Last week I received a few emails from the Google Search Console Team. They said:
Last week I received a few emails from the Google Search Console Team. They said:
Googlebot cannot access CSS and JS files on <website-name>
It looked like this:

It turns out I’m not the only one getting these emails. Many of my clients and students have also been getting them.
What Does It Mean?
Back in October 2014, Google published an article from their Webmaster Central Blog about some new changes regarding how their indexing system renders your site. The old way rendered your site as text only. This was all before the latest wave of responsive design.
Now Google wants to see if your site is Mobile Friendly (You can put your site to the test here). The new indexing system renders your site much the same as a modern browser would, meaning they now need access to your CSS and JavaScript files in order to determine how mobile friendly it is.
By default, older versions of WordPress generated a robots.txt file that would automatically block access to certain directories in your site. This meant that obedient search engines would not index those directories. Some of those directories contain CSS and JavaScript files that Google now needs access to.
How Do I Fix It?
Before we do anything some of you may be wondering what a robots.txt file is. According to Google:
A robots.txt file is a file at the root of your site that indicates those parts of your site you don’t want accessed by search engine crawlers.
Using a robots.txt file we can tell search engine crawlers what part of our site they can’t index.
Step 1 – Fetch and Render
The first step I took was to use the Fetch as Google tool in my Search Console Dashboard. You can access it from the left hand menu under the Crawl section:
From the dropdown choose Mobile: Smartphone and click Fetch and Render

Click on Partial to see what resources the bots can’t access

Now I can see the files that Google can’t access it.

What I notice is that almost all of the files are in the wp-includes folder.
Step 2: Robots.txt Checker
Navigate to the robots.txt checker tool in your Search Console Dashboard. It’s also in the Crawl section.

Now I can see the content of my robots.txt file.
I can confirm that the wp-includes folder is being blocked in the robots.txt file.
Step 3: Updating the robots.txt file
If you haven’t already you should install the Yoast SEO Plugin. Its great for helping you with SEO and they just so happen to have a very useful robots.txt editor! They even have a useful tutorial about how you can use that functionality.
If you don’t want to follow their tutorial here’s a very quick guide to updating your robots.txt file using their plugin:
- Log into your wordpress powered website
- Hover over SEO and click on Tools
- Click File Editor
- Remove the line that says Disallow: /wp-includes/
- Click Save changes to Robots.txt

Step 4: Checking with Google
Go back to you Search Console dashboard and open the Robots.txt checker. You should now see the updated version. If you don’t, make sure that you click on save changes and wait a few minutes before checking again.

Conclusion
The robots.txt is a very powerful file when it comes to SEO. It can determine what a search engine sees and does not see when trying to index our site.
Since it’s so important for our business to rank as high as possible you can see why we should give them access to what they need. Google have stated that if you don’t allow them to see your CSS and JS it will affect your ranking.
These files help Google understand that your website works properly, so blocking access to these assets can result in sub-optimal rankings.
Do the right thing and update that robots.txt file today!
More awesome posts like this, direct to your inbox!
Join our mailing list to receive the latest news and updates from our team sent directly to your inbox.


1 comment
Join the conversationPaul Hughes - August 9, 2015
Thanks for this. I have a slightly longer list: 16 items and my site has fallen two spots on the organic listings. This is still too techie for me and so I’ve forwarded it to my webguy. I only saw the error message last night after seeing my site dip. Hopefully the 16 things, once open to gogle, will restore my rankings.
By the way, what does ‘temporarily unreachable’ mean? I thought this error message only concerned blocked stuff? Why are a couple of your bits unreachable (and loads of mine) and is the fix for all of these things the same?
Thanks
Paul