How to Fix Googlebot Cannot Access CSS & JS Files Issue

Hey, Webmasters! Don’t panic if you got a notice from Google in your email today — numerous website admins were cautioned that “Googlebot cannot access CSS and JS files on”

Google conveyed this notice by means of Search Console, while additionally advising them that Googlebot’s failure to get to those records may bring about “imperfect rankings”.

That sounds awful, right? however, the uplifting news is there’s a simple fix for it and implementing the proper fix can help you sort this issue out.

Googlebot Cannot Access CSS & JS

Here is the warning received by many from Google yesterday.

Google systems have recently detected an issue with your homepage that affects how well our algorithms render and index your content. Specifically, Googlebot cannot access your JavaScript and/or CSS files because of restrictions in your robots.txt file. These files help Google understand that your website works properly so blocking access to these assets can result in suboptimal rankings.

According to the Google Webmaster Guidelines released in last October, blocking of CSS and JavaScript is completely not allowed by Google. Though the company hasn’t taken any critical steps in conveying this to webmasters in the recent past about this issue, now it has started sending out warning messages to all the website owners to fix these issues immediately.

So as part of their fix if your site has received a similar warning message you need not worry about that as you can do these simple things to fix the issue easily by yourself.

Fixing CSS & JS Warning by editing your robots.txt file:

Check your robots.txt file for any of the following lines of code:

Disallow: /.js$*

Disallow: /.inc$*

Disallow: /.css$*

Disallow: /.php$*

Disallow: /wp-includes/

If you can find any of these lines of code or rules were written by you in the past, remove them immediately as that’s the code Googlebot is talking about. You need to remove it so that Googlebot can crawl your JS and CSS Files easily. Above lines of code may be used by many custom CMS bloggers as well. So check the blocking lines of code or script carefully depending on the CMS you use.

Note: If you are a WordPress user there are high chances that you might be using this Disallow: /wp-includes/ line of code which is used by many SEO’s to block Google from accessing their CSS and JS files. Remove that line and it will fix the issue for you straight away.

If you are using Yoast SEO plugin then it’s easy for you to edit your robots file from the WordPress dashboard itself. Check the instructions below.

Go to your WordPress dashboard, click on SEO > Tools and then click on the blue link “File editor” which will help to edit your robots.txt file.

Just check for this line of code Disallow: /wp-includes/ and remove it and save the robots file. That’s it you are done.

Once you are done cleaning up your robots.txt file the final step is to check if your site is not blocking any other CSS or JS files using Google’s Fetch and Render tool which will help you in confirming further if the issue is completely fixed or not on your website.

Why Does Googlebot need to access your CSS & JS files?

After receiving that warning every one of you might be wondering why this issue has suddenly cropped up into the picture from google. Well, the message is stated loud and clear from google here Help Google get a better picture of your site.

A web page relies on the availability of my_script.js, which is typically run by web browsers to provide the browsers with the core textual content of the page. If my_script.js is blocked from Google, we won’t be able to get the text content when Googlebot renders the web page.

Also do check this below video from Matt Cutts where he explains about why you should not be blocking Javascript & CSS on your website to be accessed by Google:

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.