Google Sends Mass Warning: “Googlebot Cannot Access Your JavaScript and CSS Files”

SMS Text
google warning

Don’t be alarmed if you received a warning from Google in your email today — many webmasters were alerted that “Googlebot cannot access your JavaScript and/or CSS files.”

Google sent out this warning via Search Console, while also reminding them that Googlebot’s inability to access those files may result in “suboptimal rankings”.

That sounds bad, but the good news is there’s an easy fix for it and implementing the fix end up helping your site.

Here is the warning the full warning:

“Google systems have recently detected an issue with your homepage that affects how well our algorithms render and index your content. Specifically, Googlebot cannot access your JavaScript and/or CSS files because of restrictions in your robots.txt file. These files help Google understand that your website works properly so blocking access to these assets can result in suboptimal rankings.”

Blocking CSS and JavaScript has been a Google no-no since it was written into the Webmaster Guidelines last October. It’s only recently that the company has been issuing warnings about it.

If your site has been blocking Googlebot from accessing those files, then it’s a good thing you know about it so you can deal with the issue.

There’s an easy fix for it, which involves editing your site’s robots.txt file. If you’re comfortable editing that file, then go ahead with this fix.

Look through the robots.txt file for any of the following lines of code:

Disallow: /.js$*

Disallow: /.inc$*

Disallow: /.css$*

Disallow: /.php$*

If you see any of those lines, remove them. That’s what’s blocking Googlebot from crawling the files it needs to render your site as other users can see it.

The next step is to run your site through Google’s Fetch and Render tool, which will confirm whether or not you fixed the problem.

If Googlebot is still being blocked, the tool will provide further instructions on changes to be made to the robots.txt file.

In addition, you can use the robots.txt testing tool in Search Console to identify if there are any other crawling issues.

Image Credit: Shutterstock

Matt Southern
Matt Southern is the lead news writer at Search Engine Journal. His passion for helping people in all aspects of online marketing flows through in the expert industry coverage he provides.
Matt Southern
Get the latest news from Search Engine Journal!
We value your privacy! See our policy here.
  • Glad to see I wasn’t the only one. Although in our case, it was a 3rd party analytics javascript file that caused the problem. A single file.

    In our robots.txt file, we had “disallowed” the entire folder for the analytics, so of course when Googlebot crawled any page, the analytics calls for the js file and I guess Google didn’t like that.

    We got the notice today around lunch and of course, like anyone with a web presence, totally flipped put.

    Seems like an alarming warning message that could have several causes. Really wish big G could personalize it a bit more so webmasters don’t have heart attacks!

    Thanks for the news article!

    Ryan @ Netfloor USA

  • Principal Garden

    Is there any way look for our Robots.txt file to begin with? And will it be the same for us if we were to change the Disallow to Allow instead? Or simply remove it as you’ve mentioned? How about wp-admin? Many thanks!

    • R.Rogerson

      Your robots.txt file should be in the main-root of your website (where your index.html or index.php file is).
      If you have FTP access (or can use a FileManager through your webhost), it should be the first “web accessible” directory (often called “htdocs”, “web”, “public_html” etc.).

      It should be accessible in your browser [ http:// your domain . tld / robots.txt ].

      It can be more complicated if you have a “dynamic” robots.txt file.
      You’ll know if you do because you will be able to access it through the browser, but won’t see an actual file via FTP/FileManager.
      (If that is the case, then a look in any .htaccess file should show a rewrite for robots.txt – you’ll need to look through the instructions for your site/plugins to see how to control it)

  • Nauseen

    I used standard Robots.txt file but I am also got Google webmaster errors:-
    User-agent: *
    Disallow: /wp-admin/
    Disallow: /wp-includes/

    Please help me How to solve this problem.

    • Try adding the below lines to the end:

      User-Agent: Googlebot
      Allow: .js
      Allow: .css

      This should specifically allow Googlebot’s to access js and css files. Then run the render tool as suggessted by Matt.

      Hope it helps.

      • Hi Ven Tesh

        This adjustment might work better for all js and css
        Allow: /*.js$
        Allow: /*.css$

        Cheers!
        jules

  • Kamaldeep

    I got mine fixed by adding this below my robots.txt file. Of course this code is for wordpress users only. I also recommend installing “WP ROBOTS TXT” plugin for easy editing of ROBOTS.txt file.

    Allow: /wp-content/themes/
    Allow: /wp-content/plugins/
    Allow: /wp-content/uploads/
    Allow: /wp-includes/css/
    Allow: /wp-includes/js/
    Allow: /wp-includes/images/

  • Hi Matt,

    I got that warning yesterday. I don’t have any other line of code other than:
    User-agent: *
    Disallow: /wp-admin/

    Is there anything else that might cause the warning? Will this affect rankings significantly if I ignore the warning?

    • Caroline

      Hi Matt,

      I had the same problem as you, only had those 2 lines in my robots.txt. After doing some research I found this solution which has since worked:

      #Googlebot
      User-agent: Googlebot
      Allow: *.css
      Allow: *.js

      # global
      User-agent: *
      Disallow: /wp-admin/
      Disallow: /wp-includes/

      Once you do that, update your robots.txt in your search console and then fetch as Google. The hacked message in Google search didn’t disappear instantly, but wait a while and it’ll go.

      Hope that helps!

  • R.Rogerson

    Knowing what directories/folders your resource files (JS/CSS) are located in would help.
    Many people will block access to directories for things like themes, not realising that includes certain files the themes load up (like CSS or JS for widget styling etc.).

  • I got this error for one of the site’s I host for a friend. I’m only blocking /wp-admin in robots.txt, and I can’t see any reason for CSS and JS to be blocked, and “fetch as Google” doesn’t show anything.

    If I was inclined towards pessimism, I’d say this is just Google being lame again and slinging FUD around like it’s a paying job. Dorks.

  • Sujit

    Hi,

    I got the same error yesterday, but haven’t blocked anything in the robots file. Still it is not able to fetch and render the .JS & .CSS file.

    Can anyone please help me on this.
    Thanks in advance.

    • R.Rogerson

      without a Domain/URL – no one is going to be able to help

  • Google sent out this warning via Search Console then it is obvious to guess that this is gonna to be a important ranking factor.

    I did receive the message for all my wordpress sites where i blocked the wp-include file for googlebot.

  • John

    So I have tried numerous configurations of robots.tx to no avail. However, I disabled the wordpress plugin Quick Adsense and all renders OK. Could someone who knows how to code look into the issue and maybe post a fix ? To be more specific this is what showed in the list after I tried to render the page before deactivating Quick Adsense. “http ://page ad2.googlesyndication. com/pagead/js/ adsbygoogle.js ” After deactivating Quick Adsense the page rendered OK
    Edited to break link.

    I have contacted BuySellAds which appears to be the owner of the plugin. Their response is that they no longer support the plugin.

    *Please note this robots.txt did not solve the problem.

    My current robots.txt:
    User-agent: *
    Disallow: /wp-admin/
    Disallow: /wp-includes/
    Allow: /wp-content/themes/
    Allow: /wp-content/plugins/
    Allow: /wp-content/uploads/
    Allow: /wp-includes/css/
    Allow: /wp-includes/js/
    Allow: /wp-includes/images/
    User-Agent: Googlebot
    Allow: .js
    Allow: .css
    Allow: /*.js*
    Allow: /*.css*
    Allow: /wp-admin/
    Sitemap: http://mywebsitename. com/sitemapindex.xml

  • For WordPress sites, there is pretty much no point in applying the kind of restrictions variously described above. Google “yoast robots.txt” for information on that.

    Unless you have a special requirement (affiliate links, unusual plugins or functionality, etc.), the best format is just:

    User-agent: *
    Allow: /

    You should use an SEO plugin and/or meta tags to block indexing/following of inappropriate or duplicated content.

    Having said that … we liberalised all our clients’ robots.txt files this week and are still getting whingeing from Google. In particular, they are complaining about access-restricted assets hosted on 3rd party systems, including there very own YouTube and other Google properties. Usual rule applies – the bigger the organisation, the lower the evident corporate IQ and the less they care.

  • Thx for the heads up- the only site I got a notice for is a WP site of recent vintage. I didn’t make a robots.txt. Is that a new WP default?

  • RomanM

    Nice tip, Matt.
    Although most people will probably have received this message due to the blocking of certain folders in which JS/CSS-files are kept.

  • I suppose it makes sense but its a frustrating update if you manage a lot of sites and have to change a ton of robots.txt files.