Google sent out this warning via Search Console, while also reminding them that Googlebot’s inability to access those files may result in “suboptimal rankings”.
That sounds bad, but the good news is there’s an easy fix for it and implementing the fix end up helping your site.
Here is the warning the full warning:
If your site has been blocking Googlebot from accessing those files, then it’s a good thing you know about it so you can deal with the issue.
There’s an easy fix for it, which involves editing your site’s robots.txt file. If you’re comfortable editing that file, then go ahead with this fix.
Look through the robots.txt file for any of the following lines of code:
If you see any of those lines, remove them. That’s what’s blocking Googlebot from crawling the files it needs to render your site as other users can see it.
The next step is to run your site through Google’s Fetch and Render tool, which will confirm whether or not you fixed the problem.
If Googlebot is still being blocked, the tool will provide further instructions on changes to be made to the robots.txt file.
In addition, you can use the robots.txt testing tool in Search Console to identify if there are any other crawling issues.
Image Credit: Shutterstock