robots txt tester
Google

Google Launches Official Robots.txt Testing Tool

A robots.txt file is key component of your website’s SEO, but sometimes these files do more harm than good. For example, your robots.txt file may be blocking search engine spiders from crawling important pages on your site, or it may be preventing your site from being indexed altogether.

Google wants to make it easier for you to detect these kinds of errors, which is why they have announced today that they’re launching an updated robots.txt testing tool in Webmaster Tools. You can find the updated testing tool in Webmaster Tools within the Crawl section.

With the new tool you will see your current robots.txt file, and have the ability to test specific URLs to see whether or not they’re disallowed for crawling. The tool will guide users through complicated directives in the robots.txt file and highlight the specific one that led to Googlebot’s final decision.

You can make changes in the file with the new tool, and test the changes as you go. Once everything is working as it should, just upload the new version of the file to your server to make the changes take effect.

If you know of errors on some of your existing sites, Google recommends double-checking the robots.txt files of those sites. A common problem comes from old robots.txt files that block CSS, JavaScript, or mobile content. Once those problems are identified, it’s usually just a simple fix to correct the problem.

Google’s John Mueller chimed in on the discussion and said he recommends you try this new tool, even if you’re sure that your robots.txt file is fine.

Some of these issues can be subtle and easy to miss. While you’re at it, also double-check how the important pages of your site render with Googlebot, and if you’re accidentally blocking any JS or CSS files from crawling.

For more information about robots.txt, here’s a Webmaster Help page on how the files are processed.

 Google Launches Official Robots.txt Testing Tool

Matt Southern

Freelance Writer at MattSouthern.com
Matt Southern is the lead news writer at Search Engine Journal. His passion for helping people in all aspects of online marketing flows through in the expert articles he contributes to many well respected publications across the web. Contact him via his website if you'd like him to write for you.
 Google Launches Official Robots.txt Testing Tool

You Might Also Like

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>

4 thoughts on “Google Launches Official Robots.txt Testing Tool