Google Webmaster Tools has a new tool that would certainly be useful to site owners in making their site Googlebot friendly – the Robots.txt generator. Basically, what this translator does is to automatically generate robot.txt file. With a few clicks of the mouse, you can easily have your site’s robots.txt file. Thus it eliminates the technicalities of creating Robots.txt that has haunted novice webmasters like me before.
Using the Robots.txt generator, webmasters can easily instructs any robots which files or folders in your site’s root directory should be crawled by the Googlebot. You can even choose which specific robot you want to have access to your site’s index and restrict other robots from doing the same thing. Similarly, you can further refine this crawling activities by specifying which robot should access certain files in your root directory and which robot should access another file.
Although this might seem a pretty useful tool, there are still some limitations to its application. Some robots may ignore instruction in the robots.txt and continue to craw your site including those files or folders that have restrictions. So for highly sensitive files or documents, it would still be wise to put them behind password protection.
Although it is guaranteed that Googlebots will recognize your robots.txt based on the guidelines that you indicated when you were generating your robots.txt, other major robots may actually ignore it.
But still, the robots.txt generator is a great addition to the Webmaster Tools, as it will definitely make the lives of webmasters a little bit easier than before.