Google’s URL Removal Tool Lacks Support for Wildcards
Dan Thies reports over at Search Engine Watch Forums URL Removal tool doesn’t support robots.txt extensions. He explains that even though “Googlebot supports an extension to the robots.txt syntax, which allows webmasters to use wildcards in disallow directives.”
It does not support the same extensions when using the URL removal tool. He said it “will generate an error message telling you that wildcards aren’t allowed, if you feed it a robots.txt file which makes use of these extensions.”
Dan continues to explain that “Matt Cutts confirmed this… but it really shouldn’t be a huge problem under normal circumstances, since it should only take a few days for Googlebot to pick up changes in the robots.txt file, and drop any pages that are disallowed.”
So I would expect this to be added soon to the removal tool.
Barry Schwartz, Search Engine Roundtable Search Forum Coverage – Barry Schwartz is the Editor of Search Engine Roundtable and President of RustyBrick, Inc., a Web services firm specializing in customized online technology that helps companies decrease costs and increase sales.
Subscribe to SEJ
Get our daily newsletter from SEJ's Founder Loren Baker about the latest news in the industry!