According to an email sent to the owner of a Romanian site, Zoso.ro (reminds me of Led Zepplin), the DreamHost web hosting & server company is demanding that its high traffic sites block the Googlebot.
According to the DreamHost site, they host over 500,000 ‘domains’ (not sure if they mean parked domains and sites, but that’s probably the case).
This email is to inform you that a few of your sites were getting hammered by Google bot. This was causing a heavy load on the webserver, and in turn affecting other customers on your shared server. In order to maintain stability on the webserver, I was forced to block Google bot via the .htaccess file.
[Limit GET HEAD POST
deny from 66.249
allow from all]
You also want to consider making your files be unsearchable by robots and crawlers, as that usually contributes to high number of hits. If they hit a dynamic file, like php, it can cause high memory usage and consequently high load…
I do not think I have heard of a more ridiculous approach by a web server company to ‘help’ its customers by manually altering their .htaccess file to block Google. Furthermore, to ask their customers to block other search bots is completely off the hook.
It would be much more cost effective for the customer to choose a hosting service which can handle the dreaded Googlebot, instead of signing their own incoming traffic death certificate by blocking Google, and all of the incoming traffic which comes as a result of being indexed.
[Hattip to SEOpedia]