While various companies and analysts have criticized Google, most of these criticisms are either quickly resolved, negligible given the big pictures, or simply false (like DuckDuckGo’s claim that your Google search history will somehow, by magic we suppose, hurt your credit). However, one major complaint stuck to Google as if the company were nothing but an orb of super-glue. That complaint? That Google is providing more and more spam results.
Blekko, the small but tough search engine that features human reviews of sites and an innovative “slashtag” search feature, has been one of the leaders in condemning the Google model. As part of their attack on Google’s spam-ridden approach, they’ve release a “spam clock” that gives a running figure of the number of spam sites created since 2011 began. Search Engine Land gives us the details.
As of the time of this writing, about 225,000,000 spam sites have allegedly been created, and according to Blekko a million more are made every single hour. Of course, while this figure is stunning, it doesn’t represent the number of spam sites that have actually made their way into Google’s top pages, nor does it answer a crucial question: Is Blekko’s response better?
While it’s certainly an excellent idea to have human reviews of SERP data, the sheer size of the web makes Google’s automation a more pragmatic approach if the objective is to search the entire web. The choice may be between more spam-cluttered results on pages like Google and less inclusive results on pages like Blekko. Which one’s best may often be a matter of taste or circumstance. More pressing is the question of whether automation can address this obviously critical issue while still providing good results. If we don’t base our link ranking on inbound links and similar data, what can we base it on?