Complaints about Google‘s current spam results are whirling around like a hurricane. The huge attention to this critical issue (which began with reports in November of last year) has done at least three things. One: It prompted new search engines to push for “anti-spam” features. Two: It increased attention to Blekko, the search engine that currently has the strongest “zero spam” platform. And three: It prompted Google to pay greater attention to anti-spam development.
While it was generally accepted that Google knew about the spam issues and were working on them, we received an official statement from Matt Cutts, the Google representative who often serves as a public face, especially to webmasters. Cutts indicated that “to respond to the [spam] challenge, we recently launched a redesigned document-level classifier.” This classifier is designed to make spamming the search engines harder through both recognizing completely spam pages and tagging “spam sections” of pages (such as blog comments used for no other purpose but promoting a site).
Google’s not stopping there, however. Once the algorithm is adjusted to more fully recognize spam, Google will be turning its attention toward eliminating the low-quality or duplicate content used to “fluff” a site. Since content is so crucial in a site’s success, many webmasters take one of two easy ways out. Option one is to simply steal the content from another site. Option two is to pay writers a very low sum to create content. These writers often don’t speak English as their first language, and the content itself (often being paid at $0.01/word or less) is usually absolute garbage.
Once these low-quality content farms are put against a wall, it’s likely that some will perish, while those who have the ability to create insightful, useful, unique content will still be able to work in the field. Better yet, users won’t have to sort through promotional filler to find valuable content.