SEO

Why Google Gets Gamed (and Why That Won’t Stop)

Google is facing a big problem with spam. That’s not a secret. In fact, it’s why Google released a bloodthirsty Panda into the wild to destroy as much spam as was physically (or electronically) possible. However, while some of the worst offenders were removed – especially when it came to content mills and more blatantly thin sites – the Panda algorithm update didn’t fix things. Here’s the way one of my Twitter followers so aptly put it: “Why does every Google search result I come up with redirect me to advertisers?”

By advertisers we’re referring to re-sellers of some outside product, subscription, or service. These are typically affiliate groups who get a big bonus (ranging from $15 to $100) when you sign up for something as harmless as a free trial. So, why do these affiliate sites so entirely outrank pages that, you know, actually answer the question? To answer this thoroughly, let’s take a quick trip in Rob’s magical time machine, visiting a time long, long ago: the mid 90s.

The Birth of Link-Based Algorithms

In 1993, the Mosaic browser was released and the world was given access to a graphical web. It wasn’t fast, it wasn’t real pretty, but it was visual – and that meant the start of a public internet. Sites were originally categorized and put into independently-owned directories, and major directories were the primary way typical users navigated the chaos that is the web. Then innovative search engines took the field, with key players at the time including Yahoo!, Lycos, and AltaVista (and feel free to give those names a moment of silence).

These original search engines looked through actual web pages, indexing the content found inside. It actually read text and matched it up to your query. The theory was simple: If a site uses the words you’re searching for, it’s probably on the topic you’re searching for.

Quickly, however, the search engine result pages plummeted into chaos. The reason was the first wave of search engine optimization, which recognized how the sites were being ranked and tried to put in the exact right number of keyword repetitions (that’s where the “keyword saturation” idea, and the obsession some SEOs still have with it, comes from).

The gaming got bad fast. Beyond simply optimizing their site for the given keywords, webmasters and first-generation SEOs were creating duplicate pages that had the exact same content but was located at a different address. An entire page of the search results could be from the exact same publisher.

That’s where Google came in. Larry Page and Sergey Brin were certain they could make a better search. This time they wouldn’t be using keywords found within a site as the mode of ranking (although it would, and still does, play a role). Rather, they were on a hunt for a method by which site popularity could be automatically tracked. What they came to was the idea of link popularity, and in the early phases, that was largely based on the site’s PageRank.

PageRank is, in the broadest (and admittedly least thorough) summary possible, a simple way of indicating how many links point to a site. Links are more powerful if they originate from sites that have a high PageRank themselves. So, theoretically, as people went around sharing the link – be it on their own site, on a blog, or on any other medium that could be indexed – those links would serve as evidence for the high quality the site being linked to.

Immediately, Google’s algorithm gave results that were substantially better than those on Yahoo! and other major search sites (which were basically just spam). However, it didn’t take shady SEOs long to catch up.

The Next Generation of Spam

Search engine optimization is, overall, a good thing. It’s a way by which webmasters can communicate with the search engines, telling them what their site is about, what they want to be discovered for, and so on and so forth. The problem is that there are certain optimizers who use a set of tactics that sabotage the aim of the search engines (which is providing high-quality content for searchers). In the Google era this spamming has happened in the form of link generation.

More inbound links bolsters your search engine ranking, so it’s not hard to see the solution to increasing rank: Get more links! Rather than getting these links by word-of-mouth marketing, providing guest content, or otherwise earning the links, however, most spam groups these days rely on the network of sites that sell links, the self-publishing article sites, and other mediums that allow you to push links by yourself.

The Capital Power Conundrum

So the quick answer to why Google provides spam results is that, since they have an automated system, people are always trying to go through and game the SERP. It’s impossible to afford a manual management system and it’s impossible to spam-proof an automated one. But there’s one other question that deserves attention: Why is it just spam sites doing this?

Well, obviously, it’s not just spam sites, but the sites that are re-selling for someone have a huge advantage: an advertising and SEO budget. Since they can make a huge amount of money with every signup on the site, they generate hundreds, thousands, or tens of thousands of dollars each month that can be turned inward for buying links, article submission packages, and so forth. As the profit (i.e., capital) generated is re-invested, they become more profitable and more powerful.

What about the sites you actually want to see? What you’re probably looking for is an informational site that provides direct, honest, and unbiased information. In other words, you’re looking for a group who will charitably give you information with little hope of a large return. While you may be fine with an ad or two on these sites, the overall revenue for the publisher and for each advertisement you click on is substantially lower. So the honest, informational sites have less incentive to hire SEOs and they have less starting and ongoing resources to invest in optimization.

Why Social Web, Curation, Etc., Won’t Stop Gaming

There have been numerous propositions for how this situation can be improved or fixed. Social sharing as a ranking factor (including Twitter shares, Facebook likes, and Google +1s) sounds great. Human curation and contributions (such as we see on Blekko) are fantastic conceptually. Here’s the problem: The moment one of these factors becomes significant enough, it absolutely, definitely will be gamed.

The social web is on the rise and once its hit the mainstream, you bet that there will be groups who create hundreds – even thousands – of social accounts for the simple purpose of selling you their “likes.” In the same way, if Blekko became popular, then SEO groups would hire third parties or develop networks to game the human contributions.

For every step the search sites take toward “fixing” the problem and establishing countermeasures, the SEOs have come up with one more way to get around the countermeasure. The only long-term fixes would involve either an idealistic (probably impossible) way to detect those who are gaming the search sites or full, single-group human curation (which is far too expensive to pull off). Sadly, there isn’t a quick fix. The search sites will continue to create well-rounded algorithms that are more likely to surface good results, but it’s impossible to “kill spam” completely – at least in today’s world.

The good news is that webmasters who don’t invest in gaming will still see the best long-term results. Focusing on quality, basic promotion through guest blogging or social sites, and honestly providing value will get organic results over time – and won’t be tossed to the sidelines with algorithm updates.

And, on the whole, we’re improving. The number of spam sites filtered out of the top results has multiplied like bunnies, and upcoming ideas such as social feedback, site blocking, and other user contributions will have a valuable role to play. The mistake is thinking that these new ideas are solutions, when in fact they’re just one more territory where the search sites will do war with the black-hat SEOs. But each one of these territories gives Google, Bing, and other search sites an additional advantage – and the overall picture, believe it or not, is getting gradually less spammy.

 Why Google Gets Gamed (and Why That Wont Stop)
Rob has been insatiably obsessed with Google, search engine technology, and the trends of the web-based world since he began life as a webmaster in 2002. His work as an SEO consultant since 2006, and subsequently to content writing for technology and internet-focused publications, has done nothing but fuel this passion.
 Why Google Gets Gamed (and Why That Wont Stop)

Latest posts by Rob D Young (see all)

Comments are closed.

8 thoughts on “Why Google Gets Gamed (and Why That Won’t Stop)

  1. Google itself invited the game of spam by being liberal to their adsense policies. Every other spam page I see on internet is nothing but full of Adsense. in a way SPAM has become an industry itself, the spam writers, spam SEO and spam buyers. one of my friend publishes almost 100 pages daily, that is nothing but a bunch of keywords and content written on those keywords. The content is just rubbish, have no human meaning or not useful to anyone but its full of adsense around the content. Google by their thumb rule reaches to those pages, the pages have fresh content ( irrespective of their usability), they are crawled and well ranked ( because of offsite seo). So I believe Google has to control their adsense business ( which is leased likely to happen) to deliver the quality content on SERP.

    1. I’m not going to make a judgment on whether Google is responsible for spam SEO (AdSense is just one of many groups, after all, and others may have stepped in). If Google really is anti-spam, though, running AdSense as they currently do represents a real conflict of interest.

      1. Rob, so you are also of same opinion that by one or other way, google’s ad business is conflict of interest against their search engine. Unless google introduces some limitations on adsense, they cannot come out clean of SPAM Jungle around.

  2. No, it is not “it’s impossible to spam-proof an automated one” We do it. It is a matter of policy, not technology. DuckDuckGo, Blekko, and our SiteTruth system all block far more spam than Google does.

    We, as SiteTruth, start by requiring that commercial web sites properly identify the organization behind the web site. Is there a business name and address on the web site? This is a legal requirement in many jurisdictions. If a site has ads, we consider it commercial. No identifiable business means a low ranking.

    Once we identify the business, we do automated due diligence.  We use hard sources like SEC filings and D&B credit ratings, not junk rating sites like Citysearch and Yelp. 

    This works much better than “crowdsourcing”.  Crowdsourcing is very
    spammable, as Google discovered when they merged Google Places data into
    web search results last October.  Lying to Citysearch has no penalties. Lying to Google can reduce the ranking for a single domain. Lying to Dun and Bradstreet can cut off your business credit. Lying to the SEC can send you to jail.

    Affiliate sites, spam blogs, and other junk just disappear. Then we can start making finer distinctions, like observing that the line of business of the company behind a site is “ad agency”, or that  a business has too many domains for their annual revenue.

    So don’t go around saying it’s impossible.

    SiteTruth technology is covered by U.S. Patent #7,693, 833 other patents pending.

  3. As you just said, follow the money. Whilst there is money to be made the issue will always exist. It’s not really a new issue – direct mail cons existed defore the net/web and human nature being what it is……. We’ll always need technology and our own decision making capability to beat it, it’s inevitable unfortunately.

  4. Hi Rob, I thought this was a serious online magazine but since my comment has been censored I must line SEJ up as non trustable site. If you may want to reply to me privately as to what was wrong with my comment to deserve such censorship I’d be thankful to you.  

  5. Good news is that webmasters now focusing on quality, basic promotion through social sites, and honestly providing value. The number of spam sites filtered out of the top results has multiplied like bunnies, site blocking, and other user charity will have a precious role to play.