Negative SEO What it Is & How to Protect Yourself

SMS Text

Recent events such as unnatural link warnings from Google, over optimization filters andblog networks being deindexed by Google, the concern of links being able to harm a website has risen again.

The biggest concern is the issue of ‘negative SEO’, which is sabotaging your competitor’s rankings to help you move past them on the search engine result pages.

Some webmasters are very concerned about what negative SEO is and how they can protect themselves. This post will tell you what to do about negative SEO.

Negative SEO: It’s Very Real

…under certain conditions.

First, let’s imagine that the search engines tolerate a certain threshold of optimization for a site before they penalize it for being over optimized.

Let’s call this the ‘over optimization cup’.

To visualize this, just imagine that every site starts with an empty cup. Once the cup gets overfilled, a site is at risk for getting penalized.

A larger, more authoritative site such as Amazon has a much larger cup; that means they can tolerate more dirt in their site buildup. ‘Dirt’ can be defined as spammy links, blatant keyword stuffing, duplicate content or anything that isn’t considered squeaky clean white hat SEO.

In contrast, a smaller, newer site has many forces working against it. For example, it might not have much content or links pointing to it. So right off the bat, we can see a few reasons why the search engines wouldn’t rank this site in the same category as Amazon:

  1. Thin content – e.g. duplicate content
  2. Manipulative on-site SEO – e.g. excessive footer links or keyword stuffing
  3. Spammy link profile – e.g. blog network links, mass directory, forum, or social bookmarks

In comparison, Amazon has an overwhelming number of quality links, loads of user generated content, excellent site design, and a quick page load time. It’s easy for Google to crown Amazon as an authority because of all the positive signals the site has.

It’s also easy for Google to identify the small sites as cockroaches in the niche. And when the cockroaches mess around too much, they get stomped on.

Right now, some of you might be jumping up and down because you think you can take your competitors site out of commission by pushing it over the top with some negative SEO. And in some cases, it might work (more on that later).

But if you’re going after a giant like Amazon, you’re not going to succeed.

Here’s why.

Larger Sites Have A Higher Threshold for Spam

Naturally, larger sites have a stronger overall presence (such as # of quality inbound links, in-depth content and domain age). This means their over optimization cup is much larger because the search engines trust these sites more.

To put things in perspective, think of a small site as a sailboat and a large site as a battleship.

Which one can take more abuse?

The battleship.

How To Protect Yourself

There’s no groundbreaking cure to defending your site against negative SEO – all you need to do is follow the standard SEO guidelines of having:

  • A website with useful content that helps people.
  • Solid site architecture.
  • Great website design for usability and trust purposes. Nowadays, design is marketing.
  • A relatively clean link profile that shows the search engines that you aren’t doing anything over manipulative.

None of this is stuff is very new but with the uproar about over optimization, people are scrambling to figure out how to stop negative SEO. If you have any doubts on your website, be sure to refer to Google’s very own guide on how to build a high-quality website.

Furthermore, if you need a better idea of how to do SEO in moderation, this infographic is a great resource:

SEO strategies
SEO Infographic by SEO Book

Don’t Waste Your Time With It

By now, you might be thinking that negative SEO is something you should add to your arsenal of tricks so you can ultimately make more money and laugh as your competitors go down.

Don’t do that.

Not only is it unethical, it’s a misuse of time.

Executing a negative SEO campaign requires:

  1. Time
  2. Money
  3. Tools
  4. A lack of morals

But why is it a misuse of time? Because there’s no guarantee that you will succeed. So instead of allocating valuable resources based on the possibility of taking down a competitor, it would be more efficient to pour those resources into your own site.

Negative SEO: Myths, Realities, and Precautions

SEOmoz did a Whiteboard Friday on different negative SEO techniques:

Rand Fishkin did an excellent job defining some characteristics of high risk and low risk websites:

High risk:

  • You have engaged in spammy link building
  • Manipulative on-site stuff
  • Your site has few brand signals

Low risk:

  • Clean backlink profile – no manipulative linking (at least intentional). Note: Everyone is going to have some bad backlinks given the number of different scrapers, crawlers, and other different bots.
  • Clean, high quality design/interface
  • A site that doesn’t feel SEO’ed – Whenever you have that sixth sense that a site feels manipulative, it probably is. Examples of sites that don’t feel SEO’ed: Zappos/Amazon/TechCrunch/SEOmoz
  • Strong brand signals – e.g. brand name searches
  • Lots of people searching for your brand name – e.g. social media, press
  • Lots of user and usage metrics

If you’re ever wondering if a certain website is ‘high risk’ or ‘low risk’, simply refer to the characteristics above to help you classify a site.


With all the ruckus about over optimization right now, it’s important to arm yourself with knowledge so you can react properly to the situation. In this case, all you need to do is to continue to follow SEO best practices and deliver great value to your customers. To give yourself more of an edge, try being remarkable. It’ll take you much further than a negative SEO campaign.

What are your thoughts of negative SEO?

Image Credit: Fotolia and Jai Mansson

Eric Siu
Eric Siu is the User Growth Lead at Treehouse, an online technology school that teaches coding, web design, how to build startups, and more. Feel... Read Full Bio
Subscribe to SEJ!
Get our weekly newsletter from SEJ's Founder Loren Baker about the latest news in the industry!
  • Dan Thies


    If you want the facts about what you called “very real,” they’re “very different” than what you think. Rand was good enough to correct the error he made in his video, on the post at SEOMoz. It’s a little disappointing to see it repeated here.

    • Eric Siu

      I already replied to you via Twitter and I said you’re more than welcome to clarify your statements. Looking forward to hearing it.

    • John S. Britsios

      Dan I must agree with you.

      Eric you asked for a case study?

      I am dealing with a customer who has an excellent site UX, PR5, very clean link profile and who was ranking for more than two years # 1 for the most competitive terms of his industry and have been attacked by “NegativeSEO” (sblogs) twice. Once in Dec 2011 and once in April 2012.

      On the 4th of January his rankings dropped. For example for the biggest term from #1 to #7. And many more terms dropped like that too.

      At some point he filed a reconsideration request, and on the 27th of March 2012 for the first time he got an email from Google where they said he has some on pages issues and also inorganic/unnatural links which all together were violating their guidelines and demanded that he was fix the on-page issues and also should go after the webmasters and asking them to remove the links and that he must update them with his results.

      First thing we done, we fixed the on-page issue and filed a reconsideration request, but this time telling Google that he the links were not created from him and that they must disregard them.

      Google replied not mentioning this time anything about on-page issues, but still mentioned about the bad links.

      Didn’t Rand Fishkin say that you only need to tell Google to disregard the links? That would have been a sweet dream if that was “very real!”

      At last, on the 24th of April 2012 Panda update, traffic went up, but few days later on the 27th of April 2012 he have been hit from the Penguin algorithm. His major term was not found in the first #200!

      I forgot to mention above, that the customer hired me at the end of March when he got that message from Google and I began taking action.

      First I checked if their are any on-site issues. On the 3rd of May 2012 someone wrote the most of the stuff I was checking already here I add my comment there because I thought he was reading my thoughts. πŸ™‚

      After all the only things I found wrong on his site, was that he was spamming unintentionally RDFa ratings and reviews and that there was in the pages headers a link to a javascript hosted on another site but which was not accessible because it was returning a 403 HTTP header, which probably was perceived as suspicious.

      The same time I was working on as Bruce Clay calls it now “Link Pruning”:

      We are done with all tasks since last week and we contacted Google with all our last efforts and waiting for a reply or if not, at least a recovery.

      My client know that he probably should not expect any miracles like jumping up to nr. 1, but to return to page one should be an at least.

      I will update you guys when I have some news.

      Any further questions?

  • Aman Singh@seoinsiter

    The whole list is true. I read the whole info-graphic and realized that all links should never point to the home page . But, the links should only link to the page that actually contains some info about that page where the link came from.

  • Satya

    Negative SEO is something that you cant do squat about. These days any idiot with a huge scrapebox list can bring a mom and pop site down. So your post title “How to protect yourself” is a misnomer.

    The only way one would stay protected from negative seo is if Google stopped ignoring the elephant in the room and fixed their broken pandas and penguins πŸ˜‰

    • Eric Siu

      Hi Satya,
      Can you provide some case studies on this?

  • Norm

    No article on this subject would be complete without recommending people setup Google Alerts. You should add all your important keywords and company name and url as Google Alerts set to “everything”

    • Eric Siu

      Thanks Norm – that’s a great idea!

  • Nick Stamoulis

    I think that as long as you’ve stuck with white hat SEO you shouldn’t have too much to fear about negative SEO. Any potential harm an unscrupulous competitor might do to your site will be offset by all the good work (both onsite and on) that you’ve been doing. I think too many sites owners think that a few bad links are going to ruin their site, but as long as you’re toeing the line chances are your white hat SEO will counter any negative SEO.

  • Eric Siu

    Hi Nick,
    For the most part, I agree with what you’re saying. This is assuming that the ‘negative SEO’ we’re talking about is strictly dirty backlinks. However, as seen in Rand’s video, there are different ways of doing negative SEO, such as hacking a site and blocking it with robots.txt.

  • Jessica Rosengard

    Excellent post. I think the most important message is not to over optimize in an obsessive way. Since Google is constantly changing their algorithm, this is actually an ever changing issue which requires constant attention. Thanks for sharing this info!

  • matt

    There seems to be no clear way to protect your site from it. You just need to monitor things and keep building quality links.

  • Chris C

    Good stuff Eric! I’m a little skeptical of Rand’s distinction between Just Good Cars and Zappos in terms of “feeling SEO’d.” Zappos litters their homepage with links the same way JUC does. The only really difference I see is

    1- The obvious difference in aesthetics
    2- JUC has the words “used” at the beginning of each link.

    Would you say this is enough to make a distinguish between what Google would identify as a “manipulative” footers vs a natural one? Or rather, is this what Rand is essentially saying?

    • Eric Siu

      Hey Chris,
      Honestly I think Rand might be referring more to these sites from a design standpoint. While the footers might be manipulative, I don’t think that’s enough of a signal to give these brands a ding because they have so many other positive signals.

  • pete

    Good article. It’s ridiculous that Google lets the whole these things happen. How people who are starting new business in a competitive market can protect against Negative SEO??

    • Eric Siu

      Thanks Peter,
      It’s easier said than done, but all it’s going to take is unwavering effort and a lot of time to build out a high quality site.

    • Patrick Allmond

      The reality is you can’t. Google doesn’t know who creates links. It cannot tell a good link from a bad. This will continue to be a problem as long as the internet and hyperlinks are structured the way that they are. Since you can game regular positive SEO you can game negative SEO. They are the same thing.

  • Eric Siu

    Pete* πŸ™‚

  • Richard

    To me, it’s amazing that a company with the “brainpower” of Google can not counteract negative SEO. It seems so simple. They have come up with a formula to penalize sites who supposedly have bad inbound links…which is what competitors use to try to destroy their competition. Using this exact formula, instead of a penalty, they could just de-emphasize those links and not factor them in at all. Wouldn’t that be easy?

  • Eamon Moriarty

    Ideally websites should be assessed totally on content, design and usability. Unfortunately the search engines have not yet found a way to do this reliably.
    Google and other engines have been trying to tweak their assessment of links as an indicator of the value and relevance of websites but have failed to perfect it. So Panda and Penguin have destroyed perfectly good and relevant websites which are not involved in any spamming, scamming or shady practices – many of them providing a good service.
    I agree with Richard. Why not just give no value to links which are perceived to be ‘bad’? Imposing a penalty and sending good websites to the sin bin is unfair and also skews Googles own results.
    Many small online businesses are getting fed up of Google’s manipulations. They don’t have the time or resources to be constantly trying to keep up with constant changes.

  • Pete

    You guys are right TO AN EXTENT… If the site has age, authority and a huge link profile that is all white hat a few thousand spam links will not bring the stir down all together.

    But what about mom and pops that take the time to develop social pages and gather and audience, take time to reach out to bloggers and guest post, list by hand in related, moderated directories, give meaningful comments on related blogs all by hand? The people can only build up a small link profile but over time they grow through organic backlink, but again that takes time. What if in the midst of this a 6 month old site, that is a registered business and ranking top 3 gets hit with over 5000 spammy comments all targeting one keyword?

    I can tell you what happens in just a few days we have experienced a 10 position drop!

    Let me give you an example…We have 3 separate sites, all selling a similar product all in the same niche.

    For our purposes I will say we 1 site is selling Toyo tires, one is selling BF Goodrich tires and the last is selling Eagle tires.
    Lets also assume that all 3 are ranking on page 1 for related keywords, all have social media accounts, all have guest posts linking all have over 100 pages of content and all are 6 months old. They all use the same CMS and have related articles.

    1 of the 3 sites was hit with 5000 spam links to various blogs and the anchor text is our main keyword and the comment is our keyword repeated multiple times. So using this example the spammer linked the kw “Toyo tires” to our URL and in the comment he said “toyo tires, toyo tires, toyo tires” and within a few weeks (once google indexes the links) that very same site started to slide…

    There is not a whole heck of a lot that newer sites that are just coming into their own can do to fight this behavior. To restate, 3 sites, similar age, similar niche, different content, similar marketing strategy and the 1 that got spammed is falling the other 2 are fine. That is not a coincidence.

    • Tim

      Pete, you could have helped insulate yourself by combining all three websites into one… that way your total backlink count would have been the combined total of the three sites Also, your total number of indexed pages would have been trippled. Both of these factors could have raised your “strength” up by a factor of 9!
      Instead you have 3 weak sites (I’m guessing you have a keyword loaded domains for each, which has just been the focus of the recent EMD google update), and you’re tripling your time requirements (now soical media stuff has to be done for each website, instead of one, updates, etc, etc).
      It’s pretty clear you are not a “mum and dad blog” operator, you’ve intentionally gone and tried to game the rankings… So, if you’re serious, start over again with ONE strong domain not three weak ones…

  • John

    A competitor has initiated a negative seo campaign. I see from my raw gwt data unsolicited links by the bucketload from spammy sites many of which show a 404 error when i try to click on them. What should my strategy be to recover from the damage? I have had no warnings from google but my serps have suffered.
    Anybody out there had similar problem?