budget-link-removal

How to Deal With an Unnatural Links Penalty on a Budget

Many website, businesses, and employees have been affected by Google’s recent clamp down on spammy and manipulative link practices. It is becoming an all too common problem for small business, and we are increasingly finding that new clients have been affected by penalization and need help finding a resolution.

Whether a site has been hit by a manual penalty, or has suffered a drop in traffic following a Penguin update, there is always a lot of work required in order to get a penalty lifted. Many of the tools needed are expensive, and many of the popular methods require a lot of manual work. This post offers some tools, guidelines, and methods for reducing the time and cost of penalty removal work.

Identifying Penalties

The first step to any kind of penalty recovery is figuring out if you have a penalty and what kind of penalty it is. With manual penalties this is nice and easy as Google just tell you in Google Webmaster Tools. Algorithmic penalties are a bit more tricky, and normally you have to break the old correlation ≠ causation rule when deciding that one is present.

When dealing with new clients, we try to qualify them as quickly as possible – often clients don’t even know that they’ve been hit by a penalty. A couple of tools can give you a quick indicator:

Website Penalty Indicator – Cost $0
My company developed this tool recently (built using SEMrush traffic data), which can be used for a quick visual check on traffic trends, and if a website has seen any large drops following Google update announcements (or not, as the case may be…)

Website Penalty Indicator Screenshot

Screenshot taken 17/02/2014 of feinternational.com

ahrefs – Cost $0 (paid plans available)
Our backlink tool of choice, ahrefs is largely considered among the SEO community to have the largest and most accurate index. For our budget-busting analysis we’ll be making use of the free account, which gives you restricted data and only 3 lookups a day – but that is perfectly adequate for our needs at this stage.

ahrefs Link stats dashboard

Screenshot taken 17/02/2014 of ahrefs.com

By looking at a site’s backlink numbers and their anchor text ratios you can get a pretty good idea how manipulative their backlink profile is, and whether it might be worth doing some further investigation.

ahrefs Anchor Cloud

Screenshot taken 17/02/2014 of ahrefs.com

Screenshots from these tools along with a brief explanation should be enough to convince most prospective clients that it is worth doing some more detailed investigations, giving you deeper insight into their situation.

Panguin – Cost $0
This a fantastic tool for identifying penalties, but it does require you to have access to the Google Analytics account of the site in question, so it is not much use for pre-qualifying. However once you do have GA access, it is incredibly useful. Once you log in, you choose one of your Profiles (Views) and Panguin will overlay Google update data over your Google organic search traffic.

Panguin tool

Screenshot taken 17/02/2014 of barracuda-digital.co.uk

You are looking for an update followed by a fall in search traffic (or a series thereof), which could imply a penalty is present. Make sure to compare the data to previous years to ensure it isn’t simply a seasonal fluctuation.

SEMrush – $0 (paid plans available)
A sublime tool for competitor research, SEMrush makes use of its database of ranking history for millions of keywords to give you loads of valuable data. In the absence of keyword ranking history, the SEMrush overview (free) can give us a great idea of previous trends.

SEMrush Keyword ranking overview

Screenshot taken 17/02/2014 of semrush.com

Combining all these sources you should be able to make a fairly accurate judgment as to whether a site has been penalized or not.

Patrick Hathaway

Patrick Hathaway

SEO Consultant at Hit Reach
Patrick Hathaway is an SEO consultant for Hit Reach, delivering client-side and in house SEO campaigns. He is currently doing LOTS of website audits (SEO and CRO), link removal work and content marketing campaigns.When he isn't working, Patrick spends most of his time trying to keep up with his baby boy. In the (minuscule) remaining free time, he enjoys playing sport and drinking beer (normally not at the same time).
Patrick Hathaway
Patrick Hathaway

Latest posts by Patrick Hathaway (see all)

Comments are closed.

22 thoughts on “How to Deal With an Unnatural Links Penalty on a Budget

  1. That’s all good stuff there, I do love Panguin – that’s a great tool to show clients who have penalized sites. I am an LRT Associate with LinkResearchTools in Vienna who make the wonderful Link Detox Genesis. I have their rather expensive Superhero Account and at the moment I have quite a few Link Detox credits spare, so I am offering to do a full detox/backlink audit of any site and produce a disavow file and full backlink report (showing healthy, suspicious and toxic links). I’m prepared to do this for £50/75€/$100 at the moment as I need the experience of looking at different penalized sites. I know this is not as cheap as what Patrick has described, but it will get you results faster. Get in touch ASAP if you want me to help, I am about to have a case study published with LRT, so I can see a busy summer coming up! Thanks again Patrick, there’s a few great Excel tips in there I hadn’t thought of.

  2. This is a very good article. I’d like to give my thoughts as someone who does a lot of penalty removal work, and yes, I’m one of the people whom you have “called BS on”. :) But that’s ok…it’s good to debate topics like this. Here are my thoughts:

    Regarding not needing to check links manually, I have tried several tools and each time I find that they make too many mistakes for me to be happy with the work. Now, in some cases, you can use tools to eliminate the worst offenders and then manually review the rest. But, if you’ve got a manual review, you’re still going to need to check if those worst offenders are live before sending an email. I suppose you could just send emails to everyone, but you’re guaranteed to have site owners respond angrily telling you that that link doesn’t exist anymore. I suppose the argument against this would be to use automated tools to check if your link is live. These tools fail a lot as well. I find that Scrapebox is pretty good at finding Alive vs Dead pages but it’s not 100%. The first time I used it, I sent a client a list of the alive pages and he sent me back an email with a bunch of pages that were marked as “dead” that still had links on them. This is true for other tools as well…you’re definitely going to miss some links if you are relying on tools to tell you whether they’ve been removed or not. About a year ago we had a huge site that we were working on. We used two different automated tools to determine whether our link was still live on the page, marked the ones that had been removed and then filed for reconsideration. When we failed, we looked back manually at every link that we had marked as “page not found” or “link not found” and found several hundred that we had missed because the tools could not see our link. Very embarrassing. So now, I just check them all by hand.

    What I do is visit one link from each domain by hand. For the links that I have marked “page not found” or “link not found” I take every link that we have from those domains and run those ones through Scrapebox to see if they can pick up other live links from another page on that site.

    Good tip on using Bing for finding links! Another good source is your Google analytics referral data. Although many unnatural links will send no referral traffic, often the site owner will check the link once after installing it.

    Back to using tools – the tools are not going to pick up certain patterns that are against the quality guidelines. For example one of the definitions of “link schemes” is creating partner pages exclusively for the purpose of cross linking. If you have a reciprocal linking scheme going on and you have been sophisticated about it, I’m not aware of a tool that will catch that.

    I’m not sure about the ethics of hiring someone on fiverr to supply you with a report from a paid tool….but then that’s a discussion for another day.

    I find it interesting that you put your data for your reconsideration requests in dropbox. Matt Cutts once mentioned that if you were going to point to external sites, that you should use Google Docs as the webspam team won’t visit other sites for fear of getting malware. Still, Dropbox probably is a trusted site.

    All in all though, I thought this was a good article. I hope I have not overstepped boundaries by giving my opinions here. They weren’t meant as criticism but rather as a response to the “I call BS” statement. :)

    1. Ha! Marie I love that comment. Thanks for taking the time to share some of your ideas and add a load more value to the post.

      Regarding this idea about not needing to check links. What I was trying to say was:
      (a) In most cases it is sufficient to check the domain, or at least one link on the domain. Sometimes you have to go back and check a few more, but there are a lot of cases where you don’t need to check every article or directory link etc… as you get a good idea from one.
      (b) I wasn’t really advocating using tools to classify links for you. The Excel filtering etc… gives you a list that you can then check manually. Or, as is often the case, if it does reveal a footprint – say 100 directories all with the same anchor text – you can check a few and be pretty confident in a bulk classification of the others.
      (c) I agree on the issues with checking links are live using tools. It is not perfect. But then again the list of links comes from a range of tools in the first place – even GWT lists links which are no longer live. Personally I think this is just the nature of the web – links are somewhat transient in nature (and the worst links even more so). I totally agree that if someone says they removed a link that you should go and check it yourself.

      Overall, I was trying to outline a methodology for an in-house SEO or consultant whose employer/client doesn’t have a big budget for this – where any reduction in time/cost would be valuable. Sorry if the ‘BS’ comment caused any offence, I was merely trying to make a point.

      To briefly cover your other points:
      – Google Analytics referral data is a great shout! Never would have thought of that.
      – I’ve used Dropbox many times for sending screenshots, and never had an issue. Possibly you are right that it’s ‘safer’ to use Google Drive.

      Thanks for your input!

      Patrick

    2. Nice one Patrick

      I get what you are both saying I’m sure Marie knows the usual suspects though :)

      After all natural links don’t come from
      123-SEO-directory.info
      124-SEO-directory.info
      etc

    3. I agree with Marie 100% about using tools. Getting started with them can help find a majority of bad pages but can also give you false security of being “done” when there are hundreds more bad links incoming. It is a huge pain to do this (I recently had a client come to me with a site that had tens of thousands of spammy forum/alternate language/off topic/etc incoming links) by hand but it is the best method to ensure the results are what you want. That said, this is about doing this on a budget, and manually checking a hundred/thousand/10.000 pages is not time efficient and thus not budget efficient. So if money is an issue, then tools are likely the best use of your time. Thanks for sharing this !

  3. Good article! While I typically get 80 – 90% of the domains from GWT, other tools are required for the remaining domains like Ahref, Majestic SEO. Must mention a couple of fantastic “free” resources for link collection:

    Webmeup is a pretty decent source for link collection – http://webmeup.com/tools/backlinks.html. Another resource is the good old Moz – easy to get a 30-90 trial pro account. I must confess I converted to full time paid subscription because of the other awesome SEO & Analytics features.

    Also agree with Marie – always check atleast one link from each domain. Had to do it once for 70k domains for a single client… it was a painfully slow process but worth the time.

    1. OSE is in hidden in there already! Yeah Webmeup is pretty good too – and they have a free tool so it fits with my theme!

      Still disagree on this ‘check at least one link’ thing – it makes sense in principle but see Chris Dyson’s comment above, no way I am checking sites like that!

    1. If a site has been affected by Panda it is normally down to a significant proportion of the content on a site being duplicated or offering very little in terms of unique value. You really need to try and analyse your site objectively and determine whether each of your pages add value.

      There is no ‘one size fits all’ way to analyse this, but some common things to look out for:
      – Scraper sites stealing your content and outranking you for it (could count against you as duplicate content)
      – Competitors manually stealing your content and outranking you for it
      – The use of manufacturers descriptions which are also used on many other sites
      – Content (posts/articles) that are heavily syndicated
      – Tag/category pages that are thin, with very little unique content
      – Over-optimised internal linking strategies

      Identifying what the problems are is the first step, then you need to clean up all the issues and potentially add a lot more unique content.

    1. I’ve not heard this advice from Matt Cutts about not needing to remove links, that seems to go against everything else they’ve ever said on the topic.

      Some SEOs believe you can be successful removing penalties using only the disavow tool, however we don’t recommend that to our clients as the most effective method. Even if you do get a penalty removed this way, it could only take another Google update for them to lower the thresholds and bring your site back under penalty. If the links are genuinely gone they can’t hurt you.

    1. You’re right…there aren’t many published cases of sites recovering from Penguin. I’ve had a couple of cases. In one case the site did more than recover and actually saw an improvement in rankings. However, this is rare. The reason for this is that in order to recover from Penguin, not only does there have to be an extremely thorough link cleanup, but it is vitally important that the site has the ability to attract links naturally. Sites that are able to attract truly natural links are not often the same sites that have used low quality link building methods that would get them into Penguin trouble.

  4. Some good stuff here, both in the article and the comments! I absolutely agree on the Analytics referral data – I consistently find really crappy “one-click wonders” that never show up in any tool (including GWMT).

    To add to the free tools for those who need to do things on a budget, rmoov has a never ending free trial “Basic” plan which allows those who are cash-strapped to run limited size campaigns for as long as they need to for free. Of course we also have paid plans, but decided to make the free plan available after talking to many very small business owners who have been forced into trying to do the work themselves because of loss of revenue.

    We do also provide full support and help with how to deal with penalties for all subscribers, paid and free (shhhh… ;)

    On the subject of live and dead links: This is something to be very careful of. We see incredible transience in URLs and whole sites. We have determined in some cases that this is due to webmaster behavior, sometimes simply due to the fact that dubious sites tend to utilize unstable or unreliable hosting, sometimes purely and simply because a site crashed and was restored from backups. With all this coming and going of links, we became very concerned about the integrity of reporting for our client campaigns, so we introduced a “rechecker” during last year. This is a second crawler which constantly recrawls links that have been found to be dead throughout the life of a campaign. When links become live again they are automatically reinstated to the campaign and a notation made in the reports. We have found this greatly improves reporting accuracy . So my recommendation would be that if you are using tools like Scrapebox etc, you need to check for live links more than once to ensure that what you send to the Webspam team is on the money.

    Hmmm…didn’t mean for that to sound like a promotion …edit accordingly if you need to :)

    Sha

  5. Hey Patrick,

    indeed very good stuff in here, but not so much on the Scrapebox Link Checker Tool and the Website Penalty Indicator – which both doesnt work for me :P.

  6. Good post, I’m read your content about deal unnatural links penalty, last few months I’m doing SEO work according to Google algorithm (Google Penguin), lots of unnatural link removed but we can’t find result. Can anyone tell me actually how much time to recover from Google penguin?

  7. Every action has a reaction therefore any action we take in SEO might influence adversely which is beyond our control so we need to follow correct procedures with in a frame since all the commercial activities are bound by the regulations.

  8. @Ravi Well hopefully u dont miss the Point here. The Links that once helped your Website to manipulate the Rankings and to Rank higher, have been detected, flagged and devalued. So removing and or disavowing (which by the way can take month until the google bot crawls em again and adds the “disavow” attribute to those links) means that u get ure Rankings back. Just simply because all the “good” ones that helped to manipulate the Rankings are gone and you cant re-establish your old Rankings just by removing/disvowing the “bad” ones!

  9. I work in the penalty recovery business. I’ve seen people trying to get out of penalties themselves, in ways like which you’ve listed, and others, but often they fail. They do not always know the ins and outs and what needs to stay and what needs to go. I’ve had clients come to me with ‘I used WMT with no results’ and other such statements. This is because many link removal services have metrics that don’t benefit the user, offering silly results; often talking about Link Detox. At The Link Auditors, we do not use complex metrics, either a link is toxic or not, never ‘suspicious’ or ‘threat’, leading to unsure results and confusion. Our results are clear and once received, our clients know exactly which links need to go and which need to stay.

    We have a collection of tools, that find toxic backlinks in a number or ways from sitewide links, links on duplicate IPs and banned domains and more. We also run a link removal service that averages an 80% success rate, which is higher than Link Detox and other services. All our tools are automatic, quick and easy to use; with our service you can have all your toxic backlinks found and removed with a few clicks of a mouse.