budget-link-removal
SEO

How to Deal With an Unnatural Links Penalty on a Budget

Many website, businesses, and employees have been affected by Google’s recent clamp down on spammy and manipulative link practices. It is becoming an all too common problem for small business, and we are increasingly finding that new clients have been affected by penalization and need help finding a resolution.

Whether a site has been hit by a manual penalty, or has suffered a drop in traffic following a Penguin update, there is always a lot of work required in order to get a penalty lifted. Many of the tools needed are expensive, and many of the popular methods require a lot of manual work. This post offers some tools, guidelines, and methods for reducing the time and cost of penalty removal work.

Identifying Penalties

The first step to any kind of penalty recovery is figuring out if you have a penalty and what kind of penalty it is. With manual penalties this is nice and easy as Google just tell you in Google Webmaster Tools. Algorithmic penalties are a bit more tricky, and normally you have to break the old correlation ≠ causation rule when deciding that one is present.

When dealing with new clients, we try to qualify them as quickly as possible – often clients don’t even know that they’ve been hit by a penalty. A couple of tools can give you a quick indicator:

Website Penalty Indicator – Cost $0
My company developed this tool recently (built using SEMrush traffic data), which can be used for a quick visual check on traffic trends, and if a website has seen any large drops following Google update announcements (or not, as the case may be…)

website penalty indicator How to Deal With an Unnatural Links Penalty on a Budget

Screenshot taken 17/02/2014 of feinternational.com

ahrefs – Cost $0 (paid plans available)
Our backlink tool of choice, ahrefs is largely considered among the SEO community to have the largest and most accurate index. For our budget-busting analysis we’ll be making use of the free account, which gives you restricted data and only 3 lookups a day – but that is perfectly adequate for our needs at this stage.

ahrefs link stats How to Deal With an Unnatural Links Penalty on a Budget

Screenshot taken 17/02/2014 of ahrefs.com

By looking at a site’s backlink numbers and their anchor text ratios you can get a pretty good idea how manipulative their backlink profile is, and whether it might be worth doing some further investigation.

ahrefs anchors How to Deal With an Unnatural Links Penalty on a Budget

Screenshot taken 17/02/2014 of ahrefs.com

Screenshots from these tools along with a brief explanation should be enough to convince most prospective clients that it is worth doing some more detailed investigations, giving you deeper insight into their situation.

Panguin – Cost $0
This a fantastic tool for identifying penalties, but it does require you to have access to the Google Analytics account of the site in question, so it is not much use for pre-qualifying. However once you do have GA access, it is incredibly useful. Once you log in, you choose one of your Profiles (Views) and Panguin will overlay Google update data over your Google organic search traffic.

panguin How to Deal With an Unnatural Links Penalty on a Budget

Screenshot taken 17/02/2014 of barracuda-digital.co.uk

You are looking for an update followed by a fall in search traffic (or a series thereof), which could imply a penalty is present. Make sure to compare the data to previous years to ensure it isn’t simply a seasonal fluctuation.

SEMrush – $0 (paid plans available)
A sublime tool for competitor research, SEMrush makes use of its database of ranking history for millions of keywords to give you loads of valuable data. In the absence of keyword ranking history, the SEMrush overview (free) can give us a great idea of previous trends.

semrush How to Deal With an Unnatural Links Penalty on a Budget

Screenshot taken 17/02/2014 of semrush.com

Combining all these sources you should be able to make a fairly accurate judgment as to whether a site has been penalized or not.

Total spend so far – $0.

Identifying Unnatural Links

For the purpose of this post, I’m ignoring Panda type penalties as that is often about improving site content itself, and I want to focus on link-based penalties. Often the biggest job when working on a penalty case is identifying the links themselves. Sure, you can use premium services like Link Risk and Link Detox, which can help a lot with the identification work – but we are on a budget here, people!

Collecting Links

The first step in any link cleanup is to actually collect as many links as possible – and more data is always better.

Google Webmaster Tools – $0
Although Google have stated it is sufficient to only use the data listed in Google Webmaster Tools, there are two problems with this:

  1. GWT lists different sample links at different times
  2. You still need more link intelligence (e.g. anchor text) to make any kind of decision about link quality

But as a starting point, you can do a lot worse than GWT link data. If you want to be really thorough, log into GWT every day over a couple of weeks and grab the data every day, then collate and dedupe in Excel.

Bing Webmaster Tools – $0
Just because they don’t send you any traffic doesn’t mean you should forget about Bing! Their inbound link data gives you links and anchor text, so it can actually be more useful than GWT.

Fiverr – $5
I know I’ve been spoiling you with all these freebies, but if you can’t afford $5 (£2.99 to us Brits on today’s exchange rate) then you’re probably not charging enough for your valuable service. Why Fiverr? ‘Cause you can make use of someone else’s costly tool subscriptions and just pay them for the link data – one ‘gig’ can get you backlink data from Majestic, ahrefs, AND Open Site Explorer. Best 5 bucks you’ve ever spent.

With all that lot, you probably have enough link data to get going with link classifications. Get all your data in Excel and homogenize the columns as much as possible.

Link Classification

Before you spend any time even looking at your links, it is worth checking if the links are still live. Link checking tools will only stay up-to-date to a point, and sites that have been the target of spam are getting pulled down all the time.

Scrapebox Free Link Checker – $0
Dedupe your linking data and plug them in to Scrapebox’s free incarnation of their popular software, which will go and check if the links are still live in the first place. For any that aren’t, double-check that no other links from that domain are live, then move these across into a spreadsheet or workbook called ‘Links Removed’. If you are filing a reconsideration request for a manual action, you might as well add these links to your list of links that you managed to get removed and make it look like your work.

Once you have a final list of links that you know are still live, remove duplicates by domain, as we will only be checking links at the domain level (when was the last time you saw a ‘good’ link on a crappy site?).

You Don’t Need To Manually Check Every Link

I’ve heard many times that during the link classification process you ‘should’ check every single domain manually to make sure you don’t accidentally mis-classify good quality links.

I call BS on this.

The reason we have unnatural link penalties in the first place is because SEOs took normal legitimate linking practices and spammed the crap out of them. And how do you scale things quickly? You look for footprints, find one that works – rinse and repeat.

We can very quickly speed up the link classification process by finding ways to reverse engineer these footprints and bulk classify lots of links at once.

Excel Custom Filters

Working on Excel, we’ll use custom filters to try to identify some footprints and remove the amount of manual classification required. Go to Data->Filter and filter across your entire data set, then select the Domain or URL column and choose Text Filters->Contains

excel filter How to Deal With an Unnatural Links Penalty on a Budget

This will likely return you with all or most of the directory submissions for this site. This doesn’t mean that they are all necessarily bad, ‘SEO directories’, so cross-referencing against something like anchor text usually helps you make a quick decision on the value/quality of each link. If you manually check a few you’ll quickly see the link builder’s footprint, and you should be able to classify the sites without ever visiting them. I tend to cut and paste links out into new workbooks, so I end up with a ‘Good’ list and a ‘Bad’ list (for future removal/disavow work, the list of Good links is your best friend).

Then apply this same principle to a range of different search words:

  • links
  • article
  • forum
  • seo
  • submit
  • search
  • engine
  • guest
  • etc…

Once you spot a particular tactic has been used, try out different variations of this method to see if you can quickly identify any more links.

Rank Cracker – £0
This software is actually designed as a link building tool, but we will use it to do the opposite. This tool runs through a list of links and determines which automated link building software can be used to replicate them. In our case – which automated link building tool may have been used to create the links in the first place.

rank cracker How to Deal With an Unnatural Links Penalty on a Budget

Again, don’t use this signal alone – make sure you double-check the results or try to pair it with anchor text or something – but it can be a good indicator that the links are manipulative.

Another cool feature of Rank Cracker is that it provides you with contact details for the sites on links it could not identify, which will be useful for link removal work (see below).

Some Manual Checking Is Unavoidable

Once you have exhausted this footprint method, you should have identified a decent amount of links, thus reducing the amount of (unavoidable) manual work required. However, even when you are working through a list manually you can take some shortcuts. Filter by anchor text and look for money keywords – it is highly likely that these links are manipulative.

You can also – after a while – begin to judge a link based on its URL alone. If the URL or the domain itself does not look particularly natural, it is unlikely that the link is. Again, you will start to see patterns in the sites and URLs that can lead to further filtering – I have classified thousands of links using this method.

Total spend so far – $5.

Link Removals

Although some believe that you don’t need to bother with link removals, I feel you are doing your client a disservice by not even attempting to get some of their bad links removed. That is a debate for another day however; assuming you do wish to do link removals, here are some budget-friendly methods.

Citation Labs Contact Finder  – $10
Citation Labs have a whole range of tools, and you pay for bandwidth depending on how many requests you need to process. The tool that is particularly useful for link removal work is the Contact Finder, which returns contact form URLs as well as email addresses. You start with 10Mb of free bandwidth, which can search contact details for around 300-400 domains, but you can purchase 100Mb for only $10 – which should be sufficient for most projects.

Tout – $0 (Free Trial)
Once you have all your contact details, you’ll need to email all the site owners to ask them to remove the links. Tout is a great tool for automating this process, as it allows you to bulk upload the data and use dynamic templates to populate each email. It will also tell you who has opened and clicked your email, so you know who is ignoring you entirely. I tend to follow-up twice more after the initial request, using different subject headers, typically around 5 days apart – meaning it is possible to complete all your removal requests within the 14 day trial period.

One of the useful side-effects of this mass mailing is that many of the email addresses will fail, giving you an inbox full of delivery failure notices. In terms of collecting evidence for a reconsideration request for a manual penalty, these screenshots can serve as further evidence to Google that you have carried out plenty of work.

Tip: Make sure to keep a detailed spreadsheet of every website you contacted and the result.

Total spend so far – $15.

Do It Yourself Disavow ($0)

For some reason, one of the ‘selling’ points of the premium tools is the generation of a ‘Google friendly’ disavow list. This is probably because lots of webmasters do it wrong – but the process is really very straightforward.

You have already classified all your links, so you know which ones you don’t want Google to count. Use this as the basis to build your file, again working in Excel:

  1. Paste all of your URLs into a fresh worksheet, then extract the domain if you don’t already have it
  2. Remove duplicates by domain
  3. In the second column, type ‘domain:’ and fill down all the rows
  4. In the third column, use the CONCATENATE function to merge the two cells (see image below)
  5. Copy and paste all the values from column C into a Notepad document and save the file

concatenate How to Deal With an Unnatural Links Penalty on a Budget

 

You then need to upload your newly created disavow file for Google to process. You’ll need to be signed in to Google Webmaster Tools as the site owner and visit this page. Upload your file and submit, and the page should then look something like this:

disavow file upload How to Deal With an Unnatural Links Penalty on a Budget

Screenshot taken 17/02/2014 of google.com

Make sure to check the notification underneath which will tell you if you’ve made any errors during the creation of your disavow file – check that the number of domains matches the number you expected.

Total spend so far – $15.

How To Write Reconsideration Requests

Without going into too much detail, here are some brief pointers on how to write a reconsideration request, based on my experience:

  • Use a bullet point list of all the work you have carried out, emphasizing headline figures (e.g. 257 links removed, 1200 site owners contacted, 3548 emails sent).
  • Include as much thorough documentation as you possibly can – link to Google Drive spreadsheets of removal data and Dropbox folders of email failures and contact page screenshots.
  • Don’t submit your reconsideration request too soon – if you send it one week after you got your manual penalty it won’t seem like you have worked very hard, so leave it a couple of months.
  • Use polite, concise, and factual language – keep it brief and to the point.
  • Don’t lie.

The main thing to remember with manual penalties (and reconsideration requests) is that your email is being read by a human. They don’t care about your history, or your story, or your ‘future plans’. They care about the fact that you have been manipulating Google’s algorithm, and want to see plenty of evidence that you’ve repented and tried to right those wrongs.

Penalty Removal for 15 Dollars

This is deliberately not the title of my post, simply because many penalties take several attempts to get removed or revoked, so I would imagine most cases would exceed this $15. Of course, this also doesn’t take into account the value of your time, the cost of which would depend primarily upon the size of the cleanup operation required.

I’ve also got a confession to make – we use paid tools all the time for our link removal work, since the more you do, the more these tools save you time and begin to pay for themselves. For the cash-strapped site owner or freelancer, however, this method will see you through.

Featured image credit: Screenshot taken 17/02/2014 from wordle.net

 How to Deal With an Unnatural Links Penalty on a Budget

Patrick Hathaway

SEO Consultant at Hit Reach
Patrick Hathaway is an SEO consultant for Hit Reach, delivering client-side and in house SEO campaigns. He is currently doing LOTS of website audits (SEO and CRO), link removal work and content marketing campaigns.When he isn't working, Patrick spends most of his time trying to keep up with his baby boy. In the (minuscule) remaining free time, he enjoys playing sport and drinking beer (normally not at the same time).
 How to Deal With an Unnatural Links Penalty on a Budget
 How to Deal With an Unnatural Links Penalty on a Budget

Latest posts by Patrick Hathaway (see all)

You Might Also Like

Comments are closed.

22 thoughts on “How to Deal With an Unnatural Links Penalty on a Budget

  1. That’s all good stuff there, I do love Panguin – that’s a great tool to show clients who have penalized sites. I am an LRT Associate with LinkResearchTools in Vienna who make the wonderful Link Detox Genesis. I have their rather expensive Superhero Account and at the moment I have quite a few Link Detox credits spare, so I am offering to do a full detox/backlink audit of any site and produce a disavow file and full backlink report (showing healthy, suspicious and toxic links). I’m prepared to do this for £50/75€/$100 at the moment as I need the experience of looking at different penalized sites. I know this is not as cheap as what Patrick has described, but it will get you results faster. Get in touch ASAP if you want me to help, I am about to have a case study published with LRT, so I can see a busy summer coming up! Thanks again Patrick, there’s a few great Excel tips in there I hadn’t thought of.

  2. This is a very good article. I’d like to give my thoughts as someone who does a lot of penalty removal work, and yes, I’m one of the people whom you have “called BS on”. :) But that’s ok…it’s good to debate topics like this. Here are my thoughts:

    Regarding not needing to check links manually, I have tried several tools and each time I find that they make too many mistakes for me to be happy with the work. Now, in some cases, you can use tools to eliminate the worst offenders and then manually review the rest. But, if you’ve got a manual review, you’re still going to need to check if those worst offenders are live before sending an email. I suppose you could just send emails to everyone, but you’re guaranteed to have site owners respond angrily telling you that that link doesn’t exist anymore. I suppose the argument against this would be to use automated tools to check if your link is live. These tools fail a lot as well. I find that Scrapebox is pretty good at finding Alive vs Dead pages but it’s not 100%. The first time I used it, I sent a client a list of the alive pages and he sent me back an email with a bunch of pages that were marked as “dead” that still had links on them. This is true for other tools as well…you’re definitely going to miss some links if you are relying on tools to tell you whether they’ve been removed or not. About a year ago we had a huge site that we were working on. We used two different automated tools to determine whether our link was still live on the page, marked the ones that had been removed and then filed for reconsideration. When we failed, we looked back manually at every link that we had marked as “page not found” or “link not found” and found several hundred that we had missed because the tools could not see our link. Very embarrassing. So now, I just check them all by hand.

    What I do is visit one link from each domain by hand. For the links that I have marked “page not found” or “link not found” I take every link that we have from those domains and run those ones through Scrapebox to see if they can pick up other live links from another page on that site.

    Good tip on using Bing for finding links! Another good source is your Google analytics referral data. Although many unnatural links will send no referral traffic, often the site owner will check the link once after installing it.

    Back to using tools – the tools are not going to pick up certain patterns that are against the quality guidelines. For example one of the definitions of “link schemes” is creating partner pages exclusively for the purpose of cross linking. If you have a reciprocal linking scheme going on and you have been sophisticated about it, I’m not aware of a tool that will catch that.

    I’m not sure about the ethics of hiring someone on fiverr to supply you with a report from a paid tool….but then that’s a discussion for another day.

    I find it interesting that you put your data for your reconsideration requests in dropbox. Matt Cutts once mentioned that if you were going to point to external sites, that you should use Google Docs as the webspam team won’t visit other sites for fear of getting malware. Still, Dropbox probably is a trusted site.

    All in all though, I thought this was a good article. I hope I have not overstepped boundaries by giving my opinions here. They weren’t meant as criticism but rather as a response to the “I call BS” statement. :)

    1. Ha! Marie I love that comment. Thanks for taking the time to share some of your ideas and add a load more value to the post.

      Regarding this idea about not needing to check links. What I was trying to say was:
      (a) In most cases it is sufficient to check the domain, or at least one link on the domain. Sometimes you have to go back and check a few more, but there are a lot of cases where you don’t need to check every article or directory link etc… as you get a good idea from one.
      (b) I wasn’t really advocating using tools to classify links for you. The Excel filtering etc… gives you a list that you can then check manually. Or, as is often the case, if it does reveal a footprint – say 100 directories all with the same anchor text – you can check a few and be pretty confident in a bulk classification of the others.
      (c) I agree on the issues with checking links are live using tools. It is not perfect. But then again the list of links comes from a range of tools in the first place – even GWT lists links which are no longer live. Personally I think this is just the nature of the web – links are somewhat transient in nature (and the worst links even more so). I totally agree that if someone says they removed a link that you should go and check it yourself.

      Overall, I was trying to outline a methodology for an in-house SEO or consultant whose employer/client doesn’t have a big budget for this – where any reduction in time/cost would be valuable. Sorry if the ‘BS’ comment caused any offence, I was merely trying to make a point.

      To briefly cover your other points:
      – Google Analytics referral data is a great shout! Never would have thought of that.
      – I’ve used Dropbox many times for sending screenshots, and never had an issue. Possibly you are right that it’s ‘safer’ to use Google Drive.

      Thanks for your input!

      Patrick

    2. Nice one Patrick

      I get what you are both saying I’m sure Marie knows the usual suspects though :)

      After all natural links don’t come from
      123-SEO-directory.info
      124-SEO-directory.info
      etc

    3. I agree with Marie 100% about using tools. Getting started with them can help find a majority of bad pages but can also give you false security of being “done” when there are hundreds more bad links incoming. It is a huge pain to do this (I recently had a client come to me with a site that had tens of thousands of spammy forum/alternate language/off topic/etc incoming links) by hand but it is the best method to ensure the results are what you want. That said, this is about doing this on a budget, and manually checking a hundred/thousand/10.000 pages is not time efficient and thus not budget efficient. So if money is an issue, then tools are likely the best use of your time. Thanks for sharing this !

  3. Good article! While I typically get 80 – 90% of the domains from GWT, other tools are required for the remaining domains like Ahref, Majestic SEO. Must mention a couple of fantastic “free” resources for link collection:

    Webmeup is a pretty decent source for link collection – http://webmeup.com/tools/backlinks.html. Another resource is the good old Moz – easy to get a 30-90 trial pro account. I must confess I converted to full time paid subscription because of the other awesome SEO & Analytics features.

    Also agree with Marie – always check atleast one link from each domain. Had to do it once for 70k domains for a single client… it was a painfully slow process but worth the time.

    1. OSE is in hidden in there already! Yeah Webmeup is pretty good too – and they have a free tool so it fits with my theme!

      Still disagree on this ‘check at least one link’ thing – it makes sense in principle but see Chris Dyson’s comment above, no way I am checking sites like that!

    1. If a site has been affected by Panda it is normally down to a significant proportion of the content on a site being duplicated or offering very little in terms of unique value. You really need to try and analyse your site objectively and determine whether each of your pages add value.

      There is no ‘one size fits all’ way to analyse this, but some common things to look out for:
      – Scraper sites stealing your content and outranking you for it (could count against you as duplicate content)
      – Competitors manually stealing your content and outranking you for it
      – The use of manufacturers descriptions which are also used on many other sites
      – Content (posts/articles) that are heavily syndicated
      – Tag/category pages that are thin, with very little unique content
      – Over-optimised internal linking strategies

      Identifying what the problems are is the first step, then you need to clean up all the issues and potentially add a lot more unique content.

    1. I’ve not heard this advice from Matt Cutts about not needing to remove links, that seems to go against everything else they’ve ever said on the topic.

      Some SEOs believe you can be successful removing penalties using only the disavow tool, however we don’t recommend that to our clients as the most effective method. Even if you do get a penalty removed this way, it could only take another Google update for them to lower the thresholds and bring your site back under penalty. If the links are genuinely gone they can’t hurt you.

    1. You’re right…there aren’t many published cases of sites recovering from Penguin. I’ve had a couple of cases. In one case the site did more than recover and actually saw an improvement in rankings. However, this is rare. The reason for this is that in order to recover from Penguin, not only does there have to be an extremely thorough link cleanup, but it is vitally important that the site has the ability to attract links naturally. Sites that are able to attract truly natural links are not often the same sites that have used low quality link building methods that would get them into Penguin trouble.

  4. Some good stuff here, both in the article and the comments! I absolutely agree on the Analytics referral data – I consistently find really crappy “one-click wonders” that never show up in any tool (including GWMT).

    To add to the free tools for those who need to do things on a budget, rmoov has a never ending free trial “Basic” plan which allows those who are cash-strapped to run limited size campaigns for as long as they need to for free. Of course we also have paid plans, but decided to make the free plan available after talking to many very small business owners who have been forced into trying to do the work themselves because of loss of revenue.

    We do also provide full support and help with how to deal with penalties for all subscribers, paid and free (shhhh… ;)

    On the subject of live and dead links: This is something to be very careful of. We see incredible transience in URLs and whole sites. We have determined in some cases that this is due to webmaster behavior, sometimes simply due to the fact that dubious sites tend to utilize unstable or unreliable hosting, sometimes purely and simply because a site crashed and was restored from backups. With all this coming and going of links, we became very concerned about the integrity of reporting for our client campaigns, so we introduced a “rechecker” during last year. This is a second crawler which constantly recrawls links that have been found to be dead throughout the life of a campaign. When links become live again they are automatically reinstated to the campaign and a notation made in the reports. We have found this greatly improves reporting accuracy . So my recommendation would be that if you are using tools like Scrapebox etc, you need to check for live links more than once to ensure that what you send to the Webspam team is on the money.

    Hmmm…didn’t mean for that to sound like a promotion …edit accordingly if you need to :)

    Sha

  5. Hey Patrick,

    indeed very good stuff in here, but not so much on the Scrapebox Link Checker Tool and the Website Penalty Indicator – which both doesnt work for me :P.

  6. Good post, I’m read your content about deal unnatural links penalty, last few months I’m doing SEO work according to Google algorithm (Google Penguin), lots of unnatural link removed but we can’t find result. Can anyone tell me actually how much time to recover from Google penguin?

  7. Every action has a reaction therefore any action we take in SEO might influence adversely which is beyond our control so we need to follow correct procedures with in a frame since all the commercial activities are bound by the regulations.

  8. @Ravi Well hopefully u dont miss the Point here. The Links that once helped your Website to manipulate the Rankings and to Rank higher, have been detected, flagged and devalued. So removing and or disavowing (which by the way can take month until the google bot crawls em again and adds the “disavow” attribute to those links) means that u get ure Rankings back. Just simply because all the “good” ones that helped to manipulate the Rankings are gone and you cant re-establish your old Rankings just by removing/disvowing the “bad” ones!

  9. I work in the penalty recovery business. I’ve seen people trying to get out of penalties themselves, in ways like which you’ve listed, and others, but often they fail. They do not always know the ins and outs and what needs to stay and what needs to go. I’ve had clients come to me with ‘I used WMT with no results’ and other such statements. This is because many link removal services have metrics that don’t benefit the user, offering silly results; often talking about Link Detox. At The Link Auditors, we do not use complex metrics, either a link is toxic or not, never ‘suspicious’ or ‘threat’, leading to unsure results and confusion. Our results are clear and once received, our clients know exactly which links need to go and which need to stay.

    We have a collection of tools, that find toxic backlinks in a number or ways from sitewide links, links on duplicate IPs and banned domains and more. We also run a link removal service that averages an 80% success rate, which is higher than Link Detox and other services. All our tools are automatic, quick and easy to use; with our service you can have all your toxic backlinks found and removed with a few clicks of a mouse.