9 Roadblocks That Could be Harming Your Google Rankings

SMS Text
9 Roadblocks That Could be Harming Your Google Rankings

Almost every business in the world wants to rank on the first page of Google, but very few ever get there. With the best SEO will in the world, top-result status is a mountain many marketers struggle to climb.

It’s not for want of trying. With Google’s algorithm changing constantly, it can be hard to keep up with best practices. Unfortunately, as with any supply and demand situation, there are also a number of unscrupulous “specialists” offering to help unsuspecting businesses raise their ranking. The blind leading the blind, often enough.

Should you give up on SEO? Categorically not. Your search engine ranking can have a major impact on your business, driving traffic to your site and helping you secure more customers.

As the saying goes, everything worth having is worth working hard for. 

I have put together this guide to help you out, highlighting nine roadblocks that could be holding your website back from ranking success.

The Overarching Principle

The most important thing to grasp when it comes to SEO is the underlying principle: Google wants to help its users. It really is that simple.

When Google searches websites and decides where to rank them, what is really happening is a decision on how useful and relevant the content is to people using the search engine.

The concept gets obscured sometimes, but that’s the most important thing to bear in mind–and it’s the logic that dictates each of these nine roadblocks. The algorithm changes pretty regularly, leaving SEOs everywhere scrambling to keep up, but you can’t go far wrong if you bear that principle in mind.

1. Duplicate Content

Duplicate content isn’t useful content. Whether it’s your website copy or your blog pieces, Google proactively penalizes sites which don’t have original content.

In the era of content marketing, people can fall into the trap of thinking that any content is worthwhile content. That simply isn’t the case. The moment you start sacrificing quality for quantity is the moment your traffic will plummet.

But what if your website has pages with duplicate content? The likes of product pages or location-tailored landing pages? In these situations, you can use canonical URLs, which are a way of telling Google which specific instance of that copy is to be taken as canonical.

A common example of this would be blog posts, where the same post can be accessed by a variety of URLS. For example, you might ‘tag’ a blog post in multiple ways, which would generate different post URLS:

  • /blog/nutrition/top-tips-for-healthy-living
  • /blog/fitness/top-tips-for-healthy-living

And so on.

Without a canonical version, Google would register the different URLs as duplicate content and penalize accordingly. You can easily check if your content will pass Google’s beady eye for duplicates by running it through tools such as Copyscape.


2. Thin Content

Thin content is just as bad as duplicate content. Think back to our overarching principle–thin content (literally, not enough ‘meaty’ content) isn’t useful to users searching for information on a topic.

It used to be that the Google ranking algorithm prioritized link building, but this isn’t the case so much anymore. With the introduction of the not-so-affectionately regarded Panda update, Google penalizes any site it deems as offering thin content.

Think rich, in-depth, engaging content and you’re on the same lines as Google. This is even more important for highly-competitive keywords. To stand a chance of competing with the big boys, smaller businesses may want a content-rich site.

As with every Google change in the past, the businesses that are penalized are the ones that put ranking first and their users second. Always think about how you can make your site more relevant, more useful, and easier to navigate and Google will reward you accordingly!

3. Anchor Text Over-Optimization

The world of SEO is one of constant push and pull–new ways to rank are identified, those methods are abused, Google introduces new rules to penalize cheaters. And so on it goes.

Anchor text has been far from immune. Back in the day, SEO was simple. Identify a keyword, then use that keyword a few times in your copy and in the anchor text. Hey presto, hello first-page ranking!

Then people figured that out, and everyone jumped on the bandwagon. Dodgy SEOs started spamming their backlinks with highly optimized anchor text keywords, and usefulness and relevance took a back seat to bad SEO practice.

Predictably, Google then changed the algorithm to actively penalize anyone who over-optimized their anchor text. The principle is to reward unique, useful content–and to stop people from tricking the system when that’s not what they offer.

The key here is to focus on building a bank of rich, shareable, interesting content, instead of focusing on spamming your anchor text. Sure, it’s important to bear keywords in mind–without them Google can’t identify relevance at all – but don’t prioritize your anchor text over creating amazing content. Think human first, search engine second.

4. Bad Backlinks

Backlinks are still one of the most important factors in search ranking. Google uses backlinks to determine how relevant and valuable that content is–figuring that a link from page 1 to page 2 is effectively an endorsement by page 1 of page 2.

In simple terms, if I link out in this piece to an article by, say, HubSpot, that link counts as a backlink for HubSpot and will help elevate their ranking.

Backlinks followed the same cycle as the rest of SEO–first it worked well, then people figured out how to abuse it, now you can get penalized in Google if you don’t stick to stringent quality guidelines.

Look back less than a decade, and SEOs were wildly building links–focusing on quantity much more than quality. Every site, every tactic, was fair game–from directory submissions and article directories to comment spam and guest blogging.

That’s not to say that those are automatically bad practices, by the way. The point is that they’re bad practice, as with everything in this piece, if the people using them are focusing on manipulating the SERPs rather than adding value to users.

Directory link building, for example, can actually be really valuable. There are directories out there that are really useful to users–providing high-quality, relevant lists of links giving an in-depth oversight of the topic area.

The problem becomes when directories have low standards, allowing anyone in for a price, simply to build backlinks.

You will probably have noticed a trend throughout this article. The best way to get onto directories that can help your SEO strategy is to write comprehensive, valuable content on a topic. Make the editor want to include you on the directory, by virtue of how good your content is.

Laptop on desk

Guest blogging is the same. It is a legitimate practice, if done legitimately. Unfortunately, what a number of people did is start building links on poor quality, irrelevant websites simply to get the backlink. It doesn’t work. It’s spammy. Google doesn’t like it, people don’t like it, and it can damage your brand reputation.

Saying that, as with directory link building, guest blogging can still be a useful strategy. Guest bloggers getting exposure on the high authority domains – Forbes, for example – are getting incredibly high-quality backlinks, plus exposure to a huge audience. How do you get a guest post accepted on an industry-leading platform?  Your content needs to be exceptional.

If you want to read more about good link building, you can check out my traffic improvement guide published here on Search Engine Journal in March 2015.

5. Non-Optimized Page Titles

SEO is about balance. You have to think about keywords, because that is fundamentally how Google determines ranking, but you don’t want to think about it too much or you will get hit with a penalty!

Page titles are an element that Google weights very heavily when deciding what your page is about and how valuable it is. The logic here is that your page title has to capture everything your page is about in a short character limit, so it’s probably a good indicator of content.

It’s really important to include the core keyword you’re targeting in your page title, basically. Not only will this be better for your ranking, it is also better for your click-through rate as people can immediately identify relevance to their search query (page title comes up first on Google results).

Ranking factors 2015

Image credit: Marcus Tober | Searchmetrics | ‘The Content Evolution

As with everything SEO, the importance of page titles is always shifting, but it’s still worth focusing on. Research from Searchmetrics actually suggests page titles are declining in importance slightly, but it’s important not to forget to optimize them. Even if Google decreases the importance of the page title as a ranking factor, it still has a huge impact on your click-through rate.

6. Insecure Site

This is another relatively recent change from Google, who announced last year that site security is a ranking factor. Basically, all sites that start with https:// are secure sites, and this will now have a role to play in determining ranking.

It’s hard to determine how important this factor will be, but when it comes to SEO any boost is a worthwhile boost.

Also, switching to secure https has the major benefit of improving your traffic referral data. We’re unable to effectively track visitors that move from a secure to an insecure site, so we can’t build an accurate picture of our traffic sources. If your site is https though, all visitors can be tracked, wherever they come from.

Although switching to https is more effort than some of the other ranking factors, it’s definitely worth doing if you can. If not purely for the SEO benefits, for the better vision it gives you, as well as the increased trust you can build with consumers by offering a safe site.

7. Accidentally No-Indexing

No-indexing is one of those things that seems really obvious, but it can so easily be missed. A single line of code hiding on your website, and your site is invisible to Google.

Even big businesses fall afoul of no-indexing errors–I’ve worked with enough of them! This is one of the first things you should check, if you’re convinced you should be ranking in Google but aren’t.

8. Bad User Experience

Search ranking can only get you so far. If your user experience is bad, it doesn’t matter how many visitors your site gets, because your bounce rate will be outrageous.

Bad user experience

Google looks for signals that people who visit your site find it valuable – one such signal is how long they stay there. A high bounce rate is an indication that your traffic quickly leaves, which Google figures to mean it’s not useful.

There are loads of factors that come under the heading of ‘user experience’. Basically, anything and everything your site can do that improves things for visitors, which includes:

  • Making it easy to navigate
  • Including a user-platform for reviews
  • Including a ‘related articles’ tab
  • Transcribing videos so content is more easily accessible
  • Fast loading speeds
  • Consistent calls to action
  • Easy to find Contact and Help pages
  • Mobile-responsive design
  • Well spaced and easy to read
  • Enabling easy social sharing
  • Inviting feedback
  • Choosing appealing colors and design
  • Including Testimonials
  • Displaying FAQS

Try and think from the perspective of a website visitor yourself – what do you find annoying, frustrating? What stops you getting what you need to do done? What makes you leave?

All of these things come under the user experience heading, and are critical to getting users to stay on your site, which in turn is critical to getting Google to rank you favorably.

9. Using a Bad Agency

With the best will in the world, reading a few articles online won’t make you an SEO expert. The world of SEO is confusing, ever-evolving, and time-consuming – so most people hire an agency to handle everything for them.


Unfortunately, there is no shortage of charlatans in this game. It stems from the ease–and massive potential rewards–of cheating the system back in the day. Many SEOs still believe they can get back to the black-hat glory days, manipulating Google to secure high returns ranking.

That simply isn’t possible anymore, but too many SEOs are still clinging onto the dream. Instead of genuinely taking the time to learn the SEO landscape–and keep on top of it regularly–they’re putting up a front.

For clients, who often don’t know much about SEO at all, it can be an alluring front indeed. “I can make you rank first on Google,” they say: “Guaranteed first page.” Except nothing is guaranteed, and anyone who tells you otherwise is lying.

SEOs who try to ‘trick’ Google are destined to trail behind the search giant, who pours millions of dollars and leverages the best minds in the world to stop people abusing the system. You might get lucky one week, but you’ll never win out overall.

Instead, your best bet when it comes to hiring the right agency is to check references extensively. Ideally, you want to see testimonials, case studies, and tangible evidence of results. And you want someone who recognizes that SEO is about genuinely providing value, not about manipulation.


Image Credits

Featured Image: rusty_clark/Flickr.com
In-post Photo #1: Unsplash/Pixabay.com
In-post Photo #2: mintchipdesigns/Pixabay.com
In-post Photo #3: ClkerFreeVectorImages/Pixabay.com

Barrie Smith
Barrie Smith is a Link Building guru at Receptional Ltd and has over 11 years of experience in Digital Marketing. Barrie has worked on several... Read Full Bio
Barrie Smith
Subscribe to SEJ!
Get our weekly newsletter from SEJ's Founder Loren Baker about the latest news in the industry!
  • Phil Gregory

    Thin content is the bugbear that keeps on giving. I am constantly banging on to clients about how they simply must improve their content offering. Especially on e-commerce platforms like Magento. It’s very easy to get your inventory up online, presenting it well is a completely different matter though. Great article!

    • R.Rogerson

      Part of the problem is the sheer size of the task.
      Revising hundreds of product pages is daunting – thousands is a nightmare.
      Help them prioritise it – they need to pick their biggest earners, best converters and most sought after products.
      That should give you about 10% to target. Everything else can be done at a later date/slower rate.

      Whilst they revise the content, get them to add in notes about features/aspects/benefits. At a later date, if these aren’t separate data points, you should be able to parse their notes and generate the data. This can aid UX/internal search, and push conversion too.

      Basically – the main motivator is increasing the bottom line.

      The other motivator is fear.
      Tell them if the content is naff, they may get hit by Panda – then game over for at least 3 months. No rnakings, no traffic, no conversions, no money!
      Then tell them that they can noindex the non-priority pages whilst they get up to speed. That will keep them same. Their stronger content will generate traffic, conversions and money. Then step by step they can expand the indexable content.

      • Barrie Smith

        Great advice R. Rogerson, thank you for your contribution πŸ™‚

    • Barrie Smith

      Thanks Phil, I know all about the pains of clients with thin content πŸ™

  • Kelvin Igbinigie

    Hi Barrie,

    Thanks for this great article. Really timely as I plan to focus more on my blog in 2016.

    Please your number 1 point about Duplicate content. I need your assistance. How do I create canonical url of my post so I don’t get penalized for duplicate content.

    Thank you.

    • Barrie Smith

      Hi Kelvin, this guide from Google should be able to assist you with canonical URLs: https://support.google.com/webmasters/answer/139066?hl=en

      Let me know if you need any more support.


  • R.Rogerson

    “… With Google’s algorithm changing constantly, it can be hard to keep up with best practices …”
    No offence Barrie, but I’m more than a little tired of seeing that line used.
    The Google Guidelines haven’t really changed all that much, and the core principals have remained pretty much the same as well.
    The only things that have changed are;
    a) G got better/faster at catching crud
    b) Stupid ideas, exploits and spam methods got booted
    c) G got smarter/faster at processing things (content+queries)
    Care to give 3 examples of ‘real’ (not spam, not bunk) best practices that have changed in the past 2, 4, 5, 10 years?

    1) Duplicate content
    2) Thin content
    3) Over optimised anchor text
    4) Bad backlinks
    5) Un-optimised titles
    6) Insecure sites
    7) Accidentally noindexing
    8) Bad UX
    9) Using a bad agency

    So ….. things like

    * Not researching the market
    audience/competitors, existing content, popular content forms etc.

    * Not researching target terms
    keywords/phrases etc. – highly/strongly competed, non-competed etc.

    * Blocking bots with robots.txt
    preventing the whole site or important parts of the site from being crawled,
    causing PR flow black holes etc.

    * No/Incorrect/Poor canonical structure
    No use of the CLE, using it to point to the wrong URL, mixed messages between variant pages etc. This not only includes internal duplicates/clsoe variants, but also test sites on different domains/subdomains, www and non-www web addresses, different case URLs etc.

    * No/Incorrect/Poor redirect usage
    See Canonical above πŸ˜€

    * URL removal via search console (webmaster tools)
    Telling G to remove the entire site or directories from the SERPs

    * Improper URL Parameter tool usage via search console (webmaster tools)
    Similar results to Canonical above πŸ˜€

    * Keyword stuffing
    Cramming in the primary term in all the right places, all the wrong places and just about any other place you can find

    * No/Improper/Weak internal linking
    Internal links are important and influential – for users and bots. Failing to link up/down/sideways etc. can result in lower effectiveness.

    * Bad internal linking
    Linking to URLs that are no-longer working can infuriate visitors, and wastes crawls, fills up errors logs and wastes PR flow.

    * Creating worthy content
    Optimising things is all well and good – but if the page is useless, then you might as well not bother optimising it. Instead, you should make sure what you create is of use/interest, and ideally shareable/worthy of talking about.

    * etc. etc. etc.

    Personally, I would have thought some of them more important than things like Insecure sites.

  • steve carl

    Ofcourse, all above mentioned really helpful for beginners. I think, you missed internal link building in the list. You can easily reduce the bounce rate and visitors will navigate to other page, If the article is good and unique.

    • Barrie Smith

      Good point Steve, thank you for your contribution πŸ™‚

  • Vishwajeet Kumar

    Hi, Barrie, Thanks for this very informative article. Actually I had face problem with Bad links which had harm my site rankings in Google. But luckily i am out this problem and my website works fine. Your article is very helpful to newbies who are struggling for ranking their sites.

    • Barrie Smith

      Nice to hear you overcame your bad links issue, Vishwajeet πŸ™‚

  • George

    To sum it up, optimize your website with users’ in mind. Make your website easier to access and use and make sure that you follow the Google standards.

  • Norm

    Actually, duplicate content won’t inherently get you penalized. As Google webmaster guidelines say “Duplicate content on a site is not grounds for action on that site unless it appears that the intent of the duplicate content is to be deceptive and manipulate search engine results.”

    We sell complete websites including content. We have over 7,000 clients. Most all of those clients have the same or very similar content. In other words, they may change the home and about, but overall the service pages are identical. I can show you results page after results page after results page that list our clients on page 1 for localized searches.

    The point/theory is, only one page will rank for a given keyword. If you google “burlington tax prep”, you will see the number one listing is a client of ours. If you copy the first two paragraphs of that page and paste that into google in quotes, you will see about 4,400 other sites have the same page. Now Google “san antonio tax prep”. The #10 listing is our client with the same page you saw in previous first page results.

    Try “miami cpa”. In this very competitive market, the #9 listing is our client with thin content home page and a site that essentially has no original content. “san antonio cpa”, “cincinnati cpa”, “seattle cpa”, “denver cpa”, “san jose accountant”, “phoenix accountant”, “indianapolis accountant”, I could go on. All examples of site full of duplicate content on page 1.

    To be clear, I am not promoting duplicate content. The best thing anyone can do for their site is craft compelling and original content that engages the reader. That content should also recognize SEO best practices. My comment is simply meant to clarify a growing miss interpretation of Google webmaster guidelines that google will penalize sites with duplicate content. Google may penalize sites that don’t have original content, but not having original content is not the reason for the penalty.

    • Barrie Smith

      Hi Norm,

      Interesting to see that you are getting away with duplicate content.

      I’ve certainly not seen this pulled off in a more competitive industry though. Looking at the search volumes of those keywords you suggested and the competition on page 1 is generally very weak – I’ve seen the sites at the top of low search level industries seem to be able to get away with article directory and other low quality links etc.

      If you find an example of a non-authoritative site ranking for a popular keyword with duplicate content, I’d like to see πŸ™‚


  • Haider Amin

    Over the experience of past two years, I have found out that content is the king in gaining rankings. I did a small experiment on one of my sites. Posted high quality articles with at least 2000 words in each articles. Every article was unique and written by me. I did not create any backlinks. That site got ranked in two months and the ranking never dropped! The key to persistent traffic is daily posting with quality content. Backlinks matter too but they work well if the content is of good quality. Otherwise site will probably be penalized in an algorithm update of Google.

  • Tom Ordonez

    Great tips but thin content is a grey area. If you have any type of directory listing that is SEOd. Then is that considered thin content? Will Google penalize those directory websites for having thin content?

    • R.Rogerson

      People need to understand that it’s not a on-size-fits-all subject.
      G understands – they categorise content types and rate/rank based on content types.
      Further, it’s not just the content, but the perceived value of that. Are people using the site? Are they going back to the site from the SERPs? Are they linking to it?

      More content doesn’t equate to better.
      The content has to be of use/interest and serve the visitor/user. Adding 600 words from Alice in Wonderland is not going to help people much.
      Adding 100 words that cover the pro’s/con’s of the business might. Adding 50 words that accurately tags the business by location, purpose, consumer base, opening hours etc. will.

      Don’t look for measurements/metrics other than quality (usefulness/interest/appreciation).

  • Karl Craig-West

    Thanks for this, especially the bits on content. I’m constantly harping on about decent original content to clients and it’s just possible that they get sick of hearing it.
    Hopefully, when they read this post they’ll be convinced.

  • Jay Staniforth

    This massively reinforces what I have always said – “Marketing is easy, anyone can do it, what’s hard, and what many people seem to miss, is that it’s difficult to do right!”