Why Do Sites Rank High on Google When They Aren’t Optimized?

SMS Text
Why Sites Rank on Google When They Aren’t Optimized? | SEJ

Have you ever wondered why some sites rank high on Google when they aren’t optimized for search engines? Or even worse, when they barely have any backlinks?

I’ve been asked this question a lot over the last few months, so I thought I would write a blog post explaining why that happens.

Here’s why some sites rank high when they aren’t optimized: 

Reason #1: Click-Through Rate

Part of Google’s algorithm looks at a click-through rate. It calculates it as a percentage, reflecting the number of clicks you receive from the total number of people searching for the particular phrase you rank for.

The higher the percentage, the more appealing your listing is compared to the competition. And if your click-through rate is higher than everyone else’s, Google will slowly start moving you up the search engine results page, as this algorithm factor tells Google that searchers prefer your listing.

Looking at the click-through rate isn’t enough, however, as people could create deceptive title tags and meta descriptions to increase their results. So, Google also looks at your bounce rate.

It assesses the number of people who leave your page by hitting the back button to return to the search listing page. If Google sends 1,000 people to one of your web pages and each of those 1,000 people hit the back button within a few seconds, it tells Google your web page isn’t relevant.

A lot of the websites ranking well on Google that don’t seem to be optimized have a high click-through rate and a low bounce rate. And that helps maintain their rankings.

For example, if you look at this guide, you’ll see it ranks really high for the term “online marketing,” and the ranking very rarely fluctuates, as my click-through rate according to Webmaster Tools is 31%.

Here’s another example. This post ranks well for “best times to post on social media”. It would be hard to outrank this listing as my click-through rate is currently 52%.

Why Sites Rank on Google When They Aren’t Optimized? | SEJ

If you want to see your click-through rates, log into Webmaster Tools, and click on your site profile. If you don’t have a site profile, that means you need to add your site to Webmaster Tools and wait a few days.

Once you are viewing your site in Webmaster Tools, click on the navigational option “search traffic,” and then click on “search queries.”

If you need help increasing your click-through rates, read this post as I walk you through the steps you need to take.

Reason #2: Age

One of the big factors that cause some sites to rank well is their age. Most of the sites that rank high are at least a few years old.

Sure, most of these older sites have more backlinks and content as they have been around for longer, but not all of them.

What I’ve noticed is that if you take a brand new website, build tons of relevant links, and add high quality content, you still won’t get as much search traffic as older sites will.

There is not much you can do here other than just give it time. The older your site gets, the more search traffic you will generally receive, assuming you are continually trying to improve it.

Reason #3: Backlinks

Google doesn’t just look at the sheer number of backlinks a site has—it also looks at relevancy and authority.

Many of these non-optimized sites that are ranking well have a few high quality backlinks pointing to the right internal pages. For example, if you have only few links—but they come from .edu and .gov extensions—your site will rank extremely well.

In addition to having the right backlinks, those sites also have a spot-on anchor text for these links. Most SEOs think you need rich anchor text links to rank well, but the reality is you don’t.

Google is able to look at the web page that is linking to you and analyze the text around the link as well as the text on the page. It helps Google determine if the link is relevant to your site and what you should potentially rank for.

Reason #4: Cross-Linking

Even if you don’t have the best on-page SEO and a ton of backlinks, you can rank well from an overall site perspective if you cross-link your pages.

And it’s important not just from a navigational or breadcrumb perspective, but from an in-content perspective. If you can add in-content links throughout your site and cross-link your pages, you’ll find that they all will increase in rankings.

On the flip side, if you aren’t cross-linking your pages within your content, you’ll find that some of your web pages will rank extremely well, while others won’t. It’s because you are not distributing link juice and authority throughout your whole site.

Reason #5: Content Quality

Since its Panda update, Google has been able to determine content quality of websites. For example, it can determine whether a site is too thin or has duplicate content, allowing for a much better analysis of content quality than before.

A lot of these well-ranking older sites have extremely high quality content. You may not think so, but Google does.

Why?

Because Google doesn’t just look at the content on a site – looks at the content on one website and compares it to others within that space. So if you have higher quality content than all of your competitors, you are much more likely to outrank them in the long run.

Reason #6: Competition

The beautiful part about ranking for certain keywords is that they are low in competition. And some of these low competitive terms don’t get searched often.

From what I’ve seen, the results pages for these low competition key phrases aren’t updated by Google as often as some of the more competitive terms are. Why? Because more people are viewing the competitive terms.

If you were Google, wouldn’t you focus your resources on ensuring that popular terms and results pages are updated more frequently than phrases that aren’t searched for very often?

Reason #7: Growth Rate

What should you do if you want to rank really high for a keyword? Build a ton of relevant backlinks and write a lot of high quality content, right?

Although that’s true, what happens is a lot of webmasters grow their link count a bit too fast…so fast that it seems unnatural. And chances are it is.

Google is smart enough to know this as it has data on a lot of sites within your space. For this reason, you see a lot of older sites ranking well as they are growing at a “natural” pace versus one that seems manufactured.

Conclusion

There are a lot of reasons why sites that don’t seem well-optimized rank well. The seven I listed above are the main reasons I’ve seen over the years.

So the next time you are trying to figure out why a certain site ranks well when it shouldn’t, chances are it’s because of one or more reasons on the list.

As a website owner, you shouldn’t focus too much on your competition; instead, you should focus on improving your website. In the long run, the company with the best product or service tends to win.

Why else do you think non-optimized sites rank well on Google?

 

This post originally appeared on Quick Sprout, and is re-published with permission.

Neil Patel
Neil Patel is the co-founder of KISSmetrics, an analytics provider that helps companies make better business decisions. Neil also blogs about marketing and entrepreneurship at Quick Sprout.
Get the latest news from Search Engine Journal!
We value your privacy! See our policy here.
  • R.Rogerson

    Redirects.
    This is a problem one to detect.
    Though a site may not have any links (at all, or of relevance, or of worth) – if other URLs or domains are redirecting to that site, it will, in most cases, get value from them.

    External Canonical Link Elements.
    Much like the Redirects (above), but done via the CLE.

    History.
    Believe it or not, Google has a memory – and it can/does remember certain things, sometimes for a long time.
    So a site may not have much in the way of relevant content at the moment – but at one time, it may have, and that can be of influence.

    I see you reference correlations (proximity of associated words to links etc.),
    and SERP CTR.

    SERP CTR.
    When asked several years ago – Google denied SERP CTRs as a ranking factor. I know some people have been pushing the concept of Crowd Searching and stating it is a ranking signal … but I’m always more than a little doubtful of results from “SEO Experiments”. So … has anyone at Google actually confirmed it?
    Alternatively, do you know of any qualified/thorough tests/experiments that point to such a factor (other than surface correlation)?

    Co-Occurrence.
    I don;t think I’ve seen/heard of any at G admitting to this, nor of any tests that strongly prove it?
    I know people like Dan Petrovic were running tests a few years ago – and saw no significant indication … but it could be that there is a threshold, a delay or that things have changed since then.
    As per SERP CT – do you know of any reliable tests to look at?

    And sorry … but…
    “Internet Market” has over 300 links to it, most followed and a fair few contain the words “internet marketing” in them (and that’s discounting the numerous shares on Social Platforms).
    Further, it’s not like it’s on a “new” or “untrusted” site – you’ve got enough links to your site to sustain you more than well enough 😀 The amount of PR flowing from your existing pages to any new page is hardly insignificant.

  • http://www.homes.com Grant Simmons

    While I agree that Click Through Rate is an important metric (it’s something that Google has easy access to, and has admitted as a key factor in paid ads), I don’t agree that bounce rate as a metric is a critical component of the algorithm, or considered in a ‘search vacuum’.

    I believe that the algorithm ‘looks’ at CTR of query topics, analyzes the dwell time (time to return to the SERP), measures site engagement metrics (quality signals from content etc.), considers query modifications (changes to the original query in sub sequent searches), and then (as you mention above) compares to other sites it qualifies in the same query topic segment.

    In that way bounce rate isn’t a definitive number, it’s a relative number ‘similar’ sites are measured against. e.g. a query for opening hours of a restaurant will probably have a high bounce rate and one page engagement, a query for the history of a city might have a relatively low bounce rate and multi-page engagement.

    Challenges of absolutes and a focus on scale of metrics make bounce rate a useless metric in a vacuum, but add in other data points and a bit of common sense, and you can work to improve site wide metrics based on user engagement expectations.

    One more thing I’d add to ‘content quality’ is the factors of uniqueness (when published), authority, and supplemental content as key differentiators of ‘quality’.

    Google is a massive connections engine and understands the connections of age (as noted above) and uniqueness i.e. ‘who said it first might be the authority, all others are pretenders’ *unless* they have added significant value through additional (supplemental) content.

    2c

    Cheers

    • R.Rogerson

      I’m sure that, somewhere, there was a patent of G’s regarding “supplementary content”, such as PDFs etc. that pertain to the page being looked at. The idea is that sites that go the extra step give extra value to the user.

      Nice point on the relative-comparative, rather than absolute of things like CTR/Time before Return to SERPs.
      Though G identifies “content types”, automatically grouping pages by common performance would aid them no-end.

  • http://www.leapfrogleads.biz/neeraj-kumar-knkayastha/ Neeraj Kumar

    Hi Neil,
    Very True.
    May I add that if the page is viral on social media, there is every possibilities that it will rank well on search engines!

  • Dave

    First of all, I do believe that there are websites that are not optimized but still ranking well in serps and I also believe in most of the previous blogs of Neil Patel. However, it seems that he failed me on this one especially what he said about in #1, 2, 3, 4, 5, and #7. I disagree with all of these.

    • http://www.moxiedot.com Kelsey Jones

      I’d love to hear more on why, Dave!

    • Mike

      Really? I cannot even agree with #6. It’s a silly theory.

      I thought this was one of the worst SEO related articles I have ever read.

      Seriously things like EDU/GOV links carrying magical ranking powers and link velocity playing a role in rankings… It’s laughable.

      Hopefully, people are smart enough to realize this is nothing more than crappy link bait designed to target newbs.

      • http://www.moxiedot.com Kelsey Jones

        Hi Mike, We appreciate the feedback. SEO is an ever-changing industry and we are all here to learn. I’d be really interested in knowing any studies that have proven EDU/GOV links not valuable (I’m not saying I believe either way).

      • http://spartanmarketingacademy.com Mike

        Kelsey,

        EDU/GOV links have long been a tactic of snakeoil salesmen in the SEO niche.

        There is zero evidence that a link from a page with a .edu or .gov TLD is stronger than any other link out there.

        Don’t get me wrong. If Harvard wants to link to one of my sites, I’ll take it. But the kind of .edu and .gov links that you can create on your own, are worthless.

      • R.Rogerson

        I cannot believe it’s 2015 and we are Still banging on about .edu and .gov links!
        It’s been more than 10 years.
        10 years!
        Google have stated numerous times (MC did a vid what … 5 years ago?) that there is no ranking bonus for certain Global TLDs.
        In fact – the Only TLD difference is Global vs Geographical – and that depends on whether the site appears to be local or not.

      • https://www.facebook.com/royalravirajput Ravi Rajput

        Yeah.. i never hear before that links from edu/gov sites helps to get high.
        Can’t understand which type of special stuff they have 🙂 🙂

    • http://nosuchagency.dk Soren Jensen

      @Dave
      I totally agree with you on this one…
      This post is full of postulates without any evidence whatsoever…. CTR?? .edu and .gov links! Seriously???
      And #5 Content Quality. Then please explain why loads of sites with sh** content (google translate horror) or almost no content at all ranks…

      I have to say, I’m disappointed Mr. Patel – This is not what I expected from you. Sorry ’bout that.

  • http://www.marc-heiss.com/online-marketing/ Marc Heiss

    Hi Neil,

    great article. Thanks for your share.

    Best,
    Marc

  • http://quickwebresources.com/ Deepanker

    I believe that number of backlinks, domain age and CTR are the main factors which helps sites in ranking high even if they are not optimized. I see various tech blogs who just write most of the posts within 150 words. But they are ranking high because of their backlinks and authority.

  • http://www.brainybull.com/ Anil Saini

    Great post Neil. I have seen some websites like bubblews.com that do not focus on seo optimization for users post but they still have good google page rank with good alexa rank as well. Thanks for the post 🙂

  • http://www.prosites.com Norm

    Regarding CTR, it’s funny, all those rank checking softwares that check you position for various keywords, trigger impressions in GWT but of course to not trigger clicks. Of course they trigger impressions and no clicks for everyone, so not necessarily hurting, but you can do very weird long tail searches with rank checkers that trigger your site, and you will see those phrases in webmaster tools. It also explains why you will see some high impression counts for phrases your average position is 10+ pages back. All these rank checkers are naturally lowering the collective CTR of everyone.

  • http://www.digitalsuccessagency.com Peggy White

    Thanks for sharing this useful piece of content. You cleared my perspective on backlinks. I agreed with you that cross linking alone does that which on-page tactics can’t do.

  • http://www.laravelecommerce.com/ Arun Kumar

    Hi Neil,

    You are absolutely right, I have been noticing that above seven is the core reasons for sites ranking high on Google. Especially domain age plays a vital role in rank higher on Google rather than bank links.

  • http://www.seoworx.net.au/website-design-parramatta/ Richard

    A big reason why these completely un-optimized substandard websites rank is because people search for them using branded search terms. And when they search for them using standard keywords, it’s done so via a genuine IP address that has relevant and authentic search history behind it.

    Crowd searching & CTR manipulation can definitely helping rankings but the integrity and authority of the IP addresses used is a huge contributing factor. Hits on your site from proxy IP’s won’t even show up in your analytics, let alone boost your rankings. Even if you used elite proxies (that actually do show up in analytics as legitimate traffic) to manipulate your CTRs, the reputation, age and authority of those IP addresses won’t do much to increase your Serps.

    Just like certain social media profiles have more authority and clout, individual IP addresses are also classified using behavioural profiling by Google and rated according to their authenticity & online activity. This is why fake crowd sourcing to manipulate CTR & Serps doesn’t work because the integrity & authority of the IP addresses used to do it is non-existent.

    • http://www.grahamseo.com Graham

      I’m not so sure about that purely for the fact that IP addresses change when people are mobile or not on a fixed IP with their ISP. My IP address changes several times a day depending on where I am and what I am doing (office, gym, home etc). My Google profile is consistent though and all of the history and things that go with it.

      I agree with what you are saying though, but for my money it’s more likely to be connected with the profile.

  • http://www.flexibleseo.com.au Jason

    Great info. A few years ago i always used to ask myself the very question of this article. I did a little but of research and realised ctr is important. I have since always valued CTR as an important ranking factor.

  • http://soref.fr Marc

    I wonder why nobody had a go at another explanation : some unoptimized websites rank higher just because… they are not using SEO techniques ! In our post Panda/Penguin/Colibri/Platypus etc Google world, FUD is still the best asset for Googlers to counter the SEO business. Thus I’ve noticed some webistes where on the first page without any link. Content is OK but not that great. Nothing else in this article applies (domain age, etc) The only reason for me is to make SEO people search for years… without finding any good answer…

    • R.Rogerson

      A site at the top of G without links is … well … darn unlikely from experience.
      I know they used to have things like the “honeymoon” period, when a new site may get a short term boost to monitor response/performance … and some content may get a pickup for freshness for a time …
      … but the chances of a brand spanking new site being in the top position without links….
      … care to give some examples?

      That’s not going against the concept of FUD though – I know G like to play games. They have some things in place to delay (and vary that delay) some ranking factor influence – and they aren’t always completely informative with updates (they release Penguin with Panda … and made a bunch of other little tweaks at the same time – some of what was assigned to P+P was nothing other than threshold adjustments).

  • http://oziti.com.au/web-design Vikas

    Nicely put together. I often get asked the same question a few times and it can be really hard to explain. An explanation from a reputed source is just what I was looking for.

  • http://www.straightnorth.com Scott Hepburn

    I challenge the premise that sites meeting the criteria above aren’t optimized. Most of those factors are the very epitome of optimization.

  • http://lasertekservices.com/ Neil Lopez

    Hello Neil,

    “As a website owner, you shouldn’t focus too much on your competition; ” makes a lot of sense. Sometimes I tend spend more time spying on my competitors rather than making simple changes with my site. Your article is an eye opener for me

  • https://www.facebook.com/royalravirajput Ravi Rajput

    Agreed with all, but a bit confuse that how links we are getting from edu/gov site helps to get higher rank on SERP.