On-Page SEO Factors: Which Ones Have the Most Impact on Rankings?

SMS Text

Even though SEO is a synergy of different practices, not all of them are equally important for higher rankings. And, because SEO’s are normally pressed-for-time individuals, it’s essential for them to know what SEO tasks should be their priority.

Speaking of on-page and off-page SEO (many SEO’s also consider keyword research a separate aspect), I’d like to say that, quite often, on-page SEO does not get the attention it should. This is because, in general, it takes less time than link building and often plays second fiddle to off-page SEO when an emergency rankings boost is required.

However, there are certain on-page SEO factors, leveraging which can work miracles for your site, and it’s important to know how much impact each of them carries. So, let’s talk about these factors.

Lay a Foundation for Higher Rankings

Any construction work begins with laying a foundation for the future building. Likewise, on-page SEO begins with creating a certain carcass for the site’s content. What’s important SEO-wise is that this carcass gets ‘A’ grades from the search engines. In other words, search engines should find it easily crawlable and non-confusing.

Here are the checkpoints that should be covered when running an on-page SEO audit. They are arranged starting with the most important ones:

of utmost importance highly important very important

quite important of minimum importance.

Use the above information to understand how important the follow factors below are.

HTTP response code errors

A reliable hosting provider is crucial to your website’s success in the SERPs. If the server your site is hosted on is often unavailable or takes a long time to respond, the search engines won’t hold much for your Web resource.

Not to mention that, if your site is unavailable, the users will simply not be able to access it. Therefore, we consider this factor to be of utmost importance.

Site speed

It’s all about the user experience these days, and the search engines have gotten even pickier when it comes to site speed. Over a year ago, Google confirmed that site load speed was an important ranking factor.

Internal links pointing to the page

When the webpages of your site are interlinked, it helps both users and the search engines navigate it. Internal links with appropriate anchor texts have a big impact on a page’s position in the SERPs. However, be careful not to overdo – your internal links should look natural to the search engines.

Correct rel=”canonical” Use

Canonical tags help one eliminate duplicate content WITHIN their own site. Let’s say you have several versions of your URL indexed (for instance, www.example.com, example.com and www.example.com/home.htlm), in which case Google may treat them as different pages with duplicate content on them. So, using the rel=”canonical” attribute would solve the problem.

Absence of Broken Links

Search engine bots crawl not only the page they arrive at, but the links they find on it. If any of those links are broken, the overall impression of your page is spoiled.

At the same time, it’s understandable that you can barely be responsible for what is happening on third-party sites you provide links to. Hence, it’s not a HUGE ranking factor in the eyes of the search engines, even though a fairly important one. The best practice would be to check your site for broken links from time to time.

Perfect HTML Code

Although the quality of your markup is not an insanely important factor, sloppy HTML code does diminish your chances to rank higher in the search engines. It’s best to make sure there are no errors on your webpages using the W3C HTML and markup validator. Besides, there are on-page SEO tools (like our WebSite Auditor) that come with a built-in markup validator as well.

Valid CSS

Just like perfect HTML, valid CSS is not a must, but rather an additional asset to one’s recipe for successful SEO. Besides, you’d want to pay more attention to your CSS if you’re optimizing for the mobile Web and using special mobile style sheets for mobile users.

And, as W3C offers a CSS validation service as well, you can use it to check for CSS errors and warnings.

Now we are talking CONTENT

Once the infrastructure for your site’s content has been taken care of, it’s time to optimize the content. A site’s content sends the search engines certain signals, using which they determine how relevant the site is for a particular query.

The more signals there are and the stronger they are, the more likely a search engine is to deem your site relevant. However, too strong of a signal could be suspicious, right? So, keep that in mind.

In respect to content, what matters for rankings is:

Keyword-related factors

Keyword in URL

Everybody in the SEO industry knows that exact match domains tend to perform extremely well in the SERPs. If so, why doesn’t everyone just buy a domain name that also happens to be their keyword and hit Google’s top?

Well, there certain cons here. First, the desired domain name is often taken. Second, for branding purposes, it is better to go for a brand name as your domain name. And, third, we believe that the impact that exact match domain names have on rankings will most likely decrease in the future.

Keyword in Page Title

Unlike in the case with domain names, it’s much easier to stick your keyword in a page title. Just make sure that your titles also look natural, since real people are going to see them in the search results. For example, out of the two page titles

  • Holiday Gifts New York|Gifts Brooklyn|Free Delivery|Gifts
  • Holiday Gifts for All Occasions – Brooklyn New York – Free Delivery

The second one is definitely more attractive.

Keywords in Internal Anchor Texts

As I said earlier, internal linking is important for higher rankings. However, simply linking to your landing page from other pages of your site won’t do much. To attain the desired effect, use your keywords in the anchor texts. Also, make them [anchor texts] vary. Links with identical anchors will look unnatural to the search engines and may be ignored.

Keyword in H1 Text

H1 is an HTML tag normally used to mark headings. When your keywords stand in H1 tags, they carry more weight, so to say, and have a bigger effect on your site rankings.

Keyword in Image Alt Text

Image alt text is what gets displayed instead of your image when it cannot be loaded, or when certain functions responsible for rendering images are disabled in a person’s browser. If image alt texts reflect the rest of your page’s semantics, this sends the search engines a signal that your page is relevant to the search query, and they rank it higher.

Keyword Frequency (body text)

Needless to say, you keywords should be present in your page copy as well. However, the exact number of keywords that helps you achieve top rankings would depend on the niche, the search engine, etc. So, there is no ideal keyword density that’ll work for any site.

The general rules that apply here are: it’s best to use your keywords more towards the beginning of your page and to avoid keyword stuffing.

Keyword in Bold/Italic

Keywords in bold/italic do have a bit more significance in the eyes of the search engines, but, please, remember that the user comes first and use formatting wisely.

Keyword in Meta Description

If your meta description contains keywords irrelevant to the overall theme of your webpage, this will do little to improve your rankings, and the search engine may even choose not to display the description you specify.

However, if your meta description is semantically in line with the rest of the page’s content, this can give your site rankings a boost.

Content quality-related factors

Content Uniqueness

Did you know that, if you copy content from another site on the Web and post it on a webpage, the page may not even turn up in the search results? Google sometimes treats pages with identical content as versions of one and the same page, determines which one of them appears to be the most trustworthy and filters out the rest of the pages from its search results.

So, if you’re striving for higher rankings, the content on your page must be as unique as possible.

Content Freshness

In general, the more often you update your content, the better. Fresh content is especially important for Web 2.0 sites, news portals, etc. Besides, many SEO professionals say that, in the future, the query deserves freshness (QDF) mechanism in Google’s algorithm will be triggered more and more often, so, fresh content will become even more important.

The Amount of Content Above the Fold

In the light of the recent Google’s algo update, having sufficient content ‘above the fold’ becomes more important than ever. ‘Above the fold’ is the part of the page users see immediately upon arriving at your site, without having to scroll down. Hence, make sure that enough meaningful content is right away visible.

Content Density (thin content)

The overall content density of your site is also important. Google’s Panda update series targeted websites with ‘thin content’, among other things. Well, looks like owners of minimalist sites now have something to ponder.

The use of Visual Content

The search engines also look at whether you’re using any visual aids, such as images or videos on your website. It’s not a crucial on-page SEO factor, but still can win you a tiny fraction of Google’s affection, since it serves to indicate that your site is more likely to provide a decent user experience.


If not done properly, on-page SEO can really become a stumbling stone for your site on its way to Google’s top. Hence, it is just as important to take care of it as it is to do link building.

And, when performing on-page SEO, identify your priorities to save yourself time and move your site up the ladder of success with minimum effort. The above provided list of on-page SEO factor will help you do just that.


Alesia Krush

Alesia Krush

Alesia is an SEO and a digital marketer at Link-Assistant.Com, a major SEO software provider and the maker of SEO PowerSuite tools. Link-Assistant.Com is a... Read Full Bio
Alesia Krush
Subscribe to SEJ!
Get our weekly newsletter from SEJ's Founder Loren Baker about the latest news in the industry!
  • Jenni

    Great list. I think rel=”canonical” should be upgraded in its importance though; if it’s wrong it can kill rankings dramatically.

  • mike litson

    Ok, just a couple of issues here,

    a) the canon tag is subjective to the level of dupe issues a site has. so saying it’s only quite important is a little iffy, I’ve seen sites come from nowhere just because the canons have been put in properly.

    b)keywords in bold/italic, waste of time and is the sort of thing which (if Cutts recent interview wasn’t just scare tactics) is about to be slammed.

    c)keyword in the URL is not of the utmost importance – it can help in the sense that your link building will look more natural if one of the key terms you’re going for is your brand, but wont get you any rankings by itself in a competitive niche. Lets look at “online slots” for an example, (I’m browsing in the UK here), I don’t see any exact match domains in the top 5 (which is where most of the traffic comes from and I’m being generous – most goes to the top 3). But it is not of the UTMOST importance.

    Other than that nice enough article.

    • Melissa Fach

      Great points Mike.

  • New Egypt Consulting

    Brilliant. I would like to add two related hints:
    1- Adding social media sharing buttons would highly support long-term SEO efforts for the website owner.
    2- Installing blog with updated content is truly recommended for corporate websites (especially B2B). Google’s Matt Cutts had referred to blogs with fresh content as key tool for link building.

  • todd

    great article definately getting bookmarked onpage seo is the most important

  • Amit

    Great Info Alesia, however I think you have missed to talk about robots.txt, sitemap.xml.

  • David

    Very nice articles…Thanks for sharing that great information……

    I am agree with you mike for b) keywords in bold/italic

  • Luke McGrath

    Hi Alesia, nice article and good presentation. For a 101, it ticks all the boxes. I have to agree with Mike though, bolded keywords don’t belong on this list. It’s probably no longer a ranking factor and can lead to poor user experience if mishandled.

    I’d add a disclaimer that if doing any of these things condracts the usability of your page, conversions may suffer despite increased traffic. Things like tacky, keywordy URls don’t always engender confidence.

    • Alesia Krush

      Thanks, Luke! I totally agree, these on-page techniques are quite basic and for sure shouldn’t interfere with one’s website usability optimization. Thanks for your point.

  • Sanat Singha

    I like the content area you discussed. Any content should be visually appealing other than its research values.

  • Autocrat (Lyndon NA)

    Sorry to say – some of that isn’t accurate.

    Valid HTML & Valid CSS are Not ranking factors.
    So long as G can parse them (and it handles most errors) and you aren’t doing any “bad” (such as tiny text etc.),
    then you can have errors/warning and no negative affects at all.

    Meta Descriptions are Not a ranking factor.
    They haven’t been for years.

    Speed is factor – that affects a small %.
    Further – G never said it was an important factor – merely that it was “a” factor.

    Content Density?
    Thin Content refers to have low value/low originality content for the purpose of ranking for specific terms.
    IOt is not really associated with the Quantity of text on a page.
    You can have 2000 words and still have Thin Content.

    • Alesia Krush

      Autocrat, thanks for your comment.

      It looks like some of the things sound too categorical in my post and need some clarification.

      OK, let’s talk about valid code not being a ranking factor. Although crawlers got really advanced and can “handle” tons of code errors, there’s one significant “but”: if they cannot handle them, they stop indexing the page right at that point in the code. That wouldn’t be caused by a missing ATL attribute, of course, but, if you completely ignore validation, you risk overlooking more serious issues, which may lead to the situation I described above.

      Besides, invalid code can be an issue not only for search bots, but also for users. For instance, if the page is incorrectly rendered in a web browser. I think, Google would not like that. ๐Ÿ˜‰

      Meta descriptions are attached “minimum importance” to in my post. However, they can be very important in some cases: by modifying the description, I once managed to double the CTR of one of the pages – it took me about half an hour to revise and change the copy, which had had such great results. In its turn, better CTR can also result in more traffic and better rankings, so, in a way, descriptions can influence rankings (not so directly, I agree!)

      Site speed is important to users as well. When it takes more than 10 sec for a page to load, your bounce back to the SERPs may rise to a great extent. Wouldn’t that influence rankings?

      Yep, you are absolutely right, “thin” content is most frequently low-quality, no-value text. But that’s what we, users, can definitely see and identify. The googlebot, however, as sophisticated as it is, is a machine and follows algorithms, certain rules and criteria. The quantity of content is just one (of many) factors the bot can rely on. Google has been talking for ages about working on your site’s content. Yes, it does mean creating valuable content, but it also means creating sufficient content. Remember the recent “above-the-fold” update? Google (well, Matt Cutts, to be exact) said that it’s been hard for some users to find actual content above the fold, and they decided to address it. That’s because each page is like an answer to one’s question and people prefer full, profound answers.

      • Scott Harris

        From the official Google docs:
        #1 consideration goes to properly formatted and filled schema.org tagging.
        #2 consideration goes to properly formatted and filled opengraph tagging
        #3 consideration goes to the venerable “meta description” tag.
        #4 Leaving no meta or microformat description on a page, instead leaving it up to Google’s bot. Something they specifically do NOT recommend.

        What’s interesting is no one says squat about Dublin Core, other than it “inspired” the use of other microformatting schema. W3C’s Unicorn red flags all DC (they red flag a lot of schema.org, too – and be careful with html5: I’ve seen Unicorn validate things that won’t render in a browser). Wikimedia released 1.18 (and 1.19 since) with access to formatting Dublin Core… With Wikipedia figuring as prominently in the SERPs as they do, it makes you wonder…

  • Ros

    A very thorough article in my opinion. The nature of the ‘Google Beast’ is that we cannot ever know for definite the exact results of our efforts but I do think that if the advice above is implemented, we won’t go far wrong because all the actions combined ‘should’ indicate a quality website. It’s a little like playing musical chairs though as it could soon be ‘all change’ and we’ll have to jump to it all over again!

  • Ned Wells

    This is a handy list, Alesia, thanks for posting. Your ranking for importance feels about right to me, but I’d be interested to know… are they based on your experience, or on testing and experiments?

    • Alesia Krush

      Thank, Ned.

      As SEO’s we all have our unique experiences based on the campaigns we run and the stuff we hear from our fellow-SEO’s. But, I think, one should check his/her presumptions with the actual data, just because there are so many factors at play. Thus, I’ve always preferred boring, figure-dominated stats reports to well-written, but speculative SEO posts.

      So, what you read here is rather my accumulated (and, I’d say, abridged) experience plus the stats that I’ve collected from various resources over a long period of time and had tried to bring to a common denominator.

  • Nick Stamoulis

    Whenever I am working with a new client, on-site SEO is always the first thing we focus on. In my opinion, what is the point of doing a great link building campaign if the site isn’t ready for the visitors? You can control your onsite SEO, so why not get it in top shape from the start?

  • Ivan Temelkov

    Absolutely love this post. Dead on. Always good to see others out there that are considerate about SEO efforts. Your blog post outlines all the specifics that you should be optimizing accurately and especially when building a new website.

    Thanks for sharing the knowledge.

  • Reg-NBS-SEO

    Great article.
    I would move up the H1 tag to U, as it is of prime importance to Google and to visitors when converted to the visual equivalent.
    I would actually lump the h tags in one category. h1 -> h3 being the primary tags defining the semantic hierarchy.

    Keyword in Image Alt Text is one i would downgrade.
    Tests have shown that unique text in alt tags is not indexed.
    Therefore alt tags have very little effect on SERPs.
    It is also important to have the alt text reflect the content of the image.
    Of higher impact is CSS image replacement. Using this for accessibility you can set the text in h tags.

    @Mike Litson,
    On my search on Google.co.uk, the top 3 URLs have the word slots, #4 does not, and #5 has the word in the Domain name.
    If I use Google.com the top 5 have the word.
    If you look at the phrase, the word “online” is redundant, the key being “slots”.

    Google tells us that both position and size of text are important.
    They also say that capitalization factors in, so I would imagine that any decoration of the text is important.


    • Alesia Krush

      Thanks, Reg-NBS-SEO!

      In the beginning, I felt it was pretty hard to classify all on-page factors according to their importance. I’ve worked on different projects and, to tell the truth, sometimes observed “contradictory” SEO results. So, some SEO’s may have noticed that H tags are just as important as page titles. However, in my own experience, titles have always played a greater role.

      As for image alt text, I had an interesting experience: one day, I needed to rank for a very competitive keyword pretty bad. After doing on-page, I saw traffic coming for that very word! I couldn’t actually believe my eyes and rushed to Google to check where the website ranked. Well, it wasn’t there, at least not in the top 100. Then I spotted the image we had optimized (using an alt tag) in Google Images results. Turned out, our competitors lacked visual content, that’s why we had hit the top of Google Images no problem . The story taught me to take image alt text seriously.

      • Scott Harris

        Image alt text… there’s also image title text: that little box that pops up when you hover your mouse over an image. Hidden deep in the bowels of the accessibility specs is a limit on these of 90 characters each. Image maps: yes, Google says they don’t read these but screen readers do. If screen readers can, do you expect the Googlebot to be less capable? Fill out all alt and title tags, then somewhere on the page every link in the imagemap must be mirrored in text.

        Image alt text is required of accessibility, image title text is not. Link title text is required. When using tags, they must be used in order: h1 > h2 > h3 etc. There are specific limits on how many of each definition are allowed on a page.

        Accessibility also requires that the page title tag mirror all or part of the page tag content. Page preferred limit: 90 characters. preferred limit: 45 characters. Never use more than two tags on a page. If you run a .js slideshow or other effect at the top of a page, make sure there is an tag on something before it.

        Rather than or a keyword, use an tag, the bot recognizes that on first read. Be careful with this: tags inline, surrounded by normal text, get negative flags. Use the word as a section title instead, adds extra weight.

  • Deb Ward

    Terrific presentation of good insights and material. I really like the idea of “ranking” the importance of each piece. Thanks Alesia.

  • Gian Marc

    nice article. well written and structured. read through until the end without fatigue ๐Ÿ™‚ thanks!

  • Scott Harris

    You’re kidding, right? Some of the comments in this “article” are now 180° from true. So many potentially valuable on-page elements aren’t even touched, while deprecated factors are rated ‘very important.” If this were the cheat sheet for an SEO 1.0 exam that would be one thing, but this is a new world and the new world revolves around new user enhancements, not quotes from an early draft of the SEO 1.0 Bible.

    No “apple-touch-icon”? No “viewport”? No “modernizr”? (Well, if we’re talking SEO 1.0 you probably don’t know what modernizr is or does.) No microformats? No schema .org? No “landmark” tags (and their associated user enhancements)? And then comes “link building”…

    I got into it a while ago with a group of folks who swear by the SEO PowerSuite toolset. Their customer lists are long and of good pedigree. And as the lead SEO folks at each company have written me since Panda’s last run, their link building techniques (read: linkspamming) have cost their customers greatly… as I said it would. Our conversations are quite different now as they are in recovery mode and recovery mode offers no bragging rights until they’ve recovered completely. And to think they were charging (and getting) big bucks for that… Some of their customers lost 80% (and more) of their traffic overnight.

    Are you ready for what Google is going to throw at us next? I’ve been throwing things at them for years and we’re only about halfway down that list… they finally got to “over optimization penalties.” I hope they get to “publisher intent” before too much longer… and I know Google has denied any comments about them targeting common ad-server coding but hey, pull those javascripts (if you can afford to) and see where you rise to in the SERPs… If you can’t afford to pull those javascripts you’re really in trouble.

    • Josh

      “Modernizr?” Strangely, I found no mentions of it in SEOmoz blog posts.. And from what I found in Google, it’s more of a dev tool. That’s a 101 type of article, targeted mostly at SEO beginners. Or do they have to start learning on-page optimization with utilizing Modernizr? lol

    • Alesia Krush


      I wonder if you could expand on the “many potentially valuable on-page elements” that “aren’t even touched” in the post. I’m sure that would benefit the readers of this resource, including me.

      So, you’ve mentioned apple-touch-icon. That’s probably an important thing to do, considering the ever-increasing number of iOS/Android users. I have to admit I didn’t know exactly what it was for, but thanks to Wikipedia, I do now.

      To save someone’s time oh googling:

      “For Apple devices with the iOS operating system version 1.1.3 or later, such as the iPod Touch, iPhone, and iPad, as well as some Android devices, one can provide a custom icon that users can display on their Home screens using the Web Clip feature (called Add to Home Screen within Mobile Safari). This feature is enabled by supplying a in the section of documents served by the web site. If the custom icon is not provided, a thumbnail of the web page will be put on the home screen instead.”

      So, I don’t really think it’s critical for rankings. In case you have experienced any improvement or vice versa due to the presence/lack of this element, I’d appreciate your sharing the case.

      Also, I’ve talked about some basic on-page SEO practices that can be applied to pretty much any SEO campaign. The fact that they’re basic does not make them unimportant. Believe it or not, one needs to fix broken links and optimize page code before applying the microformats, which have their impact as well, but, in my opinion, do not belong to the basics.

  • Ron

    I’m wondering whether a web page/site that employs all the on-page optimization methods listed will be adversely affected by Google’s announced “over-optimization” change to its algorithm.

  • Autocrat (Lyndon NA)

    @ Ron

    Apaprently Matt Cutts has said that they are looking at things like;
    Keyword stuffed pages
    Unnatural Backlink Profile

    So look through some of your pages.
    If they appear informative, natural (or slightly above) and readable – you should be fine.
    On the other hand, if you seem to be using your target term 1 word in 6 … you have a problem.

    • Ron

      Thanks Autocrat … hope you’re right. Then again it’s all speculation at this point I suppose … since we don’t really know Google’s math (what it is or will be). It’s interesting how a few people (mostly guys from what I can tell) sitting around a table at ‘the plex’ can algorithmically impose their subjective value systems … and the rest of us are left to dealing with the implications. That’s a lot of power they have … more than elected officials in many cases.

      • Scott Harris

        @ Ron: You don’t know the half of it. Google engineers experiment all the time, with virtually everything they have running on the web. There’s good and bad about that. Often enough, an experiment will have a particular target but will cause massive collateral damage. In sorting out the collateral damage, they’ll come up with a manageable tweak for something else and voila: Matt Cutts is shortly making an announcement of a new “feature” and they add another notch to the list of this month’s tweaks. Getting from the disaster to the feature can take months…and more experimentation.

        Not long ago we saw an experiment run by AdWords/AdSense engineers on Youtube videos. Everything about the experiment broke every rule Google imposes on their advertisers and their publishers. The experiment was ugly as sin, the resulting outcry huge, huge enough that Google engineers acknowledged the gaffe and said it wouldn’t be repeated. Doesn’t mean it won’t come back as a coyote wearing sheep’s clothing next time… it just won’t be the same wolf. But that one was large enough and nasty enough that many folks took note and screamed.

        Most of their experiments run just under the radar and either they are successful at returning specific planned-for data in a set time period or they are shut down. The nastiest of all experiments are those that affect data in the search index. Second nastiest are probably those that affect Google accounts. My personal favorite response from their side when something went wrong was: “That was the result of an over-aggressive spam bot…” All it takes is for one engineer to misplace a special character… and millions of folks and their businesses get nuked with no apologies ever offered. I’ll betcha that thought makes some politicians just salivate…

  • pradip

    nice info Alesia .your suggestion is very good for improve the ranking of the website.i appreciated this your post.thanks for sharing information.

  • Chris Gregory

    Good overall article. I would second the keywords being in bold or italics. I also believe that using synonyms of your keywords should be used. Googles LSI algo will recognize them as simliar and you should escape the over SEO’s penalties coming down.

  • Chris Gregory

    Correction: I would second NOT to use bold or italics on keywords.

    • Scott Harris

      Any modern website does all formatting with CSS. So how would a bot know that something was bolded or in italics? The author is probably a nice person, probably does her best to do a good job but she needs a refresher in today’s web… The old snake oil in this article just doesn’t cut it – and I’ll reserve my comments on those who pat her on the back and say “Job well done.” Just chafes me no end when I see this kind of stuff being passed off as “responsible, informative journalism.” It ain’t.

      One thing to know about all who try to pass themselves off as “SEO experts”: They will never tell you everything. Else why would you pay for their services? And we who are jumping around here are not the intended targets of this article: we have half a clue. This author appears to be trolling for folks without any clue.

      A lot of what SEO is about is really simple stuff, like taking advantage of the tag opportunities that most folks ignore because it’s simply a pain. Or they just don’t know html and the “opportunities” built into it, or Dreamweaver doesn’t offer them a simple “fill in the blank” option. Where the “over optimization” comes in is when you decide to fill out your image “alt” tags and write a novel in each one, then duplicate that novel in every alt tag. Same for “title” tags. There are prescribed limits placed on those tags and if you don’t know (and most for-hire SEOs are only guessing)… There’s other places where the over-optimization comes in, too, but just read any SEO 1.0 offering and whatever they suggest that you do should raise a red flag in your consciousness. As Autocrat points out: keyword stuffing (by any name) and abnormal backlink profiles (by any name) are top of the list. But for many for-hire “SEO experts,” both activities are still putting lots of your money in their bank. Beware when you buy the whole line and start buying their “SEO” software… don’t be victimizing your own customers with this stuff.

      Something else to recognize: by the time Matt Cutts makes a public comment about something, Google has been diddling with that for months already, finally settled on something that doesn’t cause too much collateral damage and already moved on to a dozen other things. He will never truthfully tell you what’s just around the corner. Whenever you are dealing with comments from a Google mouthpiece you need to look at it like a legalist: reverse engineer what is being said and try to determine what isn’t being said. Usually, what isn’t said is of significantly more value.

      It’s about the user… and only about the user. Google is finally starting to get back to that basic. If only they’d just bite the bullet and de-index all the crap, ad-bait, ad-click-bait, user-useless sites they know they keep showing. Sure would make surfing the web more enjoyable and productive…

      • Autocrat (Lyndon NA)

        Glad I’m not the only one concerned by the “well done” statements.

        Considering I pointed out some glaring issues at the start … I find it darn worrying that some people may be following the advice/suggestions ๐Ÿ™

        And – for those in doubt … click my name and go view my profile.
        I was a TC in Google Webmaster Central.
        I know “wrong” when I see it ๐Ÿ˜€

      • Reg-NBS-SEO

        Wow you are a smart guy Scott and from the tone of your writing, the only one.
        Why is it that we don’t see your name on the article?

        “Any modern website does all formatting with CSS. So how would a bot know that something was bolded or in italics? ”
        How indeed. Perhaps by parsing the CSS file?
        Or don’t you think that Google reads the CSS?

        Granted there is a lot of misinformation and garbage but just calling it that without any concrete examples of where it is wrong just won’t cut it.

        “One thing to know about all who try to pass themselves off as “SEO experts”: They will never tell you everything. Else why would you pay for their services?”

        If you know anything about internet marketing, there are those that will always take the free information and run with it, but there is a sizable percentage that what it done for them.
        Even if you tell *everything* you will have those looking for you to do it.

        Damm right Google does not come right out and tell us what to do.
        Matt Cutts is full of evasive and half truths.
        He just came out with the statement that Google will be looking at “over optimizing SEO” and they have been looking at this for years.

        Did you see the huge error Vanessa Fox made in her evaluation of the Mayday update?
        Do you know what it was?

        Google IS getting back to the user.
        Onpage counts for most of the SEO now and links count for little or nothing.

      • Autocrat (Lyndon NA)


        Erm – did you jsut say links don’t count for much?
        Considering that is still the primary method of detecting Popularity, and one of the methods for Relevancy to be passed, and initiallys decides the base crawl rate for a site …
        … No – Links are still one of (if not “the”) most influential aspects.

        On-page becoming more important?
        Possibly … but not in a direct manner.
        Instead, User Behaviour seems to be mroe influential over the past year than it was before.

  • Scott Harris

    Autocrat: I don’t know if these folks know what “Google Webmaster Central” is, never mind what a “TC” is… or how one becomes a “TC”… the author of this “article” surely doesn’t. I don’t think she’d last more than a few posts in those forums before someone pointed her directly at “Newbie Central” and suggested she read everything there before coming back again. Same for all these “Good job well done” folks.

    On the other hand, why should we care? Someone has to occupy those 10 million slots in the SERPs that aren’t on the first page… might as well be the folks who hang out at Search Engine Journal.

    You do realize, Autocrat, that Sasch put this “article” in my datastream and every time I come across one of these “offal” sites I get my knickers in a twist, right? Things are hard enough on the web without people claiming some kind of authority spreading this stuff around.

    • Autocrat (Lyndon NA)

      That is actually a collection of very good points.

      Ah yes, Sasch did a share of it, did he not?
      Bless him – I think he was put out by this as well.

  • Scott Harris

    There’s 2 ways to go about this whole process:

    1. Write purely for the user.
    2. Play never-ending games with the Searchbots: always running, always looking over your shoulder, always trying to guess what’s next, always praying you didn’t go too far and that you made no “mistakes.”

    I write for the user, 100% of the users. I don’t research all the guessing games around the bot activities, I don’t follow the latest obsessions, I don’t normally hang out in places like this or webmasterworld or searchengineland or any of the hundreds of others of similar ilk. There’s too much work on my desk for that nonsense. That’s also why you’ll never see my name on any “article” in a place like this. But mis- and dis-information… that I find really annoying. And when someone tosses this stuff into the datastream in the few places where I do hang out…

    I write for the user, 100% of the users. In the field where I work, there are certain legal requirements. I know, I know, “legal requirements” when it comes to web design… Yeah. You know what’s really wonderful about that? So many folks simply have no idea, no interest, no clue as to the very real value of that discipline. I can’t (and don’t) take shortcuts. Nowhere. So I never have to wonder what the bots will think when they come visiting, I never have to worry about it.

    NBS-SEO: your site fails the simplest of tests that I have to pass 100%. And that’s the simplest of the tests I have to deal with. When I ran it past the next test, it set a new record for the number of red flags (for me, anyway – I don’t normally test other people’s stuff because so little does pass, and other people’s stuff doesn’t generally concern me anyway).

    The author of this “article”: if she were honestly espousing the technical methodologies practiced in the site of the company she works for, that would be one thing. But she doesn’t. And that site would do better than NBS-SEO in the tests I have to run but it still wouldn’t pass 100%. As the Googlers say “You need to eat your own dog food and make sure it’s the best dog food before you go passing it off on other folks.”

    So I’ll stick with what I do and continue collecting those pure organic backlinks from .gov, .edu and .us TLDs. You can’t buy those, you can’t trade those, you can’t beg, borrow or steal those. You earn them, the hard way: doing the work, meeting and/or exceeding the content quality and coding standards. It also really helps if you take the time when you’re done to go through everything you did like a real user and not like a programmer… and I do have tools that help me with that… the best things about those tools: they’re free, they leave no footprints, they do the job I need to have done perfectly well. Oh, and they help me see around the corner so I know what’s in the pipeline and arriving on the scene soon.

    When you write 100% for 100% of the users, they do your social networking work for you, gladly. It’s a whole other world. But if you can’t meet (or exceed) the content and code standards, perhaps you should invest $249 USD in the SEO Powersuite… see if they’re offering any indemnification when it comes to Panda nuke strikes and the Googlebot’s newest daily activities while you’re at it.

    Do I sell SEO services? Nope, not in this life anyway.

    • Autocrat (Lyndon NA)

      I agree with teh sentiement and the “spirit” behind it all …,
      … but unfortuantely there are minor issues.

      Most markets don’t have audiences that have sites … thus links are going to be in short supply.
      Thus you ahve to market/push a little to get your feet under you.
      And unfortunately, you can get .gov and .edu links by asking, arranging and in some rarer cases, making payments ๐Ÿ™

      Such things aside … indeed, if you don’t “over do it” … if you simply tick the semantic boxes and the generate for the user (properly) you generally will do better, and be virtually risk free (never 100%, I’ve watch G nuke innocents and shrug off the collateral ๐Ÿ™ ).

      I personal advise as I always have – focus on teh user, research their Needs/Wants/Desires … meet and exceed those in text content, functional content and actual quality of service/product … and you should do fine.
      Throw in some low-key, basic and obvious marketing, and you should do ok.

      The only time you need to get a little messy is when you are new and in an overly-populated and highly competitive market.
      Unfortunately – having a “good site” is simply not enough in many such cases.
      that’s when the real SEO research and application comes in.
      And it’s mroe to do with targeting correctly and ensuring a solid site than getting links.

      Sell SEO services?
      You bet.
      But normally to SEOs that need hgiher knowledge or a more accurate eye, or a better understanding of the real game, rather than simply ranking for naff terms or following questionable advice from someone whos 5 years out of date or making it up as they go along (content density!!! ROFL).

    • Reg-NBS-SEO

      Man oh man.
      I just LOVE self proclaimed SEO pros (idjits) that allude to secretive measures under the umbrella of NDS or security.

      How do you keep up with the ever changing search industry if you do not frequent it’s hangouts?

      Since you do not seem to have a site Scott, perhaps you could tell me what tests my site does not pass?
      I am #1 for my chosen keyword phrase out of a field of over 30 million competing sites.

      For someone so knowledgeable you seem to have a very limited online “footprint”.
      The only mention I could find that *might* be you is a page on iMeet pushing their services.

      Just WHERE DO the “purists” hang out and where will we see your name on an article?

      • Scott Harris

        I know, I know… I really should hang out more often with the chicken-blood-spattered practitioners of “santeria,” the snake charmers and the purveyors of purloined oil of snake… Sorry, that’s where you find “articles” like this one. And folks patting each other on the back with a hearty “Good job well done.” No, I like the places where the first thing a new visitor sees is something to the effect of: “If you don’t have an active invitation, your registration will NOT be approved.” And active invitations don’t come easy: usually you need some kind of specific credential(s) and someone who’ll personally “sponsor” you. It’s behind those barriers that productive discussion happens.

        Every now and then someone will point out one of these other sites for something or other and I’ll take a look. Often enough I’ll shortly see the author (or one or several of the major responders) pop up somewhere else, screaming about Panda or some other Google-caused disaster – funny that Bing never gets that attention… maybe they’re smart, staying under that radar like they do. Maybe they don’t emulate exploding nuclear weaponry when they do their tweaks. Then again, maybe Bing doesn’t get abused as much by the SEO freaks.

        I don’t test for keywords: keywords come and go with algo tweaks. The tests that last concern accessibility: the FAE and WAVE. But those are only the most basic tests. See what AMP says. See if you can walk through your site with NVDA and VoiceOver and ChromeVox. And see if your site makes sense when seen through those eyes. The better you do there, the better your basics will do long term with the Googlebot. But there is no test for actual content quality like real live traffic, like visitors actively recommending your stuff to their buddies, pals and friends. That’s a “special sauce” you have to come up with yourself, especially these days.

        No, my name is not a “chosen keyword phrase.” It’s a pretty common name, useless in terms of “search.” No, I don’t publish articles and plaster my name on them, same as I don’t visit several forums a day like this one and leave droppings pointing back to my business. I focus on what I like to do, my several thousand pages of content and my own advertising customers (yes, some people actually directly pay me to run their advertising on my sites – they get a much better ROI than they do with things like AdWords and OpenX). I like my space, my privacy, my eclectic sleeping hours, my castle in the mountains next door to wilderness and the time I get to spend teaching my tiny granddaughter the differences between hard smooth surfaces and walking in gravel or on grass. I’m also more openly active in the “special needs” community than out here among the neurotypicals… although it’s not my “special needs” activities that earned me the open invite to most Google offices. I will say they put out a pretty good spread of food and I especially like it when they pick up the bar tab. They send out some pretty nice gifts at Christmas, too.

        iMeet? Nope, never been there.

  • Scott Harris

    Autocrat: if you’ve been able to garner reputable backlinks from .gov, .edu and .us TLDs, hey… I’ve never tried. The crowd I hang out with are total purists: They sometimes freely offer but don’t ever ask and never, ever offer payment.

    G and their nuke problems: Yeah, they like to test their code tweaks in production. Nothing turns up problems like that exposure to scale. Of course, at that scale you can’t really back-pedal and repair any collateral damage either. My understanding is they learn more from the collateral damage anyway: a by-product of trying to define things by what they aren’t.

  • Reg-NBS-SEO

    Sorry if I don’t believe you when you say you have a site with several thousand pages.
    I think you are a legend in your own mind.

    How about a link to this mythical site?

    NVDA and other screen readers are for accessibility issues and are far removed from SEO.
    If the blind were my target market, I would be quite compliant, but they are not.

    My site displays fairly well in a text only browser. I am not concerned with the audio output readers.

    If you don’t test for keywords, you do not understand SEO. Keywords are page/site specific and do not change with Google’s algos.

    I too have a castle in the woods but it does not stop me from learning, as yours seems to do.

    I was invited to speak to Google with a client about a mind mapping software but had to turn the invitation down, I do not travel well.

    People, like you, that criticize in broad generalities are worse than spammers.
    Prove your worth, show us your work, or get off the pot.

    • Luke McGrath

      It’s a shame that “people with sight” is your target market. Taking a standards based approach to web design would not only be of benfit to blind or otherwise disabled customers but also to your customers at large.

      It seems to me that standards based (i.e HTML5, CSS, WCAG) sites are future-proofing themselves against the next Google crackdown. One of the goals of SEO is to increase customer base, why not open your sites to all users while you’re at it?

      In any case, as much as this isn’t the best ever SEO article, no-one benefits from a p*ssing contest about who’s met who and who many pages we all have or where they are. Let’s share best practice and opinion without claiming people don’t “understand” SEO. If they really don’t – great! You can steal their clients when they get wise.

      • Scott Harris

        Good points, Luke: “future-proofing” a site by moving forward into the full coding array. If only the standards were set and validators worked correctly – I’ve seen things that won’t render pass html5 validation using Unicorn. And while a large part of that spec isn’t ready yet, neither are most browsers for many of the elements in that code base – but that’s what “modernizr” is about: never overdelivering to a browser.

        US Government stat: up to 20% of Americans on the web use some form of assistive technology. EU numbers are very similar. SE rewards for writing to that are not in the future, they’ve been present for years, just ignored by the mainstream.

        Fact: Google is out to destroy the whole “SEO” game as it has been played since the days of Alta Vista and HotBot. The whole concept of SEO needs to be baked into the design workflow from wireframes on. Bolt-on and after-market solutions are going out the window.

        Anyone looking at the coding of the “link-assistant” site will see a better display of present “SEO” reality than this “article” has shown. That’s what really set me off… now I can crawl back into my cave and unwrap a new software package delivered from the coding gods in Mountain View. I smell bug reports cooking… Ta Ta!

  • Scott Harris

    @NBS-SEO: Your beliefs are not my problem. Your methodology: that’s not my problem either but it does bring up a different issue. One thing about the better screen readers: they see a page almost exactly the same way the Googlebot does, although they trip over nested tables and certain other not-recommended gaffes more. And the more modern versions of JAWS and VoiceOver actually try to guess at things to fill in the blanks where lazy programmers don’t do the whole job, same way Google’s ImageBot does. I’m sure you’re aware of the shortcomings of the ImageBot, right? Wouldn’t have the problem if programmers did the whole job… And that’s one thing I like about the accessibility tools: they impose a discipline that requires that someone do the whole job… or fail the tests. In doing the whole job you’ll find all kinds of SEO possibilities that are generally ignored by most “programmers.” Not doing the whole job is a slap in the face to more US Web surfers than the entire population of Canada. That’s a lot of potential business to throw away.

    Keywords: One of the biggest question marks in the whole genre. How many is enough… how many is too many… density questions, variations, synonyms, mis-spellings, human-readability indexes versus machine-read… if your entire strategy is based around your present presentation of keywords, you’re in for a bumpy ride while Google engineers fiddle with their scripts. But you already know that. I’m also sure you know about the hit Google recently gave sites with unnatural backlink profiles… They’re looking a lot more at behaviors these days and programming to that (which is something the author of the above “article” never touched on either).

    Thanks for testing the firewall around my privacy. Several years ago an economics professor at the Université de Montreal found his way to me, not through my site but via a completely unexpected vector… caused me to take extra precautions. In some places I am a prominent personage, prominent enough to have suffered a couple DDoS attacks and assorted other nonsense years ago. These days, my primary hobby site sees a couple million visitors per year but I remain anonymous. No big deal… I’m just some twit SEO idjit making noises in the wilderness, right?

    PS: You do live in a beautiful place. Many years ago I astyed in Courtenay and Campbell River, traveling with college buddies from Newfie. But I left Canada in 1979 and haven’t looked back.

  • Paul

    Thanks, I appreciate the guide – well-structured and well-written!

  • Reg-NBS-SEO

    Left Canada eh Scott? Your loss.

    If you are indeed that paranoid about your privacy, and have protected yourself, then why not give us the links to your site(s)?

    Your words, “How many is enough… how many is too many density questions, variations, synonyms, mis-spellings, human-readability indexes versus machine-read…” only emphasize just how little you know about SEO.

    It is proper SEO that does not get hit when the Google guys tweak their algos.
    It is the SEO flack’s pages that try to push the limits that get hurt.

    Google has been after spammy linking profiles since they first introduced PageRank.
    This came to a head in the Mayday update and had been a major part of the Panda filters.

  • Jamy

    Most of the commenters here have a deny-all-opinion, while they fail to suggest or contribute anything really valuable. You’d better keep your snobbish opinion for your private blog. For years “top SEO guys” are talking about mysterious “advanced” techniques. Looks like they wanna boost their “authority” and squeeze even more money out of their customers for “advanced” SEO magic (at the same time these folks keep on outsourcing bulk of the tasks to Eastern Europe, India or wherever it’s cost efficient). They are not really able to produce real results, they only have a “vision” and the ability to share it with likeminded persons during all sorts of conferences. Even the best vision and theory are nothing without practice.

    What we have in practice? – Any website is an ecosystem where everything is important and interdependent: content, proper html, microformats, meta, images, layout ๐Ÿ™‚ – simply everything. There is no magic or anything “advanced” about it. For good results you need to PROPERLY and APPROPRIATELY use every element without looking at the relevant importance. The holistic approach is a key to success in SEO 1.0, 2.0 or 0.5. On top of these – you don’t have to use any trick that is available to you. Damn, why on earth should I implement modernizr on the very simple local 5-page website of a customer who uses it to promote catering services?.. Right, “top SEO guys”, it’s a fine way to make them pay another 1K $ for silly SEO magic that they don’t need.

  • Scott Harris

    @Jamy: You make some good points: one has to always be aware of the context (and scale) one plays in. It’s easy enough to get “Albuquerque School Photographers” or “Minneapolis Custom Cabinetry” or “Ocala RV Parks” to the top of the SERPs for those terms using the data presented above. My own perspective is tainted by the scale on which I generally operate, a scale which requires that you throw virtually everything (including the plumbing under the sink) into your projects – and yes, you vet every aspect of the hardware and software that run your site and the plumbing that connects it to the web. You deal with reputable hosts, ones who never see their names in the malware stats. You reduce server calls, especially off-site (so I don’t use calls for “googleapis” or “github” or “code.google” etc. – if I need code from any off-site repository, I import it myself and put the load on my own servers where I have some control and never get caught in their traffic jams). You strip and compress libraries. You maximize accessibility (especially if your primary traffic comes from North America or the EU). You design and build for 100% of users, everywhere… knowing that somewhere in the world, there’s a team of programmers financed by some multi-billion-dollar outsourcer copying your every step. And it doesn’t matter how much money they have backing them or how little their efforts cost, they WILL take shortcuts. Sometimes it’s more satisfying to stay small. If I didn’t enjoy the challenge so much… and no, I don’t sell “SEO services,” I sell comprehensive design and construction for the future.

  • Scott Harris

    In the end, isn’t this really what everyone’s looking for:

    “The following structured data can be used to sort or bias search results…”

    Everyone has seen this, right? It’s taken from official, publicly available Google documentation… and comes with explicit instructions for implementation and tools for testing…

  • Scott Harris

    I’m going to stick my foot a bit further into this one again:

    A problem with doing any of this comes up when you consider that Google these days is delivering results pages targeted to the cookies you have in your browser. Until you dump cookies and cache, you can’t get an honest response. And even then, the response you may get will usually be influenced by whatever web history Google has in their files with your name on it. So this is the Heisenberg Principle in action: “The act of observing influences the observed.” If Google hadn’t invested billions of dollars in a system that trips over itself to deliver you what the system thinks you want (in a manner slanted to make the system money), it might be different: you might be able to derive an honest objective answer to a straightforward query. But it’s not that way. And every different machine you try to use to get real honesty will also most likely be tainted by the cookies and cache onboard and whatever web history (stored in Google’s database) that browser may be linked to.

    There is an incredible amount of explicit usable data for structuring queries in the back end of Google Webmaster Tools but there is nothing anywhere that can work for you to guarantee a fully honest and unbiased search results page. Even should you begin afresh after dumping cookies and cache, your queries themselves will quickly start to influence your results. And out here in the real world, who has a clue about “structured queries”? It seems only about 20% of the general American public has any idea what “SEO” means…

    So the more you try to ascertain exactly where you place in the SERPs for any series of queries, the more your act of searching (and possibly clicking on your own returns) will apparently improve your position… for yourself anyway: the results pages you will see are not explicitly the results pages I (and a large subset of the browsing public) are likely to see. So you can truly make yourself #1 in your own eyes and think you’re getting Google’s blessing while doing it… and never really approach the objective truth because in the end, the results delivered to everyone else on the planet are slanted by their own browser’s cache and cookies and Google-stored web history, too.

    Notice I’m not saying anything about whether or not you or your visitors might happen to be logged in to a Google (or OpenID) account of some sort while you are searching… that’s an influence of a whole other order.

  • Neha

    Well meta tags are very important nowadays as it gives the short summary of what we are writing about,You have very nicely explained it,thanks for providing this useful article.

  • MIke

    What a handy list! Thanks for this.

  • Nisha Advani

    Meta tags are great if they are implemented in a very appropriate manner. The list you have compiled gives various insights on how on page has a really impact of the website !! Super list !! Good for beginners as well as experienced bloggers!! ๐Ÿ™‚

  • Alice Stevert

    Thank you Alesia, This was well written post over SEO technique i would like to appreciate your for your knowledge. I like the way you elaborated each and every factor.

  • Conetix

    Thanks for sharing, it helps seeing these listed based on importance.