8 Advanced On-Page SEO Techniques Your Boss Expects You to Know

SMS Text
8 Advanced On-Page SEO Techniques Your Boss Expects You to Know

After over a decade of doling out penalties, Google has caused the rise and demise of manipulative SEO methods. Many professionals have nearly forgotten the broad space for manipulation enabled by on-page SEO, and are acting on a very basic level, be it as a result of added caution or perhaps habit.

No, this isn’t another article about title optimization, keyword optimization, and the importance of tags. Below, I intend to present a list of problems and solutions that will help you improve search engine rankings using on-page elements from less talked about, yet still significant, angles.

To draw a parallel, envision your site as a brick and mortar store. Your off-page SEO would be equivalent to reputation management and PR efforts, while on-page is more about what your store contains: the shelves, the cash register, etc. Each element varies in importance, but all are crucial to the store’s success.

Often, business owners insist on getting quotes for off-page work only, usually content-based, without performing any on-page changes. They may feel extremely confident in their conviction that they’ve done all that could be, but the reality is often very different. A quick analysis typically yields unprofessional and/or non-SEO compatible coding and insertion, which can damage rankings. Advanced implementation of on-page SEO techniques strengthens the site and can have a quick, almost immediate effect on rankings, unlike off-page efforts which usually take longer to bear fruit and are conditioned upon many factors we cannot control.

After nearly 10 years of experience in the field, I’ve stumbled across a lot of strange situations and interesting facts. In this article I’ll round up a few key tips for effective on-page optimization. As you might deduce from the title, this piece is intended for somewhat experienced SEO professionals, so I’ve made certain assumptions about base knowledge and technical ability. If anything is unclear, feel free to ask me questions in the comments section.

The following list of on-page methods is ordered randomly, and not according to significance. Take a deep breath, and let’s dive in.

Editor’s Note: the author claims no affiliations to the mentioned tools in this post.

1. Internal Link Structure Doesn’t Tell a Story

General Explanation: have you ever wondered why sub-pages such as “About” and “Contact Us” typically have strong PR and DA scores? In many cases, they don’t even have external links. The reason for this is internal links: these sub-pages usually appear in cross-website menus. By linking to these pages internally at a higher frequency, we’re effectively declaring them important. However, they may not be as strong as they could be.

Crawling robots don’t have preferences or desires; their mission is simple and calculated. For this reason, if we know exactly what they’re after, we’ll be able to optimize our site. When a site is being crawled, movement from one page to the next is done by entering these internal links. Information is collected until the crawl is finished, or if it reaches a “time out” notification. In most cases, sites are looking at a large squandered opportunity.

Improving the structure of your internal links is one of the most effective ways to optimize your on-page elements. Nearly every time, once this is done we see an improvement in rankings upon Google’s next crawl. The method for optimizing this structure is widely known: anchors tell stories. If we take all the anchors on a given site and write them out only to read them consecutively, we should hope to get a picture of the site’s theme.

The Problem: Without a clear internal links structure, Google’s algorithm assigns less relevant search phrases to the site’s most important landing pages. The primary symptom of this is having the homepage attributed to most of the phrases we want to rank high for, but few of them make it to the top of SERPs.

The Solution: If your site has a blog, start out by linking 20-30 posts (don’t overdo it) to relevant, important landing pages. Write an additional 15-20 new posts per month delving deeper into the main subject dealt with in the landing page, and link from these new posts to the older posts, and vice versa. This internal linking system should highlight the main crossroads in your site, using both old, indexed pages, and newer pages.

8 Advanced On-Page SEO Tips You Need to Know | SEJ

Ensure your content is interesting and of high quality. Otherwise, your internal links won’t cut it. Note that I didn’t mention which anchors it’s preferable to use, since that’s less important, as long as the page it’s linking from is relevant. Naturally, it’s advisable to use anchors that describe the target page. “Click here” is also a popular anchor and even recommended every now and then, when appropriate.

Now, it’s time to locate the strongest pages in your website and link from those pages to your most important landing pages (if possible). In doing this, you’ll leverage existing strength to assist crucial assets. In order to accomplish this we’ll use ScrapeBox, type the site’s URL under site: and click on “Start Harvesting”. See example below:

Screenshot taken on 2.10.14

Screenshot taken on 2.10.14

The outcome should present all the pages in the site. Now, we’ll check strength and popularity in two simple steps.

  1. Clicking “Check PageRank” and exporting the data using the right-hand menu (CSV file);
Screenshot taken on 2.10.14

Screenshot taken on 2.10.14

True, PR isn’t the most reliable or accurate measure to assess strength, but it still serves as a good indicator. Naturally, testing PA (page authority) would be more accurate, but it would require setting up an account with Moz and registering an API code.

Note: Those of you who’ve used Moz’s code in the past know it can sometimes be inaccurate, and sometimes return absolutely no outputs. Pay attention and check every suspicious result manually. It’s important to remember that these metrics don’t necessarily assign the same importance that the algorithm assigns to them, but they’re sufficient as a general indication.

  1. Check popularity on social networks by clicking “ScrapeBox Social Checker”, export the data, and unify with the results with the output from step one:
Screenshot taken on 2.10.14

Screenshot taken on 2.10.14

Now, you can unify the two tables and sort the data anyway you’d like to determine which pages are strongest, and then link those to your important landing pages. This is what the template I use looks like:

Screenshot taken on 2.10.14

Screenshot taken on 2.10.14

If you have a WordPress site, it’s best to familiarize yourself with the following plugins that can help build and maintain the internal link structure:

No Sweat WP Internal Links Lite

SEO Smart Links

2. Confusing Root Folder

General Explanation: This is likely one of the most overlooked elements on your site. Metaphorically, it’s the dark attic no one thinks about during their regular routine. However, when considering that Google crawls your website for only a few seconds – maximizing said crawl time could be significant.

What’s the Problem? Site builders, optimizers, and site owners tend to just discard files onto the root folder and subfolders incessantly. As long as it’s nothing visual, they don’t consider it a problem – but every single file is influential. The main problem is diluting the relevant information in these ‘junk files’, such as:

  • Different file versions
  • Trial files
  • Unused DOC/PDF files
  • Backup directories
  • Images / video / music (media files should be placed in designated directories)
  • Temporary files

Though these files are no longer in use, they are considered whenever your site is crawled, which could be preventing you from maximizing the resources allocated to you, or worse: diluting the important information found of the site.

The Solution: Time to clean the attic! Open a directory and name it “old-files”, then place all unused files in it. Additionally, sort your media files and sub-directories, and don’t forget to update all addresses in the code to the most updated one, wherever they’re used. Don’t bother finding every single link that points to them – just perform the changes and then run a broken link checking software to find which links need changing. For this step, I use xenu link sleuth. Finally, update the robot.txt file with the following command:

Disallow: /old-files/

3. Duplicate Content Right Under your Nose

General Description: Are penalties issued for duplicate content? Hardly, except in extreme cases. Generally speaking, duplicate content by itself is not sufficient grounds for a penalty, excluding cases where Google recognizes fraudulent intent behind said duplicate content, aimed at ‘tricking’ the search engine. Usually, duplicate content will lead to devaluation, and most of the time, we won’t even realize it.

Every search result contains thousands of indexed results; Google gives us access to the first 1,000, meaning 100 pages, of search results. Nearly no one is interested in what page 10+ has to offer, due to the simple fact that we’re accustomed to believe that whatever is on the first page of search results is more relevant and thus more reliable. It’s almost second nature to trust Google’s algorithm, and they in turn do everything to avoid disappointing us. Duplicate or similar content poses quite a challenge to Google, in this respect, and therefore Google is determined to face it head on to avoid compromising user experience.

Once more, those first 1,000 results are essentially a long chronological list sorted by relevance and authority. However, this list is also subjected to filters intended to keep quality high. As you may have guessed, one of these filters is set to recognize duplicate and similar content.

What is the Problem? Like a ball and chain, duplicate or similar content weighs your site down, and makes all work on the site harder. Theoretically, a duplicated page could eventually make its way to the first page of search results, provided other ranking factors leverage its strength over competitors, despite the duplicate content. This effort, however, will be much greater in terms of time and money.

The Solution: I’ll touch upon each individual problem, and describe its solution thereafter.

  • You have: Written the main site’s URL in several ways. For example: with or without www prefix, with/without index.php suffix, etc. All of these versions lead to the same page – your homepage – and they may all be indexed by Google.

Screenshot taken on 2.10.14

Solution: First, input the following lines in the htaccess files (replace “domain” with your site’s name; replace “html” with “php” if relevant):

RewriteEngine On

RewriteCond %{THE_REQUEST} ^[A-Z]{3,9}\ /index\.php

RewriteRule ^index\.php$ / [L,R=301]

RewriteCond %{HTTP_HOST} ^www\.domain\.com$ [NC]

RewriteRule ^(.*)$ http://domain.com/$1 [L,R=301]

Then, define a preference for a type of address on Google Webmaster Tools (with or without www prefix)

Screenshot taken on 2.10.14

Screenshot taken on 2.10.14

  • You have: A secure version and a regular version: http/https

Solution: When shifting from the regular site display (http) to the secure display (https) input the following code into your htaccess file (replace “domain” with your site’s name):

RewriteEngine on

RewriteCond %{SERVER_PORT} !^443$

RewriteRule ^(.*)$ http://www.domain.com/$1 [NC,R=301,L]

  •  You have: Duplicate titles and page descriptions. Google serves you this data on a silver platter, and therefore expects you to do something about it.

Solution: Access Google Webmaster Tools, get rid of the duplicates, and while you’re at it re-examine the rest of the recommendations that appear.

Screenshot taken on 2.10.14

Screenshot taken on 2.10.14

You have: Internal pages with parameters. Many sites use different URL versions in order to track and analyze. Additionally, there are varied reasons due to which pages might receive different URLs, whilst all lead to the same page. In these cases, Google meets us halfway and offers an easy-to-use tool under Webmaster Tools.

Screenshot taken on 2.10.14

Screenshot taken on 2.10.14

Define parameters which Google will use to determine, ahead of time, whether duplicate content is present, and treat said addresses accordingly. You can find more on this subject in this post by Google. It’s best to define the parameters in advance, since once they’re indexed it’ll take some time before Google gets rid of them completely.

  •  Search for duplicate content using CopyScape and handle accordingly.
  • Reduce similar content by unifying similar pages. When possible, opt to create longer, thorough pages which cover an entire topic rather than separate pages for sub-topics. If you’ve decided to do this using existing content pages, don’t forget to set a permanent redirect from the cancelled page to the updated page which should now include more comprehensive information on the subject.
  • Use rel=”canonical” when you need to define an internal page as identical to another page. See example code below:

<link href=”http://www.example.com/canonical-version-of-page/” rel=”canonical” />

Read more about this subject here.

4. Site Loading Speed

General Description: In the eyes of Google, providing a great user experience is synonymous with increasing accessibility to information – helping users find precisely what they want as quickly as possible. The less time you spend in each search session, the better. When it comes to your site, the opposite is true: you need to retain your visitors for as long as possible (excluding key phrases having to do with addresses, telephone numbers and other fast-info). Site loading speed is one of the most important user experience metrics. If your site loading speed is average or slower, you’re probably paying the price for it with higher bounce rates. Google notices this, tracks you, and ‘penalizes’ you in rankings.

What’s the Problem? There are different ways to measure user experience, one of which is site loading speed on PC and mobile devices. A slow loading site will be subject to penalty. As early as 2011, Matt Cutts stated that site loading speed is one of the factors that factor into ranking, and as time goes by we’re inclined to believe the importance of this factor has increased. Three years after said statement, I can say with confidence that improving site loading speed nearly always improves rankings.

The Solution: First and foremost, assess your site’s current state. I always prefer to use the tools provided by Google, since they’re adapted to the actual measures they examine. The following example uses two tools in order to get a broad assessment of the problem, and resolve it.

  •  PageSpeed Insights scores sites according to loading speed, from 0 (slowest) to 100 (fastest). Google scores itself at 99, and I recommend aiming for a site loading speed of 70 and above. Every score range is assigned a different color; try to aim for the yellow area in the very least.
Screenshot taken on 2.10.14

Screenshot taken on 2.10.14

After conducting the test, you’ll also receive recommendations for changes you can implement on your site to improve your score.

  •  GTmetrix will give you a more comprehensive overview. The reason I use another tool in addition to the one provided by Google is that PageSpeed Insight doesn’t list all the factors that influence site speed, and I like to be thorough. Focus on the Yslow tab. Conduct an analysis on the ‘timeline’ tab:
Screenshot taken on 2.10.14

Screenshot taken on 2.10.14

For advanced, real-time analysis, install the following plugin in your browser: http://yslow.org

For a quick fix that deals less with the code, start with the next four recommendations and then regroup according to the results yielded by the two tools recommended above.

  •  Optimize all the images on your site (access from your main image directory). The simplest tool you can use to do this is Yahoo! Smush.it.
  • Enable compression using GZIP. Just input the following code into your htaccess file:

AddOutputFilterByType DEFLATE text/plain

AddOutputFilterByType DEFLATE text/html

AddOutputFilterByType DEFLATE text/xml

AddOutputFilterByType DEFLATE text/css

AddOutputFilterByType DEFLATE application/xml

AddOutputFilterByType DEFLATE application/xhtml+xml

AddOutputFilterByType DEFLATE application/rss+xml

AddOutputFilterByType DEFLATE application/javascript

AddOutputFilterByType DEFLATE application/x-javascript

  • Assign expiration dates to files by inputting the following code into your htaccess file:

<IfModule mod_expires.c>

ExpiresActive On

ExpiresDefault “access plus 1 month”

ExpiresByType image/x-icon “access plus 1 year”

ExpiresByType image/gif “access plus 1 month”

ExpiresByType image/png “access plus 1 month”

ExpiresByType image/jpg “access plus 1 month”

ExpiresByType image/jpeg “access plus 1 month”

ExpiresByType text/css “access 1 month”

ExpiresByType application/javascript “access plus 1 year”


Use a CDN (content providing network). This is a service typically provided by your site storage provider.

5. Full Transparency and Disclosure of Malicious Features

General Explanation: Blocking certain files and code elements may force Google into assumptions that hurt us. Google can’t know everything about your site, nor can it examine everything manually. Therefore, it’s important to be cautious and not produce artificial warning signs.

What’s the Problem? The presence of blocked files on the site makes Google err on the side of caution and deem our site malicious, even if it hasn’t been proven manipulative.

Solution: Go over the checklist below and make sure you’re not giving off warning signs unintentionally.

  • Don’t block your JS and CSS files on robot.txt. Google pays great attention to this. It’s unclear whether this is a precaution since spammers use these files for various content manipulations, or whether it’s because crawling robots see a broken/missing page as the code is unavailable to them:
    Screenshot taken on 2.10.14

    Screenshot taken on 2.10.14

    Be careful not to block template files as well (which, by the way, also block JS and CSS). This is a common mistake many think helps Google navigate to content files instead of code files, but it’s simply wrong. WordPress site owners can and should block their Plugins directory.

  • Though Google can only decipher (‘read’) text, the presence of images is just as important. Don’t spare Google from crawling your image directory. Blocking this directory will render all your image files unindexed, which will in turn decrease the amount of content classified under your site. There’s no direct evidence pointing to this influencing rankings, but it should result in an increase to incoming traffic thanks to Google’s Image search results. If there’s no justification for blocking media files, I’d strongly advise against it.
  • Avoid blocking your RSS Feed directory. This is an important source of information for Google (and for you, depending how you look at it), even more than your site map.
  • Once a month, I recommend checking the list of search queries in Google Webmaster Tools. There, you’ll be able to see the traffic your website gets from keywords that have nothing to do with your site. Ensure that your site is getting traffic only from search terms having to do with your field of activity. Words which should raise a red flag are those related to adult-content, medical supplies, and loans (assuming your site doesn’t deal with these subjects). In checking this you’ll be able to determine with greater confidence whether or not your site has been hacked and inserted with pages preying on its strength for black-hat SEO purposes.
illustration made on 2.10.14

illustration made on 2.10.14

  •  Check Google Webmaster Tools under the tab Security Issues.
  • Use a scanning tool at least once per month. Sign up for these paid services if you want to check daily:

Sucuri SiteCheck


  •  Check your code! Search for suspicious, automatically generated codes using the above tools. I recommend running the following manual searches:

<script>function followed by a long number

text-indent: followed by a number greater than 300 or smaller than -300

hxxp:// followed by a URL





Running a cross-site search can be conducted using DreamWeaver or similar software. Prior to this, remember to back-up your entire site on a local directory, an action you should ideally take every several weeks (in addition to the back-up services offered by your website storage provided). NOTE: treat every suspicious result seriously, but don’t jump to the conclusion that you’re dealing with malicious code.

Check your links and where they lead. Use the following tool on all the most important pages of your website: Unmask Parasites

6. Broken Links and Lost URL Recycling

General Explanation: Ongoing maintenance includes running periodical scans, in order to ensure optimal user experience. Google understands internet dynamics, and is well aware of the incompleteness that characterizes almost every site out there. Therefore, there’s no danger of penalty due to a few broken links, but you should definitely do something about them if they’re so common they send a message of neglect. There’s no exact number, but if you’re dealing with hundreds of broken links appearing in Google Webmaster Tools, or if you’re seeing a constant rise in crawling problems instead of a decrease, it’s cause for concern. Again, I recommend using the data provided by Google to try to restore the natural balance.

What’s the Problem: Broken links damage user experience, and in extreme cases, may indicate site neglect. In the presence of broken links, we lose strength accumulated and as a result, the site’s overall strength is damaged. Typically, when we see strength compromised it’s due to lost addresses.

 Solution: There are two stages to solving this problem.

  1. Run a Xenu Link Sleuth test and fix all the broken links you find.
  2. Handle the lost addresses by finding the broken addresses list in Google Webmaster Tools and exporting the data into a CSV file.
Screenshot taken on 2.10.14

Screenshot taken on 2.10.14

Save only the URL column and delete all the rest. It’s important to know that despite the broken addresses, some may still hold value that you should strive to conserve. If you’ve never completed this process before, it’s possible that the number of URLs in the file reaches hundreds and even thousands. It can seem daunting and frustrating, but if you’re careful to maintain this periodically, it shouldn’t take more than a few minutes on an ongoing basis.

Now, find the valuable URLs in the list. Delete the URLs that were fixed in the first phase. Now, highlight all the URLs you know are valuable from experience (perhaps popular old posts/pages which you decided to delete for some reason) and check PR and social strength as shown in step 2. Denote all results which have a PR of 1 and over, and those with high social signals (determine the minimum yourselves based on the data).

Now, find a replacement for each broken link, and write a redirect code according to the following template:

Redirect 301 /old-url.php http://www.website-name.com/new-url.php

Now, you have a list of all the valuable lost addresses along with a permanent redirect to the new URL. Copy this list and insert it into htaccess.

7. Correcting Mistakes and Removing Superfluous Code

General explanation: errors in the code, as well as superfluous code, can delay crawling and signify an unmaintained site. Like broken links, errors in the code are commonplace and natural, up to a certain degree. In most cases, the number of mistakes doesn’t reflect the level of maintenance, rather the types of mistakes and the depth of their effects. It’s entirely possible that there are 100 mistakes in your code which require no particular attention, whilst one critical error can have devastating consequences.

What’s the problem? Any site that’s been around for a few years typically suffers from an inflated code filled with superfluous lines and errors. These can potentially cause instability and inefficiency, and signal to Google that the site isn’t regularly maintained.

Solution: Again, a two-phase solution is necessary.

  1. Fix the errors in the code using either http://validator.w3.org/ or http://jigsaw.w3.org/css-validator
  2. Remove superfluous code (back-up the code before you begin):
  • Remove unused tracking codes such as duplicate Analytics/WMT codes
  • Remove any codes that aren’t used (commented out)
  • Remove any and all codes you’re positive aren’t in use
  • Remove your meta-keywords tag
  • Remove unused comments (do not remove comments that help make sense of the changes in the code)

8. Latent Semantic Index and Necessary Keywords

General Explanation: We can’t avoid discussing keywords, but this time I’d like to approach the subject from a different angle – namely, the location of keywords in key locations, as well as using LSI (latent semantic index) are both important measures to ensure you’re getting full credit for your content.

LSI: SEO professionals, content writers, and web designers often use LSI when writing content, in order to enrich their content with words and phrases that have to do with the subject matter, such as using synonyms and related content. For instance:

Synonyms for “economical” = inexpensive, cheap, cost-effective

Related to the subject “Dog” = breed, pound, names, adoption, the bounty hunter

Most of us have no background in psychology, but Google invests a great deal of resources to decipher user behavior in order to yield better search results. By understanding the way users naturally scan and decipher the page, Google can further improve its algorithm. By being aware of this process, we can optimize our content to appeal to this natural user behavior, and to Google’s algorithm.

What’s the Problem? Content-poor landing pages, or pages that have incorrectly structured content, typically rank low.

Solution: Our aim is to naturally enrich the content offered, and benefit the user. To accomplish this, we’ll need to address the content on the page itself. After you’ve chosen a page to focus on, write down the keywords you’d like to target:

Screenshot taken on 2.10.14

Screenshot taken on 2.10.14

For every keyword or phrase you choose, write down synonyms and related keywords. Synonyms are available in any thesaurus; related keywords can be found via Google. Type the word, hit the space key and then consider Google’s suggested search queries.

This is a great tool to use. If after these you still haven’t found what you’re looking for, click “Search” and scroll all the way down the first page, where you’ll find a section showing related searches.

Once the Excel file is ready, make modifications to the text and ensure as many words from your list appear in the landing page, naturally. Be mindful not to overstep the boundary here, since unnatural content has been on Google’s sights in recent years. The objective is to improve the level and quality of coverage on this specific topic. Avoid keyword stuffing, which will almost always expose the site to a danger of penalty.

The Takeaway

The above tips should get you well on your way to optimizing your site’s on-page aspects, after taking care of basic on-page factors you should all be relatively familiar with. However, note that no site is ever ‘finished’ – maintenance should be ongoing and frequent. Through experience, we should strive to prioritize the on-page tasks and focus on those that yield the best results. Good luck, and get to work!

Got more expert tips? Sound off in the comments!


Image Credits

Featured Image: Creativa Images via Shutterstock
Image #1: alice-photo via Shutterstock

Ben Oren

Ben Oren

Director of Marketing at Whiteweb LTD
Ben Oren specializes in web marketing and boosting online conversion for large corporations in highly competitive niches, mostly in the US and Europe. He is... Read Full Bio
Ben Oren
Get the latest news from Search Engine Journal!
We value your privacy! See our policy here.
  • Core


    Baller stuff, the highest quality “hands-on” stuff I have seen on SEJ in a long while. TY!

    • Ben Oren

      Thank you very much for your kind words!

  • Rohit

    Hi Ben,

    Thanks a trillion for this article. I am a digital marketer, and everybody over here thinks that they know everything (on-page, website audit, etc.) – but then nobody remembers to DO IT! You always skip a couple of things and BANG – GA starts booing you!

    Your article is a perfect checklist for me! ๐Ÿ™‚

    • Ben Oren

      Hi Rohit,

      I’m glad you found it useful.

  • Nick Stamoulis

    “Your off-page SEO would be equivalent to reputation management and PR efforts, while on-page is more about what your store contains: the shelves, the cash register, etc. Each element varies in importance, but all are crucial to the store’s success.” I really like this analogy. There’s so much more to on site SEO than adding keywords to content. If the structure and back end coding on a site is a mess that will affect any SEO efforts.

    • Ben Oren

      very true!

      Thank you for reading.

  • Paul Barrs

    Great checklist; you’ve just given me more work to do! There’s nothing more important than the user experience, certainly. But if we can tweak the small things to get more traffic to help more users enjoy that experience – there’s the apples right there!

    Well explained and well written ๐Ÿ˜‰

    • Ben Oren

      Thank you Paul.

  • Raul

    Very interesting post!

    I think another important thing is to add rich snippets: Bread crumbs, prices (if you’re working on an online shop), and author snippets.

    I implemented rich snippets on my webpage last month, and it eventually ranked on SERPs!

    (I’m sorry if my English is a bit bad. I’m Spanish.)

    • Ben Oren

      Thank you for you feedback Raul!

      I agree, rich snippets are important too. There are a lot of things can be done on the site level.


  • Jennifer

    Great tips regarding on-page optimization. I have one thing that confuses me, though: How do I implement latent semantic indexing?

    • Ben Oren

      Hi Jennifer,

      It’s easy, after you you new keywords list with all variations, you start replacing few keywords in the content with the words from the list.

      For example: If you have a page talking about economical solutions and your main keywords you want to target is “economical package” and you have it on your page 4 times, you can replace 2 of them with “inexpensive package” and with “cost-effective package”. That way you will help Google understand and rank you on a wider set of keyphrases relevant to your industry.

      I hope it helps you!


      • Jennifer

        Thanks Ben, for your response and i will follow the steps you mentioned,

  • Scoot kincher

    these are the great tips to rank a blog, i think they also help you to make your blog more stronger. Thanks for sharing

    • Ben Oren

      Thank you Scoot.

      I’m glad you found it useful!


  • Ankit Gupta

    Really great tools you have talked about in each problem … thanks for it as we know many of these issues but don’t know about the tools you mentioned above and the way to use these tools thanks ben

    • Ben Oren

      Thanks for the feedback Ankit!

      If you follow my guide you will find out that these tools are very easy to use.

      This is a great opportunity to leran new tools, right?


  • Bhanu Chawla

    Darn! This has to be one of the best post I’ve read on SEJ in a while.

    Good one, Ben!

    • Ben Oren

      Thank you Bhanu, I’m glade you like it!


  • Zain

    Great stuff! For once someone has gone beyond the amateur keyword research and title tag basics.

    • Ben Oren

      Thank you Zain.

  • Bhim Rai

    Hi Ben,

    I couldn’t agree less. Very well written post. Spot on everything.

    Keep it coming.


    Bhim Rai

    • Ben Oren

      Thank you! you can check my previous articles as well ๐Ÿ™‚

  • Chris Pereira

    Excellent read. You’ve put together some great information and resources all in one page. For site speed, image expiry and minifying/combining files also helps quicken load times. I can certainly think of a few others that would greatly benefit SEO (alt tags and images), but I’m convinced there will be a part deux to this article at some point, n’est pais ? ๐Ÿ˜‰

    • Ben Oren

      Maybe….. ๐Ÿ™‚

  • Soumya Roy

    Amazing post and important things you pointed out for onpage optimization. Good internal linking between pages is really helpful and keeping LSI keywords on content and other strategic positions on a page pays off a lot. Site speed is another big factor. The only thing I missed is, mobile friendly site structure, though it’s not fully related to the onpage SEO but I believe this could have been added.

    • Ben Oren

      Hi Soumya,

      Thank you for your detailed comment!
      You are right, there is few more things can be done but I felt it became already a bit overwhelming. Good feedback though.

  • Diwakar

    Nice post Ben thanks…!

    I think at point 7. we are dependent on the web designer and the developer for fixing the errors. Most of the time I have faced this problem.

    • Ben Oren

      You are right, this is a problem that we all have but still needs to deal with it ๐Ÿ™‚

      • Danny

        This is a problem at our place, designers don’t do anything to help lazy gits wont even rename images for us. ๐Ÿ™‚

      • Kelsey Jones

        Ha, I think that’s definitely a designer thing. I just have learned all the renaming file shortcuts on my mac to make it easy. ๐Ÿ™‚

  • Lucas Rose

    Solid article.

    • Ben Oren

      Thank you Lucas.

  • Angel

    Hi Ben,

    Cool article! I took the advice and made some corrections in the robots.txt file.
    Hope to have a positive result!.


    • Ben Oren

      Cool Angel!

      Let me know regarding any improvements.


  • Danny

    I do allot of basic websites for small companies like plumbers and so on. I found the information/tool/tips and so on very useful and will be sure to take some of these techniques in to consideration when I next do work on one of my clients websites so thank you.

    • Ben Oren

      Thank you Danny,

      Don’t forget also to take care of the basic on-page elements as well.


      • Danny

        Hello again Bed, I was wondering what methods you use for a specific subject so I will explain.

        Because I am always trying to improve on my SEO, I always bookmark pages such as this ๐Ÿ˜›

        How do you keep track of helpful posts and tools you find without bloating your bookmarks.

        This is a genuine question because my bookmarks are a mess and I honestly need a better way of keeping track of all the helpful stuff I find in a way it will be easier to find them again.

        Surely we all have this problem right?

      • Danny

        Sorry I meant Ben ๐Ÿ˜€

  • Leon

    Great piece, Ben!

    Been dealing with IA in the past few years, and can definitely ‘approve’ that LSI these days, and information architecture in general, is a huge contributor to the general on-page SEO, and I found it to be amongst the strongest signals.
    Kudos, and looking forward on more technical stuff to come! ๐Ÿ™‚

    • Ben Oren

      Thank you leon,

      I’m happy to hear that you also found it useful, a lot of people underestimate the power of LSI.

      Good luck!

  • Vimlesh Maurya

    Superb tips SEJ, I just loved it and recommended everyone to read it. Thanks for sharing.

    • Ben Oren

      Vimlesh, Thank you for sharing the article.


  • King Rosales

    Hey Ben, thanks for this. I agree, I haven’t seen any penalties with duplicate content for my ecommerce sites. I took action on products with slight variations and combined them into a single product page with a drop down, but it hasn’t boosted my traffic. If anything, it makes it easier to find products which leads to increased conversion rates so its been good that way. Speed could always be faster, but its a pain to test servers especially because you have to test a full site. Thank goodness for free trials but hosting isn’t that expensive in the first place, its just time which is not as easy to come by. My biggest take away from your post was the internal linking, which I gotta do a lot more of. Besides increasing on site time, its great for the site in SEO terms.

    • Ben Oren

      Thank you for your detailed comment.

      It is good that you took care of things and you gain more conversion. This is the job, to optimize, analyze and optimize again and again! It’s not easy to maintain an ecommerce site, but it is not easy to take care of an offline business as well ๐Ÿ™‚

      Nothing worth having comes easy….

  • Bhavin M

    Wow, Really awesome post.

    I’ve learnt a lot of things from this post. I really like the step by step guide with complete screen shots so we can understand well.

    I think “Implementing Latent Semantic Index” tip is one of the agile techniques that Google can use to know about our web page content easily and it might also help get ranked well on the search engine result page.

    Only Thanks isn’t enough for this post… you’re genius ๐Ÿ™‚

    • Ben Oren

      Thank you Bhavin,

      I’m glad you found it useful.


  • Maximillian Heth

    Nice article, Ben! I just downloaded this as a PDF and stored in my “SEO Checklists” folder.

    • Ben Oren

      Cool! Good Idea ๐Ÿ™‚

  • Mallory

    Thank you so much for this article! This is seriously one of the best articles about SEO I’ve read in awhile. I love that you included the actual code and changes that a webmaster should make – often you find posts like these that tell you WHAT to do, but now HOW to implement it. Thanks!

    • Ben Oren

      Thank you for your kind words Mallory.

  • Claudio Heilborn

    Great stuff actually I´ll send it to our SEO on page staff, I loved your technique #1 tactics, method and sweat. Nothing really new, but you take it to the limits! Great job. We will be following your footsteps! hahaha

    • Ben Oren

      Thank you Claudio!

      I’m happy you found this useful and that you’re going to use it in your company.


  • Jill C

    Excellent stuff! Great information on some really important tasks that are sometimes not done – or not done well. Thanks for such a useful post!

    • Ben Oren

      Thank you Jill.

  • Lina

    Wow! {Jaw dropped}
    I think it’sbeen a WHILE since I read such a comprehensive post on SEO!
    Could also be titled “this is why I ask this kind of money for the job I am doing”!!!
    Thank you for sharing, bookmarked it is!

    • Ben Oren

      Thank you Lina for your great feedback ๐Ÿ™‚

  • Darius Gaynor

    Great Read. Website loading speed is very important. I tested the duplicate content penalty with a viral stories site. Never got penalized.

    • Ben Oren

      Hi Darius,

      As I mentioned in the article, there is no penalty for duplicate content except in extreme cases. Generally speaking, duplicate content by itself is not sufficient grounds for a penalty. You might suffer from devaluation though.

  • David Albert

    Thanks for sharing such a informative article. These are the great tips which really improve Page Rank and get more traffic.

    • Ben Oren

      Thank you David

  • Jeffrey Enabe

    Nice breakdown of sometimes complicated technical SEO. I really like your tips for running things through htcaccess files as well.

    • Ben Oren

      I’m glad you like it!

    • Ben Oren

      Im glad you like it.

  • S.Rajesh

    Hi Ben,

    Thanks for wonderful article. I am really gonna share this post into my company social pages.

    Thanks Again,

    • Ben Oren

      Thank you! much appreciated.


  • Frank Johnson

    Nice to see real meat in an article and not just the usual “fluff”. Nice job. SEO pros should already know these things, but there is so much garbage out there and misdirection that few people even know what real SEO is nowadays.
    SEO gets a bad name because it’s surrounded by bad information and crappy “clickbait” articles. Thank you so much for added useful content to the swamp of misinformation that perpetrated cyberspace. Kudos to you sir.

    • Ben Oren

      Thank you very much for your feedback Frank.


  • John

    I usually don’t like doing this, but this typo was just too epic not to point out.

    On #4: The Solution: First and foremost, asses your site’s current state. (you missed an “s”… it should be “assess” not “asses”)

    I told my strategist to make sure to “Ass” the site’s current state…. needless to say, it was good for a laugh.

    • Kelsey Jones

      It’s so epic it’s as if we did it on purpose…but we didn’t, unfortunately! ๐Ÿ™‚ Thanks for the heads up and we are fixing right this second!

  • Michael McCabe

    As a relatively inexperienced webmaster for my own website, I just want to say a big thank you, and especially for the inclusion of a host of analytical tools which I wasn’t even aware existed – tools which I will be using going forward.


  • Suhrud Potdar

    This one is a Power Pack for SEO! Talk about hands-on, toolsets makes it great! Thanks!

  • Linda Fiorentini

    Wow! This is the most comprehensive and easy-to-understand post about On-Page SEO Techniques. I read about 5 posts about the same topic a few hours ago and so far, this is the best post I read so far.

    I particularly liked the “disclosure of malicious features.” Most websites tend to do the opposite. They try hiding malicious features from Google and miserably fail. But to disclose it? Well I haven’t heard it before and as I see it, you are actually right for saying so.

    Thank you so much for these tips. I’ll implement each one of them to my website STAT!

  • Catana Alexandru

    This is beyond the amateur keyword research and title tag basics. Nice . Thank You

  • Vikas Singh Gusain

    Hi Ben,

    I completely agree of this blog and Thanks for sharing this kind information, but can you let me know anyone, if I post an off page activity Social Bookmarking with changing descriptions, so what is it Spam?