7 Simple But Overlooked SEO Audit Tips

SMS Text
7 Simple but Overlooked SEO Audit Tips | SEJ

SEO audits. I love ’em and I hate them. There is nothing more gratifying than uncovering that one issue that’s keeping a client’s site from performing well. The downside is that it can be very time-consuming. Everyone has their own methods and checklist when doing an SEO audit.

Sometimes we get into a rhythm and never think beyond our own checklist. Adding to your checklist is the key to an effective audit. Especially when new technologies and advancements in search become commonplace. Three years ago, you had to check and verify Google authorship to ensure it was set up correctly, today not so much. If you work with a client that is having issues with stories showing up in Google news—your checklist will be different from that of a standard SEO audit.

In some cases, something very simple can cause a very large problem. I’ve compiled a list of the seven most common issues I’ve helped businesses with after their original audit was inconclusive.

What’s Your Status?

Status codes are easy to overlook. Check that your 404 pages returning 404 status? Are your redirects 301? Are there multiple redirects? You can use this Chrome plugin to easily see that status code and the path of redirects. I’m surprised by the number of sites I’ve come across that have 404 pages not returning a 404 status.

Duplicate Pages

Do you have two home pages? Splitting your PR by having two home pages can be both a duplicate and pagerank issue. You’ll find this mostly in sites that have a “home” link. This is still a common issue, so be sure to check for it.

Many SEO audits look at content across the web. However are you checking for duplicate content across the site itself? Certain SEO scrappers look at duplicate titles and meta description tags, but not the content itself. Yes, duplicate tags could be a sign of duplicate content, but not always.

What’s Your Preferred Domain?

www vs. non-www. Are you checking for those? Then I assume you are also checking https:// and www.https:// I’ve come across a few of these issues, especially since Google’s announcement of the “benefits” of going SSL. I even came across one site that had one of each of these versions – quadrupling the pages indexed in the search results. If you run across this issue be sure to use the Moz toolbar to determine which site version has the best link signals. Redirecting the version with greater PR to the version with the lower PR could cause some temporary ranking drops.img2

Conversion Pages in Search Results

Check the search results for “odd” pages. It’s not uncommon to find old legacy pages floating around. Conversion pages in search results are still common, especially in sites using WordPress. Check to ensure these pages are not indexed so users can’t stumble across them and throw off your goal tracking. Or worse yet, access downloadable content for free.

Orphaned Pages

Keep an eye out for orphaned pages as well. These are pages that aren’t connected or were previously linked to. In some cases, the site was redesigned and those pages were forgotten. I’ve seen cases studies created, but then not linked to from the site. This can result in a lot of wasted effort. These are sometimes pages only found in a sitemap.

Check Your Sitemap

Are you finding pages in the search results that can’t be found on the site? Check the sitemap. Outdated sitemaps can cause issues. If your sitemap contains redirects, 404 pages, or links pointing to canonical (only canonical links should be in the sitemap) you will run into issues. Google will not index them. Check your sitemap report in search console to see how many pages Google is crawling from the sitemap versus how many it is indexing.

Also, be sure to check that site search results are blocked as well. This is usually overlooked. Site search can generate urls that you don’t want Google to index. Google doesn’t want to index site search results. These can provide a poor user experience, plus they can generate tons of 404 pages.

Blocking Pages From Search

Robots.txt is the most common way to block pages from search. However, a page/site can still drive in organic traffic. If Google feels the page is relevant to a user’s search query it will still show that page even if it’s blocked via robots.txt file. The best way to remove an already index page or site from the SERP is to no, index it using the no, index tag or X-Robots-tag.

Verify your developers have best practices in place. I would recommend that all developers at least have a checklist to check against.

What do you think is commonly overlooked in audits? Let me know in the comments!

Image Credits

Featured Image: nerucci/Shutterstock.com
In-post Photo: Image by Joe Balestrino

Joe Balestrino
Joe Balestrino is a Internet Marketing Consultant located in New York City and is available for consulting on all forms of search marketing and internet marketing He is a 13 year search marketing veteran.
Joe Balestrino
Get the latest news from Search Engine Journal!
We value your privacy! See our policy here.
  • Great inputs Joe!

    I completely agree with you, issues might direct you toward your own list that differ than those predefined checklists.

    The issues you have talked about being really touching what we face in the everyday scenarios.

    Thanks for sharing Joe!

  • These are some good pointers everyone should have in their arsenal for SEO auditing. I think the one that stumbles the most is not knowing about the “no, index” tag or X-Robots tag. People think that just because you disallow a certain folder or file in robots.txt that the search crawlers will automatically remove that link from SERP.

    Another suggestion that I think is useful to know, but sometimes overlooked is when users have the “print versions” of a page.. sometimes plugins create additional pages that are not even know by the user and this will definitely cause duplicate content.

  • Simply awesome post, I think Screaming frog will be most helpful to audit almost all the factors addressed in the post.

  • I’m curious, if these things are overlooked in an SEO audit, what on earth is in an audit?

  • Nice article thank you for posting…

  • Thanks for sharing informative article with us. It will help us to promote website in right direction.

  • If these are issues you are “overlooking” in a site audit, you should not be doing a site audit. Further, actually running a site crawl will surface most of these issues. And if you aren’t running a site crawl, you haven’t audited a site…period.

    • Eugene

      Absolutely agree, the article has little value. Son, I’m disappoint.

  • Thank you so much joe it really nice to know about this page

  • Totally geek and experience tips. Thanks for sharing such useful information

  • thank you so much to share with us, such a admirable, Informative and helpful….

  • Good basic overview, thanks joe!

  • Hi Joe,

    Wonderful write-up. I found one new thing to ‘Set Preferred Domain’. Thanks a lot for sharing this article. Keep up the good work and helping us 🙂