14 Days Free Trial
  1. SEJ
  2.  » 
  3. Web Development

50 Questions You Must Ask to Evaluate the Quality of Your Website

Do you really have a high-quality website that follows SEO best practices? Here is a detailed list of 50 questions you should ask yourself to create a better website.

50 Questions You Must Ask to Evaluate the Quality of Your Website

How do you evaluate a website?

By asking the right questions.

No website is perfect.

Every website has flaws.

Many things that can go wrong, whether due to technical SEO, on-page optimizations, page speed, or something else.

Ready to find your website’s flaws so you can get them fixed and working properly?


Read on to learn 50 questions you should ask to evaluate the quality of your website.

1. Do the Webpages Contain Multiple H1 Tags?

There shouldn’t be multiple H1 tags on the page.

The H1 tag is the focus of the page’s topic, and thus it should only occur on the page once.

If you include more than one H1 tag on the page, it will cause the dilution of the focus of the page.

Get some coffee for a website evaluation2. Is the Website Easily Crawlable?

One major issue that can impact a website negatively are 4xx and 5xx error pages.

Continue Reading Below

If your site cannot be crawled due to these issues, you can experience decreases in performance when search engines cannot effectively crawl the site.

3. Are Error Pages Configured Properly?

Issues can occur when error pages are not properly configured.

Properly configuring error pages can help search engines see what pages are what for exactly what they are.

One example: if you have 200 OK pages but they display Page Not Found as an error, then search engines could read these pages as 200 OK pages.

This can introduce conflicts in crawling and indexing.

The safe solution is to make sure that all pages display a status and display what that status is.

4xx pages should display 4xx errors with 4xx statuses.

5xx pages should display 5xx statuses.

Using any other configuration will add confusion and will not help when it comes to your specific issues.

Identifying issues in such a manner and correcting them will help increase the quality of your website.

Continue Reading Below

4. Does Navigation Use JavaScript?

If you use navigation that uses JavaScript for its implementation, you will interfere with cross-platform and cross-browser compatibility.

In 99.9 percent of cases, JavaScript does not need to be used for navigation.

Many of the same effects can be achieved through straight CSS 3 coding, which is what should be used for navigation instead.

If your website is using a responsive design, this should be part of it already.

Seriously, ditch the JavaScript.

5. Are URLs Resolving to a Single Case?

Again, this can be another major issue.

Like canonicalization issues above, URLs in multiple cases can cause duplicate content issues because search engines see all these instances of URLs at once.

One trick you can use to take care of multi-case URLs include adding a lower command to the website’s htaccess file.

This will cause all multi-case URLs to render as one canonical URL that you choose.

6. Does Your Site Use a Flat Architecture?

Using a flat architecture is all well and good but does not lend itself well to topical focus and organization.

With a flat architecture, all your pages are typically dumped in the root directory, and there is no focus on topics or related topics, and there is usually disorganization all lumped into one.

Using a siloed architecture helps you group webpages by topics and themes and organize them with a linking structure that further reinforces topical focus.

This, in turn, helps search engines better understand what you are trying to rank for.

For sites with narrower topical focus, a flat architecture may be a better way to go, but there should still be an opportunity to silo the navigation appropriately.

7. Is Thin Content Present on the Website?

Thin content by itself, so long as it is high quality enough that it answers a user’s query with information for that query, is not necessarily an issue.

Thin content becomes an issue when it is of no value at all. Thin content should not be measured by word count but in terms of quality, uniqueness, authority, relevance, and trust.

Continue Reading Below

Is the content quality content?

Is it unique content (is it written uniquely enough that it doesn’t appear anywhere else (on Google or on-site)?

Does the content authoritatively satisfy the user’s query?

And, is it relevant and does it engender trust when you visit the page?

8. Are You Planning on Reusing Existing Code or Creating a Site From Scratch?

Copying and pasting code is not as simple as you would expect.

Have you ever seen websites that seem to have errors in every line of code when checking it in the W3C validator?

The reason why is usually because the developer copied and pasted the code that was written for one DOCTYPE and used it for another.

If you copy and paste the code for XHTML 1.0 into an HTML 5 DOCTYPE you can expect many thousands of errors.

This is why it is so important to consider if you’re transferring a site to WordPress the DOCTYPE that is being used. This can interfere with cross-browser and cross-platform compatibility.

Continue Reading Below

9. Does the Site Have Schema.org Structured Data Where Applicable?

Schema is key to obtaining rich snippets in the SERPs on Google.

Even if your site does not lend itself well to certain industries that have specific structured data elements, there are ways to add structured markup anyway.

First, performing an entity audit is useful for finding what your site is doing already.

If nothing, you can correct this inequity by adding Schema to things like:

  • Navigation.
  • Logo.
  • Phone number.
  • Certain content elements that are common on every website.

10. Does the Site Have An XML Sitemap?

One item seldom translates to increasing the overall quality of a website, but this is one of them.

Having an XML sitemap makes your website much easier to crawl by the search engines.

Things like 4xx and 5xx errors in the sitemap, non-canonical URLs in the sitemap, blocked pages in the sitemap, sitemaps that are too large, and other issues should be looked at to gauge how the sitemap impacts the quality of a website.

Continue Reading Below

11. Are Landing Pages Not Properly Optimized?

There is a situation where having multiple landing pages optimized for the same keywords on the site does not actually make your site more relevant.

In fact, it can cause confusion for the search engines and what is called keyword cannibalization.

If a page is going to rank, then it has to identify which one it is, and on top of that, the search engine may not identify the one that is best for your purposes.

Having multiple landing pages for the same keyword can also dilute your link equity.

If other sites are interested in linking to your pages about a certain topic, what could really happen is that the link equity would be diluted across all those pages about that certain topic.

12. Is the Robots.txt File Free of Errors?

Robots.txt can be a major issue if the website owner has not configured it correctly.

One of the things I tend to run into in website audits is a robots.txt file that is not properly configured.

Continue Reading Below

All too often I see sites that have indexation issues and they have the following code added:

Disallow: /

BAD webmaster. BAD BAD.

This blocks all crawlers from crawling the website from the root folder on down.

Responsive Design13. Does the Website Use a Responsive Design?

Gone are the days of separate mobile websites (you know, the sites that use subdomains for the mobile site: “mobile.example.com” or “m.example.com”).

Continue Reading Below

Thanks to responsive design technologies, this is no longer necessary.

Instead, the modern method is to utilize HTML 5 and CSS 3 Media Queries to create a responsive design.

This is even more important with the arrival of Google’s mobile-first index.

14. Are CSS & JavaScript Blocked in Robots.txt?

It is important to go over this one because robots.txt should not block CSS or JS resources at all.

Google sent out a mass warning in July 2015 about blocking CSS and JS resources.

In short, don’t block CSS and JS resources.

15. Are Excessive Dynamic URLs Used Throughout the Website?

Identifying the number of dynamic URLs and whether they present an issue can be a challenge.

The best way to do this: identify whether the number of dynamic URLs outweighs the static URLs on the site.

If they do, then you could have a problem with dynamic URLs impacting crawlability.

It makes it harder for the search engines to understand your site and its content.

Continue Reading Below

16. Is the Site Plagued by Too Many Links?

Too many links can be a problem, but not in the way you would think.

Google no longer penalizes for more than 100 links on a page (John Mueller said so in 2014).

But, if you do have more than that quantity, maybe significantly more, it can be considered a spam signal if you are being spammy.

17. Does the Site Have Daisy-Chain URLs, and Do These Redirects Exceed 5 or More?

While Google will follow up to five redirects, they can still present problems.

Redirects can present even more problems if they continue into excessive territory – beyond five redirects.

It is, therefore, a good idea to ensure that your site has two redirects or less, assuming making this move does not impact prior SEO efforts on the site.

Serving any navigation element with JavaScript is a bad idea because it limits cross-browser and cross-platform compatibility, which also interferes with the user experience.

Continue Reading Below

When in doubt, do not serve links with JavaScript and use only plain HTML to serve links.

19. Is the Anchor Text in the Site’s Link Profile Overly Optimized?

If your site has a link profile with overly optimized and repetitive anchor text, it can eventually lead to possible action, whether algorithmic or manual in nature if it’s severe enough.

Ideally, your site should have a healthy mix of anchor text pointing to your sites – a good balance is following the 20 percent rule: 20 percent branded anchors, 20 percent exact match, 20 percent topical match, and probably 20 percent naked URLs.

But, the challenge is achieving this link profile balance while also not leaving identifying footprints that you are doing anything manipulative.

20. Is Any Canonicalization Implemented on the Website?

Canonicalization refers to making sure that Google sees the URL that you prefer them to see.

In short, using a snippet of code you can declare that Google sees one URL as the preferred source of content for that URL.

Continue Reading Below

This causes many different issues at once, including:

  • The dilution of inbound link equity.
  • The self-cannibalization of the SERPs (where multiple versions of that URL are competing for results).
  • Inefficient crawling when search engines spend even more time crawling the exact same content every time.

To fix canonicalization issues is to use one URL for all public facing content on each URL on the site.

The preferred solution is to use 301 redirects to redirect all non-canonical versions of URLs to the canonical version.

Decide early on in the web development stage which URL structure and format you wish to use and use this as your canonical URL version.

Evaluate Your Website's Quality21. Are the Images on the Site Too Large?

If images on your site are too large, you risk running into issues with load time, which is being implemented as a mobile ranking signal starting in July.

Continue Reading Below

If you have a 2MB image loading on your page, this is a major problem.

It isn’t necessary, and you waste an opportunity to identify these issues in the first place.

22. Are Videos on the Site Missing Schema Markup?

It is possible to add Schema.org Structured Data to videos as well.

Using the video object element, you can markup all your videos with Schema.

23. Does the Site Have All Required Page Titles?

Missing SEO titles on a website can be a problem.

If your site is missing an SEO title, Google could automatically generate one based on your content.

You never want Google to auto-generate titles and descriptions.

You don’t want to leave anything to chance when it comes to optimizing a site properly, so all page titles should be manually written.

24. Does the Site Have All Required Meta Descriptions?

If your site does not have meta descriptions, Google could automatically generate one based on your content, and it is not always the one that you want to have added to the search results.

Continue Reading Below

To err on the side of caution, and to make sure that you don’t have issues with this, always make sure that you write a custom meta description for each page.

Meta keywords should also be considered. Google and Bing may have come out and stated that these are not used in search ranking, but other search engines still use them. It is a mistake to have such a narrow focus that you are only optimizing for Google and Bing

In addition, there is the concept of linear distribution of keywords, which helps with points of relevance.

While stuffing meta keywords may not necessarily help rankings, carefully and strategically adding meta keywords can add points of relevance to the document.

The only time this can hurt is if you are spamming, and Google decides to use it as a spam signal to nail your website.

25. Is the Page Speed of Top Landing Pages More Than 2-3 Seconds?

It is important to test and find out the actual page speed of your top landing pages.

Continue Reading Below

This can make or break your website’s performance.

If it takes your site 15 seconds to load, that’s bad.

Always make sure your site takes less than a second to load.

While Google’s recommendation says 2-3 seconds, the name of the game is being better than their recommendations and better than your competition.

26. Does the Website Leverage Browser Caching?

It is important to leverage browser caching because this is a component of faster site speed.

To leverage browser caching, you can simply add the following line of code to your htaccess file. Please be sure to read the documentation on how to use it.

PLEASE NOTE: Use this code at your own risk. The author does not accept liability for this code not working for your website.

<IfModule mod_expires.c>
ExpiresActive On
ExpiresByType image/jpg "access 1 year"
ExpiresByType image/jpeg "access 1 year"
ExpiresByType image/gif "access 1 year"
ExpiresByType image/png "access 1 year"
ExpiresByType text/css "access 1 month"
ExpiresByType text/html "access 1 month"
ExpiresByType application/pdf "access 1 month"
ExpiresByType text/x-javascript "access 1 month"
ExpiresByType application/x-shockwave-flash "access 1 month"
ExpiresByType image/x-icon "access 1 year"
ExpiresDefault "access 1 month"

Continue Reading Below

27. Does the Website Leverage the Use of a Content Delivery Network?

Using a content delivery network can make site speed faster because it decreases the distance between servers and customers – thereby decreasing the time it takes to load the site to people in those locations.

Depending on the size of your site, use of a content delivery network can help increase performance significantly.

28. Has Content on the Site Been Optimized for Targeted Keyword Phrases?

It is usually easy to identify when a site has been properly optimized for targeted keyword phrases.

They almost always stick out like a sore thumb if they are not well-optimized.

You know how it is: spammy text reads similar to the following if it’s optimized for widgets: “These widgets are the most awesome widgets in the history of widgetized widgets. We promise these widgets will rock your world.”

Well-optimized keywords read well with surrounding text, and if you are someone who is on the inside of the optimizations you will likely be able to identify them easier.

Continue Reading Below

But, just because they are easily identified by you doesn’t always make them spammy.

If they are well organized and properly interwoven with the surrounding text, then you really shouldn’t have to do much else to these optimizations.

On the flip side, if there is so much spammy text that it is negatively affecting optimizations on-site, then it may be time to junk some of the content and rewrite it entirely.

29. How Deeply Has Content on the Site Been Optimized?

Just as there are different levels of link acquisition, there are different levels of content optimization.

Some optimization is surface-level, depending on the initial scope of the content execution mandate.

Other optimizations are deeper, with images, links, and keywords being fully optimized.

Further questions you may want to ask to ensure that content on your site is properly optimized include:

  • Does my content include targeted keywords throughout the body copy?
  • Does my content include headings optimized with keyword variations?
  • Does my content include lists, images, and quotes where needed? Don’t just add these things randomly throughout your content. They should be contextually relevant and support the content.
  • Does my content include bold and italicized text for emphasis where needed?
  • Does my content read well?
Continue Reading Below

30. Has Keyword Research Been Performed on the Site?

Just adding keywords in everywhere doesn’t work well.

You have to know things like search volume, how to target those words accordingly with your audience, and how to identify what to do next.

This is where keyword research comes in.

You wouldn’t build a site without first researching your target market, would you?

In the same vein, you wouldn’t write content without performing targeted keyword research.

Website evaluation31. Has Content on the Site Been Proofread?

Have you performed any proofreading on the content on your site before posting?

Continue Reading Below

I can’t tell you how many times I have performed an audit and found silly errors like grammatical errors within the content, spelling errors, and other major issues.

Be sure to proofread your content before posting. This will save a lot of editing work in the future, when you have situations that result in the SEO having to perform a lot of the editing.

When it’s part of your job, however, and it’s expected, grin and bear it. Or party and enjoy it, whichever side of the table you’re on.

32. Have Images on the Site Been Optimized?

Image optimizations include things like keyword phrases in the file name, image size, image load time, and making sure that images are optimized for Google image search.

Image size should match or otherwise appear to complement the design of your site.

You wouldn’t include images that are completely irrelevant if you were doing marketing correctly, right?

In the same vein, don’t include images that appear to be completely spamming your audience.

Continue Reading Below

33. Does the Site Follow Web Development Best Practices?

This is a big one.

Sites violate even the basics of web development best practices in so many ways – from polyglot documents, to invalidated code as tested on the W3C, to excessive load times.

Now, I know I’m going to get plenty of flack from developers about how some of my usual requirements are “unrealistic” but when you have been practicing these development techniques for years, they are not all that difficult.

It just takes a slightly different mindset than what you are used to: you know, the mindset of constantly building and going after the biggest, best, and therefore most awesome website you can create.

Instead of that, the mindset should be about “creating the lightest-weight, least resource-intensive site” is what should be at the forefront of your development practices.

The problem is that the former has been done so much that even SEOs have given up even trying to change things. There is always so much contention from web developers about even following basic web development best practices.

Continue Reading Below

Yes, I know many websites out there don’t follow the W3C. But, when you are being paid by your client, and the client requests this, you need to know your stuff and know how to make sure that your site validates in the validator.

Coming up with excuses will only make you look unprofessional.

  • Are things like 1-2 second load times unrealistic? Not when you use CSS Sprites and lossless compression in Adobe Photoshop properly.
  • Are less than 2-3 HTTP requests unrealistic? Not when you properly structure the site and you get rid of unnecessary WordPress scripts that are taking up valuable code real estate.
  • Want to really get some quick page loading times? Get rid of WordPress entirely and code the site yourself. You’ll remove at least 1.5 seconds of load time just due to WordPress.

Stop being an armchair developer and become a professional web developer. Expand those horizons!

Think outside the box.

Be different. Be real. Be the best.

Stop thinking web development best practices are unrealistic – because the only thing that is unrealistic is your attitude and how much you don’t want to work hard or learn something new instead of machine-gunning your website development work in the name of profits, and dare I say – screwing the client over.

Continue Reading Below

34. Has an HTTPS Migration Been Performed Correctly?

When you are setting up your site for proper HTTPS migrations, you have to purchase website security certificate.

One of the first steps is to perform the purchase of this certificate. If you don’t do this step correctly, you can completely screw up your HTTPS migration later.

Here’s why.

Say you purchased an SSL certificate for this reason. You selected one option, which is just for one subdomain. By doing it this way, you have inadvertently created potentially more than 100 errors on-site just by choosing the wrong option during the purchase process.

For this reason, it is best to always consider at the very least, a wild-carded SSL certificate for all domain variations.

While this is usually a little more, at the very least, you do not ever introduce errors into the process by making sure this happens.

35. Was a New Disavow File Submitted with the HTTPS GSC Profile? Was That HTTPS GSC Profile Ever Created?

You would be surprised how often this comes up in website audits. But, sometimes, a disavow file was never submitted to the new Google Search Console (GSC) HTTPS profile.

Continue Reading Below

Or, a GSC HTTPS profile was never created, and the current GSC profile is either under-reporting or over-reporting on data, depending on how the implementation was handled.

Thankfully, the fix is pretty simple – just make sure you transfer the old HTTP Disavow file to the new HTTPS profile and continue to update it regularly.

36. Were GSC Settings Carried Over to The New Account?

This can also cause certain issues with an HTTPS migration.

Say you had the HTTP domain setup as www. But then you set the domain in the new GSC to non-www.

Or something else than the original profile had.

This is one example where errant GSC settings can cause issues with an HTTPS migration.

37. Did You Make Sure to Notate the Migration in Google Analytics?

Not just notating the migration, but not notating major website changes, overhauls, or otherwise can hurt your decision-making later.

If precise details are not kept by notating them in Google Analytics (GA), you could be flying blind when making website changes that depend on these details.

Continue Reading Below

Here’s an example: say that a major content overhaul took place. You got a penalty later. This was the only change. A change in department heads took place, as did SEOs.

Notating this change in Google Analytics will help future SEO folks understand what happened before that has impacted the site in the here and now.

38. Is the Social Media Implementation on the Site Done Correctly?

This comes up in audits a lot. I see things where social media links were not quite removed when things changed (like where a social media effort on a particular platform was unnecessarily hyperfocused) or where smaller things like potential customer interactions were not exactly kosher.

These things will impact the quality of your site.

If you are constantly machine-gunning your social posts, and not interacting with customers properly, you are doing it wrong.

39. Are Lead Submission Forms Properly Working?

If a lead generation form is not properly working, you may not be getting all possible leads coming through.

Continue Reading Below

If there’s a typo in an email address or a typo in a line of code that is breaking the form, these need to be fixed.

For lead generation forms, it is always a high priority to make sure that regular maintenance is performed. This helps to prevent things like under-reporting of leads, and errant information being submitted.

Nothing’s worse than getting information from a form and finding that that phone number is one digit off or the email address is wrong due to a programming error, and not necessarily due to submission error.

40. Are Any Lead Tracking Scripts Working Correctly?

Performing ongoing testing on lead tracking scripts is crucial to ensure proper functioning of your site.

If your lead tracking scripts ever end up breaking, and you get errant submissions on the weekend, this can wreak havoc on your customer acquisition nightmares.

41. Is Call Tracking Properly Setup?

I remember working with a client at an agency, and they had call tracking setup on their website.

Continue Reading Below

Everything appeared to be working correctly.

When I called the client and discussed the matter, everything appeared correct.

We got to discussing the phone number and the client mentioned that they had changed that phone number a while back.

It was one digit off.

You can imagine the client’s reaction when informed them what the phone number on the site is.

It is easy to forget to audit something as simple as the phone number when you are in the midst of increasingly complex website optimizations.

That’s why it is important to always take a step back from time to time and test things and talk to your client to make sure that your implementations are working correctly everywhere.

42. Is the Site Using Excessive Inline CSS and JavaScript?

To touch on an earlier topic, inline CSS and JavaScript is bad when it turns excessive.

This leads to excessive browser rendering times and can potentially ruin cross-browser and cross-platform functionality by relying on inline implementations of CSS and JavaScript.

Continue Reading Below

It is best to just avoid these at all in the course of your web development, and make sure that you always add any new styles to the CSS style sheet, and any new JavaScript is properly created and accounted for, rather than inline.

43. Are the Right GSC/GA Accounts Linked Properly?

You wouldn’t believe how often this has come up when I took over a website. I looked at their GSC or GA accounts, and they were not properly reporting or otherwise working.

Turns out that at some point, the GA or GSC account had been switched to another account, and no one bothered to update the website accordingly. Or, some other strange scenario.

This is why it is doubly important to always check on the GSC and GA accounts and make sure that the site has the proper profiles implemented.

Evaluating the quality of your website can help your SEO44. Does The Site Have URLs That Are Too Long?

By making sure that URLs are reasonably short, and not having URLs that are extra long (over 100 characters), it is possible to avoid user experience issues.

Continue Reading Below

It is important to note that much longer URLs can lead to user experience issues.

When in doubt, if you have two URLs you want to use in a redirect scenario, and one is shorter than the other one, use the shorter version.

In addition, it is considered a standard SEO best practice to limit URLs to less than 100 characters. The reason why comes down to usability and user experience.

Google can process longer URLs. But, shorter URLs are much easier to parse, copy and paste, and share on social.

This can also get pretty messy. Longer URLs, especially dynamic ones, can wreak havoc on your analytics data.

Say you have a dynamic URL with parameters.

This URL gets updated for whatever reason multiple times a month and generates new variations of this same URL with the same content, and also updates the parameters.

When URLs are super long in this situation, it can be challenging to sift through all the analytics data and identify what is what.

Continue Reading Below

This is where shorter URLs come in. They can make such a process easier, can ensure one page URL for each piece of unique content, and you do not run the risk of negatively damaging the site’s reporting data.

It all comes down to your industry and what you do. This advice might not make as much sense for an e-commerce site that may have just as many URLs with such parameters.

In such a situation, a different method of handling such URLs may be desired.

45. How Targeted Are Keywords on the Site?

You can have the best content in the world. Your technical SEO can exceed 100 percent and be the best, fastest-loading website ever. But, in the end, keywords are the name of the game.

Keyword queries are how Google finds what people are searching for.

The more targeted your keywords on-site are, the better that Google will be able to discern where to place your site in the search results.

Continue Reading Below

What exactly is meant by targeted keywords? These are any words that users use to find your site, that are mapped to queries from Google.

And what is the best, most awesome method to use to optimize these keywords?

The keyword optimization concept of linear distribution applies. It’s not about how many keywords you can add to the page, but more about what linear distribution tells the search engines.

It is better to have keywords sprinkled throughout the text evenly (from the title tag, description, and meta keywords down to the bottom of the page) than stuff everything up the wazoo with keywords.

Don’t think you can randomly stuff keywords into a page with “high keyword density” and make it work for long. That’s just random keyword spamming, and the search engines don’t like that.

There is a major difference between spamming the search engines and keyword targeting. Just make sure your site adheres to proper keyword targeting for the latter, and that you are not seen as a spammer.

Continue Reading Below

46. Are There Any Notations in Google Analytics About Major Site Changes?

To expand on an earlier point made during our HTTPS migration discussion, it is important to ensure that any major website changes are notated in Google Analytics.

This helps to pinpoint where things (if any) went wrong during major technical overhauls.

If a penalty occurs later, it can be easier to pinpoint

47. Is Google Analytics Even Setup Properly on the Site?

Expanding on our earlier discussions about the right accounts being linked, even just setting up Google Analytics can be overlooked by even the most experienced SEO professionals.

It is a detail that, while not always identified during an audit, can wreak havoc on reporting data later.

During a site migration or design, it can be easy to miss an errant Google Analytics installation, or otherwise think the current implementation is working properly.

Even during domain changes, overall technical domain implementations, and other site-wide changes, always make sure the proper Google Analytics and GSC implementations are working and set up properly on-site.

Continue Reading Below

You do not want to run into the situation later where an implementation went wrong, and you don’t know why content is not properly performing when it was posted.

48. Is Google Tag Manager Working Properly?

If you use Google Tag Manager (GTM) for your reporting, it is important to also test Google Tag Manager to make sure that it is working properly.

If your reporting implementations are not working, then they can end up underreporting or over-reporting, and you can make decisions based on false positives being presented by errant data.

Using preview and debug modes, Google Tag Assistant, and using Screaming Frog can be great means to that end.

For example, identifying pages that don’t have Google Tag Manager code added is easy with Screaming Frog.

Using custom search and extraction can help you do this. This method can find pages that, quite simply, just do not have GTM installed.

Using Google Tag Assistant, a Chrome Extension for Google Tag Manager, can help you troubleshoot GTM, Google Analytics, and AdWords. It works with recordings by recording a browsing session. This session will then report on anything happening and how the data interactions will show up in GA.

Continue Reading Below

49. Does the Site Otherwise Have Any Other Major Quality Issues?

Other quality issues that can impact your site include bad design.

Be honest.

Look at your site and other competitors in the space.

How much do you really like your site in comparison to those competitors? This is one of those things that you just can’t really put a finger on.

It must be felt out, or otherwise navigated through intangibles like a gut instinct. If you really don’t like what your design is doing, it may be time to go back to the drawing board and start again from scratch.

Other issues to keep an eye out for include errors within the content, grainy images, plug-ins that aren’t working, or anything that impacts something negatively from a reporting point of view.

It may not even be a penalty either. It may simply be errant reporting due to a plug-in’s implementation that went wrong.

50. Is Your Reporting Data Accurate?

A consistent theme I wanted to include throughout this article includes inaccuracies in reporting.

Continue Reading Below

Because GSC and GA reporting data are used oftentimes when it comes to making decisions, it is so important to make sure that your GSC and GA implementations are all 100 percent correct.

This article describes exactly what can happen when you have issues with reporting data.

Dark traffic, or hidden traffic, can be a problem if not dealt with properly.

This can skew a huge chunk of what you think you know about your visitor traffic statistics.

That can be a major problem!

Analytics platforms, including Google, have a hard time tracking every single kind of traffic source.


A Website Is Never Done!

Evaluating the quality of a website can be an ongoing process that is never done. It is important to stick to a regular schedule.

Perhaps schedule website audits to occur every year or so. That way, you can continue an evaluation process that identifies issues and gets them in the development queue before they become problems.

More Website Audit Resources:

Continue Reading Below

Subscribe to SEJ

Get our daily newsletter from SEJ's Founder Loren Baker about the latest news in the industry!

Topic(s) of Interest*

Brian Harnish

Lead SEO at iLoveSEO

Brian has been doing SEO since before it was called SEO, back in the days of 1998. Back then, SEO ... [Read full bio]

Read the Next Article
Read the Next