1. SEJ
  2.  ⋅ 
  3. Technical SEO

11 Lessons Learned From Auditing Over 500 Websites

After auditing hundreds of websites, the most frequent or significant problems are not complex technical SEO issues. Read on and learn what truly works.

11 Lessons Learned From Auditing Over 500 Websites

After conducting more than 500 in-depth website audits in the past 12 years, I’ve noticed clear patterns about what works and doesn’t in SEO.

I’ve seen almost everything that can go right – and wrong – with websites of different types.

To help you avoid costly SEO mistakes, I’m sharing 11 practical lessons from critical SEO areas, such as technical SEO, on-page SEO, content strategy, SEO tools and processes, and off-page SEO.

It took me more than a decade to discover all these lessons. By reading this article, you can apply these insights to save yourself and your SEO clients time, money, and frustration – in less than an hour.

Lesson #1: Technical SEO Is Your Foundation For SEO Success

  • Lesson: You should always start any SEO work with technical fundamentals; crawlability and indexability determine whether search engines can even see your site.

Technical SEO ensures search engines can crawl, index, and fully understand your content. If search engines can’t properly access your site, no amount of quality content or backlinks will help.

After auditing over 500 websites, I believe technical SEO is the most critical aspect of SEO, which comes down to two fundamental concepts:

  • Crawlability: Can search engines easily find and navigate your website’s pages?
  • Indexability: Once crawled, can your pages appear in search results?

If your pages fail these two tests, they won’t even enter the SEO game — and your SEO efforts won’t matter.

I strongly recommend regularly monitoring your technical SEO health using at least two essential tools: Google Search Console and Bing Webmaster Tools.

Google Search Console Indexing ReportGoogle Search Console Indexing Report provides valuable insights into crawlability and indexability. Screenshot from Google Search Console, April 2025

When starting any SEO audit, always ask yourself these two critical questions:

  • Can Google, Bing, or other search engines crawl and index my important pages?
  • Am I letting search engine bots crawl only the right pages?

This step alone can save you huge headaches and ensure no major technical SEO blockages.

→ Read more: 13 Steps To Boost Your Site’s Crawlability And Indexability

Lesson #2: JavaScript SEO Can Easily Go Wrong

  • Lesson: You should be cautious when relying heavily on JavaScript. It can easily prevent Google from seeing and indexing critical content.

JavaScript adds great interactivity, but search engines (even as smart as Google) often struggle to process it reliably.

Google handles JavaScript in three steps (crawling, rendering, and indexing) using an evergreen Chromium browser. However, rendering delays (from minutes to weeks) and limited resources can prevent important content from getting indexed.

I’ve audited many sites whose SEO was failing because key JavaScript-loaded content wasn’t visible to Google.

Typically, important content was missing from the initial HTML, it didn’t load properly during rendering, or there were significant differences between the raw HTML and rendered HTML when it came to content or meta elements.

You should always test if Google can see your JavaScript-based content:

  • Use the Live URL Test in Google Search Console and verify rendered HTML.
Google Search Console LIVE TestGoogle Search Console LIVE Test allows you to see the rendered HTML. (Screenshot from Google Search Console, April 2025)
  • Or, search Google for a unique sentence from your JavaScript content (in quotes). If your content isn’t showing up, Google probably can’t index it.*
Site: search in Google The site: search in Google allows you to quickly check whether a given piece of text on a given page is indexed by Google. (Screenshot from Google Search, April 2025)

*This will only work for URLs that are already in Google’s index.

Here are a few best practices regarding JavaScript SEO:

  • Critical content in HTML: You should include titles, descriptions, and important content directly in the initial HTML so search engines can index it immediately. You should remember that Google doesn’t scroll or click.
  • Server-Side Rendering (SSR): You should consider implementing SSR to serve fully rendered HTML. It’s more reliable and less resource-intensive for search engines.
  • Proper robots.txt setup: Websites should block essential JavaScript files needed for rendering, as this prevents indexing.
  • Use crawlable URLs: You should ensure each page has a unique, crawlable URL. You should also avoid URL fragments (#section) for important content; they often don’t get indexed.

For a full list of JavaScript SEO common errors and best practices, you can navigate to the JavaScript SEO guide for SEO pros and developers.

Read more: 6 JavaScript Optimization Tips From Google

Lesson #3: Crawl Budget Matters, But Only If Your Website Is Huge

  • Lesson: You should only worry about the crawl budget if your website has hundreds of thousands or millions of pages.

Crawl budget refers to how many pages a search engine like Google crawls on your site within a certain timeframe. It’s determined by two main factors:

  • Crawl capacity limit: This prevents Googlebot from overwhelming your server with too many simultaneous requests.
  • Crawl demand: This is based on your site’s popularity and how often content changes.

No matter what you hear or read on the internet, most websites don’t need to stress about crawl budget at all. Google typically handles crawling efficiently for smaller websites.

But for huge websites – especially those with millions of URLs or daily-changing content – crawl budget becomes critical (as Google confirms in its crawl budget documentation).

Google documentation on crawl budgetGoogle, in its documentation, clearly defines what types of websites should be concerned about crawl budget. (Screenshot from Search Central, April 2025)

In this case, you need to ensure that Google prioritizes and crawls important pages frequently without wasting resources on pages that should never be crawled or indexed.

You can check your crawl budget health using Google Search Console’s Indexing report. Pay attention to:

  • Crawled – Currently Not Indexed: This usually indicates indexing problems, not crawl budget.
  • Discovered – Currently Not Indexed: This typically signals crawl budget issues.

You should also regularly review Google Search Console’s Crawl Stats report to see how many pages Google crawls per day. Comparing crawled pages with total pages on your site helps you spot inefficiencies.

While those quick checks in GSC naturally won’t replace log file analysis, they will give quick insights into possible crawl budget issues and may suggest that a detailed log file analysis may be necessary.

Read more: 9 Tips To Optimize Crawl Budget For SEO

This brings us to the next point.

Lesson #4: Log File Analysis Lets You See The Entire Picture

  • Lesson: Log file analysis is a must for many websites. It reveals details you can’t see otherwise and helps diagnose problems with crawlability and indexability that affect your site’s ability to rank.

Log files track every visit from search engine bots, like Googlebot or Bingbot. They show which pages are crawled, how often, and what the bots do. This data lets you spot issues and decide how to fix them.

For example, on an ecommerce site, you might find Googlebot crawling product pages, adding items to the cart, and removing them, wasting your crawl budget on useless actions.

With this insight, you can block those cart-related URLs with parameters to save resources so that Googlebot can crawl and index valuable, indexable canonical URLs.

Here is how you can make use of log file analysis:

  • Start by accessing your server access logs, which record bot activity.
  • Look at what pages bots hit most, how frequently they visit, and if they’re stuck on low-value URLs.
  • You don’t need to analyze logs manually. Tools like Screaming Frog Log File Analyzer make it easy to identify patterns quickly.
  • If you notice issues, like bots repeatedly crawling URLs with parameters, you can easily update your robots.txt file to block those unnecessary crawls

Getting log files isn’t always easy, especially for big enterprise sites where server access might be restricted.

If that’s the case, you can use the aforementioned Google Search Console’s Crawl Stats, which provides valuable insights into Googlebot’s crawling activity, including pages crawled, crawl frequency, and response times.

Google Search Console Crawl Stats reportThe Google Search Console Crawl Stats report provides a sample of data about Google’s crawling activity. (Screenshot from Google Search Console, April 2025)

While log files offer the most detailed view of search engine interactions, even a quick check in Crawl Stats helps you spot issues you might otherwise miss.

Read more: 14 Must-Know Tips For Crawling Millions Of Webpages

Lesson #5: Core Web Vitals Are Overrated. Stop Obsessing Over Them

  • Lesson: You should focus less on Core Web Vitals. They rarely make or break SEO results.

Core Web Vitals measure loading speed, interactivity, and visual stability, but they do not influence SEO as significantly as many assume.

After auditing over 500 websites, I’ve rarely seen Core Web Vitals alone significantly improve rankings.

Most sites only see measurable improvement if their loading times are extremely poor – taking more than 30 seconds – or have critical issues flagged in Google Search Console (where everything is marked in red).

Core Web Vitals in Google Search ConsoleThe Core Web Vitals report in Google Search Console provides real-world user data. (Screenshot from Google Search Console, April 2025)

I’ve watched clients spend thousands, even tens of thousands of dollars, chasing perfect Core Web Vitals scores while overlooking fundamental SEO basics, such as content quality or keyword strategy.

Redirecting those resources toward content and foundational SEO improvements usually yields way better results.

When evaluating Core Web Vitals, you should focus exclusively on real-world data from Google Search Console (as opposed to lab data in Google PageSpeed Insights) and consider users’ geographic locations and typical internet speeds.

If your users live in urban areas with reliable high-speed internet, Core Web Vitals won’t affect them much. But if they’re rural users on slower connections or older devices, site speed and visual stability become critical.

The bottom line here is that you should always base your decision to optimize Core Web Vitals on your specific audience’s needs and real user data – not just industry trends.

Read more: Are Core Web Vitals A Ranking Factor?

Lesson #6: Use Schema (Structured Data) To Help Google Understand & Trust You

  • Lesson: You should use structured data (Schema) to tell Google who you are, what you do, and why your website deserves trust and visibility.

Schema Markup (or structured data) explicitly defines your content’s meaning, which helps Google easily understand the main topic and context of your pages.

Certain schema types, like rich results markup, allow your listings to display extra details, such as star ratings, event information, or product prices. These “rich snippets” can grab attention in search results and increase click-through rates.

You can think of schema as informative labels for Google. You can label almost anything – products, articles, reviews, events – to clearly explain relationships and context. This clarity helps search engines understand why your content is relevant for a given query.

You should always choose the correct schema type (like “Article” for blog posts or “Product” for e-commerce pages), implement it properly with JSON-LD, and carefully test it using Google’s Rich Results Test or Structured Data Testing Tool.

Structured data markup typesIn its documentation, Google shows examples of structured data markup supported by Google Search. (Screenshot from Google Search Console, April 2025)

Schema lets you optimize SEO behind the scenes without affecting what your audience sees.

While SEO clients often hesitate about changing visible content, they usually feel comfortable adding structured data because it’s invisible to website visitors.

Read more: CMO Guide To Schema: How Your Organization Can Implement A Structured Data Strategy

Lesson #7: Keyword Research And Mapping Are Everything

  • Lesson: Technical SEO gets you into the game by controlling what search engines can crawl and index. But, the next step – keyword research and mapping – tells them what your site is about and how to rank it.

Too often, websites chase the latest SEO tricks or target broad, competitive keywords without any strategic planning. They skip proper keyword research and rarely invest in keyword mapping, both essential steps to long-term SEO success:

  • Keyword research identifies the exact words and phrases your audience actually uses to search.
  • Keyword mapping assigns these researched terms to specific pages and gives each page a clear, focused purpose.

Every website should have a spreadsheet listing all its indexable canonical URLs.

Next to each URL, there should be the main keyword that the page should target, plus a few related synonyms or variations.

Keyword research and keyword mappingHaving the keyword mapping document is a vital element of any SEO strategy. (Image from author, April 2025)

Without this structure, you’ll be guessing and hoping your pages rank for terms that may not even match your content.

A clear keyword map ensures every page has a defined role, which makes your entire SEO strategy more effective.

This isn’t busywork; it’s the foundation of a solid SEO strategy.

→ Read more: How To Use ChatGPT For Keyword Research

Lesson #8: On-Page SEO Accounts For 80% Of Success

  • Lesson: From my experience auditing hundreds of websites, on-page SEO drives about 80% of SEO results. Yet, only about 1 in 20 or 30 sites I review have done it well. Most get it wrong from the start.

Many websites rush straight into link building, generating hundreds or even thousands of low-quality backlinks with exact-match anchor texts, before laying any SEO groundwork.

They skip essential keyword research, overlook keyword mapping, and fail to optimize their key pages first.

I’ve seen this over and over: chasing advanced or shiny tactics while ignoring the basics that actually work.

When your technical SEO foundation is strong, focusing on on-page SEO can often deliver significant results.

There are thousands of articles about basic on-page SEO: optimizing titles, headers, and content around targeted keywords.

Yet, almost nobody implements all of these basics correctly. Instead of chasing trendy or complex tactics, you should focus first on the essentials:

  • Do proper keyword research to identify terms your audience actually searches.
  • Map these keywords clearly to specific pages.
  • Optimize each page’s title tags, meta descriptions, headers, images, internal links, and content accordingly.

These straightforward steps are often enough to achieve SEO success, yet many overlook them while searching for complicated shortcuts.

Read more: Google E-E-A-T: What Is It & How To Demonstrate It For SEO

Lesson #9: Internal Linking Is An Underused But Powerful SEO Opportunity

  • Lesson: Internal links hold more power than overhyped external backlinks and can significantly clarify your site’s structure for Google.

Internal links are way more powerful than most website owners realize.

Everyone talks about backlinks from external sites, but internal linking – when done correctly – can actually make a huge impact.

Unless your website is brand new, improving your internal linking can give your SEO a serious lift by helping Google clearly understand the topic and context of your site and its specific pages.

Still, many websites don’t use internal links effectively. They rely heavily on generic anchor texts like “Read more” or “Learn more,” which tell search engines absolutely nothing about the linked page’s content.

Low-value internal linksImage from author, April 2025

Website owners often approach me convinced they need a deep technical audit.

Yet, when I take a closer look, their real issue frequently turns out to be poor internal linking or unclear website structure, both making it harder for Google to understand the site’s content and value.

Internal linking can also give a boost to underperforming pages.

For example, if you have a page with strong external backlinks, linking internally from that high-authority page to weaker ones can pass authority and help those pages rank better.

Investing a little extra time in improving your internal links is always worth it. They’re one of the easiest yet most powerful SEO tools you have.

Read more: Internal Link Structure Best Practices to Boost Your SEO

Lesson #10: Backlinks Are Just One SEO Lever, Not The Only One

  • Lesson: You should never blindly chase backlinks to fix your SEO. Build them strategically only after mastering the basics.

SEO audits often show websites placing too much emphasis on backlinks while neglecting many other critical SEO opportunities.

Blindly building backlinks without first covering SEO fundamentals – like removing technical SEO blockages, doing thorough keyword research, and mapping clear keywords to every page – is a common and costly mistake.

Even after getting those basics right, link building should never be random or reactive.

Too often, I see sites start building backlinks simply because their SEO isn’t progressing, hoping more links will magically help. This rarely works.

Instead, you should always approach link building strategically, by first carefully analyzing your direct SERP competitors to determine if backlinks are genuinely your missing element:

  • Look closely at the pages outranking you.
  • Identify whether their advantage truly comes from backlinks or better on-page optimization, content quality, or internal linking.
Backlink analysisThe decision on whether or not to build backlinks should be based on whether direct competitors have more and better backlinks. (Image from author, April 2025)

Only after ensuring your on-page SEO and internal links are strong and confirming that backlinks are indeed the differentiating factor, should you invest in targeted link building.

Typically, you don’t need hundreds of low-quality backlinks. Often, just a few strategic editorial links or well-crafted SEO press releases can close the gap and improve your rankings.

Read more: How To Get Quality Backlinks: 11 Ways That Really Work

Lesson #11: SEO Tools Alone Can’t Replace Manual SEO Checks

  • Lesson: You should never trust SEO tools blindly. Always cross-check their findings manually using your own judgment and common sense.

SEO tools make our work faster, easier, and more efficient, but they still can’t fully replicate human analysis or insight.

Tools lack the ability to understand context and strategy in the way that SEO professionals do. They usually can’t “connect the dots” or assess the real significance of certain findings.

This is exactly why every recommendation provided by a tool needs manual verification. You should always evaluate the severity and real-world impact of the issue yourself.

Often, website owners come to me alarmed by “fatal” errors flagged by their SEO tools.

Yet, when I manually inspect these issues, most turn out to be minor or irrelevant.

Meanwhile, fundamental aspects of SEO, such as strategic keyword targeting or on-page optimization, are completely missing since no tool can fully capture these nuances.

Screaming Frog SEO Spider flagging SEO issuesScreaming Frog SEO Spider says there are rich result validation errors, but when I check that manually, there are no errors. (Screenshot from Screaming Frog, April 2025)

SEO tools are still incredibly useful because they handle large-scale checks that humans can’t easily perform, like analyzing millions of URLs at once.

However, you should always interpret their findings carefully and manually verify the importance and actual impact before taking any action.

Final Thoughts

After auditing hundreds of websites, the biggest pattern I notice isn’t complex technical SEO issues, though they do matter.

Instead, the most frequent and significant problem is simply a lack of a clear, prioritized SEO strategy.

Too often, SEO is done without a solid foundation or clear direction, which makes all other efforts less effective.

Another common issue is undiagnosed technical problems lingering from old site migrations or updates. These hidden problems can quietly hurt rankings for years if left unresolved.

The lessons above cover the majority of challenges I encounter daily, but remember: Each website is unique. There’s no one-size-fits-all checklist.

Every audit must be personalized and consider the site’s specific context, audience, goals, and limitations.

SEO tools and AI are increasingly helpful, but they’re still just tools. Ultimately, your own human judgment, experience, and common sense remain the most critical factors in effective SEO.

More Resources:


Featured Image: inspiring.team/Shutterstock

Category Technical SEO
Olga Zarr SEO Consultant at SEOSLY

Olga Zarr is an SEO consultant at SEOSLY, a technical SEO specialist and an SEO auditor with 13+ years of ...