If you’re a CMO, I feel your pain. For years, decades even, brand visibility has largely been an SEO arms race against your competitors. And then along comes ChatGPT, Perplexity, and Claude, not to mention Google’s new AI-powered search features: AI Mode and AI Overviews. Suddenly, you’ve also got to factor in your brand’s visibility in AI-generated responses as well.
Unfortunately, the technical shortcuts that helped your brand adapt quickly and stay competitive over the years have most likely left you with various legacy issues. This accumulated technical SEO debt could potentially devastate your AI visibility.
Of course, every legacy issue or technical problem will have a solution. But your biggest challenge in addressing your technical SEO debt isn’t complexity or incompetence; it’s assumption.
Assumptions are the white ants in your search strategy, hollowing out the team’s tactics and best efforts. Everything might still seem structurally sound on the surface because all the damage is happening inside the walls of the house, or between the lines of your SEO goals and workflows. But then comes that horrific day when someone inadvertently applies a little extra pressure in the wrong spot and the whole lot caves in.
The new demands of AI search are applying that pressure right now. How solid is your technical SEO?
Strong Search Rankings ≠ AI Visibility
One of the most dangerous assumptions you can make is thinking that, because your site ranks well enough in Google, the technical foundations must be sound. So, if the search engines have no problem crawling your site and indexing your content, the same should also be true for AI, right?
Wrong.
Okay, there are actually a couple of assumptions in there. But that’s often the way: One assumption provides the misleading context that leads to others, and the white ants start chewing your walls.
Let’s deal with that second assumption first: If your site ranks well in Google, it should enjoy similar visibility in AI.
We recently compared Ahrefs data for two major accommodation websites: Airbnb and Vrbo.
When we look at non-branded search, both websites have seen a downward trend since July. The most recent data point we have (Oct. 13-15, 2025) has Airbnb showing up in ~916,304 searches and Vrbo showing up in ~615,497. That’s a ratio of roughly 3:2.
Image from author, October 2025But when we look at estimated ChatGPT mentions (September 2025), Airbnb has ~8,636, while Vrbo has only ~1,573. That’s a ratio of ~11:2.
Image from author, October 2025I should add a caveat at this point that any AI-related datasets are early and modeled, so should be taken as indicative rather than absolute. However, the data suggests Vrbo appears far less in AI answers (and ChatGPT in particular) than you’d expect if there was any correlation with search rankings.
Because of Vrbo’s presence in Google’s organic search results, it does have a modest presence in Google’s AI Overviews and AI Mode. That’s because Google’s AI features still largely draw on the same search infrastructure.
And that’s the key issue here: Search engines aren’t the only ones sending crawlers to your website. And you can’t assume AI crawlers work in the same way.
AI Search Magnifies Your Technical SEO Debt
So, what about that first assumption: If your site ranks fine in Google, any technical debt must be negligible.
Google’s search infrastructure is highly sophisticated, taking in a much wider array of signals than AI crawlers currently do. The cumulative effect of all these signals can mask or compensate for small amounts of technical debt.
For example, a page with well-optimized copy, strong schema markup, and decent authority might still rank higher than a competitor’s, even if your page loads slightly slower.
Most AI crawlers don’t work that way. They strip away code, formatting, and schema markup to ingest only the raw text. With fewer other signals to balance things out, anything that hinders the crawler’s ability to access your content will have a greater impact on your AI visibility. There’s nowhere for your technical debt to hide.
The Need For Speed
Let’s look at just one of the most common forms of technical SEO debt: page speed.
Sub-optimal page speed rarely has a single cause. It’s usually down to a combination of factors – bloated code, inefficient CSS, large JavaScript bundles, oversized images and media files, poor infrastructure, and more – with each instance adding just a little more drag on how quickly the page loads in a typical browser.
Yes, we could be talking fractions of a second here and there, but the accumulation of issues can have a negative impact on the user experience. This is why faster websites will generally rank higher; Google treats page speed as a direct ranking factor in search.
Page speed also appears to be a significant factor in how often content appears in Google’s new AI Mode.
Dan Taylor recently crunched the data on 2,138 websites appearing as citations in AI Mode responses to see if there was any correlation between how often they were cited and their LCP and CLA scores. What he found was a clear drop-off in AI Mode citations for websites with slower load times.
Image from author, October 2025
Image from author, October 2025We also looked at another popular method website owners use to assess page speed: Google’s PageSpeed Insights (PSI) tool. This aggregates a bunch of metrics, including the above two alongside many more, to generate an overall score out of 100. However, we found no correlation between PSI scores and citations in AI Mode.
So, while PageSpeed Insights can give you handy diagnostic information, identifying the various issues impacting your load times, your site’s Core Web Vitals are a more reliable indicator of how quickly and efficiently site visitors and crawlers can access your content.
I know what you’re thinking: This data is confined to Google’s AI Mode. It doesn’t tell us anything about whether the same is true for visibility in other AI platforms.
We currently lack any publicly available data to test the same theory for other agentic assistant tools such as ChatGPT, but the clues are all there.
Crawling Comes At A Cost
Back in July, OpenAI’s Sam Altman told Axios that ChatGPT receives 2.5 billion user prompts every day. For comparison, SparkToro estimates Google serves ~16.4 billion search queries per day.
The large language model (LLM) powering each AI platform responds to a prompt in two ways:
- Drawing on its large pool of training data.
- Sending out bots or crawlers to verify and supplement the information with data from additional sources in real time.
ChatGPT’s real-time crawler is called ChatGPT-User. At the time of writing, the previous seven days saw ChatGPT-User visit the SALT.agency website ~6,000 times. In the same period, Google’s search crawler, Googlebot, accessed our website ~2,500 times.
Handling billions of prompts each day consumes a huge amount of processing power. OpenAI estimates that its current expansion plans will require 10 gigawatts of power, which is roughly the output of 10 nuclear reactors.
Each one of those 6,000 crawls of the SALT website drew on these computational resources. However, a slow or inefficient website forces the crawler to burn even more of those resources.
As the volume of prompts continues to grow, the cumulative cost of all this crawling will only get bigger. At some point, the AI platforms will have no choice but to improve the cost efficiency of their crawlers (if they haven’t already), shunning those websites requiring more resources to crawl in favor of those which are quick and easy to access and read.
Why should ChatGPT waste resources crawling slow websites when it can extract the same or similar information from more efficient sites with far less hassle?
Is Your Site Already Invisible To AI?
All the above assumes the AI crawler can access your website in the first place. As it turns out, even that isn’t guaranteed.
In July this year, Cloudflare (one of the world’s largest content delivery networks) started blocking AI crawlers by default. This decision potentially impacts the AI visibility of millions of websites.
Cloudflare first gave website owners the ability to block AI crawlers in September 2024, and more than 1 million customers chose to do just that. The new pay-per-crawl feature takes this a step further, allowing paid users of Cloudflare to choose which crawlers they will allow and on what terms.
However, the difference now is that blocking AI crawlers is no longer opt-in. If you want your website and content to be visible in AI, you need to opt out; assuming you’re aware of the changes, of course.
If your site runs on Cloudflare infrastructure and you haven’t explicitly checked your settings recently, there’s a decent chance your website might now be invisible to ChatGPT, Claude, and Perplexity. Not because your content isn’t good enough. Not because your technical SEO is poor. But because a third-party platform made an infrastructure decision that directly impacts your visibility, and you might not even know it happened.
This is the uncomfortable reality CMOs need to face: You can’t assume what works today will work tomorrow. You can’t even assume that decisions affecting your AI visibility will always happen within your organisation.
And when a change like this does happen, you absolutely can’t assume someone else is handling it.
Who Is Responsible?
Most technical SEO issues will have a solution, but you’ve got to be aware of the problem in the first place. That requires two things:
- Someone responsible for identifying and highlighting these issues.
- Someone with the necessary skills and expertise to fix them.
Spelled out like this, my point might seem a tad patronizing. But be honest, could you name the person(s) responsible for these in your organization? Who would you say is responsible for proactively and autonomously identifying and raising Cloudflare’s new pay-per-crawl policy with you? And would they agree with your expectation if you asked them?
Oh, and don’t cop out by claiming the responsibility lies with your external SEO partners. Agencies might proactively advise clients whenever there’s “a major disturbance in the Force,” such as a pending Google update. But does your contract with them include monitoring every aspect of your infrastructure, including third-party services? And does this responsibility extend to improving your AI visibility on top of the usual SEO activities? Unless this is explicitly spelled out, there’s no reason to assume they’re actively ensuring all the various AI crawlers can access your site.
In short, most technical SEO debt happens because everyone assumes it’s someone else’s job.
The CMO assumes it’s the developer’s responsibility. It’s all code, right? The developers should know the website needs to rank in search and be visible in AI. Surely, they’ll implement technical SEO best practice by default.
But developers aren’t technical SEO experts in exactly the same way they’re not web designers or UX specialists. They’ll build what they’re briefed to build. They’ll prioritize what you tell them to prioritize.
As a result, the dev team assumes it’s up to the SEO team to flag any new technical changes. But the SEO team assumes all is well because last quarter’s technical audit, based on the same list of checks they’ve relied on for years, didn’t identify anything amiss. And everybody assumes that, if there were going to be any issues with AI visibility, someone else would have raised it by now.
This confusion all helps technical debt to accumulate, unseen and unchecked.
→ Read more: Why Your SEO Isn’t Working, And It’s Not The Team’s Fault
Stop Assuming And Start Doing
The best time to prevent white ants from eating the walls in your home is before you know they’re there. Wait until the problems are obvious, and the expense of fixing all the damage will far outweigh the costs of an initial inspection and a few precautionary measures.
In the same way, don’t wait until it becomes obvious that your brand’s visibility in AI is compromised. Perform the necessary inspections now. Identify and fix any technical issues now that might cause issues for AI crawlers.
A big part of this will be strong communication between your teams, with accountabilities that make clear who is responsible for monitoring and actioning each factor contributing to your overall visibility in AI.
If you don’t, any investment and effort your team puts into optimizing brand content for AI could be wasted.
Stop assuming tomorrow will work like today. Technical SEO debt will impact your AI visibility. That’s not up for debate. The real question is whether you’ll proactively address your technical SEO debt now or wait until the assumptions cause your online visibility to crumble.
More Resources:
- Ask An SEO: How Do You Prioritize Technical SEO Fixes With Limited Dev Support?
- Beyond Keywords: Leveraging Technical SEO To Boost Crawl Efficiency And Visibility
- The Complete Technical SEO Audit Workbook
Featured Image: SvetaZi/Shutterstock