SEO professionals have long discussed the concept of a “crawl budget,” which refers to the limited number of pages search engines can crawl daily.
The assumption is that sites must stay within this allotted budget to get pages indexed.
However, in a recent podcast, Google’s Search Relations team debunks misconceptions about crawl budgets and explains how Google prioritizes crawling.
How Googlebot Prioritizes Crawling
Dave Smart, an SEO consultant and Google Product Expert, acknowledges the confusion surrounding crawl budget:
“I think there’s a lot of myths out there about crawling, about what it is and what it isn’t. And things like crawl budgets and phrases you hear thrown around that may be quite confusing to people.”
Gary Illyes answered Dave with a question:
“All right. I will turn this around and I will ask you, if you operated a crawler, how would you decide what to fetch?”
David Smart, the SEO consultant, responded:
“You need to do it by looking at what’s known, finding somewhere to start, a starting point. And from that, you get the links and stuff, and then you would try and determine what’s important to go and fetch now, and maybe what can wait until later and maybe what’s not important at all.”
Gary Illyes expanded on how Google decides how much to crawl by explaining the role of search demand.
This is what Gary said:
“One is the scheduler, which basically says that I want to crawl this …But that’s also kind of controlled by some feedback from search. …if search demand goes down, then that also correlates to the crawl limit going down.”
Gary does not explain what the phrase “search demand” means. But the context of his entire statement is from Google’s perspective. So from Google’s perspective “search demand” probably means search query demand. Search query demand makes sense because if nobody’s searching for Cabbage Patch Kids then Google doesn’t really have a reason to crawl websites about Cabbage Patch Kids. But again, Gary did not explain what “search demand” means so we have to look at it from the context in which that phrase was spoken.
Gary finishes his thought on that topic with this sentence:
“So if you want to increase how much we crawl, then you somehow have to convince search that your stuff is worth fetching, which is basically what the scheduler is listening to.”
Gary does not elaborate what he means by “convince search that your stuff is worth fetching” but one interpretation could be to make sure it’s relevant to user trends, which means keeping up to date.
Related: How To Manage Crawl Budget For Large Sites
Focus On Quality & User Experience
So, what can websites do to ensure their pages get crawled and indexed efficiently? The answer lies in focusing on site quality.
As Illyes puts it:
“Scheduling is very dynamic. As soon as we get the signals back from search indexing that the quality of the content has increased across this many URLs, we would just start turning up demand.”
By consistently improving page quality and the usefulness of your content to searchers, you can overcome any assumed limitations on crawling.
The key is to analyze your site’s performance, identify areas for improvement, and focus on delivering the best possible experience to your target audience.
In Summary
Google’s recent insights clarify that a fixed “crawl budget” is largely a myth. Instead, the search engine’s crawling decisions are dynamic and driven by content quality and search demand.
By prioritizing quality, relevance, and user experience, site owners can ensure that their valuable pages get discovered, crawled, and indexed by Google – without worrying about hitting an arbitrary limit.
Hear the full discussion in the podcast episode linked below:
FAQ
How does the concept of a crawl budget affect SEO strategies?
SEO professionals have discussed the concept of a crawl budget, believing that staying within a certain limit of pages crawled daily is essential. However, Google’s search engineers have clarified that there is no set crawl budget that websites must adhere to.
Instead, Google prioritizes crawling based on content quality and user interaction signals. Therefore, SEO strategies should shift focus from managing a crawl budget to optimizing for high-quality, user-centric content to increase the chances of being crawled and indexed effectively.
What factors influence Googlebot’s prioritization for crawling web pages?
A dynamic set of factors influences Googlebot’s prioritization for crawling web pages, predominantly content quality although Gary Illyes mentioned search demand plays a role (which could be a reference to search query demand).
In what ways can marketers enhance the crawlability of their website’s content?
Marketers looking to improve their website’s crawlability should concentrate on the following:
- Producing high-quality content that is informative, relevant, and engaging to the target audience.
- Ensuring the website offers a superior user experience with fast loading times, mobile-friendliness, and navigational ease.
- Gaining natural backlinks from reputable sources to increase credibility and visibility to search engines.
- Regularly updating content to reflect the latest information, trends, and user needs.
Featured Image: BestForBest/Shutterstock