There is no benchmark for what is considered an optimal crawl budget for websites, Google’s John Mueller says.
This was stated in response to a Reddit thread where an SEO asked whether there’s ideal percentage of pages that Googlebot should be crawling every day. The thought behind this question is content may be kept fresh if Googlebot crawls it more regularly.
Here is the question in full:
“While everyone talks about crawl budget, I haven’t found a specific cutoff or a range for this. Like what should be the minimum percentage of pages out of total (or total changes made everyday) should be crawled by GoogleBot everyday to keep the content fresh?
I understand that this can vary a lot because lot depends upon static/variable content on the website. I am just trying to understand how to benchmark crawl budget.”
In response, Mueller simply states: “There is no number.”
Therefore, if you’re looking to boost your crawl budget, there’s really no ideal number to be aiming for. However, that’s not to say there’s no benefit to optimizing a website’s crawl budget.
What is Crawl Budget?
Simply put, a crawl budget is the number of URLs Googlebot is able to crawl (based on site speed) and wants to crawl (based on user demand).
A higher crawl budget can help keep popular content fresh, and help prevent older content from becoming stale.
See: Googlebot Crawl Budget Explained by Google’s Gary Illyes
Factors Affecting Crawl Budget
One of the best ways to improve crawl budget is to limit the amount of low-value-add URLs on a website, such as:
- Faceted navigation and session identifiers
- On-site duplicate content
- Soft error pages
- Hacked pages
- Infinite spaces and proxies
- Low quality and spam content
These types of pages can steal crawl activity away from a site’s more important pages.
It’s also advisable for site owners to monitor the crawl errors report in Search Console and keep server errors to a minimum.
Also see: 7 Tips to Optimize Crawl Budget for SEO