Google utilizes two types of crawling methods when it goes through webpages — one to discover new content and one to refresh existing content.
This is explained by Google’s Search Advocate John Mueller during the Google Search Central SEO office-hours hangout recorded on January 7.
An SEO professional named Swyamdipta Chakraborty joins the livestream to ask Mueller a series of questions, one of which has to do with how often Googlebot crawls his site.
He notes that Googlebot used to crawl his site daily when he published more regularly, but doesn’t crawl as much when fewer articles are published.
Perhaps out of concern that a reduction in crawl frequency is a bad sign, he asks Mueller if this is normal.
Mueller assures him this is fine, and goes on to explain the two types of crawling Googlebot engages in.
Learn more about how Google crawls websites in the section below.
Two Types Of Googlebot Crawling
You can find out how often Googlebot crawls your site via a report in Search Console, and there may be periods when your site is crawled more than others.
When questioned about the report, Mueller confirms the fluctuations are normal and discusses the two types of crawling:
“That can happen. It’s not so much that we crawl a website, but we crawl individual pages of a website. And when it comes to crawling, we have two types of crawling roughly.
One is a discovery crawl where we try to discover new pages on your website. And the other is a refresh crawl where we update existing pages that we know about.”
Not only can crawl frequency vary for the whole site, it can vary by individual webpages.
If your homepage is updated more regularly than other pages, for example, then you’ll see more Googlebot activity on that page.
“So for the most part, for example, we would refresh crawl the homepage, I don’t know, once a day, or every couple of hours, or something like that.
And if we find new links on their home page then we’ll go off and crawl those with the discovery crawl as well. And because of that you will always see a mix of discover and refresh happening with regard to crawling. And you’ll see some baseline of crawling happening every day.
But if we recognize that individual pages change very rarely, then we realize we don’t have to crawl them all the time.”
Certain types of websites are likely to be crawled more than others.
A news websites that’s updated multiple times a day will be crawled more than a site that’s updated once a month.
Googlebot is capable of recognizing these patterns and adjusting its crawl frequency accordingly.
“For example, if you have a news website and you update it hourly, then we should learn that we need to crawl it hourly. Whereas if it’s a news website that updates once a month, then we should learn that we don’t need to crawl every hour.
And that’s not a sign of quality, or a sign of ranking, or anything like that. It’s really just purely from a technical point of view we’ve learned we can crawl this once a day, or once a week, and that’s ok.”
So don’t be alarmed if you notice Googlebot is visiting your site more or less often.
Further, don’t be concerned if Googlebot recently crawled your site and updates to existing content aren’t reflected in search results.
That could be a case where Google crawled your site to discover new content, not to refresh existing content.
If your site rarely makes changes to published content, then Googlebot may crawl more for discovery that refreshes.
Again, it doesn’t necessarily have anything to do with content quality.
Hear the full discussion below:
Featured Image: Diyajyoti/Shutterstock