Advertisement
  1. SEJ
  2.  ⋅ 
  3. SEO

Google: Changing URLs On Larger Sites Takes Time To Process

Google's John Mueller explains about the time it takes for Google to process changes across a larger site

Changes to large sites takes time for Google to process

Someone on Reddit asked a question about making a sitewide change to the code related to a website with ten languages. Google’s John Mueller offered general advice about the pitfalls of sitewide changes and word about complexity (implying the value of simplicity).

The question was related to hreflang but Mueller’s answer, because it was general in nature, had wider value for SEO.

Here is the question that was asked:

“I am working on a website that contains 10 languages and 20 culture codes. Let’s say blog-abc was published on all languages. The hreflang tags in all languages are pointing to blog-abc version based on the lang. For en it may be en/blog-abc

They made an update to the one in English language and the URL was updated to blog-def. The hreflang tag on the English blog page for en will be updated to en/blog-def. This will however not be dynamically updated in the source code of other languages. They will still be pointing to en/blog-abc. To update hreflang tags in other languages we will have to republish them as well.

Because we are trying to make the pages as static as possible, it may not be an option to update hreflang tags dynamically. The options we have is either update the hreflang tags periodically (say once a month) or move the hreflang tags to sitemap.

If you think there is another option, that will also be helpful.”

Sitewide Changes Take A Long Time To Process

I recently read an interesting thing in a research paper that reminded me of things John Mueller said about how it takes time for Google to understand updated pages relate to the rest of the Internet.

The research paper mentioned how updated webpages required recalculating the semantic meanings of the webpages (the embeddings) and then doing that for the rest of the documents.

Here’s what the research paper (PDF) says in passing about adding new pages to a search index:

“Consider the realistic scenario wherein new documents are continually added to the indexed corpus. Updating the index in dual-encoder-based methods requires computing embeddings for new documents, followed by re-indexing all document embeddings.

In contrast, index construction using a DSI involves training a Transformer model. Therefore, the model must be re-trained from scratch every time the underlying corpus is updated, thus incurring prohibitively high computational costs compared to dual-encoders.”

I mention that passage because in 2021 John Mueller said it can take Google months to assess the quality and the relevance of a site and mentioned how Google tries to understand how a website fits in with the rest of the web.

Here’s what he said in 2021:

“I think it’s a lot trickier when it comes to things around quality in general where assessing the overall quality and relevance of a website is not very easy.

It takes a lot of time for us to understand how a website fits in with regards to the rest of the Internet.

And that’s something that can easily take, I don’t know, a couple of months, a half a year, sometimes even longer than a half a year, for us to recognize significant changes in the site’s overall quality.

Because we essentially watch out for …how does this website fit in with the context of the overall web and that just takes a lot of time.

So that’s something where I would say, compared to technical issues, it takes a lot longer for things to be refreshed in that regard.”

That part about assessing how a website fits in the context of the overall web is a curious and unusual statement.

What he said about fitting into the context of the overall web kind of sounded surprisingly similar to what the research paper said about how the search index “requires computing embeddings for new documents, followed by re-indexing all document embeddings.”

Here’s John Mueller response in Reddit about the problem with updating a lot of URLs:

“In general, changing URLs across a larger site will take time to be processed (which is why I like to recommend stable URLs… someone once said that cool URLs don’t change; I don’t think they meant SEO, but also for SEO). I don’t think either of these approaches would significantly change that.”

What does Mueller mean when he said that big changes take time be processed? It could be similar to what he said in 2021 about evaluating the site all over again for quality and relevance. That relevance part could also be similar to what the research paper said about computing embeddings” which relates to creating vector representations of the words on a webpage as part of understanding the semantic meaning.

See also: Vector Search: Optimizing For The Human Mind With Machine Learning

Complexity Has Long-Term Costs

John Mueller continued his answer:

“A more meta question might be whether you’re seeing enough results from this somewhat complex setup to merit spending time maintaining it like this at all, whether you could drop the hreflang setup, or whether you could even drop the country versions and simplify even more.

Complexity doesn’t always add value, and brings a long-term cost with it.”

Creating sites with as much simplicity as possible has been something I’ve done for over twenty years. Mueller’s right. It makes updates and revamps so much easier.

Featured Image by Shutterstock/hvostik

Category News SEO
ADVERTISEMENT
SEJ STAFF Roger Montti Owner - Martinibuster.com at Martinibuster.com

I have 25 years hands-on experience in SEO and have kept on  top of the evolution of search every step ...

Google: Changing URLs On Larger Sites Takes Time To Process

Subscribe To Our Newsletter.

Conquer your day with daily search marketing news.