Advertisement
  1. SEJ
  2.  ⋅ 
  3. SEO

57 SEO Insights From Google’s John Mueller

Brush up on your SEO knowledge with this comprehensive collection of insights from Google Search Advocate John Mueller.

57 SEO Insights From Google’s John Mueller

As Google’s Search Advocate, John Mueller shares so many SEO insights it would be a fulltime job to keep up.

Mueller assists SEO professionals each week, answering their questions in a live Q&A. Many of the tips you’ll find here are from those office-hours hangouts.

This roundup also includes key takeaways from search explainer videos that Mueller puts out on a less frequent basis.

Each of these snack-sized servings of Google insider knowledge is grouped by category, starting with ranking factors.

Here’s a roundup of top tips you may have missed.

Ranking Factors

1. Google Doesn’t Have 200+ Ranking Factors

In the past, Google has said there are 200+ factors its algorithm takes into consideration when ranking content.

Google is officially moving away from that number, saying it’s misleading and creates a false impression of how its algorithms work.

“… we’ve kind of moved away from the over 200 ranking signals number, because it feels like even having a number like that is kind of misleading in the sense that, Oh Google has a spreadsheet with all of the ranking signals and they can just sort them by importance and tell me which ones they are. And that’s definitely not the case.”

2. Quantity Of Backlinks Doesn’t Matter

The total number of links pointing to a website is irrelevant to Google.

One good link from a relevant website can be more impactful than millions of low-quality links.

“… you could go off and create millions of links across millions of websites if you wanted to, and we could just ignore them all.

Or there could be one really good link from one website out there that is, for us, a really important sign that we should treat this website as something that is relevant because it has that one link… So the total number essentially is completely irrelevant.”

3. Changing Dates Won’t Improve Rankings

Changing publishing dates on webpages, without making any significant changes, will not help to improve rankings in Google search results.

“I don’t think it would change anything with regards to search, and we definitely wouldn’t rank those pages differently in search just because you’re changing the date and time on a page.”

4. Duplicate Content Is Not A Negative Ranking Factor

Duplicate content does not count negatively against a site in terms of search rankings. Google handles it by displaying one version of the content and ignoring the others.

“So if you have the same content on multiple pages then we won’t show all of these pages. We’ll try to pick one of them and show that. So it’s not that there’s any negative signal associated with that. In a lot of cases that’s kind of normal that you have some amount of shared content across some of the pages.”

5. Presentation Can Impact Rankings

The visual presentation of a website can impact its visibility in search results.

“Sometimes those small differences do play a role in regards to how people perceive your website. If, for example, you have something that is on a financial topic and people come to you and say “well your information is okay but it’s presented in a way that looks very amateurish,” – then that could reflect how your website is perceived. And in the long run could reflect something that is visible in search as well.”

6. Customer Reviews Are Not A Ranking Factor

Customer reviews are not used by Google’s algorithms to rank web search results.

They’re used in local search rankings, but not organic web search rankings.

“As far as I know we don’t use the number of customers or reviews when it comes to web search, with regards to ranking. Sometimes we do pull that information out and we might show it as kind of a rich result in the search results.”

7. Might Take A Month To See Ranking Changes

After fixing quality issues on a site it could take up to a month to see changes in Google’s search results.

“How long that takes… yeah… it’s hard to say… it’s really hard to say… to re-crawl that across a larger site that can take a bit of time, especially if you make bigger changes like across everything if you change the structure of your website.

I would assume something like that, just purely from a technical point of view would take… I don’t know… maybe a month.”

8. Removing Blog Comments May Impact Rankings

Google indexes blog comments like other content, which means they could be helping webpages rank in search results.

Therefore, removing all blog comments from a site could impact its rankings.

“I think it’s ultimately up to you. From our point of view we do see comments as a part of the content. We do also, in many cases, recognize that this is actually the comment section so we need to treat it slightly differently. But ultimately if people are finding your pages based on the comments there then, if you delete those comments, then obviously we wouldn’t be able to find your pages based on that.”

9. Core Web Vitals Is More Than A Tiebreaker

Contrary to what was previously believed, Mueller confirms the Core Web Vitals ranking factor is more than just a tiebreaker.

“It is a ranking factor, and it’s more than a tie-breaker, but it also doesn’t replace relevance. Depending on the sites you work on, you might notice it more, or you might notice it less…

The other thing to keep in mind with core web vitals is that it’s more than a random ranking factor, it’s also something that affects your site’s usability after it ranks (when people actually visit).”

10. Core Web Vitals Ranking Factor Gets Calculated Slowly

Core Web Vitals data is collected and updated every 28 days. That means the scores reported in Google Search Console, or in tools like PageSpeed Insights, are reports of what Google measured in (roughly) the previous 28 days.

Therefore if core web vitals scores are improved, it will take time to see a noticeable impact from the ranking signals. Mueller says it’s yet undecided whether this will change or whether there will always be a general lag.

“I don’t know if that’s decided completely yet. I mean… part of that is also that there is just a general lag for the data anyway… We kind of have to wait that period of time until we have collected enough data…

So I suspect it’s not something that will be optimized for… speedy updates but more kind of to have a clear understanding of the overall picture… my guess is it’ll be more something of a slow thing rather than a real-time change.”

11. Traffic Doesn’t Impact Core Web Vitals

Core Web Vitals scores are calculated from actual traffic, but the traffic itself does not influence the scoring.

“It doesn’t matter if millions of users are seeing that or just… I don’t know… thousands of users are seeing it… the pure number of visitors to your site is not a factor when it comes to core web vitals and generally not a factor for ranking either.”

12. Google My Business Is Essential For Local Search Rankings

To rank well in local search results, optimizing your Google My Business listing is just as important as optimizing your website.

“… it sounds like what you’re looking at is a local service or local business, essentially. And for that I would make sure that you really have a really strong Google My Business entry set up. Because that’s something that can be shown a little bit easier in the search results for queries like this.

And in particular, queries that include something like “near me,” it’s not that you need to rank for “near me” because near me is essentially, like… global. It’s not something specific on your website.

But rather what you need to do t here is just make sure that you have your location very clearly defined on your pages, so that we can recognize this location is associated with your website or with this page and the user is in that location.”

13. Product Price Is Not A Ranking Factor

Offering competitive prices may help attract more customers, but it won’t have any impact on the search rankings of ecommerce stores.

“Purely from a web search point of view, no, it’s not the case that we would try to recognize the price on a page and use that as a ranking factor.

So it’s not the case that we would say we’ll take the cheaper one and rank that higher. I don’t think that would really make sense.”

14. Word Count is Not a Ranking Factor

There’s no truth to the theory that word count matters for search rankings.

If a shorter article communicates the same information as a longer article, Google will recognize it offers the same value to searchers.

As Mueller says, it doesn’t make sense to rank content based on which page has more words than the other.

“We don’t use word count for ranking. It’s fine to use word counts for *yourself* as a guideline for your content, if it encourages better content from your writers.”

15. Original Page Titles Are Still Used For Rankings

Following an update to how Google generates page titles in search results, Mueller confirms original titles are no less important than they were before.

The carefully crafted page title you wrote will still be used for search rankings even if Google replaces it in the SERPs.

“You never know how these things evolve over time, but at least at the moment it is the case that we continue to use what you have in your title tag, in your title element, as something that we can use for ranking.

It’s not like something that replaces everything for the website, but it is a factor that we use in there. Even if when we display the title for your page we swap out maybe that one keyword that you care about, we would still use that for ranking.”

16. E-A-T Is Not A Ranking Factor

After Google wrote about E-A-T (expertise, authoritativeness, and trustworthiness) in its Quality Rater Guidelines, a belief started to emerge that it’s a direct ranking factor.

It’s not, Mueller confirms. And there’s no such thing as an “E-A-T score” either.

“…  it’s not something where I would say Google has an E-A-T score and it’s based on five links plus this plus that.

It’s more something that, our algorithms over time …we try to improve them, our quality raters try to review our algorithms and they do look at these things.

So there might be some overlap here but it’s not that there’s a technical factor that’s involved which would kind of take specific elements and use them as an SEO factor.”

17. There’s No Single Deciding Factor In Search

There’s no single ranking factor you can point to and say it’s the deciding factor above others.

A ranking factor may carry considerable weight for one query, and not matter at all for another.

“And it’s also not the case that any particular kind of factor within this big network is the one deciding factor or that you can say that this factor plays a 10% role because maybe for some sites, for some queries, it doesn’t play a role at all.

And maybe for other sites, for other queries, it’s the deciding factor. It’s really hard to say kind of how to keep those together.”

18. Heading Tags Are A Strong Signal

Text within a heading tag sends a strong signal to Google, telling it what a page is about and what you want it to rank for.

“And when it comes to text on a page, a heading is a really strong signal telling us this part of the page is about this topic.

…whether you put that into an H1 tag or an H2 tag or H5 or whatever, that doesn’t matter so much.

But rather kind of this general signal that you give us that says… this part of the page is about this topic. And this other part of the page is maybe about a different topic.”

19. Keywords In The Domain Name Do Not Impact Rankings

A website is not any more likely to rank for a particular keyword if that keyword is in the domain name.

Keywords in the domain name are not a ranking signal.

You’re much better off with a domain that reflects the name of your company, or has something to do with your brand.

“Just because a website has a keyword in its domain name doesn’t mean that it’s more relevant than others for that keyword.

In short, you don’t need to put keywords in the domain name.”

Main Content

20. Make The Focus Keyword As Visible As Possible

Mueller strongly advises putting a page’s focus keyword where it’s most visible, including titles, headings, subheadings, and so on.

“I would recommend that if there’s something that you want to tell us that your page is about, to make that as visible as possible.

So don’t just put that as a one-word mention at the bottom of your article. But rather, use it in your titles, use it in your headings, use it in your subheadings, use it in captions from images…

All of these things, to make it as clear as possible for users and for Google when they go to your page that this page is about this topic.”

21. Improve It Or Remove It

When asked whether it’s better to improve low-quality content rather than removing it, Mueller says improving it is the best way to go.

If you have no intention to improve the content, however, then you should go ahead and remove it.

“I think if that’s something that you think is good content that you want to publish with your website, with your name, then I would keep it. Just because it’s old doesn’t mean it’s bad.

But if you look at it and you say, oh, this is embarrassing for me now, I don’t want it to be online, it’s like so bad. Then that’s something where I’d say either improve it or remove it.”

22. Put Unique Content Above The Fold

A webpage should have at least some unique content in the above the fold area.

There’s always going to be some content that’s duplicated across different pages, but at least aim for a minimal amount of unique content at the top of the page.

“The important part for us is really that there is some amount of unique content in the above the fold area. So if you have a banner on top, and you have a generic hero image on top, that’s totally fine. But some of the above the fold content should be unique for that page.”

23. Spelling And Grammar Are High Priority

Poor spelling and grammar are seen by Google as a quality issue as they can directly impact a user’s experience.

“With regard to spelling errors, grammatical errors, I think that’s something that’s a bit more of almost like a gray zone in that on the one hand we have to be able to recognize what a page is about.

And if we can’t recognize that because there’s so many errors on the page in the text, then that makes it harder.

The other aspect is also that we try to find really high quality content on the web and sometimes it can appear that a page is lower quality content because it has a lot of grammatical and technical mistakes in the text.

I would almost say spelling and grammar is probably for most websites a higher priority than broken HTML.”

24. Most Content Gets Indexed Within A Week

When a new page is published it can take anywhere from several hours to several weeks for it to be indexed.

Mueller suspects most good content is picked up and indexed within about a week.

25. Same Content In Different Formats Is Not Duplicate

Identical content published in different formats, such as a video and a blog post, is not considered duplicate content.

Google isn’t capable of transcribing the dialogue in a video to compare it against the written content in a blog post.

“First of all we don’t do text analysis of the videos and then map them to webpages. If your video has the same content as your blog post it’s still something different. People sometimes go to Google with the intent to read something, and sometimes they go to Google with the intent to watch something or to listen to something, and those are very different things.

We wouldn’t not say the text in this video is exactly the same as a blog post therefore we don’t show either of them or we only show one of them. So if you have a video that matches your blog post I think that’s perfectly fine.”

26. Lots Of Affiliate Links OK If Content Is Valuable

There’s no harm in having a lot of affiliate links on a page if the main content adds value to the web.

Websites are free to use as many affiliate links as they want on a single page, as long as there’s useful content as well.

“There is no limit. From our side it’s not that we’re saying that affiliate links are bad or problematic. It’s more a matter of, well, you actually need to have some useful content on your page as well. So that’s kind of the angle that we take there.

The amount of affiliate links that you have on a site is totally irrelevant. The ratio of links to article length is also totally irrelevant.”

27. Embedded Videos Have Same Value As Uploaded Videos

Videos embedded from other sources have the same SEO value as videos natively hosted on a website.

“It’s essentially the same. It’s very common that you have a separate CDN (content delivery network) for videos, for example, and technically that’s a separate website. From our point of view if that works for your users, if your content is properly accessible for indexing then that’s perfectly fine.”

28. Too Many Internal Links Can Lower Their Value

Using a significant amount of internal links on the same page can dilute their value.

When asked if too many internal links on a page do more harm than good, Mueller says:

“Yes and no. I think, in the sense that we do use the internal links to better understand the structure of a page, and you can imagine the situation where if we’re trying to understand the structure of a website, with the different pages that are out there, if all pages are linked to all other pages on the website, where you essentially have like a complete internal linking across every single page, then there’s no real structure there.

So regardless of what PageRank, and authority, and passing things like that, you’re essentially not providing a clear structure of the website. And that makes it harder for search engines to understand the context of the individual pages within your website. So that’s the way that I would look at it there.”

29. Images Instead Of HTML For Charts

A rare instance where Mueller recommends using images instead of plain text is when you’re displaying a chart in the main content.

There’s no benefit to coding a chart with HTML. An image with an understandable alt attribute is perfectly fine.

“I think it depends a bit on what you want to achieve with the chart. Usually, these kind of things I would just add as an image and make sure that you have an understandable alt attribute for the image as well.

So if there’s any critical information in that chart that you need to get across then put it in the alt attributes. So that we can pick it up as text, so that people who can’t see the image can also get that information. But in general I would just use images.”

30. Anchor Text Should Provide Context

Internal links can help Google discover more articles within a site, so the anchor text should provide context to what the linked page is about.

“With regards to internal links you’re giving us a signal of context. So basically you’re saying, in this part of my website you’ll find information about this topic. And that’s what you would use as the anchor text for those internal links…

With regards to external links, if you’re linking out to other people’s websites, the same things. Like, supply some context why people should go and click on this link, what kind of extra information it gives.”

31. Long Anchor Text Gives Google More Context

There’s nothing wrong with using long anchor text on a page. In fact, it may help.

Google uses anchor text to learn more about the page being linked to.

The longer the anchor text, the more context you’re giving to Google.

That information will be taken into account when ranking the page.

“I don’t think we do anything special to the length of words in the anchor text. But rather, we use this anchor text as a way to provide extra context for the individual pages.

Sometimes if you have a longer anchor text that gives us a little bit more information. Sometimes it’s kind of like just a collection of different keywords.”

32. Google Doesn’t Understand Sarcasm

Writing content with a sarcastic tone may endear readers, but it won’t win any points with Google.

Google’s algorithm is likely to misunderstand sarcasm, so avoid using it when writing content that communicates critical information.

“I would say there’s definitely a risk that we misunderstand things like that or that we don’t understand when there is sarcasm on a page.

And especially if it’s something where it’s really critical for you to get the right message across to Google and to all users then I would make sure it’s as clear as possible.

So maybe in cases where you’re talking about medical information, maybe try to avoid sarcasm.

If you’re writing about … an entertainment topic or something like that then that’s like probably less of an issue.”

Technical SEO

33. Some Types Of Schema Markup Can’t Be Combined

Google’s rich results can combine certain types of structured data markup but not others.

“… in the search results, some of the rich results types we can combine and some of them we can’t combine. So, for example, if you have a recipe and you have ratings then we can often combine that in the search results, in one rich results type.

However, if you have an FAQ and you have a how-to, then at least from what I recall what these look like, these are things that wouldn’t be combined in a single rich result type, which means our systems would have to pick one of them to show.”

34. No Benefit To A Flat URL Structure

An artificially flat URL structure, where every page looks like it’s one click away from the home page, isn’t necessary.

Google isn’t concerned with how many slashes there are in a URL. It treats URLs as identifiers of content, not as a way to understand site structure

“You don’t have to have kind of an artificially flat directory structure. So from that point of view, if you have a directory structure that users can recognize and where you can tell that sometimes people are like even typing in the URL, or copy and pasting parts of a URL together, I think that’s perfectly fine. There’s no need to hide that kind of URL structure from users by doing URL rewriting or anything like that.”

35. 404 Errors Are Normal

It’s normal for a site to have 404 errors, so Google doesn’t treat them as a negative ranking factor.

There’s no reason to be concerned even if Search Console shows up to 40% of a site’s pages are 404s.

“I don’t think that would look unusual to us. It’s not like we would see that as a quality signal or anything. The only time where I think 404s would start to look like something problematic for us is when the home page starts returning 404s. Then that might be a situation where we go: “oh, I don’t know if this site is actually still up.”

But if parts of the site are 404, like, whatever. It’s like a technical thing, like, it doesn’t matter.”

36. CCTLD Not Required For Geotargeting

A country-code top-level domain (CCTLD), such as .de for Germany, is not necessary to geotarget searchers in that country.

“No, it’s not required.

In general, if you want to use geotargeting there, there are two ways to do that.

One is to use the country level top-level domain, which would dot DE for Germany in that case.

The other is to use a generic top-level domain and to use a geotargeting setting in search console.

So that could be, for example, a dot Com website or dot Net or dot Info or dot EU or whatever.

Any of those would also work and then you just set geotargeting for Germany.”

37. No Technical Fix For Quality Issues

Technical fixes can only get you so far with improving search rankings.

For a website to be taken seriously by Google is has to meet a certain level of quality, which cannot be achieved with technical fixes alone.

“Website quality is not something you can fix with technical changes. If you want search engines to take your site more seriously, you really, really need to improve the game there.”

38. URL Length Is A Light Signal For Canonicalization

The length of a URL is a light signal that Google uses to determine which version of a URL is canonical.

Google is likely the choose the shorter version of a URL to display in search results, provided all else is equal.

“We use URL length very lightly for canonicalization, so if we spot url.htm?utm=greencheeseandham and url.htm, we might shoose url.htm as the canonical assuming all else is the same. That can make it look like shorter URLs are better for SEO, but it’s really just a side-effect.”

39. Keep URLs The Same When Revamping A Site

A site revamp could do more harm than good if you go so far as to change the URLs.

When making changes and/or improvements to a site, make sure the URLs stay the same.

If the URLs are changed, Google may crawl them as new pages, which means starting from scratch in terms of search rankings.

“For revamps there’s sometimes a few things that come together and it’s sometimes tricky to figure out exactly what all is happening.

But the main thing that I would watch out for when you’re doing a revamp is to make sure:

That the URLs stay the same as much as possible so that you don’t change the URL structure.

That the internal linking stays the same as much as possible.

That the content and the layout on the pages stays the same as much as possible.”

40. Structured Data Is An “Extremely Light” Signal

Structured data helps communicate to Google what a page is about, but SEOs shouldn’t depend on it to have a huge impact on search rankings.

Why?

Because it’s only a light signal.

If you really want to make it obvious to Google what you want a page to rank for, communicate it through the main content.

“How do you rank something purely from SD hints? It’s an extremely light signal. If you’re worried, make the content more obvious.”

41. Multiple H1 Tags On The Same Page Is Fine

Does Google recommend using one H1 heading? No.

Publishers are free to use as many H1 headings as they want.

“You can use H1 tags as often as you want on a page. There’s no limit, neither upper or lower bound.

Your site is going to rank perfectly fine with no H1 tags or with five H1 tags.”

42. Nofollow Links In Guest Posts

If you’re submitting a guest post to another site with a link back to your site, that link has to have a nofollow tag on it.

The same goes for guest posts published on your site with a link back to the author’s site.

Google sees guest posts as promotion for the author’s site, so any links within the content are not considered natural links.

Therefore, a nofollow tag must be present to prevent Google from thinking you’re involved in some kind of link scheme.

“The part that’s problematic is the links — if you’re providing the content/the links, then those links shouldn’t be passing signals & should have the rel-sponsored / rel-nofollow attached. It’s fine to see it as a way of reaching a broader audience.

Essentially if the link is within the guest post, it should be nofollow, even if it’s a “natural” link you’re adding there.

FWIW none of this is new, and I’m not aware of any plans to ramp up manual reviews of this. We catch most of these algorithmically anyway.”

43. Google Prefers JSON-LD Structured Data

There are two main types of structured data you can markup your site with.

One is JSON-LD, the other is microdata.

Google supports both types but prefers JSON-LD.

“We currently prefer JSON-LD markup. I think most of the new structured data that are kind of come out for JSON-LD first. So that’s what we prefer.”

44. Keyword-Rich Meta Titles Not Against Google’s Guidelines

Though Google doesn’t recommend it, filling up a page’s meta title with keywords is not against the search engine’s guidelines.

Previously there was a belief that this would be considered keyword stuffing and result in a demotion.

According to Mueller, that’s not the case.

“It’s not against our webmaster guidelines. It’s not something that we would say is problematic. I think, at most, it’s something where you could improve things if you had a better fitting title because we understand the relevance a little bit better.

And I suspect the biggest improvement with a title in that regard there is if you can create a title that matches what the user is actually looking for then it’s a little bit easier for them to actually click on a search result because they think “oh this really matches what I was looking for.”

General SEO

45. SEO Won’t Become Obsolete

Mueller says he doesn’t think search engines will ever advance to a point that SEO becomes obsolete.

This puts to rest concerns that Google’s machine learning will advance to a point where good content can rank without SEO.

“I think one of the things that people always worry about is everything around machine learning and that Google’s algorithms will get so far as to automatically understand every website and SEO will be obsolete, nobody will need to do that. I don’t think that will happen.”

46. Perfect Time For An SEO Side Hustle

With the launch of the Page Experience update, Mueller says now is the right time to get into optimizing websites.

“A good consultant who helps a site get into the green can be worth a lot of money. If you like this kind of work, if you like working on a low level on websites, and have practice with various setups/CDNs/plugins/frameworks, now’s the perfect time to level up and get paid well for it.”

47.  Sometimes There’s No SEO Solution

Sometimes SEO is not the solution for making a site rank better, Mueller says:

“One of the things to keep in mind is that it’s possible that there’s just no SEO solution. 6 years is a long time, and the web + Google News + everything around it has evolved quite a bit.

Sometimes it’s not a technical issue, sometimes it’s not something you can fix by just “buying a bunch of links,” sometimes it’s just that the site strategy is now obsolete.”

48. Google Indexes Mobile Versions Of Pages By Default

The change to mobile-first indexing means Google indexes the mobile versions of a page by default, rather than the desktop.

In the case of sites with separate mobile URLs, that means the m-dot version is used for indexing.

“The change with mobile first indexing is that we’ll use the mobile version (m-dot) as the version for indexing, instead of the www (desktop) version. For most sites, this change has already happened. If your site is already indexed with mobile, nothing will change.”

49. Don’t Rely On Google Discover Traffic

Referral traffic from Google Discover is likely to fluctuate, so it’s important not to depend on it as a consistent source.

“… with Discover it’s something which is not tied to a specific query. So it’s really hard to say what you should be expecting because you don’t know how many people are interested in this topic or where we would potentially be able to show that.

So that’s something where, if you do see a lot of visibility from Google Discover, I think that’s fantastic. I just would be careful and kind of realize that this is something that can change fairly quickly.”

50. Build Links With “Digital PR”

Mueller speaks positively of building links through digital PR (public relations). He clarifies it’s not spam. In fact, it may be as critical as technical SEO.

“I love some of the things I see from digital PR, it’s a shame it often gets bucketed with the spammy kind of link building. It’s just as critical as tech SEO, probably more so in many cases.”

51. Mobile And Desktop Rankings Are Contextually Personalized

For some searches, the needs of individual users are different depending on whether they’re searching from mobile or desktop, and that can influence rankings.

“… It’s normal that desktop and mobile rankings are different.

Sometimes that’s with regards to things like speed. Sometimes that’s with regards to things like mobile friendliness.

Sometimes that’s also with regards to the different elements that are shown in the search results page.

For example if you’re searching on your phone then maybe you want more local information because you’re on the go.

Whereas if you’re searching on a desktop maybe you want more images or more videos shown in the search results. So we tend to show …a different mix of different search results types.

And because of that it can happen that the ranking or the visibility of individual pages differs between mobile and desktop.

And that’s essentially normal. That’s a part of how we do ranking. It’s not something where I would say it would be tied to the technical aspect of indexing the content.”

52. Syndication Can Impact Rankings

Republishing and/or syndicating content on other sites can reduce your website’s chances of ranking for target keywords.

Mueller calls syndicating/republishing a bad idea:

“If you’re republishing, then it *is* duplicate content. If you goal is to reach a broader audience, go for it. If your goal is that only your site ranks for those queries, then syndicating / republishing is a bad idea. Pick your goals & select the work that helps you reach them.”

53. Search Is Not A Science

Search is not an exact science in the sense that all websites need to follow the same steps to achieve favorable rankings.

There can be multiple ways to achieve high rankings in Google. Every site does not have to follow the same blueprint.

“I think that’s really important to keep in mind in the sense that there is no absolute truth out there with regards to which page should be ranking for which query…

So it’s not that every site has to do the same thing, but rather there are multiple ways to get there and you don’t have to blindly follow just one ranking factor to get to the end result.”

54. Core Updates Impact Google Discover

When Google rolls out a core update in search results it also impacts how content is surfaced in Google Discover.

If traffic goes up or down after a core update, but your search rankings are stable, it could be due to changes in Discover.

“We do use a number of the same quality algorithms in Discover as we use in web search. When a broad core update happens in web search it’s very common that you would also see changes in discover as well. So that’s certainly not totally unrelated.”

55. Google Doesn’t Index All Pages

Google doesn’t index all pages of a website, even if it knows about them.

It’s completely normal, Mueller says, for as much as 20% of pages to not be indexed.

“The other thing to keep in mind with regards to indexing, is it’s completely normal that we don’t index everything off of the website.

So if you look at any larger website or any even midsize or smaller website, you’ll see fluctuations in indexing.

It’ll go up and down and it’s never going to be the case that we index 100% of everything that’s on a website.

So if you have a hundred pages and (I don’t know) 80 of them are being indexed, then I wouldn’t see that as being a problem that you need to fix.”

56. There Is No “Sandbox” Or “Honeymoon” Period

There’s no such thing as a “Google sandbox,” where new pages are deliberately held back from appearing in search results.

Nor is there a  “honeymoon period” where new pages get a boost in rankings because Google has a preference for fresh content.

When a new page is published, Google makes assumptions about where it should rank.

Sometimes those assumptions end up being inaccurate, which is why a page may initially rank high and then see an abrupt drop.

“In the SEO world this is sometimes called kind of like a sandbox where Google is like keeping things back to prevent new pages from showing up, which is not the case.

Or some people call it like the honeymoon period where new content comes out and Google really loves it and tries to promote it.

And it’s again not the case that we’re explicitly trying to promote new content or demote new content.

It’s just, we don’t know and we have to make assumptions.

And then sometimes those assumptions are right and nothing really changes over time.

Sometimes things settle down a little bit lower, sometimes a little bit higher.”

57. Low Traffic Does Not Mean Low Quality

Be careful not to assume that low traffic is an indication that a page is low quality.

A page can still be useful, and high quality, even if it doesn’t get as much traffic as other pages on a site.

So it’s important not to remove pages from your site just because they’re not getting as much traffic as you’d like to see.

“On some websites, pages that get low traffic are often almost like correlated with low quality as well but that doesn’t have to be the case.

On other websites it might must just be that a lot of traffic goes to the head pages and the tail pages are just as useful but they’re useful for a much smaller audience.

So they get barely any traffic.

From our point of view, those websites are still useful and it’s still high quality.

I wouldn’t remove it just because it doesn’t get traffic.”

Conclusion

There are a lot of moving parts in SEO, and Google’s John Mueller is one who helps the community make sense of how everything works together.

Between his weekly live Q&A shows, dozens of daily tweets, occasional Reddit forum posts, and random bits of insight around the web — it’s a challenge to keep up with everything.

That’s what we’re here for.

Follow Search Engine Journal’s coverage for the latest insights from Mueller and everyone else in Google’s search relations team.

More Resources:


Featured image: pathdoc/Shutterstock

Category SEO
ADVERTISEMENT
SEJ STAFF Matt G. Southern Senior News Writer at Search Engine Journal

Matt G. Southern, Senior News Writer, has been with Search Engine Journal since 2013. With a bachelor’s degree in communications, ...

57 SEO Insights From Google’s John Mueller

Subscribe To Our Newsletter.

Conquer your day with daily search marketing news.