Editor’s Note: This is a section of our completely redone SEO Guide. Enjoy!
Google has multiple named parts of the algorithm that influence search rankings. Google Panda is part of the algo that is specific to the quality of content, Penguin is specific to the quality of links, and Hummingbird is Google’s part of the algo for handling conversational search queries accurately.
Google Panda takes the quality of a site’s content into account when ranking sites in the search results. For sites that have lower quality content, they would likely find themselves negatively impacted by Panda. As a result, this causes higher quality content to surface higher in the search results, meaning higher quality content is often rewarded with higher rankings, while low-quality content drops.
When Panda originally launched, many saw it as a way for Google to target content farms specifically, which were becoming a major problem in the search results with their extremely low-quality content that tended to rank due to sheer volume. These sites were publishing a fantastic amount of low-quality content very quickly on topics with very little knowledge or research, and it was very obvious to a searcher who landed on one of those pages.
Google has now evolved Panda to be part of the core algorithm. Previously, we had a known Panda update date, making it easier to identify when a site was hit or had recovered from Panda. Now it is part of a slow rolling update, lasting months per cycle. As a result, it is hard to know whether a site is negatively impacted by Panda or not, other than doing a content audit and identifying factors that sites hit by Panda tend to have.
User Generated Content
It is important to note that Panda does not target user-generated content specifically, something that many webmasters are surprised to learn. But while Panda can target user-generated content, it tends to impact those sites that are producing very low-quality content – such as spammy guest posts or forums filled with spam.
Do not remove your user-generated content, whether it is forums, blog comments or article contributions, simply because you heard it is “bad” or marketed as a “Panda proof” solution. Look at it from a quality perspective instead. There are many highly ranking sites with user-generated content, such as Stack Overflow, and many sites would lose significant traffic and rankings simply because they removed that type of content. Even comments made on a blog post can cause it to rank and even get a featured snippet.
Word count is another aspect of Panda that is often misunderstood by SEOs. Many sites make the mistake that they refuse to publish any content unless it is above a certain word count, with 250 words and 350 words often cited. Instead, Google recommends you think about how many words the content needs to be successful for the user.
For example, there are many pages out there with very little main content, yet Google thinks the page is quality enough that it has earned the featured snippet for the query. In one case, the main content was a mere 63 words, and many would have been hard pressed to write about the topic in a non-spammy way that was 350+ words in length. So you only need enough words to answer the query.
Content Matches the Query
Ensuring your content matches the query is also important. If you see Google is sending traffic to your page for specific queries, ensure that your page is answering the question searchers are looking for when they land there. If it is not, it is often as simple as adding an extra paragraph or two to ensure that this is happening.
As a bonus, these are the types of pages – ones that answer a question or implied question – that Google is not only looking to rank well but is also awarding the featured snippet for the query to.
Technical SEO also does not play any role in Panda. Panda looks just at the content, not things like whether you are using H1 tags or how quickly your page loads for users. That said, technical SEO can be a very important part of SEO and ranking in general, so it should not be ignored. But it does not have any direct impact on Panda specifically.
If you are struggling to determine whether a particular piece of content is considered quality or not, there is one surefire way to confirm. Look in Search Analytics or your site’s analytics program such as Google Analytics and look at the individual page. If Google is ranking a page and sending it traffic, then clearly it is viewing it as quality enough to show high enough in the search results that people are landing there from those Google’s search results.
However, if a page is not getting traffic from Google, it does not automatically mean it is bad, but the content is worth looking at closer. Is it simply newer and has not received enough ranking signals to rank yet? Do you see areas of improvement you can make by adding a paragraph or two, or changing the title to match the content better? Or is it truly a garbage piece of content that could be dragging the site down the Panda hole?
Also, do not forget that there is traffic outside of Google. You may question a page because Google is not sending it traffic, but perhaps it does amazingly well in Bing, Baidu, or one of the other search engines instead. Diversity in traffic is always a good thing, and if you have pages that Google might not be sending traffic to, but is getting traffic from other search engines or other sites or through social media shares, then removing that content would be the wrong decision to make.
How to prevent Google Panda from negatively impacting your site is pretty simple. Create high-quality, unique content that answers the question searchers are asking.
Reading content out loud is a great way to tell if content is high-quality or not. When content is read aloud, suddenly things like over usage of repetitive keywords, grammatical errors, and other signals that the content is less than quality will stand out. Read it out yourself and edit as you go, or ask someone else to read it so you can flag what should be changed.
The second major Google algorithm is Penguin. Penguin deals solely with link quality and nothing else. Sites that have purchased links or have acquired low-quality links through places such as low-quality directories, blog spam, or link badges and infographics could find their sites no longer ranking for search terms.
Who Should Worry about Penguin?
Most sites do not need to worry about Penguin unless they have done some sketchy link building in the past or have hired an SEO who might have engaged in those tactics. Even if the site owner was not aware of what an SEO was doing, the owner is still ultimately responsible for those links. That is why site owners should always research an SEO or SEO agency before hiring.
If you have done link building in the past while tactics were accepted, but which are now against Google’s webmaster guidelines, you could be impacted by Penguin. For example, guest blogging was fine years ago, but is not a great way to build links now unless you are choosing your sites well. Likewise, asking site visitors or members to post badges that linked to your site was also fine previously, but will now definitely result in Penguin or a link manual action.
Algorithmic Penguin and Link Manual Actions
Penguin is strictly algorithmic in nature. It cannot be lifted by Google manually, regardless of the reason why those links might be pointing to a website.
Confusing the issue slightly is that there is a separate manual action for low-quality links and that one can be lifted by Google once the links have been cleaned up. This is done with a reconsideration request in Google Search Console. And sites can be impacted by both a linking manual action and Penguin at the same time.
Incoming Links Only
Penguin only deals with a site’s incoming links. Google only looks at the links pointing to the site in question and does not look at the outgoing links at all from that site. It is important to note that there is also a Google manual action related directly to a site’s outgoing links (which is different from the regular linking manual action), so the pages and sites you link to could result in a manual action and the deindexing of a site until those links are cleaned up.
Finding Your Backlinks
If you suspect your site has been negatively impacted by Penguin, you need to do a link audit and remove or disavow the low quality or spammy links. Google Search Console includes a list of backlinks for site owners, but be aware that it also includes links that are already nofollowed. If the link is nofollowed, it will not have any impact on your site, but keep in mind, the site could remove that nofollow in the future without warning.
There are also many third-party tools that will show links to your site, but because some websites block those third-party bots from crawling their site, it will not be able to show you every link pointing at your site. And while some of the sites blocking these bots are high-quality well-known sites not wanting to waste the bandwidth on those bots, it is also being used by some spammy sites to hide their low-quality links from being reported.
Assessing Link Quality
When it comes to assessing the links, this is where many have trouble. Do not assume that because a link comes from an .edu site that it is high-quality. There are plenty of students who sell links from their personal websites on those .edu domains which are extremely spammy and should be disavowed. Likewise, there are plenty of hacked sites within .edu domains that have low-quality links.
Do not make judgments strictly based on the type of domain. While you can’t make automatic assumptions on .edu domains, the same applies to all TLDs and ccTLDs. Google has confirmed that just being on a specific TLD it does not help or hurt the search rankings. But you do need to make individual assessments. There is a long running joke about how there’s never been a quality page on a .info domain because so many spammers were using them, but in fact, there are some great quality links coming from that TLD, which shows why individual assessment of links is so important.
Beware of Links from Presumed High-Quality Sites
Do not look at the list of links and automatically consider links from specific websites as being a great quality link, unless you know that very specific link is high quality. Just because you have a link from a major website such as Huffington Post or the BBC does not make that an automatic high-quality link in the eyes of Google – if anything, you should question it more.
Many of those sites are also selling links, albeit some disguised as advertising or done by a rogue contributor selling links within their articles. These types of links from high-quality sites actually being low-quality has been confirmed by many SEOs who have received link manual actions that include links from these sites in Google’s examples. And yes, they could likely be contributing to a Penguin issue.
As advertorial content increases, we are going to see more and more links like these get flagged as low-quality. Always investigate links, especially if you are considering not removing any of them simply based on the site the link is from.
As with advertorials, you need to think about any links that sites may have pointed to you that could be considered promotional links. Paid links do not always mean money is exchanged for the links.
Examples of promotional links that are technically paid links in Google’s eyes are any links given in exchange for a free product for review or a discount on products. While these types of links were fine years ago, they now need to be nofollowed. You will still get the value of the link, but instead of it helping rankings, it would be through brand awareness and traffic. You may have links out there from a promotional campaign done years ago that are now negatively impacting a site.
For all these reasons, it is vitally important to individually assess every link. You want to remove the poor quality links because they are impacting with Penguin or could cause a future manual action. But you do not want to remove the good links, because those are the links that are helping your rankings in the search results.
Promotional links that are not nofollowed can also trigger the manual action for outgoing links on the site that placed those links.
Editor Note: Removing links and submitting a disavow request is also covered in more detail in the ‘What to Do When Things Go Wrong‘ section of our SEO Guide.
Once you have gone through your backlinks and determined that there are some that should be removed or disavowed, you will need to get these links removed. You should first approach site owners and ask them to remove the links pointing to your site. If removals are unsuccessful, add those URLs to a disavow file, one you will submit to Google.
There are tools that will automate the link removal requests and agencies that will handle the requests as well, but do not feel it is necessary to do this. Many webmasters find contact forms or emails and will do it themselves.
Some site owners will demand a fee to remove a link from a site, but Google recommends not paying for link removals. Just include them in your disavow file instead and move onto the next link removal. Some site owners are using link removals to generate revenue, so the practice is becoming more common.
Creating and Submitting a Disavow File
The next step in cleaning up Penguin issues is to submit a disavow file. The disavow file is a file you submit to Google that tells them to ignore all the links included in the file so that they will not have any impact on your site. The result is that the negative links will no longer cause negative ranking issues with your site, such as with Penguin, but it does also mean that if you erroneously included high-quality links in your disavow file, those links will no longer help your ranking. This is another reason why it is so crucial to check your backlinks well before deciding to remove them.
If you have previously submitted a disavow file to Google, they will replace that file with your new one, not add to it. So it is important to make sure that if you have previously disavowed links, you still include those links in your new disavow file. You can always download a copy of the current disavow file in Google Search Console.
Disavowing Individual Links Versus Domains
It is recommended that you choose to disavow links on a domain level instead of disavowing the individual links. There will be some cases where you will want to disavow individually specific links, such as on a major site that has a mix of quality versus paid links. But for the majority of links, you can do a domain based disavow. Then, Google only needs to crawl one page on that site for that link to be discounted on your site.
Doing domain based disavows also means that you are do not have to worry about those links being indexed as www or non-www, as the domain based disavow will take this into account.
What to Include in a Disavow File
You do not need to include any notes in your disavow file, unless they are strictly for your reference. It is fine just to include the links and nothing else. Google does not read any of the notations you have made in your disavow file, as they process it automatically without a human ever reading it. Some find it useful to add internal notations, such as the date a group of URLs was added to the disavow file or comments about their attempts to reach the webmaster about getting a link removed.
Once you have uploaded your disavow file, Google will send you a confirmation. But while Google will process it immediately, it will not immediately discount those links. So you will not instantly recover from submitting the disavow alone. Google still needs to go out and crawl those individual links you included in the disavow file, but unfortunately the disavow file itself will not prompt Google to crawl those pages specifically.
It can take six or more months for all those individual links to be crawled and disavowed. And no, there is no way to determine which links have been discounted and which ones have not been, as Google will still include both in your linking report in Google Search Console.
Speeding Up the Disavow Process
There are ways you can speed up the disavow process. The first is using domain based disavows instead of individual links. And the second is to not waste time include lengthy notations for Google’s benefit so that you can submit your disavow faster. Because reconsideration requests require you to submit more details, some misunderstand and believe the disavow needs more details, too.
Lastly, if you have undergone any changes in your domain, such as switching to https or switching to a new domain, you need to remember to upload that disavow file to the new domain property in Google Search Console. This is one step that many forget to do, and they can be impacted by Penguin or the linking manual action again, even though they have cleaned it up previously.
Recovery from Penguin
When you recover from Penguin, do not expect your rankings to go back to where they used to be before Penguin, nor for the return to be immediate. Far too many site owners are under the impression that they will immediately begin ranking at the top for their top search queries once Penguin is lifted.
First, some of the links that you disavowed were likely contributing to an artificially high ranking, so you cannot expect those rankings to be as high as they were before. Second, because many site owners have trouble assessing the quality of the links, some high-quality links inevitably get disavowed in the process, links that were contributing to the higher rankings.
Add to the mix the fact Google is constantly changing their ranking algorithm, so factors that benefited you previously might not have as big of an impact now, and vice versa.
Compensated Links via Badges and More
Also be aware of any link building campaigns you are doing, or legacy ones that could come back to impact your site. This would include things like badges you have given to other site owners to place on their sites or the requirement that someone includes a link to your site to get a directory listing or access something. In simple terms, if the link was placed in exchange for anything, it either needs to be nofollowed or disavowed.
When it comes to disavowing files that people are using to clean up poor quality links, there is a concern that a site could be hurt by competitors placing their URLs into a disavow file uploaded to Google. But Google has confirmed that they do not use the URLs contained within a disavow file for ranking, so even if your site appears in thousands of disavows, it will not hurt. That said, if you are concerned your site is legitimately appearing in thousands of disavows, then your site probably has a quality issue you should fix.
There is also the negative SEO aspect of linking, where some site owners worry that a competitor could buy spammy links and point them to their site. And many use negative SEO as an excuse when their site gets caught by Google for low-quality links.
If you are worried about this, you can proactively disavow the links as you notice them. But Google has said they are pretty good about recognizing this when it happens, so it is not something most website owners need to worry about.
Real Time Penguin
Google is expected to release a new version of Penguin soon, which will have one very notable change. Instead of site owners needing to wait for a Penguin update or refresh, the new Penguin will be real-time. This is a huge change for those dealing with the impact of spamming links and the weights many have had to endure after cleaning up.
Google Hummingbird is part of the main Google search algorithm and was the first major change to their algorithm since 2001. But what is different about Hummingbird is that this one is not specifically a spam targeting algorithm, but instead an algorithm to ensure they are serving the best results for specific queries. Hummingbird is more about being able to understand search queries better, particularly with the rise of conversational search.
It is believed that Hummingbird is positively impacting the types of sites that are providing high-quality content that reads well to the searcher and is providing answers to the question the searcher is asking, whether it is implied or not.
Hummingbird also impacts long-tailed search queries, similarly to how Rank Brain is also helping those types of queries. Google wants to ensure that they can provide high-quality results for the longer queries. For example, instead of sending a specific question related to a company to the company’s homepage, Google will try to serve an internal page on the site about that specific topic or issue instead.
Hummingbird cannot be optimized for, outside of optimizing for the rise of conversational search. Longer search queries, such as what we see with voice search, and the types of queries that searchers tend to do on mobile are often highlighted with a conversational search. And optimizing for conversational search is easier than it sounds. Make sure your content is highly readable and can answer those longer tail queries as well as shorter tail ones.
Like Rank Brain, Hummingbird had been released for a period before it was announced, and SEOs did not particularly notice anything different regarding the rankings. It is not known how often Hummingbird is updated or changed by Google.