Advertisement
|

How Tabbed Content Might Be Hurting Your Search Rankings

  • 773
    SHARES
  • 2.9K
    READS
Cai Simpson
Cai Simpson
How Tabbed Content Might Be Hurting Your Search Rankings

Putting content behind tabs is a common solution for keeping a websites body content clean and concise. Offering the user the ability to show, or hide content with a single click.

But is tabbed content a good thing for your search engine optimization efforts?

Recently Gary Illyes of Google confirmed that they are OK with tabbed content.

So the question isn’t whether tabbed content is SEO-friendly or not but how to implement it right.

How Badly Implemented Tabbed Content Can Hurt Your Rankings

As you know, search engine crawlers have had a hard time reading JavaScript over the years.

Since 2014, Google has been working to understand how JavaScript has become essential in modern day website design. However, it is still far from perfect as JavaScript is a complicated yet beautiful thing.

We don’t know exactly what Google can and can’t read with JavaScript, so the best thing you can do at this point is to make sure that your JS files are readable and not disallowed in your robots.txt.

When a crawler reads your JS statements it will simply either be able to understand it or not. If it isn’t understood, any piece of content within that sector won’t be displayed or rendered. Meaning that your well-structured content may not be of any value to your search aspirations.

A good way to see if your content is being read is to use the Fetch as Google function within Google Search Console, which displays both a rendered version for Googlebot and how a visitor will see the page.

Fetch as Google

So how does this relate to tabbed content?

Tabbed content created merely by using JavaScript will not be visible to search engines as they don’t crawl for ranking JavaScript code and therefore text inside the code is considered as part of the code and will be ignored.

Here’s an example is a standard piece of tabbed content – notice how it doesn’t contain a standard hyperlink:

<button class="tablinks" onclick="OPENTAB(event, 'EVENTNAME')" id="defaultOpen">TAB TITLE</button>
 
<script>
document.getElementById("defaultOpen").click();
var content = document.createTextNode("<YOUR_CONTENT>");
theDiv.appendChild(content);
</script>

Googlebot won’t be able to read your tab because <YOUR_CONTENT> is a part of JavaScript code and Google simply doesn’t give a weight to it. If you have five tabs, all with 200 words in each, you’re losing out on 1,000 words on your page.

This is bad news for your webpage because this will significantly lower your content quality, not to mention missing out on those keyword-rich and relevant pieces of content.

What You Can Do to Fix This Issue

Because tabbed content is dependent on the kind of code showcased above, the most reliable way to fix it is by using CSS-based techniques where you have the text in different div containers but hidden via style="display:none" command then display them when one clicks on a tab.

Here is a real-life example of such implementation.

tabbed-content-example

When you click on a section row, it displays the text behind it. If you need an in-depth how-to example, you can find one here.

Are you wondering if hidden text might not work with Google? Here is John Mueller’s confirmation that they are okay with content not being visible by default, especially in a mobile-first index era.

So you need to choose a well-structured and designed page that enables Googlebot to effectively crawl your website, indexing every single piece of content you have got to offer.

We ran tests over many months to prove how JavaScript based implemented tabbed content affects ranking. All pages in this example were optimized with keyword-rich content but have not received any external links during the course of the test. We simply created the pages, noticed which pages were not ranking well, and made one simple page edit – the removal of badly implemented tabbed content.

Tabbed content research search engine rankings

From the table above, you can quite clearly see the effect that tabbed content removal has had on the search engine rankings. Pages that previously struggled to get onto page 1 are now (and have remained) on page 1 for its target keyword. Target keywords that were already on Page 1, have gained position, pushing for that top slot.

Keyword 8, for example, wasn’t gaining any positioning. Once the tabbed content was removed, the page dropped slightly. However, over time, the page went from Page 3 to Page 1 within a matter of months.

Keyword 2 gradually started to pick up positioning but instantly jumped to Page 1, which indicates that the page was most likely crawled naturally at that point. Once the GoogleBot noticed the rich content that was not being hidden anymore, it decided that the page was worthy of page 1 status.

Arguments Against This Ideology

When creating an SEO rule such as this, considering other factors is also essential. Are there any other reasons why these pages have grown?

Below is a short list of other possible factors that may have affected these search engine results:

  • Page age: As webpages get older, trust grows and will affect the position. The pages showcased in this research didn’t contain any information indicating the date that the page was created.
  • Natural organic external links: No external links were built to these pages in the time we were monitoring the research.
  • Algorithm changes: The only algorithm change that would have helped this would have been the ongoing content quality updates. This means that ongoing algorithm changes would benefit pages that showcase more (and relevant) content.
  • Page creation: Pages weren’t gaining any position 3 months prior to the page being created.
  • Existing on-page optimization: When the pages were created, standard SEO was applied, including meta optimization, content creation, header optimization and image optimization.

Will Google Ever Be Able To Read Tabbed Content?

We don’t know if JavaScript-based code will ever be able to be read by search engine bots. So any kind of future proofing this kind of web development is purely speculative.

For now, the research shows us that typically, standard JavaScript isn’t read and therefore SEO professionals should keep a keen eye out for any JavaScript that may be harming a websites ability to rank.


Image Credits
Featured Image: royguisinger/Pixabay

In-post Images: Screenshots by Cai Simpson. Taken August 2017.

ADVERTISEMENT
ADVERTISEMENT
Cai Simpson

Cai Simpson

Digital Marketing Manager at Bravr Digital Marketing

Cai Simpson is the Digital Marketing Manager at Bravr. A digital marketing agency based in London, Vancouver and Devon. Cai ... [Read full bio]

Advertisement