Advertisement
  1. SEJ
  2.  ⋅ 
  3. SEO

SEO & JavaScript: The Good, the Bad & the Uncertainty

Search engines have made improvements in indexing JavaScript websites. But the question of whether they can properly render JS pages remains muddled.

SEO & JavaScript: The Good, the Bad & the Uncertainty

JavaScript and SEO have long been a debated topic among developers and SEO experts.

Search engines have made, and continue to make, significant improvements in indexing JavaScript websites.

That said, the question of whether or not major search engines can properly render pages created using JavaScript remains muddled.

The Good: New Developments Ease Compatibility

Google and Bing made recent SEO announcements related to JavaScript last year, revealing improvements to ease compatibility.

Google announced that they have started using the latest version of Google Chrome to render webpages executing JavaScript, Style Sheets and more.

Bing announced that they are adopting the new Microsoft Edge as the Bing Engine to render pages.

Bingbot will now render all web pages using the same underlying web platform technology already used by Googlebot, Google Chrome, and other Chromium-based browsers.

Both leading search engines also announced that they will make their solution evergreen, committing to regularly update their web page rendering engine to the most recent stable version of their browser.

These regular updates will ensure support for the latest features, a significant leap from the previous versions.

Search Engines Are Simplifying SEO by Leveraging the Same Rendering Technology

These developments from Google and Bing make it easier for web developers to ensure their websites and their web content management system work across both browsers without having to spend time investigating each solution in depth.

With the exception of files that are not robots.txt disallowed, the secondary content they see and experience in their new Microsoft Edge browser or their Google Chrome browser is what search engines will also experience and see.

For SEOs and developers, this saves time and money.

For example, there is:

  • No longer a need to keep Google Chrome 41 around to test Googlebot.
  • No longer a need to escalate to Bing.
  • No longer a need to maintain a compatibility list of which JavaScript function, style sheet directive work per search engine.

And the list goes on and on.

With all this great news and free time, does that mean a green light for JavaScript?

Likely, not.

The Bad: JavaScript Is Still Facing Many Limitations & Risks

Long story short, JavaScript can complicate the search engines’ ability to read your page, leaving room for error, which could be detrimental for SEO.

When a search engine downloads a web document and starts analyzing it, the first thing it does is understand the document type.

If the document is a non-HTML file (examples: HTTP redirect, PDF, image or video) then there is no need to render the document leveraging JavaScript stack, as this type of content does not include JavaScript.

For HTML files, if they have enough resources, they will attempt to render the document using their optimized browser rendering solutions.

Problems start to surface when JavaScript is not directly embedded in the document.

<script type="text/javascript" src="https://www.domain.com/files/myjavascript.js" />

Search engines must download the file to read and execute it.

If the content is robots.txt disallowed, it won’t be able to.

If they are allowed, search engines must succeed downloading the file, facing crawl quota per site and site unavailability issues.

Search engines generally don’t do complex actions such as clicking a button, so it would be best to use basic HTML as <script> link to the file like the example above.

Another potential pitfall is the JavaScript file may not be in sync with the cached version of the website. Search engines generally cache for extended periods of time to avoid fetching every resource on the page often.

JavaScript may do HTTP requests to load content and additional resources files via HTTP calls which multiply the change of facing issues previously explained.

JavaScript included in these JavaScript files or HTML also may not be compatible with the JavaScript engine used by search engines.

When it’s not compatible, the search engine isn’t going to read it, and if we can’t read it, we’re not going to remember it.

With the recent move for search engines to use the same technology and commitment to updating their browsers, this should become easier to deal with in the future.

Also, don’t forget that the handling of JavaScript by the search engines is limited:

  • Search normalized URLs with a #. Dropping all parameters after the # (except the legacy #! Standard).
  • Search engines don’t generally click buttons and do other complex actions.
  • Search engines don’t wait long periods of time for pages to render.
  • Search engines don’t output complex interactive webpages.

JavaScript should not be the new Flash!

Keep in mind that every instance of JavaScript has to be read. When used excessively it will slow the page speed for ranking index.

The Uncertainty: For Optimal SEO, Use JS Practically, Sparingly or Ideally, Not at All

For large websites and for websites that want to get the most of search engines, it is preferable to detect search engine crawlers based on their user agent (Bingbot, Googlebot) and output basic HTML without JavaScript, or limited JavaScript.

Also, allow crawlers to access content with one HTTP request for the HTML and text that you want to be indexed.

There is also concern that if a site feels the need to differentiate the experience with JavaScript or for bots, that they may be penalized for spammer cloaking.

The good news is Google and Bing both suggest there is no need to worry if you output nearly the same text and content as the one viewed by your human customers.

Google says:

“Currently, it’s difficult to process JavaScript and not all search engine crawlers are able to process it successfully or immediately. … we recommend dynamic rendering as a workaround solution to this problem. Dynamic rendering means switching between client-side rendered and pre-rendered content for specific user agents.”

Bing says:

“When it comes to rendering content specifically for search engine crawlers, we inevitably get asked whether this is considered cloaking… and there is nothing scarier for the SEO community than getting penalized for cloaking … The good news is that as long as you make a good faith effort to return the same content to all visitors, with the only difference being the content is rendered on the server for bots and on the client for real users, this is acceptable and not considered cloaking.”

Do or Don’t?

For SEO experts, it is preferable for you to not output JavaScript when search engine crawlers are visiting your webpages, assuming the HTML text content and formatting you return look nearly the same as the ones viewed by humans visiting your sites.

If JavaScript has a purpose on the site and page, it can be fine to use it.

Be sure to understand the technical implications so that your documents can be properly indexed or consult with a technical SEO expert.

Search engines are incentivized to index your content to satisfy their customers.

If you come across issues, investigate them using search engines webmaster online tools or contact them.

More Resources:

Category SEO
ADVERTISEMENT
Fabrice Canel Principal Product Manager at Microsoft Bing

Principal Program Manager leading the Bing Web Data platform team . The team discovering, selecting, crawling, processing billions of new ...

SEO & JavaScript: The Good, the Bad & the Uncertainty

Subscribe To Our Newsletter.

Conquer your day with daily search marketing news.