This is also referred to as server-side rendering, which requires multiple HTTP requests compared to a single HTTP request required to render static HTML.
Needless to say, dozens of HTTP calls required to render a single page is not optimal. However, Bingbot has a way to deal with it.
Bing offers the following recommendations to minimize HTTP requests while ensuring its web crawler can render the most complete version of a site every time:
- Program the site to detect the Bingbot user agent
- Prerender the content on the server side and output static HTML
The above recommendations will help increase the predictability of crawling and indexing by Bing and should assist other web crawlers as well.
The inevitable question Bing gets asked when it comes to rendering content for search crawlers is whether it is technically considered cloaking.
As long as the same content is shown to all visitors it is not considered cloaking, Bing says.
Here is the exact quote:
“The good news is that as long as you make a good faith effort to return the same content to all visitors, with the only difference being the content is rendered on the server for bots and on the client for real users, this is acceptable and not considered cloaking.”