How Google handles JavaScript throughout the indexing process

Vercel published an article with research on how Google Bots deal with client-side JavaScript. People assume that sites that are JavaScript-rendered only are worse for SEO and worry about that. Which is entirely reasonable, as Google themselves warns you that there are limitations with client-side rendering and “some pages may encounter problems with content not showing up in the rendered HTML. Other search engines may choose to ignore JavaScript and won’t see JavaScript-generated content.”

Their testing idea makes sense. If the page is hit with a Google Bot, wait for the page to entirely finish loading (regardless of rendering method), and only then send tracking data back to store and analyze.

Their findings are mostly “this is fine” and literally every page was rendered in full, including pages with complex JS and dynamic content, although the comparisons of rendering strategies show that client-side rendering is still clearly the worst.

This was also testing directly on nextjs.org, which I imagine Google sees as a pretty highly trafficked and important site, and perhaps willing to send it’s most robust bots. Does your brand new site for Jimbob’s Bakery get the same treatment?

Wanna take your React skills to the next level?

Leave a Reply

Your email address will not be published. Required fields are marked *

Did you know?

Frontend Masters Donates to open source projects. $313,806 contributed to date.