Search

How Google handles JavaScript throughout the indexing process

Vercel published an article with research on how Google Bots deal with client-side JavaScript. People assume that sites that are JavaScript-rendered only are worse for SEO and worry about that. Which is entirely reasonable, as Google themselves warns you that there are limitations with client-side rendering and “some pages may encounter problems with content not showing up in the rendered HTML. Other search engines may choose to ignore JavaScript and won’t see JavaScript-generated content.”

Their testing idea makes sense. If the page is hit with a Google Bot, wait for the page to entirely finish loading (regardless of rendering method), and only then send tracking data back to store and analyze.

Their findings are mostly “this is fine” and literally every page was rendered in full, including pages with complex JS and dynamic content, although the comparisons of rendering strategies show that client-side rendering is still clearly the worst.

This was also testing directly on nextjs.org, which I imagine Google sees as a pretty highly trafficked and important site, and perhaps willing to send it’s most robust bots. Does your brand new site for Jimbob’s Bakery get the same treatment?

Wanna take your React skills to the next level?

Frontend Masters logo

React is the web's most popular framework, always topping the charts for JavaScript developers awareness, interest, and satisfaction. We have a huge course on next-level React learning covering hooks in depth, CSS-in-JS with Tailwind, code splitting, SSR, TypeScript, and more. So popular it's now been updated to a third edition!

Leave a Reply

Your email address will not be published. Required fields are marked *

Frontend Masters ❤️ Open Source

Did you know? Frontend Masters Donates to open source projects. $313,806 contributed to date.