White blocks with letters spelling Google, symbolizing search and SEO concepts.

JavaScript SEO: How Google Renders SPAs, React, and Angular

Let’s cut through the noise. This guide explains exactly how Google’s JavaScript SEO rendering works for modern frameworks and how to stop guessing and start auditing.

Stop Trusting Google to ‘Just Figure Out’ Your JavaScript

Relying on Google to ‘just figure out’ your JavaScript-heavy website is a fantastic way to get your most important pages de-indexed. It’s a common refrain from development teams: ‘Google can render JavaScript now, it’s fine.’ This statement is both true and dangerously misleading.

Yes, Googlebot has evolved. It now runs a Web Rendering Service (WRS) based on a recent version of Chrome. But understanding the nuances of this JavaScript SEO rendering process is the difference between a high-ranking site and one that’s invisible to search.

This guide isn’t about hypotheticals. It’s a technical breakdown of how rendering actually works, the common failure points for Single Page Applications (SPAs) built with React, Angular, or Vue, and how to audit your site to ensure search engines see the same content your users do.

The Two Waves of Indexing: A Recipe for Delay and Despair

Google doesn’t process your JavaScript-powered site in one clean pass. It uses a two-wave indexing process, and the gap between those waves is where SEO performance goes to die.

Wave One: Googlebot fetches your initial HTML. It’s fast and efficient. It crawls the `robots.txt`, gets the HTML, and indexes any content and links it finds immediately. For a traditional, server-rendered site, the process often ends here.

For an SPA, that initial HTML is often just an empty shell with a link to a massive `.js` bundle. Googlebot sees a mostly blank page, notes that it requires rendering, and throws it onto a massive queue for the second wave.

Wave Two: Sometime later—minutes, days, or even weeks—the Web Rendering Service finally gets around to your page. It executes the JavaScript, renders the final DOM, and sends that back for indexing. Only now does Google see your actual content, links, and metadata.

This delay is a killer for time-sensitive content and creates a massive drain on your Crawl Budget. Every page that requires rendering is a page that costs more resources to index, slowing down the discovery of new and updated content across your entire site.

Warning

The rendering queue is not a FIFO (First-In, First-Out) system. Google prioritizes pages for rendering based on its own internal signals, like PageRank. Your new, low-authority product page could be waiting a very long time.

Client-Side vs. Server-Side: The Core of JavaScript SEO Rendering

The root of most JavaScript SEO issues lies in the rendering pattern you choose. If you don’t understand the difference between client-side and server-side rendering, you’re flying blind.

The debate isn’t just academic; it directly impacts what Googlebot can and cannot see on that first pass. Let’s break down the main contenders.

  • Client-Side Rendering (CSR): This is the default for frameworks like React (using Create React App) and Angular. The server sends a minimal HTML document, and the user’s browser executes JavaScript to render the rest of the page. This is the pattern that relies entirely on Google’s second wave of indexing.
  • Server-Side Rendering (SSR): The server renders the full HTML of the page before sending it to the browser. When the crawler (or user) requests the page, they receive a complete document. This is ideal for SEO as all content is available in the first wave. Frameworks like Next.js (for React) and Angular Universal provide SSR capabilities.
  • Static Site Generation (SSG): Every page is pre-rendered into a static HTML file at build time. This offers the best performance and SEO friendliness, as there’s no server-side computation needed per request. It’s perfect for blogs, documentation, and marketing sites where content doesn’t change in real-time.
  • Dynamic Rendering: A workaround, not a strategy. This involves detecting user-agent and serving a pre-rendered version to bots (like Googlebot) and the standard client-side rendered version to human users. It’s a stopgap for when you can’t implement true SSR.

Common JavaScript SEO Rendering Catastrophes (And How to Spot Them)

Theory is great, but let’s talk about the real-world problems we see in audits every day. These are the subtle-but-deadly issues that can make your site invisible to search, even if it looks perfect to a human user.

The most common culprit is improper linking. Developers often use non-anchor tags for navigation because it’s convenient within a framework’s component model. This is a disaster for crawlers.

A crawler is looking for “ tags. It does not execute arbitrary JavaScript `onClick` events. If your site’s navigation is built on `div` or `span` elements, you don’t have a website; you have a digital island that Google can’t navigate.

Here’s what that anti-pattern looks like in the wild:

<!-- DO NOT DO THIS. This is invisible to crawlers. -->
<div class="fancy-button" onClick="navigateTo('/about-us')">
  Learn More About Our Team
</div>

<!-- DO THIS INSTEAD. This is crawlable and accessible. -->
<a href="/about-us" class="fancy-button">
  Learn More About Our Team
</a>
  • Content Hidden Behind User Actions: If content only loads after a user clicks a ‘Read More’ button or scrolls to a certain point (infinite scroll), search engines will likely never see it. Bots don’t click buttons for fun.
  • Missing or Injected Metadata: Critical SEO tags like ``, “, and “ must be present in the initial HTML source. If they are only injected by JavaScript after rendering, they may be missed or processed too late.</li> <li><strong>Blocked Resources in robots.txt:</strong> If you disallow crawling of the `.js` or `.css` files needed to render the page, Googlebot will see a broken, likely blank, page. Always ensure critical rendering resources are crawlable.</li> </ul> <h2 class="wp-block-heading" id="auditing-with-screamingcat">How to Audit Your JavaScript SEO Rendering with ScreamingCAT</h2> <p>You can’t fix what you can’t see. Guessing about rendering is a losing game. You need to crawl your site just as Google does, which means using a tool that can process JavaScript.</p> <p>ScreamingCAT is built for this. Our crawler integrates a headless Chromium browser to render pages, allowing you to compare the initial HTML with the fully rendered DOM. This is the only way to diagnose rendering issues at scale.</p> <p>To get started, simply enable JavaScript rendering in the crawl configuration. Go to `Configuration > Spider > Rendering` and select `JavaScript` from the dropdown. ScreamingCAT will then crawl each URL twice: once for the raw HTML and again after executing the page’s JavaScript.</p> <p>Once the crawl is complete, the magic happens. You can add columns to the main report for ‘Rendered Word Count’, ‘Rendered H1’, or ‘Rendered Canonical’. Compare these against the non-rendered versions. Any significant discrepancies are red flags that point to a <strong>JavaScript SEO rendering</strong> problem.</p> <p>For a more detailed walkthrough, check out our full <a href="/blog/tutorials/javascript-rendering-headless-chrome-screamingcat/">JavaScript Rendering Tutorial</a>. It covers the exact settings and reports you need to become a rendering expert.</p> <div style="height:24px" aria-hidden="true" class="wp-block-spacer"></div> <div class="wp-block-group sc-callout sc-callout-tip"><div class="wp-block-group__inner-container is-layout-flow wp-block-group-is-layout-flow"> <p class="sc-callout-label">Pro Tip</p> <p>Use the ‘View Source’ tab in ScreamingCAT’s lower pane. You can instantly toggle between ‘Original HTML’ and ‘Rendered HTML’ for any selected URL. It’s the fastest way to visually confirm if your most important content or links are missing from the initial source.</p> </div></div> <div style="height:24px" aria-hidden="true" class="wp-block-spacer"></div> <h2 class="wp-block-heading" id="dynamic-rendering-strategy">Is Dynamic Rendering Still a Viable Strategy?</h2> <p>Let’s be blunt: dynamic rendering is a technical debt bandage. It was a necessary evil years ago when search engine renderers were less capable, but in the modern web, it should be a last resort.</p> <p>The strategy involves using a service like Puppeteer or Rendertron to create a pre-rendered static HTML snapshot of your page. Your server then detects the user-agent of the request. If it’s a known bot (like Googlebot), you serve the snapshot. If it’s a human user, you serve the normal client-side rendered application.</p> <p>While Google officially supports this and doesn’t consider it cloaking (if the content is equivalent), it adds a significant layer of complexity and fragility to your architecture. You now have two versions of your site to maintain and debug. It’s one more thing that can, and will, break.</p> <p>If you’re working with a legacy SPA and have no developer resources to implement SSR, dynamic rendering can save your SEO. But if you are building a new application or have the ability to refactor, invest in a proper long-term solution like SSR or SSG. Your future self will thank you.</p> <blockquote class="wp-block-quote is-layout-flow wp-block-quote-is-layout-flow"> <p>While we support dynamic rendering and are processing it, we generally don’t recommend it. We think a better long-term solution is to use server-side rendering, static rendering, or hydration.</p> <cite>Martin Splitt, Google Search Relations</cite></blockquote> <div style="height:40px" aria-hidden="true" class="wp-block-spacer"></div> <div class="wp-block-group sc-key-takeaways"><div class="wp-block-group__inner-container is-layout-flow wp-block-group-is-layout-flow"> <p class="sc-key-takeaways-title">Key Takeaways</p> <ul class="wp-block-list"> <li>Google uses a two-wave indexing process, which can delay the indexing of content that relies on client-side JavaScript rendering.</li> <li>Server-Side Rendering (SSR) or Static Site Generation (SSG) are vastly superior to Client-Side Rendering (CSR) for SEO performance and crawl efficiency.</li> <li>Common JavaScript SEO issues include non-anchor tag links (e.g., `onClick` divs), content hidden behind user interactions, and missing metadata in the initial HTML.</li> <li>You must audit your site with a JavaScript-capable crawler like ScreamingCAT to compare the raw HTML with the fully rendered DOM to find rendering issues at scale.</li> <li>Dynamic rendering is a complex workaround, not a sustainable long-term strategy. Prioritize native SSR or SSG solutions whenever possible.</li> </ul> </div></div> <div style="height:32px" aria-hidden="true" class="wp-block-spacer"></div> <div class="wp-block-group sc-post-tags is-layout-flex wp-block-group-is-layout-flex"> <p><a href="#" class="sc-post-tag">JavaScript SEO</a></p> <p><a href="#" class="sc-post-tag">Technical SEO</a></p> <p><a href="#" class="sc-post-tag">Rendering</a></p> <p><a href="#" class="sc-post-tag">React SEO</a></p> <p><a href="#" class="sc-post-tag">Angular SEO</a></p> <p><a href="#" class="sc-post-tag">Single Page Application</a></p> <p><a href="#" class="sc-post-tag">Crawling</a></p> </div> <div style="height:32px" aria-hidden="true" class="wp-block-spacer"></div> <div class="wp-block-group sc-share-row is-nowrap is-layout-flex wp-container-core-group-is-layout-ad2f72ca wp-block-group-is-layout-flex"> <p class="sc-share-label">Share</p> <p><a href="https://twitter.com/intent/tweet?url=" class="sc-share-btn" target="_blank" rel="noopener noreferrer">X</a></p> <p><a href="https://www.linkedin.com/sharing/share-offsite/?url=" class="sc-share-btn" target="_blank" rel="noopener noreferrer">in</a></p> <p><a href="https://news.ycombinator.com/submitlink?u=" class="sc-share-btn" target="_blank" rel="noopener noreferrer">HN</a></p> </div> <div style="height:32px" aria-hidden="true" class="wp-block-spacer"></div> <div class="wp-block-group sc-author-card"><div class="wp-block-group__inner-container is-layout-flow wp-block-group-is-layout-flow"> <div class="wp-block-group sc-author-info"><div class="wp-block-group__inner-container is-layout-flow wp-block-group-is-layout-flow"> <p class="sc-author-name">ScreamingCAT Team</p> <p class="sc-author-bio">Building the fastest free open-source SEO crawler. Written in Rust, designed for technical SEOs who value speed, privacy, and no crawl limits.</p> </div></div> </div></div> <div style="height:32px" aria-hidden="true" class="wp-block-spacer"></div> <div class="wp-block-group sc-post-cta"><div class="wp-block-group__inner-container is-layout-flow wp-block-group-is-layout-flow"> <h3 class="wp-block-heading">Ready to audit your site?</h3> <p>Download ScreamingCAT for free. No limits, no registration, no cloud dependency.</p> <div class="wp-block-buttons is-content-justification-center is-layout-flex wp-container-core-buttons-is-layout-16018d1d wp-block-buttons-is-layout-flex"> <div class="wp-block-button sc-btn-primary"><a class="wp-block-button__link wp-element-button" href="/download/">Download for Free</a></div> <div class="wp-block-button sc-btn-ghost"><a class="wp-block-button__link wp-element-button" href="https://github.com/nickvdyck/screamingcat" target="_blank" rel="noopener">View on GitHub</a></div> </div> </div></div> </div></div> </div><!-- .entry-content --> <footer class="entry-footer"> <div class="entry-tags"> <span class="tags-links"> <span class="tags-label screen-reader-text"> Post Tags: </span> <a href=https://screamingcat.net/blog/tag/dynamic-rendering/ title="dynamic rendering" class="tag-link tag-item-dynamic-rendering" rel="tag"><span class="tag-hash">#</span>dynamic rendering</a><a href=https://screamingcat.net/blog/tag/js-rendering-seo/ title="JS rendering SEO" class="tag-link tag-item-js-rendering-seo" rel="tag"><span class="tag-hash">#</span>JS rendering SEO</a><a href=https://screamingcat.net/blog/tag/react-seo/ title="React SEO" class="tag-link tag-item-react-seo" rel="tag"><span class="tag-hash">#</span>React SEO</a><a href=https://screamingcat.net/blog/tag/spa-indexing/ title="SPA indexing" class="tag-link tag-item-spa-indexing" rel="tag"><span class="tag-hash">#</span>SPA indexing</a> </span> </div><!-- .entry-tags --> </footer><!-- .entry-footer --> </div> </article><!-- #post-163 --> <nav class="navigation post-navigation" aria-label="Posts"> <h2 class="screen-reader-text">Post navigation</h2> <div class="nav-links"><div class="nav-previous"><a href="https://screamingcat.net/blog/crawling-indexing/orphan-pages-find-fix/" rel="prev"><div class="post-navigation-sub"><small><span class="kadence-svg-iconset svg-baseline"><svg aria-hidden="true" class="kadence-svg-icon kadence-arrow-left-alt-svg" fill="currentColor" version="1.1" xmlns="http://www.w3.org/2000/svg" width="29" height="28" viewBox="0 0 29 28"><title>Previous Previous
Orphan Pages: How to Find and Fix Pages With No Internal Links

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *