{"id":4810,"date":"2024-12-18T12:43:51","date_gmt":"2024-12-18T17:43:51","guid":{"rendered":"https:\/\/frontendmasters.com\/blog\/?p=4810"},"modified":"2024-12-18T12:43:52","modified_gmt":"2024-12-18T17:43:52","slug":"introducing-tanstack-start","status":"publish","type":"post","link":"https:\/\/frontendmasters.com\/blog\/introducing-tanstack-start\/","title":{"rendered":"Introducing TanStack Start"},"content":{"rendered":"\n<p>The best way to think about&nbsp;<a href=\"https:\/\/tanstack.com\/start\/latest\">TanStack Start<\/a>&nbsp;is that it\u2019s a thin server layer atop the&nbsp;<a href=\"https:\/\/tanstack.com\/router\/latest\">TanStack Router<\/a>&nbsp;we&nbsp;<a href=\"https:\/\/frontendmasters.com\/blog\/introducing-tanstack-router\/\">already know and love<\/a>; that means we don&#8217;t lose a single thing from TanStack Router. Not only that, but the nature of this server layer allows it to side-step the pain points other web meta-frameworks suffer from.<\/p>\n\n\n\n<p>This is a post I&#8217;ve been looking forward to writing for a long time; it&#8217;s also a difficult one to write. <\/p>\n\n\n\n<p>The goal (and challenge) will be to show why a server layer on top of a JavaScript router is valuable, and&nbsp;why&nbsp;TanStack Start\u2019s implementation is unique compared to the alternatives (in a good way). From there, showing how TanStack Start actually works will be relatively straightforward. Let&#8217;s go!<\/p>\n\n\n\n<p class=\"learn-more\">Please keep in mind that, while this post discusses a lot of generic web performance issues, TanStack Start is still a React-specific meta-framework. It&#8217;s not a framework-agnostic tool like <a href=\"https:\/\/astro.build\/\">Astro<\/a>.\u00a0<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">Why Server Rendering?<\/h2>\n\n\n\n<p>Client-rendered web applications, often called \u201cSingle Page Applications\u201d or \u201cSPAs\u201d have been popular for a long time. With this type of app, the server sends down a mostly empty HTML page, possibly with some sort of splash image, loading spinner, or maybe some navigation components. It also includes, very importantly, script tags that load your framework of choice (React, Vue, Svelte, etc) and a bundle of your application code.<\/p>\n\n\n\n<p>These apps were always fun to build, and in spite of the hate they often get, they (usually) worked just fine (any kind of software can be bad). Admittedly, they suffer a big disadvantage: initial render performance. Remember, the initial render of the page was just an empty shell of your app. This displayed while your script files loaded and executed, and once&nbsp;<em>those<\/em>&nbsp;scripts were run, your application code would most likely need to request data before your actual app could display. Under the covers, your app is doing something like this<\/p>\n\n\n<div class=\"wp-block-image\">\n<figure class=\"aligncenter size-full is-resized\"><img data-recalc-dims=\"1\" loading=\"lazy\" decoding=\"async\" width=\"772\" height=\"874\" src=\"https:\/\/i0.wp.com\/frontendmasters.com\/blog\/wp-content\/uploads\/2024\/12\/csr-perf-flow.png?resize=772%2C874&#038;ssl=1\" alt=\"\" class=\"wp-image-4816\" style=\"width:537px;height:auto\" srcset=\"https:\/\/i0.wp.com\/frontendmasters.com\/blog\/wp-content\/uploads\/2024\/12\/csr-perf-flow.png?w=772&amp;ssl=1 772w, https:\/\/i0.wp.com\/frontendmasters.com\/blog\/wp-content\/uploads\/2024\/12\/csr-perf-flow.png?resize=265%2C300&amp;ssl=1 265w, https:\/\/i0.wp.com\/frontendmasters.com\/blog\/wp-content\/uploads\/2024\/12\/csr-perf-flow.png?resize=768%2C869&amp;ssl=1 768w\" sizes=\"auto, (max-width: 772px) 100vw, 772px\" \/><\/figure>\n<\/div>\n\n\n<p>The initial render of the page, from the web server, renders only an empty shell of your application. Then some scripts are requested, and then parsed and executed. When those application scripts run, you (likely) send some other requests for data. Once&nbsp;<em>that<\/em>&nbsp;is done, your page displays.<\/p>\n\n\n\n<p>To put it more succinctly, with client-rendered web apps, when the user first loads your app, they&#8217;ll just get a loading spinner. Maybe your company&#8217;s logo above it, if they&#8217;re lucky.<\/p>\n\n\n<div class=\"wp-block-image\">\n<figure class=\"aligncenter size-full is-resized\"><img data-recalc-dims=\"1\" loading=\"lazy\" decoding=\"async\" width=\"684\" height=\"498\" src=\"https:\/\/i0.wp.com\/frontendmasters.com\/blog\/wp-content\/uploads\/2024\/12\/csr-user.png?resize=684%2C498&#038;ssl=1\" alt=\"\" class=\"wp-image-4819\" style=\"width:487px;height:auto\" srcset=\"https:\/\/i0.wp.com\/frontendmasters.com\/blog\/wp-content\/uploads\/2024\/12\/csr-user.png?w=684&amp;ssl=1 684w, https:\/\/i0.wp.com\/frontendmasters.com\/blog\/wp-content\/uploads\/2024\/12\/csr-user.png?resize=300%2C218&amp;ssl=1 300w\" sizes=\"auto, (max-width: 684px) 100vw, 684px\" \/><\/figure>\n<\/div>\n\n\n<p>This is perhaps an overstatement. Users may not even notice the delay caused by these scripts loading (which are likely cached), or hydration, which is probably fast. Depending on the speed of their network, and the type of application, this stuff might not matter much.<\/p>\n\n\n\n<p><em>Maybe.<\/em><\/p>\n\n\n\n<p>But if our tools now make it easy to do better, why not do better?<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">Server Side Rendering<\/h3>\n\n\n\n<p>With SSR, the picture looks more like this<\/p>\n\n\n<div class=\"wp-block-image\">\n<figure class=\"aligncenter size-full is-resized\"><img data-recalc-dims=\"1\" loading=\"lazy\" decoding=\"async\" width=\"788\" height=\"884\" src=\"https:\/\/i0.wp.com\/frontendmasters.com\/blog\/wp-content\/uploads\/2024\/12\/ssr-render.png?resize=788%2C884&#038;ssl=1\" alt=\"\" class=\"wp-image-4821\" style=\"width:515px;height:auto\" srcset=\"https:\/\/i0.wp.com\/frontendmasters.com\/blog\/wp-content\/uploads\/2024\/12\/ssr-render.png?w=788&amp;ssl=1 788w, https:\/\/i0.wp.com\/frontendmasters.com\/blog\/wp-content\/uploads\/2024\/12\/ssr-render.png?resize=267%2C300&amp;ssl=1 267w, https:\/\/i0.wp.com\/frontendmasters.com\/blog\/wp-content\/uploads\/2024\/12\/ssr-render.png?resize=768%2C862&amp;ssl=1 768w\" sizes=\"auto, (max-width: 788px) 100vw, 788px\" \/><\/figure>\n<\/div>\n\n\n<p>The server sends down the complete, finished page that the user can see immediately. We do still need to load our scripts and hydrate, so our page can be&nbsp;<em>interactive<\/em>. But that&#8217;s usually fast, and the user will still have content to see while that happens.<\/p>\n\n\n\n<p>Our hypothetical user now looks like this, since the server is responding with a full page the user can see.<\/p>\n\n\n<div class=\"wp-block-image\">\n<figure class=\"aligncenter size-full is-resized\"><img data-recalc-dims=\"1\" loading=\"lazy\" decoding=\"async\" width=\"796\" height=\"506\" src=\"https:\/\/i0.wp.com\/frontendmasters.com\/blog\/wp-content\/uploads\/2024\/12\/ssr-user.png?resize=796%2C506&#038;ssl=1\" alt=\"\" class=\"wp-image-4823\" style=\"width:569px;height:auto\" srcset=\"https:\/\/i0.wp.com\/frontendmasters.com\/blog\/wp-content\/uploads\/2024\/12\/ssr-user.png?w=796&amp;ssl=1 796w, https:\/\/i0.wp.com\/frontendmasters.com\/blog\/wp-content\/uploads\/2024\/12\/ssr-user.png?resize=300%2C191&amp;ssl=1 300w, https:\/\/i0.wp.com\/frontendmasters.com\/blog\/wp-content\/uploads\/2024\/12\/ssr-user.png?resize=768%2C488&amp;ssl=1 768w\" sizes=\"auto, (max-width: 796px) 100vw, 796px\" \/><\/figure>\n<\/div>\n\n\n<h3 class=\"wp-block-heading\">Streaming<\/h3>\n\n\n\n<p>We made one implicit assumption above: that our data was fast. If our data was slow to load, our server would be slow to respond. It&#8217;s bad for the user to be stuck looking at a loading spinner, but it&#8217;s even worse for the user to be stuck looking at a blank screen while the server churns.<\/p>\n\n\n\n<p>As a solution for this, we can use something called &#8220;streaming,&#8221; or &#8220;out of order streaming&#8221; to be more precise. The user still requests all the data, as before, but we tell our server &#8220;don&#8217;t wait for this\/these data, which are slow: render everything else, now, and send that slow data to the browser when it&#8217;s ready.&#8221;<\/p>\n\n\n\n<p>All modern meta-frameworks support this, and our picture now looks like this<\/p>\n\n\n<div class=\"wp-block-image\">\n<figure class=\"aligncenter size-full is-resized\"><img data-recalc-dims=\"1\" loading=\"lazy\" decoding=\"async\" width=\"822\" height=\"500\" src=\"https:\/\/i0.wp.com\/frontendmasters.com\/blog\/wp-content\/uploads\/2024\/12\/ssr-streaming-user.png?resize=822%2C500&#038;ssl=1\" alt=\"\" class=\"wp-image-4824\" style=\"width:569px\" srcset=\"https:\/\/i0.wp.com\/frontendmasters.com\/blog\/wp-content\/uploads\/2024\/12\/ssr-streaming-user.png?w=822&amp;ssl=1 822w, https:\/\/i0.wp.com\/frontendmasters.com\/blog\/wp-content\/uploads\/2024\/12\/ssr-streaming-user.png?resize=300%2C182&amp;ssl=1 300w, https:\/\/i0.wp.com\/frontendmasters.com\/blog\/wp-content\/uploads\/2024\/12\/ssr-streaming-user.png?resize=768%2C467&amp;ssl=1 768w\" sizes=\"auto, (max-width: 822px) 100vw, 822px\" \/><\/figure>\n<\/div>\n\n\n<p>To put a finer point on it, the server does still initiate the request for our slow data&nbsp;<em>immediately<\/em>, on the server during our initial navigation. It just doesn&#8217;t block the initial render, and instead&nbsp;<em>pushes down<\/em>&nbsp;the data when ready. We&#8217;ll look at streaming with Start later in this post.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">Why did we ever do client-rendering?<\/h3>\n\n\n\n<p>I&#8217;m not here to tear down client-rendered apps. They were, and frankly&nbsp;<em>still are<\/em>&nbsp;an incredible way to ship deeply interactive user experiences with JavaScript frameworks like React and Vue. The fact of the matter is, server rendering a web app built with React was tricky to get right. You not only needed to server render and send down the HTML for the page the user requested, but also send down the&nbsp;<em>data<\/em>&nbsp;for that page, and hydrate everything&nbsp;<em>just right<\/em>&nbsp;on the client.<\/p>\n\n\n\n<p>It&#8217;s hard to get right. But here&#8217;s the thing:&nbsp;<strong>getting this right is the one of the primary purposes of this new generation of meta-frameworks<\/strong>. Next, Nuxt, Remix, SvelteKit, and SolidStart are some of the more famous examples of these meta-frameworks. And now TanStack Start.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">Why is TanStack Start different?<\/h2>\n\n\n\n<p>Why do we need a new meta-framework? There&#8217;s many possible answers to that question, but I&#8217;ll give mine. Existing meta-frameworks suffer from some variation on the same issue. They&#8217;ll provide some mechanism to load data on the server. This mechanism is often called a &#8220;loader,&#8221; or in the case of Next, it&#8217;s just RSCs (React Server Components). In Next&#8217;s (older) pages directory, it&#8217;s the&nbsp;<code>getServerSideProps<\/code>&nbsp;function. The specifics don&#8217;t matter. What matters is, for each route, whether the initial load of the page, or client-side navigation via links, some server-side code will run, send down the data, and then render the new page.<\/p>\n\n\n\n<div class=\"wp-block-columns learn-more is-layout-flex wp-container-core-columns-is-layout-9d6595d7 wp-block-columns-is-layout-flex\">\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\" style=\"flex-basis:20%\">\n<figure class=\"wp-block-image size-full is-resized\"><a href=\"https:\/\/frontendmasters.com\/courses\/complete-react-v9\/?utm_source=boost&amp;utm_medium=blog&amp;utm_campaign=boost\"><img data-recalc-dims=\"1\" loading=\"lazy\" decoding=\"async\" width=\"500\" height=\"500\" src=\"https:\/\/i0.wp.com\/frontendmasters.com\/blog\/wp-content\/uploads\/2024\/12\/thumb.webp?resize=500%2C500&#038;ssl=1\" alt=\"\" class=\"wp-image-4840\" style=\"width:122px;height:auto\" srcset=\"https:\/\/i0.wp.com\/frontendmasters.com\/blog\/wp-content\/uploads\/2024\/12\/thumb.webp?w=500&amp;ssl=1 500w, https:\/\/i0.wp.com\/frontendmasters.com\/blog\/wp-content\/uploads\/2024\/12\/thumb.webp?resize=300%2C300&amp;ssl=1 300w, https:\/\/i0.wp.com\/frontendmasters.com\/blog\/wp-content\/uploads\/2024\/12\/thumb.webp?resize=150%2C150&amp;ssl=1 150w\" sizes=\"auto, (max-width: 500px) 100vw, 500px\" \/><\/a><\/figure>\n<\/div>\n\n\n\n<div class=\"wp-block-column is-vertically-aligned-center is-layout-flow wp-block-column-is-layout-flow\" style=\"flex-basis:66.66%\">\n<p>Need to bone up on React in general? Brian Holt&#8217;s <a href=\"https:\/\/frontendmasters.com\/courses\/complete-react-v9\/?utm_source=boost&amp;utm_medium=blog&amp;utm_campaign=boost\">Complete Intro to React<\/a> and <a href=\"https:\/\/frontendmasters.com\/courses\/intermediate-react-v5\/?utm_source=boost&amp;utm_medium=blog&amp;utm_campaign=boost\">Intermediate React<\/a> will get you there.<\/p>\n<\/div>\n<\/div>\n\n\n\n<h3 class=\"wp-block-heading\">An Impedance Mismatch is Born<\/h3>\n\n\n\n<p>Notice the two worlds that exist: the server, where data loading code will always run, and the client. It&#8217;s the difference and separation between these worlds that can cause issues.<\/p>\n\n\n\n<p>For example, frameworks always provide some mechanism to mutate data, and then re-fetch to show updated state. Imagine your loader for a page loads some tasks, user settings, and announcements. When the user edits a task, and revalidates, these frameworks will almost always re-run the entire loader, and superfluously re-load the user&#8217;s announcements and user settings, in addition to tasks, even though tasks are the only thing that changed.<\/p>\n\n\n\n<p>Are there fixes? Of course. Many frameworks will allow you to create extra loaders to spread the data loading across, and revalidate only&nbsp;<em>some<\/em>&nbsp;of them. Other frameworks encourage you to cache your data. These solutions all work, but come with their own tradeoffs. And remember, they&#8217;re solutions to a problem that meta-frameworks <em>created<\/em>, by having server-side loading code for every path in your app.<\/p>\n\n\n\n<p>Or what about a loader that loads 5 different pieces of data? After the page loads, the user starts browsing around, occasionally coming back to that first page. These frameworks will usually cache that previously-displayed page, for a time. Or not. But it&#8217;s all or none. When the loader re-runs, all 5 pieces of data will re-fire, even if 4 of them can be cached safely.<\/p>\n\n\n\n<p>You might think using a component-level data loading solution like react-query can help. react-query is great, but it doesn&#8217;t eliminate these problems. If you have two different pages that each have 5 data sources, of which 4 are shared in common, browsing from the first page to the second will cause the second page to re-request all 5 pieces of data, even though 4 of them are already present in client-side state from the first page. The server is unaware of what happens to exist on the client. The server is not keeping track of what state you have in your browser; in fact the &#8220;server&#8221; might just be a Lambda function that spins up, satisfies your request, and then dies off.<\/p>\n\n\n<div class=\"wp-block-image\">\n<figure class=\"aligncenter size-full is-resized\"><img data-recalc-dims=\"1\" loading=\"lazy\" decoding=\"async\" width=\"724\" height=\"800\" src=\"https:\/\/i0.wp.com\/frontendmasters.com\/blog\/wp-content\/uploads\/2024\/12\/extra-data.png?resize=724%2C800&#038;ssl=1\" alt=\"\" class=\"wp-image-4825\" style=\"width:537px;height:auto\" srcset=\"https:\/\/i0.wp.com\/frontendmasters.com\/blog\/wp-content\/uploads\/2024\/12\/extra-data.png?w=724&amp;ssl=1 724w, https:\/\/i0.wp.com\/frontendmasters.com\/blog\/wp-content\/uploads\/2024\/12\/extra-data.png?resize=272%2C300&amp;ssl=1 272w\" sizes=\"auto, (max-width: 724px) 100vw, 724px\" \/><\/figure>\n<\/div>\n\n\n<p>In the picture, we can see a loader from the server sending down data for <code>queryB<\/code>, which we already have in our TanStack cache.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">Where to, from here?<\/h3>\n\n\n\n<p>The root problem is that these meta-frameworks inevitably have server-only code running on each path, integrating with long-running client-side state. This leads to conflicts and inefficiencies which need to be managed. There&#8217;s ways of handling these things, which I touched on above. But it&#8217;s not a completely clean fit.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">How much does it matter?<\/h3>\n\n\n\n<p>Let&#8217;s be clear right away: if this situation is killing performance of your site, you have bigger problems. If these extra calls are putting undue strain on your services, you have bigger problems.<\/p>\n\n\n\n<p>That said, one of the first rules of distributed systems is to never trust your network. The more of these calls we&#8217;re firing off, the better the chances that some of them might randomly be slow for some reason beyond our control. Or fail.<\/p>\n\n\n\n<p>We typically tolerate requesting more than we need in these scenarios because it&#8217;s hard to avoid with our current tooling. But I&#8217;m here to show you some new, better tooling that side-steps these issues altogether.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">Isomorphic Loaders<\/h3>\n\n\n\n<p>In TanStack, we do have loaders. These are defined by TanStack Router. I wrote <a href=\"https:\/\/frontendmasters.com\/blog\/introducing-tanstack-router\/\">a three-part series on Router&nbsp;here<\/a>. If you haven&#8217;t read that, and aren&#8217;t familiar with Router, give it a quick look.<\/p>\n\n\n\n<p>Start takes what we already have with Router, and adds server handling to it. On the initial load, your loader will run on the server, load your data, and send it down. On all subsequent client-side navigations, your loader will run&nbsp;<em>on the client<\/em>, like it already does. That means all subsequent invocations of your loader will run on the client, and have access to any client-side state, cache, etc. If you like react-query, you&#8217;ll be happy to know that&#8217;s integrated too. Your react-query client can run on the server, to load, and send data down on the initial page load. On subsequent navigations, these loaders will run on the client, which means your react-query <code>queryClient<\/code> will have full access to the usual client-side cache react-query always uses. That means it will know what does, and does not need to be loaded.<\/p>\n\n\n\n<p>It&#8217;s honestly such a refreshing, simple, and most importantly, effective pattern that it&#8217;s hard not being annoyed none of the other frameworks thought of it first. Admittedly, <a href=\"https:\/\/svelte.dev\/tutorial\/kit\/universal-load-functions\">SvelteKit does have universal loaders<\/a> which are isomorphic in the same way, but without a component-level query library like react-query integrated with the server.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">TanStack Start<\/h2>\n\n\n\n<p>Enough setup, let&#8217;s look at some code. TanStack Start is still in beta, so some of the setup is still a bit manual, for now.<\/p>\n\n\n\n<p><a href=\"https:\/\/github.com\/arackaf\/tanstack-start-blog-dataloading\">The repo for this post&nbsp;is here.<\/a><\/p>\n\n\n\n<p>If you&#8217;d like to set something up yourself, check out&nbsp;<a href=\"https:\/\/tanstack.com\/router\/latest\/docs\/framework\/react\/start\/getting-started\">the getting started guide<\/a>. If you&#8217;d like to use react-query, be sure to add the library for that. You can see an example&nbsp;<a href=\"https:\/\/github.com\/TanStack\/router\/blob\/main\/examples%2Freact%2Fstart-basic-react-query%2Fapp%2Frouter.tsx\">here<\/a>. Depending on when you read this, there might be a CLI to do all of this for you.<\/p>\n\n\n\n<p>This post will continue to use the same code I used in my&nbsp;<a href=\"https:\/\/frontendmasters.com\/blog\/introducing-tanstack-router\/\">prior posts<\/a>&nbsp;on TanStack Router. I set up a new Start project, copied over all the route code, and tweaked a few import paths since the default Start project has a slightly different folder structure. I also removed all of the artificial delays, unless otherwise noted. I want our data to be fast by default, and slow in a few places where we&#8217;ll use streaming to manage the slowness.<\/p>\n\n\n\n<p>We&#8217;re not building anything new, here. We&#8217;re taking existing code, and moving the data loading up to the server in order to get it requested sooner, and improve our page load times. This means everything we already know and love about TanStack Router is still 100% valid. <\/p>\n\n\n\n<p>Start does not replace Router; Start&nbsp;<em>improves<\/em>&nbsp;Router.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">Loading Data<\/h3>\n\n\n\n<p>All of the routes and loaders we set up with Router are still valid. Start sits on top of Router and adds server processing. Our loaders will execute on the server for the first load of the page, and then on the client as the user browses. But there&#8217;s a small problem. While the server environment these loaders will execute in does indeed have a&nbsp;<code>fetch<\/code>&nbsp;function, there are differences between client-side fetch, and server-side fetch\u2014for example, cookies, and fetching to relative paths.<\/p>\n\n\n\n<p>To solve this, Start lets you define a&nbsp;<a href=\"https:\/\/tanstack.com\/router\/latest\/docs\/framework\/react\/start\/server-functions\">server function<\/a>. Server functions can be called from the client, or from the server; but the server function itself always&nbsp;<em>executes on<\/em>&nbsp;the server. You can define a server function in the same file as your route, or in a separate file; if you do the former, TanStack will do the work of ensuring that server-only code does not ever exist in your client bundle.<\/p>\n\n\n\n<p>Let&#8217;s define a server function to load our tasks, and then call it from the tasks loader.<\/p>\n\n\n<pre class=\"wp-block-code\" aria-describedby=\"shcb-language-1\" data-shcb-language-name=\"TypeScript\" data-shcb-language-slug=\"typescript\"><span><code class=\"hljs language-typescript\"><span class=\"hljs-keyword\">import<\/span> { getCookie } <span class=\"hljs-keyword\">from<\/span> <span class=\"hljs-string\">\"vinxi\/http\"<\/span>;\n<span class=\"hljs-keyword\">import<\/span> { createServerFn } <span class=\"hljs-keyword\">from<\/span> <span class=\"hljs-string\">\"@tanstack\/start\"<\/span>;\n<span class=\"hljs-keyword\">import<\/span> { Task } <span class=\"hljs-keyword\">from<\/span> <span class=\"hljs-string\">\"..\/..\/types\"<\/span>;\n\n<span class=\"hljs-keyword\">export<\/span> <span class=\"hljs-keyword\">const<\/span> getTasksList = createServerFn({ method: <span class=\"hljs-string\">\"GET\"<\/span> }).handler(<span class=\"hljs-keyword\">async<\/span> () =&gt; {\n  <span class=\"hljs-keyword\">const<\/span> result = getCookie(<span class=\"hljs-string\">\"user\"<\/span>);\n\n  <span class=\"hljs-keyword\">return<\/span> fetch(<span class=\"hljs-string\">`http:\/\/localhost:3000\/api\/tasks`<\/span>, { method: <span class=\"hljs-string\">\"GET\"<\/span>, headers: { Cookie: <span class=\"hljs-string\">\"user=\"<\/span> + result } })\n    .then(<span class=\"hljs-function\"><span class=\"hljs-params\">resp<\/span> =&gt;<\/span> resp.json())\n    .then(<span class=\"hljs-function\"><span class=\"hljs-params\">res<\/span> =&gt;<\/span> res <span class=\"hljs-keyword\">as<\/span> Task&#91;]);\n});<\/code><\/span><small class=\"shcb-language\" id=\"shcb-language-1\"><span class=\"shcb-language__label\">Code language:<\/span> <span class=\"shcb-language__name\">TypeScript<\/span> <span class=\"shcb-language__paren\">(<\/span><span class=\"shcb-language__slug\">typescript<\/span><span class=\"shcb-language__paren\">)<\/span><\/small><\/pre>\n\n\n<p>We have access to a&nbsp;<code>getCookie<\/code>&nbsp;utility from the <a href=\"https:\/\/github.com\/nksaraf\/vinxi\">vinxi<\/a> library on which Start is built. Server functions actually provide a lot more functionality than this simple example shows. Be sure to check out&nbsp;<a href=\"https:\/\/tanstack.com\/router\/latest\/docs\/framework\/react\/start\/server-functions\">the docs<\/a>&nbsp;to learn more.<\/p>\n\n\n\n<p>If you&#8217;re curious about this fetch call:<\/p>\n\n\n<pre class=\"wp-block-code\" aria-describedby=\"shcb-language-2\" data-shcb-language-name=\"JavaScript\" data-shcb-language-slug=\"javascript\"><span><code class=\"hljs language-javascript\">fetch(<span class=\"hljs-string\">`http:\/\/localhost:3000\/api\/tasks`<\/span>, { <span class=\"hljs-attr\">method<\/span>: <span class=\"hljs-string\">\"GET\"<\/span>, <span class=\"hljs-attr\">headers<\/span>: { <span class=\"hljs-attr\">Cookie<\/span>: <span class=\"hljs-string\">\"user=\"<\/span> + result } });\n<\/code><\/span><small class=\"shcb-language\" id=\"shcb-language-2\"><span class=\"shcb-language__label\">Code language:<\/span> <span class=\"shcb-language__name\">JavaScript<\/span> <span class=\"shcb-language__paren\">(<\/span><span class=\"shcb-language__slug\">javascript<\/span><span class=\"shcb-language__paren\">)<\/span><\/small><\/pre>\n\n\n<p>That&#8217;s how I&#8217;m loading data for this project, on the server. I have a separate project running a set of Express endpoints querying a simple SQLite database. You can fetch your data however you need from within these server functions, be it via an ORM like Drizzle, an external service endpoint like I have here, or you could connect right to a database and query what you need. But that latter option should probably be discouraged for production applications.<\/p>\n\n\n\n<p>Now we can call our server function from our loader.<\/p>\n\n\n<pre class=\"wp-block-code\" aria-describedby=\"shcb-language-3\" data-shcb-language-name=\"TypeScript\" data-shcb-language-slug=\"typescript\"><span><code class=\"hljs language-typescript\">loader: <span class=\"hljs-keyword\">async<\/span> ({ context }) =&gt; {\n    <span class=\"hljs-keyword\">const<\/span> now = +<span class=\"hljs-keyword\">new<\/span> <span class=\"hljs-built_in\">Date<\/span>();\n    <span class=\"hljs-built_in\">console<\/span>.log(<span class=\"hljs-string\">`\/tasks\/index path loader. Loading tasks at + <span class=\"hljs-subst\">${now - context.timestarted}<\/span>ms since start`<\/span>);\n    <span class=\"hljs-keyword\">const<\/span> tasks = <span class=\"hljs-keyword\">await<\/span> getTasksList();\n    <span class=\"hljs-keyword\">return<\/span> { tasks };\n  },<\/code><\/span><small class=\"shcb-language\" id=\"shcb-language-3\"><span class=\"shcb-language__label\">Code language:<\/span> <span class=\"shcb-language__name\">TypeScript<\/span> <span class=\"shcb-language__paren\">(<\/span><span class=\"shcb-language__slug\">typescript<\/span><span class=\"shcb-language__paren\">)<\/span><\/small><\/pre>\n\n\n<p>That&#8217;s all there is to it. It&#8217;s almost anti-climactic. The page loads, as it did in the last post. Except now it server renders. You can shut JavaScript off, and the page will still load and display (and hyperlinks will still work).<\/p>\n\n\n<div class=\"wp-block-image\">\n<figure class=\"aligncenter size-large is-resized\"><img data-recalc-dims=\"1\" loading=\"lazy\" decoding=\"async\" width=\"671\" height=\"1024\" src=\"https:\/\/i0.wp.com\/frontendmasters.com\/blog\/wp-content\/uploads\/2024\/12\/tasks-page.png?resize=671%2C1024&#038;ssl=1\" alt=\"\" class=\"wp-image-4826\" style=\"width:429px;height:auto\" srcset=\"https:\/\/i0.wp.com\/frontendmasters.com\/blog\/wp-content\/uploads\/2024\/12\/tasks-page.png?resize=671%2C1024&amp;ssl=1 671w, https:\/\/i0.wp.com\/frontendmasters.com\/blog\/wp-content\/uploads\/2024\/12\/tasks-page.png?resize=196%2C300&amp;ssl=1 196w, https:\/\/i0.wp.com\/frontendmasters.com\/blog\/wp-content\/uploads\/2024\/12\/tasks-page.png?resize=768%2C1173&amp;ssl=1 768w, https:\/\/i0.wp.com\/frontendmasters.com\/blog\/wp-content\/uploads\/2024\/12\/tasks-page.png?w=926&amp;ssl=1 926w\" sizes=\"auto, (max-width: 671px) 100vw, 671px\" \/><\/figure>\n<\/div>\n\n\n<h3 class=\"wp-block-heading\">Streaming<\/h3>\n\n\n\n<p>Let&#8217;s make the individual task loading purposefully slow (we&#8217;ll just keep the delay that was already in there), so we can see how to stream it in. Here&#8217;s our server function to load a single task.<\/p>\n\n\n<pre class=\"wp-block-code\" aria-describedby=\"shcb-language-4\" data-shcb-language-name=\"TypeScript\" data-shcb-language-slug=\"typescript\"><span><code class=\"hljs language-typescript\"><span class=\"hljs-keyword\">export<\/span> <span class=\"hljs-keyword\">const<\/span> getTask = createServerFn({ method: <span class=\"hljs-string\">\"GET\"<\/span> })\n  .validator(<span class=\"hljs-function\">(<span class=\"hljs-params\"><span class=\"hljs-params\">id<\/span>: <span class=\"hljs-params\">string<\/span><\/span>) =&gt;<\/span> id)\n  .handler(<span class=\"hljs-keyword\">async<\/span> ({ data }) =&gt; {\n    <span class=\"hljs-keyword\">return<\/span> fetch(<span class=\"hljs-string\">`http:\/\/localhost:3000\/api\/tasks\/<span class=\"hljs-subst\">${data}<\/span>`<\/span>, { method: <span class=\"hljs-string\">\"GET\"<\/span> })\n      .then(<span class=\"hljs-function\"><span class=\"hljs-params\">resp<\/span> =&gt;<\/span> resp.json())\n      .then(<span class=\"hljs-function\"><span class=\"hljs-params\">res<\/span> =&gt;<\/span> res <span class=\"hljs-keyword\">as<\/span> Task);\n  });<\/code><\/span><small class=\"shcb-language\" id=\"shcb-language-4\"><span class=\"shcb-language__label\">Code language:<\/span> <span class=\"shcb-language__name\">TypeScript<\/span> <span class=\"shcb-language__paren\">(<\/span><span class=\"shcb-language__slug\">typescript<\/span><span class=\"shcb-language__paren\">)<\/span><\/small><\/pre>\n\n\n<p>Note the&nbsp;<code>validator<\/code>&nbsp;function, which is how we strongly type our server function (and validate the inputs). But otherwise it&#8217;s more of the same.<\/p>\n\n\n\n<p>Now let&#8217;s call it in our loader, and see about enabling streaming<\/p>\n\n\n\n<p>Here&#8217;s our loader:<\/p>\n\n\n<pre class=\"wp-block-code\" aria-describedby=\"shcb-language-5\" data-shcb-language-name=\"TypeScript\" data-shcb-language-slug=\"typescript\"><span><code class=\"hljs language-typescript\">loader: <span class=\"hljs-keyword\">async<\/span> ({ params, context }) =&gt; {\n    <span class=\"hljs-keyword\">const<\/span> { taskId } = params;\n\n    <span class=\"hljs-keyword\">const<\/span> now = +<span class=\"hljs-keyword\">new<\/span> <span class=\"hljs-built_in\">Date<\/span>();\n    <span class=\"hljs-built_in\">console<\/span>.log(<span class=\"hljs-string\">`\/tasks\/<span class=\"hljs-subst\">${taskId}<\/span> path loader. Loading at + <span class=\"hljs-subst\">${now - context.timestarted}<\/span>ms since start`<\/span>);\n    <span class=\"hljs-keyword\">const<\/span> task = getTask({ data: taskId });\n\n    <span class=\"hljs-keyword\">return<\/span> { task };\n  },<\/code><\/span><small class=\"shcb-language\" id=\"shcb-language-5\"><span class=\"shcb-language__label\">Code language:<\/span> <span class=\"shcb-language__name\">TypeScript<\/span> <span class=\"shcb-language__paren\">(<\/span><span class=\"shcb-language__slug\">typescript<\/span><span class=\"shcb-language__paren\">)<\/span><\/small><\/pre>\n\n\n<p>Did you catch it? We called&nbsp;<code>getTask<\/code>&nbsp;<strong>without<\/strong>&nbsp;awaiting it. That means <code>task<\/code> is a promise, which Start and Router allow us to return from our loader (you could name it&nbsp;<code>taskPromise<\/code> if you like that specificity in naming).<\/p>\n\n\n\n<p>But how do we&nbsp;<em>consume<\/em>&nbsp;this promise, show loading state, and&nbsp;<code>await<\/code>&nbsp;the real value? There are two ways. TanStack Router defines an&nbsp;<a href=\"https:\/\/tanstack.com\/router\/latest\/docs\/framework\/react\/api\/router\/awaitComponent#await-component\"><code>Await<\/code>&nbsp;component<\/a> for this. But if you&#8217;re using React 19, you can use the new&nbsp;<code>use<\/code>&nbsp;psuedo-hook.<\/p>\n\n\n<pre class=\"wp-block-code\" aria-describedby=\"shcb-language-6\" data-shcb-language-name=\"TypeScript\" data-shcb-language-slug=\"typescript\"><span><code class=\"hljs language-typescript\"><span class=\"hljs-keyword\">import<\/span> { use } <span class=\"hljs-keyword\">from<\/span> <span class=\"hljs-string\">\"react\"<\/span>;\n\n<span class=\"hljs-function\"><span class=\"hljs-keyword\">function<\/span> <span class=\"hljs-title\">TaskView<\/span>(<span class=\"hljs-params\"><\/span>) <\/span>{\n  <span class=\"hljs-keyword\">const<\/span> { task: taskPromise } = Route.useLoaderData();\n  <span class=\"hljs-keyword\">const<\/span> { isFetching } = Route.useMatch();\n\n  <span class=\"hljs-keyword\">const<\/span> task = use(taskPromise);\n\n  <span class=\"hljs-keyword\">return<\/span> (\n    &lt;div&gt;\n      &lt;Link to=<span class=\"hljs-string\">\"\/app\/tasks\"<\/span>&gt;Back to tasks list&lt;<span class=\"hljs-regexp\">\/Link&gt;\n      &lt;div className=\"flex flex-col gap-2\"&gt;\n        &lt;div&gt;\n          Task {task.id} {isFetching ? \"Loading ...\" : null}\n        &lt;\/<\/span>div&gt;\n        &lt;h1&gt;{task.title}&lt;<span class=\"hljs-regexp\">\/h1&gt;\n        &lt;Link \n          params={{ taskId: task.id }}\n          to=\"\/<\/span>app\/tasks\/$taskId\/edit<span class=\"hljs-string\">\"\n        &gt;\n          Edit\n        &lt;\/Link&gt;\n        &lt;div \/&gt;\n      &lt;\/div&gt;\n    &lt;\/div&gt;\n  );\n}<\/span><\/code><\/span><small class=\"shcb-language\" id=\"shcb-language-6\"><span class=\"shcb-language__label\">Code language:<\/span> <span class=\"shcb-language__name\">TypeScript<\/span> <span class=\"shcb-language__paren\">(<\/span><span class=\"shcb-language__slug\">typescript<\/span><span class=\"shcb-language__paren\">)<\/span><\/small><\/pre>\n\n\n<p>The <code>use<\/code> hook&nbsp;will cause the component to suspend, and show the nearest&nbsp;<code>Suspense<\/code>&nbsp;boundary in the tree. Fortunately, the&nbsp;<code>pendingComponent<\/code>&nbsp;you set up in Router also doubles as a Suspense boundary. TanStack is impressively well integrated with modern React features.<\/p>\n\n\n\n<p>Now when we load an individual task&#8217;s page, we&#8217;ll first see the overview data which loaded quickly, and server rendered, above the Suspense boundary for the task data we&#8217;re streaming<\/p>\n\n\n<div class=\"wp-block-image\">\n<figure class=\"aligncenter size-full is-resized\"><img data-recalc-dims=\"1\" loading=\"lazy\" decoding=\"async\" width=\"692\" height=\"662\" src=\"https:\/\/i0.wp.com\/frontendmasters.com\/blog\/wp-content\/uploads\/2024\/12\/streaming-tasks.png?resize=692%2C662&#038;ssl=1\" alt=\"\" class=\"wp-image-4829\" style=\"width:362px;height:auto\" srcset=\"https:\/\/i0.wp.com\/frontendmasters.com\/blog\/wp-content\/uploads\/2024\/12\/streaming-tasks.png?w=692&amp;ssl=1 692w, https:\/\/i0.wp.com\/frontendmasters.com\/blog\/wp-content\/uploads\/2024\/12\/streaming-tasks.png?resize=300%2C287&amp;ssl=1 300w\" sizes=\"auto, (max-width: 692px) 100vw, 692px\" \/><\/figure>\n<\/div>\n\n\n<p>When the task comes in, the promise will resolve, the server will push the data down, and our&nbsp;<code>use<\/code>&nbsp;call will provide data for our component.<\/p>\n\n\n<div class=\"wp-block-image\">\n<figure class=\"aligncenter size-full is-resized\"><img data-recalc-dims=\"1\" loading=\"lazy\" decoding=\"async\" width=\"754\" height=\"764\" src=\"https:\/\/i0.wp.com\/frontendmasters.com\/blog\/wp-content\/uploads\/2024\/12\/streaming-tasks-finish.png?resize=754%2C764&#038;ssl=1\" alt=\"\" class=\"wp-image-4830\" style=\"width:362px\" srcset=\"https:\/\/i0.wp.com\/frontendmasters.com\/blog\/wp-content\/uploads\/2024\/12\/streaming-tasks-finish.png?w=754&amp;ssl=1 754w, https:\/\/i0.wp.com\/frontendmasters.com\/blog\/wp-content\/uploads\/2024\/12\/streaming-tasks-finish.png?resize=296%2C300&amp;ssl=1 296w\" sizes=\"auto, (max-width: 754px) 100vw, 754px\" \/><\/figure>\n<\/div>\n\n\n<h2 class=\"wp-block-heading\">React Query<\/h2>\n\n\n\n<p>As before, let&#8217;s integrate react-query. And, as before, there&#8217;s not much to do. Since we added the&nbsp;<code>@tanstack\/react-router-with-query<\/code>&nbsp;package when we got started, our&nbsp;<code>queryClient<\/code>&nbsp;will be available on the server, and will sync up with the <code>queryClient<\/code> on the client, and put data (or in-flight streamed promises) into cache.<\/p>\n\n\n\n<p>Let&#8217;s start with our main epics page. Our loader looked like this before:<\/p>\n\n\n<pre class=\"wp-block-code\" aria-describedby=\"shcb-language-7\" data-shcb-language-name=\"TypeScript\" data-shcb-language-slug=\"typescript\"><span><code class=\"hljs language-typescript\"><span class=\"hljs-keyword\">async<\/span> loader({ context, deps }) {\n    <span class=\"hljs-keyword\">const<\/span> queryClient = context.queryClient;\n\n    queryClient.ensureQueryData(\n      epicsQueryOptions(context.timestarted, deps.page)\n    );\n    queryClient.ensureQueryData(\n      epicsCountQueryOptions(context.timestarted)\n    );\n  }<\/code><\/span><small class=\"shcb-language\" id=\"shcb-language-7\"><span class=\"shcb-language__label\">Code language:<\/span> <span class=\"shcb-language__name\">TypeScript<\/span> <span class=\"shcb-language__paren\">(<\/span><span class=\"shcb-language__slug\">typescript<\/span><span class=\"shcb-language__paren\">)<\/span><\/small><\/pre>\n\n\n<p>That would kick off the requests on the server, but let the page render, and then suspend in the component that called&nbsp;<code>useSuspenseQuery<\/code>\u2014what we&#8217;ve been calling streaming.<\/p>\n\n\n\n<p>Let&#8217;s change it to actually load our data in our loader, and server render the page instead. The change couldn&#8217;t be simpler.<\/p>\n\n\n<pre class=\"wp-block-code\" aria-describedby=\"shcb-language-8\" data-shcb-language-name=\"TypeScript\" data-shcb-language-slug=\"typescript\"><span><code class=\"hljs language-typescript\"><span class=\"hljs-keyword\">async<\/span> loader({ context, deps }) {\n  <span class=\"hljs-keyword\">const<\/span> queryClient = context.queryClient;\n\n  <span class=\"hljs-keyword\">await<\/span> <span class=\"hljs-built_in\">Promise<\/span>.allSettled(&#91;\n    queryClient.ensureQueryData(\n      epicsQueryOptions(context.timestarted, deps.page)\n    ),\n    queryClient.ensureQueryData(\n      epicsCountQueryOptions(context.timestarted)\n    ),\n  ]);\n},<\/code><\/span><small class=\"shcb-language\" id=\"shcb-language-8\"><span class=\"shcb-language__label\">Code language:<\/span> <span class=\"shcb-language__name\">TypeScript<\/span> <span class=\"shcb-language__paren\">(<\/span><span class=\"shcb-language__slug\">typescript<\/span><span class=\"shcb-language__paren\">)<\/span><\/small><\/pre>\n\n\n<p>Note we&#8217;re awaiting a <code>Promise.allSettled<\/code> call here so the queries can run together. Make sure you don&#8217;t sequentially&nbsp;<code>await<\/code>&nbsp;each individual call, as that would create a waterfall, or use <code>Promise.all<\/code>, as that will quit immediately if any of the promises error out.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">Streaming with react-query<\/h3>\n\n\n\n<p>As I implied above, to stream data with react-query, do the exact same thing, but&nbsp;<em>don&#8217;t<\/em>&nbsp;<code>await<\/code>&nbsp;the promise. Let&#8217;s do that on the page for viewing an individual epic.<\/p>\n\n\n<pre class=\"wp-block-code\" aria-describedby=\"shcb-language-9\" data-shcb-language-name=\"TypeScript\" data-shcb-language-slug=\"typescript\"><span><code class=\"hljs language-typescript\">loader: <span class=\"hljs-function\">(<span class=\"hljs-params\">{ <span class=\"hljs-params\">context<\/span>, <span class=\"hljs-params\">params<\/span> }<\/span>) =&gt;<\/span> {\n  <span class=\"hljs-keyword\">const<\/span> { queryClient, timestarted } = context;\n\n  queryClient.ensureQueryData(\n    epicQueryOptions(timestarted, params.epicId)\n  );\n},<\/code><\/span><small class=\"shcb-language\" id=\"shcb-language-9\"><span class=\"shcb-language__label\">Code language:<\/span> <span class=\"shcb-language__name\">TypeScript<\/span> <span class=\"shcb-language__paren\">(<\/span><span class=\"shcb-language__slug\">typescript<\/span><span class=\"shcb-language__paren\">)<\/span><\/small><\/pre>\n\n\n<p>Now if this page is loaded initially, the query for this data will start on the server and stream to the client. If the data are pending, our suspense boundary will show, triggered automatically by react-query&#8217;s&nbsp;<code>useSuspenseBoundary<\/code>&nbsp;hook.<\/p>\n\n\n\n<p>If the user browses to this page from a different page, the loader will instead run on the client, but still fetch those same data from the same server function, and trigger the same suspense boundary.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">Parting Thoughts<\/h2>\n\n\n\n<p>I hope this post was useful to you. It wasn&#8217;t a deep dive into <a href=\"https:\/\/tanstack.com\/start\/latest\">TanStack Start<\/a> \u2014 the docs are a better venue for that. Instead, I hope I was able to show&nbsp;why&nbsp;server rendering can offer almost any web app a performance boost, and why TanStack Start is a superb tool for doing so. Not only does it simplify a great deal of things by running loaders isomorphically, but it even integrates wonderfully with react-query.<\/p>\n\n\n\n<p>The react-query integration is especially exciting to me. It delivers component-level data fetching while still allowing for server fetching, and streaming\u2014all without sacrificing one bit of convenience.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>TanStack Start enhances the TanStack Router by adding a server layer that improves performance through server-side rendering (SSR) and isomorphic loaders.<\/p>\n","protected":false},"author":21,"featured_media":4855,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"sig_custom_text":"","sig_image_type":"featured-image","sig_custom_image":0,"sig_is_disabled":false,"inline_featured_image":false,"_jetpack_memberships_contains_paid_content":false,"footnotes":""},"categories":[1],"tags":[174,3,64,240],"class_list":["post-4810","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-blog-post","tag-data","tag-javascript","tag-server-side-rendering","tag-tanstack"],"acf":[],"jetpack_featured_media_url":"https:\/\/i0.wp.com\/frontendmasters.com\/blog\/wp-content\/uploads\/2024\/12\/tanstack-start.png?fit=1670%2C958&ssl=1","jetpack_sharing_enabled":true,"_links":{"self":[{"href":"https:\/\/frontendmasters.com\/blog\/wp-json\/wp\/v2\/posts\/4810","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/frontendmasters.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/frontendmasters.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/frontendmasters.com\/blog\/wp-json\/wp\/v2\/users\/21"}],"replies":[{"embeddable":true,"href":"https:\/\/frontendmasters.com\/blog\/wp-json\/wp\/v2\/comments?post=4810"}],"version-history":[{"count":24,"href":"https:\/\/frontendmasters.com\/blog\/wp-json\/wp\/v2\/posts\/4810\/revisions"}],"predecessor-version":[{"id":4859,"href":"https:\/\/frontendmasters.com\/blog\/wp-json\/wp\/v2\/posts\/4810\/revisions\/4859"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/frontendmasters.com\/blog\/wp-json\/wp\/v2\/media\/4855"}],"wp:attachment":[{"href":"https:\/\/frontendmasters.com\/blog\/wp-json\/wp\/v2\/media?parent=4810"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/frontendmasters.com\/blog\/wp-json\/wp\/v2\/categories?post=4810"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/frontendmasters.com\/blog\/wp-json\/wp\/v2\/tags?post=4810"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}