What To Know in JavaScript (2026 Edition)

Chris Coyier Chris Coyier on

We’ve done posts like this for CSS, but JavaScript deserves the same dangnabit! Especially as JavaScript does a better job of versioning itself anyway. We’ll cover new stuff in the language itself, but being a JavaScript practitioner involves more than the language itself, extending into runtimes, frameworks, libraries, and tooling. Let’s just do this, you’ve probably scrolled down already anyway.

What’s New in the Language

JavaScript has yearly version releases, which is a pretty nice way of doing things if you ask me!

ECMAScript 2025

The latest is ECMAScript 2025, which came out in June 2025, and the whole spec of that version is available.

Iterator Helpers

There are now methods like .map(), .filter(), .take(), .drop() directly on iterators with lazy evaluation. Honestly, to a mostly front-end guy like me, this feels a bit esoteric. Like, we already can map over arrays, so what’s the big deal? But I do understand performance, and that’s one aspect here.

const result = array
  .map(x => x * 2)      // creates a new array in memory
  .filter(x => x > 10)  // ... and again
  .slice(0, 3);         // ... and againCode language: JavaScript (javascript)

So that’s “slow” and “memory intensive”, especially if the array is quite large and the things you’re doing are “expensive” as they say. The fancy new way is like this:

const result = Iterator.from(array)
  .map(x => x * 2)
  .filter(x => x > 10)
  .take(3)
  .toArray(); // No new arrays created, computation stops after 3Code language: JavaScript (javascript)

And as a nice bonus, that whole Iterator.from() thing works on anything iterable. So not just arrays, but sets, maps, generators, etc, which means they all get the same nice set of functions to use.

Set Methods

Sets are kinda nice in JavaScript as it’s like an array only each item is guaranteed to be unique. That’s nothing new, but if you have two sets, now we have methods for returning interesting things about them, like what overlaps, what doesn’t, etc.

const youKnow  = new Set(["JS", "Python", "CSS", "SQL"]);
const jobNeeds  = new Set(["JS", "TypeScript", "Python"]);

// Skills the job wants that you already have
youKnow.intersection(jobNeeds); // → Set {"JS", "Python"}

// Everything combined — your full stack + job needs
youKnow.union(jobNeeds); // → Set {"JS", "Python", "CSS", "SQL", "TypeScript"}

// What the job needs that you DON'T know yet (skill gaps)
jobNeeds.difference(youKnow); // → Set {"TypeScript"}

// Skills you have that the job doesn't care about
youKnow.difference(jobNeeds); // → Set {"CSS", "SQL"}

// Skills that appear in only one set, not both
youKnow.symmetricDifference(jobNeeds); // → Set {"CSS", "SQL", "TypeScript"}

// Are all job requirements a subset of what you know?
jobNeeds.isSubsetOf(youKnow); // → false

// Do you have every skill and more?
youKnow.isSupersetOf(jobNeeds); // → false

// Do you and the job have zero overlap?
youKnow.isDisjointFrom(jobNeeds); // → falseCode language: JavaScript (javascript)

Pretty useful, I’d say. Claude Code had fun producing an interactive demo from that.

RegEx Updates

Lemme set the stage here. You’re building an on-page search function where your users type in their own search terms. And you want to implement this as a RegEx search. There is some danger there, as some characters a user may type in are “special” characters in RegExs, like how a $ matches the last character or whatever. So if the user searches for $9 and you just dunk that into a RegEx, it would break. Which characters you need to “escape” to fix that are specific to the implementation of RegEx at hand.

So! After apparently a 15-year journey, there is now RegExp.escape().

const query = userInput; // e.g. "$5.00 (off!)"

// ❌ BEFORE — breaks for any regex special chars
const badRe  = new RegExp(query, "g");

// ✅ ES2025 — one method, problem solved
const goodRe = new RegExp(RegExp.escape(query), "g");Code language: JavaScript (javascript)

Again, Claude Code kinda knocked it out of the park with quite a good demo.

There has also been an update to how “flags” can work inside a RegEx. I feel like a super common one is that “i” flag, meaning case-insensitive. So you’d have a RegEx that ended in like /i meaning the whole thing is case-insensitive. But what if you only wanted part of a RegEx to be case-insensitive? Now you can wrap parts of it in parentheses and add those flags at the beginning.

// Old way — you couldn't mix case sensitivity
/[a-z]+@[A-Z]+/i  // 'i' flag applies to EVERYTHING

// ES2025 — inline modifiers per-group
/(?i:[a-z.]+)@(?-i:[A-Z]+)\.(?i:com|org)/
//^^^^ case-insensitive part
//             ^^^^^ case-SENSITIVE part
//                           ^^^ case-insensitive partCode language: PHP (php)

Promise Update

Everyone’s favorite asynchronous programming flow model (Promises) has a bit of an update with Promise.try() which can help simplify error handling. A function might error in a sync or async way, and you’d have to handle them separately, but now you can deal with it together:

// A function that MIGHT be async, MIGHT throw sync
function loadUser(id) {
  if (!id) throw new Error("No ID");          // sync throw
  return fetch(`/api/users/${id}`);           // async
}

// ❌ BEFORE — two separate error paths
let p;
try {
  p = loadUser(id);                           // catch sync throw here…
} catch (e) { handleError(e); }
p?.catch(e => handleError(e));                // …and async reject here

// ✅ ES2025 — one liner, one .catch()
Promise.try(() => loadUser(id))
  .then(user  => render(user))
  .catch(err  => showError(err));             // catches BOTHCode language: JavaScript (javascript)

I will, once again, refer you to a Claude Code-produced demo which does a surprisingly good job of demonstrating the concept.

Import Attributes

Import attributes are a big personal favorite. For real! For one, I like the idea of just importing JSON data as JSON data rather than having to fetch it and parse it and all that:

import data from "./file.json" with { type: 'json' }Code language: JavaScript (javascript)

That whole with part with the subsequent object are what are called the “import attributes”, and they have a few more tricks up their sleeve that we’ll get to.

The JSON import just looks nice to me and saves a line of code or two, but to be fair, it has some really notable downsides that Jake Archibald points out in Importing vs fetching JSON. One big one: if the import fails “it takes the whole module graph down with it.” which is, uh, very bad. You can use a dynamic import() instead to catch a failure…

try {
  const { default: data } = await import(url, {
    with: { type: 'json' },
  });
} catch (error) {
  // Fallback logic
}Code language: JavaScript (javascript)

But it’s not as good as the data you get when you just do a fetch for the JSON, so it’s still kinda meh. Jake rounds it out, noting that the data you import “will live in the module graph for the life of the page”, rather than being garbage-collectible like the data after a fetch would be. Anyway: tread lightly.

JSON isn’t the only thing you can import with import attributes, though. When I said import attributes are a personal favorite feature, I mostly mean I’m excited to import CSS in this way.

import componentStyles from "./component.css" with { type: "css" };Code language: JavaScript (javascript)

I get into this in A Nice Vanilla App Architecture Using Web Components and CSS Module Scripts. I just really like how we can keep CSS to CSS files which could live in a folder right next to a JavaScript component.

import sheet from './styles.css' with { type: 'css' };

class MyComponent extends HTMLElement {
  constructor() {
    super(); 
    const shadowRoot = this.attachShadow({ mode: 'open' });
    shadowRoot.adoptedStyleSheets = [sheet];
  }
  
  ...
Code language: JavaScript (javascript)

This isn’t absolutely everything in ES2025, and there are plenty of other articles out there specifically getting into that. I found Matthew Tyson’s ECMAScript 2025: The best new features in JavaScript for InfoWorld pretty helpful. It’s got some info in there on Float16Array, for example, that’s a little outside my wheelhouse but has to do with trading precision for memory usage when you know that’s useful.

ECMAScript 2026 (Expected Mid-2026)

It’s still early in 2026, but we’ll expect the annual ECMAScript release mid-year, as in years prior. Here’s stuff that’s already in Stage 4 and will likely make the drop.

Temporal API

Easily the most exciting and useful thing to come into JavaScript in a while. Summed up basically with “Dates and times in JavaScript are good now, no libraries required.” For a long time, big-but-good libraries like Moment filled the gap, making developers choose between performance and DX 😬.

As I write, Safari is the last one without support, but it’s been worked on, and is now in TP (what they call “Technical Preview”), so it’s not far out.

One thing that is now trivial to do is get the time in a particular time zone. No libraries required.

const now = Temporal.Now.zonedDateTimeISO("America/New_York");
// Programatic date
console.log(now.toString());

// Or more readable...
console.log(now.toLocaleString());Code language: JavaScript (javascript)

I got a kick out of how TC39 meetings have a bit of code to run in your DevTools console to show you when an upcoming meeting is in your current timezone:

Temporal.ZonedDateTime.from('2026-03-10T10:00[America/New_York]')
  .withTimeZone(Temporal.Now.timeZoneId()) // your time zone
  .toLocaleString();Code language: JavaScript (javascript)

That’s just cool.

There are a million things Temporal can do, but here are a couple more that really sucked before.

Like how if we “added one month” to the last day in January, we’d get a really whack result:

const date = new Date(2026, 0, 31); // Jan 31
date.setMonth(date.getMonth() + 1); // "add one month"
console.log(date.toDateString()); // Sun Mar 03 2026 ❌ 😬Code language: JavaScript (javascript)

But with the lovely Temporal API, we’re square:

const jan31 = Temporal.PlainDate.from("2026-01-31");
const feb = jan31.add({ months: 1 });
console.log(feb.toString()); // 2026-02-28 ✅Code language: JavaScript (javascript)

Also, comparing things is… correct now.

const a = Temporal.Duration.from({ hours: 25 });
const b = Temporal.Duration.from({ days: 1 });

const cmp = Temporal.Duration.compare(a, b, { relativeTo: Temporal.Now.plainDateISO() });
console.log(cmp); // 1  (25h > 1 day) ✅Code language: JavaScript (javascript)

Explicit Resource Management

There is a new using keyword when doing async functions and await that ensures cleanup. The runtime guarantees [Symbol.dispose]() (or [Symbol.asyncDispose]()) is called when the variable goes out of scope.

class FileHandle {
  constructor(path) {
    this.path = path;
    console.log(`Opened ${path}`);
  }

  async write(data) {
    // ... write data
  }

  async [Symbol.asyncDispose]() {
    await someFlushOperation();
    console.log(`Flushed and closed ${this.path}`);
  }
}

async function saveData() {
  await using file = new FileHandle("output.txt");
  await file.write("hello world");
  // file is automatically flushed + closed here, even if an error is thrown
}
Code language: JavaScript (javascript)

The using keyword is nice there for a single resource, but there is also now a DisposableStack for multiple resources you need to be sure to clean up.

async function runJob() {
  await using stack = new AsyncDisposableStack();

  const db = stack.use(await openDatabase());
  const file = stack.use(new FileHandle("output.txt"));
  const tmpDir = stack.defer(async () => removeTempDir("/tmp/job"));

  // Do work...
  await file.write(await db.query("SELECT * FROM jobs"));

  // All three are cleaned up here, in reverse order, even if something threw
}
Code language: JavaScript (javascript)

Array.fromAsync / Iterator Sequencing

Array.fromAsync shipped first in 2024, but apparently there was some spec issues with it, so it only made the spec in ES2026 apparently. It awaits each yielded value as it walks the async iterator, collecting results into a plain array. Without it, you’d have to manually loop and push.

async function* fetchNumbers() {
  yield 1;
  await new Promise(r => setTimeout(r, 100)); // simulate async delay
  yield 2;
  await new Promise(r => setTimeout(r, 100));
  yield 3;
}

const numbers = await Array.fromAsync(fetchNumbers());
console.log(numbers); // [1, 2, 3]
Code language: JavaScript (javascript)

It’s probably most useful when you’re awaiting a function call that loops over async functions that all yield their results, like pagination or something. Instead of yielding, you can also pass in an array of Promises that will return once they all resolve.

And speaking of pagination, Iterator.concat is a new thing that allows you to lazily evaluate each thing you’re iterating over. So, rather than spreading everything into an array up front to iterate over, this can still do the iteration, but if you bail early, you save the memory you would have used filling up that array early.

const page1 = [{ id: 1 }, { id: 2 }][Symbol.iterator]();
const page2 = [{ id: 3 }, { id: 4 }][Symbol.iterator]();
const page3 = [{ id: 5 }, { id: 6 }][Symbol.iterator]();

for (const item of Iterator.concat(page1, page2, page3)) {
  process(item); // streams through all pages lazily
}
Code language: JavaScript (javascript)

Error.isError()

The point is that you can now reliably know if a value is a genuine Error object, not just an object that kinda looks like one. Useful in situations like a centralized error reporting service that potentially receives errors from places like web workers or iframes, which are different “realms” and can screw it up.

Math.sumPrecise

Surely you’ve seen console.log(0.1 + 0.2); — and how the result is a super weird 0.30000000000000004. Long story. Well just try console.log(Math.sumPrecise([0.1, 0.2])); — (in Firefox, where it’s supported so far) and you’ll see it is… exactly the same.

But apparently it’s still useful anyway for some stuff 🤷‍♀️

Base64 / Hex Encoding

Kinda cool that there is a simple, straightforward method calls for these things now.

const val = "Frontend Masters!";
const textEnc = new TextEncoder();
const bytes = textEnc.encode(val);
console.log(bytes.toBase64());
// 'RnJvbnRlbmQgTWFzdGVycyE='
console.log(bytes.toHex());
// '46726f6e74656e64204d61737465727321'Code language: JavaScript (javascript)

Yet again, let Claude Code demo it in a really neat way.

New in Frameworks

React Ecosystem

React 19 dropped in December 2024, so it’s been a bit since then. We’re at 19.2 right now, and as far as I know, there isn’t a ton of public info about what’s in React 20.

But React 19 was a pretty big release with what they call “RSC” (React Server Components), the React Compiler, and Server Actions. Here they are in a nutshell:

  • RSC: If you can have a Node server involved, maybe just maybe, some components that would normally be bundled into the client-side ball of JavaScript could be left out, and that work could be done on the server instead, communicating back just the needed data.
  • Server Actions: Speaking of having a Node server available, these allow you to call functions that exist specifically on the server. Form processing is a classic example.
  • Compiler: Some performance optimizations have been traditionally left for humans to figure out. Are you a useMemo expert? Me either. By running your React code through this compiler first, it can do these optimizations for you instead. A little build complexity for a little performance gain.

There are, naturally, a whole bunch of little things too, but broad strokes, those are the big things you should know exist. React Native went 0.83, which I know very little about, I’m afraid, but I do find it notable that they’ve “announced” (kind of) a 1.0, which must feel good to everyone involved after a decade of development. I can’t find a link for that, I think that announcement came as an on-stage shoutout at React Universe Conf.

Those server-based React technologies fresh out of the oven? Well, they were subject to back-to-back very serious security vulnerabilities last year, which rightfully scared plenty of people.

Vue Ecosystem

Vue 3.5 is holding stable, and Vue 3.6 has gone alpha with a new opt-in feature called Vapor Mode for big-time performance improvements (“comparable to Solid and Svelte 5”).

We had a nice overview of the whole Vue ecosystem in 2024. But as a total Vue outsider, it’s a little hard for me to understand the 2025/2026 scene. Obviously, Evan You is the main dude here, but he’s running VoidZero (“The JavaScript Tooling Company”), which now produces Vite+, which is a whole slew of major projects, like Vite itself, formatting, linting, testing, etc. None of that is Vue-specific, and I gotta imagine it’s hard to focus on Vue itself when all this is going on 🤷‍♀️.

Arguably, the main Vue metaframework is Nuxt, which went 4.0. The “stewards” of Nuxt itself are NuxtLabs, which was acquired by Vercel. So Vercel doesn’t like “own” Nuxt, but… kinda? Part of me feels good that metaframeworks have theoretically sustainable homes, and part of me feels weird that VoidZero has like every step in the JavaScript toolchain except a metaframework from their home language. Pinia seems to be the predominant state management library for Vue, which went v3 and dropped Vue 2 support.

Svelte Ecosystem

Svelte is cruising on v5. That was a big update to the Svelte world with what they call the “Runes API” which totally changed how reactivity works, making it more “fine-grained”, as they say, which means more efficient and faster. Honestly, I don’t know that much about Svelte or SvelteKit, except they are part of Vercel as well, and are awfully beloved by the people who use them.

JavaScript Runtimes

The biggest runtimes are clearly the ones baked into browsers. But as far as the ones you can choose and run yourself to run your own stuff, Node is still the dominant player with two interesting competitors. We’ve covered when Deno or Bun might be a viable alternative to Node. There has been more convergence than divergence lately, with all of them supporting TypeScript natively and more support of the canonical Node.

Node.js

Perhaps the biggest news recently in Node is that it can run TypeScript files natively. So:

node my-script.tsCode language: CSS (css)

That works as of Node 22.18.0, without needing the --experimental-strip-types flag anymore. Note that it still does strip types, meaning it’s not going to help warn you if there are actual problems in your TypeScript code.

The biggest news out of Node land tends to be simple but important bread & butter stuff like improvements to security, performance, and alignment with browser JavaScript APIs.

On a personal note, I’ve been quite pleased with Node’s progress. I’ve worked on projects switching to Node’s built-in test-runner, which feels good to reduce dependencies. I applaud Node’s work on its permissions model, making it feel more usable with untrusted code situations.

Bun

Bun’s big release was 1.3 with lots of DX features around running dev servers. It’s pretty satisfying you can run a full-featured dev server just by pointing bun toward the HTML files:

bun './**/*.html'Code language: Bash (bash)

This does all the processing and bundling as well, making Bun something of a Vite alternative in this context.

Perhaps the biggest news for Bun is that Anthropic (e.g. Claude) acquired Bun late last year. I think the general vibe is that it is good news for Bun, giving it a stable and well-funded home.

Generally, people choose Bun because of speed. It installs from npm extremely fast and generally performs faster across the board. At the cost of some stability.

Deno

Deno has been at v2 for a while. It’s got, as far as I know, full Node.js compatibility and is the most stable of the three. It’s also got full npm compatibility now, thanks to the npm: specifier in packages.

I think people generally choose Deno because of the stability and security-first architecture. They say it clearly:

Deno is secure by default. Unless you specifically enable it, a program run with Deno has no access to sensitive APIs, such as file system access, network connectivity, or environment access. You must explicitly grant access to these resources with command line flags or with a runtime permission prompt. This is a major difference from Node, where dependencies are automatically granted full access to all system I/O, potentially introducing hidden vulnerabilities into your project.

That’s good design.

Build Tools

Vite

Vite has become the predominant build tool of the JavaScript ecosystem. It was in the right place at the right time, I guess! While it was born out of the same folks that make Vue, Vite is a build tool that works for almost any front-end project. Color me a fan of their approach, where local development works by updating only the small parts of code that change as your work without requiring full-blown bundling, but still does production-worthy bundling on demand.

Vite has recently gone v8. This was a significant change in that, rather than relying on the third-party bundling tool Rollup, it now uses Rolldown, a bundler of their own creation. This is in line with Vite’s recent charge into becoming a more “unified toolchain”, as they put it. They can share tooling across their offerings (like a parser), making the whole thing more predictable. They call that whole toolchain Vite+, which includes the fancy dev server from Vite, formatting, linting, type checking, testing, task running, monorepo support, and packaging. That’s a lot!

They are even working on taking it a step further with a “deployment platform” called Void, which uses Cloudflare’s offerings for hosting, data storage, cloud functions, and all that.

Database, KV storage, object storage, AI inference, authentication, queues, and cron jobs. All built-in. Import what you need, skip what you don’t.

Almost all frameworks are using Vite these days: Astro, SolidStart, SvelteKit, Nuxt, etc. The notable exception is Next.js, which uses webpack and is moving to Turbopack (see next section). But we’ve even seen Next.js AI-ported over to Vite by Cloudflare, which was a controversial move.

Turbopack

Turbopack is Vercel’s bundler that has now become the default bundler as of Next.js v16. Turbopack is a Rust-based project that is supposed to be 5-10✕ faster at refreshing than webpack was in previous versions of Next.js. I believe at the moment Turbopack is specific to Next.js.

webpack

webpack is still heavily used and has a development plan for 2026, which includes many ideas for reducing the need for various loaders and other simplifications. A welcome update, as the general sentiment around webpack is that it’s too complicated.

TypeScript

TypeScript just went v6. They are saying mid-2026 for v7, which is going to be a huge release, swapping over to their new Go-based compiler. The main point of v6 is housekeeping to prepare people for that change. I think the Bytes newsletter had a good quick summary:

Strict mode is now true by default, module defaults to esnexttarget floats to the current-year ES spec (currently es2025), and types now defaults to an empty array instead of vacuuming up everything in node_modules/@types. That last one alone will break a lot of projects, but should also speed them by 20–50%.

Probably worth getting ready for v7 as you’ll almost certainly want the ~10✕ speed improvements seen in places like VS Code and Playwright usage.

Notably, TypeScript has become the #1 language on GitHub, with 66% year-over-year growth.

Types in JavaScript?

Years ago there was a bit of chatter about adding types directly into JavaScript. So perhaps some of the benefits of TypeScript without needing a compiler. This doesn’t seem to have a lot of movement and is unlikely to truly replace what TypeScript can do.

AI

This is probably as good a place to slip this in as any, but with the extreme popularity of TypeScript both in what developers are actively using and what’s available as open source for LLMs to train on, AI is just very good at writing code these days, and particularly TypeScript. They say 92% of developers are using AI to write code to some degree, which is astonishing growth and easily the biggest story in development right now.

Testing

All the main JavaScript testing frameworks are still around and doing the bulk of testing. Jest, Jasmine, Mocha, etc. But there has been some movement, particularly as Vite has grown to be incredibly popular, their testing framework Vitest has taken off. It’s Jest-compatible, so making porting tests to it is generally pretty easy, and it’s much faster (and looks nicer, I think). Vitest also has “browser mode,” meaning it can run tests in a real browser, which is pretty crucial for testing your components. This generally happens with Playwright, which also seems to be having a boom in popularity and can do “end to end” testing in its own right, and seems to have grown in popularity over Puppeter or Cypress. (it seems to me, anyway).

Meta Frameworks

Next.js

Next.js is on v16 which is the first release with Turbopack as the default. Personally, I like the push forward with that, but I’ve turned it off on my own projects as I’ve found the migration difficult. But the logging/error improvements are a noticeable step forward. This version of Next uses the React Compiler and React Server Components automatically, which is theoretically a performance gain all around, but the results seem more complex and mixed.

If you use AI with your Next.js site a lot, it’s notable that they have an MCP server now. That essentially means if you hook it up the AI will be a heck of a lot smarter at working on your site.

It’s on React 19 also, which means <ViewTransition> support, which we looked at here.

Remix / React Router

Once upon a time (a few years ago) Remix was “bought” by Shopify. It went to a v2, then it was announced that what was to be Remix v3 was actually gonna just be React Router v7. Now Remix v3 is still going to be a thing, but it’s under active development. The big thing is that React isn’t going to be a part of it anymore:

Instead, we’re building our own component model that feels closer to the web than anything we’ve seen before.

They had an event, Remix Jam, where they got into things, so check that out if you’re super interested.

TanStack

Some of the fallout from the Remix confusion may have benefited the TanStack universe, which is a collection of tools, including a router that is quite popular. And like Remix before it, that router has grown up into a framework as well.

We’ve got lots of content getting into the TanStack world from Adam Rackis.

Astro

Astro has been going strong for years now and isn’t slowing down. Just this year, they were acquired by Cloudflare, which generally feels like a good thing, as really good front-end frameworks are notoriously hard to build a strong business model around, and the answer seems to be partnering with serious hosting. It’s already being used to build a weird WordPress clone.

If you’re looking to build a site that is static-by-default, but still uses modern JavaScript framework component-based architecture, and makes it easy to opt-in to more dynamic behavior, Astro is the gold standard and darn fine choice if you ask me.

Astro’s latest release is 6.0, with grown-up features like customizing which runtime you use in development, a content security policy, and an experimental faster compiler. This was quickly followed by a 6.1 release with lots of little nice config improvements and such, proving how dedicated they are to being a good framework.

npm

There doesn’t seem to be a ton happening in npm land. It’s been 6 years since Microsoft/GitHub bought it and it seems to be running fine. GitHub itself has struggled with uptime.

What’s less fine with npm is supply chain incidents, like s1ngularity, which stole people’s credentials / tokens / config files and publicly posted them on GitHub 😳. Then there was debug/chalk where malicious package updates went out that could rewire crypto transactions to some bad guys wallet. Then there was the Shai-Hulud worm (sorry, worms, plural) that was some kind of self-replicating credential-stealing nastiness, with the 2.0 version overwriting/deleting every single file in a user’s home directory. That one went out to 796 npm packages with over 20 million downloads, so… wow. Not a great last year for npm from a security standpoint.

It may be worth checking out a tool like Socket for some protection if you’ve got serious production apps using npm.

What should I learn?

The forever answer is that learning fundamental skills on how these things work will serve you no matter what changes happen in tooling and frameworks and all that. And, dare I say it, the more AI helps us with code, the more we need people like you who will actually know what they are doing and can help plan, guide, shape, test, architect, and apply good taste to code no matter how it is created.

Signing up for Frontend Masters is the ticket to those fundamental skills.

It's time to take your JavaScript to the next level

Leave a Reply

Your email address will not be published. Required fields are marked *

$966,000

Frontend Masters donates to open source projects through thanks.dev and Open Collective, as well as donates to non-profits like The Last Mile, Annie Canons, and Vets Who Code.