Check out a free preview of the full Web Assembly (Wasm) course:
The "Memory" Lesson is part of the full, Web Assembly (Wasm) course featured in this preview video. Here's what you'd learn in this lesson:

Jem describes memory as a giant warehouse full of storage bins. Each bin has an address and can store anything as long as it fits within the bin. The main job of the operating system on a computer is to allocate and deallocate the use of these storage bins.

Get Unlimited Access Now

Transcript from the "Memory" Lesson

[00:00:00]
>> We spent the last section talking about binary and hacks and low level things. But honestly, none of that really matters. That's because computers are stateless. They don't have the concept of This thing is here, this thing is here. All it does is it processes instructions in real time, as in because we're manipulating tiny, tiny, tiny amounts of electricity.

[00:00:27] Once the computer is turned off, all that's gone, computer doesn't know it has no concept of where it was where it wants to be, who it wants to grow up to be in the future. It has none of that. All right. So we have this concept called memory.

[00:00:41] That is the ability to store these ones and zeros in a state that can be accessed either by the CPU or the operating system. And we can store those long term via things like SSDs, solid state drives or we should store them in short term memory known as RAM or random access memory.

[00:01:01] And memory is everywhere on a computer. There are caches at different levels, there's L1 cache, L2 cache and these are caches of information on the CPU themselves. So it's memory built into the CPU, that it can access data really, really quickly. Actually, the l one cache to cache that is the fastest memory access we can get on a computer.

[00:01:23] And as we go out to more and more complex memory, so if we go out to RAM, our hard drives or tape drives, things like that, the speed takes much, much longer. So if you've shopped for a computer and you see it has like, I don't know, a one megabyte and one cache.

[00:01:39] I think that's a pretty small cache. But as a 1 or 2 megabyte, 1 cache, that just means it all that there it can store it has space to store a million bytes of information. And we actually want the caches to be as large as possible, because that's the fastest way of accessing data.

[00:01:57] And I know, for a high level language like JavaScript, we generally don't spend too much time thinking about memory. That's because JavaScript is a high level language. And to me when I hear high level language, it means it covers up all of the ugly stuff, all the nasty parts we don't want to think about as programmers.

[00:02:16] For instance, we don't, or when's the last time in JavaScript you've had to think about garbage collection? Like probably if you're a performance engineer or performance oriented, you probably think about garbage collection. But most frontend engineers don't think about how to actually access memory of what's happening. When I create a string, how is that being stored?

[00:02:38] We don't think about that because JavaScript does that for us. When we think about web assembly, which doesn't have garbage collection, we actually have to consider, I've got the number I'll say 21. I've got 21 that means I need to allocate at least 8 bytes of storage. That means I need to tell the computer itself, hey, I want to reserve 8 bytes and I want to start at this memory address so I can access it later.

[00:03:07] And then later, when we're done with that information that number 21, we don't need anymore. We have to deallocate that and that's we tell the computer. Hey, actually you can free this up. This is now a free space. And when you start thinking about that, you're like, man, if I had to do that for every string and every object and every property, it would get so tedious, and it would.

[00:03:30] So we don't do that in JavaScript, but we're gonna do some of that today. So, all right it's, I don't know if anything after all this, you will appreciate how much high level languages like JavaScript do for you as programmers. The best way to think of memory and I struggled with this one because there's so many analogies.

[00:03:53] I don't know if any of you are Star Trek fans, but I am like a Star Trek next generation fan. There's a scene or there's an episode, it's super famous in in Star Trek canon, but it's called a Darmok and Jalad. But anyways, it's about this entire society.

[00:04:11] Who only form who main form of communications is telling stories and telling analogies and those represents something larger. So I like to apply the same thing too hard or difficult to grasp concepts. So I'll tell the same story. Think of memory as a giant warehouse full of storage bins.

[00:04:33] Just think of it as a town full of houses or many other things, but I'll use the storage of an example. So in your mental mind, think of this massive massive storehouse and all these are houses have bins. The bins are generally a fixed size so you can only store so much in a given bin and we can put anything in there.

[00:04:53] We can put as big a jumble of ones and zeros into that bin as the bin will fit. And oftentimes, actually, most of the time, you're probably gonna need multiple bins. So you'll have an address that starts say here and we move down the aisle and it fills up more and more bins.

[00:05:11] It takes more and more bins to store more and more information. That's the way we think of memory. Just this giant storage system, and there's different pointers to different addresses and a pointer is just an address, it's the location of that bin within the warehouse. I hope that analogy makes sense because, if we started talking about like, what actually memory does it wouldn't be meaningful to you.

[00:05:37] So, where does the operating system come in and all this? The operating system is the foreperson of the warehouse. The operating system is the one who decides, like, hey, this process here, no you need you need 10 bins worth of memory. Sure, you can have these, here's your address now.

[00:05:56] It's your mailing address, I know where to route all the mail that you get, it's gonna go to the these bins. But the operating systems job, its primary job, is just allocating memory and deallocating memory and deciding who gets what, who has where. And it keeps track of all of this in its internal state.

[00:06:16] So, you're probably thinking like why is that important? Well, there's many reasons why we have to be very, very careful about memory. So for instance, let's say I have a process, I'm gonna use JavaScript. It's really not the probably the right analogy goes to high level, but I'll use JavaScript for example.

[00:06:35] I have a JavaScript process that is gonna require about 10 megabytes of memory for some reason, doesn't really matter. It has a heap, it has an execution stack. It's gonna need 10 megabytes of memory. And let's say I have another process say I'm playing Spotify on my computer.

[00:06:55] Spotify is also gonna need memory to run and store its internal state, things like that. What would happen if they're both writing to the same addresses, the same memory, you get a memory corruption and the entire computer crashed because instead of these organized series of ones and zeros, it'll just be jibberish ones and zeros.

[00:07:13] So the order matters, the instructions matter, having a good, stable, reliable operating system honestly matters. And it's another one of that bit of magic that we take for granted in programming, that like all these things are managed for us. We don't have to manually think about these things.

[00:07:33] And I said the word pointer, but think of pointer as it's an address. It's not the actual data itself. It's just the start of the data and we know exactly how long that data is. So we're gonna read from, say, index zero, it's not an index, but we'll start at zero and we're gonna read 32 bits of information, and we're gonna pull that out.

[00:07:54] It is critical and I do mean critical that we understand the size of the information and the type, because there are different types of numbers and they are represented differently in binary. And we'll cover that in the next lecture under numeric types. So there are any questions about memory, because memory it's a foundational part of computing that a lot of us just forget it's there, because it's kind of magical at high level.