Check out a free preview of the full The Hard Parts of Servers & Node.js course:
The "Node Streams Q&A" Lesson is part of the full, The Hard Parts of Servers & Node.js course featured in this preview video. Here's what you'd learn in this lesson:

After discussing the callback queue and event loop, Will fields questions about sharing the queue, the final batch of data, buffers, queue timing, and handling incomplete data.

Get Unlimited Access Now

Transcript from the "Node Streams Q&A" Lesson

[00:00:00]
>> Will Sentance: This is a little bit, a little bit of a bonus for people. What's most important here, and what you want to take away from this, is that we can break down any inbound flow of data into chunks. And on each chunk, run a function on it, to do it in that batch.

[00:00:16] And then, while we're doing code up here in Javascript, our single thread of Javascript is running away. Good to go. In the background we've got a spun-up thread by LEEUV that's pulling in the next data. It's definitely multithreaded, just not in the same runtime. Jobses got one thread, and LEEUV spun up a background thread.

[00:00:39]
>> Will Sentance: Okay, so let's have thumbs on what's getting pretty advanced at this point. You lost me at this point. Clear clarification. Williams clear. The vehicle carrying Sam you're clear, Michael's clarification zaps clear, Sarah Rose is clear. Virginia has got a clarification, Matt’s clear. Okay, so we've got Charlie.

[00:01:00]
>> Off screen male: All the events, they share the same callback queue?
>> Will Sentance: All the associated functions with events, you mean, all the associated functions,
>> Will Sentance: Unfortunately.
>> Will Sentance: So all functions that have been set to autorun by node do not share the same callback queue. That's what we're gonna see. I'm just thinking if all the event-driven ones do.

[00:01:25] IO ones do, I think timers trigger events are they.
>> Will Sentance: That's a really interesting question. More importantly all auto functions set to auto run do not share the same queue. Any ones that involve input output. So as IO, that is ones which do trigger events. They all do share the same queue.

[00:01:48] My question just to myself is, are there some autorun functions that do not have events that trigger that autorun, and those ones are going to see in a moment. We'll come to it in a second Mark, if you don't mind. Let's just do ones here, Matt. Roman, okay fine, go ahead.

[00:02:05]
>> Off screen male: Once it's finished reading file, can it send a new response, kind of a message that it's done? Like send-
>> Will Sentance: What a great question! Absolutely it does! Anyone know what flushed-out message is? Cuz we're going to need that right? Because that we're going to run at that point.

[00:02:24] The JSON parse presumably. Yeah, it's going to auto close it and it's going to shout out closed. It's closed, isn't it? The point if we go and look at the docks to see exactly which event it flushes out. I believe it's close. And that moment it's going to, we hopefully set up a function to on the close event handle the day.

[00:02:45] Is it closed?
>> Off screen male: It is.
>> Will Sentance: Good job. Yeah, yeah exactly.
>> [INAUDIBLE]
>> Will Sentance: Yeah on close, exactly. On close event run a function which is this case would presume take all stringified data. [INAUDIBLE] JSON and turn it into into an object at that moment, once you got it all re compiled together.

[00:03:04] Yeah, great question. Thank you, Roman.
>> Off screen male: It'll always deal with a string? Whatever file we give it to read, or?
>> Will Sentance: Again, I've been a little bit cheeky here, but this doesn't come in as a string. It tecnically comes in is what's called a buffer. Which is a space in memory in which you fill in zeros and ones that represent any data type.

[00:03:24] And once you grab the first bit out of it you throw it in with more zeros and ones representing the next batch.
>> Off screen male: So my question is since we lose-
>> Will Sentance: That's a buffer data type, something to note.
>> Off screen male: Since we were reading a JSON, right?. So it started reading from the beginning but it didn't close the file, because it just took a chunk of data from it.

[00:03:42] How does it know that it's a complete object?
>> Will Sentance: This is what I said. We're gonna hold on the passing, the parsing of the JSON until we've done all the cleaning. So you can imagine what we do is we can turn the buffer into a string. The buffer is just zeros and ones, 64000 zeros and ones.

[00:04:00] 64000 sorry, bytes, which is 64000 characters, turn that into a string. Remove all the mean words, attach that to clean tweets, next is 4000, do the same thing, attach it on the end of clean tweets next to hundreds and hundreds of 64000 kilobytes by 64 kilobyte batches. The whole thing is now a long string of fide, string of data then we apply JSON parse on the close event which is when all the data's in.

[00:04:27]
>> Off screen male: So, so that means only operations that we can do is if it's, assuming it to be a string right?
>> Will Sentance: We wouldn't be able to start parsing the data because the JSON object sorry JSON stringify java object. You really do need to have the opening curly brace and the closing one, to be able to work with it.

[00:04:48] So you couldn't do things like working with the JSON particularly, yeah. Very fair, yeah. Other questions? Michael.
>> Off screen male: Yes, I do have a question. So the browser's connection to the server socket.
>> Will Sentance: Yes.
>> Off screen male: Times out after a while. This takes 15 seconds. Would that happen, and if so does it make sense?

[00:05:11]
>> Will Sentance: I'm assuming you're doing this before any requests come in.
>> Off screen male: Okay.
>> Will Sentance: We're not grabbing in 1.5 gigabytes in response to any user's requests anytime soon.
>> Off screen male: Okay, but you could send a chunk.
>> Will Sentance: Yeah. [CROSSTALK] Yeah, yeah, absolutely. Video data, you can sort of send PCs, although HDP has ways of working with video data that this doesn't necessarilly imply.

[00:05:33]
>> Off screen male: Is there anything that can delay adding a datastream to the callback queue so that things get out of sync?
>> Will Sentance: Wow that was much more. Very intrigued by that.
>> Will Sentance: Here's the thing. Yes because it is a single thread. Single-threaded language, including the node passing stuff into JavaScript.

[00:06:00] Meaning, without going into too much detail until the final slide, meaning that that function is only actually literally put on the callback queue when, things are put on the call back queue at a certain moment. The moment being, When there's a break from synchronous code running. But it could never run anyway until the synchronous code is run.

[00:06:34] Never run anywhere until the synchronous code is finished running. So who cares? That function could never run anyway until all the synchronous code is running. So when it gets added here, doesn't matter as long as it's added in the order of completion and LEEUV handles logging, the order of completion of functions, being ready to be triggered.

[00:06:55] There actually, yes, technically although we don't need to worry about it in my opinion. Technically added to the callback queue only when the event loop does a cycle and there's no synchronous code.
>> Will Sentance: We just don't need to care, because they were never gonna be running until all synchronous code was done anyway, so who cares when they are added is when they're executed that matters.

[00:07:17] And they're always added in the order of completion down here. They're always queued up. And then grabbed out in the order they were finished, or the order they were triggered, sorry, to run autorun down here. We'll see more of this in a moment. So we're hitting the very limits of what we have in terms of information for the event loop and the callback queue.

[00:07:33] Final thing after the break will be a monstrous full event loop and we'll do as many of the queues as feels comfortable. And that'll be the very final code. Mark.
>> Off screen male: What would you do when a parcel word has arrived?
>> Will Sentance: Yeah I mean, you could do a small thing, like if you spot a word that hasn't got a closing quote on it, you could make sure you just have it as a little cache, as a little separate string here.

[00:08:08] And then append it to the beginning of the next batch to analyze it as a full battery for words. That would be more of an implementation detail, how you ended up handling that. Okay, does everyone get that? The one thing I'm saying by that is if the stringified came in and you cut off halfway, and that word was FU.

[00:08:25] It's the word lesser F, followed by that U, and you're like this is great, my favorite two-letter word, you'd miss that actually there's a word to be cleaned there.
>> Will Sentance: How do you spot it? You'd make sure you always end on a space for example, or you always end on a closing quote.

[00:08:45] And this is just a passing parsing challenge that anybody faces when you're passing data. You made sure it was done on a quote or then on a string. And then go back the previous space or previous quote and take that word and just cash, I mean like hold on to that word as a end-of-previous-string hangover and save it and then append that to the next block, the next batch.