Practical Problem Solving with Algorithms

Algorithms & Techniques Overview

Kyle Simpson

Kyle Simpson

You Don't Know JS
Practical Problem Solving with Algorithms

Check out a free preview of the full Practical Problem Solving with Algorithms course

The "Algorithms & Techniques Overview" Lesson is part of the full, Practical Problem Solving with Algorithms course featured in this preview video. Here's what you'd learn in this lesson:

Kyle reviews common algorithms found in computer science. These include BubbleSort, QuickSort, Tree Traversals, Path Finding, and Binary Search. Techniques like Iteration, Recursion, Indexing, and Referencing are also discussed in this lesson.

Preview
Close

Transcript from the "Algorithms & Techniques Overview" Lesson

[00:00:00]
>> So let's talk about some common algorithms. Not all of these are ones that we're gonna tackle again. These are just things that you'll hear, that you'll Google, you'll find. There are sorting algorithms. A couple of the most famous ones are BubbleSort and QuickSort, there are dozens of other ones.

[00:00:18]
There's all kinds of general and specialized sorts for different conditions. I'm gonna be straight up honest with you, I don't care about sorting at all. It's the least interesting DSA topic to me. People have figured that out, they've written volumes of information about them. And the most that I've ever cared about the sorting algorithm is I love the little fancy animations where they show how it works conceptually or whatever.

[00:00:46]
That's about as much as I'm gonna care, so we're not gonna deal with sorting in this. And sorting is one of those key common topics that are covered in a DSA course. I'm just gonna skip it cuz I think it's not as interesting. If I need to sort something, I'm pretty likely to just use the built-in JavaScript sort.

[00:01:07]
And if that's somehow for some reason not gonna work, then I'm just gonna grab a QuickSort implementation most likely. I'm not gonna get much deeper than that, in my own coding. But your mileage may vary, your mileage will vary on that. All right so setting aside sorts, there are things that I find much more interesting, and the kinds of work that I tackle, I seem to come across a lot more often.

[00:01:34]
Traversing things like trees and graphs, and when we get into graphs, we have this interesting concept called path finding, which is, I have this multi-connected set of these different elements that will be called nodes in a graph. And they may not actually have direct connections to each other.

[00:01:53]
So there's various different circuitous routes that I could take to go from one node in the graph to another node in the graph. And there are different reasons why I might wanna take one route versus another. For example, you might have what's called a weighted graph, meaning that the connection between node A and B costs more for some definition of cost than the connection from A to C.

[00:02:15]
And it might be that a longer route, meaning more hops, costs less than the more direct route. So we can think about variety of different real world metaphors for this, like toll bridges and things like that. There's a more circuitous route that costs less, takes you longer. There's the more direct route, you get there quicker, but now you've gotta pay a higher toll.

[00:02:38]
By the way, I talk about toll roads and toll bridges. I used to think that must be one of those universal cultural references, and it's not. I've gone places in the world that people are like, you have to pay to use a road? It's completely foreign to them.

[00:02:54]
I grew up in the land of toll roads. I happily pay that as one of my little life's luxury taxes. I don't worry about toll roads, but for some people, that's a foreign concept, so. Maybe you need a different metaphor there. Insert your own metaphor for having something that you have to pay more to get across or something, I don't know.

[00:03:14]
Anyway, so we have algorithms that allow us to go through a tree. And again, remember trees are ordered, so the order that we go through the tree is a big deal sometimes. Sometimes we wanna just visit every node in the tree and it doesn't matter what order we visit them in.

[00:03:32]
But many, and I would say maybe even most of the time, if we're gonna go through a tree, the order matters. The order matters, perhaps, because we're processing all of the elements in the tree. And the operations that we're processing with are not commutative, so the order of the operations matters.

[00:03:49]
Sometimes it's that we're traversing through a tree and we need to stop early. So obviously, the order matters, because if we go in the wrong order, we're not gonna stop as early as we wanted to. So those are things to think about. And then I threw up here binary search.

[00:04:04]
Similar to sorting, search algorithms, to me, seem pretty straightforward, and I, again, don't end up implementing search algorithms in the classical sense very often. But the concept of binary search, we'll get into this in the next slide. The concept of binary search being recursive, that's pretty important. So being able to understand the underlying principle of something like binary search, meaning divide a problem into multiple pieces.

[00:04:34]
In the binary world, that's divided into two pieces, so divided in half and only work with half the problem, and then divided again in half. But there's higher order divisions where you divide it into three or four or whatever. So binary is just the most common, and that means divided in half.

[00:04:49]
Divide and conquer would be sort of a general term for that. And finally, a few of the techniques, these are things that we'll go through throughout the workshop. These are common techniques that you end up using to create your algorithms. You implement algorithms, these are some of those LEGO pieces that we put together.

[00:05:08]
Iteration, pretty straightforward things, like for loops, while loops. We're going to iterate in order for a certain amount of time or perhaps just an arbitrary amount of time. When you have a while true loop, that's just gonna keep going until you've exhausted the entire space. So that's iteration.

[00:05:28]
Recursion, one of those terms, I heard somebody earlier talk about regular expressions, those are things that we can kinda be sometimes frightened about. With recursion, it's similar. There's a few other topics in computer science that people kind of feel like those are taboo or difficult. That closure is one of those that many people feel like you have to have this special magical enlightenment to be able to understand.

[00:05:58]
We are gonna use recursion a lot. Depending on your level of comfort with recursion, some of that may feel a bit uncomfortable. But you absolutely cannot make any progress in this DSA topic and data structures and algorithms without getting comfortable with it. So whereas somebody might be able to tell you, you can get away with never using a regular expression, I don't necessarily agree with that.

[00:06:21]
But you might have somebody say, you don't need a regular expression, there's other ways to solve those problems, or at least, abstract those problems. While that might be true of regular expressions, you can't get anywhere [LAUGH] in this topic without getting real comfortable with recursion. So I think it's just one of those bullets that we need to bite.

[00:06:43]
We need to tackle it, so we're gonna use it quite a bit. Put very simply, recursion means that a function ends up calling itself either directly or indirectly. So a cycle of function calls where, well, while the function is still running, another indication on the call stack, there's that word stack that we referred to.

[00:07:04]
So you think about a function call, and then when it calls another function while it's still running, then it adds to what we call the call stack, right? So you have these functions that call each other. That's not the same thing as calling a function, it finishing, and then calling another function and it finishing.

[00:07:19]
It's call function a, who then, while it's running, invokes function b, and while it's running, invokes function c. That's what we mean by the call stack, that's stacking up. And if a calls b and b calls a, then you would have a stack, you'd have a, b, a.

[00:07:39]
Or if a calls b, b call c, and c calls a, now you have a cycle. Either one of those is what we call recursion. And you might be thinking, why on earth would you have a calling a or a calling b, or why would you have direct or indirect recursion?

[00:07:55]
Why would you allow that sort of thing to happen? The answer is that the call stack is an abstraction over iteration with state management. Every time we invoke a new function, we get a new set of state that is encapsulated in that function call. And therefore, we have this stack where we're doing things, we're stacking up work, a, b, c, d, however deep it goes, could go 10,000 deep, right?

[00:08:26]
Each one of those individual function calls has its own state. And what we're asking the engine to do, the JavaScript engine, in this case, is manage that state for us in a stack, where it pushes it on and pops it off. The entire world of that function gets pushed onto the stack, and then when it's done, it pops off.

[00:08:44]
I don't wanna worry about all of that stuff. Well, what if I did worry about it? I absolutely could take any recursive algorithm and implement it with an iteration, with a while loop. And what I would have to do is take every bit of state that I need for each one of those levels and manage it in my own stack.

[00:09:02]
Can I do that? Yes, sometimes I do do that, sometimes I do implement something that we would think of recursively, you can implement it iteratively and vice versa. They are what we call in a formal mathematical term, they're isomorphic of each other. Any recursion can be expressed iteratively, and in fact, most compilers for most languages unroll these sorts of things.

[00:09:26]
They will unroll recursion and make it a loop or sometimes vice versa, because the same is also true or generally true. So you have to be real familiar with iteration. And most of us that have done any sort of imperative programming, we're familiar with for loops and while loops and so forth.

[00:09:42]
Maybe less familiar with recursion, but you just need to understand that that's asking the engine to do our iteration for us. It's an abstraction over iteration, that says, I'm gonna need this state to be separate for each of these iterations, and I want you to manage that. I'm gonna do that through the call stack rather than through my own data structure.

[00:10:02]
In fact, we have examples of places in modern framework world where the call stack that, as we think of as being a built-in thing that we just invoke and use, turns out the call stack isn't quite malleable enough for some frameworks. And they have re-implemented the entire concept of a call stack in the framework, so that they have control.

[00:10:27]
I'm referring to react and fibers and non-local continuations and stuff, but you can get real deep with this. So anything that you think of that's built in, you could re-implement yourself. You could implement memory management yourself if you wanted. You shouldn't, but you could, all right? So recursion is just a dual of or isomorphic of the iteration.

[00:10:47]
And sometimes it's more convenient to invoke recursion than it is to do it with iteration. In fact, we'll see an example where we're going through a tree. And you typically think, because trees are typically referred to as a recursively-defined data structure, because every node is its own sub tree.

[00:11:06]
That's what we mean by recursively-defined data structure. Because trees are recursive data structures, recursion is a very natural operation to use to traverse them. But that's not always true. The breadth-first traversal, again, we'll come back to this later, but the breadth-first traversal of a tree is much more complicated to represent recursively.

[00:11:28]
In fact, most people don't implement it recursively. You could, but most people don't. They implement it with an iterative loop and a queue. An interval loop plus a queue is a breadth-first traversal of a tree. We'll come back to some of that stuff, so if that feels unfamiliar to you, don't worry, we're coming back to it.

[00:11:47]
I'm just giving you a sense of how these pieces can fit together. Okay, the last thing I have listed up here is indexes and references. I've really come up with a good term to describe what I mean here, but I'm using indexes in the database sense. When I talk about this concept of an index, you might have a name data structure, let's say, for example, an array.

[00:12:10]
It's a sequentially-ordered collection of values, doesn't matter what the value types are, just sequentially-ordered. But you might want a separate data structure alongside that array that tells you something about some or all of the elements in there. Here's an example. I wanna know all of the elements in the array.

[00:12:33]
I need quick access to all of the elements in the array whose value starts with the letter a. And that's literally like what databases do. When they create indexes, they have this big old table of all your data. And then they create these separate data structures that say, if you're gonna look for all the things that start with a, if you're gonna look for all the IDs that are even or whatever, we maintain this separate list, this separate data structure.

[00:12:55]
That's not a copy of the data, its pointers, its references into the data. This is extremely common in data structure algorithm creation, is that we construct the main data structure, and rather than duplicating the data, we construct these other data structures that have references into these places. And you can access it sequentially and by name, if you will.

[00:13:23]
Those are generally at odds, but we can have two parallel data structures, one with references to the other. That's what I mean here, that's another extremely common technique. So these are things that we will see play out for us.

Learn Straight from the Experts Who Shape the Modern Web

  • In-depth Courses
  • Industry Leading Experts
  • Learning Paths
  • Live Interactive Workshops
Get Unlimited Access Now