Serverless with AWS Lambda

Wrapping Up "ServerLess with AWS"

Scott Moss

Scott Moss

Initialized
Serverless with AWS Lambda

Check out a free preview of the full Serverless with AWS Lambda course

The "Wrapping Up "ServerLess with AWS"" Lesson is part of the full, Serverless with AWS Lambda course featured in this preview video. Here's what you'd learn in this lesson:

After taking questions from students about the cold start and "keeping lambdas warm," API gateways, and more, Scott wraps up Serverless with AWS course.

Preview
Close

Transcript from the "Wrapping Up "ServerLess with AWS"" Lesson

[00:00:00]
>> Scott: Any questions on ServerLess, Lambda, AWS, any of this crazy stuff I just mentioned or anything in general?
>> Scott: Yes?
>> Speaker 2: You had mentioned keeping something warm earlier, how does one do that?
>> Scott: Yeah, how does one keep something warm? Okay, there's a couple ways. So when I said keeping something warm, when you deploy a Lambda, basically the reason it's called ServerLess, is because you don't have to maintain the infrastructure.

[00:00:34]
But there's still infrastructure there and there's a container there. That container spins up every time your Lambda's accessed and then it tends to stay around there for some period of time. And it will stay longer if you continue accessing that Lambda, although eventually it will recycle. So that's what's called keeping it warm because that time from spinning up a new container to your Lambda actually running is called Cold Start.

[00:00:57]
So you can keep that warm by just pinging your Lambda, so that really depends on your Lambda. So if your Lambda is a API Lambda o API Gateway, you could probably just set up a row or a git request to it like some status page or something right, so that will keep it warm.

[00:01:11]
If your Lambda is not exposed by HTTP and it's like some backgrounds off, then you can subscribe a LAmbda to an interval which is at the top of event you can do. And on the interval you can just run the Lambda, which will then keep them warm. So there's many ways to do it, but at the end of the day as long as you invoke a Lambda on some interval then that would increase the longevity of the container that it's in, which will decrease the chance of a cold start.

[00:01:38]
So that's what I mean by keeping it warm. But eventually, you're gonna run into a cold start, and when you do it's gonna suck, for real, especially if your Lambdas are big, we run into that problem a lot. We have five-second API calls sometimes, so it's kind of bad.

[00:01:52]
But also that has a lot to do with our code, we had a really bad database infrastructure set up too, so it's a lot of things. It's basically just compounds all the way down, so you kind of gotta have some clean code to take advantage of that. And for awhile I wasn't convinced that Lambda would be a good approach yet for something that was being used that a client had to wait on.

[00:02:13]
Lambda's great for anything in the background, hands down. But then when it comes do to, all right, this is an API request somebody's gonna be waiting on it. I wasn't convinced that it was ready but now with things like CloudFlare Workers, FaunaDB, service databases. I'm convinced that, actually you can get really good performance out of these things.

[00:02:30]
Even with a code start penalty, because you can actually use, you can grab that performance back by having service databases and things on the edge. So where as before when it first came out, I mean couldn't get an API call faster than 200 milliseconds, there's just nothing you can do about it.

[00:02:45]
So that's not the case anymore.
>> Speaker 3: What do people use, or what did you use as an alternative to avoid the cold starts before? Or what's, if you're gonna have a bigger application that you wanna serve always running, what's the alternative in AWS?
>> Scott: If you want a bigger application that's always running, just a traditional server, is what what you're asking?

[00:03:09]
>> Speaker 3: Yeah, I guess so, you're just gonna host it on a traditional server at that point?
>> Scott: Yeah, you just put it on Beanstalk, or EC2, or you can look at FarGate, which is basically server-less containers. So AWS Fargate, I don't know where they come up with these names, it's like they have a hat, like Harry Potter.

[00:03:28]
You're like Fargate, so yeah, this one is, it's like service containers basically. So this one will run containers for you in servers or clusters without you managing the underlying infrastructure and it will only charge you when you use it. That would probably be the long, server version of Serverless I guess.

[00:03:48]
>> Speaker 3: That's cool, gonna ask, getting at. Cool.
>> Speaker 4: This one doesn't work with Kubernetes, though, I don't think?
>> Scott: Probably not, I haven't really looked too much into it. Let me see.
>> Speaker 4: The last, if you scroll down, it's the last line down there. Right there, support for EKS, yeah.

[00:04:09]
>> Scott: Okay, yeah, there's another one too. I forgot what it's called, but it's in preview. And that one might be- I think that's EKS actually. Yeah, EKS, okay, yeah. Fargate support for EKS will be available 2018. Yeah, it's coming.
>> Scott: Any other questions? For API gateway cuz API gateway's not that good.

[00:04:32]
Check out Kong, Kong's really dope. Not the dog toy.
>> [LAUGH]
>> Scott: [LAUGH] Get Kong, yeah. So this is lie ,y shape, they pivoted and they do Kong now. They have a paid enterprise plan, super expensive. So if you've got the money, look into it. But the open source version is actually really cool.

[00:04:50]
It's a LUA based API gateway, based off LUA and NGINX, which is really amazing and fast. The open source version is really good, and there's tons of open source dashboards that tie into it. We use for the longest, so we're probably gonna go back to it. Highly recommend using this over API gateway, way better, way more extensive.

[00:05:08]
Way better things as far as authentication strategies. You could do so many more things here, had no problems with it whatsoever. And it's free if you do the open source, but here we got some questions here. What do you think of Dino Gels with Dynamo DVs when using Lambda.

[00:05:25]
I've never heard of Dino Gel, that sounds like a toothpaste. But let me see what that is, it's a DynmoDBMapper, it's like an ORM. I have no opinion on Dynmo gels, but I think if there's any abstraction around DynamoDB it's probably a safe bet because interacting with DynamoDB natively is horrible.

[00:05:48]
So yeah, I would say, go for it. And using of a Lambda? Yeah, of course, definitely. Would you recommend using DynamoDB disadvantages? I would recommend using DynamoDB. I think it's the best bet you could use for AWS right now in your lambdas. As long as you don't plan on doing things like analytics on your database and anything like that, but as far as feature set goes, yeah, it's pretty straight forward.

[00:06:12]
It's gonna give you the best support for Lambda, you're gonna be able to subscribe to different events. You're gonna be able to do a lot of different things, but you're gonna be stuck in the Amazon ecosystem. So you won't be able to benefit from tools that only work with SQL, some type of business analytics dashboard unless you pipe it to some other ETL and put it through there, or you track it with segments.

[00:06:31]
So there's some things there, but that's kinda true with any non-based SQL database. So I would say, yeah, definitely at least give it a shot, especially if you don't plan on leaving AWS anytime soon. But if you plan on, we might go to DigitalOcean or something later on then don't do it because it's only for AWS.

[00:06:47]
What's your opinion about CIICD options, I code Pipeline, I code build? I don't use any of those CIIs that AWS has, I think they're just, I don't know, like I don't like things that don't look good, and they don't look good to me I'll just say that. So I've been using Travis and circle lately, Travis has been pretty good.

[00:07:07]
So Jenkins is kind of old school for me, I don't really like the way it looks either, kind of picky on things that look good.
>> Speaker 3: Do you use code pipeline?
>> Scott: They use code pipeline, so there you go. It probably works good, I just don't like the way it looks.

[00:07:18]
And I should, I got free credits, maybe I should check it out but I don't know.
>> Scott: Okay, I think that's it, any other questions? Last call for some questions. All right, thanks for coming. Everybody is well equipped to do some server less now. If there's any other questions, feel free to email me, reach out on Twitter, Ahead of Mark, but thank you.

[00:07:45]
[APPLAUSE]

Learn Straight from the Experts Who Shape the Modern Web

  • In-depth Courses
  • Industry Leading Experts
  • Learning Paths
  • Live Interactive Workshops
Get Unlimited Access Now