Check out a free preview of the full Full Stack for Front-End Engineers, v2 course:
The "Security Ethics" Lesson is part of the full, Full Stack for Front-End Engineers, v2 course featured in this preview video. Here's what you'd learn in this lesson:

Jem discusses security ethics with the audience , and when or how to hold companies accountable for their bugs and security breaches.

Get Unlimited Access Now

Transcript from the "Security Ethics" Lesson

[00:00:00]
>> Jem Young: There is a, which team is it? I forget what they're called at Google. If you ask me what my dream job is, if I was in security, it's the Google X Team or something like that. And their job is only to hack or try to find zero days in software, then they let the manufacturer know.

[00:00:18] And they do that as being white hat hackers. And it's kind of like doing good in the world, which is, it's really cool that that job exists. But one of the issues is about, when it comes to security, is if I'm a security researcher and I say, hey, Frontend Masters, you have a security hole in your software.

[00:00:39] And I tell you about that, what responsibility do I have? Cuz essentially, I found a zero day. And I tell you and you don't do anything about that. What do I do? What should I do? I didn't think we were gonna talk about ethics today. But let's talk a little bit about ethics.

[00:00:54] So, what is the responsibility of companies? If I say hey, I found a bug in your bank software or something like that. What do you think? Cuz this is a discussion, you know?
>> Speaker 2: Patch it, they should patch the defect.
>> Jem Young: But what if they don't?
>> Speaker 3: A set amount of time and then you kind of put them on blast on Twitter something to pressure.

[00:01:21]
>> Jem Young: Yeah, and now we get into a really ethically gray area, and that's why security is, I spend so much time on it cuz it's really interesting. Yeah, if I say, Marc Grabanski, CEO of Frontend Masters, I emailed you that there's a bug on frontendmasters.com. And I emailed you 30 days ago and you didn't fix it yet.

[00:01:40] And that means if I found it, someone else can find it and they can be exploiting it now and I think it's your responsibility to fix that. And if they don't, at that point I can say publicly, hey, there's a bug on frontendmasters.com and I can shame them publicly.

[00:01:54] Which is ethically dubious, cuz at that point I just released this vulnerability to everybody. And if they still don't fix it, now the entire world knows that this hole exists. And that's why security researcher is a really, it's a really tricky area. Because at what point, like what if it's a really complicated bug, what if it takes them more than 30 days to fix it?

[00:02:13] And I still release it anyways. That's not fair to them, but they should also fix the hole, too. Yes, Jeff.
>> Jeff: Well, I was gonna say, depending on the complexity of the vulnerability and how many people are potentially affected. As I understand it, there's a different set of standards for how urgently you expect a company to respond.

[00:02:36] And the other thing is, you can also announce after a certain amount of time, whether it's a month or six months, you can say, hey, we've found a bug. You don't have to say where it is or how to exploit it. Versus six months or a year later, it's like, by the way, this is where we found the bug.

[00:02:53] And here's some available GitHub repos, so that any script kiddy could compromise it, to show just how easy it is. I mean, there's a graduated response rather than just all or nothing.
>> Jem Young: Yeah, but even there it gets so ethically dubious because what if I say there's a bug with your login protocol.

[00:03:18] And I just announce that to the world, cuz they haven't fixed it in 30 days, 90 days. That's enough that enough people are gonna start hammering on that and they're probably gonna find that vulnerability too. So even then, even if you're trying to do the right thing, you can still get in trouble.

[00:03:32] There's no right answer to this. Google's team response is they give 90 days to the company. So they'll tell you about it and if you don't fix it in 90 days, they'll release that out into the world. And it's kind of like a naming and shaming type thing.

[00:03:45] Which again, if the company doesn't respond, you've just released this bug out into the world that anybody can take advantage of. And especially when you start chaining these zero days together, you can blow up nuclear reactors, things like that. Yes.
>> Speaker 5: The scariest one for me is the medical device ones.

[00:04:01] Because at DEFCON they'll do a thing, a presentation, here's how you make this pacemaker kill someone, right? And some of the medical device companies don't take security seriously. So I don't know, that freaks me out anyways.
>> Jem Young: It should, you should all be freaked out by security. And when there's hacks and breaches, we should all collectively hold these companies more accountable.

[00:04:31] Sorry, this is totally tangent, but because I care about these sort of things. Equifax, they have your Social Security number, all of your credit history, which essentially I can figure out who you are by just how you spend your money. They got hacked and their consequences were so minimal for this data that nobody asked them to collect and nobody asked them to monetize, and nothing happened.

[00:04:50] So honestly, I think it's up to us as engineers in the private sector to take these things more seriously and hold companies accountable. So if you see your company has bad security practice, you should say something. See something, say something. [LAUGH] Anyways, that was a tangent on security.

[00:05:08] But what we've learned so far is keep your software up to date. That's one of the easiest things you can do, with the unattended upgrades, it does it for you.