Security Nation, S3 E4

How Entrepreneur Christian Wentz Takes On Identity Authentication and Data Integrity One Line of Code at a Time

September 25, 2020

 

In our latest episode of Security Nation, we are joined by Christian Wentz, CEO, CTO, founder of Gradient. From an electrical-engineering-applied-to-neuroscience background to a privacy and data protector present, we discuss what it’s like to thread the needle between internet profitability and end-user privacy. There’s technology, there’s politics, there’s policy, and there’s Tod getting very excited about code.

Stick around for our Rapid Rundown, where Tod talks through CVE-2020-1472, a CVSS-10 privilege escalation vulnerability in Microsoft’s Netlogon authentication process that the paper's authors christened “Zerologon.”

Appears on This Episode

Jen Ellis
Vice President, Community and Public Affairs

Jen Ellis is the vice president of community and public affairs at Rapid7. Jen’s primary focus is on creating positive social change to advance security for all. She believes that it is critical to build productive collaboration between those in the security community and those operating outside it, and to this end, she works extensively with security researchers, technology providers, operators, and influencers, and various government entities to help them understand and address cybersecurity challenges. She believes effective collaboration is our only path forward to reducing cyber attacks and protecting consumers and businesses. She has testified before Congress and spoken at a number of security industry events including SXSW, RSA, Derbycon, Shmoocon, SOURCE, UNITED, and various BSides.

Tod Beardsley
Research Director, Rapid7

Tod Beardsley is the director of research at Rapid7. He has over 20 years of hands-on security experience, stretching from in-band telephony switching to modern IoT implementations. He has held IT Ops and IT Security positions in large organizations such as 3Com, Dell, and Westinghouse, as both an offensive and defensive practitioner. Today, Tod directs the myriad security research programs and initiatives at Rapid7. He can be uniquely identified at https://keybase.io/todb.

Christian Wentz
Founder/CEO/CTO, Gradient

Christian is the Founder, CEO, and CTO of Gradient (gradient.tech). Previously, he was Founder/CEO of Kendall Research Systems, a neural interface company (acquired by Kernel.co in 2017), and member of the founding team at Misfit, Inc. (acquired by Fossil; Google for $270MM in 2015). Wentz received the inaugural Forbes’s 30 Under 30 award in Science & Innovation and was awarded a Hertz Foundation Myhrvold Family Fellowship to pursue a PhD at MIT. He earned an M.Eng in Electrical Engineering and Computer Science under Profs. Ed Boyden and Rahul Sarpeshkar, and an S.B. in Electrical Science and Engineering, both from MIT.

About the Security Nation Podcast

Security Nation is a podcast dedicated to celebrating the champions in the cybersecurity community who are advancing security in their own ways. We also cover the biggest events in security that you should know about. In each episode, host Jen Ellis (@infosecjen) sits down with a guest so they can share their stories, what worked, what didn’t, and what you can learn from their initiative so maybe we can inspire you to do something new, while Tod Beardsley breaks down the biggest security headlines of the week. 


View all Security Nation episodes

Podcast Transcript

Jen Ellis: Hi, and welcome to another thrilling episode of Security Nation, the podcast where we talk to interesting people doing cool things to advance security in some way. With me is my amazing cohost, who may or may not be in his hot tub, Mr. Tod Beardsley. How are you doing?

Show more Show less

Tod Beardsley:

Hi, Jen. I'm doing well. I'm fully clothed and reasonably dry.

Jen Ellis:

I was going to say, being fully clothed, there's no guarantee that you're not in the hot tub. And as a quick aside, if you have not checked out Under The Hoodie and seen cartoon Tod yet-

Tod Beardsley:

Oh no.

Jen Ellis:

We are rapid7.com/under-the-hoodie and you will see cartoon Tod. I am going to say it right now, I think the hair is a little perfect. It's a little too perfect.

Tod Beardsley:

It's a little on the nose. Yeah.

Jen Ellis:

No, he definitely looks like he got some styling. He looks like he's salon-fresh-

Tod Beardsley:

Sure, yeah.

Jen Ellis:

I do enjoy the fact though that they have just beaten the hell out of your cartoon character.

Tod Beardsley:

Oh yeah. I don't know what it is with that pointer. Really has it in for me.

Jen Ellis:

Yeah. He claimed that that happens in real life as well as like a giant monster chasing you down the street.

Tod Beardsley:

Constantly.

Jen Ellis:

Right. When you're in the hot tub.

Tod Beardsley:

Uh-huh (affirmative).

Jen Ellis:

Okay. So we also have a guest this week, which is exciting. With us is Christian Wentz who is a CEO, CTO, and founder of Gradient. So that's three more interesting titles than I have, although I have a very long, wordy title to try and make up for it. Christian, welcome. It is delightful to meet you. Thank you for joining us.

Christian Wentz:

Yeah, thanks for having me.

Jen Ellis:

And I feel like you're one of those people, when I look at your bio, it makes me feel like I have done nothing with my life. You have this habit of founding companies, winning awards, that kind of thing. I mean, what have you been doing with your time, Christian?

Christian Wentz:

It's all just a farce. Nah, nah ...

Jen Ellis:

Just a smoke screen.

Christian Wentz:

Yes. It's an inside game. No. I guess, quick background. I'm a chip designer by training. I thought I always wanted to build chips and computer systems and those sorts of things. Came out of MIT as an undergrad, switched my major three times.

Jen Ellis:

I like that.

Christian Wentz:

When your freshman physics professor wins the Nobel Prize, it gives you a good reason to be like, "This is why I cannot follow this person."

Jen Ellis:

Yeah. That's a moment, right there.

Christian Wentz:

Yeah. Got an interest early on in building early-stage companies. So actually before I graduated, I sort of settled into this electrical-engineering-applied-to-neuroscience bent. And we started a pair of companies around neural interfaces to the brain and how we could use that to improve cognitive function and restore cognitive performance in people suffering from various brain disorders and diseases. And so I was lobbying John Kerry for money instead of taking my final for power electronics. Yeah. I thought it was a worthwhile tradeoff.

Tod Beardsley:

Sure!

Christian Wentz:

Yeah. So the whole complexity of taking an early nascent idea, figuring out is it real? Does it really matter? Can you actually achieve it? Can you get people around it, build a team, and launch a product has kind of been the thing I've been excited about. Since then, I've worked on a wearable sensor company called Misfit Wearables that we sold to Fossil. And so first 10 years or so, I was really doing medical-ish things. And how we got here in terms of Gradient, is I was thinking of what to do next. So my previous company, Kendall Research, which is building these high-bandwidth, closed-loop, neural-interface technologies. So if you're following the news today with Elon's Neuralink, there's Bryan Johnson's Colonel, there's Paradromics, a few others. So it's a really exciting space. I ended up deciding that we are going to sell the company. And so that company is now part of what is called Kernel today.

Tod Beardsley:

And that's Kernel I/O, right? kernel.io, I believe.

Christian Wentz:

That's right. Yeah. Yeah. So the field's roughly divided into not invasive, some invasive, and really invasive methods. And Elon's on the really invasive side. I'm excited about what he's doing. We haven't talked about it in a while, but it's great progress. Anyway, I was trying to think of what to do next. Because one downside of doing things in medtech is everything moves very slowly. You do all this great innovation and then you wait for nine years for the FDA. And one of the problems that we were running into in terms of actually working with the real world and regulatory bodies and so forth is with any of these neural interface technologies, what you really need to do is get data in context. So knowing the signal out of some neurons in your brain, but not knowing where they are or what else is going on at the same time is not very interesting.

Christian Wentz:

If I want to sort of figure out what's wrong in a neural circuit, it's helpful to know, for example, what am I doing actively? What posture am I in? What's going on around the world? And so you need to gather data from a bunch of different sources, and sync it all up, and compute on it to figure out what the hell is this thing doing? And I think the we'll-just-throw-AI-at-it model, if you have enough of this data and you throw the right ConvNet in there, then the result is obvious. But the challenge there is this data is coming out of a person. It's going to be a thing in your body. And if you get the output wrong, you could actually cause significant harm or death to the patient. See, this isn't one of those like, "Well, we'll just train the model and eventually, it will work."

Jen Ellis:

That does seem like it could be a problem.

Christian Wentz:

Yeah.

Jen Ellis:

I'm not an expert, but it seems like it could be a problem.

Christian Wentz:

Right. And not even that, but and thinking about this problem for a little bit, realized this is a generic problem, maybe an extreme version of it, but you care about privacy. Is there interesting information in there that might unique-identify a person? And if so, is that information something that we need to protect? If so, how? You think about an autonomous vehicle is another example. We have more and more compute, pushing to the edge where, necessarily so, because we need lower latencies. We can't send everything back to the core. And in all those cases, imagine I've got a car with some LIDAR sensors or vision system, capturing information and computing on it, making a decision as to whether I should stop left, right, whatever. If that thing is wrong, you die. And it's not necessarily the case that it's malicious actors, that is definitely a thing to be worried about, but it's also just sloppy code. Humans don't write perfect code. So ...

Jen Ellis:

Oh, that's music to our ears. We talk about that all the time.

Christian Wentz:

So this is a big, hairy, ambiguous problem that exists on pretty much every connected device to some degree. I mean, wouldn't it be great to know the provenance of information that is being used to make some decision, or knowing that the execution of that code has integrity so I can trust the output, or any of this stuff? That was sort of the flavor of what got Gradient going.

Jen Ellis:

Okay. So I'm following, and I feel like I can practically hear Tod bounce up and down in his chair because the idea of, I like the word provenance of being able to understand the provenance of the code and understand potential issues with it upfront. I can see the appeal in that. How are you going about this? What is the approach?

Christian Wentz:

Well, the easy answer which we started with, our initial approach was, well, it's quite simple. We just replace all of the silicon that has all of these side-channel vulnerabilities and poorly architected to ensure privacy and integrity of data, we'll just replace those. And we'll have you all adopt a new crypto system that allows us to provably authenticate computation, point-to-point, such that you have a unified, glorious, trust fabric, and we can all go home.

Jen Ellis:

Easy.

Tod Beardsley:

Right, yeah. And I really appreciate the greenfield approach here. I think security people, worldwide, this is always their first instinct is like, "Well, just tear it all down and do it right. Why not just do it that?"

Christian Wentz:

Yeah. It turns out, and then, grand visions are great for raising initial amounts of money. And then when you actually go talk to potential end users of these kinds of things, so who cares? Well, if I'm building an autonomous vehicle, if I rely upon this code to be accurate and uncorrupted, so like finance, healthcare, defense, all the big things you think of. And the immediate reaction as anyone would probably have concluded with some further thought, as it turns out, people are reluctant to replace everything they spent billions or trillions of dollars in.

Jen Ellis:

Weird.

Christian Wentz:

Yeah.

Jen Ellis:

Really?

Christian Wentz:

We didn't, by the way, totally let go of that vision. And we have built our own, formally verified and tested secure-boot processors. They run our own backend infrastructure and going to be making those, as much of that as transparently inspectable as possible. Don't trust us that we got it right. Here it is, try to break it. But it's a high bar to ask people to replace silicon. It's not impossible. There are groups and customers who will do that. But even if they're willing to do so, it's a multiyear process.

Christian Wentz:

So really the first thing we started doing is, well, what do we have to work with that's already there? Yeah, we're not going to get the glorious, unified, formally verified trust fabric across world just yet. But what's a step there? So there's a lot of actually a lot of great stuff here. So all of the chip companies have made attempts, some better than others, at making isolated or provable execution of code possible. Intel's got their own SGX architecture. AMD's got SEV. Arm in the embedded space has their TrustZone framework, I wouldn't call it an architecture. And there are open projects like Google's open Titan. There are trusted platform modules.

Christian Wentz:

The standard for TPMs is something that can't say I'd really encourage anyone to read, but it is standardized and it's across, they're in almost every enterprise server. At least in the U.S., they're mandated by Federal Acquisition Regulations, so they have to be in things. I would love to know who lobbied to have that happen because they don't do anything with them, but they exist. So anyway, there are these bits of things that you see if there's this toolbox of spare parts, there are chunks of these things everywhere. And the question we reframed our challenge into was, what can I do with what I've got? So starting with, what can I prove about whatever system we're talking about, using what it has already. Servers are a pretty easy story. Enterprise stuff has 80% of the stuff necessary. So we can just do a software deployment to it to bootstrap up these various inspectable properties.

Christian Wentz:

And we came up with a cryptographic framework to provably attest with privacy-preserving properties. I think that's the big addition there, that this device is what it says it is and it was constructed as it claims it is. So that's our first target is to say, "Well, while we're educating the world, not just ourselves, but with the RISC-V Foundation, with TCG, and all these other things, how computing should be done. How can we make the best of what exists today and allow it to be inspectable by anyone? So not just saying, "Well, I trust that Intel's attestation of this thing is correct, so it must be." Transparent, open, all that good stuff.

Tod Beardsley:

Do you expect, Christian, that people will actually trust this stuff, though? Well, and actually not just trust it, but actually do the verification? Do you think that there are people sitting around verifying things all the time? Because that's the promise of open-source software in general, right? Like, "Oh, well, you can't have back doors in open source because everyone looks at it," but it turns out, few people look at it.

Christian Wentz:

Right. So I don't think that open is enough by any means. So for example, I think we have more of the anonymous digital signature community of the world under one roof than has ever been assembled before except in certain esoteric conferences, which are cool, but-

Jen Ellis:

You do you have a lot of PhDs. I did notice that.

Christian Wentz:

Yes. And so when we started saying, "Okay, well, practically we're building something to ship commercially," we've done some public demonstrations. We did a demonstration in June with HP Enterprise, leveraging their Self Innards trust on their machines. And there'll be some more coming up. As well as to package this stuff up, obviously one of the core components is there are these graphic libraries. And there are a handful that everyone seems to use and everyone assumes is fine. And when our cryptographer started digging into these, "You know, this one actually appears to just pull 90% of its guts from this one. And no one really trusts that one."

Christian Wentz:

But there's this sort of chaining bit of ambiguity, to your point, open-source software is theoretically watched by everyone, but in practice, that's a high bar. So one of the big tenets that we've taken is simplify as much as possible, what it is you need to trust. So our IDaaS Station framework is extremely tight in terms ... it's something like 2,000 lines of code. It's meant to be small enough that you can formally verify it. I mean, the idea of even saying, "We formally verify and trust this hypervisor," is I think an unreasonably high bar given the current state of formal verification, but I'm not the formal verification guy.

Jen Ellis:

I feel like now you've said that it's 2,000 lines of code that Tod is like, "Ooh, let me look at it."

Christian Wentz:

Yeah. Actually, we can show you the code. If you think about the challenges that exist ... I mean, it's also a bit of a political minefield, particularly right now in the United States. I'll just leave it there. What crypto-system can we use? What can't we use? Who can have what? It was never a clean-cut thing, and now it's less so. So we try to use the, for example, RSA-based crypto because we know we can export it, things like that. We have pro versions as well. And so keeping a header on that is one. But I think to encourage public use of these kinds of provable trust frameworks if that's what they call it. And I think we're talking in the abstract, so maybe some concrete examples might be helpful.

Christian Wentz:

But before that, you have to be able to guarantee it to the end user that their information, or sort of side channel, whether that information about these data stations will be kept private. I think that is something that hasn't been commercially given a lot of airtime, because if you think about, how does the internet make money? The internet makes money by doing the exact opposite of what I just said.

Tod Beardsley:

Yeah.

Christian Wentz:

Well, no, I don't want to anonymize you. I want to find out enough information about you as I can and sell you ads. And that's fine. I mean, I think none of us are-

Tod Beardsley:

And your friends so I can sell your friends ads. Right?

Christian Wentz:

Yep. Right. When Zuckerberg was testifying to Congress, at least I had the shocking realization that most of Congress doesn't understand how the internet works or makes money. Okay. That's retrospectively not shocking. Anyway, all this to say that there is, I think, it's a bit of a threading-the-needle thing here, but there is, we believe, a way to achieve both the end user privacy. There are certain pressures that are mounting aggressively day-to-day now that are working in the favor of driving adoption there. So GDPR obviously being the first one. California's privacy laws. Everyone's jumping on this bandwagon of saying, "No, you can't actually do anything you want with this information." There's always a question here of like, "Who cares about this stuff and why? Is that sufficiently compelling to drive a change in how we essentially built the internet?" But as we were putting together the first version of the company, so we're about two years old. And the first year it was, I collected a bunch of PhDs who were way smarter than me in their individual areas. And we said, "Okay, well, what is possible? What can we do in hardware? What can we do in privacy-preserving compute? What can we do in tested, cryptographically verifiable constructs?" All this stuff. And then in our second year, we kind of took what we thought was really interesting and talked to a bunch of industry verticals who seemed to need these kinds of things in various degrees. And then they'd come back and say, "Well, no, no. That can't possibly work. All this is broken. Theoretically, very interesting, but you just have to allow us to change nothing with what we do and just drop this in magically. And then yes, the offering is amazing." Going back to the engineering team and saying, "Well, all right, we have to comply with this standard and this standard and this standard. And no, we can't have new hardware deployed. Can we do it?" I think we've actually gotten there.

Jen Ellis:

Okay. So you guys have been around for two years. You are definitely addressing some crazy complicated, ambitious stuff here. And I like this sort of thing of where you had this vision and this and this ideal. And then you were like, "Oh wait, this is the real world. And we're going to find some challenges with that." And so you adjusted. And that is the appropriate thing to do. That is how these things work. How is it going now? Now that you've made those adjustments, how is it going in the new direction?

Christian Wentz:

Yeah. I appreciate the generous framing of our initial approach. It's going well. I mean, I wouldn't say it's so much that we're redirecting so much as the boil the ocean, any connected device can prove its identity, authentication, integrity of code, provenance. All of that is still the vision that we are driving towards. And there are aspects of that that are actually already there. But realizing because I think there's this sort of natural cycle with any, I hate the phrase deep-tech company, but anytime you're doing something really hard that have multiple factors that aren't purely technology, they're policy, they're political, I think that the middle of that hero quest situation is you go through the woods and figure out all the things that you cannot move in the immediate term. And so, pretty early on, I brought in a number of people who have done things like this before.

Christian Wentz:

So the former VP of hardware from Apple, periodically consult with some of the senior former members of the NSA, both in terms of how they build things and how they break things. No one's breaking nation-state-level crypto systems, but doing the math there. I believe the phrase I got from one or two people was, "We bribed the cook clerk." So, in building your own system, where do you think about vulnerabilities? But so where we're at is for better or worse, the realization we had, this is great for us as a for-profit entity, is that when you start describing this grand vision of what we can do for you, what's in it for you, the end customer, whether that's a individual user, or a Fortune 100, or a government, or whomever, is that they have such more basic problems than the thing that you articulated.

Christian Wentz:

Even some of the most sophisticated companies or users we talked to and they say, "Wait, wait, you can identify uniquely and authenticate how many machines I have?" Just that in and of itself was interesting. It's like, "Okay, well," so I would describe this as building a pyramid structure. We're not a Ponzi scheme, where the-

Tod Beardsley:

Important distinction!

Christian Wentz:

Right. Right. There's sort of a hierarchy of needs technologically. Where we are aspiring to be, currently there are people, and they're non trivially sized companies, their target market is massive, but any connected device, anything that connects to the internet by some means needs to have authenticated identity, it needs to have integrity of code, and ways to communicate that. Why should I trust anything coming from this device? The late '80s, early '90s CA model doesn't work when you have a hundred devices on the internet for every person. So just building in that foundation is a huge undertaking, but it's not hard to convince people that that is something that they need.

Tod Beardsley:

Well, and I mean, you can argue that the late '80s, early '90s certificate authority model kind of never worked.

Christian Wentz:

Fair.

Christian Wentz:

I was trying to be generous.

Tod Beardsley:

We all played along. We all played along because we wanted to be able to take people's credit card numbers. That was the thing. So we all just kind of hand waved at it. I mean, it's not like credit card numbers of themselves particularly secure. I mean, they're not generated in a secure way. You give them out all over the place to tons of merchants. The bar for the CA was be at least as secure as the teller at the corner store. And what you're describing is significantly more secure than that. I wonder, would it just be easier to have just kind of like an Ozymandias, kind of Adrian Veidt's sort of plan of just like, EMP blast all of our electronics, blame it on aliens and then come in and say like, "Okay, well, now we're going to greenfield it the right way." Cheaper!

Christian Wentz:

If I can get an invite to that, let me know. I think there's more there that's already been deployed than I think I initially gave credit looking at this large problem space. The huge issue is that there's nothing that's unifying. The incentive structure to do some of these things often goes, or at least for a while, went at odds with how you make money on the internet. I don't think that's necessarily the case anymore or doesn't have to be the case. So to get to the grand vision side of things, I spent a tremendous amount of time with every ... if they make it a chip in the machine in front of you, we're probably having C-level conversations with them about how to get them to do this. And what's cool is that they all want to. I've never had a conversation with a company that says ... even the ones you'd think like, "No, it's terrible." Behind closed doors, everyone says, "This is awesome. This is the future happening now."

Jen Ellis:

I mean, they've learned that the hard way, right? In the past couple of years, they've all been massively impacted by very, very broad-scale vulnerability disclosures that have left them reeling. And so finding ways to create new trust, get past those challenges. Because as you say, technology is developed by humans and as we know, to err is pretty goddamn human. And so I think finding a path forward is in their interest, and focus on competing in the ways they want to compete rather than just who trusts which technology more. So yeah, I think it makes perfect sense from that point of view.

Jen Ellis:

Listening to you chat, it sounds like you are very up on a lot of the sort of policy conversations that are happening in areas around this stuff. And right now, there is a lot of movement on privacy and cybersecurity, as you mentioned with GDPR, and the California bill, and others will follow. But also, in driving the adoption of secure-by-design principles and that kind of stuff. Do you find that these policy conversations are helpful to you? Are they creating an environment in which you have a better runway to have conversations with people?

Christian Wentz:

I mean, frankly, at a startup pace now, nothing moves fast enough. It's something where the operating assumption that I have is in absence of those policy transformations, we will and we are growing an initial user base and we'll continue to grow that. And at some point, we will be, one, in a position where we're big enough to be able to move or help move that conversation in the direction we think it should go. And I guess that comes down to, do you then trust us to be a benevolent actor? We're appealing to a degree. I mean, I think it does create markets in certain areas.

Christian Wentz:

But for example, I mean, we take the privacy and security thing very seriously at the public policy side. So we have on the team, Obama's former privacy advocate. He was tapped, he was stolen from the ACLU or borrowed from the ACLU to monitor the intelligence community and basically say, "Yeah, let's ensure that they're actually abiding by the laws we set." And so, lots of interesting insights from that. I think that community, for what it's worth, I may lose credit saying this, but really does try to not be big evil brother. Stuff happens when you have all of the internet.

Jen Ellis:

Yeah. It's complicated, right?

Christian Wentz:

Right.

Jen Ellis:

There are no simple solutions. If there were, we would have solved these problems a long time ago.

Christian Wentz:

Yeah. And that's actually, that's also kind of part of how we think fundamentally about architecting ways to prove, whether it's identity, or authenticity, or code integrity, or, I mean, data provenance is huge, particularly given these evolving geopolitical concerns and the whole new ruling around GDPR, and where is it U.S. citizens' data transfer to the US is no longer protected by Safe Harbor. Things like, less than the policy. Maybe some of that, you could say, is policy, but those are the things that drive adoption of new things in the space.

Jen Ellis:

Yeah. And I think when I used the term policy, I was giving a broad term to incorporate a lot of that. I mean, I think that policy will try and react to these things and it is trying to react. But you're right, it moves slowly. But I mean, it's interesting. If you look at something like you mentioned medical devices at the beginning and clearly, you're up on FDA requirements, particularly given where you worked in the past.

Jen Ellis:

But the FDA is one of the regulatory authorities that I would say is leaning in hard on cybersecurity and doing really excellent work on that front, and has come out with pre-market and post-market guidance for the cybersecurity of connected medical devices, and has even now updated and come out with, in 2018, came out with an updated version of their pre-market, which includes what they term Cyber Bill Of Materials, specifically to get to the problem that you're talking about, which is understanding how to think about trusting the technologies you're buying and understanding what the components are. It's a policy approach. It's a process approach rather than a technology approach that you're proposing. What do you think about that sort of thing?

Christian Wentz:

Yeah, we looked early on at ... medtech is a great application, low-hanging-fruit application of what we're doing, because it is a captive. Like Medtronic, Boston Scientific, Abbott, the big four make the whole thing full stack. They control the software. It's very easy to drop in the enabling components that we're talking about. And it's 2K of code. These are not meaningful changes. And I was excited to see the FDA, essentially cyber security chief...

Jen Ellis:

Suzanne Schwartz.

Christian Wentz:

Yes. So I think what she's proposing is fantastic. The issue with medtech specifically, and I think it's the case for a lot of industries. When I was last building a med device team, medtech is highly sophisticated in several dimensions, but just based on the financial incentive and so forth, only like 10% of revenue goes to R&D. So when I was trying to find a chip design team that could build the closed-loop, high-bandwidth neural interface that we were building, I hired them from Qualcomm and Intel and those places, not from ... because the sophistication of thinking about the problem. The problem in general modern semiconductors, so fantastically complex that we've adopted means to handle the complexity. And so long way of saying medtech, I think, doesn't have the in-house resources generally to understand how to do what is being advocated to them by the great work that Schwartz is doing. Almost every industry has this. Automotive has it. They're not stupid people, it's just, this is ...

Jen Ellis:

Yeah. I mean, they're learning a new area. I don't think that, I actually know because I've sat in a room with a bunch of automotive manufacturers who've said they don't think of themselves as technology manufacturers the way that people who traditionally participate in security conversations think of themselves as technology manufacturers. The software aspect is still new. And so thinking about secure-by-design principles is a journey for them. And it is, has been, I think, actually quite an amazing journey over the past several years. I think if you wind back the clock to the GPAC, I think things have changed a huge amount in that time. And there continues to be a really strong effort to continue to change and build security. But it is a journey. It doesn't get done overnight.

Christian Wentz:

Yeah. I think there's an opportunity to bridge between policy and technology solutions and say that ... So for example, in our approach, automotive was one informing industry. One of the pieces of feedback we got from a large component supplier was everything that's certificate-driven ... Which my initial reaction was that's fantastic. That's a thing, you know how to use certificates, like digital certificates. And so that sort of became a forcing function to say, "Okay, well, here's the thing. You're more or less telling me you have your own certificate infrastructure, and every large OEM has their own, and they're not going to change that. So fit into that mold." And if you can do that, then you can move. You can sort of like retrofit, bootstrap these industries. So things like that, I think. And maybe that's a standards conversation. I really try to avoid getting into those.

Jen Ellis:

Right. Nobody likes a standards conversation. I feel like I could actually nerd out with you about this stuff hours. I can get very boring when I get on to this topic. But we should wrap it up. I should let you go. You're a busy man. You have three different titles. You must be a very busy man. But thank you so much. This is fascinating. And this is the first time that we have had somebody on who literally worked in neuroscience. Amazing. I feel like this is squad goals for the podcast.

Jen Ellis:

Thank you so much for coming on. The last thing I will ask is always the last question I was anyone. Part of what we try and do with the podcast is inspire other people to feel like they can take on their own projects to advance security in some way. I'm guessing that not many of listeners are like, "I'm going to go and found a company and do some super ambitious thing." But if you were looking to give advice to people who are sort of worried about taking on their own projects or thinking about something they want to take on, be it be a big or small, what advice would you provide?

Christian Wentz:

Yeah. I mean, what's the worst that could happen?

Jen Ellis:

I love that.

Christian Wentz:

I guess I can look at it like, it seems to me, it's really rare that attempting to innovate, even if ... So imagine you have a day job at a large company. Innovation from within is always valued highly. It's actually one of the problems that I frequently face talking to to large companies is that you get this, "Well, we didn't invent it here." So I don't know that that's really advice. Sorry.

Jen Ellis:

No, but I think clearly, with your bio, you have a very entrepreneurial spirit. And I think the attitude of for sure, for lots of people, there are situations where it is very daunting to take it on and there can be real repercussions. For example, if you walk away from a job and you've got job security so that you can go and start your own thing. But I think that attitude with maybe smaller-scale projects of, they may fail, but that's an opportunity to learn, and to adjust, and to move forward. It's not necessarily the end of the world. I do actually like that attitude.

Christian Wentz:

Yeah. Maybe that's the way to frame it, which is, I was talking to someone yesterday about this who's thinking about doing this early in her career. And the fear was, "Well, what if it doesn't work?" Well, then it doesn't work, but you tried something. You're going to learn a tremendous amount in the process of figuring it out. 95% of this stuff, they're not technology skills. It will help you in everything else you do.

Jen Ellis:

Yep. Awesome. All right. Well, Christian, thank you so much. Really appreciate you joining us. I'm a little bit worried that Tod is quiet because he's actually been looking at your code

Christian Wentz:

We'll follow up afterwards and I'll give you access.

Tod Beardsley:

Cool.

Jen Ellis:

And good luck with Gradient.

Christian Wentz:

Thank you.

Tod Beardsley:

Yeah.

Jen Ellis:

Keep us posted.

Christian Wentz:

All right. Thanks, guys.

Jen Ellis:

So, Tod, what's happening in the world of security today? Tell me everything.

Tod Beardsley:

Well, I have two items, Jen, and first is I have learned that we have a super fan and it is Bill from Mississippi.

Jen Ellis:

What?

Tod Beardsley:

Yeah, I know right.

Jen Ellis:

Is it my mom?

Tod Beardsley:

No, it's not your mom. It is Bill from Mississippi and apparently he has a Rapid7 banner in his office and he listens to the podcast and he loves it. And so, hey Bill, thanks for listening. You're the best.

Jen Ellis:

Yeah, Bill, we love you. Thank you. Please tell all of your friends and even people you're not sure if you like, just anybody on the street, just tell them about Security Nation. Sing it high and low. Awesome.

Tod Beardsley:

That was the first bit of news that someone listens to the podcast.

Jen Ellis:

Yay!

Tod Beardsley:

And secondly, let's talk about... I wanted to talk about our friend Zerologon, which is a bug known formally as CVE-2020-1472, not 1492, because that is a different thing.

Jen Ellis:

It is a very different thing, but nice historical reference. And that is a very formal name, you are correct. What was it again?

Tod Beardsley:

It is CVE-2020-1472, aka Zerologon. And this is a good bug for bad guys.

Jen Ellis:

Isn't Zerologon Dade's character when he's in Hackers? No.

Tod Beardsley:

That would be Zero Cool. But it would've been 20 times cooler if it was called that, here in this week of I believe the 25th anniversary of "Hackers."

Jen Ellis:

I'm glad I worked in a Hackers reference then.

Tod Beardsley:

So Zerologon is a big, big deal. If you haven't heard about it, go patch your domain controllers right away. And so what it is, is that essentially, if you have a shot at a domain controller, like you can establish a TCP session. So like typically this means you have to be inside the network. Sometimes people put their domain controllers on the internet because sometimes people put X on the internet. You shouldn't. But if you have a straight shot at the domain controller, you can trick it into setting any computer account to be all nulls. And that gets you into domain ownage, basically. So it has to do with the way that computer accounts do... Like computer counts as opposed to human accounts or service accounts. Computer accounts are a different class of thing in Windows networking. They authenticate in a very different way. Basically, they use this thing called AES/CFB8, which nobody knows about.

Jen Ellis:

It just rolls right off the tongue, though. Doesn't it?

Tod Beardsley:

Well, nobody except for Tom Tervoort who wrote the paper on this. He knows everything about this, apparently. I believe he is the one human on earth who knows everything about this, like the graphic schemes. Good job, Tom. And so basically he found that he found that if you... There's like two issues here. One is that the protocol spec calls for an initialization value in IV to be set to all zeros. Anytime you see any kind of cryptographic algorithm define this as this should be all zeros, that's something to look at. So pro tip to anyone else rolling their own crypto, if you require your IV to be all zeros, you probably have something broken in there.

Jen Ellis:

Yeah, that sounds not how crypto works.

Tod Beardsley:

It's not great.

Jen Ellis:

I'm no mathematician but...

Tod Beardsley:

And I've been reading the paper off and on this morning, again, just, I know it doesn't seem like it, but I did actually kind of prep for this. But basically if you set this thing to all zeros, and then that means that you have a one in 256 chance because you have 256 bytes between zero and FF for the result of that to also be zero. And then that gets used in like the next byte, the next byte, the next byte. And so then it basically turns the whole thing to zeros. So, yeah. So don't do that. Microsoft has fixed this in the August... Now we're recording this in September. This was fixed in the August Microsoft patch Tuesday, but the paper and the analysis behind the fix wasn't released until September. So like everybody got a free month.

Tod Beardsley:

And I remember distinctly our own William Vu seeing this bug pop up in the August wrap-up and says, "Hmm, that looks bad. I wonder what that's about. Boy, I can't wait for someone to figure that thing out." And William Vu was right, that this is bad. So this is the kind of bug that will be used on penetration tests forever. Now it only affects domain controllers. Interestingly, it affects Samba domain controllers as well, not just the Windows ones. Because it's part of this IV being zero is part of the spec. And so Samba is like, yes, sir, sounds good. This is how we do it. Because that's how Samba does SMB. They basically just reverse the SMB protocol from Windows. And so that's interesting that it's not just a Windows issue. Now I don't know anybody who runs Samba domain controllers. I assume they exist because the capability is there, but it would be weird.

Tod Beardsley:

I don't know why you would choose to do that. Samba is usually for things like network attached storage backups and things like that, normally not a domain controller. But it only affects domain controllers. Unlike the other two big Windows bugs that everybody uses, which is MS 17-010 aka EternalBlue and MS 08-067, AKA Conficker. Those two are well-worn tools in the pen testers' toolkit. This one will be as well because it also affects Windows server 2008, which got end-of-lifed this year. So, no patch for you guys. And I expect there will be a actually... Have a Microsoft, Microsoft tends to fix big, big bugs like this even when things are end of life. But if you're running a domain controller on end-of-life software, you have other problems. You really should have dealt with this before 2010 or 2020 rather, or even 2010.

Jen Ellis:

Both.

Tod Beardsley:

Yeah. Why not? So, yeah, there's a little bit of a misconception about this bug that I've seen where like one of the theories was like, oh, you already have to be on a domain joined computer. Like you have to be on a Windows computer basically in order to launch this attack. And that is not true because if you read the paper, the second part of it is how to spoof any client in the domain as well. So you can be like on the thermostat or on the TV or on any object that can establish a TCP connection to the domain controller. That is literally the only requirement here. So it's like, see a domain, own a domain, that's how this works. It's a pretty good bug.

Jen Ellis:

That's sounds no good. That sounds pretty bad.

Tod Beardsley:

Yeah, fix it. There are mitigations on... There's configurations on netlogin, which is the protocol that's being used here. You can set it to like for Senna and seal. This is basic. This is essentially what the patch does is that it makes that mandatory rather than optional, which it was before. Microsoft says, there'll be another fix soon. I don't know what soon means, but that makes it better. Probably dealing with his IV zero business.

Jen Ellis:

If I was following along, which I have been attempting to do, it sounds like your advice is more or less patch it, but it's also like patch it and also don't put it on the internet?

Tod Beardsley:

Don't put it on the internet. Now, so people who have done Windows admin would say, of course, you never put your domain controller on the internet. Why would you ever do that? Controllers also these days tend to also be DNS servers. And so I can see a situation where an organization is like, well, why would I build a whole other DNS server when I have this DNS server right here? Let's just put this thing on the internet. And you could do that, I guess, like with good firewall rules. It's probably a bad idea, but you could expose just port 53 or something. And you don't get owned through that, but a lot of people will then screw that up, right? They'll say I'll put my domain controller on the internet for DNS. And then I'll set the firewall rule after my coffee break and then forget. I walk through the door and forget to do the firewall thing.

Jen Ellis:

Yeah, I got you.

Tod Beardsley:

So we do see that there are DCs on the internet. I don't know off the top of my head how many, I assume it's in the five digits area. I think I saw Tom Sellers say something about tens of thousands, so not tons and tons, but a few.

Jen Ellis:

Enough.

Tod Beardsley:

And like these Windows bugs are nominally, like local network bugs, right. Like you have to talk SMB and who would put SMB on the internet. That's crazy. And so we'll see WannaCry happens and that is purely a Windows issue. So like the worldwide ransomware clock is ticking on this one. If you have not already figured out that this is a truly critical Windows bug that affects like a critical system too, by the way. I understand change control happens, domain controllers have backups. The way to run Windows domains is you always have at least two domain controllers.

Tod Beardsley:

So you patch one, bring it down, you let the other one do the thing. And then you bring the other one up, you patch the other one. Like, you can do this. This is normal in Windows and you should be patching every month anyway, right? But if you haven't been patching, today is a fine time to go frantically to your change control board and say, we need this fixed yesterday because big, bad news. And I know for a fact, a fact, Jen, that pen testers are using this. I had a pen tester friend of mine, not at Rapid7, send me a screenshot of wow, this is cool that this dropped in the middle of my pen test because now I have 24 domain controllers. So I've seen the screenshot. I know it works.

Jen Ellis:

It's a real thing. People are doing it. And if pen testers are doing it that probably means other people are doing it, too.

Tod Beardsley:

Yeah. I don't know how real they are. I don't know how confirmed they are. There's reports that there's some ransomware stuff using it. The thing that makes me suspicious is that I haven't seen the name of the ransomware. It's like not Emotet, right. Because if it was Emotet, everyone would say Emotet is using this, but I haven't seen that. So I don't know how true that is, but I would expect there should be revving up pretty good. And we're in month two now of this thing being patched. And so like three months after the patch is typically when you start seeing like big activity around it.

Jen Ellis:

Yeah,

Tod Beardsley:

And with MS 17-010, so we just released an Under the Hoodie report. We asked, "Hey, did you see MS 17-010?" And 30% of the time they said they did. And that was three years ago. So three years from now, we should be on the road to having this thing nailed down, patched.

Jen Ellis:

Wow. Is it crazy that it was three years? Wow. Blimey.

Tod Beardsley:

Blimey indeed.

Jen Ellis:

Any excuse.

Tod Beardsley:

So to sum up, patch your Windows servers, and if you're Bill, patch your Windows servers in Mississippi.

Jen Ellis:

And keep listening.

Tod Beardsley:

And keep listening.

Jen Ellis:

Awesome. Thank you, Tod. Thank you so much. And thank you again to our amazing guest, Christian. We look forward to hearing what happens with radian and thank you as ever to our Patron Saint of Patience, Bri. Thank you for making this all happen.

Tod Beardsley:

Yay! Thank you Bri.

Jen Ellis:

Hopefully, hopefully you'll tune in again in the future.