Last updated at Fri, 25 Sep 2020 19:37:14 GMT

In our latest episode of Security Nation, we are joined by Christian Wentz, CEO, CTO, founder of Gradient. From an electrical-engineering-applied-to-neuroscience background to a privacy and data protector present, we discuss what it’s like to thread the needle between internet profitability and end-user privacy. There’s technology, there’s politics, there’s policy, and there’s Tod getting very excited about code.

You can listen to the full discussion below, or read on for a transcript of the interview:

Jen Ellis:

Hi, and welcome to another thrilling episode of Security Nation, the podcast where we talk to interesting people doing cool things to advance security in some way. With me is my amazing co-host, who may or may not be in his hot tub, Mr. Tod Beardsley. How are you doing?

Tod Beardsley:

Hi, Jen. I'm doing well. I'm fully clothed and reasonably dry.

Jen Ellis:

I was going to say, being fully clothed, there's no guarantee that you're not in the hot tub. And as a quick aside, if you have not checked out Under The Hoodie and seen cartoon Tod yet-

Tod Beardsley:

Oh no.

Jen Ellis:

We are rapid7.com/under-the-hoodie and you will see cartoon Tod. I am going to say it right now, I think the hair is a little perfect. It's a little too perfect.

Tod Beardsley:

It's a little on the nose. Yeah.

Jen Ellis:

No, he definitely looks like he got some styling. He looks like he's salon-fresh-

Tod Beardsley:

Sure, yeah.

Jen Ellis:

I do enjoy the fact though that they have just beaten the hell out of your cartoon character.

Tod Beardsley:

Oh yeah. I don't know what it is with that pointer. Really has it in for me.

Jen Ellis:

Yeah. He claimed that that happens in real life as well as like a giant monster chasing you down the street.

Tod Beardsley:

Constantly.

Jen Ellis:

Right. When you're in the hot tub.

Tod Beardsley:

Uh-huh (affirmative).

Jen Ellis:

Okay. So we also have a guest this week, which is exciting. With us is Christian Wentz who is a CEO, CTO, and founder of Gradient. So that's three more interesting titles than I have, although I have a very long, wordy title to try and make up for it. Christian, welcome. It is delightful to meet you. Thank you for joining us.

Christian Wentz:

Yeah, thanks for having me.

Jen Ellis:

And I feel like you're one of those people, when I look at your bio, it makes me feel like I have done nothing with my life. You have this habit of founding companies, winning awards, that kind of thing. I mean, what have you been doing with your time, Christian?

Christian Wentz:

It's all just a farce. Nah, nah ...

Jen Ellis:

Just a smoke screen.

Christian Wentz:

Yes. It's an inside game. No. I guess, quick background. I'm a chip designer by training. I thought I always wanted to build chips and computer systems and those sorts of things. Came out of MIT as an undergrad, switched my major three times.

Jen Ellis:

I like that.

Christian Wentz:

When your freshman physics professor wins the Nobel Prize, it gives you a good reason to be like, "This is why I cannot follow this person."

Jen Ellis:

Yeah. That's a moment, right there.

Christian Wentz:

Yeah. Got an interest early on in building early-stage companies. So actually before I graduated, I sort of settled into this electrical-engineering-applied-to-neuroscience bent. And we started a pair of companies around neural interfaces to the brain and how we could use that to improve cognitive function and restore cognitive performance in people suffering from various brain disorders and diseases. And so I was lobbying John Kerry for money instead of taking my final for power electronics. Yeah. I thought it was a worthwhile tradeoff.

Tod Beardsley:

Sure!

Christian Wentz:

Yeah. So the whole complexity of taking an early nascent idea, figuring out is it real? Does it really matter? Can you actually achieve it? Can you get people around it, build a team, and launch a product has kind of been the thing I've been excited about. Since then, I've worked on a wearable sensor company called Misfit Wearables that we sold to Fossil. And so first 10 years or so, I was really doing medical-ish things. And how we got here in terms of Gradient, is I was thinking of what to do next. So my previous company, Kendall Research, which is building these high-bandwidth, closed-loop, neural-interface technologies. So if you're following the news today with Elon's Neuralink, there's Bryan Johnson's Colonel, there's Paradromics, a few others. So it's a really exciting space. I ended up deciding that we are going to sell the company. And so that company is now part of what is called Kernel today.

Tod Beardsley:

And that's Kernel I/O, right? kernel.io, I believe.

Christian Wentz:

That's right. Yeah. Yeah. So the field's roughly divided into not invasive, some invasive, and really invasive methods. And Elon's on the really invasive side. I'm excited about what he's doing. We haven't talked about it in a while, but it's great progress. Anyway, I was trying to think of what to do next. Because one downside of doing things in medtech is everything moves very slowly. You do all this great innovation and then you wait for nine years for the FDA. And one of the problems that we were running into in terms of actually working with the real world and regulatory bodies and so forth is with any of these neural interface technologies, what you really need to do is get data in context. So knowing the signal out of some neurons in your brain, but not knowing where they are or what else is going on at the same time is not very interesting.

Christian Wentz:

If I want to sort of figure out what's wrong in a neural circuit, it's helpful to know, for example, what am I doing actively? What posture am I in? What's going on around the world? And so you need to gather data from a bunch of different sources, and sync it all up, and compute on it to figure out what the hell is this thing doing? And I think the we'll-just-throw-AI-at-it model, if you have enough of this data and you throw the right ConvNet in there, then the result is obvious. But the challenge there is this data is coming out of a person. It's going to be a thing in your body. And if you get the output wrong, you could actually cause significant harm or death to the patient. See, this isn't one of those like, "Well, we'll just train the model and eventually, it will work."

Jen Ellis:

That does seem like it could be a problem.

Christian Wentz:

Yeah.

Jen Ellis:

I'm not an expert, but it seems like it could be a problem.

Christian Wentz:

Right. And not even that, but and thinking about this problem for a little bit, realized this is a generic problem, maybe an extreme version of it, but you care about privacy. Is there interesting information in there that might unique-identify a person? And if so, is that information something that we need to protect? If so, how? You think about an autonomous vehicle is another example. We have more and more compute, pushing to the edge where, necessarily so, because we need lower latencies. We can't send everything back to the core. And in all those cases, imagine I've got a car with some LIDAR sensors or vision system, capturing information and computing on it, making a decision as to whether I should stop left, right, whatever. If that thing is wrong, you die. And it's not necessarily the case that it's malicious actors, that is definitely a thing to be worried about, but it's also just sloppy code. Humans don't write perfect code. So ...

Jen Ellis:

Oh, that's music to our ears. We talk about that all the time.

Christian Wentz:

So this is a big, hairy, ambiguous problem that exists on pretty much every connected device to some degree. I mean, wouldn't it be great to know the provenance of information that is being used to make some decision, or knowing that the execution of that code has integrity so I can trust the output, or any of this stuff? That was sort of the flavor of what got Gradient going.

Jen Ellis:

Okay. So I'm following, and I feel like I can practically hear Tod bounce up and down in his chair because the idea of, I like the word provenance of being able to understand the provenance of the code and understand potential issues with it upfront. I can see the appeal in that. How are you going about this? What is the approach?

Christian Wentz:

Well, the easy answer which we started with, our initial approach was, well, it's quite simple. We just replace all of the silicon that has all of these side-channel vulnerabilities and poorly architected to ensure privacy and integrity of data, we'll just replace those. And we'll have you all adopt a new crypto system that allows us to provably authenticate computation, point-to-point, such that you have a unified, glorious, trust fabric, and we can all go home.

Jen Ellis:

Easy.

Tod Beardsley:

Right, yeah. And I really appreciate the greenfield approach here. I think security people, worldwide, this is always their first instinct is like, "Well, just tear it all down and do it right. Why not just do it that?"

Christian Wentz:

Yeah. It turns out, and then, grand visions are great for raising initial amounts of money. And then when you actually go talk to potential end users of these kinds of things, so who cares? Well, if I'm building an autonomous vehicle, if I rely upon this code to be accurate and uncorrupted, so like finance, healthcare, defense, all the big things you think of. And the immediate reaction as anyone would probably have concluded with some further thought, as it turns out, people are reluctant to replace everything they spent billions or trillions of dollars in.

Jen Ellis:

Weird.

Christian Wentz:

Yeah.

Jen Ellis:

Really?

Christian Wentz:

We didn't, by the way, totally let go of that vision. And we have built our own, formally verified and tested secure-boot processors. They run our own backend infrastructure and going to be making those, as much of that as transparently inspectable as possible. Don't trust us that we got it right. Here it is, try to break it. But it's a high bar to ask people to replace silicon. It's not impossible. There are groups and customers who will do that. But even if they're willing to do so, it's a multiyear process.

Christian Wentz:

So really the first thing we started doing is, well, what do we have to work with that's already there? Yeah, we're not going to get the glorious, unified, formally verified trust fabric across the world just yet. But what's a step there? So there's a lot of actually a lot of great stuff here. So all of the chip companies have made attempts, some better than others, at making isolated or provable execution of code possible. Intel's got their own SGX architecture. AMD's got SEV. Arm in the embedded space has their TrustZone framework, I wouldn't call it an architecture. And there are open projects like Google's open Titan. There are trusted platform modules.

Christian Wentz:

The standard for TPMs is something that can't say I'd really encourage anyone to read, but it is standardized and it's across, they're in almost every enterprise server. At least in the U.S., they're mandated by Federal Acquisition Regulations, so they have to be in things. I would love to know who lobbied to have that happen because they don't do anything with them, but they exist. So anyway, there are these bits of things that you see if there's this toolbox of spare parts, there are chunks of these things everywhere. And the question we reframed our challenge into was, what can I do with what I've got? So starting with, what can I prove about whatever system we're talking about, using what it has already. Servers are a pretty easy story. Enterprise stuff has 80% of the stuff necessary. So we can just do a software deployment to it to bootstrap up these various inspectable properties.

Christian Wentz:

And we came up with a cryptographic framework to provably attest with privacy-preserving properties. I think that's the big addition there, that this device is what it says it is and it was constructed as it claims it is. So that's our first target is to say, "Well, while we're educating the world, not just ourselves, but with the RISC-V Foundation, with TCG, and all these other things, how computing should be done. How can we make the best of what exists today and allow it to be inspectable by anyone? So not just saying, "Well, I trust that Intel's attestation of this thing is correct, so it must be." Transparent, open, all that good stuff.

Tod Beardsley:

Do you expect, Christian, that people will actually trust this stuff, though? Well, and actually not just trust it, but actually do the verification? Do you think that there are people sitting around verifying things all the time? Because that's the promise of open-source software in general, right? Like, "Oh, well, you can't have back doors in open source because everyone looks at it," but it turns out, few people look at it.

Christian Wentz:

Right. So I don't think that open is enough by any means. So for example, I think we have more of the anonymous digital signature community of the world under one roof than has ever been assembled before except in certain esoteric conferences, which are cool, but-

Jen Ellis:

You do have a lot of PhDs. I did notice that.

Christian Wentz:

Yes. And so when we started saying, "Okay, well, practically we're building something to ship commercially," we've done some public demonstrations. We did a demonstration in June with HP Enterprise, leveraging their Self Innards trust on their machines. And there'll be some more coming up. As well as to package this stuff up, obviously one of the core components is there are these graphic libraries. And there are a handful that everyone seems to use and everyone assumes is fine. And when our cryptographer started digging into these, "You know, this one actually appears to just pull 90% of its guts from this one. And no one really trusts that one."

Christian Wentz:

But there's this sort of chaining bit of ambiguity, to your point, open-source software is theoretically watched by everyone, but in practice, that's a high bar. So one of the big tenets that we've taken is simplify as much as possible, what it is you need to trust. So our IDaaS Station framework is extremely tight in terms ... it's something like 2,000 lines of code. It's meant to be small enough that you can formally verify it. I mean, the idea of even saying, "We formally verify and trust this hypervisor," is I think an unreasonably high bar given the current state of formal verification, but I'm not the formal verification guy.

Jen Ellis:

I feel like now you've said that it's 2,000 lines of code that Tod is like, "Ooh, let me look at it."

Christian Wentz:

Yeah. Actually, we can show you the code. If you think about the challenges that exist ... I mean, it's also a bit of a political minefield, particularly right now in the United States. I'll just leave it there. What crypto-system can we use? What can't we use? Who can have what? It was never a clean-cut thing, and now it's less so. So we try to use the, for example, RSA-based crypto because we know we can export it, things like that. We have pro versions as well. And so keeping a header on that is one. But I think to encourage public use of these kinds of provable trust frameworks if that's what they call it. And I think we're talking in the abstract, so maybe some concrete examples might be helpful.

Christian Wentz:

But before that, you have to be able to guarantee it to the end user that their information, or sort of side channel, whether that information about these data stations will be kept private. I think that is something that hasn't been commercially given a lot of airtime, because if you think about, how does the internet make money? The internet makes money by doing the exact opposite of what I just said.

Tod Beardsley:

Yeah.

Christian Wentz:

Well, no, I don't want to anonymize you. I want to find out enough information about you as I can and sell you ads. And that's fine. I mean, I think none of us are-

Tod Beardsley:

And your friends so I can sell your friends ads. Right?

Christian Wentz:

Yep. Right. When Zuckerberg was testifying to Congress, at least I had the shocking realization that most of Congress doesn't understand how the internet works or makes money. Okay. That's retrospectively not shocking. Anyway, all this to say that there is, I think, it's a bit of a threading-the-needle thing here, but there is, we believe, a way to achieve both the end user privacy. There are certain pressures that are mounting aggressively day-to-day now that are working in the favor of driving adoption there. So GDPR obviously being the first one. California's privacy laws. Everyone's jumping on this bandwagon of saying, "No, you can't actually do anything you want with this information." There's always a question here of like, "Who cares about this stuff and why? Is that sufficiently compelling to drive a change in how we essentially built the internet?" But as we were putting together the first version of the company, so we're about two years old. And the first year it was, I collected a bunch of PhDs who were way smarter than me in their individual areas. And we said, "Okay, well, what is possible? What can we do in hardware? What can we do in privacy-preserving compute? What can we do in tested, cryptographically verifiable constructs?" All this stuff. And then in our second year, we kind of took what we thought was really interesting and talked to a bunch of industry verticals who seemed to need these kinds of things in various degrees. And then they'd come back and say, "Well, no, no. That can't possibly work. All this is broken. Theoretically, very interesting, but you just have to allow us to change nothing with what we do and just drop this in magically. And then yes, the offering is amazing." Going back to the engineering team and saying, "Well, all right, we have to comply with this standard and this standard and this standard. And no, we can't have new hardware deployed. Can we do it?" I think we've actually gotten there.

Jen Ellis:

Okay. So you guys have been around for two years. You are definitely addressing some crazy complicated, ambitious stuff here. And I like this sort of thing of where you had this vision and this and this ideal. And then you were like, "Oh wait, this is the real world. And we're going to find some challenges with that." And so you adjusted. And that is the appropriate thing to do. That is how these things work. How is it going now? Now that you've made those adjustments, how is it going in the new direction?

Christian Wentz:

Yeah. I appreciate the generous framing of our initial approach. It's going well. I mean, I wouldn't say it's so much that we're redirecting so much as the boil the ocean, any connected device can prove its identity, authentication, integrity of code, provenance. All of that is still the vision that we are driving towards. And there are aspects of that that are actually already there. But realizing because I think there's this sort of natural cycle with any, I hate the phrase deep-tech company, but anytime you're doing something really hard that have multiple factors that aren't purely technology, they're policy, they're political, I think that the middle of that hero quest situation is you go through the woods and figure out all the things that you cannot move in the immediate term. And so, pretty early on, I brought in a number of people who have done things like this before.

Christian Wentz:

So the former VP of hardware from Apple, periodically consult with some of the senior former members of the NSA, both in terms of how they build things and how they break things. No one's breaking nation-state-level crypto systems, but doing the math there. I believe the phrase I got from one or two people was, "We bribed the cook clerk." So, in building your own system, where do you think about vulnerabilities? But so where we're at is for better or worse, the realization we had, this is great for us as a for-profit entity, is that when you start describing this grand vision of what we can do for you, what's in it for you, the end customer, whether that's a individual user, or a Fortune 100, or a government, or whomever, is that they have such more basic problems than the thing that you articulated.

Christian Wentz:

Even some of the most sophisticated companies or users we talked to and they say, "Wait, wait, you can identify uniquely and authenticate how many machines I have?" Just that in and of itself was interesting. It's like, "Okay, well," so I would describe this as building a pyramid structure. We're not a Ponzi scheme, where the-

Tod Beardsley:

Important distinction!

Christian Wentz:

Right. Right. There's sort of a hierarchy of needs technologically. Where we are aspiring to be, currently there are people, and they're non trivially sized companies, their target market is massive, but any connected device, anything that connects to the internet by some means needs to have authenticated identity, it needs to have integrity of code, and ways to communicate that. Why should I trust anything coming from this device? The late '80s, early '90s CA model doesn't work when you have a hundred devices on the internet for every person. So just building in that foundation is a huge undertaking, but it's not hard to convince people that that is something that they need.

Tod Beardsley:

Well, and I mean, you can argue that the late '80s, early '90s certificate authority model kind of never worked.

Christian Wentz:

Fair. I was trying to be generous.

Tod Beardsley:

We all played along. We all played along because we wanted to be able to take people's credit card numbers. That was the thing. So we all just kind of hand waved at it. I mean, it's not like credit card numbers of themselves particularly secure. I mean, they're not generated in a secure way. You give them out all over the place to tons of merchants. The bar for the CA was be at least as secure as the teller at the corner store. And what you're describing is significantly more secure than that. I wonder, would it just be easier to have just kind of like an Ozymandias, kind of Adrian Veidt's sort of plan of just like, EMP blast all of our electronics, blame it on aliens and then come in and say like, "Okay, well, now we're going to greenfield it the right way." Cheaper!

Christian Wentz:

If I can get an invite to that, let me know. I think there's more there that's already been deployed than I think I initially gave credit looking at this large problem space. The huge issue is that there's nothing that's unifying. The incentive structure to do some of these things often goes, or at least for a while, went at odds with how you make money on the internet. I don't think that's necessarily the case anymore or doesn't have to be the case. So to get to the grand vision side of things, I spent a tremendous amount of time with every ... if they make it a chip in the machine in front of you, we're probably having C-level conversations with them about how to get them to do this. And what's cool is that they all want to. I've never had a conversation with a company that says ... even the ones you'd think like, "No, it's terrible." Behind closed doors, everyone says, "This is awesome. This is the future happening now."

Jen Ellis:

I mean, they've learned that the hard way, right? In the past couple of years, they've all been massively impacted by very, very broad-scale vulnerability disclosures that have left them reeling. And so finding ways to create new trust, get past those challenges. Because as you say, technology is developed by humans and as we know, to err is pretty goddamn human. And so I think finding a path forward is in their interest, and focus on competing in the ways they want to compete rather than just who trusts which technology more. So yeah, I think it makes perfect sense from that point of view.

Jen Ellis:

Listening to you chat, it sounds like you are very up on a lot of the sort of policy conversations that are happening in areas around this stuff. And right now, there is a lot of movement on privacy and cybersecurity, as you mentioned with GDPR, and the California bill, and others will follow. But also, in driving the adoption of secure-by-design principles and that kind of stuff. Do you find that these policy conversations are helpful to you? Are they creating an environment in which you have a better runway to have conversations with people?

Christian Wentz:

I mean, frankly, at a startup pace now, nothing moves fast enough. It's something where the operating assumption that I have is in absence of those policy transformations, we will and we are growing an initial user base and we'll continue to grow that. And at some point, we will be, one, in a position where we're big enough to be able to move or help move that conversation in the direction we think it should go. And I guess that comes down to, do you then trust us to be a benevolent actor? We're appealing to a degree. I mean, I think it does create markets in certain areas.

Christian Wentz:

But for example, I mean, we take the privacy and security thing very seriously at the public policy side. So we have on the team, Obama's former privacy advocate. He was tapped, he was stolen from the ACLU or borrowed from the ACLU to monitor the intelligence community and basically say, "Yeah, let's ensure that they're actually abiding by the laws we set." And so, lots of interesting insights from that. I think that community, for what it's worth, I may lose credit saying this, but really does try to not be big evil brother. Stuff happens when you have all of the internet.

Jen Ellis:

Yeah. It's complicated, right?

Christian Wentz:

Right.

Jen Ellis:

There are no simple solutions. If there were, we would have solved these problems a long time ago.

Christian Wentz:

Yeah. And that's actually, that's also kind of part of how we think fundamentally about architecting ways to prove, whether it's identity, or authenticity, or code integrity, or, I mean, data provenance is huge, particularly given these evolving geopolitical concerns and the whole new ruling around GDPR, and where is it U.S. citizens' data transfer to the US is no longer protected by Safe Harbor. Things like, less than the policy. Maybe some of that, you could say, is policy, but those are the things that drive adoption of new things in the space.

Jen Ellis:

Yeah. And I think when I used the term policy, I was giving a broad term to incorporate a lot of that. I mean, I think that policy will try and react to these things and it is trying to react. But you're right, it moves slowly. But I mean, it's interesting. If you look at something like you mentioned medical devices at the beginning and clearly, you're up on FDA requirements, particularly given where you worked in the past.

Jen Ellis:

But the FDA is one of the regulatory authorities that I would say is leaning in hard on cybersecurity and doing really excellent work on that front, and has come out with pre-market and post-market guidance for the cybersecurity of connected medical devices, and has even now updated and come out with, in 2018, came out with an updated version of their pre-market, which includes what they term Cyber Bill Of Materials, specifically to get to the problem that you're talking about, which is understanding how to think about trusting the technologies you're buying and understanding what the components are. It's a policy approach. It's a process approach rather than a technology approach that you're proposing. What do you think about that sort of thing?

Christian Wentz:

Yeah, we looked early on at ... medtech is a great application, low-hanging-fruit application of what we're doing, because it is a captive. Like Medtronic, Boston Scientific, Abbott, the big four make the whole thing full stack. They control the software. It's very easy to drop in the enabling components that we're talking about. And it's 2K of code. These are not meaningful changes. And I was excited to see the FDA, essentially cyber security chief...

Jen Ellis:

Suzanne Schwartz.

Christian Wentz:

Yes. So I think what she's proposing is fantastic. The issue with medtech specifically, and I think it's the case for a lot of industries. When I was last building a med device team, medtech is highly sophisticated in several dimensions, but just based on the financial incentive and so forth, only like 10% of revenue goes to R&D. So when I was trying to find a chip design team that could build the closed-loop, high-bandwidth neural interface that we were building, I hired them from Qualcomm and Intel and those places, not from ... because the sophistication of thinking about the problem. The problem in general modern semiconductors, so fantastically complex that we've adopted means to handle the complexity. And so long way of saying medtech, I think, doesn't have the in-house resources generally to understand how to do what is being advocated to them by the great work that Schwartz is doing. Almost every industry has this. Automotive has it. They're not stupid people, it's just, this is ...

Jen Ellis:

Yeah. I mean, they're learning a new area. I don't think that, I actually know because I've sat in a room with a bunch of automotive manufacturers who've said they don't think of themselves as technology manufacturers the way that people who traditionally participate in security conversations think of themselves as technology manufacturers. The software aspect is still new. And so thinking about secure-by-design principles is a journey for them. And it is, has been, I think, actually quite an amazing journey over the past several years. I think if you wind back the clock to the GPAC, I think things have changed a huge amount in that time. And there continues to be a really strong effort to continue to change and build security. But it is a journey. It doesn't get done overnight.

Christian Wentz:

Yeah. I think there's an opportunity to bridge between policy and technology solutions and say that ... So for example, in our approach, automotive was one informing industry. One of the pieces of feedback we got from a large component supplier was everything that's certificate-driven ... Which my initial reaction was that's fantastic. That's a thing, you know how to use certificates, like digital certificates. And so that sort of became a forcing function to say, "Okay, well, here's the thing. You're more or less telling me you have your own certificate infrastructure, and every large OEM has their own, and they're not going to change that. So fit into that mold." And if you can do that, then you can move. You can sort of like retrofit, bootstrap these industries. So things like that, I think. And maybe that's a standards conversation. I really try to avoid getting into those.

Jen Ellis:

Right. Nobody likes a standards conversation. I feel like I could actually nerd out with you about this stuff hours. I can get very boring when I get on to this topic. But we should wrap it up. I should let you go. You're a busy man. You have three different titles. You must be a very busy man. But thank you so much. This is fascinating. And this is the first time that we have had somebody on who literally worked in neuroscience. Amazing. I feel like this is squad goals for the podcast.

Jen Ellis:

Thank you so much for coming on. The last thing I will ask is always the last question I was anyone. Part of what we try and do with the podcast is inspire other people to feel like they can take on their own projects to advance security in some way. I'm guessing that not many of listeners are like, "I'm going to go and found a company and do some super ambitious thing." But if you were looking to give advice to people who are sort of worried about taking on their own projects or thinking about something they want to take on, be it be a big or small, what advice would you provide?

Christian Wentz:

Yeah. I mean, what's the worst that could happen?

Jen Ellis:

I love that.

Christian Wentz:

I guess I can look at it like, it seems to me, it's really rare that attempting to innovate, even if ... So imagine you have a day job at a large company. Innovation from within is always valued highly. It's actually one of the problems that I frequently face talking to to large companies is that you get this, "Well, we didn't invent it here." So I don't know that that's really advice. Sorry.

Jen Ellis:

No, but I think clearly, with your bio, you have a very entrepreneurial spirit. And I think the attitude of for sure, for lots of people, there are situations where it is very daunting to take it on and there can be real repercussions. For example, if you walk away from a job and you've got job security so that you can go and start your own thing. But I think that attitude with maybe smaller-scale projects of, they may fail, but that's an opportunity to learn, and to adjust, and to move forward. It's not necessarily the end of the world. I do actually like that attitude.

Christian Wentz:

Yeah. Maybe that's the way to frame it, which is, I was talking to someone yesterday about this who's thinking about doing this early in her career. And the fear was, "Well, what if it doesn't work?" Well, then it doesn't work, but you tried something. You're going to learn a tremendous amount in the process of figuring it out. 95% of this stuff, they're not technology skills. It will help you in everything else you do.

Jen Ellis:

Yep. Awesome. All right. Well, Christian, thank you so much. Really appreciate you joining us. I'm a little bit worried that Tod is quiet because he's actually been looking at your code

Christian Wentz:

We'll follow up afterwards and I'll give you access.

Tod Beardsley:

Cool.

Jen Ellis:

And good luck with Gradient.

Christian Wentz:

Thank you.

Tod Beardsley:

Yeah.

Jen Ellis:

Keep us posted.

Christian Wentz:

All right. Thanks, guys.

To hear Christian’s interview in full and check out other episodes of Security Nation, click here.

NEVER MISS A BLOG

Get the latest stories, expertise, and news about security today.