April 24, 2020
On our latest episode of Security Nation, we caught up with Casey Ellis, founder and CTO at Bugcrowd. Joining us during the 2020 RSA Conference, he takes the time to discuss normalizing vulnerability disclosure, the safe harbor debate, and the legal implications of crowdsourced security testing.
Stick around for our Rapid Rundown, where Tod breaks down the recent controversy on online vs. mail-in voting, and gives the inside scoop on Rapid7’s newest project, AttackerKB.
Jen Ellis is the vice president of community and public affairs at Rapid7. Jen’s primary focus is on creating positive social change to advance security for all. She believes that it is critical to build productive collaboration between those in the security community and those operating outside it, and to this end, she works extensively with security researchers, technology providers, operators, and influencers, and various government entities to help them understand and address cybersecurity challenges. She believes effective collaboration is our only path forward to reducing cyber attacks and protecting consumers and businesses. She has testified before Congress and spoken at a number of security industry events including SXSW, RSA, Derbycon, Shmoocon, SOURCE, UNITED, and various BSides.
Tod Beardsley is the director of research at Rapid7. He has over 20 years of hands-on security experience, stretching from in-band telephony switching to modern IoT implementations. He has held IT Ops and IT Security positions in large organizations such as 3Com, Dell, and Westinghouse, as both an offensive and defensive practitioner. Today, Tod directs the myriad security research programs and initiatives at Rapid7. He can be uniquely identified at https://keybase.io/todb.
Casey is the Founder, Chairman, and CTO of Bugcrowd. He is a 20-year veteran of information security, servicing clients ranging from startups to multinational corporations as a pen tester, security and risk consultant and solutions architect, then most recently as a career entrepreneur. Casey pioneered the Crowdsourced Security as a Service model launching the first bug bounty programs on the Bugcrowd platform in 2012, and co-founded the https://disclose.io vulnerability disclosure standardization project in 2016. A proud expat of Sydney, Australia, Casey lives with his wife and two kids in the San Francisco Bay Area. He is happy as long as he’s in the passionate pursuit of potential.
Jen Ellis: Hi and welcome to the latest episode of Security Nation, the podcast where we talk to cool people doing interesting things to advance security in some way. I'm your host, Jen Ellis. I'm Rapid7's VP of community and public affairs, and I am joined by my cohost, the amazing Tod Beardsley.
Show more Show lessTod Beardsley:
Hi.
Jen Ellis:
Live from day three of RSA. Bear in mind that we started recording, I think at days zero or minus one or something, which is why I now have conference voice, hurray. Which means that I veer between sounding like a certain kind of phone operator and a dirty old man. Hurray. Okay. So with us this week we have my cousin, Casey Ellis.
Casey Ellis:
There you go.
Jen Ellis:
CTO and founder of Bugcrowd. Aussie, but we won't hold it against him.
Casey Ellis:
You guys kicked us out, so I don't think you can-
Jen Ellis:
Right, right. Which is why now we lose it all sporting events. Terrible. You've got your revenge in the long term and also wielder of DayGlo hair. Casey, thank you for joining us. It's great to have you on.
Casey Ellis:
Thank you so much for having us. And yes, I am also packing the dulcet baritone of day three of RSA.
Jen Ellis:
Why does it work for you so much better than it does for me?
Casey Ellis:
I don't know.
Jen Ellis:
It should be a family trait though, dammit. So tell us a little bit about yourself.
Casey Ellis:
Yeah, for sure. So yeah, my extended, incorrect, whatever fullness title. I started Bugcrowd, that's the one I lead with because it's the piece I'm proudest of.
Jen Ellis:
Congratulations. I hear it's doing very well. You had a big party last night, which people were very excited to go to.
Casey Ellis:
Yeah. The joke that I make is that we're actually a party and a schwag company that has cybersecurity kind of as a side hustle.
Jen Ellis:
Right. No. Rapid7 is also a clothing brand as far as I can tell.
Casey Ellis:
Yeah, yeah.
Jen Ellis:
Yeah, absolutely.
Casey Ellis:
Which is a fun thing.
Jen Ellis:
But we also make great security products.
Casey Ellis:
Exactly right. Yeah. So started the company in Australia, back in 2012 was when we launched our first programs. I'm also the chairman and now CTO of the company as well.
Jen Ellis:
Congratulations, chairman sounds pretty fancy.
Casey Ellis:
It's pretty great.
Jen Ellis:
Fancy!
Casey Ellis:
Yeah. And I think that's, I mean, the thing that I love about, one of the cool things about RSA is getting to talk to security entrepreneurs that have ideas because folk are always coming up, like, "Look, I was not doing this before I was doing it. It is possible," dah dah dah. I digress. So what Bugcrowd does is, the original kind of mission statement of the business was to be able to connect the entire potential creative power of the white hat community with the entire problem space that exists in cybersecurity as it evolves. So both at a point in time, but also there's new stuff even today that didn't exist when we started that we're hacking on.
Casey Ellis:
And I view cybersecurity as a fundamentally human problem, the tech just makes it go faster. So this like, "How do we like balance the equation between what's available to the bad guys and what we have from a human resource standpoint?" And really, the other side of that is that hackers have been at the table for a long time trying to help. Right?
Tod Beardsley:
For sure.
Casey Ellis:
But we haven't been invited and it's usually been a pretty hostile response, so keeping my buddies out of jail was one of the things as well.
Tod Beardsley:
It can be a very emotional event, disclosing bugs, on both sides.
Casey Ellis:
Calling someone's baby ugly is like never an easy conversation to have proactively. But I think this whole idea of building software is hard. To err is human, therefore we need to figure out how we've done that and figure out how not to do it. That's becoming more of a normal idea.
Jen Ellis:
Right. So on that point, I mean, I use the calling babies ugly thing all the time, but the reality is what we need to do is get to a culture where that's not how it's perceived.
Casey Ellis:
Yes, exactly.
Jen Ellis:
You're not saying your baby is ugly, you're more saying, "Hey, your baby has some spit-up. Maybe we should change it," or something. I don't know. It needs to become like a diaper change. And for people not in America, that's nappies.
Tod Beardsley:
Nappies, yes.
Casey Ellis:
No, hang on. Is it nappies. No it was nappies back home, and diapers here.
Jen Ellis:
Yeah. That's why I said people not in America.
Tod Beardsley:
I thought it was chips? No?
Jen Ellis:
Freedom fries.
Casey Ellis:
Oh, dear god, let's not start this.
Jen Ellis:
Okay. Firstly, I feel like I should have been bowing every time I've seen you now, I'm sorry. I feel like I've really been remiss in my GGs towards the chairman. So I'll correct it going forward. We'll make sure.
Casey Ellis:
That's quite all right. That's quite all right.
Jen Ellis:
But I want to know, Tod has a question that he likes to ask people. I want to know if he's going to lay it on you.
Tod Beardsley:
Oh, I will. So Casey, in the universe of hacking scenes in movies and TV shows, is there one in particular that speaks to you?
Casey Ellis:
"Sneakers." The whole thing.
Jen Ellis:
There was no hesitation as well, it was very quick.
Casey Ellis:
I've given this a lot of thought and it's definitely "Sneakers."
Tod Beardsley:
Well, we're in the right city for it. Sneakers is also the most San Francisco movie ever.
Casey Ellis:
Straight up. I mean-
Jen Ellis:
Not the most San Francisco ever.
Tod Beardsley:
Well, okay. Maybe not ever, ever.
Jen Ellis:
I mean, doesn't "Bullet" have a very famous chase scene on the streets of San Francisco?
Tod Beardsley:
Sure.
Jen Ellis:
And have you seen "The Rock?"
Tod Beardsley:
Uh-huh(affirmative).
Jen Ellis:
Right?
Casey Ellis:
Yeah. There's hacking in "The Rock," right? I mean, literally crossing the San Mateo bridge for the first time after we came out here, we were like, this is the bridge, it's amazing.
Jen Ellis:
Oh my god, is that why you chose San Francisco? That's you're here?
Casey Ellis:
San Francisco is like Hollywood for an actor. Do you know what I mean? This place is crazy and there's lots of things that aren't great about it, but there's lots of things that are very unique because it's been doing high-tech for the longest, really. And it was kind of always the plan. My wife and I we wanted to live somewhere else at some point and we both love San Francisco just as a city. And as I started getting into the startup thing, I'm like, "I want to see inside the belly of the beast and get access to the different things."
Casey Ellis:
Like the resource from a VC standpoint, all of that. But then it's more than that, it's like early adopters, people that had done this stuff for a really long time. I like to think that I'm pretty good at asking the right dumb questions and figuring out which parts of the answers to discard. And you can do a lot of that here. So, "Sneakers," yes.
Tod Beardsley:
Yes. Yeah. I give you that, 100%.
Casey Ellis:
I mean, I think looking back on it too, it's such a savant movie. I don't know. I got into hacking because I just really enjoy thinking like a criminal, but I don't want to be one.
Tod Beardsley:
Sure.
Jen Ellis:
To be fair, an orange jumpsuit would clash very badly.
Casey Ellis:
And also, I'm Australian so the shoe fits pretty cleanly.
Jen Ellis:
Oh, yeah. That's true. You don't just be like a cliche.
Casey Ellis:
Right. Right. It's a little cliche but it does actually make a lot of sense. I think different people get into security in different ways. Like folk come in from risk, finance even, folk coming because they're just computer scientists. And they want to make computers bend to their will. For me, it's the computer science part, but the stronger thing is that I'm just fascinated by almost the entrepreneurship of being a criminal and the patterns of thinking that come behind that. But then at the same time, I've got no desire whatsoever. The do no harm thing or something that I take pretty seriously. So okay, how do you reconcile that? And it turns out we've got this industry that lets us do that. Yes. That's how I got started.
Jen Ellis:
So tell us a little bit about... Well, actually let's do the quick 30 seconds on Bugcrowd.
Casey Ellis:
Sure. The spiel.
Jen Ellis:
It's a good jumping-off point.
Casey Ellis:
We haven't gotten to that bit. Yeah, yeah, for sure. Yeah. So what Bugcrowd is, we've assembled a platform where we run vuln disclosure programs, we run bug bounty programs, and then what we do is we actually take the fact that we can attract people to the platform using those public models to learn about what they can do, how much we can trust them, how professional are they, where are they from? All of that sort of stuff, and basically carve up the crowd in different ways to deliver private crowdsource testing and all sorts of different forms. And really the idea from—and I ran a a pen test company in Australia, was a pen test for about seven years and then moved into solution architecture and sales and broke bad the startup thing. And the company before Bugcrowd was a pen testing firm.
Casey Ellis:
And I was looking at it thinking, "All right, like firstly, the margin that's achievable in this space is insane. If everyone's walking away from this happy, then something's systemically wrong and that's eventually going to get to a breaking point." I'm like, "Well, maybe I can be that breaking point, that could be cool." But the other side of it was-
Jen Ellis:
Life goals.
Casey Ellis:
Yeah. Life goals. Squad goal. The other side of it was this idea of if we've got like a crowd of people building software and going back to that, yeah, mistakes are human, they're not... Yeah, I think you were spot on before, the whole idea of the security team or the security people having the tendency to call people stupid. It's like, "Well, have you built a platform and maintained it? I'm not sure that you have." And it's the same in the opposite direction. So yeah, the whole idea of this crowd building software, there being mistakes and then this crowd of people trying to attack it maliciously and then you've got Bob or Jane the pen tester and whatever tooling they've got available to them.
Casey Ellis:
The tooling is awesome but it's never going to be as creative as the human to get into that gap. So how do you balance it out? So that was really the idea of it, normalizing disclosure, normalizing, the idea that hackers are locksmiths, too, we're not just burglars. That was a part of what we needed, I knew we needed to do to be able to see the thing grow. And honestly that side of it is the part that I'm actually proud of. There wasn't anything else around when we started and that, to me, has shifted. We've cleared, I don't think everyone views hackers that way yet, definitely over the hump of that being consensus.
Jen Ellis:
I think in the years since the crowd was founded, I think that we have seen a huge evolution. I mean, we had a letter from the Department of Justice in support security research. That was a milestone.
Casey Ellis:
Yeah. It's phenomenal. So it's cool to be a part of that. I'm very proud of what we've done and all of that. But it's honestly been a privilege to be a part of this evolution of what's going on.
Jen Ellis:
No, I mean, I agree on that front as well. So tell us about disclosed IO.
Casey Ellis:
Yeah, for sure. So this is the project piece. So what we did in 2016, actually a little bit before that, we recognized that researchers don't tend to read legal language. Right?
Jen Ellis:
What?
Casey Ellis:
It's true. Shock, horror. And also we're having this new, there's the OG people that have done VR and disclosure and stuff like that. And then there's this kind of new talent pool that's come in that don't really have the history or the empathy of the 30 years that's led up to this. They don't read the brief and they don't understand the legal context of it.
Jen Ellis:
I mean, no one reads the terms of service, let's be honest.
Casey Ellis:
Yeah, exactly. So how do we make that simpler and more consumable and then standardize it so you can reduce it almost to a logo that becomes like a "Oh, okay. They've got a neighborhood watch sticker on their mailbox. I get it." But then what we did the year before last was to recognize the fact that yep, CFAA is not going to change anytime soon and anti-hacking laws are being hotly debated in all sorts of ways.
Jen Ellis:
For those who are listening who have apparently never listened before, because I talk about it all the time. The CFAA is the...
Casey Ellis:
Computer Fraud and Abuse Act.
Jen Ellis:
Which is the main anti-hacking law in the U.S. And a really big problem with the CFAA is that it doesn't define consent.
Casey Ellis:
And the idea, I think really the core of it is that it sort of assumes bad faith. This whole idea of a hacker is a good person too, anyone who attempts unauthorized access, if you're doing that, you're automatically viewed as...
Jen Ellis:
It doesn't. You have to prove mens rea That's a basic for the U.S. For us law, which basically means you have to prove bad intent. I'm sorry, you've got the law nerd here.
Casey Ellis:
I got the esquire.
Jen Ellis:
Yes. Yes, you did. You got the jurors fricking doctorate out, right?
Casey Ellis:
Yeah. Completely, we've had a lot of conversations around the letter of the law, but the CFAA is also very vague, so it comes down to the interpretation and how people use it, which is the part I'm talking about.
Jen Ellis:
100%. The CFAA has got a big challenge in it. There's a lot of gray area, but just to be clear on this and to make it a little bit less just U.S.-centric, there is a problem in anti hacking laws generally whereby the concept of consent on the internet is very tricky because there is a school of thought that if you put something on the internet and make it publicly accessible, then consent is implied.
Jen Ellis:
However, anti-hacking laws all hinge around this concept of authorization either exceeding authorized access or accessing without authorization. And people have traditionally tried to lean into things like user agreements as a way of saying whether or not there's authorization, but that doesn't work legally because there is no expectation of people reading user agreements and in fact, you can't negotiate them.
Casey Ellis:
That's a far more eloquent way of laying it out. That's absolutely right.
Jen Ellis:
The match of concepts is tricky and it applies to the CFAA, the CMA in the U.K., probably everybody's anti-hacking laws. They're all sort of based on the same kind of stuff mainly.
Casey Ellis:
Exactly, yeah.
Jen Ellis:
So sorry for bringing out the esquire.
Casey Ellis:
I appreciate. That was my Lyft driver pitch of the problem. And I think for the audience, that description was spot-on.
Jen Ellis:
Oh my god, I would be a very boring Lyft driver. I would be like, "Well, let me tell you."
Casey Ellis:
How can you reduce the problem to something that people will understand and then engage with? But yeah, the nuance of it is exactly as say and that was really well-put. So 2016, no one reads the brief, 2018, hang on a sec. We're testing the limits of this far more frequently because of the varieties of bounty and the rise of vuln disclosure. We need to start to figure out ways to allow companies to align their legal teams, their marketing teams, their PR teams and then externally by basically the entire internet is what is okay and what's not okay. Because the challenge when you go even deeper with that concept of access on the internet is that if the way that you're helping is to bypass the things that would normally control access, then it gets really funky really quickly.
Casey Ellis:
So that's really what disclosed IO. We took the open source repo, we branded it, we put it under that domain, created a logo around it, and all of that sort of stuff. And I'll get to that now in a bit. And it was the idea of how do you create an easy on button from a boilerplate terminology standpoint for people to either copy and paste or to use as a framework for their legal teams to work with? Because usually what we see when we talk to someone who's never thought about this before, like the security team get it, legal team completely freak out and you end up with "War and Peace," and you go work it back from there. We wanted to skip that. Yeah. So that was the idea on the actual language itself.
Casey Ellis:
It's here for the U.S., we've got a generalized version that talks about any hacking laws, any circumvention laws, DCMA, but whatever else might be equivalent to that globally.
Jen Ellis:
So the DMCA is the?
Casey Ellis:
The Digital Millennium Copyright Act, which means if you circumvent a control that's intended to prevent a behavior within software that you actually own, so it's the difference between someone else's computer and something that runs as software or a computer that is yours. That's when that comes in and affects people doing security research. So all of that stuff, like being able to then transport that into different countries, different languages, the goal, and this is really the call to action for people that are listening. Folks that have a legal background, that understands how this stuff works. I'm not a lawyer, I play one on TV, but I'm very open about everything not to-
Jen Ellis:
You do? Which shows are you on?
Casey Ellis:
Bugcrowd. The Bugcrowd show.
Jen Ellis:
I love that show, although it went really downhill after the first season.
Casey Ellis:
We haven't got a theme song yet but we should work on that. The whole idea of it's open source, the whole idea is this is actually, obviously Bugcrowd benefits from this shift in how it all works. It makes life easier for us as a business, but it's also something that we want to see happen anyway. So, okay how do you open source? How do you encourage contribution, collaboration, consensus, adoption? And that's why the logo's there. Because the whole idea, my thought on this and really the master plan is to get it to the point where Joe or Jane Internet is seeing this and saying, "That's neighborhood watch for the internet. I understand that." And it becomes a positive driver for a company to adopt something like this instead of just, "I don't know if I want to talk to hackers or not."
Tod Beardsley:
So the text the things that you're producing on disclose.io, are they short enough that I can just chuck them into a readme, in a GitHub repo, or are they more GPL length?
Casey Ellis:
Yeah, that's a really good question. Because the idea is how do you make it as legally complete as possible whilst keeping it brief and then accommodating for the fact that most of your audience is English as a second language or isn't the native language of the text itself? When we start doing stuff in Russia, for example, that's going to get really interesting. And it's an interesting enough problem here in the English language because brevity tends to trade itself off against the legal completeness of a thing.
Casey Ellis:
And really it's more, in terms of the actual prosecutions, any language that's proactively put out onto a website by a company is a very strong negative signal of the fact that that company's going to cloud out and rain on a security researcher. And that's a conversation we've had with the FF a lot. How accurate does this need to be legally?
Casey Ellis:
Yeah. Because if it gets on the website, there's a lot of people to go into that decision and you get organizational alignment as a byproduct of the past to that. So there's all of these weird dynamics that we're trying to balance out behind the scenes. But yeah, the endgame is for it to be something where someone can see it, they can understand the concept of neighborhood watch. At that point, it's not about hackers anymore, it's just like, "Oh, okay. They're getting feedback from the internet. That's a good thing. Cool."
Jen Ellis:
Yeah.
Casey Ellis:
Yeah.
Jen Ellis:
So, one of the things I really like about it is that you do have versions in other languages and you are trying to sort of tailor to other country's needs. How many countries have you now covered?
Casey Ellis:
Yeah. At this point it's Canada and the U.S. We're working actively on Australia, the U.K. and Spain.
Jen Ellis:
Nice.
Casey Ellis:
And yeah, it's straight up, this is one of those things, Bugcrowd, there's people that work on it within Bugcrowd. IT's not a core mission thing so it's as much input we can get.
Jen Ellis:
Do you have external people in the community working on it as well?
Casey Ellis:
Yes, we do. Yeah.
Jen Ellis:
That's great.
Casey Ellis:
The goal is for that to be able to grow. So for people to push on the repo, say, "Hey, I don't like that word. I think we should use responsible instead of coordinated or whatever."
Jen Ellis:
Yeah. Good lord. I hope that you smack those people hard.
Casey Ellis:
I mean, that a conversation would issue and then that would be on record, so that's-
Jen Ellis:
No, I don't advocate violence. I don't want people to say I was responsible Okay. So and are you talking to the government of those countries? Do they get involved?
Casey Ellis:
Yeah. Yeah. So the thing that's been interesting with this is the whole conversation, we got drawn into election security about 18 months ago by the House Rules Committee and we've been working since with the National Association of State Secretaries and some of the states.
Jen Ellis:
Election is crazy, I haven't heard much about those, a big topic.
Casey Ellis:
It's this hot new thing apparently. I don't know. Democracy could-
Tod Beardsley:
I'm excited to solve it by 2024.
Casey Ellis:
Here's my bad business idea of the week. Instead of looking at putting voting on the blockchain, let's just turned the election into an ICO.
Tod Beardsley:
Sure.
Casey Ellis:
Right?
Tod Beardsley:
Why not?
Casey Ellis:
If a candidate beats the price of Bitcoin than they win.
Tod Beardsley:
Perfect.
Casey Ellis:
No one do that, please. That was a joke. So yeah, so National Association of State Secretaries, the states themselves and others, and this is where it kind of comes together. The main risk I think for 2020 is just information. We're kind of out of time to do much more on the security front. And we still can. And so don't hear what I'm not saying they're, to me the bigger threat in the bigger risk is the idea of, people that want to manipulate the outcome or undermine confidence in democracy itself, calling out computer security is a reason not to vote and a reason not to trust the government.
Casey Ellis:
And I see vuln disclosure and just idea of it being there as something that's easy enough to communicate to a layperson voter that it can actually help with that. So that's kind of the big play there. And it's a bit of a long low but when you kind of run through it, it's like, "Oh, that does actually make sense. Democracy's a crowd thing and we've got a crowd of hackers to help out with stuff. So plug them in together and let's see what we can do."
Jen Ellis:
Do you have a sense of how many people have adopted it?
Casey Ellis:
Yeah, we're going to tweak the start again probably at the end of this week. So we've got a list. So the other thing that's on disclose.io is literally a crowdsource list of every known vulnerability disclosure program, which again is open source and that's actually being contributed to by the community.
Jen Ellis:
It's great.
Casey Ellis:
Yeah, there's about I think 900 programs on there right now.
Jen Ellis:
That's great.
Casey Ellis:
Yeah. It's good. Last time I looked at it, it was about 30% that had adopted full or what we refer to as full safe harbor language. And really the idea of that is if you do these things, you are authorized and therefore stuff like CFAA becomes more difficult to prosecute.
Casey Ellis:
Versus partial safe harbor, which is basically, "Hey, we won't be jerks," they still could be as a company-
Jen Ellis:
You can trust us, honest.
Casey Ellis:
Yeah, exactly. I described that as the difference between like in California the speed limits are more kind of speed recommendations and that's like a consensus. That's the difference between partial and full. So if the speed limit was to actually be raised, that would be the equivalent of full in the sense.
Jen Ellis:
So for those paying attention, he just sidestepped the question of how many people?
Casey Ellis:
30% of 900, which is 300. I didn't sidestep it. What are you talking about?
Jen Ellis:
Oh, so you've got 300 people who've adopted it. That's really awesome. I'm sorry. Maybe my brain just couldn't cope with maths.
Casey Ellis:
It's day three of RSA.
Jen Ellis:
Yeah. I'm like, "Is it bedtime again?" That's awesome. So then all of that sounds very rosy and like it's all gone very well.
Casey Ellis:
What hasn't?
Jen Ellis:
Yeah, right. Right. I don't believe it's all been smooth sailing.
Casey Ellis:
Not at all, no. I mean, consensus is hard to get, especially of all of the topics in cybersecurity, that's hard to get consensus around. Vulnerability disclosure I think is the-
Jen Ellis:
There are no opinions, I don't know what you're talking about.
Casey Ellis:
There is absolutely no opinions. There are no-
Jen Ellis:
While, I've got you hear, I've got some thoughts on this.
Casey Ellis:
Yeah, so I think language is hard, like I mentioned before, the conversation with EFF and other folk where it's like, "Okay, how?" Because we probably had a bunch of lawyers come up and say, "Well, that doesn't fully cover of this edge case and that edge case and the other." At which point we had to have that conversation and that was heavy because lawyers are lawyers and they know stuff, and it's like, "Okay, we have to actually balance this out and have them understand what we're trying to achieve, which is not complete legal completeness. By the way. Is that okay? We should probably triple-check that." That whole process. I wouldn't say that hasn't gone well. It's just been a blip in-
Jen Ellis:
Yeah. And I imagine it was slow.
Casey Ellis:
Yeah. It's been one of those ones where it's like, "Okay, we've had to grind our way through that part." There's different opinions on even the use of the word safe harbor.
Tod Beardsley:
I'm just scrolling through all your old pull requests on that topic. So it's fascinating.
Casey Ellis:
I mean, it's gotten pretty animated in there already, which is it's the point. Let's get people talking about this, and as a byproduct of that conversation, hang on a sec, are we doing this? We should do it. Because that's the endgame.
Jen Ellis:
So can you lay out what the concerns are around the use of the word safe harbor?
Casey Ellis:
Yeah. I think, I mean in, so what I've heard is, "Well, hackers are going to interpret that to mean immunity to any kind of prosecution whatsoever. And they've got carte blanche to do what they want," as the main risk, which I can see happening, especially with newer players. To me that's one of those ones where, well, hang on a sec, if safe harbor is a term that the person implementing the policy can understand more quickly because of the use of that term, then we kind of get past that problem because it becomes normalized. So that's the tradeoff there. Do you know what I mean?
Jen Ellis:
It is. I mean, certainly in the conversations around what a cover for security research might look like in an anti-hacking law, the concept of inadvertently creating, if you will, a backdoor to the law for criminals is certainly one of the big concerns. And it's even, with the safe harbor concept it's even further than researchers believing that anything goes. And it's people masquerading as researchers and getting away with other things.
Casey Ellis:
Yeah. And I mean, that's a really good call out so that, that's another thing to be able to market through and explain to people. It's like, "No, it's not. Hey, you can just hack us. YOLO. It's all good." Because that's not-
Jen Ellis:
There's so many tech vendors do live by YOLO.
Casey Ellis:
Well, we're in the city of YOLO, so yeah. But yeah, because really the way that it works is it's like a conditional, if this then that statement. So if you that dah dah, dah, dah, we will do these things and by the way you are authorized and exempted and so on. So the precursor is the if statement, if you go outside those lines-
Jen Ellis:
Right. That is the important thing.
Casey Ellis:
Yeah, that becomes the definition of good faith that's set out by the company right in the policy. And what we tried to do with disclose.io is get that to as sane a middle ground as possible. Okay, these are the things that, yeah, you should...
Casey Ellis:
There's the whole conversation around, should there even be scope in vuln disclosure or is it just the organization itself? I believe it should ultimately be everything that affects the entity but different people are at different stages of being able to accept that idea and do that.
Jen Ellis:
Right. And incremental progress. Baby steps.
Casey Ellis:
Exactly, exactly. And the goal is to get it to the point where it's just, hey, you know what? If you see something, say something and that's fine. That's the end state that I want to get to-
Jen Ellis:
It sounds familiar, I feel like I've heard this somewhere.
Casey Ellis:
It's from this agency that has really awesome stickers. I think so.
Jen Ellis:
One of the other criticisms or concerns I've heard with safe harbor is that if you draw a box around a set of activities and say everything in this box is okay, then do you over time create an expectation or an understanding that anything outside that box is inherently considered to be bad or at least shady?
Casey Ellis:
Yeah. Yeah. You end up with that dichotomy, even if it wasn't expressed, it's implied.
Jen Ellis:
This is definitely something that I've heard one of my favorite people, Kurt Opsahl, GC of the EFF, has raised as a concern in the past.
Casey Ellis:
100%. Yeah. I've talked to Kurt about that too and-
Jen Ellis:
In fact, over lunch with me.
Casey Ellis:
Yeah, there you go. Yeah. And he's got a really good point. It's one of these ones where if we get that kind of criticism or that kind of critical feedback, it's like, "Okay, help me understand the nature of the angle that you're viewing the problem from a and let's work out how that reconciles with the end goal that we're trying to achieve. And is that kind of an inherent part of the next step?" Which I sort of think it is, the whole idea of a dichotomy, that already existed informally.
Casey Ellis:
We've helped thousands of companies establish these policies at this point. So the idea of defining, there are norms forming around what is okay, acceptable research any way as a function of this being a novel space that's been adopted pretty quickly. So to me it's one of those hazards that given the fact that new social norms tend to do that, period how do we actually steer that? How do we get to a point where we can steer that in the influence over time?
Jen Ellis:
I mean, again, I think, much as we were saying before, it's about incremental progress, and you take the space that you can occupy safely now and you safeguard around the other spaces and try to be vigilant and think about how you adapt over that over time.
Casey Ellis:
Exactly. And what I suspect will happen, because I mean, this already happens in Bounty Land when it comes to people going out of scope-
Jen Ellis:
Bounty Land, one of my favorite places.
Casey Ellis:
Bounty Land, we do have a theme song for that, it's fun.
Jen Ellis:
There's skipping, there's rainbows, unicorns.
Casey Ellis:
There's a merry-go-round and it goes around and around the room. But the idea of a company running a private program and then there's someone that submits an issue to them and they're like, "Well, hang on a sec, this is a private program. But that's not the same as the disclosure program." There's all of these different things that are kind of starting to surface now as things that need to be normalized and need to be better understood. And we're just in the process of evolving from one thing to the next.
Casey Ellis:
So I think it's really a matter of being mindful. Again, coming back to what's the North star? What do we want to hit? and then reconciling whatever those hiccups, might be towards that north star, and being vigilant for them. Because if it ends up on, you know what? You didn't do, blah blah blah. According to disclose.io, you're a criminal.
Jen Ellis:
I mean, that's how I act. Yeah. I always say that to people.
Casey Ellis:
I can get a little legalistic sometimes too.
Jen Ellis:
I say it particularly to Todd, on the regular.
Casey Ellis:
Todd, you need to reel him in sometimes.
Jen Ellis:
Exactly, absolutely. It's for his own good.
Casey Ellis:
So I kind of expect that to happen at different points in time so it's just a matter of how do you get it back on track?
Jen Ellis:
I mean, I think it's a great project, Casey. I'm super happy that we have people advocating for security research and trying to think about how we make this stuff easier and more consistent and more streamlined and how we over time adapt the relationship, make it healthy between vendors and researchers. So thank you for everything you're doing on that front. If you could offer, just as your final thought, your sort of Jerry Springer moment, if you could offer a sort of single learning or piece of advice to people who are launching their own projects. Trying to build consensus, that kind of stuff. Obviously avoid the word responsible is one of them.
Casey Ellis:
Yeah. Act responsibly. To me, it's all about community. All the brains I have and all the brains I can borrow. So it's being-
Jen Ellis:
Nice. How many brains do you have?
Casey Ellis:
A few, but I've borrowed more.
Jen Ellis:
In boxes. Jars.
Casey Ellis:
I've borrowed far more. I collect them. No. That gets creepy quick.
Jen Ellis:
You think.
Casey Ellis:
The whole idea, I think security can be kind of an introverted space. And this idea of like, "Hey, I've got this idea," folk that haven't done that before, find it really intimidating to do. And the part I don't think a lot of them realize is that there's folk that actually think that what you're trying to do is awesome and you will get a lot of help if you ask for it. Build community to whatever degree you can. And then as I said before, ask people dumb questions, figure out which parts of the answer apply to what you're doing and which parts don't and integrate that into what you know.
Jen Ellis:
I agree. I think that's awesome. I think it is a great note to finish on. Casey, thank you so much for joining us despite the fact that we all feel a little bit like we're half dead.
Tod Beardsley:
And it's mostly your fault because of your party.
Casey Ellis:
I'm sorry.
Jen Ellis:
He was blaming me over text message this morning. Whichever Ellis he can pick on, he will.
Casey Ellis:
That's right. It's Ellis first.
Tod Beardsley:
This is definitely an Ellis-based blame.
Jen Ellis:
I hope that you will come on again in the future and give us an update on it.
Casey Ellis:
Yeah, absolutely.
Jen Ellis:
We would love that. And again, thank you for everything.
Casey Ellis:
Absolutely. Cool. Cheers.
Jen Ellis:
Thank you so much to Casey for being our special guest during RSA, which is no longer now, in case you've been asleep for a really long time and you weren't aware.
Tod Beardsley:
It was in the before times, even.
Jen Ellis:
It was the before times. It was BC, Before COVID, and Casey is now in Australia and I'm in the U.K. So much has changed. What's happening in the world?
Tod Beardsley:
Well, from the start, because this was our Casey Ellis talk, I did want to talk about online voting. Specifically, the march toward online voting, as seen from the point of view of Voatz, and I say that because they-
Jen Ellis:
I'm sorry, was that Voatz?
Tod Beardsley:
V-O-A-T-Z, which, you know, that name should just inspire that appropriate level of confidence in handling online balloting. It is so goofball, Jen, I can't even get over it. Over the last week or so, they've had this on again, off again Twitter campaign to say, basically, no, no, no, we're legit and we can totally do this. It's very much on the heels of COVID-19, and the disastrous election that happened in Wisconsin.
Jen Ellis:
Yeah.
Tod Beardsley:
In Wisconsin, the polling places in Milwaukee went from 130 polling places to five on the day of.
Jen Ellis:
Yikes.
Tod Beardsley:
And it was bunkered. Super long lines, several hours hanging out with a bunch of strangers during a full-blown pandemic. It made me livid. But see, here's the thing, it's like the solution to this is ballot by mail, and I know ballot by mail up and down the board is really hard. You're dealing with a giant population you've never had to deal with before. That said, ballot by mail has been the thing for people who happen to be overseas, people who are gone for other reasons, the military. The entire military votes this way, by the way.
Jen Ellis:
I mean that's a bold assumption that the entire military votes, but yeah, sure.
Tod Beardsley:
Sure.
Jen Ellis:
Those in the military who are deployed overseas and vote.
Tod Beardsley:
Yeah. They can't, yeah. The ones who are deployed...The deployed military votes this way, which in recent history tends to be a lot of people.
Jen Ellis:
Yeah.
Tod Beardsley:
As far as from what I hear, when you're in the military, you will take every opportunity to break up your routine at all. So voting is one of those few things they get to do that they feel connected to American society.
Jen Ellis:
Good way of thinking about it. I was going on the postal vote's a pain, and massively reduces the number of people who vote kind of a thing, which I think is where you were starting from.
Tod Beardsley:
It is. It's not perfect, and moving from a, let's say, 8% turnout if that even from vote by mail, to 100%. Everyone who votes, votes by mail will be a task. That will be a huge thing. But it's also a pandemic and it's a crisis, and desperate times and desperate measures and all that. And here's the thing is, and let's bring this back around to Voatz. What Voatz is doing now is they are undercutting any kind of vote by mail argument and it's just so naked and transparent. Just go look at their Twitter feed for the last couple of weeks when you hear this. It is Twitter.com/voatz, and they are just saying these bonkers things about oh, voting by mail is insecure and voting by my mail exposes your personal identification, PII, and all that, and it's hard and people don't want to do it. All in the service of have you tried our vote by internet solution over here? Which is crazy because they just came off of an audit from Trail of Bits that exposed a few dozen issues with their solution. And I do like voting on the internet. I do think that voting on the internet is going to be a thing. It's inevitable. We will be doing it. We will not be doing it with Voatz, full stop. They just don't have the solution for it today. I mean and I think that's a foregone conclusion among security people, which is why Voatz is now turning to attack security people directly.
Jen Ellis:
Oh, that's a good sign. It always goes well when that happens.
Tod Beardsley:
Yeah. I'm very much looking forward to my personal attack from Voatz. We'll see after this airs.
Jen Ellis:
Great.
Tod Beardsley:
Depends how many people will listen to this, but they are...so West Virginia, the Secretary of State there seems to like Voatz. I guess it's okay to have one state try it out because that's half the point of a federated country is that you have these laboratories of democracy, and what works one place, might work other places, so let's find out.
Tod Beardsley:
But that said, I don't know, man. It seems like Voatz is just not...They don't have the cryptographic chops. They certainly don't have the trust of basically anyone in security and now they're pulling the skeezy playbook of oh, well let's just diss security researchers and let's diss vote by mail, which is a totally valid solution even though it's hard. I would today, at this point in April during a pandemic, I think vote by mail is an easier lift than vote by internet. You know? And that's I think just how the way the world works. It's my opinion, man.
Jen Ellis:
And you've held back on it so far, otherwise, in this gripping episode.
Tod Beardsley:
I am annoyed with this company and the way they're handling crisis comms, with the way they're handling marketing, because Jen, I know what good crisis comms looks like.
Jen Ellis:
You smooth talker, you. It will totally work. Yeah. I mean definitely they seem to be in the school of self-serving marketing shenanigans and not really thinking about the fact that if you're a company that is selling a product for voting, then probably what you should care about more than anything is the sanctity of elections. When you go around trying to undermine public confidence in elections.
Tod Beardsley:
Yeah, exactly.
Jen Ellis:
That basically makes you the bad guy. That makes you skeezy. It means that you're not... You don't really care about the thing that you say you care about.
Tod Beardsley:
You have skulls on your lapels at that point. It is impossible to be the good guy in this situation.
Jen Ellis:
Yeah. Yep. I agree there's a question, a distinct question mark over that piece of activity.
Tod Beardsley:
Yeah, and speaking of bad guys.
Jen Ellis:
Yeah?
Tod Beardsley:
You can go get an index of them. Not really, but if you go to AttackerKB, this is a product that the mostly Metasploit folks, but a whole bunch of other Rapid7 folks worked on, UX wise and all that, about ranking vulnerabilities as they appear to be in the eyes of an attacker.
Jen Ellis:
Wait, this is not a product though, right? This is, like it's more of an open source.
Tod Beardsley:
Oh, yeah. It's super free and everything.
Jen Ellis:
Anybody can contribute knowledge, isn't this right?
Tod Beardsley:
Yes. I mean, I would call it a product because it has been produced by Rapid7, that's how grammar works. It's not for sale or anything like that.
Jen Ellis:
Right.
Tod Beardsley:
It's at AttackerKB, if you Google "AttackerKB," you'll find it. If you just go to Attackerkb.com you'll find it. And what this is, it's basically a crowdsourced approach to criticality and severity and risk calculations. This is an area of computer security that we're actually not that great at because cryptographers, for example, might say oh, this crypto-bug is super, super bad. But then somebody else, an engineer, might look at it and think, well what do you get out of this anyway? Okay, you get this one secret, but who cares? You know, that kind of thing.
Jen Ellis:
Just to clarify, when you say we are not super good at it, you mean humans. You don't mean Rapid7?