Security Nation, Episode 10

How to Get Your Engineering Team to Take On Security Initiatives (Without Even Realizing It)

November 15, 2019

 

In this episode of Security Nation, we chat with Oliver Day about his experience embedding security into the engineering team at a medium-sized publisher. Oliver discusses the importance of understanding other people’s roles and what matters to them, and how that helps drive security efforts.

Also, join Tod for the Rapid Rundown, where he digs into the latest BlueKeep attacks in everyone’s favorite segment, “BlueKeep Watch. If you like what you hear, please subscribe! We release episodes every two weeks, each featuring a new guest who is doing something positive to help advance security. Our next episode will be released on Tuesday, Dec. 3.

Appears on This Episode

Jen Ellis
Vice President, Community and Public Affairs

Jen Ellis is the vice president of community and public affairs at Rapid7. Jen’s primary focus is on creating positive social change to advance security for all. She believes that it is critical to build productive collaboration between those in the security community and those operating outside it, and to this end, she works extensively with security researchers, technology providers, operators, and influencers, and various government entities to help them understand and address cybersecurity challenges. She believes effective collaboration is our only path forward to reducing cyber attacks and protecting consumers and businesses. She has testified before Congress and spoken at a number of security industry events including SXSW, RSA, Derbycon, Shmoocon, SOURCE, UNITED, and various BSides.

Tod Beardsley
Research Director, Rapid7

Tod Beardsley is the director of research at Rapid7. He has over 20 years of hands-on security experience, stretching from in-band telephony switching to modern IoT implementations. He has held IT Ops and IT Security positions in large organizations such as 3Com, Dell, and Westinghouse, as both an offensive and defensive practitioner. Today, Tod directs the myriad security research programs and initiatives at Rapid7. He can be uniquely identified at https://keybase.io/todb.

Oliver Day

Oliver Day is a security engineer at a publishing company in the U.S. and an old-school citizen of the hacking community. He started a nonprofit to help other nonprofits with security, called Securing Change, and it is now in the hands of a new generation of technologists. He worked at Rapid7 approximately 500 million years ago.

About the Security Nation Podcast

Security Nation is a podcast dedicated to celebrating the champions in the cybersecurity community who are advancing security in their own ways. We also cover the biggest events in security that you should know about. In each episode, host Jen Ellis (@infosecjen) sits down with a guest so they can share their stories, what worked, what didn’t, and what you can learn from their initiative so maybe we can inspire you to do something new, while Tod Beardsley breaks down the biggest security headlines of the week. 


View all Security Nation episodes

Podcast Transcript

Jen Ellis: Hi, and welcome to the latest episode of Security Nation, the podcast where we talk to interesting people doing interesting things to advance security. I'm your host, Jen Ellis. I'm Rapid7's VP of Community and Public Affairs, and as usual, I'm joined by my cohost, Tod Beardsley. Amazing Tod. How are you doing, Tod?

Show more Show less

Tod Beardsley: Hello. Amazing Tod? Okay.

Jen Ellis: Amazing, Tod.

Tod Beardsley: No, I'm great.

Jen Ellis: Like Amazing Grace but more Tod-esque.

Tod Beardsley: And we already have a Grace, so.

Jen Ellis: We do. Who is amazing. Why don't we call her Amazing Grace all the time? Oh, because that'd be annoying.

Tod Beardsley: Starting now.

Jen Ellis: Right. You know who else we have this week? We have Amazing Oliver. Oliver Day, who is going to join us and is going to talk about embedding security in engineering, which is actually one of our favorite topics. We are big fans of the—well I'm going to call it SecOps because that's literally what we call it. So that's what it is. Before we get to talking to Oliver, we are as usual going to do the Rapid Rundown and Tod is going to tell me what's happening in security.

Tod Beardsley: So, the big news for the Rapid Rundown this week is, you guessed it, updates on BlueKeep!

Jen Ellis: Woo! That's your thing!

Tod Beardsley: I love talking about BlueKeep. It's so interesting. So, during the last couple weeks, there was an outbreak, I guess is how you would describe it, of attacks involving BlueKeep. It was all kind of lame. It wasn't a worm, it was super crashy, all that. We've already talked about that before, but the interesting part that we can talk about this Rapid Rundown is that these attacks caught the attention of our buddy ZeroSum and on Nov. 7, ZeroSum posted a pretty long, technical deep dive on why the BlueKeep exploits that are floating around are so crashy and it turns out, the culprit is Meltdown. Yes, the speculative execution bugs of a year or two ago are screwing up how BlueKeep exploits work. Mainly because the common case these days appears to be that Windows machines that are vulnerable to these RDP bugs are incidentally also fixed for the Spectre and Meltdown bugs in a lot of cases. So, this is changing the way that kernel memory management works in the wild. This wasn't noticed or considered during the lab-based exploit development, but it turns out the real world is different from the lab.

Jen Ellis: What?!

Tod Beardsley: Yeah, I know. Right? And it's real interesting, too, and this is kind of a fundamental feature of exploit development that we run into occasionally where earlier patches may not fix the issue for later vulnerabilities, but they twiddle some things. So we've all said that the BlueKeep exploits are all very fiddly. It all has to deal with the spooky kernel memory management, and this is a perfect case where memory management is getting affected by some patches that appear to be pretty popular. So, hooray for Intel and Microsoft for getting good patch rollout on Spectre and Meltdown, which is not surprising. It got a lot of news, a lot of mainstream news. I heard about it on NPR, which was weird. I think that it's interesting in terms of exploit development. It's interesting in terms of when we actually start seeing attacks in the wild, how that feeds back into security research. We have a fix in on the Metasploit module now. It deals with this. It deals with the way that all the kernel memory hooking works, and if you are interested in reading up on ZeroSum's stuff, just look up ZeroSum on Twitter, you'll find a link to his blog. He doesn't post super often so it should be near the top there.

Jen Ellis: So, I want to make sure that I understand because this sounds like the most InfoSec-y story I've heard for a long time. Firstly, the InfoSec community is disappointed because the exploitation that they're seeing in the wild is not cool enough for them. Am I hearing that correctly?

Tod Beardsley: No, I guess? I mean, we were all expecting a big flare-up and catching a lot of people's attention as people's just stuff just got burned down to the ground. But the thing is, is that the attack, it ended up with a lot of blue screens of death, a lot of BSoD crashes, which are super obvious. So, it was easy to detect, easy to just notice that you were getting attacked at all. It hit a bunch of honeypots. It wasn't very targeted, and ultimately, the payload was installing a cryptominer, which is about the best kind of payload from a victim point of view that you can get because cryptominers just kind of like, they'll work your fan, but they don't really hurt. It's not ransomware. It's not permanently breaking devices. It's not shells.

Jen Ellis: Then this is such the cyber-putdown: It'll work your fan. Wow. That is cold, literally, in all the ways. As I understand it, it's lame. It's not interesting enough, and people are being saved by a prior bug, vuln, that they've patched.

Tod Beardsley: Well, and that's another interesting thing too, right? It's like we've said before that IDS can interrupt these attacks just by triggering on the payload and then killing the connection, while killing the connection will also cause these blue screens of death. Turns out these earlier patches are also "causing" blue screens of death because they are changing the way that that memory layout works, which is the whole point, right? That's why you have these patches, but it changes the profile, the vulnerability profile. So again, kernel exploitation like this is always super wanky.

Jen Ellis: I'm sorry, it's what now?

Tod Beardsley: It's janky. It's jinky, it's wanky. It's like all of those funny words that mean not quite right, because when they fail, they fail hard. And this is terrible for pen testers and for criminals, too, because you get noticed.

Jen Ellis: All right. So, we don't like it. I feel like this is what happens when you give a thing its own sound effect. After that, it just can't live up to the pressure. It's never going to be able to be what you want it to be, Tod.

Tod Beardsley: Oh, well, don't worry. Now that we are learning more about how kernel exploitation works in the wild, I expect a far more stable exploit to come out. Probably at some point, maybe another two months, three months, maybe over the holidays. This kind of vulnerability is great for ransomware. So, the moral of this story is still patch your stuff. Patch against this bug, please. If you're exposing RDP to the internet, please stop. If you absolutely can't, which I cannot think of a reason why. If you can think of a reason why you need RDP on the internet, please let me know.

Jen Ellis: So, thank you for educating me as usual, and now we get to talk to Oliver. Oliver is a security professional working in a medium-sized publisher. Before he worked at said publisher, Oliver started a nonprofit providing security services to other nonprofits, which I think is the coolest thing ever. I love that he did that. He's a super community-minded person. And before that, he worked at Rapid7, like a million years ago. How long ago did you work at Rapid7 Oliver?

Oliver Day: I think it was about a half a million years ago?

Jen Ellis: That is not what you're doing today, and we are going to talk about what you are doing. But before we get to that, let's learn a little bit more about you. So, Tod has a question that he's been dying to ask one of our guests. So, Tod, take it away.

Tod Beardsley: All right. Oliver, not counting the movie "Hackers…"

Jen Ellis: Which by the way, should always be counted.

Tod Beardsley: Yes. That should just be the beginning of every question. Not counting the movie "Hackers," can you talk about, describe a scene in a hacking movie that speaks to you?

Oliver Day: So, I'm a pretty old person. I think politely we say old-school or OG or something like that. But I'm that generation and I think a lot of us have the same answer, which is "Sneakers." And it's, to me, the best representation of what hacking was in the golden age or could be in a new golden age. But in "Sneakers," it was these guys just figuring out how something worked, right? And they had this device that was from the NSA that could basically just break all encryption. And then suddenly, they were like, "Well hey, what about all those..." It was a black book of encrypted websites. Or it wasn't even websites back then. They were like, "Hey let's log into the Federal Reserve node in Virginia." And they log in, they're like, "Cool, who wants to crash the economy?"

Oliver Day: And it's this sort of discovery, and they spent the whole movie just figuring out what this device even was. And they were the only ones really in the world who could even say, "Oh okay, I know what that is, and I totally know what we can do with this." And then they go on and decrypt FAA traffic and they're like, "Hey, does anyone want to crash a jet?" And that's when they kind of have that moment of clarity. They're like, "Dude, this is really not okay." It was actually the really dreamy hunky dude from forever ago…uh…

Jen Ellis: Robert Redford?

Oliver Day: Robert Redford. Thank you. Yeah.

Jen Ellis: He is dreamy and hunky.

Oliver Day: Right? That's a pretty accurate description. He was like, "Guys, we cannot do this." And that's one of the best moments for me. It was where he says, "Guys, this is totally wrong. I get that you're giddy with power. Stop it."

Jen Ellis: So you're saying that your favorite hacker movie moment is realizing that terrorism is wrong?

Oliver Day: Yes, yes. It's in realizing that hackers have amazing powers and that they can be used for really good things or really, really bad things. It's up to you guys to choose what you do with your life.

Jen Ellis: I like that. Use your powers for good, not evil. Tod. I don't want to single you out. Tod. So you have undertaken a role and you have made a sort of shift in that you are embedded with the engineering team. Is that correct?

Oliver Day: Yeah actually, I took a role that was supposed to just be DevOps. They added security engineering because they saw my background. But when I started, it was really just learning how to DevOps in this company. And that means learning how they deploy their infrastructure, which they keep as code, and then learning to write the code to deploy new infrastructure. And as I did that, as I actually contributed to the team effort, what I found was that eventually there would be a crisis. And at that time, they would ask questions that were security-related. And that felt like the natural time to put on the security hat versus as soon as I come in the door saying, "Okay I'm the security guy, this is what we need to do." And just throwing advice at them.

Oliver Day: And I think the other thing that really helped in the beginning was something that Wendy Nather said, which is, "If you're asking why don't you just do X, it means you probably should find out first." So, I went about asking, so why do we do this, this way? Without making any judgements against them, what is the story behind this? And sometimes the stories were okay. And I was like, okay that's a pretty valid business justification. And other times they weren't. And what I told the first guy that I worked for, he's left and I have a new boss now, but the first guy, I was like, "I'm never going to yell at you about security stuff. I'm just going to take a lot of notes." And then we're going to have discussions later.

Jen Ellis: I mean that's seems like a good way of building a little bit of trust and a little bit of empathy. I think giving people the benefit of the doubt is often harder than it sounds like it should be. Right? That seems like something we should all be able to say. Yeah, we do that all the time. We have enough sort of basic mutual respect for each other to give each other the benefit of the doubt, but actually, when you get caught up in the thing and something seems obvious to you, it gets hard to do that, to operate that way. And I think when you do, it makes a huge difference to the tone of the conversation and the tone of the response you get and how open people are to listening.

Oliver Day: Yeah. And I think, there's one person, I wish I could remember her name, but one piece of advice she gave me, she was learning to become an executive and she said, "Even a CTO should be a CEO once, to learn how to sell." And honestly that's what I've kind of learned here is, there was a security... There's this one part, one facet of our security that I've been unhappy with. I'm not going to say it out loud, but it's something I've been unhappy with for two years that I've been here. And it's only in the last six months that the CTO has relented. And I don't mean that I was hammering away at him, but I would just bring it up like, "Well I would really like this security goal to happen, but I can't because of this one policy we have. How about if we just carve around it and do this?"

Oliver Day: And then the next time we would carve around it and do this other thing and eventually he was sort of painted into a corner. It's like, look, can we just get rid of the policy? And he did. He eventually said, "Yeah, let's do that." But it took two years... A year and a half. And it took being told no a lot and not being upset about it, not saying, "Well then, I quit. You can just be hacked. Whatever happens, you deserve it." Which I feel is sort of the common response from security people. Well if you don't listen to me, you're going to get hacked and whatever happens, you deserve it. Those words literally come out sometimes and it's not... I don't think it's super helpful. It hasn't been helpful here, anyway.

Oliver Day: And then I've been able to make very small, incremental changes. Well we have this thing like deployments and it was one guy. Or a data fix is a really good one. There's data fixes that we get tickets for, and it was one guy in charge of it or one guy who would do it. He was the only one who could do it. And I said, "Well this is a single point of failure. Can we instead create a rotation around this?" And they took that idea and implemented in other areas, like who's triaging bugs and who's doing this?

Oliver Day: And now we've taken this sort of security concept and implemented it in other places, and they don't even realize, it's like, they're infosec-ing. That's what they're doing, but it doesn't matter to them. What I've done is said, "Look, here's a tool. This isn't a requirement. Here's a tool that will help you solve a problem you have." And they're like, "Oh that actually kind of cool. Yeah, let's do that." Versus, here's a requirement, if you don't do it, you're fired. They tend to push back.

Jen Ellis: Yes, absolutely. And I think if you demonstrate the value to the business, then people don't care what the label is that goes on it. They care about the value that you're demonstrating, which it sounds like is exactly what you were able to do in this situation.

Oliver Day: And I love the way you phrase that, because I think that's something that we also as a community need to do. And the first person I've heard really talk about this was Justine Bone. She gave a keynote at Source Boston and she basically said, "Look you guys that are snickering at the word 'cyber,' you are literally pushing yourselves out of conversations with stakeholders. You're laughing at them, they're now going to ignore you. And guess who controls your budget, who controls your head count, who controls literally everything above you? And you're snickering at them. That's never going to work." For me, I started at... They would give me these privacy projects and I was like, "You guys understand I have a background in law. I'm not a lawyer. I do security stuff. I'm not a privacy person. You're conflating two things."

Oliver Day: But eventually what I realized was if I wanted my objectives to be taken seriously, I would just have to accept they would be lumped under privacy. And now I have a privacy and security roadmap. Hooray for me! And all my security goals are getting pushed through, and sometimes I call it security.

Tod Beardsley: Yeah, I mean it sounds like what you've done is that you've wrapped your security in a privacy blanket, right?

Oliver Day: Yeah.

Tod Beardsley: And you can kind of smuggle them through that way. So good social engineering, Oliver.

Oliver Day: Thank you!

Jen Ellis: I mean, I think the bottom line here, the bottom lesson is, figure out what the business cares about and what's relevant to where they are now, and then figure out how to make yourself relevant to that. That's how you get people interested and you get people to the table. And that actually by the way, is an incredibly standard negotiation tactic. That's a normal negotiation tactic. It isn't even empathy, you don't have to... Empathy is a scary word for a lot of people. And they feel like they have to develop a new skill. And it actually isn't really about, you don't have to care about what they care about. You have to know what they care about and know how to map what you care about back to that

Tod Beardsley: Yeah. In your case, Oliver, it sounds like privacy was the thing. I mean I know at other places, a great backdoor for security is just talking about availability and uptime. And then sneaking security in on that.

Oliver Day: Yeah. Like the single point of failure stuff. Okay, this is why we're going to divide into a team.

Tod Beardsley: Yeah and sometimes that doesn't resonate, so you got to try something else.

Oliver Day: It's interesting on that point, right? One of the things that's helped me a lot, and I wish that I had come up with this on my own. I did not. I first read about this on a blog from Fastly, the CDN company and they basically had this entire post, several posts about why security teams should adopt the language of agile when working with engineers. And I was just like, that's actually quite brilliant. I was working at a different company at the time and so I started, I was like, okay well with SecOps, we want user authentication to use slow hashes. Okay, that feels good. And then it was just learning JIRA and learning how to break that down into other stories and stuff.

Oliver Day: But what I found was that bringing a JIRA ticket to a group of engineers was way easier than anything else... Any other way of packaging that request. Right? By saying, "Look, here's a JIRA ticket. I've broken it from an epic down into multiple stories. Everything has points assigned. Here are all the components that it affects. Here's some release dates. What do you think?" And suddenly they were just interacting in a scrum meeting like it was any other ticket. And they were saying, "Okay, yeah I could do that. And let's sequence it in after this because it might affect this other thing." And, "Oh, Oliver, have you thought about maybe implementing it this way?" "Oh no, I didn't. Okay, let's do that."

Oliver Day: And suddenly I was speaking basically their language and they could talk back to me in a way that I could accept. I could take in and understand. It just made it so much easier. Dino Daizovi also, I probably butchered his last name. But he also talks about this stuff a lot and has really good advice about learning to work with your engineering teams instead of what I almost feel like in other places I've worked at was against the engineering team. It was this antagonistic relationship where I was out to find all the flaws that they were coding before they shipped. It didn't feel right. It really did feel like I was working against them instead of with them.

Jen Ellis: And I think to your point, that both a lot of security pros and a lot of engineers do feel that way. I think a lot of security pros feel like they're seen as being the bad guy, they're unappreciated. I think a lot of engineers feel like they put all of their time, effort, blood, sweat and tears into their code. Bloody code is the best code. And some smug security person comes along and tells them that their baby is ugly, with no appreciation of the time and effort that goes into it. And to your point, I think when you learn to work within the framework that other people like... I mean even stupid stuff, like we have conversations here when we're working cross-functionally about will we use Confluence with something or will we use Slack for something?

Jen Ellis: And sometimes for the people who are not in the engineering teams, we have a real eye-roll and we're kind of like, "Those are not the tools we like to use and we don't think they're actually the best form of communication for this kind of situation." But we have to play the, okay this is where we'll accommodate you game, and this is where we'll need you to accommodate us game. And before you can bring people to where you want to be, you have to bring them along with you, which means you can't be super arrogant about it. You have to start where they start and try and fit in with their workflow. And I think it sounds like you've done that incredibly effectively.

Oliver Day: Thank you. Yeah. I like the way you framed it, because recently I had this task where our vendor, I won't describe who, but they're pretty good. They came in, did an assessment, they figured out something about our login system that wasn't great. So we were going to introduce rate limiting. And by the way, just for security professionals who have never dealt with the remediation aspects, I plead with you to go speak with people who have gone through it. It's not as easy as you think. And you say, "You should just change that." It can take months to work out the plans, write the code, migrate things. So anyway, just this one rate login thing, I ended up breaking about three or four spec tests, our spec tests. And it turns out I'm not the smartest guy in the room when it comes to writing our spec tests. I'm very good with security. I'm a middling cryptographer person. But when it comes to writing our spec, I'm awful. And I think that's where, at least the people that I've worked with that I don't really respect anymore, that's where they always fell short to me. Because they were like, "Look, we're security. We are at least senior-level engineers anywhere else." And I'm like, dude you're really not. You can't have that attitude.

Jen Ellis: And here's the thing, and I think this is true in the issue between security people and engineers, or breakers and builders if you like. As much as it's true in the world that I sit in, which tends to be more policy conversations, the ideal thing is when you bring multiple people to the table who have different skill sets and expertise and you can put those skills together and learn from each other. If everybody had the same skills, a lot of us would really be redundant. So it's much better when we can learn from people who have different complementary skills to us.

Breakers are good at breaking things. That's what they do. And it's a super valuable skill to have, but it's not the same as building. And it's important to recognize what builders bring to the table. There'd be nothing to break if builders didn't have to make stuff!

Oliver Day: Exactly. And if you're a breaker, please recognize you might not be the best builder in the world. And respect the builders that you talk to. They have a skill, too.

Jen Ellis: I feel like I got very soapboxy in this episode. I apologize. I shouldn't have. So tell us, for somebody else who wanted to go about this and emulate what you've done, build a program like yours, what would be your No. 1 piece of advice to them?

Oliver Day: So my No. 1 piece of advice would be that the product owner of the product that you're trying to improve is the No. 1 relationship that you should focus on, cultivate, and protect. Because that person literally controls the flow of resources and time and energy that goes into the product. And if you want any chance of your security improvements to be taken seriously, that's the person to convince. And I have a pretty good relationship with the product manager here and sometimes he's like, "Dude, I don't feel comfortable with this." And one time he was absolutely right because I made a change to our password reset system during our peak registration time.

Jen Ellis: Ouch.

Oliver Day: Really stupid and shortsighted of me, by the way.

Jen Ellis: But props for calling it out and props for learning from it.

Oliver Day: Yeah. And the thing is, if I hadn't had a good relationship with our product manager, I probably would have been in a lot of trouble. But because he and I talked about it and I came into his office one morning when we were having some issues and he looked at me, he was like, "We maybe shouldn't have done this during registration peak time." And I was like, "Yeah, I agree. Let's never do this again."

Jen Ellis: That does sound like a very productive relationship.

Oliver Day: It is. It is. And like I said, there's stuff that is going to get put on the back burner that I'm super into, but that's fine. They will eventually work their way into our product, that I'm convinced of. Because I think I've developed enough respect here that they're like, "Okay some of his ideas are a little out there, but eventually we do come to agree that that's what all of our auditors say we should be doing. That's what all of the security experts say we should be doing. So, let's just go ahead and do it and let's figure it out." Figure out how, and I think that's the biggest lesson. I mean not the one about protecting your relationship with the product manager, but the secondary one is if you've never experienced the remediation side, I very much wish that you would at least ask some people to just walk you through, how hard is it to change from say a fast hash in your password field to a slow hash? What is it like to really migrate from say SHA512 to scrypt? It sounds super-duper simple. It's not. It is insanely difficult. There's so many moving parts.

Jen Ellis: I think walking a mile in other people's shoes is great advice. Tod, have you ever done that?

Tod Beardsley: Have I ever migrated from SHA512 to scrypt? No, I have not. I am blissfully ignorant about it.

Jen Ellis: No, that was very specific. But what I was getting at was the walking a mile in other people's shoes. And I think you have done that. I think you would have sat in the engineer's seat and in the breaker seat.

Tod Beardsley: Yeah, I mean it's the same deal on project management kind of stuff, right? You have to be talking the other person's love language when you are assigning them tasks.

Jen Ellis: Oh my God. Did you say love language?

Tod Beardsley: I did say love language.

Jen Ellis: This is the best podcast ever. Thank you so much for coming on and telling us all about this and congratulations for it sounds like a pretty healthy relationship there that seems to be yielding great results. And good luck for the future.

Oliver Day: Thank you very much. Yeah, and I really love the podcast. I did actually listen to podcasts. You made fun of me when we first started talking about this, but I did. I did. And I have to admit, as soon as I heard you interview Wendy Nather, I was like, "I'm totally in. I will totally do the podcast." Because she is legitimately one of my heroes. She's amazing.

Jen Ellis: She's one of my heroes as well. And I love the fact that she brought us credibility. I will totally tell her that. Obviously she listens to us all the time. No, she doesn't. She's busy. But I will totally tell her that that is the case. That is hilarious.

Oliver Day: Yeah. No, the Securing Change Nonprofit, when we first started, the Security Poverty Line was inspirational to us. It was one of the ways that we could best describe the thing that we were trying to fix in the world.

Jen Ellis: It's not just an amazing piece of work, but it's an amazing encapsulation of a problem. She tells the problem in a way that everyone can relate to and has been very profoundly impactful, I think. Yeah. Well-earned praise there. Thank you. And yeah, thank you for coming on. Tod, thank you for co-hosting, particularly as I know you're not feeling great still.

Tod Beardsley: It is my pleasure.

Jen Ellis: And Bri, as ever, you're amazing and we appreciate it. Thank you. I say this every week, but I don't think what you guys know is I'm actually gazing at her adoringly and as I say it, which is so awkward for her. So, thank you, Bri. We appreciate it. All right, we're out. Bye!