Last updated at Mon, 21 Aug 2017 14:13:43 GMT

I've been thinking about a number of topics about the state of the security industry. If you're reading this blog post you have been also. News of breaches is hard to avoid. In particular, I've been interested in learning more about what I see as a curious gap between what people say and what they do. Perhaps you've seen it also.

Have you ever talked to an executive, peer, vendor, or partner and thought that their vision for security didn't match up with their actions? Perhaps they told you security was important. Maybe they used words and phrases like “critical”, “non-negotiable”, “key to our success”. I've had people tell me “Without the trust of our customers, we have nothing”. I'm not talking about people who are selling you something, but rather people who are in a position to help drive their organization to a more secure posture for their employees, customers, and products. In almost all cases, I believe them. Or maybe I should say that they believe what they are saying. I believe that when they think about the users, customers, source code, or whatever their crown jewels are, that they really want to protect them.

But frequently we see people doing things that do not support their vision for security. Executives don't fund important initiatives. Peers, even security professionals, take shortcuts that increase risk to the organizations. Deal makers and partners prioritize speed of execution over security, accumulating technical debt. I'm sure you're reading these examples and have many of your own, regardless of your role.

These disconnects add up. We should not be really be surprised at the number and magnitude of security incidents over the past few years. (As an aside, you're allowed to be shocked even if you are not surprised. If you knowingly put your fingers in an electric socket, you'll be shocked but not surprised!) We keep seeing reports of increased spending, and read about all the “cyber wake up calls” that companies and industries receive when others are breached. But the breaches just keep coming.

What might explain these disconnects? Why do people stress how important things are to them, like security, without engaging in activities to get there? You might say that they are prioritizing, and maybe that's right in some cases. Security can't be number one on the list at all times. It might be tempting to think that they don't really mean what they say. But let's assume for a while that they really do think security is important and that they'd pass a polygraph test.

Let's take a look at some examples from outside the security space. Let's look at an example where people have a heart attack. That has to be one of the biggest wake up calls I can imagine. Assuming modern science saves your life it should be clear that you need to make some lifestyle changes. When people talk to their doctors after the heart attack, they will urge three life changes: smoking cessation, healthy eating, and physical exercise. As hard as breaking bad habits can be, it would be reasonable to assume that most Americans follow that advice. But that's not what we see. In a study conducted by the Journal of the American Medical Association, researchers found that 4.3% improved in all three areas. And 26% of men, and 7% of women in the study were unable to make any of the changes. (Article at the St. Louis Post-Dispatch)

As in the case of information security, it would be tempting to think that these subjects weren't taking things seriously. That reaction is unlikely to explain the poor numbers. Change is hard, even when your life is on the line.

Let's also take a look at another topic outside of information security, namely research into why people are poor to prepare for natural disasters. A study conducted at Wharton/UPenn looked topics around how people perceive the threats of being in the path of a storm, or in other natural disasters. They had in mind things like people leaving cars in flood zones and how they planned to evacuate and under what circumstances. Wharton marketing professor Robert Meyer had the following conclusion:


…people are subject to three major biases. One is, simply put, that there is a tendency to under-appreciate the future or under-consider the future, or future consequences. A second thing is that people are too quick to forget the past, or too slow to remember the negative events that have happened in the past. The third one is that if in doubt, what often happens is that people will follow the advice of other people who are no less prone to those sorts of mistakes than they are.


…most people fail to adequately understand the threats they face as a result of natural and other disasters, and often those poor “mental models” lead to insufficient preparation.

Again keep in mind that this study isn't about information security risk, but about how people in flood zones think about risk and taking action. But these results sound eerily like preparing for a breach. I encourage you to read a little more on the study and to watch the video, replacing in your mind phrases like “rising water” with security-themed terms like “data exfiltration”.

But the phrase I want you to keep coming back to is “mental models”. I have a few thoughts on mental models in the security space, but I'll use that as a teaser for an upcoming blog post.

I'll also be covering these topics on my webinar next week. You can register here. Hope to see you then!

Have thoughts? Drop me a line on Twitter at @boblord.