Skip to navigation Skip to content

Ron Woerner: Solving for Human Error in Cybersecurity

Minute Read

Despite incredible advances in technology, the achilles heel of cybersecurity continues to be human error. Why is it so hard to solve?

On this week's episode of The Secure Communications Podcast, guest Ron Woerner talks about the intersection of human psychology and cybersecurity and why the future of cyber needs to be grounded in simplicity.

Welcome to the Secure Communications Podcast
Data in motion is complex, chaotic, and unsecure, but the ability to seamlessly communicate is what drives innovation, growth and progress. Discover how the leading minds in the fields of technology, cybersecurity and communications are tackling the challenge of securing data in motion, and gain insights into what’s new and what’s next on the Secure Communications Podcast. Each week, host Kathleen Booth interviews bold thinkers who are developing and/or employing transformational technologies to solve communication security challenges.

In this episode

Ron Woerner

Ron Woerner has spent his career studying, advising businesses, and teaching others about the intersection between human behavior and cybersecurity.

As President and Chief Trusted Advisor at Cyber-AAA, he solves cybersecurity problems by providing answers, assurance, awareness, assessments and advising for small, medium and large organizations. Woerner established the Cybersecurity Studies program at Bellevue University, an NSA Center of Academic Excellence where he still teaches.

Ron has been featured at TEDx, (ISC)2, ISACA, and RSA conferences for over 15 years. He has written for multiple security publications and has created online courses for LinkedIn Learning and Cybrary.

In this episode, Ron talks about why human error continues to be one of the primary challenges facing the cybersecurity industry, and what needs to happen to solve for it. 

Quick links

Listen, watch, or read

Want to hear Ron's insights on solving the human factor in cybersecurity?







Kathleen (00:02): Thank you for joining today's episode of the secure communications podcast. I'm your host Kathleen Booth and today my guest is Ron Warner who is the CEO, President and Chief Cybersecurity Consultant at Cyber AAA as well as a professor of cybersecurity studies at Bellevue University. Welcome Ron.

Ron (00:25): Oh Kathleen. Glad to be here. 

Kathleen (00:30): I'm happy to have you here. For those who are listening, could you tell my audience a little bit more about yourself and your background and what you do in the field of cybersecurity?

Ron (00:39): Certainly. So I'm an Air Force veteran. I was an Air Force intelligence officer and that's how I actually got started in information security. I was a computer science graduate working in intelligence and just led me naturally into data protection. When I left the Air Force, I was a Unix systems administrator and this was back in the day where we didn't have specialized security professionals. So it was also my responsibility to secure the networks and the systems for companies like Mutual of Omaha, ConAgra foods, TD Ameritrade. About 10 years ago I decided I wanted to try teaching. So for the past 10 years I've been teaching for Bellevue university in their cybersecurity studies. But I also act as a security consultant for small and medium businesses, helping them solve numerous types of security problems and ensuring they're compliant with the different laws and regulations such as NIST, PCI and ISO.

Kathleen (01:38): Fantastic. And one of the things I find so interesting about the whole realm of cybersecurity and secure communications is that, you know, there is such a strong, if you will, technical or technological aspect to it. The, the technology is evolving so rapidly, but there's also this other side of it which has to do with the human condition and psychology and human behavior and really human factors, I guess is the best way to put it. That, you know, you can have in many cases an incredibly well crafted technological solution and still the weakest link in your, in your chain if you will, is going to be the people that are a part of the ecosystem that, that is involved in that solution. And you've done some really interesting work in this area. So just wanted to just sort of like say that to kick off the conversation and, and I would love to get your thoughts on, on the human element of communication security.

Ron (02:40): Well definitely. It's not that humans are bad or trying to be malicious on purpose. We often just are trying to take the path of least resistance in solving whatever problem, whether it's communicating remotely like so many of us are doing right now because of the Coronavirus or just trying to get their job done. So it's important to understand the user's perspective in developing the solutions, this way they can implement it as easily as possible and security is not seen as a blockade as it often sometimes is. So it's the idea of understanding that user experience, or UX, as a part of the technology development process, talking to end users, finding out what they want, how they will use it, how maybe they will misuse the technology. So that's all of what I've been studying for about the last 10,12 years.

Kathleen (03:36): So why is this such a difficult problem to solve? Because it's been around for some time. As you said, you've been studying it for 10 years. Why, why is it so hard?

Ron (03:47): Because so many people, we're all different. But we're all very similar. If you read the book predictable, predictably irrational name, the author is slipping my mind, but he studied this - how we're very predictable, but we all tend to be irrational in how we approach and solve problems. So that's always going to be the challenge. Plus, so often from a technology, so I'm a techie, I'm a geek, nerd. I think my perspective is the way to go, and it's sometimes hard for me to understand the end user's perspective. Plus it takes a lot of time and effort as part of the development process to go out and talk to sample user populations to find out how they will leverage the technology and again, how it could be abused and misused as well.

Ron (04:38): And that's why we need part of the whole development life cycle to include requirements testing throughout and then penetration testing or red teaming at the end of the process to find out those security gaps.

Kathleen (04:52): So, during the design stage of, of if you will, architecting either a technological solution or putting together a cybersecurity program, what are some of the things that organizations are currently doing if, I guess in the, in the best practice category to try and tackle this before it becomes a problem?

Ron (05:13): Of course the answer is it depends. The really progressive ones have user focus groups available early and get them involved in the conversations in the development process. So many times though, what happens is we take a compliance based approach where we look at the current requirements like NIST, like PCI, HIPAA, ISO, and we base our security on those, which aren't necessarily based on how end users will actually use the technology.

Ron (05:44): Keep in mind, most of the laws, regulations were written a long time ago, like NIST, the one many of us now follow, was actually written back in 2012, 2013. Think about how things have changed since then. We're so much more reliant on mobile technologies and cloud infrastructure and that was not necessarily included in those early compliance documents. So again, the idea for developers, it's just finding out who your user base is, getting involved with, getting them involved in the process as early as possible and then understanding how they think and how they are influenced, which is actually a whole other area of human aspects that we like to talk about.

Kathleen (06:28): Is that really where design thinking comes in and, and you know, structuring your, your development process to have those sprints where you're really engaging prospective users and essentially observing them as they behave in the wild?

Ron (06:44): Right. Well that's exactly right Kathleen. You need to observe how they will leverage the technology early so they can, you can understand, well again, how will they not only use it in normal environments, but how it could be used abnormally through the testing cycles. For example, I am a hacker, a pen tester, and one of the things I'll do is figure out how technology breaks. And that's because end-users will accidentally break things all the time. They won't even understand how they did it. They'll just begin playing with it and all of a sudden, poof, they can get into an insecure state. So that's why having security as part of these like agile sprints can be very beneficial. And then understanding really how to make the technology usable. So this way users, the end people who are leveraging it, will want to use it. It's user friendly.

Ron (07:44): So from the design principles, we're looking at it based on psychology. It's, you know, is the product easy to understand? It's almost likable. Like we like humans, we can like technology as well. And understanding those types of usability gaps actually from fundamentals that were designed back in the 1970s.

Kathleen (08:07): Yeah, it's fascinating how some of this stuff is, you said NIST is from what, you know, what is it six years ago? And, and other elements are from even farther than that. What are some common ways that you see these, these issues manifest themselves? I mean, I imagine that there are certain behavioral things that come up over and over again that, that you see as common mistakes in the process.

Ron (08:34): Right. People again, will take the path of least resistance, for example, with their network security. So many, when they're working from home, don't want to use a VPN for example, because of that fear, the unknown unknowns, if you will.

Ron (08:49): I'm afraid if I use a VPN to encrypt my network traffic, that it will slow everything down or that I won't understand how it really works. So it's just, how do we educate as a part of this? And that's where being a teacher really comes in handy. Being able to teach early and make it very, very small chunks of information. Because if you overwhelm your end users with too much information right away, then they won't bother either. So it's just, how do we take like, two to three minutes worth of chunk learning to help our end users leverage the technology?

Kathleen (09:29): So you just raised education and that actually taps into the next question I was going to ask you, which is, you know, knowing that at least at the stage we're at right now, there isn't a solution to, to, to guarantee that any security solution is going to be architected in a way that is perfect or completely locked down. You know, I think we have to acknowledge that we need to do something to try and encourage users to, to comply, if you will, with the behavioral traits that we're looking for. So, you know, just tapping into psychology, I feel like there are a couple of different ways you can approach this. You talked about education which is, is can be very anticipatory. But then there's also the, the carrot and the stick approach. There's incentives, there's, you know, punishment for noncompliance. What do you find is most effective? I'm assuming it's probably a mix of these things, but I've never seen any data to support what works and what doesn't in this regard.

Ron (10:36): Actually, there's a lot of data saying that the stick, with the carrot and the stick doesn't really work very well with most people. I mean, young kids maybe, but you really need to show them the benefits and how that benefits them personally. So Simon Sinek, Start With Why. What is their personal why they want to do this? For example, taking from the current situation with coronavirus, we're constantly being reminded to wash our hands and not to go out if we're sick. I mean, this is nothing new, right? But it's really being driven home now because it's affecting us personally. We don't want to impact our neighbors if we happen to have this and not even realize it. So it's taking those ideas outside of the world of technology and bringing them in. We call it cybersecurity hygiene.

Ron (11:28): So what does that hygiene we can be taking and doing, like washing our hands? Well, it's like spring cleanup around our homes right now. We're recording this at spring. We should be doing spring cleanup on our PCs. Are we running only the programs we need to run? Otherwise there's too much overhead. Just like if we have too much junk around our house, it makes it hard to walk around. Too much junk on your computers or on your networks, it's too hard to, to be able to allow the traffic, network traffic, that needs to get through. And so many people don't even know what's on their home networks with all of the IOT types of devices that people are plugging in. A common complaint I hear is that now everyone is working from home and all the kids are taking classes from home. It's bogging down the network. So look at your inventory. Start with the inventory. What do you have? If it's not needed, unplug it. Get rid of it. Make those decisions. So it's really what we need to be doing, on the technology side, is really learning from how we do things in our daily lives.

Kathleen (12:31): So do you have any, any examples that you can point to at the organizational level of companies or organizations that have done this really well and how does that play out? What does it look like?

Ron (12:44): Yeah, so the companies that do cybersecurity well, again, they make it personal for the employee. They show how the employee is part of the security equation and the little steps that they take. Keeping it as simple as possible. So KISS, if you will. Keep information security simple maybe where you know, hey, by just making sure my computer is being updated, that I know like, at the start of Zoom meetings, what am I talking about? If there's anything sensitive, then making sure we know who is in the Zoom meeting. I mean, those little behavioral steps go a long way that everyone can be involved in letting the employees know that they are an important part of the security equation. And again, encouraging the right behavior. So learning from common management practices like The One Minute Manager, remember? This whole book by Blanchard and Johnson where they're talking about one minute goals. So, you know, what is the one minute things that end users can do? One minute praise. So not just, in security we like to just tell people how they're doing things wrong. No, we need to show them. Hey, you're doing things right. When you compliment how things are going well, then when there's that room for improvement, it makes it so much easier to make those subtle little changes. So that again comes back to this human aspect, understanding basic influence and leadership techniques, and how do we equate it back into communications and security?

Kathleen (14:16): Well, that's a really interesting point because you know, that makes me think about where responsibility for this typically lies within the organization and, and, and I guess I would ask you that, who do you see within companies or larger enterprises or even you know, educational institutions, nonprofits, government, who do you see typically owning responsibility for this?

Ron (14:41): The ones that are doing it well, I'm seeing this right now with my clients as a cybersecurity consultant, it's where upper management is taking the lead because it becomes having the right culture of security. One of the companies I'm working with, the CEO is leading the charge for security. Even though he's delegated it to his chief security officer and chief technology officer, he's the one who is saying, Hey, this is what we need to do to be successful as a company. I've seen the same thing with the chief financial officer because security is really about risk, which is an economics problem. For example, I was working with one client and they were wondering why they were using so much network bandwidth, and this was the first time the CFO had asked the question, so where are we using all of our network bandwidth? And the network engineer was in the room and said, Oh, well here, let me show you.

Ron (15:32): And I pull it up. And lo and behold, what were the top sites? Facebook, Netflix, Google. The CFO had a minor cow right there because this was a defense contractor where employees shouldn't have been watching Netflix during, you know, using the company bandwidth. So right there, he was able to set the direction and then percolate it down to the rest of the company. So it's just, again, common, good leadership techniques. Take it from the top. Too often we see Kathleen, and this is a situation, unfortunately this may resonate with some of your listeners. It's known as Spaf's first theorem, first law, of security administration. It goes like this. If you have responsibility for security, but you don't have the authority to enforce it, you're just there to take the blame when something goes wrong.

Kathleen (16:24): That's so true. I think that's true of most positions of responsibility.

Ron (16:31): Very true. So that's where they need to match up, and the companies that are successful match responsibility with the authority for network communications and network security.

Kathleen (16:43): You know, and that gets me thinking about leadership paths for cybersecurity professionals because so many of them start out with more of a technical kind of education background. But from everything you're saying, you know, I'm hearing things like, if you're going to be an organizational leader in the field of cybersecurity, you need to have really strong skills in education, in leadership. You know, all of these other soft skills that are not necessarily the things that we associate with an education in cybersecurity. So I wonder if you could speak to that.

Ron (17:17): Certainly that's sometimes the challenge with this. I'll be talking to young folks trying to break into security or they're already a network security analyst working in say, a security operation center. They'll go, well, how do I get a leadership or management position? Well, you can start actually within your communities. There's numerous groups throughout the United States, throughout the world, like ISC squared, ISSA, ISACA and Infraguard. So, take a leadership role within your local community. Great way to build up the networking, the human networking connections and give you that practice with leadership and management as well. Numerous other groups as well, just taking initiative and taking a lead and finding a mentor. That's sometimes what we forget about as well. Really the good security professionals learn how to coach and mentor. So first of all, ask if you're a junior in cybersecurity or network communications, find a mentor. Ask, will you mentor me? And then, as you get more advanced, become a mentor to other people and it becomes very synergistic where you can co-mentor each other to build each other up. And then that really helps the whole community because you're gaining understanding in real world environments, not just within a classroom.

Kathleen (18:40): That's great advice. So what is the implication of this problem not being solved? I mean, you know, when we think about the human element, it could be one person who's, you know, somewhere in a massive organization of thousands of people. If, if one person kind of makes a wrong move, how big of a problem can that be?

Ron (19:00): Well, it can take a company down when it comes to ransomware. It just takes one person not thinking, going unconscious for a few minutes, you know? Getting that email that looks real, clicking on that attachment and all of a sudden, poof, because they didn't take the responsibility for securing their system, knowing what network shares. So, one person can take down an organization, unfortunately. So the way to do this, well, again, it's just that continual coaching of employees. Why we keep seeing this. So it's a phrase that me and a few of my buddies created like 10 years ago called Security Groundhog Day. Security Groundhog Day. It's like the Bill Murray movie. We're reliving the same day over and over again. Because it's common practices like good network security, making sure you're configuring your network devices appropriately, staying up to date on patches, having policies in place, knowing who has access to what, where, when, why, and how. You know, these things are nothing new. Even though the technology has changed. We're now using all sorts of devices on our bodies, in our homes, but the same philosophies and concepts reign true. So it's going back to those, the basics of blocking and tackling, you know? Taking from sports analogy, it's just to work with our users, coach them, and again, show them why it's important. Get them involved.

Kathleen (20:31): Great. Now shifting gears for a minute, I have a couple of questions that I always like to ask my guests when they come on the podcast. The first one is, with the way that we communicate and manage data changing, what do you see as the biggest challenge that we as a society are going to face in the next few years when it comes to securing our communications?

Ron (20:57): It's the challenge that we're still seeing today is that we're plugging in so many devices now in power networks, whether it's the business network or home network, do we know everything that we're plugging in, whether it's wired or Wi-Fi? Do we have the policies in place to tell the end users that, no, you can't plug in a Google home or Amazon echo type of a device that, you know, they're just thinking it's a Bluetooth speaker. I want to listen to music while I'm working. Why is that not allowed? Or I'm connecting my, I'm pointing to my iWatch, whether it's iWatch or Fitbit or whatever. I'm plugging that into potentially a production network. So it's starting out with that inventory. What do we have? Because you can't secure things if you don't know it even exists. Getting alerted that, hey, we have new devices on our network. Should they be there making a decision, yes or no? Before the yes, then are they configured appropriately? Many times, the manufacturers give direction on how to secure the device, but people may not take the time to do it. So realizing, it's up to all of us. Just like we secure our homes and our cars, it's all up to all of us to secure our business environments as well.

Kathleen (22:12): It's pretty unbelievable how much of our lives are run by, you know, some degree of, of internet connectivity. You know, I think most people have no idea how many things they have running on their network. So following on that, what, you know, looking ahead to the next five years or so, what new secure communications technology are you personally most excited about?

Ron (22:43): Well, I think to make security, keep it simple so that KISS approach. Keep information security, small, simple, accessible for everybody. So that's where I see, really, the innovations to where we can bring security with us wherever we go. So let's just say I need to work from a coffee shop. Well they're known to be very insecure. Or an airport. When I would travel, I would get a lot of work done at airports, but using their Wi-Fi is just not always a good idea. There's numerous steps we need to take. And again, sometimes we're in a hurry and we forget about locking the car doors if we will, doing it with our network security. So keeping those devices small, simple where we can plug in our security wherever we are, on any of the devices that we have. This way, we'll always be in a secure state and we won't even have to think about it. Like today, how we lock our car doors as we're walking away from our car because it's become unconscious for us. It becomes so easy. We need to get to that within our network communications and security.

Kathleen (23:48): Great. And is there anyone in particular that you think is doing really interesting or, or bleeding edge work in this area of, of communication security?

Ron (23:58): Well, of course Attila, with your products, with making them very small and accessible. Really, again, it's that approach we need to take. We're seeing this with numerous other companies as well, integrating security into the technology too. So it's not just an add on, you know? How do I secure a home after it's already been built? No, you want to build security in. So it's that actual model is part of the application development or product development life cycle as well, where you can turn to the companies and find out ,yes, that you've already been certified in security and you can prove that you have the right amount of security depending on your, what is the sensitivity of the data or the systems in use.

Kathleen (24:45): Great. All right, well we're coming to the top of our time, Ron. If someone wants to learn more about you or follow you online, what's the best way for them to do that?

Ron (24:56): Certainly go to out to my website, You can also follow me on Twitter at @ronw123 or connect with me on LinkedIn, just Ron Woerner. I make myself very accessible on purpose because I love connecting with people. Together, we can solve today's security challenges and that's why I love being able to come on podcasts like this. So Kathleen, I appreciate you having me.

Kathleen (25:22): Oh, I appreciate you coming on. And I will put all of those links that you just mentioned into the show notes so you can find those on the website. Thank you for joining me this week, Ron. And if you're listening and you enjoyed this episode, please consider leaving the podcast a review on Apple Podcasts or wherever you choose to listen. We want to hear from you, and if you have an idea for a future episode or a guest that we should speak with, tweet us @Attilasecurity and we'll look forward to hearing from you. Thanks again, Ron.

Ron (25:55): My pleasure. Be safe everyone.

Kathleen (25:57): You as well.