Modern Workplace

A people-centric approach to patching the human firewall

Sep 19, 2023
Patching the human firewall

Editor’s note: The following was written by Zscaler Solutions Architect Director Ian Curtis.

When an attacker scans your environment for entryways, what’s the most promising vulnerability they will discover? 

Verizon's 2023 Data Breach Investigations Report found that over 74% of breaches required human exploitation to be successful. That means humans still represent the greatest vulnerability to our cybersecurity. This won’t surprise many working in the field; it is a part of our daily experience. Unfortunately, awareness of the problem has not reduced the risk it represents. Humans can be anomalies for a cyber-defender who’s made a career studying technology. 

The difficulty of changing a security culture is often overlooked due to a few common assumptions.

Common false assumptions

1. When people know more, they will behave differently 

While some research demonstrates a slight behavior change when users undergo security awareness education, the impact tends to be minimal. If there was ever a time when we mistook humans for computers, this is it. When a computer fails to perform the function properly, we simply update the code and assume the computer will behave differently. There’s only one problem – humans are not computers. Just because they have been made aware, does not mean they have been made to care.

2. We alone can affect seismic cultural shifts

You may have heard the tale of the two fish being asked a question by another older fish passing by, “Hey boys! How’s the water?” The two confused fish respond with, “What the heck is water?” 

Water, in this scenario, is culture. Sometimes, the hardest things to see are all around us, with culture often more elusive than water for a fish. This doesn’t mean we can’t practically assess our company culture, but it does mean an outsider's assessment may be uniquely valuable. 

3. We can work against human nature

Humans, on balance, are both overloaded with information and incredibly social. We must therefore be careful with how we allocate our attention. Efficiency and socialization are critical to our survival. 

It would be foolish to think we can fight our nature. Nonetheless, we attempt to. For security professionals, it’s often easier to assume everyone who didn’t change their habits after a 30-minute training or countless email reminders is somehow deficient rather than simply human. To affect how people behave, we must ensure that doing the right thing is easier than doing the wrong thing.

Overcoming erroneous assumptions

Charles Kettering, famed head of research at General Motors, once said, “A problem well-stated is a problem half-solved.” He was referring to the importance of clearly defining terms, ensuring consensus, and then reducing them to a set of attributes that can be measured and improved. In our case, if you ask every stakeholder to define “security culture,” you will get varied responses. 

Therein lies much (50 percent, according to Mr. Kettering) of our problem. It's nearly impossible to secure buy-in, establish a baseline, and make improvements before first defining your terms.

Here I define security culture as the “ideas, customs, and social behaviors of a group that influences security.” This security culture is a subset of the overall culture and exists whether we foster it or not. It is also constantly changing and, if we do not proactively influence the direction of that change, it’s unlikely to change for the better.

Ideas are broad values organizations hold to determine what is acceptable. True organizational values are not found tacked up beneath a mission statement in a cubicle on floor seven. Instead, they are best observed by asking well-crafted questions to the widest audience possible. While there are many security culture surveys, I propose customizing them on a per-company basis. Questions should address aspects of security, but not relate directly to individual actions. In other words, don’t ask whether a user clicks on suspicious emails or allows piggybacking, but how the user understands the role they play in the security of their company. 

Customs refer to the accepted and observable norms adopted within the confines of the company culture. Some behaviors, like logging in with a password, will be automatic and contribute to overall organizational security. Luckily, these behaviors can be directly observed via existing indicators or addressed broadly within the aforementioned survey. 

Most organizations have data on how many phishing emails were reported and can compare that against overall phishing activity. They can combine that data with surveys asking employees which behaviors peers are most likely to engage in. It’s important to focus on others because people are far less likely to report their own mistakes than to report on the behaviors others have engaged in. 

Social behavior refers to how these actions adapt to a broader changing culture. Remember, security culture is a subset of the overall culture. Culture, again, is an ever-changing force that ultimately dictates the success of most functioning organizations. So, while ideas and customs can be measured against a baseline, behavior is best addressed continuously en route to the goal.

Steps to transforming behavior

Define success

After establishing a baseline, we can define short and long-term goals. A long-term goal would be achievable within 3-5 years and aligned with any other long-term organizational goals. A short-term goal should take no longer than 12 months to achieve. When defining each, we should ensure they map to the two first components of security culture (ideas and customs) as attested by the data gathered. 

Engage influencers

As described, humans are social creatures. To instigate positive security-related change, the effort must be supported by key stakeholders. 

There are two types of influencers that we must recruit to be successful: 

  1. Leadership Stakeholders - These are people who already have a track record of influencing the culture of the organization. They must believe in their ability to affect change if they are to be recruited to participate. Ensure they understand their role in influencing risky organizational behavior. 
  2. Detractors - These individuals will be the most openly opposed to the new security program, likely influencing others to feel the same. They are a significant hurdle to achieving cultural change. Involving these naysayers is often the surest route to winning them over. Recruit them as program ambassadors and suddenly once vocal detractors become outspoken proponents. 

Adjust as necessary

More often than not, people make the easy decision, not the right one. This means we must desirable behaviors easier to execute than risky ones. Is reporting an email easier than opening it? If not, we should swap formal reporting structures for a single, bright button that’s easier to find than any others. 

Educate, don’t pontificate

Armed with new methods, we can begin immersing as many individuals as possible in education that ensures they know how to respond appropriately to potential threats. Remember, just because they are aware does not mean that they care. Each training touchpoint must be as personable, immersive, and relatable as possible. Fear alone can not ensure engagement. While true that we should stress the negative impacts poor decisions can have on the organization, it’s important to also demonstrate how good behaviors help us achieve business goals more effectively. 

Celebrate

Finally, we should publically celebrate high performers, ideally in ways they genuinely care about. This could include something as coveted as additional paid time off or as small as a coffee shop gift card. Regardless, it’s important that rewards and not just punishments are on the table. 

A note on technology

Technology should never be the primary focus in patching the human firewall. That said, there are a few key concepts from the NIST Zero Trust framework that can reduce the likelihood of human-enabled exposure. 

When we think about how the NIST Zero Trust framework can help with security culture, consider core concepts like no assumed trust, continuous verification, and most importantly ensuring no internal employee or contractor is ever provided trusted network access in order to use an application. 

With traditional network-and-firewall-based segmentation, we were forced to give every authenticated user full network access to everything within the security domain that the application is connected to. With cloud-based proxy technologies, however, we can ensure users are connected to applications without network access. This means that, if they do click a phishing email, that attacker cannot fingerprint connected apps or move laterally thereafter. This alone can have a dramatic impact on reducing the risk associated with the human element of an attack. 

Rinse & repeat 

Changing culture is a continuous, iterative process requiring repeated and consistent measurement, goal setting, and growth. Anything less is cultural decay. When compared to the gaping security hole the unpatched human firewall represents, the patch is well worth the effort.

What to read next 

The secret behind a successful zero trust project is no secret at all; it’s the human element

Defending against email attacks means optimizing your team (not just your tech)