A problem with security awareness programs is that it so hard to reach everyone in an organization. Information security professionals have long talked about “security is everyone’s business” but this is an empty phrase if people disagree. This blog post is more about “why should anyone care” than about how to actually reach people. If we can find an answer to that question, the rest will become much easier. 

Why should anyone listen to what you say?

Imagine that you work as a sales executive in your company. It is Monday morning, and you are early in the office, to set the plan for today. You are planning to prepare the morning meeting with the sales managers, a call with marketing on a new campaign that will launch next month, and you have to go over the sales rep hiring strategy with HR. The sun is shining, it looks like it will be a busy but good day for business. An email ticks in from information security with the subject “Mandatory anti-phishing security training”. You are thinking to yourself “I know what a phishing email is, this is probably yet another boring 20-minute elearning session. A waste of time”. What action would you take next?

  1. Do the mandatory training immediately, to get it done with
  2. Schedule 30 minutes to do the training at a more convenient time
  3. Delete the email and go on with your day

If you picked c, that is perfectly understandable. it is far from obvious how that awareness training would help you get your job done; it is an interruption, and it is not clear how this supports growing sales over the next quarter or help hire the next great sales representative. 

Our executive, John Harris, deleted that email.

Sales executive preparing today's meetings and tasks.
John Harris, our sales executive, planning his day in the morning. Does he have time for a 20-minute e-learning session on phishing now?

This is not an information security problem as such, it is a communication problem. Perhaps, even a leadership problem. Within a company, we are trying to shape behaviors that will reduce risk exposure, and hence help the business. This is leadership. A classic article on persuasive leadership from the Harvard Business Review was published in year 2000 by Robert Goffee and Gareth Jones; “Why should anyone be led by you”. They came up with four characteristics of inspirational leaders; 

  1. They selectively show some weaknesses. This shows their humanity and helps build trust. Without trust we don’t want to be influenced by someone.
  2. They manage with “tough empathy”. This means that they empathize with the work employees do, and care about the work they do. They give employees what they need rather than what they want. 
  3. They use intuition to help decide what action to take and how to time it.
  4. They make it visible how they are different from other leaders. This makes them more interesting and worth following. 

It is easy to see that there is a long way from how the average security awareness program seeks to shape behaviors to what Goffee and Jones described as the typical characteristics of inspirational leaders. This difference is also severely limiting the reach of many security awareness programs. And this is what security awareness is about; inspiring behavioral changes that will benefit the cybersecurity maturity of the organization. 

Let’s get back to the sales executive and the Monday morning at the office. This time you are the awareness program manager at the company, and you are concerned because there has recently been a wave of Emotet phishing emails hitting your company, and by far you have had the most infections in the sales department. You have a need to reduce the threat. How can you act differently from using the e-learning system to send a mandatory phishing training e-mail to everyone? Perhaps the content of that training explains the message you need to get across but first you need to catch their attention. 

Timing: obviously you need to take action quickly, since this is an ongoing attack campaign. But what does your intuition tell you about the course of actions to take to best shape that behavioral change you need to take place? To get the sales department to listen, it would be great if we could Harris as an ally. He cares about sales numbers. What does Emotet do? It allows for installation of further malware, and for theft of credentials. It can also allow criminals to steal data. Malware infections have been detected by antivirus on the sales reps’ computers, and cleanup has caused lost productivity for the sales team. These are problems you can help the sales executive solve. Perhaps all you need is a phone call before you send that email from the e-learning system? 

As a security awareness manager you don’t only need to know what people are doing and when the right time to reach them is, but also what events in the threat landscape should trigger awareness messages. We think building awareness messages on relevant threat intelligence is a great way to provide valuable input to people.

Balancing between low visibility and overload

Sometimes a good message drowns in the noise. If you get “nagging” from the security team all the time you will likely just mentally classify it as “noise”. Like reminders to fill your time sheet form HR. On the other hand, if a message is sent only in one channel, for example on the Intranet, you will miss a lot of people you want to reach. 

Not everything is important

Prioritizing messages. Are we doing that, or are we just distributing everything in the same channels, to the same people? Obviously, some things are more important than others. 

Some channels should maybe reserved for the most important messages. Take the recent iOS vulnerabilities, making it possible to hack an iPhone just by sending the owner an email. Let’s say your executive team has previously been targeted by hackers-for-hire, and that they all use iPhones issued by the company. What would be more effective, sending another email from the same sender address as those e-learning “mandatory training” emails, or texting them directly on the phone? Obviously, texts are more likely to reach the person in question but it is also more intrusive. Keeping the most intrusive channels for the most important messages is a good idea. 

What can be less important? A message to take an e-learning on recognizing phishing is maybe less important than many other things, unless there is a specific threat of such attacks. Then an infosec channel on Teams or Slack may be the right place for that message, perhaps followed up with an email reminder after a while? But you woldnt sendt e-learning nudges as SMS’s to the executive team; it is too intrusive. 

Recruit allies to help shape behaviors

if you are a ghost in the wire, nobody will care. Messages that gain support from people others listen too are more likely to be acted upon. This is why recruiting influential supporters throughout the organization is time and money well spent. These are your security champions. You want people on your side, who can help sell your product; the security message. Great security champions don’t need to have line management responsibility but they should be respected by their peers and believe in security as a business enhancer. 

The role of leadership

Leadership does play an important role in security awareness. The most important part is walking the walk. Managers who follow security policies, and visibly make security objectives part of their business objectives do more for the security level of the company than any e-learning system can do.

A leader giving a speech to a team
A leader who makes security priorities visible through actions do more for security than any e-learning system ever has.

This is why the security team should support the business managers in getting their jobs done. Spending time with managers, not only talking about threats, but how good working practices will help both productivity and security at the same time. 

Channels: how should we send our message to optimize reach?

Let’s say you have your security champions supporting you, and leaders who exemplify good security behaviors. You are planning how to use the communication channels available in your organization. Which channel do you use to reach whom? And what message are you trying to get across? Let’s take a page out of a marketing textbook, and start with creating “security personas” instead of “marketing personas”. 

A marketing persona, or in our case, a security persona, is a description of a fictitious person you are trying to “sell” your message to. It will help you frame your message, understand the motivations of that person, and choose a good channel for reaching them. Here are a few questions you would ask to define a persona: 

  • What is the business role? What performance goals do they have?
  • What is the educational background of this person?
  • What is the primary motivation factor at work?
  • What do they care about?
  • What is their expected age group?
  • How do they communicate with others in their day-to-day business?
  • What are stress factors for them?
  • What changes would help them reach their goals more easily?

Few security teams think through these things before creating a communications plan. It is very helpful to do so as it helps with shaping the language of the message, the frequency and timing of messages, the format, and channels to use. This is putting “tough empathy” into practice for cybersecurity awareness programs.

Let’s use Hubspot’s free tool for creating marketing personas to create a persona for sales managers, as we are trying to reduce the Emotet wave. You find their tool at https://www.hubspot.com/make-my-persona

A security persona can help optimize reach for cybersecurity awareness.
A “security persona” is a marketing persona adapted to the needs for selling security internally.

We know his goal is revenue growth: the problem we are trying to solve is limiting the ability of sales representatives to provide that. These interests are aligned.

We know communications for John occurs on phone or in person. Sending an email is likely to have less impact. Perhaps a phone call, and a quick coffee together, before sending e-learning messages to the sales team would help? What if John would talk about security as a threat to reaching sales goals in the sales manager meeting? What if he would ask them about productivity challenges and the recent phishing attacks? That could inspire a completely different reception of those e-learning emails. 

Fast and slow messages

Some years ago the psychologist Daniel Kahnemann published his bestselling book “Thinking fast and slow”, where he talked about humans having two ways of thinking. One fast, using intuition, based on recognizing patterns, supporting quick decision making. The other is slow, analytical, based in logic. Most of the time we act in the fast system, sometimes even when the slow system should be triggered. 

This may be a simplification of things but shaping our awareness program around a principle of making responses to attack triggers fast responses, or good security habits, is very helpful for reducing the human attack surface in the organization. This is because fast reactions are unavoidable. it is also what makes us fall for social engineering scams. When criminals make the scam so that our “fast system” switches on, and we take action before noticing any signs that something is not right, they succeed. 

Our slow system is also necessary. Our slow system helps us design systems and processes. It supports decision on habits to internalize. And when we shape our awareness programs, we should tend to both ways of thinking. Unfortunatley, most awareness programs  focus a lot on an analytical approach to decision making, whereas we spend most of our time using the intuitive, fast way of thinking. This is why asking people to inspect email headers will not reduce the number of Emotet infections in your organization. 

Three awareness message classes

Awareness messages are not all the same. Broadly speaking, there are three types of messages we want to communicate: 

  1. Orders: we have an imminent threat and need to tell people what to do. Now. This should not happen very often!
  2. Supporting habits: building good “fast thinking responses”. This is the core objective of the day-to-day awareness program. Short, to the point, more like a reminder than a lecture.
  3. Motivating security practices. This is the analytical background we use to build acceptance for our “fast thinking”. This is logical, supported with facts, procedures, compliance. 

If we start with the orders, this requires some slow thinking in order to support fast decisions in the security team. Take a zero-day allowing full takeover of a computer with no user interaction. We need to tell people to turn off a feature, or do something we cannot manage remotely. This is the time to issue the order. But will it work? It depends. If people are aware of “security orders” existing, and that this is very important, and there is a process behind it, it can work very well. If an “order” comes out of the blue, a lot of people will do what is required. But some will refuse, and think it is hyperbole, perhaps even get angry. So to support the “order” system, you need to build acceptance through talking to leaders, recruiting security champions to support it, and communicating about it during normal times as part of your security awareness program. 

What is the “fast system decisions” we are talking about as good security habits. Is it the “hover over the linke to check if the link in an email is a potentially dangerous link”? No, that is definitely analytical behavior requiring conscious and planned action. Instead of that hit from the typical awareness curriculum, it is the fast decision to share a document through Office365 instead of sending an email with an attachment. It is the choice to generate the password you need for your new online account using your password manager, instead of typing in something you can easily remember, like Summer2020 that you also used on Facebook and eBay. When people make choices that are different from the typical social engineering attack, they help build a more natural fast response to that attack because it is then easier to recognize as “this does not follow the natural pattern”. In other words, training people to use modern collaboration tools is better for security than another video telling them to hover over links before clicking. 

Building motivation for fast responses with slow thinking

How can we shape our messages for the “slow system” so that we motivate acceptance of good “fast thinking patterns”? Knowing WHY we do something is a strong motivator. Most people, if told what to do, will be reluctant to comply with that unless it is clear why this would be beneficial. This is where the analytical system of thought comes in. 

What analytical messages to send, depends on the role. We should tie those messages to the “security personas” we discussed above. How do the security habits we want to shape align with the business objectives as well as the personal objectives and goals of the security personas we defined?

Let’s get back to John Harris, our sales executive, and the Emotet danger. Let us start with the position of the security team, how would we defend against Emotet infections? 

  • Patching of computers and antivirus
  • Blocking of command and control at the firewall based on threat intelligence
  • No administrative accounts on endpoints
  • GET PEOPLE TO STOP OPENING THOSE WORD ATTACHMENTS!

So we have a few technical controls, that’s fine. The last one is about not opening Word attachments, and then activating macros. That’s what the security department wants. How does this align with the wants of the sales exec? He wants revenue growth. They have had malware infections threatening the short-term growth due to workstation restoration, and potential longer-term problems due to leaked documents or perhaps even ransomware deployed through the Emotet command and control. This would of couse lead to more downtime, but also future clients distrusting the company if it reaches the media. The alignment of interests is relatively clear: no downtime, no reputation damage.

A key problem here is people opening attachments. We could tell them to stop opening attachments. If the typical workflow of a sales representative is to write proposals in Word, send as attachments to colleagues for comments, then to the prospective clients, telling them to do so would not work. You would literally be telling them to stop doing their jobs. 

Another thing you could do would be to train them to look for signs of a “suspicious email”. Like strange wording, unusual sender address, unusual Word document, do not activate macros, and so on. The first problem with this is that it requires turning on that analytical approach every time you get an email with an attachment. If that happens 20 times per day you have already lost. 

Your strategy could also be to train them to use cloud software for internal work, with collaborative tools that aren’t available outside the company. That way an e-mail with a Word attachment would be highly unusual. This would have the added benefit of faster internal information sharing within the sales team, and more collaboration. You are aligning with the goals of the sales executive! So, if you have a cup of coffee with John, discussing these things before sending any training material, the attendance rate would likely skyrocket! In addition, if you team up with HR’s training department and have them help with training on using cloud tools effectively, it would much faster move ways of working from the old ways that the criminals are emulating in their social engineering attacks, to new ways with much better security. 

Measuring awareness

When companies provide awareness training, they often have little data to show that the training actually makes the company less vulnerable to cyber attacks. Of course, measuring aggregate behaviors of people in an organization as a response to limited inputs is very difficult. There is a lot of noise and most likely a weak signal. 

Commonly used ways of measuring awareness program performance

The most common way to measure performance of a security awareness program is to look at statistics of completion of e-learning modules. This is more a compliance measurement than a performance measurement; registering how many people have clicked through a learning module says very little about how much of that knowledge they have internalized. It is not that measuring completion is not useful but it doesn’t really measure what we want to know. So let’s perhaps first discuss what we actually do want to measure. 

  • Is cybersecurity knowledge improving?
  • Is cybersecurity practice improving?
  • Is the risk going down?

Many companies use e-learning with built-in quizzes at the end of e-learning modules. While this is good for helping people remember what they just learned in a video or some text they just read, it is not a good way to measure knowledge, practice improvements or changes in risk.

One common type of test that tries to measure risk level and perhaps practice, is the simulated phishing test. How many people click on a phishing email depends a lot on how “difficult” it has been made. That means that phishing tests can be quite uncertain data but they do to some extent measure the ability of people in an organization to detect and avoid falling for social engineering scams. 

Operational measurements

In addition to artificial tests like simulated phishing, or using a quiz, one can use operational data to gauge the security performance of the organization. There one has metrics like security reports from end-users, incident reports, malware infections and so on. This is a good measure of risk exposure but it can be difficult to tie it to activities in the awareness program.

Human factors: measuring the likelihood of human error

If we look at some of the underlying factors for why we fall for social engineering attacks, we need to turn to psychology. It is not that we do not know that a link can be dangerous, it is that our attention is elsewhere and we are focused on getting the job done. If there are many distractions in our environment, or in our lives, the probability of letting the guard down increases. The same can be said about the typical “human error”, like sending that confidential spreadsheet to the wrong person, og the developer who forgets to validate user inputs before storing it in the database. How can we measure these factors? 

A field of study that attempts to estimate the risk of “human error” while performing an operation is HRA, or human reliability analysis. This type of study combines assessment of complexity of the task at hand with so called performance shaping factors to gauge the likelihood of making mistakes that cause undesired events to happen, such as opening a Word attachment from an email and then activating macros without thinking about it. 

There are many factors that influence our ability to make good decisions. Performance influencing factors are important for cybersecurity.

At Cybehave we have adapted knowledge about human reliability analysis as used in the nuclear industry, aviation and the oil and gas industry to the reality of the information worker. Using such methods to characterize the organization culture can greatly help prioritize the factors that motivate good security behaviors. 

Working environment

The working environment will greatly influence the decision making performance, whether we are talking about the fast or the slow system of thinking. People who feel safe and not under unhealthy levels of stress are more likely to make better decisions, both when reacting intuitively and when performing conscious analysis before selecting an option. This clearly has implications for cybersecurity.

Most companies have annual working environment surveys. The results of these surveys are important input for security awareness programs too. Because of this, the security team should work closely with HR and leadership on working environment topics, and at least get insights into the results. 

  • Do you have departments where the stress level is high? 
  • Do you have departments where there is a high level of conflict?
  • Do you have departments where employees distrust their direct managers?
  • Do you have departments where employees distrust the company management as a whole?
  • Do you have departments where employees report harassment from peers or managers?

In addition to working environment surveys, there are some operational HR data that can be used to assess performance shaping factors as well, such as overtime use (indicator of too many tasks, causing stress) and high employee turnover (indicating unhealthy working environment and discontent).

As we understand from this discussion, actually measuring what we want to know related  cybersecurity awareness is very difficult. Still, we should measure some essential characteristics and use them to improve how we communicate with the organisation about security. Cooperation is critical to good communication and can help a lot in increasing the reach of your security awareness program. Cooperating with leaders and HR in this respect is important, and to get that cooperation to work well we should remember to think about that relationship from the other side; what is the benefit to the HR advisor or line manager to spending time on your security issues? 

Reaching them all (tl;dr;)

So, we can conclude that optimizing security awareness reach to influence everyone is difficult. Here are some bullet points to summarize our approach to better security communications. 

  • Give people a reason to listen to you. Put a face on your communications, and remember the four qualities of inspirational leaders; (1) show that you are human by showing some weaknesses, (2) use tough empathy – understand what they do and what they want, give them what they need, (3) use intuition and “soft decision making” to decide about timing and the actions to take, (4) use differences that set you apart from others to make yourself noticeable.
  • Build workflows that are not easily emulated by attackers. Then social engineering attacks will be much more difficult to perform since the attack will stand out as “unusual” rather than as part of the daily workflow.
  • Use the three classes of awareness messages wisely to make it easier to build good security practices (warnings, reminders supporting fast thinking, motivators requiring slow thinking).
  • Use personas to shape the messages, formats and channels to better fit the recipients of your awareness message
  • Connect what you measure to what you want to know. Use direct, operational and HR measurements to build a good picture of the overall security awareness performance.
  • Adjust your program according to needs; if measurements show that certain departments are both under high levels of stress and phishing tests show that they click more than others, the right medicine is more likely in leadership than in more reminders to avoid clicking those emails.

If you enjoyed this content, please join our weekly newsletter on cybersecurity and privacy management.

Leave a Reply