The security awareness industry can - if you ask Albert Einstein - be declared as insane. Why? SAT keeps doing the same thing and expecting different results. When things are not working, instead of trying to understand what is wrong, and then fix it, SAT just keeps throwing more of the same at the already overwhelmed employees.
If you call it new school, old school, death by powerpoint, training, awareness or even security culture, chances are you are following some vendors recommendations and best practices. Which you would be expecting to make a difference. And when the vendor tells you you train more often, and favors training with assessments and questionnaires, you probably believe they not only know that it works, but that they have evidence that the recommendations actually will provide you with the results you expect when spending your budget on these tools and platforms.
What if I told you that research suggests that best practices from security awareness training providers not only miss the target, but can be counterproductive?
When you talk to your vendor next time, make sure to ask them about their evidence for suggesting the actions they recommend. Most of them can’t. The problem is not that the sales rep or CSM you are most likely to be communicating with, doesn't know. The problem is that their employer doesn't know. They are, of course, highly unlikely to admit to that. And yes, some of them may actually have valid data, but most do not.
My evidence for this bold claim is a research project my team and I started in 2020. We teamed up with a 5000 employee company, and their local awareness specialist Caitriona Forde. Together, we wanted to know if targeted training was more or less effective compared to the traditional shot-gun approach of the SAT industry. We set up the project with two groups of employees - one group that we would run our experiment on, and the second group a control group. The experiment group received a highly tailored program, especially crafted for their specific needs (skills), their communication style and their culture. The control group received the industry standard best practices approach we are all too familiar with.
After implementing the program, which lasted almost 18 months from start to finish, we learned that, not very surprisingly, targeted training is much more effective than the traditional approach. I mean, it should be no surprise that people are unique in their needs, their education, their understanding of security, their roles and responsibilities as well as in their approaches to the world. It makes sense that targeted and tailored training is effective, even more effective than the traditional approach.
What was shocking to us was the control group. Whereas the experiment group showed dramatic improvements in our measurements, the control group showed a dramatic reduction in their security behaviors, attitudes and communication.
We did not expect that result.
The most important lesson learned in this research project is that the one-size-fits-all approach to security awareness training is, at best, not working. At worst, it will reduce the effectiveness of employees' abilities to protect themselves and their employer. So if that is your current approach, you really need to change. And fast.
The better approach is to understand the actual needs of your colleagues around the organization. Understand their work environment, their work tasks and responsibilities, the tools they use (also those that may be prohibited), and what they feel about using those tools. Every role is different, with their own unique requirements, tools and practices. Every human is different, with their own frames of reference, skills and world views. Your security program should cater to them, not the other way around. Because they are not there to protect you - you are there to protect them.
I believe that in order to show progress, you need to be able to measure it in a meaningful way. Create a baseline, where you can either leverage a tool like Praxis Navigator, some of the existing SAT vendor assessments, create your own assessments, or collect and analyze data yourself. The most important thing is to use the same metric to measure the results as you used when creating the baseline.
You can find the published research paper here if that is your thing:
You can check out Praxis Navigator here:
And you can reach out to us if you have any questions!