Skip to content
All posts

The Internet is a dark room. Your brain thinks the lights are on.

How can you tell when someone is lying to you? Perhaps the most obvious method is by analyzing the content of what is being said to you. People who are lying may contradict themselves, or give you information that you already know to be false from an alternate source. However, the vast majority of our biological hardware for lie detection is reliant on other, less conscious detection methods. 

These methods include indicators such as the intonation in somebody's voice, their emotional state, whether they are hesitant as they speak, whether their visual presentation matches how they are attempting to portray themselves, and whether they are making too much or too little eye contact. Any one of these indicators from this non-exhaustive list can act to tell you that something about what you are being told is not accurate, and you should be cautious with this information and the people offering it to you. 

A significant number of our defences against social threats (such as lying or manipulation), as listed above, are disabled by the digital environment because we lose our ability to physically sense. This in itself is not the biggest issue facing cybersecurity. The biggest issue we face in cybersecurity is that we do not appear to understand that we have lost the vast majority of our social threat defenses. That is to say, despite its pivotal role in the invention of the internet, the human brain has little understanding of how to safely traverse digital environments. 

Let's take a look at an example: Let us assume that you and I know one another. A person who you believe to be me approaches you in a dark room. I hand you an object. I tell you to eat it. You have no idea what this object is. Your thought process, one hopes, is to question what I have given you: this is a dangerous situation. Even the concept of being without visual aids is enough to increase the level of evidence required for you to accept information as truth. 

Now let us take another scenario. You are in your email inbox. A person you believe to be me, a person you know and trust, sends you an email. There is an attachment included which you are told to download, but you have no idea what this attachment is. Even if your suspicions are raised, it's unlikely that your response will match the heightened skepticism you feel in the first scenario. 

The first scenario is some hypothetical nightmare realm, whereas the second is a standard day at the office. People are used to traversing the unknown in a digital space, and the level of unknown we accept before we take actions is not only significant, it is significantly more than it should be. These are, for all intents and purposes, the same scenario. 

Research has shown that when it comes to cybersecurity scenarios, we are chronically and unjustifiably optimistic. One of the leading causes of our over optimism can be remedied fairly easily: we just need to start treating the digital world with the same caution and skepticism we naturally apply in physically uncertain environments. By understanding the principles of trust, caution and verification are equally vital online as they are offline, we can significantly bolster our defenses against digital deception and social engineering.

How do we help employees change their mindset? 

To facilitate a mindset change among employees towards cybersecurity, Praxis Security Labs suggests an innovative approach that goes beyond traditional training methods. Like phishing simulations, real life simulations are a powerful tool in getting people to understand complicated or abstract concepts. We recommend engaging employees in table top exercises focused on social interactions, simulating the loss of different senses.

Namely, we recommend using your cyber lunches or other engagement activities to run sensory simulations through table top exercises. These scenarios should challenge employees to navigate various situations without relying on certain senses, mirroring the limitations we often face in digital communications. As part of this recommendation we would also suggest you run such scenarios with the following in mind:

  • Small team discussions -- Organize your employees in small teams. This encourages more active participation and discussion and will positively influence your security culture.
  • Facilitated debriefings -- After each exercise, conduct a debriefing session where teams can discuss their strategies or decisions. You may consider nominating a facilitator to guide and encourage discussion. 
  • Feedback -- Soliciting feedback after the exercises can help you gauge their success and how your employees feel about security and their role in interventions.

_____

An accomplished researcher and neuroscientist, Thea Mannix, PhD, is the Director of Research at Praxis Security Lab, where she uses her knowledge of neurobiology and social science to help further understanding of human factors in cyber security. Given the invaluable human-centered perspective that Thea brings to the industry, Thea is a popular speaker and is often requested to give presentations on cyberpsychology. 

You can request Thea Mannix (or the other Praxis Security Labs subject matter experts) to give a presentation to your organization or as a speaker at your next event by reaching out with our contact request page

Request a speaker

Praxis Security Labs experts are available to present on a number of topics from cyberpsychology and human factors to data visualization and statistical modeling.