What makes you click? What prompts you to pass along, or share, information that you’ve seen online? Whether it’s a personality quiz, a startling announcement, a provocative political statement, the promise of a list of the Top X of just about anything, or any number of other forms of “click bait,” we are all wired with pre-programmed, automatic responses when given certain prompts.
And threat actors know it. In fact, they don’t just want to prompt people to click, they want to compel them to share potentially dangerous emails and posts with others. And they do so in numbers that are increasingly astronomical.
While much time, effort and expense is brought to bear to apply technological solutions to protect data security and integrity, the truth is that the biggest risks to data and systems isn’t based on technology at all—they are based on people.
Knowing this, a wide array of hackers and cybercriminals target people, rather than machines, to wreak havoc. These types of efforts are known as social engineering and are used by criminals in a variety of ways. Entire populations are being manipulated through increasingly prevalent and hyper-compelling information designed to invoke emotion and exploit known biases.
Our biases raise cyber risks
We’re fertile ground for this type of information because we all suffer from various biases that compel us to believe information that supports these biases—whether we recognize that this is what’s happening or not.
We are at risk because we all have various cognitive biases that have evolved based on our life experiences and interactions. Whenever we see something on Facebook that triggers one of our biases (and, by default, our interest in not only reading, but sharing), we become part of the problem, however unwittingly.
We are always filtering what we call reality through all of the other things that have happened to us through our experience. Each of those biases influence the way we process information.
Cognitive biases
The more you read about cognitive biases the more you will begin to see how you are being targeted based on your biases. We all live within our own socio- and ideological frames. In fact, as Susan Bales, president of FrameWorks, has pointed out: “If the facts don’t fit the frame, it’s the facts people reject, not the frame.”
If you’ve ever been involved in a Facebook conversation or an argument around the holiday dining room table, or engaged in a heated debate online, chances are that you’ve heard comments along the lines of: “That sounds like it makes sense, but I just don’t agree.”
Even in the face of facts that challenge our assumptions, we often fail to change our views.
Therein lies the challenge for all of us. We have an innate tendency to value—and even seek out—information that supports what we already believe. That’s called “confirmation bias” and it’s a form of cognitive bias. There are, according to Christopher Dwyer, PhD, writing for Psychology Today, 12 common biases that impact how we make decisions. But that number can be deceptively small compared to the far-reaching cognitive implications.
How can you combat these tendencies? First, by taking steps to identify your own cognitive biases.
Pressing pause on your cognitive biases
A firm appreciation of the OODA Loop (which stands for Observe, Orient, Decide and Act), created by John Boyd, a military strategist, can help us minimize the potential to fall prey to various types of social engineered misinformation. Understanding this natural cycle of decision making can enable us to build-in some checks and balances observing and acting. We can learn to pause long enough to check our biases, orient and decide whether the information we are seeing, reading or observing is, in fact, credible.
The orientation piece of this involves analysis and synthesis and, in some cases, a search for new information. Information has been weaponized and is being used by players with an agenda; when their agenda meshes with your cognitive biases, the spread of misinformation can—and increasingly is—amplified.
As Thomas L. Friedman said in a recent opinion piece for The New York Times: “I worry because Facebook and Twitter have become giant engines for destroying the two pillars of our democracy — truth and trust.”
Friedman goes on to add that these social media channels have become “huge, unedited cesspools of conspiracy theories that are circulated and believed by a shocking—and growing—number of people.”
To combat this we need to monitor our own behaviors and take steps to intentionally slow down as we’re faced with information we find interesting for whatever reason. In those moments, use the heightened interest and emotion as a flag to fact check and consider what the information is trying to invoke within us. Then, and only then, should we make a conscious—and informed—decision about whether or not to share that information with others.
Ongoing education, and communication, a must-do
To avoid the potential for your staff members, colleagues, partners, family and friends to forward one meme after another meme and participate, however unknowingly, in the spread of misinformation, ongoing education and communication are required. Education and preparation are the only defense.