LAS VEGAS — When it comes to security, a lot of what
matters might not be based on technology but rather what happens in people’s heads.
In a keynote presentation here at Black Hat, noted Security
pundit Bruce Schneier exposed the reality that
clouds most security-related judgments and actions. In
Schneier’s opinion, security is very fundamental to being
alive.
“Security is both a feeling and a reality,” Schneier told
the overflow crowd. “You can feel secure even if you’re
not, and you can be secure even if you don’t feel it. In
some sense there are two different meanings to the same
word, and that makes it hard to talk about.”
Schneier sees security as a tradeoff for consumers where
they spend some money or time and then get something
in return. The question to ask is not is it worth it but
whether it does any good.
“People have a natural intuition about security
tradeoffs,” Schneier said. “Whether you walk down one
street or another, lock your door or wear a bulletproof
vest.”
According to Schneier, there are five security trade offs: Severity of risk; probability of risk; magnitude of costs; how effective the
counter-measure is at mitigating the risk; and the trade-off itself.
The Amygdala, Schneier explained, is the oldest part of
your brain and that’s where you think about security. It’s
the part of the brain that controls the fight or flight
mechanism.
“Amygdala is a very fast part of your brain — faster than
consciousness.”
The neocortex, by contrast, is the part of the brain
associated with consciousness, thinking and reasoning.
Schneier referred to the neocortex as the newest part of
the brain and, as such, is still in beta testing. He said it’s also the slowest part and, therefore, will react last.
Brain heuristics or cognitive biases are brain shortcuts
that all humans have that help shape perceptions. Schneier
argued that a lot of security problems happen when these
biases fail.
“We exaggerate some risks and downplay others.”
According to studies he cited, humans are risk averse when it comes to gains and
risk-seeking when it comes loss.
“A sure gain means you live for another day; on the other
hand the sure loss means you lose,” Schneier said. “The
risky loss means you might not.”
The control bias stipulates that things under our control are less likely to be bad for us.
The availability heuristic stipulated that humans aren’t as
good about big numbers as smaller numbers.
The hindsight bias, otherwise known as the Monday morning
quarterback syndrome, says that once something has
happened you mis-remember how you thought it would happen.
Then there is “representativeness.” Schneier argued that we
tend to think of something as more probable based on how
well it fits the stereotype.
One study Schneier cited demonstrated what he referred
to as the anchoring effect.
In the study, participants are
put in front of a roulette wheel that spins to a random number. They’re then asked if the number of African nations is more or less than the number that
showed up on the wheel.
“Turns out the higher number you see the higher number you
guess,” Schneier said. “Our brain anchors on the higher
number. This has bizarre implications; just hand people
random data and they start to fixate on it. It means that
if we’re looking for computer-like calculations in people,
we’re wasting out time.”
Schneier concluded that people have very fine-tuned perceptions of
risk and cost and the way they deal with things. As good
security people, you can try and take the biases and
overcome them.
“The evil people will try and understand the biases so
that they can exploit them,” Schneier said. “We do see the
evil more and more in fields of persuasion.”
The real problem for security professionals is when
security perception and reality are out of whack.
“I think we as a community need to spend more time on how
people perceive security, especially when designing
products.”
This article was first published on InternetNews.com.