Author: Gershon Ben Keren

In 1995 a small-time crook name McArthur Wheeler, with an accomplice, robbed two banks having sprayed his face with lemon juice believing it would make him invisible to security cameras; just as lemon juice could be used as “invisible ink” on paper, he believed that the same thing would happen on a CCTV tape. He was so confident in his belief that he deliberately looked up at a security camera and smiled. His reasoning came from the fact that lemon juice can be used as invisible ink, only becoming visible when exposed to heat. He mistakenly assumed the same idea/principle applied to surveillance footage. When he was arrested just a few hours after the robberies, Wheeler was astounded that he’d been identified, as he believed he’d been effectively invisible – due to the lemon juice - when he committed his offenses. This case caught the attention of two psychologists David Dunning and Justin Kruger. They were intrigued that Wheeler had complete and absolute confidence in his flawed logic, and it was this strange “contradiction” that became the foundation for their research into cognitive biases. In 1999, they published their study on what is now called the Dunning-Kruger effect: a psychological phenomenon where people with limited knowledge or skill greatly overestimate their understanding and competence in a subject. The Dunning-Kruger effect demonstrates that ignorance doesn’t simply manifest itself as the absence of understanding concerning a subject, but that it can also create a misplaced understanding of it i.e., the less someone knows about something the more certain they are concerning it.
In my thirty-plus years teaching personal safety and self-defense there is one constant that I have experienced, with members of a certain demographic/population (and they’re not easily identifiable before you start talking about violence and personal safety), and that is they believe they understand violence. I understand 100% why people want to believe that they understand threats, violence and personal safety etc. To admit that you don’t is scary both on a personal, and at a societal, level. It is far simpler to believe that your greatest danger comes from someone who is mentally ill (in the US 60% of people believe those with schizophrenia are likely to be violent) – a person you are unlikely to interact with – than a family member or friend etc., who you interact and deal with on a daily or weekly basis. If all mental illnesses – not personality disorders – could be cured/eliminated, it is estimated that all serious violent crime, including active killer events/incidents, would only drop by about 4% i.e., 96% of all violent crimes are committed by offenders who are not judged mentally ill (Swanson et al., 2015). I have lost count of the number of conversations where people have insisted that I don’t understand the extent and seriousness of mental illness and its relationship to violence; no information, data provided to back up such arguments/ideas but an insistence that those who are mentally ill are a danger/threat to people’s personal safety.
Most people don’t want to educate themselves concerning the realities of violence, which is understandable. It is rarely rewarding to look at the worst side of our species – which includes ourselves – and consider the things we are capable of. However, our denial/discounting and possible re-working of reality should not be seen as an educational positive i.e., substituting ignorance for an “opinion” is rarely beneficial. We should also understand that we are species who values “stories” over “facts”. This is why we are so susceptible to infomercials; we are more likely to believe in a personal account/story than actual facts and data e.g., if we hear an account of “someone” who lost 35 pounds taking a diet pill supplement, that will resonate more with us than a statistic that says 13% of people who took this pill lost weight etc. This isn’t because we are inherently “stupid” but because stories – whether true or not true, factually reliable/unreliable etc., – are how we pass down/communicate information to others. This means that “one” person’s story/account can overly inform us to a degree that it shouldn’t i.e., we can become utterly convinced that something is “true” based on very limited knowledge. We may think that we are more clever and intelligent than the guy who believed that he could make himself invisible by spraying lemon juice on his face, but we suffer from the same biases – that is who we are as a species, and we should have the intelligence to accept this, especially when the stakes are potentially high.
One of our default responses/reactions to a threat/danger, is to discount/deny it. This phenomenon has been seen time and time again during natural and human-made disasters e.g., people not leaving a burning building, exiting a crashed plane, refusing to leave a house that is about to be flooded etc. These are incidents which naturally involve us having limited information of them, but which can quickly see us become “experts” in them, believing that we are “right” whilst basing our opinion(s) on restricted/limited information. Whilst we may laugh at McArthur Wheeler being so convinced at his ability to be “invisible” when he had the time to research whether this belief was valid or not, we should recognize how we can quickly convince ourselves of something in the moment e.g., that the plane which has crashed into our building doesn’t mean that e should evacuate it etc. On 9/11, when the second plane hit the second tower, the average evacuation time, before people started leaving their desks was 6 minutes 47 seconds. It may seem inconceivable, in retrospect that people wouldn’t leave their desks immediately and some not at all etc., however there were those who became experts in the moment (not because of arrogance but because they fell prey to inherent cognitive biases), concerning their situation and may have believed that they understood the situation better than others around them.