A Forum for Opinions on News, Politics, and Life
July 18th, 2011
In a recent post, I argued that common sense was vastly over-rated as a tool for making sound judgments and that we need to engage in “reasoned sense” that includes both extensive direct experience and critical thinking. Taking steps that include the informal use of the scientific method can help us make better decisions.
However, as recent research has demonstrated, even scientists who strictly adhere to the scientific method can’t guarantee that they will draw the best possible conclusions. When I read this research my first thought was, “How could such highly educated and precisely trained professionals veer off the path of objectivity?” The answer is simple: They, like all of us, possess one quality from which it is impossible to divorce themselves. That quality? Being human.
As the fields of psychology and behavioral economics have demonstrated, homo sapiens is a seemingly irrational species that appears to, more often than not, think and behave in nonsensical rather than commonsensical ways. The reason is that we fall victim to a veritable laundry list of cognitive biases that cause us to engage in distorted, imprecise, and incomplete thinking which, not surprisingly, results in “perceptual distortion, inaccurate judgment, or illogical interpretation” (thanks Wikipedia), and, by extension, poor and sometimes catastrophic decisions.
Well-known examples of the results of cognitive biases include the Internet, housing, and financial crises of the past decade, truly stupid use of social media by politicians, celebrities, and professional athletes, the existence of the $2.5 billion self-help industry, and, well, believing that a change in the controlling party in Washington will somehow change its toxic political culture.
What is interesting is that many of these cognitive biases must have had, at some point in our evolution, adaptive value. These distortions helped us to process information more quickly (e.g., stalking prey in the jungle), meet our most basic needs (e.g., help us find mates), and connect with others (e.g., be a part of a “tribe”).
The biases that helped us survive in primitive times when life was much simpler (e.g., life goal: live through the day) and speed of a decision rightfully trumped its absolute accuracy doesn’t appear to be quite as adaptive in today’s much more complex world. Due to the complicated nature of life these days, correctness of information, thoroughness of processing, precision of interpretation, and soundness of judgment are, in most situations today, far more important than the simplest and fastest route to a judgment.
Unfortunately, there is no magic pill that will inoculate us from these cognitive biases. But we can reduce their power over us by understanding these distortions, looking for them in our own thinking, and making an effort to counter their influence over us as we draw conclusions, make choices, and come to decisions. In other words, just knowing and considering these universal biases (in truth, what most people call common sense is actually common bias) will make us less likely to fall victim to them.
Here are some of the most widespread cognitive biases that contaminate our ability to use common sense:
The Bandwagon effect (aka herd mentality) describes the tendency to think or act in ways because other people do. Examples include the popularity of Apple products, use of “in-group” slang and clothing style, and watching the “Housewives of…” reality-TV franchise.
The Confirmation bias involves the inclination to seek out information that supports our own preconceived notions. The reality is that most people don’t like to be wrong, so they surround themselves with people and information that confirm their beliefs. The most obvious example these days is the tendency to follow news outlets that reinforce our political beliefs.
Illusion of Control is the propensity to believe that we have more control over a situation than we actually do. If we don’t actually have control, we fool ourselves into thinking we do. Examples include rally caps in sports and “lucky” items.
The Semmelweis Reflex (just had to include this one because of its name) is the predisposition to deny new information that challenges our established views. Sort of the yang to the yin of the Confirmation bias, it exemplifies the adage “if the facts don’t fit the theory, throw out the facts.” An example is the Seinfeld episode in which George Costanza’s girlfriend simply refuses to allow him to break up with her.
The Causation bias suggests the tendency to assume a cause-effect relationship in situations in which none exists (or there is a correlation or association). An example is believing someone is angry with you because they haven’t responded to your email when, more likely, they are busy and just haven’t gotten to it yet.
The Overconfidence effect involves unwarranted confidence in one’s own knowledge. Research has shown that people who say they are “99% certain are wrong 40% of the time.” Examples include political and sports prognosticators.
The False Consensus effect is the penchant to believe that others agree with you more than they actually do. Examples include guys who assume that all guys like sexist humor.
Finally, the granddaddy of all cognitive biases, the Fundamental Attribution Error which involves the tendency to attribute other people’s behavior to their personalities and to attribute our own behavior to the situation. An example is when someone treats you poorly, you probably assume they are a jerk, but when you’re not nice to someone, it’s because you are having a bad day.
I could go on and on (for an exhaustive list of cognitive biases, do a search on Wikipedia), but you get the point. If you look at your own thinking, you’ll likely find yourself at the mercy of these distortions (though I may just be suffering from the False Consensus effect). But I really am sure that we fall for cognitive biases all of the time (I may be guilty of the Overconfidence effect). In any event, all the research I read supports this post’s claims (uh-oh, I think I just fell for the Confirmation bias). Note to self: Need to continue to work on resisting cognitive biases.
(This article was also posted at Dr. Jim Taylor’s Blog.)
(Visit Dr. Jim Taylor’s YouTube channel to see TV interviews and Prime topic discussions.)
(To avoid spam, comments with three or more links will be held for moderation and approval.)
Copyright 2016 Opinion Forum