September 4, 2013

Flawed Reasoning and Failures in Cognition, continued

 - Part 2 of 3

Yesterday we established confirmation bias as being at the root of our failure to reason logically when debating political and religious divisions. Today we will discuss textbook definitions for some of the subdivisions within the umbrella we call confirmation basis.

In-group Bias is a somewhat nonspecific bias of genetic origin. It is rooted deep in our animalistic or tribalistic tendencies, manifesting as the fear and hatred of people “not like us.” Research has shown this bias may be affected by the neurotransmitter oxytocin. University of Amsterdam psychologist Carsten De Dreu describes oxytocin as helping us forge bonds with the people of our in-group while having the opposite function for those on the outside. It promotes suspicion, fear and even hatred of the out-group.

This particular bias is evident in both political subdivisions and religious theology, causing individuals to discard those not within the subdivision or sect while elevating inbred and possibly deficient individuals to positions of leadership. Where it hurts us is that it leads to an overestimation of the value of our fellow tribesmen while diminishing that of people we don't really know, often resulting in a terrible waste of talent.

Observational Selection Bias is when we suddenly start noticing things we didn't notice that much before, and then incorrectly assume that the frequency has increased. An example might be pregnant women suddenly noticing a lot of other pregnant women, or new car buyers suddenly noticing the same car everywhere they look. The likelihood is that there really isn’t any increase in the frequency, but instead the thing has become elevated in our mind and in turn we notice it more often. Trouble is that most people don't recognize this as a selectional bias. Most actually believe these items or events are happening with increased frequency, causing a distinctly disconcerting feeling. Another attribute of this bias is that it contributes to a feeling that this couldn’t be coincidence.

Status-Quo Bias promotes the human tendency to be apprehensive of change and often leads to choices that guarantee things will remain the same or change as little as possible. This has obvious ramifications in everything from politics to economics. Take for instance the difficulties of the 60s experienced by those pushing for racial equality, and the subsequent resistance still evident 50 years later. More recently there are the LGBT issue and continued support for marijuana prohibition.

We like to stick to our routines, our political parties, and even our favorite restaurants. When given the choice between the unknown Bob’s Diner and the familiar Burger King, status quo bias prompts a fearful resistance to the unknown and often prompts the choice of the latter.

The perniciousness of this bias is the unwarranted assumption that another choice will be inferior or make things worse. We know that the Burger King will serve something familiar, even if perhaps not of the highest quality or with the best flavor. Although Bob’s Diner might have far better food, the risk is more than many will take. The status-quo bias can be summed with the saying, "If it ain't broke, don't fix it"… an adage that fuels conservative tendencies. And in fact, some commentators say this is why the U.S. hasn't been able to enact universal health care, despite the fact that so many support the idea of reform.

Negativity Bias is the belief that all news is bad news. People tend to pay more attention to bad news… and it's not just because we are morbid. Steven Pinker, in his book The Better Angels of Our Nature: Why Violence Has Declined, argues that crime, violence, war, and other injustices are steadily declining. Recent national crime statistics tend to verify this, yet most people would argue that things are getting worse. An example is the constant drumbeat that the U.S. economy has steadily gotten worse under the current administration, when all reliable data proves this to be untrue.

Social scientists tell us that we perceive negative news as being more important or profound. We also tend to give more credibility to bad news, perhaps because we are suspicious of proclamations to the contrary. In our prehistoric past the heeding bad news may have been more adaptive, but today we run the risk of allowing this bias to inhibit growth. Dwelling on negativity at the expense of genuinely good news tends to cause people to believe that the world is a worse place than it actually is.

Some who voted for the current President are experiencing Post-Purchase Rationalization bias. This occurs following what starts out looking like a good deal, but later seems a bad bargain. The same occurs when we see something in a store and just can’t live without it. We take it home and later find the gee gaw not as valuable as we first thought, causing us to start doubting our decision. We might regret the purchase because of the expense or because it did not perform as expected… but then the bias kicks in and we convince ourselves that it was a smart move regardless of the deficiencies.

This is the mental mechanism that causes us to feel better after we make poor decisions. It provides us with a way to subconsciously justifying our decisions. Psychologists call this the Dissonance Model of Post-Decision Product Evaluation, and describe it as stemming from commitment principle and need to avoid the state of cognitive dissonance.

Neglecting Probability bias stems from irrational fear of low probability threats. An example would be the fear of flying. Almost nobody is afraid of riding in a car, but a measurable demographic refuse to fly out of fear of crashing. Many others suffer elevated stress levels while flying. This in spite of the fact that automobile accidents account for at least 67 times more deaths than air crashes.  Some estimations show the odds to be considerably greater even than this.

Now compare this with the current, rampant fear of terrorist incidents. In the U.S. you are far more likely to die of cancer than by terrorist attack, yet the anti-terrorism budget expends on average a half million dollars per documented victim of terrorism on an annual basis, while the budget for cancer prevention lays out only about $10,000 per victim.

The phenomena represents the human brain’s failure to grasp proper sense of peril and risk. It leads to the overestimation of risk for rare events while underestimating the risks involved with the more familiar yet far more dangerous. This country is currently suffering from an almost hysterical fear of terrorism, even though the odds of choking on your food or becoming accidentally poisoned are far greater. If society wishes to effectively counter the actual dangers we face, we must first put them in perspective.

Gambler's Fallacy or Positive Expectation Bias is perhaps more like a bug in our software than a bias. We inexplicably put tremendous weight in previous experience and let this influence our expectations. Think about flipping a quarter. If one flips heads four or five times in a row we are inclined to believe (and to bet on) the likelihood that the next flip will be tails. As Spock might say, this is illogical. The odds remain the same regardless of previous outcomes. The outcome of each coin flip is statistically independent of previous results, meaning the probability remains 50-50.

The positive expectation is that luck must eventually change and that because of all the previous bad luck it must mean that it is our turn to win. Successful gamblers know this not to be a valid assumption and do not rely on luck. These people have the ability to tabulate previous events, maintain the current odds in their heads and only bet when those odds are favorable. They also have an ability to “read” people, and can be pretty accurate in judging a bluff. This is not luck… it is science.

Further discussion and the conclusion of this thesis will continue tomorrow.