December learning groups: Confirmation Bias

Every month we also have a study/discussion group (which are now called learning groups) about an aspect of critical thinking, logic or argumentation. Our first one in October was about informal logical fallacies, the previous one was on evidence.  Initially this one was going to be on bias in general, but as usual, I discovered the scope to be much too wide to fit in just one meet up.  In fact, all total it may take about a year to cover. So I decided to narrow it down to confirmation bias.

But first, let’s look at what bias is, exactly.

Bias

We all work within a subjective social reality, a way of looking at the social world from our personal perspective.  But it is potentially rife with distortion, as well as inaccurate judgments and interpretations; in other words, irrationality.

Bias is a subjective and often flawed desire or tendency to hold a particular perspective or outlook.  It could be by dismissing or denying other points of view.  It is generally held as a stance towards a particular object, such as an individual person (such as a boss), an individual non-human object (for example, black cats) or even a group of people (a nationality or even a gender).  There are, as I discovered, several types of biases and numerous examples of each type.  Some affect judgment or decision-making (one of which we will be covering today).  These are called cognitive biases.  Others affect our social behavior, others our memory.  Some biases exist externally, such as statistical bias and media bias, as well.

Biases are not necessarily always detrimental.  They lead to more efficient – if sometimes less effective – decisions. They also enable us to be more proactive, rather than wallowing in perpetual over-analyzation.

So why do people have bias?

Quite frankly, it would seem that we don’t have much choice:

  1. Our rationality is bounded, meaning we have cognitive limitations: a finite amount of time within which to acquire a finite amount of information and to make decisions.  Often our decisions need to be quick, and a deliberate and analytical process is unproductive or even unfeasible.  We often seek satisfying decisions rather than optimal ones.
  2. Most of the time evidence is neither simple nor clear-cut; it is often complex, ambiguous and/or confusing or even contradictory.  Again, because of bounded reality, we can’t possibly weigh all the evidence efficiently, much less objectively.
  3. We are biased for our own sanity: we are bombarded with stimuli every day, particularly in modern society, much too much to take in completely, at least at a conscious level. We have developed a mechanism called heuristics, which we will talk about in a future meet up, to compensate.
  4. It pays to be selectively perceptive.  There are intrinsic social costs to being wrong about your beliefs, as opposed to having objectively correct beliefs.  It makes us look, validly or not, foolish, unintelligent, gullible or dishonest.  It discredits us as potential leaders, thus lowering our status.  It ostracizes us from any groups we have identified with.    For this reason, we benefit from forgetting or ignoring any stimuli that is emotionally discomforting or contradictory to our paradigms and we focus on the information or stimuli that confirms our current paradigm.
  5. Unfortunately it is more or less reflexive: Confirmation bias is often the result of the Semmelweis reflex, a tendency to instinctively reject any evidence or knowledge that contradicts our own norms, beliefs or paradigms. There is reason to think that an admission of error can be a socially costly one; you lose face, you lose reputation and trustworthiness, etc.
  6. On the other hand, self-verification and self-enhancement are to our advantage. For this reason we tend to be one-sided in looking for evidence or arguments that bolster our ideas, rather than one’s that may contradict them, even though they made lead us to a more likely conclusion. This leads to presumptuous and loaded questions and inquiry, as well as appeals and other logically fallacious thought processes.
  7. Wishful thinking is a concept based on the “Pollyanna Principle” which dictates that if a conclusion is pleasant it is therefore favorable over an unpleasant one, no matter how cogent or sound the unpleasant one may be.  Basically, it’s the idea that if we simply wish something were true, that should be sufficient to make it true, and anything that contradicts that is to be dismissed, ignored or ridiculed.

Another related bias which explains confirmation bias is called subjective validation.  People consider a piece of information more valid if it has any personal significance for them.  So, for instance, a superstitious person feel validated seeing a correlation between them getting fired and the date being the 13th (and consequently relieve themselves of personal responsibility), whereas another person would not see the same validation.  Related is the Forer effect, the tendency for us to associate ourselves with positive attributions about ourselves that psychics, mentalists, cold readers or other scammers use.

  • Example: This was shown using a fictional child custody case. Subjects read that Parent A was moderately suitable to be the guardian in multiple ways. Parent B had a mix of salient positive and negative qualities: a close relationship with the child but a job that would take him or her away for long periods. When asked, “Which parent should have custody of the child?” the subjects looked for positive attributes and a majority chose Parent B. However, when the question was, “Which parent should be denied custody of the child?” they looked for negative attributes, but again a majority answered Parent B, implying that Parent A should have custody. (This could be a good reason why it’s beneficial to ask positive questions rather than negative ones, as opposed to the act of asking positive questions being “idealistic” or “naive”)

So what is confirmation bias?

We are going to focus on cognitive biases, specifically judgment and decision-making biases, starting with confirmation bias, and moving on to illusory biases which make us look at the world in an overly positive or negative way, as well as attentional biases, probability biases and what I call “comfort” biases, biases that motivate us to retain a status quo.

Confirmation bias is a tendency of people to favor information that confirms their beliefs or hypotheses.  It may be a bias towards collecting or searching for information (we tend to read books or magazines or check out websites that go along with our own views), or towards remembering certain information (we tend to remember information that supports our views better than information that contradicts it), or even towards paying attention to or accepting certain information (when people talk to us when tend to pick out the information that confirms certain positions we hold).  It can lead a person to interpret ambiguous information as favorable towards their position.  For instance, when disaster strikes an atheist might say “See, a loving God wouldn’t do that” whereas a theist might say “See, that’s God showing his disapproval.”  We tend to interpret information in a biased way when we have a strong opinion one way or another. Our standards are more lenient for evidence that supports our beliefs, and stricter for opposing evidence. In regards to opposing evidence, this is also called “disconfirmation” bias.

  • Example: A team at Stanford University ran an experiment with subjects who felt strongly about capital punishment, with half in favor and half against. Each of these subjects read descriptions of two studies; a comparison of U.S. states with and without the death penalty, and a comparison of murder rates in a state before and after the introduction of the death penalty. After reading a quick description of each study, the subjects were asked whether their opinions had changed or not. They then read a much more detailed account of each study’s procedure and had to rate how well-conducted and convincing that research was. In fact, the studies were fictional. Half the subjects were told that one kind of study supported the deterrent effect and the other undermined it, while for other subjects the conclusions were swapped. The subjects, whether proponents or opponents, reported shifting their attitudes slightly in the direction of the first study they read. Once they read the more detailed descriptions of the two studies, they almost all returned to their original belief regardless of the evidence provided, pointing to details that supported their viewpoint and disregarding anything contrary. Subjects described studies supporting their pre-existing view as superior to those that contradicted it, in detailed and specific ways. Writing about a study that seemed to undermine the deterrence effect, a death penalty proponent wrote, “The research didn’t cover a long enough period of time”, while an opponent’s comment on the same study said, “No strong evidence to contradict the researchers has been presented”. Subjects made their judgments while in a magnetic resonance imaging (MRI) scanner which monitored their brain activity. As subjects evaluated contradictory statements by their favored candidate, emotional centers of their brains were aroused. This did not happen with the statements by the other figures.

Yeah… so what?

Well, for one thing, confirmation bias can lead to attitude polarization, the phenomenon by which people’s attitudes draw even further and further apart from those who disagree with them, and towards belief preservation, the persistence of a belief even after it has been demonstrated to be false and finally towards illusory correlation, falsely seeing correlations where there actually are none.  Conspiracy theorists are a good example of attitude polarization and belief preservation at work; even if they find contradictory evidence, they chalk it up to part of the conspiracy, and feel that much more ensured that their beliefs are true.  They are also a good example of illusory correlation, seeing signs that the illuminati is at work in the supposed “symbols” and gestures that celebrities exhibit.

  • Example of attitude polarization: A study was done at the Stanford in which subjects with strong opinions about the death penalty read about mixed experimental evidence. Twenty-three percent of the subjects reported that their views had become more extreme, and this self-reported shift correlated strongly with their initial attitudes. Another example: They measured the attitudes of their subjects towards these issues before and after reading arguments on each side of the debate. Two groups of subjects showed attitude polarization; those with strong prior opinions and those who were politically knowledgeable. In part of this study, subjects chose which information sources to read, from a list prepared by the experimenters. For example they could read the National Rifle Association’s and the Brady Anti-Handgun Coalition’s arguments on gun control. Subjects were more likely to read arguments that supported their existing attitudes.
  • The belief perseverance effect has been shown by a series of experiments using what is called the “debriefing paradigm”: subjects read fake evidence for a hypothesis, their attitude change is measured, and then the fakery is exposed in detail. Their attitudes are then measured once more to see if their belief returns to its previous level. A typical finding is that at least some of the initial belief remains even after a full debrief. In one experiment, subjects had to distinguish between real and fake suicide notes. The feedback was random: some were told they had done well while others were told they had performed badly. Even after being fully debriefed, subjects were still influenced by the feedback. They still thought they were better or worse than average at that kind of task, depending on what they had initially been told.

Confirmation bias can also lead us to see relationships where none exist. This is called illusory correlation or illusory association.

  • Example: A study recorded the symptoms experienced by arthritic patients, along with weather conditions over a 15-month period. Nearly all the patients reported that their pains were correlated with weather conditions, although the real correlation was zero. People rely heavily on the number of positive-positive cases when judging correlation: in this example, instances of both pain and bad weather. They pay relatively little attention to the other kinds of observation (of no pain and/or good weather).

Confirmation bias can also lead to the “backfire effect”; when people’s confirmation bias is quite strong they will tend to be more repelled by contradictory evidence than someone who is more objective.  In other words, contradictory evidence makes them more adamant about their current beliefs!  This effect explains the behavior of the conspiracy theorist, as well.

In more practical terms, it can lead to bad financial decisions; it can result in overconfidence in one’s decisions which leads one to ignore evidence that their decisions may be bad ones.  It can lead to inhibition of medical advances when professionals are certain that their current knowledge is optimal.  It is a product of depression, leading depressed individuals to seek out and confirm evidence that fits in and reinforces their negative paradigms.  It can lead to a tendency to believe in the paranormal, such as psychic readings.  If one wants to believe in such phenomenon, he or she will make a cognitive effort to make connections between himself or herself and the reading.  It can lead to exacerbating or extending conflict, whether between individuals or even nations, when both sides are certain of the veracity of their side of the argument.

The underlying idea is that, optimally, we want to make rational rather than irrational decisions, or at least interject as much rationality as we can in our judgments and decisions.  While it may be impossible to eliminate emotions from decisions, especially those which affect us personally, ideally we want our decisions to be based on an optimal, formal process, particularly decisions that are going to impact us and those around us most strongly and most long-term.  Emotions tend to lead towards detrimental decisions when considered in the long run.  The more intense and immediate the anticipated results and the emotions themselves are, the more impact we feel they have.  We tend to focus on the short-term effects and anticipated negative emotions for the purpose of alleviating our present emotional state or currently perceived negative impact, rather than look at the objective results of our decisions long-term. In these cases, our desires or fears override our reasoning, and our beliefs or actions suffer. Fear and sadness tend to lead to irrational pessimism, anger towards overly quick and necessarily unanalytical decisions.  Stress generated from emotional upset can add to cognitive “load” which makes it difficult to remain rigorous when making decisions.  Also, fear of potential regret or disappointment in the future can negatively affect a decision in the present. They can affect not only trivial decisions, but major financial or even medical ones.  Neuroscience experiments have shown how emotions and cognition, which are present in different areas of the human brain, interfere with each other in decision-making process, resulting often in a primacy of emotions over reasoning, meaning that we place more importance on emotions rather than reasoning.  Biases are not only detrimental on such a small-scale.  Consider that many social institutions rely on individuals to make rational judgments, like courts, or that people in leadership positions are susceptible to bias as well.

Examples:

1. An investor who imagines losing a small amount of money even after a big gain will generally focus with disappointment on the lost investment, rather than with pleasure on the overall amount still owned.

2. A dieter who anticipates losing two pounds may imagine feeling pleasure even though those two pounds are a very small percentage of what needs to be lost overall.

3. Game participants who could win $1000 and end up with nothing base their disappointment on the loss of the hoped-for prize, rather than on the fact that they have no less money than they had when they began the game.

4. A fear of flying experienced while deciding how to travel may lead a person to choose driving even though air safety statistics would show air travel to be statistically less likely to present a danger. A fear of flying may be enhanced by the vividness of the mental image of a plane crash may be in the mind of the decision-maker.

Confirmation bias, among other biases which we will discuss in the future, can affect our memory.  This is called “selective recall“, “confirmatory memory” or “access-biased memory“.  Information that matches expectations or beliefs is more easily stored and recalled.

  • Example: In a group of subjects were shown evidence that extroverted people are more successful than introverts. Another group were told the opposite. In a subsequent, apparently unrelated, study, they were asked to recall events from their lives in which they had been either introverted or extroverted. Each group of subjects provided more memories connecting themselves with the more desirable personality type, and recalled those memories more quickly.

So, as we can see, bias can affect us pretty severely.

Ok, so what can we do about it?

Some would say very little to nothing.  However, some studies have shown that awareness of biases has the tendency to decrease the likelihood of the bias.  Simply put, the more conscious we become of our unconscious, the less influence it has.  By refocusing our attention to our behaviors rather than trying to decipher their inner workings we can become more objective, and thus more accurate, about our own biases.

In the future we will discuss cognitive bias mitigation, or ways to handle bias when it arises, and cognitive bias modification, ways that we can change our own biases to prevent them from interfering when we need to think clearly and rationally.

 Recommended Books

A Mind of Its Own: How Your Brain Distorts and Deceives – Cordelia Fine

Cognitive Illusions: A Handbook on Fallacies and Biases in Thinking, Judgment and Memory – Rudiger Pohl

Don’t Believe Everything You Think – Thomas E. Kida

Mistakes Were Made (But Not By Me) – Carol Tavris

Social Cognition: Making Sense of People – Ziva Kunda

Strangers to Ourselves – Timothy Wilson

Thinking, Fast and Slow – Daniel Kahneman

Thinking and Deciding – Johnathan Baron

When Prophecy Fails – Leon Festing

 Recommended Videos and Websites

Selective perception videos:

http://www.youtube.com/watch?v=vJG698U2Mvo

http://www.youtube.com/watch?v=RzwZ0kwhYyo

Illusory Correlation:

http://www.youtube.com/watch?v=2I1n8-zpvMI

Amusing video on confirmation bias:

http://www.youtube.com/watch?v=hcucGn_X8AA

Website on confirmation bias (as well as other biases):

http://www.sciencedaily.com/articles/c/confirmation_bias.htm

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s