Psych in Sum: Confirmation Bias in Politics

confirmation_bias

Image from Clay Bennett

We saw the first debate between the two major party nominees for president this Monday, and critiques of the candidates’ performances flooded the internet. As with every debate, there are people declaring victory on both sides. Not only did their candidate win, but he/she mopped the floor with the other one! But political experts largely agree that Hillary Clinton won the debate, and polls of likely voters that were conducted offline show the majority of the electorate thinks Clinton won as well. So how do a bunch of people take some unscientific online polls and use them as real evidence that Trump won? Confirmation bias.

Confirmation bias is a tendency to seek out information that supports your position (hello .net web address!) and to interpret information in such a way that it confirms what you already think.1 Everyone is vulnerable to this bias. In fact, scientists regularly try to avoid having their a priori beliefs affect the later interpretation of their data through various techniques, like deciding sample size before looking at the results, or using a blind experimental design. They go through this trouble because we know that people tend to selectively expose themselves to information they agree with, and tend to ignore information they don’t agree with.1 That isn’t good science, and it isn’t the road to good decision-making either.

Thinking your biases are founded in fact can be extremely problematic and lead to overconfidence in judgments, such that you are more certain your judgment is correct than your evidence warrants.And if there’s something worse than someone who is wrong, it’s someone who is desperately trying to convince you they aren’t. Generally, if you look hard enough, you’re going to find something that assures you that you are right. Whether or not that something has any real merit is often another story.

 

  1. Nickerson, R. S. (1998). Confirmation bias: A ubiquitous phenomenon in many guises. Review of general psychology2(2), 175.
  2. Kahneman, D., & Tversky, A. (1977). Intuitive prediction: Biases and corrective procedures. DECISIONS AND DESIGNS INC MCLEAN VA.

Papa Don’t Preach: Josh Duggar and the Psychological Consequences of Christian Patriarchy (Part 2)

3ydhWvtP

Image from Twitter

Over a week ago, allegations were confirmed that Josh Duggar sexually assaulted 5 underage women while still a minor himself. While his personal actions are deplorable and inexcusable, his upbringing as a member of the Quiverfull and Christian Patriarchy movements likely had an impact on his behavior. At SocialPsyQ we’re exploring the possible psychological forces that may have helped to motivate these crimes and others like them. If you missed part 1, you can read it here.

In part 2 of our analysis of the Duggar scandal, we’ll discuss Groupthink and Confirmation Bias.

3) Groupthink- Groupthink can help to explain many a disaster, from the decision to launch the Challenger to the ill-fated Bay of Pigs invasion. Groupthink is a phenomenon whereby people in a group are motivated to maintain group harmony by isolating themselves from outside influences and ignoring unpopular opinions.1 The problem is that this often leads to bad decisions, since the decision-makers don’t have all of the information and have not considered all of the alternatives. There are several types of situations that can make groups vulnerable to groupthink. For instance, there’s collective avoidance, where all members of a group act defensively to prevent failure, and there’s collective overoptimism, which is marked by overconfidence in achieving success.2 Either or both of these types of situations could be at play in Christian Patriarchy, as the group acts to avoid failing to instill an appreciation of their religious beliefs in their children, and they are convinced that their brand of Christianity will lead to successfully accomplishing this.

Groupthink is definitely at work in extreme religious movements or cults where a group of people comes to advocate certain practices or beliefs, regardless of their actual merit or social acceptability. For instance, beliefs in the Christian Patriarchy movement that women belong in the home and need a male authority to remain pure are fairly outmoded, but being surrounded by a large group of people with these same beliefs normalizes them, and discourages dissenters from speaking up. Researchers have found that contrary to popular opinion, we aren’t most strongly influenced by close others like friends, but by people who we identify with as part of the same social group.3 So, for the Duggars, these influencers are likely fellow members of their church, and the leaders of the Quiverfull and Christian Patriarchy movements.

The major antecedents of groupthink are: 1) The existence of a cohesive group, 2) The expression of a preference from a respected leader, and 3) Insulation from useful outside opinions that should be considered.4 There are also several symptoms of groupthink. Some important ones that may have played a role here are: Morality-Where groups come to believe that their opinion is the morally superior one, Stereotyped Views of Others-Where group members have simplified views of people with opposing positions, Pressure on Dissent-Where group members pressure people who question the majority opinion, Self-Censorship-Where people who disagree with the majority feel they cannot speak up, Illusion of Unanimity-Where people assume that those group members who don’t speak up agree with the majority position, and Mindguarding-Where specific group members act to shield the group from outside information that is contrary to group beliefs.4 It is easy to tell that almost all of the symptoms of groupthink are at play in these extreme religious movements. These symptoms may help to explain why women also adhere to Christian Patriarchy, even though it seems obvious that it is not in their best interests. Sadly, Groupthink discourages people from speaking up, and may have played a role in the cover-up of Duggar’s abuse, as well.

4) Confirmation bias-We’ve discussed this on SocialPsyQ before, but confirmation bias seems to underlie many of the actions of these extremely religious people. Confirmation bias occurs when someone specifically seeks evidence or interprets information in such a way that it confirms their already existing beliefs.5 Confirmation bias can even operate in science, when researchers design their studies or interpret their data in such a way that it confirms their hypothesis.6 But, of course, laypeople do this as well with their own ideas about the world.

For instance, Fox News often presents news in a different way than other news outlets. If Fox News presents information that is more in line with your already existing beliefs, you are more likely to listen to that news station than one that calls your beliefs into question. People who are more committed to their ideas, or have just been reminded of their strong beliefs, are more likely to exhibit confirmation bias than individuals who are not.7 That’s not to say that only some people show confirmation bias, it just means that people may show this bias in different areas. Specifically, those areas in which their beliefs are the strongest. It’s no coincidence that the Duggars sought out some extremist religious homeschooling materials for their children that are in line with their own extremist beliefs, or that they only associate with other families that share their beliefs. It is simply an instance of confirmation bias.

It is clear that a religious group that is headed by influential male authorities could lead to the various symptoms of groupthink and encourage the Duggars to handle their son’s crimes in house, as well as to have put Josh Duggar in a position where he did not feel that he could communicate his deviant thoughts and feelings with his religious parents. In addition, confirmation bias helped to keep the Duggars in the dark. As they acted to keep their kids safe from the “evils” of the world with their brand of authoritarian parenting, they invited the devil right in the front door.

Come back on Thursday for the conclusion of the Duggar scandal series!

NOTE: I would be remiss to not mention that some research has called the Groupthink theory into question, and that scientific results from lab studies do not always replicate all of the antecedents and results proposed by Janis and colleagues. While the laboratory evidence does not always support all of the tenants of the theory, it is obvious that this process plays out in some form based on the many real-life examples that scientists invoke to discuss the theory, such as launching the Challenger in less than ideal conditions, despite warnings from the engineers about the temperature.8

  1. Janis, I. L. (1972). Victims of groupthink: a psychological study of foreign-policy decisions and fiascoes.
  1. Esser, J. K. (1998). Alive and well after 25 years: A review of groupthink research. Organizational behavior and human decision processes, 73(2), 116-141.
  1. Hogg, M. A., & Hains, S. C. (1998). Friendship and group identification: A new look at the role of cohesiveness in groupthink. European Journal of Social Psychology, 28(3), 323-341.
  1. Moorhead, G., Ference, R., & Neck, C. P. (1991). Group decision fiascoes continue: Space shuttle Challenger and a revised groupthink framework. Human Relations, 44(6), 539-550.
  1. Nickerson, R. S. (1998). Confirmation bias: A ubiquitous phenomenon in many guises. Review of general psychology, 2(2), 175.
  1. Oswald, M. E., & Grosjean, S. (2004). 4 Confirmation bias. Cognitive illusions: A handbook on fallacies and biases in thinking, judgement and memory, 79.
  1. Munro, G. D., & Stansbury, J. A. (2009). The dark side of self-affirmation: Confirmation bias and illusory correlation in response to threatening information. Personality and Social Psychology Bulletin.
  1. Turner, M. E., & Pratkanis, A. R. (1998). Twenty-five years of groupthink theory and research: Lessons from the evaluation of a theory. Organizational behavior and human decision processes, 73(2), 105-115.