This is Kellyanne Conway trying to explain why Trump’s Press Secretary Sean Spicer would come out and claim verifiable falsehoods as the truth. You see, to them, these aren’t lies. These are “alternative facts” – as in there’s an original fact, then there’s alternative facts. If someone consistently lies, what happens when presented with the verifiable evidence that that person is lying?
Here are your choices:
- Admit it was a lie
- Attack the evidence
- Attack the person giving the evidence
- Claim it wasn’t a lie while not refuting the evidence
Kellyanne Conway went with a combination of #3 and #4 here. First, she attacked the press for something completely different. Then she claimed Sean Spicer was giving “alternative facts” not lies. This is exactly what happens when we get stuck with some kind of cognitive dissonance. Cognitive dissonance occurs when we have conflicting ideas, beliefs or attitudes. It creates a cognitive strain that we attempt to relieve. On the one hand, we are truth-tellers, not liars. On the other hand, what was said is a proven lie. How do we accept both statements as true? When we come up against this sort of cognitive dissonance, humans are incredibly adept at creating stories that allows two seemingly opposing views to be correct at the same time. The creation of “alternative facts” is just that sort of story we create to hold on the our previously held views when shown contradictory evidence.
Cognitive Dissonance Theory
Cognitive dissonance theory was first investigated sixty years ago by the psychologist Leon Festinger. He became curious about how cults worked. He wondered how cults who claimed the world was about to end dealt with the verifiable truth that the world did not end when they predicted. He conducted interviews with members of a cult who claimed the earth was about to be drowned in an epic biblical flood – after that predicted day came and passed without so much as a heavy rain. He noticed some of the members on the fringe who hadn’t placed their whole identity in the cult were willing to accept they were wrong and move on with their lives. However, those at the center of the cult had to come up with a way to ease the mental strain they were experiencing from believing the earth would flood, but obviously seeing that it did not. Festinger heard from these members stories not of contrition and accepting the inaccuracy of their beliefs. He heard stories that doubled-down on their belief system. Cult members told him that due to the group’s prayers and faithfulness, that God had spared the world from the imminent flood. They saw evidence to the contrary of their beliefs as evidence that their belief was right the whole time. Several studies in the intervening years have similar results. Festinger called this mental strain and its subsequent effects cognitive dissonance theory.
Festinger suggested that when presented with a cognitive dissonance, we have three options:
- Change one of your beliefs (often requiring admitting your belief was wrong)
- Find new information that outweighs one of the dissonant beliefs (even if this information is itself wrong)
- Reduce the importance of one of the beliefs (often convincing yourself that the new, dissonant information doesn’t really matter)
According to research on the backfire effect and confirmation bias, we are extremely unlikely to go with option #1 here, especially when our beliefs are deeply held and political in nature. For instance, it is my personal belief that a hot dog is not, in fact, a sandwich. It is also my belief that corporations are not people and thus should not be treated as such. Now, if presented with enough contrary evidence, I may be willing to change my mind on whether a hot dog is a sandwich (I care about it, but I don’t have any of my personal identity or ideology wrapped up in this age-old debate). However, if presented with contradictory evidence that says corporations are people, I would be much less likely to change my deeply-held political belief.
Trump is popular, right? So his crowd was huge, right?
When presented with the cognitive dissonance that Trump and his team felt – Trump won the election “bigly”/ Trump is popular/everyone loves him/the crowds and the people are behind him/his inauguration crowd must have been the largest in history vs. the actual size of the inauguration crowd in side-by-side pictures against 2008 – Trump and his team decided to double-down to find new information, or “alternative facts”, that outweighed any objective reality the media could show.
When we simply claim Trump and company are liars, it is easy to dismiss them and their faults. If, however, we see something in their lies that we ourselves do as well, that would create cognitive dissonance of our own. We would assume we are good people, we are not liars. We would then see Trump (the liar) as going thru the same cognitive strain as we are to come up with his ‘alternative facts’. That is much more difficult to do. It is much easier for us psychologically to put our political opponents in boxes (liar, cheater, egotistical maniac, 1%-er, racist, xenophobe) and then label this box “other”. We are none of those boxes – so we don’t experience any cognitive dissonance when labeling others that way. This tendency, however, creates division which makes it that much more difficult to build consensus and reach out to those who disagree with us. At this point in our political culture, we all should benefit when we reach out to those who may disagree with us.
This is cognitive dissonance. Yes, these are lies. Yes, we should call them out for their lies. Yes, the media should be an arbiter of facts and ‘alternative facts.’ However, just like calling someone a racist doesn’t automatically fix racism, just calling someone a liar doesn’t fix this problem either. The only way to fix this problem is to understand the problem, understand ourselves and possibly create a political culture where it’s perfectly acceptable to say “I was wrong.” That won’t happen overnight and won’t happen with Trump and company. But in the world of “alternative facts” and obvious cognitive dissonance, we must find a way to stop these lies and myths from decaying the public discourse.