Scott Sommers
Illuminator
- Joined
- Jul 27, 2009
- Messages
- 3,866
In the past few years, there's been a growing interest in the cognitive mechanisms behind the 9/11 conspiracy beliefs we see both here on JREF and elsewhere.
There has been much speculation that conspiracy beliefs, like 9/11 Truth, have their origins in systematic cognitive mechanisms, such as cognitive dissonance and a thing that gets called the Kruger & Dunning Effect. I don't believe this. Conspiracy beliefs are pathological. They are the result of cognition gone wrong. At least the kind of beliefs you see associated with 9/11 Truth have their origins in disordered thinking that for lack of a better word, I will call confusion.
Cognitive Bias
Cognitive dissonance is a form of mental behavior described by a number of different terms. These include cognitive bias, cognitive heuristics and even social cognition. I'm going use the term cognitive bias in this post, just because it's handy. By cognitive bias, I mean it is an explanation for why information processing in people does not follow a strictly logical fashion. Results are reached that are derived strictly from the facts of the situation alone. Humans make decisions and reach conclusions using short cuts and even make errors in processing. When this happens in a systematic fashion, it can be thought of as a cognitive bias.
Why would there be an evolutionary need for a bias in cognition? Why would humans evolve patterns of thought that are strictly speaking not logical and result in conclusions that are not based in the facts of the matter? The social world is extremely complex and there’s just too much information to process at the speed needed for social life. If people need to be able to respond at the pace that’s required, some sort of shortcut is required to sort through all the information, identify key aspects and prepare a response. As a result of such pressures, people sometimes act outside of logic.
Still the result of these actions seems to achieve an almost unbelievable social order. It is as if they were following another type of logic not based in mathematics and predicate logic, or at least not based in these alone. It’s as if there is something going on that’s trying to infer meaning from the limited information available at the time and leap ahead to an understanding that’s just probably true.
So while on the level of an individual social act, you can point out what’s illogical about someone’s behavior, these cognitive biases are the foundation of our social lives. They occur all the time, and work most of the time to bring about socially positive results.
One of the beefs I have with social psychology as an academic discipline is the emphasis it places on cognitive bias as error. Almost all the experimental results that you’ll read about in textbooks come from designs that detect bias through error. By this I mean that the experimental effect of the bias is detected through deviation from a decision making process where the a logical decision would have resulted in a correct or polite or appropriate result. Think of all those experiments you read about in social psych. Milgram and obedience lead to ordinary people being violent. The By-stander Effect is measured through people’s indifference to the suffering of others. The Zimbardo's Stanford Prison Experiment led to obsessive control. Asch Conformity was observed through ridiculously incorrect conclusions. But these are in fact typical of how post-WWII social psychology evolved – highly controlled, ecologically questionable laboratory experiments that result in people doing stupid or even bad things.
But the reality of the matter is that cognitive bias exist largely because they have been selected by the forces of natural selection. They create the enormous order that we see around us. They are generally successful in leading to the correct and safest way to interact with others. The mistakes observed in the social psychology lab are the result of contrived situations designed by specialists to trick these mechanisms into leading us into the kind of bad situation which we would normally want to avoid making.
Cognitive Dissonance
So let’s move on to the particular issue that has drawn me to think about this – cognitive dissonance. The term was first used by Leon Festinger in his famous piece of research When Prophecy Fails. It’s important to point out that this was a field study quite different from the kind of experimental social psychology that almost everyone does today. Festinger joined a Christ-based doomsday cult to observe their reaction to the failure of an end-of-the-world prediction. But the point of this study had little to do with irrational beliefs.
The idea of a cognitive dissonance was originally a response to Behaviorism. A key aspect of Behaviorism is that only observable behaviors can be studied and not the actions of the mind. This led to the well-known principals of reward and punishment where animals will repeat behaviors that are rewarded and avoid behaviors that are punished. Festinger and his colleagues were trying demonstrate that this could not account for some of the complex social behaviors we all know about and that cognition – thinking, learning, remembering, etc. – must be taken into account to understand the results. In the romantic language of a slightly later period, you would hear about the idea of getting inside the black box, meaning the idea that psychologists would be investigating the hidden processes of the mind.
So what is cognitive dissonance? Well, one thing about it is that it happens all the time as a way of reconciling behavior and attitudes. From Social Psychology 8th Edition by Taylor, Peplau, and Sears (p. 173)
Another key aspect of cognitive dissonance is that it is a positive force in the shaping of knowledgeable beliefs about the world. From Group Process in the Classroom 7th Edition by Schmuck & Schmuck (p.182)
So let’s talk about what cognitive dissonance is not.
It is not unusual.
It’s not a bad thing.
It is not a pathology.
It is not disordered thought.
It is not a response to something you disagree with.
It does not happen from being exposed to counterfactual information.
Once again, from Social Psychology 8th Edition by Taylor, Peplau, and Sears (p. 173)
Cognitive Dissonance and the 9/11 Truth
I don’t see much of what I described above happening among 9/11 Truthers. I have never seen a Truther reevaluate their opinion. Truthers frequently tell one of two stories about their attitudes toward 9/11. One is that they always knew what was going on. The second is that they used to believe this thing they call ‘the official story’ but then after reevaluation of the evidence, they realized it couldn’t be true, blah, blah, blah, blah, blah. Neither of these seems to be what Festinger or more contemporary colleagues are talking about when they discuss cognitive dissonance.
But in fact, I doubt cognitive bias has much to do with the opinions of Truthers or almost any conspiracy theorists. For one thing, cognitive bias is generally an adaptive process in the minds of people. It evolved to construct a social world in which people could expect things to happen in a way that makes sense. Truthers and other conspiracy theorists are wildly confused people. These are people for whom obvious facts are not having the influence that nature expects them to have. I don’t see how this could generally be the result of natural and adaptive psychological mechanisms. I don’t see much of anything among Truthers and their conspiracy friends that resembles the cognitive dissonance I described above.
Rather, I see conspiracy beliefs as pathologies. They are the result of cognitive mechanisms that have broken down. Conspiracy theorists are not thinking properly. There are reasons for this, but they do not lie in the cognitive mechanisms that nature has provided us with to handle social life. And this is why conspiracy beliefs are not widely held, are transitory, and are immediately identifiable as the messed up thinking that they are.
Why do people believe in conspiracy theory?
I call conspiracy beliefs pathologies for a reason. Mental illness is clearly the reason for some this belief. I am astounded at the number of Truthers on the JREF and elsewhere who talk about their drinking and drug problems, having been involuntarily committed, or about hallucinations treated by physicians with powerful drugs. I don’t know if this is a higher rate than you’d find in the general population, but my sense is that it is much higher. Certainly it is the only place where I encounter such people.
My point is not that Truthers are crazy, but rather that mental illness is not cognitive bias. It is mental illness.
Another likely source of conspiracy beliefs is in deception. There are situations we have all seen on the JREF where Truthers just lie about things. In fact, this is so common, it is the norm here. A Truther says something, someone shows this is not the case, the Truther disappears only to come back a few months later to make the same claim, perhaps even changing their username.
But lying isn’t a cognitive bias. People who lie don’t even believe what they’re saying. That’s why it’s a lie. People have reasons for this deception, but it is clearly not that they are trying to get at facts of the matter.
What I think is the most common source of conspiracy beliefs is confusion. I'm not really sure how to define this term. What I mean is something antithetical to cognition. 9/11 conspiracy beliefs are caused by a breakdown in the functioning of cognitive mechanisms. The mental illness I described above would probably be one example of this, but here I mean something more general. Something akin to a lack of the skills and abilities that make understanding possible.
Can you blame people for being confused? The claims and counterclaims being made about this 9/11 conspiracy are so complex I can hardly keep up with them. This stuff about thermite and paint is really beyond my understanding and I have 2 years of university chemistry and math. Much of these beliefs appear to be based on faith in the credentials of people like Steven Jones and Judy Wood. Regardless of what you think about them as scientists and people, they are highly credentialed. Dick Gage is an architect and, if you don’t know a lot about building construction or have experts to talk to, he could easily be convincing for that reason. Truthers are generally not very well-educated and the ones who are don’t seem to have a technical background of any sort.
But confusion is a complicated idea. I know of no comprehensive definition. You might be confused because you’re drunk. You might be confused because you’ve been spinning around and around for a long time. You might be confused because you have a low IQ or brain damage or only one eye. And none of these would be a cognitive bias.
You might also be confused because you can’t read very well. To the Truthers out there, this may seem very strange. There are Truther organizations that claim to be full of university graduates and they can’t have reading problems, right? I’m less sure of this. Marlene Scardamalia and Carl Bereiter have speculated that reading at an expert level might not be as widespread among the educated as we would all like to think. If you read their chapter in Toward a General Theory of Expertise, they produce evidence of a general lack of expert reading ability among journal editors and leading scientists.
In fact, I have often wondered about this among some of the Truthers I have witnessed. Some are literate and able to compose long, precise and complex pieces of writing, but at the same time seem to have some very subtle misunderstandings of news and technical writing. It’s as if they have missed some key aspect of meaning in writing that is just a little too hard for them. On the JREF, we make fun of this, calling such people stupid or reminding them that we have already answered this or that point. We make the more polite assumption that our Truther friends are fully capable of comprehending what we wrote. I am less sure we can count on this. And right or wrong, a reading deficit would not be a cognitive bias.
The End
I’m quite interested in this idea, that verbal issues are part of what causes conspiracy beliefs. I’d like to do more work on this in the future. But for now, my point is that confusion, with its many origins, rather than the systematic cognitive biases described with such terms as cognitive dissonance, are at the root of much of the conspiracy beliefs we see on the JREF and elsewhere.
There has been much speculation that conspiracy beliefs, like 9/11 Truth, have their origins in systematic cognitive mechanisms, such as cognitive dissonance and a thing that gets called the Kruger & Dunning Effect. I don't believe this. Conspiracy beliefs are pathological. They are the result of cognition gone wrong. At least the kind of beliefs you see associated with 9/11 Truth have their origins in disordered thinking that for lack of a better word, I will call confusion.
Cognitive Bias
Cognitive dissonance is a form of mental behavior described by a number of different terms. These include cognitive bias, cognitive heuristics and even social cognition. I'm going use the term cognitive bias in this post, just because it's handy. By cognitive bias, I mean it is an explanation for why information processing in people does not follow a strictly logical fashion. Results are reached that are derived strictly from the facts of the situation alone. Humans make decisions and reach conclusions using short cuts and even make errors in processing. When this happens in a systematic fashion, it can be thought of as a cognitive bias.
Why would there be an evolutionary need for a bias in cognition? Why would humans evolve patterns of thought that are strictly speaking not logical and result in conclusions that are not based in the facts of the matter? The social world is extremely complex and there’s just too much information to process at the speed needed for social life. If people need to be able to respond at the pace that’s required, some sort of shortcut is required to sort through all the information, identify key aspects and prepare a response. As a result of such pressures, people sometimes act outside of logic.
Still the result of these actions seems to achieve an almost unbelievable social order. It is as if they were following another type of logic not based in mathematics and predicate logic, or at least not based in these alone. It’s as if there is something going on that’s trying to infer meaning from the limited information available at the time and leap ahead to an understanding that’s just probably true.
So while on the level of an individual social act, you can point out what’s illogical about someone’s behavior, these cognitive biases are the foundation of our social lives. They occur all the time, and work most of the time to bring about socially positive results.
One of the beefs I have with social psychology as an academic discipline is the emphasis it places on cognitive bias as error. Almost all the experimental results that you’ll read about in textbooks come from designs that detect bias through error. By this I mean that the experimental effect of the bias is detected through deviation from a decision making process where the a logical decision would have resulted in a correct or polite or appropriate result. Think of all those experiments you read about in social psych. Milgram and obedience lead to ordinary people being violent. The By-stander Effect is measured through people’s indifference to the suffering of others. The Zimbardo's Stanford Prison Experiment led to obsessive control. Asch Conformity was observed through ridiculously incorrect conclusions. But these are in fact typical of how post-WWII social psychology evolved – highly controlled, ecologically questionable laboratory experiments that result in people doing stupid or even bad things.
But the reality of the matter is that cognitive bias exist largely because they have been selected by the forces of natural selection. They create the enormous order that we see around us. They are generally successful in leading to the correct and safest way to interact with others. The mistakes observed in the social psychology lab are the result of contrived situations designed by specialists to trick these mechanisms into leading us into the kind of bad situation which we would normally want to avoid making.
Cognitive Dissonance
So let’s move on to the particular issue that has drawn me to think about this – cognitive dissonance. The term was first used by Leon Festinger in his famous piece of research When Prophecy Fails. It’s important to point out that this was a field study quite different from the kind of experimental social psychology that almost everyone does today. Festinger joined a Christ-based doomsday cult to observe their reaction to the failure of an end-of-the-world prediction. But the point of this study had little to do with irrational beliefs.
The idea of a cognitive dissonance was originally a response to Behaviorism. A key aspect of Behaviorism is that only observable behaviors can be studied and not the actions of the mind. This led to the well-known principals of reward and punishment where animals will repeat behaviors that are rewarded and avoid behaviors that are punished. Festinger and his colleagues were trying demonstrate that this could not account for some of the complex social behaviors we all know about and that cognition – thinking, learning, remembering, etc. – must be taken into account to understand the results. In the romantic language of a slightly later period, you would hear about the idea of getting inside the black box, meaning the idea that psychologists would be investigating the hidden processes of the mind.
So what is cognitive dissonance? Well, one thing about it is that it happens all the time as a way of reconciling behavior and attitudes. From Social Psychology 8th Edition by Taylor, Peplau, and Sears (p. 173)
Dissonance is defined as an aversive motivational state that results when some behavior we engage in is inconsistent with our attitudes.” Their discussion continues, “One situation that almost always arouses dissonance is making a decision…After we make the decision, all the good aspects of the unchosen alternative and all the bad aspects of the chosen alternative are inconsistent with the decision. Dissonance can be reduced by improving our evaluation of the chosen alternative or by lowering our evaluation of the unchosen alternative. After making decisions, there is a tendency for us to increase our liking for what we chose and to decrease our liking for what we did not choose.
Another key aspect of cognitive dissonance is that it is a positive force in the shaping of knowledgeable beliefs about the world. From Group Process in the Classroom 7th Edition by Schmuck & Schmuck (p.182)
New information is brought out – and the student is pushed to self-analysis and to a consideration of alternative conceptualizations. Some of the students’ previously held conclusions do not hold up. Personal ideas of the conflict or disequilibrium arises. Some psychologists call this a state of cognitive dissonance. Cognitive dissonance arises an active inner search for new understanding and new conclusions. The student learns to consider very different conceptual schemes simultaneously and is energized to develop his or her own unique version of the issue.
So let’s talk about what cognitive dissonance is not.
It is not unusual.
It’s not a bad thing.
It is not a pathology.
It is not disordered thought.
It is not a response to something you disagree with.
It does not happen from being exposed to counterfactual information.
Once again, from Social Psychology 8th Edition by Taylor, Peplau, and Sears (p. 173)
Cognitive dissonance occurs when you come to an understanding that your behaviors and attitudes are not consistent. The result is a reevaluation of your attitude.Dissonance is defined as an aversive motivational state that results when some behavior we engage in is inconsistent with our attitudes.
Cognitive Dissonance and the 9/11 Truth
I don’t see much of what I described above happening among 9/11 Truthers. I have never seen a Truther reevaluate their opinion. Truthers frequently tell one of two stories about their attitudes toward 9/11. One is that they always knew what was going on. The second is that they used to believe this thing they call ‘the official story’ but then after reevaluation of the evidence, they realized it couldn’t be true, blah, blah, blah, blah, blah. Neither of these seems to be what Festinger or more contemporary colleagues are talking about when they discuss cognitive dissonance.
But in fact, I doubt cognitive bias has much to do with the opinions of Truthers or almost any conspiracy theorists. For one thing, cognitive bias is generally an adaptive process in the minds of people. It evolved to construct a social world in which people could expect things to happen in a way that makes sense. Truthers and other conspiracy theorists are wildly confused people. These are people for whom obvious facts are not having the influence that nature expects them to have. I don’t see how this could generally be the result of natural and adaptive psychological mechanisms. I don’t see much of anything among Truthers and their conspiracy friends that resembles the cognitive dissonance I described above.
Rather, I see conspiracy beliefs as pathologies. They are the result of cognitive mechanisms that have broken down. Conspiracy theorists are not thinking properly. There are reasons for this, but they do not lie in the cognitive mechanisms that nature has provided us with to handle social life. And this is why conspiracy beliefs are not widely held, are transitory, and are immediately identifiable as the messed up thinking that they are.
Why do people believe in conspiracy theory?
I call conspiracy beliefs pathologies for a reason. Mental illness is clearly the reason for some this belief. I am astounded at the number of Truthers on the JREF and elsewhere who talk about their drinking and drug problems, having been involuntarily committed, or about hallucinations treated by physicians with powerful drugs. I don’t know if this is a higher rate than you’d find in the general population, but my sense is that it is much higher. Certainly it is the only place where I encounter such people.
My point is not that Truthers are crazy, but rather that mental illness is not cognitive bias. It is mental illness.
Another likely source of conspiracy beliefs is in deception. There are situations we have all seen on the JREF where Truthers just lie about things. In fact, this is so common, it is the norm here. A Truther says something, someone shows this is not the case, the Truther disappears only to come back a few months later to make the same claim, perhaps even changing their username.
But lying isn’t a cognitive bias. People who lie don’t even believe what they’re saying. That’s why it’s a lie. People have reasons for this deception, but it is clearly not that they are trying to get at facts of the matter.
What I think is the most common source of conspiracy beliefs is confusion. I'm not really sure how to define this term. What I mean is something antithetical to cognition. 9/11 conspiracy beliefs are caused by a breakdown in the functioning of cognitive mechanisms. The mental illness I described above would probably be one example of this, but here I mean something more general. Something akin to a lack of the skills and abilities that make understanding possible.
Can you blame people for being confused? The claims and counterclaims being made about this 9/11 conspiracy are so complex I can hardly keep up with them. This stuff about thermite and paint is really beyond my understanding and I have 2 years of university chemistry and math. Much of these beliefs appear to be based on faith in the credentials of people like Steven Jones and Judy Wood. Regardless of what you think about them as scientists and people, they are highly credentialed. Dick Gage is an architect and, if you don’t know a lot about building construction or have experts to talk to, he could easily be convincing for that reason. Truthers are generally not very well-educated and the ones who are don’t seem to have a technical background of any sort.
But confusion is a complicated idea. I know of no comprehensive definition. You might be confused because you’re drunk. You might be confused because you’ve been spinning around and around for a long time. You might be confused because you have a low IQ or brain damage or only one eye. And none of these would be a cognitive bias.
You might also be confused because you can’t read very well. To the Truthers out there, this may seem very strange. There are Truther organizations that claim to be full of university graduates and they can’t have reading problems, right? I’m less sure of this. Marlene Scardamalia and Carl Bereiter have speculated that reading at an expert level might not be as widespread among the educated as we would all like to think. If you read their chapter in Toward a General Theory of Expertise, they produce evidence of a general lack of expert reading ability among journal editors and leading scientists.
In fact, I have often wondered about this among some of the Truthers I have witnessed. Some are literate and able to compose long, precise and complex pieces of writing, but at the same time seem to have some very subtle misunderstandings of news and technical writing. It’s as if they have missed some key aspect of meaning in writing that is just a little too hard for them. On the JREF, we make fun of this, calling such people stupid or reminding them that we have already answered this or that point. We make the more polite assumption that our Truther friends are fully capable of comprehending what we wrote. I am less sure we can count on this. And right or wrong, a reading deficit would not be a cognitive bias.
The End
I’m quite interested in this idea, that verbal issues are part of what causes conspiracy beliefs. I’d like to do more work on this in the future. But for now, my point is that confusion, with its many origins, rather than the systematic cognitive biases described with such terms as cognitive dissonance, are at the root of much of the conspiracy beliefs we see on the JREF and elsewhere.
Last edited: