The Doc
Curing Stupidity
- Joined
- Nov 9, 2006
- Messages
- 2,158
Cognitive bias in itself is a logical fallacy, and the conspiracists use it like there is no tomorrow. This thread will delve deep into the realms of bias, and how the "truth" movement applies it.
I was browsing over the several types of bias that exist, today, and I a thought struck me! So many of them can be attributed to the 9/11 "truth" movement. So here goes:
Bandwagon effect — the tendency to do (or believe) things because many other people do (or believe) the same. Related to groupthink, herd behaviour, and manias.
Closely related to the Ad Populum (Argument to popularity), this is seen often with the phrase "84% of the US believes 9/11 was an inside job!". Not true, but even if it was, it's a fallacy.
Bias blind spot — the tendency not to compensate for one's own cognitive biases.
A lot of conspiracists fail to realize why they hold their belief. For example, some fail to realize that their pre-defined hatred for the US, Bush or the US government is what is furthering their beliefs. Note: I am not saying that all conspiracists hate the US, USG or GWB. Just an example.
Confirmation bias — the tendency to search for or interpret information in a way that confirms one's preconceptions.
Getting all your information from prisonplanet or 911blogger isn't a good idea, yet a large portion of conspiracists do exactly that. Failure to research all arguments is common amongst the truth movement.
Contrast effect — the enhancement or diminishment of a weight or other measurement when compared with recently observed contrasting object.
For example, "my rabbit cage didn't collapse! Therefore WTC shouldn't have collapsed!". Enough said.
Focusing effect — prediction bias occurring when people place too much importance on one aspect of an event; causes error in accurately predicting the utility of a future outcome.
Self explanatory really. Nitpicking is common amongst "truth" movement "researchers".
Illusion of control — the tendency for human beings to believe they can control or at least influence outcomes that they clearly cannot.
Alex Jones is a prime example of this.
Impact bias — the tendency for people to overestimate the length or the intensity of the impact of future feeling states.
Conspiracists think that one day they will liberate the world and become mankind's heroes. How wrong they are.
Information bias — the tendency to seek information even when it cannot affect action.
Take, for example, Dylan's recent "soot" fiasco.
Neglect of probability — the tendency to completely disregard probability when making a decision under uncertainty.
An example of this would be the complete ignoring of jet liner impacts in assessing the likelyhood of the WTC collapse.
Omission bias — The tendency to judge harmful actions as worse, or less moral, than equally harmful omissions (inactions).
A good example of this is the LC forum condoning the murder of Gravy, and then getting extremely upset when Troy threatened William Rodriguez. I condone neither myself.
Planning fallacy — the tendency to underestimate task-completion times.
May 2006, Loose Change Final Cut to be released. Oh wait...
Reactance - the urge to do the opposite of what someone wants you to do out of a need to reassert a perceived attempt to constrain your freedom of choice.
The classic rebellion attitude. Simply because people tell them they are wrong, they think they are right. For example, George Bush saying that we shouldn't listen to these wild conspiracies was fuel for the conspiracist fire.
Selective perception — the tendency for expectations to affect perception.
See Alex Jones followers. "The revolution is coming!".
Anchoring — the tendency to rely too heavily, or "anchor," on one trait or piece of information when making decisions.
Again, nitpicking is common amongst the movement.
Anthropic bias — the tendency for one's evidence to be biased by observation selection effects.
Self Explanatory.
Attentional bias — neglect of relevant data when making judgments of a correlation or association.
Another example of nitpicking, this time in the form of ommission.
Availability heuristic — a biased prediction, due to the tendency to focus on the most salient and emotionally-charged outcome.
We saw this in a recent post on the LC forum. A lot of conspiracists are expecting some kind of revolution, because it seems cool to a lot of them I guess.
Clustering illusion — the tendency to see patterns where actually none exist.
A BIG one amongst the movement. Making patterns out of nothing is a strong habbit of the conspiracists.
Hindsight bias — sometimes called the "I-knew-it-all-along" effect, the inclination to see past events as being predictable.
LIHOP'ers PAY ATTENTION.
Illusory correlation — beliefs that inaccurately suppose a relationship between a certain type of action and an effect.
Self explanatory.
Ludic fallacy — the analysis of chance related problems with the narrow frame of games. Ignoring the complexity of reality, and the non-gaussian distribution of many things.
Most conspiracists follow Hollywood as their guide to reality. Sadly, this is true. Things that are not possible in reality (such as space beams) fit right in the truther websites.
Observer-expectancy effect — when a researcher expects a given result and therefore unconsciously manipulates an experiment or misinterprets data in order to find it (see also subject-expectancy effect).
See every experiment by Steven Jones.
Optimism bias — the systematic tendency to be over-optimistic about the outcome of planned actions.
Another example of overconfident truthers.
Overconfidence effect — the tendency to overestimate one's own abilities.
The odd belief some conspiracists hold that they could effectively spark a revolution and take on the US Armed Forces, or even the police for that matter.
Positive outcome bias — a tendency in prediction to overestimate the probability of good things happening to them (see also wishful thinking, optimism bias and valence effect).
See the two above.
Texas sharpshooter fallacy — the fallacy of selecting or adjusting a hypothesis after the data are collected, making it impossible to test the hypothesis fairly.
Self explanatory.
I was thinking of also adding in logical fallacies, but there was no room. They do manage to use just about every one of them, afterall.
I was browsing over the several types of bias that exist, today, and I a thought struck me! So many of them can be attributed to the 9/11 "truth" movement. So here goes:
Bandwagon effect — the tendency to do (or believe) things because many other people do (or believe) the same. Related to groupthink, herd behaviour, and manias.
Closely related to the Ad Populum (Argument to popularity), this is seen often with the phrase "84% of the US believes 9/11 was an inside job!". Not true, but even if it was, it's a fallacy.
Bias blind spot — the tendency not to compensate for one's own cognitive biases.
A lot of conspiracists fail to realize why they hold their belief. For example, some fail to realize that their pre-defined hatred for the US, Bush or the US government is what is furthering their beliefs. Note: I am not saying that all conspiracists hate the US, USG or GWB. Just an example.
Confirmation bias — the tendency to search for or interpret information in a way that confirms one's preconceptions.
Getting all your information from prisonplanet or 911blogger isn't a good idea, yet a large portion of conspiracists do exactly that. Failure to research all arguments is common amongst the truth movement.
Contrast effect — the enhancement or diminishment of a weight or other measurement when compared with recently observed contrasting object.
For example, "my rabbit cage didn't collapse! Therefore WTC shouldn't have collapsed!". Enough said.
Focusing effect — prediction bias occurring when people place too much importance on one aspect of an event; causes error in accurately predicting the utility of a future outcome.
Self explanatory really. Nitpicking is common amongst "truth" movement "researchers".
Illusion of control — the tendency for human beings to believe they can control or at least influence outcomes that they clearly cannot.
Alex Jones is a prime example of this.
Impact bias — the tendency for people to overestimate the length or the intensity of the impact of future feeling states.
Conspiracists think that one day they will liberate the world and become mankind's heroes. How wrong they are.
Information bias — the tendency to seek information even when it cannot affect action.
Take, for example, Dylan's recent "soot" fiasco.
Neglect of probability — the tendency to completely disregard probability when making a decision under uncertainty.
An example of this would be the complete ignoring of jet liner impacts in assessing the likelyhood of the WTC collapse.
Omission bias — The tendency to judge harmful actions as worse, or less moral, than equally harmful omissions (inactions).
A good example of this is the LC forum condoning the murder of Gravy, and then getting extremely upset when Troy threatened William Rodriguez. I condone neither myself.
Planning fallacy — the tendency to underestimate task-completion times.
May 2006, Loose Change Final Cut to be released. Oh wait...
Reactance - the urge to do the opposite of what someone wants you to do out of a need to reassert a perceived attempt to constrain your freedom of choice.
The classic rebellion attitude. Simply because people tell them they are wrong, they think they are right. For example, George Bush saying that we shouldn't listen to these wild conspiracies was fuel for the conspiracist fire.
Selective perception — the tendency for expectations to affect perception.
See Alex Jones followers. "The revolution is coming!".
Anchoring — the tendency to rely too heavily, or "anchor," on one trait or piece of information when making decisions.
Again, nitpicking is common amongst the movement.
Anthropic bias — the tendency for one's evidence to be biased by observation selection effects.
Self Explanatory.
Attentional bias — neglect of relevant data when making judgments of a correlation or association.
Another example of nitpicking, this time in the form of ommission.
Availability heuristic — a biased prediction, due to the tendency to focus on the most salient and emotionally-charged outcome.
We saw this in a recent post on the LC forum. A lot of conspiracists are expecting some kind of revolution, because it seems cool to a lot of them I guess.
Clustering illusion — the tendency to see patterns where actually none exist.
A BIG one amongst the movement. Making patterns out of nothing is a strong habbit of the conspiracists.
Hindsight bias — sometimes called the "I-knew-it-all-along" effect, the inclination to see past events as being predictable.
LIHOP'ers PAY ATTENTION.
Illusory correlation — beliefs that inaccurately suppose a relationship between a certain type of action and an effect.
Self explanatory.
Ludic fallacy — the analysis of chance related problems with the narrow frame of games. Ignoring the complexity of reality, and the non-gaussian distribution of many things.
Most conspiracists follow Hollywood as their guide to reality. Sadly, this is true. Things that are not possible in reality (such as space beams) fit right in the truther websites.
Observer-expectancy effect — when a researcher expects a given result and therefore unconsciously manipulates an experiment or misinterprets data in order to find it (see also subject-expectancy effect).
See every experiment by Steven Jones.
Optimism bias — the systematic tendency to be over-optimistic about the outcome of planned actions.
Another example of overconfident truthers.
Overconfidence effect — the tendency to overestimate one's own abilities.
The odd belief some conspiracists hold that they could effectively spark a revolution and take on the US Armed Forces, or even the police for that matter.
Positive outcome bias — a tendency in prediction to overestimate the probability of good things happening to them (see also wishful thinking, optimism bias and valence effect).
See the two above.
Texas sharpshooter fallacy — the fallacy of selecting or adjusting a hypothesis after the data are collected, making it impossible to test the hypothesis fairly.
Self explanatory.
I was thinking of also adding in logical fallacies, but there was no room. They do manage to use just about every one of them, afterall.