• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

How Did Confirmation Bias Evolve?

Mr. Scott

Under the Amazing One's Wing
Joined
Nov 23, 2005
Messages
2,546
Confirmation Bias is one of my research projects right now. After our discussion on it in the thread "Why the militant atheism?" I started to wonder how such a seemingly self-destructive mechanism was selected for.

The particular aspect of Confirmation Bias which I think is remarkable is this: It's been found that ignoring evidence against a belief one firmly holds stimulates the same pleasure centers as addictive drugs (explaining why woos take on skeptics here with addictive persistence).

However, isn't there an advantage to being the one with the right belief -- to correcting a belief you find evidence is wrong? I offer this parable:

A cave clan may find that every so often an infant simply disappears from their cave on dark nights. They want to know why so they can cut their losses.

Cave man woo-Me argues that waving a totem each night for two hours while chanting "woo-woo-woo" will scare away the spirit lifting the baby to the cave in the sky.

Cave man ran-De suggests the paw prints that appear outside the cave the nights babies disappear suggest they should try and hunt down that dingo they sometimes see with blood-soaked snout.

Woo-Me argues forcefully that he is right and that the paw prints are just the sky spirit's trick to confuse us. Woo-Me is so emphatic and persuasive, because of the pleasure he experiences from ignoring ran-De's evidence, that the clan believes him and banishes ran-De. Babies continue to disappear and the colony becomes extinct, along with the confirmation bias gene.

So, why is the confirmation bias gene still around?

(After I started this I thought of an answer, but still want to see the discussion play out here untainted by my own confirmation bias. I've written what I think is the answer into a text file and will paste it here after the discussion approaches maturity.)
 
Last edited:
I always assumed it had something to do with how very little thinking is involved in survival. If you have to stop and think all the time, the lion catches and eats you. If you just react in the same way you always react, you get a head start on the thinkers, and they get eaten instead of you.
 
Confirmation Bias is one of my research projects right now. After our discussion on it in the thread "Why the militant atheism?" I started to wonder how such a seemingly self-destructive mechanism was selected for.

The particular aspect of Confirmation Bias which I think is remarkable is this: It's been found that ignoring evidence against a belief one firnly holds stimulates the same pleasure centers as addictive drugs (explaining why woos take on skeptics here with addictive persistence).

However, isn't there an advantage to being the one with the right belief -- to correcting a belief you find evidence is wrong? I offer this parable:

A cave clan may find that every so often an infant simply disappears from their cave on dark nights. They want to know why so they can cut their losses.

Cave man woo-Me argues that waving a totem each night for two hours while chanting "woo-woo-woo" will scare away the spirit lifting the baby to the cave in the sky.

Cave man ran-De suggests the paw prints that appear outside the cave the nights babies disappear suggest they should try and hunt down that dingo they sometimes see with blood-soaked snout.

Woo-Me argues forcefully that he is right and that the paw prints are just the sky spirit's trick to confuse us. Woo-Me is so emphatic and persuasive, because of the pleasure he experiences from ignoring Rand-Me's evidence, that the clan believes him and banishes ran-Di. Babies continue to disappear and the colony becomes extinct, along with the confirmation bias gene.

So, why is the Confirmation Bias gene still around?

(After I started this I thought of an answer, but still want to see the discussion play out here untainted by my own confirmation bias. I've written what I think is the answer into a text file and will paste it here after the discussion approaches maturity.)

Who has the higher probability of producing offspring? Ran-de who has been banished to fend for himself or Woo-Me who remains within the protection of the clan?
 
What makes you think there would be a "confirmation bias gene"? Behavior is the emergent property of phisiology acting within its enviornment.

There is no reason to assume nature would be efficient in weeding out every single detrimental aspect of life. In fact, in a natural proccess, we would expect our definition of "perfection" to go completely ignored.
Evolution != Progress.

Our ancestors could probably get away with Confirmation Bias, because most of the time its consequences were not as dramatic as your parable.
Perhaps such Bias was a shortcut to full-scale investigation, that was merely "good enough" most of the time.
This Bias might have something of a survival value, anayway, since full-scale investigations would require more resources. And few had such resources to spare, until relatively recently.
 
Who has the higher probability of producing offspring? Ran-de who has been banished to fend for himself or Woo-Me who remains within the protection of the clan?

Also, it's possible that rather than the clan dying out, with fewer mouths to feed those remaining have more food each and remain healthier than they would otherwise.
 
Actually, I'd first argue that your example does not illustrate the case very well. Your example is discussing the formulation of a new idea, which is a matter of critical thinking rather than confirmation bias.

Although take the same example. They accept woo-Me's explanation, and dance all night outside the cave. While they do this, no babies get stolen (the dancing scares the dingo away, it doesn't approach). Later, one of the cavemen suggests that maybe it's not evil spirits...that's where confirmation bias is more likely (at least in this example). Of course, confirmation bias can even come into play when you have the right answer, as well.

In any case, my point is that confirmation bias is more about defending/holding on to an already existing belief, rather than choosing between new alternatives. This would tend to reinforce the behaviors that already exist within a clan/group. As that clan or group has obviously been successful at surviving (it's still around), this means that the successful survival-oriented patterms would be reinforced, as well.

I'd say that confiramtion bias was actually less of a problem when we were cavemen than now. In primative days, confirmation bias defending incorrrect and dangerous ideas would tend to promote the removal of those ideas...while keeping around good and useful ideas. Keep in mind that evolution is about the species, not the individual. It would seem that a method to preferrentially weed out those with poor thinking skills, something like confirmation bias, could actually be useful to the species as a whole. Some neutral ideas owuldn't be affected much (various rituals and traditions, for example). Today, we protect people from their own stupidity, so confirmation bias has a free hand to promote both useful and useless ideas.

That's my two cents worth, anyway. I'll bill you for it ;)
 
Last edited:
Confirmation Bias is one of my research projects right now. After our discussion on it in the thread "Why the militant atheism?"
However, isn't there an advantage to being the one with the right belief -- to correcting a belief you find evidence is wrong? I offer this parable:

As previously stated, there should be no such thing as a "confirmation bias gene".

As to the question above, when it comes to complex beliefs like religion, no, there would be no advantage to those with the "right belief". The important aspect of that sort of behavior is group unity. Confirmation bias aids group unity by ignoring pesky issues that do not impact survival generally (in other words, your example is far too contrived).

Group unity is so important to our survival that I would expect confirmation bias to be a strong force even if it led to some weird examples of decreased survival in strange situations.
 
Actually, I'd first argue that your example does not illustrate the case very well. Your example is discussing the formulation of a new idea, which is a matter of critical thinking rather than confirmation bias.

Although take the same example. They accept woo-Me's explanation, and dance all night outside the cave. While they do this, no babies get stolen (the dancing scares the dingo away, it doesn't approach). Later, one of the cavemen suggests that maybe it's not evil spirits...that's where confirmation bias is more likely (at least in this example). Of course, confirmation bias can even come into play when you have the right answer, as well.

I'd agree with this example. The confirmation bias here is that the cavemen keep dancing to appease the spirits. They don't have the right reasoning, but in this situation, where's the evolutionary benefit in trying to falsify your belief? If you change what you're doing to test your theory, there's a good chance that the dingo comes back while you're formulating alternative explanations and tests.
 
Confirmation Bias is one of my research projects right now. After our discussion on it in the thread "Why the militant atheism?" I started to wonder how such a seemingly self-destructive mechanism was selected for.

Because in the main it is not destructive? I can consider a whole bunch of ways in which it is helpful.

The particular aspect of Confirmation Bias which I think is remarkable is this: It's been found that ignoring evidence against a belief one firmly holds stimulates the same pleasure centers as addictive drugs (explaining why woos take on skeptics here with addictive persistence).

This?

However, isn't there an advantage to being the one with the right belief -- to correcting a belief you find evidence is wrong? I offer this parable:

A cave clan may find that every so often an infant simply disappears from their cave on dark nights. They want to know why so they can cut their losses.

Cave man woo-Me argues that waving a totem each night for two hours while chanting "woo-woo-woo" will scare away the spirit lifting the baby to the cave in the sky.

Cave man ran-De suggests the paw prints that appear outside the cave the nights babies disappear suggest they should try and hunt down that dingo they sometimes see with blood-soaked snout.

Woo-Me argues forcefully that he is right and that the paw prints are just the sky spirit's trick to confuse us. Woo-Me is so emphatic and persuasive, because of the pleasure he experiences from ignoring ran-De's evidence, that the clan believes him and banishes ran-De. Babies continue to disappear and the colony becomes extinct, along with the confirmation bias gene.

I suspect a caveperson will form beliefs around that which is familiar. In this case, it seems more likely that they will initially attribute the babies' disappearance to animals or other humans. It is my impression that erroneous beliefs are more likely to be formed around those things where cause and effect is not so obvious - medical outcomes, weather, war, love.

Linda
 
The associative nature of patterns recognition is hardwired and soft programmed. To be able to see edible things and dangerous things will lead to reproductive success.

The behavioral consequence of 'superstitious behavior' is an unfortunate consequenes. Patterns will be seen that are not there.
 
In any case, my point is that confirmation bias is more about defending/holding on to an already existing belief, rather than choosing between new alternatives.

Absolutely.

In this rather good review, Nickerson comes at the problem from a cognitive perspective, and explores the nature of belief-persistence.

He argues that there are several advantages to having less vulnerable beliefs (along with lots of disadvantages too), and that whilst it leads to dogmatism and intolerance, these are not necessarily disadvantageous to individuals or groups. He also cites evidence that we are more prone to confirmation bias in certain situations (e.g. abstract reasoning tasks, hypothesis testing) than others (e.g. processing and monitoring social rules and the behaviour of others), which may be a clue to the ultility or otherwise of this particular bias. The paper also observes that sometimes one particular bias may be protective against more serious errors.

How difficult must it be to write a selective review on confirmation bias!
 
Thanks for the criticisms of my OP, which help me focus on the point that really interests me.

I was concerned that someone would suggest that the woo-woo ceremony was scaring off the dingo, but I never said the dingo didn't eat the babies after the ceremony. A smart dingo might learn to associate the ceremony with the presence of delicious babies. Observing this failure, me-Woo might suggest the ceremony go for three, not two hours, a response consistent with confirmation bias. Also, ran-Di could find another clan that's not so woo-woo in its beliefs, while the clan that shunned him goes extinct because they maintained false hypotheses in spite of conflicting evidence.

But such ambiguities in the parable are beside the point.

Whether or not there is a specific confirmation bias gene is also beside the point. Following correct hypotheses about how the world works seems to me to yield a reproductive/survival advantage. There must be genes involved in such an apparently silly wiring plan for the brain that results in confirmation bias.

What possible reproductive advantage yield could result from being pleasured by ignoring evidence that your hypothesis about how the world works is incorrect?
 
Last edited:
What possible reproductive advantage yield could result from being pleasured by ignoring evidence that your hypothesis about how the world works is incorrect?
Because your value to others may not depend so much on your being right as on your ability to convince them that you are right.

Woo-Me is so emphatic and persuasive, because of the pleasure he experiences from ignoring ran-De's evidence, that the clan believes him and banishes ran-De.

The struggle Woo-Me won was not for accurate knowledge, but for status -- and what made him so emphatic and (hence) persuasive was that he himself believed what he was saying.
 
JoeEllison said:
I always assumed it had something to do with how very little thinking is involved in survival. If you have to stop and think all the time, the lion catches and eats you. If you just react in the same way you always react, you get a head start on the thinkers, and they get eaten instead of you.
And it follows that feeling pleasure at such automatic reactions would reinforce them.

What does a chipmunk on a road do when you approach it with a car? I believe it dodges randomly, rather than thinking about where to go. Thinking will get it squished every time.

~~ Paul
 
Whether or not there is a specific confirmation bias gene is also beside the point. Following correct hypotheses about how the world works seems to me to yield a reproductive/survival advantage. There must be genes involved in such an apparently silly wiring plan for the brain that results in confirmation bias.

What possible reproductive advantage yield could result from being pleasured by ignoring evidence that your hypothesis about how the world works is incorrect?

I think the point people are making when they say there is no confirmation bias gene is that it is not an inherited trait (not that there needs to be one single gene for it). If it's hard wired, then the only way to avoid it would be to change our anatomy (lose some connections and grow others). I find that difficult to believe. I would also question whether any fallacious way of thinking is an inherited trait.

I would accept that we're hard-wired to want explanations for things. (Probably because of the selective advantage for a social creature in being able to think about other peoples' motives and desires and so on, and the huge surplus in reasoning power that accompanies that.) Wanting to explain things can lead to genuine curiosity (which can lead eventually to science), or it can lead to clinging to false explanations.

Leaving the inheritance question aside, I can see C.B. having some advantage. The confirmation bias, as mentioned, is sort of a defense of existing ideas, not a way of coming up with new ones. I'd lump it in with ways of protecting the status quo (sort of the thinking that is behind superstitious behaviour). Whether your thinking about it is right or wrong, you're generally safer with the tried and true. In cases where you're wrong about a risk, it's at least innocuous to avoid something novel that isn't really dangerous.
 
I would accept that we're hard-wired to want explanations for things. (Probably because of the selective advantage for a social creature in being able to think about other peoples' motives and desires and so on, and the huge surplus in reasoning power that accompanies that.) Wanting to explain things can lead to genuine curiosity (which can lead eventually to science), or it can lead to clinging to false explanations.

Bolding mine.

With only one sample set, who's to say that what we have isn't what would have evolved under any other circumstances? With such a startlingly brief history (in terms of the evolutionary record), the modern scientific method seems more like the evolutionary quirk than confirmation bias or even tribal social behaviour.

If we are somehow able to take full control of our own evolution (or that of a synthetic follow-on super species), it may lead to an entirely different set of questions being asked 10,000 years from now...
 
But such ambiguities in the parable are beside the point.

Whether or not there is a specific confirmation bias gene is also beside the point. Following correct hypotheses about how the world works seems to me to yield a reproductive/survival advantage. There must be genes involved in such an apparently silly wiring plan for the brain that results in confirmation bias.

What possible reproductive advantage yield could result from being pleasured by ignoring evidence that your hypothesis about how the world works is incorrect?


That is way too specific for human biology, there is a benefit from associative pattern recognition, even if it gets misapplied at times.
Leraning to see patterns is good, that is edible, that is where edible thing grows, that is dangerous, that is where danger lives. Possible benefits to associative pattern recognition.

Superstitious behavior occurs when a consequence gets associated with an unrelated stimulus, so we haver pattern recognition and superstition, bith based upon healthy traits that become misapplied.
 
That is way too specific for human biology, there is a benefit from associative pattern recognition, even if it gets misapplied at times.
Leraning to see patterns is good, that is edible, that is where edible thing grows, that is dangerous, that is where danger lives. Possible benefits to associative pattern recognition.

Superstitious behavior occurs when a consequence gets associated with an unrelated stimulus, so we haver pattern recognition and superstition, bith based upon healthy traits that become misapplied.

I'm not really focusing on superstition.

My main curiosity is in the connection between ignoring evidence that contradicts your beliefs and stimulating the pleasure center. I'm ready to believe that a single genetic mutation resulted in a connection between these two groups of neurons, and that this bit of accidental neuron wiring was then selected for.

In pseudocode one would write this as follows:

FUNCTION Confirmation Bias(input,model)
IF(input doesn't agree with model)
THEN
discard input
stimulate pleasure center()
END

FUNCTION Stimulate Pleasure Center()
whatever you just did, do it again ASAP and with more gusto
END FUNCTION

Could some dyed-in-the-wool skeptics have a damaged confirmation bias gene?

I only ask. ;)
 
And it follows that feeling pleasure at such automatic reactions would reinforce them.

What does a chipmunk on a road do when you approach it with a car? I believe it dodges randomly, rather than thinking about where to go. Thinking will get it squished every time.

~~ Paul
That's exactly right. And, if it were capable of higher thinking, it would probably later think "Hey, I dodged to the left first, and I survived... better go ahead and dodge left first every time!" We're not adapted to be "perfect", but merely "good enough", and some sort of dodging motion is going to almost always be at least a little better than no movement at all. So, you can surmise that there will be greater benefit to "keep doing what you have done successfully in the past" than in "stop and formulate a new method every time."
 
I'm not really focusing on superstition.

My main curiosity is in the connection between ignoring evidence that contradicts your beliefs and stimulating the pleasure center. I'm ready to believe that a single genetic mutation resulted in a connection between these two groups of neurons, and that this bit of accidental neuron wiring was then selected for.

In pseudocode one would write this as follows:

FUNCTION Confirmation Bias(input,model)
IF(input doesn't agree with model)
THEN
discard input
stimulate pleasure center()
END

FUNCTION Stimulate Pleasure Center()
whatever you just did, do it again ASAP and with more gusto
END FUNCTION

Could some dyed-in-the-wool skeptics have a damaged confirmation bias gene?

I only ask. ;)

Then how do you explain people's ability to be educated out of making the confirmation bias fallacy in their thinking?

If it's hard wired, it'd be like height or hair-color, wouldn't? Yours for life.

I think Dancing David is right, that a larger issue might be something connected with selective advantage, but a specific logical fallacy doesn't seem like something close enough to protein-encoding.

Just like when people say how could homosexuality have evolved? (I'd ask then how could poetry writing or philosophy have evolved?) (See Wowbagger's post above.)

We evolved as social animals where things like face-recognition and recognizing intent was important, and the large brains that were selected for those (among other) properties also have other interesting properties. . . including, for instance, the ability to learn calculus. Does it make sense to ask whether the ability to learn calculus (itself) was something selectively advantageous?
 

Back
Top Bottom