We all know people who cling to their emotional beliefs with close-minded tenacity, regardless of evidence, so yeah, I get the article. It's like the old saying, you can't reason somebody out of a position they didn't reason themselves into.
But there seems to be a bit of naivete in the examples given in the article, or maybe the examples just aren't explained fully enough. For example:
Geoffrey Munro at the University of California and Peter Ditto at Kent State University concocted a series of fake scientific studies in 1997. One set of studies said homosexuality was probably a mental illness. The other set suggested homosexuality was normal and natural. They then separated subjects into two groups; one group said they believed homosexuality was a mental illness and one did not. Each group then read the fake studies full of pretend facts and figures suggesting their worldview was wrong. On either side of the issue, after reading studies which did not support their beliefs, most people didn’t report an epiphany, a realization they’ve been wrong all these years. Instead, they said the issue was something science couldn’t understand.
The subjects weren't living in a vacuum. They'd surely been exposed to other reports before these. I propose that it would be silly to expect a person to suddenly have an epiphany because one report contradicted all they'd used to base their opinion on until then.
If I believed the new report to be true, yet it contradicted previous ones, then the logical conclusion
would be something like science doesn't have the answers yet.
If people are supposed to be that easily manipulated, they could fall for any hoax.
Another example:
In 1992, Peter Ditto and David Lopez conducted a study in which subjects dipped little strips of paper into cups filled with saliva. The paper wasn’t special, but the psychologists told half the subjects the strips would turn green if he or she had a terrible pancreatic disorder and told the other half it would turn green if they were free and clear. For both groups, they said the reaction would take about 20 seconds. The people who were told the strip would turn green if they were safe tended to wait much longer to see the results, far past the time they were told it would take. When it didn’t change colors, 52 percent retested themselves. The other group, the ones for whom a green strip would be very bad news, tended to wait the 20 seconds and move on. Only 18 percent retested.
Again, most people probably know they feel healthy or have had a good report from a physical lately, so they actually have prior evidence that would lead them to expect a good outcome.
A bad outcome would be a major life-changing event, affecting everything from career plans to health insurance. Only the most gullible would immediately accept an unexpected, major announcement at face value and act on it without double-checking.
I'm not sure why gullibility, as tested for in those experiments, is supposed to be good. They seem the opposite of the Milgram experiment. Rather than showing that people tend to bow to what authority figures tell them in official settings, it shows people are skeptical of information that doesn't fit their past experience, even when presented as hoaxed scientific studies or with the veneer of a medical test. Not sure that's bad.