Menus Subscribe Search

Follow us


Why bin Laden Death Photos Won’t Change Minds

• May 07, 2011 • 4:00 AM

Whether it’s Osama’s death throes or Obama’s birthplace, a wealth of academic research shows that people believe today what they believed yesterday — even increasingly outlandish conspiracy theories.

In debating whether or not to release the allegedly gruesome images of Osama bin Laden’s death, one of the main arguments in favor of release was that it might help to counter the conspiracy-minded thinking that the operation was a fake, or that bin Laden has been dead for years, or even that he was a CIA fabrication all along.

But President Barack Obama seems to have realized what many social scientists have known for years: that no evidence in the world would convince certain people that a U.S. Navy SEAL unit killed bin Laden at his compound in Abbattabad, Pakistan, and buried him at sea. Much in the same way, a certain percentage of Americans will never believe Obama was born in the United States, no matter how many long-form birth certificates are released.

But how is it that the human mind is capable of spinning increasingly baroque tales to deny simple facts?

Much actually follows the simple idea that most of us only seek and believe information that supports what we already think, all the while ignoring and disregarding information that would contradict our pre-existing beliefs. It’s a phenomenon that goes by many names: confirmation bias, selective exposure, myside bias, cognitive dissonance. But they all add up to the same thing: We like to pick one side and be right about it, and once we commit to thinking a certain way, gosh darn it, we’re going to make sure we keep thinking that way – whatever it takes.

“When you get some evidence, you only take it seriously if it agrees with you,” explains Jonathan Baron, a professor of psychology at the University of Pennsylvania. “And in a way, that’s not unreasonable. If I hear of some psychology experiment that demonstrates ESP, I’ll tend to say, ‘I don’t believe that.’ But the difficulty comes when people selectively expose themselves to evidence and then pretend that they didn’t.”

Studies in psychology have demonstrated, among other things, a consistent pattern of avoiding evidence that contradicts an initial hypothesis; irrepressible overconfidence in one’s own judgment; that initial impressions are hard to dislodge; that people feel more confident in their decisions when they only consider one side; and that we, in fact, have a preference for one-sided thinking.

One of the most famous studies in this line of research recruited people who had strong pro and con views on capital punishment, and then exposed them to made-up studies of the effect of the death penalty as a deterrent to crime. Opponents of the death penalty described the research finding no link to deterrence to be more convincing, while the opposite was true for supporters of the death penalty. But what was remarkable was that people came away from the experiment even more convinced of their original viewpoint.

Another famous example details what happened when a UFO cult leader’s prophecy that aliens from the planet Clarion would rescue cult members from an earth-destroying flood on Dec. 21, 1954, did not, in fact, come to fruition. Rather than losing faith, cult members instead became more resolute, believing that the prophecy was real but that their group had spread enough goodness to save the planet from destruction. After all, what was harder to believe for the group members: that they had made a mistake, or that they had singlehandedly prevented the destruction of humanity?

Such appears to be the human mind: Start with a faulty premise, add time and intention, and voilà! — without constant efforts to challenge our beliefs, the cognitive shortcuts and motivated reasonings we are all prone to can easily cement into impossible-to-dislodge conspiracy theories.

“There’s what we’d talk about as an epistemological stance that people take,” says Deanna Kuhn, a professor of psychology and education at Columbia University Teachers College. “By that, I mean what their stance is with respect to evidence, how claims are supported, what kind of evidence would you take as proof of a claim. And there are many people, too many people, who are operating in an epistemological framework that is not open to evidence.”

In the U.S., conspiracies often taken on a partisan tinge, such as the birther phenomenon.

“When a Democrat is in the White House, the prevailing theories [are] fairly conservative,” says Mark Fenster, a University of Florida law school associate dean and author of Conspiracy Theories: Secrecy and Power in American Culture. “And when a Republican is in charge, the theories trend toward to the left. There’s a partisan aspect, but there’s an underlying fear about concentrations of power.”

If you don’t like Obama, you’re much more likely to attune to allegations of his foreign birth. They confirm what you always thought of him — something is not quite right about that man. And all that filtering has a way of feeding upon itself to construct an elaborate epistemological edifice.

“If you think Obama is a terrible president, then you want to think that he’s lying about bin Laden,” Baron says. “So some of it is wishful thinking.”

Studies have shown that even something as seemingly objective as one’s own personal financial well-being is subject to one’s partisan leanings.

Fenster also notes that there are some intriguing demographic drivers: African-Americans tend to be more likely to believe in conspiracy theories (perhaps because they’ve been victimized more in American history, and belief in conspiracy theories tends to be associated with alienation from power). Men also tend to be more likely to believe in conspiracy theories than women, though nobody is sure why.

All in all, it suggests that convincing those who are fully invested in disbelieving you is almost certainly always a fool’s errand. In fact, if you liked this article, it’s probably only because you suspected as much about human cognition in the first place.

Sign up for the free Miller-McCune.com e-newsletter.

“Like” Miller-McCune on Facebook.

Follow Miller-McCune on Twitter.

Add Miller-McCune.com news to your site.

Subscribe to Miller-McCune

Lee Drutman
Lee Drutman, Ph.D., teaches at the University of California Washington D.C. Semester Program. He has worked as a staff writer for the Philadelphia Inquirer and the Providence Journal. His work has also appeared in the Los Angeles Times, New York Newsday, Slate, Politico and the American Prospect.

More From Lee Drutman

Tags: , , , ,

If you would like to comment on this post, or anything else on Pacific Standard, visit our Facebook or Google+ page, or send us a message on Twitter. You can also follow our regular updates and other stories on both LinkedIn and Tumblr.

A weekly roundup of the best of Pacific Standard and PSmag.com, delivered straight to your inbox.

Follow us


Subscribe Now

Quick Studies

How Junk Food Companies Manipulate Your Tongue

We mistakenly think that harder foods contain fewer calories, and those mistakes can affect our belt sizes.

What Steve Jobs’ Death Teaches Us About Public Health

Studies have shown that when public figures die from disease, the public takes notice. New research suggests this could be the key to reaching those who are most at risk.

Speed-Reading Apps Will Not Revolutionize Anything, Except Your Understanding

The one-word-at-a-time presentation eliminates the eye movements that help you comprehend what you're reading.

To Make Friends, Autistic Kids Need Advice—and Space

Kids with autism need help when it comes to making friends—but they also need their independence.

Gaming the Wedding Gift Registry System

Registering for your wedding? Keep your must-have items away from the average price of your registry—they’re unlikely to be purchased.

The Big One

One state—Pennsylvania—logs 52 percent of all sales, shipments, and receipts for the chocolate manufacturing industry. March/April 2014