Tuesday, September 04, 2007

washingtonpost.com
Persistence of Myths Could Alter Public Policy Approach

http://www.highprogrammer.com/alan/rants/reviews/urban_myth_box.jpg

By Shankar Vedantam
Washington Post Staff Writer
Tuesday, September 4, 2007; A03

The federal Centers for Disease Control and Prevention recently issued a flier to combat myths about the flu vaccine. It recited various commonly held views and labeled them either "true" or "false." Among those identified as false were statements such as "The side effects are worse than the flu" and "Only older people need flu vaccine."

When University of Michigan social psychologist Norbert Schwarz had volunteers read the CDC flier, however, he found that within 30 minutes, older people misremembered 28 percent of the false statements as true. Three days later, they remembered 40 percent of the myths as factual.

Younger people did better at first, but three days later they made as many errors as older people did after 30 minutes. Most troubling was that people of all ages now felt that the source of their false beliefs was the respected CDC.

The psychological insights yielded by the research, which has been confirmed in a number of peer-reviewed laboratory experiments, have broad implications for public policy. The conventional response to myths and urban legends is to counter bad information with accurate information. But the new psychological studies show that denials and clarifications, for all their intuitive appeal, can paradoxically contribute to the resiliency of popular myths.

This phenomenon may help explain why large numbers of Americans incorrectly think that Saddam Hussein was directly involved in planning the Sept 11, 2001, terrorist attacks, and that most of the Sept. 11 hijackers were Iraqi. While these beliefs likely arose because Bush administration officials have repeatedly tried to connect Iraq with Sept. 11, the experiments suggest that intelligence reports and other efforts to debunk this account may in fact help keep it alive.

Similarly, many in the Arab world are convinced that the destruction of the World Trade Center on Sept. 11 was not the work of Arab terrorists but was a controlled demolition; that 4,000 Jews working there had been warned to stay home that day; and that the Pentagon was struck by a missile rather than a plane.

Those notions remain widespread even though the federal government now runs Web sites in seven languages to challenge them. Karen Hughes, who runs the Bush administration's campaign to win hearts and minds in the fight against terrorism, recently painted a glowing report of the "digital outreach" teams working to counter misinformation and myths by challenging those ideas on Arabic blogs.

A report last year by the Pew Global Attitudes Project, however, found that the number of Muslims worldwide who do not believe that Arabs carried out the Sept. 11 attacks is soaring -- to 59 percent of Turks and Egyptians, 65 percent of Indonesians, 53 percent of Jordanians, 41 percent of Pakistanis and even 56 percent of British Muslims.

Research on the difficulty of debunking myths has not been specifically tested on beliefs about Sept. 11 conspiracies or the Iraq war. But because the experiments illuminate basic properties of the human mind, psychologists such as Schwarz say the same phenomenon is probably implicated in the spread and persistence of a variety of political and social myths.

The research does not absolve those who are responsible for promoting myths in the first place. What the psychological studies highlight, however, is the potential paradox in trying to fight bad information with good information.

Schwarz's study was published this year in the journal Advances in Experimental Social Psychology, but the roots of the research go back decades. As early as 1945, psychologists Floyd Allport and Milton Lepkin found that the more often people heard false wartime rumors, the more likely they were to believe them.

The research is painting a broad new understanding of how the mind works. Contrary to the conventional notion that people absorb information in a deliberate manner, the studies show that the brain uses subconscious "rules of thumb" that can bias it into thinking that false information is true. Clever manipulators can take advantage of this tendency.

The experiments also highlight the difference between asking people whether they still believe a falsehood immediately after giving them the correct information, and asking them a few days later. Long-term memories matter most in public health campaigns or political ones, and they are the most susceptible to the bias of thinking that well-recalled false information is true.

The experiments do not show that denials are completely useless; if that were true, everyone would believe the myths. But the mind's bias does affect many people, especially those who want to believe the myth for their own reasons, or those who are only peripherally interested and are less likely to invest the time and effort needed to firmly grasp the facts.

The research also highlights the disturbing reality that once an idea has been implanted in people's minds, it can be difficult to dislodge. Denials inherently require repeating the bad information, which may be one reason they can paradoxically reinforce it.

Indeed, repetition seems to be a key culprit. Things that are repeated often become more accessible in memory, and one of the brain's subconscious rules of thumb is that easily recalled things are true.

Many easily remembered things, in fact, such as one's birthday or a pet's name, are indeed true. But someone trying to manipulate public opinion can take advantage of this aspect of brain functioning. In politics and elsewhere, this means that whoever makes the first assertion about something has a large advantage over everyone who denies it later.

Furthermore, a new experiment by Kimberlee Weaver at Virginia Polytechnic Institute and others shows that hearing the same thing over and over again from one source can have the same effect as hearing that thing from many different people -- the brain gets tricked into thinking it has heard a piece of information from multiple, independent sources, even when it has not. Weaver's study was published this year in the Journal of Personality and Social Psychology.

The experiments by Weaver, Schwarz and others illustrate another basic property of the mind -- it is not good at remembering when and where a person first learned something. People are not good at keeping track of which information came from credible sources and which came from less trustworthy ones, or even remembering that some information came from the same untrustworthy source over and over again. Even if a person recognizes which sources are credible and which are not, repeated assertions and denials can have the effect of making the information more accessible in memory and thereby making it feel true, said Schwarz.

Experiments by Ruth Mayo, a cognitive social psychologist at Hebrew University in Jerusalem, also found that for a substantial chunk of people, the "negation tag" of a denial falls off with time. Mayo's findings were published in the Journal of Experimental Social Psychology in 2004.

"If someone says, 'I did not harass her,' I associate the idea of harassment with this person," said Mayo, explaining why people who are accused of something but are later proved innocent find their reputations remain tarnished. "Even if he is innocent, this is what is activated when I hear this person's name again.

"If you think 9/11 and Iraq, this is your association, this is what comes in your mind," she added. "Even if you say it is not true, you will eventually have this connection with Saddam Hussein and 9/11."

Mayo found that rather than deny a false claim, it is better to make a completely new assertion that makes no reference to the original myth. Rather than say, as Sen. Mary Landrieu (D-La.) recently did during a marathon congressional debate, that "Saddam Hussein did not attack the United States; Osama bin Laden did," Mayo said it would be better to say something like, "Osama bin Laden was the only person responsible for the Sept. 11 attacks" -- and not mention Hussein at all.

The psychologist acknowledged that such a statement might not be entirely accurate -- issuing a denial or keeping silent are sometimes the only real options.

So is silence the best way to deal with myths? Unfortunately, the answer to that question also seems to be no.

Another recent study found that when accusations or assertions are met with silence, they are more likely to feel true, said Peter Kim, an organizational psychologist at the University of Southern California. He published his study in the Journal of Applied Psychology.

Myth-busters, in other words, have the odds against them.