"Once people receive misinformation, it’s quite difficult to remove its influence. This was demonstrated in a 1994 experiment where people were exposed to misinformation about a fictitious warehouse fire, then given a correction clarifying the parts of the story that were incorrect. Despite remembering and accepting the correction, people still showed a lingering effect, referring to the misinformation when answering questions about the story."
There are a lot of psychology gems in there. But that one stood out because it is a common media method to hype a new story, which later turns out to be utterly false. Issuing corrections is important for this reason, that information that was wrong must be corrected. Where I think we run into problems is that the correction isn't exactly covered with the same diligence as the superhyped falsehood. Medical information and science reporting in particular are especially prone to such behavior in news cycles. So it would seem better if instead of reporting every new study in ten foot headlines that we wait for more confirmations to trickle in first.
The various debunking tips
1) Avoid backfiring. We tend to remember things in networks by relating things to what we already "know to be". If we already "know" a myth to be true, then associating new information with it, even if it is demonstrating the falseness of the myth is liable to be trouble.
2) Overkill, don't do it. I do this a lot. Power driving people with mountains of disclaiming information to their weak thesis is a common response to seeing someone as wrong. Stick to the best facts and arguments and move on from there as needed. Only among the most "informed" and dedicated followers of a myth will one need to amass copious amounts of information to combat it. Most people believe very vague and poorly thought out straw man positions on most issues, mostly because they don't care about those issues very much. Pile driving them with information demonstrates your passion for it, but doesn't really convince them to do likewise or to alter a preconceived opinion with these new facts challenging it. This is probably why news media tends to cater to those preconceived opinions rather than to chart new courses following new and surprising facts that overturn the old ways.
3) Don't challenge worldviews. This comes up frequently in debates over evolution. Apparently men and apes sharing common ancestries destroys a lot of comfortable assumptions about man's special place in the universe for some religious people and hence Darwin was wrong. Or something. However changing the topic to something like geological time scales or raw genetics, while complicated subjects in and of themselves, is likely enough to push some headway into the Darwin was wrong crowd. Some of it at least.
4) Alternative explanations. This feature of the human brain for completeness of schema is a powerful force behind mythology. Explaining complex phenomenon like weather or evolution of species or the occasion for "bad things to happen to good people" requires generally complicated reflection on these topics for a concise and thorough understanding of the topic. People don't like doing that very much (nor do they have the time to make such inquiries on all manner of subject matter). So they fill in the gaps so they feel like they know what they are talking about. A devastating social effect for anyone to admit is that they do not know what they are talking about in a public setting. Such a response of honest ignorance is rare, and is greeted with derision and scorn. It should be greeted with acceptance and conversation, if such ignorance is honest (Rick Perry's oops moment for instance was not, as he was forgetting talking points that were intended to be bonafides of his conservative intentions to govern). Experts often do not know the answers to everything going on in their circle, much less in fields broad and outside their sphere of expertise. They will ask questions and express fascination or curiosity in new developments, and if they are good enough at it, they will also help to try to poke holes in new theories and discoveries. This is not the common response of the layman on any field for which they are not intimately acquainted. Deference to expertise, to group acceptance, to ideological biases, and so on is much more likely where knowledge bases are poor.
Linky Friday: The Scientific Darkness
1 hour ago