Trust in Science Can Ironically Lead to False Beliefs. Luckily, There’s a Solution

Fans of science aren’t immune from swallowing the occasional bit of baloney. Fortunately, reminders that science values a critical eye can go a long way when it comes to sorting robust evidence from misinformation dressed in a lab coat.

A study by researchers from the University of Illinois, Urbana-Champaign, and the University of Pennsylvania in the US showed how a broad trust in information that sounds scientific can make pseudoscience sound more appealing.

Four experiments conducted online using between 382 and 605 volunteers compared responses to two fictitious accounts, one claiming cancerous effects of genetically modified organisms, the other involving a viral bioweapon.

The experiments varied the depictions of each story, presenting them in scientific language or using lay terms. On analysis, participants who confessed to trusting in science were unsurprisingly more likely to be influenced by the more scientific-sounding accounts.

Going on these results alone, it ironically makes efforts to promote greater trust in science a win-lose situation when it comes to dispelling conspiracy myths and pseudoscience.

One final experiment gives us some hope. Participants reminded to “think for themselves and not blindly trust what media or other sources tell them” thought twice about their responses, making them less likely to view the stories favorably.   

Retaining a healthy amount of skepticism in the face of scientific-sounding claims isn’t exactly shocking advice.

Yet as reliable evidence struggles to stand out in a churning sea of misinformation, there’s a growing need to identify exactly what makes for effective public communication.

“What we need are people who also can be critical of information,” says Dolores Albarracín, a social psychologist affiliated with the University of Pennsylvania and the University of Illinois, Urbana-Champaign.

“A critical mindset can make you less gullible and make you less likely to believe in conspiracy theories.”

After centuries of steady improvements in medicine and technology demonstrating the worth of science, the majority of people tend to associate scientific endeavors with mostly positive outcomes.

On the whole, most of us tend to think science is a good thing, even if our judgment of who to trust is somewhat complicated.

At the heart of the problem lies an easily distracted human brain shaped by millions of years of evolution. With room for attention at a premium, our brains need to be economical when it comes to identifying the kinds of information that are most likely to benefit us.

Unfortunately, human thinking has been shaped less by a need to compute the fundamentals of nature and more by how to work with other human brains. Our cognitive tools are adapted to search for shortcuts – called heuristics – based on language, facial expressions, and even fashion to quickly determine who is on our side and who isn’t.

Being reminded to remain critical can put the brakes on an over-reliance on heuristic thinking, giving our brains a chance to look for more information to build a belief.

While the study emphasizes the need to promote science hand-in-hand with a value in thinking critically, it doesn’t describe a panacea against misinformation.

Few of us are in positions to take the time needed to build beliefs from the ground up; in the end, virtually all of us rely on trusting other people who present themselves as well informed, whether to willfully deceive or simply because they, too, backed the wrong horse.

“People are susceptible to being deceived by the trappings of science,” says Albarracín.

“It’s deception but it’s pretending to be scientific. So people who are taught to trust science and normally do trust science can be fooled as well.”

This research was published in the Journal of Experimental Social Psychology.

Source

Author: showrunner