Study Finds a Key Way to Build Trust in Science – And It’s Not Education

Trust in the truth is a major talking point these days. How we respond to the greatest global emergencies of our time depends on the outcome of that discussion.

Emerging evidence now suggests that a popular tactic for improving public trust in science is actually based on a myth.

A new survey among 705 individuals in the United States suggests that education does not influence general acceptance of science, as the so-called ‘information deficit model’ once suggested.

“This is somewhat (but not completely) surprising,” explain two researchers at the University of Maribor in Slovenia.

While it’s probably true that trust in science somewhat hinges on a greater understanding of the scientific process itself, psychologists Nejc Plohl and Bojan Musil argue that education may only play a role in “determining trust within specific areas of science.”

General trust in science is a different matter. It’s only linked to education on a correlational level. When other factors are considered, the association disappears, explain Plohl and Musil.

The findings suggest that merely presenting people with scientific facts isn’t enough to convince them to trust in the scientific process, as those studying risk and science communication have been pointing out for decades now.

Meanwhile, the new survey results also show stronger predictors of trust in science to be political conservatism, religiousness, or conspiracy ideations, and in previous research.

But the new study also shows openness to changing one’s viewpoint is an especially powerful factor.

This openness is considered an aspect of intellectual humility, defined as “a nonthreatening awareness” of one fallibility.

In recent years, the concept has been tied to trust in COVID-19 vaccines.

In one study, for instance, researchers found those who show less intellectual humility are more likely to have anti-vaccine attitudes. Meanwhile, those who show more intellectual humility generally plan to get vaccinated.

“Since scientific claims can contradict individuals’ beliefs and expectations about the world, being able to shift one’s opinion when confronted with alternative evidence may be crucial to building trust between individuals and the scientific community,” write Plohl and Musil.

But while openness was the second strongest factor the two psychologists identified in their survey, it only explained a small part of the picture.

Two-thirds of the variation in results were not tied to any factor the authors considered.

As such, Plohl and Musil think psychologists might be overlooking some major ways to improve public trust in science.

Tackling logical fallacies and cognitive biases in human thought could be more important than delivering cold hard facts.

“Scientific information can be difficult to swallow, and many individuals would sooner reject the evidence than accept information that suggests they might have been wrong,” psychologists argued in a 2022 paper on distrust in science.

“This inclination is wholly understandable, and scientists should be poised to empathize.”

The study was published in Personality and Individual Differences.

Source

Author: showrunner