(Dis)Trust in Science
Dozens of infants and children in Romania died recently in a major measles outbreak, as a result of prominent celebrities campaigning against vaccination. This trend parallels that of Europe as a whole, which suffered a 400 percent increase in measles cases from 2016 to 2017. Unvaccinated Americans traveling to the World Cup may well bring the disease back to the U.S.
Of course, we don’t need to go to Europe to catch measles. Kansas just experienced its worst outbreak in decades. Children and adults in a few unvaccinated families were key to this widespread outbreak. Just as in Romania, parents in the U.S. are fooled by the false claim that vaccines cause autism. This belief has spread widely across the U.S. and leads to a host of problems.
Measles was practically eliminated in the U.S. by 2000. However, in recent years outbreaks of measles have been on the rise, driven by parents failing to vaccinate their children in a number of communities. We should be especially concerned because our president has frequently expressed the false view that vaccines cause autism, and his administration has pushed against funding science-based policies at the Centers for Disease Control.
These illnesses and deaths are one among many terrible consequences of the crisis of trust suffered by our institutions in recent years. While headlines focus on declining trust in the media and the government, science and academia are not immune to this crisis of confidence, and the results can be deadly.
Consider that in 2006, 41 percent of respondents in a nationwide poll expressed “a lot of confidence” in higher education. Less than 10 years later, in 2014, only 14 percent of those surveyed showed “a great deal of confidence” in academia.
What about science as distinct from academia? Polling shows that the number of people who believe that science has “made life more difficult” increased by 50 percent from 2009 to 2015. According to a 2017 survey, only 35 percent of respondents have “a lot” of trust in scientists; the number of people who do “not at all” trust scientists increased by over 50 percent from a similar poll conducted in December 2013.
This crumbling of trust in science and academia forms part of a broader pattern, what Tom Nichols called The Death of Expertise in his 2017 book. Growing numbers of people claim their personal opinions hold equal weight to the opinions of experts.
SHOULD WE TRUST SCIENTIFIC EXPERTS?
While we can all agree that we do not want people to get sick, what is the underlying basis for the idea that the opinions of experts—including scientists—deserve more trust than the average person in evaluating the truth of reality?
The term “expert” refers to someone who has extensive familiarity with a specific area, as shown by commonly recognized credentials such as a certification, an academic degree, publication of a book, years of experience in a field, or other way that a reasonable person may recognize an “expert.” Experts are able to draw on their substantial body of knowledge and experience to provide an opinion, often expressed as “expert analysis.”
That doesn’t mean an expert opinion will always be right: it’s simply much more likely to be right than the opinion of a non-expert. The underlying principle here is probabilistic thinking, our ability to predict the truth of current and future reality based on limited information. Thus, a scientist studying autism would be much more likely to predict accurately the consequences of vaccinations than someone who has spent 10 hours googling “vaccines and autism” online.
This greater likelihood of experts being correct does not at all mean we should always defer to experts. First, research shows that experts do best in evaluating reality in environments that are relatively stable over time and thus predictable, and also when the experts have a chance to learn about the predictable aspects of this environment. Second, other research suggests that ideological biases can have a strongly negative impact on the ability of experts to make accurate evaluations. Third, material motivations can sway experts to conduct an analysis favorable to their financial sponsor.
However, while individual scientists may make mistakes, it is incredibly rare for the scientific consensus as a whole to be wrong. Scientists get rewarded in money and reputation for finding fault with statements about reality made by other scientists. Thus, for the large majority of them to agree on something—for there to be a scientific consensus—is a clear indicator that whatever they agree on reflects reality accurately.
THE INTERNET IS FOR … MISINFORMATION
The rise of the internet and, more recently, social media is key to explaining the declining public confidence in expert opinion.
Before the internet, the information accessible to the general public about any given topic usually came from experts. For instance, scientific experts on autism were invited to talk on this topic on mainstream media, large publishers published books by the same experts, and they wrote encyclopedia articles on this topic.
The internet has enabled anyone to be a publisher of content, connecting people around the world with any and all sources of information. On the one hand, this freedom is empowering and liberating, with Wikipedia a great example of a highly curated and accurate source on the vast majority of subjects. On the other, anyone can publish a blog piece making false claims about links between vaccines and autism. If they are skilled at search engine optimization, or have money to invest in advertising, they can get their message spread widely.
Unfortunately, research shows that people lack the skills for differentiating misinformation from true information. This lack of skills has clear real-world effects: just consider that U.S. adults believed 75 percent of fake news stories about the 2016 U.S. presidential election. The more often someone sees a piece of misinformation, the more likely they are to believe it.
Blogs with falsehoods are bad enough, but the rise of social media made the situation even worse. Most people re-share news stories without reading the actual article, judging the quality of the story by the headline and image alone. No wonder that research indicates that misinformation spreads as much as 10 times faster and further on social media than true information. After all, the creator of a fake news item is free to devise the most appealing headline and image, while credible sources of information have to stick to factual headlines and images.
These problems result from the train wreck of human thought processes meeting the internet. We all suffer from a series of thinking errors such as confirmation bias, our tendency to look for and interpret information in ways that conform to our beliefs.
Before the internet, we got our information from sources such as mainstream media and encyclopedias, which curated the information for us to ensure it came from experts, minimizing the problem of confirmation bias. Now, the lack of curation means thinking errors are causing us to choose information that fits our intuitions and preferences, as opposed to the facts. Moreover, some unscrupulous foreign actors—such as the Russian government—and domestic politicians use misinformation as a tool to influence public discourse and public policy.
The large gaps between what scientists and the public believe about issues such as climate change, evolution, GMOs, and vaccination exemplify the problems caused by misinformation and lack of trust in science. Such mistrust results in great harm to our society, from outbreaks of preventable diseases to highly-damaging public policies.
WHAT CAN WE DO?
Fortunately, there are proactive steps we can take to address the crisis of trust in science and academia.
For example, we can uplift the role of science in our society. The March for Science movement is a great example of this effort. First held on Earth Day in 2017 and repeated in 2018, this effort involves people rallying in the streets to celebrate science and push for evidence-based policies. Another example is the Scholars Strategy Network, an effort to support scholars in popularizing their research for a broad audience and connecting scholars to policy makers.
We can also fight the scourge of misinformation. Many world governments are taking steps to combat falsehoods. While the U.S. federal government has dropped the ball on this problem, a number of states passed bipartisan efforts promoting media literacy. Likewise, many non-governmental groups are pursuing a variety of efforts to fight misinformation.
The Pro-Truth Pledge combines the struggle against misinformation with science advocacy. Founded by a group of behavioral science experts (including myself) and concerned citizens, the pledge calls on public figures, organizations, and private citizens to commit to 12 behaviors listed on the pledge Web site that research in behavioral science show correlate with truthfulness. Signers are held accountable through a crowdsourced reporting and evaluation mechanism while getting reputational rewards because of their commitment. The scientific consensus serves as a key measure of credibility, and the pledge encourages pledge-takers to recognize the opinions of experts as more likely to be true when the facts are disputed.
More than 500 politicians took the pledge, including state legislators Eric Nelson (Pa.) and Ogden Driskell (Wyo.), and members of U.S. Congress Beto O’Rourke (Texas) and Marcia Fudge (Ohio). Two research studies at Ohio State University have demonstrated with a strong statistical significance the effectiveness of the pledge in changing the behavior of pledge-takers to be more truthful. Thus, taking the pledge yourself and encouraging people you know and your elected representatives to take the pledge is an easy action to both fight misinformation and promote science.
I have a dream that, one day, children will not be getting sick with measles because their parents put their trust in a random blogger instead of extensive scientific studies. I have a dream that schools will be teaching media literacy and people will know how to evaluate the firehose of information coming their way. I have a dream that we will all know that we suffer from thinking errors and watch out for confirmation bias and other problems. I have a dream that the quickly growing distrust of experts and science will seem like a bad dream. I have a dream that our grandchildren will find it hard to believe our present reality when we tell them stories about the bad old days.
To live these dreams requires all of us who care about truth and science to act now, before we fall further down the slippery slope. Our information ecosystem and credibility mechanisms are broken. Only a third of Americans trust scientists and most people can’t tell the difference between truth and falsehood online. The lack of trust in science—and the excessive trust in persuasive purveyors of misinformation—is perhaps the biggest threat to our society right now. If we don’t turn back from the brink, our future will not be a dream: it will be a nightmare.