Characterizing the Eroding Trust in Science
Updated: Jan 25
While the U.S. is experiencing a drastic uptick in COVID-19 cases, with tens of thousands of new cases per day, trust in leading infectious disease expert Dr. Anthony Fauci seems to be wavering, especially among top government officials. Dr. Fauci was denied an invitation to President Donald Trump’s coronavirus briefing on the 21st of July — the first such briefing in two months — as the White House has moved to distance themselves from the public health expert.
Dr. Fauci at a coronavirus briefing over three months ago (Shutterstock, 2020).
Trust in science has seemingly dwindled in the United States, and this deteriorating relationship between the public and scientific experts has contributed to COVID-19’s spread. "In many states and cities, you have the leadership actually giving the right guideline instruction. But somehow, people for one reason or another, don't believe it or are not fazed by it. And they go ahead and do things that are either against the guidelines that their own leadership is saying," Fauci told CBS News on July 17th. He continued, "It's the kind of mistrust of science because science is viewed as authority. And there's a lot of anti-authority feeling. I think that's the kind of thing that drives the anti-vaxxers, the people who don't believe the science of vaccination and don't want to get their children vaccinated. It's all part of that trend, which is very disturbing.”
Fauci says that the mistrust of science is a trend, but is that really the case? Is faith in science really on the downturn?
Many surveys indicate that it is. Regarding trust in academia, one nationwide poll in 2004 found that 41.6% of Americans had “a lot of confidence” in America’s Colleges and Universities. In 2014, just 10 years later, a similar poll found that just 14% of respondents felt “a great deal of confidence” in institutions of higher education. Meanwhile, YouGov polls show that, between 2013 and 2017 , the number of respondents who had some level of trust in scientists decreased from 87% to 80%, while the number who did not trust scientists at all increased from 6% to 10%.
Yet, not all measures of trust are on the downturn. Surveys conducted by the Pew Research Center indicate that the percentage of U.S. adults who believe that scientists act in the public interest has risen between 2016 and 2019, suggesting that there may be recent improvements in science’s public image. Still though, America’s fickle trust in its own leading experts is a growing concern, especially considering our poor results in the COVID-19 pandemic.
The Demographics of Distrust
Many factors influence whether someone will be more or less trusting in science — perhaps the most studied is political ideology. A 2019 Pew Research Center survey found that 43% of Democrats reported “a great deal of confidence” that scientists act in the public interest, compared to 27% of Republicans. In a whole slew of other questions, such as whether scientists should take an active role in policy debates and whether the scientific method generally produces accurate conclusions, Democrats consistently showed a higher level of favorability and trust in science.
However, the survey also showed that different areas of science had different levels of partisan bias. For example, many more Democrats (70%) had a generally positive view of environmental research scientists compared to Republicans (40%), but when it came to medical research scientists, the numbers were much closer (70% of Democrats and 67% of Republicans had a generally positive opinion). A 2017 study found that, apart from particularly politicized issues such as the science of climate change, Democrats and Republicans were not more or less likely to be skeptics of science, such as with regards to the safety of genetically modified food or vaccine skepticism. Although political ideology clearly plays a role, the exact impact of it appears to be quite complicated.
The Pew Research Center also found that another key driver in scientific trust is a person’s level of familiarity with scientists. Consider that a larger portion of Americans reported mostly positive views of medical doctors (74%) than medical research scientists (68%). Regarding medical researchers, 84% of Americans who reported knowing a lot about medical researchers had a mostly positive view of the group. However, those who reported knowing “nothing at all” about medical researchers were much less likely to hold a positive opinion (only 41% of these Americans had a mostly positive view). Furthermore, those who knew “a lot” about medical researchers were more likely to believe that medical researchers cared about the public interest, conducted good research, and provided fair and accurate information than those who knew either “a little” or “nothing at all.”
Familiarity breeds trust, and this explains why Americans tend to have better opinions of doctors than researchers, as many Americans are more familiar with doctors and their work. Furthermore, those who were more familiar with researchers demonstrated higher trust in their work.
A similar result occurs when comparing Americans with high, medium, or low science knowledge. Like with the varying levels of familiarity, those with more scientific knowledge were more likely to hold positive opinions of medical researchers and have faith that the group was acting in the public interest and providing accurate information. Interestingly, among those with low science knowledge, a higher proportion were more likely to have a positive opinion of doctors (61%) than medical researchers (53%). This gap of opinion of doctors and researchers was much smaller among those with high science knowledge (81% versus 79%).
Similar results persist when comparing opinions on dietitians, nutrition research scientists, environmental health specialists, and environmental research scientists — those with higher familiarity and those with higher science knowledge tend to be more trusting towards those groups.
When it comes to active distrust, many Americans believe misconduct among scientific researchers to be either a “very big problem” or a “moderately big problem,” and that researchers sometimes or often avoid serious consequences. When research is funded by an industry group, the majority of Americans say they would trust those findings less. Meanwhile, if data is openly available to the public, or if findings are reviewed by an independent committee, Americans say they tend to trust research results more. In addition, conservative Republicans are much less likely to trust research that is funded by the federal government compared to moderate/liberal Republicans and Democrats.
Aside from political ideology and familiarity, race and religion can also factor into scientific trust. For example, Blacks and Hispanics were much more likely to say that scientific misconduct is a big problem than whites. Given the racist misconduct of the infamous Tuskegee Syphilis Experiment or the case of Henrietta Lacks, this is perhaps unsurprising. A separate study also reported religious orthodoxy to be a major predictor for low trust in science, even more so than political bias.
But what can be done about this “crisis of trust?” In some ways, the problem is out of scientist’s hands. The spread of misinformation on the internet is perceived by many to be a major cause of the crisis, and it’s not hard to see why. Because false information can spread very rapidly, with catchier headlines than real articles, the internet is incredibly efficient at misinforming the public. At the same time, research indicates that people lack the skills to discern the credibility of information online. Solving these issues requires both anti-misinformation measures by media sites and a reform of education that can provide true internet literacy skills for the American population.
Changing how science articulates with the news can also improve public trust. The majority of Americans say they learned about medical, dietary, or environmental professionals/researchers through the news. At times, the news can focus on poorly performed science for the sake of sensationalism or political/social commentary. While regulating news is very dangerous territory to tread, something should be done to change how science is portrayed on the news.
Some of that may come down to scientists themselves. Science journal articles are often extremely technical, with overwhelming amounts of jargon that renders them unreadable to non-experts. While these kinds of jargon-laden studies are necessary for experts to communicate with each other, the unfortunate consequence is that, in the absence of other sources of direct information, it creates a barrier between the public and exciting, new science.
Furthermore, based on the Pew Research survey, a large part of the American public is worried about misconduct among scientists conducting research. Part of that may be due to scientists retracting previously erroneous statements or findings. Alleviating this aspect of the problem may involve educating the public about the inherently revisionist nature of science, where the retracting of previous findings may well be an indicator that science does work. Every time a previous finding is revised, science gets closer to the truth. Not all overturned findings are due to faulty research, but some of them are. We also need to address systemic reasons why faulty research might be conducted and published in the first place. Still, while the relationship between the scientific community and American society won’t be easy to repair, the rapid spread of COVID-19 in the United States has demonstrated the importance of a science-receptive public. Finding ways to combat misinformation, addressing systemic issues in research, creating more opportunities to present work to a wider audience or funding more science journalism may be worthwhile investments if they can create a deeper trust between scientists and the society they serve.
Coronavirus Task Force Daily Briefing, Washington DC, USA [Online image]. (2020). Shutterstock. https://www.shutterstock.com/editorial/image-editorial/coronavirus-task-force-daily-briefing-washington-dc-usa-01-apr-2020-10599739bp
Donald, B., (2016, November 22). Stanford Researchers Find Students Have Trouble Judging the Credibility of Information Online. Stanford Graduate School of Education. https://ed.stanford.edu/news/stanford-researchers-find-students-have-trouble-judging-credibility-information-online
Funk, C., Hefferon, M., Kennedy, Brian., & Johnson, C. (2019, August 02). Trust and Mistrust in American’s Views of Scientific Experts. Pew Research Center. https://www.pewresearch.org/science/2019/08/02/trust-and-mistrust-in-americans-views-of-scientific-experts/
Gross, N., & Simmons, S. (2006, May 22). American’s Views of Political Bias in the Academy and Academic Freedom. (Working Paper Version). https://www.aaup.org/NR/rdonlyres/DCF3EBD7-509E-47AB-9AB3-FBCFFF5CA9C3/0/2006Gross.pdf
Johnson, D. R., & Peifer, J. L. (2017, April 03). How Public Confidence in Higher Education Varies by Social Context. https://www.tandfonline.com/doi/abs/10.1080/00221546.2017.1291256?journalCode=uhej20&
Rutjens, B. T., Sutton, R. M., & van der Lee, R. (2017, December 01). Not All Skepticism Is Equal: Exploring the Ideological Antecedents of Science Acceptance and Rejection. Personality and Social Psychology Bulletin (PSPB), 44(3), 384-405. https://doi.org/10.1177/0146167217741314
YouGov/Huffington Post. (2013). YouGov December 6 - 7, 2013, [Data Set]. Retrieved from https://d25d2506sfb94s.cloudfront.net/cumulus_uploads/document/phqmhkwqbe/tabs_HP_science_20131209.pdf
YouGov/Huffington Post. (2017). YouGov April 28 - 29, 2017 - 1000 US Adults, [Data Set]. Retrieved from http://big.assets.huffingtonpost.com/tabsHPScienceandPolitics20170428.pdf