To Fight Misinformation, We Need to Teach That Science Is Dynamic

Science is a social process, and teaching students how researchers work in tandem to develop facts will make them less likely to be duped by falsehoods

Symbols of the manipulation of information entering brain on red background

Sixty-five years ago, a metal sphere the size of a basketball caught the U.S. science, military and intelligence communities by surprise. Sputnik 1, the first artificial satellite, launched into orbit by Russia, triggered U.S. policy makers to recognize that they were falling behind globally in educating and training scientists. In response, the government began investing in science education at every level from elementary to postgraduate. The aim was to scale up the nation’s scientific workforce and improve the public’s understanding of science, ensuring that we would never again face a comparable technology gap.

The Sputnik-era reforms produced a cadre of experts. But these reforms were largely unsuccessful in helping the public understand how science works, why science matters, and why and when it should be trusted. Reading most textbooks today, a student might never realize that before settled facts and models emerge, there is a period of uncertainty and disagreement. As we have seen during the COVID pandemic, some people think that the absence of consensus is an indication of some sort of scandal or malfeasance, instead of the way science is conducted. From there, one might be inclined to doubt the entire system, including any subsequent consensus.

It’s easy to see why so many of us struggle to distinguish trustworthy science from what is flawed, speculative or fundamentally wrong. When we don’t learn the nature of consensus, how science tends to be self-correcting and how community as well as individual incentives bring to light discrepancies in theory and data, we are vulnerable to false beliefs and antiscience propaganda. Indeed, misinformation is now a pervasive threat to national and international security and well-being.


On supporting science journalism

If you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.


Giving people more facts is insufficient. Instead, we need a populace that can tell which sources of information are likely to be reliable, even if the science itself is beyond what they learned in school, so that they can identify when they need scientific information to make decisions in their own lives. Just as critically, people must understand enough about how science attempts to minimize error. In other words, every member of our society needs to be what science education researcherNoah Feinstein calls a “competent outsider.”

To become competent outsiders, students need to learn how science produces reliable knowledge. But here our educational system is falling short. In the words of the American Association for the Advancement of Science, the process of science is taught as a sequence of “posing problems, generating hypotheses, designing experiments, observing nature, testing hypotheses, interpreting and evaluating data, and determining how to follow up on the findings.” Curricula at all levels must teach how the social, collaborative nature of science works to produce reliable knowledge. Here are five core topics that should be included:

Uncertainty. Practicing scientists spend most of their time dealing with unsettled questions, whereas textbooks traffic in long-settled science. This can be disorienting when science-in-the-making is suddenly thrust into public view. Students should be taught how scientists manage uncertainty: typically, scientists consider some explanations more likely than others while holding open the possibility that any of a number of alternatives is correct. In most cases, when a new study is published, its results are not taken to be the definitive answer but rather a pebble on the scale favoring one of several hypotheses.

Peer review. Scientific claims are validated (or tossed out) through peer review, but this process does not guarantee that any particular conclusion is correct. Rather, it filters for work that is more likely to be interesting, plausible and methodologically sound. It is not designed to detect fraud or experiment error, for example; reviewers do not replicate the original experiments. While much attention goes to the prepublication peer review that determines whether a paper will be published, the process is ongoing. Projects face peer review when they are first proposed, as the scientists working on them make progress, and later after publication, on social media sites, discussion boards and in the formal scientific literature.

Expertise. When evaluating scientific claims, researchers consider the expertise of the people making a claim. Similarly, the competent outsider must ask whether the claimant has appropriate expertise. In some contexts, it may not always be practical to comprehensively evaluate a person’s training, qualifications, track record, standing within the field, employment, and potential sources of bias, financial or otherwise. But you can at least consider, for instance, where a person works: for example, is a scientist who is endorsing a product employed by the company making that product? Science these days is a highly specialized activity; the further the topic under consideration is from an individual scientist’s expertise, the more their claims should be treated with caution. An active researcher with an M.D. or a Ph.D. in a medical field is probably qualified to explain the general principles around vaccines, whereas a life scientist is unlikely to be a good authority on how polar ice sheets are contributing to sea level rise.

Consensus. When scientists can generally agree on observations or interpretations of data, this is consensus—and it guides their understanding of the world. Some issues have resolved into a broad consensus (the earth’s climate is changing because of human activity) whereas others remain unsettled (the specific biological mechanisms responsible for long COVID). In absence of a scientific consensus, there is good reason to be skeptical of anyone who claims to know the answer with certainty. Consensus does not emerge immediately and is never based on a single publication; it is established by extensive, meticulous, empirical work that other scientists and reviewers examine deeply and critically at all stages. Even a strong scientific consensus may not be unanimous. Most important scientific claims, from the causes of climate change to the role of evolution by natural selection, have at least a handful of contrarians. These range from unqualified people who assert claims with no evidence, to people who have serious scientific arguments to make. There are even cases in which a contrarian is exceptionally qualified in a closely related discipline—say, a Nobel laureate in medicine promoting fringe views about the causes of AIDS. In science, consensus trumps expertise every time.

Agnatogenesis. Corporations and other interests with a financial or political stake in outcomes use agnatogenesis—the deliberate creation of doubt—to undermine confidence in scientific findings. Typically, the aim is to create enough uncertainty to stave off regulatory action. For example, the tobacco industry tried to cast doubt on findings linking smoking to cancer, and fossil fuel companies have tried to undermine scientific evidence of anthropogenic climate change.

Some may argue that our proposal for teaching students to be competent outsiders adds yet another subject to an already overloaded curriculum. But it can be done, as illustrated by courses such as Sense & Sensibility & Science at the University of California, Berkeley. We have seen how pandemic misinformation, for instance, undercut efforts by public health and medical practitioners. This dismissal of the iterative process of science and consensus, in part, drove high vaccine refusal rates, leading to large numbers of unnecessary deaths and immeasurable additional harm. We cannot bemoan the plethora of misinformation if we are not prepared to explain and defend the tools and processes that will help us deal with the next pandemic, prevent mass extinction and reverse climate change.

This is an opinion and analysis article, and the views expressed by the author or authors are not necessarily those of Scientific American.