Four things newsrooms can do right now to counter science polarization
There are no easy fixes for systemic issues—but there are steps we can take to inoculate readers against misinformation
At SRCCON in June, we hosted a discussion about countering polarization in coverage of science topics. On the heels of two years of pandemic weirdness, not to mention years of entrenched discourse about climate change, the topic feels especially relevant.
We brought together an audience of journalists with four scholars studying this topic from various angles: researcher and reporter Jaime Longoria of the Equity First Vaccination Initiative and the Disinfo Defense League; psychologist Stephan Lewandowsky of the University of Bristol; communications researcher Jiyoung Lee of the University of Alabama; and historian Scott Gabriel Knowles of Korea Advanced Institute of Science and Technology and the COVIDCalls podcast.
A systemic issue like polarization does not have easy fixes, we concluded. This problem will be part of our media reality until forces from policy, society, and tech industries make substantial changes. But newsrooms can be part of those changes, and encourage civil discourse. There are things we can do now.
1) Give your audience agency in your newsroom’s coverage
Several movements in journalism offer strategies for engaging audiences: participatory journalism, human-centered design, and the best practices in transparency encouraged by Trusting News. We discussed many examples of how newsrooms have accomplished this relationship-building. Jaime Longoria talked about his work during the pandemic to partner with trusted messengers in communities of color to spread the word about life-saving public health initiatives.
The Equity First Vaccination Initiative worked in five U.S. cities—Baltimore, Chicago, Houston, Newark, and Oakland—to provide access to vaccines and accurate information about them. The collaboration included the Brown School of Public Health, people such as Longoria with journalism backgrounds, and organizers in communities of color.
Longoria emphasized that working with folks who already had relationships with the communities they sought to inform was key to the initiative’s quick response to the spread of misinformation: “Our work was to make sure the trusted messengers were educated and had the support of a research team behind them.”
Working with trusted messengers is one way to make sure communities receive valid information. But journalists should also work to make themselves trusted messengers, especially at the local level, by incorporating transparency and audience interconnection into their newsroom processes.
Although this effort was not technically journalism, Longoria describes it as one of the most rewarding experiences of his career, one that prompted him to reflect on how journalism practices could improve. “When you are out doing reporting, do people see you as representing your news org or the communities you belong to?” he asked. “If the answer is the former, you may be locked out of … negotiations.”
2) Get ahead of misinformation through “prebunking”
The psychology of misinformation makes it a pernicious problem: “When a statement is corrected, people adjust their belief downward in most circumstances,” Stephan Lewandowsky said. “However, even when people tell us that they no longer believe information X, they still act as though they rely on X.”
This psychological feature is called the continued influence effect. In the “post-truth” world, “basic human cognition will make it difficult to restore a common factual reality,” Lewandowsky said. So it’s critical that we inoculate people against misinformation by warning them of falsehoods before they emerge, a strategy known as “prebunking.”
The latest edition of the Debunking Handbook, which Lewandowsky contributed to, includes information about prebunking best practices, such as teaching people to recognize the manipulation techniques commonly used in misinformation. Lewandowsky and colleagues recently showed that exposure to prebunking videos could inoculate nearly 1 million YouTube users against misinformation.
Monitoring Google Trends can help some journalists stay aware of when a falsehood is taking off about a topic they cover. However, most journalists, especially those of us in small newsrooms, are overwhelmed by the amount of misinformation that falls in our wheelhouses. Our SRCCON discussion ended in a call for improved networks between scholars and tech employees who monitor misinformation and those who are connected to audiences who need to know about it (see #4, below).
3) Slow down viral misinformation and speed up fact-checking
Viral misinformation is a systemic problem that demands solutions from companies and policy makers. Tech companies have introduced a variety of “speed bumps” to sharing viral and potentially false stories, including Twitter’s “Want to Read an Article First?” pop-up or Facebook’s pop-up implemented in 2020 when people shared information about COVID–19. These speed bumps are helpful, but alone are not enough.
Journalists can contribute to curbing this problem by speeding up fact-checking processes to identify misinformation quickly. Jiyoung Lee pointed to the development of AI fact-checking as one tool for this purpose. “AI fact-checking systems have been mostly applied by independent fact-checking orgs,” Lee said. “But journalists do rely on the fact-checking decisions made by these orgs before generating their news stories.” She noted particular contexts where AI fact-checking has been useful, such as during political speeches.
Journalists in the discussion were largely unfamiliar with AI fact-checking tools and understandably wary of a technology that has already been involved with reinforcing problematic social biases. “Given that decisions made by AI are based on the data set, there are limitations,” Lee said. “In particular, novel misinformation can be difficult to fact-check, given the lack of evidence.” These questions point to a fruitful area for discourse between scholars, tech professionals, and journalists.
Better networks for making journalists aware of misinformation relevant to their audience will also speed up this process. Journalists can cultivate these networks by collaborating with their audiences (as discussed in #1) and linking up with relevant experts (as discussed below).
4) Building networks between scholars and journalists
Many people in our discussion at SRCCON, as well as other related discussions at the conference, expressed a sense of being overwhelmed when confronted with polarization and misinformation. There’s a real need for community and camaraderie among those who face these problems in our everyday professional lives. To truly respond to misinformation and polarization with the speed, accuracy, and compassion warranted, journalists and others working on these problems need a coordinated network and community to support our efforts.
Scott Gabriel Knowles called for networks between scholars and journalists. “The past … allows people to look past contemporary political squabbles and see the long trends that shape their lives,” Knowles said. His background studying disasters that unfold slowly, such as climate change, plastic pollution, or viral evolution, informs his position. “These processes move slowly, they have deep histories, and we can’t explain them thoroughly in the midst of the flood or the pandemic, on deadline, when the flow of daily news is too fast and when people are scared and susceptible to disinformation,” he said.
Historian-journalist collaboration has yielded projects like The Washington Post’s Made By History column and Andrew Revkin’s YouTube channel. During the pandemic, Knowles’s podcast COVIDCalls provided daily, livestreamed interviews with scholars processing the pandemic through the lens of their research.
History is not the only field where journalists need to develop networks. Scholars from a variety of fields often have helpful insights about media coverage, at the same time that journalists have insights about their audience’s misconceptions and needs, and about the best ways to impart certain types of information. We need deeper connection between these groups, through workshops, conferences, and other networking and community-building endeavors.
Our SRCCON discussion aimed to identify strategies that journalists can use now to improve our reporting within a polluted and polarizing media system. Many heads are better than one, and we hope that by connecting with audiences, experts, and other journalists, we will feel less alone and more effective at informing the audiences we serve.
Katie L. Burke is an award-winning editor and science writer. As digital features editor at American Scientist, she shares the wonder of science and helps scientists tell their stories. She also serves on the advisory board of the Virginia Tech Center for Communicating Science.
Amanda Yarnell is the Senior Director of Harvard T.H. Chan School of Public Health’s Center for Health Communication. The center defines, teaches, and shares best practice for credibly communicating health information in an increasingly fragmented world. Prior to joining Harvard Chan, Amanda led the editorial, product, and audience engagement teams at Chemical & Engineering News, an award-winning nonprofit science news outlet published by the American Chemical Society. She also serves on the advisory board of Drug Hunter, a platform that empowers, educates, and connects scientists who work in drug discovery.