Disinformation Day 2022 Considers Pressing Need for Cross-sector Collaboration and New Tools for Fact Checkers

November 9, 2022
Disinformation Day Cover

October 26, 2022 marked the first annual Disinformation Day hosted by Good Systems’ “Designing Responsible AI Technologies to Curb Disinformation” research team. Approximately 150 attendees from across the globe came together virtually to discuss challenges and opportunities in curbing the spread of digital disinformation. Thought leaders representing a range of disciplines and sectors examined the needs of fact checkers, explored issues of bias, fairness, and justice in mis- and disinformation, and outlined next steps for addressing these pressing issues together.

The event began with remarks from Dr. Dhiraj Murthy (School of Journalism and Media) and Dr. Matthew Lease (School of Information), who co-lead the Good Systems project. "This inaugural Disinfo Day event serves as a call to action around two key themes in mis-/dis- information studies: fact checking and bias and fairness in annotation and labeled data,” said Murthy.

Dr. David Corney, data scientist and engineer at FullFact, a company that examines news claims and asks media outlets to publish corrections, delivered a keynote on “How AI Can Help Fact Checkers Fight Bad Information.” In the address, Corney emphasized that “Bad information ruins lives,” and that it’s important for society that everyone receives the best information possible. He suggested that one way to approach the challenge of volume is asking, “What is the most important claim to fact check each day?” and “How can we increase the impact of our work (reach as many people as possible with positive impact)?” He shared that finding claims to be true can be just as useful as finding claims to be false in helping the public to navigate the massive amount of information around politics, pandemics, and more. Looking ahead to new tool development, Corney urged, “It’s essential to speak to users at every stage of development.”

 

Building upon the discussion, Dr. Greg Durrett (Computer Science) moderated the panel “New Frontiers in Tools for Fact Checkers” featuring Corney and Dr. Scott Hale and Shalini Joshi of Meedan, a global technology nonprofit that builds tools to help fact checkers and to increase equitable access to quality information while reducing harms like hate speech and misinformation. Durrett shared that while natural language processing and tools like large neural networks have progressed, they haven’t necessarily advanced in alignment with what fact checkers really need. He asked the panel, "What are tools out there that can be useful? And how can we better meet the needs of folks that are working in fact-checking?" Panelists discussed challenges including an ever-changing media landscape and the need for real-time data; a gap between academic researchers and practitioners; expanding tools and models into more languages; a lack of tools for effectively coding images, memes, and videos; and building forum for coalitions of fact checkers, academic researchers, lawyers, and stakeholders to tackle urgent challenges together. All panelists underscored that collaboration is key to moving forward and making the progress we collectively wish to see. “…there is a need for people developing tools and products to work together with fact checkers to help them be more efficient and engage with their audiences and for researchers to do so as well,” said Joshi. “We need fact checkers, researchers, developers, and product people coming together to look at the challenges and find solutions.”

Moving into the final panel of the day, Dr. Maria De-Arteaga (Information, Risk, and Operation Management) led a discussion on “Bias/Fairness in Dis-/Misinformation Studies” featuring Angie Holan, editor-in-chief of fact-checking news outlet PolitiFact, and researchers Dr. Rachel Moran and Dr. Sukrit Venkatagiri of the University of Washington’s Center for an Informed Public. De-Arteaga asked panelists to consider, “What does it mean to think about risks of bias?” and “What does it mean to think about fairness in the context of misinformation detection?” Panelists discussed that a commitment to accuracy is essential in mitigating personal biases as is including multiple, diverse perspectives in news stories and fact-checking processes. Venkatagiri prompted the audience to consider shifting to a justice-oriented perspective, focusing not only on how many people are impacted by misinformation but who is most affected and most needs help, considering the power dynamics at play. While panelists acknowledged challenges such as overcoming widespread mistrust in science and academic work related to elections, deeply embedded biases, and the need to better measure the disproportionate impact misinformation has on different populations, all remained optimistic about the future, citing new work and initiatives nationwide dedicated to curbing the spread of mis- and disinformation. “There has been so much progress on fighting misinformation over the past several years that I’m actually quite hopeful,” said Holan, “…we are seeing, across society, concern about misinformation and how to have positive standards of information on the internet and on social media. Even though things seem somewhat gloomy and there’s a lot of misinformation out there, I actually feel optimistic that awareness has increased exponentially and we’re starting to see these various innovative projects trying out new solutions.”

Concluding the event, Dr. Josephine Lukito (School of Journalism and Media) summarized key takeaways and calls to action. She reiterated the call for collaboration, a topic that came up in every session, and for spaces like the one created by Disinformation Day, virtually and in-person, for people to engage across disciplines and across academia, industry, and civil society to help curb the spread of mis- and disinformation.

Lukito also called for participants to come together to address needs identified throughout the discussion, including:

  • Building generalizable and equity- and justice-driven fact checking tools
  • More fact checking resources in languages other than English
  • More fact checking resources for video and multimedia
  • More cross-platform disinformation detection
  • More tools to address misinformation that spreads via ephemeral content (like livestreams that disappear or get deleted)
  • Resources that benefit the public at large (“Don't let things sit in academic papers.”) 
    • Media literacy and public engagement resources to ensure that academic research is accessible to the public
  • Increased collaboration between academic researchers, fact checkers, industry professionals, lawyers, policymakers, and nonprofit organizations

Visit the Disinformation Day website to learn more, stay connected, and view recordings from the event. Use the hashtag #DisinfoDay to continue the conversation on social media.


Resources:

Discover resources resulting from and shared during Disinformation Day 2022.

Disinformation Day Video Playlist

Policy Brief: “Moderating Social Media Discourse for a Healthy Democracy

Authored by Dr. Josephine Lukito, Kathryn Kazanas, and Bin Chen as part of the “Designing Responsible AI Technologies to Curb Disinformation” Good Systems research project.  

Nakov, Preslav, David Corney, Maram Hasanain, Firoj Alam, Tamer Elsayed, Alberto Barron-

Cedeno, Paolo Papotti, Shaden Shaar, and Giovanni Da San Martino. “Automated Fact-Checking for Assisting Human Fact-Checkers.” arXiv:2103.07769. March 13, 2021.

#FactsFirstPH

“Meedan launches collaborative effort to address misinformation on WhatsApp during Brazil’s presidential election”