Skip to content
Longterm Wiki
Back

Credibility Rating

4/5
High(4)

High quality. Established institution or organization with editorial oversight and accountability.

Rating inherited from publication venue: Carnegie Endowment

Though mislabeled as a WEF report, this is a Carnegie Endowment policy guide on disinformation countermeasures; relevant to AI safety discussions around AI-generated misinformation, content moderation policy, and governance of information ecosystems.

Metadata

Importance: 42/100policy briefanalysis

Summary

This Carnegie Endowment report provides an evidence-based policy guide for countering disinformation, synthesizing research on what interventions actually work. It evaluates a range of strategies—from platform regulation to media literacy—and offers actionable recommendations for policymakers seeking to address information integrity threats.

Key Points

  • Reviews empirical evidence on the effectiveness of various anti-disinformation interventions, distinguishing proven approaches from unproven ones.
  • Covers platform-level, government, and civil society responses to disinformation, including content moderation, labeling, and media literacy programs.
  • Emphasizes the importance of evidence-based policymaking over reactive or politically-motivated responses to information threats.
  • Highlights risks of over-correction, such as censorship or chilling effects on legitimate speech when countering disinformation.
  • Relevant to AI governance as AI-generated content and synthetic media increasingly intersect with disinformation challenges.

Cited by 1 page

PageTypeQuality
AI-Era Epistemic InfrastructureApproach59.0

Cached Content Preview

HTTP 200Fetched Apr 9, 202698 KB
Countering Disinformation Effectively: An Evidence-Based Policy Guide | Carnegie Endowment for International Peace Report Countering Disinformation Effectively: An Evidence-Based Policy Guide

 A high-level, evidence-informed guide to some of the major proposals for how democratic governments, platforms, and others can counter disinformation.

 Link Copied By Jon Bateman and Dean Jackson Published on Jan 31, 2024 Table of Contents

 00 Methodology 
 01 Challenges and Cautions 
 02 Case Study 1: Supporting Local Journalism 
 03 Case Study 2: Media Literacy Education 
 04 Case Study 3: Fact-Checking 
 05 Case Study 4: Labeling Social Media Content 
 06 Case Study 5: Counter-messaging Strategies 
 07 Case Study 6: Cybersecurity for Elections and Campaigns 
 08 Case Study 7: Statecraft, Deterrence, and Disruption 
 09 Case Study 8: Removing Inauthentic Asset Networks 
 10 Case Study 9: Reducing Data Collection and Targeted Ads 
 11 Case Study 10: Changing Recommendation Algorithms 
 12 Looking Ahead: Generative AI 
 Additional Links

 Full Text (PDF) Key Takeaways (PDF) Program 

 Technology and International Affairs

 The Technology and International Affairs Program develops insights to address the governance challenges and large-scale risks of new technologies. Our experts identify actionable best practices and incentives for industry and government leaders on artificial intelligence, cyber threats, cloud security, countering influence operations, reducing the risk of biotechnologies, and ensuring global digital inclusion.

 Learn More Project 

 Partnership for Countering Influence Operations

 The goal of the Partnership for Countering Influence Operations (PCIO) is to foster evidence-based policymaking to counter threats in the information environment. Key roadblocks as found in our work include the lack of: transparency reporting to inform what data is available for research purposes; rules guiding how data can be shared with researchers and for what purposes; and an international mechanism for fostering research collaboration at-scale.

 Learn More Project 

 Information Environment Project

 Carnegie’s Information Environment Project is a multistakeholder effort to help policymakers understand the information environment, think through the impact of efforts to govern it, and identify promising interventions to foster democracy.

 Learn More Summary

 Disinformation is widely seen as a pressing challenge for democracies worldwide. Many policymakers are grasping for quick, effective ways to dissuade people from adopting and spreading false beliefs that degrade democratic discourse and can inspire violent or dangerous actions. Yet disinformation has proven difficult to define, understand, and measure, let alone address.

 Even when leaders know what they want to achieve in countering disinformation, they struggle to make an impact and often don’t realize how little is known about the effectiveness of policies commonly recommended by experts. Pol

... (truncated, 98 KB total)
Resource ID: d25a731963a5a372 | Stable ID: sid_qNZTl1s7X7