2025/06/01 Andreas Jungherr

New preprint: Political Disinformation: ‘Fake News’, Bots, and Deep Fakes

I’m happy to share that my preprint Political Disinformation: ‘Fake News’, Bots, and Deep Fakes is now available. It will appear in the *Oxford Research Encyclopedia of Communication* sometime in the future.

You can read the preprint here: Political Disinformation: ‘Fake News’, Bots, and Deep Fakes

The piece is an attempt to take a step back from the often alarmist public discourse on disinformation and look more carefully at what we actually know. Concerns about political disinformation—whether in the form of fake news, bots, deepfakes, or foreign interference—have become increasingly prominent, especially since 2016. But these concerns don’t always align with what empirical research tells us.

In the article, I try to do three things:

  • Clarify what we’re talking about: I revisit some of the definitional challenges around disinformation and related terms. A central point is that political speech is often contested, interpretive, and not easily reduced to true vs. false categories.
  • Summarize the current evidence: Drawing on a broad range of studies, I look at what we know about the actual reach and effects of disinformation. While the issue is real, the available data suggest that its impact is more limited than many fear.
  • Reflect on how we respond: Efforts to fight disinformation can come with their own risks, especially if they concentrate power over speech or suppress disagreement. I argue for responses that are proportionate, evidence-based, and attentive to democratic openness.
  • This piece is meant as a contribution to ongoing conversations across research, policy, and civil society. It doesn’t dismiss the challenges posed by disinformation, but it does suggest we might better address them by being more precise in our terms, more rigorous in our methods, and more careful about the trade-offs involved.

    Abstract: Political disinformation has become a central concern in both public discourse and scholarly inquiry, particularly following the electoral successes of right-wing populist movements in Western democracies since 2016. These events have been widely interpreted through the lens of manipulation, deviance, and disruption; all closely linked to the spread of false information, automated amplification, and foreign interference in digital communication environments. More recently, advances in Artificial Intelligence have added to anxieties about the growing sophistication and scale of disinformation campaigns. Collectively, these concerns are often framed under the concept of digital disinformation and viewed as posing existential threats to democratic systems. This entry provides a comprehensive and critical overview of the disinformation debate. It traces the definitional and conceptual challenges inherent in the term “disinformation,” highlights how digital infrastructures shape both the problem and its perceived urgency, and synthesizes empirical evidence on the actual reach, distribution, and impact of political disinformation. The article distinguishes between individual, collective, and discursive harms, while cautioning against inflated threat narratives that outpace empirical findings. Importantly, the entry addresses the risks of regulatory overreach and centralized control. Efforts to counter disinformation may themselves undermine democratic openness, suppress dissent, and weaken societies’ capacity for collective information processing. In response, the article outlines a research agenda that prioritizes conceptual clarity, empirical rigor, and systemic analysis over alarmism. It advocates for a shift away from overstreching framings of disinformation toward more precise and differentiated understandings of digital political communication and its challenges.

  • Andreas Jungherr (2025). Political Disinformation: “Fake News”, Bots, and Deep Fakes. In Oxford Research Encyclopedia of Communication. Oxford: Oxford University Press. (Forthcoming).
  • , , , , , , , , ,