Main Body
Foreign and Domestic Computational Propaganda
https://twitter.com/ngleicher/status/1504186935291506693?ref_src=twsrc%5Etfw
Above, Nathaniel Gleicher, head of Security Policy at Facebook, announces on Twitter that they have removed a deepfake video of Ukranian president Zelensky produced by Russian propagandists.
If you happen to own a Twitter/X account, click through the link and review the pattern of comments to get a sense of their sentiment.
Overview
Up to this point in our studies, we have examined the presence of intermediary technologies and the nature of our relationships with them. As new technologies proliferate into common usage, a predictably intense effort is made by bad actors (and some with best intentions) to exploit the affordances of these systems for a variety of social or political ends. When photography, radio, and motion picture film emerged, they were quickly employed to deliver propaganda, often to audiences that had not yet developed a sense of literacy to comprehend the creator’s manipulative tactics to distort reality.
The question we encounter in this module’s study goes beyond identifying how social media systems and search engines can be exploited to spread propaganda – we know that it has been going on in various forms for a long time. The more important question is how well does it work? Even if the most realistic looking deepfake video was shared all over social media, would it make a difference? If not, why do regimes all over the world continue to generate it?
This chapter examines Computational Propaganda: a form of propaganda employed by foreign and domestic groups that exploits the affordances of AI, social media systems, and search algorithms to promote ideological or political messages. Computational propaganda is a form of Cognitive Warfare.
The motives for producing this propaganda varies depending on the entity. A foreign enemy may wish to undermine faith in democratic institutions, foment mistrust in elected officials, compromise democratic common ground, or reduce voter motivation to participate in the democratic process. A domestic enemy may wish to promote an ideology that elevates a dominant social order or to demonize a particular group or individual.
The intended effect of computational propaganda is, over the long term, to produce a perception of reality that operates as an internal psychological force leading to social and political conflict. As you will see, however, it is difficult to quantifiably measure how the effect of Cognitive Warfare impacts the outcome of key events, such as a presidential election.
Key Terms
Cognitive Warfare – “Cognitive warfare is an unconventional form of warfare that uses cyber tools to alter enemy cognitive processes, exploit mental biases or reflexive thinking, and provoke thought distortions, influence decision-making and hinder actions, with negative effects, both at the individual and collective levels” (Claverie & Cluzel, 2022, p. 2).
Manufacturing Consensus – The practice of using a collection of automated bots in social media to promote the illusion of consensus about an issue. The collective effect of this practice also drowns out other views by the infinite quantity of content published by bots (Howard, 2018, p.4). This term is often associated with astroturfing, which the intentional effort to create the illusion of grassroots support for an issue or candidate.
PSYOPS – An acronym short for Psychological Operations: “Psychological warfare involves the planned use of propaganda and other psychological operations to influence the opinions, emotions, attitudes, and behavior of opposition groups” (RAND Corporation, n.d.). PSYOPS would be considered an umbrella term under which cognitive warfare would be classified.
What should you be focusing on?
Your goal in this chapter is to synthesize how all of the prior topics of study in this book culminate into an operationalized effort by entities to shape the perception of reality through algorithmically optimized communication systems.
Readings & Media
Note that APA formatted references for all of the articles below are listed at the bottom of this chapter for use in your posts and paper.
Thematic narrative in this chapter
In the following readings and media, the authors will present the following themes:
- Social media systems and search algorithms are routinely exploited by ideological and political actors to promote propaganda to targeted audiences.
- Synthetic media will be an ongoing component of computational propaganda.
- Both humans and pre-programmed bots will generate propaganda which will appear across many different device ecologies.
Required “The Cognitive Warfare Concept” (PDF) by Bernard Claverie, François du Cluzel. Computational Propaganda Research Project, 2019 (10 pages)
This article explains the concept and purpose of cognitive warfare as a state-sponsored strategic operation executed primarily through networked information systems. Generally, cognitive warfare is expressed here as covert operations, though one definition characterizes cognitive warfare as “conflict short of war.”
“The main goal is not to serve as an adjunct to strategy or to defeat an enemy without a fight, but to wage a war on what an enemy community thinks, loves, or believes in by altering perceptions. It is a war on how the enemy thinks, how its minds work, how it sees the world and develops its conceptual thinking. The effects sought are an alteration of worldviews, and thereby affect their peace of mind, certainties, competitiveness and prosperity” (Claverie & du Cluzel, 2019, p. 5).
Required Video: “Disinformation Technology: How Online Propaganda Campaigns Are Influencing Us | Renee DiResta” March 11, 2018, Long Now Foundation. (9:00 min)
Renee DiResta is a renowned scholar in the field of disinformation and propaganda studies. As a data scientist, she demonstrates how information flows through social media according to various techniques. In the lecture segment below, DiResta describes how social media systems are used as efficient vehicles for computational propaganda.
“As deepfakes become more prevalent, as the content that is manipulatable changes, the adversary grabs onto the most bleeding edge thing, recognizing that that’s where the least defense is, that those fronts are not really well defended, not well-monitored (18:48).”
You may wish to use the speed controls in the YouTube player to play back at a faster speed, if needed. Hover over the video and click the gear button on the lower right corner, then change the playback speed – try it at 1.25.
The required section in this video starts at 14:11 and ends at 23:13.
Required “How foreign operations are manipulating social media to influence your views” by Filippo Menczer, The Conversation, October 8, 2024 (4 pages)
This article is produced by the Observatory on Social Media, a scholarly institute at the University of Indiana that tracks influence campaigns. They describe the techniques used to flood social media systems with content and manipulate the promotional algorithms to elevate propaganda to wider audiences.
The article references a research study that describes how malicious actors amplify and distribute disinformation across the social media landscape: Quantifying the vulnerabilities of the online public square to adversarial manipulation tactics. (Read the Conclusion section on page 8).
Similarly, Chen et al. (2021) describe how political ecosystems form around user accounts: “Neutral bots probe political bias on social media.” Focus your attention to the Introduction and Discussion.
“…not only do users exchange huge amounts of information [on social media] with large numbers of others via many hidden mechanisms, but these interactions can be manipulated overtly and covertly by legitimate influencers as well as inauthentic, adversarial actors who are motivated to influence opinions or radicalize behaviors (Huston and Bahm, 2020, p. 1).
Contrary Evidence
The articles below dispute the claim that propaganda on social media has any significant effect on swaying election results. While it is true that elections reflect the collective reality of a population, there are many factors that contribute to a person’s decision to vote a particular way, which may explain why computational propaganda is less effective in swaying elections.
It is important to ask, however, whether the findings of these studies related to election outcomes transfer similarly to popular sentiment about other issues. For example, if studies suggest that computational propaganda does not significantly affect voting behavior, can the same be said about the use of propaganda to influence popular sentiment about transgender people, immigrants, climate change, or privately owned health insurance companies?
Required
Note that APA formatted references for these articles are listed at the bottom of this chapter for use in your posts and paper.
- In contrast to the prior readings, MIT Technology Review’s 2024 article “AI’s impact on elections is being overblown” proposes reasons why AI-generated content had little effect on election outcomes.
- In this study, “Do (Microtargeted) Deepfakes Have Real Effects on Political Attitudes? review the Abstract, then read the sub-section “Should we worry about (microtargeted) deepfakes?”
- Review the Abstract from the Eady, G., et al. (2023) research study: Exposure to the Russian Internet Research Agency foreign influence campaign on Twitter in the 2016 US election and its relationship to attitudes and voting behavior.
- Read the Discussion section towards the bottom in the research study by Almond et al. (2022) which makes indirect inferences about the effect of Russian interference: Reduced trolling on Russian holidays and daily US Presidential election odds.
Optional: “What’s behind the “All Eyes on Rafah” image that went viral on social media” by Ivana Saric in Axios.com, May 30, 2024.
This article describes the details related to the 2024 Israel-Palestine conflict when AI-generated images appeared on social media showing destruction in Rafah.
Optional: “Social media goes to war” by Mark Scott and Rebecca Kern in Politico.com, March, 2022.
This article describes specific action taken between governments and social media companies in response to misinformation and propaganda campaigns.
Optional: “How I Built an AI-Powered, Self-Running Propaganda Machine for $105” by Jack Brewster, The Wall Street Journal, April 12, 2024.
Optional: “Meta Adversarial Threat Report 2023” published by Meta, the parent corporation of Facebook, November, 2023. This report describes the extent of Coordinated Inauthentic Behavior (CIB) propagated by foreign governments using Facebook to reach American audiences. Read the brief Summary of Findings.
Optional: “Synthetic and manipulated media policy” published by X (former Twitter), April, 2023. This document describes X’s policy regarding users posting synthetic media designed ti mislead or confuse viewers.
References
Almond D, Du X, Vogel A (2022) Reduced trolling on Russian holidays and daily US Presidential election odds. PLoS ONE 17(3): e0264507. https://doi.org/10.1371/journal.pone.0264507
Bradshaw, S., & Howard, P. N. (2019). The global disinformation order: 2019 global inventory of organised social media manipulation. Global Inventory of Organised Social Media Manipulation. Working Paper 2019.2. Oxford, UK: Project on Computational Propaganda.
Chen, W., Pacheco, D., Yang, KC. et al. Neutral bots probe political bias on social media. Nat Commun 12, 5580 (2021). https://doi.org/10.1038/s41467-021-25738-6
Claverie, B., & du Cluzel, F. (2022). The Cognitive Warfare Concept. Innovation Hub Sponsored by NATO Allied Command Transformation, 2022-02.
Dobber, T., Metoui, N., Trilling, D., Helberger, N., & de Vreese, C. (2021). Do (microtargeted) deepfakes have real effects on political attitudes?. The International Journal of Press/Politics, 26(1), 69-91.
Eady, G., Paskhalis, T., Zilinsky, J. et al. Exposure to the Russian Internet Research Agency foreign influence campaign on Twitter in the 2016 US election and its relationship to attitudes and voting behavior. Nat Commun 14, 62 (2023). https://doi.org/10.1038/s41467-022-35576-9
Howard, P., Woolley, S. (2018). Computational Propaganda: Political Parties, Politicians, and Political Manipulation on Social Media. United Kingdom: Oxford University Press.
Simon, F. M., McBride, K., & and, S. A. (2024). AI’s impact on elections is being overblown. Technology Review, Inc.