Main Body
Foreign and Domestic Computational Propaganda
1/ Earlier today, our teams identified and removed a deepfake video claiming to show President Zelensky issuing a statement he never did. It appeared on a reportedly compromised website and then started showing across the internet.
— Nathaniel Gleicher @ngleicher@infosec.exchange (@ngleicher) March 16, 2022
Above, Nathaniel Gleicher, head of Security Policy at Facebook, announces on Twitter that they have removed a deepfake video of Ukranian president Zelensky produced by Russian propagandists.
Click through the link and review the pattern of sentiment by the commenters.
Overview
This chapter examines Computational Propaganda: a form of propaganda employed by foreign and domestic groups that exploits the affordances of social media systems and search algorithms to promote ideological or political messages. Computational propaganda is a form of Cognitive Warfare.
The motives for producing this propaganda varies depending on the entity. A foreign enemy may wish to undermine faith in democratic institutions, foment mistrust in elected officials, compromise democratic common ground, or reduce voter motivation to participate in the democratic process. A domestic enemy may wish to promote an ideology that elevates a dominant social order or to demonize a particular group or individual.
The key principle to understand in this study is to recognize that this strategy would not be a viable investment by foreign enemies if it did not produce the intended effect. The intended effect, in this case, is to produce a perception of reality that operates as an internal psychological force leading to social and political conflict. As you will see, however, it is difficult to quantifiably measure how the effect of Cognitive Warfare impacts the outcome of key events, such as a presidential election.
Key Terms
Cognitive Warfare – “Cognitive warfare is an unconventional form of warfare that uses cyber tools to alter enemy cognitive processes, exploit mental biases or reflexive thinking, and provoke thought distortions, influence decision-making and hinder actions, with negative effects, both at the individual and collective levels” (Claverie & Cluzel, 2022, p. 2).
PSYOPS – An acronym short for Psychological Operations: “Psychological warfare involves the planned use of propaganda and other psychological operations to influence the opinions, emotions, attitudes, and behavior of opposition groups” (RAND Corporation, n.d.). PSYOPS would be considered an umbrella term under which cognitive warfare would be classified.
What should you be focusing on?
Your goal in this chapter is to synthesize how all of the prior topics of study in this book culminate into an operationalized effort by groups and political entities to shape the perception of reality through algorithmically optimized communication systems.
Readings & Media
Thematic narrative in this chapter
In the following readings and media, the authors will present the following themes:
- Social media systems and search algorithms are routinely exploited by ideological and political actors to promote propaganda to targeted audiences.
- Synthetic media will be an ongoing component of computational propaganda.
- Both humans and pre-programmed bots will generate propaganda which will appear across many different device ecologies.
Required “The Cognitive Warfare Concept” (PDF) by Bernard Claverie, François du Cluzel. Computational Propaganda Research Project, 2019 (10 pages)
This article explains the concept and purpose of cognitive warfare as a state-sponsored strategic operation executed primarily through networked information systems. Though one definition characterizes cognitive warfare as “conflict short of war” since it does not involve killing people or taking territory, it is thought of more as “covert operations.”
“The main goal is not to serve as an adjunct to strategy or to defeat an enemy without a fight, but to wage a war on what an enemy community thinks, loves or believes in, by altering perceptions. It is a war on how the enemy thinks, how its minds work, how it sees the world and develops its conceptual thinking. The effects sought are an alteration of worldviews, and thereby affect their peace of mind, certainties, competitiveness and prosperity” (Claverie & du Cluzel, 2019, p. 5).
Required “A Global Inventory of Organized Social Media Manipulation” (PDF) Samantha Bradshaw and Philip N. Howard. Computational Propaganda Research Project, 2019 (about 20 pages)
This article surveys the range of techniques used to promote foreign and domestic propaganda and disinformation through social media. Focus your reading in the areas of the article that describe how this work is conducted.
Required Video: “Disinformation Technology: How Online Propaganda Campaigns Are Influencing Us | Renee DiResta” March 11, 2018, Long Now Foundation. (9:00 min)
Renee DiResta is a renowned scholar in the field of disinformation and propaganda studies. As a data scientist, she demonstrates how information flows through social media according to various techniques. In the lecture segment below, DiResta describes how social media systems are used as efficient vehicles for computational propaganda.
“As deepfakes become more prevalent, as the content that is manipulatable changes, the adversary grabs onto the most bleeding edge thing recognizing that that’s where the least defense is, that those fronts are not really well defended, not well-monitored (18:48).”
You may wish to use the speed controls in the YouTube player to play back at a faster speed, if needed. Hover over the video and click the gear button on the lower right corner, then change the playback speed – try it at 1.25.
The required section in this video starts at 14:11 and ends at 23:13.
Required “The Supply of Disinformation Will Soon Be Infinite” by Renée DiResta, The Atlantic, September 20, 2020 (12 pages)
This article describes the landscape of fake authors and AI generated Internet sources that clutter the information landscape with propaganda and other conflicting messages designed to obfuscate reliable information.
“Disinformation campaigns used to require a lot of human effort, but artificial intelligence will take them to a whole new level.”
Note that when the OpenAI consortium released of GPT-2 in 2019, they state the potential risk inherent in an AI text generation system that achieves human-like authenticity. See Section 2 “GPT-2 can be fine-tuned for misuse.”
Required “Neutral bots probe political bias on social media” by Wen Chen, Diogo Pacheco, Kai-Cheng Yang, and Filippo Menczer, Nature Communications, September 22, 2021 (about 20 pages)
This article describes a research study that measures the “political drift” that occurs in Twitter content according to the algorithmic feed each account receives, including the influence of bot accounts. The value of this research is in understanding how a political ecosystem can form around a given user account and how certain content is more persuasive than others.
As with typical research papers of this kind, focus your attention to the Introduction and Discussion.
“…not only do users exchange huge amounts of information [on social media] with large numbers of others via many hidden mechanisms, but these interactions can be manipulated overtly and covertly by legitimate influencers as well as inauthentic, adversarial actors who are motivated to influence opinions or radicalize behaviors (Huston and Bahm, 2020, p. 1).
In contrast, review the Abstract from this research study: Exposure to the Russian Internet Research Agency foreign influence campaign on Twitter in the 2016 US election and its relationship to attitudes and voting behavior. Essentially, this study could not determine the extent of any effect of foreign influence to the outcome of the 2016 presidential election. Another study only suggest indirect links to the effect of Russian interference on the prospective outcome of the election: Reduced trolling on Russian holidays and daily US Presidential election odds.
Required “Political Communication, Computational Propaganda, and Autonomous Agents” (PDF) by Samuel C. Woolley and Philip Howard, International Journal of Communication, 2016 (7 pages)
This article is a preface to a Special Section in the International Journal of Communication (Volume 10, 2016). It references the contemporary ecology of mediated communications and how algorithmic systems on social media are leveraged to achieve message penetration:
“The bulk of digital communications are no longer between people but between devices… (Howard and Woolley, 2016, p. 4882).”
This article mentions a series of additional articles in the same journal issue that elaborate on the principles presented here. You may wish to explore these Special Section articles further.
Optional: “What’s behind the “All Eyes on Rafah” image that went viral on social media” by Ivana Saric in Axios.com, May 30, 2024.
This article describes the details related to the 2024 Israel-Palestine conflict when AI-generated images appeared on social media showing destruction in Rafah.
Optional: “Social media goes to war” by Mark Scott and Rebecca Kern in Politico.com, March, 2022.
This article describes specific action taken between governments and social media companies in response to misinformation and propaganda campaigns.
Optional: “How I Built an AI-Powered, Self-Running Propaganda Machine for $105” by Jack Brewster, The Wall Street Journal, April 12, 2024.
Optional: “Meta Adversarial Threat Report 2023” published by Meta, the parent corporation of Facebook, November, 2023.
This report describes the extent of Coordinated Inauthentic Behavior (CIB) propagated by foreign governments using Facebook to reach American audiences. Read the brief Summary of Findings.
Optional: “Synthetic and manipulated media policy” published by X (former Twitter), April, 2023.
This document describes X’s policy regarding users posting synthetic media designed ti mislead or confuse viewers.
References
Claverie, B., & du Cluzel, F. (2022). The Cognitive Warfare Concept. Innovation Hub Sponsored by NATO Allied Command Transformation, 2022-02.