An image from a now-deleted Facebook page of a Russian paid Ghanaian troll, which touted alleged police targeting of African Americans. (Photo by CNN)[1]
Introduction
Within the last decade the world has witnessed authoritarian regimes increasingly use social media to advance their global interests. For example, Russia’s exploitation of ethnic Russians living in Ukraine to set conditions for their 2014 annexation of Crimea. Playing on existing biases and aspirations for an autonomous Crimea, they gained control of the disputed area. Russia’s annexation of Crimea is just one example of how a state can exploit biases to support its strategic interests.
The biggest threat to a state’s national security is a global competitor’s exploitation of their biases. Cognitive biases are evident within everyday lives, for example, sports, politics, and socioeconomic discriminations. Both large companies and adversaries use algorithms to track online search results that are influenced by our biases, those things that we hit ‘like’ or view. Additionally, adversaries leverage other tools like social media bots and trolls to mislead populations using these existing biases. Unfortunately, your average person isn’t aware of this digital intrusion and manipulation, which makes them unwitting participants for enemy exploitation.
Cognitive Biases
Misinformation has become the tool of choice to exploit people’s biases through social media. Your average citizen will intentionally or unintentionally encounter misinformation, and potentially reshare if it supports their personal preferences. This resharing of information can gain traction and spread quickly among populations of shared biases. According to research done by the Observatory on Social Media at Indiana University, there are three types of biases that make social media vulnerable to misinformation.[3]
Bias in the Brain: Cognitive biases originate in the way the brain processes the information that every person encounters every day. The brain can deal with only a finite amount of information, and too many incoming stimuli can cause information overload. That in itself has serious implications for the quality of information on social media. We have found that steep competition for users’ limited attention means that some ideas go viral despite their low quality – even when people prefer to share high-quality content.[4]
Bias in Society: When people connect directly with their peers, the social biases that guide their selection of friends come to influence the information they see. Our analysis of the structure of these partisan communication networks found social networks are particularly efficient at disseminating information – accurate or not – when they are closely tied together and disconnected from other parts of society. The tendency to evaluate information more favorably if it comes from within their own social circles creates “echo chambers” that are ripe for manipulation, either consciously or unintentionally.[5]
Bias in the machine: The third group of biases arises directly from the algorithms used to determine what people see online. Both social media platforms and search engines employ them. These personalization technologies are designed to select only the most engaging and relevant content for each individual user. But in doing so, it may end up reinforcing the cognitive and social biases of users, thus making them even more vulnerable to manipulation. For instance, the detailed advertising tools built into many social media platforms let disinformation campaigners exploit confirmation bias by tailoring messages to people who are already inclined to believe them.[6]
The combination of these biases provides adversaries a means to exploit populations to reinforce their narratives and setting conditions for their strategic interests. Moreover, once these messages gain traction and go viral, it can overpower competing narratives, creating a false positive in the information environment. This could potentially increase support to an adversary's interests due to it being aligned with the population's biases, as seen during Russia's annexation of Crimea.
Unwitting Collaborators
Adversaries, like Russia, are deliberately identifying ideological fault lines they can exploit. According to Kate Starbird, associate professor at the University of Washington in Seattle, “...those behind disinformation campaigns purposely entangle orchestrated action with organic activity. Audiences become willing but unwitting collaborators, helping to achieve campaigners’ goals.”[7] Once these messages are intertwined with an existing bias of a social circle, those within that circle will overpoweringly spread the word unwittingly.
However, Russia isn’t the only one targeting states; for the past two decades, China has been targeting developing countries to exploit economic opportunities. For example, in Latin America, China is the second-largest trading partner of Colombia and Peru.[8] These economic partnerships in developing countries also contribute to a positive Chinese image in the region and opens doors to exploit other areas. Although not all positive, figure 1. provides a snapshot of perceptions throughout Latin America on China, collected from the top 5 online news outlets of the region.
Figure 1. Positive news in Latin America’s online media outlets on the topic of China 2014.[9b] km
The positive news about China is conditioning local biases for future economic opportunities. For example, Chinese companies can potentially exploit local landowners and real estate companies to expand their footprint in the western hemisphere. Local farmers or companies may believe Chinese investments will benefit them, based on their biases formed by the media. Hence, local societies become unwitting collaborators to Chinese strategic interests.
The Ongoing Threat
Recent history has proven that cognitive biases are the biggest vulnerability for any state. Through sophisticated information campaigns, adversaries can manipulate societies to meet their strategic interests. Consequently, the pervasive access to online information has divided populations based on biases, providing adversaries an opportunity to exploit. These divisions have created “echo chambers” within communities that unwittingly continue to spreading false information.[10]
States like Russia and China also have the resources to flood the information environment to exploit these echo chambers. These states continuously use bots and paid trolls throughout the world to seed misinformation into targeted communities. However, it is the unwitting family member, co-worker, and politicians who continue to amplify these messages.[11] Once these messages go viral by unwitting collaborators, depending on their level of trust and legitimacy it becomes difficult to discern the misinformation.
Figure 2. CNN and Twitter found accounts purportedly of people across the US were being set up and run from Ghana and Nigeria funded by Russia.[12]
Before states can counter misinformation, they must first understand the cognitive biases within their communities. More importantly, countries must understand how the combination of all these biases from individuals to societies creates more complex vulnerabilities to their national security.[13] Not following these biases can open opportunities for adversaries to influence markets, elections, or provoke divisions within democratic countries.
No country is safe from misinformation, including the United States. During the 2016 elections, adversaries used misinformation campaigns to discourage certain demographic groups from voting. According to Kate Starbird, "the pervasive use of disinformation is undermining democratic processes by fostering doubt and destabilizing the common ground that democratic societies require."[14] In closing, to combat misinformation, it will take a combination of policymakers, academia, and big tech companies to protect democratic societies from adversaries' long-term conditioning of vulnerable populations’ cognitive biases.
The views expressed are those of the author and do not reflect the official policy or position of the U.S. Army, Department of Defense, or the U.S. Government. Author
Assad Raza is an Active Duty Civil Affairs Officer in the United States Army. He holds a B.A. in Psychology from The University of Tampa, a M.A. in Diplomacy w/concentration in International Conflict Management from Norwich University, and is a graduate of The Western Hemisphere Institute for Security Cooperation’s Command and General Staff Officer Course at Fort Benning, Georgia. Raza has served with 82nd Airborne Division, 96th Civil Affairs Battalion (Special Operations) (Airborne), and 5th Special Forces Group (Airborne).
Endnotes [1] Ward, Clarrisa et al., “Russian Election Meddling Is Back -- Via Ghana and Nigeria -- and in Your Feeds,” CNN World, March 13, 2020, https://edition.cnn.com/2020/03/12/world/russia-ghana-troll-farms-2020-ward/index.html.
[2] Moynihan, Donald P., and Stéphane Lavertu. "Cognitive Biases in Governing: Technology Preferences in Election Administration." Public Administration Review 72, no. 1 (2012): 68-77. Accessed March 23, 2020. www.jstor.org/stable/41433144. [3] Ciampaglia, Giovanni Luca, and Filippo Menczer. “Misinformation and Biases Infect Social Media, Both Intentionally and Accidentally.” The Conversation, June 20, 2018. https://theconversation.com/misinformation-and-biases-infect-social-media-both-intentionally-and-accidentally-97148.
[4] Ibid.
[5] Ibid.
[6] Ibid.
[7] Starbird, Kate. “Disinformation’s Spread: Bots, Trolls and All of Us.” Nature, July 24, 2019. https://www.nature.com/articles/d41586-019-02235-x.
[8] Ospina Estupinan, Jhon Deyby. “The coverage of China in the Latin American Press: Media framing study.” Cogent Arts & Humanities 4, no. 1 (February). http://dx.doi.org/10.1080/23311983.2017.1287319.
[9] Ibid.
[10] Ciampaglia, Giovanni Luca, and Filippo Menczer. “Misinformation and Biases Infect Social Media, Both Intentionally and Accidentally.”
[11] Starbird, Katie. “Disinformation’s Spread: Bots, Trolls and All of Us.”
[12] Ward, Clarrisa et al., “Russian Election Meddling Is Back -- Via Ghana and Nigeria -- and in Your Feeds.”
[13] Ciampaglia, Giovanni Luca, and Filippo Menczer. “Misinformation and Biases Infect Social Media, Both Intentionally and Accidentally.”
[14] Starbird, Katie. “Disinformation’s Spread: Bots, Trolls and All of Us.”