Jan. 30, 2020 —
Despite recent technical innovations, such as the use of social media, Russia’s current malign influence campaigns follow those of its Soviet predecessor. Unless we understand these strategies, we remain vulnerable to them. The new National Security Strategy acknowledges the return of great power competition along with a fundamental shift in the locus of struggle away from armed conflict toward cyberspace and non-kinetic information warfare. Nevertheless, our actions within the information domain, especially regarding social media, remain geared to response and mired in lack of awareness of how our adversaries, especially Russia, structure and implement malign influence campaigns.
Unless we utilize and protect our national investment in social science expertise, our adversaries will understand us better than we understand them—and exploit that gap to their advantage. But by wedding America’s intensely competitive and rigorous social and behavioral science to our expertise in machine learning and AI, the US can obtain an asymmetric advantage that implements “persistent engagement” rather than response (Nakasone 2019) to defeat malign influence campaigns and better protect our Nation.
Social scientists in the United States have documented Russia’s past strategies, successes, and failures. Social science can detect and dissect our adversaries’ moves today so as to maintain information domain awareness, that is, knowledge of where and how they operate in the information domain to impact our and our allies’ security, safety, economy, and social and political environment. Quality information domain awareness involves the ability to automatically track and analyze data trends in real time, and to identify targets for countering disinformation by providing anomaly alerts (variance from truth, or from past or projected patterns of behavior).
In Eastern Europe, and increasingly in Western Europe, Russian influence campaigns have strategically played on cultural values and psychological biases, using lessons learned from attempts to control its own ethnic groups and nationalities beginning with Stalin’s reign as People’s Commissar of Nationalities (1917-1923), when he coined the term dezinformatsiya (дезинформа́ция). Stalin sought to undermine traditional values, institutions, and leaders in various parts of the Soviet Union and beyond in order to prepare the psychological ground for communist takeover.
A complementary effort sought to discredit pro-Western forces as foreign, immoral, and dangerous. According to former KGB operations chief in the U.S., Gen. Oleg Kalugin, during the Cold War the aim was: “Subversion… to weaken the West, to drive wedges in the Western community alliances of all sorts, particularly NATO, to sow discord among allies.” (Kalugin 2007) Realizing that they could not match US technological or economic development, Soviet strategists sought to neutralize the advantage by exploiting psychological vulnerabilities and cultural preferences in human decision-making processes: Disorient and disarm the ability to decide, formulate, or execute commands or policy, and political, economic, or military power can be rendered useless (Bagge 2019). According to Gen. Ion Mihai Paceba of the Romanian Secret Police, the Soviet bloc had more people engaged in disinformation than in the armed forces and defense industry combined. A central behavioral principle “for disinformation to succeed… was that a story should always be built around ‘a kernel of truth’ that would lend credibility” (Paceba and Rychlak 2013, 38).
Arguably Soviet disinformation’s greatest successes occurred in the late 1940s and early 1950s, when it facilitated expansion of Soviet control over Eastern Europe and created serious threats of communist takeover in key Western European countries—France, Italy, and Greece. Although the U.S. began to recognize the Soviet strategy as early as 1946-1947, it was slow to mobilize the human resources needed to contain and counter it: From a budget of $4.7 million with 302 personnel and 7 foreign stations in 1947, by 1952 the numbers grew to $80 million with nearly 6,000 personnel in 47 foreign stations (Gaddis 1982).
The U.S. appears similarly slow today in addressing the threat to our homeland, as well as our allies. Without informed countermeasures, the Russian strategy may reverse the world’s general postwar trend toward greater tolerance and less violence, including the spread of liberal democracies (from 35 countries in 1970 to more than 100 in the early 2000s). Freed of the ideological constraints of the Soviet era, Russia has greater flexibility and agility in trying and testing disinformation tactics, while denying U.S. a clear ideological target.
Moreover, the growth of the internet and social media now reaches nearly 3 billion people (Statista 2019), and the speed and reach of Russia’s disinformation has eliminated much of Russia’s technological and economic disadvantage. Messages can be created and disseminated without physical agents or clear instruction. Such operations enable the conduct of hostile activities under the threshold of armed conflict, while preserving plausible deniability for any damage inflicted. Russia knows us and our vulnerabilities well enough to develop messages and target audiences effectively. It also has the analytical capacity to evaluate its own performance and the agility to seize opportunities and change courses. There may be little time to catch up before our interests and allies suffer serious degradation.
Our response has been primarily reactive stopgap efforts, like those employed by the data scientists and analysts of Twitter, Instagram, and Facebook to block propaganda and hateful content (Weedon et al. 2017). Some of these defensive efforts involve technically accomplished pattern recognition algorithms for processing big data. Proactive attempts at messages favorable to U.S. interests, such as those pushed in public diplomacy initiatives, frequently are based on selected “success” stories from the past, anecdotes, and intuition. However, there is scant evidence that these efforts have sustained results, or that we have a strategic appreciation of the psycho-social depth of Russia’s malign campaigns. Our operations appear to be equally ignorant of our own psychological susceptibilities and cultural belief systems that Russia uses technology to monitor and manipulate (Atran et al. 2017).
As a result, we are being attacked through vulnerabilities created by our own core values, with Russian efforts succeeding at polarization by amplifying both sides of contentious political and social issues. In parallel, it has attacked democratic values (e.g., due process, freedom of opinion, tolerance of minorities, etc.) and subverted trust in related institutions of democratic governance (e.g., judiciary, press, representation of societal diversity). It has become skilled at triggering viral cascades when pivotal decisions are made, and in undermining key personalities and political institutions, vulnerable social groups, and the mutual respect needed for reasoned debate and consensus among the general population.
By closely studying American and European societies, Russia has discovered the strategic advantages of appearing to embrace traditional values that we accused them of attacking during the Cold War: family, religion, nation, and longstanding cultural and ethnic mores. A persistent message from Russian President Vladimir Putin is that: “the liberal idea… is allowed to overshadow the culture, traditions and traditional values of millions of people making up the core population.” The kernel of truth here is that some segments of our own society do perceive “traditional values” to be threatened owing to rapidly changing political, economic, and social conditions. Putin feeds those fears by claiming that the fault lies with liberal Western permissiveness wherein “migrants can kill, plunder and rape with impunity because their rights as migrants have to be protected" and LGBT persons make “children play five or six gender roles” (BBC News 2019), a position echoed in anti-Western hate messaging that can be systematically linked to state-sponsored Russian sources (e.g., Civil Society Platform 2019).
Russian agencies and institutions working today on “nationalities” and “values” include Russian Special Services, the recently created Federal Ethnic Affairs Agency, and various Institutes of the Russian Academy of Sciences populated by former Soviet and Russian generals and ministers, in addition to their scientists. Many of the leading scientists are area experts trained in the disciplines of classic anthropology, including ethnology, archaeology, and linguistics. They seek encyclopedic understanding of target populations’ social, political, and economic make up. Combined with basic research on social and psychological processes, that knowledge facilitates Russia’s ability to know whom, when, and how to target specific interventions that further their strategic goals. In fact, interaction with such Russian expertise could prove beneficial to both our countries, as it has in counterterrorism efforts on the ground and in social media, although Moscow has chosen to play most other areas as a zero-sum game.
What is lacking in Russian social science, however, are rigorous social science methods of (null) hypothesis testing and experimental design as well as sophisticated AI for the design of automated counter measures in real time (though Putin recently declared ramping up AI to be a national priority). Although the U.S. has an advantage in use of scientific method and AI, the effectiveness of their deployment depends on how well we understand our adversaries’ goals, tools, and tactics; their perceptions of our strengths and weakness; and what those really are. For the persistent engagement that faces us, our strategies and tactics need to be informed by expertise in the psycho-cultural factors that affect political, economic, and social shifts in populations of interest. Without that expertise, we will persist in such demonstrably ineffective practices as countering disinformation with truth and evidence, however logically consistent or factually correct it may be (Murphy et al. 2019). In fact, we have decades of research into the greater persuasive effectiveness of leveraging cognitive biases and cultural norms that defend deeply held beliefs (Mercier and Sperber 2017). Coupled with our advanced computational capabilities, rigorous social science analysis of psychological vulnerabilities and cultural preferences could offer a powerful countermeasure to overcome Russia’s well-practiced Soviet-style influence campaigns, powered up by modern social media.
Atran, Scott, Robert Axelrod, Richard Davis, and Baruch Fischhoff. 2017. Challenges in Researching Terrorism from the Field. Science 355:352-354.
Bagge, Daniel P. 2019. Unmasking Maskirovka: Russia’s Cyber Influence Operations. New York: Defense Press.
BBC News. 2019. Putin: Russian President Says Liberalism ‘Obsolete’. June 28.
Civil Society Platform No to Phobia. 2019. Medea Monitoring Search Engine, http://notophobia.ge/eng/media/
Gaddis, John Lewis. 1982. Strategies of Containment: A Critical Appraisal of Postwar American National Security Policy. New York: Oxford University Press.
Kalugin, Oleg. 2007. Inside the KGB: An Interview with Retired KGB Maj. Gen. Oleg Kalugin; CNN. June 27.
Mercier, Hugo, and Dan Sperber. 2017. The Enigma of Reason. Cambridge. Harvard University Press.
Murphy, Gillian, Elizabeth F. Loftus, Rebecca Hostein Grady, Linda J. Levine, and Ciara M Greene. 2019. False Memories for Fake News during Ireland’s Abortion Referendum. Psychological Science. 30(10): 1449-1459.
Nakasone, Paul M. 2019. A Cyber Force for Persistence Operations. Joint Force Quarterly. 92: 10-14.
Pacepa, Ion Mihai and Ronald J. Rychlak. 2013. Disinformation: Former Spy Chief Reveals Secret Strategies for Undermining Freedom, Attacking Religion, and Promoting Terrorism. Washington, DC: WND Books.
Statista. 2019. Number of social media users worldwide; https://www.statista.com/statistics/278414/number-of-worldwide-social-network-users/
Weedon, Jen, William Nulan, and Alex Stamos. 2017. Information Operations and Facebook. Facebook Security. April 27.
Scott Atran is an anthropologist and psychologist who studies how cognitive constraints and biases, and cultural preferences and values, shape social structures and political systems. He is co-founder of Artis International; Research Professor at the University of Michigan’s Ford School of Public Policy; Research Fellow at Oxford University’s Changing Character of War Centre; and advisor to the UN Security Council on counterterrorism and issues of Youth, Peace and Security.
Richard Davis focuses on policy development to thwart illicit physical and internet trafficking by states and criminal organizations of people, money, goods, and ideas. He is former Director of Prevention, White House Homeland Security Council; co-founder of Artis International and Artis Looking Glass; Professor of Practice at Arizona State University; and Research Fellow at Oxford University’s Changing Character of War Centre.
Hasan Davulvu is a computer engineer and mathematician who has developed US patented algorithms for tracking illicit and malign behavior in social media. He is co-founder of Artis Looking Glass and Associate Professor at Arizona State University’s School of Computing, Informatics and Decision Systems Engineering.
Associated Minerva Projects
Addressing Resilience in the Western Alliance against Fragmentation: Willingness to Sacrifice and the Spiritual Dimension of Intergroup Cooperation and Conflict New Analytics for Measuring and Countering Social Influence and Persuasion of Extremist Groups
Supporting Service Agency
Air Force Office of Scientific Research and Office of Naval Research
Nota Bene Content appearing from Minerva-funded researchers—be it the sharing of their scientific findings or the Owl in the Olive Tree blogs posts—does not constitute Department of Defense policy or endorsement by the Department of Defense.