Both as individuals and collectively, we make decisions and behave in a way that reflects our perception of the world and our interpretation of the information available to us. Yet this construct is changing dramatically. The creation of the Internet and Social Media (ISM) has resulted in massive changes of scale in time, space and cost of information flows. The diffusion of information is now practically instantaneous across the entire globe.
This has resulted in a qualitatively new landscape of influence and persuasion. First, the ability to influence is now effectively “democratized,” since any individual or group has the potential to communicate and influence large numbers of others online in a way that would have been prohibitively expensive in the pre-Internet era.
Second, this landscape is now significantly more quantifiable. Data from ISM can be used to measure the response of individuals as well as crowds to influence efforts, and the impact of those operations on the structure of the social graph.
Finally, influence is also far more concealable. Users may be influenced by information provided to them by anonymous strangers, or even in the simple design of an interface. In general, ISM provides new ways of constructing realities for actors, audiences and media. It fundamentally challenges the traditional news media’s function as gatekeepers and agenda-setters.
Thinking About Influence
More often than not, the word “propaganda” is used in a negative or pejorative context. But this was not always the case. In 1622, Pope Gregory XV created the Congregatio de Propaganda Fide (Office for the Propagation of the Faith), whose purpose was to supervise the Church’s missionary efforts in the New World and elsewhere. This was partly a reaction to the spread of Protestantism and intended to help people follow the “true” path.
Edward Bernays, considered by many to be the father of the modern field of public relations, had a perhaps somewhat more flexible interpretation. He said: “Modern propaganda is a consistent, enduring effort to create or shape events to influence the relations of the public to an enterprise, idea or group.” He also took note of its power, making clear that “the conscious and intelligent manipulation of the organized habits and opinions of the masses is an important element in a democratic society. Those who manipulate this unseen mechanism of society constitute an invisible government which is the true ruling power of our country.”
An even more modern and flexible perspective has been offered by Dimitry Kiselev, the Director General of Russia’s state-controlled Rossiya Segodnya media conglomerate and the Kremlin’s all-around media czar. According to him, “objectivity is a myth that is proposed and imposed on us.” He has accused the European Union of hypocrisy and violating his right to free speech (which is protected by international law) for imposing sanctions on him for broadcasting propaganda (which, by the way, is not illegal under international law).
All of this is important context for thinking about the rapidly changing Information Environment that we now confront.
The Information Environment
The U.S. Department of Defense Dictionary of Military Terms defines the Information Environment (IE) as “the aggregate of individuals, organizations, and systems that collect, process, disseminate, or act on information.” The IE consists of a wide variety of complex interacting and interconnected components, ranging from individuals to groups at multiple scales of organization to physical systems such as the power grid and medical facilities. The decisions and actions taken by these components, individually and collectively, simultaneously shape and are shaped by the IE in which we live.
The nature of interaction within the IE is rapidly evolving and old models are becoming irrelevant faster than we can develop new ones. The result is uncertainty that leaves us exposed to dangerous influences without proper defenses.
The IE can be broadly characterized along both technical and psychosocial dimensions. IE security today (often referred to as cybersecurity) is primarily concerned with defense of its purely technical features—for example, defense against denial of service attacks, botnets, massive thefts of IP and other attacks that typically take advantage of security vulnerabilities. This view is too narrow, however. For example, little attention has been paid to defending against incidents like the April 2013 Associated Press Twitter hack in which a group used (“hijacked”) the news agency’s Twitter account to put out a message reading “Two explosions in the White House and Barack Obama is injured.” The result of this message, with the weight of the Associated Press behind it, was a drop and recovery of roughly $136 billion in equity market value over a period of about 5 minutes.
This attack exploited both the technical (hijacking the account) and psychosocial (understanding how the markets would react) features of the IE.
Another attack, exploiting purely psychosocial features, took place in India in September 2013. It was an incident designed to fan the flames of Hindu-Muslim violence, involving the posting of a gruesome video of two men being beaten to death, accompanied by a caption that identified the two men as Hindu, and the mob as Muslim. It took 13,000 Indian troops to put down the resulting violence. It turned out that while the video did show two men being beaten to death, it was not the men claimed in the caption and in fact the incident had not taken place in India at all. The attack, moreover, required no technical skill whatsoever; it simply required a psychosocial understanding of the right place and right time to post it in order to achieve the desired effect.
These last two actions are examples of cognitive hacking. Key to the successes of these cognitive hacks were the unprecedented speed and the extent to which the essential disinformation could be distributed. Another core element of the success of these two efforts was their authors’ correct assessment of a cognitive vulnerability of their intended audiences—a premise that the audience is already predisposed to accept without too much critical thinking, because it makes a fundamental emotional appeal to existing fears or anxieties. And while the execution of this strategy relies on fundamentally new features of the IE, some of the underlying principles have been known throughout recorded history.
A Call to Action
An important question regarding the survival of our nation is how we answer the increasing threats that we face in the Information Environment from adversaries who range from nation states large and small to criminal or terrorist organizations to a handful of people with malicious intent. At this time, all of our adversaries possess a significant asymmetric advantage over us as a result of policy, legal and organizational constraints that we are subject to and they are not. We need honest and open debate about how to meet these threats.
For example, both the research community and the operational community that is charged with defending us are subject to suffocating constraints on access to data. To understand the absurdity of our current situation, consider the fact that many parts of the U.S. government that need access to open and public social media data are denied that access, while every single one of our adversaries has complete and ready access to that information.
This author, as a program manager at the Pentagon’s Defense Advanced Research Projects Agency (DARPA), recently concluded what is probably the largest ever government sponsored research program in foundational social media technology, known as the Social Media in Strategic Communications (SMISC) program. SMISC researchers accomplished amazing things and significantly advanced the field resulting in over 200 publications in the open literature, as well as developing a number of groundbreaking technologies ready for application. At this point, the biggest fear is that, because of uninformed and antiquated policies and undue legal constraints, the principal beneficiaries of this work will end up being not the U.S. government but its adversaries.
To ensure this does not happen, the United States needs to create a new Center for Cognitive Security, the goal of which is to create and apply the tools needed to discover and maintain fundamental models of our ever-changing IE and to defend us in that environment, both collectively and as individuals. Such a Center would bring together experts in areas such as cognitive science, computer science, engineering, social science, security, marketing, political campaigning, public policy, and psychology, with the goal of developing a theoretical as well as an applied engineering methodology for managing the full spectrum of cognitive security issues. The U.S. government has already laid the foundation for such a construct; now is the time to erect it.
Today, the manipulation of our perception of the world is taking place on scales of time, space and intentionality that were previously unimaginable. It is all shaped by the information we receive. And that, precisely, is the source of one of the greatest vulnerabilities we as individuals and as a society must learn to deal with.
Rand Waltzman
[Originally published in The American Foreign Policy Council Defense Technology Program Brief, September 2015]