U.S. Department of Energy

 

Date of this Version

5-2017

Citation

United States Advisory Commission on Public Diplomacy, May 2017

Edited by Shawn Powers and Markos Kounalakis

Comments

United States government work. Public domain material.

Abstract

Report from a meeting held on the topic of disinformation, the Internet, and public diplomacy held at the Hoover Institution, Stanford University, in 2017.

Executive Summary

Scientific progress continues to accelerate, and while we’ve witnessed a revolution in communication technologies in the past ten years, what proceeds in the next ten years may be far more transformative. It may also be extremely disruptive, challenging long held conventions behind public diplomacy (PD) programs and strategies. In order to think carefully about PD in this ever and rapidly changing communications space, the Advisory Commission on Public Diplomacy (ACPD) convened a group of private sector, government, and academic experts at Stanford University’s Hoover Institution to discuss the latest trends in research on strategic communication in digital spaces. The results of that workshop, refined by a number of follow-on interviews and discussions, are included in this report. I encourage you to read each of the fourteen essays that follow, which are divided into three thematic sections: Digital’s Dark Side, Disinformation, and Narratives.

Digital’s Dark Side focuses on the emergence of social bots, artificial intelligence, and computational propaganda. Essays in this section aim to raise awareness regarding how technology is transforming the nature of digital communication, offer ideas for competing in this space, and raise a number of important policy and research questions needing immediate attention. The Disinformation section confronts Oxford English Dictionary’s 2016 word of the year – “post-truth” – with a series of compelling essays from practitioners, a social scientist, and philosopher on the essential roles that truth and facts play in a democratic society. Here, theory, research, and practice neatly align, suggesting it is both crucial and effective to double-down on fact-checking and evidence-based news and information programming in order to combat disinformation campaigns from our adversaries. The Narrative section concludes the report by focusing on how technology and facts are ultimately part of, and dependent on, strategic narratives. Better understanding how these narratives form, and what predicts their likely success, is necessary to think through precisely how PD can, indeed, survive the Internet. Below are some key takeaways from the report.

In Defense of Truth

• We are not living in a “post-truth” society. Every generation tends to think that the current generation is less honest than the previous generation. This is an old human concern, and should be seen today as a strategic narrative (see Hancock, p. 49; Roselle, p. 77). Defending the value and search for truth is crucial. As Jason Stanley notes (p. 71), “without truth, there is just power.”

• Humans are remarkably bad at detecting deception. Studies show that people tend to trust what others say, an effect called the truth bias. This bias is actually quite rational—most of the messages that a person encounters in a day are honest, so being biased toward the truth is almost always the correct response (see Hancock, p.49).

• At the same time people are also continuously evaluating the validity of their understanding of the world. This process is called “epistemic vigilance,” a continuous process checking that the information that a person believes they know about the world is accurate. While we have a difficult time detecting deception from interpersonal cues, people can detect lies when they have the time, resources, and motivation. Lies are often discovered through contradicting information from a third source, or evidence that challenges a deceptive account (see Hancock, p. 49).

• Fact checking can be effective, even in hyper-partisan settings (see Porter, p. 55), and is crucial for sustained democratic dialogue (Bennett, p. 61; Stanley, p. 71). Moreover, it is possible, using digital tools, to detect and effectively combat disinformation campaigns in real time (Henick and Walsh, p. 65).

Computational Propaganda

• Computational propaganda refers to the coordinated use of social media platforms, autonomous agents and big data directed towards the manipulation of public opinion.

• Social media bots (or “web robots”) are the primary tools used in the dissemination of computational propaganda. In their most basic form, bots provide basic answers to simple questions, publish content on a schedule or disseminate stories in response to triggers (e.g. breaking news). Bots can have a disproportionate impact because it is easy to create a lot of them and they can post a high-volume content at a high frequency (see Woolley, p.13).

• Political bots aim to automate political engagement in an attempt to manipulate public opinions. They allow for massive amplification of political views and can empower a small group of people to set conversation agenda’s online. Political bots are used over social media to manufacture trends, game hashtags, megaphone particular content, spam opposition and attack journalists. The noise, spam and manipulation inherent in many bot deployment techniques threaten to disrupt civic conversations and organization worldwide (see Chessen, p.19).

• Advances in artificial intelligence (AI) – an evolving constellation of technologies enabling computers to simulate cognitive processes – will soon enable highly persuasive machine-generated communications. Imagine an automated system that uses the mass of online data to infer your personality, political preferences, religious affiliation, demographic data and interests. It knows which news websites and social media platforms you frequent and it controls multiple user accounts on those platforms. The system dynamically creates content specifically designed to plug into your particular psychological frame and achieve a particular outcome (see Chessen, p. 39).

• Digital tools have tremendous advantages over humans. Once an organization creates and configures a sophisticated AI bot, the marginal cost of running it on thousands or millions of user accounts is relatively low. They can operate 24/7/365 and respond to events almost immediately. AI bots can be programmed to react to certain events and create content at machine speed, shaping the narrative almost immediately. This is critical in an information environment where the first story to circulate may be the only one that people recall, even if it is untrue (see Chessen, p. 39).

• PD practitioners need to consider the question of how they can create and sustain meaningful conversations and engagements with audiences if the mediums typically relied upon are becoming less trusted, compromised and dominated by intelligent machines.

• Challenging computational propaganda should include efforts to ensure the robustness and integrity of the marketplace of information online. Defensively, this strategy would focus on producing patterns of information exchange among groups that would make them difficult to sway using techniques of computational propaganda. Offensively, the strategy would seek to distribute the costs of counter-messaging broadly, shaping the social ecosystem to enable alternative voices to effectively challenge campaigns of misinformation (see Hwang, p. 27). In the persuasive landscape formed by social media and computational propaganda, it may be at times more effective to build tools, rather than construct a specific message.

• Practitioners are not alone in their concern about the escalating use of social bots by adversarial state actors. The private sector is, too. Social media platforms see this trend as a potentially existential threat to their business models, especially if the rise of bots and computational propaganda weakens users’ trust in the integrity of the platforms themselves. Coordination with private sector is key, as their policies governing autonomous bots will adapt and, thus, shape what is and isn’t feasible online.

Moving Past Folk Theories

• Folk theories, or how people think a particular process works, are driving far too many digital strategies. One example of a folk theory is in the prevalence of echo chambers online, or the idea that people are increasingly digitally walled off from one another, engaging only with content that fits cognitive predispositions and preferences.

• Research suggests that the more users rely on digital platforms (e.g. Twitter and Facebook) for their news and information, the more exposure they have to a multitude of sources and stories. This remains true even among partisans (though to a lesser extent than non-partisans). It turns out we haven’t digitally walled ourselves off after all (see Henick and Walsh, p. 65).

• Despite increased exposure to a pluralistic media ecosystem, we are becoming more and more ideological and partisan, and becoming more walled off at the interpersonal and physical layers. For example, marriages today are twice as likely to be between two people with similar political views than they were in 1960.

• Understanding this gap between a robustly diverse news environment and an increasingly “siloed” physical environment is crucial to more effectively engaging with target audiences around the world. Interpersonal and in-person engagement, including exchange programs, remain crucial for effective PD moving forward (see Wharton, p. 7).

• Despite this growing ideological divide, people are increasingly willing to trust one another, even complete strangers, when their goals are aligned (see the sharing economy, for example). This creates interesting opportunities for PD practitioners. Targeting strategies based on political attitudes or profiles may overshadow the possibility of aligned goals on important policy and social issues (see Hancock, p. 49).

Rethinking Our Digital Platforms and Metrics

Virality – the crown jewel in the social media realm – is overemphasized often at the expense of more important metrics like context and longevity. Many of the metrics used to measure the effectiveness of social media campaigns are vulnerable to manipulation, and more importantly, don’t measure engagement in any meaningful way. These metrics were built for an industry reliant on advertising for revenue generation, and as a result, may not be well-suited when applied to the context of PD (see Ford, p. 33; Woolley, p. 13).

• Overemphasizing certain metrics, such as reach or impressions, fails to account for the risks created by relaying on the same portals as other, less truthful and more nefarious actors. We need to be cautious and aware of the various ways in which the digital media business industries are shaping PD content, be aware of the risks, and think carefully about safeguarding the credibility U.S. Department of State PD programs operating in this space (see Wharton, p. 7; Ford, p. 33).

Strategic Narratives

• Strategic narratives—a means for political actors to construct a shared meaning of the past, present and future of politics in order to shape the behavior of other actors.” They provide the ideological backdrop for how audiences assess the meaning and significance of current events and breaking news. Put another way, they help people make sense of what would otherwise be a dizzying onslaught of news they are exposed to on a daily basis (see Roselle, p. 77; Kounalakis, p. 91).

• Crafting effective narratives require a genuine consensus--even if limited or temporary--on our policy priorities and their underlying values, as well as a detailed understanding and appreciation of local grievances and concerns about the related policy issue (see Wharton, p. 7; Roselle. P. 77). As such, effective strategic narratives must be mutually constructed.

• Rather than focusing on trending news topics and stories alone, we need to develop greater capacity to understand competing public narratives in foreign contexts and track how they adapt over time. Understanding distinctions between system (or governance), value, and identity narratives would allow PD practitioners to construct policy narratives that speak to, or at least acknowledge, the underlying pillars of belief in a given community (see Walker, p. 83; Roselle, p. 77).

• Every new administration creates new opportunities for foreign engagement. A shift towards a more transactional approach to PD, focused less on values but more on shared policy priorities, could allow for improved relations and cooperation with a number of countries previously hostile to American PD efforts and programs (see Kounalakis, p. 91).

Share

COinS