Editor’s note: This article is the ninth in a series, “Full-Spectrum: Capabilities and Authorities in Cyber and the Information Environment.” The series endeavors to present expert commentary on diverse issues surrounding US competition with peer and near-peer competitors in the cyber and information spaces. Read all articles in the series here.

Special thanks to series editors Capt. Maggie Smith, PhD of the Army Cyber Institute and MWI fellow Dr. Barnett S. Koven.


Imagine the following scenario: A group of US military and diplomatic officials meet to discuss a named operation. During the meeting, a heated argument erupts among a uniformed trio about the need to overcome the current doctrinal limits of information operations. After a few clever retorts from a diplomat about the primacy of public diplomacy, a visiting Silicon Valley technocrat chimes in, explaining how his one-click solution might work. The argument goes on, in a circular manner, as one side or another refers to definitions and authorities until, like many similar discussions, the meeting ends in an agreement to revisit the topic at another time. Frustratingly, in the time it took the officials to reach an unsatisfying conclusion, malign actors have likely initiated an entirely new disinformation campaign to undermine trust in our democratic institutions and values.

We are constantly reminded of the real-world impact of disinformation, from Russia’s long-standing active measures to weaken democratic and international institutions, to terrorist groups’ deceptive recruitment tactics, to the impact of vaccine disinformation on COVID vaccine acceptance rates around the globe. Adding to the problem are deepfakes and other technological advances that are emerging as mainstream disinformation capabilities. Disinformation is a widespread and serious threat and countering it requires coordination and collaboration across a broad array of tools and actors to make meaningful progress.

The Department of Defense’s 2014 doctrinal definition of information operations (IO) describes the objective to “influence, disrupt, corrupt, or usurp the decision making of adversaries and potential adversaries while protecting our own.” However, in 2018, operations in the information environment (OIE) conceptually displaced IO. OIE attempts to “change or maintain the perceptions, attitudes, and other elements that drive desired behaviors of relevant actors.” The US Advisory Commission on Public Diplomacy defines public diplomacy (PD) as “activities intended to understand, inform, and influence foreign publics.” These definitions, and their ever-evolving supporting terms—informational power, perception management, psychological operations, strategic communication, information warfare, influence operations—tend to confuse rather than clarify terms at a time when US organizations are facing increasingly sophisticated and resolute adversaries in a complex and interconnected information environment.

Traditional US definitions of IO and PD—implying distinct functions between the military and the diplomatic corps—require our attention but should not hinder cooperative action as the information environment becomes riskier due to technological advancements, steep competitor resource expenditures, and brazen adversarial operations. Our separate and distinct governmental authorities, our independent press, and our robust and freewheeling private sector and civil society are America’s greatest strengths—they ensure the protections and the freedoms we tend to associate with being American. And to preserve these American values, we must ultimately be more efficient and effective than our authoritarian competitors who trade protections and freedoms for stronger control and speed of action.

Given the reality of decentralized and underfunded US activity in the counter-disinformation sphere, the community of practice needs to shelve the dictionary and begin sharing functionality among themselves and with partners, while simultaneously fostering trust within the community and maintaining compliance with the law. More useful than schoolhouse definitions—and key to enabling coordinated impact—is a comprehensive understanding of the available tools across the full spectrum of actors and functions, and agreement on mutually beneficial roles. In short, we propose a whole-of-society framework. This article works toward identifying roles and functions that are available for a comprehensive whole-of-society approach.

In this article, we assume that the proposed framework and outlined capabilities are focused on the shared objective of countering disinformation that undermines US policies, stability, and national security. The term “countering disinformation,” a term that is almost as misunderstood and redefined as IO and PD, does not consist merely of counter messaging but also of proactive measures that use facts to inform audiences, reduce the impact of disinformation, and promote freedom of expression—activities that can be functionally categorized under communication, resilience, disruption, and regulation.

Communication

Proactive communication, setting the narrative, or filling the communication void before mis- or disinformation can distort the truth is a critical undertaking that requires understanding and implementation of the full spectrum of communication capabilities. Increasing transparency and building trust in democratic values and institutions is a key to success. While counter messaging should not be the first resort, it does have its time and a place alongside broader, proactive communication efforts.

The United States, as a society and nation, has an extraordinarily broad range of disparate communicators and voices, ranging from governmental institutions to the private sector, to civil society actors and organizations. Communication platforms are also constantly evolving and expanding their reach as production and consumption of social media, print and digital media, radio, and television continuously grow and change. Within the US government alone, communication encompasses a number of activities:

  • Public AffairsExperienced public affairs professionals adeptly navigate owned, paid, and earned media, building the necessary relationships to secure ideal placement of targeted content in order to inform audience attitude or behavior. Standard public affairs tools include press briefings, interviews, press releases, media flyovers, media co-ops, advertising campaigns, public events, and branding. Techniques such as narrative, framing, and endorsement are a mainstay in this space.
  • Key Leader Engagements: Key leader, diplomatic, or other official engagements are key instruments for impactful communication to specific audiences. The tools in this category, more so than the method of engagement or the talking points, are the communicators themselves; because what is said is often less important than how it is said and who says it.
  • Strategic Signals: The military also communicates through action; in fact, considering its prominence, the US military sends messages both through action as well as inaction. Strategic signals frequently include freedom of navigation operations, capability demonstrations, exercises, investments, and asset placement, which can, more or less subtly, communicate resolve, commitment, and priority, or project battlefield superiority.
  • Public Diplomacy: Public diplomacy engagements and communications can vary as much in style and execution as the country of execution but can include speaker series, social and traditional media campaigns, American Spaces, exhibitions, and seminars (additional public diplomacy tools are categorized under resilience).
  • Military Information Support Operations (MISO): One element of doctrinal IO, MISO offers a broad range of communication opportunities dependent upon the environment, including online or printed magazines or journals, websites, news apps, radio programs, leaflets, WebOps, and text messages.
  • Counter Messaging: This use of this tool assumes the disinformation message is already out. Once spread, disinformation can be difficult to counter; however, one of the more successful counter-messaging tactics is exposure—exposing the source and deceptive or malign nature of the message as opposed to focusing on the details of its inaccuracy. Other tactics include denigration, boycotting, and, under specific limited circumstances, deception.

Resilience

Building resilience, reducing vulnerability, and inoculating populations against the effects of mis- and disinformation are a few of the most effective ways to counter disinformation and are an absolute necessity in the current global information environment. Building resilience to disinformation involves the producers of information, education systems, media associations, public diplomacy practitioners, and organizations equipping and empowering society to recognize how disinformation is created and spread, and to understand its impact. It encourages individuals and organizations to think critically about how we consume and share information and creates protective barriers to disinformation’s negative effects on society. Responsible digital citizenship comes down to the individual, but this skill can be fostered through the following measures:

  • Digital Literacy: It is vital to integrate media and digital literacy and updated critical thinking skills into educational curricula, adult education and professional development opportunities, and public service announcements or other educational ventures. Games can be effectively used to inoculate populations against the techniques and methods used by malicious actors and teach populations to recognize disinformation when they see it. This is not a quick fix, but critical to a long-term solution.
  • Civics: A renewed effort to broaden and strengthen pro-democratic values and civic education through a range of educational and other civil society organizations is necessary. These efforts should incorporate skills for understanding information related to government and election processes, and focus on America’s pluralistic history, reasoning, and critical thinking.
  • Journalism: Media plays a critical role in a society’s resilience. It is critical to promote journalistic standards to safeguard independent, fact-based, investigative journalism and establish and bolster fact-checking standards and norms.
  • Public Diplomacy: Public diplomacy efforts can build resilience to disinformation in foreign populations through a myriad of programs, which include facilitating professional, cultural, and educational exchanges, supporting English language education, training journalists, conducting capacity-building workshops, supporting local credible voices, and supporting young leaders initiatives.

Disruption

A more pointed approach leverages technology to stem the flow of disinformation at its source or as it is in transit to the consumer. These technical solutions are often part of the toolbox used by the broader tech sector and internet platforms, while some capabilities are used by law enforcement and the military overseas, including in the execution of cyberspace operations. The most common technical solutions include:

  • Validation: Identity, content, and site authentication are some of the tools that can support the validation of social media, websites, political ads, and more that are increasingly subject to disinformation. Current law, Section 230 of the Communications Decency Act, places the responsibility for this kind of work on the internet platforms but new tools—such as tools to track who created, updated, deleted, and read information—have potentially broader applications when used in a rights-respecting manner.
  • Blocking: Techniques such as content moderation and algorithmic filtering of content are social media companies’ core means of keeping unwanted content from infecting the information environment. These techniques can be controversial, as some see them as censorship, though internet platforms create and enforce their own terms of service.
  • Destruction: Destruction of adversarial capabilities is a high-risk, high-reward calculation. The military’s destruction of centers of gravity, including ISIS’s media production capabilities and media dissemination sites, for example, yielded at least short-term gains against a flexible media system.
  • Cyberspace Operations: Another high-risk, high-reward tactic, cyber operations and hacking and dumping offer temporary disruption of sites, servers, and communication systems, and serve as a clear message to adversaries that they are vulnerable and that the United States has the advantage. The military’s cyber disruption of ISIS’s online operations is a good example.
  • Enforcement: Recent arrestsindictments, and sanctions of foreign agents involved in disinformation operations against US audiences have served to send a clear message. Federal law enforcement signals the United States’ resolve every time it takes action.

Regulation

Because disinformation is not going anywhere, regulation is a necessary component of any proposed approach to counter the impact of disinformation directed at the United States. Effective regulation should source input from local and national legislators, media associations, internet platforms and the broader tech sector, and international organizations. Several approaches to regulation are required and should include the following:

  • Legislation: The US Congress and state legislatures have taken a crucial role in managing the US information environment. Recent bills such as SAFE TECHHonest Ads, and ACCESS demonstrate legislators’ new willingness to participate in this complex issue; however, partisan disagreement stymies enactment of new bills and recent flawed foreign legislative models set bad precedents.
  • Regulation: American TV networks and journalistic organizations have long maintained standards for information and reporting acceptability. The FCC once attempted to enforce honest coverage through the fairness doctrine, which was abandoned due to its perceived effect of “chilling” free speech. Similar models have been discussed and suggested for the internet.
  • International cooperation: Diplomatic efforts to counter the effects of disinformation require extraordinary sophistication of coordination, but can result in strategic moral gains against adversaries.

Weaving It All Together

Categorizing activities to counter disinformation within the functions of communication, resilience, disruption, and regulation supports a focus on outcomes rather than creating further bureaucratic division. Opening the information aperture to consider a broader spectrum of actors and capabilities allows communicators, strategists, and policymakers to construct impact-based activities by combining disparate capabilities. No single agency and no single tactic is capable of countering disinformation on its own. Therefore, we must be committed to learning, collaborating, and innovating.

Countering disinformation requires creative, collaborative, and coordinated implementation of the full spectrum of activities to achieve the greatest impact. To that end, the US Congress has mandated the Global Engagement Center (GEC) to direct, lead, synchronize, integrate, and coordinate efforts of the federal government to recognize, understand, expose, and counter foreign state and foreign nonstate actors’ propaganda and disinformation efforts.

US departments and agencies must continue to undertake counter-disinformation activities in compliance with their existing authorities—including limits on domestic influence operations, for example—to preserve our democratic values and institutions and reverse the erosive effects of disinformation on society while promoting freedom of expression. But by reorienting planning efforts to capitalize on whole-of-government and whole-of-society functions and strengths, the United States can gain new advantages over adversaries and competitors who leverage strong control and faster adoption of novel influence techniques to undermine our social fabric.

JD Maddox supports the US Department of State’s GEC as CEO of Inventive Insights LLC. JD previously served as deputy coordinator of the GEC, as a branch chief in the Central Intelligence Agency, and as a US Army psychological operations team leader. JD is also an adjunct professor in George Mason University’s Department of Information Sciences and Technology.

Casi Gentzel is a foreign affairs officer with the US Department of State’s GEC who currently serves as the GEC’s representative to US Indo-Pacific Command. Prior to joining the GEC in 2016, she served in the State Department’s Bureau of Diplomatic Security and the Office of the Under Secretary for Civilian Security, Democracy, and Human Rights.

Adela Levis is a civil service officer with the US Department of State’s GEC and serves as the GEC academic and think-tank liaison. Prior to joining the GEC in 2015, she served in the State Department’s Bureau for Democracy, Human Rights, and Labor, and in the Bureau for Educational and Cultural Affairs.

The views expressed are those of the authors and do not reflect the official position of the United States Military Academy, Department of the Army, or Department of Defense.

Image credit: Jorge Franganillo