How to Preempt Disinformation

Just as the U.S. spends hundreds of billions on warships and planes, it should invest in an information warfare machine focused on continually preempting disinformation.

featured
algorithm

An image of an injured Ukrainian social media influencer staggering from the rubble of a maternity hospital struck by Russian forces has become a new symbol of the global disinformation wars. When the shocking image of Marianna Vishegirskaya, who is known as @gixie_beauty on Instagram, went viral, a post on a Russian state-controlled site, ironically called “War On Fakes," claimed that Vishegirskaya was actually a “crisis actress” paid to play the part. Using this post as a launchpad, the Russian embassy in the U,K. tweeted out the link, along with claims of “#fakenews,” numerous times. Twitter quickly took down the tweets, and numerous Western media outlets wrote articles debunking the claim.

Some social media watchers claimed this as a victory against disinformation. If so, it is a small one, and it is certainly not a blueprint for larger success. To fix the problem of disinformation, we need to think about the system that makes it so easy and profitable to sow inaccuracies at such a low cost. The good news is that there are a number of concrete steps governments and social media companies can take to rein in disinformation without reinventing the wheel or killing social media.

The Disinformation War Is a Forever War

Governments have always used propaganda to serve national and ideological ends. The U.S. and its allies created news outlets such as Voice of America (VOA) and Radio Free Europe (RFE) to combat Nazi and Soviet propaganda. World War II and the Cold War were far tidier information wars, with radio towers providing the primary broadcast power. Through these efforts, the West learned one of the key lessons in fighting propaganda: Provide credible, accurate information in an accessible format.

The internet, ubiquitous connectivity and social media have radically changed the rules of information warfare. The internet made it inexpensive to create seemingly credible content. Global internet access made it easy to achieve mass distribution at a low cost. Social media companies changed the way we consume media, applying machine learning and algorithms to modify our news diet and drive us to increase our interaction with the news.

The idea of generating interest by presenting news of great extremity is not new. “If it bleeds, it leads” has long been an accurate description of the media's priorities. The three above factors can combine, however, in ways that harm society. Digital media production made it easy to create disinformation that looked like news. It made it so easy, in fact, that numerous businesses unaffiliated with any government cropped up in places such as Macedonia to create bogus political content solely to generate clicks. The internet and social media algorithms distributed disinformation widely and simultaneously amplified our predictable responses, thereby encouraging further creation of disinformation. Russia and China have eagerly embraced the new reality of information warfare, even targeting Canadians with relentless anti-U.S. propaganda.

For example, the Wagner Group, a Russian mercenary company, frequently uses disinformation to prop up unpopular dictatorships in Africa and other regions by aiming disinformation at political opponents with the goal of causing political unrest and even violence. In Mali, this is part of Wagner's business strategy and Russia’s political strategy, which aim to both profit from political strife and undermine stability in regions that are strategically important to the U.S.

Because disinformation is so inexpensive and easy to create and distribute, the West is now engaged in an unwinnable game of “whack a mole." Asking the large social media companies to radically alter their business models is not an option; these platforms were designed from the ground up for virality. Banning social media, too, is not a solution. Countless people use social media in ways that benefit them, and society uses social media as a tool for collective action and creative expression.

There are, however, ways to tweak the algorithms and handle disinformation that might break disinformation’s hold on our information sphere. The U.S.’s effective “pre-bunking”—releasing intelligence to counter Russian disinformation—has completely changed the information battlefield against Russia and provided crucial lessons on how to fight and win this broader information war. What the West needs now is a cohesive disinformation warfare strategy and machinery that works in partnership with social media companies to reestablish a healthier information sphere.

See also: How to Put a Stop to AI Bias

Deliberation as Choice Architecture

In their bookNudge, Richard Thaler and Cass Sunstein outline a concept of encouraging actions through “choice architecture,” which affords users sufficient freedom but nudges them in a desired direction. In the case of disinformation, the desired outcome is less sharing of factually incorrect content. There are a number of ways to inject friction into the sharing process to slow it down. Twitter already asks users whether they want to share articles they have not actually clicked on and read. Facebook could easily inject an “are you sure you would want your mother to read this?” prompt to make users stop and think before they hit "share." Alternatively, if a user has never interacted with an account that they are sharing content from, social media platforms could ask: “You have never read or retweeted anything originating from this account. Do you trust it?” The general goal would be to encourage people to stop and think before they share.

Introducing Costs to Bad Actors

There are other ways to introduce costs to bad actors, the simplest being removing any content they post from amplification algorithms. For example, social media companies could have a “three-strikes” policy that de-amplifies any account with three reported violations for disseminating inaccurate content. This doesn’t mean that they can’t post or that people directly following them cannot read their content. But it throws sand into the gears of virality. And de-amplification will destroy the business models of commercial entities that attract customers through disinformation. A more drastic step would be blocking commerce on these sites or accounts, which often profit by selling T-shirts and other merchandise. While social media platforms are already responding to complaints about disinformation, giving low-status, low-pay content moderators full-time jobs, better pay and more power to enact more rigorous penalties would go a long way toward imposing costs on disinformation.

Pre-Bunking Everything, Everywhere

The two suggestions above focus on changing algorithms to reduce disinformation. An equally powerful change would be enacting a policy of global “pre-bunking.” Over the course of the war in Ukraine, the U.S. government has controlled the information space in Europe by electing to publish intelligence to preempt well-worn Russian disinformation narratives. The very act of saying publicly what intelligence agencies hear secretly altered the equation, defanging Russian disinformation in the West and establishing a baseline of factual information. In the U.S. and the West, pre-bunking robbed Russian intelligence agencies of the ability to sow doubts about the facts on the ground and get ahead of public perception by spreading mistruths on social media unchecked and unopposed.

This was a discrete preemption effort, but it should become the norm. Wherever bad actors are spreading disinformation, the U.S. is likely gathering intelligence about what is happening. Just as the U.S. spends hundreds of billions on warships and planes, it should invest in an information warfare machine focused on continually preempting disinformation machines. In comparison with the costs of military gear, the price tag would be a pittance. The loss of secrecy might pose some risks and should always be a considered choice. But in an era of open-source intelligence, when private satellites provide military-grade photographs and information warfare plays out in real time, sharing more and not less with the public will generate goodwill and credibility while also advancing Washington’s geopolitical goals. For social media algorithms, pre-bunking can serve as a counterbalance against disinformation and as a ballast to anchor the information space rather than allowing a continuous loop of reaction to false posts and stories.

What technology takes away, technology also gives. Unthinking adoption of algorithmically driven content engines has made it easier than ever to create and spread disinformation, and smart changes to those algorithms could yield disproportionately beneficial results. Over time, people may even begin to think differently about how and what they share, and new business models may emerge that are less reliant on blind clicks and the hijacking of raw attention. Disinformation will always be with us—it’s part of the world of information exchange. Whether we let disinformation dominate our information sphere and drive the conversation depends on providing better, broader choices and feedback by fixing the algorithms that enabled single actors to dominate the information sphere in the first place.

This article was written by Vivek Wadhwa and Alex Salkever. They are the authors of The Driver in the Driverless Car and From Incremental to Exponential: How Large Companies Can See the Future and Rethink InnovationTheir work explains how advancing technologies can be used for both good and evil, to solve the grand challenges of humanity or to destroy it.


Vivek Wadhwa

Profile picture for user VivekWadhwa

Vivek Wadhwa

Vivek Wadhwa is a fellow at Arthur and Toni Rembe Rock Center for Corporate Governance, Stanford University; director of research at the Center for Entrepreneurship and Research Commercialization at the Pratt School of Engineering, Duke University; and distinguished fellow at Singularity University.

MORE FROM THIS AUTHOR

Read More