While propaganda is nothing new, the information age has unlocked the possibility of manipulating audiences in ways that could only have been dreamed of by power-hungry regimes of the past. The Internet has connected us through social networks, decentralized the media landscape and allowed vast sums of information to move across the world with incredible speed. This brings with it a host of opportunities for learning and connecting, but it also offers fertile ground for disinformation. 

Since at least 2008, experts have found Russia at the forefront of this 21st-century style of propaganda. Similar techniques were later employed by political figures around the world including US President Donald Trump, Brazil President Jair Bolsonaro and in global disinformation campaigns targeting groups such as women. 

Researchers at the RAND Corporation observed repeating patterns in the Russian information tactics and, in 2016, coined the strategy as the “firehose of falsehood” model. They chose the term firehose because the strategy draws on a huge number of channels, bot networks, sources and messages to shoot as much information to as many places as possible. Falsehoods because, instead of the common approaches of persuasion based on rational arguments and credibility, the messages are divorced from any commitment to the truth and aggressively disseminate half-truths, complete lies and contradictory messages. 

We can see it operating with brazen clarity in Russia’s 2022 invasion of Ukraine. Russian President Vladimir Putin originally presented Russia’s claim to Ukraine in a long essay explaining how “Russians and Ukrainians were one people” in the historical, cultural and spiritual sense. In the initial days of the invasion, Putin then justified his “special military operation” by saying that Russia was liberating Ukraine from the government full of “drug-addled Nazis” that aspired to develop nuclear weapons. 

Both narratives, however, failed to catch on widely abroad. So the propaganda quickly shifted. 

After testing out other narratives, such as that Ukraine was on the verge of developing dirty bombs, one finally seemed to stick with Western audiences. It mixes with conspiracy theories pushed about the origins of the coronavirus and claims that the US was secretly developing biological weapons in Ukraine. Russian sources had been laying the groundwork for months, although US biolabs were not originally brought up as a justification for the invasion. However, the theory gained renewed traction after a QAnon-linked Twitter account claimed that Russia was bombing US biolabs in Ukraine. The theory spread like wildfire across right-wing US media and began getting picked up by media outlets around the world. Eventually, Chinese and Russian officials began echoing the claim, with the Russian officials even insinuating the US was experimenting with bat coronaviruses in Ukraine. 

Meanwhile, a torrent of false information was being shared on social media about the conflict itself. For instance, misleading or false videos were shared about the war, acting to sow confusion over what was really happening and even suggesting that the war was being “staged.”

Clearly, consistency and commitment to the truth are not priorities. The firehose approach allows for a vast quantity of misinformation to circulate across channels and for the catchiest narrative to stick and be further amplified. It also involves omnichannel sources operating across internet platforms, television, newspapers, radio, geographies and in official forums. Sometimes the messages contradict each other, but often they draw on each other to repeat false claims. The sheer volume of falsehoods and half-truths aims to “entertain, confuse and overwhelm the audience” to the point of general cynicality. 

“The scariest thing about ‘fake news’ is that all news becomes fake,” said Nic Dawes, the former editor of a major South African newspaper, in an open letter to US journalists after Trump’s election. 

Why people believe 

Why does this style of disinformation seem to work so well? Well, as behavioral psychologists suggest, humans are not entirely rational.

As Nobel Prize Laureate Daniel Kahneman breaks it down, human thought falls into two systems. The vast majority of our thinking, he says, operates in system one, which uses mental shortcuts to react quickly and instinctively to the world around us. This is especially common when people are tired, stressed or busy. System two is slower, more logical and requires mental strain. While most people believe they are operating mainly in system two, numerous experiments have proven otherwise.

For instance, studies have found that multiple sources of information or repetitive ideas can be highly persuasive, no matter how preposterous the claims. Other psychological experiments suggest that receiving messages from people perceived to be similar to the receiver, such as influencers or social media users, makes the messages more convincing.

Experimental psychology also tells us that first impressions tend to stick. Once someone already heard something, it can be hard to convince them otherwise. Even if people believe that the source was not credible, “the sleeper effect” means they forget about the original doubt around the source and only remember the claim. At the same time, stories that trigger emotional responses are more likely to be shared and believed. 

Putting kinks in the hose 

As the researchers behind the firehose of falsehood theory say, “don't expect to counter the firehose of falsehood with the squirt gun of truth.” While it is important to remain credible and refute misinformation, simply dismantling arguments with evidence or logical frameworks isn’t enough. Psychological research, however, offers tools to counter the onslaught of false information. 

One of the most effective strategies, according to the authors, is forewarning audiences about misinformation. That is because it gives the credible source the first-mover advantage. This tactic has been seen before and during the war in Ukraine, with American and NATO officials warning that Russia may be trying to set up false flag operations. Along the same lines, the researchers suggest that it may be more productive to teach people about how propagandists manipulate audiences, rather than fighting each piece of information.

They also suggest that it is more efficient to counter the effects of the propaganda, as opposed to the propaganda itself. In other words, use narratives to fight back. So if, for example, misinformation aims to support Russian aggression, instead of refuting individual claims, push a counternarrative of the importance of democracy and international law. “Don't direct your flow of information directly back at the firehose of falsehood; instead, point your stream at whatever the firehose is aimed at, and try to push that audience in more productive directions,” the authors suggest. 

The importance of using all tools available to turn off or turn down the flow of falsehoods is also highlighted. By this, they refer to interfering with the ability of propagandists to make their messages heard. This again played out during the initial stages of the Ukraine war, when Russian media outlets like RT and Sputnik were banned from dozens of countries and social media platforms. This is also the case with social media outlets blocking misinformation or shutting down accounts that repeatedly promote lies. 

How the firehose and its counterattacks will play out over Ukraine remains to be seen. But unprecedented steps have already been taken as the information war continues to be fought far beyond Ukraine’s borders.