Cognitive War Turns the Mind into Battleground

Cognitive warfare targets how we think, feel, and react to stimuli. Policymakers must take action to not only regulate emerging technologies but also work to identify and address the vulnerabilities in our cognition, writes Irene Pujol.

Listen to this Article

In an era marked by geopolitical tensions and unprecedented technological advancement and disruption, a new battleground is emerging: our minds. “Cognitive warfare,” a concept being explored by NATO’s Innovation Hub since 2021, represents a shift in global conflict in which the strategic aim is to influence human cognition. Indeed, cognitive warfare is an increasing global security concern, driven by advances in neuroscience, AI and other emerging technologies, and the proliferation of social media. Understanding this evolving threat, and preparing to combat it, is crucial.

The premise of cognitive warfare is based on the recognition that, while operations of manipulation and deception have historically been integral to broader military tactics, today’s landscape is markedly different. Influencing how individuals and groups think, react, and make decisions has transformed from a tactical to a strategic objective in its own right – and one that is increasingly within reach. The drivers of this development can be summarized in three words: knowledge, incentives, and means.

The knowledge driver of cognitive warfare, which is often overlooked, stems from our growing understanding of how the human mind works, thanks to decades of research in neuroscience, behavioral economics, and psychology. In fact, according to Harvard Business School professor Gerald Zaltman, only a small fraction of our decisions – around five percent – are rational. The remainder of our decisions are limited – by what Herbert Simon called bounded rationality – and are influenced by unconscious factors such as repetition, automatic responses, biases, and fallacies.

As Nobel laureate Daniel Kahneman argues in his book Thinking Fast and Slow, our brains are wired to take automatic shortcuts for most of the decisions we make daily, such as choosing what to eat or responding to social cues. This is because deliberate analysis consumes a lot of mental energy. Examples of these mental shortcuts include:

  • The anchoring effect: Our tendency to rely too heavily on the first piece of information we encounter (the “anchor”) when making decisions.
  • Confirmation bias: Our inclination to seek out, interpret, and remember information that confirms our existing beliefs and preconceptions.

The exploitation of cognitive biases is not new. Marketing companies have been leveraging this knowledge for years to sell products (and often ones we don’t need.) However, over the past decade, the exploitation of cognitive biases has expanded beyond consumer manipulation and now disinformation campaigns and other influence operations have become central to both global politics and modern warfare.

Why? Because they have incentives to do so, a second driver of cognitive warfare. Engaging in cognitive warfare offers a safe and cost-effective way to achieve strategic goals, often without the risks and expense associated with conventional military action. This approach is exemplified by China’s “intelligentized warfare” strategy towards Taiwan. By using cognitive warfare techniques, China seeks to exert control over Taiwan’s fate without resorting to conventional warfare – an approach partly motivated by concerns about the sustainability of China’s economic growth.

The ability of AI systems to learn and instantly adapt their messages to their interlocutors will enable a new level of microtargeting and personalized disinformation.

Yet, even in places of active, kinetic war, managing the narrative – which requires a good understanding of how the mind works – has grown increasingly critical to “winning the war.”  The ongoing war in Ukraine illustrates this point. At a time when action from Western governments is desperately needed to shift the military balance against the Russian invasion, Russian disinformation campaigns to undermine support for Ukraine continue to intensify. For instance, Ralf Beste, head of the department for culture and communication at Germany’s Federal Foreign Office told the Financial Times that his team identified this year a network of more than 50,000 fake accounts generating up to 200,000 posts daily. The latter aimed to persuade Germans that the government’s support for Ukraine jeopardizes German prosperity and increases the risk of nuclear war by “looking for cracks of doubts and feelings of unease and trying to enlarge them.”

While such strategies and techniques have been common in peace and war for decades, their effectiveness has increased – and will continue to increase – due to technological innovations that provide the means to engage in cognitive warfare with wide reach and impact.

The Cambridge Analytica scandal showed how social media data can be exploited to develop psychological profiles for political microtargeting. In recent years, we’ve also witnessed the strategic use of fake accounts, bots, digital influencers, and the distribution of fake news through messaging apps like Telegram to stoke racial tensions in the US, promote military coups in the Sahel, and boost China’s influence overseas. Yet, while destabilizing, traditional disinformation campaigns on social media have still shown mixed results in achieving strategic goals.

Generative AI will further transform this landscape. Not only will it increase the volume and reach of disinformation campaigns and other influence operations by reducing the financial and technical barriers to content creation, it will improve their quality and effectiveness. The growing sophistication of deepfakes and other AI-generated content will make it harder for people to tell what’s real and what’s not. Moreover, the ability of AI systems to learn and instantly adapt their messages to their interlocutors will enable a new level of microtargeting and personalized disinformation.

The risks of manipulation and reality distortion will also expand as AR/VR technologies including Apple Vision Pro and Meta Quest Pro become more widespread – and brain-computer interfaces, such as Elon Musk’s Neuralink, could give malicious actors unprecedented access to our neural data, offering insights into how we feel, think, and react to particular stimuli. This would allow them to hack and alter our perceived reality or even influence our moods and behaviors.

While such a scenario may seem distant, it is important to anticipate and understand the risks of ongoing technological developments in light of today’s increasingly geopolitical context. Citizens must be aware of how their cognitive biases and data can be used – and exploited – for others’ gain, and thus learn how to critically evaluate the information they consume and share. Policymakers, in turn, must define and address the cognitive domain activities that use emerging technologies. This requires a holistic understanding of technology, neuroscience, and geopolitics – and goes well beyond current cybersecurity measures. Adapting our current governance frameworks so that they balance innovation with individual cognitive rights – including defining what actions in the cognitive domain constitute aggression and the best mechanisms for attribution and accountability – is paramount.

However, policymakers should not limit themselves to assessing how emerging technologies enable cognitive warfare and how they can be regulated to prevent their use for harmful purposes. They should also work with a wide range of stakeholders, from technology designers to psychologists, to identify the various vulnerabilities in human cognition and how technology can help address them.

What’s clear is there is no time to waste. Cognitive warfare capabilities will only continue to advance and change not only the nature of conflict but how we are able to respond to it. Taking action now might help create a technological landscape that is a benefit, rather than a burden (or worse) to human autonomy and society at large.

 

© IE Insights.

WOULD YOU LIKE TO RECEIVE IE INSIGHTS?

Sign up for our Newsletter

Newsletter Subscription