This will be a short post, with other reading and writing to do: but I wanted to go back to a concept I tried to develop when in the Foreign Office, that tried to capture and understand the different levels on which information manipulation works.
I very much enjoyed a recent conversation with the excellent academic and writer Ian Garner on the Week Ahead podcast.
Garner argued that Putin considers his messaging in metaphysical terms, in which the factual contact matters very little. He added in this context that last month’s interview with Tucker Carlson was impactful in its appearance, rather than in the content of the actual discussion.
This took me back to my earlier thoughts on the different levels on which information manipulates audiences. Understanding this seems crucial to countering it.
To my mind, information manipulation works at the factual, or tactical/situational level, where basic events or single incidents can be employed or distorted or invented to mislead audiences regarding a specific incident. For instance, a Russian proxy blogger could post a video claiming that a Ukrainian soldier shot at him, when in fact the blogger might have triggered some kind of tripwire.
More profoundly, and alongside (or under) this, information manipulation can work on cultural, sometimes metaphysical and mythical levels, impacting on deeply-held worldviews and beliefs. Examples of this are powerful calls to cultural beliefs, claims of national superiority and identity, and also claims of historical inevitability (which Putin and Russia do very well).
I have a sense that the Russians understand very well the differences between these situational and strategic types of information manipulation. They also understand very well now how we try to counter these types of messaging, encouraging us to focus on tactical untruths and diversions, while using more strategic messaging to appeal to certain audiences in the long-term.
We must understand better the differences between these types of messaging in order to counter them, especially their long-term impact. Governments and partners can get tied up in fact-checking and debunking and the incidental or situational level – and there are suggestions that Russian proxies are using this technique to divert teams and employ their resources.
What I think requires more thought, however, is how these two different strands of messaging interact with each other, and how the Russians understand this interaction. It seems to me that tactical informational manipulation is highly effective when it deliberately presses against deeper cultural messages, highlighting and endorsing them. These points of contact appear particularly potent, and the Russians seem adept at experimenting at pace to see how these pressure points work.
For example, it was not enough for Moscow to claim, as it did in June last year, that foreign mercenaries were in a Kramatorsk pizzeria that Russia destroyed last year. Russia also insisted that this was part of a wider ‘war’ that that the ‘West’ had declared against the Russians. Using different levels of messaging in combination could possibly be highly impactful.
Understanding impacts of strategic manipulation
Tactical manipulation can endorse and perpetuate the more strategical, cultural strands of manipulation. Strategical manipulation can appeal to cultural beliefs, worldviews and emotions (perhaps a sense of faith); but people’s mythical thinking or beliefs can be legitimised or sustained by their understanding (even if flawed) of situational reality.
This is one of the reasons why Russia’s recent capture of Avdiivka could be effective in the information space (even where military analysts debate the battlefield significance). The seeming success in Avdiivka (and the Russians’ portrayal of that operation) feeds into and to partially substantiates the wider myth that Russia is winning in this war, and that a Russian victory is historically inevitable (I have thought a lot about the role of Providence in Russian thinking, and shall return to this issue later). That strategic manipulation – that Moscow has a just cause in this war and is destined to win in any case – in turn can be inspirational to their troops (momentum counts). However, this also gives the basis to create more tactical manipulation, which audiences might be more likely to believe immediately, because they fit in with their deeper mythical worldviews.
Of course, tactical manipulation might be debunked at the factual/situational level, but the growing ephemeral nature of messaging, in particular on favoured Russian platform Telegram, seems to complicate further the difficult relationship between strategic and tactical manipulation.
Countering the metaphysical
In all this work, it is necessary to consider continually impacts of Russian manipulation and how we can mitigate these.
This is a highly complex area; however, although measuring impacts scientifically is still very hard, there are indications from the Russians of their likely strategic intent, which in turn can guide us towards effective mitigations. If tactical messaging can sustain strategic/mythological messaging, can it also be used to undermine it?
It seems possible. I argued in my last post that we should message hard about Russian losses in Avdiivka, using tactical messaging to undercut Russian myths about their supremacy and their supposed inevitability of their victory. This could be one way to push our messaging forward, as Russians appear incredibly sensitive to information about their losses and the number of men killed in Ukraine. The untimely fate of Z-blogger Andrey Morozov (‘Murz’) suggests that Russia is particularly keen to keep discussions of death tolls out of its information space; we should do all we can to keep inserting it.