In last year’s Defense Intelligence Agency report on Russian military capabilities, the American agency devoted a section to the topic, stating “weaponisation of information is a key aspect of Russia’s strategy and is employed in time of peace, crisis, and war”.
“Russia employs a troll army of paid online commentators who manipulate or try to change the narrative of a given story in Russia’s favour. Russia’s Troll Army, also known as the Internet Research Agency, is a state-funded organisation that blogs and tweets on behalf of the Kremlin.
Trolls typically post pro-Kremlin content and facilitate heated discussions in the comments sections of news articles. Their goal is to counter negative media and “Western influence.” While the goal of some trolls is to simply disrupt negative content, other trolls promote completely false content.”
Russia is at the forefront of information warfare in the modern age, utilising an array of organisations and strategies to spread disinformation to further national strategy but how are they doing it?
Every now and then we come across a report from one of the many Russian state broadcasters that have more than remarkable headlines revolving around military equipment and it seems fairly obvious that the piece has a clear agenda but why is this being done?
They were false but the rumours had begun spilling into conventional news media. Numerous analysts and experts in intelligence point to Russia as the prime suspect, noting that preventing NATO expansion is a centrepiece of the foreign policy of the nation.
Even the UK Defence Journal has been contacted by various Russian based ‘news organisations’ looking for soundbites whenever we publish a story about an MoD blunder or questionable government decision.
The most effective instrument in this effort appears to be Russia Today, the organisation has been frequently described as a propaganda outlet for the Russian government and media regulator, Ofcom, has repeatedly found RT to have breached rules on impartiality, and of broadcasting “materially misleading” content.
In the paper ‘Computational Propaganda in Russia: The Origins of Digital Misinformation’ Sergey Sanovich argues that the digital propaganda of the Russian government seeks to insulate Putin’s leadership from any domestic challengers and aid in his foreign policy ventures, which increasingly sets Russian interests off against the West.
The study argues that the propaganda tools, including trolls and bots, were conceived and perfected in the pockets of political competition and a globally integrated market economy still left in Putin’s Russia.
“It’s argued that Russia could be on a mission to restore its Soviet or imperial glory and to prevent liberal democratic values from taking root in the Russian political system.
Yetthe tools used are precisely the ones developed in the most internationally competitive part of the Russian economy that emerged during the liberal 1990s and (until recently) was not subject to heavy-handed interventions by the government: the online media and tech sector.”
The paper concludes that the fact that bots and trolls thrive in the low-trust, anything goes, prove-it-on-the-spot environment. People share sensational and alarmist headlines without much verification more often on social media than any other medium.
The paper also advises that that building up the reputation of mainstream media, ensuring their objectivity, fairness and professional integrity are trusted by the public would do more than anything else to deny Russian digital propaganda the power it currently wields.
“These external limitations, coupled with the vibrancy and tightness of and the emphasis on the burden of proof in the Russian blogosphere, required the government to build sophisticated tools of online propaganda and counter-propaganda.
They combined the ability of bots to jam unfriendly and amplify friendly content and the inconspicuousness of trolls posing as real people and providing elaborate proof of even their most patently false and outlandish claims.
Beyond that, exposing trolls and bots as well as the nuts and bolts of their campaigns could help both to educate the public in how to avoid falling for the misinformation they spread and to find technological means of disrupting their activity.”