This report assesses the extent, nature, purpose and efficacy of online Russian
state-sponsored efforts to promote Russia’s strategic narratives and exercise political
influence in Europe
Stephen Hutchings is Professor of Russian Studies at the University of Manchester, UK
Alexandr Voronovici isPostdoctoral Research Associate at the University of Manchester, UK
Vera Tolz is Professor of Russian Studies at the University of Manchester, UK
Neil Sadler is Associate Professor of Translation Studies at the University of Leeds, UK
Maxim Alyukov is Leverhulme Early Career Research Fellow at the University of Manchester, UK
Sofia Tipaldou is Assistant Professor in International Relations at Panteion University, Greece
This is only the executive summary. To access the full report please click HERE
Executive Summary
• This report assesses the extent, nature, purpose and efficacy of online Russian
state-sponsored efforts to promote Russia’s strategic narratives and exercise political
influence in Europe, in the aftermath of bans imposed in the West on Russian state media
following the full-scale invasion of Ukraine in 2022.
• Focusing on the recent EU2024 elections, the report offers findings and policy
recommendations based on analysis of the output of five outlets with confirmed
Kremlin sponsorship (Reliable Recent News (RRN); Berliner Tageszeitung (BTZ); Voice
of Europe (VoE); France et EU; and Pravda-EN/Pravda-FR/Pravda-DE)
• Our findings confirm other studies suggesting the existence of a Kremlin strategy to
infiltrate, disrupt and influence Western media spaces. The initiators of these operations
tend to be part of the sprawling Russian state apparatus which extends into the grey zone of
actors, both Russian citizens and foreigners, that include corrupt businesspeople, individual
Kremlin sympathisers, former intelligence operatives and semi-criminal entrepreneurs
whose incentives are largely financial.
• The content of the examined output is highly parasitic. Much of it consists of the
remediation of news stories reported elsewhere, including by Western mainstream media.
Pro-Kremlin biases are reflected through the selection of stories serving Russian strategic
interests: hence the volume of Ukraine war stories selected to fit Kremlin narratives.
• Editorial approaches vary significantly, but include directly deceptive practices such as:
(a) plagiarising without modification Western agencies’ news report collected via MSN or
Yahoo news to which fictious reporters’ names are assigned; (b) modifying the language
of the plagiarised Western mainstream media reports with the addition of propagandistic
statements steering readers towards pro-Kremlin interpretations; (c) distortive citation of
sources where claims are attributed to Western media outlets that they did not make.
• Much of the overall output during the monitoring period in June 2024 consisted
of factual and often neutral reporting. This can be said to be part of an authentication
strategy designed to make readers more receptive to the more skewed coverage.
• The output is produced in multiple languages, including English, German, Frenchand Spanish, with varying levels of competence. Pravda’s auto-translations of original
Russian sources are riddled with errors, whereas BTZ’s and France et EU’s output is written
in correct, idiomatic language.
• There is evidence of the use of AI tools to gather content from across the web. Some
human curation of content to tailor selections of news to audiences in different linguistic and
cultural settings is evident.
• The domain names and access status of the outlets we studied are unstable and subject
to constant change in what is effectively a constant cat-and-mouse game between them and
Western media regulators striving to block them (it is possible, therefore, that some of the
outlets studied are no longer accessible).
• EU2024 election-specific coverage content related to the EU2024 elections generally
indicates no central coordination, but does point to a campaign against the President of
the EU Commission, owing to her tough anti-Russian, pro-Ukrainian stance. Ukraine war policies; criticism of Europe’s attachment to “decadent” liberal values and US influences; threats posed to European states by corruption and mass immigration.
• The audience traction and wider spread of the content posted by the proxies appears
to be minimal. Website visitor numbers are at best modest. In the case of the Telegram
channels, audience views and responses are often close to zero. France et EU’s Twitter
account had virtually no following. VoE achieved the best social media performance with
182K Twitter/X followers in August 2024, when, however, posting ended with an invitation
to join the VoE Telegram channel.
• Prior to their ban, some VoE and RRN output seems to have found its way into
the media ecosystem of conservative, far-right, and far-left actors, including global
conspiracists. There is very limited evidence of any influence on more mainstream media.
• There is a mismatch between the significant Russian state input into establishing
and maintaining the proxy network, and the level of influence achieved. Therefore,
the question of purpose arises. The network may serve a secondary “swamp and distract”
purpose, with a performative emphasis on demonstrating a continued Russian presence
in Western media space consistent with earlier OSINT analysis suggesting a strategy to
“overwhelm the global disinformation research and fact-checking community”.1 Some
initiatives are also laundering operations to enable payments to Kremlin-friendly actors.
• We make a series of policy recommendations, including the following:
Policy responses to current Russian influence operations should balance the need to
address the actual threats they pose with the danger that overinflating their efficacy
may give them the oxygen of publicity they demonstrably seek.
Assessing the precise traction and spread achieved by these operations is also key to
avoiding mistakes involving the misattribution to the Russian state of activities which,
in fact, have their origins elsewhere.
If the responses are to include bans and access restrictions, then these must be
implemented effectively, especially given their propensity to be exploited by Russian
state actors for propaganda purposes.
Policy makers should take great care to ensure that disinformation lexicon is used
consistently and accurately, avoiding the tendency to apply it to areas of activity that
do not fit the description, or that require different labels.
Great caution should also be exercised when applying the latest tools of detection
and analysis to digital propaganda. Automated tracking software can give highly
misleading results, if not accompanied by qualitative, manual analysis.
Be the first to comment