Russian Propaganda in 2024: Traditions and New Trends

Viktoria Odusanvo

2024 was a year filled with various events and challenges linked to the increase in the intensity of Russian propaganda. This year the Kremlin’s “minions” became more active in fields connected to political manipulation and spreading anti-Western narratives all over the world.

To prepare a summary of this year’s Russian propaganda “trends” our team created a chosen few examples of Moscow’s information influence on the global arena. 

Interfering with elections 

This year, Europe and the US saw a jam-packed“election season.” Russia used this period to test new, more aggressive hybrid warfare methods in the context of elections, promoting candidates who serve the Kremlin’s policies and thus spread chaotic moods among the people. Russian influence could be traced during the elections in Moldova and Georgia. During the US elections, polling stations received false mining threats sent from Russian domains. The first round of the presidential elections in Romania was annulled due to the investigation of fraudulent activities and bot farm usage by a pro-Russian candidate. 

The purpose is not just to expand Russia’s information influence, but to undermine the trust in modern democratic values and institutions, forcing people to doubt their stability. Kaja Kallas, a new high representative for foreign affairs and security policy at POLITICO’s P28 highlighted that it is crucial to take the threat of Russia interfering with the election process seriously. “Democracy is based on trust, and if you can’t trust elections anymore then how can you trust the outcome?”, – she stated.

Two weeks before the Romanian presidential election voting started, a populist politician Calin Georgescu gained 30,000 new subscribers on TikTok, and his election campaign video saw an increase in views by 300,000 with TikTok overflowing with hashtags related to his candidacy.

Using AI and social media 

This year Moscow betted on new technologies and increased its use of AI in social media, bot farms, and general AI usage to upscale their “propaganda factory”. The Justice Department of the US, along with its partners in Canada and the Netherlands uncovered a network of over 900 bot accounts that were used to spread Russian narratives on the X social media (formerly known as Twitter). Another loud example is Calin Georgescu, a Romanian pro-Russian candidate who took part in the first round of the presidential election, whose pre-election campaign on the TikTok social media was supported by countless bot farms. 

Video and audio materials created by AI pose another threat. Using resources such as these can create an illusion that famous politicians, activists, or influencers support Russia’s politics. For example, during this year’s Olympics, Russia “took offense” at their athletes being banned from participating and launched a discrediting campaign for the event. Fake videos using the image of Tom Cruise, who allegedly talked about the “fall of the Olympics” were a part of this campaign. Kamala Harris also became an unfortunate victim in Russian AI-generated videos before the elections. The Kremlin is using these tactics as a means to damage the politician’s reputation and create chaos during the election process. 

A still from the deepfake of Vice President Kamala Harris created by Russians. Source: Microsoft Threat Analysis Center (MTAC)

Threatening with new “interventions”

Since the beginning of the full-scale invasion, the Kremlin used a strategy of intimidation of its so-called “red lines”. It is used to lower the support Ukraine has in the world and cultivate the fear of “retribution” that Russia might prepare for the countries that provide military and economic aid to Ukraine. Threats linked to the possibility of a nuclear strike and the usage of the “who’s next” narrative, based on trying to intimidate the EU countries which Russia is planning to “teach a lesson” after their fantasised victory in Ukraine were the most popular thus far. Often such threats are addressed to the former USSR countries, which Russia considers its “historical territories”. In particular, the narrative of the importance of avoiding a war with Russia was a significant part of the pro-Russian Georgian Dream’s (a political party) election campaign, as well as a part of the rhetorics used to discredit the referendum about joining the EU, that was held in Moldova. Using a strategy such as this is especially cynical, given the fact that both Georgia and Moldova have territories that are currently occupied by Russia. 

The protests in Tbilisi started on the 28th of November after the Georgian ruling party announced its intention not to participate in negotiations to join the EU until 2028.

Personalised fakes

Among the “new inventions” and disinformation, Russia continued to use the old favourite – creating and spreading rumors to discredit certain individuals, companies, and institutions’ reputations. One of the popular “questions” Russia raised this year was the imaginary foreign aid theft in Ukraine. One example of such rumors is the narrative of the Zelensky family’s alleged wealth, who, as per the Kremlin’s propagandists’ lies, use military aid and diplomatic visits to buy luxury transportation means and jewellery. Rumour campaigns such as this are created to cultivate outrage among the citizens of the US and Europe, pushing them to vote for political parties and candidates, which strive to reduce military aid given to Ukraine all together.

An example of a personified fake attack. In September 2024 the Chairman of the US Senate Foreign Relations Committee Ben Cardin communicated with an unidentified person, who impersonated the former Minister of Foreign Affairs of Ukraine Dmytro Kuleba using deepfakes via Zoom.