Next year is set to be a big year for elections, with 65 national-level ones taking place across 50 countries. In fact, between October 2023 and the end of 2024, 71% of people living in democracies will vote in an election. This has raised some concerns around how disinformation and interference might affect these elections, particularly as technology continues to make it easier for those looking to influence voters. 

Here are three areas that are causing particular concern.

Generative AI and the creation of disinformation

According to Bruce Schneier, Adjunct Lecturer in Public Policy at Harvard Kennedy School, generative AI is “a tool uniquely suited to internet-era propaganda.” This is because it has greatly reduced the cost of producing propaganda. As we know, generative AI can be prompted to create any type of content on demand and is largely free to use. As a result, election manipulation is no longer limited to those with large budgets (for example state actors), and now accessible to smaller, domestic threat actors.  

Spread of disinformation on social media platforms

The hardest part of election interference is distributing propaganda and disinformation widely enough. Social media platforms have helped this, as malicious actors set up multiple fake accounts to share and spread the message. Many of the major platforms have improved their methods of identifying and removing these accounts but, during an election year as big as 2024, the volume of fake accounts is likely to be very high. Some platforms, however, are not doing enough to protect against disinformation. X (formerly Twitter), was recently identified as having the biggest proportion of disinformation of the six biggest networks, and was warned by the EU that it needed to do more to comply with the recently introduced Digital Services Act (DSA). 


There are ways that social media platforms could do more to protect against the spread of disinformation, particularly around elections. For example, implement ‘virality circuit breakers’, restrict ‘rampant resharing’ during election season and create clear and defined strike systems.

New forms of media

Finally, the type of content that people share and engage with has changed. Now short-form video content on platforms such as TikTok is growing in popularity, and this format is good for sharing punchy, provocative videos that can often be created using generative AI. This, combined with platforms’ recommendation algorithms, can help increase virality of certain types of content, helping it to be viewed by more people more quickly than ever before. 

Whilst there is a very real threat of election interference through disinformation and AI generated propaganda, the good news is that governments and tech companies are aware and taking steps to prevent it. Learning from the computer security world and sharing methods of attack can help. By studying and understanding the different techniques and methods being used in other countries, governments can learn to defend their own.