Dominic Saari

Election interference challenges democracies in the election year of 2024

Various influence campaigns are currently taking place all over the world, with the aim of influencing the political climate in the target countries. Artificial intelligence (AI) has already been strongly harnessed to enhance information warfare, writes Project Researcher Dominic Saari from the Ģֱ.
Published
14.5.2024

Teksti: Dominic Saari | Kuvat: Petteri Kivimäki

This year can be considered a major year for elections: according to estimates, in 2024, elections will be organised in at least . An optimist could see the situation as a great celebration of democracy: after all, a record-breaking number of citizens from New York to Delhi are heading to the ballot box.

Dark clouds, however, are gathering above the celebrations of democracy. According to the , global freedom declined for the 18th  consecutive year in 2023. According to the 2023 index, the fundamental rights and freedoms of citizens were restricted in 52 countries. Improvements took place in only 21 countries, which is the worst result in Freedom House’s history.

In 2023, the fragile state of democracy was impacted by, for example, populism, coups, armed conflicts, election fraud, and information warfare by foreign actors.

Authoritarian states sow distrust

In January 2024, the World Economic Forum (WEF) published a risk report in which misinformation and disinformation were seen as the greatest short-term threats globally. Misinformation and disinformation are methods used in information warfare.

Utilizing information warfare, authoritarian states aim to create mistrust, manipulate public debates, and steer the politics in the target countries in the desired direction. 

China and Russia, for example, have attempted to increase internal polarisation in the United States by releasing completely different narratives to different parts of the population.

At the heart of the influencing strategies there are often misleading and emotional messages which are being extensively shared on different social media platforms. The campaigns can include false information, conspiracy theories – or factual information presented purely in a misleading way.  In the Western hemisphere, the presidential elections in the United States are again being followed with a mixture of anxiety and anguish. The stakes are perceived to involve U.S. support for Ukraine and Taiwan.

There are currently plenty of influence campaigns running across the globe aiming to affect the politics in the target countries. In autumn 2023, the social media giant Meta announced that it had successfully taken down “the largest known cross-platform covert influence operation in the world”, which was closely linked to China. This network consisted of thousands of bot and troll accounts on various social media platforms, which spread narratives favourable to the Chinese regime around the world. During the spring,  have also confirmed uncovering Russian influence networks which strive to bribe the members of the European Parliament to spread the Kremlin’s propaganda ahead of the European Parliament election in June.

It can be assumed that these networks are just the tip of the iceberg and show that democracy is still being targeted everywhere.

AI offers a wide range of opportunities for hostile influence campaigns

While the true potential of AI is still being fiercely debated, it is already strongly integrated as part of information warfare. Artificial intelligence chatbots, such as ChatGPT, can produce text at an amazing speed in various languages and at an extremely low cost.

AI-produced deepfakes are often used together with textual content. Alarming observations have been reported around the world concerning fake audio and video clips circulating on social media. Ahead of the parliamentary election in the Slovak Republic, for example, the far-right Republika party released videos in which the leader of the Progressive Slovakia party Michael Šimečka and the President of the Slovak Republic Zuzana Čaputová announced appeared to announce that they would be voting for the extreme right. Another deepfake was shared in which Šimečka seemed to claim he was buying votes. During the election, pro-Russian material was shared actively in Slovakian social media, an effort that is suspected to be part of a wider Russian influence campaign.

Deepfakes were also used to interfere with Taiwan’s presidential election in January 2024. During the elections, a fake video featuring Taiwan’s Democratic Progressive Party (DPP) candidate, and subsequently elected vice president, Lai Ching-te was distributed in the media. In it, the candidate acknowledges that opposition views are mainstream in the country. Taiwan has accused China of election interference and cognitive warfare.

Only time will tell how widely AI will impact election interference in the future. One thing is for certain: its use in spreading disinformation will continue to increase as technology advances.

Election interference is a global problem and a threat to the world’s remaining democracies on all continents. The particularly fragile state of democracies exposes them to the risk of hostile influence. Those responsible for carrying out such attacks aim to use our emotions to create wider rifts in our society and erode mutual trust in existing institutions. Democracy needs defending against threats lurking in the information environment. No nation or individual is immune to hostile influence campaigns, which will only become more sophisticated and persuasive in the future.

Project researcher and PhD student Dominic Saari is part of the KILPI project, which aims to enhance research and education on cognitive security in Finland.