Is European democracy threatened by AI and disinformation?

Is European democracy threatened by AI and disinformation?
By Amel Saebi
Client Services Director and Co-Lead, ICF Next Europe & Asia
Amel Saebi's Recent Articles
The resonance manifesto
Mar 14, 2024
4 MIN. READ

More than 40% of the world's population will be heading to the polls during 2024 and, leading up to those elections, a new analysis says AI-generated disinformation will target voters on a near-daily basis in at least 50 countries.

The 2024 kick-off conference, Fake It Until You Make It, hosted by ICF Next in Brussels, brought together some of the world's leading AI and communications experts to discuss the critical issue of AI-driven disinformation in European elections. Beth Tapper, senior content designer and strategist at ICF Next, was joined by panelists Jeroen Cluckers, George Bara, Iain Brown and Amel Saebi in a discussion about disinformation and the threat to European democracy in 2024—a “super election” year that will see more than 4 billion people cast their ballot in what could be democracy’s biggest event.

ICF’s global marketing services agency focuses on helping your organization find opportunity in disruption.
Go to ICF Next

The growing challenge of disinformation

AI threat article - content image 1

According to Cluckers, there is an urgent need for more critical scrutiny of AI-generated content and its impact on democratic processes. As the creative research lead at Immersive Lab and an artistic researcher at MAXlab, he creates deep fakes that prove how digital images can be seamlessly manipulated, even with real-time video and audio, and without the need for high-tech servers or experts.

"We're entering an era in which our enemies can make it look like anyone is saying anything at any point in time, even if they would never say those things,” he explains. Regulation is also going to be required to change the underlying algorithmic structures that drive platforms such as YouTube and TikTok, he argues. These can put the user into what Cluckers describes as an “infinite loop” and an “echo chamber” of disinformation that can damage the democratic process.

Disinformation: Past, present, and future

Disinformation is increasing in quantity and quality, bringing fresh challenges for those who try to counter it. The founder and chief strategist at Zetta Cloud AI, George Bara, makes it clear that, given the sheer amount of AI-generated disinformation, automated verification algorithms will be crucial in countering the proliferation of disinformation campaigns. Fortunately, there are also tools that now make it possible to detect disinformation by analyzing the content using AI.

Considering that we can’t just rely on a “Terminator”-style showdown, where AI fights AI, to save democracy, Bara agrees that regulation is needed to make tech companies themselves more accountable for the content they host on their platforms. “As soon as you have foreign interference and bot farms, that's a legal issue. Candidates, their parties, and their campaign messages should be protected from outside interference. It should be straightforward technically to block this and not allow propaganda. The EU should put oversight and accountability on the platforms that distribute content.”

The risks, rewards, and ethics of AI

AI threat article - content image 2

AI can also bring benefits to the democratic process, as Iain Brown, the head of data science for northern Europe and adjunct professor of marketing data science at the University of Southampton points out. Increased efficiency and the hyper-personalization of messaging are just a couple of examples. “We can do more with less, and be more efficient and more productive,” he says. “We can be more specific to the individual."

Of course, there are ethical issues that come out of hyper-personalization, too. “You can play to their interests and their opinions, but at the same time, how do you keep that unbiased?” To address this, Brown stresses the importance of strengthening policies to tackle biased content, as well as the important role played by the organizations who are already verifying and authenticating information online.

“We need to hold organizations accountable,” he says. “My belief is that trust is the new currency. We will only work or provide information to organizations that we trust. I'm an AI optimist. If we're proactive, we have the power in our hands to make changes. If we act on those changes, we can make a difference.”

Strategies for countering disinformation

Amel Saebi, co-division leader and client services director at ICF Next, echoes this call for a proactive approach. She offers practical communication strategies to effectively combat disinformation, outlining a two-step approach focusing on first understanding disinformation campaigns and then implementing targeted communication tactics to address them. These tactics stress the importance of simple, consistent messaging, and the role of inoculation in crafting counter-narratives against disinformation.

“The challenge is not really AI,” Saebi says. “It is how we perceive information and how we check sources. There is a big educational aspect to be done. We should understand the logic of disinformation, how it works, and its different powers. We can shape single-minded and clear messages. We can monitor social media—listening is very important—so that we can build new narratives and provide facts and figures to the audience to help them to debug information as they go.”

A more collaborative approach to challenging disinformation

AI-driven disinformation campaigns present a growing and increasingly complex challenge, making clear the need for collaborative efforts between all stakeholders to mitigate the impact on democracy and free elections.

As we deal with the ever-evolving landscape of digital content manipulation, we need a concerted effort to maintain the integrity of democratic processes and to combat the proliferation of disinformation.

With its opening event of 2024, ICF Next continues to play its part as a leading EU communication agency. Watch the full “Fake It Until You Make It. Artificial Intelligence: A threat to our democracy and free elections?” event here.

Meet the author
  1. Amel Saebi, Client Services Director and Co-Lead, ICF Next Europe & Asia

    Amel is vice-president and head of client relations at ICF Next, with significant experience in implementing complex event projects, campaigns, and FWCs. View bio

The latest marketing trends, uncovered.

Subscribe to get insights, commentary, and news sent straight to your inbox.