Just two days before the Slovak elections, an audio recording was published on Facebook. There were two voices: Michal Šimečka, who heads the liberal Progressive Slovakia party, and Monika Tódová from the daily Dennik N.. They appeared to be discussing how to rig elections, in part by buying the votes of the country’s marginalized Roma minority.
Šimečka and Dennik N. immediately denounced the audio as fake. News agency AFP’s fact-checking department said the audio showed signs of manipulation using AI. But the recording was released during a 48-hour moratorium before voting opened, during which the media and politicians are supposed to remain silent. This meant that, according to Slovak electoral rules, this message was difficult to widely discredit. And because the post was audio, it exploited a loophole in Meta’s manipulated media policy, which states that only fake videos — in which a person has been edited to say words they never pronounced – go against its rules.
The election was a close race between two frontrunners with opposing visions for Slovakia. On Sunday, it was announced that the pro-NATO party Progressive Slovakia had lost to SMER, which was campaigning to withdraw military support from its neighbor Ukraine.
Before the vote, EU digital chief Věra Jourová said Slovakia’s elections would be a test of the vulnerability of European elections to the “multi-million euro weapon of mass manipulation” used by Moscow to interfere in the elections. Now, in the wake of this crisis, countries around the world will look to what happened in Slovakia for clues about the challenges they too might face. Neighboring Poland, which a recent EU study says is particularly at risk of being the target of disinformation, goes to the polls in two weeks. Next year, elections will be held in the United Kingdom, India, the European Union and the United States. Fact-checkers trying to combat disinformation on social media in Slovakia say their experience shows AI is already advanced enough to disrupt elections, while they lack the tools to fight back.
“We are not as ready as we should be,” says Veronika Hincová Frankovská, project manager at the fact-checking organization Demagog.
During the elections, Hincová Frankovská’s team worked long hours, splitting their time between fact-checking claims made during televised debates and monitoring social media platforms. Demagog is a fact-checking partner for Meta, meaning it works with the social media company to write fact-checking labels regarding suspected misinformation spreading on platforms like Facebook.