NOS news•
-
Nando Kasteleijn
Tech editor
-
Nando Kasteleijn
Tech editor
At the beginning of this year, residents of the American state of New Hampshire received a call that seemed to come from President Biden: the speaker advised them not to vote in the primary election. Only the actual election in November counted.
Only it wasn’t the president making the call, but an AI voice delivering the message, an example of one deep. The calls came from a company in Texas. That company is now under investigation for illegal activities.
Disinformation has been bandied around elections for years, with many references being made to Russia in the past. 2024 is a special election year, more than half of the world’s population going to the polls. Elections are already underway in India, Europe will follow early next month and the US this fall.
So the stakes are high, while it is increasingly easy to create deep fakes, thanks to the increase in generational AI (artificial intelligence).
One deepfake is enough
“One audio file generated by AI, a day or two before the election, is enough to influence the vote,” said Tomasso Canetta, who coordinates intelligence research at the European Digital Media Observatory. . It already happened last year in Slovakia, to put the leader of the Liberal Party in a bad light.
According to Canetta, audio is the most difficult conversion at the moment. With AI generated images, you will often see (small) deviations in the image. That was clearly visible with a picture by Frans Timmermans who toured X last fall The photo was clearly fake.
AI-generated videos are still indistinguishable from the real thing, although Sora, OpenAI’s text-to-video module, may change that. Currently, it is often videos where the audio is fake and the lips are lip synced.
“Audio lapses are the most harmful because the average user can’t easily identify them, especially if you don’t pay close attention to conversational style and grammar,” Canetta says. He emphasizes that there are good ways to identify these deep types, but they do not provide a 100 percent guarantee.
Listen to Joe Biden’s deepfake here. You’ll hear first Biden’s real voice and then the fake one:
Fake is indistinguishable from real: listen to an in-depth look at Joe Biden here
Canetta’s group produces monthly reports on the number of intelligence investigations by European intelligence agencies and also monitors how many of these have been conducted with AI. In March, of the 1,729 objects verified with reality, 87 were created by AI, or 5 percent.
Anyway, according to Canetta you don’t even need a large number. Either way, voters can be negatively affected. Tom Dobber, affiliated with the University of Amsterdam, also drew this conclusion along with other researchers after an experiment. They had a panel watching an in-depth video of the American Democratic politician Nancy Pelosi, in which she justified the storming of the Capitol.
Democrats were more negative about Pelosi afterward. At the same time, Dobber says it is very difficult to make a direct connection between such an event and an election result.
A small role
Luc van Bakel, research coordinator at the Flemish broadcaster VRT, expects that depth will have a limited role in the European elections in Belgium and the Netherlands. “This is one of the things that is being added, a new method that is being added. “
Ultimately, disinformation grows faster when it is widely disseminated, often through social media such as TikTok, Facebook, Instagram and X. “X is characterized by a lot of disinformation, ” said Canetta. “But I think other platforms have a lot to do as well.”
In response, TikTok and YouTube said they would remove videos that are fake. TikTok also emphasizes that it is not always possible to identify material that has been processed by AI. Meta (Facebook’s parent company) and X did not respond to questions from NOS.
VRT’s Van Bakel also points to an undercurrent that is not publicly visible: private conversations in apps such as WhatsApp. He believes that video circulates a lot on public social media and audio more in places where deepfakes are less likely to be noticed.
2024-05-09 15:26:08
#wave #depth #elections #expected #risk