Zelensky’s Deepfake Is the Peak of the Misinformation War
How AI can influence a global conflict — and everything else.
A global conflict and the impossibility to tell real from fake is a cocktail for disaster.
Two weeks ago Volodymyr Zelensky was making headlines everywhere. But this time, it wasn’t really him. It was a fake video in which Ukraine’s president supposedly appeared declaring he had decided to “return Donbas” and advising civilians to “lay down arms” before the Russian military and “return to [their] families,” as reported by Sky News.
The video was viewed hundreds of thousands of times on Youtube, Facebook, and Twitter before it was taken down, following policies on manipulated media. The hackers (no one has claimed authorship) even managed to put it live on the TV channel Ukraine-24. Soon after, the channel posted a message on Facebook saying it was a fake video — and one fairly easy to spot as the sizes of head and body didn’t match and the audio was off. Zelensky himself issued a statement on Instagram debunking the deepfake.
This is the first known instance of deepfakes being used in the war between Russia and Ukraine — and could very well be the first in any armed conflict.
We’ve seen other forms of propaganda and false information during this war (e.g. spamming bot accounts or fake news articles from dubious sources). However, deepfakes — which have protagonized worrying discussions about AI risks in the past — are at the technological forefront of false information. Because they threaten one of the sources we’re inherently predisposed to believe: Moving faces and talking voices.
Deepfakes: explain like I’m five
What is it?
For those of you who don’t know what a deepfake is, here’s the simplest explanation: A deepfake is a video (or image) that displays a person doing or saying something they haven’t, built with AI models.
A tech-savvy person can easily access deepfake software to modify a real video (e.g. switching people’s faces) or create a fake one from scratch (e.g. simulating mouth movements that correspond to a prerecorded audio file). Some parts are real, some aren’t. And when merged…