Deepfakes are becoming Russia's most dangerous weapon against Ukraine


During war, disinformation is a powerful tool, used to destroy the morale of the opposing force. In modern times, disinformation is easier than ever to create and spread. The ongoing invasion of Ukraine is seeing this first hand as Russia uses AI deepfakes to get into the head's of soldiers.

Russia is making deepfakes of Ukrainian president Volodymyr Zelenksy

Reported by The Daily Dot, Russian hackers created a realistic deepfake of Ukrainian president Volodymyr Zelenksy. In the video, a Zelenksy sound-alike tells Ukranian soldiers to surrender. This is matched with AI generated software of Zelenksy’s face saying the words.

In an honesty, the deepfake of Zelenksy wasn’t particularly great. For starters, the stand in for the Ukranian president clearly has a noticeably bigger head and is a very frequent blinker. Furthermore, the footage has a noticeable amount of shifting.

However, these issues are all just down to time. People who create professional deepfakes have finely tuned models that rarely shift. (For example, Disney’s Luke Skywalker deepfake in The Mandalorian.) On the other hand, rushed AI videos will have a multitude of issues.

Nevertheless, even rushed deepfakes can have an effect. Hackers uploaded the fake Zelenksy on Ukranian news website Segodnya alongside text claiming he was surrendering. As the video was on a professional news website, some took the video as fact.

Afterwards, Zelenksy quickly came out on social media to denounce the video as fake. The Ukranian president responded:

“If I can offer someone to lay down their arms, it's the Russian military. Go home. Because we're home. We are defending our land, our children and our families."

Read More: AI generated faces are more trustworthy than real people, finds study

It was an expected trick

Russia’s use of deepfakes to crush morale was already anticipated by the Ukranian government. After all, modern day Russia is known for its disinformation campaigns and forgeries, making a trick like this not just easy to spot, but expected.

Just two weeks ago, the Ukranian government warned citizens that fake videos would be used to scare them. In a Facebook post, citizens were warned:

“Imagine seeing Volodymyr Zelenksy on TV making a surrender statement. You see it, you hear it — so it’s true. But it’s not the truth. This is deepfake technology. This will not be a real video, but created with machine learning algorithms. Be aware — this is fake! His goal is to disorient, sow panic, disbelieve citizens and incite our troops to retreat. Rest assured: Ukraine will not capitulate!”

Deepfakes are still a new technology, and a lot of people genuinely can't tell the difference between AI generation and reality. It's a powerful tool, and one that will likely be used often in disinformation campaigns, especially as the tech gets more defined.

For more articles like this, take a look at our AI and News page.