Deepfakes are becoming Russia's most dangerous weapon against Ukraine

During war, disinformation is a powerful tool, used to destroy the morale of the opposing force. In modern times, disinformation is easier than ever to create and spread. The ongoing invasion of Ukraine is seeing this first hand as Russia uses AI deepfakes to get into the head's of soldiers.

Russia is making deepfakes of Ukrainian president Volodymyr Zelenksy

Reported by The Daily Dot, Russian hackers created a realistic deepfake of Ukrainian president Volodymyr Zelenksy. In the video, a Zelenksy sound-alike tells Ukranian soldiers to surrender. This is matched with AI generated software of Zelenksyā€™s face saying the words.

In an honesty, the deepfake of Zelenksy wasnā€™t particularly great. For starters, the stand in for the Ukranian president clearly has a noticeably bigger head and is a very frequent blinker. Furthermore, the footage has a noticeable amount of shifting.

However, these issues are all just down to time. People who create professional deepfakes have finely tuned models that rarely shift. (For example, Disneyā€™s Luke Skywalker deepfake in The Mandalorian.) On the other hand, rushed AI videos will have a multitude of issues.

Nevertheless, even rushed deepfakes can have an effect. Hackers uploaded the fake Zelenksy on Ukranian news website Segodnya alongside text claiming he was surrendering. As the video was on a professional news website, some took the video as fact.

Afterwards, Zelenksy quickly came out on social media to denounce the video as fake. The Ukranian president responded:

ā€œIf I can offer someone to lay down their arms, it's the Russian military. Go home. Because we're home. We are defending our land, our children and our families."

Read More: AI generated faces are more trustworthy than real people, finds study

It was an expected trick

Russiaā€™s use of deepfakes to crush morale was already anticipated by the Ukranian government. After all, modern day Russia is known for its disinformation campaigns and forgeries, making a trick like this not just easy to spot, but expected.

Just two weeks ago, the Ukranian government warned citizens that fake videos would be used to scare them. In a Facebook post, citizens were warned:

ā€œImagine seeing Volodymyr Zelenksy on TV making a surrender statement. You see it, you hear it ā€” so itā€™s true. But itā€™s not the truth. This is deepfake technology. This will not be a real video, but created with machine learning algorithms. Be aware ā€” this is fake! His goal is to disorient, sow panic, disbelieve citizens and incite our troops to retreat. Rest assured: Ukraine will not capitulate!ā€

Deepfakes are still a new technology, and a lot of people genuinely can't tell the difference between AI generation and reality. It's a powerful tool, and one that will likely be used often in disinformation campaigns, especially as the tech gets more defined.

This Article's Topics

Explore new topics and discover content that's right for you!

AINews
Have an opinion on this article? We'd love to hear it!