Deepfake Attacks trick facial recognition, increasing identity fraud

share to other networks share to twitter share to facebook

Facial recognition is a highly controversial technology, especially as companies like Clearview rip data from social media and sell it to others. However, for those against the tech, there is a method to trick it: Deepfake Attacks. 

Via The Verge, cybersecurity firm Sensity revealed that deepfake technology is the number one issue for facial recognition systems. On one hand, this could help people who are against facial recognition. On the other hand, it is a tool being used to commit vast amounts of fraud. 

How deepfake attacks cripple facial recognition

Advertisement

Sensity reports that deepfakes are currently able to help attackers bypass Liveness Tests. These tests are used in places like banks to verify the identity of users. However, they can now be tricked. 

Using machine learning, Sensity swapped the faces on ID cards as well as in real-time surveillance streams to fool facial recognition. Liveness Tests can ask users to look into a device such as a smartphone or CCTV camera to verify; deepfakes can bypass them all. 

Liveness Tests such as Know Your Customer tests are used frequently to stop fraud. However, the ease in which they can now be circumvented is alarming. 

“We tested 10 solutions and we found that nine of them were extremely vulnerable to deepfake attacks,” said Sensity COO Francesco Cavalli. “There’s a new generation of AI power that can pose serious threats to companies. Imagine what you can do with fake accounts created with these techniques. And no one is able to detect them.”

Read More: Tesla driver trapped in burning car as electric doors fail

Vendors do not care 

Cavalli explained that many vendors that are vulnerable to deepfake attacks simply do not care about the possible threat to security. In fact, upon being told, many vendors were apathetic towards Senity’s warnings. 

Advertisement

“We told them ‘look you’re vulnerable to this kind of attack,’ and they said ‘we do not care,’” he explained. “We decided to publish it because we think, at a corporate level and in general, the public should be aware of these threats.”

Deepfake attacks could end up becoming a major issue for banks in the future. Cavalli explained a number of possible issues, saying: 

“I can create an account; I can move illegal money into digital bank accounts of crypto wallets. Or maybe I can ask for a mortgage because today online lending companies are competing with one another to issue loans as fast as possible.”

Nevertheless, at this time, banks seemingly do not care much about the possible threat. However, they might if an attack ever happens.