AI Deepfakes are being used to apply for remote jobs, warns FBI

share to other networks share to twitter share to facebook

AI deepfakes have been around for years now, but nefarious uses of the technology are starting to get more popular. Outside of fake revenge porn and political sabotage, the video AIs are being used for something different: job applications.

Why are people using AI Deepfakes for job applications?

In a report by the Federal Bureau of Investigation, it was revealed that deepfake job applications are becoming increasingly popular. In job interviews, real-time deepfakes are being used in order to sneak their way into jobs.

Advertisement

Unfortunately, these job applications do not appear to be people simply trying anything they can to get hired. Instead, as with most things deepfake related, they’re being used for nefarious purposes.

The FBI explained that there’s been a massive increase in reports of both video and audio deepfakes during job interviews. During interviews for remote work jobs, these applicants are stealing the likenesses of others in order to gain access to specific jobs.

In the report, it’s stated that all of the applied jobs are relating to computer technologies. The FBI explains:

“Positions identified in these reports include information technology and computer programming, database, and software related job functions. Notably, some reported positions include access to customer PII, financial data, corporate IT databases and/or proprietary information.”

Advertisement

This means that the entire purpose of the applicants are to gain access to private information. AI deepfakes that are successful could result in fake workers stealing large amounts of data.

Read More: Google bans Deepfake Research over fears of wrongdoing

They’re not perfect

Despite the rough nature of some video calls, the issues with deepfakes are still noticeable to some. Issues with syncing and generation of movement were telltale signs that something was amiss.

Advertisement

“In these interviews, the actions and lip movement of the person seen interviewed on-camera do not completely coordinate with the audio of the person speaking,” the report reads. “At times, actions such as coughing, sneezing, or other auditory actions are not aligned with what is presented visually.”

However, not everyone will be able to recognise these issues, and even those who do may think it’s simply internet related. After all, most people aren’t expecting a deepfake to be used in a job interview.