Alexa will soon be able to mimic anyone on Earth, and that’s very troubling


Amazon Alexa has become a daily part of life for many, as the AI manages to make everyday life easier. Recently, Amazon announced a new feature that would allow Alexa to mimic the voice of your dead loved ones. This quickly raised concerns, but others were more positive, wondering which celebrities they could make the device sound like.

Celebrity voices

Spy.com points out how Alexa kind of already has a sounds like feature. Numerous apps can be used with the voices of Shaquille O’Neal, Melissa McCarthy, Samuel L. Jackson, Deadpool, and R2-D2. However, those apps are available through deals with actors, allowing Amazon to use their voices and partnering with the company.

With Alexa’s new feature, anyone can make anyone say anything through deepfake AI. This means that anyone will be able to make anyone say anything they want them to, potentially being dangerous.

While the aforementioned technology can mimic the voices of dead loved ones coming in the future, it will absolutely be used for more than that. We should assume that copying celebrity voices will become a reality fairly easily, as well as being able to mimic anyone with a fairly lengthy voice catalogue.

There’s no denying that this will be a very cool feature for Alexa and might even convert people into buying more of Amazon’s tech. Admittedly, there might be some things that can prevent this from being a viable option but there’s no doubt that people would pay for this feature, especially if it’s not too expensive.

Read More: AI Deepfakes are being used to apply for remote jobs, warns FBI

The problem…

It will be interesting to see if Amazon and celebrities take advantage of this, assuming they can prevent people from using this to scam people. Fans can already ask celebrities to say greetings for their loved ones via platforms like Cameo and we’re sure that they wouldn’t mind lending their voices like this for a good amount of money.

Also, keep in mind that people might end up using these celebrity voices to fool people. We’ve seen numerous cases where fans on Twitter were taken advantage of by nefarious folks, losing a ton of money to make their celebrities happy. Only time will tell if Amazon can actually make this without having to worry about scammers.

It’s not just celebrities

Despite some, presumably, good intentions from Amazon, there’s a very good chance that this Alexa app can be abused and put ordinary people in danger as well. There are already AI deepfakes that have been causing problems, so we can only imagine how much trouble can happen if the voices of normal people can be replicated.

Whether it’s you, your loved ones, or strangers, it seems that anyone’s voice can be copied and used to fool people. Remember that in the age of the internet, scammers can be even harder to detect and things will be even more difficult if they use our voices for their misdeeds. It’s something that Amazon really needs to think about because this can hurt a lot of people.

This Article's Topics

Explore new topics and discover content that's right for you!

TechNews