We’re all going to die someday; it’s what makes every life precious. Nevertheless, it’s always tough when we experience loss which is why Amazon has decided to make “memories last” with a controversial new Alexa feature that lets the AI read stories with the voice of those we lost.
On one hand, there is an appeal in hearing the voice of your loved one read stories to you, especially if they’ve passed away. That being said, a feature that can copy voices like this can also lead to it being abused for nefarious means. It will be interesting to see how this Alexa feature gets used and tweaked in the future.
Making memories last
Via Sky News, Amazon is working on this new feature for its Alexa AI that replicates the voices of loved ones. Announced at a Las Vegas conference on Wednesday, the feature was pitched as a way to cope with losing our loved ones during the pandemic.
At the conference, a child asked the Alexa AI to read him The Wizard of Oz in his late grandmother’s voice. As asked, the home assistant switched voices and read the classic tale.
The technology quickly proved controversial as thousand online cried out at the dystopic tool. After all, reviving someone via AI is a haunting tool, and one that can be used nefariously.
However, we can’t deny that hearing a loved one’s voice can also be very comforting, especially if that person meant a lot to us. We hope this is being made with good intentions.
Unfortunately, the sad reality of this matter is that bad people might end up using this tech for nefarious means. We live in a world where people are using deepfakes to fool people and this new Alexa feature, despite good intentions, could end up being abused in the worst kind of ways.
While Amazon’s senior vice president, Rohit Prasad, has tried assuring people that Alexa won’t be abused by those with bad intentions, it’s hard not to be paranoid. Prasad did say that the aim of Alexa is "not to be confused with the all-knowing, all-capable, uber artificial general intelligence" but we doubt that has put any fears to rest.
Overall, it’s a fairly noble gesture from Amazon. But it’s one that people can take advantage of.