Zoom Emotion Analyser AI protested by Human Rights Groups

share to other networks share to twitter share to facebook

In a bizarre overstep of surveillance, video call company Zoom is working on an AI that will detect the emotions of users. However, as expected, not many are happy about the in-development of the Zoom Emotion Analyser AI.

Why are people protesting the Zoom Emotion Analyser?

Announced earlier this year, Zoom’s Emotion Detection software was aimed at school children that are still remote learning. The technology is said to be able to monitor emotions in real-time to determine if a child is angry, upset, confused or more.

Advertisement

After the announcement, 25 different human rights organisations have banded together to try and stop the Zoom Emotion Analyser. Reported by Protocol, organisations such as the American Civil Liberties Union, Electronic Privacy Information Center and Fight for the Future demanded that Zoom cancel the AI plans.

Activists argue that emotion AI does not take into account individualism, instead grouping everyone together under attributes determined through an established dataset. As we all know, datasets are often proven to be biased, more often leading to racist and toxic outcomes.

This software is discriminatory, manipulative, potentially dangerous and based on assumptions that all people use the same facial expressions, voice patterns, and body language,” the organisations wrote in an open letter.

“Zoom’s use of this software gives credence to the pseudoscience of emotion analysis which experts agree does not work. Facial expressions can vary significantly and are often disconnected from the emotions underneath such that even humans are often not able to accurately decipher them.”

Read More: French woman falls in love and gets engaged to a robot 

Past Emotion AI controversies

Zoom’s use of Emotion AI technology is not the first realisation of the idea. In fact, emotional datasets have been used for years for tasks such as remote interviews. This practice has already seen its fair share of controversies.

For example, a recent BBC 3 documentary explored the discriminatory nature of the technology, particularly now it labels the behaviour of mentally disabled people. In the documentary, autistic people, or those with disabilities like down syndrome, were labelled as poor candidates due to their emotive expressions.

Emotion AI like the Zoom Emotion Analyser could certainly improve some experiences for some children. However, for many, it will simply end up making life more difficult, and that's not hyperbole; it’s already happening.