EPIC: Emotion recognition tech violates EU fundamental rights

EPIC: Emotion recognition tech violates EU fundamental rights

The Electronic Privacy Information Center (EPIC) has urged the Dutch Data Protection Authority to protect students and employees from the harms of emotion recognition.

The EU AI Act prohibits the development, deployment, and placement of emotion recognition systems in the EU market intended for workplaces and educational institutions, with limited exceptions for certain medical and safety reasons. However, the Dutch data protection agency Autoriteit Persoonsgegevens (AP) opened a consultation requesting feedback on the implementation of this prohibition.

The Washington DC-based EPIC has urged AP to define emotion recognition systems broadly and to either allow for no exemptions for its use, or construe the medical and safety exemption narrowly. EPIC’s recommendation is based on the “complete lack of scientific evidence that these systems work,” the organization writes, and that they “violate” various protections enshrined in the EU Charter of Fundamental Rights and other EU regulations.

EPIC regularly advocates for the protection of civil liberties and privacy rights, with focus on biometric surveillance, and has previously complained to the FTC on a job application screening tool that used emotion recognition. In addition, it has advised the United States Department of Education on the harms of emotion recognition, and warned the United States Department of Justice on the invasive nature of emotion recognition technologies.

AI-based emotion recognition systems make predictions about an individual’s emotional state based on biometric data such as heart rate, skin moisture, voice tone, gestures or facial expressions. However, the science behind “emotion recognition” can be barely construed as science. This is for the simple reason that inner emotions can be very hard to objectively measure based on a person’s external features.

For example, a skilled movie actor can be read as sad or anguished or extremely happy, but it does not mean that they are genuinely experiencing those emotions within themselves. Researchers have found that facial expressions can convey varying emotional states and that these can also vary substantially across different cultures, situations, and even across people within a single situation.

Therefore, “objectively” assessing emotions is a misnomer. Furthermore, such technologies can display discrimination based on race, gender and disability. Australian researcher and lawyer Natalie Shard recently explained in a piece for The Conversation why she believes the Australian government should have specific regulations surrounding the use of such technologies, which can be read here.

Article Topics

AI Act  |  biometrics  |  data protection  |  emotion recognition  |  EPIC  |  expression recognition  |  face biometrics  |  Netherlands  |  regulation

Latest Biometrics News

 

Biometric authentication is a common factor in the digital identity trends of the week, the past year and possibly the…

 

On Friday, an attorney for Ascension Health, a major U.S. hospital operator, wrote to Maine’s attorney general to tell him…

 

The passage of the first draft of ISO/IEC 18013-7 in October was the crest of a wave in the standardization…

 

A digital wallet for farmers in New Zealand developed by Anonyome and Indicio has won a Constellation Research SuperNova Award…

 

Facial recognition technology equipped with AI is frequently discussed in the context of law enforcement and contemporary fraud. Less so…

 

EDEKA Jaeger has unveiled a third retail location with automated self-service kiosks that perform biometric age verification in Stuttgart Plieningen,…


link