Emotional AI and biometric data governance
By Claudia Schettini  |  Sep 08, 2022
Emotional AI and biometric data governance
Image courtesy of and under license from Shutterstock.com
Emotional AI has received a great deal of both interest and scrutiny, from its applications in shaping public opinion, enforcing airport security, and even predictive policing to prevent crimes before they happen. The ethical concerns that arise from this mean the world must change how it views emotional AI, update the legal framework governing its use, and treat it just as it would other sensitive personal data and privacy rights.

ROME - The public has always been ‘emotional’ in the basic sense that it responds to shifts in public opinion, trends in social behavior, consumption patterns and electoral choices, as well as other expressions of the public will that are in part determined by emotional factors, whether these are seen as residing in the psychological makeup of individuals or in the symbolic content of the media messages to which they are exposed.

Moreover, in recent times the place of emotions in civic culture has been changing, becoming a more visible, more explicit, and more prominent part of everyday life because of a process of emotionalization in political life. People’s popular and civic cultures, as well as their politics, are now deeply emotionalized, and the public presence of emotions has increased due to the rise of irrational and sentimental politics, as well as charismatic manipulation and nationalist and populist sentiments.

Managing such tensions and averting serious conflicts has become a key task for governments, a major part of which is concerned with the emotions triggered or generated by the insecurities and - at least for some - the disappointments of modern life.

However, that still begs the question: How can one connect the topic of political emotions with the field of artificial intelligence (AI)? One can talk about ‘emotional AI’ that refers to technologies which use affective computing and AI techniques to learn about and interact with human emotional life and to regulate and optimize the so-called emotionality of spaces (for example, security issues at an airport).

By looking at two AI devices developed by the United States’ Department of Homeland Security to improve US airport security after the 9/11 attacks, the so-called Screening Passengers by Observations Techniques and Future Attribute Screening Technology, two Critical Security Studies scholars, Frowd and Leite, studied the role of what they

The content herein is subject to copyright by The Yuan. All rights reserved. The content of the services is owned or licensed to The Yuan. The copying or storing of any content for anything other than personal use is expressly prohibited without prior written permission from The Yuan, or the copyright holder identified in the copyright notice contained in the content.
Continue reading
Sign up now to read this story for free.
- or -
Continue with Linkedin Continue with Google
Comments
Share your thoughts.
The Yuan wants to hear your voice. We welcome your on-topic commentary, critique, and expertise. All comments are moderated for civility.