Biometrics , Endpoint Security , Geo Focus: The United Kingdom

UK Data Watchdog Issues Warning on Emotional Detection Tech

UK ICO Promises Biometrics Guidance Document in Spring 2023
UK Data Watchdog Issues Warning on Emotional Detection Tech
Image: Shutterstock

The British data watchdog is warning U.K. companies to stay away from emotional recognition technology due to its risk for systemic bias, inaccuracy and discrimination.

See Also: SASE: Recognizing the Challenges of Securing a Hybrid Workforce

Emotional detection technologies "may not work yet, or indeed ever," U.K. Deputy Information Commissioner Stephen Bonner said in a statement.

Artificial intelligence-driven emotional detection that builds on biometric data such as facial recognition already is a multibillion-dollar market expected to reach $43 billion within the next four years.

The office predicts that the commercial sector, in particular, will make greater use of biometrics technology within the next two to three years by incorporating behavioral analysis into products. Home internet of things devices may be able to identify users and guests by their voices and offer tailored responses based on their perceived emotional state.

Critics say emotional detection algorithms, especially when based on inferences from facial expressions, are based on false assumptions of human expressiveness. "How people communicate anger, disgust, fear, happiness, sadness, and surprise varies substantially across cultures, situations, and even across people within a single situation," concluded a 2019 scientific article.

China has emerged as a large marketplace for emotional detection technology. A domestic industry consisting of dozens of companies sells emotional detection tools supposedly capable of flagging students whose attention has wandered, concluded a 2021 report from London-based based human rights group Article 19.

Chinese law enforcement researchers tout the value of using artificial intelligence to identify "micro expressions" - fleeting and unconscious facial movements supposedly indicative of true emotions - in order to identify "dangerous people."

The U.K. Information Commissioner's Office says it will release guidance during spring 2023 about corporate use of biometric technology. It flagged emotional detection artificial intelligence as one of four issues the guidance should address. The others are further guidance around data protection compliance issues that arise from use of biometrics, the use of biometrics to classify people - not just identify them, and pervasive gathering of biometric data without physical content with a subject.

About the Author

Akshaya Asokan

Akshaya Asokan

Senior Correspondent, ISMG

Asokan is a U.K.-based senior correspondent for Information Security Media Group's global news desk. She previously worked with IDG and other publications, reporting on developments in technology, minority rights and education.

Around the Network

Our website uses cookies. Cookies enable us to provide the best experience possible and help us understand how visitors use our website. By browsing, you agree to our use of cookies.