HomeTechnologyArtificial IntelligenceAI and Facial Recognition: An Attempt to Create Empathetic Systems

AI and Facial Recognition: An Attempt to Create Empathetic Systems

Imagine you are in a job interview. As you answer the recruiter’s questions, an artificial intelligence (AI) system scans your face, scoring you for nervousness, empathy, and dependability. It may sound like science fiction, but these systems are increasingly in motion, often without people’s knowledge or consent.

Emotion recognition technology (ERT) is in fact a burgeoning multi-billion-dollar industry. It aims to use AI to detect emotions from facial expressions. Yet the science behind emotion recognition systems is controversial: there are biases built into the systems.

In a 2012 study, by Aron K. Barbey, et. al, neuroscientists confirmed that emotional intelligence and cognitive intelligence share many neural systems. From integrating cognitive, social, and affective processes. This study confirms what psychologists have suspected for decades: that there are interdependencies between emotional intelligence and general intelligence.

Remember Jarvis from the movie Iron man or TARS from the movie Interstellar. Well, the audience almost cried when those artificial intelligent beings inflicted the essence of love and care.

Many companies use ERT to test customer reactions to their products, from cereal to video games. But it can also be used in situations with much higher stakes, such as in hiring, by airport security to flag faces as revealing deception or fear, in border control, in policing to identify “dangerous people” or in education to monitor students’ engagement with their homework.

AI Demonstrating Empathy & Strong Problem-Solving Skills

AI

Artificial intelligence is not only good at problem-solving but it is also demonstrating empathy. AI has developed spontaneous emotions of its own accord. The company Cogito, founded by Joshua Fest and Dr. Sandy Pentland, melds together machine learning with behavioral adaptation, supported by the latest breakthroughs in behavioral science.

Fortunately, facial recognition technology is receiving public attention. The award-winning film Coded Bias, recently released on Netflix. It documents the discovery that many facial recognition technologies do not accurately detect darker-skinned faces. And the research team managing ImageNet, one of the largest and most important datasets used to train facial recognition. It was recently forced to blur 1.5 million images in response to privacy concerns.

With the increasing volume of visual, audio, and text data in commerce, there have been many business applications using AE. For example, Affectiva analyses viewers’ facial expressions from video recordings while they are watching video advertisements in order to optimize the content design of video ads.

Revelations about algorithmic bias and discriminatory datasets in facial recognition technology have led large technology companies, including Microsoft, Amazon, and IBM, to halt sales. And the technology faces legal challenges regarding its use in policing in the UK. Furthermore, in the EU, a coalition of more than 40 civil society organizations has called for a ban on facial recognition technology entirely.

Lapetus Solutions develops a model to estimate an individual’s longevity, health status, and disease susceptibility from a face photo. Their technology has been applied in the insurance industry.

Citizen science project

AI

ERT has the potential to affect the lives of millions of people, yet there has been little public deliberation about how—and if—it should be used. This is where the citizen science project comes to play.

On the interactive website (which works best on a laptop, not a phone) you can try out a private and secure ERT for yourself. To see how it scans your face and interprets your emotions, you can also play games comparing human versus AI skills in emotion recognition and learn about the controversial science of emotion behind ERT.

When it comes to ERT, we need to collectively examine the controversial science of emotion built. These systems and analyze their potential for racial bias. And we need to ask ourselves: even if ERT can accurately read everyone’s inner feelings, Do we want such intimate surveillance in our lives? These are questions that require everyone’s deliberation, input, and action.

Most importantly, you can contribute your perspectives and ideas to generate new knowledge about the potential impacts of ERT. As a computer scientist and digital activist If you have a face, you have a place in the conversation.

By Mayank Vashisht | Technology Journalist | ELE Times

Related News

Must Read

Mission accomplished: Infineon technology proves reliable once again in space on Artemis II

Infineon's radiation-hardened semiconductors performed flawlessly on NASA's Artemis...

Bosch and Qualcomm expand collaboration to strategic ADAS solutions

Cockpit Computers: 10 million units delivered • High-performance solutions: Bosch...

Gartner Forecasts Worldwide Semiconductor Revenue to Exceed $1.3 Trillion in 2026

Semiconductor Revenue to Grow 64% in 2026 DRAM...

Directed Energy Systems: Where Capability Ends and Control Begins

by Sukhendu Deb Roy, Industry Consultant Key Takeaways The economics...

Boundary scan in combination with automotive applications for CAN-FD and LIN bus

Serial communication remains the backbone of electronic communication in...

Why Every EV & 5G Phone Could Soon Be Powered by Gujarat

In a move that cements India’s transition from a...

WSCAD ELECTRIX AI Cuts 50% Engineering Effort For Alligator Automations

Alligator Automations India Pvt. Ltd., a manufacturer of end-of-line...