HomeTechnologyArtificial IntelligenceAI and Facial Recognition: An Attempt to Create Empathetic Systems

    AI and Facial Recognition: An Attempt to Create Empathetic Systems

    Imagine you are in a job interview. As you answer the recruiter’s questions, an artificial intelligence (AI) system scans your face, scoring you for nervousness, empathy, and dependability. It may sound like science fiction, but these systems are increasingly in motion, often without people’s knowledge or consent.

    Emotion recognition technology (ERT) is in fact a burgeoning multi-billion-dollar industry. It aims to use AI to detect emotions from facial expressions. Yet the science behind emotion recognition systems is controversial: there are biases built into the systems.

    In a 2012 study, by Aron K. Barbey, et. al, neuroscientists confirmed that emotional intelligence and cognitive intelligence share many neural systems. From integrating cognitive, social, and affective processes. This study confirms what psychologists have suspected for decades: that there are interdependencies between emotional intelligence and general intelligence.

    Remember Jarvis from the movie Iron man or TARS from the movie Interstellar. Well, the audience almost cried when those artificial intelligent beings inflicted the essence of love and care.

    Many companies use ERT to test customer reactions to their products, from cereal to video games. But it can also be used in situations with much higher stakes, such as in hiring, by airport security to flag faces as revealing deception or fear, in border control, in policing to identify “dangerous people” or in education to monitor students’ engagement with their homework.

    AI Demonstrating Empathy & Strong Problem-Solving Skills

    AI

    Artificial intelligence is not only good at problem-solving but it is also demonstrating empathy. AI has developed spontaneous emotions of its own accord. The company Cogito, founded by Joshua Fest and Dr. Sandy Pentland, melds together machine learning with behavioral adaptation, supported by the latest breakthroughs in behavioral science.

    Fortunately, facial recognition technology is receiving public attention. The award-winning film Coded Bias, recently released on Netflix. It documents the discovery that many facial recognition technologies do not accurately detect darker-skinned faces. And the research team managing ImageNet, one of the largest and most important datasets used to train facial recognition. It was recently forced to blur 1.5 million images in response to privacy concerns.

    With the increasing volume of visual, audio, and text data in commerce, there have been many business applications using AE. For example, Affectiva analyses viewers’ facial expressions from video recordings while they are watching video advertisements in order to optimize the content design of video ads.

    Revelations about algorithmic bias and discriminatory datasets in facial recognition technology have led large technology companies, including Microsoft, Amazon, and IBM, to halt sales. And the technology faces legal challenges regarding its use in policing in the UK. Furthermore, in the EU, a coalition of more than 40 civil society organizations has called for a ban on facial recognition technology entirely.

    Lapetus Solutions develops a model to estimate an individual’s longevity, health status, and disease susceptibility from a face photo. Their technology has been applied in the insurance industry.

    Citizen science project

    AI

    ERT has the potential to affect the lives of millions of people, yet there has been little public deliberation about how—and if—it should be used. This is where the citizen science project comes to play.

    On the interactive website (which works best on a laptop, not a phone) you can try out a private and secure ERT for yourself. To see how it scans your face and interprets your emotions, you can also play games comparing human versus AI skills in emotion recognition and learn about the controversial science of emotion behind ERT.

    When it comes to ERT, we need to collectively examine the controversial science of emotion built. These systems and analyze their potential for racial bias. And we need to ask ourselves: even if ERT can accurately read everyone’s inner feelings, Do we want such intimate surveillance in our lives? These are questions that require everyone’s deliberation, input, and action.

    Most importantly, you can contribute your perspectives and ideas to generate new knowledge about the potential impacts of ERT. As a computer scientist and digital activist If you have a face, you have a place in the conversation.

    By Mayank Vashisht | Technology Journalist | ELE Times

    Related News

    Must Read

    Top 10 Reinforcement Learning Applications and Use Cases

    One of the most intriguing areas of machine learning...

    How TVS Electronics is Transforming Digital India with “Make in India” AIDC

    In the fast-paced world of digital transformation, Automatic Identification...

    Lotus Microsystems and EDOM Technology Form Strategic Distribution Partnership to Expand Presence Across APAC

    Distribution Partnership to Expand Presence Across APAC Lotus Microsystems ApS,...

    Tata–Merck MoU to Accelerate Chip Manufacturing Infrastructure in India

    Tata Electronics Private Limited has signed a strategic Memorandum...

    UP Electronics Policy Draft to Boost Smartphone and Electronics Manufacturing

    The Uttar Pradesh government has introduced a draft policy...

    Semicon India 2025: PM Modi Says India’s Semiconductor Revolution Will Shape Global Future

    Prime Minister Narendra Modi inaugurated Semicon India 2025, positioning...

    Career Opportunities for Women in India’s Electronics Industry

    In the heart of India’s rapidly transforming digital economy,...

    Top 10 Decision Tree Learning Companies in India

    Decision tree algorithms continue to be one of the...