Connected Magazine

Main Menu

  • News
  • Products
    • Audio
    • Collaboration
    • Control
    • Digital Signage
    • Education
    • IoT
    • Networking
    • Software
    • Video
  • Reviews
  • Sponsored
  • Integrate
    • Integrate 2024
    • Integrate 2023
    • Integrate 2022
    • Integrate 2021

logo

Connected Magazine

  • News
  • Products
    • Audio
    • Collaboration
    • Control
    • Digital Signage
    • Education
    • IoT
    • Networking
    • Software
    • Video
  • Reviews
  • Sponsored
  • Integrate
    • Integrate 2024
    • Integrate 2023
    • Integrate 2022
    • Integrate 2021
Features
Home›Features›Emotionally charged technology

Emotionally charged technology

By Jacob Harris
30/07/2019
321
0

Emotion detection technology is already being used in advertising and automotive applications. But what does the tech have to offer the home integration industry? Jacob Harris investigates.

What if your smart home knew when you’d had a bad day and could predict exactly what music, lighting and temperature would make you feel better? Your fridge could suggest the perfect meal for every occasion, and your car could prompt you when you’re distracted? This scenario may not be as far away as one might think.

Advances in AI have given rise to technologies that can accurately detect and recognise our emotions and it seems they’re set to proliferate. According to Econsultancy, the market for emotion sensing technology is estimated to grow to $65 billion by 2023. There is already a large demand for the tech, with companies like Disney and Kellogg’s using it to measure, and predict, audience responses to their films and advertising campaigns.

The technology relies on a variety of emotional ‘tells’ such as pupil dilation, body temperature and even the chemical composition of our breath which, via an amalgamation of sensors and machine learning, can provide an accurate picture of how we’re feeling. However, facial recognition technology that detects, measures and analyses micro expressions is far and away the most common method currently in use.

ADVERTISEMENT

Affectiva, an offshoot of MIT media laboratories, is leading the charge in the field with patented technology that measures emotional and cognitive states through facial expressions and voice. The company uses deep learning to detect and analyse 20 independent expressions that, in varied combinations, equate to hundreds of expressions and seven emotional states.

According to Affectiva, this can be done using any camera. Of course, the accurate analysis of these expressions relies on complex deep learning algorithms and a massive repository of data.

“We have the world’s largest emotion data repository: from 87 different countries, more than 50,000 hours of AV (6.5 million faces analysed, 2 billion frames, more than 1,100+ hours of automotive data). It is also always growing: each week we receive more than 200 hours of video data from our in-market partners and more than 100 hours of automotive in-car video data. Emotion AI algorithms need to be trained to analyse these massive amounts of real world data that are collected and annotated every day,” says Affectiva AI’s director of applied AI, Jay Turcot, during the company’s Technical Deep Dive webcast.

The company is currently focussed on automotive AI such as driver state monitoring. The aim of this tech is to increase driver safety by effectively detecting the emotional and cognitive state of the driver and responding accordingly. However, the potential applications for this tech extend far beyond the automotive industry.

Dolby Laboratories chief scientist Poppy Crum believes we are in the ‘era of the technological empath’, and that emotion sensing tech can help us bridge the emotional and cognitive divide between each other by enabling us to better understand what others are feeling. This could translate to a future where tech like augmented reality can ‘sense’ the emotional states of others and inform the wearer. Indeed, Microsoft patented emotion-sensing technology that enables the wearer to interpret what subjects in their field of view are feeling for its HoloLens back in 2015.

The ‘catch’ is that many people will no doubt feel conflicted about using (or others using) products that essentially strip them of their ability to withhold personal information. However, Poppy maintains that this is a necessary cost to facilitate a deeper, more authentic understanding of one another.

It may be that emotion-sensing tech gains its first major foothold in the privacy of our homes before we see mainstream usage in public spaces. And to this end, several research projects are already underway to look at possible applications for home automation.

Eye 2H is a proposed automated smart home control system for detecting human emotions through facial detection. The study, authored by Lim Teck Boon, Mohd Heikal Husin, Zarul Fitri Zaaba and Mohd Azam Osman from Malaysia’s School of Computer Sciences, posits that tracking and analysing the user’s mood, and using this to inform settings for lighting and other electronic equipment within the house will serve to improve mood and elevate the overall standard of living for the occupant.

Similarly, the Mexican study, Emotional Domotics: Inhabitable Home Automation System for Emotion Modulation Through Facial Analysis looked at influencing mood through light hue, (among other variables) with the aim of developing a controlled algorithm for living spaces based on the user’s emotional state.

Mood-enhancing homes are an exciting prospect. Not least because they bring us back to what should arguably be the primary motivation behind all technological development, especially in the home automation industry: improving human lives, interactions and relationships.

Affectiva emphasises the importance of taking a human-centric approach to AI and affirms that human perception AI will help to bring the focus of AI back to the people who will benefit from it – that creating AI that understands what it is to be human is the only way to build trust in the technology.

“What comes next is we start looking at our motives. We really want to build a multi-modal system capable of combining multiple channels to detect more complex affective states. The crux of that is we really want to build artificial emotional intelligence,” says Jay.

Affectiva plans to build on what’s already possible with emotion AI, and go a step further to replicate the power of human perception in artificial intelligence. It’s an idea that goes to the heart of many of the concerns around AI in general, such as instrumental convergence: if an AI is unable to understand and value what it means to be human, then even seemingly benign goals can have devastating consequences (as demonstrated in Nick Bostrom’s ‘Paperclip Maximiser’ thought experiment) – when a system’s hierarchy of values differs fundamentally from ours, it follows that its methods to achieve a goal will also differ from ours as will the expected outcomes.

There are many important considerations to address for the ethical development and deployment of AI systems, especially those designed to interact with humans. For now, we can look forward to home control systems that ‘know’ how we’re feeling, and how to make us feel better. And better understanding what makes us feel better may in turn help us navigate through more complex steps as we come to them.

  • ADVERTISEMENT

  • ADVERTISEMENT

Previous Article

Predictive analytics: que sera, sera

Next Article

Wellness: The sleeping giant

  • ADVERTISEMENT

  • ADVERTISEMENT

Advertisement

Sign up to our newsletter

Advertisement

Advertisement

Advertisement

Advertisement

  • HOME
  • ABOUT CONNECTED
  • DOWNLOAD MEDIA KIT
  • CONTRIBUTE
  • CONTACT US