Affective Computing: How Does This Work And Some Use Cases

Original Source Here

Affective Computing: How Does This Work And Some Use Cases

This post was originally published on EnableX blog page

What did you feel while seeing the price tag of a product in a supermarket? You may not realise it, but machines can read these incredibly nuanced subtilities of human expressions, and retailers can use it to their advantage. A device loaded with a specific software program can predict whether a person is smiling out of frustration or joy. Human-computer interaction has gone beyond the realms of sci-fi fantasies and turned into reality. The rise of emotionally aware machines has blurred the human-machine divide, and it is now redefining the way people experience technology.

What Is Affective Computing?

Emotion AI, also known as Affective Computing, is all about how AI can decode the emotional status of a human being by analysing their facial expressions, such as head motion, facial distortions, movement of jaws, and speech pattern etc. It detects, recognises and emulates human emotions through a programmed AI neural network. Without a doubt, humans can analyse and interpret complex emotional signals better. However, the gap is narrowing faster than you can imagine, thanks to advancements in big data capability and powerful algorithms.

Understanding Psychological Framework Is Critical

Though technology and AI are essential aspects of Affective Computing, the psychological framework is critical to the overall understanding. A series of researchers, such as Paul Ekman, Lavenson and James Russel, proposed credible theories to firmly establish that human emotions can be separated with distinct physical signatures. Later, these theories became the cornerstone of Emotion AI technology. Among these theories, Ekman’s Discreet Model and the dimensional model of James Russel provided the critical foundation for the development of Affective Computing:

  • Ekman Discrete Model: Paul Ekman came with the concept of creation of seven basic emotions –anger, disgust, fear, happiness, sadness, and surprise, and a neutral expression which are created involuntarily by changing facial contours. Ekman’s model firmly established a correlation between facial expressions and emotional status.
  • Russel’s Dimensional Model For Emotion Classification: If Ekman’s model provided the qualitative foundation for Emotion AI, Russel’s model of emotions laid the quantitative foundation for Affective Computing technology. This model proposed that emotions, arousal and valence are distributed in a two-dimensional space. The dimensional distribution of emotions made it possible to gauge the intensity level of positive engagement (or positive arousal) or degree of negative disengagement (or negative arousal) and the degree of pleasantness (positive valence) or unpleasantness (negative valence).

How Emotion AI Technology Works

The human face is a canvas for communicating different emotional expressions. Whether it’s love, anger, joy, surprise, sadness or fear, every human emotion sends facial signals whenever they occur. With the help of Affective Computing technology, these emotional signals can be deciphered precisely and quickly.

Here is how the Emotion AI technology works:

  • Webcam captures micro-level facial expressions data, and this is placed in the RAM of the device.
  • The computing device gathers cues about the emotional status by analysing facial expressions, such as head postures, speech patterns, and eye movements.
  • These images are analysed using computer vision, big data and AI to decipher nine major traits based on Ekman’s model, such as age, gender, pose, face detector, emotions, arousal, valence, attention, wish and other features.
  • These traits are designed in the form of a well-designed SDK. You can load the specific API based on your requirement. Though the architecture may vary in different Emotion AI products, the core concept remains the same.
  • Deep learning technology constantly works in the backend to provide necessary predictive output.

A specialist labels the emotions of a large number of images (and frame by frame in the case of videos), and the algorithm is trained to interpret this data accurately. The result obtained is compared with the manual labelling while taking errors into account.

Leveraging Power of Emotion AI: Industry-Specific Use Cases

The entire sphere of Emotion AI is an exciting blend of psychology and technology and is throwing compelling use cases for a wide gamut of industries, whether it’s retail, finance, healthcare, online gaming, media, marketing or HR & Recruitment. According to MarketsAndMarkets, the Emotion AI market will be worth $37.1 billion by 2026.

Here are four exciting use cases of Affective Computing:

1. Healthcare

Affective Computing can help doctors devise personalised treatment plan as per the requirements of patients. It can be of immense help in mental health as it can divulge whether a patient is under stressful/traumatic mental condition.

2. Retail & Marketing

According to a Temkin Group research report, users are 7.1 times more likely to buy a product if people have a positive association with a brand. Enterprises are leveraging the power of Emotion AI to understand what shoppers feel about their products.

On the other hand, marketers can create more effective marketing campaigns after understanding emotional responses of users and how they feel after watching their ads. These crucial inputs can help refine their marketing message to users.

Download the eBook on how smart retailing with AI and Live Video can drive smarter sales and marketing.

3. Online Education

Identifying the emotional response of learners can have profound impact in online education space. Emotion AI can reveal the interest level of students and provide predictive output to help teachers devise and implement better teaching strategy. Apart from this, it can also help universities and colleges to conduct online exams securely while controlling the incidences of cheating and fake attendance.

4. HR & Recruitment

Do you know that global enterprises, such as Unilever, Dunkin Donuts, and IBM, have already started using emotion recognition technology for hiring candidates? As reported by Financial Times, Unilever reportedly saved more than 50,000 HR work hours in 2019. Recruiters are already using Emotion AI to handle large-scale recruitment drives with ease. Emotion AI allows them to evaluate from the entire pipeline of candidates and weed out unsuitable applicants in the early stage of the recruitment process, saving significant productive work hours.

Additionally, HR professionals often suffer from many preconceived notions and unconscious biases against minorities, gender, and older people; Emotion AI holds the promise of eliminating these inherent biases in hiring.

To know more on how Emotion AI can transform recruitment, read here or download the eBook — Recruit with Emotion Recognition AI.

Final Thoughts

The advancements in Affective Computing have opened the frontiers of immense possibilities for humanity. Enterprises recognise the power of emotionally intelligent AI systems over traditional AI systems having computational and cognitive ability only. It could help find solutions much faster than the human mind can imagine. If traditional AI has brought efficiency and speed in operations, leveraging the power of emotional data at scale will be a big opportunity for companies in the future.


Trending AI/ML Article Identified & Digested via Granola by Ramsey Elbasheer; a Machine-Driven RSS Bot

%d bloggers like this: