Emotion AI’s progress and implications for marketing

Emotion AI’s progress and implications for marketing
Emotion AI’s progress and implications for marketing

Today, I decided to have some fun with AI image generators and asked them to show me different emotions on a person’s face — anger, happiness, disgust and so on. As expected, the results displayed typical expressions. Anger was shown with a furrowed brow and gritted teeth, happiness with a wide smile, and disgust with a crinkled nose.

1731009310_118_Emotion-AIs-progress-and-

1731009312_347_Emotion-AIs-progress-and-

But as I looked at these images, something didn’t feel right. Are human emotions that predictable?

Research suggests otherwise. Emotions are incredibly nuanced, and the way we express them varies based on who we are, where we are from, and what we are going through at that moment. You could smile when you are furious because you are in a professional setting or could cry from joy. The context shapes your emotions, and that’s where things get tricky for Emotion AI, a technology that aims to detect and interpret our feelings based on facial expressions, voice or text.

What is Emotion AI?

Emotion AI, or Affective Computing, was first conceptualised in the 1990s by MIT professor Rosalind Picard, who argued that for computers to interact with humans, they needed to understand emotions. What started as sentiment analysis, (which means to interpret whether a piece of text is positive or negative), evolved into far more ambitious attempts to read the full range of human emotions using algorithms, cameras and microphones.

Sentiment analysis is a much simpler process. It looks for keywords like ‘happy’ or ‘frustrated’ in text and assigns a positive or negative score. Emotion AI, on the other hand, attempts to dive deeper into facial expressions, vocal tones, or physiological signals to assess emotions in real-time. While sentiment analysis tells you how someone might feel based on their language, Emotion AI claims to know what someone feels based on their behaviours. A process that could come in handy in categories like customer service, sales, marketing and more.

The last time emotion AI became a hot topic was in 2019 when the AI and ML world was focused on computer vision and not on generative art and content. Back then, a team of researchers published a study, concluding that human emotions cannot be accurately determined by facial movements alone. The notion that we can teach AI to recognise human feelings by mimicking how people read facial expressions, body language and tone of voice is flawed in its assumption.

Big tech companies like Microsoft, IBM, and Google have all rolled out their own versions of Emotion AI. Microsoft’s Azure Cognitive Services offers emotion detection as part of its computer vision suite, allowing developers to integrate emotion recognition into their apps. Google with its Cloud Vision API can label emotions in photos from happiness, anger, surprise or sadness, based on facial analysis. Moreover, IBM’s Watson also dives into emotional analysis, using natural language processing to detect emotions in text.

But despite these advanced tools available today, research shows that emotion recognition isn’t as accurate as it seems. A report from the Association for Psychological Science suggests that there’s no universal expression for emotions. Someone from one part of the world might show happiness differently than someone from another. More importantly, many consumers feel that their emotions are overlooked during digital interactions, leading to a gap in empathy between humans and machines.

But businesses are now deploying AI chatbots and assistants to service consumers and this needs accuracy. One industry in particular is excited about detecting human emotions – advertising and marketing.

Emotion AI: The marketer’s dream

Imagine a future where an ad campaign can detect your mood and tailor the content to suit it. If you are feeling down, the AI could comfort you, if you are happy, you would get something upbeat and playful. It could change the way you shop and interact with brands.

Brands are already experimenting with this. Unilever has used Affectiva’s facial coding technology to understand how people respond emotionally to its ads, tracking facial expressions to see if viewers feel happiness, disgust, or surprise. Similarly, responses to Coca-Cola’s AI ad were monitored with the help of EmotionTrac, a tool that measures emotional responses to media by capturing facial expressions and other biometric data. The insights gained aimed at creating personalised, emotion-driven marketing campaigns.

A recent research also explored the potential of AI to detect human emotions in posts on X (formerly Twitter), particularly its influence on donor behaviour toward non-profit organisations. To improve emotional detection, the research utilised a transformer transfer learning model, pre-trained on extensive datasets, to identify specific emotions in tweets. The findings suggested that the emotions expressed in posts about non-profits significantly impacted public sentiment and subsequent donation behaviour.

Most recently, Microsoft updated its Copilot AI assistant with a more engaging voice and capabilities to analyse web pages in real-time, aimed at enhancing user interaction. This update refined Copilot’s tone, making it feel like an active listener, while also introducing features like ‘Think Deeper’ for reasoning through decisions and ‘Copilot Vision’ for discussing what users see in their browser.

The potential of Emotion AI in marketing is undeniable. It offers brands a way to connect with consumers on a deeper level. But what happens when AI knows too much?

The challenges of detecting emotions

It’s one thing for an ad to target consumers’ interests based on past purchases, but it’s entirely different if it’s targeting their current emotional vulnerabilities. The ability to tap into their feelings could lead to manipulative practices, maybe even pressuring them to make decisions.

There’s also the assumption that AI might get it wrong. Considering AI is still learning spellings, it might misinterpret a consumer’s anger for calmness. Cultural differences play a role in emotional expression, making it hard for AI to interpret feelings universally. A study revealed that Chinese participants tend to focus more on the eyes to interpret facial expressions, whereas Western Caucasians give importance to the eyebrows and mouth. These cultural differences may misinterpret emotional signals during cross-cultural communication, according to the research findings. If humans themselves struggle to understand others’ feelings and cues, how can AI accurately detect them?

There’s the issue of bias. AI systems are only as good as the data they are trained on, and if the data is biased toward certain populations, the results could be biased. Studies indicate that certain emotional AI systems are more likely to assign negative emotions to the faces of people with dark skin tones. This raises concerns about potential biases if these technologies are used in areas like recruitment, performance assessments, medical diagnostics or law enforcement.

Additionally, the problem is data. Companies already have access to your data and digital algorithms. Now consumers will have to worry about your emotional data being sold to third parties so brands can show you ads based on your mood.

While AI can improve customer experiences, it also risks invading our private feelings, capitalising them for profit. Considering emotions are personal and subjective, should AI be allowed to know so much about our inner world?

Ultimately, Emotion AI requires careful regulation and ethical guidelines to ensure that our emotions remain our own. The European Union AI Act, approved in May 2024, prohibits AI from manipulating behaviour and bans emotion recognition in places like workplaces and schools. It allows the identification of emotional expressions but not the emotional states. For example, a call centre manager could act on AI detecting a grumpy tone but cannot assume that the person is actually grumpy.

As technology keeps getting developed, are we ready for a world where machines know how we feel and how that could change the way we interact with them and each other? One thing is for sure, AI has a long way to go before it starts accurately detecting human emotions. After all, as Director and Producer, Robert De Niro once said, “People don’t try to show their feelings, they try to hide them.”

Decode is a weekly series where we will be decoding what’s happening in the world of social media and technology.

-

-

PREV Thiago Motta after Juve-Turin: “Three important points. Very happy with Yildiz’s goal”
NEXT Juric: “For now I’ve done decent things. I want to win back the fans” – Forzaroma.info – Latest As Roma football news – Interviews, photos and videos