Emotion AI: The next step in business communication?

May 25, 2021

Emotion AI: The next step in business communication?

Though culture and society continue to embrace this new digital era, human nature remains stubbornly unchanged from earlier generations. As a result, we see an imperfect fit between our instinct for human contact, and the simulated experience of such contact provided by technologies like social media. This gap explains the often emotionally unsatisfying nature of a Zoom meeting, for example, or a phone call answered by an automated system.

Understanding our common humanity

Our need for socialization is just one among hundreds of human universals – traits observed in every known population, and therefore part of our historical identity. This shared nature also includes elements as diverse as language structures, belief systems, social status, forms of emotional expression, and more.

The reality of human nature puts limits on the kinds of social and technological projects that are likely to succeed. But it also gives innovative businesses a very powerful tool, as long as they know how to use it. Precisely because our human hardware is so similar no matter which country or culture someone comes from, AI-driven software that can successfully read people’s emotions is likely to be effective everywhere.

Building machines that can learn

This is the theory behind Emotion AI, a concept introduced in a previous InnoHub article under the name ‘affective computing’. In essence, this technology gathers observational data through cameras and microphones, which a central program then combines to create a personal profile for each particular customer.

Using this method, the algorithm can recognize your emotions in real time as you browse through a store or have a conversation. This collected data is used to learn more about your mood as well as your interests – so that the program can adapt its behavior accordingly.

To understand how this invention might be used in practice, imagine visiting a clothing store and browsing an interactive digital catalog of their selection. The machine asks how it can help, and you say that you’re interested in shoes. A variety of shoes pops up on the display, and as you browse, a camera watches your eyes as well as your facial expression.

You focus on basketball shoes and start to smile when you see a pair of Air Jordans. Then your eyes move toward the price, and your smile disappears. The machine uses this information to instantly build a customer profile for you. It quickly suggests looking at a pair of last year’s Air Jordans, which are cheaper than the new shoes but very high quality.

The computer chats with you, rapidly interpreting cues in your facial expression as well as your tone of voice. The software even adapts its own emotional register in response to yours, just the way a particularly talented salesperson would talk to you.

You buy the shoes.

Digital maps for an emotional world

Human nature can’t be ignored, but it can be fooled. Though Zoom calls struggle to recreate the experience of real human connection, other forms of media – such as movies and recorded music – can feel fully immersive. As people often make decisions based on emotion, businesses have an understandable interest in using Emotion AI for the purposes of persuasion.

The challenge for software developers is to design systems which can effectively engage people’s emotions by responding appropriately to data collected in real time. There are currently three main methods for processing this data:

Natural language text analysis – This method involves the interpretation of plain text, whether collected in written or verbal format. The algorithm attempts to measure the emotional content of this text for clues about the mindset of the person it is interacting with.

Voice analysis – Through vocal pitch, rhythm, emphasis, and other qualities of real-world speech, this method uses auditory data to determine the sentiment of the speaker.

Facial expression analysis – Using camera data, this method picks up on visible cues to build an emotional profile of the subject’s current thought patterns.

With the system able to follow the emotional experience of its user, the next task is to determine an appropriate response. By modifying both content and tone to match signals received from the user, the Emotion AI algorithm is able to guide the interaction effectively in the desired direction.

Opportunities and barriers

The value of this technology for businesses is considerable. Emotion AI promises to reduce staff costs while keeping sales numbers high, even as it follows customer expectations of a shopping experience that is simultaneously more customized and more digital.

Yet Emotion AI also faces limitations that are both technical and ethical in nature. At a technical level, people live very complex emotional lives, and often express those emotions in different ways. Even under ideal conditions, programs currently do an imperfect job of analyzing our emotional states. In essence, Emotion AI is a work in progress that will require some time until it is accurate enough to function at a high level without human assistance.

Deeper questions also surround the use of this technology. What happens to the personal data that is collected? Can it be sold to insurance companies, for example, who then use data on your individual personality quirks to increase your insurance rates? What about issues surrounding consent?

These questions will surely be debated as the technology improves – but for now, businesses from multiple sectors are looking to integrate some form of Emotion AI into their customer communications. By blending the personal and the universal into a unique digital experience, Emotion AI may just be the next logical step in our collective embrace of technology.

Share this article

Subscribe to InnoHub!

Stay updated and inspired

เรานำข้อมูลมาใช้เพื่อการส่งมอบคอนเทนต์และบริการอย่างเหมาะสม เราจะปกป้องความเป็นส่วนตัวของคุณ คุณสามารถอ่านข้อมูลเพิ่มเติมได้ที่ Privacy Policy และคลิกสมัครเพื่อดำเนินการต่อ