Emotion AI’s risks and rewards: 4 tips to use it responsibly

We’re excited to bring Transform 2022 back to life on July 19th and virtually July 20-28. Join AI and data leaders for sensible conversations and exciting networking opportunities. Register today!


Over the past two weeks, Emotions has been running high around the evolution and use of AI, including techniques such as voice-based emotion analysis and computer vision-based facial expression detection.

For example, the video conferencing platform Zoom came under discussion after saying that its sales-targeted products could soon include Emotion AI features. The nonprofit advocacy group, Fight for the Future, published an open letter to the company: stating that Zoom’s potential offer would be “a major breach of consumer trust”, “inherently biased” and a “marketing ploy.”

Meanwhile, Intel and Classroom Technology are working on tools that use AI to detect children’s moods in virtual classrooms. This led to media coverage with unfortunate headlines such as “Emotion-Tracking Software Could Ding Your Kid For Looking Bored In Meth”.

Finally, the communications AI company, headquartered in Unifor, Palo Alto, California and India, is enjoying Unicorn status after announcing $ 400 million in new funding and $ 2.5 billion in valuations in February. In January 2021, the company acquired Emotion Research Lab, which uses state-of-the-art facial recognition and eye-tracking technology to capture and analyze video interactions in real-time to increase engagement between people. Last month, it unveiled its Q4 Sales solution, which “takes advantage of computer vision, tonal analysis, automated speech recognition and natural language processing and makes recommendations to capture the full emotional spectrum of sales conversations and increase the performance of sales teams.”

But computer scientist and famed fired-googler Timnit Gabru, who founded the independent AI Ethics Research Institute in December 2021. Criticism of Unifor’s claims on Twitter“The trend to embed pseudoscience in AI systems is so great,” she said.

What does this type of pushback mean for the enterprise? How can organizations calculate the risks and rewards of investing in Emotional AI? Experts maintain that the technology can be useful in certain use cases, especially when it comes to helping customers and supporting sellers.

Commitment to transparency is key

But, he adds, emotion AI requires a commitment to transparency for investment. Organizations also need a thorough understanding of what tools can and cannot do, as well as careful consideration of potential bias, data privacy, and ROI.

Today’s evolving emotion AI technologies “seem a bit more aggressive,” admits Annette Zimmermann, Gartner’s VP analyst who specializes in emotion AI. “For the enterprise, I think transparency must be a top priority.” In December 2021, Zimmerman published the Gartner Competitive Landscape Report for the feeling of AI space. She noted that with the epidemic, organizations are “trying to add more empathy to customer experiences.”

However, organizations also need to ensure that the technology works and the system is trained in a way that is not biased, she told VentureBeat. “For example, computer vision is very good at finding clear feelings like happiness and deep despair,” she explained. “But for more subtle things like irony, or slightly annoyed versus very angry, the model needs to be trained, especially on geographical and ethnic differences.”

Emotion AI can be the main difference

Zimmerman, highlighting Unifor in his Competitive Landscape report, wrote that the combination of computer vision and voice-based emotion analysis “could be the main difference for the company.”

In a comment emailed to VentureBeat, Patrick Ahlen, VP of Unifor’s artificial intelligence, said “it’s important to note that in today’s business world, meeting recordings and communication intelligence applications have become mainstream.” The company’s goal with Q4 Sales, he continued, is to “make virtual meetings more engaging, balanced, interactive and valuable to all parties.”

“We make sure there are no creeps,” he added. “We ask for consent before the call starts. We do not profile people on the call and we do not identify facial ID or face.” Furthermore, he explained, all participants have the choice to choose rather than dislike, with the full consent of both parties, at the beginning of each video meeting.

Ehlen also wanted to dispel the confusion about whether she was claiming to develop an AI that “detects emotions” or knows something about people’s inner emotional states. This is not a uniform claim at all, he said: “On the contrary, using combinations of facial expressions and tone of voice, for example, we read the gestures that people sometimes use to communicate their feelings.” For example, he explained, the sentence ‘Nice day, isn’t it?’ “One thing may seem to communicate if you just look at the text, but if it comes with a sarcastic tone of voice and an eye roll, it communicates something else.”

AI-driven emotional analysis is increasingly sophisticated

Sentiment analysis for text and voice has been going on for years: whenever you call the customer service line and contact center and hear the call “This call is being recorded for quality assurance,” for example, you are experiencing the most sophisticated. Has become, i-driven conversation analysis.

Zimmerman also highlighted Boston-based Cogito in Gartner’s competitive landscape as “a pioneer in audio-based emotion AI technology, providing call agent support / coaching as well as real-time emotion analytics for stress-level monitoring”. The company first provided AI solutions to the U.S. Department of Veterans Affairs – to analyze the voices of military veterans with PTSD to see if they needed immediate help. Then, they moved into a contact center space with an AI-powered sentiment analysis system that analyzes conversations and guides customer service agents at the moment.

“We provide real-time guidance to understand how the call is going and the caller’s psychological state,” says Josh Feast, CEO of Cogito. “For example, what is the experience for the parties on the call? What is the level of fatigue? What is the receptivity or motivation?”

The solution, then, is to provide the agent with specific clues, perhaps advising them to adjust the pitch or speed of the conversation. Or, it may provide recognition that the other party is distressed. “It provides an opportunity to show a little bit of empathy,” he said.

What an Enterprise Needs to Know Before Investing in Emotion AI

  • Pay attention to the emotional AI C-level.

,Executives need to know that Emotion AI has great potential with big responsibilities, ”said Theresa Kushner, Data and Analytics Practice Lead at NTT Data Services. “Operating these sophisticated AI algorithms is something that needs to be addressed at the C-level, and that data cannot be assigned to scientific teams or operations staff. They need to understand the level of commitment that is required to implement and operate controversial technologies such as Emotion AI, and to work closely together to make sure it doesn’t get out of hand. ”

Talking to different vendors, making sure they really show ROI, Zimmerman said: “You need to understand the benefits of investing in this particular technology – does it help me increase customer satisfaction? Or does it help me increase retention and brainstorming? Helps reduce? ” Ehlen of Unifor added that organizations should also look for solutions that could bring immediate ROI. “Solutions in this area should be able to help enhance human interactions in real time and then become more intelligent and appropriate over time,” he explained.

  • Understand algorithms and data collection.

Questions about data aggregation and integration with other vendor solutions should always be on top of mind, Kushner said, adding that organizations should make sure that technology does not violate any of their ethical boundaries, especially when it comes to emotional AI. “Consider asking if they can explain the AI ​​algorithm that generates this emotional response? What data do they use for the emotional side of emotional AI? How is it collected? What do we collect to enrich that dataset?” Fall? ” Understanding the real capabilities and limitations of the technology is also important, Ahlen added: “Is it single mode or multi-mode AI? Sealed or fused? This will determine the level of context and accuracy you may eventually have. ”

  • Implement the test and learn the framework.

These days, the feeling AI technology has evolved to the point that organizations are organizing large-scale projects. “It requires careful consideration of change management, the establishment of a steering committee and, critically, the implementation of certain types of testing and learning frameworks,” Feist said, which could lead to new use case ideas. “For example, we have clients who have tested our technology to provide real-time guidance to agents, but they also realized that they can use it to signal when agents are tired and need a break. ”

Balancing the risks and rewards of Emotional AI

According to Gartner’s Zimmerman, Emotion still has a long way to go to adopt AI technology, especially when it comes to Big Tech. “I guess, given some of the technology advances announced by Amazon and some of the discussions Google has done, many more devices will have this functionality, but they can’t.” I think they can do that in terms of technology, but maybe it’s privacy issues. ”

Enterprise customers also have to weigh the risks and rewards of Emotional AI. Kushner points out that businesses may think they want to know how a customer really feels about their interactions with the online call center and use Emotion AI technology to find out. “But if emotion AI technology does not properly represent the customer’s feelings and customer support responds in a way that does not conform to the feelings expressed by the customer, there is a risk of customer dismissal,” she said.

To maintain the right balance, Ehlen of Unifor said, sellers and consumers need to build trust, which, in turn, is based on open communication and choice. “We’re clear on what our solution can and can’t do,” he said. “Whether or not we give customers the choice to integrate this tool into their engagement. For those who choose, we adhere to industry best practices for data privacy and security. ”

The bottom line, Feist said, is that in order to succeed with emotion AI, the enterprise needs to use technology win-win: “In each case, I think organizations need to ask themselves, ‘Is it good for the enterprise?’ Is it good for employees? Is it good for customers? “

Venturebeat’s mission Transformative Enterprise is about to become a digital town square for technology decision makers to gain knowledge about technology and transactions. Learn more about membership.

Similar Posts

Leave a Reply

Your email address will not be published.