Facial Emotion Analytics

Facial Emotion Analytics identifies facial expressions in video and labels them with an emotion attribute. Use this analysis to gather a holistic view of how a speaker feels when viewed alongside the spoken word transcript.

Restriction: LivingLens Analytics are not available for LivingLens Experience Edition. Contact your Medallia Representative for more information.
Note: Facial Emotion Analytics can be activated for Insights Suite customers. Contact your Medallia Representative for pricing and activation information.
Facial Emotion Analytics is a form of Artificial Intelligence (AI), also referred to as Artificial emotional intelligence or Emotion AI. It identifies key landmarks on the human face, such as the corners of eyebrows, corners of the mouth and the tip of the nose. A collection of deep learning algorithms analyze these regions to classify a facial expression. Combinations of these facial expressions are mapped to a specific emotion.

Emotion Attribute

ContemptShows a smug or dismissive expression.
DisgustShows acute displeasure.
Effortful ThoughtIntense thought process, it often occurs when respondents are thinking hard about an answer.
EmphasisA measure that illustrates a level of expressiveness, either positive or negative. Demonstrates a level of connection to the subject matter.
ExpressivenessDemonstrates a level of engagement with the subject matter, this can be positive or negative.
JoyDemonstrates happiness, the most positive emotion returned for Facial Emotion Analytics.
Social SmileNot a full smile, but typically a gesture of acceptance.

The analysis is based on the world's largest emotion data repository of 5.7M faces analyzed in 75 countries.

Accuracy is based on several factors, including facial position, orientation and lighting in a video. Video quality directly impacts how accurately the technology can identify facial expressions and classify emotions.


Emotion attributes identified in video are displayed in the Facial timeline chart on the Media View page in LivingLens. Click anywhere on the Facial timeline to jump to the moment in the video.

A screenshot of the LivingLens Facial timeline chart. It displays three emotions in a clickable timeline.

Facial Emotion Analytics works when there are both an individual face or multiple faces on screen. The software can identify emotions in multiple faces on one screen but does not indicate which face is linked to which emotion.

Each emotion attribute appears as a Cognitive filter in the Media Library.

The Analytics page can display an aggregate view of Facial Emotion Analytics data. The chart lists every emotion attribute in the channel and the number of videos with that identified emotion. See Analytics Module display for more details.

Example use cases

  • Compare Facial Emotion Analytics data to the Sentiment Analysis of the spoken word.

  • Identify video that displays the emotional response most strongly.

  • Compare Facial Emotion Analytics data against other data available. Identify respondents whose data (such as NPS or scoring data) contrast with their emotional data.