FaceReader is the world’s first tool capable of automatically analyzing facial expressions, providing users with an objective assessment of a person’s emotion.

In 2007, FaceReader 1.0 was released. Since then, a new FaceReader release has been brought to the market on an annual basis, with FaceReader 7.0 being the current version, released in July of 2016. With every purchase of FaceReader you receive a complete software package with full customer support offered by VicarVision’s partner, Noldus.

Request a Quote


FaceReader is the world’s first tool capable of automatically analyzing facial expressions, providing users with an objective assessment of a person’s emotion.

In 2007 FaceReader 1.0 was released. Since then, a new FaceReader release has been brought to the market on annual basis, with FaceReader 7.0 being the current version released in july of 2016. With every purchase of FaceReader you receive a complete software package with full customer support offered by VicarVision’s partner, Noldus.

Request a Quote

FaceReader Clients

FaceReader is used worldwide at more than 500 universities, research institutes, and companies in many markets, such as consumer behavior research, usability studies, psychology, educational research and market research. Please download the client list below for an overview or contact us directly for more information.

Classifications

  • Facial Expressions

    Happy, Sad, Angry, Surprised, Scared, Disgusted, ContemptNEW and Neutral.
  • Valence

    A measure of the attitude of the participant (positive vs negative).
  • ArousalNEW

    A measure of the activity of the participant (active vs inactive).
  • Action Units

    20 of the most common Facial Action Units.
  • Facial States

    Eyes opened/closed, Mouth opened/closed, Eye Brows lowered/neutral/raised.
  • Global Gaze

    A global gaze direction (left, forward or right) helps to determine attention.
  • Characteristics

    Gender, Age, Ethnicity and the presence of Glasses, a Beard and a Moustache.
  • Head Pose

    Accurate head pose can be determined from the 3D face model.

3D Face Modeling

Realtime 3D Modeling

FaceReader uses an advanced 3D face modeling technique, with over 500 keypoints. The system is capable of modeling a face in realtime, without any manual initialization needed.

FaceReader Output Visualization

FaceReader contains a wide variety of visualization options, to make the data easily accessible for the researcher.

Continuous Expression Intensities

FaceReader outputs the 6 basic expressions, Happy, Sad, Angry, Surprised, Scared, Disgusted and an extra Neutral state as continuous intensity values between zero and one.

New in FaceReader is the addition of Contempt as the 7th expression.

face_detection

Action Unit Detection

The six basic emotions are only a fraction of the possible facial expressions. A widely used method for describing the activation of the individual facial muscles is the Facial Action Coding System (Ekman 2002).

FaceReader can detect the 20 most common AUs.

face_detection

Circumplex Model of Affect

The circumplex model of affect describes the distribution of emotions in a 2D circular space, with arousal and valence dimensions.

Circumplex models (Russel 1980) are commonly used to assess liking in marketing, consumer science, and psychology.

face_detection

Expression Summary

A summary of the expressions during a single analysis can be viewed in a easy understandable pie chart, showing overall responses.

Different subparts of the analysis can be selected to view the summary of the expressions.

face_detection

Characteristics

FaceReader can automatically classify some key characteristics of your participants.

face_detection

Facial States

FaceReader can automatically classify the state of some key parts of the participants face.

face_detection

Project Analysis



face_detection

All your participants in one project

In FaceReader you can (re)create your complete experiment, adding all your participants to one single project. The Project Analysis Module allows for analysis of the response of groups of participants, towards your stimuli.

Participants can be grouped based on independent variables, like age, gender or any manually entered variable.

Statistical Analysis

Using the numerical project analysis window, the significance of differences between groups or between stimuli can automatically be calculated. To easily see, with a single glance, what the interesting differences are.

face_detection

Group results can be visualized as interactive barcharts and boxplots. Hovering gives detailed information, quantifying the differences.

  • numerical results
  • numerical results
  • boxplot_selected

Temporal Results

Using the temporal project analysis window, you can view your stimuli synced with the average response of you participants.

Temporal Project Analysis. Watch you stimuli synced with the response of a group of participants.
face_detection
face_detection
face_detection

FaceReader Screenshots

1. Russell, J. A. (1980). A circumplex model of affect. Journal of Personality and Social Psychology, 39(6), 1161.
2. Ekman, P., & Friesen, W. V., & Hager, J. C. (2002). Facial action coding system: The manual on CD-ROM. Instructor's Guide. Salt Lake City: Network Information Research Co.