A subscription to JoVE is required to view this content. Sign in or start your free trial.
A protocol for capturing and statistically analyzing emotional response of a population to beverages and liquefied foods in a sensory evaluation laboratory using automated facial expression analysis software is described.
We demonstrate a method for capturing emotional response to beverages and liquefied foods in a sensory evaluation laboratory using automated facial expression analysis (AFEA) software. Additionally, we demonstrate a method for extracting relevant emotional data output and plotting the emotional response of a population over a specified time frame. By time pairing each participant's treatment response to a control stimulus (baseline), the overall emotional response over time and across multiple participants can be quantified. AFEA is a prospective analytical tool for assessing unbiased response to food and beverages. At present, most research has mainly focused on beverages. Methodologies and analyses have not yet been standardized for the application of AFEA to beverages and foods; however, a consistent standard methodology is needed. Optimizing video capture procedures and resulting video quality aids in a successful collection of emotional response to foods. Furthermore, the methodology of data analysis is novel for extracting the pertinent data relevant to the emotional response. The combinations of video capture optimization and data analysis will aid in standardizing the protocol for automated facial expression analysis and interpretation of emotional response data.
Automated facial expression analysis (AFEA) is a prospective analytical tool for characterizing emotional responses to beverages and foods. Emotional analysis can add an extra dimension to existing sensory science methodologies, food evaluation practices, and hedonic scale ratings typically used both in research and industry settings. Emotional analysis could provide an additional metric that reveals a more accurate response to foods and beverages. Hedonic scoring may include participant bias due to failure to record reactions1.
AFEA research has been used in many research applications including computer gaming, user behavior, education/pedagogy, and psychology studies on empathy and deceit. Most food-associated research has focused on characterizing emotional response to food quality and human behavior with food. With the recent trend in gaining insights into food behaviors, a growing body of literature reports use of AFEA for characterizing the human emotional response associated with foods, beverages, and odorants1-12.
AFEA is derived from the Facial Action Coding System (FACS). The facial action coding system (FACS) discriminates facial movements characterized by action units (AUs) on a 5-point intensity scale13. The FACS approach requires trained review experts, manual coding, significant evaluation time, and provides limited data analysis options. AFEA was developed as a rapid evaluation method to determine emotions. AFEA software relies on facial muscular movement, facial databases, and algorithms to characterize the emotional response14-18. The AFEA software used in this study reached a "FACS index of agreement of 0.67 on average on both the Warsaw Set of Emotional Facial Expression Pictures (WSEFEP) and Amsterdam Dynamic Facial Expression Set (ADFES), which is close to a standard agreement of 0.70 for manual coding"19. Universal emotions included in the analysis are happy (positive), sad (negative), disgusted (negative), surprised (positive or negative), angry (negative), scared (negative) and neutral each on a separate scale of 0 to 1 (0=not expressed; 1=fully expressed)20. In addition, psychology literature includes happy, surprised, and angry as "approach" emotions (toward stimuli) and sad, scared, and disgusted as "withdrawal" emotions (away from aversive stimuli)21.
One limitation of the current AFEA software for characterizing emotions associated with foods is interference from facial movements associated with chewing and swallowing as well as other gross motor motions, such as extreme head movements. The software targets smaller facial muscular motions, relating position and degree of movement, based on over 500 muscle points on the face16,17. Chewing motions interfere with classification of expressions. This limitation may be addressed using liquefied foods. However, other methodology challenges can also decrease video sensitivity and AFEA analysis including data collection environment, technology, researcher instructions, participant behavior, and participant attributes.
A standard methodology has not been developed and verified for optimal video capture and data analysis using AFEA for emotional response to foods and beverages in a sensory evaluation laboratory setting. Many aspects can affect the video capture environment including lighting, shadowing due to lighting, participant directions, participant behavior, participant height, as well as, camera height, camera angling, and equipment settings. Moreover, data analysis methodologies are inconsistent and lack a standard methodology for assessing emotional response. Here, we will demonstrate our standard operating procedure for capturing emotional data and processing data into meaningful results using beverages (flavored milk, unflavored milk and unflavored water) for evaluation. To our knowledge only one peer reviewed publication, from our lab group, has utilized time series for data interpretation for emotions analysis8; however, the method has been updated for our presented method. Our aim is to develop an improved and consistent methodology to help with reproducibility in a sensory evaluation laboratory setting. For demonstration, the objective of the study model is to evaluate if AFEA could supplement traditional hedonic acceptability assessment of flavored milk, unflavored milk and unflavored water. The intention of this video protocol is to help establish AFEA methodology, standardize video capture criteria in a sensory evaluation laboratory (sensory booth setting), and illustrate a method for temporal emotional data analysis of a population.
Ethics Statement: This study was pre-approved by Virginia Tech Institutional Review Board (IRB) (IRB 14-229) prior to starting the project.
Caution: Human subject research requires informed consent prior to participation. In addition to IRB approval, consent for use of still or video images is also required prior to releasing any images for print, video, or graphic imaging. Additionally, food allergens are disclosed prior to testing. Participants are asked prior to panel start if they have any intolerance, allergies or other concerns.
Note: Exclusion Criteria: Automated facial expression analysis is sensitive to thick framed glasses, heavily bearded faces and skin tone. Participants who have these criteria are incompatible with software analysis due to an increased risk of failed videos. This is attributed to the software's inability to find the face.
1. Sample Preparation and Participant Recruitment
2. Preparation of Panel Room for Video Capture
Note: This protocol is for data capture in a sensory evaluation laboratory. This protocol is to make AFEA data capture useful for a sensory booth setting.
3. Participant Adjustment and Verbal Directions
4. Individual Participant Process for Video Capture
5. Evaluating Automated Facial Expression Analysis Options
Note: Many facial expression analysis software programs exist. Software commands and functions may vary. It is important to follow the manufacturer's user guidelines and reference manual20.
6. Timestamp Participant Videos for Data Analysis
7. Time Series Emotional Analysis
Note: Consider the "baseline" to be the control (i.e., unflavored water in this example). The researcher has the ability to create a different "baseline treatment stimulus" or a "baseline time without stimulus" for paired comparison dependent on the interests of the investigation. The method proposed accounts for a "default" state by using a paired statistical test. In other words, the procedure uses statistical blocking (i.e., a paired test) to adjust for the default appearance of each participant and therefore reduces the variability across participants.
The method proposes a standard protocol for AFEA data collection. If suggested protocol steps are followed, unusable emotional data output (Figure 1) resulting from poor data collection (Figure 2: A; Left Picture) may be limited. Time series analysis cannot be utilized if log files (.txt) predominantly contain "FIT_FAILED" and "FIND_FAILED" as this is bad data (Figure 1). Furthermore, the method includes a protocol for dir...
AFEA application in literature related to food and beverage is very limited1-11. The application to food is new, creating an opportunity for establishing methodology and data interpretation. Arnade (2013)7 found high individual variability among individual emotional response to chocolate milk and white milk using area under the curve analysis and analysis of variance. However, even with participant variability, participants generated a happy response longer while sad and disgusted had shorter time r...
The authors have nothing to disclose.
This project was funded, in part, by ConAgra Foods (Omaha, NE, USA), the Virginia Agricultural Experiment Station, the Hatch Program of the National Institute of Food and Agriculture, U.S. Department of Agriculture, and the Virginia Tech Water INTERface Interdisciplinary Graduate Education Program.
Name | Company | Catalog Number | Comments |
2% Reduced Fat Milk | Kroger Brand, Cincinnati, OH or DZA Brands, LLC, Salisbury, NC | na | for solutions |
Drinking Water | Kroger Brand, Cincinnati, OH | na | for solutions |
Imitation Clear Vanilla Flavor | Kroger Brand, Cincinnati, OH | na | for solutions |
Iodized Salt | Kroger Brand, Cincinnati, OH | na | for solutions |
FaceReader 6 | Noldus Information Technology, Wageningen, The Netherlands | na | For Facial Analysis |
Sensory Information Management System (SIMS) 2000 | Sensory Computer Systems, Berkeley Heights, NJ | Version 6 | For Sensory Data Capture |
Rhapsody | Acuity Brands Lighting, Inc., Conyers, GA | For Environment Illumination | |
R Version | R Core Team 2015 | 3.1.1 | For Statistical Analysis |
Microsoft Office | Microsoft | na | For Statistical Analysis |
JMP | Statistical Analysis Software (SAS) Version 9.2, SAS Institute, Cary, NC | na | For Statistical Analysis |
Media Recorder 2.5 | Noldus Information Technology, Wageningen, The Netherlands | na | For capturing participants sensory evaluation |
Axis M1054 Camera | Axis Communications, Lund, Sweden | na | |
Beverage | na | Beverage or soft food for evaluation |
Request permission to reuse the text or figures of this JoVE article
Request PermissionThis article has been published
Video Coming Soon
Copyright © 2025 MyJoVE Corporation. All rights reserved