DOI QR코드

DOI QR Code

Emotion Recognition using Facial Thermal Images

  • Eom, Jin-Sup (Department of Psychology/Brain Research Institute, Chungnam National University) ;
  • Sohn, Jin-Hun (Department of Psychology/Brain Research Institute, Chungnam National University)
  • Received : 2012.03.14
  • Accepted : 2012.05.11
  • Published : 2012.06.30

Abstract

The aim of this study is to investigate facial temperature changes induced by facial expression and emotional state in order to recognize a persons emotion using facial thermal images. Background: Facial thermal images have two advantages compared to visual images. Firstly, facial temperature measured by thermal camera does not depend on skin color, darkness, and lighting condition. Secondly, facial thermal images are changed not only by facial expression but also emotional state. To our knowledge, there is no study to concurrently investigate these two sources of facial temperature changes. Method: 231 students participated in the experiment. Four kinds of stimuli inducing anger, fear, boredom, and neutral were presented to participants and the facial temperatures were measured by an infrared camera. Each stimulus consisted of baseline and emotion period. Baseline period lasted during 1min and emotion period 1~3min. In the data analysis, the temperature differences between the baseline and emotion state were analyzed. Eyes, mouth, and glabella were selected for facial expression features, and forehead, nose, cheeks were selected for emotional state features. Results: The temperatures of eyes, mouth, glanella, forehead, and nose area were significantly decreased during the emotional experience and the changes were significantly different by the kind of emotion. The result of linear discriminant analysis for emotion recognition showed that the correct classification percentage in four emotions was 62.7% when using both facial expression features and emotional state features. The accuracy was slightly but significantly decreased at 56.7% when using only facial expression features, and the accuracy was 40.2% when using only emotional state features. Conclusion: Facial expression features are essential in emotion recognition, but emotion state features are also important to classify the emotion. Application: The results of this study can be applied to human-computer interaction system in the work places or the automobiles.

Keywords

References

  1. Arnett, J., Offer, D. & Fine, M. A., Reckless driving in adolescence: 'state' and 'trait' factors, Accident Analysis and Prevention, 29, 57-63, 1997. https://doi.org/10.1016/S0001-4575(97)87007-8
  2. Banuls, R., Carbonell Vaya, E., Casanoves, M. & Chisvert, M., Different emotional responses in novice and professional drivers. In Traffic and transport psychology: Theory and application. In Proceedings of the international conference on traffic psychology, 343-352, Valencia, Spain, 1996.
  3. Bartlett, M. S., Littlewort, G., Frank, M. G., Lainscsek, C., Fasel I. & Movellan., J., Fully Automatic Facial Action Recognition in Spontaneous Behavior, Proc. IEEE Int'l Conf. Automatic Face and Gesture Recognition (AFGR '06), 223-230, 2006.
  4. Chang, Y., Hu, C., Feris, R. & Turk., M., Manifold Based Analysis of Facial Expression, Journal of Image and Vision Computing, 24 (6), 605-614, 2006. https://doi.org/10.1016/j.imavis.2005.08.006
  5. Chen, Z., Ma, L. & Sen, Y., Behavioural approaches to safety management in underground mines, In proceedings of the international conference of information technology, computer engineering and management sciences, 324-327, Nanjing, China, 2011.
  6. Cohn, J. F., Foundations of Human Computing: Facial Expression and Emotion. Proc. Eighth ACM Int'l Conf. Multimodal Interfaces (ICMI '06), 233-238, 2006.
  7. Deffenbacher, J. L., Lynch, R. S., Oetting, E. R. & Yingling, D. A., Driving anger: Correlates and a test of state-trait theory, Personality and Individual Differences, 31, 1321-1331, 2001. https://doi.org/10.1016/S0191-8869(00)00226-9
  8. Ekman, P. & Friesen, W., Facial Action Coding System: A technique for the measurement of facial movement. Consulting Psychologists Press, Palo Alto, 1978.
  9. Eyben, F., Wollmer, M., Graves, A., Schuller, B., Douglas-Cowie, E. & Cowie, R., On-line emotion recognition in a 3-D activation-valencetime continuum using acoustic and linguistic cues, Journal of Multimodal user interfaces, 3, 7-19, 2010. https://doi.org/10.1007/s12193-009-0032-6
  10. Fisher, C. D. & Ashkanasy, N. M., The emerging role of emotions in work life: An introduction, Journal of Organizational Behavior, 21, 123-129, 2000. https://doi.org/10.1002/(SICI)1099-1379(200003)21:2<123::AID-JOB33>3.0.CO;2-8
  11. Gross, J. J. & Levenson, R. W., Emotion elicitation using films. Cognition and Emotion, 9, 87-108, 1995. https://doi.org/10.1080/02699939508408966
  12. Han, G. S., Cultural limitations of social psychological theories: A review for the social psychology of Korean people, Korean Journal of Social Psychology, 6, 132-155, 1991.
  13. Jarlier, S., Grandjean, D., Delplanque, S., N'Diaye, K., Cayeux, L., Velazco, M. L., Sander, D., Vuilleumier, P. & Scherer, K. R., Thermal analysis of facial muscles contractions, IEEE Transactions on Affective Computing, 2, 2-9, 2011. https://doi.org/10.1109/T-AFFC.2011.3
  14. Katsis, C. D., Katertsidis, N., Ganiatsas, G. & Fotiadis, D. I. Toward emotion recognition in car-racing drivers: A biosignal processing approach, IEEE Transactions on Systems, Man and Cybernetics, Part A: Systems and Humans, 38, 502-512, 2008. https://doi.org/10.1109/TSMCA.2008.918624
  15. Khan, M. M., Ingleby, M. & Ward, R. D., 2006, Automated facial expression classification and affect interpretation using infrared measurement of facial skin temperature variations, ACM Transactions on Autonomous and Adaptive Systems, 1, 91-113, 2006. https://doi.org/10.1145/1152934.1152939
  16. Kim, K., Bang, S. & Kim, S., Emotion Recognition System Using Short-Term Monitoring of Physiological Signals, Medical and Biological Engineering and Computing, 42, 419-427, 2004. https://doi.org/10.1007/BF02344719
  17. Kuraoka, K. & Nakamura, K., The use of nasal skin temperature measurements in studying emotion in macaque monkeys, Physiology & Behavior, 102, 347-355, 2011. https://doi.org/10.1016/j.physbeh.2010.11.029
  18. Liu, C., Conn, K., Sarkar, N. & Stone, W., Physiology-Based Affect Recognition for Computer-Assisted Intervention of Children with Autism Spectrum Disorder, International Journal of Human-Computer Studies, 66, 662-677, 2008. https://doi.org/10.1016/j.ijhsc.2008.04.003
  19. Liu, Z. & Wang, S., Emotion recognition using hidden markov models from facial temperature sequence, Affective Computing and Intelligent Interaction, Lecture Notes in Computer Science, 6975, 240-247, 2011.
  20. Loukidou, L., Loan-Clarke, J. & Daniels, K., Boredom in the workplace: More than monotonous tasks, International Journal of Management Reviews, 11, 384-405, 2009.
  21. Merla, A. & Romani, G. L., Thermal signatures of emotional arousal: A functional infrared imaging study, In Proceedings of the Annual International Conference of the IEEE EMBS, Lyon, Frace, 23-26, 2007.
  22. Nakanishi, R. & Imai-Matsumura, K., Facial skin temperature decreases in infants with joyful expression, Infant behavior & Development, 31, 137-144, 2008. https://doi.org/10.1016/j.infbeh.2007.09.001
  23. Nasoz, F., Alvarez, K., Lisetti, C. L. & Finkelstein, N., Emotion recognition from physiological signals using wireless sensors for presence technologies, Cognitive, Technology & Work, 6, 4-14, 2004. https://doi.org/10.1007/s10111-003-0143-x
  24. Nhan, B. R. & Chau, T., Classifying affective states using thermal infrared imaging of the human face, IEEE Transactions on Biomedical Engineering, 57, 979-987, 2010. https://doi.org/10.1109/TBME.2009.2035926
  25. Pantic, M. & Bartlett, M. S. (2007). Machine Analysis of Facial Expressions, Face Recognition, In K. Delac and M. Grgic (Eds.), 377-416, I-Tech Education and Publishing.
  26. Pantic, M. & Rothkrantz, L. J. M., Facial action recognition for facial expression analysis from static face images, IEEE Transactions on Systems, Man, and Cybernetics Part B, 34, 1449-1461, 2004. https://doi.org/10.1109/TSMCB.2004.825931
  27. Parvlidis, I., Eberhardt, N. L. & Levine, J. A. Seeing through the face of deception, Nature, 415, 35, 2002.
  28. Picard, R. W., Vyzas, E. & Healey, J., Toward Machine Emotional Intelligence: Analysis of Affective Physiological State, IEEE Transactions on Pattern Analysis and Machine Intelligence, 23(10), 1175-1191, 2001. https://doi.org/10.1109/34.954607
  29. Rimm-Kaufmann, S. E. & Kagan, J., The psychological significance of changes skin temperature, Motivation and Emotion, 20, 63-78, 1996. https://doi.org/10.1007/BF02251007
  30. Shami, M. & Verhelst, W., An evaluation of the robustness of existing supervised machine learning approaches to the classification of emotion in speech, Speech Communication, 49(3), 201-212, 2007. https://doi.org/10.1016/j.specom.2007.01.006
  31. Trujillo, L., Olague, G., Hammoud, R. & Hernandez, B., Automatic feature localizations in thermal images for facial expression recognition, In Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 14-14, 2005.
  32. Tsiamyrtzi, P., Dowdall, J., Shastri, D., Pavlidis, I. T., Frank, M. G., Ekman, P., Imaging facial physiology for the detection of deceit, International Journal of Computer Vision, 71, 197-214, 2007. https://doi.org/10.1007/s11263-006-6106-y
  33. Yeasin, M., Bullot, B. & Sharma, R., Recognition of facial expressions and measurement of levels of interest from video, IEEE Transactions on Multimedia, 8, 500-507, 2006. https://doi.org/10.1109/TMM.2006.870737
  34. Yoshitomi, Y., Facial expression recognition for speaker using thermal image processing and speech recognition system, In Proceedings of the WSEAS International Conference on Applied Computer Science, Athens, Greece, 182-186, 2010.
  35. Zeng, Z., Pantic, M., Roisman, G. I. & Huang, T. S., A survey of affect recognition methods: Audio, visual, and spontaneous expressions, IEEE Transactions of Pattern Analysis and Machine Intelligence, 31, 39-58, 2009. https://doi.org/10.1109/TPAMI.2008.52

Cited by

  1. Emotion recognition from thermal infrared images using deep Boltzmann machine vol.8, pp.4, 2014, https://doi.org/10.1007/s11704-014-3295-3
  2. Identification potential of online handwritten signature verification vol.52, pp.3, 2016, https://doi.org/10.3103/S8756699016030043
  3. Fuzzy System-Based Fear Estimation Based on the Symmetrical Characteristics of Face and Facial Feature Points vol.9, pp.7, 2017, https://doi.org/10.3390/sym9070102
  4. Evaluation of Fear Using Nonintrusive Measurement of Multimodal Sensors vol.15, pp.7, 2015, https://doi.org/10.3390/s150717507