Open Access Open Access  Restricted Access Subscription Access

Emotion Recognition Techniques using Facial Images: A Review

Dr. Varun Chand H, Ashley Chrisanthus, Athul Thampi, Dayal S, Dhanup S

Abstract


Emotion recognition has emerged as a critical area of research, with applications in fields such as human- computer interaction, affective computing, and mental health assessment. This paper presents a comprehensive review of state-of-the-art facial emotion recognition technologies. In addition to reviewing unimodal facial emotion recognition approaches, this paper explores the integration of EEG signals with facial expressions to improve the accuracy and robustness of emotion recognition systems. The aim of this study is to analyze and compare different approaches employed for facial emotion recognition, highlighting their strengths and limitations.


Full Text:

PDF

References


Chowdary MK, Nguyen TN, Hemanth DJ. Deep learning-based facial emotion recognition for human–computer interaction applications. Neural Computing and Applications. 2021 Apr 22:1-8.

Wang XW, Nie D, Lu BL. Emotional state classification from EEG data using machine learning approach. Neurocomputing. 2014 Apr 10;129:94-106.

A. T. Kabakus, “PyFER: A Facial Expression Recognizer Based on Convolutional Neural Networks,” IEEE Access, vol. 8, pp. 142243– 142249, 2020.

I. Däntsch and G. Gediga, ‘‘Confusion matrices and rough set data analysis,’’ J. Phys. Conf. Ser., vol. 1229, pp. 1–6, Oct. 2019

I. N. Mehendale, “Facial emotion recognition using convolutional neural networks (FERC),” SN Applied Sciences, vol. 2, no. 3, Feb. 2020.

W. Liu, L. Zhou, and J. Chen, “Face Recognition Based on Lightweight Convolutional Neural Networks,” Information, vol. 12, no. 5, p. 191, Apr. 2021.

J. Rößler, J. Sun, and P. Gloor, “Reducing Videoconferencing Fatigue through Facial Emotion Recognition,” Future Internet, vol. 13, no. 5, p. 126, May 2021.

P. A. Gloor, M. Paasivaara, C. Z. Miller, and C. Lassenius, “Lessons from the collaborative innovation networks seminar,” International Journal of Organisational Design and Engineering, vol. 4, no. 1/2, p. 3, 2016.

P. Lucey, J. F. Cohn, T. Kanade, J. Saragih, Z. Ambadar and I. Matthews, "The Extended Cohn-Kanade Dataset (CK+): A complete dataset for action unit and emotion-specified expression," 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition - Workshops, San Francisco, CA, USA, 2010, pp. 94- 101.

M. Lyons, M. Kamachi, and J. Gyoba, “The Japanese Female Facial Expression (JAFFE) Dataset,” Zenodo, Apr. 14, 1998. (accessed Sep. 10, 2022).


Refbacks

  • There are currently no refbacks.