Emotional Analysis of Candidates During Online Interviews

Open Access
Article
Conference Proceedings
Authors: Alperen SayarTuna ÇakarTunahan BozkanSeyit ErtuğrulMert Güvençli

Abstract: The recent empirical findings from the related fields including psychology, behavioral sciences, and neuroscience indicate that both emotion and cognition are influential during the decision making processes and so on the final behavioral outcome. On the other hand, emotions are mostly reflected by facial expressions that could be accepted as a vital means of communication and critical for social cognition. This has been known as the facial activation coding in the related academic literature. There have been several different AI-based systems that produce analysis of facial expressions with respect to 7 basic emotions including happy, sad, angry, disgust, fear, surprise, and neutral through the photos captured by camera-based systems. The system we have designed is composed of the following stages: (1) face verification, (2) facial emotion analysis and reporting, (3) emotion recognition from speech. The users upload their online video in which the participants tell about themselves within 3 minutes duration. In this study, several classification methods were applied for model development processes, and the candidates' emotional analysis in online interviews was focused on, and inferences about the situation were attempted using the related face images and sounds. In terms of the face verification system obtained as a result of the model used, 98% success was achieved. The main target of this paper is related to the analysis of facial expressions. The distances between facial landmarks are made up of the starting and ending points of these points. 'Face frames' were obtained while the study was being conducted by extracting human faces from the video using the VideoCapture and Haar Cascade functions in the OpenCV library in the Python programming language with the image taken in the recorded video. The videos consist of 24 frames for 1000 milliseconds. During the whole video, the participant's emotion analysis with respect to facial expressions is provided for the durations of 500 milliseconds. Since there are more than one face in the video, face verification was done with the help of different algorithms: VGG-Face, Facenet, OpenFace, DeepFace, DeepID, Dlib and ArcFace. Emotion analysis via facial landmarks was performed on all photographs of the participant during the interview. DeepFace algorithm was used to analyze face frames through study that recognizes faces using convolutional neural networks, then analyzes age, gender, race, and emotions. The study classified emotions as basic emotions. Emotion analysis was performed on all of the photographs obtained as a result of the verification, and the average mood analysis was carried out throughout the interview, and the data with the highest values ​​on the basis of emotion were also recorded and the probability values have been extracted for further analyses. Besides the local analyses, there have also been global outputs with respect to the whole video session. The main target has been to introduce different potential features to the feature matrix that could be correlated with the other variables and labels tagged by the HR expert.

Keywords: Emotion Detection, Face Recognition, Deep Learning

DOI: 10.54941/ahfe1003278

Cite this paper:

Downloads
225
Visits
610
Download