Dolgikh Anatoly (PhD student at the National Research Nuclear University MEPhI, Moscow)
Radostev Eduard (National Research Nuclear University MEPhI, Moscow)
| |
This paper presents a method for automatic recognition of a user’s emotional state from facial-expression parameters (blendshape coefficients) captured by the built-in sensors of a VR headset. Emotions are represented in two ways: as one of seven basic categories and as a point in the continuous three-dimensional Valence–Arousal–Dominance (VAD) space. The proposed approach evaluates both linear models (multivariate regression and SVM) followed by mapping to the nearest emotion in the VAD space, and direct classifiers (logistic regression and a neural network) trained on discrete emotion classes. Experimental data were collected in VR and include 52 facial-expression features. Logistic regression achieved an accuracy of approximately 60%, while the neural network reached 56% (both demonstrating a high correlation between the predicted and ground-truth VAD coordinates), whereas linear methods yielded only 30–45%. An ensemble classifier did not improve accuracy. In addition, we developed a neural network that generates a corresponding facial blendshape configuration for virtual avatar control given target (V, A, D) coordinates. The results are consistent with current trends (state-of-the-art systems for partially occluded faces typically report around 70% accuracy) and support the practical feasibility of emotion recognition in VR. Typical challenges (notably confusions between closely related emotions) are discussed, and directions for integrating VR-based emotion recognition into virtual agent systems are outlined.
Keywords:emotion recognition, virtual reality, virtual environment, virtual avatars, machine learning, facial expressions.
|
|
| |
|
Read the full article …
|
Citation link: Dolgikh A. , Radostev E. EMOTION RECOGNITION FROM FACIAL EXPRESSION PARAMETERS IN VIRTUAL REALITY // Современная наука: актуальные проблемы теории и практики. Серия: Естественные и Технические Науки. -2026. -№02. -С. 59-66 DOI 10.37882/2223-2966.2026.02.10 |
|
|