Emotion recognition in Virtual Reality using sensor fusion with eye tracking

dc.contributor.authorKorkmaz, Meral
dc.contributor.authorSarikaya, Mehmet Ali
dc.contributor.authorKarakas, Tulay
dc.contributor.authorYildiz Ozkan, Dilek
dc.contributor.authorDemir, Yüksel
dc.contributor.authorBilen, Ömer
dc.contributor.authorInce, Gökhan
dc.date.accessioned2026-02-08T15:11:10Z
dc.date.available2026-02-08T15:11:10Z
dc.date.issued2025
dc.departmentBursa Teknik Üniversitesi
dc.description.abstractEmotion recognition is an emerging field with applications in healthcare, education, and entertainment. This study integrates Virtual Reality (VR) with multi-sensor fusion to enhance emotion recognition. The research comprises two phases: data collection and analysis/evaluation. Ninety-five participants were exposed to curated audiovisual stimuli designed to elicit a wide range of emotions within an immersive VR environment. VR was chosen for its ability to provide controlled conditions and overcome the limitations of current mobile sensor technologies. Physiological data streams from various sensors were integrated for comprehensive emotional analysis. ElectroEncephaloGraphy (EEG) data revealed brain activity linked to emotional states, while eye tracking data provided insights into gaze direction, pupil dilation, and eye movement—factors correlated with cognitive and emotional processes. Peripheral signals, including heart rate variability, ElectroDermal Activity (EDA), and body temperature, were captured via wearable sensors to enrich the dataset. Machine learning models, such as XGBoost, CatBoost, Multilayer Perceptron, Gradient Boosting, and LightGBM, were employed to predict participants’ emotional states. Evaluation metrics, including accuracy, precision, recall, and F1 scores, demonstrated the robustness and precision of the proposed VR-based multi-sensor fusion approach. This research presents a novel approach to emotion recognition, bridging gaps in traditional methods by integrating VR, multi-sensor fusion, and machine learning. © 2025 Elsevier Ltd
dc.description.sponsorshipTürkiye Bilimsel ve Teknolojik Araştırma Kurumu, TUBITAK, (122K260); Türkiye Bilimsel ve Teknolojik Araştırma Kurumu, TUBITAK
dc.identifier.doi10.1016/j.compbiomed.2025.111070
dc.identifier.issn0010-4825
dc.identifier.pmid40967141
dc.identifier.scopus2-s2.0-105016084903
dc.identifier.scopusqualityQ1
dc.identifier.urihttps://doi.org/10.1016/j.compbiomed.2025.111070
dc.identifier.urihttps://hdl.handle.net/20.500.12885/5260
dc.identifier.volume197
dc.indekslendigikaynakScopus
dc.indekslendigikaynakPubMed
dc.language.isoen
dc.publisherElsevier Ltd
dc.relation.ispartofComputers in Biology and Medicine
dc.relation.publicationcategoryMakale - Uluslararası Hakemli Dergi - Kurum Öğretim Elemanı
dc.rightsinfo:eu-repo/semantics/closedAccess
dc.snmzScopus_KA_20260207
dc.subjectEEG
dc.subjectEmotion recognition
dc.subjectEye tracking
dc.subjectPhysiological signals
dc.subjectSensor fusion
dc.subjectVirtual reality
dc.titleEmotion recognition in Virtual Reality using sensor fusion with eye tracking
dc.typeArticle

Dosyalar