Yazar "Yaman, Arif Emre" seçeneğine göre listele
Listeleniyor 1 - 1 / 1
Sayfa Başına Sonuç
Sıralama seçenekleri
Öğe Attention-Emotion-Fatigue Detection Based On Face Mesh Features Using Deep Learning(SET Teknoloji, 2025) Öz, Emre; Belkan, Ahmet Emir; Yaman, Arif Emre; Gümüş, Kadir; Karabudak, Alperen; Kayaarma, Selma YılmazyıldızExtended Abstract Research Problem/Questions – Modern applications in work safety, traffic management, and psychotechnical evaluation demand reliable real?time assessments of an individual’s cognitive state. Despite significant progress in single-parameter analyses (such as fatigue detection or emotion recognition), a gap remains in developing an integrated system that simultaneously evaluates emotion, fatigue, and attention. This study aims to address this gap. In addition, considering the amount of paper work required in deployment of the classical widespread manual D2 Attention Test [18] in measuring the attention level, this study aims to be one of the first studies to pioneer the automation of this test. Short Literature Review – Earlier studies show that a large portion of occupational and traffic accidents are related to attention deficit [1],[2], fatigue [3],[4] and high emotional state [5]. However, research has typically focused on individual aspects—usually only fatigue [6], [7], only attention [8] or only emotion [9] are analyzed separately. While these methods have yielded promising results, they often neglect the interdependencies between cognitive factors. Recent advances in deep learning and computer vision have enabled more nuanced detection capabilities; however, studies that merge these modalities remain scarce. Methodology – The proposed system is developed in Python by using several specialized libraries. – Fatigue Detection: Mediapipe is used for real-time detection of facial landmarks, enabling the extraction of parameters such as EAR, MAR, MOE (Mouth Opening Extent), and PERCLOS. These parameters are then fed to an LSTM model for fatigue classification. – Emotion Analysis: Deepface, supported by a convolutional neural network (CNN) pretrained on FER-2013 dataset and fine-tuned with domain-specific images, identifies subtle facial expressions. In addition, K-Means clustering algorithm was implemented to cluster the intensity levels (low, medium, and high) within each emotion class. – Attention Measurement: The system incorporates the D2 attention test framework, wherein eye-tracking data obtained via Mediapipe is used to assess fixation durations and detect attention lapses. – User Interface and Calibration: A user-friendly interface is developed with PyQt5. A calibration phase establishes individual baseline metrics to account for inter-subject variability. The integrated approach employs feature extraction, time series analysis, and clustering techniques (e.g., K-means) to quantify and categorize the cognitive states reliably. Results and Conclusions – Experimental evaluations demonstrate that the integrated system achieves approximately 94% accuracy in fatigue detection. This result offers a practical and effective solution for real-time applications with its high accuracy, low computational cost and person-independent generalization ability. On the other hand, attention measurement module shows stable performance and high correlation with the classical manual deployment scores. This result is promising especially considering that this is one of the first studies to pioneer the automation of this test. The emotion analysis module, optimized through hyperparameter tuning and CNN architecture search, reaches a mean average precision (mAP) close to 88%. These results validate that a multi-faceted approach not only enhances detection accuracy but also provides a comprehensive analysis of an individual’s cognitive state. The findings show significant potential for deploying such systems in environments where real-time monitoring is critical for safety and performance improvement.












