Abstract: Mental health disorders such as depression and anxiety are often underdiagnosed because current assessments rely heavily on self-report questionnaires and clinical interviews, which are subjective, time-consuming, and difficult to scale. Recent studies show that eye movement behaviour— such as fixation patterns, saccade dynamics, and gaze allocation to emotional stimuli—can serve as objective digital biomarkers for mental health conditions [1], [2].
This paper presents VISION AND EMOTION, a real-time mental health assessment system that leverages eye-tracking data captured using a standard camera. The system records gaze trajectories while users interact with carefully designed visual tasks (emotionally valence images, reading tasks, and attention-switching trials) and extracts features such as fixation duration, saccade amplitude, blink rate, and gaze distribution across regions of interest. These features are used to train a machine learning classifier (Support Vector Machine) to distinguish between Normal and At-Risk mental health states.
The proposed framework is lightweight, non-invasive, and deployable on commodity hardware without dedicated infrared eye trackers. Experimental evaluation demonstrates that the system can achieve promising classification performance with low latency, enabling near real-time feedback suitable for preliminary mental health screening. By combining eye-tracking analytics with machine learning, the system contributes toward scalable, objective, and cost- effective digital mental health tools that can complement traditional clinical assessments [2].

Keywords: Eye Tracking, Mental Health Assessment, Depression Detection, Gaze Analysis, Digital Biomarkers, Machine Learning, Real-Time Monitoring.


Downloads: PDF | DOI: 10.17148/IARJSET.2025.121257

How to Cite:

[1] VIJAYKUMAR MS, NONITA SALDANHA, PREKSHITH S, YASHIKA R, YETHISH, "“VISION AND EMOTION: LEVERAGING EYE TRACKING DATA FOR MENTAL HEALTH ASSESSMENT”," International Advanced Research Journal in Science, Engineering and Technology (IARJSET), DOI: 10.17148/IARJSET.2025.121257

Open chat