Estimating ‘Happy’ Based on Eye-Behavior Collected from HMD

Mayu Akata, Yoshiki Nishikawa, Toshiya Isomoto, and Buntarou Shizuki

This study shows a method for estimating users’ emotions in Virtual Reality (VR) spaces through the collection of eye behaviors. In our method, we use eye-related information available from the Head Mounted Display (HMD), including the direction vector of the gaze, coordination of the pupil, pupil diameter, and the eyelid opening width, to estimate whether the user is having fun or feeling others emotions. Using the LightGBM algorithm, the estimation accuracy resulted in an AUC of 0.84 and an accuracy of 0.78.

Download PDF
References
  • Mayu Akata, Yoshiki Nishikawa, Toshiya Isomoto, and Buntarou Shizuki. 2024. Estimating ‘Happy’ Based on Eye-Behavior Collected from HMD. In Proceedings of the 2024 Symposium on Eye Tracking Research and Applications (ETRA '24), Article 58, Jun 04, 2024, Glasgow, UK, ACM, pp. 1–2. DOI: https://doi.org/10.1145/3649902.3656364