This study shows a method for estimating users’ emotions in Virtual Reality (VR) spaces through the collection of eye behaviors. In our method, we use eye-related information available from the Head Mounted Display (HMD), including the direction vector of the gaze, coordination of the pupil, pupil diameter, and the eyelid opening width, to estimate whether the user is having fun or feeling others emotions. Using the LightGBM algorithm, the estimation accuracy resulted in an AUC of 0.84 and an accuracy of 0.78.