Eye movements were recorded using a Positive Science mobile eye tracker, sampling at 30 Hz, with a calibrated accuracy of around a degree of visual angle. The videos of the eye and scene camera were combined using Yarbus software, with eye position being estimated using feature detection (an algorithm that detects movement of an ellipse fitted to the darkest part of the video: the pupil) and the corneal reflection. Participants were calibrated using a 9-point calibration presented on a surface at the same working-height as the experiment took place.
Do you have any questions about this protocol?
Post your question to gather feedback from the community. We will also invite the authors of this article to respond.
Tips for asking effective questions
+ Description
Write a detailed description. Include all information that will help others answer your question including experimental processes, conditions, and relevant images.