The wearable frames resemble normal glasses and are 3D printed by an external printing service. The frames contain two eye cameras looking at user's eyes, one scene camera looking at user's view, and a circuit board with some resistors and six infrared LEDs. The cameras are connected to a computer with USB cables, one of which is used also for powering the circuit board. A computer software computes the user's gaze as a point in her scene camera.
The software implements advanced gaze tracking algorithms. In brief, our method utilizes a physical model of human eye which is fitted into the captured eye images. This requires detecting the pupil and the LED reflections in the eye images for which we use various computer vision algorithms and probabilistic methods. Due to the physical model approach, the devices can be moved during a recording; this is important, especially when making long recordings with children.
Annotating mobile gaze tracking data, i.e., classifying the gaze targets, is challenging. To alleviate the annotation process, we utilize visual QR symbol like markers. These are attached in the classroom near expected gaze targets, such as a computer screen. The markers are automatically detected and their location can be used to assess if the gaze point has been inside the pre-defined target area.