The goal of this work is to describe a methodology for synchronizing two eye-tracking goggles and computing measures of joint visual attention (JVA) in a co-located setting. In our study, dyads of students interacted with different version of a tangible interface designed for students in logistics. The video above shows the video recorded by each mobile eye-tracker […]
Archive for the ‘Featured’ Category
The EarExplorer Project One of my goals is to explore the affordances of new technologies for learning STEM disciplines (Science, Technology, Engineering and Mathematics). In collaboration with others, we are building a series of tangible interfaces for teaching science concepts. In this project, we started with the constraint of teaching a highly spatial domain where one could […]
Making sense of collaborative eye-tracking data Nowadays massive datasets are becoming available for a wide range of applications, with education no exception: Cheap sensors can now detect every student movement and utterance. Massive Open Online Courses (MOOCs) over the web collect every click of users taking classes online. This information can provide crucial insights into […]
Introduction Previous research demonstrated that joint attention plays a crucial role in any kind of social interaction: From babies learning from their caregivers to parents educating their children, teenagers learning from school teachers, students collaborating on a project or for any group of adults working toward a common goal, joint attention is a fundamental mechanism […]
A tangible user interface for supporting collaborative learning of logistics.
A collaborative environment for learning phylogenetics.