We developed a method for recording human interactions with 3D virtual objects in real time. Our approach involves linking rotation data for the manipulated objects with behavior measures like eye tracking. This lets us gain deeper insights on the cognitive process behind the task, rather than just the final outcome.
The following topics are considered:the relationship between material rotation and eye movements, individual differences, applications and practical implications, methodological advancements, cognitive models and theories, developmental and clinical perspectives, and neurocognitive mechanisms. This work contributes on the methodological front. It appears that there is no straightforward approach to integrating eye activity data with rotation trajectories.
In this research, we aim to contribute to fill in this gap. A straightforward and cost-effective experimental setup was devised to map conventional eye tracking data over 3D models with rotational tests. Our future research will focus on the advancement of our method for the study of individual differences in the interpretation and interaction with virtual and physical three dimensional objects.
For example, we will examine the case for students and experts in chemistry.