To begin, launch an eye tracking software on a computer. Open the iRT webpage, fill in a name and email and press next. Seat the participant in a comfortable position before the eye tracking system.
Move the chair to ensure an optimal distance between the participant and the camera. Now adjust the camera's height to correctly capture the participant's pupils. Press calibrate to begin calibration.
Ask the participant to look at a series of dots on the screen and follow the dots'movement without moving their head. After the calibration process, a realtime display of the captured screen with gaze data can be seen in the primary display window. Click on the gaze video icon in the main menu to display the user's face captured by the eye tracker.
Then click on start record to begin the experiment. Open the previously opened iRT's window and ask the participants to click on the go button, following which they will have to rotate the object on the right until it closely matches the one on the left. Click done to conclude the rotation and to repeat this process until all tasks are concluded.
Once the eye tracker data collection has been completed, click on analyze data. Then export a CSV file with all the data recorded for the user. If the online interactive rotation tasks page is being used, open the Google Sheets file used to receive the online data.
Then click on file, followed by download and xlsx. Next, download and uncompress the repository that was being used. Ensure that the scripts 1.unpacking_sheets.
m, 2.data_merge_and_process. m, 3.3D rotational trajectory. m, and the folder models are inside the downloaded repository in the folder Octave.
Move the data files into the Octave folder. Open the script 1.unpacking_sheets. m with the GNU Octave launcher.
In the editor tab, click on the green save file and run icon to run the script. When two prompts appear in succession, input the name of the downloaded file in the first prompt and the name of the unpackaged file inside the second field. A pop-up informing the user of the process completion will show up in a few minutes.
Open and run the script 2.data_merge_and_process. m to merge the data from both the eye tracker and iRT. Now input the session ID value, task ID value from the save data sheet.
Then input the unpackaged iRT data file name and eye tracker data file name. Launch and run the 3.3D rotation trajectory. m script.
When three prompts appear, input the session ID value, task ID value, and unpackaged iRT data file name, or leave them blank. To replay the participant's task interaction, first, go to the interactive task webpage. Start the test, then move the mouse pointer.
Next, click on the invisible debug text to enable the debug mode. Press the timer stop icon to interrupt the task and click on the console to open the JSmol console of the model. Now open the file output jmol console XLSX and copy the entire page of Jmol commands.
Paste the list of commands copied inside the JSmol console and click the run button to execute it. To generate a gif animation, write the command capture file name script output inside the JSmol console where file name is the name of the gif file to be created and output is the entire list of commands copied. The pupil of the participant remained more dilated in the initial and fine tuning phases.
The long fixation period in the fine tuning corresponded to a plateau in the pupil diameter. The rotation trajectory analysis showed that participant one initially deviated from the target position before finding a definitive path to the solution.