We are trying to understand the mechanisms linking time perception abilities with reading skills. Here, you ask a more specific question, What explains the relation between perceiving the time structure of a visual stimuli and reading skills? Is disassociation real?
Instead, it is a byproduct of visual processing problems in dyslexia. The temporal sampling framework was a breakthrough in this area by proposing that the synchronization of brainwaves with auditory stimuli was necessary to receive both sequence and speech units. Associations between time perception for visual stimuli in the region have also been reported, but explanations are still lacking.
Major challenge when you study time perceptual for visual stimuli is characterizing the cognitive processes that take place during stimulus presentation. Eye tracking is a candidate's tool, but the problem is that the ocular model response do not reflect time perception only, and also reacts to stimulus characteristics, like luminance. In this study, we present an eye tracking protocol that allow us separating model indices of stimulus processing from potential indices of time perception.
In animation software, using a frame duration conversion table, define the eight sequences of two time intervals for a slowdown sequence, ensuring the first interval is shorter than the second. For each slowdown sequence, invert the order of intervals to create a speed up analog. Define key frames for each sequence with specific frames for stimulus onset, interval offsets, and stimulus end.
In frame seven, draw a blue circle at the screen center. Copy and paste this image into the adjacent frame. Subsequently, copy and paste this two-frame sequence into the other two key frames.
After opening a file in the animation software with the same specifications used in flash animations, open the spreadsheet to align key frame specifications to squashed balls hitting the ground. In the fourth frame, draw a blue ball at the top center. Then using the tween command, generate a continuous change from the ball at the top to the squashed ball.
Draw a non-squashed ball vertically above the lowest point of the trajectory at the middle points. Generate the ascendant animation between the interval onset and the highest point and between the highest point and the next squash. To create an experimental folder, open the Experiment Builder application and choose New from the Menu file.
Specify the name of the project and choose the location to save the file. In the project folder, select library. Then open the video folder and upload the Xvid video stimulus files.
Drag the display screen icon to the graph editor window and create a link with the start panel. In the properties of Display_Screen, click on insert Multiline Text Resource and input the calibration procedure instructions. Select the Keyboard and EL button triggers.
Link the display screen to both triggers. Click the camera setup icon and link it to both triggers. Then select and drag the results file to the right side of the flow chart.
Link the sequence icon to the camera setup. Enter the sequence. Then drag and link the Start panel, Display icon, the triggers EL button and Keyboard.
Inside the block sequence, drag a new sequence icon to the editor to create the trial sequence. Then in the trial sequence, drag a Prepare_Sequence icon and link the second to the first. Drag the Drift_Correction icon to the interface and link it to the Prepare_Sequence icon.
Next in the trial sequence, drag a new sequence icon to the editor to create the recording sequence. Select the option Record in the properties of this sequence. In Properties, click on Data Source and fill in each row of the table with the exact file name of each stimulus.
Enter trial as Practice or Experimental. Set the presentation frequency and specify the expected response button. On the top panel of the interface, click on Randomization Settings, and mark the enabled trial randomization boxes to randomize the stimuli within each block.
In the recording sequence, establish a connection between the Start panel and the Display screen. Within the Display screen, choose the Insert Video Resource button and drag it to the interface. In the Properties, select the edit attributes and type the indicated text.
Link the keyboard and EL button triggers to the display icon. Drag the update attribute and add it to the results file icons. Link update attribute and results file to the triggers.
Finally, on the top of the main panel, click on the Run arrow icon to run a test of the experiment. To begin, position the participant approximately 60 centimeters away from the stimulation computer, ensuring the stimulus circle corresponds to two degrees of the visual field. Set the sampling frequency to 1, 000 hertz for high resolution, and select the dominant eye for recording In the visualization window of the recording computer, ensure that the eye tracker is tracking both the target and dominant eye consistently.
Open the experiment and run the five-point calibration and validation procedures per the system specifications. Instruct the participant to gaze at a dot appearing on the screen at five different places. After explaining the task to the participant, click Run to start the experiment.
In the Data Viewer software, navigate to File, then Import Data, and select multiple EyeLink data files. In the dialog box, select the files of all participants. Then select one trial and use the square icon to draw an interest area.
To create Time Window_All, click the Draw icon and select the entire screen. In the dialog box, name the interest area as Time Window_All and set a time segment matching the full trial. Click on Save the Interest Area Set and apply this template to all trials of the same length.
As per the interval duration, select one of the 16 time structures and define them as per the stimulus type. After selection, label the interest area. In the Menu bar, click on Analysis, then Report and Interest Area Report.
Select the output area to extract dwell time, number of fixations and pupil size. After clicking on next, export the matrix as xlsx file. Finally, check dwell time measures for Time Window_All and marked trials where signal loss succeeds 30%Comparisons among time window 1 and time window 12 showed different levels of change according to stimulus with balls eliciting more time window related changes in ocular-motor responses than flashes in both groups.
Behavioral findings indicated stimulus effects with less accuracy for balls compared to flashes, and group effects showing poorer performance in dyslexics, but no interaction between group and stimulus.