Establishing a reproducible method for calculating the magnitude of multisensory integration effects is significant, as it will aid in the facilitation of future translational research across diverse clinical populations. The main advantage to our technique is that we're able to quantify a robust phenotype of multisensory integration that is subsequently associated with important cognitive and motor outcomes in aging, like balance, falls, gait, and executive functions. Begin by using stimulus presentation software to program a simple reaction time experiment with three experimental conditions, visual alone, somatosensory alone, and simultaneous visual-somatosensory.
Use a stimulus generator with three control boxes. The left and right control boxes contain bilateral blue light-emitting diodes that illuminate for visual stimulation and bilateral motors with 0.8 G vibration amplitude that vibrate for somatosensory stimulation, as well as plastic housing for the stimulators. Next, place a center dummy control box equidistant from the left and right control boxes, and affix a visual target sticker to serve as the fixation point.
After the experiment has been set up, escort the participant to the testing room. Have the participant sit upright and comfortably rest their hands upon the left and right control boxes. Strategically place index fingers over the vibratory motors mounted to the back of the control box and thumbs on the front of the control box under the LEDs, to not block the light.
Ensure that the somatosensory stimuli are inaudible by providing participants with headphones over which continuous white noise is played at a comfortable level. Have the participant use a foot pedal located under the right foot as the response pad. Finally, have the participant respond to each stimulus as quickly as possible regardless of whether they feel it, see it, or feel it and see it.
Begin analysis by excluding participants that are not able to attain an accuracy of 70%correct or greater on any one stimulus condition. Consider trials inaccurate if a participant fails to respond to a stimulus within the set response time period, and set corresponding reaction time, or RT, to infinity rather than excluding the trial from the analysis. RT data are sorted in ascending order by the experimental condition.
Place visual, somatosensory, and VS conditions in separate columns of sorted RT data. Ensure each row represents one trial and each cell represents the actual RT.Note, do not employ data-trimming procedures that delete very slow RTs, as this will bias the distribution of RT data. Ensure RTs that are clearly outliers are set to infinity.
Then, to bin the RT data, identify the fastest and the slowest RT.Subtract the slowest RT from the fastest in order to calculate the individual's RT range across all test conditions. Bin RT data from the 0%to the 100%in 5%increments by taking the fastest RT and gradually adding 5%to the previously calculated RT range until 100%of the RT data is accounted for, to result in 21 time bins. Next, within a computer spreadsheet, use a frequency function where array one equals the actual RTs for one of the experimental conditions and array two equals the 21 quantized RT bins previously calculated divided by the total number of trials, 45, per condition.
Then, create the cumulative distribution frequency, or CDF, by summing the running total of probabilities across the quantized bins for each of three experimental conditions. The CDF of the multisensory condition represents the actual CDF. To calculate the predicted CDF, sum the two unisensory CDFs with an upper limit set to one.
Use this formula across each one of the 21 quantized time bins. Start at the zeroth percentile, and continue all the way down to the 100th percentile for bin 21. Next, to conduct the Test of the Race Model Inequality, subtract the predicted CDF from the actual CDF for each of the 21 quantized time bins to obtain the difference values.
Plot these 21 values as a line graph, where the x-axis represents each one of the quantized time bins and the y-axis represents the probability difference between the actual and predicted CDFs. Here, positive values at any latency indicate the integration of the unisensory stimuli and reflect a violation of the RMI. To quantify the multisensory effect at a group level, group-average the individual RMI data across all participants.
Use a spreadsheet to assign individuals to rows and time bins to columns. Then, in a new spreadsheet, place the previously calculated 21 difference values into individual rows and average values within time bins to create one group-averaged difference waveform. Then, plot the group average 21 values as a line graph, where the x-axis represents each one of the quantized time bins and the y-axis represents the probability difference between the CDFs.
Finally, calculate the area under the curve for each individual by using participant one's data as an example. Sum the CDF difference value at time bin one with the CDF difference value at time bin two, and then divide by two. Visually inspect each consecutive pair of time bins containing positive values.
Then, sum these results to generate the total AUC of the CDF difference wave during the violated percentile range of 0.00 to 0.10. Results indicate a group-averaged violation occurring over the zero to 10%percentile range for a sample of 333 older adults. The total number of positive values, zero, one, two, or three, for those three quantiles, 0.00 to 0.10, determines which multisensory classification group a person is assigned to, either deficient, poor, good, or superior, respectively.
As we have previously described, it is critical to avoid data-trimming procedures, as it biases the RT distributions. Slow reaction times and omitted trials need to be set to infinity. The main objective here was to develop a robust phenotype of multisensory integration.
Having said that, we are aware of differential multisensory integration patterns in aging, and our next step will be to uncover the neural networks responsible for such integrative processes while determining how specific structural or functional alterations contribute to the differential integration patterns. We are working on identifying the neural correlates associated with visual-somatosensory integration in aging, and we believe that such developments will provide insights into several diseases, including but not limited to Alzheimer's and Parkinson's.