JoVE Logo

Zaloguj się

Aby wyświetlić tę treść, wymagana jest subskrypcja JoVE. Zaloguj się lub rozpocznij bezpłatny okres próbny.

W tym Artykule

  • Podsumowanie
  • Streszczenie
  • Wprowadzenie
  • Protokół
  • Wyniki
  • Dyskusje
  • Ujawnienia
  • Podziękowania
  • Materiały
  • Odniesienia
  • Przedruki i uprawnienia

Podsumowanie

This paper discusses how to build a brain-computer interface by relying on consumer-grade equipment and steady-state visually evoked potentials. For this, a single-channel electroencephalograph exploiting dry electrodes was integrated with augmented reality glasses for stimuli presentation and output data visualization. The final system was non-invasive, wearable, and portable.

Streszczenie

The present work focuses on how to build a wearable brain-computer interface (BCI). BCIs are a novel means of human-computer interaction that relies on direct measurements of brain signals to assist both people with disabilities and those who are able-bodied. Application examples include robotic control, industrial inspection, and neurorehabilitation. Notably, recent studies have shown that steady-state visually evoked potentials (SSVEPs) are particularly suited for communication and control applications, and efforts are currently being made to bring BCI technology into daily life. To achieve this aim, the final system must rely on wearable, portable, and low-cost instrumentation. In exploiting SSVEPs, a flickering visual stimulus with fixed frequencies is required. Thus, in considering daily-life constraints, the possibility to provide visual stimuli by means of smart glasses was explored in this study. Moreover, to detect the elicited potentials, a commercial device for electroencephalography (EEG) was considered. This consists of a single differential channel with dry electrodes (no conductive gel), thus achieving the utmost wearability and portability. In such a BCI, the user can interact with the smart glasses by merely staring at icons appearing on the display. Upon this simple principle, a user-friendly, low-cost BCI was built by integrating extended reality (XR) glasses with a commercially available EEG device. The functionality of this wearable XR-BCI was examined with an experimental campaign involving 20 subjects. The classification accuracy was between 80%-95% on average depending on the stimulation time. Given these results, the system can be used as a human-machine interface for industrial inspection but also for rehabilitation in ADHD and autism.

Wprowadzenie

A brain-computer interface (BCI) is a system allowing communication with and/or control of devices without natural neural pathways1. BCI technology is the closest thing that humanity has to controlling objects with the power of the mind. From a technical point of view, the system operation works by measuring induced or evoked brain activity, which could either be involuntarily or voluntarily generated from the subject2. Historically, research focused on aiding people with motor disabilities through BCI3, but a growing number of companies today offer BCI-based instrumentation for gaming4, robotics5, industry6, and other applications involving human-machine interaction. Notably, BCIs may play a role in the fourth industrial revolution, namely industry 4.07, where cyber-physical production systems are changing the interaction between humans and the surrounding environment8. Broadly speaking, the European project BNCI Horizon 2020 identified application scenarios such as replacing, restoring, improving, enhancing, or supplementing lost natural functions of the central nervous system, as well as the usage of BCI in investigating the brain9.

In this framework, recent technological advances mean brain-computer interfaces may be applicable for usage in daily life10,11. To achieve this aim, the first requirement is non-invasiveness, which is important for avoiding the risks of surgical intervention and increasing user acceptance. However, it is worth noting that the choice of non-invasive neuroimaging affects the quality of measured brain signals, and the BCI design must then deal with the associated pitfalls12. In addition, wearability and portability are required. These requirements are in line with the need for a user-friendly system but also pose some constraints. Overall, the mentioned hardware constraints are addressed by the usage of an electroencephalographic (EEG) system with gel-free electrodes6. Such an EEG-based BCI would also be low-cost. Meanwhile, in terms of the software, minimal user training (or ideally no training) would be desired; namely, it would be best to avoid lengthy periods for tuning the processing algorithm before the user can use the system. This aspect is critical in BCIs because of inter-subject and intra-subject non-stationarity13,14.

Previous literature has demonstrated that the detection of evoked brain potentials is robust with respect to non-stationarity and noise in signal acquisition. In other words, BCIs relying on the detection of evoked potential are termed reactive, and are the best-performing BCIs in terms of brain pattern recognition15. Nevertheless, they require sensory stimulation, which is probably the main drawback of such interfaces. The goal of the proposed method is, thus, to build a highly wearable and portable BCI relying on wearable, off-the-shelf instrumentation. The sensory stimuli here consist of flickering lights, generated by smart glasses, that are capable of eliciting steady-state visually evoked potentials (SSVEPs). Previous works have already considered integrating BCI with virtual reality either alone or in conjunction with augmented reality16. For instance, a BCI-AR system was proposed to control a quadcopter with SSVEP17. Virtual reality, augmented reality, and other paradigms are referred to with the term extended reality. In such a scenario, the choice of smart glasses complies with the wearability and portability requirements, and smart glasses can be integrated with a minimal EEG acquisition setup. This paper shows that SSVEP-based BCI also requires minimal training while achieving acceptable classification performance for low-medium speed communication and control applications. Hence, the technique is applied to BCI for daily-life applications, and it appears especially suitable for industry and healthcare.

Protokół

The study was approved by the Ethical Committee of Psychological Research of the Department of Humanities of the University of Naples Federico II. The volunteers signed informed consent before participating in the experiments.

1. Preparing the non-invasive wearable brain - computer interface

  1. Obtain a low-cost consumer-grade electroencephalograph with dry electrodes, and configure it for single-channel usage.
    1. Short-circuit or connect any unused input channels on the low-cost electroencephalograph to an internal reference voltage as specified by the inherent datasheet. In doing so, the unused channels are disabled, and they do not generate crosstalk noise.
    2. Adjust the electroencephalograph gain (typically through a component with variable resistance) to have an input range in the order of 100 µV.
      NOTE: The EEG signals to be measured are in order of tens of microvolts. However, the dry electrodes are greatly affected by motion artifacts, which result in oscillations in the order of 100 µV due to the variability in the electrode-skin impedance. Increasing the input voltage range helps to limit EEG amplifier saturation, but it does not completely eliminate it. On the other hand, it would be inconvenient to increase the input voltage range even more, because this would affect the voltage resolution in measuring the desired EEG components. Ultimately, the two aspects must be balanced by also taking into account the bit-resolution of the analog-to-digital converter inside the electroencephalograph board.
    3. Prepare three dry electrodes to connect to the electroencephalograph board. Use a passive electrode (no pre-amplification) as the reference electrode. The remaining two measuring electrodes should be active ones (i.e., involve pre-amplification and eventual analog filtering).
      NOTE: Electrodes placed on a hairy scalp area require pins to overcome electrode-skin contact impedance. If possible, solder silver pins with flat heads (to avoid too much pain for the user), or ideally use conducting (soft) rubber with an Ag/AgCl coating.
  2. Obtain commercial smart glasses with an Android operating system and a refresh rate of 60 Hz. Alternatively, use a lower refresh rate. A higher refresh rate would be desirable for stimuli as there would be less eye fatigue, but there are no currently available solutions on the market.
    1. Download the source code of an Android application for communication or control, or develop one.
    2. Replace the virtual buttons in the application with flickering icons by changing the inherent object (usually in Java or Kotlin). White squares with at least 5% screen dimension are recommended. Usually, the bigger the stimulating square, the higher the SSVEP component to detect will be, but an optimum can be found depending on the specific case. Recommended frequencies are 10 Hz and 12 Hz flickering. Implement the flicker on the graphical processing unit (GPU) to avoid overloading the computing unit (CPU) of the smart glasses. For this purpose, use objects from the OpenGL library.
    3. Implement a module of the Android application for real-time processing of the input EEG stream. The Android USB Service can be added so that the stream may be received via USB. The real-time processing may simply apply a sliding window to the EEG stream by considering the incoming packets. Calculate the power spectral densities associated with the 10 Hz and 12 Hz frequencies through a fast Fourier transform function. A trained classifier can, thus, distinguish that the user is looking at the 10 Hz flickering icon or the 12 Hz flickering icon by classifying the power spectral density features.

2. Calibrating the SSVEP-based brain - computer interface

NOTE: Healthy volunteers were chosen for this study. Exclude subjects with a history of brain diseases. The involved subjects were required to have normal or corrected-to-normal vision. They were instructed to be relaxed during the experiments and to avoid unnecessary movements, especially of the head.

  1. Let the user wear the smart glasses with the Android application.
  2. Let the user wear a tight headband for holding the electrodes.
  3. Connect the low-cost electroencephalograph to a PC via a USB cable while the PC is disconnected from the main power supply.
    1. Initially disconnect all the electrodes from the electroencephalograph acquisition board to start from a known condition.
    2. In this phase, the EEG stream is processed offline on the PC with a script compatible with the processing implemented in the Android application. Start the script to receive the EEG signals and visualize them.
  4. Check the displayed signal that is processed offline. This must correspond to only the quantization noise of the EEG amplifier.
  5. Connect the electrodes.
    1. Apply the passive electrode to the left ear with a custom clip, or use an ear-clip electrode. The output signal must remain unchanged at this step because the measuring differential channel is still an open circuit.
    2. Connect an active electrode to the negative terminal of the differential input of the measuring EEG channel, and apply to the frontal region (Fpz standard location) with a headband. After a few seconds, the signal should return to zero (quantization noise).
    3. Connect the other active electrode to the positive terminal of the differential input of the measuring EEG channel, and apply to the occipital region (Oz standard location) with the headband. A brain signal is now displayed, which corresponds to the visual activity measured with respect to the frontal brain area (no visual activity foreseen there).
  6. Acquire signals for system calibration.
    1. Repeatedly stimulate the user with 10 Hz and 12 Hz (and eventually other) flickering icons by starting the flickering icon in the Android application, and acquire and save the inherent EEG signals for offline processing. Ensure each stimulation in this phase consists of a single icon flickering for 10 s, and start the flickering icon by pressing on the touchpad of the smart glasses while also starting the EEG acquisition and visualization script.
    2. From the 10 s signals associated with each stimulation, extract two features by using the fast Fourier transform: the power spectral density at 10 Hz and at 12 Hz. Alternatively, consider second harmonics (20 Hz and 24 Hz).
    3. Use a representation of the acquired signals in the features domain to train a support vector machine classifier. Use a tool (in Matlab or Python) to identify the parameters of a hyperplane with an eventual kernel based on the input features. The trained model will be capable of classifying future observations of EEG signals.

3. Assemble the final wearable and portable SSVEP-based interface

  1. Disconnect the USB cable from the PC, and connect it directly to the smart glasses.
  2. Insert the parameters of the trained classifier into the Android application. The system is now ready.

Wyniki

A possible implementation of the system described above is shown in Figure 1; this implementation allows the user to navigate in augmented reality through brain activity. The flickering icons on the smart glasses display correspond to actions for the application (Figure 1A), and, thus, these glasses represent a substitute for a traditional interface based on button presses or a touchpad. The efficacy of such an interaction i...

Dyskusje

The proper functioning of the system involves two crucial aspects: SSVEP elicitation and signal acquisition. Aside from the specific devices chosen for the current study, SSVEP could be elicited with different devices providing a flickering light, though smart glasses are preferred to ensure wearability and portability. Analogously, further commercial electroencephalographs could be considered, but they would have to be wearable, portable, and involve a minimum number of dry electrodes to be user-friendly. Moreover, the ...

Ujawnienia

The authors have nothing to disclose.

Podziękowania

This work was carried out as part of the ICT for Health project, which was financially supported by the Italian Ministry of Education, University and Research (MIUR), under the initiative Departments of Excellence (Italian Budget Law no. 232/2016), through an excellence grant awarded to the Department of Information Technology and Electrical Engineering of the University of Naples Federico II, Naples, Italy. The project was indeed made possible by the support of the Res4Net initiative and the TC-06 (Emerging Technologies in Measurements) of the IEEE Instrumentation and Measurement Society. The authors would like to also thank L. Callegaro, A. Cioffi, S. Criscuolo, A. Cultrera, G. De Blasi, E. De Benedetto, L. Duraccio, E. Leone, and M. Ortolano for their precious contributions in developing, testing, and validating the system.

Materiały

NameCompanyCatalog NumberComments
Conductive rubber with Ag/AgCl coating ab medica s.p.a.N/AAlternative electrodes – type 2
Earclip electrodeOpenBCIN/AEar clip
EEG-AEOlimexN/AActive electrodes
EEG-PEOlimexN/APassive electrode
EEG-SMTOlimexN/ALow-cost electroencephalograph
Moverio BT-200EpsonN/ASmart glasses
Snap electrodesOpenBCIN/AAlternative electrodes – type 1

Odniesienia

  1. Wolpaw, J. R., et al. Brain-computer interface technology: A review of the first international meeting. IEEE Transactions on Rehabilitation Engineering. 8 (2), 164-173 (2000).
  2. Zander, T. O., Kothe, C., Jatzev, S., Gaertner, M., Tan, D. S., Nijholt, A. Enhancing human-computer interaction with input from active and passive brain-computer interfaces. Brain-Computer Interfaces. , 181-199 (2010).
  3. Ron-Angevin, R., et al. Brain-computer interface application: Auditory serial interface to control a two-class motor-imagery-based wheelchair. Journal of Neuroengineering and Rehabilitation. 14 (1), 49 (2017).
  4. Ahn, M., Lee, M., Choi, J., Jun, S. C. A review of brain-computer interface games and an opinion survey from researchers, developers and users. Sensors. 14 (8), 14601-14633 (2014).
  5. Bi, L., Fan, X. A., Liu, Y. EEG-based brain-controlled mobile robots: A survey. IEEE Transactions on Human-Machine Systems. 43 (2), 161-176 (2013).
  6. Arpaia, P., Callegaro, L., Cultrera, A., Esposito, A., Ortolano, M. Metrological characterization of a low-cost electroencephalograph for wearable neural interfaces in industry 4.0 applications. IEEE International Workshop on Metrology for Industry 4.0 & IoT (MetroInd4. 0&IoT). , 1-5 (2021).
  7. Rüßmann, M., et al. Industry 4.0: The future of productivity and growth in manufacturing industries. Boston Consulting Group. 9 (1), 54-89 (2015).
  8. Angrisani, L., Arpaia, P., Moccaldi, N., Esposito, A. Wearable augmented reality and brain computer interface to improve human-robot interactions in smart industry: A feasibility study for SSVEP signals. IEEE 4th International Forum on Research and Technology for Society and Industry (RTSI). , 1-5 (2018).
  9. Brunner, C., et al. BNCI Horizon 2020: Towards a roadmap for the BCI community. Brain-Computer Interfaces. 2 (1), 1-10 (2015).
  10. Yin, E., Zhou, Z., Jiang, J., Yu, Y., Hu, D. A dynamically optimized SSVEP brain-computer interface (BCI) speller. IEEE Transactions on Biomedical Engineering. 62 (6), 1447-1456 (2014).
  11. Chen, L., et al. Adaptive asynchronous control system of robotic arm based on augmented reality-assisted brain-computer interface. Journal of Neural Engineering. 18 (6), 066005 (2021).
  12. Ball, T., Kern, M., Mutschler, I., Aertsen, A., Schulze-Bonhage, A. Signal quality of simultaneously recorded invasive and non-invasive EEG. Neuroimage. 46 (3), 708-716 (2009).
  13. Grosse-Wentrup, M. What are the causes of performance variation in brain-computer interfacing. International Journal of Bioelectromagnetism. 13 (3), 115-116 (2011).
  14. Arpaia, P., Esposito, A., Natalizio, A., Parvis, M. How to successfully classify EEG in motor imagery BCI: A metrological analysis of the state of the art. Journal of Neural Engineering. 19 (3), (2022).
  15. Ajami, S., Mahnam, A., Abootalebi, V. Development of a practical high frequency brain-computer interface based on steady-state visual evoked potentials using a single channel of EEG. Biocybernetics and Biomedical Engineering. 38 (1), 106-114 (2018).
  16. Friedman, D., Nakatsu, R., Rauterberg, M., Ciancarini, P. Brain-computer interfacing and virtual reality. Handbook of Digital Games and Entertainment Technologies. , 151-171 (2015).
  17. Wang, M., Li, R., Zhang, R., Li, G., Zhang, D. A wearable SSVEP-based BCI system for quadcopter control using head-mounted device. IEEE Access. 6, 26789-26798 (2018).
  18. Arpaia, P., Callegaro, L., Cultrera, A., Esposito, A., Ortolano, M. Metrological characterization of consumer-grade equipment for wearable brain-computer interfaces and extended reality. IEEE Transactions on Instrumentation and Measurement. 71, 1-9 (2021).
  19. Wolpaw, J. R., Birbaumer, N., McFarland, D. J., Pfurtscheller, G., Vaughan, T. M. Brain-computer interfaces for communication and control. Clinical Neurophysiology. 113 (6), 767-791 (2002).
  20. Duszyk, A., et al. Towards an optimization of stimulus parameters for brain-computer interfaces based on steady state visual evoked potentials. PLoS One. 9 (11), 112099 (2014).
  21. Prasad, P. S., et al. SSVEP signal detection for BCI application. IEEE 7th International Advance Computing Conference (IACC). , 590-595 (2017).
  22. Xing, X., et al. A high-speed SSVEP-based BCI using dry EEG electrodes). Scientific Reports. 8, 14708 (2018).
  23. Luo, A., Sullivan, T. J. A user-friendly SSVEP-based brain-computer interface using a time-domain classifier. Journal of Neural Engineering. 7 (2), 026010 (2010).

Przedruki i uprawnienia

Zapytaj o uprawnienia na użycie tekstu lub obrazów z tego artykułu JoVE

Zapytaj o uprawnienia

Przeglądaj więcej artyków

Brain computer InterfaceWearable TechnologyLow cost EEGRehabilitationAttention deficit hyperactivity DisorderAutismElectroencephalographSignal ProcessingExtended RealitySmart GlassesEEG VisualizationFeature ExtractionNeural Activity

This article has been published

Video Coming Soon

JoVE Logo

Prywatność

Warunki Korzystania

Zasady

Badania

Edukacja

O JoVE

Copyright © 2025 MyJoVE Corporation. Wszelkie prawa zastrzeżone