Abstract
The paper proposes a Fuzzy Logic-based LabVIEW application to determine the strength of the voluntary eye-blinks used as commands in a brain-computer interface to control a mobile robot. Relevant statistical features (standard deviation, root mean square, Kurtosis coefficient, and maximum value of amplitude) of the raw electroencephalographic signal acquired from the biosensor provided by a portable NeuroSky headset determine the input linguistic variables. A customized counting algorithm of the voluntary eye-blinks generates the various movement commands (move forward, move backward, turn left, turn right, stop). By implementing LabVIEW graphical custom code sequences, it resulted in developing this algorithm. The Bluetooth-based communication between the LabVIEW application and Arduino allows the sending of commands to the mobile robot. The proposed BCI experimental system is necessary to provide an efficient working principle employed by robust mechatronic systems that support people with neuromotor disabilities to regain their confidence and independence in performing simple everyday activities.
Access provided by Autonomous University of Puebla. Download conference paper PDF
Similar content being viewed by others
Keywords
1 Introduction
The brain-computer interface (BCI) is a multidisciplinary application involving broad knowledge and advanced technical abilities from different research areas: biomedical engineering, mechatronics, computer science, neuroscience, and psychology. The most efficient non-invasive BCI systems involve electroencephalographic (EEG) signals, which can be acquired even with portable commercial headsets.
Processing methods and artificial intelligence techniques are applying to the EEG signal to enable the detection of particular patterns associated with the task executed by the user, as a command, for example: focusing the attention on something [1], keeping a relaxing state of mind [1], executing the voluntary eye-blinks, counting the number of times a specific element is flashing, eliciting P300 evoked potential [2] or imagining something, especially a specific movement [3], triggering slow cortical potentials [4]. The accomplishment of the previously mentioned tasks will determine the real-time control of mechatronic devices, such as a robotic arm [5], a robotic hand [6], an automated wheelchair [7], necessary in the everyday assistance of disabled people, who suffered from neuromotor illnesses, for example, locked-in syndrome, amyotrophic lateral sclerosis, cerebral stroke, spinal cord injuries, and tetraplegia.
The voluntary eye-blinking is an artifact across the EEG signal. The eye-blink pattern is easily recognizable and precisely identifiable among the other variations of the EEG biopotentials. A voluntary eye-blink lasts for 100–400 ms and is frequently chosen as a control signal in BCI applications [8] when neuromotor-disabled patients usually can execute such a simple gesture. The researchers used either the thresholding approach or the raw EEG data analysis to detect voluntary eye-blinks, according to the previous work. Tran D.-K et al. [9] implemented and tested an offline thresholding-based algorithm for intentional eye-blinks detection by identifying peaks across the raw EEG signal acquired from the Emotiv headset. Prem S. et al. [10] presented preliminary stage research using thresholding values measured by the voluntary eye-blinks and the attention level acquired from a NeuroSky headset connected to a smartphone. Miranda M. et al. [11] performed the analysis of the EEG signal acquired from the NeuroSky headset by designing an ad-hoc mother wavelet to precisely detect the temporal location and measure the duration of each eye blinking occurrence also captured with a camera.
The current paper presents a Fuzzy Logic-based LabVIEW application to determine the strength of the voluntary eye-blinks used as BCI commands to control a mobile robot. The paper novelty consists of the processing methods applied to the raw EEG signal to identify the most relevant features used in the configuration of a Fuzzy Logic System that can run in real-time and classify the voluntary eye-blinks used as movement commands sent to Arduino based mobile robot. The raw EEG signal processing for detecting the eye-blinking pattern has benefited from getting a quicker response and the possibility of identifying other significant EEG patterns used simultaneously. Across scientific literature, it is tough to identify explicit evidence of applying fuzzy logic methods on features determined by EEG signals to recognize the voluntary eye-blinks. This paper reveals a novel approach to analyzing the raw EEG signal to measure the eye-blink strength efficiently. In contrast to using the predefined LabVIEW function aimed at accessing the functionality of the ThinkGear chip, the proposed Fuzzy Logic-based LabVIEW application removes the time delays or any other interruption. Therefore, other mental processes (for example, attention or meditation) get measured in parallel with the detection of the voluntary eye-blink.
The structure of the paper is as follows: Sect. 2 describes the hardware system of the proposed BCI solution, Sect. 3 comprises information about the implemented software application, Sect. 4 includes the obtained results and related discussions, and Sect. 5 focuses on conclusion and future research directions.
2 Hardware System – Arduino Based Mobile Robot Controlled by NeuroSky Mindwave Mobile Headset
The brain-computer interface application involved designing a mobile robot (Fig. 1) based on a chassis connecting two wheels and two DC motors to an Arduino Uno board using an L298N motor driver. The mobile robot can change its movement directions according to the received commands based on the voluntary eye-blinks deciphered from the raw EEG signal acquired from the biosensor embedded by a portable NeuroSky headset. NeuroSky is paired by Bluetooth protocol to the computer running a LabVIEW application, receiving the EEG signal, and sending commands to the Arduino Uno board. An HC-06 Bluetooth device enables the communication between the Arduino and the computer.
The NeuroSky Mindwave Mobile headset is one of the most popular portable monitoring devices used in BCI research. It provides the developers a toolkit of the pre-defined functions enabling the reading of specific values characterizing: the meditation level, the attention degree, and the eye-blinking strength. Moreover, the samples of the raw EEG signal can be acquired and stored in data structures for further processing of the extraction of different EEG frequencies: delta, theta, low alpha, high alpha, low beta, high beta, low gamma, and high gamma. One of the most convenient ways of accessing the full functionality provided by the ThinkGear chip of the NeuroSky portable headset is the use of the toolkit offered by NI LabVIEW [12].
3 Software System – LabVIEW Application and Arduino IDE
The software system consists of two programming environments, LabVIEW used to acquire and analyze the EEG signal, and Arduino IDE used to control the mobile robot.
3.1 The LabVIEW Based EEG Signal Processing
Achieving the communication between the NeuroSky headset and LabVIEW application (Fig. 2) resulted in using the NI driver [9], which enabled both basic and advanced functionality of the ThinkGear embedded chip. The principal used function/virtual instrument is ThinkGear Read, and it provides the benefit of acquiring/reading an array of 512 samples of the raw EEG signal during a time interval of 1 s. Further processing methods targeted the raw EEG signal. For example, a Fast Fourier Transform (FFT) function resulted in getting the frequency domain of the raw EEG signal. Likewise, different types of digital filters (high pass, low pass, bandpass) were necessary to extract certain frequency rhythms, for example, delta (0.5–4 Hz), theta (4–8 Hz), alpha (8–12 Hz), beta (12–30 Hz) and gamma (>50 Hz). Thus, it has resulted in a series of arrays, each of them containing 512 samples of various types of EEG signals, both in the time and frequency domain. Ten statistical features (mean, standard deviation, sum, range – the difference between the maximum and the minimum value of amplitude, median, Kurtosis coefficient, skewness, root mean square – RMS, mode, and maximum value of the amplitude) got computed across the 512 samples of each array defining a specific type of signal. As previously mentioned, 1 s of acquisition is equivalent to getting 512 samples. An experiment was performed, consisting of two stages, each involving a time interval of EEG data acquisition equal to 30 s.
During the first stage (meaning the first 30 s), the user had to execute voluntary eye-blinks, according to an audio signal (a beep) transmitted at each time interval of one second. Therefore, 30 sequences have resulted, each consisting of a series of 512 samples of different types of the EEG signal (raw, delta, theta, alpha, beta, gamma), both in time and frequency domain, showing the pattern of a voluntary eye-blink. Ten statistical features got computed for each of the 30 sequences: mean, standard deviation, sum, range – the difference between the maximum and minimum value of amplitude, median, Kurtosis coefficient, skewness, root mean square, mode, and maximum value of amplitude. Each of those 30 sequences was also graphically displayed so that the pattern of the voluntary eye-blink got observed. Thus, irrelevant sequences got removed if they could not offer a correct representation of the eye-blink pattern. This situation happened when the user mistakenly executed a voluntary eye-blink at an erroneous time interval, incorrectly recorded by the sequence indicated by the audio signal (beep). In the test, two such sequences got removed. Therefore, it resulted in 28 sequences associated with the detection of a voluntary eye-blink.
During the second stage (meaning the last 30 s), the user had to avoid voluntary eye-blinking and keep a neutral state of mind. This way, other 30 sequences, each of them containing 512 samples of EEG signal, were obtained and taken as a reference for distinguishing between the moment when the user has executed a voluntary eye-blink, intending to send a command, and any other situation when he/she has avoided the eye-blinking. Analog to the analysis applied in the first stage, there were ten statistical features for every 30 sequences. Each of those 30 sequences was also graphically displayed to observe any possible anomaly. In the test, a single irrelevant sequence got removed. Therefore, it resulted in 29 sequences corresponding to an EEG signal, which did not include any eye-blink.
Finally, the two stages got integrated so that 57 sequences resulted, classified by two categories (1 – one voluntary eye-blink got executed and 0 – no eye-blink got performed). The obtained numerical data were exported to a .csv file using the Write Delimited Spreadsheet function from the File I/O palette offered by LabVIEW. Figure 3 shows an example of a .csv file related to the raw EEG signal from the current paper.
3.2 The Automatic Generation of a Series of Charts Displaying the Variations of Statistical Features Applied on the Raw EEG Dataset
The next objective was to investigate the influence/impact each of the ten statistical features has on each of the 57 sequences of different types of the EEG signal. The accomplishment of this objective resulted in an intuitive solution given by the implementation of a LabVIEW original application enabling the automatic generation of charts displaying the variations of statistical features applied on a dataset consisting of the 57 sequences defining two different classes: 1 – one voluntary eye-blink executed and 0 – no eye-blink executed. Figure 4 shows the LabVIEW application's graphical user interface (GUI) running the automated charts of statistical features applied to the .csv file related to the raw EEG signal (Fig. 3).
Similar charts were also obtained by uploading the other .csv files related to different EEG signals (delta, theta, alpha, beta, and gamma). By visually comparing all the resulted charts, it resulted in the conclusion that the most significant differences between the two classes (1 – a voluntary eye-blink detected and 0 – no eye-blink detected) were given by the computing of four statistical features (standard deviation, Kurtosis coefficient, root mean square and maximum value of the amplitude) applied to the raw EEG signal. Moreover, it resulted in the raw EEG signal being the most significant type because it includes all the possible variations to offer an accurate and detailed description of the pattern of voluntary eye-blinking. Otherwise, different EEG signal processing methods (FFT and filtering) are unnecessary by choosing the raw EEG signal, resulting in a higher degree of simplicity and convenience regarding the LabVIEW code design.
As seen in Fig. 4, there is a Tab graphical window that displays two maximum values and two minimum values of the statistical features calculated on the array consisting of the first 28 sequences related to the class 1 – voluntary eye-blink detected and the last 29 sequences related to the class 0 – no eye-blink executed. For example, it resulted in a maximum/minimum value across the first 28 sequences of the standard deviation measured on the raw EEG signal and a maximum/minimum value across the last 29 sequences of the same statistical feature measured on the raw signal. Determining the minimum and maximum thresholds is necessary for the configuration of the Fuzzy Logic system.
3.3 LabVIEW Based Fuzzy Logic System Aimed to Measure the Strength of the Eye-Blinking
Designing a Fuzzy Logic system resulted in measuring the strength of the eye-blink to compare the resulted value to an established threshold. Thus, a voluntary eye-blink gets detected if its strength is higher than the given threshold. In contrast to Boolean Logic based on total membership, meaning that an input variable can be included either in the first class or in the second class, Fuzzy Logic enables the partial membership [13], assigning the input variable to some degree of membership to both classes.
The most significant statistical features give the input linguistic variables (standard deviation, root mean square, Kurtosis coefficient, and the maximum amplitude value) computed for the raw EEG signal in the time domain. The Fuzzy System Designer provided by NI LabVIEW [13] was necessary to implement the algorithm to generate the value of the output variable represented by the strength of the eye-blink. According to Fig. 5, the shape of the membership functions, called Low, Medium, High, and their intermediate points were set after the experimentation with different mixtures of functions types and partial ranges to obtain the relevant values of the strength of the eye-blinking. The resulted value was updated continuously according to the real-time execution of the eye-blinking.
According to Fig. 6, the range of the output variable, the eye-blinking strength, has a minimum value equal to 0 (zero) and a maximum value equal to 255. Also, the membership functions of the output variable have the characteristics shown in Fig. 6.
According to Table 1, out of 12 possibilities, three Fuzzy Logic-based rules got tested.
3.4 The Integration Between the Raw EEG Data Acquisition and Fuzzy Logic
The eye-blinking strength, previously calculated by using the Fuzzy Logic System, is further compared with a threshold established in the real-time execution of the LabVIEW application. Designing a state-machine paradigm integrated the algorithm of computing the number of voluntary eye-blinks with the real-time running of the Fuzzy Logic system and the real-time detection of the commands aimed to control the movement direction of the mobile robot. The state-machine paradigm is consisting of a case structure (known as if-else or the switch statement in procedural programming) including seven sub-diagrams aimed for the achievement of the following seven tasks: initialization, stop command, move forward, move backward, turn left, turn right and compute the number of voluntary eye-blinks. At every moment, only the instructions code of a single sub-diagram can be executed based on the value of a variable (enumerated type) assigned to the case selector. As described below, these sub-diagrams have a similar instructions code:
-
The acquisition of an array containing 512 samples of raw EEG signal;
-
The computing of statistical features (standard deviation, root mean square, Kurtosis coefficient, the maximum value of amplitude) for that array;
-
The real-time running of the Fuzzy Logic System Controller;
-
The generation of the eye-blink strength, calculated as an output variable of the Fuzzy Logic System;
-
The comparison of the resulted strength of the eye-blink with a given threshold so that two alternative sequences of instructions code will get executed;
-
The first sequence is related to a favorable condition (it resulted in detecting a voluntary eye-blink, whose amplitude exceeds the given threshold), and it consists of inserting the current value of the eye-blinking amplitude in a numerical array, followed by the transition to the next state.
-
The second sequence is related to a false condition (it did not detect an eye-blink), and the transition to the last state is enabled.
-
In the Move Forward and Move Backward states, a Flag Boolean variable is set or unset, to indicate the previously selected movement direction, when the robot should turn to the left or the right;
-
The last state detected the commands changing the movement directions of the mobile robot, according to the size of the array containing the strength of the voluntary eye-blinks. Therefore, there were deciphered the following commands, based on the array size, consisting of:
-
o One element – stop (one voluntary eye-blink was detected);
-
o Two elements – move forward (two voluntary eye-blinks got achieved);
-
o Three elements – move backward (three voluntary eye-blinks got executed);
-
o Four elements – move forward left or move backward left, depending on the value of the Boolean Flag variable (four voluntary eye-blinks got executed);
-
o Five elements – go forward right or go backward right, depending on the value of the Boolean Flag variable (five voluntary eye-blinks got executed).
-
3.5 The Commands Sent by LabVIEW Application to the Arduino Program Necessary to Change the Movement Directions of the Mobile Robot
A string variable is assigned to a particular character, depending on the identified command. The Arduino board received this particular character and executed the previous commands by changing the movement directions of the mobile robot. Using the VISA LabVIEW-based toolkit resulted in implementing the Bluetooth protocol-based communication between the LabVIEW application and the Arduino program.
4 Results and Discussions
A video demonstration showing the results from running the proposed software and hardware system is available at this YouTube unlisted link: https://youtu.be/Mh9ibydqd5w. By testing the brain-computer interface application based on the Fuzzy Logic LabVIEW algorithm, the values returned by the eye-blinking strength can successfully determine the voluntary eye-blinks after setting a suitable threshold customizable for each user. Nevertheless, the proposed BCI system is just a prototype for training, simulation, and educational purposes. Disabled people could use it to experiment with controlling a mobile robot without involving muscles and peripheral nerves. The Fuzzy Logic approach can be considered a starting point based on the raw EEG signal processing to identify the eye-blinking amplitude. Currently, the presented BCI application constitutes a proof of concept. Conducting various testing scenarios should validate the accuracy and response time of the implemented algorithm. A deeper analysis of the feature extraction methods and testing of all possible Fuzzy Logic rules are also necessary to improve the detection accuracy of the voluntary eye-blinks.
5 Conclusion
This research work aims to prove the possibility of using the Fuzzy Logic system to detect voluntary eye-blinks, also used to determine the movement commands of a mobile robot. It resulted in developing a LabVIEW application and designing a BCI prototype by the acquisition of the raw EEG signal from the biosensor of a NeuroSky portable headset, the computing of the most significant statistical features (the standard deviation, the root mean square, the Kurtosis coefficient and the maximum value of amplitude) and the calculation of the eye-blinking strength compared to a given threshold. The strength of a voluntary eye-blink should exceed this threshold. Implementing a state-machine-based algorithm was necessary to compute the number of voluntary eye-blinks and generate the movement commands sent to the Arduino board, which controls the mobile robot.
The proposed Fuzzy Logic System provides good precision and accuracy regarding measuring the strength of eye-blinking. It resulted in a quicker response and a simple working principle. It also resulted in an efficient integration between the acquisition of the raw EEG signal, the Fuzzy Logic System controller, and the counting algorithm of the voluntary eye-blinks. Considering the current status of the scientific literature, implementing a LabVIEW-based Fuzzy Logic system to classify the eye-blinks is an underexplored research field. Therefore, this paper provided the foundation of knowledge necessary to develop advanced systems. Regarding the future research directions, the proposed BCI experimental system needs testing by different categories of users in various psychological conditions and environments to accomplish suitable adjustments and enhance the Fuzzy Logic-based detection of voluntary eye-blinks.
References
Galíndez-Floréz, I., Coral-Flores, A., Moncayo-Torres, E., Mayorca-Torres, D., Guerrero-Chapal, H.: Biopotential signals acquisition from the brain through the mindwave device: preliminary results. In: Botto-Tobar, M., Zambrano Vizuete, M., Torres-Carrión, P., Montes León, S., Pizarro Vásquez, G., Durakovic, B. (eds.) ICAT 2019. CCIS, vol. 1193, pp. 139–152. Springer, Cham (2020). https://doi.org/10.1007/978-3-030-42517-3_11
Wang, Y., Wang, Z., Clifford, W., Markham, C., Ward, T.E., Deegan, C.: Validation of low-cost wireless EEG system for measuring event-related potentials. In: Proceedings of the 2018 29th Irish Signals and Systems Conference (ISSC), Belfast, pp. 1-6 (2018). https://doi.org/10.1109/ISSC.2018.8585297
Rosca, S., Leba, M., Ionica, A., Gamulescu, O.: Quadcopter control using a BCI. IOP Conf. Ser.: Mater. Sci. Eng. 294, 012048 (2018). https://doi.org/10.1088/1757-899X/294/1/012048
Harsono, M., Liang, L., Zheng, X., Jesse, F.F., Cen, Y., Jin, W.: Classification of imagined digits via brain-computer interface based on electroencephalogram. In: Wang, Y., Huang, Q., Peng, Y. (eds.) Image and Graphics Technologies and Applications: 14th Conference on Image and Graphics Technologies and Applications, IGTA 2019, Beijing, China, April 19–20, 2019, Revised Selected Papers, pp. 459–471. Springer Singapore, Singapore (2019). https://doi.org/10.1007/978-981-13-9917-6_44
Kubacki, A., Milecki, A.: Control of the 6-axis robot using a brain-computer interface based on steady state visually evoked potential (SSVEP). In: Trojanowska, J., Ciszak, O., Machado, J.M., Pavlenko, I. (eds.) Advances in Manufacturing II: Volume 1 – Solutions for Industry 4.0, pp. 213–222. Springer International Publishing, Cham (2019). https://doi.org/10.1007/978-3-030-18715-6_18
Reust, A., Desai, D., Gomez, L.: Extracting motor imagery features to control two robotic hands. In: Proceedings of the 2018 IEEE International Symposium on Signal Processing and Information Technology (ISSPIT), pp. 118–122. Louisville, KY, USA (2018). https://doi.org/10.1109/ISSPIT.2018.8642627
Mistry, K.S., Pelayo, P., Anil, D.G., George, K.: An SSVEP based brain computer interface system to control electric wheelchairs. In: Proceedings of the 2018 IEEE International Instrumentation and Measurement Technology Conference (I2MTC), pp. 1–6. Houston, Texas (2018). https://doi.org/10.1109/I2MTC.2018.8409632
Zhi-Hao, W., Hendrick, K., Yu-Fan, C., Chuan-Te, L., Shi-Hao,, Gwo-Jia, J.: Controlling DC motor using eye blink signals based on LabVIEW. In: Proceedings of the 2017 5th International Conference on Electrical, Electronics and Information Engineering (ICEEIE), pp. 61–65. Malang (2017). https://doi.org/10.1109/ICEEIE.2017.8328763
Tran, D.-K., Nguyen, T.-H., Nguyen, T.-N.: Detection of EEG-based eye-blinks using a thresholding algorithm. Eur. J. Eng. Technol. Res. 6(4), 6–12 (2021). https://doi.org/10.24018/ejers.2021.6.4.2438
Prem, S., Wilson, J., Varghese, S.M., Pradeep, M.: BCI integrated wheelchair controlled via eye blinks and brain waves. In: Pawar, P.M., Balasubramaniam, R., Ronge, B.P., Salunkhe, S.B., Vibhute, A.S., Melinamath, B. (eds.) Techno-Societal 2020: Proceedings of the 3rd International Conference on Advanced Technologies for Societal Applications—Volume 1, pp. 321–331. Springer International Publishing, Cham (2021). https://doi.org/10.1007/978-3-030-69921-5_32
Miranda, M., Salinas, R., Raff, U., Magna, O.: Wavelet Design for Automatic Real-Time Eye Blink Detection and Recognition in EEG Signals. Int. J. Comput. Commun. Control 14, 375–387 (2019). https://doi.org/10.15837/ijccc.2019.3.3516
NI LabVIEW Toolkit – NeuroSky Brain Computer Interface, https://forums.ni.com/t5/NI-Labs-Toolkits/NeuroSky-LabVIEW-Driver/ta-p/3520085?profile.language=en. Accessed 09 Aug 2021
LabVIEW PID and Fuzzy Logic Toolkit User Manual, http://www.ni.com/pdf/manuals/372192d.pdf. Accessed 09 Aug 2021
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Rușanu, O.A. (2022). A Fuzzy Logic-Based LabVIEW Implementation Aimed for the Detection of the Eye-Blinking Strength Used as a Control Signal in a Brain-Computer Interface Application. In: Moldovan, L., Gligor, A. (eds) The 15th International Conference Interdisciplinarity in Engineering. Inter-Eng 2021. Lecture Notes in Networks and Systems, vol 386. Springer, Cham. https://doi.org/10.1007/978-3-030-93817-8_66
Download citation
DOI: https://doi.org/10.1007/978-3-030-93817-8_66
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-93816-1
Online ISBN: 978-3-030-93817-8
eBook Packages: EngineeringEngineering (R0)