Abstract
Artificial intelligence coupled with digitally connected technologies are becoming more self-evident. These developments indicate an increasing symbiosis between human and machine, referring to a new phase of interaction—symbiotic intelligence. In this vein, the human-centred development of technologies is becoming more and more important. The detection of user’s mental states, such as cognitive processes, emotional or affective reactions, offers great potential for the development of intelligent and interactive machines. Neurophysiological signals provide the basis to estimate many facets of subtle mental user states, like attention, affect, cognitive workload and many more. This has led to extensive progress in brain-based interactions—Brain-Computer Interfaces (BCIs). While most BCI research aims at designing assistive, supportive or restorative systems for severely disabled persons, the current discussion focuses on neuroadaptive control paradigms using BCIs as a strategy to make technologies more human-centred and also usable for non-medical applications. The primary goal of our neuroadaptive technology research agenda is to consistently align the increasing intelligence and autonomy of machines with the needs and abilities of the human—a human-centred neuroadaptive technology research roadmap. Due to its far-reaching social implications, our research and developments do not only face technological but also social challenges. If neuroadaptive technologies are applied in non-medical areas, they must be consistently oriented to the needs and ethical values of the users and society.
Access provided by Autonomous University of Puebla. Download chapter PDF
Similar content being viewed by others
Keywords
- Artificial intelligence
- Cognitive enhancement
- Brain-computer interface
- Neuroadaptive technology
- Autonomous systems
- Adaptive systems
- Personalized technology
- Human-centred design
16.1 The Rise of Artificial Intelligence: Technologies for the Interaction Between Human and Machine
Digitally connected systems, techniques and methods of artificial intelligence (AI) and machine learning (ML) are changing our world. Humans develop such technologies in order to satisfy intentions, thereby anticipating our needs and thus making life easier. Computers trade our shares, we have cars that park themselves, and flying is almost completely automated. Virtually every area has benefited from the tremendous progress in digitization and AI, from military to the medical field and manufacturing. These progresses not only gradually transform the way we are interacting with technological products and services, but also differently influences our sensorimotor and cognitive capacities and skills. Osiurak et al. [1] summarizes these gradual technological developments over time into three levels that describe human-technology interaction with physical (affordance design), sophisticated (automation and interface design) and symbiotic (embodied and cognitive design) technologies. In the future, we will experience a massive influence on people’s everyday lives and working environment: the interaction with digital products and connected machines, technologies and services is becoming more self-evident and a core competence of the future requiring new modes of interaction and cognitive abilities. Future trends, like voice, gesture or thought operated technologies indicate an increasing symbiosis of human and technology, referring to a new phase of interaction called symbiotic intelligence [1]. The authors claim that the sophisticated technology of the future will ultimately become more and more unconscious to humans in order to maybe become one with them—the goal is the intuitive handling of technology in order to minimize the interaction effort with technical products.
Historically, the development and use of tools is strongly related to human evolution and intelligence. With the rise of AI and digitally connected products, we have access to an enormous variety of data and information. This enables us to develop interfaces that support us in how we think, what we know, how we decide and act. This transformation can be summarized under the concept of cognitive enhancement or cognitive augmentation [2]. It describes a very broad spectrum of techniques and approaches, such as performance-enhancing drugs, medical implants and prostheses and human-computer interfaces, which lead to improved abilities and may probably transcend our existing cognitive boundaries. Nevertheless, the increasing integration of technology in our everyday life and working environments entails new challenges and potential for conflicts. Often, humans with their individual preferences, skills and needs find themselves overlooked in the development of future technology. The resulting solutions, while technologically advanced, may nevertheless offer limited gains in terms of the productivity, creativity, and health of the users in question.
16.2 Embodied and Situated Minds: How We Use, Act and Think with Technology
If smart and adaptive technologies that support or even expand our cognitive abilities are the future, then it is essential to consider an optimal design so that such technologies are geared to the user’s needs and contribute to a human-centred, efficient and accepted technology.
The human-centred development of technologies and interfaces for the interaction between human and machines is becoming more and more important. In order to achieve increased productivity with a concurrent contribution to the subjective well-being of employees, digital equipment needs to be seamlessly and intuitively integrated in everyday working life [3]. Intelligent systems should support the user rather than hamper the interaction due to its inherent complexity. Instead of creating frustration, the system should motivate the user by providing a positive user experience during the interaction [4]. Positive user experiences in daily human-technology interactions are extremely important for both the individual person and the organization: from the human factors point of view, positive user experiences contribute to the subjective perception of competence, have a positive effect on mental health and consequently lead to motivated action, increased productivity and job satisfaction, which are important factors for the enterprise [5,6,7].
Research from the cognitive neurosciences on embodied intelligence shows that intelligence cannot be assigned purely to brain functions without regarding the human in its situated surroundings. Thus, intelligent behaviour and decision-making develop primarily from the interaction between brain, body and environment [8]. We use our entire environment, including integrated technologies, as an extended mind or memory storage [9], for example to facilitate knowledge retrieval and to reduce the cognitive demands of a task. Familiar technical aids include our fingers for counting, GPS devices for navigation, or the use of the internet for knowledge retrieval. We use such technical aids to either simplify or surrender tasks completely. Hence, we are permanently engaged in what can be called cognitive offloading [10].
16.3 Connecting Brain and Machine: Brain-Computer Interfaces
From a technological perspective, computers and machines are increasingly capable of learning, communicating and making decisions. Thus, the interaction between humans and technology gains additional dynamics. Speech or gesture and mimic recognition are increasingly replacing former input devices such as a mouse and keyboard. There is steady progress in the development of measurement techniques, sensor technologies and miniaturization of techniques for recording neurophysiological activity coupled with advanced signal processing, statistics and machine learning. Based on these developments, we gathered a tremendous understanding of cognitive functions and emotional processes underlying human behaviour, decision-making and social interactions over the last decades. Thus, brain and physiological signals allow us to derive many facets of subtle mental user states, like attention, affect, movement intention, cognitive workload and many more. One key invention for researching brain-based interactions between humans and machines is called Brain-Computer Interfaces (BCIs) [11, 12]. The BCI is currently the most direct form of an interface for the interaction between a user and a technical system.
16.4 Measurement Technologies and Applications of Brain-Computer Interfaces
The backbone of BCIs is technology for the real-time recording of neurophysiological activity that can be divided into invasive and non-invasive measurement techniques. Invasive recording techniques require brain surgery to implant electrodes directly into the brain. These recordings are further subdivided into brain-surface electrodes, like e.g. electrocorticography (ECoG) and brain-penetrating microelectrodes (for a comprehensive overview, see Thakor [13]). Non-invasive recordings for BCIs are subdivided into (a) portable measurement techniques, like electroencephalography (EEG) and functional near-infrared spectroscopy (fNIRS) and (b) stationary systems like magnetoencephalography (MEG) and functional magnetic resonance imaging (fMRI). EEG and MEG allow measuring the electromagnetic activity of multiple cortical neurons directly by recording voltage fluctuations via electrodes on the scalp (EEG) or by using very sensitive superconducting sensors (so-called SQUIDs, superconducting quantum interference device in MEG) [14, 15]. Both fMRI and fNIRS measure neuronal activity indirectly. These techniques record metabolic processes related to neuronal activity by capturing hemodynamic changes in the blood flow. Hence, they enable the precise localization of the activation and deactivation of certain brain regions [16, 17].
Since its beginnings in the 1970s [18], BCI research focussed primarily on clinical and medical applications. The main purpose was to provide users that have physical or perceptual limitations with a communication tool or to allow them to control a technical device, for example for locked-in or stroke patients. In such applications, certain mental states that the user voluntarily generated are decoded while circumventing any muscular activity [19,20,21,22,23,24]. BCIs enable the development of assistive and restorative technologies to control wheelchairs [25], orthoses [26,27,28], prostheses [29], service robots [30] and web-applications [31]. Besides active control for users with motor impairments, BCIs are also applicable for neurofeedback training, for example to treat patients with psychiatric disorders such as depression, schizophrenia or attentional deficits [32,33,34,35,36]. Advances in these fields have led to a boost in mobile technologies and sophisticated machine-learning algorithms that can be further exploited for monitoring healthy users and laying the basis for non-medical applications of BCIs [11, 12, 37, 38].
16.5 Neuroadaptive Technology and Its Potential for Future Human-Computer Interaction Applications
In our everyday life, technical systems are becoming more and more prominent and serve the purpose of supporting us in our daily routines. Interactive machines and adaptive systems obtain information from the user’s interaction behaviour [39] through integrated sensors (e.g. smartphones) or environmental sensors (e.g. cameras) [40,41,42]. Such intelligent systems are summarized under the term context-aware systems [43, 44]. Context-aware systems are able to adapt the interaction based on the current context information (including information about the purpose of use, objective to be achieved, and tasks), thus making machines sensitive to physical environments, locations and situations. Examples range from very simple adaptations such as screen brightness to the current time of day and ambient lighting; lane-keeping and distance assistants in (autonomous) vehicles; and cooperating industrial robots and service robots for domestic use for the elderly.
However, the user with her individual preferences, skills and abilities receives less attention. In order to provide an optimal interaction between user and adaptive and autonomous technologies, it is important to take not only environmental and contextual conditions, but also the current user state (with her preferences and intentions) into account appropriately. Thus, machines need an understanding of the user, information about the user that goes beyond the bare necessities for controlling the machine. Therefore, it is a major prerequisite that technology reacts sensitively and promptly to its users, to create a collaborative and assistive interaction. Over the last years, the use of machine-learning algorithms for computational user modelling has increased substantially [45,46,47]. The basic idea is that computational user models represent more fine-graded aspects of the user, such as skills, preferences and cognitive abilities, as well as contextual changes such as selective attention, working memory load and the current emotional state and mood. It allows system adaptation to complex situations without inflexible dependence on predetermined programs [48, 49] and provides the basis for a symbiotic interaction between user and machine to collaborate and cooperate in making collective decisions.
For a long time, clinical and medical applications have been the primary goals of BCI research. In classical approaches, active control-based BCIs consider decoding brain activity to map it to commands that can drive an application that is running on a computer or a device. Examples for assistive or restorative BCIs are the P300 speller [50] that uses an event-related potential correlating with attentional resources, menu selection and exoskeleton control using Steady-State-Visual-Evoked-Potential (SSVEP) paradigms that are natural responses to visual stimuli at specific frequencies [51, 52], or binary selection through motor imagery paradigms that are produced by voluntarily modulating certain oscillatory sensorimotor rhythms [20, 26,27,28].
The introduction of passive BCIs [53] as a new concept coupled with new mobile and deployable sensor technologies for EEG and fNIRS [54,55,56] and advanced signal processing and machine-learning algorithms [37, 38, 57,58,59,60] for artefact correction and classification of cognitive and emotional states makes BCIs ready for non-clinical usage [11, 12, 61,62,63]. In classical BCIs, the machine-learning algorithms that are used focus mainly on the number of bits transmitted per minute and the successful classification rates of extracted brain patterns. In the passive BCI concept, the bit transmission rate is not the primary interest, but rather to focus on augmenting human-computer interaction. Hence, users do not need to carry out any mental actions actively to produce brain patterns that are translated to computer action. To the contrary, the passive BCI concept is envisioned as a continuous brain-monitoring process that is used to stratify the user according to his/her cognitive or emotional state. This provides the basis for extended computational user models in a human-machine control loop. Furthermore, this loop requires a precise knowledge of the psychological processes and corresponding neurophysiological correlates on which the adaptive computer system depends. Open-loop EEG or fNIRS-based passive BCIs to monitor psychological processes such as user engagement, user intention, selective attention, emotional engagement and workload in students, drivers, pilots or air traffic controllers have already been introduced [64,65,66,67,68,69,70,71,72,73]. Affective reactions such as valence and arousal are another source of user information that can serve as possible input to adaptive computer systems [61,62,63, 74, 75]. In a closed-loop human-computer interaction paradigm, this information enriches a user model to enable not only a concrete command, but also an adequate system adaptation to the user’s preferences, skills and abilities. With the help of sophisticated signal processing and machine-learning techniques, neurophysiological signals can be interpreted in the sense of a continuous representation of the user’s condition and provide information about psychological processes such as cognition, emotion and motivation. In a control loop, the estimated user model serves as an input variable in order to optimally support the goal of user interaction and certain user needs by intelligent system adaptation.
The developments in brain-based interactions enable the design of a neuroadaptive system loop [61,62,63, 76,77,78]. These loops are currently being discussed as a strategy to make adaptive and autonomous technologies more user-oriented and augment human-computer interaction. Possible future neuroadaptive applications, among others, are for example:
-
Intelligent vehicles that dynamically adapt the level of automation of the driving task to the current intention, attentional or workload level of the driver [67, 79,80,81,82].
-
Interactive e-learning programs that adapt speed and difficulty to the cognitive abilities of the user [73, 83].
-
Neurofeedback-based interfaces to promote subjective well-being by training concentration and relaxation [84].
-
Personalized internet applications that capture affective user reaction to adapt their content, presentation and interaction mechanisms to individual needs and preferences [61,62,63].
-
Collaborative robots that react sensitively to user intentions, emotions and attentional levels [85,86,87,88].
There are two main benefits of implicit user interaction via a neuroadaptive control loop for personalized system applications. (1) Complementarity: There is no interference with other activities in the interaction cycle. The neuroadaptive control loop expands the communication between human and machine and thereby contributes to the self-learning process of the system towards individual user abilities, skills and preferences. (2) Assistance for automation: The neuroadaptive system can help to increase the situation-awareness of the user towards automated system behaviour. A transparent system behaviour develops by considering implicit user reactions during longer periods of interaction. Consequently, the user is expected to experience feelings of control over and trust in the automated system. Future research will reveal the extent to which these expectations are correct.
The increasing research interest in the still very young field of neuroadaptive technologies can also be observed at human factor engineeringFootnote 1 and affective computingFootnote 2 conferences and newly emerging, popular conferences such as neuroergonomicsFootnote 3 and neuroadaptive technology.Footnote 4 Innovation-friendly companies, e.g. from the automobile industry, already use brain and physiological methods for their consumer research. Furthermore, technology companies from Silicon Valley, like Facebook Footnote 5 or Elon Musk—founded NeuraLink Footnote 6—invest in research on invasive closed-loop neuroadaptive technologies.
Although the use and advantages of neuroadaptive interaction are indisputable, there are still some major gaps and challenges to overcome before they can be applied outside of lab conditions. (1) Understanding brain functions out of the lab: There is still a significant lack of basic knowledge on human brain functions in complex real-world situations where individuals perform activities in natural environments and social contexts. In such environments, signal analysis and machine-learning interpretation is still very challenging. Robust algorithms to deal with real-world artefacts that strongly exceed the signals of interest are still needed. (2) Integration of context information: Neurophysiological signals cannot be interpreted in isolation, but must be analysed and classified in a given context of application. While context is controlled and known in the lab, real-life applications require the combination of context information to adapt technical systems to user states and social situations under real-world conditions.
Consequently, considerable research effort is needed to realize closed-loop solutions based on brain signals robust enough to deal with the high complexity of real-life applications [11, 12, 89].
The primary goal of a future neuroadaptive technology research agenda is to consistently align the increasing intelligence and autonomy of technical assistive systems with the emotional needs and cognitive abilities of the human—a human-centred neuroadaptive technology research roadmap. By their individual adaptability, neuroadaptive technologies contribute significantly to a human-centred, efficient and acceptable technology. The applied research in this field and the possible transfer into real-life applications requires a strong transdisciplinary research agenda. Due to its far-reaching social implications, research and developments does not only have to face technological, but also social challenges, like including questions about cognitive liberty, mental privacy, mental integrity and psychological integrity. In addition to computer science and neuroscience, the integration of further disciplines is needed, such as positive psychology that aims to foster human flourishing and well-being by researching positive emotions and its influences during human-technology interaction as well as ethics and social sciences. If neuroadaptive technologies are applied in non-medical areas, they must be consistently oriented to the needs and ethical values of the users and society.
Notes
- 1.
https://www.ahfe2019.org/, accessed 29th July 2019.
- 2.
http://acii-conf.org/2019/, accessed 29th July 2019.
- 3.
http://www.biomed.drexel.edu/neuroergonomics/, accessed 29th July 2019.
- 4.
http://neuroadaptive.org/conference, accessed 29th July 2019.
- 5.
- 6.
https://www.wsj.com/articles/elon-musk-launches-neuralink-to-connect-brains-with-computers-1490642652, accessed 29th July 2019.
References
Osiurak F, Navarro J, Reynaud E. How our cognition shapes and is shaped by technology: a common framework for understanding human tool-use interactions in the past, present, and future. Front Psychol. 2018;9:293. https://doi.org/10.3389/fpsyg.2018.00293.
Moore P. Enhancing me: the hope and the hype of human enhancement (TechKnow). New York: Wiley; 2008.
Kahneman D. Objective happiness. New York: Russell Sage Foundation; 1999. p. xii. 593 pp.
Jameson A. Understanding and dealing with usability side effects of intelligent processing. AI Mag. 2009;30:23–40.
Deci EL, Ryan RM, editors. Handbook of self-determination research. Softcover ed. Rochester: University of Rochester Press; 2004.
Hassenzahl M. User experience (UX): towards an experiential perspective on product quality. New York: ACM Press; 2008. p. 11.
Spath D, Peissner M, Sproll S. Methods from neuroscience for measuring user experience in work environments. In: Rice V, editor. Advances in understanding human performance. Boca Raton: CRC Press; 2010. p. 111–21.
Engel AK, Maye A, Kurthen M, König P. Where’s the action? The pragmatic turn in cognitive science. Trends Cogn Sci. 2013;17:202–9.
Wilson M. Six views of embodied cognition. Psychon Bull Rev. 2002;9:625–36.
Risko EF, Gilbert SJ. Cognitive offloading. Trends Cogn Sci. 2016;20:676–88.
Blankertz B, Acqualagna L, Dähne S, Haufe S, Schultze-Kraft M, Sturm I, Ušćumlic M, Wenzel MA, Curio G, Müller K-R. The Berlin brain-computer interface: progress beyond communication and control. Front Neurosci. 2016;10:530. https://doi.org/10.3389/fnins.2016.00530.
Cinel C, Valeriani D, Poli R. Neurotechnologies for human cognitive augmentation: current state of the art and future prospects. Front Hum Neurosci. 2019;13:31. https://doi.org/10.3389/fnhum.2019.00013.
Thakor NV. Translating the brain-machine interface. Sci Transl Med. 2013;5:210ps17.
Nunez PL, Srinivasan R. Electric fields of the brain: the neurophysics of EEG. 2nd ed. Oxford: Oxford University Press; 2006.
Hämäläinen M, Hari R, Ilmoniemi RJ, Knuutila J, Lounasmaa OV. Magnetoencephalography—theory, instrumentation, and applications to noninvasive studies of the working human brain. Rev Mod Phys. 1993;65:413–97.
Logothetis NK, Pauls J, Augath M, Trinath T, Oeltermann A. Neurophysiological investigation of the basis of the fMRI signal. Nature. 2001;412:150–7.
Ferrari M, Quaresima V. A brief review on the history of human functional near-infrared spectroscopy (fNIRS) development and fields of application. NeuroImage. 2012;63:921–35.
Vidal JJ. Toward direct brain-computer communication. Annu Rev Biophys Bioeng. 1973;2:157–80.
Birbaumer N, Ghanayim N, Hinterberger T, Iversen I, Kotchoubey B, Kübler A, Perelmouter J, Taub E, Flor H. A spelling device for the paralysed. Nature. 1999;398:297–8.
Ramos-Murguialday A, Broetz D, Rea M, et al. Brain-machine interface in chronic stroke rehabilitation: a controlled study: BMI in chronic stroke. Ann Neurol. 2013;74:100–8.
Kübler A, Nijboer F, Mellinger J, Vaughan TM, Pawelzik H, Schalk G, McFarland DJ, Birbaumer N, Wolpaw JR. Patients with ALS can use sensorimotor rhythms to operate a brain-computer interface. Neurology. 2005;64:1775–7.
Münßinger JI, Halder S, Kleih SC, Furdea A, Raco V, Hösle A, Kübler A. Brain painting: first evaluation of a new brain–computer interface application with ALS-patients and healthy volunteers. Front Neurosci. 2010;4:182. https://doi.org/10.3389/fnins.2010.00182.
Wolpaw JR, Birbaumer N, McFarland DJ, Pfurtscheller G, Vaughan TM. Brain-computer interfaces for communication and control. Clin Neurophysiol. 2002;113:767–91.
Wolpaw JR. Brain-computer interfaces as new brain output pathways. J Physiol Lond. 2007;579:613–9.
Carlson T, JdR M. Brain-controlled wheelchairs: a robotic architecture. IEEE Robot Autom Mag. 2013;20:65–73.
Vukelić M, Gharabaghi A. Oscillatory entrainment of the motor cortical network during motor imagery is modulated by the feedback modality. NeuroImage. 2015;111:1–11.
Brauchle D, Vukelić M, Bauer R, Gharabaghi A. Brain state-dependent robotic reaching movement with a multi-joint arm exoskeleton: combining brain-machine interfacing and robotic rehabilitation. Front Hum Neurosci. 2015;9:564. https://doi.org/10.3389/fnhum.2015.00564.
Vukelić M, Belardinelli P, Guggenberger R, Royter V, Gharabaghi A. Different oscillatory entrainment of cortical networks during motor imagery and neurofeedback in right and left handers. NeuroImage. 2019;195:190–202.
Rohm M, Schneiders M, Müller C, Kreilinger A, Kaiser V, Müller-Putz GR, Rupp R. Hybrid brain–computer interfaces and hybrid neuroprostheses for restoration of upper limb functions in individuals with high-level spinal cord injury. Artif Intell Med. 2013;59:133–42.
Leeb R, Tonin L, Rohm M, Desideri L, Carlson T, JdR M. Towards independence: a BCI telepresence robot for people with severe motor disabilities. Proc IEEE. 2015;103:969–82.
Bensch M, Karim AA, Mellinger J, Hinterberger T, Tangermann M, Bogdan M, Rosenstiel W, Birbaumer N. Nessi: an EEG-controlled web browser for severely paralyzed patients. Comput Intell Neurosci. 2007;2007:1–5.
Wyckoff S, Birbaumer N. Neurofeedback and brain-computer interfaces. In: Mostofsky DI, editor. The handbook of behavioral medicine. Oxford: Wiley; 2014. p. 275–312.
Birbaumer N, Ruiz S, Sitaram R. Learned regulation of brain metabolism. Trends Cogn Sci (Regul Ed). 2013;17:295–302.
Ruiz S, Lee S, Soekadar SR, Caria A, Veit R, Kircher T, Birbaumer N, Sitaram R. Acquired self-control of insula cortex modulates emotion recognition and brain network connectivity in schizophrenia. Hum Brain Mapp. 2013;34:200–12.
Choi SW, Chi SE, Chung SY, Kim JW, Ahn CY, Kim HT. Is alpha wave neurofeedback effective with randomized clinical trials in depression? A pilot study. Neuropsychobiology. 2011;63:43–51.
Ehlis A-C, Schneider S, Dresler T, Fallgatter AJ. Application of functional near-infrared spectroscopy in psychiatry. NeuroImage. 2014;85:478–88.
Craik A, He Y, Contreras-Vidal JL. Deep learning for electroencephalogram (EEG) classification tasks: a review. J Neural Eng. 2019;16:031001.
Lotte F, Bougrain L, Cichocki A, Clerc M, Congedo M, Rakotomamonjy A, Yger F. A review of classification algorithms for EEG-based brain–computer interfaces: a 10 year update. J Neural Eng. 2018;15:031005.
Seifert C, Granitzer M, Bailer W, Orgel T, Gantner L, Kern R, Ziak H, Petit A, Schlötterer J, Zwicklbauer S. Ubiquitous access to digital cultural heritage. J Comput Cult Herit. 2017;10:1–27.
Radu V, Lane ND, Bhattacharya S, Mascolo C, Marina MK, Kawsar F. Towards multimodal deep learning for activity recognition on mobile devices. In: Proceedings of the 2016 ACM international joint conference on pervasive and ubiquitous computing adjunct—UbiComp’16. Heidelberg: ACM Press; 2016. p. 185–8.
Sankaran K, Zhu M, Guo XF, Ananda AL, Chan MC, Peh L-S. Using mobile phone barometer for low-power transportation context detection. In: Proceedings of the 12th ACM conference on embedded network sensor systems—SenSys’14. Memphis: ACM Press; 2014. p. 191–205.
Liu H, Wang J, Wang X, Qian Y. iSee: obstacle detection and feedback system for the blind. In: Proceedings of the 2015 ACM international joint conference on pervasive and ubiquitous computing and proceedings of the 2015 ACM international symposium on wearable computers—UbiComp’15. Osaka: ACM Press; 2015. p. 197–200.
Mens K, Capilla R, Cardozo N, Dumas B. A taxonomy of context-aware software variability approaches. In: Companion proceedings of the 15th international conference on modularity—MODULARITY companion 2016. Malaga: ACM Press, Spain; 2016. p. 119–24.
Kaklanis N, Biswas P, Mohamad Y, Gonzalez MF, Peissner M, Langdon P, Tzovaras D, Jung C. Towards standardisation of user models for simulation and adaptation purposes. Univ Access Inf Soc. 2016;15:21–48.
Yan L, Ma Q, Yoshikawa M. Classifying twitter users based on user profile and followers distribution. In: Decker H, Lhotská L, Link S, Basl J, Tjoa AM, editors. Database and expert systems applications. Berlin: Springer; 2013. p. 396–403.
Gao R, Hao B, Bai S, Li L, Li A, Zhu T. Improving user profile with personality traits predicted from social media content. In: Proceedings of the 7th ACM conference on recommender systems—RecSys’13. Hong Kong: ACM Press; 2013. p. 355–8.
Besel C, Schlötterer J, Granitzer M. On the quality of semantic interest profiles for onine social network consumers. SIGAPP Appl Comput Rev. 2016;16:5–14.
Licklider JCR. Man-computer Symbiosis. IRE Trans Hum Factors Electron HFE. 1960;1:4–11.
Pope AT, Bogart EH, Bartolome DS. Biocybernetic system evaluates indices of operator engagement in automated task. Biol Psychol. 1995;40:187–95.
Krusienski DJ, Sellers EW, McFarland DJ, Vaughan TM, Wolpaw JR. Toward enhanced P300 speller performance. J Neurosci Methods. 2008;167:15–21.
Kwak N-S, Müller K-R, Lee S-W. A lower limb exoskeleton control system based on steady state visual evoked potentials. J Neural Eng. 2015;12:056009.
Yin E, Zhou Z, Jiang J, Chen F, Liu Y, Hu D. A novel hybrid BCI speller based on the incorporation of SSVEP into the P300 paradigm. J Neural Eng. 2013;10:026012.
Zander TO, Kothe C. Towards passive brain-computer interfaces: applying brain-computer interface technology to human-machine systems in general. J Neural Eng. 2011;8:025005.
McDowell K, Lin C-T, Oie KS, Jung T-P, Gordon S, Whitaker KW, Li S-Y, Lu S-W, Hairston WD. Real-world neuroimaging technologies. IEEE Access. 2013;1:131–49.
Zander TO, Andreessen LM, Berg A, Bleuel M, Pawlitzki J, Zawallich L, Krol LR, Gramann K. Evaluation of a dry EEG system for application of passive brain-computer interfaces in autonomous driving. Front Hum Neurosci. 2017;11:78. https://doi.org/10.3389/fnhum.2017.00078.
Piper SK, Krueger A, Koch SP, Mehnert J, Habermehl C, Steinbrink J, Obrig H, Schmitz CH. A wearable multi-channel fNIRS system for brain imaging in freely moving subjects. NeuroImage. 2014;85:64–71.
Haeussinger FB, Dresler T, Heinzel S, Schecklmann M, Fallgatter AJ, Ehlis A-C. Reconstructing functional near-infrared spectroscopy (fNIRS) signals impaired by extra-cranial confounds: an easy-to-use filter method. NeuroImage. 2014;95:69–79.
Schecklmann M, Mann A, Langguth B, Ehlis A-C, Fallgatter AJ, Haeussinger FB. The temporal muscle of the head can cause artifacts in optical imaging studies with functional near-infrared spectroscopy. Front Hum Neurosci. 2017;11:456. https://doi.org/10.3389/fnhum.2017.00456.
Biessmann F, Plis S, Meinecke FC, Eichele T, Müller K-R. Analysis of multimodal neuroimaging data. IEEE Rev Biomed Eng. 2011;4:26–58.
Dahne S, BieBmann F, Meinecke FC, Mehnert J, Fazli S, Mtuller K-R. Multimodal integration of electrophysiological and hemodynamic signals. IEEE; 2014. p. 1–4.
Bauer W, Vukelić M. EMOIO research project: an interface to the world of computers. In: Neugebauer R, editor. Digital transformation. Berlin: Springer; 2019. p. 129–44.
Vukelić M, Pollmann K, Peissner M. Toward brain-based interaction between humans and technology. In: Neuroergonomics. Amsterdam: Elsevier; 2019. p. 105–9.
Pollmann K, Ziegler D, Peissner M, Vukelić M. A new experimental paradigm for affective research in neuro-adaptive technologies. New York: ACM Press; 2017. https://doi.org/10.1145/3038439.3038442.
Dijksterhuis C, de Waard D, Brookhuis KA, Mulder BLJM, de Jong R. Classifying visuomotor workload in a driving simulator using subject specific spatial brain patterns. Front Neurosci. 2013;7:149. https://doi.org/10.3389/fnins.2013.00149.
Berka C, Levendowski DJ, Lumicao MN, Yau A, Davis G, Zivkovic VT, Olmstead RE, Tremoulet PD, Craven PL. EEG correlates of task engagement and mental workload in vigilance, learning, and memory tasks. Aviat Space Environ Med. 2007;78:B231–44.
Aricò P, Borghini G, Di Flumeri G, Colosimo A, Pozzi S, Babiloni F. A passive brain–computer interface application for the mental workload assessment on professional air traffic controllers during realistic air traffic control tasks. In: Progress in brain research. Amsterdam: Elsevier; 2016. p. 295–328.
Haufe S, Kim J-W, Kim I-H, Sonnleitner A, Schrauf M, Curio G, Blankertz B. Electrophysiology-based detection of emergency braking intention in real-world driving. J Neural Eng. 2014;11:056011.
Lahmer M, Glatz C, Seibold VC, Chuang LL. Looming auditory collision warnings for semi-automated driving: an ERP Study. In: Proceedings of the 10th international conference on automotive user interfaces and interactive vehicular applications—automotiveUI’18. Toronto: ACM Press. 2018. p. 310–9.
Ihme K, Unni A, Zhang M, Rieger JW, Jipp M. Recognizing frustration of drivers from face video recordings and brain activation measurements with functional near-infrared spectroscopy. Front Hum Neurosci. 2018;12:327.
Dehais F, Roy RN, Scannella S. Inattentional deafness to auditory alarms: inter-individual differences, electrophysiological signature and single trial classification. Behav Brain Res. 2019;360:51–9.
Dehais F, Duprès A, Blum S, Drougard N, Scannella S, Roy R, Lotte F. Monitoring Pilot’s mental workload using ERPs and spectral power with a six-dry-electrode EEG system in real flight conditions. Sensors. 2019;19:1324.
Ayaz H, Shewokis PA, Bunce S, Izzetoglu K, Willems B, Onaral B. Optical brain monitoring for operator training and mental workload assessment. NeuroImage. 2012;59:36–47.
Walter C, Rosenstiel W, Bogdan M, Gerjets P, Spüler M. Online EEG-based workload adaptation of an arithmetic learning environment. Front Hum Neurosci. 2017;11:286.
Mühl C, Allison B, Nijholt A, Chanel G. A survey of affective brain computer interfaces: principles, state-of-the-art, and challenges. Brain Comput Interfaces. 2014;1:66–84.
Liberati G, Federici S, Pasqualotto E. Extracting neurophysiological signals reflecting users’ emotional and affective responses to BCI use: a systematic literature review. NeuroRehabilitation. 2015;37:341–58.
Zander TO, Krol LR, Birbaumer NP, Gramann K. Neuroadaptive technology enables implicit cursor control based on medial prefrontal cortex activity. Proc Natl Acad Sci. 2016;113(52):14898–903.
Fairclough SH. Fundamentals of physiological computing. Interact Comput. 2009;21:133–45.
Hettinger LJ, Branco P, Encarnacao LM, Bonato P. Neuroadaptive technologies: applying neuroergonomics to the design of advanced interfaces. Theor Issues Ergon Sci. 2003;4:220–37.
Sonnleitner A, Simon M, Kincses WE, Buchner A, Schrauf M. Alpha spindles as neurophysiological correlates indicating attentional shift in a simulated driving task. Int J Psychophysiol. 2012;83:110–8.
Ricardo Chavarriaga LG. Detecting cognitive states for enhancing driving experience. In: International BCI meeting brain-computer interface 2013 proceedings of the fifth international brain-computer Interface meeting: defining the future June 3-7 2013 Asilomar conference center, Pacific grove, California, USA; 2015. https://doi.org/10.3217/978-3-85125-260-6-60.
Unni A, Ihme K, Jipp M, Rieger JW. Assessing the driver’s current level of working memory load with high density functional near-infrared spectroscopy: a realistic driving simulator study. Front Hum Neurosci. 2017;11:167.
Pollmann K, Stefani O, Bengsch A, Peissner M, Vukelić M. How to work in the car of the future?: a neuroergonomical study assessing concentration, performance and workload based on subjective, behavioral and neurophysiological insights. In: Proceedings of the 2019 CHI conference on human factors in computing systems—CHI’19. Glasgow: ACM Press; 2019. p. 1–14.
Spüler M, Krumpe T, Walter C, Scharinger C, Rosenstiel W, Gerjets P. Brain-computer interfaces for educational applications. In: Buder J, Hesse FW, editors. Informational environments. Cham: Springer International Publishing; 2017. p. 177–201.
Kosuru RK, Lingelbach K, Bui M, Vukelić M. MindTrain: how to train your mind with interactive technologies. In: Proceedings of mensch und computer 2019 on—MuC’19. Hamburg: ACM Press; 2019. p. 643–7.
Perrin X, Chavarriaga R, Colas F, Siegwart R, Millán JR. Brain-coupled interaction for semi-autonomous navigation of an assistive robot. Roboti Auton Syst. 2010;58:1246–55.
Chavarriaga R, Sobolewski A, Millã¡n JdR. Errare machinale est: the use of error-related potentials in brain-machine interfaces. Front Neurosci. 2014;8:208. https://doi.org/10.3389/fnins.2014.00208.
Iwane F, Halvagal MS, Iturrate I, Batzianoulis I, Chavarriaga R, Billard A, Millan JdR. Inferring subjective preferences on robot trajectories using EEG signals. In: 2019 9th international IEEE/EMBS conference on neural engineering (NER). San Francisco: IEEE; 2019. p. 255–8.
Edelman BJ, Meng J, Suma D, Zurn C, Nagarajan E, Baxter BS, Cline CC, He B. Noninvasive neuroimaging enhances continuous neural tracking for robotic device control. Sci Robot. 2019;4:eaaw6844.
Brouwer A-M, Zander TO, van Erp JBF, Korteling JE, Bronkhorst AW. Using neurophysiological signals that reflect cognitive or affective state: six recommendations to avoid common pitfalls. Front Neurosci. 2015;9:136. https://doi.org/10.3389/fnins.2015.00136.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2021 Springer Nature Switzerland AG
About this chapter
Cite this chapter
Vukelić, M. (2021). Connecting Brain and Machine: The Mind Is the Next Frontier. In: Friedrich, O., Wolkenstein, A., Bublitz, C., Jox, R.J., Racine, E. (eds) Clinical Neurotechnology meets Artificial Intelligence. Advances in Neuroethics. Springer, Cham. https://doi.org/10.1007/978-3-030-64590-8_16
Download citation
DOI: https://doi.org/10.1007/978-3-030-64590-8_16
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-64589-2
Online ISBN: 978-3-030-64590-8
eBook Packages: MedicineMedicine (R0)