Abstract
Purpose of Review
Starting with a technical categorization and an overview of current exoskeletons and orthoses and their applications, this review focuses on robotic exoskeletons and orthoses for neuromotor rehabilitation and relevant research needed to provide individualized adaptive support to people under complex environmental conditions, such as assisted daily living.
Recent Findings
Many different approaches from the field of autonomous robots have recently been applied to the control of exoskeletons. In addition, approaches from the field of brain-computer interfaces for intention recognition are being intensively researched to improve interaction. Finally, besides stimulation, bidirectional feedback and feedback-based learning are recognized as very important to enable individualized, flexible, and adaptive human assistance.
Summary
AI-based methods for adaptation and online learning of robotic exoskeleton control, combined with intrinsic recognition of human intentions and consent, will in particular lead to improving the quality of human–robot interaction and thus user satisfaction with exoskeleton-based rehabilitation interventions.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.Avoid common mistakes on your manuscript.
Introduction
Supporting people with motor impairments with exoskeletons, i.e., creating an external support structure for the human body, or providing additional human power, is a very old desire. The first patents on mechanisms to increase human performance exist from 1890 (e.g., Yagn [1]). The first exoskeletons were already used for teleoperation in the last century (e.g., Alvfen et al. [2]) and the support of human musculature by technical systems [3] also has such a long history.
Over time, the areas of application have expanded [4]. Power augmentation through exoskeletons became particularly relevant in areas of care (easier lifting of patients) [5] sportsFootnote 1 and work [6] or military applications [7]. In comparison, teleoperation is of particular interest in enabling simple remote control of complex, often robotic systems with manipulators. Here, force feedback to the human has been introduced so that the operator can detect when the exoskeleton encounters an obstacle [8,9,10]. More recently, the fields of application have expanded with regard to the use of exoskeletons to relieve tiring, mostly repetitive or continuous activities [6]. Examples include exoskeletons that serve as a mobile portable seat [11] or have arm-holding functions [12]. This trend is driven by demographic change and is based on the use of exoskeletons for medical applications, such as restoring mobility or for rehabilitation [13,14,15]. The effectiveness of exoskeletons for neuromotor rehabilitation, especially after stroke, was shown early on [16] and has been confirmed again and again since then [17,18,19].
Since the last century, many different technical solutions have been developed for the various areas of application. A distinction can be made, for example, between exoskeletons and orthoses as well as passive and active systems. What all these systems have in common is that they have a complicated mechanical construction that encloses the body and, at least in the case of active systems, require highest performance in the areas of control, software, and hardware architecture. Depending on the application, the use of physiological data and artificial intelligence for intuitive and situational support is moreover of great importance [20,21,22]. We will discuss on different developments and research paths that future robotic exoskeletons will use bidirectional feedback and will no longer be purely programmed, but will intrinsically learn to behave correctly or subjectively correctly during interaction to enable true co-adaptation between humans and machines as a prerequisite for support and rehabilitation under complex environmental conditions, such as assisted daily living (ADL).
This review will first give a brief technical categorization of exoskeletons and orthoses, followed by an overview of application areas with a focus on exoskeletons and orthoses that are already commercially available. We then define robotic exoskeletons and orthoses and briefly discuss the relevance of autonomous functions in such systems for different application areas. After a discussion of different possibilities for exoskeletons to adapt to the user, an overview of different approaches for intention recognition is given, with a focus on more complex approaches motivated by hybrid brain-computer interfaces in the field of neuromotor rehabilitation and assistance. Finally, we discuss bidirectional feedback as the last component of human-centered robotic exoskeletons and orthoses before giving a short outlook for future research on bidirectional coadaptive robotic exoskeletons.
A Technical Categorization of Today’s Exoskeletons and Orthoses
In general, a distinction can be made between passive and active exoskeletons and orthoses. According to the Oxford Concise Medical Dictionary, an orthosis is a device that applies external forces to a specific part of the body to protect or support joints and to correct deformity of the musculoskeletal system [23]. Orthoses are also characterized by the fact that they fit closely to the body and claim to be as light as possible [24]. Exoskeletons, on the other hand, do not always fit closely to the body and are not supported solely by the user’s muscle strength (for an example, see [25]). Due to the influence of orthoses on the neuromuscular system, they also have a biomechanical effect on the motor functions of the affected limbs and thereby limit the range of movement in terms of permitted straight line of freedom or range of movement. Orthoses can be divided into active and passive. Active systems are equipped with power units and sensors that are used to support or replace the user’s muscles. Passive systems conversely do not generate any support in form of mechanical energy from power units and the user must carry the weight of the system with his muscles. Accordingly, it is important to minimize the system’s own weight [26]. Orthoses can be developed for all kinds of joints, as for example E-MAG Active (Ottobock) [27] for the knee, MalleoLoc (Bauerfeind) [28] for the ankle, or OsoTract (Bort medical) [29] for the shoulder. For more examples, see Fig. 1, which can be decoded with Table 1.
As can be seen in Fig. 2, exoskeletons are devices that are not only used in the medical domain for rehabilitation and pain reduction but also in industry for strength enhancement, injury prevention, or teleoperation. Exoskeletons can also be divided into passive and active. In addition, force feedback from an active exoskeleton can also be exerted on the user if the application domain is represented by teleoperation. In this case, the sensors are used to detect joint angles or to measure the existing muscle activity to determine the resulting force for the support [26]. Some examples for active exoskeletons include the EksoNR (Ekso Bionics) [30] for gait rehabilitation, the Capio (DFKI GmbH) [10] for teleoperation, or Recupera Reha (DFKI GmbH) for upper limb rehabilitation [31] (more examples in Fig. 1). An example for a passive exoskeleton is the Paexo Neck from Ottobock Industrials [32] (see Fig. 1). A special feature of some passive exoskeletons, which are used for overhead work, for example, are spring elements, which can be mechanical or gas springs (see Balser et al. [33], Hyun et al. [34], Kazerooni et al. [6], and Maurice et al. [35] for details on the effect and the mechanism of the springs). During overhead work, these springs are used to balance the weight of the arms. This relieves the shoulder muscles, and the forces are transferred to the lower back [33]. This mechanism is used for example in Ekso EVO (Ekso Bionics) [33], H-VEX (Hyundai) [34], and Skelex 360-XFR (SkeleX) [36]. In case of using gas springs, the force is generated by another part of the body during execution of the movement to be supported. This method is used, for example, with the BackX (SuitX) [6]. Here, when the user leans forward, a torque is generated in response to the gravitational force of the torso, thus relieving the user’s lower back.
In summary, there are more passive than active or robotic systems overall (see Fig. 1). It can also be seen from Figs. 1 and 2 that almost all systems in the industry are passive. Furthermore, there are very few devices for full-body application and more systems for the lower body than for the upper body available.
Currently Existing Application Domains
Exoskeletons and orthoses are already applied in many different domains. An overview of the domains of the different devices considered here can be seen in Fig. 2. Orthoses are generally used mainly in the domain of rehabilitation, for example, E-MAG Active (Ottobock) [27] or OsoTract (Bort medical) [29] (s. Figure 2). There are also orthoses that have been developed for elderly who have age-related problems and need assistance in daily life (e.g., Agilium Freestep 3.0 (Ottobock) [37]). Overall, the domains of orthoses are therefore to be arranged in a medical context and to reduce pain. Exoskeletons are found in significantly more domains than orthoses. For industry, for example, there are already many passive and some active devices available today (s. Figure 2). The passive devices are mainly used to prevent injuries to the musculoskeletal system by relieving the musculature, e.g., Moon (Human Mechnical Techologies) [38], SoftExo Carry (Hunic) [39], and Laevo V2 (Laevo) [40]. The active device mentioned in Fig. 1, CrayX (German Bionic) [41], is also designed to relieve the lower back, but additionally has active walking support.
Another domain of exoskeletons is the rehabilitation, as well as the (permanent) assistance of people with incurable movement restrictions (e.g., Ekso UE (Ekso Bionics) [42], BoostX Knee (SuitX) [43], Harmony (ReNeu Robotics Lab) [44]). Most devices for rehabilitation provide active support and are intended for recovery and functional compensation of people with physical disorders [45]. Those categorized as passive in Fig. 1 provide support via springs, making them passive devices as defined in the previous section. Passive exoskeletons have also been developed in some cases especially for medical care staff to provide support when transferring or lifting patients (SoftExo Care (Hunic) [46]) or to protect them from X-rays (ShieldX (SuitX) [47]). The active exoskeleton Forge Performance (Roam Robotics) [48] was developed especially for the military. This is an active walking support, so that soldiers can walk on uneven terrain for a long time without getting tired. In contrast to orthoses, there is no exoskeleton in Fig. 2 that was developed exclusively to support elderly in daily life and to alleviate age-related complaints, but always focus on injury prevention or relieving the musculature. Figure 2 shows some active exoskeletons that currently exist only as demonstrators or prototypes and are therefore not available for purchase now. The BLEEX (U.C. Berkeley) [49], for example, is being developed for the military. It is designed to allow soldiers to carry up to 70 kg in a backpack and have active walking support, making it possible to carry this load over longer distances [49]. Also, there are demonstrators of active exoskeletons (Capio (DFKI GmbH) [50], VI-Bot (DFKI, GmbH) [51], Exodex-Adam (DLR) [52]) that were developed to be used in aerospace and teleoperation. A unique selling point is that they have force-feedback, which should simplify the operation for the user at a long distance [50]. This is a domain that would be opened up in a new way and would show completely new possibilities. One prototype named “The Product” by Tendo [53] is shown in Fig. 2. This is an active exoskeleton that is designed to assist with grasping by controlling artificial tendons [53].
In summary, there are currently no pure systems for assistance and only a few that work with biosignals (for example, in rehabilitation) as shown in Fig. 2. A variety of passive systems are available for industry, as the daily benefits are visible here.
Robotic Exoskeletons and Orthoses
Robots are in a general definition multi-purpose handling devices that not only perform one task but can also be adapted to solve other tasks by being programmable and being equipped with various tools. Often a distinction is made between robots with fixed work sequences such as classical industrial robots, robots that can perform several specific tasks in variable sequence, playback or teach-in robots whose motion sequences are demonstrated by an operator, stored, and can be replayed, numerically controlled robots for which the operator can flexibly program various motion sequences, and generally “intelligent” robots. Intelligent robots perceive their environment via sensors and interpret this via various algorithms and (learned) models to solve tasks even if something changes in the environment. In the definition of autonomous robots, some of the aforementioned categories are combined. Autonomous robots move independently and complete a not exactly defined task without human assistance by calculating or learning solutions.
Robotic orthoses and exoskeletons, i.e., active systems, that can move independently via motors can either be assigned to one of the mentioned subgroups of robots or even fulfil several criteria of different subgroups. Very complex systems can include both simple algorithms that adapt their own movement, for example, by taking out their own weight or the weight of parts of the human bodyFootnote 2 in the sense of gravitational compensation, or more complex control approaches that allow one part of the system to control another part, for example, one arm or leg controls the other [21, 54]. In the same system, in another mode, the movements can be trained by a therapist in the sense of a teach-in and then repeated automatically or triggered by the recognition of the person’s intention to move from, e.g., their residual muscle activity,Footnote 3 eye movement, or brain activity [21]. The latter requires an intelligent interface, such as a BCI, which can be part of the intelligent robotic exoskeleton or orthoses [55]. In teleoperation, intelligent bidirectional exoskeletons can not only transmit movements from the human to the robot but also feedback tactile information to the operator [10].
Hence, intelligent robotic exoskeletons and orthoses that are able to sense and analyze their environment and adapt their behavior are on the one hand autonomous robots, but also have a strong, sometime bidirectional interactive component to recognize the user’s intention or user’s capabilities [56] as will be discussed in the following sections.
Autonomous Adaptation of Behavior
Only very few exoskeletons for neuromotor rehabilitation are designed to support humans with no motor abilities left. For example, Atalante Exoskeleton from Wandercraft can self-balance itself and a patient with complete spinal cord injury during walking [57]. For efficient rehabilitation and motivating support, exoskeletons and orthoses must adapt their behavior to the abilities of the patient, e.g., not only to support an arm movement to different strengths depending on the patient’s condition, called assist-as-needed [58,59,60], but also to enable natural movements by synchronizing human–robot movements [61, 62]. For guided movement, impedance control allows variable adjustment of support, i.e., from precise guidance on the trajectories with rather passive participation of the patient to very weak support in the direction of the target trajectory with greater personal participation, as shown for gait [63] as well as upper body rehabilitation (see [64] for review). In order to adapt the control of exoskeletons even better to the patient and his or her needs, biosignals such as electromyogram (EMG) can be used in the control [65].
For the support of patients under ADL, it is for example not sufficient to adapt movement pattern for gate rehabilitation similar to able-bodied gait [66] or to assure the balance of the user [57] or to automatically recognizing and responding to different gaits [61, 62]. Rather, systems that assist in ADL conditions must also be able to recognize and understand the environment, e.g., the position and size of stairs in order to climb stairs [67]. Such behaviors under ADL conditions were often developed based on research on autonomous systems, here humanoid robot walking. By automated environment recognition (see [68] for review on environment recognition), patients can thus in the future be re-enabled to operate in a wide variety of environments and perform activities of daily living.
For upper body rehabilitation (see [69] and [70] for review on the effect of robot guided upper limb rehabilitation) and support under ADL, even more autonomous robot function must be implemented into robotic exoskeletons (see [71•] for evaluation of potential needs on upper body exoskeletons in ADL). Most of such functions needed to, e.g., recognize a cup that the user wants to drink from and to aid with reaching [72•] and gripping [73] as well as bringing the cup to the mouth, and to support for drinking itself, are not yet implemented as a complete solution within a single exoskeleton, especially not in rather complex hand exoskeletons and often only explored in simulation [74]. Furthermore, how much autonomy is needed always depends on the remaining abilities of the user and on the exoskeleton’s ability to autonomously recognize the user’s intentions, as will be explained in the next section.
Intention Recognition Using Multimodal Data
The perception of intentions is particularly important in the context of neuromotor rehabilitation [17, 18] or ADL support by an exoskeleton [75,76,77]. An intelligent exoskeleton must recognize the patient’s intention, e.g., where he or she wants to move [78, 79]. But also in other applications, such as teleoperation, it can be relevant to understand what a user’s intentions are. For example, using the human electroencephalogram (EEG) can help to detect the movement intention to increase the sensitivity of the force sensors that recognizes interaction forces between human and exoskeleton for more transparent behavior [8].
Brain-computer interfaces (BCIs) [80] can be used to detect the user’s intention, such as walking direction when using a lower limb exoskeleton based on SSVEP [78, 79] or the intention to move the arm, for example [81]. In general, BCIs are often used to control the environment and assistive devices [82,83,84,85]. The above given examples of using BCIs in the control of exoskeletons [8, 17, 18, 75,76,77,78,79] show that the analysis of human physiological data can be used to infer intention. However, also other physiological data or data recorded from the exoskeleton such as motion data recorded from a healthy leg to control the impaired leg [54] or data from force sensors to measure interaction forces [8] can be utilized to infer intentions to support a human adequately and to interact with the environment. In recent research on intent recognition for exoskeleton-based support and rehabilitation, multimodal physiological [77, 86] and external data [87] is combined to increase reliability and performance [56].
Therefore, for intention detection and control, signals from the EEG, such as motor-related signals elicited by motor imagery [88] (see recent review in [89] on brain-computer interfaces based on motor imagery) or by planning intended motor movements, are often not used alone but in combination with other signals using hybrid BCIs (for review of hybrid BCIs, see [90] and [91]). Hybrid BCIs can, on the one hand, use two different types of EEG patterns from different sources, such as a specific EEG signal elicited by motor preparation or motor imagery combined with other patterns in the EEG such as the so-called SSVEP [92,93,94] an activity elicited by repetitive visual stimuli. On the other hand, two different EEG patterns such as frequency- or time-domain related patterns elicited by the same internal or external event such as movement preparation [95, 96] can also be combined in a hybrid BCI. Other approaches use different types of signals, such as EEG and other physiological data like the electrooculogram (EOG) [97, 98], the EMG [86], or the electrocardiogram (ECG) [99] or combine EEG with other non-physiological signals such as eyetracking data [100] or joystick data [101] (for comprehensive and systematic review, see [91]). The main reason to combine multimodal data is to improve accuracy of the BCI [77, 86, 89, 90] or to enhance controllability [89, 98]. It is also possible to improve the adaptability of assistive technical aids to individual needs through the availability of increased information [87] or for depth directional navigation for hand-disabled persons through better recognition of the environment with several sensors [100].
Based on the research in hybrid BCI reviewed above, it can be derived that safe and flexible intention recognition for exoskeleton control under ADL conditions multimodal data such as EEG and EMG data [87] should be combined to reliably detect the intention of the user and be extended by additional data to infer how the user wants to interact with or within the environment [21, 75,76,77,78, 87].
Stimulation, Bidirectional Feedback, and Interactive Learning
Feedback helps to improve human–machine interaction. It improves the user’s awareness on specific situations, e.g., whether the user of an exoskeleton is in balance [102], transfers tactile information using force feedback [8,9,10], or directs the user back to a target trajectory as soon as he or she is too much off from it under assist-as-needed [58,59,60]. Moreover, the combination of an exoskeleton with functional electrical stimulation (FES) has shown to provide promising results for motor recovery of upper limb [72•, 103] and lower limb gait rehabilitation [104] or during sit-to-stand transitions [105] with reduced muscle fatigue caused from FES when combined with an active exoskeleton (for review, see [106]).
While stimulation offers a way to directly influence muscles using FES or brain areas, such as the motor area using repetitive transcranial magnetic stimulation (rTMS), to improve neuromotor rehabilitation after stroke [107], providing feedback allows for a more indirect adaptation, as known from biofeedback approaches (see [108] for review). A biofeedback signal can be of two categories, i.e., biomechanical or physiological [108]. Biomechanical biofeedback is used most often either combined with or integrated into exoskeleton-based assistance or rehabilitation. Most BCIs, which can be combined with exoskeletons as discussed above, make use of biofeedback, i.e., neurofeedback. Any type of biofeedback is used to support the user’s learning by becoming aware of changes in the body (processes and behavior). Also, stimulation addresses the user, e.g., by enhancing activity in motor areas of the brain while using an exoskeleton for stroke rehabilitation to improve the effectiveness of the therapy [109].
However, feedback should also be given to the machine, e.g., to the exoskeleton, to improve its behavior. This is done, e.g., in “assist-as-needed,” by making use of EMG activity [65] as discussed above. Very recent approaches use feedback from humans even more fundamentally than just to adapt the robotic system. Based on learning approaches for autonomous robotic systems, especially using reinforcement learning (RL) [110, 111], new approaches are being developed to make online learning more robust under complex environments by using human feedback (see [112] for discussion). Feedback can be given explicitly [113, 114] or implicitly inferred by the human during the interaction (discussed in [115]) to avoid additional cognitive load on the human and to obtain more consistent feedback than would be possible with explicit feedback [116,117,118,119].
A recent approach that uses human feedback is interactive reinforcement learning (IRL) (see [120,121,122] for a review). Since human–robot interaction is crucial in assistive robotics, IRL offers great potential for using human feedback to learn, improve, or adapt a system’s behavior to human needs [115]. Here, the use of brain activity as an intrinsic feedback source in IRL, i.e., for intrinsic IRL [116], became of particular interest as it combines approaches of IRL and the intrinsic use of brain activity for learning in robots. Current approaches make use of a user’s error awareness, which is correlated with the so-called error-related potential (ErrP) in the EEG (for different types of ErrP, see [123, 124]) to automatically correct or learn appropriate behavior (see [125] for general discussion). ErrP not only correlates with error awareness [126], but it is also discussed to correlate with error severity [127]. Intrinsic feedback in RL from ErrPs has already been discussed as a way to learn the control of prostheses online and it has been shown that a robotic arm can thus learn to reach certain predefined target locations in 2D [118]. In addition, a study with stroke patients showed that unfinished training trials produce a brain pattern similar to ErrP, which could potentially be used to improve state-of-the-art assistive robotic therapy approaches [128•]. While in [128•], it is discussed that assistance as needed can be improved by detecting the time of failure during therapy to then trigger, e.g., needed support, we would like to argue that the detection of ErrP or other error-related activities in the EEG can furthermore be used to improve assistance in relation to subjective needs, which may differ from objectively measured needs.
Conclusion
While mechanically sophisticated and stably controlled robotic exoskeletons and orthoses are already available for various applications, current research is mainly focused on improving human–machine interaction using various data from the user. This current research is driven by the enormous need for human intention recognition, adaptation to the user’s needs, and, especially in neuromotor rehabilitation, the provision of assistance according to needs. From the different approaches discussed in this review, it is clear that many different challenges are often addressed individually through research in a particular area. In the future, research that combines different research directions is needed to overcome weaknesses in individual approaches and to capitalize on the strengths of other approaches. For example, using a BCI, it will remain difficult to calculate exact movement trajectories from surface derived brain signals, e.g., the EEG, even if other data are used. However, using solutions from the field of autonomous robots can solve this problem. Once the intention and goal are derived from the physiological data, a trajectory can be calculated to reach the goal. In addition, the intrinsic feedback of the human can be used to correct the guidance support offered by the robotic exoskeleton online or to learn preferred support strategies. It is the combination of different approaches in future research that will enable the effective use of bidirectional feedback for transparent behavior and optimal support, as well as for robotic exoskeletons that can also be used under ADL. Especially for the latter, increasingly more methods from the area of autonomous robotics need to be integrated into future robotic exoskeletons to enable the support and assistance of a human under complex natural environmental conditions. By means of these approaches, bidirectional feedback and interactive learning for true coadaptivity between man and machine, the felt fusion between man and exoskeleton will be made possible in the future and will allow the integration of such technical devices into the body schemata.
Notes
See video at https://youtu.be/dCn1ktzbpZ8 minute 0:58.
See video at https://youtu.be/dCn1ktzbpZ8 minute 1:25.
References
Papers of particular interest, published recently, have been highlighted as: • Of importance
N. Yagn, “Apparatus for facilitating walking”. Patent US 440684 A, 1890.
H. Alfven and H. Kleinwächter, “Syntelmann—und die möglichen Konsequenzen,” Bild der Wissenschaft, 1970.
G. Cobb, “Walking motion”. Patent US 2010482 A, 1934.
C.-J. Yang, J.-F. Zhang, Y. Chen, Y.-M. Dong and Y. Zhang, “A review of exoskeleton-type systems and their key technologies,” Proceedings of the Institution of Mechanical Engineers, Part C: Journal of Mechanical Engineering Science, pp. 1599–1612, 2008, doi: https://doi.org/10.1243/09544062JMES936.
Y. Sankai, “HAL: hybrid assistive limb based on cybernics,” Kaneko M., Nakamura Y. (eds) Robotic Research. Springer Tracts in Advanced Robotics, 2010, https://doi.org/10.1007/978-3-642-14743-2_3.
H. Kazerooni, W. Tung and M. Pillai, “Evaluation of Trunk-supporting exoskeleton,” Proceedings of the Human Factors and Ergonomics Scoiety, pp. 1080–1083, 2019, https://doi.org/10.1177/1071181319631261.
A. Zoss, H. Kazerooni and A. Chu, “On the mechanical design of the Berkeley Lower Extremity Exoskeleton (BLEEX),” IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 3465–3472, 2005, doi: https://doi.org/10.1109/IROS.2005.1545453.
M. Folgheraiter, M. Jordan, S. Straube, A. Seeland, S.-K. Kim and E. A. Kirchner, “Measuring the improvement of the interaction comfort of a wearbale exoskeleton,” International Journal of Social Robotics, pp. 285–302, 2012, https://doi.org/10.1007/s12369-012-0147-x.
I. Jo, Y. Park and J. Bae, “A teleoperation system with an exoskeleton interface,” IEEE/ASME International Conference on Advanced Intelligent Mechatronics, pp. 1649–1654, 2013, doi: https://doi.org/10.1109/AIM.2013.6584333.
M. Mallwitz, N. Will, J. Teiwes and E. A. Kirchner, “The CAPIO active upper body exoskeleton and its application for teleoperation,” Proceedings of the 13th Symposium on Advanced Space Technologies in Robotics and Automation, 2015.
A. P. Irawan, D. W. Utama, E. Affandi, Michael and H. Suteja, “Product design of chairless chair based on local components to provide support for active workers,” IOP Conference Series: Materials Science and Engineering, 2019, doi:https://doi.org/10.1088/1757-899X/508/1/012054.
S. Spada, L. Ghibaudo, S. Gilotta, L. Gastaldi and M. P. Cavatorta, “Investigation into the applicability of a passive upper-limb exoskeleton in automotive industry,” Procedia Manufacturing, pp. 1255–1262, 2017, https://doi.org/10.1016/j.promfg.2017.07.252.
T. Platz and S. Roschka, “Rehabilitative Therapie bei Armparese nach Schlaganfall,” Neurol. Rehabil., pp. 81–106, 2009.
T. Platz, “Rehabilitative Therapie bei Armlähmungen nach einem Schlaganfall. S2-Leitlinie der Deutschen Gesellschaft für Neurorehabilitation,” NeuroGeriatrie, pp. 104–116, 2011.
J. Nitschke, D. Kuhn, K. Fischer and K. Röhl, “Comparison of the usability of the rewalk, Ekso and HAL,” Special edition from: OrthOpädietechnik, p. 22, 2014.
C. D. Takahashi, L. Der-Yeghiaian, V. Le, R. R. Motiwala and S. C. Cramer, “Robot-based hand motor therapy after stroke,” Brain, pp. 425–437, 2008, doi: https://doi.org/10.1093/brain/awm311.
T. Noda, N. Sugimoto, J. Furukawa, M.-A. Sato, S.-H. Hyon and J. and Morimoto, “Brain-controlled exoskeleton robot for BMI rehabilitation,” Proc. 12th IEEE-RAS Int. Conf. Humanoid Robots (Humanoids), pp. 21–27, 2012, doi: https://doi.org/10.1109/HUMANOIDS.2012.6651494.
E. Hortal, D. Planelles and F. e. a. Resquin, “Using a brain-machine interface to control a hybrid upper limb exoskeleton during rehabilitation of patients with neurological conditions,” J NeuroEngineering Rehabil, 2015, https://doi.org/10.1186/s12984-015-0082-9.
N. Singh, M. Saini, N. Kumar, M. V. Padma Srivastava and A. Mehndiratta, “Evidence of neuroplasticy with robotic hand exoskeleton for post-stroke rehabilitation: a randomized controlled trial,” J NeuroEngineering Rehabil, 2021, https://doi.org/10.1186/s12984-021-00867-7.
E. A. Kirchner, J. Albiez, A. Seeland, M. Jordan and F. Kirchner, “Towards assistive robotics for home rehabilitation,” Proceedings of the 6th International Conference in Biomdeical Electronics and Devices (BIODEVICES-13), 2013.
E. A. Kirchner, S.-K. Kim, S. Straube, A. Seeland, H. Wöhrle, M. M. Krell, M. Tabie and M. Fahle, “On the applicability of brain reading for predictive human-machine interfaces in robotics,” PLoS ONE, Public Library of Science, p. e81732, 2013, https://doi.org/10.1371/journal.pone.0081732.
E. A. Kirchner, N. Will, M. Simnofske, L. M. Vaca Benitez, B. Bongardt, M. M. Krell, S. Kumar, M. Mallwitz, A. Seeland, M. Tabie, H. Wöhrle, M. Yüksel, A. Heß, R. Buschfort and F. Kirchner, “Recupera-reha: exoskeleton technology with integrated biosignal analysis for sensorimotor rehabilitation,” 2. Transdiziplinäre Konferenz “Technische Unterstüzungssysteme, die die Menschen wirklich wollen”, pp. 504–517, 2016.
J. Law and E. Martin, Concise Medical Dictionary, 10 ed., Oxford University Press, 2020.
H. Herr, “Exoskeleton and orthoses: classification, design challanges and future directions,” Journal of NeuroEngineering and Rehabilitation, vol. 6, no. 21, 2009, https://doi.org/10.1186/1743-0003-6-21.
Strickland, “Good-bye, wheelchair,” IEEE Spectrum, pp. 30–32, 2012, doi: https://doi.org/10.1109/MSPEC.2012.6117830.
Kirchner, E.A. et al., “Exoskelette der künstlichen Intelligenz in der klinischen Rehabilitation,” in Digitale Transformation von Dienstleistungen im Gesundheitswesen, Wiesbaden, Springer Gabler, 2019, pp. 413–435, https://doi.org/10.1007/978-3-658-23987-9_21.
Otto Bock HealthCare Deutschland GmbH, “Elektronisch gesteuertes Kniegelenksystem E-MAG Active,” 2021. [Online]. Available: https://www.ottobock.de/orthesen/produkte/bein-und-knieorthesen/e-mag-active/. [Accessed 11 February 2022].
Bauerfeind AG, “MalleoLoc,” Bauerfeind AG, 2022. [Online]. Available: https://www.bauerfeind.de/de/produkte/orthesen/fuss/details/product/malleoloc. [Accessed 11 February 2022].
BORT medical, “Produkte—BORT OsoTract Oberarm-Schulter-Orthese,” BORT GmbH, 2022. [Online]. Available: https://www.bort.com/de/produktdetail.html?product=121300. [Accessed 11 February 2022].
eksoBIONICS, “eksoNR,” Ekso Bionics, 2021. [Online]. Available: https://eksobionics.com/eksonr/. [Accessed 11 February 2022].
E. A. Kirchner, N. Will, M. Simnofske, L. M. Vaca Benitez, B. Bongardt, M. M. Krell, S. Kumar, M. Mallwitz, A. Seeland, M. Tabie, H. Wöhrle, M. Yüksel, A. Heß, R. Buschfort and F. Kirchner, “Recupera-reha: exoskeleton technology with integrated biosignal analysis for sensorimotor rehabilitation,” in Zweite transdiziplinäre Konferenz “Technische Unterstützungssysteme, die die Menschen wirklich wollen”, 2016.
Otto Bock HealthCare Deutschland GmbH, “Paexo Neck,” [Online]. Available: https://paexo.com/wp-content/uploads/2019/11/2019-10363-66-Beileger-PaexoNeck-DL-DE-OBE-20190926.pdf. [Accessed 11 February 2022].
Balser F, Desai R, Ekizoglou A, Bai S. A novel passive shoulder exoskeleton designed with variable stiffness mechanism. IEEE Robotics and Automation Letters. 2022;7(2):2748–54. https://doi.org/10.1109/LRA.2022.3144529.
Hyun DJ, Bae K, Kim K, Nam S, Lee D-H. A light-weight passive upper arm assistive exoskeleton based on multi-linkage spring-energy dissipation mechanism for overhead tasks. Robotics and Autonomous System. 2019. https://doi.org/10.1016/j.robot.2019.103309.
Maurice P, Camernik J, Gorjan D, Schirrmeister B, Bornmann J, Tagliapietra L, Latella C, Pucci D, Fritzsche L, Ivaldi S, Babic J. Objective an subjective effects of a passive exoskeleton on overhead work. IEEE Trans Neural Syst Rehabil Eng. 2020;28(1):152–64. https://doi.org/10.1109/TNSRE.2019.2945368.
Skelex, “Skelex 360-XFR,” [Online]. Available: https://www.skelex.com/skelex-360-xfr/. [Accessed 11 February 2022].
Otto Bock HealthCare GmbH, “Agilium Freestep 3.0,” 2021. [Online]. Available: https://www.ottobock.de/orthesen/produkte/bein-und-knieorthesen/agilium-freestep-3.0/. [Accessed 11 February 2022].
HMT, “Moon,” Human Mechanical Technologies, 2022. [Online]. Available: https://www.hmt-france.com/fr/ourExoskeletons/moon. [Accessed 11 February 2022].
Hunic GmbH, “SoftExo Carry,” HUNIC SoftExo, 2022. [Online]. Available: https://hunic.com/softexo-carry/. [Accessed 11 February 2022].
Laevo Exoskeletons, “The Laevo V2,” [Online]. Available: https://www.laevo-exoskeletons.com/en/laevo-v2. [Accessed 11 February 2022].
German Bionic, “CrayX: Exoskelett für manuelles Handling,” German Bionic Systems GmbH, 2022. [Online]. Available: https://www.germanbionic.com/5th-generation/. [Accessed 11 February 2022].
eksoBIONICS, “eksoUE—upper extremity exoskeleton,” Ekso Bionics, 2020. [Online]. Available: https://eksobionics.com/de/eksoue-de/. [Accessed 11 February 2022].
SuitX, “Recreational exoskeleton—BoostX Knee,” suitx, 2021. [Online]. Available: https://www.suitx.com/boostknee. [Accessed 11 February 2022].
B. Kim and A. D. Deshpande, “An upper-body rehabilitation exoskleton Harmony with an anatomical shoulder mechanism: design, modeling, control, and performance evaluation,” The International Journal of Robotics Research, pp. 414–435, 2017, doi: https://doi.org/10.1177/0278364917706743.
A. F. Ruiz, A. Forner-Cordero, E. Rocon and J. L. Pons, “Exoskeletons for rehabilitation and motor control,” The First IEEE/RAS-EMBS International Conference on Biomedical Robotics and Biomechatronics, pp. 601–606, 2006, doi: https://doi.org/10.1109/BIOROB.2006.1639155.
Hunic GmbH, “Hunic SoftExo Care,” HUNIC SoftExo, 2022. [Online]. Available: https://hunic.com/softexo-care/. [Accessed 11 February 2022].
SuitX, “shieldX | suitX,” suitx, 2021. [Online]. Available: https://www.suitx.com/shieldx. [Accessed 11 February 2022].
R. Robotics, “Forge Performance - Roam,” 2021. [Online]. Available: https://www.roamrobotics.com/forge. [Accessed 11 February 2022].
H. Kazerooni, J.-L. Racine, L. Huang and R. Steger, “On the control of the Berkeley Lower Extremity Exoskeleton (BLEEX),” Proceedinfs of the 2005 IEEE International Conference on Robotics and Automation, pp. 4353–4360, 2005, doi: https://doi.org/10.1109/ROBOT.2005.1570790.
German Research Center for Artifical Intelligence and Universität Bremen, “The CAPIO active upper body exoskeleton,” [Online]. Available: https://www.dfki.de/fileadmin/user_upload/import/7383_slides_RoboAssist_2014_Mallwitz.pdf. [Accessed 10 February 2022].
M. Folgheraiter, M. Jordan, L. M. Vaca Benitez, F. Grimminger, S. Schmidt, J. Albiez and F. Kirchner, “A highly integrated low pressure fluid servo-valve for applications in wearable robotic systems,” Proceedings of the 7th International Conference on Informatics in Control, Automation and Robotics, 2010.
H. Beik-Mohammadi, M. Kerzel, B. Pleintinger, T. Hulin, P. Reisich, A. Schmidt, A. Pereira, S. Wermter and N. Y. Lii, “Model mediated teleoperation with a hand-arm exoskeleton in long time delays using reinforcement learning,” 2020 29th IEEE Conference on Robot Human Interactive Communication (RO-MAN), pp. 713–720, 2020, doi: https://doi.org/10.1109/RO-MAN47096.2020.9223477..
Tendo, “Get a grip | Tendo—for people, not for symptoms,” tendoforpeople, [Online]. Available: https://www.tendoforpeople.se/tendo. [Accessed 11 February 2022].
Zhang C, Liu G, Li C, Zhao J, Yu H, Zhu Y. Development of a lower limb rehabilitation exoskeleton based on real-time gait detection and gait tracking. JOUR. 2016. https://doi.org/10.1177/1687814015627982.
S. Kumar, H. Wöhrle, M. Trampler, M. Simnofske, H. Peters, M. Mallwitz, E. A. Kirchner and F. Kirchner, “Modular design and decentralized control of the recupera exoskeleton for stroke rehabilitation,” Applied Sciences, MDPI, p. 626, 2019, https://doi.org/10.3390/app9040626.
E. A. Kirchner, S. Fairclough and F. Kirchner, “Embedded multimodal interfaces in robotics: applications, future trends, and societal implications,” The Handbook of Multimodal-Multisensor Interfaces, Morgan & Claypool Publishers, pp. 523–576, ISBN: ebook: 978–1–97000–173–0, hardcover: 978–1–97000–175–4, 2019.
J. Kerdraon, J. Previnaire, M. Tucker and et al., “Evaluation of safety and performance of the self balancing walking system Atalante in patients with complete motor spinal cord injury,” Spinal Cord Ser Cases, vol. 7, no. 71, 2021, https://doi.org/10.1038/s41394-021-00432-3.
A. U. Pehlivan, D. P. Losey and M. K. O’Malley, “Minimal assist-as-needed controller for upper limb robotic rehabilitation,” IEEE Transactions on Robotics, pp. 113–124, 2016, doi: https://doi.org/10.1109/TRO.2015.2503726.
S. Y. A. Mounis, N. Z. Azlan and F. Sado, “Assist-as-needed control strategy for upper-limb rehabilitation based on subject’s functional ability,” Measurement and Control, pp. 1354–1361, 2019, https://doi.org/10.1177/0020294019866844.
L. Zhang, S. Guo and Q. Sun, “An assist-as-needed controller for passive, assistant, active, and resistive robot-aided rehabilitation training of the upper extremity,” Appl. Sci., p. 340, 2021, https://doi.org/10.3390/app11010340.
B. Fang, Q. Zhou, F. Sun, J. Shan, M. Wang, C. Xiang and Q. Zhang, “Gait neural network for human-exoskeleton interaction,” Front. Neurorobot., 29 October 2020, https://doi.org/10.3389/fnbot.2020.00058.
B. Kleiner, N. Ziegenspeck, R. Stolyarov, H. Herr, U. Schneider and A. Verl, “A radar-based terrain mapping approach for stair detection towards enhanced prosthetic foot control,” IEEE International Conference on Biomedical Robotics and Biomechatronics (BIOROB), pp. 105–110.
Y. Zhou, J. She, Z.-T. Liu, C. Xu and Z. Yang, “Implementation of impedance control for lower-limb rehabilitation robots,” 4th IEEE International Conference on Industrial Cyber-Physical Systems (ICPS), pp. 700–704, 2021, doi: https://doi.org/10.1109/ICPS49255.2021.9468210.
A. Q. Keemink, H. van der Kooij and A. H. and Stienen, “Addmittance control for physical human-robot interaction,” Int. J. Rob. Res. 37, pp. 1421–1444, 2018, doi: https://doi.org/10.1177/0278364918768950.
J. C. Castiblanco, I. F. Mondragon, C. Alvarado-Rojas and J. D. Colorado, “Assist-as-needed exoskeleton for hand joint rehabilitation based on muscle effort detection,” Sensors, p. 4372, 2021, https://doi.org/10.3390/s21134372.
D. Fineberg, P. Asselin, N. Harel, I. Agranova-Breyter, S. Kornfeld, W. Baumann and A. Spungen, “Vertical ground reaction force-based analysis of powered exoskeleton-assisted walking in persons with motor complete paraplegia,” J Spinal Cord med., pp. 313–321, 2013, doi: https://doi.org/10.1179/2045772313Y.0000000126.
F. Xu, R. Huang and Cheng, H. et al., “Stair ascent strategies and performance evaluation for a lower limb exoskeleton,” Int J Intell Robot Appl, pp. 278–293, 2020, https://doi.org/10.1007/s41315-020-00123-6.
B. Laschowski, W. McNally, A. Wong and J. McPhee, “Environment classification for robotic leg prostheses and exoskeletons using deep convolutional neural networks,” Front. Neurorobot., 04 February 2022, https://doi.org/10.3389/fnbot.2021.
G. B. Prange, M. J. A. Jannink, C. G. M. Groothuis-Oudshoorn, H. J. Hermens and M. J. Ijzerman, “Systematic review of the effect of robot-aided therapy on recovery of the hemiparetic arm after stroke,” J Rehabil Res Dev., pp. 171–84, 2006, doi: https://doi.org/10.1682/jrrd.2005.04.0076.
G. Kwakkel, B. J. Kollen and H. I. Krebs, “Effects of robot-assisted therapy on upper limb recovery after stroke: a systematic review,” Neurorehabil Neural Repair, pp. 111–21, 2008, doi: https://doi.org/10.1177/1545968307305457.
• H. Nam, H. Seo, J. Leigh, Y. Kim, S. Kim and M. Bang, “External robotic arm vs. upper limb exoskeleton: what do potential users need?,” Appl. Sci., p. 2471, 2019, https://doi.org/10.3390/app9122471. This work is relevant since it analyzes the demands of different groups of patients with upper body impairments. Such work is most relevant for setting new goals in research and development.
• F. Grimm and A. Gharabaghi, “Closed-loop neuroprosthesis for reach-to-grasp assistance: combining adaptive multi-channel neuromuscular stimulation with a multi-joint arm exoskeleton,” Front. Neurosci., 2016, doi: https://doi.org/10.3389/fnins.2016.00284. This work is relevant since it addresses patients with chronic impairments caused by stroke. This group of patients has very bad outcome in case of classical therapy approaches.
L. Gerez, A. Dwivedi and M. Liarokapis, “A hybrid, soft exoskeleton glove equipped with a telescopic extra thumb and abduction capabilities,” IEEE International Conference on Robotics and Automation (ICRA), pp. 9100–9106, 2020, doi: https://doi.org/10.1109/ICRA40945.2020.9197473.
Topini A, Sansom W, Secciani N, Bartalucci L, Ridolfi A, Allotta B. Variable admittance control of a hand exoskeleton for virtual reality-based rehabilitation tasks. Front Neurorobot. 2022. https://doi.org/10.3389/fnbot.2021.789743.
M. S. Al Maamari, S. S. Al Badi, A. Saleem, M. Mesbah and E. Hassan, “Design of a brain controlled hand exoskeleton for patients with motor neuron diseases,” 10th IEEE International Symposium on Mechatronics and its Application, 2015, https://doi.org/10.1109/ISMA.2015.7373470.
L. Randazzo, I. Iturrate, S. Perdikis and J. d. R. Millán, “mano: a wearable hand exoskeleton for activities of daily living and neurorehabilitation,” IEEE Robotics and Automation Letters, pp. 500–507, 2018, doi: https://doi.org/10.1109/LRA.2017.2771329.
S. R. Soekadar, M. Witkowski, N. Vitiello and N. Birbaumer, “An EEG/EOG-based hybrid brain-neural computer interaction (BNCI) system to control an exoskeleton for the paralyzed hand,” Biomedical Engineering / Biomedizinische Technik, pp. 199–205, 2015, https://doi.org/10.1515/bmt-2014-0126.
N.-S. Kwak, K.-R. Müller and S.-W. Lee, “A lower limb exoskeleton control system based on steady state visual evoked potentials,” J. Neural Eng., p. 056009, 2015, doi: https://doi.org/10.1088/1741-2560/12/5/056009.
K. Lee, D. Liu, L. Perroud, R. Chavarriaga and J. d. R. Millán, “A brain-controlled exoskeleton with cascaded event-related desynchronization classifiers,” Robotics and Autonomous Systems, pp. 15–23, 2017, https://doi.org/10.1016/j.robot.2016.10.005.
J. R. Wolpaw, N. Birbaumer, D. J. McFarland, G. Pfurtscheller and T. M. Vaughan, “Brain-computer interfaces for communication and control,” Clin Neurophysiol, pp. 767–01, 2002, https://doi.org/10.1016/S1388-2457(02)00057-3.
E. Lew, R. Chavarriaga, S. Stefano and J. d. R. Millán, “Detection of self-paced reaching movement intention from EEG signals,” Frontiers in Neuroengineering, 2012, https://doi.org/10.3389/fneng.2012.00013.
A. Ferreira, T. F. Bastos-Filho, M. Sarcinelli-Filho, J. L. Martín, J. C. García and M. Mazo, “Improvements of a brain-computer interface applied to a robotic wheelchair,” Biomedical Engineering Systems and Technologies - International Joint Conference BIOSTEC, pp. 64–73, 2009, https://doi.org/10.1186/s12984-015-0082-9.
E. Hortal, D. Planelles, A. Costa, E. Iánez, A. Úbeda, J. M. Azorín and E. Fernández, “SVM-based Brain-Machine Interface for controlling a robot arm through four mental tasks,” Neurocomputing, pp. 116–121, 2014, https://doi.org/10.1016/j.neucom.2014.09.078.
L. Citi, R. Poli, C. Cinel and F. Sepulveda, “P300-based BCI mouse with genetically-optimized analogue control,” IEEE Trans Neural Syst Rehabil Eng., pp. 51–61, 2008, doi: https://doi.org/10.1109/TNSRE.2007.913184.
J. L. Sirvent, E. Iánez, A. Úbeda and J. M. Azorín, “Visual evoked potential-based brain-machine interface applications to assist disabled people,” Expert Syst Appl., pp. 7908–18, 2012, https://doi.org/10.1016/j.eswa.2012.01.110.
Chowdhury A, Raza H, Dutta A, Prasad G. EEG-EMG based hybrid brain computer interface for troggering hand exoskeleton for neuro-rehabilitation. Preceedings of the Advances in Robotics. 2017. https://doi.org/10.1145/3132446.3134909.
E. A. Kirchner, A. Seeland and M. Tabie, “Multiodal movement prediction - towards an individual assistance of patients,” PLoS ONE, Public Library of Science, p. e85060, 2014, https://doi.org/10.1371/journal.pone.0085060.
G. Pfurtscheller and C. Neuper, “Motor imagery and direct brain-computer communication,” Proceedings of the IEEE, pp. 1123–1134, 2001, doi: https://doi.org/10.1109/5.939829.
J. Zhang and M. Wang, “A survey on robots controlled by motor imagery brain-computer interfaces,” Cognitive Robotics, pp. 12–24, 2021, https://doi.org/10.1016/j.cogr.2021.02.001.
Amiri S, Pantazis D, Fazel-Rezai R, Asadpour V. A review of hybrid brain-computer interfaces systems. Advances in Human-Computer Interaction. 2013. https://doi.org/10.1155/2013/187024.
Choi I, Rhiu I, Lee Y, Yun MH, Nam CS. A systematic review of hybrid brain-computer interfaces: taxonomy and usability perspectives. PLoS ONE. 2017. https://doi.org/10.1371/journal.pone.0176674.
G. R. Burkitt, R. B. Silberstein, P. J. Cadusch and a. W. Wood, “Steady-state visual evoked potentials and travelling waves,” Clin. Neurophysiol., pp. 246–58, 2000, doi: https://doi.org/10.1016/s1388-2457(99)00194-7.
Zhu D, Bieger J, Molina GG, Aarts RM. A survey of stimulation methods used in SSVEP-based BCIs. Comput Intell Neurosci. 2010. https://doi.org/10.1155/2010/702357.
Z. Iscan and V. V. Nikulin, “Steady state visual evoked potential (SSVEP) based brain-computer interface (BCI) performance under different perturbations,” PLoS ONE, p. e0191673, 2018, https://doi.org/10.1371/journal.pone.0191673.
A. Seeland, L. Manca, F. Kirchner and E. A. Kirchner, “Spatio-temporal comparison between ERD/ERS and MRCP-based movement prediction,” Proceedings of the 8th International Conference on Bio-inspired Systems and Signal Processing (BIOSIGNALS-15), pp. 219–226, 2015, https://doi.org/10.5220/0005214002190226.
Li H, Huang G, Lin Q, Zhao J-L, Lo W-LA, Mao Y-R, Chen L, Zhang Z-G, Huang D-F, Li L. Combining movement-related cortical potentials and event-related desynchronization to study movement preparation and execution. Front Neurol. 2018. https://doi.org/10.3389/fneur.2018.00822.
S. He, Y. Zhou, T. Yu, R. Zhang, Q. Huang, L. Chuai, M. U. Mustafa, Z. Gu, Z. L. Yu, H. Tan and Y. Li, “EEG- and EOG-based asynchronous hybrid BCI: a system integrating a sepller, a web browser, an e-mail client, and a file explorer,” IEEE Transactions On Neural Systems and Rehabilitation Engineering, pp. 519–530, 2014, https://doi.org/10.1109/TNSRE.2019.2961309.
Zhu Y, Ying L, Jinling L, Pengcheng L. A hybrid BCI based on SSVEP and EOG for robotic arm control. Front Neurorobot. 2020. https://doi.org/10.3389/fnbot.2020.583641.
Scherer R, Müller-Putz GR, Pfurtscheller G. Self-initiation of EEG-based brain-computer communication using the heart rate response. J Neural Eng. 2007. https://doi.org/10.1088/1741-2560/4/4/L01.
E. C. Lee, J. C. Woo, J. H. Kim, M. Whang and K. R. Park, “A brain-computer interface method combined with eye tracking for 3D interaction,” J Neurosci Methods, pp. 289–98, 2010, https://doi.org/10.1016/j.jneumeth.2010.05.008.
A. Kreilinger, V. Kaiser, C. Breitwieser, J. Wiliamson, C. Neuper and G. Müller-Putz, “Switching between manual control and brain-computer interface using long term and short term quality measures,” Frontiers in Neuroscience, pp. 1–11, 2012, https://doi.org/10.3389/fnins.2011.00147.
Y.-T. Pan, Z. Lamb, J. Macievich and K. A. Strausser, “A vibrotactile feedback device for balance rehabilitation in EksoGT robotic exoskeleton,” 7th IEEE International Conference on Biomedical Robotics and Biomechatronics (Biorob), pp. 569–576, 2018, doi: https://doi.org/10.1109/BIOROB.2018.8487677.
C. Freeman, E. Rogers, A.-M. Hughes, J. H. Burridge and K. Meadmore, “Iterative learning control in health care: electrical stimulation and robotic-assisted upper-limb stroke rehabilitation,” IEEE Control Syst., pp. 18–43, 2012, doi: https://doi.org/10.1109/MCS.2011.2173261.
A. J. del-Ama, Á. Gil-Agudo, J. L. Pons and et al., “Hybrid FES-robot cooperative control of ambulatory gait rehabilitation exoskeleton,” J NeuroEngineering Rehabil, 2014, https://doi.org/10.1186/1743-0003-11-27.
S. A. Murray, R. J. Farris, M. Golfarb, C. Hartigan, C. Kandilakis and D. Truex, “FES coupled with a powered exoskeleton for cooperative muscle contribution in persons with paraplegia,” Annu Int Conf IEEE Eng Med Biol Soc, pp. 2788–2792, 2018, doi: https://doi.org/10.1109/EMBC.2018.8512810.
A. J. del-Ama, A. D. Koutsou, J. C. Moreno, A. de-los-Reyes, A. Gil-Agudo and J. L. Pons, “Review of hybrid exoskeletons to restore gait following spinal cord injury,” J Rehabil Res Dev, pp. 497–514, 2012, doi: https://doi.org/10.1682/jrrd.2011.03.0043.
Wenxiu P, Wang P, Xiaohui S, Xiaopei S, Qing X. The effects of combined low frequency repetitive transcranial magnetic stimulation and motor imagery on upper extremity motor recovery following stroke. Front Neurol. 2019. https://doi.org/10.3389/fneur.2019.00096.
O. M. Giggins, U. M. Persson and B. Caulfield, “Biofeedback in rehabilitation,” J NeuroEngineering Rehabil, 200113, https://doi.org/10.1186/1743-0003-10-60.
Miller KJ, Gallina A, Neva JL, Ivanova TD, Snow NJ, Ledwell NM, Xiao ZG, Menon C, Boyd LA, Garland SJ. Effect of repetitive transcranial magnetic stimulation combined with robot-assisted tarining on wrist muscle Activation post-stroke. Clin Neurophysiol. 2019. https://doi.org/10.1016/j.clinph.2019.04.712.
I. Akkaya, M. Andrychowicz, M. Chociej, M. Litwin, B. McGrew, A. Petron, A. Paino, M. Plappert, G. Powell, R. Ribas, J. Schneider, N. Tezak, J. Tworek, P. Welinder, L. Weng and Q. Yuan, “Solving rubik’s cube with a robot hand,” 2019. [Online]. Available: https://arxiv.org/pdf/1910.07113v1.pdf. [Accessed 21 February 2022].
J. Kober, J. A. Bagnell and J. Peters, “Reinforcement learning in robotics: a survey,” The International Journal of Robotics Research, pp. 1238–1274, 2013, https://doi.org/10.1177/0278364913495721.
G. Dulac-Arnold, N. Levine, D. J. Mankowitz, J. Li, C. Paduraru, S. Gowal and T. Hester, “An empirical investigation of the challenges of real-world reinforcement learning,” 2020. [Online]. Available: https://arxiv.org/pdf/2003.11881.pdf. [Accessed 21 February 2022].
C. Daniel, M. Viering, J. Metz, O. Kroemer and J. Peters, “Active reward learning,” Proceedings of Robotics: Science and Systems, 2014.
E. Biyik, D. P. Losey, M. Palan, N. C. Landolfi, G. Shevchuk and D. Sadigh, “Learning reward functions from diverse sources of human feedback: optimally integrating demonstrations and preferences,” The International Journal of Robotics Research, pp. 45–67, 2022, https://doi.org/10.1177/02783649211041652.
J. Lin, Z. Ma, R. Gomez, K. Nakamura, B. He and G. Li, “A review on interactive reinforcement learning from human social feedback,” IEEE Access, pp. 120757–120765, 2020, doi: https://doi.org/10.1109/ACCESS.2021.3096662.
S. K. Kim, E. A. Kirchner, A. Stefes and F. Kirchner, “Intrinsic interactive reinforcement learning—using error-related potentials for real world human-robot interaction,” Scientific Reports, p. 17562, 2017, https://doi.org/10.1038/s41598-017-17682-7.
T. Luo, Y. Fan, J. Lv and C. Zhou, “Deep reinforcement learning from error-related potentials via an EEG-based brain-computer interface,” IEEE International Conference on Bioinformatics and Biomedicine (BIBM), pp. 697–701, 2018, doi: https://doi.org/10.1109/BIBM.2018.8621183.
I. Iturrate, R. Chavarriaga, L. Montesano, J. Minguez and J. del R. Millán, “Teaching brain-machine interface as an alternative paradigm to neuroprosthetics control,” Scientific Reports, p. 13893, 2015, https://doi.org/10.1038/srep13893.
S. K. Ehrlich and G. Cheng, “Human-agent co-adaptation using error-related potentials,” Journal of Neural Engineering, p. 066014, 2018, https://doi.ord/https://doi.org/10.1088/1741-3553/aae069.
A. L. Thomaz, G. Hoffman and C. Breazeal, “Real-time interactive reinforcement learning for robots,” Proceedings of AAAI Workshop on Human Comprehensible Machine Learning, 2005.
C. Stahlhut, N. Navarro-Guerrero, C. Weber and S. Wermter, “Interaction in reinforcement learning reduces the need for finely tuned hyperparameters in complex tasks,” Kognitive Systeme, 2015, https://doi.org/10.17185/duepublico/40718.
C. Arzate Cruz and T. Igarashi, “A survey of interactive reinforcement learning: design principles and open challenges,” Proceedings of the 2020 ACM Designing Interactive Sytsmes Conference, pp. 1195–1209, 2020, https://doi.org/10.1145/3357236.3395525.
S. K. Kim and E. A. Kirchner, “Classifier transferability in the detection of error related potentials from observation to interaction,” IEEE International Conference on Systems, Man, and Cybernetics (SMC’13), pp. 3360–3365, 2013, https://doi.ord/https://doi.org/10.1109/SMC.2013.573.
S. K. Kim and E. A. Kirchner, “Handling few training data: classifier transfer between different types of error-related potentials,” IEEE Transactions on Neural Systems and Rehabilitation Engineering, pp. 320–332, 2016, https://doi.org/10.1109/TNSRE.2015.2507868.
R. Chavarriaga, A. Sobolewski and J. d. R. Millán, “Errare. machinale est: the use of error-related potentials in brain-machine interfaces,” Frontiers in Neuroscience, 2014, https://doi.org/10.3389/fnins.2014.00208.
Wessel JR. Error awareness and the error-related negativity: evaluating the first decade of evidence. Front Hum Neurosci. 2012. https://doi.org/10.3389/fnhum.2012.00088.
Spüler M, Niethammer C. Error-related potentials during continuous feedback: using EEG to detect errors of different type and severity. Front Hum Neurosci. 2015. https://doi.org/10.3389/fnhum.2015.00155.
• A. Kumar, Q. Fang, J. Fu, E. Pirogova and X. Gu, “Error-Related Neural Responses Recorded by Electroencephalography During Post-stroke Rehabilitation Movements,” Frontiers in Neurorobotics, 2019, https://doi.org/10.3389/fnbot.2019.00107. Although experimental set up might need improvement, this work is relevant since it investigates intrinsic human signals in the EEG associated with self-evaluation of success of rehabilitation training that could potentially be used to automatically adapt support by a rehabilitation robot
Funding
Open Access funding enabled and organized by Projekt DEAL.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of Interest
The authors declare no competing interests.
Human and Animal Rights and Informed Consent
This article does not contain any studies with human or animal subjects performed by any of the authors.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
About this article
Cite this article
Kirchner, E.A., Bütefür, J. Towards Bidirectional and Coadaptive Robotic Exoskeletons for Neuromotor Rehabilitation and Assisted Daily Living: a Review. Curr Robot Rep 3, 21–32 (2022). https://doi.org/10.1007/s43154-022-00076-7
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s43154-022-00076-7