1 Introduction

Studies on rehabilitative therapy have shown that timely intervention can help disabled patients regain their normal abilities quickly (Wang et al. 2009). However, with the increasing rates of disabled patients, the number of trained medical staff with the ability to provide such care is not increasing at the same pace, which results in longer recovery processes (Eurostat 2017). The traditional method for measuring the range motion is to attach a device called a goniometerFootnote 1. to the affected joints of the body (Gajdosik and Bohannon 1987). Apart from being a time consuming method, it also demands the supervision of experienced therapists (Grant et al. 2005). The method is also stagnant and does not provide any motivation or entertainment to complete long therapy sessions. Therefore, there is a high demand for developing technology-based adaptive solutions to guide patients towards performing online low-cost rehabilitation program that can help them live independently at home.

To help in home-based hand-therapy, various computing platform such as Microsoft Kinect,Footnote 2 Microsoft Digits,Footnote 3 MyoFootnote 4, and Leap Motion,Footnote 5 to name a few, have recently emerged that help in identifying physiological parameters from a hand therapy session in real-time (Chang et al. 2011). Recent works have suggested that using non-invasive motion tracking system that does not make use of external wearable devices lets the patient perform movement more naturally (Komatireddy et al. 2013; Stone and Skubic 2013). Other approaches in this area seem to favor analysis of kinematic therapy data to help track the disability level of a patient (Zhao et al. 2012). Although Microsoft Kinect 2 is a popular platform for full body gesture recognition (Khoshelham and Elberink 2012) which can detect basic hand open/close gestures, it cannot track subtle motions of different joints in the hand with great accuracy. Recently, a 3D motion-based gesture control device called Leap Motion has been presented. It is a specialized device for tracking hand joints only. It provides 3D positions of joints of two hands at a frame rate of 60–200 fps within a range of 0.6 m. The sensing range and very high resolution makes it well suited among the state-of-the-art monitoring devices for in-home hand therapies (Weichert et al. 2013). Some of the disability metrics that can be measured by the Leap Motion include the range of motion (ROM) of small joints of individual fingers, the velocity, and the acceleration of the connected limbs. A tracking layer then creates a stream of data frames consisting of tracking information about the wrist and fingers. Although the most important factor in the rehabilitation process is the time spent exercising under the supervision of the therapist, home-based e-therapy can become a great factor in the improvement of the patient given its low cost, flexible, relaxing, and easy to perform factors.

Meanwhile, serious games have started to gain popularity in diverse areas such as education, national defense, and health (Michael and Chen 2005). Particularly, in the areas of mental and physical rehabilitation, serious games have gained a lot of popularity in the recent past (Rego et al. 2010). Significant gains that can be obtained by developing serious games for treating patients with physical disability have been shown in (Omelina et al. 2012; Afyouni et al. 2014). Moreover, map-based applications have made navigation a routine activity for mobile phone users of all ages. Children as well as adults usually find driving a car around points of interest or flying a kite at their favorite locations an exciting and motivating activity (Ayala et al. 2011; Jacob et al. 2010; Qamar et al. 2014b). We believe that the ubiquity, economy, and intelligence of this technology can be effectively harnessed to create a personalized therapy-driven serious game framework to provide an entertaining and enjoyable form of therapy.

Fig. 1
figure 1

Using a Leap Motion device for 3D serious games

In previous work, and in order to track the movement of a subject, we have used two low cost sensors, namely, the Leap Motion and Microsoft Kinect, which are equipped with 3D motion sensing cameras (Qamar et al. 2014a, b). The data streams produced by the two devices can be rendered on the screen in real-time or recorded for offline playback and analysis. We have also developed an authoring tool to allow the therapist to select joints of interest from the image of the human body in a highly intuitive manner. In Afyouni et al. (2014), the principles and initial modeling for developing a framework integrating adaptive serious games for patients with hand disabilities were introduced. This paper extends the previously proposed concepts by introducing a fully-fledged gamified hand-therapy framework with several main features.

In this work, we introduce a patient-adaptive environment for the gamification of hand therapies in order to facilitate and encourage the rehabilitation process. An online e-therapy framework integrated with a therapy-driven serious game environment will be presented. This framework offers several advantages over the state-of-the-art systems for therapy design and monitoring, mainly, with respect to automatic game generation and a self-learning methodology for patient adaption. This framework provides a very encouraging and motivating environment to physically challenged individuals for performing routine and boring physical therapy exercises using patient-adaptive serious games (see Fig. 1). As an extension to our prior efforts in Afyouni et al. (2014), this work introduces and implements the following concepts: (i) a comprehensive data model that defines patients’ profiles, preferences, and constraints; game data including preprocessed as well as session-based patient-generated data; and sensor data streams; (ii) therapy-aware and patient-centered route generation algorithms that are based on the mapping approach to translate therapy instructions into routes in 3D; (iii) A complete system architecture with a patient-centered approach and a self-learning component that allows for adaptive and personalized games based on patient constraints and performance; (iv) a fully-implemented platform and an extensive evaluation study of our framework has been performed, and results of experiments in different scenarios are demonstrated. Consequently, our framework contains the following features:

  • A serious game environment that generates and recommends 3D navigation routes on a real-world map and with different levels of difficulty based on therapeutic gestures. This allows users to perform different therapeutic exercises in front of a Leap Motion controller while flying to destination of their choice on the 3D globe. Moreover, a 3D spatio-temporal validation zone needs also to be generated in order to direct and monitor the actual route being followed by the patient.

  • Game difficulty levels can be dynamically generated based on training sessions, as well as updated analytics after each session. The life cycle of our framework consists in continuously learning new semantics about the patient progress so that the intelligent game generator can adjust the difficulty level accordingly.

  • Real-time and offline data analytics are dynamically displayed to provide therapists and experts with the abilities to monitor patient’s progress. Session data are stored and indexed using cloud-based services. Many criteria are considered in this phase to demonstrate the patient performance for achieving their goal from a medical perspective, but also with respect to their interaction within the game.

  • A comprehensive evaluation study to assess the system’s efficiency, effectiveness, and user’s satisfaction. We designed different benchmarks to evaluate the impact and usability of our gamified hand therapy framework. This includes quantitative as well as qualitative evaluation. The quantitative evaluation is divided into two main categories: (a) medical metrics such as the ROM, speed, angular speed, acceleration, etc., and (b) user interaction metrics including, pattern similarity, point-to-point analysis, among others. The qualitative evaluation relies on checking the usability by designing questionnaires for the therapists and patients to assess their level of satisfaction of the framework.

To the best of our knowledge, this is the first work that develops adaptive and personalized serious games based on therapeutic gestures for patients with hand disabilities. This adaptive mechanism infers patients’ preferences and capabilities, in order to guide them towards achieving their assigned therapies successfully at home, and within a fully entraining environment. The remainder of the paper is organized as follows. Section 2 discusses the related work from different related fields. Section 3 describes the mathematical modeling to represent therapies, game-generated data, as well as the patient-centered approach for mapping therapies within the serious game environment. Algorithms for mapping therapeutic gestures to navigational movements in 3D world are introduced in Sect. 4. Section 5 details the system development and usage along with its implementation details. Section 6 shows the evaluation strategy, the experimental setup, and results with real demonstration scenarios, while Sect. 7 discusses the feedback, limitations, and generalization of the framework. Finally, Sect. 8 draws some conclusions and highlights future work.

2 Related work

This section presents a review of related work in the areas of serious games for therapy rehabilitation, adaptation and personalization in serious games, and the use of tangible and intangible devices in current rehabilitation systems.

2.1 Serious games for physical rehabilitation

Serious games are video games that have a goal other than entertainment (Bartolomé et al. 2011; Michael and Chen 2005). Serious games have started to gain popularity in areas as diverse as education, national defense, and health (Mayo 2007; Numrich 2008; Sawyer 2008). Possible benefits of these games in improving the learning skills of subjects have been demonstrated in Rego et al. (2010). In the areas of mental and physical rehabilitation, serious games have gained a lot of popularity in recent years (Omelina et al. 2012). Early intervention in many disability cases can help patients return quickly to normal life (Wang et al. 2009). Traditionally, therapists attach various devices to the affected joint on patients body to measure the angle of its motion, which is one of the major metric of disability (Norkin and White 2009). Apart from being a time consuming method, it also demands the supervision and monitoring of experienced professionals. Therefore, the importance of developing home-based systems for physical rehabilitation has recently increased (Khademi et al. 2014; Stone and Skubic 2011).

The human–computer interaction community has made various attempts to enhance physical rehabilitation with new interaction technologies. They have used tangible and intangible devices along with serious games to help in rehabilitation (Boulanger et al. 2013). Khademi et al. (2014) modified a game called Fruit Ninja to be played with Leap Motion controller for rehabilitation of the arm movement of upper extremity for the stroke patient. Saini et al. (2012) designed a low cost game framework for motivating stroke patients in doing hand and leg therapies. Hashemian and Wang (2014) designed touch-based video games for motivating body exercises for rehabilitation. Ma and Bechkoum (2008) shows that serious games are effective for physical therapy of stroke patients. These works lacks adaptive and patient-centered approach, rather they focus on better human interaction with devices and games for rehabilitation.

Meanwhile, the inclusion of accurate spatio-temporal information has proven to be a key ingredient in designing an effective navigation-based serious game environment. In the current age, map browsing has become a very common activity for many people. A number of map-based serious games have also been developed to teach geography and history to students (Peng et al. 2015; Crawford et al. 2003). Map-based serious games can also be beneficial for hand physical therapies. However, to the best of our knowledge, no map-based serious game has been developed for hand rehabilitation purposes.

2.2 Adaptation in serious games for rehabilitation

Adaptation in games takes place in different ways. Some games adapt to the user’s performance at the beginning of every new session while some others adjust during the session (Chen et al. 2008; Gouaïch et al. 2012). Difficulty levels in the game are either static or dynamic. In case of static difficulty levels, game parameters are changed according to a pre-set formula to make the game easy or difficult at discrete pre-defined levels (Heuser et al. 2006). Dynamic difficulty levels demand a closer follow up with the user as well as a broad range of adjustable values for different parameters (Hocine et al. 2015). An increase in the number and values of parameters results in an increasingly adaptable game environment. Fully automatic adaptation techniques allow the user to perform in-home rehabilitation more easily (Cameirão et al. 2010). On the other hand, this may prevent therapists from testing their therapeutic strategies in a more independent manner. A semi-automatic approach allows the therapist to control non-game-related parameters and let the system modify game related parameters based on the user performance levels (Pirovano et al. 2012).

Heuser et al. (2006) suggested games for reach and grasp exercises. The games are adaptable based on the user feedback, but adaptation levels are fixed. The fixed difficulty levels are managed by changing the speed, size, and color of objects on the screen in a pre-determined manner. In Chen et al. (2008), a feedback based on colors, sounds, and media is used to differentiate between levels of difficulty for upper hand rehabilitation. The difficulty levels are static and pre-determined. Ma et al. (2007) propose an adaptable game for post stroke rehabilitation. A model of the patient ability is constructed using different metrics. The patient profile is related with the difficulty levels of the game using a specification matrix.

In Cameirão et al. (2010), the authors have described a game-based rehabilitation system called spheroids. The aim of the study is to find the effect of training parameters on task performance. The task of the player in the game is to intercept spheres coming towards him/her. After every generation of ten consecutive spheres, the performance of the patient is assessed and the difficulty level is adjusted accordingly. The performance of a patient is measured using four parameters, namely, speed, interval, range, and size of the spheres. A psychometric model is used to relate game parameters to task difficulty. In Rabin et al. (2011), an incremental difficulty strategy was used to find the effects of serious games on motor control for upper limb rehabilitation. Before every session, the patient had to go through a motor assessment exercise to measure the reach of his arms. Difficulty was increased within the session progressively. The input parameters to the adaptation algorithm were reached zone, wrist weight, grasp pressure and session date. The change in difficulty levels was expressed by making the target area smaller or by changing the range and speed of targets.

In Pirovano et al. (2012), the authors have used a semi-automatic adaptation strategy to give the therapist some control over adaptation parameters while keeping the process automatic. A game of catching falling fruits was developed in which the difficulty level was adapted by adjusting the number, size, weight and fall frequency of the fruit as well as the number of baskets used to collect the falling fruit. The parameters that were under the control of the therapist were game independent, such as, exercise selection, time allocated for the exercise, and number of repetitions and constraints on movements. Adaptation was based on a model of healthy player data. This platform is developed using the Wii Balance Board, so it cannot cover the accurate joints in hand hand therapy, and does not consider the game generation from therapeutic gestures. In Mihelj et al. (2012), a virtual rehabilitation environment was proposed to increase patient motivation through game design. The patient was supposed to catch bottles and place them in a basket. Adaptation was based on predefined difficulty levels and the number of bottles caught and placed in the basket. Non-game-specific parameters affecting adaptation were velocity, acceleration, and force. Adaptation was also dependent on player preferences in addition to the above parameters.

In Gouaïch et al. (2012), the authors have used a digital pheromone-based algorithm to build a map of the areas where the patient’s hand can move. An interaction matrix is used to follow up with the movements of the patient. A simple gradient based algorithm is used to decide the difficulty level whereby the gradient of each cell indicates the direction of increase or decrease of the pheromone. The difficulty adaptation technique employed is generic and can be used with a number of therapeutic games for upper limb rehabilitation. It is based on both player profile and performance. Hocine et al. (2015) present a brief survey of adaptation in games for rehabilitation. They have classified different works based on the objectives behind the adaptation technique, the moment of time during the game play at which the adaptation decision is made, inputs required for the adaptation process, game elements affected by the adaptation, and the used adaptation model. They also presented a player profile based on difficulty adaptation technique and compare it with a random strategy and an incremental difficulty strategy technique. The authors also introduced a framework for upper-limb rehabilitation based on this adaptation technique, which considers the game, tasks, and objectives as the three main component of the rehabilitation protocol. This framework allows for selecting one of three games based on difficulty levels.

Although these works present very interesting adaptation techniques, they do not focus on the automatic generation of adaptive games from therapeutic instructions, and based on patient preferences and constraints, which is a major component in our framework. Moreover, medical and HCI analytics and self-adaptivity based on patient performance were not thoroughly discussed. In our work, we have also adopted a semi-automatic adaptation strategy whereby we initially allow the therapist to set non-game related parameters based on his/her experience. For example, the therapist can suggest the number of repetitions and the types of exercises based on the knowledge of the medical domain. Our game engine initially records training data for the patient and adapts game difficulty according to it. The game engine also monitors the user activity during the game and updates the difficulty level based on the user performance.

2.3 Sensors used for hand therapy monitoring

Recently, various computing platforms have surfaced that help in identifying physiological parameters from a therapy session. Microsoft Digits and Microsoft Kinect are two examples of such platforms. Kinect 2 can do full body gesture recognition by tracking 25 joints in the human body. However, it cannot track subtle motions of different joints in the hand with great accuracy.Footnote 6 Leap Motion is a recent addition to the growing list of human motion tracking devices. The Leap Motion controller is a very small and portable device that can detect all joints of the hand at 30–200 fps. The high accuracy of sensing hand movements makes it the best candidate for home-based therapy device (Weichert et al. 2013). Leap Motion is also more portable than the Kinect and does not require an independent power source.

Microsoft Digits seems to overcome limitations of Kinect for tracking hand gestures and has potential to be an excellent hand therapy framework; it has to be tied to the patient’s hand in certain orientation though, which is hard for hemiplegic patients (Kim et al. 2012). Vicon camera has been used by various platforms for gesture recognition using markers and computer vision-based technology.Footnote 7 However, it requires a very complex and expensive set up, which is not feasible for general patients to use at home. Another issue with the above platforms is that they require very complex computing platform to be setup and tend to be expensive for general patients to be used at home. Therefore, we opt to choose simple and inexpensive Leap Motion controller for our framework.

In this work, we present an online hand therapy system based on the Leap Motion controller that uses a serious game to record and track the therapy exercises performed by the patient. Recent work in this area seems to favor analyzing patient movement data to help track the disability level of the patient (Zhao et al. 2012). Therefore, we use Leap Motion controller along with the adaptive serious game approach to help in hand therapy rehabilitation.

2.4 Rehabilitation systems from market perspectives

Home based-therapy rehabilitation environments using tangible and intangible sensors along with serious games are gaining popularity (Boian et al. 2002; Rego et al. 2010). These environments encourage patients for rehabilitation by playing games. They improve the efficiency of the patients self-exercise and also save their time and money. A study of the market shows that a number of organizations have developed and tested various frameworks for gamifying physical therapy exercises. Mira from Mira Rehab,Footnote 8 JRS from Jintronix for Kinect-based rehabilitation,Footnote 9 Vera from Reflexion Health,Footnote 10 YuGo from BioGaming,Footnote 11 and IREX from GestureTek Health,Footnote 12 are some physiotherapy management systems that utilize the power of the tangible and intangible sensors for tracking body joints and measuring therapy compliance.

As stated above, various researchers as well as companies are making products for home-based therapies. In this paper, we present the development of a patient-centered serious game environment integrated with a Leap Motion controller to help hand rehabilitation. We believe that our work improves the state-of-the-art work in different aspects. First, this framework is developed in close collaboration with the therapists associated with a local disability center. We are also generating patient-centered tasks in the game environment, while taking the prescribed therapies into consideration. In addition, we defined various relevant metrics, calculate them, and finally interactively plot them as they are required by the therapist to monitor the patient performance. Our framework is capable of accurately tracking hand joints and can plot patient’s movements and related analytics in real-time and on session-basis. The framework embeds different adaptation techniques to tailor to user patients’ preferences and needs. We also evaluated our framework quantitatively and in the form of user studies.

3 Modeling approach

This section presents a modeling approach to represent therapies in terms of joints and motions associated with them as well as other dynamic parameters to help in generating an appropriate and adaptable game for a given patient. A description on how game data is represented and automatically generated on the fly is also provided. Game data includes preprocessed and session-based patient-generated data, as well as sensor data streams. Moreover, a patient disability model that defines patients’ profiles, preferences, and constraints will be presented.

3.1 Therapy data model

This modeling approach has been first introduced in Afyouni et al. (2014), based on principles defined with the help of medical experts. A summary of these concepts are presented in this section, which lay the ground for the algorithmic development in Sect. 4. In medical literature, movements are expressed in terms of joints and primitive gestures (Norkin and White 2009). For example, the bending of the palm down, towards the wrist is called flexion, since the act of flexion happens around the wrist joint. The range of motion (ROM) of the wrist joint for flexion hence depicts the range of angles traversed by the palm during the flexion movement. For a normal human being, the range is \(0^{\circ }\) (fully extended wrist) to \(90^{\circ }\) (fully flexed wrist). To model these movements, we have therefore associated each movement with its related joint as well as the normal range of motion. However, a physically impaired person lacks the ability to perform a movement from beginning to end like a healthy person. To enable the system to detect these partial movements and to separate them from other non-related movements, a movement indicator is needed. We have included the direction of change in angle as an indicator to show the type of movement taking place. For example, if the angle around the wrist starts decreasing, it means that the wrist flexion movement has started. Similarly, an increase in the angle shows the start of the wrist extension movement. For the purpose of modeling, we consider a therapy as

$$\begin{aligned} \mathcal {T}~=~\left\langle g1,~g2,~g3,~\dots ,~gn\right\rangle \end{aligned}$$

where \(\mathcal {T}\) is the therapy devised by a therapist for a given disability, and \(g1,~g2,~g3,~\dots ,gn\) are the primitive gestures required to accomplish that therapy. Also, let \(\mathcal {J}\) be the set of joints of the human body being tracked.

$$\begin{aligned} \mathcal {J}~=~\left\langle j1,~j2,~j3,~\dots ,~jn\right\rangle \end{aligned}$$

As examples of trackable joints, we consider:

j1 = left wrist, j2 = MCP joint (that is the knuckle between the hand and the finger).

Each joint in the human body has a number of motions associated with it. Our model focuses on the representation of hand joints and movements. A wrist joint has flexion, extension, radial deviation, and ulnar deviation motions (see Fig. 2). The set of all possible motions related to the joints in the human body is represented by \(\mathcal {M}\). As a result, a primitive gesture g can be expressed as follows:

$$\begin{aligned} g~=~\left\langle j,~m,~\textit{nROM},~\textit{dir},~p,~\textit{dur},~n,~r\right\rangle \end{aligned}$$

where \(j \in \mathcal {J}\) is a joint; \(m \in \mathcal {M}\) is a motion associated to \(j; \textit{nROM} \) describes the normal range of motion for a given joint; \( \textit{dir} \in \{\textit{clockwise}, \textit{anti-clockwise}\}\) is the direction of movement in the corresponding plan \(p; p \in \{ \textit{frontal} (\textit{xy-plane}), \textit{transverse} (\textit{xz-plane}), \textit{sagittal} (\textit{yz-plane})\}\) represents the plane of movement as explained in Fig. 2; dur is the gesture duration as suggested by the therapist; n is the number of repetitions of that gesture during the therapy; and finally r depicts the potential resistance that is the weight carried by some parts of the body for developing all of the muscles that control the patient’s hands.

Fig. 2
figure 2

Explanation of the three Leap Motion planes as frontal (xy-plane), transverse (xz-plane), and sagittal (yz-plane) (https://developer.leapmotion.com/leapjs/api-guide)

As an example, the therapy “wrist bend” can be expressed as a combination of primitive gestures “wrist flexion” and “wrist extension”. This can be represented as follows.

$$\begin{aligned} \mathcal {T}_{\textit{WristBend}}= & {} \left\langle g_{\textit{WristFlexion}},~g_{\textit{WristExtension}}\right\rangle \\ g_{\textit{WristFlexion}}= & {} \left\langle \textit{wrist},~\textit{flexion},~90,~\textit{clockwise},~\textit{sagittal},~2~\hbox {s},~5,~0\right\rangle \\ g_{\textit{WristExtension}}= & {} \left\langle \textit{wrist},~\textit{extension},~90,~\textit{anti-clockwise},~\textit{sagittal},~2~\hbox {s},~10,~0\right\rangle \end{aligned}$$

where the last value 0 indicates that there is no resistance applied since nothing is carried in hand.

From a gesture duration and the number of repetitions as prescribed by the therapist, the system can then compute the total therapy duration as follows:

$$\begin{aligned} \textit{Theraphy duration} = \varSigma \textit{gesture}\text {'}{} \textit{s}~\textit{duration}\times \textit{number of repetition} \end{aligned}$$

For a disabled person, there are two more aspects of a gesture that need to be registered. One is the percentage of motion that a patient can achieve, and the other is the amount of time required to perform that gesture. As an example, consider a patient who can flex his arm to \(40^{\circ }\) only. It is also important to record the time taken by the patient to reach the maximum point, since disability may result in either slow or incomplete movement. This is discussed further in Sect. 3.3.

It is worth noting that this hand therapy model is presented in a 2D plane for simplicity, but this is actually extended and applied to 3D hand motions. In addition, this hand therapy model is only applied to the wrist joint of the hand; other joints should have their own state modeling.

3.2 Game generated data model

Leap Motion is among the cheapest sensor devices that are available to store data of 3D motions. Using these sensors, there is no need to connect a part of the body with a wire, which makes it more convenient in serious games. These wireless devices store 3D motion data accurately. For hand monitoring, Leap Motion provides data about wrist and arm motions and for each and every joint of the fingers and thumb. Sensor data as well as game-generated data are represented in our framework as follows.

$$\begin{aligned} \textit{Leap~data} = \left\langle \textit{frameID}, \textit{hands}, \textit{timestamp} \right\rangle \end{aligned}$$

Leap Motion data contains the output produced by the Leap Motion device. It is a series of time stamped and identified frames, where frameID and timestamp represent the frame identifier and the current timestamp, respectively; hands represent the set of hands currently detected by the Leap Motion; hand data is represented by the following tuple:

$$\begin{aligned} \textit{hands}= & {} \left\langle \textit{fingers}, \textit{isRight}, \textit{confidence}, \textit{palmDirection},\right. \\&\left. \textit{palmVelocity}, \textit{armDirection}\right\rangle \end{aligned}$$

where isRight determines whether the detected hand is the right one, confidence indicates the level of confidence on the reported data, palmDirection depicts the direction vector for the palm, palmVelocity is its velocity vector, and armDirection shows the direction vector for the arm.

fingers illustrates the set of fingers data as follows: \(\textit{fingers} = \langle \textit{type}, \textit{bones}\rangle \); where type is the finger type (e.g., ‘THUMB’), and bones represent the set of bone type and its direction vector: \(\textit{bones} = \langle \textit{boneType}, \textit{direction}\rangle \). The bone types mainly include the phalanges (i.e., 14 bones that are found in the fingers of each hand). Each finger has three phalanges (the distal, middle, and proximal); the thumb has only two. The metacarpal bones represent the five bones that compose the middle part of the hand. Finally, the carpal bones are the eight bones that create the wrist.

Game data represents the patient real-time data while playing in a given session. Game data frames are synchronized with the Leap Motion frames (ie., they both have the same frame rate and frame id), so that a comparison between the current hand posture and the player position within the game can be made. It contains the following attributes:

$$\begin{aligned} \textit{Game data} = \left\langle \textit{frameID}, \textit{position}, \textit{direction}, \textit{movementID}, \tau \right\rangle \end{aligned}$$

where position is the 3D coordinates reflecting the patient current location (calculated based on a fixed speed and the current palm direction), direction reflects the player orientation in 3D based on the current palm direction, movementID is the index of the current detected movement with a given type (e.g., flexion, extension, etc.); and \(\tau \) is the timestamp.

The therapy-aware and patient-centered path is generated by the game engine and reflects the best path to be followed in real-time by the player. The generated path is represented with the following parameters:

$$\begin{aligned} \textit{Generated path} = \left\langle \textit{mvIdx}, \textit{movementType}, \textit{angle}, \textit{startAngle}, \textit{duration}, \textit{points} \right\rangle \end{aligned}$$

where mvIdx and movementType represent the movement index and type, respectively; angle is the desired angle for this movement (e.g., flexion \(90^{\circ }\)); startAngle is the initial angle of the wrist for this particular patient; duration is the suggested duration to complete that movement; and points is the set of intermediate point coordinates generated between the first and last position in 3D space. Generally the system is generating one intermediate point on every \(1^{\circ }\) change in the curved trajectory.

3.3 Disabled patient model

For designing the patient model in our framework, we conducted meetings with three therapists in two local medical centers. We wanted to see the procedure and the amount of information they take into account for each patient. During the meetings, we first discussed the standard definition of disability by World Health OrganizationFootnote 13 as a higher level term used for impairments (any issue in body function), activity limitations (any problem in doing any task), and participation restrictions (issues encountered in involvements in daily life situation). However, we decided that modeling all the disabilities is a challenging task and is not the scope of this work. In this work, we focus on patients with hand physical impairments only and on building a model to represent that particular type of impairment. The therapists told us that they generally fill a form on patient arrival about his/her personal health records. It contains information about his name, age, gender, contact number, illness history, etc. Then during a meeting with the therapist, several recommendations and prescribed therapies will be logged to start the rehabilitation process. We also observed few therapy sessions and based on the gathered information, we designed the disabled patient model for our framework.

Overall, this patient model is designed to represent patient characteristics. Customizing the therapeutic plan according to these characteristics is done in three phases: (1) taking these into account by the therapist while devising the therapy; (2) By applying training sessions for the use of the game environment to detect all abilities and disabilities of the patient; (3) By applying semantic analysis on the played sessions of the patient. Knowledge discovered from these analytics will be matched to the saved characteristics in order to develop more advanced judgment of the current performance and the causes for unexpected results. The model includes,

3.3.1 Patient basic profile

The basic profile contains information such as,

  • Personal info The framework takes personal information of each patient such as name, address, phone number, age, gender, and id number.

  • Caregiver info The contact information of the caregiver/relative who may be called in case of emergency.

3.3.2 Patient preferences

The patient preferences consist of,

  • Rehabilitation setting Preferences for the initial rehabilitation therapy setting and session intensity.

  • Game preferences The patient can choose various game preferences such as different layouts and scenes, choosing the source and destination places for generating therapy path, interactively exploring key places on the map, choosing with/without tunnel option, length of the training session, and choosing different player objects and intermediate coins.

3.3.3 Patient health profile

The patient health profile has attributes such as,

  • Illness history Any previous illness history in the patient or in any immediate family members.

  • Allergies The list of any allergies (drugs, food, environment, etc.).

  • Medications and medical devices Any medication or medical devices currently being in use.

  • Physical dysfunctions This includes any physical body impairment that may affect the patient performance. Here we list them,

    • Joints mobility It represents any impairment to the joints mobility. It also represents amount of impairment such as low, moderate, or high.

    • Motor control It shows if there is any impairment to control body parts and how much mobility those parts have with or without any external equipment.

    • Pain The parts of the body where patient feels pain and to what extent.

    • Uncontrolled movements The parts of the body exhibiting uncontrolled contractions of muscles.

  • Mental dysfunctions Mental disability includes the perception and cognition of the main task being performed. It contains information about intellectual, memory, or orientation impairment. It also contains the ability to communicate about his preferences, concerns, and constraints to his social ties and therapist.

  • Miscellaneous issues This includes any issue to the patient other than physical and mental dysfunctions and can affect the performance of the patient such as color blindness, any eye or ear issue, cardiac, respiratory, gastrointestinal, or urinary dysfunctions.

3.3.4 Training session

Initially, the patient uses the system in the training mode where the system records the patient performance and calculate abilities and disabilities of the patient along with basic movement features such as range of motion.

3.3.5 Patient constraints

Based on the patient profile, training session, and therapist recommendations, the system retrieves an initial set of constraints as defined by the therapist for each patient’s gesture, so that an adapted and personalized game can be generated by taking these constraints into consideration. For each gesture gi, we have set of constraints \(C_i\). One gesture can also have different constraints at the same time. The patient constraints for each gesture (and its corresponding joint) are defined as \(\mathcal {PC} = \langle pc1, pc2, \ldots , pcn\rangle \), where \(pci = \langle gi, C_i\rangle \) represent the set of patient constraints \(C_i\) for a particular gesture \(gi\,\in \mathcal {T}\). \(C_i\) is represented as follows:

\(C_i = \langle c1_i, c2_i, c3_i, \ldots , ck_i\rangle \), where \(cj_i\) includes the following parameters:

$$\begin{aligned} cj_i = \left\langle \textit{CID}, \textit{UID}, \textit{psROM}, \textit{peROM}, \textit{pdur}, \textit{pr}\right\rangle \end{aligned}$$

with CID is the constraint identifier; UID is the user identifier; psROM is the patient start angle; peROM is the patient end angle; pdur is the patient duration to complete the gesture; and pr is the patient resistance while performing the gesture.

The constraints are also updated by the system upon finishing each session. The system analyses session data recorded by the patient, and based on the current performance, the difficulty level and other parameters such as speed, ROM, and durations are updated (see more details in Sect. 5.2).

For judging the improvement of the patient during the session, we have also defined X as the set of expected metrics to be achieved after a given session. X is calculated by taking each gesture properties and the percentage of improvement PI indicator for that gesture into account, which is suggested by the therapist. For instance, it may be suggested that a patient range of motion should be improved by 5% after three sessions. This PI indicator can be the same for the whole exercise including all the corresponding gestures, or differs from one gesture to another (e.g., it can be expected that 5% improvement is expected from the flexion/extension exercise, whereas, only 3% would be expected from the radial/ulnar one). Consequently, the expected parameters can be recomputed after the suggested number of sessions, so that new expected values are stored in the patient records. X is represented as follows:

\(\mathcal {X} = \langle x1, x2, x3, \ldots , xn\rangle \), where xi includes the following parameters:

\(xi = \langle \textit{xdur}, \textit{psROM}, \textit{peROM}\rangle \) where xdur is the expected duration required to complete the gesture; psROM and peROM are the expected patient start and end angles.

We use the above defined attributes for constructing the patient model and then use the model to generate the relevant therapies for the patients. For example, we use the model information to find the extent of the impairments, and based on it control the difficulty level of therapies.

3.4 Personalized therapy design

In order to support a personalized therapy with the aid of the game paradigm, we have modeled personalization aspects in therapy design. The patient domain comprises three main components: (i) the patient profile, (ii) patient preferences, and (iii) patient constraints, as described in Sect. 3.3. This patient domain is considered as an input to define the patient’s therapeutic context, such as details about the therapy type, types of features in the game in order to map each primitive therapy to a game action, types of motions involved in each game action, joints and muscles to be tracked in each motion type, metrics that store joint and muscle values, normal range of motions for each joint, and improvement metrics for each disability type, to name a few. A hand motion or gesture is composed of a subset of hand muscles and joints. A hand therapy exercise is mapped to one or more quality of improvement metrics.

Depending on the patient domain, a personalized game domain is created for that patient, which includes the appropriate features of the game. For instance, a very simple version of the game can be generated with non-disruptive visual features for patients with mental distraction. Moreover, patients’ preferences are considered as key game changing parameters. For example, places of interest to be visited, source and destination locations, and the visual avatar can be easily adapted to the patient request. Furthermore, the patient model along with the therapeutic instructions defined by the therapist will contribute in filtering the hand therapy domain, in order to select the required gestures with a specific configuration (ROM, number of repetitions, etc.). Consequently, adapted therapeutic gestures will be generated and stored in the hand gesture domain, with specific quality of improvement metrics to be tracked during the session. it is worth noting that based the patient performance throughout the training and official sessions, the system provides additional indicators to be updated in the patient domain for future games and levels of difficulty generations as discussed in Sect. 5.2.

Figure 3 shows the use case diagram that illustrates how the overall personalized therapeutic games are being designed and deployed. As illustrated, a personalized therapy monitoring scenario where two types of users are presented: a therapist and a patient. In addition, the back-end requires the help of the developers, system administrators, and a support person or team to help in in setup and calibration phases. A therapist can create a model therapy session, and then allocates the right therapy module with relevant complexity levels to a given patient. A therapist can also define high-level therapies and associate a subset of primitive therapies to his/her profile. The therapist can also manage patients, and review and annotate patients’ therapy sessions. The therapist can view the sessions recorded by a patient at home or at the hospital or can visualize the plots and different metrics to see the improvement of the patient. A patient can visualize the model therapy session and record his/her own session within the game environment. A patient can log into the game environment and replay model therapies uploaded by his/her therapist. A caregiver can see the improvement metrics from the past session analytics. A caregiver can also assist the patient in case the patient needs any assistance.

Fig. 3
figure 3

Use case diagram showing personalized therapy design life cycle

4 Intelligent game generation

The modeling approach presented in Sect. 3 serves as a basis for the development of our algorithms for therapy-aware and patient-centered route generation. After receiving the input therapies needed for a given patient,the framework engine generates and recommends adaptive 3D routes inside the serious game environment and with different difficulty levels. The game provides a deeply immersive map navigation experience to the players. This section first describes the mapping approach for therapy gesture translation, and then explains how adaptive routes are generated from the therapeutic instructions, while taking the patient’s preferences and constraints.

4.1 Mapping physical therapy to 3D serious games

A number of natural map browsing gestures has been previously developed in Afyouni et al. (2014), to make the browsing experience simple and intuitive. Figure 4 shows a gesture mapping table, which outlines different gestures that have been implemented such as wrist ulnar and radial deviation for moving right and left and wrist flexion and extension for zooming in and out. We have taken these gestures from therapies suggested by therapists to patients suffering from motor disability problems. Each gesture is detected by a gesture specific algorithm that parses the incoming sensory data stream and interprets hand position data to find the particular gesture. As shown in Fig. 4, we have designed both 2D and 3D map browsing gestures such that each primitive gesture can be converted into a primitive therapy. As an example of moving around a 2D map, a user has to grab the map by making a clenched fist gesture. Once the map has been grabbed, moving around requires the motion of the clenched fist towards right or left. To zoom into or out of the map, clenched fists of both the hands are used. A different set of gestures have been designed in the case of 3D map browsing. Figure 4 also shows the normal ROM for each joint movement. For instance, the radial deviation of the wrist joint means moving the hand in a way that the wrist remains fixed while the tips of the fingers are heading towards the body. In case of ulnar deviation, the tips of the fingers sway away from the body.

Flexion and extension of the wrist are used to fly low or high. This is equivalent to zooming in and zooming out respectively. Flexion of the wrist means bending of the wrist in such a manner that the tip of the fingers sway towards inside. Extension is the opposite movement. When fingers are swayed towards the outside from the anatomical position the action is called hyperextension. Flexion and extension have a range of \(0^{\circ }\) to \(90^{\circ }\) while hyperextension has a range of \(0^{\circ }\) to \(60^{\circ }\). Abduction is the movement of a part of the body away from the body while adduction is the opposite. Finger abduction means opening the fingers so that they move away from the middle finger. Adduction of fingers means moving the fingers towards the middle finger. The normal range of motion for abduction and adduction of fingers is \(0^{\circ }\) to \(20^{\circ }\). The clenching of the fist is used to lock in on 2-D maps. Once the map is locked in, moving the clenched fist moves the map in the direction of motion of the fist. Opening the fist releases the map. Once the map is released, moving the hand moves the pointer on the screen. Similarly, the clenching of the two fists allows a user to zoom in or out of the map. Increasing the distance between the two fists zooms into the map while bringing the fists closer results in a zoom out operation.

Fig. 4
figure 4

Modeling and mapping physical therapy gestures

Mapping hand gestures to navigational movements in serious games requires specific algorithms for each kind of movement. For instance, an algorithm for mapping hand gestures to navigate in 2D maps through the serious game environment has been developed in Afyouni et al. (2014). This algorithm takes Leap Motion data frames as input, and especially works on the current and previous frames to compute the difference in angles, duration, and spatial distance in order to generate the next navigational movement. This algorithm is applied to 2D Maps for left, right, top and down movements; other algorithms that take Leap Motion data as input for all map movements such as zoom in, zoom out and combinations of these movements have been also designed. The next section focuses on the development of therapy-aware and patient-centered 3D route generation algorithms, which takes this mapping approach into account in order to translate therapy instructions into routes in 3D in a fully automated way, while considering patients’ preferences and constraints.

4.2 Therapy-aware and patient-centered route generation

The intelligent route generator provides an adaptive path that includes all motions required to complete the therapy. Algorithm 1 describes the main steps for creating therapy-based dynamic routes within the game environment. This algorithm takes the therapy gestures \(\langle g_{1},~g_{2},~\dots ,~g_{n}\rangle \) , the corresponding set of constraints for each gesture \(\langle C_{1},~C_{2},~\dots ,~C_{n}\rangle \), where \(C_i\) represents the set of patient constraints for the gesture gi, the starting point initialPosition, destination point endPosition, percentage of improvement PI, as well as the expected session-based metrics \(\langle x_{1},~x_{2},~\dots ,~x_{n}\rangle \) as input. The generated path will be stored in a cloud-based geo-spatial environment to be played in different sessions and with different levels of difficulty.

figure a

Each therapy exercise (e.g., wrist flexion/extension) has four gestures: (1) normal to targeted direction, (2) stay on target, (3) targeted to normal backward, (4) stay on normal. As illustrated in Algorithm 1, every standard therapy proposed by the therapist for a given patient has to be adapted according to the constraints defined for that particular patient. As mentioned earlier, the initial set of constraints is defined by the therapist and based on the user profile. However, those constraints are updated upon finishing each session based on the performance of the patient. This route generator also takes into account the source and destination requested by the patient to make the game more personalized. The zoom level is also considered but within a certain restriction; it is up to the system to recommend certain valid zoom levels depending on the source and destination, and the total therapy duration derived from the gesture durations as explained in Sect. 3. The following actions are applied in this intelligent route generator:

  • X is the set of expected session-based parameters required to achieve the target level of improvement. It is calculated by taking the therapy properties \(\mathcal {T}\) and the percentage of improvement PI indicator into account, which is suggested by the therapist (line 2).

  • We start with an initial position (location, orientation), by computing the altitude of the object based on the number of therapy gestures n, the minimum and maximum travel time, as well as the speed on Z-axis (MINZSPEED, MAXZSPEED). The initial altitude is computed by taking into account the number of Flexion/Extension (i.e., ups and downs) and the maximum speed allowed, so the airplane does not hit the ground or reach the highest limit in sky (lines 3–5). The (x, y) coordinates of the position are predefined either by taking a default place or the place requested by the user.

  • The average ground speed avgGroundSpeed is calculated by considering the Euclidean distance eucDistance between starting location initialPosition and destination endPosition, and the total therapy time totalTherapyTime. This time is equivalent to the travel time totalTtavelTime and is calculated by adding the duration of each therapy gestures (lines 6–9).

  • The ultimate orientation is the direction of the bird flight towards the destination, also referred to as the course angle course (line 10).

  • For each gesture in the prescribed therapy, we retrieve the set of related constraints and expected session-based parameters. Calculate average ground speed and actual flying speed of that particular gesture (lines 14–16).

  • Based on the gesture \(g_{i}\) movement, we compute the next 3D global coordinates in XY and YZ planes, based on the expected range of motion, the course, and the actual flying distance (lines 17–20). Following is the pseudo code to get the coordinates of the next point.

    figure b
  • Update the total therapy distance by adding the actual gesture \(g_{i}\) length, and add the calculated position. Calculate average flying distance by considering total therapy time and total therapy distance (line 24) (line 21, 24).

The output of this algorithm is a sequence of 3D global coordinates \(\mathcal {P}~=~\langle p_{0},~p_{1},\dots ,~p_{m}\rangle \) of an adapted route, total therapy distance dth, and the average flying speed \(\textit{avgFlyingSpeed}\) based on the therapy gesture movements. Usually, this recommended route can be adapted after each session if the game goals are achieved from a medical perspective. In that case, a new route is designed by adjusting the difficulty level based on the suggested percentage of improvement indicator; otherwise, the same route is proposed in the next session.

4.3 Real-time behavior

As previously mentioned, at the preprocessing phase, the system creates a personalized game for the patient, which includes an adapted route with the purpose of achieving the therapy. Whenever the patient decides to play a new session, the latest personalized game will be displayed in real-time. The patient tries to follow the predefined route using hand gestures, while non-disruptive instructions (i.e., alerts) are being generated on the fly to avoid deviating from the original plan. The system provides an alerting mechanism to guide patients in their navigation experience using multi-modal guidance instructions. These instructions can be visual (shown on the screen) and/or voice instructions depending on their perception capabilities (cf., Fig. 10). The spatio-temporal tunnel is also developed to avoid the sharp deviation from the recommended path. However, it is not suggested to automatically correct patients’ navigational movements, since the purpose is to analyze their movement behavior and not to force them to always stay on the path.

Real-time parameters are also displayed including ROM values for the relevant joints, time spent in seconds, the number of balls/objects hit, etc. In every frame, the current position and other relevant parameters regarding the hand posture are recorded as discussed in Sect. 3.2. Game generated data and Leap Motion data are recorded and preprocessed in real-time, so that it can be sent upon finishing the session for detailed analysis. At the end of each session, diverse analytics regarding the behavior of the player are automatically generated and displayed to the therapist. A comparison of the patient performance in different sessions is conducted, in order to asses the patient improvement from medical perspectives, as well as with regard to their interaction and achievements within the game.

5 System development and usage

We present a system that generates adaptive serious games by taking rough input instructions from the therapist, and which employs an automated intelligent trajectory recommendation within a novel gaming environment to be played and followed by the patient. The subsequent parts of this section include a description of our system requirements, and the system design along with its implementation details.

5.1 Requirements

Our approach relies on the assumption that playing games while doing the assigned therapy would be more attractive and engaging. Therefore, there is a need for designing adaptive serious games based on patient therapeutic data. Particularly, such a gaming environment is expected to be self-trained by taking the disability of the patient into consideration. Mapping therapeutic hand exercises in three dimensions to actual movements on the real globe requires computing 3D points at different levels of detail to reflect those movements. Flexion and extension of the hand, in addition to its ulnar and radial deviation is considered for game development and for experiments. To measure the quality of improvement metrics, analytics are extracted from each type of map browsing activity. A subject can navigate through the 3D world map through different hand gestures. A 3D path generated at different levels of detail is developed based on the therapy data (range of motion, number of repetitions, source and destination, etc.). A 3D spatio-temporal validation region (i.e., 3D tunnel) is also generated in order to track and adjust the patient movement throughout the session. Some of the metrics that need to be collected in real-time, include positions as well as linear and angular velocities and accelerations. Additional challenges lie in creating and adding semantics to different types of attractive and repellent objects depending on the difficulty level of the game. This aims at enriching the user navigation experience, while complying with a strict set of health constraints as suggested by the physician. At the end of the browsing session, the actual and generated paths are visualized and compared to each other with respect to the mean deviation time and distance.

Other non-functional requirements are related to the performance, reliability/correctness, effectiveness, and usability of the system. Several performance metrics need to be designed to assess the system efficiency with respect to well-known benchmark measures. These include pattern comparison, point-to-point analysis for specific gesture improvement, mean-time needed to complete the exercise, path length, activity duration (how long does it take to complete each gesture), and the number of times obstruction hits or goes out of range. It is required to pilot the system in its natural settings and observe a sample of real users interacting with it. The second challenge is to measure the effectiveness of the system in providing an easy to use and cost effective framework for therapists to assign therapy sessions to patients and monitor their impact, while keeping patients motivated and entertained. It also drills into measuring the reliability of captured data though the system to be used in decision making. Details on these factors can be found in Sect. 6.

5.2 System design

Our system lifecycle starts when the therapist uses our authoring interface to define a specific therapy. As illustrated in Fig. 5, the interface displays a human body anatomy model, where each of the joints that are capable of being traced is displayed as a hotspot on the body. When a user clicks on a joint, a list of movements related to that joint is displayed with check boxes on the side. A therapist can associate a subset of body joints with a subset of primitive or high-level actions and other track-able metrics. This exercise is then assigned to any patient suffering from a related disability. The Therapy Database stores details about disability type, therapy type, types of motions involved in each therapy, joints to be tracked for each motion type, metrics related to those joints, normal ranges for each of the factors and metrics, and improvement metrics for each disability type, to name a few. A therapy-mapping module then translates these gestures into appropriate movements in the game environment. This module is sufficiently generic, so that navigational movements are applied to different patients that have similar disabilities. However, adaptation of those movements to reflect the patient constraints, are done in the next step by the Game Configuration Manager.

Fig. 5
figure 5

System architecture

The core information processing and computational intelligence components are the game configutation manager and the game engine. The Configuration Manager takes different inputs into account, including the generic translated movements, the patient profile, the game profile, and the previous session data in order to design a game with different levels of difficulty that reflects the preferences and constraints of the patient. The user profile acts as an electronic medical record (EMR) which is used to store detailed information about disabled patients. An example of a patient record is family information, type of hand-disability, name of the therapist who is overseeing, types of hand-therapy the patient has to conduct, past therapy history, improvement parameters, etc. The game profile contains specific properties of the game that can be enabled/disabled depending the patient profile. For instance, a visual feature such as particles moving within 3D space can be disabled for some patients who can be easily distracted in real-time. When the patient starts a therapy session using the GIS game, the game engine requests geo-spatial map data from the map server or any other scene layout data depending on the selected game environment. The gestures performed by the user for playing the game are detected by the sensors and sent to the game engine. While the user enjoys an immersive game playing experience, the data captured by the sensors is stored in a predefined format for later playback, processing, and mining.

Session data is stored so that it can be played back later in 3D world. Session data comprises the three types of game-generated data: (i) Leap Motion data, (ii) player data, and (iii) generated path data, as described in Sect. 3.2. The output part is made up of different windows showing the Leap Motion data as well as the quality of improvement metrics. An interactive graph generation screen is also part of the output interface.

Let us consider a scenario where a patient is registered for the first time, his/her basic disability data is then recorded in his/her profile. Suppose a patient can flex his/her wrist up to \(20^{\circ }\) only. The normal range of motion for wrist flexion is \(75^{\circ }\). Based on the percentage of improvement suggested by the therapist, the system can devise a plan to recover the patient’s ability in different steps. For instance, It might be suggested to divide the difference into five steps: \((75-20)/5=11\). This means that a five level game will be devised by the system with increments of \(11^{\circ }\) at each level. Before the game starts, the system pulls up recorded metrics of the patient from initial measurement (or form previous game session if the patient has played the game previously) and displays it on the screen. The system will then ask the user about his/her preferences regarding the game scenes and the source and destination targets for his/her trip.

Due to tight integration with the health framework, the 3D path generation and the difficulty level of the game will be controlled by the system based on the current health status of the patient. The input to the 3D path generation algorithm is the primitive therapies devised by the therapist while considering the patient profile. A therapy-to-navigation mapper will translate those primitives into navigational movements. Those navigational movements will be integrated within the game engine in order to generate and match the navigable route on the 3D globe. The patient can fly the plane over famous monuments within a 3D map environment. Based on physiotherapist recommendation, a 3D spatio-temporal tunnel will be constructed to ensure a safe region in which the actual route performed by the patient can be considered as valid. This tunnel will be considered as transparent navigation barriers in real-time. Whenever the patient is following a route that is deviating away from the valid region, the flying plane will hit the barriers of this tunnel to avoid further deviation, and to rectify the followed path.

As soon as a patient achieves her goal by increasing her motion to the required number of degrees after a given session, he/she will be promoted to the next level. The new level will have similar movements, albeit a different difficulty level. The scenery and environment will be different to add variety to the gaming experience. To perform the same manoeuvers, the patient will now be required to move the joint to a higher degree. At the end of each game session the patient will be shown his/her progress based on the comparison between the actual and generated path, as well as between the current and the previous game sessions.

It is worth noting that the system is self-learning, means that the game configurations are changing based on the initial training and assessment, and after each played session. The adjustment of the difficulty level varies according to the calculated target of each gesture, which takes into account the performance of the past sessions (i.e., three sessions are considered as a metric). A performance indicator Y dictates the type of adjustment. For instance, if Y is 100%, and the target angle of the previous session for ulnar deviation was \(33^{\circ }\), this implies that the patient has reached the required number of times for this gesture, and with the right target angle. Three cases for adapting the difficulty levels are distinguished as follows:

  • Case 1: If the mean of the performance in the last three sessions lags behind by a threshold \(J > 10\%\) then the new target angle is reduced by J (example value of J is 13%). The new \(\textit{targetAngle} = \textit{pastTarget} - (13\% \textit{ of pastTarget})\) (i.e., adjustment is negative). For example, for a target angle equals to \(33^{\circ }\) and J equals to 13%, the new angle will be equal to \(29.7^{\circ }\).

  • Case 2: If the performance of the last three sessions lags behind by threshold \(2\%< J < 10\%\) then the delta angle is 0, and the target remains the same (i.e., no adjustments are applied).

  • Case 3: If the performance is behind by less than 2%, then the exercise is considered to be complete, and the new target angle is increased by the percentage of improvement PI as defined in Sect. 3.3.

Fig. 6
figure 6

The framework lifecycle as a self-learning component

Figure 6 illustrates the life cycle for taking patient characteristics and generating the different levels of difficulty within the game environment. The different phases in the learning process are as follows.

  1. 1.

    Phase 1: the patient-centered therapy assignment with an expected percentage of improvement -PI- per session. This is the initial phase where the therapy instructions and the patient preferences and constraints are considered for the game development.

  2. 2.

    Phase 2: involves performing one or multiple training sessions to calibrate the device and game parameters before generating the patient-adaptive game.

  3. 3.

    Phase 3: covers the generation of therapy-aware navigational movements with multiple levels of difficulty having an increased PI value.

  4. 4.

    Phase 4: comprises the official session recording by the system while playing the assigned game.

  5. 5.

    Phase 5: for performing extensive real-time and offline analytics, and assessing the satisfaction level of users generated from usability tests.

  6. 6.

    Phase 6: this last phase covers the system adaptation based on learned semantic knowledge. The patient model is also updated accordingly. The system checks whether a new therapy should be devised, and returns back to the first phase, otherwise the system starts again from phase 3 with an updated level of difficulty.

Fig. 7
figure 7

Game user interface: Left side showing the right hand skeleton through the hand Leap Motion visualizer, and the right side is showing the player within the game environment

5.3 Implementation

The game environment consists of a PC or laptop machine and a Leap Motion device attached to it to capture hand movement data. Our designed system consists of two major parts; (1) two front-end interfaces, one to assign therapies and see interactive plots and statistics of the results, and another one that is the game interface; (2) a back-end game engine to generate game tasks based on therapies and to collect and save usage data.

We have used the unity game engineFootnote 14 to create 3D games related to hand-therapy. A MapNav Unity extension was used to incorporate the 3D maps into the game environment. We have developed algorithms to capture raw joint data from different joints of a subject’s hand (see Fig. 7). The system is configured to relay the live joint data captured from the 3D Leap Motion controller using its SDK to the web where the hand movements of the avatar’s hand representing the subject in the physical world is mapped and synchronized. The front-end interface is designed using WebGL along with HTML5 and PHP. The plots are generated using jqplotFootnote 15 and plotlyFootnote 16 libraries.

As patient plays the game, the system collects usage data. The system collects two types of real-time data; (1) Leap Motion data that contains the output produced by the Leap Motion device, and (2) game data. Leap Motion data is a series of time stamps and ID’s frames. Each frame has a number of attributes such as confidence (a floating point number from 0 to 1), palm direction (unit vector), arm direction (unit vector), joint position, etc. The game data contains trajectory path originally computed for therapies, as well as the user-generated data, including the trajectory, gesture, number of coins picked, time, etc.

Our system saves hand motion data, patient game data and generated data in JSON format. We use JSON format for data plotting since we cannot compromise on the quality of raw data while for online animation of historical motion. Motion data analyzing and plotting has been implemented using PHP and plotly JavaScript library. MySQL database is used to save data and the database is stored in our local servers.

5.3.1 System usage

Our system can be accessed online by the therapists and patients. On the first access, they have to register and create a username and a password. On the next access, depending upon their role (therapist or patient), they can see the relevant features. The patient can see all the therapies assigned to him, play these therapies on his chosen game, record the session, see the analytics of the recorded data, see historical results for comparison, etc. In the game environment as well, the patient can explore key places interactively, chane background tiles and styles, choose different player objects and different intermediate coins, with/without tunnels, training session, starting and ending positions in the navigation games, etc. The therapist logs on to the system through the initial login screen. The first menu option allows the therapist to design a therapy. Selecting this option takes her to a screen where she can enter the name and a brief description of the therapy. The next screen shows the image of a human skeleton in a blue outline. The different joints of the body are drawn as hotspots on the screen. When the therapist clicks on a joint, a drop down list of motions associated with the joint appear as a tooltip on the screen.When the therapist is done selecting joints and movements of interest, data is stored in the database. The purpose of storing joints of interest in the database is to inform the system which joints to track for a particular therapy. The therapist can link the therapy to a disability and assign this therapy to any patient registered in the database. The therapist can choose various metrics and can see the plots and statistics to analyze the patient performance. The therapist can add a new therapy to the database by selecting the joints on a hand skeleton showing all the possible joints. Therapist can also add the number of repetitions for each gesture and the percentage of improvement between sessions.

Once the therapies are added in the front-end interface they are saved on the server for the given patients. The game engine then receives therapy prescription from the server. This is defined as a set of movements pertaining to specific joints. The game engine first breaks down and converts this therapy description into an internal representation as an ordered array of gestures. Each gesture is represented by a start angle, end angle, duration, joint and movement type of that joint. The array of gestures is then shuffled in accordance with typical physiotherapy convention such that each movement is followed by its opposite movement, for example, flexion would be followed by extension whenever possible. Then based on this array of gestures, a path in the game is generated (1). At the intermediary points between gestures, target coins are placed. Touching these targets in the game awards the player with points.

We implemented multiple games in our system. One of the game is the navigation in 3D maps while flying an airplane or kite on a suggested therapeutic route to pickup coins. The trail of the user navigation in 3D maps is mapped to the therapy devised to that patient. Navigational movements, angles of deviations, and other therapeutic data are recorded for pattern analysis and for producing quality of improvement feedback about the patient’s performance. Figures 7 and 9 illustrate the user interfaces where a user is navigating through the map browsing game environment which is tracked by Leap Motion to deduce rotational and angular motions from hand joints. Figure 8 shows the game interface inside unity game engine with various attributes such as position, rotation, scale, speed, etc.

Another designed game consists of navigating within 3D space following a computed route as shown in Fig. 10. The patient has to pick up coins while navigating the therapeutic path and the system collects game and Leap Motion data for the hand movement. the virtual hand is also drawn on the interface and is synchronized with patient hand movement. The visual aids in form of text (turn right or left) and voice over messages are also available to the user for easy navigation.

Fig. 8
figure 8

Software development environment within the unity game engine

Fig. 9
figure 9

Generated path within the game environment

6 Evaluation

After designing the framework, we conducted an extensive evaluation with real patients. For the design of the study, we collaborated with a therapist from a local disability center and two physicians from Al Nour Hospital in Makkah. The design of the user study was divided into two phases. The first phase was a pilot study where one therapist from Al Nour Hospital wanted to see the effectiveness of the framework and wanted to find out how many metrics were sufficient to give her a good picture of the patient performance. Based on the results of the pilot study, we updated our framework and then performed a detailed evaluation of the framework. The different studies are discussed in detail in the following sections.

6.1 Study parameters

We monitored various therapy sessions of different patients and interviewed several therapists. Based on the data gathered, we defined different parameters to measure the effectiveness of performing therapies within our framework. The therapist could see real time plots and statistics of all the parameters to assess patient performance.

Fig. 10
figure 10

Screenshot of the outer space navigation game

6.1.1 Measurement details

Based on our discussions with the therapists, we incorporated the ability of recording the following gestures in our framework.

  • Wrist flexion Wrist flexion happens when the palm of the hand moves inside towards the front of the forearm.

  • Wrist extension Extension of the wrist is the movement of the back of the hand towards the back of the forearm.

  • Wrist radial deviation This movement is expressed by tilting the thumb side of the hand toward the lateral side of the forearm.

  • Wrist ulnar deviation In this gesture, the wrist tilts towards the little finger side of the hand.

  • Elbow flexion Elbow flexion happens when the hand moves towards the shoulder.

  • Elbow extension Elbow extension is the straightening of the arm at the elbow joint. The elbow in the anatomical position is considered fully flexed.

Effectiveness: Designing advanced analytics to be drawn as a result of each game session to measure the effectiveness of the framework as a serious game environment and a therapeutic kit. Different metrics have been studied and monitored to analyze the quality of improvement of the patient and the usability of our framework (Chittaro et al. 2011; Hocine et al. 2015). A list of these metrics includes:

  • Time Time to accomplish a gesture or a therapy. This is computed for each gesture as well as for the whole session.

  • Joint movement Impact of therapy distribution over different joints of the wrist and fingers. We record the movement data for each joint of four fingers and the thumb, while performing therapy gestures.

  • Incorrect gestures An example of an incorrect gesture is to move the forearm instead of the wrist to achieve the desired result. We record the number and type of incorrect gestures and the time at which they are performed.

  • Hand posture Hand posture while performing therapy gestures.

  • Reaction time Interval between the appearance of an instruction and its execution.

  • Range of motion (ROM) It is one of the most important metrics for the therapist to see the improvement of the patient. We are recording maximum, minimum, and average ROM for each gesture as well as for the whole therapy session.

  • Speed The velocity of the patient movement while playing the game.

  • Movement analysis Trajectory followed by the patient while performing a gesture.

  • Mean time Mean time needed to complete the exercise.

  • Gesture duration Time taken to complete each gesture.

  • Path length Path length for each therapy and for the whole session.

  • Success rate Therapies completed successfully and the number of points gathered in the game.

  • Error rate Number of therapies not performed correctly.

  • Hits Number of obstructions hit.

  • Interest Interest level in the game (how soon does the patient become bored and deviated).

  • Other metrics We are also calculating various other metrics that are needed by the therapist. These include the angular speed and acceleration. The muscle power, Visual Analog Scale (VAS) pain score, and functional status are also determined before and after sessions by traditional means for comparison purposes.

Usability: A separate list of questionnaires was prepared for the therapist and the patients to assess their level of satisfaction about the usability and effectiveness of the framework.

6.1.2 Data collection and storage

Our system collects data from two sources. First, it collects data generated by the game as patients play their sessions while performing various therapies. Second, it gathers data generated by the Leap Motion device. The Leap Motion gives detailed raw data of hand movements as described in Sect. 3. The system logs this data in real-time as the patient plays the game. The system also calculates the defined metrics and their statistics, so that interactive plots can be generated immediately at the end of each session.

6.1.3 Apparatus

The studies were conducted on a 13.3-inch MacBook Pro 2.5GHz Dual-core Intel i5 with 8 GB of memory, equipped with a 24 inch monitor working at a resolution of \(1600\times 1200\) pixels. The Leap Motion device with v2 Desktop SDK was used to collect hand movement data.

6.1.4 Design rationale

The main purpose of designing the user study was to assess the patient/therapist satisfaction and to validate the framework for its effectiveness and usability. We wanted to allow therapists and patients to interact with our system, perform therapies, and interact with analytics. We also wanted to find out if a patient can easily interact with the system and perform the prescribed therapies at home with or without help. We also designed questionnaires for the patients and the therapists to collect qualitative results.

6.2 Pilot study

In the first phase of the user study, we conducted a detailed discussion with the therapist at a local disability center in Makkah about the metrics needed to measure and monitor the patient while performing a given therapy. The therapist wanted to see experimental results for the collected metrics of patients to compare them to traditional therapeutic kits. The purpose of the pilot study was to validate the correctness and usability of the system. The therapist wanted to analyze the metrics through different statistics and plots from data recorded in the study. The therapist also wanted to see if the defined metrics were enough to get a complete picture of the patient’s performance. A number of sessions were conducted before starting the real trials with the patients. Special attention was given to the calibration process in order to ensure reliability and correctness of the determined values.

6.2.1 Study details

In collaboration with the therapist in the local disability center who provides hand-therapy services to diversified types of patients having different levels of diplegia, we finalized the above metrics and added real-time calculations of these parameters to the user interface. We then performed a pilot study with one patient, a 24 year old female with right hand disability (dealing with hand injury with wrist sprains). The therapist prescribed three sessions of therapies where each session consisted of multiple repetitions of a single gesture. The therapist suggested a therapy consisting of six movements for the hand (i.e., pronation and supination of the forearm as well as flexion, extension, radial and ulnar deviation of the wrist). The subject started with a flexed forearm and an open palm. A number of movements were performed by the subject in which the palm faced up or down (supination and pronation), and the tips of the fingers faced straight up or down (flexion and extension of the wrist), so that the patient could follow the game-generated route. The system provided live feedback through hand and skeleton animations, and analytical reports were also sent to the therapist about the state of the respective joints (Fig. 11).

Fig. 11
figure 11

Plots showing kinematic analysis of: a wrist radial–ulnar deviation; b flexion extension/hyperextension of wrist; c pronation–supination of palm surface; and d flexion–extension of elbow joint

For this pilot study, the therapist stored three model therapy sessions in the session repository and then allocated the right therapy module with a relevant complexity level to a patient. The game considered in this pilot study is a 3D space navigation, where a subject navigates the space by going left (radial deviation), right (ulnar deviation), up (wrist flexion), down (wrist extension/hyperextension), or rotates the airplane (pronation/supination) on a generated therapeutic path (see Fig. 10). The subject has to collect the coins introduced at the end of each gesture.

The patient first performed a training while playing the game with a Leap Motion to get accustomed to the environment. The patient then performed three therapy sessions, going through each therapy 30 times. The therapies were also randomly shuffled inside each session. After each session, the patient took some rest, and once she was ready, the next session was starting. Each session on average took about 6 min. At the beginning of each session, we also attached a goniometer on the patient’s hand to calibrate the Leap Motion device and validated the correctness of the readings recorded by the device. Our system collects data about different hand therapy modules, the kinds of activities and actions performed during each therapy module, the types of data or metrics recorded, and the types of joints that are tracked. The following section summarizes our detected motion types and the analysis of the captured test data.

6.2.2 Results

Following are important metrics that the therapist analyzed to measure patient performance. The therapist wanted to see the overall hand movement of the patient for various prescribed gestures. Figure 11a shows a certain part of the session in which horizontal movement of the hand is plotted. The x-axis shows the number of frames while the y-axis shows the normalized range of motion. As seen in Fig. 11a, during this part of therapy session, the user only moved her hand once in the right direction and once in the left direction. In other words, the positive Y value shows ulnar deviation while the negative Y value indicates a radial deviation. Hence, the real-time interception of the Leap Motion frames provides us with great insight about the radial/ulnar deviation movement of the wrist.

Similarly, Fig. 11b shows the flexion and extension hand movement for a certain portion during the session. This figure shows that the wrist was initially hyperextended for a short duration, and then it was in the extension state for the rest of the therapy session. Figure 11c illustrates that the subject’s palm was in a pronation state (palm surface facing downward) most of the time, whereas, at the end, it reverted to a supination state (palm facing up). Finally, Fig. 11d shows the navigation of the flexing and extending elbow in the game environment.

Figure 12 shows the time-based monitoring of different gestures performed by the patient over different sessions. In the wrist pronation plot, the x-axis depicts the time while the y-axis shows the value of the palm normal. Initially, the palm normal is \({-}1\) which means that the palm is facing downwards. This state is called pronation (Fig. 12a). In the third session (khaki color), the hand is supinated to its fullest, so we can see the graph going all the way up to \({+}1\). The other sessions show partial ability to move the hand from the pronated state to the supinated state.

In the wrist ulnar deviation plot (Fig. 12b), the x-axis illustrates the time while the y-axis shows the wrist range of motion. The zero angle represents the point where the hand is not making any angle with respect to the forearm in the horizontal position. Thus, the middle finger is pointing straight in this position. Radial and ulnar movements of the hand produce positive and negative values, respectively. Multiple sessions are plotted together for comparison purposes. The therapist told us that they generally measure improvements after multiple sessions, since measuring the improvement with devices like goniometer takes time. Moreover, there is hardly any noticeable improvement after one or two sessions.

Fig. 12
figure 12

Comparing different sessions of a patient therapy. a Wrist pronation. b Wrist radial/ulnar deviation

6.2.3 Improvements and suggestions

The analysis of collected data showed a first case of the system. The therapist and patient were engaged and interested to see interactive plots of the metrics along with statistics. Generally, therapists have to use physical devices such as goniometer after each session to measure the ROM, and no other metrics are provided. It is quite laborious and time consuming to calculate these improvements through such devices. Therefore, she was excited to see all the results provided to her without any involvement and she commented that it will improve the effectiveness of her work. Also, the system provides rich results for each and every movement of the joints as compared to physical devices, which are used after the end of the session to measure overall improvement.

Based on collected data analytics, the therapist also recommended few other parameters: she wanted to see visual analytical improvements such as 3-D trajectory analysis, mean deviations for each gesture and for the whole session, movement-based deviation, point-to-point analysis, and other statistics for each movement as well as for the whole therapy session. The therapist also pointed out that she needed another plot over time where different repetitions of a therapy gesture were superimposed on each other. It would help her in monitoring the patient performance as well as the element of fatigue while performing multiple repetitions of the same therapy. We complied with her request and added the additional metrics and analytics to the system before our next evaluation phase.

The therapist also suggested a few more interactions to the game in order to reduce patient workload. She felt and the patient also confirmed that following the path on the map adds extra fatigue and mental effort to the patient. She suggested to add directional arrows with voice and verbal instructions (such as move up, move down, turn left, turn right) at each navigational movement. She also suggested to deactivate dimensions not used during certain therapies. For instance, during radial/ulnar therapies, the patient has to move left or right and Z axis can be fixed, whereas for flexion/extension, the patient has to move their hand up or down and the X axis can be fixed. This gives more control to the patient while navigating in the game. She also asked us to add a virtual hand on the game interface that moves synchronously with the patient’s hand to facilitate hand eye co-ordination. We updated our game with these interactions for the next study.

6.3 Detailed user study

After the initial study, we updated our system according to the therapist suggestions. We then conducted a user study of the system along with therapists in the local disability center. The detailed study is described in the following.

6.3.1 Participants

The therapists recruited five female patients with some hand disability. The participants were between 18 and 30 years of age with mean age of 25. All the participants have partial hand movement and are required to perform therapies for rehabilitation. All of them are right handed and have right hand disability (four dealing with trauma to the finger or the wrist, and one coping with cerebral palsy). 4 out of 5 participants self-reported as computer users with some experience. All participants also reveal that they have basic map navigation concept. Furthermore, none of the participants had previously played with leap-motion-based games. It is worth noting that our system is considering subjects who have some freedom of movement in their hand, and want to regain their full range of motion. Subjects with completely paralyzed hand movements are not within the current scope of this study.

6.3.2 Study design

For each patient, the therapist enters the type of hand therapy needed, the percentage of improvement, and the number of repetitions. For the experiments, we fix the number of sessions for each patient to three sessions. Each session has 30 repetitions of gestures, which are randomly shuffled in each session. After each session, the patient was taking some break for 30 min, and once the patient was ready, the next session was starting again. Each patient has to perform four main hand therapies: wrist flexion–extension, and radial–ulnar deviation. In this way, we have \(30 \times 4 = 120\) gestures in each session.

The game considered in this study again is a 3D space navigation, where the patient has to fly an airplane on a route generated according to the prescribed therapies. The patient has to collect the coins placed at the end of each gesture. The patient is required to follow the generated path as closely as possible. Visual aids in form of arrows and voice over messages are provided to the patient to easily follow the path. A virtual hand is also shown to showcase the patient hand movements. Patients also perform one or more training sessions before the actual trials to ensure that they know how to interact with the Leap Motion device and play the game. As the patient plays the game, we collect and store game data as well as Leap Motion data for the hand movement. We then perform different analytics to calculate the required metrics and store session and movements summaries in database.

6.3.3 Study details

Therapist session For the user study, the therapist added each of the participants and assigned them the prescribed hand therapy to perform over three sessions. The therapist also selected the movements that she wanted to track for that particular joint by clicking a checkbox next to each movement in the list.

Patient session A total of five subjects used our system. Each one had motor control problem of the right hand, with two suffering from strokes caused by accidents. All of them also experienced a kind of psychological or emotional trauma. The written informed consent was obtained from all patients, as well as from the therapist. All participants were informed about the purpose, content, and protocol of the study. They also received sufficient training before the actual experiment. The entire study took three sessions per subject, with an average of 4 min for each session. The system measured different parameters as defined in Sect. 6.1.

Patients were asked to perform four main hand gestures: wrist flexion–extension, and radial–ulnar deviation. Several metrics were collected for advanced analytics, such as the percentage of time on path within range, mean deviation from the generated path, ROM for each gesture, among others. When a patient was told by the therapist to perform a particular exercise, the patient logged in to the game directly and started the adaptive game as prescribed. Using a Leap Motion device, the patient played different sessions which were uploaded by the system automatically. Once the patient’s sessions had been uploaded, the therapist checked the performance and percentage of improvement by comparing different metrics extracted from the session data. The plots were of two types. The first type compared the patient session with that of the therapist to measure the fidelity, whereas the second type of plots compared different sessions of the patient with each other to measure the quality of improvement.

The patient also told the visual analog pain score to the therapist on a scale of 1–10 after a few minutes of each session. This score indicated if the patient felt pain as a result of the therapy. Generally, after performing the therapy, the patient did feel some pain and it is a regular medical practice to measure pain score after a few minutes.

Fig. 13
figure 13

Time taken to complete each session

6.3.4 Observations and results

Most of the quantitative metrics are automatically recorded and plotted online. The qualitative results of questionnaires and user feedback are computed and reported below. Overall, all participants were able to complete the tasks. They felt all motivated and excited to perform the therapy while interacting and playing the game instead of doing the traditional exercises with a therapist.

Quantitative performance Various metrics to analyze the patient performance were developed. Below we explain these results along with their respective figures.

Figure 13 shows the total time required by each patient to complete the therapy session. The best time to complete each session by performing all required therapy is 4 min shown by horizontal red line in Fig. 13. On average, patients completed the session in 4 min and 36 s. For patient 1, the first session was aborted before completion as the patient was not able to continue playing at that instant. In general, one can notice that the second and third session were closer to the ideal time required to perform the therapy.

Fig. 14
figure 14

Game generated and patient movement 3D trajectory plots for multiple sessions for two patients

The system can also compare patients’ trajectories over sessions with respect to the game generated path. Figure 14 shows the comparison of the patient trajectory for three sessions. Although following the exact path suggested in the game is not a must from medical perspectives, but it shows to what extent a patient can focus on both objectives: performing prescribed therapy and playing the game in the best way. Results show that the two patients were really stick to the right path with a high percentage. The others were less focusing on the path and were more concerned on controlling the game with the Leap Motion while performing the required therapy. The plot is interactive such that users can rotate the view in any direction for better comparison. The system also shows statistics on the mean deviations between two trajectories (i.e., patient path and game generated path). The mean deviation is calculated based on the fact that how accurately patient has followed the generated game path.

Fig. 15
figure 15

Plots showing maximum ROM of three sessions for different patient gestures

The system can also show the minimum, maximum, and average ROM for the different conducted sessions as well as for each gesture within the session. Figure 15 shows the maximum ROM for the three sessions for one patient with four therapy types. This gives a clear understanding to the therapist about how this patient’s range of motion is evolving based on the prescribed therapy.

On the other hand, Fig. 16a, b show the ROM variations for each gesture repetition during the session as well as over all the sessions. These figures present the filtered version of large plots of ROM variations over time (showing only few movements/repetitions of the session), where therapists can analyze these variations for specific gestures per movement. These plots are important for the therapist as they illustrate the minimum and maximum ROM for each repetition during the session. This helps in identifying the distraction or pain moments over time, thus providing a better patient performance diagnosis and quality of improvement.

Fig. 16
figure 16

ROM variation over sessions for radial and ulnar gestures. a Maximum ROM per session for radial. b Maximum ROM per session for ulnar

Fig. 17
figure 17

Percentage of achievement and correctness with respect the prescribed therapy

We also analyzed the number of coins picked by the patient. Overall, patients were able to collect 66% of the available coins. However, this does not directly reflect the correctness of the exercise achieved by the patient. From medical perspectives, the therapist asked to know the percentage of achieved tasks with respect to the prescribed therapy. Even if a patient does not hit the ball or does not collect the coin, he might finish the prescribed therapy by performing all required gestures. For that purpose, Fig. 17 illustrates the percentage of correctness for the five patients, which indicates how much of the prescribed therapy was actually well implemented by patients. Results in this figure were surprising as some patient have done only between 50 and 60%, whereas some others have performed up to 190%. Patient 4 has done almost the complete task correctly as she was following the path perfectly. The reason we obtain the 190% is that some patient have performed twice as many as the number of therapies required in this game. Although this parameter shows a nice medical achievement, it is badly reflected by the game metric since the patient was not hitting the balls, as more lefts and rights were performed and not in the correct order.

Fig. 18
figure 18

Patient subjective feedback (x-axis represent qualitative score and y-axis represents number of participants for that score)

Qualitative analysis After the user study, we gave a questionnaire to each user to report their feedback about the framework. It contained questions about the usability, enjoyability, effectiveness, fatigue, motivation level, mental effort required, etc. while using the system. The score was given on the scale of 1–5 with 1 is the least score. Figure 18 shows the results of the patient feedback. It shows that patients generally liked the system and were all interested. An interesting point here is that the patients felt that more mental effort was required than usual while performing therapies through playing games. However, they also agreed that it also provided an engaging and motivating experience.

We also gave one questionnaire to three of the therapists we interacted with for this work to get their feedback. We asked them three questions. The first was, how easy is it to use the system? The second question was, how comfortable are they in using this system in future with their patients? The last question was, how easy is the system to monitor the patient rehabilitation process? Figure 19 shows the results. It shows that overall therapist do find the framework usefuk and would like to use it in the future.

Fig. 19
figure 19

Therapist subjective feedback

7 Discussion

In the results section, we presented only few important metrics suggested by the therapists. There are various other metrics defined above that can also be analyzed by the system. Overall, the results of the study are very encouraging as therapists and patients were engaged and excited about the framework.

It should be noted that there is no way to compare the same therapies for different patients, as they have different profiles and constraints. The paper only presents a comparison of different sessions for the same patient. Figure 17 that puts different patients on the same plot, does not have the aim of comparing them, but only to show them all together to the therapist in one window.

7.1 Analysis of the framework adaptability

Within our framework, we can adapt the path length, curve angle and smoothness, color, curvature, object size, speed of appearance of objects, animations, and guidance instructions, among others. All these parameters are adjusted based on the physical and perception constraints of patients, as well as their preferences. During training sessions, we first performed the experiment with the patients without any adaptations, that is, paths and curves are not therapy- and patient-aware. One patient with a minor hand injury could finish the game while staying close to the path, whereas the others had problems. We incrementally started adding adaptation parameters by generating therapy-aware paths, and adjusting the colors and sizes of objects on the screen to make them easier for patients to deal with. We added the adaptation layer to the game engine in order to parse the results of a game session and adjust the difficulty level of the subsequent session accordingly. The adaptation layer takes into account the number of objects captured by the player as well as the fidelity of the path followed by the user. This training phase with incremental adjustments of the game to reflect patients’ constraints and therapy requirements was appreciated in the end, and patients discovered the importance of every added factor.

7.2 Feedback

Overall, therapists and patients found the framework useful. Therapists were excited to see various metric plots in real-time to access the patient performance. They said generally they do not measure the improvement until after few sessions. With this system, they can easily monitor the improvement after any gesture, and from different perspectives. They like the system adaptability according to patient performance. They also found the system easy to use for analyzing and maintaining rehabilitation records of patients. They also said that patients do not need to wait for appointments and in long queues, by using such online solutions.

Patients also gave good feedback about the framework. They initially felt a bit strange to perform therapy while playing games but they like it as it keeps them engaged in performing therapies. One of the patient said ‘a nice framework to do therapies at home while playing games and see improvement metrics’. Another patient commented out, ‘for next sessions I do not have to come to the hospital’. Another interesting comment was, ‘I don’t know I can even play games for my hand rehabilitation’. Overall, they were excited to use the framework for their rehabilitation.

The patients also liked the idea that the system adapted to their abilities. They felt that it was necessary to keep their interest in the game. They suggested that the adaptation function should come into play after three sessions instead of one so that the values of different parameters can become more stable (something we reintegrate in the adaptation model afterwards). At present, the system considers as input the performance indicator with respect to the performed exercise, the number of objects captured, and the fidelity of the path as input parameters to the adaptation function. We are working on adding more parameters to the equation to generate a more comprehensive formula for detecting the correctness of the motion of a patient’s hand and its closeness to a normal hand.

7.3 Other therapy-aware games

Different therapy-aware games can be developed within this framework and without changing the logic and back-end development.

  • Driving game Using a real world road network, the system will generate a driving route that may include places of interest to the user. The game can be changed into a 3D city browser using tagged images from Google Street Map. The user will be able to stop by interesting places and get information about them.

  • Treasure hunt This game will change the user’s neighborhood into a treasure-hunting environment. The system will hide treasure items along a certain path and the user will have to navigate to those places to unearth the treasure.

  • Geography education These techniques can also be used to develop games for teaching geography and improving spatial cognition of children or elderly with impaired neural functions.

7.4 Limitations

Our system is currently limited to hand disabled patients only. For hand disabled patients as well, our system is useful for those patients who have some hand movement and are performing therapies to get completely healed. However, for severe strokes or paralyzed hands, our system is not directly applicable. For such patients, therapists told us that once they started getting better and have some movements in their hand, then this system can be used. Also, the complexity of performing therapies within our framework can be higher for patients with comorbid diagnosis (e.g., mental disorder) that might affect their overall cognitive and/or physical performance.

Another limitation of our system is that we are using only few games for performing therapies. These games might not be interesting for all the users of different ages. We will work on this limitation to add multiple games in future. Also, we evaluated the system with elderly person and the results may be different in case of children. We are planning to design few more attracting games for children in the future.

We evaluated our system for right disabled hand only as all recruited patients have right hand disability. However, the framework can be used both for right or left hand disabilities in the same manner. The system automatically detects which hand is used for performing therapies and also logs that information.

also, interacting with a Leap Motion device to play game may not be easy initially for different users who are used to play games using standard inputs such as keyboards and mice. They first need to train and get used to it before playing smoothly.

In our evaluation, we do not use any external contraption arm device to restrict the user in moving his arm during the hand therapy. However, therapists specifically informed patients to not move the arm and they were also monitoring them through out the experiment. Also, our system calculates the arm movement along with the hand so that the therapist can see how much the patient arm is used during the therapies.

7.5 Generalization

Our designed system is generic so that it can be used for any other kind of therapy. In this work, we have focused on hand therapy but our system is designed to tackle therapies for the whole body in the future. From the front-end interface, the therapist can assign any therapy and the patient can perform those therapies in their home environment. In the current work, we used a Leap Motion to track hand movements, but our system is capable of using other devices used to monitor therapies such as kinect, Myo armband, etc. The system can collect data from these devices and can generate the interactive analytics of the collected data for therapists as well as for patients.

Similarly, our current system is designed for mild hemiplegic patients for their hand rehabilitation. However, as described above that our system is generic and can be used with other devices to perform different therapies. Thus, in those cases it can be used by patients affected by strokes, hemiplegia, cerebral palsy, carpal tunnel syndrome, etc.

The therapist tells us that she needs the system for performing leg therapies as well, as according to her, the leg movement is their first priority in most of the disabled patients so that the patient becomes independent. After that they focus on hand rehabilitation. With the use of kinect device we can track the full body and can collect data for analytics. We will validate these therapies and devices usage within our system in the near future.

8 Conclusions and future work

This paper introduces an adaptive serious game framework that incorporates a patient-centered model into a personalized therapy design in order to facilitate hand rehabilitation by using state-of-the-art motion tracking technologies. We have developed a gamified e-therapy framework to provide patients with an entertaining and personalized environment to practice physiotherapy under the guidance of therapists and caregivers. An intelligent game generation within a map-browsing environment is implemented, where each game is represented by navigational movements generated according to patients’ preferences and constraints. The implemented system generates metrics and live plots after each game-play. Algorithms to parse game-generated as well as Leap Motion data to infer high-level analytics from medical and HCI perspectives, have been also presented. This paper presented a blended design study detailing the experience gained from designing this system. The system has undergone several development phases by considering different trials performed in the presence of physicians, therapists, patients, and parents of the recruited patients. We received many valuable suggestions from all the stakeholders that were made part of the final design.

Our initial test results show that this framework has a potential to explore for more advanced serious games and detailed clinical data mining and analysis. We will be looking at the current shortcomings and feedback from the therapists to make the framework robust and complete. We are at the stage of integrating more primitive actions so that a therapist can produce and make a wide range of high-level therapies. Also, eliminating environmental as well as hardware noise is another step that we currently focus on, since there are many distractions in the real-life scenarios with diverse use cases with respect to patient disabilities and constraints.

Using navigation games for hand therapy is very engaging, but may be some people may not like such types of games. It is one of the limitations that in future we want to fix by either devising other diverse games, or by incorporating and controlling existing games within the framework. Another limitation is that our framework is generally designed for all ages of people including children and elderly. We will be also investigating different approaches for incorporating existing games within our framework. Therefore, patients would be able to not only use our developed serious games, but also play with their favorite games while improving their physical disability. This track presents new challenges, as we need to use existing pure games to solve a real life serious health problem. This requires adaptations at different control levels: (1) controlling any existing game with the Leap Motion device instead of a mouse or keyboard controller. (2) Overlay the game user interface with additional instructions and guidance towards performing their therapies while playing their favorite game.