Newton’s laws of motion (presented here for straight line motion) show the relationship between displacement (distance), velocity (speed) and acceleration. This demonstrates velocity over time produces distance and acceleration over time changes velocity. Therefore, distance travelled can be calculated from acceleration.

Motion equations show an initial velocity and a constant acceleration; however, for human motion analysis, there are no constants. Acceleration itself is determined by a multitude of components which we will investigate in this section.

The challenge in measuring acceleration (linear or rotational) is that it is two steps (a double derivative in fact) of that with which our mind understands physically, that is distance. In relating inertial sensors to our usual 3D physical environment and Cartesian coordinates we must adjust our thinking from what we can see (position), infer velocity (changes in position over time) and acceleration (which cannot be seen). Having a firm grasp that acceleration data is entirely different (though related) to positional data is critical to beginning to use it to gain meaningful and often complementary insights. A simple example of this is looking at heel strike using video (or motion capture) compared to an accelerometer (or a force place for that matter). Ground contact occurs before force transfer (and a change of acceleration).

The resulting acceleration experience on a body can be expressed as the sum of four components (Ohta 2005). These components are the inertial changes of the body, gravity, centrifugal motion and tangential acceleration. Each of these combines to the overall acceleration that will be measured by the sensor. In seeking to use sensors for human locomotion, it is essential to understand the role each of these plays in individual cases to effectively decouple, remove or make best use of each for the interpretation of data. The differences are manifested in differing magnitudes, temporal or frequency domain enabling modern filtering techniques (often quite simple to implement) to separate the components.

Note: Introduce some text about inertial frame of reference concept somewhere. This is where velocity is constant—usually zero in laboratory equipment. Sensors work in own local coordinate geometry that is constantly changing orientation—this is the mental challenge to decouple.

3.1 Inertial Component

These are the direct linear component accelerations measured by a sensor. In practice, these components are often much smaller than the other components, though a few exceptions make it one of the most practical of the components for understanding and decoupling human motion. Rapid motions of limb segments make a contribution as do rapid periods of decelerations such as contact events (100’s of g’s) for measuring percussive forces such as striking in boxing and ground contact events in running and walking gait.

In the case of rapidly moving limbs, it is important to remember that accelerations are not velocity and not displacement; thus, peak acceleration in a movement is likely to be at the start acceleration and end deceleration of a movement, opposite in sign and hovering around zero in the middle of a movement. By contrast, the velocity will be at its maximum in the middle of the movement and displace at its highest at the end of a movement.

For percussive events, deceleration happens during contact, and depending on the shock attenuation such as padding on a boxing glove, the time for the contact event can be very short, yet very high in magnitude. In such cases, sensor will usually go over range and the samples may not catch enough points on the waveform to represent the event accuracy. They do, however, make for very good temporal markers of the event itself—something which aids further analysis especially when coupled with other sensors. This will be explored further in the case studies.

3.2 Gravity Component

The earth’s gravitational forces create in a practical sense a constant form of acceleration in a given inertial frame of reference. Any changes to this value are then the results of changing forces on the sensor itself. Thus, understanding that constant value should be present enables the analysis of changes at the sensor position, the most common of which is a change of orientation of a sensor and the location of the vertical axis. Depending on the form of locomotion, this takes of different forms and can be more easily understood through a few examples. In Fig. 3.1, a sensor is attached to the sacrum (small of the back) of a swimmer. As the torso rolls during swimming action, the gravity component is seen alternately in the sagittal and frontal axis of the transverse plane. It thus forms a convenient measure to count swimming strokes, ultimately leading to measures of stroke phase and lap turn times as well (but more on that in the case example later on).

Fig. 3.1
figure 1

Gravity component from swimming strokes on sacral mounted sensor

For human movement where orientation is important to determine a particular activity, detection of the gravity component can be used to classify limb segment orientation to thus determine activity type. Examples such as daily living tasks for health applications (James et al. 2012) and tackling in ball sports (Delaney et al. 2016) show comparable levels of accuracy to gold standards like video or motion capture systems. The chief advantage here is that these can be computationally derived and used in the ambulatory environment (Gleadhill et al. 2016).

Often, it happens that during human locomotion, the body and attached sensor are no longer in an inertial frame, such as in jumping motion. In this case, the gravity component does not appear in any of the channels and can serve as a reliable measure of time of flight (Fig. 3.2) and ultimately jump height. This has been applied to team sports such as volleyball (Gageler et al. 2015) and snowboarding (Harding et al. 2007). In the latter example is was validated as being as accurate as high speed video and being of much greater convenience. Additionally, the sensor output in the snowboarding study was found to have a high degree of correlation with more subjective measures, such as judging an athlete’s performance during actual competition.

Fig. 3.2
figure 2

Flight time from takeoff to landing, determined by the acceleration peaks

The gravity component transitioning from a non-inertial and inertial frame of reference can be seen clearly in the figure below of a skydiver undertaking acrobatic manoeuvres below. In this figure, the skydiver is in free fall (no gravity component is present) until terminal velocity is reached (gravity component can be clearly seen) and then as the skydiver undertakes acrobatic tumbles the component transitions between the various axes clearly delineate their activities. This work (Wixted et al. 2011) was envisaged as an alternative method to judge the spot to the traditional use of a telescope from a distance of ~2 km away (Fig. 3.3).

Fig. 3.3
figure 3

Skydiver acceleration traces from body-worn sensors

3.3 Centripedal

Where a moving body, body segment or sensor on such is moving in a constrained radial direction, a centripetal force is exerted towards the centre of rotation. Sometimes, this is expressed as an outwards force called the centrifugal force. Often, the correctness of the terminology is debated among the various scientific disciplines. The focus here, however, is that this additional acceleration component is measured by a sensor, as a component of the inertial acceleration equation presented earlier. This component can be found in low magnitudes in arm, and leg swing events during ambulation, however, can be somewhat swamped by ground contacts of other limb movement components such as throwing. In the case of throwing, or bowling in the sport of cricket, the centrifugal component can become the most dominant of components.

In a study examining bowling action in cricket (Spratford et al. 2015), where a straight arm is used to accelerate a ball before release, rather than articulating the elbow for the more conventional bowling motion, the centrifugal components can exceed 50 g and the degree of phase correlation between the upper and lower arm can accurately assess whether the ball has been bowled or thrown to assess its adherence to the game.

An important consideration in this study was the dynamic range of the sensor (±50 g) meant that it had little resolution available for detection of running activity (±2 g), and so a dual sensor was required for this application.