FormalPara Key Points

Larger American Football players (e.g., linemen) are at increased risk for exertional heat illness and heat stroke compared with smaller players (e.g., backs) because of greater body mass index, increased body fat, lower surface area to body mass ratio, lower aerobic capacity, and the stationary nature of the position, which can reduce heat dissipation.

In general, American Football players exhibit higher sweating rates than athletes in other sports. Sweat sodium concentration of American Football players does not seem to differ from other sports; however, given the high volume of sweat loss, the potential for sodium loss is higher in American Football than in other sports.

Coaches, athletic trainers, and team personnel should be aware of the unique thermoregulation, fluid balance, and sweat loss challenges American Football players face during preseason practice and monitor players more diligently, particularly large linemen within the first 3–4 days of practice.

1 Introduction

Exercise in the heat presents a challenge to maintaining thermal homeostasis and markedly impairs exercise capacity [1]. Multiple factors can disrupt thermal balance, including the exercise intensity, ambient environment, hypohydration, and clothing [1]. Studies have documented the influence of hot environments on body core temperature (T c), sweating rates, and fluid balance during practice and game play for various team and individual sports [24]. The majority of reports, however, from team and individual sports have primarily examined smaller athletes compared with the American Football player. American Football presents unique challenges for thermoregulation and fluid balance. This is due to the large body size of players as well as the high-intensity nature of the sport, uniform requirements, and preseason training and early competition occurring during the hottest parts of the year. Numerous reports have documented the disproportionately higher rates of exertional heat illness (EHI) and heat stroke (EHS) in American Football players compared with other sports [57]. Therefore, there is a need to understand the T c challenges these athletes face at various levels of competition. The intent of this review is to provide an overview of laboratory and field-based studies and discuss the thermoregulation, fluid balance, and sweat losses of American Football players. Specifically, we discuss the thermoregulatory challenges American Football players face during the preseason, the inhibiting nature of the required protective equipment, the increased risk for EHI and EHS among American Football players compared with other sports, and why larger American Football players are at a greater risk of EHI and EHS. EHI, which includes exercise-associated muscle cramps, heat syncope, heat exhaustion, and EHS, will be discussed in relation to preseason American Football practice. Sweating rates, body mass losses, fluid intakes, and sweat sodium concentrations will also be reviewed. Finally, practical applications, gaps in the literature, and recommendations for future avenues of research with American Football players will be presented.

2 Methods of Literature Search

The literature search was conducted using PubMed and EBSCO databases. Multiple search phrases pertaining to “American Football,” “thermoregulation,” “heat illness,” “heat stroke,” “sweating rates,” and “fluid balance” were used to identify relevant articles. Bibliographies of relevant articles were searched to identify any potential additional studies. The search period was from inception to January 2016. Selection criteria were broad in order to encompass a wide array of studies examining thermoregulation, fluid balance, and sweating rates of American Football players. The titles and abstracts were reviewed based on general inclusion criteria: English language, full-length articles published in peer-reviewed journals, and healthy youth and adult athletes. An overview of the studies is provided in Tables 1, 2.

Table 1 Summary of literature pertaining to American Football for environmental conditions and thermoregulation
Table 2 Summary of field-based studies measuring sweating rate and fluid balance in American Football players during summer training

3 Exertional Heat Illness and Heat Stroke in American Football

3.1 Exertional Heat Illness Rates

Thermal homeostasis is maintained when there is a balance between heat gain and heat dissipation. The body takes multiple measures to preserve T c within a relatively narrow range, predominately through redistribution of blood flow and activation of sweat glands [1]. A number of factors can challenge thermal homeostasis, including exercise intensity, the ambient environment, and clothing ensemble [1]. Exercise in the heat represents a major challenge to maintain a T c within a tolerable range for that individual athlete. As the ambient temperature and humidity increase, the capacity to dissipate heat is diminished. Due to the mismatch between heat production and dissipation, the athlete’s risk increases for EHI and EHS. EHS is defined as a T c of >40.5 °C associated with central nervous system disturbances and multiple organ system failure [13].

EHI has been estimated to impact 9000 high-school athletes each year [6]. Kerr et al. [6] analyzed data from 2005 to 2011 from the National High School Sports-Related Injury Surveillance System and showed EHI occurred at a rate of 1.2 incidences per 100,000 athlete exposures. American Football players accounted for 74.4 % of EHIs at a rate of 4.42 per 100,000 athlete exposures [6]. The highest percentages of EHIs occurred with offensive linemen (35.7 %), followed by defensive linemen (16.9 %) and linebackers (9.7 %) [6]. American Football players were 11.4 times more likely to develop EHI compared with all other sports combined [6]. The next largest number of EHI events occurred in girls’ volleyball (4.8 %), girls’ soccer (3.0 %), and boys’ wrestling (3.0 %) [6]. Most American Football EHI events occurred in the southeastern region of the United States during the month of August (60.3 %) and during the preseason (90.4 %) [6]. Huffman et al. [9] reported that 60 % of all EHIs in high school American Football players occurred during the preseason. The authors also noted that the majority of EHIs took place >1 h into the practice [9]. Similarly, Yard et al. [10] showed that most EHIs occurred either 1–2 h (46.6 %) or >2 h (37.2 %) after practice had begun. Kerr et al. [6] reported a similar percentage (36.7 %) for EHIs occurring >2 h into the practice session for American Football players. During a single American Football season in the southeastern United States, Cooper et al. [11] reported 139 EHIs among five universities. Exercise-associated muscle cramps accounted for 70 % of the EHIs, while heat exhaustion and heat syncope accounted for 23 and 7 %, respectively [11]. The vast majority of these EHI incidents (88 %) occurred in August [11]. No incidences of EHS were reported in the study [11].

3.2 Exertional Heat Stroke Fatalities

Exercise in the heat has also led to numerous deaths in high school, college, and professional sports due to EHS. American Football fatalities have been tracked since 1931 by the American Football Coaches Association, but not until 1960 were heat stroke deaths routinely monitored [7]. The National Center of Catastrophic Sport Injury Research (NCCSIR) reported 140 EHS deaths from 1960 to 2014 for American Football [7]. Since the mid 1990s, 54 American Football players have died from EHS, with 90 % of fatalities occurring during practice. Of the 54 deaths, 77 % occurred at the high school level [7]. Fatal EHS cases have doubled within the last 15 years, with 41 cases from 2000 to 2014, compared with 20 deaths from 1985 to 1999 [7]. Although major efforts have been made by governing bodies setting forth policies and recommendations to help reduce heat illness for American Football players during preseason practice, improvements are still clearly needed [9, 17]. For example, it has been recommended by the National Athletic Trainer’s Association (NATA) for all states to adopt appropriate heat acclimatization guidelines [12]. This is particularly warranted at the high school level, since this is the population in which the majority of EHS fatalities have occurred [7].

4 Thermoregulation in American Football Players

4.1 Anthropometry

The NATA guidelines list several non-environmental risk factors for EHI, including body mass index (BMI) [13]. Specifically, obese players (BMI ≥ 30 kg/m2 obese) are at greater risk for EHI [13]. Multiple studies have reported on the BMI profiles of American Football players [1420]. Malina et al. [21] investigated 653 youth American Football players (9–14 years) and found that 45 % were classified as either overweight or obese according to their BMI. Kaiser et al. [14] assessed 65 Division I freshmen American Football players morphological profiles and showed that their average BMI (29.8 kg/m2) values classified them as either overweight or obese. They reported similar findings for professional American Football players (30.1 kg/m2). However, it is important to note that the players’ body fat percentages (13 % professional and 15 % collegiate) in this study [14] were found to be within an acceptable range for health status [22]. Recent studies using more sophisticated techniques to measure body composition (i.e., dual-energy X-ray absorptiometry) report that collegiate and professional American Football linemen have greater body compositions and BMI compared with non-linemen (i.e., quarterbacks, wide receivers, etc.) [15, 16, 23]. Havenith et al. [24] have suggested that increased body fat mass may exert an additional effect on the response to heat stress. This is due to the increase in metabolic rate for carrying the fat mass when compared with lean individuals [24].

The rise in EHS fatalities is likely multifactorial, but an increase in player size (BMI) has been proposed as one possible explanation [5]. Several studies have reported an increase in players BMI over time [1720]. Multiple studies also support the role of BMI being a predisposing factor for heat ailments [5, 6, 13]. Of the 58 documented hyperthermia-related deaths of American Football players between 1980 and 2009, Grundstein et al. [5] reported that 79 % of the players were large, with a BMI of >30 kg/m2 and 86 % of those individuals played the lineman position. All deaths occurred under high (23–28 °C) or extreme (>28 °C) environmental conditions as defined by the American College of Sports Medicine (ACSM) using wet bulb globe temperature (WBGT) [5].

Other critical factors with EHS for large players include excess body fat percentage, decreased aerobic capacity, and lower surface area to body mass ratio [5]. Aerobic capacity has been shown to vary among American Football players at the college [25] and professional [26] level based upon position. Linemen have been reported to have lower VO2max (43.5–55.9 mL/kg/min) compared with backs (52.4–60.2 mL/kg/min) [25, 26]. VO2max is defined as the maximal capacity for oxygen consumption by the body during maximal exertion [27]. The lower surface area to mass ratio of larger players is a main contributing factor to higher T c [28, 29]. This is due to diminished heat dissipation with a lower surface area and greater heat production with a larger body mass [28, 29]. Further, the stationary nature of the lineman position, which is usually played by larger American Football players, could be another potential factor for a reduced heat dissipation compared with smaller players participating in more mobile positions [30, 31]. Other reported contributing factors in EHS deaths include hot and/or humid environmental conditions and time of year (preseason). Evidence that body size (BMI) and extreme environmental conditions can have varying effects on players’ health and safety means it is imperative that coaches and athletic trainers understand the risks associated with these factors for heat-related illness.

4.2 The American Football Uniform and Heat Stress

The American Football uniform covers 70 % of the skin surface area [32] and can impede heat loss through convection, radiation, and evaporation [32]. Evaporation of sweat from the skin provides the main mechanism for cooling the body and accounts for 80 % of heat loss during exercise [33]. Once the maximal evaporation required to maintain thermal balance surpasses the maximal evaporation possible in the ambient environment, heat storage becomes inevitable, especially during high-intensity exercise [34]. Clothing can reduce the capacity for evaporative heat loss by creating a microenvironment that is hotter and more humid than the ambient environment [34, 35]. The semi-encapsulating American Football uniform increases thermal insulation by inhibiting both dry and evaporative heat exchange with the environment. If the uniform reduces heat dissipation, this could potentially increase players’ risk for heat ailments. Therefore, it is critical to understand the impact of the American Football uniform on the maintenance of thermoregulation.

Nearly 50 years ago the first study was conducted on American Football uniforms, showing restricted heat loss with the uniforms during moderate environmental temperatures [36]. In 1969, Mathews et al. [37] performed a follow-up study having participants exercise for 30 min at a moderate environmental temperature wearing either a full American Football uniform or shorts. Results showed a greater peripheral blood flow and increased heart rate and sweating rate with the full American Football uniform compared with shorts. The American Football uniform also inhibited evaporative cooling, with a final T c of 39.0 °C compared with 38.4 °C with shorts at the end of the exercise bout. These early studies were foundational and clearly showed the added thermal strain of the American Football uniform. The studies were conducted in response to 12 EHS deaths from 1959 to 1962 in high school and college American Football players [36, 37]. The studies noted players’ deaths being attributed to multiple factors, including exercising in a humid environment, players dressed in full uniform, and restricted fluid intake. The majority of fatalities occurred within the first 2 days of practice, with players not acclimatized to the heat. Mathews et al. [37] made recommendations for preventative measures, which included allowing ad libitum drinking and heat acclimatization. This study provided warnings about the American Football uniform over 30 years prior to the National Collegiate Athletic Association (NCAA) policy for preseason American Football practice [38].

In 2002, thresholds for identifying levels of uncompensable heat stress were established for American Football uniforms under a variety of environmental conditions [39]. Data from this study were used by the ACSM to develop guidelines for wearing uniforms during exercise in the heat [8]. McCullough and Kenney [40] provided quantitative data modeling heat exchange while wearing an American Football uniform. This study showed a progressive increase in evaporative resistance and thermal insulation with the full American Football uniform compared with shorts or a partial uniform [40]. These results were later confirmed by Armstrong et al. [32], who showed that a full modern American Football uniform caused greater thermal and cardiovascular strain compared with a partial uniform or control (shorts and socks) in a hot environment. Elevated Tc with the full uniform has been shown to persist during recovery periods because of the inhibiting nature of the uniform retarding heat loss [31, 37]. Johnson et al. [41] also showed a higher T c in a full compared with a partial modern uniform. However, interestingly, players were unable to perceive the greater thermal strain. This is an important point for athletic trainers, as perceptual response in American Football players may not be indicative of thermal strain while exercising in the heat [41].

In 2003, the NCAA implemented a policy for preseason American Football practice to help players with heat acclimatization and reduce heat illness commonly associated with the first 7–10 days of practice. The policy provides guidelines for equipment progression before full pads can be worn, the duration of the practice, the number of practice sessions per day, and rest time between each practice for twice daily practices [38]. Prior to the 2003 NCAA policy, no guidelines existed for progression through the first weeks of preseason practice, except under the coach’s discretion.

4.3 Field-Based Studies

Prior to the 2003 NCAA policy, Godek et al. [42, 43] was the first to examine thermoregulation in American Football players (Division II) during a field-based study. The study investigated T c in American Football players compared with cross-country runners while exercising in a hot and humid environment (Table 1). A variety of American Football positions were selected (i.e., linemen, tight ends, linebackers, running backs, and defense backs), representing a wide range of body types. Players practiced twice a day for 8 days with the exception of day 7, in which the team participated in an intrasquad scrimmage. Equipment for American Football players included no pads on day 1 and thereafter half pads in the morning and full pads in the afternoon (days 2–6, 8). The American Football players and cross-country runners continued their normal training sessions throughout the study. The ambient temperature and humidity were similar in the morning and afternoon between days 4 and 8 for practices. T c was significantly higher on day 4 compared with day 8 in both American Football players and runners [42]. American Football players showed a higher T c with pads (either half or full) compared with no pads [42]. A significantly higher T c was also reported for full pads in the afternoon compared with half pads in the morning [43] (Table 1). The mean T c between full and half pads was small (0.2 °C, Table 1), with the authors noting the difference having little biological or clinical value [43]. There was a continual fluctuation in T c for American Football players throughout practice depending on exercise intensity and rest periods. Despite hot and humid conditions (Table 1), no player approached a T c of 40 °C with signs or symptoms of EHS.

In 2006, Yeargin et al. [44] investigated the new NCAA policy for heat acclimatization in Division I American Football players during preseason practice [38]. Over an 8-day period, the authors tested 11 players from various positions. They reported post-practice T c, measured when players were leaving the field, ranging from 37.2 to 40.7 °C. Over the 8-day period, post-practice T c exceeded 39 °C in 19 measurements, with 79 % of the occurrences in players with a BMI of >30 kg/m2. The authors noted T c was <39 °C for players with a mean BMI of ≤32 kg/m2 compared with T c of >39 °C for players with a BMI of ≥34 kg/m2. A weak but significant relation (r = 0.23) was shown for T c and BMI. However, no relation was found between positions and hyperthermia. The greatest physiological strain occurred in the first 2 days of practice, accounting for 45 % of all reported T c of >39.0 °C. Despite practices occurring in a warm-humid environment (28.1 °C WBGT and 65 % relative humidity, averaged from the first 2 days of practice), players were able to gradually acclimatize to the heat with no reports of EHI under the new NCAA policy (i.e., progressive increase in equipment, duration and the number of practices throughout the week).

Previous field studies have not reported a relation between player position and hyperthermia [4245], potentially because of the low number of players in each position. A field study by Godek et al. [31] addressed this by examining T c between National Football League (NFL) linemen (i.e., offensive and defensive linemen) and backs (i.e., wide receivers, running backs, corner backs, tight ends, and linebackers) during twice daily preseason practices. They observed a significantly greater T c in NFL linemen compared with backs (Table 1). Higher T c occurred during individual drills and team and live scrimmages with linemen. Due to the intermittent nature of the sport, T c fluctuated throughout practice, rising during activity and decreasing during breaks, as shown in previous studies [42]. Waligum and Paolone [29] also reported higher T c in linemen compared with backs during a laboratory-based study. As previously mentioned (Sect. 4.1), the differences in T c between linemen and backs are multifactorial. Potential factors include anthropometric differences (i.e., BMI, percentage of body fat, body surface area, and surface area to mass ratio) and position-specific requirements influencing metabolic heat production or reduced heat dissipation [31, 46, 47], which was first proposed by Godek et al. [31].

One limitation of most field-based studies is the lack of data quantifying exercise intensity among players [31, 42, 44, 48]. A recent field study by DeMartini et al. [30] showed that linemen engaged in more isometric work and covered less distance at lower velocities than backs, although this does not suggest a lower metabolic rate or heat production of linemen compared with backs [30]. For example, backs covered greater distances and velocities compared with linemen, but average heart rate was the same during the total practice for backs and linemen [30]. Due to the stationary nature of the position for linemen, it is plausible that reduced heat dissipation could be a primary factor, among others, for higher T c. This was demonstrated by Deren et al. [46, 47] in a series of studies in which the authors had linemen and non-linemen exercise at a fixed metabolic heat production per unit of body surface area at 350 W.m−2 for 60 min at 32 °C. The results showed a significantly higher Tc and sweating rate with linemen despite a 25 % lower heat production per unit mass. This suggests that linemen have a compromised potential for heat dissipation via a diminished sweating efficiency due to a lower skin wettedness and lower sweat gland density [46].

In a follow-up field study, Deren et al. [47] reported that self-generated air movements were significantly lower in linemen compared with backs during preseason practice. The study showed convective and evaporative heat loss are lower during wind sprints, individual drills, and team scrimmages for linemen. Air flow, whether through self-generated movements or from the ambient environment (i.e., wind), helps to move warm air away via convective currents and can also help to evaporate sweat by increasing the evaporative capacity relative to still air [49]. Since linemen run at a lower velocity, less ambient air would circulate in the microenvironment, therefore reducing convective and evaporative heat loss [50]. Deren et al. [47] proposed including mechanical misting fans to artificially induce air flow and moisture, helping to increase convective and evaporative cooling during stationary drills or breaks in practice for offensive and defensive linemen [46, 47]. Previous laboratory-based studies [28, 29] are consistent with recent laboratory and field studies [46, 47] showing heat dissipation to be diminished in linemen compared with backs [29, 46, 47] and in larger versus smaller subjects [28]. Investigators have also examined differences in T c between offensive and defensive linemen [48]. However, no difference in T c between offensive and defensive linemen was shown for twice daily practices during the preseason [48], potentially due to similar characteristics between positions.

Considering the success of the NCAA policy for progression of preseason practice with American Football athletes [44], it is interesting that not every level of competition has adopted or implemented appropriate guidelines. In 2009, the NATA published preseason heat acclimatization guidelines for secondary school athletes [12]. However, numerous states at the high school level still have inadequate recommendations in place for heat acclimatization and progression through preseason practices [51]. Only 14 states have implemented all of the heat acclimatization guidelines set forth by the NATA, with other states being deficient [51]. Limited research exists on T c during preseason practice with high school players. Yeargin et al. [52] examined Tc in adolescent American Football players (Table 1) over 10 days of preseason practice in late August. Players practiced once a day for days 1–5 and 8–10 and twice daily for days 6 and 7. There was a main effect for time (days) on maximum T c, with T c higher on days 1–3 and lower on days 6 and 7. This is in agreement with previous field-based studies showing greater physiological strain during the first 2–3 days of practice in collegiate players [42, 44]. Maximal daily T c was correlated with maximum WBGT, in line with previous research [44]. The authors provided a model in which adolescent American Football players can safely participate in preseason practice without excessive T c occurring (<40 °C) [52].

4.4 Work to Rest Ratio

Work to rest ratio should also be considered in relation to environmental conditions during the preseason [13]. T c for American Football players has been shown to fluctuate during practice based upon periods of activity and rest [31], with exercise intensity being one of the primary factors influencing T c [31]. Guidelines from the NATA recommend the work to rest ratio be adjusted on the basis of the environmental conditions and intensity of the exercise [13]. As previously mentioned (Sect. 4.2), elevated T c with the full uniform has been shown to persist during recovery periods (i.e., rest breaks) because of the inhibiting nature of the uniform retarding heat loss [37, 53]. Therefore, athletes should be permitted and encouraged to remove equipment (i.e., helmets) during rest breaks [13]. The NATA further recommends that breaks should be planned in advance and located in the shade, with athletes having enough time to consume fluids [13].

5 Sweating Rate and Body Fluid Balance

5.1 Sweating Rate

As mentioned previously (Sect. 4.2), thermoregulatory sweating occurs during exercise and/or heat stress in an attempt to dissipate body heat and regulate Tc. With sweating, heat is transferred from the body to water (sweat) on the surface of the skin. When this water gains sufficient heat, it is converted to a gas (water vapor), thereby removing heat from the body [54]. Many intrinsic (e.g., genetics, aerobic capacity, body size, and heat acclimatization) and extrinsic (e.g., environmental temperature/humidity, practice intensity, clothing/protective equipment) factors modify the rate of sweating, thus the variability in sweat loss between athletes can be considerable [55]. Metabolic heat production is a major factor in dictating increases in T c and therefore sweating rate increases in proportion to exercise intensity [56]. Large body size [31, 5762], high environmental temperatures [56], and wearing clothing or protective equipment [32, 37, 63] can also increase heat strain and sweating rate [55, 64].

The commonly cited range in sweating rate across a wide range of individuals during exercise is 0.5–2.0 L/h [55]. However, in American Football players, sweating rates above this range have been observed. Several studies of collegiate and professional American Football teams have reported mean sweating rates of ~2.0 L/h or more [31, 65, 66]. Furthermore, sweating rates 3.0 L/h or more have been reported in some larger adult players [45, 65, 67, 68]. Several studies have compared sweating rates among groups of athletes to determine whether differences in body size, playing position, level of play, age, uniform configuration, and other factors are associated with differences in sweat loss. The following sections discuss the impact that these various factors have on sweating rate in American Football players (see Table 2 for summary).

5.2 Body Size

Several authors have discussed the impact of body size (body mass or body surface area) on thermoregulation [31, 5762]. It is well established that heavier athletes produce and store more heat at a given absolute work rate (e.g., running velocity) than their lighter counterparts [5759]. It therefore stands to reason that body size significantly impacts thermoregulatory responses such as sweating rate. As such, a direct significant relation between sweating rate and body surface area [31, 60, 61] or body mass [60, 62] is commonly reported in American Football studies. In addition, the large differences in sweating rate reported between American Football and other sports are usually nullified by accounting for differences in body mass or body surface area (i.e., expressing sweating rate relative to body mass in kg or body surface area in m2) [55, 65, 67].

5.3 Playing Position and Level of Play

As discussed previously (Sect. 4.1), there are considerable size and body composition differences between certain playing positions, such as linemen versus backs. In addition to the effects of body size, a higher fat mass also increases body heat storage, due to inhibition of heat dissipation [57]. Given the impact of body size and composition on heat strain, it is not surprising that linemen have been shown to exhibit significantly higher sweating rates than backs during summer training [60, 61, 68]. For example, in one study with professional players, sweating rate was significantly higher in linemen (2.25 ± 0.68 L/h) and linebackers/quarterbacks (1.98 ± 0.48 L/h) than backs (1.40 ± 0.45 L/h) [68]. Only one study has compared sweating rates in American Football players of different playing levels. Godek et al. [61] tested collegiate and professional players matched for body mass, height, and body surface area playing under the same outdoor environmental conditions and found no differences in sweating rates between levels. In addition to body size, exercise intensity is obviously another important determinant of sweating rate. One laboratory study has compared regional sweating rates of linemen and backs cycling at a fixed metabolic heat production per unit body surface area [46]. In this study, linemen exhibited significantly greater sweating rates on the forehead, chest, shoulder, and forearm, compared with subjects who played a back position. Interestingly, sweating efficiency was found to be lower in the linemen, as T c was higher in linemen despite their greater sweating rates compared with backs [46]. More work in this area is needed to better understand the effect of player position and position-specific factors on heat strain, thermoregulatory sweating, and fluid replacement needs.

5.4 Age

As shown in Table 2, the sweating rates of youth American Football players are generally lower than those of collegiate and professional players. For instance, during summer training, the mean sweating rate in youths ranged from 0.6 to 1.3 L/h across three studies [52, 62, 69], while adults exhibited mean sweating rates of 1.0–2.9 L/h across nine studies [31, 43, 60, 61, 65, 66, 68, 70, 71]. In addition, in one study, Yeargin et al. [52] found that sweating rate (averaged across 10 days of preseason training) was significantly lower in younger (14–15 years, 0.6 ± 0.4 L/h) versus older (16–17 years, 0.8 ± 0.3 L/h) youth heat-acclimatized American Football players. It is important to note that these age/maturation differences among American Football players’ sweating rates could be due in part to unmatched exercise intensities within and between studies. Nonetheless, laboratory studies controlling for exercise intensity have also reported age/maturation differences in sweating rate [7275]. In addition, the lower sweating rates in younger/pre-pubertal athletes remain even after accounting for differences in body mass or body surface area [67, 7274].

5.5 Practice Versus Game

Only one American Football study has compared sweating rates during practices and games. McDermott et al. [62] tested 9- to 15-year-old players over the course of a 5-day summer camp and found that mean sweating rate during games (1.30 ± 0.57 L/h) was about two times that of practices (0.65 ± 0.35 L/h). The differences in sweating rate could be attributed to differences in exercise intensity or work to rest ratio since ratings of perceived exertion (on the 20-point Borg scale) were significantly higher during games (19 ± 2) compared with practices (16 ± 2) [26]. Additional work is needed to help determine how hydration needs of American Football players at all levels of play vary between practices and games.

5.6 Uniform Configuration

As discussed previously (Sect. 4.2), protective equipment impairs heat loss capacity because of its high insulation and low water vapor permeability and therefore leads to increased thermal strain [63]. In addition, the extra weight load from wearing equipment increases metabolic heat gain [37, 63]. Consequently, wearing American Football pads during exercise has been shown to increase the thermoregulatory sweating response [32, 37]. Armstrong et al. [32] compared sweating rates of men wearing a full uniform, partial uniform (without helmet and shoulder pads), and control clothing (shorts, socks, and shoes only) while performing a standardized American Football-simulated laboratory protocol in the heat (33 °C). In this study, sweating rate was significantly higher in the full (2.05 ± 0.34 L/h) and partial (1.86 ± 0.25 L/h) uniforms versus the control clothing (1.24 ± 0.16 L/h). However, there seems to be no significant difference in sweating rate with different uniform configurations. In the study by Armstrong et al. [32], no differences in sweating rate were found between full and partial uniforms. Similarly, in a field study, Godek et al. [43] reported no differences in collegiate American Football players’ sweating rates when wearing full (2.01 ± 0.66 L/h) versus half pads (1.97 ± 0.53 L/h) during summer training. These results are not surprising, since in both studies, there were no differences in the players’ T c between uniform configurations [32, 43].

5.7 Fluid Balance

When sweat output exceeds fluid intake, a body water deficit (hypohydration) accrues, which can lead to impaired circulatory and thermoregulatory function during endurance exercise. For example, laboratory studies have shown that ≥3 % hypohydration causes a decrease in plasma volume and therefore a decrease in stroke volume and a compensatory increase in heart rate to maintain a given cardiac output [7678]. In addition, ≥3 % hypohydration has been shown to delay the onset and decrease the sensitivity of the sweating and skin blood flow response to hyperthermia [7981]. The physiological strain associated with hypohydration is exacerbated by greater body water deficits and when exercise is performed in hot-humid environments [76, 82]. For more details on the physiology and performance effects of hypohydration, the reader is referred to a recent comprehensive review [83].

Despite high sweating rates in American Football players, the observed disturbances in fluid balance have generally been mild (see Table 2). For example, most field studies with professional [31, 60, 68], collegiate [43, 44, 70, 71], and high school [69] American Football players reported mean body mass losses of ≤2 %. These results suggest that the drinking breaks in most studies were sufficient to provide athletes with enough fluid to offset sweat losses. In fact, some studies in youth players have shown better in-practice hydration (i.e., replacing sweat losses) than off-field rehydration (i.e., pre- and in-between practice) habits [52, 62]. This is an interesting finding considering that youth athletes’ perceived sweat losses often underestimate their actual sweat losses, but can perhaps be explained by a combination of adequate fluid availability and relatively low sweating rates (making it easier to drink enough to offset losses during practice) [52]. Given the structure of the game, which includes allowance of player substitutions, frequent stoppage of play (timeouts, breaks after each quarter, etc.), and most athletes in college and NFL playing only one side of the ball (offense or defense), American Football players should have ample opportunity to consume fluids.

Only one study, by Horswill et al. [66], has reported mean pre- to post-practice losses of >2 % body mass (see Table 2). This was one of the few studies in which athletes remained in full pads for the duration of practice. In addition, the location of this testing (southeastern United States) provided for some of the highest levels of heat stress (WBGT 29.4–32.2 °C) among field studies to date. The fluid losses from sweating in these athletes may have been at a high enough rate that their intake could not keep pace with their losses [66].

A few studies have also tracked baseline (pre-practice) body mass changes throughout consecutive days of summer training camp [43, 44, 69]. These study results showed that mild body mass deficits accrued after the first day of training, but hypohydration levels did not worsen over the remainder of camp. That is, fluid intake between practice sessions was apparently sufficient to avoid cumulative hypohydration from consecutive days of training. For example, in collegiate players, baseline body mass decreased by 0.8, 1.5, 0.9, 1.1, and 0.5 kg (i.e., ~0.4–1.3 %), respectively, during days 2, 3, 4, 6, and 8 compared with that of day 1 [43]. Similarly, Stover et al. [69] reported that the pre-practice body mass of high school players was consistently ~0.5 kg (i.e., 0.6 %) less on days 2–5 of training camp versus day 1. There also seems to be no significant effect of player position on fluid balance. That is, athletes with higher sweating rates consumed more fluid than their counterparts who sweated less, leading to no differences in body mass deficits between linemen and backs [60, 61].

Another potential concern related to fluid balance is overdrinking relative to sweat losses, which can increase the risk for symptomatic exercise-associated hyponatremia (i.e., blood sodium dilution to <125–130 mmol/L) [84]. Hyponatremia is a serious, potentially life-threatening condition and has been reported in American Football [8486]. Thus, hydration education and behaviors should be aimed at preventing significant fluid imbalances, including overdrinking (body mass gain) as well as under drinking (≥3 % body mass loss). Knowing individual sweating rates (in various training and environmental conditions), monitoring body mass changes and urine output (volume and concentration), and thirst can help inform appropriate personalized fluid replacement strategies [55, 84].

6 Sweat Sodium Concentration

Thermoregulatory sweat is comprised of water and electrolytes, predominantly sodium and chloride. The total amount of sodium lost in sweat depends on the rate and duration of sweating as well as sweat sodium concentration. Mean whole-body sweat sodium concentration during exercise across a variety of athletes is ~35–40 mmol/L, but ranges from ~10 mmol/L to as high as ~90 mmol/L [55, 87]. The considerable inter-individual variability in sweat sodium concentration is due to multiple factors, including genetics, diet, heat acclimatization status, and sweating rate [55]. The “gold-standard” method of measuring whole-body sweat sodium concentration is the whole-body washdown technique [88]. However, in field studies, regional (e.g., forearm) sweat collections have been used as a more practical means to estimate sweat sodium concentration, and regression equations [87, 89] can be used to predict whole-body sweat sodium concentration.

Only a few studies have measured sweat sodium concentration of American Football players during summer training [52, 67, 68]. The sweat sodium concentration of American Football players does not seem to differ significantly from that of athletes in other sports. For example, Baker et al. [67] found no difference in the mean forearm sweat sodium concentration of competitive/professional athletes in American Football (44 ± 19 mmol/L) versus other team/skill sports, such as baseball (36 ± 18 mmol/L), basketball (46 ± 17 mmol/L), tennis (40 ± 19 mmol/L), and soccer (36 ± 13 mmol/L). In addition, Godek et al. [68] measured forearm sweat sodium concentration in professional players and found no differences between backs/receivers and quarterbacks/linebackers (group mean was 50 ± 16 mmol/L, range was 15–99 mmol/L). Finally, in agreement with others [72] testing the impact of age/maturation on sweat composition, Yeargin et al. [52] found that forearm sweat sodium concentration was ~30 % lower in 14- to 15-year-old (27.3 ± 17.2 mmol/L) versus 16- to 17-year-old (40.4 ± 19 mmol/L) players during summer training (although this difference did not reach statistical significance). To date, no studies, to the best of our knowledge, have measured sweat sodium concentration during the competitive season or determined intra-subject variability with changes in exercise intensity, weather, heat-acclimatization or training status, or other factors that change throughout the course of an American Football season.

Although sweat sodium concentrations observed in American Football players are similar to other sports, it is expected that total sweat sodium losses would be higher in Football due to the very high sweating rates commonly observed, particularly in larger players (e.g., linemen) [67]. However, the impact of these sodium losses in American Football players is currently unclear. There are insufficient data to inform clear recommendations on how much sodium needs to be replaced for the maintenance of physiological or performance outcomes. It has been suggested that sodium should be consumed during exercise when total sodium losses are 3–4 g or more [90, 91]. For an American Football player with a whole-body sweat sodium concentration of 40 mmol/L (920 mg/L) and a sweating rate of 2 L/h, it would take 1.6–2.2 h of training to lose 3–4 g of sodium. Since American Football practices are commonly ~2 h or more, many adult players would be expected to lose ≥3–4 g of sodium during summer training camps. Indeed, Godek et al. [68] reported total sodium losses of 9.8 ± 6.0 g (group mean ± standard deviation) on days when professional American Football players practiced 4.5 h during the preseason.

The potential benefits or rationale for sodium ingestion during exercise include the replacement of sweat electrolyte losses [55] and better maintenance of sodium balance [92, 93] and plasma volume [91]. However, there is currently a lack of well-controlled intermittent high-intensity studies measuring the impact of sodium loss or replacement in American Football. There is some indication that higher sweat sodium losses may occur in players with a history of heat-related whole-body muscle cramps [71]. However, the association with high sweat sodium losses is not consistently found in all cramp-prone players [66]. More work is needed to understand the impact of sodium losses on hydration and/or performance outcomes during exercise. It is clear, however, that sodium ingestion significantly improves post-exercise rehydration. Ingestion of sodium with water helps stimulate more complete rehydration, which includes both better plasma volume restoration and whole-body fluid balance compared with ingestion of plain water [9498]. The increase in blood sodium concentration and osmolality with sodium ingestion stimulates renal water reabsorption, such that urine output is inversely related to the sodium content of the ingested fluid [94]. Rapid and complete restoration of fluid balance via sodium and water ingestion is particularly important during twice per day practice sessions. In most other situations, water and sodium can be consumed with normal eating and drinking practices (i.e., with meals) [91].

7 Study Limitations, Gaps in the Literature, and Future Directions

Considering the disproportionate death rates from EHS in American Football compared with other sports, it is interesting that only five field-based studies have investigated T c responses in American Football players (Table 1). Further, only one study has investigated T c responses among American Football players at the high school level [52]. This is surprising, considering deaths from EHS have been routinely tracked in American Football players since the 1960s and the largest percentage of EHS has occurred in high school players [7]. There is still work to be done at the high school level, as not all states have adopted appropriate policies to date. High school athletic directors should consider adopting appropriate policies for preseason practice even if adequate policies are not in place for their state.

Another gap in the literature is that very few studies have been conducted in geographical locations that are in the hottest areas of the United States (i.e., southeast, south, southwest regions). Out of the five field-based studies investigating T c responses, only one has been conducted in the southeastern United States (i.e., Florida) [48], with the majority of work being done in the northeast [31, 4244, 52]. Understanding challenges players face in hotter geographical portions of the United States will provide valuable insights into the risk associated with preseason practice in these areas. More research is also needed in a range of player age levels and from different regions of the USA to get a more comprehensive understanding of the fluid intake behaviors of American Football players. In addition, future work is warranted to inform clear recommendations on how much sodium needs to be replaced during training for the maintenance of physiological or performance outcomes in American Football.

8 Conclusion

Field-based studies have been conducted in the preseason with American Football players competing at the high school, college, and professional levels. A consistent finding across studies is the significant relation between body size (surface area or body mass) versus sweating rate, such that larger players (e.g., linemen and linebackers) exhibit higher sweating rates than smaller players (e.g., backs). Wearing an American Football uniform, whether full or half pads, leads to significantly higher sweating rates compared with wearing standard athletic clothing (e.g., shorts). Mean sweating rate ranged from 0.6 to 1.3 L/h in studies with youth American Football players and from 1.0 to 2.9 L/h in college and professional American Football players. Despite high sweating rates in American Football players, the observed disturbances in fluid balance have generally been mild (mean body mass loss ≤2 %). Sweat sodium concentration of American Football players does not seem to differ from that of athletes in other sports, but given the high volume of sweat loss in American Football players, the potential for sodium loss is higher in American Football than in other sports. It is well established that sodium ingestion plays an important role in stimulating more complete rehydration following exercise, particularly during twice per day training sessions. However, future work is needed to better understand how much sodium needs to be replaced during training for the maintenance of physiological or performance outcomes in American Football. In general, youth players exhibit lower sweating rates and lower sweat sodium concentrations than adult American Football players.

Epidemiological studies report disproportionately higher rates of EHI and EHS with American Football compared with other sports, with the majority of incidences occurring during preseason practice during the hottest times of the year. Field- and laboratory-based studies are in agreement that larger individuals are at greater risk for heat ailments early during the preseason, specifically within the first 3–4 days of practice. Larger players are at increased risk for heat ailments compared with smaller players because of a greater BMI, increased body fat, lower surface area to body mass ratio, lower aerobic capacity, and the stationary nature of the position they may play, which can reduce heat dissipation. Coaches, athletic trainers, and team personnel should be aware of the unique thermoregulation, fluid balance and sweat loss challenges American Football players face during preseason practice and monitor players more diligently, particularly large linemen.