Introduction

The contemporary student of human food intake has every reason to wonder at the spectacular developments of knowledge about the mechanisms commanding ingestion. The anatomy and physiology of brain structures and their interactions with peripheral neural and hormonal mechanisms in the digestive tract or the body adipose tissue have now been described in impressive works illustrating their functions in health and disease, including food-intake related conditions such as obesity and metabolic disorders (a brief review is presented in [1]). In spite of so many brilliant advances, contemporary science has little to offer to counter the worldwide epidemic of obesity and metabolic diseases. The proportions of individuals affected with various types of intake-associated diseases have attained unprecedented levels and are still rising in most parts of the world [2].

Food intake is a key contributor to the obesity epidemic, although other aspects of the contemporary lifestyle are also influential [3]. One important task, in order to improve our understanding of the critical mechanisms that determine intake under real life conditions, is to characterize intake behavior in all its aspects, as they potentially reflect critical underlying influences. However, obtaining objective measures of food intake in humans is a very challenging task.

There are many reasons for this situation. Food ingestion in humans is a highly complex set of behaviors occurring under multiple influences and in a variety of circumstances [4]. From a physiological perspective, intake is a key actor of physiological regulatory mechanisms (energy balance, glycemia, body adiposity, and others) [5] and responds to internal control signals [6]. Food ingestion in humans also responds to numerous aspects of the environment, from sensory stimulation to complex environmental constraints, including the socio-cultural demands in human societies [3]. The satiety cascade, originally presented by John Blundell in 1987 [7] and regularly updated [8] to include newly identified factors, illustrates the impact and interactions of numerous factors at the time of meals (satiation mechanisms) and between meals (satiety mechanisms). Nowadays, the contemporary lifestyle in an obesogenic environment makes a scientist’s task increasingly difficult. Among other difficulties, people now frequently eat “on the go” as opposed to the traditional context of culturally defined “main meals” [9].

Ideally, in the context of the worldwide epidemic of obesity, tools should be developed to allow the objective measurement of human food intake responses and the exploration of underlying mechanisms that determine the adjustment of intake to needs and permit adequate weight control (or fail to do so). Objective measures of energy intake over one day, or even over one meal, are useful. In addition, studies of the microstructure of meals, in so far as it reflects the interplay of decisive factors at the time of ingestion, can open windows for fine-resolution analyses.

Measures of human eating in a laboratory context

In order to obtain a precise quantification of the behaviors of interest, laboratory studies offer a rigorous, if simplified, context. Under laboratory settings, the conditions of intake, and the various factors thought to influence it, can be manipulated in a rigorous, reproducible fashion. In addition, the characteristics of the ingestive process itself can be specifically measured with adequate instruments.

Interestingly, early studies of objectively measured food intake responses in humans described sucking behavior in newborn infants [10, 11]. A continuous graphic record of the sucking pressures developed within the mouth of the infant was used to study the day-to-day changes of sucking within the first week of life [11]. The number of sucks, the average pressure per suck, and the amount of nutrient consumed per minute were related to chronological age of the infant, allowing an increase in the amount of nutrient consumed between 24 and 48 h of age [11].

Inspired by the Skinner box used in animal studies, Hashim and Van Itallie attempted to reduce the eating process to its simplest components [12, 13]. They developed a “feeding machine” consisting of an electronically monitored reservoir filled with a bland but nutritionally adequate liquid formula. Subjects could self-feed by operating a syringe-type pump to deliver a predetermined volume of formula directly into their mouth. The amounts consumed were converted into cumulative intake graphs with calories plotted against time. Several other teams examined cumulative food intake over the course of a single meal using a variety of liquid food dispensers [14,15,16]. One later development of these early devices, the universal eating monitor (UEM) developed by Kissileff et al. [17] (see article in this same Special Issue), allowed the recording of cumulative intake of solid as well as liquid foods.

Many hypotheses about the mechanisms controlling food intake could be generated from the observation of cumulative intake curves. Early analyses suggested that cumulative intake curves fit to a quadratic equation with a linear coefficient corresponding to the initial rate of eating and a quadratic coefficient reflecting prandial acceleration or deceleration of eating [18]. Changes in stimulatory factors (such as palatability) were theorized to affect the initial eating rates, whereas inhibitory processes (presence of satiation hormones for example) would affect the rate of deceleration [18]. Meyer and Pudel [16] identified two types of cumulative intake curves. A decelerating type, seen most often in normal-weight subjects, was called a “biological satiation curve,” whereas a linear type, which was reported predominantly in individuals with obesity, was hypothesized to reflect a distorted perception of satiation signals. In this perspective, overeating was attributed to a person’s blunted perception of or responsiveness to satiation signals [18]. Cumulative intake curves did not permit the identification of specific aspects of the mealtime responses at the origin of changes in eating rates.

The Edogram method was developed to investigate eating responses at the time of meals consumed under laboratory settings. The term was used for the first time in a 1969 article by Pierson and Le Magnen [19]. In an unorthodox neologism, it is composed of a Greek (“gram”: recording) and a Latin (“edere”: to eat) roots. Beyond addressing the meal as a whole (its size, duration, volume, total energy, and nutrient content), it allowed an analysis of what was called the “microstructure” of the meal in terms of chewing, swallowing, and pausing responses. Edograms were used to quantify the impact of a number of factors at discrete moments within the meal: what is the influence of hunger? What is the influence of sensory factors such as palatability and variety? Is their influence the same at the beginning versus the end of meals? The Edogram method was also used to detect potential differences in intake behavior between individuals with normal weight or obesity.

In the scientific literature, measuring intake is often equated with a measure of energy, or food weight or volume consumed, but not a measure of the underlying behavioral responses that generate this value. The edogram, unlike the instruments that represent intake as one value (calories, or grams), is a direct outcome of the nerve and muscular activity that reflect underlying mechanisms. It offers a possibility to quantify these influences and to examine how the internal and external influences join in a common behavioral output. The neural and physiological pathways that translate a change in the hedonic dimension of food or in deprivation into altered intrameal eating patterns are beyond the scope of the present review. Research in this field is abundant and updates on the role of the nervous system between input and output are published regularly [5, 6].

The edographic studies addressed various hypotheses linking physiological pathways from external and internal stimuli to changes in chewing and swallowing as measured by the edogram. The following hypotheses were tested:

Hypothesis 1: Pleasure drives rates of consumption. Predicted outcomes: increasing palatability (from lowest to highest levels based on prior assessment independent of the edogram) will increase eating rates as well as total intake, and be reflected in edogram parameters: shorter chewing time, fewer swallows, and shorter intrameal pauses.

Hypothesis 2: Internal state drives consumption. Predicted outcomes: increasing the duration of premeal deprivation will increase eating rate at the beginning of meals, as evidenced by shorter chewing, fewer swallows, shorter pauses. As the meal progresses and the internal state goes from initial deprivation to satiation, the edograms will reflect the change in internal state and show a progressive increase in chewing time and pauses, and therefore a slower eating rate.

Hypothesis 3: Mixed meals counter sensory-specific satiation (SSS). Predicted outcomes: when several foods are presented, SSS will be minimized. Therefore, mixed meals will show the highest level of stimulation: fastest eating rates, shortest chewing, and shortest pauses.

Hypothesis 4: Body weight status affects consumption: faster eating rates are reported in individuals with overweight/obesity and cumulative intake curves are linear rather than decelerating. Predicted outcomes: underlying the higher eating rate, chewing time will be shorter in participants with overweight/obesity than in normal-weight individuals; intrameal pauses will be shorter. The parameters will remain relatively constant from the beginning to the end of the meal.

Recording edograms

In a pilot phase [20], electromyographic recordings of masseter muscles during chewing established that texture is a critical determinant of chewing responses, both in terms of force of chewing movements and duration of chewing before complete swallowing of a food piece. Foodstuffs were shown to fall on a continuum of textures from hard/dry to soft/fluid that critically determines chewing activity [20]. Therefore, in order to study the influence of deprivation and palatability, it appeared important to work under fixed texture conditions. Keeping texture constant, early edographic recordings investigated the influence of preprandial deprivation duration on the ad libitum intake of standardized food stimuli [19]. Later studies [21,22,23] obtained edograms under varying conditions of deprivation and palatability, using standardized food units: typically, small open sandwiches (about 4 cm2) made of sliced white bread spread with a thin layer of one type of food. Many spreads could be used (from mustard to meat, cheese, or jam) in single-food or mixed-flavor meals. The food stimuli used in edographic studies varied in visual aspect as well as flavor. Their texture and energy value, being determined largely by the bread, were approximately the same for all flavors (average weight 7 g, average energy value 25 kcal; about 350 kcal per 100 g). Before the start of edographic recordings, sensory evaluation tests were conducted in order to assess the palatability of the food stimuli, under the same laboratory conditions as the edographic tests (time of day, day of week, previous deprivation time) [21,22,23]. Participants were invited to taste test foods presented in random order and rate their palatability on a 100 mm visual analog scale. For each participant, it was then possible to obtain a score representing the palatability of each test food, and to establish a hierarchy of the palatability ratings for each participant in the study. This procedure would allow us to see how palatability ratings would correlate with edographic parameters and also to examine edographic responsiveness to the individual hierarchy of palatability between the different food stimuli (the hierarchy could be considerably different between participants).

The edogram consisted in parallel recordings of chewing and swallowing activity during a standardized meal ingested under controlled laboratory conditions. Ingestive patterns were recorded using a two-channel oscillograph (Teckmar, George Washington Ltd., England) coupled with very simple pieces of equipment. A 60 ml surgical balloon, filled with water, was maintained on the participant’s throat by means of an adjustable elastic collar (14 g total). Each swallow induced a change in pressure in the balloon that was transmitted through a liquid pressure transducer to a strain gauge coupler, producing a large deflection of the pen of the Oscillograph. A light headset, terminated on one side with a strain gauge, which rested on the participant’s cheek, just in front of the ear, picked up movements of the jaw associated with chewing. The signals arising from chewing movements were translated into graphic deflections on the second channel of the oscillograph. Figure 1 displays a segment of an edogram, showing ingestive responses during the intake of a few food units. The intake of each unit triggers a bout of chewing responses and then a swallow. A short pause takes place between successive ingestive bouts.

Fig. 1: The edogram is a simultaneous recording of chewing and swallowing activity during the intake of standardized food units.
figure 1

The upper line represents three swallows (deep deflections indicated by arrows) following the chewing of successive food units. Smaller movements of the pen are artefacts due to movements of the skin during chewing. The lower line displays chewing bouts separated by pauses. Durations of a few elements of the sequence are indicated.

Edograms allowed a quantitative analysis of many intraprandial parameters: chewing time per food unit, number of chewing movements per food unit, rate of chewing movements, number of swallows per food unit, pause duration between two successive chewing bouts, drinking time, and number of swallows per drink. Such parameters were examined over the meal as a whole and at specific moments of the meal.

In addition to highly standardized food stimuli, edograms required strict laboratory conditions in order to keep the recording equipment optimally responsive throughout the meals. Recording sessions took place at lunch time in a quiet room where participants were tested alone. At the beginning of test meals, participants were invited to sit at the dining table and equipped with the recording instruments. The experimenter made sure that the subjects were comfortable and that the recording instruments were in no way intrusive. The food stimuli were presented on a tray, in sufficient amounts to avoid ceiling effects. Within easy reach by the participant, a cup of water was placed on a postal scale from which water intake could be monitored. The oscillograph was placed on another table, invisible to the subjects. The experimenter stayed it the room during the tests but remained outside of the visual field of the subjects and avoided interfering in any way with the participant’s spontaneous ingestion. Meal size and duration were determined by the participant’s responses.

The edogram: how the microstructure of meals reflects appetite

Table 1 shows the main characteristics and outcomes of the published edogram studies.

Table 1 Effects of various experimental manipulations on the microstructure of meals as revealed by edogram studies.

Edographic studies addressed the influence of sensory factors on lunch intake [19, 21,22,23]. Standardized food units were presented in laboratory lunches either in single-flavor meals of diverse palatability levels, or in mixed-flavor meals. As expected, the individual hierarchy of palatability levels (as established by visual analog ratings obtained in prior sensory tests) directly influenced total meal intake and meal duration. In addition, the edograms showed how several parameters of the microstructure of the meals were affected: overall mealtime eating rate (number of food pieces per minute) increased with palatability, a phenomenon resulting from a decrease in chewing time (and number of chews) per food unit and, in persons with obesity, a shorter duration of intrameal pauses between two successive food units. By contrast, chewing rate (about 1.3 chews/s) remained stable across conditions. One swallow per food unit was observed regardless of palatability level. See Fig. 1.

A mixed-flavor condition (food units of five different flavors presented simultaneously) stimulated intake more than the most palatable single-flavor meal, with increased meal size and duration, faster eating rates, and decreased chewing time and number of chews per food unit [22]. These effects were observed both in normal-weight individuals and in persons with obesity [22].

In normal-weight participants only, changes in the eating patterns were noticed from the beginning to the end of meals: the eating rate decelerated due to both an increase in chewing activity per standard food unit and an increase in the duration of intrameal pauses. These changes were hypothesized to reflect the development of inhibitory signals, from initial hunger to satiation before the spontaneous, participant-determined, termination of the meal. No such intrameal changes appeared in persons with obesity [22]. These observations were consistent with previously published observations [16] and provided a quantitative characterization of the aspects of eating responses responsible for the variations in intraprandial eating rates reported in previous studies.

Increasing the duration of premeal deprivation also induced a stimulation of total intake at mealtime, illustrated by an increase in meal size and duration [23]. These effects were cumulative with those of palatability. By contrast, the microstructure of meals proved to be more sensitive to palatability than to deprivation levels: chewing and swallowing did not vary with deprivation over the whole meal [23]. However, a closer analysis of the edograms revealed an effect of deprivation at the beginning of meals, when higher deprivation levels induced shorter chewing times. These differences had disappeared by the end of the meals, just before satiation.

Mealtime intake of fluid was also studied via edograms [22, 23]. Mealtime drinking is important since dietary surveys have consistently established that most of the daily intake of fluid occurs at meal times [24, 25]. Edograms revealed that most prandial drinking occurs in the last quarter of meals and, to a lower degree, just before the first bite of food. The amount of prandial drinking varied directly with meal size and duration and, consequently, increased during higher palatability meals. In mixed-flavor meals, drinking also occurred between successive food pieces of different flavors.

The edogram had important limitations. The recording equipment limited its use to laboratory conditions, with a very narrow range of potential food stimuli. The number of food pieces ingested in the various edographic studies remained relatively small (10–30). Although this could represent a realistic lunch-time energy intake (up to 750 kcal), the meals were very short (less than 10 min) due to the little amount of chewing required. Importantly, meals were smaller and shorter in people with obesity than in participants with normal weight. In those conditions, the development of satiation cues remained limited and the differences observed between participants of different weight status (although concordant with other results [16]) may be at least partly artefactual. In addition, the ethological validity of the laboratory equipped for the recording of edograms is questionable: very few of the numerous factors that affect free-living eating can be reproduced in the laboratory. In particular, social interactions were impossible.

Edograms and later studies of the microstructure of meals

Since the last edographic publications, the observations extracted from Edograms have generally been confirmed by other approaches of mealtime intake in humans. The scientific advances can be reviewed from various standpoints: firstly, the relationships of intrameal parameters with appetite and satiation, and their possible causal influence on the amount of food ingested at a particular eating event; and secondly, the idiosyncratic differences in eating styles between individuals, particularly in the context of normal weight, overweight, or obesity:

The microstructure of meals, from appetition to satiation

Studies of intrameal intake responses, using a variety of different measuring instruments, confirmed the influence of palatability and deprivation on mealtime ingestive responses and the decrease of eating rates between the beginning and the end of ad libitum meals. The ingestion rate of small standardized solid food units (SFUs) retrieved from an opaque food dispenser was shown to decrease from the beginning to the end of meals, with a more rapid initial rate of eating after longer deprivation times and with increasing palatability of the SFUs [26]. Numerous experimental trials using the UEM [17, 18, 27,28,29,30] confirmed that both deprivation and palatability increase the linear coefficient of the cumulative intake curve, translated in faster eating rates. UEM studies also explored the role of a range of other factors susceptible to affect the quadratic coefficient, such as physical state, gender, and presence of obesity or eating disorder, which affect intrameal changes in eating rate. (UEM studies are discussed in another paper in this same IJO issue.)

Various methods were developed to obtain accurate measurements of variables like chewing activity, swallows, and bite size in order to investigate their specific relationships with appetite. Sipping liquid versus semisolid food through a straw led to larger bite sizes and longer oral processing with the semisolid food; under these conditions, 47% more liquid food than solid was required to reach the same level of satiation [31]. In video recorded meals, slower eating in response to differences in food texture facilitated lower energy intake [32, 33].

Forde et al. [34] analyzed video recordings to code separate bites, chews, and swallows, during experimental trials in which subjects ingested 50 g portions of 35 foods, at their usual rate. In agreement with the early report by Pierson and Le Magnen [20], texture properties appeared to have more effect than taste properties on ingestive responses. While number of chews, number of bites per portion, eating rate and oral exposure time varied considerably between foods, chewing rate was relatively constant, with an overall average of 1 chew/s. The average bite size was 7–8 g per bite. These observations are consistent with edographic studies in which standard food units (7 g) generated 1.3 chews/s [23]. Chewing rate in humans appears relatively stereotyped [35] and shows little sensitivity to the sensory or physical–chemical characteristics of the foods. The comparison of 35 foods varying in texture showed that smaller bite size and longer oral-sensory exposure time contributed to higher satiation for a given energy content [34].

The microstructure of meals as a causal influence on energy intake

The association between faster eating rates in response to increasing palatability/deprivation and larger meal sizes in those conditions raises the issue of causal effects. It has been hypothesized that fast eating rates not only reflect higher palatability or deprivation-induced stimulation, but can also, per se, facilitate overeating. Different mechanisms can contribute to such an effect. Fast eating allows more food to be ingested before satiation signals can be generated from the gastro-intestinal tract (gastric distention, etc.) [36]. By contrast, longer oro-sensory exposure could enhance satiating efficiency as a result of nutrient sensing in the mouth via the taste system [37]. Slower eating also results in higher release of satiety hormones such as glucagon-like peptide-1 [38], peptide YY [38], cholecystokinin [39], and a stronger postmeal suppression of the orexigenic hormone ghrelin levels [39]. SSS is a potent inhibition mechanism whereby the sensory characteristics of a food progressively lose their hedonic appeal as ingestion progresses [40, 41]. SSS is one of the very likely contributors to the “biological satiation curve” [16] and the quadratic coefficient of cumulative intake curves [18]. Eating at a slow rate also improves episodic memory for the meal and promotes satiety [42].

In order to demonstrate causation, experimental studies have manipulated the parameters of the microstructure of meals as independent variables and examined their influence on subjective appetite and intake during one eating episode (satiation effects) or following the end of an eating episode (satiety effects). Experimental manipulations requested participants to consume standardized food pieces (for example almonds, sandwiches, pasta, pizza rolls, etc.) and to vary either the number of chews per piece [39, 43, 44], chewing time [44, 45], duration of pauses between mouthfuls [29, 45], eating rate [46], or bite size [47, 48].

These studies demonstrated a significant influence of the parameters of the microstructure of meals on total intake in the same meal. A systematic review of 40 experimental manipulations [49] concluded that increases in oral processing time and chewing activity, obtained via a variety of techniques, enhance satiation and decrease mealtime intake, while post-intake sensations of hunger and desire to eat tend to decrease. Another systematic review and meta-analysis of 22 controlled laboratory-based interventions concluded that slower eating induces a significant reduction in food intake and provided evidence that effect size is sensitive to the size of change in eating rate, with larger decreases in eating rate producing larger decreases in food intake [36]. Interestingly, there was no [36] or little [49] effect of eating rate on hunger sensations at the end of the meal so that the lower intake obtained with slow eating did not induce weaker satiety after the meals.

Instructing people to eat more rapidly has the converse effect: normal-weight participants invited to “binge” under laboratory settings consumed more when a rapid eating rate was imposed versus a slower rate, in contrast with patients with bulimia nervosa whose binge intake was unaffected by eating rate [50].

Many experts have proposed strategies to help consumers or patients reduce their eating rate in everyday life in order to limit excess consumption [36, 51,52,53]. Beyond the individual level, it has been proposed that the design of functional foods for managing appetite and weight could use the demonstrated benefits of altering intrameal eating responses (by making longer chewing necessary, for example) [34, 54].

The microstructure of meals and idiosyncratic eating styles

Oral processing varies not only in response to the characteristics of foods but also under the influence of many stable or transient individual characteristics such as age and dental health [55]. According to Forde et al. [34], “Eating rate is not only a property of a food, but can also be considered as a property of a person.” The edographic studies identified eating rate as a stable individual feature across palatability conditions [21].

Studies using surface electromyography, which monitors the electrical activity of skeletal muscles, identified idiosyncratic response styles characterized, in particular, by a “naturally preferred eating duration” [55]. Total muscle effort and total number of chews are higher in normal-weight “long duration eaters” than in “short duration eaters.” Short duration eaters do not compensate for their reduced eating time by chewing more efficiently but swallow a less broken down bolus than long duration eaters. Longer eating duration results in more saliva incorporation into the bolus. The bolus properties at the moment of swallowing thus differ considerably between short versus long duration eaters [55].

Although these differences in preferred eating duration were reported in normal-weight individuals, the contribution of fast eating in weight gain is a legitimate concern.

Differences in the microstructure of meals and body weight status

A systematic review and meta-analysis of 23 observational studies in adults showed that BMI is higher in individuals who eat quickly than people who eat slowly [56]. The reported difference amounted to 1.78 kg/m2 and the pooled odds ratio of eating quickly on the presence of obesity was 2.15 [56]. One obvious limitation of this work is the self-appreciated eating rates (objective data using an eating monitor were obtained in only 46 out of 106,898 respondents).

As discussed above, objective laboratory observations also suggest that a faster eating rate, associated with shorter chewing or pausing, could contribute to higher energy intake, at least at the moment of the meal. Whether or not such short-term satiation/satiety effects could be potent enough to induce weight change over the long term depends on the mobilization of regulation mechanisms. In recent years, the concept of energy regulation has known deep revisions whose outcome is a renewed emphasis on the factors affecting intake at mealtime [5, 57, 58]. In human societies, the daily meal number and the moments when eating is possible or appropriate are determined largely by social constraints, thereby conferring a major role to the factors present at the moment of intake [58]. This is precisely what the edogram addressed.

Edograms were obtained from a small group of women with obesity [22] and revealed faster eating rates, lower chewing activity per standard food unit, and shorter intrameal pauses particularly in conditions of increased palatability, than in normal-weight individuals. In addition the progressive deceleration in eating rate observed in the normal-weight participants did not occur in persons with obesity.

Convergent observations were reported in UEM studies. A laboratory study reported a higher initial eating rate in persons with obesity versus normal-weight subjects, but no difference in rate of intrameal deceleration [30]. A modified UEM was used in a cafeteria study of ad libitum food intake and revealed that average bite size increased linearly with BMI [59]. Although the reported effect was small (0.20 g/BMI), this response could facilitate overeating since bite size directly affects total intake in a meal [48, 60].

If the microstructure of meals can act in a causal way to affect total intake, then it is legitimate to expect that modifying some of the critical responses could facilitate weight loss. As reviewed above, several experimental studies reported a decrease in intake following the manipulation of eating responses. Beyond these immediate effects, weight loss was reported in adults with overweight after a 15-week intervention in which they learned to eat more slowly, with longer pauses between bites, via the use of an “augmented fork” that provided vibrotactile feedback about eating rates [61]. Clinically significant reductions in body weight were observed 12 months after training obese adolescents to use a “Mandometer” (a portable weighing scale connected to a small computer) to monitor their eating rate and eat more slowly [51]. These observations raise the issue of how prandial responses could be monitored, and possibly manipulated, on the long term in free-living individuals in order to facilitate weight control and, possibly, contribute to the management of overweight/obesity.

Edograms in free-living persons: an unresolved challenge

Early edographic studies were carried out under laboratory conditions. The laboratory is a very precious environment where demanding scientific tests can be efficiently designed and carried out [62]. In the case of the edogram and related approaches, the laboratory gave scientists the opportunity to make precise measurements of micro-responses or to manipulate these micro-responses to influence intake in highly controlled conditions. Early experimental designs controlled for many influential factors that are known to affect intake in free-living situations: time of day, day of week, nature of food presented, food physical properties, and bite size, among many others.

An obvious crucial question remains: how does this knowledge generalize out of the laboratory? How do the parameters of the microstructure of meals respond to appetite conditions in everyday circumstances? The relevance of laboratory observations remains to be validated in free-living circumstances. For example, while meal size responds to the energy content of a preload similarly under laboratory and cafeteria conditions, meal duration is predominantly affected by environmental setting [63]. In Meiselman’s words [64], a greater comprehension of human eating behavior requires the precision of a laboratory combined with the realism of a social setting.

Various efforts have been done over the years to take edograms and other objective measurement tools out of the laboratory and address spontaneous ad libitum mealtime responses in free-living individuals. The “oral sensor” developed by Eliot Stellar promised to be an unobtrusive and objective method for measuring all meals, snacks, and nibbling throughout the day [65]. Unfortunately it did not develop beyond an original prototype. The “Mandometer” [51] and the “augmented fork” [61] have been used to monitor ad libitum intake. An algorithm has recently been proposed to analyze cumulative intake data obtained from a UEM recording of unrestricted eating at a cafeteria meal [66], with perspectives of use in clinical settings or in free-living populations. Such a device could be used to investigate the effects of monitored changes in eating rates (or even anorexigenic drugs) on intake and weight control.

Teams of engineers have achieved significant improvements over the original laboratory edogram and, through the use of portable sensors, have realized recordings of intake responses at mealtime in free-living subjects [67]. These developments are addressed in other contributions in this Special Issue.

The microstructure of meals in the 21st century obesogenic world

In the present worldwide struggle against the rising epidemic of obesity and intake-associated metabolic diseases, it seems more important than ever to understand what the parameters of the microstructure of meals tell us about the determinants of intake in human consumers. The everyday circumstances of food intake are extremely complex. Surveys using the “Weekly Food Diary” method illustrated the influence of a myriad of compensated (hunger, stomach filling, etc.) or uncompensated (time of day, presence of others, etc.) factors that exert a significant but minute effect on intake [4]. While their individual influence may be modest, their combined impact is powerful. The eating occasions of today occur in more diverse circumstances than did the traditional eating pattern composed of a number of “regular” meals. The present situation suggests more or less constant nibbling: a survey of food intake in America concluded that “adults in the US eat around the clock” [9]. This evolution of eating practices has consequences for appetite: intake is likely to be affected by concurrent distraction, for example by the use of media or smartphone at meal times [68,69,70].

Nowadays a large part of the daily intake takes place not only when distracting stimuli are present at meal times, but more generally when people are doing just about anything (driving, answering mails, working, etc.) and happen to be consuming food at the same moment, i.e., when the main activity is not eating [71, 72]. As a consequence, the number of relevant influences is likely to be even higher today than the many already included in de Castro’s model [4]. Taking the edographic apparatus outside of the lab does not only mean developing reliable instruments that can record eating activities when a large variety of foods are selected, but also instruments that can remain reliable under a large variety of circumstances. The edograms of tomorrow may look more like the 24 h cumulative intake curves recorded in laboratory animals [73] than recordings of a single meal activity if all relevant aspects of food intake are to be considered.

The science of the 21st century benefits from the spectacular development of research tools in many areas. We are now able to identify a large number of hormonal or neural signals from the gastro-intestinal tract and assess their action on brain mechanisms [74]. The brain itself is now under direct scrutiny thanks to imaging techniques that have already added much to our knowledge of structures and functions involved in the stimulation and inhibition of responses to food. For example, the identification of the “hungry brain” [75] and its exaggerated responses to visual food stimulation after weight loss is a spectacular example of the advances made possible by the progress of scientific methods since the early, rudimentary but clever attempts to identify brain appetite centers in the mid-20th century [76].

Other recently developed methods are available to take the actual measurement of human food intake into the 21st century: everyone (or almost) is now equipped with smartphones that can provide rich information about the circumstances of various daily eating occasions (time of day, day of the week, location), which can be completed using the phone camera to obtain a picture of settings, company, foods on the plate, and leftovers. These instruments can be connected with food composition tables capable of computing the energy and nutrient contents of the meals. Smartphone cameras are already being used to obtain and analyze pictures of free-living food intake [77, 78]. More refinements of techniques could complement them with the use of wearable sensors capable of providing data about chewing, swallowing, oral processing time, and other potentially sensitive parameters, as other contributions to this Special Issue illustrate. Concurrent measures of relevant physiological parameters such as body temperature or level of glycemia (unobstrusive continuous glucose monitor systems are now available for use by patients with diabetes) can be added. Simultaneous physiological measurements of energy expenditure (metabolic and behavioral), which appears the key determinant of a “biological drive to eat” recently highlighted by Blundell et al. [79], may be of special importance. Algorithms can be developed to analyze the intake data and, hopefully, sort out the critical factors or combinations of factors that affect intake in various circumstances or in different populations.

The combined use of these tools will allow a very fine grained quantitative analysis of ingestive responses at the time of eating and open an unprecedented perspective on the status of numerous factors contributing to appetition and satiation in free-living humans [80]. Hopefully the emerging picture will suggest efficient levers to modify intake responses and improve nutritional status and weight control.