Introduction

It is currently highly debated if morbidity in European populations generally remains stable across generations or if there are trends towards compression or expansion of morbidity [1,2,3]. As morbidity trends can hardly be measured directly, various measures of healthy life expectancy (HLE) are used as indicators. Already today, healthy life expectancy varies largely across European countries, with Germany being below the European average [4]. Analyses comparing German birth cohorts from 1911 to 1926 and 1917 to 1932 found a decrease of healthy life expectancy from the older to the younger age cohort [5]. While this indicates an expansion of morbidity, which, combined with slowing birth rates, would be a substantial organizational and financial challenge to modern societies, the underlying mechanisms are still incompletely understood. As a consequence, it is still unclear if the potential expansion of morbidity in Germany between 1911 and 1932 should be seen as part of a continued trend which accompanies the rise in life expectancy throughout the twentieth century, or if historical developments interfered with these trends. For example, the expansion of morbidity in the first third of the twentieth century may have resulted from progress in treatment of formerly fatal diseases without progress in preventing their onset (“compression of mortality”) [6]. Both quantitative and qualitative trend changes further along the twentieth century are conceivable, such as an intensified expansion of morbidity due to adverse political and economic conditions related to World War II on the one hand, or a compression of morbidity due to progress in prevention and postponed disease onset on the other hand.

Explanations for morbidity trends have also drawn on the critical and sensitive periods model of life course epidemiology [7]. The model posits that there are periods in life that are most relevant for the development of health deficits in older age, such as gestation or early life. During a critical period, the developing organism adapts especially well to its environment. From a biological perspective, this may result in epigenetic changes such as susceptibility to inflammatory processes which contribute to allostatic load in late life. From a social-behavioral perspective, lasting behavioral patterns (e.g. eating habits) are developed in these periods of life [8, 9]. Sensitive periods, like critical periods, are times of rapid adaptations of the body system to the environment, but as opposed to critical periods, changes in sensitive periods are less likely to be irreversible. Although critical and sensitive developmental periods are not limited to gestation and early childhood, and may also vary depending on the outcome of interest, it is uncontroversial that gestation and early childhood represent some of the most critical developmental periods over the life course [7, 10, 11].

When testing the critical period model in European countries, it may be useful to keep in mind that older adults in Europe are heterogeneous regarding their life experiences. The experience of children born during or shortly after World War II was dramatically different from those born earlier or later regarding trauma from migration and loss, famine and poor living conditions in many European countries [12]. Some of the most severe famines of the twentieth century in Europe included those in the Soviet Union during the siege of Leningrad in 1941–1944, in Greece during the German occupation with a peak in 1941–1942 [8] and the Dutch Hunger Winter in the Western Netherlands caused by a blockade by the German army in 1944–1945 [13]. In Germany, the nutritional situation did not deteriorate until after the end of World War II, when the formerly centralized food production and distribution system collapsed, leading to a severe food crisis [13]. The average energy intake per person in Germany, which had been kept at about 2500 calories until 1944, dropped to 2000 calories in spring 1945 and subsequently to 1550 calories, further decreasing to its lowest level of around 1050–1250 calories in 1946. Thereafter, average official rations remained at about 1550 calories per day [14]. The nutritional situation improved markedly after June 1948 with a currency reform accompanied by a good harvest and the uptake of the Marshall Plan, with average energy intake rising back to over 1800 calories [8]. In contrast, the recommended daily energy intake at the lowest physical activity level is at least 2450 (1950) calories for 30–59 year-old men (women) with population-average height and weight. Higher physical activity levels, pregnancy, lactation and growth elicit higher energy requirements [15]. Apart from the food crisis, many other structural and societal challenges were associated with this early reconstruction period on the way to democracy, for example the arrival of almost 10 million refugees until October 1946 at the four occupation zones that comprised the later West and East Germany.

In sum, it remains unclear how much of the currently observed association of health status with age [16] is actually due to age effects and how much is contributed by specific cohort effects.

Adults born in Germany between 1937 and 1950, who have only recently reached retirement age, cover cohorts who were relatively well-supplied during their critical developmental age before and during World War II (i.e. up to June 1945). In addition, these adults comprise cohorts which were heavily undersupplied during their critical developmental age during the early reconstruction and food crisis (ERFC) period (June 1945 to June 1948) and (after 1948) the again well-supplied post currency reform birth cohorts. These differences in exposure to unique circumstances during critical developmental age [12] make them specifically interesting to investigate questions on future health trends in older adults in Germany.

Following the critical period model, we hypothesize that the cohort which was in critical developmental age during the ERFC period has on average worse age-specific health status in older age than a birth cohort which experienced their critical developmental age before this period. Secondly, we hypothesize that cohorts with critical developmental age before and after this period are on average comparable regarding their age-specific health status.

The objective of this study was to compare the health status of older adults at the ages of 65–71 years born before, during and after the early reconstruction and food crisis period after World War II in Germany (adjusted for later-life demographic or socio-economic characteristics or health behaviors).

Methods

Study design, participants, and data collection procedures

Data for this study originates from two independent assessments of participants from the KORA (Cooperative Health Research in the Region of Augsburg)-Age study in Southern Germany. Participants for the KORA-Age study were drawn from the population representative samples of four surveys conducted between 1984 and 2000 in the city of Augsburg and two surrounding counties: The first three of these former surveys were conducted in 1984/85 (S1), 1989/90 (S2) and 1994/95 (S3) as part of the WHO MONICA (Monitoring of Trends and Determinants in Cardiovascular Diseases) project. In 1999/2000, after the MONICA project had officially concluded, an additional survey (S4) using the same population-representative sampling mechanisms was conducted under the name of KORA by the Helmholtz Zentrum München. For the KORA-Age baseline assessment in 2008, all former MONICA/KORA participants aged 65 years and older (i.e. born 1943 or earlier) were invited. In 2015, a younger enrichment sample (all former MONICA/KORA participants aged 65–71 years in 2015, i.e. born 1944–1950) was added to the KORA-Age study population.

For the KORA-Age assessment in 2008, of the 5990 eligible former MONICA/KORA participants, 4123 persons (response rate: 68.8%) completed a self-administered health questionnaire and participated in a structured telephone interview. For the 2015 enrichment sample, 1929 former MONICA/KORA participants born between 1944 and 1950 were eligible. Of these, 1457 participated in the structured telephone interview and returned the paper-based questionnaire (response rate: 75.5%).

This paper is based on all KORA-Age participants who were aged 65–71 years either in 2008 or in 2015 and thus born between 1937 and 1950.

For the main analysis of this paper we included a sub-group of participants for whom information on place of birth was available, effectively excluding participants who may have been born outside of Germany and of whom exposure to the ERFC period in critical developmental age could thus not be ascertained. Information on birth place for former MONICA S1 (1984/85), S2 (1989/90) and S3 (1994/95) participants derived from the following question which was included in these previous studies: “How long have you lived at your current place of residence?”. Only those participants who indicated the response option “since birth” were included in the main analysis. In the MONICA/KORA survey S4 (1999/2001) the same question was not asked, but information on place of birth could be derived from the following question: “Were you born within the current defined borders of Germany?”. From this survey, only those participants who indicated the response option “yes” were included in the main analysis. The larger data set which also comprised participants for whom place of birth in Germany was possible but not ascertained was additionally used for sensitivity analyses.

Further details about study design, sampling method, data collection and response rates for the MONICA/KORA and KORA-Age studies can be found elsewhere [17,18,19]. A flow chart of participant recruitment for this analysis can be found in Online Resource 1.

Approval for KORA-Age was obtained from the Ethics Committee of the Bavarian Medical Association (No. 08064). Written informed consent was obtained from all participants.

With study participants aged 65 years and older in 2008 and the enrichment sample aged 65–71 years in 2015, KORA-Age offers the first opportunity in Germany to simultaneously compare health status in older age across pre-war and war, early reconstruction and food crisis as well as post currency reform birth cohorts in Germany.

Health status

To measure health status, we constructed a Frailty Index (FI) following established methods using deficit variables collected in both relevant KORA-Age waves. Deficits which are potential candidates to enter a FI include diseases, measures of functioning and (pre-)clinical signs and symptoms [20].

This KORA-Age FI includes in total 33 items, covering 10 diseases, 13 measures of functioning and 10 signs and symptoms. Details on the FI item selection process can be found elsewhere [21, 22]. An updated list of included FI items and their cut-offs for deficit definition can be found in Online Resource 2.

The FI for a person results as the number of the person-specific deficits divided by the total number of listed deficits. The respective FI scores range from 0 (= no deficits present) to 1 (= all deficits present). If a participant scored missing on one or more of the deficit items, the denominator of the FI was reduced accordingly. If information on more than 20% of the FI items were missing for a participant, the FI value was set to missing [23].

Exposures: age, cohort, and sex

Three cohorts were defined based on being in critical developmental age during the early reconstruction and food crisis period in Germany, which occurred from June 1945 until June 1948 [14]. Following literature, critical developmental age was defined as including the prenatal 9-month gestation period and birth until the age of 2 years [10, 11].

Time of birth was measured by birth quarter of the year (1, 2, 3, or 4), and gestation period was defined as including the quarter of birth plus the two preceding quarters. Exact birth dates were unavailable for analysis due to data protection considerations. Time of birth in quarter years was calculated based on age at reference date (December 31st 2008 for those born ≤ 1943 and December 31st 2015 for those born 1944–1950). In combination with information on the exact quarter of the respective birth year, time of birth was calculated as year of data collection minus age at reference date minus (1–0.25 times the quarter of birth).

Thus, participants from the pre-war and war (PWW) cohort were older than two (up to eight) years (i.e. already past critical developmental age) at the beginning of the early reconstruction and food crisis period in June 1945, including all participants born between Q1 in 1937 and Q2 in 1943.

The early reconstruction and food crisis (ERFC) cohort was defined as those participants for whom the ERFC period occurred during gestation or the first 2 years of life, including participants born between Q3 in 1943 and Q1 in 1949.

The post currency reform (PCR) cohort was defined as those who were conceived and born after the currency reform in June 1948, which marked the end of the ERFC period. This cohort thus included participants born in Q2 in 1949 and thereafter (see Fig. 1).

Fig. 1
figure 1

Graphical display of the quarter-based birth cohort categorization for main analysis

This exposure definition did not imply a minimum exposure time. Thus, on the one hand, it covered participants who may have been exposed to the ERFC period only for one day during their critical developmental age (which could be either the first day of gestation or the last day of their second life year) as well as, on the other hand, participants who were exposed to the ERFC period throughout gestation and their first 2 years of life. Therefore, we developed several alternative exposure operationalizations for sensitivity analyses: a minimum exposure of 3 months, a minimum exposure of 6 months and a subdivision of exposure in three dummy variables (exposure during gestation, during the first year of life, during the second year of life). For an additional sensitivity analysis, the PWW cohort was additionally sub-divided in a war cohort (i.e. participants born between Q4 1937 and Q2 1943, who had been exposed during their critical developmental age to the war period starting in Q4 1939, but not to the ERFC period) and a pre-war cohort (i.e. those born between Q1 and Q3 in 1937 who had completed their critical developmental age before the start of World War II). For a graphical display of these categorizations see Online Resource 3.

Age at data collection was used as continuous variable centered around the mean (68 years) [21].

Covariate selection

Covariates were selected based on their respective associations with the exposure of interest, i.e. cohort membership, in either the sensitivity or main analysis data set, in separate multinomial generalized linear models controlling for age.

We tested adult-life demographic (marital status) and socioeconomic (education) variables as well as health risk behaviors (physical activity, smoking status, alcohol intake and body mass index [BMI]) as potential predictors of cohort membership [9]. These variables were selected as they had also been shown previously to be associated with health status as measured by the FI or one of its components such as functioning, age-related diseases or signs and symptoms [24,25,26] and might thus be able to explain part of the association between cohort membership and older-age health status.

Education was measured as a combination of years spent at school and years spent in vocational training (resulting values: 8, 10, 11, 15, 16, 17 years). We defined a maximum of eight educational years as “low education”, 10 or 11 years as “lower intermediate education”, 12 or 13 years as “higher intermediate education” and 15–17 years as “high education”. Information on this variable was carried over from earlier MONICA/KORA surveys, and was assumed to have remained stable throughout adult life. All other covariates except for education were measured in 2008 (for participants born ≤ 1943) or 2015 (for participants born 1944–1950).

Marital status was categorized as single, married, divorced, and widowed.

Physical activity (PA) was estimated by means of two separate four-category interview questions asking about the time per week spent on sports activities during leisure time (including cycling) in summer and winter (0, < 1, 1–2, and > 2 h sport/week). The winter and summer responses were combined to create one variable of leisure time physical activity. “No activity” was defined as less than 1 h sports in summer or winter; “low activity” was irregular participation in sports for about 1 h per week in at least one season; ‘moderate activity’ was defined as regular participation in sports for about 1 h per week in at least one season; and ‘high activity’ was defined as regular sports in summer and winter for more than 2 h per week in both seasons [27]. Participants were classified according to their smoking habits as smokers, ex-smokers, and never-smokers and according to their alcohol consumption frequency as “daily or nearly daily”, “several times a week”, “once a week”, “less than once a week” or “hardly ever or never”.

BMI was categorized into four categories according to World Health Organization (WHO) thresholds: underweight or normal weight (BMI < 25 kg/m2), overweight (25 ≤ BMI < 30 kg/m2), obesity grade I (30 ≤ BMI < 35 kg/m2), and obesity grade II or III (BMI ≥ 35 kg/m2) [28].

Descriptive statistics

The three cohorts (PWW, ERFC, and PCR) were compared with regard to their categorical covariate characteristics in the main and sensitivity analysis data sets using absolute and relative frequencies and a Chi squared test for differences between cohorts. For continuous variables, mean scores and standard deviations were calculated for each cohort and compared by a Kruskal–Wallis test. Additionally, the distribution of single health deficit items over cohorts was described using absolute and relative frequencies.

Time-lag analysis

Age-specific mean FI scores with confidence intervals were graphically presented separately for the three cohorts stratified by sex. Differences in age- and sex-specific mean FI scores between cohorts were assessed visually.

Regression model

The effects of cohort membership, age, and sex on Frailty Index in adults aged 65–71 years old were analyzed using negative binomial generalized linear models (GLMs) with a log-link, the number of present health deficits as outcome and the number of possible deficits as offset term. Resulting effect estimates were presented as Frailty Index Ratios. Additional covariates were added to the model in a second step. We used a complete case analysis. All analyses were computed using R Studio Version 1.1.423 [29]. For all analyses, significance level was set to 0.05.

Sensitivity analyses

To verify if cohort effects were actually due to differences between cohorts and not due to period effects (i.e. measurement in 2008 or 2015), and if the linearity assumption for cohort effects was reasonable, we conducted two sensitivity analyses using generalized linear mixed models (GLMMs), one with cohort membership as a random effect, and one with cohort membership as a random effect nested in period. An additional sensitivity analysis was done with the PWW cohort further divided into a pre-war and a war cohort. Due to sample size considerations, all abovementioned models were conducted in the larger data set including those participants for whom place of birth in Germany could not be ascertained. The GLMMs were also repeated in the (smaller) main analysis data set, but without adjustment for additional covariates due to power considerations.

Furthermore, we repeated the main analysis both in the smaller and the larger data set using the three abovementioned alternative exposure operationalizations: a minimum exposure of 3 and 6 months, respectively and three dummy variables capturing age at exposure more precisely (gestation, first year of life, second year of life).

Results

Study participants

FI values were available for 1800 PWW cohort participants (48% male), 1168 ERFC cohort participants (47% male) and for 407 (47% male) PCR cohort participants (Online Resource 1). Of these, 1774 PWW cohort participants (48% male), 1149 ERFC cohort participants (47% male) and 404 (47% male) PCR cohort participants had complete covariate information. Place of birth in Germany could be ascertained for 590 PWW, 475 ERFC and 171 PCR cohort participants. Thus, the sample size for the main analysis was 1236 and the sample size for sensitivity analysis was 3327 participants.

Descriptive statistics

The three cohorts (PWW, ERFC, and PCR) differed significantly in mean FI values, sex, mean age, marital status, education, BMI, and smoking status. No differences were found with regard to physical activity, and alcohol consumption (Table 1). For descriptive statistics on the complete sample (including participants for whom place of birth could not be ascertained) see Online Resource 4. On the level of single deficit items and after adjustment for age, cohort differences seemed to be significantly driven by deficits in arising, dressing, walking, taking stairs, stooping, lung and joint diseases, anxiety, fatigue and pain (Online Resource 5).

Table 1 Descriptive statistics by birth cohort for the smaller data set including only participants for whom place of birth in Germany could be ascertaineda (n = 1236)

Time-lag analysis

Descriptive plots revealed that age- and sex-specific FI values were higher for women than for men in the PWW and ERFC cohorts. For participants from the PCR period cohort there was no clear visual difference. In addition, age- and sex-specific FI values were slightly higher for the ERFC as compared to the PWW cohort, with the highest visual discrepancy for women born in 1944 and 1945 and to a lesser extent for men born 1944–1947 (see Fig. 2 and Online Resource 6).

Fig. 2
figure 2

Mean sex- and age-specific Frailty Index values stratified by cohort

Covariate selection

Adjusted for age, cohorts differed significantly with regard to sex, education, smoking status, marital status and BMI, but not with regard to alcohol consumption and physical activity.

Regression models

In the restricted GLM model, the FI rate ratio was significantly higher for the ERFC cohort (Ratio 1.14, CL [1.05, 1.23] as compared to the PWW cohort. The PCR cohort was not significantly different from the PWW cohort (Ratio 1.04, CL [0.92, 1.18]). Also, higher age (Ratio per additional life year 1.04; CL [1.02, 1.06]) and female sex (Ratio 1.13; CL [1.05, 1.22]) were independently and significantly associated with higher FI values.

When adjusting for the pre-selected covariates correlated to cohort membership, both age and ERFC cohort effects remained the same, and the sex effect estimate was reduced to 1.09 (CL [1.00, 1.17]). For more details, see Table 2.

Table 2 Resulting Frailty Index ratios from the negative binomial models with complete covariate information for restricted and comprehensive covariate adjustment sets including all 65–71-year-old participants for whom birth in Germany could be ascertaineda, measured in 2008 or 2015 (depending on cohort membership) with full covariate information (n = 1236)

Sensitivity analyses

Taking into account potential period random effects in addition to random effects for cohort membership did not significantly improve model fit as compared to a model with only cohort membership as random effect. Also, there was no considerable change in effect estimates for covariates when comparing the GLMs using cohort membership as fixed effect and the GLMM using cohort membership as random effect. Thus, we decided to keep the GLMs as final models.

When using the variable with a minimum exposure of 3 months or 6 months, respectively, effect estimates for the ERFC cohort as compared to the PWW cohort remained significant and comparable in effect size (1.15, CL [1.07, 1.25] and 1.13, CL [1.04, 1.23], respectively). In the overall sample including participants for whom birth in Germany could not be ascertained, the ERFC cohort effect was reduced to 1.06 (CL [1.01, 1.11] but remained significant. Covariate effect estimates generally remained stable in size, significance and direction in all sensitivity analyses, with the exception of education (stronger protective effects in the main analysis sample) and being widowed (significant risk factor only in the larger overall KORA-Age sample).

When we used the exposure dummy variables representing age at exposure (ERFC period in utero, ERFC period in the first year of life, ERFC period in the second year of life), only exposure in the first year of life showed a significant effect, and only in the overall sample including participants for whom birth in Germany could not be ascertained (Ratio 1.12, CL [1.02, 1.22]). For more details, see Table 3 and Online Resources 7 and 8.

Table 3 Resulting Frailty Index Ratios from negative binomial models with restricted and comprehensive covariate adjustment sets, including participants with complete covariate information measured in 2008 or 2015 (depending on birth cohort): Sensitivity analyses using different parametrizations of ERFC period exposure in the smaller data set including only participants for whom place of birth in Germany could be ascertaineda (n = 1236)

Discussion

Our analysis is among the first to explore the effects of exposure to the German early reconstruction and food crisis period after World War II in early childhood on health status in older age. We found that, when taking into account both age and cohort effects for health status in adults aged 65–71 years, age remained the most influential factor. On the one hand, co-occurrence of the ERFC period with gestation period or the first 2 years of life further increased the number of accumulated health deficits in older age as compared to co-occurrence of the food crisis with age above 24 months. These differences between cohorts could not be explained by differences in socio-economic or socio-demographic status or health behaviors in older age, indicating a direct effect from co-occurrence of critical developmental age and the ERFC period on older-age health. On the other hand, the accumulated health deficit levels of participants born after the currency reform in June 1948 did not differ significantly from those who were already older than 24 months during early reconstruction and food crisis. Our results were significant independently of duration of exposure to the ERFC period during critical developmental age. The ERFC period effect was stronger in the main analysis sample which comprised only those participants for whom place of birth in Germany could be ascertained.

For health projections in Germany, this may indicate that an expansion of morbidity which has been reported earlier for 1911–1932 birth cohorts [5] and which our analysis additionally suggested for cohorts born between 1937 and the first quarter of 1949 (the PWW and ERFC cohorts) cannot simply be extrapolated to post war generations (such as the PCR cohort). Also the determinants leading to the two instances of morbidity expansion may be different: Whereas the first expansion of morbidity from 1911 to 1932 may have resulted from increased life expectancy combined with stable disease onset, the results of our analysis may be a result of a shift towards earlier disease onset from the 1937–1943 to the 1943–1949 cohorts.

Plots of age- and sex-specific mean FI values by cohort suggested that the detrimental effect of being born in the aftermath of World War II on health status was especially pronounced for women born in 1944 and 1945. Potential explanations are that these cohorts suffered some of the longest exposures to the food crisis (i.e. from birth or age 1 until age 3 or age 4, respectively). In addition, their birth years coincide with the peak of reported hunger prevalence in the German population (close to 25% in 1945) [8]. Although these birth cohorts were not exposed to the food crisis during gestation, our results are in line with the results of previous research suggesting that undersupply in infancy and childhood has a higher impact on health in older age than undersupply in utero [10].

At first sight, these findings seem to contradict the literature on fetal programming, i.e. the negative health effects of adverse exposures during gestation [13, 30]. This apparent discrepancy may be explained by differential fertility in parents and differential mortality in children [31]. Studies investigating older-age outcomes of war-related in utero famine exposure during shorter famines, such as the 6-month Dutch Hunger Winter 1944/45 where no impact on fertility is assumed, do report significant effects for in utero exposure [13]. These effects are less stable when there is reason to believe that fertility effects may have occurred (as in Greece) [10]. For the ERFC period in Germany, fertility effects have been ascertained: The adverse living conditions in the three-year aftermath of World War II resulted in lower birth rates, higher numbers of miscarriages and increased infant mortality [32]. Thus, it can be expected that the 2008 and 2015 survivors from the early reconstruction and food crisis (ERFC) cohort are generally healthier and more resilient than the average pre-war and war (PWW) or post-currency reform (PCR) cohorts. Consequently, our results on the effect of the post-World War II turbulences on older-age health may even be biased downwards [10, 33] and may have produced the non-significant effect of in utero-exposure. Selection effects, especially the so-called “male vulnerability” (boys seem to be more strongly affected by adverse early-life conditions) [8] and the male–female health survival paradox (men die earlier in a better health state whereas women live longer, even though in worse health) [34] may also explain why we found outliers only in women and not in men born 1944–1945.

A second reason why our results may be biased downwards is that during war-induced famines only a fraction of the population is exposed to adverse living conditions, with populations in large urban areas being usually more affected than those in smaller cities such as Augsburg and its surrounding rural counties. Thus, one can assume that the effect would have been stronger if all actually non-exposed individuals from the ERFC cohort (e.g. those better off because of collaborations with the allied forces or those living in rural areas with access to self-produced food) could have been singled out from the data set [8].

We are confident that the effect of age and cohort membership on health in older age was not confounded by period effects, as taking measurement period into account did not significantly improve the model nor change the results. In other words, being aged 65–71 years and interviewed in 2008 did not have a significantly different health effect than being aged 65–71 years and interviewed in 2015.

As the KORA-Age participants were sampled during their adult life and no information on place of birth or place of residence during the first 2 years of life was available, some of them may have moved to Germany after the food crisis period. Thus, we ran our main analysis using only those participants for whom the information was available that they were born in Germany. Nevertheless, the results were also supported by our sensitivity analyses in the larger data set including participants for whom information on place of birth was not available.

Our study has the following limitations: First, we had no individual-level data neither on the participants’ mothers’ exposure to hunger or other war- or early reconstruction related stressful life events, morbidities and living conditions during gestation, nor on perinatal outcomes such as birthweight or preterm deliveries. On a macro level, though, it can be confirmed that these conditions were far more frequent during the ERFC period than before and thereafter: For example, birth rates in the city of Dresden dropped from 5100 to 1900 in 1945, with a slow increasing trend starting again only after 1949. Low weight births (< 2500 g) in the city of Leipzig increased by 57% in 1946 as compared to the previous year, reaching earlier low levels again only after 1950. Also congenital malformations such as neural tube defects were more frequent in the ERFC period, more than doubling in Berlin in the ERFC period as compared to the PWW period [32]. The ERFC period also coincides with the period where the largest share of respondents recalls having suffered from hunger [8, 12]. In addition, we did not have information on the participants own respective experience between birth and 24 months of age. Even if information on this question had been collected, it would have been subject to information bias, as it is improbable that participants would have remembered the presence of these conditions in their first years of life. In this situation, it has been postulated that the best way to define exposure in critical developmental periods may be to rely on macro-level information [13]. We chose to use the German early reconstruction and food crisis period as indicator for the most formative period in Germany related to the aftermath of World War II, as it has a well-defined start and end date and affected a large share of the population. These characteristics apply only to a much smaller extent to bombings (different timing in different places) and separation from family members or absence of fathers (different timing, independent of place). Still, when interpreting our results, it should be taken into account that the effects of what we called “early reconstruction and food crisis period” are potentially combined effects of hunger and other experiences introduced by the aftermath of World War II such as psychological stress, displacement, poverty, separation from close family members, and interrupted education [8], all of which may also have affected our participants’ parents and thus have had epigenetic effects on older-age health [35].

In conclusion, co-occurrence of critical developmental age with the early reconstruction period and the associated food crisis in Germany increased the risk for higher numbers of accumulated health deficits in adults aged 65–71 years. These effects were not explained by selected covariates and could not be found in a cohort born after 1949, suggesting a direct link from the experience of early childhood adversities to older-age health and the potential for a change in morbidity trends towards the second half of the twentieth century. Thus, it is imperative that research on future morbidity trends continuously reviews its conclusions based on data from the most recent birth cohorts. At the same time, historical circumstances such as war and famine have to be taken into account, as they may excert their negative effects well into subsequent generations.