OUP user menu

Cardiovascular consequences of famine in the young

Annet F.M. van Abeelen, Sjoerd G. Elias, Patrick M.M. Bossuyt, Diederick E. Grobbee, Yvonne T. van der Schouw, Tessa J. Roseboom, Cuno S.P.M. Uiterwaal
DOI: http://dx.doi.org/10.1093/eurheartj/ehr228 First published online: 25 August 2011


Aims The developmental origins hypothesis proposes that undernutrition during foetal life, infancy, or childhood is associated with an increased risk of cardiovascular disease in adulthood. As data on postnatal developmental programming are scarce, we investigated whether exposure to undernutrition during childhood, adolescence, or young adulthood is related to coronary heart disease (CHD) and stroke in adult life.

Methods and results We studied 7845 women from the Prospect-EPIC cohort who had been exposed at various degrees to the 1944–45 Dutch famine when they were aged between 0 and 21 years. We used Cox proportional hazard regression models to explore the effect of famine on the risk of CHD and stroke, overall and within exposure age categories (0–9, 10–17, ≥18 years). We adjusted for potential confounders, including age at famine exposure, smoking, and level of education as a proxy for socio-economic status. Overall, stronger famine exposure was associated with higher CHD risk. Among those who experienced the famine between ages 10 and 17 years, CHD risk was significantly higher among severely exposed women compared with unexposed women (HR 1.38; 95% CI 1.03–1.84), which only slightly attenuated after adjustment for confounding (HR 1.27; 95% CI 0.94–1.71). We observed a lower stroke risk among famine exposed women (HR 0.79; 95% CI 0.61–1.02). Adjustment for potential confounders produced similar results (HR 0.77; 95% CI 0.59–0.99).

Conclusion Exposure to undernutrition during postnatal periods of development, including adolescence, may affect cardiovascular health in adult life.

  • Developmental programming
  • Dutch famine
  • Undernutrition
  • Coronary heart disease
  • Stroke


Nutritional influences early in life may relate to risk of cardiovascular disease (CVD) in adult life according to the developmental origins of chronic disease hypothesis.13 This hypothesis proposes that undernutrition during foetal life, infancy, and childhood changes the structure and function of the body permanently.4 It further postulates that these changes, while beneficial for short-term survival, lead to chronic diseases in adult life.

Much evidence supports the developmental origins hypothesis for intrauterine determinants. Studies in both developing and developed countries have demonstrated an association between low birth weight and an increased risk of CVD.58 Furthermore, associations between small size at birth and known biological risk factors for CVD, such as hypertension, hypercholesterolaemia, and type two diabetes, have been described.9 In the Dutch famine Birth Cohort Study, we demonstrated that undernutrition during gestation is associated with an increased risk of obesity,10 an atherogenic lipid profile,11 and coronary heart disease (CHD) in adult life.12

Although less well studied than intrauterine development, disturbing postnatal development may also harbour later life disease consequences. In most research, however, the disturbed postnatal development is a consequence of disturbances during prenatal development. Early childhood catch-up growth in children who were small at birth,13 characterized by a disproportionate increase in adipose tissue when compared with lean body mass,14 is associated with insulin resistance.15,16 Studies in the Helsinki Birth Cohort demonstrated that the combination of a small body size at birth, low weight gain during infancy, and rapid gain in BMI during childhood is associated with the metabolic syndrome, a body composition with low muscle mass in relation to fat mass, hypertension, and CHD in adult life.4,17,18 There is circumstantial evidence that stunted growth in later childhood poses similar chronic disease threats. In women with anorexia nervosa, weight normalization has been associated with body fat redistribution towards visceral adiposity.19 Therefore, hampered growth by undernutrition in later childhood and subsequent recovery might also have metabolic consequences resulting in increased chronic disease risk.

Human studies on the association between undernutrition during childhood and CVD risk in adult life are scarce.2022 We have previously studied undernutrition from early childhood through young adolescence in relation to subsequent cancer risk,23,24 using the unique circumstances during the 1944–45 Dutch famine. As far as we know, there are no individual subject data showing a direct relation between childhood or young adult undernutrition and manifest CVD risk in adult life. Here, we report on our findings relating severe and moderate caloric restriction during childhood, adolescence, or young adulthood to the risk of CHD and stroke in adult life, using the Prospect-EPIC cohort with information on individual exposure to the 1944–45 Dutch famine.


The Prospect-EPIC cohort

Prospect-EPIC consists of 17 357 women aged 49–70 years at recruitment between 1993 and 1997 (response rate 35%). It is one of the two Dutch cohorts participating in the European Prospective Investigation into Cancer and Nutrition (EPIC). Its design has been described in detail elsewhere.25 Briefly, women residing in the city of Utrecht or its vicinity were recruited through a breast cancer screening program. All women signed informed consent before study inclusion. The study complies with the Declaration of Helsinki and was approved by the Institutional Review Board of the University Medical Center Utrecht. At enrolment, participants underwent non-fasting blood sampling and filled out an extensive food frequency questionnaire and a general questionnaire. By means of these questionnaires, we obtained information on demographic and lifestyle factors and past and current morbidity, including smoking, level of education, and whether they had ever been treated for hypertension and/or hypercholesterolaemia. Furthermore, all participants underwent physical examination. Trained assistants measured height, weight, waist and hip circumference, systolic and diastolic blood pressure, and checked the questionnaires for missing information.

Famine exposure

The Dutch famine

The Dutch famine evolved from an accumulation of circumstances. The advance of the Allied forces to the north of the Netherlands came to a halt when the attack to capture the Rhine Bridge at Arnhem (operation ‘Market Garden’) failed. In order to support the Allied offensive, the Dutch government in exile had called for a railroad strike to thwart German transport of troops and ammunition. As a reprisal, the German occupier banned all food transports. In early November 1944, food transport across water was permitted again. However, the winter of 1944–45 had started unusually early and was extremely severe. Since most canals and waterways were frozen, it had become impossible to bring in food from the rural east to the urban west of the Netherlands. Consequently, the food situation in the west of the Netherlands deteriorated rapidly from October 1944 onwards. The official daily rations for the general adult population dropped from about 1400 kcal in October 1944 to below 1000 kcal in late November 1944. At the height of the famine from December 1944 to April 1945, the official daily rations varied between 400 and 800 kcal.26 The relative amount of proteins, fats, and carbohydrates remained essentially unchanged during this period.27 After 6 months of starvation, the Netherlands was liberated ending the famine abruptly.

Famine exposure assessment

The self-administered general questionnaire, which was filled in at time of enrolment by all study participants, contained questions about place of residence and experiences of hunger and weight loss during the 1944–45 Dutch famine. Women could respond to these last two questions using one of three answer categories: ‘hardly’, ‘little’, or ‘very much’. Women who had answered ‘not applicable’ and ‘I don't know’ to one or both famine questions were excluded. We combined the answers into a three-point subjective hunger score: women who reported having been ‘very much’ exposed to both hunger and weight loss were categorized as ‘severely exposed’. Women who reported having been ‘hardly’ exposed to either hunger or weight loss were categorized as ‘unexposed’, and all others as ‘moderately exposed’.

Exposure age categories

Age at famine exposure was assessed taking 1 October 1944, the start of the famine, as reference. Exposure age was classified into three categories; childhood (0–9 years), adolescence (10–17 years), and young adulthood (18 years or older), according to the seven stages in the postnatal human life cycle as defined by Bogin.28 We defined pre-adolescent childhood, a period of rapid growth with many developmental milestones in physiology, behaviour, and cognition, as the period between 0 and 9 years, just before the growth spurt in women.28,29 From the start of the growth spurt, at around 10 years, through age 17 is called adolescence.28,29 This period is characterized by the growth spurt, including sexual development.28,29 From 18 years of age, we considered persons as young adults gradually reaching homeostasis in physiology.

Outcome assessment

Data on cardiovascular events until 31 December 2007 were provided by linking the cohort with the National Medical Registry (hospital discharge diagnosis) and with Statistics Netherlands (cause of death). Events were coded according to the International Classification of Disease (ICD) coding system version 9 or 10: (i) hospital admission ICD-9 codes 410–414 for CHD, and 431–434 and 436–438 for stroke; (ii) cause of death ICD-9 codes 410–414 for CHD, and 431–434 and 436–438 for stroke—cause of death ICD-10 codes I20–I25 for CHD, and I61–I67 and I69 for stroke.

Data analysis

We excluded women who were born after the famine (n= 2559) and who resided outside occupied Netherlands during the famine (n= 1732). For 8091 of the remaining 13 066 women, the hunger score could be calculated (62%). Women not permitting data retrieval from the municipal administration registries, the National Medical Registry or Statistics Netherlands (n= 246), were also excluded, leaving 7845 women for our analyses.

Characteristics at enrolment including demographics, anthropometry, and lifestyle were first tabulated against timing and severity of famine exposure, in order to evaluate potential confounders. We used Cox proportional hazard regression models to explore the effect of famine exposure on the risk of CHD and stroke separately. For both CHD and stroke, composite outcome events were defined as disease occurrence either manifested by hospital admission or death. Subjects who remained event-of-interest free were censored at the date of death due to other causes, the date of lost to follow-up, or on 31 December 2007, whichever came first. Women with both manifest CHD and stroke contributed to both the CHD and stroke analyses, but with follow-up times matching the respective outcomes.

For analysing stroke, we combined the severe and moderate famine exposure groups into one group because of the smaller number of stroke cases. For CHD, trend tests were used to explore dose-response relations by introducing the famine exposure score as an ordinal variable (one for ‘unexposed’, two for ‘moderately exposed’, and three for ‘severely exposed’). We analysed the relation between famine exposure and CHD and stroke within each of the exposure age categories.

First, we analysed the crude association between famine exposure and CHD and stroke. In Model 1, we adjusted for potential confounders including age at start of the famine (years), smoking (pack years), and level of education (low/high; socio-economic status proxy). For CHD, we additionally adjusted for waist circumference (cm) in Model 2, as visceral adiposity is a risk factor for CVD, and possibly an intermediate variable linking childhood undernutrition to later CVD. For stroke, we additionally adjusted for systolic and diastolic blood pressure (mmHg) in Model 2, as blood pressure is one of the most important risk factors for stroke and possibly an intermediate variable. Continuous variables were introduced as such in the different models; for categorical variables, we created indicator variables. We evaluated the proportionality of the hazards over time with log minus log plots. Results are reported as hazard ratios (HR) with 95% confidence intervals (CI). We performed all statistical analyses with PASW Statistics version 17.0 (SPSS, Chicago, IL, USA). P-values were based on two-sided tests with a cut-off level for statistical significance of 0.05.


At the end of follow-up on 31 December 2007, 7116 (91%) women were still alive, 667 (9%) had died, and 62 (1%) were lost to follow-up. During follow-up, a total of 604 (8%) women experienced a CHD event (560 hospital admissions and 62 deaths; 558 252 observation years), and 235 (3%) women experienced a stroke (208 hospital admissions and 51 deaths; 560 965 observation years).

Table 1 shows baseline characteristics at recruitment. Of the total of 7845 women, 4280 (55%) had experienced the famine in childhood, 3087 (39%) in adolescence, and 478 (6%) in young adulthood. In total, 3577 (46%) reported none, 2976 (38%) moderate, and 1292 (16%) severe exposure to famine. Women who were older at the start of the famine reported more often to be severely exposed to famine. Overall, severely famine-exposed women had higher BMI, waist circumference, total cholesterol levels, and smoked more than unexposed women.

View this table:
Table 1

Baseline characteristics of the study population according to age at famine (0–9 years, 10–17 years, or ≥18 years) and level of famine exposure (none, moderate, or severe)

Coronary heart disease

Table 2 shows the relation between famine exposure and subsequent CHD risk, both for all ages during the famine and within exposure age categories. Overall, CHD risk was slightly higher among those who were moderately famine exposed and significantly higher among those who were severely famine exposed compared with those unexposed. Within women aged 10–17 years at start of the famine, those severely famine-exposed had a significant 38% (95% CI 1.03–1.4) higher CHD risk in adult life, whereas those moderately famine-exposed had no increased CHD risk compared with unexposed women (P for trend = 0.07). Adjustment for the potential confounders age at start of the famine, smoking, and level of education as a proxy for socio-economic status slightly attenuated the risk estimates, as did the additional inclusion of waist circumference, although the effect was not significant in these analyses. Additionally including systolic and diastolic blood pressure produced similar results (data not shown).

View this table:
Table 2

(Un)adjusted hazard ratios and 95% confidence intervals (CI) for the risk of coronary heart disease for all women (all ages; three exposure age categories combined) and for women within each of the exposure age categories: 0–9 years, 10–17 years, and ≥18 years who reported to be moderately or severely exposed to famine compared with those who reported to be unexposed to famine


Table 3 shows the effect of famine exposure on the risk of stroke, both for all ages during the famine and within exposure age categories. Overall, we observed a lower risk of stroke among the famine-exposed compared with those unexposed that was statistically significant after adjustment for confounding (HR 0.77; 95% CI 0.59–0.99). Both women who were famine-exposed during young childhood (0–9 year exposure age category) and women who were famine exposed during young adulthood (≥18 year exposure age category) seemed to have a lower stroke risk compared with unexposed women. This lower stroke risk was the strongest among women who were famine exposed during young adulthood (HR 0.65; 95% CI 0.34–1.23). After including the potential confounders age at start of the famine, smoking, and level of education as a proxy for socio-economic status, the relative risk estimates for famine exposure were slightly stronger. Additionally including systolic and diastolic blood pressure produced similar results.

View this table:
Table 3

(Un)adjusted hazard ratios and 95% confidence intervals (CI) for the risk of stroke for all women (all ages; three exposure age categories combined) and for women within each of the exposure age categories: 0–9 years, 10–17 years, and ≥18 years who reported to be exposed to famine compared with those who reported to be unexposed to famine


Our findings suggest that a relatively short period of severe undernutrition is associated with an increased CHD risk in adult life, in a dose-dependent manner. Famine exposure was associated with a lower risk of stroke, with a trend towards stronger relations in early childhood (0–9 years of age) and in young adulthood (over 18 years of age).

Before further discussion, some aspects of our study—besides the observational nature precluding causal inference—require consideration. The Dutch famine of 1944–45 is a ‘natural experiment’ in history, which gave us the unique possibility to study the long-term effects of acute undernutrition during childhood, adolescence, and young adulthood in otherwise well-nourished girls and women. We used individual exposure data instead of classifying populations according to place of residence or time,2022 thus enhancing precision of exposure assessment. Our exposure classification agrees with rationing practices at that time. The allocated individual amount of calories was based on age. Young children (1–3 years) received about 50%, whereas adults received about 25% of the distributed amount of calories at the start of the famine.26 Furthermore, children were relatively protected within families and by special committees, such as the Interchurch Organization.26,30 Our data reflect these historical facts, showing that the older women were at the start of the famine, the higher the proportion that reported to have been exposed to famine, which supports the quality of our exposure data.

Nevertheless, our individual famine score is still susceptible for misclassification, since it was based on recollection. This may especially be true for the youngest age group, although it is conceivable that these young women have learned about their famine experiences from their parents and family. However, such recall error is unlikely to be different in women who developed CHD or stroke or not, since the vast majority of hospital admissions for CHD and stroke occurred after exposure measurement at study inclusion. Therefore, the consequence of such recall error is likely to be an underestimation of the true famine effects.

Another issue may be the presence of selection bias. First, the overall response to the famine questionnaire was 62%. We cannot exclude that there has been selection there by exposure status—probably with a higher degree of non-response among those having been unexposed to the famine. Nevertheless, it is difficult to conceive of selective non-response based on both exposure status and CHD or stroke risk (e.g. leading to a preferential non-response among those unexposed women who have a high CHD or stroke risk), making selection bias due to non-response unlikely. Secondly, our findings are conditional on survival until examination between 1993 and 1997. We can only speculate about how this could have affected our findings. The famine caused an estimated 22 000 direct deaths on an estimated population of 4.3 million people who suffered from the famine.30,31 Data from Amsterdam, a city that was severely struck by the famine, showed that 75% of all fatalities were male, and 79% were babies or over 65 years of age.30 Since we studied a female cohort with <10% babies, we suspect a minimal direct increase in mortality due to the famine in the source population of our study. Furthermore, expected cardiovascular mortality before inclusion at age 50 among women is also very low. In the Netherlands in 1996, only 241 out of 5 419 598 women aged between 0 and 50 years died due to CHD or stroke.32,33 Thus, although the famine may have led to some increased mortality, maybe especially in women and girls with low fat-reserves and less efficient metabolism, and famine survivors may constitute a group of women with somewhat better constitution and social resources for health, the above makes the extent of potential selective survival before study inclusion negligible.

The strength of our study is the precision with which lifestyle factors and co-morbidity were measured and analysed. Trained assistants performed physical examination and checked the questionnaires for missing information. When adjusting for covariables in the various regression models, we took special care to provide accurate fit of the data. However, we cannot completely exclude the possibility of residual confounding.

We studied only women, recruited through a breast cancer screening program. Since there is a rising body of evidence showing sex-specific differences in programming, the generalizability of these results to men is unknown.34

Coronary heart disease

We found a significantly higher CHD risk among women who reported to be severely famine exposed during adolescence compared with unexposed women in the same age group, but not for moderate exposure. Still, a borderline significant trend test may suggest dose-dependency. Important risk factors for CHD, including smoking and high cholesterol levels, were higher among severely exposed women in the 10–17year exposure age category compared with those unexposed. Likewise, higher proportions of smokers were observed in categories of increased famine exposure. Indeed, adjustment for such risk factors produced slightly lower risk estimates.

The increased waist circumference among famine exposed women may be compatible with previous findings that weight normalization among women with anorexia nervosa was associated with body fat redistribution towards visceral adiposity.19 Our results might suggest that the increased CHD risk among women in the 10–17year exposure age category is partially mediated by an altered body fat distribution after undernutrition during adolescence. As cholesterol data were only available for a 10% random sample, we could not adjust for this CVD risk factor.

We could not demonstrate associations in young childhood. Young children were relatively protected against undernutrition during the Dutch famine. Their official daily rations never fell below 1000 calories and the specific nutrient components were always above the standards used by the Oxford Nutritional Survey.26 Therefore, early childhood exposure may have been less severe than in older age groups, resulting in lower exposure contrast in early childhood, and less power to detect an association. However, we have previously shown that particularly this age group was more prone to develop breast cancer after severe famine exposure.23

We lacked information about the experience of stress during the famine, thus hampering distinction between effects of undernutrition and war and famine related stress. A Finnish study showed higher hypothalamic-pituitary-adrenocortical axis reactivity to a psychosocial stress test in childhood war evacuees.35 Furthermore, it also showed that early life traumatic experience was associated with a two-fold increased risk of being diagnosed with CVD later in life.36 There is also other evidence that early life stress can, through glucocorticoid signalling, have lasting psychological effects.37 Such signalling can also have inflammatory effects on the cardiovascular system, so early life stress may, through inflammation, increased the risk of CVD.36,38 As an alternative explanation for our findings, stress may have induced behavioural changes which in turn changed levels of classical risk factors for CHD.

There have been two other studies on the association between undernutrition during childhood, adolescence, and young adulthood and the risk of later CVD.2022 Men who experienced the siege of Leningrad around the age of 6–8 years and 9–15 years had an increased mortality due to ischaemic heart disease and stroke.21,22 These results do not seem to be in full agreement with ours, but the essentially different circumstances during both famines hamper a direct comparison. The Dutch famine lasted only 6 months, whereas in Leningrad it lasted for more than 2 years. As a result, more than 20% of the population of Leningrad died from hunger-related causes during the siege,39 probably resulting in much more selective survival compared with the Dutch famine with 0.5% deaths.31 Furthermore, unlike in the Netherlands, the famine in Leningrad was preceded and followed by periods of relative food shortage. Finally, the Russian standard of living remained rather poor for a long period after World War II, whereas the Dutch people grew up in a period of increasing affluence.40 In the second study, the German occupation of the Channel Islands was used as an indicator of undernutrition,20 demonstrating that postnatal exposure (around the age of 7–21 years) to the occupation was associated with a 2.5-fold increased CVD risk in later life.20 These findings may also seem different from ours, possibly due to the difference in exposure definition by geographical rather than individual data.


Surprisingly, we observed a lower risk of stroke in adult life among famine exposed women compared with unexposed women. Famine exposure was unexpectedly associated with a lower risk of stroke, with a trend towards stronger relations in early childhood and in young adulthood. Systolic and diastolic blood pressure were lower among exposed young adult women compared with the unexposed, despite of less use of anti-hypertensive medication. High blood pressure is, besides age, the most important risk factor for stroke.41 To the best of our knowledge, this is the first study describing this inverse association. The developmental origins hypothesis would suggest that only women who were famine exposed during periods of rapid growth and development should be sensitive for programming and the concomitant increased chronic disease risk. Since women from the age of 18 years onwards gradually reach homeostasis in physiology and do not undergo periods of rapid growth anymore, it is unlikely that these women are still sensitive for programming through undernutrition. Alternatively, young adult women who experienced the famine may have permanently changed their eating habits toward a healthier pattern as a result of the traumatic experiences, in turn leading to a lower stroke risk. However, this explanation does neither fits our finding for exposure in young children and lower stroke risk nor the observation that young adult exposure did not affect the CHD risk. Furthermore, we have to point-out that the stroke data were relatively scarce, potentially leading to less robust risk estimates.


Our findings support the notion that disturbed postnatal development, particularly in adolescence, can have important implications for adult health. The contemporary relevance of our finding is that famine and undernutrition are still a major problem worldwide; the first millennium developmental goal is to eradicate extreme hunger. Since the incidence of CVD is the number one cause of death globally,42 and rising in many parts of the world, further research into the impact of undernutrition during sensitive periods of growth and maturation is warranted.


This study provides the first direct evidence that exposure to undernutrition during postnatal periods of development, including adolescence, may affect cardiovascular health in adult life.


The Prospect-EPIC study was supported by ‘Europe Against Cancer’ Programme of the European Commission (SANCO); the Dutch Ministry of Health; the Dutch Cancer Society; ZonMw, the Netherlands Organisation for Health Research and Development; World Cancer Research Fund (WCRF).

Conflict of interest: none declared.


We thank the PHARMO Institute and Statistics Netherlands for follow-up data on CVDs.


View Abstract