Hostname: page-component-78c5997874-m6dg7 Total loading time: 0 Render date: 2024-11-14T03:23:44.074Z Has data issue: false hasContentIssue false

Maximising benefits and minimising adverse effects of micronutrient interventions in low- and middle-income countries

Published online by Cambridge University Press:  11 March 2019

Kaleab Baye*
Affiliation:
Center for Food Science and Nutrition, Addis Ababa University, Addis Ababa, Ethiopia
*
Corresponding author: Kaleab Baye, email [email protected]
Rights & Permissions [Opens in a new window]

Abstract

Micronutrient deficiencies are widespread and disproportionately affect women and children in low- and middle-income countries (LMIC). Among various interventions, food fortification and supplementation with micronutrients have been proven to be cost-effective. The aim of the present paper is to review existing literature to assess risks of excessive intake in LMIC to then highlight programmatic changes required to maximise benefits of micronutrient interventions while minimising risks of adverse effects. While very few LMIC have national food consumption surveys that can inform fortification programmes, many more are implementing mandatory fortification programmes. The risks of inadequate micronutrient intakes were common, but risks of excessive intakes were also present for iodine, vitamin A, folic acid and iron. Excessive salt consumption, high concentrations of iodine in ground-water and excessive levels of iodisation were linked with excessive iodine intake. For vitamin A, overlapping interventions were the main risk for excessive intake; whereas for iron, contamination with iron from soil and screw-wares of millers and high iron concentration in drinking-water increased the risk of excessive intake, which could be further exacerbated with fortification. Before implementing micronutrient interventions, adherence to the basic principles of documenting evidence confirming that the deficiency in question exists and that fortification will correct this deficiency is needed. This can be supported with dietary intake assessments and biochemical screening that help diagnose nutrient deficiencies. Targeting micronutrient interventions, although programmatically challenging, should be considered whenever possible. Moreover, closer monitoring of appropriate fortification of foods and overlapping interventions is needed.

Type
Conference on ‘Multi-stakeholder nutrition actions in Africa: Translating evidence into policies, and programmes for impact’
Copyright
Copyright © The Author 2019 

About two billion people in the world suffer from micronutrient deficiencies(Reference Bailey, West and Black1). The most common micronutrient deficiencies are those of iron, iodine, zinc, vitamin A and folate. Women and children in low- and middle-income countries (LMIC) are the most affected(Reference Black, Allen and Bhutta2, Reference Muthayya, Rah and Sugimoto3). These deficiencies compromise several essential bodily and cellular functions leading to a range of disabilities including adverse pregnancy outcomes, poor growth, impaired cognitive development, compromised immunity and even death(Reference Christian and Stewart4, Reference Kennedy, Nantel and Shetty5). Consequently, the scale-up of proven interventions addressing micronutrient deficiencies and anaemia are being promoted to achieve the global nutrition targets and the nutrition-related sustainable development goals(6).

Among the proven evidence-based interventions proposed in the Lancet Nutrition Series, fortification of staple foods and supplementation are considered to be cost-effective in preventing and controlling micronutrient deficiencies(Reference Bhutta, Das and Rizvi7). Fortifications of staple foods have been introduced in industrialised countries as early as the 1920s(Reference Allen, De Benoist and Dary8). Micronutrient deficiencies that were then widespread have been greatly decreased or have even been eliminated(Reference Mannar and Hurrell9). To replicate this success story in LMIC, food fortification is now being initiated and promoted in many African and South east Asian countries where the burden of multiple micronutrient deficiencies are believed to be high(Reference Osendarp, Martinez and Garrett10).

This review paper reminds the basic principles for effective fortification, highlights current fortification practices and illustrates the possible risks of adverse effects related to excessive intakes. The focus on the risks of excessive intake in this review is deliberate as the benefits of fortification in preventing micronutrient deficiencies are widely known and many reviews exist on the subject.

Basic principles for effective fortification

Effective implementation of micronutrient fortification programme requires documented evidence that the deficiency in question exists and that fortification will correct this deficiency(Reference Allen, De Benoist and Dary8). The micronutrient deficiencies can be best determined through biochemical measurements, but to determine whether a confirmed deficiency will be addressed through fortification requires food consumption surveys that identify and quantify the nutrient gap to be filled(Reference Allen, De Benoist and Dary8, Reference Gibson, Carriquiry and Gibbs11). Ideally, nationally representative data should provide the food and nutrient intake information of different population and physiological groups, because fortified foods can indiscriminately reach all these groups(Reference Allen, De Benoist and Dary8).

The main goal of an effective fortification programme is to correct inadequate nutrient intake while making sure that excessive intake is acceptably low(Reference Allen, De Benoist and Dary8). This requires a 2 d intake data at least from a subset of the population to estimate the usual intake distribution of the population in question, to then shift the distribution upwards so that the intake of each nutrient meets the estimated average requirement for 97·5 % of the population, while not more that 2·5 % of the population reaches or exceeds the tolerable upper intake level(Reference Allen, De Benoist and Dary8) (Fig. 1). Although these are basic principles that need to govern every fortification programme, countries with at least one national food consumption survey in LMIC are very limited(Reference Huybrechts, Aglago and Mullee12). For example, in 2017 only four sub-Saharan African countries (Ethiopia, Nigeria, South Africa and Uganda) had at least one nationally representative food consumption survey (Fig. 2(a)). In contrast, a lot more countries without a national food consumption survey have introduced mandatory food fortification programmes or have the legislation mandating grain fortification (Fig. 2(b)). Perhaps, the implementation of fortification programmes in the absence of data on micronutrient intake might have been precipitated from the assumption that: fortification is a proven strategy; micronutrient deficiencies in the diet are likely to exist; and the risk of any adverse effect is rare or non-existent.

Fig. 1. Shift in intake distribution desired with fortification (a) and balance required to ensure adequacy and safety (b).

Fig. 2. (Colour online) Countries with at least one National Individual Food Consumption Survey by 2017 (a) and those with mandatory fortification legislation in 2018 (b). Source: (a) adapted from Huybrechts et al.(Reference Huybrechts, Aglago and Mullee12); (b) taken from the Food Fortification Initiative (http://www.FFInetwork.org).

Evidence from existing intake and biochemical data

The few existing and nationally representative intake data confirm the high prevalence of inadequate intake of one or more micronutrient deficiencies. However, they also indicate that excessive intake of iodine, vitamin A, iron and folic acid exist.

Risk of excessive iodine intake

A closer look at the Nigerian Food Consumption Survey reveals that not only inadequate intake of iodine exists, but a significant proportion of the population consume more than adequate amounts of iodine, and about 30 % of children, 20 % of women and close to 10 % of pregnant women have possible excessive intakes. These estimates were based on urinary iodine concentration assay(Reference Maziya-Dixon, Akinyele and Oguntona13). A more recent study conducted in Djibouti, Kenya and Tanzania showed that urinary iodine concentration exceeded the threshold for excessive iodine intake among school age children, whereas thyroglobulin, the thyroid-specific glycoprotein on which iodine is stored and thyroid hormone is formed, was high in all population groups(Reference Farebrother, Zimmermann and Abdallah14). Indeed, according to the global iodine network, excess iodine intake is present in eleven countries(Reference Network15) and was primarily due to overiodisation of salts, possibly high sodium intake and high iodine concentrations in ground water(Reference Farebrother, Zimmermann and Abdallah14).

The effects of such excessive iodine intake on thyroid function have not been well studied. Acute excess iodine intake is well tolerated because of homeostatic regulation that transiently shuts down thyroid hormone synthesis(Reference Leung and Braverman16); however, chronic exposure to excessive iodine intake can be less tolerated and lead to overt hyperthyroidism among individuals with sub-clinical hyperthyroidism(Reference Cooper and Biondi17). This risk is particularly high in historically iodine-deficient regions where nodular goitre is more prevalent(Reference Pearce18), and thus requires attention.

Risk of excessive vitamin A intake

Vitamin A deficiency, a risk factor for blindness and for mortality from measles and diarrhoea, remains a public health concern in much of Africa and south Asia(Reference Stevens, Bennett and Hennocq19). However, recent evidence suggests that there is a significant decreasing trend in the proportion of children aged 6–59 months affected by vitamin A deficiency(Reference Stevens, Bennett and Hennocq19). The decreasing trend in vitamin A deficiency could largely be ascribed to vitamin A supplementation, fortification, reduced infection rates and possibly increased economic status and knowledge that enable households to consume vitamin A-rich foods(Reference Stevens, Bennett and Hennocq19). However, an emerging risk of excessive vitamin A intake is present as a result of overlapping interventions. Multiple foods are now fortified with vitamin A (e.g. oil, sugar, cereal starch), which substantially increase the risk of excessive intake that would have not been the case with the consumption of only one vitamin A fortified food vehicle(Reference Tanumihardjo, Kaliwile and Boy20) (Fig. 3).

Fig. 3. Overlapping vitamin A interventions and liver vitamin A accumulation. Source: Tanumihardjo(Reference Tanumihardjo, Gannon and Kaliwile24).

Excessive vitamin A intake and increased liver vitamin A accumulation have been reported from studies in countries such as Zambia and Guatemala, where concurrent vitamin A interventions such as fortification of sugar, cooking oil, starch and vitamin A supplementation are given to children(Reference Mondloch, Gannon and Davis21Reference Tanumihardjo23). These interventions are happening with little or no adequate assessment of vitamin A status; hence, increasing the risk of adverse effects(Reference Tanumihardjo, Gannon and Kaliwile24). Children turning yellow in the mango season as a result of excessive vitamin A intake have been reported(Reference Tanumihardjo, Gannon and Kaliwile25), yet very little effort exists to guide a more careful implementation of vitamin A interventions(Reference Tanumihardjo, Mokhtar and Haskell26). This is unfortunate, given that excessive vitamin A (retinol) has been linked with increased bone fracture in men(Reference Michaëlsson, Lithell and Vessby27) and women(Reference Feskanich, Singh and Willett28). Given that the problem of excessive vitamin A intake increases with increased coverage of various vitamin A interventions, monitoring population vitamin A intake is critical(Reference Tanumihardjo, Kaliwile and Boy20).

Risk of excessive folic acid intake

Deficiencies in folate (organic form found in food) impair several bodily functions. For example, impaired erythrocyte synthesis due to folate deficiencies can lead to macrocytic anaemia and increased risk of neural tube defects in fetuses(Reference Reynolds, Biller and Ferro29, Reference Berry, Li and Erickson30). Consequently, fortification with folic acid (synthetic form) has been implemented in over seventy-five countries worldwide(Reference Santos, Lecca and Cortez-Escalante31). This was reported to have decreased the incidence of neural tube defects in the USA by 19–32 %(Reference Boulet, Yang and Mai32, Reference Williams, Mai and Mulinare33). This success has led many countries in Africa and Asia where folate deficiencies are expected to be still high to plan or implement folic acid fortification.

As previously illustrated, many of these countries do not have adequate intake data nor have sufficient information on the folate concentrations in their food composition tables. In the absence of such information, the implementation of mandatory folic acid fortification can potentially be risky if excessive amounts are consumed(Reference Patel and Sobczyńska-Malefora34). This is because, unlike the natural form (folate), folic acid needs to undergo a two-step reduction to tetrahydrofolate through dihydrofolate reductase before being used for metabolic processes. This conversion (reduction) is not always efficient(Reference Scaglione and Panzavolta35). Besides, the absorption and biotransformation of the folic acid to its active folate 5 methyltetrahydrofolate form is saturated when intake of folic acid is excessive, thus leading to the appearance of unmetabolised folic acid in circulation(Reference Powers36).

Similar to inadequate folate intake, excess folic acid intake is also associated with adverse effects. The prolonged presence of unmetabolised folic acid has been associated with increased risk of cancer and insulin resistance in children(Reference Krishnaveni, Veena and Karat37, Reference Mason, Dickstein and Jacques38). Besides, excess folic acid intake masks vitamin B12 deficiency and at very high concentrations, has been associated with hepatotoxicity(Reference Patel and Sobczyńska-Malefora34). Given that high levels of unmetabolised folic acid have been increasingly diagnosed in countries with mandatory folic acid fortification(Reference Pfeiffer, Hughes and Lacher39, Reference Pasricha, Drakesmith and Black40), ensuring that interventions are guided by evidence is critical.

Risk of excessive iron intake

A key nutrient that fortification and supplementation programmes often target to prevent or treat anaemia is iron. This is informed by the wealth of information on the health benefits of iron fortification and supplementation, which includes reduced adverse pregnancy outcomes, lower risk of anaemia, increased cognitive function(Reference Bhutta, Das and Rizvi7, Reference Gera, Sachdev and Boy41). Consequently, fortification or supplementation with iron has been listed as a proven and efficacious intervention that if scaled-up, could substantially reduce morbidities and mortality(Reference Bhutta, Das and Rizvi7). In line with the global nutrition targets for 2025 that aim to reduce the prevalence of anaemia by 50 % (relative to prevalence figures in 2012), the WHO proposes daily iron supplementation for pregnant women and fortification of wheat, maize and rice starch in settings where these are staples(6, 42). Moreover, WHO strongly recommends home fortification of complementary foods with multiple micronutrient powders to improve iron status and reduce anaemia among infants and young children (6–23 months of age)(42).

In Ethiopia, a national food consumption survey was conducted to inform the design and implementation of fortification programmes(43). While dietary diversity and consumption of animal source foods was extremely low, iron intake was however high. Among women of reproductive age, inadequate intake was about 15 %, whereas excessive intake was estimated to be between 60 and 80 %. Similarly, smaller studies from different parts of the country confirm the low prevalence of inadequate intake and iron deficiency, even among children and pregnant women whose physiological demands are high(Reference Baye, Guyot and Icard-Verniere44Reference Gebreegziabher and Stoecker47). Simulation of fortification with iron even at lower doses (about 6 mg) than the WHO recommendations (12·5 mg) significantly increased the risk of excessive intake among infants and young children(Reference Abebe, Haki and Baye45).

A closer look at the source of the high iron intake despite a relatively poor consumption of iron-fortified or animal source foods revealed that much of the iron came from soil contamination of staple cereals occurring during traditional threshing that happens under the hooves of cattle(Reference Baye, Mouquet-Rivier and Icard-Vernière48, Reference Guja and Baye49). Although little is known about the bioavailability of this iron, Guja and Baye(Reference Guja and Baye49) using a haemoglobin depletion–repletion rat assay indicated that extrinsic iron from soil contamination can contribute to haemoglobin regeneration. This is supported by observations from Malawi that showed low iron deficiency among rural women whose foods were contaminated by acidic soils(Reference Gibson, Wawer and Fairweather-Tait50). Additional sources of iron, such as screw-wares during milling(Reference Icard-Vernière, Hama and Guyot51) and groundwater iron(Reference Rahman, Ahmed and Rahman52, Reference Ahmed, Khan and Shaheen53), can also contribute to high iron intake in LMIC.

There is mounting evidence of the possible adverse effects associated with the provision of iron in excess or for iron-replete individuals. In iron-replete infants and young children, exposure to iron fortified foods has been associated with decreased growth(Reference Lönnerdal54, Reference Idjradinata, Watkins and Pollitt55), impaired cognitive development(Reference Hare, Arora and Jenkins56, Reference Wessling-Resnick57) and increased diarrhoea possibly due to altered gut microbiota to more pathogenic bacteria(Reference Paganini and Zimmermann58, Reference Jaeggi, Kortman and Moretti59). In pregnant women, there is emerging evidence that suggest that high iron status as reflected by high haemoglobin or serum ferritin values can be associated with adverse effects including preterm birth, impaired fetal growth and gestational diabetes(Reference Dewey and Oaks60Reference Tamura, Goldenberg and Johnston65). More research in this area is urgently needed to better understand the emerging evidence coming largely from observational studies.

Summary and programmatic implications

Both inadequate and excessive micronutrient intakes may co-exist in LMIC, and both are related with adverse health outcomes. This calls for a better design and implementation of micronutrient interventions to maximise benefits and minimise adverse outcomes. Before implementing micronutrient interventions, adherence to the basic principles of documenting evidence confirming that the deficiency in question exists and that fortification will correct this deficiency is needed. This can be supported with dietary intake assessments and biochemical screening that help diagnose nutrient deficiencies, estimate the gap to be filled and the risk of excessive intake. Targeting micronutrient interventions, although programmatically challenging, should be considered whenever possible. To this end, recent advances in point of care diagnostics such as the Cornell Nutriphone(Reference Mehta, Colt and Lee66) and the density-based fractionation of erythrocytes(Reference Hennek, Kumar and Wiltschko67) hold promise for a minimally-invasive screening tool with potential application in-field. Closer monitoring of appropriate fortification of foods and any possible risk with overlapping interventions is also needed.

Conclusion

Many LMIC do not have even one nationally representative food consumption survey; hence, the extent and magnitude of excessive intake and related adverse effects remain largely unknown. Besides, even when intake data exist, the proportion of the population exceeding the upper limit is rarely reported. Future research should not only focus on inadequate intake, but also on the risks of excessive intakes. Moreover, the validity of upper limits for specific age groups such as infants and young children are still debatable and thus require more studies. Operational research on how to best target micronutrient interventions, and whether more bioavailable fortificants provided at a lower dosage are more effective and safe requires further investigation.

Acknowledgements

I would like to thank the ANEC VIII organisers.

Financial Support

None.

Conflict of Interest

None.

Authorship

The author had sole responsibility for all aspects of preparation of this paper.

References

1.Bailey, RL, West, KP Jr & Black, RE (2015) The epidemiology of global micronutrient deficiencies. Ann Nutr Metab 66, 2233.Google Scholar
2.Black, RE, Allen, LH, Bhutta, ZA et al. (2008) Maternal and child undernutrition: global and regional exposures and health consequences. Lancet 371, 243260.Google Scholar
3.Muthayya, S, Rah, JH, Sugimoto, JD et al. (2013) The global hidden hunger indices and maps: an advocacy tool for action. PLoS ONE 8, e67860.Google Scholar
4.Christian, P & Stewart, CP (2010) Maternal micronutrient deficiency, fetal development, and the risk of chronic disease. J Nutr 140, 437445.Google Scholar
5.Kennedy, G, Nantel, G & Shetty, P (2003) The scourge of ‘hidden hunger’: global dimensions of micronutrient deficiencies. Food Nutr Agric 5, 816.Google Scholar
6.World Health Organization (2014) Global nutrition targets 2025: anaemia policy brief (WHO/NMH/NHD/14·4 Geneva: World Health Organization. https://www.who.int/nutrition/publications/globaltargets2025_policybrief_anaemia/en/ (accessed September 2018).Google Scholar
7.Bhutta, ZA, Das, JK, Rizvi, A et al. (2013) Evidence-based interventions for improvement of maternal and child nutrition: what can be done and at what cost? Lancet 382, 452477.Google Scholar
8.Allen, LH, De Benoist, B, Dary, O et al. (2006) Guidelines on Food Fortification With Micronutrients. Geneva: World Health Organization.Google Scholar
9.Mannar, MV & Hurrell, RF (2018) Food fortification: past experience, current status, and potential for globalization. In Food Fortification in a Globalized World, pp. 311 [MV Mannar and RF Hurrell, editors]. New York: Academic Press.Google Scholar
10.Osendarp, SJM, Martinez, H, Garrett, GS et al. (2018) Large-scale food fortification and biofortification in low- and middle-income countries: a review of programs, trends, challenges, and evidence gaps. Food Nutr Bull 39, 315331.Google Scholar
11.Gibson, RS, Carriquiry, A & Gibbs, MM (2015) Selecting desirable micronutrient fortificants for plant-based complementary foods for infants and young children in low-income countries. J Sci Food Agric 95, 221224.Google Scholar
12.Huybrechts, I, Aglago, EK, Mullee, A et al. (2017) Global comparison of national individual food consumption surveys as a basis for health research and integration in national health surveillance programmes. Proc Nutr Soc 76, 549567.Google Scholar
13.Maziya-Dixon, B, Akinyele, O, Oguntona, EB et al. (2004) Nigeria Food Consumption and Nutrition Survey 2001–2003: Summary. Ibadan: IITA.Google Scholar
14.Farebrother, J, Zimmermann, MB, Abdallah, F et al. (2018) Effect of excess iodine intake from iodized salt and/or groundwater iodine on thyroid function in nonpregnant and pregnant women, infants, and children: a multicenter study in East Africa. Thyroid 28, 11981210.Google Scholar
15.Network, IG (2017) Global Scorecard of Iodine Nutrition in 2017 in the General Population and in Pregnant Women (PW). Zurich, Switzerland: IGN.Google Scholar
16.Leung, AM & Braverman, LE (2014) Consequences of excess iodine. Nat Rev Endocrinol 10, 136.Google Scholar
17.Cooper, DS & Biondi, B (2012) Subclinical thyroid disease. Lancet 379, 11421154.Google Scholar
18.Pearce, EN (2018) Iodine nutrition: recent research and unanswered questions. Eur J Clin Nutr 72, 1226.Google Scholar
19.Stevens, GA, Bennett, JE, Hennocq, Q et al. (2015) Trends and mortality effects of vitamin A deficiency in children in 138 low-income and middle-income countries between 1991 and 2013: a pooled analysis of population-based surveys. Lancet Glob Health 3, e528e536.Google Scholar
20.Tanumihardjo, SA, Kaliwile, C, Boy, E et al. (2018) Overlapping vitamin A interventions in the United States, Guatemala, Zambia, and South Africa: case studies. Ann NY Acad Sci. [Epublication ahead of print version].Google Scholar
21.Mondloch, S, Gannon, BM, Davis, CR et al. (2015) High provitamin A carotenoid serum concentrations, elevated retinyl esters, and saturated retinol-binding protein in Zambian preschool children are consistent with the presence of high liver vitamin A stores. Am J Clin Nutr 102, 497504.Google Scholar
22.Bielderman, I, Vossenaar, M, Melse-Boonstra, A et al. (2016) The potential double-burden of vitamin A malnutrition: under-and overconsumption of fortified table sugar in the Guatemalan highlands. Eur J Clin Nutr 70, 947.Google Scholar
23.Tanumihardjo, SA (2018) Nutrient-wise review of evidence and safety of fortification: Vitamin A. In Food Fortification in a Globalized World, pp. 247253 [MV Mannar and RF Hurrell, editors]. New York: Academic Press.Google Scholar
24.Tanumihardjo, SA, Gannon, B & Kaliwile, C (2016) Controversy regarding widespread vitamin A fortification in Africa and Asia. Adv Nutr 7, 5.Google Scholar
25.Tanumihardjo, S, Gannon, B, Kaliwile, C et al. (2015) Hypercarotenodermia in Zambia: which children turned orange during mango season? Eur J Clin Nutr 69, 1346.Google Scholar
26.Tanumihardjo, SA, Mokhtar, N, Haskell, MJ et al. (2016) Assessing the safety of vitamin A delivered through large-scale intervention programs: workshop report on setting the research agenda. Food Nutr Bull 37, S63S74.Google Scholar
27.Michaëlsson, K, Lithell, H, Vessby, B et al. (2003) Serum retinol levels and the risk of fracture. New Engl J Med 348, 287294.Google Scholar
28.Feskanich, D, Singh, V, Willett, WC et al. (2002) Vitamin A intake and hip fractures among postmenopausal women. JAMA 287, 4754.Google Scholar
29.Reynolds, E (2014) The neurology of folic acid deficiency. In Handbook of Clinical Neurology, vol. 120, pp. 927943 [Biller, J and Ferro, JM, editors]. Elsevier.Google Scholar
30.Berry, RJ, Li, Z, Erickson, JD et al. (1999) Prevention of neural-tube defects with folic acid in China. New Engl J Med 341, 14851490.Google Scholar
31.Santos, LMP, Lecca, RCR, Cortez-Escalante, JJ et al. (2016) Prevention of neural tube defects by the fortification of flour with folic acid: a population-based retrospective study in Brazil. Bull World Health Org 94, 22.Google Scholar
32.Boulet, SL, Yang, Q, Mai, C et al. (2008) Trends in the postfortification prevalence of spina bifida and anencephaly in the United States. Birth Defects Research Part A: Clin Mol Teratol 82, 527532.Google Scholar
33.Williams, J, Mai, CT, Mulinare, J et al. (2015) Updated estimates of neural tube defects prevented by mandatory folic Acid fortification-United States, 1995–2011. MMWR Morb Mortal Wkly Rep 64, 15.Google Scholar
34.Patel, KR & Sobczyńska-Malefora, A (2017) The adverse effects of an excessive folic acid intake. Eur J Clin Nutr 71, 159163.Google Scholar
35.Scaglione, F & Panzavolta, G (2014) Folate, folic acid and 5-methyltetrahydrofolate are not the same thing. Xenobiotica 44, 480488.Google Scholar
36.Powers, HJ (2007) Folic acid under scrutiny. Br J Nutr 98, 665666.Google Scholar
37.Krishnaveni, GV, Veena, SR, Karat, SC et al. (2014) Association between maternal folate concentrations during pregnancy and insulin resistance in Indian children. Diabetologia 57, 110121.Google Scholar
38.Mason, JB, Dickstein, A, Jacques, PF et al. (2007) A temporal association between folic acid fortification and an increase in colorectal cancer rates may be illuminating important biological principles: a hypothesis. Cancer Epidemiol Prev Biomarkers 16, 13251329.Google Scholar
39.Pfeiffer, CM, Hughes, JP, Lacher, DA et al. (2012) Estimation of trends in serum and RBC folate in the US population from pre-to postfortification using assay-adjusted data from the NHANES 1988–2010–3. J Nutr 142, 886893.Google Scholar
40.Pasricha, S-R, Drakesmith, H, Black, J et al. (2013) Control of iron deficiency anemia in low-and middle-income countries. Blood 121, 26072617.Google Scholar
41.Gera, T, Sachdev, HS & Boy, E (2012) Effect of iron-fortified foods on hematologic and biological outcomes: systematic review of randomized controlled trials. Am J Clin Nutr 96, 309324.Google Scholar
42.World Health Organization (2011) Guideline: Use of Multiple Micronutrient Powders for Home Fortification of Foods Consumed by Infants and Children 6–23 Months of Age. Geneva: World Health Organization.Google Scholar
43.EPHI (2013) Ethiopian National Food Consumption Survey. Addis Ababa: Ethiopian Public Health Institute.Google Scholar
44.Baye, K, Guyot, J-P, Icard-Verniere, C et al. (2013) Nutrient intakes from complementary foods consumed by young children (aged 12–23 months) from North Wollo, northern Ethiopia: the need for agro-ecologically adapted interventions. Public Health Nutr 16, 17411750.Google Scholar
45.Abebe, Z, Haki, GD & Baye, K (2018) Simulated effects of home fortification of complementary foods with micronutrient powders on risk of inadequate and excessive intakes in West Gojjam, Ethiopia. Mat Child Nutr 14, e12443.Google Scholar
46.Gashu, D, Stoecker, BJ, Adish, A et al. (2016) Ethiopian pre-school children consuming a predominantly unrefined plant-based diet have low prevalence of iron-deficiency anaemia. Public Health Nutr 19, 18341841.Google Scholar
47.Gebreegziabher, T & Stoecker, BJ (2017) Iron deficiency was not the major cause of anemia in rural women of reproductive age in Sidama zone, southern Ethiopia: a cross-sectional study. PLoS ONE 12, e0184742.Google Scholar
48.Baye, K, Mouquet-Rivier, C, Icard-Vernière, C et al. (2014) Changes in mineral absorption inhibitors consequent to fermentation of Ethiopian injera: implications for predicted iron bioavailability and bioaccessibility. Int J Food Sci Technol 49, 174180.Google Scholar
49.Guja, H & Baye, K (2018) Extrinsic iron from soil contributes to Hb regeneration of anaemic rats: implications for foods contaminated with soil iron. Br J Nutr 119, 880886.Google Scholar
50.Gibson, RS, Wawer, AA, Fairweather-Tait, SJ et al. (2015) Dietary iron intakes based on food composition data may underestimate the contribution of potentially exchangeable contaminant iron from soil. J Food Compos Anal 40, 1923.Google Scholar
51.Icard-Vernière, C, Hama, F, Guyot, J-P et al. (2013) Iron contamination during in-field milling of millet and sorghum. J Agric Food Chem 61, 1037710383.Google Scholar
52.Rahman, S, Ahmed, T, Rahman, AS et al. (2016) Determinants of iron status and Hb in the Bangladesh population: the role of groundwater iron. Public Health Nutr 19, 18621874.Google Scholar
53.Ahmed, F, Khan, MR, Shaheen, N et al. (2018) Anemia and iron deficiency in rural Bangladeshi pregnant women living in areas of high and low iron in groundwater. Nutrition 51, 4652.Google Scholar
54.Lönnerdal, B (2017) Excess iron intake as a factor in growth, infections, and development of infants and young children. Am J Clin Nutr 106, 1681S1687S.Google Scholar
55.Idjradinata, P, Watkins, WE & Pollitt, E (1994) Adverse effect of iron supplementation on weight gain of iron-replete young children. Lancet 343, 12521254.Google Scholar
56.Hare, DJ, Arora, M, Jenkins, NL et al. (2015) Is early-life iron exposure critical in neurodegeneration? Nat Rev Neurol 11, 536544.Google Scholar
57.Wessling-Resnick, M (2017) Excess iron: considerations related to development and early growth. Am J Clin Nutr 106, 1600S-1605S.Google Scholar
58.Paganini, D & Zimmermann, MB (2017) The effects of iron fortification and supplementation on the gut microbiome and diarrhea in infants and children: a review. Am J Clin Nutr 106, 1688S1693S.Google Scholar
59.Jaeggi, T, Kortman, GA, Moretti, D et al. (2015) Iron fortification adversely affects the gut microbiome, increases pathogen abundance and induces intestinal inflammation in Kenyan infants. Gut 64, 731742.Google Scholar
60.Dewey, KG & Oaks, BM (2017) U-shaped curve for risk associated with maternal hemoglobin, iron status, or iron supplementation. Am J Clin Nutr 106, 1694S1702S.Google Scholar
61.Taylor, CL & Brannon, PM (2017) Introduction to workshop on iron screening and supplementation in iron-replete pregnant women and young children. Am J Clin Nutr 106, 1547S1554S.Google Scholar
62.Fu, S, Li, F, Zhou, J et al. (2016) The relationship between body iron status, iron intake and gestational diabetes: a systematic review and meta-analysis. Medicine (Baltimore) 95, e2383.Google Scholar
63.Hwang, J-Y, Lee, J-Y, Kim, K-N et al. (2013) Maternal iron intake at mid-pregnancy is associated with reduced fetal growth: results from Mothers and Children's Environmental Health (MOCEH) study. Nutr J 12, 38.Google Scholar
64.Bao, W, Chavarro, JE, Tobias, DK et al. (2016) Long-term risk of type 2 diabetes in relation to habitual iron intake in women with a history of gestational diabetes: a prospective cohort study, 2. Am J Clin Nutr 103, 375381.Google Scholar
65.Tamura, T, Goldenberg, RL, Johnston, KE et al. (1996) Serum ferritin: a predictor of early spontaneous preterm delivery. Obstet Gynecol 87, 360365.Google Scholar
66.Mehta, S, Colt, S, Lee, S et al. (2017) Rainer gross award lecture 2016: a laboratory in your pocket: enabling precision nutrition. Food Nutr Bull 38, 140145.Google Scholar
67.Hennek, JW, Kumar, AA, Wiltschko, AB et al. (2016) Diagnosis of iron deficiency anemia using density-based fractionation of red blood cells. Lab Chip 16, 39293939.Google Scholar
Figure 0

Fig. 1. Shift in intake distribution desired with fortification (a) and balance required to ensure adequacy and safety (b).

Figure 1

Fig. 2. (Colour online) Countries with at least one National Individual Food Consumption Survey by 2017 (a) and those with mandatory fortification legislation in 2018 (b). Source: (a) adapted from Huybrechts et al.(12); (b) taken from the Food Fortification Initiative (http://www.FFInetwork.org).

Figure 2

Fig. 3. Overlapping vitamin A interventions and liver vitamin A accumulation. Source: Tanumihardjo(24).