Multiple micronutrient nutrition is an idea that originated in the 1940s and exemplifies the iterative nutritional paradigm. In the first four decades of the 20th century, scientists sought to separate and characterize the vitamins that were responsible for xerophthalmia, rickets, pellagra, scurvy, and beriberi. The dietary requirements of the different micronutrients began to be established in the early 1940s. Surveys showed that multiple micronutrient deficiencies were widespread in industrialized countries, and the problem was addressed by use of cod-liver oil, iodized salt, fortified margarine, and flour fortification with multiple micronutrients, and, with rising living standards, the increased availability and consumption of animal source foods. After World War II, surveys showed that multiple micronutrient deficiencies were widespread in developing countries. Approaches to the elimination of multiple micronutrient deficiencies include periodic vitamin A supplementation, iodized salt, targeted iron/folate supplementation, fortified flour, other fortified foods, home fortification with micronutrient powders, and homestead food production. The prevention of multiple micronutrient malnutrition is a key factor in achieving the Millennium Development Goals, given the important effects of micronutrients on health and survival. © 2012 American Society for Nutrition.
CITATION STYLE
Semba, R. D. (2012). The historical evolution of thought regarding multiple micronutrient nutrition. Journal of Nutrition, 142(1). https://doi.org/10.3945/jn.110.137745
Mendeley helps you to discover research relevant for your work.