In 1922, one of the great puzzles of nutrition was solved by a Swiss doctor on a mission to prevent the people of Appenzell Ausserrhoden from developing enlarged thyroid glands—a condition better known as goiter. People weren’t getting enough iodine in their diets, and Hans Eggenberger, the chief physician of the district hospital, in the town of Herisau, knew it.
After some thoughtful observation into the dietary habits of the Swiss, Eggenberger introduced iodized salt—table salt with sodium iodide or potassium iodide added to it—to Appenzell Ausserrhoden. Within a year, incidences of goiter locally dropped precipitously. Immediately, the United Swiss Rhine Salt Works began to produce iodized salt and the Swiss Federal Office of Public Health formed the Swiss Goiter Commission in 1922 to ensure that, moving forward, the Swiss would only have iodized salt at their tables.
Across the Atlantic, in the U.S., goiter remained an issue of almost epidemic proportions in the Great Lakes area. A professor of pediatric medicine at the University of Michigan, David Murray Cowie looked at the Swiss success and asked “why not here?” By May 1, 1924 the Morton Salt Company was distributing the cure to goiter to households nationwide: iodized salt.
Today, iodized salt has lowered the incidence rate of goiters considerably around the world. Not even three decades after the discovery, goiter rates among Michigan schoolchildren had fallen from a high of 66 percent in some cities down to just 0.2 percent. In England, by the turn of 21st century, goiter had dropped from roughly 1 percent incidence to 0.041 percent. Iodized salt has also minimized risks related to pregnancy, such as stillbirth and spontaneous abortion, and cognitive defects in a child’s early developmental years, such as cretinism. Just over a decade after UNICEF and the World Health Organization (WHO) officially recommended the world’s salt supplies be iodized in 2003, 120 countries have programs in place.
While Eggenberer’s revelation has led to huge nutritional successes, other micronutrient deficiencies still plague people all over the world. For example, more than one-quarter of the world’s population is anemic, with children most at risk. Vitamin A deficiencies, the leading cause of preventable blindness in children, afflict roughly 250 million preschool-age children. These illnesses and others like them crop up in both underdeveloped countries and industrialized ones—where food is either bountiful but monotonous, or, in the case of food deserts, absent.
That’s why groups like WHO and the Global Alliance for Improved Nutrition are trying to solve modern deficiencies of other micronutrients in the same way we solved the iodine problem nearly 100 years ago: through people’s food.
The first step, says Luis Mejia, one of the scientists leading the charge, is deciding which foods get fortified. That depends heavily on how often the food is consumed, in what quantities it’s consumed, and who’s doing the consuming. “You can’t just choose any food,” says Mejia, a professor of food science and nutrition at the University of Illinois. It wouldn’t make sense, for example, to fortify wasabi or Old Bay Seasoning—those spices aren’t consumed by enough people. The vehicle must be quick, convenient and used pretty much daily. In 2010, for instance, Mejia and fellow scientists at INCAP, a nutritional organization in Central America and Panama, saw major success bringing vitamin A and iron to the region’s supplies of sugar.
However, some experts suggest Mejia’s victory in fortifying sugar is a best-case scenario and addresses only a few of the many challenges likely to be experienced in other countries. Iron fortification is a great example. While each country necessarily has its own nutritional challenges, due to unique profiles of disease and deficiency, across the board, iron intake is problematic. In 2010, food scientists convened to analyze how well iron-fortified wheat flour had been helping the populations of 78 countries. The short answer: It wasn’t. Only nine countries were judged to have benefited from their current programs. The other 69 were either using the wrong type of iron powders, the wrong concentrations or both.
In large part, this is because of other endemic health problems in those countries. “If you get down to the nitty-gritty, one of the main problems is trying to give nutrients to populations that have a lot of inflammation and infection,” says Richard Hurrell, professor emeritus at ETH Zurich and co-editor of the 2006 WHO report “Guidelines on Food Fortification With Micronutrients.” In sub-Saharan Africa, especially, these maladies prevent iron fortification programs from taking hold because the body’s natural inflammatory response is to block absorption of the nutrient. ”You can’t just go with food fortification,” says Hurrell. “You have to go with a hygiene program and a program to treat malaria or to treat worms.”
Mejia suspects vitamin A fortification may encounter similar logistical hurdles. While Southeast Asia has nowhere near the level of infectious disease found in Africa, people there have micronutrient deficiencies because their diets are too simple, which impedes their bodies’ ability to absorb certain nutrients. Unlike water-soluble vitamins like B-complex vitamins and the family of C vitamins, vitamin A gets absorbed through the presence of fat. In countries with rice-heavy diets, fortifying condiments like soy sauce and fish sauce must take these fat scarcities into account. Fortunately, says Mejia, “there is usually a minimum amount of fat from nuts or some fruits that may help the intestinal absorption of vitamin A. True, the absorption may not be optimal, but it is better than nothing.”
Finally, there is the matter of stability. Fortification programs are successful when they take existing foods and preserve qualities like taste, odor, texture and color, so that people don’t need to make any mental leaps to incorporate a strange new food into their diets. If all goes to plan, nothing on the consumer’s end needs to change. Cowie’s mission to iodize America’s salt might not have worked if his plans required modifying it into a bitter blue powder or turning it into chalky pink rocks. Fortification’s big sell is that it is expansive and automatic—people who eat salt eat salt, and fortifying something complex, like mustard or chocolate, leaves too much room for naysayers.
Then there’s the cost, which stands in the way of many blue-sky solutions to global problems, including the micronutrient fix. Understanding how a nutrient can get to a population requires vast amounts of fieldwork, data collection and analysis, and man-hours, all of which eat up millions of dollars and precious time that few countries can afford. Luckily, Hurrell says, the fortification industry recently has had some help from donors like the Bill and Melinda Gates Foundation, USAID, nongovernmental organizations, academic institutions and other private sector companies.
If funding can keep pace with the advances made in both food and implementation sciences, Mejia suspects countries will be able to roll out new programs within a handful of years— fortified curry powder in India and sauces in the Philippines and Vietnam may be a reality before the decade’s out. Compared to other global public health programs, which often project decades into the future, fortification holds a unique power: to help billions of people get healthier in the same generation that hope first arrived.
“If you can do that,” Mejia says, “I think you have a winner.”