Connect
MJA
MJA

The repeating history of objections to the fortification of bread and alcohol: from iron filings to folic acid

Max Kamien
Med J Aust 2006; 184 (12): 638-640. || doi: 10.5694/j.1326-5377.2006.tb00422.x
Published online: 19 June 2006

In the early 1970s, I worked in Bourke, 800 km west of Sydney. The population in the town was 3500 people, of whom 25% were Aboriginal people. More than 30% of the Aboriginal people had clinical signs of vitamin deficiency, such as angular stomatitis, glossitis and skin xerosis. Fifty per cent of the children and 15% of adult women also had laboratory evidence of iron deficiency. Their blood vitamin levels and daily food vitamin intakes were worse than those found in any previous surveys in Australia.1 Their staple diet was white bread eaten with Golden Syrup, jam or honey, and washed down with large quantities of sweetened tea.2 This dietary pattern was acquired in the 1920s when state government policies were to segregate Aboriginal people and issue them a weekly ration of eight pounds of flour, two pounds of sugar and a quarter pound of tea.3

I thought that the quickest method of improving the health of Aboriginal people in Bourke was to fortify their bread with iron, thiamine, niacin and riboflavin. The local baker agreed. The amounts added were well within the levels permitted by the Pure Food Act 1908 (NSW), and about half that currently found in breakfast cereals.

I was then an unpublished novice researcher, naively unaware of the difference between legal regulations and community consent. My research mentors had asked the Bread Research Institute to be an independent monitor of this study. A spokesman mentioned it to a reporter from a weekly newspaper, the National Times. To our surprise, he ran a front-page story under the headline “The guinea-pigs of Bourke: the secret bread tests”.4 This article provoked considerable correspondence. Most writers were supportive, but some haematologists maintained that fortification would put people with thalassaemia and haemochromatosis at further risk of iron overload. That was a reasonable generalisation, even though there were no known cases of either disorder in Bourke Shire. They also contended that the added iron could mask the early diagnosis of bowel cancer by slowing the appearance of iron-deficiency anaemia. Some nutritionists and public health doctors also argued that education of the population was far preferable for fortification of foods.5

The local baker had a low tolerance for this controversy and stopped fortifying his bread. However, re-examination of the Aboriginal people previously tested showed significant improvement in the blood levels of those vitamins added to the bread.6 At the same time, blood levels of vitamins not added to the bread remained unchanged or worsened. In addition, physical signs of B-group deficiency had virtually disappeared.6 An editorial in the Lancet pointed out that this was one of the few studies that provided objective evidence for the benefits of fortification in nutritionally compromised communities.7 This work also contributed to state health ministers instituting the fortification for flour to isolated Aboriginal communities.8

History of fortification

Bread has been a staple food for about 10 000 years, and yeast-risen bread for 4000 years. White bread was produced in 1870 through the roller milling of wheat to remove the outer layers of bran and pollard. This led to a range of vitamin B deficiency diseases in people whose staple food was white flour, but not in those who consumed brown flour. Research identified and synthesised the vitamins that had been lost in the refining process, and led to the idea of fortifying white flour to replace what had been lost during refining.9

Fortification has been shown to be worthwhile if:

The earliest known nutrient supplementation of food was in 400 bc, when the physician Melanpus added iron filings to wine to increase Persian soldiers’ strength.10 In 1833, the French agricultural chemist Jean Baptiste Boussingault urged his government to add iodine to salt to prevent goitre. It took until 1920 for his suggestion to be implemented. It was so successful in reducing goitre and cretinism in affected populations that it prepared the way for future fortification of foods.11

Between 1924 and 1944, iodine was added to salt, vitamins A and D to margarine, vitamin D to milk, and vitamins B1, B2, niacin and iron to flour and bread. The major impetus to fortification was World War II. Both the United Kingdom and the United States governments recommended voluntary fortification of flour and bread. By 1943, the US Army would not purchase any flour or bread that was not fortified. In the UK, calcium, thiamine, niacin and iron were added to bread. In some cases, this was poorly absorbable powdered iron obtained from grinding up old railway lines.12

Many countries, such as Russia, the UK and the US, have added iron and vitamins to flour, bread and other foodstuffs with the intent of preventing or alleviating nutritional deficiencies in their population. Bread has been fortified with various preparations of iron, vitamins, amino acids, sesame, mung, soy, and broad bean flour, pumpkin seed, iodine, calcium, magnesium and zinc. The first Australian experiment in fortification was in 1956, when iodine was added to bread in Canberra to prevent goitre and cretinism.13 Tasmania added iodine to bread between 1966 and 1974.14 They ceased this practice because of concern about iodine toxicity from milk contaminated with iodine-based antiseptic solutions. Newer sanitisers have led to a decrease in the iodine content of milk. This has led to the re-emergence of mild iodine deficiency and the call for mandatory iodisation of salt.15

Thiamine

In the 1980s, the Adelaide psychiatrist Peter Yellowlees, and others, rekindled the fortification debate by pointing out that alcohol-related brain damage was a major Australian public health problem requiring a national policy for thiamine fortification of flour, bread and alcoholic beverages.16,17 Yellowlees was not aware of the previous studies in Bourke. They might have warned him of the controversy that can arise from apparently simple dietary interventions. And that history was repeated. In 1987, the National Health and Medical Research Council recommended the addition of thiamine to beer and flagon wine, but this was opposed by both brewers and anti-alcohol groups.18,19

The brewers’ argued that thiamine changed the flavour of the beer. In fact, the taste of thiamine can merge with the flavour of beer, but not with that of table wine. The brewers also argued that adding thiamine to alcoholic beverages would affect their export trade because such beverages would not be acceptable in other countries. They asserted that if Australian beers had to have added thiamine, so should beers that were imported into Australia. Yellowlees challenged these objectors to provide information about the number of alcoholics who developed brain damage through drinking expensive German beer and commented that their “stance was essentially the same maladaptive psychological defence mechanism as that of many alcoholics — denial”.20

Temperance groups and nutritionists stated that it was philosophically unsound to add good food to a bad product and that making alcohol into a super food was likely to encourage greater consumption. The compromise solution was to add thiamine only to bread, and this became mandatory in 1991.19

Unfortunately, no monitoring system was set up to examine the effect of this public health measure. Indirect evidence is that it has decreased the incidence of Wernicke’s encephalopathy and Korsakoff’s psychosis.19

Folic acid

By the 1950s, obstetricians began to think that diet had something to do with neural tube defects, as the incidence of the condition was higher in low socioeconomic populations and in babies conceived in the winter, when there was a lack of fresh food.21 Folic acid is derived from the Latin folium for leaf. It was isolated from spinach in 1941, and is a member of the B group of vitamins.22

A number of studies have been carried out to see whether folic acid protects women against producing children with neural tube defects. The best known Australian study was by Bower and Stanley in 1989.23 The two most definitive studies were the British Medical Research Council Vitamin Study Research Group report on the prevention of neural tube defects in 1991, and a Hungarian study in 1992.24,25

A neural tube defect is a catastrophic condition for a child and his or her parents. Neural tube defects affect about one child per 500 births (about 400 children in Australia per year). Folate fortification could reduce these figures by 75%.24,26 Consequently, there has been much debate about the best method of ensuring that women of childbearing age consume at least 400 μg of folic acid per day. Paediatricians and dieticians urge women of childbearing age to increase their dietary intake of folic acid by eating dark-green leafy vegetables and citrus fruits. The nutritional supplement and vitamin industry maintain that it is better for women to take multivitamin supplements routinely before and during pregnancy. The problem with both these approaches is that the information has to reach the public, they have to act on it, and women have to know when they wish to fall pregnant. Many, if not most, pregnancies are unplanned, and information about folic acid seems to have reached only about 50% of fertile women — and not those 50% who are most in need.26-28

In 1994, an NHMRC Expert Panel recommended voluntary fortification of bread, flour and cereals as the preferred strategy.29 But by 2001, only a few of the recommended foods were being fortified with folate.27 Recently, the Australian and New Zealand Food Regulation Ministerial Council recommended that folate supplementation become mandatory in flour. But this is yet to happen.26 One factor in this delay is the persisting fear that folic acid will mask the development of pernicious anaemia and thus lead to neurological damage. This concerns geriatricians, who warn that folic acid fortification may adversely affect the 15%–20% of elderly patients who have low vitamin B12 levels.30 Truswell has traced the historical antecedents of this anxiety. Before vitamin B12 was isolated in 1950, high doses of folic acid were used to treat pernicious anaemia. This often produced an improvement in the blood picture, but had no effect on the neurological complications of vitamin B12 deficiency. With much lower intakes of folic acid and the easy measurement of serum vitamin B12, such anxieties are now misplaced.31 Experience in 50 countries that have mandatory fortification with folic acid has not shown any apparent increase in neurological disorder.32

Historical patterns of objection to food fortification

The overall history of fortification has been beneficial, particularly with regard to iodine in salt and bread to prevent goitre and cretinism; vitamin D in milk to prevent rickets; thiamine, niacin and riboflavin in bread to prevent beri-beri and pellagra; and fluoride in drinking water to reduce dental caries. Despite this positive history, any attempt to improve the public’s health by “tampering” with their food or water invariably provokes a predictable and repetitive pattern of opposition.

Some nutritionists oppose fortification on the doctrine that education about a well balanced diet is a more logical approach than fortification. The nutritional supplement and vitamin industry promote the view that it is better for people to consume multivitamin supplements. Both groups “use the rhetoric of the anti-fluoridationists of the last century” to ask if it is ethical to medicate a conscious and mentally competent adult without obtaining that adult’s informed consent.28 They do not ask if it is ethical to deny potential benefits to populations and socioeconomic classes at risk.

The task of expert committees and policy makers in this area is to obtain scientific information to enable them to weigh the potential benefits against the real and theoretical risks of vitamin or mineral overload. Where such information is lacking, research and monitoring is mandatory and should be financially supported. This is the only logical approach towards resolving scientific and ethical uncertainty.33

  • Max Kamien

  • Discipline of General Practice, University of Western Australia, Perth, WA.


Correspondence: mkamien@cyllene.uwa.edu.au

Competing interests:

None identified.

  • 1. Nobile S, Woodhill JM. A survey of the vitamin content of some 2000 foods as they are consumed by selected groups of the Australian population. Food Technol Aust 1973; 25: 80-100.
  • 2. Kamien M, Nobile S, Cameron P, Rosevear P. Vitamin and nutritional status of a part-Aboriginal community. Aust N Z J Med 1974; 4: 126-137.
  • 3. Harrison L. Food, nutrition and growth in Aboriginal communities. In: Reid J, Trompf P, editors. The health of Aboriginal Australia. Sydney: Harcourt Brace Jovanovich, 1991: 123-172.
  • 4. Ross M. The guinea pigs of Bourke: the secret bread tests. The National Times 1974; 25 Feb: 1.
  • 5. Kamien M. The secret bread tests: selective primary health care or experimentation on human-beings? Soc Sci Med 1987; 24: 445-448.
  • 6. Kamien M, Woodhill JM, Nobile S, et al. Nutrition in the Australian Aborigines — effects of the fortification of white flour. Aust N Z J Med 1975; 5: 123-133.
  • 7. Fortified flour [editorial]. Lancet 1975; 1: 401-402.
  • 8. Fortified flour for SA Aborigines [editorial]. Food Technol Aust 1973; 25: 612-613.
  • 9. Bread. In: Microsoft Encarta Online Encyclopedia, 2005. Available at: http://encarta.msn.com (accessed Feb 2006).
  • 10. Mejia LA. Fortification of foods: historical development and current practices. Food Nutr Bull 1994; 15. Available at: http://www.unu.edu/unupress/food/8F154e/8F154E03.htm (accessed Feb 2006).
  • 11. Rosenfeld L. Discovery and early uses of iodine. J Chem Educ 2000; 77: 984-987.
  • 12. Brown EB. Irony. Br J Haematol 1972; 23 Suppl: 97-100.
  • 13. Hipsley EA. A new method of preventing goitre in Canberra. Med J Aust 1956; 1: 532-533.
  • 14. Gibson HB. The history of thyroid research in Tasmania from 1949 until the present day. In: King H, editor. Epidemiology in Tasmania. Canberra: Brolga Press, 1987: 35-59.
  • 15. Li M, Waite KV, Ma G, Eastman CJ. Declining iodine content of milk and re-emergence of iodine deficiency in Australia [letter]. Med J Aust 2006; 184: 307. <MJA full text>
  • 16. Yellowlees PM. Thiamin deficiency and prevention of Wernicke–Korsakoff syndrome. A major public health problem. Med J Aust 1986; 145: 216-219.
  • 17. Price J, Kerr R, Hicks M, Nixon PF. The Wernicke–Korsakoff syndrome: a reappraisal in Queensland with special reference to prevention. Med J Aust 1987; 147: 561-565.
  • 18. National Health and Medical Research Council. Report of the 104th session, November 1987. Canberra: NHMRC, 1988.
  • 19. Drew LR, Truswell AS. Wernicke’s encephalopathy and thiamine fortification of food: time for a new direction? Med J Aust 1998; 168: 534-535. <MJA full text>
  • 20. Yellowlees P. Thiamine in our bread and wine [letter]. Med J Aust 1990; 153: 567.
  • 21. Hibbard BM. The role of folic acid in pregnancy with particular reference to anaemia, abruption and abortion. J Obstet Gynaecol Br Commonw 1964; 71: 529-542.
  • 22. Davis RE, Nicol DJ. Folic acid. Int J Biochem 1988; 20: 133-139.
  • 23. Bower C, Stanley FJ. Dietary folate as a risk factor for neural tube defects: evidence from a case–control study in Western Australia. Med J Aust 1989; 150: 613-619.
  • 24. MRC Vitamin Study Research Group. Prevention of neural tube defects: results of the MRC vitamin study. Lancet 1991; 338: 131-137.
  • 25. Czeizel AF, Dudas J. Prevention of first occurrence of neural tube defects by periconceptual vitamin supplementation. N Engl J Med 1992; 327: 1832-1835.
  • 26. Maberly GF, Stanley FJ. Mandatory fortification of flour with folic acid: an overdue public health opportunity. Med J Aust 2005; 183: 342-343. <MJA full text>
  • 27. Abraham B, Webb K. Interim evaluation of the voluntary folate fortification policy. National Food and Nutrition Monitoring and Surveillance Project. Canberra: Commonwealth of Australia, 2001. Available at: http://www.health.gov.au/internet/wcms/publishing.nsf/Content/health-pubhlth-strateg-food-monitoring.htm-copy2 (accessed Apr 2006).
  • 28. Junod SW. Folic acid fortification: fact and folly. Rockville, Md: Food and Drug Administration, 2001. Available at: http://www.fda.gov/oc/history/makinghistory/folicacid.html (accessed Apr 2006).
  • 29. National Health and Medical Research Council. Folate fortification. Report of the expert panel on fortification. Canberra: NHMRC,1995.
  • 30. Flicker LA, Vasikaran SD, Thomas J, et al. Homocysteine and vitamin status in older people in Perth. Med J Aust 2004; 180: 539-540. <MJA full text>
  • 31. Truswell AS. Could a general increase in folate intake mask or accelerate the development of neurological disease due to vitamin B12 deficiency? In: National Health and Medical Research Council. Folate fortification. Report of the expert panel on fortification. Canberra: NHMRC, 1995: 43-47.
  • 32. Wald NJ, Bower C. Folic acid, pernicious anaemia, and prevention of neural tube defects. Lancet 1994; 343: 307.
  • 33. Lawrence M. Assessing the case for mandatory folate fortification: policy-making in the face of scientific uncertainties. Aust N Z J Public Health 2005; 29: 328-330.

Author

remove_circle_outline Delete Author
add_circle_outline Add Author

Comment
Do you have any competing interests to declare? *

I/we agree to assign copyright to the Medical Journal of Australia and agree to the Conditions of publication *
I/we agree to the Terms of use of the Medical Journal of Australia *
Email me when people comment on this article

Online responses are no longer available. Please refer to our instructions for authors page for more information.