Issue 61
Public Health Newsletter of Water Quality Research Australia
In this Issue:
Water, Sanitation and Hygiene
Water, Sanitation and Hygiene
1
Ă–stersund Boil Water Ends
3
Desalination Decision for Israel
4
Mosquito-borne Diseases in Australia
6
News Items
7
From The Literature Web Bonus Articles Arsenic Chemical Contamination * Disinfection Byproducts Endemic Gastroenteritis Fluoride * Giardia * Hardness * Household Interventions* Lead * Legionella * Metals Outbreaks* Rainwater Recreational Water Risk Assessment * Water Softening Water Quality *
9
Mailing List Details
20
Editor Martha Sinclair Assistant Editor Pam Hayes * Summaries of web bonus articles on these topics are contained in the PDF version of Health Stream on the WQRA website:
www.wqra.com.au
HEALTH STREAM
March 2011
A series of four articles recently published the journal PLoS Medicine has highlighted the continuing urgency of efforts to provide safe and adequate water supplies and sanitation for improving global health. The papers are authored by several eminent international researchers with extensive experience in water, sanitation and hygiene (WSH) programs in around the world. As well as citing data from recent studies in developing countries, the authors note many parallels with the situation which existed a century ago in developed nations. At that time diseases transmitted by unsafe drinking water and poor sanitation and hygiene were common, and the fundamental links between WSH and public health were well recognised by the medical community and government, leading to effective coordinated action. The authors argue that the fundamental role of WSH in improving and maintaining public health has been forgotten by many current members of the public health community, and that a new level of engagement and advocacy by the health sector is needed to address this pressing issue. The first article notes that while tuberculosis, malaria and HIV/AIDS are generally perceived as the three most high profile issues for the international public health community, diarrhoeal illness still kills more young children each year than these three diseases combined (1). Nearly 90% of deaths from infectious diarrhoea occur in children under 5 years of age, and the bulk of this mortality is concentrated in 15 developing countries. In addition to the direct burden of deaths from enteric pathogens, lack of safe water and adequate sanitation is responsible for an even larger disease burden from the long term sequelae of such infections. Recurrent episodes of diarrhoeal
MARCH 2011
PAGE 1
WATER QUALITY RESEARCH AUSTRALIA
illness and intestinal parasite infection in children also contribute to malnutrition, which in turn increases susceptibility to other infectious diseases including tuberculosis and pneumonia, adding to the health burden. The body of research evidence shows that where preexisting conditions are poor, well implemented improvements in one or more of water quality, water quantity, sanitation or hygiene will reduce diarrhoeal illness by around one-third. Even more significant benefits in the order of 60% reduction can be obtained by providing safe water to individual properties, thereby reducing the risk of contamination of shared supplies and post-collection contamination. Providing water at or close to homes enables use of larger quantities for personal and household hygiene, as well as removing the need to spend significant amounts of time each day to collect water from distant supplies. Improvements in personal hygiene not only reduce rates of enteric infection but also lower the risks of respiratory and skin diseases, neonatal infections, trachoma and schistosomiasis. Collectively, the effects of poor hygiene, inadequate sanitation, and insufficient and/or unsafe drinking water are estimated to be responsible for 7% of the total annual disease burden and 19% of child mortality worldwide. Yet most of this burden is preventable, and the cost-effectiveness of interventions to reduce diarrhoeal diseases is at least on par with better supported interventions aimed at tuberculosis, malaria and HIV/AIDS. While the direct health benefits of water and sanitation interventions are massive, they are outweighed by indirect economic benefits. Malnutrition and intestinal helminth infection are associated with stunting, impaired school performance, the need to repeat school years and late entry into the labour force. Water and sanitation provision within schools as well as homes is also important, and it has been suggested that availability of improved school sanitation facilities may improve attendance and retention rates, especially for female students. Economic benefits also arise from releasing time for productive activity when water and sanitation facilities are provided close to home. Estimates in Ghana and Pakistan have indicated that the total economic costs of inadequate water and HEALTH STREAM
sanitation are in the order of 9% of Gross Domestic Product. This analysis shows that the benefits of addressing water and sanitation needs in developing countries far exceeds the costs, but the priority given to these issues and the level of investment in improving the situation remains inadequate. The World Health Organisation’s Millenium Development Goal (MDG) 7 is to ‘to reduce by half the proportion of people without sustainable access to safe drinking water and basic sanitation by 2015’. As noted above, the wider societal impact of diarrhoeal illness means that achievement of this MDG is a key factor in supporting other MGDs relating to child mortality, poverty, education and gender equality. Current estimates suggest the goal relating to safe water provision will be met, but the sanitation goal is likely to fall 1 billion people short of the target figure. However even if the MDG is achieved, 800 million people (one tenth of the world’s population) will still be relying on water from distant and/or unprotected sources, and 1.6 billion (one quarter of the world’s population) will still lack access to even a simple latrine. Population growth also means that the number of people in the unserved categories continues to increase even if the percentage coverage remains the same. Furthermore the MDG concept, while providing a simple metric against which to judge progress, fails to take account of the many gradations in safety and reliability of water and sanitation systems that occur in reality. The second and third papers in the series look in more detail at the health and economic burden of unsafe water supply and lack of sanitation respectively (2, 3). They also summarise a range of successful intervention approaches as well as constraints and challenges which hamper progress. The fourth paper discusses the reasons why progress has been slow and outlines the actions needed to make significant progress towards ensuring universal access to water, sanitation and hygiene (4). Although advances in all three areas are essential for developing countries to secure good health for their citizens, in practical terms their implementation is generally best approached through separate programs. At the international level, WSH has a low priority area for funding, and available funds are not
MARCH 2011
PAGE 2
WATER QUALITY RESEARCH AUSTRALIA
necessarily being directed towards countries with the poorest coverage. At the national level, budget allocations are low and funding is often heavily reliant on external aid. In many recipient nations, policy and planning are weak and there is poor coordination of effort both within and outside government. However, there is no guarantee that increased funding alone will automatically lead to increased progress, and lessons need to be learned from past failures which have worsened rather than improved local situations. The authors propose an agenda for action, taking into account the roles of key actors including households, local and central governments, and external agencies. Individual households in developing countries are already major investors in locally implemented water and sanitation facilities, and encouraging this type of investment is an effective way of producing improvements for larger numbers of people with simple and relatively cheap technologies that are able to be locally maintained. The responsiveness of local government to community needs must be improved, and they must work better with local service providers to provide effective delivery of improvements. Central governments need to target adequate financial resources to local governments in high need areas, and work with them to ensure that regulation enhances rather than impedes delivery of advances. The role of external agencies is not only to increase funding levels for WSH but also to ensure that funds are used effectively to leverage investments by householders, local and national governments. The authors call for principles set forth in this series of papers for improving water and sanitation supplies to be formally incorporated into aid policies and agreements by international donor agencies. The role and responsibilities of the health sector are also discussed. The health sector plays a direct role in provision of HSW in health care facilities, investigating of outbreaks of HSW-related disease, and the integration of HSW into other health programmes. In addition it is essential that the health sector and health professionals are involved in advocacy for resource allocation for WSH improvements, and that health data are used to
HEALTH STREAM
support this need. Action is also needed to ensure that environmental health (including WSH) is given appropriate priority national health policies, and the health sector must lead dialogue with other sectors of government to promote WSH interventions in homes, schools, medical facilities and communities. Health professionals must also play a role in developing environmental health regulations relating to provision of sanitation and water quality, including the quality of service for water supplies. Health promotion in the community is also essential to increase the demand for sanitation and improve hygiene behaviours. These papers were published in PLoS Medicine 11(7) (November 16) 2010. (1) Hygiene, Sanitation, and Water: Forgotten Foundations of Health. Bartram J and Cairncross S. doi/10.1371/journal.pmed.1000367 (2) Water Supply and Health. Hunter PR, MacDonald AM, Carter RC. doi/10.1371/journal.pmed.1000361 (3) Sanitation and Health. Mara D, Lane J, Scott B and Trouba D. doi/10.1371/journal.pmed.1000363 (4) Hygiene, Sanitation, and Water: What Needs to Be Done? Cairncross S, Bartram J, Cumming O, Brocklehurst C. doi/10.1371/journal.pmed.1000365
Ă–stersund Boil Water Ends The boil water order issued for the Swedish city of Ă–stersund officially ended on 18 February, twelve weeks after it was imposed as a result of a waterborne cryptosporidiosis outbreak. About 12,700 people are reported to have experienced gastroenteritis during the outbreak. As reported in the December 2010 issue of Health Stream, the outbreak was attributed to human faecal contamination of the surface water source serving the city via the illegal connection of a residential sewage outlet to the stormwater system. Drinking water treatment processes were inadequate to remove or inactivate the chlorine-resistant protozoan parasite, resulting in a widespread outbreak affecting about one quarter of the population of the city. The outbreak is believed to have begun in early to mid-November, and the boil water notice was issued on 26 November. The municipal water company was able to rapidly obtain a UV disinfection system from another town which had purchased but not yet installed a new system, and this significantly shortened the period required to restore the drinking water supply to
MARCH 2011
PAGE 3
WATER QUALITY RESEARCH AUSTRALIA
potable condition. Installation of the UV system at the water treatment plant was completed late in December, followed by a commissioning and testing period of several weeks. Finally, an extensive flushing program was undertaken to remove contamination from the distribution system. Water samples were sent for testing in the United States to verify that any oocysts remaining in treated water had been inactivated by the UV treatment and were not capable of causing infection. On 17 March an announcement was made that after consideration of the legal issues, the municipality had concluded that it had no legal obligation to pay compensation for damages suffered by individuals or businesses affected by the outbreak. The municipality plans to hold a Citizen’s meeting about the water contamination incident on 21 March. Information will be presented about water quality, Cryptosporidium infection, the handling of the outbreak, and decontamination of the water distribution system. A comparison will also be made with other Cryptosporidium outbreaks around the world. Residents will be invited to raise questions and discuss their ideas and concerns with experts and representatives of the municipality. In the wake of the outbreak, the Swedish National Food Agency (which is also responsible for regulation of water quality) has issued new recommendations to reduce the risk of transmission of Cryptosporidium and Giardia through drinking water supplies. The recommendations are advisory rather than legally binding, and were developed jointly by the National Food Agency, the Swedish Infectious Diseases Institute and the water industry body Swedish Water. In essence they conform to internationally recognised risk management approaches for protection of drinking water quality: • characterisation of raw water sources, identification of faecal impacts, catchment protection measures and water quality monitoring • evaluation of treatment barriers and adequacy of treatment performance to ensure they are capable of handling potential challenges in the microbial quality of source water
HEALTH STREAM
Desalination Decision for Israel The Israeli Ministry of Health has announced that it intends to require water providers to supplement desalinated water with magnesium in order to prevent the potential adverse effects of magnesium deficiency. Demand for potable water has been steadily rising in Israel and it is estimated that demand will begin to outstrip the capacity of current supply sources by around 2012. Desalinated water currently provides about 20% of Israel’s drinking water supply, and major expansions to cope with rising demand are predicted to increase this proportion to around 50% by 2020. The question of whether a minimum level of magnesium in desalinated water should be required was previously considered by the Ministry as recently as 2009, but after some debate, it was not included in revisions to relevant water quality regulations. The new decision was made following delivery late last year of the report of a multidisciplinary panel which reviewed the issue of the mineral content of drinking water and cardiovascular disease risks. This question was first raised by epidemiological studies published in the late 1950s when it was observed that mortality rates from cardiovascular disease were lower in areas with ‘hard’ water (water with relatively high calcium and/or magnesium content) than in areas with ‘soft’ water. The hypothesis was advanced that high levels of one or both minerals in drinking water had a protective effect leading to lower death rates from heart attack, and perhaps stroke and other related diseases. However early studies were of an ecological design comparing statistics at the population level and could not determine individual water exposures or assess whether other differences also existed between the areas being compared. Research on the topic has continued for several decades and more recent studies have used stronger study designs and looked at individual exposures and health outcomes, but unfortunately many have lacked assessment of important cardiovascular risk factors such as smoking and diet which may have influenced disease rates. Some studies have also not reported separate concentrations of calcium and magnesium but only assessed the composite measure of hardness.
MARCH 2011
PAGE 4
WATER QUALITY RESEARCH AUSTRALIA
The question has been reviewed several times by the World Health Organisation (WHO), but no decision has yet been made to include recommendations for minimum concentrations of calcium or magnesium in the WHO Drinking-water Guidelines. The most recent round of WHO reviews was prompted by requests from member countries to provide guidance on the use of desalinated water from drinking. The question was asked whether any health considerations regarding mineral content needed to be taken into account as the composition of desalinated water may be different from previously used water sources. The desalination process removes most of the mineral content of the source water and it is necessary to deliberately add some mineral content (often in the form of calcium salts) after desalination to avoid problems of corrosion in the water distribution system, however presently this is done for operational reasons, not health effects. An expert meeting convened by the World Health Organisation in 2006 concluded that considerable knowledge gaps still existed that hampered definitive resolution of the question (1). There is no doubt that both calcium and magnesium play important roles in cardiovascular health and it is biologically plausible that increased intake could reduce morbidity and/or mortality. It was agreed that the body of evidence tended to suggest any beneficial effect was more likely to be associated with magnesium rather than calcium. Although magnesium intake even from water with high mineral content is generally much lower than from food, absorption from water may be more efficient. However the expert meeting stopped short of endorsing adoption of guidelines for minimum levels of these minerals for drinking water supplies, and recommended a range of research approaches to provide better evidence to support decision making. While the question was raised in the context of desalinated water, the adoption of health-based guideline values for minimum levels of calcium and/or magnesium in drinking water would have obvious implications for drinking water supplies drawn from conventional sources with low mineral content. Therefore this issue is of importance in the broader international context, as well as for those countries currently utilising desalination processes.
HEALTH STREAM
The Israeli proposal would see desalinated supplies supplemented with magnesium to a concentration of 20-30 mg of magnesium/litre. The decision was opposed by the representative of the Israeli water authority who argued that the health evidence was not yet sufficiently resolved and the costs of water supplementation had been considerably underestimated. The Ministry of Health decision was discussed at a meeting of the Knessett Labor, Welfare and Health Committee on 1 January 2011, and this committee made the following recommendations: • the requirement for magnesium addition to desalinated water be added to new water regulations currently being formulated • implementation of magnesium supplementation should be accompanied by comprehensive long term research to evaluate the effects on public health. • that the Ministry of Health should explore the issue of regulation for a minimum magnesium concentration in water from other sources than desalination • that regular reports on implementation should be made to the Labor, Welfare and Health Committee by the Ministry of Health. It is not yet clear to what extent this decision will result in an increased magnesium intake from water for the Israeli population, and whether changes in cardiovascular health (if they occur) might be detectable in research studies. Israel currently uses surface water from the Sea of Galilee-Lake Kineret and groundwater from natural springs and wells situated along the Mediterranean coastal aquifer and an inland mountain aquifer, in addition to desalinated water. An extensive network of channels and pipelines throughout the country interconnects these supplies and moves water from the wetter northern region to the arid south of the country. Blending of magnesium supplemented desalinated water with water from other sources will produce both spatial and temporal variation in magnesium levels at the tap, and individual exposures may be difficult to estimate. According to media reports, the Ministry of Health has stated that no information exists on current magnesium concentrations in different towns as levels are measured only at the water source.
MARCH 2011
PAGE 5
WATER QUALITY RESEARCH AUSTRALIA
There are several postulated biological mechanisms for a beneficial effect of increased magnesium intake on cardiovascular disease incidence or mortality, and these may operate over different time frames which will need to be accounted for in any research studies: • reduction of the risk of sudden death from heart attack, due to reduction in arrhythmias and/or coronary artery spasms. Such effects would be expected to occur in a relatively short time frame, perhaps in the order of only a few months. • reduction in systemic high blood pressure. This mechanism would be expected to operate over a medium to long time frame, perhaps in the order of months to years. • changes in blood lipid levels which reduce atherosclerotic plaque formation. This mechanism would be expected to relate to longer term exposure to magnesium, perhaps in the order of several years or more, since the build-up of plaque in arteries is a gradual process. In addition, a recent survey indicates that only one third of Israel’s residents drink water straight from the tap. A similar number filter or treat tap water before drinking, while the remainder drink bottled water. Rates of tap water drinking vary markedly in different religious and cultural groups, in different geographic areas, and between age groups. Clearly all of these factors will complicate the assessment of health impacts following the introduction of magnesium supplementation for desalinated water. (1) Calcium and Magnesium in Drinking Water: Public Health Significance. World Health Organisation 2009. ISBN: 9789241563550 Available from www.who.int Refer also to Health Stream Issue 41 for a report on the 2003 WHO meeting on Nutrients in Drinking Water.
Mosquito-borne Diseases in Australia Recent floods and heavy rainfall in the eastern states of Australia have triggered an increase in mosquitoborne viral diseases this year. Several such viral diseases are endemic to Australia, including Ross River virus, Barmah Forest virus and Murray Valley Encephalitis virus. In addition, public health authorities in northern Queensland wage a continuing campaign to detect and contain outbreaks of Dengue virus. This disease is not considered endemic in Australia but is repeatedly introduced when HEALTH STREAM
international travelers bring in the infection, and are bitten by local mosquito species which are able to host the Dengue virus. These mosquitoes can then bite and infect other people, triggering an outbreak. All states also maintain vigilance for imported viral diseases such as malaria which can be brought into Australia by international travelers and subsequently transmitted by mosquitoes. Wet humid conditions provide abundant breeding sites for a range of mosquito species across Australia, increasing the risk of vector-borne disease. Significant increases in the number of people diagnosed with Ross River virus infection have been reported in South Australia and Victoria. This virus causes no illness or only mild symptoms in children and about 70% of infected adults. However in the remaining 30% of adults symptoms of fever, rash, joint pains, swelling, and joint stiffness occur for up to 6 weeks. Some people also experience debilitating lethargy and fatigue, and a minority have persistent symptoms. This is the most commonly occurring notifiable vector-borne disease in Australia, with 5111 cases reported nationwide in 2010. Cases of Barmah Forest virus infection in both states have also risen. This virus produces similar symptoms to Ross River virus but is less well characterised. It is the second most common vector borne disease in Australia, with 1466 cases reported in 2010. In addition, monitoring of sentinel chicken flocks has detected increasing occurrence of Murray Valley Encephalitis virus in mosquito populations in northern Victoria and western New South Wales. This virus rarely causes clinical illness in humans with only 18 cases being reported nationally over the last 10 years. Studies indicate that about 1000 asymptomatic infections occur for every clinical case identified. However when illness occurs, it may be severe and life threatening. The Victorian Health Department has investigated whether the virus could be responsible for the death of a man who died of encephalitis in early March, but laboratory tests proved inconclusive. Health authorities have issued warnings for members of the public to wear insect repellent and protective clothing to reduce the risk of bites, and to remove breeding sources for mosquitoes near homes, wherever practical.
MARCH 2011
PAGE 6
WATER QUALITY RESEARCH AUSTRALIA
News Items
From the Literature
WHO Invites Comment on Uranium Document The World Health Organisation has released a revised version of the background document for Uranium in Drinking-water as part of the preparations for the fourth edition of the WHO Guidelines for Drinking-water Quality. The document outlines new evidence on exposed human populations published since the last assessment, and discusses possible revision of the provisional guideline value for this element. The current provisional guideline value (15 micrograms/litre) was derived from a 91-day study in rats, however several studies have since been published showing no apparent nephrotoxic or carcinogenic effects in human populations with chronic exposure to levels of several hundred micrograms//litre. In view of these new data, the possibility of raising the provisional guideline to 30 micrograms/litre is discussed. Comments on the document are invited until 8 April. www.who.int/water_sanitation_health/dwq/chemicals /uranium/en/ 2011 Water and Health: Where Science Meets Policy Conference Abstracts are invited for the 2011 Water and Health Conference, to be held October 3-7 at UNC Chapel Hill, North Carolina, USA. The submission deadline is May 1. The Conference will feature themes ranging from Freshwater Availability and Climate Change Adaptation to Human Right and Ethics. Other themes for 2011 include Perspectives on WaSH for Small Communities and Peri-urban Areas, and Southeastern US Water Challenges. For a complete list of conference themes and abstract submission details, visit: whconference.unc.edu In association with the Conference, the Water Institute at UNC is offering a prize to the best paper published in the new IWA Journal of Water, Sanitation and Hygiene for Development by an emerging author from a low income country. The prize will fund up to $1500 of the costs of the author’s participation in the 2011 Conference. www.waterinstitute.unc.edu/emerging-author-prize
HEALTH STREAM
Web-bonus articles Summaries of these articles are available only in the PDF version of Health Stream on the WQRA website: www.wqra.com.au An audit improves the quality of water within the dental unit water lines of general dental practices across the East of England. Chate, R.A.C. (2010) British Dental Journal, Online article number E11, DOI: 10.1038/sj.bdj.2010.885 Fluoride in still bottled water in Australia. Mills, K., Falconer, S. and Cook, C. (2010) Australian Dental Journal, 55(4); 411-416. Drinking water is possibly the risk factor for Giardia lamblia infection among Libyan children. BenRashed, M.B. (2010) Jamahiriya Medical Journal, 10(3); 206-209. Solar disinfection of drinking water in the prevention of dysentery in South African children aged under 5 years: The role of participant motivation. Du Preez, M., McGuigan, K.G. and Conroy, R.M. (2010) Environmental Science and Technology, 44(22); 8744-8749. Reported care giver strategies for improving drinking water for young children. McLennan, J.D. and Farrelly, A. (2010) Archives of Disease in Childhood, 95(11); 898902. Association between children's blood lead levels, lead service lines, and water disinfection, Washington, DC, 1998-2006. Jean Brown, M., Raymond, J., Homa, D., et al. (2010) Environmental Research, Controlling Legionella in hospital drinking water: An evidence-based review of disinfection methods. Lin, Y.E., Stout, J.E. and Yu, V.L. (2011) Infection control and hospital epidemiology, 32(2); 166-173. Waterborne norovirus outbreak in a municipal drinking-water supply in Sweden Riera-Montes, M., Brus Solander, K., Allestam, G., et al. (2011) Epidemiology and Infection, 1-8. Perchlorate, nitrate, and iodide intake through tap water. Blount, B.C., Alwis, K.U., Jain, R.B., et al. (2010) Environmental Science and Technology, 44(24); 95649570. Private drinking water wells as a source of exposure to perfluorooctanoic acid (PFOA) in communities surrounding a fluoropolymer production facility. Hoffman, K., Webster, T.F., Bartell, S.M., et al. (2011) Environmental Health Perspectives, 119(1); 92-97. Dose-dependent Na and Ca in fluoride-rich drinking water -Another major cause of chronic renal failure in tropical arid regions. Chandrajith, R., Dissanayake, C.B., Ariyarathna, T., et al. (2011) Science of the Total Environment, 409(4); 671-675. Evaluation of Risk Assessment Tools to Predict Canadian Waterborne Disease Outbreaks. Summerscales, I.M. and McBean, E.A. (2010) Water Quality Research Journal of Canada, 45(1); 1-11.
MARCH 2011
PAGE 7
WATER QUALITY RESEARCH AUSTRALIA
Arsenic Arsenic in drinking water and stroke hospitalizations in Michigan. Lisabeth, L.D., Ahn, H.J., Chen, J.J., Sealy-Jefferson, S., Burke, J.F. and Meliker, J.R. (2010) Stroke, 41(11); 2499-2504. This study was conducted to investigate the association between low-level arsenic exposure in drinking water and hospital admission for ischemic stroke in Michigan, USA. Data were analysed at the county level for the whole state, and at zip code level for Genesee County where arsenic concentrations are elevated and variable. Ischemic stroke admissions for people aged 45 years and older were identified from the Michigan Health and Hospital Association’s Michigan Inpatient Database for 1994 through 2006. To serve as the comparison groups, two nonvascular outcomes (duodenal ulcer and hernia) were also identified. The hospital admission data also included age, race, sex, year of hospital discharge and county and zip code of residence. US Census data were used as the source for population counts, median income and percent black population. Drinking water data were obtained from a Michigan Department of Environmental Quality arsenic database for 1983 to 2002. Population-weighted county- and zip codelevel average arsenic concentrations were calculated. There were 294,095 ischemic stroke admissions in Michigan over the study period with complete data. The median age of the admissions was 74 years. The median arsenic level among the 83 counties was 1.83 micro g/L. The unadjusted relative risk (RR) between arsenic levels and stroke admissions was 1.010 (95% CI, 1.006 to 1.014; P = 0.01). After adjusting for potential confounders, the association between arsenic levels and stoke admissions remained statistically significant (RR, 1.011; 95% CI, 1.002 to 1.019; P = 0.01). At the county level, arsenic was associated with hernia admission in both the unadjusted and adjusted models. Arsenic was also associated with duodenal ulcer admissions in the unadjusted model but not the adjusted model. There were 14,033 ischemic stroke ischemic stroke admissions in Genesee County in the study period
HEALTH STREAM
with complete data. The median age of admissions was 72 years. The median arsenic level among 27 zip codes in Genesee County was 7.78 micro g/L. In the unadjusted model, the association between arsenic levels and stroke admissions was not significant ( P = 0.16) but after adjusting for age, sex, median zip code-level income and percent black residents in the zip code, the association became significant (RR, 1.031; 95% CI, 1.011 to 1.051; P = 0.002). Arsenic was negatively associated with hernia admissions at the zip code level in the unadjusted model, but there was no association in the adjusted model. Arsenic was not with duodenal ulcer admissions in the unadjusted or adjusted models. This study demonstrated a statistical association between low-level arsenic exposure in drinking water and hospital admissions for ischemic stroke after adjusting for age and sex, the county-level factors of median income and percent of black residents and temporal trends in stoke admissions. However, there were similar associations for nonvascular outcomes which suggest that the observed association could be the result of unmeasured confounders and/or ecological bias. In the analysis of Genesee county, an association was observed between low-level arsenic exposure and ischemic stroke admission in adjusted analyses, and a significant trend was also found. Overall the findings suggest an association between arsenic and stroke that may only be apparent at more elevated levels of arsenic such as those found in Genesee County. This finding requires further epidemiological study with individual level data on arsenic exposure and incident stroke as well as biological and sociodemographic stroke risk factors. Comment In this ecological study, arsenic exposures from water at the individual level were not determined and there was no information on major risk factors such as smoking. Chemical Contamination Drinking water incidents due to chemical contamination in England and Wales, 2006-2008. Paranthaman, K. and Harrison, H. (2010) Journal of Water and Health, 8(4); 735-740.
MARCH 2011
PAGE 8
WATER QUALITY RESEARCH AUSTRALIA
The quality of drinking water in the UK is generally good however occasional breaches due to microbiological or chemical contamination are not uncommon. In England and Wales, the Chemical Hazards and Poisons Division (CHaPD) of the Health Protection Agency (HPA) provide expert toxicological advice on the risks and consequences to human health as a result of chemical contamination incidents affecting drinking water. By regularly reviewing such reported incidents, trends can be monitored, informed policy decisions can be made and the public can be assured of the safety of drinking water. This study presents the results of a review of all drinking water incidents with potential or actual chemical contamination that were referred to the CHaPD over a 3-year period. As part of the HPA’s Chemical Incident Surveillance program, all chemical-related enquiries and incidents are routinely logged on the secure online National Database. A chemical incident is defined as: an acute event in which there is, or could be, exposure of the public to chemical substances which cause, or have the potential to cause illness. The National Database was searched for records of water contamination reported from January 2006 to December 2008. An email request was also sent to the water leads for all five regional CHaPD centres asking for information about any incident(s) that they may have dealt with which may not have been reported in the database. After exclusions there were 82 incidents identified with confirmed chemical contamination of drinking water. Of these incidents, 77 were from England and 5 from Wales. Of the 82 incidents, data on the source of contamination was available in 70 cases. Approximately 40% (28/70) of the incidents were related to contamination of drinking water provided by private suppliers, and nitrate was the most commonly identified chemical contaminant. Nearly a third (22/70) of incidents were attributed to contamination occurring close to the point of consumption, most being due to hydrocarbon contamination. In the remainder (20/70), public water supplies were identified as the source of contamination, with pesticide contamination the most common cause.
HEALTH STREAM
For most of the incidents, little or no information was available on critical exposure variables such as duration of contamination and actual or estimates of the population affected. Although exact numbers were lacking, the estimated numbers of potentially exposed persons were highest for those incidents with contamination of public water supplies and least for those involving private suppliers or local contamination of public supplies, as would be expected due to the differing sizes of the distribution networks. Incidents identified in the review included elevated aluminium in a public water supply due to malfunction of dosing equipment following a power disruption, high levels of manganese and aluminium in separate private supplies, lead contamination from household plumbing, tetrachloroethylene contamination from a defective pipe work connection at a dry cleaning facility, and copper contamination from stagnation of water in pipes. The copper incident reportedly caused gastrointestinal symptoms in several children attending a camp. For most of the incidents investigated, the levels of chemical contamination were considered to be unlikely to cause significant immediate or long-term ill health based on short-term exposures. Overall, the number of chemical contamination incidents was relatively small. Private water supplies were frequently implicated and this highlights the need for tighter regulation to assure the safety and quality of the water from these supplies that is provided to consumers. Although there were no significant adverse health effects found or expected for most incidents, there were a few serious incidents that raised concerns and highlighted the dangers of chemical contaminants of drinking water. There needs to be improved recording of exposure data including the estimated number of consumers affected, duration of contamination and chemical testing results. This type of data is essential to assess whether short or long term health effects are likely for consumers when accidental or deliberate contamination by specific chemicals are reported to public health authorities.
MARCH 2011
PAGE 9
WATER QUALITY RESEARCH AUSTRALIA
Disinfection Byproducts Urinary trichloroacetic acid levels and semen quality: A hospital-based cross-sectional study in Wuhan, China. Xie, S.H., Li, Y.F., Tan, Y.F., Zheng, D., Liu, A.L., Xie, H. and Lu, W.Q. (2011) Environmental Research, doi:10.1016/j.envres.2010.12.010 Consistent associations have been reported in toxicological studies between exposure to disinfection by-products (DBPs) and impaired male reproductive health in animals. There have only been two epidemiological studies in humans that have addressed exposure to DBPs and male reproductive health and their results were inconclusive. Accurate characterisation of exposure to DBPs is still a significant challenge for epidemiological studies. Biomarkers of the exposure are an alternative measurement to improve exposure assessment. Urinary trichloroacetic acid (TCAA) has been shown in several studies to consistently correlate with ingestion exposure to TCAA from drinking water. Also the excretion half-life of urinary TCAA is considerably longer than other biomarkers for exposure to DBPs. The collection of urine samples is noninvasive and therefore preferable for conducting large-scale epidemiological studies. A hospital-based cross-sectional study was conducted in Wuhan, China, to evaluate the effect of exposure to DBPs on semen quality in humans. Exposure to DBPs was estimated based on urinary TCAA levels. The participants were male partners in sub-fertile couples seeking infertility medical instruction or assisted reproduction services from the Tongji Hospital in Wuhan, between May 2008 and July 2008. The final analysis included 418 subjects who completed a face-to-face interview which included questions on demographics, medical history, reproductive history, lifestyle factors, occupational exposures and water-use behaviours. Water-use questions included use of cold tap water, the number and size of glasses of tap water consumed per day, and frequency and duration of bathing and showering. Semen specimens were collected after two to seven days of abstinence and analysed for sperm concentration, motility, morphology and
HEALTH STREAM
components (percent of sperm cells and abnormal head, midsection and tail, and percentage acrosomal integrity). On the same day as the semen sample collection a spot urine sample was collected. The concentrations of creatinine-adjusted TCAA were analysed in urine samples with the gas chromatography/electron capture detection method. Linear regression was performed for semen parameters and urinary creatinine-adjusted TCAA concentrations. There were 33 men (8%) with below the reference sperm concentration (less than 20 million/mL), 151 (36%) men with below the reference sperm motility (less than 50% motile), 6 men (1%) with below the reference sperm morphology (less than 9% for sperm morphology), and 265 men (63%) with all parameters at or above the reference values. Urinary creatinineadjusted TCAA concentrations ranged from 0.3 to 147.5 micro g/g creatinine, with a mean of 9.2 micro g/g creatinine. Creatinine-adjusted urinary TCAA concentrations were not significantly associated with sperm concentration, sperm count and morphology parameters, but significant associations were found between decreased percent motility and increased creatinine-adjusted urinary TCAA concentration in the univariate model. However, after adjustment for age, abstinence time and smoking status, these associations became non-significant. When those in the lowest quartile of creatinine-adjusted urinary TCAA concentrations were the reference group, the adjusted mean differences in percent motility of those in the second, third and top quartiles were -4.6% (95% CI: -9.2%, 0.0%), -4.3% (95% CI: -8.9%, 0.3%) and -3.7% (95% CI: -8.4%, 0.9%), respectively. This study did not find any significant association of decreased sperm concentration, sperm count and sperm morphology with elevated urinary TCAA levels. There was some suggestive but inconclusive evidence of decreases in sperm motility with increased urinary TCAA levels. The effects of exposure to DBPs on human male reproductive health still require additional large investigations with well designed studies.
MARCH 2011
PAGE 10
WATER QUALITY RESEARCH AUSTRALIA
Endemic Gastroenteritis Water use and acute diarrhoeal illness in children in a United States metropolitan area. Gorelick, M.H., McLellan, S.L., Wagner, D. and Klein, J. (2010) Epidemiology and Infection, doi:10.1017/S0950268810000828 This study examined the association between water exposures and ADI in children under non-outbreak conditions in a major US metropolitan area. It was hypothesised that there would be varying risks depending on the child’s water source, suggesting endemic waterborne disease, and that use of bottled water or water filters would reduce the risk. This nested case-control study was conducted in an urban/suburban paediatric emergency department (ED) which serves a metropolitan area with a population of 1.5 million. The cohort consisted of all patients aged less than 18 years seen in the ED during the 24-month study period (1 January 2007 to 31 December 2008) during the hours of 8am to midnight on weekdays and 12 noon to midnight on weekends. Eligible cases were children with ADI and eligible controls were children with a non-gastrointestinal complaint. Cases and controls were matched on age and date of visit. A parent or caregiver of each subject completed a previously validated water-use survey with questions about exposure to water for drinking, hygiene and recreation. Drinking water questions included the child’s usual use of bottled and tap water (including use of water to make infant formula) and use of filters. Water source was categorised as surface or groundwater based on the zip code of the child’s primary residence. There were also questions about presence of ill contacts at home with diarrhoea and attendance at day care. Family income was estimated on the basis of median income for zip code of residence. Stratum-specific adjusted odds ratios (aOR) were calculated for the three main water effects: water source (surface vs. ground), drinking-water type (tap vs bottled), and use of water filters. There were 1209 cases and 1263 controls in the study. It was estimated that at least 46% of potentially eligible cases attending the ED were recruited. The median age was 1.8 years for cases
HEALTH STREAM
and 1.9 years for controls. Complete data were available for 90.4% of subjects. The majority of residences (nearly 80%) were served by municipal surface water and the remaining 20.5% of homes were served by groundwater. There were 45% of subjects who drank mostly or only bottled water, and 42.3% who drank mostly or only tap water. Groundwater use was associated with increased odds of ADI compared to surface water (aOR 1.38, 95% CI 1.01-1.87). For children living in homes served by surface water, the use of primarily bottled water was associated with an increased odds of illness compared to use of primarily tap water (aOR 1.27, 95% CI 1.03-1.58). For those in households served by well water, bottled water was not significantly associated with diarrhoeal illness (aOR 1.17, 95% CI 0.82-1.68) although the aOR was elevated. Some type of filter was reported by 17.7% of subjects, but filter use did not affect the odds of illness. Of the risk factors considered, only presence of ill contacts at home with diarrhoea was independently associated with increased odds of illness. Attendance at school or day care, and socioeconomic status were not associated with diarrhoeal illness. This study showed that well water use and bottled water use are associated with increased odds of ADI in children in a major US metropolitan area served by municipal treated water systems. The results for ground water are consistent with other research that detected human viral contamination in a substantial proportion (8-42%) of wells. The findings for bottled water use were not anticipated. It is possible that bottled water has a relatively greater contamination compared to municipal tap water. Also contamination may occur after the product is opened by the consumer. The results suggest that endemic waterborne transmission of gastroenteritis is occurring in this setting and further work is required to confirm these findings and explore the possible causes. Comment The authors note they were unable to distinguish whether homes were served by private wells (probably untreated) or municipal wells (chlorinated), however the majority probably received municipal groundwater. No data were available on the pathogens causing ADI in cases.
MARCH 2011
PAGE 11
WATER QUALITY RESEARCH AUSTRALIA
Hardness Effect of water hardness on cardiovascular mortality: An ecological time series approach. Lake, I.R., Swift, L., Catling, L.A., Abubakar, I., Sabel, C.E. and Hunter, P.R. (2010) Journal of Public Health, 32(4); 479-487. There have been numerous studies that have suggested an inverse relationship between drinking water hardness and cardiovascular disease. However the weight of evidence is insufficient for the World Health Organisation (WHO) to implement a health based guideline for water hardness, calcium or magnesium. If there are associations between cardiovascular disease and water hardness then the public health benefits of any subsequent interventions could be large as cardiovascular disease is the leading cause of death in many developed countries. The WHO has recommended ecological studies of cardiovascular mortality rates in populations exposed to step changes in drinking water hardness through natural experiments such as a water supply changing from a hard groundwater to a softer river source, as the preferred method of providing more conclusive evidence on whether a link exists. This study aims to investigate associations between cardiovascular mortality and levels of water hardness, calcium or magnesium using time series data of populations experiencing a change in drinking water and to provide methodological recommendations for those attempting similar studies. The study included 14 areas in England and Wales where changes in drinking water hardness had occurred between 1981 and 2005. Within each of the areas monthly mortality counts for the same period, subdivided by gender and age, were obtained from the Office for National Statistics. The main outcome of interest was cardiovascular mortality. For each of the areas investigated, the temperature for each month between 1981 and 2005 was obtained from the nearest weather station, as ambient temperature is associated with cardiovascular mortality (higher mortality occurring at increasing extremes of low and high temperatures). Flu is also known to be associated with cardiovascular mortality and a major flu epidemic occurred nationally in December 1989
HEALTH STREAM
so an indicator variable for flu was created. A water hardness indicator variable was created to define months before and after the change in water hardness. In England and Wales, cardiovascular mortality has shown a non-linear downward trend between 1981 and 2005, so two approaches were used to control for this. The first approach involved the change area acting as its own control. This approach assumes that any trend in mortality is smooth, whereas any effect due to water hardness change is almost immediate. Monthly cardiovascular mortality counts were modelled using linear or Poisson regression. The second approach used control areas. Data from a nearby area was used to control for the trend in the change area. As well as controlling for the long-term trend, this had the benefit of potentially controlling for other factors such as smoking rates and diet which are assumed to show similar trends in the control areas. The log of the relative risk between case and control areas was regressed onto the water hardness indicator variable. The resulting coefficient of water hardness provided an estimate of the risk ratio after the water hardness changed compared with before, adjusted for the factors assumed common to both series. This method can only be applied to larger areas as the risk ratio cannot be calculated when there are zero mortality counts in the control series, so for smaller areas, the monthly mortality data was aggregated to annual data and the analysis was focused on the two subgroups with the highest mortality rates (between 65 and 74 years of age and over-75 years). The relative risk of mortality after, compared with before the water hardness change were calculated for deaths amongst males and females in the 65-74 and 75 plus age groups. The results were highly varied with generally wide 95% confidence intervals. About 95% of the confidence intervals included 1 and provided no evidence of an effect due to the water hardness change. Relative risks were combined for areas experiencing an increase/decrease in water hardness, calcium and magnesium but approximately 95% of the confidence intervals also included 1. A variety of time lags were also examined between mortality and the independent variables but again
MARCH 2011
PAGE 12
WATER QUALITY RESEARCH AUSTRALIA
non-significant results were found. The monthly time series for the ratio of mortality between case and control area were calculated for the two areas with the largest populations. There was no evidence of any association between changes in drinking water hardness, calcium or magnesium and cardiovascular mortality. When the mortality counts were aggregated to annual data no evidence of a relationship with cardiovascular mortality was found and the confidence intervals were wide reflecting the low number of annual observations within each time series.
two litres), bottles are tightly closed and exposed horizontally to direct sunlight for at least six hours. If the skies are very cloudy, exposure time needs to be extended to two days. It is recommended to prevent recontamination that water should be consumed directly from the bottle or poured into a clean cup. This paper evaluates the implementation of SODIS in four slum areas of Yaounde, Cameroon. The main aims of the research were to assess the health impact of the intervention and to identify some of the conditions that were favourable for the adoption of the method.
This study found no evidence of an association between changes in water hardness, calcium or magnesium and cardiovascular mortality. These results were consistent across different age groups and genders and between the two statistical approaches used. Three areas experienced large changes in calcium levels (greater than 50 mg/L), three had changes between 25 and 50 mg/L and the remainder had smaller changes. However only two areas had major changes in magnesium levels (more than 10 mg/L) and these had relatively small populations. Therefore definitive conclusions can not be drawn about the effect of changes this mineral. Future studies need to be conducted on the individual level involving only those in the population who are magnesium/calcium deficient, and therefore presumably more susceptible to water hardness changes.
Two cross-sectional surveys were conducted. A preintervention survey was conducted prior to the SODIS promotional activities, and after the intervention an evaluation survey was conducted which included an intervention and a control group. The pre-intervention survey was carried out between 16 and 21 July 2007 in randomly selected households having at least one child under five years of age in the four selected areas. The survey included water consumption patterns and diarrhoea incidence among children under five and a total of 2,193 interviews were conducted. The intervention involved the training of 60 people on promoting the SODIS method among the target population. Public workshops were organised and mass media informed about SODIS. An intensive door-to-door campaign was undertaken and 2,911 households (mainly with children under five) were visited and trained in SODIS use during the intervention phase from July 2007 to April 2008. The evaluation survey interviews were conducted from 21 April to 7 May 2008 using a questionnaire. There were 783 households randomly selected with at least one child under five years (369 intervention households and 414 control households). The evaluation survey questionnaire focused on the sociodemographic situation, health status (diarrhoea, stomach pain) and on issues pertaining to SODIS. Also questions were included on: consumption of liquids including types and amounts, hygiene on the basis of the calculation of an index containing four indicators and cleanliness of surrounding area using a 4-point scale.
Household Interventions Health gains from solar water disinfection (SODIS): Evaluation of a water quality intervention in Yaounde, Cameroon. Graf, J., Togouet, S.Z., Kemka, N., Niyitegeka, D., Meierhofer, R. and Pieboji, J.G. (2010) Journal of Water and Health, 8(4); 779-796. Solar water disinfection (SODIS) is a cost effective and simple household water treatment method that only requires direct sunlight and polyethylene terephthalate (PET) bottles which are generally available in low-income countries. SODIS involves contaminated water being filled into empty, transparent and unlabeled PET bottles (volume up to
HEALTH STREAM
To assess household compliance with the intervention, a score of SODIS use was developed
MARCH 2011
PAGE 13
WATER QUALITY RESEARCH AUSTRALIA
with the minimum score value of 0 and the highest possible value of 5. The score was used to assess SODIS application by the households with a rating of 0-3 points = non-user, 4 points = irregular user, 5 points = regular user. To assess the relative difference in socio-economic status of the interviewed households, an index based on five indicators was used: type of building, equipment in household, kind of energy used for cooking, number of people sharing one bed/mattress and interviewer’s appraisal of the economic situation. The index ranged between 1 (low) to 4 (relatively high). The pre-intervention survey of the households showed a 34.3% prevalence of diarrhoea among children under five. There were 112 households excluded from the control group in the evaluation survey analysis as they had already heard of the SODIS methods and therefore the final control group consisted of 302 households. After the intervention, 31.8% of all households in the control groups reported diarrhoea incidence. There was no statistical difference between the prevalence rates observed in the pre-intervention survey and for the evaluation survey control group. However, diarrhoea prevalence in the intervention group was lower (22.8%). Compared to the pre-intervention survey, a significant difference in diarrhoea incidence was found (x2 = 19.18, p less than 0.001, odds ratio = 1.77). A significant difference was also found when the intervention and control group were compared regarding diarrhoea occurrence (x2 = 6.89, p = 0.009, odds ratio = 1.58). The intervention group was subdivided according to SODIS compliance [nonusers (n=129), irregular users (n=71) and regular users (n=169)] and compared regarding diarrhoea prevalence. The households of the control group and the non-users of the intervention group showed exactly the same diarrhoea rate (31.8%). The irregular users reported diarrhoea in 16.9% of the cases. The regular SODIS users had a diarrhoea rate of 18.3%. Diarrhoea prevalence among regular users was significantly lower than among non-users of the intervention group and the control group and this was also true of irregular users. The evaluation survey data was included in logistic regression models to examine the relationship
HEALTH STREAM
between SODIS application and diarrhoea when including the influence of other factors. When the risk factors hygiene and cleanliness of surrounding were included in the model, SODIS application still had predictive power for reducing diarrhoea. When the variable percentage of unsafe drinks was included, SODIS application lost its significance. This is a logical consequence as SODIS is an important method of providing safe water and is therefore possibly mediated through the variable percentage of unsafe water. Socio-economic status was also integrated in the model but this did not reach significance level. On the basis of the results found here it can be concluded that SODIS introduction had positive health benefits among the target population of Cameroon. The results agree with most of the previous SODIS health impact studies conducted in other developing countries. The further promotion and dissemination of SODIS as an effective water treatment method among poor populations in lowincome countries is recommended. The social aspects of SODIS also need further research such as strategies to inform beneficiaries about SODIS and why some households adopt the practice of water treatment and others don’t. Comment The analysis does not present any data on whether compliance with SODIS declined over time although some households would have received the intervention 12 months before evaluation while others would have been introduced to SODIS shortly before the evaluation. Metals Exposure assessment of metal intakes from drinking water relative to those from total diet in Japan. Ohno, K., Ishikawa, K., Kurosawa, Y., Matsui, Y., Matsushita, T. and Magara, Y. (2010) Water Science and Technology, 62(11); 2694-2701. This study investigated daily metal intake (DMI) of 17 metals (toxic, essential yet toxic, and essential metals) from total diet and drinking water in six cities in Japan. The contribution to tolerable daily intake
MARCH 2011
PAGE 14
WATER QUALITY RESEARCH AUSTRALIA
(TDI) was estimated of toxic elements as proposed by the World Health Organization (WHO) or Joint FAO/WHO Expert Committee on Food Additives (JECFA). The contribution to recommended dietary allowances (RDAs) proposed by the Japanese Ministry of Health, Labour and Welfare (JMHLW) was also assessed. At each of the sampling location in the six cities across Japan about 150 kinds of foods were purchased from grocery stores. Food items that are usually cooked before consumption were cooked by the usual methods such as boiling or baking. All food items were categorised into 13 groups in accordance with the classification of the National Nutrition Survey. Food samples of each group were homogenised to make a composite sample. Drinking water samples were collected from the tap where the preparation of the food was performed. In all of the six cities, drinking water was processed in municipal drinking water treatment plants that use surface source water. The concentrations of 14 metals (boron, aluminium, chromium, manganese, nickel, copper, zinc, arsenic, selenium, molybdenum, cadmium, antimony, lead and uranium) in drinking water and in the digested solutions of the food composite samples were determined using an inductively coupled plasma-mass spectrometer. The National Nutrition Survey provided daily intake data for each of the food sample groups. The daily drinking water intake was assumed to be 2 L/day. To calculate DMI, the metal concentration in food composite sample or drinking water was multiplied by the daily intake of each group. Values of TDI and RDA were calculated as for an average Japanese adult. The values of TDI (mg/kg/day) were derived from either the TDI reported in the WHO Guidelines for Drinking-water Quality or from that proposed by JECFA. To convert these TDI values to the unit of weight per capita, the average weight per capita was assumed to be 50kg. The average RDA values provided by JMHLW of both males and females from 18 to 69 years old were used as the RDA for an average Japanese person.
50% of TDI. The six-city average of daily arsenic intake was 280% of the TDI, and the highest intake was 359% of the TDI. Of the 13 food groups, two groups (“Other vegetables and seaweeds” and “Fish and shellfish”) accounted for high arsenic ratios. The mean molybdenum intake was 99% of the TDI and the maximum ratio was 134%. The excess intake of molybdenum does not pose an immediate health risk to humans because the TDI represents a tolerable intake for a lifetime, and short-term exposure to levels exceeding the TDI are not a concern. There are uncertainties however about long-term exposure. The mean dietary intakes of magnesium, calcium and iron were found to be less than the RDA or AI (adequate intake), although the differences were not very large. Iron intake was lower than the RDA in all six cities and manganese and zinc intakes were lower in one city. Daily intakes of chromium, selenium and molybdenum were far greater than the RDA values. It terms of toxicity, the DMI values of selenium and molybdenum were near or slightly higher than the TDI. The JMHLW does not provide a tolerable upper intake level for chromium and the WHO does not describe the quantitative risk of chromium intake. For the 13 toxic metals analysed, the contribution of drinking water to TDI was 2% or less in all six cities. In the WHO Drinking-water Guidelines, water is assumed by default to contribute 10% to 20% of daily exposure to chemical substances. However, based on the finding of this study that the mean daily intake from food and water was less than 50% of TDI for the 10 of the 13 toxic metals, a case can be made for increasing the allocation ratio of intake from drinking water in establishing drinking water quality standards. The drinking water contribution to RDA or AI was less than 10% for all essential metals in all the six cities which indicated that drinking water does not contribute very much to essential metal intake. On average, drinking water was found to contribute only about 5% of the AI of calcium. Along with the mean intakes, the variation of individual intakes for essential metals in the areas studied should be further investigated and considered.
The mean DMIs of 10 toxic metals (excluding selenium, molybdenum and arsenic) were less than
HEALTH STREAM
MARCH 2011
PAGE 15
WATER QUALITY RESEARCH AUSTRALIA
Outbreaks
water were reviewed. Kitchen staff were interviewed about food and beverage handling and preparation.
Giardiasis outbreak at a camp after installation of a slow-sand filtration water-treatment system. Karon, A.E., Hanni, K.D., Mohle-Boetani, J.C., Beretti, R.A., Hill, V.R., Arrowood, M., Johnston, S.P., Xiao, L. and Vugia, D.J. (2010) Epidemiology and Infection, 1-5. doi:10.1017/S0950268810001573 In July and August 2007, an outbreak of giardiasis occurred among attendees of a boys’ scouting camp in central California. Prior to 2007, the campsite used water from a natural spring for drinking, toilets and showers, however in April 2007, a boil-water order was issued after routine testing by the local health department identified total coliforms (16 c.f.u./100 ml) in the spring water. During the summer 2007 season, imported tanks of potable water were used for cooking and attendees drank bottled water. In May 2007, installation of a slow-sand filtration system and chlorinator for spring water began. The filtration system was put into operation on 6 July and the local health department began weekly testing of unfiltered and sand-filtered spring water for coliforms and turbidity. At that time the boil-water order for drinking remained in effect, but the system’s effluent was used in the bathrooms for washing hands and showering. Multiple groups attended the camp for periods of 4 to 7 days. It appears that gastrointestinal illness was experienced by several people following the 5-8 July session, and this triggered an outbreak investigation.
Of the 316 scouts, parents and staff at the camp during the three camp sessions, 255 (81%) responded. Of the 30 stool samples tested, 26 were positive for Giardia. An additional 24 probable cases were identified on the basis of symptoms. The attack rate for the 5-8 July session was 28%, versus 0% for the preceding session and 11% for the subsequent session. Multivariate analysis of survey data from 34 cases and 56 controls showed showering to be the only exposure statistically associated with illness (adjusted odds ratio 3.1, 95% CI 1.1-9.3). The environmental investigation indicated that the unsterilized sand purchased for use in the filtration system was the most likely source of contamination, and inadvertent water ingestion during showering led to infection. Although the water was chlorinated at 1.0 mg/L, this may have been insufficient to inactivate Giardia cysts. The study highlights the importance of using sterile sand in slow-sand water filtration systems, and verifying filtration adequacy through pre-installation pilot testing and post-installation performance testing. Water should not be consumed or used for bathing showering or any other purpose which may inadvertently lead to ingestion until testing has consistently confirmed the safety of slow-sandfiltered water. Rainwater
Children and adults who had attended three consecutive camp sessions on 24-28 June, 5-8 July or 9-12 July were contacted and asked to contact their local health department if they had experienced symptoms and to submit stool specimens for analysis. In addition a telephone survey was used to collect demographic data, clinical data and data on exposures to recreational and drinking water and foods served at camp. Camp staff were asked to fill in a written version of the survey. A site visit was conducted on the 14 August to inspect the spring and slow-sand filtration system and interview the installer. Water and sand samples were collected and records of total coliform, faecal coliform and turbidity testing of unfiltered and sand-filtered spring
HEALTH STREAM
Health Risk from the Use of Roof-Harvested rainwater in Southeast Queensland, Australia, as potable or nonpotable water, determined using quantitative microbial risk assessment. Ahmed, W., Vieritz, A., Goonetilleke, A. and Gardner, T. (2010) Applied and Environmental Microbiology, 76(22); 7382-7391. This study used binary PCR (presence/absence) and quantitative PCR (qPCR)-based assays to first detect and then quantify zoonotic pathogens in samples from roof-harvested rainwater in Southeast Queensland (SEQ) residential houses. Quantitative microbial risk assessment (QMRA) analysis was
MARCH 2011
PAGE 16
WATER QUALITY RESEARCH AUSTRALIA
applied in order to estimate the risk of infection from exposure to specific pathogens found in roofharvested rainwater. The pathogens Campylobacter jejuni, Legionella pneumophila, Salmonella spp. Giardia lamblia and Cryptosporidium parvum were selected as these pathogens could be present in the faeces of birds, mammals and reptiles that have access to roofs. Following rain events, faecal matter could potentially be transported to tanks via roof runoff. A total of 214 water samples collected from 82 residential houses in Brisbane, Gold Coast and Sunshine Coast regions. Water samples were collected from the outlet taps located close to the bases of the tanks. In the first phase of the study, 100 samples were collected from 82 tanks soon after rainfall events and screened for the presence/absence of the selected pathogens using binary PCR assays. In phase two, water samples were collected from a subset of tanks (n=19) which were PCR positive for the selected pathogens in phase one. Samples were collected from the 19 tanks every 2 weeks over a 3 month period (April to June 2009, 6 sampling occasions) commencing with a rainfall event, and were tested using binary PCR. For all positively identified samples in phases one and two, quantitative PCR methods were then used to quantify these pathogens. Of the 214 samples tested during the entire study, the Salmonella invA, G. lamblia beta-giardin and L. pneumophila mip genes were detected in 23 (10.7%), 21 (9.8%) and 12 (5.6%) of the rainwater samples, respectively. The C. jejuni mapA gene was detected in one sample but was non-quantifiable. None of the samples were positive for C. parvum oocyst wall protein (COWP) genes. The estimated numbers of Salmonella, G. lamblia and L. pneumophila, organisms ranged from 6.5 x 101 to 3.8 x 102 cells, 0.6 x 100 to 3.6 x 100 cysts and 6.0 x 101 to 1.7 x 102 cells per 1,000 ml of water, respectively. During phase two of the sampling, pathogens were found to be present in individual tanks between 0 and 32% of the time during the 3 month period. Salmonella spp. were found in 4.4% of samples, G. lamblia in 5.3%, and L. pneumophila in 3.5%. Overall, pathogens were present approximately 5% of the time.
HEALTH STREAM
There were six risk scenarios considered for exposure to Salmonella spp., G. lamblia and L. pneumophila. For salmonellosis and giardiasis risk, the scenarios were (i) liquid ingestion due to drinking of rainwater (1 litre/day) on a daily basis, (ii) accidental liquid ingestion (1ml/event) due to garden hosing twice a week, (iii) aerosol ingestion due to showering (1.9ml/event) on a daily basis, and (iv) aerosol ingestion (1.9 microlitre/event) due to hosing twice a week. For legionellosis risk the scenarios were (i) aerosol inhalation due to showering (0.84 microlitre/event) on a daily basis and (ii) aerosol inhalation (0.5 microlitre/event) due to hosing twice a week. For QMRA, all organisms were assumed to be viable and infectious. The calculated infection risk per 10,000 exposed persons per event for aerosol inhalation infection with L. pneumophila was low (up to 8.6 x 10-2) compared to that calculated for Salmonella spp. and G. lamblia for the routes of liquid ingestion via drinking (up to 6.8 x 102), liquid ingestion via hosing (up to 7.1 x 10-1) and aerosol ingestion via showering (up to 1.3 x 100). Very low risks of infection were calculated for aerosol ingestion via hosing for Salmonella spp. and G. lamblia (up to 1.3 x 10-3) due to the extremely low volumes ingested. If the percentage of positive tanks for each pathogen is multiplied by the proportion of the urban population that have a tank and use the water for drinking and/or hosing, the fraction of the urban SEQ population potentially exposed to each pathogen ranged from 0.46% to 4.76% across the different scenarios. The risk of infection per year from the selected pathogens per 10,000 persons in urban SEQ was then calculated. Both Salmonella spp. and G. lamblia showed infection risks with similar orders of magnitude for each scenario and values up to 1.2 x 102 for liquid ingestion via drinking, 2.4 x 10-1 for aerosol ingestion via showering, 1.8 x 10-1 for liquid ingestion via hosing and 3.4 x 10-4 for aerosol ingestion via hosing were found. L. pneumophila showed much lower infection risks for aerosol inhalation with a maximum value of 7.1 x 10-3 found. If a threshold value of one extra infection per 10,000 persons per year is used then it is apparent that of all the scenarios considered, only those involving liquid ingestion via drinking present an unacceptable level
MARCH 2011
PAGE 17
WATER QUALITY RESEARCH AUSTRALIA
of risk from infection. Based on ‘worst-case’ assumptions, if undisinfected rainwater is routinely used for drinking, then the infection incidence is expected to range from 9.8 x 100 to 5.4 x 101 (Salmonella spp.) and from 2.0 x 101 to 1.3 x 102 (G. lamblia) cases per 10,000 persons in urban SEQ per year. The authors conclude the only likely risk encountered from roof-harvested rainwater samples was from drinking water contaminated with Salmonella spp. and .G. lamblia. L. pneumophila did not present a threat for uses of tank water as potable water. Therefore it would be advisable to disinfect roofharvested rainwater by installation of a UV disinfection unit, boiling, or other forms of disinfection before it is used as potable water, especially for drinking. This is particularly important for the elderly and immunocompromised. Also given that G. lamblia was found in rainwater tank samples, inclusion of giardiasis in the notifiable disease list in Queensland should be considered. Rainwater tanks have been installed in Australia at rates not seen before due to water restrictions in several capital cities. In south east Queensland over 260,000 state government subsidies were granted to households for installing rainwater tanks up to December 2008 when the subsidy scheme ended. Therefore it is essential that a robust methodology be developed for assessing the possible health risks from roof-harvested rainwater. Further information is required on the occurrence of pathogens throughout the year and the viability of pathogens in roof-harvested tanks. Comment This study deliberately made assumptions which maximised risk estimates, and actual risks may be lower due to loss of pathogen viability in stored rain water between rain events. Australian health authorities do not recommend drinking untreated rainwater if a potable tap water supply is available. A recently released report from the Australian Bureau of Statistics indicates that although the proportion of households using water from rainwater tanks has increased from 19.3% to 26.4% between 2007 and 2010, the percentage of households drinking rainwater has not increased.
HEALTH STREAM
Recreational Water Disease outbreaks associated with untreated recreational water use. Schets, F.M., De Roda Husman, A.M. and Havelaar, A.H. (2010) Epidemiology and Infection, 1-12. In 1991 an epidemiological surveillance system was set up in The Netherlands to record reported health complaints associated with untreated recreational water. Recreational exposure to surface water may result in negative health effects when water quality is microbiologically poor and may possibly result in outbreaks of disease, however is often difficult to attribute reported cases of presumptive waterborne illness to recreational water contact. This paper provides an overview of the types and number of outbreaks that were reported and investigated from 1991 to 2007. Surveillance activities may help to characterise the epidemiology of the illness and to identify trends in aetiological agents. Also, major deficiencies in providing safe recreation water may be identified and the data collected may be used to develop interventions to prevent future illness. Outdoor recreational water settings covered by this surveillance program included all official bathing sites in untreated fresh and marine water for which water quality data were reported to the European Commission. In 2007, 641 official bathing sites existed in The Netherlands, including 555 inland water bodies and 86 coastal water sites. The National Institute for Public Health and the Environment annually asks provinces and public health services to provide overviews of the outbreaks and single cases of illness associated with untreated recreational water use that they were notified of during the most recent preceding bathing season (1 May to 1 October). Provinces and public health services were requested to complete a standard form and subdivide the outbreaks and single cases into six categories: (I) gastroenteritis, (II) skin, (III) ear conditions, (IV) eye conditions, (V) leptospirosis and (VI) other health complaints. From 2004 onwards, provinces and public health services were also asked to report presumptive outbreaks involving 10 or more cases with similar illness from the same bathing site as soon as they were notified. During the bathing season
MARCH 2011
PAGE 18
WATER QUALITY RESEARCH AUSTRALIA
water quality was tested at official bathing sites fortnightly for compliance with standards for faecal indicator bacteria in to the European Bathing Water Directive. Weather data were also obtained about the bathing seasons for 1991-2007. Outbreaks were classified according to the Centers for Disease Control and Prevention classification scheme for waterborne disease outbreaks that uses strength of evidence implicating water as the transmission route on the bases of available epidemiological and water quality data. There are four classes: (I) adequate epidemiological data (data provided about exposed and unexposed persons with relative risk or odds ratio greater than or equal to 2, or P values less than or equal to 0.05) and provided and adequate water quality data; (II) adequate epidemiological data and not provided or inadequate water quality data; (II) epidemiological data provided but limited (does not meet the criteria for class I or a claim was made that ill persons had no other exposures in common than water, but no data provided) and water quality data provided and adequate; and (IV) epidemiological data limited and water quality data not provided or inadequate. From 1991 to 2007 a total of 1055 reports of untreated recreational water-related illnesses were received from provinces and public health services. Of these reports, 313 were not included as they involved only one patient. The remaining 742 outbreaks were mainly comprised of skin conditions (48%) or gastroenteritis (31%). Incidents involving ear conditions and mixed outbreaks of gastroenteritis and skin conditions were less common. Leptospirosis was reported infrequently and eye conditions were rare. From 2004 to 2007 there were 42 outbreaks that were directly reported during the bathing season; 24 (57%) were skin conditions, 10 (24%) gastroenteritis and five (12%) otitis externa. The majority of the outbreaks (78%) were classified as Class IV as they lacked water quality data and epidemiological investigation was not performed. There were 160 outbreaks classified as Class II and these mainly comprised outbreaks of skin conditions or gastroenteritis. There were no Class II outbreaks and less than 1% of the outbreaks fell into Class I. The number of people affected was poorly reported in
HEALTH STREAM
21% of outbreaks so estimation of total numbers was difficult. Assuming only 2 people were involved in such outbreaks, the total number affected in the 742 outbreaks was at least 5623. The number of outbreaks per bathing season varied greatly and ranged from nine to 97, with a median of 43. The number of outbreaks was strongly associated (r = 0.8-0.9) with average monthly temperature in June, July and August and the number of summer days (i.e. maximum temperature of greater than or equal to 25 degrees C) and tropical days (i.e. maximum temperature of greater than 30 degrees C) in a summer. Although the degree of compliance with European bathing-water quality standards increased over time, this did not correlate (r = 0.1) with the number of outbreaks reported. During the study period there were 170 reports of cyanobacterial scums in recreational waters, but, none of these was accompanied by reports of cases of illness. Occasionally some of the reported outbreaks of gastroenteritis or skin conditions were suspected to be related to exposure to cyanobacteria however none of these was further investigated. The limited number of bathing sites in The Netherlands which had recurrent outbreaks of the same illness (4%) and the low frequency of recurrence (2-4 times) at these sites implies that, in general, water quality problems resulting in diseases outbreaks were due to incidental (faecal) contamination events or environmental conditions that favoured the growth of indigenous pathogens, which were the presumed aetiological agents in most of the outbreaks. Therefore measures to protect public health include: identification of all possible contamination sources and high-risk situations, regular updating of bathing water profiles, alerting the public of changes that might have a negative effect on water quality and the provision of adequate and updated information to the public. People need to be informed about the possible negative health impacts of swimming in surface water although it should also be emphasised that illness is generally mild and self-limiting and the positive health effects of swimming should be stressed.
MARCH 2011
PAGE 19
WATER QUALITY RESEARCH AUSTRALIA
Water Softening A Randomised Controlled Trial of Ion-Exchange Water Softeners for the Treatment of Eczema in Children. Thomas KS, Dean T, O’Leary C, Sach TH, Koller K, Frost A, Williams HC, the SWET Trial Team. (2011) PLoS Medicine 8(2) e10000395. Atopic eczema (also known as atopic dermatitis) is believed to affect up to 20% of school age children in developed countries. Some studies have suggested the condition may be associated with or exacerbated by hard water (water high in divalent cations, mainly calcium and magnesium). This paper reports the outcome of a multi-centre randomised trial on water softening involving 336 children in the UK. All children had been diagnosed with eczema and met severity criteria on a recognised clinical assessment scale. Children were aged 6 months to 16 years at enrolment, and all lived in hard water areas (200 mg/L calcium carbonate equivalent or above). Only one child was enrolled per household. Households were randomised to receive an ion-exchange water softener plus normal eczema care, or normal eczema care, for 12 weeks. Eczema severity was assessed at baseline, 6, 12 and 16 weeks by researches nurses who were blinded to water softener assignment. A pilot study had shown participants could not be
blinded due the effects of softening on soap lathering. Information was also recorded on itchiness, sleep loss and use of topical medications. Movement during sleep was assessed using a wrist accelerometer. Dermatitis Family Impact and child’s Quality of Life questionnaires were administered. Eleven children in the intervention group and two in the control group withdrew from the study, leaving data for 159 in the intervention group and 164 in the control group for analysis. Both groups of children showed improvement in eczema severity as scored by research nurses by the end of the 12 week period (water softener group 20% improvement, control group 22% improvement), and there was no significant difference between the two groups. No differences were seen in the use of topical eczema medications or movement during sleep. Some significant but small differences were seen in subjective outcomes, but may have been attributable to reporting bias. The authors conclude that water softeners cannot be recommended for symptom relief in children with existing eczema. Disclaimer Whilst every effort is made to reliably report the data and comments from the journal articles reviewed, no responsibility is taken for the accuracy of articles appearing in Health Stream, and readers are advised to refer to the original papers for full details of the research.
Health Stream is the quarterly newsletter of Water Quality Research Australia. Health Stream provides information on topical issues in health research which are of particular relevance to the water industry, news and updates on the recent literature. This newsletter is available free of charge to the water industry, public health professionals and others with an interest in water quality issues. A PDF version of the newsletter including Web-bonus articles is available on the WQRA website To be placed on the email notification list for Health Stream, please email: pam.hayes@monash.edu To be placed on the print mailing list for Health Stream, please send your postal address details to: Pam Hayes Epidemiology and Preventive Medicine Monash University - SPHPM Alfred Hospital, Melbourne VIC 3004 AUSTRALIA
Phone +61 (0)3 9903 0571 Fax +61 (0)3 9903 0556 Email pam.hayes@monash.edu
© Copyright WQRA. Health Stream articles may be reproduced and communicated to third parties provided WQRA is acknowledged as the source. Literature summaries are derived in part from copyright material by a range of publishers. Original sources should be consulted and acknowledged.
HEALTH STREAM
MARCH 2011
PAGE 20
Web Bonus Articles
Health Stream 61- March 2011Web Bonus articles Chemical Contamination Perchlorate, nitrate, and iodide intake through tap water. Blount, B.C., Alwis, K.U., Jain, R.B., Solomon, B.L., Morrow, J.C. and Jackson, W.A. (2010) Environmental Science and Technology, 44(24); 9564-9570. Perchlorate is widespread in the environment from both anthropogenic and natural sources. Pharmacological dosages of perchlorate (mg/kg-day) can interfere with iodide uptake by the thyroid gland and therefore reduce thyroid hormone production. Additional data are needed to assess the relative importance of low level perchlorate exposure from drinking water as compared with exposure from food. Nitrate is ubiquitous in the environment and due to the use of nitrogenous fertilisers, levels have significantly increased in groundwater over the past two decades. The U.S. EPA has set a maximum contaminant level of 44,300 micro g/L for drinking water nitrate, based on preventing methemoglobinemia in infants. Elevated nitrate intake has been found to be associated with increased risk of cancer and adverse reproductive outcomes. Nitrate may also interact additively with perchlorate to inhibit iodide uptake at the thyroid. The relative importance of nitrate intake in tap water can be evaluated by measuring nitrate from this source. Iodine deficiency is an important public health problem globally. Iodine is required by the thyroid gland to produce thyroid hormones. Some sensitive life stages e.g. infants and pregnant/lactating women may benefit from increased iodine intake. This study measured perchlorate, nitrate and iodide in tap water collected from a representative sample of U.S. residents and estimated anion intakes attributable to direct and indirect tap water consumption. The National Health and Nutrition Examination Survey (NHANES) collected tap water samples and water consumption data between 2005-2006 from 3262 U.S. residents age 12 years and older, who had resided within 30 locations across the U.S. Water consumption data were collected by dietary
HEALTH STREAM
questionnaire. Direct tap water consumption was defined as the amount of tap water consumed by an individual on the first day of 24-h dietary recall. Indirect tap water consumption was defined as water added to foods or beverages during final preparation (e.g. brewing coffee) and calculated from the same dietary questionnaire. Perchlorate, nitrate and iodide were measured in all the water samples available (n=3262) of which there were only 3084 that had matching study participant data (water consumption, body weight, age and race/ethnicity). The median perchlorate, nitrate and iodide levels in the tap water were 1.16, 758 and 4.55 micro g/L, respectively. None of the perchlorate tap water levels were above the U.S. EPA drinking water equivalent level of 24.5 micro g/L. Tap water nitrate levels were weakly correlated with both iodide levels (r = 0.17, p less than 0.0001, n = 3261) and perchlorate levels (r = 0.25, p less than 0.0001, n = 3261). There were only 47% of the study participants that reported directly consuming any tap water in the 24 h dietary recall. Indirect consumption of tap water was reported for 78% of study participants. There were 89% of study participants that had total tap water intake (direct + indirect) greater than zero. Direct and indirect tap water consumption were not found to be correlated ( r = 0.02). For adults ages greater than or equal to 20 yr, median and 95th percentiles of total tap water intakes were 11.6 and 45.5 mL/kg-day, respectively. Median perchlorate intakes found for adults and for women of reproductive age were 9.11 and 6.56 ng/kg-day, respectively and account for only 14% and 12% respectively, of median total perchlorate dose previously reported for adults (64 ng/kg-day) and women of reproductive age (57 ng/kg-day) in NHANES 2001-2002. Estimated perchlorate intake doses from tap water were found to be much lower than the U.S. EPA reference dose (RfD) of 700 ng/kg-day. The median nitrate intake for all study participants was 10.3 micro g/kg-day, which is about 0.15% of the U.S. EPA reference dose for nitrate (7.1 mg/kgday). The 95th percentile intake for nitrate for all study participants (287 micro g/kg-day) was only 4% of the nitrate reference dose. None of the study participants were found to exceed the U.S. EPA
MARCH 2011
PAGE 1
Web Bonus Articles
reference dose for nitrate. The median iodide intake from tap water was estimated to be 39.7 ng/kg-day which is 2% of the daily intake recommended by the World Health Organization (WHO) for adolescents and adults (2 micro g/kg-day). The geometric mean iodide intake attributable to tap water is typically less than or equal to 1% of the average range attributable to food intake. It is possible that intakes of perchlorate by infants consuming formula prepared with tap water containing perchlorate at the 95th percentile of this study distribution (1.89 micro g/L) could exceed the EPA reference dose of 0.7 micro g/kg-day. This study found that drinking water is not a major source of intake of perchlorate, nitrate and iodide. It is likely that diet accounts for a larger portion of aggregate perchlorate intake than does water ingestion. The information found here will assist in decisions about strategies for reducing perchlorate and nitrate exposure. Private drinking water wells as a source of exposure to perfluorooctanoic acid (PFOA) in communities surrounding a fluoropolymer production facility. Hoffman, K., Webster, T.F., Bartell, S.M., Weisskopf, M.G., Fletcher, T. and Vieira, V.M. (2011) Environmental Health Perspectives, 119(1); 92-97. Perfluorooctanoic acid (PFOA or C8) is a synthetic chemical that is used in the manufacture of fluoropolymers. These polymers possess unique properties including oil, stain, grease and water repellency, and are used in non-stick cookware, weather- and stain-resistant clothing and textiles, building and construction material and electronics. PFOA has been detected in the environment globally and is extremely resistant to environmental and metabolic degradation. It has also been detected in most of the serum samples from U.S. and world populations. PFOA is readily absorbed via inhalation and ingestions and once absorbed it is eliminated from the human body very slowly. PFOA exposure has been linked to various health impacts in animals, including increased cancer risk, adverse reproductive outcomes and liver damage, however the health
HEALTH STREAM
impacts of exposure in humans remain largely unknown. DuPont began using PFOA in the manufacture of Teflon at its Washington Works plant in the early 1950s. In the late 1990s the emissions to air and the Ohio River were at a maximum and large reductions in these emissions have been reported by the company in recent years. The primary source of exposure for individuals in the surrounding communities is contaminated groundwater that is used for drinking water. In 2001, a group of residents in communities surrounding the facility filed a class action lawsuit against DuPont for alleged health damages after PFOA was detected in public drinking water. The settlement established the C8 project which was a baseline survey conducted from 20052006 to investigate potential links between PFOA and human disease in the area surrounding the facility. This current study examined the relationship between PFOA concentrations in serum and in drinking water using data collected from private drinking water wells contaminated by industrial emissions in a subset of C8 Health Project participants. The C8 Health Project included approximately 69,000 adults who lived in one of six public water districts in West Virginia and Ohio that surround DuPont’s Washington Works facility. Data were collected from each participant using questionnaires and clinical examinations and demographic information and residential, occupational and medical histories were obtained. Serum samples were taken from each participant and concentrations of 10 perfluorinated compounds were determined. Water monitoring conducted by DuPont for public and private wells surrounding the Washington Works facility beginning in 2001. Private well monitoring reports included PFOA measurements as well as the primary use of each well and the name and address of each well’s owner. Well monitoring data was linked for 63 private wells that were used primarily for drinking water to C8 Health Project participants based on name and address. There were 115 participants who used 62 different private wells included in this study. The relationship between drinking water and PFOA levels in serum was examined using robust regressions methods. For comparison with the regression analyses, the ratio of
MARCH 2011
PAGE 2
Web Bonus Articles
serum to PFOA concentrations in drinking water was predicted using a simple first-order, singlecompartment pharmacokinetic model. The final sample consisted of 108 participants after exclusions. Serum PFOA levels ranged from 0.9 to 4751.5 micro g/L, with a median concentrations of 75.7 micro g/L. The median PFOA concentration in drinking water wells included in the analysis was 0.2 micro g/L. There was considerable variability between wells and PFOA concentrations ranged from below the limit of quantification (LOQ = 0.006 micro g/L) farthest from the Washington Works facility to 13.3 micro g/L closest to the facility. In the adjusted robust regression models each microgram per litre increase in drinking water PFOA concentration was associated with a 141.5 micro g/L (95% CI = 134.9148.1) increase in serum concentrations. Growing one’s own vegetables, being male and being employed at DuPont were associated with elevated serum PFOA levels, however these associations did not reach statistical significance at the 0.05 level. When the analyses was restricted to those with residency greater than 15 years, results were similar (Beta = 140.2 micro g/L; 95% CI 132.1-148.4 micro g/L; n= 67). Using the simple steady-state first-order pharmacokinetic model in the study population after scaling for body weight and sex of participants, a serum:drinking water concentration ratio of 114 was obtained which is close to the estimated of effect obtained in the regression analysis. This result suggests that the pharmacokinetic model provides a reasonable estimate. The serum PFOA concentrations in users of private wells in the area surrounding DuPont were much greater than those observed in the general U.S. population (4 micro g/L). Private drinking water wells in the areas were contaminated with PFOA, with levels in some wells being much greater than those observed in public drinking water supplies in the same area, which ranged from 0.03 micro g/L in Mason County to 3.5 micro g/L in Little Hocking. The adjusted regression analysis indicated that PFOA levels in drinking water are a significant predictor of PFOA levels in serum. The results may also be able to be applied to other areas with point-source PFOA contamination.
HEALTH STREAM
Fluoride Fluoride in still bottled water in Australia. Mills, K., Falconer, S. and Cook, C. (2010) Australian Dental Journal, 55(4); 411-416. It is speculated that there may be an association between increased consumption of bottled water and a recent increase in tooth decay rates in Australian children. It is thought that consumption of (fluoridated) reticulated water is being replaced by bottled water consumption and therefore affecting dental health. As of January 2009, more than 80% of the Australian population had access to fluoridated water. The recommended level of fluoride in reticulated water supplies is from 0.6 to 1.1 mg/L. The rise in bottled water consumption is a worldwide phenomenon and bottled water is purchased for its convenience, taste and perception that it doesn’t contain chemicals and harmful bacteria, as well as being a status symbol. International studies have shown that the majority of bottled water available contains negligible fluoride in terms of dental health benefits. There is limited information however about the fluoride concentration of bottled water sold in Australia. This study aimed to measure the fluoride concentration of still bottled water available in Australia. There were 100 different brands of still bottled water purchased between 2006 and 2007, with three bottles of each brand were tested. The volume of the containers varied from 300 ml to 12 litres and most of the samples were packaged in plastic bottles. The majority (72%) of waters were described as spring water. Two analytical methods were used to determine the fluoride levels of the water: ion chromatography and ion-selective electrode. The ion chromatography and fluoride electrode methods produced comparable results and results differed by less than 0.1 mg/L. The fluoride levels of the 300 samples tested ranged from less than 0.1 to 1.6 mg/L, a more than 16-fold difference. There were 16 brands in which the water originated from countries outside Australia. There were 85% of the samples tested that had fluoride levels less than 0.1 mg/L. There were a further 5% of
MARCH 2011
PAGE 3
Web Bonus Articles
samples that had fluoride levels from 0.1 up to 0.5 mg/L, therefore 90% of the samples had fluoride levels less than 0.5 mg/L and below the levels recommended to have a dental benefit. Of the remaining samples (from 10 brands) 27 had fluoride levels from 0.6 up to and including 1.1 mg/L (a range of 0.5 to 1.3 mg/L allowing for test result uncertainty). Of the 10 brands with fluoride levels within the recommended range for reticulated water, seven were from Australia, two were from Malaysia and one was from Europe. There were one per cent of samples which recorded fluoride values in excess of this range. There were no trends seen between water source and fluoride level. However there were differences between individual samples within brands which indicated significant variability in fluoride composition within the source supply. Fluoride levels were recorded on the labels of 32 samples and none of the samples reported a level which (when adjusted for test uncertainty) could be considered to exceed the label claim. This study supports the belief that bottled water does not contain the same levels of fluoride as fluoridated reticulated drinking water supplies. There is a need for the fluoride concentration to be listed on the labels of bottled water and this has been recognised in the guidelines for the use of fluorides in Australia. Better information about fluoride levels in bottled water is of most relevance and interest to those without access to a fluoridated reticulated water supply and health professionals. Being able to select a particular brand of bottled water on the basis of its fluoride level could have a long term dental health benefit if sustained, for these communities. Also, this information could aid individuals and health professionals to better assess the influence of bottled water consumption on dental health, in particular if it is the main source of drinking water in unfluoridated areas. Comment A national survey conducted by the Australian Bureau of Statistics in March 2010 showed that 6.6% of Australian households use purchased bottled water as their main source of drinking water. A further 9.8% use water from rainwater tank, which also lacks fluoride.
HEALTH STREAM
Dose-dependent Na and Ca in fluoride-rich drinking water -Another major cause of chronic renal failure in tropical arid regions. Chandrajith, R., Dissanayake, C.B., Ariyarathna, T., Herath, H.M.J.M.K. and Padmasiri, J.P. (2011) Science of the Total Environment, 409(4); 671-675. In Sri Lanka the prevalence of chronic kidney disease (CKD) has been increasing over the past 8-10 years, especially in particular geographic regions. The disease in these regions does not appear to be associated with any known risks factors such as diabetes and hypertension or chronic glomeronephritis. “Chronic Kidney Disease of unknown etiology� (CKDu) is more prevalent in certain parts of the dry zone regions of the country where the annual rainfall is less than 1000 mm and much of the year is dry. The disease commonly manifests in young male farmers of low socioeconomic class and mostly affects those between 3060 years of age. In almost all the affected regions, the main source of potable water for CKDu subjects is groundwater from shallow and deep wells. The high CKDu prevalent regions overlap with the high groundwater fluoride zone of Sri Lanka, suggesting that at least to some extent, the fluoride content of drinking water may contribute to the CKDu. In some parts of the dry zone however the disease is not recorded even though drinking water fluoride content is significantly higher, indicating that fluoride is not acting alone. This paper describes a detailed hydrogeochemical study that was carried out covering endemic and non-endemic regions. Well water samples were randomly collected from the CKDu prevalent regions of Giradurukotte (n=46), Nikawewa (n=52), Medawachchiya (n=10) and Padaviya (n=34) and from the non-endemic regions of Huruluwewa (n=29) and Wellawaya (n=8). Only wells in use for more than 10 years were sampled in order to obtain information on the effect of long-term human consumption. In endemic regions, samples were taken randomly, but in Medawachchiya all samples were taken from wells used by CKDu patients. Samples were collected in duplicate from each site and the pH, electrical conductivity (EC) and alkalinity were measured in-situ. Water samples were tested for a variety of hydrochemical characteristics
MARCH 2011
PAGE 4
Web Bonus Articles
including fluoride concentration, nitrate, phosphate, sodium, sulphate, chloride, calcium, potassium, iron and total hardness. The pH of all water samples was neutral to alkaline in the regions studied. There was an extremely large variation in EC found in water samples. High hardness was common in most of the water samples except for the water from the Wellawaya region where higher alkalinity was prominent compared to the other regions studied. The nitrate-nitrogen content of the water was well below the WHO recommended limit of 10 mg/L except for one sample from Medawachchiya. The Ca-bicarbonate type water was found to be predominant in endemic CKDu regions whereas Na-K-non dominant anion type water was common in the non-endemic regions. There was a clear difference in Na and Ca activities in drinking water between affected and non-affected regions and this was apparent by their Na/Ca ratios. High fluoride levels were common in all studied regions, however, non-endemic areas were characterised by much higher Na/Ca ratios. When the Na/Ca ratio is increased, the geochemical behaviour of drinking water favours complexation of fluoride with Na+, which reduces both the toxicity of fluoride ions in the human body and the absorption of Ca2+. When Ca2+ activity is higher on the other hand, this aggravates the damage caused by fluoride intake and can cause considerable nephrotoxic effects on human proximal tubular cells, but the toxicity depends strongly on Na+ and Ca2+ activity. Nutritional state of the individuals and genetic predisposition also play a major role in CKDu in the north central dry zone of Sri Lanka as even though all families are exposed to similar geoenvironmental conditions only those susceptible are affected. The farming community in such regions are more vulnerable to the disease as they drink more water (obtained from the ground) when working in the warm humid conditions that prevail. The mean Na/Ca ratios in the endemic regions ranged from 1.6 to 6.6, while in the nonendemic region they ranged from 34 to 469. Mean fluoride concentrations ranged from 0.62 to 1.42 mg/L in endemic areas, and from 0.72 to 1.05 in the non-endemic region.
HEALTH STREAM
The information gained from this study is of particular importance to the millions of people living in poverty in arid zones of some tropical countries. The groundwater in these countries used for drinking and domestic purposes may have unique fluoride, sodium and calcium ion concentrations, which may lead to fatal chronic kidney diseases. Comment This study raises a new hypothesis based on ecological comparison of populations in different areas. Additional studies collecting exposure and health outcome data at an individual level are needed to provide a stronger level of evidence .This should include assessment of dietary intake of the minerals of concern in the different regions, in addition to the contribution of water. Giardia Drinking water is possibly the risk factor for Giardia lamblia infection among Libyan children. BenRashed, M.B. (2010) Jamahiriya Medical Journal, 10(3); 206-209. Giardia is a common cause of gastrointestinal disease especially in children. Contamination of water supplies with Giardia cysts has been increasingly recognised over the last 10 years as a cause of waterborne disease in humans. There have been several large outbreaks of giardiasis that have results from the contamination of municipal water supplies with human waste. This study was undertaken to demonstrate the Giardia lamblia infection rate and its possible association with drinking water source among asymptomatic children living in Tripoli, Libya. The study included 260 children with ages ranging from 3 years-14 years, mean age was 6.9 years. The children had no diarrhoea or any other gastrointestinal symptoms related to Giardia infection. The children were divided in to 3 different groups according to their drinking water source. There were 121 children who drank from municipal domestic tap water, 68 drank from private domestic tube pipe wells and 71 drank from both wells and rain water reservoirs or tanks (feskya). Stool samples were collected from all children and examined for the
MARCH 2011
PAGE 5
Web Bonus Articles
presence of G. lamblia trophozoites, or cysts using normal saline and iodine smear. G. lamblia cysts were detected in 25 samples (9.6%). In the group that drank from municipal tap water alone the number of positive samples was the lowest (5 positive samples, 4%), for those that drank from wells alone there were 6 (8.8%) positive samples and those that drank from both wells and rain water there were 14 (19.7%) positive samples. There was a significant association between Giardia infection in children drinking from private domestic well reservoirs only and children drinking from both private domestic well reservoirs in addition to drinking rain water, in comparison with those children drinking from municipal tap water only, (p=0.006) and (p=0.004) respectively. In addition to Giardia, another intestinal protozoa were found, Entamoeba histolytica (5.7%) and three nonpathogenic protozoa species Entamoeba hartmanni, Entamoeba coli and Endolimax nana. The results show that private domestic water reservoirs and tanks may possibly be subject to contamination with Giardia and therefore a risk for Giardia infections. It is not possible to verify that the acquired Giardia infection came from these sources as no water samples were analysed and other modes of Giardia transmission need to be considered such as food and person-to-person. Small private water supplies including pipe well water reservoirs and rain tanks are not regulated by drinking water standards and there is a lack of inspection measures and the owner must test water and treat the water as needed to avoid possible health risks. In order to prevent infection from these private water supplies health education is necessary along with environmental protection, mandatory inspection measures and regulation by drinking water standards. Household Water Treatment Solar disinfection of drinking water in the prevention of dysentery in South African children aged under 5 years: The role of participant motivation. Du Preez, M., McGuigan, K.G. and Conroy, R.M. (2010) Environmental Science and Technology, 44(22); 8744-8749.
HEALTH STREAM
Household water treatment (HWT) is an attractive option when the cost of universally supplying piped water is prohibitive. Solar disinfection (SODIS) is one of the simplest and cheapest HWT techniques and recent laboratory studies have shown that exposing water to sunlight results in significant reduction in microbial contamination. However, there is limited data from controlled field trials to show that this reduction in bacterial levels translates into a reduction in risk of disease in people and previous studies have had several deficiencies. A randomised, unblinded controlled trial of solar disinfection was conducted in a periurban area in the Gauteng Province of South Africa from October 2007 to November 2008. The relationship between participant motivation and the effectiveness of solar disinfection in preventing childhood dysentery was examined. The study was conducted in the four periurban subdistricts of Soshanguve, Legonyane, Fafung and Kwarriekraal in the Tshwane Municipality of the Gauteng Province. Households were eligible to participate if they had no in-house piped water and at least one resident child aged 6 months to 5 years. Households were randomly allocated to control and SODIS groups. Field staff were recruited from the local community and visited households every 2 weeks to collect and distribute diarrhoeal diaries, assist with difficulties experienced with completion of the diaries and to encourage households to use SODIS. Households were selected 3 months before the main survey follow-up to ensure a wellestablished SODIS-related health effect. Follow up began in December 2008 and ended in December 2009. Parents and carers participating in the study were given verbal and written information on the disease concept and a simple explanation of the solar disinfection process and its effect on the microbial quality of their drinking water and the resulting health of their children. They were trained in using SODIS and in how to complete the pictorial diarrhoea diaries. Baseline information was collected about basic hygiene, sanitation and water use practices. Children in the intervention group drank from a bottle which had been solar exposed on the previous day and were advised where possible to drink directly from the bottle rather than filling a cup
MARCH 2011
PAGE 6
Web Bonus Articles
or other container. Carers of the children in the control group were not provided with SODIS bottles and were instructed to maintain their usual practices. The diarrhoea diaries recorded diarrhoea incidence daily from both the control and intervention groups and were collected monthly. Water from the storage containers and SODIS bottles was collected every 3 months and Escherichia coli count recorded. Motivation was measured by calculating the proportion of days on which diarrhoea diaries had been recorded. Motivation was classified as low (less than 75% of diarrheal diary information completed) or high (75% or more complete). There were a total of 649 households recruited with 386 children in the control group and 438 children in the intervention group. At baseline 62.4% of the water samples from the study households met World Health Organisation guidelines for zero thermotolerant coliforms per 100 mL and a further 21.8% had levels under 10 per 100 mL. There were 121 children lost to follow-up during the study, leaving 335 children in the control group and 383 in the intervention group available for analysis. Diarrhoeal illness was defined as dysentery (any loose stool containing blood or mucus) or nondysentery (3 or more loose or watery stools during a 24 hour period).The annual incidence of dysentery in the control group was 4.9 days (95% CI 4.6-5.3) and in the intervention group 2.5 days (95% CI 2.3-2.7). A generalised negative binomial regression model was used to examine the effect of SODIS on the incidence of dysentery. Incidence rates were lower in households drinking water from a standpipe than from any other source (IRR 0.38, 95% CI 0.12-1.2, P =0.091) and lower in those drinking solar disinfected water (IRR 0.64, 95% CI 0.39-1.0, P = 0.071), however both effects were of borderline statistical significance. Overall, 25.3% of participants kept diarrhoea diaries for 75% or more of the trial days, with no difference between controls and intervention groups. Overall, those who had the poorest level of data recording had the highest annual incidences of dysentery with rates of 12.9 days per year in those not on standpipe water sources and 4.8 in those on standpipe sources. After adjustment for these effects, those in the SODIS
HEALTH STREAM
group with the highest level of motivation (75% or more of data recorded) had a significantly lower incidence of dysentery (incidence rate ratio 0.36, 95% CI 0.16-0.81, P = 0.014). There was no significant effect of SODIS in any of the other lower motivation categories. Solar disinfection was not significantly associated with risk overall (P = 0.419) of non-dysentery diarrhoea. Effective solar disinfection was classified as a reduction of 1 log10 unit in bacterial concentration or better. Of the 200 follow-up water samples in which bacteria were detected in the storage water, the quality of water in the SODIS bottle was improved in 20% of low motivation households and 34% of high motivation households. Overall, the highly motivated households were more likely to achieve this than the other households in the SODIS group (OR 2.2, P = 0.034 adjusted for clustering by household). The results show that where motivation levels are low, solar disinfection of drinking water does not produce a worthwhile reduction in risk of dysentery in children. However, even in the socially deprived setting of this study, almost a quarter of households achieved 75% compliance or better, and children in these households showed a marked reduction in dysentery rate over the study year. This study highlights the need for a change from efficacy research to effectiveness research and the necessity to discover strategies to identify the factors most likely to affect uptake of HWT within a specific setting and community. Reported care giver strategies for improving drinking water for young children. McLennan, J.D. and Farrelly, A. (2010) Archives of Disease in Childhood, 95(11); 898-902. The quality of drinking water is an important contributor to child health particularly as young children are susceptible to diarrhoeal mortality which is due in part to unsafe drinking water. In low- and middle-income counties, care givers may engage in a variety of household water treatment (HWT) strategies to try and improve drinking water quality for children where the water quality is in doubt. The pattern of these practices is not well known, particularly for young children in high-risk situations.
MARCH 2011
PAGE 7
Web Bonus Articles
This study examines strategies used by care givers of young children for improving drinking water quality within two samples drawn from high-risk populations in the Dominican Republic. The two high-risk groups of children considered in this study were: (1) children living in a poor, unplanned peri-urban community and (2) children attending a clinic for the treatment of undernutrition. The study objectives were to determine (1) the extent of use of different strategies to improve drinking water by care givers of young children, (2) the variation in practice as a function of the child’s age and (3) the changes in frequency of strategies used compared to a decade ago. The study was undertaken in a district on the outskirts of the city of Santo Domingo, Dominican Republic. The Recent Community Sample (RCS, n=25) was recruited from one of the unplanned neighbourhoods from a door-to-door inquiry for a care giver of a child 6 years of age or under in the residence between February 2007 and August 2008. Only one young child per household was selected randomly. The Past Community Sample (PCS, n=266) was recruited from six unplanned neighbourhoods in this district, including that used for the RCS and was also recruited door-to-door; however, every 10th home on each road was visited to identify care givers of a child 5 years of age or under in the household between June 1997 and November 1997. Both the Recent Nutrition Clinical Sample (RNCS) and Past Nutrition Clinical sample (PNCS) were recruited from a nutrition clinic based in a nongovernmental hospital in this same district. The RNCS (n=312) comprised care givers recruited from consecutive admissions to the child nutrition program between July 2004 and November 2009. The PNCS (n=75) comprised care givers recruited from consecutive admission to the child nutrition program between April 1997 and December 1997. The care giver of each eligible child was invited to participate in a structure interview conducted by a researcher. The interview included questions about the types of water (boiled, chlorinated, bottled untreated) given to the child and the frequency of drinking water.
HEALTH STREAM
Bottled water was the most frequent water improvement strategy reported by both recent samples, and it was reported as “always/almost always” and “sometime” in regard to usage levels, suggesting that bottled water is a main approach, as well as a supplementary approach for drinking water improvement. Rates of bottled water consumption that were reported in recent samples were much higher than those from the 1997 samples. A reduction in use of chlorination and non-treated water over time was also seen. Within the clinic population, there was a major reduction in use of boiling, which was replaced by bottled water use. The recent community sample reported a higher use of ‘any’ water improvement strategy, mainly driven by bottled water. The relationship between water improvement strategies and child age were examined and boiling was the most common strategy for children under 1 year of age and this markedly decreased over time. Bottled water use however increased with age. Untreated water use was rare in the youngest children and increased with age. These patterns were apparent in both recent samples. This study identifies an increase in reliance by care givers on bottled water in the Dominican Republic as a key drinking water improvement strategy for young at-risk children. While this may represent a positive trend in protecting children from waterborne disease, it may not be a good financial approach to the provision of safe drinking water for low-income families. Mean weekly expenditures on water are reported to be 59 (SD 55) pesos, representing a mean of 5.2 (SD 6.9) percent of income for a subgroup that reported typical household income (47% of recent samples). This study also identified the child’s age as an important factor that influences the type of household-level drinking water improvement strategy used, particularly the use of boiling water. The information collected in this study may inform health education efforts, however first there is a need to determine what is most favourable for these communities and what should be promoted.
MARCH 2011
PAGE 8
Web Bonus Articles
Lead Association between children's blood lead levels, lead service lines, and water disinfection, Washington, DC, 1998-2006. Jean Brown, M., Raymond, J., Homa, D., Kennedy, C. and Sinks, T. (2010) Environmental Research, 111; 67-74. The adverse health effects of lead exposure are well known, and in children can include developmental delay, behaviour disorders at low lead levels, seizures and in rare cases, death when levels are high. Lead in drinking water is known to contribute to children’s blood lead levels (BLLs) and in the United States cases of childhood lead poisoning have been associated with drinking water. Lead is most commonly found in finished water through corrosion of plumbing materials containing lead. There are three factors that influence the level of lead in drinking water: the presence of lead in plumbing materials, the pH of finished water and the presence or absence of mineral scale in plumbing. Leaded service lines (LSL) connect homes to a central water main or run from the water meter to the home and these are known to contribute to lead found in household tap water. Chloramine, a chlorineammonia combination has been adopted by a number of water suppliers as this produces fewer disinfection by-products than chlorine alone. However, in the absence of specific anticorrosive treatments such as orthophosphate, chloramine degrades accumulated mineral scale resulting in lead leaching into drinking water. An increase in the average BLL of children after water disinfectant had been changed from chlorine to chloramine has been reported. This study examined BLL results among children tested for lead in Washington, DC, between 1998 and 2006. The study assessed how the BLLs of tested children were affected by water disinfectant type and LSL while adjusting for the effect of housing age. Homes built before the 1980s may have LSLs or copper pipes with lead-soldered joints whereas homes built after the 1986 Safe Drinking Water Amendments have “lead-free” pipes and fixtures. Also examined were the effect of both partial and no replacement of LSLs on the BLLs of children tested between 2004 and 2006.
HEALTH STREAM
There were three cross-sectional analyses conducted. The study population was derived from the Washington, DC, Childhood Lead Poisoning Prevention Program (CLPPP) blood lead surveillance system which collected laboratory-based reports of the BLL results of tested individuals whose results were reported between January 1, 1998 and December 31, 2006. The Washington, DC Water and Sewer Authority (WASA) provided CLPPP and the Centers for Disease Control and Prevention with a list of 26,155 homes presumed to have an LSL. Street addresses from blood lead tests reported to CLPPP and WASA address data were standardised and matched to the complete street address. Complete street addresses could be found for 63,854 children from the 67,831 unique children less than 6 years of age having at least one BLL reported and they comprised the final sample. Water disinfectant type was coded depending on dates when the different types of water disinfection were used by WASA. They were coded as (1) chlorine if the BLL test was conducted between January 1, 1998 and October 31, 2000; (2) chloramine if the BLL test was conducted between November 1, 2000 and June 30, 2004; or (3) chloramine with orthophosphate if the BLL test was conducted between July 1, 2004 and December 21, 2006. Age of housing was coded as pre-1950, 19501978, and post-1978. These periods coincided with changes in lead concentration in residential paint. The most amount of lead paint was used pre-1950 and moderate use occurred from 1950 through 1978. After 1978 lead in residential paint was banned. Data on age of houses were obtained for 37,322 (58.5%) children with validated addresses. Child’s age was categorised into less than or equal to 16 months of age and greater than 16 months – 6 years. A relationship was seen between BLL quartile status and probability of living in a house with an LSL for every year between 1998 and 2006 including those years when WASA was in compliance with the EPA action level of 15 ppb. When chloramine was used as a water disinfectant, the adjusted odds ratio (OR) of a BLL in the highest versus the lowest quartile for children living in homes with an LSL was 2.5 (95%
MARCH 2011
PAGE 9
Web Bonus Articles
CI, 2.2-2.9), controlling for age of housing. The risk was highest in 2003 when the adjusted OR of a BLL in the highest versus the lowest quartile for children living in homes with LSL was 3.2 (95% CI 2.4, 4.4). When chloramine and orthophosphate were used together (2004-2006), the odds of a BLL in the highest quartile versus the lowest quartile remained elevated, but the odds were somewhat lower than when chloramine alone was used. Chloramine and orthophosphate were used for water disinfection during the period of time when the LSL replacement program was conducted. When households with no LSL were compared to households with partial LSL replacement, those with partial LSL replacement had elevated OR for a child’s BLL 5-9 micro g/dL [OR = 1.9 (95% CI, 1.5-2.30] and for BLL greater than or equal to 10 micro g/dL [OR = 3.3 (95% CI, 2.2-4.9)]. No significant difference was found in risk between children in households with partially replaced or intact LSL for either BLL level. This study showed that in Washington, DC, between November 2000 and December 2006, children living in homes with a LSL were at increased risk of having higher BLLs than children living in homes without an LSL. Associations were found to be strongest during 2003 when chloramine alone was used as the water disinfection method. The association remained after controlling for age of housing. Partial replacement of LSLs did not result in a decrease in the association between LSL and elevated BLL. As there is no safe blood lead level threshold for children, and lead in water contributes to BLLs, prompt and effective action is required by utilities to rapidly comply with existing drinking water standards. Legionella Controlling Legionella in hospital drinking water: An evidence-based review of disinfection methods. Lin, Y.E., Stout, J.E. and Yu, V.L. (2011) Infection control and hospital epidemiology, 32(2); 166-173. In the early 1980s, the epidemiological link between the presence of Legionella pneumophila in hospital drinking water and the occurrence of hospitalacquired legionellosis was first made. In order to prevent hospital acquired Legionella infection, the
HEALTH STREAM
drinking water system needs to be disinfected. The efficacy of any disinfection measures needs to be validated in a stepwise fashion from laboratory assessment to controlled multiple-hospital evaluation over a long period of time. This review evaluates systemic disinfection methods (copper-silver ionization, chlorine dioxide, monochloramine, ultraviolet light, and hyperchlorination), a focal disinfection method (point-of-use filtration) and short-term disinfection methods in outbreak situations (superheat-and-flush with or without hyperchlorination). A standardised evaluation criteria was formulated for the disinfection methods which included: demonstrated efficacy in vitro against Legionella, reports of anecdotal experience of efficacy in controlling Legionella contamination in individual hospitals, peer-reviewed and published reports of controlled studies of prolonged duration (years) of efficacy in controlling Legionella growth and preventing cases of hospital-acquired Legionnaires’ disease in individual hospitals and confirmatory reports from multiple hospitals with prolonged duration of follow-up (validation step). Copper and silver are bactericidal in vitro against Legionella and some other waterborne pathogens. Copper-silver ionisation is the only disinfection technology that has been validated by the 4-step standardised evaluation criteria recommended here. At present, copper-silver ionisation appears to be the best available technology for controlling Legionella colonisation in hospital water systems. There are now a number of vendors that offer ionisation systems however recommendations and assessments from other hospitals using ionisations should be sought before making a purchase. To ensure long-term success of copper-silver ionisation, rigorous maintenance plans with regular monitoring of both ion concentrations and the percentage of sites with positive Legionella is necessary. Chlorine dioxide is a gas solution that is normally generated on site at the facility and it has been used in water treatment in Europe since the 1940s. Chlorine dioxide is a promising disinfection method; however it has not yet fulfilled the 4 criteria required for validation of efficacy. It is recommended at present that it is used in smaller secondary
MARCH 2011
PAGE 10
Web Bonus Articles
distribution systems with a low cold water temperature, nongalvanised piping, and with a low total organic carbon content in the hospital water. Future published studies should report chlorine dioxide concentrations along with Legionella positivity rate. There are many vendors offering varying types of chlorine dioxide generators and many hospitals have had marginal success with them, it therefore should be mandatory that recommendations and assessments from other hospital with experience with chlorine dioxide should be sought. Monochloramine is effective against Legionella in vitro and against biofilm-associated Legionella in model plumbing systems. The efficacy of on-site monochloramine treatment in individual hospitals over a prolonged period has not yet been studies. Monochloramine provides a stable residual that can penetrate biofilm and has a wider working pH range than copper-silver ionization and chlorine. Monochloramine disinfections appear to be a promising approach for decreasing Legionella colonisation however long-term studies need to be undertaken. Hyperchlorination has been found to be the most unreliable and the most expensive disinfection method. It has become less favourable because of inadequate penetration of the agent into biofilms in piping, persistence of Legionella organisms in hyperchlorination systems, corrosion of water distribution systems leading to pinhole leaks over time and the introduction of carcinogens into the drinking water. Point-of-use filters (0.2-micro m pore size) (AquaSafe; Pall Medical) have been used for the prevention of nosocomial infections due to Legionella and Pseudomonas aeruginosa, particularly in high-risk areas such as intensive care units and transplant units. In a controlled study, the filter was found to completely eliminate Legionella and Mycobacterium organisms from water. In an outbreak situation, some hospitals restrict water use by having patients use bottled water exclusively and by restricting all patients from showering. The use of
HEALTH STREAM
filters in such a situation is more cost-effective and better tolerated by patients. UV light is an attractive option for disinfection as no chemicals are added to the drinking water. Two hospitals have shown that UV was ineffective in eradicating Legionella at distal sites. A combination of UV and other disinfection methods has been found to be effective for individual hospital units. The efficacy of UV disinfection seems to be optimised if the system is installed on the incoming water main of a virgin hospital in which no biofilm has been established. UV light may play a role if the area for disinfection is small, for example a transplant unit. In cases of hospital-acquired Legionnaires’ disease emergency disinfection methods need to be used. Hospitals may use superheat-and-flush disinfection, with or without shock chlorination, as a short-term systemic control measure. Water temperatures at distal sites need to be rigorously maintained and monitored. If superheat-and-flush disinfection cannot be used because hot water lines are not available at every distal site, shock chlorination may be the only option. Shock chlorine dioxide disinfection is in theory feasible however clinical experience with this method as a short-term measure is limited. The World Health Organisation recommends that drinking water cultures for Legionella be performed every 3 months to verify the efficacy of disinfection. In many hospitals the use of waterless hand cleansers and decreased water usage means that water fixtures are not exposed to disinfectant as frequently and this has resulted in increased Legionella colonization rates. The solution to this is periodic flushing of the outlets (20 minutes once per month) to increase disinfection exposure. Hospital units that are closed for renovations are also vulnerable to recolonisation. All lines need to be flushed and cultured for Legionella before patients are housed there again. The selection of a vendor for the installation of a systemic disinfection method in a hospital requires careful consideration, and objective assessments from other hospitals that have used the vendor’s products over a long period of time should be examined. Ongoing needs for maintenance and water testing
MARCH 2011
PAGE 11
Web Bonus Articles
also need to be considered, costed and fully documented. There are many examples of expensive installations that have failed to prevent hospital patients from acquiring Legionella infections. A consistent theme in such cases was found to be that the choice of system was made primarily by engineering personnel, with little clinical input. The authors strongly recommend that infection control practitioners should lead the team charged with selection of a disinfection system, and that decisions must be based on rigorous assessment of evidence. Outbreaks Waterborne norovirus outbreak in a municipal drinking-water supply in Sweden. Riera-Montes, M., Brus Solander, K., Allestam, G., Hallin, E., Hedlund, K.O. and Llofdahl, M. (2011) Epidemiology and Infection, 1-8, DOI: 10.1017/S0950268810003146. Noroviruses have a low infectious dose and high transmissibility and can cause gastrointestinal illness in humans. Waterborne norovirus outbreaks mostly occur due to faecal contamination of drinking water, although outbreaks related to recreational waters have been documented. Detection of noroviruses in water is still difficult despite improvements in methodology, and confirmation of water contamination by this virus is often not obtained in outbreak investigations. This study used epidemiological and molecular investigations to study a large waterborne outbreak causes by contaminated municipal drinking water in Sweden. On the 15th of April 2009, several cases of vomiting and diarrhoea were reported in a village (population around 400) in the mountains in Western Sweden during the Easter weekend. Over 60 households reported at least one member ill with vomiting and/or diarrhoea in the initial investigation. Water was suspected as the source of the outbreak given the extent of the outbreak and lack of an obvious common exposure event. An outbreak control team was formed and an investigation began to assess the extent of the outbreak, identify the cause and confirm the mode and vehicle of transmission so appropriate long-term control measures could be implemented. A
HEALTH STREAM
boil water notice was issued for the period 16 April to 26 June. A retrospective cohort study was conducted. A questionnaire was delivered to all households in the village with questions divided into two sections: one section with questions about the household as a whole, and a second section with questions to be answered individually by each household member. A case was defined as an individual resident in the village who developed vomiting and/or diarrhoea (defined as greater than or equal to 3 loose stools per day) from 7 to 17 April 2009. The exposure was defined as consumption of unboiled water in different locations (home, village community centre, work and school/day-care centre). Information was collected on the average number of glasses of water consumed per day in each location as well as data on the source of the household water (municipal water network or own well). An environmental investigation was started by the municipal Environmental Office (EO) to detect possible anomalies in the water network that could have caused the contamination. Stool samples from six ill individuals were collected and investigated for bacterial enteropathogens (including Salmonella, Shigella, Campylobacter, Yersinia) and intestinal parasites (Cryptosporidium, Giardia). The six stool samples were also analysed for enteric viruses by electron microscopy and for norovirus and sapovirus with single-round multiplex capsid RT-PCR. There were 67 water samples taken in different locations and at different time points and investigated for E. coli, coliforms, intestinal enterococci and Clostridium perfringens. There were seven water samples investigated for the presence of somatic coliphages and norovirus. The study cohort consisted of 270 individuals (including guests) from 116 households who returned the completed questionnaire. Of these, 173 individuals fulfilled the case definition. There were 95 (82%) households that reported having at least one member ill and 27 (23%) reported having guests who became ill during the same period. The majority (90%) of households were connected to the public water network. The first case appeared on the 7 April, the number peaked on the 14 April and then
MARCH 2011
PAGE 12
Web Bonus Articles
decreased thereafter with the last case reported on 17 April. Of the cases, 98% (169/173) reported living in a house connected to the municipal water network compared to 81% of non-cases (78/96). Those households connected to the public water network were at an increased risk of developing disease (relative risks (RR) 4.80, 95% CI 1.68-13.73) compared to those with no connection to the public network. There was also an increased risk of developing disease for those residents who drank unboiled water at home but this did not reach statistical significance (RR 1.76, 95% CI 0.60-5.18). There were 217 residents for which there was information about the number of glasses of water consumed at home. The risk of developing disease was found to be higher the more glasses of unboiled water that were consumed at home, however it did not reach statistical significance. The village was served by two different water sources. Source A was a borehole situated 2km from the village where the water was treated in an on-site water plant through rapid sand filtration and UV-light disinfection. Source B was a well situated on the outskirts of the village. The water was treated in an on-site water plant with the same method as for source A. Source B was used only at times of increased demand in the area, and the last period of use had been for an annual cross-country ski competition from 18 February to 1 March 2009. It had not been used during all of 2008 and had been reconnected to the municipal water network without any water quality testing. Source B was found to have inadequate physical barrier protection, and may have been subject to contamination by snowmelt. The village also had a water reservoir with two tanks situated uphill from the village. This reservoir was served by the same water pipe network as the village. Water from this reservoir would discharge into the water pipe network when the water demand was not met by the two water sources. An interruption in the water service was reported on 10 April caused by a fall in water pressure which affected households in the upper part of the village. A leak in the water pipe network was suspected to be the cause. It was also noted that the reservoir was not working correctly and all the water had discharged into the water pipe network.
HEALTH STREAM
All six stool samples were negative for enteropathogenic bacteria, Cryptosporidium and Giardia . Norovirus were identified by EM in five of the six stool samples and norovirus RNA was detected in all six samples by RT-PCR. Norovirus genotype GI.3 was identified in all the stool samples. Evidence of faecal contamination was found in one sample of treated water from source B and one sample taken from a household connected to the municipal network, as well as in 4/9 samples of raw water from source B. The norovirus GI.3 was also found from a carboy of water stored since 12 April in one household. Water source B was identified as the possible cause of the outbreak with five samples showing evidence of faecal contamination and the source lacking adequate protection barriers. In the week prior to the outbreak there had been considerable amounts of snow melting that could have overflowed into the system. The contamination of both the raw water source and the water pipe network could have occurred due to the leaks identified in the water pipe network. Norovirus are also known to migrate through the ground and contaminate groundwater. Loss of water pressure could have amplified the contamination of the water pipe network by allowing ingress of contaminated groundwater. Water source B was initially disconnected from the public water network but reconnected a month later due to lack of microbiological evidence of water as the source of the outbreak. The water pipe network was promptly repaired and cleaning of the water reservoir was completed within a week of the outbreak alert. After the identification of virus in the stored carboy sample, water source B was then closed down. This is the first time that noroviruses have been isolated in a waterborne outbreak in a municipal drinking water supply in Sweden in both patients and drinking water. This study shows the importance of inter-agency collaboration and the value of using molecular methods in an outbreak investigation to assure prompt adequate and timely control measures are undertaken.
MARCH 2011
PAGE 13
Web Bonus Articles
Risk Assessment Tools Evaluation of Risk Assessment Tools to Predict Canadian Waterborne Disease Outbreaks. Summerscales, I.M. and McBean, E.A. (2010) Water Quality Research Journal of Canada, 45(1); 1-11. There are a number of drinking water risk assessment tools that have been developed by regulatory and nongovernmental bodies to enable risk assessment of drinking water systems. Three existing risk assessment tools were applied to drinking water systems that have been implicated in waterborne disease outbreaks to determine whether these tools could identify the hazards and vulnerabilities that resulted in the respective outbreaks. The tools were selected because they included specific risk assessment surveys that address different aspects of the water system. The three tools assessed were, the British Columbia Drinking Water Source-to-Tap Screening Tool (B.C. Tool), the Montana Water Center Microbial Risk Assessment Ranking Tool (MRA Tool) and the risk assessment forms in the Scottish Private Water Supplies: Technical Manual (PWS). The waterborne diseases outbreaks that occurred in North Battleford, Saskatchewan and Walkerton, Ontario were used to test the utility of the three risk assessment tools to determine whether they could identity the issues that led to the failure of these water systems. As input to the risk assessment tools, the available information regarding the condition of the water systems prior to the contamination events that led to the respective outbreaks was used. The B.C. Tool represents the first tier of the assessment process to identify and assess threats to drinking water quality. The B.C. Tool is used to compile information about the operation and management of the water system, the quality of tap water supplied by the system, the water system infrastructure and chemical and microbial hazards that could enter the water supply. The B.C. Tool is completed by the water supplier and submitted to the drinking water officer (DWO) for review. If the DWO feels that potentially significant risks have been identified then they may issue an order for the supplier to complete selected modules of the B.C.
HEALTH STREAM
comprehensive drinking water source-to-tap assessment process in order to further investigate and assess the risks identified. The modules are completed by a multidisciplinary team of qualified professionals with experience relevant to drinking water systems. The hazards identified after completing any of modules 1 through 6 are then ranked using a qualitative risk assessment procedure as part of module 7. The MRA Tool is meant to assist small water system operators and managers identify potential microbial risks and corrective actions that can be taken to manage those risks. The MRA Tool includes a number of different survey forms specifically tailored to alternative water supplies (i.e., streams and rivers, lakes and impoundments, wells or springs) and treatment systems as well as survey forms addressing other water system infrastructure and water quality monitoring activities. Each survey question receives a numerical risk score from 0 to 1. The relative risk scores associated with each question for a particular survey are weighted and summed to calculate a total microbial risk score for the survey, which ranges from 0 to 1. The risk scores for all surveys are then weighted and summed again to determine the overall risk score for the water system. The PWS risk assessment process is completed by the local authority with input and assistance from the “Relevant Person� responsible for the water supply. The PWS includes separate risk assessment forms for different water supplies (surface water supplies, springs, wells and boreholes). Each risk assessment form has common sections that collect information regarding relevant contact information, historic water quality data, and the results of past risk assessments. Each survey has two scores associated with it: the risk characterisation and the hazard assessment score. All of the risk assessment tools successfully identified the hazards that led to contamination of the water source in both the North Battleford, Saskatchewan and Walkerton, Ontario cases studies. The tools however had different levels of success in identifying vulnerabilities associated with the water source and treatment and water quality monitoring activities.
MARCH 2011
PAGE 14
Web Bonus Articles
Even thought the B.C. Tool did not successfully identify vulnerabilities associated with water treatment directly, in both cases the DWO would have ordered the water supplier to complete the source-to-tap assessment modules that are necessary to identify the risks that led to the outbreaks based on the input to the B.C. Tool. Of concern is that this source-to-tap assessment process may be too onerous for a small water system operator to complete in a reasonable amount of time. The MRA Tool is a single-tier risk assessment tool that requires more information regarding the design of the water treatment systems than the B.C. Tool. However, relatively high risk scores are assigned to questions regarding historical treated water quality monitoring data and this can overshadow the risk scores associated with the vulnerabilities in the treatment system that lead to the outbreak, which may then decrease the possibility that the water supplier will take direct action to address the vulnerabilities. The PWS risk assessment forms do not include questions on water treatment or monitoring equipment apart from two questions regarding a point-of-entry or point-of-use treatment system. The questions are designed for very small water system and are therefore in general not applicable to larger water systems that serve residential or mixed–use developments. A major inadequacy of all the risk assessment tools considered is that they do not reflect the interdependent nature of the barriers to drinking water contamination. The survey questions which address the source, treatment and distribution of the water system are considered independently from one another. Vulnerabilities in an upstream barrier would have an impact on the weights or risk scores assigned to survey questions regarding relevant downstream barriers. Elevated weights or risk scores should be assigned to survey questions regarding a hazard when there is only one process or barrier capable of removing or inactivating it. All three risk assessment tools also do not address the redundancy of key equipment or processes capable of removing pathogens. An elevated risk score needs to be
HEALTH STREAM
assigned if there is no ‘stand by’ equipment that can be brought into service in the event that a process fails. Water Quality An audit improves the quality of water within the dental unit water lines of general dental practices across the East of England. Chate, R.A.C. (2010) British Dental Journal, Online article number E11, DOI: 10.1038/sj.bdj.2010.885 Biofilms begin to form within dental unit water lines (DUWLs) soon after installation and connection to a water supply. The biofilms consist mostly of aerobic, gram negative, non-coliform water bacteria which have a limited pathogenic potential in immunocompetent people. Despite the routine use of anti-retraction valves, salivary blood-borne hepatitis B and HIV have been shown experimentally to be sucked back into handpieces and have been recovered distally in dental waterlines. In 1995 the American Dental Association (ADA) recommended that by the year 2000, DUWLS should have no more than 200 colony forming units (cfu) per ml in water samples. In December 2003, the United States’ Centers for Disease Control and Prevention (CDC) recommended that DUWL water should meet the United States 1999 Environmental Protection Agency (EPA) regulatory drinking water standard that states no more than 5% of water samples should be contaminated with total coliforms and that they should have less than 500 cfu/ml of heterotrophic water bacteria. In July 2004 the ADA adopted the DUWL CDC recommendations. The European Union’s (EUs) standard for potable (drinking) water states that water samples should have neither Escherichia coli nor any other faecal coliforms present and for the aerobic colony count to be less than 100 cfu/ml at 22 degrees C after 72 hours of culturing. There have been a number of studies that have surveyed the compliance with these targets among general dental practices both in UK and in Europe and high levels of contamination have been found. Despite the general prevalence of high levels of contamination reported, there still remains no
MARCH 2011
PAGE 15
Web Bonus Articles
evidence that DUWL biofilms represent a public health risk. Non-tuberculosis mycobacteria (NTM) has been recovered from DUWLs, as well as Legionella pneumophila. Immunodeficient patients are more susceptible to infection from waterborne opportunistic pathogens and heavy exposure to species of Legionella is a potential health risk for both dental personnel and their immunocompromised patients. Another potential risk to clinical staff is from contaminated aerosols related to the development of asthma. Endotoxin exposure can exacerbate asthma and Gram-negative bacteria which contain cell wall endotoxins predominate the flora in DUWLs. This study was undertaken to evaluate the quality of water emanating from the DUWLs which supplied irrigation for the dental handpieces and triple spray syringes in general dental practices across the East of England and to undertake an audit to improve upon it. In 2006 all of the general dental practitioners in the East of England were given the opportunity to take part in a regional audit on cross-infection control. There were 124 dentists that initially registered to participate in the audit. By 2007, 72 had begun and by 2008, 68 had completed the audit. Dentists were required to answer a questionnaire on their current knowledge of water quality standards, the type and age of the dental units they worked with and what, if any, cross-infection protocols they currently applied to their DUWLs. Each of the participating dentists were asked to collect a DUWL water samples both before the start and mid-way through a morning session. Water samples were analysed for aerobic colony counts at 22 degrees C after 72 hours and 37 degrees C after 24 hours and also for the presence of coliforms. The standards that were set for the audit were that as a minimum, the samples should at least meet the United States CDC guideline/EPA regulatory drinking water standard as endorsed by the ADA but preferably they should meet the EU standards for potable water. Dentists were instructed to complete both the pre-audit survey and the audit assessments of their DUWL levels of contamination, regardless of whether their pre-audit water samples was found to already comply with either of the suggested audit
HEALTH STREAM
quality standards. If the water samples did not comply, they were asked to begin using an initial shock purge of preferably either Alpron or Sterilox disinfectants, followed by at least 1 month of continuous application before their DUWL were retested. If the dentist’s units were already compliant they were asked either to continue with their existing regimes for at least a further month before retesting for the audit, or to convert to using either Alpron or Sterilox if they preferred . Before the audit, 56% of the DUWLs reported flushing through for 2 minutes at the start of the day, 29% were purged for 20 seconds in between each patient, 50% were treated with a wide range of different disinfection solutions, 44% were drained down dry at the end of the day and 9% had no crossinfection control measures applied to them. In the audit, all dentists used a disinfection solution alone, predominantly either Alpron or Sterilox. In the pre-audit survey, half of the 72 dental units had never had their DUWLs disinfected while the other half had had a wide range of disinfectants applied. None of the 72 DUWL samples were contaminated with E. coli however five of them (7%) had coliforms recovered. The mean aerobic colony count after 72 hours of culturing at 22 degrees C for these water samples was 60,892 cfu/ml. Only 25% achieved the EU potable water standard, of which 11% had no planktonic bacterial contamination. There were 3% above the EU standard but below the CDC guideline /EPA regulatory drinking water standard, while there were 72% that failed to reach this minimum audit standard. Of the 68 units that completed the audit study, 80.9% reached the EU potable water standard, of which over half (54.4%) had zero planktonic bacterial contamination. There were 4 units (5.9%), all of which had been disinfected with Alpron, that were below the CDC/EPA standard but were above the 1995 ADA standard, while only nine units (13.3%) still failed to reach any DUWL water quality standard. Audit levels of DUWL bacterial contamination in water samples that were taken in the morning before the start of the session showed that all 68 dental units remained uncontaminated by E.
MARCH 2011
PAGE 16
Web Bonus Articles
coli, and only one had traces of coliforms in the DUWL water. The mean aerobic colony count after 72 hours at 22 degrees C for these water samples was dramatically reduced to 3,338 cfu/ml (SD 15,823 cfu/ml). Of the 68 units, 46 (69.1%) had zero planktonic bacterial contamination and none of them had heavy levels of contamination. Audit levels of DUWL bacterial contamination for mid-session water samples showed the mean aerobic colony count taken after 72 hours of culturing at 22 degrees C had dropped to 1,150 cfu/ml (SD 3,845 cfu/ml) while 41 (60.3%) of them had zero planktonic bacteria and none of the colony counts were excessive.
the minimum audit standard after a second cycle of DUWL disinfection and thereby minimised both the risk of cross-infection to vulnerable patients as well as to dental staff chronically exposure to contaminated aerosols. Some dental units however remained unacceptably contaminated even after a third cycle of disinfection. This highlights the importance of water testing by an accredited laboratory to verify adequate disinfection is occurring, rather than merely accepting performance claims made by the manufacturer regarding the efficacy of their DUWL disinfectant product.
There were 10 dentists that subsequently re-exposed their dental units to another cycle of purging and disinfection for at least a month before retesting. Of the eight units that were disinfected with Alpron, two of them reached zero levels of bacterial contamination, one met the 1995 ADA DUWL quality standard and two met the EU potable water standard. Of the remaining three units using Alpron, two still failed to reach audit standards while one only reached the EU standard for the mid-treatment session water samples. Of the two dentists who continued to used either sodium hypochlorite plus hydrogen peroxide or Sterilox during the second cycle of DUWL disinfection, both of their units still failed to reach audit standards. Three of the four dentists whose units still remained non-compliant, chose to re-expose their units to a third cycle of DUWL disinfection. Of these the 1995 ADA compliant Alpron user’s dental unit reached zero levels of DUWL contamination, the failed Alpron user’s dental unit reached the EU standard but the Sterilox user’s DUWLs still remained non-compliant. This study found that 75% of dental units initially failed to reach the EU DUWL water quality guideline during the pre-audit survey even though most of participants had used either distilled or sterile water as the irrigant for their units or had used a range of physical interventions to cleanse their waterlines. Therefore this study has shown that neither the choice of water nor the flushing through or drying of DUWLs makes an acceptable difference in the levels of bacterial contamination. It was encouraging that the majority of non-compliant dental units reached
HEALTH STREAM
MARCH 2011
PAGE 17