Effects of Drinking-Water Filtration on Cryptosporidium Seroepidemiology, Scotland - Volume 20, Number 1—January 2014 - Emerging Infectious Disease journal - CDC
Volume 20, Number 1—January 2014
Effects of Drinking-Water Filtration on CryptosporidiumSeroepidemiology, Scotland
Author affiliations: Health Protection Scotland, Glasgow, Scotland, UK (C.N. Ramsay, C. Robertson, K.G.J. Pollock); University of Strathclyde, Glasgow (A.P. Wagner, C. Robertson); International Prevention Research Institute, Lyon, France (C. Robertson);Scottish Parasite Diagnostic Laboratory, Glasgow (H.V. Smith)
Each year since 2005, Health Protection Scotland has received reports of 500–700 laboratory-confirmed cases of cryptosporidiosis (10–14 cases/100,000 population/year); seasonality is usually markedly biphasic, peaking in spring and early autumn. Cryptosporidiosis is caused by>1 species/genotypes of the protozoan parasite in the genus Cryptosporidium, which infects a wide variety of animals including humans. The most common human pathogens areCryptosporidium hominis and C. parvum. Characteristic signs of infection are profuse, watery diarrhea, often accompanied by bloating, abdominal pain, and nausea or vomiting. Illness is typically self-limiting but can last for 2–3 weeks; studies suggest an association with long-term health sequelae, such as reactive arthritis and postinfection irritable bowel syndrome (1,2). Moreover, severe, chronic diarrhea or even life-threatening wasting and malabsorption can develop in persons with severely compromised immune systems, particularly those with reduced T-lymphocyte counts, in the absence of immunotherapy (3).
Drinking water contaminated with Cryptosporidium oocysts is a recognized risk factor for human illness (4–6). Before or after treatment, water can be contaminated by a variety of sources, including livestock, feral animals, or humans (7). Oocysts can remain infectious in the environment for prolonged periods and are resistant to regular drinking-water disinfection treatments. For preventing human exposure, oocysts must be physically removed from water supplies; however, inadequate water filtration can expose persons to risk for infection from viable oocysts (8–11).
Where drinking-water filtration has been enhanced to reduce oocysts counts, the incidence of reported clinical Cryptosporidium infection has been reduced (6,11). However, reported rates of infection are subject to variation, depending on factors such as local laboratory testing criteria, and exposure source attribution depends on the quality of risk-factor exposure data. Therefore, assessing trends in clinical infection rates might not be sufficiently sensitive for detecting changes in single-exposure risks. Variations in other risk factors (e.g., foreign travel, direct animal contact) can also obscure an effect associated with reduced exposure to oocysts in drinking water. Assessment of the effects of changes in environmental oocyst exposure would ideally be based on measuring population-level indicators, rather than relying on reported (self-selected) cases of laboratory-confirmed cryptosporidiosis. Alternatively, longitudinal variation in levels of antibody to Cryptosporidium oocyst proteins could be used to detect associations with variations in oocyst exposure.
The association between seropositivity and exposure to Cryptosporidium oocysts in drinking water has been investigated. Low levels of oocysts have been detected in 65%–97% of surface-water supplies, suggesting that many populations may be at risk (12). Elevated serologic responses have been detected in those whose drinking-water source is surface water rather than groundwater. The risk for oocyst exposure might therefore be higher for surface water than for groundwater sources (13–15), even after conventional filtration (13). However, chronic low-level exposure to oocysts in environmental sources, including drinking water, can stimulate protective immunity. Strong serologic responses to oocyst antigens have been associated with such environmental exposures (16,17).
To decrease the risk for waterborne Cryptosporidium infection from drinking-water supplies, the water industry established several barrier water treatment systems. In Scotland, water treatment has significantly reduced the concentration of Cryptosporidium oocysts in final (posttreatment) tap water. Before September 2007, however, the Loch Katrine system, which supplies the towns of Glasgow and Clyde, did not have such a filtration treatment. The risk from drinking unfiltered water was demonstrated in 2000, when an outbreak of cryptosporidiosis occurred among Glasgow residents living within the Loch Katrine supply area (18). To decrease this risk, in September 2007, enhanced treatment (rapid gravity filtration and coagulation) was introduced to the Loch Katrine supply system. This new system provided an opportunity to assess the public health effects of improving the standard of water filtration.
We investigated the prevalence of antibodies to the 27-kDa Cryptosporidium oocyst antigen among residents living in the Loch Katrine supply area (Glasgow) before and after the introduction of filtration and compared these with levels in a control population (Dundee) where no such change to drinking-water treatment occurred. Our main objective was to determine whether an association exists between prevalence of antibody response to the 27-kDa antigen and the standard of drinking-water treatment (filtered vs. unfiltered).