scholarly journals Persistence and Decontamination of Bacillus atrophaeus subsp. globigii Spores on Corroded Iron in a Model Drinking Water System

2007 ◽  
Vol 73 (8) ◽  
pp. 2451-2457 ◽  
Author(s):  
Jeffrey G. Szabo ◽  
Eugene W. Rice ◽  
Paul L. Bishop

ABSTRACT Persistence of Bacillus atrophaeus subsp. globigii spores on corroded iron coupons in drinking water was studied using a biofilm annular reactor. Spores were inoculated at 106 CFU/ml in the dechlorinated reactor bulk water. The dechlorination allowed for observation of the effects of hydraulic shear and biofilm sloughing on persistence. Approximately 50% of the spores initially adhered to the corroded iron surface were not detected after 1 month. Addition of a stable 10 mg/liter free chlorine residual after 1 month led to a 2-log10 reduction of adhered B. atrophaeus subsp. globigii, but levels on the coupons quickly stabilized thereafter. Increasing the free chlorine concentration to 25 or 70 mg/liter had no additional effect on inactivation. B. atrophaeus subsp. globigii spores injected in the presence of a typical distribution system chlorine residual (∼0.75 mg/liter) resulted in a steady reduction of adhered B. atrophaeus subsp. globigii over 1 month, but levels on the coupons eventually stabilized. Adding elevated chlorine levels (10, 25, and 70 mg/liter) after 1 month had no effect on the rate of inactivation. Decontamination with elevated free chlorine levels immediately after spore injection resulted in a 3-log10 reduction within 2 weeks, but the rate of inactivation leveled off afterward. This indicates that free chlorine did not reach portions of the corroded iron surface where B. atrophaeus subsp. globigii spores had adhered. B. atrophaeus subsp. globigii spores are capable of persisting for an extended time in the presence of high levels of free chlorine.

2014 ◽  
Vol 9 (4) ◽  
pp. 491-501 ◽  
Author(s):  
Jennie L. Rand ◽  
Graham A. Gagnon ◽  
Alisha Knowles

Distribution system data from a Nova Scotia municipal drinking water supply was collected over four years, including free chlorine residual concentration, heterotrophic plate count (HPC) bacteria, and temperature. These data were analyzed for occurrences of HPC bacteria greater than 500 colony forming units (CFU)/mL. The municipality was interested in determining if secondary chlorination practices were sufficient in maintaining microbial health in their distribution system. Coliform data were non-detect (total coliforms and Escherichia coli) in the distribution system over this period and thus heterotrophic bacteria were used to assess microbial health. Results were compared to similar data collected from pilot-scale studies that had been carried out using the same municipal water as the source. Analysis showed that a similar trend was observed between pilot- and full-scale samples. Full-scale data analysis revealed that the minimum disinfection requirement of 0.2 mg/L did not consistently control occurrences of heterotrophic bacteria from being greater than 500 CFU/mL. By comparison, maintaining a concentration of 0.3 mg/L or above, particularly in warm-weather systems, maintained the number of heterotrophic bacteria at below 500 CFU/mL. Fortunately the majority of samples collected in the full-scale distribution system (>89%) had a free chlorine residual concentration of greater than 0.30 mg/L. While it is recognized that this system had 100% compliance for E. coli, the goal of this work will help utilities understand how to utilize microbial data to inform operational disinfection targets for their distribution system.


2007 ◽  
Vol 55 (5) ◽  
pp. 161-168 ◽  
Author(s):  
T.H. Heim ◽  
A.M. Dietrich

Pipe relining via in situ epoxy lining is used to remediate corroded plumbing or distribution systems. This investigation examined the effects on odour, TOC, THM formation and disinfectant demand in water exposed to epoxy-lined copper pipes used for home plumbing. The study was conducted in accordance with the Utility Quick Test, a migration/leaching method for utilities to conduct sensory analysis of materials in contact with drinking water. The test was performed using water with no disinfectant and levels of chlorine and monochloramines representative of those found in the distribution system. Panelists repeatedly and consistently described a “plastic/adhesive/putty” odour in the water from the pipes. The odour intensity remained relatively constant for each of two subsequent flushes. Water samples stored in the epoxy-lined pipes showed a significant increase in the leaching of organic compounds (as TOC), and this TOC was demonstrated to react with free chlorine to form trichloromethane. Water stored in the pipes also showed a marked increase in disinfectant demand relative to the water stored in glass control flasks. A study conducted at a full scale installation at an apartment demonstrated that after installation and regular use, the epoxy lining did not yield detectable differences in water quality.


2001 ◽  
Vol 1 ◽  
pp. 39-43 ◽  
Author(s):  
V. Zitko

Many countries require the presence of free chlorine at about 0.1 mg/l in their drinking water supplies. For various reasons, such as cast-iron pipes or long residence times in the distribution system, free chlorine may decrease below detection limits. In such cases it is important to know whether or not the water was chlorinated or if nonchlorinated water entered the system by accident. Changes in UV spectra of natural organic matter in lakewater were used to assess qualitatively the degree of chlorination in the treatment to produce drinking water. The changes were more obvious in the first derivative spectra. In lakewater, the derivative spectra have a maximum at about 280 nm. This maximum shifts to longer wavelengths by up to 10 nm, decreases, and eventually disappears with an increasing dose of chlorine. The water treatment system was monitored by this technique for over 1 year and changes in the UV spectra of water samples were compared with experimental samples treated with known amounts of chlorine. The changes of the UV spectra with the concentration of added chlorine are presented. On several occasions, water, which received very little or no chlorination, may have entered the drinking water system. The results show that first derivative spectra are potentially a tool to determine, in the absence of residual chlorine, whether or not surface water was chlorinated during the treatment to produce potable water.


2017 ◽  
Vol 18 (2) ◽  
pp. 391-398 ◽  
Author(s):  
A. M. M. Batista ◽  
P. Meynet ◽  
G. P. P. Garcia ◽  
S. A. V. Costa ◽  
J. C. Araujo ◽  
...  

Abstract This study evaluated the microbiological safety of the water distribution system of a city in the state of Minas Gerais (Brazil), population 120,000 inhabitants. During the study, the city suffered a severe drought that had a significant impact on water availability and quality in the river that supplies water to the city. Samples (2 liters) were collected from the distribution system over a period of six months, which included wet and dry months, from three points: the point with the lowest altitude in the distribution network, the farthest point from the water treatment works, and an intermediate point. Free chlorine was measured in situ using a Hach kit. DNA was extracted using a FastDNA Spin Kit Soil (Qbiogene). Advanced sequencing techniques (Ion Torrent) were used to identify and quantify the relative abundance of potentially pathogenic bacteria present in the samples. Coliforms and Escherichia coli, indicators currently used worldwide to assess microbiological safety of drinking water, were measured on all samples using an enzyme substrate method (ONPG-MUG Colilert®). Next generation sequencing results retrieved 16SrRNA sequences of E. coli and some potentially pathogenic bacteria, even in the presence of free chlorine. Operational taxonomic units related to pathogenic bacteria were present in all samples from the drinking water distribution system (DWS) and, in general, at high relative abundance (up to 5%). A total of 19 species related to bacterial pathogens were detected. Inadequate operational practices that could affect the microbiological safety of the DWS were identified and discussed. The current paper is the first to evaluate the community of potentially pathogenic bacteria in a real DWS.


2018 ◽  
Vol 115 (8) ◽  
pp. E1730-E1739 ◽  
Author(s):  
Sammy Zahran ◽  
Shawn P. McElmurry ◽  
Paul E. Kilgore ◽  
David Mushinski ◽  
Jack Press ◽  
...  

The 2014–2015 Legionnaires’ disease (LD) outbreak in Genesee County, MI, and the outbreak resolution in 2016 coincided with changes in the source of drinking water to Flint’s municipal water system. Following the switch in water supply from Detroit to Flint River water, the odds of a Flint resident presenting with LD increased 6.3-fold (95% CI: 2.5, 14.0). This risk subsided following boil water advisories, likely due to residents avoiding water, and returned to historically normal levels with the switch back in water supply. During the crisis, as the concentration of free chlorine in water delivered to Flint residents decreased, their risk of acquiring LD increased. When the average weekly chlorine level in a census tract was <0.5 mg/L or <0.2 mg/L, the odds of an LD case presenting from a Flint neighborhood increased by a factor of 2.9 (95% CI: 1.4, 6.3) or 3.9 (95% CI: 1.8, 8.7), respectively. During the switch, the risk of a Flint neighborhood having a case of LD increased by 80% per 1 mg/L decrease in free chlorine, as calculated from the extensive variation in chlorine observed. In communities adjacent to Flint, the probability of LD occurring increased with the flow of commuters into Flint. Together, the results support the hypothesis that a system-wide proliferation of legionellae was responsible for the LD outbreak in Genesee County, MI.


Chemosphere ◽  
2016 ◽  
Vol 153 ◽  
pp. 521-527 ◽  
Author(s):  
Danielle M. West ◽  
Qihua Wu ◽  
Ariel Donovan ◽  
Honglan Shi ◽  
Yinfa Ma ◽  
...  

2002 ◽  
Vol 2 (4) ◽  
pp. 105-110
Author(s):  
R. Lake ◽  
S. Driver

Coliforms are used as indicators of faecal pollution in water. Therefore, the presence of coliforms in drinking water causes concern as it indicates the potential presence of other bacteria. Coliforms have been seen in water within the Vivendi Water UK area during the summer months and their presence has previously been explained by localised pipe renovation. In this study, the influence that the algal bloom has on the presence of coliforms has been assessed. It has been shown that there is a strong link between the end of the algal bloom and coliforms being found in the distribution system. The algal bloom does not allow coliforms to pass through the treatment works. However, the high level of total organic carbon (TOC) in the treated water, made up of algal breakdown products, provides a good nutritional source for regrowth in the distribution system. Where there are high TOC levels, coliforms can grow even with high chlorine concentrations. However, where there is little TOC, then even a very low chlorine residual is adequate to prevent coliform growth.


1997 ◽  
Vol 35 (11-12) ◽  
pp. 289-292 ◽  
Author(s):  
D. P. Sartory ◽  
P. Holmes

Coliform bacteria, isolated from treated drinking water supplies, can be derived from a range of sources (e.g. infiltration, breakthrough at the treatment works or from the biofilm established within the pipework). The sensitivity of these bacteria to chlorine may be related to their source and metabolic status. Strains of coliforms were isolated from sewage works effluents, river and reservoir waters as well as from the bulk water and biofilms from distribution systems. These were assayed for sensitivity to free and total chlorine using two assay procedures. For E. coli, the isolates from the distribution system bulk water showed greater resistance to free chlorine than those from sewage effluents and equivalence to those from river waters. For non-E. coli coliforms (mainly strains of Klebsiella, Enterobacter and Citrobacter), those from distribution system biofilms showed the greatest sensitivity to free and total chlorine whilst those from river water had the greatest resistance.


2006 ◽  
Vol 72 (9) ◽  
pp. 5864-5869 ◽  
Author(s):  
Elizabeth D. Hilborn ◽  
Terry C. Covert ◽  
Mitchell A. Yakrus ◽  
Stephanie I. Harris ◽  
Sandra F. Donnelly ◽  
...  

ABSTRACT There is evidence that drinking water may be a source of infections with pathogenic nontuberculous mycobacteria (NTM) in humans. One method by which NTM are believed to enter drinking water distribution systems is by their intracellular colonization of protozoa. Our goal was to determine whether we could detect a reduction in the prevalence of NTM recovered from an unfiltered surface drinking water system after the addition of ozonation and filtration treatment and to characterize NTM isolates by using molecular methods. We sampled water from two initially unfiltered surface drinking water treatment plants over a 29-month period. One plant received the addition of filtration and ozonation after 6 months of sampling. Sample sites included those at treatment plant effluents, distributed water, and cold water taps (point-of-use [POU] sites) in public or commercial buildings located within each distribution system. NTM were recovered from 27% of the sites. POU sites yielded the majority of NTM, with >50% recovery despite the addition of ozonation and filtration. Closely related electrophoretic groups of Mycobacterium avium were found to persist at POU sites for up to 26 months. Water collected from POU cold water outlets was persistently colonized with NTM despite the addition of ozonation and filtration to a drinking water system. This suggests that cold water POU outlets need to be considered as a potential source of chronic human exposure to NTM.


Sign in / Sign up

Export Citation Format

Share Document