Upper Limit of Residual Chlorine in Reclaimed Wastewater

2006 ◽  
Vol 1 (2) ◽  
Author(s):  
M. Ji ◽  
N. Zhang ◽  
K.A Zhang ◽  
H.Sh Zhang ◽  
Zh. Huo

Chlorination as a disinfectant of tertiary treatment is the most common form in municipal wastewater treatment plants in China. Excess residual chlorine in reclaimed wastewater is harmful to growth of lawn grass. However, upper limit of residual chlorine aimed at reclaimed wastewater reuse for urban green land irrigation has not been promulgated. Lab-scale potted experiments of seedling tall fescue (Festuca arundinacea Schreb.)(TF) and Kentucky bluegrass (Poa pratensis L.)(KBG) were performed to evaluate the effects of residual chlorine in reclaimed wastewater on these two kinds of lawn grass. The results showed that: relative aboveground biomass, photosynthetic rate and total chlorophll (Tchl) concentration reduced to great extent with increasing of residual chlorine. The activity of catalase (CAT) of TF went up under low concentrations and decreased under higher ones, which was different from the decline trend of the same index of KBG. Considered growth of the two kinds of turf grass and disinfectant of reclaimed water, the upper limits of residual chlorine in reclaimed water for landscape irrigation should be equal to 1.0 mg/L for TF and 0.8 mg/L for KBG.

1993 ◽  
Vol 27 (7-8) ◽  
pp. 53-61 ◽  
Author(s):  
A. Kanarek ◽  
A. Aharoni ◽  
M. Michail

Groundwater recharge for wastewater reuse developed and practiced successfully in the Dan Region Project is rather a soil aquifer treatmemt (SAT) system which should be considered as an integral part of the municipal wastewater treatment process. SAT consists of controlled passage of effluent through the unsaturated zone and the aquifer, mainly for purification purposes. The recharge operation is carried out by means of spreading basins which are surrounded by adequately spaced recovery wells which permit segregation of the recharge zone from the rest of the aquifer. A very high quality of reclaimed water is obtained after SAT which is suitable for a variety of non-potable uses such as unrestricted agricultural uses, industrial uses, non-potable municipal uses and recreational uses.


1992 ◽  
Vol 26 (7-8) ◽  
pp. 1513-1524 ◽  
Author(s):  
T. Asano ◽  
L. Y. C. Leong ◽  
M. G. Rigby ◽  
R. H. Sakaji

The State of California's WastewaterReclamationCriteria is under review and will be revised and expanded to include several new regulations on the use of reclaimed municipal wastewater. To provide a scientific basis for the evaluation of the existing and proposed Criteria, enteric virus monitoring data from secondary and tertiary effluents were evaluated. These virus data were obtained from special studies and monitoring reports, covering the period from 1975 to 1989, including ten municipal wastewater treatment facilities in California. Based on the enteric virus data from these reports, and using the current Criteria as a guide, four exposure scenarios were developed to determine the risk of waterborne enteric virus infection to humans as a consequence of wastewater reclamation and reuse. The exposure assessments included food crop irrigation, landscape irrigation for golf courses, recreational impoundments, and ground water recharge. The virus enumeration and the resulting risk assessments described in this paper provide a comparative basis for addressing the treatment and fate of enteric viruses in wastewater reclamation and reuse. The analyses show that annual risk of infection from exposure to chlorinated tertiary effluent containing 1 viral unit/100 L in recreational activities such as swimming or golfing is in the range of 10−2 to 10−7, while exposures resulting from food-crop irrigation or groundwater recharge with reclaimed municipal wastewater is in the range of 10−6 to 10−11. The risk analyses are also used to demonstrate that the probability of infection can be further mitigated by controlling exposure to reclaimed wastewater in the use area.


1990 ◽  
Vol 115 (4) ◽  
pp. 608-611 ◽  
Author(s):  
Jennifer M. Johnson-Cicalese ◽  
C.R. Funk

Studies were conducted on the host plants of four billbug species (Coleoptera:Curculionidae: Sphenophorus parvulus Gyllenhal, S. venatus Chitt., S. inaequalis Say, and S. minimus Hart) found on New Jersey turfgrasses. A collection of 4803 adults from pure stands of various turfgrasses revealed all four billbugs on Kentucky bluegrass (Poa pratensis L.), tall fescue (Festuca arundinacea Schreb.), and perennial ryegrass (Lolium perenne L.), and S. parvulus, S. venatus, and S. minimus on Chewings fescue (F. rubra L. ssp. commutata Gaud.). Since the presence of larvae, pupae, or teneral adults more accurately indicates the host status of a grass species, immature billbugs were collected from plugs of the various grass species and reared to adults for identification. All four species were reared from immature billbugs found in Kentucky bluegrass turf; immatures of S. venatus, S. inaequalis, and S. minimus were found in tall fescue; S. venatus and S. minimus in perennial ryegrass; and S. inaequalis in strong creeping red fescue (F. rubra L. ssp. rubra). A laboratory experiment was also conducted in which billbug adults were confined in petri dishes with either Kentucky bluegrass, perennial ryegrass, tall fescue, or bermudagrass (Cynodon dactylon Pers.). Only minor differences were found between the four grasses in billbug survival, number of eggs laid, and amount of feeding. In general, bermudagrass was the least favored host and the other grasses were equally adequate hosts. The results of this study indicate a need for updating host-plant lists of these four billbug species.


2007 ◽  
Vol 55 (1-2) ◽  
pp. 449-457 ◽  
Author(s):  
G. De Feo ◽  
M. Galasso ◽  
V. Belgiorno

The aim of this paper was to evaluate the groundwater pollution in an endoreic basin in southern Italy. The aquifer circulation was carried out on two different levels: a shallow groundwater, with a water table of about 10 m, and a deep groundwater in a karst aquifer, with a water table of 140–190 m. Reclaimed municipal wastewater and superficial water collected in the catchment area were both drained in a swallow hole linked with the deep groundwater. The agricultural practice conducted in the endoreic basin produced an excess of nitrate in the soil. Nitrate was subsequently washed out and displaced in the groundwater. With regard to the EU Drinking Water Directive (98/83/EC), the research activity conducted during 2003 showed the absence of pollution in the deep groundwater used for drinking water supply. The shallow groundwater, instead, was strongly influenced by agricultural and pasture activities, with detectable levels of nitrates and bacteria. In order to reduce the load of pollution entering the swallow hole and then in the deep groundwater, the realisation of a constructed wetland plant was proposed to improve the quality of reclaimed wastewater, as well as to pursue the wastewater reuse in agriculture.


Membranes ◽  
2020 ◽  
Vol 10 (6) ◽  
pp. 131 ◽  
Author(s):  
Jiaqi Yang ◽  
Mathias Monnot ◽  
Lionel Ercolei ◽  
Philippe Moulin

Wastewater reuse as a sustainable, reliable and energy recovery concept is a promising approach to alleviate worldwide water scarcity. However, the water reuse market needs to be developed with long-term efforts because only less than 4% of the total wastewater worldwide has been treated for water reuse at present. In addition, the reclaimed water should fulfill the criteria of health safety, appearance, environmental acceptance and economic feasibility based on their local water reuse guidelines. Moreover, municipal wastewater as an alternative water resource for non-potable or potable reuse, has been widely treated by various membrane-based treatment processes for reuse applications. By collecting lab-scale and pilot-scale reuse cases as much as possible, this review aims to provide a comprehensive summary of the membrane-based treatment processes, mainly focused on the hydraulic filtration performance, contaminants removal capacity, reuse purpose, fouling resistance potential, resource recovery and energy consumption. The advances and limitations of different membrane-based processes alone or coupled with other possible processes such as disinfection processes and advanced oxidation processes, are also highlighted. Challenges still facing membrane-based technologies for water reuse applications, including institutional barriers, financial allocation and public perception, are stated as areas in need of further research and development.


Author(s):  
Brian R. McMahon ◽  
Robert C. J. Koo ◽  
H. Williams Persons

In 1986 the City of Orlando, Florida; Orange County; and area citrus growers implemented an innovative program to reclaim municipal wastewater for irrigation of citrus trees. This program, known as Water Conserv II, is planned to ultimately provide up to 50 million gallons per day of reclaimed water to as much as 15,000 acres of citrus grove land. In this paper, the authors present the program concept; identify public health issues that were considered; describe the facilities that were constructed to treat, transmit and distribute the reclaimed water; discuss operational factors and summarize initial observations of the project’s performance after the first two years of operation. Paper published with permission.


2015 ◽  
Vol 72 (4) ◽  
pp. 616-622 ◽  
Author(s):  
Defang Ma ◽  
Baoyu Gao ◽  
Yan Wang ◽  
Qinyan Yue ◽  
Qian Li

A hybrid process with membrane bioreactor (MBR) and powdered activated carbon (PAC), PAC/MBR, was used for real municipal wastewater treatment and reuse. The roles of chlorine dose, contact time, pH and bromide in trihalomethane (THM) formation and speciation during chlorination of the reclaimed water were investigated. Total trihalomethane (TTHM) yield exponentially increased to maximum with increasing chlorine dose (correlation coefficient R2 = 0.98). Prolonging substrate chlorine contact time significantly promoted TTHM formation. Less than 40% of THMs formed in the first 24 h, indicating that the PAC/MBR effluent organic matters were mostly composed of slow-reacting precursors. Increasing pH and bromide concentration facilitated THM formation. Higher chlorine dose and contact time enhanced chloro-THM formation. The bromo-THM formation was favored at near neutral condition. Despite the variation of chlorine dose, contact time and pH, the yield of THM species in order was usually CHCl3 > CHBrCl2 > CHBr2Cl > CHBr3. However, THM speciation shifted from chlorinated species to brominated species with increasing bromide concentration.


Plant Disease ◽  
2006 ◽  
Vol 90 (2) ◽  
pp. 246-246
Author(s):  
G. Polizzi ◽  
A. Vitale ◽  
I. Castello

Tall fescue (Festuca arundinacea Schreb.) and Kentucky bluegrass (Poa pratensis L.) are the main turfgrass species cultivated in Sicily (southern Italy) for ready lawn (sod) to ornamental purposes. In July 2004 and May 2005, a widespread disease was noticed in two turf nurseries on the eastern side of Sicily on a ready lawn mixture of F. arundinacea cv. Safari (94%) + P. pratensis cv. Cabaret (6%). Numerous yellow, circular- and crescent-shaped patches as much as 30 to 40 cm in diameter were observed. The turf usually died around the perimeter of the patch, but the grass remained green in the center of the ring with a tuft of green grass in the center (frog eye). Affected turf was initially reddish brown and turned brown as it died. Small, round and off-white or tan seed-like structures were dispersed on mycelial strands at the outer edge of the ring in the mat at the base of grasses. The pathogen was identified as Sclerotium rolfsii Sacc. The fungus was isolated directly as aerial mycelium or sclerotia or following surface disinfection (2 min in 0.5% NaOCl) and plating diseased tissues on potato dextrose agar (PDA). Sclerotia were observed in vitro in 7-day-old cultures. Pathogenicity was tested by inoculating two com-mercial ready lawn strips (80 × 100 cm) of two healthy turfgrass species each with three isolates of the fungus. Thirty sclerotia were placed at the base of stems. Noninoculated ready lawn strips served as control. All plants were covered with plastic bags, exposed to diffused daylight for 5 days, and then maintained in a growth chamber at 25 to 28°C under fluorescent light. Disease symptoms and southern blight signs like the ones observed in the field occurred 2 weeks after inoculation. S. rolfsii was reisolated from affected tissues. Symptoms were not detected on any of the non-inoculated ready lawn strips. The disease was serious enough that chemical treatments were required for its control. Southern blight was previously detected on bermudagrass and other cool-season turfgrass genera (1).To our knowledge, this is the first report of southern blight on tall fescue and bluegrass in Italy. Reference: (1) R. W. Smiley. Common Names of Plant Diseases. Diseases of Turfgrasses. Online publication. The American Phytopathological Society, St. Paul, MN.


2017 ◽  
Vol 2 (3) ◽  
pp. 162-170
Author(s):  
Kenneth Lynn Diesburg ◽  
Ronald F. Krausz

This research was conducted to determine the degree of success, by month, in seeding establishment of tall fescue (Festuca arundinacea Schreb., Kentucky bluegrass (Poa pratensis L.), Bermudagrass (Cynodon dactylon [L.] Pers. var. dactylon), and zoysiagrass (Zoysia japonica Steud.) at two locations in the moist, Midwest, continental transition zone on a prepared seed bed without irrigation or cover. The four species were planted every month of the year starting in September 2005. Starter fertilizer and siduron were applied the same day as seeding with no subsequent management except mowing. Percent cover of living turfgrass was recorded in each of 24 months after seeding. Tall fescue (80%) and Bermudagrass (73%) provided the best percent cover over all planting dates. Kentucky bluegrass provided 65% and zoysiagrass 24% cover. The cool-season grasses performed best in the July-to-March plantings; tall fescue 88% and Kentucky bluegrass 72%. Bermudagrass (94%) established best in the January-to-April plantings, while Zoysiagrass (32%) established best in the November-to-March plantings. Germination and seedling survival after germination of all species were inhibited by limited moisture during summer. The warm-season grasses were further limited by winter kill in the August, September, and October seedings. These results emphasize the risk in spring-seeding as well as the value in dormant-seeding of both warm- and cool-season turfgrasses for low-input, nonirrigated establishment.


Sign in / Sign up

Export Citation Format

Share Document