Riverbank filtration for drinking water supply - a proven method, perfect to face today's challenges

2002 ◽  
Vol 2 (5-6) ◽  
pp. 1-8 ◽  
Author(s):  
R. Irmscher ◽  
I. Teermann

Hygiene standards and parasites have been a special focus of drinking water utilities for several years. In this context the development of new, high-tech water treatment methods is often taken into consideration. However, we have been applying riverbank filtration as an inexpensive, natural method in Düsseldorf for over 130 years. Indeed it had been introduced for “hygiene reasons” at the time and, according to our experience, riverbank filtration is well suited to meet these “new” hygiene challenges. We have intensively examined the infiltration of river water into the aquifer. We view this core process as the prerequisite for the sustained function of riverbank filtration. It is closely linked with the retention of turbid matters in the riverbed and the shearing forces on the subsurface. In addition, we have investigated the effectiveness of bank filtration as regards the elimination of microorganisms over recent years. According to these examinations, bacteria are reduced by an average of 3 log orders by bank filtration; individual breakthroughs correlate with high water events. According to our measurements Giardia and Cryptosporidium have been completely eliminated in riverbank passage. The retention of three examined types of viruses was also found to be almost completely accomplished.

2005 ◽  
Vol 5 (2) ◽  
pp. 123-134 ◽  
Author(s):  
R. Miller ◽  
B. Whitehill ◽  
D. Deere

This paper comments on the strengths and weaknesses of different methodologies for risk assessment, appropriate for utilisation by Australian Water Utilities in risk assessment for drinking water source protection areas. It is intended that a suggested methodology be recommended as a national approach to catchment risk assessment. Catchment risk management is a process for setting priorities for protecting drinking water quality in source water areas. It is structured through a series of steps for identifying water quality hazards, assessing the threat posed, and prioritizing actions to address the threat. Water management organisations around Australia are at various stages of developing programs for catchment risk management. While much conceptual work has been done on the individual components of catchment risk management, work on these components has not previously been combined to form a management tool for source water protection. A key driver for this project has been the requirements of the National Health and Medical Research Council Framework for the Management of Drinking Water Quality (DWQMF) included in the draft 2002 Australian Drinking Water Guidelines (ADWG). The Framework outlines a quality management system of steps for the Australian water industry to follow with checks and balances to ensure water quality is protected from catchment to tap. Key steps in the Framework that relate to this project are as follows: Element 2 Assessment of the Drinking Water Supply System• Water Supply System analysis• Review of Water Quality Data• Hazard Identification and Risk Assessment Element 3 Preventive Measures for Drinking Water Quality Management• Preventive Measures and Multiple Barriers• Critical Control Points This paper provides an evaluation of the following risk assessment techniques: Hazard Analysis and Critical Control Points (HACCP); World Health Organisation Water Safety Plans; Australian Standard AS 4360; and The Australian Drinking Water Guidelines – Drinking Water Quality Management Framework. These methods were selected for assessment in this report as they provided coverage of the different approaches being used across Australia by water utilities of varying: scale of water management organisation; types of water supply system management; and land use and activity-based risks in the catchment area of the source. Initially, different risk assessment methodologies were identified and reviewed. Then examples of applications of those methods were assessed, based on several key water utilities across Australia and overseas. Strengths and weaknesses of each approach were identified. In general there seems some general grouping of types of approaches into those that: cover the full catchment-to-tap drinking water system; cover just the catchment area of the source and do not recognise downstream barriers or processes; use water quality data or land use risks as a key driving component; and are based primarily on the hazard whilst others are based on a hazardous event. It is considered that an initial process of screening water quality data is very valuable in determining key water quality issues and guiding the risk assessment, and to the overall understanding of the catchment and water source area, allowing consistency with the intentions behind the ADWG DWQM Framework. As such, it is suggested that the recommended national risk assessment approach has two key introductory steps: initial screening of key issues via water quality data, and land use or activity scenario and event-based HACCP-style risk assessment. In addition, the importance of recognising the roles that uncertainty and bias plays in risk assessments was highlighted. As such it was deemed necessary to develop and integrate uncertainty guidelines for information used in the risk assessment process. A hybrid risk assessment methodology was developed, based on the HACCP approach, but with some key additions and modifications to make it applicable to varying catchment risks, water supply operation needs and environmental management processes.


2016 ◽  
Vol 16 (4) ◽  
pp. 922-930 ◽  
Author(s):  
L. Richard ◽  
E. Mayr ◽  
M. Zunabovic ◽  
R. Allabashi ◽  
R. Perfler

The implementation and evaluation of biological nitrification as a possible treatment option for the small-scale drinking water supply of a rural Upper Austrian community was investigated. The drinking water supply of this community (average system input volume: 20 m3/d) is based on the use of deep anaerobic groundwater with a high ammonium content of geogenic origin (up to 5 mg/l) which must be treated to prevent the formation of nitrites in the drinking water supply system. This paper describes the implementation and operation of biological nitrification despite several constraints including space availability, location and financial and manpower resources. A pilot drinking water treatment plant, including biological nitrification implemented in sand filters, was designed and constructed for a maximum treatment capacity of 1.2 m3/h. Online monitoring of selected physicochemical parameters has provided continuous treatment performance data. Treatment performance of the plant was evaluated under standard operation as well as in the case of selected malfunction events.


1999 ◽  
Vol 48 (5) ◽  
pp. 177-185 ◽  
Author(s):  
O. Griffini ◽  
M. L. Bao ◽  
D. Burrini ◽  
D. Santianni ◽  
C. Barbieri ◽  
...  

2007 ◽  
Vol 7 (5-6) ◽  
pp. 219-225
Author(s):  
G. Gangl ◽  
D. Fuchs-Hanusch ◽  
E. Stadlober ◽  
P. Kauch

According to national standards, water utilities have to guarantee the supply of water to their consumers in appropriate quality, quantity and pressure. Therefore, foresighted rehabilitation planning is necessary. In some cases the buried water supply system is more than 100 years old but still in use. The aging process of the material derived from failures can be noticed by the water utilities in the daily operation, where several factors influence the lifetime expectancy. It is nearly impossible to predict the first failure, but if a failure occurs caused by aging processes, the following failure can be described statistically. The decisions whether a pipe-section should stay in use and be repaired locally or be replaced can be supported by statistical analyses.


1984 ◽  
Vol 3 (5) ◽  
pp. 383-392 ◽  
Author(s):  
J.C. Sherlock ◽  
D. Ashby ◽  
H.T. Delves ◽  
G.I. Forbes ◽  
M.R. Moore ◽  
...  

1 The water supply in Ayr (Scotland, UK) was plumbosolvent and many dwellings in Ayr contained lead pipes. In 1981 treatment of the water supply to reduce its plumbosolvency was initiated. Measurements of water and blood lead concentrations were made before and subsequent to the treatment. Most of the measurements made before and after water treatment began were made on water samples from the same dwellings and blood samples from the same women. 2 Water treatment produced a sharp fall in water lead concentrations and a decrease in the median blood lead concentration from 21 to 13 μg/100 ml. 3 Two women had higher than expected blood lead concentrations, both these women had been removing old paint. 4 Women who had lead pipes removed from their dwellings all showed substantial decreases in their blood lead concentrations. 5 The curvilinearity of the relation between blood lead and water lead concentrations is confirmed. Even relatively low (<40 μg/l) water lead concentrations may make a substantial contribution to blood lead concentrations.


2015 ◽  
Vol 7 (1) ◽  
pp. 1-15 ◽  
Author(s):  
I. Delpla ◽  
A. Scheili ◽  
S. Guilherme ◽  
G. Cool ◽  
M. J. Rodriguez

In Québec, Canada, shifts in climate patterns (i.e., rainfall increase) could have consequences on source water quality due to the intensification of surface/groundwater runoff contamination events, leading to a decline in drinking water treatment efficiency and ultimately disinfection by-products (DBPs) formation following treatment. To assess the impacts of climate change (CC) scenarios on DBP formation, a suite of models linking climate to DBPs was used. This study applies three emissions scenarios (B1, A1B and A2) for three 30-year horizons (2020, 2050 and 2080) in order to produce inputs to test several DBP models (total trihalomethanes (TTHMs), haloacetic acids and haloacetonitriles). An annual increase is estimated for all DBPs for each CC scenario and horizon. The highest seasonal increases were estimated for winter for all DBP groups or species. In the worst-case scenario (A2-2080), TTHMs could be affected more particularly during winter (+34.0%), followed by spring (+16.1%) and fall (+4.4%), whereas summer concentrations would remain stable (−0.3 to +0.4%). Potentially, small water utilities applying only a disinfection step could be more affected by rising TTHMs concentrations associated with CC than those having implemented a complete water treatment process (coagulation–flocculation, filtration and disinfection) with +13.6% and +8.2% increases respectively (A2-2080).


Author(s):  
Victor Khoruzhy ◽  
Tetіana Khomutetska ◽  
Igor Nedashkovskіy

Surface water bodies, which are sources of drinking water supply, receive a significant amount of pollution from wastewater. This negatively affects the ecological condition of water resources and poses a threat to the health and sanitary well-being of the population. The main pollutants of surface sources are: sewage of economic-fecal and industrial sewage, which contain organic pollutants, surfactants, heavy metal ions; oil products coming from industrial sites and urban areas; effluents from livestock farms and storage ponds of production waste; washing of mineral fertilizers and pesticides from agricultural lands. Adjustment of surface springs additionally affects the deterioration of water quality in them. Therefore, existing water treatment technologies may not always provide the required degree of drinking water purification. According to monitoring studies, more than 38% of water samples taken at centralized water supply facilities did not meet regulatory requirements. This situation encourages the search for ways that would create conditions for more efficient operation of water supply systems. Modernization of existing water supply facilities and application of new water treatment technologies can help solve the problem. The article illustrates constructive schemes of shore and channel water intake and treatment facilities, the use of which makes it possible to reduce the dirt retention load on the main treatment facilities, increase the reliability of fish fry protection and improve the ecological condition of reservoirs at water intake sites. For effective removal of organic matter at water treatment plants, it is advisable to use bioreactors and contact-clarifying filters. Such solutions allow not only to increase the productivity of the water treatment plant, but also significantly reduce its construction cost, simplify the operation of facilities and reduce annual operating costs.


2018 ◽  
Vol 2 (2) ◽  
pp. 39-48
Author(s):  
Hayder Mohammed Issa ◽  
Reem Ahmed Alrwai

Safe source of drinking water is always considered as an essential factor in water supply for cities and urban areas. As a part of this issue, drinking water quality is monitored via a useful scheme: developing drinking water quality index DWQI. DWQI is preferably used as it summarizes the whole physicochemical and bacteriological properties of a drinking water sample into a single and simple term. In this study, an evaluation was made for three drinking water treatment plants DWTPs named: Efraz 1, Efraz 2 and Efraz 3 that supply drinking water to Erbil City. The assessment was made by testing thirteen physicochemical and two bacteriological parameters during a long period of (2003 – 2017). It has been found that turbidity, electrical conductivity EC, total alkalinity, total hardness, total coliform and fecal coliform have more influence on drinking water quality. DWQI results showed that the quality of drinking water supplied by the three DWTPs in Erbil City fallen within good level. Except various occasional periods where the quality was varying from good to fair. The quality of the drinking water supply never reached the level of marginal or poor over the time investigated. The applied hierarchical clustering analysis HCA classifies the drinking water dataset into three major clusters, reflecting diverse sources of the physicochemical and bacteriological parameter: natural, agriculture and urban discharges.


Water Policy ◽  
2021 ◽  
Author(s):  
L. Bross ◽  
J. Bäumer ◽  
I. Voggenreiter ◽  
I. Wienand ◽  
A. Fekete

Abstract The drinking water supply is a core element of national regulations for normal and emergency supply as well as coping with crisis events. Particularly with regard to the interdependence of critical infrastructures means that water supply failures can have far-reaching consequences and endanger the safety of a society, e.g., by impairing hospital operations. In case of an emergency in the drinking water infrastructure, minimum supply standards, e.g., for patients in hospitals, become important for emergency management during crisis situations. However, wider recognition of this issue is still lacking, particularly in countries facing comparably minor water supply disruptions. Several international agencies provide guideline values for minimum water supply standards for hospitals in case of a disaster. Acknowledging these minimum standards were developed for humanitarian assistance or civil protection, it remains to be analyzed whether these standards apply to disaster management in countries with high water and healthcare supply standards. Based on a literature review of scientific publications and humanitarian guidelines, as well as policies from selected countries, current processes, contents, and shortcomings of emergency water supply planning are assessed. To close the identified gaps, this paper indicates potential improvements for emergency water supply planning in general as well as for supply of hospitals and identifies future fields of research.


Sign in / Sign up

Export Citation Format

Share Document