Wednesday, September 5, 2012

The Drinking Water Standards (Part 1)

When the objective of water treatment is to provide drinking water, then we need to select technologies that are not only the best available, but those that will meet local and national quality standards. The primary goals of a water treatment plant for over a century have remained practically the same: namely to produce water that is biologically and chemically safe, is appealing to the consumer, and is noncorrosive and nonscaling. Today, plant design has become very complex from discovery of seemingly innumerable chemical substances, the multiplying of regulations, and trying to satisfy more discriminating palates. In addition to the basics, designers must now keep in mind all manner of legal mandates, as well as public concerns and environmental considerations, to provide an initial prospective of water works engineering planning, design, and operation.

The growth of community water supply systems in the United States started in the early 1800s. By 1860, over 400, and by the turn of the century over 3000 major water systems had been built to serve major cities and towns. Many older plants were equipped with slow sand filters. In the mid 1890s, the Louisville Water Company introduced the technologies of coagulation with rapid sand filtration. 

The first application of chlorine in potable water was introduced in the 1830s for taste and odor control, at that time diseases were thought to be spread by odors. It was not until the 1890s and the advent of the germ theory of disease that the importance of disinfection in potable water was understood. Chlorination was first introduced on a practical scale in 1908 and then became a common practice.
Federal authority to establish standards for drinking water systems originated with the enactment by Congress in 1883 of the Interstate Quarantine Act, which authorized the Director of the United States Public Health Services (USPHS) to establish and enforce regulations to prevent the introduction, transmission, or spread of communicable diseases. 

Today resource limitations have caused the United States Environmental Protection Agency (USEPA) to reassess schedules for new rules. A 1987 USEPA survey indicated there were approximately 202,000 public water systems in the United States. About 29 percent of these were community water systems, which serve approximately 90 percent of the population. Of the 58,908 community systems that serve about 226 million people, 51,552 were classified as "small" or "very small. " Each of these systems at an average serves a population of fewer than 3300 people. The total population served by these systems is approximately 25 million people. These figures provide us with a magnitude of scale in meeting drinking water demands in the United States. Compliance with drinking water standards is not uniform. Small systems are the most frequent violators of federal regulations. Microbiological violations account for the vast majority of cases, with failure to monitor and report. Among others, violations exceeding SDWA maximum contaminant levels (MCLs) are quite common. Bringing small water systems into compliance requires applicable technologies, operator ability, financial resources, and institutional arrangements. The 1986 SDWA amendments authorized USEPA to set the best available technology (BAT) that can be incorporated in the design for the purposes of complying with the National Primary Drinking Water Regulations (NPDWR). Current BAT to maintain standards are as follows: 


For turbidity, color and microbiological control in surface water treatment: filtration. Common variations of filtration are conventional, direct, slow sand, diatomaceous earth, and membranes.

For inactivation of microorganisms: disinfection. Typical disinfectants are chlorine, chlorine dioxide, chloramines, and ozone.

For organic contaminant removal from surface water: packed-tower aeration, granular activated carbon (GAC), powdered activated carbon (PAC), diffused aeration, advanced oxidation processes, and reverse osmosis (RO).

For inorganic contaminants removal: membranes, ion exchange, activated alumina, and GAC.

For corrosion control: typically, pH adjustment or corrosion inhibitors. The implications of the 1986 amendments to the SDWA and new regulations have resulted in rapid development and introduction of new technologies and equipment for water treatment and monitoring over the last two decades. Biological processes in particular have proven effective in removing biodegradable organic carbon that may sustain the regrowth of potentially harmful microorganisms in the distribution system, effective taste and odor control, and reduction in chlorine demand and DBP formation potential. Both biologically-active sand or carbon filters provide cost effective treatment of micro-contaminants than do physicochernical processes in many cases. Pertinent to the subject matter cover in this volume, membrane technology has been applied in drinking water treatment, partly because of affordable membranes and demand to removal of many contaminants. Microflltration, ultrafiltration, nanofiltration and others have become common names in the water industry. Membrane technology is experimented with for the removal of microbes, such as Giardia and Cryptosporidium and for selective removal of nitrate. In other instances, membrane technology is applied for removal of DBP precursors, VOCs, and others.

Other treatment technologies that have potential for full-scale adoption are photochemical oxidation using ozone and UV radiation or hydrogen peroxide for destruction of refractory organic compounds. One example of a technology that was developed outside North America and later emerged in the U.S. is the Haberer process. This process combines contact flocculation, filtration, and powdered activated carbon adsorption to meet a wide range of requirements for surface water and groundwater purification.

Utilities are seeking not only to improve treatment, but also to monitor their supplies for microbiological contaminants more effectively. Electro-optical sensors are used to allow early detection of algal blooms in a reservoir and allow for diagnosis of problems and guidance in operational changes. Gene probe technology was first developed in response to the need for improved identification of microbes in the field of clinical microbiology. Attempts are now being made by radiolabeled and nonradioactive gene-probe assays with traditional detection methods for enteric viruses and protozoan parasites, such as Giardia and Cryprosporidium. This technique has the potential for monitoring water supplies for increasingly complex
groups of microbes.

Continue to part 2


No comments:

Post a Comment