First published in Water e-Journal Vol 3 No 2 2018.
Chemical disinfection of drinking water has been the single most important reason for improved public health. However potential chronic health impacts that may be associated with the consumption of disinfected water require health regulators and water utilities to consider the overall safety of drinking water. Throughout regulated jurisdictions worldwide, control of disinfection by-products (DBPs) in drinking water guidelines is currently managed by setting guidelines for (or regulating) specific compounds. However, currently regulated DBPs do not adequately account for the potential health impacts and identification of the responsible compounds is difficult. Therefore alternative approaches to disinfection management are warranted. This paper discusses alternatives that could be implemented, in the short and longer term, to reduce DBP formation and improve overall public health associated with disinfected drinking water.
Chlorine is the most common primary disinfectant applied in drinking water supplies across Australia, and in many regions throughout the world. While disinfection with chlorine has provided substantial public health benefits, its interaction with natural organic matter (NOM) and inorganic precursors such as bromide and iodide that may be present in the water can generate many disinfection by-products (DBPs).
Although chlorinated drinking water has been linked to adverse health effects and over 600 DBPs have been identified to date (Weinberg et al., 2002; Richardson et al., 2007), the behaviour of only approximately 20 of them is adequately understood (Sadiq and Rodriguez, 2004). Currently throughout regulated jurisdictions worldwide, control of DBPs in drinking water is managed by setting guidelines for (or regulating) specific compounds. This is based on, and limited by, identification of formed compounds, development of analytical capability and available toxicological data.
For most DBPs all this information is not available, leading to extrapolation of potential health impacts from incomplete data. This lack of data delays guideline development and implementation and can potentially result in DBPs with greater health impact not having guidelines. For example, only three of the possible nine chloro/bromo haloacetic acids (HAAs) are included in the Australian Drinking Water Guidelines (ADWG), Japanese Standards and World Health Organisation (WHO) Guidelines (Table 1).
The HAAs with Australian guidelines are the three chlorinated HAAs, however, there is evidence that the brominated HAAs are of greater health significance (Richardson et al., 2007).
A possible approach to address the total risk would be to include a more comprehensive measure of the actual DBPs that are being formed. One such technique is total or adsorbable organic halogen concentration, referred to as TOX or AOX. This method adsorbs organic compounds onto activated carbon columns which are then pyrolysed at high temperatures, converting organic carbon to carbon dioxide and the bound halogens to hydrogen halide, the latter are then measured by silverion precipitation. It is also possible to measure just the total adsorbable brominated compounds (AOBr), by ion chromatography, as these have been shown to be of greater health impact. Whilst AOX is a more comprehensive measure of chlorinated DBPs, it does not include DBPs that do not contain halides such as nitrosamines, associated with chloramination.
Relationships between various types of DBP have been developed over the years; some studies have linked total AOX formation with other DBP classes such as trihalomethanes (THMs) or HAAs (Singer and Chang, 1989; Obolensky and Singer, 2005). More information on the extent of total DBP formation would be of value both in terms of informing utilities regarding their overall DBP production and as a potential regulatory/compliance approach. Limiting total DBPs formed as measured by AOX will provide an overall limit to DBP formation but does this actually link to health impacts? Or is it too broad a measure which will not necessarily produce a better health outcome?
A recent study (Sawade et al., 2016) showed that although AOX more effectively represented overall halogenated DBP formation, it inadequately described the degree of toxic response in bioassays, as the majority of the effect is believed to be due to individual, highly toxic compounds that only form a small fraction of the total DBPs.
Introduction of additional DBP guidelines may not necessarily be the best approach in terms of encouraging proactive and effective management of water treatment processes and distribution systems. As has been seen from establishing THM guidelines in the ADWG, this can encourage operational practices, which may achieve compliance with the regulations, but that do not necessarily improve overall water quality or reduce health risk. An example of this is the use of aeration to remove THMs following disinfection. This practice assists compliance with THM guidelines but only impacts volatile compounds and does not prevent formation of other DBPs.
A more effective approach would be to establish regulations that encourage changes to operational practice to reduce overall risk. This could occur by selecting regulations that target water quality prior to disinfection. This is akin to the establishment of a turbidity value for filter operation within conventional treatment plants as a surrogate to reduce the risk of Cryptosporidium entering the water supply as discussed in the ADWG. To date, there is very little direct evidence linking turbidity <0.2 NTU with specific reduction in Cryptosporidium occurrence in treated water. However, implementing this approach encourages focus on optimising treatment operation to improve treated water quality, rather than focussing only on reducing the number of detected Cryptosporidium oocysts in treated water. Whilst measurement of Cryptosporidium oocysts in raw and treated water enables quantification of removal, and auditing of compliance, detection of low oocyst levels is challenging and, alone is not an effective mechanism for encouraging improved treatment performance. Whilst there may also be additional surrogates within the treatment process that could further improve overall performance and Cryptosporidium risk, the introduction of filtered water turbidity regulations has certainly been a step in the right direction.
The factors that impact DBP formation during chlorination include chlorine dose, contact time, pH, temperature and the presence of precursors; natural organic matter (NOM), bromide and iodide. DBPs are formed through reactions of hypochlorous acid (HOCl) with NOM in the presence or absence of bromide/ iodide. The concentration and character of the NOM are critical in determining the extent and rate of reaction with chlorine and hence the amount and type of DBP that are formed. The extent of formation of chlorinated versus brominated (or iodinated) DBPs is dependent on both the concentration of bromide and iodide and on the relative chlorine (HOCl) to bromide ratio (Westerhoff et al., 2004; Sadiq and Rodriguez, 2004).
Understanding how NOM reacts with chlorine has been a key focus of researchers worldwide for many years. Measurement of total NOM is usually undertaken using organic carbon analysers and, in Australia, drinking water supplies normally analyse after filtration through 0.45 µm filters to provide a measure of dissolved organic carbon (DOC). The United States (US) measures total organic carbon (TOC) and has adopted the approach of reducing DBP by improving removal of TOC through conventional treatment within their USEPA D/DBP rule. The rationale behind the introduction of enhanced coagulation (defined as the addition of excess coagulant for improved removal of DBP precursors by conventional treatment) was that only a fraction of the chlorination by-products and associated health risks have been identified and hence an increase in precursor removal would reduce known and unknown public health risks.
This approach requires removal of TOC based on raw water alkalinity and concentration of TOC in the raw water, according to Table 2 (Crozes et al., 1995). Reduction in pH to achieve the required TOC removal may also be necessary in some waters. While there has been some investigation into the DBP formation potential of DOC fractions (Kristiana et al., 2010; Marhaba and Van, 2000), these fractions are still not so clearly defined that the outcomes translate to all other water sources and there is still a great deal of site specificity to consider. Ultimately, conventional treatment is not a precise treatment technology, allowing only limited potential for targeted DOC fraction removal.
This is a good approach that focuses on improving overall treatment and could be adopted within Australia. It could be enhanced further by utilising a modelling approach to determine the optimal DOC that can be removed within a particular raw water source.
There are a number of models available that can predict the coagulant dose required to maximise DOC removal using limits defined by a point of diminishing returns such as <0.15 mg/L DOC change per 10 mg/L coagulant (van Leeuwen et al., 2005). Similar to the Cryptosporidium analogy, this would ensure that conventional treatment processes are optimised to remove the maximum amount of DOC possible by that treatment process.
In 2007, the Health Ministry in France went one step further and recommended a target of 2 mg/L for TOC for drinking water (personal communication). Whilst this is not a regulated limit, but only expressed as a “reference”, this value is recognized by French water utilities as an operational target for efficient management and control of THM formation < 100 µg/L (regulated limit) at the customer tap. Striving for treatment targets that exceed the capacity of conventional techniques is acknowledged as ambitious and potentially costly to achieve in many parts of the world, however reaching a future state begins with a goal; and efforts towards overcoming the technical and economic difficulties need to continue in order to achieve improved public health protection.
The extent of reaction of chlorine with DOC is not based solely on the concentration of the DOC but more specifically with the type of DOC. NOM consists of a range of compounds including humic substances (humic and fulvic acids) and non-humic matter such as proteins, amino acids, sugars and polysaccharides. The precursors which produce the greatest proportion of DBPs have been identified as those with strong UV absorption that contain aromatic functional groups or unsaturated bonds such as hydrophobic acids (humic substances). Activated aromatic functional groups, referred to collectively as polyhydroxyphenolic acid (PHA) moieties (all aromatic rings substituted with one or more hydroxyl group and incorporated into larger NOM molecules), are the predominant reaction sites in humic substances (Li et al., 2000). Other aliphatic functional groups such as esters and ketones can also contribute to the formation of DBPs, but to a lesser extent than PHA moieties (Li et al., 2002; Westerhoff et al., 2004).
A strong correlation has been shown to apply between UV absorbance at 254nm (UV254) and the formation of total THMs by many researchers. If a specific UV254 value can be found which limits THM formation below regulations, this could provide a worthwhile operational/ regulatory target and is worth further investigation. This would direct operational practice to improve treated water quality prior to disinfection, and would reduce formation not only of THMs but of all DBPs and lead to reduced health risk.
Whilst the correlation between UV254 and THM formation applies generically across all water quality, THM formation (and other DBPs) is also impacted by detention time and secondary disinfection within distribution systems. Strong linear correlations have also been found between UV254 prior to chlorination and THM formation at the ends of distribution systems for individual supplies which may provide an additional means of predicting the THM formation within a specific distribution system (SA Water, unpublished data). This would assist with setting operational guidelines for managing THMs (and other DBPs) within specific systems.
Chemical coagulation is very effective for removal of UV254 absorbing NOM compounds, hence, selection of an operational/regulatory target for UV254 to increase removal of organic precursors of DBPs, may not necessarily require significant upgrades for treatment plants incorporating coagulation.
Korshin et al. (1997) also showed that the amount of AOX formed when a sample was chlorinated was strongly related to the change in the UV absorbance. This applied over a wide range of wavelengths, with the relationship most reliable and easiest to apply at wavelengths where the magnitude of absorbance change was greatest. Typically, the peak value of this change is at or near 272 nm, so it is at this wavelength that the relationship has been explored in most detail.
Therefore monitoring absorbance change at this wavelength could be an alternative to monitor and control DBP formation.
Recent studies have also identified correlations between DBP and a range of fluorescence spectroscopy descriptors. A strong correlation has been identified between chloroform and fluorescence based measurement, such as excitation-emission pairs (λEx/Em) of λ278/506 (Humic acid-like) (Pifer et al., 2014). Excitationemission matrices (EEMs) were also processed with parallel factor analysis, which resulted in six component fluorophore groups (C1-C6), each with a maximum intensity (FMAX). Strong correlations were also found between chloroform and C1 FMAX (Fulvic acid-like) (Pifer et al., 2014). In summary, FMAX values produced an accurate DBP-precursor surrogate parameter because the algorithm is able to differentiate between components that are strongly correlated to THM formation potential.
Relationships between the formation of DBPs and changes of the fluorescence of natural organic matter (NOM) in chlorinated water were also quantified using two fluorescence indexes by Roccaro et al. (2009). They were defined as the change of the wavelength that corresponds to 50% of the maximum intensity of fluorescence (Δλem 0.5) and the differential ratio of fluorescence intensities measured at 500 and 450 nm (Δ(I500/I450)). As expected, variations of chlorine doses, reaction times and temperatures affected the kinetics of chlorine consumption and DBP formation. However, correlations between chlorine consumption, concentration, and speciation of THMs, HAAs and haloacetonitriles, with Δ(I500/I450) and Δλem 0.5 values remained unaffected by chlorination conditions and, to some extent, NOM properties.
A later study by the same researchers (Roccaro et al., 2011) also found very strong relationships between concentrations of trichloronitromethane (chloropicrin) and dichloroacetonitrile and spectroscopic indexes based on both absorbance and fluorescence measurements, namely ΔA272, Δ(I500/I450) and Δλem 0.5. Since these spectroscopic indexes are indicators of the degradation of the reactive aromatic groups, it was concluded that the formation of N-DBP is also associated with the chlorination of activated aromatic groups in NOM. Other research groups have also reported that major N-DBPs precursors are amino acids and N-containing heterocyclic aromatic rings. The authors therefore concluded that the changes in NOM absorbance and fluorescence are fundamental descriptors of the formation of organichalogenated N-DBPs and may also be suitable for real time monitoring of emerging N-DBPs and for studying of formation pathways of emerging DBPs.
Roccaro et al. (2009) have suggested that, for the water sources they examined, the fluorescence indicators better represent more complex mechanisms occurring during chlorination, notably the oxidative and/or incorporative degradation of NOM aromatic groups by halogen species and the structural transformation of NOM molecules. In contrast, the differential absorbance quantifies changes of NOM aromaticity caused by the incorporation of halogens. Additional benefits of the use of NOM fluorescence to track DBPs formation are defined by its very high sensitivity, ability to generate informationrich 3D EEM spectra and lack of interference by many species (e.g. chlorine, nitrate and nitrite) whose presence complicates the use of absorbance spectroscopy.
The use of a fluorescence-based approach for NOM characterisation and the link with DBP formation is still in development, with significant potential to assist DBP management in the future.
Studies investigating mechanisms of DBP formation found that aqueous bromine substitutes into organic structures, whereas chlorine tends to cleave carbon bonds thus having a more significant impact on NOM structure based on UV/Visible and solid-state C-NMR spectroscopy (Westerhoff et al., 2004). Other researchers have reported the preference for bromoderivatives in the presence of aliphatic precursors versus a preferred chloro-substitution in the case of aromatic compounds. Therefore the speciation and concentration of DBP formed during chlorination processes are mainly dominated by the ratio of bromide to reactive NOM as well as the ratio of bromide to chlorine concentrations (Westerhoff et al., 2004; Ates et al., 2007).
Research into the bromine substitution among different DBP classes has found that the molar percent halogen incorporated into DBP can be used as an unbiased measure and be expressed as a dimensionless factor called the bromine incorporation factor (BIF). BIF is defined as the ratio of molar concentration of bromine incorporated into a given class of DBP to the total molar concentration of chlorine and bromine in that class (Hua et al., 2006; Wang et al., 2010).
The BIFs for both THMs and HAAs increased substantially with increasing bromine-to-chlorine consumption ratio, however the efficiency of bromine substitution into different DBP groups has been found to vary, with higher substitution efficiency of bromine into THMs and dihaloacetonitriles than into dihaloacetic acids and trihaloacetic acids (Obolensky and Singer, 2005).
The results for two different water sources (Sawade et al., 2016) suggested that it is the concentration of bromide (Br) and DOC present that is more significant in relation to the DBP speciation than the ratio itself, as the DOC concentration determines the chlorine dose and hence the HOCl:Br ratio. It is also important to note that not all bromide will be utilised during chlorination. A study by Wang and Huang (2006) reported that at higher chlorine dosages (5 mg/L of free chlorine), about 60% of the bromide will be transformed into the reactive forms (hypobromous acid and hypobromide ion) by chlorine. At low chlorine dosage (1 mg/L), however, only 20% of bromide will be transformed.
Sawade et al. (2016) found that addition of bromide to increase the Br:DOC ratio increased the production of toxic DBPs (as measured by mammalian cell cytotoxicity) to the extent that the chlorinated coagulated water was almost as toxic as the chlorinated untreated water. Since the overall DBP production was lower in the coagulated water due to the reduced level of NOM, this indicates that the DBPs with higher bromide substitution were considerably more toxic than the mainly chlorinated forms produced in lower bromide waters. This conforms to other findings (Richardson et al., 2007).
These studies suggest that water quality regulations need to account for the bromide content in addition to the DOC (or some key characteristic of the DOC) to minimise health risks.
Another more direct approach to measure the health impact may be to measure the “toxicity” of the disinfected water using a range of appropriate bioassays. Traditionally only cytotoxicity and DNA damage are assessed when studying DBPs in vitro, however significant sample concentration (up to 1000 times) is required to elicit any acute responses. In contrast, Farré et al. (2013) have proposed to use more sensitive early warning signs, such as the onset of repair and defence mechanisms in cells rather than assays that are based on the ultimate manifestation of effect (e.g. mutagenicity). While the latter are relevant for a comprehensive human health risk assessment, the adaptive cellular stress responses may be superior for monitoring purposes as they are activated before actual harm occurs, but still prove the presence of associated chemical stressors. Given their sensitivity to DBPs, the use of the bioluminescence inhibition test and the induction of oxidative stress as indicator tests for monitoring purposes were recommended.
However, in the short term, the aforementioned approach will not assist in setting operational targets for treated water that will allow utilities to improve management of their treatment processes but could provide an appropriate auditing tool. In the future, identification of a correlation between water quality parameters and a “toxicity” measure of disinfected water may highlight the need for alternative or advanced treatment to remove specific precursors and prevent formation of DBPs of greater health impact.
Whilst disinfection has had the most significant improvement on the public health of drinking water supplies by preventing disease caused by pathogenic microorganisms, potential chronic health impacts that may be associated with the consumption of disinfected water require health regulators and water utilities to be vigilant in ensuring the overall safety of drinking water. Current regulations that limit specific DBP compounds are not able to adequately account for the potential health impacts and identification of the specific compounds responsible for toxic effects is difficult. Improved approaches that reduce the potential health effects are required, including the use of more comprehensive measures of the DBPs that are formed, such as AOX.
However, a more effective approach would be to establish regulations that encourage changes to operational practices to reduce overall health risk. Prevention of DBP formation by regulating for greater removal of the key DBP precursors, NOM and bromide, prior to disinfection should be considered. These water quality targets could include specific DOC limits or limits based on specific organic character, such as UV absorbance or fluorescence, which are more representative of the potential DBP formation than DOC concentration alone. This approach could be implemented in the short term in conjunction with measurement of total DBP formation, by measuring AOX, rather than a wide range of specific DBP compounds.
In the future, the development and application of appropriate bioassays that directly measure the impact of water treatment and disinfection on key aspects of health would enable selection of appropriate treatment to minimise DBP formation.
The authors would like to thank Andrew Humpage, Senior Toxicologist, SA Water Corporation for his critical review and comments.
Mary Drikas, E | Mary is the Manager of Water Science at SA Water Corporation. She has been leading water treatment studies in SA Water since 1987, has over 35 years’ experience in this field and has co-authored over 100 scientific papers. She was also the Program Leader of the Water Technology Program nationally within the Cooperative Research Centre for Water Quality and Treatment, from July 1995 to June 2008.
Rolando Fabris | Rolando is a Senior Scientist with SA Water Corporation, with 20 years of water industry experience. He is a specialist in water treatment processes and characterisation of natural organic matter (NOM), but has also been involved with online monitoring, distribution systems and disinfection by-product research. Rolando has been a regular presenter in national and international water conferences since 2003 and has over 50 papers in peer-reviewed journals.