The Evolution of UV
All technology must evolve over time, as devices get smaller, more efficient and faster. Ultraviolet (UV) technology is no exception to this rule. When UV systems were first introduced for the purpose of disinfection more than 20 years ago, they were large devices utilized in high-flow wastewater applications. In time, UV became accepted for drinking water uses, being manufactured with large stainless steel tubes and many UV lamps inside.
The drinking water industry today has access to more compact UV systems equipped with a minimal number of UV lamps for both municipal and residential applications. This is primarily due to the development of more powerful and stable UV lamps as well as UV reactors that have been designed to be as hydraulically efficient as possible with uniform dose distribution throughout. This process has been the evolution of UV technology as we know it.
UV technology is now accepted as a primary disinfectant under drinking water regulations worldwide; however, it must undergo third-party validation in order to prove the system’s performance. The question is: If UV technology has evolved and continues to evolve, how can UV validation follow suit?
Third-party validation of a UV system is simply a test of the equipment to prove that it can provide sufficient disinfection according to the equipment rating. This testing is conducted by adding a known concentration of a challenge microorganism to a volume of water, of known water quality, at a predetermined flow rate through a UV disinfection system and measuring how many of those microorganisms are still viable in the effluent water. This is called a biodosimetry test or bioassay.
The challenge microorganism is crucial in this analysis with many parameters to take into consideration. The challenge microorganism must be:
- At least as resistant to UV light as Cryptosporidium (a targeted pathogen for UV inactivation credits);
- Nonpathogenic in nature as it is used for testing purposes;
- Easy to produce with high titers of at least 109 PFU/mL; and
- Stable when added to various conditions of test water. The water can be from a surface or groundwater source with UVT inhibitors added.
Traditionally, the challenge microorganism of choice in North America has been MS2 bacteriophage. MS2 was developed some time ago and was found to have the closest response to Cryptosporidium than any other challenge microorganism available. The downside of MS2 is that it is a great deal more resistant to UV than Cryptosporidium, causing an over-estimation of the dose level needed for inactivation when using it for bioassay purposes.
Using MS2 for UV validation purposes will ultimately cause an increase in system size and in turn, an increase in UV equipment cost. By identifying a surrogate challenge microorganism with a closer response to Cryptosporidium than MS2 could greatly decrease the amount of equipment necessary for drinking water disinfection needs.
There are many alternative challenge microorganisms that have been tested in UV validation studies with the same goal in mind: greater system capacity with less equipment.
Not only will there be a cost savings due to less equipment, but with the smaller amount of equipment, it now has the ability to fit into a smaller footprint allowing for existing water treatment facilities to simply modify their treatment train.
One particular challenge microorganism that has had great success is called T1 bacteriophage. This phage has been found to resist UV disinfection in a manner more similar to Cryptosporidium than MS2. MS2 is accepted for use in UV validation in the U.S. Environmental Protection Agency’s 2006 UV Disinfection Guidance Manual (DGM), but they also make room for alternate challenge microorganisms.
According to the UV DGM 2006 version, “male-specific-2 bacteriophage (MS2) phage and B. subtilis spores historically have been used for validation testing to receive treatment credit for Cryptosporidium and Giardia. Because their UV resistance is notably greater than that of Cryptosporidium and Giardia, other, more sensitive microorganisms such as T1 and T7 phage are gaining favor.”
The misconception in the drinking water industry, especially in Canada, is that a 40 mJ/cm2 dose is equivalent to a 3-log inactivation credit of Cryptosporidium. That rationale is leaving a great deal of information out. The reported dose needed for a specific log inactivation of Cryptosporidium will greatly depend on the challenge microorganism used in the validation of the UV equipment. Traditionally, the reported dose of 40 mJ/cm2 was equivalent to a 3-log inactivation credit of Cryptosporidium, but that is using MS2 as the challenge microorganism.
If a UV manufacturer were to use T1 phage as their challenge microorganism, a dose of only 15 mJ/cm2 would be needed in order to claim a 3-log Cryptosporidium credit—a very big difference. This dose difference can result in an increase of a system’s rating by almost two-thirds. Table 1 comes directly from the UV DGM 2006 document. Notice the reported dose for each challenge microorganism.
Imagine an application such as New York City: They need to be able to treat almost 2.5 billion gal of water a day. If the amount of equipment could be reduced by two-thirds, you can imagine the cost savings. But do not just think about the equipment, consider the building size, power costs and energy savings.
When you think of the evolution of UV technology, the reduction in the number of lamps and reactor size comes to mind, and smaller is better. Smaller is more cost effective and economical. If you really break it down, the evolution of UV validation simply goes hand in hand with the evolution of UV technology—something to keep in mind when reviewing how a UV system is validated.