BROWSE ALL ARTICLES BY TOPIC
Incoming Quality Control of Water
Securing Water as an Essential Ingredient Rather than a Commodity
by Dan Kroll
Today’s modern food processing industries are heavily reliant upon water as both an ingredient and as an integral part of their preparation and processing functions. While in some instances the water used is further processed and treated by the food manufacturer, in many cases this supply of water is obtained from local municipal sources and under goes no further monitoring or processing beyond what is done by the local utility that supplies it to the end users. Unfortunately, in most cases little or no monitoring of the quality of this water by the water provider occurs beyond the water treatment plant. The vast labyrinth of pipes known as the distribution system that delivers the water is, for the most part, unmonitored.
It is well known and widely accepted in the water supply industry that the distribution system is not secure. Due to the potential for backflow events, cross connections, corroding pipes, groundwater infiltration and biofilm events the water contained in the distribution system is not immune to becoming recontaminated after leaving the treatment plant. An added danger is the potential for deliberate contamination due to terrorist activities known as backflow attacks (see side bar). Much attention has recently been placed on the vulnerability of the U.S. drinking water supplies to assault by terrorists. The fact that our water supply systems, as they are currently configured, are vulnerable to attack has been widely recognized. (Hickman, 1999; Hoffbuhr, 2002; OSTP, 2003; Kroll, 2006) While most supply sources are limited in their vulnerability due to the massive volumes of water involved, the distribution system remains a vulnerable and tempting target as was clearly stated in a recent GAO report to Congress that listed the vulnerability of the distribution system to attack as the largest security risk to water supplies. (GAO, 2003) Terrorists could compromise a system through an assault anywhere in the distribution system through the introduction of any one of a large number of possible threat agents through a backflow attack. Such attacks are not just theoretical; they have already been attempted. Here’s a few examples:
- May 1983 – Israel uncovers Arab plot to poison Galilee water with “an unidentified powder.”
- February 2002 – Al Qaeda arrested with plans to attack U.S. embassy water in Rome with a cyanide-based compound.
- April 2003 – Jordan foils Iraqi plot to poison drinking water supplies from Zarqa feeding U.S. military bases along the Eastern desert.
- December 2002 – Al Qaeda operatives arrested with plans to attack water networks surrounding the Eiffel Tower neighborhoods, Paris.
- September 2003 – FBI bulletin warns of Al Qaeda plans found in Afghanistan to poison U.S. food and water supplies.
A system to effectively detect such incursions into the distribution system, whether they are accidental or terror related, would be a valuable tool to enhance the safety and quality of any products manufactured or processed with the water. The large diversity of potential agents that could find their way into the distribution system precludes monitoring for them on an individual basis. One method that has been developed and has found success in such an endeavor is the concept of using bulk parameter monitoring techniques along with advanced chemometrics.
A Real Time Monitoring System
One such system makes use of five common bulk parameters that are monitored simultaneously in real time. The parameters that are monitored are pH, conductivity, total organic carbon (TOC), turbidity and residual chlorine. When measured in real time, these parameters can show a lot of variability for a given water supply (see Figure. 2). That is why a baseline estimator that is sensitive to small perturbations and yet is resilient enough to not be constantly alarming due to normal fluctuations is required for such a system to function properly.
In the system as it is designed, the signals from all of the instruments are processed from five separate parameter measures into a single value in an event monitor computer system that contains the algorithm. The signal then goes through the crucial proprietary baseline estimator. A deviation of the signal from the estimated baseline is then derived. Then a gain matrix is applied that weights the various parameters based on experimental data for a wide variety of probable threat agents. The magnitude of the deviation signal is then compared to a preset threshold level that is set by the operator. If the signal exceeds the threshold, the trigger is activated. Figure 3 shows the same data from Figure 2 processed through the algorithm.
Even with extremely noisy data, the system does not trigger at a threshold level set at one (1). Therefore, during normal operation, with no threats present, the process deviation should not be large enough to produce a trigger signal less than one (1) in this case.
However when the data for a cyanide incursion at 1 percent of the LD-50 or approximately 2.8 mg/L is superimposed on the system, the trigger level of 1 is easily exceeded (see Figure 4). Other contaminants exhibit similar results.
The deviation vector that is derived from the trigger algorithm contains significantly more data than what is needed to simply trigger the system. The deviation vector’s magnitude relates to concentration and trigger signal, while the deviation vector direction relates to the agent characteristics. Seeing that this is the case, laboratory agent data can be used to build a threat agent library of deviation vectors. A deviation vector from the water monitor can be compared to agent vectors in the threat agent library to see if there is a match within a tolerance. This system can be used to classify what agent is present (see Figure 5). Each vector results in a vector angle in n-space that, from the research conducted so far, appears to be unique to the class of agent present. The fact that the direction of the vector is unique for a given agent type allows the use of an algorithm to classify the cause of a trigger being set off. When the event trigger is set off, the library search begins. The agent library is given priority and is searched first. If a match is made, the agent is identified. If no match is found, the plant library is then searched and the event is identified if it matches one of the vectors in the plant library. If no match is found, the data is saved and the operator can enter an ID when one is determined. The agent library is provided with the system, and the plant library is learned on site.
The unknown alarm rate when the system is tracking real world data is also quite low. The system is equipped with a learning algorithm, so that as unknown alarm events occur over time, the system has the ability to store the signature that is generated during the event. The operator can then go into the program and identify that function and associate it with a known cause such as the turning on of a pump or the switching of water sources, etc. Then, the next time that event occurs; it will be recognized and identified appropriately.
Over time as the system learns, the probability of an unknown alarm that has not been previously encountered and identified will continue to decrease and will eventually approach zero. The probability of an unknown alarm due to a given event depends upon the frequency of the occurrence of such an event and the time that the algorithm has had to learn that event. Events that occur frequently will be quickly learned while rare or singular events will take longer to be learned and stored. This should result in a fairly rapid drop off in the number of unknown alarms as common events are quickly learned.
A number of similar multi-parameter measurement platforms, most without the addition of intelligent algorithms, have been evaluated for such applications by the EPA Environmental Technology Verification (ETV) program. See the full reports at http://www.epa.gov/etv/verifications/vcenter1-35.html. These systems appear to be a good choice for detecting water quality excursions that could be linked to water security events. There are a number of advantages to using such systems. The chief advantage is that these instruments are not new. They are common everyday parameters with which the average industry worker is quite familiar, thus adding a degree of comfort in operations not afforded by other new technology. As existing technologies, these instruments have been proven to be robust and dependable in prior field deployments. They represent measurements that would be of interest and use to water utility personnel and food industry process control engineers above and beyond their role as water security devices.
Process Improvement Capabilities
Through many years of experience, the best old hands at plant operations have developed “a sense” for knowing something in the system is amiss. It can be a smell, color, clarity (or lack there of), sound or just tingling in the nape of the neck. One gains these senses only by extensive experience in a particular facility. Due to the shrinking workforce and the loss of institutional knowledge at many facilities, bulk parameter monitoring in the distribution system with interpretive algorithms has the potential to become the artificial “sense” able to quickly “learn” the quirks of the distribution system and have those quirks labeled by those with extensive experience so that less experienced employees have the benefit of that knowledge without having to wait five, 10 or more years. A good phrase to describe this knowledge base would be “institutional intuition.” (Englehardt, 2005: Kroll, 2006).
With the aging of the workforce and rapid employee turnover “institutional intuition” has the chance of quickly dying out. Algorithms could be a way to circumvent this loss of knowledge and to build a knowledge base where none has previously existed. The pattern of different water quality profiles could be correlated to process or quality problems. This may be especially crucial in industries where the quality of the water used can have a direct effect on finished product quality such as the bottled beverage and brewing industries. These correlations could in turn allow improvements is system operation that may result in cost savings and definitely will result in a higher quality product being delivered to the consumer.
One of the largest advantages to this type of monitoring system is the multi-parameter array’s ability to detect such a wide variety of potential threat agents from metals to organics to bio-agents. The ability to trigger on unique unknown events is also a major plus. A disadvantage is that there are some events, which occur during normal operation, that may trigger an unknown alarm. This, however, can be an advantage if the information is used to generate the institutional intuition discussed above. Nonetheless, this learning phase is not free and requires an input of time and effort to investigate and classify these alarms so they can be placed into the database.
Many local utilities are in the process of establishing such monitoring stations in their distribution systems. One of the drawbacks for the utilities is site location. The deployment site needs to have adequate space for the instrumentation, a water supply to be tested, drain, power and communications. Many water providers have a limited number of utility owned sites that can be used for these purposes. There exists the possibility of collaboration between the food processing plant and the local utilities to provide the site for deployment in exchange for data sharing. While not a traditional arrangement, such a scheme could be mutually beneficial.
The described system makes use of an integrated array of robust common water quality monitoring sensors coupled with interpretive algorithms to recognize and classify significant water quality deviations. Extensive in house and third party verification testing as well as extensive deployment at field sites has demonstrated the suites’ ability to fill the analytical gap that currently exists for distribution network monitoring and serve the purpose of an early warning system in the water distribution network. Hopefully, the systems’ unique ability to learn and classify will result in not only increased safety from terror related events, but will morph into an operational tool that will find everyday use in improving water quality operations and ensure a better quality product to consumers.
- Allman. Timothy. 2003. Drinking water distribution system modeling for predicting the impact and detection of intentional contamination, Masters Thesis, Colorado State University, Dept of Engineering, Fort Collins Colorado, Summer.
- Englehardt, Terry. 2005 E-mail message to author. November 7th.
- GAO 2003. “Drinking Water Security: Experts’ views on how future federal funding can best be spent to improve security” GAO-04-29.
- Hickman, Maj. Donald C, USAF, BSC, 1999. “A Chemical and Biological Warfare Threat: USAF Water Systems at risk,” Counter
- Proliferation Paper No. 3, USAF Counter Proliferation Center, Air War College.
- Hoffbuhr, J.W., 2002. “Waterscape: An Executive Perspective. Water Follies” Journal of the American Water Works Association. 94:6.
- Kroll, Dan. 2006. Securing the Water Supply: Protecting a Vulnerable Asset. Pennwell Publishing. Tulsa, OK.
- Office of Science and Technology Policy, The White House, 2003. The National Strategy for the Physical Protection of Critical Infrastructures and Assets.
Dan Kroll is chief scientist for Hach Homeland Security Technologies, a division of Hach Co. (Loveland, Colo.). He is also the author of the book “Securing Our Water Supplies: Protecting a Vulnerable Resource.” Reach him at 970-663-1377 x2637 or DKROLL@hachhst.com.