Applying Quality Methods to Water Treatment , Processes & Problem Solving

Typically commercial water systems are subject to considerable variation. Makeup water characteristics can change over time. The abruptness and degree of change depend on the source of the water. Water losses from a recirculating system, changes in production rates, and chemical feed rates all introduce variation into the system and thereby influence the ability to maintain proper control of the system.

Other variables inherent in such systems include:

  • water flow/velocity water
  • temperature process temperature / atmospherics
  • process demands
  • evaporation rates
  • operator skill/training
  • water characteristics (suspended solids, hardness, pH swings)
  • treatment product quality

As every industrial water system is unique, not only in the production operations it supports and the sources of water it receives, but also in the degree of inherent variation encountered due to the factors listed above. While a very sensitive treatment program that must operate within a narrow control range may be suitable for one system, another system requiring the same degree of protection may be incapable of maintaining the required control. Consequently, inferior results must be accepted unless the system is improved to support the sensitive program.

In operating systems, proper treatment of water supply, boiler, cooling, and effluent waters often requires constant adjustment of the chemistry to meet the requirements of rapidly changing system conditions. A well designed program is essential to maintaining proper control. The program should include proper control limits and the ability to troubleshoot problems that interfere with control of water chemistry. Success in troubleshooting depends on the knowledge, logic, and skills of the troubleshooter. In order to improve operations it is necessary to recognize the importance of continuous improvement and to be familiar with some tools and procedures necessary to support this effort.

Adequate and reliable data are essential if variation in a system is to be measured and reduced. Specialized software can assist efforts to manage, summarize, and use data effectively. Process data can be stored in a database and retrieved and analyzed as needed in a variety of formats. Live monitoring provide nearly instantaneous access to many months or years of process data. Using this data one can graph and analyze the process in a variety of formats, such as statistical process control (SPC), trend analysis, and histograms. The operator is able to troubleshoot the system based on these analyses without spending large amounts of time manually researching and analyzing the data.


Although the performance of a process varies from day to day, the average performance and the range of variation are fairly constant over time. This level of performance is inherent in the process and is provided for in the system design. The Quality Control Limits can identify the accepted average and accepted range of variation in feedwater hardness. This limits is often adopted as the standard of performance. Sometimes, performance falls outside the accepted, or standard, range of variation in the Quality Control Limits. The goal of problem solving in the Quality Control Limits is to reestablish performance within the standard.

This involves the following steps:

  • Understanding the process
  • Products knowledge (materials)
  • detecting the change (sporadic spike)
  • identifying the cause of the change
  • taking corrective action to restore the status quo


Problem solving in Quality Improvement can have an even greater impact. The goal of quality improvement is to reject the status quo as the standard and reach a level of performance never before achieved. This level, the New Quality Control,” represents the achievement of lower costs and/or better performance. For example , significantly lower feedwater hardness decreases scaling potential and improves boiler reliability.

This step extends the scope of problem solving beyond the correction of obvious problems. While it is important to “make the system work,” it is often more important to view the entire system to identify areas of potential improvement. Some systems are poorly planned; others have not been updated to keep pace with changing requirements and progressing technology. In either case, it is often the system that causes control and operational problems not the people working within the system.

Quality Improvement Tools

While a proper mindset must exist for continuous improvement, certain problem solving procedures and tools can add structure and consistency to the effort. The following quality improvement tools provide the means to summarize and present meaningful data in a way that adds significance to the successful resolution of chronic problems.

Flow Diagrams. A flow diagram provides a graphic presentation of the steps required to produce a desired result. For example, this tool may be used to clarify the procedures used to regenerate a softener or the steps to be taken in the event of an upset in a cooling tower. Flow diagrams are used in problem solving to give all parties a common understanding of the overall process.

Brainstorming. In diagnosing a problem, new and useful ideas can result when all of the people familiar with the process meet to share their experiences and ideas. Possible causes are discussed and possible solutions are presented and evaluated.

Cause-Effect Diagrams. An important first step in quality improvement is the identification of the root causes of a problem. A cause-effect diagram provides an effective way to organize and display the various ideas of what those root causes might be.

Scatter Diagrams. A scatter diagram is useful in providing a clear, graphic representation of the relationship between two variables. For example, boiler feedwater iron levels might be plotted as a function of feedwater pH to confirm or rule out a cause-effect relationship.

Pareto Analysis. Pareto analysis is a ranked comparison of factors related to a quality problem, or a ranking of the cost of various problems. It is an excellent graphic means of identifying and focusing on the vital few factors or problems.

Meaningful Data Collection. Meaningful collection of data and facts is fundamental to every quality improvement effort. Quality improvement is an information intensive activity. In many cases, problems remain unsolved for long periods of time due to a lack of relevant information. A good data collection system must be carefully planned in order to provide the right information with a minimum of effort and with minimal chance of error.

In order to plan for data collection, it is necessary to identify potential sources of bias and develop procedures to address them:

  • Exclusion bias. If a part of the process being investigated has been left out, the result will be biased if the data is intended to represent the entire process. For example, if data on attemperating water purity is not included in an evaluation of a steam turbine fouling problem, the cause could be missed.
  • Interaction bias. The process of collecting the data itself can affect the process being studied. For example, if an operator knows that cooling tower treatment levels are being monitored by the central laboratory, he may be more careful conducting his own tests.
  • Perception bias. The attitudes and beliefs of the data collectors can influence what they perceive and how they record it. If an operator believes that swings in steam header pressure are his responsibility, he may record that operation was normal at the time of boiler water carryover.
  • Operational bias. Failure to follow the established procedures is a common operational bias. For example, failure to cool a boiler water sample to 25 °C (77 °F) often leads to an erroneous pH measurement.

Graphs and Charts. Pictorial representations of quantitative data, such as line charts, pie charts, and bar graphs, can summarize large amounts of data in a small area and communicate complex situations concisely and clearly.

Statistical Process Control. Statistical process control (SPC) is the use of statistical methods to study, analyze, and control the variation in any process. It is a vehicle through which one can extract meaningful information about a process so that corrective action, where necessary, can be implemented. While a histogram is a pictorial representation of patterns of variation, SPC is used to quantify this variation and determine mathematically whether the process is stable or unstable, predictable or erratic.

With statistical process control, the actual historical data is used to calculate the upper and lower statistical limits as a guideline for future operation. Anything falling outside of the statistical limits is considered to be a special cause of variation requiring immediate attention. Of course, if the common causes of variation are excessive for either engineering or economic reasons, constant improvement to the process is necessary until the statistical limits are narrowed to the point of acceptability.

Data is key to success , without it an operator is working blind. To solve a problem quickly and effectively ALL stakeholders MUST be involved in the process irrespective of wither the issue is in your area or not. If you’re in the process line then you’re in.