BROWSE ALL ARTICLES BY TOPIC
Leverage the Power of Information
Fragmented processes, disconnected systems lead to loss of quality
by Michael Gay
The proliferation of new information technologies has brought numerous benefits to the food processing industry, including improvements in overall productivity and efficiency. At the same time, the industry continues to experience major lapses in safety and quality, magnified in recent months by several highly publicized product recalls.
Fragmented processes and disconnected systems, each with their own specific data, are problems at many plants. Because these organizations use a paper-based approach, resources are overloaded trying to track safety and quality processes, while inconsistencies and waste abound.
Unconnected data sources and manually tracked quality processes lead to a lack of real-time information. Quality issues are not addressed, and the root causes of a quality deviation are not identified. Additionally, critical production decisions are frequently based on assumptions instead of accurate and reliable information.
These operational deficiencies may be key contributors to recent recalls. They also point to a need for improved management of the entire supply chain, implementation of tighter control and monitoring, and delivery of real-time information on production processes. To achieve optimum performance, manufacturers need timely information about the production process to effectively analyze and detect undesirable trends so that they can take immediate corrective action when needed. Once these best practices are defined, manufacturers can enforce them and, where possible, build quality directly into the solution so that the product is manufactured correctly the first time.
An integrated quality management tool can provide substantial dividends in that type of situation. Despite the natural hesitation of the industry to tinker with a proven, albeit cumbersome, paper-based system of quality management, the underlying benefits of this integrated solution have become too promising to ignore.
Step One: Turn Your Data Into Intelligence
At the core of a robust quality management system are tools that allow users to aggregate information from multiple applications and transform the data into highly visible, actionable, performance-oriented intelligence. Without defined measures and procedures that connect data sources, analyze performance, and enforce processes, the identification and correction of root causes is nothing more than guess work.
A central component in an effective quality management strategy is the ability to gather and correlate information from multiple sources, allowing decision makers to see diverse views and maintain key relationships. Reports, key performance indicators (KPIs), and operational metrics can then be assembled quickly into dashboards so that performance can be measured throughout the facility.
By connecting disparate data sources, quality control staff can access information that provides exception alerts through live information connections, and managers can determine precisely where, when, and why mistakes are occurring. This real-time intelligence will automatically highlight exception conditions, missed targets, and plan deviations.
When these disparate data sources are connected, operators use the intelligence dashboards to make quality improvements. By putting data into context, operators can make process corrections in real time, instead of after the fact, resulting in significant improvements in output, yield, and first-pass quality.
Because dashboards display metrics in rich graphics, operators understand more quickly how to respond to the data. For example, a quick glance at a trend-oriented graphic, as compared to raw numbers, can provide powerful insight into performance history and status. Users can more effectively compare multiple data sources using the dimension of time or a production run.
At one large food processing operation, a manufacturing intelligence application allows the company to aggregate data from all of its control and historical systems. This provides KPI exception reporting and root cause analysis that can be shared in a Web portal. This capability presents one version of the truth for all quality information and provides a basis for better decision making. Operators can easily view, for example, how varying the mixers, ovens, or ingredient suppliers will impact the final product.
Production supervisors can build comparisons, assessing how batches are running at each site and comparing that information to other locations or to corporate standards. In this way, a plant can standardize the processes of a particular product at the optimal recipe and performance level.
This same manufacturer can apply manufacturing intelligence to determine, for example, why one machine requires more time to dry a product. By looking at multiple data sources—from the humidity in the air to changes in raw material—and using the advanced analytics to determine correlation, the manufacturer calculates and compares process trends over time. The manufacturer uses performance equations to derive information that was not obvious, helping the company determine the root cause of a problem.
Step Two: Define and Enforce Your Quality Process
Step two in improving food quality is ensuring that the product is repeatedly made with the same quality process. Using the correlated information gathered during step one, you can make improvements in the overall production process and establish best practices.
Once the best practice is defined, its use must be enforced so that product is created the same way every time. A procedural control application will ensure the use of best practices related to the raw materials, the processing equipment, and the manual and automatic operational procedures.
One food manufacturer applied a procedural control application to create manual spice kits. This control procedure combined manual operations and automatic weighing equipment to reduce quality-related expenses by more than 10% by enforcing specific requirements for ingredient type and amount, kit processing, and proper identification and tracking of the completed spice kits.
Step Three: Apply Predictive Quality
The third step for improving food quality involves applying predictive quality. Predictive control technologies use advanced modeling techniques based on timely in-process measurements to simulate processes, run what-if scenarios, and determine how changes will impact output. Setting output variable targets and letting the models determine optimum input targets can also perform steady-state optimizations.
One of the world’s leading dairy producers wanted to increase throughput to handle growing raw milk supplies. The company looked for ways to improve the operating efficiency of existing evaporators and dryers before making new capital investments. The wealth of historical data for the wide product mix produced in the dryer allowed for creation of a model that accurately reflected moisture ranges for each product. Installing an application that biased the prediction hourly with in-process testing data from the online grading analysis system enhanced the simulation.
Audit results revealed that the predictive control solution exceeded its objective, reducing the variation in the chamber outlet temperature by approximately 43% and the sifter moisture by 52%. The dairy producer successfully increased the yield and quality of its nutritional, whole milk, and skim milk powders.
Bringing It All Together
Food manufacturers increasingly see the inherent value and tangible returns of integrated quality management systems. The information derived not only empowers employees with improved vision, but also helps them correlate and trend data and achieve marked improvements through prediction and control—all of which help reduce risk and improve profitability. ■