BROWSE ALL ARTICLES BY TOPIC
by Greg Burham, PhD, Donald W. Schaffner, PhD, and Steven C. Igham, Phd
Predictive food microbiology, a well-established subdiscipline of food microbiology used for nearly 100 years, is reemerging. Its progress and impact on food safety practices and hazard analysis and critical control point (HACCP) systems will require the cooperation of industry, academia, and regulatory agencies.
About a century before Svante Arrhenius and Jan Belehrádek—the early fathers of predictive microbiology—were considering the best mathematical approach to quantifying microbial behavior, the industry that spurred their continuing debate was born. Early in the 1800s, Nicholas Appert had discovered that food heated in sealed containers would not spoil during extended storage. His discovery earned Appert a large cash award and allowed for the feeding of Napoleon’s vast armies. Appert didn’t understand why food spoiled, and the causes of spoilage remained unknown until the discoveries of Louis Pasteur some 50 years later.
Predictive microbiology considers such factors as bacterial heat resistance, the heat-transfer properties of food, and time/ temperature history. Ever since the important scientific and technological discoveries of Appert and others, the canning—or more appropriately, the thermo-stabilization—industry has employed predictive microbiology to ensure the quality and safety of its products. Today, nearly 100 years after Arrhenius and Belehrádek, we have a well-established form of thermal processing for low-acid foods (pH>4.6) that is accepted by regulatory authorities.
The 12-D process, as it is often called, demonstrates the benefits of predictive microbiology in action. This temperature-specific process is based on assumptions of first-order microbial inactivation kinetics and a decimal reduction time (D-value). It is intended to achieve a 12-D or 12 log10 reduction of the most heat-stable microorganism capable of causing human illness (usually Clostridium botulinum spores) or spoilage of the product under normal storage conditions. For example, if the time (D-value) in minutes at 250°F (121°C) for the inactivation of C. botulinum spores is 0.2 (1-D or 1 log10 reduction), then the 12-D (12 log10 reduction) equivalent would be 2.4 minutes.
Because initial spore levels cannot be adequately determined for each container of food, the 12-D process offers a degree of overkill that reduces potential risk to an acceptable level. For example, if we assume a can of food initially contains 1,000 (103) C. botulinum spores, a 12-D process will result in a 109-fold risk reduction, resulting in a one-in-a-billion chance of a can containing a surviving C. botulinum spore. This practice has been business-as-usual for several decades now and, during this time, the low-acid canned food industry has achieved an enviable safety record.
Fast forward to the late twentieth century. The shift in predictive microbiology is toward modeling the growth and survival of microorganisms rather than inactivating them. The mathematical methodology used to describe these biochemical processes has also evolved, and many methods, some of which are quite complex, have been described. In 1993, R.C. Whiting and R.L. Buchanan proposed the further classification of these mathematical models as primary, secondary, or tertiary; this serves as the framework for understanding the basic structure of the predictive microbiology software packages available today.
Primary models describe microbial response (e.g., lag phase duration, growth rate, inactivation rate) to a specific condition or conditions over time. Secondary models describe microbial response over a range of primary model conditions, while tertiary models are an assembly of primary and/or secondary models into an end-user software package. Some examples of primary models include the Gompertz function, the Baranyi model, the Buchanan three-phase linear model, McKellar’s heterogeneous population model, and Natick’s quasi-chemical model. These methods follow similar approaches; for example, most predictive microbiology tools used in the food industry are kinetic rather than probabilistic and empirical (or semi-mechanistic) rather than completely mechanistic. Many excellent reviews on predictive microbiology are available.
The United States Department of Agriculture’s (USDA) Agricultural Research Service’s Pathogen Modeling Program (PMP) is probably the most recognized predictive microbiology tool in the United States. The PMP is a tertiary model that can be used to predict the growth or inactivation of a number of foodborne pathogens exposed to combinations of specific environmental (temperature, pH, sodium nitrite concentration, and so on) or processing (heat/irradiation) conditions. Most PMP predictions are based on microbial responses observed experimentally in sterile laboratory growth media and not in a specific food, however, a shortcoming identified in reviews on predictive microbiology. Some other limitations of PMP: most of its models are isothermal based, and it may not be easy to use or interpret for those who have not been indoctrinated.
A similar tool, the Institute of Food Research’s (Norwich, U.K.) ComBase Predictor (www.combase.cc/), is also based on microbial behavior in liquid microbiological media and can be difficult to use and interpret. The ComBase database, however, offers an extensive resource for experimentally observed microbial responses in food environments. A recently developed tertiary model, THERM v.2 (Temperature History Evaluation of Raw Meat, www.meathaccp.wisc.edu/THERM/calc.aspx), has addressed some of these limitations for predicting Escherichia coli O157:H7, Salmonella serovars, and Staphylococcus aureus behavior in raw beef, pork, and poultry.
This tool is only applicable, however, to raw meat products that contain no ingredients that might inhibit pathogen growth, such as salt or sodium nitrite. Many other tools are available to predict microbial responses in food, including the Seafood Spoilage and Safety Predictor, made available by the Danish Institute for Fisheries Research and the Technical University of Denmark, which can predict the shelf life of seafood at constant or changing temperatures.
Validating HACCP Systems
When using predictive microbiology tools to support HACCP systems, predictions of microbial behavior made by tools must be validated with experimental observations of microbial behavior in the given food system. Although this validation has its own unique set of inherent difficulties, it is important to make these observations whether one is validating predictions from food-specific tools, like THERM v.2, or predictions from the often more conservative laboratory media-based tools.
HACCP arrived in the early 1960s during a collaboration between the Pillsbury Company, U.S. Army Natick Laboratories, and the U.S. Air Force Space Laboratory Project Group, in cooperation with the National Aeronautics and Space Administration, to develop rations for the U.S. space program. HACCP as a widely accepted food safety system gained momentum when the National Academy of Sciences published “An evaluation of the role of microbiological criteria for foods and food ingredients” in 1985. In 1988, the National Advisory Committee on Microbiological Criteria for Foods published their HACCP principles. These principles included conducting a hazard analysis, establishing critical control points, setting critical limits, monitoring those limits, establishing corrective actions for deviations, verifying that the system is working, and documenting all appropriate procedures and records.
The first regulatory mandates for HACCP came from the Food and Drug Administration (FDA) in their low-acid canned food regulations and seafood HACCP regulations. But HACCP truly arrived in the food industry in 1996, when the USDA adopted the “Pathogen Reduction; HACCP Systems; Final Rule,” which required that all meat and poultry processors use HACCP as their main food safety system.
Predictive microbiology tools are useful in several parts of HACCP systems, including in the areas of conducting a thorough hazard analysis, providing scientifically valid information in establishing critical limits at critical control points, and evaluating system deviations (corrective actions). Let’s consider some examples. The first two are brief and hypothetical, but they help illustrate predictive microbiology’s potential influence on HACCP systems. The third example is a more detailed description of an actual process deviation in which the use of predictive microbiology tools might have reduced the economic burden.
In a hazard analysis of the production of chicken cordon bleu, Salmonella associated with the raw chicken is a hazard reasonably likely to appear. During the preparation step, the raw chicken is exposed to temperatures up to 55°F (13.8°C) for less than four hours. Both the PMP and THERM estimate that less than 30% of lag phase duration will have elapsed during a four-hour period at this temperature. Further, if the lag phase duration is not considered, the time needed for Salmonella populations to increase by 0.3 log colony forming units (CFU)—one doubling or generation time—is greater than six hours at this temperature. Applying predictive microbiology in this example provides evidence that this step in the hazard analysis should not be regarded as a critical control point.
In establishing critical limits for a ready-to-eat (RTE) deli sandwich prepared ahead of time by hand, the question is what cold-holding temperature will allow a maximum eight-hour holding time before consumption that can be justified by scientific information rather than current regulatory guidance? The FDA’s Food Code allows for time only (four hours, no temperature control) as a public health control of RTE foods intended for immediate consumption. Since our RTE deli sandwich is prepared by hand, the microbial hazard most likely to occur is the transfer of S. aureus.
Using the PMP, we can determine a potential increase in S. aureus populations at an ambient temperature of 75°F (24°C) and no lag phase of about a 1.5 log CFU during the currently allowed four hours. If this population increase is acceptable based on existing guidance, then adjusting the cold-holding temperature to establish a longer storage time is possible. For example, we can obtain our eight-hour storage time by cold-holding our RTE deli sandwich at 67°F (19.5°C); if we cold hold at 60°F (15.5°C), we can stay below this potential level of S. aureus growth for up to 17 hours.
Another potential application for predictive microbiology tools, only recently considered, is reducing economic losses associated with the condemnation of foods exposed to short-term temperature abuse due to refrigeration failure at retail and food service operations. Estimated losses to a large retail or food service company may be tens of thousands of dollars each year. Most often, condemnation results from mechanical refrigeration failure that allows the product temperature to rise above the 41ºF (5ºC) cold-holding requirement enforced by many regulatory authorities. The primary reference for this cold-holding requirement is the FDA’s Food Code.
Another criterion often linked to this cold-holding requirement is that exposure of potentially hazardous foods—raw meat and poultry, for example—to an out-of-temperature condition should not exceed four hours, although this is not specifically detailed in the section of the Food Code covering cold holding of potentially hazardous food. The four-hour limit likely comes from another section of the Food Code that addresses the use of time only as a public health control rather than in conjunction with temperature. This section, however, applies specifically to RTE foods or to a working supply of raw foods just before cooking, both of which are intended for immediate consumption. Thus, the criterion does not apply to situations such as refrigeration failures, in which raw meat and poultry have been exposed to temperatures above 41ºF (5ºC) for any period of time.
Because the Food Code is written in a manner that provides inflexible limits for regulatory control, it does not offer the deviation guidance required to make appropriate disposition decisions in these out-of-temperature situations. Recent research suggests that the four-hour limit commonly imposed may be unnecessarily conservative for raw meat and poultry products, especially at the lower end of the temperature range. Using a predictive microbiology tool such as THERM v.2 to more accurately predict pathogen behavior in raw meat and poultry could drastically reduce the economic losses associated with condemnation of these temperature-abused foods—without compromising consumer safety.
Using data from a recent regulatory authority report on temperature-abused fresh raw meat and poultry items that resulted in the condemnation of several hundred dollars’ worth of product, we used THERM v.2 to evaluate the risk associated with the noted deviation. Several time and temperature measurements were available in the report. Internal (half-inch below the surface) product temperatures at two, four, eight, and 12 hours into the refrigeration failure were 42, 48, 60, and 38ºF, respectively. The lower temperature limit for THERM v.2 is 50ºF (10ºC), so user-entered temperatures below this limit are calculated using the 50ºF (10ºC) lag phase duration and growth rate values, a conservative function of the tool.
For all meat types—beef, pork, and poultry—THERM v.2 predicted that <70% of lag phase had elapsed for E. coli O157:H7, and <60% of lag phase had elapsed for Salmonella serovars. No lag phase duration or growth rate values are given for S. aureus at temperatures below 60ºF; this pathogen did not grow during 24-hour experiments reported by Ingham et al. Therefore, THERM v.2 did not predict any lag phase elapsing for S. aureus during this out-of-temperature situation. Using predictive microbiology in this example may have reduced economic losses associated with this refrigeration failure.
Acceptance of Tools a Necessity
Using predictive microbiology tools in conjunction with managing HACCP systems seems to be a natural fit; however, because HACCP systems have regulatory oversight, regulators must accept predictive microbiology tools. The FDA, with HACCP oversight of the juice and seafood industries, has no formal policy statement on the use of predictive microbiology tools. FDA officials do, however, have a long history of predictive microbiology acceptance within the thermo-stabilization industry and have often used predictive models to make policy decisions.
The USDA, on the other hand, has offered its opinion on predictive microbiology tools, specifically in USDA Food Safety and Inspection Service (FSIS) Notice 25-05, the Listeria compliance guidelines, as well as in USDA FSIS, Appendix B to “Compliance Guidelines for Cooling Heat-treated Meat and Poultry Products (Stabilization).” In these documents, the USDA recognizes that predictive microbiology tools are beneficial in HACCP systems for hazard analysis, development of critical limits, and evaluation of process deviations. They also point out many drawbacks, however, such as the presence/growth of indigenous microbes that affect predictions, stress response reactions that are not properly addressed, and unknown biological variability.
The USDA still accepts predictive microbiology information as one—but not the sole—source of HACCP documentation. The way the model was developed, validated, and used to produce predictions must be considered, however. The USDA also encourages consulting an expert in predictive microbiology modeling to ensure appropriate use.
The USDA has just launched the Predictive Microbiology Information Portal (PMIP, http://portal.arserrc.gov/PMIP Home.aspx) to assist small and very small food companies in the use of predictive models and food microbiology information. The PMIP is especially useful for locating and retrieving predictive models and research data for use in HACCP systems.
State regulators are also accepting predictive microbiology information, along with expert consultations, to resolve many noncompliance issues. Cindy Klug, a meat scientist with the Wisconsin Department of Agriculture, says she is a firm believer in using validated science to determine food safety rather than the “we haven’t killed anyone yet” approach. She recalls using the first version of THERM, which was developed at the University of Wisconsin-Madison, and says it was “a bit difficult to use and interpret.” But after it was revised it was easier to use, she adds. Dr. Klug and Steven C. Ingham, PhD, a Wisconsin professor, used the new online version of THERM to assist an establishment in determining carcass safety when their coolers went down during a 100ºF heat wave.
“The plant owner had done a good job collecting cooler and carcass time and temperature data,” she says. “Professor Ingham plugged the data into THERM, which predicted that neither Salmonella nor E. coli O157:H7 had gone into growth phase. The plant owner printed the graph and presented it to his inspector as validation that the carcasses were still safe, even though a deviation from a critical limit had occurred.
Using THERM was quick, based on scientifically validated parameters, and straightforward in its reported information. Any tool that works this well makes the regulatory/industry partnership less stressful, with both sides working toward a common goal of food safety, and should be used whenever possible.”
Predictive microbiology and HACCP have been intertwined from the start. With further development, refinement, and validation, we should soon see wider acceptance of validated predictive microbiology tools to enhance HACCP systems, thus ensuring the safety of the consumer and the nation’s food supply using 21st-century tools.
The authors would like to acknowledge the efforts of Christopher Doona, PhD; Cheryl Baxa, PhD; and Lt. Col. Timothy Stevenson, DVM, PhD for their review of, and contributions to, this article.
Dr. Burnham is veterinary liaison, combat feeding directorate, at the U.S. Army’s Natick Soldier Research, Development, and Engineering Center. Reach him at firstname.lastname@example.org. Dr. Schaffner is a professor and food science extension specialist at Rutgers, The State University of New Jersey. Reach him at schaffner@aesop. rutgers.edu. Dr. Ingham is a professor and food safety extension specialist at the University of Wisconsin-Madison. Reach him at email@example.com.
- Burnham GM, Ingham SC, Fanslau MA, et al. Using predictive microbiology to evaluate risk and reduce economic losses associated with raw meats and poultry exposed to temperature abuse. United States Army Medical Department. AMEDD Journal. 2007;PB8-07-7/8/9:57-65.
- Doona CJ, Feeherry FE, Ross EW. A quasi-chemical model for the growth and death of microorganisms in foods by non-thermal and high-pressure processing. Int J Food Microbiol. 2005;100(1):21-32.
- Ingham SC, Fanslau MA, Burnham GM, et al. Predicting pathogen growth during short-term temperature abuse of raw pork, beef, and poultry products: use of an isothermal-based predictive tool. J Food Prot. 2007;70:1446-1456.
- McDonald K, Sun DW. Predictive food microbiology for the meat industry: a review. Int J Food Microbiol. 1999;52:1-27.
- McKellar RC, Lu X. Modeling Microbial Responses in Food. Boca Raton, Fla.: CRC Press; 2004.
- McMeekin TA, Olley JN, Ross T, et al. Predictive Microbiology: Theory and Application. Taunton, U.K.: Research Studies Press Ltd; 1993.
- National Academy of Sciences. An evaluation of the role of microbiological criteria for foods and food ingredients. Washington, D.C.: National Academies Press; 1985. Available at: www.nap.edu/openbook.php?isbn=030 9034973. Accessed March 11, 2008.
- Peleg M. Advanced Quantitative Microbiology for Foods and Biosystems. Boca Raton, Fla.: CRC Press; 2006.
- United States Department of Agriculture, Food Safety & Inspection Service. Pathogen Reduction; Hazard Analysis and Critical Control Point (HACCP) Systems; Final Rule.10. United States Department of Agriculture, Food Safety and Inspection Service.