Maybe not so slightly pink: Properly cooked pork chops may contain threat of Listeria and Salmonella for consumers

If you are eating leftover pork chops that have not been cooked well-done, you’re putting yourself at risk for Salmonella and Listeria exposure. While many individuals prefer to consume their pork medium, a new study published in Risk Analysis: An International Journal revealed that cooking pork chops to an acceptable temperature does not completely eliminate pathogens, providing these cells with the opportunity to multiply during storage and harm consumers.  

The study, “Impact of cooking procedures and storage practices at home on consumer exposure to Listeria monocytogenes and Salmonella due to the consumption of pork meat,” found that only pork loin chops cooked well-done in a static oven (the researchers also tested cooking on a gas stove top) completely eliminated the Listeria and Salmonella pathogens. Other levels of cooking, i.e. rare and medium, while satisfying the requirements of the product temperature being greater than or equal to 73.6 degrees Celsius and decreasing the pathogen levels, did leave behind a few surviving cells which were then given the opportunity to multiply during food storage before being consumed.  

It is generally believed that when meat is heat treated to 70 degrees Celsius for two minutes, a one million cell reduction of E. coli, Salmonella, and Listeria is achieved and thus the meat is free of pathogens and safe to eat. However, a report by the European Food Safety Authority revealed that more than 57 percent of Salmonella outbreaks in 2014 were in the household/kitchen, and 13 percent were associated with inadequate heat treatment. 

“The results of this study can be combined with dose response models and included in guidelines for consumers on practices to be followed to manage cooking of pork meat at home,” says Alessandra De Cesare, PhD, lead author and professor at the University of Bologna.  

In order to assess the pathogen levels in cooked pork, the researchers, from the University of Bologna, the Institute of Food Engineering for Development and the Istituto Zooprofilattico delle Venezie, tested 160 packs of loin chop. The samples were experimentally contaminated with 10 million cells of L. monocytogenes and Salmonella to assess the reduction in pathogens after cooking, in accordance with the Food Safety and Inspection Service (FSIS) and British Retail Consortium (BRC) specifications (ensuring a reduction of at least 100,000 and 1,000,000 cells, respectively). The samples were contaminated on the surface, to mimic contamination via slaughter and cutting.  

The samples were divided into groups to be cooked either on gas with a non-stick pan or in a static oven. In each setting, the pork chops were cooked to rare, medium, and well-done. For each cooking combination, 40 repetitions were performed for a total of 240 cooking tests.  

The researchers also interviewed 40 individuals between the ages of 20 and 60 to determine household consumer habits regarding doneness preferences. Prior published research was referenced to define meat storage practices and the probability that consumers store their leftovers at room temperature, in the refrigerator or discard them immediately. Growth rate data for the pathogens at each temperature were obtained using the software tool ComBase.  

The only cooking treatment able to completely inactivate the pathogens was oven well-done, which achieved a reduction between one and 10 million cells. Statistical analyses of the data showed significant differences related to level of cooking and cooking procedure. However, the researchers explained that factors such as moisture, water activity, fat levels, salts, carbohydrates, pH, and proteins can impact the cooking treatment and effectiveness and, as a consequence, on bacteria survival. These results emphasize the needs to consider the form of pork (such as whole muscle versus ground) being cooked, in addition to the final temperature necessary to inactivate pathogens.  

The results show that a reduction between one and 10 million of pathogen cells was reached when applying all of the tested cooking treatments, with product temperatures always reaching 73.6 degrees Celsius or greater. However, according to the simulation results using the obtained cell growth rates, the few surviving cells can multiply during storage in both the refrigerator and at room temperature, reaching concentrations dangerous for both vulnerable and regular consumers.  

After storing leftovers, there was a probability for the concentration of pathogens to reach 10 cells ranging between 0.031 and 0.059 for all combinations except oven well-done. Overall, the mean level of exposure to Listeria and Salmonella at the time of consumption was one cell for each gram of meat. The results obtained from this study can be implemented in guidelines for consumers on practices to follow in order to manage cooking pork meat at home.  

 

Cook it: Toxo in pork

Toxoplasma gondii is one of the leading foodborne pathogens in the United States.

The Centers for Disease Control and Prevention (CDC) reported that T. gondii accounts for 24% of deaths due to foodborne illness in the United States.

raw-meat-120607Consumption of undercooked pork products in which T. gondii has encysted has been identified as an important route of human exposure. However, little quantitative evaluation of risk due to different pork products as a function of microbial quality at the abattoir, during the production process, and due to consumer handling practices is available to inform risk management actions.

The goal of this study was to develop a farm-to-table quantitative microbial risk assessment (QMRA) model to predict the public health risk associated with consumption of fresh pork in the United States.

T. gondii prevalence in pigs was derived through a meta-analysis of existing data, and the concentration of the infectious life stage (bradyzoites) was calculated for each pork cut from an infected pig. Logistic regression and log-linear regression models were developed to predict the reduction of T. gondii during further processing and consumer preparation, respectively. A mouse-derived exponential dose-response model was used to predict infection risk in humans. The estimated mean probability of infection per serving of fresh pork products ranges from 3.2 × 10−7 to 9.5 × 10−6, corresponding to a predicted approximately 94,600 new infections annually in the U.S. population due to fresh pork ingestion. Approximately 957 new infections per year were estimated to occur in pregnant women, corresponding to 277 cases of congenital toxoplasmosis per year due to fresh pork ingestion.

In the context of available data, sensitivity analysis suggested that cooking is the most important parameter impacting human health risk. This study provides a scientific basis for risk management and also could serve as a baseline model to quantify infection risk from T. gondii and other parasites associated with meat products.

Quantifying the risk of human Toxoplasma gondii infection due to consumption of fresh pork in the United States

Food Control, Volume 73, Part B, March 2017, Pages 1210–1222

Miao Guo, Elisabetta Lambertini, Robert L. Buchanan, Jitender P. Dubey, Dolores E. Hill, H. Ray Gamble, Jeffrey L. Jones, Abani K. Pradhan

http://www.sciencedirect.com/science/article/pii/S0956713516305825

Nerd alert: Risk assessment in Europe

The European Food Safety Authority says a special issue of the EFSA Journal presents the main outcomes of EFSA’s 2nd Scientific Conference “Shaping the Future of Food Safety, Together” held in Milan, on 14-16 October 2015. 

kvANE_s-200x150The event was a unique opportunity for stakeholders in Europe’s food regulatory system – policy makers, risk assessors, scientists and NGOs – to identify future challenges for food safety risk assessment in Europe. Over 1000 delegates came together in what proved to be a stimulating and insightful debate on global food safety concerns. The discussions covered an impressive range of topics and provided inspiration for EFSA’s Strategy 2020. The conclusions will help EFSA and the wider risk assessment community to chart a course in food safety risk assessment in the coming years.

The special issue of the EFSA Journal reflects the conference’s three plenary and nine parallel sessions and is accompanied by a Foreword from EFSA’s Executive Director, Bernhard Url.

All the conference material that was published on the conference’s dedicated microsite will be archived on EFSA’s website. This includes the programme, webcasts, recordings and video clips which will continue to be publicly available and linked to the special issue of the EFSA Journal. 

nerd-alert.g-jpgf

Norovirus: Best way to assess risk?

The application of quantitative microbial risk assessments (QMRAs) to understand and mitigate risks associated with norovirus is increasingly common as there is a high frequency of outbreaks worldwide.

norovirus.qmraA key component of QMRA is the dose–response analysis, which is the mathematical characterization of the association between dose and outcome. For Norovirus, multiple dose–response models are available that assume either a disaggregated or an aggregated intake dose. This work reviewed the dose–response models currently used in QMRA, and compared predicted risks from waterborne exposures (recreational and drinking) using all available dose–response models.

The results found that the majority of published QMRAs of norovirus use the 1F1hypergeometric dose–response model with α = 0.04, β = 0.055. This dose–response model predicted relatively high risk estimates compared to other dose–response models for doses in the range of 1–1,000 genomic equivalent copies. The difference in predicted risk among dose–response models was largest for small doses, which has implications for drinking water QMRAs where the concentration of norovirus is low.

Based on the review, a set of best practices was proposed to encourage the careful consideration and reporting of important assumptions in the selection and use of dose–response models in QMRA of norovirus.

Finally, in the absence of one best norovirus dose–response model, multiple models should be used to provide a range of predicted outcomes for probability of infection.

Comparison of risk predicted by multiple norovirus dose–response models and implications for quantitative microbial risk assessment

Nicole Van Abel, Mary E. Schoen, John C. Kissel, J. Scott Meschke

Risk Analysis, June 2016, DOI: 10.1111/risa.12616

http://onlinelibrary.wiley.com/doi/10.1111/risa.12616/abstract

Listeria and raw milk cheese: A risk assessment involving sheep

Semisoft cheese made from raw sheep’s milk is traditionally and economically important in southern Europe. However, raw milk cheese is also a known vehicle of human listeriosis and contamination of sheep cheese with Listeria monocytogenes has been reported.

sheep.milk.cheeseIn the present study, we have developed and applied a quantitative risk assessment model, based on available evidence and challenge testing, to estimate risk of invasive listeriosis due to consumption of an artisanal sheep cheese made with raw milk collected from a single flock in central Italy.

In the model, contamination of milk may originate from the farm environment or from mastitic animals, with potential growth of the pathogen in bulk milk and during cheese ripening. Based on the 48-day challenge test of a local semisoft raw sheep’s milk cheese we found limited growth only during the initial phase of ripening (24 hours) and no growth or limited decline during the following ripening period. In our simulation, in the baseline scenario, 2.2% of cheese servings are estimated to have at least 1 colony forming unit (CFU) per gram. Of these, 15.1% would be above the current E.U. limit of 100 CFU/g (5.2% would exceed 1,000 CFU/g). Risk of invasive listeriosis per random serving is estimated in the 10−12 range (mean) for healthy adults, and in the 10−10 range (mean) for vulnerable populations.

When small flocks (10–36 animals) are combined with the presence of a sheep with undetected subclinical mastitis, risk of listeriosis increases and such flocks may represent a public health risk.

Risk assessment of human listeriosis from semisoft cheeses made from raw sheep’s milk in Lazio and Tuscany

Roberto Condoleo, Ziad Mezher, Selene Marozzi, Antonella Guzzon, Roberto Fischetti, Matteo Senese, Stefania Sette, Luca Bucchini

Risk Analysis, June 2016, doi:10.1111/risa.12649

http://onlinelibrary.wiley.com/doi/10.1111/risa.12649/abstract;jsessionid=519D74728E4A34E1CE300B856B99D54B.f04t04

Taking it further: Quantitative microbiological risk assessment and source attribution for Salmonella

The current issue of Risk Analysis contains several papers regarding QMRA and Salmonella.

qmrainactionIn a recent report from the World Health Organisation, the global impact of Salmonella in 2010 was estimated to be 65–382 million illnesses and 43,000–88,000 deaths, resulting in a disease burden of 3.2–7.2 million disability-adjusted life years (DALYs).[3] Controlling Salmonella in the food chain will require intervention measures, which come at a significant cost but these should be balanced with the cost of Salmonella infections to society.[5]

Despite a wealth of published research work relating to Salmonella, many countries still struggle with identifying the best ways to prevent and control foodborne salmonellosis. Two questions are particularly important to answer in this respect: ([1]) What are the most important sources of human salmonellosis within the country? and ([2]) When a (livestock) source is identified as important, how do we best prevent and control the dissemination of Salmonella through that farm-to-consumption pathway? The articles presented in this series continue the desire to answer these questions and hence eventually contribute to the reduction in the disease burden of salmonellosis in humans.

Risk Analysis, 36: 433–436

Snary, E. L., Swart, A. N. and Hald, T.

http://onlinelibrary.wiley.com/doi/10.1111/risa.12605/abstract

Application of molecular typing results in source attribution models: the case of multiple locus variable number tandem repeat analysis (MLVA) of Salmonella isolates obtained from integrated surveillance in Denmark

Risk Analysis, 36: 571–588

de Knegt, L. V., Pires, S. M., Löfström, C., Sørensen, G., Pedersen, K., Torpdahl, M., Nielsen, E. M. and Hald, T.

http://onlinelibrary.wiley.com/doi/10.1111/risa.12483/abstract

Salmonella is an important cause of bacterial foodborne infections in Denmark. To identify the main animal-food sources of human salmonellosis, risk managers have relied on a routine application of a microbial subtyping-based source attribution model since 1995. In 2013, multiple locus variable number tandem repeat analysis (MLVA) substituted phage typing as the subtyping method for surveillance of S. Enteritidis and S. Typhimurium isolated from animals, food, and humans in Denmark.

risk.analysis.frameworkThe purpose of this study was to develop a modeling approach applying a combination of serovars, MLVA types, and antibiotic resistance profiles for the Salmonella source attribution, and assess the utility of the results for the food safety decisionmakers. Full and simplified MLVA schemes from surveillance data were tested, and model fit and consistency of results were assessed using statistical measures. We conclude that loci schemes STTR5/STTR10/STTR3 for S. Typhimurium and SE9/SE5/SE2/SE1/SE3 for S. Enteritidis can be used in microbial subtyping-based source attribution models. Based on the results, we discuss that an adjustment of the discriminatory level of the subtyping method applied often will be required to fit the purpose of the study and the available data. The issues discussed are also considered highly relevant when applying, e.g., extended multi-locus sequence typing or next-generation sequencing techniques.

Assessing the effectiveness of on-farm and abattoir interventions in reducing pig meat–borne Salmonellosis within E.U. member states

Risk Analysis, 36: 546–56

Hill, A. A., Simons, R. L., Swart, A. N., Kelly, L., Hald, T. and Snary, E. L.

http://onlinelibrary.wiley.com/doi/10.1111/risa.12568/abstract

As part of the evidence base for the development of national control plans for Salmonella spp. in pigs for E.U. Member States, a quantitative microbiological risk assessment was funded to support the scientific opinion required by the EC from the European Food Safety Authority. The main aim of the risk assessment was to assess the effectiveness of interventions implemented on-farm and at the abattoir in reducing human cases of pig meat–borne salmonellosis, and how the effects of these interventions may vary across E.U. Member States. Two case study Member States have been chosen to assess the effect of the interventions investigated. Reducing both breeding herd and slaughter pig prevalence were effective in achieving reductions in the number of expected human illnesses in both case study Member States. However, there is scarce evidence to suggest which specific on-farm interventions could achieve consistent reductions in either breeding herd or slaughter pig prevalence.

skinned-pigs-hanging-in-a-slaughterhouse-gwe28273853Hypothetical reductions in feed contamination rates were important in reducing slaughter pig prevalence for the case study Member State where prevalence of infection was already low, but not for the high-prevalence case study. The most significant reductions were achieved by a 1- or 2-log decrease of Salmonella contamination of the carcass post-evisceration; a 1-log decrease in average contamination produced a 90% reduction in human illness. The intervention analyses suggest that abattoir intervention may be the most effective way to reduce human exposure to Salmonella spp. However, a combined farm/abattoir approach would likely have cumulative benefits. On-farm intervention is probably most effective at the breeding-herd level for high-prevalence Member States; once infection in the breeding herd has been reduced to a low enough level, then feed and biosecurity measures would become increasingly more effective. 

Toxo in meat – US edition

Toxoplasma gondii is a global protozoan parasite capable of infecting most warm-blooded animals. Although healthy adult humans generally have no symptoms, severe illness does occur in certain groups, including congenitally infected fetuses and newborns, immunocompromised individuals including transplant patients.

toxoplasma-gondii-life-cycleEpidemiological studies have demonstrated that consumption of raw or undercooked meat products is one of the major sources of infection with T. gondii. The goal of this study was to develop a framework to qualitatively estimate the exposure risk to T. gondii from various meat products consumed in the United States.

Risk estimates of various meats were analyzed by a farm-to-retail qualitative assessment that included evaluation of farm, abattoir, storage and transportation, meat processing, packaging, and retail modules. It was found that exposure risks associated with meats from free-range chickens, nonconfinement-raised pigs, goats, and lamb are higher than those from confinement-raised pigs, cattle, and caged chickens. For fresh meat products, risk at the retail level was similar to that at the farm level unless meats had been frozen or moisture enhanced.

Our results showed that meat processing, such as salting, freezing, commercial hot air drying, long fermentation times, hot smoking, and cooking, are able to reduce T. gondii levels in meat products. whereas nitrite and/or nitrate, spice, low pH, and cold storage have no effect on the viability of T. gondii tissue cysts. Raw-fermented sausage, cured raw meat, meat that is not hot-air dried, and fresh processed meat were associated with higher exposure risks compared with cooked meat and frozen meat.

This study provides a reference for meat management control programs to determine critical control points and serves as the foundation for future quantitative risk assessments.

Qualitative assessment for Toxoplasma gondii exposure risk associated with meat products in the United States

Journal of Food Protection, Number 12, December 2015

Miao Guo, Robert L. Buchanan, Jitender P. Dubey, Dolores E. Hill, Elisabetta Lambertini, Yuqing Ying, Ray H. Gamble, Jeffrey L. Jones, and Abani K. Pradhan

http://www.ingentaconnect.com/content/iafp/jfp/2015/00000078/00000012/art00013

Chlorine is still a friend: Gastro from water wells in Canada

Waterborne illness related to the consumption of contaminated or inadequately treated water is a global public health concern.

water.well.ontAlthough the magnitude of drinking water-related illnesses in developed countries is lower than that observed in developing regions of the world, drinking water is still responsible for a proportion of all cases of acute gastrointestinal illness (AGI) in Canada.

The estimated burden of endemic AGI in Canada is 20·5 million cases annually – this estimate accounts for under-reporting and under-diagnosis. About 4 million of these cases are domestically acquired and foodborne, yet the proportion of waterborne cases is unknown. There is evidence that individuals served by private systems and small community systems may be more at risk of waterborne illness than those served by municipal drinking water systems in Canada. However, little is known regarding the contribution of these systems to the overall drinking water-related AGI burden in Canada.

Private water supplies serve an estimated 12% of the Canadian population, or ~4·1 million people. An estimated 1·4 million (4·1%) people in Canada are served by small groundwater (2·6%) and surface water (1·5%) supplies. The objective of this research is to estimate the number of AGI cases attributable to water consumption from these supplies in Canada using a quantitative microbial risk assessment (QMRA) approach. This provides a framework for others to develop burden of waterborne illness estimates for small water supplies. A multi-pathogen QMRA of Giardia, Cryptosporidium, Campylobacter, E. coli O157 and norovirus, chosen as index waterborne pathogens, for various source water and treatment combinations was performed. It is estimated that 103 230 AGI cases per year are due to the presence of these five pathogens in drinking water from private and small community water systems in Canada.

In addition to providing a mechanism to assess the potential burden of AGI attributed to small systems and private well water in Canada, this research supports the use of QMRA as an effective source attribution tool when there is a lack of randomized controlled trial data to evaluate the public health risk of an exposure source. QMRA is also a powerful tool for identifying existing knowledge gaps on the national scale to inform future surveillance and research efforts. 

Estimating the burden of acute gastrointestinal illness due to Giardia, Cryptosporidium, Campylobacter, E. coli O157 and norovirus associated with private wells and small water systems in Canada

Epidemiology and Infection, August 2015, pages 1-16, DOI: 10.1017/S0950268815002071

H.M. Murphy, M.K. Thomas, P.J. Schmidt, D.T. Medeiros, S. McFadyen, and K.D.M. Pintar

http://journals.cambridge.org/action/displayAbstract?fromPage=online&aid=10034786&fulltextType=RA&fileId=S0950268815002071

Listeria uses alternative metabolic pathways to grow on cold salmon

Listeria monocytogenes grows on refrigerated smoked salmon by way of different metabolic pathways from those it uses when growing on laboratory media. This discovery could lead to reduced incidences of foodborne illness and death, said principal investigator Teresa Bergholz. The research appeared in Applied and Environmental Microbiology, a journal of the American Society for Microbiology.

listeria4In the study, the investigators showed that L. monocytogenes grows on cold smoked salmon by using different metabolic pathways to obtain energy from those it uses on laboratory media, even when the media was modified to have the same salt content and pH as the salmon. To grow on the salmon, the bacterium upregulates genes that enable it to use two compounds from cell membranes — ethanolamine and propanediol — as energy sources.

L. monocytogenes, as well as Salmonella, are known to use those same genes to grow within a host — in the gastrointestinal tract, and on macrophages. “There may be ways we can use this information to control the pathogen both in foods as well as in infected people,” said Bergholz, assistant professor in the Department of Veterinary and Microbiological Sciences at North Dakota State University, Fargo. “Understanding how a foodborne pathogen adapts to environmental stresses it encounters on a specific food could allow food microbiologists to develop inhibitors of metabolic or stress response pathways that are necessary for the pathogen to grow or survive on that product.”

“The information may also enable improved risk assessments, as virulence of a pathogen may be affected considerably by the stress responses and/or metabolic pathways used to survive on the food,” said Bergholz.

Bergholz noted that ready to eat products typically have very low levels of contamination with L. monocytogenes, and that the bacterium must be able to grow on the product during refrigerated storage in order to reach an infectious dose. “In many cases, the addition of organic acids will slow or stop the growth of this pathogen on ready to eat meats and seafood.”

I volunteer: Study on THC in animal products

The European Food Safety Authority (EFSA) was asked to deliver a scientific opinion on the risks for human health related to the presence of tetrahydrocannabinol (THC) in milk and other food of animal origin.

thcTHC, more precisely delta-9-tetrahydrocannabinol (Δ9-THC) is derived from the hemp plant Cannabis sativa. In fresh plant material, up to 90 % of total Δ9-THC is present as the non-psychoactive precursor Δ9-THC acid. Since few data on Δ9-THC levels in foods of animal origin were available, the Panel on Contaminants in the Food Chain (CONTAM Panel) estimated acute human dietary exposure to Δ9-THC combining different scenarios for the presence of Δ9-THC in hemp seed-derived feed materials.

Acute exposure to Δ9-THC from the consumption of milk and dairy products ranged between 0.001 and 0.03 µg/kg body weight (b.w.) per day in adults, and 0.006 and 0.13 µg/kg b.w. per day in toddlers. From human data, the CONTAM Panel concluded that 2.5 mg Δ9-THC/day, corresponding to 0.036 mg Δ9-THC/kg b.w. per day, represents the lowest observed adverse effect level. By applying an overall uncertainty factor of 30, an acute reference dose (ARfD) of 1 μg Δ9-THC/kg b.w. was derived. The exposure estimates are at most 3 % and 13 % the ARfD, in adults and toddlers, respectively.

The CONTAM Panel concluded that exposure to Δ9-THC via consumption of milk and dairy products, resulting from the use of hemp seed-derived feed materials at the reported concentrations, is unlikely to pose a health concern.

A risk assessment resulting from the use of whole hemp plant-derived feed materials is currently not feasible due to a lack of occurrence data. The CONTAM Panel could also not conclude on the possible risks to public health from exposure to Δ9-THC via consumption of animal tissues and eggs, due to a lack of data on the potential transfer and fate of Δ9-THC.

 Scientific Opinion on the risks for human health related to the presence of tetrahydrocannabinol (THC) in milk and other food of animal origin

EFSA Journal 2015;13(6):4141[125 pp.]

EFSA

http://www.efsa.europa.eu/en/efsajournal/pub/4141.htm?utm_source=feed&utm_medium=rss&utm_campaign=ej