Improving food inspections through effective scheduling

To properly assess a food establishment for compliance with local food safety regulations is a science and an art. They take time and energy.
The science is applying risk assessment to determine the severity of the public health violation and the art is being able to effectively communicate the findings to the operator or Person-in-Charge. On-site training of the cited violations is an additional effort conducted by inspectors time permitting.
A recent study “How Scheduling Can Bias Quality Assessment: Evidence from Food Safety Inspections,” co-written by Maria Ibáñez and Mike Toffel, looks at how scheduling affects workers’ behavior and how that affects quality or productivity. In the study the authors suggest reducing the amount of given inspections during the day as fatigue will negatively affect the quality of successive inspections . As such a cap on inspections should be implemented to correct this issue. As much as I agree with this statement, the problem stems from inadequate resources to hire more staff to conduct inspections. Many inspectors are generalists meaning that on any given day they may be required to inspect a restaurant, on-site sewage system, playground, pool and deal with any environmental health issues that arise. Unfortunately, quality is sometimes sacrificed by quantity simply due to a lack of staff.

Carmen Nobel reports:

Simple tweaks to the schedules of food safety inspectors could result in hundreds of thousands of currently overlooked violations being discovered and cited across the United States every year, according to new research about how scheduling affects worker behavior.

The potential result: Americans could avoid 19 million foodborne illnesses, nearly 51,000 hospitalizations, and billions of dollars of related medical costs.

Government health officers routinely drop in to inspect restaurants, grocery stores, schools, and other food-handling establishments, checking whether they adhere to public health regulations. The rules are strict. Food businesses where serious violations are found must clean up their acts quickly or risk being shut down.

Yet each year some 48 million Americans get sick, 128,000 are hospitalized, and 3,000 die due to foodborne illnesses, according to the Center for Disease Control and Prevention.

The research is detailed in the paper “How Scheduling Can Bias Quality Assessment: Evidence from Food Safety Inspections,” co-written by Maria Ibáñez, a doctoral student in the Technology and Operations Management Unit at Harvard Business School, and Mike Toffel, the Senator John Heinz Professor of Environmental Management at HBS, experts in scheduling and in inspections, respectively.

“The more inspections you have done earlier in the day, the more tired you’re going to be and the less energy you’re going to have to discover violations”

“This study brought together Maria’s interest in how scheduling affects workers’ behavior and how that affects quality or productivity, and my interest in studying the effectiveness of inspections of global supply chainsand of factories in the US,” Toffel says.

Timing is everything

Previous research (pdf) showed that the accuracy of third-party audits is affected by factors such as the inspector’s gender and work experience. Ibáñez and Toffel wanted to look at the effect of scheduling because it’s relatively easy for organizations to fix those problems.

The researchers studied a sampling of data from Hazel Analytics, which gathers food safety inspections from local governments across the United States. The sample included information on 12,017 inspections by 86 inspectors over several years; the inspected establishments included 3,399 restaurants, grocers, and schools in Alaska, Illinois, and New Jersey. The information contained names of the inspectors and establishments inspected, date and time of the inspection, and violations recorded.

In addition to studying quantitative data, Ibáñez spent several weeks accompanying food safety inspectors on their daily rounds. This allowed her to see firsthand how seriously inspectors took their jobs, how they made decisions, and the challenges they faced in the course of their workdays. “I’m impressed with inspectors,” she says. “They are the most dedicated people in the world.”

Undetected violations

Analyzing the food safety inspection records, the researchers found significant inconsistencies. Underreporting violations causes health risks, and also unfairly provides some establishments with better inspection scores than they deserve. According to the data, inspectors found an average 2.4 violations per inspection. Thus, citing just one fewer or one more violation can lead to a 42 percent decrease or increase from the average—and great potential for unfair assessments across the food industry, where establishments are judged on their safety records by consumers and inspectors alike.

On average, inspectors cited fewer violations at each successive establishment inspected throughout the day, the researchers found. In other words, inspectors tended to find and report the most violations at the first place they inspected and the fewest violations at the last place.

The researchers chalked this up to gradual workday fatigue; it takes effort to notice and document violations and communicate (and sometimes defend) them to an establishment’s personnel.

“The more inspections you have done earlier in the day, the more tired you’re going to be and the less energy you’re going to have to discover violations,” Ibáñez says.

They also found that when conducting an inspection risked making the inspector work later than usual, the inspection was conducted more quickly and fewer violations were cited. “This seems to indicate that when inspectors work late, they are more prone to rush a bit and not be as meticulous,” Toffel says.

The level of inspector scrutiny also depended on whatever had been found at the prior inspection that day. In short, finding more violations than usual at one place seemed to induce the inspectors to exhibit more scrutiny at the subsequent place.

“This seems to indicate that when inspectors work late, they are more prone to rush a bit and not be as meticulous”

For example, say an inspector is scheduled to inspect a McDonald’s restaurant and then a Whole Foods grocer. Suppose McDonald’s had two violations the last time it was inspected. If the inspector now visits that McDonald’s and finds five or six violations, the inspector is likely to be particularly meticulous at the Whole Foods next on the schedule, reporting more violations than she otherwise would.

That behavior may be because inspectors put much effort into helping establishments learn the rules, create good habits, and improve food safety practices.

“It can be frustrating when establishments neglect these safety practices, which increases the risk of consumers getting sick,” Ibáñez says. “When inspectors discover that a place has deteriorated a lot, they’re disappointed that their message isn’t getting through, and because it poses a dangerous situation for public health.”

On the other hand, finding fewer violations than usual at one site had no apparent effect on what the inspector uncovered at the subsequent establishment. “When they find that places have improved a lot since their last inspection, they just move on without letting that affect their next inspection.”

Changes could improve public safety

The public health stakes are high for these types of errors in food safety inspections. The researchers estimate that tens of thousands of Americans could avoid food poisoning each year simply by reducing the number of establishments an inspector visits on a single day. Often, inspectors will cluster their schedule to conduct inspections on two or three days each week, saving the other days for administrative duties in the office. While this may save travel time and costs, it might be preventing inspectors from doing their jobs more effectively.

One possible remedy: Managers could impose a cap on the maximum number of inspections per day, and rearrange schedules to disperse inspections throughout the week—a maximum of one or two each day rather than three or four.

In addition, inspectors could plan early-in-the-day visits to the highest-risk facilities, such as elementary school cafeterias or assisted-living facilities, where residents are more vulnerable to the perils of foodborne illnesses than the general public.

On the plus side, tens of thousands of hospital bills are likely avoided every year, thanks to inspectors inadvertently applying more scrutiny after an unexpectedly unhygienic encounter at their previous inspection.

“Different scheduling regimes, new training, or better awareness could raise inspectors’ detection to the levels seen after they observe poor hygiene, which would reduce errors even more and result in more violations being detected, cited and corrected,” Ibáñez says.

The authors estimate that, if the daily schedule effects that erode an inspector’s scrutiny were eliminated and the establishment spillover effects that increase scrutiny were amplified by 100 percent, inspectors would detect many violations that are currently overlooked, citing 9.9 percent more violations.

“Scaled nationwide, this would result in 240,999 additional violations being cited annually, which would in turn yield 50,911 fewer foodborne illness-related hospitalizations and 19.01 million fewer foodborne illness cases per year, reducing annual foodborne illness costs by $14.20 billion to $30.91 billion,” the authors write.

Lessons for inspections

While the study focuses on food safety inspections, it offers broad lessons for any manager who has to manage or deal with inspections.

“One implication is that bias issues will arise, so take them into account as you look at the inspection reports as data,” Ibáñez says. “And another is that we should try to correct them. We should be mindful about the factors that may bias our decisions, and we should proactively change the system so that we naturally make better decisions.”

 

Risk assessment ‘tolerable’ for tree that killed woman

Risk assessments are fraught with value judgements scientists make when choosing the upper and lower boundaries of numerical ranges and the assumptions made, especially those involving human behavior.

Conrad Brunk (right) and co-authors explored this in the 1991 book, Value Judgements in Risk Assessment.

For the many food safety risk assessors and analysts out there, a New Zealand tree may offer a lesson.

A tree in Rotorua, known as Spencer’s Oak, was deemed to be of a “tolerable” level of risk when it came down in a Jan. 2018 storm and killed a woman.

The 150-year-old oak, believed to be around 23m tall, blocked Amohia St, trapped 56-year-old Trish Butterworth in her car. She died at the scene.

The risk assessment of the tree has been revealed in documents released by Rotorua Lakes Council to Stuff under the Local Government Official Information and Meetings Act.

Benn Bathgate reports that in a tree assessment report from an arboricultural contractor dated February 28, 2017, Spencer’s Oak and a second tree were assessed.

“The assessed risk levels for these trees all fall the tolerable level,” the report said. 

“There is some decay evident in some of the buttress roots and in some old pruning wounds. Sounding the trunk and buttress roots with a plastic mallet did not indicate any major areas of concern.”

The report also found several old wire rope cables installed in the tree which were described as “under a lot of tension”, with one frayed and unravelling.

“Although the cables appear to be under tension there are no signs that these cables are required by the tree.”

The tree was also described as showing signs of decline.

The report also outlines three bands of risk level; broadly acceptable, tolerable and unacceptable.

“This inspection and report will give the three a risk rating and options for mitigation,” the report said.

“It is up to the tree owners to decide what if any action is to be taken depending on their tolerance of tisk.”

The report’s conclusion said if the examined trees had major deadwood removed, their risk level would be considered as low as reasonably practible.

The only thing certain is more uncertainty: Europe tries new uncertainty approach

The European Food Safety Authority (EFSA) has developed a harmonised approach to assessing and taking account of uncertainties in food safety, and animal and plant health. This approach will boost the transparency of the resulting scientific advice and make it more robust for decision-making.

Maybe.

The EFSA Scientific Committee guidance on uncertainty in scientific assessments offers a diverse toolbox of scientific methods and technical tools for uncertainty analysis. It is sufficiently flexible to be implemented in such diverse areas as plant pests, microbiological hazards and chemical substances.

Prof Tony Hardy, Chair of the Scientific Committee said: “Since 2016, we have tested, refined and tailored our new approach to uncertainty analysis, benefiting from open consultations with EFSA’s partners and the wider public. Crucially, we learnt a great deal about how to apply the new approach by trialling it across all EFSA’s scientific areas of activity.

The approach is described in two separate documents: a short user-friendly (says who?) guidance with practical instructions and tips, and a supporting scientific opinion with all the detailed scientific reasoning and methods.

The long-term goal is that the new guidance on uncertainty will be an integral step in all EFSA’s scientific assessments.

Prof Hans Verhagen is head of EFSA’s department for risk assessment. He said: “The trial showed that in areas like plant health, an explicit uncertainty analysis is already being used, with positive feedback from risk managers who say this helps them with their decision-making. In other areas, where uncertainty analysis is not yet integrated in the assessment process, the testing phase has helped give a clearer idea how to develop tailored approaches.”

EFSA will implement the approach in two stages. In general scientific areas, the guidance will apply from autumn 2018 after the renewal of the Authority’s scientific panels.

In regulated products areas such as pesticides, food additives or food contact materials it will be phased in later on, in light of the experience gained in the ‘non-regulated’ areas.

In parallel, EFSA is developing practical guidance for communication specialists on how to communicate the results of uncertainty analysis to different target audiences, including the public. A public consultation will be held on a draft of the communication approach in 2018.

Others have been working on this for 40 years. When the goal is public health – so more people don’t barf – we already know it’s better to go public early and oftern.

Going public: Early disclosure of food risks for the benefit of public health

Mar.17

NEHA, Volume 79.7, Pages 8-14

Benjamin Chapman, Maria Sol Erdozaim, Douglas Powell

http://www.neha.org/node/58904

Often during an outbreak of foodborne illness, there are health officials who have data indicating that there is a risk prior to notifying the public. During the lag period between the first public health signal and some release of public information, there are decision makers who are weighing evidence with the impacts of going public. Multiple agencies and analysts have lamented that there is not a common playbook or decision tree for how public health agencies determine what information to release and when. Regularly, health authorities suggest that how and when public information is released is evaluated on a case-by-case basis without sharing the steps and criteria used to make decisions. Information provision on its own is not enough. Risk communication, to be effective and grounded in behavior theory, should provide control measure options for risk management decisions. There is no indication in the literature that consumers benefit from paternalistic protection decisions to guard against information overload. A review of the risk communication literature related to outbreaks, as well as case studies of actual incidents, are explored and a blueprint for health authorities to follow is provided.

Modeling to reduce risks of Salmonella in alfalfa sprouts

We developed a risk assessment of human salmonellosis associated with consumption of alfalfa sprouts in the United States to evaluate the public health impact of applying treatments to seeds (0–5-log10 reduction in Salmonella) and testing spent irrigation water (SIW) during production.

The risk model considered variability and uncertainty in Salmonella contamination in seeds, Salmonella growth and spread during sprout production, sprout consumption, and Salmonella dose response.

Based on an estimated prevalence of 2.35% for 6.8 kg seed batches and without interventions, the model predicted 76,600 (95% confidence interval (CI) 15,400–248,000) cases/year. Risk reduction (by 5- to 7-fold) predicted from a 1-log10 seed treatment alone was comparable to SIW testing alone, and each additional 1-log10 seed treatment was predicted to provide a greater risk reduction than SIW testing. A 3-log10 or a 5-log10 seed treatment reduced the predicted cases/year to 139 (95% CI 33–448) or 1.4 (95% CI <1–4.5), respectively. Combined with SIW testing, a 3-log10 or 5-log10 seed treatment reduced the cases/year to 45 (95% CI 10–146) or <1 (95% CI <1–1.5), respectively. If the SIW coverage was less complete (i.e., less representative), a smaller risk reduction was predicted, e.g., a combined 3-log10 seed treatment and SIW testing with 20% coverage resulted in an estimated 92 (95% CI 22–298) cases/year.

Analysis of alternative scenarios using different assumptions for key model inputs showed that the predicted relative risk reductions are robust. This risk assessment provides a comprehensive approach for evaluating the public health impact of various interventions in a sprout production system.

Risk assessment of salmonellosis from consumption of alfalfa sprouts and evaluation of the public health impact of sprout seed treatment and spent irrigation water testing

January 2018, Risk Analysis

Yuhuan Chen, Regis Pouillot, Sofia Farakos, Steven Duret, Judith Spungen, Tong-Jen Fu, Fazila Shakir, Patricia Homola, Sherri Dennis, Jane Van Doren

DOI: 10.1111/risa.12964

http://onlinelibrary.wiley.com/doi/10.1111/risa.12964/epdf

Maybe not so slightly pink: Properly cooked pork chops may contain threat of Listeria and Salmonella for consumers

If you are eating leftover pork chops that have not been cooked well-done, you’re putting yourself at risk for Salmonella and Listeria exposure. While many individuals prefer to consume their pork medium, a new study published in Risk Analysis: An International Journal revealed that cooking pork chops to an acceptable temperature does not completely eliminate pathogens, providing these cells with the opportunity to multiply during storage and harm consumers.  

The study, “Impact of cooking procedures and storage practices at home on consumer exposure to Listeria monocytogenes and Salmonella due to the consumption of pork meat,” found that only pork loin chops cooked well-done in a static oven (the researchers also tested cooking on a gas stove top) completely eliminated the Listeria and Salmonella pathogens. Other levels of cooking, i.e. rare and medium, while satisfying the requirements of the product temperature being greater than or equal to 73.6 degrees Celsius and decreasing the pathogen levels, did leave behind a few surviving cells which were then given the opportunity to multiply during food storage before being consumed.  

It is generally believed that when meat is heat treated to 70 degrees Celsius for two minutes, a one million cell reduction of E. coli, Salmonella, and Listeria is achieved and thus the meat is free of pathogens and safe to eat. However, a report by the European Food Safety Authority revealed that more than 57 percent of Salmonella outbreaks in 2014 were in the household/kitchen, and 13 percent were associated with inadequate heat treatment. 

“The results of this study can be combined with dose response models and included in guidelines for consumers on practices to be followed to manage cooking of pork meat at home,” says Alessandra De Cesare, PhD, lead author and professor at the University of Bologna.  

In order to assess the pathogen levels in cooked pork, the researchers, from the University of Bologna, the Institute of Food Engineering for Development and the Istituto Zooprofilattico delle Venezie, tested 160 packs of loin chop. The samples were experimentally contaminated with 10 million cells of L. monocytogenes and Salmonella to assess the reduction in pathogens after cooking, in accordance with the Food Safety and Inspection Service (FSIS) and British Retail Consortium (BRC) specifications (ensuring a reduction of at least 100,000 and 1,000,000 cells, respectively). The samples were contaminated on the surface, to mimic contamination via slaughter and cutting.  

The samples were divided into groups to be cooked either on gas with a non-stick pan or in a static oven. In each setting, the pork chops were cooked to rare, medium, and well-done. For each cooking combination, 40 repetitions were performed for a total of 240 cooking tests.  

The researchers also interviewed 40 individuals between the ages of 20 and 60 to determine household consumer habits regarding doneness preferences. Prior published research was referenced to define meat storage practices and the probability that consumers store their leftovers at room temperature, in the refrigerator or discard them immediately. Growth rate data for the pathogens at each temperature were obtained using the software tool ComBase.  

The only cooking treatment able to completely inactivate the pathogens was oven well-done, which achieved a reduction between one and 10 million cells. Statistical analyses of the data showed significant differences related to level of cooking and cooking procedure. However, the researchers explained that factors such as moisture, water activity, fat levels, salts, carbohydrates, pH, and proteins can impact the cooking treatment and effectiveness and, as a consequence, on bacteria survival. These results emphasize the needs to consider the form of pork (such as whole muscle versus ground) being cooked, in addition to the final temperature necessary to inactivate pathogens.  

The results show that a reduction between one and 10 million of pathogen cells was reached when applying all of the tested cooking treatments, with product temperatures always reaching 73.6 degrees Celsius or greater. However, according to the simulation results using the obtained cell growth rates, the few surviving cells can multiply during storage in both the refrigerator and at room temperature, reaching concentrations dangerous for both vulnerable and regular consumers.  

After storing leftovers, there was a probability for the concentration of pathogens to reach 10 cells ranging between 0.031 and 0.059 for all combinations except oven well-done. Overall, the mean level of exposure to Listeria and Salmonella at the time of consumption was one cell for each gram of meat. The results obtained from this study can be implemented in guidelines for consumers on practices to follow in order to manage cooking pork meat at home.  

 

Cook it: Toxo in pork

Toxoplasma gondii is one of the leading foodborne pathogens in the United States.

The Centers for Disease Control and Prevention (CDC) reported that T. gondii accounts for 24% of deaths due to foodborne illness in the United States.

raw-meat-120607Consumption of undercooked pork products in which T. gondii has encysted has been identified as an important route of human exposure. However, little quantitative evaluation of risk due to different pork products as a function of microbial quality at the abattoir, during the production process, and due to consumer handling practices is available to inform risk management actions.

The goal of this study was to develop a farm-to-table quantitative microbial risk assessment (QMRA) model to predict the public health risk associated with consumption of fresh pork in the United States.

T. gondii prevalence in pigs was derived through a meta-analysis of existing data, and the concentration of the infectious life stage (bradyzoites) was calculated for each pork cut from an infected pig. Logistic regression and log-linear regression models were developed to predict the reduction of T. gondii during further processing and consumer preparation, respectively. A mouse-derived exponential dose-response model was used to predict infection risk in humans. The estimated mean probability of infection per serving of fresh pork products ranges from 3.2 × 10−7 to 9.5 × 10−6, corresponding to a predicted approximately 94,600 new infections annually in the U.S. population due to fresh pork ingestion. Approximately 957 new infections per year were estimated to occur in pregnant women, corresponding to 277 cases of congenital toxoplasmosis per year due to fresh pork ingestion.

In the context of available data, sensitivity analysis suggested that cooking is the most important parameter impacting human health risk. This study provides a scientific basis for risk management and also could serve as a baseline model to quantify infection risk from T. gondii and other parasites associated with meat products.

Quantifying the risk of human Toxoplasma gondii infection due to consumption of fresh pork in the United States

Food Control, Volume 73, Part B, March 2017, Pages 1210–1222

Miao Guo, Elisabetta Lambertini, Robert L. Buchanan, Jitender P. Dubey, Dolores E. Hill, H. Ray Gamble, Jeffrey L. Jones, Abani K. Pradhan

http://www.sciencedirect.com/science/article/pii/S0956713516305825

Nerd alert: Risk assessment in Europe

The European Food Safety Authority says a special issue of the EFSA Journal presents the main outcomes of EFSA’s 2nd Scientific Conference “Shaping the Future of Food Safety, Together” held in Milan, on 14-16 October 2015. 

kvANE_s-200x150The event was a unique opportunity for stakeholders in Europe’s food regulatory system – policy makers, risk assessors, scientists and NGOs – to identify future challenges for food safety risk assessment in Europe. Over 1000 delegates came together in what proved to be a stimulating and insightful debate on global food safety concerns. The discussions covered an impressive range of topics and provided inspiration for EFSA’s Strategy 2020. The conclusions will help EFSA and the wider risk assessment community to chart a course in food safety risk assessment in the coming years.

The special issue of the EFSA Journal reflects the conference’s three plenary and nine parallel sessions and is accompanied by a Foreword from EFSA’s Executive Director, Bernhard Url.

All the conference material that was published on the conference’s dedicated microsite will be archived on EFSA’s website. This includes the programme, webcasts, recordings and video clips which will continue to be publicly available and linked to the special issue of the EFSA Journal. 

nerd-alert.g-jpgf

Norovirus: Best way to assess risk?

The application of quantitative microbial risk assessments (QMRAs) to understand and mitigate risks associated with norovirus is increasingly common as there is a high frequency of outbreaks worldwide.

norovirus.qmraA key component of QMRA is the dose–response analysis, which is the mathematical characterization of the association between dose and outcome. For Norovirus, multiple dose–response models are available that assume either a disaggregated or an aggregated intake dose. This work reviewed the dose–response models currently used in QMRA, and compared predicted risks from waterborne exposures (recreational and drinking) using all available dose–response models.

The results found that the majority of published QMRAs of norovirus use the 1F1hypergeometric dose–response model with α = 0.04, β = 0.055. This dose–response model predicted relatively high risk estimates compared to other dose–response models for doses in the range of 1–1,000 genomic equivalent copies. The difference in predicted risk among dose–response models was largest for small doses, which has implications for drinking water QMRAs where the concentration of norovirus is low.

Based on the review, a set of best practices was proposed to encourage the careful consideration and reporting of important assumptions in the selection and use of dose–response models in QMRA of norovirus.

Finally, in the absence of one best norovirus dose–response model, multiple models should be used to provide a range of predicted outcomes for probability of infection.

Comparison of risk predicted by multiple norovirus dose–response models and implications for quantitative microbial risk assessment

Nicole Van Abel, Mary E. Schoen, John C. Kissel, J. Scott Meschke

Risk Analysis, June 2016, DOI: 10.1111/risa.12616

http://onlinelibrary.wiley.com/doi/10.1111/risa.12616/abstract

Listeria and raw milk cheese: A risk assessment involving sheep

Semisoft cheese made from raw sheep’s milk is traditionally and economically important in southern Europe. However, raw milk cheese is also a known vehicle of human listeriosis and contamination of sheep cheese with Listeria monocytogenes has been reported.

sheep.milk.cheeseIn the present study, we have developed and applied a quantitative risk assessment model, based on available evidence and challenge testing, to estimate risk of invasive listeriosis due to consumption of an artisanal sheep cheese made with raw milk collected from a single flock in central Italy.

In the model, contamination of milk may originate from the farm environment or from mastitic animals, with potential growth of the pathogen in bulk milk and during cheese ripening. Based on the 48-day challenge test of a local semisoft raw sheep’s milk cheese we found limited growth only during the initial phase of ripening (24 hours) and no growth or limited decline during the following ripening period. In our simulation, in the baseline scenario, 2.2% of cheese servings are estimated to have at least 1 colony forming unit (CFU) per gram. Of these, 15.1% would be above the current E.U. limit of 100 CFU/g (5.2% would exceed 1,000 CFU/g). Risk of invasive listeriosis per random serving is estimated in the 10−12 range (mean) for healthy adults, and in the 10−10 range (mean) for vulnerable populations.

When small flocks (10–36 animals) are combined with the presence of a sheep with undetected subclinical mastitis, risk of listeriosis increases and such flocks may represent a public health risk.

Risk assessment of human listeriosis from semisoft cheeses made from raw sheep’s milk in Lazio and Tuscany

Roberto Condoleo, Ziad Mezher, Selene Marozzi, Antonella Guzzon, Roberto Fischetti, Matteo Senese, Stefania Sette, Luca Bucchini

Risk Analysis, June 2016, doi:10.1111/risa.12649

http://onlinelibrary.wiley.com/doi/10.1111/risa.12649/abstract;jsessionid=519D74728E4A34E1CE300B856B99D54B.f04t04

Taking it further: Quantitative microbiological risk assessment and source attribution for Salmonella

The current issue of Risk Analysis contains several papers regarding QMRA and Salmonella.

qmrainactionIn a recent report from the World Health Organisation, the global impact of Salmonella in 2010 was estimated to be 65–382 million illnesses and 43,000–88,000 deaths, resulting in a disease burden of 3.2–7.2 million disability-adjusted life years (DALYs).[3] Controlling Salmonella in the food chain will require intervention measures, which come at a significant cost but these should be balanced with the cost of Salmonella infections to society.[5]

Despite a wealth of published research work relating to Salmonella, many countries still struggle with identifying the best ways to prevent and control foodborne salmonellosis. Two questions are particularly important to answer in this respect: ([1]) What are the most important sources of human salmonellosis within the country? and ([2]) When a (livestock) source is identified as important, how do we best prevent and control the dissemination of Salmonella through that farm-to-consumption pathway? The articles presented in this series continue the desire to answer these questions and hence eventually contribute to the reduction in the disease burden of salmonellosis in humans.

Risk Analysis, 36: 433–436

Snary, E. L., Swart, A. N. and Hald, T.

http://onlinelibrary.wiley.com/doi/10.1111/risa.12605/abstract

Application of molecular typing results in source attribution models: the case of multiple locus variable number tandem repeat analysis (MLVA) of Salmonella isolates obtained from integrated surveillance in Denmark

Risk Analysis, 36: 571–588

de Knegt, L. V., Pires, S. M., Löfström, C., Sørensen, G., Pedersen, K., Torpdahl, M., Nielsen, E. M. and Hald, T.

http://onlinelibrary.wiley.com/doi/10.1111/risa.12483/abstract

Salmonella is an important cause of bacterial foodborne infections in Denmark. To identify the main animal-food sources of human salmonellosis, risk managers have relied on a routine application of a microbial subtyping-based source attribution model since 1995. In 2013, multiple locus variable number tandem repeat analysis (MLVA) substituted phage typing as the subtyping method for surveillance of S. Enteritidis and S. Typhimurium isolated from animals, food, and humans in Denmark.

risk.analysis.frameworkThe purpose of this study was to develop a modeling approach applying a combination of serovars, MLVA types, and antibiotic resistance profiles for the Salmonella source attribution, and assess the utility of the results for the food safety decisionmakers. Full and simplified MLVA schemes from surveillance data were tested, and model fit and consistency of results were assessed using statistical measures. We conclude that loci schemes STTR5/STTR10/STTR3 for S. Typhimurium and SE9/SE5/SE2/SE1/SE3 for S. Enteritidis can be used in microbial subtyping-based source attribution models. Based on the results, we discuss that an adjustment of the discriminatory level of the subtyping method applied often will be required to fit the purpose of the study and the available data. The issues discussed are also considered highly relevant when applying, e.g., extended multi-locus sequence typing or next-generation sequencing techniques.

Assessing the effectiveness of on-farm and abattoir interventions in reducing pig meat–borne Salmonellosis within E.U. member states

Risk Analysis, 36: 546–56

Hill, A. A., Simons, R. L., Swart, A. N., Kelly, L., Hald, T. and Snary, E. L.

http://onlinelibrary.wiley.com/doi/10.1111/risa.12568/abstract

As part of the evidence base for the development of national control plans for Salmonella spp. in pigs for E.U. Member States, a quantitative microbiological risk assessment was funded to support the scientific opinion required by the EC from the European Food Safety Authority. The main aim of the risk assessment was to assess the effectiveness of interventions implemented on-farm and at the abattoir in reducing human cases of pig meat–borne salmonellosis, and how the effects of these interventions may vary across E.U. Member States. Two case study Member States have been chosen to assess the effect of the interventions investigated. Reducing both breeding herd and slaughter pig prevalence were effective in achieving reductions in the number of expected human illnesses in both case study Member States. However, there is scarce evidence to suggest which specific on-farm interventions could achieve consistent reductions in either breeding herd or slaughter pig prevalence.

skinned-pigs-hanging-in-a-slaughterhouse-gwe28273853Hypothetical reductions in feed contamination rates were important in reducing slaughter pig prevalence for the case study Member State where prevalence of infection was already low, but not for the high-prevalence case study. The most significant reductions were achieved by a 1- or 2-log decrease of Salmonella contamination of the carcass post-evisceration; a 1-log decrease in average contamination produced a 90% reduction in human illness. The intervention analyses suggest that abattoir intervention may be the most effective way to reduce human exposure to Salmonella spp. However, a combined farm/abattoir approach would likely have cumulative benefits. On-farm intervention is probably most effective at the breeding-herd level for high-prevalence Member States; once infection in the breeding herd has been reduced to a low enough level, then feed and biosecurity measures would become increasingly more effective.