Exposure was quantified with stochastic models at the population level, which incorporated measures of frequency, quantity ingested, prevalence, and concentration, using data from FoodNet Canada surveillance, the peer-reviewed and gray literature, other Ontario data, and data that were specifically collected for this study. Models were run with @Risk software using Monte Carlo simulations.
The mean number of cells of Campylobacter ingested per Ontarian per day during the summer, ranked from highest to lowest is as follows: household pets, chicken, living on a farm, raw milk, visiting a farm, recreational water, beef, drinking water, pork, vegetables, seafood, petting zoos, and fruits.
The study results identify knowledge gaps for some transmission routes, and indicate that some transmission routes for Campylobacter are underestimated in the current literature, such as household pets and raw milk. Many data gaps were identified for future data collection consideration, especially for the concentration of Campylobacter in all transmission routes.
A comparative exposure assessment of Campylobacter in Ontario, Canada
The current issue of Risk Analysis contains several papers regarding QMRA and Salmonella.
In a recent report from the World Health Organisation, the global impact of Salmonella in 2010 was estimated to be 65–382 million illnesses and 43,000–88,000 deaths, resulting in a disease burden of 3.2–7.2 million disability-adjusted life years (DALYs). Controlling Salmonella in the food chain will require intervention measures, which come at a significant cost but these should be balanced with the cost of Salmonella infections to society.
Despite a wealth of published research work relating to Salmonella, many countries still struggle with identifying the best ways to prevent and control foodborne salmonellosis. Two questions are particularly important to answer in this respect: () What are the most important sources of human salmonellosis within the country? and () When a (livestock) source is identified as important, how do we best prevent and control the dissemination of Salmonella through that farm-to-consumption pathway? The articles presented in this series continue the desire to answer these questions and hence eventually contribute to the reduction in the disease burden of salmonellosis in humans.
Application of molecular typing results in source attribution models: the case of multiple locus variable number tandem repeat analysis (MLVA) of Salmonella isolates obtained from integrated surveillance in Denmark
Risk Analysis, 36: 571–588
de Knegt, L. V., Pires, S. M., Löfström, C., Sørensen, G., Pedersen, K., Torpdahl, M., Nielsen, E. M. and Hald, T.
Salmonella is an important cause of bacterial foodborne infections in Denmark. To identify the main animal-food sources of human salmonellosis, risk managers have relied on a routine application of a microbial subtyping-based source attribution model since 1995. In 2013, multiple locus variable number tandem repeat analysis (MLVA) substituted phage typing as the subtyping method for surveillance of S. Enteritidis and S. Typhimurium isolated from animals, food, and humans in Denmark.
The purpose of this study was to develop a modeling approach applying a combination of serovars, MLVA types, and antibiotic resistance profiles for the Salmonella source attribution, and assess the utility of the results for the food safety decisionmakers. Full and simplified MLVA schemes from surveillance data were tested, and model fit and consistency of results were assessed using statistical measures. We conclude that loci schemes STTR5/STTR10/STTR3 for S. Typhimurium and SE9/SE5/SE2/SE1/SE3 for S. Enteritidis can be used in microbial subtyping-based source attribution models. Based on the results, we discuss that an adjustment of the discriminatory level of the subtyping method applied often will be required to fit the purpose of the study and the available data. The issues discussed are also considered highly relevant when applying, e.g., extended multi-locus sequence typing or next-generation sequencing techniques.
Assessing the effectiveness of on-farm and abattoir interventions in reducing pig meat–borne Salmonellosis within E.U. member states
Risk Analysis, 36: 546–56
Hill, A. A., Simons, R. L., Swart, A. N., Kelly, L., Hald, T. and Snary, E. L.
As part of the evidence base for the development of national control plans for Salmonella spp. in pigs for E.U. Member States, a quantitative microbiological risk assessment was funded to support the scientific opinion required by the EC from the European Food Safety Authority. The main aim of the risk assessment was to assess the effectiveness of interventions implemented on-farm and at the abattoir in reducing human cases of pig meat–borne salmonellosis, and how the effects of these interventions may vary across E.U. Member States. Two case study Member States have been chosen to assess the effect of the interventions investigated. Reducing both breeding herd and slaughter pig prevalence were effective in achieving reductions in the number of expected human illnesses in both case study Member States. However, there is scarce evidence to suggest which specific on-farm interventions could achieve consistent reductions in either breeding herd or slaughter pig prevalence.
Hypothetical reductions in feed contamination rates were important in reducing slaughter pig prevalence for the case study Member State where prevalence of infection was already low, but not for the high-prevalence case study. The most significant reductions were achieved by a 1- or 2-log decrease of Salmonella contamination of the carcass post-evisceration; a 1-log decrease in average contamination produced a 90% reduction in human illness. The intervention analyses suggest that abattoir intervention may be the most effective way to reduce human exposure to Salmonella spp. However, a combined farm/abattoir approach would likely have cumulative benefits. On-farm intervention is probably most effective at the breeding-herd level for high-prevalence Member States; once infection in the breeding herd has been reduced to a low enough level, then feed and biosecurity measures would become increasingly more effective.
Stacy Stevens, who leads the Issues & Crisis Navigation practice at FoodMinds LLC in Chicago writes in O’Dwyer’s Communications & New Media that microbiological pathogens lurk around every drain, every ceiling tile that collects condensation and every box of ingredients unpacked in a restaurant kitchen.
Food and beverage companies are speaking out about the healthfulness and wholesomeness of their products, as well as the integrity of their supply chains and their commitments to farm animal care and environmental sustainability, to maintain consumer confidence and trust. That carefully-constructed bond with consumers can dissipate in an instant with the emergence of a food safety concern. Here’s how communication pros can work in lock-step with their operations counterparts to prevent a food safety compromise and keep their hard-earned reputations intact.
Most of us practitioners of public relations don’t claim to understand the finer distinctions between Listeria monocytogenes, Clostridium botulinum and Escherichia coli. So, how can we ensure our companies and clients stay out of the “tag, CDC says you’re it” spotlight?
Dairy Forum 2016, a gathering of 1,100 food industry executives from around the globe hosted by International Dairy Foods Association (IDFA) in late January, explored this question in the session “Caution: Company Under Pressure,” which I had the honor of moderating.
According to panel member Joe Levitt, partner at Hogan Lovells and former director of FDA’s Center for Food Safety and Applied Nutrition, the single biggest problem is thinking this can’t or won’t happen to you.
Communicators: it’s our job to get our companies and clients past that mindset, and once we do, the path to preparedness — and ideally prevention — is clear:
Your QA team is reexamining and reinventing its system from top to bottom.
There’s no such thing as zero risk. And pledging to adopt the very highest food safety standards represents a major investment of time and resources. At The Ice Cream Club, headquartered in Boynton Beach, Fla., company owners watched the Listeria outbreaks linked to ice cream in 2015 with trepidation. But they didn’t just watch, they sprang into action. They brought in outside auditors, instructing them, “We’re not looking for a gold star; we want you to thoroughly review our facility for any areas of risk and opportunities to improve!” They installed new equipment, joined IDFA’s Listeria Task Force, updated protocols and methodically retrained their employees. Importantly, they walked the production facility floor day-in-and-day-out, visibly modeling the behaviors they wanted employees to follow. This was a critical success factor in helping employees acclimate to the dynamic new food safety culture.
Communicators have pressing media interviews to conduct, content strategies to approve and executives to prep for the next earnings call. But the historic legislation that directed FDA to build a new system of food safety oversight – one focused on applying the best available science along with common sense, to prevent outbreaks of foodborne illness, must be part of our jobs too. Make sure you understand what your operations team is doing in light of the multi-year implementation of Food Safety Modernization Act (FSMA) regulations, and update your food safety and quality messaging and proof points accordingly. Then, take it upon yourself to make sure your quality assurance, supply chain management and science/regulatory experts appreciate the importance of reputation management, issues management and crisis communication to their efforts. And make sure your FSMA-compliant Recall Plan includes the communications plan and assets you’ll need to deploy in order to properly notify customers and the public when necessary.
The key to the success of any response-mode communication effort is that your stakeholders and the general public already know your name and enough about you to give you the benefit of the doubt when something negative surfaces. The best way for members of the food industry to ensure this is the case, is to visibly engage in corporate social responsibility, responsible sourcing, nutrition, health & wellness and sustainability efforts. Make meaningful commitments and talk to the public – online and offline – about them in an authentic and passionate way.
Your crisis plan should be a living, breathing arsenal of strategies, checklists and tactics so you and your colleagues can respond without losing time to internal deliberations (“is this a four-alarm fire, or a three-alarm fire?”) when something hits. You’re going to need to marshal external resources quickly as well, so your plan should map out your network of legal, scientific, communications and operational advisors. And remember that scenario-based tabletop exercise that got cut from last year’s budget? It certainly would have been helpful if the key players had gotten a practice session in before the playbook was put to use!
Your third-party academic experts know you, and can speak to your track record.
The list of third-party experts compiled by your summer intern isn’t going to do much good in a crisis if you haven’t built and maintained relationships with everyone on it. Invite scientific and academic experts to tour your facilities, make an effort to visit their institutions, and update them regularly on company events and milestones.
When your brand or company reputation is called into question, you’re not alone. Industry associations such as IDFA have communication resources — social media monitoring dashboards, for example, and they employ technical experts who maintain strong relationships with federal regulators. They can advise you on preventive controls and communication strategies to shore up your prevention plan and are well equipped to buoy your team in the event of an escalated issue or crisis.
The stakes for food and beverage executives are higher than ever as FDA becomes increasingly aggressive in using the criminal sections of the Federal Food Drug and Cosmetic Act (FDCA) in the wake of food safety incidents. There have been more criminal prosecutions in the past five years of food company managers than in the prior two decades combined. And it’s important to realize that a misdemeanor conviction under the FDCA does not require proof of fraudulent intent, and doesn’t even require that managers were aware of a potential safety issue.
With a culture of food safety excellence and crisis preparedness in place, you may not avoid “tag, you’re it” entirely but you’ll be in a much better position to get back-into-the-game, and back-to-business, as quickly as possible.
In the summer of 1994, Intel types discovered a flaw with their Pentium computer chip, but thought the matter trivial; it was not publicly disclosed until Oct. 30, 1994, when a mathematician at Lynchburg College in Virginia, Thomas Nicely, posted a warning on the Internet.
As perceived problems and complaints rose through the weekend Andrew S. Grove, Intel’s chairman and CEO, composed an apology to be posted on an Internet bulletin board—actually a web, but because he was at home with no direct Internet access, he asked Intel scientist Richard Wirt to post the message from his home account; But because it bore Mr. Wirt’s electronic address, the note’s authenticity was challenged, which only added to the fury of the Internet attacks on Intel.
At 8 a.m. the following Monday inside the company’s Santa Clara, Calif. headquarters, Intel officials set to work on the crisis the way they attacked a large problems—like an engineering problem. Said Paul Otellini, senior vice-president for worldwide sales, “It was a classic Intellian approach to solving any big problem. We broke it down into smaller parts; that was comforting.”
By the end of week two, the crisis looked to be subsiding; Then on Monday, Nov. 12, 1994, the International Business Machines Corp. abruptly announced that its own researchers had determined that the Pentium flaw would lead to division errors much more frequently than Intel said. IBM said it was suspending shipments of personal computers containing the Pentium chip
Mr. Grove was stunned. The head of IBM’s PC division, Richard Thoman, had given no advance warning. A fax from Thoman arrived at Intel’s HQ on Monday morning after the IBM announcement, saying he had been unable to find Grove’s number during the weekend. Mr. Grove, whose number is listed, called directory assistance twice to ask for his own number to ensure he was listed.
After the IBM announcement, the number of calls to Santa Clara overwhelmed the capacity of AT&T’s West Coast long-distance telephone switching centres, blocking calls. Intel stock fell 6.5 per cent.
As John Markoff of the N.Y. Times wrote on the front-page in Dec. 1994, the reluctance of Intel to act earlier, according to Wall Street analysts, was the result of a corporate culture accustomed to handling technical issues rather than addressing customers’ hopes and fears.
Only then, Mr. Grove said, did he begin to realize that an engineer’s approach was inappropriate for a consumer problem.
According to one op-ed writer, Intel’s initial approach to the problem—prove you are doing sophisticated calculations if you want a replacement chip—was like saying “until you get to be cardinal, any internal doubts about the meaning of life are your own problem, a debate tha has been going on since before Martin Luther.
Intel’s doctrine of infallibility was facing an old-fashioned Protestant revolt.” (John Hockenberry, Pentium and our Crisis of Faith, N.Y. Times, Dec. 28, 1994, A11; this is how things were referenced before hot links)
Why and how did Intel go wrong? The answer is rooted in Intel’s distinctive corporate culture, and suggests that Intel went wrong in much the same way as other big and unresponsive companies before it.
Intel has traditionally valued engineering over product marketing. Inward-looking and wary of competitors (from experience with the Japanese) it developed a bunker mentality, a go-for-the-juglar attitude and reputation for arrogance.
According to one former engineer, Federico Faggin, a co-inventor of Intel’s first microprocessor, “The attitude at Intel is, ‘We’re better than everyone else and what we do is right and we never make mistakes.’”
Finally, on Dec. 20, Grove apparently realized that he and his company were standing at Ground Zero for an incoming consumer relations meteor. Intel announced that it would replace the defective chips—and pay for the labor—no questions asked, for the life of the original PC.
Discussing Intel’s previous position, Grove said, “To some people, this seemed arrogant and uncaring. We apologize for that.”
So what does a consumer with a Pentium do? Teach Intel that this isn’t about white paper. It’s about green paper—the money you paid and the performance you didn’t get. Replace that chip. After all, consumers deserve to be treated with respect, courtesy and a little common sense.
The number of victims was being reported in other media as recently as this week as just 98.
And, the internal document says the real number of victims of Chipotle’s Simi Valley outbreak could be higher still. “In reviewing the food logs provided by Chipotle for both 8/18/15 and 8/19/15, it is estimated at least 1500+ entrees were sold each day.” Sandy Murray, who did the analysis for the division, wrote: “Thus, the actual number of customers and employees ill from this outbreak is likely to be substantially higher than the reported number of 234.”
This week Chipotle ran print advertisements in 60 newspaper markets Wednesday with an apology from Steve Ells, the burrito chain’s founder and co-chief executive. His apology though only went to the victims of the current nine state E. coli 026 outbreak and the Boston College outbreak.
“From the beginning, all of our food safety programs have met or exceeded industry standards, “ Ells says (Pinto defense). “But recent incidents, an E. coli outbreak that sickened 52 people and a Norovirus outbreak that sickened approximately 140 people at a single Chipotle restaurant in Boston, have shown us that we need to do better, much better.”
No mention was made of the other foodborne outbreaks.
The publicly traded Chipotle also had one of its better days since its troubles began. CMG stock was up 2.49 percent or $13.79 per share, closing at $568.65 per share.
Recent findings on construal level theory (CLT) suggest that abstract thinking leads to a lower estimated probability of an event occurring compared to concrete thinking.
We applied this idea to the risk context and explored the influence of construal level (CL) on the overestimation of small and underestimation of large probabilities for risk estimates concerning a vague target person (Study 1 and Study 3) and personal risk estimates (Study 2).
We were specifically interested in whether the often-found overestimation of small probabilities could be reduced with abstract thinking, and the often-found underestimation of large probabilities was reduced with concrete thinking.
The results showed that CL influenced risk estimates. In particular, a concrete mindset led to higher risk estimates compared to an abstract mindset for several adverse events, including events with small and large probabilities.
This suggests that CL manipulation can indeed be used for improving the accuracy of lay people’s estimates of small and large probabilities. Moreover, the results suggest that professional risk managers’ risk estimates of common events (thus with a relatively high probability) could be improved by adopting a concrete mindset.
However, the abstract manipulation did not lead managers to estimate extremely unlikely events more accurately. Potential reasons for different CL manipulation effects on risk estimates’ accuracy between lay people and risk managers are discussed.
Thinking concrete increases the perceived likelihood of risks: The effect of construal level on risk estimation
Wiley Online Library, Risk Analysis, 26 June 2015, DOI: 10.1111/risa.12445
Eva Lermer, Bernhard Streicher, Rainer Sachs, Martina Raue, and Dieter Frey
Ronald L. Doering, the first president of the Canadian Food Inspection Agency and currently counsel in the Ottawa offices of Gowlings, writes:
One of the most persistent regulatory myths is the notion that politics can and should be kept out of science-based regulatory decision making. But as Covello and Merkhofer have clearly shown: “In practice, assumptions that have policy implications enter into risk assessment at virtually every stage of the process. The idea of a risk assessment that is free, or nearly free, of policy considerations is beyond the realm of possibility.” It is surprising how much our public discourse is still dominated by the quaint utopian view that science and policy can be strictly separated.
This enduring myth is the basis of the current kerfuffle regarding the government’s “war on science,” the allegation that decisions are based on ideology, not science, that politicizing science is a very bad thing and that all decisions must be “evidence-based.” Ironically, these same critics make a virtue out of being skeptical of mainstream science by opposing, for example, fluoridation, GM food, irradiation and vaccinations. But their basic misunderstanding is that they believe or pretend to believe that science and policy can be separated. Their whole concept of “evidence-based” is flawed. It is the legitimate and necessary role of politicians to take the science-based risk assessment and then carry out the policy based risk management function by weighing the social, political, economic, ethical and environmental factors in order to arrive at the appropriate regulatory decision. In our democratic system scientists cannot, should not, carry out what is the legitimate role of elected politicians and their senior advisers.
What should be the acceptable level of PCBs in farmed salmon? What should be the appropriate mix of rules to prevent the importation of BSE into Canada? What is the acceptable level of phthalates in plastic toys? What are the best regulations to prevent the importation of FMD into Canada? What is the right regulatory regime for the approval of genetically modified traits in seeds? What is the acceptable level of GM corn in wheat products? What should be the necessary rules for the storage of high level nuclear waste? What is the safe level of BPA in water bottles? How should the level of salt in processed food products be regulated? Should it continue to be illegal to sell raw milk? What should be the rules for raw milk cheese? This is just a small sample of the science-based public policy issues with which I was directly involved in recent years.
In all of these cases it was the regulator’s task to protect the public health and safety of Canadians through a complex process of weighing the many factors involved without, may I say again, the aid of some quantitative cost benefit analysis; the factors were too complex to be monetized in a way that would be useful for decision making. In all these cases the science was relevant by not determinative. And yet in all these cases the parties argued that the basic question was one of science: if only we could get the science right, the public policy answer would follow. If only the world were that simple.
When I was president of Canada’s largest science-based regulator, I dealt regularly with scientists who were seemingly unaware how much their science advice was imbued with unstated policy considerations, and how much the uncertainty of their science required the consideration of other factors. Many academic and government scientists and their public sector unions still shamelessly march in the streets arguing that decisions must only be “evidence-based.” My nutrition and food science students seem genuinely unaware, uncomfortable even, with the idea that science-based health risk assessments are replete with policy considerations.
We need to engender a broad public debate about the role of science and scientists in policy making. For starters we need to debunk the myth that politics can and should be taken out of science-based regulation making.
Recently two approaches have been applied to derive such criteria and to analyse their potential impact in terms of human health risk reduction: the risk-based version of the established microbiological criteria approach, that applies a microbiological limit (ML) for sample data, and the Danish “case-by- case” risk assessment approach, that applies a limit for the relative risk estimate (relative risk limit, RRL) based on sample data.
In this study, data sets from Sweden and Denmark are used to compare the performance of the two approaches in terms of efficiency, i.e. the balance between the residual risk after implementation of the criterion and the percentage of non-complying batches, and the attending uncertainty.
The analysis shows that the two approaches are equally efficient, and suggests that the RRL criterion is attended with less uncertainty. The two approaches are compared and their advantages and disadvantages are discussed. Given the uncertainties attending the results of the analysis, more research in terms of data collection, risk assessment and uncertainty analysis would be needed to develop these risk-based criteria further.
Food Control, Volume 53, Pages 177-184
Maarten Nauta, Jens Kirk Andersen, Pirkko Tuominen, Jukka Ranta, Roland Lindqvist
Maybe this is why Europe is somewhat messed up over food.
The European Food Safety Authority says “the decision to separate the tasks of risk assessment and risk management just over a decade ago has transformed the safety of Europe’s food. And while there is wide recognition that this change has strengthened the safety of the food chain, uncertainty can still exist over the difference in roles and responsibilities of risk assessors and risk managers. … Risk assessors provide independent scientific advice on potential threats in the food chain. Risk managers use this advice as a basis for making decisions to address these issues. At a European level, this separation of roles is fundamental and enshrined in law. It was introduced to make clear the distinction between science and politics; to place independent science-based assessment at the heart of policy making.”
Maybe there’s “wide recognition” in Europe, but in the U.S. and Canada risk assessment, management and communication are recognized as interdependent roles, forming the overall risk analysis approach.
Because value judgements are an inherent part of human activity, the U.S. National Academy of Sciences recommended in 1997 that risk assessors expand risk characterization beyond the current practice of merely
translating the results of a risk analysis into non-technical terms. This
limited approach is “seriously deficient” and should be replaced with an
analytical-deliberative approach that involves stakeholders from the very
inception of a risk assessment.
Michael Batz (right, pretty much as shown) writes:
A friend of mine who played guitar in a punk band once told me to “always leave them wanting more.” He was talking about how some bands played for too long, leaving the audience bored by the end. Their approach, by contrast, was to play about 15 songs in 12 minutes.
Sometimes it made people a little angry.
I guess I learned a lesson from him, since it took my colleagues and I 10 years to publish the results of our foodborne illness risk ranking work in a pair of journal articles in 2012. Or maybe I’m just lazy. In any case, as much as we tried to get it all in there, we definitely left folks wanting more. Which is to say, we got a lot of questions about the details. And I guess that’s half the point of these things, right? So we decided to go further into the weeds.Three papers in 12 years? Pretty punk rock. Or lazy. Whatever.
US: Disease-outcome trees, EQ-5d scores, and estimated annual losses of quality-adjusted life years (QALYs) for 14 foodborne pathogens in the United States
Foodborne Pathogens and Disease doi:10.1089/fpd.2013.1658
Michael Batz, Sandra Hoffmann, and J. Glenn Morris Jr
Measures of disease burden such as quality-adjusted life years (QALYs) are increasingly important to risk-based food safety policy. They provide a means of comparing relative risk from diverse health outcomes. We present detailed disease-outcome trees and EQ-5D scoring for 14 major foodborne pathogens representing over 95% of foodborne illnesses, hospitalizations, and deaths due to specified agents in the United States (Campylobacter spp., Clostridium perfringens, Cryptosporidium parvum, Cyclospora cayetanensis, Escherichia coli O157:H7, Shiga toxin–producing E. coli non-O157, Listeria monocytogenes, nontyphoidal Salmonella enterica, Shigella, Toxoplasma gondii, Vibrio vulnificus, Vibrio parahaemolyticus and other noncholera Vibrio, and Yersinia enterocolitica). We estimate over 5800 QALYs lost per 1000 cases of L. monocytogenes and V. vulnificus, compared to 125 QALYs lost per 1000 cases of T. gondii, 26 for E. coli O157:H7, 16 for Salmonella and Campylobacter, and 14 for Y. enterocolitica. The remaining 7 pathogens are estimated to cause less than 5 QALYs lost per 1000 cases. In total, these 14 pathogens cause over 61,000 in QALY loss annually, with more than 90% due solely to acute infection being responsible for 65% of total QALY loss, with premature mortality and morbidity due to chronic and congenital illness responsible for another 28%. These estimates of the burden of chronic sequelae are likely conservative; additional epidemiological research is needed to support more accurate burden estimates. This study shows the value of using integrated metrics for comparing disease burden, and the need to consider chronic and congenital illness when prioritizing foodborne pathogens.