From the time I started teaching at university in 1984, I never once recommended a textbook.
They were a student rip-off.
Tim Wu of The New York Times writes that as the semester ends, instructors at universities and community colleges around the country will begin placing their orders for next year’s textbooks. But not all professors will pay enough attention to something that students complain about: the outlandish prices of the books we assign. Having grown at many times the rate of inflation, the cost of a leading economics book can be over $250; a law school casebook plus supplement can cost $277. Adding to such prices is the dubious trend of requiring students to obtain digital access codes, averaging $100, to complete homework assignments.
Professors love tough questions. Here’s one we need ask ourselves: Are we helping rip off our students?
Diagnostic Laboratory Practices Tool: Find out how diagnostic testing practices in FoodNet’s surveillance area have changed over time for 10 pathogens: Campylobacter, Cryptosporidium *, Cyclospora, Listeria, norovirus, Salmonella, STEC, Shigella, Vibrio, and Yersinia.
Hemolytic Uremic Syndrome (HUS) Surveillance Tool: HUS is a life-threatening condition, most often triggered by STEC infection. See how rates of pediatric HUS and STEC infection have changed in FoodNet’s surveillance area since 1997.
My friend Ronald Doering, the first president of the Canadian Food Inspection Agency, writes persuasively in this Food in Canada column last week:
In September, there were several media reports of a survey by 3M that found that 32 per cent of Canadians are “skeptical of science.” The results were universally treated as “worrisome,” “alarming” and “depressing” because such a lack of trust in scientists might skew policy discussions to non-science considerations (bad) and perhaps, as well, undermine funding for scientists (very bad).
As readers of this column over the years will know, I have a different view. While, of course, it depends on what you mean by “science,” generally my opinion is that everyone should be more skeptical of science. I’m not saying that science is not important. CFIA scientists and their 10 laboratories are critical to the work of the agency. We can never have too much good science.
What I am saying is that there are many reasons why ordinary citizens, and especially consumers, should always be skeptical of science:
Most science is a lot more uncertain than is usually acknowledged. In food and nutrition science, for example, you name the issue and I can give you conflicting science. Over the years in this column, I have demonstrated vastly conflicting science on, for example, genetically engineered foods, food irradiation, the safety of BPA in food packaging, the safety of farmed salmon, the safety of water fluoridation and food additives. We have seen that Canada’s top two scientists on the safe level of salt in our diets disagree so intensely that they routinely resort to vicious name-calling. Canada and the U.S. consider the science on folic acid so clear that they require mandatory fortification of certain foods, while every EU country interprets the science to be so dangerous that they refuse to fortify; both groups insist their policy is “science-based.” It is illegal to sell raw milk in Canada and Australia but legal in England, Wales and Northern Ireland; both sides insist their policies are “science-based.” Nutrition science vacillates wildly. With such pervasive uncertainty, isn’t it just common sense to be skeptical?
Consumers get their science information on food and nutrition from newspapers, magazines, television and social media, none of which have trained science reporters anymore and all of which trade in alarmist “investigations,” food company bashing, celebrity advice and 45-second clips. Most consumers cannot understand most food labels. Health claims are more about marketing than health. Scientific illiteracy and innumeracy abound. As Mark Twain observed, if you don’t read magazines and newspapers you are uninformed, and if you do, you are misinformed. (Of course, this column is an exception). In the face of such widespread misinformation, isn’t it just common sense to be skeptical?
One of the most pervasive myths is that science and policy can be separated. When I was president of Canada’s largest science-based regulator, I dealt regularly with scientists who were seemingly unaware of how much their science advice was imbued with unstated policy considerations. Policy implications enter into the risk assessment at virtually every stage of the process. Moreover, in our system, scientists don’t make policy. After the scientist does the science-based risk assessment, elected politicians and their senior advisors carry out the policy-based risk management responsibility by weighing the science with the economic, political, legal, environmental, and ethical considerations. This is not the politicization of science; this is evidence-based policymaking. These two separate functions are often conflated and the outcome presented as driven purely by science. Isn’t it just common sense to be skeptical of this “science?”
A scientist friend recently highlighted another reason to be skeptical. The university system still insists that professors publish or perish, which accounts for why so much published science is both unread and unreadable, contributing nothing of value to the public that pays for it. It is certainly common sense, he says, to be skeptical of this science. Given the growing recognition of the importance of diet for health and the growing threat of foodborne illness, we need more and better science to aid in policymaking. Having said that, the public should always be skeptical of the science that comes their way.
Human norovirus (HuNoV) is a foremost cause of domestically acquired foodborne acute gastroenteritis and outbreaks. Despite industrial efforts to control HuNoV contamination of foods, its prevalence in foodstuffs at retail is significant. HuNoV infections are often associated with the consumption of contaminated produce, including ready-to-eat (RTE) salads.
Decontamination of produce by washing with disinfectants is a consumer habit which could significantly contribute to mitigate the risk of infection. The aim of our study was to measure the effectiveness of chemical sanitizers in inactivating genogroup I and II HuNoV strains on mixed salads using a propidium monoazide (PMAxx)-viability RTqPCR assay. Addition of sodium hypochlorite, peracetic acid, or chlorine dioxide significantly enhanced viral removal as compared with water alone. Peracetic acid provided the highest effectiveness, with log10 reductions on virus levels of 3.66 ± 0.40 and 3.33 ± 0.19 for genogroup I and II, respectively. Chlorine dioxide showed lower disinfection efficiency.
Our results provide information useful to the food industry and final consumers for improving the microbiological safety of fresh products in relation to foodborne viruses.
Effectiveness of consumers washing with sanitizers to reduce human norovirus on mixed salad
Eduard Anfruns-Estrada, Marilisa Bottaro, Rosa Pinto, Susana Guix, Albert Bosch
No good journal does that. They have lots of submissions.
The spam emails highlight the wild west of predatory journals, often with names that try to imitate real journals. Today’s was the “New American Journal of Medicine”, a not-so-subtle variation of the New England Journal of Medicine or the American Journal of Medicine. It looks like that journal has published a total of 8 papers in 2019. I looked at one of them and ‘crap’ is my generous assessment. It’s a paper that recommends a treatment for pregnant women and it’s one page long, does not disclose the funding source, fails to fulfill pretty much every standard reporting requirement for a clinical trial and reports essentially no specific data or analysis. But, it’s ‘published data’ and on someone’s CV.
The state of the scientific literature is pretty messed up. “Show me the study” has been a common refrain, but it’s not as useful these days because anything can get published.
Too many journals.
Good journals screen out the weak articles. High impact journals publish a minority (5-25% of submissions…and most often people only send their best papers to those journals). Some journals are still good quality and take lower impact papers that are still good science. Some journals take whatever they can get, trying to screen out the bad science.
Others…they take whatever they can get, as long as the authors can pay. Sadly, there are literally thousands of those.
Some people don’t realize we don’t get paid to write scientific papers. Some journals publish at no cost, but increasingly, there are publication fees that may range from a few hundred to a few thousand dollars. That, itself, isn’t necessarily the problem. Some journals charge fees so that the papers can be open access (available to anyone, without a need for a subscription). However, some journal charge a couple thousand dollars, make a nice profit and don’t particularly care about the science.
As someone who’s an associate editor, editorial board member and frequent reviewer for many journals, I see the good and bad.
I see papers that should be published accepted.
I see good quality papers rejected by good journals, knowing they’ll still end up in another good journal.
I see bad papers rejected.
However, I also see…
Horrible quality papers rejected that I know will end up published somewhere.
It’s frustrating to be reviewing a paper that’s complete crap, knowing it will find a home in a journal eventually. Yes, it will most likely be in a bottom feeder journal that many of it of us in the scientific community know is dodgy. However, not everyone will realize that and there will still be ‘published data’ to refer back to. Sometimes, that’s just frustrating, because poor quality science shouldn’t be published. However, when it deals with clinical matters (e.g. diagnosis, treatment…) it can be harmful, since poor quality or invalid data shouldn’t form the basis of decisions. Yet, it happens.
There have been a couple ‘stings’, where fake (and clearly garbage) papers have been submitted to journals. The highest profile was one that was published in Science (Bohannon, 2013). The author submitted a paper to various journals, with the following set-up “Any reviewer with more than a high-school knowledge of chemistry and the ability to understand a basic data plot should have spotted the paper’s short-comings immediately. Its experiments are so hopelessly flawed that the results are meaningless.” More than 50% of open access journals accepted it.
There are many reasons these dodgy journals are used.
“Publish or perish” isn’t quite true but it’s pretty close. Junior faculty need to show productivity to keep their positions or move into the increasingly elusive tenured positions. Scientific papers is a key metric, because it’s easy to count.
Some people get taken advantage of, not realizing the journal is predatory (or that fees are so high, until after the paper is accepted).
Commercial profit. Companies want to say their products are supported by published data. If the data aren’t any good, the amount of money that it takes to get something published is inconsequential for most companies.
Open access isn’t inherently bad. There are excellent open access journals that charge a couple thousand dollars per paper but have high standards. Open access is ideal as it means the science is available to everyone. It just has to be acceptable science, and that’s where things start to fall apart.
Anyway…enough ranting. I always like to say “don’t talk about a problem without talking about a solution” but I don’t have an easy solution. More awareness is the key, which is why sites that track predatory journals, such as Beall’s List, are important. It’s a good update on a sad state of affairs.
I love Mondays in Australia because it’s Sunday in the U.S., football and hockey are on TV for background, the kid is at school when not in France, and I write (Sorenne painting in France).
Fourteen years ago, me and Chapman went on a road trip to Prince George (where Ben thought he would be eaten by bears) to Seattle, then to Manhattan, Kansas, where in the first week I met a girl, got a job, and then spinach happened.
Leafy greens are still covered in shit.
I am drowning in nostalgia, but things haven’t changed, and, as John Prine wrote, all the news just repeats itself.
Same with relationships.
Former U.S. Food and Drug Administration food safety chief, David Acheson, writes that on October 31, 2019, FDA announced a romaine lettuce E. coli O157:H7 outbreak for which the active investigation had ended and the outbreak appeared to be over. As such FDA stated there was no “current or ongoing risk to the public” and no avoidance of the produce was recommended.
Since that announcement, however, I have seen a number of articles condemning FDA and CDC. Why? Because the traceback investigation of the outbreak began in mid-September when CDC notified FDA of an illness cluster that had sickened 23 people across 12 states. So why the delay in announcing it to the public?
Despite the critical (and rather self-serving; always self-serving) stance on the “inexcusable” delay taken by a prominent foodborne illness attorney and his Food Safety “News” publication – which blasted a headline FDA “hid” the outbreak – my stance, having been an FDA official myself involved in outbreak investigations, is that the delay was practical and sensible.
Why? As FDA states right in its announcement:
When romaine lettuce was identified as the likely source, the available data indicated that the outbreak was not ongoing and romaine lettuce eaten by sick people was past its shelf life and no longer available for sale.
Even once romaine was identified as the likely cause, no common source or point of contamination was identified that could be used to further protect the public.
During the traceback investigation, the outbreak strain was not detected in any of the samples collected from farms, and there were no new cases.
Thus, neither FDA nor CDC identified any actionable information for consumers.
So, if it is not in consumers’ best interest to publicize an issue that no longer exists, why should they be driven away from a healthy food alternative? Why should unfounded unease be generated that will damage the industry, providing no benefit for consumers but ultimately impacting their pockets? There is just no upside to making an allegation without information. We’ve seen the impact on consumers and the industry when an announcement of a suspected food turns out to be incorrect; specifically “don’t eat the tomatoes” when it turned out to be jalapeno and serrano peppers. Having learned from such incidents, FDA’s approach is: If we don’t have a message that will help protect the public, then there is no message to be imparted.
So, rather than condemn FDA and CDC, I would commend them for getting the balance correct. And, perhaps, instead of any condemning, we should be working together to get the answers faster, to get outbreak data through better, faster, more efficient and coordinated traceability. Our entire system is too slow – a topic we have discussed many times in these newsletters.
The public and the scientific community need to be informed to prevent additional people from barfing.
I also rarely eat lettuce of any sort because it is overrated and the hygiene controls are not adequate.
Greek salad without lettuce is my fave.
Going public: Early disclosure of food risks for the benefit of public health
NEHA, Volume 79.7, Pages 8-14
Benjamin Chapman, Maria Sol Erdozaim, Douglas Powell
Often during an outbreak of foodborne illness, there are health officials who have data indicating that there is a risk prior to notifying the public. During the lag period between the first public health signal and some release of public information, there are decision makers who are weighing evidence with the impacts of going public. Multiple agencies and analysts have lamented that there is not a common playbook or decision tree for how public health agencies determine what information to release and when. Regularly, health authorities suggest that how and when public information is released is evaluated on a case-by-case basis without sharing the steps and criteria used to make decisions. Information provision on its own is not enough. Risk communication, to be effective and grounded in behavior theory, should provide control measure options for risk management decisions. There is no indication in the literature that consumers benefit from paternalistic protection decisions to guard against information overload. A review of the risk communication literature related to outbreaks, as well as case studies of actual incidents, are explored and a blueprint for health authorities to follow is provided.
I’ll leave the summary of two antimicrobial resistance reports to my friend and hockey colleague (and he’s a professor/veterinarian) Scott Weese of the Worms & Germs Blog (he’s the semi-bald dude behind me in this 15-year-old pic; I’m the goalie; too many pucks to the head):
Two reports came out this week, both detailing the scourge of antibiotic resistance.
They’re both comprehensive, with a combined >400 pages explaining that this is a big problem.
I’m not going try to summarize the reports. I’ll just pick out a few interesting tidbits.
From the CCA report (Canada):
According to their modelling, first-line antimicrobials (those most commonly used to treat routine infections) helped save at least 17,000 lives in 2018 while generating $6.1 billion in economic activity in Canada. “This contribution is at risk because the number of effective antimicrobials are running out.”
Antimicrobial resistance was estimated to reduce Canada’s GDP by $2 billion in 2018. That’s only going to get worse unless we get our act together. It’s estimated that by 2050, if resistance rates remain unchanged, the impact will be $13 billion per year. If rates continue to increase, that stretches to $21 billion. Remember, that’s just for Canada, a relatively small country from a population standpoint.
Healthcare costs due to resistance (e.g. drugs, increased length of stay in hospital) accounted for $1.4 billion in 2018. But remember that people who die from resistant infections can actually cost less. If I get a serious resistant infection and die quickly, my healthcare costs are pretty low since I didn’t get prolonged care. All that to say that dollar costs alone don’t capture all the human aspects. Regardless, this cost will likely increase to $20-40 billion per year by 2050.
In terms of human health, resistant infections were estimated to contribute to 14,000 deaths in Canada in 2018, with 5,400 of those directly attributable to the resistant infection (i.e. those deaths would not have occurred if the bug was susceptible to first line drugs). That makes resistance a leading killer, and it’s only going to get worse.
The document’s dedication says a lot. “This report is dedicated to the 48,700 families who lose a loved one each year to antibiotic resistance or Clostridioides difficile, and the countless healthcare providers, public health experts, innovators, and others who are fighting back with everything they have.”
The forward has some great messages too:
To stop antibiotic resistance, our nation must:
Stop referring to a coming post-antibiotic era—it’s already here. You and I are living in a time when some miracle drugs no longer perform miracles and families are being ripped apart by a microscopic enemy. The time for action is now and we can be part of the solution.
Stop playing the blame game. Each person, industry, and country can affect the development of antibiotic resistance. We each have a role to play and should be held accountable to make meaningful progress against this threat.
Stop relying only on new antibiotics that are slow getting to market and that, sadly, these germs will one day render ineffective. We need to adopt aggressive strategies that keep the germs away and infections from occurring in the first place.
Stop believing that antibiotic resistance is a problem “over there” in someone else’s hospital, state, or country—and not in our own backyard. Antibiotic resistance has been found in every U.S. state and in every country across the globe. There is no safe place from antibiotic resistance, but everyone can take action against it. Take action where you can, from handwashing to improving antibiotic use.
Some might say it’s alarmist. However, I don’t think it’s alarmist when someone really should be raising the alarm. We need to talk about it more, not less. We need to get people (including the general public, healthcare workers, farmers, veterinarians, policymakers) on board, to realize it’s a big issue that needs to be addressed now. “Short term pain for long-term gain” certainly applies here. We can keep delaying and the numbers will keep going up, or we can invest in solutions.
The numbers are scary but specific numbers don’t really matter in many ways. “Lots” is all we should have to know to get motivated. However, decision-makers like numbers, so these numbers hopefully will be useful to show the impact and potential benefits of investing in this problem, and motivate them to put money into antimicrobial stewardship. Saving lives should be enough, but that often doesn’t cut it. Antibiotic resistance doesn’t have a good marketing campaign. Everyone knows why people were wearing pink last month and why there are some pretty dodgy moustaches this month. Those are important issues, for sure. However, considering the overall impact, antibiotic stewardship needs to get more people behind it if we’re going to effect change.
The kids in my lab had me buy a video camera in 1999 so we could film stuff and put it on the Intertubes long before youtube existed (and film my 2000 Ivan Parkin lecture at IAFP when I got turned away at the U.S. border).
Twenty years later, Leesburg, Ind.-based Maple Leaf Farms is offering a behind-the-scenes look at its duck farms with its new #MLFarmToFork campaign that focuses on transparency and the company’s commitment to operating responsibly.
According to Rita Jane Gabbett of Meating Place, Maple Leaf Farms will highlight its farm-to-fork process on social media through behind-the-scenes videos, farmer interviews and more.
“We want consumers to know the story behind our duck and our desire for continuous improvement,” explained Maple Leaf Farms Duck Marketing Manager Olivia Tucker. “We’re proud of our animal husbandry practices, our facilities and our people, and we want to showcase how vertical integration allows Maple Leaf Farms to produce the highest quality duck on the market.”
To explain vertical integration and how it benefits the entire supply chain, Maple Leaf Farms has created an animated video that outlines the production process and how products get to consumers’ tables. You can view the video at www.tinyurl.com/MLFarmtoFork.
In an ongoing effort to understand sources of foodborne illness in the United States, the Interagency Food Safety Analytics Collaboration (IFSAC) collects and analyzes outbreak data to produce an annual report with estimates of foods responsible for foodborne illnesses caused by pathogens. The report estimates the degree to which four pathogens – Salmonella, E. coli O157, Listeria monocytogenes, and Campylobacter – and specific foods and food categories are responsible for foodborne illnesses.
The Centers for Disease Control and Prevention (CDC) estimates that, together, these four pathogens cause 1.9 million foodborne illnesses in the United States each year. The newest report (PDF), entitled “Foodborne illness source attribution estimates for 2017 for Salmonella, Escherichia coli O157, Listeria monocytogenes, and Campylobacter using multi-year outbreak surveillance data, United States,” can be found on the IFSAC website.
The updated estimates, combined with other data, may help shape agency priorities and inform the creation of targeted interventions that can help to reduce foodborne illnesses caused by these pathogens. As more data become available and methods evolve, attribution estimates may improve. These estimates are intended to inform and engage stakeholders and to improve federal agencies’ abilities to assess whether prevention measures are working.
Foodborne illness source attribution estimates for 2017 for salmonella, Escherichia coli O157, listeria monocytogenes, and campylobacter using multi-year outbreak surveillance data, United States, Sept.2019
Tyson Foods has, according to KATV, negotiated a settlement with the U.S. Department of Agriculture (USDA) for $1 million.
The settlement is linked to a lawsuit in which the meat processor said a federal meat inspector lied about inspecting hogs at its Storm Lake, Iowa, plant, forcing the company to destroy 8,000 carcasses and resulting in $2.4 million in losses and expenses.
Tyson Foods filed suit against the government agency in May after an inspector signed inspection cards for 4,622 hogs at the Storm Lake facility. The antemortem inspections were never actually conducted by the agency in person as the report stated.
The meat giant was able to show the courts the inspector never left her car but signed the cards without seeing the hogs.
Tyson Foods said it incurred losses of $2.48 million from the false reports. By the time it learned of the alleged actions, the negligently inspected hogs had been intermingled into a larger group of some 8,000 hog carcasses and therefore could no longer be positively identified and the entire group had to be destroyed.
“This was an unfortunate situation and we appreciate the USDA for working with us to address our losses. We take our commitment to food safety very seriously and look forward to a continued partnership with the USDA,” Tyson Foods spokesman Worth Sparkman told Talk Business & Politics in an email statement.