Telling people there’s no risk is irresponsible

There’s some dumb stuff in this interview with author Jack Gilbert (who wrote, Dirt is Good: The Advantage of Germs for Your Child’s Developing Immune System) about eating dirt and the hygiene hypothesis.

I get it, expose your kids to lots of things, boost their immune system. But why say things like this:

Unless you dropped it in an area where you think they could be a high risk of extremely dangerous pathogens, which in every modern American home is virtually impossible, then there’s no risk to your child.

and

As long as they’re properly vaccinated, there’s no threat, and they will actually get a stronger, more beneficial exposure.

There’s always a threat. There’s no zero risk. There’s a pretty good chance that foodborne pathogens, that sometimes kill folks, are in every kitchen.

Hygiene hypothesis variation: The parasite underground

human.whipwormVik was in his late 20s, blood started appearing in his stool. He found himself rushing to the bathroom as many as nine times a day, and he quit his job at a software company.

He received a diagnosis of severe ulcerative colitis, an inflammatory condition of the colon. Steroids, which suppress inflammation, didn’t work for him. Sulfasalazine suppositories offered only the slightest relief. A year and a half after his diagnosis, Vik’s gastroenterologist warned him that because his disease was poorly controlled, he risked developing a condition called toxic megacolon: His inflamed intestines might rupture, leading to blood infection, septic shock or death.

The doctor recommended infusions of cyclosporine, a powerful immune-suppressant drug. Vik looked it up and learned that the drug, often given to transplant recipients, in rare instances can increase the risk of fatal infection and certain cancers. And if cyclosporine didn’t work, the next intervention would probably be the surgical removal of his colon. Vik might have to wear a colostomy bag for the rest of his life.

“I had a feeling there had to be a better way,” he told me recently. (Worried about being stigmatized, Vik asked that I identify him only by his first name.) He began researching ulcerative colitis and discovered that the prevalence of inflammatory bowel disease — an umbrella term that includes both ulcerative colitis and Crohn’s disease — had increased markedly in the United States over the 20th century. Yet the disease was less common in the developing world. He learned that exposure to dirt and unsanitary conditions early in life seemed to protect against these and other inflammatory diseases later. And then he encountered an explanation for the correlations in the research of a scientist named Joel Weinstock.

Weinstock, a gastroenterologist now at Tufts University, thought that parasites were to blame. But it wasn’t their presence in the human digestive system that was causing the rise; it was their absence. To survive for years in another animal, parasitic worms, known as helminths, counter their hosts’ defenses. Because an out-of-control immune response against native bacteria was thought to drive inflammatory bowel disease, Weinstock’s insight was that parasites’ ability to disarm the immune system might prevent the disorder. The broader implication was that the disappearance of parasites — largely eradicated from American life in the early 20th century through improvements in sanitation — might have left our immune systems unbalanced, increasing our vulnerability to all types of inflammatory disorders.

Whipworm-Pictures-1To Vik, Weinstock’s idea was the first cogent explanation for his disease. It also pointed toward a solution. Weinstock was already experimenting with “re-parasitizing” people with inflammatory bowel disease, using a helminth called Trichuris suis, the pig whipworm. He had chosen the species because, in theory, it can’t reach sexual maturity in humans and spread from one person to another. Early, small studies yielded impressive results, with 43 percent of colitis patients seeing improvement after 12 weeks of whipworm eggs, but Vik thought the use of pig whipworm had a flaw. It required continual dosing, and it could cost tens of thousands of dollars a year (a German company was producing the eggs for human consumption; in the United States, selling them to treat a disease is illegal). And most important, if he expected a parasite to change his immune system, he believed, a species adapted to humans, not pigs, was likely to do a better job.

Vik wanted human whipworm. This helminth, which reaches about 1.5 inches in length, fixes itself into the wall of the large intestine and feeds off the organ’s secretions for perhaps two years. The potential results of severe whipworm infection include anemia, clubbed fingers and, in children, stunted growth. But after exhausting his other options, Vik began to think of infecting himself with parasites as the most rational course of action. After all, the parasite had been with people since prehistory; Ötzi the Iceman, the 5,300-year-old mummy found frozen in the Italian Alps, had whipworm. Besides, the worst possible outcome of a whipworm infection was a kind of inflammatory bowel disease. And he already had that.

His doctor was dead set against the idea, Vik told me. So was his wife, a doctor in training. (They later divorced.) Vik is a driven, entrepreneurial type, though, and undeterred, he began emailing any expert who “would listen to my crazy ideas.” In 2004, he flew to Bangkok to meet a parasitologist who agreed to hear him out. He brought along his father, a professor and internist, for “gravitas.” (Vik’s father, who worked in Southeast Asia as a young doctor, told me it was common then to leave light whipworm infections untreated.)

The Thai parasitologist later handed him a vial of fluid containing whipworm eggs. Microscopic in size, they had come from an 11-year-old girl in southern Thailand, he was told. Vik flew home.

Next began what Vik describes as “the most difficult part” of his life. He set up a lab in his parents’ Southern California home and stocked it with a microscope, petri dishes, slides and flasks purchased on eBay. But he couldn’t get the eggs to “embryonate.” Just as chicken eggs need to incubate to hatch, helminth eggs require “embryonation” to produce infective larvae. Parasite eggs are excreted in feces, and in their native tropics, that embryonation occurs naturally after the eggs spend time in warm, humid conditions. But reproducing those conditions in his parents’ house proved difficult. He tried various conditions — warm, wet, cool, dry, light, dark — to no avail: The eggs remained inert. Swallowed in this state, they would pass right through his gut without hatching.

The breakthrough came when, imagining defecation under a tree, Vik abandoned his goal of antiseptic incubation and began using nonsterile tap water in the petri dish. Now the eggs, football-shaped and translucent under the microscope, began to display a knotted, ropy shape within — developing larvae — indicating embryonation. Months after returning from Thailand, he finally drank a glass of water containing a few hundred whipworm eggs.

Three months later, he swallowed another thousand eggs. Ova began showing up in his stool, indicating that his body now hosted living, breeding parasites. When he tapered off his drugs, his colitis remained quiescent. Instead of triumph, though, Vik felt doubt. Was this real? Or was it a natural ebb in his disease? “I wanted proof,” he told me.

The story goes on to say that over the past decade, thousands of people around the world have introduced parasites into their bodies on purpose, hoping to treat immune-related disorders. Some have drawn inspiration directly from Vik’s case study, which appeared in the journal Science Translational Medicine in 2011. But many more have been inspired by the same research that inspired Vik. A confluence of factors is driving what is essentially an amateur quest to “rewild” the modern body and restore it to an imagined prelapsarian state. The internet has facilitated the sharing of information, both reliable and not. But maybe more important, scientists are wrestling with germ theory, a cornerstone of modern medicine, and beginning to articulate a more nuanced idea: that the organisms in our bodies not only make us sick but also keep us healthy. Participants in the parasite underground see themselves as acting on this new emerging paradigm.

BS: Less disinfectant, more Rioja

Tal Abbary, a freelance writer writes in the quickly diminishing N.Y. Times that she recently moved back to South Florida after seven years in Spain, where supermarket shelves are curiously empty of antibacterial products and superbug threats have not yet become the stuff of media commentary.

sandy.gab.jun.13Spaniards (or rather, their maids) can scour a home clean like no one else, but bleach is the product of choice, and in recent years, the public has focused its measured fears on high unemployment rates, home evictions or government corruption. Kitchen counters are generally considered innocuous.

A common cultural motto is the psychologically cool “no pasa nada” (roughly — no big deal), meant to take the wind out of the sails of just about any of life’s problems. This is a hard-won ethos in a country that has endured, over mere decades, a bloody civil war followed by dictatorship, transition to democracy, meteoric economic growth, rising immigration and the current financial slump.

This is food safety idiocy.

While culturally correct in the right social circles that also are anti-vaxx, anti-GMO, and can afford to life in New York City, the data suggest otherwise.

My mother was four-years-old when she suffered a bout of undulate fever.

Gramps got rid of the cows the next day.

Even now, with whole genome sequencing and other molecular tools, we humans fail at the most basic microbiological tests: the hygiene hypothesis leaves a lot of bodies.

Nosestretcher alert: Invite some germs to dinner

When Michael Pollan endorses an article, I know it’s BS.

sprout.santa_.barf_.xmas_-300x254So it is with Kate Murphy’s piece in the New York Times on Sunday, that says the U.S. food supply is “arguably the safest in the world” and asks “whether our food could perhaps be too clean.”

I’ve been hearing this for 25 years. It’s a tantalizing belief but at this point that’s all it is – a belief.

Cherry-picking data to support a pre-existing theory remains a belief.

I could tell an equal number of stories about my mother who got undulate fever from raw milk as a child, or my aunt who suffered with cyclospora from basil in Florida, or Champan who spent a weekend in our toilet from Campylobacter in Kansas, but it’s not science.

There are research areas worth exploring, but we humans don’t know much about applying this germ theory, especially to the genetically susceptible.

The theory that there might be such a thing as “too clean” food stems from the hygiene hypothesis, which has been gaining traction over the last decade. It holds that our modern germaphobic ways may be making us sick by harming our microbiome, which comprises all the microscopic beasties — bacteria, viruses, fungi, mites, etc. — that live in and on our bodies.

Research so far has focused primarily on the detrimental effects of cesarean births and not breast-feeding, which may inhibit the formation of a robust microbiome, and the use of antibacterial soaps and antibiotics, which diminish the microbiome once it is established.

A result is an immune system that essentially gets bored, spoiling for a fight and apt to react to harmless substances and even attack the body’s own tissues. This could explain the increasing incidence of allergies and autoimmune disorders such as asthma, rheumatoid arthritis and inflammatory bowel syndrome.

It could also explain my latest fart.

There is also the suggestion that a diminished microbiome disrupts hormones that regulate hunger, which can cause obesity and metabolic disorders.

When it comes to foodborne illness, the idea is that fewer good bacteria in your gut means there is less competition to prevent colonization of the bad microbes, leading to more frequent and severe bouts of illness.

Moreover, your underutilized immune system may lose its ability to discriminate between friend and foe, so it may marshal its defenses inappropriately (e.g., against gluten and lactose) or not at all.

All of this is hard to prove.

That should be the headline.

Anyone who has visited a country with less than rigorous sanitation knows the locals don’t get sick from foods that can cause tourists days of toilet-bound torment.

sprouts_sandwich(1)That’s because the susceptible ones have died off.

“We have these tantalizing bits of evidence that to my mind provide pretty good support for the hygiene hypothesis, in terms of foodborne illness,” said Guy Loneragan, an epidemiologist and professor of food safety and public health at Texas Tech.

Yes, it’s tantalizing.

This is not to say we’d be better off if chicken producers eased up on the salmonella inspections, we ate recalled ice cream sandwiches and didn’t rinse our produce.

Rinsing produce ain’t going to do much either way; but may make the consumer feel cleaner.

Murphey says it is worth noting that serious foodborne diseases — the ones that make it into the news, like listeria, salmonella, E. coli, cryptosporidium and campylobacter — are mainly diseases of immuno-compromised populations.

Nonsense.

The E. coli O104 outbreak in sprouts in Germany in 2011 that killed 53 and sickened 4,400 primarily struck middle-aged, health women, because they eat more salads.

That’s science.

Zooneses: are we too clean?

Hygiene hypothesis; we really don’t know much;on a recent episode of the TVO current affairs show “The Agenda with Steve Paikin” explores the topic of “Our Relationship with Cleanliness” – an informative, yet fun look at the topic of germs. Panelists take a cultural, historical, psychological and sociological look at the microorganisms on us and around us – and how we respond to them (including some points on contact with pets).

Video is available here: http://www.wormsandgermsblog.com/2014/12/articles/miscellaneous/are-we-too-clean/index.html

Picking your nose and eating it may be good for you

It is called barfblog, and anyone with kids knows they do gross things. So do the adults.

I’ve known people who picked their nose and subtely ate it, but we all saw.

The four-year-old daughter also thinks no one is watching as she seinfeld_thepick-300x207prepares to snack down, to which both parents say, use a tissue.

But despite everything you may have heard from your mom, picking your nose and eating what you find may have some health benefits, according to a biochemistry professor at the University of Saskatchewan in Saskatoon.

“By consuming those pathogens caught within the mucus, could that be a way to teach your immune system about what it’s surrounded with?” is the hypothesis Scott Napper posed to his students.

CBC cited Napper as noting that snot has a sugary taste and that may be a signal to the body to consume it and derive information for the immune system.

“I’ve got two beautiful daughters and they spend an amazing amount of time with their fingers up their nose,” he said. “And without fail, it goes right into their mouth afterwards. Could they just be fulfilling what we’re truly meant to do?”

Failings with the hygiene hypothesis

My mother contracted undulant fever as a child in the 1940s.

My grandfather, to his credit, promptly got rid of the dairy cows, and went into potatoes, and then became the asparagus baron of Canada.

Scott Weese, an OK hockey player and vet prof at the University of Guelph, where I used to ply my trade, delves into the hygiene hypothesis on his Worms & Germs Blog.

Yes, I do say, don’t eat poop, but the amendment would be, if you do, cook it.

And like Weese, I’ve heard the same question for 20 years on the speaker and blog circuit: is our food too clean?

How clean to we want things to be, asks Weese, and can we be too clean? Furthermore, does reduction in our exposure to microorganisms predispose us to various diseases, such as allergic and inflammatory diseases? The answer to both of these is presumably yes. However, what level of clean is good and what level is excessive?

In a hospital, we want clean… very clean. We have a highly susceptible population and lots of bad bugs in circulation. We want close attention paid to disinfection and thorough hand hygiene in hospitals, no doubt about it. But what about in the general population? Antibacterial soaps are not generally recommended for households because there’s no evidence they are needed and they might increase the likelihood of antibiotic resistance (since bacteria that become resistant to antibacterial agents in soaps can also be resistant to some antibiotics). We don’t need high-level disinfection as a routine practice all over the house. At certain times and in certain areas, sure, it’s certainly still a good idea. For example, if you’re working with raw chicken, careful attention to hygiene and surface disinfection is important because of the high likelihood of exposure to some important pathogens (e.g. Salmonella). But do we need to be spraying disinfectants around the rest of the house on a routine basis (as some TV commercials indicate)? Probably not.

Being a germaphobe can be good, but maybe it can also be bad. We need to think about the role of this complex and massive (yet still poorly defined) microbial population that lives with us. How much exposure to bacteria from different sources is actually needed for health, especially in kids? How much is harmful? There has to be a middle ground.

Watching daughter Sorenne slowly recover from whatever made her stronger back in April via 14 vomits and five diarrheal episodes reinforced, to me, how little is known.

The concept of exposing people to germs at an early age to build immunity is known as the hygiene hypothesis.

I’m not an immunologist, but the idea makes biological sense; I do, however, get concerned with the details, and generalizations.

We know immune systems take several years to develop in young children, and things start to go downhill after 55. (Freedom 55?). A little dirt may be good for kids, but there will always be some who, through genetics, environment and other unknowns, will be more susceptible to disease than others. And we’re not smart enough to know who those individuals are. The good ole’ days usually included stories about a family that lost a kid.

My mother came close.

Then there’s the policy. I can’t imagine the agriculture minister or secretary announcing that investments in a lot of this food safety stuff would be better spent on other societal priorities. We’ve done a cost-benefit analysis and decided it’s better for everyone to get a little sick. We’re going to lose a few, and we don’t know who those few (or many) are, but it’s a cost-effective approach.

 

If the hygiene hypothesis is real, does it matter?

The most frequently asked question with public and scientific crowds at any food safety jamfest I’ve done over the past 20 years: Is food too clean?

It comes from that adage, what doesn’t kill you makes you stronger.

But what if it kills you? Or causes irreparable damage, like 8-year-old Brit, Elisabeth Willoughby, who contracted toxocariasis, probably from contact with dog doo while crawling in the park as an infant. Her right eye was permanently scarred by the roundworm parasite.

Watching daughter Sorenne slowly recover from whatever made her stronger the other night via 14 vomits and five diarrheal episodes reinforced, to me, how little is known.

The concept of exposing people to germs at an early age to build immunity is known as the hygiene hypothesis.

I’m not an immunologist, but the idea makes biological sense; I do, however, get concerned with the details, and generalizations.

Medical types have suggested that the hygiene hypothesis explains the global increase of allergic and autoimmune diseases in urban settings. It has also been suggested that the hypothesis explains the changes that have occurred in society and environmental exposures, such as giving antibiotics early in life.

Researchers at Brigham and Women’s Hospital (BWH) reported in Science last month that exposing germ-free mice to microbes during their first weeks of life, but not when exposed later in adult life, led to a normalized immune system and prevention of diseases.

Moreover, the protection provided by early-life exposure to microbes was long-lasting, as predicted by the hygiene hypothesis.

"These studies show the critical importance of proper immune conditioning by microbes during the earliest periods of life," said Richard Blumberg, MD, chief for the BWH Division of Gastroenterology, Hepatology and Endoscopy, and co-senior study author, in collaboration with Dennis Kasper, MD, director of BWH’s Channing Laboratory and co-senior study author. "Also now knowing a potential mechanism will allow scientists to potentially identify the microbial factors important in determining protection from allergic and autoimmune diseases later in life."

Does that mean if your kid gets an infectious disease later in life, parents are negligent for not exposing them to a little infectious disease earlier in life?

It all sounds romantically agrarian – a little dirt is good for you – until specifics get in the way; specifics like, it’s your kid.

My answer to questioning minds goes something like this:

We know immune systems take several years to develop in young children, and things start to go downhill after 55. (Freedom 55?) A little dirt may be good for kids, but there will always be some who, through genetics, environment and other unknowns, will be more susceptible to disease than others. And we’re not smart enough to know who those individuals are. The good ole’ days usually included stories about a family that lost a kid. And it was probably some kind of infectious disease. Western societies have enough science and enough affluence to decide, one is too many.

Then there’s the policy. I can’t image the agriculture minister or secretary announcing that investments in a lot of this food safety stuff would be better spent on other societal priorities. We’ve done a cost-benefit analysis and decided it’s better for everyone to get a little sick. We’re going to lose a few, and we don’t know who those few (or many) are, but it’s a cost-effective approach.

T. Olszak, D. An, S. Zeissig, M. P. Vera, J. Richter, A. Franke, J. N. Glickman, R. Siebert, R. M. Baron, D. L. Kasper, R. S. Blumberg. Microbial exposure during early life has persistent effects on natural killer T cell function. Science, 2012; DOI: 10.1126/science.1219328

Food allergies linked to hygiene hypothesis? ‘If fewer allergies is more infection, no parent would expose their child to more infection’

People from well-educated families are almost twice as likely to suffer from some dangerous food allergies as others — possibly because their bodies’ natural defences have been lowered by rigorous hygiene and infection control, suggests a new Canadian study.

The research from McGill University also found that immigrants were about half as likely to be afflicted by the allergies, perhaps reflecting differences in diet and environment between their countries of origin and Canada.

The study, just published in the Journal of Allergy, was meant to address an enduring medical mystery: Why have so many people in certain industrialized countries developed violent reactions to peanuts, shellfish and other foods in recent decades?

The link to higher education may be explained by what is called the hygiene hypothesis, the unproven idea that smaller families, cleaner homes, more use of antibiotics to treat infections and vaccines to prevent them have curbed development of the immune system, said Dr. Moshe Ben-Shoshan, who led the research. That in turn could make some people more susceptible to allergy.

If the hypothesis does actually explain some food reactions, though, parents may not be able to do much about it, admitted the allergist at Montreal Children’s Hospital. The benefits of such health products as antibiotics and vaccines easily outweigh the risk of children developing serious allergies, said Dr. Ben-Shoshan.
“We can’t suggest we become dirtier and expose our children to more bacteria,” he said. “If the price of having fewer allergies is more infection, I don’t know any parent who would expose their child to more infection.”

The study’s findings are far from conclusive but they, and the hygiene hypothesis as an explanation, seem plausible, said Dr. Stuart Carr, president of the Canadian Society of Allergy and Clinical Immunology. He also cautioned, however, that translating the knowledge into preventive action would be complicated.

Local food is not inherently safer food

The idea that food grown and consumed locally is somehow safer than other food, either because it contacts fewer hands or any outbreaks would be contained, is the product of wishful thinking.

Barry Estabrook of Gourmet magazine is the latest to invoke the local is pure fantasy, writing,

“There is no doubt that our food-safety system is broken. But with the vast majority of disease outbreaks coming from industrial-scale operations, legislators should have fixed the problems there instead of targeting small, local businesses that were never part of the problem in the first place.”

As soon as someone says there’s “no doubt” I am filled with doubt about the quality of the statement that is about to follow.

Foodborne illness is vastly underreported — it’s known as the burden of reporting foodborne illness. Someone has to get sick enough to go to a doctor, go to a doctor that is bright enough to order the right test, live in a state that has the known foodborne illnesses as a reportable disease, and then it gets registered by the feds. For every known case of foodborne illness, there are 10 -300 other cases, depending on the severity of the bug.??????

Most foodborne illness is never detected. It’s almost never the last meal someone ate, or whatever other mythologies are out there. A stool sample linked with some epidemiology or food testing is required to make associations with specific foods. ??????Newsweek has an excellent article this week about the U.S. Centers for Disease Control and its Disease Detective Camp, where teenagers learn how to form a hypothesis about a disease outbreak and conduct an investigation. The key lies only partly in state-of-the-art technology. At least half the challenge is figuring out the right questions to ask. Who has contracted the disease? Where have they been? Why were they exposed to this pathogen?

Maybe the vast majority of foodborne outbreaks come from industrial-scale operations because the vast majority of food and meals is consumed from industrial-scale operations. To accurately compare local and other food, a database would have to somehow be constructed so that a comparison of illnesses on a per capita meal or even ingredient basis could be made. ???