If the hygiene hypothesis is real, does it matter?

The most frequently asked question with public and scientific crowds at any food safety jamfest I’ve done over the past 20 years: Is food too clean?

It comes from that adage, what doesn’t kill you makes you stronger.

But what if it kills you? Or causes irreparable damage, like 8-year-old Brit, Elisabeth Willoughby, who contracted toxocariasis, probably from contact with dog doo while crawling in the park as an infant. Her right eye was permanently scarred by the roundworm parasite.

Watching daughter Sorenne slowly recover from whatever made her stronger the other night via 14 vomits and five diarrheal episodes reinforced, to me, how little is known.

The concept of exposing people to germs at an early age to build immunity is known as the hygiene hypothesis.

I’m not an immunologist, but the idea makes biological sense; I do, however, get concerned with the details, and generalizations.

Medical types have suggested that the hygiene hypothesis explains the global increase of allergic and autoimmune diseases in urban settings. It has also been suggested that the hypothesis explains the changes that have occurred in society and environmental exposures, such as giving antibiotics early in life.

Researchers at Brigham and Women’s Hospital (BWH) reported in Science last month that exposing germ-free mice to microbes during their first weeks of life, but not when exposed later in adult life, led to a normalized immune system and prevention of diseases.

Moreover, the protection provided by early-life exposure to microbes was long-lasting, as predicted by the hygiene hypothesis.

"These studies show the critical importance of proper immune conditioning by microbes during the earliest periods of life," said Richard Blumberg, MD, chief for the BWH Division of Gastroenterology, Hepatology and Endoscopy, and co-senior study author, in collaboration with Dennis Kasper, MD, director of BWH’s Channing Laboratory and co-senior study author. "Also now knowing a potential mechanism will allow scientists to potentially identify the microbial factors important in determining protection from allergic and autoimmune diseases later in life."

Does that mean if your kid gets an infectious disease later in life, parents are negligent for not exposing them to a little infectious disease earlier in life?

It all sounds romantically agrarian – a little dirt is good for you – until specifics get in the way; specifics like, it’s your kid.

My answer to questioning minds goes something like this:

We know immune systems take several years to develop in young children, and things start to go downhill after 55. (Freedom 55?) A little dirt may be good for kids, but there will always be some who, through genetics, environment and other unknowns, will be more susceptible to disease than others. And we’re not smart enough to know who those individuals are. The good ole’ days usually included stories about a family that lost a kid. And it was probably some kind of infectious disease. Western societies have enough science and enough affluence to decide, one is too many.

Then there’s the policy. I can’t image the agriculture minister or secretary announcing that investments in a lot of this food safety stuff would be better spent on other societal priorities. We’ve done a cost-benefit analysis and decided it’s better for everyone to get a little sick. We’re going to lose a few, and we don’t know who those few (or many) are, but it’s a cost-effective approach.

T. Olszak, D. An, S. Zeissig, M. P. Vera, J. Richter, A. Franke, J. N. Glickman, R. Siebert, R. M. Baron, D. L. Kasper, R. S. Blumberg. Microbial exposure during early life has persistent effects on natural killer T cell function. Science, 2012; DOI: 10.1126/science.1219328