Stray cat blues and zoonoses

Although trained in molecular biology, genetics and food science, I always had a desire to additionally express my creativity as a writer. In 1986I put aside my fears and made my way to the second floor of the university centre, home to all things student, including the University of Guelph student newspaper, The Ontarion.

I approached the editor-in-chief, who had issued a call for additional writers, and boldly – it seemed bold at the time – said. I want to write.

But rather than do record reviews or movie reviews, which everyone wanted to write, I said I want to write about science. I was a MSc student at the time.

After a few published pieces, I was awarded a weekly science column. The next year, through a series of weird events, I became editor-in-chief and lost interest in grad school (or was it the other way around?).

For a newbie journo, it was a struggle to come up with a weekly column while running my experiments to understand the basis of disease resistance –Verticillium wilt – in tomatoes.

My cats led the way.

My first warm-blooded pets – whom I named Clark and Kent — had been provided by my veterinary student girlfriend. I became fascinated with cat behavior, and thought the 25,000 potential readers would share my cat voyeurism.

Some did, enough to secure my weekly column and write about other sciencey things.

Researchers from the Centers for Disease Control and Prevention, Atlanta, Georgia, (B. Breedlove) and the British Library, London, UK (J. Igunma) write in the Dec. 2020 issue of Emerging Infectious Diseases that Felis catus, the only domesticated species of cat in the family Felidae, flourishes on every continent except Antarctica. Able to thrive in almost any climate and habitat, it is among the world’s most invasive species. Current estimates of the global cat population, including pet, stray, and feral cats, range from 200 million to 600 million. Where there are humans, more than likely there are also cats.

Humans living in agricultural villages in northern Africa and the Near East are believed to have domesticated the African wildcat (Felis lybica) between 8,000 to 12,000 years ago. Archaeologist Magdalena Krajcarz and colleagues noted, “The cat’s way to domestication is a complex and still unresolved topic with many questions concerning the chronology of its dispersal with agricultural societies and the nature of its evolving relationship with humans.” Likely stored grains and trash piles in villages attracted rodent pests, which in turn lured local wildcats and initiated a nascent mutualistic relationship that has since flourished.

From those villages, cats found their way around the world. Authors Lee Harper and Joyce L. White wrote that ancient sailors “were quick to see the advantage of having cats aboard ship during long voyages to protect their food supplies from damage by rodents.” Trade and commerce helped spread cats from the Middle East to various ports of call in Europe, the Far East and Orient, and the Americas. Throughout this common history, cats have been both reviled and revered by humans.

During the Middle Ages in Europe, some religious institutions considered cats evil, leading to thousands being killed. Later, however, the Black Death spread by fleas on rats contributed to cats’ redemption. Harper and White noted, “The cat’s skill as a hunter of vermin was desperately needed. Its reputation was salvaged. Owning a cat was back in style.”

Ancient Egyptians ascribed to cats many characteristics shared with deities they worshipped. Freyja, Norse goddess of love, beauty, and fertility, rode on a cat-drawn chariot. Temples in medieval Japan often kept a pair of cats to protect precious manuscripts from being ruined by mice. In the Kingdom of Siam, which is modern-day Thailand, Buddhist monks welcomed cats into their temples, where they were protected as Maeo Wat (Temple Cats).

This month’s cover art, “two lucky cats to support leadership,” is the second folio from A Thai Treatise on Cats, created in the 19th century in central Thailand and acquired by the British Library in 2011. Such manuscripts about cats were made for breeders in Thailand at least from the 18th century on, although it is believed that cat breeding goes back to the beginnings of the ancient Thai Kingdom of Ayutthaya in the 14th century.

The positive side of cat ownership, as celebrated in those cat treatises, is acknowledged on the CDC website, “Research has shown that cats can provide emotional support, improve moods, and contribute to the overall morale of their owners. Cats are also credited with promoting socialization among older individuals and physically or mentally disabled people.” Cats, as noted earlier, have also historically helped control the spread of rodent-borne diseases among humans.

Nonetheless, living in close quarters with cats carries some health risks. Cats can transfer various zoonotic diseases, including Campylobacter infection, cat–scratch disease, cryptosporidiosis, hookworm infection, plague, rabies, and salmonellosis. Cats are the only animal in which the Toxoplasma gondii parasite completes its life cycle, and humans in close contact with cat litter, for example, are at risk of developing toxoplasmosis, which pregnant women can potentially transmit to a fetus. Much less common is transfer of disease from humans to animals, such as the suspected case of human-to-cat transmission of severe acute respiratory syndrome coronavirus 2 reported in this issue. (Both owner and cat recovered.)

Detecting, responding to, and preparing for emerging zoonotic infections―which, like cats, have made their way around the world with our help―are major challenges for public health leaders. Even if cats are not actual talismans or have the power to improve leadership, spending a few minutes considering these lucky cats may provide public health officials a brief respite or serendipitous insight.

In consideration of our mutual relationship with cats

Emerging Infectious Diseases, vol. 26 no. 12

Byron Breedlove and Jana Igunma

Foodborne disease outbreaks in the United States: A historical overview

Understanding the epidemiology of foodborne disease outbreaks (FBDOs) is important for informing investigation, control, and prevention methods.

We examined annual summary FBDO data in the United States from 1938 to 2015, to help understand the epidemiology of outbreaks over time. Due to changes in reporting procedures, before 1998, the mean number of annual outbreaks was 378, and after that, it was 1062.

A mean of 42% had a known etiology during 1961–1998; since then the etiology has been identified in ∼65%, with a marked increase in the number of norovirus outbreaks. From 1967 to 1997, a mean of 41% of FBDOs occurred in restaurant settings, increasing to 60% in 1998–2015. Concurrently, the proportion of outbreaks occurring at a home decreased from 25% to 8%.

The mean size of outbreaks has decreased over time, and the number of multistate outbreaks has increased. Many social, economic, environmental, technological, and regulatory changes have dramatically affected the epidemiology of foodborne disease over time.

Foodborne Pathogens and Disease, Vol. 15 Issue 1

January 2018

Timothy Jones and Jane Yackley

History matters: Gordie Howe edition

One of the things that Doug and I often lament is for all of the focus on technology and progress, many food safety problems that pop up are recurring issues: not understanding how the bugs move/grow, poor management, poor execution or just not caring.

Same stuff almost every time.

A lot can be learned by paying attention to the bad stuff that has happened in the past.howe-gretzky

History matters in hockey too. Last night my kids stayed up with the hopes of seeing the Stanley Cup awarded live for the first time. Neither made it past the ten minute mark of the third, and San Jose forced a game six (so we have another chance Sunday).

During the game we talked a while about NHL players jersey numbers; Jack wears number 9 after finding out Gretzky wore that as a kid.

Gretzky chose that number because his idol Gordie Howe wore it.

Gordie died today at age 88.

Top 10 UK toilets through time

A Scottish food safety friend sent along this story from English Heritage which has some great pics.

1. Housesteads Roman Fort, Hadrian’s Wall: All together now…

Toilet-bannerThe best preserved Roman loos in Britain are at Housesteads Roman Fort on Hadrian’s Wall. At its height, the fort was garrisoned by 800 men, who would use the loo block you can still see today. There weren’t any cubicles, so men sat side by side, free to gossip on the events of the day. They didn’t have loo roll either, so many used a sponge on a stick, washed and shared by many people.

Visit Housesteads Roman Fort

2. Old Sarum, Wiltshire: Luxury facilities, until you have to clean them…

These deep cesspits sat beneath the Norman castle at Old Sarum, probably underneath rooms reached from the main range, like private bathrooms. In the medieval period luxury castles were built with indoor toilets known as ‘garderobes’, and the waste dropped into a pit below. It was the job of the ‘Gongfarmer’ to remove it

Visit Old Sarum

3. Dover Castle, Kent: The royal wee

Henry II made sure that Dover Castle was well provided with garderobes. He had his own en-suite facilities off the principal bed-chamber. As with many castles of the era, chutes beneath the garderobes were built so that the waste fell into a pit which could be emptied from outside the building.

Visit Dover Castle

4. Goodrich Castle, Herefordshire: The toilet tower

At Goodrich Castle there’s a whole tower dedicated to doing your business.

Visit Goodrich Castle

5. Orford Castle, Suffolk: A Norman urinal

Garderobes are quite common in medieval castles, but urinals are a little more unusual. Henry II’s Orford Castlewas built as a show of royal power, and to guard the busy port of Orford.

Visit Orford Castle

6. Muchelney Abbey, Somerset: Thatched loo for monks

Many medieval abbey ruins across the country include the remains of the latrines, or ‘reredorter’ (meaning literally ‘at the back of the dormitory’), including Muchelney AbbeyCastle Acre Priory and Battle Abbey. At Muchelney the building survives with a thatched roof, making it the only one of its kind in Britain. The monks would enter the loo block via their dormitory and take their place in a cubicle – you can still see the fixings for the bench and partitions between each seat.

Visit Muchelney Abbey

7. Jewel Tower, London: The Privy Palace

A precious survival from the medieval Palace of Westminster, Jewel Tower was part of the ‘Privy Palace’, the residence of the medieval kings and their families from 11th to 16th century. It was well supplied with garderobes, with one on each of the three floors.

Visit Jewel Tower

8. Old Wardour Castle, Wiltshire: ‘A new discourse of a stale subject’

The forerunner to our modern flushing toilet was invented at Old Wardour Castle. The inventor Sir John Harington met with five others at the castle to discuss his idea for the first time in 1592.

Visit Old Wardour Castle

Thunderbox9. Audley End House, Essex: Feeling flush

Along with many other technological advancements, Audley End was one of the first country houses in England to have flushing toilets. The first of Joseph Bramah’s new hinged-valve water closets was purchased in 1775, and a further 4 were bought in 1785 at a cost equivalent to the wages of two servants for a whole year.

Visit Audley End

10. Brodsworth Hall, South Yorkshire: Thunderboxes

Inside the elegant Victorian country house of Brodsworth Hall almost everything has been left exactly as it was when it was still a family home. So as well as the grand furniture, there’s also everything from the commodes of the 1840s to a modern pink bathroom from the 1960s/70s.

Visit Brodsworth Hall

One hundred years of food safety extension

Ellen Thomas, PhD candidate in the department of Food, Bioprocessing and Nutrition Sciences at NC State writes,

When I was growing up, I made occasional trips with my dad to the local extension office to drop off soil samples (we lived on a farm). Up until about 5 years ago, this was really my only experience with Cooperative Extension. It wasn’t until I began graduate school that I was introduced to the far-reaching world of extension. This year marks 100 years of Cooperative Extension in the United States.10447625_692765530308_9060931799503108221_n A United States Department of Agriculture’s extension webpage details the Congressional acts that initially created extension, as well as the primary goals of extension today. I also dug into numerous universities’ cooperative extension pages to learn more about how extension has evolved over the past century, and found numerous examples of agricultural courses offered to consumers, research conducted to improve food safety and communicate those steps to consumers, and technologies developed to vastly improve efficiency and opportunity for growers.

Ellen took the lead on an article for Food Safety Magazine on detailing some of the history of food safety as it relates to the food industry, reprinted below.

Land-grant universities in the United States were established with the Morrill Acts of 1862 and 1890. Their mission was to educate the public on subjects of agriculture, home economics and other practical tasks in the home—to literally extend research and help families across the country. While food safety was not initially within the mission’s scope, food safety has a strong and intertwined history within land-grant universities and Cooperative Extension.

In 1890, Professor Stephen M. Babcock at the University of Wisconsin invented a device that tested the butterfat content of milk quickly and efficiently. He shared this technology with the university and dairy industry throughout the state, creating an open and engaging relationship between the university and the public that continues to this day.

In the early 1900s, advocates began to call for better-quality milk, as well as bringing milk sanitation laws and training inspectors to be consistent in how they enforced regulations. This led to creation of the International Association of Dairy and Milk Inspectors in 1912 (the precursor to the International Association of Food Protection). One of the nation’s greatest challenges was how to obtain the most technical, up-to-date information, and to effectively communicate it to dairy farmers.

In addition to teaching and research, land-grant universities have a long tradition of connecting academics and research to the masses, originally in largely rural areas through a delivery mechanism known as extension; 2014 marks 100 years of the Cooperative Extension system in the United States. The Smith-Lever Act in 1914 further solidified the role of extension in land-grant universities by creating a partnership with the U.S. Department of Agriculture (USDA), in which USDA would provide funds to each state to carry out extension work.

In North Carolina, strong extension programs emerged from canning clubs and corn clubs. These organizations were effective in providing useful information for those interested in home preservation, increasing crop yields, volunteerism and community fellowship. The clubs later developed into 4-H. The structure and overall group principles of 4-H were defined in 1919 at a meeting in Kansas City. Today, 4-H reaches 7 million American children and includes groups in rural, urban and suburban communities in every state; youth are exposed to a wide variety of topics in agriculture and STEM (science, technology, engineering and mathematics).

During World War I, extension helped increase crop yields and home preserving, as well as the organization of groups to fill gaps in the labor force. Extension helped create farming cooperatives and provided instruction on home practices to aid families during the Depression. During World War II, extension dramatically increased food production as part of the Victory Garden program.

In the 1960s, Rutgers University extension agricultural engineer William Roberts revolutionized greenhouse farming with the innovation of pumping air between plastic films. Approximately 65 percent of commercial greenhouses throughout the world use this technology today. Further similar greenhouse technology developments continued under Roberts in the years that followed.

In 1969, President Lyndon Johnson began the Expanded Food Nutrition Extension Program (EFNEP) as part of his War on Poverty. Program assistants were trained to teach nutrition and food safety, and to promote overall wellness. EFNEP now operates in all 50 states, Puerto Rico, American Samoa, Guam, the Virgin Islands, Northern Marianas and Micronesia. There are both adult and youth programs with the goal of promoting high-quality diets among audiences with lower incomes and limited access to resources.

The Master Gardener program began at Washington State University in the 1970s with the idea to train volunteers in horticulture to educate the public to reach a larger audience. The curriculum included culturing plants, fruits and vegetables, and grasses; how to deal with pests, diseases and weeds; and how to safely administer pesticides. The curriculum was administered by state- and county-based faculty. Over time, the program has grown, gaining more recognition; it is now sponsored across the United States and Canada. The program structure has also been extended to other portions of extension, such as food preservation.

In 1988, listeriosis, a highly infectious and potentially serious illness caused by the bacterium Listeria, was linked to hot dogs and deli meats. The tragic outbreak included 108 cases, with 14 deaths and 4 miscarriages or stillbirths. Researchers at Colorado State University conducted extensive experiments to characterize Listeria and explore methods of mitigating its prevalence in foods. High-risk groups, particularly pregnant women, were the focus, and suggestions for reducing risk, such as heating deli meats before consumption, were distributed in extension fact sheets nationally.

Kansas State University enjoys a strong relationship with a variety of meat producers, which has been building over the past few decades. Meat science faculty engage in research related to meat quality, sensory evaluation, meat safety, color stability, packaging and numerous other factors related to meat from slaughter to handling at home. The department offers courses on campus and through distance learning, Hazard Analysis and Critical Control Points courses throughout the Midwest and value-added services for meat processors. The department has been releasing papers on optimal equipment for small producers, handling wild game and other technologies since the 1990s.

With the increase of foodborne illness associated with produce in the 1990s and 2000s, the University of California, Davis, established the Center for Produce Safety in 2007, which integrates industry, government and academic research with the ultimate goal of maximizing produce yields while maintaining the best quality and safety of product. The center provides short courses, workshops and certifications for produce growers, related to quality, postharvest technology and safety, and has funded numerous research projects.

Extension faces many challenges and opportunities as the system moves into its second century. While state and federal appropriations and other funding streams have decreased recently, agriculture has also changed—less than 2 percent of Americans are farmers today. The Food Safety Modernization Act, with the goal of making food safer, will provide many food businesses with new regulations to comply with. Cooperative Extension will continue to play an integral part in assisting businesses (especially the small and very small) to assess and manage food safety issues—and help consumers understand what goes into making food safe. Extension programs across the country have also increased their social media presence, continue to provide evidence-based recommendations and conduct applied research that affects food from farm to fork. Extension has adapted to numerous changes over the past century, taking the lead in bringing new food safety technologies to agriculture and food production worldwide.

We like the social media stuff: barfblog is now active on Facebook

Someone asked me about the history of barfblog this week – stuff like how it started and where the name came from.

Here’s how I remember it: Doug had been editing a bunch of daily listservs (FSNet, Agnet, Animalnet and FFnet) in some form since 1993. These were a big source of food safety-related news for risk managers (folks in industry, academia and the regulatory agencies) Screen Shot 2014-01-17 at 1.28.41 PM before Google Alerts, RSS feeds and Twitter existed. Beyond sharing what was going on in the food safety world, Doug encouraged the students and staff who worked for him to write evidence-based commentary and submit op-eds and letters to the major publications (back when there were actual newspapers).

I came along in 2000 and became a news junkie and jumped into the whole share-your-thoughts-in-an-interesting-way thing. Even with my grammar, spelling and general logic challenges. In 2005, when self-publishing was all the rage, we decided to start a forum to post stories about food safety experiences, the stuff that others didn’t publish or didn’t fit the format of the traditional newspapers.

And we started a blog. It wasn’t really a blog at the start, but a forum. And it got bombarded by porn spam. So we left it for a while and relaunched the whole thing in 2007.

But it needed a name.

Christian, a particularly creative undergraduate, came up with the name – barfblog (all in lowercase as Dave Stanley always told Doug uppercase was a waste in e-mail, and he agrees) – and then created a video of him guzzling vermouth and actually barfing.

The idea was (and still is) to write stories about what makes people barf and take current news items and highlight what we thought was important – based on the literature and our experiences.

Doug’s more concise description is this:

Every time I talk to someone on a plane, train or automobile, they find out what I do, and then proceed to tell me their worst barf story. was created to capture those stories, except most people don’t want to be bothered writing, so we did it for them.

Since 2007 we’ve embraced social media as a channel to carry out that dialogue and increase discussion. But we’ve really sucked at Facebook. Until now. We’ve got a somewhat new, but now active space where we’ll be posting our, uh, blog posts as well as pictures and links. And we’re looking for folks to jump in on the discussions.
Check out barfblog on Facebook at

Food fraud ain’t nothing new

Whenever I’m confused I watch TV. And then I reach for history.

A new examination by the U.S. Pharmacopeial Convention (USP), discovered rising numbers of fake ingredients in products from olive oil FERRIERES 2to spices to fruit juice.

“Food products are not always what they purport to be,” Markus Lipp, senior director for Food Standards for the independent lab in Maryland, told ABC News.

In a new database, USP warns consumers, the FDA and manufacturers that the amount of food fraud they found is up by 60 percent this year.

USP tells ABC News that liquids and ground foods in general are the easiest to tamper with:

Olive oil: often diluted with cheaper oils

Lemon juice: cheapened with water and sugar

Tea: diluted with fillers like lawn grass or fern leaves

Spices: like paprika or saffron adulterated with dangerous food colorings that mimic the colors

Milk, honey, coffee and syrup are also listed by the USP as being highly adulterated products.

Also high on the list: seafood. The number one fake being escolar, an oily fish that can cause stomach problems, being mislabeled as white tuna or albacore, frequently found on sushi menus.

National Consumers League did its own testing on lemon juice just this past year and found four different products labeled 100 percent lemon juice were far from pure.

And I now have the luxury of having my own lemon and Tahitian lime trees on my concrete balcony.

Food fraud is nothing new.

Historian Madeleine Ferrieres, until recently Professor of Modern History at the University of Avignon and the author of my favorite food food_fraud_adulterationbook, 2002’s Mad Cow, Sacred Cow, said in a recent interview, “we still live with the illusion of modernity, with the false idea that what happens to us is new and unbearable. These are not risks that have arisen, but our consumer behavior has changed.”

What’s new is better tools to detect fraud, which also presents an opportunity: those who use the real deal should be able to prove it through DNA testing and brag about it.

The days of faith-based food safety are coming to a protracted close.

As Ferrieres wrote in her book,

“All human beings before us questioned the contents of their plates. … And we are often too blinded by this amnesia to view our present food situation clearly. This amnesia is very convenient. It allows us to reinvent the past and construct a complaisant, retrospective mythology.”

30 dead, 139 sick from listeria in cantaloupe; a history of US food safety disasters

Before Al Gore invented the Internet in 1994, there was this thing called paper, which was useful for keeping records.

Those with a fetish in the macabre or statistics may care that the listeria-in-cantaloupe outbreak, which has now killed 30 and sickened 139, pales in comparison to past outbreaks.

Robert Tauxe, deputy director of the U.S. Centers for Disease Control and Prevention’s Division of Foodborne, Waterborne and Environmental Diseases told Elizabeth Weise of USA Today the deadliest documented foodborne illness outbreak in the United States was in the winter of 1924-1925, when typhoid in raw oysters from New York City killed approximately 150 people and sickened more than 1,500.

The second largest outbreak linked to food occurred in Boston in 1911. Then, streptococcus in raw, unpasteurized milk killed 48 people and sickened more than 2,000. The disease was described as "septic sore throat’ at the time, Tauxe says. Similar but smaller outbreaks like this one led to a national move to pasteurize milk in 1924 by the U.S. Public Health Service.

In 1922 in Portland, Ore., another outbreak of "septic sore throat" killed 22 people and sickened 487. That round of streptococcus was also linked to raw, unpasteurized milk.

And in 1919, an outbreak of botulism from olives put up in glass jars in California killed at least 15 people in three states. It resulted in a major change in how items were canned so that botulism would no longer be a problem.

But, Americans don’t want medical care like that practiced in 1919, nor should food production be rooted in some nostalgic past. Every death and illness from food is tragic, especially if preventable; what can be done to prevent this happening again? Telling consumers to wash cantaloupes in bleach is not a solution.

A table of cantaloupe-related outbreaks is available at

Sprouts still suck even if The Packer thinks the problem is new

I still don’t like sprouts. Never have. When I inadvertently eat them (like when someone sneaks them into my sandwich, often at a food safety conference) I find myself picking them out of my teeth.

The consumption of raw sprouts has been linked to over 40 outbreaks of foodborne illness internationally going back to 1988, including a whopping 648 who were sickened in a salmonella-in-sprouts outbreak in 2005 in Ontario (that’s in Canada).

So it’s somewhat baffling that one of the flagship publications of the produce industry, The Packer, would come out with an editorial today that opens with,

“Ensuring food safety in fresh produce has been the highest-profile concern for the industry since 2006’s outbreak linked to spinach.”

The 2006 E. coli O157:H7 in spinach was the 29th identified outbreak in leafy greens. Lots of people, including the U.S. Food and Drug Administration, started paying attention to microbial food safety concerns in fresh produce beginning with the E. coli O157:H7 outbreak in unpasteurized Odwalla cider in 1996.

The Packer then states, and they are apparently serious, “Though they haven’t garnered as much concern — yet — sprouts have been a recent and recurring source of illness.”

The first consumer warning about sprouts was issued by the U.S. Centers for Disease Control (CDC) in 1997. By July 9, 1999, FDA had advised all Americans to be aware of the risks associated with eating raw sprouts. Consumers were informed that the best way to control the risk was to not eat raw sprouts. The FDA stated that it would monitor the situation and take any further actions required to protect consumers.

In Jan. 2002, CDC issued ?a renewed call for Americans to avoid fresh alfalfa or other sprouts, and that people, particularly young children, the elderly and those with weak immune systems, should avoid eating raw sprouts. Dr. Mark Beatty of the CDC’s National Center for Infectious Diseases, said at the time, "Immunocompromised people could develop shock and die from the infection," although healthy people were at a lower risk for such complications.? Beatty was further quoted as saying that a 2001 outbreak in four western states revealed a "misconception" that sprouts were a healthy food. At least three of the people involved in the outbreak ate sprouts partly for health reasons.

?Because of continued outbreaks, the sprout industry, regulatory agencies, and the academic community pooled their efforts in the late 1990s to improve the safety of the product, including the implementation of good manufacturing practices, establishing guidelines for safe sprout production and chemical disinfection of seeds prior to sprouting.?

But are such guidelines actually being followed? And is anyone checking? ?In response to the 2001 outbreak, the California Department of Health Services and the California Department of Education recommended that schools stop serving uncooked sprouts to young children.

There is a lot of turnover at trade magazines, and it’s proving harder to find decent writers who know the history of a topic rather than tracers who regurgitate whatever is out there, but why would The Packer close with,

“Much has been learned about food safety best practices since the industry’s wakeup call with spinach.”

Best practices for fresh produce were first published by FDA in 1998. Trade rags do no one any favors with memories of convenience.