Skip to main content

Babies Have a Microbial Window of Opportunity [Excerpt]

The microbiome of infants just after birth can have important health consequences later in life, researchers explain in a new book

Algonquin Books of Chapel Hill

Excerpted from Let Them Eat Dirt: Saving Your Child from an Oversanitized World by B. Brett Finlay, Ph.D., and Marie-Claire Arrieta, Ph.D. © 2016 by B. Brett Finlay, Ph.D., and Marie-Claire Arrieta, Ph.D. Reprinted with permission of Algonquin Books of Chapel Hill. All rights reserved.

Until very recently, whenever we thought of microbes — especially around babies — we considered them only as potential threats and were concerned with getting rid of them, and it is no surprise why. In the past century, most human communities have experienced the benefits of medical advances that have reduced the number and the degree of infections we suffer throughout life. These advances include antibiotics, antivirals, vaccinations, chlorinated water, pasteurization, sterilization, pathogen–free food, and even good old-fashioned hand washing. The quest of the past 100 years has been to get rid of microbes — “the only good microbe is a dead one.” This strategy has served us remarkably well; nowadays, dying from a microbial infection is a very rare event in developed countries, whereas only 100 years ago 75 million people died worldwide over a span of two years infected with H1N1 influenza virus, also known as the Spanish flu. At first glance, our war on microbes along with other medical advances has truly paid off. The average life span in the US in 1915 was 52 years, approximately 30 years shorter than it is today. For better or for worse, there are almost four times more humans on this planet than only 100 years ago, an incredibly accelerated growth in our historic timeline. Evolutionarily speaking, we have hit the jackpot. But at what price?

The prevalence of infectious diseases declined sharply after the emergence of antibiotics, vaccines, and sterilization techniques. However, there has been an explosion in the prevalence of chronic non-infectious diseases and disorders in developed countries. One hears about these in the news all the time since they’re very common in industrialized nations, where alterations to our immune system play an important role in their development. They include diabetes, allergies, asthma, inflammatory bowel diseases (IBD), autoimmune diseases, autism, certain types of cancer, and even obesity. The incidence of some of these disorders is doubling every ten years, and they are starting to appear sooner in life, often in childhood. They are our new epidemics, our modern-day bubonic plague. (By contrast, these diseases have remained at much lower levels in developing countries, where infectious diseases and early childhood mortality are still the major problems.) Most of us know someone suffering from at least one of these chronic illnesses; due to this prevalence, researchers have focused their attention on identifying the factors that cause them. What we know now is that although all of these diseases have a genetic component to them, their increased pervasiveness cannot be explained by genetics alone. Our genes simply have not changed that much in just two generations—but our environment sure has.


On supporting science journalism

If you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.


About twenty-five years ago a short scientific article published by an epidemiologist from London attracted a lot of attention. Dr. David Strachan proposed that a lack of exposure to bacteria and parasites, specifically during childhood, may be the cause of the rapid increase in allergy cases, since it prevents proper development of the immune system. This concept was later termed the “hygiene hypothesis,” and an increasing number of studies have explored whether the development of many diseases, not just allergies, can be explained by this hypothesis. There is now a large amount of very solid evidence  supporting Dr. Strachan’s proposal as generally correct. What remains less clear is what exact factors are responsible for this lack of microbial exposure. For his study on allergies, Dr. Strachan concluded that “declining family size, improvements in household amenities, and higher standards of personal cleanliness” contributed to this reduced contact with microbes. While this may be true, there are many other modern-life changes that have an even stronger impact on our exposure to microbes.

One of these changes can be attributed to the use, overuse, and abuse of antibiotics—chemicals that are designed to indiscriminately kill bacterial microbes. Definitely one of, if not the greatest discovery of the twentieth century, the emergence of antibiotics marked a watershed before-and-after moment in modern medicine. Prior to the advent of antibiotics, 90 percent of children would die if they contracted bacterial meningitis; now most cases fully recover, if treated early. Back then, a simple ear infection could spread to the brain, causing extensive damage or even death, and most modern surgeries would not even be possible to contemplate. The use of antibiotics, however, has become far too commonplace. Between the years 2000 and 2010 alone there was a 36 percent increase in the use of antibiotics worldwide, a phenomenon that appears to follow the economic growth trajectory in countries such as Russia, Brazil, India, and China. One troubling thing about these numbers is that the use of antibiotics peaks during influenza virus infections, even though they are not effective against viral infections (they are designed to kill bacteria, not viruses).

Antibiotics are also widely used as growth supplements in agriculture. Giving cattle, pigs, and other livestock low doses of antibiotics causes significant weight gain in the animals and, subsequently, an increase in the meat yield per animal. This practice is now banned in Europe, but is still legal in North America. It seems that antibiotic overuse in humans, especially in children, is inadvertently mimicking what occurs in farm animals: increased weight gain. A recent study of 65,000 children in the US showed that more than 70 percent of them had received antibiotics by age two, and that those children averaged eleven courses of antibiotics by age five. Disturbingly, children who received four or more courses of antibiotics in their first two years were at a 10 percent higher risk of becoming obese. In a separate study, epidemiologists from the Centers for Disease Control and Prevention (CDC) found that states in the US with higher rates of antibiotics use also have higher rates of obesity.

While these studies didn’t prove that antibiotics directly cause obesity, the consistency in these correlations, as well as those observed in livestock, prompted scientists to have a closer look. What they found was astonishing. A simple transfer of intestinal bacteria from obese mice into sterile (“germ-free”) mice made these mice obese, too! We’ve heard before that many factors lead to obesity: genetics, high-fat diets, high-carb diets, lack of exercise, etc. But bacteria—really? This raised skepticism among even the biggest fanatics in microbiology, those of us who tend to think that bacteria are the center of our world. However, these types of experiments have been repeated in several different ways and the evidence is very convincing: the presence and absence of certain bacteria early in life helps determine your weight later in life. Even more troubling is the additional research that shows that altering the bacterial communities that inhabit our bodies affects not just weight gain and obesity, but many other chronic diseases in which we previously had no clue that microbes might play a role.

Let’s take asthma and allergies as an example. We are all witnesses to the rapid increase in the number of children suffering from these two related diseases. Just a generation ago it was rather unusual to see children with asthma inhalers in schools. Nowadays, 13 percent of Canadian children, 10 percent of US children, and 21 percent of Australian children suffer from asthma. Peanut allergies? That used to be incredibly rare, but is now so frequent and so serious that it has led to peanut-free schools and airplanes. As with the obesity research, it is now evident that receiving antibiotics during childhood is associated with an increased risk of asthma and allergies.

Our laboratory at the University of British Columbia became very interested in this concept and decided to do a simple experiment. As had been observed with humans, giving antibiotics to baby mice made them more susceptible to asthma, but what we observed next left us in awe. If the same antibiotics were given when the mice were weaned and no longer in the care of their mothers, there was no effect in susceptibility to asthma. There appeared to be a critical window of time, early in life, during which antibiotics had an effect on the development of asthma. When given orally, the antibiotic that we chose, vancomycin, kills only intestinal bacteria, and does not get absorbed into the blood, lungs, or other organs. This finding implied that the antibiotic-driven change in the intestinal bacteria caused the increase in the severity of asthma, a disease of the lungs! This experiment, as well as others from several different labs, came to the same conclusion: modifying the microbes that live within us at the beginning of our life can have drastic and detrimental health effects later in life. The discovery that this early period in life is so vulnerable and so important tells us that it’s crucial to identify the environmental factors that are disturbing the microbial communities that inhabit us during childhood.

We went on to try to study this in humans instead of mice. Our question was simple: are there intestinal microbial differences in babies that go on to develop asthma earlier in life compared to babies that do not develop asthma. We sequenced the intestinal microbiota in a sample of babies’ feces and found something that really surprised us. Three-month-old babies at a high risk of developing asthma later in life were missing four types of bacteria that babies at low risk had, but when we looked at the microbiota at one year of age, the differences were essentially gone. All of this—plus other studies about antibiotics, farms, etc.—points to a critical window of time in which microbial changes in the intestine have long-term immune consequences in the lung. In addition, we found that the differences were not limited to the type of bacteria found in feces, but also some of the compounds they produce. Interestingly, only one of these compounds was a bacterial product made in the gut known as acetate, and many of them were bacterial compounds detected in the urine of these babies—another proof that bacterial metabolites go everywhere in our body.

Our lab is still trying to figure out how these four bacteria (which are present in low numbers and which we nicknamed FLVR) lead to asthma, but one additional set of experiments in mice suggests that FLVR is directly involved with mediating this, as opposed to just showing a correlation. We gave one group of germ-free mice a fecal transplant from a baby who had no FLVR in his feces (and also developed asthma later in life), whereas a second group of mice received the same fecal transplant plus added FLVR. When the mice became adults, the group that received FLVR had much less lung inflammation and other markers of asthma.

We’re still far from considering this as a preventative therapy for human asthma, but it opens a few doors that might give us a major advantage in combating the asthma epidemic. The first is that, assuming our findings hold true in other populations, we should be able to identify infants that are at very high risk for developing asthma. Even more exciting, we may be able to give those high-risk infants certain microbes or microbial products as a way to prevent asthma.