As dusk fell on the Finnish city of Lahti on a still chilly day in May 2016, a crew of workers let themselves into the yard of an empty daycare center. Underneath the swings and jungle gyms, they installed squares of forest floor—scruffy shrubs, shin-high berry bushes, wispy meadow grasses, and velvety mounds of moss—harvested from the woods somewhere in a less developed part of the country. Around the edges, they put in soft green sod. In the morning, when the children arrived, they found their playground—formerly a drab patchwork of asphalt, gravel, and sand—transformed overnight into micro-oases of wilderness.
This scenario played out three more times that month at daycares in Lahti, and 500 miles to the west, in the city of Tampere. It wasn’t the work of some nature-loving guerrilla artists, but the start of an ambitious science experiment to find out if the lack of microbes in paved-over urban environments could be turning people’s immune systems against them. “There is this ‘biodiversity hypothesis,’ that in the absence of diverse environmental microbiota, people are more likely to get immune-mediated diseases,” says Aki Sinkonnen, an evolutionary ecologist at the University of Helsinki. “But no one had really tested this with children.”
You’re probably more familiar with the “hygiene hypothesis.” First described by a British epidemiologist named David Strachan in the early 1990s, it posits that the rise in chronic disorders related to overreactive immune systems—such as asthma, diabetes, and allergies—is driven by children growing up in increasingly sterile bubbles. Immune systems are, at their most basic, object classifiers. Their job is to recognize what’s self and what’s other. Microbes encountered early in life are the first tutors in this process—helping the developing immune system decipher what’s dangerous and what is not. The more families have used antibacterial soaps and gels, sealed themselves into high-rise apartments, and driven cars through concrete jungles, the less habitat there has been for bacteria, protozoa, fungi, and viruses to thrive—and the less likely it’s been for kids’ developing immune systems to run into them. And less exposure has meant fewer opportunities to train. A poorly trained immune system could fail when it comes time to distinguish between a body’s own cells and food allergens, or gut microbes, or pollen in the air.
Lab experiments on rodents in the early 2000s supported this idea: Wild rats had immune systems well-tuned to fight dangerous pathogens, but not minor irritants, whereas their lab-raised counterparts went into overdrive at the smallest stimulus. Human epidemiological studies lent circumstantial evidence too—allergy and asthma rates tend to be higher in more industrialized areas than in rural ones. To counteract these supposed negative effects of urban, modern lifestyles, dozens of companies have sprung up to hawk immune-boosting probiotics—pills and drinks and creams filled with cocktails of live bacterial cultures. In the Covid-19 era, thousands of posts tagged #immuneboost promoting these and other home remedies show up on Instagram each week. So far, there’s little evidence any of it has worked.
Which is why, in recent years, scientists like Sinkkonen have taken this idea one step further. People are increasingly living in microbiodiversity deserts, they observed, missing out on exposure to a variety of harmless bugs. “The immune system doesn’t recognize microbes by species, but by their type,” says Sinkonnen. “Probiotics usually contain only one or two types of bacteria, so it’s unlikely to activate the whole immune system. We wanted to see what would happen if we brought in a whole diverse microbial environment.” Hence, the forest floors in the playgrounds—the first randomized controlled trial to test the biodiversity hypothesis in kids. Biohacking, but make it cute.