Imagine a world in which everyone stopped showering. Disgusting, right? But wait. What if that world was actually better for our health?
Of course, we should still wash our hands to stop the spread of infection, especially during a pandemic. But our fixation on almost clinical cleanliness may well be compromising our immune systems.
Why did we start washing ourselves so thoroughly, anyway? Like so many things, the rise of soap coincided with the birth of capitalism.
Nearly two centuries ago, clever marketing convinced us that we needed to “fight germs.” Since then, advertisers have only grown more sophisticated at convincing us to buy skin care products. They promise us cleanliness, health, and beauty.
But medics are now beginning to understand the importance of our skin’s natural microbiome – all those bacteria that live on the outside of our bodies.
Science already knows that a diverse microbiome is crucial for our gut. But it’s also important for the skin. And there’s no way you can build up this diversity if you seal yourself away with soap.
Modern ideas of “cleanliness” have led us to overwash ourselves
Developments in medicine and technology mean we spend more time indoors and clean ourselves more often. We are far less likely to die of an infectious disease. But rates of chronic disease have skyrocketed.
Some of these chronic conditions may well be linked to washing ourselves too often. One example is atopic dermatitis, or eczema. It makes the skin red and itchy.
Sandy Skotnicki, a dermatologist and professor at the University of Toronto, advises patients who experience eczema flares to forgo hot showers and throw away soaps and gels. These products are, after all, mostly made of detergents that can be harmful to the skin. She recommends that patients simply wash their armpits, groin, and feet.
This “soap minimalism” helps the skin do what it does best: maintain equilibrium. It has evolved to do this over millions of years. Scientists are now studying how microbes which inhabit the skin – its microbiome – work with our environments.
New research has revealed more about the role of the apocrine sweat glands. They sit in the groin and armpits and produce the oily secretions that cause body odor. But they also do something incredibly useful: they sustain the trillions of microbes that live in and on us.
It might sound gross, but these microbes may actually act as an invisible top layer of our skin. They foster its dynamic relationship with the outside world.
Throughout human history, people have had different reasons for cleaning themselves
We weren’t always obsessed with sterile cleanliness. Before germ theory, people bathed for different reasons, including entertainment and general wellness.
Let’s look at the ancient Roman baths, for example. They were a place for socializing and relaxing. Washing came last. Those baths had no circulation system, so water almost certainly carried a film of human sweat and scum. Hygiene couldn’t have been much of a concern.
And in ancient Jerusalem, people washed themselves to avoid spiritual contamination. Hebrews required hand and foot washing before going into the Temple. They also demanded that people wash their hands before and after eating.
Rabbis used to say that physical cleanliness leads to spiritual purity. Islam also requires ritual washing five times a day, before each prayer. This meant that Arabic people built complex water systems long before Europeans.
Christians, on the other hand, viewed excessive bathing as a sinful luxury. There were religious reasons for this: after all, Jesus believed that inner purity was more important than religious ceremony. So the European attitude toward hygiene was, to put it mildly, cavalier. In the 14th century, it became one of the reasons why the Black Death ravaged the continent and killed one in three Europeans.
It wasn’t until 1854 that the link between living conditions and disease was discovered. John Snow, a physician from London, traced an outbreak of cholera to just one location: a well that sat next to a cesspit of human feces.
But the government didn’t take Snow seriously. After all, if he was right, they would have to rebuild London’s entire infrastructure. It would take another 30 years for Snow’s work to be corroborated. This happened when German physician Robert Koch saw cholera-causing microbes under a microscope. Combined with Snow’s work and other subsequent observations, this confirmed the link between cholera and contaminated water.
Finally, germ theory took hold. The theory goes that infectious diseases are spread by microscopic living organisms. To stop these organisms, governments began to invest in preventative infrastructures such as water treatment facilities and sewage systems. Social norms also changed, to the extent that if you were ungroomed, people thought you were dangerous.
Rich people began referring to the working class as the great unwashed, and cleanliness became an indicator of status. This created a huge market for soap.
Advertising created a new perception of cleanliness in the soap industry
At the end of the nineteenth century, America and Britain were living through a soap boom. One company that stood out was Lever Brothers. Their soap-making wasn’t especially innovative, but their advertising was. They claimed that their product, Sunlight Soap, was a lifesaver. This marketing technique soon made Lever Brothers the world’s largest soap distributor.
The eldest brother, William Lever, was the brains behind the operation. As one historian put it, “Lever didn’t advertise so much as paint the world with his brand.” Ads were everywhere. William even founded a newspaper and published a healthcare book. Both carried the name of his brand: the paper was called Sunlight Almanac, and the book, the Sunlight Year Book.
Lever was one of the first to realize that the emergence of a new middle class meant he could now sell to every household. Mass manufacturing drove costs down, and soap was becoming universally affordable.
Early soap sellers were ingenious in their use of media to spread their message. Sponsored content, which we now see everywhere, was their invention: Procter & Gamble published a parenting handbook and plastered it with advice to use Ivory Soap.
From there, soap companies went on to dominate radio, and later TV. They didn’t just run ads – they revolutionized broadcasting. Soap companies created the genre of daytime serial. These programs were geared toward housewives and came to be known as soap operas.
Titans of the soap industry were also among the first users of marketing jargon. Colgate & Company advertised their Cashmere luxury soap as “hard milled” and therefore “safer” – even though there’s no such thing as hard or soft milling.
And Palmolive invoked recommendations of unnamed doctors. Here’s one, from a 1943 ad: “You can have a lovelier complexion in 14 days with Palmolive Soap. Doctors prove!”
Soap was now in every household, but companies wanted more. They needed to expand their product lines. So marketers came up with a new message.
One product – soap – was no longer enough. Instead, you needed to buy more products to undo its effects. Using soap gives you dry skin? Buy moisturizer. This set the stage for the emergence of skincare empires.
Skin care is slowly infiltrating the medical field, but isn’t regulated as strictly as drugs
Fast forward to today, and “indie” brands are now shaking up the skin care industry.
At the Indie Beauty Expo in New York, the current trend is using words like “clean,” “cruelty-free,” and “pure.” Indie brands also often highlight a specific “new” ingredient; something that hasn’t been emphasized before.
But the difference between indie and mainstream brands has less to do with the size of the company, and more with their marketing and aesthetics.
Small indie brands tend to take more risks with their claims. In the U.S., it’s perfectly legal to say whatever you want about the benefits of a cosmetic product, so long as you’re not promising a disease cure. This means that there’s a low barrier to entry. A product can rise to the top by word-of-mouth or through social media.
Both independent and mainstream companies are treading the fine line between cosmetics and drugs. They use fancy scientific terms, and this means that consumers find it tricky to tell medical fact from cosmetic fiction.
Take collagen, the protein that keeps skin looking taut and therefore “young.” Applying collagen directly to your face would be pointless: the molecule is too big to penetrate the skin. But at the Indie Beauty Expo, collagen products were everywhere. Sellers would tell you that it firms up the skin, makes it smoother and plumper.
Meanwhile, retinoids, derived from vitamin A, are approved as drugs, but are also sold over-the-counter as cosmetics. There is actually some scientific evidence that they stimulate the production of collagen. But it’s up to consumers to do their research.
Compare this free-for-all with how drugs are regulated. Before introducing a new medicine, pharmaceutical companies spend years trialing it for safety and efficiency. And yet, we tend to be cautious about drugs, but trusting about skin care products. Why is this?
Many people believe skin care brands because they feel wronged by the medical establishment. It hasn’t met their needs, they think, so they start looking for alternatives.
Skin care enthusiasts accept that there might be some scammers around. But they’re also convinced that sharing personal experiences online will help the community tell good products from duds.
Skin care offers a sense of control.
Bacteria on our skin isn’t necessarily a bad thing
Since the Industrial Revolution, we have been withdrawing from nature. And yet, in August 2016, a study showed that early exposure to the microbial world outside could be great for our immune systems.
Researchers focused on two groups of people: the Amish and the Hutterites. These communities are genetically similar, and they live similar lives, largely unchanged since the eighteenth century. But there’s one key difference.
In the Hutterite community, children never accompany their fathers to the communal farm. The Amish, meanwhile, work with babies strapped to their backs. So their children grow up interacting with soil, animals, and microbes.
Scientists, led by allergist and immunologist Mark Holbreich, wanted to see if this made any difference. And it certainly did. In the Amish community, far fewer children suffered from asthma and allergies. Four to six times fewer, in fact.
So microbes can be good for us. Why? To understand this, let’s first consider how the immune system works. Immune cells are passed back and forth between the circulatory system – our blood vessels – and the lymphatic system, which carries lymph around our bodies. Lymph is a liquid that’s full of immune cells, or lymphocytes.
Their main job is to look out for foreign material that gets into or onto our bodies. These foreign particles are known as antigens; and as soon as lymphocytes sound the alarm, your body begins a counter-attack. This is what we call inflammation, and this process keeps us safe from disease. But sometimes the immune system misfires. It attacks particles which are entirely harmless, or even turns against our own cells. This is how you get an autoimmune disease.
So how can we avoid such dangerous mistakes? One way is to train the immune system to respond properly. We can do this by exposing it to bacteria, and this approach works best when we’re very young.
This exposure can start in the very first seconds of our lives. With vaginal birth, as the baby travels from the womb to the outside world, she picks up some of her mother’s microbes. She is then breastfed – and, with each meal, receives adult immune cells.
Children continue to form their microbiome through everyday contact. They receive these microbes from family members, dirt, animals, and even other children’s toys.
Our overuse of antibiotics is probably corrupting our health more than excessive hygiene practices
Remember the Lever Brothers from an earlier section? Their influence on modern hygiene didn’t end with Sunlight Soap. In fact, they introduced a marketing idea which has stayed with us for more than 100 years.
In 1894 they released a new soap called Lifebuoy. They claimed it was as good as medicine: a cure for fevers and colds. Why? Because its main ingredient was an antiseptic called carbolic acid, which destroyed harmful microbes. In a society that had only just discovered germ theory, this sales pitch really worked.
Later, the makers of Lifebuoy Health Soap coined another now-familiar term, “body odor,” or B.O. This, too, was an advertising ploy. Here’s how Lever Brothers explained it: body odor is caused by bacteria, so you need soap to kill those bacteria. This wasn’t based in science, but the fear-based marketing resonated, and sales quadrupled. It was only a matter of time before soap makers began adding antibiotics to their products.
In 1948, a new deodorant soap hit the market. It was called Dial, and it contained an antibiotic compound called hexachlorophene. Similar products followed, and eventually hexachlorophene became an accepted ingredient of cosmetics.
But 30 years later, studies revealed that this chemical could get past the skin and affect the nervous system. This could have been the end of antibiotics in soap, but makers of cosmetics found a solution: they substituted hexachlorophene with another microbe-killing compound, triclosan.
Problem solved? Well, not quite. Recent studies show that constant exposure to triclosan can promote the development of tumors, alter the way hormones work, and possibly even cause allergies.
This sounds worrying, especially as we’re extremely exposed to triclosan. In 2009, a study showed that three in four Americans had triclosan in their urine.
So are all these risks worth it? Do antibacterial cosmetics really fight disease better than just plain old soap and water? In 2013, the US government decided to check, and the Food and Drug Administration asked soap makers for evidence. They offered almost none. So regulators banned triclosan, hexachlorophene, and 17 other antimicrobial ingredients from soaps.
In an ironic turn of events, the latest skin care trends suggest actively adding bacteria to our bodies. Many indie skin care brands are now selling products with probiotics and prebiotics – compounds that foster the growth of microbial populations. It might not be long before big companies hop on the bandwagon.
Our skin microbiome may contain important information about our health
In 2009, Claire Guest was researching the idea that dogs could smell cancer. One of the dogs involved in the study, a golden retriever named Daisy, became Claire’s pet. The scientist remembers walking home from the park, when the dog started acting strange. “She was a bit wary of me,” she recalled.
Suddenly, Claire remembered: a few days earlier, she’d felt a little lump in her breast. At the time, she hadn’t really considered the possibility of breast cancer.
Since then, Claire, whose cancer is now in remission, has been working full-time with dogs who can detect signs of cancer and other illnesses.
We all know that dogs have a great sense of smell. But just how acute is it? Well, as it turns out, they can pick up small changes in our volatiles, the complex cocktail of chemicals we all emit. This makes dogs great at detecting illness.
So far, dogs have been trained to react to high blood sugar levels; they’re also showing promise in pointing out Parkinson’s disease, which is linked with changes to our skin.
Claire Guest wonders what’s behind this smell of disease. What are the dogs sniffing out? Could it be our microbiome?
There is more and more evidence to support this theory; and if research proves Guest right, we could learn to use these changes to our skin microbes. Perhaps that would allow doctors to catch, and treat, disease earlier.
In another study, British researchers wanted to use dogs to detect malaria.
They tested hundreds of Gambian schoolchildren for the disease; and then each kid was given a brand new pair of socks. A few hours later, scientists collected the socks and sent them to London, where medical detection dogs got to work. In seven out of ten cases, dogs correctly identified socks worn by infected children.
What’s clear is that the compounds our skin produces aren’t random. Perhaps it’s time for us to direct our energies toward learning more about our skin, instead of trying to scour it clean.
Maintaining good health requires a balance between hygiene and exposure to microbes
In the 1800s, Florence Nightingale led a team of nurses at a military hospital in Crimea, where the British were fighting back Russian expansion. When Nightingale arrived at the warzone, hospital care was shambolic. Ten times more soldiers were dying from infectious diseases than were killed in battle.
Wards were dank and swarming with lice and fleas. Nightingale believed the men needed air to get better, so she proposed opening up new doors and windows to improve circulation. Her plan worked: one report suggests that the death rate fell by nearly 40%.
What Nightingale did changed hospitals around the world. Her achievements encouraged medics to let more air into their clinics.
And then the germ theory took hold. Despite the evidence that Nightingale’s approach worked, modern hospitals began dividing patients into small rooms with little ventilation. The windows were shut once again.
The history of human cleanliness has been a tale of extremes. Perhaps it’s time to consider what we’ve learned from both approaches.
Two things stand out. First, let’s think about how we can maintain microbial diversity. One way to achieve it is through communal living. A 2017 study from the University of Waterloo found that people who cohabitate develop similar microbiomes. They also have greater microbial diversity. The same is true of people who have pets, drink less alcohol, and exercise outside.
And here’s something else: we also need to address the global imbalance in hygiene practices. Richer countries have embraced sterility – sometimes, to their own detriment. But elsewhere in the world, people don’t even have access to clean drinking water. More than 30% of us have no way to wash our hands at home.
Billions of people around the world continue to die from infections. In Haiti, after the 2010 earthquake, 8,000 people died of cholera. Clean water and good hygiene could have prevented these deaths.
Health is a very personal issue, and it’s only right that we question the systems that influence our wellness and hygiene habits. But healthcare is also a public concern, and so, instead of hoarding resources, we should be working together to promote global health initiatives.
Basic hygiene practices, like washing our hands with soap and clean water, are important to prevent the spread of infectious disease. But we shouldn’t go overboard with cleaning. Less is more.
Cultivating a diverse microbiome through exposure to the outside world will benefit our health far more than excessive showering.