Humans vs bacteria: A timeline of science, suspicion, trial and error
Updated 23:59, 10-Jan-2020
By Gary Parkinson
Europe;
03:19

Humans have fought against bacteria since long before we actually knew what we were doing. To the ancients, the spread of disease and infection must have been as mystifying as it was terrifying. However, long before modern science started to explain the secrets of microscopic life, various civilizations had found remedies through empirical evidence: they tried something, it worked, and they lived to recommend it to others.

For two millennia, Chinese herbalists had fought malaria with the widespread herb artemisia annua - known variously as sweet wormwood, sweet annie, sweet sagewort, annual mugwort or annual wormwood. However, it was only in 1972 that Chinese scientist Tu Youyou isolated the active ingredient artemisinin, allowing for mass production of synthetic antimalarial drugs, which saved millions of lives; she was later awarded a Nobel Prize. 

The ancient Chinese weren't alone in noticing remedies that can now be explained by microbiologists. Treating wounds with the topical application of moldy bread, an "accidental" antibiotic, was advised in ancient Egypt, Greece and Rome, long before Alexander Fleming's untidy laboratory unintentionally led to the fortuitous discovery of penicillin's antibiotic properties. But let's not get ahead of ourselves...

 

The appliance of 17th century science: A physician wearing a 17th century plague-preventive outfit (Credit: Wellcome Collection)

The appliance of 17th century science: A physician wearing a 17th century plague-preventive outfit (Credit: Wellcome Collection)

 

Imagining the invisible

Thinkers had postulated the idea of invisibly tiny life for thousands of years. Five or six centuries BC, the teachings of the ancient Indian religion Jainism suggested unseen microbiological creatures. In the first century BC, Roman scholar Marcus Terentius Varro warned against living near swamps "because there are bred certain minute creatures which cannot be seen by the eyes, which float in the air and enter the body through the mouth and nose and thereby cause serious diseases."

In around 1025, The Canon of Medicine by Persian polymath Ibn Sina considered the nature of epidemics, combining the millennium-old miasma theory (which blamed communicable disease on polluted air) with his own ideas on contagion via breath, water and dirt. 

The discussion went from philosophical to urgent in the late 1340s, with the Black Death, which is estimated to have killed around half of Europe's population and 100 million worldwide. Arab physicians Ibn Khatima and Ibn al-Khatib suggested that infectious diseases were caused by "minute bodies" transmitted through clothing and earrings. 

Read more A horrible history of pandemics

But in order to check the scientific theory, technology had to catch up – via a linen merchant's obsession with quality control.

 

A recreation of Leeuwenhoek's groundbreaking microscope (Credit: Jeroen Rouwkema)

A recreation of Leeuwenhoek's groundbreaking microscope (Credit: Jeroen Rouwkema)

 

A Dutch draper and a germ theory

Antonie van Leeuwenhoek was a Dutch draper who wanted to check the quality of his merchandise at a level beyond the ability of the human eye. Dissatisfied with available magnifying lenses, he experimented with glass processing until, in 1676, he hit upon a lens-making technique that revealed the world in microscopic detail. 

Fascinated with what he found, he became the first to document microscopic observations of red blood cells, muscle fibers, spermatozoa – and bacteria, although he called them animalcules (from the Latin animalculum, "tiny animals"). That name didn't stick, but his did: he's still known with gratitude as The Father of Microbiology. 

Scientific studies intensified and theories were proposed but old beliefs die hard unless disproved by rigorous practice and solid evidence – and perhaps a little theater: in 1854, John Snow took direct action to save lives. Amid a worldwide cholera pandemic, the English physician distrusted the still-dominant miasma theory and suspected the disease was waterborne. 

One nasty outbreak centered on a specific public water pump on Broad Street in London's Soho. And, although Snow's microscopic examination of a water sample couldn't prove its guilt, his exhaustive analysis of infection patterns – kick-starting the discipline now known as epidemiology – were enough to persuade the council to remove the pump handle. The outbreak abated, the argument was quickly won and the miasma theory replaced with rapidly evolving germ theory: diseases were spread not by general "bad air" but by specific pathogenic microorganisms.

The question was: What were they and how can they be stopped?

 

An oversight helped Louis Pasteur to a breakthrough (Credit: AP Photo/Francois Mori)

An oversight helped Louis Pasteur to a breakthrough (Credit: AP Photo/Francois Mori)

 

Europe vs death

The race was on and Europe's finest minds worked to find – and combat – the culprits for various diseases.

In France, Louis Pasteur had already demonstrated in the 1860s that milk spoiled due to the growth of microorganisms, which could be avoided by killing the bacteria via heating the liquid ("pasteurization"). In 1879, he was helped to a breakthrough by an absent-minded assistant. Leaving for a holiday, Pasteur instructed his sidekick Charles Chamberland to inject chickens with cholera, but Chamberland forgot. By the time he did it a month later, the cholera culture had weakened and the chickens survived. 

Chamberland was apologetic but Pasteur was delighted, especially when the chickens fought off a more virulent sample that had killed other chickens. Pasteur realized the birds had been accidentally immunized by exposure to the weaker strain. Although working mostly with animals, he had found a way to vaccinate them against cholera – and went on to do the same for anthrax and rabies. 

 

Edward Jenner performing the first vaccination (Credit: Wellcome Collection)

Edward Jenner performing the first vaccination (Credit: Wellcome Collection)

(A short diversion on terminology. The word vaccination comes from the Latin vacca, for cow, because Edward Jenner realized he could prevent smallpox infections by injecting the cowpox virus – related to smallpox but much less lethal. Inoculation comes from the Latin oculus for eye – because inoculation is an older term previously used for the grafting of a bud, or "eye," of one plant into another. Vaccination and inoculation are now used more or less interchangeably and are subsets of immunization, which also includes the use of antitoxins and antibodies.)

While some scientists sought cures and prevention, others strove to isolate the culprits. In 1876, German microbiologist Robert Koch became the first to prove a specific bacterium caused a disease, namely anthrax from Bacillus anthracis. In 1882, he proved tuberculosis, previously thought an inherited disease, was caused by Mycobacterium tuberculosis and in 1884, he proved cholera was caused by Vibrio cholerae. But for all his groundbreaking success, he wasn't immune to a midlife crisis: in 1893, just before turning 50, he divorced his wife of 26 years and married a 21-year-old actress.

 

Waldemar Haffkine (seated, center) doles out the cholera vaccine (Credit: Wellcome Collection)

Waldemar Haffkine (seated, center) doles out the cholera vaccine (Credit: Wellcome Collection)

 

Risks and blunders

In 1897, Queen Victoria knighted Waldemar Haffkine - born in what is now Ukraine – for his work in creating vaccines. Building on the animal breakthroughs of Louis Pasteur – in 1889, he joined the newly created Pasteur Institute as a librarian, because it was the only job available there – Haffkine put his life on the line in 1892 by injecting himself with his unproven cholera vaccine. 

The stunt impressed the newspapers but not the medical establishment, so Haffkine moved to cholera-stricken India, where possible remedies were more readily accepted: 40,000 recipients came forward between 1893 and 1896. Then bubonic plague struck Mumbai and the authorities pleaded for his help: setting up a makeshift laboratory in a college corridor, he worked on a vaccine for three punishing months – two assistants quit, another had a nervous breakdown – before again testing the resultant vaccine upon himself. It reduced the mortality rate by 50 percent and earned him that knighthood.

In the rush to identify and inoculate, there were mis-steps. In 1879, the German-Swiss Edwin Klebs and the Italian Corrado Tommasi-Crudeli confidently decreed that malaria was caused by a bacterium; the medical establishment happily agreed, so much so that they ignored French army physician Charles Alphonse Laveran a year later when he correctly noted it was in fact caused by a parasite. It took the best part of two decades to overturn that misapprehension, especially as in 1883 Klebs correctly identified Corynebacterium diphtheriae as the cause of diphtheria. 

 

In this watercolour by Richard Tennant Cooper, a sick child is strangled by a ghostly skeleton representing diphtheria (Credit: Wellcome Collection) 

In this watercolour by Richard Tennant Cooper, a sick child is strangled by a ghostly skeleton representing diphtheria (Credit: Wellcome Collection) 

 

Into the 20th century

History will remember the 20th century as the era in which humankind truly began to understand and control the effects of bacteria… for the most part. 

Tuberculosis is a case in point. From the early 1900s, French physician Albert Calmette worked with veterinarian Camille Guerin, isolating the most virulent strain – later named Bacillus Calmette-Guerin, which is why the TB vaccine is known as the BCG. First used on humans in 1921, the vaccine was slow to catch on, especially after 72 infants died in 1930 because the mild inoculating strain had been infected by a virulent strain ill-advisedly stored in the same incubator. 

However, science pressed on and the tragic failures were outweighed by successes, with vaccines developed for diphtheria in 1923 and tetanus in 1927. And then, in 1928, Alexander Fleming returned from holiday to his London laboratory and discovered that a blue-green mold had inhibited bacterial growth on a petri dish he had inadvertently left out of an incubator. 

Naming the active ingredient penicillin, he investigated its use as an antiseptic to deter bacterial growth and wondered if it might be useful in chemotherapy; it was others who purified penicillin and developed it as the world's first therapeutically used natural antibiotic. Science had now developed enough for humans to knowingly use natural microorganisms as agents in a microscopic bacterial battle. 

 

Penicillin was pushed into service during World War II (Credit: Science Museum, London)

Penicillin was pushed into service during World War II (Credit: Science Museum, London)

 

Winning hearts and minds

The larger-scale battle was to persuade the public that vaccination was a good thing. The antipathy was not just rooted in a cartoonish fear of needles. To a population that had spent millennia terrified of virulent diseases wiping out towns and villages, the idea of injecting the very thing that could kill you – albeit a safely weakened version to kick-start your body's defenses against a serious infection – must have been bewilderingly counterintuitive. 

Indeed, it is entirely possible that vaccination wouldn't have reached the tipping point required for mass immunity if the decision were left to the individual. From the middle of the 20th century, governments began to promote immunization programs. 

That push began with antibiotics, saving lives in the aftermath of a horrific incident. In November 1942, a fire at Boston's Cocoanut Grove nightclub killed 492 people and hospitalized many more. At the time, burns victims frequently died from subsequent infection, so for the first time the authorities used penicillin – only recently approved for use. Its success prompted the US government to mass-produce and distribute penicillin to the armed forces; the "wonder drug" saved many lives and changed many minds. 

After the world war came another kind of global campaign: the first coordinated attempts to eradicate disease via vaccination. From 1948, the United Nations International Children's Emergency Fund (Unicef) and World Health Organization (WHO) prioritized mass immunization against tuberculosis; backed by funding from various governments. Within three years, they had vaccinated 14 million people in 23 countries. The era of mass intervention, frequently funded by governments and supported by the media, had begun.

 

The public queue for mass vaccination in New York, 1947, during the last US smallpox outbreak to date (Credit: AP)

The public queue for mass vaccination in New York, 1947, during the last US smallpox outbreak to date (Credit: AP)

 

Peak and backlash

Success creates trust and as deaths dwindled, the queues grew. Once-common infections receded into memory after vaccinations reached the levels required for herd immunity, which occurs after the vaccination of a sufficient proportion of the population to effectively render a disease incommunicable. 

However, that success can create its own problems. Herd immunity and receding memories can lead to complacency, while misinformation can cause distrust. A 1998 research paper linking vaccines to autism has been thoroughly discredited but public suspicion has lingered – frequently fueled by the online theorizing of "anti-vaxxers" – and immunization rates have been dropping. 

Tragic accidents have increased the danger. In July 2018, two children in Samoa died after the incorrect application of a measles vaccine. Under pressure from anti-vaxxers – and against the advice of the WHO – the government suspended vaccinations, with coverage plummeting from 74 percent the previous year to 34 percent. 

Predictably enough, people started to die of measles. When the outbreak started in August, anti-vaxxers attempted to blame poverty and malnutrition but the evidence, like the bodies, started to pile up. By November, the government had declared a state of emergency and made vaccination mandatory. By December, amid thousands of cases and dozens of deaths, they had to cancel Christmas and impose a curfew, with unvaccinated families forced to display a red flag outside their homes. 

As of early January, vaccination levels were back up to 94 percent, almost at the threshold for herd immunity. The outbreak will recede, but the lesson is a harsh one, and it's not the only one. In 2015, the Spanish city of Olot suffered an outbreak of diphtheria. Thankfully, it wasn't an epidemic – only 10 cases were reported, and just one death: of a six-year-old boy whose parents had chosen not to vaccinate him. 

 

December 2019: Samoa succumbs to a fatal outbreak of a preventable disease (Credit: TVNZ via AP)

December 2019: Samoa succumbs to a fatal outbreak of a preventable disease (Credit: TVNZ via AP)

 

Medieval medics

The idea of homesteads being marked red for danger might seem unnecessarily medieval – a return to a past we should have left long behind us. But it's worth remembering that humans have used microscopic cures for centuries, even if they may not have correctly identified the science behind them. 

In 2015, the American Society for Microbiology published research by scientists who had tested a thousand-year-old remedy for bacterial infection. Bald's Leechbook, a 10th century Anglo-Saxon medical text, recommended those suffering a "wen" – an infection of the eyelash follicle, nowadays known to be linked to the Staphylococcus aureus bacterium which is still among the most common causes of hospital-acquired infections such as MRSA – with crushed garlic and either onion or leek combined with wine and bovine bile, left to stand in a brass or bronze vessel for nine nights. 

Recreating Bald's millennium-old eye salve, the scientists were delighted to find it worked, completely removing all trace of the Staphylococcus aureus bacterium and retaining antistaphylococcal activity during 30 days of storage. Fascinatingly, they also found that removing any of the ingredients (except the brass pot, which may have been necessary for relative cleanliness) reduced its efficacy – in some cases to zero. 

Our predecessors may not have known why or how things worked, but they knew what did: carefully cultivated and tested "ancientbiotics." In many ways, modern scientists are simply following in their footsteps in humankind's ongoing battle with bacteria.