1.Bloodletting—Draining Patients to ‘Balance’ Their Health
For centuries, bloodletting was considered the cornerstone of medical treatment, from ancient civilizations to the height of Georgian and Victorian medicine. Physicians believed that the body’s health depended on balancing four bodily humors—blood, phlegm, black bile, and yellow bile. If a patient suffered from fever, headaches, or even mental illness, the solution was often the same: remove “excess” blood to restore harmony. The practice, deeply rooted in Hippocratic and Galenic teachings, persisted despite its often disastrous consequences.

Doctors employed various methods to draw blood, each more unsettling than the last. Venesection, the most common technique, involved cutting open a vein—usually in the arm—to drain several ounces of blood into a bowl. Cupping, another popular method, used heated glass cups to create suction on the skin, drawing blood to the surface before making small incisions to release it. Scarification involved making multiple shallow cuts, while leeches were favored for more delicate procedures, such as treating eye conditions or reducing inflammation. In extreme cases, doctors removed up to 80% of a patient’s blood, often leaving them dangerously weak or dead.
Despite its widespread use, bloodletting lacked scientific merit. By the 19th century, medical pioneers like Pierre Louis began challenging its effectiveness, demonstrating that excessive blood loss did more harm than good. The rise of germ theory and advances in pathology signaled the beginning of the end for this archaic practice. Yet, astonishingly, bloodletting persisted in some medical circles well into the early 20th century—a testament to how deeply entrenched medical traditions can be, even in the face of progress.
2.Arsenic and Mercury—Deadly ‘Cures’ for Common Ailments
In the Georgian and Victorian eras, arsenic and mercury were not just poisons—they were medicine. Physicians prescribed these toxic substances for a wide range of ailments, from syphilis to skin conditions, with little understanding of their devastating effects. Mercury, in particular, was the go-to treatment for syphilis, a rampant disease in 18th and 19th-century Britain. Patients undergoing “mercurial salivation” endured weeks of excruciating treatment, which involved ingesting or inhaling mercury until they produced excessive saliva—thought to purge the illness. Instead, it left them with rotting teeth, neurological damage, and, often, an early death.
Arsenic, meanwhile, was marketed as a cure-all. Physicians prescribed it for fevers, respiratory issues, and even as a beauty treatment to achieve the pale complexion fashionable among the upper classes. Some over-the-counter medicines contained arsenic in small doses, though long-term use led to chronic poisoning, causing organ failure, hair loss, and severe digestive issues. Despite these dangers, arsenic remained in medical use well into the 19th century, only gradually falling out of favor with the rise of germ theory and safer alternatives. These hazardous “cures” serve as a grim reminder of how medicine, before the advent of modern pharmacology, often did more harm than good.
3.Trepanning—Drilling Holes in the Skull to Release ‘Evil Spirits’
Trepanning, or trephination, is one of the oldest known surgical procedures, dating back at least 7,000 years. By the Georgian and Victorian eras, this ancient practice was still in use, though often based on misguided beliefs rather than medical necessity. Physicians of the time believed that drilling a hole into the skull could relieve pressure, cure epilepsy, or even expel “evil spirits” that were thought to cause mental illness. Despite the lack of anesthesia and rudimentary surgical tools, some patients survived, as evidenced by skulls showing signs of healing.
The procedure involved using a hand-cranked trepan—a circular saw-like instrument—to bore into the cranium. Without antiseptics or proper sterilization, the risk of infection was extraordinarily high, often leading to fatal complications. Yet, some doctors swore by its effectiveness, particularly in cases of head trauma. As medical knowledge advanced in the late 19th century, trepanning was gradually abandoned in favor of more scientific approaches to neurology and psychiatry.
While trepanation is no longer practiced in mainstream medicine, modern neurosurgery employs similar techniques—albeit with precise instruments and sterile conditions—to relieve cranial pressure after severe brain injuries. The procedure remains a haunting testament to the lengths early medicine went to in its attempts to heal the human body.
4.The Use of Leeches for Everything from Fevers to Mental Illness
For centuries, leeches were a staple of medical practice, particularly during the Georgian and Victorian eras. Physicians relied on these bloodsucking creatures to treat a staggering range of ailments, from fevers and inflammation to mental illness. The practice, known as hirudotherapy, was rooted in the ancient belief that diseases stemmed from an imbalance of bodily “humors.” By removing blood, doctors hoped to restore equilibrium—a theory that persisted well into the 19th century.

Leeches were particularly valued for their ability to draw blood in a controlled manner. Unlike traditional bloodletting with lancets, leeches could extract blood from precise locations, making them ideal for treating localized ailments. Their saliva contained hirudin, an anticoagulant that kept wounds bleeding for hours after detachment, prolonging the perceived therapeutic effect. By the early 19th century, demand for medicinal leeches had skyrocketed—France alone imported an estimated 40 million leeches annually to meet medical needs.
Patients suffering from mental illness often had leeches applied behind their ears or on their temples, based on the belief that excessive blood in the brain contributed to madness. Similarly, those with fevers or joint pain had leeches placed on the afflicted areas to “draw out” the illness. Apothecaries sold leeches for home use, and professional leech collectors—often women—waded into ponds to harvest them.
Despite its widespread use, leech therapy began to decline in the late 19th century as medical science advanced. Pasteur’s germ theory and the rise of antiseptic medicine rendered bloodletting obsolete, though leeches have seen a minor resurgence in modern surgery for improving blood circulation in delicate procedures.
5.Surgery Without Anesthesia—The Brutality of Early Operations
Before the advent of anesthesia in the late 1840s, surgery was a gruesome ordeal that tested both the skill of the surgeon and the endurance of the patient. Operations were performed with the patient fully conscious, restrained by assistants, and subjected to excruciating pain. Speed was paramount—surgeons like Robert Liston became famous for their rapid amputations, with Liston reportedly removing a leg in just 30 seconds. This was not merely a feat of dexterity; the less time a procedure took, the lower the chances of the patient dying from shock or blood loss.
However, speed did not compensate for the risks of infection. In an era before germ theory, surgical tools were often reused without sterilization, and operating theaters were breeding grounds for bacteria. It was not uncommon for patients to survive the operation only to succumb to sepsis days later. The psychological toll extended beyond the patient—observers, including medical students and even figures like Charles Darwin, were often horrified by the suffering they witnessed.
The introduction of ether anesthesia in 1846, first demonstrated by Liston, revolutionized surgery, allowing for more precise and less traumatic procedures. This, combined with Joseph Lister’s antiseptic techniques in the 1860s, drastically reduced mortality rates. Yet, despite these advancements, surgery remained a perilous experience well into the Victorian era, with urban hospitals struggling to combat post-operative infections and high patient fatality rates.