Throughout much of human history, medicine was a practice as much rooted in tradition and superstition as in science. Few treatments exemplify this better than bloodletting — a once-common medical procedure that involved withdrawing blood from a patient to cure or prevent illness. Practiced for over two millennia, bloodletting is now largely discredited, yet its history provides a fascinating, and often grim, lens into the evolution of medical thought. By examining the origins, rationale, practices, and consequences of bloodletting, we gain deeper insight into how ancient societies understood the human body, disease, and healing.
The Origins of Bloodletting: Humoral Theory and Ancient Beliefs
The practice of bloodletting can be traced back to ancient civilizations, including the Egyptians, Greeks, and Chinese. In Western medicine, its roots are deeply intertwined with the humoral theory proposed by Hippocrates (c. 460–370 BCE), which became the dominant framework for understanding health for centuries.
According to this theory, the body was composed of four humors: blood, phlegm, black bile, and yellow bile. Health was believed to result from a balance of these fluids, while disease stemmed from an imbalance. Bloodletting was used to reduce what was thought to be an excess of blood, thereby restoring equilibrium.
Hippocrates himself was cautious about bloodletting, but his successor, Galen of Pergamon (129–c. 200 CE), greatly expanded its application. Galen, whose writings shaped European and Islamic medicine for over a thousand years, advocated for specific bloodletting techniques based on astrology, the seasons, and the location of illness in the body.
Techniques and Tools: From Leeches to Lancets
While the concept of removing blood remained largely the same, the tools and techniques used in bloodletting varied widely across cultures and time periods. The most direct method was venesection — the cutting of a vein to drain blood. Physicians often targeted the median cubital vein in the arm, although different veins were chosen depending on the illness being treated.
Scarification, another method, involved making small superficial cuts in the skin and drawing blood using cupping glasses that created suction. This technique was less invasive but often considered less effective.
Leeches, belonging to the species Hirudo medicinalis, were another popular tool, especially in medieval and early modern Europe. These creatures would be applied to the skin to suck blood slowly and were thought to be particularly suitable for delicate areas like the face or near the eyes.
Despite the differences in methods, all shared a common goal: the deliberate removal of blood to treat illness or maintain health.
Who Got Bled and Why? The Diseases and the Patients
Bloodletting was used to treat an astonishing range of ailments — everything from fevers, headaches, and pneumonia to mental illness, epilepsy, and even acne. It wasn’t limited to the sick, either. In some regions, especially in Islamic medicine, it was also used prophylactically — a form of routine maintenance to prevent disease.
In the Middle Ages, bloodletting became a widespread public ritual. Barbers, who often doubled as surgeons, performed bloodletting alongside tooth extractions and haircuts. The iconic red-and-white barber pole symbolizes blood (red) and bandages (white) — a relic of this grisly past.
The logic behind which illnesses called for bloodletting, how much blood to remove, and where from the body to take it often varied wildly. Physicians made decisions based on complex charts that combined astrology, the humors, and localized symptoms. The results were inconsistent at best — and deadly at worst.
The Consequences: When Medicine Does More Harm Than Good
Although bloodletting was intended to cure, it frequently weakened patients, worsened their conditions, or hastened death. The sheer volume of blood sometimes removed — up to several pints in one session — could induce shock, anemia, or organ failure. In many cases, it deprived patients of the strength they needed to fight off infection.
Some of history’s most notable figures may have suffered from excessive bloodletting. George Washington, the first U.S. president, died in 1799 after being bled multiple times in a single day while suffering from a throat infection. His physicians removed about 40% of his blood — a likely contributor to his death.
Despite mounting evidence of its dangers, bloodletting persisted well into the 19th century. This resilience is a stark reminder of how medical dogma can persist in the face of contradictory evidence when institutional authority and tradition outweigh empirical observation.
The Fall of Bloodletting and the Rise of Evidence-Based Medicine
The 19th century brought profound changes to medicine, as scientific methods began to challenge long-standing traditions. Physicians like Pierre Charles Alexandre Louis in France began using statistical analysis to evaluate the effectiveness of treatments, and bloodletting did not hold up under scrutiny. Louis’s research showed that patients with pneumonia who were not bled often fared better than those who were — a turning point in the shift toward evidence-based medicine.
Simultaneously, discoveries in microbiology, such as the germ theory of disease by Louis Pasteur and Robert Koch, revolutionized the understanding of illness. Diseases were no longer seen as imbalances of fluids but as infections caused by specific microorganisms. This dramatically changed treatment approaches and rendered humoral theory — and its practices like bloodletting — obsolete.
By the early 20th century, bloodletting had all but disappeared from mainstream medicine, surviving only in rare cases and alternative therapies. Today, phlebotomy (the modern equivalent of bloodletting) is still used, but only for specific conditions like hemochromatosis or polycythemia, where blood removal has proven therapeutic benefit.
Bloodletting is one of the most enduring — and cautionary — chapters in the history of medicine. It illustrates how deeply cultural beliefs and medical practices can become entwined, often at the expense of patient welfare. The story of bloodletting reminds us of the importance of skepticism, scientific inquiry, and humility in the face of medical certainty. Though we may view the practice with horror today, it also serves as a testament to humanity’s ongoing quest to understand and heal the body — a journey marked by both darkness and discovery.