I’m truly at a point where I love doing this blog more than I ever have. No longer a lone voice that’s hit & miss, I have a group of wonderful collaborators who somehow manage to put up with me—perhaps, because I truly strive to promote the fruits of their collaborative efforts as best as I can, and without the over-the-top marketing hype that’s become so common everywhere.
Here’s the entire bank of my knowledge on bloodletting, up until a couple of months ago: primitive, superstitious practice that killed George Washington, the first president of these united states (may not even be true). I always love being wrong because then, minimally, I’m less wrong than before.
I love slaying dragons or, in this context, questioning icons, bromides, and slogans of “truth.”
I once again give you The Duck Dodgers.
In our previous article, Iron, Food Enrichment and The Theory of Everything, we hypothesized the link between the rise in modern chronic disease and the rise in iron intakes during the 20th century, through both food fortification and increased meat consumption.
Careful readers are well aware that chronic diseases of civilization began to rise well before iron fortification entered the food supply. When we investigated this further, we found to our surprise that our not so distant ancestors were bloodletting far, far more than we ever imagined.
From the time of antiquity to the late 19th century—for at least 2,000 years—bloodletting was extremely common. In fact, it became so common during the 19th century that its abuse ultimately led to its downfall—along with it being discredited by prominent physicians.
Bloodletting was popular with the major religions (seasonal/ritual bloodlettings, relief from afflictions). Bloodletting was common in Ancient Egypt, China, Greece and Rome. And it expanded to prophylactic bloodletting during ancient times and into pre-modern times.
In medieval times, the major religious institutions used to do a lot of periodic bloodletting, and on scheduled and regulated days. People could also choose to fast or bloodlet, and the pious did a lot of both—bloodletting as much as five times a year if they were healthy. If you chose to bloodlet you were given a three-day break from services and perhaps some meat as you recovered.
By the 19th century, you were bled for virtually every conceivable condition. If you had a headache, a cough, a fever, were pregnant, or had anxiety—or anything—you were bled until you fainted…and you did it standing up, to speed up the process. If you needed surgery, you were bled before the operation as a form of anesthesia. The practice did become abused (but that raises a question: why?). It was the most common medical practice and the de facto treatment for virtually every ailment. Bi-annual prophylactic bloodlettings were very common.
How common was the spring and fall practice of bloodletting during the first half of this century, and how disastrous were its effects, are illustrated by a statement of Dr. Wilks. He said that he had often asked the late Mr. Monson Hills, who for many years was cupper and surgery attendant, and for all practical purposes house-surgeon, at Guy’s Hospital, as to his experience of the time when persons came to the hospital, especially at the “spring and fall,” to be bled by the dozen or twenty in the morning. After I had supposed that they would walk in and as quietly walk out after the operation, he would answer, “No such thing; they commonly fainted, and they might be seen lying in rows on the surgery floor like so many slaughtered sheep.” Dr. Markham quoted the late Dr. Stokes, of Dublin, as saying that “when I was a student of the Meath Hospital hardly a morning passed when some twenty or thirty unfortunate creatures were not phlebotomised. The floor was running with blood to such an extent that it was difficult to cross the prescribing hall for fear of slipping. Patients were seen wallowing in their own blood.”
Not only was bloodletting constantly practiced by every physician, but your local “barber surgeon” could give you a shave, a haircut and a bloodletting whenever you felt like it. They were in very high demand.
In theory, barber-surgeons intervened only under a physician’s order, as part of a prescribed cure. In reality, matters were quite different. Seasonal bloodlettings were commonly self-prescribed as part of everyday health management. In addition to performing phlebotomies, barber-surgeons were authorized to set broken bones, treat wounds, and medicate abscesses and skin diseases. They were, in other words, authorized to treat the outer body. (The physiological domain of physicians, by contrast, was the body beneath the skin.) Because their cures were more accessible and less expensive than those of physicians, surgeons routinely treated a much broader range of illnesses than they were officially empowered to.
It’s commonly believed that it was the doctors and barbers who preyed on unsuspecting patients and convinced them that they needed unnecessary bloodlettings. In reality bloodlettings were said to have had a noticeable effect that the patients demanded—much like patients demanding antibiotics today.
An essay on the remittent and intermittent diseases, by John Macculloch (1830)
“It is said, and indeed it is matter of daily experience, that in all such cases, immediate relief is procured by blood-letting in either of these forms; and as the same relief is similarly produced in cases of decided inflammation, as in others in which it is an acknowledged remedy, the analogy seems sufficiently perfect to form a justifiable argument. Unfortunately, still more unfortunately, patients themselves become so convinced; so conscious in fact of this relief, that they are always ready to demand it, and, still more, to resort to it without advice, or against that, on their own notions and opinions.”
Although Hippocratic therapies, including bloodletting, survived into the 1920s, by 1875 bloodletting had fallen by the wayside, and many doctors had lamented its passing:
In 1879 an American doctor, T. H. Buckler acknowledged that ‘the lancet, but the common consent of the profession at large, had been sheathed never to be drawn again’. Yet he was writing ‘A Plea for the Lancet’. In 1875 an English doctor, W. Mitchell Clarke, wrote ‘we are most decidedly living in one of the periods when the lancet is carried idly in its silver case; no one bleeds; and yet from the way in which my friends retain their lancets, and keep them from rusting, I cannot help thinking they look forward to a time when they will employ them again’. Bloodletting had largely been abandoned because statistical studies had shown that it did not work, and recent developments in physiology had been able to show that it resulted in reduced haemoglobin concentrations, which hardly seemed likely to be beneficial. But doctors clearly regretted sheathing their lancets. The lancet was a symbol of their profession and of their status as doctors—the leading English medical journal is still called The Lancet.
Worst of all though, the abandonment of the lancet was not compensated for by the introduction of any new therapy that could replace it in general practice. A gap was left, and something was needed to fill the gap. By 1892 the leading Amercian physician of his day, William Osler, was writing ‘During the first five decades of this century we have certainly bled too little.’ And he proceeded to advocate for bloodletting for pneumonia: done early it could ‘save life’. Similarly in 1903 Robert Reyburn, and American, was asking ‘Have we not lost something of value to our science in our entire abandonment of the practice of venesection?’ The Lancet of 1911 contained and article entitled ‘Cases illustrating the use of venesection’—the cases included high blood pressure and cerebral haemorage. Bloodletting was also recommended for various types of poisoning, from carbon monoxide to mustard gas. In the trenches in 1916, venesection was the approved method of treating the victims of gas attacks. Heinrich Stern, publishing The Theory and Practice of Bloodletting in New Yorkin 1915 declared that ‘like a phoenix, the fabulous bird, bloodletting has outlasted the centuries and has risen, rejuvinated, and with new vigor, from the ashes of fire which threatened its destruction’—he thought bloodletting a useful treatment for drunkenness and homosexuality. Others recommended it for typhoid, influenza, jaundice, arthritis, eczema, and epilepsy.
While many factors are obviously involved, doctors were said to have observed a considerable rise in arteriosclerosis after 1880. Diets had changed too. By that point in time, the nation was also experiencing widespread dyspepsia—a sort of national stomach ache—that some had blamed on a lack of fiber due to people replacing their traditional whole wheat with white flour).
By the late 19th century, some physicians were publicly lamenting the complete abandonment of bloodletting for therapeutic purposes.
“But a few years ago it was customary to bleed too frequently, and almost every morbid condition was thought to demand bloodletting. Practically, we never resort to the measure now, perhaps because we do not consider to their full extent the advantages to be derived from it. From one excess we have fallen into the other. The disciples of the lancet bled according to a system; it was a formula. Their adversaries abstained by convention, not always by conviction; that, too, was a formula. There was error on either side. Therapeutical truth does not lie in a mere formula; it is to be found in facts proven clinically and experimentally, not in mere systems.”
What’s interesting is not whether or not bloodletting actually was therapeutic or not. What’s interesting is that virtually all of humanity went from a civilization where bloodletting was common and iron-rich meats were a luxury, to a society that rarely bled, where food was enriched with iron and iron-rich meats were eaten with regularity. And such a dramatic change happened over the course of about 100 years. It’s a complete 180° that has rarely been considered in the context of modern chronic diseases.
The Great Depression and WWII may have played am unsuspecting role in swinging this pendulum. As the nation continued to consume nutrition-less white flour, while meat was rationed or scarce, key micronutrients became a challenge for many to obtain in the years leading up to WWII. Metabolic issues were seen, which may now be linked to imbalances in the micronutrients—such as manganese and copper—needed to metabolize carbohydrates and manage iron efflux. Instead of promoting whole wheat flour, fortification of white flour was used to crudely solve rampant deficiencies for anemia and pellagra—convincing much of the nation that people can never have too much iron.
Before long, Popeye was promoting iron-rich foods, and iron-fortified Geritol™ became the success of Madison Avenue. And even though it was argued to be unnecessary, the FDA significantly increased fortification levels in 1983, which coincidentally skyrockted many metabolic health issues in countries that fortify their foods. To this day, most Americans erroneously believe that you can never have too much iron.
As more and more research continues to implicate excess iron in a wide range of chronic diseases, hopefully people will begin to notice that skyrocketing iron intakes and the cessation of traditional bloodletting may in fact be related to how we got here.