The great thing about icons is that sooner or later, they always fall. It’s the job of general iconoclasts, like me, to laf and ridicule (sometimes at myself, even)—to hasten the process—all the while folks are busy holding on for dear life: banking on, , life styling, worshiping, and making livings off their iconic idols. The strong survive, though, so it’s just really a process of evolutionary natural selection. Nothing to be afraid of. Adapt, or face scorn and dismissal.
Recent jaunts into this area have been the posts on iron “enrichment” (here and here), Paleo getting it wrong on grains, and a brief history of the astounding popularity and abuse of bloodletting (that’s highly related to the iron posts). A forthcoming post will expand on Paleo getting it completely wrong on grains, by means of serious conflation—that’s tantamount to holding a position against eating eggs because so many people eat just the whites.
Recently a new study came out: The Importance of Dietary Carbohydrate in Human Evolution.
We propose that plant foods containing high quantities of starch were essential for the evolution of the human phenotype during the Pleistocene. Although previous studies have highlighted a stone tool-mediated shift from primarily plant-based to primarily meat-based diets as critical in the development of the brain and other human traits, we argue that digestible carbohydrates were also necessary to accommodate the increased metabolic demands of a growing brain. Furthermore, we acknowledge the adaptive role cooking played in improving the digestibility and palatability of key carbohydrates. We provide evidence that cooked starch, a source of preformed glucose, greatly increased energy availability to human tissues with high glucose demands, such as the brain, red blood cells, and the developing fetus. We also highlight the auxiliary role copy number variation in the salivary amylase genes may have played in increasing the importance of starch in human evolution following the origins of cooking. Salivary amylases are largely ineffective on raw crystalline starch, but cooking substantially increases both their energy-yielding potential and glycemia. Although uncertainties remain regarding the antiquity of cooking and the origins of salivary amylase gene copy number variation, the hypothesis we present makes a testable prediction that these events are correlated.
Here’s a Science Daily review.
Yawn. I’m being facetious, of course, but c’mon. Anyone who’s been paying attention to the science rather than salivating over the next Paleo guru contest giveaway deal must have been aware of the huge breakthrough in 2013: A Grassy Trend in Human Ancestors’ Diets. It even got the intransigent Loren “100% Right Since 2000” Cordain’s knickers in a bunch.
Interestingly, Cordain tried to refute the June 2013 revelation by the National Academy of Sciences that multiple studies had shown increased C4 intake from eating sedges. But, his argument was fairly weak as while he acknowledged that the researchers had concluded that plants likely contributed to the bulk of C4 intake, he responded, “Nevertheless, when the isotopic data is triangulated from archaeological, physiological and nutrition evidence, it is apparent that the C4 signature in ancestral African hominin enamel almost certainly is resultant from increased consumption of animals that consumed C4 plants.”
Well, no. Unfortunately for Cordain, anthropologists had already tossed aside the idea of a carnivorous hominid, since dental morphology did not present as carnivorous and hominid tools were too primitive for butchering when the timeline showed a significant jump in C4. And just a few months after he wrote a formal letter to the Proceedings of the National Academy of Sciences complaining about the findings, researchers from Oxford University discovered that it was indeed Tiger Nuts that contributed to higher C4 intake in P. boisei making his scrambled rebuttal look fairly weak. If I remember correctly, researchers showed evidence that these early hominids ate nutrient-dense sedge tubers, ate termites and likely scavenged animals when they could obtain them. So, I guess you could say Cordain tried to refute the National Academy of Sciences, but he soon quit once the evidence piled up against him.
More Stuff Loren Cordain has been wrong about, steadfastly refusing to budge on any position he’s held for getting close to two decades—with the possible exception of backing down on his position that canola oil is better for you than animal fat.
- The danger of saturated fat.
- Legumes. He literally goes after Chris Kresser’s Dr. Oz appearance because he said beans are OK for some people.
- Honey. He gets some hack with a dozen letters after his name to take down HFCS as some surrogate for honey.
By consequence, he’s also gotten a lot wrong in terms of his various hand-waving over various “toxins” and “anti-nutrients”—as if he doesn’t seem to understand that all plants except fruit have evolved chemical defenses to being eaten. It turns out that a lot of them are evolutionary push-pull, kinda like an antibiotic that kills certain bacteria can eventually become a nutrient for them. Remember how phytate once once Satan’s Spawn? Well, it’s not, anymore, since a recent study showed that “…in one of the most prominent gut bacteria species, [is] an enzyme able to break down phytate. […] They then went on to characterise the enzyme, showing it was highly effective at processing phytate into the nutrients the body needs.”
And if that’s not enough, here’s Mat “The Kraken” Lalonde, PhD (Harvard), on a Chris Kresser’s podcast, covering wheat germ agglutinin and other so-called anti-nutrients.
Mat Lalonde: “It turns out that most lectins, especially the most well-studied ones like wheat germ agglutinin, PHA, which is in legumes, which is phytohaemagglutinin, they are deactivated by heat. These proteins are very sensitive to heat, and they’re destroyed. So people waving their hands in the air like, ‘Oh my God, these things are really toxic!’ and whatnot. And it’s true. They are very toxic. We have the research to show that they are toxic in animals in vitro when they’re fed to animals, but it turns out that they’re feeding raw legumes or pure isolated proteins to these things, not cooked food.”
That’s a great podcast episode. Mat talks about a lot of the things that the Paleo™ narrative got wrong. For instance…
Mat Lalonde: “But I had been giving a nutrition talk for a long time, and I’d been relying on antinutrient information that came from a talk that was given to me by Loren Cordain, and it turns out that most of the information that was in there that I never bothered to double check—I should’ve double checked—was wrong. And when I was standing up there and I was really condescending, that was to him. I was like: Listen, dude, we need to have a match right here, right now, because you have been putting out this information and it’s completely wrong, and I can discuss where the failings are.”
This was in 2012. Now, show me a single post from Cordain since suggesting he may have jumped the gun on a lot of this stuff. In fact, what you’ll find are lots of post asserting the same things he’s always asserted, incorporating noting new, and using all the various ploys to discredit any research that calls his assertions into question or exposes them as incomplete.
So now back to the original point of the post. Same hand waving, diversionary tactics, red herrings, and non-sequiturs from Cordain to dismiss science that is actually moving forward, not stuck in his money-making enterprise forever.
And, how many times does he address Tigernuts (Cyperus esculentus)? Zero times, and you can see above in this post how he went crickets when the C4 grasses research uncovered the real source of the isotopes. Wasn’t from eating grass, wasn’t from eating animals that eat grass—but rather from a raw starchy tuber that humans and primates eat to this day, and grows wild and propagates like weeds. Rather than address it head on, he engages in the diversion of arguing over when humans controlled fire. It’s beside the point because you can eat Tigernuts right out of the ground. Moreover, it’s easy to harvest enough in a a few hours to meet all caloric needs.
Baboons pull them up and eat them to this day, which is how researchers got on the trail in the first place.
The other fun fact is that in terms of macronutrient breakdown (protein, fat, carbohydrate), it’s very similar to breast milk. And what’s more, it edges out red meat in terms of vitamins and minerals, averaged out.
So, all you nitwits argue endlessly about cooked starches, when it’s pretty clear everyone relevant over there in Africa could get their hands on edible raw, and did, for about the last 2 million years…and pound for pound, they have twice the starch of potatoes.
The issue is not whether carbohydrates were essential for the growth of big brains. The issue is that it was most surely part of the equation going back upwards of 2.3 million years ago, in far greater quantities that we’d thought, so it’s a moot, endless ‘well, what if’ point, and frankly, it’s dishonest, disingenuous, and manipulative to keep framing the argument in those terms.
Also, the issue is not whether it’s essential for our already-big brains today, but whether whole food-relative levels of starch in whole foods are healthful, and I submit that the history of human fecundity, survival, and prosperity are such proof, and that it’s laughable to hold otherwise. The Blue Zones offer further profound evidence that longevity can be added to that list.
Until Cordain, et al, take on Tigernuts—and other starchy raw sedge tubers—specifically, concretely, and exhaustively, they should by all rights be laughed right off the Paleosphere Stage as a bunch of money-hungry, dishonest pip-squeaks and clowns.