scratch-mark

Did Paleo (and Loren Cordain) Get It Wrong On Carbohydrates? Of Course, But We Knew That Already.

The great thing about icons is that sooner or later, they always fall. It’s the job of general iconoclasts, like me, to laf and ridicule (sometimes at myself, even)—to hasten the process—all the while folks are busy holding on for dear life: banking on, , life styling, worshiping, and making livings off their iconic idols. The strong survive, though, so it’s just really a process of evolutionary natural selection. Nothing to be afraid of. Adapt, or face scorn and dismissal.

Recent jaunts into this area have been the posts on iron “enrichment” (here and here), Paleo getting it wrong on grains, and a brief history of the astounding popularity and abuse of bloodletting (that’s highly related to the iron posts). A forthcoming post will expand on paleo getting it completely wrong on grains, by means of serious conflation—that’s tantamount to holding a position against eating eggs because so many people eat just the whites.

Recently a new study came out: The Importance of Dietary Carbohydrate in Human Evolution.

Abstract

We propose that plant foods containing high quantities of starch were essential for the evolution of the human phenotype during the Pleistocene. Although previous studies have highlighted a stone tool-mediated shift from primarily plant-based to primarily meat-based diets as critical in the development of the brain and other human traits, we argue that digestible carbohydrates were also necessary to accommodate the increased metabolic demands of a growing brain. Furthermore, we acknowledge the adaptive role cooking played in improving the digestibility and palatability of key carbohydrates. We provide evidence that cooked starch, a source of preformed glucose, greatly increased energy availability to human tissues with high glucose demands, such as the brain, red blood cells, and the developing fetus. We also highlight the auxiliary role copy number variation in the salivary amylase genes may have played in increasing the importance of starch in human evolution following the origins of cooking. Salivary amylases are largely ineffective on raw crystalline starch, but cooking substantially increases both their energy-yielding potential and glycemia. Although uncertainties remain regarding the antiquity of cooking and the origins of salivary amylase gene copy number variation, the hypothesis we present makes a testable prediction that these events are correlated.

Here’s a Science Daily review.

Yawn. I’m being facetious, of course, but c’mon. Anyone who’s been paying attention to the science rather than salivating over the next Paleo guru contest giveaway deal must have been aware of the huge breakthrough in 2013: A Grassy Trend in Human Ancestors’ Diets. It even got the intransigent Loren “100% Right Since 2000” Cordain’s knickers in a bunch.

Interestingly, Cordain tried to refute the June 2013 revelation by the National Academy of Sciences that multiple studies had shown increased C4 intake from eating sedges. But, his argument was fairly weak as while he acknowledged that the researchers had concluded that plants likely contributed to the bulk of C4 intake, he responded, “Nevertheless, when the isotopic data is triangulated from archaeological, physiological and nutrition evidence, it is apparent that the C4 signature in ancestral African hominin enamel almost certainly is resultant from increased consumption of animals that consumed C4 plants.”

Well, no. Unfortunately for Cordain, anthropologists had already tossed aside the idea of a carnivorous hominid, since dental morphology did not present as carnivorous and hominid tools were too primitive for butchering when the timeline showed a significant jump in C4. And just a few months after he wrote a formal letter to the Proceedings of the National Academy of Sciences complaining about the findings, researchers from Oxford University discovered that it was indeed Tiger Nuts that contributed to higher C4 intake in P. boisei making his scrambled rebuttal look fairly weak. If I remember correctly, researchers showed evidence that these early hominids ate nutrient-dense sedge tubers, ate termites and likely scavenged animals when they could obtain them. So, I guess you could say Cordain tried to refute the National Academy of Sciences, but he soon quit once the evidence piled up against him.

More Stuff Loren Cordain has been wrong about, steadfastly refusing to budge on any position he’s held for getting close to two decades—with the possible exception of backing down on his position that canola oil is better for you than animal fat.

By consequence, he’s also gotten a lot wrong in terms of his various hand-waving over various “toxins” and “anti-nutrients”—as if he doesn’t seem to understand that all plants except fruit have evolved chemical defenses to being eaten. It turns out that a lot of them are evolutionary push-pull, kinda like an antibiotic that kills certain bacteria can eventually become a nutrient for them. Remember how phytate once once Satan’s Spawn? Well, it’s not, anymore, since a recent study showed that “…in one of the most prominent gut bacteria species, [is] an enzyme able to break down phytate. […] They then went on to characterise the enzyme, showing it was highly effective at processing phytate into the nutrients the body needs.”

And if that’s not enough, here’s Mat “The Kraken” Lalonde, PhD (Harvard), on a Chris Kresser’s podcast, covering wheat germ agglutinin and other so-called anti-nutrients.

Mat Lalonde: “It turns out that most lectins, especially the most well-studied ones like wheat germ agglutinin, PHA, which is in legumes, which is phytohaemagglutinin, they are deactivated by heat. These proteins are very sensitive to heat, and they’re destroyed. So people waving their hands in the air like, ‘Oh my God, these things are really toxic!’ and whatnot. And it’s true. They are very toxic. We have the research to show that they are toxic in animals in vitro when they’re fed to animals, but it turns out that they’re feeding raw legumes or pure isolated proteins to these things, not cooked food.”

That’s a great podcast episode. Mat talks about a lot of the things that the Paleo™ narrative got wrong. For instance…

Mat Lalonde: “But I had been giving a nutrition talk for a long time, and I’d been relying on antinutrient information that came from a talk that was given to me by Loren Cordain, and it turns out that most of the information that was in there that I never bothered to double check—I should’ve double checked—was wrong. And when I was standing up there and I was really condescending, that was to him. I was like: Listen, dude, we need to have a match right here, right now, because you have been putting out this information and it’s completely wrong, and I can discuss where the failings are.”

This was in 2012. Now, show me a single post from Cordain since suggesting he may have jumped the gun on a lot of this stuff. In fact, what you’ll find are lots of post asserting the same things he’s always asserted, incorporating noting new, and using all the various ploys to discredit any research that calls his assertions into question or exposes them as incomplete.

So now back to the original point of the post. Same hand waving, diversionary tactics, red herrings, and non-sequiturs from Cordain to dismiss science that is actually moving forward, not stuck in his money-making enterprise forever.

And, how many times does he address Tigernuts (Cyperus esculentus)? Zero times, and you can see above in this post how he went crickets when the C4 grasses research uncovered the real source of the isotopes. Wasn’t from eating grass, wasn’t from eating animals that eat grass—but rather from a raw starchy tuber that humans and primates eat to this day, and grows wild and propagates like weeds. Rather than address it head on, he engages in the diversion of arguing over when humans controlled fire. It’s beside the point because you can eat Tigernuts right out of the ground. Moreover, it’s easy to harvest enough in a a few hours to meet all caloric needs.

Baboons pull them up and eat them to this day, which is how researchers got on the trail in the first place.

The other fun fact is that in terms of macronutrient breakdown (protein, fat, carbohydrate), it’s very similar to breast milk. And what’s more, it edges out red meat in terms of vitamins and minerals, averaged out.

So, all you nitwits argue endlessly about cooked starches, when it’s pretty clear everyone relevant over there in Africa could get their hands on edible raw, and did, for about the last 2 million years…and pound for pound, they have twice the starch of potatoes.

The issue is not whether carbohydrates were essential for the growth of big brains. The issue is that it was most surely part of the equation going back upwards of 2.3 million years ago, in far greater quantities that we’d thought, so it’s a moot, endless ‘well, what if’ point, and frankly, it’s dishonest, disingenuous, and manipulative to keep framing the argument in those terms.

Also, the issue is not whether it’s essential for our already-big brains today, but whether whole food-relative levels of starch in whole foods are healthful, and I submit that the history of human fecundity, survival, and prosperity are such proof, and that it’s laughable to hold otherwise. The Blue Zones offer further profound evidence that longevity can be added to that list.

Until Cordain, et al, take on Tigernuts—and other starchy raw sedge tubers—specifically, concretely, and exhaustively, they should by all rights be laughed right off the Paleosphere Stage as a bunch of money-hungry, dishonest pip-squeaks and clowns.

Richard Nikoley

I'm Richard Nikoley. Free The Animal began in 2003 and as of 2021, contains 5,000 posts. I blog what I wish...from health, diet, and food to travel and lifestyle; to politics, social antagonism, expat-living location and time independent—while you sleep—income. I celebrate the audacity and hubris to live by your own exclusive authority and take your own chances. Read More

41 Comments

  1. John Rhoades on August 21, 2015 at 16:08

    Ex-type II diabetic and ex-way overweight, now no symptoms, good blood sugar control, and about 25 BMI with more muscle than average for my age (70). About 25-35% of my diet is carbs (I no longer count). n=1 sure, but then the n is me so I consider it highly relevant. I pretty much eat the amount of carbs my body/brain uses, so no excess glucose to pile up as fat and sicken my cells with excess energy they can’t use. Transitioned from SAD to Paleo in 2010, then Paleo to Jaminet PHD in 2011. Paul Jaminet has always advocated sufficient carbs to meet but not exceed body/brain requirements, generally way more than the low-carb crowd recommends.

    • Bret on August 26, 2015 at 05:52

      The low carb crowd seems to be unable or unwilling to distinguish whole food carbs from refined carbs, and until they do, they’re a joke in my book.



  2. David on August 21, 2015 at 17:18

    Richard, do you have any predictions on the results of the NuSI experiments?

    • John on August 22, 2015 at 14:12

      You mean, other than finding that carbs are bad, bad, bad and that calories don’t matter?

      Although I still like Tim Ferriss, and I think he does bring up interesting ideas (like cold thermogenisis and Ray Cronise), I don’t trust ANY of his claims. On page 23 of The Four Hour Body, he claims that a 220 pound man doing an hour on the stairmaster would only burn 7 more calories than sitting on the couch watching the Simpsons. That’s ludicrous, it would be much closer to 400 or 500, depending on level of intensity. If he’s that far off base on something that simple, how could he be trusted on anything more complicated?



    • Bret on August 26, 2015 at 20:40

      You mean, other than finding that carbs are bad, bad, bad and that calories don’t matter?

      Absolutely. My own carelessness did not enable me to eat 1900 calories in a single meal…it was my biochemistry that overcame my will power, because carbs.



  3. John on August 22, 2015 at 13:10

    “A forthcoming post will expand on Paleo getting it completely wrong on grains, by means of serious conflation—that’s tantamount to holding a position against eating eggs because so many people eat just the whites.” I don’t think this is a good analogy. I think the biggest problem with grains is that a cumulative toxin (iron) is being added, not that certain ones are being eaten refined. It’s more like holding a position against eggs because eating eggs along with antifreeze causes serious problems.

    I also don’t think that eating just egg whites, in and of itself, is unhealthy. In fact, the whites are where the iron binding proteins of eggs exist. Egg whites could be a very useful addition to a diet whose goal is to lower iron stores of the body.

    In the case of wheat and rice (the only grains that can be refined,) I still believe that refined and unfortified are superior to whole. The reason I believe this is due to the fact that refined versions cause fewer GI problems than whole, and clinical trials have shown superior mineral retention when consuming refined rice and wheat, as opposed to whole.

    • gabkad on August 22, 2015 at 14:14

      John, an egg yolk contains about 2% of the iron requirements of a human being. Eating only egg whites is….. oh, never mind.



    • John on August 22, 2015 at 14:44

      Were you going to say silly? Because I would tend to agree with that. I think the yolks are packed with vitamins, and that cholesterol is nothing to be afraid of in food. But that doesn’t make egg whites “unhealthy,” and they could certainly be useful in certain contexts.



    • gabkad on August 22, 2015 at 19:34

      No John, not silly. A bloody waste is what I think it is when people eat only the whites when the nutrient bomb is the yolk.

      I eat duck eggs. Huge tasty yolks.



    • John on August 23, 2015 at 14:01

      “Fewer GI problems to whom? To people whose digestive systems have been destroyed by a lifetime of eating nutrient depleted refined grains?” The GI problems I’m referring to are generally minor and unpleasant, things like gas and bloating. I’ve personally gotten this from brown rice in the past, and it seems fairly common. I haven’t noticed this when I eat white rice. At the 2:10 mark of this video, this trainer mentions that he knows a lot of people that get bloated from brown rice, himself included- https://www.youtube.com/watch?v=h1OWk6R97hw

      “Which minerals are better retained from refined grains? And how can they be better retained if they’re not there in the first place?” Calcium, Magnesium, and Zinc. And they don’t have to come from the grains themselves, since no one eats a 100% whole grain diet for their entire life. Also, even though mineral content of refined grains are reduced, they aren’t completely devoid of minerals.

      Here’s a study that found superior mineral retention when white rice was eaten- http://journals.cambridge.org/action/displayAbstract?fromPage=online&aid=800920&fileId=S0007114550000278

      This study found similar results-

      This study from Iran found better retention of calcium, magnesium, phosphorus and zinc in two male subjects when eating white bread as opposed to whole wheat Bazari bread- http://www.scribd.com/doc/275705538/J-Nutr-1976-Reinhold-493-503-pdf#scribd



    • Jane Karlsson on August 23, 2015 at 08:05

      “In the case of wheat and rice (the only grains that can be refined,) I still believe that refined and unfortified are superior to whole. The reason I believe this is due to the fact that refined versions cause fewer GI problems than whole, and clinical trials have shown superior mineral retention when consuming refined rice and wheat, as opposed to whole.”

      Fewer GI problems to whom? To people whose digestive systems have been destroyed by a lifetime of eating nutrient depleted refined grains?

      Which minerals are better retained from refined grains? And how can they be better retained if they’re not there in the first place?



    • Duck Dodgers on August 23, 2015 at 20:28

      “The GI problems I’m referring to are generally minor and unpleasant, things like gas and bloating”

      Some might say that’s a sign that unbalanced flora are adjusting to fibers.

      “superior mineral retention”

      Does anyone in a developed country, who can obtain virtually any mineral-rich food (mollusks, hemp, tubers, mushrooms, etc), really need to worry about “superior mineral retention”? Citing a study about a “poor vegetarian diet” isn’t exactly proof of anything related to the diet of developed populations. Recent studies on anti-nutrients tend to treat developed and non-developed populations as generally unrelated.

      The problem is that the benefits of whole grains in developed populations are often attributed to the antioxidants and phytochemicals that are refined out by modern milling. For instance, and this is just one example, phytates have antioxidant properties for developed populations while they are anti-nutrients to people with poor vegetarian diets. The compound has dual roles.

      On the flip side of the coin, many phytochemicals appear to stress our bodies in ways that make us stronger. It’s like a cocktail of antioxidants, hormetic pro-oxidants, and unique fibers.

      In our current research, we are finding that pre-industrial revolution populations not only thrived on whole grains and whole wheat, but worshipped it and believed that whole wheat in particular was the most important and most nourishing vegetable that they could eat. The written history and the opinions of those who were around during the rise of modern milling, suggests that wheat only lost its title as the “staff of life” once it was refined and purified by milling machines in ways that could not have been achieved prior to the industrial revolution.

      Truth be told, I don’t think many Westerners these days have ever actually eaten true whole wheat, because it’s fairly difficult to buy true whole wheat these days. If you buy “whole wheat” flour, in a supermarket or “whole wheat bread,” you’re usually buying something once known as “bran flour” which is white flour with the bran added back in. This is done to remove the germ, which has a shorter shelf life.

      Anyhow, it seems hard to believe that true whole wheat, which was the main early nourishing staple of Western civilization, needed to be refined by machines in order to achieve some kind of perfection. Makes no sense when you see that the benefits of whole grains are only partially related to minerals. There are likely many compounds in whole wheat that don’t make it into industrial milled products—therefore, it’s not a whole food. Meanwhile, the critics of grains just mainly focus on impaired mineral status of people with poor vegetarian diets—as if it had any relevance to those with mineral-replete diets.

      If one is so worried about mineral status, one can eat easily source mineral-rich foods with very little effort. I’ve not yet seen any evidence, yet, that phytates steal micronutrients from the mitochondria—antinutrients seem to mainly sweep away excess. If you know of any evidence otherwise, please share it!



    • FrenchFry on August 24, 2015 at 01:34

      @John
      You have been reading Colpo, haven’t you 😉

      @Duck : what about Graham’s wheat flour ? I vaguely remember it is better than simple “whole wheat flour” but I suppose that for a proper shelf-life, the flour is also devoid of the germ ?



    • Jane Karlsson on August 24, 2015 at 05:02

      @John
      Yes I think gas and bloating from brown rice is quite common. It isn’t the brown rice’s fault.

      The studies you linked are nearly all very short term. Gut bacteria need time to adapt. The authors of your first paper realised this, and extended the experiment to 18 weeks.

      “It is evident that the calcium and magnesium balances [on the brown rice diet] improved for each subject so that, for example, originally negative balances in time became positive. This is definite evidence of an adaptation on the part of the organism to adverse dietary conditions. …
      ….It is not possible to say what is the essential nature of this process of adaptation. It is achieved, apparently, by decreasing the quantities of calcium and magnesium in the faeces. This may be due to an alteration in the intestinal flora producing increased breakdown in phytate, or to altered rates of absorption.”

      If you eat food low in minerals like white rice, your gut transport systems will be upregulated to increase absorption as much as possible. In fact you may absorb too much, in which case excretion will go up, as happened in 9 of the 12 subjects.

      “The urinary excretion of calcium was, in nine subjects, greater on the polished rice diet.”



    • Duck Dodgers on August 24, 2015 at 13:32

      We are going to briefly cover Sylvester Graham in the upcoming post. As I understand it, Graham was correct in realizing that the toxic alum and chlorine being added to white flour in the early 19th century was problematic. His flour was additive free.

      But, Graham’s main belief was that people needed to consume lots of irritating fibers that rasped and scratched the gut lining to provoke a kind of hormesis to activate peristalsis and relieve the constipation that became prevalent with industrialized flours. He also believed that modern flours led to venereal excess and masturbation, which depleted vital energy and nutrients. Graham died at the age of 57, leaving people in doubt of his theories—later to be revived by J.H. Kellogg. But Graham’s followers, “Grahamites” were rumored to have also eaten ground up coats and beards to simulate whole grain Graham flour.

      I don’t plan on dwelling too much on Graham, or his beliefs, but what’s interesting is that he was one of the early proponents of a fiber-rich diet.

      As we look into the history of wheat, we see that the dietary gurus of the 19th and early 20th century actually knew what the widespread dietetic problem was (ignoring their odd preoccupation with masturbation). Amazingly, they even knew how to use probiotic foods and fibers to reverse the problem. But, more importantly, they knew that industrialized flours were not the same as real whole wheat. In the end, industrial food reformers won the battle over our taste buds and the fashion shifted to modern/refined flours, which were later fortified and then hybridized, etc.

      The historical literature suggests that widespread wheat-related digestive issues are only about 200 years old. Prior to that, it appears that whole grain bread—and in particular, whole wheat—was the de facto superfood that agriculturalists were meant to eat.

      We’ve genuinely lost the infrastructure to manufacture true whole wheat, and so it’s rarely ever consumed these days, and few people ever make the effort to exclusively source it.



    • Duck Dodgers on August 25, 2015 at 13:17

      Ah… I now see why I was confused on the industrial flour process. Here’s a video on how industrial flour is made. They do in fact take pure white flour and then add the rest of the “whole grain” back in, estimated to the original proportions. It’s not like grinding coffee. Some people argue that the industrial milling process on the germ, and the purifying process results in a very different product from grist-mill flour, but I am unable to confirm that.

      Canadian flour is where the confusion comes in as they allow a partial reconstruction of flour and can still call it “whole wheat” even though it’s not. Canadians need to buy “whole grain whole wheat” in order to get whole grain.



    • FrenchFry on August 25, 2015 at 02:22

      Thanks DD!


      We’ve genuinely lost the infrastructure to manufacture true whole wheat, and so it’s rarely ever consumed these days, and few people ever make the effort to exclusively source it.

      If your bulk calories was wheat, it would be bad news. However, today one can simply base the bulk calories on some other staple and have the sub-optimal but still reasonably OK whole organic wheat on occasions (as long as you are not celiac or sensitive to anything wheat related – I am not and I don’t mind eating wheat at all these days, my foray into “paleo” 2-3 years ago ended quite a while ago when I grew very uneasy with all the marketing and salesmen tricks from all the paleo / primal gurus – I also remembered that my recent ancestors were doing just fine on grains … ). If anything, I tend to eat rather low fat as I base my diet on natural carb sources and don’t seek to eat animal foods everyday. So grains are definitely a big part of my current diet and I am quite happy about it.

      In retrospect, it is amazing how much baloney you can swallow if you are not very critical of all these “ancestral” prescriptions including the myths of low-carb high fat diets advertized everywhere these days. Even WAPF stuff is to be taken with a very critical eye (cf. FCLO critics – I’d rather eat fresh cod livers than swallowing a rancid, near putrefied oil that probably delivers very little of its promised benefits).



    • Duck Dodgers on August 25, 2015 at 07:03

      “If your bulk calories was wheat, it would be bad news. However, today one can simply base the bulk calories on some other staple and have the sub-optimal but still reasonably OK whole organic wheat on occasions (as long as you are not celiac or sensitive to anything wheat related”

      True. But, I think you may be missing my point—probably because I haven’t expressed it fully. 🙂

      In my own recent n=1 with true whole wheat, and after re-adapting to it, I’m finding that I respond to real wheat much like I have with other superfoods. For instance, when taking Reishi mushrooms, it’s common to just feel very relaxed and focussed. And keep in mind that even mushrooms require specific preparations (tea, cooking, extraction, etc). Well, that’s the same effect I’m getting from just incorporating a few daily servings of whole wheat into my diet.

      Every successful culture had its superfood. For Western civilization it was wheat, hands down. We don’t really think of wheat as a superfood these days (it’s actually become a liability), but the ancients believed that wheat had magical properties. There are lots of examples of this in the early literature. I think there may be something to those sentiments.

      So, while it’s true that from a caloric standpoint, it’s sort of irrelevant. But from a health and well-being perspective, I think something’s been lost there. After all wheat is not just calories, it’s full of all sorts of phenolics, carotenoids, sterols, β-glucan, RS, inulin, oligosaccharides, lignans, and other compounds (see Table III, here).

      I think we probably need to get out of the mindset that wheat is some kind of sub-par caloric filler. That’s only true of industrial milled wheat. True whole wheat, and other farinaceous seeds, are the superfood of agriculturalists. I suspect it’s about time people started re-discovering these lost superfoods for their own n=1s and see how they do.

      So, I see the loss of that infrastructure as something more than a re-allocation of staple calories. I think I’m beginning to see it as a loss of our culture’s own superfood.



    • Duck Dodgers on August 25, 2015 at 07:37

      Need to clarify. Looks like I was wrong about whole wheat flour being rare.

      Wikipedia: Whole-wheat flour

      In the United States, “whole wheat flour” must contain the whole grain—the bran, the germ, and the endosperm—in the naturally occurring proportions.[2]

      By contrast, in Canada, “whole wheat flour” may have up to 5 percent of the kernel removed and is thus not necessarily whole grain.[3] Thus, “whole wheat” flour commonly has 70% of the germ removed to prevent rancidity, and as such cannot be labeled “whole grain.” Only “whole grain whole wheat flour” must contain the whole grain.

      Outside North America “whole-wheat flour” is sold as “Wholemeal flour” or various type numbers or regional identifications.

      So, depending on which country you are in, you just have to find the right designation. Old grist mills preserve more of the compounds, but in general regular whole wheat flour in a traditional preparation should be relatively good.

      My own n=1 is largely based on a bakery that mills its own organic flour each morning, so hard for me to say what the main factor is.



    • Richard Nikoley on August 25, 2015 at 11:57

      Good intro to the upcoming post.



    • Duck Dodgers on August 25, 2015 at 13:30

      And, I’ll just add that more of the confusion comes from this Canadian paper from McGill, part of which is about Canadian-specific standards discussed, above.

      Although, the paper discusses the chemicals that can be legally added during the industrial milling process (chlorine, chlorine dioxide, benzoyl peroxide, potassium bromate, ammonium persulfate, ammonium chloride, acetone peroxide, azodicarbonamide, ascorbic acid, l-cysteine, mono-calcium phosphate). And I believe many of these are legal in the US as well.

      So, yeah… industrial flour is a pretty heavily processed product. Would be interesting to hear if the French allow any of these chemicals in their flour.



  4. gabkad on August 22, 2015 at 14:18

    Richard: good stuff. Cordain is stuck in his own rut.

  5. marie on August 23, 2015 at 10:57

    You know, I wonder whether the specifics of what Cordain does or doesn’t believe within the evolutionary paradigm of nutrition actually matter, it’s that evolutionary framework that is, well, revolutionary. Yes, he wasn’t the only one nor the first, but he made it widely understandable.

    So science keeps finding new information and refining our understanding, within that evolutionary framework. If he disagrees with much of the new findings, isn’t trivial compared to the very idea that’s caused so many people to look at nutrition from an evolutionary prism in the first place?

    Not being “nice” here, just weighing relevance. Science doesn’t need an impassioned defence against his or anyone’s arguments. It tends to progress inexorably. So the only reasons to argue are to kvetch or, generously, to influence opinions in a helpful way. However, If people would rather read opinions than learn enough to evaluate findings for themselves, that’s a great opportunity to let natural selection do its thing, no? 🙂

    Bisous, toujours…

  6. Michael Ruscio on August 24, 2015 at 20:24

    I really think we need to stop the high carb. versus low carb. argument which only divides this movement but rather acknowledge that a spectrum of carb. intake exists and different people will thrive at different portions on the spectrum.

    • G in Northeast US on August 29, 2015 at 11:02

      How does your view of high carb – low carb diets as a spectrum relate to ketosis?

      I have more energy, and much more consistent energy levels when my carb intake is low enough to produce ketosis than otherwise.

      Also, what is the simplest and best way to consume the potato starch for increasing resistant starch in the diet? I don’t like the mixing with glass of water idea at all — maybe something like greek yogurt makes more sense?



  7. Marius on August 24, 2015 at 11:15
    • Richard Nikoley on August 24, 2015 at 12:50

      Well, Nora is painting herself into a buggy-whip box.

      Should I help her?



    • FrenchFry on August 25, 2015 at 04:09

      This woman’s opinion is irrelevant. Eventually, people will forget about “paleo”, LCHF diets, nutrition as religion, etc. A matter of time really.



    • Duck Dodgers on August 25, 2015 at 05:10

      At this point, it’s best to ignore the Paleo™ gurus and their blatant disregard for the ever-improving scientific literature. I wonder how much longer they can go on having to deny more and more conflicting literature being published each year. (For instance, even carbon isotope evidence is now considered to be murky). It’s becoming obvious that their only motivation in defending their position is to maintain their own livelihood, rather than seeking a better understanding of anything.

      The Paleo™ response is nothing more than selective interpretation.

      Wikipedia: Cognitive dissonance theory: Selective interpretation

      Selective interpretation: a method for reducing dissonance by interpreting ambiguous information so that it seems consistent with ones beliefs, thoughts, or actions.

      The snake oil genius of Paleo™ is to justify a bizarre diet with ambiguous dietary clues from a time well before written records existed, allowing Nora—or anyone—to invent any diet they please. Those ambiguous and conflicting dietary clues are the perfect setup for a scam.



    • Richard Nikoley on August 25, 2015 at 07:14

      I like the third comment: “Thanks for this.”

      Might as well have said ‘whew, now I don’t have to think for myself.”



    • FrenchFry on August 25, 2015 at 07:59

      @Richard

      I enjoyed reading the comment from Dean, made me laugh quite hard!

      So you see a bunch of squirrels eat some stuff in some way, and you get the brilliant idea that yeah, that’s the way I should eat too! … really hilarious, why not imitate say a sewage rat diet ? what about a cow’s diet ? Yeah, a good one, if eating healthy cows is so great, maybe eating like them would be even better!!

      OK, I’m out …



    • Bret on August 26, 2015 at 20:44

      Holy shit @ Gedgaudas. Is she flat out aiming to be dismissed and ridiculed by people with brains?

      She’s in good company. Jimmy Moore seems eager as ever to double down on specious bullshit.

      And Cordain…every time I read anything with his name attached to it, I become more convinced he is exactly what Richard described in this post.



  8. Gemma on August 25, 2015 at 03:43

    Hahaha

    my guess is that the authors of the study — Karen Hardy, Jennie Brand-Miller, Katherine D. Brown, Mark G. Thomas, and Les Copeland – were busy reading FTA…

  9. Jesrad on August 27, 2015 at 06:27

    The Hardy paper is mis-citing the study that supposedly is their evidence of starch in mid pleistocene: http://high-fat-nutrition.blogspot.com/2015/08/starchy-stable-isotopes-i-dont-think-so.html

    The diet difference between sapiens and neandertals/denisovans that the isotope comparison highlights, is actually shown as sapiens ate more fish and shellfish, and not more starch.

    Widespread use of fire starts around 780ky ago, AMY1 gene starts duplicating after 350ky (it’s not duplicated in neandertals nor denisovans). Dental cavities appear even later.

    The whole hypothesis of big brains from cooked carbs is ridiculous, and the citations used don’t support it.

    • Duck Dodgers on August 27, 2015 at 07:56

      It’s probably from the honey.



    • Richard Nikoley on August 27, 2015 at 08:01

      Well, yes, the cooked starch thing is a red herring, because there are tigernuts and other sedge tubers that are easy to harvest, incredibly nutritious and nutritionally dense, and the preferred method of consumption is raw.

      So easy a baboon can do it, and they still do.



    • Jesrad on August 28, 2015 at 03:06

      Indeed many primates do. They don’t seem to get bigger brains from it though ?



    • Jesrad on August 28, 2015 at 07:55

      I do recall that human consumption of honey was big enough to evolve two species of honeyguide birds: https://en.wikipedia.org/wiki/Honeyguide



    • Richard Nikoley on August 29, 2015 at 06:50

      The point is that raw starch is PART of the picture, as one might expect. And other things too. We’re omnivores.



    • Jesrad on September 1, 2015 at 03:39

      Eugene McCarthy argues that the reason non-human apes do not evolve bigger brains than they already have, is because they hit a cap on thermoregulation capacity, starch notwithstanding.

      As for me, I eat reheated rice and potatoes regularly.



  10. Jesrad on August 27, 2015 at 06:29

    While we’re talking about ridiculous ideas and evolution of big brains, I think you might like to have a look at this: http://www.macroevolution.net/hybrid-hypothesis-section-3.html

Leave a Comment





Pinterest118k
YouTube798
YouTube
Follow by Email8k
RSS780