I’ve been amusing myself by examining the WHO meat-eating-causes-cancer report, which was widely and rightly ridiculed when it appeared a few weeks ago. In my last post we asked the question how a bunch of supposed experts from all over the world could have produced such a travesty, and we concluded it was inevitable because WHO was engaged not in a scientific inquiry but in an exercise in groupthink.(1)

But now let’s turn to some of the things that are wrong with the WHO report, not because that report deserves serious discussion but because its errors are characteristic of what passes for public health science these days. In other words, as I’ll try to show, almost everything that comes out of the field of public health deserves as much respect as Donald Trump’s opinion of himself.

WHO didn’t do any original research before they put out their report. Instead, they reviewed the existing literature, hundreds and hundreds of wildly conflicting observational studies focused on meat-eating and cancer. An “observational” study is simply an attempt to look at two different groups of people (say, people who eat meat and people who don’t) and then to observe how they differ in the characteristic you’re interested in. In this case, what was of interest was the incidence of cancer, especially colon cancer, in the two groups.

Two prominent types of observational studies are so-called “cohort” studies and so-called case-control studies. In a cohort study the subjects are followed over time, with some subjects being exposed and some not exposed to the factor being examined (meat eating). The investigator then observes whether or not there are different outcomes regarding the variable of interest – cancer. Case-control studies, by contrast, begin with subjects who already have a specific outcome (colon cancer) and those subjects are compared to subjects who don’t have colon cancer, specifically in terms of whether or not they eat meat. Got all that?

Observational studies like those examined by WHO are part of the broader field of epidemiology, which is the alpha and omega of public health science. Whenever you hear a bunch of self-appointed experts telling you that you should do X or shouldn’t do Y, nine times out of ten that advice is based on a bunch of (highly suspect) epidemiological studies.

Which brings us to the first problem with the WHO report and, indeed, almost all studies in the public health field. You will probably think I’m joking, but the problem is that while there is certainly a field called epidemiology, there is no such thing as a scientifically trained epidemiologist.

You can’t go to Harvard or Southwest Louisiana State and major in epidemiology. In fact, almost all people performing epidemiological studies (or observational studies) are physicians. They are obviously well-trained professionals, but they are not trained in the field that matters.(2) It’s as though we had a field called physics but no physicists. Or a field called philosophy but no philosophers.(3)

What we have, then, is a bunch of quasi-trained, quasi-amateurs running around conducting studies that are almost always completely bogus. Recent headlines pointed out that roughly half the clinical studies published in major scientific journals can’t be replicated, which is more than slightly appalling. But consider that clinical studies are the gold standard in health science, the best-of-the-best. Observational studies are, from the get-go, a poor, rural cousin, so fraught with potential error that no one would even think about doing them if there were an ethical way to do a clinical study on the same issue.(4)

So if half of all clinical studies are bogus, care to guess what percentage of observational studies are bogus? 60%? 70%? 80%? Sorry, you’re way off. According to the WHO experts themselves, the correct answer is something like 97%.

The WHO report brags that “The Working Group [i.e., the panel of global groupthinkers WHO had assembled] assessed more than 800 epidemiological studies that investigated the association of cancer with consumption of red meat…” Well, that certainly sounds impressive. But of this “more than 800” studies, how many did the experts at WHO consider worthwhile, so well-designed and well-conducted that WHO could rely on them? The answer (I’m not making this up, folks) is 14 cohort studies and 15 case-control studies. Are you effing kidding me? The other 97% of the studies were worthless?

Well, yes, of course they were worthless, for the reasons noted above: they were designed and conducted and reported on by people who had no idea what they were doing. And that’s only the first thing that’s wrong with the WHO report – and public health “science” in general. I’m just getting warmed up here.

(1) The real tragedy, to the extent that anything associated with WHO can be considered tragic, is that we still have no clue whether eating meat causes cancer.

(2) Most of the rest of the so-called epidemiologists – those who aren’t docs – have something like a Masters in Public Health degree.

(3) The only other institution I can think of that operates similarly – assigning amateurs to do professional jobs – is the United States Army. Back in the dark days of Vietnam I was drafted into the army out of law school. The army noticed “law” on my resume and promptly made me a military policeman. I was every bit as well qualified to be a cop as most docs are to be epidemiologists.

(4) The (unethical) clinical counterpart of the observational studies on which the WHO report was based would involve shoving meat down some people’s throats until they got colon cancer, while shoving veggies down other people’s throats until they didn’t. The veggie eaters would, instead, get brain, prostate, breast and stomach cancers as a result of all the pesticides they’d ingested.

Next up: Why We Don’t Take Our Meds, Part 3

[To subscribe or unsubscribe, drop me a note at GregoryCurtisBlog@gmail.com.]

Please note that this post is intended to provide interested persons with an insight on the capital markets and is not intended to promote any manager or firm, nor does it intend to advertise their performance. All opinions expressed are those of Gregory Curtis and do not necessarily represent the views of Greycourt & Co., Inc., the wealth management firm with which he is associated. The information in this report is not intended to address the needs of any particular investor.

Visit the Greycourt website »