It’s a provocative headline—maybe too provocative: “What Should We Eat to Stay Healthy? Why Experts Actually Have No Idea.”
So goes the title of a recent Reuters op-ed by David Seres, an associate professor at the Institute of Human Nutrition at Columbia University in New York. It’s the type of headline meant to grab your attention, a hallmark of an age in which the findings of even small-scale nutrition studies are packaged and hyped by the media into tantalizing tidbits that routinely shoot to the top of any given website's “most-emailed” list. You know the type: “Eating Blueberries Reverses Memory Loss!”; “Diet Soda Linked to Kidney Disease!”; “Live Longer, Live Healthier: Eat Nuts!”
Therein lies the problem. With such a headline, Seres (or, to be fair, perhaps his editor at Reuters) resorts to the same brand of overblown sensationalism that he seems to criticize in his piece. Do experts really have “no idea” what we should eat to stay healthy? Hardly the case.
The professor trains his academic’s eye on the way results of nutritional studies are often conveyed to the public, and much of what he says is eminently valid if decidedly not share-worthy—the equivalent of humdrum exhortations to eat your veggies and exercise when what the public wants is its health news in quick, sweet, easily digestible bites.
Seres outlines the problems and pitfalls of most nutrition studies, starting with the true-but-hardly-gee-whiz fact that it's both complicated and expensive to conduct the research. Diet studies often fail, for example, because not enough test subjects stay on the diet. Furthermore, researchers may have to follow subjects for a number of years, if not decades, to get meaningful results. It can even be hard to target one aspect of a diet, as changing how people eat in one way can produce a domino effect on other health and lifestyle habits.
Which leads us to a helpful little lecture on the difference between randomized and observational studies. Really, if you’re going to consume any sort of health-related news, it’s a distinction everyone should know.
“In a randomized study, we recruit a group of subjects with a desired set of similarities, and randomly assign them to a treatment, which in this case is a diet,” Seres writes. “Researchers then monitor the subject to see how the different treatments have affected him or her. Because the subjects are relatively similar, and the treatments randomly assigned, researchers can establish cause and effect.”
In observational studies, by contrast, researchers look at a broad sample of people to see whether two things occur together with a high degree of frequency. But there’s no way to determine whether one thing causes the other. Hence, we’re talking correlation as opposed to causation, and that’s why you have so many health headlines employing the word “linked.”
To use Seres’ example, an observational study might find a correlation between smiling and happiness, and thus your headline might be “Researchers Link Smiling to Happiness.” No matter the implication, that doesn't mean smiling causes happiness.
“While less difficult and less expensive—and therefore much more popular—this type of research can only generate hypotheses about cause and effect,” Seres writes. “Most of our dietary guidance is based upon this kind of research.”
So if all those do-gooder nutrition recommendations are so much B.S., I guess that means we can all go back to a steady diet of pizza, burgers, and soda, right?
That's the opposite kind of sensationalism, of course. Seres seems to hunger for the same thing the public does: certainty. He just wants to find more scientific rigor behind the clicky headline. But while it’s worth reminding all of us that scientific research is an imperfect and constantly unfolding process that's open to error and misinterpretation, it seems ludicrous to come nigh close to throwing the proverbial baby out with the bathwater for want of definitive answers.
No diet, let alone any single “superfood,” is going to ensure that you never develop cancer or die of a heart attack. Researchers may never be able to establish the kind of ironclad cause-and-effect relationship between, say, eating too much bacon and early mortality, as they did with smoking a generation ago. But does that mean we should all go bacon-heavy Paleo? Or dive headlong back into the carb-counting Atkins craze? If the state of nutrition research is as dire as Seres seems to imply, then where are the studies touting the supposedly miraculous health benefits of the processed-food-laden, sugary, fatty, french-fries-as-vegetable “Western” diet?
When I come across one, I’ll be sure, dear reader, to let you know.