We Need More Headlines Like "No Evidence For Flossing"
The New York Times told us yesterday that we don't have to floss, and the Internet has been celebrating ever since.
But what did dental science actually discover? The news yesterday was that the federal government "had quietly dropped any mention of flossing" from their dietary guidelines, because they had never rigorously studied flossing's benefits. Furthermore, the Times reports that five years ago, a 12-study meta-analysis found unreliable evidence that flossing reduces plaque, and no data regarding cavities or tooth loss. The Times headline was accurate: "Flossing? Maybe There's No Need." This was big health news about a lack of scientific knowledge!
We need more health news like this.
You see, the scientific method is currently experiencing a replication crisis: most scientists have failed to replicate other scientists' results, and half have failed even to replicate their own. It implies that most of our "knowledge" from recent studies might not actually be based on evidence. The likely cause of this is that studies with new, positive results are rewarded with publication and news coverage, but the job of trying to replicate studies is thankless.
Imagine a hundred studies of oranges and cancer. The first 98 studies, perhaps, find no relationship. They are not prominently published or covered in the New York Times. The 99th study gets a positive result, by chance: it's published in a major journal and the New York Times headlines a story, "Oranges linked to cancer." Also by chance, the hundredth study finds a negative correlation: "Eating oranges prevents cancer." Since the two studies with significant outcomes are rewarded with publication and press coverage, while the inconclusive studies are not, we are left thinking that the scientific opinion about oranges is contradictory.
In fact, it's likely that oranges have no measurable effect. But there is little incentive in academia or the media to publish that fact.
Furthermore, there is not enough incentive to replicate studies. Let's say that only the "oranges cause cancer" study was published. Other researchers ought to redo the study and see if they get the same result. But for the sake of their careers, researchers create new studies to ask new questions instead. In this scenario, we think oranges cause cancer based on a single study.
A very real example of this crisis is "Growth in a Time of Debt", a famous economic study in 2010 by Carmen Reinhart and Ken Rogoff. They claimed that when a nation's debt exceeds 90% of GDP, its growth suddenly slows. Since the paper agrees with right-wing opinion about national debt, Paul Ryan and other Republicans cited it often in the US 2012 election. They claimed the paper supported their austerity policies that would've slashed aide to the poor.
Luckily, a humble grad student in 2013 tried to replicate their result, and couldn't. When Thomas Herndon asked for a copy of Reinhart and Rogoff's Excel file, he found an actual bug—they'd selected only part of a column as input to a formula—plus design flaws and missing data that undermined their thesis. Their real title should have been: "No conclusive link between debt and slow growth."
Reinhart and Rogoff were contrite, and they congratulated Thomas Herndon on his find. It's telling that Ken Rogoff was quoted this week in the Times in favor of debt increases, saying that targeted deficit spending could lead to growth. For this one study, the scientific method worked.
So let's see more headlines like "Flossing? Maybe There's No Need." Science reporters should give fair attention to inconclusive studies. Lack of evidence should be big news. If researchers are rewarded for trying to replicate studies, including when they fail to replicate them, then we'll put our scientific understanding on firmer ground. Meanwhile, common sense rules: I'll keep flossing.
Images: The Devil's Artisan.