Wednesday, November 4, 2015

Parsing the Popular Press

Science, especially nutritional science, usually moves in small steps forward and then small steps steps backward and then more steps forward and a few steps sideways until we eventually arrive at the truth. One experiment or one nutritional study proves nothing. It has to be confirmed, preferably several times, in order to be accepted as true.

Unfortunately the popular press has no understanding of this. The popular press also has no understanding of statistics and what “significant” means in statistical terms. So a single study is often be blown up and spread all over the popular press. The conclusions are simplified. And if that single study confirms the average person’s preconceptions, the study will be blown up even more and simplified even more.

The obvious recent example is the study about red and processed meat and cancer.

This was a meta-analysis, and such analyses have known problems. Nutritional studies also have known problems as most of them use questionnaires, which requires people to remember how many times they ate Food X over the last week or month. I often can’t remember what I had for breakfast,  and my diet changes with the seasons and with what foods are on special that week. I haven’t the slightest idea how often I ate spinach in the last month.

Furthermore, the meat study was about only colorectal cancer (CRC). The conclusion was that “the state of the epidemiologic science on red meat consumption and CRC is best described in terms of weak associations, heterogeneity, an inability to disentangle effects from other dietary and lifestyle factors, lack of a clear dose-response effect, and weakening evidence over time.”

But the popular press ran articles with titles like “Meat as dangerous as smoking.” This is the sort of soundbite people will remember. No one will remember “weak associations” or “lack of a clear dose-response effect” even if they know what that means, or the fact that the study is limited to CRC.

Numerous diet bloggers have pointed out the limitations of this red meat study, including Zoe Harcombe and Stephan Guyenet. I don’t want to do that here, just point out how the popular press misleads the public in order to get sensational stories.

Another story that captured the attention of the popular press was a story claiming that red wine can improve cardiovascular health in people with type 2 diabetes. This is the type of story that the general public can understand, and it’s appealing to people who like to drink, so it was played large in a lot of the popular press.

These news stories are often based on just one study, and just one study proves nothing, especially if it’s a nutritional study. So the probability that further research will contradict the study the press has blown up is high. If it is, some readers (or TV watchers) decide that no health information is trustworthy and may ignore even advice based on good research that has been confirmed many times.

Here is an example of a study of chimpanzee communication claiming that their results disprove previous results by another group. And here is a study contesting the source of DNA samples supposed to be from ancient grains. They claim the results stemmed from contamination with modern grains. And here is a story refuting the idea that sitting for long periods of time is bad for your health even though you get exercise at other times.

One might conclude that no scientific studies are worth reading about because next week someone will claim the opposite. That’s not true. Often studies are confirmed by other researchers. It’s just that we have to interpret any studies as tentative, and especially if they’re funded by a group that sells the product, for example, a study by the US Gloopyberry Association showing that gloopyberries reduce cholesterol levels when you eat the equivalent of five pounds a day. The popular press would run stories with headlines like “Gloopyberries reduce cholesterol,” and people would buy more gloopyberries, which of course is why the Gloopyberry Association funded the study.

Many of the foods we eat have effects on the compounds in our blood, especially when eaten in excess, and most of them have never been tested. In fact, some of them might have a greater effect on cholesterol than the gloopyberries but we won’t know until someone studies them in reasonable amounts, and this requires funding for the study.

Derek Lowe has a blogpost about another problem with popular press articles. That is the tendency to label the results of studies as “breakthrough” or “miracle” cures when in fact they represent baby steps on the way to the solution of some problem. Most articles by science news organizations like Science Daily or Eurekalert are simply press releases from the institutions whose scientists did the experiments. Quite often you’ll see two or even three articles about the same research, each one emphasizing the researchers from their institution.

Their job is to put the best possible spin on their institutions, and “breakthrough” studies are a good way to do this. They also provide suggested headlines, which these science news sites usually use. Easier than writing their own.

The popular press then reads the Science Daily or Eurekalert stories and simplifies even more and that’s what most people read or hear on the evening news.

So what can we as intelligent readers do? We can’t ignore all these studies. But we have to file them away as “interesting and perhaps with some truth in them.” Then we can wait to see if the work is confirmed.

This isn’t as exciting as thinking that eating gloopyberries will solve all our problems and do our taxes for us. But it’s closer to reality. We can’t control the popular press. But we can control our reactions to the popular press articles. Sometimes ignore is the best solution, I think.

5 comments:

  1. http://emboj.embopress.org/content/34/22/2721

    This is a good editorial on the problems of reproducing complex modern research. It's long, but worthwhile. The author mentions the fact that space in journals is limited, so authors often provide only skimpy documentation of Methods. This is certainly true in some nutritional studies, in which they just say that mice were on a high-fat diet without more details.

    ReplyDelete
  2. I've noticed that there is a heirarchy of dogma. Even some well done research may have a significant weakening between the Data and the Conclusions (see especially Harvard), the Abstract is a further step away, the Press release hardly describes the same study, and what journalists write in the popular press (including the likes of WebMD) is almost completely disconnected.

    This works the other way too, often the public commenters have a much greater awareness than the journalist who wrote the original article.

    When research grants and reputations are at stake, science often takes a back seat. Journalists are divided between sensationalism and supporting the status quo depending on which is more likely to please the advertisers.

    There's now a critical mass of competent bloggers and commentators who frankly do a much better job than many peer reviewers.

    ReplyDelete
    Replies
    1. Many journalists write articles on the basis of the press releases. And because full text is often behind a paywall and because today's methods are extremely complex and sometimes vague (a "high fat diet" in mice usually means high fat on a background of low-quality carbs) and because journalists aren't given time for a thorough review, it's not surprising that their stories are misleading.

      We're fortunate that we have bloggers who have the time to drill down in more detail. But some of them come to the research with preconceived ideas, so one has to read their commentary with a grain of salt too.

      Press releases are usually written by PR people whose job is to make their institutions look good. And of course they always point out that the research suggests where new drugs might help, hoping the investigator will attract interest and grants from big pharma.

      Getting to the truth isn't easy. It reminds me of the character in one of Solzhenitsyn's novels who read Pravda and tried to ferret out what was really going on by what *wasn't* mentioned.

      Delete
  3. A scientist in a completely different field had a good take, he reckoned that while scientists didn't want to lie, neither did they want to lose their research grants, so they learned how to hide their results in plain sight where other scientists could find them but accounting clerks would miss them . . .

    ReplyDelete
  4. That's like Calvin Trillin saying reporters at Time Magazine added extraneous material to their stories hoping that the editors, who needed to justify their jobs, would delete that and not something important.

    ReplyDelete