Skip to content

Press Releases exaggerate Research and Journalists are happy to uncritically repeat the exaggerations

A study published today by the British Medical Journal tries to investigate the often unhealthy relationship between biomedical and health related studies, press releases on the studies and the resulting news articles. There's a widespread feeling among scientifically minded people that “the media gets it wrong”. This is hardly controversial, it's always good to have some scientific data on the details. The study is titled “The association between exaggeration in health related science news and academic press releases: retrospective observational study“ the main authors are Petroc Sumner and Christopher Chambers.

The authors took press releases from 20 major UK universities. They then checked the press release and the resulting news article for typical exaggerations in the field. They took three very common examples: Claiming causation where the study only claims correlation, inference about humans from animal studies and practical advice about behavior change. There is one important limitation the authors point out: They didn't ask whether the studies themselves where already exaggerated, they only tried to measure the exaggerations that go beyond the study itself.

The main results are unsettling, but to be expected: Press releases exaggerate a lot (between 36 % and 40 %). If the press release is exaggerated journalists are much more likely to also exaggerate (around 80 % for all three examples). If the press release does not exaggerate there is still a substantial chance that the journalist will do. Journalists especially like to exaggerate consumer advice.

More exaggeration does not mean more news articles

There is one result that is a bit more difficult to interpret. The authors found that whether or not a press release is exaggerated makes hardly a difference in media uptakes. One has to be careful not to jump to conclusions too fast here and not make the same exaggeration mistakes this whole study is about. This could be interpreted as a sign that science doesn't have to exaggerate in press releases to get media coverage. But another very plausible explanation is that the more interesting studies are less likely to be exaggerated and the less interesting studies are successful in filling that gap by exaggerating their results.

I thought whether a causal relationship could be checked with a different study design. It certainly would be possible to make some kind of randomized controlled trial, though I'm not sure if this would be ethical as you'd have to deliberately produce exaggerated press releases to do so.

Who's to blame

Appart from the data the study already led to some discussion who's to blame and what to do about it. Interestingly both the study itself and an Editorial by Ben Goldacre tend to argue in a direction that scientists are to blame and should change. They both argue that they don't believe in change in journalism (certainly something for me and my colleagues to think about).

Science journalist Ed Yong made a strong statement on Twitter where he argues that all the blame should go to the Journalist. “We are meant to be the bullshit filters. That is our job.” I can't argue with that.

It's certainly interesting that the scientists seem to put the blame on science while the journalist blames his profession. However in the end I think there's neither an excuse for writing exaggerated news articles nor an excuse for exaggerated press releases.

Ben Goldacre has some very practical suggestions how to change science press releases. He argues press releases should contain full names of both the PR people and the scientists involved and responsible for writing them to improve accountability. He also proposes that press releases should be much more integrated in the scientific publishing process. They should be linked from the study itself and they should also be open to post-publication review processes and criticism from the scientific community. I think these are good ideas, though probably not sufficient to tackle the problem. (By the way, here is the press release about this study and it is not linked from the study itself. They could lead by example.)