Connect
MJA
MJA

Media reporting on research presented at scientific meetings: more caution needed

Steven Woloshin and Lisa M Schwartz
Med J Aust 2006; 184 (11): 576-580. || doi: 10.5694/j.1326-5377.2006.tb00384.x
Published online: 5 June 2006

Abstract

Objective: To examine media stories on research presented at scientific meetings to see if they reported basic study facts and cautions, and whether they were clear about the preliminary stage of the research.

Design and setting: Three physicians with clinical epidemiology training analysed front-page newspaper stories (n = 32), other newspaper stories (n = 142), and television/radio stories (n = 13) identified in LexisNexis and ProQuest searches for research reports from five scientific meetings in 2002–2003 (American Heart Association, 14th Annual International AIDS Conference, American Society of Clinical Oncology, Society for Neuroscience, and the Radiological Society of North America).

Main outcome measures: Media reporting of basic study facts (size, design, quantification of results); cautions about study designs with intrinsic limitations (animal/laboratory studies, studies with < 30 people, uncontrolled studies, controlled but not randomised studies) or downsides (adverse effects in intervention studies); warnings about the preliminary stage of the research presented at scientific meetings.

Results: 34% of the 187 stories did not mention study size, 18% did not mention study design (another 35% were so ambiguous that expert readers had to guess the design), and 40% did not quantify the main result. Only 6% of news stories about animal studies mentioned their limited relevance to human health; 21% of stories about small studies noted problems with the precision of the finding; 10% of stories about uncontrolled studies noted it was not possible to know if the outcome really related to the exposure; and 19% of stories about controlled but not randomised studies raised the possibility of confounding. Only 29% of the 142 news stories on intervention studies noted the possibility of any potential downside. Twelve stories mentioned a corresponding “in press” medical journal article; two of the remaining 175 noted that findings were unpublished, might not have undergone peer review, or might change.

Conclusions: News stories about scientific meeting research presentations often omit basic study facts and cautions. Consequently, the public may be misled about the validity and relevance of the science presented.

Methods
Sample
Sample frame

We analysed media coverage of research presentations at high profile scientific meetings. We chose scientific meetings previously identified as “high profile” (ie, likely to attract media attention) based on advice from science writers, editors,2 and media database searches. The five meetings were the American Heart Association, the International AIDS Conference, the American Society of Clinical Oncology, the Society for Neuroscience, and the Radiological Society of North America.

Outcome measures

We used an explicit coding scheme (available at: http://www.vaoutcomes.org/research_tools.php) to analyse each news story. The one-page coding scheme was organised into the following three domains.

Basic study facts: Did the news story report study size, identify study subjects (eg, cells, animals, live humans) and study design, and quantify main results (and if so, were any absolute risks reported)? Most coding choices were framed as yes/no questions, although in some cases coders were asked if the information was stated explicitly or if they had to guess.

Cautions: Were relevant cautions provided about study designs with intrinsic limitations (ie, animal/lab studies, small [human] studies, uncontrolled studies, and controlled but not randomised studies), and were possible downsides noted for intervention studies?

Coders first indicated whether news stories noted (explicitly or by implication) any cautions about study design (eg, that the study was limited because of its small size) or about the interpretation of study results (eg, the study was preliminary and no one should change behaviour based on the findings). We further coded all caution statements to see if they addressed the key limitations intrinsic to particular study designs. Box 1 defines the study design-specific cautions we looked for (eg, did news stories on animal studies highlight that the results might not apply to humans). We applied these definitions liberally to give credit for any effort to provide the caution, even if the relevant statement was subtle. For example, a news story about an observational study was given credit for raising the possibility of confounding because it included the statement “this link is not proven by this study”.

Preliminary stage of research: Were there warnings about the preliminary stage of the research? By preliminary, we meant research presentations not associated with an “in press” or published journal article at the time of the meeting. Specifically, we looked for statements about whether the preliminary research was unpublished, had not undergone peer review, might change as the study matured, or that these might not be the final study results.

Results
Description of meeting presentations garnering media attention

Eighteen per cent of the newspaper stories appeared on page 1; 17% appeared in the top five circulation US newspapers (Box 2). The median length of newspaper stories was 473 words (interquartile range, 255–631 words).

Box 3 summarises the kinds of meeting presentations receiving media coverage. Most (89%) were identifiable as studies of live humans (9% were animal or laboratory studies; in 2% of the reports, study subjects could not be determined). Study size varied from just one subject to 75 000 subjects (median, 502); 17% of studies were small (< 30 subjects). Twenty per cent of meeting presentations were identified as being drug or industry funded. Among the 166 news stories about human studies, 39% covered observational studies (18% uncontrolled; 21% controlled). Only 3% described the most definitive kind of study: large (n ≥ 1000), human, randomised trials. About two-thirds of the 153 stories on human studies (other than surveys or unknown study designs) reported an intermediate outcome measure (eg, tumour size) rather than a patient outcome (eg, death).

Cautions

Important cautions about study designs with intrinsic limitations were rarely noted. Only 6% (1/17) of news stories about animal studies included a statement that the results might not apply to human health. Only 21% (5/24) of news stories about studies involving fewer than 30 people alerted readers to the imprecision of small studies. Ten per cent (3/31) of news stories about uncontrolled studies noted (or implied) that without a control group it is not possible to know if the outcome really relates to the exposure, and 19% (7/36) of stories about controlled but not randomised studies raised the possibility of confounding. Cautions about possible downsides of interventions were also missing: 142 news stories covered intervention studies, but only 29% noted any possible downsides (eg, side effects or other harms) or stated that there were none.

Box 5 lists cautions reported about study designs with intrinsic limitations. Although any attempt at caution is worthwhile, some are more helpful than others. Vague cautions (eg, “a larger, follow-up study is planned”11 about an uncontrolled study of 11 patients) may not raise sufficient concern about whether the findings are really true (or worth acting on). More specific and explicit cautions, like those reported in the news story “Cholesterol drugs cut cancer risk”,18 which raises the possibility of confounding and the need for a definitive study before acting on the results, are likely to be more helpful to readers.

Discussion

Work presented at scientific meetings is generally not ready for public consumption: results change, fatal problems emerge, and hypotheses fail to pan out. Nonetheless, the presentations are often big news. The five meetings we analysed received extensive coverage in the highest profile media outlets in the US. Unfortunately, the news stories often failed to report basic study facts and important cautions needed to avoid misleading the public about the meaning, validity and importance of the science highlighted.

Our study has two limitations. First, we only examined five meetings. It is possible that the coverage of other scientific meetings might have been better. We think this is unlikely, because these are extremely prominent meetings and the coverage appeared in well known media outlets. Next, as with any content analysis, some subjectivity is inherent in coding. We tried to minimise subjectivity by creating and pretesting an explicit coding scheme, using two independent coders (one blinded to our hypotheses) and reporting only elements with very high inter-rater reliability (ie, κ > 0.7). In addition, we did not evaluate accuracy — our goal was to see whether key elements were reported; we did not check to see whether the journalists reported correct facts.

It is not hard to understand why research presented at scientific meetings garners extensive media attention. Researchers benefit from the attention because it is a mark of academic success, their academic affiliates benefit because good publicity attracts patients and donors, and research funders (public and private) benefit when they can show a good return on their investments. The meeting organisers also benefit; extensive media coverage attracts more advertisers and higher profile scientists for the following year, guaranteeing more dramatic reports and, ultimately, more press. The importance of publicity is reflected in the fact that meeting organisers often pay more attention to courting the media (ie, issuing press releases, holding news briefings, and organising investigator interviews) than in vetting the science itself.2 Most importantly, the public has a strong appetite for medical news — particularly about new, “breakthrough” treatments and technologies. Scientific meetings provide the media with an easy source of such stories.

Unfortunately, the public does not always benefit from preliminary findings.22 When they turn out not to be true, patients can be hurt by exposure to ineffective or harmful treatments or by forgoing good alternatives. Consequently, it is important for the public to understand the inherent limitations of preliminary work.

The most direct way to improve the media coverage of scientific meetings would be to have less of it. This will not happen, of course, as too many interests are served by turning preliminary reports into health news. The next best thing would be to improve the way in which these stories are reported. Here, good reporting means providing basic study facts, highlighting cautions about study designs with intrinsic limitations, and being clear about the preliminary stage of the work under discussion.

Ideally, this effort would begin with improving the media’s sources. Press releases issued by meeting organisers, granting agencies and academic institutions should routinely include balanced data presentations (we favour tables with the absolute risks of outcomes for each study group) and study cautions. When interviewed, researchers should clearly and repeatedly note the preliminary nature of their work, the need to interpret results with caution and the importance of waiting for their work to undergo scientific peer review.

Although it is encouraging that more than half of the news stories reported basic study facts, there is much room for improvement. These facts are readily available, and journalists and their editors can ensure the facts are routinely reported. This problem is not specific to the coverage of scientific meeting presentations. For example, studies of the media coverage of medications in the US23 and Canada24 have found that 40% and 80%, respectively, of news stories failed to quantify benefit.

Highlighting cautions is more challenging, because this entails an appreciation of the inherent limitations of different study designs. In the news stories we analysed, cautions were routinely absent. When present, the cautions may have been too vague to be useful. We hope that Box 1 will help reporters (and those writing press releases for researchers, journals and meetings) focus on these issues, and that they will routinely use (or adapt) the language provided in the table to make these cautions explicit. Standardised language to describe common issues should help make the journalists’ job a little easier, and, more importantly, would help educate readers about these fundamental issues.

Of course, highlighting cautions is far less interesting than focusing on the novelty, human interest or possible implications of the research. Readers might even skip these stories altogether. But ignoring a preliminary report about a weak study is preferable to being misled.

5 Selected cautions in news reports about study designs with important limitations

Animal or laboratory study

Exercising the body stimulates brain.10 In all, 24 monkeys participated in the study . . . Cameron cautioned, however, the study doesn’t prove exercise makes humans brighter.

Small study

Still depressed? Go work out.11 Eleven patients who completed a 12-week exercise program of 3 hours of exercise per week all went into full remission. A larger, follow-up study is planned, researchers said.

Experiment uses patients’ cells to heal heart.12 Incredible, if indeed the results last and can be repeated in larger studies. The Arizona team has only done this procedure on 18 patients.

Harvard researchers present sobering report on AIDS.13 Now this is only a single case report, but just this week, federal researchers published an article on two men in Thailand who got second HIV infections.

Uncontrolled study

Devices that read human thought now possible.14 That’s a far cry from proving that a workable long-term implant would be safe and effective. Nicolelis said it was much too soon to “even think about” moving any particular device into full-blown clinical trials.

Drug-coated stent is found safe and effective for arteries.15 But use of taxol in the heart stent . . . is experimental. There is no indication yet that it is safer or more effective than another type of drug-coated stent, already in use . . . Dr Stone said that if the FDA approved the Taxus device, which stent doctors choose would depend largely on marketing campaigns until a head-to-head study of both devices was completed.

Glaxo AIDS drug boost.16 Glaxo said 24 healthy volunteers had taken the new drug, currently known only as either S-1360 or GW810781, without any serious side effects. As a result, Glaxo last month initiated a “proof of concept” phase II trial involving 100 HIV-infected patients in the US.

Controlled but not randomised

Cardiac arrest deadlier at night.17 Stryer says patients should not be alarmed by the “provocative, but really preliminary” findings of the Virginia researchers. But they should be sure there is someone around to respond quickly if they need help . . . He said there might also be differences in the patients’ underlying illnesses. “It might be that regardless of how skillfully they were resuscitated, they were at higher risk. These are interesting questions that require more exploration.”

Cholesterol drugs cut cancer risk.18 The study factored out diabetes, prior medical history and the use of many categories of drugs in comparing the groups of patients. But it did not consider lifestyle conditions such as smoking . . . Cauley’s research and the new Dutch study reviewed patient records to identify trends. That is the same approach that led to initial findings saying hormone replacement therapy in post-menopausal women reduces the risk of heart disease and dementia. But more rigorous trials — selecting 2 groups of women and giving one real medication and the other a fake pill — found an increase in those conditions . . . But this is not enough data for me to start telling patients, “let’s give you statins to prevent cancer”.

Women smokers at double the risk of lung cancer.19 Other experts, however, were skeptical of the figures, which are based on 77 cases. Sir Richard Peto . . . said “This is a very small study and its conclusions may well be wrong. It’s simply not true that men and women who smoke have very different lung cancer rates”.

  • Steven Woloshin1
  • Lisa M Schwartz2

  • VA Outcomes Group, Dartmouth Medical School, White River Junction, Vt, USA.


Correspondence: lisa.schwartz@dartmouth.edu

Acknowledgements: 

We would like to thank H Gilbert Welch and Alex Kallen for helpful comments on earlier drafts. The authors contributed equally to this report; the order of their names is arbitrary. Steven Woloshin and Lisa Schwartz were supported by Veterans Affairs Career Development Awards in Health Services Research and Development and Robert Wood Johnson Generalist Faculty Scholar Awards. This study was supported by a grant from the National Cancer Institute and from a Research Enhancement Award from the US Department of Veterans Affairs. The views expressed do not necessarily represent the views of the Department of Veterans Affairs or the United States Government.

Competing interests:

None identified.

  • 1. Fontanarosa P, Flanagin A. Prepublication release of medical research. JAMA 2000; 284: 2927-2929.
  • 2. Schwartz L, Woloshin S, Baczek L. Media coverage of scientific meetings: too much, too soon? JAMA 2002; 287: 2859-2863.
  • 3. Van Der Weyden M, Armstrong R. Australia’s media reporting of health and medical matters: a question of quality. Med J Aust 2005; 183: 188-189. <MJA full text>
  • 4. Giordano S, Duan Z, Kuo Y-F, et al. Impact of a scientific presentation on community treatment patterns for primary breast cancer. J Natl Cancer Inst 2006; 98: 382-388.
  • 5. Toma M, McAlister F, Bialy L, et al. Transition from meeting abstract to full-length journal article for randomized controlled trials. JAMA 2006; 295: 1281-1287.
  • 6. Landis R, Koch G. The measurement of observer agreement for categorical data. Biometrics 1977; 33: 159-174.
  • 7. Forrow L, Taylor W, Arnold R. Absolutely relative: how research results are summarized can affect treatment decisions. Am J Med 1992; 92: 121-124.
  • 8. Malenka D, Baron J, Johansen S, et al. The framing effect of relative and absolute risk. J Gen Intern Med 1993; 8: 543-548.
  • 9. Naylor C, Chen E, Strauss B. Measured enthusiasm: does the method of reporting trial results alter perceptions of therapeutic effectiveness? Ann Intern Med 1992; 117: 916-921.
  • 10. Kirkey S. Exercising the body stimulates brain: study. Working out increases blood flow. US researchers find monkeys on treadmill became more alert and more interested. Gazette (Montreal) 2003; 9 Nov: A5.
  • 11. Science Briefs: Still depressed? Go work out. San Diego Union-Tribune 2003; 12 Nov: F-2. Available at: http://archives.signonsandiego.com/index.html (accessed May 2006).
  • 12. Canadian Broadcasting Corporation News. Experiment uses patients’ cells to heal heart. 2003; 11 Nov. Available at: http://www.cbc.ca/story/science/national/2003/11/10/heart_stem031110.html (accessed May 2006).
  • 13. Knox R. Harvard researchers present sobering report on AIDS [radio news transcript]. National Public Radio 2002; 10 Jul. Available at: http://www.npr.org/transcripts/ (accessed May 2006).
  • 14. Hall CT. Devices that read human thought now possible, study says. San Francisco Chronicle 2003; 10 Nov: A5. Available at: http://sfgate.com/cgi-bin/article.cgi?f=/c/a/2003/11/10/MNGK82U4MV1.DTL&hw (accessed May 2006).
  • 15. Altman LK. Drug-coated stent is found safe and effective for arteries. New York Times 2003; 11 Nov: A19. Available at: http://query.nytimes.com/gst/fullpage.html?sec=health&res=9F0CE7DB1E39F932A25752C1A9659C8B63 (accessed May 2006).
  • 16. Turpin A. Glaxo AIDS drug boost. Scotsman (Edinburgh) 2002; 10 Jul: 2. Available at: http://business.scotsman.com/technology.cfm?id=742352002 (accessed May 2006).
  • 17. Dembner A. Cardiac arrest deadlier at night. Boston Globe 2003; 18 Nov: C1. Available at: http://www.boston.com (accessed May 2006).
  • 18. Wahlberg D. Study: cholesterol drugs cut cancer risk. Atlanta Journal-Constitution 2003; 2 Jun: A3. Available at: http://nl.newsbank.com/sites/ajc/ (accessed May 2006).
  • 19. Women smokers at double the risk of lung cancer. Times (London) 2003; 2 Dec: 1. Available at: http://www.newsint-archive.co.uk/pages/main.asp (accessed May 2006).
  • 20. Connor S. HIV vaccine could be on the market in five years. Independent (London) 2002; 9 Jul: 5. Available at: http://news.independent.co.uk/world/science_technology/article183435.ece (accessed May 2006).
  • 21. Pollack A, Altman L. Large trial finds AIDS vaccine fails to stop infection. New York Times 2003; 24 Feb: A1. Available at: http://query.nytimes.com/gst/fullpage.html?sec=health&res=9502E4D61E3DF937A15751C0A9659C8B63 (accessed May 2006).
  • 22. Woloshin S, Schwartz L. What’s the rush? The dissemination and adoption of preliminary research results. J Natl Cancer Inst 2006; 98: 372-373.
  • 23. Moynihan R, Bero L, Ross-Degnan D, et al. Coverage by the media of the benefits and risks of medications. N Engl J Med 2000; 342: 1645-1650.
  • 24. Cassells A, Hughs M, Cole C, et al. Drugs in the news: an analysis of Canadian newspaper coverage of new prescription drugs. Can Med Assoc J 2003; 168: 1133-1137.

Author

remove_circle_outline Delete Author
add_circle_outline Add Author

Comment
Do you have any competing interests to declare? *

I/we agree to assign copyright to the Medical Journal of Australia and agree to the Conditions of publication *
I/we agree to the Terms of use of the Medical Journal of Australia *
Email me when people comment on this article

Online responses are no longer available. Please refer to our instructions for authors page for more information.