If you’re one of our regular readers I’ll bet your ears echo from hearing us bang these drums over and over:
Finding examples illustrating the first bullet point is like falling off a log for us. Here are a few misleading headlines from observational studies published within just the past few weeks (click on the links to read our critique):
Can smartphones trigger ADHD symptoms in teens? (HealthDay)
Not only do observational studies lend themselves to exaggerated headlines like these that can mislead readers, but they can also lead those who should know better to promote glowing outcomes that just happen to align with their agendas. Outcomes that when tested more rigorously via RCTs don’t stand up.
A clear and clever illustration of this is highlighted in a provocative article written by pediatrician Aaron Carroll, MD, and published in the New York Times earlier this week:
In the article Carroll tells us about a unique study published earlier this summer looking into the effectiveness of the wellness program at the University of Illinois at Champaign-Urbana. As Carroll points out most evaluations of wellness programs are observational by design. They’re quite prone to selection bias. So when results suggest participation in such programs leads to healthier outcomes it’s hard to tease out if those benefits are — as one of his sources points out — “due to differences in the people rather than differences from the [wellness] program.”
But the Illinois Workplace Wellness Study did something clever: It evaluated the program using both an observational AND a randomized controlled study approach.
In a nutshell, every time the observational study suggested a benefit from participating in the program, the RCT analysis did not; in other words, no cause-and-effect benefit. Here are some of those findings:
Gym attendance: The observational approach suggested participants went to the gym 7.4 days/yr vs. nonparticipants = 3.8 days/yr … The RCT findings? 5.8 and 5.9 days/yr, respectively.
Healthcare spending: The observational approach suggested participants spent much less ($525 vs. $657) … The RCT findings? ($576 vs. $568)
It’s not hard to imagine a journalist (or a university public relations writer) taking these observational findings and writing an article with headlines like these:
‘Wellness program participants exercise nearly twice as much as non-participants’
‘Need to save money? Join a wellness program’
Of course, these are hypothetical and you might think so what?
At issue is that nearly every day, of every week, for 12 years we’ve come across headlines such as these that seduce readers with misinformation based on observational findings that very likely — had they been subject to a more rigorous study approach — would have been shown to be inaccurate/incorrect [see our tips for writing better health headlines].
That’s the elegance and beauty of the Illinois Workplace Wellness Study and why we give kudos to Carroll for writing about it, and why we wish more mainstream news outlets had covered it. It’s an important study.
And we were glad to see Carroll bring up these two (paraphrased) caveats:
A final thought worth bearing in mind is that not infrequently, when observational data support an institution’s agenda they are selectively highlighted, but when they do no not they’re often selectively downplayed [here’s an example].
This is a choice made far upstream at the source of much of our health care news. If journalists choose to report such imbalanced results without careful scrutiny (as in the 3 very real headlines listed above) it all but ensures that polluted health care information will be more widely disseminated.
That’s why we’d be well served to always keep the Illinois Workplace Wellness Study in mind.
Just as we’d be well served to wait and see if further RCT studies support or refute its findings.
Over 50 million Americans are enrolled in wellness programs. If you’d like to learn more, check out our podcast from last year: