No surprise: Repatha reality is messier than initial reports suggested

Ever since Amgen teased the public with claims of a landmark study showing that its new heart drug Repatha significantly reduced the risk of cardiovascular events, anticipation has been building for the day when Amgen would release the data to back up its bravado.

That day finally came on Friday, and the wall-to-wall news coverage suggests that Amgen’s public relations strategy was quite effective.

But what about the drug itself? How big was the “significant” reduction that Amgen touted in its news release (which some news outlets reported uncritically in the weeks preceding publication of the data)?

And more importantly, how carefully did news stories evaluate the results and did they provide consumers with the information they need to make informed decisions about treatment?

The coverage runs the gamut and I can only paint in broad strokes.

  • The New York Times’ somewhat breathless coverage leads with the claim that this new class of drugs “has the potential to improve the health and longevity of millions of Americans with heart disease, the nation’s leading killer, accounting for one in four deaths.” It’s not until the ninth paragraph that we learn, no, the drug “did not show a benefit in overall death rates from cardiovascular causes.”
  • NPR’s piece, written by Yale cardiologist Harlan Krumholz, had much more balanced framing: “Pricey New Cholesterol Drug’s Effect On Heart Disease Is More Modest Than Hoped.” The relatively small reduction in cardiovascular events receives prominent emphasis in NPR’s story. “[Repatha] over about two years of study, reduced the risk of cardiovascular events, including heart attacks and stroke, by about 15 percent. For about every 66 people treated, one person avoided one of these events. There was no reduction, however, in the risk of death.”
  • I was pleased to see that cost featured prominently in many stories, including Kaiser Health News’ piece in USA Today that was headlined, “Cholesterol drug prevents heart attacks — but costs $14K a year.” Journalist Larry Husten, writing on the Cardiobrief blog, wondered if “the modest efficacy of the drugs is worth their immodest cost.”

What should patients and the public take away from this?

The patient perspective was covered very thoroughly by patient advocate Dave deBronkart, also known as e-Patient Dave. His excellent round-up of news stories about the study encourages patients to “avoid relative risk reduction (headlines about percentages) and look instead for actual (absolute) numbers of patients helped.”

Here’s what those numbers look like and how those numbers impact cost:

-Major heart problems or strokes happened to 11.3% of patients WITHOUT the new drug, and 9.8% of patients WITH the new drug. In other words, 1.5% of patients avoided a problem event, but 9.8% still experienced a problem event despite taking the drug.

-1.5% means on average, 1 patient in 67 benefits from the drug. That’s the NNT – the number of patients needed to treat for one to get any benefit.

-Note, though, that the drug did not save lives: the same percent died whether or not they got the drug. So it prevented 1.5% of these major cardiac events, but didn’t alter death rates – at least not during the time of this study.

-The drug costs $14,000/year, and these patients were watched a median of 2.2 years, so the cost was about $30,800 per patient.

-The 67:1 ratio means each prevented heart attack came at a cost of 67 x $30,800 = $2.06 million.

-No new side effects were discovered. That’s good – many new drugs bring new risks, too. (But …this study was pretty short, so more news about side effects may come out later.)

deBronkart adds that “this study included very high risk patients, and if your risk isn’t as bad, then the benefits of the drug would not be comparable. You’d be much less likely to benefit, so the NNT for patients like you would be much larger.”

Beware of claims made without data

Bottom line: Patients, doctors and policymakers will have to make their own decisions about whether treatment with this drug makes sense for individuals and for society more generally. But those decisions should be based on complete information that has been thoroughly analyzed and reported.

Claims of “landmarks” and “breakthroughs” should be viewed skeptically in the absence of data. The evidence-based reality is almost always messier than initial reports would have you believe.


Read The Rest at HealthNewsReview- (opens a new tab)





Pages

Archive