New Vaccine Data Is Coming: Watch Out for These 3 Claims

SUBSCRIBE

Subscribe to WIRED and stay smart with more of your favorite Ideas writers.

The last few weeks have seen dramatic headlines on the efficacy of several new coronavirus vaccines. But coverage to this point has been based on press releases—tiny snippets of results that were trickled out from clinical trials. This week, the real data deluge will begin. The Food and Drug Administration has been analyzing thousands of pages of data for BioNtech-Pfizer’s vaccine, in preparation for an all-day meeting on December 10 to decide whether to authorize its emergency use—and detailed summaries of that data could be released as soon as Tuesday. Then the agency will repeat the process next week for the Moderna vaccine. The first publications of the vaccines’ efficacy data in medical journals should be coming out soon too.

It’s going to be a lot for experts and the news media to digest. If the past is any guide, there will be plenty of errors, misunderstandings, and communication snafus in coverage, as well as active disinformation campaigns on both sides to boost hype or spread fear. Here are three tricky or misleading claims to keep an eye on:

Misleading Claim 1:

The trials were so humongous, all the results must be iron-clad. Or else, The trials included so few people who actually got sick, the results must be unreliable.

We’ve been conditioned to think a study’s size is just about the most important thing about it. That’s how they’re always described to us: “A study of 50,000 people found so-and-so.” But we shouldn’t be so easily impressed by how large a trial sounds! For effectiveness, its power is always about the number of “events” that occur during the study—in this case the number of people who got sick with Covid.

Take the famous 1954 field trial for the Salk polio vaccine, which included an incredible 1.8 million children! But the number of events used in the most critical analysis—a placebo comparison involving a subset of more than 400,000 children—was just 143. That’s how many children in the study developed paralytic polio, and it was enough to be sure of the all-important finding that the Salk vaccine worked. The BioNTech-Pfizer trial had 43,000 participants and 170 events; the Moderna trial had 30,000 participants and 196 events. Those numbers represent an efficient means of getting urgent answers. They suggest neither overkill nor risky corner-cutting.

Read all of our coronavirus coverage here.

At the same time, it’s important to remember that not every number that’s reported will be based on a full analysis of all the data. The BioNTech-Pfizer vaccine is said to have 95 percent efficacy; Moderna’s, 94.5 percent. These are the most top-level findings. As more data comes out, though, it’s all but certain that more fine-grained—but potentially less reliable—analyses will make the news and be presented as though they are equally strong. We already saw this happen a few weeks ago, when a BioNTech-Pfizer press release reported that “efficacy in adults over 65 years of age was over 94 percent.” Outlets such as ABC News went ahead and passed that finding straight on to their readers. But we can’t be as certain about this as we can about the overall efficacy. When a little more data about the vaccine was released upon its authorization by the UK government last Wednesday, it showed that there weren’t quite enough participants in the study’s oldest age group (over 75) to be as sure about that result. It’s still an important finding; it’s just a more tentative one. The same may apply to other statistics that get reported in the days to come. How effective is the vaccine at preventing infection, for example, or protecting people with chronic diseases from Covid? Don’t assume that secondary findings like these will carry the same degree of confidence as the main ones.

Misleading Claim 2:

Now we know that claims of “95 percent efficacy” were hype.

Be prepared to see a bunch of new estimates of effects in the days and weeks to come—and some of these could make the vaccines sound less useful at preventing Covid than did first reports. That’s not a sign that the first analyses are unreliable. Calculations of 95 percent efficacy were based on data from only those participants who took both of two vaccine injections, with some extra time allowed for the fullest possible immunity to kick in. That means a bunch of people were left out: There may be some who got Covid before that point was reached; some who dropped out of the study before getting their second injection; and some who stopped providing the researchers with follow-up, so it won’t be clear whether they ever did get sick. We’ll soon have analyses that take these people into account, and they will show somewhat different levels of efficacy. These may offer some insight into how well the vaccines will perform in practice, when rolled out in our communities.

Source

Author: showrunner