The FDA’s standards began to slide in the late 1980s and early ’90s. AIDS activists, desperate to slow a devastating and mysterious illness, had pushed for the creation of new pathways for approving treatments more quickly—and, effectively, on thinner evidence. A new program in 1992 allowed for “accelerated approval” on the basis of surrogate markers, which are indirect measures of a drug’s benefit, assessed via laboratory or imaging tests, that stand in for more meaningful outcomes such as life expectancy. But the implementation of these accelerated processes was criticized by some scientists and patients, even at the time. In 1994, for example, The New York Times cited skeptics who worried that “no one can tell if the drugs work.” Eight months later, the AIDS activist organization ACT UP San Francisco called Anthony Fauci a “pill-pushing pimp” for supporting CD4 immune-cell counts and viral loads as surrogate markers. They were completely invalid, the activists wrote, and nothing more than “a marketing exec’s wet dream.”
Read: Before Occupy: How AIDS activists seized control of the FDA in 1988
That early change in standards worked out for the best. We now know that CD4 counts and viral load are excellent markers of HIV/AIDS status, and the types of HIV drugs that were being questioned at the time—reverse-transcriptase inhibitors and protease inhibitors—did turn out to be effective. Newer versions still form the backbone of lifesaving, multidrug regimens. In fact, their success has been used to wave off criticism of the FDA’s declining standards for years.
But that level of success is not at all the norm. Most treatments in medicine will prove only modestly effective and come with real risks.