By Derek Thompson

Last year, I called America a “rich death trap.” Americans are more likely to die than Europeans or other citizens of similarly rich nations at just about every given age and income level. Guns, drugs, and cars account for much of the difference, but record-high health-care spending hasn’t bought much safety from the ravages of common pathogens. Whereas most of the developed world saw its mortality rates improve in the second year of the coronavirus pandemic, more Americans died of COVID after the introduction of the vaccines than before.

But this week, America finally got some good news in the all-important category of keeping its citizens alive. Since the early 1990s, the U.S. cancer-mortality rate has fallen by one-third, according to a new report from the American Cancer Society.

When I initially read the news in The Wall Street Journal, my assumption was that this achievement in health outcomes was principally due to medical breakthroughs. Since the War on Cancer was declared by President Richard Nixon in 1971, the U.S. has spent hundreds of billions of dollars on cancer research and drug development. We’ve conducted tens of thousands of clinical trials for drugs to treat late-stage cancers in that time. Surely, I thought, these Herculean research efforts are the primary drivers of the reduction in cancer mortality.

As it turns out, however, behavioral changes and screenings seem just as important as treatments, if not more so.

Let’s start with an obvious but crucial point: There is no individual disease called “cancer.” (Relatedly, nothing like a singular “cure for cancer” is likely to materialize anytime soon, if ever.) Rather, what we call cancer is a large group of diseases in which uncontrolled growth of abnormal cells makes people sick and possibly brings about their death. Different cancers have different causes and screening protocols, and as a result, progress can be fast for one cancer and depressingly slow for another.

The decline in cancer mortality for men in the past 30 years is almost entirely for a handful of cancers—lung, prostate, colon, and rectal. Little progress has been made on other lethal cancers.

Consider the diverging histories of two cancers. In 1930, death rates for lung cancer and pancreatic cancer were measured as similarly low among the American-male population. By the 1990s, however, lung cancer mortality had exploded, and that disease became one of the leading causes of death for American men. Since 1990, the rate of lung cancer has declined by more than half. Meanwhile, pancreatic-cancer rates of death rose steadily into the 1970s and have basically plateaued since then.

What explains these different trajectories? In the case of lung cancer, Americans in the 20th century participated en masse in behaviors (especially cigarette smoking) that dramatically increased their risk of contracting the disease. Scientists discovered and announced that risk, then public-health campaigns and policy changes encouraged a large reduction in smoking, which gradually pulled down lung-cancer mortality. In the case of pancreatic cancer, however, the causes are mysterious, and the disease is tragically and notoriously difficult to screen.

Treatments for late-stage lung cancers have improved in the past few decades, according to the American Cancer Society report. But for all the money we’ve spent on treatments, most of the decline in deaths in the past three decades seems to be the result of behavioral changes. Smoking in America declined from a historic high of about 4,500 cigarettes per person per year in 1963—enough for every adult to have more than half a pack a day—to less than 2,000 by the end of the century. It’s fallen further since then.

Another possible factor in declining cancer mortality is better screening, though the question of how much to screen is still contentious. In the early 1990s, doctors started using blood tests that turned up prostate-specific toxins. This period coincided with a decline in prostate cancer. But many positive results from these tests were false alarms, turning up asymptomatic cases that never would have bloomed into serious cancers. As a result, the federal government discouraged these prostate-cancer tests for men in the 2010s. Since then, diagnoses of advanced prostate cancer have surged, and mortality rates have stopped falling—suggesting that the previous testing regime may have been better after all.

This cancer-screening debate could define the next generation of medicine. As I wrote in last year’s “Breakthroughs of the Year,” companies such as Grail now offer blood tests that look for circulating-tumor DNA in order to detect 50 types of cancer. As these kinds of tests become cheaper and more available, they could reduce the mortality of more cancers, just as antigen tests have helped reduce the death rate of prostate cancer. On their face, these advances sound simply miraculous. But deploying them effectively will require a delicate balancing act on the part of regulators. After all, how much information is too much information for patients if many cancer tests are false alarms? “They sound wonderful, but we don’t have enough information,” Lori Minasian of the National Cancer Institute has said of these tests. “We don’t have definitive data that shows that they will reduce the risk of dying from cancer.”

The Biden administration’s Cancer Moonshot Initiative should heed the lessons of this latest report. Much of the decline in cancer mortality since the 1990s comes from upstream factors, such as behavioral changes and improved screening, even though the overwhelming majority of cancer research and clinical-trial spending is on late-stage cancer therapies. A cure for cancer might be elusive. But a moonshot for cancer screenings and tests might be the most important front in the future war on cancer.