Updated | Not too long after the Soviet flag was hauled down from the Kremlin, a startling number of Russian men started dying. Young and middle-age men began to drown, get run over and suffer asphyxiation and heart attacks in shocking numbers. There were all manner of suspicious, gruesome deaths, the details of which suggested alcohol abuse and suicide. The life expectancy for Russian men was plummeting; between 1986 and 1996, it dropped from 65 to 57.
For years it was a source of great perplexity and despair, and when journalists and academics finally began to make sense of what was happening, the answers were knotty. The fall of the Soviet Union had created what the United Nations Development Programme called a “demographic collapse” brought on in large part by a “rise in self-destructive behavior, especially among men.” But all that alcoholism and drug use didn’t come out of nowhere. Many saw it as a direct result of the worsening economic conditions in Russia, where poverty and unemployment had been sharply rising since the dissolution of the USSR. The combination of no job and no foreseeable better future was driving men to drink. And the vodka was killing them, by way of liver disease, alcohol poisoning and fatal accidents. It was a gallows humor version of “It’s the economy, stupid.”
Roughly a quarter-century later, a similarly grim narrative of self-destruction and death is filling graveyards. But this time, it’s happening in the United States.
In November 2015, two Princeton economists, Anne Case and Angus Deaton, published a report analyzing mortality rates among Americans from 1999 to 2013. Their findings violently overturned our fundamental expectations for life expectancy in 21st-century America: For 14 years, the mortality rate among white Americans age 45 to 54 rose, half a percent every year. According to Case and Deaton, if mortality rates had simply held steady at their 1998 number, there would have been 96,000 fewer deaths from 1999 to 2013. Further, if the mortality rates had continued to steadily decline as they had in the second half of the 20th century—and as is typical of industrialized countries—488,500 deaths could have been prevented over that 14-year period. As Deaton told The Washington Post, “Half a million people are dead who should not be dead.”
In a commentary on the study, Dartmouth economists Jonathan Skinner and Ellen Meara observed that “it is difficult to find modern settings with survival losses of this magnitude.” Skinner says the U.S. mortality figures are unique even among other epidemiological crises: Compared with, say, the crack cocaine epidemic of the 1980s, the HIV crisis, and even the mass deaths of Russian men in the 1990s, the current trend is unprecedented in its abrupt and unforeseen arrival. “There were a few studies that kind of hinted at it, but to find this rise in mortality where people didn’t even know why, there’s nothing that you can point to where you can say, ‘Oh my gosh, this is why this is happening,’” he says.
When Case and Deaton’s study was published in Proceedings of the National Academy of Sciences in early November, the intelligentsia slipped into a kind of paranoid rapture. Journalists, pundits and op-ed potentates came out in droves to offer their takes on the dismaying statistics. There are some clear hints as to what was going on: The data show that the uptick in deaths was primarily from drug and alcohol poisonings and suicides, with liver disease a somewhat distant third culprit. But there was no clear explanation for why middle-age white Americans were overdosing and killing themselves at such unprecedented rates. So many treated the study as a canvas upon which any and all of the popular American end-of-days narratives could be painted: loss of religion; decline in marriages; disintegration of good middle-class jobs; the end of the blue collar–led household due to wage stagnation; even, more quixotically, the broken promise of the American dream.
But many of the factors pointed to—especially economic considerations like frozen wages, unemployment and the disappearance of well-paying jobs that didn’t require a college degree—affected blacks and Hispanics in the U.S. even worse than they did whites. Yet mortality rates in those demographic groups have continued to fall. White middle-age Americans still have a lower mortality rate than, for example, middle-age blacks—415 per 100,000, compared with 581. But that difference is significantly smaller than it was 15 years ago, as black mortality in the 45-to-54 age group has fallen 2.6 percent per year since 1999, while that of their white counterparts rose. And European countries were racked by arguably even worse economic hardship than the U.S. in the past decade—but their mortality rates have likewise declined, in keeping with historical trends. Middle-age white Americans’ mortality now lags well behind both Hispanics in the U.S. and corresponding age groups in France, Germany, Canada, the U.K. and other industrialized countries. Apply a bit of analytical rigor and the economics argument doesn’t hold up.
Speculators were also quick to interpret the mortality figures as specifically a white man’s problem—all the better to suit journalists’ characterization of disillusioned former breadwinners made impotent by growing income inequality. But the numbers undercut that argument: Columbia statistics professor Andrew Gelman sifted Case and Deaton’s data to separate the mortality rates for gender, and he found that women have been dying at a higher rate than men ages 45 to 54 since 1999, with the most pronounced spike coming after 2006. This data crunch sabotaged the neat and widely popularized idea that the dying were grievously disaffected middle-age white men, broken by the revelation that the American dream was a lie.
There is, however, something that does make white men and women in the U.S. unique compared with other demographics around the world: their consumption of prescription opioids. Although the U.S. constitutes only 4.6 percent of the world’s population, Americans use 80 percent of the world’s opioids. As Skinner and Meara point out in their study, a disproportionate amount of these opioid users are white, and past studies have shown that doctors are much more willing to treat pain in white patients than in blacks.
Birth of a Plague
“My daughter was a schoolteacher with a master’s degree, her own home—yet she was a heroin addict,” says Donna Shackett.
It all happened so quickly it didn’t seem real to Donna. Jill Shackett, a 34-year-old elementary teacher in Bristol, Connecticut, underwent neck fusion surgery in 2012. The opioid prescriptions she received afterward sucked her into a vicious cycle of rehab and addiction—she had three separate stints in rehab facilities. After the last one, in 2013, Donna picked up her daughter at the airport and felt encouraged by her determination to get her life back together. Jill wanted her mother to take control of her finances and was confident she would get back on her feet. But within hours of being dropped off at home, Jill was hawking her Kindle online, trying to score cash for heroin. She died the next day of an overdose.
The Shacketts’ story is hardly unique. Prescription opioid use has been on the rise in the U.S. since the late 1990s, and heroin has not been far behind. From 2001 to 2014, the rate of heroin-related fatal overdoses has increased sixfold. A recent CDC report found that more people died from drug overdoses in the U.S. in 2014 than in any other year on record—and over 60 percent of those deaths came from opioids. And as the media coverage, town hall meetings and local legislative hand-wringing over the past 18 months have shown, things are only getting worse. A new heroin scourge has risen out of the ruins of the 2000s opioid craze, and, unlike previous incarnations in the late 1960s and ’70s, it’s no longer confined to the seedy alleyways of the nation’s big cities. This time it’s sweeping through working- and middle-class America. “It’s the guy standing behind you in Starbucks,” Donna Shackett says. “It’s your kid’s teacher. It’s your next-door neighbor.”
Prior to the 1990s, prescription opioids—synthetic opiates designed to mimic the effects of opium—were almost exclusively reserved for cancer patients in chronic, often excruciating pain. Doctors chose from a fleet of powerful opioids, including fentanyl, produced by Johnson & Johnson’s Janssen Pharmaceuticals; Vicodin, produced and distributed by Abbott Labs; and Endo Pharmaceuticals’s Percocet and Opana. Because these chemical compositions were precariously close to heroin, they were reserved for patients experiencing the uniquely severe pain that comes with tumors pressing against nerve and bone. But a movement in the mid-’90s led by pain advocacy groups and doctors specializing in pain management started pushing for the use of opioids for chronic non-cancer pain. By the late ’90s, laws and regulations expanded the use of opioids to encompass that pain.
No longer restricted to the small, highly specific group of patients with painful, often terminal cancer diagnoses, opioids could now be used on everything from post-op recovery to back pain, sports injuries to migraines.
In 1996, OxyContin arrived in this brave new world of pain medication. Developed by Purdue Pharma, OxyContin was approved by the U.S. Food and Drug Administration in 1995 largely because it promised to be a safer alternative to the prescription drugs that were being increasingly abused. An FDA presentation from 2008, “History of OxyContin: Labeling and Risk Management Program,” explained that “delayed absorption, as provided by OxyContin tablets, is believed to reduce the abuse liability of a drug.” This became one of the key arrows in Purdue’s marketing quiver: Because of the drug’s time-release formula, it was abuse-resistant and therefore posed less long-term risk to patients.
Purdue Pharma knew that it would have to distinguish OxyContin from the glut of other effective opioids on the market. Purdue’s first wedge into the fray would be the claim that because of OxyContin’s time-release formula, patients would not need to take as many pills and would not get the powerful, potentially addictive high caused by the opioids that are immediately released into the bloodstream. Meanwhile, Purdue downplayed OxyContin’s risks. At one point, the company claimed that the likelihood of addiction was “less than 1 percent,” and sales reps allegedly told health care providers that the drug didn’t even cause a buzz.
OxyContin was advertised with scads of promotional materials, including literature for prescribers, audiotapes, videos and a benevolently titled website, “Partners Against Pain.” Purdue also showered doctors with the kind of promotional swag—fishing hats, coffee mugs, and even a ’50s music mix titled “Get in the Swing With OxyContin”—that today feels laced with an ugly irony, like the paradisiacal cigarette ads from the 1940s. From 1996 to 2000, Purdue Pharma grew its sales force from 318 to 671. It also ramped up its marketing push in medical journals, increasing spending on advertisements from $700,000 in 1996 to $4.6 million in 2001. By that year, the company was spending around $200 million a year marketing OxyContin. But Purdue’s most insidious tactic might have been the secret data it kept on health care providers all across the U.S. (whose existence the company finally acknowledged in 2013, 11 years after the effort’s launch). Through its sophisticated database, Purdue Pharma kept track of prescribing patterns among doctors, aiming its marketing push at those who used opioids most frequently with their patients.
The carpet-bomb marketing paid off: The rise of OxyContin was meteoric. Sales of the new opioid analgesic went from $45 million in 1996, its first year on the market, to $1.1 billion in 2000. From 1997 to 2002, OxyContin prescriptions for non-cancer pain increased nearly tenfold, going from 670,000 to 6.2 million. In 2007, Purdue pleaded guilty to federal charges that it misbranded OxyContin and misled doctors and patients about the risk of addiction and abuse. In a statement regarding the lawsuit, Purdue acknowledged that “some employees made, or told other employees to make, certain statements about OxyContin to some health care professionals that were inconsistent with the FDA-approved prescribing information.” The mea culpa notwithstanding, by 2010 the sales numbers would swell still further, to $3.1 billion.
OxyContin led the way to a fourfold increase in opioid prescriptions during the 2000s. An oft-cited statistic is that in 2010 there were enough painkiller prescriptions in the U.S. to medicate every single American adult all day for an entire month. During the first decade of the 21st century, the country was so awash in opioid prescriptions that OxyContin pills dropped into black and gray markets, used recreationally by everyone from teenage girls to suburban moms to unemployed middle-age men. And the sharp hike in opioid prescriptions use had a virulent doppelgänger: The fourfold increase in prescriptions was matched by a quadrupling of overdose deaths over the exact same period. In 1999, there were 4,030 fatal overdoses from opioids. In 2010, that number had ballooned to 16,651.
Heroin Winter Is Coming
Caylee (her name has been changed) grew up in Newtown, Connecticut. She had what she describes as a “really normal childhood”—a little brother; parents who are still together; fantastic grades in school. Her only hang-up, it seemed, was a lack of direction. After high school, she started working at an upscale home goods store and taking classes at a local college. By then she was already experimenting with prescription opioids. So many of her friends in Newtown were taking them that it was hardly thought of as taboo, let alone dangerous.
Caylee says pills made heroin seem normal, by helping it shed the odious DARE—the anti-drug and -violence education program—stereotypes of gaunt men skulking down New York City streets. Caylee progressed from pills to snorting heroin to eventually shooting it, and by 24 she was a heroin addict. It was inevitable, she says, because of the high cost of maintaining the habit. “Shooting up is always a financial decision, because your habit is so bad that you can’t physically sustain it by snorting it.” Because the high from injecting heroin is so much more powerful than the effects of snorting, a user’s money can stretch much further if she’s shooting up. And then you’re really hooked: Caylee says she knew people who were so badly addicted that immediately after buying the drug, they would start shooting up while driving away from the pickup.
Statistics suggest that the narrative beats of Caylee’s drug addiction story are shared by many in the U.S.: A July Centers for Disease Control and Prevention (CDC) report found that people who abuse opioids are 40 times more likely to become heroin users. As Caylee succinctly puts it, “Painkillers open the door to heroin.” Sometimes the initial foray into opioids was through legitimate prescriptions. In other cases, teens filched from parents’ medicine cabinets. The extraordinary surplus of the opioids in the 1990s, pushed on doctors through relentless marketing, deceptive statistics and secret databases, set the stage for the worst episode of prescription drug abuse in this country’s history. Then, the new decade, and two things happened that pushed Americans even more forcefully toward the heroin epidemic of today.
First, black market demand for OxyContin, the most highly coveted of all the prescription opioids, went way up, rapidly outstripping supply, and it became prohibitively expensive. An 80-milligram OxyContin pill—a dose common on the street—cost $32 to $40 in the mid-2000s; by 2009, in some areas of the country the going rate for the pills doubled to $80. At the hideous peak of an addiction, when tolerance levels scrape the sky and are difficult to satiate, opiate addicts can burn through six or seven 80-milligram pills a day. In the late 2000s, that meant blowing through $500 a day to feed a habit. It would be an unsustainable expense for most, let alone people rapidly backsliding into the life of a junkie. So many turned to the far cheaper and increasingly available alternative: heroin.
The second major factor in the shift from prescription pain pills to heroin was Purdue Pharma’s reformulation of OxyContin in 2010. After over a decade in which its golden-goose pill precipitated an exponential increase in opioid abuse and fatal overdoses, Purdue released a “tamper-resistant” version of OxyContin. According to the April 5, 2010, FDA press release, the reformulated pill was “intended to prevent the opioid medication from being cut, broken, chewed, crushed or dissolved to release more medication.” This reformulation might very well have decreased abuse by those not yet experienced with opioids. It’s also what Purdue points to when asked about its responsibility for the opioid and heroin addiction epidemics. “For more than a decade, Purdue has been working with policymakers, law enforcement and public health experts to address the risks associated with prescription opioids,” the company said in a written statement. “We believe the pharmaceutical industry has the responsibility and unique ability to help evolve the opioid market, which is why we’ve taken a leadership role in developing opioids with abuse-deterrent properties.” What few understood at the time they were introduced, though, was how these new opioids were going to affect the millions of Americans already dependent on the oxycodone inside the pills and accustomed to snorting and shooting it to get the best high.
No longer able to “break into the safe,” as one addict put it, droves of addicts moved on to the black tar and brown powder heroin that was already primed to shoot, snort or smoke. A 2012 study in The New England Journal of Medicine surveyed 2,566 opioid-dependent individuals and found that after Purdue released its abuse-deterrent pill, 66 percent of OxyContin abusers switched to another opioid. Heroin was by far the most popular choice. As one responder to the 2012 study put it, “Most people that I know don’t use OxyContin to get high anymore. They have moved on to heroin [because] it is easier to use, much cheaper and easily available.”
These accounts and accompanying data explain how heroin has become endemic in parts of America—the suburbs, the upper–middle class, New England—where the drug was previously just a skid row nightmare, chilling but remote. Somewhat paradoxically, addiction came first, and heroin followed. White Americans, already dying from OxyContin, fentanyl and Opana (oxymorphone, another synthetic opioid) abuse, are switching to the poison that’s cheaper, stronger and more deadly.
Today, drug overdose deaths from both prescription opioids and heroin continue their sharp climb in every age group. But the OxyContin Wild West of the 2000s was not just about skyrocketing overdoses—the overprescription of OxyContin, Vicodin and Percocet also spread the intractable disease of addiction. As Case and Deaton point out in their study, for every fatal painkiller overdose, there are 130 people addicted to prescription opioids.
“Mortality is the canary in the coal mine,” says Skinner. The fact that heroin overdoses nationwide increased 28 percent from 2013 to 2014 (with an accompanying 16 percent hike in prescription painkiller deaths) means there are hundreds of thousands of addicts behind those fatalities, who are not only one wrong fix from death but are also saddled with addiction for life. And heroin addiction taken as a whole, Skinner says, is arguably even more pernicious than the deaths it can cause: It can tear families and communities apart, harming many more people than just the actual addicts.
Efforts are underway to fight the onslaught of prescription drugs and the sprawling heroin epidemic. Michael Botticelli, director of the Office of National Drug Control Policy in Washington, says his office has led an aggressive expansion of state-based prescription drug monitoring systems, allowing health care providers to identify potential abusers jumping from doctor to doctor to feed their addiction. The office focuses on educating prescribers about the perils of opioids. Botticelli has also led efforts to improve access to treatment for addicts, including bolstering distribution of naloxone, which reverses the effects of an opioid overdose. But perhaps most promising are the recently drafted CDC guidelines for opioid prescribers, urging doctors to weigh the risks of dependency and abuse whenever prescribing opioids. For acute pain, the CDC recommends “three or fewer days” of opioid treatment under most circumstances—a long way from the 30- and even 90-day supplies patients have been able to obtain—and adds that "non-pharmacologic therapy and non-opioid pharmacologic therapy are preferred for chronic pain. While CDC guidelines are not binding, they are oft-cited and widely followed in the medical community.
Of course, while there is always a place for both triage and more stringent prescriber guidelines, such efforts won’t cut off these drugs at the source. And pharmaceutical companies like Purdue, Endo, Johnson & Johnson and Abbott Labs have little incentive to reduce the sales of their pain pills: They’ve been lavishly profiting from the opioid epidemic for nearly two decades. It’s also too early to tell how the opioid epidemic is affecting the livelihoods of men and women in their 20s and 40s. It may take years for us to fully comprehend the scope of its devastation. And there’s a good chance it’ll get worse before it gets better: In August, the FDA approved the use of OxyContin for children ages 11 to 16.
This article was updated with additional details from the recent CDC draft guidelines for opioid prescription.