Ensure That Every American Has Health Insurance
Everyone in America should have health insurance that covers at least a basic level of care. This would give them access to medical services and ensure that health-care providers are paid. Improving access to care this way would actually slow the growth in health-care spending because people who become seriously ill—who have a heart attack, for example—wind up receiving costly medical care, subsidized by society, even if they are uninsured. Preventing that heart attack by removing barriers to health care not only improves health, it saves money.
A strong economy will not get us to universal care; the robust economic growth and tight labor markets of the late 1990s did not substantially reduce the number of uninsured. And we cannot expect significant numbers of the uninsured to pay for insurance on their own—just over half of the uninsured today have family incomes below $30,000, and premiums for self-purchased family policies are typically $10,000 or more. Many of the uninsured will need highly subsidized insurance. On the other hand, three out of 10 uninsured have incomes (above the median household income of $48,200 in 2006) that put them in the middle class. Most of these people might buy private coverage if they could find affordable policies.
The presidential candidates differ in the levels of detail about what they would do to reduce the number of uninsured, but they all assume that employer-sponsored health insurance will continue to cover a majority of Americans (currently 60 percent). The problem with this basic strategy is that globalization is forcing American companies to become more competitive in what they pay for labor. Employers increasingly resist paying more than a defined amount for health care; most cannot continue to pay health costs that have doubled in the last decade. In just the last seven years, the fraction of employers sponsoring health insurance has declined from 69 percent to 61 percent, and those that continue to provide it are asking employees to pay an increasing share of medical costs. We need to create a new financing structure that includes employers, before a majority pull out of paying for health insurance altogether.
A strategy worth examining is that of the Netherlands. Two years ago the Dutch created a new system that uses a combination of premiums and taxes to finance health insurance for everyone. People have a choice of 33 health plans, each of which shares a common set of health services. Each adult pays a modest annual premium (about $1,500) that depends on which health plans he or she chooses. Everyone also pays an income-related premium tax, so low-income people are subsidized and all children are covered. The income-related tax is reimbursed by workers' employers. Some of the tax revenues are used to adjust payments to health plans that enroll people likely to have higher medical expenses—thereby reimbursing plans for higher-cost indi-viduals and ensuring that they receive quality medical care.
—Katherine Swartz, PH.D.Professor of Health Economics and Policy at Harvard School of Public Health
Eliminate Racial Disparities
As a practicing physician, I know firsthand that health care can work wonders for those who need it. Equal opportunities for good health are among the greatest benefits society can provide. Many research studies have found, however, that African-Americans, Latinos and American Indians often do not receive these benefits, for treatments ranging from primary care of diabetes to high-tech heart surgery.
For African-Americans, current disparities in health have deep roots in slavery and segregation. Even as overt discrimination in health care has been largely extinguished, the embers of inequality smolder and reignite. In Chicago, for example, death rates from breast cancer were nearly identical for white and African-American women in 1996. By 2003 these death rates fell 35 percent for white women but actually rose 12 percent for African-American women.
Why do such disparities persist? Minority Americans are more likely to fall through the cracks of complex systems of care. Too often they lack insurance coverage, live in poor communities, experience language barriers and face subconscious biases of health professionals—factors that contribute to unacceptable disparities in care.
What should be done to eliminate these inequities? First, understand the specific barriers to high-quality care. Confronted with breast-cancer disparities in Chicago, local health-care and community leaders are improving access to mammograms and developing systems to ensure all women receive effective treatments without delay. Leaders in every community can do likewise—root out reasons that disparities persist for major illnesses and develop effective partnerships to eliminate them.
—John Z. Ayanian, M.D., M.P.P. Professor of Medicine and Health Care Policy at Harvard Medical School and a doctor at Brigham and Women's Hospital in Boston
Fix the Medicare Drug Benefit
Before launching any bold new initiatives, let's plug the leaks in the Medicare drug-benefit law. High drug costs can force elderly and disabled people to go hungry, to skip and split pills, and to suffer costly hospitalizations. In response, Congress passed legislation in 2003 giving Medicare beneficiaries insurance for prescription drugs. The drug benefit is controversial because beneficiaries have to pay high out-of-pocket expenses and choose from hundreds of private drug plans, each covering different drugs. While the program has helped many afford their medications, it won't achieve its original goals unless the new Congress and administration fix a few glaring defects.
First, plug the "doughnut hole." That is the gap in drug coverage between $2,250 and $5,100 in annual out-of-pocket costs that was created as a result of political compromise. Such coverage gaps are almost as hazardous as no coverage at all, especially for the several million people with costly illnesses who have fallen into the doughnut hole this year. The estimated $5 billion annual cost of eliminating this gap is only 1.3 percent of overall Medicare expenditures. This additional expense would improve health and might actually reduce overall expenses by lowering hospitalization costs.
Second, automatically enroll the 3 million to 4 million near-poor individuals who are already eligible for very low-cost drug coverage, but didn't sign up. The administration hasn't promoted the best coverage to this vulnerable group. Reaching them is as easy as looking up their tax and Social Security information.
—Stephen B. Soumerai, SC.D. Professor of Ambulatory Care and Prevention at Harvard Medical School
Use Quality-of-Care Report Cards
If Zagat's can rate Chinese restaurants and Greek tavernas, and Consumer Reports can rate skateboards and digital cameras, why can't we rate doctors? The answer is that we can. And, increasingly, we are doing so. Scores of Internet databases now provide more information than anyone can readily process on the quality of care provided by U.S. hospitals and health plans—and, sometimes, even by individual doctors.
Some of these "quality report cards" provide data on preventive services, such as the percentage of age-eligible women receiving mammograms or Pap smears. Others focus directly on life-and-death matters, including mortality rates after serious illness or high-risk surgeries. Performance can vary enormously. In the several states now reporting risk-adjusted mortality rates following coronary-artery bypass surgery, for example, top-performing surgeons often have mortality rates half those of the lowest performers.
With such striking differences, one might expect quality report cards to play a key role in medical decision-making. Yet patients rarely use them, and many doctors question their reliability, especially when data pertain to individual physicians. Doctors also worry that calling public attention to inadequacies will impede professional efforts to improve care. Nonetheless, quality report cards work: repeatedly we see that when poor quality of care is made public, providers intensify efforts to improve it.
So how do we take the best advantage of this medical Zagat's? Most important, we stay the course. Yes, work must continue to improve fairness and accuracy, but shedding sunlight on medical practice is unquestionably healthy medicine for patients.
—Arnold M. Epstein, M.D. Chair of the Department of Health Care Policy and Management at Harvard School of Public Health and professor of Medicine at Harvard Medical School and Brigham and Women's Hospital
Wire American Medicine
Polls show that most Americans think their doctors routinely use computers to care for them. After all, this is supposed to be the most technologically advanced medical-care system in the world. But when it comes to patients' records, the great majority of U.S. doctors still rely on technology that predates the printing press: scrawled, often illegible notes that have an uncanny tendency to go missing just when they're needed most.
Experts believe that if hospitals and doctors used electronic health records (EHRs)—systems that store and manage patients' medical information—they could vastly improve the quality of care and reduce costs. For example, such records would routinely warn doctors against giving medicines to patients who are allergic. In the typical paper-based record, it can be hard to find test results—leading to unnecessary repeat tests, and costs. President Bush is so convinced of the value of EHRs that he wants all doctors to adopt them by 2014.
But progress has been slow. The first reason is money. To wire our entire health-care system would cost an estimated $160 billion, and many small physician practices and hospitals are balking. They also fear that installing EHRs will disrupt their work and increase malpractice liability.
Meanwhile, virtually every Western health-care system is light-years ahead of ours. In Denmark, every primary-care doctor is wired. The United States will eventually catch up. It has to. And patients can speed the process by avoiding doctors and hospitals who stubbornly resist the Information Age.
—David Blumenthal, M.D. Professor of Medicine and Health Care Policy at Harvard Medical School and Massachusetts General Hospital
Make Sure New Treatments Are Properly Studied
When you have a medical problem, you and your doctor usually have treatment options. If you have breast cancer, will it be lumpectomy or mastectomy? If a narrowed coronary artery needs opening, should you go with a "drug eluting" or a "bare metal" stent? If you're a smoker, should you have a screening CT scan to look for lung cancer, or not?
As you decide on a course of treatment, you may well think that medical studies comparing your options have revealed the best choice. Unfortunately, that often is not the case. The time-tested way to compare options—a randomized clinical trial—requires a lot of time and money, and often involves patients who are younger and healthier than the people who may ultimately need the new treatment. Today there are more new diagnostic and therapeutic approaches than can practically be studied in such trials. From 2004 to 2006, approximately 90 truly new medicines were approved by the FDA, multiple new radiology tests to detect disease were introduced and several new kinds of laparoscopic surgery were launched. While many were compared against a placebo or against doing nothing, very few were compared with other already available options.
Every year millions of patients have a new test or treatment that has not been systematically compared with other options. Did they benefit? Our society needs to do a better job of recording and aggregating pertinent information from those millions of people. The effort must be rigorous and transparent. It will cost a substantial amount of money, but it will benefit everyone—patients, doctors, employers and insurers.
—Barbara J. McNeil, M.D., PH.D. Head of the Department of Health Care Policy at Harvard Medical School and professor of Radiology at the School and Brigham and Women's Hospital
Curb Drug Spending Without Hurting Drug Development
New drugs can protect us against life-threatening and disabling diseases, but they also place big demands on the nation's finances. By 2016 we can expect to spend roughly half a trillion dollars per year on prescription drugs. Health spending is stressing household and public budgets, and almost 20 percent of the growth in that spending is due to prescription drugs.
The collision of growing costs and expanding science is forcing the institutions that pay the bills for health care to try to restrict the use of expensive medicines to those people for whom they clearly are necessary. The risk is that an overly zealous approach will harm drugmakers' revenues and, as a result, stem the flow of new treatments.
Payers and regulators need to balance the competing goals of curbing the growth in drug spending and promoting innovative treatments. All parties, government and private, should agree that insurance coverage will be linked to the demonstrated value of expensive new medicines. Congress should develop legislation allowing the FDA to promote competition among biopharmaceutical products when their patents expire, in a way that continues high standards of safety. In some specific circumstances (for example, unique drugs used primarily by the elderly), the Medicare program must be prepared to negotiate prices with drug companies. In sum, payers and regulators need to make decisions that involve great scientific and economic complexity, in order to make valuable new medicines affordable.
—Richard G. Frank, PH.D. Professor of Health Care Policy at Harvard Medical School