|"Does Romneycare save lives?"|
Yet, research on that topic has been somewhat murky. Most studies have focused on Medicaid as a surrogate for all insurance. Some research suggests that it has no impact on health outcomes, while other studies say it can lead to unnecessary and even dangerous care.
That's why this study that was just published in the Annals of Internal Medicine is important. It says insurance saves lives.
As Population Health Blog readers may recall, Massachusetts required its citizens to buy into "Romneycare" health insurance long before we had even heard of the controversial term "mandate."
Years later, researchers wanted to know if Romneycare - and by implication, its mandate - made any difference in the most important outcome of all: death rates.
The researchers used a "quasi experimental" design that contrasted the county death rates in Massachusetts counties before (2001 through 2005) and after (2007 through 2010) the advent of Romneycare to a set of "propensity matched" counties from New England states that had no health reform.
Mortality data was obtained from the CDC. The analysis was limited to adults aged 20 to 64 years and adjusted for country level age, gender, race, poverty rates, income, baseline mortality rates and unemployment rates.
During the baseline "before" years, there were no statistically significant differences in mortality between the Massachusetts counties and the control counties. That changed. During the "after" years, mortality, compared to the control counties, statistically significantly declined by 2.9% or by 8.2 persons per 100,000. As further evidence of the impact of insurance reform, elderly populations from the same counties - who presumably had before and after access to Medicare - showed no differences over time.
The paper has a graph that displays mortality rates year after year, and while Massachusetts had a slightly lower (and statistically nonsignificant) baseline mortality rate, there is a small but credible divergence downward over time compared to the control counties.
The Population Health Blog finds the study credible. Propensity matching is the next best thing to a randomized clinical trial, and this study uses a valid concurrent control group to support the notion that health insurance saves lives. Nothing else seems to have accounted for the drop in the death rate.
1) In clinical medicine, one gauge of treatment effectiveness is "number needed to treat" (or "NNT"). "High value" NNTs range in the 20 to 100 range (i.e., a doctor has to "treat" "100" patients with a particular condition to "cure" one). While every life is precious, Massachusets has taught us that the Romneycare's NNT is 830.* In other words, we have to mandate insurance for over 800 persons to save one life. That's not unreasonable, but after mishaps like this, we should be open to finding better ways to accomplish it.
2) Prior to the institution of Romneycare, Massachusetts maintained a fund that could be used to compensate hospitals for the care of uninsured persons. Since that was a de-facto form of insurance, the Population Health Blog is less confident that the 2.9% difference in mortality rates is a black/white narrative on the transition from "no" insurance to "full" insurance. Rather it's about a transition from one financing mechanism to another. That being said, real insurance would seem to "beat" other forms of health care financing.
3) Can the life-saving track record of a Romneycare mandate be applied to Obamacare's mandate? While there are some important similarities, that doesn't necessarily mean that what works in urban Boston will work in rural Mississippi. More research will be needed, and the Population Health Blog predicts much of it will involve propensity matching.
4) Last but not least, a large part of Romneycare's mandate facilitated the expansion of commercial insurance. This paper doesn't help the Population Health Blog to compare the relative life-saving merits government-run Medicaid vs. a private not-for-profit like Blue Cross Blue Shield. That'll also take more research.
Image from Wikipedia
*An astute reader alerted the PHB that it had initially posted a NNT number spuriously calculated off the 8.2 per 100K difference described above. The authors of the Annals paper correctly give the number as 830.