Two recent Post-Standard stories come to mind. One story was about death rates associated with open-heart surgery, and it was based on statistics from the New York State Department of Health. Another story concerned the federal government’s new reporting on hospital patient satisfaction, about which I commented.
Who are these quality watchdogs? Besides the state and federal government, some are non-profit organizations, such as Leapfrog or the Niagara Coalition for Quality Care. They may be for-profit companies, such as HealthGrades (which licenses its endorsements to hospitals and sells its data directly to the public). Health insurance companies also purport to report on the quality of physicians and hospitals.
How good are the reports from these watchdog groups? That is the question that the Healthcare Association of New York State (HANYS) has started to answer.
Last year the HANYS’ Quality Institute (in which I participated) reported that
…hospital quality “report cards”… have not yet fully lived up to their promise of informing consumers and helping providers improve care. Problems with the accuracy, clarity, timeliness, and comparability of quality measures persist….Hospital quality reports contain useful information, but the reports are different in the way they examine quality data, and are at times contradictory.
Now, believing that turnabout is fair play, HANYS has published its own ratings of the watchdogs' report cards. HANYS scores the watchdogs based on specific criteria. For example, does the organization disclose the methodology it uses? Is its methodology able to be verified by third parties? Does the organization use statistical methods to adjust for significant differences in illness severity for patients at different hospitals?
Based on its criteria, HANYS ranks eight publicly available hospital report cards from A to D with A as the best rating.
Public reporting of hospital data for comparison purposes is a good idea. I support it. But it’s a confusing picture out there, as I have discussed elsewhere,[1] and HANYS has made an important contribution to help consumers better understand exactly what is involved in the public reports they may read about or consult.
The Centers for Medicaid and Medicare Services , AHANYS calls its report-card-on-report-cards “a first step toward providing consistent and understandable comparative information about hospital quality.”
The New York State Department of Health, A
The Joint Commission, B
The Leapfrog Group, C
The Niagara Health Quality Coalition on Hospital Quality, C
HealthGrades, D
Solucient, D
U.S. News and World Report, D
Public reporting of hospital data for comparison purposes is a good idea. I support it. But it’s a confusing picture out there, as I have discussed elsewhere,[1] and HANYS has made an important contribution to help consumers better understand exactly what is involved in the public reports they may read about or consult.
[1] See my previous postings: Of quality and board oversight, Hospital mortality rates, and Mortality & hospital report cards.
1 comment:
We absolutley must have uniform reporting standards. The benefits are too important: hospitals will be better able to identify areas needing improvement, and patients will be better able to make informed decisions about where they get their care. Those goals (especially the latter), however, are still distant. Until we have reliable uniform comparitive information, the benefits of these reports will remain largely elusive except to those rating organizations that profit from their enterprise.
Post a Comment