Showing posts with label Niagara Health Quality Coalition. Show all posts
Showing posts with label Niagara Health Quality Coalition. Show all posts

Saturday, May 3, 2008

Turnabout is fair play

Every few months, it seems, there’s a story about the quality of care at one or more local hospitals, based on the report of one or another watchdog organization.

Two recent Post-Standard stories come to mind. One story was about death rates associated with open-heart surgery, and it was based on statistics from the New York State Department of Health. Another story concerned the federal government’s new reporting on hospital patient satisfaction, about which I commented.

Who are these quality watchdogs? Besides the state and federal government, some are non-profit organizations, such as Leapfrog or the Niagara Coalition for Quality Care. They may be for-profit companies, such as HealthGrades (which licenses its endorsements to hospitals and sells its data directly to the public). Health insurance companies also purport to report on the quality of physicians and hospitals.

How good are the reports from these watchdog groups? That is the question that the Healthcare Association of New York State (HANYS) has started to answer.

Last year the HANYS’ Quality Institute (in which I participated) reported that
…hospital quality “report cards”… have not yet fully lived up to their promise of informing consumers and helping providers improve care. Problems with the accuracy, clarity, timeliness, and comparability of quality measures persist….Hospital quality reports contain useful information, but the reports are different in the way they examine quality data, and are at times contradictory.
Now, believing that turnabout is fair play, HANYS has published its own ratings of the watchdogs' report cards. HANYS scores the watchdogs based on specific criteria. For example, does the organization disclose the methodology it uses? Is its methodology able to be verified by third parties? Does the organization use statistical methods to adjust for significant differences in illness severity for patients at different hospitals?


Based on its criteria, HANYS ranks eight publicly available hospital report cards from A to D with A as the best rating.
The Centers for Medicaid and Medicare Services , A
The New York State Department of Health, A
The Joint Commission, B
The Leapfrog Group, C
The Niagara Health Quality Coalition on Hospital Quality, C
HealthGrades, D
Solucient, D
U.S. News and World Report, D
HANYS calls its report-card-on-report-cards “a first step toward providing consistent and understandable comparative information about hospital quality.”

Public reporting of hospital data for comparison purposes is a good idea. I support it. But it’s a confusing picture out there, as I have discussed elsewhere,[1] and HANYS has made an important contribution to help consumers better understand exactly what is involved in the public reports they may read about or consult.

Saturday, August 11, 2007

Of quality & board oversight

I recently made a presentation to the hospital’s quality committee that displayed 293 quality and safety measures tracked by Community General Hospital. It was a confusing, hard-to-understand picture of the ongoing surveillance we perform.

A few months ago I sat at a table with hospital CEOs from across the state as we struggled to get our arms around the number of quality measures each of us tracks. How, we asked ourselves, can we use such data more effectively to improve performance at our hospitals?

“We are at our limit in making sense of so many indicators,” said one CEO. Another observed, “We can’t go in a thousand directions. We have to focus on a limited set of meaningful measures.” “We’re struggling too,” added a third.

What is included in our 293 data measures? Data we report to the federal and state governments about heart disease, pneumonia, and infection control. Data we examine internally, including medication problems, falls, and near-misses. Measures we report to insurers in compliance with pay-for-performance contracts. And measures reported about us by outside organizations, such as HealthGrades®, the Niagara Health Quality Coalition, or Thomson Healthcare (formerly Solucient).

A recent article in the Journal of Patient Safety the performance of some 200 hospitals and concluded that “better quality outcomes were associated with hospitals at which the board spends more than 25% of their time on quality issues.” The Institute for Healthcare Improvement (IHI) recently recommended that “[b]oards in all hospitals…[should] spend at least 25% of their meeting time on quality and safety issues.”

A few years back, Community General’s board agenda was rearranged so that reports on quality come before reports on finances. But how much time does the board actually devote to quality issues? I reviewed last year’s board minutes and estimated the time spent on quality issues was about 30%. That was higher than I had thought. But does it mean we are working on the right things? Are we doing enough? I asked such questions of members of the board’s quality committee – and they think we can improve.

Serving on a hospital board is quite a responsibility. As a director, you have the overall responsibility for hospital performance. Section 405.2 of the New York State hospital code holds the governing body responsible for quality performance and for process improvement. The board exercises its responsibility, of course, by hiring management, by assuring proper policies are in place, and by overseeing the hospital’s performance.

In New York, most hospital boards are composed of volunteer representatives who devote their time (and often donate money too) to assure their community has excellent hospital care and services. What education and information do they need to provide quality oversight?

Chaired by board member Dr. David Tyler, the quality committee has held two meetings on this subject, and it will make recommendations to the full board later this year. Management is working with the committee in developing its recommendations. As part of the process, some members of VHA Empire State[1] generously shared with us their own board-level quality reports, and we have learned from them. We will probably borrow ideas from some of these boards, but our work is very much organization-specific. Where should Community General focus its attention in order to make more progress in areas that will best serve patients?

This week Community General was honored for having “one of the fastest and most consistent rates of improvement in the nation” by Thomson Healthcare. It’s reassuring that our improvement has been better-than-average in recent years as we set the stage for the next series of safety and quality initiatives.


[1] Community General Hospital is a member of VHA Empire State is part of VHA, Inc., a health care provider alliance of more than 2,400 not-for-profit health care organizations.