Skip to content

AHS plays with numbers

Timing is everything.I had just accepted an invitation to speak on performance measurement and reporting at the International Deming Research Seminar at Fordham University in New York when Alberta Health Services (AHS) released its second quarter performance report. When giving technical presentations on stuff like performance reporting, recent real world examples can really help.

Timing is everything.

I had just accepted an invitation to speak on performance measurement and reporting at the International Deming Research Seminar at Fordham University in New York when Alberta Health Services (AHS) released its second quarter performance report. When giving technical presentations on stuff like performance reporting, recent real world examples can really help.

The AHS performance report details results for over 50 performance measures — everything from knee replacement wait times to immunization rates to staff absenteeism. It’s worth checking out. AHS calls it a “thorough analysis of our performance” but it’s really just 90 pages of childish nonsense.

Nowhere is this more apparent than in the report’s Performance Dashboard. It compares the quarter’s results for all 50-plus performance measures with results from the previous quarter, the previous year and a performance target. For example, the second quarter wait time for radiation therapy (first consult) of 5.0 weeks is compared to the 4.4 weeks of the previous quarter. The report concludes that it’s now taking longer to see the radiologist.

If the sophisticated mathematics and interpretations of this are beyond you, relax. AHS provides symbols to help, in this instance, adding a little red circle to the dashboard indicating that the longer wait time is ‘bad.’ If our radiologists don’t know that 5.0 weeks is longer than 4.4 weeks, maybe it’s time for new radiologists.

The report also compares the quarterly result of 5.0 weeks with last year’s second quarter result of 6.0 weeks. Because the current performance is better than the previous years’, a little green square for ‘good’ is placed on the dashboard. This apparent tie is broken by another red circle because the 5.0 weeks is still above the ‘acceptable target’ wait time of 3.0 weeks. Acceptable to whom isn’t specified.

All this creates a mishmash of ‘bad’ red circles and’ good’ green squares with some ‘we have no idea’ yellow triangles thrown in for good measure. Collectively, all these numbers and colourful symbols demonstrate nothing more about performance than an understanding of elementary school arithmetic.

This should help my New York presentation because these two-numbers-at-time comparisons always get big laughs at conferences.

But while good for laughs, the juvenile approach to reporting can have serious consequences, not the least of which is misleading the public on the state of health-care system performance. Here’s how that happens.

The whole point of performance measurement is identifying signals in amongst the noise. Most kindergarten students can tell you that 5.0 weeks is less than 6.0 weeks. That’s hardly insightful.

What managers and the public need to know is whether this drop in wait time is important. Is it the result of a significant or important change in the system itself — a signal?

Or is it simply the result of a bunch of little things coming together at different times and in different ways producing small, random, fluctuations in year to year results — or noise? (Nick Silver’s aptly titled recent bestseller, The Signal and the Noise, discusses this at length and is worth a read.) Performance measurement that can’t tell the difference is useless.

Which brings us back to the AHS Performance Scorecard. It can’t distinguish signals from random noise, and because of this, treats every difference between the numbers as a signal. Meaningless noise is sold to the public as important findings. Kindergarten arithmetic is spun as thorough performance analysis, giving the mistaken impression that system leadership has a handle on things.

In fairness to AHS, it’s hardly unique in reporting performance in this way. Government reporting on system, process or program performance is often of similar quality. Analysis of municipal performance in the Ontario Municipal Benchmarking Initiative, and of hospital performance at the Canadian Institute for Health Information, are two examples coming immediately to mind.

If this is the best performance measurement and reporting government can deliver, then two things are true. One, we have no idea how well government is performing, and two, performance isn’t going to get better anytime soon.

Troy Media columnist Robert Gerst is a partner in charge of operational excellence and research and statistical methods at Converge Consulting Group Inc. He is author of The Performance Improvement Toolkit: The Guide to Knowledge-Based Improvement. For more, see the website troymedia.com.