Special report: Quality indicators program rolls out, but debate continues

As more facilities are encouraged to participate in the national rollout of quality indicators in aged care, stakeholders continue to debate key aspects of the program.

As more facilities are encouraged to participate in the national rollout of quality indicators in aged care, stakeholders continue to debate key aspects of the program.

From this month Australia’s program of quality indicators in residential aged care moves from pilot phase to a voluntary national rollout.

Around 350 residential facilities were involved in the pilot, which ran from May to September last year and trialled three clinical indicators –pressure injuries, unplanned weight loss and the use of physical restraint.

However, the government is yet to release the findings of the pilot evaluation. The department and Sussan Ley’s office did not respond to Australian Ageing Agenda’s requests for the report.

The Department of Health is now seeking to recruit additional facilities to take part.

Participating facilities will collect and report their data on a quarterly basis through the My Aged Care provider portal and from April 2016, an icon on the My Aged Care website will indicate that a facility is a participant in the national program.

Yet as the national scheme is rolled out, stakeholders continue to debate key aspects of the program.

Voluntary versus mandatory

Adrian Morgan
Adrian Morgan

A controversial question the government and stakeholders have grappled with is whether facilities should be compelled to participate as part of a mandatory program or whether a voluntary system is preferable.

Chair of the NACA Quality Indicators Reference Group Adrian Morgan said there was no consensus within the advisory group on whether participation should become mandatory overtime and he believed that proposition would be “very vigorously debated” among stakeholders.

“Given the enthusiasm with which the pilot was approached, we could assume that there will be a very high level of take-up,” Mr Morgan told Australian Ageing Agenda.

The Victorian quality indicator program, which involved government-owned residential aged care facilities, has provided an example of how a voluntary system can still achieve close to 100 per cent industry participation.

Ian Yates, chief executive of COTA Australia, said it was his peak’s view that participation should be mandatory after an initial period of robust testing.

He said he was confident there would be a natural momentum built around industry participation and those who did not participate would leave themselves open to consumers and others asking the question why.

It was also COTA’s view that facilities should not be allowed to pull out once they have joined the program. “We also want to be very clear that we don’t end up with a situation where good results go up and bad results don’t. If you’re in it, you’re in it; you have to report your results all the time,” said Mr Yates.

Perverse incentives

Lee-Fay Low
Dr Lee-Fay Low

While quality indicators provide an opportunity for aged care facilities and staff to reflect on their clinical practice, international evidence has shown they can also create perverse incentives for providers, especially when performance measures are publicly reported or have sanctions attached.

Dr Lee-Fay Low, Associate Professor in Ageing and Health at the University of Sydney, told AAA that international studies showed that published quality indicators can lead to the cherry picking of clients or providers diverting resources to areas being measured.

In some cases aged care facilities and hospitals selected healthier residents known as “cream skimming” to improve their performance on published measures of quality, she said.

Professor Joseph Ibrahim
Professor Joseph Ibrahim

Professor Joseph Ibrahim from the Faculty of Medicine, Nursing and Health Sciences at Monash University agreed that perverse outcomes could arise.

He recalled the US league tables of hospital mortality published in the late 1980s and early 1990s, which had the impact of hospitals transferring patients out of their institution before they died or recoding patients to improve their results.

Alarmingly, Australia’s national pilot has already yielded examples of provider gaming.

Last November, a provider in attendance at an industry conference in Sydney admitted to the audience their organisation was amending admission criteria to improve performance on quality indicators. During a discussion on quality indicators the person told the audience:

“We are already amending our admission criteria documents in residential aged care to look at the frequent fallers or resistance to care clients … because we know they are going to be outliers on our quality indicators.”

Ian Yates
Ian Yates

Mr Yates said it was critical to closely examine any unintended behavioural consequences that emerged from the quality indicator program, such as the focus on physical restraint resulting in an unacceptable risk of falls.

“[Stakeholders] haven’t thought through what the minimum set of quality indicators is that would be mutually supportive and reinforcing and have the least unintended consequences,” he said.

Mr Morgan agreed it was important the program move to include a sufficiently broad set of indicators so as to reduce the incentive to “game the system” and pointed to the value of consumer experience and quality of life indicators as providing a more holistic picture

The Productivity Commission’s 2011 report into aged care, which recommended a program of quality indicators, also warned that too few indicators could distort care priorities and mean there were unmeasured aspects of care. This needed to be balanced against the costs and time required to collect data on a wider range of indicators, it said.

Read our full report into the quality indicators program – including the latest on the quality of life and consumer experience measures and the planned rollout in home care – in the January-February 2016 issue of Australian Ageing Agenda magazine.

Want to have your say on this story? Comment below. Send us your news and tip-offs to editorial@australianageingagenda.com.au 

Subscribe to Australian Ageing Agenda magazine (includes Technology Review

Sign up to AAA newsletters

Tags: cota, ian-yates, joesph ibrahim, lee-fay-low, nutrition, pressure injuries, productivity-commission, quality-indicators, reporting, slider, Sussan Ley, wound-care,

4 thoughts on “Special report: Quality indicators program rolls out, but debate continues

  1. As a statistician with a focus on methods for measuring quality of care, I read the resource manual for the quality indicator programme with great interest. Through my work with MOA Benchmarking, it has been our intent, where appropriate, to align MOA’s indicators with these definitions; I have a number of concerns. For example, they’ve asked homes to count the number of physical restraints in use with the intent to physically restrain someone. Whilst the agency correctly identifies adverse outcomes associated with the use of physical restraints, they incorrectly assume that there is a linear association with adverse outcomes the more restraints a person has. For evaluating performance, they treat a 50-bed home that has a single person with three physical restraints, the same as a 50-bed home with three residents with a single physical restraint. This doesn’t make sense from an epidemiological point of view, because the person with three restraints is unlikely to be at three times the risk of an adverse event as they would be were they to only have one. This doesn’t make sense from a clinical point of view, because it necessarily assumes that restraint use is independent of the care needs of the person. This doesn’t make sense from a person-centred view of care because it assumes the deprivation of some autonomy for three people is equivalent to the deprivation of autonomy for one.

    Furthermore, their attempts to use this information to demonstrate performance is rudimentary. There is no attempt to draw sound statistical inference from the data – simply showing higher or lower. Whilst I encourage any progress toward assisting homes in improving quality of care, I think any efforts to do so need to be undertaken within the context of sound epidemiological practice.

  2. Sadly, the entire project looks like a perverse outcome.

    As Dr Straney notes above, the methodology is unsound.

    The creative manipulation of Pressure injury rates defies explanation. You would be forgiven for assuming this would simply entail counting every injury that occurs over a given time period.

    Instead, the resource manual instructs participants to conduct just one count per quarter (So, for example, if someone develops a pressure injury after the initial count, then dies within the three month period before the next count, they’re not counted). Participants are also advised to identify any palliative care residents with pressure injuries within the count. (No reason is given for this, but why include this unless it’s intended to affect the data in some way…perhaps it just doesn’t matter if dying people receive poor care and develop painful decubitus ulcers?)

    After lodging your tally, it’s multiplied by 1000 then divided by your total occupied bed days. Rather than presenting the public with a simple percentage rate of pressure injuries per resident population, they get a number with 3 decimal places that reveals…well, it’s hard to know what it reveals.

    Why measure restraint; a practice that has no place in contemporary aged care? Mr Yates’ concern regarding the link between restraint and fall rates is misguided; this outdated way of thinking only perpetuates the use of restraint in aged care. The only way to eliminate falls is by inventing an anti-gravity machine. Surely it’s time to re-examine our interpretation of unacceptable falls risk and stop tying up grandma “for her own good”.

    I’ve no idea how they deal with unplanned weight loss…I couldn’t bring myself to read any further.

    The advisory group has taken 5 years to select 3 “mutually supportive indicators with the least unintended consequences”. The delay wasn’t due to the rigorous demands of developing any ground-breaking techniques; these are all established and well researched indicators. Mr Morgan also informs us the advisory group hasn’t formed a consensus on voluntary/mandatory participation in the program. Decisiveness doesn’t appear to be the group’s strong point.

    How are we doing so far?

    The My Aged Care website informs us that “QI will give consumers, their families and carers greater access to information about the quality of aged care services and inform their choices about the type of aged care services they use”. Mission accomplished?

    And what about the provider who publically admitted they were massaging the system to get better results? (They actually said this to an audience at an industry conference!)

    Is it OK to let that slide, or should we be outraged and demand they be named and shamed? The absence of any response from our industry further illustrates why we have a problem with public credibility and integrity.

    Our peak bodies and expert panels consistently let us down with these absurd performances. As long as we keep creating ineffectual focus groups stacked with provider representatives and academic pseudo-clinicians, rational people will continue to view our sector as the embarrassing drunk uncle of the Health Care family.

  3. this appears to be little more than yet another distraction, we do not need another set of rules by which to operate, there is already the aged care standards in place which is embedded in legislation – why seek to add another layer of bureaucracy taking up yet more time and money,

    here’s a novel idea, spend some time and money attracting quality staff to work in aged care facilities, how about more staff at the coal face to provide the actual care which would eliminate all three issues identified – restraint, pressure areas and weight loss

    frustrated does not even begin to cover it, seriously

Leave a Reply

Your email address will not be published. Required fields are marked *