Michael Culhane, Department of Social Services
Michael Culhane, Department of Social Services

The national rollout of quality indicators covering key clinical areas in aged care is on track, but department concedes work on ‘quality of life’ measure proving challenging

The outcomes from the pilot evaluating the use of quality indicators in residential aged care is currently being collated into the final report that will be provided to the National Aged Care Alliance (NACA) special reference group later this month.

Initial feedback from the almost 300 facilities that participated had been “broadly positive” and it was expected they would come on-board with the full program from January 2016, according to Michael Culhane from the Department of Social Services.

The department will also be encouraging more facilities to participate from April next year, and by this time next year it hopes to have a “fair portion of the sector” participating in the program, Mr Culhane told the Leading Age Services Australia National Congress on Monday.

However, while the pilot, which took place from May to September, focussed on three indicators – unplanned weight loss, pressure injuries and the use of physical restraint – work on developing indicators covering consumer experience and quality of life had proven more difficult, he acknowledged.

The department and KPMG, which is overseeing the project, had been working with the NACA reference group for nine months on the measures, and had hoped to conduct a pilot in 2015.

However, quality of life was a complex notion and the factors that influenced it were subjective, therefore developing a tool that could effectively capture that information was “quite a challenge,” Mr Culhane said.

Options for a quality of life tool were discussed in June with the reference group and while there was strong commitment to bringing those into the program they were unable to reach agreement. In September they agreed to test three tools over the late part of 2015 and possibly pilot them in early 2016, he said.

Home care indicators

Proving similarly complex were the quality indicators for home care, he said. “The OECD found very little international convergence for measuring quality in that environment,” he said, adding it was more complex than residential care as the provider was present for a lot less of the consumer’s time and therefore linking outcomes to the provider could be difficult.

The department aimed to run a pilot program to test indicators in home care early next year, and like the residential care pilot would be looking for a nationally representative pilot which would be open to all providers to participate in, he said.

The areas it was looking at were social connectedness, quality of life and consumer experience, goal attainment, potentially preventable hospitalisation and residential access, and the quality of care, Mr Culhane said.

Part of broader movement

Quality indicators were being used in number of countries such as the US, the UK and Canada, while the majority of state-run facilities in Victoria had been participating in a program.

The Federal Government was taking a similar step nationally by introducing quality indicators in residential aged care, followed by home care, he said.

This was part of a much broader renewed focus on quality, with recent initiatives including the Aged Care Quality Agency holding forums exploring quality, the government’s commitment to single quality framework across residential and community care, the consolidation of the aged care standards, and the transfer of complaints to the Aged Care Commissioner.

Photo: Peak Multimedia

Want to have your say on this story? Comment below. Send us your news and tip-offs to editorial@australianageingagenda.com.au 

Subscribe to Australian Ageing Agenda magazine (includes Technology Review

Sign up to AAA newsletters

Join the Conversation

1 Comment

  1. On track? This initiative was announced in April 2012 and due to establish a set of indicators by May 2013…it seems the reference group has confused caution with inertia.

    With a plethora of existing indicators to choose from (they’re not developing any new ones) this should have been completed by now. To date, only three indicators have been trialled and the jury is still out on what to do next.

    A cynic would question the invlovement of so many ‘Peak Bodies’ in the development of this initiative. Considering the results to date, its the only only explanation for self interest triumphing over common sense.

    Take Pressure Injuries, for example.

    One would assume this is simpy a count of all pressure injuries that developed at each facility over defined time period, expressed as a simple percentage of the resident population. Wrong.

    The actual number that will be presented to the public comes from doing a count ONCE every quarter, multiplying it by 1000 and then dividing it by the number of occupied bed days per quarter.

    This is quite brilliant as it produces a figure with three decimal places that means absolutely nothing. Never mind counting injuries that develop between the quarterly counts (that either heal or the resident dies).

    The Indicator Resource Manual also includes some helpful information in Appendix 7 on using the indicator data to set quality improvement targets. Despite acknowledging a 26 – 42% incidence of pressure injuries among residents, their introduction to setting targets suggests a 10% reduction is 12 months is an ‘achievable target’ and that an ‘absolute target’ acknowledges that we already have the skills, knowledge, equipment and resources to ensure that all pressure injuries are preventable.

    It’s also curious they’ve chosen to measure the use of restraint, particularly since restraint has no place in contemporary aged care and the department has published guides to establishing restraint free environments for many years.

    Unpanned weight loss also gets a run. The resource manual refers to BMI and frequently references papers that are up to 15 years old.

    They actually have all this in print! I can’t decide whether to applaud the reference group for their tansparency or to deride them for treating the public with such contempt.

    The national trial is ‘struggling with complexity’ due to the incompetence of those charged with its development. We’ve got KPMG, the department and a host of provider representatives running the show…if nothing else, at least we can be confident the final product will obfuscate the data and tell us nothing of importance.

Leave a comment

Your email address will not be published.