Measuring quality requires quality in measuring

Hopefully the quality of life and care experience surveys continue to evolve with learnings and feedback from providers delivering them, writes Elyssia Clark.

Hopefully the quality of life and care experience surveys continue to evolve with learnings and feedback from providers delivering them, writes Elyssia Clark.

When the federal government announced the introduction of surveys to monitor the care and quality of life for aged care residents, we identified two potential challenges.

Not only were there challenges around how we would deliver the Quality of Care Experience – Aged Care Consumers (QCE-ACC) and the Quality of Life, and the Aged Care Consumers (QOL-ACC) surveys – but also concerns about how residents felt about participating in another survey.  

We ran early pilot surveys at two homes to help us work through any issues before the program went live in April this year. Up to 38 per cent of residents refused to participate in the pilot with many claiming they had “already done the survey”.

Elyssia Clark

Also known as survey fatigue, residents felt they were being asked to complete survey after survey, which had the potential to impact the volume and quality of feedback.

Residents found the survey language and phrasing to be a challenge with complex wording making the process daunting. Many asked team members to read out the questions and provide simplification or further explanation to help them understand.

Timelines were also an issue. Our residents lead busy, active lives thanks to the number of activities available. This made it difficult to not only survey all residents but also catch them when they were most alert, which tends to be in the morning.

To combat this, our residential managers helped us identify residents with no or mild cognitive impairment who would be better able to self-complete the survey. We also increased the font size and included visual answer codes to help residents respond more effectively.

Those unable to complete surveys on their own were either given the option of electronically assisted interviews or interview by proxy. Assisted interviews were prioritised as part of morning scheduling.

Unfortunately the extended text in the interviewer-assisted surveys continued to prove challenging for residents.

We also tried to help residents understand the difference in surveys by providing them and their family members with letters about the research as well as a contact point for all questions. Posters were also installed in each home and all staff were comprehensively briefed so they could answer any questions about what we were doing, and why.

Despite these measures, we still expect to see a decrease in response rates from proxy survey respondents, which is already only around 15-to-35 per cent. We anticipate that this will continue to fall unless they have seen changes in the care provided, which is unlikely given the short timeframe between surveys.

We found that some questions were not applicable to all residents. For example, some residents explained that they were not experiencing any pain, therefore could not answer questions on this topic.

Our culturally and linguistically diverse residents were also excluded from the process as there were no survey translations available at the time of interviewing.

Australia’s population of overseas-born residents has grown by almost six per cent in the past 10 years. This means there will be a corresponding increase in the number of CALD residents in the future and need to be included in any planning for the sector.

While we continue to refine processes, we hope the surveys will continue to evolve with these learnings and feedback from providers as we deliver them on the ground.

Elyssia Clark is general manager of customer, insights and marketing at Benetas

Comment on the story below. Do you have an opinion to share about an issue or something topical in the aged care sector? Get in touch at

Tags: benetas, elyssia clark, quality of care, quality of life,

1 thought on “Measuring quality requires quality in measuring

  1. Great article Elyssia. We provided much of this feedback, and more, when we participated in the pilot of the expanded quality indicators. Unfortunately, our feedback seemed to fall on deaf ears. I also hope the surveys will evolve based on our learnings as they have great potential to facilitate improvements in the sector if implemented appropriately.

Leave a Reply

Your email address will not be published. Required fields are marked *