Operational lessons from CareTrack

The team behind the landmark CareTrack study into the level of appropriate care in healthcare provision in Australia, published in yesterday's Medical Journal of Australia, is hoping to develop a set of agreed tools for data extraction and to set up expert groups to develop clinical standards for common conditions using a Wiki-like approach.

CareTrack, which highlighted disparities in the standards of care provided by medical practitioners for 22 common conditions, proved to be a difficult study to undertake due to problems in gathering population-based data from multiple sources, the researchers say.

In a Perspectives piece accompanying the study, the CareTrack authors outline the difficulties they faced in gathering information to conduct the project, pinpointing the lack of easy access to healthcare data as one of the main barriers to making similar studies more routine and prospective.

CareTrack is only the second such study to be conducted in the world, with the first undertaken in the US 10 years ago. One of the main barriers to such population-based studies is the difficulty in accessing medical records, the researchers say. The study required a large amount of manual data extraction, several rounds of ethics approval and a lot of paper-based forms.

Two reasonably simple ways to overcome these barriers are the wider use of commercially available tools to extract data from general practice and hospital-based medical records, and the creation of expert groups to set up clinical standards using a Wiki-like approach.

One of the study authors, Enrico Coiera, director of the Centre for Health Informatics at the University of NSW's Australian Institute of Health Innovation, said a shorter-term approach should be considered to develop tools that permit extraction of key data fields from local electronic record systems.

“It would be good now to have an agreed standard way for data extraction to proceed nationally, so that any record system that met this standard could be queried,” Professor Coiera said.

“There are already several commercial and public domain tools designed to extract data from different primary care record systems. These are used for example to assist in clinical audit at a practice or Division level.

“There are many models for such monitoring – our CareTrack study for example works as a high level indicator of the degree of variation in care for agreed conditions, and we could use the extraction tools on a selected sample of practices to get this kind of high level data.

“If it was thought useful, and individual practices wanted to see how well they tracked against standard, then they could elect to report, and would make sure the record system they used was conformant with standards for extraction.”

The researchers say some of these difficulties might be overcome if they could be carried out over a national shared electronic health record system. However, national systems like the PCEHR are not likely to be able to allow this for many years, if ever.

“In an ideal world, we would have a national system where clinical record systems were directly interfaced to a national backbone, designed to support population health analyses, and where we could query practice records for specific atomic data ...” Professor Coiera said.

“As that sort of functionality is likely a long way off in Australia, we can still use simpler data extraction tools to get the sort of information needed for monitoring. The PCEHR today does not seem to be architected to allow the extraction of atomic record data, but that may change with time.”

The study also highlighted the enormous range of existing guidelines and indicators for evidence-based care, some of which were out of date and most of which come from a number of different sources. The authors recommended that a series of clinical standards be developed to ensure practitioners have easier access to up to date, nationally agreed standards for different conditions.

“We believe that the term “clinical standard” implies that its development has been subject to strict criteria … that there is national agreement on the content, that it would be kept up to date, and that it would be backed up by unambiguous indicators and an easy-to-use tool developed by both providers and consumers,” they wrote.

They recommend that a national approach is taken with groups of experts formed to oversee the adoption and development of clinical standards, indicators and tools for each condition and to keep them up to date. They also propose that these experts, in collaboration with relevant national bodies, develop a draft of proposed national clinical standards that could be incorporated into a collaborative wiki, such as that being developed by the MJA and the Cancer Council Australia.

“Wikis are powerful public resources enabling anyone with an interest to develop online documents, reach rapid consensus, and finetune them over time as circumstances or evidence change,” they wrote.

“Comments will be invited from interested parties both at the meeting and for a defined period of time afterwards. Responses by the oversight group to these comments will be posted on the wiki. After an iterative improvement process, and receipt of final comments from the relevant interest groups, the standards, indicators and tools will be published in peer-reviewed literature.”

They also propose that apps be developed as tools for use on handheld electronic devices and for incorporating into medical records.

Professor Coiera said the team was already scoping the possibility of developing this collaborative effort. “We are collaborating with the MJA to convene expert groups to develop basic clinical indicators for a selection of major conditions," he said.

“Unlike guideline development, the process we envisage is to develop very simple clinical standards of care, which capture basic good practice. For example, if your patient is hypertensive, then do measure and record their blood pressure. If your patient is asthmatic make sure they have an asthma plan.

“Achieving consensus on such indicators of care is more feasible than doing the same for complete practice guidelines, which are always changing, and often contentious. Our goal is to narrow things down to a set of non-contentious basics that most everyone agrees with.

“Apart from the professions, we also think consumers need to be engaged in the process as they are another important part of the equation. If patients have access to clinical indicators, then they can help make sure the appropriate things get done. If they had an app that could access the current recommended indicators for their condition, they could use it as a checklist - and have an informed discussion with their treating clinician about their care.”

The CareTrack study also involved William Runciman, Tamara Hunt, Natalie Hannaford, Peter Hibbert, Johanna Westbrook, Richard Day, Diane Hindmarsh, Elizabeth McGlynn and Jeffrey Braithwaite.

Posted in Australian eHealth

You need to log in to post comments. If you don't have a Pulse+IT website account, click here to subscribe.

Sign up for Pulse+IT eNewsletters

Sign up for Pulse+IT website access

For more information, click here.

Copyright © 2017 Pulse+IT Magazine
No content published on this website can be reproduced by any person for any reason without the prior written permission of the publisher.