Tale of a 10-year journey for SA Health

In September this year, South Australian Health CIO David Johnston handed over the baton after a decade in the role, just as his team's long-term strategy to implement a statewide integrated EHR was coming to a head with the rollout of the Allscripts EPAS system.

While people may have a different opinion on whether a centralised system for health IT decision-making is the right way to go – Victoria, for example, has had its share of problems with this approach – this is the strategy that South Australia has taken for its acute care sector.

In a wide-ranging and amusing presentation to the Centre for Health Innovation (CHI) conference in Melbourne recently, Mr Johnston outlined the work that needed to be done in terms of basic infrastructure before clinical systems could be rolled out.

When he began, the final report of the 2003 South Australian Generational Health Review had been handed down, showing that healthcare would consume the entire state budget by 2032. “It has since been revised back to 2025, and it consumes about 40 per cent of the budget now,” Mr Johnston said.

The review had a few recommendations to make to avoid this coming tsunami, but the one that interested him was the idea to develop a plan to enable the establishment of a single electronic health record for each patient.

“It would seem pretty obvious in most industries that that would be a good idea, but when I started looking into it, it became more complicated,” he said.

Describing the SA health system at the time as a $5 billion cottage industry, he said the system was using archaic, paper-based technologies to transfer information around and doing so incredibly inefficiently.

“What did I inherit? Oh, it was fabulous. I look back on it and think I'll never do that again. There were 75 information silos – eight metropolitan hospitals that were independent, competing for money, reporting directly to the minister. There are 65 country hospitals, some of them very small, and mental health services that were completely isolated from the acute sector and community health with its own systems.”

Mr Johnston said he had tried to count all of the systems used but gave up at about 3500, and that didn't include databases that clinicians had developed on their own. “We did a scan last year and there were about 92,000 Access databases in use, 14,000 of which were in constant use and contained patient information,” he said.

“The hospitals had their own boards with fiercely independent cultures. The two I like to pick on are Royal Adelaide and Flinders. There is some competition between the two: Royal Adelaide built its own mainframe system – it had to have a mainframe because they are expensive and very impressive – and Flinders bought a Unix system because Royal Adelaide had a mainframe system.

“There were many home-grown information systems, and they bought systems and then customised them so they were all configured differently. They were paying top dollar to support them but many were 25 years old and were about to fall over. There were many legacy systems and we actually found a few using languages that nobody had ever heard of.

“Then we got into the really serious bit which was no standardisation. You go in as a patient and you are given a number that is unique to that institution, you have a number of tests that use a particular code that is only used at that institution.

”The drug database or the pathology database or the imaging database would all be different. Therefore, there was no ability whatsoever to transfer information apart from couriering a big load of paper which you can't read half the time and it has codes that you can't understand.

“And there was also no investment budget. The standard response was 'we have no money', regardless of what the question was.”

Looming catastrophe

Mr Johnston described the situation as “an absolute mess”, but said it was typical of most hospitals and health systems around the world. The decision to implement an EHR was to alleviate that mess in some way, but more importantly to avoid the catastrophic increase in activity that faces all health systems globally.

“It is not so much reducing costs but the ability to absorb activity,” he said. “Victoria, for example, is 2800 beds short of its four-year forward projections. That is $3 billion right there, without the running costs, and none of it is in the forward budgets.

“It's not so much the opportunity to reduce healthcare costs at a macro level, but more the opportunity to increase its efficiency and drop errors. It is things like patient safety that are a great driver of cost in the clinical environment. The key to total quality management is to reduce variance.”

He said there were a number of obvious areas that needed to be fixed. This included a common patient identifier, systems that could actually talk to one another, common languages for catalogues and databases, access to devices and proper security of access.

“Security is a fascinating topic, as it can be scary stuff in hospitals,” he said. “As long as you are wearing a suit, and preferably with a stethoscope, you can ask medical records to see a patient's record and often they will give it to you.

“If you go into a hospital and turn a keyboard upside down, in 50 per cent of cases you'll find a password taped underneath it. If something goes wrong, they'll say it wasn't me – it was someone else using my password. There is no legal grounding in that: a user name and password is not a legal signature.

“You need access to devices where they are needed – you'll have a nurses' workstation with people crowded around it waiting to use the computer and getting distracted – and you need security of health information exchange. You can't email clinical documents. Well, you can, but you shouldn't.”

In 2003, he said, it was obvious that the state needed to have a rethink about how the whole system was designed. “Health is a federation – there shouldn't be a whole load of small businesses acting independently.”

The conclusion was that everyone had to use the same system and there should be single identifiers for patients, staff and other providers of care.

“The overarching strategy was to look at large single-instance enterprise systems, systems that can cope with 30,000 concurrent users. Single instance means that everyone is on the same system and there are no changes to it. I was told that one system was used and there were 75 instances of it – that means it is not single instance. It is like a completely different product.”

What was instituted was a hub-and-spoke systems architecture, which knocked out the point to point connections between the systems and instead funnelled them through what SA Health calls its corporate health information broker (HIB).

“That is very specialised software,” he said. “Its job is to pass messages around the system. So you take individual systems and they only have one interface, which is the broker. That way, if one system needs to send information to another, you send it through the broker and the broker passes it down. What that lets you do is to start pulling systems out and putting systems in, and you have one interface regardless of how many systems it needs to talk to.”

Infrastructure requirements

Part of the plan was also to introduce twin data centres, which Mr Johnston said was essential when using a large enterprise system. “It's very well-established architecture but very new from a health perspective,” he said.

Strong governance was also required. “We had these business units which would buy all kinds of things. The vendors were having a field day. It would go into one hospital and the others would take a look over the fence and say we'd like that too.

“The investment was coming from all over the place … What is forgotten about is it has to be maintained, and then the company that provided it would go bust and you were in all kinds of trouble. So part of it was to choke the incompatible investment off, and that meant basically centralising the IT function across the entire business.”

Even something as basic as new PCs was a struggle. Invariably, those higher up the food chain got first dibs on new PCs, while those who actually needed them were instead given hand-me-downs. In many hospitals, 40 per cent of IT staff were in support and maintenance, he said.

“We needed to standardise the desktops. We took all of the PCs, transferred them to the department for ownership, and then rented them back. They were a four-year rental so at the end of that four years, we would come and get them and we'd rent them others for another four years. What that meant was that the currency of devices remained controlled and modern, and support costs starts to drop away.

“Then we went through bedside computers, and then to mobile devices, which are currently underway. That is the iPads and the tablets and iPhones and how that all works with the wireless networks.

“There were staff running redundant links out to the data centres, so the data centre needed to put in twin routers in the front of the hospital so that if one fell over the other would keep it up.

“The networks were terrible, so [we needed to] invest in the network, put in wireless networking, virtualise the LAN. All of those things needed to be done and that was about $200 million worth of infrastructure that needed to go in behind the walls that nobody could see.”

Mr Johnston's team identified 65 different projects that needed to be done to enable a single EHR, and then calculated the cost. “We went to cabinet and asked them to give us the money. If you want an electronic health record to help avoid the tsunami that is coming, we need $375.6m, which got laughed at by a lot of people until cabinet said yes.

“So, we got stuck into it. First off it was to build the internal project capability. That is hard. You need a standard project methodology, you need to have skilled project people, you need an entire governance and project tracking system, you need steering committees and project boards, you need an entire culture change around professional project management and that took a number of years to get in place.

“That was established and the machine then kicked off. At its peak it was running 33 projects concurrently.”

These projects included the installation of Intersystems' Ensemble integration platform as the health information broker, and the installation of an enterprise master patient index using IBM's Initiate software.

“That is an absolutely critical piece of software,” Mr Johnston said. “It has a matrix, so you can type in the state number, it goes and grabs the information and comes back and presents it to you, and looks like it has all come from one system. The other key is that it also stores the IHI, which became important later when we talk about the PCEHR.”

Secure messaging for information encryption and decryption was installed, as was a directory of all 37,000 staff that identified them and allocated user rights depending on their roles.

It is only when all of that infrastructure is in place that clinical applications can begin to go in, he said. “My analogy is the floors of the building – you put in the eight floors of infrastructure and the ninth floor is the application layer.”

The first large enterprise system installed was iSoft's iPharmacy system. That will be followed with a statewide imaging system from Carestream Health, to go online by April or May next year, and then pathology. Money has been allocated for a pathology system but a tender has not yet been issued, with Mr Johnston estimating that installation will be 18 months or so away.

The ability to send electronic discharge summaries to the PCEHR is now up and running – SA developed its own system to enable this, which has now been licensed by NEHTA to other states – and they are also being sent to GPs via secure messaging.

Point of care

What came next was a decision on how to give clinicians access to these information systems at the point of care. It is all very well to have the information systems installed, Mr Johnston said, but how are you going to access it if you've got PCs scattered around the place?

“Information at the point of care became critical for us so we did something a little radical,” Mr Johnston said. “We partnered with Telstra and we rolled out 3500 bedside computers and that is the largest roll-out in the world.

“Its first wave was to become a patient infotainment system ... so the patient can access the internet, they can read their email, watch movies on demand. But the reason it was put in was not for that. That was a method of cross-subsidising so that we could actually get the computers in and get the patients to pay for some of it. The reason that we put it in was point of care at the bedside.”

The bedside devices also allow better food management, which has flow on effects not just on the bottom line in terms of wastage but in ensuring scheduled surgery is not delayed. Mr Johnston said up to 30 per cent of food in Australian hospitals is wasted, typically because food will turn up even though the patient has been discharged. That food cannot be reissued.

“The issue of food wastage is terrible,” he said. “By using that screen, you can put that into the clinical system and the PAS, so if you have a patient that is diabetic they don't have certain foods appear. If they are going into surgery in the next six hours, they get the choice of a glass of water or a glass of water. You don't have to cancel surgery because someone's eaten.

“There are a whole pile of safety benefits attached to it. It is also real-time so it goes straight back into the food management system. The estimates are that you can cut food wastage from 30 per cent to 10 per cent simply by getting that order done by the patient.”

The computer roll-out was completed in March, and the food management module was added in June. In August, video libraries were added to allow for patient education.

Mr Johnston said this month, “some of the fun will start”, including replacing telephone handsets with models that have a barcode scanner embedded on the back. “What that means is you can now move into some medication management,” he said.

“I went to a place called Jefferson Medical Centre in Arkansas in the US and they've put in the system that we've put in. They had a PC in the ward with a Bluetooth scanner, and they scanned wristbands and scanned medication to make sure they match.

“They admitted a 13 per cent medication error rate in the past but it went from 13 per cent to zero overnight because it was impossible to administer the wrong medication to the wrong patient. We decided that we would take it but use handsets to do that.”

EPAS

Now that the basic infrastructure part of the strategy is in place, the enterprise patient administration system can go in, and the most important element – clinical decision support – can begin, Mr Johnston said.

While the roll-out of EPAS has been delayed somewhat, mainly due to problems that Allscripts has had with Australia's intricate billing systems, the first site at Noarlunga Hospital went live in August and others will come on stream over the next two years. “The majority of the work is complete, the big systems are in, and the sites are being activated,” Mr Johnston said.

By centralising the whole system, SA Health now has the largest information system of its kind in the world for healthcare, he said.

“There is the increase in safety that comes with that, the increase in throughput, and then you can get into business intelligence, because there's now a stable base for health information. You can start to look for all of the correlations and deep mining and do all of that sort of stuff, which you can't do unless you've got that stable information base in the first place.

“What will be interesting to watch will be the KPIs over the next few years, that is the safety KPIs and the cost KPIs. It shows that government agencies can actually move. There was a clear goal that drove that massive change over that 10-year period and for an organisation of that size, it shows that it can actually be done provided the goal doesn't change.

“I had people ringing up saying they wanted an update on what the strategy is, and I'd say it's the same strategy. The same strategy that has been in place for 10 years because that is what a strategy is. I have avoided a lot of meetings that way."

Mr Johnston said there is no doubt that introducing EPAS will be very challenging, but he believes it is past the point of no return, which is why he is content to move on after 10 and a half years.

“With the big systems up and the sites starting to plug in, it is now a case of plug the next site in, plug the next site in, and that is for operational people. Some of the sites will be challenging, more challenging than others because they got their own cultures, but ultimately it will be in within the next few years so I kind of feel like my job is done.

“I would never do it again mind you but certainly I have enough scars that I can now go and talk to other states that have all got the same challenges. It's not necessarily the right answer for each state – it is not the right answer for Victoria, that's clear; a hybrid solution will probably work in Victoria – but it is really about working with people to say what can be done and saying here is what I've learnt along the way. The hard way.”

Posted in Australian eHealth

Comments   

# Daniel Byrne 2013-12-12 22:57
Great article overall but a bit of fact checking will prove that discharge summaries are not being sent by secure messaging to GPs. Sending it encrypted to a practices email address is not secure messaging. Unfortunately the guys at the top of the tree often have no idea of what happens on the ground.
# Simon James 2013-12-12 23:07
Hi Daniel,

Can you elaborate on the distinction you are drawing between 'sending it encrypted to an email address' and secure messaging?

Thanks
Simon
# Oliver Frank 2013-12-13 11:12
I think that Daniel is saying that discharge summaries are not being delivered into GPs' clinical computer systems via Argus, despite SA Health having bought an Argus licence some years ago.
# Oliver Frank 2013-12-13 11:14
Discharge summaries sent to the practice's ordinary email address still have to be handled manually by the practice, to get them into the patient's record. This is not acceptable.
# Daniel Byrne 2013-12-13 11:38
My practice is one of the minority that receives discharge summaries from SA Health "electronically ". We did it as a trial. They are sent to our public practice email address encrypted - but we then have to print them out for doctors to read and manually import them into our clinical software. Double handling, unsafe process. SA Health refuse to use any of the standard commercially available systems that link straight into our clinical software. It annoys me that for years they have been claiming they use secure messaging. It is impossible for a GP to send a referral electronically into the SA Health system. The wonderful EPAS is 100% closed to anyone outside the SA Health firewall. A very insular way of caring for patients.
# Harriet McDonald 2013-12-13 14:45
EPAS is a mess.. only very low profile minor sites are being allowed to go live before the impending SA state election.. this gives the project a semblance of moving forward without exposing the government to the risk of a major public relations disaster. The full scope of the this impending IT catastrophe will become evident mid next year after the new government has settled in. Watch this space!

You need to log in to post comments. If you don't have a Pulse+IT website account, click here to subscribe.

Sign up for Pulse+IT eNewsletters

Sign up for Pulse+IT website access

For more information, click here.

Copyright © 2017 Pulse+IT Magazine
No content published on this website can be reproduced by any person for any reason without the prior written permission of the publisher.