Skin in the game: a different approach to guaranteeing health data privacy
A popular branch of technology at the moment is AI, though the name is only half right. It's certainly artificial, but there's not much that's intelligent about it. Automated pattern recognition is a more accurate term because AI systems are getting very good at recognising patterns, but they can't do much else.
Still, pattern recognition is useful. For example, you can drive cars with it. Uber, Tesla and Google are dead keen on building a self-driving future in which our roads will be safer and pleasantly free of congestion. We'll also have more time to look at our devices.
Unfortunately, they have already had accidents, some fatal. It's too early to tell if the rate of self-driving car accidents is higher or lower than for human-driven, but if the experience from aviation is applicable, then we can expect an initial increase in adverse events before it becomes safer.
The tech companies can blog all they like about the positive statistics but, people being human, they aren't that reassured. Apparently, we think there's a fair chance Herbie might turn into Christine. We can tolerate drunks and speeders killing every day, but the thought of death by car robot is too dystopian.
One thing companies could do to alleviate fears is to take a different approach to testing. Uber was testing on roads without any public announcement about what they were doing, which is not just rude but is also putting all the risk on to those who know nothing about the risk they're taking.
What if Uber, Tesla and Google were only allowed to test self-driving cars in areas where their children were going to school? Nothing would focus the mind harder. It's not hard to imagine a change in approach.
Another technology that's being worked on is using personal health data for purposes other than direct care. Like self-driving cars, this could make our future better. The expectation is of safer, more effective treatments, ill-health identified quicker, and more effective ways to prevent onset. Doing this well depends on people with different types of expertise working together, clinical and statistical most importantly, but also informatics and computer science. Good practitioners will bring with them wisdom and a dollop of philosophy, plus a deep connection to community.
There is probably little about this technology that is hazardous to life, though misinterpretation is a constant and present danger and could cause harm. Instead, the biggest concerns are around privacy, consent and trust. Most people are more than happy for their data to be used for the greater good, but only on condition that their participation is respected and not exploited. As with all technology, great attention must be paid to the non-technological dimension before there is a chance of being successful.
The government is currently hoping to be successful in this type of project. The Department of Health recently published a framework for the secondary use of My Health Record data, which will be the mechanism for making data from the My Health Record accessible to researchers. The data could be de-identified or identified, the latter only under certain conditions, with ethics approval and patient consent.
While the framework outlines guiding principles for how this will all happen, it doesn't have much detail on the processes. DoH is working on the detail now, and it will have much to think about. There are many intricate tasks involved in transporting data from a very locked-down system to a third party.
To alleviate fears about any of these steps going wrong, the government will no doubt develop compliance, regulation, monitoring and auditing mechanisms. Plenty of money will be spent (mostly on consultants) and piles of documentation written.
But will all of this guarantee much against slippage? It is often far too easy for those involved to find ways to shift accountability somewhere else if something goes wrong, with the resulting finger-pointing resembling the Mexican stand-off scene in Reservoir Dogs.
Consultants can wash their hands of it once the contract has ended. If it's easy for party A to say it was party B's fault, they will. As with self-driving cars though, perhaps there is a way to drastically reduce the expenditure and guarantee focus.
The My Health Record can only work if other systems can unambiguously find the correct record for an individual. This is made possible by the Healthcare Identifiers Service. Every Australian has an identifier called an Individual Healthcare Identifier (IHI). In theory (and hopefully in practice), one IHI equals one My Health Record.
One key step of the secondary use process will be to develop the criteria for determining the subset of the My Health Record data to transmit to the researcher. What if the IHIs of everyone involved in secondary use were saved into a list, and this step used the list to force those records to also be included, whether they met the criteria or not?
Everyone means everyone. For life. The members of the data governance board. The department director who signed off on the process. The politicians who voted for it (should it need legislative change). The database administrators of My Health Record. The members of the ethics committee that approved the research. The managers and staff in the data custodian teams at Australian Institute of Health and Welfare. None of them would be allowed to opt out, and if they did, they have to opt back in.
Whether the data is identified or de-identified, most of these people would care enough to make a difference and make sure the public’s privacy is protected. After all, they're also protecting their own privacy. There are scads of evidence that enforcement and structures are far less effective and much more costly than using some simple mechanism of trust and accountability. Will someone take a brave step out of the top-down, command and control paradigm and try this out?
We've seen some pretty poor behaviour lately. HealthEngine apparently doesn't understand the difference between "express consent" and "informed consent". Other government initiatives haven't gone as well as ministers would like. The traditional methods for avoiding failure don't seem to be working. Perhaps it's time we tried a different approach.
Brendon Wickham is a health informatician who works in primary care.
Are you a CHIA member? Reading this Pulse+IT article entitles you to CPD points. Click here to record your participation.
Posted in Australian eHealth
Tags: My Health Record