Informa Australia is part of the Informa Connect Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 3099067.

Healthcare | Technology

Prediction in fear: AI in healthcare

28 Sep 2018, by Amy Sarcevic


Disruptive technologies, AI and machine learning can be powerful tools to drive better health outcomes. But in a fear-based economy, the AI-enabled predictions need to be fully understood within the context of each individual person and their circumstances, MedicalDirector’s CEO, Matthew Bardsley discusses ahead of the AI, Machine Learning & Robotics in Health Conference.

“Universally, we transact in two things – fear or pleasure. Every decision we make, always comes back to one of these two things, and within each of those two utilities, there’s a moral compass.

“But to be equitable in pleasure, isn’t as greatly valued as being equitable in fear.

“Let’s take this example. You go on holiday to Fiji and stay at a beautiful island and five-star resort. Your colleague also goes on holiday to Fiji, but stays on the mainland in cheaper accommodation. You both share your very different experiences with each other back at work, and happily share holiday snaps and stories.

“Now let’s take an example in healthcare. You are diagnosed with a rare cancer, its treatable but the drugs aren’t available on the PBS. You meet with a support group to discuss the impact this is having on your life only to find out that many people in the support group have been able to afford the drugs and are in recovery. Your experiences are inequitable and well, it just doesn’t feel right.

“The moral question arises as to why should someone be disadvantaged, in their most vulnerable state, because of their economic position. So equitability is a real issue when it comes to the moral compass in healthcare in a fear-based economy.

“This takes us to the issue of predictability and fear. There is a real moral issue with ‘prediction’ in fear, compared to prediction in pleasure.

“Prediction in pleasure is easy to digest, you go on Amazon, search for outdoor activity ideas and the CX machine learning capabilities suggest rollerblades. You say ‘these look great!’ and buy it, but then later find you didn’t actually use it that much. It’s not a big issue, you find something else you like.

“Prediction in a fear-based economy however, is a very different experience. Imagine if an AI-enabled health prediction tool said to you, ‘you’re going to die when you’re 70.’ You then base all your life decisions based on that prediction, but come your 70th birthday, you’re still around, but you’re alone and broke, with no set future plans in place.

“The impact prediction has in a fear-based economy is very real and has a far more implications for our individual decisions. If the fear doesn’t manifest itself or the fear is greater than what was initially predicted, these are much harder for the human mind to cope and take on board. So it has to be managed a lot more carefully.

“And it is effectively managing predictions in a fear-based economy that we need to bear in mind as we innovate in healthcare.

“Disruptive technologies, AI and machine learning can be powerful tools to drive better health outcomes. But in a fear-based economy, the AI-enabled predictions need to be fully understood within the context of each individual person and their circumstances. The message then needs to be delivered with the right empathy and sensitivity.

“One example is a simple birth control alert or reminder, which would seem quite straightforward to communicate to a patient. But if that patient is in an abusive relationship, that message needs to be conveyed in a very different way.

“For AI to really be a powerful tool in healthcare and enabler of better health outcomes, we need to make sure it is done respectfully, there is a level of empathy in the system, and a layer of emotional artificial intelligence, before it can be unleashed into the healthcare ecosystem. Otherwise just one health prediction delivered incorrectly can lead to a life or death situation for a patient, and stifle innovation.

“Moving forward, innovators in healthcare need to understand and respect this layer of emotional AI required, and really lean into this problem and understand its repercussions, so healthcare can enjoy the same capabilities as other parts of the market.

Matthew Bardsley will discuss this topic further at the AI, Machine Learning & Robotics in Health Conference – due to take place 20-21 November in Melbourne. 

Learn more and register.

Blog insights you may like

Get all the latest on Informa news and events

Informa Connect Australia is the nation's leading event organiser. Our events comprise of large scale exhibitions, industry conferences and highly specialised corporate training.

Find out more

Subscribe to Insights
SUBSCRIBE 

Join Our Newsletter
Informa Insights

Stay up-to-date with all the latest
updates, upcoming events & more.
close-link