Health and Wellness

As ambulance leaders turn to technology, how will the NHS navigate the ‘Wild West’ of AI?

After diagnosing the NHS as “broken”, the government has placed a big bet on tech being the key treatment for its ailing system, promising that it will become the most “AI-enabled” health system in the world.

With services facing a battle over finances, as well as a lack of staff able to meet patients’ needs, health leaders have been exploring the use of AI for some time. The evidence is already there for its use in reading patients’ scans. But, more broadly, how does the use of AI tools translate into emergency care?

Here, ambulance leaders tell The Independent about the realities of using AI in a complex, fast-paced and potentially dangerous environment.

‘We’ve got to get it right first time’

Guiding drones, traffic light prediction, helping with diagnoses and live language translation these are just a few of the ways in which AI could be used within the UK’s ambulance sector.

Graham Norton, digital transformation lead for the Northern Ambulance Alliance, believes that AI will “absolutely” become an everyday tool for ambulance staff.

“There is absolutely no reason why AI will not be a routine part of the day-to-day activities across the ambulance sector. It should be,” he said.

Mr Norton and Mr Johnny Sammut, director of digital services for the Welsh Ambulance Services University NHS Trust, both agree that AI has huge potential to help health workers battling an increasingly challenging environment.

But the pair say this comes with a heavy safety warning.

“The reason that we’re different [in ambulance services, compared to the rest of the health service] is that this is genuine life and death, and a lot of the time, certainly over the phones, you can’t even eyeball the patient. So, it’s not to say there isn’t a huge enthusiasm [for AI] and huge, huge potential. But we’ve got to get it right first time,” says Mr Norton.

In areas of the NHS such as diagnostic services, AI is being used to read patient scans. But, if a concern is flagged, these readings are usually looked at afterwards by a health professional, creating a safety net.

But Mr Norton warned: “If you’re using AI at an emergency care level I’m talking about 999 and 111 calls, for example by the nature of what you’re trying to do, you don’t have the same level of safety net.”

Tackling health inequality

The Yorkshire Ambulance Service is currently one of a handful of trusts testing out the use of AI within services, with the main focus on testing safe AI transcribing tools.

These are so-called “ambient AI” which can listen, record and transcribe notes for paramedics on scene or call handlers. Mr Norton said the devices could even be used to translate patients who don’t speak English, using a Google Translate-type tool.

“If we can have AI helping us with translation and transcription, we’re going to be able to deal with real health inequality. There’s a real health inequality for people who don’t speak English as a first language,” he said.

Meanwhile, in Wales, Mr Sammut said the service was already seeing “immediate time saving benefits”, in terms of reducing admin burden for staff, by using AI.

Last month, the trust soft-launched a 111 online virtual agent, similar to an AI chat function, which provides patients with a conversational way to ask about symptoms.

In another use, which is quite different, Mr Sammut said there is work to link AI-enabled drones with hazardous area response teams – teams which respond to complex and major emergencies.

“So this provides situational awareness in the sky on particularly complex or dangerous scenes. We’ve got AI now embedded into technology and those drones will have things like intelligent tracking. They’ll be able to pull thermal and non-thermal imaging together and then they’re able to survey and track particular areas of a scene using AI. It develops its own situational awareness in the sky.”

The service also hopes to develop AI which can assist with predicting ambulance demand. It can also help paramedics in the field, by interpreting echocardiographs (ECGs) for example, or anomalies in a patient’s skin.

“The risk of not doing this [using AI] is far greater [than not]. When you think about the NHS, where we are today, the burden that sits on staff and the levels of funding… to not follow through with AI is quite frankly dangerous.”

However, in such a high-risk and fast-moving area, the ambulance executive did point out some risks.

“The other thing that I’ve got in my mind at the minute is: what downstream risk do we create with AI? I’m thinking from a cybersecurity perspective. So one of the very real concerns that I do have with AI is how do we avoid, track and mitigate against AI poisoning.

“AI poisoning is whereby someone will feed one of your AI models a whole heap of fake information and fake data and… you know the price of us getting AI wrong isn’t money alone. It’s life. So if someone is able to poison those models, that is a very real risk to the public.”

News stories over the past two years, including major cybersecurity attacks on the NHS and individual hospitals, show how precarious an area this is.

In terms of risk management, Mr Norton also points out that there needs to be a way of quality assessing AI providers.

The potential is “phenomenal”, he said, but the service must “slow down a little bit”. “You’ve got to avoid the Wild West here,” he adds.

  • For more: Elrisala website and for social networking, you can follow us on Facebook
  • Source of information and images “independent”

Related Articles

Leave a Reply

Back to top button

Discover more from Elrisala

Subscribe now to keep reading and get access to the full archive.

Continue reading