Over recent weeks I have met all sorts of people from inside and outside the NHS with plans to use artificial intelligence in all sorts of innovative ways to transform the health service. What makes Haris Shuaib different is that he’s not just talking about AI or piloting it, he’s actually using it in two major hospitals.
Haris runs a 20-strong AI team at the Guy’s and St Thomas’ NHS Trust providing applications for the two London hospitals. He is one of a new breed spanning the worlds of medicine and computer science, with a background as a physicist working with MRI scanners and a PhD in the use of machine learning to diagnose brain tumours.
He says the team was formed after a period when he was doing ‘AI by stealth’ in the hospital:”I was becoming the de facto expert locally - when different departments wanted to deploy AI, they would call on me.” So he lobbied Trust bosses for an AI unit and with the help of some research grants the team was assembled three years ago.
One of their first projects was to use AI to automate paediatric bone scanning, where x-rays of children’s hands are examined to determine their developmental age. Haris explains that this involves a radiologist paging through an encyclopedia of hand scans to find one matching the x-ray. “It's incredibly boring. And it's very well suited for AI.”
They used an algorithm designed by a French company and it seemed to work well. But after a few months came the first incident where doubt crept in about the deployment of AI when some consultants complained that the system was providing some wrong answers on developmental age. An investigation was mounted and it was discovered that the mistakes related to one particular aspect of bone assessment:
“If the child is under four years of age, you're not supposed to use the bones in the wrist to make the assessment because they can be misleading. And the AI was correctly ignoring wrist bones in its assessment for children under four and that group of consultants were not.”
In other words, the AI had remembered something that the doctors had forgotten and was therefore more, not less accurate than these humans.
Two more examples of AI in action also involve automating unglamorous processes which are nevertheless vital in the efficient operation of hospitals. The first is the huge volume of correspondence from GPs referring patients to the hospital for treatment: “We essentially get paper forms, then an army of administrators need to figure out which department needs to see this patient.”
Similarly, the process of setting up appointments is on the path to automation, starting with the endoscopy department. Haris Shuaib explains that reports come back after an endoscopy with results that could mean either that the patient does not need to be seen again, that they should be given an appointment in three months’ time, or should be called back the next day because something worrying has been found.
“Because the administrator doesn't understand the report or the guidelines, it means a consultant has to sit with them and basically tell them when to book these patients.”
In both of these cases the AI team is using a Large Language Model - the technology behind ChatGPT - to read the referral forms or the endoscopy reports and understand them in the way a doctor would.
That is the theory at least but these two projects are at an early stage and even if they promise to rid doctors of a whole lot of tedious admin one can imagine they will want to be convinced their AI replacements are up to the job. Then there are steep regulatory hurdles to clear. Many others involved in trying to bring AI into the NHS. have told me their biggest headache involves getting approval from the MHRA for what is regarded as a software medical device.
But Haris reveals that Guy’s and St Thomas’ benefits from a loophole: “If a health institution builds any kind of medical device and uses it on its own patients it's outside the purview of the regulator, “ he explains, “because the MHRA regulates the market. and the inside of a hospital is not a marketplace.'‘
That means his team can move more quickly but there is a downside. They cannot easily take their innovations to other hospitals which means that it’s hard to scale up and get the commercial return that they and their suppliers need to go on investing or to spread best practice across the NHS.
But the AI team at Guy’s and St Thomas’ has found another way of distributing its ideas across the health service, An internal education programme teaching doctors about AI has become a national scheme, Fellowships in Clinical Artificial Intelligence, directed by Haris Shuaib. He says training needs to be at the heart of any project to bring AI into the NHS:
“Even if you buy really good technology, if we're not investing in helping doctors understand what the AI is saying, how to integrate it into clinical decision making, either it's going to be ineffective or at worst will be unsafe.”
After a half hour call in which Haris ranged far and wide, fizzing with ideas about the practical, managerial and ethical challenges of bringing AI into the NHS, I felt both exhilarated and exhausted. I imagine the managers and clinicians he works with feel the same, but if the NHS really is to be transformed by AI we will need dozens of enthusiastic innovators like him to lead this revolution.
Getting the quality right with AI tooling in a health context is very hard, especially as you scale up. You've got to know that you need to match receiver operating characteristic (ROC) curves and the usecase, including understanding condition prevalence and user skills/experience.
That said, the opportunities are large, so let's hope that the relevant leadership/org structures get the deployment approach right, and don't fall into the type of ignorance trap that caught out the PO with Horizon.