The stereotype of the crusty old doctor who wants nothing to do with new technology is not entirely inaccurate. Still, some of the most passionate evangelists for the idea that artificial intelligence can transform healthcare are members of the medical profession. And they have to learn a key lesson - some challenges may prove just too hard for AI.
Take Dr Tom Day, a consultant specialising in heart problems in unborn babies. He got in touch after I posted a social media message looking for people involved in trying to deploy AI in the NHS. Dr Day is one of a new breed of doctors who see patients, work on blue sky research at a university, but also have an entrepreneurial drive.
He works at Guy’s and St Thomas’ Hospital, is a clinical research scientist at King’s, London, and with colleagues has a start-up company. All of these activities focus on one question - why does the ultrasound scan expectant mothers get at twenty weeks somehow fail to spot around half of all serious heart defects?
This failure has serious consequences. When a heart defect is spotted at 20 weeks the baby’s parents face a terrible dilemma. “With very serious heart defects, “ explains Dr Day, “then a significant proportion of people would have a termination of pregnancy.” But medicine has made big advances over the last twenty years and with advance knowledge of a problem, the birth can take place at a specialist hospital, with doctors on hand to operate. “In the modern era, not quite all but almost all heart problems are operable. We can start giving life saving treatment as soon as the baby's born.”
But with no prior warning, the existence of a heart defect may only become evident some hours after the birth, making the risk of death or at least brain damage much higher. A decade ago, a project got underway at Guy’s and St Thomas’, with a big grant from the Wellcome Trust, to try to improve matters.
It was called iFind and its aim was “to substantially improve the diagnostic accuracy of scans carried out by less expert sonographers [ultrasound operators]” so that routine scans could be carried out without the need for a high level of expertise.
The scheme involved trying a number of technologies - one approach was using MRI scans, which Dr Day says has been very successful, another was using robotics to automate the scanning process, which has proved challenging: “it's really, really difficult to get robotics to work in this context.”
And finally there is the attempt to use AI to detect heart defects, which is where Dr Day comes in. Like just about any project based on machine learning, this was always going to depend on the collection and labelling of vast mounts of data. This came in the form of 10,000 videos of ultrasound scans - effectively screen recordings of what is quite a lengthy process.
Having heard about other uses of AI to examine scans for everything from brain tumours to eye conditions I thought I knew what came next - an algorithm trained on images of heart defects that had been missed first time around would be developed, providing a essential new tool for doctors treating sick babies.
But I was wrong. “The problem is heart defects may be the most common problem affecting babies but they still are rare,” says Dr Day. So even 10,000 scans might produce fewer than 100 undiagnosed heart defects of different kinds and that was nowhere near enough data to build a reliable AI system.
Instead the team decided they might be able to use AI to make the lengthy and complex process of carrying out an ultrasound scan faster. Tom says that as well as looking out for heart defects the sonographers have to take a whole series of routine measurements, stopping frequently to take notes: “What we thought might happen was that if you could automate that - a bit like putting the aeroplane in autopilot - you would improve the detection rate indirectly.”
They have produced a product which reduces the average time taken to do a scan from 20 to 12 minutes and. according to Dr Day, `’lightens the cognitive load” of the sonographer, in other words frees them from the distractions they face as they go through the routine stuff and allows them to focus on what is important.
But, but, but…
”What we haven't shown is evidence that it changes the rate of diagnosis,” he admits. To achieve that would mean running a huge trial involving thousands of babies, a very expensive process for which they are currently seeking a grant.
But in the meantime Dr Day and his colleagues have set up a business called Fraiya to try to commercialise their ideas. As they try to sell their existing product as a practical way for hospitals to speed up ultrasound, one approach is to market the system as a way of making life easier for the doctors who often have to be the second pair of eyes reviewing the scan:
“The AI can take a 20 minute ultrasound and organise it in a way such that an expert can then review it very quickly.”
For now, he says, this would be a halfway house with a human in the loop just using the AI as a tool, but over time the system would get better at detecting heart defects on its own.
The ambition is still to create a system that can detect far more heart defects than the typical human ultrasound operator but without producing too many false positives which would clog up hospitals while placing unnecessary and unbearable strains on parents.
Tom Day’s story tells us that amidst all the hype about AI and healthcare we need to realise that this stuff is hard, it makes big demands in terms of the data and investment required, and after years of hard work you still may find that it doesn’t really work.
So I asked Tom what kept him going:. Here’s what he told me:
““I regularly have to care for babies with severe heart disease who have been missed antenatally and born in the wrong hospital, with no treatment until they become very sick. It’s a real reminder that what we are trying to do will have a tangible benefit for the most fragile and vulnerable members of our society.”