When I started thinking about how artificial intelligence might be deployed in the NHS, I have to admit that mental health was one of the last places I expected it to be useful. After all, what is there to automate about an area of healthcare which is surely all about medical staff interacting with patients?
Then I met Ross O’Brien and he summed up in one word the opportunity for AI in mental health - chatbots. Ross is just finishing a job as UK Managing Director of a global health tech company Wysa, having been asked in 2021 to set up a UK branch with a mission. as he puts it, “to sell AI mental health products in the NHS.”
In just three years he has succeeded in making Wysa one of the biggest suppliers of AI to the NHS., with contracts with more than a dozen trusts. He joined the company after more than a decade as an NHS manager, commissioning mental health services and latterly leading the London Digital Talking Therapies Programme.
He is clear about the strains on the NHS when it comes to ‘talking therapies’ - what many of us know as psychotherapy. “My service in Kensington and Chelsea used to receive somewhere in the region of 20,000 referrals per year. And that's just one borough in London.” He describes what would happen when someone. was either referred to the service by a GP or filled in a form on a website to try to get seen:
““They wouldn't hear anything for a number of weeks. Then, after a short telephone consultation, they would be put in a waiting list that can be anything from two months to a year. And in that time, they would hear nothing.”
He describes the nightmare of bureaucracy that awaits patients, asked to answer 83 questions on a form which takes an hour to complete, then when they press “send” having no idea whether it has been read or just disappeared into the ether.
“It's been rubbish and everyone has known it's been rubbish, but it's been the NHS so everyone's thankful that there is a service there.”
So Wysa’s first innovation was a chatbot that guided people through that complex form which had to be navigated before they could be referred for talking therapy. Ross says it offers something more conversational than just a form, allowing people to ask questions, such as ‘when is the service going to see me ?’ or ‘what are the opening hours?’’ or ‘where is it based?’ or ‘am I going to be seen by a male doctor?’ All that sort of stuff you can throw at it and it will respond appropriately so it's a nicer experience.”
In this kind of service where there may still be a long wait once the patient has been through the referral process and been given a date for an appointment, the doctor will want the prospective patient to do some preparatory work in advance, exploring for instance what depression is and what are its symptoms. Here Wysa has another chatbot to act as a friendly tutor as you learn about your illness.
“That's available 24/7 It's not a human, it is an AI. So it's non stigmatised. You can ask the difficult questions, you can ask the embarrassing questions, knowing that there's nobody reading your answers. This is your own personal reflection room.”
So up until now in the process, the AI isn’t replacing humans because there weren’t any - in fact it is introducing a degree of personalisation that wasn’t there before. Clinicians won’t mind if the chatbots are smoothing the patient’s path to them. But the question is what happens next - could an AI actually be a therapist?
Ross O’Brien is clear that will happen:”In the future, the AI companies are going to start to offer therapy as well. So rather than seeing a human and having a talking therapy intervention with a human therapist, you'll be able to do it with the AI.”
That throws up all sorts of ethical issues and it seems hard to imagine that most patients won’t resist the idea of talking to a machine rather than a doctor. But Ross says there is a sizeable minority who might prefer to talk to an AI about some matters - he points to some people with autism who find conversation with a doctor difficult.
He also believes that we are all going to get more comfortable talking to chatbots in other areas of our lives, so may become less resistant to an AI therapist - especially when they can be seen quickly while an appointment with a human doctor will be months away.
As he moves on from Wysa to focus on running his own health tech consultancy he reflects on the challenges faced by any business wanting to engage with the NHS and get it to pay for an innovative product:.
“The process in order to get to market is an excruciatingly long one. And I think companies genuinely need to have deep pockets to be able to weather that storm and get into the NHS.” And that makes it hard, says Ross O’Brien, for some of the best ideas to break through: “True innovation comes from the doctor who's fed up with not being able to help his patients or her patients….they’re then finding that they're having to invest a lot of their own personal money or finding money from connections.”
Many of these doctor-led startups won’t survive and quite a few that do may find it is easier to get their ideas off the ground in overseas markets than in the UK.
But Ross O’Brien is still optimistic that artificial intelligence can make a big difference to mental health services. After all, he says, the scale of the challenge demands a radical approach: “The waiting lists are too long, the processes are too repetitive - all of that can be helped and supported by AI. Equally the patient's experience is is not good enough yet - there's so much more that can be done with AI to support the patient experience.”
In the end, it is economic necessity which may drive the adoption of AI in mental health services and in many other parts of the NHS. But let’s hope that the needs of patients aren’t forgotten in the rush to draft in robot therapists.
Thr nhs is a joke. It needs root and branch reform. Starting with how it's funded. Look at your own experience when you broke your arm, how is that acceptable? Why was noone held to account? Its just wrong. Far too many nhs staff expecting patients to do what should be their job. Far too many lazy easy excuses. Far too many reasons to avoid doing what's right for patients. I defy anyone to prove me wrong.