The therapists using AI to make therapy better
Researchers are learning more about how therapy works by examining the language therapists use with clients. It could lead to more people getting better, and staying better.
Kevin Cowley remembers many things about April 15, 1989. He had taken the bus to the Hillsborough soccer stadium in Sheffield, England, to watch the semifinal championship game between Nottingham Forest and Liverpool. He was 17. It was a beautiful, sunny afternoon. The fans filled the stands.
He remembers being pressed between people so tightly that he couldn’t get his hands out of his pockets. He remembers the crash of the safety barrier collapsing behind him when his team nearly scored and the crowd surged.
Hundreds of people fell, toppled like dominoes by those pinned in next to them. Cowley was pulled under. He remembers waking up among the dead and dying, crushed beneath the weight of bodies. He remembers the smell of urine and sweat, the sound of men crying. He remembers locking eyes with the man struggling next to him, then standing on him to save himself. He still wonders if that man was one of the 94 people who died that day.
These memories have tormented Cowley his whole adult life. For 30 years he suffered from flashbacks and insomnia. He had trouble working but was too ashamed to talk to his wife. He blocked out the worst of it by drinking. In 2004 one doctor referred him to a trainee therapist, but it didn’t help, and he dropped out after a couple of sessions.
But two years ago he spotted a poster advertising therapy over the internet, and he decided to give it another go. After dozens of regular sessions in which he and his therapist talked via text message, Cowley, now 49, is at last recovering from severe post-traumatic stress disorder. “It’s amazing how a few words can change a life,” says Andrew Blackwell, chief scientific officer at Ieso, the UK-based mental health clinic treating Cowley.
What’s crucial is delivering the right words at the right time. Blackwell and his colleagues at Ieso are pioneering a new approach to mental-health care in which the language used in therapy sessions is analyzed by an AI. The idea is to use natural-language processing (NLP) to identify which parts of a conversation between therapist and client—which types of utterance and exchange—seem to be most effective at treating different disorders.
The aim is to give therapists better insight into what they do, helping experienced therapists maintain a high standard of care and helping trainees improve. Amid a global shortfall in care, an automated form of quality control could be essential in helping clinics meet demand.
Ultimately, the approach may reveal exactly how psychotherapy works in the first place, something that clinicians and researchers are still largely in the dark about. A new understanding of therapy’s active ingredients could open the door to personalized mental-health care, allowing doctors to tailor psychiatric treatments to particular clients much as they do when prescribing drugs.
A way with words
The success of therapy and counseling ultimately hinges on the words spoken between two people. Despite the fact that therapy has existed in its modern form for decades, there’s a surprising amount we still don’t know about how it works. It’s generally deemed crucial for therapist and client to have a good rapport, but it can be tough to predict whether a particular technique, applied to a particular illness, will yield results or not. Compared with treatment for physical conditions, the quality of care for mental health is poor. Recovery rates have stagnated and in some cases worsened since treatments were developed.
Researchers have tried to study talking therapy for years to unlock the secrets of why some therapists get better results than others. It can be as much art as science, based on the experience and gut instinct of qualified therapists. It’s been virtually impossible to fully quantify what works and why—until now. Zac Imel, who is a psychotherapy researcher at the University of Utah, remembers trying to analyze transcripts from therapy sessions by hand. “It takes forever, and the sample sizes are embarrassing,” he says. “And so we didn’t learn very much even over the decades we’ve been doing it.”
AI is changing that equation. The type of machine learning that carries out automatic translation can quickly analyze vast amounts of language. That gives researchers access to an endless, untapped source of data: the language therapists use.
Researchers believe they can use insights from that data to give therapy a long-overdue boost. The result could be that more people get better, and stay better.
Blackwell and his colleagues are not the only ones chasing this vision. A company in the US, called Lyssn, is developing similar tech. Lyssn was cofounded by Imel and CEO David Atkins, who studies psychology and machine learning at the University of Washington.
Both groups train their AIs on transcripts of therapy sessions. To train the NLP models, a few hundred transcripts are annotated by hand to highlight the role therapists’ and clients’ words are playing at that point in the session. For example, a session might start with a therapist greeting a client and then move to discussing the clients’ mood. In a later exchange, the therapist might empathize with problems the client brings up and ask if the client practiced the skills introduced in the previous session. And so on.
The technology works in a similar way to a sentiment-analysis algorithm that can tell whether movie reviews are positive or negative, or a translation tool that learns to map between English and Chinese. But in this case, the AI translates from natural language into a kind of bar code or fingerprint of a therapy session that reveals the role played by different utterances.
A fingerprint for a session can show how much time was spent in constructive therapy versus general chitchat. Seeing this readout can help therapists focus more on the former in future sessions, says Stephen Freer, Ieso’s chief clinical officer, who oversees the clinic’s roughly 650 therapists.
Looming crisis
The problems that both Ieso and Lyssn are addressing are urgent. Cowley’s story highlights two major shortcomings in the provision of mental-health care: access and quality. Cowley suffered for 15 years before being offered treatment, and the first time he tried it, in 2004, it didn’t help. It was another 15 years before he got treatment that worked.
Cowley’s experience is extreme, but not uncommon. Warnings of a looming mental-health crisis ignore a basic truth: we’re already in one. Despite slowly receding stigma, most of the people who need help for a mental-health issue still don’t get it. About one in five of us has a mental illness at any given time, yet 75% of mentally ill people aren’t receiving any form of care.
And of those who do, only around half can expect to recover. That’s in the best mental-health systems in the world, says Blackwell. “If we went to a hospital with a broken leg and we were told there was a 50-50 chance of it being fixed, somehow that wouldn’t seem acceptable,” he said in a TED talk last year. “I think we can challenge ourselves to have higher expectations.”
The pandemic has exacerbated the problem but didn't create it. The issue is fundamentally about supply and demand. The demand comes from us, our numbers swelled by one of the most taxing collective experiences in living memory. The problem on the supply side is a lack of good therapists.
This is what Ieso and Lyssn are addressing. According to Freer, people typically come at the supply problem with the assumption that you can have more therapists or better therapists, but not both. “I think that’s a mistake,” he says. “I think what we’re seeing is you can have your cake and eat it.” In other words, Ieso thinks it can increase access to care and use AI to help manage its quality.
Ieso is one of the largest providers backed by the UK’s National Health Service (NHS) that offer therapy over the internet by text or video. Its therapists have so far delivered more than 460,000 of hours of cognitive behavioral therapy (CBT)—a commonly used and effective technique that helps people manage their problems by changing the way they think and behave—to around 86,000 clients, treating a range of conditions including mood and anxiety disorders, depression, and PTSD.
Ieso says its recovery rate across all disorders is 53%, compared with a national average of 51%. That difference sounds small—but with 1.6 million referrals for talking therapy in the UK every year, it represents tens of thousands of people who might otherwise still be ill. And the company believes it can do more.
Since 2013, Ieso has focused on depression and generalized anxiety disorder, and used data-driven techniques—of which NLP is a core part—to boost recovery rates for those conditions dramatically. According to Ieso, its recovery rate in 2021 for depression is 62%—compared to a national average of 50%—and 73% for generalized anxiety disorder—compared to a national average of 58%.
Ieso says it has focused on anxiety and depression partly because they are two of the most common conditions. But they also respond better to CBT than others, such as obsessive compulsive disorder. It’s not yet clear how far the clinic can extend its success, but it plans to start focusing on more conditions.
In theory, using AI to monitor quality frees up clinicians to see more clients because better therapy means fewer unproductive sessions, although Ieso has not yet studied the direct impact of NLP on the efficiency of care.
"Right now, with 1,000 hours of therapy time, we can treat somewhere between 80 and 90 clients,” says Freer. “We’re trying to move that needle and ask: Can you treat 200, 300, even 400 clients with the same amount of therapy hours?”
Unlike Ieso, Lyssn does not offer therapy itself. Instead, it provides its software to other clinics and universities, in the UK and the US, for quality control and training.
In the US, Lyssn’s clients include a telehealth opioid treatment program in California that wants to monitor the quality of care being given by its providers. The company is also working with the University of Pennsylvania to set up CBT therapists across Philadelphia with its technology.
In the UK, Lyssn is working with three organizations, including Trent Psychological Therapies Service, an independent clinic, which—like Ieso—is commissioned by the NHS to provide mental-health care. Trent PTS is still trialing the software. Because the NLP model was built in the US, the clinic had to work with Lyssn to make it recognize British regional accents.
Dean Repper, Trent PTS’s clinical services director, believes that the software could help therapists standardize best practices. “You’d think therapists who have been doing it for years would get the best outcomes,” he says. “But they don’t, necessarily.” Repper compares it to driving: “When you learn to drive a car, you get taught to do a number of safe things,” he says. “But after a while you stop doing some of those safe things and maybe pick up speeding fines.”
Improving, not replacing
The point of the AI is to improve human care, not replace it. The lack of quality mental-health care is not going to be resolved by short-term quick fixes. Addressing that problem will also require reducing stigma, increasing funding, and improving education. Blackwell, in particular, dismisses many of the claims being made for AI. “There is a dangerous amount of hype,” he says.
For example, there’s been a lot of buzz about things like chatbot therapists and round-the-clock monitoring by apps—often billed as Fitbits for the mind. But most of this tech falls somewhere between “years away” and “never going to happen.”
“It’s not about well-being apps and stuff like that,” says Blackwell. “Putting an app in someone’s hand that says it’s going to treat their depression probably serves only to inoculate them against seeking help.”
One problem with making psychotherapy more evidence-based, though, is that it means asking therapists and clients to open up their private conversations. Will therapists object to having their professional performance monitored in this way?
Repper anticipates some reluctance. “This technology represents a challenge for therapists,” he says. “It’s as if they’ve got someone else in the room for the first time, transcribing everything they say.” To start with, Trent PTS is using Lyssn’s software only with trainees, who expect to be monitored. When those therapists qualify, Repper thinks, they may accept the monitoring because they are used to it. More experienced therapists may need to be convinced of its benefits.
The point is not to use the technology as a stick but as support, says Imel, who used to be a therapist himself. He thinks many will welcome the extra information. “It’s hard to be on your own with your clients,” he says. “When all you do is sit in a private room with another person for 20 or 30 hours a week, without getting feedback from colleagues, it can be really tough to improve.”
Freer agrees. At Ieso, therapists discuss the AI-generated feedback with their supervisors. The idea is to let therapists take control of their professional development, showing them what they’re good at—things that other therapists can learn from—and not so good at—things that they might want to work on.
Ieso and Lyssn are just starting down this path, but there’s clear potential for learning things about therapy that are revealed only by mining sufficiently large data sets. Atkins mentions a meta-analysis published in 2018 that pulled together around 1,000 hours’ worth of therapy without the help of AI. “Lyssn processes that in a day,” he says. New studies published by both Ieso and Lyssn analyze tens of thousands of sessions.
For example, in a paper published in JAMA Psychiatry in 2019, Ieso researchers described a deep-learning NLP model that was trained to categorize utterances from therapists in more than 90,000 hours of CBT sessions with around 14,000 clients. The algorithm learned to discern whether different phrases and short sections of conversation were instances of specific types of CBT-based conversation—such as checking the client’s mood, setting and reviewing homework (where clients practice skills learned in a session), discussing methods of change, planning for the future, and so on—or talk not related to CBT, such as general chat.
The researchers showed that higher ratios of CBT talk correlate with better recovery rates, as measured by standard self-reported metrics used across the UK. They claim that their results provide validation for CBT as a treatment. CBT is widely considered effective already, but this study is one of the first large-scale experiments to back up that common assumption.
In a paper published this year, the Ieso team looked at clients’ utterances instead of therapists’. They found that more of what they call “change-talk active” responses (those that suggest a desire to change, such as “I don’t want to live like this anymore”) and “change-talk exploration” (evidence that the client is reflecting on ways to change) were associated with greater odds of reliable improvement and engagement. Not seeing these types of statements could be a warning sign that the current course of therapy is not working. In practice, it could also be possible to study session transcripts for clues to what therapists say to elicit such behavior, and train other therapists to do the same.
This is valuable, says Jennifer Wild, a clinical psychologist at the University of Oxford. She thinks these studies help the field, making psychotherapy more evidence-based and justifying the way therapists are trained.
“One of the benefits of the findings is that when we’re training clinicians, we can now point to research that shows that the more you stick to protocol, the more you’re going to get symptom change,” says Wild. “You may feel like improvising, but you need to stick to the treatment, because we know it works and we know how it works. I think that’s the important thing—and I think that’s new.”
These AI techniques could also be used to help match prospective clients with therapists and work out which types of therapy will work best for an individual client, says Wild: “I think we’ll finally get more answers about which treatment techniques work best for which combinations of symptoms.”
This is just the start. A large health-care provider like Kaiser Permanente in California might offer 3 million therapy sessions a year, says Imel—“but they have no idea what happened in those sessions, and that seems like an awful waste.” Consider, for example, that if a health-care provider treats 3 million people for heart disease, it knows how many got statins and whether or not they took them. “We can do population-level science on that,” he says. “I think we can start to do similar things in psychotherapy.”
Blackwell agrees. “We might actually be able to enter an era of precision medicine in psychology and psychiatry within the next five years,” he says.
Ultimately, we may be able to mix and match treatments. There are around 450 different types of psychotherapy that you can get your insurer to pay for in the US, says Blackwell. From the outside, you might think each was as good as another. “But if we did a kind of chemical analysis of therapy, I think we’d find that there are certain active ingredients, which probably come from a range of theoretical frameworks,” he says. He imagines being able to pull together a selection of ingredients from different therapies for a specific client.“Those ingredients might form a whole new type of treatment that doesn’t yet have a name,” he says.
One intriguing possibility is to use the tools to look at what therapists with especially good results are doing, and teach others to do the same. Freer says that 10 to 15% of the therapists he works with “do something magical.”
“There’s something that they’re doing consistently, with large volumes of clients, where they get them well and the clients stay well,” he says. “Can you bottle it?”
Freer believes the person who treated Kevin Cowley is just that type of therapist. “That’s why I think Kevin’s story was such a powerful one,” he says. “Think of how many years he’s been suffering. Now imagine if Kevin had had access to care when he was 17 or 18.”
Deep Dive
Artificial intelligence
How to opt out of Meta’s AI training
Your posts are a gold mine, especially as companies start to run out of AI training data.
Apple is promising personalized AI in a private cloud. Here’s how that will work.
Apple’s first big salvo in the AI wars makes a bet that people will care about data privacy when automating tasks.
This AI-powered “black box” could make surgery safer
A new smart monitoring system could help doctors avoid mistakes—but it’s also alarming some surgeons and leading to sabotage.
Why does AI hallucinate?
The tendency to make things up is holding chatbots back. But that’s just what they do.
Stay connected
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.