Record numbers of people are turning to AI chatbots for therapy, reports Anthony Cuthbertson. But recent incidents have uncovered some deeply worrying blindspots of a technology out of control
is it better to have an AI therapist than none at all?
The evidence so far shows that the answer to that is a responding “no”. LLM bots have suggested means of suicide to people in crisis and encouraged unhealthy behavior in people with reading disorders. They are dangerous in such roles and should never be used in place of a therapist.
No therapy is better than a “therapist” that tries to murder you.
I was a physiotherapist and the AI recommendations for physical/mechanical health feel like someone grabbed a diagnosis from a lucky dip of options. It sounds very professional but doesn’t specifically diagnose issues for the client.
The AI therapist question is a very good one, is it better to have an AI therapist than none at all?
The evidence so far shows that the answer to that is a responding “no”. LLM bots have suggested means of suicide to people in crisis and encouraged unhealthy behavior in people with reading disorders. They are dangerous in such roles and should never be used in place of a therapist.
No therapy is better than a “therapist” that tries to murder you.
I was a physiotherapist and the AI recommendations for physical/mechanical health feel like someone grabbed a diagnosis from a lucky dip of options. It sounds very professional but doesn’t specifically diagnose issues for the client.
No. Would you take an untested drug?