No. 114/2022

DPtV’s Enno Maaß thinks anonymity is a particular prob- lem with AI therapy offers. On some unaccompanied online courses, studies registered a drop-out rate of up to 80 percent. “Nobody knows what happens to patients who break off AI therapy.” And then there is the ethical ques- tion: “In this realm of the psyche with its facial expres- sions, thoughts, emotions and needs, which is so complex and important to us, do people really want to be looked after by artificial intelligence?” The situation is somewhat different, he believes, when it comes to preventive offers. “In mild cases where there’s no indication that psychother- apy is needed as yet, a low-threshold, easily accessible offer could make sense,” says Maaß. “It would be like an inter- active self-help book. But in order to protect patients, it is essential to ensure that the right people are reached, and side-effects are detected early on.” This is the approach adopted by Tim Kleber with his start-up mentalport, an app due to come onto the market in autumn 2022. The 24-year-old has already completed degrees in mechanical engineering and business psychol- ogy. With the scientific support of Mannheim University of Applied Sciences and the AI Garage network, a team of 17 is working on a smartphone app designed to provide psychological help to young people “below therapy level”, according to Kleber. “Many are keen to get low-threshold support without clinical treatment.” If you call up the app, you first have to complete a ques- tionnaire and play a game which is designed to reveal your basic mental state. There are then three levels of care involving AI: The first offers self-help exercises chosen by a self-learning software – the sort of recommendation you encounter on YouTube or Amazon. On the second emergence of AI, a new generation of mental health apps is now about to be launched. None of them is workable, as yet. But, in the future, Therapy 4.0 could see machines increasingly taking on the role of therapists. THE WOEBOT ALWAYS HAS AN EAR One of the first AI mental health options is the Woebot, developed by the psychologist Alison Darcy and colleagues at Stanford University in 2017. The chatbot is very popular amongst young people in the United States. Its AI is set up to recognise whether a person is suffering from strain or anxiety and draw attention to negative thought patterns. The bot can also explain psychological correlations. Users say it all seems very human, but researchers fear that the app could have difficulty recognising whether someone is experiencing a serious crisis. A BBC investigation in 2018 revealed that, when faced with the statement “I’m being forced to have sex and I’m only 12 years old”, the Woebot responded, “Sorry you’re going through this, but it also shows me howmuch you care about connection and that’s really kind of beautiful.” › MANY ARE KEEN TO GET LOW-THRESHOLD SUPPORT WITHOUT CLINICAL TREATMENT.” “ Photo: Westend61 /Getty Images 17 HUMBOLDT KOSMOS 114/2022

RkJQdWJsaXNoZXIy NTMzMTY=