Can a chatbot replicate connection between patient and doctor

Many mental health apps carry disclaimers

2019165551

Joseph Weizenbaum, a renowned German-American computer scientist and former professor at the Massachusetts Institute of Technology, is widely recognised as a pioneer of artificial intelligence. During the 1960s, he developed an AI program called ELIZA, which employed word and pattern recognition techniques along with natural language programming to simulate responses similar to a psychotherapist. According to a recent article by Elisabeth Rosenthal, a senior contributing editor of KFF Health News, the success of ELIZA actually “terrified” Weizenbaum. In an interview with Rosenthal, Weizenbaum disclosed that his students would engage with the machine as if it were a genuine therapist. Although he had foreseen the rise of powerful AI tools, he maintained the belief that they could never truly excel as therapists.

In today's cyber world, there is an overwhelming presence of self-help apps focused on mental health. According to a market analysis by Grand View Research, the global market for mental health apps touched $5.2 billion in 2022, with a projected compound annual growth rate of 15.9 per cent from 2023 to 2030. Over the past few years, around 20,000 apps have emerged in the mental health space.

However, clinicians, researchers, and government authorities have raised concerns regarding the effectiveness of these apps. The lack of comprehensive research and expert consultation during the app development process has been a major factor contributing to these concerns. A study published in JMIR Mental Health in 2020 examined 293 apps offering therapeutic treatment for anxiety and/or depression. Of these, only 162 claimed to have an evidence-based theoretical framework in their app store descriptions, and a mere 10 had published evidence supporting their efficacy.

Most mental health apps provide what can be called “structured therapy”, generating workbook-like responses tailored to specific patient problems. However, there are concerns about the potential for unempathetic responses and the harm they may cause, particularly in cases involving suicidal tendencies.

Three years ago, the Journal of the American Medical Informatics Association featured a study highlighting concerns with consumer-facing apps. The study reviewed 74 reports on safety issues related to health apps and identified 80 safety concerns, with 67 of them relating to the quality of information presented and inappropriate responses.

Many of these apps carry disclaimers stating that they are not intended to replace medical or behavioural health services. However, they are still marketed with claims of treating conditions such as anxiety and depression and predicting suicidal tendencies. Hence, it is prudent to refrain from replacing a qualified therapist with a mobile application since the current evidence is inadequate to support the notion that a chatbot can replicate the empathetic connection between a patient and a human healer.

TAGS