Can AI be trusted with our mental health?
Covid-19 has amplified the need for easy access to mental health resources, and some think artificial intelligence is the answer: With the proliferation of mental health apps and websites, individuals seeking mental health support are more likely to find themselves in conversation with a chatbot than a real human. These apps are both convenient and affordable, and are making therapy accessible to a larger subset of society. As a cheap and handy alternative, will therapy apps eventually make human therapists redundant? And can AI be trusted with our mental health? Experts interviewed by the Wall Street Journal broadly agreed that there are potential uses for AI in mental health services, but that bots are unlikely to replace humans any time soon.
AI-powered therapy tools can make mental health services more accessible and affordable: Financial troubles are a common source of stress and mental health issues, with studies pointing to a direct correlation between low household income and mental health problems, meaning that often people who need this kind of care are unable to afford it. Factor in the busyness of modern life, and sometimes even those who can afford it feel they do not have the time to go to a physical appointment. Interestingly, research has also shown that people are more willing to open up when they believe they are interacting with a bot rather than a real human, possibly due to less anxiety about being judged.
But leaving a vulnerable person in the care of a bot may not be entirely responsible, and risks are that AI powered bots could dispense inappropriate information that could end up amplifying bad advice to people in a suggestible state of mind. At the moment, both the UK’s NHS and the US’ American Psychiatric Association only recommend using apps as an “adjunct” to therapy, and not as a main treatment, which should be overseen by a human. But that could soon change. “If chatbots prove safe and effective, we could see a world where patients access treatment and decide if and when they want another person involved,” said the Stanford School of Medicine’s Adam Miner.
And then there are the privacy concerns: Research has shown that some smartphone apps used for depression and smoking cessation have shared data with commercial third parties without accurately disclosing that this would happen in the past, which is a major “red flag” for the mental health app industry according to John Torous, director of the digital-psychiatry division at Beth Israel Deaconess Medical Center. Though most countries have developed robust laws guarding patient confidentiality in a conventional therapy setting, how that can be extended to apps and websites has yet to be determined.
Passive smartphone sensing apps can, however, be used to gather data that can help doctors predict why some people develop mental health issues in the first place, and to determine and alter humans to critical moments that may require intervention. “Using new data combined with AI will likely help us unlock the potential of creating new personalized and even preventive treatments,” said Torous.
Instead of replacing therapists, AI can help “guide” the therapy process by providing tools for self reflection in between therapy sessions, or monitoring and tracking patterns in users’ mood changes, which could in turn lead to more accurate diagnoses. Apps that use Cognitive Behavioral Therapy to guide users out of negative thought patterns, often through a conversation with an AI chatbot, are especially popular.
And easier access to mental health apps can increase the demand for in person care, with the accessibility and anonymity of apps potentially easing people who may have been uncomfortable seeking mental health support into the idea of regular therapy. AI-powered mental health resources may actually boost, rather than chip away at, the demand for human therapists, as the practice becomes more normalized and mental health apps seek to bolster their services by adding human psychologists and psychiatrists to their teams.