Back to the complete issue
Monday, 26 April 2021

Can AI be trusted with our mental health?

Covid-19 has amplified the need for easy access to mental health resources, and some think artificial intelligence is the answer: With the proliferation of mental health apps and websites, individuals seeking mental health support are more likely to find themselves in conversation with a chatbot than a real human. These apps are both convenient and affordable, and are making therapy accessible to a larger subset of society. As a cheap and handy alternative, will therapy apps eventually make human therapists redundant? And can AI be trusted with our mental health? Experts interviewed by the Wall Street Journal broadly agreed that there are potential uses for AI in mental health services, but that bots are unlikely to replace humans any time soon.

AI-powered therapy tools can make mental health services more accessible and affordable: Financial troubles are a common source of stress and mental health issues, with studies pointing to a direct correlation between low household income and mental health problems, meaning that often people who need this kind of care are unable to afford it. Factor in the busyness of modern life, and sometimes even those who can afford it feel they do not have the time to go to a physical appointment. Interestingly, research has also shown that people are more willing to open up when they believe they are interacting with a bot rather than a real human, possibly due to less anxiety about being judged.

But leaving a vulnerable person in the care of a bot may not be entirely responsible, and risks are that AI powered bots could dispense inappropriate information that could end up amplifying bad advice to people in a suggestible state of mind. At the moment, both the UK’s NHS and the US’ American Psychiatric Association only recommend using apps as an “adjunct” to therapy, and not as a main treatment, which should be overseen by a human. But that could soon change. “If chatbots prove safe and effective, we could see a world where patients access treatment and decide if and when they want another person involved,” said the Stanford School of Medicine’s Adam Miner.

And then there are the privacy concerns: Research has shown that some smartphone apps used for depression and smoking cessation have shared data with commercial third parties without accurately disclosing that this would happen in the past, which is a major “red flag” for the mental health app industry according to John Torous, director of the digital-psychiatry division at Beth Israel Deaconess Medical Center. Though most countries have developed robust laws guarding patient confidentiality in a conventional therapy setting, how that can be extended to apps and websites has yet to be determined.

Passive smartphone sensing apps can, however, be used to gather data that can help doctors predict why some people develop mental health issues in the first place, and to determine and alter humans to critical moments that may require intervention. “Using new data combined with AI will likely help us unlock the potential of creating new personalized and even preventive treatments,” said Torous.

Instead of replacing therapists, AI can help “guide” the therapy process by providing tools for self reflection in between therapy sessions, or monitoring and tracking patterns in users’ mood changes, which could in turn lead to more accurate diagnoses. Apps that use Cognitive Behavioral Therapy to guide users out of negative thought patterns, often through a conversation with an AI chatbot, are especially popular.

And easier access to mental health apps can increase the demand for in person care, with the accessibility and anonymity of apps potentially easing people who may have been uncomfortable seeking mental health support into the idea of regular therapy. AI-powered mental health resources may actually boost, rather than chip away at, the demand for human therapists, as the practice becomes more normalized and mental health apps seek to bolster their services by adding human psychologists and psychiatrists to their teams.

Enterprise is a daily publication of Enterprise Ventures LLC, an Egyptian limited liability company (commercial register 83594), and a subsidiary of Inktank Communications. Summaries are intended for guidance only and are provided on an as-is basis; kindly refer to the source article in its original language prior to undertaking any action. Neither Enterprise Ventures nor its staff assume any responsibility or liability for the accuracy of the information contained in this publication, whether in the form of summaries or analysis. © 2022 Enterprise Ventures LLC.

Enterprise is available without charge thanks to the generous support of HSBC Egypt (tax ID: 204-901-715), the leading corporate and retail lender in Egypt; EFG Hermes (tax ID: 200-178-385), the leading financial services corporation in frontier emerging markets; SODIC (tax ID: 212-168-002), a leading Egyptian real estate developer; SomaBay (tax ID: 204-903-300), our Red Sea holiday partner; Infinity (tax ID: 474-939-359), the ultimate way to power cities, industries, and homes directly from nature right here in Egypt; CIRA (tax ID: 200-069-608), the leading providers of K-12 and higher level education in Egypt; Orascom Construction (tax ID: 229-988-806), the leading construction and engineering company building infrastructure in Egypt and abroad; Moharram & Partners (tax ID: 616-112-459), the leading public policy and government affairs partner; Palm Hills Developments (tax ID: 432-737-014), a leading developer of commercial and residential properties; Mashreq (tax ID: 204-898-862), the MENA region’s leading homegrown personal and digital bank; Industrial Development Group (IDG) (tax ID:266-965-253), the leading builder of industrial parks in Egypt; Hassan Allam Properties (tax ID:  553-096-567), one of Egypt’s most prominent and leading builders; and Saleh, Barsoum & Abdel Aziz (tax ID: 220-002-827), the leading audit, tax and accounting firm in Egypt.