According to the National Institute of Mental Health (NIMH), major depression is one of the most commonly diagnosed mental health conditions in the United States. Depression may be common, but many people are not able to access treatment. According to a report from the Substance Abuse and Mental Health Service Administration (SAMHSA)’s, only 63% of adults identified as having had at least one major depressive episode reported receiving any kind of treatment. For teens with major depression, the numbers are even more concerning: According to SAMHSA’s survey, only 40% of teens who had at least one major depressive episode received treatment. Statistics on people with depression who cannot get treatment have prompted the development of alternative pathways to mental health care, many of which leverage technological advancements. Here are a few ways AI could be used to help people manage depression, as well as what the technology can’t do. For more mental health resources, see our National Helpline Database.
What AI Can Offer to Mental Health Treatment
In the model of emotionally focused therapy, a person who is emotionally safe is accessible, responsive, and engaging. These are the qualities researchers and programmers in the field of AI hope to bring to technology-based mental health care.
Accessibility and Responsiveness
AI apps can provide unprecedented accessibility by being available 24/7 at little to no cost. The programs collect data that allow them to create a level of therapeutic rapport with users and offer relevant responses. The level of engagement continues to develop as the program learns more about the user and collects additional data. Through this process, the app is able to better detect and work toward meeting a user’s individual behavioral and emotional health goals and needs.
Privacy and Anonymity
Many people value their privacy and enjoy the anonymity technology can provide. Considering the sensitive and personal nature of the content shared in counseling and therapy, programs using AI that can be used anonymously by users can be an appealing asset.
Access and Stigma
AI-based depression resources also hope to address other barriers to mental health treatment, such as a lack of providers in rural areas, the stigma associated with mental illness, and feelings of fear, guilt, or shame that may stop someone with depression from reaching out to others.
Benefits of AI
There are several potential benefits to using AI to help manage depression, from improved access to coordinated care, but the technology also has some limitations.
Convenience
Traditional outpatient counseling involves scheduling an appointment once or twice per week to meet with a therapist for a 50-minute session. To make an appointment, you need to find a time in a therapist’s schedule that suits your schedule. To attend your scheduled sessions consistently, you may have to ask for time off at work or school or make other arrangements (such as for childcare). You also need to factor in the time it takes you to commute to and from an office, which will vary depending on how close counseling resources are to where you live. Most mental health apps and platforms available provide resources users can access anywhere from a smartphone, tablet, or laptop. They’re also available at any time. You can use the apps day or night, on weekends, holidays, or any other time that works for your schedule. Compared to the cost of therapy fees, as well as accounting for missed work, commuting, and other needs, these apps are also low to no-cost alternatives.
Connection
In the field of mental health, a major predictor of positive counseling experience is the rapport developed between clients and counselors. The relationship built in therapy establishes trust and creates a safe space to express challenging emotions and discuss difficult experiences. Forging connection is a key part of the healing process. Those developing artificial intelligence-based apps consult a wealth of amount of data and research to create programs that help people to feel connected and understood. In addition to building an emotional connection, AI can connect people to the help they need in situations where they otherwise wouldn’t have access. In rural and remote areas, mental health resources can be few and far between. Furthermore, the resources that are available may not have the capacity to meet the demands of the whole community. The ability to access and use mental health resources that are designed using artificial intelligence can be a lifeline of hope for people in areas where there is little to no affordable, accessible, and available help.
Anonymity
As much as humans are wired for connection and can recognize the benefits of personal counseling and therapy for treating depression, obstacles can prevent people from getting the help they need. The stigma attached to mental illness and its treatment has lessened to a degree but is still present in society. The influence of depression stigma can make it harder for people to feel empowered and safe in seeking treatment. The use of mobile apps and artificial intelligence platforms allows people to receive mental health services within the safety of their own personal living space, which eliminates the possibility of others knowing someone is receiving care, such as by crossing paths with an acquaintance when headed to a counselor’s office. Much of what would be discussed in counseling for depression involves emotional hurt, painful experiences and fear, and other tough topics that leave people feeling vulnerable. The ability to maintain one’s privacy when discussing mental health coupled with the flexibility to do so when it’s convenient and from one’s own space makes these AI-designed apps for depression appealing and potentially useful. However, there are some limitations to the technology that should be kept in mind.
Limitations of AI
Many of the obstacles that can prevent people from seeking and receiving help for depression are removed, or at least minimized, by AI-driven apps and programs. It’s also essential that clinicians are educated on how to use AI technology with their patients. A global survey of more than 700 psychiatrists found that while the majority believe AI will become a regular part of their clinical practice (particularly to perform documentation), many were not convinced of the technology’s therapeutic benefits—especially compared to the potential risks.
AI Bots and Apps for Depression
Below are a few examples of popular AI-based technology that’s available to help people manage depression. Although the apps cannot diagnose or treat a mental health condition, they may be complementary for people working with a doctor or mental health practitioner.
WoeBot
WoeBot launched in the summer of 2017 and is referred to as an automated conversational agent, also known as a chatbot. It is designed to offer convenient care to those struggling with depression by mimicking human conversation, offering self-help related guidance and companionship to its users. The program can share with you information and resources, such as videos and exercises, based on what it thinks you are needing at that time. WoeBot can be used anonymously on iOS and you can also chat with it through Facebook Messenger. As you continue to chat with WoeBot, it collects data through natural language processing (NLP) and uses this information to get to know you better. The collected information allows for the program to more accurately detect and meet your emotional needs at a given time, offering personalized resources, self-help guidance, information, and support related to your concerns.
Wysa
Wysa is an artificial intelligence-based, “emotionally intelligent” bot that the company says can “help you manage your emotions and thoughts.” Like WoeBot, Wysa’s designed based on principles of CBT to help users challenge and change thoughts and behaviors. Wysa also incorporates dialectical behavioral therapy (DBT), meditation practices, and motivational interviewing into chats. Wysa can be used anonymously, though as with other AI apps, it collects data as users chat to improve the accuracy of its interpretation of a user’s behavioral and mental health needs and goals. Although the chatbot service is free, the company that developed Wysa suggests users purchase a monthly subscription that provides the opportunity to interact with a human Wysa coach.
Tess
X2AI’s Tess is described as “a psychological AI that administers highly personalized psycho-education and health-related reminders on demand.” The program is a text-based messaging conversation that users can access through Facebook Messenger, SMS texting, web browsers, and smartphone apps. Through consistent messaging, the data Tess accumulates informs its responses with the aim of providing guidance that’s most relevant to the user. Together with tracking a user’s goals, the interventions suggested by Tess are influenced by the framework of cognitive-behavioral therapy.
Youper
This free app for iOS and Android uses AI chatbot technology to help users talk through their symptoms, behaviors, and patterns. The company calls Youper an “emotional health assistant,” that provides personalized feedback and insights based on what it learns in daily text-based conversations with users. At the beginning of a chat session, users can communicate what they need from Youper. For example, a user might want help managing chronic depression or may need tips for feeling less anxious at the moment. As users are chatting with Youper, the bot culls through various psychological techniques (such as CBT) to provide the user with helpful guidance based on their needs and goals. At the end of a conversation, Youper provides a summary that can help users track symptoms, moods, and patterns over time.
A Word From Verywell
Apps, platforms, and other technology that uses artificial intelligence to help people manage depression and other mental health conditions offer many benefits, but they do have limitations. Although these products address gaps in access and may help people overcome barriers to seeking mental health support and resources, they are not intended to replace clinicians in diagnosing conditions or prescribing medical or psychiatric treatment.