COVID-19 may never make it to everyone’s lungs, but it will leave no heart untouched. According to a recent study, more than 40% of the global population have experienced mental issues due to the pandemic. Isolation, unemployment, travel restrictions, remote working, and terror on the news are taking their toll.
Crisis call centers are at capacity. In April, a U.S. federal emergency hotline reported an over 1000% increase in calls compared with the same period last year.
Even before the pandemic, limited accessibility and inefficiency of counseling have been a nuisance. A US federal survey shows that fewer than half of people suffering from mental illnesses have been able to get help in the past year.
The disaster has boosted the search for solutions, and artificial intelligence (AI) looks especially promising.
Let’s take a look at the current state and potential of AI in improving mental health help.
1. Conversational support
The ability of AI to imitate humans by conducting a meaningful dialogue is invaluable for counseling hotlines.
Conversational AI runs on a predefined flow and two AI techniques: natural language processing (NLP) and machine learning (ML). A bot powered by this technology can recognize a person’s voice or text input, get information by asking questions and reply to requests.
You can text (chatbot) or speak with it using your voice (voicebot). The latter is particularly interesting for universal hotlines. The elderly might not have the technical knowledge required to use texting apps and the underprivileged often have no access to the internet.
Using ML, expert psychiatrists can train them for tasks like helping a caller to calm down or diagnostics.
A voicebot can speak with a caller to match them with the right counselor and keep them engaged while waiting for the operator. Most interactive voice response systems (IVR) are very annoying as they run you through audio menus. Conversely, an AI bot can provide gentle, human-like guidance in a call steering mode.
This technique is known as conversational IVR.
User products based on AI also help lessen the load on crisis hotlines. A popular self-help app Replika, an AI chatbot, reported a 35% increase in traffic lately.
2. Intelligent process automation
Intelligent process automation (IPA) can help hotline operators focus on counseling and process more requests.
The IPA concept is an AI-driven upgrade of robotic process automation (RPA). RPA automates repetitive tasks based on preprogrammed rules. IPA improves automation by adding the capabilities of AI such as reading, learning, analyzing and decision making.
Here is one simple example: an AI voicebot listens to the caller and asks questions. An NLP algorithm transcribes their input and outlines key points. RPA mechanisms enter this information in the patient’s electronic medical card in a structured format.
Hotline operators are often volunteers, non-professionals who have received essential counseling training. IPA can help ensure their adherence to best practices and guidelines.
During the call, an AI listens to what a caller says and interprets this information as one of the pre-scripted requests. RPA then matches the request with a relevant response and provides suggestions for the operator on a device screen. This helps take the conversation in a pre-scripted flow developed by experts.
However, IPA is yet to be harnessed by counseling services. By discovering its opportunities, they will find ever more ways to improve the quality and efficiency of their operations.
3. Voice and text-based diagnostics
Aside from understanding the intent of a caller or texter, AI can hear and read what a human cannot.
Intelligent algorithms can analyze language and voice for patterns linked to mental disorders and certain emotional states.
According to a study by the University of California, the range of voice of depression patients is lower than normal. It also features more pauses, starts and stops. A “breathy” quality is a sign that a person is likely to re-attempt suicide.
NYU School of Medicine has developed a Post-Traumatic Stress Disorder (PTSD) Recognition Algorithm that reaches 89% accuracy at identifying PTSD. The system captures over 40,00 features of speech associated with depression.
NLP also quickly spots lethal words. These denote the ideation, the plan, the timing and the means to harm oneself or others. Callers/texters using these words are one step away from a tragedy and need to be treated first. Crisis Text Line, a text-based counseling service, applies AI to identify such users and tend to them in less than 38 seconds.
4. Visual diagnostics
AI systems can also see what the eye cannot, thanks to computer vision.
We have once created a computer vision algorithm that analyzes low-quality photographs of road signs and recognizes the text on them. But AI can also recognize emotions to assess mental health and personality traits.
In addition to that, computer vision can find body language patterns signaling a particular disorder.
Consider this: people suffering from depression don’t move their heads as often and don’t wear a smile as long as others. A research team led by Andrew Reece and Christopher Danforth has been able to prove that. They analyzed 43,950 Instagram photos of 166 users and identified depression with 70% accuracy.
This feature of AI is especially useful as crisis hotlines increasingly turn to video calls as a communication channel.
5. Reading internal signals
Wearable-mounted sensors powered with AI can improve remote medical supervision for counseling patients.
Our team had a chance to create an ML algorithm that does exactly that: recognizes emotions by analyzing information from a sensor mounted in a wearable.
By analyzing heart rate and other vitals, AI-powered sensors can identify when a person is in a crisis. A wearable then sends an alert to the doctor so they can take action. Tending to the patient timely can help prevent harmful behavior towards themselves and others.
Another way this may work is through self-control. Recognizing that you need help is the first step in mental healthcare — and not the easy one. A wearable can tell you when you should seek counseling based on your body’s signals.
Pharmaceutical company Otsuka offers an interesting approach. They produce aripiprazole tablets with a tiny sensor made of natural ingredients, a wearable and digital interfaces.
Once a patient has swallowed the pill, this sensor is paired with a patch placed on the body. This wearable sends the information to a mobile app of the patient and a web dashboard used by the healthcare providers.
By using AI, this system analyzes the patient’s mood and controls adherence to medication intake.
6. Disorder prediction and prevention
Finally, timely care should be applied to reduce hotlines’ workload. Predictive capabilities of AI can boost mental issue prevention efforts.
An algorithm can analyze huge amounts of medical records, looking for disorder development patterns. Non-emergency counselors can reach out to patients flagged as “risky” and take preventive measures. A 2019 study has proved 80% accuracy of this method.
Two non-profit organizations, Thresholds and Data Kind, have manually analyzed information from multiple platforms, such as Thresholds’ internal databases, the Illinois Department of Healthcare and Family Services, and the Cook County Jail.
The team managed to create a foundation for risk scoring and predictions, and automated reporting and visualizing. Adding AI to the process can further streamline the risk prevention process.
Conclusion
Pushed by the pandemic, institutions and corporations are increasingly uncovering the benefits of AI for mental health management.
As AI reduces effort of human operators, it can allow for processing more claims. Today, New York’s NYC WELL hotline has to expand its staff from 104 to 191 to meet the demand. Imagine how quickly they could scale if they had a proper IPA system in place.
Working toward better AI solutions in mental health management will have an impact on everyone: a political gain for authorities, cost-effective operation for counseling firms, and of course healing people who are in dismal need for attention, right now.