Using Artificial Intelligence in Mental Healthcare: Assessing the Pros and Cons

Authored by: Teena Thomas

Research assistance by: Diksha Pandey

Edited by: Kausumi Saha

 

ABSTRACT

This commentary explores the intervention of Artificial Intelligence (AI) in providing mental healthcare services, through chatbot applications. Mental health chatbots, being inexpensive and easily accessible, are increasingly gaining ground as an alternative to therapists and other mental health professionals. The commentary aims to draw conclusions on the effectiveness of its usage in mitigating mental health crises and weigh the pros and cons of the same. Further, it examines the merit of utilising AI to supplement, as opposed to replace, the existing mental health care resources and service providers, and the need for practising caution in the same.

 

INTRODUCTION

Mental health is a sensitive issue garnering increasing attention and awareness in India and around the world. In spite of that, the subject of mental illnesses remains a taboo, with a majority of the population facing lack of awareness or an unwillingness to discuss the issue within their households. The stigma associated with seeking treatment dissuades people from approaching therapists and accessing other available care resources, despite the fact that mental illnesses such as depression and anxiety causes significant negative effects on personal and professional lives of patients [1]. This leads to a high treatment gap for mental health problems in India [2].

Treatment gap is a useful indicator to assess the quality, accessibility and utilisation of health care services in a society. In India, the National Mental Health Survey 2015-2016 reported an overall treatment gap of 83% for any mental health problem. Only 1 in 10 people with mental health disorders receive evidence-based treatments [3]. In a report released by the World Health Organization (WHO) in 2017, for every 1,00,000 people in India, there are only 1.93 mental health care workers. The number of mental health professionals, both governmental and private, was 25,312 for a population of 1.3 billion people, less than 0.002% [4]. To put this in perspective, Brazil, with a population of 205,962,108 has 6,53,829(0.31%) and China, with 1,397,028,553 people, has 1,22,309(0.008%) mental health professionals [5] [6]. Even on the expenditure front, the total mental healthcare expenditure per person by the Indian government is INR 4; in total accounting for a mere 1.93 % of the health budget.

Some of the reasons for such a high treatment gap in India are, low perceived need for treatment due to limited awareness, socio-cultural beliefs that reinforce shame and stigma around mental health, and insufficient and inefficiently distributed resources [7]. Treatment gap is particularly large in rural areas, where 70% of our population resides, but only 25% of the country’s health infrastructure is situated [8]. Gender stereotypes, lack of platforms for dialogue and the absence of certified therapists aggravates the issue.

Financial barriers are also significant in hindering access to mental healthcare in India. Added to the cost of treatment are expenses towards travelling, accommodation, food, medication and follow-up appointments, making it difficult for a large section of the population to afford treatment. Thus, seeking mental healthcare often requires significant class privilege in the current scenario.

Thus, while in cases of severe mental health issues such as schizophrenia or dementia, professional treatment is irreplaceable, to deal with problems such as anxiety or depression, people have begun to look for more feasible options. This is where Artificial Intelligence (AI) forays in, in the form of chatbot applications. Chatbot applications simulate human conversation through voice commands, text chats, or both.

 

USE OF AI IN PROVIDING MENTAL HEALTH CARE SERVICES

AI and mental health care may seem like opposite ends of a spectrum. Mental healthcare is a domain where human intervention seems inevitable, but this is slowly changing. Virtual counsellors are becoming the go-to for many individuals around the world who cannot afford the services of certified professionals. Mental health chatbots such as Woebot and Wysa utilise AI, Cognitive Behavioural Therapy (CBT), Dialectical Behavioural Therapy (DBT) and Natural Language Processing Model (NLP), to script responses while interacting with people dealing with stress and anxiety [9] [10] [11] [12] [13]. They also host a range of mindfulness and meditation exercises, and information on better sleep and calming techniques. Several of these features can be accessed free of cost, making it particularly accessible for young students and people without income.

The extent of use of such applications is visible in the fact that Wysa alone has over 1 million downloads on the Google Play Store [14]. Wysa also claims to have a 50% month-on-month customer repeat rate, and has reportedly conducted over 19 million conversations as of yet. Currently, almost half of the Wysa users come from the US, UK and Canada, while Indians make for about 20% of its user base [15]. Initially, the age group of the majority of users was 24 to 30 years, however the app is now seeing more users in the age group 35 years and above [16].

The chief selling point of virtual therapy is anonymity. People can download mental health applications in their personal capacity without needing to inform family members. Wysa, for instance, markets itself as a “4 am friend” which ensures privacy and refrains from judgement and opinion [17]. These features make the platforms particularly appealing to adolescents and young adults, who may find it easier to share their troubles in virtual therapy without fear of judgement.

Further, if trained well, AI can flag signals and patterns in speech which can be difficult for humans to detect. Some of the relationships between speech and mental health, for instance, are that speaking in a monotone may be a sign of depression, fast speech of mania and disjointed word choice can be a sign of schizophrenia. Alongside analysing speech data, physical manifestations like ‘changes in the brain’ can also be studied more closely by AI algorithms. Facebook uses AI-based technology to flag posts which may suggest suicidal thoughts or expressions, to be analysed by human reviewers [18]. The other advantage of algorithms is its convenience and inconspicuousness. These

chatbot-based applications are ready to listen and engage in a conversation anytime and anywhere, and there are no complications concerning appointment times with therapists. In person, CBT Therapy may be helpful for people who have generalised anxiety disorder, but it may also be stressful to invest the time and money without a clear sense of progression.

Virtual chatbot applications, however, cannot replace the services of medically licensed therapists, and those who own and design these applications are well aware of that. For example, at the time of writing a Wysa conversation or tool concept, the first script is reviewed by a clinician. Safety issues are identified where responses by Wysa can be contradicted or can act as a further trigger for people dealing with anxiety or stress [19]. Applications also use a disclaimer about their services. For instance, the cautionary message of Wysa is as follows:

“Wysa is not designed to assist with crises such as abuse, severe mental health conditions that may cause feelings of suicide, harm to self and any other medical emergencies. Wysa cannot and will not offer medical or clinical advice. It can only suggest that users seek advanced and professional medical help. Please reach out to your country-specific suicide emergency number at http://www.suicide.org/international-suicide-hotlines.html or crisis helplines and emergency numbers.”  [20]

 

LIMITATIONS

Placing Artificial Intelligence in the space of mental healthcare is a controversial intersection. AI is only as strong as the data it is trained on. Machines have predefined scripts and not minds of their own, and hence they cannot empathise or rationalise like human beings. They also cannot be creative with their responses, and can only perform the tasks they are designed to do. As a result, people who use Wysa, for instance, often have reviews such as, “You’re not getting me” or “You’re not understanding me”, even though its founders are constantly trying to reduce that number [21].

Furthermore, chatbot applications such as Apple’s Siri and Amazon’s Alexa are often criticised for having insensitive responses when faced with serious and disturbing issues such as people mentioning domestic violence or rape. These applications are not equipped to counter with appropriate responses for such issues, and can be potentially harmful to people looking for help. More importantly, increased dependency on artificial intelligence may provide further impetus for mentally ill people to distance themselves socially. Positive social interaction is a key factor in maintaining sound mental health [22]. As conditions like hikikomori become more and more prevalent, anything that reduces the need for human interaction must be treated with an appropriate level of skepticism [23].

Of particular concern is the absence of mechanisms to regulate mental healthcare applications. There is also the issue of sensitive personal details being shared with these apps, and how secure this information is. Virtual conversational applications record and store information that is divulged with them, with the privacy and confidentiality clauses of these being dubious at best.

Soumitra Pathare, an Indian psychiatrist who was involved in shaping the Mental Healthcare Act 2017, has also voiced her concerns about allowing such applications in the market without thorough checks. Wysa, in its defence, indicates a score of 93% given by ORCHA, an organisation that evaluates the safety and efficacy of health and care applications [24]. The organisation rated Wysa 86% in data privacy, 100% in clinical assurance and 85.7% in user experience. ORCHA further clarifies that their review makes it clear that a suitably qualified professional was involved in the development of the application. But in ‘user experience’, they mention that the application does not seem to have a clear approach in dealing with user issues [25].

 

CONCLUSION

Due to the above limitations, it is apparent that chatbot applications cannot be relied upon to make accurate diagnoses. They can only be used to bridge the gap or provide minimal support to people who need an outlet to vent their feelings. This, again, is strictly not meant for severe mental health issues but is mostly seen in the form of providing redirection or simple coping mechanisms.

With the arrival of such applications, it is important to look at the failure of the Indian government in being able to successfully address the pressing issues of inadequate, inaccessible and unaffordable healthcare services. In a deficient structure, people resort to other means and methods to address their issues and chatbot applications provide them with one such platform. The government needs to provide mental health services at subsidised rates and ensure that a greater number of psychiatrists, psychologists, therapists, counsellors and social workers should be available to help people with mental health crises.

When using AI to mitigate mental health problems, awareness is the most important factor, wherein people are able to comprehend the abilities of an application fully before using it and not expect more than what the application can offer. Secondly, the treatment gap needs to be acknowledged and bridged by allocating sufficient resources for mental health treatment.

Technological singularity talks about a hypothetical point of time in the future, where technology expands and grows out of control and becomes irreversible. The more aspects of life that were originally handled by humans become replaced by artificial intelligence, the more we progress towards such a society.

 

ENDNOTES

[1] A therapist is person who treats psychological problems or is skilled in a particular kind of therapy, eg. psychologist, psychiatrist, counsellor, etc.

[2] Treatment gap is defined as the number of people with active disease who are not receiving treatment.

[3] Pp. 122, http://indianmhs.nimhans.ac.in/Docs/Report2.pdf.

[4] https://www.who.int/mental_health/evidence/atlas/profiles-2017/IND.pdf?ua=1.

[5] https://www.who.int/mental_health/evidence/atlas/profiles-2017/BRA.pdf?ua=1.

[6] https://www.who.int/mental_health/evidence/atlas/profiles-2017/CHN.pdf?ua=1.

[7] Pp. 122, http://indianmhs.nimhans.ac.in/Docs/Report2.pdf.

[8] https://www.hrw.org/report/2014/12/03/treated-worse-animals/abuses-against-women-and-girls-psychosocial-or-intellectual.

[9] Woebot is a “fully automated conversational agent” developed by Woebot Labs in San Francisco in 2017.

[10] Wysa is an AI-enabled ‘Life Coach’ for mental and emotional wellness launched in Mumbai in 2017 by Jo Aggarwal and Ramakant Vempati

[11] Cognitive Behavioral Therapy is a psycho-social intervention that aims to improve mental health.

[12] Dialectical behaviour therapy is an evidence-based psychotherapy that makes an effort to treat borderline personality disorder.

[13] NLP is a branch of machine learning which enables computers to interpret and process speech and text in human language.

[14] As of 23rd April 2020.

[15] https://inc42.com/startups/how-wysa-is-using-ai-to-solve-the-growing-risk-of-mental-health-problems/

[16] https://www.livemint.com/companies/start-ups/ai-startup-wysa-bags-2mn-for-its-conversation-bot-on-mental-health-1561005087653.html.

[17] https://www.wysa.io/.

[18] https://medicalfuturist.com/artificial-intelligence-in-mental-health-care/

[19] http://maneeshjuneja.com/blog/2018/12/12/an-interview-with-jo-aggarwal-building-a-safe-chatbot-for-mental-health.

[20] https://www.wysa.io/.

[21] https://thepolitic.org/an-interview-with-jo-aggarwal-co-inventor-of-wysa/

[22] https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3150158/

[23] Hikikomori is a condition which has become very prevalent in Japan. It is a Japanese word which means ‘social withdrawal’ or ‘staying indoors’. It describes a situation that mainly affects adolescents or young adults who isolate themselves from the world, sequestered in their parents’ homes, locked in their bedrooms for days, months, and even years on end. They refuse to be in touch with even their family. These patients use the Internet profusely, and only venture out to deal with their most vital bodily needs. More information can be found here: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4776119/.

[24] https://thecorrespondent.com/89/techs-final-frontier-your-mind/11782182764-6ea44ec8.

[25] https://appfinder.orcha.co.uk/review/209172/