top of page

An Innovative AI-Integrated Emotional Wellness Strategy

Empowering Chatbots and Virtual Therapists to Emulate Mental Health Professionals



by Mark D. Lerner, Ph.D.

Chairman, The National Center for Emotional Wellness


AI-Integrated Emotional Wellness (AIEW) refers to the broad interface of the cognitive abilities of artificial intelligence (AI) and the complexity of human emotions. Acknowledging the irreplaceable importance of authentic, in-person communication, AIEW focuses on how artificial intelligence can foster emotional well-being.



The National Center defines emotional wellness as an awareness, understanding, and acceptance of our feelings and the ability to navigate challenges and change. This article presents an innovative strategy to empower chatbots with mental health responses that foster emotional wellness.


There are increasing mechanisms through which AI can provide support for people—such as chatbots and virtual therapists. However, addressing raw emotion while the user is experiencing adversity presents a unique challenge.


As I described in other AI-related articles, feelings are more complicated than our thoughts. For example, try to define what a feeling is. While engaging with clients in psychotherapy, I often described feelings as "the stuff that's hardest to talk about." On other occasions, I simply described feelings as “the truth.” Today, I have chosen a definition congruent with my role as an emotional wellness consultant: feelings or emotions are complex, multifaceted components that greatly influence our decision-making, actions, and overall well-being.


A decade ago, I mounted a theatrical production, "SESSIONS with Dr. Mark Lerner," where unscripted real-life stories from the past were enacted in the sanctuary of a therapist's office—demonstrating how people benefited from articulating their thoughts and feelings while grappling with adversity. The audience became proverbial flies on the wall, keenly observing the utilization of evidence-based therapeutic strategies. In Act II, set in the present day, the client elucidated what they did to overcome—and become who they are today.


As I reflect on this production and its potential to inform and inspire audiences on stage and screen, I wonder whether it foreshadowed something much more significant—an innovative mechanism whereby AIEW could facilitate acquiring language, nonverbal communication/cues, and specific strategies to address the complexity of human emotions that are experienced in mental health intervention.


Consider the potential of the following:


AIEW could offer numerous "SESSIONS" with diverse "clients"—each enacting a unique, challenging experience in a therapeutic milieu. By leveraging AI technology, the specific verbal content of these SESSIONS could be integrated using Natural Language Processing. Furthermore, non-verbal cues could be interpreted using Computer Vision and Machine Learning Techniques. This AIEW process would provide more human, emotion-driven responses than are realized with extant AI-powered support systems. Chatbots could be empowered with responses consistent with mental health care.


A critical component of this strategy is understanding that the quality of the chatbot or virtual therapist responses is only as remarkable as the analyzed SESSIONS.


This is important: There are two common forms of listening when helping people experiencing adversity. I recall teaching these in a university counseling psychology course nearly forty years ago. Active listening involves understanding a speaker’s exact words and meaning—and communicating that meaning to the speaker. Empathic listening goes beyond words. It focuses on understanding the feelings behind one's words—experiencing them as if they were the other person and communicating those feelings back to them.


My proposed strategy will integrate active and empathic listening and other evidence-based approaches used in mental health interventions.


This article touches the surface of an innovative strategy by which AIEW can empower machines to interface with human emotions during challenges and change—with SESSIONS. I look forward to consulting with the AI community regarding implementing this strategy.


Finally, while AI-powered helping mechanisms like chatbots and virtual therapists can become adept at interfacing with human emotions, these efforts must be conducted responsibly and ethically. Furthermore, while AI can provide support, it must always be an adjunct to professional mental health care that fosters authentic interpersonal communication—empathy, warmth, genuineness, compassion, trust, intimacy, nonverbal cues, shared experience, history, confidentiality, human connectivity, and more.



Comments


bottom of page