Can AI Deliver Compassion?
How Chatbots are Redefining Patient Counseling in Medicine
July 30, 2024 – by Rebecca Handler
The art of practicing medicine often comes with a profoundly human challenge: delivering bad news. Despite advancements in technology, no machine has ever been able to truly replicate the empathy and nuanced communication required to tell a patient their prognosis is grim. But what if that is changing? Jonathan H. Chen, MD, PhD, a physician and computer scientist, decided to put this question to the test by challenging an AI chatbot with some of the most emotionally charged scenarios in medicine.
He began with a scenario discussing a feeding tube for a patient with advancing dementia, a situation full of emotional complexity and ethical dilemmas, making it an ideal test for the AI's capabilities.
“I asked the Chatbot to assume the role of a clinician providing supportive counseling,” he explains. “Then, I played the role of a concerned family member, having the chatbot play the role of a physician.”
AI's Role in Mental Health Support
Chen’s experience underscored the possibility that AI could play a significant role in aspects of patient care such as providing therapy and counseling, meeting the vast demand for these services that often outstrips the supply of available professionals.
Tina Hernandez-Boussard, PhD, a renowned expert in biomedical informatics at Stanford, weighs in: "We've seen mental health issues rising at unexplainable rates, and there's a huge need for mental health services. I see a future where AI can help meet that need," she says.
AI-driven chatbots can provide immediate assistance, offering a lifeline when human professionals may not be available. Hernandez-Boussard also highlights a critical issue, "One of the biggest times we see mental health needs is around 4 AM, but there’s often no one available to help. AI can potentially fill that gap with savvy chatbots that can provide the necessary support."
Additionally, through the analysis of clinical notes and patient communications via electronic health records, AI can help identify patients at high risk of mental health issues such as depression and suicidal ideation.
In fact, in recent research from Chen and his colleagues, they developed a machine-learning system that quickly identifies urgent messages, improving response times from 9 hours to just 8-13 minutes. This system, trained on thousands of messages, accurately flags potential crises, helping specialists provide timely support.
This is done with Natural language processing (NLP), a subfield of computer science and artificial intelligence (AI) that uses machine learning to enable computers to understand and communicate with human language.
Hernandez-Boussard also highlights the benefit of using NLP to detect phrases and patterns indicative of mental health concerns. Her team has been doing work to use natural language processing to identify phrases that might predict concerns for depression. “By doing so, AI can aid in early detection and intervention, potentially preventing severe outcomes and improving resource allocation,” she says.
Despite AI's promising applications, Hernandez-Boussard and Chen both stress the importance of maintaining human involvement and having clinicians supervise and work alongside AI to ensure effective healthcare delivery.
Training Clinicians with AI-Driven Simulations
Both Hernandez-Boussard and Chen’s experiences have inspired them to consider how human-computer interactions could serve as a training platform for clinicians to learn empathy in tough situations.
Reflecting on the experiment, Chen shares, “With the ability to practice high-stakes conversations in a low-stakes environment, I hope such computer systems will make us better in our next human–human interactions.”
For a detailed exploration of Chen's findings, read his full essay here.
Tips for Practitioners Using ChatGPT in Patient Care
Chen provides a checklist of recommendations and warnings when using AI systems in patient care:
Recognize that people are the most important scarce resource in an overstretched healthcare system but also that humans do not have a monopoly on empathy, counseling, and communication.
Embrace computer systems to not only automate our mundane paperwork to recover critical human time but also allow us to practice and enhance our most human skills.
Constantly test all AI (and human intelligence) systems in healthcare to ensure safe, reliable, and compassionate counseling and advice for all.
Your next recommended read
AI for Grant Writing: Use with Caution
Three grant writing experts in the Division of Cardiovascular Medicine want researchers to understand common pitfalls to avoid as they use AI as a tool in preparing grant applications.