Can AI Teach You How to Communicate With More Empathy?

Emily Boynton Fact Checked
Hands reaching out for each other
© kkgas / Stocksy United

Since ChatGPT went public, the world has been abuzz with talk of artificial intelligence, aka AI or machines and computer systems that simulate human intelligence. And Seattle, with its tech companies and software engineer masses, is in the beating heart of it all.

It feels like every day we’re seeing new glimpses of how AI might change the way we live and work, from its ability to detect cancer to its capacity to write stories, generate images and even pass portions of the bar exam.  

Tim Althoff, an assistant professor of computer science at the University of Washington, is researching yet another application: using AI to help mental health peer supporters communicate more empathically.  

If implemented effectively and safely, this technology could help provide much needed mental health resources. One in five Americans experience metal illness every year and more than 150 million people live in federally designated mental health professional shortage areas. Peer support in online chats and forums like Mental Health Subreddits, where people talk about mental illness with peers who aim to listen and empathize, is often used to address this gap. However, many of these peer supporters are untrained and in turn not always confident in their responses — and this is where Althoff believes AI can help. 

The idea of AI teaching empathy might make you balk but stick with us. In his recent study, Althoff found when used as a collaboration tool, AI increased people’s ability to communicate empathically and feel confident doing so. 

How AI can help teach empathic communication  

Althoff and a team of computer scientists and clinical psychologists worked together to create AI software that provides real-time, sentence-level feedback on text, including suggestions to make it more empathic.  

“It’s like Grammarly for empathy,” Althoff says. (Or the red squiggly line that offers edits to your doc, only this time it’s for compassion not grammar nitpicks.)  

The concept is intriguing but also raises the question: How does the AI know what is the most empathic thing to say, and do we trust it? 

“So, software — even AI software — doesn’t understand empathy, not at a human experiential level. But AI is really good at pattern matching,” says David Atkins, CEO of Lyssn.io and clinical psychologist and affiliate professor at the UW School of Medicine Department of Psychiatry and Behavioral Sciences, who collaborated on the study. “Over time with many examples, the AI can learn what types of words, vocal tone and inflection, and interactions back and forth in the conversation are indicative of high versus low empathy — where really we mean high or low empathic communication.”  

The research team used a pre-existing, evidence-based definition of empathy, then had multiple psychologists evaluate whether numerous example texts met the threshold for empathy. To do this, they broke down empathic communication into three core skills: asking open-ended questions, reflecting back what someone says and avoiding judgement.  

They then trained the AI on that data so it could learn how to distinguish and evaluate text based on these empathic communication skills.  

“Given a lot of high-quality training data like this, we can develop a reliable AI system for evaluating empathy,” Atkins says. 

A safety- and ethics-first approach to AI in mental health care  

Both Althoff and Atkins recognize some people might be wary of AI that teaches empathy, but they emphasize they designed the model to center ethics and safety above all else.  

“We prioritized safety at every step,” Atkins says. “Both benefits to safety and dealing with risks doesn’t happen unless you design for it at the very beginning of every stage.”     

This meant starting with psychologists to define and evaluate empathy, then designing an AI app with a team of mental health stakeholders so that it would provide feedback without causing peer supporters to take the AI’s response wholesale or feel discouraged if their original response had multiple edits.  

“The human-to-human interaction in peer support is extremely meaningful, and we didn’t want to disrupt that,” Althoff says.  

ai offers text edits to improve empathic communication

Once the app was designed, they tested it with peer supporters in a sandbox environment — meaning the supporters were able to use the AI but their responses were never sent to the original person seeking support. The algorithm also filtered out comments that were inappropriate (vulgar, etc.) to ensure the safety and well-being of supporters and flagged comments containing safety concerns (such as content about self-harm or suicide), so that they could be sent to therapists.  

Using AI to improve access to care and create a more empathic world 

The study was a smashing success. Based on an empathy classification model, the human and AI collaborative responses led to a 19.6% increase in empathy overall and a 38.9% increase for peer supporters who had difficulty expressing empathy in writing. Plus, 63.3% of peer supporters found the feedback helpful, 69.8% felt more confident providing support after using the AI and 77.7% wanted the feedback system to be added to peer-to-peer support platforms.

Most supporters chose to collaborate with the AI to create more empathic responses, or they took inspiration from the AI’s prompts and rewrote the response in their own words. As a result, the AI-human collaboration created responses high in both empathy and authenticity. 

For Althoff and Atkins, this shows it is possible to use AI to increase the empathy of peer support responses without increasing the risk of harm.  

“When we think about computers and AI, empathy isn’t the first thing people typically think about. A really important aspect is that the computer doesn’t have empathy for your feelings. Instead, the AI has looked at a lot of words and situations, so if someone wants to express empathy but struggles to find the words to say, the AI can help them express their empathy,” Althoff says.  

Ultimately, this collaboration between AI and peer supporters could be a game changer for mental health support by allowing peer supporters access to training in empathic communication at a large scale.  

“In a world where most people don’t have access to mental health professionals, due to capacity or insurance, sometimes peer support is all you have,” Atkins says. “Technology can help raise the quality of support people can receive access to regardless of the setting.”