Select Page


(BigStock Image)

Most of us have had the experience of sending a text or email that came across sounding insensitive or angry, even though that wasn’t our intent.

Unfortunately, the lack of social cues in such messaging makes it much easier to be misinterpreted. Depending on the communication, this can lead to misunderstandings, hurt feelings or worse. That’s a shortcoming Bellevue, Wash.-based mpathic wants to correct using empathic AI.

Drawing on insights and datasets assembled over the past decade, mpathic has set out to promote human connection and understanding in the workplace.

To do this, they’ve created plugins that tie into their cloud-based empathy-as-a-service, or EaaS, to help humans talk to humans using real-time text corrections. This way, texts and emails can be reviewed and changes can be suggested prior to hitting “Send.” By adding these capabilities into platforms like Slack, or Gmail, mpathic hopes to bring more empathy to the corporate communication landscape.

mpathic CEO and co-founder Grin Lord. (mpathic Photo)

“We realized this could all be mediated with an AI empathy engine, almost like Grammarly for empathy,” said co-founder Grin Lord. “We’ve had amazing developments in AI that allow us to do this now in real time, making this is the first time in human history that we can get real-time empathy correction that’s dynamic.”

In an example from a recent pitch, the service suggested replacing an inflammatory message like, “Why does Nic schedule these meetings always at the last minute? Am I right?” with a more open question: “How do you feel about the meeting change?”

Based on years of research on human interaction, mpathic offers a unique approach to guiding users. Lord, who has a Doctor of Psychology degree, initially based mpathic’s dataset on insights she gained doing research in the early 2000s at Harborview Medical Center in Seattle, the only Level I trauma center in Washington state.

During that time, Lord was part of a group doing research on empathic listening. Following a car crash, DUI drivers would frequently be brought into Harborview. Rather than giving the driver pamphlets or telling them what to do or shaming them, the researchers would listen to them, perhaps for 15 or 20 minutes, following specific protocols. In a randomized controlled trial, they saw a measurable drop in drinking by those drivers that lasted for up to three years, as well as a 48% reduction in hospital readmissions. Not only did this help the subject toward recovery, it led to significant cost reductions, as well as greater public safety.

Since then, Lord has been involved with other startups including Lyssn, a platform for assessing behavioral health provider empathy and engagement during clinical sessions.

Prior to its launch, the team behind mpathic started Empathy Rocks, which builds human connection using empathic AI through a gamified platform. The platform allows practitioners to improve their empathic listening skills while earning continuing education credits.

But it was during the early seed funding stage for Empathy Rocks that Lord and co-founder Nic Bertagnolli became aware they already had a viable product in the underlying empathy engine for that platform. Pivoting, they launched mpathic to make the engine more readily and widely available.

Developing both “Grammarly for empathy” and an API, mpathic wants to do more than simply promote good relations between employees. Given the expanding globalization of many corporations and the growing pool of employees from other parts of the country and the world, mpathic wants to provide human resources departments with a tool that can help smooth the onboarding of employees. Since different regions have different ideas and attitudes about what constitutes civil and sensitive behavior, mpathic can be used to help integrate new hires into their new team more rapidly.

Lord is quick to point out that mpathic doesn’t just suggest text corrections but makes other kinds of behavioral suggestions, too. In this way, the user builds an understanding of empathic communication and behavior through context, use and repetition.

“We actually make corrections that are very behavioral,” said Lord. “So, it may not even be a replacement of a word or transformation of the text. Instead, the AI may suggest calling a meeting or getting on the phone, because certain things don’t need to be in an email.”

Though mpathic grew out of Empathy Rocks, the gamified training platform continues to provide empathic listening training as it acquires new data that’s used to train mpathic’s EaaS. The platform was designed by the team’s empathy designer, Dr. Jolley Paige, who notes the many factors that need to be considered at a time when AI bias is such a concern.

“We were thinking about gender, age, culture, where you’re located in the country, but also about different abilities, too,” Jolley said. “So, if somebody has a language processing disorder, how would that impact how they interact with this game?”

While some people may have concerns about using AI to modify human behavior, lots of companies see value in such an approach. “Some of our early enterprise partners are looking at plugging mpathic into their Slack, Gmail or whatever, primarily because they’re interested in this idea of quickly onboarding cross-cultural and global teams,” Lord said. “I think it can be useful for unifying mission values language for a company.”

Last month, mpathic was one of 14 startups that pitched at PIE Demo Day. PIE (Portland Incubator Experiment) is led by general manager Rick Turoczy and seeks to provide founders — often first-time entrepreneurs — with access to mentorship and networks.

Empathy Rocks and mpathic intentionally source and curate their data to include underrepresented voices and are part of All Tech is Human as well as other communities committed to ethical AI development.

Empathic AI is part of a much broader field of computer science, originally known as affective computing and more recently referred to as emotion AI or artificial emotional intelligence. Originating out of MIT Media Lab and other research institutes about 25 years ago, emotion AI involves systems that can read, interpret and interact with human emotions. Since emotion and especially empathy are central to the human condition, such work has the potential to make our technologies interact more easily, humanely and responsibly with people, both at home and in the workplace.





Source link