The one thing AI can’t do for us? Care. Authentically

About me: I study compassion and empathy and created a program for leaders called 'Leading with Empathy' at Google, where my roles have included Global Head of Resilience. As a tech employee I have also benefited first hand from the incredible AI revolution we’re experiencing, and as someone obsessed with fostering compassion in the workplace I’m heavily invested in understanding how AI can help - or hinder - authentic empathy.

If AI can help me write an essay, it can help me break up with someone, right? Multiple ads in my TikTok feed would say so (I’m very happily married, so the algorithm needs adjusting). If leaders are taught by people like me that empathic communication is a key driver of driven, productive, innovative and creative employees more likely to stay with your company, attract more other top talent and produce better business results, then surely on a day where your calendar has you chain-smoking meetings and time is at a premium, it’s easier to ask AI to "write a thank you and great job email" than waste time composing it yourself? 

One of the key tenets of my Empathy work with senior leaders  is demonstrating understanding, effectively dialing into someone else’s frequency, especially if that someone is below you in the reporting chain. In an age where the majority of knowledge workers' communication at work is over text/email/ping, the urge to save time and use AI to "generate a sensitive paragraph to send to my Black colleagues to show I care about Juneteenth", while shocking, is also a reality. I’ve seen it. If you’re honest with yourself, you’ve at least considered it. Like a vending machine for empathic sentiment, I wonder how prevalent this short-cut to compassion already is. 

What’s worse than an absence of empathy? Fake empathy. 

In a fascinating new paper by Anat Perry, ‘AI will never convey the essence of human empathy’, research from an online emotional-support chat service utilizing GPT-3 responses shows that while the AI-generated chats are initially well received, at the point the recipient realizes it was artificially generated all the positives are negated. Empathy comes from a place of shared experience, so finding out it was artificial can be more damaging than not receiving a message at all. While well intended, an email from a university to its students showing compassion after a mass shooting had the exact opposite effect when it was discovered to have been AI-generated. 

We’ve likely all received a gift sent via Amazon with the pre-populated “Enjoy your gift!” text option, and know how much more meaningful it is when something personal has been entered instead by the sender. That’s because an act of compassion or empathy isn’t ‘what’ the act is, but in the doing of the act itself. You can't hack compassion.

Leadership is hard, but leading with empathy isn’t something you can fake your way to, so resist the urge to lean on generative AI shortcuts. While that authentic message or conversation may take a few seconds or minutes longer, you run the risk of doing harm rather than good by skipping the most important step - caring.

LinkedIn Article

Previous
Previous

Empathy on the court and off

Next
Next

After 2 years of Leading from Home, here are my top 3 reminders for managers looking to flex their empathy muscle