'Empathetic' software is set to change the way we shop

'Empathetic' software is set to change the way we shop

Date: 2024-03-19    Source    Read:1556


Empathy has always been an inherently humane characteristic -- but already, there are several AIs claiming at least some degree of it.

It’s grocery shopping day and you enter your favorite big store. One of the shop’s social robots, a shiny black-and-white humanoid with a round disc with a big blue eye for a face and a cylindrical base for a body walks up to you.

“Hey, is that how you pronounce your name?” it says. You are in a hurry and the last thing you want is a robot talking to you. You curtly answer “Yes.” The android immediately takes off. Another day, though, you float around the shop, wearing a kind of puzzled expression on your face. This time your yes to your silicone conversationalist bears much friendlier undertones. “Can I help you?” the bot asks. “Sure,” you say, and the automaton gives you a tour of the latest products, deals and offers, until you are sure you have navigated the big grocery store in the most satisfying way.

The above is a scene from the retail environment that might be our reality in a few years from now according to Rana el Kaliouby, visionary computer scientist, entrepreneur, and author of Girl Decoded, the book where she recounts the story of her mission to humanize technology with empathy. It is the outcome of the advent of emotionally and socially literate artificial intelligence (AI) agents into the realm of retail,  or else the advent of AIs with empathy.

 

So far, empathy has always been an inherently humane characteristic

Empathy is a highly praised component of emotional intelligence; it is the ability to put yourself in another person’s shoes, to understand and identify with them, and to carefully consider their perspective. It is an inherently humane characteristic that machines cannot in any way currently partake of. Futurist author and computer scientist Ray Kurzweil predicts empathetic agents will be mainstream in less than 10 years from now though. Yet even at the time of speaking, there are several AIs claiming at least some degree of empathy on the market.  

Take Shift, for example, an online, peer-to-peer, marketplace for buying and selling used cars. “We are the first business to ever use Pulsar AI, and this is for customers to interact with us through email and text message,” says George Arison, cofounder of Shift. Arison says that what sets his AI apart from others currently on the market is the spot-on quality of Pulsar’s reply. 

“We get a lot of emails that have multiple questions or requests in them at the same time. Most AI bots break down in that type of an environment - they can’t answer that (try to ask Siri a ton of questions at once),” he says.  He is confident that their own AI not only understands multi-question inputs, but can also give educated and accurate responses to each and every one of these questions and engage the customer in every step of the communication. “This type of technology is what you would need if you want to have a bot with empathy,” Arison says.

 

Right now, digital assistants are severely lacking contextual understanding

A few years ago principal software engineer at business and financial software company Intuit Wolf Paulus was driving on a highway when he received a notification from his digital assistant that “there was traffic ahead because of a fatality!” in an upbeat tone. “My digital assistant was basically telling me somebody had died in a happy way...” Paulus recalls. “This made me realize the importance of empathy and contextual understanding in the messages these AIs are relaying. 

He was immediately tempted to fill his own software with empathy. Intuit’s QuickBooks Assistant is now claiming a piece of the empathetic conversational design pie: it features a “small talk” that “listens” for direct expressions of emotions like anger, excitement, or sadness in your voice, with the company saying they are evolving their chatbot to explore sentiment analysis in their text-based tool.

It goes without saying that empathy is good value for money too. A recent study found that good customer service increases the possibility of a purchase by 42 percent, while bad translates into a 52 percent chance of losing that sale forever—simply put, more than half of us will stop shopping at a store should we have a disappointing customer service interaction. And because AIs don’t get sad or burned out or angry, and an empathetic AI is physically and “emotionally” available 24/7 for us, we can gauge how valuable assets these socially intelligent AIs can be for businesses says Peter Diamandis, well-known futurist, engineer, and author of The Future is Faster Than You Think, a book predicting a compelling, technology-abundant future for humanity. 

 

But for this to work properly, we need to shift our focus to the ethics behind the tech

But for Rayid Ghani, a professor in the machine learning department at the Heinz College of Information Systems and Public Policy at Carnegie Mellon, what we are looking at is an AI responding with words empathetically tailor-made for our ears. This does not mean the humanoid or voice-over assistant or chatbot possesses real empathy. “The machine isn't going to understand how I feel and his emotional state won’t change because of that,” Ghani says.  Even though he agrees that a human agent is not guaranteed to be genuinely empathetic,  Ghani thinks we should be focusing our attention more on the ethics of using AIs with empathy in retail rather than on the technology behind them.

He asks us to visualize a scenario in which we are managers of a business and we want to build a system optimizing resolution times for customers that are in the top 10% of our revenue. “These top customers are rich and male and white,” says Ghani. “Do you build a system that is empathetic toward them and leaves out women? Who is deciding on how empathy is evaluated and who is deciding on what metrics exist for success?” he asks. It is inevitable for Ghani that our newly minted empathetic agent might improve the retail experience for some people--on this occasion, rich, white men-- while simultaneously making it miserable for all other groups. 

It might be possible to bypass this problem by feeding the AI just and inclusive data though. “The algorithm and the neural nets that power this AI aren’t biased. It's the data you feed that is biased,” says Diamandis. If you only feed it data about white, rich men, then it doesn't work for women who are not rich and white. If you only feed it photos of cats and you show it a dog, it's going to call it a cat.” But before the “democratization” of data comes the diversity of the creator. “We have to be super intentional about ensuring that people who create these AIs have a diversity of ethnicity, gender, age, and more,” says el Kaliouby. All going well, she predicts empathetic AIs will be the gateway to how we shop in the near future.

 

The likeability of AI agents is far more nuanced than you'd think

Ultimately, the retail industry will not cease its efforts to infuse emotional and social awareness into the AI agents it experiments with, because it has understood the tremendous value likeability has in a robotic partner. “Take a look at Alexa and Google Home,” Paulus says.  “They are priced about the same. They look about the same. Google Home is more accurate, more capable, but Alexa is more popular because she is more likeable.”  

For him, the likeability of future chatbots, voice assistants, and humanoids--in other words, how much they remind us of...us--will become what separates success from failure in the very near future.