We Want Chatbots to Act More to have any human

There is something particularly disturbing about seeing images of Atlas, the robot developed by Boston Dynamics. Its human-like movements suggest a sense of body awareness and intuition inherent in human beings, but clearly it is not human.

The technologies behind AI and robotics are continuing to advance.

As they become increasingly sophisticated, we must ask ourselves, how should AI be? Where should we allow boundaries to continue to blur and where should we draw a clear line in the sand?

It’s an enigmatic challenge that gets even more complicated by the headlines about robotics citizenship and speculation about an upcoming apocalypse.

By evaluating the evolutionary role of AI in the customer experience, we can begin to answer this question. The early application of chatbots serves as a small window into the world of human-bot interactions, and a case study of how technology should be configured in the future.

Thinking about human behavior makes sense.

The early buzz around chatbots encountered many irritating moans from the many consumers who introduced robots prematurely. Initial bots were legitimately criticized for being ineffective and often unable to do the basic tasks for which they were designed.

However, what was probably most frustrating for customers dealing with these bots was their lack of empathy.

If a customer takes the time to contact a brand to get help with something, they really want to feel understood.

The paradox is here that cars are not particularly familiar with feelings (in their defense, I know many people who are not very familiar with the understanding of feelings).

As technology develops

AI should be more emotionally aware to really understand people’s demands. Empathy is essential as companies increasingly seek to communicate with consumers through automated solutions. Chatbots have come a long way since the early days.

An estimated 16 percent of Americans (that is, 39 million people) now own a smart speaker. But even in these more advanced solutions, there is a fairly chronic problem of tone deafness.

Keep in mind that when you make a request to Alexa, she won’t tell you “you’re welcome” if she thanks you. On the one hand, it’s reassuring to know that it no longer “listens” after an order, but on the other, many are concerned that we are setting a precedent for dishonesty and meaninglessness for a future generation.

A more terrifying example is the lack of consequences of being rude to robots, and more specifically, to the way bots respond to things like sexual harassment.

That’s a problem considering that more complex bots like Sophia will be among us soon.

To avoid robots that perpetuate a Deaf society, we need to train AI to their liking. This is not a simple task, but it is possible.

Empathy is a “soft skill,” but nonetheless it is a skill. Therefore, AI training in empathy can be approached in the same way that we train AI in anything, with a digestible data set.

This would include training to “listen” to data points such as tone of voice (both written and verbalized), words that express feelings and emotions, and even how one responds temporarily or over time to interaction.

Is the person in transition from agitated to happy, or vice versa?

A bot needs to know the difference so it can moderate its response to avoid agitating a calm person, or calming a restless person.

Understanding emotions and feelings is a fundamental part of understanding people and how they act, what they want and how to respond appropriately.

If a machine cannot learn empathy, it cannot understand how empathy affects people’s demands and actions, and cannot create better outcomes.

AI training in empathy means machine learning extracts subjective emotions, feelings and feelings from conversations with people. AYI models can learn, as individuals, how these feelings qualitatively impact needs, responses, actions and outcomes. An angry person, who uses the word.

Leave comment

Your email address will not be published. Required fields are marked with *.