Home » Health » “Empathy for Robots: The Importance of Sharing Personal Information to Build Trust and Confidence in AI”

“Empathy for Robots: The Importance of Sharing Personal Information to Build Trust and Confidence in AI”

Feeling empathy for a robot is more important than you might think. It increases the confidence we have in an AI. But how do you do that? Actually in the same way as with humans: by having a robot share ‘personal’ information.

You don’t build a relationship with other people just by small talk. It is precisely personal matters that bring you closer together. That is apparently no different with a robot. From New Japanese research It turns out that people feel more empathy for an online AI when it shares personal things while chatting with people.

More acceptance
Due to major staff shortages, the use of care robots or robots in service has become increasingly common in Japan. A difficult point, however, remains the trust people have in it. That is why research is being done into factors that can contribute to more acceptance and empathy towards AI. Previous studies have already shown that people are more willing to accept robots if they manage to elicit empathy. This applies to chatbots on websites, but also to cleaning robots or robots that resemble a pet. Previous research has also highlighted the importance of sharing personal information in order to build human relationships.

One and one equals two, the researchers must have thought: if a humanoid robot shares personal information, it should elicit empathy from its users. To test this idea, the researchers conducted an experiment in which participants chatted online with an AI, which was represented by the image of a human or human-like robot. During the chat, the participant and the AI ​​spoke to each other as if they were colleagues on their lunch break. In each conversation, the robot revealed personal work-related information, less relevant information about a hobby, or no personal information at all.

Work-related information
In the end, the researchers were able to evaluate the empathy of 918 participants for the AI ​​with a standard questionnaire. Personal work-related information turned out to be the most effective: it generated the most empathy, more than information about hobbies and certainly more than the total absence of a personal note. “In this experimental situation between participant and robot, both were colleagues. The personal information about work enhanced the feelings of empathy. So the study makes it clear that personal information appropriate to the situation elicits the most empathy,” said researcher Takahiro Tsumura from Tokyo. Scientias.nl.

It made no difference whether the AI ​​was depicted as a human or as a humanoid robot. This did not surprise the researcher. “There will probably be differences in empathy compared to a cleaning robot or a robot arm in the factory, but in this case the robot had a human-like appearance in both cases. Then it no longer matters much for the feelings of empathy.”

Educational purposes
The discovery that it is possible to make robots elicit more empathy could be important for the future development of AI. Let the robot tell you something about itself and people will feel better about it, which also increases confidence in the device. “Our study can change the negative image of robots in society and contribute to future social relationships between humans and AI.”

For example, by sharing personal information, an AI can encourage children to learn or adults to perform certain tasks. “One of the most surprising aspects of this study was that revealing personal information in the right circumstances not only leads to changes in human empathy, but also in behavior. In our study, participants lent more money to the AI ​​as it shared information relevant to the situation. These results can be useful in educational or interpersonal simulations.”

2023-05-13 09:03:26
#trick #trusting #trust #humans

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.