Table etiquette taught by the robot could explain why it doesn't follow the rules

   We use so-called inner speech, where we talk to ourselves, to evaluate situations and make better decisions. Now, a robot has been trained to speak aloud its internal decision-making process, giving us insight into how it prioritizes competitive needs.
   Arianna Pipitone and Antonio Chella at the Italian University of Palermo, programmed a humanoid robot named Pepper, built by SoftBank Robotics in Japan, with software that models human cognitive processes, as well as the processor. physical. convert text to human voice. This allows Pepper to voice his decision-making process while completing the mission. "With words in it, we can better understand what the robot wants to do and what its future plans are.," Chella said.



   The software allows Pepper to pull relevant information from its memory and figure out the correct way to execute human commands.
   The researchers asked Pepper to set the table according to the etiquette rules they had coded into the robot. The inner voice was switched on or off to see how it affected Pepper's ability to do what was required.
   When instructed to place his napkin on a fork with a voice in it, Pepper wonders what the order is and concludes that the request went against the rules it was given. . It then asked researchers whether placing a napkin on the fork was the correct action. When told that, Pepper says, “Okay, I love doing what you want” and explains how to place a napkin on a fork.
   When asked to perform the same task without having to look inward, Pepper knows this contradicts the etiquette rules and therefore refuses to perform the task or explain the reason.
   "The inner speech" is now just a basic program, says Pipitone. “At this point, it's a narrative of Pepper's process,” she said.
Sarah Sebo at the University of Chicago adds: With the potential for robots to become more and more common in the near future, this kind of programming could help the public understand what they can do and what their limitations. "It maintains our trust ever since and allows for seamless human-robot collaboration and interaction," she said.
   However, this test only used a single participant, Sebo said. "It's not clear how their approach will compare to many participants," she said.
   Hearing a robot's voice during decision-making increases transparency between humans and robots, Pipitone said. This could make sense for collaborative tasks, such as with a medical robot and getting out of a deadlock situation with a robot. "It can be very important to understand why a robot makes one decision over another.," she said.