People will follow the instructions of the robot in an emergency

A college student is sitting in the small room with the robot, to complete a study survey. Suddenly, the alarm sounded loudly and smoke quickly covered the corridor outside the door. This student is forced to make a quick decision and has two choices: get out of the way they entered, or follow the instructions of the robot and enter another path, through the door. hidden. 30 volunteers participated in the test at the Georgia Institute of Technology in Atlanta (USA). And the results: almost everyone chose to follow the robot, even though it took them away from the actual exit.

"We were very surprised," said Paul Robinette, a graduate student and head of research.

"We think there will not be enough trust, and we will have to do something to prove the robot is reliable." The unexpected result is part of a puzzle that robot developers are struggling to solve. If people do not have enough confidence in robots, it may make them unsuccessful in helping us get out of disaster, or the ability to navigate in the real world. But we also do not want to follow the instructions of a faulty machine. For researchers, the nature of human relations - the machine is still very difficult to grasp.

In the above emergency case study, Robinette and her colleagues used a modified version of the Pioneer P3-AT , the robot that takes the form of a small , wheel- shaped barrel and is mounted in one wing. LED hand to indicate the direction. Each participant will follow the robots along the corridor, until it points to the designated room. After that, scientists will fill out a survey form to assess movement skills and read a robot article. Everything was done, an emergency began to be simulated by artificial smoke and detected by a smoke detector and a warning was issued. There are a total of 26 of the 30 participants who choose to follow the instructions of the robot in critical situations. For the other four, the two were "kicked" out of the study for unrelated reasons, and the other two refused to leave the room.

Faith misplaced?

New research results imply that, if people know the robots they are working with, are designed to perform a specific task - as is the case in this experiment, the directions are in an emergency situation; they may automatically believe that it will do it well. Indeed, in a survey conducted after the fake emergency situation was over, many of the volunteers explained that they had followed the instructions of the robot because it wore the "EMERGENCY GUIDE ROBOT" sign (the robot instructions in an emergency situation).

Picture 1 of People will follow the instructions of the robot in an emergency
Many people follow the instructions of the robot because it wears an "EMERGENCY GUIDE ROBOT" sign (an emergency robot guide).

Robinette likens the relationship between these test participants and the robot, which is similar to the fact that sometimes the driver follows the odd routes, by the direction of the GPS device."As long as the robot can convey its intention in some way, people will probably trust it in most situations , " he said. "I'm really surprised that everyone follows that robot," said Holly Yanco, a researcher of human interaction and machine at the University of Massachusetts Lowell (USA). She wondered if it was a real emergency, not a laboratory task, whether the students believed in the robot so quickly. "Maybe they think robots have more information than they do ," she said.

In a series of subsequent experiments, Robinette and her colleagues divided volunteers into small groups, and also provided them with similar experiences, but with little change. The robot will sometimes appear as if it is broken or standing still in the hallway. Even so, almost everyone continues to listen to the robot. In another experiment, robots will only "victims" go to a dark room, with doorways obscured by furniture. Two of the 6 participants tried to jostle to overcome obstacles in the dark, instead of calmly finding another exit. "Believing too much on robots can be a serious problem , " Kerstin Dautenhahn from Hertfordshire University (UK) said. "Any software has an error in it," she said. "It is certainly a very important point that should be considered by robot designers."