Isaac Asimov's 1st Law Test
Isaac Asimov, the famous science fiction writer, was the one who set out
Isaac Asimov, the famous science fiction writer, was the one who set out "3 rules for robots" , in which the first thing stipulates: "Robots must not harm people, or leave them to humans." victim". Hearing this is extremely simple, but a recent experiment has shown that, to obey this law turns out to be extremely complicated, robots will also have to "brainstorm" and even wonder beforehand. Its decision before the situation.
Robot scientist Alan Winfield of Bristol's robotics institute in the UK has recently released a small experiment, to investigate whether robots can implement the first rule set. His group uses a small table, on which there are 3 robots, with 2 children playing the role of humans. The problem comes when both "humans" intend to commit suicide by stabbing themselves into the pit, and what the other robot will do, because it is clear that the law says it is not allowed to wear humans are harmed, that is, they die without saving.
First, the robot passes the test when only one human object appears and deliberately proceeds to reach the pit, the robot comes out and stops "that person" again. When two "human" objects appeared and attempted to commit suicide, the robot began to be confused because it did not know who to save first, sometimes it tried to save both at the same time but failed to do so. . After the test, there were 14/33 attempts, the robot took too much time to decide who to save, so it lost the opportunity to cause both "humans" to fall into the hole.
This test is said to be very important in the robot development industry, as well as the self-driving car industry. For example, in the event that a car is running while someone intentionally blocks the vehicle to commit suicide, how will the vehicle react to both ensure safety for passengers sitting in the vehicle and the other object.
In 1942, Isaac Asimov introduced the Runaround science fiction short story about robots, including three rules that laid the foundation for the development of robots later:
Rule 1: The robot is not allowed to harm people. Or let people harm. (A robot may not injure a human being or, through inaction, allow a human being to come to harm.)
Rule 2: Robots must obey human orders unless they conflict with Rule No. 1. (A robot must obey the given orders to it by human beings, except where such orders will conflict with the First Law .)
Rule 3: The robot must protect itself, as long as that defense does not conflict with Rule No. 1 or Rule 2. (A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.)
- Dubbed the genius, Isaac Newton also broke the battle on the stock exchange
- Hurricane Isaac swept in Haiti, Cuba and East America
- Isaac Newton
- Revealed Isaac Newton's first mathematical manuscript
- 'Flying fish' almost caused Isaac Newton to be unemployed
- 6 horror and extreme experiments that scientists have done with themselves
- Tropical storm Isaac has a trajectory like Katrina
- Winged words about genius Isaac Newton
- How accurate is the DNA test for the biological child?
- Video: Challenge the god of death to prove Newton's law
Technology of growing plants in the dark World's largest digital camera ready for action China once again surprised the world when it let the humanoid robot Star1 race across the Gobi Desert. Octopus-inspired underwater sticky device Humans have been able to communicate in dreams. South Korea successfully researches the world's first 'single atom editing' technique Sweden successfully developed the world's first wooden transistor American company develops propeller-less aircraft with speed of nearly 1,000km/h