Who will self-drive cars choose to stab, avoid whom?

In about 10 years, a self-driving car will appear

Self-driving cars are the inevitable trend of modern cars. In about 10 years, self-driving cars will appear "abundantly" on the street. There is a question for self-driving cars: if the accident is unavoidable, who will self-drive cars choose to stab?

According to tech site TechCrunch, scientist Iyad Rahwan and colleagues at the Massachusetts Institute of Technology are conducting a study of moral categories for self-driving cars. Rahwan said: "Every time a self-driving car performs a complex operation, it will also have to consider in terms of possible risks to different objects".

Rahwan hypothesized an inevitable crash for self-driving cars. A person who suddenly fell to the front of a self-driving car that was launching very fast, and on the left front of the car also had a concrete barrier. In this case how will the self-driving car handle? It will prevent people from falling into the road and crashing into a concrete fence, causing the driver to die or will he go straight to save the driver but will cause the person to fall off the road to death?

Picture 1 of Who will self-drive cars choose to stab, avoid whom?

In this case how will the self-driving car handle?

Rahwan says the current situation of self-driving cars causing human accidents is only a hypothesis, but in the future this is entirely possible when the streets are flooded with self-driving cars. Therefore, the ethical questions of self-driving cars are being raised every day. For example, how will a self-driving car handle when passing a cyclist or a pedestrian?

Ryan Jenkins, a professor of philosophy at the California State Polytechnic University, said: "When you drive in the street, you're putting people around at risk. When we pass a cyclist or a Pedestrians, we often create a safe distance for them, even though we are confident that our cars will not crash, but there are things that happen suddenly that no one anticipates. bicycles may fall off, pedestrians may slip. "

To ensure safety, self-driving cars will have to slow down and run slowly when detecting pedestrians on the road, to avoid intentional pedestrians entering the front of the vehicle. This is the opinion of Noah Goodall, a member of the Virginia Transportation Research Commission.

Driving is the person who can handle the above situations through intuition, but for artificial intelligence this is not simple. Self-driving software needs to clearly define rules for handling unexpected situations, or based on general driving regulations. Hopefully lawmakers will soon be creating rules for self-driving cars.

Picture 2 of Who will self-drive cars choose to stab, avoid whom?

To ensure safety, self-driving cars will have to slow down and run slowly when detecting pedestrians on the road.

Are manufacturers not ready?

How do car manufacturers drive this ethical aspect? In many cases, they did not respond. Although the issue of self-driving ethics attracts a lot of interest, the car manufacturing industry is trying to avoid it. A CEO of Daimler AG, when asked, said that Mercedes-Benz self-propelled models will protect people at all costs. He said: "No software or a self-driving system is concerned about the value of human life." Daimler's representative affirmed that the moral aspect of self-driving cars is not a real problem, and the company "is focusing on anti-risk solutions for cars to avoid falling into situations." dilemma ".

However, scientists still claim that the risks to self-driving cars are inevitable. Such as brake failure, being hit by another car; bike riders, pedestrians and pets suddenly fall in front of the vehicle. Therefore, the situation of self-driving cars must make difficult choices is real.

As Daimler claims that they value every life equally, we may think that the company does not have any clear rules for dealing with situations that happen to human life.

Picture 3 of Who will self-drive cars choose to stab, avoid whom?

Google's self-driving cars will soon be available in the market.

Meanwhile, Google has a detailed guideline on handling rules for self-driving car accidents. In 2014, Sebastian Thrun - founder of Google X, said that its self-driving cars would crash into smaller objects. "If there is an inevitable collision situation, the car will hit a smaller target."

Google's 2014 patent also describes a similar situation, when the car drives itself away from a truck in the same lane, rushes to the next lane and approaches a smaller car. It would be safer to crash into a small car.

Crashing on smaller objects is a moral decision, a choice to protect passengers by minimizing damage from accidents. However, such self-driving vehicles will transfer risk to pedestrians or passengers of small cars. Indeed, Professor Patrick Lin of California Polytechnic University wrote: "Smaller objects can be a baby stroller or a small child".

In March 2016, Chris Urmson, the leader of Google's self-driving car, answered the Los Angeles Times interview about a new rule: "Our car will try its best to avoid those who can't. protection, that is, cyclists and pedestrians, then the car will try to avoid moving objects on the road. " Compared to smaller object crashing rules, this new approach of Google was more realistic. Google cars drive in the other direction to protect the most vulnerable pedestrians in an accident. Of course this will not benefit the owner, because they want the car to protect their lives at all costs.

Self-driving cars must distinguish sex, age of passersby?

Another problem is raised: whether self-driving cars need to distinguish the gender, age and social position of road users to take appropriate measures. For example, for 2 groups of passersby, a group of doctors and pregnant people, another group of elderly and young people. So in case of force majeure, which self-driving car will avoid? Perhaps it will take a long time for a self-driving car to distinguish such groups of subjects.

Picture 4 of Who will self-drive cars choose to stab, avoid whom?

In the long run, the most ethical decision for self-driving cars is in the nature of this vehicle.

Need more workshops

How will we handle the issue of self-driving ethics? Everyone agrees that more discussion is needed between scientists, legal experts and car makers on this issue.

Last September, the US National Traffic Safety Administration (NHTSA) issued a report, which states: "Car manufacturers along with other units must cooperate with the law enforcement agencies and stakeholders (such as drivers, passengers .) to address possible risk situations, to ensure that decisions are made in a intentionally".

Wayne Simpson, an expert working for consumer rights agency, agrees with NHTSA. Simpson said: "The public has the right to know when someone drives a car on the street it will prioritize whose life? Passengers, drivers or pedestrians? Which factors are it considered? If These questions are not fully answered, manufacturers will program self-driving cars according to their own regulations, which may not be consistent with social customs, ethics or rules of law".

Some technology firms seem to have taken this idea. Apple said the company is conducting " thoughtful surveys" to gain input from industry leaders, consumers, federal agencies and experts. Ford said it was "working with a number of major universities and industrial partners" in making self-propelled vehicles. At the same time Ford also said that the assumptions set out above are theoretical . a bit too much. Ford approached the problem of self-driving car ethics in the aspect that the company would make good cars, not rely on unrealistic hypotheses, but really could not solve.

Picture 5 of Who will self-drive cars choose to stab, avoid whom?

Tesla has a self-driving system on all its models.

In the long run, the most ethical decision for self-driving cars is in the nature of this vehicle. Self-driving cars are much safer than drivers. Experts predict that self-driving cars can eliminate 90% of traffic accidents.

Anyway to achieve this, we also need to create good enough rules, avoiding mistakes that can lead to controversy or litigation. As Rahwan and his colleagues Azim Shariff, Francois Bonnefon wrote in the New York Times: "How soon is a self-driving car going to have more lives saved. But we also need to consider a way. Understand the psychological and technical aspects of self-driving cars that will free us from the tedious, time-consuming and dangerous driving that we have done for more than a century by".

Update 12 December 2018
« PREV
NEXT »
Category

Technology

Life

Discover science

Medicine - Health

Event

Entertainment