Self-propelled armed robots will be 'infinite disaster'

Robots and self-propelled weapons are increasingly popular in the modern world. But scientists warn, if not well controlled, they can become "infinite disasters".

A report entitled "Autonomous Weapons and Operational Risk report" of the new Security Center - CNA (USA), announced earlier this week, raised concerns. about the decision-making capabilities of self-propelled weapons systems , including unmanned aircraft.

Picture 1 of Self-propelled armed robots will be 'infinite disaster'
Self-propelled weapons, such as unmanned aerial vehicles (illustrations) or advanced guided missile systems, can not only make wrong decisions but can also cause incalculable dangers. If attacked by hackers.

Former US Secretary of Defense Paul Scharre, the lead author of the warning report, uses armed robots to reduce the burden of people in war, however, those responsible need to Be wary of being lulled into a false sense of security.

Self-propelled machines - such as drones and advanced navigation systems or missiles, can not only make wrong decisions, they can also be used against us. If attacked by hackers.

Experts say that, if one of the two possibilities is encountered, self-propelled machinery could be "infinite disaster".

Hollywood has long made similar warnings in its films, for example, the Robocop film produced in 1987, in which a security robot does not recognize civilian targets and destroy them. always that goal.

Picture 2 of Self-propelled armed robots will be 'infinite disaster'
Hollywood has long put this subject on film.During the 1987 Robocop, a security robot mistakenly eradicated civilian targets because it could not identify correctly.

Dr. Sarah Kreps, an unmanned aerial fighter expert from Cornel University, warned that developing self-propelled weapons would lead to two main problems: That is the lack of subjectivity to realize. Are you, what's the enemy and the hackers problem?

Explaining the limitations of smart machines in recognizing goals, security experts have emphasized the need to keep people within the framework. The inherent trouble of a war zone makes it difficult to choose targets, especially if there are soldiers captured as prisoners.

"You cannot make subjective decisions about who is a warrior or an ordinary person in an algorithm. A person, or many people, must be analyzed for behavior, to see if they are direct and positive. join the fight or not " Dr. Kreps said.

Picture 3 of Self-propelled armed robots will be 'infinite disaster'
Machines cannot identify subjectivity like humans and cannot program to do it, so we cannot count on them to recognize what you are enemies.

"The opponent's status is often subjective and subjective and it is not easy to be programmed in a self-propelled weapon. We should not be lulled into thinking that technology can make these easier decision ".

In addition to the lack of subjective judgment, autonomy brings a threat from cyber attacks. If security systems protect self-propelled technologies that can be overwritten by hackers, it can cause destruction on the battlefield.

"If a group of hackers can hack into the gaps of the security system at the Pentagon, they can almost certainly hack into self-propelled weapons control systems. And this case will bring disaster. Endless " , Dr. Kreps shared.

The report concludes that, for security reasons, countries that are able to access autonomous weapons need to be transparent and clear to the world in their implementation.