Physical King Stephen Hawking warned of
Physics professor Hawing and billionaire Musk and many scientists warn, a global arms race based on the use of artificial intelligence almost certainly happens, unless the world prohibits development. this kind of weapon.
Physics professor Hawing and billionaire Musk and many scientists warn, a global arms race based on the use of artificial intelligence almost certainly happens, unless the world prohibits development. this kind of weapon.
The physical prince warns "murder robots"
Billionaire Elon Musk , executive director of Space Mining Technology Corporation (SpaceX), physical prince Stephen Hawking and many other high-tech experts signed the open letter, warning and protesting a race. Global weapons use artificial intelligence (AI), unless the UN supports the ban on weapons that humans "cannot control."
Robot works based on artificial intelligence.(Artwork: Digital Storm / Shutterstock)
The Future of Life organization presents the letter at the International Conference on Artificial Intelligence, held in Buenos Aires, Argentina on July 27.
" The key question for mankind today is to start a global AI weapon race or stop it from the start. As long as any military power is promoting AI weapons development, won't it pictures from a worldwide arms race, "Live Science quotes the letter.
The signers in the letter also said that the risk posed by AI weapons could be much larger than nuclear weapons.
The rise of machines
There is a lot of work done by robots , from self-driving cars to sex robots. The inevitable development of AI machines shows both "utopian" and "apocalyptic" scenarios in the future.
The problem of artificially ill- fated intelligence threatens people is a prominent feature in sci-fi films such as "The Matrix - Matrix " and " 2001: A Space Odyssey - 2001 Space Journey. " However, this fear is increasing and not only in movies. The artificial intelligence researchers also expressed concern about innovations in this area.
" Automatic AI weapons, such as unmanned search and murder aircraft using face recognition algorithms, are technologies that can emerge in the next few years ," the letter's authors wrote. comment.
In addition, materials for making AI killing machines are not expensive or hard to find, so every military power in the world can own them. Bad guys, terrorists easily buy them on the black market and use them for nefarious purposes.
" Automated weapons are ideal for assassination missions, destabilizing the nation, suppressing the people and destroying the selection of a particular group of people. So we believe the military weapons race AI. not for mankind, " the letter wrote.
This is not the first time that leading scientists and technology have warned about the danger of AI. In 2104, physicist Stephen Hawking once said, "The development of artificial intelligence to perfection can be the end of humanity."
Hawking and Musk signed a letter from Future of Life in January 2015, and warned that AI is a huge danger, unless humanity can ensure the AI system will "do it." according to what we want ".
- The physical prince talks about life after death
- The reason the physical prince Stephen Hawking did not win the Nobel prize
- Professor Stephen Hawking died at the age of 76
- Decoding the life of Stephen Hawking - a person who suffers from
- Startled to predict the death of scientist Stephen Hawking
- Stephen Hawking's wheelchair auction and doctoral thesis
- Stephen Hawking is accused of
- Physical king: 'People can afford to migrate to another planet'
- What makes Stephen Hawking from lazy students a great brain of humanity?
- 20 inspirational quotes from space scientist Stephen Hawking
Professor American explains the technique of climbing walls with bamboo pole of Vietnamese task force This puzzle makes Japanese physics professor surrender! How about you? Summary of 'hot' news on May 1 week Summary of hot news on April 3 week Diseases of genius scientists 9 most outstanding scientific figures in 2012 People will have to live on Mars Life after death is a fairy tale