"Do not make killer robots" and suggested by Dr. Hawking and over 1000 researchers
ByNathan Rupert
In the movie "Terminator" series, a computer "Skynet" will appear that attacks human beings who are trying to stop themselves as enemies. In reality this SkyNet class "computer with ego" has not appeared yet, but as artificial intelligence is more dangerous than nuclear weapons, Dr. Stephen Hawking, theoretical physicist and Eleon mask of Tesla More than 1000 artificial intelligence researchers and robotics researchers such as CEO, Apple's co-founder Steve Wozniak and Deep Mind development officer Google, more than 1,000 researchers have released an open letter on the net.
FLI - Future of Life Institute
http://futureoflife.org/AI/open_letter_autonomous_weapons
Musk, Wozniak and Hawking urge ban on warfare AI and autonomous weapons | Technology | The Guardian
http://www.theguardian.com/technology/2015/jul/27/musk-wozniak-hawking-ban-ai-autonomous-weapons
The letter suggests that the robotic weapon "Killer Robot" which moves autonomously by loading artificial intelligence is dangerous and that it will quit developing competition of artificial intelligence in the military direction.
According to the letter, artificial intelligence is called the third revolution of war following "gunpowder" "nuclear weapons". However, since nuclear weapons are not easy to make their raw materials, it is not something that anyone can have, but if they are autonomous robot weapons with artificial intelligence, the raw materials can be easily arranged, even mass production is possible. Naturally, it also flows to the black market, terrorists can also be easily acquired, automatic rifles that are used in any place of conflict in the present dayAK-47 (Karashnikov)There is a possibility that it will become like existence.
A "stop-killer robot" campaign with a similar concept has already been in operation since April 2013.
Campaign to Stop Killer Robots
http://www.stopkillerrobots.org/
Although military authorities explain that human beings will decide attacks that can cause fatal damage, one day, human beings do not deny that robots have the ability to exercise their own power, Human · Lights watch pointed out. Such robot weapons are dangerous if there is a part that does not meet international humanitarian law and the risk of civilian injury or death in armed conflict increases.
Losing Humanity | Human Rights Watch
https://www.hrw.org/report/2012/11/19/losing-humanity/case-against-killer-robots
Ban 'Killer Robots' Before It's Too Late | Human Rights Watch
https://www.hrw.org/news/2012/11/19/ban-killer-robots-its-too-late
As of August 2014, Mr. Earon Mask tweeted that 'Artificial intelligence is more dangerous than nuclear weapons, so it requires caution in handling.'
Worth reading Superintelligence by Bostrom. We need to be super careful with AI. Potentially more dangerous than nukes.
- Elon Musk (@ elonmusk)2014, August 3
Dr. Hawking also appeals the danger of artificial intelligence.
Artificial intelligence is potentially dangerous than nuclear weapons, Dr. Hawking warns that "Artificial intelligence exceeds humans within 100 years" - GIGAZINE
Mr. Steve Wozniak also agreed with the two people and thought that they developed a smart device to take care of them but the intelligence of the device got smarter and human beings were kicked out and the robot pet I was worried that I might become.
Apple co-founder Steve Wozniak on the Apple Watch, electric cars and the surpassing of humanity | afr.com
http://www.afr.com/technology/apple-cofounder-steve-wozniak-on-the-apple-watch-electric-cars-and-the-surpassing-of-humanity-20150323-1m3xxk
Incidentally, the United Nations has also been discussing this type of robot weapons, but at the conference held in Geneva in April, despite the strong campaign such as the "stop-killer robot" campaign, the UK We are expressing our intention to discontinue the development of autonomous robot weapons.
Related Posts:
in Note, Posted by logc_nt