What is the crisis that AI weapons and autonomous drones bring to the battlefield?
Drones have been used in warfare for years, but were generally remotely controlled by humans. However, by combining image recognition software and autopilot software, autonomous drones can now be mass-produced at low cost. The Washington Post, an overseas newspaper, discusses autonomous drones used as weapons and weapons equipped with AI.
Autonomous weapons already exist and are playing a role on battlefields like Libya and Armenia --The Washington Post
In March 2020, the Libyan government, with the help of Turkish support forces, used autonomous drones to attack extremist troops. In May 2021, Israel carried out the world's first group flight with a fully autonomous drone to collect strategic information. In addition to attackable drones, drones with a self-destruct function that specialize in collecting information are also deployed in Turkey and Israel.
The enactment of a total ban on autonomous killing weapons called 'killer robots', which has long been requested by human rights activists, is being supported by 30 countries as of 2021. However, the world's leading military nation insists that a total ban is unnecessary. The United States has stated that 'concerns have been exaggerated and humans can effectively control autonomous weapons,' and the Kremlin said, 'there are no true AI weapons yet and cannot be banned.' Stated.
But the Washington Post argues that the reality is that technological advances have already killed people with weapons that make their own decisions, such as the Libyan Civil War. Since these drones have both remote control and autonomy, it is impossible to know from the outside whether humans have made the final decision to bomb individual targets.
Autonomous drones are becoming the main force on the battlefield, and today there are dozens of government projects to develop autonomous drones. Countries such as the United States, China, and Russia continue to develop, despite participating in discussions on treaties limiting autonomous drones.
Over the last decade, it has become easier to use computers that can process large data sets in a short amount of time, and researchers have made great strides in designing computer programs that process large amounts of information. Advances in AI have made it possible for AI to write poetry , translate languages accurately, and even help scientists develop new drugs.
However, the debate about the dangers of relying on computers for decision-making is intensifying. Companies such as Google, Amazon, Apple, and Tesla are spending a lot of money on technology development, and attempts are being made to 'do not prejudice AI', and AI's international conference NeuroIPS but 'paper to be submitted to the international conference is mandatory a description of the effects that could give to society,' the announcement has been or.
In some countries, AI technologies like facial recognition are already deployed in autonomous weapons. As early as 2010, Samsung's weapons division has developed a Sentry Gun that uses image recognition to discover and fire humans. A similar gun was also deployed on the border between Israel and the Gaza Strip, but the South Korean and Israeli governments said, 'The system certainly works automatically, but it is humans who control it.' I will.
'Technology makes weapons smarter, but it also makes it easier for humans to remotely control them,' said Paul Scharre, a former US Special Operations Command soldier and vice president of the Center for a New American Security. 'If you realize that you could hit a civilian after launching a missile, you can stop it,' he said.
However, autonomous drone expert Dahn Kaiser, who belongs to the international peace organization PAX, said, 'Still, at the speed required on the battlefield, more decisions will inevitably be left to the machine.' It is not so unrealistic that the war will proceed at a speed that we humans can no longer control. '
in Posted by log1p_kr