Skip to main content
xYOU DESERVE INDEPENDENT, CRITICAL MEDIA. We want readers like you. Support independent critical media.

116 Experts Urge UN to Ban ‘Autonomous Weapons’

Unlike its movie counterpart, the real-world skynet does not kill people. Or does it?
Automised weapons

Skynet, from the Terminator movie franchise – has its namesake at the NSA. Unlike its movie counterpart, the real-world skynet does not kill people. Or does it? Skynet helps NSA prepare a ‘kill list’, which will then be ratified by human beings. Drones then take over the part of execution. Today, it would not be science fiction if you imagined drones taking decisions by themselves, without the intervention of human beings. ‘Autonomous weapons’ - weapons which do not require human intervention – are a reality today. Just like ‘autonomous automobiles’ a.k.a  ‘driver-less cars’, these weapons can make their own decisions and require very little human input, if any.

As fiction turns to reality, there are widespread concerns on the consequences of such technologies. Two years ago, in 2015, over a 1,000 researchers signed an open letter urging the United Nations to act on this and stop the development of weaponised artificial intelligence. The UN acted on this and convened a group of experts in 2016, under the Convention on Conventional Weapons (CCW). However, this group is yet to meet – a meeting scheduled this month was delayed due to financial problems. The pace of growth in robotics and AI does not match the slow progress on the part of the UN, and we are getting increasingly closer to using AI in weapons. This will allow, in the opinion of experts, to wage wars on a scale and time that humans can not even comprehend.

Shortly after the meeting was canceled, a group of 116 experts on robotics and AI, from 26 countries, penned another open letter. They urged the UN to act quickly and ensure support to the Group of Governmental Experts (GGE) which was formed to look into this matter. The GGE is now scheduled to meet in November of 2017. Leading figures from within the field of AI research community as well as companies working on AI are part of the signatories to this letter. Some of the well known figures include Elon Musk, Mustafa Suleyman (who founded and heads Applied AI at Google’s DeepMind), Fahad Azad (founder of Robosoft Systems, India).

The signatories, 7 of whom are from India, raised an alarm on the possible ‘third revolution’ in weapons (the first two being gun powder and nuclear weapons), which could quickly escalate into an arms race. The group cautioned on the misuse of these technologies, which are particularly vulnerable to hacks, and put countless innocent lives at risk.

A problem with AI technologies, as pointed out in an earlier article on this site (https://newsclick.in/artificial-intelligence-and-threat-humanity), is that they are neither audit-able nor answerable. No one, not even the designer of the algorithms, has a clue as to why the program took a certain decision. While this might still be okay for deciding the fare of your next Uber ride, it is absolutely unacceptable to have such algorithms decide the fate of people. As a general principle, AI should be regulated and made accountable. One should be able to design these in such a way as to understand why a particular decision was made or how was it arrived at. Artificial Intelligence, like that of nuclear technology, will have to be regulated.  A misuse of these technologies can prove too costly for the human race itself.

Full text of the open letter below:

An Open Letter to the United Nations Convention on Certain Conventional Weapons

As companies building the technologies in Artificial Intelligence and Robotics that may be repurposed to develop autonomous weapons, we feel especially responsible in raising this alarm. We warmly welcome the decision of the UN’s Conference of the Convention on Certain Conventional Weapons (CCW) to establish a Group of Governmental Experts (GGE) on Lethal Autonomous Weapon Systems. Many of our researchers and engineers are eager to offer technical advice to your deliberations. We commend the appointment of Ambassador Amandeep Singh Gill of India as chair of the GGE. We entreat the High Contracting Parties participating in the GGE to work hard at finding means to prevent an arms race in these weapons, to protect civilians from their misuse, and to avoid the destabilizing effects of these technologies.

We regret that the GGE’s first meeting, which was due to start today, has been cancelled due to a small number of states failing to pay their financial contributions to the UN. We urge the High Contracting Parties therefore to double their efforts at the first meeting of the GGE now planned for November.

Lethal autonomous weapons threaten to become the third revolution in warfare. Once developed, they will permit armed conflict to be fought at a scale greater than ever, and at timescales faster than humans can comprehend. These can be weapons of terror, weapons that despots and terrorists use against innocent populations, and weapons hacked to behave in undesirable ways. We do not have long to act. Once this Pandora’s box is opened, it will be hard to close.

We therefore implore the High Contracting Parties to find a way to protect us all from these dangers.

 

 

 

 

Get the latest reports & analysis with people's perspective on Protests, movements & deep analytical videos, discussions of the current affairs in your Telegram app. Subscribe to NewsClick's Telegram channel & get Real-Time updates on stories, as they get published on our website.

Subscribe Newsclick On Telegram

Latest