AI News, Use of 'killer robots' in wars would breach law, say campaigners

Use of 'killer robots' in wars would breach law, say campaigners

The use of fully autonomous weapons in a theatre of war would breach international law, campaigners and experts say, as longstanding calls for a ban on “killer robots” intensify.

These AI-powered guns, planes, ships and tanks could fight future wars without being subject to any human control, as high-tech nations step up investment in the weapons and inch towards full autonomy.

Twenty-six countries explicitly support a prohibition on fully autonomous weapons, with Austria, Belgium and China recently joining thousands of scientists and artificial intelligence experts and more than 20 Nobel peace prize laureates in declaring their support.

“The groundswell of opposition among scientists, faith leaders, tech companies, nongovernmental groups, and ordinary citizens shows that the public understands that killer robots cross a moral threshold.

However, if states recommend negotiations should begin in 2019, it will help pave the way for their formal approval in November, after almost every country agreed that some form of human control should be maintained over the use of force at the last meeting in April.

“The idea of delegating life and death decisions to cold compassionless machines without empathy or understanding cannot comply with the Martens clause and it makes my blood run cold,” said Noel Sharkey, a roboticist who wrote about the reality of robot war as far back as 2007 and has acted as a spokesperson for the Campaign to Ban Killer Robots.

South Sudan: Amnesty International calls for justice to end cycles of violence and broken promises in South Sudan

As rapid technological advances bring “killer robots” ever closer to reality, Amnesty International is calling on states to support the negotiation of new international law to ban fully autonomous weapons systems.

We are calling on states present in Geneva this week to act with the urgency this issue demands, and come up with an ambitious mandate to address the numerous risks posed by autonomous weapons.” The majority of states at the last CCW meeting in April 2018 emphasized the importance of retaining human control over weapons systems and the use of force, and expressed support for developing new international law on lethal autonomous weapons systems.

Although it is unclear how sophisticated future technologies will be, it is very unlikely that fully autonomous weapons systems would be able to replicate the full range of inherently human characteristics necessary to comply with international law.  This includes the ability to analyse the intentions behind people’s actions, to assess and respond to often dynamic and unpredictable situations, or make complex decisions about the proportionality or necessity of an attack.

We are calling on states to take concrete steps to halt the spread of these dangerous weapons, both on the streets and on the battlefield, before it’s too late.” Background Amnesty International and its partners in the Campaign to Stop Killer Robots are calling for a total ban on the development, production and use of fully autonomous weapon systems, in light of the human rights and humanitarian risks they pose.

Fully Autonomous Weapons

The use of artificial intelligence in armed conflict poses a fundamental challenge to the protection of civilians and to compliance with international human rights and humanitarian law.

Fully autonomous weapons are distinct from remote-controlled weapon systems such as drones—the latter are piloted by a human remotely, while fully autonomous weapons would have no human guidance after being programmed.

Important questions raised by the development of fully autonomous weapons Ongoing research and development in the field of fully autonomous weapons have reached a critical stage, requiring in-depth reflection on further technical development of such weapon systems.

The debate on fully autonomous weapons raises following fundamental ethical and principle questions: These issues put into question whether or not human abilities, such as the assessment of proportionality, military necessity, and the capability to make distinctions between civilians and combatants, can be transferred to a machine.

Bearing in mind that most of today’s armed conflicts are inter-state conflicts without clear boundaries between a variety of armed groups and civilians, it is questionable how a robot can be effectively programmed to avoid civilian casualties when humans themselves lack the ability to make distinctions in such conflict settings and face difficulties to overcome these dilemmas.

As the UN Special Rapporteur on extrajudicial, summary or arbitrary executions pointed out in his report to the Human Rights Council, the removal of humans from the selection and execution of attacks on targets constitutes a critical moment in the new technology which is considered as “revolution in modern warfare”.

He urged states to think carefully about the implications of such weapon systems, noting that such technology increases the risk that states are more likely to engage in armed conflicts due to a reduced possibility of military causalities.

Supporters of fully autonomous weapons argue that these systems would help overcome human emotions such as panic, fear, or anger, which lead to misjudgment and incorrect choices in stressful situations.

Pakistan, Morocco, Mexico, Argentina (on behalf of GRULAC), Cuba, Sierra Leone, Switzerland, Algeria, and Egypt raised deep concerns about future implications of such weapons and argued that these weapons should be discussed through the perspectives of both human rights and international humanitarian law.

The campaign seeks to establish a coordinated civil society call for a ban on the development of fully autonomous weapon systems and to address the challenges to civilians and the international law posed by these weapons.


Killer drone arms, articial intelligence an increasingly real fiction, Social and Smart Phone Facial Recognition, Smart swarms, Warning ! SUBSCRIBE OUR ...

Autonomous weapons could change battlefields of the future [Advertiser content from ICRC]

Robots will fight the wars of tomorrow. Subscribe to our channel! Learn more about the automation of warfare: ..

The Threat of AI Weapons

Will artificial intelligence weapons cause World War III? This animated clip is from my friends at ..

Lethal Autonomous Weapons

Biography: Stuart Russell received his B.A. with first-class honours in physics from Oxford University in 1982 and his Ph.D. in computer science from Stanford in ...

Autonomous Weapons Systems & the Role of Law

The development of autonomous weapons systems continues unabated. In its broadest sense, these are systems that can operate outside of direct human ...

The Dawn of Killer Robots (Full Length)

Subscribe to Motherboard Radio today! In INHUMAN KIND, Motherboard gains exclusive access to a small fleet of US Army bomb ..

34C3 - Regulating Autonomous Weapons

The time travelling android isn't even our biggest problem Depending on the definition, ..


The short, disturbing film is the latest attempt by campaigners and concerned scientists to highlight the dangers of developing autonomous weapons that can ...

RT International: use of autonomous weapons and commercial drones

Physicist and member of International Committee for Robot Arms Control Mark Gubrud speaks about dangers of autonomous weapons.

Moral Math of Robots: Can Life and Death Decisions Be Coded?

A self-driving car has a split second to decide whether to turn into oncoming traffic or hit a child who has lost control of her bicycle. An autonomous drone needs ...