AI News, Artificial intelligence researchers boycott South Korean university amid fears it is developing killer robots

Artificial intelligence researchers boycott South Korean university amid fears it is developing killer robots

Leading artificial intelligence researchers have boycotted South Korea’s top university after it teamed up with a defence company to develop “killer robots” for military use.

An open letter sent to the Korea Advanced Institute of Science and Technology (KAIST) stated that the 57 signatories from nearly 30 different countries would no longer visit or collaborate with the university until autonomous weapons were no longer developed at the institute.

“If we combine powerful burgeoning AI technology with insecure robots, the Skynet scenario of the famous Terminator films all of a sudden seems not nearly as far-fetched as it once did,” Lucas Apa, a senior security consultant from the cybersecurity firm IOActive, told The Independent.

If robot ecosystems continue to be vulnerable to hacking, robots could soon end up hurting instead of helping us.” KAISTpresidentSung-ChulShin responded to the open letter, claiming that the university had 'no intention' of developing lethal autonomous weapons.

Autonomous Weapons: an Open Letter from AI Robotics Researchers

They might include, for example, armed quadcopters that can search for and eliminate people meeting certain pre-defined criteria, but do not include cruise missiles or remotely piloted drones for which humans make all targeting decisions.

Artificial Intelligence (AI) technology has reached a point where the deployment of such systems is — practically if not legally — feasible within years, not decades, and the stakes are high: autonomous weapons have been described as the third revolution in warfare, after gunpowder and nuclear arms.

Just as most chemists and biologists have no interest in building chemical or biological weapons, most AI researchers have no interest in building AI weapons — and do not want others to tarnish their field by doing so, potentially creating a major public backlash against AI that curtails its future societal benefits.

Indeed, chemists and biologists have broadly supported international agreements that have successfully prohibited chemical and biological weapons, just as most physicists supported the treaties banning space-based nuclear weapons and blinding laser weapons.

Elon Musk leads 116 experts calling for outright ban of killer robots

Some of the world’s leading robotics and artificial intelligence pioneers are calling on the United Nations to ban the development and use of killer robots.

While AI can be used to make the battlefield a safer place for military personnel, experts fear that offensive weapons that operate on their own would lower the threshold of going to battle and result in greater loss of human life.

The letter, launching at the opening of the International Joint Conference on Artificial Intelligence (IJCAI) in Melbourne on Monday, has the backing of high-profile figures in the robotics field and strongly stresses the need for urgent action, after the UN was forced to delay a meeting that was due to start Monday to review the issue.

The founders call for “morally wrong” lethal autonomous weapons systems to be added to the list of weapons banned under the UN’s convention on certain conventional weapons (CCW) brought into force in 1983, which includes chemical and intentionally blinding laser weapons.

We need to make decisions today choosing which of these futures we want.” Musk, one of the signatories of the open letter, has repeatedly warned for the need for pro-active regulation of AI, calling it humanity’s biggest existential threat, but while AI’s destructive potential is considered by some to be vast it is also thought be distant.

Ryan Gariepy, the founder of Clearpath Robotics said: “Unlike other potential manifestations of AI which still remain in the realm of science fiction, autonomous weapons systems are on the cusp of development right now and have a very real potential to cause significant harm to innocent people along with global instability.” This is not the first time the IJCAI, one of the world’s leading AI conferences, has been used as a platform to discuss lethal autonomous weapons systems.

An Open Letter to the United Nations Convention on Certain Conventional Weapons

As companies building the technologies in Artificial Intelligence and Robotics that may be repurposed to develop autonomous weapons, we feel especially responsible in raising this alarm.

We warmly welcome the decision of the UN’s Conference of the Convention on Certain Conventional Weapons (CCW) to establish a Group of Governmental Experts (GGE) on Lethal Autonomous Weapon Systems.

Many of our researchers and engineers are eager to offer technical advice to your deliberations.

We commend the appointment of Ambassador Amandeep Singh Gill of India as chair of the GGE.

We entreat the High Contracting Parties participating in the GGE to work hard at finding means to prevent an arms race in these weapons, to protect civilians from their misuse, and to avoid the destabilizing effects of these technologies.

We regret that the GGE’s first meeting, which was due to start today (August 21, 2017), has been cancelled due to a small number of states failing to pay their financial contributions to the UN.

We urge the High Contracting Parties therefore to double their efforts at the first meeting of the GGE now planned for November.

Lethal autonomous weapons threaten to become the third revolution in warfare.

Once developed, they will permit armed conflict to be fought at a scale greater than ever, and at timescales faster than humans can comprehend.

These can be weapons of terror, weapons that despots and terrorists use against innocent populations, and weapons hacked to behave in undesirable ways.

We do not have long to act.

Once this Pandora’s box is opened, it will be hard to close.

We therefore implore the High Contracting Parties to find a way to protect us all from these dangers.

Translations: Chinese  Japanese   Russian  To add your company, please contact Toby Walsh at tw@cse.unsw.edu.au.

Tiberio Caetano, founder &

Chatterton and Leo Gui, founders, MD &

Lorge, founder &

O’Brien, founder &

Sinha, founder &

Storr, founder &

Turner, founder &

Bengio, founder of Element AI &

Gariepy, founder &

CTO, Clearpath Robotics, found &

Chow, founder &

Li, founder &

Rosa, founder &

Tranberg Hansen, founder &

Järve, founder &

Valpola, founder &

CTO of ZenRobotics, founder &

Østergaard, founder &

Bravo, founder &

Burdun, founder &

Cherrier, founder &

Garnier, founder &

CEO of ARISEM (acquired by Thales), founder &

Joachim de Greeff, founders, CEO &

Manuel del Río, founder &

Musk, Wozniak and Hawking urge ban on warfare AI and autonomous weapons

Over 1,000 high-profile artificial intelligence experts and leading researchers have signed an open letter warning of a “military artificial intelligence arms race” and calling for a ban on “offensive autonomous weapons”.

The letter states: “AI technology has reached a point where the deployment of [autonomous weapons] is – practically if not legally – feasible within years, not decades, and the stakes are high: autonomous weapons have been described as the third revolution in warfare, after gunpowder and nuclear arms.” The authors argue that AI can be used to make the battlefield a safer place for military personnel, but that offensive weapons that operate on their own would lower the threshold of going to battle and result in greater loss of human life.

Should one military power start developing systems capable of selecting targets and operating autonomously without direct human control, it would start an arms race similar to the one for the atom bomb, the authors argue.Unlike nuclear weapons, however, AI requires no specific hard-to-create materials and will be difficult to monitor.

Killer robots: Experts warn of 'third revolution in warfare'

More than 100 leading robotics experts are urging the United Nations to take action in order to prevent the development of 'killer robots'.

'Once developed, they will permit armed conflict to be fought at a scale greater than ever, and at timescales faster than humans can comprehend,' the letter says.

'These can be weapons of terror, weapons that despots and terrorists use against innocent populations, and weapons hacked to behave in undesirable ways,' it adds.

Those in favour of killer robots believe the current laws of war may be sufficient to address any problems that might emerge if they are ever deployed, arguing that a moratorium, not an outright ban, should be called if this is not the case.

THE FUTURE OF MILITARY AI ROBOTICS AND COMBAT DRONES BUILT FOR WAR 2017!

The AI (Artificial Intelligence) Revolution is here and has a lot of people very worried about the future AI and robots merging. This 2017 one hundred and sixteen ...

Tech Visionaries Warn Us About Killer Robots

Elon Musk and Stephen Hawking are among scientists who urged researchers to calm down with artificial intelligence. They argue that artificial intelligence and ...

KILLER ROBOTS ARE COMING: Google & Tesla Beg Awareness

See more at Elon Musk and Mustafa Suleyman have written an open letter urging the UN to block the use of lethal autonomous weapons ..

The Threat of AI Weapons

Will artificial intelligence weapons cause World War III? This animated clip is from my friends at ..

How can you stop killer robots | Toby Walsh | TEDxBerlin

Tobay Walsh on "How can you stop killer robots" at TEDxBerlin ( Toby Walsh is one of the leading researchers in the world in Artificial ..

THE PENTAGON IS GETTING SERIOUS ABOUT AI WEAPONS || WARTHOG 2018

'We must see to it that we cannot be surprised,' says the Pentagon's top scientist By Matt Stroud ...

We Talked To Sophia — The AI Robot That Once Said It Would 'Destroy Humans'

This AI robot once said it wanted to destroy humans. Senior correspondent Steve Kovach interviews Sophia, the world's first robot citizen. While the robot can ...

Artificial intelligence researchers abandoned the "killer robot" boycott.

Artificial intelligence researchers abandoned the "killer robot" boycott. The group of international scientists concluded the boycott that they had launched against ...

Artificial intelligence experts to the UN 'Killer Robot' warnings: Banned.

Artificial intelligence experts to the UN 'Killer Robot' warnings: Banned. The experts warned! The world's leading robot and artificial intelligence experts wrote a ...

Scientists from around the world concerned with the development of intelligent combat robots in Kore

More than fifty leading global experts in the field of artificial intelligence researchers have declared a boycott of South Korean Institute of advanced technology ...