AI News, Do We Want Robot Warriors to Decide Who Lives or Dies?
- On Sunday, June 3, 2018
- By Read More
Do We Want Robot Warriors to Decide Who Lives or Dies?
Lately, fears of fiction turning to fact have been stoked by a confluence of developments, including important advances in artificial intelligence and robotics, along with the widespread use of combat drones and ground robots in Iraq and Afghanistan.
But it’s likely, and some say inevitable, that future AI-powered weapons will eventually be able to operate with complete autonomy, leading to a watershed moment in the history of warfare: For the first time, a collection of microchips and software will decide whether a human being lives or dies.
The poles of the debate are represented by those who fear that robotic weapons could start a world war and destroy civilization and others who argue that these weapons are essentially a new class of precision-guided munitions that will reduce, not increase, casualties.
Last year, the debate made news after a group of leading researchers in artificial intelligence called for a ban on “offensive autonomous weapons beyond meaningful human control.” In an open letter presented at a major AI conference, the group argued that these weapons would lead to a “global AI arms race” and be used for “assassinations, destabilizing nations, subduing populations and selectively killing a particular ethnic group.” The letter was signed by more than 20,000 people, including such luminaries as physicist Stephen Hawking and Tesla CEO Elon Musk, who last year donated US $10 million to a Boston-based institute whose mission is “safeguarding life” against the hypothesized emergence of malevolent AIs.
and Toby Walsh from the University of New South Wales, Australia—expanded on their arguments in an online article for IEEE Spectrum, envisioning, in one scenario, the emergence “on the black market of mass quantities of low-cost, antipersonnel microrobots that can be deployed by one person to anonymously kill thousands or millions of people who meet the user’s targeting criteria.” The three added that “autonomous weapons are potentially weapons of mass destruction.
While some nations might not choose to use them for such purposes, other nations and certainly terrorists might find them irresistible.” It’s hard to argue that a new arms race culminating in the creation of intelligent, autonomous, and highly mobile killing machines would well serve humanity’s best interests.
“Itmight be a high-intensity straight-on conflict when there’s no time for humans to be in the loop, because it’s going to play out in a matter of seconds.” The U.S. military has detailed some of its plans for this new kind of war in a road map [pdf] for unmanned systems, but its intentions on weaponizing such systems are vague.
Asked about autonomous weapons, Work insisted that the U.S. military “will not delegate lethal authority to a machine to make a decision.” But when pressed on the issue, he added that if confronted by a “competitor that is more willing to delegate authority to machines than we are...we’ll have to make decisions on how we can best compete.
(In video released after the demonstration, the robot is shown riding an ATV at a speed only slightly faster than a child on a tricycle.) China’s growing robotic arsenal includes numerous attack and reconnaissance drones.
The three countries’ approaches to robotic weapons, introducing increasing automation while emphasizing a continuing role for humans, suggest a major challenge to the banning of fully autonomous weapons: A ban on fully autonomous weapons would not necessarily apply to weapons that are nearly autonomous.
“But that seems like probably the last way that militaries want to employ autonomous weapons.” Much more likely, he adds, will be robotic weapons that target not people but military objects like radars, tanks, ships, submarines, or aircraft.
“Because humans are better at being flexible and adaptable to new situations that maybe we didn’t program for, especially in war when there’s an adversary trying to defeat your systems and trick them and hack them.” It’s not surprising, then, that DoDAAM, the South Korean maker of sentry robots, imposed restrictions on their lethal autonomy.
Arkin argues that autonomous weapons, just like human soldiers, should have to follow the rules of engagement as well as the laws of war, including international humanitarian laws that seek to protect civilians and limit the amount of force and types of weapons that are allowed.
Eventually, Arkin arrived at a set of algorithms, and using computer simulations and very simplified combat scenarios—an unmanned aircraft engaging a group of people in an open field, for example—he was able to test his methodology.
special rapporteur for human rights, wrote an influential report noting that the world’s nations had a rare opportunity to discuss the risks of autonomous weapons before such weapons were already fully developed.
meetings, Heyns says that “if I look back, to some extent I’m encouraged, but if I look forward, then I think we’re going to have a problem unless we start acting much faster.” This coming December, the U.N.’s Convention on Certain Conventional Weapons will hold a five-year review conference, and the topic of lethal autonomous robots will be on the agenda.
- On Wednesday, June 26, 2019
MICRO DRONES KILLER ARMS ROBOTS - AUTONOMOUS ARTIFICIAL INTELLIGENCE - WARNING !!
Killer drone arms, articial intelligence an increasingly real fiction, Social and Smart Phone Facial Recognition, Smart swarms, Warning ! SUBSCRIBE OUR ...
Automated Machine Gun Targets People from 1.5 miles
Feb 14 - South Korea has developed an automated, turret-based weapon platform capable of locking onto a human target three kilometers away. Tara Cleary ...
5 Robots that Will Take Over the World
Westworld's robots aren't the only ones learning to be more "alive." Just in time for the Westworld season finale, Dark5 examines 5 scary new skills being ...
Armies of the Future: AI Bots or “Enhanced” Humans?
In this week's edition of Mind Hack, Jeff DeRiso discusses an effort by the Army to train their autonomous combat robots to better recognize targets. He also talks ...
Future weapon used by intelligence(Mini Drone)
AI based drone weapon will be launched when will target anyone by face recognization.
Israel's Killer Robots
Israel is the world's biggest exporter of military drones, used around the world for everything from surveillance to precision rocket attacks on speeding cars in ...
Top 10 FUTURE WEAPONS Already Being Used
Welcome to Top10Archive! While peace is ideal, conflict always seems inevitable. While not at war, most nations will continue their research into devastating ...
Rules of war (in a nutshell)
Yes, even wars have laws. To find out more, visit ******** Rules of War in a Nutshell - script Since the beginning, humans have resorted to ..
Phalanx CIWS Close-in Weapon System In Action - US Navy's Deadly Autocannon
Footage of the Phalanx CIWS Close-in Weapon System in various target practicing exercises. The Phalanx Close-In Weapons System (CIWS) was developed as ...
How to regulate Autonomous Weapon Systems
EU Non-Proliferation and Disarmament Conference 2015 Special Session 10 Chair: Beth Stevenson, Aerospace and Defence reporter, Flightglobal Noel ...