AI News, $500 RC Truck Is an IED Detecting Robot That (Should Be) Affordable for Everyone

$500 RC Truck Is an IED Detecting Robot That (Should Be) Affordable for Everyone

Robots like iRobot PackBots are great tools for (among other things) detecting IEDs, and they've managed to save the lives of countless soldiers, often by sacrificing themselves.

At this rate, it's gonna be a while before every soldier can rely on a top of the line EOD robot, but in many cases, a top of the line robot (that costs a hundred thousand dollars) is overkill, or at the very least, not strictly necessary to still provide a valuable contribution to a squad of soldiers.

A couple weeks ago, the little truck was vaporized when it managed to set off a 500 pound IED that might have otherwise been triggered by the Humvee itself, and this is the fifth IED that the truck has detected, although the first one that it's actually set off.

Military Robots and the Laws of War

More than just conventional wisdom, it has become almost a cliché to say that the wars in Afghanistan and Iraq have proved “how technology doesn’t have a big place in any doctrine of future war,” as one security analyst told me in 2007.

The American military efforts in those countries (or so the thinking goes) have dispelled the understanding of technology-dominated warfare that was prevalent just a few years ago—the notion that modern armed conflict would be fundamentally changed in the age of computers and networks.

Amid this ecstatic hype, it is not surprising that many security studies experts, both in and out of the defense establishment, latched onto the notion that linking up all our systems via electronic networks would “lift the fog of war,” allow war to be done on the cheap, and even allow the United States to “lock out” competition from the marketplace of war, much as they saw Microsoft doing to Apple at the time. Nor

As one robotics executive put it at a demonstration of new military prototypes a couple of years ago, “The robots you are seeing here today I like to think of as the Model T.

We are seeing the very first stages of this technology.” And just as the Model T exploded on the scene—selling only 239 cars in its first year and over one million a decade later—the demand for robotic warriors is growing very rapidly.

During World War I, for example, all sorts of recent inventions, from airplanes dropping bombs to cannons shooting chemical weapons, were introduced before anyone agreed on the rules for their use—and, as to be expected, the warring sides sometimes took different interpretations on critical questions.

Most of the uses of unmanned systems were classified and thus there was little public knowledge of their relative successes, as well as no field tests or data collection to solve the problems they incurred (16 percent of the Fire Flys crashed).

“The Iraqis came to learn that when they heard the buzz of a Pioneer overhead, all heck would break loose shortly thereafter because these sixteen-inch rounds would start landing all around them,” said Steve Reid, an executive at the Pioneer’s maker, AAI.

In one case, a group of Iraqi soldiers saw a Pioneer flying overhead and, rather than wait to be blown up, waved white bed sheets and undershirts at the drone—the first time in history that human soldiers surrendered to an unmanned system. Of

The Gulf War was the first war in history to involve widespread computers, used for everything from organizing the movement of hundreds of thousands of troops to sorting through reams of satellite photos looking for targets for missiles to hit.

“That’s when it really came together.” Now widely accessible by devices in automobiles, the GPS is a constellation of military satellites that can provide the location, speed, and direction of a receiver, anywhere on the globe.

(As one advertisement for an unmanned plane put it, “Can you keep your eyes open for thirty hours without blinking?”) They can operate in dirty environments, such as battle zones filled with biological or chemical weapons, or under other dangerous conditions, such as in space, in rough seas, or in flights with very high gravitational pressures. The

rising interest in robots in the late 1990s coincided with changing political winds—a shrinking U.S. military as part of the post-Cold War so-called “peace dividend,” and an increasing belief that public tolerance for military risk and casualties had dropped dramatically after the relatively costless victory in the Gulf War.

In 2000, this was the main factor that led Senator John Warner (R.-Va.), then chairman of the Armed Services Committee, to mandate in the Pentagon’s budget that by 2010, one-third of all the aircraft designed to attack behind enemy lines be unmanned, and that by 2015, one-third of all ground combat vehicles be driverless. And

The annual national defense budget since 9/11 has risen to $515 billion (an increase of 74 percent between 2002 and 2008), not counting the cost of operations in Afghanistan and Iraq.

As the Washington Post put it, “The undertaking has attracted not only the country’s top weapons makers but also dozens of small businesses…all pitching a science-fiction gallery of possible solutions.” Robert Finkelstein recalled a time when he personally knew most of the engineers working on military robotics.

These allow PackBot not only to roll forward and backward like regular tank tracks, but also to climb stairs, rumble over rocks, squeeze down twisting tunnels, and even swim in under six feet of water. The

designers at iRobot view their robots as “platforms.” PackBot has eight separate payload bays and hookups that allow its users to swap in whatever they need: mine detector, chemical and biological weapons sensor, or just extra power packs.

iRobot has new and improved versions of the PackBot as well as a host of plans to convert any type of vehicle into a robot, be it a car or ship, using a universal control unit that you plug into the engine and steering wheel.

It is really just a mobile platform, with a USB port—a universal connector—on top that can be used to plug in sensors, a gun, and a TV camera for battle, or an iPod and loudspeakers for a mobile rave party.

technology circles, new products that change the rules of the game, such as what the iPod did to portable music players, are called “killer applications.” Foster-Miller’s new product gives this phrase a literal meaning.

SWORDS is basically the Talon’s tougher big brother, with its gripping arm replaced by a gun mount that can carry pretty much any weapon that weighs under three hundred pounds, ranging from an M-16 rifle and .50-caliber machine gun to a 40mm grenade launcher or an antitank rocket launcher.

The SWORDS can’t reload itself, but it can carry two hundred rounds of ammunition for the light machine guns, three hundred rounds for the heavy machine guns, six grenades, or four rockets. Unlike

At iRobot, researchers describe their rivals as thinking, “We hear that robots are trendy, so let’s do that.” At Foster-Miller, they retort, “We don’t just do robots and we don’t suck dirt.” (Indeed, thanks to its Roomba vacuum cleaner, iRobot may be the only company that sells at both Pentagon trade shows and Bed Bath

If robots were pickup trucks, Foster-Miller represents the Ford model, stressing how the Talon is “Built Tough.” Its promotional materials describe the Talon as “The Soldier’s Choice.” Foster-Miller executives love to recount tales of how the Talon has proven it “can take a punch and stay in the fight.” The iRobot team, meanwhile, bristles at the idea that its systems are “agile but fragile.” They insist that the PackBot is tough, too, citing various statistics on how it can survive a 400 g-force hit, what they describe as the equivalent of being tossed out of a hovering helicopter onto a concrete floor.

They are most proud of the fact that their robots have a 95 percent out-of-the-box reliability rate, higher than any other in the marketplace, meaning that when the soldiers get them in the field, they can trust the robot will work as designed. Beneath

For testing purposes, iRobot’s PackBot has been equipped with a shotgun that can fire a variety of ammunition, including non-lethal rubber bullets, rounds that can blow down a door, and even more powerful “elephant killer” bullets.

Another version of PackBot is the Robotic Enhanced Detection Outpost with Lasers (REDOWL), which uses lasers and sound detection equipment to find any sniper who dares to shoot at the robot or accompanying troops, and instantly targets him with an infrared laser beam. Foster-Miller

And the SWORDS itself is being replaced by a new version named after the Roman god of war—the MAARS (Modular Advanced Armed Robotic System), which carries a more powerful machine gun, 40mm grenade launchers, and, for non-lethal settings, a green laser “dazzler,” tear gas, and a loudspeaker, perfect for warning any insurgents that “Resistance is futile.” We

About the size of a golf cart, the first version of the vehicle was operated by a soldier wielding a PlayStation video game controller, but software plug-ins will allow it to be upgraded to semiautonomous and then fully autonomous modes.

REX has an arm with six joints to drag a soldier to safety, while REV has a life-support pod that even comes with a flat-screen TV facing the wounded soldier’s face so that operators can see and communicate with the human on the other end if he is conscious.

It involves everything from replacing tens of thousands of armored vehicles with a new generation of manned and unmanned vehicles to writing the many millions of lines of software code for the new computers that will link them all together.

March of 2002, [during] the mission briefings over Southern Iraq at that time, the mission commander would get up and he’d say, “OK, we’re going to have the F-15Cs fly here, the F-16s are going to fly here, the A-6s are going to fly here, tankers are going to be here today.” Then they would say, “And oh by the way, way over here is going to be the Predator.” We don’t go over there, and he’s not going to come over here and bother us….

Given its price tag—at just $4.5 million, it costs a fraction of what a manned fighter jet costs—the Predator can be used for missions where it might be shot down, such as traveling low and slow over enemy territory.

It was originally designed for reconnaissance and surveillance, and about a quarter of the cost of each Predator actually goes into the “Ball,” a round mounting under the nose that carries two variable-aperture TV cameras, one for seeing during the day and an infrared one for night, as well as a synthetic-aperture radar that allows the Predator to peer through clouds, smoke, or dust.

At forty feet long, Global Hawk has been described as looking like “a flying albino whale.” Originally conceived as an unmanned replacement for the half-century-old U-2 spy plane, Global Hawk can stay aloft up to thirty-five hours.

Like its older sibling the Hunter, which is twice as big and can stay in the air twice as long, the Shadow is used on a variety of tactical missions in support of ground forces, including reconnaissance, target acquisition, and battlefield damage assessment. The

For example, airships could literally be “parked” in the air, as high as one hundred thousand feet up, for weeks, months, or years, serving as a communications relay, spy satellite, hub for a ballistic missile defense system, floating gas station, or even airstrip for other planes and drones. The

And a broad new set of robots is being introduced for war at sea, too, where the main differentiation is whether they are designed to operate on the surface, like a boat, or underwater, like a submarine.

The poet John Donne told in 1621 how the invention of better cannons would help limit the cruelty and crimes of war, “and the great expence of bloud is avoyed.” Richard Gatling hoped his new fast-firing gun would serve to reduce the bloodshed of war, while Alfred Nobel believed the explosives he invented would make war unthinkable.

For example, even as recently as the 1999 Kosovo war, NATO pilots spotting for Serbian military targets on the ground had to fly over the suspected enemy position, then put their plane on autopilot while they wrote down the coordinates of the target on their lap with a grease pencil.

During that Kosovo campaign, for example, such a premium was placed on not losing any NATO pilots that planes were restricted from flying below fifteen thousand feet so that enemy fire couldn’t hit them.

Soldiers describe how one of the toughest aspects of fighting in cities is how you have to burst into a building and, in a matter of milliseconds, figure out who is an enemy and who is a civilian and shoot the ones that are a threat before they shoot you, all the while avoiding hitting any civilians.

Marc Garlasco of Human Rights Watch told me how “the single most distinguishing weapons I have seen in my career were Israeli UAVs.” He described how, unlike jet fighters that had to swoop in fast and make decisions on what targets to bomb in a matter of seconds, the UAVs he observed during the 2006 Lebanon war could loiter over a potential target for minutes or even hours, and pick and choose what to strike or not.

Robots, on the other hand, might live up to the sniper motto of “one shot, one kill.” As journalist Michael Fumento put it in describing SWORDS, the operator “can coolly pick out targets as if playing a video game.” But

Many studies, like Army psychologist Dave Grossman’s seminal book On Killing (1995), have shown how disconnecting a person, especially via distance, makes killing easier and abuses and atrocities more likely.

Keith Shurtleff, an Army chaplain and the ethics instructor for the Soldier Support Institute at Fort Jackson in South Carolina, worries that “as war becomes safer and easier, as soldiers are removed from the horrors of war and see the enemy not as humans but as blips on a screen, there is a very real danger of losing the deterrent that such horrors provide.” Participation

via the virtual world also seems to affect not merely how people look at the target, but also how they look at themselves (which is why people in online communities, for example, take on identities and try out behaviors they never would in real life).

Research shows that this sort of “externalization” allows something called “doubling.” Otherwise nice and normal people create psychic doubles that carry out sometimes terrible acts their normal identity never would.

One Air Force lieutenant colonel who led a Predator operation noticed how the virtual setting could make it easy for the drone operators to forget that they were not gods from afar and that there are real humans on both ends.

“You have guys running the UAV saying, ‛Kill that one, don’t kill that one.’” Each new military technology, from the bow and arrow to the bomber plane, has moved soldiers farther and farther from their foes.

Even if a robot has software that follows all the various rules of engagement, and even if it were somehow absolutely free of software bugs and hardware failures (a big assumption), the very question of figuring out who an enemy is in the first place—that is, whether a target should even be considered for the list of screening questions—is extremely complicated in modern war.

Even more worrisome, the concept of keeping human beings in the loop is already being eroded by policymakers and by the technology itself, both of which are rapidly moving toward pushing humans out.

since it will be very difficult to guarantee that autonomous robots can, as required by the laws of war, discriminate between civilian and military targets and avoid unnecessary suffering, they should be allowed the autonomous use only of non-lethal weapons.

This sounds simple enough, but oddly the Pentagon has already pushed the legal interpretation that our drones have an inherent right to self-defense, including even to preemptively fire on potential threats, such as an anti-aircraft radar system that lights them up.

Frankenstein shouldn’t get a free pass for his monster’s misdeeds.) If a programmer gets an entire village blown up by mistake, he should be criminally prosecuted, not get away scot-free or merely be punished with a monetary fine his employer’s insurance company will end up paying.

Similarly, if some future commander deploys an autonomous robot and it turns out that the commands or programs he authorized the robot to operate under somehow contributed to a violation of the laws of war, or if his robot were deployed into a situation where a reasonable person could guess that harm would occur, even unintentionally, then it is proper to hold the commander responsible.

To ensure that responsibility falls where it should, there should be clear ways to track the authority in the chain of design, manufacture, ownership, and use of unmanned systems, all the way from the designer and maker to the commanders in the field.

merely scientists, but everyone from theologians (who helped create the first laws of war) to the human rights and arms control communities must start looking at where this technological revolution is taking both our weapons and laws.

Over time, some sort of consensus might emerge—if not banning the use of all autonomous robots with lethal weapons, then perhaps banning just certain types of robots (such as ones not made of metal, which would be hard to detect and thus of most benefit to terrorist groups).

As Steven Metz of the Army War College says, “You have to remember that many consider international law to be a form of asymmetric warfare, limiting our choices, tying us down.” Yet history tells us that, time and again, the society that builds an ethical rule of law and stands by its values is the one that ultimately prevails on the battlefield.

as we begin to wrestle with the problems that robots present for the laws of war, we might find instructive the wisdom from a past generation that grappled with a revolutionary and fearsome new technology (in that case, atomic weapons).

Amid this ecstatic hype, it is not surprising that many security studies experts, both in and out of the defense establishment, latched onto the notion that linking up all our systems via electronic networks would “lift the fog of war,” allow war to be done on the cheap, and even allow the United States to “lock out” competition from the marketplace of war, much as they saw Microsoft doing to Apple at the time.

During World War I, for example, all sorts of recent inventions, from airplanes dropping bombs to cannons shooting chemical weapons, were introduced before anyone agreed on the rules for their use—and, as to be expected, the warring sides sometimes took different interpretations on critical questions.

Most of the uses of unmanned systems were classified and thus there was little public knowledge of their relative successes, as well as no field tests or data collection to solve the problems they incurred (16 percent of the Fire Flys crashed).

As veteran robotics scientist Robert Finkelstein has pointed out, “It took decades for UAVs to recover from Vietnam misperceptions.” The next big U.S. military spending on unmanned planes didn’t come until 1979, with the Army’s Aquila program.

“The Iraqis came to learn that when they heard the buzz of a Pioneer overhead, all heck would break loose shortly thereafter because these sixteen-inch rounds would start landing all around them,” said Steve Reid, an executive at the Pioneer’s maker, AAI.

In one case, a group of Iraqi soldiers saw a Pioneer flying overhead and, rather than wait to be blown up, waved white bed sheets and undershirts at the drone—the first time in history that human soldiers surrendered to an unmanned system.

The Gulf War was the first war in history to involve widespread computers, used for everything from organizing the movement of hundreds of thousands of troops to sorting through reams of satellite photos looking for targets for missiles to hit.

Calling it a “technology war,” the victorious commanding general, “Stormin’” Norman Schwarzkopf, said, “I couldn’t have done it all without the computers.” Over the rest of the 1990s, as sensors and computer processors improved, unmanned systems became ever more capable.

(As one advertisement for an unmanned plane put it, “Can you keep your eyes open for thirty hours without blinking?”) They can operate in dirty environments, such as battle zones filled with biological or chemical weapons, or under other dangerous conditions, such as in space, in rough seas, or in flights with very high gravitational pressures.

The rising interest in robots in the late 1990s coincided with changing political winds—a shrinking U.S. military as part of the post-Cold War so-called “peace dividend,” and an increasing belief that public tolerance for military risk and casualties had dropped dramatically after the relatively costless victory in the Gulf War.

In 2000, this was the main factor that led Senator John Warner (R.-Va.), then chairman of the Armed Services Committee, to mandate in the Pentagon’s budget that by 2010, one-third of all the aircraft designed to attack behind enemy lines be unmanned, and that by 2015, one-third of all ground combat vehicles be driverless.

As the Washington Post put it, “The undertaking has attracted not only the country’s top weapons makers but also dozens of small businesses…all pitching a science-fiction gallery of possible solutions.” Robert Finkelstein recalled a time when he personally knew most of the engineers working on military robotics.

These allow PackBot not only to roll forward and backward like regular tank tracks, but also to climb stairs, rumble over rocks, squeeze down twisting tunnels, and even swim in under six feet of water.

The designers at iRobot view their robots as “platforms.” PackBot has eight separate payload bays and hookups that allow its users to swap in whatever they need: mine detector, chemical and biological weapons sensor, or just extra power packs.

Meanwhile, iRobot has new and improved versions of the PackBot as well as a host of plans to convert any type of vehicle into a robot, be it a car or ship, using a universal control unit that you plug into the engine and steering wheel.

In technology circles, new products that change the rules of the game, such as what the iPod did to portable music players, are called “killer applications.” Foster-Miller’s new product gives this phrase a literal meaning.

SWORDS is basically the Talon’s tougher big brother, with its gripping arm replaced by a gun mount that can carry pretty much any weapon that weighs under three hundred pounds, ranging from an M-16 rifle and .50-caliber machine gun to a 40mm grenade launcher or an antitank rocket launcher.

The SWORDS can’t reload itself, but it can carry two hundred rounds of ammunition for the light machine guns, three hundred rounds for the heavy machine guns, six grenades, or four rockets.

At iRobot, researchers describe their rivals as thinking, “We hear that robots are trendy, so let’s do that.” At Foster-Miller, they retort, “We don’t just do robots and we don’t suck dirt.” (Indeed, thanks to its Roomba vacuum cleaner, iRobot may be the only company that sells at both Pentagon trade shows and Bed Bath

If robots were pickup trucks, Foster-Miller represents the Ford model, stressing how the Talon is “Built Tough.” Its promotional materials describe the Talon as “The Soldier’s Choice.” Foster-Miller executives love to recount tales of how the Talon has proven it “can take a punch and stay in the fight.” The iRobot team, meanwhile, bristles at the idea that its systems are “agile but fragile.” They insist that the PackBot is tough, too, citing various statistics on how it can survive a 400 g-force hit, what they describe as the equivalent of being tossed out of a hovering helicopter onto a concrete floor.

They are most proud of the fact that their robots have a 95 percent out-of-the-box reliability rate, higher than any other in the marketplace, meaning that when the soldiers get them in the field, they can trust the robot will work as designed.

For testing purposes, iRobot’s PackBot has been equipped with a shotgun that can fire a variety of ammunition, including non-lethal rubber bullets, rounds that can blow down a door, and even more powerful “elephant killer” bullets.

Another version of PackBot is the Robotic Enhanced Detection Outpost with Lasers (REDOWL), which uses lasers and sound detection equipment to find any sniper who dares to shoot at the robot or accompanying troops, and instantly targets him with an infrared laser beam.

And the SWORDS itself is being replaced by a new version named after the Roman god of war—the MAARS (Modular Advanced Armed Robotic System), which carries a more powerful machine gun, 40mm grenade launchers, and, for non-lethal settings, a green laser “dazzler,” tear gas, and a loudspeaker, perfect for warning any insurgents that “Resistance is futile.” We will also soon see entirely new armed UGVs hit the battlefield.

About the size of a golf cart, the first version of the vehicle was operated by a soldier wielding a PlayStation video game controller, but software plug-ins will allow it to be upgraded to semiautonomous and then fully autonomous modes.

REX has an arm with six joints to drag a soldier to safety, while REV has a life-support pod that even comes with a flat-screen TV facing the wounded soldier’s face so that operators can see and communicate with the human on the other end if he is conscious.

It involves everything from replacing tens of thousands of armored vehicles with a new generation of manned and unmanned vehicles to writing the many millions of lines of software code for the new computers that will link them all together.

Lieutenant General Walter Buchanan, the U.S. Air Force commander in the Middle East, recalled the run-up to the Iraq war in an interview with Air Force Magazine: In March of 2002, [during] the mission briefings over Southern Iraq at that time, the mission commander would get up and he’d say, “OK, we’re going to have the F-15Cs fly here, the F-16s are going to fly here, the A-6s are going to fly here, tankers are going to be here today.” Then they would say, “And oh by the way, way over here is going to be the Predator.” We don’t go over there, and he’s not going to come over here and bother us….

Given its price tag—at just $4.5 million, it costs a fraction of what a manned fighter jet costs—the Predator can be used for missions where it might be shot down, such as traveling low and slow over enemy territory.

It was originally designed for reconnaissance and surveillance, and about a quarter of the cost of each Predator actually goes into the “Ball,” a round mounting under the nose that carries two variable-aperture TV cameras, one for seeing during the day and an infrared one for night, as well as a synthetic-aperture radar that allows the Predator to peer through clouds, smoke, or dust.

At forty feet long, Global Hawk has been described as looking like “a flying albino whale.” Originally conceived as an unmanned replacement for the half-century-old U-2 spy plane, Global Hawk can stay aloft up to thirty-five hours.

Like its older sibling the Hunter, which is twice as big and can stay in the air twice as long, the Shadow is used on a variety of tactical missions in support of ground forces, including reconnaissance, target acquisition, and battlefield damage assessment.

That same year, an Air Force lieutenant general forecast that “given the growth trends, it is not unreasonable to postulate future conflicts involving tens of thousands.” Among the UAVs likely to see action in those future conflicts will be Predators reconfigured for electronic warfare, submarine hunting, and even air-to-air combat;

For example, airships could literally be “parked” in the air, as high as one hundred thousand feet up, for weeks, months, or years, serving as a communications relay, spy satellite, hub for a ballistic missile defense system, floating gas station, or even airstrip for other planes and drones.

And a broad new set of robots is being introduced for war at sea, too, where the main differentiation is whether they are designed to operate on the surface, like a boat, or underwater, like a submarine.

The poet John Donne told in 1621 how the invention of better cannons would help limit the cruelty and crimes of war, “and the great expence of bloud is avoyed.” Richard Gatling hoped his new fast-firing gun would serve to reduce the bloodshed of war, while Alfred Nobel believed the explosives he invented would make war unthinkable.

For example, even as recently as the 1999 Kosovo war, NATO pilots spotting for Serbian military targets on the ground had to fly over the suspected enemy position, then put their plane on autopilot while they wrote down the coordinates of the target on their lap with a grease pencil.

During that Kosovo campaign, for example, such a premium was placed on not losing any NATO pilots that planes were restricted from flying below fifteen thousand feet so that enemy fire couldn’t hit them.

Soldiers describe how one of the toughest aspects of fighting in cities is how you have to burst into a building and, in a matter of milliseconds, figure out who is an enemy and who is a civilian and shoot the ones that are a threat before they shoot you, all the while avoiding hitting any civilians.

Marc Garlasco of Human Rights Watch told me how “the single most distinguishing weapons I have seen in my career were Israeli UAVs.” He described how, unlike jet fighters that had to swoop in fast and make decisions on what targets to bomb in a matter of seconds, the UAVs he observed during the 2006 Lebanon war could loiter over a potential target for minutes or even hours, and pick and choose what to strike or not.

Robots, on the other hand, might live up to the sniper motto of “one shot, one kill.” As journalist Michael Fumento put it in describing SWORDS, the operator “can coolly pick out targets as if playing a video game.” But as journalist Chuck Klosterman put it, a person playing video games is usually “not a benevolent God.” We do things in the virtual world, daring and violent things, that we would never do if we were there in person.

Many studies, like Army psychologist Dave Grossman’s seminal book On Killing (1995), have shown how disconnecting a person, especially via distance, makes killing easier and abuses and atrocities more likely.

Keith Shurtleff, an Army chaplain and the ethics instructor for the Soldier Support Institute at Fort Jackson in South Carolina, worries that “as war becomes safer and easier, as soldiers are removed from the horrors of war and see the enemy not as humans but as blips on a screen, there is a very real danger of losing the deterrent that such horrors provide.” Participation via the virtual world also seems to affect not merely how people look at the target, but also how they look at themselves (which is why people in online communities, for example, take on identities and try out behaviors they never would in real life).

Research shows that this sort of “externalization” allows something called “doubling.” Otherwise nice and normal people create psychic doubles that carry out sometimes terrible acts their normal identity never would.

“You have guys running the UAV saying, ‛Kill that one, don’t kill that one.’” Each new military technology, from the bow and arrow to the bomber plane, has moved soldiers farther and farther from their foes.

Even if a robot has software that follows all the various rules of engagement, and even if it were somehow absolutely free of software bugs and hardware failures (a big assumption), the very question of figuring out who an enemy is in the first place—that is, whether a target should even be considered for the list of screening questions—is extremely complicated in modern war.

First, since it will be very difficult to guarantee that autonomous robots can, as required by the laws of war, discriminate between civilian and military targets and avoid unnecessary suffering, they should be allowed the autonomous use only of non-lethal weapons.

This sounds simple enough, but oddly the Pentagon has already pushed the legal interpretation that our drones have an inherent right to self-defense, including even to preemptively fire on potential threats, such as an anti-aircraft radar system that lights them up.

Frankenstein shouldn’t get a free pass for his monster’s misdeeds.) If a programmer gets an entire village blown up by mistake, he should be criminally prosecuted, not get away scot-free or merely be punished with a monetary fine his employer’s insurance company will end up paying.

Similarly, if some future commander deploys an autonomous robot and it turns out that the commands or programs he authorized the robot to operate under somehow contributed to a violation of the laws of war, or if his robot were deployed into a situation where a reasonable person could guess that harm would occur, even unintentionally, then it is proper to hold the commander responsible.

To ensure that responsibility falls where it should, there should be clear ways to track the authority in the chain of design, manufacture, ownership, and use of unmanned systems, all the way from the designer and maker to the commanders in the field.

Not merely scientists, but everyone from theologians (who helped create the first laws of war) to the human rights and arms control communities must start looking at where this technological revolution is taking both our weapons and laws.

Over time, some sort of consensus might emerge—if not banning the use of all autonomous robots with lethal weapons, then perhaps banning just certain types of robots (such as ones not made of metal, which would be hard to detect and thus of most benefit to terrorist groups).

As Steven Metz of the Army War College says, “You have to remember that many consider international law to be a form of asymmetric warfare, limiting our choices, tying us down.” Yet history tells us that, time and again, the society that builds an ethical rule of law and stands by its values is the one that ultimately prevails on the battlefield.

“The more society adheres to ethical norms, democratic values, and individual rights, the more successful a warfighter that society will be.” So as we begin to wrestle with the problems that robots present for the laws of war, we might find instructive the wisdom from a past generation that grappled with a revolutionary and fearsome new technology (in that case, atomic weapons).

Who Stole the Plans for iRobot's Battle Bots?

The PackBot — iRobot's military droid — has special oblong paddles that help it flip over.

He was preoccupied with trying to solve a few electrical problems on the robots he hoped his company would sell to the US Army for as much as $300 million, one of the largest robot orders in history.

So he didn't pay much attention to the fact that iRobot, his former employer and chief competitor for the Army contract, was suing him for stealing their designs.

He stuffed the box into a green duffel bag, carried it outside, and put it in the trunk of a white Saturn registered to the parents of Kimberly Hill, his girlfriend and the chief operating officer of his company, Robotic FX.

On the way back, Ahed stepped out of the Saturn, opened the trunk, and emptied the contents of the bag into a curbside dumpster a block from Hill's apartment.

Ahed was 20 years old when he joined iRobot in May 1999, a biomedical engineering student at the University of Illinois on a summer internship.

The only criticism anyone could level was that he had the arrogance of youth — he seemed to think he was the smartest kid in the room, that he could solve any problem.

"We'd sleep in the office, by the front door, so the first people arriving in the morning would hit us on the head and wake us up,"

and Urbie, a small, rugged military robot with flippers that enabled it to climb stairs and right itself when turned upside down.

The next day, an email was sent at 10:18 pm from his still-active iRobot account to his new Robotic FX address detailing how the PackBot's batteries were made.

Near the mountainous Pakistani border, the Army's 101st Airborne sent the squat, rectangular machines to map suspected terrorist compounds and find weapons caches.

Stateside, Frost and his Army liaison became popular figures on the Pentagon briefing circuit, even pitching the robots to defense secretary Donald Rumsfeld.

In 2002, engineers applied what they'd learned building the SC Johnson floor-cleaner to a disc-shaped bot they called the Roomba.

The quirky $200 robovac became the must-have gadget that holiday season, eventually selling more than 2.5 million units.

private detective photographed Jameel Ahed unloading iRobot-related equipment from his car.Photo: Courtesy of Xconomy Ahed, meanwhile, was working out of a small office and basement workshop attached to his father's dental practice.

He designed custom fixtures so he could injection-mold thermoplastic rubber parts — much lighter than the PackBot's aluminum components.

It weighed just 20 pounds and cost less than $30,000 — half what iRobot was charging for a comparable early version of the PackBot.

The Negotiator's driving system also beat the PackBot's, which at the time used two hockey-puck-like discs mounted on a console — a nonintuitive and complicated setup.

The so-called xBot deal would be worth up to $300 million and cover as many as 3,000 units — a surge.

Most of the bots in the field were tiny — really just souped-up, radio-controlled toy trucks with cameras attached.

A former helicopter pilot and business consultant, Ward was famous for establishing a cheap, reliable repair shop for bomb-handling robots in Iraq in just a few weeks.

He wanted to bring that same kind of economy and speed to the robots themselves, and he thought $120,000 a pop for the upgraded, fully loaded PackBot was a lot to pay.

Rather than go through a months-long bidding process, the JPO announced a drive-off, a sort of robot ropes course, to be held in August.

The xBot specs called for a robot weighing 50 pounds or less that could hurdle obstacles and scope out potential bombs.

We'd been to each other's homes."Photo: Rainer Hosch The company plowed ahead, eventually entering two machines — a modified PackBot and a model tailored for the Future Combat Systems program.

There were so many modifications to be made, like switching to off-the-shelf encoders, gears, and cameras, that engineers began working around-the-clock shifts.

They led the PackBot crew into the 104-degree Alabama summer heat and onto the test track, an old firing range ringed with hills.

The PackBot followed a zigzagging series of green flags against grassy fields — a challenge for electronic eyes — and identified mock IEDs in trash piles along the way.

Sitting in 6 inches of water, the robot lifted a 5-pound simulated pipe bomb and spun it around, showing how well its arm could move.

It then drove up a ramp and into the back of a tractor-trailer, where it picked out coat hangers, bowling pins, and rocket launchers amid the shadows.

In fact, no one will describe the Negotiator's performance on the record: Army officials won't even reveal how many teams competed in the xBot trial.

That was odd — how much wartime experience do you need to distinguish a rocket launcher from a bowling pin?

Meanwhile, as Frost's team was packing to return to Alabama for this second audition, iRobot's lawyers emailed Ahed to notify him of their lawsuits filed in Alabama and Massachusetts for patent and trade secret theft.

Ahed spent the rest of that weekend scrubbing files from his hard drives and destroying the CD-ROMs he had collected from the office.

On Monday, August 20, a district court judge in Alabama issued a temporary restraining order against Robotic FX, demanding that it preserve "all evidence, information, data, and documents."

On September 14, 2007, the Army awarded the five-year xBot contract to Ahed for $279.9 million (Ahed knocked off the extra $5.1 million to sweeten the deal).

The company filed official protests with the Army and petitioned the Massachusetts judge for a preliminary injunction to put the brakes on the xBot contract.

"The number one cause of Soldier and Marine casualties are IEDs, and this system allows the Soldier and Marine to complete their mission while keeping them out of harm's way,"

He admitted to sweeping his office for old discs and circuit boards — because he no longer wanted "to have any iRobot memorabilia"

Yes, Jameel Ahed almost stopped his former bosses from getting an important military contract, and he probably used some of their designs to do it.

Ward had been a Negotiator fan since June 2006, smitten by its easy-to-use controller, its sharp-eyed camera, and, of course, its minuscule price tag.

The JPO ordered several of the machines just to play around with, though it wouldn't agree to a later request from Robotic FX that the government protect the company from future patent lawsuits.

I can say that the firm is big enough to make robots in quantity — something iRobot had only barely figured out how to do after years of supplying the government.

A month later, when the xBot contract was formally announced, he went back for a personal demo in Ahed's basement workshop.

The executive put his hand into the puppet controller, pinched his fingers, picked up the coin with the Negotiator's claw, and dropped the quarter into his shirt pocket.

If he won the xBot competition, the executive's firm would buy Robotic FX and use its own, larger assembly lines to crank out robots by the hundreds.

It would be a triple win: The government got bomb spotters cheap enough to outfit the whole infantry, Ahed got rich, and the defense company got a giant new market.

But in the world of defense contracting, it was business as usual — and depending on your perspective, kind of admirable.

(iRobot itself likely benefited from such tuned specs in earlier contracts.) If anything, the rigid performance thresholds and reverse-auction format of the xBot deal made it fairer than many Pentagon awards.

One line of their defense went like this: Because iRobot had developed the PackBot in part with money from the US military, its design could arguably be considered public property.

When Ahed got the email from iRobot's attorneys announcing the lawsuit, it couldn't have been a surprise — he had received the cease-and-desist letter months earlier.