Giving drones a conscience

What the helicopter was to the Vietnam war, the drone is becoming to the Afghan conflict: both a crucial weapon in the American armory and a symbol of technological might pitted against stubborn resistance. Pilotless aircraft can hit targets without placing a pilot in harm's way.

They have proved particularly useful for assassinations. On Feb. 17, for example, Sheikh Mansoor, an al-Qaida leader in the Pakistani district of North Waziristan, was killed by a drone-borne Hellfire missile. The United States wants to increase drone operations.

Assassinating "high value targets," such as Mansoor, often involves a moral quandary. A certain amount of collateral damage has always been accepted in the rough-and-tumble of the battlefield, but direct attacks on civilian sites, even if they have been commandeered for military use, causes queasiness in thoughtful soldiers.

Errors are not only tragic, but also counterproductive. Sympathetic local politicians will be embarrassed and previously neutral noncombatants may take the enemy's side. Moreover, the operators of drones, often on the other side of the world, are far removed from the sight, sound and smell of the battlefield. They may make decisions to attack that a commander on the ground might not, treating warfare as a video game.

Ronald Arkin of the Georgia Institute of Technology's School of Interactive Computing has a suggestion that might ease some of these concerns. He proposes involving the drone itself — or, rather, the software that is used to operate it — in the decision to attack. In effect, he plans to give the machine a conscience.

The software conscience that Arkin and his colleagues have developed is called the Ethical Architecture. Its judgment may be better than a human's because it operates so fast and knows so much. And — like a human but unlike most machines — it can learn.

The drone would initially be programmed to understand the effects of the blast of the weapon it is armed with. It would be linked to both the Global Positioning System and the Pentagon's Global Information Grid, a vast database that contains, among many other things, the locations of buildings in military theaters and what is known about their current use.

After each strike the drone would be updated with information about the actual destruction caused. It would note any damage to nearby buildings and would subsequently receive information from other sources, such as soldiers in the area, fixed cameras on the ground and other aircraft. Using this information, it could compare the level of destruction it expected with what actually happened. If it did more damage than expected — for example, if a nearby cemetery or mosque was harmed by an attack on a suspected terrorist safe house — then it could use this information to restrict its choice of weapon in the future. It could also pass the information to other drones.

No commander is going to give a machine a veto, of course, so the Ethical Architecture's decisions could be overridden. That, however, would take two humans — both the drone's operator and his commanding officer. That might not save a target from destruction but it would, at least, provide room for a pause for reflection before the pressing of the "fire" button.

Forever the fog of war

From the Economist's Democracy in America blog

Any effort to cut through the fog of war is welcome (see accompanying story), but let us not forget how thick that fog is. Andrew Sullivan (a blogger for the Atlantic) flags one of the most tragic, disturbing and riveting videos I have seen out of Iraq. (The video, which is difficult to watch, can be viewed at links.tampabay.com) And while Sullivan quickly concludes that what we're seeing is a war crime, I think that is an oversimplified reaction to a

complicated event.

I've watched this video a couple of times now, first with outrage similar to Sullivan's, then with an eye towards seeing what the pilots saw. I don't mean to turn this tragedy into a psychological experiment, but I'm reminded of a selective attention test in which, by way of the text and markers in the video (decrypted and released by a whistleblower group called WikiLeaks.org),

we are prodded to see one side of the event. (The helicopter pilots do not help matters with their casual approach to killing and intermittent laughter.)...

I think the scene is more ambiguous than it first appears, but my broader point is that no matter how precise our weaponry gets, no matter how much information we feed into our targeting systems, the decision to fire will always be based on incomplete information and come down to fallible human judgment. So while it is normal to react to these tragedies with varying degrees of moral repugnance, let us not be shocked. This is the nature of war and there is only

one truly effective way to avoid such incidents.

Giving drones a conscience 04/10/10 [Last modified: Friday, April 9, 2010 7:09pm]

Join the discussion: Click to view comments, add yours

Loading...