March 29, 2024

Scholars, Contractors Debate Drone Ethics

Pax Defense Forum - Tim Heely
Pax Defense Forum - Tim Heely

Timothy Heely

By Jay Friess
Editor

When opening the first discussion panel at the 7th Annual Pax Defense Forum at St. Mary’s College Tuesday, retired Vice Adm. Joseph Dyer paraphrased a quote from the ancient Greek historian Thucydides: “A nation that makes a great distinction between its scholars and its warriors will have its laws made by cowards and its wars fought by fools.”

Hence, Mr. Dyer said it was critical that The Patuxent Partnership and the college’s Center for the Study of Democracy had gathered both warriors and scholars to debate the thorny question of the ethical use of armed robots in future combat.

“There is a place for armed robots,” Mr. Dyer asserted, but he said the world is a long way away from Hollywood’s vision of an autonomous killing machine. “Movies have not been kind to robots in America. … Don’t be too quick to jump 100 years into the future.” He added that in Japan, robots are seen primarily as helpful entities, rather than like the Terminator or HAL 9000.

However, in his opening remarks, Dr. Michael Cain, head of the Center for the Study of Democracy, noted that military development of drones is accelerating rapidly, with the Pentagon asking for $5 billion in drone support in fiscal 2012 alone. The speed of drone development has outpaced the speed of the legal, academic and cultural institutions that examine and limit their use.

How Autonomous Do We Want Drones?

For Mr. Dyer, the “über question” that faces military observers and thinkers today is man-in-the-loop versus autonomy – human-controlled robots versus ones that can act and respond on their own.

Currently, armed drones in the United States’ arsenal are remotely controlled, either by satellite or direct radio link. Against a foe of limited resources, such as the Iraq or Afghanistan insurgencies, these vehicles have operated with little interference. However, against a more technologically-equipped foe, the story might be very different.

“These Predators and things will be of no use,” said Dr. Noel Sharkey, professor of robotics at the University of Sheffield in the UK.

Against a foe that can effectively jam its communication and control links, drones must be able to act autonomously, making engagement decisions on their own if they are to be effective.

“[Jamming] is an obvious driver for autonomous operation,” Dr. Sharkey said, and said he opposes allowing drones to do their own lethal autonomous targeting and limit its proliferation, much like nations have limited the use of nuclear weapons, land mines and cluster munitions.

Lethal targeting would involve autonomous decision-making, Dr. Sharkey said, which would boil down to a series of what programmers call “if-then statements.”

“There’s nothing smart about “if that’s a heat signature, then fire on it,” Dr. Sharkey said. He added that robots are just now getting to the point where they can regularly distinguish a car from a lion, much less combatants from civilians. He argued that they cannot do proportional decisions, and they cannot be held accountable. Moreover, the multi-threaded, complex code that is required for robots to make the simplest decisions is already unpredictable and prone to unintended consequences.

As for Hollywood’s Terminator scenario, Dr. Sharkey said, “I don’t see that happening at all. … [Artificial Intelligence] never gets as far and quickly as we like to think it does.” He noted that US Special Forces came to nickname one of their armed ground drones the “TRV” – Taliban Resupply Vehicle – since insurgents would flank it, kick it over and strip if of its weapons and ammunition.

However, Dr. Sharkey questioned the wisdom of sending a machine, unsupervised, to complete a mission in enemy territory.

Mr. Timothy Dunigan, a former Navy captain now with SAIC, said, “We have very little autonomy right now. … The level of autonomy to go off and do that, we do not have today.”

Mr. Timothy Heely, a retired rear admiral now with Cobham LLC, asserted that a similar capability does exist today in the form of the Tomahawk cruise missile.

Networked Warfare and Diffused Accountability

For the former military officers on Monday’s panels, the idea of a drone autonomously targeting and using lethal force was not as troublesome as the way in which the drones, and another military weapons, are increasingly being controlled. Mr. Dyer asserted that network-centric warfare, where the intelligence comes from one source, the targeting from another, and the firing from yet another, is blurring accountability in the command structure.

Mr. Dyer posed the hypothetical question of where responsibility lies when a satellite-guided bomb does or doesn’t hit its intended target, adding, “I’ll promise you, it’s no longer in the cockpit.”

Mr. Heely agreed, saying, “[Networked warfare] does dilute responsibility.” Mr. Heely took the idea a step further, questioning if there was a difference between an F-18 pilot firing a missile from 15,000 feet using coordinates radioed to him or a drone pilot firing a missile from a base in Nevada. Neither pilot ever physically sees the target. He argued that a drone is no different from any other long-range standoff weapon.

“Who controls this?” Mr. Heely asked. “Who runs these things? … We’re getting to the point here where the blame is going to be spread.”

The panelists also raised concern about remote pilots developing a “Gameboy mentality,” finding it easier to use lethal force without hesitation when removed from the physical danger of the battlefield.

Mr. Dunigan argued that drone pilots have better intelligence, more time on station, more comfort and no threat of death influencing their decisions, allowing them to think more clearly. He admitted, “There is a little bit of Gameboy mentality. It’s something we need to be concerned about.”

However, overall, Mr. Dunigan said that no matter how a weapon is delivered, by a human or drone, it will not solve the problem of incomplete intelligence. He said. “There is always going to be bad information.”

“I’m more concerned about the Gameboy mentality of your politicians, honestly,” Dr. Sharkey said, noting that the perceived accuracy and surgical precision of drone strikes has encouraged the White House to expand the battlespace without committing ground forces. He added that the diffuse command structure and physical locations of networked warfare are a concern. “There are people distributed all over the place, and it gets very confusing.”

Dr. Michael Noone, a professor at Catholic University and Columbia School of Law, argued that there will be people to be held accountable in the event of a mishap. He said, “Robots are not legally recognized as persons. They cannot sue or be sued. … There’s going to be responsible adults here. They may be hard to find, and there may be collective action.”

In order to be held accountable, the humans in the loop of controlling robotic weapons need to have enough information to understand the potential consequences of their actions, according to Peter Asaro of The New School and the International Committee for Robot Arms Control.

“What we need is human consideration and deliberation,” Mr. Asaro said. “We don’t want to leave that to an automated process.”

Is it Fair?

The use of drones does bring up new, complex ethical matters, like whether using drones against a lesser-equipped enemy is fair fighting.

“I’m not an advocate of ‘fair,'” Mr. Dunigan said, noting that the history of military technology has always been about being able to affect your enemy while staying out of harm’s way.

And Guy Seidman, professor at Israel’s Radzyner School of Law, questioned whether the law and fairness are even factors in war.

“Law is a lot less relevant than many of us lawyers would like to admit,” Mr. Seidman said. “There is no accountability [in war]. This is not about accountability. … This is more about morality.”

From a moral standpoint, Seidman argued that stand-off warfare allows combatants to take the time to make better decisions, although he said, “If anything, it might make [political] leaders more trigger-happy.”

“Body bags are good inhibitors,” Dr. Sharkey countered and argued that standoff, asymmetrical warfare encourages terrorist tactics by the disadvantaged side.

Seidman and Sharkey’s exchange highlighted an earlier rhetorical question by Mr. Heely, who asked whether a UAV pilot, stationed  on a domestic American base, could become a legitimate military target for a terrorist.

However, Mr. Seidman concluded that worrying about who has possession of advanced technology and weaponry was less important than the ability to convince them that they should keep it to themselves.

Leave A Comment