jump to navigation

Presators and Robots at War September 19, 2011

Posted by rogerhollander in Pakistan, Science and Technology, War.
Tags: , , , , , , , , , , , , , , , , , , ,
trackback
Roger’s note: read about your tax dollars at work to provide deadly war games for young marines on the PlayStation killing machines.  Since virtually every country in the world has terrorists within and since the US is at war with terrorism, it can “legally” in the name of self-defense bombard at will.  And unmanned predators may be coming soon to a police station near you!

// by Christian Caryl, http://www.opednews.com/Quicklink/Predators-and-Robots-at-Wa-in-General_News-110919-829.html

Wired for War: The Robotics Revolution and Conflict in the
Twenty-first Century

by P.W. Singer
Penguin,
499 pp., $17.00 (paper)
Predator: The Remote-Control Air War over Iraq and Afghanistan: A
Pilot’s Story

by Lieutenant Colonel Matt J. Martin with Charles W.
Sasser
Zenith, 310 pp., $28.00

caryl_01-092911.jpgMax Becherer/Polaris

The US Air Force’s 62nd Expeditionary Reconnaissance
Squadron launching an unmanned Predator drone with laser-guided Hellfire
missiles mounted on its wings, Kandahar Air Field, Afghanistan, November 2009

Drones are in the headlines. We read daily about strikes against terrorist
targets in the tribal areas of Pakistan using unmanned aerial vehicles
(UAVs)—remote-controlled aircraft equipped with elaborate sensors and sometimes
weapons as well. Earlier this summer the US sent
Predator drones into action against militants in Somalia, and plans are
reportedly afoot to put the CIA in charge of a drone
offensive against al-Qaeda operatives in Yemen. NATO has
dispatched UAVs to Libya. State-of-the-art stealth drones cased the house where
Osama bin Laden was living before US Navy seals staged
their now famous raid. And in a speech a few weeks ago, White House
counterterrorism chief John Brennan made it clear that drones will continue to
figure prominently in the Obama administration’s counterterrorism strategy. On
August 22, a CIA drone killed the number-two al-Qaeda
leader in the mountains of Pakistan.

Most of us have probably heard by now how extraordinary this technology is.
Many of the UAV strikes in South Asia are actually
orchestrated by operators sitting at consoles in the United States. US Air Force Colonel Matt Martin gives a unique first-person
account of the strange split consciousness of this new type of warfare in his
book Predator. Even as his body occupies a seat in a control room in
Nellis Air Force Base in Nevada, his mind is far removed, following a suspicious
SUV down a desert road in Iraq or tailing Taliban
fighters along a mountain ridge in Afghanistan. “I was already starting to refer
to the Predator and myself as ‘I,’ even though the airplane was thousands of
miles away,” Martin notes ruefully.

Notifying Marines on the ground that he’s arriving on the scene in
Afghanistan, he has to remind himself that he’s not actually arriving
anywhere—he’s still in his seat on the base. “Although it was only shortly after
noon in Nevada,” he writes, “I got the yawns just looking at all that snow and
darkness” on the ground outside Kabul. He can hardly be blamed for the
confusion. The eerie acuity of vision afforded by the Predator’s multiple
high-powered video cameras enables him to watch as the objects of his interest
light up cigarettes, go to the bathroom, or engage in amorous adventures with
animals on the other side of the world, never suspecting that they are under
observation as they do.

Even though home and wife are just a few minutes’ drive down the road from
his battle station, the peculiar detachment of drone warfare does not
necessarily insulate Martin from his actions. Predator attacks are
extraordinarily precise, but the violence of war can never be fully tamed, and
the most gripping scenes in the book document Martin’s emotions on the occasions
when innocent civilians wander under his crosshairs in the seconds just before
his Hellfire missile arrives on target. Allied bomber pilots in World War II killed millions of civilians but rarely had occasion to
experience the results on the ground. Drone operators work with far greater
accuracy, but the irony of the technology is that its operators can see their
accidental victims—two little boys and their shattered bikes, in one especially
heartrending case Martin describes—in excruciating detail. Small wonder that
studies by the military have shown that UAV operators
sometimes end up suffering the same degree of combat stress as other
warfighters.1

And yet the US military does little to discourage the
notion that this peculiar brand of long-distance warfare has a great deal in
common with the video-gaming culture in which many young UAV operators have grown up. As one military robotics
researcher tells Peter Singer, the author of Wired for War, “We modeled
the controller after the PlayStation because that’s what these eighteen-,
nineteen-year-old Marines have been playing with pretty much all of their
lives.” And by now, of course, we also have video games that incorporate drones:
technology imitating life that imitates technology.

Drones are not remarkable because of their weaponry. There is
nothing especially unusual about the missiles they carry, and even the largest
models are relatively lightly armed. They are not fast or nimble. What makes
them powerful is their ability to see and think. Most of the bigger drones now
operated by the US military can take off, land, and fly
by themselves. The operators can program a destination or a desired patrol area
and then concentrate on the details of the mission while the aircraft takes care
of everything else. Packed with sensors and sophisticated video technology, UAVs
can see through clouds or in the dark. They can loiter for hours or even days
over a target—just the sort of thing that bores human pilots to tears. Of
course, the most significant fact about drones is precisely that they do not
have pilots. In the unlikely event that a UAV is shot
down, its operator can get up from his or her console and walk away.

So far, so good. But there are also quite a few things about drones that you
might not have heard yet. Most Americans are probably unaware, for example, that
the US Air Force now trains more UAV operators each year than traditional pilots. (Indeed, the
Air Force insists on referring to drones as “remotely piloted aircraft” in order
to dispel any suspicions that it is moving out of the business of putting humans
into the air.) As I write this, the US aerospace
industry has for all practical purposes ceased research and development work on
manned aircraft. All the projects now on the drawing board revolve around
pilotless vehicles. Meanwhile, law enforcement agencies around the country
eagerly await the moment when they can start operating their own UAVs. The
Federal Aviation Administration is considering rules that will allow police
departments to start using them within the next few years (perhaps as early as
2014). Soon, much sooner than you realize, your speeding tickets will be issued
electronically to your cell phone from a drone hovering somewhere over the
interstate. The US Customs Service has already used UAVs
to sneak up on drug-smuggling boats that easily evade noisier conventional
aircraft.

Robots that fly get most of the attention. In fact, though, UAVs represent
only one small part of the action in military robotics. As Singer recently told
me, there are already more robots operating on the ground (15,000) than in the
air (7,000). The US Army uses its mechanical warriors to
find and disarm roadside bombs, survey the battlefield, or shoot down incoming
artillery shells. Though these land-based robots may seem a bit more primitive
than their airborne cousins, they are catching up quickly. The models in
development include the bizarre BigDog, an eerily zoomorphic quadruped designed
to help soldiers carry heavy loads over difficult terrain, and BEAR, a vaguely humanoid machine on caterpillar tracks that
can lift loads of up to 500 pounds.

The US Navy is experimenting with machines of its
own. It recently unveiled a robot jet ski designed to sniff out attackers who
might try to sneak up on US ships underwater. The Navy
has developed harmless-looking (and environmentally friendly) sailboats packed
with high-tech surveillance gear that can pilot themselves around the world, if
need be. Robot submersibles, too, are in the works. Unconstrained by the
life-support requirements of manned submarines, these automated spies could
spend months on underwater patrol, parking themselves at the bottom of enemy
harbors and observing everything that goes in or out. So battery life becomes
the main constraint. Some scientists are trying to solve it by enabling the
underwater drone to feed off organic matter lying on the sea floor (known as a
“mud battery”).

So far none of these water-borne robots seem to be carrying torpedoes. The
army, however, is already experimenting with robots that can shoot. In his book,
Singer describes SWORDS, a tracked vehicle equipped with
a suite of cameras that see farther than the human eye even while covering
multiple angles. The machine can be armed with a 50-caliber machine gun or a
variety of other weapons. The SWORDS zoom camera and its
weapon can be perfectly synchronized, and the machine makes for a much more
stable platform than a soft, breathing, frightened human body lying prone in the
midst of a battlefield. Singer writes:

In an early test of its guns, the robot hit the bull’s-eye of a
target seventy out of seventy tries. In a test of its rockets, it hit the target
sixty-two out of sixty-two times. In a test of its antitank rockets, it hit the
target sixteen out of sixteen times. A former navy sniper summed up its
“pinpoint precision” as “nasty.” …Since it is a precisely timed machine pulling
the trigger, the “one shot” mode means that any weapon, even a machine gun, can
be turned into a sniper rifle.

Singer described this system two years ago. In the feverish world of military
robotics, 2009 already feels like a distant era, so we can only surmise how far
SWORDS has progressed since then. Researchers are now
testing UAVs that mimic hummingbirds or seagulls; one model under development
can fit on a pencil eraser. There is much speculation about linking small drones
or robots together into “swarms”—clouds or crowds of machines that would share
their intelligence, like a hive mind, and have the capability to converge
instantly on identified targets. This might seem like science fiction, but it is
probably not that far away. At ETH in Zurich,
Switzerland’s equivalent of MIT, engineers have linked
miniature quadrocopters (drones equipped with four sets of rotors for maximum
maneuverability) into small networks that can deftly toss balls back and forth
to each other without any human commands.

The technology transfixes. The capabilities are seductive; so,
too, is the lure of seeming invulnerability. The Taliban has no air force. Its
foot soldiers do not have night vision or the ability to see through overcast
skies, but they can sometimes hear the drones circling in the sky above. David
Rohde, the New York Times correspondent who was held captive by the
Taliban for seven months in 2009, described in his account of the experience
what it is like to be on the ground while Predators and Reapers are on the
prowl. “Two deafening explosions shook the walls of the compound where the
Taliban held us hostage,” he writes. “My guards and I dived to the floor as
chunks of dirt hurtled through the window.” A missile fired by a US drone has obliterated two cars a few hundred yards
away:

It was March 25, and for months the drones had been a terrifying
presence. Remotely piloted, propeller-driven airplanes, they could easily be
heard as they circled overhead for hours. To the naked eye, they were small dots
in the sky. But their missiles had a range of several miles. We knew we could be
immolated without warning….

Later, I learned that one guard called for me to be taken to the
site of the attack and ritually beheaded as a video camera captured the moment.
The chief guard overruled him.2

This particular strike, it turns out, has killed seven militants, zero
civilians. Most of the attacks are remarkably precise, as Rohde writes. Yet this
is almost beside the point: “The Taliban were able to garner recruits in their
aftermath,” he writes, “by exaggerating the number of civilian casualties.”

His point is borne out by a recent study conducted by Peter Bergen and
Katherine Tiedemann, two analysts at the New America Foundation in Washington
who have been tracking drone strikes in the tribal areas of Pakistan ever since
the US began conducting attacks there in June 2004.
Though reliable information from that part of the world is extremely hard to
come by—the story of Rohde’s kidnapping explains why foreign journalists tend to
steer clear of the area—Bergen and Tiedemann have carefully analyzed media
reports for the details of each attack. While acknowledging the difficulties of
obtaining reliable data (and the wildly divergent information issued by American
and Pakistani official sources), they conclude that the attacks have grown
steadily more accurate. According to Bergen and Tiedemann, “During the first two
years of the Obama administration, around 85 percent of those reported killed by
drone strikes were militants; under the Bush administration, it was closer to 60
percent.”3 At the same time the authors
note that the strikes have probably been far less successful than US officials claim at killing militant leaders. Most of the
dead, Bergen and Tiedemann conclude, are likely rank-and-file fighters. (A newer
study by the Bureau of Investigative Journalism in London arrives at a somewhat
higher overall civilian casualty rate.)

Though such statistics are remarkable when measured against the history of
warfare, they are, of course, little consolation to the families of those
innocent bystanders who have been killed along with the jihadis. And, as Bergen
and Tiedemann rightly note, the precision of the killing is only one small part
of the story. Polls show, just as Rohde suspected, that Pakistanis
overwhelmingly believe that most of those who die in the attacks are civilians—a
perception that is undoubtedly aggravated by the impunity with which the drones
stage their raids on Pakistani territory. Dennis Blair, director of national
intelligence from 2009 to 2010, recently made a similar observation in The
New York Times
: “Our reliance on high-tech strikes that pose no risk to our
soldiers is bitterly resented in a country that cannot duplicate such feats of
warfare without cost to its own troops.” (While the Pakistani government
publicly expresses its disapproval of the strikes, in private Pakistani leaders
have provided intelligence and logistical support for the campaign—a fact that
they are eager to conceal from the public.) The number of terrorist attacks in
Pakistan has risen sharply as the drone campaign has accelerated. Bergen and
Tiedemann conclude that the broader political effects of the UAV campaign may well cancel out some of its tactical
benefits.

One remedy they propose is to take control of the drone program away from the
CIA, which currently runs the campaign in the tribal
areas, and transfer it to the military.4
This offers several advantages. In contrast to the CIA,
which denies the very existence of the program and accordingly reveals nothing
about the criteria by which it chooses its targets, the US Department of Defense can at least be held publicly
accountable for its conduct and is much more likely to respond to pressure to
keep its use of UAVs within the bounds of international law. This cannot be said
of the CIA’s use of drones for the purposes of “targeted
killing”—particularly given that the strikes are being secretly conducted
against targets in Pakistan, a country with which the United States is not at
war, under ill-defined and murky circumstances.

The legal issues involved are complex. Philip Alston, an expert in
international law appointed by the United Nations to examine the question,
asserted in a report that, “Outside the context of armed conflict, the use of
drones for targeted killing is almost never likely to be legal.”5 The trick, of course, is how we define “armed conflict”
in an age of non-state-affiliated terrorist and insurgent groups operating from
places where the writ of a central government does not extend. International
law, some experts say, gives the US the right to protect
its forces in Afghanistan against attacks staged by al-Qaeda and its allies in
the tribal areas—while whether the drone strikes violate Pakistani sovereignty
depends largely on agreements we have with the Pakistani government, a point
that remains somewhat mysterious.

The Obama administration might help matters by providing an explanation of
the legal rationale for the program. But so far it has declined to do so, aside
from a brief statement by a leading State Department legal adviser that cited
the internationally recognized right to self-defense.6 In this respect it is only to be welcomed that scholars
around the world are engaged in an active debate about the legal implications of
the drone campaigns. Given that more than forty countries around the world are
now experimenting with military robots of their own, the United States cannot
rest on the assumption that it will retain a monopoly over this technology
forever. The day when US forces are attacked by a
drone—perhaps even one operated by a terrorist—is not far away.

Many of the recent books on UAVs predictably dwell on the
technical specs and astonishing capabilities of these new weapons systems.
Singer provides us with plenty of the same, but the great virtue of his book is
precisely that he also devotes space to the broader questions raised by the
breakneck expansion of military robotics. As he writes, the US government is using drones to conduct a military campaign
against the sovereign state of Pakistan. Yet no one in Congress has ever pressed
the President for any sort of legal declaration of hostilities—for the simple
reason that the lives of American military personnel are not at stake when the
Predators set off on their missions.

In fact, as Singer shows, the ethical and legal implications of the new
technology already go far beyond the relatively circumscribed issue of targeted
killing. Military robots are on their way to developing considerable autonomy.
As noted earlier, UAVs can already take off, land, and fly themselves without
human intervention. Targeting is still the exclusive preserve of the human
operator—but how long will this remain the case? As sensors become more powerful
and diverse, the amount of data gathered by the machines is increasing
exponentially, and soon the volume and velocity of information will far exceed
the controller’s capacity to process it all in real time, meaning that more and
more decision-making will be left to the robot.

A move is already underway toward systems that allow a single operator to
handle multiple drones simultaneously, and this, too, will tend to push the
technology toward greater autonomy. We are not far from the day when it will
become manifest that our mechanical warriors are better at protecting the lives
of our troops than any human soldier, and once that happens the pressure to let
robots take the shot will be very hard to resist. Pentagon officials who have
been interviewed on the subject predictably insist that the decision to kill
will never be ceded to a machine. That is reassuring. Still, this is an easy
thing to say at a point when robots are not yet in the position to take the
initiative against the enemy on a battlefield. Soon, much sooner than most of us
realize, they will be able to do just that.

We have only just begun to explore what this means. Singer quotes Marc
Garlasco, a recognized expert on the law of war at Human Rights Watch. “This new
technology creates pressure points for international law,” Garlasco says. “You
will be trying to apply international law written for the Second World War to
Star Trek technology.” Singer continues:

Another fundamental premise of the human rights group, and for
broader international law, is that soldiers in the field and the leaders who
direct them must be held accountable for any violations of the laws of war.
Unmanned systems, though, muddy the waters surrounding war crimes. “War crimes
need both a violation and intent,” says Garlasco. “A machine has no
capacity to want to kill civilians, it has no desires…. If they are incapable of
intent, are they incapable of war crimes?” And if the machine is not
responsible, who does the group seek to hold accountable, and where exactly do
they draw the line? “Who do we go after, the manufacturer, the software
engineer, the buyer, the user?”

Later Singer notes that the US has consistently
applied an expanded right of self-defense for its aircraft operating in
conflicts around the world. When an enemy radar “lights up” a US plane, the pilot has the right to fire first without
waiting to be attacked. All fine and good. But then imagine that the aircraft
involved is not a plane but a UAV:

If an unmanned plane flying near the border of another nation is
fired on, does it have the right to fire back at that nation’s missile sites and
the humans behind them, even in peacetime? What about the expanded
interpretation, the right to respond to hostile intent, where the drone is just
targeted by radar? Is the mere threat enough for the drone to fire first at the
humans below?

The answers depend on how wide the “self” in self-defense is
defined.

It turns out, Singer explains, that the US Air Force
currently operates according to the principle that a pilotless aircraft, as an
entity representing the people who sent it on its mission, “has the same rights
as if a person were inside it,” and that this “interpretation of robot rights is
official policy for unmanned reconnaissance flights over the Persian Gulf.” But
the situation is evolving rapidly. The next generation of military robots is
likely to have a high degree of operational independence without yet achieving
the kind of intelligent self-awareness that entails responsibility. Luckily
there is already something of a legal precedent for handling similar situations.
“As odd as it sounds,” Singer writes, “pet law might then be a useful resource
in figuring out how to assess the accountability of autonomous systems.”

This is a particularly thought-provoking conclusion given that the
researchers now working on military robots seem especially eager to ransack the
biological world for elegant solutions to the design problems that have to be
overcome. There is a snake-shaped robot that can rear itself up in the grass
when it wants to scan its surroundings. Tiny surveillance robots scuttle up
walls like bugs, and robot flyers flap their wings. The Navy is testing
submersibles that swim like fish. Researchers in the UK
have developed a robot whose sensors mimic rat whiskers—since so far no engineer
has managed to come up with a sensor system that is better at navigating in
total darkness.

Whether we like it or not, war has often been a powerful goad to
technological innovation. Now technology is on the verge of supplanting the
human soldier altogether—with consequences that can only be guessed. The
question in the case of military robotics, even at this relatively early stage,
is the extent to which we will manage to retain control over the process.
Whether we are ready or not, the answer will soon be clear.

—August 30, 2011

1 See Scott Lindlaw, “Remote-Control Warriors Suffer War Stress,” Associated
Press, August 7, 2008.

2 “A Drone Strike and Dwindling Hope,” Part Four of “Held by the Taliban,”
The New York Times , October 20, 2009.

3 Peter Bergen and Katherine Tiedemann, “Washington’s Phantom War: The
Effects of the US Drone Program in Pakistan,” Foreign Affairs ,
July/August 2011.

4 The CIA operates its drones from control stations in or around its
headquarters in Langley, Virginia. It is likely that many of the operators are
actually civilian contractors.

5 Philip Alston, “Report of the Special Rapporteur on Extrajudicial, Summary
or Arbitrary Executions,” United Nations, Human Rights Council, May 28, 2010.
See also David Kretzmer, “Targeted Killing of Suspected Terrorists:
Exra-Judicial Executions or Legitimate Means of Defence?” The European
Journal of International Law
, Vol. 16, No. 2 (2005).

6 Harold Koh, the legal adviser to the State Department, devoted a few brief
remarks to the subject in a speech last year, available at http://www.state.gov/s/l/
releases/remarks/139119.htm.

  1. 1See Scott Lindlaw, “Remote-Control Warriors Suffer War Stress,” Associated
    Press, August 7, 2008.
  2. 2″A Drone Strike and Dwindling Hope,” Part Four of “Held by the Taliban,”
    The New York Times , October 20, 2009.
  3. 3Peter Bergen and Katherine Tiedemann, “Washington’s Phantom War: The Effects
    of the US Drone Program in Pakistan,” Foreign Affairs , July/August
    2011.
  4. 4The CIA operates its drones from control stations in or around its
    headquarters in Langley, Virginia. It is likely that many of the operators are
    actually civilian contractors.
  5. 5Philip Alston, “Report of the Special Rapporteur on Extrajudicial, Summary or
    Arbitrary Executions,” United Nations, Human Rights Council, May 28, 2010. See
    also David Kretzmer, “Targeted Killing of Suspected Terrorists: Exra-Judicial
    Executions or Legitimate Means of Defence?” The European Journal of
    International Law
    , Vol. 16, No. 2 (2005).
  6. 6Harold Koh, the legal adviser to the State Department, devoted a few brief
    remarks to the subject in a speech last year, available at http://www.state.gov/s/l/
    releases/remarks/139119.htm.
About these ads

Comments»

No comments yet — be the first.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Follow

Get every new post delivered to your Inbox.

Join 216 other followers

%d bloggers like this: