In
the last years of the twentieth century, two weapons changed the way
that America fights air wars: smart bombs (bombs that “see” a target
using a television camera or a radiation sensor, or that head for a
programmed location) and UAVs (unmanned aerial vehicles). Smart bombs
came into their own in the first Gulf War. Reconnaissance UAVs proved
their worth in Bosnia and Kosovo in the late 1990s, and offensive UAVs
began firing missiles in Iraq, Afghanistan, Pakistan, and elsewhere a
few years later.
The American public got its first look at smart
bombs on January 17,1991. Iraq had invaded Kuwait five months earlier,
and President George H. Bush had put together a UN-backed coalition to
force its withdrawal. Iraq had the world’s fourth-largest army, at
955,000 men, and it faced a coalition force only two-thirds that size.
America’s last experience of a real war had been the long disaster of
Vietnam—nineteen years from start to finish, 58,000 American dead, and
153,000 wounded—and even experienced military officers feared that the
Gulf War might be a reprise of Vietnam. Gen. Edward Meyer, a former Army
chief of staff, predicted that America would suffer ten thousand to
thirty thousand casualties in driving Iraq out of Kuwait. Saddam Hussein
was counting on exactly that and reportedly told U.S. ambassador April
Glaspie, “Americans cannot stand 10,000 dead.”
Americans watched
the war in their living rooms. Tomahawk cruise missiles flew by
journalists’ Baghdad hotel windows and blew up government buildings. One
after another, American fighter planes “plinked” Iraqi tanks with
Maverick missiles, and CNN replayed the video clips: the pilot locked
the missile’s sensor onto the tank’s image, pushed a button, and the
missile did the rest. By the time the coalition’s ground attack began in
mid-February, the Iraqi army had already been seriously degraded. One
Iraqi general said, “During the Iran war, my tank was my friend because I
could sleep in it and know I was safe. . . . During this war my tank
became my enemy. . . . [N]one of my troops would get near a tank at
night because they just kept blowing up.” Although only 8 percent of the
bombs dropped were smart bombs, they did 75 percent of the damage.
Gen.
Meyer and Saddam Hussein vastly overestimated U.S. casualties— only 346
Americans died in the Gulf War, and less than half of those in combat.
On a statistical basis, American soldiers in the war zone were safer
than had they stayed at home in civilian life.5 Iraqi casualties, both
military and civilian, were much higher, but even they were low by the
standards of Vietnam—four thousand Iraqi civilians and thirty-five
thousand soldiers dead, while about one million Vietnamese civilians and
two million soldiers had died. Smart bombs made that reduction
possible. One Iraqi battalion commander reported that only one of his
soldiers was killed in the air war, but that all his vehicles were hit.
The Gulf War coalition destroyed Iraq’s military capabilities, but it
left Baghdad standing—unlike Tokyo, Hiroshima, Nagasaki, Dresden,
Hamburg, or Berlin in World War II. The Air Force was finally able to
deliver what it had promised in the 1930s: striking military targets
while avoiding homes, schools, and hospitals. Where the Norden bombsight
had failed, smart bombs succeeded.
The Gulf War taught America
that future wars should be nearly bloodless, at least for its own
soldiers. (On average from September 11, 2001, through 2012, about 540
Americans died each year in Iraq and Afghanistan. More died on average
every two days in World War II, and that from an America with less than
half the 2012 population.) And world opinion would no longer tolerate
the widespread civilian casualties of Korea or Vietnam. On February 13,
two fighter-bombers used laser-guided smart bombs to attack Baghdad’s
Amiriyah shelter, which had been mistaken for a military command center.
A bomb went down the shelter’s airshaft and killed 408 civilians,
provoking outrage in the Arab world and protests in Europe and America.
*
Engineers
have been tinkering with UAVs since the early days of aviation. In
World War I, the Naval Consulting Board, chaired by Thomas Edison,
funded a gyroscopic autopilot for an anti-ship “aerial torpedo” to be
developed by Elmer Sperry and Peter Hewitt. The torpedo was designed to
fly a preset magnetic course at a fixed altitude, wait until an engine
revolution counter determined that it had achieved the desired range,
and then dive onto an enemy ship that was had been calculated to be
below. In flight tests, an autopilot-controlled seaplane flew a
thirty-mile course and automatically dropped a bag of sand that missed
the target by two miles, which was not bad for 1917. The Navy placed an
order for six aerial torpedoes— stripped-down airplanes without seats or
pilot controls that could carry a payload of a thousand pounds of
explosive. The torpedo’s initial flight tests were unsuccessful, and the
war ended before it saw service. Sperry also contributed an autopilot
to the Army for a UAV, the “Kettering Bug,” named after its designer,
Charles Kettering. The Bug could carry two hundred pounds of explosives
seventy-five miles. After a successful flight test, the Army ordered a
hundred planes. Like the Navy’s aerial torpedo, the Bug did not see
combat.
Neither the Navy’s aerial torpedo nor the Army’s Bug had
any external guidance, and both services saw the need for radio control
if UAVs were to hit a specific target such as a ship or an artillery
emplacement. The Navy lost interest in radio-controlled planes in the
mid-1920s, while the Air Corps persisted into the 1930s, when it
abandoned investment in UAVs in favor of the Norden bombsight and the
B-17 heavy bomber.
Germany’s World War II V-1 “buzz bomb,” like
America’s World War I aerial torpedo, was an unguided UAV flying on
autopilot. Its mission was to hit any populated area in southern
England, which did not require intelligence. The most successful smart
bomb of World War II was the Japanese Kamikaze plane. Its guidance
system was a human pilot, but it proved what a guided bomb could do.
Kamikaze attacks sank 34 U.S. ships, damaged 384 others, and killed
4,900 sailors. Fourteen percent of the Kamikazes survived intense
anti-aircraft fire and fighter defenses to strike a ship, and they sank
8.5 percent of those they struck.
“Operation Aphrodite” was a plan
to turn worn-out B-17 and B-24 heavy bombers into smart bombs: strip
out all guns, armor, seats, and other unnecessary gear; stuff the bomber
with thirty thousand pounds of high explosive; put a television camera
in the nose; and fly it by radio from a mother ship, which would direct
the plane to its target. Twenty were launched, and all failed—shot down,
crashed because of control problems, or exploded prematurely.
Lieutenant Joseph P. Kennedy, John Kennedy’s older brother, died in an
Aphrodite explosion on August 12, 1944. The Allied generals abandoned
Aphrodite as unworkable in late January 1945.
Throughout its
history, the Air Force has shown more interest in new aircraft than in
new munitions. The Korean air war was mostly fought with World War II
weaponry, with the exception of new jet fighters.9 After Korea, the
Eisenhower administration’s New Look military strategy emphasized
nuclear weapons. The Air Force entered the Vietnam era with an array of
nuclear missiles, a fleet of B-52 strategic bombers designed to carry
four nuclear bombs each, fighter-bombers designed for high-speed,
low-level nuclear attack, conventional “iron” bombs that were little
advanced from those it had possessed in 1945, and only two smart bombs,
both developed by the Navy.
Aware that it needed better weapons,
the Air Force enlisted Texas Instruments and the Army’s Redstone Arsenal
to develop what would become the Paveway laser-guided smart bomb: one
plane would shine a pulsed, invisible, infrared laser beam on a target,
and another plane, flying at approximately twelve thousand feet, would
drop a bomb anywhere in a one-thousand-foot-diameter imaginary “basket”
around the target, which was reflecting infrared radiation from the
first plane’s laser beam. The bomb would look for radiation at the right
infrared frequency that had a beat that synchronized with the laser’s
pulsing. When it found that combination, it would lock on, head for the
target, and destroy it. Texas Instruments had not yet designed a defense
system, so it faced credibility problems in a competition against a
rival system proposed by a more experienced North American Aviation
subsidiary. The responsible Air Force officer bypassed the normal
contracting process and convened a “generals board” that included
recently retired Air Force chief of staff Curtis LeMay. The Air Force
approved Paveway and sent units to Vietnam for combat testing in 1968—
just as President Johnson announced a halt to bombing of North Vietnam.
From
1965 to 1968, the United States had rained bombs on North Vietnam in
Operation Rolling Thunder. The Thanh Hoa Bridge across the Song Me
River, for example, was the target of eight hundred American sorties
that dumped ten thousand pounds of explosives. The bombs had scarred the
bridge, but the anti-aircraft guns and surface-to-air (SAM) missiles
that surrounded it shot down 104 American pilots, and the bridge
remained standing. The Long Bien Bridge across the Red River in Hanoi
was another apparently impregnable target—three hundred anti-aircraft
guns and eighty-five SAM sites kept twenty-six supply trains crossing
the bridge every day from China and the port of Haiphong. In 1972, when
Nixon renewed bombing of North Vietnam with the Linebacker campaign,
smart bombs took out both bridges in a matter of days.
Political
success did not follow military success. As a bombing exercise,
Linebacker was enormously successful, but the United States lost the
war. Linebacker did convince the Air Force to continue to invest in
improved short-range smart bombs such as Paveway. These were launched
from warplanes in combat, and they fit solidly into the Air Force’s
precision bombing doctrine. Long-range cruise missiles, developed in the
1970s and introduced in the early 1980s, were another story. These
jet-propelled, subsonic unmanned airplanes are descendants of the German
V-1 buzz bomb. They fly at low altitude to evade enemy radar, are
self-guided to fly a programmed route, have a range of about fifteen
hundred miles, and can carry either a nuclear or conventional warhead of
up to two thousand pounds. They are smart bombs, but they make the
pilot less important. They do not need a sophisticated bomber—subsonic
B-52s, submarines, or surface ships could launch them from a distance.
In 1977, President Jimmy Carter cancelled the Air Force’s prized
supersonic B-1A bomber when cruise missiles became available. A French
general said in an interview, “The B-1 is a formidable weapon, but not
terribly useful. For the price of one bomber, you can have 200 cruise
missiles.” Air Force officers groused that the United States might as
well subcontract the next war to Pan Am. But at a cost of more than $1
million each, cruise missiles are not weapons for routine use. Their
advantage is that they can be launched from afar, but they cost
significantly more than short-range smart bombs, carry a smaller
payload, and are somewhat less accurate, so the Air Force’s pilots and
warplanes kept a role in aerial combat.
Immediately after the Gulf
War, the Air Force and Navy began development of the Joint Direct
Attack Munition ( JDAM) guidance kit, which could be bolted onto
conventional bombs. JDAM bombs are ideal for fixed targets such as
airfields, oil refineries, or power plants. GPS navigation systems are
susceptible to jamming, so JDAM couples GPS guidance with an inertial
guidance system that determines the bomb’s position by measuring its
acceleration, similar to the guidance systems used in ICBMs. JDAM is
inexpensive (about $20,000 per kit) and, unlike laser- or
television-guided smart bombs, does not require target visibility—feed
it the coordinates of the target, and cloud cover and dust are no
impediment.
Smart bombs became standard munitions. In the 1995
NATO bombing campaign in Bosnia, 98 percent of the bombs dropped were
smart. In the 1999 bombing in the Kosovo operation, precision bombing
finally won a war without the need for a land invasion: Serbian premier
Slobodan Milosevic gave up when he lost popular support after NATO hit
Belgrade government buildings, the telephone system, and the Yugoslav
power grid. “Precision” did not mean that the operation was bloodless or
free from blunders, however. NATO hit an Albanian refugee column that
it mistook for a Serbian convoy, and a JDAM bomb destroyed the Chinese
embassy in Belgrade when someone entered the wrong GPS coordinates.
In
the 1970s, the Pentagon assigned the Army the job of developing a
battlefield reconnaissance drone code-named “Aquila.” The drone’s
specifications kept growing: night vision, laser designation for smart
bomb attacks, a secure data link, armor. Only a few expensive prototypes
were built, and the program was canceled in 1987. But when Israelis
drones proved their worth in scouting Syrian radar sites in the Bekaa
Valley in 1982, the Navy took note and acquired the Pioneer drone from
an Israeli defense firm. The Pioneer was a simple reconnaissance drone,
much like the original specification for the Aquila. The Navy used it
successfully as a spotter for its battleships’ sixteen-inch guns in 1991
in the Gulf War.
The Air Force was less interested in UAVs than
was the Navy, and it invested in them later than it did in smart bombs.
It was the CIA, congressional Republicans, an Israeli engineer, and a
small San Diego defense firm—not the Air Force—that would make the UAV
an important American weapon.
Abraham Karem, an Israeli designer
of drone aircraft, moved to the United States in the 1970s but was
unable to find a job with a defense firm. So he started his own company,
Leading Systems, and worked above his garage. He received seed money
from the Pentagon to develop an unmanned drone aircraft. His UAV
exceeded its contract’s specifications, flying 650 hours without a
crash, but the contract for further development went to another Israeli
defense firm, and Leading Systems went broke.
Two brothers, Neal
and Linden Blue, owned a cocoa and banana plantation in Nicaragua in the
1970s. They became friendly with the Nicaraguan dictator Anastasio
Somoza, who was opposed in a guerilla war by the Sandinista Liberation
Front, and the Blues saw UAVs as a way to attack the Sandinistas’
gasoline storage tanks. They bought a small defense contractor, General
Atomics, from Chevron in 1986. General Atomics purchased Leading
Systems’ assets in bankruptcy and kept Karem working on an improved
version of his drone, the GNAT-750, which made its first flight in 1989.
The CIA and the Turkish government bought multiple GNAT-750s.
The
Blues were looking to the long term: they believed that once the Air
Force saw that it was in danger of losing control of a growing segment
of military aviation, it would bend to the inevitable, just as it had in
the 1950s when it took up missiles despite seeing them as a threat to
its prized bomber fleet. And rather than just wait for the Air Force to
come to its senses, the Blues pushed. General Atomics spent more on
political contributions as a percentage of sales than did any other
defense contractor. Its specialty was offering junkets to key
congressional staffers (a practice that is now illegal). Its
congressional supporters included conservative Southern Californian
Republican representatives Jerry Lewis and Randy “Duke” Cunningham.
In 2005, Cunningham pled guilty to federal charges relating to bribery
(not by General Atomics), and Lewis’s reputation was sullied by charges
of favoritism toward General Atomics and other contractors. Lewis was
not indicted, but when the Republicans regained control of Congress in
2010, his party did not offer him his old post as chairman of the House
Appropriations Committee.
General Atomics developed the Predator
as the GNAT-750’s successor, and it first saw service in Bosnia in 1995.
Predators were at that time reconnaissance-only airplanes, roughly the
size of a small Cessna. They were underpowered (Rotax, the company that
manufactured the engine, was best known for snowmobiles), were not
equipped with radar to see through clouds, had no de-icing equipment,
and were difficult to land. Lewis had forced the Predator on the Air
Force in 1994 with an earmark. “If it had not been for an earmark, the
Predator would not have been in Bosnia,” Lewis told Fox News in 2006.
“And that mission served our country very, very well. A classic
illustration of earmarks at their best.” General Atomics’ political
strategy worked—Congress forced the Air Force to invest in drones. In
2000, Republican senator John Warner laid out his goal for the Pentagon:
one-third of its purchased aircraft should be unmanned by 2010. The
Congressional Unmanned Systems Caucus remains a potent political force
as of this writing, with sixty representatives who are members.
There
was some resistance. Most Air Force generals come from the ranks of
fighter pilots, and as Hap Arnold pointed out in 1944, drone aircraft
threaten to make fighter pilots obsolete. When the Air Force did
reluctantly take up UAVs, it hired a civilian contractor to control
them. Only when it found that its contractor was hiring retired Navy
pilots did it assign its own pilots to unmanned aircraft, and even then
it paid them less than “real” pilots and gave them no career advancement
credit for flight hours. By September 2001, nineteen of the sixty-eight
Predators that had been delivered to the Air Force had been lost, many
due to operator error. The Air Force viewed them as toys. General
Atomics fixed the Predator’s performance problems—bigger turboprop
engine, de-icing equipment, higher ceiling, greater payload. But it
still had customer problems with the Air Force.
Gen. John Jumper
was named Air Force chief of staff the week before the 9/11 attack on
the World Trade Center. He had commanded U.S. and Allied air forces in
the Bosnia and Kosovo campaigns. Unlike many in the Air Force, Jumper
saw the potential of UAVs, though their limitations frustrated him. A
Predator operator could spot an enemy tank, for example, but that was
it. The operator would then have to send the location of the tank to a
bombing coordinator, who would send two planes—a designator plane to
“paint” the tank with a laser beam, and a second plane to destroy it
with a smart bomb. During that delay, the tank might have fired on
American troops or left the area entirely. Jumper’s solution was to add a
“laser ball” to his Predators so that an operator could designate a
target, keep the laser beam on it even if it moved, and then call in a
plane for a laser-bomb strike. But for “fleeting, perishable targets
that don’t require a big warhead and that we can just go ahead and take
care of,” Jumper saw even that as an unnecessary delay. He armed the
Predator with its own laser-guided missiles—a pair of hundred-pound
Hellfire anti-armor missiles. That solution married a UAV to a smart
bomb; Predators could fly for hours, their operators sitting in cubicles
in trailers near Las Vegas, taking breaks so that their attention did
not flag, going home to their families as they handed the planes—still
in flight—over to the next shift. When a target appeared, an operator
could designate it with his laser ball and destroy it with his Hellfire.
The
post-9/11 war in Afghanistan showed what an armed Predator could
do—kill Al Qaeda leaders. Smart bombs can hit a target, but a Predator
armed with a smart bomb can often identify what is inside the target. It
can hover above a building for hours, watching people entering and
leaving, and it can follow an automobile down a highway. Commanders
could make fine distinctions about acceptable “collateral damage” to
civilians—should a car carrying an Al Qaeda leader and an unknown
companion be destroyed? What about a leader and his wife? What about a
leader and his three children? In the past, bombs had been made bigger
to compensate for their inaccuracy, but smart bombs reversed that trend.
Their precision meant that bombs could be made smaller, just big enough
to destroy a targeted house but leave the neighbor’s house standing.
As
of this writing, the drone’s operator, not a computer, decides when an
American weapon will be fired. The operator examines the video feed and
determines whether an attack is authorized under his orders. (Although
visual confirmation is no guarantee, as leaked 2007 footage of a
mistaken and deadly helicopter attack on civilians in Iraq shows.) But
computers will become more involved, and the idea that humans can
effectively oversee computers is illusory: in 1988, the guided- missile
cruiser USS Vincennes shot down Iran Air Flight 66, killing all 290 on
board. The airliner was ascending, was flying its scheduled route, and
had its civilian transponder operating, but the Vincennes’s Aegis
computer system mistook the Airbus A300 for an Iranian F-14
fighter-bomber. Pressed for time, and believing the Vincennes was under
attack, the crew accepted the computer’s “advice” and fired two missiles
at the plane, destroying it.
As image-recognition and artificial
intelligence software improve, computers will demonstrate an improved
ability to distinguish tanks from taxis and terrorists’ vans from school
buses. Operators will learn to trust the computers, and when the
computer says to fire, operators will obey. As UAVs proliferate,
American drones will face enemy drones on the battlefield, and delays to
call a human operator will be seen as intolerable. Computers will be
given authority to fire, just as computers have been given the authority
to risk billions of dollars in flash trading against other computers
despite the occasional disastrous loss.
Allowing computers to make
life-and-death decisions may be inevitable, but it is frightening. In
his 1953 short story “Second Variety,” science fiction author Philip K.
Dick imagined a war in which autonomous American killer robots could
distinguish and kill enemy soldiers. Then the military took the next
step—giving the computerized factories that produced the killer robots
autonomy to design improved models. Dick’s story does not end well for
the human race.
UAVs and smart bombs are not yet an incarnation of
Dick’s nightmare scenario, and they have transformed America’s arsenal.
Older weapons were unusable: chemical weapons were simultaneously
horrifying and militarily ineffective, and nuclear bombs are
disproportionate—like arming bank guards with dynamite. Combinations of
smart bombs with UAVs, however, have shown themselves to be both usable
and adaptable.
When to use them is another question. After
Vietnam, the United States embraced what came to be known as the
“Weinberger-Powell doctrine,” named after Reagan’s secretary of defense,
Caspar Weinberger, and Joint Chiefs of Staff chairman Colin Powell.
According to the doctrine, before the United States would initiate
military action, national security must be threatened; all political,
economic, and military means must be exhausted; a clear objective and a
plausible exit strategy must exist; and the action must have broad
public and international support. The 1991 Gulf War fit the
Weinberger-Powell doctrine, but later interventions in Somalia, Bosnia,
and Kosovo arguably did not. In the run-up to Bosnia, Secretary of State
Madeleine Albright asked Powell, “What’s the point of having this
superb military that you’re always talking about if we can’t use it?”
After
9/11, with the availability of smart bombs and UAVs, the Weinberger-
Powell doctrine is effectively obsolete. Military force is often the
first choice for the United States, supplanting diplomacy or other
efforts. America is supposedly not at war in Yemen or Pakistan or
Somalia, but Air Force drones strike there regularly. Because there is
no risk to the pilots, there is little public scrutiny. And the CIA
operates its own drones, with no public scrutiny at all. Legal and
ethical questions remain unanswered: Should a Predator attack on a known
terrorist in his car be considered an act of war or an assassination?
What about terrorists who are American citizens? Who decides on
legitimate targets?
At present, the United States has a
technological lead in both smart bombs and UAVs. Historically, however,
no nation has been able to maintain a weapons monopoly indefinitely—the
American monopoly of the atomic bomb lasted only four years, and its
monopoly of the hydrogen bomb less than that. Once other nations begin
to use drones routinely, America may have to rethink its position on
cross-border anti-terrorist attacks. What, for example, would the United
States say about Russian UAV attacks on Chechen rebels in the mountains
of neighboring Georgia, or a drone attack that the Chinese considered
launching against a drug lord in Burma?
China has offered its
drones for sale at an air show, and other countries have doubtless
produced them as well. Export controls are unlikely to be effective in
controlling proliferation. The United States sells UAVs and smart bombs
to its allies, and the weapons are lost on the battlefield.
Reverse-engineering the hardware of captured weapons would be relatively
simple, although re-creating the firmware, which is certainly
encrypted, would be more difficult. (The Iranians, however, claim they
decrypted the video of a crashed American drone.) But America’s enemies
have competent programmers and hackers, and digital espionage requires
nothing more than access to the right computer.
Iran and North
Korea waste their time trying to make seventy-year-old nuclear weapons
and fifty-year old ICBMs. They are repeating Saddam Hussein’s
mistake—developing weapons that oppose the United States symmetrically.
Tanks and airplanes failed Hussein, but Iraqi insurgents have used
suicide bombers and IEDs, decidedly asymmetric weapons, far more
effectively against coalition forces in both Iraq and Afghanistan. A
better R&D strategy for America’s enemies would be to develop
robotic IEDs that combine off-the-shelf technologies—an
explosive-stuffed model airplane guided by GPS, for example, or an IED
built using a radio- controlled car with a video camera in its nose. The
next arms race has only just begun.
Excerpted from “American Arsenal:A century of waging war” by Patrick Coffey. Copyright 2013. Oxford University Press. All rights reserved.