The New Digital Age Read online

Page 26


  Soldiers will not be left behind completely, and not all human functions will be automated. None of the robots in operation today operate fully autonomously—which is to say, without any human direction—and, as we’ll discuss later, there are important aspects of combat, like judgment, that robots will not be capable of exercising for many years to come. To better understand how technology will soon enhance the capabilities of human soldiers we asked a now inactive Navy SEAL, who, incidentally, participated in the Osama bin Laden raid in May 2011, what he anticipated for combat units in the future. First, he told us, he envisioned units equipped with highly sophisticated and secure tablet devices that will allow soldiers to tap into live video feeds from UAVs, download relevant intelligence analysis and maintain situational awareness of friendly troop movements. These devices will have unique live maps loaded with enough data about the surrounding environment—the historical significance of a street or building, the owners of every home, and the interior infrared heat movements captured by drones overhead—to provide soldiers with a much clearer sense of what to target and what to avoid.

  Second, the clothes and gear that soldiers wear will change. Haptic technologies—this refers to touch and feeling—will produce uniforms that allow soldiers to communicate through pulses, sending out signals to one another that result in a light pinch or vibration in a particular part of the body. (For instance, a pinch on the right calf could indicate a helicopter is inbound.) Helmets will have better visibility and built-in communications, allowing commanders to see what the soldiers see and “backseat drive,” directing the soldiers remotely from the base. Camouflage will allow soldiers to change their uniform’s color, texture, pattern or scent. Uniforms might even be able to emit sounds to drown out noises soldiers want to hide—sounds of nature masking footsteps, for example. Lightweight and durable power sources will be integrated as well, so that none of the devices or wearable technologies will fail at crucial moments due to heat, water or distance from a charger. Soldiers will have the additional ability to destroy all of this technology remotely, so that capture or theft will not yield valuable intelligence secrets.

  And, of course, wrapping all of this together will be a hefty layer of cybersecurity—more than any civilian would use—that enables instant data transmission within a cocoon of electronic protection. Without security, none of the advantages above will be worth the considerable cost that will be required to develop and deploy them.

  Alas, military contractors’ procedures will hold back many of these developments. In the United States, the military-industrial complex is working on some of the initiatives mentioned above—DARPA has led the development of many of the robots now in operation—but it is by nature ill-equipped to handle innovation. Even DARPA, while relatively well funded, is predictably stalled by elaborate contracting structures and its position in the Department of Defense bureaucracy. The innovative edge that is the hallmark of the American technology sector is largely walled off from the country’s military by an anarchic and byzantine acquisitions system, and this represents a serious missed opportunity. Without reforms that allow military agencies and contractors to behave more like small private companies and start-ups (with maneuverability and the option to move quickly) the entire industry is likely to retrench rather than evolve in the face of fiscal austerity.

  The military is well aware of the problems. As Singer told us, “It’s a big strategic question for them: How do they break out of this broken structure?” Big defense projects languish in the prototype stage, over budget and behind schedule, while today’s commercial technologies and products are conceived of, built and brought to market in volume in record time. For example, the Joint Tactical Radio System, which was supposed to be the military’s new Internet-like radio-communications network, was conceived of in 1997, then shut down in September 2012, only to have acquisitions functions transferred to the Army under what is now called the Joint Tactical Networking Center. By the time it was shut down as its own operation, it had cost billions of dollars and was still not fully deployed on the battlefield. “They just can’t afford that kind of process anymore,” Singer said.

  One recourse for the military and its contractors is to use commercial, off-the-shelf (COTS) products, which means buying commercially available technologies and devices rather than developing everything in-house. The integration of such outside products, however, is not an easy process; meeting military specifications alone (for ruggedness, utilization and security) can introduce damaging delays. According to Singer, the bureaucracy and inefficiency of the military contracting system have actually generated an unprecedented degree of ground-level ingenuity in building functional work-arounds. Some involve buying quick-need systems outside the normal Pentagon acquisitions process; that is how MRAP (mine-resistant, ambush-protected) vehicles were rushed to the front after the scourge of IEDs began in Iraq. And troops often adapt commercial technologies that they take on a deployment themselves.

  Even military leaders have recognized the advantages that such inventiveness can bring. “The military was, in some ways, aided by the demands of the battlefields in Iraq and Afghanistan,” Singer explained. “In Afghanistan, Marine attack helicopter pilots have taken to strapping iPads onto their knees as they fly, and using those for maps instead of the built-in system in their crafts.”6 He added that as the pressure of an active battlefield ends, military leaders are worried that innovative work-arounds might evaporate. It remains to be seen if innovation will drive change in a problematic contracting system.

  • • •

  Technological breakthroughs have offered the United States major strategic advantages in the past. For many years after the first laser-guided missiles were developed, no other country could match their lethality over long distances. But technological advantages generally tend to equalize over time, as technologies are spread, leaked or reverse-engineered, and sophisticated weaponry is no exception. The market for drones is already international: Israel has been at the forefront of that technology for years; China is very active in promoting and selling its drones; and Iran unveiled its first domestically built drone bomber in 2010. Even Venezuela has joined the club, utilizing its military alliance with Iran to create an “exclusively defensive” drone program that is operated by Iranian missile engineers. When asked to confirm reports of this program, the Venezuelan president Hugo Chavez remarked, “Of course we’re doing it, and we have the right to. We are a free and independent country.” Unmanned drones will get smaller, cheaper and more effective over time. As with most technologies, once a product is released into the environment—be it a drone or a desktop application—it’s impossible to put it back in the box.

  We asked the former DARPA director Regina Dugan how the United States approaches the high level of responsibility that comes with building such things, knowing that the ultimate consequences are out of its control. “Most advances in technology, particularly big ones, tend to make people nervous,” she said. “And we have both good and bad examples of developing the societal, ethical and legal framework that goes with those kinds of technological advances.” Dugan pointed to the initial concerns people expressed about human genome sequencing when that breakthrough was announced: If it could be determined that you had a predisposition toward Parkinson’s disease, how would that affect how employers and insurance companies treated you? “What came to pass was the understanding that the advance that would allow you to see that predisposition was not the thing that we should shy away from,” Dugan explained, “but rather we should create the legal protections that ensure that people couldn’t be denied health care because they had a genetic predisposition.” The development of technological advances and the protections they will ultimately require must grow in tandem for the right balance to be struck.

  Dugan described her former agency’s role in stark terms: “You can’t undertake a mission like the invention and prevention of strategic drones engineered for combat purposessurprise if you�
��re unwilling to do things that initially make people feel uncomfortable.” Rather, the obligation is to handle that job responsibly—which, critically, requires input and help from other people. “The agency can’t do it by itself. One has to involve other branches of government, other parties, in the debate about those things,” she said.

  It is comforting to hear how seriously DARPA takes its responsibility for these new technologies, but the problem is, of course, that not all governments will approach them with similar consideration and caution. The proliferation of drones presents a particularly worrisome challenge, given the enormous benefits they bestow upon even the smallest armies. Not every government or military in the world has the technical infrastructure or human capital to support its own fleet of unmanned vehicles; only those with deep pockets will find it easy to buy that capability, openly or otherwise. Owning military robots—particularly unmanned aerial vehicles—will become a strategic prerogative for every country; some will acquire them to gain an edge, and the rest will acquire them just to maintain their sovereignty.

  Underneath this state-level competition, there will be an ongoing race by civilians and other non-state actors to acquire or build drones and robots for their own purposes. Singer reminded us that “non-state actors that range from businesses like media groups and agricultural crop dusting to law enforcement, to even criminals and terrorists, have all used drones already.” The controversial private military firm Blackwater, now called Academi, LLC, unveiled its own special service—unmanned drones, available to rent for surveillance and reconnaissance missions—in 2007. In 2009, it was contracted to load bombs onto CIA drones.

  There is also plenty of private development and use of drones outside the context of military procurement. For example, some real-estate firms are now using private drones to take aerial photographs of their larger properties. Several universities have their own drones for research purposes; Kansas State University has established a degree for unmanned aviation. And in 2012 we learned about Tacocopter (a service allowing anyone craving a taco to order on a smart phone, punch in his location and receive his tacos by drone), which proved to be a hoax, but is both technically possible and not far off.

  As we mentioned earlier, lightweight and inexpensive “everyman” drones engineered for combat purposes will become particularly popular at the global arms bazaar and in illicit markets. Remotely piloted model planes, cars and boats that can conduct surveillance, intercept hostile targets and carry and detonate bombs will pose serious challenges for soldiers in war zones, adding a whole other dimension to combat operations. If the civilian version of armed drones becomes sophisticated enough, we could well see military and civilian drones meeting in battle, perhaps in Mexico, where drug cartels have the will and the resources to acquire such weapons.

  Governments will seek to restrict access to the key technologies making drones easy to mass-produce for the general populace, but regulating the proliferation and sale of these everyman drones will be very difficult. An outright ban is simply unrealistic, and even modest attempts to control civilian use in peaceful countries will have limited success. If, for example, the U.S. government required people to register their small unmanned aircrafts, restricted the spaces in which drones could fly (not near airports or high-value targets, for example) and banned their transport across state lines, it’s not hard to imagine determined individuals finding ways around the rules by reconfiguring their devices, anonymizing them or building in some kind of stealth capacity. Still, we might see international treaties around the proliferation of these technologies, perhaps banning the sale of larger drones outside official state channels. Indeed, states with the greatest capacity to proliferate UAVs may even pursue the modern-day version of the Strategic Arms Limitation Talks (SALT), which sought to curtail the number of U.S. and Soviet arms during the Cold War.

  States will have to work hard to maintain the security of their shores and borders from the growing threat of enemy UAVs, which, by design, are hard to detect. As autonomous navigation becomes possible, drones will become mini cruise missiles, which, once fired, cannot be stopped by interference. Enemy surveillance drones may be more palatable than drones carrying missiles, but both will be considered a threat since it won’t be easy to tell the two apart. The most effective way to target an enemy drone might not be with brute force but electronically, by breaching the UAV’s cybersecurity defenses. Warfare then becomes, as Singer put it, a “battle of persuasion”—a fight to co-opt and persuade these machines to do something other than their mission. In late 2011, Iran proudly displayed a downed but intact American drone, the RQ-170 Sentinel, which it claimed to have captured by hacking into its defenses after detecting it in Iranian airspace. (The United States, for its part, would say only that the drone had been “lost.”) An unnamed Iranian engineer told The Christian Science Monitor that he and his colleagues were able to make the drone “land on its own where we wanted it to, without having to crack the remote-control signals and communications” from the U.S. control center because of a known vulnerability in the plane’s GPS navigation. The technique of implanting new coordinates, known as spoofing, while not impossible, is incredibly difficult (the Iranians would have had to get past the military’s encryption to reach the GPS, by spoofing the signals and jamming the communications channels).

  Diplomatic solutions might involve good-faith treaties between states not to send surveillance drones into each other’s airspace or implicit agreements that surveillance drones are an acceptable offense. It’s hard to say. Perhaps there might emerge international requirements that surveillance drones be easily distinguishable from bomber drones. Some states might join together in a sort of “drone shield,” not unlike the nuclear alliance of the Cold War, in which case we would see the world’s first drone-based no-fly zone. If a small and poor country cannot afford to build or buy its own bomber drones, yet it fears aerial attacks from an aggressive neighbor, it might seek an alliance with a superpower to guarantee some measure of protection. It seems unlikely, however, that states without drones will remain bereft for long: The Sentinel spy drone held by the Iranians cost only around $6 million to make.

  The proliferation of robots and UAVs will increase conflict around the world—whenever states acquire them, they’ll be eager to test out their new tools—but it will decrease the likelihood of all-out war. There are a few reasons for this. For one, the phenomenon is still too new; the international treaties around weapons and warfare—the Nuclear Nonproliferation Treaty, the Anti-Ballistic Missile Treaty, and the Chemical Weapons Convention, to name a few—have not caught up to the age of drones. Boundaries need to be drawn, legal frameworks need to be developed and politicians must learn how to use these tools responsibly and strategically. There are serious ethical considerations that will be aired in public discourse (as is taking place in the United States currently). These important issues will lead states to exhibit caution in the early years of drone proliferation.

  We must also consider the possibility of a problem with loose drones, similar to what we see with nuclear weapons today. In a country such as Pakistan, for example, there are real concerns about the state’s capacity to safeguard its nuclear stockpiles (estimated to be a hundred nuclear weapons) from theft. As states develop large fleets of drones, there will be a greater risk that one of these could fall into the wrong hands and be used against a foreign embassy, military base or cultural center. Imagine a future 9/11 committed not by hijackers on commercial airliners, but instead by drones that have fallen out of state hands. These fears are sufficient to spur future treaties focused on establishing requirements for drone protection and safeguarding.

  States will have to determine, separately or together, what the rules around UAVs will be, whether they will be subject to the same rules as regular planes regarding violating sovereign airspace. States’ mutual fears will guard against a rapid escalation of drone warfare. Even when it was revealed that the American Sentinel drone had viola
ted Iranian airspace, the reaction in Tehran was boasting and display, not retaliation.

  The public will react favorably to the reduced lethality of drone warfare, and that will forestall outright war in the future. We already have a few years of drone-related news cycles in America from which to learn. Just months before the 2012 presidential election, government leaks resulted in detailed articles about President Obama’s secret drone operations. Judging by the reaction to drone strikes in both official combat theaters and unofficial ones like Somalia, Yemen and Pakistan, lethal missions conducted by drones are far more palatable to the American public than those carried out by troops, generating fewer questions and less outrage. Some of the people who advocate a reduced American footprint overseas even support the expansion of the drone program as a legitimate way to accomplish it.

  We do not yet understand the consequences—political, cultural and psychological—of our newfound ability to exploit physical and emotional distance and truly “dehumanize” war to such a degree. Remote warfare is taking place more than at any other time in history and it is only going to become a more prominent feature of conflict. Historically, remote warfare has been thought of mostly in terms of weapons delivered via missiles, but in the future it will be both commonplace and acceptable to further separate the actor from the scene of battle. Judging from current trends, we can assume that one effect of these changes will be less public involvement on the emotional and political levels. After all, casualties on the other side are rarely the driving factor behind foreign policy or public sentiment; if American troops are not seen to be in harm’s way, the public interest level drops dramatically. This, in turn, means a more muted population on matters of national security; both hawks and doves become quieter with a smaller threat to their own soldiers on the horizon. With more combat options that do not inflame public opinion, the government can pursue its security objectives without having to consider declaring war or committing troops, decreasing the possibility of outright war.