Robotics Revolutionary

By April 10, 2017August 30th, 2018Army ALT Magazine

Ex-Army Ranger Paul Scharre, formerly in OSD and now with the Center for a New American Security, fears DOD bureaucratic resistance could the pump brakes on progress in machine intelligence.

by Ms. Margaret C. Roth

Paul Scharre

Paul Scharre

It couldn’t be a much bigger leap from Southwest Asia to downtown Washington yet, for Paul Scharre, the two hardly could be more closely connected. What Scharre experienced as an Army Ranger deployed to Iraq and Afghanistan—his first look at how robots could mitigate the huge toll that improvised explosive devices (IEDs) were taking on Soldiers—led him directly to what he’s doing now as a civilian: senior fellow and director of the Future of Warfare Initiative at the Center for a New American Security.

In just 10 years, Scharre (rhymes with “sorry” but with “sh” instead of “s”) has seen warfare from three distinct vantage points: the battlefield, as a graduate of the Army’s Airborne, Ranger and Sniper schools and honor graduate of the 75th Ranger Regiment’s Ranger Indoctrination Program; the bureaucracy (the Office of the Secretary of Defense (OSD) from 2008 to 2013); and now the more bookish community of analysts in Washington that aim to make sense of the big picture and influence defense policy. At OSD, he played a leading role in establishing policies on unmanned and autonomous systems and emerging weapons technologies, heading the working group that drafted DOD Directive 3000.09, which established policies on autonomy in weapon systems. Scharre also led DOD efforts to establish policies on intelligence, surveillance and reconnaissance (ISR) programs and directed energy technologies.

With an M.A. in political economy and public policy and a B.S. in physics, Scharre is wholly engrossed in how new technologies translate to warfighting doctrine and acquisition—and he is passionately aware of how long that can take.

With the increased freedom he now has as a former DOD insider looking more broadly at the defense establishment from the outside, Scharre talked with Army AL&T magazine in February about what the Pentagon needs to do to take appropriate advantage of the rapid advances in robotics, artificial intelligence (AI) and autonomous weapon systems. As he perhaps understated it, “I’m just saying, as an observer here, these might be things that the U.S. military can do to be more effective and stay competitive.”

Army AL&T: We were intrigued by your operational background and the amount of thought you’ve given to the topic of robotics and artificial intelligence. How did you get from there to here?

Scharre: When I was in the Army, I saw how decisions in Washington and the Pentagon really affected people downrange. When I first came to the Pentagon, we were working on a suite of different capabilities to try to make the Pentagon’s sluggish bureaucracy more responsive to the warfighters in the field. Things like intelligence, surveillance and reconnaissance were huge issues at the time, and unmanned vehicles are a part of that.

But over time, robotics became a bigger and bigger issue. I think the people inside DOD began to realize the potential of what I would describe as kind of an accidental robotics revolution that happened—the Predator [unmanned aerial vehicle (UAV)] and Gray Eagle, and then large numbers of smaller unmanned aircraft or drones, like the Wasp and Raven, thousands of those things that gave troops the ability to look over hills and around corners. I worked on the receiving end of this [demand], and there was just this tremendous appetite for more ISR, what Secretary Gates [Dr. Robert M. Gates, secretary of defense from December 2006 to July 2011] described as this “insatiable demand.”

And what I saw—which was really disheartening but also educational for me—was the immense resistance within the bureaucracy to respond to the needs of the warfighter on this issue. Secretary Gates had to direct a standalone ISR task force to respond to the needs.

The needs from the COCOMs [combatant commands] were massive and just swamped the ability of the bureaucracy to understand. And rather than try to say, OK, here’s a legitimate need by warfighters for emerging technology that’s really valuable, and you know our current processes don’t really make it possible, feasible or affordable to respond to these needs, so we need to find better ways of doing business (which there are lots of opportunities to do, because it’s a new technology). Instead the response of the bureaucracy was basically to reject the warfighters’ needs, to just say no. And it was really only because Secretary Gates forced it on the U.S. Air Force that the Air Force grew the number of Predator or Reaper air patrols from initial small numbers like 12 up to 50 and 60, 65 and 70 [24/7 orbits] over time.

Then-Staff Sgt. Paul Scharre poses with Iraqi children in Diyala province, Iraq, as part of the opening of an elementary sch9oool in Baqubah in 2008. (Photo courtesy of Paul Scharre)

IN IT TO WIN IT
Then-Staff Sgt. Paul Scharre poses with Iraqi children in Diyala province, Iraq, as part of the opening of an elementary sch9oool in Baqubah in 2008. (Photo courtesy of Paul Scharre)

As soon as Gates left, there was pushback within the bureaucracy. The Air Force in particular was taking its foot off the pedal and doing less. And I think it’s an indictment of the bureaucracy that we’ve [also] seen across other areas like MRAPs [Mine Resistant Ambush Protected vehicles].The Air Force is not unique in this. I think the Army’s failure to respond in a timely fashion on MRAPs is just unconscionable and a disgrace.

I think this is a continual problem that the bureaucracy has. The system is designed to think long term about what the future force might need in some unknown, nebulous time frame. When there are immediate needs today, people in the bureaucracy—it’s not that they don’t care; they don’t think that it’s their job to respond to those needs. And the system is so slow that it’s not easy to [respond]. So I’m getting off the topic of robotics, but it’s something that I’m passionate about.

I think speed is really fundamental in this type of international environment we’re living in today. We have a very different military than we had almost 30 years ago at the end of the Cold War, but we’re dealing with bureaucracies that are an outgrowth of institutions that we created in the Cold War. Today we have a wider set of possible challenges. We’re competing against actors like terrorist groups that don’t have the kinds of bureaucracies we have.

That’s going to be a challenge in future wars as well. Whether it’s a big war or small war, whether it’s a war against a terrorist group or another nation-state, you’ve got to be constantly adapting and evolving.

And that’s a really vital lesson that we need to be imparting in our institutions: that the types of threats that we face in the future will be different, and the types of adaptations will be different, and we’ll need the ability to have institutions that can rapidly adapt to whatever those things are. That’s really fundamental, particularly for technologies like robotics that are moving so rapidly. The progress in machine intelligence driven by deep learning and neural networks is just mind-blowing. These deep-learning neural networks are solving problems that have been bedeviled the AI researchers for decades, things that people just had no idea how to solve.

And so we’re at the beginning of an explosion in machine intelligence that’s likely to unfold. It’s really hard for the U.S. to stay competitive in that environment, in part because things are moving quickly and in part because a lot of the innovation of robotics is outside of traditional defense actors. It’s coming from Google and IBM and Microsoft and Facebook and Apple, and they don’t want to work with DOD. It’s not worth the headache. I’ve heard from people in venture capital firms, “I won’t let my companies work with the U.S. military,” because they’re just going to bog you down into a lengthy multiyear process of futzing around with requirements. They’re going to try to over-specify what they need, they’re going to give you a bunch of government red tape. And at the end of the day, the profit margins aren’t even going to be there.

A remotely-piloted explosive ordnance disposal (EOD) robot hefts a 150-pound package during the May 2016 Raven's Challenge exercise held at the New York State Preparedness Training Center in Oriskany, New York. The author’s experience with a similar EOD robot crystallized his thinking that the Army could do more to use robots, as well as AI and other intelligent machines, to do some of the dangerous and difficult work that often falls to Soldiers. (U.S. Army National Guard photo by Sgt. J.P. Lawrence)

LET ME GET THAT FOR YOU
A remotely-piloted explosive ordnance disposal (EOD) robot hefts a 150-pound package during the May 2016 Raven’s Challenge exercise held at the New York State Preparedness Training Center in Oriskany, New York. The author’s experience with a similar EOD robot crystallized his thinking that the Army could do more to use robots, as well as AI and other intelligent machines, to do some of the dangerous and difficult work that often falls to Soldiers. (U.S. Army National Guard photo by Sgt. J.P. Lawrence)

And so what we’re seeing is, there’s this model where DOD uses tools like DARPA [the Defense Advanced Research Projects Agency] and the Office of Naval Research [ONR] to fund basic innovation in various technologies, and the concept is that they take this stuff to a commercial market and they mature these technologies, and then they spin back in to the defense sector. That’s a great model, [but] I’m not sure how much things are actually coming back in.

Army AL&T: You mean what they call transitioning?

Scharre: Well, there’s two different kinds of concepts. One is, you have a place like DARPA develop something that’s a really appealing proof of concept. And then they throw it over the transom or use some means that’s supposed to cross the “valley of death” that people describe to get into a program of record. And that often fails. There isn’t necessarily an institution of bureaucracy that is designed to grab ahold of those things and then transition them.

Army AL&T: I think the new Army Rapid Capabilities Office has that intent.

Scharre: Yeah, the Rapid Capabilities Office seems exactly like the kind of thing the Army should be doing, and it has a lot of potential. The Army needs that kind of capability from a bureaucratic standpoint. I think it remains to be seen if they’re going to have the bureaucratic clout and the funding and the autonomy to do what they need to do.

And then there’s this smaller issue, that there are some technologies that aren’t even right for transitioning yet. So DOD makes a fundamental investment, and it’s just not mature enough to be really transitioned to a military application, and the company takes it to market in the commercial side and they might mature it. And you hope that over time, that [technology] comes back in.

People are trying to create ad hoc processes to do that, and we need more of those kinds of things. It’s especially vital for technologies like robotics and automation, where they’re moving rapidly and so much of the innovation is happening out in the commercial sector.

I will say I’ve seen tremendous interest in the last several years—and not just concepts about human-machine teaming in physical ways and cognitive ways, but also people really thinking hard about, OK, what does it mean to be innovative? How do we find ways of increasing experimentation and war gaming and competition of ideas so that we’re meeting at the forefront of new operational concepts in relation to adversaries?

IBM’s Watson for Cyber Security uses cognitive capabilities to improve cyber security investigations. Scharre notes that many of the developments in AI come from the private sector, which is often reluctant to work with the government. However, he also notes that many AI tools are open source and therefore publicly available. (Photo by John Mottern/Feature Photo Service for IBM)

WATSON, CAN YOU HEAR ME?
IBM’s Watson for Cyber Security uses cognitive capabilities to improve cyber security investigations. Scharre notes that many of the developments in AI come from the private sector, which is often reluctant to work with the government. However, he also notes that many AI tools are open source and therefore publicly available. (Photo by John Mottern/Feature Photo Service for IBM)

Now the Army has the opportunity to take basically a cadre of leaders—junior and midgrade officers and NCOs who’ve been able to have that freedom to be innovative out in the field and have autonomy—and say, OK, we want you to take the sort of intellectual capital you had and the skill set of problem-solving and apply it to new problems: How will we fight a war against Russia? How will we project power in the Pacific? How will we respond to adversaries’ challenges in cyberspace and electronic warfare and other things?

The way those wars were fought, particularly in Afghanistan, where the geography and people are so dispersed, we gave a lot of autonomy to junior leaders, and brigades and divisions were in support of people at lower levels. That’s just incredibly good in terms of maturing our leaders in their critical thinking. One of the challenges the Army has going forward is, for people who grew up in that environment, how do you continue that in garrison? So you get the squad leader engaged in finding solutions. You can’t do those things from the headquarters.

Read the full article in the April-June issue of Army AL&T magazine.


Subscribe to Army AL&T News, the premier online news source for the Acquisition, Logistics, and Technology (AL&T) Workforce.


ONLINE EXTRAS

Why Poker Is a Big Deal for Artificial Intelligence,” MIT Technology Review, Jan. 23, 2017

Perspectives on Research in Artificial Intelligence and Artificial General Intelligence Relevant to DoD,” MITRE Corp., January 2017

US Air Force F/A-18 released 103 Perdix micro drones” video, Jan. 9, 2017

Special Ops’ ‘Iron Man’ Suit on Track for 2018,” National Defense magazine, May 2016

Directed-Energy Weapons: Promise and Prospects,” report for Center for a New American Security, April 7, 2015