Our interview for the April – June Critical Thinking with Paul Scharre—he’s a Ranger vet who did tours in Iraq and Afghanistan who is now a robotics expert at the Center for a New American Security—covered so much good material, but we didn’t want to leave it on the cutting room floor. Here’s Scharre on challenges to automation and artificial intelligence (AI).
Army AL&T: If the Army’s main challenge is finding the right places to use robots, automation and AI … what are those places?
Scharre: The fundamental problem with automation is that it’s brittle. You can design really specific machine intelligence tools to solve very specific problems, like playing chess or playing Go. Playing poker is something that AI’s defeated humans at just recently—the latest in a series of things that people said machines can’t do that now machines can do better than humans. But all those things are very narrow. So the poker AI that can beat any human in the world can’t go into a person’s house and make a pot of coffee, which we wouldn’t think of as a particularly complicated cognitive task, but it is. It just happens to be one that is so easy for us because of the way our brains are developed.
But a lot of things in war depend upon context, and a machine just can’t do that today. So machines could identify things, like is this person carrying an AK47, or does he have a shovel in his hand? We could probably do that with a machine better than a person. But trying to look at someone approaching you and saying, is this person hostile? What’s in their mind? What are their intentions? It’s hard for people, and it’s really hard for machines. It’s taking the next level—to try to intuit intent—they can’t do that right now. Maybe that will change in the next five to 10 years. It’s hard to say, but they certainly can’t do that right now. The problem with self-driving cars [is], you put a self-driving car on the road, but we don’t control the environment and you don’t get to decide what situations it might encounter. You can drive it for hundreds of thousands of miles, and there might still be situations that you could never imagine it would find itself in.
A human in those situations can be flexible. You can give people broad guidance. You can tell people things like, we’re here to win hearts and minds. That’s kind of ambiguous: You can always use force to defend yourself, but we’re not here to kill people, we’re here to win over these villagers. People actually understand that. People can actually make sense of those things, even though it’s super ambiguous. You can’t program that into a machine, right?
That’s not to say that means that machines are not invaluable. It’s just that we need to think very carefully about how we apply them into what kinds of problems. They’re better in different ways and for different kinds of tasks. And the hard thing is that that line is shifting.
Read the full article at http://usaasc.armyalt.com/#folio=90.
This article was originally published in the April – June issue Army AL&T magazine.
Subscribe to Army AL&T News, the premier online news source for the Acquisition, Logistics, and Technology (AL&T) Workforce.