Basics Matter in Killer Robots Debate

121126-N-PL185-082

121126-N-PL185-082

Last year, Human Rights Watch launched a cam­paign to ban the devel­op­ment of autonomous weapons sys­tems, which they dys­phemisti­cally call “killer robots.” The report and email cam­paign has sev­eral prob­lems to it, which Greg McNeal lays out ably here — namely, that it’s based entirely on an over­sim­pli­fied cam­paign of fear and doesn’t take into account the real­ity of the tech­nol­ogy, the cur­rent poli­cies in place, or even basics like cur­rent human fallibility.

Indeed, there is a fas­ci­nat­ing sci­ence fic­tion ele­ment to this dis­cus­sion that bears dis­cus­sion in a longer, sep­a­rate post. The appeal to fic­tion and fear instead of facts has the poten­tial to up-end the debate about the laws of armed con­flict and how weapons are used — surely the oppo­site of what they’d prefer.

Non­sense like this anti-robots video HRW pro­duced is a key exam­ple. It is based on half-truths, actual non-truths, and leaps of logic that are just silly.

Noel Sharkey, who’s famous for his anti-drone rants on the BBC, is a pro­fes­sor of arti­fi­cial intel­li­gence and robot­ics at the Uni­ver­sity of Sheffield. And he appar­ently is not aware of the basics of active pat­tern recog­ni­tion for under­stand­ing human behav­ior, mea­sure­ments, and image pro­cess­ing research: all very impor­tant fac­tors in design­ing autonomous weapons.

Then, the HRW senior research asserts that the X-47b, a Navy drone that can take off and land on an air­craft car­rier with­out human input (an incred­i­bly dif­fi­cult and com­plex task), will carry weapons. The US Navy has said cat­e­gor­i­cally on the record it will not.

Lastly, almost in a coup de grace, a Nobel lau­re­ate appeals to her suc­cess in get­ting inter­na­tional treaties ban­ning land mines and clus­ter bombs, and says the world should dupli­cate that suc­cess for autonomous weapons. The thing is, the U.S. has not signed the land mine treaty, nor has it signed the clus­ter muni­tions treaty. Nei­ther has Rus­sia, China, or Israel. So what good are those treaties?

This video — and the cam­paign against autonomous robots — is based on a whole lot of false fears and mis­lead­ing pre­sen­ta­tions of easily-googleable facts. Which is a real shame because there are seri­ous chal­lenges to the devel­op­ment and use of these weapons sys­tems that need to be debated. This cam­paign, how­ever, pre­vents such a discussion.

Besides which, humans aren’t that great in war­fare any­way. Even highly trained U.S. troops make mis­takes and kill chil­dren. The police mis­tak­enly shoot unarmed peo­ple with depress­ing reg­u­lar­ity. Really: how much worse could machines, with strict rules and a factory’s pre­ci­sion, really be?