Human Rights Watch’s New “Killer Robots” Report Calling for International Ban on Autonomous Weapons Systems, Production or Development

Human Rights Watch has released a new report (co-authored by the Harvard Law School Human Rights Clinic) on autonomous weapons systems that might emerge over the next several decades, titled “Losing Humanity: The Case Against Killer Robots.”  The report calls for a multilateral treaty that would preemptively ban “development, production, and use” of fully autonomous weapons by all states.  It would be hard to be more sweeping that the report’s language in calling a comprehensive ban.  Here is the language from the recommendations, directed to states:

Prohibit the development, production, and use of fully autonomous weapons through an international legally binding instrument.

Adopt national laws and policies to prohibit the development, production, and use of fully autonomous weapons.

Commence reviews of technologies and components that could lead to fully autonomous weapons. These reviews should take place at the very beginning of the development process and continue throughout the development and testing phases.

It happens that Matthew Waxman and I have a policy essay appearing on this topic in the December-January issue of Policy Review, “Law and Ethics for Robot Soldiers” (the link goes to a special SSRN version with footnotes).  While of course sharing the concern that any new weapons system meet the requirements of the laws of war, our conclusions run the opposite direction as Human Rights Watch’s.  Over at Lawfare, we discuss reasons why  this kind of sweeping, prohibitory approach seems to us both wrong on substance and unworkable in practice.  It’s a complicated topic, and I imagine we’ll probably post some more detailed and specific critiques of the report, and discuss it at Lawfare, at Opinio Juris international law blog, and here at Volokh. (I should add that while we take largely the opposite view, there are people calling for this kind of comprehensive treaty ban whose work and views we take very seriously – Noel Sharkey in particular.)

I’d add for my own part – this is not a point Matt and I have taken a view on together – the report is nothing if not sweeping in the prohibitions it calls to be enforced by international treaty.  I find it confusing – confused – in the framing of its demands.  It calls for prohibitions on development, production, and use of fully autonomous weapons systems, along with reviews of “technologies and components that could lead to fully autonomous weapons.” (Emphasis added.)  I don’t know quite how to square this with the report’s “code of conduct” proposed for roboticists:

To Roboticists and Others Involved in the Development of Robotic Weapons

Establish a professional code of conduct governing the research and development of autonomous robotic weapons, especially those capable of becoming fully autonomous, in order to ensure that legal and ethical concerns about their use in armed conflict are adequately considered at all stages of technological development.

I take it this is intended to address concerns about a point that Law and Ethics for Robot Soldiers puts at the center of discussion – the fact that all these technologies of automation develop incrementally, and there is not necessarily a single obvious moment when the technology should be deemed “autonomous.”  I suppose this means that roboticists are supposed to be evaluating their work as they go, to determine the point – in keeping with the prohibition on “development” of “technologies or components” that “could lead” to fully autonomous weapons – when they are supposed to put down their research as violating the treaty ban on development.  They are supposed to be doing this review from the very beginning of their work.  Given that the legal standard imposed by treaty would say that any “development” of systems “capable of becoming fully autonomous” is illegal per se, including “components and technologies” that “could lead” to full autonomy, it is hard to understand what genuinely independent role this code of conduct is supposed to play.  At bottom, though, the whole approach of prejudging future technology, and doing so by a sweeping and preemptive treaty, is simply the wrong approach for providing genuinely needed legal and ethical guidance in this matter of weapons development.


Comments are closed.