There are active international discussions taking place on the ethics and governance of Autonomous Weapon Systems (AWS)—robots that can kill without direct human intervention or oversight. It is imperative that we critically examine the role and nature of public engagement intended to inform decision makers.
The Martens Clause, included in the additional protocols of the Geneva Conventions, makes explicit room for the public to have a say on what is deemed permissible in matters of armed conflict, especially where new technologies are concerned. However, many measures of public opinion, using methods such as surveys and polls, have been designed in such a way that makes them subject to potential biasing effects. For example, some only consider specific applications instead of general aspects/features unique to the technology under consideration. In this paper, we survey various studies that have been conducted to gauge public opinion on the use of military drones (autonomous and remotely operated), including the recent international poll conducted by the Open Roboethics initiative (ORi).
By drawing on evidence from moral psychology, we highlight some potential biasing effects that particular question framings could have on outcomes, and outline considerations that can be taken into account when designing and determining the applicability of public opinion measures to questions of the governance of AWS. Such considerations can help public engagement objectives live up to the spirit of the Martens Clause.