Dozens of countries are holding a multilateral disarmament conference at the United Nations in Geneva today to discuss a new and disturbing threat to humanity. Military powers from across the world are developing technology that could lead to the creation of fully autonomous weapons—that is, weapons that would select targets and fire without “meaningful human control.” The diplomats in Geneva need to decide how to deal with these “killer robots” in international law before it is too late.
As technology moves toward ever greater autonomy, the loss of human control over weapons systems threatens to revolutionize the ways wars are fought. Giving machines the power to determine when to use lethal force could have an impact as great as, or greater than, the introduction of gunpowder or nuclear weapons. Artificial intelligence experts say the development of these weapons is only years—not decades—away.
The possibility that these “lethal autonomous weapons systems” could be added to the arsenals of modern militaries raises a host of legal, ethical and security problems. Almost all of these concerns directly relate to the loss of human control over the weapons’ use. To prevent these risks, countries should adopt an international treaty banning fully autonomous weapons.
A new report, released at the Geneva conference by Human Rights Watch and the Harvard Law School International Human Rights Clinic, explains the need for human control, documents the growing support for a requirement and examines legal precedent for treaty law banning weapons that lack control.
Ensuring that humans retain control over life-and-death decisions about using weapons, whether in war or for law enforcement, would avoid crossing a moral red line. Keeping human control over targeting would also promote compliance with international humanitarian law and international human rights law. It would increase the likelihood that a weapon’s use is “proportionate”—that is, that the harm to civilians doesn’t outweigh the military advantage. People can better deal with complex and unforeseeable scenarios because they can apply human judgment to situations a robot could not to be pre-programmed to assess.
Mandating that a human determine when to fire on a specific target means that someone could be held accountable for an unlawful attack. That requirement would avoid the accountability gap the use of fully autonomous weapons could create and make those potentially responsible more likely to think before they act.
The countries are meeting this week under the mantle of an existing treaty, the Convention on Conventional Weapons, as part of a series of discussions that began three years ago. By the end of the first session, the concept of human control had already emerged as a central issue, with many speakers explicitly expressing support for requiring human control or for moving discussions of the term forward. Even those who did not use the term “control” seemed to recognize the importance of maintaining some kind of human involvement in the use of lethal force. The United States and Israel, for example, referred to “appropriate levels of human judgment,” a narrower but related term.
“There is broad agreement that human control over weapon systems and the use of force must be retained,” a Red Cross spokesperson said during the discussion. The gathering consensus suggests that it’s time to formalize a clear and unambiguous prohibition of weapons without human control.
There is no existing international law dedicated to the topic of fully autonomous weapons, but adopting a new prohibition on this is feasible. International law has employed the concept of control in a wide range of areas. For example, it assigns liability for an unlawful act to a country or individual that has “effective control” of the perpetrator. International environmental law obligates states to “control” pollution.
Historically, disarmament law has found weapons that lack human control unacceptable and banned them. The Mine Ban Treaty prohibited antipersonnel landmines that lacked control because they were triggered by victims rather than a human operator. The Biological and Chemical Weapons Conventions were adopted in part to end the horror caused by weapons that were uncontrollable.
There is then precedent for the assembled nations to follow as they determine how the Convention on Conventional Weapons could relate to lethal autonomous weapons systems. But after three years of informal experts meetings, it is time for countries to move beyond talk to action.
There is an opportunity this year to do just that. The Convention on Conventional Weapons will hold its five-year review conference in December to set its agenda for future sessions. This week’s meeting can make recommendations that would influence the review conference’s agenda.
The recommendations should call for formalizing deliberations on lethal autonomous weapons by forming a “Group of Governmental Experts” to begin negotiations for a new legally binding protocol attached to the treaty to prohibit the development, production, and use of these weapons. The recommendations should also make clear that human control must be maintained over selecting and striking targets.
The international community has the chance to preempt a third revolution in warfare, one that could have disastrous effects. But to seize that opportunity, it must act now. Any delay could allow the specter of fully autonomous weapons to become a reality.
Bonnie Docherty is a senior researcher in the Arms Division of Human Rights Watch and a lecturer at Harvard Law School International Human Rights Clinic.