Stephen Goose, director of Human Rights Watch's Arms Division, was instrumental in bringing about the 2008 convention banning cluster munitions, the 1997 treaty banning antipersonnel mines, the 1995 protocol banning blinding lasers, and the 2003 protocol requiring clean-up of explosive remnants of war. He and Human Rights Watch co-founded the International Campaign to Ban Landmines (ICBL), which received the 1997 Nobel Peace Prize. Goose created the ICBL’s Landmine Monitor initiative, the first time that non-governmental organizations around the world have worked together in a sustained and coordinated way to monitor compliance with an international disarmament or humanitarian law treaty. In 2013, he and Human Rights Watch co-founded the Campaign to Stop Killer Robots. Before joining Human Rights Watch in 1993, Goose was a US congressional staffer and a researcher at the Center for Defense Information. He has a master's degree in International Relations from the Johns Hopkins School of Advanced International Studies and a B.A. in History from Vanderbilt University.

Posted: January 1, 1970, 12:00 am

At the United Nations in Geneva the Campaign to Stop Killer Robots called on governments to not allow the development of weapons systems that would select and attack targets without any human intervention.

© 2018 Mary Wareham

Opposition to the creation of so-called “killer robots” – weapons systems that could select and attack targets without any human intervention – is growing. The issue has been hotly debated since 2013, when the Campaign to Stop Killer Robots began calling for a preemptive ban on the development, production, and use of such technology. The group has repeatedly warned that these weapons could have devastating consequences for civilian populations around the world.

The only way to stop the development of fully autonomous weapons is through national laws and an international ban treaty. But current diplomatic talks at the United Nations on this challenge are based on consensus – which allows just a few or even a single state to block an agreement sought by a majority – and often results in lowest-common denominator decision-making.

This is effectively what happened in Geneva last week at the sixth Convention on Conventional Weapons (CCW) meeting on lethal autonomous weapons systems. There was strong convergence among the 88 participating states on the need to retain some form of human control over weapons systems and the use of force. Many countries recommended a preemptive ban on the development and use of these weapons.

But a handful of states – namely Australia, Israel, Russia, South Korea, and the United States – strongly opposed any new treaty. Alarmingly, they instead suggested exploring the potential humanitarian “benefits” of developing and using lethal autonomous weapons systems.

Ultimately, the CCW meeting participants could only agree to recommend continuing their deliberations into next year. But the longer it takes states to negotiate a new international ban, the greater the chance that killer robots will become reality, and forever change the face of warfare. The world must not continue down this dangerous path.

Author: Human Rights Watch
Posted: January 1, 1970, 12:00 am

Thank you, Mr. President.

Compliance by States Parties with the Convention on Cluster Munitions has been very impressive. Indeed, compliance with the core prohibitions has been perfect thus far. There have been no instances or even allegations of use, production, or transfer of cluster munitions by any State Party. The first stockpile destruction deadline was 1 August, and every State Party with that deadline met it, some far in advance. In fact, most States Parties with upcoming deadlines have already completed destruction of their stocks. On this 10-year anniversary of the adoption and signing of the convention, we can say with great certainty that this is a convention that is working and working well.

However, there are some compliance concerns, related to transparency and national laws.

Thirteen States Parties are late in providing their initial transparency report. Four of those were due in 2011. This is an 89% compliance rate, but this should be 100%. These reports are needed, among other reasons, to establish officially if a country has stocks of cluster munitions and if it is contaminated. Moreover, far too many States Parties are late in submitting their annual updated report.

No State Party has enacted new implementation legislation since December 2016. Too few overall have enacted new laws or other national implementation measures. By our count, more than one-quarter of States Parties have yet to implement their Article 9 obligations.

In addition, we encourage all States Parties to elaborate their views on certain important issues related to interpretation and implementation of the convention, issues which are relevant to ensuring compliance. Of those States Parties that have commented on these matters, the vast majority have agreed with the following interpretations:

The Convention on Cluster Munitions prohibits (1) any intentional or deliberate assistance with activities banned by the convention, including during joint military operations with states not party; (2) any transit of cluster munitions by a state not party across the territory of a State Party; (3) any foreign stockpiling of cluster munitions by states not party in the territory of a State Party; and (4) any direct or indirect investment in producers of cluster munitions.

The convention is having a powerful impact even on nations that have not yet joined, as most are in de facto compliance with key provisions, such as no use, no production, no trade. An international norm rejecting any use of cluster munitions is clearly emerging.

Cluster Munition Monitor reports confirmed use in the past year in just two countries—in Syria, by Syrian government forces supported by Russia, and in Yemen by the Saudi Arabia-led coalition.

We expect every State Party to firmly condemn any use of cluster munitions by any actor, and to call for an immediate halt to such use. And please follow-up bilaterally after your initial reactions.

In closing, let me reiterate that at the 10-year mark, we should all feel good about the Convention, about the record of compliance, and about the strength of the growing norm. But there is no room for complacency. These gains take constant care, for the long haul.

Thank you.

Posted: January 1, 1970, 12:00 am

When we released Cluster Munition Monitor 2018 last Thursday, we highlighted the untarnished compliance record regarding the convention’s core prohibitions. One of the most visible examples is seen in stockpile destruction, where all of the first states with cluster munitions to ratify the convention destroyed their stocks within the convention’s eight-year deadline.

This achievement shows the world that States Parties take their obligations seriously and are committed to implementation. It also demonstrates to states considering joining the Convention on Cluster Munitions that the provisions are not overly burdensome or impossible to implement. 

Four States Parties completed destruction of their stockpiled cluster munitions during the previous year: Croatia, Cuba, Slovenia and Spain. We warmly welcome this achievement. We encourage them to report in detail on the process and appreciate Croatia’s detailed PowerPoint presentation here. We did not hear from Cuba today and urge it to provide details on the exact quantity and types of cluster munitions and submunitions destroyed.

During 2017, seven States Parties—those four plus Peru, Slovakia, and Switzerland—destroyed a collective total of 33,551 cluster munitions and 1.7 million submunitions. This is the lowest number destroyed since the creation of the Convention on Cluster Munitions.

However, the reason for that is positive: the vast majority—99 percent—of the total reported global stocks of cluster munitions once held by States Parties have now been destroyed. As the Monitor reports, 33 States Parties have completed destruction of their stocks, collectively destroying 1.4 million cluster munitions containing more than 177 million submunitions.

State Party Slovakia destroyed a substantial number of cluster munitions over the past year and we hope to hear from Slovakia at this meeting. Switzerland is on track to complete the destruction of its cluster munition stocks by the end of this year. With the technical support of Norwegian People’s Aid, Botswana and Peru have made substantial progress over the past year to plan and prepare for the destruction of their stockpiled cluster munitions within the convention’s deadlines.

Yet, as always, it is not all good news. We therefore bring the following concerns to your attention for our collective follow-up:

  • Bulgaria has reported the possession of a substantial number of cluster munitions, but still has not started destroying them. There is just one year left until its 1 October 2019 stockpile destruction deadline. We appreciate the update provided today, but are disappointed to hear that Bulgaria is considering making an extension request to its pending stockpile destruction deadline.
  • Guinea-Bissau has indicated several times that it needs financial and technical assistance to destroy its stockpiled cluster munitions by its 1 May 2019 deadline. Yet it still has not disclosed the quantity and types of stockpiled cluster munitions as it is nearly seven years late delivering its Article 7 transparency report. Guinea-Bissau last participated in a meeting of the convention in 2015.
  • There is evidence that Guinea imported cluster munitions back in the year 2000, prior to joining the convention. Yet it still has not provided its transparency report for the convention, which was due April 2015. Therefore, it is not possible to know if it still has stockpiled cluster munitions left to destroy.
  • Signatories that possess cluster munitions, such as Indonesia and Nigeria, appear to have taken few, if any, steps to ratify the convention or to declare and destroy their cluster munitions.
  • Cyprus is the last European Union member state to have signed, but not yet ratified the convention. It has not disclosed any information on its cluster munition stocks, but we have learned that 3,760 mortar projectiles and 2,559 submunitions that it transferred in 2014 for the purposes of destruction still have not been destroyed.

Under the convention’s cooperative compliance measures, States Parties and others, including our campaign members, stand ready to help States Parties requiring assistance. It is clear that several now need help to overcome financial, technical, and other challenges that are preventing them from swiftly destroying their cluster munition stocks.

Before concluding, we would like to highlight the fact that most States Parties have chosen not to retain any cluster munitions for training and research purposes. Nonetheless, 13 States Parties are retaining cluster munitions. This includes two of the newer States Parties, Bulgaria and Slovakia.

We are pleased to hear just now from the Netherlands that it intends to significantly reduce the number of cluster munitions that it has retained for research and training, but not consumed for these purposes since 2011. We were disappointed that Cameroon decided to retain all six of its stockpiled cluster munitions for research and training purposes.

Last year, Italy announced that it has destroyed the cluster munitions that it initially retained for research and training purposes and would not replenish those stocks. Several States Parties retaining cluster munitions have significantly reduced the number retained since making their initial declarations, including Belgium, France, Germany, Switzerland, and Spain.

That shows how the initial amounts retained were likely too high, but it still is not clear if current holdings constitute the “minimum number absolutely necessary” for the permitted purposes, as required by the convention.

We applaud the States Parties that have destroyed their cluster munition stocks and are not retaining any. It’s clear that most States Parties agree with the CMC that there is no compelling reason to retain live cluster munitions and explosive submunitions for the purposes of research and training.

Finally, the Cluster Munition Coalition supports both the guidelines on extension requests and the voluntary declaration of completion submitted to this meeting.

Posted: January 1, 1970, 12:00 am

A BLU-61 submunition marked for destruction in-place in the Basra governorate of Iraq, March 2018. 

© 2018 UNMAS
 
(Geneva) – No state party to the 2008 treaty prohibiting cluster munitions has violated the core prohibitions on use, production, transfer, and stockpiling of these weapons, resulting in an untarnished compliance record, Human Rights Watch said today during the release of the Cluster Munition Monitor 2018 report.
 
Cluster Munition Monitor 2018 is the ninth annual report of the Cluster Munition Coalition (CMC), the global coalition of nongovernmental organizations co-founded and chaired by Human Rights Watch. The group works to ensure that all countries join and adhere to the 2008 treaty banning cluster munitions and requiring clearance and victim assistance. The report details how some non-signatories, particularly Israel, Russia, and the United States, hardened their defense of cluster munitions during the past year.
 
“Full compliance is essential to ensuring that the treaty banning cluster munitions prevents further human suffering from these widely discredited weapons,” said Mary Wareham, arms division advocacy director at Human Rights Watch and an editor of the report. “The treaty members are showing the holdouts that they have nothing to lose and everything to gain by renouncing cluster munitions and coming on board without delay.”

Treaty members are showing the holdouts that they have nothing to lose and everything to gain by renouncing cluster munitions and coming on board without delay.

Mary Wareham

Arms Division Advocacy Director at Human Rights Watch

In the US, a November 30, 2017 Defense Department policy directive abandons a longstanding policy requiring the US not use cluster munitions that result in more than a 1 percent unexploded ordnance after 2018. Human Rights Watch has condemned the policy for halting a long-planned move away from inaccurate cluster munitions. The US claims that cluster munitions have military utility, but last used them during the 2003 invasion of Iraq, with the exception of a single 2009 attack in Yemen. There is no evidence that the US or its coalition partners have used cluster munitions against the Islamic State (also known as ISIS) in Syria and Iraq.

Cluster munitions can be fired from the ground by artillery systems, rockets, and projectiles, or dropped from aircraft. They typically open in the air, dispersing multiple bomblets or submunitions over a wide area. Many submunitions fail to explode on initial impact, leaving dangerous duds that can maim and kill like landmines for years.

Currently, there are 103 states parties to the Convention on Cluster Munitions, while 17 countries have signed, but not yet ratified. There has been no new use, production, or transfers of cluster munitions by any state party since the convention was adopted on May 30, 2008. All states parties facing the first eight-year stockpile destruction deadline – August 1, 2018 – successfully destroyed their stocks in time, including Croatia, Slovenia, and Spain in the past year. Cuba, a new state party, also completed its stockpile destruction, while Switzerland is expected to announce completion imminently.

The destruction to date of a collective total of 1.4 million cluster munitions and more than 177 million submunitions means that 99 percent of the total reported global stocks held by states parties have now been destroyed. During 2017, seven countries destroyed a total of 33,551 cluster munitions and 1.7 million submunitions.

However, use of cluster munition by Syrian government forces on anti-government-held areas of the country, which began in 2012, continued throughout 2017 and the first half of 2018. The number of recorded cluster munition attacks fell over the past year, in part due to the decreasing number of areas that remain outside of the government’s control. In Yemen, far fewer cluster munition attacks were reported over the last year by a Saudi-led coalition that has conducted a military operation against Houthi forces in Yemen since March 2015. That decrease comes after strong public outcry, global media coverage, and widespread condemnation. There is evidence that cluster munitions may have been used in Egypt and Libya, but it has not been possible to independently confirm these allegations. None of these countries are party to the Convention on Cluster Munitions.

According to the Cluster Munition Monitor, there were 289 new casualties in 2017 and 99 percent of cases in which the victim’s status was reported were civilians. That included 187 casualties in Syria and 54 in Yemen from both new attacks and explosive remnants. There were 32 new casualties in Laos, all from unexploded submunitions used by the US in the 1960s and 1970s. The number of new victims in 2017 is a sharp decrease from the 971 reported in 2016, but many casualties go unrecorded or lack sufficient documentation.

Since the publication of last year’s report, Sri Lanka was the only country to ratify or accede to the convention, on March 1.

For the third consecutive year, Russia voted with Zimbabwe in December 2017 against a United Nations General Assembly resolution promoting the convention, though 32 non-signatories voted for the resolution. Russia has participated in a joint military operation with Syrian forces since September 30, 2015, in which cluster munitions have caused extensive civilian harm.

According to the Cluster Munition Monitor, 26 countries, including 12 states parties and two signatories, are contaminated by cluster munition remnants. Around the world, at least 153,000 submunitions were destroyed during 2017 in clearance operations. Under the convention, eight states parties have completed clearance of their contaminated land.

Most states parties have formally declared they are not retaining any cluster munitions for training or research, as the treaty permits, though 12 treaty members are. Thirty have enacted national laws to carry out the convention, and another 20 are in the process of doing so.  

“Several states parties still have significant work to do to clear contaminated areas, assist victims, report on their implementation, and ensure they have laws and other measures to punish any violations,” Wareham said. “Countries needing assistance should not hesitate to request help, as cooperative compliance is the bedrock of this treaty.”

Cluster Munition Monitor 2018 will be presented at the Eighth Meeting of States Parties to the Convention on Cluster Munitions, which opens at the United Nations in Geneva on September 3.

Posted: January 1, 1970, 12:00 am

Thank you Mr. Chair, and thank you for your work chairing this Group of Governmental Experts, including your consultations with civil society. I am speaking in my capacity as coordinator of the Campaign to Stop Killer Robots, the rapidly growing coalition of 76 non-governmental organizations in 32 countries working to preemptively ban weapons systems that, once activated, would select and attack targets without human intervention.

The serious legal, operational, moral, technical, proliferation, security and other challenges raised by fully autonomous weapons have gained widespread attention since the first CCW meeting on the topic in May 2014. However, states still have not agreed on the regulatory response needed to address the serious challenges raised.

It’s increasingly obvious that the public strongly objects to allowing machines to select targets and use force without any meaningful human control. Doing so would be abhorrent, immoral, and an affront to the concept of human dignity and principles of humanity. It’s high time governments heed the mounting calls for a new international law to prohibit killer robots and start negotiating one.

The Campaign to Stop Killer Robots urges states at this sixth international meeting on lethal autonomous weapons systems to recommend a negotiating mandate to create such a ban treaty. We hope that states heed the calls from not only us, but the African Group of states, the Non-Aligned Movement, Brazil, Austria, Chile, Colombia, Panama, and others to begin negotiations on a new treaty to retain human control over weapons systems or prohibit lethal autonomous weapons.

Momentum is starting to build rapidly for states to start negotiating a legally-binding instrument. Requests for more time to further explore this challenge may seem valid, but increasingly sound like excuses aimed at delaying the inevitable regulation that’s coming.

Promises of greater transparency, codes of conduct, meek political declarations and more committees are insufficient to deal with the far-reaching consequences of creating fully autonomous weapons. Nothing less than a ban treaty will be needed to effectively constrain the development of autonomy in the critical functions of weapons systems and avoid dehumanizing the use of force.

Posted: January 1, 1970, 12:00 am

Thank you, Mr. Chairman.

Human Rights Watch is a co-founder of the Campaign to Stop Killer Robots, and Mary Wareham of Human Rights Watch is the global coordinator of the Campaign.

We are pleased that the GGE has shifted its focus to options for the way forward on lethal autonomous weapons systems (LAWS). For five years, states have highlighted the host of problems with these weapons, including legal, moral, accountability, technical, and security concerns. It is time to move on and take action. As Brazil noted, the world is watching and there are high expectations for the CCW to produce a strong, concrete outcome.

Human Rights Watch supports the proposal for a mandate to begin negotiations in 2019 of a legally binding instrument to require meaningful human control over the critical functions of lethal autonomous weapons systems. Such a requirement is effectively the same as a prohibition on weapons that lack such control.

We were pleased to hear so many states—the vast majority of states—express support for a legally binding instrument prohibiting lethal autonomous weapons systems. We hope that High Contracting Parties set aside significant time in 2019 to fulfill that mandate—at least four weeks, so that the negotiations could be concluded within one year.

Several states have said the CCW’s discussions should focus on the compliance of lethal autonomous weapons systems with international law and particularly international humanitarian law. We agree that compliance with rules of proportionality and distinction is critical, and we question whether this technology could comply.

But another provision of international humanitarian law must also be considered. The Martens Clause—which appears in the Geneva Convention, Additional Protocol I, and the preamble of the CCW—creates a legal obligation for states to consider moral implications when assessing new technology. The clause applies when there is no specific existing law on a topic, which is the case with lethal autonomous weapons systems, also called fully autonomous weapons.

The Martens Clause requires in particular that emerging technology comply with the principles of humanity and dictates of public conscience. As we have outlined in a new report distributed this week, fully autonomous weapons would fail this test on both counts.

The principles of humanity require humane treatment of others and respect for human life and dignity. Weapons that lack meaningful human control over the critical functions would be unable to comply with these principles.

Fully autonomous weapons would lack compassion, which motivates humans to minimize suffering and killing. They would also lack the legal and ethical judgment necessary to determine the best means for protecting civilians on a case-by-case basis in complex and unpredictable combat environments.

As inanimate machines, fully autonomous weapons could also not appreciate the value of human life and the significance of its loss. They would base life-and-death determinations on algorithms, objectifying their human targets—whether civilians or combatants. They would thus fail to respect human dignity.

The development of weapons without meaningful human control would also run counter to the dictates of public conscience. In national and regional group statements, a majority of states at CCW have called for the negotiation of a legally binding instrument on lethal autonomous weapons systems. Many have expressly called for a prohibition on the weapons. Virtually all states have stressed the need to maintain human control over the use of force. Collectively, these statements provide evidence that the public conscience favors human control and objects to fully autonomous weapons.

Experts and the general public have reached similar conclusions. As was discussed in yesterday’s side event sponsored by the Campaign to Stop Killer Robots, thousands of AI and robotics researchers along with companies and industry representatives have called for a ban on fully autonomous weapons. Traditional voices of conscience—faith leaders and Nobel Peace Laureates—have echoed those calls, expressing moral outrage at the prospect of losing human control over the use of force. Civil society and the ICRC have also emphasized that law and ethics require human control over the critical functions of a weapon.

In conclusion, the rules of law and morality demand the negotiation of a new legally binding instrument on fully autonomous weapons. An assessment of the technology under the Martens Clause shows there is a gap in international law that needs to be filled. Concerns related to the principles of humanity and dictates of public conscience show that the new instrument should ensure that meaningful human control over the use of force is maintained and the development, production, and use of fully autonomous weapons are prohibited.

Thank you.

Posted: January 1, 1970, 12:00 am

Global launch of the Campaign to Stop Killer Robots in London on April 23, 2013.

© 2013 Campaign to Stop Killer Robots

The next revolution in warfare threatens to undermine fundamental principles of morality and law. Fully autonomous weapons, already under development in a number of countries, would have the power to select targets and fire on them without meaningful human control. In so doing, they would violate basic humanity and the public conscience.

International humanitarian law obliges countries to take these factors into account when evaluating new weapons. A longstanding provision known as the Martens Clause creates a legal duty to consider the moral implications of emerging technology. The Martens Clause states that when no existing treaty provision specifically applies, weapons should comply with the “principles of humanity” and the “dictates of public conscience”.

A new report from Human Rights Watch and Harvard Law School’s International Human Rights Clinic, of which I was the lead author, shows why fully autonomous weapons would fail both prongs of the test laid out in the Martens Clause. We conclude that the only adequate solution for dealing with these potential weapons is a preemptive ban on their development, production, and use.

More than 70 countries will convene at the United Nations in Geneva from August 27 to 31 to discuss what they refer to as lethal autonomous weapons systems. They will meet under the auspices of the Convention on Conventional Weapons, a major disarmament treaty. To avert a crisis of morality and a legal vacuum, countries should agree to start negotiating a treaty prohibiting these weapons in 2019.

With the rapid development of autonomous technology, the prospect of fully autonomous weapons is no longer a matter of science fiction. Experts have warned they could be fielded in years not decades.

While opponents of fully autonomous weapons have highlighted a host of legal, accountability, security, and technological concerns, morality has been a dominant theme in international discussions since they began in 2013. At a UN Human Rights Council meeting that year, the UN expert on extrajudicial killing warned that humans should not delegate lethal decisions to machines that lack “morality and mortality.”

The inclusion of the Martens Clause in the Convention on Conventional Weapons underscores the need to consider morality in that forum, too.   

Fully autonomous weapons would violate the principles of humanity because they could not respect human life and dignity. They would lack both compassion, which serves as a check on killing, and human judgment, which allows people to assess unforeseen situations and make decisions about how best to protect civilians. Fully autonomous weapons would make life-and-death decisions based on algorithms that objectify their human targets without bringing to bear an understanding of the value of human life based on lived experience.

Fully autonomous weapons also run counter to the dictates of public conscience, which reflect an understanding of what is right and wrong. Traditional voices of conscience, including more than 160  religious leaders and more than 20 Nobel Peace Laureates, have publicly condemned fully autonomous weapons. Scientiststechnology companies, and other experts by the thousands have joined the chorus of objectors.  In July, a group of artificial intelligence researchers released a pledge, since signed by more than 3,000 people and 237 organisations, not to assist with the development of such weapons.

Governments have also expressed widespread concern about the prospect of losing control over the use of force. In April, for example, the African Group, one of the five regional groups at the United Nations, called for a preemptive ban on fully autonomous weapons, stating, “it is inhumane, abhorrent, repugnant, and against public conscience for humans to give up control to machines, allowing machines to decide who lives or dies.”

To date, 26 countries have endorsed the call for a prohibition on fully autonomous weapons. Dozens more have emphasised the need to maintain human control over the use of force. Their shared commitment to human control provides common ground on which to initiate negotiations for a new treaty on fully autonomous weapons.

Countries debating fully autonomous weapons at the United Nations next week must urgently heed the principles reflected in the Martens Clause. Countries should both reiterate their concerns about morality and law and act on them before this potential revolution in weaponry becomes a reality.

Author: Human Rights Watch
Posted: January 1, 1970, 12:00 am

 

Summary

In cases not covered by this Protocol or by other international agreements, civilians and combatants remain under the protection and authority of the principles of international law derived from established custom, from the principles of humanity and from the dictates of public conscience.
— Martens Clause, as stated in Additional Protocol I of 1977 to the Geneva Conventions

Fully autonomous weapons are one of the most alarming military technologies under development today. As such there is an urgent need for states, experts, and the general public to examine these weapons closely under the Martens Clause, the unique provision of international humanitarian law that establishes a baseline of protection for civilians and combatants when no specific treaty law on a topic exists. This report shows how fully autonomous weapons, which would be able to select and engage targets without meaningful human control, would contravene both prongs of the Martens Clause: the principles of humanity and the dictates of public conscience. To comply with the Martens Clause, states should adopt a preemptive ban on the weapons’ development, production, and use.

The rapid development of autonomous technology and artificial intelligence (AI) means that fully autonomous weapons could become a reality in the foreseeable future. Also known as “killer robots” and lethal autonomous weapons systems, they raise a host of moral, legal, accountability, operational, technical, and security concerns. These weapons have been the subject of international debate since 2013. In that year, the Campaign to Stop Killer Robots, a civil society coalition, was launched and began pushing states to discuss the weapons. After holding three informal meetings of experts, states parties to the Convention on Conventional Weapons (CCW) began formal talks on the topic in 2017. In August 2018, approximately 80 states will convene again for the next meeting of the CCW Group of Governmental Experts.

As CCW states parties assess fully autonomous weapons and the way forward, the Martens Clause should be a central element of the discussions. The clause, which is a common feature of international humanitarian law and disarmament treaties, declares that in the absence of an international agreement, established custom, the principles of humanity, and the dictates of public conscience should provide protection for civilians and combatants. The clause applies to fully autonomous weapons because they are not specifically addressed by international law. Experts differ on the precise legal significance of the Martens Clause, that is, whether it reiterates customary law, amounts to an independent source of law, or serves as an interpretive tool. At a minimum, however, the Martens Clause provides key factors for states to consider as they evaluate emerging weapons technology, including fully autonomous weapons. It creates a moral standard against which to judge these weapons.

The Principles of Humanity

Due to their lack of emotion and legal and ethical judgment, fully autonomous weapons would face significant obstacles in complying with the principles of humanity. Those principles require the humane treatment of others and respect for human life and human dignity. Humans are motivated to treat each other humanely because they feel compassion and empathy for their fellow humans. Legal and ethical judgment gives people the means to minimize harm; it enables them to make considered decisions based on an understanding of a particular context. As machines, fully autonomous weapons would not be sentient beings capable of feeling compassion. Rather than exercising judgment, such weapons systems would base their actions on pre-programmed algorithms, which do not work well in complex and unpredictable situations.

Showing respect for human life entails minimizing killing. Legal and ethical judgment helps humans weigh different factors to prevent arbitrary and unjustified loss of life in armed conflict and beyond. It would be difficult to recreate such judgment, developed over both human history and an individual life, in fully autonomous weapons, and they could not be pre-programmed to deal with every possible scenario in accordance with accepted legal and ethical norms. Furthermore, most humans possess an innate resistance to killing that is based on their understanding of the impact of loss of life, which fully autonomous weapons, as inanimate machines, could not share.

Even if fully autonomous weapons could adequately protect human life, they would be incapable of respecting human dignity. Unlike humans, these robots would be unable to appreciate fully the value of a human life and the significance of its loss. They would make life-and-death decisions based on algorithms, reducing their human targets to objects. Fully autonomous weapons would thus violate the principles of humanity on all fronts.

The Dictates of Public Conscience

Increasing outrage at the prospect of fully autonomous weapons suggests that this new technology also runs counter to the second prong of the Martens Clause, the dictates of public conscience. These dictates consist of moral guidelines based on a knowledge of what is right and wrong. They can be ascertained through the opinions of the public and of governments.

Many individuals, experts, and governments have objected strongly to the development of fully autonomous weapons. The majority of respondents in multiple public opinion surveys have registered opposition to these weapons. Experts, who have considered the issue in more depth, have issued open letters and statements that reflect conscience even better than surveys do. International organizations and nongovernmental organizations (NGOs), along with leaders in disarmament and human rights, peace and religion, science and technology, and industry, have felt compelled, particularly on moral grounds, to call for a ban on fully autonomous weapons. They have condemned these weapons as “unconscionable,” “abhorrent … to the sacredness of life,” “unwise,” and “unethical.”

Governments have cited compliance with the Martens Clause and moral shortcomings among their major concerns with fully autonomous weapons. As of July 2018, 26 states supported a preemptive ban, and more than 100 states had called for a legally binding instrument to address concerns raised by lethal autonomous weapons systems. Almost every CCW state party that spoke at their last meeting in April 2018 stressed the need to maintain human control over the use of force. The emerging consensus for preserving meaningful human control, which is effectively equivalent to a ban on weapons that lack such control, shows that the public conscience is strongly against fully autonomous weapons.

The Need for a Preemptive Ban Treaty

An assessment of fully autonomous weapons under the Martens Clause underscores the need for new law that is both specific and strong. Regulations that allowed for the existence of fully autonomous weapons would not suffice. For example, limiting use to certain locations would neither prevent the risk of proliferation to actors with little regard for humane treatment or human life, nor ensure respect for the dignity of civilians or combatants. Furthermore, the public conscience reveals widespread support for a ban on fully autonomous weapons, or its equivalent, a requirement for meaningful human control. To ensure compliance with both the principles of humanity and the dictates of public conscience, states should therefore preemptively prohibit the development, production, and use of fully autonomous weapons.

 

Recommendations

To avert the legal, moral, and other risks posed by fully autonomous weapons and the loss of meaningful human control over the selection and engagement of targets, Human Rights Watch and Harvard Law School’s International Human Rights Clinic (IHRC) recommend:

To CCW states parties

  • Adopt, at their annual meeting in November 2018, a mandate to negotiate a new protocol prohibiting fully autonomous weapons systems, or lethal autonomous weapons systems, with a view to concluding negotiations by the end of 2019.
  • Use the intervening Group of Governmental Experts meeting in August 2018 to present clear national positions and to reach agreement on the need to adopt a negotiating mandate at the November annual meeting.
  • Develop national positions and adopt national prohibitions as key building blocks for an international ban.
  • Express opposition to fully autonomous weapons, including on the legal and moral grounds reflected in the Martens Clause, in order further to develop the existing public conscience.

To experts in the private sector

  • Oppose the removal of meaningful human control from weapons systems and the use of force.
  • Publicly express explicit support for the call to ban fully autonomous weapons, including on the legal and moral grounds reflected in the Martens Clause, and urge governments to start negotiating new international law.
  • Commit not to design or develop AI for use in the development of fully autonomous weapons via codes of conduct, statements of principles, and other measures that ensure the private sector does not advance the development, production, or use of fully autonomous weapons.

 

I. Background on Fully Autonomous Weapons

Fully autonomous weapons would be able to select and engage targets without meaningful human control. They represent an unacceptable step beyond existing armed drones because a human would not make the final decision about the use of force in individual attacks. Fully autonomous weapons, also known as lethal autonomous weapons systems and “killer robots,” do not exist yet, but they are under development, and military investments in autonomous technology are increasing at an alarming rate.

The risks of fully autonomous weapons outweigh their purported benefits. Proponents highlight that the new technology could save the lives of soldiers, process data and operate at greater speeds than traditional systems, and be immune to fear and anger, which can lead to civilian casualties. Fully autonomous weapons, however, raise a host of serious concerns, many of which Human Rights Watch has highlighted in previous publications. First, delegating life-and-death decisions to machines crosses a moral red line. Second, fully autonomous weapons would face significant challenges complying with international humanitarian and human rights law. Third, they would create an accountability gap because it would be difficult to hold anyone responsible for the unforeseen harm caused by an autonomous robot. Fourth, fully autonomous weapons would be vulnerable to spoofing and hacking. Fifth, these weapons would threaten global security because they could lead to an arms race, proliferate to actors with little respect for international law, and lower the threshold to war.[1]

This report focuses on yet another concern, which straddles law and morality—that is, the likelihood that fully autonomous weapons would contravene the Martens Clause. This provision of international humanitarian law requires states to take into account the principles of humanity and dictates of public conscience when examining emerging weapons technology. A common feature in the Geneva Conventions and disarmament treaties, the clause represents a legal obligation on states to consider moral issues.

The plethora of problems presented by fully autonomous weapons, including those under the Martens Clause, demand urgent action. A handful of states have proposed a wait-and-see approach, given that it is unclear what technology will be able to achieve. The high stakes involved, however, point to the need for a precautionary approach. Scientific uncertainty should not stand in the way of action to prevent what some scientists have referred to as the “third revolution in warfare, after gunpowder and nuclear arms.”[2] Countries should adopt a preemptive ban on the development, production, and use of fully autonomous weapons.

 

II. History of the Martens Clause

While the Martens Clause originated in a diplomatic compromise, it has served humanitarian ends. It states that in the absence of specific treaty law, established custom, the principles of humanity, and the dictates of public conscience provide protection for civilians and combatants. Since its introduction, the Martens Clause has become a common feature of the core instruments of international humanitarian law. The clause also appears in numerous disarmament treaties. The protections the Martens Clause provides and the legal recognition it has received highlight its value for examining emerging weapons systems that could cause humanitarian harm on the battlefield and beyond.

Origins of the Martens Clause

The Martens Clause first appeared in the preamble of the 1899 Hague Convention II containing the Regulations on the Laws and Customs of War on Land. In that iteration, the Martens Clause reads:

Until a more complete code of the laws of war is issued, the High Contracting Parties think it right to declare that in cases not included in the Regulations adopted by them, populations and belligerents remain under the protection and empire of the principles of international law, as they result from the usages established between civilized nations, from the laws of humanity, and the requirements of the public conscience.[3]

The clause thus provides a baseline level of protection to civilians and combatants when specific law does not exist.

Russian diplomat and jurist Fyodor Fyodorovich Martens proposed the Martens Clause as a way to break a negotiating stalemate at the 1899 Hague Peace Conference, which had been convened to adopt rules restraining war, reduce arms spending, and promote peace.[4] The great powers and lesser powers disagreed about how much authority occupying forces could exercise over the local population. The great powers insisted on a new treaty clarifying the rights and obligations of occupying forces, while the lesser powers opposed codifying provisions of an earlier political declaration that they believed did not adequately protect civilians. The Martens Clause provided fighters against foreign occupation the option of arguing that if specific provisions of the treaty did not cover them, they were entitled to at least such protection offered by principles of international law derived from custom, “the laws of humanity,” and “the requirements of the public conscience.”[5]

Modern Use of the Martens Clause

In the nearly 120 years since the adoption of the 1899 Hague Convention, the Martens Clause has been applied more broadly and become a staple of efforts to extend humanitarian protections during armed conflict. Seeking to reduce the impact of hostilities, numerous instruments of international humanitarian law and disarmament law have incorporated the provision.

Geneva Conventions and Additional Protocol I

When drafting the 1949 Geneva Conventions, the cornerstones of international humanitarian law,[6] negotiators wanted to ensure that certain protections would continue if a state party decided to withdraw from any of the treaties. The four Geneva Conventions contain the Martens Clause in their articles on denunciation, which address the implications of a state of leaving the treaties.[7] In its authoritative commentary on the conventions, the International Committee of the Red Cross (ICRC), the arbiter of international humanitarian law, explains:

[I]f a High Contracting Party were to denounce one of the Geneva Conventions, it would continue to be bound not only by other treaties to which it remains a Party, but also by other rules of international law, such as customary law. An argumentum e contrario, suggesting a legal void following the denunciation of a Convention, is therefore impossible.[8]

Additional Protocol I, which was adopted in 1977, expands the protections afforded to civilians by the Fourth Geneva Convention.[9]  The protocol contains the modern iteration of the Martens Clause, and the version used in this report:

In cases not covered by this Protocol or by other international agreements, civilians and combatants remain under the protection and authority of the principles of international law derived from established custom, from the principles of humanity and from the dictates of public conscience.[10]

By incorporating this language in its article on “General Principles and Scope of Application,” rather than confining it to a provision on denunciation, Additional Protocol I extends the application of the Martens Clause. According to the ICRC commentary:

There were two reasons why it was considered useful to include this clause yet again in the Protocol. First ... it is not possible for any codification to be complete at any given moment; thus, the Martens clause prevents the assumption that anything which is not explicitly prohibited by the relevant treaties is therefore permitted. Secondly, it should be seen as a dynamic factor proclaiming the applicability of the principles mentioned regardless of subsequent developments of types of situation or technology.[11]

The Martens Clause thus covers gaps in existing law and promotes civilian protection in the face of new situations or technology.

Disarmament Treaties

Since 1925, most treaties containing prohibitions on weapons also include the Martens Clause.[12] The clause is referenced in various forms in the preambles of the 1925 Geneva Gas Protocol,[13] 1972 Biological Weapons Convention,[14] 1980 Convention on Conventional Weapons,[15] 1997 Mine Ban Treaty,[16] 2008 Convention of Cluster Munitions,[17] and 2017 Treaty on the Prohibition of Nuclear Weapons.[18] Although a preamble does not establish binding rules, it can inform interpretation of a treaty and is typically used to incorporate, by reference, the context of already existing law. The inclusion of the Martens Clause indicates that if a treaty’s operative provisions present gaps, they should be filled by established custom, the principles of humanity, and the dictates of public conscience. By incorporating the Martens Clause into this line of disarmament treaties, states have reaffirmed its importance to international humanitarian law generally and weapons law specifically.

The widespread use of the Martens Clause makes it relevant to the current discussions of fully autonomous weapons. The clause provides a standard for ensuring that civilians and combatants receive at least minimum protections from such problematic weapons. In addition, most of the diplomatic discussions of fully autonomous weapons have taken place under the auspices of the CCW, which includes the Martens Clause in its preamble. Therefore, an evaluation of fully autonomous weapons under the Martens Clause should play a key role in the deliberations about a new CCW protocol.

 

III. Applicability and Significance of the Martens Clause

The Martens Clause applies in the absence of specific law on a topic. Experts disagree on its legal significance, but at a minimum, it provides factors that states must consider when examining new challenges raised by emerging technologies. Its importance to disarmament law in particular is evident in the negotiations that led to the adoption of a preemptive ban on blinding lasers. States and others should therefore take the clause into account when discussing the legality of fully autonomous weapons and how best to address them.

Applicability of the Martens Clause

The Martens Clause, as set out in Additional Protocol I, applies “[i]n cases not covered” by the protocol or by other international agreements.[19] No matter how careful they are, treaty drafters cannot foresee and encompass all circumstances in one instrument. The Martens Clause serves as a stopgap measure to ensure that an unanticipated situation or emerging technology does not subvert the overall purpose of humanitarian law merely because no existing treaty provision explicitly covers it.[20]

The Martens Clause is triggered when existing treaty law does not specifically address a certain circumstance. As the US Military Tribunal at Nuremberg explained, the clause makes “the usages established among civilized nations, the laws of humanity and the dictates of public conscience into the legal yardstick to be applied if and when the specific provisions of [existing law] do not cover specific cases occurring in warfare.”[21] It is particularly relevant to new technology that drafters of existing law may not have predicted. Emphasizing that the clause’s “continuing existence and applicability is not to be doubted,” the International Court of Justice highlighted that it has “proved to be an effective means of addressing the rapid evolution of military technology.”[22] Given that there is often a dearth of law in this area, the Martens Clause provides a standard for emerging weapons.

As a rapidly developing form of technology, fully autonomous weapons exemplify an appropriate subject for the Martens Clause. Existing international humanitarian law applies to fully autonomous weapons only in general terms. It requires that all weapons comply with the core principles of distinction and proportionality, but it does not contain specific rules for dealing with fully autonomous weapons.[23] Drafters of the Geneva Conventions could not have envisioned the prospect of a robot that could make independent determinations about when to use force without meaningful human control. Given that fully autonomous weapons present a case not covered by existing law, they should be evaluated under the principles articulated in the Martens Clause.

Legal Significance of the Martens Clause

Interpretations of the legal significance of the Martens Clause vary.[24] Some experts adopt a narrow perspective, asserting that the Martens Clause serves merely as a reminder that if a treaty does not expressly prohibit a specific action, the action is not automatically permitted. In other words, states should refer to customary international law when treaty law is silent on a specific issue.[25] This view is arguably unsatisfactory, however, because it addresses only one aspect of the clause—established custom—and fails to account for the role of the principles of humanity and the dictates of public conscience. Under well-accepted rules of legal interpretation, a clause should be read to give each of its terms meaning.[26] Treating the principles of humanity and the dictates of public conscience as simply elements of established custom would make them redundant and violate this rule.

Others argue that the Martens Clause is itself a unique source of law.[27] They contend that plain language of the Martens Clause elevates the principles of humanity and the dictates of public conscience to independent legal standards against which to judge unanticipated situations and emerging forms of military technology.[28] On this basis, a situation or weapon that conflicts with either standard is per se unlawful.

Public international law jurist Antonio Cassese adopted a middle approach, treating the principles of humanity and dictates of public conscience as “fundamental guidance” for the interpretation of international law.[29] Cassese wrote that “[i]n case of doubt, international rules, in particular rules belonging to humanitarian law, must be construed so as to be consonant with general standards of humanity and the demands of public conscience.”[30] International law should, therefore, be understood not to condone situations or technologies that raise concerns under these prongs of the Martens Clause.

At a minimum, the Martens Clause provides factors for states to consider as they approach emerging weapons technology, including fully autonomous weapons. In 2018, the ICRC acknowledged the “debate over whether the Martens Clause constitutes a legally-binding yardstick against which the lawfulness of a weapon must be measured, or rather an ethical guideline.”[31] It concluded, however, that “it is clear that considerations of humanity and public conscience have driven the evolution of international law on weapons, and these notions have triggered the negotiation of specific treaties to prohibit or limit certain weapons.”[32] If concerns about a weapon arise under the principles of humanity or dictates of public conscience, adopting new, more specific law that eliminates doubts about the legality of a weapon can increase protections for civilians and combatants.

The Martens Clause also makes moral considerations legally relevant. It is codified in international treaties, yet it requires evaluating a situation or technology according to the principles of humanity and dictates of public conscience, both of which incorporate elements of morality. Peter Asaro, a philosopher of science and technology, writes that the Martens Clause invites “moral reflection on the role of the principles of humanity and the dictates of public conscience in articulating and establishing new [international humanitarian law].”[33] While a moral assessment of fully autonomous weapons is important in its own right, the Martens Clause also makes it a legal requirement in the absence of specific law.

Precedent of the Preemptive Ban on Blinding Lasers

States, international organizations, and civil society have invoked the Martens Clause in previous deliberations about unregulated, emerging technology.[34] They found it especially applicable to discussions of blinding lasers in the 1990s. These groups explicitly and implicitly referred to elements of the Martens Clause as justification for preemptively banning blinding lasers. CCW Protocol IV, adopted in 1995, codifies the prohibition.[35]

During a roundtable convened by the ICRC in 1991, experts highlighted the relevance of the Martens Clause. ICRC lawyer Louise Doswald-Beck argued that “[d]ecisions to impose specific restrictions on the use of certain weapon may be based on policy considerations,” and “that the criteria enshrined in the Martens clause [should] be particularly taken into account.”[36] Another participant said that “the Martens clause particularly addresses the problem of human suffering so that the ‘public conscience’ refers to what is seen as inhumane or socially unacceptable.”[37]

Critics of blinding lasers spoke in terms that demonstrated the weapons raised concerns under the principles of humanity and dictates of public conscience. Several speakers at the ICRC-convened meetings concurred that “weapons designed to blind are … socially unacceptable.”[38] ICRC itself “appealed to the ‘conscience of humanity’” in advocating for a prohibition.[39] At the CCW’s First Review Conference, representatives of UN agencies and civil society described blinding lasers as “inhumane,”[40] “abhorrent to the conscience of humanity,”[41] and “unacceptable in the modern world.”[42] A particularly effective ICRC public awareness campaign used photographs of soldiers blinded by poison gas during World War I to emphasize the fact that permanently blinding soldiers is cruel and inhumane.

Such characterizations of blinding lasers were linked to the need for a preemptive ban. For example, during debate at the First Review Conference, Chile expressed its hope that the body “would be able to establish guidelines for preventative action to prohibit the development of inhumane technologies and thereby to avoid the need to remedy the misery they might cause.”[43] In a December 1995 resolution urging states to ratify Protocol IV, the European Parliament declared that “deliberate blinding as a method of warfare is abhorrent.”[44] Using the language of the Martens Clause, the European Parliament stated that “deliberate blinding as a method of warfare is … in contravention of established custom, the principles of humanity and the dictates of the public conscience.”[45] The ICRC welcomed Protocol IV as a “victory of civilization over barbarity.”[46]

The discussions surrounding CCW Protocol IV underscore the relevance of the Martens Clause to the current debate about fully autonomous weapons. They show that CCW states parties have a history of applying the Martens Clause to controversial weapons. They also demonstrate the willingness of these states to preemptively ban a weapon that they find counter to the principles of humanity and dictates of public conscience. As will be discussed in more depth below, fully autonomous weapons raise significant concerns under the Martens Clause. The fact that their impact on armed conflict would be exponentially greater than that of blinding lasers should only increase the urgency of filling the gap in international law and explicitly banning them.[47]

 

IV. The Principles of Humanity

The Martens Clause divides the principles of international law into established custom, the principles of humanity, and the dictates of public conscience. Given that customary law is applicable even without the clause, this report assesses fully autonomous weapons under the latter two elements. The Martens Clause does not define these terms, but they have been the subject of much scholarly and legal discussion.

The relevant literature illuminates two key components of the principles of humanity. Actors are required: (1) to treat others humanely, and (2) to show respect for human life and dignity. Due to their lack of emotion and judgment, fully autonomous weapons would face significant difficulties in complying with either.

Humane Treatment

Definition

The first principle of humanity requires the humane treatment of others. The Oxford Dictionary defines “humanity” as “the quality of being humane; benevolence.”[48] The obligation to treat others humanely is a key component of international humanitarian law and international human rights law.[49] It appears, for example, in common Article 3 and other provisions of the Geneva Conventions, numerous military manuals, international case law, and the International Covenant on Civil and Political Rights.[50] Going beyond these sources, the Martens Clause establishes that human beings must be treated humanely, even when specific law does not exist.[51]

In order to treat other human beings humanely, one must exercise compassion and make legal and ethical judgments.[52] Compassion, according to the ICRC’s fundamental principles, is the “stirring of the soul which makes one responsive to the distress of others.”[53] To show compassion, an actor must be able to experience empathy—that is, to understand and share the feelings of another—and be compelled to act in response.[54] This emotional capacity is vital in situations when determinations about the use of force are made.[55] It drives actors to make conscious efforts to minimize the physical or psychological harm they inflict on human beings. Acting with compassion builds on the premise that “capture is preferable to wounding an enemy, and wounding him better than killing him; that non-combatants shall be spared as far as possible; that wounds inflicted be light as possible, so that the injured can be treated and cured; and that the wounds cause the least possible pain.”[56]

While compassion provides a motivation to act humanely, legal and ethical judgment provides a means to do so. To act humanely, an actor must make considered decisions as to how to minimize harm.[57] Such decisions are based on the ability to perceive and understand one’s environment and to apply “common sense and world knowledge” to a specific circumstance.[58] Philosophy professor James Moor notes that actors must possess the capacity to “identify and process ethical information about a variety of situations and make sensitive determinations about what should be done in those situations.”[59] In this way, legal and ethical judgment helps an actor weigh relevant factors to ensure treatment meets the standards demanded by compassion. Judgment is vital to minimizing suffering: one can only refrain from harming humans if one both recognizes the possible harms and knows how to respond.[60]

Application to Fully Autonomous Weapons

Fully autonomous weapons would face significant challenges in complying with the principle of humane treatment because compassion and legal and ethical judgment are human characteristics. Empathy, and the compassion for others that it engenders, come naturally to human beings. Most humans have experienced physical or psychological pain, which drives them to avoid inflicting unnecessary suffering on others. Their feelings transcend national and other divides. As the ICRC notes, “feelings and gestures of solidarity, compassion, and selflessness are to be found in all cultures.”[61] People’s shared understanding of pain and suffering leads them to show compassion towards fellow human beings and inspires reciprocity that is, in the words of the ICRC, “perfectly natural.”[62]

Regardless of the sophistication of a fully autonomous weapon, it could not experience emotions.[63] There are some advantages associated with being impervious to emotions such as anger and fear, but a robot’s inability to feel empathy and compassion would severely limit its ability to treat others humanely. Because they would not be sentient beings, fully autonomous weapons could not know physical or psychological suffering. As a result, they would lack the shared experiences and understandings that cause humans to relate empathetically to the pain of others, have their “souls stirred,” and be driven to exercise compassion towards other human beings. Amanda Sharkey, a professor of computer science, has written that “current robots, lacking living bodies, cannot feel pain, or even care about themselves, let alone extend that concern to others. How can they empathize with a human’s pain or distress if they are unable to experience either emotion?”[64] Fully autonomous weapons would therefore face considerable difficulties in guaranteeing their acts are humane and in compliance with the principles of humanity.

Robots would also not possess the legal and ethical judgment necessary to minimize harm on a case-by-case basis.[65] Situations involving use of force, particularly in armed conflict, are often complex and unpredictable and can change quickly. Fully autonomous weapons would therefore encounter significant obstacles to making appropriate decisions regarding humane treatment. After examining numerous studies in which researchers attempted to program ethics into robots, Sharkey found that robots exhibiting behavior that could be described as “ethical” or “minimally ethical” could operate only in constrained environments. Sharkey concluded that robots have limited moral capabilities and therefore should not be used in circumstances that “demand moral competence and an understanding of the surrounding social situation.”[66] Complying with international law frequently requires subjective decision-making in complex situations. Fully autonomous weapons would have limited ability to interpret the nuances of human behavior, understand the political, socioeconomic, and environmental dynamics of the situation, and comprehend the humanitarian risks of the use of force in a particular context.[67] These limitations would compromise the weapons’ ability to ensure the humane treatment of civilians and combatants and comply with the first principle of humanity.

Respect for Human Life and Dignity

Definition

A second principle of humanity requires actors to respect both human life and human dignity. Christof Heyns, former special rapporteur on extrajudicial, summary or arbitrary executions, highlighted these related but distinct concepts when he posed two questions regarding fully autonomous weapons: “[C]an [they] do or enable proper targeting?” and “Even if they can do proper targeting, should machines hold the power of life and death over humans?”[68] The first considers whether a weapon can comply with international law’s rules on protecting life. The second addresses the “manner of targeting” and whether it respects human dignity.[69]

In order to respect human life, actors must take steps to minimize killing.[70] The right to life states that “[n]o one shall be arbitrarily deprived of his life.”[71] It limits the use of lethal force to circumstances in which it is absolutely necessary to protect human life, constitutes a last resort, and is applied in a manner proportionate to the threat.[72] Codified in Article 6 of the International Covenant on Civil and Political Rights, the right to life has been recognized as the “supreme right” of international human rights law, which applies under all circumstances. During times of armed conflict, international humanitarian law determines what constitutes arbitrary or unjustified deprivation of life. It requires that actors comply with the rules of distinction, proportionality, and military necessity in situations of armed conflict.[73]

Judgment and emotion promote respect for life because they can serve as checks on killing. The ability to make legal and ethical judgments can help an actor determine which course of action will best protect human life in the infinite number of potential unforeseen scenarios. An instinctive resistance to killing provides a psychological motivation to comply with, and sometimes go beyond, the rules of international law in order to minimize casualties.

Under the principles of humanity, actors must also respect the dignity of all human beings. This obligation is premised on the recognition that every human being has inherent worth that is both universal and inviolable.[74] Numerous international instruments—including the Universal Declaration of Human Rights, the International Covenant on Civil and Political Rights, the Vienna Declaration and Programme of Action adopted at the 1993 World Human Rights Conference, and regional treaties—enshrine the importance of dignity as a foundational principle of human rights law.[75] The Africa Charter on Human and Peoples’ Rights explicitly states that individuals have “the right to the respect of the dignity inherent in a human being.”[76]

While respect for human life involves minimizing the number of deaths and avoiding arbitrary or unjustified ones, respect for human dignity requires an appreciation of the gravity of a decision to kill.[77] The ICRC explained that it matters “not just if a person is killed or injured but how they are killed or injured, including the process by which these decisions are made.”[78] Before taking a life, an actor must truly understand the value of a human life and the significance of its loss. Humans should be recognized as unique individuals and not reduced to objects with merely instrumental or no value.[79] If an actor kills without taking into account the worth of the individual victim, the killing undermines the fundamental notion of human dignity and violates this principle of humanity.

Application to Fully Autonomous Weapons

It is highly unlikely that fully autonomous weapons would be able to respect human life and dignity. Their lack of legal and ethical judgment would interfere with their capacity to respect human life. For example, international humanitarian law’s proportionality test requires commanders to determine whether anticipated military advantage outweighs expected civilian harm on a case-by-case basis. Given the infinite number of contingencies that may arise on the battlefield, fully autonomous weapons could not be preprogrammed to make such determinations. The generally accepted standard for assessing proportionality is whether a “reasonable military commander” would have launched a particular attack,[80] and reasonableness requires making decisions based on ethical as well as legal considerations.[81] Unable to apply this standard to the proportionality balancing test, fully autonomous weapons would likely endanger civilians and potentially violate international humanitarian law.[82]

Fully autonomous weapons would also lack the instinctual human resistance to killing that can protect human life beyond the minimum requirements of the law.[83] An inclination to avoid killing comes naturally to most people because they have an innate appreciation for the inherent value of human life. Empirical research demonstrates the reluctance of human beings to take the lives of other humans. For example, a retired US Army Ranger who conducted extensive research on killing during armed conflict found that “there is within man an intense resistance to killing their fellow man. A resistance so strong that, in many circumstances, soldiers on the battlefield will die before they can overcome it.”[84] As inanimate objects, fully autonomous weapons could not lose their own life or understand the emotions associated with the loss of the life of a loved one. It is doubtful that a programmer could replicate in a robot a human’s natural inclination to avoid killing and to protect life with the complexity and nuance that would mirror human decision making.

Fully autonomous weapons could not respect human dignity, which relates to the process behind, rather the consequences of, the use of force.[85] As machines, they could truly comprehend neither the value of individual life nor the significance of its loss. They would base decisions to kill on algorithms without considering the humanity of a specific victim.[86] Moreover, these weapons would be programmed in advance of a scenario and could not account for the necessity of lethal force in a specific situation. In a CCW presentation as special rapporteur, Christof Heyns explained that:

to allow machines to determine when and where to use force against humans is to reduce those humans to objects; they are treated as mere targets. They become zeros and ones in the digital scopes of weapons which are programmed in advance to release force without the ability to consider whether there is no other way out, without a sufficient level of deliberate human choice about the matter.[87]

Mines Action Canada similarly concluded that “[d]eploying [fully autonomous weapons] in combat displays the belief that any human targeted in this way does not warrant the consideration of a live operator, thereby robbing that human life of its right to dignity.”[88] Allowing a robot to take a life when it cannot understand the inherent worth of that life or the necessity of taking it disrespects and demeans the person whose life is taken. It is thus irreconcilable with the principles of humanity enshrined in the Martens Clause.

When used in appropriate situations, AI has the potential to provide extraordinary benefits to humankind. Allowing robots to make determinations to kill humans, however, would be contrary to the Martens Clause, which merges law and morality. Limitations in the emotional, perceptive, and ethical capabilities of these machines significantly hinder their ability to treat other human beings humanely and to respect human life and dignity. Consequently, the use of these weapons would be incompatible with the principles of humanity as set forth in the Martens Clause.

 

V. The Dictates of Public Conscience

The Martens Clause states that in the absence of treaty law, the dictates of public conscience along with the principles of humanity protect civilians and combatants. The reference to “public conscience” instills the law with morality and requires that assessments of the means and methods of war account for the opinions of citizens and experts as well as governments. The reactions of these groups to the prospect of fully autonomous weapons makes it clear that the development, production, and use of such technology would raise serious concerns under the Martens Clause.

Definition

The dictates of public conscience refer to shared moral guidelines that shape the actions of states and individuals.[89] The use of the term “conscience” indicates that the dictates are based on a sense of morality, a knowledge of what is right and wrong.[90] According to philosopher Peter Asaro, conscience implies “feeling compelled by, or believing in, a specific moral obligation or duty.”[91] The adjective “public” clarifies that these dictates reflect the concerns of a range of people and entities. Building on the widely cited work of jurist and international humanitarian law expert Theodor Meron, this report looks to two sources in particular to determine what qualifies as the public conscience: the opinion of the public and the opinions of governments.[92]

Polling data and the experts’ views provide evidence of public opinion.[93] Surveys reveal the perspectives and beliefs of ordinary individuals. They can also illuminate nuanced differences in the values and understandings of laypeople. While informative, polls, by themselves, are not sufficient measures of the public conscience, in part because the responses can be influenced by the nature of the questions asked and do not necessarily reflect moral consideration.[94] The statements and actions of experts, who have often deliberated at length on the questions at issue, reflect a more in-depth understanding.[95] Their specific expertise may range from religion to technology to law, but they share a deep knowledge of the topic. The views they voice can thus shed light on the moral norms embraced by the informed public.[96]

Governments articulate their stances through policies and in written statements and oral interventions at diplomatic meetings and other public fora. Their positions reflect the perspectives of countries that differ in economic development, military prowess, political systems, religious and cultural traditions, and demographics. Government opinion can help illuminate opinio juris, an element of customary international law, which refers to a state’s belief that a certain practice is legally obligatory.[97]

Application to Fully Autonomous Weapons

The positions of individuals and governments around the world have demonstrated that fully autonomous weapons are highly problematic under the dictates of public conscience. Through opinion polls, open letters, oral and written statements, in-depth publications, and self-imposed guidelines, members of the public have shared their distress and outrage at the prospect of these weapons. Government officials from more than 100 countries have expressed similar concerns and spoken in favor of imposing limits on fully autonomous weapons.[98] While public opposition to fully autonomous weapons is not universal, collectively, these voices show that it is both widespread and growing.[99]

Opinion of the Public

Public opposition to the development, production, and use of fully autonomous weapons is significant and spreading. Several public opinion polls have revealed individuals’ resistance to these weapons.[100] These findings are mirrored in statements made by leaders in the relevant fields of disarmament and human rights, peace and religion, science and technology, and industry. While not comprehensive, the sources discussed below exemplify the nature and range of public opinion and provide evidence of the public conscience.

Surveys

Public opinion surveys conducted around the world have documented widespread opposition to the development, production, and use of these weapons. According to these polls, the majority of people surveyed found the prospect of delegating life-and-death decisions to machines unacceptable. For example, a 2013 survey of Americans, conducted by political science professor Charli Carpenter, found that 55 percent of respondents opposed the “trend toward using” fully autonomous weapons.[101] This position was shared roughly equally across genders, ages, and political ideologies. Interestingly, active duty military personnel, who understand the realities of armed conflict first hand, were among the strongest objectors; 73 percent expressed opposition to fully autonomous weapons.[102] The majority of respondents to this poll also supported a campaign to ban the weapons.[103] A more recent national survey of about 1,000 Belgians, which was released on July 3, 2018, found that 60 percent of respondents believed that “Belgium should support international efforts to ban the development, production and use of fully autonomous weapons.” Only 23 percent disagreed.[104]

International opinion polls have produced similar results. In 2015, the Open Robotics Initiative surveyed more than 1,000 individuals from 54 different countries and found that 56 percent of respondents opposed the development and use of what it called lethal autonomous weapons systems.[105] Thirty-four percent of all respondents rejected development and use because “humans should always be the one to make life/death decisions.”[106] Other motivations cited less frequently included the weapons’ unreliability, the risk of proliferation, and lack of accountability.[107] An even larger survey by Ipsos of 11,500 people from 25 countries produced similar results in 2017.[108] This poll explained that the United Nations was reviewing the “strategic, legal and moral implications of autonomous weapons systems” (equivalent to fully autonomous weapons) and asked participants how they felt about the weapons’ use. Fifty-six percent recorded their opposition.[109]

Nongovernmental and International Organizations

Providing further evidence of concerns under the dictates of public conscience, experts from a range of fields have felt compelled, especially for moral reasons, to call for a prohibition on the development, production, and use of fully autonomous weapons. The Campaign to Stop Killer Robots, a civil society coalition of 75 NGOs, is spearheading the effort to ban fully autonomous weapons.[110] Its NGO members are active in more than 30 countries and include groups with expertise in humanitarian disarmament, peace and conflict resolution, technology, human rights, and other relevant fields.[111] Human Rights Watch, which co-founded the campaign in 2012, serves as its coordinator. Over the past six years, the campaign’s member organizations have highlighted the many problems associated fully autonomous weapons through dozens of publications and statements made at diplomatic meetings and UN events, on social media, and in other fora.[112]

While different concerns resonate with different people, Steve Goose, director of Human Rights Watch’s Arms Division, highlighted the importance of the Martens Clause in his statement to the April 2018 CCW Group of Governmental Experts meeting. Goose said:

There are many reasons to reject [lethal autonomous weapons systems] (including legal, accountability, technical, operational, proliferation, and international security concerns), but ethical and moral concerns—which generate the sense of revulsion—trump all. These ethical concerns should compel High Contracting Parties of the Convention on Conventional Weapons to take into account the Martens Clause in international humanitarian law, under which weapons that run counter to the principles of humanity and the dictates of the public conscience should not be developed.[113]

The ICRC has encouraged states to assess fully autonomous weapons under the Martens Clause and observed that “[w]ith respect to the public conscience, there is a sense of deep discomfort with the idea of any weapon system that places the use of force beyond human control.”[114] The ICRC has repeatedly emphasized the legal and ethical need for human control over the critical functions of selecting and attacking targets. In April 2018, it made clear its view that “a minimum level of human control is required to ensure compliance with international humanitarian law rules that protect civilians and combatants in armed conflict, and ethical acceptability in terms of the principles of humanity and the public conscience.”[115] The ICRC explained that international humanitarian law “requires that those who plan, decide upon and carry out attacks make certain judgements in applying the norms when launching an attack. Ethical considerations parallel this requirement—demanding that human agency and intention be retained in decisions to use force.”[116] The ICRC concluded that a weapon system outside human control “would be unlawful by its very nature.”[117]

Peace and Faith Leaders

In 2014, more than 20 individuals and organizations that had received the Nobel Peace Prize issued a joint letter stating that they “whole-heartedly embrace [the] goal of a preemptive ban on fully autonomous weapons” and find it “unconscionable that human beings are expanding research and development of lethal machines that would be able to kill people without human intervention.”[118] The individual signatories to the letter included American activist Jody Williams, who led the civil society drive to ban landmines, along with heads of state and politicians, human rights and peace activists, a lawyer, a journalist, and a church leader.[119] Organizational signatory the Pugwash Conferences on Science and World Affairs and the Nobel Women’s Initiative, which helped spearhead the letter, are both on the steering committee of the Campaign to Stop Killer Robots.

Religious leaders have similarly united against fully autonomous weapons. In 2014, more than 160 faith leaders signed an “interreligious declaration calling on states to work towards a global ban on fully autonomous weapons.”[120] In language that implies concerns under the principles of humanity, the declaration describes such weapons as “an affront to human dignity and to the sacredness of life.”[121] The declaration further criticizes the idea of delegating life-and-death decisions to a machine because fully autonomous weapons have “no moral agency and, as a result, cannot be held responsible if they take an innocent life.”[122] The list of signatories encompassed representatives of Buddhism, Catholicism, Islam, Judaism, Protestantism, and Quakerism. Archbishop Desmond Tutu signed both this declaration and the Nobel Peace Laureates letter.

Science and Technology Experts

Individuals with technological expertise have also expressed opposition to fully autonomous weapons. The International Committee for Robot Arms Control (ICRAC), whose members study technology from various disciplines, raised the alarm in 2013 shortly after it co-founded the Campaign to Stop Killer Robots.[123] ICRAC issued a statement endorsed by more than 270 experts calling for a ban on the development and deployment of fully autonomous weapons.[124] Members of ICRAC noted “the absence of clear scientific evidence that robot weapons have, or are likely to have in the foreseeable future, the functionality required for accurate target identification, situational awareness or decisions regarding the proportional use of force” and concluded that “[d]ecisions about the application of violent force must not be delegated to machines.[125] While the concerns emphasized in this statement focus on technology, as discussed above the inability to make proportionality decisions can run counter to the respect for life and principles of humanity.

In 2015, an even larger group of AI and robotics researchers issued an open letter. As of June 2018, more than 3,500 scientists, as well as more than 20,000 individuals, had signed this call for a ban.[126] The letter warns that these machines could become the “Kalashnikovs of tomorrow” if their development is not prevented.[127] It states that while the signatories “believe that AI has great potential to benefit humanity in many ways,” they “believe that a military AI arms race would not be beneficial for humanity. There are many ways in which AI can make battlefields safer for humans, especially civilians, without creating new tools for killing people.”[128]

In addition to demanding action from others, thousands of technology experts have committed not to engage in actions that would advance the development of fully autonomous weapons. At a world congress held in Stockholm in July 2018, leading AI researchers issued a pledge to “neither participate in nor support the development, manufacture, trade, or use of lethal autonomous weapons.”[129] By the end of the month, more than 2,850 AI experts, scientists and other individuals along with 223 technology companies, societies and organizations from at least 36 countries had signed. The pledge, which cites moral, accountability, proliferation, and security-related concerns, finds that “the decision to take a human life should never be delegated to a machine.” It states, “There is a moral component to this position, that we should not allow machines to make life-taking decisions for which others—or nobody—will be culpable.”[130] According to the Future of Life Institute, which houses the pledge on its website, the pledge is necessary because “politicians have thus far failed to put into effect” any regulations and laws against lethal autonomous weapons systems.[131]

Industry

High-profile technology companies and their representatives have criticized fully autonomous weapons on various grounds. A Canadian robotics manufacturer, Clearpath Robotics, became the first company publicly to refuse to manufacture “weaponized robots that remove humans from the loop.”[132] In 2014, it pledged to “value ethics over potential future revenue.”[133] In a letter to the public, the company stated that it was motivated by its belief that “that the development of killer robots is unwise, unethical, and should be banned on an international scale.” Clearpath continued:

[W]ould a robot have the morality, sense, or emotional understanding to intervene against orders that are wrong or inhumane? No. Would computers be able to make the kinds of subjective decisions required for checking the legitimacy of targets and ensuring the proportionate use of force in the foreseeable future? No. Could this technology lead those who possess it to value human life less? Quite frankly, we believe this will be the case.[134]

The letter shows that fully autonomous weapons raise problems under both the principles of humanity and dictates of public conscience.

In August 2017, the founders and chief executive officers (CEOs) of 116 AI and robotics companies published a letter calling for CCW states parties to take action on autonomous weapons.[135] The letter opens by stating, “As companies building the technologies in Artificial Intelligence and Robotics that may be repurposed to develop autonomous weapons, we feel especially responsible in raising this alarm.”[136] The letter goes on to highlight the dangers to civilians, risk of an arms race, and possibility of destabilizing effects. It warns that “[o]nce this Pandora’s box is opened, it will be hard to close.”[137] In a similar vein in 2018, Scott Phoenix, CEO of Vicarious, a prominent AI development company, described developing autonomous weapons as among the “world’s worst ideas” because of the likelihood of defects in their codes and vulnerability to hacking.[138]

Google and the companies under its Alphabet group have been at the center of the debate about fully autonomous weapons on multiple occasions. DeepMind is an AI research company that was acquired by Google in 2014. In 2016, it submitted evidence to a UK parliamentary committee in which it described a ban on autonomous weapons as “the best approach to averting the harmful consequences that would arise from the development and use of such weapons.”[139] DeepMind voiced particular concern about the weapons’ “implications for global stability and conflict reduction.”[140] Two years later, more than 3,000 Google employees protested the company’s involvement with “Project Maven,” a US Department of Defense program that aims to use AI to autonomously process video footage taken by surveillance drones. The employees argued that the company should “not be in the business of war,”[141] and more than 1,100 academics supported them in a separate letter.[142] In June 2018, Google agreed to end its involvement in Project Maven once the contract expires in 2019, and it issued ethical principles committing not to develop AI for use in weapons. The principles state that Google is “not developing AI for use in weapons” and “will not design or deploy AI” for technology that causes “overall harm” or “contravenes widely accepted principles of international law and human rights.”[143]

Investors in the technology industry have also started to respond to the ethical concerns raised by fully autonomous weapons. In 2016, the Ethics Council of the Norwegian Government Pension Fund announced that it would monitor investments in the development of these weapons to decide whether they are counter to the council’s guidelines.[144] Johan H. Andresen, council chairman, reiterated that position in a panel presentation for CCW delegates in April 2018.[145]

Opinions of Governments

Governments from around the world have increasingly shared the views of experts and the broader public that the development, production, and use of weapons without meaningful human control is unacceptable. As of April 2018, 26 nations—from Africa, Asia, Europe, Latin America, and the Middle East—have called for a preemptive ban on fully autonomous weapons.[146] In addition, more than 100 states, including those of the Non-Aligned Movement (NAM), have called for a legally binding instrument on such weapons. In a joint statement, members of NAM cited “ethical, legal, moral and technical, as well as international peace and security related questions” as matters of concern.[147] While a complete analysis of government interventions over the past five years is beyond the scope of this report, overall the statements have demonstrated that countries oppose the loss of human control on moral as well as legal, technical, and other grounds. The opinions of these governments, reflective of public concerns, bolster the argument that fully autonomous weapons violate the dictates of public conscience.

The principles embedded in the Martens Clause have played a role in international discussions of fully autonomous weapons since they began in 2013. In that year, Christof Heyns, then Special Rapporteur on extrajudicial killing, submitted a report to the UN Human Rights Council on fully autonomous weapons, which he referred to as “lethal autonomous robotics.”[148] Emphasizing the importance of human control over life-and-death decisions, Heyns explained that “[i]t is an underlying assumption of most legal, moral and other codes that when the decision to take life or to subject people to other grave consequences is at stake, the decision-making power should be exercised by humans.”[149] He continued: “Delegating this process dehumanizes armed conflict even further and precludes a moment of deliberation in those cases where it may be feasible. Machines lack morality and mortality, and should as a result not have life and death powers over humans.”[150] Heyns also named the Martens Clause as one legal basis for his determination.[151] The 2013 report called for a moratorium on the development of fully autonomous weapons until the establishment of an “internationally agreed upon framework.”[152] A 2016 joint report by Heyns and Maina Kiai, then UN Special Rapporteur on freedom of peaceful assembly and of association, went a step further, recommending that “[a]utonomous weapons systems that require no meaningful human control should be prohibited.”[153]

In May 2013, in response to Heyns’s report, the UN Human Rights Council held the first discussions of the weapons at the international level.[154] Of the 20 nations that voiced their positions, many articulated concerns about the emerging technology. They often used language related to the Martens Clause or morality more generally. Ecuador explicitly referred to elements of the Martens Clause and stated that leaving life-and-death decisions to machines would contravene the public conscience.[155] Indonesia raised objections related to the principles of humanity discussed above. It criticized the “possible far-reaching effects on societal values, including fundamentally on the protection and the value of life” that could arise from the use of these weapons.[156] Russia recommended that “particular attention” be paid to the “serious implications for societal foundations, including the negating of human life.”[157] Pakistan called for a ban based on the precedent of the preemptive ban on blinding lasers, which was motivated in large part by the Martens Clause.[158] Brazil also addressed issues of morality; it said, “If the killing of one human being by another has been a challenge that legal, moral, and religious codes have grappled with since time immemorial, one may imagine the host of additional concerns to be raised by robots exercising the power of life and death over humans.”[159] While Human Rights Council member states also addressed other important risks of fully autonomous weapons, especially those related to security, morality was a dominant theme.[160]

Since the Human Rights Council’s session in 2013, most diplomatic discussions have taken place under the auspices of the Convention on Conventional Weapons.[161] States parties to the CCW held three informal meetings of experts on what they refer to as “lethal autonomous weapons systems” between 2014 and 2016.[162] At their 2016 Review Conference, they agreed to formalize discussions in a Group of Governmental Experts, a forum that is generally expected to produce an outcome such as a new CCW protocol.[163] More than 80 states participated in the most recent meeting of the group in April 2018. At that meeting, Austria noted that “that CCW’s engagement on lethal autonomous weapons stands testimony to the high level of concern about the risk that such weapons entail.”[164] It also serves as an indication that the public conscience is against this technology.

CCW states parties have highlighted the relevance of the Martens Clause at each of their meetings on lethal autonomous weapons systems. At the first meeting in May 2014, for example, Brazil described the Martens Clause as a “keystone” of international humanitarian law, which “‘allows us to navigate safely in new and dangerous waters’ and to feel confident that a human remains protected under the principles of humanity and the dictates of public conscience.”[165] Mexico found “there is absolutely no doubt that the development of these new technologies have to comply with [the] principles” of the Martens Clause.[166] At the second CCW experts meeting in April 2015, Russia described the Martens Clause as “an integral part of customary international law.”[167] Adopting a narrow interpretation of the provision, the United States said that “the Martens Clause is not a rule of international law that prohibits any particular weapon, much less a weapon that does not currently exist.” Nevertheless, it acknowledged that “the principles of humanity and the dictates of public conscience provide a relevant and important paradigm for discussing the moral or ethical issues related to the use of automation in warfare.”[168]

Several CCW states parties have based their objections to fully autonomous weapons on the Martens Clause and its elements. In a joint statement in April 2018, the African Group said that the “principles of humanity and dictates of public conscience as enunciated in the [Martens] Clause must be taken seriously.”[169] The African Group called for a preemptive ban on lethal autonomous weapons systems, declaring that its members found “it inhumane, abhorrent, repugnant, and against public conscience for humans to give up control to machines, allowing machines to decide who lives or dies, how many lives and whose life is acceptable as collateral damage when force is used.”[170] The Holy See condemned fully autonomous weapons because they “could never be a morally responsible subject. The unique human capacity for moral judgment and ethical decision-making is more than a complex collection of algorithms and such a capacity cannot be replaced by, or programed in, a machine.” The Holy See warned that autonomous weapons systems could find normal and acceptable “those behaviors that international law prohibits, or that, albeit not explicitly outlined, are still forbidden by dictates of morality and public conscience.”[171]

At the April 2018 meeting, other states raised issues under the Martens Clause more implicitly. Greece, for example, stated that “it is important to ensure that commanders and operators will remain on the loop of the decision making process in order to ensure the appropriate human judgment over the use of force, not only for reasons related to accountability but mainly to protect human dignity over the decision on life or death.”[172]

CCW states parties have considered a host of other issues surrounding lethal autonomous weapons systems over the past five years. They have highlighted, inter alia, the challenges of complying with international humanitarian law and international human rights law, the potential for an accountability gap, the risk of an arms race and a lower threshold for war, and the weapons’ vulnerability to hacking. Combined with the Martens Clause, these issues have led to convergence of views on the imperative of retaining some form of human control over weapons systems the use of force. In April 2018, Pakistan noted that “a general sense is developing among the High Contracting Parties that weapons with autonomous functions must remain under the direct control and supervision of humans at all times, and must comply with international law.”[173] Similarly, the European Union stated that its members “firmly believe that humans should make the decisions with regard to the use of lethal force, exert sufficient control over lethal weapons systems they use, and remain accountable for decisions over life and death.”[174]

The year 2018 has also seen increased parliamentary and UN calls for human control. In July, the Belgian Parliament adopted a resolution asking the government to support international efforts to ban the use of fully autonomous weapons.[175] The same month, the European Parliament voted to recommend that the UN Security Council:

work towards an international ban on weapon systems that lack human control over the use of force as requested by Parliament on various occasions and, in preparation of relevant meetings at UN level, to urgently develop and adopt a common position on autonomous weapon systems and to speak at relevant fora with one voice and act accordingly.[176]

In his 2018 disarmament agenda, the UN secretary-general noted, “All sides appear to be in agreement that, at a minimum, human oversight over the use of force is necessary.” He offered to support the efforts of states “to elaborate new measures, including through politically or legally binding arrangements, to ensure that humans remain at all times in control over the use of force.”[177] While the term remains to be defined, requiring “human control” is effectively the same as prohibiting weapons without such control. Therefore, the widespread agreement about the necessity of human control indicates that fully autonomous weapons contravene the dictates of public conscience.

 

VI. The Need for a Preemptive Ban Treaty

The Martens Clause fills a gap when existing treaties fail to specifically address a new situation or technology. In such cases, the principles of humanity and the dictates of public conscience serve as guides for interpreting international law and set standards against which to judge the means and methods of war. In so doing, they provide a baseline for adequately protecting civilians and combatants. The clause, which is a provision of international humanitarian law, also integrates moral considerations into legal analysis.

Existing treaties only regulate fully autonomous weapons in general terms, and thus an assessment of the weapons should take the Martens Clause into account. Because fully autonomous weapons raise concerns under both the principles of humanity and the dictates of public conscience, the Martens Clause points to the urgent need to adopt a specific international agreement on the emerging technology. To eliminate any uncertainty and comply with the elements of the Martens Clause, the new instrument should take the form of a preemptive ban on the development, production, and use of fully autonomous weapons.

There is no way to regulate fully autonomous weapons short of a ban that would ensure compliance with the principles of humanity. Fully autonomous weapons would lack the compassion and legal and ethical judgment that facilitate humane treatment of humans. They would face significant challenges in respecting human life. Even if they could comply with legal rules of protection, they would not have the capacity to respect human dignity.

Limiting the use of fully autonomous weapons to certain locations, such as those where civilians are rare, would not sufficiently address these problems. “Harm to others,” which the principle of humane treatment seeks to avoid, encompasses harm to civilian objects, which might be present where civilians themselves are not. The requirement to respect human dignity applies to combatants as well as civilians, so the weapons should not be permitted where enemy troops are positioned. Furthermore, allowing fully autonomous weapons to be developed and to enter national arsenals would raise the possibility of their misuse. They would likely proliferate to actors with no regard for human suffering and no respect for human life or dignity. The 2017 letter from technology company CEOs warned that the weapons could be “weapons of terror, weapons that despots and terrorists use against innocent populations, and weapons hacked to behave in undesirable ways.”[178] Regulation that allowed for the existence of fully autonomous weapons would open the door to violations of the principles of humanity.

A ban is also necessary to promote compliance with the dictates of public conscience. An overview of public opinion shows that ordinary people and experts alike have objected to the prospect of fully autonomous weapons on moral grounds. Public opinion surveys have illuminated significant opposition to these weapons based on the problems of delegating life-and-death decisions to machines. Experts have continually called for a preemptive ban on fully autonomous weapons, citing moral along with legal and security concerns. Regulation that allows for the existence of fully autonomous weapons, even if they could only be used in limited circumstances, would be inconsistent with the widespread public belief that fully autonomous weapons are morally wrong.

The statements of governments, another element of the public conscience, illuminate that opposition to weapons that lack human control over the selection and engagement of targets extends beyond individuals to countries. More than two dozen countries have explicitly called for a preemptive ban on these weapons,[179] and consensus is emerging regarding the need for human control over the use of force. As noted above, the requirement for human control is effectively equivalent to a ban on weapons without it. Therefore, a ban would best ensure that the dictates of public conscience are met.

The principles of humanity and dictates of public conscience bolster the case against fully autonomous weapons although as discussed above they are not the only matter of concern. Fully autonomous weapons are also problematic under other legal provisions and raise accountability, technological, and security risks. Collectively, these dangers to humanity more than justify the creation of new law that maintains human control over the use of force and prevents fully autonomous weapons from coming into existence.

Acknowledgments

Bonnie Docherty, senior researcher in the Arms Division of Human Rights Watch, was the lead writer and editor of this report. She is also the associate director of armed conflict and civilian protection and a lecturer on law at the International Human Rights Clinic (IHRC) at Harvard Law School. Danielle Duffield, Annie Madding, and Paras Shah, students in IHRC, made major contributions to the research, analysis, and writing of the report. Steve Goose, director of the Arms Division, and Mary Wareham, advocacy director of the Arms Division, edited the report. Dinah PoKempner, general counsel, and Tom Porteous, deputy program director, also reviewed the report. Peter Asaro, associate professor at the School of Media Studies at the New School, provided additional feedback on the report.

This report was prepared for publication by Marta Kosmyna, senior associate in the Arms Division, Fitzroy Hepkins, administrative manager, and Jose Martinez, senior coordinator. Russell Christian produced the cartoon for the report cover.

 

 

[1] For a more in-depth discussion of the pros and cons of fully autonomous weapons, see Human Rights Watch and the Harvard Law School International Human Rights Clinic (IHRC), Making the Case: The Dangers of Killer Robots and the Need for a Preemptive Ban, December 2016, https://www.hrw.org/report/2016/12/09/making-case/dangers-killer-robots-....

[2] Future of Life Institute, “Autonomous Weapons: An Open Letter from AI & Robotics Researchers,” opened on July 28, 2015, https://futureoflife.org/open-letter-autonomous-weapons/ (accessed July 22, 2018).

[3] Convention (II) with Respect to the Laws and Customs of War on Land and its Annex: Regulations concerning the Laws and Customs of War on Land, The Hague, adopted July 29,1899, entered into force September 4, 1900, pmbl., para. 8.

[4] “The three conventions adopted at the 1899 Conference represented the three broad areas … [of] pacific settlement of international disputes, arms limitation, and the laws of war.” Betsy Baker, “Hague Peace Conferences: 1899 and 1907,” Max Planck Encyclopedia of Public International Law, updated November 2009, http://opil.ouplaw.com/view/10.1093/law:epil/9780199231690/law-978019923... (accessed July 14, 2018), para. 3.

[5] See Antonio Cassese, “The Martens Clause: Half a Loaf or Simply Pie in the Sky?” European Journal of International Law, vol. 11, no. 1 (2000), pp. 193-195; Theodor Meron, “The Martens Clause, Principles of Humanity, and Dictates of Public Conscience,” American Journal of International Law, vol. 94, no. 1 (2000), p. 79 (noting, “[t]he clause was originally designed to provide residual humanitarian rules for the protection of the population of occupied territories, especially armed resisters in those territories.”). A minority of scholars question the conventional narrative that the clause served as a fair compromise. See Rotem Giladi, “The Enactment of Irony: Reflections on the Origins of the Martens Clause,” European Journal of International Law, vol. 25, no. 3 (2014), p. 853 (“His [Martens’] response to the objections raised by Belgium was anything but conciliatory. It was calculated, naturally, to advance legal rules on occupation that suited the interests of the expanding Russian empire he represented…. Martens’ response remains a classic example of power politics veiled by humanitarian rhetoric; it also cunningly harped on the political and professional sensitivities besetting his audience.”).

[6] As of August 6, 2018, the four Geneva Conventions had 196 states parties. See International Committee of the Red Cross (ICRC), “Treaties, States Parties and Commentaries, Geneva Conventions of 12 August 1949 and Additional Protocols, and their Commentaries,” https://ihl-databases.icrc.org/applic/ihl/ihl.nsf/vwTreaties1949.xsp (accessed August 6, 2018).

[7] Geneva Convention for the Amelioration of the Condition of the Wounded and Sick in Armed Forces in the Field, adopted August 12, 1949, 75 U.N.T.S. 31, entered into force October 21, 1950, art. 63; Geneva Convention for the Amelioration of the Condition of Wounded, Sick and Shipwrecked Members of Armed Forces at Sea, adopted August 12, 1949, 75 U.N.T.S. 85, entered into force October 21, 1950, art. 62; Geneva Convention relative to the Treatment of Prisoners of War, adopted August 12, 1949, 75 U.N.T.S. 135, entered into force October 21, 1950, art. 142; Geneva Convention relative to the Protection of Civilian Persons in Time of War, adopted August 12, 1949, 75 U.N.T.S. 287, entered into force October 21, 1950, art. 158 (stating, “The denunciation shall have effect only in respect of the denouncing Power. It shall in no way impair the obligations which the Parties to the conflict shall remain bound to fulfil by virtue of the principles of the law of nations, as they result from the usages established among civilized peoples, from the laws of humanity and the dictates of the public conscience.”).

[8] ICRC, “Commentary of 2016 on Convention (I) for the Amelioration of the Condition of the Wounded and Sick in Armed Forces in the Field, Geneva, 12 August 1949: Article 63: Denunciation,” https://ihl-databases.icrc.org/applic/ihl/ihl.nsf/Comment.xsp?action=ope... (accessed July 14, 2018), para. 3330.

[9] As of August 6, 2018, Additional Protocol I had 174 states parties. See ICRC, “Protocol Additional to the Geneva Conventions of 12 August 1949, and relating to the Protection of Victims of International Armed Conflicts (Protocol I), 8 June 1977,” https://ihl-databases.icrc.org/ihl/INTRO/470 (accessed August 6, 2018).

[10] Protocol Additional to the Geneva Conventions of August 12, 1949 and relating to the Protection of Victims of International Armed Conflicts (Protocol I), adopted June 8, 1977, 1125 U.N.T.S. 3, entered into force December 7, 1978, art. 1(2).

[11] ICRC, “Commentary of 1987 on Protocol Additional to the Geneva Conventions of August 12, 1949 and relating to the Protection of Victims of International Armed Conflicts (Protocol I): Article 1, General Principles and Scope of Application,” https://ihl‑databases.icrc.org/applic/ihl/ihl.nsf/Comment.xsp?action=openDocument&documentId=7125D4CBD57A70DDC12563CD0042F793 (accessed July 15, 2018), para. 55.

[12] A notable exception is the 1992 Chemical Weapons Convention. See Convention on the Prohibition of the Development, Production, Stockpiling and Use of Chemical Weapons and on their Destruction, adopted September 3, 1992, 1974 UNTS 45, entered into force April 29, 1997.

[13] The 1925 Geneva Gas Protocol incorporates elements of the Martens Clause in its preamble. Protocol for the Prohibition of the Use in War of Asphyxiating, Poisonous or Other Gases, and of Bacteriological Methods of Warfare, adopted June 17, 1925, 94 L.N.T.S. 65, entered into force February 8, 1928, pmbl., paras. 1-3 (“Whereas the use in war of asphyxiating, poisonous or other gases … has been justly condemned by the general opinion of the civilized world; and Whereas the prohibition of such use has been declared in Treaties to which the majority of Powers of the world are Parties; and to the end that this prohibition shall be universally accepted[] … binding alike the conscience and the practice of nations”).

[14] Convention on the Prohibition of the Development, Production and Stockpiling of Bacteriological (Biological) and Toxin Weapons and on Their Destruction, opened for signature April 10, 1972, 1015 UNTS 163, entered into force March 26, 1975, pmbl., para. 10 (“Convinced that such use would be repugnant to the conscience of mankind and that no effort should be spared to minimize this risk”).

[15] Convention on Prohibitions or Restrictions on the Use of Certain Conventional Weapons which May be Deemed to be Excessively Injurious or to Have Indiscriminate Effects (CCW), adopted December 10, 1980, 1342 UNTS 137, entered into force December 2, 1983, pmbl., para. 5 (“Confirming their determination that in cases not covered by this Convention and its annexed Protocols or by other international agreements, the civilian population and the combatants shall at all times remain under the protection and authority of the principles of international law derived from established custom, from the principles of humanity and from the dictates of public conscience”).

[16] Convention on the Prohibition of the Use, Stockpiling, Production and Transfer of Anti-Personnel Mines and on Their Destruction, adopted September 18, 1997, 2056 UNTS 241, entered into force March 1, 1999, pmbl., para. 8 (“Stressing the role of public conscience in furthering the principles of humanity as evidenced by the call for a total ban of anti-personnel mines”).

[17] Convention on Cluster Munitions, adopted May 30, 2008, 2688 UNTS 39, entered into force August 1, 2010, pmbl., para. 11 (“Reaffirming that in cases not covered by this Convention or by other international agreements, civilians and combatants remain under the protection and authority of the principles of international law, derived from established custom, from the principles of humanity and from the dictates of public conscience”).

[18] Treaty on the Prohibition of Nuclear Weapons, adopted July 7, 2017, C.N.475.2017.TREATIES-XXVI.9, pmbl., para. 11 (“Reaffirming that any use of nuclear weapons would also be abhorrent to the principles of humanity and the dictates of public conscience”).

[19] Protocol I, art. 1(2).

[20] For instance, in a legal paper on fully autonomous weapons, Switzerland argued: “Accordingly, not everything that is not explicitly prohibited can be said to be legal if it would run counter the principles put forward in the Martens clause. Indeed, the Martens clause may be said to imply positive obligations where contemplated military action would result in untenable humanitarian consequences.” Switzerland, “A ‘Compliance-Based’ Approach to Autonomous Weapon Systems,” U.N. Doc. CCW/GGE.1/2017/WP.9, November 10, 2017, https://www.unog.ch/80256EDD006B8954/(httpAssets)/6B80F9385F6B505FC12581D4006633F8/$file/2017_GGEonLAWS_WP9_Switzerland.pdf (accessed July 15, 2018), para. 18.

[21] In re Krupp, Judgment of July 31, 1948, in Trials of War Criminals before the Nuremberg Military Tribunals: “The Krupp Case,” vol. IX, (Washington: US Government Printing Office, 1950), p. 1340.

[22] Advisory Opinion on the Legality of the Threat or Use of Nuclear Weapons, International Court of Justice, July 8, 1996, http://www.icj-cij.org/files/case-related/95/095-19960708-ADV-01-00-EN.pdf (accessed July 15, 2018), para. 78.

[23] Some critics argue that international humanitarian law would adequately cover fully autonomous weapons and note the applicability of disarmament treaties on antipersonnel landmines, cluster munitions, and incendiary weapons. These instruments do not provide specific law on fully autonomous weapons, however. For critics’ view, see Michael N. Schmitt and Jeffrey S. Thurnher, “‘Out of the Loop’: Autonomous Weapon Systems and the Law of Armed Conflict,” Harvard National Security Journal, vol. 4 (2013), p. 276.

[24] See Rupert Ticehurst, “The Martens Clause and the Laws of Armed Conflict,” International Review of the Red Cross, no. 317 (1997), https://www.icrc.org/eng/resources/documents/article/other/57jnhy.htm (accessed July 15, 2018), p. 1 (noting, “The problem faced by humanitarian lawyers is that there is no accepted interpretation of the Martens Clause. It is therefore subject to a variety of interpretations, both narrow and expansive.”).

[25] For example, the British government advanced this interpretation in its briefing before the International Court of Justice during the 1996 Nuclear Weapons Advisory Opinion process, stating: “While the Martens Clause makes clear that the absence of a specific treaty provision on the use of nuclear weapons is not, in itself, sufficient to establish that such weapons are capable of lawful use, the Clause does not, on its own, establish their illegality. The terms of the Martens Clause themselves make it necessary to point to a rule of customary international law which might outlaw the use of nuclear weapons.” Meron, “The Martens Clause, Principles of Humanity, and Dictates of Public Conscience,” American Journal of International Law, p. 85.

[26] France v. Greece, Permanent Court of International Justice, Judgement No. 22, March 17, 1934, http://www.worldcourts.com/pcij/eng/decisions/1934.03.17_lighthouses.htm (accessed July 15, 2018), para. 106 (Separate Opinion of M. Anzilotti) (“[I]t is a fundamental rule in interpreting legal texts that one should not lightly admit that they contain superfluous words: the right course, whenever possible, is to seek for an interpretation which allows a reason and a meaning to every word in the text.”).

[27] See, for example, Michael Salter, “Reinterpreting Competing Interpretations of the Scope and Potential of the Martens Clause,” Journal of Conflict and Security Law, vol. 17, no. 3 (2012), p. 421.

[28] See, for example, In re Krupp, Judgment of July 31, 1948, in Trials of War Criminals before the Nuremberg Military Tribunals: “The Krupp Case,” p. 1340 (asserting that the Martens Clause “is much more than a pious declaration”). See also Cassese, “The Martens Clause,” European Journal of International Law, p. 210 (asserting that most of the states that appeared before the International Court of Justice with regards to the Nuclear Weapons Advisory Opinion “suggested—either implicitly or in a convoluted way—the expansion of the scope of the clause so as to upgrade it to the rank of a norm establishing new sources of law”); ICRC, A Guide to the Legal Review of New Weapons, Means and Methods of Warfare: Measures to Implement Article 36 of Additional Protocol I of 1977 (2006), http://www.icrc.org/eng/resources/documents/publication/p0902.htm (accessed July 15, 2018), p. 17 (stating, “A weapon which is not covered by existing rules of international humanitarian law would be considered contrary to the Martens clause if it is determined per se to contravene the principles of humanity or the dictates of public conscience.”).

[29] Cassese, “The Martens Clause,” European Journal of International Law, p. 212.

[30] Ibid. See also Jochen von Bernstorff, “Martens Clause,” Max Planck Encyclopedia of Public International Law, updated December 2009, http://opil.ouplaw.com/search?sfam=&q=Martens+Clause+&prd=EPIL&searchBtn... (accessed July 15, 2018), para. 13 (“A second reading sees the clause as an interpretative device according to which, in case of doubt, rules of international humanitarian law should be interpreted according to ‘principles of humanity’ and ‘dictates of public conscience.’”).

[31] ICRC, “Ethics and Autonomous Weapons Systems: An Ethical Basis for Human Control?” April 3, 2018, https://www.icrc.org/en/document/ethics-and-autonomous-weapon-systems-et... (accessed July 15, 2018), p. 6. The ICRC has elsewhere acknowledged that states must take the Martens Clause into account when conducting weapons reviews. ICRC, A Guide to the Legal Review of New Weapons, Means and Methods of Warfare, p. 17.

[32] ICRC, “Ethics and Autonomous Weapons Systems: An Ethical Basis for Human Control?” p. 6.

[33] Peter Asaro, “Jus nascendi, Robotic Weapons and the Martens Clause,” in Robot Law, eds. Ryan Calo, Michael Froomkin, and Ian Kerr (Cheltenham, UK: Edward Elgar Publishing, 2016), https://www.elgaronline.com/view/9781783476725.00024.xml (accessed July 15, 2018), p. 386.

[34] ICRC, “Ethics and Autonomous Weapons Systems: An Ethical Basis for Human Control?” p. 6.

[35] CCW Protocol on Blinding Lasers (CCW Protocol IV), adopted October 13, 1995, entered into force July 30, 1998, art. 1. For a discussion of the negotiating history of this protocol and its relationship to discussions about fully autonomous weapons, see Human Rights Watch and IHRC, Precedent for Preemption: The Ban on Blinding Lasers as a Model for a Killer Robots Prohibition, November 2015, https://www.hrw.org/sites/default/files/supporting_resources/robots_and_..., pp. 3-7.

[36] ICRC, Blinding Weapons: Reports of the Meetings of Experts Convened by the ICRC on Battlefield Laser Weapons, 1989-1991 (Geneva: ICRC, 1993), p. 342 (emphasis in original removed).

[37] Ibid., p. 341.

[38] Ibid., p. 85.

[39] Louise Doswald-Beck, “New Protocol on Blinding Laser Weapons,” International Review of the Red Cross, no. 312 (1996),

https://www.icrc.org/eng/resources/documents/article/other/57jn4y.htm (accessed July 15, 2018).

[40] Summary of Statement by Human Rights Watch, CCW First Review Conference, “Summary Record of the 6th Meeting,” CCW/CONF.I/SR.6, September 28, 1995, para. 60.

[41] Summary of Statement by the UN Development Programme, CCW First Review Conference, “Summary Record of the 5th Meeting,” CCW/CONF.I/SR.5, September 27, 1995, para. 50.

[42] Summary of Statement by Christoffel Blindenmission (CBM), CCW First Review Conference, “Summary Record of the 6th Meeting,” CCW/CONF.I/SR.6, September 28, 1995, para. 51. CBM is an international Christian development organization.

[43] Summary of Statement by Chile, CCW First Review Conference, “Summary Record of the 14th Meeting,” CCW/CONF.I/SR.13, May 3, 1996, para. 69.

[44] European Parliament, Resolution on the Failure of the International Conference on Anti-Personnel Mines and Laser Weapons, December 4, 1995, https://eur-lex.europa.eu/legal-content/EN/TXT/HTML/?uri=CELEX:51995IP13... (accessed July 15, 2018).

[45] Ibid.

[46] Doswald-Beck, “New Protocol on Blinding Laser Weapons,” International Review of the Red Cross.

[47] Blinding lasers and fully autonomous weapons would differ in some respects. For example, blinding lasers are a specific type of weapon, while fully autonomous weapons constitute a broad class. Instead of undermining the calls for a ban, however, the unique qualities of fully autonomous weapons make a preemptive prohibition even more pressing. See Human Rights Watch and IHRC, Precedent for Preemption, pp. 17-18.

[48] English Oxford Living Dictionaries, “humanity,” https://en.oxforddictionaries.com/definition/humanity (accessed July 15, 2018). See also Merriam Webster, “humanity,” https://www.merriam-webster.com/dictionary/humanity (accessed July 15, 2018) (defining humanity as “compassionate, sympathetic, or generous behavior or disposition; the quality or state of being humane”).

[49] See ICRC, “Rule 87: Humane Treatment,” Customary International Humanitarian Law Database, https://ihl-databases.icrc.org/customary-ihl/eng/docs/v1_rul_rule87 (accessed July 15, 2018).

[50] Ibid. See also V.V. Pustogarov, “The Martens Clause in International Law,” Journal of the History of International Law, vol. 125 (1999), p. 133 (noting, “the principles of humanity are expressed concretely in the provisions prescribing ‘humane treatment’ of the wounded, the sick, prisoners of war and other persons falling beneath the protection of the Geneva Conventions of 1949 and the Protocols of 1977. One can say that ‘humane treatment’ is the main content of humanitarian law.”). In addition, Article 10(1) of the International Covenant on Civil and Political Rights provides, “All persons deprived of their liberty shall be treated with humanity”; International Covenant on Civil and Political Rights (ICCPR), adopted December 16, 1966, G.A. Res. 2200A (XXI), 21 U.N. GAOR Supp. (No. 16) at 52, U.N. Doc.A/6316 (1966), 999 U.N.T.S. 171, entered into force March 23, 1976, art. 10(1).

[51] ICRC, “The Fundamental Principles of the Red Cross: Commentary,” January 1, 1979, https://www.icrc.org/eng/resources/documents/misc/fundamental-principles... (accessed July 15, 2018).

[52] English Oxford Living Dictionaries, “humane,” https://en.oxforddictionaries.com/definition/humane (accessed July 15, 2018) (defining “humane” as “having or showing compassion or benevolence”).

[53] ICRC, “The Fundamental Principles of the Red Cross: Commentary,” January 1, 1979.

[54] In this report, the term “actor” is used to describe an agent deployed in situations, including armed conflict, where they are charged with making moral decisions, i.e., those in which a person’s actions have the potential to harm or help others. Thus, both human beings and fully autonomous weapons are actors for the purposes of this paper. All actors are required, pursuant to the Martens Clause, to comply with the principles of humanity. For a definition of empathy, see English Oxford Living Dictionaries, “empathy,” https://en.oxforddictionaries.com/definition/empathy (accessed July 15, 2018). Anneliese Klein-Pineda has also commented that “both empathy and sympathy require the ability to interpret actions and perceive the motivations or feelings of others.” Anneliese Klein-Pineda, “The Ethics of Robots: Is There an Algorithm for Morality?” Stashlearn, December 22, 2016, https://learn.stashinvest.com/robot-ethics-morality (accessed July 15, 2018).

[55] Christof Heyns, then special rapporteur on extrajudicial, summary, or arbitrary executions, wrote that “[d]ecisions over life and death in armed conflict may require compassion and intuition.” Report of the Special Rapporteur on Extrajudicial, Summary or Arbitrary Executions, Christof Heyns, Lethal Autonomous Robotics and the Protection of Life, UN Human Rights Council, A/HRC/23/47, April 9, 2013, https://www.ohchr.org/Documents/HRBodies/HRCouncil/RegularSession/Sessio... (accessed July 15, 2018), para. 55.

[56] Jean Pictet, Development and Principles of International Humanitarian Law (Geneva: Martinus Nijoff and Henry Dunant Institute, 1985), p. 62.

[57] English Oxford Living Dictionaries, “judgement,” https://en.oxforddictionaries.com/definition/judgement (accessed July 15, 2018) (defining “judgement” as “the ability to make considered decisions or come to sensible conclusions”).

[58] James H. Moor, “The Nature, Importance, and Difficulty of Machine Ethics,” IEEE Intelligent Systems, vol. 21, no. 4, (2006), https://ieeexplore.ieee.org/document/1667948/ (accessed July 15, 2018), p. 21.

[59] James H. Moor, “Four Kinds of Ethical Robot” Philosophy Now, vol. 72 (2007), p. 12.

[60] Moor, “The Nature, Importance, and Difficulty of Machine Ethics,” IEEE Intelligent Systems, p. 21.

[61] ICRC, “The Fundamental Principles of the Red Cross and Red Crescent,” 1996, https://www.icrc.org/eng/assets/files/other/icrc_002_0513.pdf (accessed July 15, 2018) p. 2.

[62] Ibid.

[63] English Oxford Living Dictionaries, “emotion,” https://en.oxforddictionaries.com/definition/emotion (accessed July 15,2018) (defining emotion as “a strong feeling deriving from one's circumstances, mood, or relationships with others” and “instinctive or intuitive feeling as distinguished from reasoning or knowledge”).

[64] Amanda Sharkey, “Can We Program or Train Robots to be Good?” Ethics and Information Technology (2017), accessed August 3, 2018, doi.org/10.1007/s10676-017-9425-5, p. 8.

[65] See Olivia Goldhill, “Can We Trust Robots to Make Moral Decisions?” Quartz, April 3, 2016, https://qz.com/653575/can-we-trust-robots-to-make-moral-decisions/ (accessed July 15, 2018) (noting, “it’s unlikely robots will be able to address the most sophisticated ethical decisions for the foreseeable future.”).

[66] Sharkey, “Can We Program or Train Robots to be Good?” Ethics and Information Technology, p. 1.

[67] Mary Wareham (Human Rights Watch), “It’s Time for a Binding, Absolute Ban on Fully Autonomous Weapons,” commentary, Equal Times, November 9, 2017, https://www.hrw.org/news/2017/11/09/its-time-binding-absolute-ban-fully-....

[68] Christof Heyns, “Autonomous Weapons in Armed Conflict and the Right to a Dignified Life: An African Perspective,” South African Journal on Human Rights, vol. 33 (2017), accessed July 1, 2018, doi.org/10.1080/02587203.2017.1303903, p. 51.

[69] Ibid., p. 58.

[70] ICRC, “The Fundamental Principles of the International Red Cross and Red Crescent Movement,” August 2015, https://www.icrc.org/sites/default/files/topic/file_plus_list/4046‑the_fundamental_principles_of_the_international_red_cross_and_red_crescent_movement.pdf (accessed July 21, 2018), p. 3.

[71] ICCPR, art. 6(1).

[72] For more a more detailed analysis of the requirements for use of force under the right to life, see Human Rights Watch and IHRC, Shaking the Foundations: The Human Rights Implications of Killer Robots, May 2014, https://www.hrw.org/report/2014/05/12/shaking-foundations/human-rights-i..., pp. 8-16.

[73] ICRC, “Rule 1 The Principle of Distinction between Civilians and Combatants,” Customary International Humanitarian Law Database, https://ihl-databases.icrc.org/customary-ihl/eng/docs/v1_rul_rule1 (accessed July 21, 2018); ICRC, “Rule 14: Proportionality in Attack,” Customary International Humanitarian Law Database, https://ihl-databases.icrc.org/customary-ihl/eng/docs/v1_rul_rule14 (accessed August 6, 2018); “Military Necessity,” in ICRC, How Does Law Protect in War?, https://casebook.icrc.org/glossary/military-necessity (accessed July 21, 2018).

[74] English Oxford Living Dictionaries, “dignity,” https://en.oxforddictionaries.com/definition/dignity (accessed July 21, 2018) (defining “dignity” as “the quality of being worthy or honourable”). See also Jack Donnelly, “Human Dignity and Human Rights,” in Swiss Initiative to Commemorate the 60th Anniversary of the UDHR, Protecting Dignity: Agenda for Human Rights, June 2009, https://www.legal-tools.org/doc/e80bda/pdf/ (accessed July 21, 2018), p. 10; Human Rights Watch and IHRC, Shaking the Foundations, p. 3; José Pablo Alzina de Aguilar, “Human Dignity according to International Instruments on Human Rights,” Revista Electrónica de Estudios Internacionales, vol. 22 (2011), p. 8. (“[International human rights instruments] also say that rights which stem from that dignity, or at least the most important ones, are universal and inviolable.”).

[75] For example, the preamble of the Universal Declaration of Human Rights asserts that “recognition of the inherent dignity and of the equal and inalienable rights of all members of the human family is the foundation of freedom, justice and peace in the world.” Universal Declaration of Human Rights (UDHR), adopted December 10, 1948, G.A. Res. 217A(III), U.N. Doc. A/810 at 71 (1948), pmbl., para. 1.

[76] African Charter on Human and Peoples’ Rights (Banjul Charter), adopted June 27, 1981, CAB/LEG/67/3 rev. 5, 21 I.L.M. 58, entered into force October 21, 1986, art. 7.

[77] There is an overlap between the types of respect to the extent that an actor who truly respects human dignity is more likely to take the actions required in order to protect human life.

[78] ICRC, “Ethics and Autonomous Weapons Systems: An Ethical Basis for Human Control?” p. 10.

[79] Heyns, “Autonomous Weapons in Armed Conflict and the Right to a Dignified Life,” South African Journal on Human Rights, pp. 62-63 (stating, “A central thrust of the notion of human dignity is the idea that humans should not be treated as something similar to an object that simply has an instrumental value (as is the case e.g. with slavery or rape) or no value at all (as with many massacres.”).

[80] Human Rights Watch and IHRC, Making the Case, p. 6. See also Final Report to the Prosecutor by the Committee Established to Review the NATO Bombing Campaign against the Federal Republic of Yugoslavia, International Criminal Tribunal for the Former Yugoslavia, http://www.difesa.it/SMD_/CASD/IM/ISSMI/Corsi/Corso_Consigliere_Giuridic... (accessed July 21, 2018), para. 50.

[81] Human Rights Watch and IHRC, Making the Case, p. 7. See also Olivier Corten, “Reasonableness in International Law,” Max Planck Encyclopedia of Public International Law, updated March 2013, http://opil.ouplaw.com/view/10.1093/law:epil/9780199231690/law-978019923... (accessed July 22, 2018), para. 1 (noting, “Reasonableness is also generally perceived as opening the door to several ethical or moral, rather than legal, considerations.”).

[82] Fully autonomous weapons would face the same difficulties determining whether force is necessary and proportionate in law enforcement situations and avoiding it when possible, which are requirements for upholding the right to life. See Human Rights Watch and IHRC, Shaking the Foundations, pp. 8-14. See also Peter Asaro, “‘Hands Up, Don’t Shoot!’ HRI and the Automation of Police Use of Force,” Journal of Human-Robot Interaction, vol. 5, no. 3, (2016), http://humanrobotinteraction.org/journal/index.php/HRI/article/view/301/... (accessed July 22, 2018).

[83] Heyns, “Autonomous Weapons in Armed Conflict and the Right to a Dignified Life,” South African Journal on Human Rights, p. 64.

[84] Lt. Col. Dave Grossman, On Killing: The Psychological Cost of Learning to Kill in War and Society (New York: Little, Brown and Company, 1995), p. 4. Similarly, Armin Krishnan wrote that “One of the greatest restraints for the cruelty in war has always been the natural inhibition of humans not to kill or hurt fellow human beings. The natural inhibition is, in fact, so strong that most people would rather die than kill somebody.” Armin Krishnan, Killer Robots: Legality and Ethicality of Autonomous Weapons (Farnham: Ashgate Publishing Limited, 2009), p. 130.

[85] ICRC, “Ethics and Autonomous Weapons Systems: An Ethical Basis for Human Control?” pp. 10, 12.

[86] The ICRC has stated that the importance of respecting the individual personality and dignity of the individual is vital to the principle of humanity. See ICRC, “The Fundamental Principles of the Red Cross: Commentary,” January 1, 1979.

[87] Christof Heyns, “Autonomous Weapon Systems: Human Rights and Ethical Issues” (presentation to the CCW Meeting of Experts on Lethal Autonomous Weapon Systems, April 14, 2016), transcript on file with Human Rights Watch.

[88] Erin Hunt and Piotr Dobrzynski, “The Right to Dignity and Autonomous Weapons Systems,” CCW Report, April 11, 2018, http://reachingcriticalwill.org/images/documents/Disarmament-fora/ccw/20... (accessed July 22, 2018), p. 5.

[89] Larry May, "Hobbes, Law, and Public Conscience," Critical Review of International Social and Political Philosophy, vol. 19, (2016): accessed July 22, 2018, doi.org/10.1080/13698230.2015.1122352. See generally Heping Dang, International Law, Human Rights, and Public Opinion: The Role of the State in Educating on Human Rights Standards (London: Taylor & Francis Group, 2017).

[90] English Oxford Living Dictionaries, “conscience,” https://en.oxforddictionaries.com/definition/conscience (accessed July 22, 2018) (defining “conscience” as “[a] person's moral sense of right and wrong, viewed as acting as a guide to one's behavior”). See also Merriam-Webster, “conscience," https://www.merriam-webster.com/dictionary/conscience (accessed July 22, 2018) (defining “conscience” as “the sense or consciousness of the moral goodness or blameworthiness of one's own conduct, intentions, or character together with a feeling of obligation to do right or be good”).

[91] Asaro, “Jus nascendi, Robotic Weapons and the Martens Clause,” pp. 374-375.

[92] Meron notes that looking at a range of opinions helps guard against the potentially immoral views of both governments and people. Meron also notes some have argued that international human rights law, which was discussed in Chapter IV of this report, can provide play a role in determining the public conscience. Meron, “The Martens Clause, Principles of Humanity, and Dictates of Public Conscience,” American Journal of International Law, pp. 83-84 (noting, “public opinion—so influential in our era—has a role to play in the development of international law is not an entirely new phenomenon.”).

[93] Asaro, “Jus nascendi, Robotic Weapons and the Martens Clause,” pp. 374-375. See also V.V. Pustogarov, “The Martens Clause in International Law,” Journal of the History of International Law, pp. 132-133.

[94] Asaro, “Jus nascendi, Robotic Weapons and the Martens Clause,” pp. 373-374.

[95] Ibid., p. 375 (“Indeed, the best place to look for emerging norms and the dictates of public conscience are in the public forums in which states and individuals attempt to grapple with, and articulate that conscience.”).

[96] Ibid. (“That content should also be elicited through public discussion, as well as academic scholarship, artistic and cultural expressions, individual reflection, collective action, and additional means, by which society deliberates its collective moral conscience.”).

[97] Michael Wood and Omri Sender, “State Practice,” Max Planck Encyclopedia of Public International Law, updated January 2017, http://opil.ouplaw.com/abstract/10.1093/law:epil/9780199231690/law-97801... (accessed July 22, 2018) (“In essence, the practice must be general (that is, sufficiently widespread and representative, as well as consistent), and accompanied by a recognition that a rule of law or legal obligation is involved.”).

[98] With the support of the Non-Aligned Movement, which counts more than 100 states as members, this number far surpasses 100 nations. See, for example, Statement by Venezuela on behalf of the Non-Aligned Movement, CCW Group of Governmental Experts (GGE) on Lethal Autonomous Weapons Systems, Geneva, March 28, 2018, https://www.unog.ch/80256EDD006B8954/(httpAssets)/E9BBB3F7ACBE8790C125825F004AA329/$file/CCW_GGE_1_2018_WP.1.pdf (accessed July 22, 2018); Statement by the African Group, CCW GGE on Lethal Autonomous Weapons Systems, Geneva, April 9-13, 2018, http://reachingcriticalwill.org/images/documents/Disarmament-fora/ccw/20... (accessed July 22, 2018).

[99] For examples of states opposed to a ban, see Statement by the United States, CCW GGE on Lethal Autonomous Weapons Systems, Geneva, April 13, 2018, https://geneva.usmission.gov/2018/04/17/u-s-statement-on-the-outcome-of-...(accessed July 22, 2018); Statement by Israel, CCW Meeting of States Parties, Geneva, November 15, 2017, http://embassies.gov.il/UnGeneva/priorities-statements/Disarmament/Docum... (accessed July 22, 2018).

[100] Charli Carpenter, "US Public Opinion Poll on Lethal Autonomous Weapons," June 2013, http://duckofminerva.dreamhosters.com/wp-content/uploads/2013/06/UMass-S... (accessed July 22, 2018); Open Roboethics Initiative, “The Ethics and Governance of Lethal Autonomous Weapons Systems: An International Public Opinion Poll,” November 9, 2015, http://www.openroboethics.org/wp-content/uploads/2015/11/ORi_LAWS2015.pdf (accessed July 22, 2018); Chris Jackson, “Three in Ten Americans Support Using Autonomous Weapons,” Ipsos, February 7, 2017, https://www.ipsos.com/en-us/news-polls/three-ten-americans-support-using... (accessed July 22, 2018).

[101] Of the 55 percent, 39 percent said they “strongly oppose” and 16 percent “somewhat oppose” using fully autonomous weapons. To assess the effects of language, the survey described the weapons as “completely autonomous [robotic weapons/lethal robots].” Charli Carpenter, “US Public Opinion on Autonomous Weapons.” These figures are based on a nationally representative online poll of 1,000 Americans conducted by Yougov.com. Respondents were an invited group of Internet users (YouGov Panel) matched and weighted on gender, age, race, income, region, education, party identification, voter registration, ideology, political interest, and military status. The margin of error for the results is +/- 3.6 percent.

[102] Ibid.

[103] Ibid. According to the survey, 33 percent said they “strongly support” a campaign and 20 percent said they “somewhat support” it.

[104] Willem Staes, “Nieuw onderzoek: 60% van de Belgen wil internationaal verbod op ‘killer robots,’” Pax Christi Vlaanderen, July 3, 2018, https://www.paxchristi.be/nieuws/nieuw-onderzoek-60-van-de-belgen-wil-in... (accessed July 22, 2018) (unofficial translation).

[105] Open Roboethics Initiative, “The Ethics and Governance of Lethal Autonomous Weapons Systems: An International Public Opinion Poll,” pp. 4, 8.

[106] Ibid., p. 7.

[107] Ibid.

[108] Chris Jackson, “Three in Ten Americans Support Using Autonomous Weapons.”

[109] Ibid. The countries most strongly opposed to the use of these weapons were Russia (69% opposed), Peru (67% opposed), Spain (66% opposed), and Argentina (66% opposed). The countries who viewed their use somewhat favorably were India (60% in favor), China (47% in favor), and the US (34% in favor).

[110] Campaign to Stop Killer Robots, “Who We Are,” April 2018, https://www.stopkillerrobots.org/coalition/ (accessed July 22, 2018).

[111] Ibid.

[112] See Campaign to Stop Killer Robots, “Bibliography,” https://www.stopkillerrobots.org/bibliography/ (accessed July 22, 2018).

[113] Statement by Human Rights Watch, CCW GEE on Lethal Autonomous Weapons Systems, Geneva, April 9, 2018, https://www.hrw.org/news/2018/04/09/statement-human-rights-watch-convent....

[114] Statement by the ICRC, CCW GGE on Lethal Autonomous Weapons Systems, Geneva, November 15, 2017, https://www.icrc.org/en/document/expert-meeting-lethal-autonomous-weapon... (accessed July 22, 2018).

[115] Statement by the ICRC, CCW GEE on Lethal Autonomous Weapons Systems, Geneva, April 11, 2018, http://reachingcriticalwill.org/images/documents/Disarmament-fora/ccw/20..., p. 1.

[116] Ibid.

[117] Ibid.

[118] Nobel Women's Initiative, “Nobel Peace Laureates Call for Preemptive Ban on Killer Robots,” May 12, 2014, http://nobelwomensinitiative.org/nobel-peace-laureates-call-for-preempti... (accessed July 22, 2018).

[119] Ibid.

[120] PAX, “Religious Leaders Call for a Ban on Killer Robots,” November 12, 2014, https://www.paxforpeace.nl/stay-informed/news/religious-leaders-call-for... (accessed July 22, 2018).

[121] Campaign to Stop Killer Robots, “Who Supports the Call to Ban Killer Robots?” June 2017, http://www.stopkillerrobots.org/wp-content/uploads/2013/03/KRC_ListBanEn... (accessed July 22, 2018), p. 1.

[122] PAX, “Religious Leaders Call for a Ban on Killer Robots.”

[123] Frank Sauer, International Committee for Robot Arms Control (ICRAC), “The Scientists’ Call … to Ban Autonomous Lethal Robots,” November 11, 2012, https://www.icrac.net/the-scientists-call/ (accessed July 22, 2018).

[124] Campaign to Stop Killer Robots, “Scientists Call for a Ban,” October 16, 2013, https://www.stopkillerrobots.org/2013/10/scientists-call/ (accessed July 22, 2018). The signatories hailed from 37 different countries and included numerous university professors.

[125] “Computing Experts from 37 Countries Call for Ban on Killer Robots,” ICRAC press release, October 15, 2013, https://www.icrac.net/wp-content/uploads/2018/06/Scientist-Call_Press-Re... (accessed July 22, 2018), p. 1; ICRAC, “As Computer Scientists, Engineers, Artificial Intelligence Experts, Roboticists and Professionals from Related Disciplines, We Call for a Ban on the Development and Deployment of Weapon Systems in which the Decision to Apply Violent Force is Made Autonomously,” June 2018, https://www.icrac.net/wp-content/uploads/2018/06/List-of-Signatories-ICR... (accessed July 22, 2018).

[126] The signatories included: Tesla CEO Elon Musk, Apple Co-founder Steve Wozniak, Skype co-founder Jaan Tallin, Professor Stephen Hawking, Professor Noam Chomsky, DeepMind leader Demis Hassabis (and 21 of his engineers), IBM Watson design leader Kathryn McElroy, Facebook Head of AI Research Yann LeCun, Twitter CEO Jack Dorsey, Nobel Laureate in physics Frank Wilczek, former Canadian Minister of Defense Hon. Jean Jacques Blais, and numerous professors. Future of Life Institute, “Autonomous Weapons: An Open Letter from AI & Robotics Researchers.”

[127] Ibid.

[128] Ibid.

[129] Future of Life Institute, “Lethal Autonomous Weapons Pledge,” https://futureoflife.org/lethal-autonomous-weapons-pledge/ (accessed July 30, 2018).

[130] Ibid.

[131] Ariel Conn, “AI Companies, Researchers, Engineers, Scientists, Entrepreneurs, and Others Sign Pledge Promising Not to Develop Lethal Autonomous Weapons,” Future of Life Institute press release, July 18, 2018, https://futureoflife.org/2018/07/18/ai-companies-researchers-engineers-s... (accessed July 30, 2018).

[132] Meghan Hennessy, “Clearpath Robotics Takes Stance Against ‘Killer Robots,’” Clearpath Robotics press release, August 13, 2014, https://www.clearpathrobotics.com/2014/08/clearpath-takes-stance-against... (accessed July 22, 2018).

[133] Ibid.

[134] Ibid.

[135] Samuel Gibbs, “Elon Musk Leads 116 Experts Calling for Outright Ban of Killer Robots,” Guardian, August 20, 2017, https://www.theguardian.com/technology/2017/aug/20/elon-musk-killer-robo... (accessed July 22, 2018).

[136] Future of Life Institute, “An Open Letter to the United Nations Convention on Certain Conventional Weapons,” https://futureoflife.org/autonomous-weapons-open-letter-2017/ (accessed July 22, 2018).

[137] Ibid.

[138] Barbara Booth, “‘Autonomous Weapons are among the World's Dumbest Ideas’: A.I. CEO,” CNBC, March 15, 2018, https://www.cnbc.com/2018/03/15/autonomous-weapons-are-among-the-worlds-dumbest-ideas-a-i-ceo.html (accessed July 22, 2018).

[139] Parliament of the United Kingdom, Science and Technology Committee, Robotics and Artificial Intelligence Inquiry, “Written Evidence Submitted by Google DeepMind, ROB0062,” May 2016, http://data.parliament.uk/writtenevidence/committeeevidence.svc/evidence... (accessed July 22, 2018), para. 5.3.

[140] Ibid.

[141] Scott Shane and Daisuke Wakabayashi, “‘The Business of War’: Google Employees Protest Work for the Pentagon,” New York Times, April 4, 2018, https://www.nytimes.com/2018/04/04/technology/google-letter-ceo-pentagon... (accessed July 22, 2018).

[142] ICRAC, “Open Letter in Support of Google Employees and Tech Workers,” June 25, 2018, https://www.icrac.net/open-letter-in-support-of-google-employees-and-tec... (accessed July 22, 2018).

[143] Sunda Pichai, “AI at Google: Our Principles,” https://www.blog.google/technology/ai/ai-principles/ (accessed July 30, 2018). Other similar sets of ethical guidelines that seek to ensure AI benefits, not harms, human beings, include: “Montreal Declaration for Responsible AI,” https://www.montrealdeclaration-responsibleai.com/the-declaration (accessed July 22, 2018); Future of Life, “Asilomar AI Principles,” 2017, https://futureoflife.org/ai-principles/ (accessed July 24, 2018) (signed by 1,273 AI and robotics researchers and 2,541 others); George Dvorsky, “UK Government Proposes Five Basic Principles to Keep Humans Safe From AI,” Gizmodo, April 16, 2018, https://gizmodo.com/uk-government-proposes-five-basic-principles-to-keep... (accessed July 22, 2018).

[144] “Norwegian Fund Considers Killer Robots,” Campaign to Stop Killer Robots press release, March 19, 2016, https://www.stopkillerrobots.org/2016/03/norwayfund/ (accessed July 22, 2018).

[145] “Ban on Killer Robots Gaining Ground,” PAX press release, April 16, 2018, https://www.paxforpeace.nl/stay-informed/news/ban-on-killer-robots-gaini... (accessed August 3, 2018).

[146] Campaign to Stop Killer Robots, “Country Views on Killer Robots,” April 13, 2018, https://www.stopkillerrobots.org/wp-content/uploads/2018/04/KRC_CountryV... (accessed July 22, 2018). These nations are: Algeria, Argentina, Austria, Bolivia, Brazil, Chile, China (calls for a ban on use only), Colombia, Costa Rica, Cuba, Djibouti, Ecuador, Egypt, Ghana, Guatemala, the Holy See, Iraq, Mexico, Nicaragua, Pakistan, Panama, Peru, the State of Palestine, Uganda, Venezuela, and Zimbabwe.

[147] Statement by Venezuela on behalf of the Non-Aligned Movement, CCW GGE on Lethal Autonomous Weapons Systems, Geneva, March 28, 2018.

[148] See generally Report of the Special Rapporteur on Extrajudicial, Summary or Arbitrary Executions, Christof Heyns, Lethal Autonomous Robotics and the Protection of Life.

[149] Ibid., para. 89.

[150] Ibid., para. 94.

[151] Ibid., paras. 89, 100.

[152] Ibid., para. 113.

[153] Joint Report of the Special Rapporteur on the Rights to Freedom of Peaceful Assembly and of Association and the Special Rapporteur on Extrajudicial, Summary or Arbitrary Executions on the Proper Management of Assemblies to the UN Human Rights Council, A/HRC/31/66, February 4, 2016, http://www.refworld.org/docid/575135464.html (accessed August 6, 2018), para. 67(f).

[154] Campaign to Stop Killer Robots, “Chronology,” https://www.stopkillerrobots.org/chronology/ (accessed July 22, 2018).

[155] Statement by Ecuador, CCW Meeting of States Parties, Geneva, October 25, 2013, http://www.reachingcriticalwill.org/images/documents/Disarmament-fora/1c..., (accessed July 22, 2018), p. 2.

[156] Statement by Indonesia, Interactive Dialogue with the Special Rapporteur on Extrajudicial, Summary or Arbitrary Executions, UN Human Rights Council, 23rd Session, Geneva, May 30, 2013, http://stopkillerrobots.org/wp-content/uploads/2013/05/HRC_Indonesia_09_... (accessed July 22, 2018).

[157] Statement by Russia, Interactive Dialogue, UN Human Rights Council, 23rd Session, Geneva, May 30, 2013, translated in Campaign to Stop Killer Robots, “Report on Outreach on the UN Report on ‘Lethal Autonomous Robotics,’” http://stopkillerrobots.org/wp-content/uploads/2013/03/KRC_ReportHeynsUN... (accessed July 22, 2018), p. 19.

[158] Statement by Pakistan, Interactive Dialogue, UN Human Rights Council, 23rd Session, Geneva, May 30, 2013, http://stopkillerrobots.org/wp-content/uploads/2013/05/HRC_Pakistan_09_3... (accessed July 22, 2018), p. 2. See also Campaign to Stop Killer Robots, “Chronology.”

[159] Statement by Brazil, Interactive Dialogue, UN Human Rights Council, 23rd Session, Geneva, May 30, 2013, http://stopkillerrobots.org/wp-content/uploads/2013/05/HRC_Brazil_09_30M... (accessed July 22, 2018), p. 2.

[160] With regard to security concerns, Indonesia noted the potential impacts on “international stability and security” that could arise from the use of such weapons. Statement by Indonesia, Interactive Dialogue, UN Human Rights Council, 23rd Session, Geneva, May 30, 2013. The Latin American Network GRULAC emphasized that “these systems could lead to a ‘normalization of conflict’, and to a possible arms race that would create divisions among States and weaken international law.” Statement by Argentina on behalf of GRULAC, Interactive Dialogue, UN Human Rights Council, 23rd Session, Geneva, May 30, 2013, http://stopkillerrobots.org/wp-content/uploads/2013/05/HRC_Argentina_09_... (accessed July 22, 2018), p. 2.

[161] Campaign to Stop Killer Robots, “Chronology.”

[162] Ibid.

[163] Ibid.

[164] Statement by Austria, CCW GGE on Lethal Autonomous Weapons Systems, Geneva, April 9-13, 2018, https://www.unog.ch/80256EDD006B8954/(httpAssets)/AA0367088499C566C1258278004D54CD/$file/2018_LAWSGeneralExchang_Austria.pdf (accessed July 22, 2018), p. 1.

[165] Summary of Statement by Brazil, CCW Informal Meeting of Experts on Lethal Autonomous Weapons Systems, Geneva, May 13, 2014, in Campaign to Stop Killer Robots, “Report on Activities: CCW Informal Meeting of Experts on Lethal Autonomous Weapons Systems,” May 13-16, 2014, http://www.stopkillerrobots.org/wp-content/uploads/2013/03/KRC_CCWreport... (accessed July 22, 2018), p. 27.

[166] Summary of Statement by Mexico, CCW Informal Meeting of Experts on Lethal Autonomous Weapons Systems, Geneva, May 13, 2014, in Campaign to Stop Killer Robots, “Report on Activities,” p. 30.

[167] Summary of Statement by Russia, CCW Informal Meeting of Experts on Lethal Autonomous Weapons Systems, Geneva, April 16, 2015, in Campaign to Stop Killer Robots, “Report on Activities,” p. 18.

[168] Statement by the United States, CCW Informal Meeting of Experts on Lethal Autonomous Weapons Systems, Geneva, April 16, 2015, https://geneva.usmission.gov/2015/04/16/ccw-informal-meeting-of-experts-... (accessed July 22, 2018).

[169] Statement by the African Group, CCW GGE on Lethal Autonomous Weapons Systems, Geneva, April 9, 2018.

[170] Ibid.

[171] Statement by the Holy See, CCW GGE on Lethal Autonomous Weapons Systems, Geneva, April 9, 2018, http://reachingcriticalwill.org/images/documents/Disarmament-fora/ccw/20..., (accessed July 22, 2018), p. 1.

[172] Statement by Greece, CCW GGE on Lethal Autonomous Weapons Systems, Geneva, April 9, 2018,

http://reachingcriticalwill.org/images/documents/Disarmament-fora/ccw/20... (accessed July 22, 2018), p. 2.

[173] Statement by Pakistan, CCW GGE on Lethal Autonomous Weapons Systems, Geneva, April 9, 2018, http://reachingcriticalwill.org/images/documents/Disarmament-fora/ccw/20... (accessed July 22, 2018), p. 2.

[174] Statement by the European Union, CCW GGE on Lethal Autonomous Weapons Systems, Geneva, April 9, 2018, http://reachingcriticalwill.org/images/documents/Disarmament-fora/ccw/20... (accessed July 22, 2018), p. 2.

[175] “Belgium Votes to Ban Killer Robots,” PAX press release, July 23, 2018, https://www.paxforpeace.nl/stay-informed/news/belgium-votes-to-ban-kille... (accessed August 9, 2018).

[176] “European Parliament Recommendation of 5 July 2018 to the Council on the 73rd session of the United Nations General Assembly,” P8_TA-PROV(2018)0312, adopted July 5, 2018, http://www.europarl.europa.eu/sides/getDoc.do?pubRef=-//EP//NONSGML+TA+P... (accessed July 24, 2018), art. 1(av).

[177] UN Secretary-General António Guterres, Securing Our Common Future: An Agenda for Disarmament (New York: UN Office of Disarmament Affairs, 2018), https://front.un-arm.org/documents/SG+disarmament+agenda_1.pdf (accessed July 22, 2018), p. 55.

[178] Future of Life Institute, “An Open Letter to the United Nations Convention on Certain Conventional Weapons.”

[179] Campaign to Stop Killer Robots, “Country Views on Killer Robots.”

Author: Human Rights Watch
Posted: January 1, 1970, 12:00 am

Growing opposition to fully autonomous weapons from various quarters shows how the public conscience supports banning weapons systems that lack meaningful human control. 

© 2018 Russell Christian/Human Rights Watch
 
(Geneva) – Basic humanity and the public conscience support a ban on fully autonomous weapons, Human Rights Watch said in a report released today. Countries participating in an upcoming international meeting on such “killer robots” should agree to negotiate a prohibition on the weapons systems’ development, production, and use.

The 46-page report, Heed the Call: A Moral and Legal Imperative to Ban Killer Robots, finds that fully autonomous weapons would violate what is known as the Martens Clause. This long-standing provision of international humanitarian law requires emerging technologies to be judged by the “principles of humanity” and the “dictates of public conscience” when they are not already covered by other treaty provisions.

“Permitting the development and use of killer robots would undermine established moral and legal standards,” said Bonnie Docherty, senior arms researcher at Human Rights Watch, which coordinates the Campaign to Stop Killer Robots. “Countries should work together to preemptively ban these weapons systems before they proliferate around the world.”

The 1995 preemptive ban on blinding lasers, which was motivated in large part by concerns under the Martens Clause, provides precedent for prohibiting fully autonomous weapons as they come closer to becoming reality.

The report was co-published with the Harvard Law School International Human Rights Clinic, for which Docherty is associate director of armed conflict and civilian protection.

Countries should:

Begin negotiations in 2019 on a treaty preemptively banning fully autonomous weapons.
Adopt national policies and laws prohibiting fully autonomous weapons.

Technology experts and companies should:

Support the call for a ban and pledge not to help develop fully autonomous weapons.

More than 70 governments will convene at the United Nations in Geneva from August 27 to 31, 2018, for their sixth meeting since 2014 on the challenges raised by fully autonomous weapons, also called lethal autonomous weapons systems. The talks under the Convention on Conventional Weapons, a major disarmament treaty, were formalized in 2017, but they are not yet directed toward a specific goal.

Human Rights Watch and the Campaign to Stop Killer Robots urge states party to the convention to agree to begin negotiations in 2019 for a new treaty that would require meaningful human control over weapons systems and the use of force. Fully autonomous weapons would select and engage targets without meaningful human control.

To date, 26 countries have explicitly supported a prohibition on fully autonomous weapons. Thousands of scientists and artificial intelligence experts, more than 20 Nobel Peace Laureates, and more than 160 religious leaders and organizations of various denominations have also demanded a ban. In June, Google released a set of ethical principles that includes a pledge not to develop artificial intelligence for use in weapons.

At the Convention on Conventional Weapons meetings, almost all countries have called for retaining some form of human control over the use of force. The emerging consensus for preserving meaningful human control, which is effectively equivalent to a ban on weapons that lack such control, reflects the widespread opposition to fully autonomous weapons.

Human Rights Watch and the Harvard clinic assessed fully autonomous weapons under the core elements of the Martens Clause. The clause, which appears in the Geneva Conventions and is referenced by several disarmament treaties, is triggered by the absence of specific international treaty provisions on a topic. It sets a moral baseline for judging emerging weapons.

The groups found that fully autonomous weapons would undermine the principles of humanity, because they would be unable to apply either compassion or nuanced legal and ethical judgment to decisions to use lethal force. Without these human qualities, the weapons would face significant obstacles in ensuring the humane treatment of others and showing respect for human life and dignity.

Fully autonomous weapons would also run contrary to the dictates of public conscience. Governments, experts, and the broader public have widely condemned the loss of human control over the use of force.

Partial measures, such as regulations or political declarations short of a legally binding prohibition, would fail to eliminate the many dangers posed by fully autonomous weapons. In addition to violating the Martens Clause, the weapons raise other legal, accountability, security, and technological concerns.

In previous publications, Human Rights Watch and the Harvard clinic have elaborated on the challenges that fully autonomous weapons would present for compliance with international humanitarian law and international human rights law, analyzed the gap in accountability for the unlawful harm caused by such weapons, and responded to critics of a preemptive ban.

The 26 countries that have called for the ban are: Algeria, Argentina, Austria, Bolivia, Brazil, Chile, China (use only), Colombia, Costa Rica, Cuba, Djibouti, Ecuador, Egypt, Ghana, Guatemala, the Holy See, Iraq, Mexico, Nicaragua, Pakistan, Panama, Peru, the State of Palestine, Uganda, Venezuela, and Zimbabwe.

The Campaign to Stop Killer Robots, which began in 2013, is a coalition of 75 nongovernmental organizations in 32 countries that is working to preemptively ban the development, production, and use of fully autonomous weapons. Docherty will present the report at a Campaign to Stop Killer Robots briefing for CCW delegates scheduled on August 28 at the United Nations in Geneva.

“The groundswell of opposition among scientists, faith leaders, tech companies, nongovernmental groups, and ordinary citizens shows that the public understands that killer robots cross a moral threshold,” Docherty said. “Their concerns, shared by many governments, deserve an immediate response.”

Posted: January 1, 1970, 12:00 am

Bonnie Docherty, senior researcher at Human Rights Watch, with a “friendly robot” at the launch of the Campaign to Stop Killer Robots, April 2013.

© 2013 Campaign to Stop Killer Robots

When drafting a treaty on the laws of war at the end of the 19th century, diplomats could not foresee the future of weapons development. But they did adopt a legal and moral standard for judging new technology not covered by existing treaty language.

This standard, known as the Martens Clause, has survived generations of international humanitarian law and gained renewed relevance in a world where autonomous weapons are on the brink of making their own determinations about whom to shoot and when. The Martens Clause calls on countries not to use weapons that depart “from the principles of humanity and from the dictates of public conscience.”

I was the lead author of a new report by Human Rights Watch and the Harvard Law School International Human Rights Clinic that explains why fully autonomous weapons would run counter to the principles of humanity and the dictates of public conscience. We found that to comply with the Martens Clause, countries should adopt a treaty banning the development, production and use of these weapons.

Representatives of more than 70 nations will gather from August 27 to 31 at the United Nations in Geneva to debate how to address the problems with what they call lethal autonomous weapon systems. These countries, which are parties to the Convention on Conventional Weapons, have discussed the issue for five years. My co-authors and I believe it is time they took action and agreed to start negotiating a ban next year.

Making rules for the unknowable

The Martens Clause provides a baseline of protection for civilians and soldiers in the absence of specific treaty law. The clause also sets out a standard for evaluating new situations and technologies that were not previously envisioned.

Fully autonomous weapons, sometimes called “killer robots,” would select and engage targets without meaningful human control. They would be a dangerous step beyond current armed drones because there would be no human in the loop to determine when to fire and at what target. Although fully autonomous weapons do not yet exist, China, Israel, Russia, South Korea, the United Kingdom and the United States are all working to develop them. They argue that the technology would process information faster and keep soldiers off the battlefield.

The possibility that fully autonomous weapons could soon become a reality makes it imperative for those and other countries to apply the Martens Clause and assess whether the technology would offend basic humanity and the public conscience. Our analysis finds that fully autonomous weapons would fail the test on both counts.

Principles of humanity

The history of the Martens Clause shows that it is a fundamental principle of international humanitarian law. Originating in the 1899 Hague Convention, versions of it appear in all four Geneva Conventions and Additional Protocol I. It is cited in numerous disarmament treaties. In 1995, concerns under the Martens Clause motivated countries to adopt a preemptive ban on blinding lasers.

The principles of humanity require humane treatment of others and respect for human life and dignity. Fully autonomous weapons could not meet these requirements because they would be unable to feel compassion, an emotion that inspires people to minimize suffering and death. The weapons would also lack the legal and ethical judgment necessary to ensure that they protect civilians in complex and unpredictable conflict situations.

In addition, as inanimate machines, these weapons could not truly understand the value of an individual life or the significance of its loss. Their algorithms would translate human lives into numerical values. By making lethal decisions based on such algorithms, they would reduce their human targets – whether civilians or soldiers – to objects, undermining their human dignity.

Dictates of public conscience

The growing opposition to fully autonomous weapons shows that they also conflict with the dictates of public conscience. Governments, experts and the general public have all objected, often on moral grounds, to the possibility of losing human control over the use of force.

To date, 26 countries have expressly supported a ban, including China. Most countries that have spoken at the U.N. meetings on conventional weapons have called for maintaining some form of meaningful human control over the use of force. Requiring such control is effectively the same as banning weapons that operate without a person who decides when to kill.

Thousands of scientists and artificial intelligence experts have endorsed a prohibition and demanded action from the United Nations. In July 2018, they issued a pledge not to assistwith the development or use of fully autonomous weapons. Major corporations have also called for the prohibition.

More than 160 faith leaders and more than 20 Nobel Peace Prize laureates have similarly condemned the technology and backed a ban. Several international and national public opinion polls have found that a majority of people who responded opposed developing and using fully autonomous weapons.

The Campaign to Stop Killer Robots, a coalition of 75 nongovernmental organizations from 42 countries, has led opposition by nongovernmental groups. Human Rights Watch, for which I work, co-founded and coordinates the campaign.

Other problems with killer robots

Fully autonomous weapons would threaten more than humanity and the public conscience. They would likely violate other key rules of international law. Their use would create a gap in accountability because no one could be held individually liable for the unforeseeable actions of an autonomous robot.

Furthermore, the existence of killer robots would spark widespread proliferation and an arms race – dangerous developments made worse by the fact that fully autonomous weapons would be vulnerable to hacking or technological failures.

Bolstering the case for a ban, our Martens Clause assessment highlights in particular how delegating life-and-death decisions to machines would violate core human values. Our report finds that there should always be meaningful human control over the use of force. We urge countries at this U.N. meeting to work toward a new treaty that would save people from lethal attacks made without human judgment or compassion. A clear ban on fully autonomous weapons would reinforce the longstanding moral and legal foundations of international humanitarian law articulated in the Martens Clause.

Author: Human Rights Watch
Posted: January 1, 1970, 12:00 am

A mother shows a picture of her son, who was detained by authorities in the northern Syrian province of Idlib, Syria, March 20, 2016. She has not heard any news about her son since then.

© 2016 Reuters

The endgame of the war in Syria is likely to come down to the northwestern province of Idlib, on the Turkish border, where some 2.3 million people are now trapped. As Russian-Syrian forces now finish retaking the smaller southwestern province of Daraa, Idlib will be the last significant enclave in anti-government hands. If Russian-Syrian forces resume pummeling the city and surrounding area from the air, its civilians could face the horrible choice of bunkering in place or desperately trying to cross the Turkish border, which has been effectively closed since 2015.

Recently, however, there is some evidence that Russia might be willing to act more constructively. Russian officials have been seeking reconstruction aid for Syria from Western donors. According to sources close to United Nations-brokered negotiations among the parties to the Syrian conflict, Russia has floated the idea of stopping the military advance on Idlib, and perhaps handing over to Turkey a degree of control similar to that now exercised by Turkey over the neighboring region of Afrin, in return for a major Western commitment to help reconstruct Syria’s devastated cities and infrastructure. That may give the West new leverage to stop the atrocities taking place in Syria. The question is how to use it.

The Syrian war has been so extraordinarily ugly because Russian-Syrian air forces have been attacking civilians indiscriminately—and in some cases directly targeting them along with schools and hospitals. Syrian forces—with Russian backing—have also regularly used prohibited weapons such as cluster munitions, incendiary devices, and chemical weapons. There is compelling evidence that Russian forces themselves have used incendiary bombs.

The laws of war flatly prohibit these attacks, declaring them war crimes, but Presidents Bashar al-Assad of Syria and Vladimir Putin of Russia have, by their actions, ripped those laws up. This military conduct is a major reason why an estimated half a million people have been killed and more than 50 percent of Syria’s pre-war population has been displaced.

The Russian air force, in particular, has been an indispensable partner in creating this carnage, fighting alongside Syrian aircraft since 2015 and significantly bolstering the effectiveness of pro-government forces. It is a crucial reason why Assad, whose battlefield position had been tenuous, now looks likely to prevail.

Russia had an important part, for example, in the Syrian government’s aerial bombing campaigns, which led to the recapture, in 2016 and 2018 respectively, of Eastern Aleppo and Eastern Ghouta—two of the most populous enclaves once held by anti-government forces. Airstrikes killed hundreds of civilians in each area. Meanwhile, pro-government forces on the ground used crippling sieges to keep humanitarian supplies and aid workers from reaching civilians. The suffering and death toll were sufficient to force both enclaves to fall.

Russia’s official arms exporter, Rosoboronexport, is the biggest weapons supplier to the Syrian military. Russian diplomats give Assad overt political support, vetoing efforts to refer Syria to the International Criminal Court and trying to block, ultimately unsuccessfully, official investigations to identify which forces are using chemical weapons. Russian state-affiliated media such as RT and Sputnik have been at the forefront of whitewashing atrocities committed by the Russian-Syrian military alliance.

Until now, Idlib has provided a refuge for some Syrians. As anti-government enclaves fell, Syrian forces gave survivors the choice of the indignity of boarding the government’s notorious green buses to be dumped in Idlib or living in government-controlled areas where they would face the risk of reprisals—detention, torture, and execution—if they were suspected of being government opponents. For obvious reasons, many chose Idlib.

Roughly half of Idlib’s civilian population today is displaced from elsewhere in Syria. They are joined by a collection of anti-government militias that are themselves often abusive—committing summary executions, mistreating detainees, restricting humanitarian aid, and kidnapping for ransom. Now that Idlib is squeezed by pro-government troops on the ground and bombed from the air by the Russian-Syrian military alliance, there are few places left in Syria to flee.

In the past, civilians seeking to escape Russian-Syrian attacks might have crossed Idlib province’s border with Turkey, where some 3.5 million Syrian refugees now live. Since October 2015, however, Turkish security forces have routinely intercepted hundreds, and at times thousands, of asylum-seekers at the border and summarily deported them to Idlib. Fences line the border, and Turkish security forces have been firing at asylum-seekers trying to cross it irregularly, killing many and wounding others.

Whether Turkey will continue to keep its border closed to newcomers if thousands of Syrians are being slaughtered on the other side remains to be seen. But if Turkey were to experience a large new influx of asylum-seekers, few of whom would be eager to return to life under the Assad government, Ankara could face pressure from inside the country, where anti-refugee sentiment is growing, to suspend the deal it made with the European Union to curtail the flow of asylum-seekers across the Aegean Sea to Greece. Preventing a massacre in Idlib to begin with is a far better option.

Still, the Russian reconstruction proposal is controversial for several reasons, even if European governments could be persuaded to pay to rebuild cities that Russian and Syrian forces were largely responsible for destroying. There are significant concerns that, rather than allocating reconstruction aid on the basis of need, the Syrian government will prioritize areas where it perceives the residents remained loyal to it during the war. It has also divulged little about how reconstruction and recovery funds are being spent—a problem compounded by its insistence on restricting access to areas it has retaken for independent private humanitarian organizations. Syrian military and intelligence forces have already diverted large sums of humanitarian aid to line their own pockets and fund their operations, so there is every reason to fear that they would similarly divert reconstruction assistance.

What’s more, Russia has been silent about restrictions imposed by the Syrian government on the return of displaced residents to certain neighborhoods, even those that were retaken several years ago. Nor has Russia publicly opposed urban-planning schemes such as Law 10 of 2018, which allows the Syrian government to confiscate and redevelop residents’ property without due process or compensation. And Russia has done far too little to end Syria’s lawless and deadly detention practices—an enormous obstacle to return for the millions of Syrians who have fled the fighting.

In any event, the lives of Syrian civilians shouldn’t depend on payoffs and backroom deals. The alternative is to call out Russian complicity in Syria’s criminal military strategy and to vigorously press the Kremlin to end these atrocities.

Russia clearly has the necessary leverage over the Assad government to avoid a bloodbath in Idlib. Its aircraft could refuse to participate in joint offensives that indiscriminately bomb civilians and civilian infrastructure. Russia’s arms exporter could stop supplying weapons until the atrocities are halted. Its diplomats could stop shielding Syrian officials from international prosecution for their war crimes.

The key is getting Russia to use that leverage. Assad’s reputation is beyond repair—his main aspiration is to stay in power and avoid prosecution—but Putin still aspires to be treated as a respected global leader. He must be persuaded that he will fail in that quest so long as he continues to underwrite Assad’s atrocities.

This is not something that the US government under Donald Trump has shown any inclination to do, as illustrated recently by President Trump’s courting of Putin’s favor at the summit in Helsinki. The European Union is in a better position to act. If Russia wants better relations with the EU—any prospect of easing sanctions and improving its economic outlook—it should show a real willingness to end the bloodshed in Syria. It could start by protecting the 2.3 million Syrians in Idlib.

Author: Human Rights Watch
Posted: January 1, 1970, 12:00 am

Imports sit destroyed in a damaged warehouse at the port in Hodeida city, Yemen.

© November 2016 Kristine Beckerle / Human Rights Watch

(Beirut) – All parties to the conflict in Yemen should minimize civilian harm during military operations against the western port city of Hodeida, Human Rights Watch said today. The Saudi-led coalition, backed by the United States, and Yemeni government-aligned forces, backed by the United Arab Emirates, stepped up attacks on Houthi forces controlling the port in June 2018.

About 70 percent of Yemen’s aid and commercial imports enter through Hodeida and the nearby Saleef port, providing food, fuel, and medicine that the population needs for survival. To comply with international humanitarian law, or the laws of war, warring parties should take immediate steps to provide safe passage and adequate support to civilians fleeing fighting and facilitate the flow of aid and commercial supplies to the broader population and access by humanitarian agencies.

“The coalition and Houthi forces now fighting for Hodeida have atrocious records abiding by the laws of war,” said Sarah Leah Whitson, Middle East director at Human Rights Watch. “The UN Security Council should urgently warn senior officials on both sides to provide civilians access to desperately needed aid.”

Related Content

Up to 600,000 civilians remain in the vicinity of densely populated Hodeida, which the Houthis took over in late 2014. On June 9, the UAE, which has led coalition operations along Yemen’s western coast, informed humanitarian organizations that they had three days to evacuate the city. The United Nations and other humanitarian organizations withdrew many of their staff, but the UN aid chief, Mark Lowcock said, “It is our plan, intention and hope to stay and deliver.”

The parties to the conflict are bound under the laws of war to take all feasible precautions to minimize civilian casualties and damage to civilian structures. Since the coalition began its military campaign in Yemen in March 2015, coalition forces have committed numerous unlawful airstrikes and used inherently indiscriminate cluster munitions. The Houthis have unlawfully carried out indiscriminate missile strikes, used anti-personnel landmines, and deployed children in combat, as have pro-government Yemeni forces.

The UN has referred to Yemen as the world’s worst and largest humanitarian crisis. As of January 2018, at least eight million Yemenis were on the brink of famine, which is linked directly to the armed conflict. Hodeida’s port is the “single most important point of entry for the food and basic supplies needed to prevent famine and a recurrence of a cholera epidemic,” the UN has said.

UN Security Council Resolutions 2140 and 2216 established a sanctions regime that authorizes the Yemen Sanctions Committee to designate those responsible for serious abuses, including obstructing humanitarian assistance, for travel bans and asset freezes. The Security Council should announce that it will impose these penalties on individuals who commit offenses during the Hodeida offensive that meet the sanctions committee’s designation criteria, Human Rights Watch said.

The US became a party to the Yemen conflict soon after fighting began by providing intelligence and refueling for coalition bombing missions. In 2017, the US sent special forces to assist in locating and destroying Houthi caches of ballistic missiles and launch sites, according to the New York Times. The US has also sold thousands of advanced bombs and rockets to Saudi Arabia, which leads the coalition. The US could become complicit in violations of the laws of war by assisting coalition forces during the Hodeida offensive. The United Kingdom, France and other countries have also sold weapons to coalition forces. US and other foreign officials may be exposed to potential legal liability by selling weapons likely to be used in unlawful attacks.

“The battle for Hodeida could have a devastating impact on civilians both in the city and elsewhere in Yemen,” Whitson said. “Both sides need to seek to minimize civilian harm at all times, whether in carrying out attacks or by allowing families to flee to safety.”

A worker is pictured in a government hospital's drug store in Sanaa, Yemen August 16, 2017. 

© 2017 Reuters/Khaled Abdullah

Facilitate Humanitarian Access
Health professionals and humanitarian workers in Yemen have described how parties to the conflict have restricted access to aid and essential goods and the impact on the civilian population. This includes repeatedly failing to abide by laws-of-war requirements to facilitate the delivery of aid to civilians, exacerbating Yemen’s humanitarian crisis.

Further disruption of humanitarian and commercial imports through Hodeida and Saleef will have predictably devastating consequences for civilians, Human Rights Watch said. The inability of essential supplies to reach their final destinations, including to areas under Houthi or Yemeni government control, would also be disastrous.

On June 8, Lise Grande, Yemen’s UN humanitarian coordinator, said: “In a prolonged worst case, we fear that as many as 250,000 people may lose everything – even their lives.… Cutting off imports through Hodeidah for any length of time will put Yemen’s population at extreme, unjustifiable risk.”

The coalition has imposed a naval and air blockade on Yemen since the current conflict began that has severely and disproportionately restricted the flow of food, fuel, and medicine to civilians in violation of international humanitarian law. The coalition closed all of Yemen’s entry points in response to a Houthi missile strike on Saudi’s Riyadh airport on November 4, 2017, keeping Hodeida and Saleef ports closed for several weeks. The closures immediately and predictably exacerbated existing food shortages. The port of Aden in the south, which is controlled by the Yemeni government, does not have the capacity to receive the hundreds of thousands of metric tons of food, fuel, medicine, and other imported goods Yemenis depend on for survival.

Houthi forces have blocked and confiscated food and medical supplies and denied access to civilians in need. They have imposed onerous restrictions on aid workers, interfered with aid delivery, and restricted the movement of ill civilians. The cumulative impact of Houthi obstruction and interference with humanitarian assistance has significantly harmed the civilian population.

Parties to the conflict must facilitate the rapid passage of humanitarian aid for civilians in need and not arbitrarily interfere with it. They must also ensure the freedom of movement of humanitarian workers, which can only be restricted temporarily for reasons of imperative military necessity. Warring parties are also prohibited from carrying out attacks on objects that are indispensable to the civilian population, such as food stores or drinking water installations intended for civilians. The laws of war prohibit using starvation as a method of warfare, and require parties to a conflict not to “provoke [starvation] deliberately” or deliberately cause “the population to suffer hunger, particularly by depriving it of its sources of food or of supplies.”

All parties to the conflict should:

 

  • Take all necessary steps to keep Hodeida and Saleef ports open and operational without interruption to humanitarian and essential commercial goods;
  • Take all necessary steps to ensure that aid and essential commercial imports can be taken in at ports and can be transported to civilians throughout Yemen;
  • Cease targeting, damaging or destroying objects indispensable to the survival of the civilian population; and
  • Ensure the safety and security of humanitarian workers at all times.

 

Provide Safe Passage to Fleeing Civilians

Thousands of civilians have been displaced as fighting moved up Yemen’s western coast. In Aden in February, families displaced from their homes said they fled because they did not trust the warring parties to distinguish between civilians and combatants in their attacks. They described their fear when “the war came” and that the next mortar attack or airstrike would hit their or their neighbors’ homes. Some said that Houthi or UAE-backed fighters had restricted their flight.

A three-story house in Souq al-Hinood, a crowded residential area in Hodeida city, that was hit by an airstrike on the evening of September 21, 2016. A single bomb killed at least 28 civilians, including 8 children.

© 2016 Priyanka Motaparthy / Human Rights Watch

One man, “Talal,” said that after pro-government fighters pushed the Houthis out of Khawka in late 2017, about a dozen Houthi fighters deployed in the woods near his farm. He was worried that coalition forces would attack, but for two weeks the Houthis refused to allow his family to leave. “They said, ‘We will live together or we will die together.’ It was an order,” said Talal. “We heard the sound of airplanes all the time, stuck between the coalition and the Houthis.” His 4-year-old son “would wake up at night, crying, and yell, ‘The airplanes are coming!’” In December, after the Houthis had retreated to the woods, coalition airstrikes hit the family vehicle and their home. Eleven family members, including Talal’s nine children, were at the farm, and his 12-year-old son was severely wounded. They fled on motorbikes to the city of Hays.

Thousands of people displaced from fighting on the western coast have fled to Aden. “Ahmed,” 27, said UAE-backed forces at checkpoints occasionally refused entry to Aden or sought bribes from displaced people. His brother said, “Some of the people who are coming pay at the checkpoint, but it depends where you are from.” People from Houthi areas in the north pay larger bribes, he said. When Ahmed traveled from Taizz to Aden in early 2018, UAE-backed forces held him and a family traveling on the same bus at a checkpoint overnight with nothing to eat – the soldiers took their food and water. The soldiers let Ahmed go in the morning, but the family from the northern city of Saada, whom the soldiers accused of being Houthis, were still being interrogated when he left.

The laws of war require parties to the conflict to take all feasible steps to evacuate civilians from areas of fighting or where fighters are deployed and not to block or impede the evacuation of those wishing to leave. The creation of “humanitarian corridors” and the issuance of effective advance warnings of attack to the civilian population do not relieve attacking forces of their obligation to distinguish at all times between combatants and civilians and to take all feasible precautions to protect civilians.

Deliberately using the presence of civilians to protect military forces from attack is the war crime of “human shielding.”

The parties to the conflict should:

 

  • Remove civilians to the extent feasible from areas in the vicinity of military targets;
  • Allow civilians to flee areas of fighting for safety and to obtain aid; and
  • Never use civilians as “human shields.”

 

Avoid Civilian Casualties and Damaging Civilian Structures
Human Rights Watch and others have documented dozens of indiscriminate or disproportionate airstrikes by coalition forces and missile attacks by the Houthis that have caused thousands of civilian casualties. Coalition airstrikes have killed civilians in attacks on markets, homes, schools, hospitals and mosques. In March, the UN human rights office said that coalition airstrikes caused 61 percent of verified civilian casualties in Yemen.

The coalition has repeatedly hit infrastructure critical to the civilian population, including damaging essential port infrastructure in Hodeida. The coalition has carried out airstrikes using explosive weapons with wide-area effect in densely populated areas, including in Hodeida city.

The coalition has repeatedly pledged to minimize civilian harm in its aerial campaign, but has continued to carry out unlawful attacks. In April, the coalition bombed a wedding in Hajjah, killing 22 civilians and wounding at least another 54, about half of whom were children. As in 23 other apparently unlawful coalition airstrikes that Human Rights Watch has documented, the coalition used a US weapon – a Joint Direct Attack Munition (JDAM) satellite guidance kit.

Houthi and government-aligned forces have carried out indiscriminate artillery attacks that have struck populated neighborhoods, killing and wounding civilians. The Houthis have repeatedly fired artillery indiscriminately into Yemeni cities, including after withdrawing from city centers. They have failed to take all feasible precautions against harming civilians by deploying military forces in civilian facilities – including a school and a civilian detention facility.   

A so-called dual-use object – one that normally has both civilian and military purposes – is a valid military target so long as the expected concrete and direct military advantage from the attack is greater than the anticipated loss of civilian life and property. Any coalition attacks on Hodeida and Saleef port facilities, which serve a military purpose, will need to take into account the extreme importance of those facilities to the civilian population and the anticipated impact of their destruction, Human Rights Watch said.

 

All parties to the conflict should:

 

  • Take all feasible measures to ensure the protection of civilians and civilian objects during military operations;
  • Act to ensure that attacks on military targets do not cause disproportionate harm to civilians and civilian objects;
  • Take all feasible steps, when operating in areas where civilians and combatants are comingled, to minimize the harm to civilians and civilian objects, including by selecting weapons and specific munitions to minimize civilian casualties; and
  • Cease use of munitions with wide-area destructive effect in heavily populated areas.

 

Cease Use of Landmines; Increase Efforts to Clear Explosive Remnants of War

Landmines have killed and maimed civilians, disrupted civilian life, hindered humanitarian access, and prevented civilians’ safe return home in affected areas in Yemen. In any battle for Hodeida, the Houthis, as well as pro-government forces, should not use antipersonnel mines, which pose a threat to civilians long after a conflict ends.

Houthi forces have repeatedly laid antipersonnel, antivehicle, and improvised mines, as they withdrew from areas in Aden, Taizz, Marib and, more recently, along Yemen’s western coast. In February, the Yemen Executive Mine Action Center’s southern branch found antipersonnel mines, antivehicle mines and other types of explosive ordnance, including improvised antipersonnel mines, near the towns of Mokha, Khawka and Hays. One deminer said that, “Areas that have been mined have never been signed [marked with warning signs].” Three years after the Houthis withdrew from Aden, mine survey and clearance operations continue.

Use of antipersonnel landmines violates the laws of war, and those involved are committing war crimes. The indiscriminate use of antivehicle mines and failure to minimize civilian casualties also violate the laws of war. Yemen suffers from a shortage of equipped and trained personnel who can systematically clear landmines and explosive remnants of war.

All parties to the conflict should:

 

  • Cease using antipersonnel mines, destroy any antipersonnel mines in their possession, and appropriately punish those using them;
  • Raise awareness among the displaced about the threat of mines, including improvised mines, and develop capacity to rapidly clear homes and residential areas of mines and remnants of war to facilitate the return of the civilian population; and
  • Direct international assistance to equip, train, and assist clearance personnel to systematically survey, clear, and destroy Yemen’s mines and explosive remnants of war. International donors should provide assistance for landmine victims, including medical care, prosthetics, and ongoing rehabilitation.

 

Ensure that No Children Take Part in Fighting

Houthi forces, pro-government forces, and other armed groups have used child soldiers, an estimated one-third of the fighters in Yemen. By August 2017, the UN had documented 1,702 cases of child recruitment since March 2015, 67 percent of which were attributable to formerly aligned Houthi-Saleh forces. About 100 of the verified cases included recruiting children younger than 15, the recruitment or use of whom is a war crime.

The Houthis have recruited, trained, and deployed children in the current conflict. “Yasser,” 20, said that the fear of forced recruitment drove him and two of his close friends to flee Sanaa in 2017. The Houthis had recruited his younger brother, 15 or 16, who then patrolled neighborhoods, worked at checkpoints, and received training. “He takes our father’s weapon when he goes.” The Houthis provided him with food and qat, but no pay. “He goes with seven of his friends around the same age,” Yasser said. Houthi supporters came to their neighborhood almost daily and talked to young men and boys about becoming fighters.

“Salem,” who was displaced by fighting on the western coast, said that children as young as 13 “were ruling us, they have the guns,” while his family remained in Houthi-controlled Mafraq. His family was first displaced to Houthi-controlled Ibb, but they fled to Aden in early 2018 after a local leader warned them the Houthis might forcibly recruit their children.

The local leader, who was also trying to send his children away to protect them, said the Houthis were going around to people’s homes, asking how many men and boys lived there, and registering names for fighting. Their neighbors said families had to provide either money or a person to fight, often a child: “They paid, or they take the kids, so they paid… They would take people who were 16, 17, if they can carry a weapon.” His relative, “Ali,” 16, said that in December 2017 Houthi men on a truck came to his secondary school. “I saw three Houthis with guns trying to take the kids… some of the kids were crying.” He and his friend scaled the school’s wall and ran away. Salem, Ali and about 70 other members of 13 related families were staying together in an abandoned school in Aden when Human Rights Watch interviewed them.

International law sets 18 as the minimum age for participation in direct hostilities, which includes using children as scouts, couriers, and at checkpoints. Under Yemeni law, 18 is the minimum age for military service.

Forces that capture child soldiers should treat them as victims of rights violations, not simply as captured fighters, and abide by international standards. The Paris Principles on Children Associated with Armed Forces or Armed Groups states: “The release, protection and reintegration of children unlawfully recruited or used must be sought at all times, without condition and must not be dependent on any parallel release or demobilization process for adults.”

All parties to the conflict should:

 

  • Ensure that no children take part in fighting in Hodeida;
  • Clarify to affiliated forces that recruiting children is unlawful even if they are not serving a military function;
  • Appropriately investigate and punish officers who allow children in their units or are responsible for the war crime of recruiting or using children under 15; and
  • Provide former child soldiers all appropriate assistance for their physical and psychological recovery and social reintegration.

 

Investigate Unlawful Attacks

None of the countries that make up the Saudi-led coalition have publicly clarified which alleged unlawful attacks their forces participated in. The investigative body the coalition created in 2016 does not meet international standards and has absolved coalition forces of responsibility in the vast majority of attacks investigated. The coalition has not compensated victims of unlawful strikes or their families, as far as Human Rights Watch has been able to determine.

In response to Human Rights Watch letters in 2016 and 2017, the Houthi-controlled Foreign Affairs Ministry expressed a willingness to investigate reports of landmine use, but not until the conflict ended. Human Rights Watch has not been able to identify any concrete steps the Houthis have taken to investigate potentially unlawful attacks or hold anyone to account for violations.

Countries that are parties to the armed conflict should:

 

  • Impartially and transparently investigate credible reports of alleged violations of the laws of war and make public their findings. Individuals implicated in war crimes should be appropriately prosecuted;
  • Conduct investigations using a full range of tools, including interviews with witnesses, surveillance and targeting videos, and forensic analyses. Public findings should include accountability measures taken against individual personnel, redress provided to victims or their families, and an explanation of the process used;
  • Provide compensation for wrongful civilian deaths, injuries and harm. The coalition should develop effective systems for civilians to file claims for condolence or ex gratia payments and to evaluate the claims; and
  • The US, UK, and France should cease all weapons transfers to Saudi Arabia because of Saudi Arabia’s widespread violations of the laws of war, and weapons transfers to other coalition members that are likely to use them unlawfully.
Posted: January 1, 1970, 12:00 am

Thank you.

The requirement to destroy all stockpiled antipersonnel landmines within the firm and relatively short deadline with no possibility for extension is a remarkable provision of the Mine Ban Treaty. With a few notable and regrettable exceptions, States Parties have successfully implemented this obligation, collectively destroying more than 53 million stockpiled mines over the past two decades.

According to Landmine Monitor, all except two of the 33 states not party to the Mine Ban Treaty currently stockpile antipersonnel landmines. Yet these states have also been taking stepts to destroy their stocks and the collective estimate of antipersonnel mines stockpiled by these states has decreased over the past 20 years from approximately 160 million antipersonnel mines to perhaps less than 50 million mines. 

Today, up to six of the treaty’s 164 States Parties continue to stockpile antipersonnel mines. Greece, Oman, and Ukraine have formally declared stocks. Somalia and Sri Lanka may have stocks, but we await their formal declarations in their Article 7 transparency reports. Tuvalu is not likely to have stocks, but must also submit its transparency report.

The newest State Party, Palestine, has stated that it does not possess a stockpile and that it will not retain any mines for training purposes.

Three States Parties still must destroy a collective total more than 5.5 million antipersonnel mines: Ukraine (4.9 million), Greece (640,761), and Oman (7,630). 

We welcome the update from Oman today that it should complete destruction of its stockpile antipersonnel mines by the end of this year.

We appreciate the update from Greece on its progress in stockpile destruction and cooperation with Bulgaria in this regard.

Yet we are deeply disappointed at the non-compliance of Ukraine with this core obligation of the Mine Ban Treaty. Ukraine has not submitted a transparency reports for calendar year 2017, as of today, which makes it impossible to properly assess their progress. Most disturbingly, Ukraine also did not intervene on its stockpile destruction status at the 16MSP in December or today here at the intersessional meetings. We urge Ukraine to accelerate their efforts and complete the task as soon as possible or by the Fourth Review Conference.

New State Party Sri Lanka is obligated to declare any stockpiles in its initial transparency report, which is due by 28 November 2018. Sri Lanka did not mention any stocks in its 2005 voluntary transparency report, but media reports indicate that some landmines have been destroyed from stocks in recent months.

Somalia stated in 2013 that it is working to “verify if in fact it holds antipersonnel mines in its stockpile.” Somalia has admitted that “large stocks are in the hands of former militias and private individuals.”

Additionally, non-state armed groups in Afghanistan, Iraq, Libya, Myanmar, Nigeria, Pakistan, Syria, Ukraine, and Yemen, as well as in Western Sahara, were reported to possess stocks of factory-made antipersonnel mines and/or components to manufacture improvised landmines.

Finally, the destruction of stockpiles of improvised antipersonnel mines requires some attention from States Parties in this forum, but more information needs to be researched and assessed. For example, do authorities in places where improvised mines are produced, stored, and used have any lessons learned to share?  Are there requirements for possible international cooperation and assistance?  Is treaty compliance even considered when improvised antipersonnel mines are found?  Are there obstacles at the national level that hinder consistent transparency reporting?  The ICBL welcomes further action on this issue.

Stockpile destruction has potentially saved millions of lives, as a mine destroyed from stocks can never claim a victim. Implementation of this treaty obligation has also potentially saved hundreds of millions of dollars, as it is much cheaper to destroy a stockpiled mine than it is to clear one or to care for its victim. Be proud of the achievements to date, but we urge states parties work collectively to help ensure that states parties yet to complete this essential task do so as swiftly as possible.

Thank you.

 

Posted: January 1, 1970, 12:00 am

Thank you for the floor.

We thank the Committee on Cooperative Compliance for its important work, and also thank it for the opportunity for the ICBL and its members to contribute to the process.

This is a good time to remind ourselves how special this Convention is, and how special the cooperative compliance approach is. There are lessons to be learned for the many nations that are struggling to deal with Syria’s willful violations of the Chemical Weapons Convention, and for the many nations that are questioning how to deal with compliance in a future world of autonomous weapons.

The most serious compliance concern of course is the possible use of antipersonnel mines by a State Party.  There has been one confirmed instance of this since entry-into-force, by Yemen in 2011-2012.  We have just heard from the Committee about its actions with respect to Yemen, as well as to allegations of use by three other States Parties: South Sudan, Sudan, and Ukraine.

It is worth noting that there have been no new allegations of use by these States Parties or any other States Parties since the Committee began its inquiries, some four years ago. After receiving a written report from South Sudan, the Committee is recommending “case closed” status for that investigation, just as it did previously for Turkey.

It is unfortunate that we did not hear updates today from Sudan, Ukraine, and Yemen.  Ukraine has not provided an update on its efforts to investigate allegations of use since February 2017 and Yemen has not since June 2017. Sudan told the Committee earlier this year that it still needs to investigate allegations of use in three regions. Full transparency with respect to these investigations and their findings is crucial.

The Compliance Committee has noted that in all of these cases, the States Parties have mined areas under their jurisdiction, but outside of their control, and that the cases will have to remain open until those states conclude appropriate investigations into those areas.

While being the most serious, use allegations are by no means the only compliance issues of concern.

The long-missed stockpile destruction deadlines for Greece and Ukraine remain unmet.

We must again ask, why are States Parties not asking questions of those that are keeping mines under the Article 3 exception without ever using them for any of the permitted purposes? These are in essence stockpiled mines, not mines retained for training or development.

On Article 5, there are far too many mine clearance extension requests, and too little respect for the “as soon as possible” requirement and the ten-year deadline.  The most egregious case of course is that of Ukraine. Ukraine has been in violation of the treaty since 1 June 2016 for missing its clearance deadline without having requested an extension in time. We continue to hope Ukraine will move to remedy this situation as soon as possible, as it is clearly in its own best interests to do so. As Switzerland noted, this situation not only affects Ukraine, but undermines the credibility and integrity of the Convention.

Ukraine has the dubious distinction of being in double violation for missing both its stockpile destruction and mine clearance deadlines. But while States Parties and the ICBL have acknowledged multiple understandable reasons and shared blame for missing the stockpile destruction deadline, such is not the case for clearance and the failure to submit an extension request. Instead, Ukraine has apparently made a decision to ignore a legal requirement and to ignore the multitude of entreaties from States Parties to urgently submit an extension request.

Extension requests are required, and submission of a request is not a matter for negotiation or for pre-conditions. Ukraine cannot unilaterally declare that its obligation no longer exists. There are no reservations allowed under this treaty. Given the very difficult conflict situation it is enduring, it is hard to understand why Ukraine is needlessly bringing harsh criticism upon itself for its failure to abide by the treaty’s legal requirements. Perhaps the time has come to ask if Ukraine is willfully violating the treaty, that is, does Ukraine believe it is gaining benefit from its failure to submit an extension request? We certainly hope not.

Turning to another compliance matter, the compliance rate with the legal obligation for transparency reporting continues to be disturbingly low, around 40%.

In closing, it is vital to promote compliance with the norm being established by the Mine Ban Treaty: that there should not be any use of antipersonnel mines by any actor under any circumstance.

It is striking that thus far in 2018, there is not confirmed use of antipersonnel mines by any government force. We are investigating allegations in several past users, such as Myanmar and Syria.

Non-state armed groups have used antipersonnel mines in a significant number of countries -- mostly improvised antipersonnel mines, also called victim-activated improvised explosive devices, which are prohibited by the Mine Ban Treaty. There have been incidents of use or unconfirmed allegations of use by non-state armed groups in Afghanistan, Cameroon, DR Congo, Iraq, Libya, Myanmar, Nigeria, Pakistan, the Philippines, Syria, Ukraine, and Yemen. We continue to assess these situations.

Improvised antipersonnel mines can also be a compliance issue.  States Parties need to treat these as they do other antipersonnel mines, to report them in transparency reports as contaminated areas, and to report on clearance in accordance with treaty mandated deadlines.

States Parties should condemn any new use of improvised antipersonnel mines by non-state armed groups as well as government forces, and States Parties should seek out new ways to stigmatize and stop the use of improvised antipersonnel mines.

Thank you.

 

Posted: January 1, 1970, 12:00 am