Stephen Goose, director of Human Rights Watch's Arms Division, was instrumental in bringing about the 2008 convention banning cluster munitions, the 1997 treaty banning antipersonnel mines, the 1995 protocol banning blinding lasers, and the 2003 protocol requiring clean-up of explosive remnants of war. He and Human Rights Watch co-founded the International Campaign to Ban Landmines (ICBL), which received the 1997 Nobel Peace Prize. Goose created the ICBL’s Landmine Monitor initiative, the first time that non-governmental organizations around the world have worked together in a sustained and coordinated way to monitor compliance with an international disarmament or humanitarian law treaty. In 2013, he and Human Rights Watch co-founded the Campaign to Stop Killer Robots. Before joining Human Rights Watch in 1993, Goose was a US congressional staffer and a researcher at the Center for Defense Information. He has a master's degree in International Relations from the Johns Hopkins School of Advanced International Studies and a B.A. in History from Vanderbilt University.

Posted: January 1, 1970, 12:00 am

Saudi-led coalition aircraft struck three apartment buildings in Faj Attan, a densely populated neighborhood in Sanaa, on August 25, 2017. Two of the buildings were completely destroyed and the third suffered extensive damage.

© 2017 Mohammed al-Mekhlafi

(Paris) – President Emmanuel Macron of France should raise serious concerns with Abu Dhabi’s crown prince regarding laws-of-war violations in Yemen, Human Rights Watch said today. Crown Prince Mohammed bin Zayed al Nahyan of the United Arab Emirates (UAE) will visit Paris on November 21, 2018.

The UAE plays a prominent role in the Saudi-led coalition’s military operations in Yemen. Since March 2015, the coalition has indiscriminately bombed homes, markets, and schools, impeded the delivery of humanitarian aid and used widely banned cluster munitions. Human Rights Watch has documented nearly 90 apparently unlawful coalition attacks, some of them likely war crimes. The UAE and UAE-led proxy forces have arbitrarily detained, forcibly disappeared, and tortured Yemenis in southern and eastern Yemen, including Yemeni activists who have criticized coalition abuses.

“As the UAE’s de facto leader and deputy commander of its armed forces, Crown Prince Mohammed bin Zayed al Nahyan could have acted to stop grave abuses in Yemen, but instead war crimes have mounted,” said Bénédicte Jeannerod, France director at Human Rights Watch. “If President Macron is truly concerned about the humanitarian crisis in Yemen, he should tell the crown prince that France will stop selling weapons to the UAE if there’s a real risk of their unlawful use.”

Despite Saudi Arabia’s and the UAE’s records of abuse, France, along with the United States and the United Kingdom, continue to sell weapons to both countries. In June, the French newspaper Le Figaro reported that French special forces were on the ground in Yemen, alongside UAE forces.

Macron should press the UAE to investigate alleged serious violations by its armed forces and Yemeni forces it supports, to appropriately prosecute those responsible for war crimes, and to provide reparation to victims of violations, Human Rights Watch said. France should stop supplying weapons and munitions to the UAE if there is a substantial risk that these arms are being used in Yemen to commit or facilitate serious violations of international humanitarian law or international human rights law.

Despite leading considerable efforts to present the UAE as progressive and tolerant, Crown Prince Mohammed bin Zayed al Nahyan, the nation’s de facto leader, has largely failed to improve his country’s human rights record.

Domestically, UAE authorities have carried out a sustained assault on freedom of expression and association since 2011. In 2014, the UAE issued a counterterrorism law that gives authorities the power to prosecute peaceful critics, political dissidents, and human rights activists as terrorists. UAE residents who have spoken about human rights issues are at serious risk of arbitrary detention, enforced disappearance, imprisonment, and torture. Many are serving long prison terms or have left the country under pressure.

In March 2017, the UAE detained Ahmed Mansoor, an award-winning human rights defender, on speech-related charges and held him incommunicado for more than a year. He was sentenced to 10 years in prison on May 29, 2018 for crimes that appear to violate his right to free expression.

UAE courts also imposed a 10-year prison sentence in March 2017 on a prominent academic, Nasser bin Ghaith, whom authorities forcibly disappeared in August 2015, for charges that included peaceful criticism of the UAE and Egyptian authorities.

On October 4, the European Parliament adopted a strongly worded resolution calling for the immediate release of Mansoor and all other “prisoners of conscience” in the UAE. The resolution expressed concern that “attacks on members of civil society including efforts to silence, imprison, or harass human rights activists, journalists, lawyers, and others has become increasingly common in recent years.” It said that European institutions should make respect for human rights activists “a precondition to any further development of relation between the EU and UAE.”

In addition, despite some reforms, many low-paid migrant workers remain acutely vulnerable to forced labor. The kafala (visa-sponsorship) system ties migrant workers to their employers. Those who leave their employers can face punishment for “absconding,” including fines, prison, and deportation. A 2017 law extended key labor protections to domestic workers, previously excluded from such guarantees, but its provisions remain weaker than those of the country’s national labor law.

Yet over the past several years, the UAE and France have strengthened their bilateral relations across a range of areas, including security, trade, and cultural exchanges. In 2017, France increased its arms sales to the UAE and opened the Louvre Abu Dhabi museum amid serious concerns regarding labor abuses in building the museum. On October 11, the UAE joined the International Organization of the Francophonie, which promotes the spread of French language and values, as an associate member, although  human rights and democratic principles are at the heart of the organization’s charter.

“By failing to address the UAE authorities’ serious rights violations in Yemen, France risks glossing over a dark reality,” Jeannerod said. “Despite outward appearances, the UAE has repeatedly shown itself to be resistant to improving its human rights record at home and abroad.” 

Posted: January 1, 1970, 12:00 am

Remnant of a wing assembly that is mounted on a US-made GBU-12 Paveway II laser-guided 500-pound bomb found at the Arhab water drilling site, Sanaa governorate, where at least 31 civilians were killed in an airstrike on September 10, 2016. According to the manufacturing date as well as the national stock number, this wing assembly was produced by Raytheon Company, a US defense contractor, in October 2015.

© 2016 Priyanka Motaparthy / Human Rights Watch

On November 11, 30 senior Obama administration officials issued a statement calling on the Trump administration to end all support for Saudi Arabia in the war in Yemen. This was a positive and thoughtful effort, given America’s participation in a war that has had catastrophic outcomes for the people of Yemen. But it was, ultimately, a failed reckoning for the Obama administration’s role in risking American complicity in Saudi-led coalition abuses in the first place.

The statement by the former senior officials attempts to acknowledge that America’s participation in the war — providing intelligence, refueling, and logistical assistance to the Saudi-led coalition — was now clearly a mistake, given the coalition’s failure to limit its myriad violations and end the war. But they justify the Obama administration’s initial decision to support the war as based on “a legitimate threat posed by missiles on the Saudi border and the Houthi overthrow of the Yemeni government, with support from Iran.”

A more honest reckoning for how the US got to where it is in this war in Yemen would start with a greater admission of the truth of the Obama administration’s motivations and mistakes in participating in this war. In their letter, the Obama officials try to distinguish their administration’s support for the war as “conditional,” vs. Trump’s “unconditional” support. Of course, this matters little to the Yemeni people because the outcome has been the same: death and destruction, very often by US bombs.

Video

Video: US-Made Bombs Used in Unlawful Airstrikes in Yemen

The Saudi Arabia-led coalition killed several dozen civilians in three apparently unlawful airstrikes in September and October 2016. The coalition’s use of United States-supplied weapons in two of the strikes, including a bomb delivered to Saudi Arabia well into the conflict, puts the US at risk of complicity in unlawful attacks.

The Obama administration’s stated justifications for joining the war effort obscure the truth of what led them to the war. Other Obama administration officials had alreadystated that their support for the war, coupled with a $1 billion arms deal, was first and foremost payback for Saudi’s grudging tolerance of the Iran nuclear deal, and to reassure them that the US remained a reliable ally, despite the deal. The amount of Iranian support to the Houthis has been debated, of course, but with little evidence, all pretty murky; better known is the fact that the Houthis are a fiercely independent group with a long history of waging war in Yemen.

As the war has evolved, Iran’s involvement with the Houthis has certainly grown, filling in the vacuum for the Houthis’ desperate search for allies, effectively creating a self-fulfilling prophecy. What is clear is that the former Yemeni president, Ali Abdullah Saleh – long supported by the US – and various Yemeni defense forces controlled by his son and nephew supported the Houthis to such a degree that international observers formally dubbed them the “Houthi-Saleh forces.”  The Houthis had been at war with Saleh’s government for decades over long-simmering grievances as a minority group in the country. They had supported the uprising against Saleh and were active participants in the country’s “National Dialogue” to reshape the country’s governance. When Houthi armed groups marched on the capital, it was to rebel against the newly drafted constitution and a proposed federal structure they believed would weaken them. They negotiated an agreement with President Hadi to resolve their differences, but soon found themselves in control of the capital when Saleh-backed defense forces stepped down from defending key government buildings, including the parliament and the presidential palace.

The Obama administration, not learning enough from past foreign military experiencesin Yemen, accepted baseless assurances from the Saudis — including the then-deputy Crown Prince Mohamed Bin Salman and the inexperienced Saudi military — that they would overthrow the Houthis in months. The US decidedly looked the other way from Saleh’s strong support for the Houthis, including vast stores of weaponry from the Defense Ministry, an institution that remained loyal to Saleh. The war dragged on, with limited military gains by the Saudi-led coalition, but a rising toll of unnecessary and unlawful death and destruction.

Well before President Trump’s appearance, we at Human Rights Watch and others had documented well over 100 apparently indiscriminate or disproportionate aerial attacks by the Saudi-led coalition on civilians and civilian infrastructure in Yemen, causing devastation to Yemenis in their homes, markets, schools, hospitals, and even during their weddings and their funerals. In case after case, we showed that US weapons were being used in many of these attacks, including widely banned cluster munitions in populated areas. False denials and cover-ups by Saudi military authorities were clear signs that they were not trustworthy partners. We repeatedly provided this evidence to Obama administration officials, but they would insist, despite the obvious evidence to the contrary, that the support they were providing was reining in the Saudis and helping improve their ability to comply with the laws of war. This is not a case of hindsight knows best. The Obama administration should have known back then.

Also well before Trump adviser — and son-in-law — Jared Kushner’s conspicuous friendship with Muhammad bin Salman, the Saudi-led coalition’s arbitrary and excessive delays on imports to Yemen were exacerbating health and nutrition crises, as diseases like cholera spread like wildfire. The Saudi-led coalition’s closure of a critical airport meant that many Yemenis couldn’t travel to get the healthcare they needed. UN humanitarian agencies and global relief organizations pleaded in vain about the harm caused by the coalition restrictions, to little avail.

Meanwhile, the Obama administration was providing Saudi Arabia not only with ongoing military support (which the formal officials mention) but also diplomatic cover (which the former officials omit), especially at the UN. When the UN finally named Saudi Arabia on its “Global List of Shame,” of the worst offenders against children, for its attacks on children in Yemen, the US stood silent as Saudi Arabia strong-armed the UN. Then-Secretary General Ban Ki Moon resisted for a while, but finally caved in and removed Saudi Arabian from the list, admitting that Riyadh had threatened to cut its funding to various UN agencies. Twice during the Obama administration, the US had the opportunity to push for a UN inquiry into abuses by all sides in the Yemen conflict, and twice it did not—the Saudi-led coalition didn’t want one. Despite repeated queries about whether the US supported the first proposed UN inquiry, Obama officials responded with silence or words of deflection, which spelled the political demise of such an initiative.

The cost of the Obama administration’s support for Saudi Arabia’s war went beyond Yemen. The juxtaposition of the Obama administration decrying Syrian/Russian attacks on civilians and Assad’s ongoing blockade of humanitarian goods, while the United States was defending Saudi-led coalition attacks on civilians and the impact of the blockade in Yemen, undermined the credibility of the Obama team’s efforts to restrain the Syrian government. The Russians openly mocked then-UN Ambassador Samantha Power for this hypocrisy. The US should have condemned and acted to curb both coalitions, equally and fairly.

Whatever conditionality the Obama administration thought it had created — in holding up the transfer of precision munitions near the tail end of Obama’s term and suspending cluster munition transfers earlier — ultimately did not have meaningful impact in reining in the continued Saudi-led coalition attacks on civilians. Nor were the steps robust enough to protect the US and US officials from risking complicity in war crimes.

Despite the claimed “unconditional” support from the Trump Administration, its officials, too, have reacted strongly to some excesses: condemning the total blockade, pushing Saudi Arabia to permit cranes to get into Yemen, ending refueling of coalition planes. But, like the steps of the previous administration, it is not anywhere close to enough. As Yemenis remember the pain and suffering the US has helped inflict on their country, as they surely must, they will not look more kindly on the Obama administration’s merely “conditional” support. And that is not to mention several dozens of Yemeni civilians killed in drone strikes in the pursuit of Al Qaeda fighters.

The responsibility for these failed policies does not fall equally on all senior Obama officials, and some individuals made every effort to steer the ship in a far better direction. But that’s not the point here. The point is an honest, full appreciation of the reasons for these policies and their consequences. The statement by former senior officials fails in that task.

The tragic fact is that the US can play a less destructive role in Yemen —building on what we’ve seen these last few weeks. The US could end arms sales to Saudi Arabia, push for the UN to call out Riyadh for its role in Yemen’s nightmare, and investigate the US role in war crime after war crime so that the US, too, can ensure it does not keep making the same deadly mistakes.

The bipartisan Senate bill introduced on November 15 calling for sanctions and restrictions on Saudi Arabia for the harm it has caused in this war is the strongest effort to date for taking serious action.

The question is: Will they? Will US officials be able to look back in a few years and write a letter saying they did all they could to stop famine, to prevent more atrocities, to ensure countless Yemenis don’t go without justice or redress? Or will those officials, too, only be able to see there was so much more that could have been done when it is far too late and many more have died?

And finally, the broader reckoning Obama administration officials should provide is their failure to seize the opportunity presented by the Arab Uprisings for a new orientation of US interests in line with the interests of the region’s people and their rights, and not their dictators and outright tyranny. This is not predominately about Syria, and the ongoing debate about whether the Obama administration’s intervention— including providing arms and assistance to a variety of little-known armed groups — and at other times nonintervention helped or hurt the Syrian people.

It is, however, squarely about where the US plays a leading role in supporting, arming, and protecting abusive governments in the Middle East – namely Egypt, Israel, Saudi Arabia, and the UAE – so that America is much more directly responsible for the conduct of these foreign partners. The Obama administration had an opportunity to broadly rethink America’s historically problematic role in the region, and ultimately failed to do so. A fuller reckoning is the only way to avoid current policy debates and future policies from remaining misguided.

Author: Human Rights Watch
Posted: January 1, 1970, 12:00 am

Incendiary weapons were used in Eastern Ghouta, Syria, in March 2018, causing more than 260 civilian casualties.
 

© 2018 Syria Civil Defense
(Geneva) – Countries at an upcoming United Nations disarmament conference, faced with evidence of 30 new incendiary weapons attacks in Syria, should agree to strengthen the international law that governs their use, Human Rights Watch said in a report released today.

The 13-page report, “Myths and Realities about Incendiary Weapons,” counters common misconceptions that have slowed international progress in this area. Incendiary weapons produce heat and fire through the chemical reaction of a flammable substance. While often designed for marking and signaling or producing smokescreens, incendiary weapons can burn human flesh to the bone, leave extensive scarring, and cause respiratory damage and psychological trauma. They also start fires that destroy civilian objects and infrastructure.

Countries at UN treaty meeting should:

• Revisit existing international law on incendiary weapons in 2019.
• Work to close loopholes in that law,
• Condemn the use of incendiary weapons in populated areas.
“The excruciating burns and lifelong disabilities inflicted by incendiary weapons demand a global response,” said Bonnie Docherty, senior arms researcher at Human Rights Watch and lead author of the report. “Simple changes in international law could help save civilian lives during wartime.”

The report details the exceptionally cruel harm caused by incendiary weapons, explains the shortcomings of existing law, and lays out steps countries should take in response. The report, designed as an accessible overview of the incendiary weapons issue, was co-published by Harvard Law School’s International Human Rights Clinic.

Incendiary rockets rain fire over farmland outside a town in western Idlib, Syria, on July 30, 2018. 

© 2018 SMART News Agency

Countries that are party to the Convention on Conventional Weapons (CCW) are scheduled to address incendiary weapons at the UN in Geneva from November 19 to 23. Protocol III to this treaty imposes some restrictions on the use of incendiary weapons, but it does not provide sufficient protections for civilians.

In 2018, the Syrian-Russian military alliance used incendiary weapons in at least 30 attacks across six governorates of Syria, based on Human Rights Watch research. The majority of these attacks involved ground-launched rockets, but air-dropped weapons have also caused harm. For example, an incendiary airstrike on March 16 in Eastern Ghouta killed at least 61 people and injured more than 200.

As flames from incendiary rockets rain down, smoke billows over farmland in western Idlib, Syria, on July 30, 2018. 

© 2018 SMART News Agency

Human Rights Watch documented an additional 90 incendiary weapons attacks in Syria from November 2012 through 2017. The total number is most likely higher. Syria has not joined Protocol III, but Russia has.

The countries at the UN meeting should address the weaknesses of Protocol III as well as articulate their policies and practices. They should also establish a forum dedicated to reviewing the protocol more formally in 2019 with the intention of strengthening its protections for civilians.

Local people struggle to extinguish fires from an incendiary weapons attack in western Idlib, Syria, on July 30, 2018. 

© 2018 SMART News Agency

Government support for action against incendiary weapons has grown significantly in recent years, although a small number of countries that view existing law as adequate have opposed proposals to amend the protocol.

Simple changes in international law could help save civilian lives during wartime.

Bonnie Docherty

Senior Researcher, Human Rights Watch

Protocol III has two major loopholes that have weakened its impact. First, its definition excludes multipurpose weapons, such as those with white phosphorus, which may be primarily designed to provide smokescreens or illumination, but which can inflict the same horrific injuries as other incendiary weapons. White phosphorus, for example, can continue to smolder in bandaged wounds and reignite days after treatment if exposed to oxygen. In 2017, the US-led coalition used white phosphorus while fighting to retake Raqqa in Syria and Mosul in Iraq from the Islamic State. The United States is party to Protocol III.

Second, while the protocol prohibits the use of air-dropped incendiary weapons in populated areas, it allows the use of ground-delivered models in certain circumstances. Because all incendiary weapons cause the same effects, this arbitrary distinction should be eliminated. A complete ban on incendiary weapons would have the greatest humanitarian benefits.

“Nations should make strengthening international law on these weapons a priority for the disarmament agenda,” said Docherty, who is also associate director of armed conflict and civilian protection at the Harvard clinic. “Stronger obligations would limit the conduct of treaty countries and, by increasing stigmatization of incendiary weapons, influence the behavior of other countries and non-state armed groups.”

Docherty will present the report’s findings at a side event at the United Nations in Geneva at 1:15 p.m. on November 20 in Conference Room XXII. 

Posted: January 1, 1970, 12:00 am

Local people struggle to extinguish fires from an incendiary weapons attack in western Idlib, Syria, on July 30, 2018. 

© 2018 SMART News Agency

Related Content

Incendiary weapons are among the cruelest weapons used in contemporary armed conflict. These weapons, which produce heat and fire through the chemical reaction of a flammable substance, cause excruciating burns and destroy homes and other civilian structures. They are regulated by Protocol III to the Convention on Conventional Weapons (CCW), but that instrument has loopholes that reduce its legal and normative power.[1] CCW states parties have increasingly expressed support for revisiting and strengthening Protocol III, and they have set aside time at their annual Meeting of States Parties in November 2018 to address the topic.

Human Rights Watch and the Harvard Law School International Human Rights Clinic (IHRC) call on states to hold a robust exchange of views in November, and to dedicate additional time in 2019 for further discussions, possibly in the form of an informal meeting of experts. States should not only condemn ongoing use, but should also work toward closing Protocol III’s loopholes and building the stigma against incendiary weapons. A complete ban on incendiary weapons would have the greatest humanitarian benefits.

To bolster the case for reviewing and amending Protocol III, this paper rebuts common myths about incendiary weapons and the law that regulates them.

Myth #1: The harm caused by incendiary weapons is comparable to that of other conventional weapons.

 

Reality: Incendiary weapons inflict exceptionally cruel injuries, including horrific burns, which can produce immediate and long-term suffering and, in many cases, a painful death.

  • Incendiary weapons can cause fourth- and fifth-degree burns that damage skin, muscles, ligaments, tendons, nerves, blood vessels, and even bones. Burns can also lead to severe infections and shock.[2]  
  • Victims face an excruciating treatment process. Dressings for burns must be changed daily and dead skin removed, a painful process that has been described as being “flayed alive.”[3]
  • Incendiary weapons can also cause carbon monoxide poisoning and respiratory damage. Victims may be unable to breathe due to inflammation to the lungs or other tissues.
  • Individuals who survive an initial attack often experience organ failure, lowered resistance to disease, lifelong disability, muscle weakness, and psychological trauma. Survivors sometimes find that they are also shunned due to severe scarring and disfigurement, which can drive them to withdraw from society. 
  • In addition to inflicting physical injury, incendiary weapons can cause socioeconomic harm and displacement because they destroy homes, hospitals, schools, farmland, and other civilian infrastructure.

As flames from incendiary rockets rain down, smoke billows over farmland in western Idlib, Syria, on July 30, 2018. 

© 2018 SMART News Agency

Myth #2: Incendiary weapons are not a common tool in contemporary armed conflict.

 

Reality: Incendiary weapons have been used repeatedly in recent armed conflicts, most notably in Syria.

  • Human Rights Watch documented 30 incidents involving incendiary weapons in Syria over the first seven months of 2018. The attacks, by the Syrian-Russian military alliance, took place in six governorates: Aleppo, Damascus, Damascus Countryside, Daraa, Hama, and Idlib. Syria Civil Defense reported that on March 16, 2018, an incendiary weapon attack on Kafr Batna, in Eastern Ghouta, killed at least 61 people and injured more than 200.[4]
  • From November 2012 through 2017, Human Rights Watch documented more than 90 incendiary weapons attacks by the Syrian-Russian military alliance in Syria.[5] The total number of such attacks is likely higher because some attacks go unreported or are not recorded by visual media so cannot be investigated.
  • Human Rights Watch documented use of incendiary weapons in Ukraine in July 2014, although it could not determine who fired the weapons.[6]
  • White phosphorus munitions, which have comparable effects to incendiary weapons regulated under international law, have also been used repeatedly over the past 15 years, including by US-led coalition forces against the Islamic State in Iraq and Syria in 2017;[7] by Saudi Arabia-led coalition forces in Yemen in 2016;[8] by Israel in Gaza in 2008-2009;[9] by both the International Security Assistance Force and the Taliban in Afghanistan between 2005-2011;[10] by Ethiopian forces in Somalia in 2007,[11] and by the United States in Iraq in 2004.[12]

Incendiary rockets rain fire over farmland outside a town in western Idlib, Syria, on July 30, 2018. 

© 2018 SMART News Agency

Myth #3: Existing international humanitarian law is adequate to protect civilians from incendiary weapons; preventing problematic use is a matter of compliance and universalization.

 

Reality: While states should join and comply with the Convention on Conventional Weapons, Protocol III governing incendiary weapons has two loopholes that interfere with its ability to protect civilians.

  • Article 1 of Protocol III narrowly defines an incendiary weapon as “any weapon or munition primarily designed to set fire to objects or cause burn injury to persons….” This definition excludes multipurpose munitions, notably those containing white phosphorus, which set fire and cause burns but are “primarily designed” for other uses, such as marking, obscuring, or signaling.
  • Article 2’s restrictions on use arbitrarily differentiate between incendiary weapons based on their delivery system. Article 2 comprehensively prohibits the use of air-dropped incendiary weapons in concentrations of civilians, but it allows the use of ground-launched incendiary weapons in concentrations of civilians when the military target is “clearly separated from the concentration of civilians and all feasible precautions are taken” to limit the incendiary effects and minimize injury or loss of life to civilians.
  • The drafters of Protocol III focused on regulating the incendiary weapons most troubling at the time of its negotiation: air-dropped weapons specifically designed to burn and set fires, notably those containing napalm. This narrow scope, however, is a legacy of the 1970s and no longer appropriate today.[13]

The aftermath of an airstrike using incendiary weapons on Khan Sheykhoun in Idlib Province, April 16 2017.

© 2017 Syrian Civil Defense in Idlib

Myth #4: Protocol III’s definition of incendiary weapons appropriately excludes multipurpose munitions that have only incidental incendiary effects.

 

Reality: By excluding multipurpose munitions, such as those containing white phosphorus, Protocol III fails to regulate munitions that cause the same harm in the same manner as those it defines as incendiary weapons.

  • White phosphorus is a chemical substance that ignites when exposed to oxygen. The chemical reaction creates intense heat of about 815 degrees Celsius and produces light and a thick smoke.[14]
  • White phosphorus munitions operate in the same way as the incendiary weapons covered by Protocol III: by setting fires and causing burns “through the action of flame, heat, or combination thereof, produced by a chemical reaction of a substance delivered on the target.” As noted above, they fall outside of Protocol III’s definition of incendiary weapons because they are “primarily designed” to be obscurants.
  • White phosphorus causes severe thermal and chemical burns, often down to the bone, that are slow to heal and likely to develop infections. If not all fragments of white phosphorus are removed, they can exacerbate wounds after treatment and reignite when exposed to oxygen. White phosphorus burns on only 10 percent of the body are often fatal.
  • In 2008-2009, for example, Israel’s use of white phosphorus munitions in Gaza killed at least 12 civilians and left dozens suffering from deep burns and respiratory damage.[15]
  • The use of white phosphorus munitions between 2005 and 2011 in Afghanistan also severely burned civilians, including an eight-year-old girl, who went through fifteen surgeries due to the extensive burns on her face, head, neck, and arms. Her recovery period took a number of months because whenever doctors tried to “scrape the dead tissue, flames leapt out.” While she and five other family members ultimately survived the attack, two of her sisters died.[16]

Screenshot of a video taken in April 2017 in Saraqeb, northwestern Syria, showing the bright trails produced by incendiary weapons. 

Myth #5: Armed forces need white phosphorus munitions because of their effectiveness as obscurants.

 

Reality: The humanitarian harm caused by white phosphorus munitions outweighs their potential usefulness on the battlefield, and less dangerous alternatives exist.

  • Protocol III should regulate white phosphorus munitions, despite claims about their military utility, because of the cruel and indiscriminate harm they cause.  In the past, weapons that inflict unacceptable harm, such as antipersonnel landmines and cluster munitions, have been banned or regulated regardless of purported military benefits.
  • In addition, there are alternative obscurants to white phosphorus munitions, such as 155mm smoke projectiles, which produce comparable visual screening properties without destructive incendiary effects.[17]
  • Some states have already turned to such alternatives in response to public pressure. While Israel used white phosphorus munitions in Gaza in 2008-2009, it began developing alternative smoke shells after it was widely condemned for the harm its munitions caused. There have been no reports of Israeli use of white phosphorus in Gaza since 2009, although it has conducted further military operations there.[18]

Footage showing what Human Rights Watch arms experts have identified as RBK-500 ZAB-2.5SM incendiary bombs mounted on a Russian attack aircraft at a Russian air base in Syria, June 18, 2016.

© 2016 Russia Today/YouTube

Myth #6: Protocol III’s distinction between air-dropped and ground-launched incendiary weapons reflects meaningful differences based on the types of delivery mechanism and frequency of use.

 

Reality: The distinction between air-dropped and ground-launched incendiary weapons is arbitrary because they cause the same type and magnitude of harm and have both been used in recent conflicts.

  • The delivery system of an incendiary weapon is irrelevant because the nature and extent of harm is the same. An individual will suffer cruel injury regardless of how the incendiary warhead reached them. Protocol III itself states that all incendiary weapons “set fire to objects or … cause burn injury to persons.”
  • Both air-dropped and ground-launched incendiary weapons have been used in populated areas in recent conflicts. In 2017, most of the 22 incendiary weapons attacks that Human Rights Watch documented in Syria involved air-dropped models.[19] In 2018, Human Rights Watch documented that the Syrian-Russian military alliance used ground-launched incendiary Grad rockets in at least 20 instances and air-dropped incendiary weapons in at least 10 instances.[20]   
  • Ground-launched incendiary weapons, in particular 9M22S Grad rockets, were also used in July-August 2014 in at least two towns in Ukraine, Ilovaisk and Luhansk. They burned several homes and endangered civilians.[21]
  • Because they generally lack aircraft to drop incendiary weapons, non-state armed groups are more likely to have access to ground-launched models. For example, the US military reported 11 cases from 2007-2009 in which insurgents used white phosphorus delivered by rockets or mortars in Afghanistan.[22]

Remnants of ZAB2.5SM submunitions from an August 7, 2016 incendiary weapon attack on Idlib city.

© 2016 Syria Civil Defense Idlib

Myth #7: Amending Protocol III would be complicated and time consuming.

 

Reality: Strengthening Protocol III, which requires only small changes to the text, would be legally and procedurally straightforward.

  • Expanding Article 1 to cover multipurpose munitions with incidental incendiary effects would simply require shifting from a design-based to an effects-based definition. The language could be changed to “any weapon or munition that has the effect of setting fire to objects or causing burn injury to persons….”
  • Precedent for adopting an effects-based weapons definition exists in CCW Protocol I on Non-Detectable Fragments,[23] Additional Protocol I to the Geneva Conventions,[24] and the Nuclear Weapons Advisory Opinion issued by the International Court of Justice.[25] Each of these documents looks to the effects of particular types of weapons in order to determine their legality.[26]
  • Strengthening restrictions on use would simply require eliminating the distinction in Article 2 between delivery systems. Article 2(2) could be amended to remove “air-delivered,” so that it prohibits making “any military objective located within a concentration of civilians the object of attack by incendiary weapons.” Article 2(3) could then be deleted.
  • Treaties regulating conventional weapons generally do not distinguish based upon delivery system. For example, Amended Protocol II to the CCW, which covers mines, booby-traps, and other devices, defines a “remotely-delivered mine” as a mine “delivered by artillery, missile, rocket, mortar, or similar means, or dropped from an aircraft.”[27]   
  • The process to amend Protocol III could be similarly straightforward and completed in a timely manner. States should set aside time to review Protocol III in 2019 and then agree to a negotiating mandate with the goal of strengthening the protocol the following year.

At least four incendiary submunitions burn on the ground of a narrow street in the al-Mashhad neighborhood of opposition-held east Aleppo city immediately after an incendiary weapon attack on August 7, 2016.

© 2016 Malek Tarboush

Myth #8: States are resistant to amending Protocol III.

 

Reality: Growing state support for strengthening the law and condemnation of recent use demonstrates the time is ripe for CCW states parties to revisit Protocol III.

  • Since 2010, at least 35 states, along with other international actors including the International Committee of the Red Cross (ICRC) and the UN secretary-general, have publicly recognized the problems of incendiary weapons.[28]
  • In November 2017, CCW states parties engaged in particularly robust discussions when the topic became a separate agenda item at their annual meeting. Almost all of the 26 states that spoke expressed concerns about incendiary weapons, and the majority recommended CCW states parties take some action in response.
  • At least nine states supported amending Protocol III.[29] For example, Costa Rica described ongoing use of incendiary weapons as an “alert” to evaluate and expand the scope of Protocol III. Austria said it “continues to see value in strengthening Protocol III,”[30] while Chile promised to work with “like-minded States and civil society in order to bring about an effective prohibition of this type of weapon and to strengthen Protocol III.”[31] 
    Simple changes in international law could help save civilian lives during wartime.

    Bonnie Docherty

    Senior Researcher, Human Rights Watch

  • At least 13 states called for further discussions on Protocol III.[32] For example, the Holy See urged “[a]n honest technical and legal review of the provisions.”[33] Croatia stated, “[T]he time is right to discuss the relevance of standards set by Protocol III,”[34] while the Philippines agreed that “review and reflection is timely and consistent with the objective of keeping the convention and its protocols relevant.”[35
  • Switzerland proposed an informal meeting of experts regarding Protocol III.[36] While this proposal was ultimately blocked by a few states parties, at least five states expressly supported it.[37]
  • At least 17 states plus the European Union condemned or expressed concerns about reports of recent and ongoing use of incendiary weapons in concentrations of civilians at the CCW’s annual meeting in 2017.[38] For example, Zambia “condemn[ed] in the strongest terms the use of incendiary weapons in populated areas regardless of the method of deployment.”[39] The United States said it was “deeply concerned over the continued reports of air-delivered incendiary weapons being used in areas near civilians,”[40] while Ireland described such reports as “disturbing.”[41]
  • Final reports from CCW annual meetings, which are adopted by consensus, have noted concerns regarding incendiary weapons with increasing urgency since 2011.[42] At the end of the CCW’s 2017 Meeting of States Parties, the final report “condemned any use of incendiary weapons against civilians or civilian objects, and any other use incompatible with the relevant rules of international humanitarian law, including the provisions of Protocol III where applicable.”[43]

A ZAB-2.5 incendiary submunition from a December 5, 2016 attack on Maarat al-Numan south of Idlib.

© 2016 SMART News Agency

Myth #9: CCW states parties should not make Protocol III a priority for discussion because attention would be better spent on tackling new issues than revisiting agreed-on protocols.

 

Reality: CCW states parties should prioritize revisiting and strengthening Protocol III because the convention is intended to be a living document and states parties have not revisited Protocol III since it was adopted in 1980.

  • At the 2017 Meeting of States Parties, several CCW states parties remarked that a review of Protocol III was long overdue. While, in the words of Costa Rica, the CCW was designed to be “a convention that is dynamic and flexible,”[44] Protocol III has remained static for almost four decades.
  • States parties have, by contrast, expanded and strengthened other elements of the convention and its three original protocols several times since adoption. States parties have amended Protocol II, added two new protocols, and expanded the scope of the convention to encompass non-international armed conflicts.[45]
  • Work on incendiary weapons need not distract from progress on lethal autonomous weapons systems because states parties to the CCW have demonstrated their ability to work on multiple issues at the same time. For example, while states parties were negotiating Protocol IV on blinding laser weapons in 1995, they were also amending Protocol II. In 2003, states parties both adopted Protocol V on explosive remnants of war and agreed to a political commitment on mines other than antipersonnel mines.[46]
  • The evidence of ongoing use of incendiary weapons and growing calls for a response underscore the urgency of revisiting Protocol III.

Local people struggle to extinguish fires from an incendiary weapons attack in western Idlib, Syria, on July 30, 2018. 

© 2018 SMART News Agency

Myth #10: Amending Protocol III will not have a significant impact.

 

Reality: Closing the loopholes in Protocol III will better protect civilians by more strictly regulating states parties’ use of incendiary weapons and by creating a more powerful norm against their use.

  • A stronger protocol would bind states parties, meaning they could not lawfully engage in use that falls into the current loopholes. Eliminating ambiguity in Protocol III would also facilitate enforcement because with clearer rules, breaches are easier to recognize and condemn.
  • Strengthening Protocol III could also influence the conduct of actors not bound by its provisions by increasing the stigma against incendiary weapons. Stigmatization has already contributed to changes in domestic policies. For example, growing opposition to incendiary weapons, at the international and national levels, helped pressure Israel, which is not party to Protocol III, to alter its policies on white phosphorus in 2013 in order to dramatically restrict use.[47]
  • Stigmatization can also influence the conduct of non-state armed groups, especially those that seek to be viewed as responsible actors.

 

Annex I. Relevant Publications

A series of reports published over the past decade by Human Rights Watch and the Harvard Law School International Human Rights Clinic (IHRC) examine the issue of incendiary weapons in more depth. Approaching the topic from a variety of angles, these reports make the case that existing international law is inadequate and should be strengthened. The reports also provide annual updates of the use of incendiary weapons and the evolution of government positions.

To download the full reports, please visit: https://goo.gl/yHJQC8

 

An Overdue Review: Addressing Incendiary Weapons in the Contemporary Context
November 2017         

This 30-page report examines how the outdated regulations of Protocol III reflect concerns about incendiary weapons use at the time of the protocol’s negotiation. It argues that the law must evolve to respond to a changed military and political landscape.

 

Time to Act against Incendiary Weapons
December 2016         

This 32-page report highlights the urgency of action at the CCW’s Fifth Review Conference and calls on states to set aside time to revisit Protocol III.

 

From Condemnation to Concrete Action: A Five-Year Review of Incendiary Weapons
November 2015

This 27-page report analyzes the past five years of the incendiary weapons debate. It also discusses recent use of incendiary weapons in Syria and Ukraine, allegations of use in Libya and Yemen, and the evolution of national views on Protocol III.

 

Incendiary Weapons: Recent Use and Growing Opposition
November 2014 

This 16-page report details the latest harm caused by incendiary weapons in Syria and Ukraine while showing the influence of growing stigma on the practice of states, such as Israel.

 

Syria’s Use of Incendiary Weapons*
November 2013

This 25-page report documents new use of incendiary weapons in Syria and the civilian harm that resulted.

 

Government Positions on Protocol III on Incendiary Weapons
November 2012

This 18-page report updates an April 2012 report on countries’ views of Protocol III.

 

Incendiary Weapons: Government Positions and Practices
April 2012

This 22-page report analyzes government statements on Protocol III and provides evidence of the use, production, and stockpiling of incendiary weapons, including white phosphorous.

 

Q&A on Incendiary Weapons and CCW Protocol III
November 2011

This 3-page Q&A defines incendiary weapons, describes the harms they cause, lays out the shortcomings of Protocol III, and offers ways to strengthen the law.

 

Strengthening the Humanitarian Protections of Protocol III on Incendiary Weapons
August 2011

This 15-page report urges state parties to discuss Protocol III at the CCW’s Fourth Review Conference and proposes specific amendments to close its loopholes. The report argues that a blanket prohibition of incendiary munitions would most effectively protect civilians.

 

The Human Suffering Caused by Incendiary Munitions
March 2011

This 16-page report details the horrific harms caused by incendiary weapons, including napalm and white phosphorous, and provides a history of use since states adopted Protocol III.

 

The Need to Re-Visit Protocol III on Incendiary Weapons
November 2010

This 10-page report introduces the inadequacies of Protocol III and calls on states parties to revisit the protocol. It also examines how the US reservation has exacerbated the protocol’s shortcomings and hindered its ability to build norms.

 

Rain of Fire: Israel’s Unlawful Use of White Phosphorus in Gaza*
March 2009

This 71-page report, based on an in-depth field investigation, documents Israel’s use of white phosphorus during its 2009 military operations in Gaza.

 

*Unless noted by an asterisk, Human Rights Watch published these reports jointly with IHRC.

 

 

[1] Convention on Conventional Weapons (CCW) Protocol III on Incendiary Weapons (Protocol III), adopted October 10, 1980, entered into force December 2, 1983.

[2] For more information on the suffering caused by incendiary weapons, see Human Rights Watch and the Harvard Law School International Human Rights Clinic (IHRC), Memorandum to Convention on Conventional Weapons Delegates: The Human Suffering Caused by Incendiary Munitions, March 2011, https://www.hrw.org/news/2011/03/31/human-suffering-caused-incendiary-mu..., p. 3.

[3] Denise Chong, The Girl in the Picture: The Story of Kim Phuc, the Photograph, and the Vietnam War (New York: Penguin Group, 1999), p. 94.

[4] Syria Civil Defense, “A horrific massacre including unconscienable [sic] Napalm air strikes killed at least 61 civilians in #Kafr_Bata Town,” Twitter, March 16, 2018, https://twitter.com/SyriaCivilDef/status/974660689502629889 (accessed October 21, 2018).

[5] Human Rights Watch documented more than 68 incendiary weapons attacks from November 2012 to 2016, and 22 attacks in 2017. Human Rights Watch and IHRC, An Overdue Review: Addressing Incendiary Weapons in the Contemporary Context, November 2017, https://www.hrw.org/news/2017/11/20/overdue-review-addressing-incendiary..., pp. 14-15. A YouTube video, published by Russia Today in June 2016, showed a Russian aircraft with incendiary bombs at Russia’s airbase in Syria, suggesting that Russia has also been using incendiary weapons in Syria. Mary Wareham, “Incendiary Weapons Pose Civilian Threat in Syria,” Human Rights Watch dispatch, June 21, 2016, https://www. hrw.org/news/2016/06/21/dispatches-incendiary-weapons-pose-civilian-threat-syria.

[6] Human Rights Watch and IHRC, Incendiary Weapons: Recent Use and Growing Opposition, November 2014, https://www.hrw.org/news/2014/11/10/incendiary-weapons-recent-use-and-gr..., p. 6; Yuri Lyamin and Michael Smallwood, “9M22S Incendiary Rocket Components Documented in Eastern Ukraine,” post to “The Hoplite” (blog), Armament Research Services, October 14, 2014, http://armamentresearch.com/9m22s-incendiary-rocket-components-documente... (accessed October 21, 2018).

[7] “Iraq/Syria: Danger from US White Phosphorus,” Human Rights Watch news release, June 14, 2017, https://www.hrw.org/news/2017/06/14/iraq/syria-danger-us-white-phosphorus.

[8] Thomas Gibbons-Neff, “Saudi Arabia Appears to be Using U.S.-Supplied White Phosphorus in its War in Yemen,” Washington Post, September 19, 2016, https://www.washingtonpost.com/news/checkpoint/ wp/2016/09/19/saudi-arabia-appears-to-be-using-u-s-supplied-white-phosphorus-in-its-war-in-yemen/?utm_term=.fd4007f43775 (accessed October 21, 2018).

[9] Human Rights Watch, Rain of Fire: Israel’s Unlawful Use of White Phosphorus in Gaza, March 2009, https://www.hrw.org/report/2009/03/25/rain-fire/israels-unlawful-use-whi....

[10] C.J. Chivers, “10 Years into Afghan War, a Thunderous Duel,” New York Times, October 7, 2011, https://www.nytimes.com/2011/10/08/world/asia/attacks-rock-us-outposts-n... (accessed October 21, 2018); Charlotte Aagaard, “Leaked Documents Show NATO Use of White Phosphorous against Afghan Insurgents,” Dagbladet information (Denmark), April 19, 2011, http://www.information.dk/265810 (accessed October 21, 2018). For further discussion of Taliban use, attempted use, and storage of white phosphorus, see “Reported Insurgent White Phosphorus Attacks and Caches,” US Central Command press release, 20090511–002, May 12, 2009, http://www.centcom.mil/MEDIA/PRESS-RELEASES/Press-Release-View/Article/9... (accessed October 21, 2018).

[11] The Ethiopian government denied having used white phosphorus. See Monitoring Group on Somalia, “Report of the Monitoring Group on Somalia pursuant to Security Council resolution 1724 (2006),” S/2007/436, July 18, 2007, http://www.un.org/ga/search/view_doc.asp?symbol=S/2007/436 (accessed October 21, 2018), paras. 30-34.

[12] Andrew Buncombe and Solomon Hughes, “The Fog of War: White Phosphorus, Fallujah, and Some Burning Questions,” The Independent, November 15, 2005, https://www.independent.co.uk/news/world/americas/the-fog-of-war-white-p... (accessed October 21, 2018).

[13] Human Rights Watch and IHRC, An Overdue Review, pp. 11-12.

[14] For more information on white phosphorus and its effects, see Human Rights Watch and IHRC, From Condemnation to Concrete Action: A Five-Year Review of Incendiary Weapons, November 2015, https://www.hrw.org/sites/default/files/supporting_resources/incendiarie..., pp. 4-5.

[15] Human Rights Watch, Rain of Fire, pp. 3-4.

[16] Jason Straziuso and Evan Vucci, “Burned Afghan Girl Learns to Smile Again,” Associated Press, June 23, 2009,

 http://www.nbcnews.com/id/31509214/ns/world_news-south_and_central_asia/... (accessed October 21, 2018). It is unclear which parties used white phosphorus munitions in this particular incident. The US military documented 44 alleged uses by the Taliban. The spokesman for the commander of NATO and US troops in Afghanistan, Brig. Gen. Richard Blanchette, however, told Human Rights Watch that NATO and US forces also used white phosphorus munitions in Afghanistan. See “Taleban ‘Used White Phosphorus,’” BBC, May 11, 2009, http://news.bbc.co.uk/2/hi/8045012.stm (accessed October 21, 2018); “Afghanistan: NATO Should ‘Come Clean’ on White Phosphorus,” Human Rights Watch news release, May 8, 2009, https://www.hrw.org/news/2009/05/08/afghanistan-nato-should-come-clean-w....

[17] Human Rights Watch, Rain of Fire, p. 4.

[18] “Israel: Strengthen White Phosphorus Phase-Out,” Human Rights Watch news release, May 18, 2013, https://www.hrw.org/news/2013/05/18/israel-strengthen-white-phosphorus-p... Gili Cohen, “IDF to Stop Using Shells with White Phosphorus in Populated Areas, State Tells High Court,” Haaretz, May 13, 2013, https://www.haaretz.com/.premium-white-phosphorus-ban-in-towns-1.5242691.  

[19] Human Rights Watch, An Overdue Review, p. 14.

[20] Human Rights Watch, “Incendiary Weapons,” in Reaching Critical Will, First Committee Briefing Book, 2018, http://reachingcriticalwill.org/images/documents/Disarmament-fora/1com/1... (accessed October 30, 2018), p. 25.

[21] Human Rights Watch and IHRC, Incendiary Weapons: Recent Use and Growing Opposition, p. 6.

[22] “Reported Insurgent White Phosphorus Attacks and Caches,” US Central Command press release.

[23] CCW Protocol I on Non-Detectable Fragments (Protocol I), adopted October 10, 1980, 1342 U.N.T.S. 168, entered into force December 2, 1983 (“It is prohibited to use any weapon the primary effect of which is to injure by fragments which in the human body escape detection by X-rays.” (emphasis added)).

[24] Additional Protocol I to the Geneva Conventions prohibits the use of weapons “of a nature to cause superfluous injury or unnecessary suffering.” Protocol Additional to the Geneva Conventions of August 12, 1949, and Relating to the Protection of Victims of International Armed Conflicts (Additional Protocol I), adopted June 8, 1977, 1125 U.N.T.S. 3, entered into force December 7, 1978, art. 35(2) (emphasis added). That protocol additionally forbids attacks using “means of combat the effects of which cannot be limited as required by this Protocol.” Ibid., art. 51(4)(c) (emphasis added).

[25] Advisory Opinion on the Legality of the Threat or Use of Nuclear Weapons, International Court of Justice (ICJ) Reports 226, July 8, 1996, https://www.icj-cij.org/files/case-related/95/095-19960708-ADV-01-00-EN.pdf (accessed October 21, 2018), para. 55 (noting that whether a weapon violated the prohibitions on poison or asphyxiating weapons depends on whether the weapon’s “prime, or even exclusive, effect is to poison or asphyxiate”) (emphasis added).

[26] The 2008 Convention on Cluster Munitions demonstrates how even a convention with a design-based definition can take effects into account. The convention defines cluster munition as being “designed to disperse or release explosive submunitions….” but looks to the humanitarian effects of weapons to determine which are safe to exclude. It states that its definition of cluster munition does not include munitions with certain specific technical characteristics because they “avoid indiscriminate area effects and the risks posed by unexploded submunitions.” Convention on Cluster Munitions, adopted May 30, 2008, Diplomatic Conference for the Adoption of a Convention on Cluster Munitions, CCM/77, entered into force August 1, 2010, art. 2(2)(c) (emphasis added).

[27] CCW Protocol II on Prohibitions or Restrictions on the Use of Mines, Booby-Traps and other Devices, as amended on May 3, 1996 (Amended Protocol II), adopted May 3, 1996, entered in force December 3, 1998, art. 2(2).

[28] Human Rights Watch and IHRC, An Overdue Review, p. 21.

[29] Argentina, Austria, Chile, Costa Rica, the Holy See, Jordan, Mexico, Panama, and Zambia. For these statements and all audio recordings referenced below, see UN Office at Geneva (UNOG), Meeting of High Contracting Parties to the CCW, Geneva, November 23, 2017, https://conf.unog.ch/digitalrecordings/# (accessed October 21, 2018) (audio recording).

[30] Statement of Austria, Meeting of High Contracting Parties to the CCW, Geneva, November 23, 2017, p. 1.

[31] Statement of Chile, Meeting of High Contracting Parties to the CCW, Geneva, November 23, 2017 (audio recording).

[32] Argentina, Austria, Chile, Costa Rica, Croatia, the Holy See, Ireland, Mexico, New Zealand, Panama, the Philippines, Switzerland, and Zambia. See UNOG, Meeting of High Contracting Parties to the CCW, Geneva, November 23, 2017 (audio recording).

[33] Statement of the Holy See, Meeting of High Contracting Parties to the CCW, Geneva, November 22, 2017, p. 2.

[34] Statement of Croatia, Meeting of High Contracting Parties to the CCW, Geneva, November 23, 2017, p. 1.

[35] Statement of the Philippines, Meeting of High Contracting Parties to the CCW, Geneva, November 23, 2017 (audio recording).

[36] Switzerland proposed that the Meeting of High Contracting Parties decide “to convene an informal meeting of experts to discuss issues related to the universalization and implementation of the Protocol III in light of the humanitarian concerns expressed.” Statement of Switzerland, Meeting of High Contracting Parties to the CCW, Geneva, November 23, 2017, p. 2.

[37] Austria, Croatia, Ireland, Mexico, and New Zealand. See UNOG, Meeting of High Contracting Parties to the CCW, Geneva, November 23, 2017 (audio recording).

[38] Austria, Chile, Costa Rica, Croatia, Germany, the Holy See, Ireland, Jordan, Mexico, New Zealand, Panama, the Philippines, Sri Lanka, Switzerland, United Kingdom, United States, Zambia, and the European Union. Ibid.

[39] Statement of Zambia, Meeting of High Contracting Parties to the CCW, Geneva, November 23, 2017 (audio recording).

[40] Statement of the United States, Meeting of High Contracting Parties to the CCW, Geneva, November 23, 2017 (audio recording).

[41] Statement of Ireland, Meeting of High Contracting Parties to the CCW, Geneva, November 23, 2017 (audio recording).

[42] For more on the final reports’ language on incendiary weapons, see Human Rights Watch and IHRC, An Overdue Review, p. 24.

[43] Meeting of the High Contracting Parties to the CCW, Final Report, CCW/MSP/2017/8, December 11, 2017, https://www.unog.ch/80256EDD006B8954/(httpAssets)/8A3BE602D1E4142CC12581E70054D0F4/$file/CCW_MHCP+2017_FinalReport_Advance+Version+(003)_ES.pdf (accessed November 12, 2018), para. 35.

[44] Statement of Costa Rica, Meeting of High Contracting Parties to the CCW, Geneva, November 23, 2017 (audio recording).

[45] United Nations Office at Geneva, “Convention on Conventional Weapons,” https://www.unog.ch/80256EE600585943/(httpPages)/4F0DEF093B4860B4C1257180004B1B30?OpenDocument, (accessed October 21, 2018).

[46] Ibid.

[47] “Israel: Strengthen White Phosphorus Phase-Out,” Human Rights Watch news release; Cohen, “IDF to Stop Using Shells with White Phosphorus in Populated Areas,” Haaretz.

Posted: January 1, 1970, 12:00 am

At the United Nations in Geneva the Campaign to Stop Killer Robots called on governments to not allow the development of weapons systems that would select and attack targets without any human intervention.

© 2018 Mary Wareham

(Paris) – The French members of the Campaign to Stop Killer Robots on November 7, 2018 published a report "Why France Must Oppose the Development of Killer Robots," which recalls the risks arising from the development of lethal autonomous weapon systems. A few days before the first Paris Peace Forum, the campaign urged France to support the opening of negotiations for an international treaty on the preventive prohibition of fully autonomous weapons.

"The Paris Peace Forum aims to promote peace and security and to reaffirm the importance of multilateralism and collective action in the face of current challenges,” said Bénédicte Jeannerod, France director at Human Rights Watch. “The risk of developing killer robots is a challenge to peace and we hope that France will actively support a multilateral solution to address it, with the adoption of a preventive ban treaty on fully autonomous weapons. Time is running out to prevent their emergence, and we cannot wait for them to make their first victims to act."

Killer robots are weapons systems that, once activated, could choose and attack a target without human control. For the Campaign to Stop Killer Robots, letting a machine decide on the life or death of a human being is a moral red line and a threat to the respect of international humanitarian law and human rights.

"These weapons systems are devoid of moral judgment and compassion, and in this respect, allowing them to kill is contrary to the principles of humanity and the requirements of public conscience," said Anne-Sophie Simpere, author of the report. "In addition, fully autonomous weapons cannot comply with international humanitarian law, in particular the rules requiring the differentiation between combatants and non-combatants, or the principle of proportionality. Only a human being has the precision of analysis to apply these principles in complex and changing combat situations.”

 

Nongovernmental organizations are also concerned about the difficulties in establishing clear accountability for crime, as well as the "high and systematic" risks that killer robots will attack the wrong people.

"Computer programs are imperfect, but robots have tenfold capabilities compared to humans,” said François Warlop, from Sciences Citoyennes, a French member of the Campaign to Stop Killer Robots. “They can act on a large scale, without the ability to assess the morality of their action. Their proliferation could therefore pose a threat to international security, especially since they require fewer human resources and could therefore lower the cost of engaging in a war. "

From Stephen Hawking to Steve Wozniak, thousands of scientists, but also religious leaders and Nobel Peace Prize winners denounce the dangers of killer robots and demand their prohibition before they are developed.

For the Campaign to Stop Killer Robots, a treaty is urgent and necessary to prevent the proliferation and development of fully autonomous weapons. Since 2013, states have been debating the subject at the United Nations. At the annual meeting of the Convention on Conventional Weapons, from November 21 to 23, they are expected to adopt the terms of reference of the group of experts on lethal autonomous weapons for the coming year. One of the issues is the opening of a mandate to negotiate a treaty on these new weapons.

"France has a contradictory position in these discussions: on the one hand Emmanuel Macron has declared himself categorically opposed to autonomous weapons, while on the other hand, France does not want to start negotiations for a preventive prohibition treaty,” said Tony Fortin, from l’Observatoire des armements. “The Ministry of Armed Forces maintains that killer robots will not be allowed to emerge, while developing weapons programs over which human control is increasingly reduced. One example is the Man Machine Teaming program of Dassault and Thales, one of the announced objectives of which is to "provide the various machine systems with more autonomy and artificial intelligence."

The Campaign to Stop Killer Robots urges France to support the opening of negotiations for an international treaty to ban fully autonomous weapons.

On parle ici de robots entièrement autonomes dans leurs déplacements et leurs actions, y compris dans la décision d'ouvrir le feu, sans aucun contrôle humain. Imaginez de telles armes aux mains de gouvernements peu scrupuleux en matière de droits humains.

— HRW en français (@hrw_fr) November 7, 2018

Posted: January 1, 1970, 12:00 am

Remnant of a UK-produced missile found at the location of an air strike at Radfan Ceramics Factory, west of Sanaa, Yemen, on September 23, 2015.

© 2015 Ali Muhammad al-Sawari

(London) – Human Rights Watch, Amnesty International, and RW UK have received permission to intervene in a court case challenging the United Kingdom’s continued sale of arms to Saudi Arabia. The case will be heard by the Court of Appeal in April 2019.

The landmark legal case, brought by Campaign Against Arms Trade (CAAT), seeks to establish that the UK government is breaking its own arms export licensing criteria by continuing to sell weapons to Saudi Arabia, given the clear risk the weapons would be used to commit serious violations of international humanitarian law violations in Yemen. The High Court in London dismissed the case in 2017, but the Campaign Against Arms trade won the right to appeal, and the three groups again received permission to intervene.

“The October 2 murder of Jamal Khashoggi inside the Saudi consulate only highlights the government’s lack of credible investigations and accountability demonstrated during the years-long Saudi-led military campaign in Yemen,” said Clive Baldwin, senior legal adviser at Human Rights Watch. “The UK has contributed through its arms sales to a campaign that has killed or wounded thousands of civilians and brought the country to the brink of disaster.”

Since the coalition began its aerial campaign in Yemen in 2015, the UK has licensed at least £4.7 billion (about US$6.1 billion) worth of arms sales to Saudi Arabia. Human Rights Watch researchers have regularly visited Yemen and documented the use of weapons, including weapons made in the UK, in strikes that appear to be unlawful. Human Rights Watch, Amnesty International, the UN, and Yemeni rights groups have repeatedly documented attacks by the Saudi-led coalition, some of which are most likely war crimes, that have hit markets, schools, hospitals, and homes, and killed thousands of civilians.

Since 2016, Human Rights Watch has called for all countries to end arms sales to Saudi Arabia until the Saudi-led coalition ends its unlawful attacks and credibly investigates those that have already occurred. A growing number of European countries have halted sales of weapons to Saudi Arabia, including Germany, the Netherlands, and Austria. On October 25, the European Parliament called for a common EU position banning arms sales to Saudi Arabia.

Meanwhile, violations in Yemen continue. On August 9, a Saudi-led coalition airstrike killed at least 26 children and wounded at least 19 more in or near a school bus in the busy Dhahyan market, in northern Yemen. The United Nations reported that on October 24, the coalition struck a vegetable packaging facility and killed 21 civilians, the latest in a series of attacks on civilian structures and yet another blow to the country’s precarious economy. The Houthi armed group, which controls much of the northern Yemen and is the target of the Saudi-led coalition’s attacks, has also committed serious violations of the laws of war, including laying antipersonnel landmines, recruiting children, and taking civilians hostage and torturing them.

“The UK should not wait for the court hearing to finally stop selling weapons to Saudi Arabia,” Baldwin said. “It should stop selling weapons now until Saudi Arabia ends unlawful attacks and holds war criminals accountable.”

Posted: January 1, 1970, 12:00 am

At the United Nations in Geneva the Campaign to Stop Killer Robots called on governments to not allow the development of weapons systems that would select and attack targets without any human intervention.

© 2018 Mary Wareham

Opposition to the creation of so-called “killer robots” – weapons systems that could select and attack targets without any human intervention – is growing. The issue has been hotly debated since 2013, when the Campaign to Stop Killer Robots began calling for a preemptive ban on the development, production, and use of such technology. The group has repeatedly warned that these weapons could have devastating consequences for civilian populations around the world.

The only way to stop the development of fully autonomous weapons is through national laws and an international ban treaty. But current diplomatic talks at the United Nations on this challenge are based on consensus – which allows just a few or even a single state to block an agreement sought by a majority – and often results in lowest-common denominator decision-making.

This is effectively what happened in Geneva last week at the sixth Convention on Conventional Weapons (CCW) meeting on lethal autonomous weapons systems. There was strong convergence among the 88 participating states on the need to retain some form of human control over weapons systems and the use of force. Many countries recommended a preemptive ban on the development and use of these weapons.

But a handful of states – namely Australia, Israel, Russia, South Korea, and the United States – strongly opposed any new treaty. Alarmingly, they instead suggested exploring the potential humanitarian “benefits” of developing and using lethal autonomous weapons systems.

Ultimately, the CCW meeting participants could only agree to recommend continuing their deliberations into next year. But the longer it takes states to negotiate a new international ban, the greater the chance that killer robots will become reality, and forever change the face of warfare. The world must not continue down this dangerous path.

Author: Human Rights Watch
Posted: January 1, 1970, 12:00 am

Thank you, Mr. President.

Compliance by States Parties with the Convention on Cluster Munitions has been very impressive. Indeed, compliance with the core prohibitions has been perfect thus far. There have been no instances or even allegations of use, production, or transfer of cluster munitions by any State Party. The first stockpile destruction deadline was 1 August, and every State Party with that deadline met it, some far in advance. In fact, most States Parties with upcoming deadlines have already completed destruction of their stocks. On this 10-year anniversary of the adoption and signing of the convention, we can say with great certainty that this is a convention that is working and working well.

However, there are some compliance concerns, related to transparency and national laws.

Thirteen States Parties are late in providing their initial transparency report. Four of those were due in 2011. This is an 89% compliance rate, but this should be 100%. These reports are needed, among other reasons, to establish officially if a country has stocks of cluster munitions and if it is contaminated. Moreover, far too many States Parties are late in submitting their annual updated report.

No State Party has enacted new implementation legislation since December 2016. Too few overall have enacted new laws or other national implementation measures. By our count, more than one-quarter of States Parties have yet to implement their Article 9 obligations.

In addition, we encourage all States Parties to elaborate their views on certain important issues related to interpretation and implementation of the convention, issues which are relevant to ensuring compliance. Of those States Parties that have commented on these matters, the vast majority have agreed with the following interpretations:

The Convention on Cluster Munitions prohibits (1) any intentional or deliberate assistance with activities banned by the convention, including during joint military operations with states not party; (2) any transit of cluster munitions by a state not party across the territory of a State Party; (3) any foreign stockpiling of cluster munitions by states not party in the territory of a State Party; and (4) any direct or indirect investment in producers of cluster munitions.

The convention is having a powerful impact even on nations that have not yet joined, as most are in de facto compliance with key provisions, such as no use, no production, no trade. An international norm rejecting any use of cluster munitions is clearly emerging.

Cluster Munition Monitor reports confirmed use in the past year in just two countries—in Syria, by Syrian government forces supported by Russia, and in Yemen by the Saudi Arabia-led coalition.

We expect every State Party to firmly condemn any use of cluster munitions by any actor, and to call for an immediate halt to such use. And please follow-up bilaterally after your initial reactions.

In closing, let me reiterate that at the 10-year mark, we should all feel good about the Convention, about the record of compliance, and about the strength of the growing norm. But there is no room for complacency. These gains take constant care, for the long haul.

Thank you.

Posted: January 1, 1970, 12:00 am

When we released Cluster Munition Monitor 2018 last Thursday, we highlighted the untarnished compliance record regarding the convention’s core prohibitions. One of the most visible examples is seen in stockpile destruction, where all of the first states with cluster munitions to ratify the convention destroyed their stocks within the convention’s eight-year deadline.

This achievement shows the world that States Parties take their obligations seriously and are committed to implementation. It also demonstrates to states considering joining the Convention on Cluster Munitions that the provisions are not overly burdensome or impossible to implement. 

Four States Parties completed destruction of their stockpiled cluster munitions during the previous year: Croatia, Cuba, Slovenia and Spain. We warmly welcome this achievement. We encourage them to report in detail on the process and appreciate Croatia’s detailed PowerPoint presentation here. We did not hear from Cuba today and urge it to provide details on the exact quantity and types of cluster munitions and submunitions destroyed.

During 2017, seven States Parties—those four plus Peru, Slovakia, and Switzerland—destroyed a collective total of 33,551 cluster munitions and 1.7 million submunitions. This is the lowest number destroyed since the creation of the Convention on Cluster Munitions.

However, the reason for that is positive: the vast majority—99 percent—of the total reported global stocks of cluster munitions once held by States Parties have now been destroyed. As the Monitor reports, 33 States Parties have completed destruction of their stocks, collectively destroying 1.4 million cluster munitions containing more than 177 million submunitions.

State Party Slovakia destroyed a substantial number of cluster munitions over the past year and we hope to hear from Slovakia at this meeting. Switzerland is on track to complete the destruction of its cluster munition stocks by the end of this year. With the technical support of Norwegian People’s Aid, Botswana and Peru have made substantial progress over the past year to plan and prepare for the destruction of their stockpiled cluster munitions within the convention’s deadlines.

Yet, as always, it is not all good news. We therefore bring the following concerns to your attention for our collective follow-up:

  • Bulgaria has reported the possession of a substantial number of cluster munitions, but still has not started destroying them. There is just one year left until its 1 October 2019 stockpile destruction deadline. We appreciate the update provided today, but are disappointed to hear that Bulgaria is considering making an extension request to its pending stockpile destruction deadline.
  • Guinea-Bissau has indicated several times that it needs financial and technical assistance to destroy its stockpiled cluster munitions by its 1 May 2019 deadline. Yet it still has not disclosed the quantity and types of stockpiled cluster munitions as it is nearly seven years late delivering its Article 7 transparency report. Guinea-Bissau last participated in a meeting of the convention in 2015.
  • There is evidence that Guinea imported cluster munitions back in the year 2000, prior to joining the convention. Yet it still has not provided its transparency report for the convention, which was due April 2015. Therefore, it is not possible to know if it still has stockpiled cluster munitions left to destroy.
  • Signatories that possess cluster munitions, such as Indonesia and Nigeria, appear to have taken few, if any, steps to ratify the convention or to declare and destroy their cluster munitions.
  • Cyprus is the last European Union member state to have signed, but not yet ratified the convention. It has not disclosed any information on its cluster munition stocks, but we have learned that 3,760 mortar projectiles and 2,559 submunitions that it transferred in 2014 for the purposes of destruction still have not been destroyed.

Under the convention’s cooperative compliance measures, States Parties and others, including our campaign members, stand ready to help States Parties requiring assistance. It is clear that several now need help to overcome financial, technical, and other challenges that are preventing them from swiftly destroying their cluster munition stocks.

Before concluding, we would like to highlight the fact that most States Parties have chosen not to retain any cluster munitions for training and research purposes. Nonetheless, 13 States Parties are retaining cluster munitions. This includes two of the newer States Parties, Bulgaria and Slovakia.

We are pleased to hear just now from the Netherlands that it intends to significantly reduce the number of cluster munitions that it has retained for research and training, but not consumed for these purposes since 2011. We were disappointed that Cameroon decided to retain all six of its stockpiled cluster munitions for research and training purposes.

Last year, Italy announced that it has destroyed the cluster munitions that it initially retained for research and training purposes and would not replenish those stocks. Several States Parties retaining cluster munitions have significantly reduced the number retained since making their initial declarations, including Belgium, France, Germany, Switzerland, and Spain.

That shows how the initial amounts retained were likely too high, but it still is not clear if current holdings constitute the “minimum number absolutely necessary” for the permitted purposes, as required by the convention.

We applaud the States Parties that have destroyed their cluster munition stocks and are not retaining any. It’s clear that most States Parties agree with the CMC that there is no compelling reason to retain live cluster munitions and explosive submunitions for the purposes of research and training.

Finally, the Cluster Munition Coalition supports both the guidelines on extension requests and the voluntary declaration of completion submitted to this meeting.

Posted: January 1, 1970, 12:00 am

A BLU-61 submunition marked for destruction in-place in the Basra governorate of Iraq, March 2018. 

© 2018 UNMAS
 
(Geneva) – No state party to the 2008 treaty prohibiting cluster munitions has violated the core prohibitions on use, production, transfer, and stockpiling of these weapons, resulting in an untarnished compliance record, Human Rights Watch said today during the release of the Cluster Munition Monitor 2018 report.
 
Cluster Munition Monitor 2018 is the ninth annual report of the Cluster Munition Coalition (CMC), the global coalition of nongovernmental organizations co-founded and chaired by Human Rights Watch. The group works to ensure that all countries join and adhere to the 2008 treaty banning cluster munitions and requiring clearance and victim assistance. The report details how some non-signatories, particularly Israel, Russia, and the United States, hardened their defense of cluster munitions during the past year.
 
“Full compliance is essential to ensuring that the treaty banning cluster munitions prevents further human suffering from these widely discredited weapons,” said Mary Wareham, arms division advocacy director at Human Rights Watch and an editor of the report. “The treaty members are showing the holdouts that they have nothing to lose and everything to gain by renouncing cluster munitions and coming on board without delay.”

Treaty members are showing the holdouts that they have nothing to lose and everything to gain by renouncing cluster munitions and coming on board without delay.

Mary Wareham

Arms Division Advocacy Director at Human Rights Watch

In the US, a November 30, 2017 Defense Department policy directive abandons a longstanding policy requiring the US not use cluster munitions that result in more than a 1 percent unexploded ordnance after 2018. Human Rights Watch has condemned the policy for halting a long-planned move away from inaccurate cluster munitions. The US claims that cluster munitions have military utility, but last used them during the 2003 invasion of Iraq, with the exception of a single 2009 attack in Yemen. There is no evidence that the US or its coalition partners have used cluster munitions against the Islamic State (also known as ISIS) in Syria and Iraq.

Cluster munitions can be fired from the ground by artillery systems, rockets, and projectiles, or dropped from aircraft. They typically open in the air, dispersing multiple bomblets or submunitions over a wide area. Many submunitions fail to explode on initial impact, leaving dangerous duds that can maim and kill like landmines for years.

Currently, there are 103 states parties to the Convention on Cluster Munitions, while 17 countries have signed, but not yet ratified. There has been no new use, production, or transfers of cluster munitions by any state party since the convention was adopted on May 30, 2008. All states parties facing the first eight-year stockpile destruction deadline – August 1, 2018 – successfully destroyed their stocks in time, including Croatia, Slovenia, and Spain in the past year. Cuba, a new state party, also completed its stockpile destruction, while Switzerland is expected to announce completion imminently.

The destruction to date of a collective total of 1.4 million cluster munitions and more than 177 million submunitions means that 99 percent of the total reported global stocks held by states parties have now been destroyed. During 2017, seven countries destroyed a total of 33,551 cluster munitions and 1.7 million submunitions.

However, use of cluster munition by Syrian government forces on anti-government-held areas of the country, which began in 2012, continued throughout 2017 and the first half of 2018. The number of recorded cluster munition attacks fell over the past year, in part due to the decreasing number of areas that remain outside of the government’s control. In Yemen, far fewer cluster munition attacks were reported over the last year by a Saudi-led coalition that has conducted a military operation against Houthi forces in Yemen since March 2015. That decrease comes after strong public outcry, global media coverage, and widespread condemnation. There is evidence that cluster munitions may have been used in Egypt and Libya, but it has not been possible to independently confirm these allegations. None of these countries are party to the Convention on Cluster Munitions.

According to the Cluster Munition Monitor, there were 289 new casualties in 2017 and 99 percent of cases in which the victim’s status was reported were civilians. That included 187 casualties in Syria and 54 in Yemen from both new attacks and explosive remnants. There were 32 new casualties in Laos, all from unexploded submunitions used by the US in the 1960s and 1970s. The number of new victims in 2017 is a sharp decrease from the 971 reported in 2016, but many casualties go unrecorded or lack sufficient documentation.

Since the publication of last year’s report, Sri Lanka was the only country to ratify or accede to the convention, on March 1.

For the third consecutive year, Russia voted with Zimbabwe in December 2017 against a United Nations General Assembly resolution promoting the convention, though 32 non-signatories voted for the resolution. Russia has participated in a joint military operation with Syrian forces since September 30, 2015, in which cluster munitions have caused extensive civilian harm.

According to the Cluster Munition Monitor, 26 countries, including 12 states parties and two signatories, are contaminated by cluster munition remnants. Around the world, at least 153,000 submunitions were destroyed during 2017 in clearance operations. Under the convention, eight states parties have completed clearance of their contaminated land.

Most states parties have formally declared they are not retaining any cluster munitions for training or research, as the treaty permits, though 12 treaty members are. Thirty have enacted national laws to carry out the convention, and another 20 are in the process of doing so.  

“Several states parties still have significant work to do to clear contaminated areas, assist victims, report on their implementation, and ensure they have laws and other measures to punish any violations,” Wareham said. “Countries needing assistance should not hesitate to request help, as cooperative compliance is the bedrock of this treaty.”

Cluster Munition Monitor 2018 will be presented at the Eighth Meeting of States Parties to the Convention on Cluster Munitions, which opens at the United Nations in Geneva on September 3.

Posted: January 1, 1970, 12:00 am

Thank you Mr. Chair, and thank you for your work chairing this Group of Governmental Experts, including your consultations with civil society. I am speaking in my capacity as coordinator of the Campaign to Stop Killer Robots, the rapidly growing coalition of 76 non-governmental organizations in 32 countries working to preemptively ban weapons systems that, once activated, would select and attack targets without human intervention.

The serious legal, operational, moral, technical, proliferation, security and other challenges raised by fully autonomous weapons have gained widespread attention since the first CCW meeting on the topic in May 2014. However, states still have not agreed on the regulatory response needed to address the serious challenges raised.

It’s increasingly obvious that the public strongly objects to allowing machines to select targets and use force without any meaningful human control. Doing so would be abhorrent, immoral, and an affront to the concept of human dignity and principles of humanity. It’s high time governments heed the mounting calls for a new international law to prohibit killer robots and start negotiating one.

The Campaign to Stop Killer Robots urges states at this sixth international meeting on lethal autonomous weapons systems to recommend a negotiating mandate to create such a ban treaty. We hope that states heed the calls from not only us, but the African Group of states, the Non-Aligned Movement, Brazil, Austria, Chile, Colombia, Panama, and others to begin negotiations on a new treaty to retain human control over weapons systems or prohibit lethal autonomous weapons.

Momentum is starting to build rapidly for states to start negotiating a legally-binding instrument. Requests for more time to further explore this challenge may seem valid, but increasingly sound like excuses aimed at delaying the inevitable regulation that’s coming.

Promises of greater transparency, codes of conduct, meek political declarations and more committees are insufficient to deal with the far-reaching consequences of creating fully autonomous weapons. Nothing less than a ban treaty will be needed to effectively constrain the development of autonomy in the critical functions of weapons systems and avoid dehumanizing the use of force.

Posted: January 1, 1970, 12:00 am

Thank you, Mr. Chairman.

Human Rights Watch is a co-founder of the Campaign to Stop Killer Robots, and Mary Wareham of Human Rights Watch is the global coordinator of the Campaign.

We are pleased that the GGE has shifted its focus to options for the way forward on lethal autonomous weapons systems (LAWS). For five years, states have highlighted the host of problems with these weapons, including legal, moral, accountability, technical, and security concerns. It is time to move on and take action. As Brazil noted, the world is watching and there are high expectations for the CCW to produce a strong, concrete outcome.

Human Rights Watch supports the proposal for a mandate to begin negotiations in 2019 of a legally binding instrument to require meaningful human control over the critical functions of lethal autonomous weapons systems. Such a requirement is effectively the same as a prohibition on weapons that lack such control.

We were pleased to hear so many states—the vast majority of states—express support for a legally binding instrument prohibiting lethal autonomous weapons systems. We hope that High Contracting Parties set aside significant time in 2019 to fulfill that mandate—at least four weeks, so that the negotiations could be concluded within one year.

Several states have said the CCW’s discussions should focus on the compliance of lethal autonomous weapons systems with international law and particularly international humanitarian law. We agree that compliance with rules of proportionality and distinction is critical, and we question whether this technology could comply.

But another provision of international humanitarian law must also be considered. The Martens Clause—which appears in the Geneva Convention, Additional Protocol I, and the preamble of the CCW—creates a legal obligation for states to consider moral implications when assessing new technology. The clause applies when there is no specific existing law on a topic, which is the case with lethal autonomous weapons systems, also called fully autonomous weapons.

The Martens Clause requires in particular that emerging technology comply with the principles of humanity and dictates of public conscience. As we have outlined in a new report distributed this week, fully autonomous weapons would fail this test on both counts.

The principles of humanity require humane treatment of others and respect for human life and dignity. Weapons that lack meaningful human control over the critical functions would be unable to comply with these principles.

Fully autonomous weapons would lack compassion, which motivates humans to minimize suffering and killing. They would also lack the legal and ethical judgment necessary to determine the best means for protecting civilians on a case-by-case basis in complex and unpredictable combat environments.

As inanimate machines, fully autonomous weapons could also not appreciate the value of human life and the significance of its loss. They would base life-and-death determinations on algorithms, objectifying their human targets—whether civilians or combatants. They would thus fail to respect human dignity.

The development of weapons without meaningful human control would also run counter to the dictates of public conscience. In national and regional group statements, a majority of states at CCW have called for the negotiation of a legally binding instrument on lethal autonomous weapons systems. Many have expressly called for a prohibition on the weapons. Virtually all states have stressed the need to maintain human control over the use of force. Collectively, these statements provide evidence that the public conscience favors human control and objects to fully autonomous weapons.

Experts and the general public have reached similar conclusions. As was discussed in yesterday’s side event sponsored by the Campaign to Stop Killer Robots, thousands of AI and robotics researchers along with companies and industry representatives have called for a ban on fully autonomous weapons. Traditional voices of conscience—faith leaders and Nobel Peace Laureates—have echoed those calls, expressing moral outrage at the prospect of losing human control over the use of force. Civil society and the ICRC have also emphasized that law and ethics require human control over the critical functions of a weapon.

In conclusion, the rules of law and morality demand the negotiation of a new legally binding instrument on fully autonomous weapons. An assessment of the technology under the Martens Clause shows there is a gap in international law that needs to be filled. Concerns related to the principles of humanity and dictates of public conscience show that the new instrument should ensure that meaningful human control over the use of force is maintained and the development, production, and use of fully autonomous weapons are prohibited.

Thank you.

Posted: January 1, 1970, 12:00 am

Global launch of the Campaign to Stop Killer Robots in London on April 23, 2013.

© 2013 Campaign to Stop Killer Robots

The next revolution in warfare threatens to undermine fundamental principles of morality and law. Fully autonomous weapons, already under development in a number of countries, would have the power to select targets and fire on them without meaningful human control. In so doing, they would violate basic humanity and the public conscience.

International humanitarian law obliges countries to take these factors into account when evaluating new weapons. A longstanding provision known as the Martens Clause creates a legal duty to consider the moral implications of emerging technology. The Martens Clause states that when no existing treaty provision specifically applies, weapons should comply with the “principles of humanity” and the “dictates of public conscience”.

A new report from Human Rights Watch and Harvard Law School’s International Human Rights Clinic, of which I was the lead author, shows why fully autonomous weapons would fail both prongs of the test laid out in the Martens Clause. We conclude that the only adequate solution for dealing with these potential weapons is a preemptive ban on their development, production, and use.

More than 70 countries will convene at the United Nations in Geneva from August 27 to 31 to discuss what they refer to as lethal autonomous weapons systems. They will meet under the auspices of the Convention on Conventional Weapons, a major disarmament treaty. To avert a crisis of morality and a legal vacuum, countries should agree to start negotiating a treaty prohibiting these weapons in 2019.

With the rapid development of autonomous technology, the prospect of fully autonomous weapons is no longer a matter of science fiction. Experts have warned they could be fielded in years not decades.

While opponents of fully autonomous weapons have highlighted a host of legal, accountability, security, and technological concerns, morality has been a dominant theme in international discussions since they began in 2013. At a UN Human Rights Council meeting that year, the UN expert on extrajudicial killing warned that humans should not delegate lethal decisions to machines that lack “morality and mortality.”

The inclusion of the Martens Clause in the Convention on Conventional Weapons underscores the need to consider morality in that forum, too.   

Fully autonomous weapons would violate the principles of humanity because they could not respect human life and dignity. They would lack both compassion, which serves as a check on killing, and human judgment, which allows people to assess unforeseen situations and make decisions about how best to protect civilians. Fully autonomous weapons would make life-and-death decisions based on algorithms that objectify their human targets without bringing to bear an understanding of the value of human life based on lived experience.

Fully autonomous weapons also run counter to the dictates of public conscience, which reflect an understanding of what is right and wrong. Traditional voices of conscience, including more than 160  religious leaders and more than 20 Nobel Peace Laureates, have publicly condemned fully autonomous weapons. Scientiststechnology companies, and other experts by the thousands have joined the chorus of objectors.  In July, a group of artificial intelligence researchers released a pledge, since signed by more than 3,000 people and 237 organisations, not to assist with the development of such weapons.

Governments have also expressed widespread concern about the prospect of losing control over the use of force. In April, for example, the African Group, one of the five regional groups at the United Nations, called for a preemptive ban on fully autonomous weapons, stating, “it is inhumane, abhorrent, repugnant, and against public conscience for humans to give up control to machines, allowing machines to decide who lives or dies.”

To date, 26 countries have endorsed the call for a prohibition on fully autonomous weapons. Dozens more have emphasised the need to maintain human control over the use of force. Their shared commitment to human control provides common ground on which to initiate negotiations for a new treaty on fully autonomous weapons.

Countries debating fully autonomous weapons at the United Nations next week must urgently heed the principles reflected in the Martens Clause. Countries should both reiterate their concerns about morality and law and act on them before this potential revolution in weaponry becomes a reality.

Author: Human Rights Watch
Posted: January 1, 1970, 12:00 am

 

Summary

In cases not covered by this Protocol or by other international agreements, civilians and combatants remain under the protection and authority of the principles of international law derived from established custom, from the principles of humanity and from the dictates of public conscience.
— Martens Clause, as stated in Additional Protocol I of 1977 to the Geneva Conventions

Fully autonomous weapons are one of the most alarming military technologies under development today. As such there is an urgent need for states, experts, and the general public to examine these weapons closely under the Martens Clause, the unique provision of international humanitarian law that establishes a baseline of protection for civilians and combatants when no specific treaty law on a topic exists. This report shows how fully autonomous weapons, which would be able to select and engage targets without meaningful human control, would contravene both prongs of the Martens Clause: the principles of humanity and the dictates of public conscience. To comply with the Martens Clause, states should adopt a preemptive ban on the weapons’ development, production, and use.

The rapid development of autonomous technology and artificial intelligence (AI) means that fully autonomous weapons could become a reality in the foreseeable future. Also known as “killer robots” and lethal autonomous weapons systems, they raise a host of moral, legal, accountability, operational, technical, and security concerns. These weapons have been the subject of international debate since 2013. In that year, the Campaign to Stop Killer Robots, a civil society coalition, was launched and began pushing states to discuss the weapons. After holding three informal meetings of experts, states parties to the Convention on Conventional Weapons (CCW) began formal talks on the topic in 2017. In August 2018, approximately 80 states will convene again for the next meeting of the CCW Group of Governmental Experts.

As CCW states parties assess fully autonomous weapons and the way forward, the Martens Clause should be a central element of the discussions. The clause, which is a common feature of international humanitarian law and disarmament treaties, declares that in the absence of an international agreement, established custom, the principles of humanity, and the dictates of public conscience should provide protection for civilians and combatants. The clause applies to fully autonomous weapons because they are not specifically addressed by international law. Experts differ on the precise legal significance of the Martens Clause, that is, whether it reiterates customary law, amounts to an independent source of law, or serves as an interpretive tool. At a minimum, however, the Martens Clause provides key factors for states to consider as they evaluate emerging weapons technology, including fully autonomous weapons. It creates a moral standard against which to judge these weapons.

The Principles of Humanity

Due to their lack of emotion and legal and ethical judgment, fully autonomous weapons would face significant obstacles in complying with the principles of humanity. Those principles require the humane treatment of others and respect for human life and human dignity. Humans are motivated to treat each other humanely because they feel compassion and empathy for their fellow humans. Legal and ethical judgment gives people the means to minimize harm; it enables them to make considered decisions based on an understanding of a particular context. As machines, fully autonomous weapons would not be sentient beings capable of feeling compassion. Rather than exercising judgment, such weapons systems would base their actions on pre-programmed algorithms, which do not work well in complex and unpredictable situations.

Showing respect for human life entails minimizing killing. Legal and ethical judgment helps humans weigh different factors to prevent arbitrary and unjustified loss of life in armed conflict and beyond. It would be difficult to recreate such judgment, developed over both human history and an individual life, in fully autonomous weapons, and they could not be pre-programmed to deal with every possible scenario in accordance with accepted legal and ethical norms. Furthermore, most humans possess an innate resistance to killing that is based on their understanding of the impact of loss of life, which fully autonomous weapons, as inanimate machines, could not share.

Even if fully autonomous weapons could adequately protect human life, they would be incapable of respecting human dignity. Unlike humans, these robots would be unable to appreciate fully the value of a human life and the significance of its loss. They would make life-and-death decisions based on algorithms, reducing their human targets to objects. Fully autonomous weapons would thus violate the principles of humanity on all fronts.

The Dictates of Public Conscience

Increasing outrage at the prospect of fully autonomous weapons suggests that this new technology also runs counter to the second prong of the Martens Clause, the dictates of public conscience. These dictates consist of moral guidelines based on a knowledge of what is right and wrong. They can be ascertained through the opinions of the public and of governments.

Many individuals, experts, and governments have objected strongly to the development of fully autonomous weapons. The majority of respondents in multiple public opinion surveys have registered opposition to these weapons. Experts, who have considered the issue in more depth, have issued open letters and statements that reflect conscience even better than surveys do. International organizations and nongovernmental organizations (NGOs), along with leaders in disarmament and human rights, peace and religion, science and technology, and industry, have felt compelled, particularly on moral grounds, to call for a ban on fully autonomous weapons. They have condemned these weapons as “unconscionable,” “abhorrent … to the sacredness of life,” “unwise,” and “unethical.”

Governments have cited compliance with the Martens Clause and moral shortcomings among their major concerns with fully autonomous weapons. As of July 2018, 26 states supported a preemptive ban, and more than 100 states had called for a legally binding instrument to address concerns raised by lethal autonomous weapons systems. Almost every CCW state party that spoke at their last meeting in April 2018 stressed the need to maintain human control over the use of force. The emerging consensus for preserving meaningful human control, which is effectively equivalent to a ban on weapons that lack such control, shows that the public conscience is strongly against fully autonomous weapons.

The Need for a Preemptive Ban Treaty

An assessment of fully autonomous weapons under the Martens Clause underscores the need for new law that is both specific and strong. Regulations that allowed for the existence of fully autonomous weapons would not suffice. For example, limiting use to certain locations would neither prevent the risk of proliferation to actors with little regard for humane treatment or human life, nor ensure respect for the dignity of civilians or combatants. Furthermore, the public conscience reveals widespread support for a ban on fully autonomous weapons, or its equivalent, a requirement for meaningful human control. To ensure compliance with both the principles of humanity and the dictates of public conscience, states should therefore preemptively prohibit the development, production, and use of fully autonomous weapons.

 

Recommendations

To avert the legal, moral, and other risks posed by fully autonomous weapons and the loss of meaningful human control over the selection and engagement of targets, Human Rights Watch and Harvard Law School’s International Human Rights Clinic (IHRC) recommend:

To CCW states parties

  • Adopt, at their annual meeting in November 2018, a mandate to negotiate a new protocol prohibiting fully autonomous weapons systems, or lethal autonomous weapons systems, with a view to concluding negotiations by the end of 2019.
  • Use the intervening Group of Governmental Experts meeting in August 2018 to present clear national positions and to reach agreement on the need to adopt a negotiating mandate at the November annual meeting.
  • Develop national positions and adopt national prohibitions as key building blocks for an international ban.
  • Express opposition to fully autonomous weapons, including on the legal and moral grounds reflected in the Martens Clause, in order further to develop the existing public conscience.

To experts in the private sector

  • Oppose the removal of meaningful human control from weapons systems and the use of force.
  • Publicly express explicit support for the call to ban fully autonomous weapons, including on the legal and moral grounds reflected in the Martens Clause, and urge governments to start negotiating new international law.
  • Commit not to design or develop AI for use in the development of fully autonomous weapons via codes of conduct, statements of principles, and other measures that ensure the private sector does not advance the development, production, or use of fully autonomous weapons.

 

I. Background on Fully Autonomous Weapons

Fully autonomous weapons would be able to select and engage targets without meaningful human control. They represent an unacceptable step beyond existing armed drones because a human would not make the final decision about the use of force in individual attacks. Fully autonomous weapons, also known as lethal autonomous weapons systems and “killer robots,” do not exist yet, but they are under development, and military investments in autonomous technology are increasing at an alarming rate.

The risks of fully autonomous weapons outweigh their purported benefits. Proponents highlight that the new technology could save the lives of soldiers, process data and operate at greater speeds than traditional systems, and be immune to fear and anger, which can lead to civilian casualties. Fully autonomous weapons, however, raise a host of serious concerns, many of which Human Rights Watch has highlighted in previous publications. First, delegating life-and-death decisions to machines crosses a moral red line. Second, fully autonomous weapons would face significant challenges complying with international humanitarian and human rights law. Third, they would create an accountability gap because it would be difficult to hold anyone responsible for the unforeseen harm caused by an autonomous robot. Fourth, fully autonomous weapons would be vulnerable to spoofing and hacking. Fifth, these weapons would threaten global security because they could lead to an arms race, proliferate to actors with little respect for international law, and lower the threshold to war.[1]

This report focuses on yet another concern, which straddles law and morality—that is, the likelihood that fully autonomous weapons would contravene the Martens Clause. This provision of international humanitarian law requires states to take into account the principles of humanity and dictates of public conscience when examining emerging weapons technology. A common feature in the Geneva Conventions and disarmament treaties, the clause represents a legal obligation on states to consider moral issues.

The plethora of problems presented by fully autonomous weapons, including those under the Martens Clause, demand urgent action. A handful of states have proposed a wait-and-see approach, given that it is unclear what technology will be able to achieve. The high stakes involved, however, point to the need for a precautionary approach. Scientific uncertainty should not stand in the way of action to prevent what some scientists have referred to as the “third revolution in warfare, after gunpowder and nuclear arms.”[2] Countries should adopt a preemptive ban on the development, production, and use of fully autonomous weapons.

 

II. History of the Martens Clause

While the Martens Clause originated in a diplomatic compromise, it has served humanitarian ends. It states that in the absence of specific treaty law, established custom, the principles of humanity, and the dictates of public conscience provide protection for civilians and combatants. Since its introduction, the Martens Clause has become a common feature of the core instruments of international humanitarian law. The clause also appears in numerous disarmament treaties. The protections the Martens Clause provides and the legal recognition it has received highlight its value for examining emerging weapons systems that could cause humanitarian harm on the battlefield and beyond.

Origins of the Martens Clause

The Martens Clause first appeared in the preamble of the 1899 Hague Convention II containing the Regulations on the Laws and Customs of War on Land. In that iteration, the Martens Clause reads:

Until a more complete code of the laws of war is issued, the High Contracting Parties think it right to declare that in cases not included in the Regulations adopted by them, populations and belligerents remain under the protection and empire of the principles of international law, as they result from the usages established between civilized nations, from the laws of humanity, and the requirements of the public conscience.[3]

The clause thus provides a baseline level of protection to civilians and combatants when specific law does not exist.

Russian diplomat and jurist Fyodor Fyodorovich Martens proposed the Martens Clause as a way to break a negotiating stalemate at the 1899 Hague Peace Conference, which had been convened to adopt rules restraining war, reduce arms spending, and promote peace.[4] The great powers and lesser powers disagreed about how much authority occupying forces could exercise over the local population. The great powers insisted on a new treaty clarifying the rights and obligations of occupying forces, while the lesser powers opposed codifying provisions of an earlier political declaration that they believed did not adequately protect civilians. The Martens Clause provided fighters against foreign occupation the option of arguing that if specific provisions of the treaty did not cover them, they were entitled to at least such protection offered by principles of international law derived from custom, “the laws of humanity,” and “the requirements of the public conscience.”[5]

Modern Use of the Martens Clause

In the nearly 120 years since the adoption of the 1899 Hague Convention, the Martens Clause has been applied more broadly and become a staple of efforts to extend humanitarian protections during armed conflict. Seeking to reduce the impact of hostilities, numerous instruments of international humanitarian law and disarmament law have incorporated the provision.

Geneva Conventions and Additional Protocol I

When drafting the 1949 Geneva Conventions, the cornerstones of international humanitarian law,[6] negotiators wanted to ensure that certain protections would continue if a state party decided to withdraw from any of the treaties. The four Geneva Conventions contain the Martens Clause in their articles on denunciation, which address the implications of a state of leaving the treaties.[7] In its authoritative commentary on the conventions, the International Committee of the Red Cross (ICRC), the arbiter of international humanitarian law, explains:

[I]f a High Contracting Party were to denounce one of the Geneva Conventions, it would continue to be bound not only by other treaties to which it remains a Party, but also by other rules of international law, such as customary law. An argumentum e contrario, suggesting a legal void following the denunciation of a Convention, is therefore impossible.[8]

Additional Protocol I, which was adopted in 1977, expands the protections afforded to civilians by the Fourth Geneva Convention.[9]  The protocol contains the modern iteration of the Martens Clause, and the version used in this report:

In cases not covered by this Protocol or by other international agreements, civilians and combatants remain under the protection and authority of the principles of international law derived from established custom, from the principles of humanity and from the dictates of public conscience.[10]

By incorporating this language in its article on “General Principles and Scope of Application,” rather than confining it to a provision on denunciation, Additional Protocol I extends the application of the Martens Clause. According to the ICRC commentary:

There were two reasons why it was considered useful to include this clause yet again in the Protocol. First ... it is not possible for any codification to be complete at any given moment; thus, the Martens clause prevents the assumption that anything which is not explicitly prohibited by the relevant treaties is therefore permitted. Secondly, it should be seen as a dynamic factor proclaiming the applicability of the principles mentioned regardless of subsequent developments of types of situation or technology.[11]

The Martens Clause thus covers gaps in existing law and promotes civilian protection in the face of new situations or technology.

Disarmament Treaties

Since 1925, most treaties containing prohibitions on weapons also include the Martens Clause.[12] The clause is referenced in various forms in the preambles of the 1925 Geneva Gas Protocol,[13] 1972 Biological Weapons Convention,[14] 1980 Convention on Conventional Weapons,[15] 1997 Mine Ban Treaty,[16] 2008 Convention of Cluster Munitions,[17] and 2017 Treaty on the Prohibition of Nuclear Weapons.[18] Although a preamble does not establish binding rules, it can inform interpretation of a treaty and is typically used to incorporate, by reference, the context of already existing law. The inclusion of the Martens Clause indicates that if a treaty’s operative provisions present gaps, they should be filled by established custom, the principles of humanity, and the dictates of public conscience. By incorporating the Martens Clause into this line of disarmament treaties, states have reaffirmed its importance to international humanitarian law generally and weapons law specifically.

The widespread use of the Martens Clause makes it relevant to the current discussions of fully autonomous weapons. The clause provides a standard for ensuring that civilians and combatants receive at least minimum protections from such problematic weapons. In addition, most of the diplomatic discussions of fully autonomous weapons have taken place under the auspices of the CCW, which includes the Martens Clause in its preamble. Therefore, an evaluation of fully autonomous weapons under the Martens Clause should play a key role in the deliberations about a new CCW protocol.

 

III. Applicability and Significance of the Martens Clause

The Martens Clause applies in the absence of specific law on a topic. Experts disagree on its legal significance, but at a minimum, it provides factors that states must consider when examining new challenges raised by emerging technologies. Its importance to disarmament law in particular is evident in the negotiations that led to the adoption of a preemptive ban on blinding lasers. States and others should therefore take the clause into account when discussing the legality of fully autonomous weapons and how best to address them.

Applicability of the Martens Clause

The Martens Clause, as set out in Additional Protocol I, applies “[i]n cases not covered” by the protocol or by other international agreements.[19] No matter how careful they are, treaty drafters cannot foresee and encompass all circumstances in one instrument. The Martens Clause serves as a stopgap measure to ensure that an unanticipated situation or emerging technology does not subvert the overall purpose of humanitarian law merely because no existing treaty provision explicitly covers it.[20]

The Martens Clause is triggered when existing treaty law does not specifically address a certain circumstance. As the US Military Tribunal at Nuremberg explained, the clause makes “the usages established among civilized nations, the laws of humanity and the dictates of public conscience into the legal yardstick to be applied if and when the specific provisions of [existing law] do not cover specific cases occurring in warfare.”[21] It is particularly relevant to new technology that drafters of existing law may not have predicted. Emphasizing that the clause’s “continuing existence and applicability is not to be doubted,” the International Court of Justice highlighted that it has “proved to be an effective means of addressing the rapid evolution of military technology.”[22] Given that there is often a dearth of law in this area, the Martens Clause provides a standard for emerging weapons.

As a rapidly developing form of technology, fully autonomous weapons exemplify an appropriate subject for the Martens Clause. Existing international humanitarian law applies to fully autonomous weapons only in general terms. It requires that all weapons comply with the core principles of distinction and proportionality, but it does not contain specific rules for dealing with fully autonomous weapons.[23] Drafters of the Geneva Conventions could not have envisioned the prospect of a robot that could make independent determinations about when to use force without meaningful human control. Given that fully autonomous weapons present a case not covered by existing law, they should be evaluated under the principles articulated in the Martens Clause.

Legal Significance of the Martens Clause

Interpretations of the legal significance of the Martens Clause vary.[24] Some experts adopt a narrow perspective, asserting that the Martens Clause serves merely as a reminder that if a treaty does not expressly prohibit a specific action, the action is not automatically permitted. In other words, states should refer to customary international law when treaty law is silent on a specific issue.[25] This view is arguably unsatisfactory, however, because it addresses only one aspect of the clause—established custom—and fails to account for the role of the principles of humanity and the dictates of public conscience. Under well-accepted rules of legal interpretation, a clause should be read to give each of its terms meaning.[26] Treating the principles of humanity and the dictates of public conscience as simply elements of established custom would make them redundant and violate this rule.

Others argue that the Martens Clause is itself a unique source of law.[27] They contend that plain language of the Martens Clause elevates the principles of humanity and the dictates of public conscience to independent legal standards against which to judge unanticipated situations and emerging forms of military technology.[28] On this basis, a situation or weapon that conflicts with either standard is per se unlawful.

Public international law jurist Antonio Cassese adopted a middle approach, treating the principles of humanity and dictates of public conscience as “fundamental guidance” for the interpretation of international law.[29] Cassese wrote that “[i]n case of doubt, international rules, in particular rules belonging to humanitarian law, must be construed so as to be consonant with general standards of humanity and the demands of public conscience.”[30] International law should, therefore, be understood not to condone situations or technologies that raise concerns under these prongs of the Martens Clause.

At a minimum, the Martens Clause provides factors for states to consider as they approach emerging weapons technology, including fully autonomous weapons. In 2018, the ICRC acknowledged the “debate over whether the Martens Clause constitutes a legally-binding yardstick against which the lawfulness of a weapon must be measured, or rather an ethical guideline.”[31] It concluded, however, that “it is clear that considerations of humanity and public conscience have driven the evolution of international law on weapons, and these notions have triggered the negotiation of specific treaties to prohibit or limit certain weapons.”[32] If concerns about a weapon arise under the principles of humanity or dictates of public conscience, adopting new, more specific law that eliminates doubts about the legality of a weapon can increase protections for civilians and combatants.

The Martens Clause also makes moral considerations legally relevant. It is codified in international treaties, yet it requires evaluating a situation or technology according to the principles of humanity and dictates of public conscience, both of which incorporate elements of morality. Peter Asaro, a philosopher of science and technology, writes that the Martens Clause invites “moral reflection on the role of the principles of humanity and the dictates of public conscience in articulating and establishing new [international humanitarian law].”[33] While a moral assessment of fully autonomous weapons is important in its own right, the Martens Clause also makes it a legal requirement in the absence of specific law.

Precedent of the Preemptive Ban on Blinding Lasers

States, international organizations, and civil society have invoked the Martens Clause in previous deliberations about unregulated, emerging technology.[34] They found it especially applicable to discussions of blinding lasers in the 1990s. These groups explicitly and implicitly referred to elements of the Martens Clause as justification for preemptively banning blinding lasers. CCW Protocol IV, adopted in 1995, codifies the prohibition.[35]

During a roundtable convened by the ICRC in 1991, experts highlighted the relevance of the Martens Clause. ICRC lawyer Louise Doswald-Beck argued that “[d]ecisions to impose specific restrictions on the use of certain weapon may be based on policy considerations,” and “that the criteria enshrined in the Martens clause [should] be particularly taken into account.”[36] Another participant said that “the Martens clause particularly addresses the problem of human suffering so that the ‘public conscience’ refers to what is seen as inhumane or socially unacceptable.”[37]

Critics of blinding lasers spoke in terms that demonstrated the weapons raised concerns under the principles of humanity and dictates of public conscience. Several speakers at the ICRC-convened meetings concurred that “weapons designed to blind are … socially unacceptable.”[38] ICRC itself “appealed to the ‘conscience of humanity’” in advocating for a prohibition.[39] At the CCW’s First Review Conference, representatives of UN agencies and civil society described blinding lasers as “inhumane,”[40] “abhorrent to the conscience of humanity,”[41] and “unacceptable in the modern world.”[42] A particularly effective ICRC public awareness campaign used photographs of soldiers blinded by poison gas during World War I to emphasize the fact that permanently blinding soldiers is cruel and inhumane.

Such characterizations of blinding lasers were linked to the need for a preemptive ban. For example, during debate at the First Review Conference, Chile expressed its hope that the body “would be able to establish guidelines for preventative action to prohibit the development of inhumane technologies and thereby to avoid the need to remedy the misery they might cause.”[43] In a December 1995 resolution urging states to ratify Protocol IV, the European Parliament declared that “deliberate blinding as a method of warfare is abhorrent.”[44] Using the language of the Martens Clause, the European Parliament stated that “deliberate blinding as a method of warfare is … in contravention of established custom, the principles of humanity and the dictates of the public conscience.”[45] The ICRC welcomed Protocol IV as a “victory of civilization over barbarity.”[46]

The discussions surrounding CCW Protocol IV underscore the relevance of the Martens Clause to the current debate about fully autonomous weapons. They show that CCW states parties have a history of applying the Martens Clause to controversial weapons. They also demonstrate the willingness of these states to preemptively ban a weapon that they find counter to the principles of humanity and dictates of public conscience. As will be discussed in more depth below, fully autonomous weapons raise significant concerns under the Martens Clause. The fact that their impact on armed conflict would be exponentially greater than that of blinding lasers should only increase the urgency of filling the gap in international law and explicitly banning them.[47]

 

IV. The Principles of Humanity

The Martens Clause divides the principles of international law into established custom, the principles of humanity, and the dictates of public conscience. Given that customary law is applicable even without the clause, this report assesses fully autonomous weapons under the latter two elements. The Martens Clause does not define these terms, but they have been the subject of much scholarly and legal discussion.

The relevant literature illuminates two key components of the principles of humanity. Actors are required: (1) to treat others humanely, and (2) to show respect for human life and dignity. Due to their lack of emotion and judgment, fully autonomous weapons would face significant difficulties in complying with either.

Humane Treatment

Definition

The first principle of humanity requires the humane treatment of others. The Oxford Dictionary defines “humanity” as “the quality of being humane; benevolence.”[48] The obligation to treat others humanely is a key component of international humanitarian law and international human rights law.[49] It appears, for example, in common Article 3 and other provisions of the Geneva Conventions, numerous military manuals, international case law, and the International Covenant on Civil and Political Rights.[50] Going beyond these sources, the Martens Clause establishes that human beings must be treated humanely, even when specific law does not exist.[51]

In order to treat other human beings humanely, one must exercise compassion and make legal and ethical judgments.[52] Compassion, according to the ICRC’s fundamental principles, is the “stirring of the soul which makes one responsive to the distress of others.”[53] To show compassion, an actor must be able to experience empathy—that is, to understand and share the feelings of another—and be compelled to act in response.[54] This emotional capacity is vital in situations when determinations about the use of force are made.[55] It drives actors to make conscious efforts to minimize the physical or psychological harm they inflict on human beings. Acting with compassion builds on the premise that “capture is preferable to wounding an enemy, and wounding him better than killing him; that non-combatants shall be spared as far as possible; that wounds inflicted be light as possible, so that the injured can be treated and cured; and that the wounds cause the least possible pain.”[56]

While compassion provides a motivation to act humanely, legal and ethical judgment provides a means to do so. To act humanely, an actor must make considered decisions as to how to minimize harm.[57] Such decisions are based on the ability to perceive and understand one’s environment and to apply “common sense and world knowledge” to a specific circumstance.[58] Philosophy professor James Moor notes that actors must possess the capacity to “identify and process ethical information about a variety of situations and make sensitive determinations about what should be done in those situations.”[59] In this way, legal and ethical judgment helps an actor weigh relevant factors to ensure treatment meets the standards demanded by compassion. Judgment is vital to minimizing suffering: one can only refrain from harming humans if one both recognizes the possible harms and knows how to respond.[60]

Application to Fully Autonomous Weapons

Fully autonomous weapons would face significant challenges in complying with the principle of humane treatment because compassion and legal and ethical judgment are human characteristics. Empathy, and the compassion for others that it engenders, come naturally to human beings. Most humans have experienced physical or psychological pain, which drives them to avoid inflicting unnecessary suffering on others. Their feelings transcend national and other divides. As the ICRC notes, “feelings and gestures of solidarity, compassion, and selflessness are to be found in all cultures.”[61] People’s shared understanding of pain and suffering leads them to show compassion towards fellow human beings and inspires reciprocity that is, in the words of the ICRC, “perfectly natural.”[62]

Regardless of the sophistication of a fully autonomous weapon, it could not experience emotions.[63] There are some advantages associated with being impervious to emotions such as anger and fear, but a robot’s inability to feel empathy and compassion would severely limit its ability to treat others humanely. Because they would not be sentient beings, fully autonomous weapons could not know physical or psychological suffering. As a result, they would lack the shared experiences and understandings that cause humans to relate empathetically to the pain of others, have their “souls stirred,” and be driven to exercise compassion towards other human beings. Amanda Sharkey, a professor of computer science, has written that “current robots, lacking living bodies, cannot feel pain, or even care about themselves, let alone extend that concern to others. How can they empathize with a human’s pain or distress if they are unable to experience either emotion?”[64] Fully autonomous weapons would therefore face considerable difficulties in guaranteeing their acts are humane and in compliance with the principles of humanity.

Robots would also not possess the legal and ethical judgment necessary to minimize harm on a case-by-case basis.[65] Situations involving use of force, particularly in armed conflict, are often complex and unpredictable and can change quickly. Fully autonomous weapons would therefore encounter significant obstacles to making appropriate decisions regarding humane treatment. After examining numerous studies in which researchers attempted to program ethics into robots, Sharkey found that robots exhibiting behavior that could be described as “ethical” or “minimally ethical” could operate only in constrained environments. Sharkey concluded that robots have limited moral capabilities and therefore should not be used in circumstances that “demand moral competence and an understanding of the surrounding social situation.”[66] Complying with international law frequently requires subjective decision-making in complex situations. Fully autonomous weapons would have limited ability to interpret the nuances of human behavior, understand the political, socioeconomic, and environmental dynamics of the situation, and comprehend the humanitarian risks of the use of force in a particular context.[67] These limitations would compromise the weapons’ ability to ensure the humane treatment of civilians and combatants and comply with the first principle of humanity.

Respect for Human Life and Dignity

Definition

A second principle of humanity requires actors to respect both human life and human dignity. Christof Heyns, former special rapporteur on extrajudicial, summary or arbitrary executions, highlighted these related but distinct concepts when he posed two questions regarding fully autonomous weapons: “[C]an [they] do or enable proper targeting?” and “Even if they can do proper targeting, should machines hold the power of life and death over humans?”[68] The first considers whether a weapon can comply with international law’s rules on protecting life. The second addresses the “manner of targeting” and whether it respects human dignity.[69]

In order to respect human life, actors must take steps to minimize killing.[70] The right to life states that “[n]o one shall be arbitrarily deprived of his life.”[71] It limits the use of lethal force to circumstances in which it is absolutely necessary to protect human life, constitutes a last resort, and is applied in a manner proportionate to the threat.[72] Codified in Article 6 of the International Covenant on Civil and Political Rights, the right to life has been recognized as the “supreme right” of international human rights law, which applies under all circumstances. During times of armed conflict, international humanitarian law determines what constitutes arbitrary or unjustified deprivation of life. It requires that actors comply with the rules of distinction, proportionality, and military necessity in situations of armed conflict.[73]

Judgment and emotion promote respect for life because they can serve as checks on killing. The ability to make legal and ethical judgments can help an actor determine which course of action will best protect human life in the infinite number of potential unforeseen scenarios. An instinctive resistance to killing provides a psychological motivation to comply with, and sometimes go beyond, the rules of international law in order to minimize casualties.

Under the principles of humanity, actors must also respect the dignity of all human beings. This obligation is premised on the recognition that every human being has inherent worth that is both universal and inviolable.[74] Numerous international instruments—including the Universal Declaration of Human Rights, the International Covenant on Civil and Political Rights, the Vienna Declaration and Programme of Action adopted at the 1993 World Human Rights Conference, and regional treaties—enshrine the importance of dignity as a foundational principle of human rights law.[75] The Africa Charter on Human and Peoples’ Rights explicitly states that individuals have “the right to the respect of the dignity inherent in a human being.”[76]

While respect for human life involves minimizing the number of deaths and avoiding arbitrary or unjustified ones, respect for human dignity requires an appreciation of the gravity of a decision to kill.[77] The ICRC explained that it matters “not just if a person is killed or injured but how they are killed or injured, including the process by which these decisions are made.”[78] Before taking a life, an actor must truly understand the value of a human life and the significance of its loss. Humans should be recognized as unique individuals and not reduced to objects with merely instrumental or no value.[79] If an actor kills without taking into account the worth of the individual victim, the killing undermines the fundamental notion of human dignity and violates this principle of humanity.

Application to Fully Autonomous Weapons

It is highly unlikely that fully autonomous weapons would be able to respect human life and dignity. Their lack of legal and ethical judgment would interfere with their capacity to respect human life. For example, international humanitarian law’s proportionality test requires commanders to determine whether anticipated military advantage outweighs expected civilian harm on a case-by-case basis. Given the infinite number of contingencies that may arise on the battlefield, fully autonomous weapons could not be preprogrammed to make such determinations. The generally accepted standard for assessing proportionality is whether a “reasonable military commander” would have launched a particular attack,[80] and reasonableness requires making decisions based on ethical as well as legal considerations.[81] Unable to apply this standard to the proportionality balancing test, fully autonomous weapons would likely endanger civilians and potentially violate international humanitarian law.[82]

Fully autonomous weapons would also lack the instinctual human resistance to killing that can protect human life beyond the minimum requirements of the law.[83] An inclination to avoid killing comes naturally to most people because they have an innate appreciation for the inherent value of human life. Empirical research demonstrates the reluctance of human beings to take the lives of other humans. For example, a retired US Army Ranger who conducted extensive research on killing during armed conflict found that “there is within man an intense resistance to killing their fellow man. A resistance so strong that, in many circumstances, soldiers on the battlefield will die before they can overcome it.”[84] As inanimate objects, fully autonomous weapons could not lose their own life or understand the emotions associated with the loss of the life of a loved one. It is doubtful that a programmer could replicate in a robot a human’s natural inclination to avoid killing and to protect life with the complexity and nuance that would mirror human decision making.

Fully autonomous weapons could not respect human dignity, which relates to the process behind, rather the consequences of, the use of force.[85] As machines, they could truly comprehend neither the value of individual life nor the significance of its loss. They would base decisions to kill on algorithms without considering the humanity of a specific victim.[86] Moreover, these weapons would be programmed in advance of a scenario and could not account for the necessity of lethal force in a specific situation. In a CCW presentation as special rapporteur, Christof Heyns explained that:

to allow machines to determine when and where to use force against humans is to reduce those humans to objects; they are treated as mere targets. They become zeros and ones in the digital scopes of weapons which are programmed in advance to release force without the ability to consider whether there is no other way out, without a sufficient level of deliberate human choice about the matter.[87]

Mines Action Canada similarly concluded that “[d]eploying [fully autonomous weapons] in combat displays the belief that any human targeted in this way does not warrant the consideration of a live operator, thereby robbing that human life of its right to dignity.”[88] Allowing a robot to take a life when it cannot understand the inherent worth of that life or the necessity of taking it disrespects and demeans the person whose life is taken. It is thus irreconcilable with the principles of humanity enshrined in the Martens Clause.

When used in appropriate situations, AI has the potential to provide extraordinary benefits to humankind. Allowing robots to make determinations to kill humans, however, would be contrary to the Martens Clause, which merges law and morality. Limitations in the emotional, perceptive, and ethical capabilities of these machines significantly hinder their ability to treat other human beings humanely and to respect human life and dignity. Consequently, the use of these weapons would be incompatible with the principles of humanity as set forth in the Martens Clause.

 

V. The Dictates of Public Conscience

The Martens Clause states that in the absence of treaty law, the dictates of public conscience along with the principles of humanity protect civilians and combatants. The reference to “public conscience” instills the law with morality and requires that assessments of the means and methods of war account for the opinions of citizens and experts as well as governments. The reactions of these groups to the prospect of fully autonomous weapons makes it clear that the development, production, and use of such technology would raise serious concerns under the Martens Clause.

Definition

The dictates of public conscience refer to shared moral guidelines that shape the actions of states and individuals.[89] The use of the term “conscience” indicates that the dictates are based on a sense of morality, a knowledge of what is right and wrong.[90] According to philosopher Peter Asaro, conscience implies “feeling compelled by, or believing in, a specific moral obligation or duty.”[91] The adjective “public” clarifies that these dictates reflect the concerns of a range of people and entities. Building on the widely cited work of jurist and international humanitarian law expert Theodor Meron, this report looks to two sources in particular to determine what qualifies as the public conscience: the opinion of the public and the opinions of governments.[92]

Polling data and the experts’ views provide evidence of public opinion.[93] Surveys reveal the perspectives and beliefs of ordinary individuals. They can also illuminate nuanced differences in the values and understandings of laypeople. While informative, polls, by themselves, are not sufficient measures of the public conscience, in part because the responses can be influenced by the nature of the questions asked and do not necessarily reflect moral consideration.[94] The statements and actions of experts, who have often deliberated at length on the questions at issue, reflect a more in-depth understanding.[95] Their specific expertise may range from religion to technology to law, but they share a deep knowledge of the topic. The views they voice can thus shed light on the moral norms embraced by the informed public.[96]

Governments articulate their stances through policies and in written statements and oral interventions at diplomatic meetings and other public fora. Their positions reflect the perspectives of countries that differ in economic development, military prowess, political systems, religious and cultural traditions, and demographics. Government opinion can help illuminate opinio juris, an element of customary international law, which refers to a state’s belief that a certain practice is legally obligatory.[97]

Application to Fully Autonomous Weapons

The positions of individuals and governments around the world have demonstrated that fully autonomous weapons are highly problematic under the dictates of public conscience. Through opinion polls, open letters, oral and written statements, in-depth publications, and self-imposed guidelines, members of the public have shared their distress and outrage at the prospect of these weapons. Government officials from more than 100 countries have expressed similar concerns and spoken in favor of imposing limits on fully autonomous weapons.[98] While public opposition to fully autonomous weapons is not universal, collectively, these voices show that it is both widespread and growing.[99]

Opinion of the Public

Public opposition to the development, production, and use of fully autonomous weapons is significant and spreading. Several public opinion polls have revealed individuals’ resistance to these weapons.[100] These findings are mirrored in statements made by leaders in the relevant fields of disarmament and human rights, peace and religion, science and technology, and industry. While not comprehensive, the sources discussed below exemplify the nature and range of public opinion and provide evidence of the public conscience.

Surveys

Public opinion surveys conducted around the world have documented widespread opposition to the development, production, and use of these weapons. According to these polls, the majority of people surveyed found the prospect of delegating life-and-death decisions to machines unacceptable. For example, a 2013 survey of Americans, conducted by political science professor Charli Carpenter, found that 55 percent of respondents opposed the “trend toward using” fully autonomous weapons.[101] This position was shared roughly equally across genders, ages, and political ideologies. Interestingly, active duty military personnel, who understand the realities of armed conflict first hand, were among the strongest objectors; 73 percent expressed opposition to fully autonomous weapons.[102] The majority of respondents to this poll also supported a campaign to ban the weapons.[103] A more recent national survey of about 1,000 Belgians, which was released on July 3, 2018, found that 60 percent of respondents believed that “Belgium should support international efforts to ban the development, production and use of fully autonomous weapons.” Only 23 percent disagreed.[104]

International opinion polls have produced similar results. In 2015, the Open Robotics Initiative surveyed more than 1,000 individuals from 54 different countries and found that 56 percent of respondents opposed the development and use of what it called lethal autonomous weapons systems.[105] Thirty-four percent of all respondents rejected development and use because “humans should always be the one to make life/death decisions.”[106] Other motivations cited less frequently included the weapons’ unreliability, the risk of proliferation, and lack of accountability.[107] An even larger survey by Ipsos of 11,500 people from 25 countries produced similar results in 2017.[108] This poll explained that the United Nations was reviewing the “strategic, legal and moral implications of autonomous weapons systems” (equivalent to fully autonomous weapons) and asked participants how they felt about the weapons’ use. Fifty-six percent recorded their opposition.[109]

Nongovernmental and International Organizations

Providing further evidence of concerns under the dictates of public conscience, experts from a range of fields have felt compelled, especially for moral reasons, to call for a prohibition on the development, production, and use of fully autonomous weapons. The Campaign to Stop Killer Robots, a civil society coalition of 75 NGOs, is spearheading the effort to ban fully autonomous weapons.[110] Its NGO members are active in more than 30 countries and include groups with expertise in humanitarian disarmament, peace and conflict resolution, technology, human rights, and other relevant fields.[111] Human Rights Watch, which co-founded the campaign in 2012, serves as its coordinator. Over the past six years, the campaign’s member organizations have highlighted the many problems associated fully autonomous weapons through dozens of publications and statements made at diplomatic meetings and UN events, on social media, and in other fora.[112]

While different concerns resonate with different people, Steve Goose, director of Human Rights Watch’s Arms Division, highlighted the importance of the Martens Clause in his statement to the April 2018 CCW Group of Governmental Experts meeting. Goose said:

There are many reasons to reject [lethal autonomous weapons systems] (including legal, accountability, technical, operational, proliferation, and international security concerns), but ethical and moral concerns—which generate the sense of revulsion—trump all. These ethical concerns should compel High Contracting Parties of the Convention on Conventional Weapons to take into account the Martens Clause in international humanitarian law, under which weapons that run counter to the principles of humanity and the dictates of the public conscience should not be developed.[113]

The ICRC has encouraged states to assess fully autonomous weapons under the Martens Clause and observed that “[w]ith respect to the public conscience, there is a sense of deep discomfort with the idea of any weapon system that places the use of force beyond human control.”[114] The ICRC has repeatedly emphasized the legal and ethical need for human control over the critical functions of selecting and attacking targets. In April 2018, it made clear its view that “a minimum level of human control is required to ensure compliance with international humanitarian law rules that protect civilians and combatants in armed conflict, and ethical acceptability in terms of the principles of humanity and the public conscience.”[115] The ICRC explained that international humanitarian law “requires that those who plan, decide upon and carry out attacks make certain judgements in applying the norms when launching an attack. Ethical considerations parallel this requirement—demanding that human agency and intention be retained in decisions to use force.”[116] The ICRC concluded that a weapon system outside human control “would be unlawful by its very nature.”[117]

Peace and Faith Leaders

In 2014, more than 20 individuals and organizations that had received the Nobel Peace Prize issued a joint letter stating that they “whole-heartedly embrace [the] goal of a preemptive ban on fully autonomous weapons” and find it “unconscionable that human beings are expanding research and development of lethal machines that would be able to kill people without human intervention.”[118] The individual signatories to the letter included American activist Jody Williams, who led the civil society drive to ban landmines, along with heads of state and politicians, human rights and peace activists, a lawyer, a journalist, and a church leader.[119] Organizational signatory the Pugwash Conferences on Science and World Affairs and the Nobel Women’s Initiative, which helped spearhead the letter, are both on the steering committee of the Campaign to Stop Killer Robots.

Religious leaders have similarly united against fully autonomous weapons. In 2014, more than 160 faith leaders signed an “interreligious declaration calling on states to work towards a global ban on fully autonomous weapons.”[120] In language that implies concerns under the principles of humanity, the declaration describes such weapons as “an affront to human dignity and to the sacredness of life.”[121] The declaration further criticizes the idea of delegating life-and-death decisions to a machine because fully autonomous weapons have “no moral agency and, as a result, cannot be held responsible if they take an innocent life.”[122] The list of signatories encompassed representatives of Buddhism, Catholicism, Islam, Judaism, Protestantism, and Quakerism. Archbishop Desmond Tutu signed both this declaration and the Nobel Peace Laureates letter.

Science and Technology Experts

Individuals with technological expertise have also expressed opposition to fully autonomous weapons. The International Committee for Robot Arms Control (ICRAC), whose members study technology from various disciplines, raised the alarm in 2013 shortly after it co-founded the Campaign to Stop Killer Robots.[123] ICRAC issued a statement endorsed by more than 270 experts calling for a ban on the development and deployment of fully autonomous weapons.[124] Members of ICRAC noted “the absence of clear scientific evidence that robot weapons have, or are likely to have in the foreseeable future, the functionality required for accurate target identification, situational awareness or decisions regarding the proportional use of force” and concluded that “[d]ecisions about the application of violent force must not be delegated to machines.[125] While the concerns emphasized in this statement focus on technology, as discussed above the inability to make proportionality decisions can run counter to the respect for life and principles of humanity.

In 2015, an even larger group of AI and robotics researchers issued an open letter. As of June 2018, more than 3,500 scientists, as well as more than 20,000 individuals, had signed this call for a ban.[126] The letter warns that these machines could become the “Kalashnikovs of tomorrow” if their development is not prevented.[127] It states that while the signatories “believe that AI has great potential to benefit humanity in many ways,” they “believe that a military AI arms race would not be beneficial for humanity. There are many ways in which AI can make battlefields safer for humans, especially civilians, without creating new tools for killing people.”[128]

In addition to demanding action from others, thousands of technology experts have committed not to engage in actions that would advance the development of fully autonomous weapons. At a world congress held in Stockholm in July 2018, leading AI researchers issued a pledge to “neither participate in nor support the development, manufacture, trade, or use of lethal autonomous weapons.”[129] By the end of the month, more than 2,850 AI experts, scientists and other individuals along with 223 technology companies, societies and organizations from at least 36 countries had signed. The pledge, which cites moral, accountability, proliferation, and security-related concerns, finds that “the decision to take a human life should never be delegated to a machine.” It states, “There is a moral component to this position, that we should not allow machines to make life-taking decisions for which others—or nobody—will be culpable.”[130] According to the Future of Life Institute, which houses the pledge on its website, the pledge is necessary because “politicians have thus far failed to put into effect” any regulations and laws against lethal autonomous weapons systems.[131]

Industry

High-profile technology companies and their representatives have criticized fully autonomous weapons on various grounds. A Canadian robotics manufacturer, Clearpath Robotics, became the first company publicly to refuse to manufacture “weaponized robots that remove humans from the loop.”[132] In 2014, it pledged to “value ethics over potential future revenue.”[133] In a letter to the public, the company stated that it was motivated by its belief that “that the development of killer robots is unwise, unethical, and should be banned on an international scale.” Clearpath continued:

[W]ould a robot have the morality, sense, or emotional understanding to intervene against orders that are wrong or inhumane? No. Would computers be able to make the kinds of subjective decisions required for checking the legitimacy of targets and ensuring the proportionate use of force in the foreseeable future? No. Could this technology lead those who possess it to value human life less? Quite frankly, we believe this will be the case.[134]

The letter shows that fully autonomous weapons raise problems under both the principles of humanity and dictates of public conscience.

In August 2017, the founders and chief executive officers (CEOs) of 116 AI and robotics companies published a letter calling for CCW states parties to take action on autonomous weapons.[135] The letter opens by stating, “As companies building the technologies in Artificial Intelligence and Robotics that may be repurposed to develop autonomous weapons, we feel especially responsible in raising this alarm.”[136] The letter goes on to highlight the dangers to civilians, risk of an arms race, and possibility of destabilizing effects. It warns that “[o]nce this Pandora’s box is opened, it will be hard to close.”[137] In a similar vein in 2018, Scott Phoenix, CEO of Vicarious, a prominent AI development company, described developing autonomous weapons as among the “world’s worst ideas” because of the likelihood of defects in their codes and vulnerability to hacking.[138]

Google and the companies under its Alphabet group have been at the center of the debate about fully autonomous weapons on multiple occasions. DeepMind is an AI research company that was acquired by Google in 2014. In 2016, it submitted evidence to a UK parliamentary committee in which it described a ban on autonomous weapons as “the best approach to averting the harmful consequences that would arise from the development and use of such weapons.”[139] DeepMind voiced particular concern about the weapons’ “implications for global stability and conflict reduction.”[140] Two years later, more than 3,000 Google employees protested the company’s involvement with “Project Maven,” a US Department of Defense program that aims to use AI to autonomously process video footage taken by surveillance drones. The employees argued that the company should “not be in the business of war,”[141] and more than 1,100 academics supported them in a separate letter.[142] In June 2018, Google agreed to end its involvement in Project Maven once the contract expires in 2019, and it issued ethical principles committing not to develop AI for use in weapons. The principles state that Google is “not developing AI for use in weapons” and “will not design or deploy AI” for technology that causes “overall harm” or “contravenes widely accepted principles of international law and human rights.”[143]

Investors in the technology industry have also started to respond to the ethical concerns raised by fully autonomous weapons. In 2016, the Ethics Council of the Norwegian Government Pension Fund announced that it would monitor investments in the development of these weapons to decide whether they are counter to the council’s guidelines.[144] Johan H. Andresen, council chairman, reiterated that position in a panel presentation for CCW delegates in April 2018.[145]

Opinions of Governments

Governments from around the world have increasingly shared the views of experts and the broader public that the development, production, and use of weapons without meaningful human control is unacceptable. As of April 2018, 26 nations—from Africa, Asia, Europe, Latin America, and the Middle East—have called for a preemptive ban on fully autonomous weapons.[146] In addition, more than 100 states, including those of the Non-Aligned Movement (NAM), have called for a legally binding instrument on such weapons. In a joint statement, members of NAM cited “ethical, legal, moral and technical, as well as international peace and security related questions” as matters of concern.[147] While a complete analysis of government interventions over the past five years is beyond the scope of this report, overall the statements have demonstrated that countries oppose the loss of human control on moral as well as legal, technical, and other grounds. The opinions of these governments, reflective of public concerns, bolster the argument that fully autonomous weapons violate the dictates of public conscience.

The principles embedded in the Martens Clause have played a role in international discussions of fully autonomous weapons since they began in 2013. In that year, Christof Heyns, then Special Rapporteur on extrajudicial killing, submitted a report to the UN Human Rights Council on fully autonomous weapons, which he referred to as “lethal autonomous robotics.”[148] Emphasizing the importance of human control over life-and-death decisions, Heyns explained that “[i]t is an underlying assumption of most legal, moral and other codes that when the decision to take life or to subject people to other grave consequences is at stake, the decision-making power should be exercised by humans.”[149] He continued: “Delegating this process dehumanizes armed conflict even further and precludes a moment of deliberation in those cases where it may be feasible. Machines lack morality and mortality, and should as a result not have life and death powers over humans.”[150] Heyns also named the Martens Clause as one legal basis for his determination.[151] The 2013 report called for a moratorium on the development of fully autonomous weapons until the establishment of an “internationally agreed upon framework.”[152] A 2016 joint report by Heyns and Maina Kiai, then UN Special Rapporteur on freedom of peaceful assembly and of association, went a step further, recommending that “[a]utonomous weapons systems that require no meaningful human control should be prohibited.”[153]

In May 2013, in response to Heyns’s report, the UN Human Rights Council held the first discussions of the weapons at the international level.[154] Of the 20 nations that voiced their positions, many articulated concerns about the emerging technology. They often used language related to the Martens Clause or morality more generally. Ecuador explicitly referred to elements of the Martens Clause and stated that leaving life-and-death decisions to machines would contravene the public conscience.[155] Indonesia raised objections related to the principles of humanity discussed above. It criticized the “possible far-reaching effects on societal values, including fundamentally on the protection and the value of life” that could arise from the use of these weapons.[156] Russia recommended that “particular attention” be paid to the “serious implications for societal foundations, including the negating of human life.”[157] Pakistan called for a ban based on the precedent of the preemptive ban on blinding lasers, which was motivated in large part by the Martens Clause.[158] Brazil also addressed issues of morality; it said, “If the killing of one human being by another has been a challenge that legal, moral, and religious codes have grappled with since time immemorial, one may imagine the host of additional concerns to be raised by robots exercising the power of life and death over humans.”[159] While Human Rights Council member states also addressed other important risks of fully autonomous weapons, especially those related to security, morality was a dominant theme.[160]

Since the Human Rights Council’s session in 2013, most diplomatic discussions have taken place under the auspices of the Convention on Conventional Weapons.[161] States parties to the CCW held three informal meetings of experts on what they refer to as “lethal autonomous weapons systems” between 2014 and 2016.[162] At their 2016 Review Conference, they agreed to formalize discussions in a Group of Governmental Experts, a forum that is generally expected to produce an outcome such as a new CCW protocol.[163] More than 80 states participated in the most recent meeting of the group in April 2018. At that meeting, Austria noted that “that CCW’s engagement on lethal autonomous weapons stands testimony to the high level of concern about the risk that such weapons entail.”[164] It also serves as an indication that the public conscience is against this technology.

CCW states parties have highlighted the relevance of the Martens Clause at each of their meetings on lethal autonomous weapons systems. At the first meeting in May 2014, for example, Brazil described the Martens Clause as a “keystone” of international humanitarian law, which “‘allows us to navigate safely in new and dangerous waters’ and to feel confident that a human remains protected under the principles of humanity and the dictates of public conscience.”[165] Mexico found “there is absolutely no doubt that the development of these new technologies have to comply with [the] principles” of the Martens Clause.[166] At the second CCW experts meeting in April 2015, Russia described the Martens Clause as “an integral part of customary international law.”[167] Adopting a narrow interpretation of the provision, the United States said that “the Martens Clause is not a rule of international law that prohibits any particular weapon, much less a weapon that does not currently exist.” Nevertheless, it acknowledged that “the principles of humanity and the dictates of public conscience provide a relevant and important paradigm for discussing the moral or ethical issues related to the use of automation in warfare.”[168]

Several CCW states parties have based their objections to fully autonomous weapons on the Martens Clause and its elements. In a joint statement in April 2018, the African Group said that the “principles of humanity and dictates of public conscience as enunciated in the [Martens] Clause must be taken seriously.”[169] The African Group called for a preemptive ban on lethal autonomous weapons systems, declaring that its members found “it inhumane, abhorrent, repugnant, and against public conscience for humans to give up control to machines, allowing machines to decide who lives or dies, how many lives and whose life is acceptable as collateral damage when force is used.”[170] The Holy See condemned fully autonomous weapons because they “could never be a morally responsible subject. The unique human capacity for moral judgment and ethical decision-making is more than a complex collection of algorithms and such a capacity cannot be replaced by, or programed in, a machine.” The Holy See warned that autonomous weapons systems could find normal and acceptable “those behaviors that international law prohibits, or that, albeit not explicitly outlined, are still forbidden by dictates of morality and public conscience.”[171]

At the April 2018 meeting, other states raised issues under the Martens Clause more implicitly. Greece, for example, stated that “it is important to ensure that commanders and operators will remain on the loop of the decision making process in order to ensure the appropriate human judgment over the use of force, not only for reasons related to accountability but mainly to protect human dignity over the decision on life or death.”[172]

CCW states parties have considered a host of other issues surrounding lethal autonomous weapons systems over the past five years. They have highlighted, inter alia, the challenges of complying with international humanitarian law and international human rights law, the potential for an accountability gap, the risk of an arms race and a lower threshold for war, and the weapons’ vulnerability to hacking. Combined with the Martens Clause, these issues have led to convergence of views on the imperative of retaining some form of human control over weapons systems the use of force. In April 2018, Pakistan noted that “a general sense is developing among the High Contracting Parties that weapons with autonomous functions must remain under the direct control and supervision of humans at all times, and must comply with international law.”[173] Similarly, the European Union stated that its members “firmly believe that humans should make the decisions with regard to the use of lethal force, exert sufficient control over lethal weapons systems they use, and remain accountable for decisions over life and death.”[174]

The year 2018 has also seen increased parliamentary and UN calls for human control. In July, the Belgian Parliament adopted a resolution asking the government to support international efforts to ban the use of fully autonomous weapons.[175] The same month, the European Parliament voted to recommend that the UN Security Council:

work towards an international ban on weapon systems that lack human control over the use of force as requested by Parliament on various occasions and, in preparation of relevant meetings at UN level, to urgently develop and adopt a common position on autonomous weapon systems and to speak at relevant fora with one voice and act accordingly.[176]

In his 2018 disarmament agenda, the UN secretary-general noted, “All sides appear to be in agreement that, at a minimum, human oversight over the use of force is necessary.” He offered to support the efforts of states “to elaborate new measures, including through politically or legally binding arrangements, to ensure that humans remain at all times in control over the use of force.”[177] While the term remains to be defined, requiring “human control” is effectively the same as prohibiting weapons without such control. Therefore, the widespread agreement about the necessity of human control indicates that fully autonomous weapons contravene the dictates of public conscience.

 

VI. The Need for a Preemptive Ban Treaty

The Martens Clause fills a gap when existing treaties fail to specifically address a new situation or technology. In such cases, the principles of humanity and the dictates of public conscience serve as guides for interpreting international law and set standards against which to judge the means and methods of war. In so doing, they provide a baseline for adequately protecting civilians and combatants. The clause, which is a provision of international humanitarian law, also integrates moral considerations into legal analysis.

Existing treaties only regulate fully autonomous weapons in general terms, and thus an assessment of the weapons should take the Martens Clause into account. Because fully autonomous weapons raise concerns under both the principles of humanity and the dictates of public conscience, the Martens Clause points to the urgent need to adopt a specific international agreement on the emerging technology. To eliminate any uncertainty and comply with the elements of the Martens Clause, the new instrument should take the form of a preemptive ban on the development, production, and use of fully autonomous weapons.

There is no way to regulate fully autonomous weapons short of a ban that would ensure compliance with the principles of humanity. Fully autonomous weapons would lack the compassion and legal and ethical judgment that facilitate humane treatment of humans. They would face significant challenges in respecting human life. Even if they could comply with legal rules of protection, they would not have the capacity to respect human dignity.

Limiting the use of fully autonomous weapons to certain locations, such as those where civilians are rare, would not sufficiently address these problems. “Harm to others,” which the principle of humane treatment seeks to avoid, encompasses harm to civilian objects, which might be present where civilians themselves are not. The requirement to respect human dignity applies to combatants as well as civilians, so the weapons should not be permitted where enemy troops are positioned. Furthermore, allowing fully autonomous weapons to be developed and to enter national arsenals would raise the possibility of their misuse. They would likely proliferate to actors with no regard for human suffering and no respect for human life or dignity. The 2017 letter from technology company CEOs warned that the weapons could be “weapons of terror, weapons that despots and terrorists use against innocent populations, and weapons hacked to behave in undesirable ways.”[178] Regulation that allowed for the existence of fully autonomous weapons would open the door to violations of the principles of humanity.

A ban is also necessary to promote compliance with the dictates of public conscience. An overview of public opinion shows that ordinary people and experts alike have objected to the prospect of fully autonomous weapons on moral grounds. Public opinion surveys have illuminated significant opposition to these weapons based on the problems of delegating life-and-death decisions to machines. Experts have continually called for a preemptive ban on fully autonomous weapons, citing moral along with legal and security concerns. Regulation that allows for the existence of fully autonomous weapons, even if they could only be used in limited circumstances, would be inconsistent with the widespread public belief that fully autonomous weapons are morally wrong.

The statements of governments, another element of the public conscience, illuminate that opposition to weapons that lack human control over the selection and engagement of targets extends beyond individuals to countries. More than two dozen countries have explicitly called for a preemptive ban on these weapons,[179] and consensus is emerging regarding the need for human control over the use of force. As noted above, the requirement for human control is effectively equivalent to a ban on weapons without it. Therefore, a ban would best ensure that the dictates of public conscience are met.

The principles of humanity and dictates of public conscience bolster the case against fully autonomous weapons although as discussed above they are not the only matter of concern. Fully autonomous weapons are also problematic under other legal provisions and raise accountability, technological, and security risks. Collectively, these dangers to humanity more than justify the creation of new law that maintains human control over the use of force and prevents fully autonomous weapons from coming into existence.

Acknowledgments

Bonnie Docherty, senior researcher in the Arms Division of Human Rights Watch, was the lead writer and editor of this report. She is also the associate director of armed conflict and civilian protection and a lecturer on law at the International Human Rights Clinic (IHRC) at Harvard Law School. Danielle Duffield, Annie Madding, and Paras Shah, students in IHRC, made major contributions to the research, analysis, and writing of the report. Steve Goose, director of the Arms Division, and Mary Wareham, advocacy director of the Arms Division, edited the report. Dinah PoKempner, general counsel, and Tom Porteous, deputy program director, also reviewed the report. Peter Asaro, associate professor at the School of Media Studies at the New School, provided additional feedback on the report.

This report was prepared for publication by Marta Kosmyna, senior associate in the Arms Division, Fitzroy Hepkins, administrative manager, and Jose Martinez, senior coordinator. Russell Christian produced the cartoon for the report cover.

 

 

[1] For a more in-depth discussion of the pros and cons of fully autonomous weapons, see Human Rights Watch and the Harvard Law School International Human Rights Clinic (IHRC), Making the Case: The Dangers of Killer Robots and the Need for a Preemptive Ban, December 2016, https://www.hrw.org/report/2016/12/09/making-case/dangers-killer-robots-....

[2] Future of Life Institute, “Autonomous Weapons: An Open Letter from AI & Robotics Researchers,” opened on July 28, 2015, https://futureoflife.org/open-letter-autonomous-weapons/ (accessed July 22, 2018).

[3] Convention (II) with Respect to the Laws and Customs of War on Land and its Annex: Regulations concerning the Laws and Customs of War on Land, The Hague, adopted July 29,1899, entered into force September 4, 1900, pmbl., para. 8.

[4] “The three conventions adopted at the 1899 Conference represented the three broad areas … [of] pacific settlement of international disputes, arms limitation, and the laws of war.” Betsy Baker, “Hague Peace Conferences: 1899 and 1907,” Max Planck Encyclopedia of Public International Law, updated November 2009, http://opil.ouplaw.com/view/10.1093/law:epil/9780199231690/law-978019923... (accessed July 14, 2018), para. 3.

[5] See Antonio Cassese, “The Martens Clause: Half a Loaf or Simply Pie in the Sky?” European Journal of International Law, vol. 11, no. 1 (2000), pp. 193-195; Theodor Meron, “The Martens Clause, Principles of Humanity, and Dictates of Public Conscience,” American Journal of International Law, vol. 94, no. 1 (2000), p. 79 (noting, “[t]he clause was originally designed to provide residual humanitarian rules for the protection of the population of occupied territories, especially armed resisters in those territories.”). A minority of scholars question the conventional narrative that the clause served as a fair compromise. See Rotem Giladi, “The Enactment of Irony: Reflections on the Origins of the Martens Clause,” European Journal of International Law, vol. 25, no. 3 (2014), p. 853 (“His [Martens’] response to the objections raised by Belgium was anything but conciliatory. It was calculated, naturally, to advance legal rules on occupation that suited the interests of the expanding Russian empire he represented…. Martens’ response remains a classic example of power politics veiled by humanitarian rhetoric; it also cunningly harped on the political and professional sensitivities besetting his audience.”).

[6] As of August 6, 2018, the four Geneva Conventions had 196 states parties. See International Committee of the Red Cross (ICRC), “Treaties, States Parties and Commentaries, Geneva Conventions of 12 August 1949 and Additional Protocols, and their Commentaries,” https://ihl-databases.icrc.org/applic/ihl/ihl.nsf/vwTreaties1949.xsp (accessed August 6, 2018).

[7] Geneva Convention for the Amelioration of the Condition of the Wounded and Sick in Armed Forces in the Field, adopted August 12, 1949, 75 U.N.T.S. 31, entered into force October 21, 1950, art. 63; Geneva Convention for the Amelioration of the Condition of Wounded, Sick and Shipwrecked Members of Armed Forces at Sea, adopted August 12, 1949, 75 U.N.T.S. 85, entered into force October 21, 1950, art. 62; Geneva Convention relative to the Treatment of Prisoners of War, adopted August 12, 1949, 75 U.N.T.S. 135, entered into force October 21, 1950, art. 142; Geneva Convention relative to the Protection of Civilian Persons in Time of War, adopted August 12, 1949, 75 U.N.T.S. 287, entered into force October 21, 1950, art. 158 (stating, “The denunciation shall have effect only in respect of the denouncing Power. It shall in no way impair the obligations which the Parties to the conflict shall remain bound to fulfil by virtue of the principles of the law of nations, as they result from the usages established among civilized peoples, from the laws of humanity and the dictates of the public conscience.”).

[8] ICRC, “Commentary of 2016 on Convention (I) for the Amelioration of the Condition of the Wounded and Sick in Armed Forces in the Field, Geneva, 12 August 1949: Article 63: Denunciation,” https://ihl-databases.icrc.org/applic/ihl/ihl.nsf/Comment.xsp?action=ope... (accessed July 14, 2018), para. 3330.

[9] As of August 6, 2018, Additional Protocol I had 174 states parties. See ICRC, “Protocol Additional to the Geneva Conventions of 12 August 1949, and relating to the Protection of Victims of International Armed Conflicts (Protocol I), 8 June 1977,” https://ihl-databases.icrc.org/ihl/INTRO/470 (accessed August 6, 2018).

[10] Protocol Additional to the Geneva Conventions of August 12, 1949 and relating to the Protection of Victims of International Armed Conflicts (Protocol I), adopted June 8, 1977, 1125 U.N.T.S. 3, entered into force December 7, 1978, art. 1(2).

[11] ICRC, “Commentary of 1987 on Protocol Additional to the Geneva Conventions of August 12, 1949 and relating to the Protection of Victims of International Armed Conflicts (Protocol I): Article 1, General Principles and Scope of Application,” https://ihl‑databases.icrc.org/applic/ihl/ihl.nsf/Comment.xsp?action=openDocument&documentId=7125D4CBD57A70DDC12563CD0042F793 (accessed July 15, 2018), para. 55.

[12] A notable exception is the 1992 Chemical Weapons Convention. See Convention on the Prohibition of the Development, Production, Stockpiling and Use of Chemical Weapons and on their Destruction, adopted September 3, 1992, 1974 UNTS 45, entered into force April 29, 1997.

[13] The 1925 Geneva Gas Protocol incorporates elements of the Martens Clause in its preamble. Protocol for the Prohibition of the Use in War of Asphyxiating, Poisonous or Other Gases, and of Bacteriological Methods of Warfare, adopted June 17, 1925, 94 L.N.T.S. 65, entered into force February 8, 1928, pmbl., paras. 1-3 (“Whereas the use in war of asphyxiating, poisonous or other gases … has been justly condemned by the general opinion of the civilized world; and Whereas the prohibition of such use has been declared in Treaties to which the majority of Powers of the world are Parties; and to the end that this prohibition shall be universally accepted[] … binding alike the conscience and the practice of nations”).

[14] Convention on the Prohibition of the Development, Production and Stockpiling of Bacteriological (Biological) and Toxin Weapons and on Their Destruction, opened for signature April 10, 1972, 1015 UNTS 163, entered into force March 26, 1975, pmbl., para. 10 (“Convinced that such use would be repugnant to the conscience of mankind and that no effort should be spared to minimize this risk”).

[15] Convention on Prohibitions or Restrictions on the Use of Certain Conventional Weapons which May be Deemed to be Excessively Injurious or to Have Indiscriminate Effects (CCW), adopted December 10, 1980, 1342 UNTS 137, entered into force December 2, 1983, pmbl., para. 5 (“Confirming their determination that in cases not covered by this Convention and its annexed Protocols or by other international agreements, the civilian population and the combatants shall at all times remain under the protection and authority of the principles of international law derived from established custom, from the principles of humanity and from the dictates of public conscience”).

[16] Convention on the Prohibition of the Use, Stockpiling, Production and Transfer of Anti-Personnel Mines and on Their Destruction, adopted September 18, 1997, 2056 UNTS 241, entered into force March 1, 1999, pmbl., para. 8 (“Stressing the role of public conscience in furthering the principles of humanity as evidenced by the call for a total ban of anti-personnel mines”).

[17] Convention on Cluster Munitions, adopted May 30, 2008, 2688 UNTS 39, entered into force August 1, 2010, pmbl., para. 11 (“Reaffirming that in cases not covered by this Convention or by other international agreements, civilians and combatants remain under the protection and authority of the principles of international law, derived from established custom, from the principles of humanity and from the dictates of public conscience”).

[18] Treaty on the Prohibition of Nuclear Weapons, adopted July 7, 2017, C.N.475.2017.TREATIES-XXVI.9, pmbl., para. 11 (“Reaffirming that any use of nuclear weapons would also be abhorrent to the principles of humanity and the dictates of public conscience”).

[19] Protocol I, art. 1(2).

[20] For instance, in a legal paper on fully autonomous weapons, Switzerland argued: “Accordingly, not everything that is not explicitly prohibited can be said to be legal if it would run counter the principles put forward in the Martens clause. Indeed, the Martens clause may be said to imply positive obligations where contemplated military action would result in untenable humanitarian consequences.” Switzerland, “A ‘Compliance-Based’ Approach to Autonomous Weapon Systems,” U.N. Doc. CCW/GGE.1/2017/WP.9, November 10, 2017, https://www.unog.ch/80256EDD006B8954/(httpAssets)/6B80F9385F6B505FC12581D4006633F8/$file/2017_GGEonLAWS_WP9_Switzerland.pdf (accessed July 15, 2018), para. 18.

[21] In re Krupp, Judgment of July 31, 1948, in Trials of War Criminals before the Nuremberg Military Tribunals: “The Krupp Case,” vol. IX, (Washington: US Government Printing Office, 1950), p. 1340.

[22] Advisory Opinion on the Legality of the Threat or Use of Nuclear Weapons, International Court of Justice, July 8, 1996, http://www.icj-cij.org/files/case-related/95/095-19960708-ADV-01-00-EN.pdf (accessed July 15, 2018), para. 78.

[23] Some critics argue that international humanitarian law would adequately cover fully autonomous weapons and note the applicability of disarmament treaties on antipersonnel landmines, cluster munitions, and incendiary weapons. These instruments do not provide specific law on fully autonomous weapons, however. For critics’ view, see Michael N. Schmitt and Jeffrey S. Thurnher, “‘Out of the Loop’: Autonomous Weapon Systems and the Law of Armed Conflict,” Harvard National Security Journal, vol. 4 (2013), p. 276.

[24] See Rupert Ticehurst, “The Martens Clause and the Laws of Armed Conflict,” International Review of the Red Cross, no. 317 (1997), https://www.icrc.org/eng/resources/documents/article/other/57jnhy.htm (accessed July 15, 2018), p. 1 (noting, “The problem faced by humanitarian lawyers is that there is no accepted interpretation of the Martens Clause. It is therefore subject to a variety of interpretations, both narrow and expansive.”).

[25] For example, the British government advanced this interpretation in its briefing before the International Court of Justice during the 1996 Nuclear Weapons Advisory Opinion process, stating: “While the Martens Clause makes clear that the absence of a specific treaty provision on the use of nuclear weapons is not, in itself, sufficient to establish that such weapons are capable of lawful use, the Clause does not, on its own, establish their illegality. The terms of the Martens Clause themselves make it necessary to point to a rule of customary international law which might outlaw the use of nuclear weapons.” Meron, “The Martens Clause, Principles of Humanity, and Dictates of Public Conscience,” American Journal of International Law, p. 85.

[26] France v. Greece, Permanent Court of International Justice, Judgement No. 22, March 17, 1934, http://www.worldcourts.com/pcij/eng/decisions/1934.03.17_lighthouses.htm (accessed July 15, 2018), para. 106 (Separate Opinion of M. Anzilotti) (“[I]t is a fundamental rule in interpreting legal texts that one should not lightly admit that they contain superfluous words: the right course, whenever possible, is to seek for an interpretation which allows a reason and a meaning to every word in the text.”).

[27] See, for example, Michael Salter, “Reinterpreting Competing Interpretations of the Scope and Potential of the Martens Clause,” Journal of Conflict and Security Law, vol. 17, no. 3 (2012), p. 421.

[28] See, for example, In re Krupp, Judgment of July 31, 1948, in Trials of War Criminals before the Nuremberg Military Tribunals: “The Krupp Case,” p. 1340 (asserting that the Martens Clause “is much more than a pious declaration”). See also Cassese, “The Martens Clause,” European Journal of International Law, p. 210 (asserting that most of the states that appeared before the International Court of Justice with regards to the Nuclear Weapons Advisory Opinion “suggested—either implicitly or in a convoluted way—the expansion of the scope of the clause so as to upgrade it to the rank of a norm establishing new sources of law”); ICRC, A Guide to the Legal Review of New Weapons, Means and Methods of Warfare: Measures to Implement Article 36 of Additional Protocol I of 1977 (2006), http://www.icrc.org/eng/resources/documents/publication/p0902.htm (accessed July 15, 2018), p. 17 (stating, “A weapon which is not covered by existing rules of international humanitarian law would be considered contrary to the Martens clause if it is determined per se to contravene the principles of humanity or the dictates of public conscience.”).

[29] Cassese, “The Martens Clause,” European Journal of International Law, p. 212.

[30] Ibid. See also Jochen von Bernstorff, “Martens Clause,” Max Planck Encyclopedia of Public International Law, updated December 2009, http://opil.ouplaw.com/search?sfam=&q=Martens+Clause+&prd=EPIL&searchBtn... (accessed July 15, 2018), para. 13 (“A second reading sees the clause as an interpretative device according to which, in case of doubt, rules of international humanitarian law should be interpreted according to ‘principles of humanity’ and ‘dictates of public conscience.’”).

[31] ICRC, “Ethics and Autonomous Weapons Systems: An Ethical Basis for Human Control?” April 3, 2018, https://www.icrc.org/en/document/ethics-and-autonomous-weapon-systems-et... (accessed July 15, 2018), p. 6. The ICRC has elsewhere acknowledged that states must take the Martens Clause into account when conducting weapons reviews. ICRC, A Guide to the Legal Review of New Weapons, Means and Methods of Warfare, p. 17.

[32] ICRC, “Ethics and Autonomous Weapons Systems: An Ethical Basis for Human Control?” p. 6.

[33] Peter Asaro, “Jus nascendi, Robotic Weapons and the Martens Clause,” in Robot Law, eds. Ryan Calo, Michael Froomkin, and Ian Kerr (Cheltenham, UK: Edward Elgar Publishing, 2016), https://www.elgaronline.com/view/9781783476725.00024.xml (accessed July 15, 2018), p. 386.

[34] ICRC, “Ethics and Autonomous Weapons Systems: An Ethical Basis for Human Control?” p. 6.

[35] CCW Protocol on Blinding Lasers (CCW Protocol IV), adopted October 13, 1995, entered into force July 30, 1998, art. 1. For a discussion of the negotiating history of this protocol and its relationship to discussions about fully autonomous weapons, see Human Rights Watch and IHRC, Precedent for Preemption: The Ban on Blinding Lasers as a Model for a Killer Robots Prohibition, November 2015, https://www.hrw.org/sites/default/files/supporting_resources/robots_and_..., pp. 3-7.

[36] ICRC, Blinding Weapons: Reports of the Meetings of Experts Convened by the ICRC on Battlefield Laser Weapons, 1989-1991 (Geneva: ICRC, 1993), p. 342 (emphasis in original removed).

[37] Ibid., p. 341.

[38] Ibid., p. 85.

[39] Louise Doswald-Beck, “New Protocol on Blinding Laser Weapons,” International Review of the Red Cross, no. 312 (1996),

https://www.icrc.org/eng/resources/documents/article/other/57jn4y.htm (accessed July 15, 2018).

[40] Summary of Statement by Human Rights Watch, CCW First Review Conference, “Summary Record of the 6th Meeting,” CCW/CONF.I/SR.6, September 28, 1995, para. 60.

[41] Summary of Statement by the UN Development Programme, CCW First Review Conference, “Summary Record of the 5th Meeting,” CCW/CONF.I/SR.5, September 27, 1995, para. 50.

[42] Summary of Statement by Christoffel Blindenmission (CBM), CCW First Review Conference, “Summary Record of the 6th Meeting,” CCW/CONF.I/SR.6, September 28, 1995, para. 51. CBM is an international Christian development organization.

[43] Summary of Statement by Chile, CCW First Review Conference, “Summary Record of the 14th Meeting,” CCW/CONF.I/SR.13, May 3, 1996, para. 69.

[44] European Parliament, Resolution on the Failure of the International Conference on Anti-Personnel Mines and Laser Weapons, December 4, 1995, https://eur-lex.europa.eu/legal-content/EN/TXT/HTML/?uri=CELEX:51995IP13... (accessed July 15, 2018).

[45] Ibid.

[46] Doswald-Beck, “New Protocol on Blinding Laser Weapons,” International Review of the Red Cross.

[47] Blinding lasers and fully autonomous weapons would differ in some respects. For example, blinding lasers are a specific type of weapon, while fully autonomous weapons constitute a broad class. Instead of undermining the calls for a ban, however, the unique qualities of fully autonomous weapons make a preemptive prohibition even more pressing. See Human Rights Watch and IHRC, Precedent for Preemption, pp. 17-18.

[48] English Oxford Living Dictionaries, “humanity,” https://en.oxforddictionaries.com/definition/humanity (accessed July 15, 2018). See also Merriam Webster, “humanity,” https://www.merriam-webster.com/dictionary/humanity (accessed July 15, 2018) (defining humanity as “compassionate, sympathetic, or generous behavior or disposition; the quality or state of being humane”).

[49] See ICRC, “Rule 87: Humane Treatment,” Customary International Humanitarian Law Database, https://ihl-databases.icrc.org/customary-ihl/eng/docs/v1_rul_rule87 (accessed July 15, 2018).

[50] Ibid. See also V.V. Pustogarov, “The Martens Clause in International Law,” Journal of the History of International Law, vol. 125 (1999), p. 133 (noting, “the principles of humanity are expressed concretely in the provisions prescribing ‘humane treatment’ of the wounded, the sick, prisoners of war and other persons falling beneath the protection of the Geneva Conventions of 1949 and the Protocols of 1977. One can say that ‘humane treatment’ is the main content of humanitarian law.”). In addition, Article 10(1) of the International Covenant on Civil and Political Rights provides, “All persons deprived of their liberty shall be treated with humanity”; International Covenant on Civil and Political Rights (ICCPR), adopted December 16, 1966, G.A. Res. 2200A (XXI), 21 U.N. GAOR Supp. (No. 16) at 52, U.N. Doc.A/6316 (1966), 999 U.N.T.S. 171, entered into force March 23, 1976, art. 10(1).

[51] ICRC, “The Fundamental Principles of the Red Cross: Commentary,” January 1, 1979, https://www.icrc.org/eng/resources/documents/misc/fundamental-principles... (accessed July 15, 2018).

[52] English Oxford Living Dictionaries, “humane,” https://en.oxforddictionaries.com/definition/humane (accessed July 15, 2018) (defining “humane” as “having or showing compassion or benevolence”).

[53] ICRC, “The Fundamental Principles of the Red Cross: Commentary,” January 1, 1979.

[54] In this report, the term “actor” is used to describe an agent deployed in situations, including armed conflict, where they are charged with making moral decisions, i.e., those in which a person’s actions have the potential to harm or help others. Thus, both human beings and fully autonomous weapons are actors for the purposes of this paper. All actors are required, pursuant to the Martens Clause, to comply with the principles of humanity. For a definition of empathy, see English Oxford Living Dictionaries, “empathy,” https://en.oxforddictionaries.com/definition/empathy (accessed July 15, 2018). Anneliese Klein-Pineda has also commented that “both empathy and sympathy require the ability to interpret actions and perceive the motivations or feelings of others.” Anneliese Klein-Pineda, “The Ethics of Robots: Is There an Algorithm for Morality?” Stashlearn, December 22, 2016, https://learn.stashinvest.com/robot-ethics-morality (accessed July 15, 2018).

[55] Christof Heyns, then special rapporteur on extrajudicial, summary, or arbitrary executions, wrote that “[d]ecisions over life and death in armed conflict may require compassion and intuition.” Report of the Special Rapporteur on Extrajudicial, Summary or Arbitrary Executions, Christof Heyns, Lethal Autonomous Robotics and the Protection of Life, UN Human Rights Council, A/HRC/23/47, April 9, 2013, https://www.ohchr.org/Documents/HRBodies/HRCouncil/RegularSession/Sessio... (accessed July 15, 2018), para. 55.

[56] Jean Pictet, Development and Principles of International Humanitarian Law (Geneva: Martinus Nijoff and Henry Dunant Institute, 1985), p. 62.

[57] English Oxford Living Dictionaries, “judgement,” https://en.oxforddictionaries.com/definition/judgement (accessed July 15, 2018) (defining “judgement” as “the ability to make considered decisions or come to sensible conclusions”).

[58] James H. Moor, “The Nature, Importance, and Difficulty of Machine Ethics,” IEEE Intelligent Systems, vol. 21, no. 4, (2006), https://ieeexplore.ieee.org/document/1667948/ (accessed July 15, 2018), p. 21.

[59] James H. Moor, “Four Kinds of Ethical Robot” Philosophy Now, vol. 72 (2007), p. 12.

[60] Moor, “The Nature, Importance, and Difficulty of Machine Ethics,” IEEE Intelligent Systems, p. 21.

[61] ICRC, “The Fundamental Principles of the Red Cross and Red Crescent,” 1996, https://www.icrc.org/eng/assets/files/other/icrc_002_0513.pdf (accessed July 15, 2018) p. 2.

[62] Ibid.

[63] English Oxford Living Dictionaries, “emotion,” https://en.oxforddictionaries.com/definition/emotion (accessed July 15,2018) (defining emotion as “a strong feeling deriving from one's circumstances, mood, or relationships with others” and “instinctive or intuitive feeling as distinguished from reasoning or knowledge”).

[64] Amanda Sharkey, “Can We Program or Train Robots to be Good?” Ethics and Information Technology (2017), accessed August 3, 2018, doi.org/10.1007/s10676-017-9425-5, p. 8.

[65] See Olivia Goldhill, “Can We Trust Robots to Make Moral Decisions?” Quartz, April 3, 2016, https://qz.com/653575/can-we-trust-robots-to-make-moral-decisions/ (accessed July 15, 2018) (noting, “it’s unlikely robots will be able to address the most sophisticated ethical decisions for the foreseeable future.”).

[66] Sharkey, “Can We Program or Train Robots to be Good?” Ethics and Information Technology, p. 1.

[67] Mary Wareham (Human Rights Watch), “It’s Time for a Binding, Absolute Ban on Fully Autonomous Weapons,” commentary, Equal Times, November 9, 2017, https://www.hrw.org/news/2017/11/09/its-time-binding-absolute-ban-fully-....

[68] Christof Heyns, “Autonomous Weapons in Armed Conflict and the Right to a Dignified Life: An African Perspective,” South African Journal on Human Rights, vol. 33 (2017), accessed July 1, 2018, doi.org/10.1080/02587203.2017.1303903, p. 51.

[69] Ibid., p. 58.

[70] ICRC, “The Fundamental Principles of the International Red Cross and Red Crescent Movement,” August 2015, https://www.icrc.org/sites/default/files/topic/file_plus_list/4046‑the_fundamental_principles_of_the_international_red_cross_and_red_crescent_movement.pdf (accessed July 21, 2018), p. 3.

[71] ICCPR, art. 6(1).

[72] For more a more detailed analysis of the requirements for use of force under the right to life, see Human Rights Watch and IHRC, Shaking the Foundations: The Human Rights Implications of Killer Robots, May 2014, https://www.hrw.org/report/2014/05/12/shaking-foundations/human-rights-i..., pp. 8-16.

[73] ICRC, “Rule 1 The Principle of Distinction between Civilians and Combatants,” Customary International Humanitarian Law Database, https://ihl-databases.icrc.org/customary-ihl/eng/docs/v1_rul_rule1 (accessed July 21, 2018); ICRC, “Rule 14: Proportionality in Attack,” Customary International Humanitarian Law Database, https://ihl-databases.icrc.org/customary-ihl/eng/docs/v1_rul_rule14 (accessed August 6, 2018); “Military Necessity,” in ICRC, How Does Law Protect in War?, https://casebook.icrc.org/glossary/military-necessity (accessed July 21, 2018).

[74] English Oxford Living Dictionaries, “dignity,” https://en.oxforddictionaries.com/definition/dignity (accessed July 21, 2018) (defining “dignity” as “the quality of being worthy or honourable”). See also Jack Donnelly, “Human Dignity and Human Rights,” in Swiss Initiative to Commemorate the 60th Anniversary of the UDHR, Protecting Dignity: Agenda for Human Rights, June 2009, https://www.legal-tools.org/doc/e80bda/pdf/ (accessed July 21, 2018), p. 10; Human Rights Watch and IHRC, Shaking the Foundations, p. 3; José Pablo Alzina de Aguilar, “Human Dignity according to International Instruments on Human Rights,” Revista Electrónica de Estudios Internacionales, vol. 22 (2011), p. 8. (“[International human rights instruments] also say that rights which stem from that dignity, or at least the most important ones, are universal and inviolable.”).

[75] For example, the preamble of the Universal Declaration of Human Rights asserts that “recognition of the inherent dignity and of the equal and inalienable rights of all members of the human family is the foundation of freedom, justice and peace in the world.” Universal Declaration of Human Rights (UDHR), adopted December 10, 1948, G.A. Res. 217A(III), U.N. Doc. A/810 at 71 (1948), pmbl., para. 1.

[76] African Charter on Human and Peoples’ Rights (Banjul Charter), adopted June 27, 1981, CAB/LEG/67/3 rev. 5, 21 I.L.M. 58, entered into force October 21, 1986, art. 7.

[77] There is an overlap between the types of respect to the extent that an actor who truly respects human dignity is more likely to take the actions required in order to protect human life.

[78] ICRC, “Ethics and Autonomous Weapons Systems: An Ethical Basis for Human Control?” p. 10.

[79] Heyns, “Autonomous Weapons in Armed Conflict and the Right to a Dignified Life,” South African Journal on Human Rights, pp. 62-63 (stating, “A central thrust of the notion of human dignity is the idea that humans should not be treated as something similar to an object that simply has an instrumental value (as is the case e.g. with slavery or rape) or no value at all (as with many massacres.”).

[80] Human Rights Watch and IHRC, Making the Case, p. 6. See also Final Report to the Prosecutor by the Committee Established to Review the NATO Bombing Campaign against the Federal Republic of Yugoslavia, International Criminal Tribunal for the Former Yugoslavia, http://www.difesa.it/SMD_/CASD/IM/ISSMI/Corsi/Corso_Consigliere_Giuridic... (accessed July 21, 2018), para. 50.

[81] Human Rights Watch and IHRC, Making the Case, p. 7. See also Olivier Corten, “Reasonableness in International Law,” Max Planck Encyclopedia of Public International Law, updated March 2013, http://opil.ouplaw.com/view/10.1093/law:epil/9780199231690/law-978019923... (accessed July 22, 2018), para. 1 (noting, “Reasonableness is also generally perceived as opening the door to several ethical or moral, rather than legal, considerations.”).

[82] Fully autonomous weapons would face the same difficulties determining whether force is necessary and proportionate in law enforcement situations and avoiding it when possible, which are requirements for upholding the right to life. See Human Rights Watch and IHRC, Shaking the Foundations, pp. 8-14. See also Peter Asaro, “‘Hands Up, Don’t Shoot!’ HRI and the Automation of Police Use of Force,” Journal of Human-Robot Interaction, vol. 5, no. 3, (2016), http://humanrobotinteraction.org/journal/index.php/HRI/article/view/301/... (accessed July 22, 2018).

[83] Heyns, “Autonomous Weapons in Armed Conflict and the Right to a Dignified Life,” South African Journal on Human Rights, p. 64.

[84] Lt. Col. Dave Grossman, On Killing: The Psychological Cost of Learning to Kill in War and Society (New York: Little, Brown and Company, 1995), p. 4. Similarly, Armin Krishnan wrote that “One of the greatest restraints for the cruelty in war has always been the natural inhibition of humans not to kill or hurt fellow human beings. The natural inhibition is, in fact, so strong that most people would rather die than kill somebody.” Armin Krishnan, Killer Robots: Legality and Ethicality of Autonomous Weapons (Farnham: Ashgate Publishing Limited, 2009), p. 130.

[85] ICRC, “Ethics and Autonomous Weapons Systems: An Ethical Basis for Human Control?” pp. 10, 12.

[86] The ICRC has stated that the importance of respecting the individual personality and dignity of the individual is vital to the principle of humanity. See ICRC, “The Fundamental Principles of the Red Cross: Commentary,” January 1, 1979.

[87] Christof Heyns, “Autonomous Weapon Systems: Human Rights and Ethical Issues” (presentation to the CCW Meeting of Experts on Lethal Autonomous Weapon Systems, April 14, 2016), transcript on file with Human Rights Watch.

[88] Erin Hunt and Piotr Dobrzynski, “The Right to Dignity and Autonomous Weapons Systems,” CCW Report, April 11, 2018, http://reachingcriticalwill.org/images/documents/Disarmament-fora/ccw/20... (accessed July 22, 2018), p. 5.

[89] Larry May, "Hobbes, Law, and Public Conscience," Critical Review of International Social and Political Philosophy, vol. 19, (2016): accessed July 22, 2018, doi.org/10.1080/13698230.2015.1122352. See generally Heping Dang, International Law, Human Rights, and Public Opinion: The Role of the State in Educating on Human Rights Standards (London: Taylor & Francis Group, 2017).

[90] English Oxford Living Dictionaries, “conscience,” https://en.oxforddictionaries.com/definition/conscience (accessed July 22, 2018) (defining “conscience” as “[a] person's moral sense of right and wrong, viewed as acting as a guide to one's behavior”). See also Merriam-Webster, “conscience," https://www.merriam-webster.com/dictionary/conscience (accessed July 22, 2018) (defining “conscience” as “the sense or consciousness of the moral goodness or blameworthiness of one's own conduct, intentions, or character together with a feeling of obligation to do right or be good”).

[91] Asaro, “Jus nascendi, Robotic Weapons and the Martens Clause,” pp. 374-375.

[92] Meron notes that looking at a range of opinions helps guard against the potentially immoral views of both governments and people. Meron also notes some have argued that international human rights law, which was discussed in Chapter IV of this report, can provide play a role in determining the public conscience. Meron, “The Martens Clause, Principles of Humanity, and Dictates of Public Conscience,” American Journal of International Law, pp. 83-84 (noting, “public opinion—so influential in our era—has a role to play in the development of international law is not an entirely new phenomenon.”).

[93] Asaro, “Jus nascendi, Robotic Weapons and the Martens Clause,” pp. 374-375. See also V.V. Pustogarov, “The Martens Clause in International Law,” Journal of the History of International Law, pp. 132-133.

[94] Asaro, “Jus nascendi, Robotic Weapons and the Martens Clause,” pp. 373-374.

[95] Ibid., p. 375 (“Indeed, the best place to look for emerging norms and the dictates of public conscience are in the public forums in which states and individuals attempt to grapple with, and articulate that conscience.”).

[96] Ibid. (“That content should also be elicited through public discussion, as well as academic scholarship, artistic and cultural expressions, individual reflection, collective action, and additional means, by which society deliberates its collective moral conscience.”).

[97] Michael Wood and Omri Sender, “State Practice,” Max Planck Encyclopedia of Public International Law, updated January 2017, http://opil.ouplaw.com/abstract/10.1093/law:epil/9780199231690/law-97801... (accessed July 22, 2018) (“In essence, the practice must be general (that is, sufficiently widespread and representative, as well as consistent), and accompanied by a recognition that a rule of law or legal obligation is involved.”).

[98] With the support of the Non-Aligned Movement, which counts more than 100 states as members, this number far surpasses 100 nations. See, for example, Statement by Venezuela on behalf of the Non-Aligned Movement, CCW Group of Governmental Experts (GGE) on Lethal Autonomous Weapons Systems, Geneva, March 28, 2018, https://www.unog.ch/80256EDD006B8954/(httpAssets)/E9BBB3F7ACBE8790C125825F004AA329/$file/CCW_GGE_1_2018_WP.1.pdf (accessed July 22, 2018); Statement by the African Group, CCW GGE on Lethal Autonomous Weapons Systems, Geneva, April 9-13, 2018, http://reachingcriticalwill.org/images/documents/Disarmament-fora/ccw/20... (accessed July 22, 2018).

[99] For examples of states opposed to a ban, see Statement by the United States, CCW GGE on Lethal Autonomous Weapons Systems, Geneva, April 13, 2018, https://geneva.usmission.gov/2018/04/17/u-s-statement-on-the-outcome-of-...(accessed July 22, 2018); Statement by Israel, CCW Meeting of States Parties, Geneva, November 15, 2017, http://embassies.gov.il/UnGeneva/priorities-statements/Disarmament/Docum... (accessed July 22, 2018).

[100] Charli Carpenter, "US Public Opinion Poll on Lethal Autonomous Weapons," June 2013, http://duckofminerva.dreamhosters.com/wp-content/uploads/2013/06/UMass-S... (accessed July 22, 2018); Open Roboethics Initiative, “The Ethics and Governance of Lethal Autonomous Weapons Systems: An International Public Opinion Poll,” November 9, 2015, http://www.openroboethics.org/wp-content/uploads/2015/11/ORi_LAWS2015.pdf (accessed July 22, 2018); Chris Jackson, “Three in Ten Americans Support Using Autonomous Weapons,” Ipsos, February 7, 2017, https://www.ipsos.com/en-us/news-polls/three-ten-americans-support-using... (accessed July 22, 2018).

[101] Of the 55 percent, 39 percent said they “strongly oppose” and 16 percent “somewhat oppose” using fully autonomous weapons. To assess the effects of language, the survey described the weapons as “completely autonomous [robotic weapons/lethal robots].” Charli Carpenter, “US Public Opinion on Autonomous Weapons.” These figures are based on a nationally representative online poll of 1,000 Americans conducted by Yougov.com. Respondents were an invited group of Internet users (YouGov Panel) matched and weighted on gender, age, race, income, region, education, party identification, voter registration, ideology, political interest, and military status. The margin of error for the results is +/- 3.6 percent.

[102] Ibid.

[103] Ibid. According to the survey, 33 percent said they “strongly support” a campaign and 20 percent said they “somewhat support” it.

[104] Willem Staes, “Nieuw onderzoek: 60% van de Belgen wil internationaal verbod op ‘killer robots,’” Pax Christi Vlaanderen, July 3, 2018, https://www.paxchristi.be/nieuws/nieuw-onderzoek-60-van-de-belgen-wil-in... (accessed July 22, 2018) (unofficial translation).

[105] Open Roboethics Initiative, “The Ethics and Governance of Lethal Autonomous Weapons Systems: An International Public Opinion Poll,” pp. 4, 8.

[106] Ibid., p. 7.

[107] Ibid.

[108] Chris Jackson, “Three in Ten Americans Support Using Autonomous Weapons.”

[109] Ibid. The countries most strongly opposed to the use of these weapons were Russia (69% opposed), Peru (67% opposed), Spain (66% opposed), and Argentina (66% opposed). The countries who viewed their use somewhat favorably were India (60% in favor), China (47% in favor), and the US (34% in favor).

[110] Campaign to Stop Killer Robots, “Who We Are,” April 2018, https://www.stopkillerrobots.org/coalition/ (accessed July 22, 2018).

[111] Ibid.

[112] See Campaign to Stop Killer Robots, “Bibliography,” https://www.stopkillerrobots.org/bibliography/ (accessed July 22, 2018).

[113] Statement by Human Rights Watch, CCW GEE on Lethal Autonomous Weapons Systems, Geneva, April 9, 2018, https://www.hrw.org/news/2018/04/09/statement-human-rights-watch-convent....

[114] Statement by the ICRC, CCW GGE on Lethal Autonomous Weapons Systems, Geneva, November 15, 2017, https://www.icrc.org/en/document/expert-meeting-lethal-autonomous-weapon... (accessed July 22, 2018).

[115] Statement by the ICRC, CCW GEE on Lethal Autonomous Weapons Systems, Geneva, April 11, 2018, http://reachingcriticalwill.org/images/documents/Disarmament-fora/ccw/20..., p. 1.

[116] Ibid.

[117] Ibid.

[118] Nobel Women's Initiative, “Nobel Peace Laureates Call for Preemptive Ban on Killer Robots,” May 12, 2014, http://nobelwomensinitiative.org/nobel-peace-laureates-call-for-preempti... (accessed July 22, 2018).

[119] Ibid.

[120] PAX, “Religious Leaders Call for a Ban on Killer Robots,” November 12, 2014, https://www.paxforpeace.nl/stay-informed/news/religious-leaders-call-for... (accessed July 22, 2018).

[121] Campaign to Stop Killer Robots, “Who Supports the Call to Ban Killer Robots?” June 2017, http://www.stopkillerrobots.org/wp-content/uploads/2013/03/KRC_ListBanEn... (accessed July 22, 2018), p. 1.

[122] PAX, “Religious Leaders Call for a Ban on Killer Robots.”

[123] Frank Sauer, International Committee for Robot Arms Control (ICRAC), “The Scientists’ Call … to Ban Autonomous Lethal Robots,” November 11, 2012, https://www.icrac.net/the-scientists-call/ (accessed July 22, 2018).

[124] Campaign to Stop Killer Robots, “Scientists Call for a Ban,” October 16, 2013, https://www.stopkillerrobots.org/2013/10/scientists-call/ (accessed July 22, 2018). The signatories hailed from 37 different countries and included numerous university professors.

[125] “Computing Experts from 37 Countries Call for Ban on Killer Robots,” ICRAC press release, October 15, 2013, https://www.icrac.net/wp-content/uploads/2018/06/Scientist-Call_Press-Re... (accessed July 22, 2018), p. 1; ICRAC, “As Computer Scientists, Engineers, Artificial Intelligence Experts, Roboticists and Professionals from Related Disciplines, We Call for a Ban on the Development and Deployment of Weapon Systems in which the Decision to Apply Violent Force is Made Autonomously,” June 2018, https://www.icrac.net/wp-content/uploads/2018/06/List-of-Signatories-ICR... (accessed July 22, 2018).

[126] The signatories included: Tesla CEO Elon Musk, Apple Co-founder Steve Wozniak, Skype co-founder Jaan Tallin, Professor Stephen Hawking, Professor Noam Chomsky, DeepMind leader Demis Hassabis (and 21 of his engineers), IBM Watson design leader Kathryn McElroy, Facebook Head of AI Research Yann LeCun, Twitter CEO Jack Dorsey, Nobel Laureate in physics Frank Wilczek, former Canadian Minister of Defense Hon. Jean Jacques Blais, and numerous professors. Future of Life Institute, “Autonomous Weapons: An Open Letter from AI & Robotics Researchers.”

[127] Ibid.

[128] Ibid.

[129] Future of Life Institute, “Lethal Autonomous Weapons Pledge,” https://futureoflife.org/lethal-autonomous-weapons-pledge/ (accessed July 30, 2018).

[130] Ibid.

[131] Ariel Conn, “AI Companies, Researchers, Engineers, Scientists, Entrepreneurs, and Others Sign Pledge Promising Not to Develop Lethal Autonomous Weapons,” Future of Life Institute press release, July 18, 2018, https://futureoflife.org/2018/07/18/ai-companies-researchers-engineers-s... (accessed July 30, 2018).

[132] Meghan Hennessy, “Clearpath Robotics Takes Stance Against ‘Killer Robots,’” Clearpath Robotics press release, August 13, 2014, https://www.clearpathrobotics.com/2014/08/clearpath-takes-stance-against... (accessed July 22, 2018).

[133] Ibid.

[134] Ibid.

[135] Samuel Gibbs, “Elon Musk Leads 116 Experts Calling for Outright Ban of Killer Robots,” Guardian, August 20, 2017, https://www.theguardian.com/technology/2017/aug/20/elon-musk-killer-robo... (accessed July 22, 2018).

[136] Future of Life Institute, “An Open Letter to the United Nations Convention on Certain Conventional Weapons,” https://futureoflife.org/autonomous-weapons-open-letter-2017/ (accessed July 22, 2018).

[137] Ibid.

[138] Barbara Booth, “‘Autonomous Weapons are among the World's Dumbest Ideas’: A.I. CEO,” CNBC, March 15, 2018, https://www.cnbc.com/2018/03/15/autonomous-weapons-are-among-the-worlds-dumbest-ideas-a-i-ceo.html (accessed July 22, 2018).

[139] Parliament of the United Kingdom, Science and Technology Committee, Robotics and Artificial Intelligence Inquiry, “Written Evidence Submitted by Google DeepMind, ROB0062,” May 2016, http://data.parliament.uk/writtenevidence/committeeevidence.svc/evidence... (accessed July 22, 2018), para. 5.3.

[140] Ibid.

[141] Scott Shane and Daisuke Wakabayashi, “‘The Business of War’: Google Employees Protest Work for the Pentagon,” New York Times, April 4, 2018, https://www.nytimes.com/2018/04/04/technology/google-letter-ceo-pentagon... (accessed July 22, 2018).

[142] ICRAC, “Open Letter in Support of Google Employees and Tech Workers,” June 25, 2018, https://www.icrac.net/open-letter-in-support-of-google-employees-and-tec... (accessed July 22, 2018).

[143] Sunda Pichai, “AI at Google: Our Principles,” https://www.blog.google/technology/ai/ai-principles/ (accessed July 30, 2018). Other similar sets of ethical guidelines that seek to ensure AI benefits, not harms, human beings, include: “Montreal Declaration for Responsible AI,” https://www.montrealdeclaration-responsibleai.com/the-declaration (accessed July 22, 2018); Future of Life, “Asilomar AI Principles,” 2017, https://futureoflife.org/ai-principles/ (accessed July 24, 2018) (signed by 1,273 AI and robotics researchers and 2,541 others); George Dvorsky, “UK Government Proposes Five Basic Principles to Keep Humans Safe From AI,” Gizmodo, April 16, 2018, https://gizmodo.com/uk-government-proposes-five-basic-principles-to-keep... (accessed July 22, 2018).

[144] “Norwegian Fund Considers Killer Robots,” Campaign to Stop Killer Robots press release, March 19, 2016, https://www.stopkillerrobots.org/2016/03/norwayfund/ (accessed July 22, 2018).

[145] “Ban on Killer Robots Gaining Ground,” PAX press release, April 16, 2018, https://www.paxforpeace.nl/stay-informed/news/ban-on-killer-robots-gaini... (accessed August 3, 2018).

[146] Campaign to Stop Killer Robots, “Country Views on Killer Robots,” April 13, 2018, https://www.stopkillerrobots.org/wp-content/uploads/2018/04/KRC_CountryV... (accessed July 22, 2018). These nations are: Algeria, Argentina, Austria, Bolivia, Brazil, Chile, China (calls for a ban on use only), Colombia, Costa Rica, Cuba, Djibouti, Ecuador, Egypt, Ghana, Guatemala, the Holy See, Iraq, Mexico, Nicaragua, Pakistan, Panama, Peru, the State of Palestine, Uganda, Venezuela, and Zimbabwe.

[147] Statement by Venezuela on behalf of the Non-Aligned Movement, CCW GGE on Lethal Autonomous Weapons Systems, Geneva, March 28, 2018.

[148] See generally Report of the Special Rapporteur on Extrajudicial, Summary or Arbitrary Executions, Christof Heyns, Lethal Autonomous Robotics and the Protection of Life.

[149] Ibid., para. 89.

[150] Ibid., para. 94.

[151] Ibid., paras. 89, 100.

[152] Ibid., para. 113.

[153] Joint Report of the Special Rapporteur on the Rights to Freedom of Peaceful Assembly and of Association and the Special Rapporteur on Extrajudicial, Summary or Arbitrary Executions on the Proper Management of Assemblies to the UN Human Rights Council, A/HRC/31/66, February 4, 2016, http://www.refworld.org/docid/575135464.html (accessed August 6, 2018), para. 67(f).

[154] Campaign to Stop Killer Robots, “Chronology,” https://www.stopkillerrobots.org/chronology/ (accessed July 22, 2018).

[155] Statement by Ecuador, CCW Meeting of States Parties, Geneva, October 25, 2013, http://www.reachingcriticalwill.org/images/documents/Disarmament-fora/1c..., (accessed July 22, 2018), p. 2.

[156] Statement by Indonesia, Interactive Dialogue with the Special Rapporteur on Extrajudicial, Summary or Arbitrary Executions, UN Human Rights Council, 23rd Session, Geneva, May 30, 2013, http://stopkillerrobots.org/wp-content/uploads/2013/05/HRC_Indonesia_09_... (accessed July 22, 2018).

[157] Statement by Russia, Interactive Dialogue, UN Human Rights Council, 23rd Session, Geneva, May 30, 2013, translated in Campaign to Stop Killer Robots, “Report on Outreach on the UN Report on ‘Lethal Autonomous Robotics,’” http://stopkillerrobots.org/wp-content/uploads/2013/03/KRC_ReportHeynsUN... (accessed July 22, 2018), p. 19.

[158] Statement by Pakistan, Interactive Dialogue, UN Human Rights Council, 23rd Session, Geneva, May 30, 2013, http://stopkillerrobots.org/wp-content/uploads/2013/05/HRC_Pakistan_09_3... (accessed July 22, 2018), p. 2. See also Campaign to Stop Killer Robots, “Chronology.”

[159] Statement by Brazil, Interactive Dialogue, UN Human Rights Council, 23rd Session, Geneva, May 30, 2013, http://stopkillerrobots.org/wp-content/uploads/2013/05/HRC_Brazil_09_30M... (accessed July 22, 2018), p. 2.

[160] With regard to security concerns, Indonesia noted the potential impacts on “international stability and security” that could arise from the use of such weapons. Statement by Indonesia, Interactive Dialogue, UN Human Rights Council, 23rd Session, Geneva, May 30, 2013. The Latin American Network GRULAC emphasized that “these systems could lead to a ‘normalization of conflict’, and to a possible arms race that would create divisions among States and weaken international law.” Statement by Argentina on behalf of GRULAC, Interactive Dialogue, UN Human Rights Council, 23rd Session, Geneva, May 30, 2013, http://stopkillerrobots.org/wp-content/uploads/2013/05/HRC_Argentina_09_... (accessed July 22, 2018), p. 2.

[161] Campaign to Stop Killer Robots, “Chronology.”

[162] Ibid.

[163] Ibid.

[164] Statement by Austria, CCW GGE on Lethal Autonomous Weapons Systems, Geneva, April 9-13, 2018, https://www.unog.ch/80256EDD006B8954/(httpAssets)/AA0367088499C566C1258278004D54CD/$file/2018_LAWSGeneralExchang_Austria.pdf (accessed July 22, 2018), p. 1.

[165] Summary of Statement by Brazil, CCW Informal Meeting of Experts on Lethal Autonomous Weapons Systems, Geneva, May 13, 2014, in Campaign to Stop Killer Robots, “Report on Activities: CCW Informal Meeting of Experts on Lethal Autonomous Weapons Systems,” May 13-16, 2014, http://www.stopkillerrobots.org/wp-content/uploads/2013/03/KRC_CCWreport... (accessed July 22, 2018), p. 27.

[166] Summary of Statement by Mexico, CCW Informal Meeting of Experts on Lethal Autonomous Weapons Systems, Geneva, May 13, 2014, in Campaign to Stop Killer Robots, “Report on Activities,” p. 30.

[167] Summary of Statement by Russia, CCW Informal Meeting of Experts on Lethal Autonomous Weapons Systems, Geneva, April 16, 2015, in Campaign to Stop Killer Robots, “Report on Activities,” p. 18.

[168] Statement by the United States, CCW Informal Meeting of Experts on Lethal Autonomous Weapons Systems, Geneva, April 16, 2015, https://geneva.usmission.gov/2015/04/16/ccw-informal-meeting-of-experts-... (accessed July 22, 2018).

[169] Statement by the African Group, CCW GGE on Lethal Autonomous Weapons Systems, Geneva, April 9, 2018.

[170] Ibid.

[171] Statement by the Holy See, CCW GGE on Lethal Autonomous Weapons Systems, Geneva, April 9, 2018, http://reachingcriticalwill.org/images/documents/Disarmament-fora/ccw/20..., (accessed July 22, 2018), p. 1.

[172] Statement by Greece, CCW GGE on Lethal Autonomous Weapons Systems, Geneva, April 9, 2018,

http://reachingcriticalwill.org/images/documents/Disarmament-fora/ccw/20... (accessed July 22, 2018), p. 2.

[173] Statement by Pakistan, CCW GGE on Lethal Autonomous Weapons Systems, Geneva, April 9, 2018, http://reachingcriticalwill.org/images/documents/Disarmament-fora/ccw/20... (accessed July 22, 2018), p. 2.

[174] Statement by the European Union, CCW GGE on Lethal Autonomous Weapons Systems, Geneva, April 9, 2018, http://reachingcriticalwill.org/images/documents/Disarmament-fora/ccw/20... (accessed July 22, 2018), p. 2.

[175] “Belgium Votes to Ban Killer Robots,” PAX press release, July 23, 2018, https://www.paxforpeace.nl/stay-informed/news/belgium-votes-to-ban-kille... (accessed August 9, 2018).

[176] “European Parliament Recommendation of 5 July 2018 to the Council on the 73rd session of the United Nations General Assembly,” P8_TA-PROV(2018)0312, adopted July 5, 2018, http://www.europarl.europa.eu/sides/getDoc.do?pubRef=-//EP//NONSGML+TA+P... (accessed July 24, 2018), art. 1(av).

[177] UN Secretary-General António Guterres, Securing Our Common Future: An Agenda for Disarmament (New York: UN Office of Disarmament Affairs, 2018), https://front.un-arm.org/documents/SG+disarmament+agenda_1.pdf (accessed July 22, 2018), p. 55.

[178] Future of Life Institute, “An Open Letter to the United Nations Convention on Certain Conventional Weapons.”

[179] Campaign to Stop Killer Robots, “Country Views on Killer Robots.”

Author: Human Rights Watch
Posted: January 1, 1970, 12:00 am