In Paris this week on an official visit, Azerbaijan’s autocratic President Ilham Aliyev has already scored one photo op. Anyone reading yesterday’s Azeri media could see dozens of photos of Aliyev posing with leaders of top French companies, including Airbus, Suez, and Credit Agricole.

Azerbaijan's President Ilham Aliyev (L) shakes hands with his French counterpart Francois Hollande as they visit a local French school under construction in Baku, May 11, 2014.

© 2014 Reuters

Today, President Hollande will receive President Aliyev and host an official dinner at Palais de l’Elysee. Again, Parisian photo ops abound. But amid the flashing cameras, one has to wonder where Azerbaijan’s repression of critics and the jailing of opponents fits in the new relationship between Paris and Baku?

In the past few years, Azerbaijani authorities have aggressively gone after the country’s once vibrant civil society, jailing dozens of activists, journalists, and political opponents. It also adopted draconian legislation making it virtually impossible for independent non-governmental organizations to operate.

One year ago, as Azerbaijan’s economy started to suffer from falling oil prices, several of those detained on political grounds were released. That was an important first step, but hopes for progress were short-lived.

Many of those released face travel bans or obstacles to their activities. Dozens are still locked up on political grounds, including opposition activist Ilgar Mammadov, despite repeated calls by the Strasbourg-based Council of Europe for his immediate release. And more activists have been thrown in jail. Recently, one of the country’s most popular journalists and bloggers, Mehman Huseynov, was sentenced to two years in prison for allegedly defaming the police, in response to his brave public denouncement of the police abuses he suffered.

When visiting Paris, Brussels, or other European capitals, President Aliyev hopes to get more business opportunities and investment in Azerbaijan. But he prefers to ignore that the people of Azerbaijan want human rights protections, transparency, and good governance. Those standing up for these values are routinely exposed to attacks and harassment.

Yet what more clear message that Azerbaijan’s crackdown cannot be ignored by potential investors than last week’s decision by the Extractive Industries Transparency Initiative (EITI), an international coalition promoting better governance of resource-rich countries, to suspend Azerbaijan – precisely because of its actions against civil society.

President Hollande should reject a narrative that only finance and economy matter in Azerbaijan. Human rights should be as central to France’s foreign policy as other topics.

Hollande should publicly call for the release of Ilgar Mammadov and all those detained in retaliation for their activism and criticism. A failure to explicitly support human rights principles would be the worst message to those unjustly waiting behind bars.

Author: Human Rights Watch
Posted: January 1, 1970, 12:00 am

Arvind Ganesan is the director of Human Rights Watch’s Business and Human Rights Division. He leads the organization’s work to expose human rights abuses linked to business and other economic activity, hold institutions accountable, and develop standards to prevent future abuses. This work has included research and advocacy on awide range of issues includingthe extractive industries; public and private security providers; international financial institutions; freedom of expression and information through the internet; labor rights; supply chain monitoring and due diligence regimes; corruption; sanctions; and predatory practices against the poor. Ganesan’s work has covered countries such as Angola, Azerbaijan, Burma, China, Colombia, the Democratic Republic of Congo, Equatorial Guinea, India, Indonesia, the United States, and Nigeria. His recent research has focused on predatory lending practices and governance issues on Native American reservations in the United States. He has written numerous reports, op-eds, and other articles and is widely cited by the media.

Ganesan has also worked to develop industry standards to ensure companies and other institutions respect human rights. He is a founder of the Voluntary Principles on Security and Human Rights for the oil, gas, and mining industries and is a founding member of the Global Network Initiative (GNI) for the internet and telecommunications industries, where he also serves on the board. Ganesan has helped to develop standards for international financial institutions such as the World Bank, and regularly engages governments in an effort to develop mandatory rules or strengthen existing standards such as the Kimberley Process. He serves on the board of EGJustice, a nongovernmental organization that promotes good governance in Equatorial Guinea, and is a member of the International Corporate Accountability Roundtable (ICAR)’s steering committee.

Before joining Human Rights Watch, Ganesan worked as a medical researcher. He attended the University of Oklahoma.

Posted: January 1, 1970, 12:00 am

Garment workers are leading a global fight against poor factory inspections. A couple of weeks ago, Pakistani worker representatives and European rights groups filed a complaint with Italian authorities against RINA. RINA, an auditing firm, issued a report certifying the Ali Enterprises factory. Weeks after they issued an SA8000 certificate—a standard for labor and social compliance in the industry—the factory burned down killing more than 250 workers.

So how can a factory that is certified burn down? And are “social audits” and certifications working?

A couple of months ago, Andrew S. and I sat at a roadside eatery in one of Asia’s apparel hubs. A plateful of starters, a pot of rice, and some iced coffee, and three hours flew by as he described his experiences inspecting garment factories for labor conditions and treatment of workers.  The process is known as “social audits” and it is a huge industry.  

Andrew (not his real name) had spent more than 15 years conducting these audits across five countries, first visiting garment factories for an apparel company and later as a freelance auditor. Apparel companies or factories hire auditing firms or freelance auditors to inspect social and labor conditions and report back with their findings and how to fix problems they identify.

Women in the sewing division of a factory in Phnom Penh, Cambodia’s capital. Women constitute about 90 percent of the workforce in Cambodia’s garment industry.

© 2014 Samer Muscati/ Human Rights Watch
Consumers who are concerned about the conditions in garment factories rely on claims that brands monitor the factories that produce for them through credible social audits. They have a right to know whether the social audit process actually works.   

That’s why a recent case brought before German authorities is so important and why it’s so troubling to know that these audits are riddled with problems. The German authorities’ findings in a June 26 report came at the end of a two-year process in response to complaints about a social audit conducted by TUV India and its parent company, TUV Rhineland, in a factory in Bangladesh.  

It brought me back to my conversation with Andrew. I would have expected him to defend the social audit process. But the frank conversation I had with him provided an in-depth look at just how bad the problems are, and how urgent the need for change.

In 2016, survivors of the Rana Plaza disaster and human rights groups filed a complaint against these companies with the German National Contact Point for the Organization for Economic Co-operation and Development (OECD). National Contact Points for OECD members hear disputes about the application of the OECD Guidelines for Multinational Enterprises, a key set of standards defining rights-respecting business conduct.

The complainants questioned TUV India’s social audit methods and findings for Phantom Apparel Ltd., saying that its social audit had failed to detect the illegal use of child labor and violations of workers’ freedom of association.

Phantom Apparel Ltd. was one of the five factories housed in the eight-story Rana Plaza building in Bangladesh. The building collapsed in 2013, killing 1,135 workers and injuring more than 2,000 others. Among other things, TUV said after the disaster that it used audit tools and procedures from the Business and Social Compliance Initiative, a leading business association, but that its system did not cover fire and building safety.

Over the last few months, I’ve spoken to a good number of people in the auditing industry and other industry experts. Some of the most scathing criticism and candid insights about the “social audit” industry has come from experienced auditors themselves. Andrew minced no words. He had repeatedly seen the system fail workers.

German authorities have identified numerous concerns with the auditing industry. These include auditors’ technical expertise, the methodology for audits, and how the audits are paid for. The authorities also expressed concerns about whether factories actually fix the problems and what brands do to monitor the process, and whether workers have ways of raising grievances after the auditors leave the premises. The German report identified many of the challenges that auditors like Andrew themselves have raised in my conversations with them.

Social compliance auditing is a multi-billion dollar industry. At one end of the spectrum are huge corporate behemoths, including TUV Rheinland, TUV Nord, TUV SUD, SGS, UL, BV, and Intertek. These companies provide a range of services, including product and quality testing, and social audits. At the other end of the spectrum are smaller firms, many of them providing specialized services. Social audits are conducted across various industries and not just the apparel industry.

Germany, with its Partnership for Sustainable Textiles, which includes apparel companies and nongovernmental organizations, and aims to improve garment industry policies and accountability for factories they source from, is well placed to lead a dialogue that would fundamentally alter the social audits business. Italian authorities too should come out strongly pushing for a rehaul of the system. Consumers should be able to feel confident that the workers who make their clothing work in safe conditions and are treated with decency and respect.

Author: Human Rights Watch
Posted: January 1, 1970, 12:00 am

More than 60 institutional investors urged jewelry manufacturers and retailers to address human rights risks in their supply chains this week, showing that environmental and human rights advocates aren’t the only ones concerned about abuses in gold and diamond mining.

The investors – which include asset managers, pension funds, and faith-based institutions – expressed concern about child and forced labor in mining, environmental damage, and displacement of indigenous peoples. They called for stronger action from the jewelry industry to ensure their products do not contribute to human rights abuses, saying that responsible sourcing is not only the ethical thing to do, but also good business practice. When businesses fail to respect human rights, they risk reputational harm and expensive litigation.          

The investors are part of the Investors Alliance for Human Rights. Launched in May of this year, the Alliance promotes coordinated investor action on human rights and business risks. Collectively, its members have more than US$2 trillion in assets.

“Precious metals and gems are often used to commemorate the most sacred and momentous events in the human experience,” an Alliance spokesperson said. “It is unconscionable that these gifts are built on human suffering and injustice. As investors, we need to see greater action—including improved standards and certifications—from the manufacturers and retailers in this space.”

The investors recognized that some companies in the jewelry industry are taking important steps to address human rights risks in their supply chains, but that most fall short of meeting international standards. In February, Human Rights Watch released a report scrutinizing the practices of over a dozen jewelry and watch brands.

The investor statement was sent directly to 32 individual companies, including jewelry and watch companies as well as large department stores and other retailers. It urged these companies to conduct robust due diligence over their supply chains, ensure full chain of custody over gold and diamonds, report publicly on their efforts, and take other steps to set a high bar for responsible sourcing. The statement also urged the Responsible Jewellery Council, an industry body, to strengthen its certification standards, make its audit process more transparent, and include civil society in its governance structure. 

Author: Human Rights Watch
Posted: January 1, 1970, 12:00 am

Indonesia's President-elect Joko Widodo (C) speaks with journalists at city hall in Jakarta, August 12, 2014.

© 2014 Reuters

Last week, Indonesian President Joko “Jokowi” Widodo announced a moratorium on the issuance of permits for new development of state forests into oil palm plantations until 2021. The timing of the moratorium created a buzz at the launch of Global Land Forum in Indonesia this week and renewed focus on the country’s reform efforts on land and agriculture.

Indonesia has made numerous reform commitments in recent years. These reforms, if implemented well, would add political weight to the government’s long-standing agrarian reform agenda.

In 2015, the government set a target to acquire and redistribute 9 million hectares of land as a mid-term development goal for 2015-2019. This was aimed at land that was previously granted for cultivation but not used, where the grant had expired or was not renewed. According to Moeldoko, the head of the president’s Executive Office, the government issued 5 million land certificates to small holder farmers in 2017 and will grant more by 2019.

The government also announced a target guaranteeing small-holder farmers the right to use 12.7 million hectares of state forests. President Jokowi has made similar pledges stating that indigenous peoples will get certificates over lands they live on and their customary forests. But the process has been slow, with critics calling for reform that is fair and just.

In February, Jokowi launched a program to accelerate land registration with a goal to register all land in Indonesia by 2025. And this week, the president signed another decree that would provide guidance on how government’s big targets on land redistribution and certification will be implemented.

Taken together, these new measures could breath new life into the Indonesian government’s decades-long agrarian reform agenda. The national agrarian reform, which began as far back as 1950s to address problematic land ownership during the Dutch Indies colonial rule, aims to redistribute agricultural land to close the economic gap and reduce the country’s inequality.

Implementing these policies will be a challenge given the complexities around land rights in Indonesia. Land disputes between peasants, indigenous peoples, companies and government are widespread, due to legal uncertainty over ownership, use, and procurement.

Fighting inequality means addressing these land disputes. Fair and just reforms would secure rights to land for the most marginalized and protect indigenous peoples’ right to their customary forests.

Author: Human Rights Watch
Posted: January 1, 1970, 12:00 am

A girl works in an artisanal diamond mine in Sosso Nakombo, Central African Republic, near the border with Cameroon, in August 2015.

© 2015 Marcus Bleasdale for Human Rights Watch

This week, hundreds of jewellers from around the world attended International Jewellery London. Last year, it drew about 9,000 industry representatives from 71 countries, discussing everything from jewellery design and manufacturing to marketing. This year, jewellery free of human rights abuses was also on the agenda.

Gold mining has been tainted by serious human rights abuses, including child labor, deadly working conditions, forced evictions, and harmful pollution. Human Rights Watch has documented such abuses in the Philippines, Papua New Guinea, Ghana, Mali, Nigeria, Tanzania, Uganda, and Eritrea.

We have also investigated how jewelry companies are trying to avoid contributing to human rights abuses in their gold and diamond supply chains. We recently took a closer look at 13 leading jewellery brands, with a combined annual revenue of over £20 billion. Many companies don’t know where their gold and diamonds are coming from, and don’t do enough to assess human rights risks. Some jewellery companies publish sparse, general information about human rights risks in their supply chains, which is nowhere near enough for a consumer to make an informed choice.

Many of the companies we contacted pointed to the Responsible Jewellery Council (RJC) and their certification against its standards. These companies considered their RJC certification proof of responsible sourcing. But this industry group’s standard is broad and imprecise, and does not require companies to fully trace the source of their minerals so they know what happened from the mine to the finished product. And there is little monitoring to ensure that companies are actually following the RJC’s code.

There are a few leading companies among those we examined. Tiffany & Co. stands out for its ability to track its gold back to the mine, and for its thorough assessments of human rights impacts. UK jeweler Boodles has pledged to take steps to better assess its supply chain and to revising and expanding its code of conduct for diamond and gold suppliers. It recently conducted an audit on its sourcing practices, and while it has yet to make the results public, the company has committed to producing public reports on its findings.

Other jewellery companies — including many small jewelers – are increasingly making efforts to ensure that the gold directly from small-scale mines is produced under rights-respecting conditions. A number of small jewellers in the UK have formed a group called Fair Luxury, or FLUX, with the goal of promoting responsible sourcing from rights-respecting mines. Many FLUX members source their gold from Fairtrade or “Fairmined” certified mines, which go further than other voluntary standards to oblige mines to respect clearly defined labor rights requirements and monitors conditions regularly for compliance.

At International Jewellery London, FLUX hosted a series of presentations on human rights and jewellery supply chains, helping put responsible sourcing on the agenda of jewelers. At the standing-room only presentations, jewellers were keen to learn about and discuss strategies for improving sourcing practices, and communicate their efforts to the public. They also requested updated and ongoing reporting on efforts by the jewellery industry to address human rights in their supply chains.

All jewellery companies, big or small, have a responsibility to ensure human rights are respected in their supply chains. Many consumers —particularly younger ones — increasingly expect companies to act responsibly. International norms also make clear that companies should assess human rights risks in their product chains and ask their suppliers to provide them with detailed information about every step of the production process.

Ultimately, consumers need to know about what they are buying and that is why companies should report publicly about the human rights due diligence they are undertaking. All the jewellery companies at International Jewellery London should be transparent about where their gold and diamonds originate, and what they are doing to address human rights abuses in their supply chains.

Author: Human Rights Watch, Human Rights Watch
Posted: January 1, 1970, 12:00 am

Dear Mr Pichai

(cc: Ben Gomes, Vice President of Search; Kent Walker, Senior Vice President of Global Affairs)

Like many of Google’s own employees, we are extremely concerned by reports that Google is developing a new censored search engine app for the Chinese market. The project, codenamed “Dragonfly”, would represent an alarming capitulation by Google on human rights.  The Chinese government extensively violates the rights to freedom of expression and privacy; by accommodating the Chinese authorities’ repression of dissent, Google would be actively participating in those violations for millions of internet users in China.

We support the brave efforts of Google employees who have alerted the public to the existence of Dragonfly, and voiced their concerns about the project and Google’s transparency and oversight processes.

In contrast, company leadership has failed to respond publicly to concerns over Project Dragonfly, stating that it does not comment on “speculation about future plans”. Executives have also refused to answer basic questions about how the company will safeguard the rights of users in China as it seeks to expand its business in the country.

Since Google publicly exited the search market in China in 2010, citing restrictions to freedom of expression online, the Chinese government has strengthened its controls over the internet and intensified its crackdown on freedom of expression. We are therefore calling on Google to:

  • Reaffirm the company’s 2010 commitment not to provide censored search engine services in China; 
  • Disclose its position on censorship in China and what steps, if any, Google is taking to safeguard against human rights violations linked to Project Dragonfly and its other Chinese mobile app offerings;
  • Guarantee protections for whistle-blowers and other employees speaking out where they see the company is failing its commitments to human rights.

Our concerns about Dragonfly are set out in detail below.

Freedom of expression and privacy in China and Google’s human rights commitments

It is difficult to see how Google would currently be able to relaunch a search engine service in China in a way that would be compatible with the company’s human rights responsibilities under international standards, or its own commitments. Were it to do so, in other words, there is a high risk that the company would be directly contributing to, or complicit in, human rights violations.

The Chinese government runs one of the world’s most repressive internet censorship and surveillance regimes. Human rights defenders and journalists are routinely arrested and imprisoned solely for expressing their views online. Under the Cybersecurity Law,[1] internet companies operating in China are obliged to censor users’ content in a way that runs counter to international obligations to safeguard the rights of access to information, freedom of expression and privacy. Thousands of websites and social media services in the country remain blocked, and many phrases deemed to be politically sensitive are censored.[2] Chinese law also requires companies to store Chinese users’ data within the country and facilitate surveillance by abusive security agencies.

According to confidential Google documents obtained by The Intercept, the new search app being developed under Project Dragonfly would comply with China’s draconian rules by automatically identifying and filtering websites blocked in China, and “blacklisting sensitive queries”. Offering services through mobile phone apps, including Google’s existing Chinese apps, raises additional concerns because apps enable access to extraordinarily sensitive data. Given the Cybersecurity Law’s data localization and other requirements, it is likely that the company would be enlisted in surveillance abuses and their users’ data would be much more vulnerable to government access.

Google has a responsibility to respect human rights that exists independently of a state’s ability or willingness to fulfil its own human rights obligations.[3] The company’s own Code of Conduct promises to advance users’ rights to privacy and freedom of expression globally. In Google’s AI Principles, published in June, the company pledged not to build “technologies whose purpose contravenes widely accepted principles of international law and human rights”. The company also commits, through the Global Network Initiative, to conduct human rights due diligence when entering markets or developing new services. Project Dragonfly raises significant, unanswered questions about whether Google is meeting these commitments.

Transparency and human rights due diligence

Google’s refusal to respond substantively to concerns over its reported plans for a Chinese search service falls short of the company’s commitment to accountability and transparency.[4]   

In 2010, the human rights community welcomed Google’s announcement that it had “decided we are no longer willing to continue censoring our results on”, citing cyber-attacks against the Gmail accounts of Chinese human rights activists and attempts by the Chinese government to “further limit free speech on the web”.

If Google’s position has indeed changed, then this must be stated publicly, together with a clear explanation of how Google considers it can square such a decision with its responsibilities under international human rights standards and its own corporate values. Without these clarifications, it is difficult not to conclude that Google is now willing to compromise its principles to gain access to the Chinese market.

There also appears to be a broader lack of transparency around due diligence processes at Google. In order to “know and show” that they respect human rights, companies are required under international standards to take steps to identify, prevent and mitigate against adverse impacts linked to their products – and communicate these efforts to key stakeholders and the public.[5] The letter from Google employees published on 16 August 2018 demonstrates that some employees do not feel Google’s processes for implementing its AI Principles and ethical commitments are sufficiently meaningful and transparent.[6]

Protection of whistle-blowers

Google has stated that it cannot respond to questions about Project Dragonfly because reports about the project are based on “leaks”.[7] However, the fact that the information has been publicly disclosed by employees does not lessen its relevance and rights impact.

In relation both to Project Dragonfly and to Google’s involvement in the US government’s drone programme, Project Maven, whistle-blowers have been crucial in bringing ethical concerns over Google’s operations to public attention. The protection of whistle-blowers who disclose information that is clearly in the public interest is grounded in the rights to freedom of expression and access to information.[8] The OECD Guidelines for Multinational Enterprises recommend that companies put in place “safeguards to protect bona fide whistle-blowing activities”.[9]

We are calling on Google to publicly commit to protect whistle-blowers in the company and to take immediate steps to address the concerns employees have raised about Project Dragonfly.

As it stands, Google risks becoming complicit in the Chinese government’s repression of freedom of speech and other human rights in China. Google should heed the concerns raised by human rights groups and its own employees and refrain from offering censored search services in China.

Signed, the following organizations:

Access Now
Amnesty International
Article 19
Center for Democracy and Technology
Committee to Protect Journalists
Electronic Frontier Foundation
Human Rights in China
Human Rights Watch 
Independent Chinese PEN Centre
International Service for Human Rights (ISHR)
PEN International
Privacy International
Reporters Without Borders (RSF)

Signed in individual capacity (affiliations for identification purposes only):

Ronald Deibert
Professor of Political Science and Director of the Citizen Lab
University of Toronto

Rebecca MacKinnon
Director, Ranking Digital Rights

Xiao Qiang
Research Scientist
Founder and Director of the Counter-Power Lab
School of Information, University of California at Berkeley

Lokman Tsui
Assistant Professor at the School of Journalism and Communication
The Chinese University of Hong Kong


[1] See Cybersecurity Law of the People's Republic of China (2016), unofficial translation, and Human Rights Watch, “China: Abusive Cybersecurity Law Set to be Passed,” November 6, 2016,

[2] See, Online Censorship In China,

[3] UN Guiding Principles on Business and Human Rights,

[4] For example, the Global Network Initiative Principles on Freedom of Expression and Privacy,

[5] UN Guiding Principles on Business and Human Rights.

[6] Kate Conger and Daisuke Wakabayashi, "Google Employees Protest Secret Work on Censored Search Engine for China," New York Times, August 16, 2018,

[7] Amnesty International meeting with Google, August 2018.

[8] UN Special Rapporteur on freedom of expression, Report to the General Assembly on the Protection of Sources and Whistleblowers, September 2015,

[9] OECD Guidelines, para 13,

Posted: January 1, 1970, 12:00 am

(San Francisco) – Google should not offer censored search services in China and should protect employee whistleblowers who raise ethical concerns, Human Rights Watch and other organizations and advocates said in a letter released today.

“Google has promised to respect human rights and only develop technology that benefits society,” said Cynthia Wong, senior internet researcher at Human Rights Watch. “Yet Google has failed to explain how it will shield users from the Chinese government’s efforts to monitor and suppress dissent.”

The joint letter to Google CEO Sundar Pichai follows media reports on August 1, 2018, that the company is developing a mobile search app as part of a project codenamed Dragonfly that would comply with Chinese censorship and other legal requirements. The letter, signed by over a dozen human rights groups and individuals, calls on Pichai to clarify Google’s approach to China and what steps it is taking to safeguard users from the Chinese government’s abusive censorship and surveillance regimes.

Human Rights Watch previously called on Google to refrain from exchanging user rights for access to China’s market. On August 3, six US senators also sent a bipartisan letter to Pichai stating that if reports about Project Dragonfly were true, it is “deeply troubling and risks making Google complicit in human rights abuses related to China’s rigorous censorship regime.” The letter warned that Google risks “set[ting] a worrying precedent for other companies seeking to do business in China without compromising their core values.”

Google has declined to respond publicly to questions from the media and rights organizations, stating that it will not “comment on speculation about future plans.”

Company executives have not been forthcoming with their own employees, according to multiple media reports. Internal employee discussions seen by Buzzfeed News indicate that several team members quit Dragonfly over ethical concerns about the project and the surrounding secrecy.

After the project became public, a broader range of employees asked Pichai to address their concerns at a company-wide meeting. According to media reports, Pichai stated that the product is in early stages and not close to launching, but that tech companies can have a positive impact where they do business. According to a Google spokesperson, Pichai also stated that transparency early in a project can “cause issues,” but that Google is “more committed to transparency than probably any company in the world.” 

However, Pichai’s statements contradict reports citing anonymous Google employees who worked on Project Dragonfly, as well as broader concerns about the lack of transparency. According to internal documents seen by The Intercept, the Dragonfly team was told in July to have the project in “launch-ready state” pending approval from Chinese officials. Internal discussions cited by BuzzFeed indicate that at least some Google employees were concerned that their work was unknowingly contributing to Project Dragonfly. Many more employees did not know that Google was actively pursuing censored search in China until details were disclosed by media reports.

In April, Google employees raised similar concerns about Project Maven, a US Department of Defense contract to build artificial intelligence (AI)-assisted drone technology. Following internal and external pressure, Google announced in June that it would end its involvement in Project Maven once the contract expires in 2019. The company also issued its AI Principles, which pledge to ensure Google’s AI-driven applications are “socially beneficial” and to not pursue “technologies whose purpose contravenes widely accepted principles of international law and human rights.”

Following the public disclosure of Project Dragonfly, over 1,400 Google employees signed a letter to management stating that “Dragonfly and Google’s return to China raise urgent moral and ethical issues,” and expressing concern that the project proceeded in secret and in spite of the AI Principles. They demanded an ethics review structure that includes employee representatives, greater transparency, and an ethical assessment of Dragonfly and Maven.

Google is a member of the Global Network Initiative, where it has committed to human rights principles to advance freedom of expression and privacy.

In the letter sent to Pichai, human rights groups and advocates also called on Google to protect whistleblowers and other employees who raise concerns about its human rights responsibilities.

“The disclosures about Projects Dragonfly and Maven have prompted urgent discussions about Google’s approach to human rights, and its commitments to socially beneficial AI,” said Wong. “Google should commend employees who raise human rights concerns and protect them from retaliation.”

Posted: January 1, 1970, 12:00 am

On Friday, the identity matching services bill will be discussed at a hearing by the parliamentary intelligence and security committee. It has serious implications for human rights.

Should the government be able to track your every move when you walk down the street, join a protest, or enter your psychiatrist’s building? Facial recognition technology may make that a reality for Australians. Parliament should refuse to expand its use until the government can demonstrate it won’t be used to violate human rights or turn us all into criminal suspects.

The bill would create a nationwide database of people’s physical characteristics and identities, linking facial images and data from states and territories and integrating them with a facial recognition system.

The system would initially enable centralised access to passport, visa, citizenship, and driver license images, though states and territories may also link other information, for example, marine licenses or proof-of-age cards. Government agencies and some private companies would then be allowed to submit images to verify someone’s identity. Government agencies will also use it to identify an unknown person. The Department of Home Affairs would manage the system.

Prime minister Malcolm Turnbull describes the proposal as a “modernisation” and “automation” of existing data-sharing practices between law enforcement agencies, making facial recognition “available in as near as possible real time.” But the proposal is too broad, enables using facial recognition for purposes far beyond fighting serious crime, and leaves significant details to departmental discretion or future interpretation. The lack of safeguards combined with the centralisation of a massive amount of information raises the potential for abuse and ever-expanding mission creep.

For example, the bill contains insufficient limits on how officials might use information shared through the system. Home Affairs would also have broad powers to define new kinds of “identity matching services” and information sharing, including perhaps fingerprints and iris scans.

The stated purposes for the system are either too minor to justify such a serious intrusion on liberty or so broad in addressing law enforcement and national security that they may cast a wide net affecting many innocent people.

The bill raises immediate alarms about privacy and other rights. With scant limits on future data collection and use, the amount of data is likely to grow over time. It also obliterates notions of consent since information people disclose for one purpose—obtaining a fishing license—could be easily used for entirely different ones like targeting “jaywalkers or litterers.”

Proponents contend that the system will not involve “surveillance” or direct integration with CCTV cameras. Nonetheless, the bill has the potential to facilitate broad tracking and profiling, especially when images are combined with other data. Imagine the chilling effect if officials ran photos taken from surveillance cameras at a demonstration or outside a union hall. Or the assumptions that could be made if you’re caught on cameras outside of a drug treatment centre, abortion clinic, or marriage counsellor’s office.

Notably, the proposal doesn’t require law enforcement agencies to get a warrant before using the system to identify someone, which is critical to preventing abuse. And what would prevent the government from integratingit with CCTV once the technologies are in place?

Facial recognition technology is far from perfect. Independent studies have found these systems often have a racial or ethnic bias. Yet the government has not disclosed enough information about the accuracy of the system it intends to use. What are its error rates and are they higher for racial and ethnic minorities? This is not a trivial issue. False positives mean people are wrongly accused or placed under unwarranted suspicion. False negatives mean criminals may continue to walk free.

Errors shift the burden onto individuals to show they are not who the system says they are, undermining the presumption of innocence. And this may disproportionately impact already vulnerable communities if the system misidentifies them at higher rates. Indigenous Australians are already significantly overrepresented in the criminal justice system. And what recourse would a person have if a bank denied them services because the system failed to verify their identity correctly?

Errors aside, facial recognition still raises significant human rights concerns. Combined with other data, they can be used to draw (potentially flawed) conclusions about who you are, what you believe, what you have done—and what you might do in the future.

The next generation of artificial-intelligence-driven facial recognition systems may be used in even more pernicious ways, from inferring your sexual orientation, IQ, or political beliefs, to predicting your propensity to commit crime or automatically detecting and punishing trivial infractions. This is already happening in China.

Lack of explicit safeguards in the bill means that information could be abused by government officials, police officers, or even private companies against people in unpredictable and unexpected ways. Australia’s patchwork of data protection laws provides insufficient safeguards against these risks.

The extraordinary intrusiveness of facial recognition should not be underestimated. Parliament should scrap the bill until the government fully addresses the threats the system poses to a free society and provides real safeguards for people’s rights.

Author: Human Rights Watch
Posted: January 1, 1970, 12:00 am

A Chinese national flag sways in front of Google China's headquarters in Beijing on January 14, 2010.

© 2010 Reuters

(New York) – Stakeholders and shareholders in Google and Facebook should urge the companies not to exchange user rights for access to China’s market, Human Rights Watch said. According to reporting in The Intercept, Google has been developing a search engine app to comply with China’s expansive censorship requirements. Facebook previously developed a censored version of its service for China, though never launched it.

The US Congress, European Parliament, and other legislatures around the world should express concern at US companies who are cooperating with China’s censorship and surveillance, Human Rights Watch said.

“Technology companies should be challenging China’s censorship – not complicit in it,” said Cynthia Wong, senior internet researcher at Human Rights Watch. “Shareholders in Google and Facebook who care about human rights should urge these companies not to compromise them for access to China’s market.”

Leaked documents examined by The Intercept describe the company’s plans to launch a censored version of its search engine as an Android app. According to the media report, Google has already demonstrated the app to Chinese officials and is waiting for approval for launch. The project, code-named Dragonfly, has been in development since spring 2017. According to reporting by The Intercept, work on the project accelerated following a meeting between Google CEO Sundar Pichai and Chinese government officials in December 2017, and the app could launch in the next six to nine months. The company is also in talks with potential Chinese partners to provide other cloud services inside the country, according to separate media reports.

Human Rights Watch reached out to Google to ask how it proposes to safeguard human rights as it seeks to expand its products and services in China. HRW had not received a response for the record at time of this publication.

China’s extensive censorship regime restricts a wide range of peaceful expression that officials deem politically sensitive, including criticism of government policy and information that does not conform to official narratives. China’s Great Firewall Internet filtering system blocks websites at the national level, including Google and Facebook services. Broadly drafted laws also require social media services, search engines, and websites that host user content to censor politically sensitive information on its behalf. The government issues vaguely worded censorship orders and expects companies to proactively restrict access to broad categories of information.

“Google withdrew from China in 2010 because the human rights and cybersecurity environment was too precarious,” Wong said. “Since then, China renewed its crackdown on rights and enacted new laws that conscript tech firms in censorship and surveillance, but the company hasn’t explained how this time will be any better.”

According to media reports, Google’s custom Chinese search app would comply with the censorship regime by automatically identifying and filtering sites blocked by the Great Firewall. Filtered sites would not be shown in response to searches, and the company would notify the user that some results may have been removed. Examples of websites that would be censored include the British Broadcasting Corporation (BBC) and Wikipedia, according to documents seen by The Intercept.

Google is not the only US internet company considering whether to censor to seek access to the Chinese market. In November 2016, the New York Times reported that Facebook was developing software “to suppress posts from appearing in people’s news feeds in specific geographic areas,” specifically “to help Facebook get into China.” The report states that Facebook would “offer the software to enable a third party—in this case, most likely a partner Chinese company—to monitor popular stories and topics,” and would allow that third party to “have full control to decide whether those posts should show up in users’ feeds.”

Facebook’s formal entry into China would raise many of the same human rights concerns faced by Google. Facebook holds highly sensitive information about its users’ networks and affiliations, which the government may demand the company disclose. Online activists could be particularly at risk because of Facebook’s policy of requiring users to employ an “authentic identity” – a name that is commonly used by family and friends, that might also be found on certain types of identity documents. Human rights organizations and officials, including Human Rights Watch and United Nations special rapporteur on freedom of expression David Kaye, have long criticized this policy, because it can chill online expression and is also likely to be disproportionately enforced against those who use pseudonyms because they are at risk of reprisals.

In 2016, Human Rights Watch wrote to Facebook to ask whether the proposed system would proceed and how Facebook intends to avoid complicity with Chinese state censorship, and also asked how Facebook would protect its users from abusive surveillance and reprisals for their online activity if Facebook launches a version of their application that complies with Chinese law. In a written response, Facebook stated that “at this time we have not concluded how or when access to Facebook could be restored for people in China, recognizing the principal role the Chinese government plays in making this decision” and that “as we continue to study this market, we will consider the important points you raise.”

In May 2017, Facebook quietly launched a photo-sharing app, Colorful Balloons, in China through a local company without a public connection to Facebook. The company has also unsuccessfully sought to open an innovation hub and subsidiaries in China.

In August 2018, Human Rights Watch again contacted Facebook for updates to its approach to China. HRW had not received a response for the record at time of this publication.

From 2006 to 2010, Google ran a censored version of its search engine in China. In March 2010, the company announced it would stop censoring search results in China, citing concerns about online censorship, surveillance, and cyber-attacks directed at the Gmail accounts of Chinese human rights activists. As a result, the search engine has remained inaccessible to mainland Chinese users, along with other Google services.

Since 2010, the Chinese government has only broadened and intensified its crackdown on human rights, especially after President Xi Jinping took power in 2013. In recent years, authorities have tightened censorship requirements, restricted access to censorship circumvention tools, and strengthened ideological control over all media. In 2017, the government shut down dozens of social media accounts, called on internet companies to “actively promote socialist core values,” and passed stricter regulations to require real-name registration, disabling people from protecting their identities if they engaged in disfavored speech. Authorities have subjected more human rights defenders, including foreigners, to show trials, subjected them to torture, and often held them incommunicado for months.

The government has significantly broadened mass surveillance efforts using big data and artificial intelligence-driven technology across China, particularly in the minority region of Xinjiang. The government also recently enacted laws that impose new requirements on companies to facilitate online surveillance. The Cybersecurity Law requires certain technology companies to retain, store, and disclose user data inside China and monitor and report “network security incidents.” Other new rules require app providers to keep user logs for 60 days to reduce the spread of “illegal information.” Under Chinese law, “security incidents” and “illegal information” are often defined broadly to encompass peaceful criticism of the government.

The Chinese government’s intensified offensive against human rights makes the timing of Google’s and Facebook’s actions particularly troubling and disappointing, Human Rights Watch said.

Google already provides two apps in China, Google Translate and file management app Files Go, though its own app store, Google Play, remains blocked. However, offering services via mobile phone apps raises additional human rights concerns that were not present when Google first entered China in 2006, when smart phones were not ubiquitous. Mobile applications can access extraordinarily sensitive data stored on phones, including contact lists, files, messages, photos, device identifiers, and location information, and can also turn on a phone’s camera and microphone if given permission by the user. Often users approve access without fully understanding the personal data that would be available. Such personal data would be more vulnerable to monitoring and collection by mobile service providers and public security agencies in China.

“US tech companies shouldn’t enter China until they can show they won’t become repression’s helping hands,” said Wong. “In the current human rights environment, that seems unlikely.”

Posted: January 1, 1970, 12:00 am

Police officers check the identity cards of a people as security forces keep watch in a street in Kashgar, Xinjiang Uyghur Autonomous Region, China on March 24, 2017.

© 2017 Thomas Peter/Reuters

Global concern is finally on the rise about Xinjiang, a region of China in which ethnically Turkic Muslims have long endured shocking repression. As governments grapple for ways to put pressure on Beijing over these abuses, attention is turning toward whether international firms doing business in the region are complying with the United Nations Guiding Principles on Business and Human Rights, that set out companies’ responsibilities to respect human rights. 

Chinese authorities are creating a profiling and policing infrastructure to identify people considered disloyal to the Chinese Communist Party under the ruse of promoting “social stability.” Human Rights Watch has documented the Xinjiang police’s abusive mass surveillance projects, including gathering DNA and other biometric information, often without people’s knowledge or consent. China lacks privacy protections against state surveillance, or an independent judicial system. Ethnic minorities have no real power to question – much less resist – authorities’ demands.

In June 2017, Human Rights Watch discovered that Thermo Fisher, a Massachusetts-based biotechnology company, had sold DNA processing technology to the Xinjiang Public Security Bureau. Human Rights Watch wrote repeatedly to the company about abuses in Xinjiang, and asking whether it is complying with the Guiding Principles to ensure that its business operations aren’t furthering abuses. Its replies provide little comfort: that it is legal to sell the equipment; that, “given the global nature of our operations, it is not possible for us to monitor the use or application of all products we manufactured;” and that they “expect all of our customers to act in accordance with appropriate regulations and industry-standard best practices.”

In June, Commerce Secretary Wilbur Ross asserted in a letter to the Congressional Executive Commission on China that Thermo Fisher’s sales were legal. At a July commission hearing, Senator Marco Rubio aptly pointed out that, “China lacks the kinds of legal safeguards that other countries implement to manage their DNA databases,” and added, “These are the same companies that are up here every day in Washington, D.C. lobbying for us not to raise these issues so they can have access to China’s 1.3-billion-person marketplace...This is sick.”

All governments concerned about the increasingly dire situation in Xinjiang should ask companies from their countries a critical question: what you are doing in Xinjiang may not break the law – but is it defensible?

Author: Human Rights Watch
Posted: January 1, 1970, 12:00 am

Andrew Wheeler, the former coal lobbyist who became acting administrator of the U.S. Environmental Protection Agency (EPA) after Scott Pruitt resigned, is weakening rules that protect people who live near coal-fired plants from the toxic ash left from burning coal.

Coal-fired power plants across the United States produce more than 100 million of tons of toxic ash every year. For decades, many states have allowed power companies to dump most of this ash into watery pits that are, on average, the size of nearly 40 football fields.

These pits, created with little or no regulatory oversight, pose a risk of a catastrophic spill flooding nearby communities with toxic sludge and widespread leaching of dangerous heavy metals into groundwater. Half of the U.S. population relies on groundwater for drinking, and private wells near power plants are especially vulnerable to contamination from coal ash.

Three days before Christmas in 2008, a dam broke on a coal ash pond in Kingston, Tennessee, spilling more than a billion gallons of black sludge into the Emory River. The spill, which took five years and cost more than $1 billion to clean up, was a dramatic reminder of the public health threat posed by the more than 1,000 coal-ash disposal sites that dot nearly every U.S state.

Coal ash from the Kingston Fossil Plant spill piles up along an inlet that empties into the Emory River, near Kingston, Tennessee in 2008.

© Brian Stansberry

A subsequent EPA review found that the majority of coal-ash disposal sites are more than three decades old, which poses a problem for their structural integrity. Groundwater tests near ponds found high levels of such hazardous metals as arsenic, lead, mercury, and hexavalent chromium.

In 2015, under President Barack Obama, the EPA finalized new rules for coal ash after receiving 450,000 public comments and conducting eight public hearings. These set protections for people living near coal-fired power stations while allowing coal ash to be sold for reuse in concrete and other materials. And it imposed new monitoring requirements on power companies.

On March 1, power companies revealed for the first time the extent of coal-ash contamination of groundwater. That same day, the EPA, then under Pruitt, released its proposal to weaken the federal coal-ash rule by allowing state regulators to suspend groundwater monitoring and increase maximum permitted contaminant levels, among other measures.

On July 18, Wheeler finalized these rule changes, touting a $30 million annual savings in regulatory costs. For the multibillion-dollar power industry, these savings hardly seem like a plausible reason to eviscerate existing rules. Perhaps the greater motive is keeping the extent of pollution hidden from public view.

In the single public hearing the EPA held before finalizing this rule change, Lee McCarty, the mayor of a small town in Alabama that has a leaking coal-ash pond, exhorted the EPA not to weaken the rule. “My county voted 72 percent for the Republican administration,” he told the EPA representatives, “yet I have not had a single person” who supports this rule change. “If this is the best that the Environmental Protection Agency can do, I would say please, at least for transparency reasons, change your name to the UPA, the Utilities Protection Agency.”

In announcing the final rule, the EPA has promised even more changes to the coal-ash rule later this year. Members of Congress on both sides of the aisle should protect the health of the people they represent and insist that the EPA safeguard their constituents’ drinking water and ensure their right to information about possible contamination.

Author: Human Rights Watch
Posted: January 1, 1970, 12:00 am

An inmate makes a phone call from his cell at the Orange County jail in Santa Ana, California, May 24, 2011.

© 2011 Reuters

The New York City Council voted this week to provide free telephone calls for anyone held in the city’s correctional facilities. The move will save New York City inmates and their families an estimated US$8 million a year.

Until this law takes effect, phone services in the city’s jails will continue to be provided by Securus Technologies which, like other private companies, charges inmates and their families, the ultimate ‘captive customers’, at higher than average rates.

Currently, Securus charges an initial fee of 50 cents, plus 5 cents per minute for calls within New York, as well as fees for depositing funds. The cost of calls can quickly become prohibitive. With black people and Latinos making up over 86% of inmates in New York City, communities of color primarily bear these high costs. Many already lived in poverty before their arrests, and are held in pre-trial detention simply because they cannot afford to pay bail.

The new law will also prohibit the city from making revenue for itself on phone call fees. City government had a revenue sharing agreement with Securus, which guaranteed the city at least $5 million in income each year.

Phone calls provide an important link between people in custody and their families. Inmates that maintain family ties have an easier re-entry process, and may be less likely to reoffend. Children also benefit from having regular contact with parents in prison. The city should not look to raise revenues by charging people for these vital connections, nor should low income families be forced to sacrifice basic necessities to pay for calls.

Last year, a court ruled that the Federal Communications Commission cannot set caps on the cost of in-state prison phone calls, leaving it to each state and local jurisdiction to set the fees. New York City is among the first jurisdictions to eliminate fees altogether, and prohibit revenue sharing contracts. In the absence of national legislation, other cities and states should follow suit. People in custody should be able to stay in touch with their families without paying an exorbitant price.

Author: Human Rights Watch
Posted: January 1, 1970, 12:00 am

(Sydney, July 26, 2018) – Australia’s proposed modern slavery law needs revisions to be effective in preventing and ending labor rights abuses, Human Rights Watch said today in a submission to Australia’s Parliamentary Legal and Constitutional Affairs Committee. 

The Modern Slavery Bill 2018 defines “modern slavery” to include the worst forms of child labor, human trafficking, and criminal offenses including forced labor, forced marriage, and slavery-like practices. 

“Australia’s modern slavery bill takes some critical steps toward holding companies to account for serious abuses in their supply chains, but to be truly effective the law needs teeth,” said Elaine Pearson, Australia director at Human Rights Watch. “The bill should lower the threshold for reporting, require the government to publicize a list of companies required to report its practices, and impose penalties for noncompliance.”

The bill is the result of extensive consultation with nongovernmental groups and businesses. 

It requires companies with annual revenue of AU$100 million (US$74 million) to submit statements describing their operations and supply chains, risks for modern slavery, attempts to assess and address those risks, and the effectiveness of those actions. Human Rights Watch urged lowering the cap to AU$25 million (US$18 million), and requiring the government to publish a list of companies covered by the law to make it harder for companies to ignore the requirement.

In addition, the bill should require companies to conduct a systematic examination of their operations and supply chains for modern slavery risks, and should provide for penalties when companies fail to comply with the law.

“Companies that fail to identify and address forced labor in their supply chains should face legal consequences,” Pearson said. “Greater transparency also helps Australian consumers to know that their dollars are not going to support trafficking or child labor.”

Posted: January 1, 1970, 12:00 am

Fueled by access to large data sets and powerful computers, machine learning and artificial intelligence can offer significant benefits to society. At the same time, left unchecked, these rapidly expanding technologies can pose serious risks to human rights by, for example, replicating biases, hindering due process and undermining the laws of war.

To address these concerns, Human Rights Watch and a coalition of rights and technology groups recently joined in a landmark statement on human rights standards for machine learning.

Known as the Toronto Declaration, the statement calls on governments and companies to ensure that machine learning applications respect the principles of equality and non-discrimination. The document articulates the human rights norms that the public and private sector should meet to ensure that algorithms used in a wide array of fields – from policing and criminal justice to employment and education – are applied equally and fairly, and that those who believe their rights have been violated have a meaningful avenue to redress.

While there has been a robust dialogue on ethics and artificial intelligence, the Declaration emphasizes the centrality and applicability of human rights law, which is designed to protect rights and provide remedies where human beings are harmed.

The Declaration focuses on machine learning and the rights to equality and non-discrimination, but many of the principles apply to other artificial intelligence systems. In addition, machine learning and artificial intelligence both impact a broad array of human rights, including the right to privacy, freedom of expression, participation in cultural life, the right to remedy, and the right to life. More work is needed to ensure that all human rights are protected as artificial intelligence increasingly touches nearly all aspects of modern life.

Drafted by rights groups, technologists, and researchers, the Toronto Declaration was finalized and announced on May 16, 2018 at the RightsCon conference in Toronto.
Posted: January 1, 1970, 12:00 am