Time to Compel Social Media Companies to Protect HRDS

By Brian Dooley 

Today I joined Waris Husain and Shannon Raj Singh for a discussion at RightsCon about how social media companies (SMCs) are failing to protect Human Rights Defenders (HRDs).

Waris Husain is Senior Legal Advisor at the American Bar Association Center for Human Rights, and Shannon Raj Singh is Special Advisor on Social Media & Conflict at the Centre for Humanitarian Dialogue. The event was hosted by the American Bar Association Center for Human Rights, and moderated by Waris, who noted that “While there was a short period during which SMC’s had hired human rights specialists to interface with HRDs and to respond to their requests for assistance, from the second half of 2022 onwards, companies like Twitter and Meta have pared down their human rights teams and often stepped back from intervening to protect HRDs.”

Shannon had been a senior member of the human rights team in Twitter before she and many others were let go in the drastic jobs cuts that followed Elon Musk taking over the company in 2022. “After I left Twitter, Iranian activists were still contacting me, desperately appealing for help but I just felt powerless because not only had I been let go but so too had so many of the others at Twitter who I would have turned to for help.”

HRDs and others have started pushing for the companies to do more than just take down harmful content. 

The Philippines is one of the most dangerous countries for human rights activists, who are regularly targeted and threatened online.  Last month, prominent Filipino journalist Leonardo Cong” Corrales filed a complaint against Facebook owner Meta to compel the company to reveal information about anonymous accounts that have attacked him online.

The information he wants to get from Meta would enable him to take legal action against his attackers.

Meta shouldn’t need the complaint to be filed. It should be providing, as a matter of course, support to activists who want to take such cases.

Taking down harmful content isn’t enough – not that they’re doing a good job of that either. In a move to push them to do more than take down scam ads on their platforms, bankers in Britain have recently called for SMCs to also reimburse victims of online fraud.

Removing bad stuff is important but companies have to do much more. Academic Marc Owen Jones has for years been criticizing the performance of SMCs in combatting digital authoritarianism in the Middle East.

Other experts, including HRDs, have been telling SMCs how harmful the attacks on their platforms are, and what’s needed to prevent them.

In her 2021 report to the United Nations Human Rights Council, UN Special Rapporteur on HRDs, Mary Lawlor, noted how HRDs murdered for their work are often targeted on social media before they are killed.

The report Final warning: death threats and killings of human rights defenders recommended that SMCs should not just “Establish and publicize easy to access, public, rapid response mechanisms to remove threatening context [and] close down accounts of those making the threats,” but also “provide all necessary data to assist legal investigations into online threats.”

Richard Wilson at the University of Connecticut has also been detailing the affects on HRDs, documenting how HRDs “targeted online report negative psychological and health outcomes and identify a nexus between online harassment and the criminalization of human rights work. Many take protective measures, engage in self-censorship, abandon human rights work, and leave the country.”

SMCs, he says, “must implement stronger human rights-protective measures in at-risk countries, including expediting urgent requests for physical protection, adopting context-specific content moderation policies, and publicly documenting state abuses.”

There are more things they should be should be required to do, including offering live, real-time assistance to those intimidated and harassed on their platforms, better informing users about unlawful harassment and intimidation and helping people to filter, report, and block malicious content.

These things are all perfectly possible for SMCs to do, are things HRDs have been asking for, and things the companies largely fail to do.

There’s nothing very new about realizing regulation is needed. At the first UN conference on human rights, in Tehran in 1968, Jamaica noted that “Already the advances in technology… have begun to present us with new human rights problems … for which new standards of conduct may need to be enunciated.”

Standards enforceable by law are what are needed. The SMCs have had time to implement such protections, but have largely failed to do so. It’s time for strict regulation to compel them to better protect HRDs.

Blog

Author:

  • Brian Dooley

Published on June 8, 2023

Share

Seeking asylum?

If you do not already have legal representation, cannot afford an attorney, and need help with a claim for asylum or other protection-based form of immigration status, we can help.