International Coalition of Rights Groups Call on Internet Infrastructure Providers to Avoid Content Policing – EFF

Posted under Cibercommunity, Technology On By James Steward

San Francisco—Internet infrastructure services—the heart of a secure and resilient internet where free speech and expression flows—should continue to focus their energy on making the web an essential resource for users and, with rare exceptions, avoid content policing. Such intervention often causes more harm than good, EFF and its partners said today.

EFF and an international coalition of 56 human and digital rights organizations from around the world are calling on technology companies to “Protect the Stack.” This is a global effort to educate users, lawmakers, regulators, companies, and activists about why companies that constitute basic internet infrastructure—such as internet service providers (ISPs), certificate authorities, domain name registrars, hosting providers, and more—and other critical services, such as payment processors, can harm users, especially less powerful groups, and put human rights at risk when they intervene to take down speech and other content. The same is true for many other critical internet services.

EFF today launched the Protect the Stack website at the Internet Governance Forum in Addis Ababa, Ethiopia. The website introduces readers to “the stack,” and explains how content policing practices can and have caused risks to the human rights. It is currently available in English, Spanish, Arabic, French, German, Portuguese, Hebrew, and Hindi.

“Internet infrastructure companies help make the web a safe and robust space for free speech and expression,” said EFF Legal Director Corynne McSherry. “Content-based interventions at the infrastructure level often cause collateral damage that disproportionately harms less powerful groups. So, except in rare cases, stack services should stay out of content policing.”

“We have seen a number of cases where content moderation applied at the internet’s infrastructural level has threatened the ability of artists to share their work with audiences,” said Elizabeth Larison, Director of the Arts and Culture Advocacy Program at the National Coalition Against Censorship. “The inconsistency of those decisions and the opaque application of vague terms of service have made it clear that infrastructure companies have neither the expertise nor the resources to make decisions on content.”

Infrastructure companies are key to online expression, privacy, and security. Because of the vital role they play in keeping the internet and websites up and running, they are increasingly under pressure to play a greater role in policing online content and participation, especially when harmful and hateful speech targets individuals and groups.

But doing so can have far-reaching effects and lead to unintended consequences that harm users. For example, when governments force ISPs to disrupt the internet for an entire country, people can no longer message their loved ones, get news about what’s happening around them, or speak out.

Another example is domain name system (DNS) abuse, where the suspension and deregistration of domain names is used as a means to stifle dissent. ARTICLE 19 has documented multiple instances of “DNS abuse” in Kenya and Tanzania.

Moreover, at the platform level, companies that engage in content moderation consistently reflect and reinforce bias against marginalized communities. Examples abound: Facebook decided, in the midst of the #MeToo movement’s rise, that the statement “men are trash” constitutes hateful speech. In addition, efforts to police “extremist” content by social media platforms have caused journalists’ and human rights defenders’ work documenting terrorism and other atrocities to be blocked or erased. There’s no reason to expect that things will be any different at other levels of the stack, and every reason to expect they will be worse.

A safe and secure internet helps billions of people around the world communicate, learn, organize, buy and sell, and speak out. Stack companies are the building blocks behind the web, and have helped keep the internet buzzing for businesses, families, and students during the COVID-19 and for Ukrainians and Russians during the war in Ukraine. We need infrastructure providers to stay focused on their core mission: supporting a robust and resilient internet.
For more information: https://protectthestack.org/
Originating from the streets of Chicago, drill music is a creative output of inner-city Black youths. It is defined by real life experiences and perspectives, and whilst drill rappers often document gang-related conflict and anti-establishment narratives in their lyrics and music videos, the rap genre is a crucial mouthpiece of…
If their marketing is to be believed, self-avowed free speech maximalist sites like Parler—“where free speech thrives”—and Frank Speech—“the voice of free speech”—claim they will publish all user content. But the reality is a prohibition of many types of legal content, including legal sexual material. This restriction…
The increasing risk that the Supreme Court will overturn federal constitutional abortion protections has refocused attention on the role digital service providers of all kinds play in facilitating access to health information, education, and care—and the data they collect in return.In a post-Roe world, service providers can expect a raft…
The White House announced today that sixty one countries have signed the Declaration for the Future of the Internet. The high-level vision and principles expressed in the Declaration—to have a single, global network that is truly open, fosters competition, respects privacy and inclusion, and protects human rights and fundamental…
When atrocities happen—in Mariupol, Gaza, Kabul, or Christchurch—users and social media companies face a difficult question: how do we handle online content that shows those atrocities? Can and should we differentiate between pro-violence content containing atrocities and documentation by journalists or human rights activists? In a conflict, should platforms take…
Content moderation is complex, difficult and, frankly, exhausting. The most recent example involves Spotify and its decision to stick with the controversial podcast host, Joe Rogan, over other creators. There is no question that Spotify has the right to determine whom to host, profit from or reject from its platform;…
Content moderation has become a critical topic across the globe. Unfortunately, it can still be difficult for the average person to understand the processes that go into content moderation, much less how to appeal decisions that those platforms make to censor content or accounts. To help fill this gap, we’re…
San Francisco—The Electronic Frontier Foundation (EFF) today launched Tracking Global Online Censorship, a website project that provides comprehensive, in-depth information about how and why social media platforms remove users’ posts, how users can appeal these take down decisions, and how the practice affects freedom of expression across the globe.Tracking…
Throughout the COVID-19 pandemic, authoritative research and publications have been critical in gaining better knowledge of the virus and how to combat it. However, unlike previous pandemics, this one has been further exacerbated by a massive wave of misinformation and disinformation spreading across traditional and online social media.The increasing volume…
San Francisco—The Electronic Frontier Foundation (EFF) and a coalition of civil society organizations and academics today released the second edition of the Santa Clara Principles on Transparency and Accountability In Content Moderation, adding standards directed at government and state actors to beef up due process and expanding guidelines for…
Back to top

source

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.