Filters

Platforms Should Preserve Data about Content Censored During COVID-19

The Dangerous Speech Project is one of 75 organizations and individuals voicing concerns about how social media and content-sharing platforms are handling content removed by automation during the COVID-19 pandemic.

¡La carta está disponible en español aquí!
الرسالة باللغة العربية متوفرة هنا.

Dear Social Media and Content-Sharing Platforms:

As the COVID-19 pandemic spreads across the globe, the importance of your platforms and their real world impact has never been clearer. Your platforms are being used to communicate, assemble, research the virus, provide mutual aid, and more. We understand that many platforms have increased their reliance on automated content moderation during the pandemic, while simultaneously removing misinformation and apparently inaccurate information about COVID-19 at an unprecedented rate.

The importance of accurate information during this pandemic is clear. But knowledge about the novel coronavirus is rapidly evolving. This is also an unprecedented opportunity to study how online information flows ultimately affect health outcomes, and to evaluate the macro- and micro-level consequences of relying on automation to moderate content in a complex and evolving information environment. But such studies rely on information that your companies control–including information you are automatically blocking and removing from your services. It is essential that platforms preserve this data so that it can be made available to researchers and journalists and included in your transparency reports. The data will be invaluable to those working in public health, human rights, science and academia. It will be crucial to develop safeguards to address the privacy issues raised by new or longer data retention and by the sharing of information with third parties, but the need for immediate preservation is urgent.

We, the undersigned organizations, institutions, and researchers, urge you to:

  1. Immediately commit to preserving all data on content removal during the COVID-19 pandemic, including but not limited to information about which takedowns did not receive human review, whether users tried to appeal the takedown (when that information is available), and reports that were not acted upon.
  2. Preserve all content that the platform is automatically blocking or removing, including individual posts, videos, images, and entire accounts.
  3. Produce transparency reports that include information about content blocking and removal related to COVID-19
  4. Provide access to this data in the future to researchers and journalists, recognizing that privacy will need to be ensured.

Signed,

Organizations

Access Now
Africa Media Development Foundation (AMDF)
AlgorithmWatch
ARTICLE 19
Association for Progressive Communications (APC)
Balkan Investigative Reporting Network
Bangladesh NGOs Network for Radio & Communication
Carnegie Mellon University Center for Human Rights Science
Center for Democracy & Technology
Collaboration on International ICT Policy for East and Southern Africa (CIPESA)
Committee to Protect Journalists
Cook Islands Internet Action Group
CREOpoint
Dangerous Speech Project
Democracy Reporting International
Derechos Digitales
Digital Trade and Data Governance Hub,
GWU
Electronic Frontier Foundation
EU DisinfoLab
Foundation for Media
Free Press Unlimited
Gatef organization
Global Forum for Media Development (GFMD)
Gulf Centre for Human Rights
Hellerstein & Associates
Institute for Strategic Dialogue
Internet Sans Frontières
IPANDETEC
MediaLab ISCTE-IUL
Media Matters for Democracy
Media Monitoring Africa
New America’s Open Technology Institute
New York University Stern Center for Business and Human Rights
Paradigm Initiative
PEN America
PersonalData.IO
Ranking Digital Rights
Reporters Without Borders (RSF)
RNW Media
South African National Editors Forum
Stiftung Neue Verantwortung (SNV)
Syrian Archive
TEDIC
WITNESS

Individuals (Institutions listed for identification purposes)

Agustina Del Campo, Centro de Estudios en Libertad de Expresión y Acceso a la Información (CELE), Universidad de Palermo, Argentina
Alexa Koenig, Executive Director, Human Rights Center, Berkeley Law
Anthony Fargo, Center for International Media Law and Policy Studies, Indiana University
Dr Argyro Karanasiou, Director of LETS Lab, University of Greenwich
Chinmayi Arun, Resident Fellow, Information Society Project, Yale Law School
Claudio Fogu, President, UCSB Faculty Association
Constance Penley, University of California, Santa Barbara
Damian Loreti, Lecturer Universidad de Buenos Aires – Social Sciences Fac.
Daphne Keller, Stanford Cyber Policy Center
David Morar, Visiting Scholar, GWU Elliott School, Digital Trade & Data Governance Hub
Deirdre K. Mulligan, School of Information, University of California, Berkeley
Eileen Donahoe, Executive Director, Stanford Global Digital Policy Incubator
Elaine Monaghan, Professor of Practice, Journalism, Indiana University-Bloomington
Ellen P. Goodman, Rutgers Law
Emma L. Briant, Associate Researcher at Bard College
Enrique Piracés, Program Manager, Center for Human Rights Science, Carnegie Mellon University
Filippo Menczer, Observatory on Social Media at Indiana University
Hannah Bloch-Wehba, Drexel University Thomas R. Kline School of Law
Jay David Aronson, Professor of Science, Technology, and Society, Carnegie Mellon University
Jennifer Holt, Associate Professor, University of California, Santa Barbara
Jessica Fjeld, Harvard Law School Cyberlaw Clinic at the Berkman Klein Center
Jun Liu, Associate Professor, Department of Communication, University of Copenhagen
Lisa Parks, Professor, MIT
Marianne Franklin, Professor of Global Media and Politics
Marietje Schaake, Stanford Cyber Policy Center and Stanford Institute for Human-Centered Artificial Intelligence
Michael Karanicolas, Wikimedia Fellow, Information Society Project, Yale Law School
Molly Land, UConn Human Rights Institute
Robin Mejia, Department of Statistics & Data Science, Carnegie Mellon University
Sebastian Schwemer, Associate Professor, Centre for Information and Innovation Law (CIIR), University of Copenhagen
Wafa Ben-Hassine, Human Rights Lawyer
Yong Liu, Hebei Academy of Social Sciences

As social media platforms rely more on automation, they must preserve data about their decisions and processes for later study.

DownloadRead More