The Santa Clara Principles on Transparency and Accountability of Content Moderation Practices

As a culmination of one of the series of  the IPO’s “research for impact” projects, this declaration builds on research and extensive consultations on best practices for social media platforms to provide transparency and accountability in their content moderation practices. This project engaged civil society organizations, industry representatives, policymakers, and academic researchers to create a priority list of the information users and researchers need to better understand commercial content moderation on social media platforms.

To develop these recommendations, partners at Queensland University of Technology, the University of Southern California, and the Electronic Frontier Foundation undertook a thematic analysis of 380 survey responses submitted by users to EFF’s onlinecensorship.org who have been adversely affected by the removal of content they post on social media platforms or by the suspension of their account. This research was used to identify information gaps expressed by users about what content is moderated, which rule was breached, and what human and automated processes are responsible for identifying content and making decisions about content moderation.

These published Santa Clara Principles build on this wider research process as well as the deliberative sessions at the All Things in Moderation conference at UCLA (6-7 December 2017)  to offer guidance to internet platforms on how to provide users with meaningful due process when their posts are taken down or their accounts are suspended, and to help ensure that the enforcement of company content guidelines is fair, unbiased, and respectful of users’ free expression rights. The three principles urge companies to:

  • publish the numbers of posts removed and accounts permanently or temporarily suspended due to violations of their content guidelines;
  • provide clear notice to all users about what types of content are prohibited, and clear notice to each affected user about the reason for the removal of their content or the suspension of their account; and
  • enable users to engage in a meaningful and timely appeals process for any content removals or account suspensions.

The wider research project has not only informed the development of these principles but also an academic paper currently under peer review.

For more information on the project, please contact the authors at n.suzor@qut.edu.ausarahmye@usc.edua.quodling@qut.edu.au, and jillian@eff.org.

Download PDF