This project sought to identify discrete priorities for enhanced transparency in the content moderation practices of social media platforms and other internet intermediaries. The Electronic Frontier Foundation (EFF), in partnership with researchers, Sarah Myers West (University of Southern California) and Nicolas Suzor, Andrew Quodling (Queensland University) collaborated on a project utilizing an existing dataset generated through the EFF’s monitoring efforts to better understand social media users’ experiences with the content moderation policies of various platforms.

EFF’s onlinecensorship.org has been collecting user reports on account suspension and content removal on social media platforms for two years, and there are currently a total of 610 reports from users in over 26 countries in 3 languages. These reports include information about the type and nature of content takedowns as well as users’ perceptions of the experience and impact it has on their lives.

With an understanding that much of the policymaking that occurs on social media content regulation takes place at the company level, the EFF sought to use this research to advocate for better content regulation and takedown polices and build these arguments from the user experience perspective.

The final  Santa Clara Principles on Transparency and Accountability of Content Moderation Practices builds on this research and extensive consultations on best practices for social media platforms to provide transparency and accountability in their content moderation practices. This project engaged civil society organizations, industry representatives, policymakers, and academic researchers to create a priority list of the information users and researchers need to better understand commercial content moderation on social media platforms.To develop these recommendations, partners at Queensland University of Technology, the University of Southern California, and the Electronic Frontier Foundation undertook a thematic analysis of 380 of the survey responses submitted by users to EFF’s onlinecensorship.org who had been adversely affected by the removal of content they post on social media platforms or by the suspension of their account. This research was used to identify information gaps expressed by users about what content is moderated, which rule was breached, and what human and automated processes are responsible for identifying content and making decisions about content moderation.The published Santa Clara Principles build on this wider research process as well as the deliberative sessions at the All Things in Moderation conference at UCLA (6-7 December 2017)  to offer guidance to internet platforms on how to provide users with meaningful due process when their posts are taken down or their accounts are suspended, and to help ensure that the enforcement of company content guidelines is fair, unbiased, and respectful of users’ free expression rights. The three principles urge companies to:

  • publish the numbers of posts removed and accounts permanently or temporarily suspended due to violations of their content guidelines;
  • provide clear notice to all users about what types of content are prohibited, and clear notice to each affected user about the reason for the removal of their content or the suspension of their account; and
  • enable users to engage in a meaningful and timely appeals process for any content removals or account suspensions.

The wider research project has not only informed the development of these principles but also an academic paper currently under peer review.For more information on the project, please contact the authors at n.suzor@qut.edu.ausarahmye@usc.edua.quodling@qut.edu.au, and jillian@eff.org.

To read more about this project, please see blog posts about the project here and here. 

To read the full principles document, please click below.