CINXE.COM

The Santa Clara Principles on Transparency and Accountability in Content Moderation

<!DOCTYPE html> <html lang="en"> <head> <meta http-equiv="Content-Type" content="text/html; charset=utf-8" /> <meta name="viewport" content="width=device-width, initial-scale=1" /> <title> The Santa Clara Principles on Transparency and Accountability in Content Moderation </title> <link rel="stylesheet" href="/css/styles.css" /> <script src="/js/jquery-3.3.1.min.js"></script> <script src="/js/main.js"></script> <meta property="og:site_name" content="Santa Clara Principles"/> <meta property="og:image" content="https://santaclaraprinciples.org/images/santa-clara-OG.png" /> <meta property="og:title" content="Santa Clara Principles on Transparency and Accountability in Content Moderation"/> <meta property="og:description" content="Santa Clara Principles on Transparency and Accountability in Content Moderation"/> <meta property="og:type" content="website"/> <meta property="og:url" content="https://santaclaraprinciples.org/images/santa-clara-OG.png"/> <meta property="fb:app_id" content="147715685336936" /> <meta name="twitter:card" content="summary_large_image"> <meta name="twitter:title" content="Santa Clara Principles"> <meta name="twitter:description" content="Santa Clara Principles on Transparency and Accountability in Content Moderation"> <meta name="twitter:image" content="https://santaclaraprinciples.org/images/santa-clara-OG.png"> <link rel="stylesheet" href="../css/styles.css" /> </head> <body class="facebook-letter"> <div class="bar"> <div class="language-picker"> <ul> <li class="current">English</li> <li><a href="https://santaclaraprinciples.org/es/open-letter/">Spanish</a></li> <li><a href="https://santaclaraprinciples.org/pt/open-letter/">Portuguese</a></li> </ul> </div> <a href="#off-canvas" class="open-menu">&#x2261;</a> </div> <header> <h1><a href="\">The Santa Clara Principles</a></h1> <h2>On Transparency and Accountability in Content Moderation</h2> </header> <div id="off-canvas" class="off-canvas" role="navigation" aria-label="mobile-navigation"> <aside role="navigation" aria-label="desktop-navigation"> <ul> <li><a href="/">Santa Clara Principles 2.0</a></li> <li><a href="/open-consultation">Santa Clara Principles 2.0 Open Consultation Report</a></li> <li><a href="/history">The History of the Santa Clara Principles</a></li> <hr> <li><strong>Implementation Guides:</strong></li> <ul> <li><a href="/toolkit-advocates">SCP 2.0 Toolkit for Advocates</a> </li> <li><a href="/toolkit-companies">SCP 2.0 Toolkit for Companies</a> </li> <li><a href="/regulators">SCP 2.0 Note to Regulators</a></li> </ul> <hr> <li><a href="/scp1">Santa Clara Principles 1.0</a></li> <li><a href="/open-letter">Open Letter to Mark Zuckerberg</a></li> </ul> </aside> <div class="close-menu"><a href="#">&times;</a></div> </div> <div class="top"><a href="#top">↑<br>Top</a></div> <div class="page-wrapper" id="top"> <nav role="navigation"> <aside role="navigation" aria-label="desktop-navigation"> <ul> <li><a href="/">Santa Clara Principles 2.0</a></li> <li><a href="/open-consultation">Santa Clara Principles 2.0 Open Consultation Report</a></li> <li><a href="/history">The History of the Santa Clara Principles</a></li> <hr> <li><strong>Implementation Guides:</strong></li> <ul> <li><a href="/toolkit-advocates">SCP 2.0 Toolkit for Advocates</a> </li> <li><a href="/toolkit-companies">SCP 2.0 Toolkit for Companies</a> </li> <li><a href="/regulators">SCP 2.0 Note to Regulators</a></li> </ul> <hr> <li><a href="/scp1">Santa Clara Principles 1.0</a></li> <li><a href="/open-letter">Open Letter to Mark Zuckerberg</a></li> </ul> </aside> </nav> <div class="main"> <h2>An Open Letter to Mark Zuckerberg</h2> <div class="language-picker"> <ul> <li class="current">English</li> <li><a href="https://santaclaraprinciples.org/es/open-letter/">Spanish</a></li> <li><a href="https://santaclaraprinciples.org/pt/open-letter/">Portuguese</a></li> </ul> </div> <section class="open-letter"> <a href="#response"><strong>Read Facebook's response</strong></a> <p>&nbsp;</p> <p>Dear Mark Zuckerberg:</p> <p>What do the <a href="https://www.thedailybeast.com/facebooks-most-famous-banned-images">Philadelphia Museum of Art</a>, a <a href="https://www.bbc.com/news/blogs-news-from-elsewhere-35221329">Danish member of parliament</a>, and a <a href="http://www.gmanetwork.com/news/hashtag/content/569097/facebook-explains-takedown-of-posts-by-ed-lingao-ejap/story/">news anchor from the Philippines</a> have in common? They have all been subject to a misapplication of Facebook’s Community Standards. But unlike the average user, each of these individuals and entities received media attention, were able to reach Facebook staff and, in some cases, receive an apology and have their content restored. For most users, content that Facebook removes is rarely restored and some users may be banned from the platform even in the event of an error.</p> <p>When Facebook first came onto our screens, users who violated its rules and had their content removed or their account deactivated were sent a message telling them that the decision was final and could not be appealed. It was only in 2011, after years of advocacy from human rights organizations, that your company <a href="https://jilliancyork.com/2011/07/19/facebook-appeals/">added a mechanism</a> to appeal account deactivations, and only in 2018 that Facebook <a href="https://newsroom.fb.com/news/2018/04/comprehensive-community-standards/">initiated a process</a> for remedying wrongful takedowns of certain types of content. Those appeals are available for posts removed for nudity, sexual activity, hate speech or graphic violence.</p> <p>This is a positive development, but it doesn’t go far enough.</p> <h4 id="today-we-the-undersigned-civil-society-organizations-call-on-facebook-to-provide-a-mechanism-for-all-of-its-users-to-appeal-content-restrictions-and-in-every-case-to-have-the-appealed-decision-re-reviewed-by-a-human-moderator">Today, we the undersigned civil society organizations, call on Facebook to provide a mechanism for all of its users to appeal content restrictions, and, in every case, to have the appealed decision re-reviewed by a human moderator.</h4> <p>Facebook’s stated mission is to give people the power to build community and bring the world closer together. With more than two billion users and a wide variety of features, Facebook is the world’s premier communications platform. We know that you recognize the responsibility you have to prevent abuse and keep users safe. As you know, social media companies, including Facebook, have a <a href="https://www.article19.org/wp-content/uploads/2018/06/Regulating-speech-by-contract-WEB-v2.pdf">responsibility to respect human rights</a>, and international and regional human rights bodies have a number of specific recommendations for improvement, notably concerning the right to remedy.</p> <p>Facebook remains far behind its competitors when it comes to affording its users due process. <sup><a href="#1">1</a></sup> We know from years of research and documentation that human content moderators, as well as machine learning algorithms, are <a href="https://cdt.org/files/2017/11/Mixed-Messages-Paper.pdf">prone to error</a>, and that even low error rates can result in millions of silenced users when operating at massive scale. Yet Facebook users are only able to appeal content decisions in a limited set of circumstances, and it is impossible for users to know how pervasive erroneous content takedowns are without increased transparency on Facebook’s part. <sup><a href="#2">2</a></sup></p> <p>While we acknowledge that Facebook can and does shape its Community Standards according to its values, the company nevertheless has a responsibility to respect its users&rsquo; expression to the best of its ability. Furthermore, civil society groups around the globe have criticized the way that Facebook’s Community Standards exhibit bias and are unevenly applied across different languages and cultural contexts. Offering a remedy mechanism, as well as more transparency, will go a long way toward supporting user expression.</p> <p>Earlier this year, a group of advocates and academics put forward the <a href="https://santaclaraprinciples.org/">Santa Clara Principles on Transparency and Accountability in Content Moderation</a>, which recommend a set of minimum standards for transparency and meaningful appeal. This set of recommendations is consistent with the work of the UN Special Rapporteur on the promotion of the right to freedom of expression and opinion David Kaye, who recently <a href="https://freedex.org/a-human-rights-approach-to-platform-content-regulation/">called for</a> a “framework for the moderation of user- generated online content that puts human rights at the very center.” It is also consistent with the <a href="https://www.ohchr.org/Documents/Publications/GuidingPrinciplesBusinessHR_EN.pdf">UN Guiding Principles on Business and Human Rights,</a> which articulate the human rights responsibilities of companies.</p> <p>Specifically, we ask Facebook to incorporate the Santa Clara Principles into their content moderation policies and practices and to provide:</p> <h4 id="notice-clearly-explain-to-users-why-their-content-has-been-restricted">Notice: Clearly explain to users why their content has been restricted.</h4> <ul> <li>Notifications should include the specific clause from the Community Standards that the content was found to violate.</li> <li>Notice should be sufficiently detailed to allow the user to identify the specific content that was restricted and should include information about how the content was detected, evaluated, and removed.</li> <li>Individuals must have clear information about how to appeal the decision.</li> </ul> <h4 id="appeals-provide-users-with-a-chance-to-appeal-content-moderation-decisions">Appeals: Provide users with a chance to appeal content moderation decisions.</h4> <ul> <li>Appeals mechanisms should be easily accessible and easy to use.</li> <li>Appeals should be subject to review by a person or panel of persons that was not involved in the initial decision.</li> <li>Users must have the right to propose new evidence or material to be considered in the review.</li> <li>Appeals should result in a prompt determination and reply to the user.</li> <li>Any exceptions to the principle of universal appeals should be clearly disclosed and compatible with international human rights principles.</li> <li>Facebook should collaborate with other stakeholders to develop new independent self-regulatory mechanisms for social media that will provide greater accountability<sup><a href="#3">3</a></sup></li> </ul> <h4 id="numbers-issue-regular-transparency-reports-on-community-standards-enforcement">Numbers: Issue regular transparency reports on Community Standards enforcement.</h4> <ul> <li>Present complete data describing the categories of user content that are restricted (text, photo or video; violence, nudity, copyright violations, etc), as well as the number of pieces of content that were restricted or removed in each category.</li> <li>Incorporate data on how many content moderation actions were initiated by a user flag, a trusted flagger program, or by proactive Community Standards enforcement (such as through the use of a machine learning algorithm).</li> <li>Include data on the number of decisions that were effectively appealed or otherwise found to have been made in error.</li> <li>Include data reflecting whether the company performs any proactive audits of its unappealed moderation decisions, as well as the error rates the company found.</li> </ul> <p>Article 19, Electronic Frontier Foundation, Center for Democracy and Technology, and Ranking Digital Rights</p> <br>7amleh - Arab Center for Social Media Advancement <br>Access Now <br>ACLU Foundation of Northern California <br>Adil Soz - International Foundation for Protection of Freedom of Speech <br>Africa Freedom of Information Centre (AFIC) <br>Albanian Media Institute <br>Alternatif Bilisim <br>American Civil Liberties Union <br>Americans for Democracy & Human Rights in Bahrain (ADHRB) <br>Arab Digital Expression Foundation <br>Asociación Mundial de Radios Comunitarias América Latina y el Caribe (AMARC ALC) <br>Association for Progressive Communications <br>Bits of Freedom <br>Brennan Center for Justice at NYU School of Law <br>Bytes for All (B4A) <br>CAIR-Minnesota <br>CAIR San Francisco Bay Area <br>CALAM <br>Cartoonists Rights Network International (CRNI) <br>Cedar Rapids, Iowa Collaborators <br>Center for Independent Journalism - Romania <br>Center for Media Studies & Peace Building (CEMESP) <br>Child Rights International Network (CRIN) <br>Committee to Protect Journalists (CPJ) <br>CyPurr Collective <br>Digital Rights Foundation <br>EFF Austin <br>El Instituto Panameño de Derecho y Nuevas Tecnologías (IPANDETEC) <br>Electronic Frontier Finland <br>Elektronisk Forpost Norge <br>epicenter.works <br>Eyebeam <br>Facebook Users & Pages United Against Facebook Speech Suppression <br>Fight for the Future <br>Florida Civil Rights Coalition <br>Foro de Periodismo Argentino <br>Foundation for Press Freedom - FLIP <br>Freedom Forum <br>Fundación Acceso <br>Fundación Ciudadano Inteligente <br>Fundación Datos Protegidos <br>Fundación Internet Bolivia.org <br>Fundación Vía Libre <br>Fundamedios - Andean Foundation for Media Observation and Study <br>Garoa Hacker Club <br>Global Voices Advocacy <br>Gulf Center for Human Rights <br>HERMES Center for Transparency and Digital Human Rights <br>Hiperderecho <br>Homo Digitalis <br>Human Rights Watch <br>Idec - Brazilian Institute of Consumer Defense <br>Independent Journalism Center (IJC) <br>Index on Censorship <br>Initiative for Freedom of Expression - Turkey <br>Instituto Nupef <br>International Press Centre (IPC) <br>Internet without borders <br>Intervozes - Coletivo Brasil de Comunicação Social <br>La Asociación para una Ciudadanía Participativa ACI Participa <br>Lucy Parsons Labs <br>MARCH <br>May First/People Link <br>Media Institute of Southern Africa (MISA) <br>Media Rights Agenda (MRA) <br>Mediacentar Sarajevo <br>New America's Open Technology Institute <br>NYC Privacy <br>Open MIC (Open Media and Information Companies Initiative) <br>OpenMedia <br>OutRight Action International <br>Pacific Islands News Association (PINA) <br>Panoptykon Foundation <br>PEN America <br>PEN Canada <br>Peninsula Peace and Justice Center <br>People Over Politics <br>Portland TA3M <br>Privacy Watch <br>Prostasia Foundation <br>Raging Grannies Action League <br>ReThink LinkNYC <br>Rhode Island Rights <br>SFLC.in <br>SHARE Foundation <br>SMEX <br>Son Tus Datos (Artículo 12 A.C.) <br>South East Europe Media Organisation <br>Southeast Asian Press Alliance (SEAPA) <br>SumOfUs <br>Syrian Archive <br>Syrian Center for Media and Freedom of Expression (SCM) <br>t4tech <br>TA3M Seattle <br>Techactivist.org <br>The Association for Freedom of Thought and Expression <br>The Rutherford Institute <br>Viet Tan <br>Vigilance for Democracy and the Civic State <br>Visualizing Impact <br>Witness <br>Xnet <a name="response"></a> <h3 id="facebooks-response">Facebook&rsquo;s Response</h3> <p>Thank you for your November 13 letter to Mark Zuckerberg addressing notice, appeal, and data transparency for violations of Facebook’s Community Standards. You’ve raised important questions about how Facebook is approaching these issues.</p> <p>Your letter gives us an opportunity to summarize the work we&rsquo;ve been doing over the past year in these areas. Please find details below, using the headings in your letter. We have also noted areas where we aren&rsquo;t currently in line with your recommendations, or where we are working in that direction.</p> <p>Please bear in mind that much of this is work in progress, and we will provide further updates as our policies and enforcement develop. We look forward to continuing this dialog with you.</p> <h4 id="notice">Notice</h4> <p>Our procedures for notifying users of Community Standards violations have become more detailed over time. In April, we released the internal guidelines <a href="https://newsroom.fb.com/news/2018/04/comprehensive-community-standards/">https://newsroom.fb.com/news/2018/04/comprehensive-community-standards/</a>that we use to enforce on our Community Standards, so that people have clarity on exactly where we draw the line. As part of this process, we have been working to improve the notifications that we send users when we remove content that goes against our Community Standards.</p> <p>In the majority of cases, when we tell people their content goes against our standards, we cite the policy violated. In a few cases, where concerns for safety or gaming of our policies are high, such as with our policies around sexual exploitation and terrorist propaganda, we provide general notice that the content in question has violated our Community Standards. We also identify for users the specific piece of content that violates our standards.</p> <p>With regard to how content is detected, evaluated, and removed (if it violates our policies), two points:</p> <p>First, Facebook&rsquo;s Community Standards Enforcement Report <a href="https://transparency.facebook.com/community-standards-enforcement">https://transparency.facebook.com/community-standards-enforcement</a>contains numbers showing how much violating content we have detected on our service. It&rsquo;s an important part of our effort to be transparent — so people can judge for themselves how well we are doing. As outlined further below, this report provides aggregate data for most major policies on the percentage of violating content that we have proactively detected as compared to that which was reported by users. For six of the eight policies for which we report data in the enforcement report, we identified over 95% of the violating content ourselves. This figure is significant because it indicates that we are successfully removing large volumes of violating content before we receive user reports. User reports nonetheless remain important, not least because they provide a signal that users perceive the reported content as harmful.</p> <p>Second, when it comes to individual pieces of content, we don&rsquo;t disclose how we are made aware of violating content, since doing so could undermine the confidentiality of user reporting — an important principle underlying our enforcement approach.</p> <h4 id="appeals">Appeals</h4> <p>When we talk about appeals, we&rsquo;re referring to ways for people to request re-review of a content decision we&rsquo;ve made. Prior to this year, such re-review was available to people whose profiles, Pages, or Groups had been taken down. In April 2018, we introduced the option to request re-review of individual pieces of content that were removed for adult nudity or sexual activity, hate speech, or graphic violence. We’ve subsequently extended this option so that re-review is now available for additional content areas, including dangerous organizations and individuals (which includes our policies on terrorist propaganda), bullying and harassment, regulated goods, and spam. We are continuing to roll out re-review for additional types of violations. We also plan to launch this option for individuals who have reported content that was not removed.</p> <p>When we announced the ability to seek re-review of content removals, we also said that we would like to set up a system where users can provide more information and context on decisions that they think we got wrong. Along these lines, we are currently experimenting with the best ways to solicit context from users.</p> <p>There are some violation types &ndash; for example, severe safety policy violations &ndash; for which we don&rsquo;t offer re-review. For all other types of content, in order to request re-review of a content decision we’ve made, you are given the option to “Request Review.” We make the opportunity to request this review clear, either via a notification or interstitial. Re-review is conducted by a different human reviewer than that of the original review. In a limited number of cases, where we have very high confidence in our original decision and believe that human review is not efficient — for example, spam and cases involving &ldquo;banked&rdquo; content — we may also rely on automation for re-review. If we find we have made a mistake, we will restore the content. Typically, the re-review takes place within 24 hours.</p> <p>Please take a look at the three screenshots appearing at the bottom of this message. These images show, respectively, notice of a Community Standards violation; the option to seek re-review of our content decision; and acknowledgment of a request for re-review.</p> <p>We welcome the opportunity for Facebook to collaborate with stakeholders on innovative approaches to self-regulation and accountability. We engage regularly with external groups and experts on our policy development and enforcement, and we are looking to do more. For example, our CEO recently laid out a path for appealing content decisions to an independent body <a href="https://www.facebook.com/notes/mark-zuckerberg/a-blueprint-for-content-governance-and-enforcement/10156443129621634/">https://www.facebook.com/notes/mark-zuckerberg/a-blueprint-for-content-governance-and-enforcement/10156443129621634/</a>(see “Independent Governance and Oversight”). In 2019, we’ll undertake a process of stakeholder consultation on this idea, to bring external voices into the decision making process. We would like to include your voices in that process.</p> <h4 id="data">Data</h4> <p>Our first ever Community Standards Enforcement Report, released in May 2018 as part of our larger bi-annual Transparency Report, provided data on the volume of content we actioned (including both content removed and content in which an interstitial or warning screen was added) and the percentage of violations we found before users reported them in the following areas: adult nudity and sexual activity, fake accounts, hate speech, spam, terrorist propaganda, and violent and graphic content.</p> <p>We received your letter just two days before the release of our second Community Standards Enforcement Report <a href="https://transparency.facebook.com/community-standards-enforcement">https://transparency.facebook.com/community-standards-enforcement</a>. This report contains a range of information on topics raised in the letter, covering the period from April to September 2018. The report includes updates to the data we shared in our May report, as well as new data in the areas of bullying and harassment, and child nudity and sexual exploitation of children. For each of these policies, we provided data on how much content we took action on, and the percentage of violations that we found before users reported them. Where possible, we also provided information on prevalence of this content on Facebook. The report further highlights data we plan to provide in the future, including an indication of how quickly we took action on Community Standards violations and how often we restore content upon re-review.</p> <p>In many policy areas, the percentage of removed content first identified by Facebook&rsquo;s automated systems is above 95%. In the areas of hate speech and bullying and harassment, where context is critical, proactive detection rates are lower (~52% and ~15% respectively). At this point, we do not provide a further breakdown of reporting source.</p> <p>We also don&rsquo;t provide data on the format of the content being removed, whether text, photo, or video, because we do not believe this data provides critical information to our users or civil society about our content review practices.</p> <p>We audit the quality and accuracy of reviewer decisions on an ongoing basis, to understand where our policies or training can be improved, and to follow up with reviewers on improving where errors are being made. Reducing these errors is one of our most important priorities.</p> <p>We hope this response is useful, and appreciate the opportunity to convey these updates. We are exploring ways to make our updates more easily accessible. In the meantime, please feel free to share this message with your colleagues and other members of civil society.</p> <hr> <div class="footnotes"> <p> <sup id="1">1</sup> See EFF’s Who Has Your Back? 2018 Report <a href="https://www.eff.org/who-has-your-back-2018">https://www.eff.org/who-has-your-back-2018</a>, and Ranking Digital Rights Indicator G6, <a href="https://rankingdigitalrights.org/index2018/indicators/g6">https://rankingdigitalrights.org/index2018/indicators/g6/</a>. </p> <p> <sup id="2">2</sup> See Ranking Digital Rights, Indicators F4 <a href="https://rankingdigitalrights.org/index2018/indicators/f4">https://rankingdigitalrights.org/index2018/indicators/f4/</a>, and F8, <a href="https://rankingdigitalrights.org/index2018/indicators/f8/">https://rankingdigitalrights.org/index2018/indicators/f8/</a> and New America’s Open Technology Institute, “Transparency Reporting Toolkit: Content Takedown Reporting”, <a href="https://www.newamerica.org/oti/reports/transparency-reporting-toolkit-content-takedown-reporting/">https://www.newamerica.org/oti/reports/transparency-reporting-toolkit-content-takedown-reporting/</a> </p> <p> <sup id="3">3</sup> For example, see Article 19’s policy brief, “Self-regulation and ‘hate speech’ on social media platforms,” <a href="https://www.article19.org/wp-content/uploads/2018/03/Self-regulation-and-%E2%80%98hate-%20speech%E2%80%99-on-social-media-platforms_March2018.pdf">https://www.article19.org/wp-content/uploads/2018/03/Self-regulation-and-%E2%80%98hate- speech%E2%80%99-on-social-media-platforms March2018.pdf.</a> </p> </div> </section> </div> </div> </div> <footer role="contentinfo"> <div class="footer-wrapper" role="navigation" aria-label="footer-navigation"> <div class="footer-logo"> <h2>The Santa Clara Principles</h2> <h3>On Transparency and Accountability in Content Moderation</h3> </div> <div class="griddy"> <div> <a href="https://www.eff.org/policy">Privacy Policy</a> </div> <div> <a href="https://www.eff.org/copyright">Copyright (CC-BY)</a> </div> </div> </div> <noscript><img src="https://anon-stats.eff.org/js/?idsite=46 rec=1&action_name=" style="border:0" height="0" width="0" alt="" /></noscript> <div style="height: 0; width: 0; position: absolute" id="anon-stats"></div> <script type="text/javascript"> document.getElementById('anon-stats').innerHTML = '<img src="https://anon-stats.eff.org/js/?idsite=46&rec=1&urlref=' + encodeURIComponent(document.referrer) + '&action_name=' + encodeURIComponent(document.title) + '" style="border:0" height="0" width="0" alt="" />'; </script> </footer> </body> </html>

Pages: 1 2 3 4 5 6 7 8 9 10