At Pinterest, our mission is to bring everyone the inspiration to create a life they love, and it’s our guiding light in drafting our content policies. Not everything on the internet is inspiring, so we have guardrails for what’s acceptable on Pinterest and what isn’t allowed.
We’re committed to providing greater transparency into how we keep Pinterest safe and inspiring. In this transparency report, you’ll get insight into how many information and deactivation requests we received from law enforcement and government agencies between July and December 2020. We’ve reported on this information twice per year since 2013.
But this year, we’re expanding the report to share more information. Now, our bi-annual transparency report will also include data on the actions we take to moderate user and merchant content on Pinterest beyond those requested by law enforcement, such as the number of policy violations and deactivations. This release specifically covers the actions we took under our policies between October and December 2020. We'll continue to iterate on this report going forward.
Pinterest’s industry-leading policies and practices are something we’re proud of. But more importantly, they’re the right thing for the people on our platform. They help to keep Pinterest a more positive and inspiring place online—for example, our longstanding policy prohibiting medical misinformation, which also challenged the industry to go further. We want to advance the industry on these issues so that—together—we can create a more inspiring internet.
Our content moderation philosophy
Pinterest’s Community guidelines are designed to keep our platform safe for all users. They govern what we do and don't allow on Pinterest, and all users must abide by them.
We have additional guidelines for merchants and advertisers to set clear expectations about what is and is not acceptable for product Pins and advertisements. We have especially high standards for safety for all audiences who use Pinterest: consumers, advertisers, creators, merchants and more. We believe you can't feel inspired if you don't first feel safe.
To help us cultivate a safe and inspired community, we develop and enforce content policies to ensure that our platform is a positive place for people to find inspiration for their next new recipe or home renovation. We work hard to identify and deactivate harmful content from our site, and our content policies and moderation practices are always evolving.
We may block, limit the distribution of or deactivate content and the accounts, individuals and groups that create or spread that content, based on how much harm it poses. In the event that a user believes a deactivation was in error, Pinterest provides options to appeal the deactivation.
Every day, millions of people all over the world come to Pinterest to create, discover and save new ideas that are shared in Pins. To understand how we approach content moderation, it’s helpful to differentiate between two types of Pins: organic Pins and ads. Our Community guidelines apply to both.
Organic Pins include all Pins created and saved on Pinterest that are not promoted as ads. For example, this could include merchants’ product Pins, which aren’t always ads, and may appear organically to people who are searching for products on Pinterest. We have additional requirements, like disclosing shipping and return policies, for merchants and their product Pins. All types of organic Pins are included in this report.
Ads are Pins that advertisers have chosen to promote as ads on Pinterest. We have additional policies for advertisers that set higher standards for the quality of ads. Because ad content is enforced differently than organic content, it is not included in this report.
Reach of policy-violating Pins
People often ask: before a Pin is deactivated for violating policy, how many people saw it? In most cases, the answer is: not a lot.
For an example, let’s look at Pins deactivated for medical misinformation during this reporting period. 85% of Pins that we deactivated for medical misinformation were actually never seen by users in this reporting period—even with more than 440 million people visiting Pinterest per month.
Reach of deactivated Pins, Oct–Dec 2020*
Policy Seen by 0 people Seen by <10 people Seen by 10-100 people Seen by 100+ people Adult content 76% 17% 4% 2% Adult sexual services 10% 25% 28% 36% Civic misinformation 64% 27% 5% 3% Conspiracy theories 91% 7% 0.9% 0.3% Dangerous goods and activities 50% 37% 6% 5% Graphic violence and threats 50% 13% 14% 22% Harassment and criticism 78% 16% 3% 3% Hateful activities 73% 10% 6% 10% Medical misinformation 85% 7% 1% 6% Self-injury and harmful behavior 90% 8% 0.8% 0.5% Spam 85% 5% 3% 6%
* Calculated based on the number of unique users that saw a policy-violating Pin between October and December 2020 for at least 1 second, before it was deactivated. Rows may not add up to 100% due to rounding.
Actioned user reports
Users can report any content they find objectionable by clicking on the three small dots on any Pin and hitting “Report Pin.” Once we confirm it’s a policy violation and take action on the reported content, we consider the report an actioned user report.
The total number of actioned user reports tells us a lot about the user experience on Pinterest. So does the number of reporters: less than 0.02% of monthly active users reported a Pin that resulted in a Pin deactivation in this reporting period.
While the majority of actioned user reports concerned adult content, the number of users that reported that content represent a very small fraction of the people on Pinterest. More importantly, not a lot of people saw the adult content, regardless of whether it was deactivated due to a user report or another enforcement mechanism. In fact, 98% of adult content that was deactivated on Pinterest was seen by fewer than 100 people during the reporting period.
Actioned user reports that resulted in a deactivated Pin, Oct–Dec 2020
Policy* Actioned Reports Adult content 155,313 Adult sexual services 749 Civic misinformation 782 Conspiracy theories 786 Dangerous goods and activities 2,296 Graphic violence and threats 825 Harassment and criticism 3,865 Hateful activities 421 Self-injury and harmful behavior 1,854
* For some policies, actioned user reports are not a useful metric, for instance because the primary enforcement mechanism involves processing reports in aggregate. For these policies, the reach of deactivated Pins is a more relevant indicator of the user experience.
There are more than 300 billion Pins on Pinterest, and each of those Pins also has an image associated with it. Just because two Pins show the same image doesn't mean that we count them as the same Pin within our systems, or even that the image came from the same source.
This is important when it comes to content moderation: if we determine that the image in one Pin is policy-violating, our tools need to be able to detect and act on matching images amongst the billions of other Pins on Pinterest. So while we detect and deactivate a lot of Pins, those Pins comprise a much smaller number of distinct images. That’s why we’re sharing the number of deactivations for both distinct images and Pins. Each provides a different kind of insight into our moderation practices for this type of content.
For example, we took proactive steps to moderate both new and pre-existing content leading up to the US election in November 2020. Those efforts included deactivating a lot of distinct images that violated our conspiracy theory policy. In doing so, we also deactivated the Pins that used those images.
Distinct image and Pin deactivations, Oct–Dec 2020*
Policy Distinct images Total Pins Adult content 2,100,253 49,855,681 Adult sexual services 707 714 Civic misinformation 3,238 15,809 Conspiracy theories 52,863 1,512,221 Dangerous goods and activities 5,501 42,310 Graphic violence and threats 1,754 3,750 Harassment and criticism 3,763 46,371 Hateful activities 1,980 8,397 Medical misinformation 5,938 18,184 Self-injury and harmful behavior 3,499 175,584 Spam 1,378,472 3,375,169
* Does not include distinct images or Pins that were deactivated because they were on a board that was deactivated, or belonged to a user that was deactivated.
When people find Pins they like or want to come back to they save them to boards that they’ve created. Over time, people have created more than 6 billion boards.
Pinterest deactivates boards if a predetermined amount of content on that board has been identified as policy-violating. We’re being intentional about not sharing specific numbers for this, because that information can be used to undermine our efforts at content moderation. The most important takeaways about board moderation are that Pinterest will deactivate boards that are determined to be policy-violating, and that when a board is deactivated, all the Pins on that board are also deactivated.
Board deactivations, Oct–Dec 2020*
Policy Boards Adult content 50,767 Adult sexual services 488 Civic misinformation 456 Conspiracy theories 1,877 Dangerous goods and activities 956 Graphic violence and threats 381 Harassment and criticism 795 Hateful activities 4,604 Medical misinformation 345 Self-injury and harmful behavior 898 Spam 2
* Does not include boards that were deactivated because they belonged to a user account that was deactivated.
Think of “accounts” on Pinterest as “profiles” or, most often, individual users. If someone saves Pins and creates boards, that content becomes associated with their account. This is also true for business, advertiser, merchant and creator accounts.
Any account may be deactivated for violating our policies. When a user is deactivated, all of their Pins and boards are also deactivated. That means that if you search for them or click on an old link to their profile, that profile won’t show up anymore. Their Pins won’t appear anywhere on Pinterest. And they won’t be able to access their own Pins or boards, either.
Account deactivations, Oct–Dec 2020
Policy Accounts Adult content 7,754 Adult sexual services 494 Civic misinformation 24 Conspiracy theories 219 Dangerous goods and activities 179 Graphic violence and threats 11 Harassment and criticism 816 Hateful activities 2,487 Medical misinformation 13 Self-injury and harmful behavior 26 Spam 3,115,438
Account appeals and reinstatements
If people believe their accounts have been deactivated by mistake, they can follow an appeals process to have their accounts reinstated. We review appeal requests and grant the appeal if we decide we made a mistake, or in some cases to give people a second chance to abide by our Community guidelines.
Account appeals and reinstatements, Oct–Dec 2020
Policy Appeals Reinstatements* Adult content 1,813 847 Adult sexual services 38 2 Civic misinformation 3 0 Conspiracy theories 22 12 Dangerous goods and activities 6 0 Graphic violence and threats 7 3 Harassment and criticism 24 13 Hateful activities 31 8 Medical misinformation 3 0 Self-injury and harmful behavior 1 0 Spam 99,839 64,777
We also process appeals for deactivated Pins and boards, and expect to include that data in reports in the future.
*Updated list on September 28, 2021 to reflect revised reinstatements figures.
This section provides insight into the volume of information and deactivation requests received from law enforcement and government agencies.
Please note that this data encompasses requests from July to December 2020. For more information on how we respond to requests for account information, refer to our Law enforcement guidelines.
Government information requests
Pinterest receives legal requests from law enforcement and government agencies for Pinterest account information. We diligently review each request, and only produce data for those that meet the requirements of law and our policies. Our policy is to notify users of government requests for their information prior to disclosing, unless we are prohibited by law or in exceptional circumstances.
Types Requests Some information produced Accounts identified Accounts notified** Subpoena 41 28 45 4 Court order 2 1 1 0 Warrant 27 19 28 9 Other* 6 3 5 0 Total 76 51 79 13
* Law enforcement requests such as wiretap orders, pen registers, trap and trace and emergency disclosure requests.
** The account owner was notified before production.
Country Requests Some information produced Accounts identified Accounts notified** Australia 1 0 1 0 Belgium 1 0 0 0 Brazil 1 0 1 0 Germany 4 0 3 0 India 3 0 1 0 Switzerland 1 0 1 0 United Kingdom 1 0 1 0 Total 12 0 8 0
* Law enforcement requests such as wiretap orders, pen registers, trap and trace, and emergency disclosure requests.
** The account owner was notified before production.
National security requests*
Time period No. of requests July to December 2020 0-249
* Any national security letters and orders issued under the US Foreign Intelligence Surveillance Act for user information.
Government content deactivation requests
We sometimes receive requests from government agencies to deactivate content on Pinterest that may be illegal in their country and/or a violation of our Community guidelines.
We review the requests to determine if the content identified violates our Community guidelines. Our teams take action on violations, ranging from deactivating the content globally to restricting access to the content within the relevant country if it violates local law but does not violate our policies.
Pinterest received a total of 4,078 requests from July to December 2020. We removed content for 3,684 of those requests for violating our Community guidelines and restricted the content on the remaining 394 requests.
Country Requests Community guidelines deactivation* Local law deactivation** India 14 0 14 South Korea 2,314 2,048 266 Russia 1,622 1,548 74 Turkey 128 88 40 Total 4,078 3,684 394
* Content violated our Community guidelines and was removed from the platform.
** Content was reported by a government agency but did not violate our Community guidelines and was restricted from appearing only in the country where the request originated, based on local law.
Pinterest does not tolerate child sexual abuse material (CSAM). We have a strict no-tolerance policy for any content—imagery or text—that exploits or endangers minors. Detecting and deactivating this content is extremely important to us and we work closely with the National Center for Missing and Exploited Children (NCMEC) to combat this type of activity.
Pinterest proactively identifies CSAM images and videos using matching tools including PhotoDNA and machine learning tools to detect additional possible CSAM. Our team of specialists is trained to identify and review CSAM, and was responsible for 1,794 CyberTipline reports to NCMEC from July to December 2020.
Time Period Cybertip reports July to December 2020 1,794
Creating the most positive space online doesn't happen by accident: It happens through proactive policy and product decisions. We have industry-leading positions on content safety that are informed by inputs and advice from outside experts, civil society and government. We also invest heavily in measures like machine learning technology to maintain a safe and positive space for the people on Pinterest. We’re proud of what we’re doing to keep Pinterest safe and to move the broader industry forward.
Let’s create a safer, more inspired internet, together.