Opinion | What People at Pornhub Were Thinking When It Shared Videos of Child Rape - The New York Times


AI Summary Hide AI Generated Summary

Key Findings from Leaked Pornhub Documents

Internal documents from Pornhub, released due to a court filing error, expose employee reactions to the presence of child sexual abuse material (CSAM) on the platform. Some messages show casual disregard, with employees joking about the illegal content. Others express serious alarm and concern about the sheer volume of CSAM.

Pornhub's Response and Policies

The documents indicate that Pornhub had 706,000 videos flagged for CSAM or other issues as of May 2020. The company's policy of requiring 16 user flags before reviewing a video for removal and restricting flagging to registered users is highlighted as potentially hindering the removal of illegal content. The intention behind restricting flagging to registered users was to "greatly reduce overall flag volume."

Pornhub's Business Model

Pornhub and related websites operate as platforms, hosting content uploaded by users rather than producing it themselves. This business model is implicated in the ease with which illegal content can proliferate on the platform.

Sign in to unlock more AI features Sign in with Google

What goes through the minds of people working at porn companies profiting from videos of children being raped?

Thanks to a filing error in a Federal District Court in Alabama, releasing thousands of pages of internal documents from Pornhub that were meant to be sealed, we now know. The documents, mostly dating from 2020 or earlier, show some employees laughing off what’s on their site.

“I hope I never get in trouble for having those vids on my computer LOOOOL,” one messaged another.

Others are somber, with one messaging another, “There is A LOT of very, very obvious and disturbing CSAM here.” CSAM stands for child sexual abuse material.

One internal document indicates that Pornhub as of May 2020 had 706,000 videos available on the site that had been flagged by users for depicting rape or assaults on children or for other problems. That was partly because, the documents suggest, Pornhub did not necessarily review a video for possible removal until it had been flagged at least 16 times.

The company also made it much more difficult to flag problem videos by allowing only registered users to do so. One internal message noted: This “will greatly reduce overall flag volume.”

Pornhub and other “tube” websites that are part of the same company — like Redtube, Tube8 and YouPorn — don’t make sex videos themselves. Rather, they provide a platform for users to post videos.

We are having trouble retrieving the article content.Please enable JavaScript in your browser settings.

Thank you for your patience while we verify access. If you are in Reader mode please exit and log into your Times account, or subscribe for all of The Times.

Thank you for your patience while we verify access.

Already a subscriber? Log in.

Want all of The Times? Subscribe.

Was this article displayed correctly? Not happy with what you see?

Tabs Reminder: Tabs piling up in your browser? Set a reminder for them, close them and get notified at the right time.

Try our Chrome extension today!


Share this article with your
friends and colleagues.
Earn points from views and
referrals who sign up.
Learn more

Facebook

Save articles to reading lists
and access them on any device


Share this article with your
friends and colleagues.
Earn points from views and
referrals who sign up.
Learn more

Facebook

Save articles to reading lists
and access them on any device