Design tools

How to Protect Rape Victims from PornHub

An excellent article in the New York Times highlights a significant problem: PornHub is infested with rape videos. Here we are offering the solution.

The author of the article mentions:

It monetizes child rapes, revenge pornography, spycam videos of women showering, racist and misogynist content, and footage of women being asphyxiated in plastic bags.

The article is short on solutions, though. As a company dealing with stock photography and AI processing of the photos, we have a couple of things to add.

I think there could be a combination of administrative and tech solutions.

1. Require a signed model release form

For years, stock photography does it with a standardized contract accepted by multiple (if not all) stock photo websites.

My colleagues and I used to sign it with our models (for our stock photography, not porn). It requires a brief conversation explaining what we do: “Your pictures could appear anywhere where our clients want to use them. The policy only prohibits the use for anything pornographic,” then responding follow-up questions, sometimes reading, and signing. There are apps for that, where people pass through several steps before they sign it, and they get a copy.

Usually, we sign it before shooting (a good time is while we do a model’s hair). A real story: once a model was late, the whole crew was waiting, and we started quickly. I didn’t bring this up before the shooting session ended — that was my first mistake. It appeared that I didn’t warn her that we’re shooting stock photography — that was my second mistake. In the end, so we had to discard the material while paying her the full fee.

Summary: model release forms are straightforward, well planned, and effective. They work in stock photography and stock videos. They could work with other photos and videos.

2. Detect the faces with AI

It’s easy and works well already.

Phase 1 would be banning people who don’t want to be published. I see it as an app called “Remove me from porn,” which makes sure it’s you and stores a fingerprint of your face in a blacklist. All pornographic websites should check the videos against this blacklist.

Phase 2 would be allowing only people who explicitly consented. Again, stock photography works like this, and it’s all good.

3. Anonymize the videos with deepfakes

Our company generates people (as reported in another New York Times article). I feel that generating faces similar to the person’s one would be super helpful.

A picture of Sasha Grey anonymized with generated photos. These non-existing people loosely remind her face but make her unrecognizable.

I believe combining our anonymization engine with deepfakes could resolve the problem once and for all.

Combined with the first two measures, we can encode the faces of people who didn’t consent. This way, we harm neither the industry nor customers by removing the vast majority of content, making this measure easier to implement.

CEO Column by Ivan Braun, the founder of Icons8 and Generated Photos

Title image from Moose Photo Stock

Recent Posts

8 tips to revolutionize teamwork efficiency

Essential tips for team managers and members to enhance collaboration, improve communication, and achieve exceptional…

2 days ago

Figma plugins to optimize your design workflow

Discover top Figma plugins to streamline your workflow, enhance visuals, and add creative flair to…

7 days ago

How to optimize visuals in blog for SEO

Learn how to manage visuals in blog posts to optimize them for SEO in this…

2 weeks ago

Figma: design without breaking the bank. Tips to avoid costly mistakes

Unraveling Figma’s pricing structure: tips to avoid hefty bills and save money. (more…)

2 weeks ago

Get tons of modern graphics right into Webflow

Don't switch tabs to browse icons, illustrations, and photos for your websites. Get them inside…

2 weeks ago

A guide to Notion covers: combining aesthetics and productivity

Transform your Notion space with custom covers! Dive in for easy steps, pro tips, and…

3 weeks ago

This website uses cookies.