Design tools

How to Protect Rape Victims from PornHub

An excellent article in the New York Times highlights a significant problem: PornHub is infested with rape videos. Here we are offering the solution.

The author of the article mentions:

It monetizes child rapes, revenge pornography, spycam videos of women showering, racist and misogynist content, and footage of women being asphyxiated in plastic bags.

The article is short on solutions, though. As a company dealing with stock photography and AI processing of the photos, we have a couple of things to add.

I think there could be a combination of administrative and tech solutions.

1. Require a signed model release form

For years, stock photography does it with a standardized contract accepted by multiple (if not all) stock photo websites.

My colleagues and I used to sign it with our models (for our stock photography, not porn). It requires a brief conversation explaining what we do: “Your pictures could appear anywhere where our clients want to use them. The policy only prohibits the use for anything pornographic,” then responding follow-up questions, sometimes reading, and signing. There are apps for that, where people pass through several steps before they sign it, and they get a copy.

Usually, we sign it before shooting (a good time is while we do a model’s hair). A real story: once a model was late, the whole crew was waiting, and we started quickly. I didn’t bring this up before the shooting session ended — that was my first mistake. It appeared that I didn’t warn her that we’re shooting stock photography — that was my second mistake. In the end, so we had to discard the material while paying her the full fee.

Summary: model release forms are straightforward, well planned, and effective. They work in stock photography and stock videos. They could work with other photos and videos.

2. Detect the faces with AI

It’s easy and works well already.

Phase 1 would be banning people who don’t want to be published. I see it as an app called “Remove me from porn,” which makes sure it’s you and stores a fingerprint of your face in a blacklist. All pornographic websites should check the videos against this blacklist.

Phase 2 would be allowing only people who explicitly consented. Again, stock photography works like this, and it’s all good.

3. Anonymize the videos with deepfakes

Our company generates people (as reported in another New York Times article). I feel that generating faces similar to the person’s one would be super helpful.

A picture of Sasha Grey anonymized with generated photos. These non-existing people loosely remind her face but make her unrecognizable.

I believe combining our anonymization engine with deepfakes could resolve the problem once and for all.

Combined with the first two measures, we can encode the faces of people who didn’t consent. This way, we harm neither the industry nor customers by removing the vast majority of content, making this measure easier to implement.

CEO Column by Ivan Braun, the founder of Icons8 and Generated Photos

Title image from Moose Photo Stock

icons8

Recent Posts

Virtual fitting room for real people, not runway models

End the return nightmare and make online shopping stress-free (more…)

5 days ago

Dominate or die: how UX design destroys outdated processes

Killer UX transforms dying businesses. See precisely how intuitive interfaces slashed operational chaos and turned…

2 weeks ago

I spent $200 on ChatGPT Operator so you don’t have to (Seriously, don’t)

Robots doing all your work sounds perfect—until they’re stuck in loops, grabbing random tweets, and…

3 weeks ago

5 best email letter design examples to use in your email campaigns

Most emails are forgettable. Great ones hook you fast, look sharp, and drive clicks. Here’s…

3 weeks ago

Losing face: The battle of AI face swappers

We put top AI face swappers to the test—beards, glasses, head tilts, and more. Some…

4 weeks ago

Break the rules, win the users: no-BS UX design process

Learn more about each step within the design process to improve your UX workflow.

2 months ago

This website uses cookies.