The internet’s largest platform for nonconsensual deepfake pornography, Mr. Deepfakes, has permanently ceased operations after a “critical service provider” terminated its support, according to an announcement on the site’s homepage this week.
“A critical service provider has terminated service permanently. Data loss has made it impossible to continue operation,” the site’s notice reads. “We will not be relaunching. Any website claiming this is fake. This domain will eventually expire and we are not responsible for future use.”
The shutdown is a significant win in the fight against fake porn made with AI. Before closing, Mr. Deepfakes had amassed over 43,000 explicit videos with more than 1.5 billion views. Most videos targeted female celebrities, but also featured private individuals, according to research data.
The timing of the shutdown seems linked to the new Take It Down Act that Congress passed last week. This law criminalizes “non-consensual intimate imagery,” including AI-generated content, and requires websites to remove such content within 48 hours when someone complains. Similar laws have been enacted or proposed in the UK, Netherlands, and the EU.
“While this is an important victory for victims of non-consensual intimate imagery (NCII), it is far too little and far too long in the making,” said Hany Farid, a professor at UC Berkeley and leading expert on digitally manipulated images, in comments to 404 Media.
The site started in 2018 after Reddit and Pornhub banned deepfakes. It became the main hub for making and sharing fake porn. Users could buy custom videos from creators who charged between $50 and $1,500 per request, often paid in cryptocurrency.
The site’s forums helped people learn how to make deepfakes. Users shared tips, tools, and face data to help create realistic fake videos of specific individuals.
While the site’s anonymous creator—reportedly a 36-year-old Toronto hospital worker—has lost his platform, much of the community has already migrated to encrypted messaging platforms like Telegram. The tools they created, like DeepFaceLab (which made about 95% of all deepfake videos), are still available online.
The shutdown demonstrates that even when site operators remain anonymous, pressure on service providers can effectively disrupt harmful platforms. For victims of nonconsensual intimate imagery, this represents a rare win in an ongoing battle against technology-facilitated abuse.