Pornhub owner Aylo will pay a $5 million settlement to the FTC and Utah over allegations that the company knowingly profited off of child sexual abuse material (CSAM) and nonconsensual material (NCM).
Formerly Mindgeek, Aylo made some significant changes to the way it moderates content in late 2020, when The New York Times published an exposé showing how Pornhub — Aylo’s most popular portfolio website — failed to prevent and remove uploads of CSAM and NCM. It was only after pressure from credit card operators that the company began to verify the ages of all actors in uploaded videos and require documents that prove the actors’ consent.
But the FTC and Utah allege that even after Aylo enacted these safeguards, the company continued to host illegal content and irresponsibly manage consumer data.
According to the FTC, Aylo failed to disclose that after performers verified their identities through a third-party vendor, Aylo obtained the data from the vendor to hold indefinitely. This data, which the FTC alleges was not stored safely, could include Social Security numbers, addresses, birthdates, and other information that could be found on government IDs.
“Aylo also told its models that they could ‘trust that their personal data remains secure’ yet failed to use standard security measures to protect the data,” the FTC said in a press release. “For example, Aylo did not encrypt the personal data it stored, failed to limit access to the data, and did not store the data behind a firewall.”
The FTC also claims that Aylo failed to keep its promise that it would ban users who attempted to upload CSAM. The complaint says that Aylo only prohibited these users from making a new account using the same username or email address.
Aylo also committed to “fingerprinting” videos suspected to be CSAM so that if someone attempted to upload it to one of its hundreds of porn sites, then it would be flagged. But the FTC alleges that this technology was not effective from at least 2017 to August 2021, resulting in hundreds of videos previously identified as CSAM being reuploaded.
Techcrunch event
San Francisco
|
October 27-29, 2025
Aylo willingly entered into this settlement, which the company says “reaffirms [its] efforts to prevent the publication of CSAM and NCM,” per a statement emailed to TechCrunch.
As part of the settlement, Aylo must verify the consent and identity of anyone who appears in uploaded photos or videos. The company is also ordered to enact policies, procedures, and technical measures to block the publication of CSAM and NCM and remove content posted prior to the implementation of this system.
“The resolution reached involved enhancements to existing measures but did not introduce any new substantive requirements that were not either already in place or in progress,” Aylo said.
For the next decade, Aylo will face independent third-party audits to make sure the company adheres to the terms of the settlement.