Apple allowed child sex abuse materials to proliferate, according to class action lawsuit

The class action suit may pit Apple against its own user privacy promises.
By  on 
An iPhone showing the Apple iCloud login screen.
Apple is being sued for a second time over its underreporting of CSAM. Credit: SOPA Images / Contributor / LightRocket via Getty Images

Apple is once again facing a billion dollar lawsuit, as thousands of victims come out against the company for its alleged complicity in spreading child sex abuse materials (CSAM).

In a lawsuit filed Dec. 7, the tech giant is accused of reneging on mandatory reporting duties — which require U.S.-based tech companies to report instances of CSAM to the National Center for Missing & Exploited Children (NCMEC) — and allowing CSAM to proliferate. In failing to institute promised safety mechanisms, the lawsuit claims, Apple has sold "defective products" to specific classes of customers (CSAM victims).

Some of the plaintiffs argue they have been continuously re-traumatized by the spread of content long after they were children, as Apple has chosen to focus on preventing new cases of CSAM and the grooming of young users.

"Thousands of brave survivors are coming forward to demand accountability from one of the most successful technology companies on the planet. Apple has not only rejected helping these victims, it has advertised the fact that it does not detect child sex abuse material on its platform or devices thereby exponentially increasing the ongoing harm caused to these victims," wrote lawyer Margaret E. Mabie.

Mashable Light Speed
Want more out-of-this world tech, space and science stories?
Sign up for Mashable's weekly Light Speed newsletter.
By clicking Sign Me Up, you confirm you are 16+ and agree to our Terms of Use and Privacy Policy.
Thanks for signing up!

The company has retained tight control over its iCloud product and user libraries as part of its wider privacy promises. In 2022, Apple scrapped its plans for a controversial tool that would automatically scan and flag iCloud photo libraries for abusive or problematic material, including CSAM. The company cited growing concern over user privacy and mass surveillance by Big Tech in its choice to no longer introduce the scanning feature, and Apple's choice was widely supported by privacy groups and activists around the world. But the new lawsuit argues that the tech giant merely used this cybersecurity defense to skirt its reporting duties.

"Child sexual abuse material is abhorrent and we are committed to fighting the ways predators put children at risk," wrote Apple spokesperson Fred Sainz in response to the lawsuit. "We are urgently and actively innovating to combat these crimes without compromising the security and privacy of all our users. Features like Communication Safety, for example, warn children when they receive or attempt to send content that contains nudity to help break the chain of coercion that leads to child sexual abuse. We remain deeply focused on building protections that help prevent the spread of CSAM before it starts."

Tech companies have struggled to control the spread of abusive material online. A 2024 report by UK watchdog National Society for the Prevention of Cruelty to Children (NSPCC) accused Apple of vastly underreporting the amount of CSAM shared across its products, with the company submitting just 267 worldwide reports of CSAM to NCMEC in 2023. Competitors Google and Meta reported more than 1 million and 30 million cases, respectively. Meanwhile, growing concern over the rise of digitally-altered or synthetic CSAM has complicated the regulatory landscape, leaving tech giants and social media platforms racing to catch up.

While Apple faces a potential billion-dollar lawsuit should the suit move to and be favored by a jury, the decision has even wider repercussions for the industry and privacy efforts at large. The court could decide to force Apple into reviving its photo library scanning tool or implement other industry features to remove abusive content, paving a more direct path toward government surveillance and wielding another blow to Section 230 protections.

Chase sits in front of a green framed window, wearing a cheetah print shirt and looking to her right. On the window's glass pane reads "Ricas's Tostadas" in red lettering.
Chase DiBenedetto
Social Good Reporter

Chase joined Mashable's Social Good team in 2020, covering online stories about digital activism, climate justice, accessibility, and media representation. Her work also touches on how these conversations manifest in politics, popular culture, and fandom. Sometimes she's very funny.


Recommended For You
Apple's child safety changes put more of the onus on app developers
A finger taps on the Apple App Store app on a phone screen.

OpenAI's Sam Altman sued by sister, alleging years of sexual abuse
OpenAI CEO Sam Altman at Italian Tech Week 2024.

Apple opens up about Siri privacy in wake of lawsuit
The Siri icon on a smartphone next to the Apple icon on a laptop screen

Mark Zuckerberg named in lawsuit over Meta’s use of pirated books for AI training
This photo illustration created on January 7, 2025, in Washington, DC, shows an image of Mark Zuckerberg, CEO of Meta, and an image of the Meta logo. Social media giant Meta on January 7, 2025, slashed its content moderation policies, including ending its US fact-checking program, in a major shift that conforms with the priorities of incoming president Donald Trump.

LinkedIn hit with lawsuit alleging private messages were used to train AI models
LinkedIn app on a smartphone screen

More in Tech

Trending on Mashable
NYT Connections hints today: Clues, answers for March 7, 2025
A close-up of an NYT Connections game on a smartphone.

NYT Strands hints, answers for March 7
A game being played on a smartphone.

Wordle today: Answer, hints for March 7, 2025
A close-up of a Wordle game open on a smartphone.

Why are there no iPhones in 'Severance'?
By Jake Kleinman
John Turturro in "Severance."

Tesla sales are reportedly falling globally. How bad it is and where.
Tesla logo
The biggest stories of the day delivered to your inbox.
These newsletters may contain advertising, deals, or affiliate links. By clicking Subscribe, you confirm you are 16+ and agree to our Terms of Use and Privacy Policy.
Thanks for signing up. See you at your inbox!