Apple knowingly ignoring child porn is a "never-ending nightmare," lawsuit says.
Credit: NurPhoto / Contributor | NurPhoto
Thousands of victims have sued Apple over its alleged failure to detect and report illegal child pornography, also known as child sex abuse materials (CSAM).
The proposed class action comes after Apple scrapped a controversial CSAM-scanning tool last fall that was supposed to significantly reduce CSAM spreading in its products. Apple defended its decision to kill the tool after dozens of digital rights groups raised concerns that the government could seek to use the functionality to illegally surveil Apple users for other reasons. Apple also was concerned that bad actors could use the functionality to exploit its users and sought to protect innocent users from false content flags.
Child sex abuse survivors suing have accused Apple of using the cybersecurity defense to ignore the tech giant's mandatory CSAM reporting duties. If they win over a jury, Apple could face more than $1.2 billion in penalties. And perhaps most notably for privacy advocates, Apple could also be forced to "identify, remove, and report CSAM on iCloud and implement policies, practices, and procedures to prevent continued dissemination of CSAM or child sex trafficking on Apple devices and services." That could mean a court order to implement the controversial tool or an alternative that meets industry standards for mass-detecting CSAM.
In a statement, Apple did not address survivors' key complaints regarding detection of known CSAM. Some survivors are in their late 20s now but were victimized when they were only infants or toddlers and have been traumatized by ongoing crime notifications that they received for decades, including some showing that images of their abuse have been found on Apple devices and services. But Apple's current safety efforts seem to focus more on detecting grooming or new CSAM than stopping the re-traumatization from known CSAM spreading.
"Child sexual abuse material is abhorrent and we are committed to fighting the ways predators put children at risk," Apple's spokesperson said. "We are urgently and actively innovating to combat these crimes without compromising the security and privacy of all our users. Features like Communication Safety, for example, warn children when they receive or attempt to send content that contains nudity to help break the chain of coercion that leads to child sexual abuse. We remain deeply focused on building protections that help prevent the spread of CSAM before it starts."
One of the plaintiffs suing anonymously to prevent further harm accused Apple of turning a "blind eye" to CSAM for years while she endures a "never-ending nightmare." One survivor told The New York Times that she's "suing Apple because she says it broke its promise to protect victims like her" when Apple failed to implement the CSAM detector.
Apple profits off ignoring CSAM, lawsuit says
To build the case, survivors' lawyers dug through 80 cases where law enforcement found CSAM on Apple products, identifying a group of 2,680 survivors as potential class members. The majority of CSAM was found on iCloud, which the lawsuit alleged has become a "significant profit center" for Apple, despite ongoing criticism of its seeming CSAM failings.
As survivors see it, Apple profits from allowing CSAM on iCloud, as child predators view its products as a safe haven to store CSAM that most other Big Tech companies mass report. Where Apple only reported 267 known instances of CSAM in 2023, four other "leading tech companies submitted over 32 million reports," the lawsuit noted. And if Apple's allegedly lax approach to CSAM continues unchecked, survivors fear that AI could spike the amount of CSAM that goes unreported exponentially.
When Apple devices are used to spread CSAM, it's a huge problem for survivors, who allegedly face a range of harms, including "exposure to predators, sexual exploitation, dissociative behavior, withdrawal symptoms, social isolation, damage to body image and self-worth, increased risky behavior, and profound mental health issues, including but not limited to depression, anxiety, suicidal ideation, self-harm, insomnia, eating disorders, death, and other harmful effects." One survivor told The Times she "lives in constant fear that someone might track her down and recognize her."
Survivors suing have also incurred medical and other expenses due to Apple's inaction, the lawsuit alleged. And those expenses will keep piling up if the court battle drags on for years and Apple's practices remain unchanged.
Apple could win, a lawyer and policy fellow at the Stanford Institute for Human-Centered Artificial Intelligence, Riana Pfefferkorn, told The Times, as survivors face "significant hurdles" seeking liability for mishandling content that Apple says Section 230 shields. And a win for survivors could "backfire," Pfefferkorn suggested, if Apple proves that forced scanning of devices and services violates the Fourth Amendment.
Survivors, some of whom own iPhones, think that Apple has a responsibility to protect them. In a press release, Margaret E. Mabie, a lawyer representing survivors, praised survivors for raising "a call for justice and a demand for Apple to finally take responsibility and protect these victims."
“Thousands of brave survivors are coming forward to demand accountability from one of the most successful technology companies on the planet," Mabie said. "Apple has not only rejected helping these victims, it has advertised the fact that it does not detect child sex abuse material on its platform or devices thereby exponentially increasing the ongoing harm caused to these victims."