Key Highlights
- A woman is suing Apple for removing CSAM detection on iCloud
- The lawsuit seeks over $1.2 billion in damages
- Apple faces criticism for not doing enough to protect victims
A complaint filed by a 27-year-old woman alleging that Apple neglected to safeguard victims of child sexual abuse has put the firm under legal scrutiny. According to the lawsuit, Apple abandoned its intention to identify child sexual abuse material (CSAM) on iCloud, putting victims at risk of further harm.
Also Read | Google’s Gemini AI In Trouble: Chatbot’s Words Left A User Terrified
What Happened?
- In 2021, Apple unveiled a feature designed to scan iCloud photos for CSAM by using use on-device technologies.
- In 2022, Apple dropped the CSAM detection plan due to privacy concerns raised by experts and advocacy groups. Although it kept a nudity-detection tool in the Messages app.
The Lawsuit Details
- The 27-year-old victim of childhood abuse alleges that she was left vulnerable by Apple’s choice to disable the CSAM detection tool. Using a MacBook, law enforcement found pictures of her mistreatment on iCloud.
- She argues that Apple broke its promise to protect abuse victims by selling products that do not safeguard users from exploitation.
- The lawsuit seeks compensation for victims and calls for changes in Apple’s practices. The plaintiff’s legal team estimates that up to 2,680 victims could join, with damages potentially exceeding $1.2 billion.
Comparison With Other Companies
Other tech giants like Google and Meta continue to use CSAM-scanning tools that detect more illegal material than Apple’s nudity-detection feature.
Other Legal Challenges For Apple
- North Carolina Lawsuit: Apple is also facing a separate lawsuit from a nine-year-old victim who claims strangers used iCloud links to send her CSAM videos. Apple has attempted to dismiss the case, citing Section 230 protections, which typically shield companies from user-uploaded content.
- Court Rulings: Recent legal rulings suggest that these protections may not apply if companies fail to moderate harmful content actively.
Also Read | UK Man Takes Apple To Court After Wife Discovers Deleted Messages With Sex Worker
Apple’s Defense
Apple has defended its actions by saying that it is still dedicated to protecting user privacy and combating child exploitation. It highlights features like the opportunity for users to report harmful content and the detection of nudity in messages.
The plaintiff’s attorney counters that these steps are insufficient. They cited more than 80 examples of the plaintiff’s abuse photos being shared, including by a Californian on iCloud.
For the tech geeks, stay updated with the latest cutting-edge gadgets in the market, exclusive tech updates, gadget reviews, and more right on your phone’s screen. Join Giznext’s WhatsApp channel and receive the industry-first tech updates.

