Apple Abandons Controversial Plans to Detect Known CSAM in iCloud Photos - MacRumorsOpen MenuShow RoundupsShow Forums menuVisit ForumsOpen Sidebar
Skip to Content

Apple Abandons Controversial Plans to Detect Known CSAM in iCloud Photos

In addition to making end-to-end encryption available for iCloud Photos, Apple today announced that it has abandoned its controversial plans to detect known Child Sexual Abuse Material (CSAM) stored in iCloud Photos, according to a statement shared with WIRED.

iCloud General Feature
Apple's full statement:

After extensive consultation with experts to gather feedback on child protection initiatives we proposed last year, we are deepening our investment in the Communication Safety feature that we first made available in December 2021. We have further decided to not move forward with our previously proposed CSAM detection tool for iCloud Photos. Children can be protected without companies combing through personal data, and we will continue working with governments, child advocates, and other companies to help protect young people, preserve their right to privacy, and make the internet a safer place for children and for us all.

In August 2021, Apple announced plans for three new child safety features, including a system to detect known CSAM images stored in iCloud Photos, a Communication Safety option that blurs sexually explicit photos in the Messages app, and child exploitation resources for Siri. Communication Safety launched in the U.S. with iOS 15.2 in December 2021 and has since expanded to the U.K., Canada, Australia, and New Zealand, and the Siri resources are also available, but CSAM detection never ended up launching.

Apple initially said CSAM detection would be implemented in an update to iOS 15 and iPadOS 15 by the end of 2021, but the company ultimately postponed the feature based on "feedback from customers, advocacy groups, researchers, and others." Now, after a year of silence, Apple has abandoned the CSAM detection plans altogether.

Apple promised its CSAM detection system was "designed with user privacy in mind." The system would have performed "on-device matching using a database of known CSAM image hashes" from child safety organizations, which Apple would transform into an "unreadable set of hashes that is securely stored on users' devices."

Apple planned to report iCloud accounts with known CSAM image hashes to the National Center for Missing and Exploited Children (NCMEC), a non-profit organization that works in collaboration with U.S. law enforcement agencies. Apple said there would be a "threshold" that would ensure "less than a one in one trillion chance per year" of an account being incorrectly flagged by the system, plus a manual review of flagged accounts by a human.

Apple's plans were criticized by a wide range of individuals and organizations, including security researchers, the Electronic Frontier Foundation (EFF), politicians, policy groups, university researchers, and even some Apple employees.

Some critics argued that the feature would have created a "backdoor" into devices, which governments or law enforcement agencies could use to surveil users. Another concern was false positives, including the possibility of someone intentionally adding CSAM imagery to another person's iCloud account to get their account flagged.

Note: Due to the political or social nature of the discussion regarding this topic, the discussion thread is located in our Political News forum. All forum members and site visitors are welcome to read and follow the thread, but posting is limited to forum members with at least 100 posts.

Popular Stories

iOS 27 on iPhone 17 1

iOS 27 Will Add These New Features to Your iPhone

Saturday May 2, 2026 8:43 am PDT by
Apple is expected to unveil iOS 27 during its WWDC 2026 keynote on June 8, and there are already many rumored features and changes for iPhones. The first developer beta of iOS 27 will likely be available immediately following the keynote, and a public beta typically follows in July. Following beta testing, the software update should be released to all users with a compatible iPhone in...
Apple Event Logo

Apple Just Released a New Accessory

Monday May 4, 2026 8:13 am PDT by
Apple today released a new Pride Edition Sport Loop for the Apple Watch. The band features a rainbow design with 11 colors of woven nylon yarns. The new Pride Edition Sport Loop is available to order now on Apple.com and in the Apple Store app in 40mm, 42mm, and 46mm sizes, and it will be available at Apple Store locations starting later this week. In the U.S., the band costs $49. There...
Apple Announces 2026 Pride Band Watch Face and iPhone Wallpaper Article 2

iOS 26.5 Coming Soon With These New Features

Monday May 4, 2026 8:40 am PDT by
iOS 26.5 is expected to be released next week, following more than a month of beta testing. The update is relatively minor, but there are a couple of new features and changes across the operating system that we have recapped below. iOS 26.5 lays the groundwork for end-to-end encryption for RCS in the Messages app and ads in the Apple Maps app, and it will include a new Pride wallpaper and a...

Top Rated Comments

Populus Avatar
45 months ago
This is the right decision for Apple to make, in my opinion. I’m glad they recognized that there are better ways to prevent the spread of this type of content.

I’m sincerely surprised Apple backtracked on something as big as this (and with such a big pressure from the governments).
Score: 64 Votes (Like | Disagree)
TheYayAreaLiving 🎗️ Avatar
45 months ago

Yeah, but now we can't catch the pedophiles
That's the Law Enforcement and Government job. Not Apple's.
Score: 63 Votes (Like | Disagree)
TheYayAreaLiving 🎗️ Avatar
45 months ago
Thank you, Apple. CSAM was a joke. If privacy matters in your life, it should matter to the phone your life is on”. Long Live!

Score: 44 Votes (Like | Disagree)
Realityck Avatar
45 months ago

Apple today announced that it has abandoned its plans to detect known CSAM stored in iCloud Photos, according to a statement shared with WIRED ('https://www.wired.com/story/apple-photo-scanning-csam-communication-safety-messages/').
Everybody should be happy CSAM is DOA
Score: 30 Votes (Like | Disagree)
aPple nErd Avatar
45 months ago
Great news but they lost all my trust with this. I will continue to use local backups on my Mac and keep my photos out of iCloud for good.
Score: 27 Votes (Like | Disagree)
aPple nErd Avatar
45 months ago

[S]After extensive consultation with experts[/S] After extensive public pressure...

Never should have succumbed to the public feedback! Perhaps a botched introduction but no one else would have done it 'right' like Apple because of the scrutiny they are under.
Truly an unhinged take
Score: 26 Votes (Like | Disagree)
Related Apple News: Culture | Opinion | Ipad | Mac | Lifestyle