Apple's Proposed Phone-Scanning Child Safety Features 'Invasive, Ineffective, and Dangerous,' Say Cybersecurity Researchers in New Study

More than a dozen prominent cybersecurity experts hit out at Apple on Thursday for relying on "dangerous technology" in its controversial plan to detect child sexual abuse images on iPhones (via The New York Times).

Child Safety Feature Purple
The damning criticism came in a new 46-page study by researchers that looked at plans by Apple and the European Union to monitor people's phones for illicit material, and called the efforts ineffective and dangerous strategies that would embolden government surveillance.

Announced in August, the planned features include client-side (i.e. on-device) scanning of users' iCloud Photos libraries for Child Sexual Abuse Material (CSAM), Communication Safety to warn children and their parents when receiving or sending sexually explicit photos, and expanded CSAM guidance in Siri and Search.

According to the researchers, documents released by the European Union suggest that the bloc's governing body are seeking a similar program that would scan encrypted phones for both child sexual abuse as well as signs of organized crime and terrorist-related imagery.

"It should be a national-security priority to resist attempts to spy on and influence law-abiding citizens," said the researchers, who added they were publishing their findings now to inform the European Union of the dangers of its plan.

"The expansion of the surveillance powers of the state really is passing a red line," said Ross Anderson, a professor of security engineering at the University of Cambridge and a member of the group.

Aside from surveillance concerns, the researchers said, their findings indicated that the technology was not effective at identifying images of child sexual abuse. Within days of Apple's announcement, they said, people had pointed out ways to avoid detection by editing the images slightly.

"It's allowing scanning of a personal private device without any probable cause for anything illegitimate being done," added another member of the group, Susan Landau, a professor of cybersecurity and policy at Tufts University. "It's extraordinarily dangerous. It's dangerous for business, national security, for public safety and for privacy."

The cybersecurity researchers said they had begun their study before Apple's announcement, and were publishing their findings now to inform the European Union of the dangers of its own similar plans.

Apple has faced significant criticism from privacy advocates, security researchers, cryptography experts, academics, politicians, and even employees within the company for its decision to deploy the technology in a future update to iOS 15 and iPadOS 15.

Apple initially endeavored to dispel misunderstandings and reassure users by releasing detailed information, sharing FAQs, various new documents, interviews with company executives, and more in order to allay concerns.

However, when it became clear that this wasn't having the intended effect, Apple subsequently acknowledged the negative feedback and announced in September a delay to the rollout of the features to give the company time to make "improvements" to the CSAM system, although it's not clear what they would involve and how they would address concerns.

Apple has also said it would refuse demands by authoritarian governments to expand the image-detection system beyond pictures of children flagged by recognized databases of child sex abuse material, although it has not said that it would pull out of a market rather than obeying a court order.

Popular Stories

iOS 26

iOS 26.4 and iOS 27 Features Revealed in New Leak

Friday December 12, 2025 10:56 am PST by
Macworld's Filipe Espósito today revealed a handful of features that Apple is allegedly planning for iOS 26.4, iOS 27, and even iOS 28. The report said the features are referenced within the code for a leaked internal build of iOS 26 that is not meant to be seen by the public. However, it appears that Espósito and/or his sources managed to gain access to it, providing us with a sneak peek...
Apple Logo Top Half

Early iOS 26 Software Leak Uncovers Dozens of Upcoming Apple Features

Monday December 15, 2025 3:05 pm PST by
Software from an iPhone prototype running an early build of iOS 26 leaked last week, giving us a glimpse at future Apple devices and iOS features. We recapped device codenames in our prior article, and now we have a list of some of the most notable feature flags that were found in the software code. In some cases, it's obvious what the feature flags are referring to, while some are more...
apple beta 26 lineup

Apple Leak Confirms Work on Foldable iPhone, AirTag 2, and Dozens More Devices

Monday December 15, 2025 2:05 pm PST by
Last week, details about unreleased Apple devices and future iOS features were shared by Macworld. This week, we learned where the information came from, plus we have more findings from the leak. As it turns out, an Apple prototype device running an early build of iOS 26 was sold, and the person who bought it shared the software. The OS has a version number of 23A5234w, and the first...
Apple Foldable Thumb

Leak Reveals Foldable iPhone Details

Monday December 15, 2025 9:09 am PST by
The first foldable iPhone will feature a series of design and hardware firsts for Apple, according to details shared by the Weibo leaker known as Digital Chat Station. According to a new post, via machine translation, Apple is developing what the leaker describes as a "wide foldable" device, a term used to refer to a horizontally oriented, book-style foldable with a large internal display....
iOS 26

iOS 26.3 Beta 1 Features: What's New So Far

Monday December 15, 2025 4:23 pm PST by
Apple is testing iOS 26.3, the next version of iOS 26 that will launch around January. Since iOS 26.3's testing is happening over the holidays, it is a smaller update with fewer features than we've seen in prior betas. We've rounded up what's new so far, and we'll add to our list with subsequent betas if we come across any other features. Transfer to Android Apple is making it simpler...
iOS 26

Apple Releases iOS 26.2 With Alarms for Reminders, Lock Screen Changes, Enhanced Safety Alerts and More

Friday December 12, 2025 10:10 am PST by
Apple today released iOS 26.2, the second major update to the iOS 26 operating system that came out in September, iOS 26.2 comes a little over a month after iOS 26.1 launched. ‌iOS 26‌.2 is compatible with the ‌iPhone‌ 11 series and later, as well as the second-generation ‌iPhone‌ SE. The new software can be downloaded on eligible iPhones over-the-air by going to Settings >...
airpods max 2024 colors

AirPods Max 2 Likely to Offer These 10 New Features

Monday December 15, 2025 7:41 am PST by
Apple released the AirPods Max on December 15, 2020, meaning the over-ear headphones launched five years ago today. While the AirPods Max were updated with a USB-C port and new color options last year, followed by support for lossless audio and ultra-low latency audio this year, the headphones lack some of the features that have been introduced for newer generations of the regular AirPods and the ...
iOS 26

iOS 26.2 Coming Soon With These 8 New Features on Your iPhone

Thursday December 11, 2025 8:49 am PST by
Apple seeded the second iOS 26.2 Release Candidate to developers earlier this week, meaning the update will be released to the general public very soon. Apple confirmed iOS 26.2 would be released in December, but it did not provide a specific date. We expect the update to be released by early next week. iOS 26.2 includes a handful of new features and changes on the iPhone, such as a new...

Top Rated Comments

Wanted797 Avatar
55 months ago
Good.

If Apple want to promote themselves as Privacy focused. They deserve every bit of criticism for this ridiculous idea.

They brought it on themselves.
Score: 171 Votes (Like | Disagree)
_Spinn_ Avatar
55 months ago

Apple has also said it would refuse ('https://www.macrumors.com/2021/08/09/apple-faq-csam-detection-messages-scanning/') demands by authoritarian governments to expand the image-detection system beyond pictures of children flagged by recognized databases of child sex abuse material, although it has not said that it would pull out of a market rather than obeying a court order.
I just don’t see how Apple thinks this is even feasible. How do they expect to ignore the laws of a local government?

The whole idea of scanning content locally on someone’s phone is a terrible idea that will eventually be abused.
Score: 88 Votes (Like | Disagree)
MathersMahmood Avatar
55 months ago

Good.

If Apple want to promote themselves as Privacy focused. They deserve every bit of criticism for this ridiculous idea.

They brought it on themselves
This. 100% this.
Score: 63 Votes (Like | Disagree)
LV426 Avatar
55 months ago
“Apple has also said it would refuse ('https://www.macrumors.com/2021/08/09/apple-faq-csam-detection-messages-scanning/') demands by authoritarian governments to expand the image-detection system”

Apple cannot refuse such demands if they are written into a nation’s law, so this is a worthless promise. The UK government has the power (since 2016) to compel Apple – amongst others – to provide technical means of obtaining the information they want. But, worse than that, Apple are not permitted to divulge the fact that any such compulsion order has been made. They must, by law, keep those measures secret. It’s all very very Big Brother.
Score: 58 Votes (Like | Disagree)
iGobbleoff Avatar
55 months ago

lot of confusion/scare mongering here, they are not scanning phones. They are looking at photos that are on Apple servers. How can anyone object to this unless you have something to hide, why would you object when it helps children?
Because it’s the beginning of a slippery slope of scanning your device for anything else that some group can think of. Phones are now nothing but trackers for big tech and governments to abuse.
Score: 54 Votes (Like | Disagree)
H2SO4 Avatar
55 months ago

lot of confusion/scare mongering here, they are not scanning phones. They are looking at photos that are on Apple servers. How can anyone object to this unless you have something to hide, why would you object when it helps children?
Are you sure?;
Announced in August ('https://www.macrumors.com/2021/08/05/apple-new-child-safety-features/'), the planned features include client-side (i.e. on-device) scanning of users' iCloud Photos libraries for Child Sexual Abuse Material (CSAM), Communication Safety to warn children and their parents when receiving or sending sexually explicit photos, and expanded CSAM guidance in Siri and Search.
Score: 45 Votes (Like | Disagree)