Apple Facing Lawsuit for 'Unlawful and Intentional' Recording of Confidential Siri Requests Without User Consent - MacRumorsOpen MenuShow RoundupsShow Forums menuVisit ForumsOpen Sidebar
Skip to Content

Apple Facing Lawsuit for 'Unlawful and Intentional' Recording of Confidential Siri Requests Without User Consent

Apple is facing a class action lawsuit [PDF] for employing contractors to listen to and grade some anonymized Siri conversations for the purpose of quality control and product improvement.

Apple's ‌Siri‌ practices were highlighted in a recent report where one of the contractors claimed that Apple employees evaluating ‌Siri‌ recordings often hear confidential medical information, drug deals, and other private information when ‌Siri‌ is activated accidentally.

hey siri
The lawsuit, filed in a Northern California court today (and shared by CNBC's Kif Leswing), accuses Apple of "unlawful and intentional recording of individuals' confidential communications without their consent," violating California privacy laws when accidental ‌Siri‌ activations are recorded and evaluated by humans.

Siri Devices are only supposed to record conversations preceded by the utterance of "Hey Siri" (a "wake phrase") or through a specific gesture, such as pressing the home button on a device for a specified amount of time. California law prohibits the recording of oral communications without the consent of all parties to the communication.

Individuals who have purchased or used Siri Devices and interacted with Siri have not consented to Apple recording conversations where "Hey Siri" was not uttered or where they did not otherwise perform a gesture intending to activate Siri, such as pressing and holding down the home button on a device for a certain period of time.

As outlined in its privacy policies, Apple collects some anonymized ‌Siri‌ recordings for the purpose of improving ‌Siri‌ and, presumably, cutting down on accidental ‌Siri‌ activations. These recordings are analyzed by humans and can include details recorded when ‌Siri‌ mishears a "Hey ‌Siri‌" trigger word.

The lawsuit claims that Apple has not informed consumers that they are "regularly being recorded without consent," though it also highlights Apple's privacy policy where Apple does state that such data can be used for improving its services.

The plaintiffs in the case, one of whom is a minor, claim to own an iPhone XR and an iPhone 6 that they would not have purchased had they known that their ‌Siri‌ recordings were stored for evaluation. The plaintiffs are seeking class action status for all individuals who were recorded by a ‌Siri‌ device without their consent from October 12, 2011 to the present.

The lawsuit asks for Apple to obtain consent before recording a minor's ‌Siri‌ interactions, to delete all existing recordings, and to prevent unauthorized recordings in the future. It also asks for $5,000 in damages per violation.

Apple has suspended its Siri evaluation program right now as it reviews the processes that are in place in light of the contractor's claims. Prior to the suspension of the program, Apple said that a small, random subset (less than 1%) of daily ‌Siri‌ requests are analyzed for improving ‌Siri‌ and dictation, with requests not associated with a user's Apple ID.

Apple in the future plans to release a software update that will let ‌Siri‌ users opt out of having their ‌Siri‌ queries included in the evaluation process, something that's not possible at the current time. All collected ‌Siri‌ data can be cleared from an iOS device by turning ‌Siri‌ off and then on again, while accidental recordings can be stopped by disabling "Hey ‌Siri‌."

Popular Stories

imac video apple feature

Apple Released Yet Another New Product Today

Friday March 20, 2026 2:39 pm PDT by
Apple has unveiled a whopping nine new products so far this March, including an iPhone 17e, iPad Air models with the M4 chip, MacBook Air models with the M5 chip, MacBook Pro models with M5 Pro and M5 Max chips, the all-new MacBook Neo, an updated Studio Display, a higher-end Studio Display XDR, AirPods Max 2, and now the Nike Powerbeats Pro 2. iPhone 17e features the same overall design as...
HomePod mini and Apple TV Sage

New Apple TV and HomePod Mini Remain 'Ready' to Launch

Sunday March 22, 2026 6:33 am PDT by
Apple has unveiled nine new products this month, but the wait continues for the next-generation Apple TV 4K and HomePod mini models. In his Power On newsletter today, Bloomberg's Mark Gurman said new versions of the Apple TV and HomePod mini have been "ready" since last year, but he reiterated that Apple has held off on releasing them until the more personalized version of Siri and other...
ios 26 4 pastel

iOS 26.4: Top 10 New Features Coming to Your iPhone

Friday March 20, 2026 2:44 pm PDT by
iOS 26.4 isn't the major update with new Siri features that we hoped for, but there are some useful quality of life improvements, and a little bit of fun with an AI playlist generator and new emoji characters. Playlist Playground - Apple Music has a Playlist Playground option that lets you generate playlists from text-based descriptions. You can include moods, feelings, activities, or...

Top Rated Comments

nacheen Avatar
87 months ago
Guess they should have read the terms of service better...
Score: 18 Votes (Like | Disagree)
nsayer Avatar
87 months ago
All of that quality assurance work and Siri is still comically bad at understanding what she's asked.
Score: 14 Votes (Like | Disagree)
87 months ago
This lawsuit is going nowhere.
Score: 10 Votes (Like | Disagree)
87 months ago
"What happens on your iPhone stays on your iPhone" ... apparently not.

Apple has made HUGE statements about privacy, "Privacy is King" , and it turns out that is 'misleading' at best.

Will I opt-in, nope because you have shown that you are no more trustworthy than any other company when it suits you.
Score: 6 Votes (Like | Disagree)
uajafd Avatar
87 months ago
"Apple in the future plans to release a software update that will let Siri users opt out of having their Siri queries included in the evaluation process, something that's not possible at the current time."

Not nearly good. This should be opt-in, not opt-out. Also, I imagine this is some grounds for a GDPR lawsuit.

I also read the whole privacy statement displayed when you try to enable Siri. Nowhere in it does it say that humans listen to snippets of your conversations.
Score: 6 Votes (Like | Disagree)
Zaft Avatar
87 months ago
That was quick.
Score: 5 Votes (Like | Disagree)
Related Apple News: Business | Technology | Buyers Guide | News | Sport