Apple’s $95M Siri Privacy Payout: Here’s How to Snag Your $20 Before It’s Gone
Another day, another tech giant cutting checks to dodge a PR nightmare—this time, Apple’s coughing up $95 million for allegedly eavesdropping via Siri. Your slice? A cool $20… if you jump through the right hoops.
Who qualifies? U.S. users who owned a Siri-enabled device (iPhone 6s or later, HomePod, etc.) between 2016-2023. No proof of harm required—just proof you existed while Apple might’ve been listening.
How to claim: File online via the settlement portal by next month. Expect your payout via check or digital transfer… eventually. Pro tip: If you think $20 covers the existential dread of corporate surveillance, wait until you see what they charge for a replacement charger.
How to get paid
To qualify for the settlement, you’ll need to be a U.S. resident who owned one or more qualifying devices between September 17, 2014, and December 31, 2024. The process requires submitting a claim by July 2, 2025, and verifying under oath that Siri activated without your permission.
The website to submit your claim is Lopez Voice Assistant Settlement, and is available in both English and Spanish.
The settlement covers a wide range of Apple devices, including iPhone 6 and newer models, iPads released since 2014, all generations of the Apple Watch, the HomePod and HomePod Mini, as well as MacBooks and iMacs manufactured since 2014.
Under the settlement terms, users can receive $20 per qualifying device, with a maximum payout of $100 per household for up to five devices. The final payment could increase if fewer claims are filed than expected. The legal team representing the plaintiffs will receive approximately $30 million from the settlement fund.
Users should gather their device serial numbers or proof of purchase, as claimants will need that information to complete the online form, provide the requested documentation, select their preferred payment method, and submit their claim before the July 2 deadline.
Hey Siri, stop listening
The lawsuit comes from a 2019 exposé by The Guardian, which revealed that Apple contractors regularly accessed private Siri recordings. According to the claims, contractors reported hearing medical appointments, business deals, and intimate moments—and also allegedly shared them with advertisers.
Lead plaintiff Fumiko Lopez’s experience highlights the potential privacy breach. As reported by the BBC, shortly after discussing Air Jordan shoes at home, she and her daughter noticed targeted advertisements for the exact models they mentioned. Another plaintiff reported seeing ads for specific medical treatments shortly after discussing them with their doctor.
"Apple has at all times denied and continues to deny any and all alleged wrongdoing and liability," the court filing states. The company maintains that Siri data collection serves only to improve the service and remains anonymized.
Besides the $95 million payment, the settlement also requires Apple to confirm the permanent deletion of all Siri audio recordings collected before October 2019.
This settlement arrives amid growing concerns about AI-powered voice assistants, and AI in general. Similar lawsuits targeted other tech giants, with Google facing a parallel class action suit also in California.
“Plaintiffs in the lawsuit allege that Google Assistant can activate and record communications even when a user does not intentionally trigger Google Assistant with a hot word, like ’Okay Google,’ or manually activate Google Assistant on their device,” the official site for the class action lawsuit reads.
Amazon agreed in 2023 to pay $25 million for similar privacy violations tied to its Alexa devices, with the SEC’s statement noting that its "complaint alleges that Amazon retained children’s voice recordings indefinitely by default" in violation of a law.
Of course, all of these companies have previously claimed to respect and protect their users’ privacy. This is especially important considering that all are developing their own generative AI models to improve their user experience, and this requires tons and tons of data.
If you want to be extra careful and protect your privacy, you can prevent Siri from automatically activating—or stop using AI assistants at all. Not ideal, but that’s the world we live in.
Edited by Andrew Hayward
Editor’s note: This story was originally published on January 3, 2025 and last updated with new details on May 9.