Apple just can’t seem to catch a break when it comes to Siri privacy concerns. Fresh off agreeing to a substantial $95 million settlement in the U.S., the tech giant now faces criminal charges in France over similar allegations. France’s human rights organization has filed a complaint with Paris prosecutors, claiming Apple’s voice assistant violated European privacy laws through unauthorized data collection.
This parallel timing isn’t coincidental. It reveals how Apple’s decade-long data practices are now generating legal consequences across multiple jurisdictions. The U.S. settlement covers alleged privacy violations spanning from September 2014 to December 2024, essentially the entire lifespan of modern Siri. Now French authorities are examining whether these same practices constitute criminal violations under GDPR, not just civil privacy breaches.
For Apple users worldwide, this dual-front legal battle raises some uncomfortable questions. What exactly has Siri been recording, and who’s been listening to it? The French case, in particular, could set significant precedent for how voice assistant privacy violations are prosecuted beyond U.S. borders.
The whistleblower who opened Pandora’s box
The French investigation centers around testimony from Thomas Le Bonniec, a former contractor who worked for Globe Technical Services in Cork, Ireland. Starting in spring 2019, Le Bonniec was part of a team tasked with improving Siri’s multilingual responses by listening to, transcribing, and tagging recordings captured by Apple’s voice assistant.
Day one set the tone. "On the very first day, we were told we were going to work on recordings of people talking to their assistant Siri or on recordings captured without their knowledge when the machine was triggered by mistake," Le Bonniec told Radio France. That direct acknowledgment of unauthorized recording undercuts any claim that these issues were accidental or unknown to operational teams.
His role went beyond simple transcription. His responsibilities included checking Siri’s transcriptions for accuracy and determining whether they were accidental recordings, and the scope of private conversations he encountered was staggering. During his time at GTS, he and his colleagues listened to "a considerable number of very private conversations triggered by mistake", intimate conversations that users never consented to share with anyone, let alone Apple contractors.
The most invasive aspect involved systematic data profiling. Some team members had "labeling duties" that went far beyond voice recognition improvement. They had to compare keywords spoken during recordings with data stored on users’ devices, including contacts, geolocation, music, and films, then tag this personal information with relevant keywords. This process effectively created behavioral profiles by linking accidental voice recordings with personal digital information, precisely the type of unauthorized data pairing that GDPR was designed to prevent.
France draws the legal line
The French complaint, filed with the Paris prosecutor on February 13, 2025, marks a step up from routine privacy disputes. The Ligue des droits de l’Homme, LDH, is not just seeking regulatory fines; they are pursuing criminal charges. The organization accuses Apple of violations including privacy breaches, unlawful personal data processing, and deceptive commercial practices, as first reported by Radio France and Le Monde.
LDH president Nathalie Tehio boiled the allegations down to two core offenses. The complaint focuses on two main offenses: invasion of privacy through recordings made without individuals’ consent, and violation of EU personal data protection law. Her description is blunt: "It’s not just spied on, it’s recorded. There is listening, recording, and even sending."
The criminal framework matters. “There is recording without people’s knowledge. This is an infraction. On the other hand, there is a violation of the GDPR, that is to say, the fact that we have not given our informed consent for this aspiration of personal data. These are two crimes,” Tehio emphasized. Her word choice, "aspiration," suggests a vacuuming up of personal information rather than incidental collection, a detail that may support criminal intent allegations.
The legal strategy also leans on France’s Sapin 2 law, which provides whistleblower protections, allowing Le Bonniec to claim official whistleblower status while supporting LDH’s criminal complaint. That framework turns what might have been a civil privacy fight into a criminal investigation where corporate leaders could face personal liability.
Apple’s defense strategy under pressure
Apple’s response follows a familiar playbook, minimize the immediate legal threat and spotlight post-2019 fixes. An Apple spokesperson pointed out that the French case remains only a privacy complaint at the time of writing, with no investigation opened yet. Accurate, but it does not engage with the substance of the allegations.
The company points to technical reforms since 2019 as proof of course correction. Apple explains it made changes to ensure Siri’s compliance with the company’s privacy commitments, including no longer retaining audio recordings of Siri interactions. On top of that, users can now opt in or out of allowing Siri to improve by learning from audio samples of their requests.
Apple’s January 2025 statement takes aim at rumor mill chatter about data monetization. The company declared: "Apple has never used Siri data to build marketing profiles, never made it available for advertising, and never sold it to anyone for any purpose".
The company also leans on its current architecture. The iPhone maker emphasized that Siri’s design prioritizes on-device processing to protect user privacy. When server access is required, “Siri uses as little data as possible to deliver an accurate result”. The company says its privacy architecture uses random identifiers instead of Apple Account information during data processing, which is "unique among digital assistants in use today".
Still, these protections highlight rather than resolve the exposure created by historical practices. The gap between Apple’s current system and the methods Le Bonniec described forms a clear timeline that both U.S. and French actions are now probing.
The $95 million settlement’s global implications
Apple’s U.S. settlement provides context for the potential global ripple effects of the French investigation. The agreement encompasses U.S. users of Siri-enabled devices from September 17, 2014, to December 31, 2024, and eligible individuals can claim up to $20 per device, with a maximum of five devices per person.
The remedial measures show how far Apple had to go to reset. Key measures include deleting individual Siri audio recordings collected before October 2019 and publishing instructions for users on how to opt into Siri’s ‘Improve Siri’ program. Deleting pre-2019 recordings reads less like a tweak and more like a purge.
The underlying allegations are straightforward. The lawsuit centers on claims that Siri inadvertently recorded private conversations and shared those recordings with third-party contractors without user consent. The issue dates back to 2014, when Apple introduced hands-free activation for Siri, creating a long runway for the alleged violations.
Advertising concerns added fuel. Some users received targeted ads for products mentioned in private conversations, prompting questions about whether Apple shared these recordings with advertisers. Apple flatly denies monetizing Siri data, yet the settlement’s timing and scale suggest the handling of that data raised enough commercial privacy worries to warrant a sizable resolution.
The settlement’s reach is narrow. Only about 3% to 5% of eligible U.S. users are expected to file claims, while millions of international users affected by the same practices receive no compensation. The French investigation could set precedent for criminal accountability beyond U.S. civil settlements, especially under GDPR’s stricter rules.
What this means for voice assistant privacy
The convergence of Apple’s U.S. settlement and France’s criminal complaint is a watershed moment for voice assistant privacy. It surfaces a basic tension, AI systems need data to improve, users expect that data to stay private.
The post-2019 changes at Apple offer a blueprint for others. Since 2019, Apple has ceased retaining audio recordings by default and shifted to using computer-generated transcripts for service improvements. Apple subsequently modified its policies, making audio recording retention opt-in only and discontinuing third-party contractor access to such recordings. Translation, privacy-preserving improvement is possible, it just takes work.
France’s criminal lens could reshape corporate accountability. Fines can be a cost of doing business; criminal charges come with reputational and operational risk that travels far beyond French borders.
For consumers, the lesson is not subtle. Privacy violations can generate long-lasting legal consequences even after companies roll out fixes. Apple has already taken steps to address these privacy concerns, including changes to how Siri processes voice recordings and offering users the ability to opt out of data sharing, yet past practices continue to invite scrutiny.
And this is global. Improvements in one country do not shield a company from prosecutors in another. The French case could shape how voice assistant privacy violations are charged under different frameworks, creating a tricky map of potential liability for tech firms to navigate.
Users who want to protect themselves should review voice assistant settings and data permissions, with the understanding that current controls may not cover what was collected years ago. As these cases show, privacy, once compromised, is difficult to restore.
As this legal drama unfolds across two continents, one thing is clear: the era of "collect first, ask permission later" for voice assistants is over. Whether through civil settlements or criminal prosecutions, the privacy reckoning for AI-powered assistants has arrived, and the consequences are reshaping how these technologies handle our most private conversations.
Comments
Be the first, drop a comment!