Apple is under a criminal investigation in France by the Paris prosecutor’s office over its handling of Siri voice recordings. The probe follows a formal complaint alleging illicit data collection and breaches of user privacy, reigniting scrutiny over how major tech companies manage sensitive audio data.
The Origin of the Investigation
The investigation stems from a complaint filed by the Ligue des droits de l’Homme (LDH), a prominent French human rights organization. The complaint relies heavily on the testimony of Thomas Le Bonniec, a former Apple subcontractor in Ireland, who revealed that he and others were tasked with listening to and analyzing Siri recordings as part of a “quality control” program.
Le Bonniec alleged that the recordings often included sensitive personal information, including medical details and private conversations, and that many were captured accidentally without users’ consent.
Privacy Concerns and Allegations
The central concern is whether Apple collected, reviewed, and analyzed user conversations without clear and informed consent. According to the complaint, snippets of audio were sometimes classified as intentional interactions even when they were accidental activations, potentially inflating the dataset for training purposes.
These practices raise significant questions about data privacy, particularly under European privacy laws like the General Data Protection Regulation (GDPR), which requires explicit consent for processing personal data.
Investigating Authority
The case has been referred to the Office for Combating Cybercrime (OFAC), part of the French National Police. OFAC is responsible for investigating cybercrime and data-related offenses, emphasizing the seriousness with which French authorities are treating the complaint.
Apple’s Response
Apple maintains that it suspended the grading program in 2019 and transitioned to an explicit opt-in system. The company asserts that it only retains audio recordings for users who consent to allow them to be reviewed for service improvement, and that it does not sell or use the data for marketing purposes.
Apple’s transparency efforts highlight the tension between AI development and user privacy protections, particularly as digital assistants like Siri rely on large amounts of audio data to function effectively.
Broader Implications for Tech Companies
This investigation in France escalates the scrutiny of Apple’s data handling practices, which have previously been the subject of class-action lawsuits and regulatory fines in multiple countries. The case could set a precedent for how tech companies handle voice data under European law and may influence similar investigations across the EU.
Privacy advocates argue that ensuring robust consent mechanisms and transparent data policies is critical to protecting users in an age of AI-powered personal assistants.
The French criminal investigation into Apple’s Siri practices underscores the ongoing tension between innovative technology and user privacy. As voice assistants become increasingly integrated into daily life, companies like Apple must navigate complex legal and ethical challenges while maintaining public trust.