Jury Finds Meta Violated Wiretap Law on Period Tracker Data

Overview of the Verdict
On August 5, 2025, a federal jury in the Northern District of California found that Meta intentionally intercepted sensitive health data from users of the Flo period- and pregnancy-tracking app without obtaining valid consent—thereby violating the California Invasion of Privacy Act (CIPA). Plaintiffs in this class-action suit demonstrated by a preponderance of the evidence that Meta’s integration of its Facebook SDK amounted to eavesdropping and recording private in-app communications, undermining users’ reasonable expectation of privacy.
Case Background and Procedural History
The lawsuit was originally filed in 2021 against Flo Health for its onboarding questionnaire, which required users to disclose their reproductive goals and detailed menstrual or pregnancy data. Meta, Google, and analytics provider Flurry were later added as defendants when it emerged they had integrated Flo’s Custom App Events (CAEs) into their advertising pipelines.
- Flo settled just before trial, claiming its privacy policy allowed third-party analytics.
- Google and Flurry reached settlements, with Flurry agreeing to a $3.5 million payment pending approval.
- Meta stood alone at trial, vehemently denying knowledge or misuse of health data.
Despite Meta’s arguments that CAEs are opaque coded payloads lacking a decoding key—thus making them unintelligible—the jury sided with plaintiffs, concluding that internal Meta communications and technical documentation proved the company knew exactly what it was receiving.
Technical Mechanism: SDK Telemetry and Custom App Events
The heart of the dispute centers on the Facebook Software Development Kit (SDK) embedded in the Flo app. Between November 2016 and February 2019:
- Flo’s onboarding survey generated CAEs that tagged user inputs (e.g., “pregnant” or “tracking_period”).
- These events were dispatched via HTTPS to Meta’s telemetry servers alongside device identifiers (IDFA/GAID).
- Meta ingested the data into its ad-targeting graph, correlating reproductive health markers with demographic and behavioral cohorts.
“Our SDK received structured strings labeled ‘lifecyle_event’ which we later parsed to optimize ad delivery,” wrote a Meta engineer in 2018 internal Slack logs, introduced as evidence.
Privacy experts note that while SDKs accelerate feature development, they also create “black boxes” in the data supply chain. Without robust audit trails or end-to-end encryption of CAE contents, third parties may retroactively decode and repurpose sensitive payloads.
Legal and Regulatory Context
CIPA is among the strictest state-level statutes governing electronic surveillance. It prohibits any unauthorized “eavesdropping” or “recording” of private communications using an electronic device. This verdict builds on recent enforcement trends:
- FTC Health Data Guidance (2024): Issued best practices for digital health platforms to obtain explicit opt-in consent.
- GDPR Fines in Europe (2023–2025): Multiple health-tech firms penalized for inadequate data minimization.
- Pending American Data Privacy and Protection Act: May establish federal guardrails similar to CIPA if enacted.
Implications for AI-Driven Ad Targeting
Meta and Google feed machine learning models with telemetry from partner apps to refine user segments. The jury’s decision signals heightened scrutiny over:
- Data Provenance: Platforms must verify that all input data respects user consent flows.
- Model Training Ethics: Incorporating sensitive health signals without explicit permission could violate both civil law and emerging AI regulations.
- Accountability Mechanisms: Calls grow for third-party audits of SDK data pipelines and cryptographic attestations of data schema.
Jane Doe, a digital privacy researcher at the Electronic Frontier Foundation, commented: “This verdict underscores that algorithms cannot be divorced from their data sources. Responsible AI demands transparent provenance, especially when handling intimate user profiles.”
Expert Perspectives and Future Outlook
Legal analysts predict Meta will appeal, arguing CIPA doesn’t apply to metadata strings. However, the mixed use of CAEs for both app analytics and ad targeting complicates the defense. Meanwhile, Congress is reviewing bills to strengthen digital health data protections and require stronger SDK disclosures.
Meta’s statement asserts: “We did not want or process health or other sensitive information. Our terms prohibit developers from sending it, and we rely on them to comply.” Critics counter that relying on developer self-certification is insufficient without independent verification.
Key Takeaways for Developers and Platforms
- Implement end-to-end encryption for all CAEs containing PII or health data.
- Maintain a data lineage registry to track event schemas and consent timestamps.
- Conduct periodic third-party audits of embedded SDKs to ensure policy compliance.
Upcoming Legislative Hearings
Next month, the House Energy & Commerce Subcommittee on Digital Assets, Financial Technology, and Inclusion will hold hearings on digital health privacy, where Meta executives are expected to testify. Observers anticipate proposals to mandate transparent SDK manifests and user-accessible logs of data-sharing events.