Big Tech Continues Voluntary CSAM Scanning in Europe After EU Law Expiration
What Happened — The EU regulation that granted tech companies a legal basis to proactively scan private communications for child sexual abuse material (CSAM) expired on Saturday. Despite the loss of statutory authority, Microsoft, Google, Meta and Snapchat issued a joint statement saying they will keep the scanning programs running on a voluntary basis.
Why It Matters for TPRM —
- Ongoing scanning may breach EU privacy law, exposing your organization to regulatory fines and litigation.
- Continued use of CSAM‑detection tools raises reputational risk and could trigger data‑subject rights challenges.
- Vendors’ unilateral policy shifts can affect contract compliance and data‑processing agreements.
Who Is Affected — Cloud‑based communication platforms, SaaS collaboration tools, social‑media services, and any downstream enterprises that rely on these providers for messaging, video calls, or content sharing.
Recommended Actions –
- Review existing vendor contracts for clauses on lawful data processing and privacy‑by‑design.
- Request updated compliance attestations from the affected providers confirming how they will meet GDPR requirements without the statutory mandate.
- Update your risk register to reflect heightened regulatory and reputational exposure; consider alternative providers if risk cannot be mitigated.
Technical Notes — The scanning methodology relies on hash‑matching against a curated database of known CSAM hashes; no new software vulnerability is disclosed. The primary concern is the legal justification for deploying this technology in private communications. Source: The Record