Signal Enhances Desktop Privacy, Blocks Windows Recall with DRM API

Dan Goodin – May 21, 2025
Signal Messenger today announced that its Windows Desktop client will, by default, block Microsoft’s AI-powered screenshot tool, Windows Recall, from capturing any content displayed inside the app. The move comes after months of community criticism and a recent re-rollout of Recall in Windows 11, which still poses privacy risks for end-to-end encrypted messaging.
What Is Windows Recall and Why It Matters
First introduced in May 2024 and reintroduced in April 2025 after significant redesigns, Windows Recall is an AI tool that automatically:
- Captures screenshots every three seconds (default interval).
- Processes images with an OCR engine to extract text content.
- Indexes and stores results in a local database (ESE/SQLite format).
Recall’s purpose is to help users search and recall past on-screen activities — from code snippets to chat logs — without manual archiving. However, early builds stored all imagery and plain-text OCR outputs unencrypted, creating a high-value target for malware and insiders.
Signal’s Default Blockade and User Overrides
Effective immediately, Signal Desktop for Windows will leverage a little-known Microsoft DRM API originally intended to protect copyrighted video content. By flipping the SetAppProtectionPolicy
flag, Signal instructs Windows to treat its UI as DRM-protected, preventing any system or third-party component — including Recall — from taking screenshots.
“Although Microsoft made several adjustments over the past twelve months in response to critical feedback, the revamped version of Recall still places any content that’s displayed within privacy-preserving apps like Signal at risk,” Signal engineers wrote. “We are enabling an extra layer of protection by default on Windows 11 even though it introduces some usability trade-offs.”
Users who require screenshots for compliance or accessibility can manually disable this setting under Settings > Advanced > Screenshots, though Signal warns that doing so reinstates the Recall risk profile.
Recap of Recall’s Security and Privacy Issues
Despite being opt-in in its latest incarnation, Recall continues to:
- Capture confidential data—payment cards, passwords entered visually, and private chat windows—without per-app consent.
- Allow decryption of its encrypted store using only a Windows Hello PIN or fingerprint, offering limited resistance to sophisticated malware that can exploit OS-level access.
- Lack a developer API for apps to exclude their own UIs, forcing workarounds like Signal’s DRM hack.
As Ars Technica’s Andrew Cunningham and researcher Kevin Beaumont have independently documented, these gaps leave users vulnerable if Recall’s secure enclave (TPM or Intel SGX) is compromised or if a rogue process gains SYSTEM privileges.
Technical Architecture of Recall: A Deep Dive
Recall’s pipeline involves several components:
- Capture Module: Injects into the DWM (Desktop Window Manager) compositor to grab framebuffers at the configured cadence.
- OCR Engine: Based on the open-source Tesseract library but augmented with Azure Cognitive Services for handwriting and multilingual support.
- Encryption Layer: Uses AES-256-GCM with keys sealed in a secure enclave (Windows Passport key store).
- Index Service: Built on top of the Windows Search indexer but extended to parse images and attachments.
While encryption at rest is a positive step, security experts warn that if an attacker can hijack the enclave or the Credential Guard, Recall’s data vault becomes accessible. Furthermore, the opt-in model still defaults to “on” during major Windows upgrades unless users explicitly opt out.
Developer Ecosystem and the Hunt for APIs
Signal’s repurposing of the DRM API highlights a broader issue: Microsoft has yet to provide a first-class developer API for controlling Recall. In community forums, app makers have proposed:
- Per-window Opt-Out: A manifest attribute to mark windows or controls as “private.”
- Declarative Privacy Labels: Marking certain UIs—like password fields or video frames—as non-indexable.
- Callback Hooks: Allowing apps to intercept capture events and return a “no-capture” flag.
Until Microsoft accepts these requests, developers will continue to rely on side-channel solutions or external sandboxing.
Regulatory and Compliance Perspectives
With GDPR, CCPA, and other privacy regulations tightening their grip on data handling practices, enterprises may face compliance challenges if Recall captures Protected Health Information (PHI) or Personally Identifiable Information (PII) without auditing. Legal experts suggest that:
- Organizations should verify that all users consent to Recall’s indexing, documented via group policy or MDM (Mobile Device Management).
- Data retention policies for Recall stores must align with industry standards (e.g., HIPAA’s six-year requirement for health records).
- Incident response plans need to incorporate potential Recall data breaches as a vector.
Future Outlook and Community Reaction
Signal hopes that its public workaround will pressure Microsoft to expose proper APIs. In a statement, the company wrote:
“Apps like Signal shouldn’t have to implement ‘one weird trick’ to maintain the privacy and integrity of their services. We urge Microsoft’s AI and platform teams to provide granular developer controls so that privacy and accessibility go hand in hand.”
Industry observers expect further debate at Microsoft’s next Build conference, where privacy controls for AI features could become a marquee topic. Meanwhile, security researchers are exploring whether similar DRM repurposing could safeguard other sensitive applications.
Credit: Getty Images