Microsoft Defends Windows Recall After Researcher Demonstrates Data Extraction Bypass
Microsoft has rejected claims of a new security vulnerability in its AI-powered Windows Recall feature, following research showing that data can be extracted from the system under certain conditions.
Recall, introduced in Windows 11, enables users to search past activity on their devices by capturing periodic screenshots, extracting text, and indexing browsing history. The feature has faced scrutiny since its initial unveiling, particularly around privacy and data security concerns.
Research Highlights Potential Weakness
Security researcher Alexander Hagenah has released an updated proof-of-concept tool, TotalRecall Reloaded, demonstrating how data from Recall’s SQLite database can be accessed by a standard user account.
Importantly, Hagenah did not break Recall’s core security protections. He described Microsoft’s encryption (AES-256-GCM) and Virtualisation-Based Security (VBS) enclaves as robust. Instead, the bypass targets a separate process—AIXHost.exe—which renders the Recall timeline for users.
According to the research, this process operates outside the protected enclave and lacks certain safeguards, such as code integrity enforcement and sandbox isolation. By injecting code into AIXHost.exe, an attacker with access to a logged-in user session could retrieve decrypted screenshots, text, and metadata after they are processed.
The method does not require administrator privileges and can be executed using standard user-level permissions once authentication via Windows Hello has occurred.
Microsoft: “Not a Vulnerability”
Microsoft has disputed the classification of the issue as a security flaw. After reviewing Hagenah’s disclosure through the Microsoft Security Response Center (MSRC), the company concluded that the behavior aligns with Recall’s intended design.
In its response, Microsoft stated that processes outside VBS enclaves are not granted direct access to encrypted data or keys, and that the architecture is designed to mitigate risks from unauthorized access. The company emphasized that the demonstrated technique requires local access as an authenticated user, which falls within expected system behavior.
Ongoing Concerns Around Data Exposure
Beyond the primary bypass, Hagenah also identified additional concerns, including the ability to delete Recall data without authentication and limited access to certain metadata without Windows Hello verification—raising potential implications for forensic analysis and data integrity.
Recall was previously delayed in 2024 after earlier findings exposed weaknesses in how screenshot data was stored. Microsoft subsequently rebuilt the feature with stronger protections, including hardware-backed security, biometric authentication, and process isolation mechanisms.
Balancing Utility and Security
The latest findings highlight the ongoing tension between usability and security in AI-driven operating system features. While Microsoft maintains that Recall’s design remains secure within its intended parameters, the research underscores the complexity of safeguarding sensitive, continuously captured user data.
As AI-powered system features become more deeply integrated into everyday computing, scrutiny around how data is accessed, processed, and protected is likely to intensify.