Windows Recall is one of the most controversial features Microsoft has built into Windows 11. It uses on-device AI to create a searchable timeline of everything you’ve seen or done on your computer, which — according to the company — should help you rediscover forgotten data and boost productivity.
Microsoft first introduced Recall in June 2024 as a preview that opted users in by default and couldn’t be removed. Worse, it stored screenshots and their contents unencrypted, in plaintext format (including passwords and other sensitive information), leaving them exposed to potential attacks.
After public backlash and security warnings, Microsoft pulled the feature(新しいウィンドウ), then released an updated version in April 2025 for Windows Insider testers. The new build added encryption, biometric authentication, and clearer privacy controls, and it’s now being gradually rolled out as an optional Windows 11 system update(新しいウィンドウ).
But some privacy and security concerns still remain. Here’s how this AI-powered feature works, what risks it introduces at home or at the workplace, and how to disable Windows Recall if you don’t want your computer to remember everything you do.
- How to disable Microsoft Recall on Windows 11
- What is Windows Recall?
- How to check if you have Microsoft Recall
- How does Windows Recall work?
- What are the privacy risks of using Windows Recall?
- Take ownership of your data
How to disable Microsoft Recall on Windows 11
If you’d rather not have this Microsoft AI tool quietly recording your screen, you can disable Microsoft Recall or remove it from your Windows 11 PC:
Pause or stop Recall snapshots (for individuals)
Go to Settings → Privacy & security → Recall & snapshots and toggle off Save snapshots.
Recall won’t take new screenshots anymore, but any existing ones remain stored locally. To remove them, select Delete snapshots.
Remove Recall as a Windows feature (for individuas)
- Open Windows Search and type Windows features.
- Select Turn Windows features on or off.
- Find and uncheck Recall.
Your snapshots will be automatically deleted from your computer.
For organizations
In managed environments(新しいウィンドウ), IT administrators can disable Recall system-wide:
- Open the Local Group Policy Editor (gpedit.msc).
- Go to User Configuration → Administrative Templates → Windows Components → Windows AI.
- Double-click Allow Recall to be enabled and select Disabled.
What is Windows Recall?
Windows Recall is an AI-powered tool you can use to search for things you’ve seen or done on your computer — such as file you opened, a website you visited, or an app you used. The idea is to create a photographic memory for your workflow, which you can search by what you remember, like “tax form,” “email from my bank,” or “last conversation with my kid.”
The tool periodically takes snapshots of your entire screen, capturing new content each time you switch windows or open a new app. Those screenshots are stored locally on your device, creating a searchable timeline of your activity. You can scroll through it or use natural language to find something, similar to chatting with an AI that remembers your conversations.
How to check if you have Microsoft Recall
To check if Recall is available on your computer, open Settings → Privacy & security and look for the Recall & snapshots option.
You can also verify Microsoft’s official Recall requirements(新しいウィンドウ). If your computer meets them but Recall isn’t showing up, check Settings → Windows Update → Optional updates, as Microsoft is gradually rolling it out as an optional Windows 11 update. If you prefer not to use it, you may want to skip updates that mention Recall in their description.
Windows Recall only runs on Copilot+ PCs, Microsoft’s latest generation of high-end laptops equipped with neural processing units (NPUs) that handle complex AI tasks. The catch? Hardly anyone owns one. Copilot+ PCs made up less than 2%(新しいウィンドウ) of all Windows laptops sold in early 2025 — a sign that Microsoft’s push for AI computers hasn’t caught on.
How does Windows Recall work?
Recall relies on small, locally run AI models built into Windows — not on Copilot, ChatGPT, or Microsoft’s cloud services. It combines OCR (which converts text from your screenshots into searchable words) with Semantic Indexing (the same AI feature that enhances the Windows 11 Search) and on-disk vector databases (to organize information by meaning rather than just keywords).
In simpler terms, your computer builds its own private memory of what you’ve seen, so you can later ask it questions in natural language — the same way you’d ask Google Photos to show everything you’ve captured in a trip.
What are the privacy risks of using Windows Recall?
For Recall to work on your Copilot+ PC, Microsoft requires you to enable BitLocker (used to encrypt your snapshot database) and Windows Hello Enhanced Sign-in Security (ESS), which asks you to authenticate with at least one biometric (face or fingerprint) before launching Recall or accessing your screenshots.
Simply put, BitLocker protects your data while it’s stored, and ESS ensures only authorized users can unlock it. That sounds like a solid foundation for security, but it doesn’t erase the bigger privacy questions around what it records in the first place.
Here’s where Windows Recall still raises concerns:
Sensitive data and data you don’t own
Microsoft says Recall automatically filters sensitive information — like credit card numbers, government IDs, and passwords — so they don’t appear in snapshots. The filter is on by default (though it can be turned off), and Microsoft published a list of data types(新しいウィンドウ) it’s designed to exclude. But that list is narrow, and independent tests(新しいウィンドウ) have shown that it doesn’t catch everything.
Anything visible on your screen can be captured, including details that Microsoft doesn’t classify as “sensitive” but that can still reveal personal information, such as your child’s name, their school, your home address, or your date of birth.
You can manually exclude certain apps and websites from Recall, but that takes constant oversight — something most people don’t have the time or energy for. And even then, you can’t control what others share with you in private conversations — Signal, Brave, and AdGuard have taken steps to shield user data(新しいウィンドウ) from Recall.
That’s where the problem deepens. Recall can also capture other people’s data, even though they never consented to being recorded. For example, a WhatsApp chat with a friend or a client’s NDA could end up indexed inside your Recall database.
Microsoft’s troubled relationship with user trust
The company says Recall doesn’t share your screenshots(新しいウィンドウ) and associated data — including the content inside and your search queries — with Microsoft or its partners. Microsoft’s broader privacy statement(新しいウィンドウ) still permits the company to use data to improve and develop products (including to train AI) and to share that data with partners for targeted ads. While this doesn’t strictly apply to Recall, policies change often, and Microsoft’s history doesn’t inspire much confidence. Here are just a few examples:
- Outlook (Microsoft’s email service) has become a data-collection platform that shares your information with hundreds of third parties, including advertisers and analytics firms.
- Microsoft-owned LinkedIn uses your public data to train AI models by default. You can opt out, but but your information can still be processed indirectly when other LinkedIn users share or upload your data.
- Austria’s data protection authority found that Microsoft illegally tracked students(新しいウィンドウ) through Microsoft 365 Education, violating EU privacy regulations for children.
- Gaming Copilot (Microsoft’s new AI assistant for Windows gaming) takes screenshots to understand what’s happening in a game and to offer tips, similar to how Recall indexes your desktop. The company says it doesn’t use Gaming Copilot snapshots(新しいウィンドウ) to train AI models, but other data — such as your voice commands and in-game interactions — are fair game. It’s also unclear if those screenshots are processed locally or sent to an external server. You can turn off Gaming Copilot, but you can’t uninstall it.
US jurisdiction and warrantless access
Microsoft is based in the US, which means it must respect laws that allow government agencies to demand or intercept your data — often without a warrant — even if you live outside the US. The company itself has acknowledged(新しいウィンドウ) this.
Even if Recall data remains on your device, Microsoft’s infrastructure — Windows accounts, telemetry, authentication systems — still gives the company a technical and legal foothold in your computer. If the US government wanted to access your Recall data, it could legally force Microsoft to cooperate.
Unclear regulatory status
Currently, there’s no public statement confirming that Microsoft’s Recall feature complies with major privacy regulations such as the EU’s GDPR — a notable omission seeing how strict are those rules about collecting, processing, and storing personal data. Without clarity, individuals and organizations using Windows Recall could set themselves up for legal liability.
For instance, once Recall is enabled at work, anything that appears on an employee’s computer display is automatically captured and indexed — including internal emails, financial dashboards, HR portals, and confidential documents. For regulated industries such as law, finance, or healthcare, this can conflict with rules around confidentiality, data minimization principles, and how incidents are reported.
In May 2024(新しいウィンドウ), the UK’s Information Commissioner’s Office (ICO) said it was “making enquiries with Microsoft” to assess the privacy safeguards built into Recall.
Consent gets complicated at work
Within organizations, IT administrators can enable Recall at the system level, but only end users can activate it on their computers, as the feature remains off by default and requires explicit consent on each device. But in practice, that consent may not be truly voluntary: In many workplaces, the employer holds a dominant position, and employees can feel pressured(新しいウィンドウ) to enable features pushed by their company.
If the company you work for launches a corporate investigation(新しいウィンドウ) (for example, to trace an insider threat or data leak), IT or legal teams may need to inspect Recall data from multiple devices. And if you’ve ever used your work computer for something personal — like chatting with your family, checking your private email, or reading a medical report — parts of your private life could suddenly become visible to people at work.
Take ownership of your data
Windows Recall might be framed as a tool for convenience, but it raises deeper questions about who controls your digital memory: you or the company behind your operating system. Microsoft’s intentions might be benign right now, but its history and multibillion-dollar investment in OpenAI (the creator of of ChatGPT) show how quickly “helpful” features can turn into surveillance and monetization tools.
We take the opposite approach at Proton, protecting your data with end-to-end encryption so even we can’t access it. We never store your information in a way that can be read, use it to train AI models, or share it with anyone.
Our apps are open source and independently audited, built under the protection of strict Swiss privacy laws. And as part of the growing Eurostack movement, we’re committed to building technology that keeps Europe’s data in Europe: private, secure, and beyond the reach of Big Tech and foreign surveillance.
If privacy matters to you, think twice before opening the door to Windows Recall and choose tools that are designed to forget — not remember everything you do.






