With Windows 10 support officially ending in October 2025, many users are upgrading to Windows 11 to stay protected with security updates. Some may be surprised, however, to find a new AI companion waiting for them: Copilot.

Positioned as Microsoft’s most ambitious effort into generative AI, Copilot is now deeply integrated across Windows 11, Edge, Bing, and the Microsoft 365 suite. It’s the centerpiece of Microsoft’s multibillion-dollar partnership with OpenAI (the creator of ChatGPT and ChatGPT Atlas), whose GPT models power Copilot’s intelligence. Combined with the launch of Copilot+ PCs — equipped with a dedicated Copilot key — this all shows just how serious Microsoft is about steering Windows toward an AI-powered future.

However, not everyone welcomes this always-present helper, reminiscent of Clippy but far more persistent. Copilot’s tight integration raises privacy and security concerns, much like when Microsoft-owned LinkedIn started to train AI using public profiles and activity data, or when Google’s plan to deeply embed its AI assistant Gemini into Android to replace Google Assistant.

Only Microsoft 365 enterprise users and IT administrators can fully remove Copilot from Windows 11. In all other cases — such as users with personal or family subscriptions, or no Microsoft 365 subscription at all — there are only ways to limit its features and tone down its visibility.

How to turn off Copilot AI

Depending on how you use Copilot, here’s how you can remove it from your computer, adjust its settings, or dial back its presence:

How to remove Microsoft 365 Copilot (enterprise subscription)

If you’re an enterprise user, go to Settings Apps Installed apps, select Copilot, and click Uninstall.

IT administrators can remove the Copilot app using the following command in PowerShell:

$packageFullName = Get-AppxPackage -Name "Microsoft.Copilot" | Select-Object -ExpandProperty PackageFullName
Remove-AppxPackage -Package $packageFullName

How to disable Microsoft 365 Copilot (personal or family subscription)

  1. Open any Microsoft 365 app (Word, Excel, PowerPoint, or OneNote).
  2. Go to File → Options → Copilot.
  3. Clear the Enable Copilot checkbox.

Repeat these steps for each Microsoft app you want to disable Copilot in.

If the Enable Copilot setting doesn’t exist in any of Microsoft’s apps:

  1. Open Word, Excel, PowerPoint, or OneNote.
  2. Go to File → Account → Account Privacy → Manage Settings → Connected experiences.
  3. Clear the Turn on experiences that analyze your content checkbox.
  4. Restart the app.

You only need to do this once, as the new privacy setting will automatically apply across all Microsoft 365 apps linked to your account.

How to hide Copilot icons in Windows 11

You can turn off Copilot icons in Windows, but you can’t completely remove the feature from your computer. These steps only disable Copilot’s visual and personalization elements, so you’ll still be able to access the app.

To hide the Copilot button from your taskbar:

  1. Go to SettingsPersonalizationTaskbar.
  2. Find and toggle off Copilot.

If you use apps that have Copilot built in, you’ll need to disable it separately in each app. For example, in Notepad, go to SettingsAI features and toggle off Copilot.

  1. Restart your computer for the changes to take effect.

How to disable data and personalization features in the Copilot web app

  1. Select your profile name or photo → your account → Privacy.
  1. Toggle off the following options:
    • Model training on text
    • Model training on voice
    • Personalization and memory
  1. Select Delete memory to immediately remove what Copilot remembers about you.
  2. Click Export or delete history. This will open a browser page where you’ll need to sign in with the Microsoft account you use for Copilot.
  3. Select Delete all activity history for each of these categories:
    • Copilot app activity history
    • Copilot in Microsoft 365 apps
    • Copilot in Windows apps
  1. Select your profile name or photo → Connectors.
  1. Toggle off everything, including OneDrive, Outlook, Gmail, Google Drive, and Google Calendar.

Note that if your data has already been used for model training, there’s no way to undo it.

How to disable Gaming Copilot model training

  1. On Windows, press Win + G to open the Game Bar.
  2. Go to SettingsPrivacy settingsGaming Copilot.
  3. Toggle off Model training on text.
  4. To hide the Gaming Copilot widget, remove it from the Game Bar widgets list.

Currently, there’s no way to fully remove Gaming Copilot from the Xbox Game Bar. Plus, you cannot prevent it from using your voice interactions to help train its AI models. If you care about privacy, it’s safest not to interact with Gaming Copilot at all.

How to delete your Copilot account

In the Copilot web app, select your profile name or photo → your account → Delete account.

Deleting your Copilot account doesn’t delete your Microsoft account, and you can still use Copilot without signing in.

If you’d rather not rely on Copilot at all, consider switching to a privacy AI assistant(ventana nueva) that never logs your data, uses it for model training, or shares it with third parties.

What is Microsoft Copilot?

Copilot is Microsoft’s line of AI assistants integrated across its products and services, such as Word, Excel, PowerPoint, Outlook, Teams, and Windows. It uses large language models (LLMs) based on Microsoft’s own AI technologies and OpenAI’s GPTs — like GPT-4o, the same that powers ChatGPT.

Like LLMs, Copilot can help you be more productive, creative, and efficient by understanding natural language prompts, analyzing data, generating and summarizing content, and providing suggestions based on context.

Microsoft provides Copilot in several versions, depending on whether you’re using it personally or professionally, including:

  • Microsoft Copilot — a free personal AI assistant for everyday tasks. It’s available as a Windows 11 app (installed by default), a web app, and a macOS app.
  • Gaming Copilot — a free personal AI assistant for Windows gaming, which takes screenshots to understand what is happening in a game and to offer tips, similar to how Recall takes snapshots of your desktop activity.
  • Microsoft 365 Copilot — integrates into Word, Excel, PowerPoint, Outlook, and Teams to use your data via Microsoft Graph, a platform that connects your data like emails, calendar, and documents.

What are the privacy risks of Microsoft Copilot?

Microsoft maintains strong privacy protections for Copilot in enterprise settings such as Microsoft 365, but consumer accounts face a very different reality(ventana nueva):

Your data may be used for ads and profiling

Copilot services in consumer Microsoft accounts are connected to Microsoft’s advertising ecosystem. This means that your interactions can influence the ads and recommendations you see. For example, if you ask Copilot about travel deals, you might later see flight or hotel ads.

Although you can turn off personalized ads in your Microsoft account privacy settings, the company still retains that interaction data and may use it in anonymized form for other purposes, such as improving ad targeting for other Copilot users with profiles similar to yours.

You keep ownership, but Microsoft can still use your data

Although Microsoft says you retain ownership of your content, using Copilot means you grant the company broad rights over it, including to “copy, distribute, transmit, publicly display, publicly perform, edit, translate and reformat” your data, and to pass those rights to third parties.

For example, if you’re a writer who adds excerpts from an unpublished story into Copilot to ask for proofreading or stylistic feedback, that text technically becomes part of the material Microsoft is licensed to process and use under its service terms. Your unreleased intellectual property may still be used beyond your direct control, even if it never becomes public.

Your data may be processed and used for AI training

Data from your Copilot prompts and responses may be collected, logged, and analyzed by Microsoft for AI training. There’s no guarantee that your sensitive or personal data will be excluded from training datasets.

As with most AI assistants, Copilot needs vast amounts of information to keep learning and refining its responses — even when that process can drift into intellectual property gray zones. A current example is a series of US lawsuits(ventana nueva) filed by authors and major news outlets who allege that Microsoft and its partner OpenAI used their copyrighted works without permission to train AI models such as Copilot and ChatGPT. OpenAI itself has publicly acknowledged(ventana nueva) that it is “impossible to train today’s leading AI systems without using copyrighted materials.”

People may see your sensitive data

Microsoft employees may review your Copilot inputs and outputs — including anything you type or upload as file attachments — to improve the service, moderate harmful content, or when you choose to submit feedback. If the thought of it makes you uncomfortable, you should avoid entering any sensitive information, such as personally identifiable data or trade secrets.

Connectors expose your data to third parties

Connectors allow Copilot to link up with Microsoft products (like OneDrive or Outlook) and third-party services (like Google Drive), so it can pull in information from multiple sources. For example, if you ask Copilot to summarize all meeting notes stored in Google Drive, it will search your Drive files and return a summary.

But when you enable a connector, you’re also allowing Copilot to exchange information with that other service. In Google Drive’s case, while Copilot gains access to your Drive’s file structure, names, timestamps, and contents, Google also receives certain details associated with your Microsoft account, such as your profile photo, name, or email address. Your valuable data now lives in both ecosystems (Microsoft and Google), creating a clearer, more complete profile of you across platforms.

Your data may cross borders under US jurisdiction

Depending on where you live, Microsoft may store your Copilot data and interactions in your region. For example, Europeans may have their data stored within the EU. But your data may not always stay there. Despite the company’s EU Data Boundary commitments(ventana nueva), some processing may still occur outside your region due to technical or capacity reasons. And because Microsoft is a US company, it is subject to US laws such as the CLOUD Act, which means US authorities could request access to your data (no matter where it’s stored), sometimes without requiring a warrant.

What are the security risks of Microsoft Copilot?

Since Microsoft can access an average of three million sensitive data records(ventana nueva) for each organization, it’s important to understand the security risks it may introduce, especially if you plan to use it for your business:

  • In June 2025, researchers discovered EchoLeak (CVE-2025-32711(ventana nueva)), a zero-click vulnerability in Microsoft 365 Copilot that let attackers steal data without any user action. By hiding malicious instructions and links inside a normal email, attackers could trick Copilot into following those commands, accessing those links, and sending parts of the user’s data to an external server. The vulnerability was patched, but it’s an example of how integrating AI assistants like Copilot into core systems — and giving them broad access to emails, documents, and internal data — can turn them into insider threats if they’re manipulated or misconfigured.
  • In August 2024, a cybersecurity firm(ventana nueva) discovered a critical information-disclosure flaw in Microsoft Copilot Studio, which allowed custom copilots to exploit weak protections and obtain sensitive information (like service tokens and database keys) to move further inside an internal system. Microsoft patched the issue.
  • In October 2025, security researchers (ventana nueva)uncovered a phishing technique called CoPhish that abuses Microsoft Copilot Studio to make malicious login pages look completely legitimate. Attackers can create a Copilot Studio agent whose Login button redirects victims to a fake OAuth consent page, then share the agent link (hosted on a trusted Microsoft domain) to lure users into granting access. Once consent is given, the attacker can obtain tokens that let them read, write, or send data, such as emails, chats, calendars, and notes. Microsoft is still working on this issue.

A private alternative to Copilot and Microsoft

If you’re uneasy about how deeply Copilot is integrated into Windows and how much data Microsoft collects through its AI services — but still want the convenience of an AI chatbot — switch to Lumo, our private AI assistant(ventana nueva). Housed in Europe, Lumo never harvests or trains on your data, shows you ads, keeps logs, or shares your information with anyone.

And if you’re ready to step outside the Microsoft ecosystem entirely, you can build your own private digital workspace with our encrypted ecosystem. You can use:

Every Proton service is built to guard your privacy by default, using end-to-end encryption, so no one but you can access your data — not even us.