The challenge with sensitive data and access to Copilot
Microsoft 365 Copilot empowers users with AI-driven assistance across Microsoft 365 apps, but it also raises concerns about accidental oversharing of sensitive information. In response, Microsoft has extended its Purview Data Loss Prevention (DLP) capabilities to Microsoft 365 Copilot, allowing organizations to enforce information protection policies within AI workflows. The DLP for Microsoft 365 Copilot has been in preview for some time, but Microsoft has now announced it is released for GA (General Availability). Among some of the interesting features are new features like alerting and policy simulation.Key details:
- Rollout Timeline: As of June 2025, the rollout has begun. It should be completed worldwide by late July 2025.
- Scope: Initially, DLP for Copilot was available for Copilot Chat scenarios. By the time of GA this is expanding to Copilot in core Office apps (Word, Excel, PowerPoint) as well. Ensuring that DLP protections apply consistently whether users are chatting with Copilot in Teams or using Copilot within Office documents.
- New Capabilities: DLP alerting and policy simulation specifically for Copilot-related policies is being introduced as part of GA. With the alerting feature, Admins can configure policies so that if a Copilot request triggers a DLP rule, an alert can be generated in the compliance center. The policy simulation allows for "trials" to see potential rule matches before enforcing them across the tenant.
- Existing Preview DLPs: If an organization already had DLP policies for Copilot during preview, those will carry forward.
- License requirements:
- Microsoft 365 Copilot license
- Microsoft 365 E5/A5/F5 license or a E3/A3/F3 license with the Compliance Add-On
Why is this important?
Implementing DLP-based restrictions on Copilot can significantly enhance your data protection and compliance settings, preparing for broader Copilot rollout within the company:
- Preventing Accidental Data Exposure: The primary benefit is a significant reduction in the risk of accidental data leaks via AI. Copilot will not output content from files or emails that are labeled as sensitive per organizational policy. This means even if a user has access to a confidential document, they cannot unknowingly have Copilot disseminate that information in a broader context (like copying it into an email or a chat) without tripping a policy.
- AI Compliance by Design: With DLP enforcement integrated, Copilot interactions now fall under the same compliance umbrella as other data activities. The DLP policies will inspect "enterprise grounding data" (Data stored within your tenant) for sensitivity labels and restrict access within Copilot, altering how data is processed and accessed by the AI.
- Administrator Oversight and Alerts: Another positive implication is improved oversight. With alerting capabilities, attempts to use Copilot in ways that conflict with DLP policies can generate real-time alerts to IT or security administrators. For instance, if a user repeatedly tries to have Copilot summarize a confidential file, each attempt can trigger an alert event. This gives security teams visibility into potential risky behavior or training gaps.
- Simulation for Policy Optimization: The addition of simulation mode for Copilot DLP policies allows compliance teams to assess policy impact before full enforcement. Admins can run a simulation to see how often a policy would have blocked Copilot in the last week, for example, and what content would have been involved.
How to configure Copilot Restrictions Based on Information Protection:
- Microsoft Purview Compliance Portal: The Purview Compliance portal is where everything can be configured and set up. Navigate to https://purview.microsoft.com as a "Compliance Administrator" to get started. Unless "Data Loss Prevention" is listed on the startpage, you can find it under the "Solutions" menu on the very left. Select "Data Loss Prevention" to find your existing policies or to create a new one (Prerequisite: You need labels already deployed in the "Information Protection").
- Select custom Policy: Make sure to select "Custom" category and "Custom policy" in the policy builder to get the desired options.
- “Microsoft 365 Copilot” location: In the policy creation wizard, Microsoft 365 Copilot is now available as a policy location. When this is selected, all other locations (Exchange, SharePoint, etc.) are automatically disabled for that policy. This ensures the policy specifically governs Copilot’s use of data.
- Define conditions: Make sure you select the correct sensitivity label as a condition in the policy conditions, using “Content contains - Sensitivity labels” as the filter. It is possible to specify multiple labels. For example, an organization might choose labels like “Highly Confidential” or “Personal” as conditions. Any content (documents or emails) carrying those labels will be targeted by the rule.
- Restrict Copilot: For policies targeting Copilot, the key action is “Prevent Copilot from processing content”. This action means that if Copilot encounters content with the specified sensitivity labels, it will not use the content in generating an answer. The protected content’s details will be excluded from Copilot’s response, although the item might still be referenced as a citation if the user has access. In practice, Copilot will either omit the sensitive material from its output or display an apology/error message indicating it cannot use that content.
- Configure notifications (optional): You may choose to enable the policy to send alerts to the purview portal Copilot DLP events. For example, an admin might enable an alert so that whenever Copilot refuses to summarize a document due to a DLP policy, the event is logged and flagged for review.
Final Thoughts
Governance of sensitive information is a topic I always emphasize when working with customers implementing Microsoft 365 Copilot. As powerful as AI can be in boosting productivity, it must be deployed responsibly, especially in environments where data privacy and compliance are critical. With the general availability of DLP for Microsoft 365 Copilot, organizations now have the tools to ensure that AI-driven workflows respect the same information protection boundaries as any other data interaction. It’s a crucial step toward building trust in AI and ensuring its safe, scalable adoption across the enterprise.
I hope you found this information helpful, so please feel free to subscribe or follow me on LinkedIn for more future updates.