“Bring your AI to work” is here: Microsoft edition - What Multiple Account Access to Copilot means

Multiple Account Access to Copilot

On October 1. 2025 Microsoft released a blog post explaining how employees now can use Copilot from their personal 365 plans to work on organizational data. This is of course, an extension of the already existing "Multi account" feature that was released for corporate accounts a "couple of months" ago.

In other words, “bring your own Copilot” is now a real thing in Word, Excel, PowerPoint, Outlook, and OneNote on desktop and mobile, with enterprise protections intact.

“Bring your AI to work” is an important topic, and banning AI altogether might not be the answer.

Whether sanctioned or shadow, AI has already entered everyday knowledge work. Microsoft’s new multi‑account access offers a safer path where employees can use Copilot from their personal Microsoft 365 subscriptions on work files, while the file’s access, auditing, and compliance still flow through the work identity and tenant. That’s better than users copy‑pasting sensitive content into unknown third‑party tools with opaque data handling.

Personally I think this is a better option than denying employees the use of AI altogether. The question isn’t “if” employees will use AI, it’s “where and how”. At least, with multi‑account access, enterprise data protection remains governed by the identity used to open the file, not by the account that funds Copilot. You keep your controls; users get productivity. That’s a pragmatic balance. But organizations still need to take control of their sensitive data.

What the feature actually does (and does not) access

Let's be clear: Copilot can act only on what the user’s active (work/school) identity can open, and Copilot’s “grounding” respects that identity’s settings. If you’ve disabled web grounding for that identity, it stays disabled, even if the Copilot entitlement comes from a personal subscription. Enterprise data protection is always tied to the identity that accesses the file.

Microsoft also documents specific capability limits when people use Copilot on work/school content via a personal plan. Here’s a condensed version of the official table:



Admin control: Cloud Policy to allow or restrict “Multiple account access to Copilot”

If your risk posture calls for restrictions, Microsoft provides a Cloud Policy setting named “Multiple account access to Copilot for work documents.” You can scope it to all users or specific groups. These policies roam with the user across Windows, macOS, iOS, and Android.

My input to security & compliance teams

This is not a “back door” to your tenant: Copilot does not expand what a user can open; it works only on the open file under the work/school identity and honors that identity’s policies (e.g., web grounding). 
A controlled governance beats prohibition: Providing a sanctioned path reduces the urge to paste sensitive content into unsanctioned tools, which is harder to monitor or control. 
Auditing & oversight remain in your domain: Actions occur within the Microsoft 365 cloud and under enterprise controls associated with the work identity. 

What organizations should do now (your short checklist)

  • Most "Cloud Policies" are "Not Configured" by default. Decide your policy stance, and configure it. Don’t leave it implicit.
  • Harden your identity‑based controls, make sure you control who can use web grounding or not, if this is important to your organization.
  • Double‑check your Sensitivity labels and DLP. Multi‑account access doesn’t bypass your protections, but it will surface gaps if your data isn’t labeled or your DLP rules aren’t tuned correctly.
  • Update your AI-Use policies and inform your users.