As organisations embrace Copilot for Microsoft 365, they unlock a powerful AI assistant that enhances productivity and creativity. However, users must consider data security before adopting Copilot. Microsoft Copilot for Microsoft 365 is a sophisticated processing and orchestration engine that enhances productivity through AI-powered capabilities. It coordinates three key components: Large Language Models (LLMs), Content in Microsoft Graph and Microsoft 365 Apps.
The Need for Data Security for Copilot Adoption
Copilot interacts with sensitive organisational data, including emails, chats, and documents so ensuring robust data security is paramount. Let’s delve into how Copilot handles your proprietary organisational data.
Microsoft Copilot for Microsoft 365 adheres to existing privacy, security, and compliance commitments for Microsoft 365 commercial customers. This includes compliance with the General Data Protection Regulation (GDPR) and the European Union (EU) Data Boundary.
Prompts, responses, and data accessed through Microsoft Graph are not used to train foundation LLMs, including those used by Copilot so your organisational data remains protected.
It can generate responses anchored in your organisational data, such as user documents, emails, calendar, chats, meetings, and contacts. Microsoft Copilot for Microsoft 365 combines this content with the user’s working context, such as the meeting a user is in now, the email exchanges the user had on a topic, or the chat conversations the user had last week. Microsoft Copilot for Microsoft 365 uses this combination of content and context to help provide accurate, relevant, and contextual responses.
What is Access Control?
Access control refers to managing user permissions and defining who can access specific resources. It’s about granting the right level of access to the right people at the right time. For Copilot, this means controlling access to data and ensuring only authorised users interact with it.
Microsoft Copilot for Microsoft 365 only surfaces organisational data to which individual users have at least view permissions. It’s important that you’re using the permission models available in Microsoft 365 services, such as SharePoint, to help ensure the right users or groups have the right access to the right content within your organisation.
This includes permissions you give to users outside your organisation through inter-tenant collaboration solutions, such as shared channels in Microsoft Teams.
When you enter prompts using Microsoft Copilot for Microsoft 365, the information contained within your prompts, the data they retrieve, and the generated responses remain within the Microsoft 365 service boundary, in keeping with our current privacy, security, and compliance commitments.
Access Control Best Practices for Copilot
To safeguard your data while adopting Copilot, consider the following best practices:
Activate just-in-time access for privileged roles: Just-in-time (JIT) access is a security concept that focuses on granting users temporary and specific access privileges based on their immediate needs. This approach enhances data security by reducing the risk of unauthorised access and potential breaches. Here’s how JIT access can help ensure data security when adopting Copilot:
- Least Privileged Access: JIT access aligns with the principle of least privilege. Instead of granting broad and continuous access, it allows users to access resources only when necessary. For Copilot adoption, this means that users have access to Copilot features and data only during active sessions when they are actively using the service.
- Reduced Attack Surface: By limiting access to essential functions, JIT access minimises the attack surface. In the context of Copilot, it ensures that only authorised users can interact with the service, reducing the risk of unauthorised code execution or data exposure.
- Temporary Permissions: JIT access provides time-bound permissions. Users receive access only for the duration needed (e.g., during a specific task or session). For Copilot, this translates to granting access to Copilot features only when users are actively collaborating or creating content.
- Automated Policies: Implementing JIT access involves setting up automated policies that grant access based on specific conditions. For Copilot, organisations can define policies that allow access during specific work hours or based on project requirements.
- Auditability: Organisations can track when and why users accessed Copilot features enhancing accountability and helps identify any suspicious activity.
Data Loss Prevention (DLP): Implement controls to prevent data leakage. Copilot’s generative AI capabilities should not inadvertently expose sensitive information. Administrators can govern Copilots within their organisation using DLP policies. You can use DLP policies to require authentication for Copilots. This ensures that only authenticated users can interact with Copilot features.
- Combine DLP with just-in-time (JIT) access to grant temporary and specific access privileges to Copilot features.
- Remember that DLP policies provide a safety net, ensuring that Copilot interactions align with security best practices while maximising productivity.
Conditional Access (CA): is a powerful security feature within Microsoft 365 that allows organisations to control access to resources based on specific conditions. It enables organisations to define policies that determine who can access what, under which conditions, and from where, providing granular control over access to applications, data, and services.
How CA Helps with Copilot Adoption:
- Context-Aware Access: Conditional Access evaluates contextual factors such as user location, device health, and sign-in risk. For Copilot, this means that access can be restricted based on these factors.
- Just-In-Time Access: Conditional Access aligns with the principle of least privilege. It grants access only when needed, reducing the risk of unauthorized interactions with Copilot.
- Risk-Based Policies: Conditional Access can enforce policies based on risk levels. For example, if a user’s sign-in is flagged as risky, access to Copilot features can be restricted.
- Multi-Factor Authentication (MFA): Conditional Access can require MFA for Copilot access, adding an extra layer of security.
- Device Compliance: Conditional Access ensures that only compliant devices can interact with Copilot.
The Risks of Poor Access Control
When adopting Microsoft Copilot for Microsoft 365, it’s crucial to address the risks associated with poor access control. Copilot, with its AI-powered capabilities, promises to enhance productivity within M365 applications like Word, PowerPoint, Teams, and Outlook.
Insufficient access control poses several risks:
- Data Exposure: Over-permissioned access can lead to unintended data exposure and leakage
- Data Espionage: Copilot’s AI capabilities may inadvertently surface private information if not properly controlled.
- Administrator Accounts: Protect admin accounts and resource entitlements using tools like Microsoft Entra ID Governance.
Ready to Get Started?
Empower your organisation with Copilot while maintaining data security. Explore Microsoft Entra for access controls and follow best practices. Remember, responsible adoption ensures a harmonious balance between productivity and protection. Reach out to us today and find out how we can support you.