You are in a meeting. Your phone rings. Copilot answers it, works out what the caller wants, decides whether it is urgent enough to interrupt you, and either puts them through or books them in for later. That is what Microsoft has just switched on.
Included in the companyβs April 2026 Teams update, published on 30 April, Copilot call delegation is now live for organisations enrolled in Frontier, Microsoftβs early-access programme for Microsoft 365 Copilot features, with a Microsoft 365 Copilot licence required. Licensing details and service limits are subject to change before general availability.
How Microsoft Copilot Call Delegation Works in Teams Phone
Once a user enables Copilot call delegation in Teams Call settings, Microsoft 365 Copilot picks up incoming calls and speaks with the caller. It gathers context about the reason for the call and tries to work out whether it is time-sensitive. If it judges the call to be urgent, it attempts a live transfer. If not, it offers the caller the option to leave a voicemail or book a follow-up appointment through Microsoft Bookings.
After each screened call, Copilot produces a written summary covering the reason for the call, key topics, and suggested next steps.
The feature is part of a push by Microsoft to move Copilot from assistant to autonomous agent. Speaking at the AI Agent & Copilot Summit in March 2026, James Oleinik, Microsoftβs partner director of product management, described the direction: βCopilot is our one AI app. It is where weβre transforming that knowledge work. Itβs the one app that employees log into to interface with AI and delegate work to their agents.β On Microsoftβs Q3 FY2026 earnings call last week, CEO Satya Nadella explained:
βWe now have a complete new form factor where you delegate the task β you are not even interactively working but delegating the task with CoWorkers.β
Call delegation is the phone system expression of that same idea.
Copilot Call Delegation and Caller Consent Under GDPR
The first compliance question is caller disclosure. When Copilot answers a Teams Phone call, the caller is speaking to an AI agent. Copilot processes and summarises that conversation. Depending on jurisdiction and configuration, it may also record it. Under GDPR, voice recordings constitute personal data. The regulationβs transparency requirements mean the caller needs to know their data is being processed and why.
Microsoftβs own support documentation confirms that βcallers see an avatar and hear an announcement that they are speaking with an attendant, not a person,β which addresses the most basic transparency requirement. Whether that satisfies GDPRβs more granular obligations around data processing and retention is a separate question, and one Microsoftβs current documentation does not answer.
GDPR fines for non-compliance can reach β¬20 million or 4% of global annual revenue. Organisations deploying Copilot call delegation in EU contexts need to ensure adequate disclosure at the point of connection and check that data retention and access controls for generated summaries meet their obligations.
Voice biometric data, including voiceprints derived from recordings, qualifies under GDPR as special-category data requiring explicit consent, wherever call delegation processes audio in a way that generates or relies on voice-characteristic analysis.
The EU AI Actβs Workplace Emotion Inference Ban: Does Copilot Call Delegation Fall Within Scope?
A more specific regulatory question involves the EU AI Actβs Article 5, in force since 2 February 2025. The Act prohibits AI from inferring emotions in workplace settings, with narrow exceptions for medical and safety use cases. Fines reach up to β¬35 million or 7% of global annual turnover.
Whether Copilot call delegation falls within scope depends on what the AI does when it classifies a call as urgent.
The European Commissionβs guidelines from February 2025 clarified that sentiment analysis on written text does not involve biometric data and sits outside the prohibition.
Voice analysis is a different matter. AI systems that draw inferences from acoustic features, such as tone, pitch, speech rate, or stress patterns, work with biometric data. The Commissionβs guidelines specifically flag that βAI systems monitoring the emotional tone in hybrid work teams by identifying and inferring emotions from voiceβ are prohibited. The prohibition covers both the caller and the recipient. An individual who receives a work call falls within a workplace context for these purposes.
The AI Actβs own Recital 44 is direct on why the ban exists, noting that AI systems inferring emotions from biometric data βmay lead to discriminatory outcomes and can be intrusive to the rights and freedoms of the concerned personsβ, and pointing to the βlimited reliabilityβ of such systems across different cultures and individuals.
Microsoft has not published technical documentation on whether Copilot call delegationβs urgency detection uses semantic content only, or also draws on acoustic voice features. That distinction determines whether organisations with employees or callers in the EU can lawfully deploy the feature. Organisations should ask Microsoft directly before enabling it.
The Commission declined to soften the prohibited practices list in its November 2025 review. Enforcement will intensify from August 2026 when the remaining high-risk AI provisions take effect. As one recent analysis put it, the fine tier for workplace emotion recognition βsits right there, next to social scoring and subliminal manipulationβ: the most serious category of breach the Act defines.
What to Consider Before Enabling Copilot Call Delegation in Your Organisation
There are also operational questions worth working through before deployment.
Urgency detection is only as useful as it is accurate. If Copilot call delegation misjudges a time-sensitive call as non-urgent, the user misses the transfer notification and may miss the call entirely.
Caller experience matters too, particularly in client-facing, legal, financial, or healthcare contexts. Callers who expect to speak to a person and reach a screening agent instead may not respond well. That is a business decision, but one worth making deliberately.
Call summaries will build up as records within the Microsoft 365 environment. In regulated industries, those records may face discovery, audit, or regulatory inspection. Summary retention policies need to align with the rules that govern an organisationβs other communications data.
Copilot call delegation is currently available to organisations enrolled in the Microsoft Frontier programme with a Microsoft 365 Copilot licence.