Frequently asked questions about Copilot in Microsoft Teams
Applies To
Microsoft365.com Microsoft Office Microsoft Teams

Last updated: June 13, 2025

Copilot in Teams empowers you to communicate without barriers by combining the power of large language models (LLMs) with Microsoft's enterprise-grade security and privacy.

Chats and channels: With ​​​​​​​Copilot, catching up on conversations in chats and channels has never been easier. You can ask ​​​​​​​Copilot for a summary or use it to understand open questions, decisions made, and more. Learn more about using Copilot in Microsoft Teams chat and channels

Compose: You can harness the power of Copilot to communicate effectively and effortlessly with your coworkers. Copilot goes beyond spelling and grammar issues-- it can rewrite and adjust your messages to your desired tone and length. With the new Custom Tone feature, you can instruct Copilot to fine-tune the tone, translate the message, or add additional context. Learn more about rewriting and adjusting your messages with Copilot in Microsoft Teams

Meetings and calls:  Copilot unlocks new ways to stay engaged in your meetings and calls. You can ask for custom summaries, synthesize information across the conversation and screen share content, or get deep insights into unanswered questions, all in real-time.  Learn more about Copilot in Microsoft Teams meetings.

Copilot in Teams is designed to reliably provide high quality, complete, and legible responses with low latency. It has been evaluated through extensive manual and automatic testing of Microsoft internal usage, public data, and online measurements to ensure it meets or exceeds our standards. Additionally, rigorous evaluations were performed over custom datasets for offensive and malicious prompts (user questions) and responses.

Copilot in Teams is constantly evaluated through user feedback (thumbs up/thumbs down), as well as automatic and manual evaluations by our product teams. Sharing feedback helps us improve the Copilot experience. We value your privacy: your data is not shared with third parties or used to train the model. Learn more about sharing feedback for Copilot

Follow these guidelines to maximize the performance of Copilot in Teams:

  1. When you write your own prompt, make sure you're providing enough details about the goal, context, expectations, and sources to get the response you want. You may want to iterate on a prompt until you get a satisfactory response. For more details, visit Learn about Copilot prompts. You can also use prompt suggestions from Microsoft, which have instructions built-in to support high quality responses.

  2. Limit questions to topics covered in the chat or meeting. Copilot will not answer unrelated questions.

  3. Speak or chat in supported languages. Copilot will respond in supported languages; however, English inputs drive the best responses. For meetings, make sure the spoken language is accurate.

  4. Ensure that a recent, substantive volume of content is available. If there is not enough data, Copilot will show an error message.

  5. Consider breaking up long meetings of approximately two hours or more into smaller segments to ensure high latency and accuracy.

  6. Help make Copilot better by giving your feedback on responses.

Copilot in Teams supports English, Spanish, Japanese, French, German, Portuguese, Italian, and Chinese Simplified. To learn more, see Supported languages for Microsoft Copilot.

Teams Generative AI features strive to provide accurate and informative responses based on the data available. However, answers may not always be accurate as they are generated based on patterns and probabilities in language data. Use your own judgment and double check the facts before making decisions or taking action based on the responses.     While these features have mitigations in place to avoid sharing unexpected offensive content in results and take steps to prevent displaying potentially harmful topics, you may still see unexpected results. We’re constantly working to improve our technology to proactively address issues in line with our responsible AI Principles. 

LLMs naturally give slightly different responses for each query, just like a human does. If you ask someone the same question on different days, their answer will not be exactly the same. During a meeting and in an active chat, Copilot gives an answer based on the most recent content available. This content evolves over time- a topic that Copilot found relevant for an earlier summary might not be as relevant five minutes later. 

By nature, Copilot summaries for meetings are concise and will not always cover 100% of the meeting content. However, you can always follow up with questions about specific details or topics to get the most out of your meetings. Please consider helping use improve Copilot by giving your feedback (thumbs up or thumbs down) as you use it in Teams.

Currently, Copilot uses the first string in the attendee’s display name. If an attendee’s display name is reversed (LastName, First Name), Copilot may display the attendee’s name differently. We are rolling out changes to ensure Copilot will use the correct first name. Please share feedback to allow us to improve further.

To provide insights about Teams meetings, Copilot needs access to what is being said. In an ongoing meeting, Copilot can be used if it is running only during the meeting or if a transcript has been started. After the meeting, Copilot will answer questions using the most recent available transcript. If there is no transcript available, Copilot will only be available for the meeting chat. The meeting organizer can restrict who has access to the Copilot and the transcript. Answers may be limited or be subject to longer latency in meetings over approximately 2 hours long. 

In chat scenarios, Copilot will reference the chat where it is opened. Unless you specify otherwise, it will only reference messages sent in the last 30 days from the most recent message sent. Copilot cannot summarize images, Loop components, or files shared in the chat.

Microsoft is committed to Responsible AI. If Copilot detects that a query violates Responsible AI guidelines, it will not provide a response.

If you find a generated response that is incorrect or if you encounter harmful or inappropriate content, please provide that feedback by clicking thumbs down in the summary and providing additional comments in the feedback form. This feedback helps us to improve and minimize this content in the future. 

No. Copilot in Teams does not share data with third parties.

No. Copilot in Teams does not use user data to train the model. 

Copilot and Microsoft 365 are built on Microsoft's comprehensive approach to security, compliance, and privacy.

For more information about privacy, see the following information:

Need more help?

Want more options?

Explore subscription benefits, browse training courses, learn how to secure your device, and more.