Sign in with Microsoft
Sign in or create an account.
Select a different account.
You have multiple accounts
Choose the account you want to sign in with.
Frequently asked questions about Copilot in Microsoft Teams

Last updated: July 12, 2023

Copilot combines the power of large language models (LLMs) with Teams data to provide summaries and answer questions in real-time to help you stay productive in the workplace.


  • The system summarizes chat messages. Its inputs are messages from the conversation it is invoked from, and its outputs are a high-level summary and key takeaways. When opened, Copilot automatically provides a default summary based on the most recent conversation history. Users can also choose to summarize the last 1, 7 and 30 days of conversation. The user can enter a query (suggested or free text) and Copilot will generate answers based on the chat history. Common use cases are summarizing the chat and getting answers to questions such as the decisions that were made or open items.


  • Copilot for meetings is based on the meeting transcript, so it will work only if the meeting is recorded or transcribed. The user can enter a query (suggested or free text) at any time during or after the meeting, and Copilot will generate answers based on the transcript. Common uses are generating meeting notes and action items, giving answers to various questions, listing main ideas discussed, listing unresolved questions and more.

Copilot in Teams is available in chats and meetings.

  • Chats: Users can manually access Copilot by pressing the Copilot button in the chat header. Additionally, if a licensed user returns to a chat after being away for a significant period, a clickable Smart Action may trigger to inform the user that a summary may be helpful.

  • Meetings: Users can access Copilot during a meeting via the Copilot button, and after a meeting (on the meetings details tab). Meeting transcription needs to be turned on for Copilot to function.

Copilot in Teams was evaluated through extensive manual and automatic testing on top of Microsoft internal usage and public data. More evaluation was performed over custom datasets for offensive and malicious prompts (user questions) and responses. In addition, Copilot in Teams is constantly evaluated with user online feedback (thumbs up and down feedback function).

In chat scenarios, Copilot is limited on how much data it can process and how far back it will go to fetch an answer. The farthest Copilot can process is 30 days from the most recent message sent. It may also be further limited by retention policies. In meeting scenarios, long meetings (longer than 2hr) may be affected by longer latency compared to shorter ones.

Many languages are supported in Copilot scenarios. The quality is expected to be better when inputs are in English, while in other languages the quality is expected to be improved over time.

Teams Copilot cannot be customized. Admins can choose which Copilots to enable via Microsoft 365 Admin Center and may enable or disable Copilot in Teams. Users can choose what questions to ask Copilot based on their needs.

Teams Copilot will best respond when users do the following:

  1. Limit questions to topics covered in the chat or meeting. Copilot will not answer unrelated questions.

  2. Speak or chat in supported languages. Copilot will respond in supported languages, however as noted above English inputs drive better responses. For meetings, make sure the defined language of the transcript matches the spoken language.

  3. Ensure recent and substantive volume of content is available. Copilot will provide error messages if these requirements are not met.

Copilot in Teams supports many languages. To learn more, see Supported languages for Microsoft Copilot.

Copilot and Microsoft 365 are built on Microsoft's comprehensive approach to security, compliance, and privacy.

For more information about privacy, see the following information:

Teams Generative AI features strive to provide accurate and informative responses, based on the data available. However, answers may not always be accurate as they are generated based on patterns and probabilities in language data. Use your own judgment and double check the facts before making decisions or taking action based on the responses.

While these features have mitigations in place to avoid sharing unexpected offensive content in results and take steps to prevent displaying potentially harmful topics, you may still see unexpected results. We’re constantly working to improve our technology to proactively address issues in line with our responsible AI principles.

If you find a generated response that is incorrect or if you encounter harmful or inappropriate content, please provide that feedback by clicking thumbs down in the summary and providing additional comments in the feedback form. This feedback helps us to improve and minimize this content in the future.

Need more help?

Want more options?

Explore subscription benefits, browse training courses, learn how to secure your device, and more.

Communities help you ask and answer questions, give feedback, and hear from experts with rich knowledge.

Find solutions to common problems or get help from a support agent.

Was this information helpful?

What affected your experience?
By pressing submit, your feedback will be used to improve Microsoft products and services. Your IT admin will be able to collect this data. Privacy Statement.

Thank you for your feedback!