Sign in with Microsoft
Sign in or create an account.
Hello,
Select a different account.
You have multiple accounts
Choose the account you want to sign in with.
Frequently asked questions about Microsoft 365 Copilot

Last updated: August 2023

Copilot combines the power of large language models (LLMs) with the intelligence of the Microsoft Graph. Using your business data, Copilot synthesizes information from documents, emails, and messages to give you summaries, answer your business questions, and generate draft content from your files, messages, and more.

Copilot uses a large language model that mimics natural language based on patterns from large amounts of training data. The model is optimized for conversation by using Reinforcement Learning with Human Feedback (RLHF)—a method that uses human demonstrations to guide the model toward a desired behavior.

When you submit a text prompt to Copilot, the model generates a response by making suggestions about what text should come next in a string of words. The model is based on a domain-specific language (DSL) that allows you to specify what kind of information you want to search and synthesize from your Microsoft 365 data. It's important to keep in mind that the system was designed to mimic natural human communication, but the output may be inaccurate or out of date.

Use natural language to ask Copilot work questions, for help generating content, or for a summary of a topic, document, or chat conversation. The more specific your prompt, the better the results. 

Here are some example prompts to help you get started:

  • Summarize unread emails from John

  • Draft a message to my team with action items from my last meeting

  • What’s our vacation policy?

  • Who am I meeting with tomorrow?

Copilot is designed to provide accurate and informative responses, based on the knowledge and your data available in the Microsoft Graph. However, answers may not always be accurate as they are generated based on patterns and probabilities in language data. Responses include references when possible, and it's important to verify the information. While Copilot has affordances to avoid sharing any offensive content or potentially harmful topics, you may still see unexpected results. We're continuously working to improve our technology in preventing harmful content.

If you find an answer is incorrect or if you encounter harmful or inappropriate content, please provide feedback by selecting the Thumbs Down icon and describing the issue in detail.

To ensure quality, Copilot is given test questions, and its responses are evaluated based on criteria such as accuracy, relevance, tone, and intelligence. Those evaluation scores are then used to improve the model. It's important to keep in mind that the system was designed to mimic natural human communication, but the output may be inaccurate, incorrect, or out of date. Providing feedback via the Thumbs Up and Thumbs Down icons will aid in teaching Copilot which responses are and are not helpful to you as a user. We will use this feedback to improve Copilot, just like we use customer feedback to improve other Microsoft 365 services andMicrosoft 365 apps. We don't use this feedback to train the foundation models used by Copilot. Customers can manage feedback through admin controls. For more information, see Manage Microsoft feedback for your organization

No, you no longer need to link your personal account to your work account to use Copilot. To use Copilot in Bing for Business, sign into Microsoft Edge with your work account.   

To use Copilot in Bing for Business you need to be signed into Microsoft Edge with your Microsoft work account. To use Bing for Consumers, sign into Microsoft Edge with your personal Microsoft account. If you want to use both at the same time, you need to open Microsoft Edge in two separate windows, sign into one of them with your work account and sign into the other with your personal account.   

Copilot and Microsoft 365 are built on Microsoft's comprehensive approach to security, compliance, and privacy.

For more information about privacy, see the following information:

As AI is poised to transform our lives, we must collectively define new rules, norms, and practices for the use and impact of this technology. Microsoft has been on a Responsible AI journey since 2017, when we defined our principles and approach to ensuring this technology is used in a way that is driven by ethical principles that put people first. Read more about Microsoft's framework for building AI systems responsibly, the ethical principles that guide us, and the tooling and capabilities we've created to help customers use AI responsibly.

Copilot presents only the data that each individual can access using the same underlying controls for data access used in other Microsoft 365 services. The permissions model within your Microsoft 365 tenant will help ensure that data will not leak between users and groups. For more information on how Microsoft protects your data, see Microsoft Privacy Statement.

Need more help?

Want more options?

Explore subscription benefits, browse training courses, learn how to secure your device, and more.

Communities help you ask and answer questions, give feedback, and hear from experts with rich knowledge.

Find solutions to common problems or get help from a support agent.

Was this information helpful?

What affected your experience?
By pressing submit, your feedback will be used to improve Microsoft products and services. Your IT admin will be able to collect this data. Privacy Statement.

Thank you for your feedback!

×