Select a heading for more information.
What can Copilot in Whiteboard do?
Copilot combines the power of large language models (LLMs) with Whiteboard to help you visually collaborate with your peers. Using natural language, you can ask Copilot to generate ideas, organize ideas into themes, and summarize whiteboard content.
Copilot in Whiteboard was evaluated through extensive manual and automatic testing on top of Microsoft internal usage and public data. More evaluation was performed over custom datasets for offensive and malicious prompts (user questions) and responses. In addition, Copilot in Whiteboard is constantly evaluated with user online feedback (thumbs up and down feedback function).
Currently, Copilot in Whiteboard is only aware of what is written in the Copilot block and note content on the board. The content it creates is not grounded on any external page content or linked content.
Copilot in Whiteboard cannot be customized. Admins can choose to enable Copilot for specific users and users can choose what questions to ask Copilot based on their needs.
Copilot in Whiteboard will best respond when users do the following:
-
Provide enough context or content for Copilot to process.
-
Use supported languages. Copilot will respond in supported languages.
Copilot in Whiteboard supports many languages. To learn more, see Supported languages for Microsoft Copilot.
Copilot and Microsoft 365 are built on Microsoft's comprehensive approach to security, compliance, and privacy.
For more information about privacy, see the following information:
-
If you’re using Microsoft 365 Copilot in your organization (with your work or school account), see Data, Privacy, and Security for Microsoft 365 Copilot.
-
If you're using Copilot in Microsoft 365 apps at home as part of Copilot Pro (with your personal Microsoft account), see Copilot Pro: Microsoft 365 apps and your privacy.
Generative AI features strive to provide accurate and informative responses, based on the data available. However, answers may not always be accurate as they are generated based on patterns and probabilities in language data. Use your own judgment and double check the facts before making decisions or taking action based on the responses.
While these features have mitigations in place to avoid sharing unexpected offensive content in results and take steps to prevent displaying potentially harmful topics, you may still see unexpected results. We’re constantly working to improve our technology to proactively address issues in line with our responsible AI principles.
If you find a generated response that is incorrect or if you encounter harmful or inappropriate content, please provide that feedback by clicking thumbs down and providing additional comments in the feedback form. This feedback helps us to improve and minimize this content in the future.
Learn more
Welcome to Copilot in Whiteboard
Microsoft 365 Copilot help & learning