Frequently Asked Questions about Copilot in Whiteboard
Select a heading for more information.
What can Copilot in Whiteboard do?
Copilot combines the power of large language models (LLMs) with Whiteboard to help you visually collaborate with your peers. Using natural language, you can ask Copilot to generate ideas, organize ideas into themes, and summarize whiteboard content.
How was Copilot in Whiteboard evaluated?
Copilot in Whiteboard was evaluated through extensive manual and automatic testing on top of Microsoft internal usage and public data. More evaluation was performed over custom datasets for offensive and malicious prompts (user questions) and responses. In addition, Copilot in Whiteboard is constantly evaluated with user online feedback (thumbs up and down feedback function).
What are the limitations of Copilot in Whiteboard and how can users minimize the impact of these limitations when using the system?
Currently, Copilot in Whiteboard is only aware of what is written in the Copilot block and note content on the board. The content it creates is not grounded on any external page content or linked content.
Can I customize the settings of Copilot in Whiteboard and how will this impact the system behavior?
Copilot in Whiteboard cannot be customized. Admins can choose to enable Copilot for specific users and users can choose what questions to ask Copilot based on their needs.
What can users do to improve performance?
Copilot in Whiteboard will best respond when users do the following:
- Provide enough context or content for Copilot to process.
- Use supported languages. Copilot will respond in supported languages.
What languages does Copilot in Whiteboard support?
Copilot in Whiteboard supports many languages. To learn more, see Supported languages for Microsoft Copilot.
Where can I learn more about privacy?
Copilot and Microsoft 365 are built on Microsoft's comprehensive approach to security, compliance, and privacy.
For more information about privacy, see the following information:
- If you're using Microsoft 365 Copilot in your organization (with your work or school account), see Data, Privacy, and Security for Microsoft 365 Copilot.
- If you're using Copilot in Microsoft 365 apps for home (with your personal Microsoft account), see Copilot in Microsoft 365 apps for home: your data and privacy.
Can I trust that the answers are always accurate?
Generative AI features strive to provide accurate and informative responses, based on the data available. However, answers may not always be accurate as they are generated based on patterns and probabilities in language data. Use your own judgment and double check the facts before making decisions or taking action based on the responses.
While these features have mitigations in place to avoid sharing unexpected offensive content in results and take steps to prevent displaying potentially harmful topics, you may still see unexpected results. We’re constantly working to improve our technology to proactively address issues in line with our responsible AI principles.
What should I do if I see inaccurate, harmful, or inappropriate content?
If you find a generated response that is incorrect or if you encounter harmful or inappropriate content, please provide that feedback by clicking thumbs down and providing additional comments in the feedback form. This feedback helps us to improve and minimize this content in the future.
Learn more
Welcome to Copilot in Whiteboard