Different users will approach the Copilot Notebook Study Guide in different ways. Below are some onboarding tips and best practices tailored for students, educators, and IT administrators.
As a student, Copilot’s Study Guide can become an invaluable part of your study routine. Once your school or university has enabled Microsoft 365 Copilot for you, here’s how you can make the most of the feature:
-
Create personal study guides for your courses: Get into the habit of compiling your lecture notes, textbook chapters, and any reference articles into a Copilot Notebook for each course or unit. For example, you might have separate notebooks like “History 101 – Midterm Study Guide” or “Biology – Chapter 5 Review.” Periodically use the Study Guide button to generate updated study materials from those sources. This is like having a personalized “teacher’s summary” of everything you’ve covered.
-
Use the interactive features to study actively: Don’t just read the summary – engage with the content. Answer the thought questions on the Topic pages (you can even write your answers in your notebook for later review). Quiz yourself with the Flashcards and take the Quiz page at the end. Active participation reinforces your learning far more than passive reading.
-
Combine Copilot with your own notes: Feel free to type additional notes or insights into the Study Guide pages. If Copilot’s summary sparks a memory or an idea, jot it down. You can also ask Copilot follow-up questions in the chat right alongside your notebook. For instance, if a Topic page mentions something you don’t fully understand, use the Copilot chat pane in OneNote to ask, “Can you explain this concept in another way?” The AI responds based on your notebook’s content, and you can add that explanation to your notes.
-
Use it ethically and wisely: Remember that Copilot is drawing from materials you already have access to – using it to study is not cheating; it’s a learning aid. However, always practice academic integrity: the Study Guide’s purpose is to help you learn core concepts, not to generate assignments for you. Double-check any critical facts or interpretations with your official sources or textbook, especially if something in the AI summary seems inconsistent with what was taught in class. Using AI thoughtfully is a skill – treat Copilot as a partner and verify its output when in doubt to ensure you truly understand the material.
-
First-time use and guidance: The first time you select Study Guide, you might see a brief explanation or tooltip about what the feature does (noting that it uses AI on your content). Read those onboarding notes carefully. If your institution provides any orientation or training on AI tools like Copilot, make sure to attend it or ask your teacher for guidelines. Microsoft is committed to helping you use the tool responsibly, but it’s important to use your own judgment too.
The Study Guide can be a powerful assistant for teachers and faculty members, whether in K-12 or higher education:
-
Generate revision materials effortlessly: Save time creating study aids. You can feed Copilot with the unit’s readings, slides, or any relevant content, and then let it draft a Study Guide for your students. Review the AI-generated pages for accuracy, make any necessary edits (for example, adjust phrasing, add school-specific references), and then share the guide to your class if your goal is to have a single shared guide with all your students. Students will not receive an individual copy of the guide so consider privacy and how students may want to use the guide as a personal and private learning space before sharing. Individual students will need to generate their own Flashcard and Quiz pages to check their understanding, and they will be able to see each other’s Flashcards and Quizzes.
-
Encourage self-directed learning: Promote the use of Study Guide among your students for independent study. For example, you could have a classroom exercise where students compile their projects or weekly notes into a Copilot Notebook and generate a Study Guide. This helps students practice synthesizing information and identifying key points – since they’ll see what the AI highlights as important. It can also serve as a starting point for group discussions; students can compare their study guides or quiz each other using flashcards generated from their individual notes.
-
Use AI literacy guidance: Materials linked from https://aka.ms/AILiteracy can be used by educators, students/learners and institution leaders to use AI effectively and responsibly.
-
Stay in control of content: It’s crucial to try the Study Guide yourself before recommending it widely. This hands-on experience will let you understand its capabilities and limitations. You’ll find that Copilot’s guidance is limited to what content the student provides – it won’t magically include material that wasn’t input. So you might advise students on what to include in their notebooks: if they skip certain chapters, those topics won’t appear in the guide. Use this insight to support students in gathering complete and relevant materials for studying.
-
Address concerns openly: Some educators may worry that students will rely on AI summaries and not engage with the full course content. Be clear that Copilot’s outputs are only as comprehensive as the inputs – it is not a substitute for reading and learning, but rather a tool for reviewing. Encourage students to treat the Study Guide as reinforcement (after they’ve gone through the lessons) instead of a replacement. You may consider class policies for AI usage. For example, students can use Study Guides for exam preparation but must still complete original assignments without AI assistance. This maintains academic integrity while leveraging the benefits of the technology.
-
Managing access: If you teach younger students or have concerns about usage, work with your IT admin to manage who has access. By default, Microsoft’s education licensing and Entra ID settings will restrict Copilot features for accounts classified as under 13. Make sure you communicate with students about when and how they should use these tools. Microsoft’s Education resources (see Support and Resources below) provide guides on best practices for integrating AI into teaching.
Deployment of the Copilot Notebook Study Guide feature requires planning around licensing and compliance in your organization (school or enterprise). Key things for admins:
-
Licensing requirements: Verify that each intended user has a Microsoft 365 Copilot license assigned. The Study Guide is not a separate product; it is included as part of the M365 Copilot (Office) license entitlement. Education customers will need a valid Microsoft 365 Copilot license for their tenant (such as the Microsoft 365 A1, A3 or A5 with Copilot add-on for Education) in order for faculty and students (13+) to use the feature. Enterprise customers likewise need the Microsoft 365 Copilot licensing enabled for users. No special “Study Guide” SKU or license exists – it’s bundled with the overall Copilot offering. (For consumer users, when supported, the feature would be available to those with a Microsoft 365 Copilot subscription on eligible plans.)
-
Consent and age controls: Microsoft provides granular control for education tenants via Entra ID (Azure AD) user fields like Age Group and Consent Provided. Ensure student accounts are marked appropriately – only users flagged as “Non-adult” (13-18) or “Adult” are allowed to use Copilot features, including Study Guide. Accounts set as “Minor” are automatically blocked from Copilot by the service for safety compliance. These controls help schools maintain COPPA and GDPR compliance and ensure Responsible AI usage at scale. See additional details here.
-
Data and storage: No special storage requirements. The Study Guide pages are stored in the user’s OneNote (which in Microsoft 365 is backed by SharePoint/OneDrive). This means all content generated by the Study Guide is retained within your tenant’s existing infrastructure and is subject to the same data retention and compliance policies as other Office documents. There is no separate external database storing user study data; the service acts as a pass-through generator that does not persist user content or answers outside of the user’s notebook. If a user deletes their notebook or data, it’s gone from the service as well. For auditing, any logs or telemetry collected are minimal and typically contain only non-identifiable events (for example, usage counts) needed for service health. This design aligns with Microsoft’s privacy commitments – user prompts and outputs are not used to train the AI model.
-
Safety and compliance: The Study Guide feature was reviewed by Microsoft’s Privacy, Security, and against Responsible AI standards before release. It adheres to data protection standards by keeping student content private to them unless they share it. The data is stored with the user’s identity and owned by the tenant. The AI’s behavior is constrained to educational use-cases: for example, it will refuse to generate inappropriate content. All generated content passes through Azure AI Content Safety filters to screen out profanity, harassment, sexual content, violence, and other undesired material. The feature is labeled as available to 13+ users and is disabled for younger students by default. As an admin, you should monitor initial usage and gather feedback. If any problematic outputs occur, they can be reported through your Microsoft support channels.
-
Use AI literacy guidance: Materials linked from https://aka.ms/AILiteracy can be used by educators, students/learners and institution leaders to use AI effectively, and responsibly.
-
Tips for rollout: Provide guidance to both teachers and students on how to use Study Guide in educationally effective ways. Emphasize to users that this is an aid for learning and revision, not a cheating tool, and that usual academic honesty policies still apply. On the technical side, stay updated with Microsoft’s documentation for any changes during the preview period and prior to general availability, and ensure your tenant’s policies on data privacy are compatible with Copilot’s requirements (since the feature uses cloud AI services, internet connectivity and M365 compliance settings must allow it).