Note: This article covers age limits and parental controls topics related to the use of Microsoft Copilot when signed in with a personal Microsoft Account.  It does not apply to Microsoft 365 Copilot. For more information on that, read this article.    

Your safety and privacy matter. When you use Microsoft Copilot ("Copilot"), we want you to feel confident and informed. This page is designed to help young people understand how Copilot works, what protections are in place, and how to use it responsibly. It also includes tips your parents and guardians may find helpful. 

What is Artificial Intelligence (AI)?

Artificial Intelligence, or AI, is a machine-based application that makes predictions, recommendations, decisions, generates content, or takes actions based on goals set by people. Think of it as a smart software that can learn, solve problems, understand what you say or type, and make sense of images or sounds. Copilot is an AI that can do things like respond to your voice instructions, analyze information, write text, create images, and a lot more.

Can you use Copilot?

Users between 13 to 18 years old, subject to minimum age requirements that vary by country or region, can use Copilot in a variety of ways.  For example, they can use Copilot to ask questions, get help with homework, summarize information, brainstorm ideas, create images, write stories, or learn new things. 

For young people, Copilot may use data differently or limit certain features. For example, Copilot doesn’t train AI models on young people’s data, and features like personalization are not available.

How can you use Copilot safely?

Copilot can be helpful, but it’s not a person.  Copilot doesn’t have feelings, life experience, or human judgement. It can sometimes get things wrong, make mistakes, or misunderstand what you mean. When Copilot provides information, it is designed to use high-quality sources and include citations so you can check where the information came from. 

Copilot includes extra protections for young people, but it’s still important to remember that AI is a tool, not a friend or a substitute for receiving advice from a trusted adult. This is why Copilot reminds you that it’s an AI and encourages you to check for mistakes.

If you’re ever unsure about something Copilot generates, or you’re dealing with sensitive or important information, talk to a trusted adult first. This could be a parent, guardian, teacher, or another adult you rely on. They can help you think things through and provide emotional support in a way an AI can’t.

Remember:  AI is not a substitute for seeking help from trusted adults in your life.

How does Copilot use your information?

Copilot uses what you type to generate a response to your question. For young people, Copilot does not personalize results, does not show personalized ads, and does not use your conversations for model training. This means your chats aren’t used to build profiles about you.

While we take steps to protect your data from misuse, there's always risk in sharing data online. Keep in mind also that some Copilot features, like Groups, may allow other users to see your chats if you’ve shared your conversation or joined a group.

Please be cautious about what content you upload to Copilot, and avoid sharing sensitive, confidential, or identifying information like your address, logins and passwords, phone number, or school information or identifying and personal information about other people. 

How does Copilot handle sensitive topics?

Copilot is designed to give safe, reliable answers to everyone, but for younger users it adds extra protections. For certain sensitive topics - such as mental health, substance use, or adult content - Copilot tailors responses to be age‑appropriate, including by offering general guidance, directing you to safer information sources, or encouraging consultation with a trusted adult.

Copilot also has built-in systems that help keep you safe. It checks for certain types of risky or harmful content and is designed to reply in a way that avoids exposing you to inappropriate or unsafe content.

If you encounter unsafe or inappropriate or illegal content, you may also report it here.

How does Copilot protect your privacy?

Your privacy really matters. When you use Copilot, it analyses what you say, or type in order to answer your questions, help you learn, or get things done. Microsoft works hard to protect this information and use it responsibly. Microsoft keeps this information to make Copilot work properly, keep it safe, and fix problems when something goes wrong.

You are in control of your Copilot conversations. Copilot doesn’t have access to your photos, messages, or anything else on your device unless you upload it to Copilot.  You can delete your chats if you don’t want them saved anymore.

If you are under 18, your chats are not used to train AI models. You may see ads while using Copilot, and these ads are based on what you are asking about right now, not on other information about you.

To protect yourself and others, avoid sharing personal information like your full name, address, passwords, school name, personal details about other people, or images of yourself or others. If you’re unsure whether something is appropriate to discuss or upload, it’s always a good idea to check with a trusted adult first. 

What does Microsoft do to keep Copilot safe?

Copilot is built using Microsoft’s responsible AI approach, which means the system is designed, tested, and monitored with safety as a priority. Before Copilot is released, it goes through extensive testing by experts who look for risks, including attempts to misuse the system or generate harmful content. Copilot uses safety filters and clear rules to reduce the chance of unsafe, offensive, or misleading content appearing. Every message you send is checked through these safeguards before Copilot responds. 

Microsoft also monitors how Copilot performs, reviews user feedback, and updates its systems regularly to improve accuracy and safety over time. The goal is to make Copilot helpful, safe, and transparent for everyone, including young people.

You can read more information about this in our Transparency Note for Copilot. 

How do we monitor Copilot for abuse?

To help keep Copilot safe, Microsoft uses systems that help detect harmful or abusive behavior, such as trying to generate dangerous content or breaking our Terms of Use.

If something like that happens, Copilot may block your request or limit your access to Copilot or certain features. Humans may review your conversations in Copilot for safety, legal, product improvement, or troubleshooting. These measures help to protect our users and help maintain a respectful, safe environment. 

What parental controls does Copilot have?

Your parents or guardian may be able to manage or limit your access to Copilot using tools like Microsoft Family Safety or other device-based parental controls. This might include screen-time limits or blocking the Copilot app. These kinds of guardrails are intended to help keep you safe and can help you and your family have conversations about your life online.

Need more help?

Want more options?

Explore subscription benefits, browse training courses, learn how to secure your device, and more.