Your School's AI Policy: What Every Parent Needs to Know and Ask
Last updated: March 2026
Most schools are making up their AI policies as they go. Only 45% of principals report having a formal AI policy at their school or district [1]. Meanwhile, 62% of students are already using AI for homework [2], and 96% of elementary school families say their school has told them nothing about AI policies at all [3].
Your school may have a thoughtful, detailed AI framework. Or it may have a vague paragraph buried in the student handbook. Or nothing at all. Here's what to look for, what to ask, and what a good policy actually looks like — so you can advocate for your kids whether your school is ahead of the curve or behind it.
What's Happening Right Now
The NYC Framework: A National Model
On March 24, 2026, the New York City Department of Education released the most comprehensive K-12 AI policy from any major U.S. school system [4]. It uses a traffic-light framework that's become a reference point for districts nationwide:
- Red (Prohibited): AI cannot be used for grading, disciplinary decisions, creating IEPs (Individualized Education Programs), or behavioral monitoring of students.
- Yellow (Use with Caution): Student use for research and projects — permitted but with grade-level-appropriate guidelines and teacher oversight.
- Green (Encouraged): Brainstorming, lesson planning, drafting communications, and supporting students with disabilities.
This matters because NYC's system serves over 1 million students. When the largest school district in the country sets a standard, other districts pay attention.
The State Landscape
At least 28 states have now published some form of AI guidance for K-12 schools [5]. But "guidance" varies wildly:
- Ohio requires every district to have an AI policy in place by July 1, 2026 — one of the first mandates with a hard deadline [6].
- Tennessee has required districts to create AI policies since March 2024, making it one of the earliest states to act [7].
- California, Virginia, and North Carolina have published advisory frameworks, but compliance is voluntary.
- Many states have issued no AI-specific education guidance at all.
The Federal Picture: Almost Nothing
Federal regulation of AI in education is "virtually nonexistent," according to Education Week's January 2026 analysis [8]. In April 2025, the Trump administration signed an executive order creating a White House Task Force on AI in Education, focused on promoting AI adoption rather than regulating it [9]. There is no federal law specifically governing how AI is used in K-12 classrooms.
This means your school district is largely on its own. Some districts are doing excellent work. Many are doing nothing.
What Europe Is Doing (and Why It Matters)
The EU AI Act classifies education as a "high-risk" sector, with compliance deadlines beginning in August 2026 [10]. Any AI tool used in European schools will need to meet strict transparency, accuracy, and human-oversight requirements. U.S. companies that sell education technology internationally are already adapting their products to meet these rules — which may eventually raise the bar for American schools too.
A Parent's Traffic Light Framework
Until your school has a clear policy, use this framework to guide conversations with your kids and their teachers.
The key question for any AI use: "Did the student learn?" If AI helped them understand something better, it's working. If AI did the work so the student didn't have to think, it's not.
10 Questions to Ask Your School
Whether you're at a PTA meeting, parent-teacher conference, or emailing your principal, these questions will tell you where your school stands.
- "Does our school or district have a written AI policy?" If the answer is no, that's the most important thing to push for. A school without an AI policy in 2026 is a school without a smartphone policy in 2015 — behind and vulnerable.
- "Where can I read the policy?" A policy that exists but isn't publicly available to parents is almost as bad as no policy at all. It should be posted on the school website or included in the student/parent handbook.
- "Which AI tools are approved for classroom use, and how were they evaluated?" Good schools vet tools for privacy, accuracy, and age-appropriateness before teachers use them. Ask whether there's a review process.
- "What student data do these AI tools collect, and who has access to it?" AI tools often collect conversation logs, usage patterns, and performance data. You deserve to know what data your child generates and where it goes.
- "How are teachers being trained to use AI and teach about AI?" Half of teachers have had at least one AI professional development session, but 85% still feel unprepared [11]. Ask what training looks like and how often it happens.
- "What's the policy on students using AI for homework and assignments?" Look for specifics, not just "use your best judgment." Good policies define what's acceptable, what requires disclosure, and what's prohibited — by grade level.
- "How does the school handle academic integrity when AI is involved?" AI detection tools are unreliable. Ask whether the school is focused on detection (flawed approach) or on assignment design that makes AI shortcuts less useful (better approach).
- "Is the AI policy different for students with disabilities or IEPs?" AI can be a powerful accessibility tool. NYC's framework specifically greenlights AI for supporting students with disabilities. Your school should have thought about this.
- "How often is the AI policy reviewed and updated?" AI changes fast. A policy written in 2024 may already be outdated. Look for at least annual reviews with input from teachers, parents, and — for older students — students themselves.
- "How does the school communicate AI policy changes to families?" Given that 96% of elementary families report no school communication about AI policies [3], this question alone can spark change.
What a Good AI Policy Looks Like vs. a Lazy One
Not all AI policies are created equal. Here's how to evaluate what your school has — or what you should advocate for.
The reality: A total AI ban doesn't protect students — it just ensures they use AI without guidance. And a no-guardrails approach leaves teachers and families without the structure they need. The goal is thoughtful, specific, and regularly updated policies.
The Privacy Landscape: What Parents Need to Know
COPPA: New Rules, New Protections
The Children's Online Privacy Protection Act (COPPA) was amended in January 2025, with full compliance required by April 22, 2026 [12]. The key change: AI companies that train their models on children's data now need separate, explicit parental consent to do so. This is a significant shift — previously, general terms-of-service acceptance was often considered sufficient.
What this means for you: if your child uses an AI tool, the company must get your specific permission before using your child's conversations or data to train their AI models.
FERPA: Important but Incomplete
The Family Educational Rights and Privacy Act (FERPA) governs access to education records, but it was written decades before AI existed. It lacks clear cybersecurity requirements and doesn't specifically address AI-generated data about students [13]. If your school uses AI tools that create performance profiles, learning assessments, or behavioral predictions about your child, it's not entirely clear how FERPA applies.
What AI Tools Actually Collect
When your child uses an AI chatbot or AI-powered educational tool, the platform may collect:
- Conversation logs — everything your child types or says
- Usage patterns — when they use it, how long, what features
- Performance data — what they get right and wrong, learning pace
- Device information — location, browser, operating system
- Behavioral inferences — AI-generated predictions about learning style, attention, and ability
What to Watch For
- Ask whether the school has signed a Student Data Privacy Agreement (SDPA) with every AI vendor. These agreements specify what data is collected, how it's used, and when it's deleted.
- Check whether the AI tool has been vetted by a third party. Organizations like the Student Data Privacy Consortium review edtech tools for privacy compliance.
- Find out if your child can use the tool without creating a personal account. School-managed accounts with SSO (single sign-on) are safer than kids signing up with personal email addresses.
- Ask about data retention. How long is your child's data kept? Is it deleted when they leave the school?
Parent's AI Policy Checklist
Use this checklist to evaluate your school's AI readiness. Print it, bring it to your next PTA meeting, or email it to your principal.
School AI Policy Checklist for Parents
Policy BasicsDoes the school have a written AI policy? Yes / No / Don't KnowIs the policy available to parents online or in the handbook?Does it cover both teacher use AND student use of AI?Does it include grade-level-appropriate guidelines (not one-size-fits-all)?Has it been updated within the last 12 months?
Student RulesAre specific AI tools approved for classroom use?Are there clear rules for when students can and cannot use AI on assignments?Does the policy address AI and academic integrity without relying solely on detection software?Are there separate guidelines for students with disabilities or IEPs?
Privacy and DataHas the school signed data privacy agreements with AI vendors?Do parents receive notification before new AI tools are introduced?Can you opt your child out of AI tool usage?Is there a clear data retention and deletion policy?
Teacher SupportHave teachers received AI-specific training?Is training ongoing (not a one-time workshop)?Do teachers have guidance for AI-inclusive assignment design?
CommunicationHas the school communicated its AI policy to families?Is there a process for parents to give feedback on the policy?Are parents notified when the policy changes?
Scoring: Count your "yes" answers. 15+ = your school is ahead of most. 10-14 = good foundation, room to improve. Under 10 = time to advocate for change.
How to Advocate for Better AI Policies
If your school scored low on the checklist — or if no policy exists at all — here's how to push for change without being dismissed as "that parent."
Step 1: Start with Questions, Not Demands
Approach your school administration from a place of partnership. Emails like "I noticed our school doesn't have a public AI policy yet — is one in development?" get better responses than "Why haven't you done anything about AI?"
Step 2: Bring a Constructive Proposal
Don't just point out the problem — offer a starting point. Share NYC's traffic-light framework as a model. Reference your state's guidance (if it exists). Offer to help form a parent-teacher AI committee.
Step 3: Make It a PTA/PTO Issue
Request a 15-minute slot at the next PTA meeting to discuss school AI policy. Frame it around parent concerns: "How many of you know what your child's school policy on AI is?" (Most hands will stay down.) Present the checklist from this article and propose next steps as a group.
Step 4: Connect with Other Parents
Eight in 10 parents want more guardrails on AI in education [14]. You're not alone. A request from one parent is easy to sideline. A request from a group of parents gets a meeting with the superintendent.
Step 5: Request Transparency Reports
Ask the school to publish an annual "AI in Education" transparency report covering: which AI tools are in use, what student data they collect, what training teachers have received, and how the policy has evolved. Public reporting creates accountability.
Step 6: Use Existing Resources
You don't have to build a framework from scratch. These organizations provide free AI education policy resources:
- TeachAI — a collaborative effort involving ISTE, ETS, Khan Academy, and others. Provides free AI policy guidance and frameworks for schools.
- ISTE (International Society for Technology in Education) — publishes AI policy recommendations and runs training programs for educators.
- Center for Democracy & Technology — focuses on student privacy rights in the age of AI.
- Your state's Department of Education — check whether your state has published AI guidance. If it has, your district should be following it.
Step 7: Follow Up
Policies don't happen overnight, but they also shouldn't take forever. Set a specific follow-up date: "Can we revisit this at the October PTA meeting?" Polite persistence works.
The Equity Problem No One's Talking About
There's a training gap between high-poverty and low-poverty school districts [11]. Schools with more resources are training teachers, vetting tools, and creating thoughtful policies. Schools with fewer resources are either banning AI entirely or ignoring it — neither of which serves students.
Meanwhile, wealthy families are paying for AI tutoring tools and giving their kids a head start. This means AI policy isn't just a technology issue — it's an equity issue. If your school doesn't provide structured access to AI tools, the students who benefit most will be the ones whose parents can afford to provide access at home.
Advocating for a good AI policy at your school isn't just about your child. It's about making sure every student gets guided, thoughtful access to the most important technology shift in education since the internet.
What to Do Next
- Find out if your school has an AI policy. Check the school website, student handbook, or email your principal directly.
- Use the checklist above to evaluate what exists (or identify what's missing).
- Have the AI conversation at home. Read our guide on How to Talk to Your Kids About AI: A Practical Age-by-Age Guide for specific conversation starters by age group.
- Understand the tools your kids are using. Check out 5 AI Study Tools That Actually Help Kids Learn (Not Just Cheat) for honest reviews of the most popular options.
- Raise it at your next PTA meeting. Bring the 10 questions and the checklist. You'll be surprised how many other parents share your concerns.
Sources
[1] EdWeek Research Center, "Principals and AI Adoption Survey" (2025) — edweek.org
[2] RAND Corporation, "Student Use of Generative AI" (December 2025) — rand.org
[3] National Parents Union, "Parents and AI in Schools Survey" (February 2026) — nationalparentsunion.org
[4] NYC Department of Education, "Artificial Intelligence Guidance for NYC Schools" (March 24, 2026) — schools.nyc.gov
[5] AI Policy Observatory, "State AI Guidance Tracker for K-12 Education" (2026) — teachai.org
[6] Ohio Revised Code, H.B. 315 — AI in Education Act (2025) — legislature.ohio.gov
[7] Tennessee Senate Bill 2651, "AI in Education Policy Requirement" (March 2024) — tn.gov
[8] Education Week, "Federal AI Regulation in Education: The State of Play" (January 2026) — edweek.org
[9] White House, Executive Order on AI in Education (April 2025) — whitehouse.gov
[10] European Commission, "EU AI Act: High-Risk Classification for Education" — artificialintelligenceact.eu
[11] RAND Corporation, "Teacher AI Professional Development: Readiness and Gaps" (Fall 2025) — rand.org
[12] Federal Trade Commission, "COPPA Amendments: AI and Children's Data" (January 2025) — ftc.gov
[13] U.S. Department of Education, "FERPA and Emerging Technology" — studentprivacy.ed.gov
[14] National Parents Union, "Parent Attitudes on AI in Education" (February 2026) — nationalparentsunion.org
Some links in this article are affiliate links. If you sign up through them, we earn a small commission at no extra cost to you. We only recommend tools we genuinely believe in. See our full Affiliate Disclosure.