Avery Balasbas
Jan 6, 2026
Key takeaways
FERPA covers the records your schools create; COPPA kicks in the moment students interact with third-party platforms
Since 2025, vendors can no longer assume consent for advertising; they must ask parents explicitly and document every decision
A simple shared spreadsheet tracking AI tools by data type, storage location, and retention period prevents compliance gaps from hiding in plain sight
Vendors who hesitate to provide compliance documentation are telling you everything you need to know about their privacy practices
Staff, students, and parents each need different privacy training; a single all-hands session won't build real understanding. Tailored training also ensures that multilingual families, diverse cultural communities and students with varying digital access levels receive information that is relevant, accessible, and affirming.
Your IT director reviews the district's new AI writing assistant and notices the privacy policy mentions "data sharing with third-party analytics partners." The superintendent asks if this complies with student privacy laws, but the policy documents read like legal contracts. Understanding which protections apply and what questions to ask shouldn't require a law degree, and educators deserve clear explanations that honor their professional expertise, cultural knowledge, and the diverse communities they serve.
FERPA and COPPA work together to protect student information, but they focus on different areas. When your district implements AI tools, both laws shape what data you can collect and how vendors must handle it. This guide breaks down what has changed so far and provides practical steps to ensure your district's AI infrastructure protects student privacy.
These steps also help districts examine how privacy protections can advance equity, particularly for students whose data has historically been misused, misunderstood, or undervalued.
Understanding FERPA and COPPA in your district
These two federal laws serve complementary yet distinct roles in protecting student privacy. FERPA focuses on educational records your schools already maintain, while COPPA governs what happens when students interact with online platforms and services. Knowing where each applies helps your administrative team ask the right questions when evaluating AI tools and drafting vendor agreements.
It also empowers teams to consider how these laws protect the identities of students, especially those who have been historically underserved in a widening technology/digital achievement gap.
What FERPA protects
FERPA protects education records at any school that receives federal funding. It gives parents the right to access their children's records, request corrections, and control who sees this information. Gradebooks, attendance records, and disciplinary notes are all protected under FERPA.
These records also hold important narratives about students’ experiences; protecting them ensures families retain agency over how their children’s stories are shared and interpreted.
What COPPA regulates
COPPA regulates how websites and online services collect personal information from children under 13. When students log in to an AI tutoring platform or submit assignments through a digital tool, COPPA determines what data these services can collect and how they must obtain parental consent.
Educators can use this as an opportunity to teach students critical digital literacy, helping them understand who designs digital tools, whose perspective shapes them, and how to navigate online spaces responsibly.
The 2025 COPPA amendments
The 2025 amendments shifted the default from opt-out to opt-in consent. Vendors must now obtain specific parental permission before using student data for advertising or sharing it with third parties. They also need to document every consent decision and justify any data they retain beyond immediate educational purposes.
For example, imagine your district's elementary curriculum team wants to pilot an AI-powered reading comprehension tool. Under the new rules, the vendor must get explicit parental permission before using student reading data for personalized advertisements or selling that information to publishers. The tool can still track reading progress for instructional purposes, but commercial use requires separate, documented consent. These distinctions help families make informed choices about how their children’s data is used beyond learning.
Mapping your district AI data lifecycle with regular check-ins
Start by creating a simple data flow diagram that shows where student information enters your AI systems, how it moves between platforms, and when it is deleted. Your IT team can sketch this on paper or use free diagramming tools. Once you have that baseline, build a rhythm of regular reviews to keep it current.
Each week, have your technology coordinator list every AI tool students used and document three things for each:
What data it collects (names, grades, writing samples, behavior patterns)
Where that data lives (vendor servers, district network, cloud storage)
Who can access it (teachers, administrators, the vendor's support team)
Consider also documenting whether each tool includes culturally responsive features, multilingual options, or equitable access settings to ensure all students benefit from the technology (e.g., accessibility features like text-to-speech or language options). This quick audit surfaces issues before they become compliance problems.
Monthly, update your diagram when schools adopt new tools or discontinue old ones. Check that retention schedules align with district policy; most student work should be deleted within one school year, unless parents request more extended storage for IEP documentation or portfolio requirements. This is also a good time to verify encryption standards.
If an AI tool doesn't mention "256-bit AES encryption" or similar security standards, have your contracts team ask the vendor directly.
Vet AI vendors with this focused checklist
Before signing contracts or approving pilot programs, review vendor privacy practices. The process typically requires several hours spread across your administrative team:
Encryption verification: Confirm the vendor uses industry-standard encryption (256-bit AES or stronger) for data at rest and in transit. Request their security whitepaper if the details aren't precise.
Retention policy review: Check how long the vendor keeps student data after your district stops using their service. Best practice is to delete immediately upon contract termination, though some vendors may need 30-60 days for technical processing.
FERPA and COPPA compliance documentation: Request copies of recent compliance audits, certifications, or third-party assessments. Look for vendors who undergo annual reviews and publish transparency reports.
Parental consent process: Verify that the vendor provides explicit, customizable consent forms. The process should explain what data gets collected, why it's needed, and how parents can revoke consent.
Incident response plan: Request documentation showing their typical timeline for breach notification (best practice is within 72 hours) and what support they provide during incidents.
As part of your vetting, ask how the vendor accounts for algorithmic bias, data representation across diverse cultural and linguistic groups, and safeguards that prevent misuse or misinterpretation of student data.
Picture a district curriculum committee evaluating AI writing assistants. They discover that one vendor stores students' essays indefinitely for "product improvement"; that vendor is removed from consideration. This raises serious concerns about student authorship, especially for students whose PII or student work could be absorbed into datasets without consent. Another provides independent auditor certificates confirming FERPA compliance and offers immediate data deletion; that one makes the shortlist.
Building your written data security program step by step
The 2025 COPPA amendments require formal, written security programs tailored to the sensitivity of data your district handles. You can develop this incrementally over several weeks, starting with the foundation and adding layers as your team builds capacity.
First, designate security coordinators, one from IT and one from administration. The IT coordinator handles technical safeguards and encryption, while the admin coordinator manages consent forms and parent communication. Having clear ownership prevents tasks from falling through the cracks.
Next, document the specific protections your district has implemented:
Access controls: Only staff with legitimate educational needs can view student data
Encryption standards: All data uses 256-bit AES encryption or equivalent
Password requirements: Staff must use unique passwords with multi-factor authentication
Physical security: Devices with student data stay in locked classrooms overnight
Finally, build an ongoing assessment into your calendar. Plan quarterly reviews where your security team evaluates new threats and updates protections, documenting findings in a shared folder so you can demonstrate continuous improvement during audits.
Set two annual review dates (October and March work well) to evaluate overall program effectiveness, review incidents, and update policies for new tools or regulations. Consider assigning a staff member to monitor news on education privacy. When new guidance emerges from the Department of Education or FTC, your team can assess whether existing policies need adjustment before issues arise.
Training staff, students, and parents effectively
Practical privacy training reaches everyone who interacts with student data without requiring all-day workshops. Break training into focused sessions tailored to each audience.
Staff training (90 minutes, August PD day):
First 30 minutes: Review FERPA and COPPA basics with real scenarios from your schools
Next 30 minutes: Practice writing AI prompts that avoid student names or identifying information
Final 30 minutes: Walk through your data security program and daily responsibilities
Create a one-page decision tree for staff: Does the tool collect student work? Does it track behavior? Does it share data with third parties? Each "yes" triggers specific privacy steps.
Student training (45-minute class period): Work with teachers to help middle and high school students recognize when online services request unnecessary personal data. Engage students in discussions about why certain platforms collect data, whose experiences are prioritized in AI design, and how they can advocate for their digital rights. For elementary students, use age-appropriate stories and scenarios about online privacy.
Parent training (45-minute evening session): Avoid legal jargon, explain that FERPA protects report cards and attendance while COPPA regulates online quizzes and educational games. Publish a FAQ document on your district website covering common questions about data retention, advertising, and breach procedures. Ensure that these materials are translated and culturally adapted so multilingual families can participate fully and confidently in privacy conversations.
How SchoolAI supports your compliance efforts
SchoolAI helps your district maintain FERPA and COPPA compliance through built-in privacy controls and transparent data practices. The platform is FERPA and COPPA compliant, with SOC 2 and 1EdTech certifications demonstrating regular security audits.
Mission Control provides administrators and teachers with visibility into students' interactions with AI without storing unnecessary personal information. The platform automatically documents consent decisions and provides parents with clear explanations of data collection practices.
When teachers create Spaces, privacy settings let your district control what information the AI can access, set automatic deletion schedules aligned with retention policies, and receive immediate notification when settings require attention.
Protect student privacy without slowing down innovation
FERPA and COPPA compliance doesn't mean avoiding AI tools entirely; it means choosing vendors carefully, documenting data practices, and keeping everyone informed about privacy protections. The 2025 amendments strengthen student privacy by requiring explicit consent and justified data retention, providing your district with more straightforward guidelines for evaluating new technologies.
Start with your weekly data lifecycle review, build your vendor vetting checklist, and schedule focused training sessions for each audience. These practices create a foundation where innovation and privacy protection work together. They ensure that innovation benefits all learners and that privacy protections uphold student agency and equity.
Ready to explore how privacy-first AI tools can support your schools? Explore SchoolAI to see compliance features explicitly designed for K-12 environments.
FAQs
Transform your teaching with AI-powered tools for personalized learning
Always free for teachers.
Related posts
AI literacy assessment: How to measure student understanding
Stephanie Howell
—
Jan 8, 2026
How large language models support personalized learning in classrooms
Katie Ellis
—
Jan 6, 2026
Teaching students about algorithmic bias through real-world examples
Cheska Robinson
—
Jan 5, 2026
AI literacy in English class: Analyzing AI-generated writing
Blasia Dunham
—
Dec 22, 2025





