Education Tech & Student Privacy: AI Under FERPA
Schools are experimenting with AI tutors, chatbots, and grading tools. But when student data is involved, the Family Educational Rights and Privacy Act (FERPA) sets strict limits.
FERPA in plain English
FERPA is the U.S. law that protects the privacy of student education records. It gives parents (and students over 18) control over access to their records. Schools must keep those records secure and limit disclosure without consent.
Where AI creates new risks
- AI tutors storing student grades – If those records are stored on third-party servers, they may count as education records under FERPA.
- Chatbots handling school forms – Data entered into an AI helpdesk may include identifiable student information.
- Training data reuse – If vendors use student data to train general AI models, it could count as unauthorized disclosure.
Vendor responsibilities
Schools must ensure that edtech vendors act as "school officials" under FERPA, which requires a written contract. Without one, the vendor's use of student data may not be legally permitted. This extends to AI companies offering "free" or consumer-grade tools. For more on vendor liability, see our guide on cyber vs. AI liability coverage.
Overlap with other laws
FERPA is the baseline, but in practice, it overlaps with other student privacy laws:
- COPPA – Protects children under 13 when using online services.
- GDPR (EU) – If a school handles data of EU students, GDPR rules may apply.
Takeaway
AI in education can be powerful, but student data is highly sensitive. Schools should treat AI vendors like any other data processor: require contracts, confirm data use restrictions, and ensure parents and students know how their data is being handled. Before implementing AI tools, consider asking your insurer these five key questions about AI risk coverage.
No email required — direct download available.