← All posts

FERPA Doesn't Cover What Your Professor Just Pasted Into ChatGPT

Consumer AI tools are third parties. Almost nobody in higher education is connecting these dots.

Carved wooden hahoe tal mask hanging on a study room wall with bookshelves and desk behind

Most people have heard of HIPAA. Fewer have heard of FERPA, the Family Educational Rights and Privacy Act, which does the same job for student records that HIPAA does for health data.

Here's the problem: FERPA and AI tools are almost fully at odds, and almost nobody in higher education is talking about it.

What FERPA actually does

Think of FERPA as a lock on your school file. Grades, transcripts, financial aid records, disability paperwork, disciplinary history. Anything in your academic record that could identify you. Schools that take federal funding (which is nearly all of them) have to protect that data and get written consent before sharing it with outside parties.

"Outside parties" is where AI tools walk in.

When a professor uploads a class roster to ChatGPT to draft feedback, that's student data leaving the school's control. When an administrator pastes a student's housing request into Claude to summarize it, that's a potential violation. When a counselor uses a consumer AI tool to draft notes on a student's probation case, those records have just been shared without consent.

FERPA's bar for what counts as a "disclosure" is low. And consumer AI tools (ChatGPT, Gemini, Claude, Copilot in their free and standard tiers) won't sign the data agreements the law requires for outside access.

The gap between policy and practice

A handful of schools have signed enterprise deals with AI providers that include data protection terms. Institutional versions of Copilot or Google Workspace that come with written promises about how data gets used. Those deals can build a real FERPA defense.

But they only cover school-issued accounts on school-managed platforms. They don't cover the professor grading essays on a personal ChatGPT account, the adjunct who doesn't have a school login, the TA running discussion boards through free Claude or the department chair using Gemini on a personal laptop during finals week.

In every one of those cases, the school's FERPA duties don't vanish because the tool is personal rather than institutional. The data is still student data. The sharing is still sharing.

Nobody checks whether a teaching assistant's free AI account has a data processing agreement with the university. (Nobody even asks the question.)

When the AI builds the app, too

In February 2026, security researcher Taimur Khan found that an app built on Lovable (a platform that lets users create software through AI prompts, no coding needed) had exposed records for 18,697 users. That included 4,538 student accounts from K-12 schools and universities including UC Berkeley and UC Davis.

The cause: the AI-generated code had flipped the access logic. It blocked logged-in users and let everyone else in. The digital version of locking the front door and removing the back wall.

The app had been featured on Lovable's discovery page with over 100,000 views. The developer hadn't written any of the security code themselves. The AI had. And the AI had gotten it backwards.

This isn't strictly a FERPA story. But it points at a trend higher education should be watching. AI tools are now building apps that handle student data. Those apps ship without security review. When they break, it's student records that spill.

The question that matters

The right question isn't "are our official AI vendors FERPA-compliant?" Most compliance teams are already on that. The right question is "what are our faculty actually using, and do we know what happens to student data when they use it?"

At most schools, the answer is no. The tools faculty use day-to-day aren't the ones IT has approved. They're personal accounts, free tiers, browser extensions that nobody vetted.

FERPA doesn't have an exception for AI tools the school doesn't know about.


BeatMask detects student records, names, ID numbers and other FERPA-covered data before it reaches AI tools. On your device, before the sharing happens.

Data Privacy FERPA Education AI Security