← All posts

The AI-Built App That Leaked 4,500 Student Records

An AI-built education app exposed 4,500 student records from K-12 schools and universities. The AI wrote the login code backwards. Nobody checked.

Guy Fawkes mask lying face-up on rain-wet pavement in a dark alley

An education app that handled student accounts from K-12 schools, UC Berkeley and UC Davis had a security flaw so basic it almost sounds made up. The login system was backwards. It blocked real users and let strangers in.

The app was built on Lovable, a platform that lets people create software by typing prompts instead of writing code. No developer wrote the login logic. The AI did. And the AI built a lock that worked in reverse.

Security researcher Taimur Khan found 16 flaws in the app, six of them critical. The exposed data covered 18,697 users total, including 4,538 student accounts. Names, emails, school records. The app had been featured on Lovable's own discovery page with over 100,000 views. It was a showcase.

Lovable's response: users are responsible for their own security. Which is true in the same way that handing someone a car with no brakes and saying "drive safe" is true.

This is what happens when AI writes the code and nobody reviews it. The app worked. It looked right. It passed the kind of check where you click around and everything seems fine. But underneath, the most basic security question (who gets in and who doesn't?) was answered wrong. Not subtly wrong. Exactly wrong.

The tools that build software are getting faster. The part where someone checks whether the software is safe hasn't kept up. The app looked fine. It worked fine. You'd never know the locks were broken just by using it. That's the part that should worry parents most.


BeatMask detects student records and personal data before it reaches AI tools. On your device, before anything gets shared.

Data Privacy AI Security Education