State Audit Exposes Severe Data Privacy Gaps in NYC Schools

A new state audit reveals major student data privacy gaps in New York City public schools. Learn what decentralized tracking and new AI tools mean for families.

Monday, May 4, 2026

Key Takeaways

  • A 2026 state audit found that New York City public schools had 141 data security incidents from 2023 to 2025 because they lacked centralized tracking for educational software.
  • Decentralized management of technology assets leaves districts vulnerable to cyberattacks. Officials cannot identify which students are affected when third-party vendors suffer a breach.
  • Federal laws like FERPA give parents the right to review student records. However, enforcement is weak. The government has never revoked federal funding for a school because of a violation.
  • As schools adopt artificial intelligence, security experts warn that districts must verify vendor data policies. Schools need to ensure student information is not harvested to train external AI models.

New York City public schools struggle to track and secure personal data collected on nearly 900,000 students. A recent five-year audit reveals systemic gaps in how the nation’s largest district manages educational software and third-party vendors.

What Happened

A new audit by State Comptroller Thomas DiNapoli found the New York City Department of Education lacks a centralized tracking system for software. This prevents district leaders from identifying compromised data during cyberattacks.

Auditors identified 141 data security incidents between January 2023 and February 2025. When hackers breached the program PowerSchool in 2024, exposing the names and birth dates of over 3,000 students, central officials did not discover the breach until January 2025. Because the district lacks a central inventory, administrators had to contact schools individually to determine who was affected.

This follows a major attack on the grading platform Illuminate, which compromised the personal information of roughly 820,000 students during the 2021-22 school year. Auditors warned that these gaps leave the district at high risk of noncompliance.

The Bigger Picture

New York City’s vulnerability is a national problem. Managing software through fragmented spreadsheets creates liability, and experts recommend centralized asset management platforms to maintain a defensible system.

Data over-collection is a frequent issue. A playbook on data minimization suggests that districts collect more information than necessary when they fail to define a clear purpose before gathering data.

Vetting EdTech vendors requires more than paperwork. Technology leaders note that standard security checklists often result in a false sense of safety unless districts maintain ongoing control over their vendors. Many smaller software companies lack a deep understanding of federal privacy requirements. As we previously reported regarding the $17.25 million Naviance settlement, relying on third-party vendors without strict oversight exposes families to privacy violations.

The need to secure this data is increasing as schools adopt artificial intelligence. New York City released a preliminary AI framework that may increase the use of automated tools in classrooms. Before approving AI software, schools must verify what data is collected and whether student information is used to train external AI models.

What This Means for Families

Parents cannot assume that a school district knows where their child’s data resides. When a system relies on individual teachers or principals to manage software access without central oversight, the risk of a data breach increases.

While the Family Educational Rights and Privacy Act (FERPA) gives parents the right to inspect records, its enforcement mechanism is weak. The federal government has never revoked a school's funding over a FERPA violation. The responsibility for preventing data leaks falls on local school boards and district leaders.

What You Can Do

  • Request a software inventory: Ask your child's principal or school board for a complete, centralized list of all third-party applications used in the classroom.
  • Opt out of unnecessary data sharing: Review the district's annual privacy notices and opt out of directory information sharing to limit exposure.
  • Demand AI transparency: If your school uses AI for grading or lesson planning, ask administrators to confirm in writing that student data is not used to train third-party models.
Share: