Minnesota Schools Pilot AI Cameras to Evaluate Teachers

A Minnesota district is testing AI-powered cameras to observe teachers. Learn how this impacts privacy, feedback quality, and student data safety.

Wednesday, February 4, 2026

A Minnesota school district is turning to artificial intelligence to solve a critical shortage of time for teacher evaluations. By using 360-degree cameras and AI analysis, administrators hope to provide faster, more detailed feedback to new educators who need it most.

What Happened

Southwest Metro Intermediate District 288 is piloting an AI system called Evelyn to analyze classroom footage and automate the feedback process. This decision addresses a significant logistical challenge: 69 percent of the district's staff are in their first three years of teaching. These probationary teachers require frequent support, but human administrators often lack the capacity to observe every lesson. The district is utilizing cameras from Owl Labs to capture audio and video, which the AI then compares against teaching rubrics.

Internal data from the district's beta phase showed that the difference between AI-generated scores and human administrator evaluations was remarkably small—only 0.07 points. This aligns with broader research into automated assessment, which has found that modern deep learning models can measure student engagement with variance consistently below 0.07.

The Bigger Picture

This pilot represents a growing trend where schools use technology to replicate expert judgment. The Danielson Framework, a widely used standard for teacher evaluation, has recently begun validating AI tools to ensure they assist rather than replace school principals. The goal is to maintain the integrity of professional feedback while handling the sheer volume of data required for effective coaching.

The pressure to automate is driven largely by state mandates that stretch school resources thin. Minnesota law imposes strict training requirements for paraprofessionals and requires intensive literacy screening and instruction. These compliance duties leave administrators with limited hours for in-person classroom observations, creating a "coaching bottleneck" that districts are desperate to clear.

However, the shift to AI is not without technical risks. While some systems show high stability, others are susceptible to generating fabricated responses or oversimplified feedback. Experts argue that for these tools to work safely, teachers must develop specific data-AI competence to understand how algorithms interpret their professional performance.

What This Means for Families

Privacy and Surveillance

Recording classrooms introduces complex privacy concerns. Under federal law, classroom conversations are considered educational records that require strict handling. While authorized systems can be secure, the rise of "Shadow AI"—where teachers use unvetted free tools—poses a threat. In some cases, 74 percent of teachers were found to be sharing student data with public AI platforms before their districts implemented secure workflows.

A New Kind of Oversight

Unlike a principal who might visit for 15 minutes, AI systems can analyze teacher and student activities continuously. This could lead to more objective data on how students engage with lessons, but it also raises questions about the scope of surveillance in learning environments. Parents should be aware that their child's behavior patterns could be part of the data set used to grade their teacher.

What You Can Do

  • Ask about data retention: Request your school district's policy on how long video footage is kept and whether it is deleted immediately after analysis.
  • Check the "human loop": Ensure that a human administrator reviews all AI-generated feedback before it becomes part of a teacher's permanent record.
  • Verify security: Ask if the AI tools used are "closed" systems hosted by the district or if data is sent to third-party servers for processing.
Share: