Google Brings Custom AI 'Gems' to K-12 Classrooms

Google recently added custom AI assistants called Gemini Gems to K-12 education accounts. Learn how teachers use them and what this means for student privacy.

Monday, March 9, 2026

In March 2025, Google added a new artificial intelligence tool called Gemini Gems to its education platform. The feature allows K-12 teachers and administrators to build custom AI assistants tailored to their specific classroom needs and district policies.

What Happened

Google initially launched Gemini Gems in August 2024 for standard users and recently expanded the feature into Google Workspace for Education accounts. Educators use the tool to program specific instructions and context into the AI. For example, a teacher can upload their district curriculum standards and ask the AI to generate daily activities that align with those rules.

Teachers use Gems to streamline repetitive tasks like creating quizzes and exit tickets. Physical education teachers build custom AI bots to generate accommodations for students with disabilities, finding alternative exercises that still meet lesson goals. School leaders also build Gems loaded with district board policies so new school board members can ask governance questions and get immediate answers. Educators now share their custom AI bots on community websites like EduGems, allowing teachers worldwide to copy and adapt successful tools.

The Bigger Picture

As we previously reported, Google is aggressively training millions of teachers to use artificial intelligence. Research highlights clear benefits for special education. AI acts as an on-demand accommodation tool by simplifying complex language on tests and generating personalized study guides. By automating the heavy administrative burden of drafting IEP documentation, teachers reclaim hours for direct student interaction. For students with severe physical challenges, combining large language models with eye-tracking devices is a game changer for communication, allowing nonverbal students to participate fully in class discussions.

However, trusting AI with legal and administrative tasks introduces serious risks. Generative AI tools frequently invent information, a flaw known as "hallucinating." Studies prove that AI tools struggle to match human accuracy when interpreting legally enforceable policies. If a school board member relies on an AI assistant for governance decisions, systemic inaccuracies can lead to procedural errors. Recognizing these risks, some school districts are rolling out new AI policies to guide staff usage.

The rise of shareable AI tools also demands better vetting. Traditional software evaluations ask if a program works, but AI vetting must ask how it reaches its conclusions and whether its outputs align with state standards. Technology directors must adopt AI-specific evaluation frameworks to audit shared AI assistants before they reach students, while relying on management controls within the Google Admin console to oversee deployments.

What This Means for Families

Privacy is the top concern when schools adopt AI. Fortunately, Google classifies the education version of the Gemini app as a Core Service. This guarantees that data entered by students and teachers is kept confidential. Google does not use this data to train its public AI models or create advertising profiles.

While data privacy is secure within the Google platform, AI introduces other classroom challenges. AI detectors used by schools to grade student work carry a significant false positive risk. These policing tools often incorrectly flag human-written content as AI-generated, creating unfair academic disputes for students.

What You Can Do

  • Ask your school board if they have an AI-specific evaluation process to screen teacher-created tools for bias and accuracy.
  • Verify that your district technology team maintains centralized controls over Google Workspace AI features rather than leaving settings on default.
  • Monitor how your school integrates AI into your child's Individualized Education Program to ensure medical and legal decisions receive professional human oversight.
Share: