The National Catholic Educational Association has partnered with Google for Education to train teachers nationwide on artificial intelligence. While administrators intend to use AI for lesson planning, the move comes as evidence suggests heavy reliance on digital tools can negatively affect student learning.
What Happened
On April 24, the NCEA revealed a new artificial intelligence training initiative to teach educators how to use generative A.I. tools like Google's Gemini. The program, developed with education technology nonprofit ISTE+ASCD, launches May 13.
NCEA President Steven Cheeseman stated the goal is to give teachers a foundation in AI competence. To maintain alignment with Catholic teachings, the NCEA is forming a specialized Google Educator Group to lead K-12 instruction. Cheeseman said the initiative focuses on professional development, but he acknowledged that trained teachers and principals will likely begin using these tools in their schools.
As we previously reported, schools are exploring how Google Gemini can generate rubrics and manage student data.
The Bigger Picture
The NCEA's announcement continues the widespread adoption of Google products in education. Google Classroom has over 150 million students globally, making the company's ecosystem part of daily instruction.
However, the push for more technology in schools faces pressure from recent research. A review of a $30 billion twenty-year classroom digital initiative found that one-to-one laptop programs coincided with measurable declines in academic performance.
Studies indicate that giving students unrestricted access to AI can hinder cognitive development. A recent study by the Wharton School showed that students with on-demand access to AI tools achieved less than half the learning gains (30 percent) of peers whose access was system-regulated (64 percent). The researchers identified a self-regulation paradox where students struggle to moderate their reliance on AI even when they know it hurts their long-term development. When students outsource their problem-solving to software, they skip the mental effort required to build skills.
What This Means for Families
For parents, the integration of AI tools like Gemini into the Google Workspace ecosystem raises questions about data privacy and classroom screen time. While Google offers a specialized framework for administrative compliance, protecting student data is not automatic.
Cybersecurity experts state that compliance requires active management by school administrators to securely manage user access permissions. Google recently introduced an AI control center to help districts manage how third-party AI agents interact with student data. The responsibility remains with local schools to configure these safeguards.
Beyond data concerns, parents must consider how these tools change the learning environment. As schools test new AI capacities, families are balancing screen time and learning to ensure digital convenience does not replace critical thinking.
What You Can Do
- Ask your school principal or IT director how they configure data access permissions within Google Workspace regarding new AI tools.
- Request to see your school's formal guidelines on student AI use, specifically whether access is system-regulated or available on-demand.
- Talk to your child's teachers about how much of their daily instruction relies on screens versus traditional, off-device learning methods.