Duolingo recently ended an internal policy that evaluated employees based on how often they used artificial intelligence. This shift suggests that forcing technology into educational tools does not automatically improve learning outcomes.
What Happened
In early 2026, Duolingo CEO Luis von Ahn confirmed the company will no longer tie performance reviews to staff AI usage. The original policy, launched in 2025, required employees to incorporate AI into their daily workflows. According to Business Insider, staff members pushed back against the mandate, questioning if they were using the technology just to satisfy a requirement. Von Ahn agreed with the feedback, stating that the mandate led to "activity theater" rather than actual outcomes.
Financial analysts at Simply Wall St note that this culture reset comes as investors evaluate whether the platform's AI tools deepen user engagement or dilute the core structured learning experience. Writing for LinkedIn Pulse, management expert Gerald Kane points out that while moving away from "compliance theater" is a smart move, making AI entirely optional carries the risk that some product teams fall behind industry standards.
The Bigger Picture
The debate inside Duolingo mirrors the conversation about AI in education. When used well, the technology is effective. A meta-analysis of 51 studies by Springer Nature shows that generative AI produces statistically significant improvements in language proficiency and cognitive development.
Research published in Humanities and Social Sciences Communications supports this. In a 10-week study, students using AI-powered voice assistants outperformed peers in traditional speaking activities, reporting lower anxiety and higher emotional engagement. As we previously reported, tools like Duolingo Max use AI video calls to simulate real-world conversations and boost student confidence.
However, limitations exist. The same 10-week study found that learners experienced a perceived lack of authentic interaction when practicing solely with AI. Furthermore, evaluations of the platform by New Literacies point out that while the AI adapts to the user, it rarely teaches students how to self-regulate. Instead, it relies on gamification to keep students clicking. According to AI Scanner, the effectiveness of these tools depends on a student's existing digital literacy and personal motivation. Schools are integrating these platforms into their classrooms, as we covered in reports on school tech budgets, but relying on algorithmic progression restricts a student's ability to engage in self-directed inquiry.
What This Means for Families
Duolingo's decision to drop its internal AI mandate is a positive signal for parents and educators. It suggests the company is prioritizing software quality over technology metrics. When developers force AI into products to meet a quota, pedagogical rigor often suffers.
Families should understand that AI language tools are supplements, not complete replacements for traditional instruction. The technology reduces the fear of making mistakes, which helps kids speak up. Yet, the reliance on gamified streaks and rigid lesson paths means students may not learn how to direct their own study habits.
What You Can Do
- Use AI for practice, not replacement. Encourage students to use AI language apps for low-stakes speaking practice to build confidence without the pressure of a human audience.
- Supplement with real conversation. Since AI lacks authentic human interaction, ensure learners have opportunities to speak their new language with actual people.
- Monitor for passive learning. Watch how children interact with gamified apps. Ask them to explain what they learned rather than just celebrating a daily login streak.