New data from educational publisher Pearson suggests that artificial intelligence might be doing more than just helping students write essays—it could be teaching them how to read. An analysis of nearly 80 million student interactions found that using embedded AI tools significantly increased behaviors associated with deep learning and comprehension.
What Happened
Pearson analyzed data from approximately 400,000 college students using its digital materials during the semester beginning January 2025. The results indicate that when AI is built directly into textbooks rather than used as a separate chatbot, students engage more deeply with the material.
According to Pearson’s analysis, students using AI study tools in instructor-led courses were 23 times more likely to be classified as "active readers." For those who used the tools repeatedly, that likelihood rose to 24 times. Even in standalone digital textbooks without instructor guidance, a single use of the AI tool tripled the likelihood of active reading.
The company defines active reading as highlighting, note-taking, asking clarification questions, and retrieving information from memory. These are critical skills for academic success, distinct from the passive scrolling often associated with digital media.
Contrary to fears that students use AI solely to cheat, the data showed that 97 percent of students used the tools responsibly. Only 3 percent attempted to paste homework or assessment questions into the tool to get quick answers.
The Bigger Picture
This shift toward active engagement comes at a time when college readiness is a major concern. National data reveals that only 39 percent of students taking the ACT in 2025 met college-level reading benchmarks. Educators have noted that incoming freshmen often struggle with the close reading and analysis required for higher education.
The distinction between "open" AI tools like ChatGPT and "walled garden" tools embedded in curriculum is becoming clearer. While general chatbots can sometimes encourage shortcuts, curriculum-specific tools appear to keep students focused. Pearson found that one in three students used the AI’s question-asking feature to pose complex inquiries that went beyond simple memorization, attempting to apply or analyze the content instead.
However, technology is rarely a silver bullet. As we previously reported, educational software often requires dedicated human supervision to be effective. The Pearson data supports this, showing that the impact of AI tools was significantly higher in instructor-led courses compared to standalone use.
What This Means for Families
For parents and educators, this data offers a counter-narrative to the fear that AI will inevitably lead to cognitive atrophy. It suggests that the design of the tool matters more than the mere presence of AI.
"Cognitive offloading"—letting the computer do the thinking—remains a risk. However, tools designed to prompt students with questions rather than just providing answers can act as a scaffold for learning. When AI is integrated responsibly, it can help students move from passive consumption to active engagement.
What You Can Do
- Ask about the tool design. When schools introduce AI, ask if the tools are "walled gardens" restricted to course content or open generative models. Tools tied to specific textbooks often have better guardrails.
- Monitor for offloading. Watch how your student uses these tools. Are they using them to summarize long texts they haven't read, or are they using them to clarify difficult paragraphs they are currently reading?
- Encourage active habits. Regardless of the technology, reinforce the basics of active reading: taking notes, highlighting key terms, and pausing to ask, "Do I understand what I just read?"