This page summarizes studies published in September 2025. It reflects findings from a specific time window and is not intended as a comprehensive review of the literature on this topic
AI tools are becoming more common in schools to support personalized learning and engagement. Research shows they can help, but they also come with trade-offs.
Features that increase engagement can also raise cognitive load, reduce student autonomy, and create equity challenges.
Using AI well depends on more than the tool itself. It requires thoughtful design, practical classroom support, and clear governance to ensure tools work for all learners.
This research is relevant for school and district leaders, instructional technology leaders, inclusion leads, and data and privacy leaders evaluating AI tools in schools.
Key findings from the research
Research across four areas highlights consistent patterns that challenge common assumptions about AI in schools.
1. Nudges and personalization work, but only to a point
Features like reminders, badges, progress indicators, and personalized prompts can support engagement when used carefully. However, broadly persuasive design features, such as streaks, social comparisons, and platform-wide reward systems, can increase cognitive load and reduce students’ sense of control even when well-intentioned.
Personalization can help and tends to support both motivation and autonomy, but it does not reliably improve participation on its own, and its effectiveness depends on whether students have the digital confidence to navigate and benefit from it
2. Adoption for students with disabilities depends on design and support
One study of students with disabilities at university found that adoption of AI tools is more likely when tools are easy to use, clearly useful, and supported by teachers or peers.
Adoption depends on several factors, including design, digital confidence, teacher guidance, and how well the tool fits the learning context. Tools that are not adapted to real learning contexts are less likely to be used consistently.
3. Design fit matters more than print versus digital
The key question is not whether learning happens on paper or screen. What matters is how well the design matches the learner’s skills and the task.
If a tool requires too much navigation, attention, or self-management, it can become a barrier rather than a support.
4. Governance is essential for AI to scale fairly in schools
Research on EdTech data governance shows that many school systems still lack transparent rules for how student data can be collected, used, shared, and governed.
Without strong governance, AI tools can introduce bias, limit opportunities, or create privacy risks. Governance is not separate from innovation, it is part of using AI responsibly at scale.
Why this matters for school leaders
AI tools are often judged by how engaging or innovative they appear in early use. Research suggests leaders need to look beyond this and ask 'under what conditions do these tools actually work well for all students?'
Four key tensions stand out:
- Engagement vs autonomy: Engagement features can support participation, but too many prompts or notifications can overwhelm students or reduce independence
- Personalization vs equity: Personalization works best when students have the confidence and support to use it. Without this, it may widen gaps rather than reduce them
- Scale vs learner variability: What works for one group may not work for another. Flexible tools and multiple ways to engage are important for school-wide use
- Innovation vs governance: Responsible AI depends on clear rules around data, access, and oversight. Governance is part of implementation, not an afterthought

What this looks like across school settings
The research reviewed here points to several recurring implementation patterns.
Practical takeaway | What to do and how |
|---|---|
Keep cognitive demands manageable | Choose tools with simple interfaces. Avoid designs that require too much tracking, navigation, or multitasking |
Balance engagement with student autonomy | Use features like reminders and progress indicators carefully. Support participation without creating pressure or over-reliance |
Build support around the tool, not just the tool itself | Provide guidance, onboarding, and time for both students and teachers to build confidence using the tool |
Plan for learner variability from the start | Look for tools that offer flexibility and different ways to engage, rather than assuming one approach works for all students |
Treat governance as part of implementation | Be clear about what data is collected, how it is used, and who has access. Build oversight into planning |
Check whether personalization is usable in practice | Personalization features are most useful when students can navigate and benefit from them. Ensure students can understand and use personalized features independently, without being left behind. |
Successful implementation in schools
Across these sources, the strongest implementation pattern is not one specific best AI model, but a set of conditions that make responsible use more possible.
Successful implementation depends on whether tools are usable, manageable for learners, supported by teachers, and governed in ways that protect students and reduce risk.
- AI tools are introduced with direct expectations for students and teachers
- Interfaces are simple enough that students can focus on learning and not tool navigation
- Engagement features are used carefully and do not create unnecessary pressure
- Students receive support when digital confidence or accessibility needs vary
- Teachers have time and guidance to use the tool in classroom routines
- Schools review data, privacy, and oversight before scaling use
- Tool use is monitored over time to see whether it is helping students in the way intended
How this shows up in practice
The examples below reflect patterns described in the research and reporting. They are not meant as prescribed steps.
Engagement features used carefully
Some schools use reminders and progress indicators but avoid overuse. The focus is on supporting learning without creating pressure.
What research suggests:
Engagement features are helpful when they are well-timed, limited, and give students control.
Support built around the tool
Students and teachers receive guidance on how to use the tool effectively in real classroom settings.
What research suggests:
Adoption improves when users feel confident, supported, and understand how the tool fits into learning.
Interface demands kept manageable
Schools focus on how much effort a tool requires from students, not just whether it is digital.
What research suggests:
Research on digital reading suggests that learning outcomes depend on the match between the learner, the task, and the interface, not simply on whether a tool is digital.
Governance built in before scale
Some schools review data use, privacy, and oversight before expanding AI use.
What research suggests:
Many systems lack clear governance. Strong policies support responsible and fair use.
Long term results
The research reviewed here does not suggest that AI tools are effective simply because they are innovative or personalized. Over time, the stronger pattern is this: responsible use is more likely when schools keep cognitive demands manageable, support different learners well, and put clear governance in place from the start.
This means long-term success depends less on early enthusiasm and more on whether a tool can be used consistently, fairly, and in ways that teachers and students can realistically sustain.
What this means for school leaders
School leaders | Review AI tools not only for innovation or engagement, but also for whether they are usable across a wide range of learners, manageable in classrooms, and supported by expectations for use |
|---|---|
Instructional technology leaders | Look closely at interface design, cognitive load, and how features such as reminders, prompts, and personalization affect student use over time |
Inclusion leads | Focus on whether tools can be used by students with different levels of digital confidence, support needs, and accessibility requirements. Adoption depends on more than technical access alone |
Data and privacy leaders | Treat governance as part of implementation planning. Review what data are collected, how they are used, who can access them, and what oversight is in place before scaling |
- Balaskas, S., Yfantidou, I., Nikolopoulos, T., & Komis, K. (2025). The Psychology of EdTech Nudging: Persuasion, Cognitive Load, and Intrinsic Motivation. European Journal of Investigation in Health, Psychology and Education (EJIHPE), 15(9), 179. https://doi.org/10.3390/ejihpe15090179
- Weng Ma, K., Pramudita Julianton, R., Yang Chan, X., Teng Chai, Y., Mukred, M., Wong Ei Leen, M., & H. Gumaei, A. (2025). A Model for the Adoption of Artificial Intelligence in Inclusive Education: An Exploratory Study of Key Factors and Expert Insights. Journal of Information Technology Education, 24, 27. https://doi.org/10.28945/5612
- Segers, E., Cho, B.-Y., & Naumann, J. (2025). Digital reading and what makes it hard for whom: Individual differences in learning from digital texts. Learning and Individual Differences, 102801. https://doi.org/10.1016/j.lindif.2025.102801
- UNICEF Innocenti – Global Office of Research and Foresight, ‘Data Governance for EdTech: Policy Recommendations’, UNICEF Innocenti, Florence, September, 2025.



