The UK has made headlines by officially permitting teachers to use generative AI tools to grade assignments, write lesson plans, and even draft letters to parents. It’s a policy move that could transform the education system by streamlining workloads for teachers and modernizing classroom practices. But it also raises critical questions: Where should we draw the line to preserve the creativity, trust, and accountability that lie at the heart of learning?
We asked members of the AI Think Tank—leaders advancing enterprise AI and governance—to weigh in on this evolving relationship between education and intelligent automation.
Regulations Set a New Standard for Teacher Productivity
Aravind Nuthalapati, Cloud Technology Leader for Data and AI at Microsoft, sees the UK’s decision as a formal embrace of AI’s capacity to meaningfully reduce educator burden.
“AI excels at automating routine tasks—grading, lesson planning, administrative work—allowing teachers to focus on mentorship, creativity, and deeper student engagement,” he says. But he cautions against unbounded use. “AI shouldn’t independently evaluate subjective work or replace nuanced human interaction. The future depends on ethical frameworks that prioritize human judgment.”
“Assigning work to AI does not mean abdicating responsibility.”
Human Insight Remains Key
Peter Guagenti, CEO of EverWorker, applauds the move, noting that AI is becoming a personal assistant for professionals across industries—and teachers should be no exception.
“Agentic AI has proven to be an invaluable assistant in the execution of tasks. Having a personalized AI worker that can manage these time-consuming tasks means that a teacher can focus on providing better, more personalized instruction to their students.” However, he cautions that “assigning work to AI does not mean abdicating responsibility, as teachers must still be able to crisply articulate what they want their AI assistants to do and will remain accountable for the output.”
AI Should Only Be Used To Amplify, Not Flatten, Learning
Gordon Pelosse, EVP of Partnerships at AI CERTs, believes AI can reduce teacher workload without sacrificing values—if it’s used right.
“AI should be a helper, not a replacement,” he says. “Using AI to cut teacher workload (through feedback and planning) is a start. But we can’t outsource judgment of trust, creativity and student growth.”
“Automate the routine, not the relationship.”
Allowing teachers to tailor the educational experience to their individual classrooms remains a critical component of educational success. A study by the Journal Evaluation in Education (JEE) found that there was a direct correlation between teacher creativity with student learning. Teachers who were given the freedom to adjust their approach to how materials were presented saw more achievement by their students than those who weren’t given this freedom.
Suri Nuthalapati, Data and AI Leader at Cloudera, echoes this point by emphasizing that education is built on trust. “Automate the routine, not the relationship,” he says. “AI should assist, not replace, especially in areas requiring empathy and nuance.”
AI May Empower Teachers, Not Replace Them
Anand Santhanam, Global Principal Delivery Leader at AWS, argues that this moment is about empowerment.
“The UK’s policy shift reflects what many in the field already know: AI is becoming part of modern work and life, and education can’t ignore that. Teachers are stretched thin,” he says. “Thoughtful AI use eases that load. But the real conversation isn’t about which tasks AI can take on. It’s about making sure the human side of teaching—judgment, empathy, inspiration stays front and center.”
The official guidance regarding how AI tools may be used by educators, rather than in place of educators, is further reinforced by one of the largest teachers’ unions in the UK, NASUWT. In their official guidance, they say, “AI-enabled technologies must not replace direct human interaction between the teacher and learner as the teacher plays a critical role in supporting the social and emotional dimensions of learning, and in personalising education to meet the diverse needs of learners.”
“Teachers have to retain the final authority over grades, especially for work requiring cultural context, creative interpretation, or nuanced judgment.”
There Are Clear Gains, But Also Clear Boundaries
Jim Liddle, Chief Innovation Officer at Nasuni, sees this policy shift as a milestone: AI is no longer a threat, but a tool. In his opinion, it can handle objective assessments, basic feedback, and lesson scaffolding. But Liddle issues a warning, “teachers have to retain the final authority over grades, especially for work requiring cultural context, creative interpretation, or nuanced judgment. Prior MIT research into this area spells out the dangers, warning that relying solely on AI for complex assignments risks superficial, formulaic feedback that could undermine the learning process and erode trust.”
Adoption And Use Must Emphasize Ethics First, Always
Nikhil Jathar, CTO at AvanSaber, envisions AI as an intelligent assistant for productivity gains: grading multiple-choice tests, tailoring worksheets, and streamlining communications.
But he draws a line at creativity and connection. “Critical boundaries must protect human-centric educational elements: subjective assessment of creative writing, sensitive parent conversations about student challenges, and inspirational mentoring that builds student confidence,” he says. ” Optimal implementation maintains the teacher as the final decision-maker, with AI serving as an intelligent assistant.”
Vishal Bhalla, CEO of AnalytAIX, highlights AI’s potential outside the classroom. “AI can support teachers by identifying student learning styles, enabling more personalized and effective instruction,” he says. Still, he, too, believes the teacher must stay at the center of learning. “As I always say that we must ask, ‘In whose service?” And the answer must be–in service of humans.”
These limitations in how AI may be used in the education field are further underscored by protections outlined in the UK’s General Data Protection Regulation (UK GDPR) and similar regulations, which restrict when and how AI systems can use personal data. These regulations make it clear that while an AI can streamline the classroom planning process or to act as a take-home tutor, teachers should also be aware of potential risks for sharing too much of their students’ personal data with an AI system.
Don’t Let Automation Erode Trust
Rodney Mason, Head of Marketing at LTK, believes that AI can help teachers focus on what matters most—students. But he warns against over-automation.
“AI systems will homogenize learning and significantly diminish teaching that addresses students’ individual needs,” he says. “Teachers, as stewards of education, must ensure AI amplifies equity and clarity rather than introducing opacity in decision-making. The future of AI in education lies in its ethical and thoughtful application, where it complements rather than supersedes the irreplaceable human connection at the heart of learning.”
Takeaways for School Leaders and EdTech Innovators
- Automate routine tasks like grading, planning, and communications to free up teaching time.
- Preserve human judgment for creative work, sensitive evaluations, and student interactions.
- Design AI tools to support—not replace—educators.
- Build transparency, ethical oversight, and bias mitigation into every system.
- Use AI to expand personalized learning, not standardize it.
The UK’s move signals that AI in education is no longer speculative—it’s policy. But if this shift is to succeed, it must prioritize the teacher-student relationship above all else. AI should serve as a powerful ally in the classroom, helping educators focus on what matters most: inspiring minds, nurturing growth, and guiding the next generation forward.