Schools and educators are being subjected to the latest contrivance that promises a reduction to the most tedious aspects of being a teacher. Often, educators are swept into a tyranny of efficiency for which tech companies repeatedly pitch to the unsuspicious. The latest? Artificial Intelligence (AI)-enabled “personal teaching assistants,” which appear as add-ons for many school-wide information systems (including curriculum planning and reporting software and the like). These AI features claim they can eliminate or lessen the need to work through the following: writing student reports, collaborating on lesson planning, developing personalized learning for students, analyzing data, and creating assessments. These niceties are the first step in the relentless march to dehumanizing education.
At the very core of our educational endeavor is the necessity of highly trained teachers who retain the knowledge, skills, and talents to be effective. And, most importantly, teachers must absolutely possess deep human qualities, which is portrayed as one who is passionately committed to the care and development of children (teachers spark inspiration, turn lives around, and ratify individual student needs, repeatedly). To truly “know thy learner” a teacher must be 100 percent engaged in their students’ learning growth. That undertaking carries a litany of responsibilities, which may include mustering through observational notes, grappling with a colleague about a student’s learning, listening intently to a student’s ideas, and writing assessments with an understanding of who the students are, as learners. There are no facets of the teacher/student relationship that ought to be denuded into rudimentary and impersonal forms. Education is principally grounded in humanism.
Let us unfold what is happening now that we are allowing those AI-enabled components to be tested by teachers. The sales promotion, remember, is that AI will trim those burdensome aspects of teaching. For example, one company says we should use the “teaching assistant,” as it will save hours of having to write those arduous report card comments about students’ achievement. One even has a magic wand on their app that “generates a hyper-personalized comment” for progress reports. Teachers simply click on the icon to abolish the need to aggregate and grapple through the writing of a personal passage. Once the comment is written by AI, the teachers can then simply click a button to configure the appropriate voice and tone for the message: firm, witty, or serious (to name just a few).
As a school leader, I am familiar with the onset of complaints from teachers around reporting time. I deflect complaints unabashedly knowing that the difficult process of reporting on learning results in a deeper understanding of each student. Before AI, teachers would not have fathomed having a friend or even a family member write our comments for them. Why, then, would we outsource the task to a bot, which is several shades away from the teacher, who is accountable and responsible for personally knowing their learners?
The rollout of ChatGPT sure scared educators. The fear was that students would never learn to compose even a paragraph if they simply reverted to ChatGPT to do it for them. It was a legitimate worry that was solved by ensuring teachers assigned work differently, and more importantly, knew their students as learners . . . and knew them well. Now a form of ChatGPT (in the shape of AI-enabled software) will allow teachers to avoid their own annoyances with writing—yes, it can even be used to write messages and letters to parents. As a parent, I would be quite disappointed to know that my child’s teacher couldn’t be bothered to write a personalized comment about my child’s learning. Or that those “witty” letters home did not represent the quiet and kind teacher I met at the beginning of the school year.
And how about the labor of curriculum planning? The former art of taking the adopted educational program and planning in teacher teams will no longer be necessary with AI-enabled curriculum planning software. It promises to streamline lesson planning, therefore, taking the teacher, again, out of the learning equation. Curriculum planning is a collaborative affair and when conducted in isolation, or by someone (or something) else, can result in a school where teachers work in silos, or worse yet, have no allegiance to the lessons they are delivering. The more teachers allow AI to do their lesson planning, the more teachers will place unyielding trust in the algorithmic output from AI. I worry that in the long run, teachers will begin to trust AI over their fellow colleagues and eventually even mistrust their own work.
Even these few drudgeries above are essential for strengthening teacher/student relationships. By the time you read this, educational software companies will have introduced a slew of additional AI-enabled accessories to eliminate these onerous tasks from teachers, all of which will accelerate the divorce between teachers and students. I implore educational leaders to scrutinize the use of these AI-enabled components to ensure the human connections between teachers and students are not compromised. Innocent overuse of AI-enabled tools will only goad the designers to create more of these embellishments which will wedge themselves firmly between students and teachers.
--------------------------------------------------------------
Deron Marvin is a Head of School at an independent school in California. He has worked for over 24 years with international and independent schools.