AI and Teaching

During my usual EduTwitter procrastination, I recently came across the following article:

TES: We Prefer Robot Teachers

In the article, John Hattie is reported as saying at a recent Education conference in Edinburgh, that students he had met in Asia preferred Robot Teachers to their own teachers. As I read the article, I could almost hear the cogs of SMT across the country working out the maths of what they needed to cut to afford an AI teacher.

‘Do we really need art, music or humanities? If we cut those teachers we could get one of those?’

‘They don’t need sick days…or parental leave…or sleep.’

When the guru of evidence based practice speaks, SMT leap into action. It’s not really their fault, we are in an industry saturated with a need for scientific and objective data and Hattie’s work provides them with exactly that.

Nor do I disapprove of Hattie’s methods. For my Master’s research, I looked in depth at Hattie’s meta-analyses and from an observational viewpoint, a lot of his findings ring true with my experiences of education. There are critics of his methodology, of course, but I get why Hattie is so revered in education. He’s taken research and given teachers practical applications to improve. Thanks John.

Nor do  I have any problems with using technology. It’s not that I fear a future where robot overlords will take over and we will be living close to the core of the Earth because we scorched the sky to stop their power sources…sorry, that’s the plot of The Matrix. Nor do I believe that SMT will necessarily replace teachers with Terminator clones (although for some classes that may not be the worst idea). My issue with technology is it’s ability to adapt to the subjective nature of humans.

When voice recognition, in the form of Siri, first came out, I was living in Scotland. It did not go well as this clip from BBC Scotland illustrates:

Voice Recognition Lift

Of course, technology has advanced since those days. Voice recognition is the norm with SMART technology, Google Home and Amazon Echo/Alexa. Recently, I caught my 5 year old conversing with Alexa and getting incredibly frustrated. Turns out she’d asked it to ‘poo’ and Alexa had responded with ‘I am not programmed to do that.’ Imagine the requests that a class full of 13 year old boys may make of that AI teacher. I suppose the Terminator model might be useful here.

The problem for me is the objective approach to teaching. The assumption that taking a scientific and objective approach to an inherently subjective and organic matter such as children can be beneficial in some way. Of course it would be…but not to teachers or children, but to SMT, data analysts and of course, OFSTED. An AI teacher may be able to record the number of responses a child has made over a lesson, a week, a term. It would be able to calculate the average response time, work out percentages of correct and incorrect answers given. It would be able to differentiate to micro levels that most teachers do not even know exist (levels set by the DfE of course) and all of this would be presented in RAG rated categories and students ranked against each other, possibly even nationally. Imagine, knowing that your child ranks 11,345th out of 55,638 in response time to direct questions on the use of homophones at KS2, sub level 5!  It’s not that I am against data. Some of it is useful, but data for data’s sake is a form of instrumental reasoning that the likes of Adorno argued against. It’s the type of rationale that leads to the labelling of children and is overall detrimental to their  mental health, well-being and development.

Technology has it’s place in education – you’d expect that from somebody blogging and vlogging and tweeting about education. But so does humanity. The claim from Sir Anthony Seldon that teachers would essential be classroom assistants in 10 years, should be resisted at all costs. Hattie’s research finds the largest impacts on teaching and learning come from qualities that result in human interactions and despite the possible negative implications of this, it is my belief from my own research that it is the human understanding of the subjective nature of students that can bring about positive changes.

Teachers understand the unpredictable nature of humans. They know when their students are having bad days. They can recognise students have problems at home, problems with other students, problems in their own minds and this is something I don’t think AI has the capability to understand just yet. Maybe instead of trying to make teaching more objective and scientific, we should be trying to make education more human.

 

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

%d bloggers like this: