AI and the joy of early adopters

What will schools be like when AI is embedded into our lives?  Much like they are now or perhaps something radically transformed? Perhaps we can see clues to the future by looking into the past? 

I was interested to hear Satya Nadella, the CEO of Microsoft, describing these days as being as “exciting… like the 90s are back.” I remember that feeling. My involvement with ICT in schools began in May 1990 with the release of Windows 3.0. It was an exciting time of innovation with Windows and the internet. The internet was revealed slowly line by line through Netscape and a screeching modem. We paid for the connection by the minute.

At the time Windows was marketed as a tool to improve business productivity. Rather like AI is marketed today. 

However, the arrival of the internet and Nicholas Negroponte’s ‘Being digital’ persuaded me that there was something much more significant going on. I think that is probably true of AI today, too. 

It took a couple of years to get a version of Windows stable enough to use reliably. Then several more before it really lived up to its billing as a proper ‘operating system’. People buying into Windows 3.0 thought they were purchasing a product. In fact, they were subscribing to a vision. This continues today in Windows 11 and into the future with Windows 12 and Copilot. 

It is certain that the future of AI will be rented. Those with the deepest pockets will get the best AI. I fear that people relying on routine admin jobs could lose out.  Education across the world is deeply concerned with social equity and we will wait to see how AI will be provided to education. Hopefully, it will be used to reduce the inequalities in education provision, not increase them. 

Since the 1990s, Windows and its rivals have transformed the whole of commercial life, from banking through manufacturing to publishing. Only education has remained (relatively) unaffected. The reasons for this are interesting and will be considered in future blog posts. 

For our early adopters of AI, this is a further clue. What they will do today will be a platform for growth and what they will be doing in five years’ time may well look very different: it is the journey and the process that matter. Mature and established working practices will emerge from today’s experiments. I salute the early adopters for their enthusiasm and patience.

However, there are real differences between then and today. Back in the 1990s we did not have Bill Gates and Steve Balmer saying that their Windows software represented an existential threat to mankind, that should be treated as seriously as the threat of climate change. This is what makes this new wave of creativity different and rather alarming. 

It is good that the educational community is moving quickly to think about these issues. The AI and Education project, part of the Bourne-Epsom protocol, is a powerful and timely UK-based contribution to the debate.

Follow this link to its website and you will see the bubbles of joy of enthusiastic practitioners who see AI making positive contributions to their planning, administration and teaching

Leavening this is a smattering of reflective opinion pieces that try to put all of this into a wider context. The ‘Yes, but…’ or the ‘Have you tried…’ pieces that come from experienced critical friends. 

So, is there a dark side to any of this? Well, for everything gained, something is lost. I did not see this back in the 1990s, although Neil Postman was warning us about living in a world where there was so much information that none of it had any meaning, value or longevity. Marshall McLuhan’s global village has not turned out to be a comfortable place to live, as he himself had warned. 

As we move forward, then, we will celebrate the good and wonder about what is living in the shadows that are ever present in the distance.

Can AI become a school tutor?

One of the most dramatic claims for the “new” AI is that it will be able to act as tutors (or counsellors) for students. A recent manual for higher education by UNESCO suggests that AI could become a “personal tutor… providing personalised feedback based on information provided by students or teachers.” 

Can AI become a school counsellor?

Some secondary schools are experimenting with AI, providing individualised feedback to answers to specific examination questions. The secret of their success seems to lie in the training given to the AI before the session begins. 

Could AI do other tutoring tasks? The highly respected educators, Hamilton, Wiliam and Hattie, have recently suggested that “AI digital tutors [could be used] instead of expensive high-intensity human tuition for learners with additional needs”. Again, the ability to do this will depend upon the real time training given to the system. 

A third role of a tutor is to give advice on careers, life choices and, often, relationships. Might AIs reach extend to these more sensitive areas of tutoring? 

People are not born as effective school tutors: they learn their craft. Their skills are shaped and honed by initial teacher training courses and by working in schools. Tutors work within defined boundaries of values and ethics. 

I have experience of training new teacher tutors as a senior lecturer in the PGCE programme at the University of Bristol and as a head of department in a secondary school.  In this essay I reflect on whether AI might be trained to act as a tutor in the same ways as trainee teachers. I conclude with a consideration of whether this is a desirable aim. 

It is important to continually remind ourselves that AIs use “large language models” to receive text from a user to predict the words most associated with the text. They do this by recognising patterns and relationships between words. They are trained using massive amounts of information, and the most recent versions can be trained further by users. 

history and physics require different kinds of understanding

The results are impressive, but it is important to realise that AIs do not understand the ideas beneath the words. They do not understand the causes for the English civil war or the phenomenon of quantum entanglement, in the ways that historians and physicists do. 

It remains important that new teachers are educated in the subjects that they teach at both primary and secondary levels. Such teachers are trained to think critically about their subjects, even if they have forgotten the exact details that they crammed into their university essays. It is this training that enables them to think about the connections between ideas and develop sequences of thoughts that become meaningful lessons for students. 

We cannot send AI to university to learn the nuances of their subjects. We must develop training materials that we supply to AI at runtime.  By using prompts, AI can learn to use the ideas to develop answers that go beyond repetition of the words in the training materials. 

The key training documents are those that teachers use when preparing to teach courses: published curricula, examination specifications, schemes of work, in service training materials, examination questions and mark schemes, examiners reports and the like. 

Other documents, such as nature of the scientific method, rules of historical inquiry, vocabulary lists, formulae sheets are also significant. 

These documents can be assembled into folders that could represent Science in Year 6 or French in Year 12. Some AI systems (eg Chat-GPT) allow these to be fed into the system using scripts coded in python. If this could be done at the level of the school or academy trust, then this would be available every time the AI was accessed. This would save teachers time in retraining the system.

Avoiding inaccurate or incorrect statements is a core pre-requisite of successful teaching. Schools become deeply concerned when trainee teachers make mistakes that students copy. But it is only a part of what good teacher tutors share with their student.

School tutors share standards and values with their students

Alongside the knowledge and skills, tutors also transmit the values, standards, rules and ethics of the school. These are designed to offer a social education that allows students to fit into the organisation of the school. 

How do trainee teachers find out what these “approved” codes of behaviour are? Policy documents produced at national, academy trust and school levels give guidance and can be used to train AIs. Equally importantly, trainee teachers observe other teachers in action and discuss issues arising in their classroom practice. The training is “on the job” because it is reflective and is determined by context. Uploaded training files can never make up for this, and this will inevitably limit the ability of AI to be an effective school tutor.  

Some students have educational plans identifying their special educational needs and these are supported by a wider literature offering strategies for successful teaching and learning. If the goal is to offer personalised support to these students, then the AI will need to be aware of their individuality. There are compliance and safeguarding implications to this. One can envisage a time when maintaining the compliance of the AI systems is a recognised role in schools. 

None of the above is intended to discourage teachers from trying to use AI: the ability of AI to simplify, translate and produce engaging text is already evident and will only get better as future versions reduce its unfortunate habit of hallucinating false information. 

An AI offering careers advice might be a real advantage in preparing students for a face-to-face discussion with a careers teacher, providing the AI was trained with up-to-date information about the rapidly changing employment sector. 

How far can this go? Can AI ever really offer effective tutoring in life choices and relationships? Tutors learn to become sensitive to context because they work alongside their students. They know when a student could try harder and when troubles at home require a gentler conversation. They can spot signs of distress and possible safeguarding issues. Can such experience ever be reduced to a training code? 

This leads to a wider question for society. Do we really want a machine to meet such basic human needs? Might such a system create a new form of psychological dependence, especially for students developing their identities through screens, amidst social media?

I have little doubt that, since there is a commercial advantage, attempts at such systems will arise. The question that schools and society need to ask is, where should any red lines be drawn? 

About this blog

This is the blog of Neil Ingram and reflects a variety of my interests over the years. As a biology teacher, university academic, examiner and author.

I have been deeply interested in the use of IT in schools when Windows 3.0 came out in May 1990.

About ten years ago, I was invited to be part of the Hewlett Packard Catalyst initiative and I developed a model about how school pedagogy could work in the Web 3.0 world. Some of this has been published, but quite a lot has not.

The development of Artificial Intelligence systems is generating the same levels of excitement as Windows 3.0. As the CEO of Microsoft said, “It feels like the 1990s again!“.

I have developed an introduction to the thinking:

Pedagogy AI is a roadmap for the pedagogy of a lesson using AI with students.

This is based on the ideas in a series of ten linked posts, called Teaching and learning with AI. Part one is here.

To show you round the rest of the site:

Stories from Nowhere was a lockdown project, trying to use stories to bring important ideas into middle years biology lessons. It is built on observations in a wood and work we were doing on a 16-19 curriculum framework for the Royal Society of Biology

Exploring the epigenetic landscape is a microsite about how genes interact with the environment and uses the ideas of Conrad (Hall) Waddington.

The home page contains a mixture of posts relating a university course I ran on genetics, society and education.

There are also posts on Evolution relate to a book I co-authored for Oxford University Press.

The title “tools for clear thinking” is based on a book by Conrad (Hal) Waddington, whose ideas run through every article on this site. The cover image was designed by Dall-E3, and the prompt included Waddington’s term “epigenetic landscape. I was delighted that its rendering resembled the original conception by the painter John Piper.