Teaching and learning with AI (part 10)

The learning community

The learning community refers to the impact of the wider community on teaching and learning. These could be friends and family of students, who have interesting life stories, or professionals who are engaged in the public understanding of their subjects. 

It is said that “It takes a whole village to raise a child.”

To see how AI might be a useful part of the learning community, we need to ask whether AI is a tool or an agent or an actor. 

A tool is a machine that does a job for us, like a sat nav that gives us directions to our destination. 

An agent has a greater level of autonomy. If the sat nav monitors traffic in real time and recommends alternative routes to avoid traffic delays, then it is showing some level of agency. 

An actor shows the greatest degree of autonomy and independence. If AI says that a friend has not left their house for three days and has not made any contact with anyone else, and asks if you want to visit, then it is being an actor – a digital personal assistant. Especially, if it then makes arrangements for the trip, including booking a hotel room.

An actor does not have be a human influence. The French sociologist, Bruno Latour defines non-human activities as actors as long as they influence social situations. 

At present, AI is being a tool for most of the uses discussed in these articles. Thinking tools are useful, as long as they do not prevent people from thinking for themselves. More sophisticated uses of AI, such as being a study buddy, may allow the AI some degree of agency to direct the learning. 

Future developments in AI might facilitate the development of a device that acts like a personal tutor. The news that Microsoft is patenting a technology to offer an “AI-powered emotional management system” as part of an emotion-centred journaling feature, suggests AI might become a motivating actor sooner than we thought. 

The AI and Education post: Enhancing Counselling Education Through AI: A Progressive Approach presents an AI application that develops the counselling skills of mature students who “role-play both counsellor and client, developing empathy and counselling techniques.  ‘Call Annie'[an AI bot] offers face-to-face interaction via video calls”. Some students like the system so much that they are using the application as a “a virtual therapist”. 

These are early days for the project and it is likely that the application is acting as a training tool with some levels of agency. More sophisticated iterations in the future might become so autonomous that they become social actors. 

Throughout these articles we have stressed that pre-training AI is the best control teachers have to ensure that its use is productive and safe. This also applies to actor-level uses of AI. 

The wider discussion for society is whether we want to enter such intimate relationships with machines, given the risks of psychological or emotional dependence. Many of us can become addicted to our screens. For others, it extends possibilities and horizons in the most extraordinary of ways. 

Q. This article considers whether we, as teachers or as members of society, should draw boundaries to limit the possible uses of AI in education.

Where do you think the boundaries should be drawn?

Previous article

Updated 16/01/24 to include end of article question.

AI and how we teach in schools

Teaching looks so easy: all we have to do is stand up and talk. Lots of us can do that! So, if this is all there is, could an AI chat box one day become an effective classroom teacher?  

Obviously, a fundamental part of teaching is about instruction. Instruction is the transmission of knowledge and skills from an expert to a novice student. 

Teachers learn to select the appropriate knowledge and skills, learn to use appropriate language, lesson sequencing and techniques of communication.

These allow ideas to be received and understood by their students.

AI can help teachers prepare for each of these areas, although how teachers control these processes is crucial to their chances of using AI successfully, as we shall see in future posts. 

However, content is not the only thing that is taught in lessons.

A second discourse runs in parallel alongside the instruction.

This regulates students’ behaviour and develops their social attitudes.

Instruction and regulation are so completely intertwined with each other, that they are inseparable. 

This post looks more carefully at how teachers regulate students’ behaviour and its implications for how AI can be instructed to plan lessons.

The regulation of students’ behaviour in schools in England is increasingly controlled by the Academy or even the Academy Trust. Individual teachers have much less control over this than they did a generation ago. Similar trends can be seen across the world.

The regulation is a series of rules that shape how students should behave in school:

  • how they should dress,
  • behave in corridors and classrooms,
  • how they should walk between rooms
  • how they should talk to teachers and to each other.

These behaviours create compliance in students. Students who cannot or will not comply can expect a series of sanctions of increasing severity.  These behaviours can be taught to AI easily enough, since they are usually enshrined in published school policies. AI would be discouraged from saying things that were against the school rules.

It would also be relatively easy to train AI to adopt certain pedagogical styles used by teachers, such as scaffolding, retrieval practice, modelling, recall and interleaving, by including a description of the practices as part of the initial prompts. Later prompts such as “be a classroom teacher that uses the technique of retrieval practice to produce…”, could be used to create useful classroom materials.

Giving students unrestricted access to AI could cause problems, if the topic being studied involves ethical or moral discussions.

All teachers are expected to abide by the “ethos” of the school, which includes upholding shared values. These include agreed attitudes to relationships education, citizenship, and social attitudes.  These are included as part of the regulation discourse and should be embedded in all lessons.  

Unless AI is specifically trained to produce answers consistent with a school’s ideology, then it might be able to produce answers that would make class teachers and students feel uncomfortable or embarrassed. See my earlier post on ‘Can AI become a school tutor?’ for more on this. 

AI has a huge potential to become a reliable assistant for teachers, but first it needs to be taught what a teacher’s job actually involves, which is far more than standing up and talking. 

This the first of a multi-part series that uses the ideas of the pedagogic device developed by Basil Bernstein to explore the potential impacts of AI on teaching in schools. It draws on work undertaken by Neil Ingram on Pegagogy 3.0 for the Hewlett Packard Catalyst initiative. 

AI and the joy of early adopters

What will schools be like when AI is embedded into our lives?  Much like they are now or perhaps something radically transformed? Perhaps we can see clues to the future by looking into the past? 

I was interested to hear Satya Nadella, the CEO of Microsoft, describing these days as being as “exciting… like the 90s are back.” I remember that feeling. My involvement with ICT in schools began in May 1990 with the release of Windows 3.0. It was an exciting time of innovation with Windows and the internet. The internet was revealed slowly line by line through Netscape and a screeching modem. We paid for the connection by the minute.

At the time Windows was marketed as a tool to improve business productivity. Rather like AI is marketed today. 

However, the arrival of the internet and Nicholas Negroponte’s ‘Being digital’ persuaded me that there was something much more significant going on. I think that is probably true of AI today, too. 

It took a couple of years to get a version of Windows stable enough to use reliably. Then several more before it really lived up to its billing as a proper ‘operating system’. People buying into Windows 3.0 thought they were purchasing a product. In fact, they were subscribing to a vision. This continues today in Windows 11 and into the future with Windows 12 and Copilot. 

It is certain that the future of AI will be rented. Those with the deepest pockets will get the best AI. I fear that people relying on routine admin jobs could lose out.  Education across the world is deeply concerned with social equity and we will wait to see how AI will be provided to education. Hopefully, it will be used to reduce the inequalities in education provision, not increase them. 

Since the 1990s, Windows and its rivals have transformed the whole of commercial life, from banking through manufacturing to publishing. Only education has remained (relatively) unaffected. The reasons for this are interesting and will be considered in future blog posts. 

For our early adopters of AI, this is a further clue. What they will do today will be a platform for growth and what they will be doing in five years’ time may well look very different: it is the journey and the process that matter. Mature and established working practices will emerge from today’s experiments. I salute the early adopters for their enthusiasm and patience.

However, there are real differences between then and today. Back in the 1990s we did not have Bill Gates and Steve Balmer saying that their Windows software represented an existential threat to mankind, that should be treated as seriously as the threat of climate change. This is what makes this new wave of creativity different and rather alarming. 

It is good that the educational community is moving quickly to think about these issues. The AI and Education project, part of the Bourne-Epsom protocol, is a powerful and timely UK-based contribution to the debate.

Follow this link to its website and you will see the bubbles of joy of enthusiastic practitioners who see AI making positive contributions to their planning, administration and teaching

Leavening this is a smattering of reflective opinion pieces that try to put all of this into a wider context. The ‘Yes, but…’ or the ‘Have you tried…’ pieces that come from experienced critical friends. 

So, is there a dark side to any of this? Well, for everything gained, something is lost. I did not see this back in the 1990s, although Neil Postman was warning us about living in a world where there was so much information that none of it had any meaning, value or longevity. Marshall McLuhan’s global village has not turned out to be a comfortable place to live, as he himself had warned. 

As we move forward, then, we will celebrate the good and wonder about what is living in the shadows that are ever present in the distance.

Can AI become a school tutor?

One of the most dramatic claims for the “new” AI is that it will be able to act as tutors (or counsellors) for students. A recent manual for higher education by UNESCO suggests that AI could become a “personal tutor… providing personalised feedback based on information provided by students or teachers.” 

Can AI become a school counsellor?

Some secondary schools are experimenting with AI, providing individualised feedback to answers to specific examination questions. The secret of their success seems to lie in the training given to the AI before the session begins. 

Could AI do other tutoring tasks? The highly respected educators, Hamilton, Wiliam and Hattie, have recently suggested that “AI digital tutors [could be used] instead of expensive high-intensity human tuition for learners with additional needs”. Again, the ability to do this will depend upon the real time training given to the system. 

A third role of a tutor is to give advice on careers, life choices and, often, relationships. Might AIs reach extend to these more sensitive areas of tutoring? 

People are not born as effective school tutors: they learn their craft. Their skills are shaped and honed by initial teacher training courses and by working in schools. Tutors work within defined boundaries of values and ethics. 

I have experience of training new teacher tutors as a senior lecturer in the PGCE programme at the University of Bristol and as a head of department in a secondary school.  In this essay I reflect on whether AI might be trained to act as a tutor in the same ways as trainee teachers. I conclude with a consideration of whether this is a desirable aim. 

It is important to continually remind ourselves that AIs use “large language models” to receive text from a user to predict the words most associated with the text. They do this by recognising patterns and relationships between words. They are trained using massive amounts of information, and the most recent versions can be trained further by users. 

history and physics require different kinds of understanding

The results are impressive, but it is important to realise that AIs do not understand the ideas beneath the words. They do not understand the causes for the English civil war or the phenomenon of quantum entanglement, in the ways that historians and physicists do. 

It remains important that new teachers are educated in the subjects that they teach at both primary and secondary levels. Such teachers are trained to think critically about their subjects, even if they have forgotten the exact details that they crammed into their university essays. It is this training that enables them to think about the connections between ideas and develop sequences of thoughts that become meaningful lessons for students. 

We cannot send AI to university to learn the nuances of their subjects. We must develop training materials that we supply to AI at runtime.  By using prompts, AI can learn to use the ideas to develop answers that go beyond repetition of the words in the training materials. 

The key training documents are those that teachers use when preparing to teach courses: published curricula, examination specifications, schemes of work, in service training materials, examination questions and mark schemes, examiners reports and the like. 

Other documents, such as nature of the scientific method, rules of historical inquiry, vocabulary lists, formulae sheets are also significant. 

These documents can be assembled into folders that could represent Science in Year 6 or French in Year 12. Some AI systems (eg Chat-GPT) allow these to be fed into the system using scripts coded in python. If this could be done at the level of the school or academy trust, then this would be available every time the AI was accessed. This would save teachers time in retraining the system.

Avoiding inaccurate or incorrect statements is a core pre-requisite of successful teaching. Schools become deeply concerned when trainee teachers make mistakes that students copy. But it is only a part of what good teacher tutors share with their student.

School tutors share standards and values with their students

Alongside the knowledge and skills, tutors also transmit the values, standards, rules and ethics of the school. These are designed to offer a social education that allows students to fit into the organisation of the school. 

How do trainee teachers find out what these “approved” codes of behaviour are? Policy documents produced at national, academy trust and school levels give guidance and can be used to train AIs. Equally importantly, trainee teachers observe other teachers in action and discuss issues arising in their classroom practice. The training is “on the job” because it is reflective and is determined by context. Uploaded training files can never make up for this, and this will inevitably limit the ability of AI to be an effective school tutor.  

Some students have educational plans identifying their special educational needs and these are supported by a wider literature offering strategies for successful teaching and learning. If the goal is to offer personalised support to these students, then the AI will need to be aware of their individuality. There are compliance and safeguarding implications to this. One can envisage a time when maintaining the compliance of the AI systems is a recognised role in schools. 

None of the above is intended to discourage teachers from trying to use AI: the ability of AI to simplify, translate and produce engaging text is already evident and will only get better as future versions reduce its unfortunate habit of hallucinating false information. 

An AI offering careers advice might be a real advantage in preparing students for a face-to-face discussion with a careers teacher, providing the AI was trained with up-to-date information about the rapidly changing employment sector. 

How far can this go? Can AI ever really offer effective tutoring in life choices and relationships? Tutors learn to become sensitive to context because they work alongside their students. They know when a student could try harder and when troubles at home require a gentler conversation. They can spot signs of distress and possible safeguarding issues. Can such experience ever be reduced to a training code? 

This leads to a wider question for society. Do we really want a machine to meet such basic human needs? Might such a system create a new form of psychological dependence, especially for students developing their identities through screens, amidst social media?

I have little doubt that, since there is a commercial advantage, attempts at such systems will arise. The question that schools and society need to ask is, where should any red lines be drawn? 

About this blog

This is the blog of Neil Ingram and reflects a variety of my interests over the years. As a biology teacher, university academic, examiner and author.

I have been deeply interested in the use of IT in schools when Windows 3.0 came out in May 1990.

About ten years ago, I was invited to be part of the Hewlett Packard Catalyst initiative and I developed a model about how school pedagogy could work in the Web 3.0 world. Some of this has been published, but quite a lot has not.

The development of Artificial Intelligence systems is generating the same levels of excitement as Windows 3.0. As the CEO of Microsoft said, “It feels like the 1990s again!“.

I have developed an introduction to the thinking:

Pedagogy AI is a roadmap for the pedagogy of a lesson using AI with students.

This is based on the ideas in a series of ten linked posts, called Teaching and learning with AI. Part one is here.

To show you round the rest of the site:

Stories from Nowhere was a lockdown project, trying to use stories to bring important ideas into middle years biology lessons. It is built on observations in a wood and work we were doing on a 16-19 curriculum framework for the Royal Society of Biology

Exploring the epigenetic landscape is a microsite about how genes interact with the environment and uses the ideas of Conrad (Hall) Waddington.

The home page contains a mixture of posts relating a university course I ran on genetics, society and education.

There are also posts on Evolution relate to a book I co-authored for Oxford University Press.

The title “tools for clear thinking” is based on a book by Conrad (Hal) Waddington, whose ideas run through every article on this site. The cover image was designed by Dall-E3, and the prompt included Waddington’s term “epigenetic landscape. I was delighted that its rendering resembled the original conception by the painter John Piper.