Teaching and learning with AI (part 10)

The learning community

The learning community refers to the impact of the wider community on teaching and learning. These could be friends and family of students, who have interesting life stories, or professionals who are engaged in the public understanding of their subjects. 

It is said that “It takes a whole village to raise a child.”

To see how AI might be a useful part of the learning community, we need to ask whether AI is a tool or an agent or an actor. 

A tool is a machine that does a job for us, like a sat nav that gives us directions to our destination. 

An agent has a greater level of autonomy. If the sat nav monitors traffic in real time and recommends alternative routes to avoid traffic delays, then it is showing some level of agency. 

An actor shows the greatest degree of autonomy and independence. If AI says that a friend has not left their house for three days and has not made any contact with anyone else, and asks if you want to visit, then it is being an actor – a digital personal assistant. Especially, if it then makes arrangements for the trip, including booking a hotel room.

An actor does not have be a human influence. The French sociologist, Bruno Latour defines non-human activities as actors as long as they influence social situations. 

At present, AI is being a tool for most of the uses discussed in these articles. Thinking tools are useful, as long as they do not prevent people from thinking for themselves. More sophisticated uses of AI, such as being a study buddy, may allow the AI some degree of agency to direct the learning. 

Future developments in AI might facilitate the development of a device that acts like a personal tutor. The news that Microsoft is patenting a technology to offer an “AI-powered emotional management system” as part of an emotion-centred journaling feature, suggests AI might become a motivating actor sooner than we thought. 

The AI and Education post: Enhancing Counselling Education Through AI: A Progressive Approach presents an AI application that develops the counselling skills of mature students who “role-play both counsellor and client, developing empathy and counselling techniques.  ‘Call Annie'[an AI bot] offers face-to-face interaction via video calls”. Some students like the system so much that they are using the application as a “a virtual therapist”. 

These are early days for the project and it is likely that the application is acting as a training tool with some levels of agency. More sophisticated iterations in the future might become so autonomous that they become social actors. 

Throughout these articles we have stressed that pre-training AI is the best control teachers have to ensure that its use is productive and safe. This also applies to actor-level uses of AI. 

The wider discussion for society is whether we want to enter such intimate relationships with machines, given the risks of psychological or emotional dependence. Many of us can become addicted to our screens. For others, it extends possibilities and horizons in the most extraordinary of ways. 

Q. This article considers whether we, as teachers or as members of society, should draw boundaries to limit the possible uses of AI in education.

Where do you think the boundaries should be drawn?

Previous article

Updated 16/01/24 to include end of article question.

AI and how we teach in schools

Teaching looks so easy: all we have to do is stand up and talk. Lots of us can do that! So, if this is all there is, could an AI chat box one day become an effective classroom teacher?  

Obviously, a fundamental part of teaching is about instruction. Instruction is the transmission of knowledge and skills from an expert to a novice student. 

Teachers learn to select the appropriate knowledge and skills, learn to use appropriate language, lesson sequencing and techniques of communication.

These allow ideas to be received and understood by their students.

AI can help teachers prepare for each of these areas, although how teachers control these processes is crucial to their chances of using AI successfully, as we shall see in future posts. 

However, content is not the only thing that is taught in lessons.

A second discourse runs in parallel alongside the instruction.

This regulates students’ behaviour and develops their social attitudes.

Instruction and regulation are so completely intertwined with each other, that they are inseparable. 

This post looks more carefully at how teachers regulate students’ behaviour and its implications for how AI can be instructed to plan lessons.

The regulation of students’ behaviour in schools in England is increasingly controlled by the Academy or even the Academy Trust. Individual teachers have much less control over this than they did a generation ago. Similar trends can be seen across the world.

The regulation is a series of rules that shape how students should behave in school:

  • how they should dress,
  • behave in corridors and classrooms,
  • how they should walk between rooms
  • how they should talk to teachers and to each other.

These behaviours create compliance in students. Students who cannot or will not comply can expect a series of sanctions of increasing severity.  These behaviours can be taught to AI easily enough, since they are usually enshrined in published school policies. AI would be discouraged from saying things that were against the school rules.

It would also be relatively easy to train AI to adopt certain pedagogical styles used by teachers, such as scaffolding, retrieval practice, modelling, recall and interleaving, by including a description of the practices as part of the initial prompts. Later prompts such as “be a classroom teacher that uses the technique of retrieval practice to produce…”, could be used to create useful classroom materials.

Giving students unrestricted access to AI could cause problems, if the topic being studied involves ethical or moral discussions.

All teachers are expected to abide by the “ethos” of the school, which includes upholding shared values. These include agreed attitudes to relationships education, citizenship, and social attitudes.  These are included as part of the regulation discourse and should be embedded in all lessons.  

Unless AI is specifically trained to produce answers consistent with a school’s ideology, then it might be able to produce answers that would make class teachers and students feel uncomfortable or embarrassed. See my earlier post on ‘Can AI become a school tutor?’ for more on this. 

AI has a huge potential to become a reliable assistant for teachers, but first it needs to be taught what a teacher’s job actually involves, which is far more than standing up and talking. 

This the first of a multi-part series that uses the ideas of the pedagogic device developed by Basil Bernstein to explore the potential impacts of AI on teaching in schools. It draws on work undertaken by Neil Ingram on Pegagogy 3.0 for the Hewlett Packard Catalyst initiative.