News and Media

The importance and place of the person in the gen AI age
By Steve Campitelli
Academic Skills Adviser, The University of Melbourne
In late 2022, the most disruptive education change agent since the advent of the Internet emerged in the form of Chat GPT. It was the first of a series of such generative AI technologies so called because of their ability to generate responses to a text input in often freely available, easy-to-use platforms. Gen AI enables any user with a net connection and the ability to form a question or a text ‘prompt’ to get an answer often within seconds. Ask it to write a critical essay on a Year 12 English novel with references, it will do that. Prompt it to make the essay more formal language suitable for university level and it will. Request it to reform the essay into a poem in Elizabethan English, it can in seconds. In many respects, it adds a completely different dimension to learning bringing many advantages to the educative context.
For those with a computer, it is an accessible, anytime, anywhere resource, always available whenever it is needed. It doesn’t get tired or sick, doesn’t get annoyed at you for your mistakes, answers any question you have and is relatively easy to use – if you can form a prompt and read the answer, you can use it. It can be a support for students with language challenges, for those with social or learning anxieties; it’s non-judgemental, can adjust answers to level, can reframe them in the language you want or in the text type you want. It can summarise readings, provide understanding of complex ideas or terms, proofread and provide feedback on your work, correct mistakes and reform your language, information transaction tasks it is very good at. In short, especially appealing for tech-adjacent young adults, it can be an active and available education partner.
So, what then is the place of the person here? If gen AI can do all these things with a minimum of fuss, then why have people involved?
There are very clear reasons education needs to keep people at its heart.
Education is fundamentally a social activity, a context for the most part in which learning is ideally co-constructed between students and teachers who learn from each other. Under this view, learning and development occur within a shared, active two-way process between people which requires both parties to recognise a present ‘other’. There is an emotional aspect to this context where people recognise other people as being happy, excited, nervous or anxious and frame their interactions accordingly. We use our situational and social-emotional understanding to congratulate, celebrate, empathise, support, guide and question, to laugh, to correct, to put an arm over a shoulder. Learning, in human-to-human terms is, therefore, grounded within a dialogue between people to which we bring a sense of ‘self and other’ in a mutually recognitive context.
A gen AI platform cannot perform this role. It has no sense of self or other, it cannot ‘recognise’ you, it doesn’t ‘notice’ if you are stressed, it isn’t ‘sorry’ for a mistake it has made in the information it has given you. It doesn’t have consciousness or emotions; it isn’t aware of you or itself at all. It processes answers by pulling together predicted language patterns based on the dataset it has been programmed with. It has no awareness or care for the truth of its statements in the same way Microsoft Word has no awareness or care for the truth of this article.
This matters to us because, especially in an increasingly tech-dominated world, people’s voices matter. We are concerned for the truth of what is being said, what others think, their opinions and feelings. Central to this is the implicit recognition of each other as people. Teachers recognise students as people at school to learn academically and to develop socially and emotionally, while students recognise teachers as people invested with the authority, knowledge and understandings to support them in that journey. Students also recognise each other as young people with agency, vulnerability, hopes, dreams and aspirations. This recognition is only able to be conferred by other people. It makes no sense to have this recognition from a non-human agent that doesn’t recognise you at all.
Further, gen AI also makes mistakes. There are a variety of reasons for this, but gen AI does generate ‘hallucinations’ – fabrications, made up or straight up incorrect information. Not all the time, not even most of the time, but it does do this, raising concerns about gen AI’s trustworthiness or unquestioned acceptance of its outputs. Students need to critically engage with the information they are working with and this requires mental effort. It is one of the central planks of person-centred education as we understand it where learning is co-constructed occurring within active and vibrant, dialogic, social contexts as outlined earlier. The giving over of that mental effort to a tool, as exemplified by the use of gen AI, is what is termed ‘cognitive offloading’. It isn’t problematic in and of itself, of course, but overuse or dependence on it to do this mental processing all the time does not help students become the critical enquirers and curious critical thinkers we hope will drive learning and thinking forward.
So, gen AI needs to be used critically, with care and for what it is good at. It is a tool, an extremely capable and impressive one, but like any tool it exists to support us. We can use it for a range of non-recognitive transactional tasks, however, we also need to be active, enquiring agents, people interacting with and recognising others in the social educative world and a person-centred education remains central to how that occurs.
