Generative artificial intelligence is giving rise to “useful and fruitful” conversations in universities, as students and academics grapple with the “high-speed evolution” of education.
UNSW Sydney’s adoption of an AI detection tool has not revealed “huge spikes” in student misconduct, with those found to have engaged in contract cheating often “stung twice” because the assignments they had paid people to produce proved to have been written by ChatGPT.
But the detection tool has exposed considerable use of AI to correct grammar or translate assignments from students’ primary languages into English – applications the university had been “anecdotally” aware of, according to the university's acting pro vice-chancellor for education and student experience, Alex Steel.
Campus collection: AI transformers like ChatGPT are here, so what next?
“This tool now allows us to see the extent of it,” he told the university’s inaugural?NSW 中国A片 Summit. “It allows us to have conversations with ourselves about whether or not we think that’s acceptable, and with students about whether it’s the right way to do things.”
Students are also using tools like ChatGPT to write résumés, complaints and emails to academics “so we understand what they’re actually asking”, Professor Steel said. They harness the chatbot to summarise their notes, interpret their lectures, find topics for assignments and “structure” initial drafts.
“If a student has never seen an answer for a topic before, they’re in the dark with their first attempt,” he said. “They’re not writing their answer with [AI] but it’s really informing the way they then approach their answer.”
Students recognised that they were “shortcutting” learning by using ChatGPT to summarise their notes, spurring them into an “intelligent conversation” – with academics and among themselves – about doing “the hard work…to get the learning reward”.
Meanwhile, attempts to design AI-proof assessment activities – from posters, videos and vivas to mandated mathematical techniques or “particular types of critiques” – had forced academics to question their fundamental objectives.
“What’s the learning outcome from the assessment? What’s the course all about? [AI detection] tools are…encouraging us to rethink our assignments [by providing] a feedback mechanism [on] how effective our assessments are.”
Professor Steel said students would be able to produce “perfectly polished” assignments through constant AI refinement. “That completely turns around our notion of what we’re trying to do in assessment,” he said. “Everybody’s answer will be perfect, but their underlying idea might be flawed. What we’ll actually be looking at is the underlying…concepts.”
He predicted that many universities would be “mainstreaming AI as part of assessment” by the end of the year. Students could be required to “write with AI and critique it”, “draw on” AI-produced drafts or include the drafts in appendices.
Academics were already harnessing AI in exam preparation by only using multiple choice questions that ChatGPT could not answer.
“If you really want to show the students…an average answer to the problem you’ve just set, get ChatGPT to produce an answer in class [using] just a few key points,” Professor Steel added. “Then you can all pick it apart. At the end of it all, the students realise that they’re smarter than the AI.”
He said AI had triggered rapid evolution rather than a revolution in teaching and learning. “With revolution, you throw out everything,” he explained. “Evolution builds on what you have.”
Nevertheless, ChatGPT had proven challenging for staff returning from “a relaxing Christmas break to discover the world had changed on them”, he conceded. “I don’t think we would have coped with this year if we hadn’t been through Covid.”