4 minute read
Preparing for an AI-future
Preparing for an AI-future
Written by Miranda Cook
Australia’s education system continues to grapple with how to respond to artificial intelligence as schools take vastly different approaches, from embracing it to blanket bans.
Bridget Pearce is an English and Academic Services teacher at BGS. She is passionate about supporting neurodivergent students, such as those with ADHD and dyslexia.
Ms Pearce became cautiously optimistic about the positive impact AI software, such as ChatGPT, could have on students who struggle to express themselves through text.
“What used to be a barrier won’t always be a barrier, and I was really excited by that,” she told Grammar News.
Over her career, she has witnessed students fail exams, have limited tertiary options and suffer from low self-esteem when struggling to use the written language.
“My first experiment with ChatGPT made it apparent that difficulties in written expression will no longer hold people back after they graduate and go into professions.”
While in awe of the technology’s ability to “even the playing field”, Ms Pearce had concerns. She noted issues surrounding plagiarism, personal data breaches and over-reliance on technology.
On a mission to learn as much as possible about the impact — good and bad — on the education industry, she has spent countless hours exploring the technology, listening to podcasts and reading articles.
Last year, Ms Pearce published her own comparative study, Beyond the Hype: Critical Questions about the Impact of AI in Education, after holding a three-day innovation summit in the Great Hall with Year 7 students.
Students were divided into two groups: those encouraged to use AI, and those who had to rely on their own thinking. The teams were tasked with finding a solution to save humanity from an AI apocalypse.
Both teams designed, prototyped and tested their strategies while keeping a process log in a journal. They created a video to persuade the public to act on their strategies.
“I suspected the students who were using AI would produce more compelling strategies and videos, even if they failed to fully understand them,” Ms Pearce said.
Much to her surprise, however, a panel of external judges deemed the work of the human-only group to be of the higher quality.
“The students who weren’t using AI said they didn’t feel limited because they could work closer together and build on each other’s ideas.
“They produced better products when they had ownership over the process,” Ms Pearce said.
The results of the summit sparked conversations among BGS students about whether AI should be used at schools.
A student said: “Ultimately, we need to learn AI because our future will be full of AI, and if we are best prepared for an AI-enabled future, then we need to develop those skills now.”
The stance was shared by all participants who also agreed AI-generated work should not be passed off as original, Ms Pearce said.
Last year, education ministers formally backed a national framework guiding the use of the new technology in Australian schools from 2024.
Ms Pearce, however, pointed out that the results in her study raised questions about the framework, such as how much do educators really know about the extent to which AI improves or confuses human cognition? And, how much time should schools invest in AI, if they might not be as beneficial for learning when compared with high-impact teaching strategies?
We want to empower teachers to have meaningful conversations with students about AI.
As the AI space continues to evolve at a rapid pace, Ms Pearce holds regular meetings with BGS teachers to discuss how AI can be used in a way that still helps students develop as learners. She was also the chairperson of the three-day Generative AI for Education Leaders Summit, held in March.
“We want to empower teachers to have meaningful conversations with students about AI.
“It is important we don’t take a hard line that generative AI is prohibited at the School - we don’t think that sets students up for success.”