Artificial intelligence is the newest student in the class, and it’s not going anywhere

Image generated by Canva AI Image Generator.
In classrooms across the country, artificial intelligence (AI) is quietly reshaping the future of education, offering personalized learning experiences that once seemed like the stuff of science fiction.
The above sentence was written by ChatGPT, after being prompted to write the lede — or opening sentence — for this story. The large language model, which generates answers to user queries using a plethora of text sources, was made available to the public in late November of 2022.
So far, the robot isn’t wrong — AI is here to stay. And while “personalized learning experiences” can certainly happen through its use, AI also presents dire consequences when it comes to academic integrity.
A series of documents released by the provincial government last spring titled “Digital literacy and the use of AI in education” addresses issues related to AI, and how the tool can be implemented in classrooms. One document is dedicated to connecting curriculum goals with AI topics. It mentions courses like Computer Studies 10 and Robotics 11, pointing to AI connections in digital literacy competencies for younger grades.
But AI poses a threat that reaches beyond typical courses and competencies. According to some teachers, as AI becomes smarter and better at creating convincing student-made writing, it may create negative impacts on our educational systems.
Digital literacy starts in K-12 classrooms, but the consequences of not learning this skill set don’t end in high school. Universities are also struggling to keep up with the unprecedented amounts of AI-plagiarized assignments being submitted by students.
Last year, a study of nearly 4 000 university students from 16 countries found that 86 per cent of students use AI in their studies, with nearly a quarter using it every day.
In order to tackle AI at post-secondary institutions, we need to start at the source: the elementary and high school systems that are supposed to prepare students for our ever-changing world.
AI growing pains

Image generated by Canva AI Image Generator.
Classroom teachers are grappling with how to manage the spread of AI and encourage its use in an ethical way. Teacher and librarian Wendy Burleson at Victoria High School says that AI has forced her to rethink the way she structures her assignments.
When teaching research skills to Grade 12 students, she invites students to use ChatGPT to brainstorm research ideas, but only after they’ve conducted research on their own.
“It’s not a blanket, terrible thing,” she tells the Martlet from the brand new Victoria High School library, which was reopened in April 2024 after seismic upgrades.
Students sit at computers in a brightly-lit computer lab, working on homework after Burleson reminds them not to use their cellphones during class.
Burleson feels that managing expectations of academic integrity and AI adds more work to the already overloaded schedules of teachers.
“I’m intimidated by it and I’m a longtime teacher,” she says. “I find that … teachers are left to figure out a lot of things for themselves with few resources.”
She adds that professional development days are valuable, but concerns about AI can’t be addressed in such a short time.
“It’s the elephant in the room that no one wants to talk about,” she says. Burleson says she believes fear of the unknown is what made administrators hesitant to start addressing the issue when ChatGPT was first released.
Burleson says the first time she heard discussions about AI, it was among colleagues, not administrators. She conducted a workshop with staff when the program first came out, in an attempt to become familiar with the tool that her students were already using.
She says that one strategy employed by her school’s English department includes having students complete writing samples early in the year, using pencil-and-paper. This determines a baseline writing ability, which can help detect plagiarism later on.
Burleson says that other teachers ask students to complete work using Google Documents, where instructors can access a summary of the speed at which assignments are written. If a student suddenly pastes an entire paragraph into the document, this might indicate that the writing came from somewhere other than their own mind.
When it comes to the daunting transition to university, Burleson says that students need to be taught that using AI for plagiarism will affect them negatively in the long term.
“If they use it now as a crutch, it’s going to harm them later, because they’re not going to be leaving with fundamental skills.”
Preparing prospective teachers for the AI classroom

Image generated by Canva AI Image Generator.
Dr. Michael Paskevicius is an associate professor of educational technology at UVic. One of his roles is teaching the next generation of educators about how AI will shape their careers.
One course called “Technology and Innovation in Education,” is mandatory for students pursuing undergraduate degrees in education at UVic, along with those taking post-degree elementary, middle, and secondary education programs. One topic mentioned in the course is AI, its applications, and how students might use it.
“You could probably run a whole class on this,” he says. Paskevicius feels that the rapid growth and capabilities of AI make it difficult to pinpoint exactly what student teachers need to know about the technology right now. Therefore, he says teacher education aims to foster curiosity, resilience, and grit — qualities he feels will help future educators to navigate the fast-moving world of AI.
Paskevicius thinks there is a future where AI-specific learning competencies are written into teacher education curricula, but he also warns that the focus of these curricula should not be placed on how each new AI model functions.
“It’s a career where you’re always learning,” he says.
Paskevicius says that the key to actually reducing extra workload for teachers grappling with AI plagiarism is to address the concept of AI openly with students.
He feels that the use of tracking changes to Google Documents may work for some teachers, but that this also has the capacity to create a “culture of mistrust.”
And, he adds, “there’s a fair bit of labour involved.”
Paskevicius suggests an alternative method of avoiding the use of AI, which is assessing student work in a variety of different ways rather than just written work. One way to do this is by assessing the process of a student’s learning, rather than only their finished product.
He encourages the use of process-based questions, like “When did you change directions?” “What made you think that way?” and “Who did you talk to that gave you that new idea?”
This prevents students from using AI as a shortcut, since a chat bot can’t satisfy foundational questions throughout the learning process.
Paskevicius also raises environmental and ethical concerns about the energy used to fuel AI computing. He says that it’s important to know who is behind many of the large language models that we’re used to using, such as ChatGPT.
“We must recognize that [AI programs are] also run by the large, technology corporations. They’re not unbiased,” he says. “They’ve all got interest in profiting.” For example, note the positive spin on AI that ChatGPT used, when I asked it to create this article’s opening sentence — I didn’t ask it to do that.
He feels that “it’s only a matter of time” before advertisements are introduced to AI platforms, pointing to companies like Netflix and Uber for first dibs.
Paskevicius says that guidelines from the ministry are good to see. “Teachers need that. They often need examples or ideas — especially new teachers — of how this could work,” he adds.
In a statement to the Martlet, a B.C. Ministry of Education and Child Care spokesperson says that other school districts have started to adopt these guidelines as well, to strengthen their response to AI usage.
The B.C. Ministry adds that “international jurisdictions have engaged with the ministry to learn from [our] approach, adapting [our] document to align with their own contexts.”
“It’s a tool that we have available to us,” says Paskevicius, comparing the introduction of AI to the invention of the calculator. “[Calculators were] eventually integrated as part of the tools that help humans be productive, creative, and thriv[ing].”
Faith-based AI education

Image generated by Canva AI Image Generator.
In some private schools, faith comes into play when deciding how AI will be implemented and addressed in education.
The Vatican released a document on Jan. 28 called “Antiqua et Nova: Note on the Relationship Between Artificial Intelligence and Human Intelligence,” which has a dedicated section addressing education.
Paul Rossetti, superintendent of Island Catholic Schools (ICS), says that this document helps guide his schools’ approach when it comes to AI.
“I like that it isn’t prohibitive,” he tells the Martlet. Rossetti feels the document effectively highlights the role of prudence, or good judgement, a concept passed down through Church leadership which he feels is not addressed sufficiently in contemporary society.
“We can’t stick our heads in the ground,” he says. “We have to be knowledgeable. We have to be aware, and then we have to use prudence in terms of how we share this and prepare our students.”
Rossetti points to the Canadian Conference of Catholic Bishops’ recent guidelines, which presents social media as another tool for Catholic educators facing challenges with changing technology.
“Don’t forget who you are, stay connected to human beings, and use good judgement and our Catholic ethos — in whatever we do — and that includes engaging with AI,” he says.
When it comes to protecting students from the dangers of AI, Rossetti says that fostering an environment of critical thinking is important. “That’s really ultimately what we need to be prioritizing, especially among high school students,” he emphasized.
The ICS held a diocese-wide AI professional development day this past November, says Rossetti, where teachers learned how the tool can be used in their classrooms.
Rossetti feels that peer-to-peer learning among teachers is the most powerful way for AI skills to be taught. He says there are certain teachers in each of the diocese’s five schools who are eager to share about their experiences using AI.
In fact, the diocese encourages its teachers to use a program called Magisterium AI, a Catholic AI tool that provides customized answers to faith-related questions, like finding bible verses relevant to user queries.
Rossetti notes that safety remains a priority when students and teachers are engaging with AI. He consults with his IT team consistently to make sure that the tools being implemented in classrooms are secure.
Part of the overall internet safety of his students involves limiting the use of cellphones for certain grades — something Rossetti says ICS implemented ahead of the ministry’s existing fall 2024 ban.
“It’s a balance,” he says. “We want [students] to know about this stuff, and we want to teach them through it, but we also don’t allow unfettered access, because we want them to learn the human skills that will help them.”
Rossetti says that allowing students access to AI while preserving their education is “a dance that [he’s] still figuring out.”
Potential positive impact — for students and teachers alike

Image generated by Canva AI Image Generator.
Some teachers feel that AI presents an opportunity to level the playing field for students with unique learning needs.
Dr. Graeme Mitchell, teacher at Claremont Secondary and UVic sessional teaching professor, says that AI helps him adjust assignments to personalize the difficulty to different students.
Mitchell says he uses ChatGPT to simplify reading materials, like “current events” articles in his social studies class, so younger grade levels can easily understand.
“There’s ways that we can personalize and really target learning and tutoring,” he tells the Martlet, describing AI as “a learning tool for kids who have challenges.”
Paul Rossetti at ICS echoes this idea, describing AI as a “beautiful opportunity” for students with learning disabilities. He says that AI platforms allow accessibility tools that are already popularly used — like speech to text — to be even more effective.
Mitchell says that AI is also a great tool for incoming teachers. “For developing lesson plans and rubrics, it’s a game changer,” he says, adding that AI tools can help to generally reduce teacher workloads.
“In a period where teachers are always being asked to do more work with less time, I think it’s going to be something that helps balance things a little bit,” he adds.
Mitchell feels that the ministry should provide more guidance regarding AI detection policies and academic integrity guidelines. He says that AI-based learning competencies should be added to the provincial education curriculum as well.
“Unless you have teachers who are going out of their way to use these platforms and are passionate about teaching themselves, you’re going to have pockets where teachers know what they’re doing, and then a ton of schools or programs where they’re still in the dark ages,” he says.
In their statement, a ministry spokesperson says that “there are no current plans to introduce additional curriculum learning standards regarding AI,” stating that digital literacy standards already address the new technology.
Mitchell estimates that 60 to 70 per cent of all assignments that he reads are completed using the help of AI. To combat this, he implements Google Docs track changes, holds Socratic seminars to gauge learning in real time, and asks students to write in-class, on pen and paper.
While he looks forward to AI teaching workshops at Claremont next year, he feels that there aren’t enough resources for teachers.
“A lot of educators are clamouring — like, we need this yesterday,” he says. “I don’t know of anyone at my school that’s looking [to] the ministry for their leadership on this.”
“For a whole cohort of students that are going through high school in the next four or five years, this is going to be a bit of a tricky time with a lot of uncertainty,” says Mitchell.
He says that some students are already using AI to extend their learning, whereas others are feeling a decrease in writing ability from overreliance on the tool.
“I’m hoping that in the end it will be something that will enhance outcomes, but I’m not fully confident about that at this time,” Mitchell says.
It’s clear that AI has already disrupted the teaching landscape, for better and for worse. If I were to feed ChatGPT my interview transcriptions right now, it could rewrite this entire feature story for me in moments.
And it might only be a matter of time before AI can conduct interviews for us, too. If that were the case now, however, I wouldn’t be growing and learning as a student journalist.
Michael Paskevicius thinks that good will come from this disruption. “If we can enable students to make good use of [AI], ethically and sensibly and justly, to thrive and live a good life, that would be the ultimate outcome.”
It’s an interesting paradox, which is beginning to play out in every busy classroom across the globe.