Skip to content

AI powerful tool for educators, but comes with risks: AU prof

Dr. Levina Yuen of Athabasca University highlights peaks and pitfalls of generative AI in the educational world as students return to classrooms
The university held consultations on its Athabasca campus Jan. 25.

Students across the province have returned to their classrooms for another year of learning, which means parents and educators are back in the ever-present cycle of trying to stay up to date on technology trends.

Today’s students have a variety of powerful tools at their disposal, but none represent a bigger opportunity — or headache — than AI.

“AI is a fantastic tool when it’s used appropriately, and I think it’s really important for us to understand and develop AI literacy so students can understand when AI can, and should, be used,” said Dr. Levina Connie Yuen, an associate professor at Athabasca University who specializes in integrating technology into the classroom.

“AI has the potential to revolutionize things and open up the doors with the sharing we do when it comes to learning.”

Generative AI, the type of AI Yuen is referring to, has been experiencing a boom in recent years. ChatGPT — a chatbot developed by OpenAI —is one example of the technology, which can respond to user prompts and create quasi-original content ranging from book summaries to images.

Yuen said it is important for students to learn how to properly use the technology. Sticking with ChatGPT, the tool can be used to brainstorm project ideas, set up a schedule, or act as an outside perspective if peers aren’t available.

It can also give biased answers or be passed off as a student’s own work.

“It’s really important for students to get into this habit of acknowledging when AI tools are being used — it ties in well with academic integrity and giving credit where credit is due,” said Yuen.

It’s also important to fact check the results a generative AI produces. Because the programs learn through what is effectively guided trial and error on a massive scale, they are prone to making up answers, a process known as hallucination.

“It’s something that you need to double check; is it biased or unbiased? Is it relevant to the question you’re asking?” said Yuen. “It goes back to using AI to complement a lot of work that we do, rather than trying to replace some of our own efforts in the learning process.”

Technological downsides

Yuen also specializes in “digital etiquette,” or how society teaches young people to be responsible citizens in the online world. Being transparent about AI usage is one component, but it also includes cyberbullying, critical thinking, and more.

“Most parents, their top two concerns shown by research deal with this idea of a lack of critical thinking as AI takes over that role, and then the idea of cyberbullying,” said Yuen. “These are concerns parents have in general, so they don’t just apply to AI, but these come to the forefront when we talk about use as an educator.”

Some youths have used generative AI programs for a process called “deepfaking,” where a fake, but realistic video or picture of a person is generated, often in a compromising situation.

Technology can often outpace regulations around its use — Yuen pointed to the recent provincial decision to ban cellphones in classrooms as one example — and fear of what a technology can be used for can also be prevalent in educational settings.

Despite that, Yuen said she hopes teachers, parents, and administrators can learn to blend the tech into the education system in a responsible manner.

“AI isn’t going away — it’s here to stay, and it’s important for us not to be fearful of it.”

push icon
Be the first to read breaking stories. Enable push notifications on your device. Disable anytime.
No thanks