Rethinking Education in the Era of AI

Winn Wing-Yiu Chow, Senior Lecturer, School of Computing and Information Systems, The University of Melbourne


AI is undoubtedly a revolutionary force, and its advancements are happening at an extraordinary speed. We are especially witnessing the rise of advanced generative AI language models like ChatGPT and Gemini, which not only understand text but also generate content that closely mirrors human writing. Nobel laureate Geoffrey Hinton recently noted that AI language models are not merely predicting the next symbol but are actually reasoning and understanding in ways similar to humans—and will continue to improve as they grow more complex. Therefore, AI should not be seen merely as another new technology like Virtual or Augmented Reality; it is poised to be a transformative force, much like the steam engine and machinery during the Industrial Revolution or computers and the Internet during the Information Age. Just as machinery ushered in the era of “labour” and computers brought the “information” age, AI is now bringing us the age of “intelligence.” This new form of intelligence is not just the narrow, specialized intelligence seen when AlphaGo AI defeated the world’s best Go players; it represents a broader, more human-like reasoning capability that is on track to surpass human intelligence in most areas.

Start from the beginning: What should be the intended learning outcomes?

As AI continues to revolutionize the world, it is set to disrupt our future in profound ways, influencing how we live and work. Silicon billionaire, Vinod Khosla, predicts that AI will handle 80% of work in 80% of jobs. If this is indeed our future, how effectively are we preparing our students for it? This reality calls for a departure from the traditional “teaching as usual” approach and requires us to stop and rethink our educational design from the ground up. Are our intended learning outcomes, established years ago, still relevant in the era of AI?

  • What is the purpose of education in an AI-driven world?
  • How can AI skills enhance employability?
  • What discipline-specific knowledge and skills can be supported by AI?
  • What graduate attributes are necessary in an AI-driven world?
  • How do we address AI proficiency and ethics?

The answers to these questions are essential for guiding our revision of the intended learning outcomes for our courses and subjects.

While many employees are already using AI tools in the workplace, and data indicates that AI enhances productivity and leads to higher-quality work, many universities, by default, prohibit students from using generative AI for their assessment submissions. Universities do have good reasons for this approach, which we will discuss later. However, this disconnect should prompt university educators to pause and reconsider the intended learning outcomes and assessments for their courses.

AI is not a calculator and should be thought of as a “Human” partner

We have likely heard discussions about whether to allow AI in the classroom, with some comparing it to the introduction of calculators. They often recall their experiences with calculators, suggesting that AI could well be integrated into our teaching without significantly disrupting the overall educational approach. However, unlike calculators, which possess narrow intelligence and are limited to performing mathematical operations, AI demonstrates “human”-like expert performance across a wide range of general tasks, making it inappropriate to view AI as a simple, specialized tool.

A new conceptual approach to understanding AI has recently been proposed by A/Prof Kate Tregloan and her collaborators which was presented at ASCILITE 2024. She suggests we think of AI as a “human” partner in learning, given its ability to demonstrate human-level capabilities in many areas. This perspective encourages us to view AI not just as a tool, but as a cooperator or collaborator in the learning process, capable of actively supporting students and educators. For example, AI could act as a “group member” in a cooperative assessment alongside students. In this scenario, students would take on the role of project leaders, assigning specific tasks to AI, while the AI generates sections of the final output based on their guidance. This new way of thinking opens up exciting possibilities for redefining the learning and working relationship between students and AI.

AI can be overly powerful for certain learning purposes

There are good reasons why universities, by default, prohibit students from using generative AI for their assessment submissions. If you want your son or daughter to learn the fundamentals of physics, you would not want them relying heavily on an AI tool that processes expert subject knowledge. AI tools like ChatGPT will not restrict themselves from providing complete answers, and it goes against human nature for students not to seek those answers when AI use is allowed. As a result, the likely outcome is that AI would dominate the learning process, depriving students of the opportunity to engage deeply with the material and develop their own understanding and skills.

On the other hand, when your child is approaching graduation and needs to complete a capstone project—one that involves solving a real-world industrial problem. In this case, the industrial partner is likely to expect students to be proficient in using AI tools like ChatGPT to assist with tasks such as research, data analysis, or brainstorming solutions. This would mirror the AI-enhanced collaboration they will likely encounter in the workplace. In such contexts, AI becomes a vital tool that supports critical thinking and problem-solving rather than diminishing the learning process.

Hence, there is no one-size-fits-all solution, and ChatGPT is clearly no exception. The integration of AI must be tailored to each educational context, whether it involves learning foundational skills or addressing complex, real-world problems. This nuanced understanding has led to the development of an AI assessment scale, which indicates that different levels of AI use should be permitted across various assessments. However, at lower levels of AI use, there is a notable lack of appropriate AI tools that can provide the right level of learning support. ChatGPT, for instance, is too powerful and complex for many foundational learning purposes. Therefore, further research and development are necessary to create specialized AI tools that align more closely with educational goals, ensuring that students receive appropriate support without compromising their learning. Perhaps it is time to consider how we can transform ChatGPT into a more targeted educational tool, akin to a “calculator,” that enhances learning rather than overshadowing it.

5 1 vote
Article Rating
Subscribe
Notify of
guest

0 Comments
Newest
Oldest Most Voted
Inline Feedbacks
View all comments