Teaching and learning in the AI era

EducationDaily
EducationDaily
The University of Sydney says it's committed helping students explore the potential of generative AI ethically.

The University of Sydney was early to embrace generative AI. In February 2023, the institution declared that “AI tools will become part of every workplace”, and said it wanted its students to be those who master the technology and “learn how to build on the work produced by AI”.

Today, says Frank Grippi, the University’s Director, Strategy & Architecture, it’s committed to embracing new technologies and driving innovation as it is for academic excellence.

“The University of Sydney is committed to ensuring that equity and inclusion are at the heart of every learning experience we deliver, regardless of the physical location or mode of delivery,” he says.

“Leveraging technology underpins our approach here – it allows us to effectively support around 70,000 students and close to 26,000 academic and support staff, including part-timers. It also helps us create a more connected learning environment across multiple teaching and research locations, including a station in the Great Barrier Reef Marine Park.”

- Advertisement -

With this in mind, the University wants to take “everything it knows about effective pedagogy and think about how AI can support this to make teaching and research easier and improve students’ learning”, says the institution’s Innovation Lead, Jim Cook.

Adopting AI responsibly

The University’s AI journey began about eight years ago when its Digital Innovation team started exploring the potential of large language models to benefit educational approaches by offering personalised and interactive learning experiences for students and teachers alike.

“Our computer science research teams have been exploring new horizons in AI for decades and doing outstanding academic work. But we’ve been waiting for the technology to hit a pivot point where it would be fiscally responsible to implement AI tools more broadly across the organisation,” says Cook. “That moment has now arrived with the emergence of powerful generative AI tools that are also widely accessible.”

Today, he says the University is taking proactive steps to equip every member of its community to use AI responsibly and productively. This has included running workshops and consultations for staff and students and developing guidelines and educational resources on the appropriate use of AI for learning and assessments.

- Advertisement -

Empowering staff and students to harness AI’s potential

The university’s eagerness to embrace generative AI has led to considerable interest from staff and students in developing new teaching and learning solutions that harness the technology’s capabilities. Support staff have also expressed interest in finding ways to make administrative processes more efficient, such as onboarding students.

Anticipating this response and to meet new use case demand, the university’s ICT division started developing a large-language-model-as-a-service platform based on Microsoft Azure in late 2022.

“The university uses a shopfront model whereby ICT acts as the enabler by providing a common platform that other University groups focused on specific education, research and operational needs can use,” says Grippi.

“Using our approach, it’s possible to create an application in about 20 minutes – it’s one-click, based on deploying simple code and gets a basic product into people’s hands for trialling and testing very quickly.”

But as Cook explains, creating the platform was a complex job:

- Advertisement -

“We spent about six months figuring out how we’d put it all together, then we built it in just three months, launching in March 2023. It was pretty complex because when we started, the tools for what we wanted to do didn’t really exist,” he says.

Developing innovative use cases

So far, the university’s ICT team has built 33 minimum viable products on its Azure platform. One of the most successful solutions is Cogniti, an AI assistant for students, developed by Danny Liu, Associate Professor of Educational Innovation.

Cogniti allows teachers to create their own AI chatbot ‘agents’ that can be steered with specific instructions and resourced with specific contextual information from units of study. These agents can be embedded into the University’s learning management system, providing a seamless experience for students – they do not have to sign up for a separate account, and the institution provides their AI access.

Teachers have full visibility over conversations with Cogniti agents, and students can flag and give feedback on AI messages.

“Using Cogniti, students can get instant, personalised support, guidance and feedback, including explanations of key concepts and coaching on study techniques,” says Cook.

- Advertisement -

“It can also boost staff productivity by helping teachers with time-consuming tasks such as creating rubrics to establish criteria for assessment.”

Since Cogniti’s soft launch in October 2023, educators from 30 institutions in Australia, New Zealand and Singapore have created more than 600 AI agents using the solution. These AI agents have engaged in over 31,500 conversations with more than 10,000 users and answered thousands of syllabus and content questions.

Cogniti has also improved personalised feedback for thousands of students, including those who used Cogniti agents during a workshop to develop occupational therapy intervention plans in the University’s Faculty of Medicine and Health.

“I thought the AI was very good,” says one student. “There was a balance between challenging our suggestions that encourages us to think and back up our ideas, and affirming our suggestions with add-ons that improve the strategy delivery.”

Cogniti is currently used across 300 units of study and the University expects this number to double by the end of 2024.

Another successful project on the University’s Azure platform is the generative AI policy navigator developed by the University’s Digital Innovation team. This tool, built using Microsoft’s public sector Information Assistant accelerator, simplifies staff access to the university’s 360 policies.

- Advertisement -

“Navigating our extensive policy library can be challenging for our 26,000 staff. However, as they become literate in working with generative AI and ask the policy navigator the right questions, they can gain some real value from it,” says Cook.

“A big reason this project has been so successful is the flexibility of Microsoft’s Information Assistant accelerator, which allowed us to provide a robust and safe tool customised to our needs.”

Reimagining assessment and feedback

The university’s dedication to the responsible and ethical use of AI means it is continuously evolving its governance approaches.

“As our use of the technology expands and matures, we’re constantly reviewing our robust standards and policies, including quality assurance processes for AI outputs, and assessing what the next level of governance is that we need to embed,” Cook says.

This stance reflects the university’s commitment to complementing human skills with technology and ensuring a harmonious integration of AI into its community.

- Advertisement -

“With generative AI becoming ubiquitous in everything we use, like Microsoft 365, search engines and social media, universities need to help students learn how to use it well.”

The University’s Educational Innovation team has recently begun pioneering an innovative ‘two-lane’ model for assessments, including tests and exams, to ensure students develop the ability to work ethically with AI technologies and maintain academic integrity.

“We offer ‘Lane 1’ assessments with two options. In Option 1, generative AI is not allowed – this applies to typical scenarios like tests, exams and oral assessments. In Option 2, generative AI is allowed but always under supervised conditions,” says Grippi. “This helps our educators confirm certain levels of student attainment.

Supporting Australia to be an ethical AI leader

As the university continues its AI journey, it is excited to explore additional use cases for the technology. For example, it rolled out Microsoft Copilot in early 2024 to boost productivity and creativity for around 130,000 students and staff members. Importantly, this move ensured equitable access to generative AI technology among students, particularly benefitting those who cannot afford a Copilot subscription.

However, it also has a much broader mission, with partnership and collaboration at its core.

“Our advice for any educational institution embarking on an AI journey is to see it as a coalition,” says Grippi. “Build a community of practice within your university and with fellow universities, developers and stakeholders. Be transparent about how you’re formulating your principles for safe, responsible AI use. The more you’re willing to share your knowledge about how you’re doing that, the better outcomes you’ll achieve.”

As Cook points out, the university’s long-term vision is to support Australia in becoming a leader in ethical AI teaching and learning, as well as research and applications.

- Advertisement -

“Our journey here is evolving in a very fast-moving, international context,” he says.

“But while we want to go fast, we also want to go safely.”

Share This Article