Still need help?
Get in touch with the LX.lab team by logging a ticket via ServiceConnect. We'll be in touch shortly.
Log a ticketWant to provide feedback on this resource? Please log in first via the top nav menu.
This guide introduces five principles to assist academics in transitioning their assessments in response to the new opportunities that GenAI opens up, as well as academic integrity challenges.
In February 2023, as it became clear that generative AI (GenAI) implied the need for a major, but as yet poorly understood, shift in teaching and assessment, TEQSA provided guidance on the appropriate use of generative AI aligned with the Higher Education Standards Framework 2021 (HESF):
The HESF states that providers assure the quality of teaching, learning, research, and research training, with respect to content and skills developed, assessment and determination of learning outcomes, and the mitigation of foreseeable risks to academic and research integrity. Providers should document their decisions and monitor their progress in addressing generative AI.
HESF
UTS took a position of promoting ‘Effective Ethical Engagement‘ and the following principles provide guidance for staff to translate this into practice. We’re suggesting principles rather than recipes, recognising that effective, ethical engagement can take many forms to suit the contexts in which we teach. You and your colleagues will undoubtedly test many creative recipes to suit the needs and tastes of diverse students in diverse contexts.
The principles are informed by:
This is clearly a significant transition for all educational institutions, and we’re all trying to make sense of the implications. To understand if GenAI will enhance or undermine learning, and what scaffolding students may need to use it effectively and ethically, you’ll need to explore the apps yourself, track how others in your field are adapting assessments, and test your assessment questions in GenAI apps prior to session starting.
The principles are framed in terms of the student experience we want to create, to help you determine what is appropriate and consistent for each course and subject, within institutional guidelines.
The following graphic details the five student-centred principles to guide usage of GenAI.
GenAI is reshaping society, the workplace, and university study — for good and for ill. It will soon be present in every conceivable productivity tool and website. Students need to be introduced to this early on, so they appreciate how GenAI connects to their professions, and each course and subject.
Update courses and subject content to include information about GenAI and how it is impacting disciplines/professions, including potential changes in the future.
Discuss GenAI openly with your students right at the start of your classes. Talk about how it is already being widely used in the workplace and society, how it is impacting your discipline and provide illustrations and invite their questions. You might model how you have used it yourself.
Explain to students that learning to use and critique GenAI is key to their education. Ask questions to help students understand the importance of developing the capability to think about, reason with and make judgements around the use of GenAI in preparation for future work.
Discuss the importance of integrity as an aspect of their education and a central part of their future professional lives. Address the issue of cheating.
Slides for the classroom (see above)
How should we talk to students about AI? (Monash University)
Students need clear guidance on how to ethically and effectively engage with GenAI for a given assignment, without crossing the academic integrity line.
Examples of Student Misconduct using AI:
“Copying or reworking any material (e.g., text, images, music, video) from generative AI tools, and claiming this work as your own without declaring use of the relevant tool.
Using generative AI tools, unless permitted use is specified for that assessment.”
As Course Directors and Subject Coordinators, your curriculum and assessment choices have direct implications for any academic integrity questions around academic misconduct that may arise.
Given the rich diversity in the University’s teaching practices and the tools we use, clearly one size cannot fit all. Clear guidance must be given on what is permitted for each assessment.
Slides for the classroom (see above)
Examples of UTS assignments and guidance:
36118 Applied NLP assignment & ChatGPT Guidelines (Antonette Shibani, TD School)
41059 Mechanical Design Fundamentals Studio 1 assignment (Anna Lidfors Lindqvist, FEIT)
TEQSA website on Artificial Intelligence
Ethical ways to use ChatGPT as a student (Open Universities Australia)
Student-staff forums on AI (University of Sydney)
Using AI tools for study (Flinders University Library)
Using generative AI (Deakin University Library)
Students must be helped to develop an awareness of when and how to use different apps, effectively and ethically, to advance their learning.
GenAI can be impressive, but it’s far from perfect. Moreover, even if working well in a technical sense, it can undermine learning if students ‘outsource’ too much work to it.
While students must take responsibility for their learning, they expect UTS to help them develop this AI literacy, just as they expect any other educational technology to be used to advance learning.
GenAI functionality is appearing in mainstream productivity tools and search engines, as well as myriad web apps.
Discuss how and when the use of GenAI will deepen learning versus undermining it. This is not necessarily straightforward, and UTS will be providing support to assist academics to think this through.
AI literacy for learning may well differ from AI literacy for a professional: a power tool for an expert is not necessarily an effective tool for a beginner. Is learning being undermined through premature or poor use of GenAI — or can learning be accelerated through effective use?
What does critical engagement with ChatGPT look like? Video excerpt, 4 mins: Insights from 4 UTS academics on the diversity of ability they’re seeing in their students’ use of ChatGPT (Simon Buckingham Shum, UTS)
GenAI literacy can be thought about as a mix of knowledge, skills and dispositions, e.g.
Students must learn how to harness the strengths of an AI tool, and understand its limits.
A conversational agent such as ChatGPT has strengths and weaknesses. Both of these can be leveraged to so students learn to appreciate these, and in doing so, engage in deeper learning.
Browse the resources to see some of the ways that UTS academics have integrated GenAI into their assessment tasks, plus other creative and evidence-based proposals for its use. Do any of these look like they could translate well into your context?
AI-resilience diagnostic for assessment tasks
What does critical engagement with ChatGPT look like? Video excerpt, 4 mins: Insights from 4 UTS academics on the diversity of ability they’re seeing in their students’ use of ChatGPT (Simon Buckingham Shum, UTS)
Assessment and GenAI (Deakin University)
ChatGPT is old news: How do we assess in the age of AI writing co-pilots? (Danny Liu & Adam Bridgeman, University of Sydney)
Roles ChatGPT can play (Mike Sharples, Open University)
Thinking companion, companion for thinking (Ethan Mollick, University of Pennsylvania)
Exploiting ChatGPT’s limitations: Concept Transfer Activity — an extract from: How to… use AI to teach some of the hardest skills: When errors, inaccuracies, and inconsistencies are actually very useful (Ethan Mollick, University of Pennsylvania)
AI may change what students need to learn, how they learn, and how you assess them.
GenAI can now perform high level cognitive work formerly considered to be the province of human cognition.
You must decide if students should still master the same knowledge and skills you’ve always taught pre-AI, or if this requires a rethink of what students need to learn, and how they can best evidence this.
A 24/7 conversational aid may also open exciting new ways of learning that are humanly impossible to scale.
Understanding these strengths and weaknesses will take time and experimentation.
Course Directors and Subject Coordinators should update curricula to include GenAI as a professional tool.
Assessments should reduce the risk of using GenAI for misconduct but also allow students to demonstrate they know how to use it in the context of their course.
Do students still need to demonstrate mastery without AI tools? Should authentic performance with AI tools become an explicit goal? Or are both now needed?
AI-resilience diagnostic for assessment tasks
Examples of robust assignment design that are AI-resilient
36103 Statistical Thinking for Data Science (Kirsty Kitto, Connected Intelligence Centre)
Examples of UTS assignments integrating ChatGPT:
36118 Applied NLP assignment & ChatGPT Guidelines (Antonette Shibani, TD School)
41059 Mechanical Design Fundamentals Studio 1 assignment (Anna Lidfors Lindqvist, FEIT)
Assessment and GenAI (Deakin University)
Access resources such as slide kits and templates to communicate GenAI and Academic Integrity to your students in the Academic Integrity Communication Toolkit.
Get in touch with the LX.lab team by logging a ticket via ServiceConnect. We'll be in touch shortly.
Log a ticketWant to provide feedback on this resource? Please log in first via the top nav menu.