Deprecated: Creation of dynamic property quick_page_post_reds::$ppr_metaurl is deprecated in /var/www/wordpress/wp-content/plugins/quick-pagepost-redirect-plugin/page_post_redirect_plugin.php on line 97

Deprecated: Creation of dynamic property quick_page_post_reds::$pprshowcols is deprecated in /var/www/wordpress/wp-content/plugins/quick-pagepost-redirect-plugin/page_post_redirect_plugin.php on line 99

Deprecated: Creation of dynamic property quick_page_post_reds::$ppr_newwindow is deprecated in /var/www/wordpress/wp-content/plugins/quick-pagepost-redirect-plugin/page_post_redirect_plugin.php on line 1531
Belonging, automatically: video feedback to boost engagement - LX at UTS

How do you nurture belonging and engagement with diverse student cohorts, at a distance? Keith Heggart from the Faculty of Arts and Social Sciences (FASS) shares learnings from a trial of automated video feedback in the new Graduate Certificate in Learning Design.

New course, new mode, new challenge

Launched in 2021, the Graduate Certificate in Learning Design was converted from blended to fully online in response to student needs and the ongoing impact of COVID-19. It runs concurrently with UTS microcredential offerings, and in short (6-week) subjects that run in series, rather than parallel.

This new course structure allows learners across Australia to participate and encourages a kind of ‘try before you buy’, beginning with a microcredential, then transferring into the certificate. This online modality brings advantages, but also presents challenges in terms of nurturing belonging, engagement and motivation among learners.

I dubbed this the ‘screen-distancing’ effect; my challenge was to encourage a sense of belonging within the cohort as it currently existed, but to do so in a way that was sustainable as the course became more well-known and student numbers increased. 

Starting with OnTask

I had previously used OnTask to send personalised, automated emails based on student participation in discussion forums. This was valuable and students generally appreciated it, but limited data from the LMS meant that my personalised emails to students were based on my ‘best-guess’ of their participation, rather than something of which I was certain. 

The process also felt less personal than I wanted. In some ways, it felt I was reducing the student’s experience to simply a question of whether they had responded to a reading. To then send a personalised email based on that (incomplete) data felt punitive, so I wanted to come up with a solution that:

  • Made better use of all of the data I had in the course (including the end of module reflections);
  • Was more personal and useful for the students – that is, it helped them, rather than just recapped their work for the week;
  • Turned my insights about student learning into action.   

Personalising video feedback, automatically

One solution to tackle these challenges was to create a set of personalised videos. Students completed a module each week, then answered a survey to reflect on their own understanding about the subject content from that week. This information was exported to OnTask, where I had tailored personalised emails that were sent to students based on the scores they had given themselves. These emails included links to a specific video (hosted on YouTube) that matched the students’ indicated level. 

The videos were offered to students enrolled in the Award course as well as the microcredentials. The cohorts were segmented into four main groups, comprising students who either:

  • Provided no response to the survey
  • Indicated low levels of confidence
  • Indicated medium levels of confidence
  • Indicated they felt very confident

Videos were two-minute summaries, deliberately informal and casual – I wanted students to feel like we were having a one-to-one conversation, not watching a pre-recorded lecture. For the low confidence students, I focused on the main points. For the more confident students, I suggested questions to ponder and further sources to explore.

Short videos with high impact

For each module, between 80% and 100% of students viewed a video. In survey feedback, most of the students commented on how much they enjoyed the videos, and especially the personalisation of them. Students agreed with the statements that ‘this feedback and support improved my overall learning experience’ and ‘this feedback make me feel more supported by my instructor’.

Interestingly, some students wanted to watch all levels of videos, not just the one that was sent to them. Other students felt that the videos needed to be longer – or that they took too long to watch, and they’d prefer to read an email!

Where to next?

This was a small scale trial, with only 20 students enrolled in the subject. Next steps are likely to include:

  • Managing increasing numbers of students in the course;
  • Spending some time crafting better messages within OnTask, and possibly including multimedia content, too;
  • Exploring how to make movement between email and Canvas straightforward – for example, if I am writing about a particular activity, I should hyperlink directly to that activity, rather than expecting students to log in and find it. 

Intrigued or already experimenting with automated feedback in your own teaching? Take a look at this summary of four UTS events on automated feedback for more inspiration and ideas to follow up.

Feature image by Sara Kurfeß 

Join the discussion