Deprecated: Creation of dynamic property quick_page_post_reds::$ppr_metaurl is deprecated in /var/www/wordpress/wp-content/plugins/quick-pagepost-redirect-plugin/page_post_redirect_plugin.php on line 97

Deprecated: Creation of dynamic property quick_page_post_reds::$pprshowcols is deprecated in /var/www/wordpress/wp-content/plugins/quick-pagepost-redirect-plugin/page_post_redirect_plugin.php on line 99

Deprecated: Creation of dynamic property quick_page_post_reds::$ppr_newwindow is deprecated in /var/www/wordpress/wp-content/plugins/quick-pagepost-redirect-plugin/page_post_redirect_plugin.php on line 1531
Remote oral presentations: bringing the OSCE online - LX at UTS

This blog post is co-authored by Dimity Wehr and Leslie McInnes. It is the second part in a two part series, and you can read the first post here: Authentic, practical alternatives to face-to-face presentations.

The OSCE

The third and final panel in the webinar series on transforming oral assessments featured the conversion of an on-campus objective structured clinical examination (OSCE) in the subject Advanced Assessment and Diagnosis. An OSCE is a real-time live exam where students demonstrate competency under simulated conditions, in this case involving students from the Master of Advanced Nursing and Master of Nurse Practitioner programs. Staff from the School of Nursing and Midwifery were able to realise their vision of remote delivery via Zoom breakout rooms, with support from the LX.lab (IML). Our panellists in the webinar, Gail Forrest (Lecturer, School of Nursing and Midwifery) and Mais Fatayer (Learning Design and Technology Specialist, LX.lab) were key collaborators in this process.

The technologies

Zoom was a straightforward choice for the examiners as the breakout room function allows allocation of students in turn to four exam stations. Along with Zoom, the LMS quiz facility allowed quizzes to be taken under observation. In addition, a Doodle Poll booking system allowed students to nominate an entry time and the examiners also used WhatsApp to communicate behind the scenes on the day.

The collaboration

Gail and Mais highlighted the successful collaboration between Nursing academics and the LX.lab to create the remote experience. The system had to be practical, adaptable, and reliable. Mais mentioned that the cohort was relatively small but indicated it would be possible to recreate the process for larger groups. The only downside noted in the preparation phase was the failure of a planned mock examination to give students practice. Problems with Zoom on the scheduled day had an impact, however, expectations were clearly explained to the students ahead of the examination date.

The process

The main Zoom room was used as a waiting room and the breakout rooms were named for each station.

  • In Station 1, students were required to take a medical and social history from a role-playing patient, in the presence of an examiner. 
  • Station 2 required students to describe to the examiner how they would perform a targeted physical assessment for their patient, indicate how they would proceed and what signs they would look for. During the task, candidates could request results like blood pressure, heart rate, pallor etc in order to build a picture and arrive at a differential diagnosis. This was conducted as a viva or ‘think aloud’ task.  
  • At Station 3 students listed the investigations they would like to request for their patient based on their diagnosis and to justify these requests. While under observation,  they then took a quiz in the LMS with password access. The quiz provided visuals of results including x-ray images, ECG readouts and blood results. For each of these, students had to identify key features and then write up whether their initial diagnosis had changed.
  • In Station 4 the student wrote a plan for the patient including who the patient should be referred to, their final diagnosis and any initial management required for the condition. This final task also required a description of how they would communicate findings to the patient. 
  • Station 5 was a separate Zoom session providing a mandatory but non-assessable debrief for 3 to 4 students at a time.

Reflections on the remote OSCE

Despite the lack of a practice exam, the allocation of two well-organised ‘managers’ for the main room led to a seamless administration of the OSCE. One manager moved the students in and out of the breakout room stations on time, while the other monitored activity and acted as a troubleshooter, keeping in touch with those staffing each breakout room via WhatsApp.

Gail reflected that from the academic perspective, it was a real success. The assessment overall was robust, the patient role-play was quite authentic and there was a positive correlation between examiners about the overall performance of each student. She noted that it was a key factor that the academics at each station were highly skilled and experienced in the area they were assessing. This enabled them to ask the right questions of the students, accurately determine their level of expertise, and answer their questions. The non-assessable debrief sessions, enabling 3-4 students to debrief together were also an important feature of the exam. Interestingly, students who attended the debrief up to an hour post-exam had already debriefed in their own way and the richness of the debrief was thinner than for those who attended very shortly after their exam.

Student feedback was generally very similar to the feedback given after a conventional OSCE; for example, students experienced heightened anxiety, felt they had not done well or reported that they ran out of time. In Gail’s opinion, the remote nature of the experience did not add negatively to these typical feedback responses. Some students who had experienced both face-to-face and online versions of an OSCE reported that they actually preferred the online version. One drawback of the online experience was lack of image clarity for some who were working on a laptop. Also the issue of timing for each station drew some criticism – some stations were seen as well-timed while others, especially Station 3, were seen as too short. Some students also reported how articulating their clinical assessment and describing their actions made them more aware of the reasons they often automatically perform certain examinations. Given that this assessment aimed to demonstrate critical thinking and reasoning, Gail found this quite encouraging, but the verbal element did pose an additional challenge for those students with English as an additional language. The academic team reported the exam as a positive and even enjoyable experience and Gail indicated that after some fine tuning they will definitely run the remote OSCE again. 

Conclusions

As noted in the post describing the first two Oral Assessment Webinars in the series, common themes emerged from diverse disciplines and these also apply to the OSCE experience. 

No doubt the organisers of this OSCE would agree with earlier webinar panellists that the design and implementation of online oral assessment tasks involves a lot of thought, time and often, collaborative effort. They have also demonstrated that careful planning and judicious use of available technologies can facilitate creative, innovative and highly authentic solutions. The positive student response provides evidence that this ground-breaking online clinical assessment model can be re-used with confidence and the approach applied to other disciplines.

Feature image by Drobotdean.

Join the discussion