References

Alconero-Camarero AR, Sarabia-Cobo CM, Catalán-Piris MJ, González-Gómez S, González-López JR. Nursing students' satisfaction: a comparison between medium- and high-fidelity simulation training. Int J Environ Res Public Health. 2021; 18:(2)804-815 https://doi.org/10.3390/ijerph18020804

Al-Ghareeb A, McKenna L, Cooper S. The influence of anxiety on student nurse performance in a simulated clinical setting: A mixed methods design. Int J Nurs Stud. 2019; 98:57-66 https://doi.org/10.1016/j.ijnurstu.2019.06.006

Alluri RK, Tsing P, Lee E, Napolitano J. A randomized controlled trial of high-fidelity simulation versus lecture-based education in preclinical medical students. Med Teach. 2016; 38:(4)404-409 https://doi.org/10.3109/0142159X.2015.1031734

Association for the Simulated Practice in Healthcare. Simulation-based education in healthcare. Standards framework and guidance. 2016. https://tinyurl.com/2s4vzk4c (accessed 9 June 2022)

Au ML, Lo MS, Cheong W, Wang SC, Van IK. Nursing students' perception of high-fidelity simulation activity instead of clinical placement: a qualitative study. Nurse Educ Today. 2016; 39:16-21 https://doi.org/10.1016/j.nedt.2016.01.015

Burbach BE, Struwe LA, Young L, Cohen MZ. Correlates of student performance during low stakes simulation. J Prof Nurs. 2019; 35:(1)44-50 https://doi.org/10.1016/j.profnurs.2018.06.002

Cant RP, Cooper SJ. Use of simulation-based learning in undergraduate nurse education: an umbrella systematic review. Nurse Educ Today. 2017; 49:63-71 https://doi.org/10.1016/j.nedt.2016.11.015

Gantt LT, Overton SH, Avery J, Swanson M, Elhammoumi CV. Comparison of debriefing methods and learning outcomes in human patient simulation. Clinical Simulation in Nursing. 2018; 17:7-13 https://doi.org/10.1016/j.ecns.2017.11.012

Healthcare Simulation Standards of Best Practice™ evaluation of learning and performance. Clinical Simulation in Nursing. 2021a; 58:54-56 https://doi.org/10.1016/j.ecns.2021.08.016

Healthcare Simulation Standards of Best Practice™ the debriefing process. Clinical Simulation in Nursing. 2021b; 58:27-32 https://doi.org/10.1016/j.ecns.2021.08.011

Healthcare simulation standards of best practice simulation design. Clinical Simulation in Nursing. 2021c; 58:14-21 https://doi.org/10.1016/j.ecns.2021.08.009

Johnston S, Parker CN, Fox A. Impact of audio-visual storytelling in simulation learning experiences of undergraduate nursing students. Nurse Educ Today. 2017; 56:52-56 https://doi.org/10.1016/j.nedt.2017.06.011

Mind the gap: exploring the needs of early career nurses and midwives in the workplace. 2015. https://tinyurl.com/2p8rh39f (accessed 9 June 2022)

Kirkham AL. Exploring the use of high-fidelity simulation training to enhance clinical skills. Nurs Stand. 2018; 32:(24)44-53 https://doi.org/10.7748/ns.2018.e10693

Lee BO, Liang HF, Chu TP, Hung CC. Effects of simulation-based learning on nursing student competences and clinical performance. Nurse Educ Pract. 2019; 41 https://doi.org/10.1016/j.nepr.2019.102646

Massoth C, Röder H, Ohlenburg H High-fidelity is not superior to low-fidelity simulation but leads to overconfidence in medical students. BMC Med Educ. 2019; 19:(1) https://doi.org/10.1186/s12909-019-1464-7

Newton RH, Krebs A. Bridging the theory-practice gap using simulation to teach care of patients with learning disabilities. Teaching and Learning in Nursing. 2020; 15:(4)233-236 https://doi.org/10.1016/j.teln.2020.04.003

NHS Institute for Innovation and Improvement. Safer care. SBAR. Situation-Background-Assessment-Recommendation. Implementation and training guide. 2010. https://tinyurl.com/2wycp2nw (accessed 9 June 2022)

Nursing and Midwifery Council. Realising professionalism: Standards for education and training. Part 3: Standards for pre-registration nursing programmes. 2018. https://tinyurl.com/2p8ftu2b (accessed 9 June 2022)

Nursing and Midwifery Council. Current emergency and recovery programme standards. 2022. https://tinyurl.com/mwfe6ryd (accessed 9 June 2022)

Reierson IÅ, Haukedal TA, Hedeman H, Bjørk IT. Structured debriefing: what difference does it make?. Nurse Educ Pract. 2017; 25:104-110 https://doi.org/10.1016/j.nepr.2017.04.013

Resuscitation Council UK. The ABCDE approach. 2021. https://www.resus.org.uk/library/abcde-approach (accessed 9 June 2022)

Roberts E, Kaak V, Rolley J. Simulation to replace clinical hours in nursing: a meta-narrative review. Clinical Simulation in Nursing. 2019; 37:5-13 https://doi.org/10.1016/j.ecns.2019.07.003

Romli MH, Cheema MS, Mehat MZ, Md Hashim NF, Abdul Hamid H. Exploring the effectiveness of technology-based learning on the educational outcomes of undergraduate healthcare students: an overview of systematic reviews protocol. BMJ Open. 2020; 10:(11) https://doi.org/10.1136/bmjopen-2020-041153

Ryall T, Judd BK, Gordon CJ. Simulation-based assessments in health professional education: a systematic review. J Multidiscip Healthc. 2016; 9:69-82 https://doi.org/10.2147/JMDH.S92695

Sadka N. Simulation in healthcare: the possibilities. Emerg Med Australas. 2021; 33:(2)367-368 https://doi.org/10.1111/1742-6723.13758

Secheresse T, Nonglaton S. The ‘Timeline Debriefing Tool’: a tool for structuring the debriefing description phase. Adv Simul (Lond). 2019; 4 https://doi.org/10.1186/s41077-019-0119-4

Shinnick MA, Woo MA. Learning style impact on knowledge gains in human patient simulation. Nurse Educ Today. 2015; 35:(1)63-67 https://doi.org/10.1016/j.nedt.2014.05.013

Stewart M, Kennedy N, Cuene-Grandidier H. Undergraduate interprofessional education using high-fidelity paediatric simulation. Clin Teach. 2010; 7:(2)90-96 https://doi.org/10.1111/j.1743-498X.2010.00351.x

Strom S, Anderson C, Yang L Correlation of simulation examination to written test scores for advanced cardiac life support testing: prospective cohort study. West J Emerg Med. 2015; 16:(6)907-912 https://doi.org/10.5811/westjem.2015.10.2697

Nursing degree applications down 30% as RCN warns Long-term Plan in danger. 2019. https://tinyurl.com/3dfyzxdw (accessed 9 June 2022)

Can a high-fidelity simulation tutorial improve written examination results? Review of a change in teaching practice

07 July 2022
Volume 31 · Issue 13

Abstract

Background:

Undergraduate nursing students prefer technology-based learning. Simulation has been used in nursing education to provide skills acquisition and clinical exposure. Can high-fidelity simulation (HFS) be used to teach tutorial content to prepare students for a written examination?

Aims:

To design a pilot HFS tutorial.

Method:

203 second year undergraduate nurses were timetabled to attend an HFS tutorial. Examination results at first attempt were compared with the previous cohort's results.

Results:

81% of the students from the HFS tutorial cohort passed at the first attempt compared with 85% from the previous cohort.

Conclusion:

The HFS tutorial needs to be developed further, incorporating simulation standards, to further assess its ability to improve a student's written examination results. Students found the post-simulation discussion difficult and wanted guidance in how to participate. Involvement of the university's skills and simulation team would be recommended for future cohorts to assist with design and facilitation.

The current undergraduate student nurse population are from generation Y and generation Z, which was identified by Jones et al (2015) as a student population that prefers technology-enhanced learning. Within the UK, since the introduction of tuition fees for pre-registration nursing programmes in 2017 and the removal of government-funded student nurse bursaries, there has been a reduction in the number of mature students applying for pre-registration nursing courses (Turnbull, 2019). Generation Z prefers technology-based learning because it provides them with greater ownership and flexibility in their learning (Romli et al, 2020). As technology advances, technically savvy students will expect their education to incorporate more non-traditional elements of education delivery (Romli et al, 2020).

This article reviews a change in how undergraduate nursing students were taught their tutorial content—with the new approach incorporating the use of high-fidelity simulation (HFS)—to prepare them for their written examination. The module was designed to teach students how to assess and safely care for adult patients with a variety of acute illnesses using the ABCDE [airway, breathing, circulation, disability, exposure] assessment approach (Resuscitation UK (RCUK), 2021). The summative assessment was a written examination based on patient scenarios that students had been given previously. Within the examination the students had to answer three questions:

  • Describe the assessment you would undertake for the patient in relation to your observations
  • Explain the pathophysiology of a patient's clinical condition
  • Provide the rationale to a particular aspect of nursing care related to the patient scenario.

Historically, the students had evaluated the module well, but had poor examination results at first submission. High-fidelity simulation (HFS) has been previously used on the nursing course to provide exposure to clinical situations and was well evaluated by students.

Literature review

Simulation is an active learning approach that enables nurse educators to provide students with exposure to complex clinical situations within a safe environment to develop and practise their clinical, critical thinking and reasoning skills (Newton and Krebs, 2020). Simulation, in conjunction with other teaching methods, has been found to increase the knowledge gain of undergraduate nursing students who identify a preference for different learning styles (Shinnick and Woo, 2015). It is an approach that is beneficial for the development of clinical skills, problem-solving and team working within health care (Sadka, 2021).

Simulation can be divided into two categories: low fidelity and high fidelity; the greater the fidelity the more believable the situation presented (Cant and Cooper, 2017). The fidelity requirement of the simulation session only needs to be as high as necessary to achieve the teacher's learning outcomes (Kirkham, 2018). Low-fidelity simulation (LFS) can assist with students' learning clinical skills, such as blood pressure measurement (Cant and Cooper, 2017). In contrast, HFS may involve the use of an actor or a computerised manikin to demonstrate realistic clinical scenarios and patient physiological parameters. The manikin can also be used to demonstrate a patient's deterioration, and may also be able interact with the learners to assist them with their communication skills (Au et al, 2016).

The use of HFS in nursing education has increased since the 1990s (Au et al, 2016), with the use of simulation found to increase nursing students' knowledge, self-confidence, and their critical thinking and assessment skills (Lee et al, 2019). Simulation has also been used as a teaching method for bridging the gap between taught theoretical content and practice within a healthcare setting (Newton and Krebs, 2020). However, it can also induce high levels of anxiety, which can diminish a student's performance and deduct from their overall learning (Al-Ghareeb et al, 2019). The Association for the Simulated Practice in Healthcare (ASPiH) (2016) recommends that the psychological safety of the learner should be considered by the simulation facilitators because learners can experience heightened levels of anxiety, especially if the simulation is used as an assessment method. However, the relationship between anxiety and HFS was not recorded by Burbach et al (2019) in their study of 120 US participants. Anxiety can be difficult for people to admit to and assess, which could explain the difference in findings.

Nursing students are required to have 2300 hours of clinical practice exposure, of which 300 hours can be spent within a simulation environment (Nursing and Midwifery Council (NMC), 2022). Simulation has gained popularity within nursing education as a way to provide clinical exposure when clinical placement places within health care are limited (Roberts et al, 2019).

HFS is frequently used within medical education because it enables the learner to see physiological feedback in relation to their medical decisions without putting patients at risk (Alluri et al, 2016). However, there is research suggesting that the learning achieved through HFS does not bear a statistically significant difference to that achieved through LFS (Massoth et al, 2019). In fact, HFS has been associated with a worse performance in knowledge gained than when using LFS; it can also lead medical students to feel overconfident in their skills and performance (Massoth et al, 2019). Simulation continues to be used today to not only train medical and nursing students to acquire clinical skills, but also to prepare them for interprofessional working relationships and team working, which are vital when caring for acutely ill patients (Stewart et al, 2010; Kirkham, 2018).

Traditionally written assessments have been accepted as a method to assess a student's ability to care for acutely ill patients (Storm et al, 2015). Simulation within health professional training provides a safe environment to practise clinical skills without compromising patient safety (Ryall et al, 2016). Previous literature reviews have found that written and simulation assessments differ in their ability to assess a learner's knowledge and practical skill (Ryall et al, 2016). For example, simulation can be used formatively to identify and close gaps in participants' knowledge and monitor progress towards achieving set learning outcomes (International Nursing Association for Clinical Simulation and Learning (INACSL) Standards Committee et al, 2021a). No literature was found discussing the use of HFS as a teaching method in preparation for a written examination.

Method

The HFS tutorial was incorporated into the second-year undergraduate module for the 2017/2018 academic year. In the last lecture before the HFS tutorials commenced the lecturing team brought a manikin and equipment into the lecture theatre and demonstrated an ABCDE assessment (RCUK, 2021) to students, pausing after each step so they had the opportunity to ask questions/clarify what was happening. The scenario involved a patient showing signs of sepsis, which was one of the students' module examination scenarios. The other HFS tutorial scenarios were drawn from the remaining module examination scenarios. These involved patients experiencing post-surgical pain, a myocardial infarction and a small bowel obstruction. These scenarios are commonly encountered in clinical practice, which is why they were chosen.

There were nine HFS tutorials and the students were allocated one session to attend. Each HFS tutorial had approximately 22 students. The learning outcomes for the HFS tutorial were:

  • To work as a team to complete an ABCDE assessment on the patient (RCUK, 2021)
  • To document the assessment on appropriate paperwork
  • To provide a handover of their assessment using the SBAR (situation, background, assessment, recommendation) communication tool (NHS Institute for Innovation and Improvement, 2010).

At the beginning of each HFS tutorial the students were split into three groups of about seven. Each group rotated around the three manikin stations that were facilitated by a lecturer, who was a registered general nurse. Within each group, two or three students would volunteer to work together to undertake an ABCDE assessment (RCUK, 2021) within the patient scenario, while the remaining students in their group observed them. This task was chosen because the first question within the students' summative examination was to write about an ABCDE assessment (RCUK, 2021) they would have completed in relation to the patient scenario. This was also the first examination question asked of the previous cohort (2016/2017).

The manikins were high fidelity in that their physiological parameters responded to the interventions made by the students. The voice of the manikin was provided by the lecturer.

When the students were not involved with undertaking an ABCDE assessment (RCUK, 2021) on the manikin, their role was to make observation notes to aid their participation in the post-simulation discussion. Each manikin scenario lasted about 10-20 minutes and ended with students providing a verbal handover of their assessment to the observing students and facilitator. Some students did not complete the full assessment and therefore handed over what they had managed to complete. The simulation was followed by a 20-minute unstructured discussion facilitated by the lecturer. During this discussion the students were asked how they felt the scenario had gone, to explain the rationale behind some of their decisions, and also gave them the chance to ask any questions they might have in relation to the scenario.

The format and timings of the session and discussion were designed to be familiar to lecturers and students, because it resembled the skills and simulation teaching previously taught within the pre-registration nursing course. At the end of the HFS tutorial, the students were given an evaluation form to complete. This was all conducted in line with the university's clinical skills and simulation teaching strategy.

Results

Both the cohorts involved in the subsequent comparison were undergraduate adult student nurses in their second year of a three-year course (academic years 2016/2017 and 2017/2018). The majority of the module's theory content was taught in the first semester (September to December). For the 2017/2018 cohort the HFS tutorial was delivered at the beginning of the second semester (January-April). The 2016/2017 cohort had a lecture to prepare them for the summative examination at the beginning of the second semester. Both cohorts had clinical placements throughout the academic year and at least one of these was in an acute setting. The summative examination for both cohorts was scheduled after the holiday break in April.

The HFS tutorial did not show an improvement in students' examination results at first attempt compared with those taught on the older version of the module the previous year (Table 1). A comparison across grade boundaries also showed no significant differences between the cohort years (Figure 1). However, a direct comparison between the results of the two cohorts is difficult due a number of factors. For the 2016/2017 cohort, there is incomplete data on results for 55 students. Between the two academic years the format of the summative examination changed from two questions worth 50% each (2016/2017) to three questions worth 50%, 25% and 25% (2017/2018). Describing the ABCDE assessment (RCUK, 2021) for the patient scenario was still the first question for both cohorts. Finally, during the 2016/2017 assessment moderation process the exam marks were increased across the board because it was deemed that the students' submissions had been under-marked across the exam markers. This did not happen with the 2017/2018 results.


Table 1. Exam results, based on marks released post-moderation, for students taking the module without/with high-fidelity simulation tutorial
Academic year cohort No students taking module Passed on first attempt Minimum mark Maximum mark Average mark
2016/2017 223 189 (85%) 14% 96% 59.01%
2017/2018 203 165 (81%) 14% 90% 49.49%
Figure 1. Spread of students marks across the grade boundaries

Discussion

On reviewing the student evaluation forms post the HFS tutorials one theme that emerged was that students had found the HFS tutorial helpful, but not the unstructured discussion afterwards due to the lack of guidance. This was unexpected, since the format of the HFS and the discussion were similar to what they students had completed previously within the course, and there had been no unsatisfactory feedback.

The debrief discussion is the most important aspect of simulation-based learning (ASPiH, 2016). The debrief discussion is where the learner gains new understanding and learning, linking theory to practice through reflecting on what they have participated in and observed (INACSL, 2021b). Although there is no specific guidance on the duration of the debriefing process adequate time should be allocated to assisting the learner achieve the learning outcomes, address any elements they wish to discuss, and identify gaps in their knowledge (INACSL Standards Committee, 2021b). Due to timing constraints in the HFS tutorial, the post-simulation discussion was only allocated about 20 minutes. On reflection, facilitators felt that this was not sufficient time to enable all groups to have a meaningful discussion.

Although the terms are used interchangeably, it should be noted that feedback and debriefing are different: feedback is a one-way conversation (Reierson et al, 2017). It emerged from discussion with other lecturers who facilitated the HFS sessions that the post-simulation discussion also tended to be centred on lecturer feedback rather than student-led. Some of the student evaluations commented on this and observed that different lecturers provided different information regarding the scenarios and correct care. Gantt et al's (2018) study into comparing different debriefing methods found that staff and students preferred a facilitated debrief, where students and the facilitator discussed the simulation events fully, giving students an opportunity to express their emotions regarding the experience and ask questions.

Because there were nine sessions to facilitate the attendance of all students at an HFS tutorial and because each session required three lecturers to facilitate, it was difficult to ensure consistency between all facilitators. A structured debriefing tool rather than the unstructured discussion that took place would have been beneficial in order to provide consistency (Secheresse and Nonglaton, 2019). Using a structured debriefing tool instead of an unstructured discussion could also provide participants with comprehensive, reflective feedback (Reierson et al, 2017). The ASPiH (2016) recommends that novice faculty members are provided with debriefing training, as the skills of the debriefer correlate closely with participants' satisfaction with the experience of simulation. However, training did not occur within the HFS tutorial debriefing.

A further theme from the students' evaluations of the HFS tutorial was a frustration with some of the equipment. There were problems with one of the manikin's network connection and therefore it did not reflect its programmed physiological parameters and changes. This deducted from the fidelity of the simulation and could have been a reason for reduced student engagement (Cant and Cooper, 2017). The INACSL Standards Committee et al (2021c) recommends using various types of fidelity to create a realistic simulation experience. However, Johnston et al (2017) found that some students still struggle to engage in simulation in an appropriate manner despite trying to make the experience as realistic as possible. Alconero-Camarero (2021) found that students had greater satisfaction with LFS, although this might be because HFS generally involves more complex clinical cases and technology, which can be a barrier to learning.

Within the School of Nursing at Kingston University there is a dedicated simulation and skills team that leads on the teaching and development of simulation content taught to the students throughout the course. However, the HFS tutorial had not been designed by this team, so they were not involved in setting up or facilitating any of the tutorial sessions. The lecturers who facilitated the sessions and set up the manikins each day work as part of a theoretical nursing team—although they help facilitate some of the simulation departments study days, they are not involved with the design and set-up of each session and so are unfamiliar with troubleshooting some of the equipment. The ASPiH (2016) recommends that simulation technicians are involved with the design of simulation scenarios and evaluating their effectiveness. It would have been helpful to have a member of the simulation and skills team involved in designing and facilitating the HFS tutorial.

Limitations

The INACSL Standards Committee et al (2021c) and the ASPiH (2016) have produced standards of best practice regarding simulation design, outcomes and objectives, assessment and staff involvement, which were not incorporated when designing the HFS tutorial due to lack of awareness of them. These standards recommend using simulation as a formative assessment to assess skill sets such as teamwork, communication and leadership. Assessing the effectiveness of the HFS tutorial via the students written examination results was perhaps too broad and subject to other factors such as students' individual learning styles, affecting correlation. The intention of a formative assessment is to improve learner performance (ASPiH, 2016), therefore perhaps this review should have used participants' evaluation of knowledge pre- and post-the HFS tutorial rather than the written examination results in their summative assignment.

The results from the HFS tutorial were taken from the 2017/2018 academic year. This was the first and only year that the HFS tutorial was used—due to university staff shortages it was not used with the 2018/2019 cohort. For the 2019/2020 academic year, a revision of the curriculum was made in response to the changes introduced to the standards for pre-registration nursing programmes (NMC, 2018). The module still incorporates simulation, as the experience was evaluated positively by the students, and it is now a formative assessment; however the students also have to reflect upon their ABCDE assessment (RCUK, 2021) in their summative written report. Due to the impact of COVID-19 and subsequent lockdowns, 2021/2022 will be the first cohort of students to complete the formative simulation and write a reflective report. Student results will be evaluated to review the changes made to the module and assessment.

Conclusion

This article has discussed using an HFS tutorial to prepare undergraduate nursing students for a written summative examination. The use of the HFS tutorial did not show an improvement in the student pass rate at assessment, although various confounding variables were identified, making a direct comparison of results difficult.

The INACSL Standards Committee et al (2021c) and the ASPiH (2016) publish standards for simulation design, which would need to be incorporated into the design of any further HFS tutorials. Student evaluations of the HFS tutorial indicated that they enjoyed the teaching session. However, some areas could have been improved, such as using a debriefing tool during the debriefing discussion to provide a more structured format for student feedback, and having a sufficient amount of time allocated to do this. A member of the skills and simulation team should also be involved in designing and facilitating future tutorials that use this format.

The use of HFS tutorials needs to be developed further and incorporate the changes in design discussed in the article to enable more significant conclusions to be made about the impact of these adjustments on improvements to students learning. The participants' results and evaluation data following HFS tutorials would need to be reviewed in order to facilitate continuous improvement in the HFS tutorials (ASPiH, 2016).

KEY POINTS

  • Undergraduate nursing population is changing and they prefer technology-enhanced learning
  • Students evaluated the high-fidelity simulation (HFS) tutorials well. Although there was no improvement in comparison with the exam results of the previous cohort, confounding factors prevented a direct comparison of the results
  • The literature indicates that the debriefing after HFS is the most valuable learning experience and sufficient time should be allocated to this
  • Students struggle with an unstructured discussion and a structured debriefing tool should be used

CPD reflective questions

  • What other innovative teaching methods could have been used to prepare students for their summative assessment?
  • From your own experience, how have you found debriefing discussions post-simulation activities that you have been involved in?
  • What other reasons could there be for students' lack of participation in the post-simulation discussion? What are the possible solutions to address this?