Back to Journals » Advances in Medical Education and Practice » Volume 12

Synchronous Screen-Based Simulation in Anesthesia Distance Education

Authors Swerdlow B , Soelberg J, Osborne-Smith L

Received 6 June 2021

Accepted for publication 12 August 2021

Published 26 August 2021 Volume 2021:12 Pages 945—956

DOI https://doi.org/10.2147/AMEP.S323569

Checked for plagiarism Yes

Review by Single anonymous peer review

Peer reviewer comments 2

Editor who approved publication: Dr Md Anwarul Azim Majumder



Barry Swerdlow,1 Julie Soelberg,1,2 Lisa Osborne-Smith1,2

1Nurse Anesthesia Program, Oregon Health & Science University, Portland, OR, USA; 2Department of Anesthesiology and Perioperative Medicine, Oregon Health & Science University, Portland, OR, USA

Correspondence: Barry Swerdlow
Nurse Anesthesia Program, Oregon Health & Science University, SON 521 3455 SW US Veterans Hospital Road, Portland, OR, 97239, USA
Tel +1 503 494 6468
Fax +1 503 346 8296
Email [email protected]

Purpose: The aim of the present study was to evaluate the feasibility, acceptability, and utility of synchronous online screen-based simulation (SBS) in anesthesia education.
Methods: The investigational cohort consisted of 12 second-year nurse anesthesia students enrolled in a Doctor of Nursing Practice (DNP) program. Pairs of students worked with a single instructor online using the same SBS employing a cloud-based peer-to-peer platform and high-fidelity software involving a graphical avatar. During each session, the instructor initially manipulated the avatar through the software scenario with educational pauses to communicate learning content. Thereafter, students proceeded through the same SBS by stating their desired actions, which were then implemented by the instructor. At the conclusion of each session, students were evaluated by an integrated software scoring system, and thereafter they completed a questionnaire rating their distance SBS experience.
Results: Synchronous online SBS was performed in this manner without difficulty; it was accepted by students as a valuable adjunct to their in-person mannequin-based simulation (MBS) training; and it was perceived as a useful addition to their anesthesia education. Students identified significant value in the isolation of the cognitive component of learning by this teaching methodology. Lack of haptic learning, however, also was seen as a disadvantage of SBS compared to MBS. Students’ criticisms of SBS were largely unrelated to use of this technique with synchronous online education, but rather related to general limitations associated with SBS technology. There was a positive correlation between the students’ mean post-SBS rating and the automated SBS score (r = 0.832).
Conclusion: Synchronous online SBS can effectively supplement MBS in an anesthesia training program. Its major perceived advantage appears to be an ability to isolate and reinforce appropriate cognitive skills related to intraoperative care including crisis management. Students who had higher mean post-SBS ratings also had higher automated SBS scores.

Keywords: simulation, anesthesia, distance education, screen-based, crisis management

Introduction

Simulation represents a critical technique in the field of anesthesia both for educational and evaluative purposes.1–4 The most common anesthesia simulations involve face-to-face activities employing task trainers, “in-situ simulation” in actual operating rooms, and computer-driven electromechanical mannequins in mock operating rooms.5 In recent years, screen-based simulation (SBS) using digital technology to represent patient scenarios on visual display units has played an escalating role in anesthesia education.6–8

SBS constitutes the overlap of high-fidelity simulation with computer-based learning.2 In a manner similar to recreational computer games, SBS in anesthesia education presents hypothetical patient scenarios using kinetic graphical images and supplemental text.7 Users interact with these scenarios via standard computer interface devices including keyboards, joysticks, touchpads, or mouse controls, and choose dialogue and management options from selection menus.9,10 SBS has been increasingly utilized in academic anesthesia largely due to its affordability, availability, simplicity, repeatability, and scorability.2,8

In comparison with mannequin-based simulation (MBS), SBS has a number of advantages.8,-11 It is cost-effective and considerably less resource and personnel dependent.12 Mannequin simulators and mock operating rooms are expensive to purchase and house, and MBS sessions frequently monopolize several individuals for the management of teaching scenarios.8 Furthermore, multiple studies have shown that SBS improves cognitive skills in anesthesia trainees.9,13,14 Until recently, however, SBS in anesthesia exclusively has involved self-directed, asynchronous activities without instructor involvement.8,15

Despite the fact that both self-directed, asynchronous SBS and asynchronous or synchronous distance (online) education in anesthesia are common, no published data exists concerning the use of instructor-directed, synchronous SBS in anesthesia distance education.8 Essentially, while synchronous online learning occurs frequently in medicine including anesthesia16 – partially due to its ability to reach large groups of individuals seamlessly, its potential for flexible scheduling, and its lack of need for institutional buildings and infrastructure – the use of simulation for this purpose (ie synchronous online SBS) represents a unique form of medical instruction that is relatively uncommon.8 This novel method of teaching brings together distance education, synchronicity, and SBS, and in so doing, it may maximize the advantages of each of these elements and permit group educational experiences that maintain social distancing, a desirable feature in the current pandemic environment.

The present study examines the feasibility, acceptability, and utility of synchronous SBS as a method of teaching in anesthesia. Specifically, it attempts to address whether student satisfaction and/or achievement of educational goals with instructor-led training using SBS at a distance are comparable to the student acceptance and pedagogic value that characterizes in-person MBS. In addition, while competency assessment in anesthesia largely has involved task trainers and MBS, assessment of critical thinking skills using SBS has been employed in the anesthesia resident selection process17 and for purposes of maintenance of anesthesia board certification.3 In this context, many anesthesia SBS platforms have integrated scoring systems (with instantaneous quantitative evaluation functions) that allow the learners’ decision-making processes to be captured and tracked for evaluative purposes and to provide effective feedback, a quality important to successful teaching strategies.9,10,13,18 The present study also seeks to understand whether such automated scoring can assist in identifying those students who perceive maximum value in synchronous online SBS.

Methods

The study complied with the Declaration of Helsinki and it received approval by the Oregon Health & Science University (OHSU) Institutional Review Board. Twelve nurse anesthesia students (7 women and 5 men, ages 28–45 years) who had completed 17 months of the Doctor of Nursing Practice (DNP) nurse anesthesia program (NAP) and were enrolled in their Anesthesia and Co-existing Disease course constituted the research cohort. At this point in their 36-month curriculum, students had completed the traditional didactic NAP courses as well as approximately 550 clinical hours.

One week prior to the study, the SBS exercise was explained in detail to the entire cohort – including the fact that non-participation in the study would not be associated with any adverse academic consequences – and written consent was obtained from all participants. All SBS sessions were conducted using a secure, high-quality, reliable, cloud-based team collaboration application that allows videoconferencing, text messaging, and whiteboarding (www.webex.com). Each session involved two students and one instructor, and the same instructor conducted all SBS sessions. Six sessions (to accommodate the cohort of twelve students) were completed over two consecutive days. Each session lasted approximately two hours. This study constituted the first experience for all students with screen-based anesthesia simulation: no student had previous exposure to this technology.

SBS sessions used Anesthesia SimSTAT (“SimSTAT”) software18 – a high fidelity, avatar-based system developed by the American Society of Anesthesiologists – that replicates an intraoperative adverse event (AE) in real time. The specific SBS scenario utilized in this study involved the onset of a hypercarbic AE in a patient undergoing a laparoscopic appendectomy. The general appearance of the SBS screen with this scenario is shown in Figure 1.

Figure 1 Screen-based simulation screen appearance. Real time dynamic graphics are used to depict operating room events. Dialogue options (readable and audible) allow the user’s avatar to communicate in this environment. All anesthesia devices are interactive. Excerpted with permission from Anesthesia SimSTAT – Appendectomy course from the American Society of Anesthesiologists. Information related to the course can be requested from ASA, 1061 American Lane Schaumburg, IL 60173–4973 or online at www.asahq.org.

At the start of each session, the instructor introduced the technology, and thereafter manipulated the program avatar through the software scenario with scripted educational pauses to communicate learning content. Because the SBS software scenario was run in real time, use of a “pause” option was essential to allow for such instructor-led teaching opportunities. A small portion of the scripted instructor session for this scenario with a pause to teach important anesthesia considerations is shown in Figure 2. Examples of these considerations included best practice diagnostic algorithms and interventions for increasing hypercarbia, increased intra-abdominal pressures, and elevated airway pressures during laparoscopic surgery.

Figure 2 Example of script for instructor’s screen-based simulation session. The screen-based simulation scenario was paused to allow discussion of important anesthesia considerations.

Abbreviation: SBS, screen-based simulation.

During the initial instructor led SBS, students were asked to apply a specific cognitive template for diagnosis and management of adverse intraoperative events. This mental framework was developed by one of the authors based on well-defined principles of anesthesia crisis management and the role of differential diagnoses in perioperative dynamic decision making.19,20 The template had been introduced and practiced during the students’ didactic instruction and consisted of consecutive generic instructions guiding students how to construct their responses to intraoperative AEs. It involved the sequential appraisal of the likelihood of an AE being artifact, responding generically to the AE, outlining the broad differential diagnosis (DDX) of the AE, and then narrowing the DDX predicated on the presence of coexisting AEs, situational data, and a consideration of events immediately preceding the AE, and then responding specifically to the most likely AE or AEs. In essence, this template directed students to shrink the list of likely conditions and adjust epistemic confidences in order to optimally respond to a great variety of intraoperative problems in real time (Figures 3 and 4).

Figure 3 Cognitive template for diagnosis and management of an intraoperative adverse event.

Abbreviations: AE, adverse event; DDX, differential diagnosis.

Figure 4 Application of cognitive template to hypercarbia in screen-based simulation.

Abbreviations: AE, adverse event; ABG, arterial blood gas; DDX, differential diagnosis; CO2, carbon dioxide; FiCO2, fractional inspired concentration of carbon dioxide.

Following the instructor’s teaching simulation conducted with selected pauses, students working in pairs proceeded consecutively through the same SBS by verbally directing the instructor’s avatar manipulations: the students chose avatar actions and communicated them to the instructor, who then directed the avatar accordingly. This latter student-led process, unlike the instructor session, was without pauses or interruptions.

At the end of each SimSTAT session, the students’ performances were tracked via an integrated scoring system that involved component and overall ratings. Component ratings were based on the user’s actions (or inactions) in specific domains that included Professionalism, Medical Knowledge, Systems-based Practice, Interpersonal and Communication Skills, and Patient care. In this manner, the integrated scoring system assessed the students’ strengths and weaknesses and identified areas for improvement. Both component scores and overall scores were reviewed with each pair of students. An example of a SimSTAT automated scoring is shown in Figure 5.

Figure 5 Example of automated screen-based simulation scoring. Several rating categories are shown as examples (Systems-based Practice, Professionalism, Interpersonal and Communication Skills, and Patient Care). Excerpted with permission from Anesthesia SimSTAT – Appendectomy course from the American Society of Anesthesiologists. Information related to the course can be requested from ASA, 1061 American Lane Schaumburg, IL 60173–4973 or online at www.asahq.org.

After concluding their SBS exercise, students completed a post-SBS questionnaire (Table 1) in which they were asked to evaluate their distance SBS experience using quantitative 5-point Likert scales (with 5 indicating strong agreement and 1 indicating strong disagreement) and qualitative responses. This data reflected the students’ perceived educational usefulness and physiological fidelity of their experiences. The questionnaire asked students to compare SBS at a distance with their other anesthesia-related simulation experiences (in-person MBS), and it provided insight on process improvement as well as learning gains.

Table 1 Post-Screen-Based Simulation Questionnaire Content. Relative agreement with statements was evaluated using a 5-point Likert scale.

Analysis

Quantitative answers were averaged for each Likert-scale response on the questionnaire, both according to question and according to each individual student. Students’ mean post-SBS ratings were compared with their overall automated SBS scores using a Pearson’s correlation test to address the question of whether there was an association between students’ perceptions of the educational value of the SBS exercise and the results of their SBS numerical assessments. Responses to open-ended inquiries (including why or why not questions and suggestions for process improvement) were merged to create themes related to the feasibility, acceptability, and utility of the educational experience for students.

Results

Feasibility

Synchronous online SBS using a reliable, cloud-based team collaboration application and high-fidelity software involving a graphical avatar (SimSTAT) was feasible when performed in the manner detailed above with pairs of students and one instructor. All SBS sessions were completed approximately in the time allocated (2 hours), which included adequate time for both the instructor and student sessions, and for analysis of student performances based on the automated SBS rating system. Overall, parameters integral to the feasibility of instructor led-SBS sessions included willingness of student participation, a high-quality, reliable cloud-based collaboration application, high fidelity SBS software, low faculty: student ratio, and adequate time allocation for the SBS sessions. Despite the relatively small size of this cohort, it should be noted that the instructor time burden involved roughly 14 hours over two consecutive days (dedicated to exercise performance) plus additional time for event preparation. This workload had the potential for inducing instructor fatigue and thereby negatively impacting the exercise, and the number of students engaged in the exercise per day (6) likely represents a maximum workable figure for a single instructor for this length of simulation.

Acceptability

Students perceived significant educational value in synchronous online SBS. The mean response (standard deviation) to the statement “The SimSTAT exercise was a valuable educational experience” was 4.3 (0.8) on a scale of 1–5 (see above) (Table 1). Furthermore, the mean responses to all of the other questionnaire elements – wherein students were provided with statements of specific potential positive attributes of the experience and asked to disagree or agree – ranged from 3.5 (0.8) to 4.4 (0.7). Notably, all students desired to add SimSTAT to their curriculum as a “learning adjunct,” with a mean Likert-scale score of 3.9 (1.1) in response to the statement “I would like more SimSTAT exercises incorporated into the nurse anesthesia educational curriculum.”

A common theme related to educational value was student appreciation of the isolation of the cognitive component of learning by the synchronous online SBS without needing to focus on technical skills. Compared with their MBS experience, students commented that they “had more time to think through why (they were) performing interventions.” An additional common theme related to the value of the pause option in teaching: “SimSTAT was a very valuable experience. First, the experience was great with the stops and discussing what we would do and answering questions about the case.” Also, “… it was great to be able to stop along the way, have discussions about a particular point of patient management, and get real time feedback.” Overall, the instructor-led session with educational pauses was perceived as having major value. In addition, the pause option reduced stress for students. Other factors related to reducing the stress of online synchronous SBS in this study (wherein students rated SimSTAT only slightly less stressful than in-person simulation with a mean score of 3.7 (0.9)) included the lack of need to focus on technical skills, a limitation on sensory input, and the inability to multitask (both latter factors adversely affecting the SimSTAT fidelity rating – see below).

In general, students found the exercise easy to perform (mean score 3.8 (1.1)) only because they did not need to learn the specifics of software manipulation: “it was a matter of communicating the intervention to the instructor.” Otherwise, they perceived the software platform to be “cumbersome with a significant learning curve” and “not intuitive.”

In so far as fidelity was concerned, the students were critical of multiple aspects of the software performance including inability to multitask (a repetitive student observation), the fact that only one character could speak simultaneously, limited decision-making options including dialogue choices, and lack of the entire panoply of distractions that occurs in an operating room (music, instrument noise, voices, relative darkness during laparoscopy, etc.). Likely for this reason, the Likert-scale response to the statement “SimSTAT accurately reproduced a real intraoperative environment as well or better than in-person simulation,” had the lowest mean value (3.1 (1.0)).

On the other hand, specifically considering fidelity related to the anesthesia machine and monitor, the students agreed that synchronous online SimSTAT fidelity was superior to their in-person MBS experiences. “The simulation in the lab sometimes lacks the congruent monitor reflection of patient’s status (especially ventilator settings).” Several students commented that they could better trust physiologic data generated in the SBS compared with similar data during MBS. Student responses to the statement “SimSTAT accurately reproduced adverse physiologic changes as well or better than in-person simulation” generally reflected this opinion, and the mean Likert-scale response associated with this statement was 4.1 (0.7).

Utility

Students agreed that synchronous online SBS has utility. They perceived this utility as supplemental to rather than as a substitute for in-person simulation: “This is a good tool, but it … should complement in-person sim not replace (it).” The cohort stressed the value of learning best practice cognitive approaches to intraoperative issues as the outstanding value of this methodology. For this reason, they desired “… more regular SimSTAT events (during the course of their education) with real time debriefing and discussion throughout the case.” In addition, however, a common theme was the fact that in-person simulation was essential to teach technical aspects of anesthesia care, because their “job is a tactile job,” and as such, requires a “physical sim lab.”

Students strongly believed that their SimSTAT experience assisted them in the acquisition of knowledge related to the management of intraoperative AEs (mean score of 4.3 (0.5)). In the context of this learning objective, a common theme referenced the AE template “reinforc(ing) the broad outline to approaching adverse events (artifact, generic response, broad DDX, narrow DDX, overlap, intervention, etc.).” The response to the statement “SimSTAT taught the proper real-time cognitive response to physiologic changes in the operating room as well as or better than in-person simulation” was associated with a similarly high Likert-scale score (4.3 (0.6)).

In addition, 75% of students (n=9) replied that the exercise reinforced the importance of early communication with the surgical team and calling for help at appropriately early times (responses evaluated by automated SBS scoring). With respect to specific management strategies, students learned the cardinal importance of prioritizing therapeutic interventions, treating critical electrolyte disorders early, and requesting timely desufflation of a pneumoperitoneum with conversion to laparotomy during a laparoscopic procedure in patients with worsening hypercarbia and hemodynamic instability.

I feel this truly cemented all of the steps and why and how we perform them for this specific scenario. Rather than reading a list of steps, this put it all together in a way lecture cannot.

Even though ease of performance was maximized (and stress minimized) by instructor-mediated student manipulation of the SBS, 58% of students (n=7) commented on both a delay in response due to this approach and a desire to perform the simulations directly – without an instructor intermediary: “Have the learners be the program users.” Students noted that they would be amenable to learning the technology needed to achieve this latter level of performance, especially if provided with a software tutorial prior to the exercise. Additional suggested improvements related to a desire to utilize the pause option without forfeiting their view of the operating room environment (a feature of SimSTAT), a desire to employ additional AE scenarios, and a request to practice asynchronously on their own (necessitating learning the intricacies of software use). The latter two options can be useful strategies to reinforce important themes in successful anesthesia crisis management.

SBS Automated Scoring

The range of SBS automated scoring of student performances was 51% - 71%. All simulated SBS scenarios ended in successful virtual patient resuscitation. These numbers and outcomes compare very favorably with initial NAP faculty attempts with the same SimSTAT program (who uniformly achieved scores < 50% during their first encounters with this system and who uniformly failed to resuscitate their patients during these initial attempts) – a finding that likely relates in part to the fact that students, unlike NAP faculty members, were not required to be proficient with the software since they worked through an instructor interface.

A Pearson’s correlation test was performed to assess the relationship between the students’ mean post-SBS ratings and the automated SBS scores. The students performed the SBS exercise in pairs, and one automated SBS score was generated for each pair. Therefore, the mean post-SBS rating for each student pair was calculated and used in the analysis. There was a positive correlation between the students’ mean post-SBS ratings and their SimSTAT automated SBS scores (r = 0.832). Visual analysis of the scatterplot was consistent with this finding. In this small sample of nurse anesthesia students, it appears that students who had higher mean post-SBS ratings also had higher SBS automated scores.

Discussion

Anesthesia training programs’ needs for synchronous online simulation have changed in the era of coronavirus disease 2019 (COVID-19). Mandatory social isolation precipitated in response to the viral pandemic has radically altered the dynamics of anesthesia education – much as it has altered the options for all traditional educational forums21 – and online distance teaching has assumed a major role in substituting for former in-person curricular activities.22,23 This same social pressure has motivated the use of distance anesthesia education using SBS to fill gaps in educational programs formerly employing solely face-to-face techniques such as MBS.8

The current prospective, small-scale study considered the use of synchronous online SBS as part of an existing course in the OHSU DNP NAP. This study demonstrates that such distance teaching is feasible, is accepted by students as a valuable educational addition to their curriculum, and is perceived by students as having significant utility as an adjunct to in-person MBS. A major advantage of SBS in this setting appears to be its ability to focus on cognitive processes without distraction and to teach aspects of critical thinking involved in the delivery of anesthesia care. This is undoubtedly true of in-person SBS as well, but the fact that such focus can be maintained with online learning is critical to the usefulness of this teaching methodology.

These critical thinking skills lie at the heart of effective crisis resource management and teaching them represents a major objective for simulation in anesthesia.7,24 Cognitive exercises to accomplish this task appear to be precisely the elements of anesthesia simulation that can be best replicated using synchronous online SBS. Crises are intrinsic to the practice of anesthesiology, and the ability to use diagnostic algorithms to interpret and appropriately react to adverse events is essential for rapid and successful resolution of perioperative adverse events.19 Furthermore, SBS is ideal for teaching anesthesia trainees to employ metacognition – learners can understand and appreciate how they think and recognize when they do or do not understand a concept. Such metacomprehension is a key element of efficient, successful learning25 and ultimately may allow anesthesia professionals to modulate their own thinking and thereby improve patient safety.26

In this study, a cognitive template for management of intraoperative AEs was taught synchronously during the initial instructor cycle using the framework of the SBS scenario to provide the skeletal elements for discussion (Figure 3). This template provided a sequential approach to diagnosis and management of common crises by considering:

  • The likelihood of an AE being artifact
  • The appropriate generic response to the AE
  • The broad DDX of the AE
  • A narrowed DDX predicated on the presence of coexisting AEs, situational parameters, and a consideration of events immediately preceding the AE
  • The appropriate specific response to the likely cause of the AE

During the SimSTAT exercise, the specific AE addressed in this manner involved increasing hypercarbia in the setting of new hypoxemia and worsening hyperpyrexia during general anesthesia for an emergent laparoscopic appendectomy. Students reproducibly found the synchronous online SBS platform to be an effective method to practice and reinforce the appropriate application of this template to the AE (Figure 4) and to demonstrate the utility of this cognitive approach to intraoperative crisis management.

In general, many of the positive educational attributes of SBS appear to apply to use of synchronous online SBS at a distance. A number of studies have shown that SBS improves both cognitive and psychomotor skills in anesthesia trainees.8,9,13,22 These skills include utilization of cognitive aids,27 a process reinforced by SimSTAT and evaluated in the automatic SBS rating reviewed synchronously with students at the end of their exercise. SBS also has been shown to improve teamwork skills.11,28 This finding is crucial in anesthesia since good teamwork, including effective communication and leadership, allows for successful management of intraoperative adverse events.27 In this context, it is noteworthy that multiple students commented that the need for early communication with colleagues (summoning help) and the surgical team was reinforced by this online synchronous SBS. Such teamwork and communication skills are fundamental elements of effective perioperative crisis management.27

Similarly, many of the broad limitations of SBS appear to apply to synchronous online SBS. Foremost among these limitations is the inability of any form of SBS to provide tactile learning. Unlike virtual reality technology, SBS has no haptic component, and as such, teaching is relegated to “thinking processes” rather than “doing processes.” In addition, students noted limitations to the fidelity of the SBS offered by SimSTAT, an example of state-of-the-art anesthesia software designed for simulation purposes. These shortcomings included the inability to multitask (a repetitive student observation), the fact that only one character could speak simultaneously, limited decision-making options including dialogue choices, and lack of the entire panoply of distractions that occurs in an operating room. Predominantly due to its relative inability to teach motor skills and some of the specific fidelity problems associated with the SimSTAT platform, rather than the distance component of the exercise or its synchronous nature, students favored the option of synchronous online SBS as a useful adjunct rather than as a replacement for MBS. They viewed its utility predominantly in terms of instructor-facilitated teaching of critical thinking skills and training for real-time cognitive responses to patient care scenarios including intraoperative crisis management.

While SimSTAT’s fidelity clearly has its limitations, students recognized its sophistication (including an ability to reproduce some physiologic responses superior to MBS). They also appreciated its steep learning curve. As a result of the latter fact, the study design chose a technique wherein students first synchronously viewed an instructor-performed session, and then verbally directed the actions of the instructor as the latter individual interacted directly with the software avatar. The major advantage of this method was that it allowed students who previously have not encountered the simulation software to use it relatively effortlessly – students did not need to master the complex program prior to the SBS. However, this option also reduced students’ interaction with the simulation (since they worked via the instructor to control the avatar), and thereby may have compromised experiential learning – particularly in comparison with MBS. Several students commented on this specific drawback, and while utilizing an instructor interface reduced stress and allowed students to focus on non-technical aspects of the SBS, it also limited the potential learning associated with exercise. Future studies may provide an opportunity for asynchronous software mastery in advance of the synchronous exercise (for example, with an online tutorial), and thereafter employ a technique wherein the instructor first runs the entire avatar-based scenario with selected pauses for teaching (in the same manner as the current study), but then students manipulate the software directly thereafter (rather than via the instructor).

At the conclusion of their exercises, feedback on student performances was provided both by the instructor and by an integrated software scoring system. One of the determinants of students’ perceptions of educational quality is their success in the academic process.29 Therefore, we conjectured that those students who scored higher using the software’s automated system would be more likely to perceive increased value in the SBS exercise. We expected to see a correlation between these two parameters, and indeed, there was a positive correlation between the students’ mean post-SBS rating and their automated SBS score (r = 0.832). Visual analysis of the scatterplot of these variables was consistent with this finding. As such, it is possible that a SBS automated system such as the one employed in this study may assist in identifying those students who will obtain maximum benefit from this educational technology. A study utilizing a larger cohort and allowing students to perform the SBS exercise individually, rather than in pairs, may better address this possibility.

Limitations

The results of this investigation need to be interpreted in the context of its small scale. The study cohort consisted of only 12 students. As such, despite the fact that the post-SBS responses were consistent and reflected the acceptability and utility of online synchronous SBS in this setting, the current findings need to be confirmed by studies involving larger numbers of participants. A similar limitation applies to the study’s finding of a positive correlation between the students’ mean post-SBS rating and their automated SBS score. Furthermore, the SBS software (SimSTAT) employed in this study was designed specifically for asynchronous, individual learner use without group participation and without an instructor intermediary. As such, the study results may not be generalizable to independent learners or even learner groups employing the software directly in the manner for which it was designed to function. Another limitation of this investigation includes the fact that students performed this simulation in pairs, rather than individually – largely due to time constraints (although, in addition, working with a partner likely reduced the stress level associated with this novel and unfamiliar endeavor). In addition, the cognitive template that provided an organizational framework for teaching management of perioperative AEs in this study, while founded upon well-established principles of anesthesia crisis management, previously has not been validated.

Conclusion

Online synchronous SBS may provide a partial solution to the need for continued simulation teaching during the COVID-19 crisis that mandates social distancing.8 Furthermore, even before the current viral pandemic ends, as simulation centers reopen with initial relaxation of quarantine restrictions, continued mandated physical distancing between learners may restrict the full reinstitution of MBS. High demands on MBS in the emerging post-COVID-19 era undoubtedly will complicate and limit the use of that modality in many training programs. For this reason, especially given the feasibility, acceptability, and educational utility of this technique demonstrated by the current prospective, small-scale study, online synchronous SBS may provide an attractive method of distance teaching. As a result, and especially with the prospect of repeat pandemics in the foreseeable future, distance education involving SBS software platforms of increasingly high fidelity likely will become integral parts of future anesthesia training programs.

Disclosure

The authors report no conflicts of interest in this work.

References

1. Boulet JR, Murray DJ. Simulation-based assessment in anesthesiology: requirements for practical implementation. Anesthesiology. 2010;112(4):1041–1052. doi:10.1097/ALN.0b013e3181cea265

2. Saddawi-Konefka D, Cooper JB. Anesthesia and simulation: an historic relationship. In: Mahoney B, Minehart R, Pian-Smith MCM, editors. Comprehensive Healthcare Simulation: Anesthesiology. Switzerland: Springer; 2020:3–13.

3. Shah A, DeMaria S, Goldberg A. Competency testing. In: Mahoney B, Minehart R, Pian-Smith MCM, editors. Comprehensive Healthcare Simulation: Anesthesiology. Switzerland: Springer; 2020:61–71.

4. Yunoki K, Sakai T. The role of simulation training in anesthesiology resident education. J Anesth. 2018;32(3):425–433. doi:10.1007/s00540-018-2483-y

5. Schaff J, Russell C. Mannequin-based simulators and part-task trainers. In: Mahoney B, Minehart R, Pian-Smith MCM, editors. Comprehensive Healthcare Simulation: Anesthesiology. Switzerland: Springer; 2020:107–115.

6. Nelson CK, Schwid HA. Screen-based simulation for anesthesiology. Int Anesthesiol Clin Fall. 2015;53(4):81–97. doi:10.1097/AIA.0000000000000076

7. Green M, Tariq R, Green P. Improving patient safety through simulation training in anesthesiology: where are we? Anesthesiol Res Pract. 2016;4237523. doi:10.1155/2016/4237523

8. Swerdlow B, Stoelberg J, Osborne-Smith L. Distance education in anesthesia using screen-based simulation – a brief integrative review. Adv Med Educ Pract. 2020;11:563–567. doi:10.2147/AMEP.S266469

9. Edwards DA, Lampotang S. Computer- and web-based simulators and virtual environments. In: Mahoney B, Minehart RD, Pian-Smith MCM, editors. Comprehensive Healthcare Simulation: Anesthesiology. Switzerland: Springer; 2020:117–125.

10. Ventre KM, Schwid HA. Computer and web-based simulators. In: Levine AI, DeMaria S Jr, Schwartz AD, Sim AJ, editors. The Comprehensive Textbook of Healthcare Simulation. New York: Springer; 2013:191–208.

11. Liaw S, Chan SW, Chen FG, Hooi SC, Siau C. Comparison of virtual patient simulation with mannequin-based simulation for improving clinical performances in assessing and managing clinical deterioration: randomized controlled trial. J Med Internet Res. 2014;16(9):e214. doi:10.2196/jmir.3322

12. Schwid HA, Souter KJ. Resident perceptions and cost analysis of a virtual patient application for anesthesia-related critical incidents. J Educ Perioper Med. 2014;16(11):E077.

13. Erlinger LR, Bartlett A, Perez A. High-fidelity mannequin simulation versus virtual simulation for recognition of critical events by student registered nurse anesthetists. AANA J. 2019;87(2):105–110.

14. Schwid HA, Rooke GA, Ross BK, Sivarajan M. Use of a computerized advanced cardiac life support simulator improves retention of advanced cardiac life support guidelines better than a textbook review. Crit Care Med. 1999;27(4):821–824. doi:10.1097/00003246-199904000-00045

15. CAE Healthcare. CAE Healthcare supports the ASA in anesthesiology training simulation. Available from: https://caehealthcare.com/blog/cae-healthcare-supports-the-asa-in-anesthesiology-training-simulation/. Accessed November 17, 2020.

16. Pecka SL, Kotcherlakota S, Berger AM. Community of inquiry model: advancing distance learning in nurse anesthesia education. AANA J. 2014;82(3):212–218.

17. Kulig AW, Blanchard RD. Use of cognitive simulation during anesthesiology resident application interviews to assess higher order thinking. J Grad Med Educ. 2016;8(3):417–421. doi:10.4300/JGME-D-15-00367.1

18. American Society of Anesthesiologists. Simulation training for physician anesthesiologists is now available anywhere, anytime. Available from: https://www.asahq.org/simulation. Accessed November 17, 2020.

19. Gaba DM, Fish KJ, Howard SK, Burden AR. Fundamentals of dynamic decision making in anesthesia. In: Gaba DM, Fish KJ, Howard SK, Burden AR, editors. Crisis Management in Anesthesiology. 2nd ed. Philadelphia: Elsevier; 2015:6–24.

20. Bowe EA. Clinical reasoning. In: Bowe EA, Schell RM, Dilorenzo AN, editors. Education in Anesthesia: How to Deliver the Best Learning Experience. Cambridge: Cambridge University Press; 2018:17–24.

21. Rajab MH, Gazal AM, Alkattan K. Challenges to online medical education during the COVID-19 pandemic. Cureus. 2020;12(7):e8966.

22. Sneyd JR, Mathoulin SE, O’Sullivan EP, et al. Impact of the COVID-19 pandemic on anesthesia trainees and their training. Br J Anesth. 2020;125(4):450–455. doi:10.1016/j.bja.2020.07.011

23. Shah AP, Falconer R, Watson AJM, Walker KG. Teaching surgical residents in the COVID-19 era: the value of a simulation strategy. J Surg Ed. 2020. doi:10.1016/j.surg.2020.08.043

24. Gaba DM, Fish KJ, Howard SK, Burden AR. Teaching anesthesia crisis resource management. In: Gaba DM, Fish KJ, Howard SK, Burden AR, editors. Crisis Management in Anesthesiology. 2nd ed. Philadelphia: Elsevier; 2015:54–64.

25. Weidman J, Baker K. The cognitive science of learning: concepts and strategies for the educator and learner. Anesth Analg. 2015;121(6):1586–1599. doi:10.1213/ANE.0000000000000890

26. Stiegler MP, Neelankavil JP, Canales C, Dhillon A. Cognitive errors detected in anesthesiology: a literature review and pilot study. Br J Anaesth. 2012;108(2):229–235. doi:10.1093/bja/aer387

27. Gaba DM, Fish KJ, Howard SK, Burden AR. Principles of anesthesia crisis resource management. In: Gaba DM, Fish KJ, Howard SK, Burden AR, editors. Crisis Management in Anesthesiology. 2nd ed. Philadelphia: Elsevier; 2015:25–53.

28. Youngblood P, Harter PM, Srivastava S, Moffett S, Heinrichs WL, Dev P. Design, development, and evaluation of an online virtual emergency department for training trauma teams. Simul Healthc Fall. 2008;3(3):146–153. doi:10.1097/SIH.0b013e31817bedf7

29. Akareem HS, Hossain SS. Determinants of education quality: what makes students’ perception different? Open Rev Educ Res. 2016;3(1):52–67. doi:10.1080/23265507.2016.1155167

Creative Commons License © 2021 The Author(s). This work is published and licensed by Dove Medical Press Limited. The full terms of this license are available at https://www.dovepress.com/terms.php and incorporate the Creative Commons Attribution - Non Commercial (unported, v3.0) License. By accessing the work you hereby accept the Terms. Non-commercial uses of the work are permitted without any further permission from Dove Medical Press Limited, provided the work is properly attributed. For permission for commercial use of this work, please see paragraphs 4.2 and 5 of our Terms.