9 minute read

Simulation: How do we train in the future?

Prof. Dr. Anthony Gallagher ORSI Academy Melle (BE)

ag.gallagher@ ogcmetrics.com

Advertisement

“The art of progress is to preserve order amid change and to preserve change amid order.”–Alfred North Whitehead (1861−1947). From the series Great Ideas of Western Man.

The acquisition, maintenance and application of skill in surgery and procedure-based medicine are matters of great importance. There is now continued debate and public interest on methods that may be used for quality assurance of surgical performance.

Agents of change: Traditionally, procedural skills have been acquired in a formal, structured apprenticeship training system. This was based on a model developed by William Stewart Halsted at Johns Hopkins at the end of the 19th and beginning of the 20th century. The apprenticeship phase in surgery and procedural-based medicine is unlike others because the trainees are being prepared to carry out interventional procedures on sick patients which are frequently life-threatening and almost always pose some risk of morbidity and mortality.

There is now an accumulating body of evidence which suggests that the safety of the procedure is directly correlated to the skill of the operator. [1,2] Furthermore, reduced work hours and changing work practices, e.g., image guided interventional procedures (laparoscopic, robotic, endovascular, endoscopic, etc) make the flaws in this approach to training very apparent.

Simulation training: The Halstedian apprenticeship approach to training is no longer fit for purpose. It is inefficient, lacks transparency and assessment is subjective, which can be (unfairly) used against the trainee to constrain training progression or indeed completion. There also seems to be unanimous agreement amongst the different procedural-based disciplines that simulationbased training is a better way to train. However, there is disagreement on how best to use simulation. There is even more disagreement amongst the different professional groups as to what constitutes an adequate level of simulation fidelity for it to be useful and usable.

Effectiveness of simulation training: Quantitative evidence already exists which demonstrates that simulation is a better way to train. [3] The optimal application of this approach (i.e., proficiency based progression or PBP) has demonstrated the power of simulation to dramatically improve suturing skills, laparoscopic surgical skills, interventional cardiology skills, orthopaedic surgery skills, and anaesthetist skills for childbirth.

In the next 12-24 months ORSI Academy and ERUS will report compelling simulation training data for robot-assisted procedure skills. A recent systematic review and meta-analysis of published, peer-reviewed, prospective, randomised and blinded clinical studies showed that a PBP simulation-based approach to training resulted in a 60% reduction in objectively assessed performance errors in comparison to a quality assured traditional approach. [4] However, publications on simulation to date have only demonstrated how superficially simulation is understood with scant attention paid to the underlying science of what makes for effective simulation training.

A revolution in computer technology has led to the problems faced by surgeons. This same technology offers a very powerful training solution. Aviation has used computer generated virtual reality (VR) simulations to train pilots for decades. However, unlike aeroplanes and airports with standardised features, real patients are all different. Furthermore, the aviation industry has over decades worked out precise protocols for dealing with different aeroplanes, airport terrains and flight scenarios. procedure performance. To utilise simulations for training, surgeons had first to develop surgical procedure templates (for a reference approach), including for example; the individual steps of the procedure, and the choice of instruments.

They also had to identify optimal and deviations from optimal performance (i.e., errors, critical/sentinel errors) so that engineers and computer scientists could build the simulation and accurately characterise the operation so that performance was quantifiable.

The science of simulation training: PBP simulation training is a scientific approach to surgical skills training that is objective, transparent and fair to the trainee as well as the trainer. The performance metrics (i.e., procedure steps, errors and critical/sentinel errors) are the cornerstone of PBP training. These are derived from, validated by, and benchmarked on experienced surgeons who are actually good at performing the procedure in question. The metrics are developed initially during a detailed procedure characterisation with three to five experienced surgeons. [5,6]

The metrics explicitly identify observable performance before, during, and after the surgical procedure. They are then validated initially at a Delphi consensus meeting and then construct validity. The latter requires that the metrics can be scored reliably by independent raters (i.e., with an interrater reliability (IRR) >0.8) and the performance assessments reliably discriminate between the objectively assessed performance of experienced and less experienced surgeons. Only when all of these validation criteria are met, a proficiency benchmark can be quantitatively defined, based on the mean performance of the experienced practitioners.

In addition, the performance metrics should be explicit and binary scores, and not Likert scale assessments. Despite the voluminous reports on Likert scale assessments, they have been demonstrated to be unreliable, [7] and thus, by default not valid. PBP validated metrics are then used to give the trainee explicit formative performance feedback during training, thus accelerating their learning using deliberate practice [8] training rather than simply requiring the trainee to engage in repeated practice.

Furthermore, PBP training is delivered by faculty who know and can score the metrics with an IRR > 0.8 and have been taught (in a train-the-trainers course) how to use the metrics for deliberate rather than repeated practice.

Deliberate practice and standardisation: VR computer and other types of simulation means that surgeons can now learn how to perform specific skills or procedures using the exact same devices, in the same way on simulations. In the past they learned these skills (and made mistakes) on real patients but on a virtual patient or a simulation they can perform the exact same procedure repeatedly and learn what not to do, as well as what to do.

This type of learning with performance feedback (i.e., deliberate practice) constitutes a very powerful approach to training that contrasts with the traditional apprenticeship model where performance feedback and learning was much more hit-and-miss. This scientific and metric-based approach means that simulation training and proficiency benchmarking can be standardised and implemented across training centres, the EU and wider afield. Furthermore, the metrics, curriculum and proficiency benchmarks are not based on the opinions of a few key opinion leaders but consensus between practicing clinicians at formal modified-Delphi meetings. Likewise, proficiency benchmarks are based on the actual measured performance levels of practicing clinicians. This approach to training necessitates a standardised curriculum and systematic and agreed approach to delivering it. Such an approach has the potential to considerably reduce performance heterogeneity by ‘trainees’.

Order amid change: Agents of change have forced surgery and medicine to consider how future doctors are optimally prepared for safe and effective clinical practice. This will unavoidably mean a change to the way doctors are trained. The ‘Scientific Method’ has as stated by Whitehead, the capacity to preserve order amid change. Proposals and ideas about training can be quantitively evaluated in a scientific way with robust empirical evidence underpinning decisions. The leadership of ERUS, the EAU and ORSI Academy are well advanced in this scientific ‘conversation’ and are very aware of the stakes involved. They also know that the scientific method and the data derived from studies in robotics, endourology, train-the-trainers etc. will guide and underpin their decision-making, thus preserving change amid order. Good quality scientific data can also mitigate the risk that change gets bogged down in endless deliberations.

Training must be more than an interesting educational experience: This scientific and evidence-based approach to the acquisition of skills for the operating room relies on systematic, simulation-based, skills curriculum for training and education. [6] It means that surgeons (and other health care workers) can be optimally prepared for the operating room with their performance benchmarked against other surgeons before performing it in vivo. Research has now shown that surgeons trained using this approach perform significantly better and make fewer errors than traditionally trained surgeons. [3,9-12]

Conclusions: Training with metric-based simulation ensures learning to a quantitatively defined performance level and greater homogeneity in trainee skill-sets. [6] Evidence from prospective, randomised studies shows that a PBP approach to education and training produces trainees with skill-sets that are 40-60% better than trainees using a traditional approach to training. These studies also show that trainees who receive the exact same curriculum but without the quantitatively defined performance benchmark perform only marginally better than those receiving traditionally training. [11]

These results clearly demonstrate that simulation training is effective for skills acquisition but the simulation training must be more than an interesting educational experience. A PBP approach to training may be conceptually and intellectually appealing but it represents a paradigm shift in how surgeons and doctors are educated and trained. [13-17]

Figure 1a-d: 1a The sculpture at the front of ORSI Academy representing the ambition to scientifically measure performance to augment and enhance robotic surgical skills learning; 1b-d three different ORSI faculty surgeons training robotic surgical skills using the exact same metric-based, deliberate practice curriculum for all trainees.

References

1. Curtis NJ, Foster JD, Miskovic D, Brown CS, Hewett PJ,

Abbott S, et al. Association of surgical skill assessment with clinical outcomes in cancer surgery. JAMA surgery. 2020;155:590-8. 2. Birkmeyer JD, Finks JF, O’Reilly A, Oerline M, Carlin AM,

Nunn AR, et al. Surgical skill and complication rates after bariatric surgery. N Engl J Med. 2013;369:1434-42. 3. Seymour NE, Gallagher AG, Roman SA, O’Brien MK,

Bansal VK, Andersen DK, et al. Virtual reality training improves operating room performance: results of a randomised, double-blinded study. Ann Surg. 2002;236:458-63; discussion 63-4. 4. Mazzone E, Puliatti S, Amato M, Bunting B, Rocco B,

Montorsi F, et al. A systematic review and meta-analysis

on the impact of proficiency-based progression simulation training on performance outcomes. Ann Surg. 2021;274:281-9. 5. Gallagher AG, Ritter EM, Champion H, Higgins G, Fried

MP, Moses G, et al. Virtual reality simulation for the operating room: proficiency-based training as a paradigm shift in surgical skills training. Ann Surg. 2005;241:364-72. 6. Gallagher AG, O’Sullivan GC. Fundamentals of surgical simulation; Principles & practices London: Springer

Verlag; 2011. 7. Satava RM, Stefanidis D, Levy JS, Smith R, Martin JR,

Monfared S, et al. Proving the Effectiveness of the fundamentals of robotic surgery (FRS) skills curriculum:

A single-blinded, multispecialty, multi-institutional randomised control trial. Ann Surg. 2019. 8. Ericsson KA, Krampe RT, Tesch-Römer C. The role of deliberate practice in the acquisition of expert performance. Psychol Rev. 1993;100:363-406. 9. Ahlberg G, Enochsson L, Gallagher AG, Hedman L,

Hogman C, McClusky DA, 3rd, et al. Proficiency-based virtual reality training significantly reduces the error rate for residents during their first 10 laparoscopic cholecystectomies. The American Journal of Surgery. 2007;193:797-804. 10. Van Sickle K, Ritter EM, Baghai M, Goldenberg AE,

Huang IP, Gallagher AG, et al. Prospective, randomised, double-blind trial of curriculum-based training for intracorporeal suturing and knot tying. J Am Coll Surg. 2008;207:560-8. 11. Angelo RL, Ryu RK, Pedowitz RA, Beach W, Burns J,

Dodds J, et al. A proficiency-based progression training curriculum coupled with a model simulator results in the acquisition of a superior arthroscopic Bankart skill set.

Arthroscopy: The Journal of Arthroscopic & Related

Surgery. 2015;31:1854-71. 12. Cates CU, Lönn L, Gallagher AG. Prospective, randomised and blinded comparison of proficiency-based progression full-physics virtual reality simulator training versus invasive vascular experience for learning carotid artery angiography by very experienced operators. BMJ

Simulation and Technology Enhanced Learning. 2016;2:1-5. 13. Gallagher AG, Ritter EM, Champion H, Higgins G, Fried

MP, Moses G, et al. Virtual reality simulation for the operating room: proficiency-based training as a paradigm shift in surgical skills training. Ann Surg. 2005;241:364-72.

Due to space constraints, the entire reference list can be made available to interested readers upon request by sending an email to: communications@ uroweb.org.

Saturday, 2 July 14:15 - 18:00 Meeting of the EAU Robotic Urology Section (ERUS) Purple Area, Room Elicium 1

This article is from: