Simulation in radiology

Simulation training has positively affected the teaching and training of professional pilots by creating consistent exposure to challenging situations in high-risk simulated conditions without exposing the pilot or passengers to real risk. Simulation training serves as a technology aid that when properly utilised, maximises the value and efficiency of teaching. Radiology training as many specialities in medicine has been affected by reimbursement changes in the US, and these changes threaten the quality of radiology education and potentially the quality of trained radiologists. It may well be that the implementation of simulation training may benefit radiology training.

The traditional radiology training model over time has primarily emerged as an ‘apprenticeship’ model, where radiology trainees primarily learn the craft by working closely with a practising radiologist or group of radiologists. This model includes a considerable time investment by the teacher in a close association of the trainee with the teacher. Even though radiology training includes additional elements as curricular elements including didactic lectures, self-learning including reading, and case-conferences, the primary method of learning for trainees is directly from teacher-practitioner to apprentice. The resident, through close association, observation, and emulation understands and learns how radiologists integrate their knowledge of the field into practical application of the three major segments of practice which include image interpretation which consists of analysing and documenting, radiology procedures which range from barium swallows to biopsies and vascular interventions, and consultation where referring physicians are guided to the specific test that will best answer their patient’s clinical need. The apprenticeship model typically lasts the four-year duration of current residency training, and as a direct result of financial pressures on the field is threatened. The apprenticeship model has inadvertently and dramatically changed over the past 10 years.

Alex Norbash

In the traditional model, a radiology resident would look at a series of studies accumulated over the previous several hours on their own, and after formulating an initial set of impressions, the resident would then sit with the attending physician who would then patiently review all studies with the trainee, and as part of apprenticeship-teaching would then correct any misconceptions or misinterpretations the resident might have. The resident would then dictate every single study, and once the study was transcribed, the attending physician would review the report, make any necessary corrections, and then review the transcribed dictations and suggested corrections with the trainee. The amount of time the attending physician spent with the trainee was high, the efficiency of the process as regards patients throughput and image visualisation was low, and additional technical challenges in the traditional model existed including such things as lost films, prolonged transcription time for dictations, a lack of certainty regarding the transcribed report since work-flow did not provide for the transcribed report being reviewed by the interpreting physician a second time in the presence of the study.

The current model in most academic institutions has evolved as we have moved away from sheets of film and technology has permitted the creation of workstation-based picture archiving and communication systems (PACS), which allow viewing of studies from multiple sites on computer monitors. Almost simultaneously, voice dictation has contributed to changes in the training model allowing high throughput, which has also arisen as a need due to progressively lower reimbursement per study. The current model for training therefore consists of a radiology resident reviewing a number of studies as images on a computer monitor, and then intermittently engaging the attending physician for a batch review. While the radiology resident is reviewing their studies, typically, the attending physician is reading a series of studies on their own which will likely never be reviewed by a radiology resident. This is a current development which is in distinct contrast from the traditional model. The radiology resident is therefore deprived of seeing a significant percentage of studies which could be educational. Of greater significance, the attending physician reviews the studies with the radiology resident in a relatively hurried manner following which the attending physician returns to his own workload, allowing the radiology resident to then voice dictate the report. The voice dictated report then goes into a queue which is reviewed by the attending physician, who then signs off the final version of the report. In the current system, therefore, the attending physician only reviews some rather than all of their studies with the resident. The attending physician is under greater productivity pressures and therefore tends to rush through teaching, and with a view to throughput the attending physician rarely has an opportunity to go over the resident’s transcribed dictations with the resident to help them learn how to craft such a report in a more refined manner.

The advantages of the current system include high throughput, increased work efficiency, and remarkably more rapid turnaround for the final report. As profit margins have diminished over the years, however, the overwhelming pressure for radiology departments has been to maximise throughput, which now results in the teaching effort being significantly compromised. In summary, the generous margins previously available from clinical work permitted the teaching mission. With diminishing margins, the teaching mission now cannot identify a substitute subsidisation basis.

Given the technology advantages possible with current computing systems, a natural question arises concerning whether diagnostic radiology simulation is possible, in a high fidelity manner. Such a high fidelity simulation system would duplicate the work-flow of a radiologist, and would include multiple cases which would be graded based on performance. This would necessitate the creation of dictionaries where the radiology resident would dictate and voice transcribe their impressions concerning a particular case, and once their dictation is completed then either individual dictations or batches of dictations would be graded. The grading of the dictation would reflect a spectrum of performance allowing a new resident to know how they perform against their peers, as differentiated from how an experienced resident would perform against their peers. Ideally feedback from such an educational simulation system would include comments regarding findings which should never be missed, and also would point out interpretation of exceptionally high performance. The intent is to categorise performance against one’s peers and delineate performance expectations.

Radiologist reading CAT scan

The current curriculum for trainees includes didactic sessions with lectures, case conferences which may either be multidisciplinary including a variety of physicians or which may include exclusively radiologists, and a considerable amount of independent reading. One could easily envision a future educational paradigm which includes didactics and self learning, however, with incorporation of a radiology simulator as part of their daily work, individuals can then more clearly identify their gaps in knowledge and ability. This type of gap assessment would direct the educational participant to media and enduring materials designed to specifically address their personal gaps.

One can readily envision advantages and disadvantages to such a training methodology. The advantages of such a methodology would include a consistent method for measuring performance, and potentially the initiation of lifelong learning utilising integrated educational systems. The disadvantages of such training methods could be early abandonment of apprenticeship, over-utilisation of the simulation system at the expense of didactic and reading opportunities, and excessive reliance on such simulation systems would deprive the trainee of exposure to individual attending practitioner approaches to the practice of radiology. The contributory challenges in creating such a system include the development of dictionaries to cover the full variety of necessary cases contained within a domain, the actual grading of the cases to reflect varied performance expectations by seniority of the examinee, and creating a sufficiently simple system to promote inputting of cases into an extractable data base from which such cases could populate educational simulators. These are not simple tasks. Although an initial version of such a simulator has been created, challenges remain in the ergonomic creation of a system which allows simple case entry, and simple case taxonomy and grading.

As we consider the greatest needs in creating such a simulation system, it may well be that particular organ systems could most easily benefit from prototypes of the same. As an example, if there is a particularly high demand for breast imaging practitioners, and if there are consistent terminology and batches or recognised cases in the clinical domain of breast imaging, it may be a comparatively simple matter to create an educational simulator for breast imaging. When looking for needs, once radiology residents are prepared for independent call an “emergency imaging” simulation system may serve as the ideal examination tool to confirm and grade their readiness for independent call. An additional opportunity for the practical implementation of such tools is an ongoing performance assessment of practitioners, or rejuvenation of skills which may have lapsed in practitioners. As an example, if I choose to read head and neck MRI’s following a one or two year hiatus, I would value greatly an opportunity to engage a simulation tool to rejuvenate my skills and grade them against performance expectations. In an envisioned future where performance measurement becomes commonplace and expected we may seek continuous objective measures of performance for practitioners; we may be able to integrate test cases into the daily work-flow of every radiologist, to ensure that acceptable and defined performance standards are actually being met.

As with many simulation examples, we have much to learn from the airline industry. New pilots undergo extensive simulation training as part of their education. All pilots undergo routine and recurrent simulator training and performance measurement, and if their performance is below acceptable standards, they are not allowed to serve as airline pilots until remedial training takes place to acceptable standards. Pilot performance is measured, and benchmarks and targets are established. Passenger’s lives are entrusted to pilots once they establish their abilities and document their performance. This performance review takes place on a recurring basis.

Patient lives are just as valuable as airline passengers lives.

Comments (0)
Add Comment