Description
AbstractUltrasound-guided regional anaesthesia (UGRA) minimises the danger of systemic toxicity, allows real-time identification of anatomy and needle position, and reduces procedural time. However, existing training may be insufficient to equip trainees with the knowledge they need to become experts¹. Assessment is typically subjective, and it is unknown how educational methods affect students' retention of UGRA skills. Cadaveric and non-cadaveric simulators using visualisation technologies such as Augmented reality (AR) and Virtual Reality (VR) offer simulator training opportunities with the ability to collect objective assessment metrics. Augmented reality (AR) enables surgeons and anaesthetists to view superimposed 3D objects on the surface of the body to create a vision of the organs and anatomical structures inside the body². Virtual Reality (VR) has been used in plastic, maxillofacial and breast surgery amongst others for clinical practice and training³. Yet the mechanisms underpinning performance need to be understood to achieve optimal performance. The Expert Performance Approach (EPA)⁴ involves capturing observable expert performance using metrics including eye movements and other sensory metrics, verbal reports and experimental procedures to identify the specific physiological or cognitive mechanisms that account for expert performance advantage over less-skilled performance on representative tasks.
In my talk I will discuss the application of eye tracking to UGRA to explore how visual attention and cognitive intent can be quantitatively measured. Eye-tracking which can be included in AR and VR systems or operate independently using mobile glasses, has been deployed for training in laparoscopy, radiology, pathology and ultrasound-guided regional anaesthesia⁵. This helps us to understand the attentional mechanisms underpinning performance. In our studies comparing novice anaesthetists to experts in interscalene block performance, we have found that novices had 2 to 3 times longer procedure times, 2 to 3 times more glances away from the region of interest during scanning, and 3 times more fixations but 3 times shorter fixation times during needling⁶. Our group has created eye tracking software that instantly calculates learning curves, validated our simulator; developed new metrics and translated knowledge and skills from simulators to patients. The evolving landscape of visualisation technologies in anaesthesia will be central to the talk.
References
1. Cheung JJ, Chen EW, Al-Allaq Y, et al. Acquisition of technical skills in ultrasound-guided regional anesthesia using a high-fidelity simulator. Studies in health technology and informatics 2011; 163: 119-24.
2. Ho-Gun, H. & Hong, J. Augmented Reality in Medicine. Med Rev, 2016; 36: 242-247, https://doi.org/10.7599/hmr.2016; 36.4: 242.
3. Sayadi, LR, Naides, A, Eng, M, Fijany, A,Chopan, M., Sayadi, JJ., Shaterian, A, Banyard, DA, Evans, GRD, Vyas, R. & and Widgerow, A.D. The New Frontier: A Review of Augmented Reality and Virtual Reality in Plastic Surgery. Aesthetic Surgery Journal, 2012; 39; 1007–1016.
4. Ericsson KA. Deliberate practice and the acquisition and maintenance of expert performance in medicine and related domains. Acad Med. 2004 Oct;79(10 Suppl):S70-81. PMID: 15383395.
5. Ashraf H, Sodergren MH, Merali N, Mylonas G, Singh H, Darzi A.. Eye-tracking technology in medical education: a systematic review. Medical Teacher, 2018; 40: 62– 9.
6. McLeod G, McKendrick M, Taylor A, et al. Validity and reliability of metrics for translation of regional anaesthesia performance from cadavers to patients. Br J Anaesth 2019; 123: 368-77
Period | 24 Aug 2022 |
---|---|
Event title | Edinburgh Festival of Anaesthesia |
Event type | Conference |
Location | Edinburgh, United KingdomShow on map |
Degree of Recognition | International |
Keywords
- Eye tracking, Regional Anaesthesia