2016 Section 5 Green Book
investigation into behavioral changes and attention shifts created through real-time navigation need further preclinical study. 28 Initial clinical use should be restricted to surgeons who perform advanced cases and already have experience with and a good working knowl- edge of IGS systems and their potential errors. CONCLUSION Real-time surgical navigation systems such as our LIVE-IGS prototype may enhance spatial awareness while reducing task workload during endoscopic skull base surgery. High spatial demand, compromised visual landmarks, and proximity to critical structures combine to create an environment where such technology may be beneficial. Multimodal feedback with novel alarms, such as structure- and proximity-specific auditory icons, could reduce visual stimuli and enhance awareness while lim- iting distraction. Acknowledgments This study could not have been performed without assistance from staff at the University of Toronto Surgical Skills Centre, Mount Sinai Hospital, Toronto. The authors thank the surgeons (including F. Gentili, D. Sommer, S. Kilty, J. Lee, and P. Goetz) who made time to participate in the trial. BIBLIOGRAPHY 1. Castelnuovo P, Dallan I, Battaglia P, Bignami M. Endoscopic endonasal skull base surgery: past, present and future. Eur Arch Otorhinolaryngol 2010;267:649–663. 2. Schlosser RJ, Bolger WE. Image-guided procedures of the skull base. Oto- laryngol Clin North Am 2005;38:483–490. 3. Kockro RA, Tsai YT, Ng I, et al. Dex-ray: augmented reality neurosurgical navigation with a handheld video probe. Neurosurgery 2009;65:795–807; discussion 807–808. 4. Sanderson PM, Watson MO, Russell WJ. Advanced patient monitoring dis- plays: tools for continuous informing. Anesth Analg 2005;101:161–168, table of contents. 5. Marescaux J, Rubino F, Arenas M, Mutter D, Soler L. Augmented-reality- assisted laparoscopic adrenalectomy. JAMA 2004;292:2214–2215. 6. Kawamata T, Iseki H, Shibasaki T, Hori T. Endoscopic augmented reality navigation system for endonasal transsphenoidal surgery to treat pitui- tary tumors: technical note. Neurosurgery 2002;50:1393–1397.
7. Daly MJ, Chan H, Prisman E, et al. Fusion of intraoperative cone-beam CT and endoscopic video for image-guided procedures. Proc SPIE 2010; 7625:762503. 8. Cleary K, Peters TM. Image-guided interventions: technology review and clinical applications. Annu Rev Biomed Eng 2010;12:119–142. 9. Yushkevich PA, Piven J, Hazlett HC, et al. User-guided 3D active contour segmentation of anatomical structures: significantly improved efficiency and reliability. Neuroimage 2006;31:1116–1128. 10. Siewerdsen JH, Moseley DJ, Burch S, et al. Volume CT with a flat-panel detector on a mobile, isocentric C-arm: pre-clinical investigation in guid- ance of minimally invasive surgery. Med Phys 2005;32:241–254. 11. Nithiananthan S, Brock KK, Daly MJ, Chan H, Irish JC, Siewerdsen JH. Demons deformable registration for CBCT-guided procedures in the head and neck: convergence and accuracy. Med Phys 2009;36:4755–4764. 12. Hart SG, Staveland LE. Development of NASA-TLX: Results of Empirical and Theoretical Research. Amsterdam, the Netherlands: Elsevier Sci- ence; 1987. 13. Dixon BJ, Daly MJ, Chan H, Vescan A, Witterick IJ, Irish JC. Augmented image guidance improves skull base navigation and reduces task work- load in trainees: a preclinical trial. Laryngoscope 2011;121:2060–2064. 14. Nakamura M, Stover T, Rodt T, et al. Neuronavigational guidance in cra- niofacial approaches for large (para)nasal tumors involving the anterior skull base and upper clival lesions. Eur J Surg Oncol 2009;35:666–672. 15. Strauss G, Koulechov K, Rottger S, et al. Evaluation of a navigation system for ENT with surgical efficiency criteria. Laryngoscope 2006;116:564–572. 16. Ukimura O, Gill IS. Image-fusion, augmented reality, and predictive surgi- cal navigation. Urol Clin North Am 2009;36:115–123, vii. 17. Volonte F, Pugin F, Bucher P, Sugimoto M, Ratib O, Morel P. Augmented reality and image overlay navigation with OsiriX in laparoscopic and robotic surgery: not only a matter of fashion. J Hepatobiliary Pancreat Sci 2011;18:506–509. 18. Livingston MA. Evaluating human factors in augmented reality systems. IEEE Comput Graph Appl 2005;25:6–9. 19. Regenbrecht H, Baratoff G, Wilke W. Augmented reality projects in the automotive and aerospace industries. IEEE Comput Graph Appl 2005; 25:48–56. 20. Carswell CM, Clarke D, Seales WB. Assessing mental workload during laparoscopic surgery. Surg Innov 2005;12:80–90. 21. Yurko YY, Scerbo MW, Prabhu AS, Acker CE, Stefanidis D. Higher mental workload is associated with poorer laparoscopic performance as meas- ured by the NASA-TLX tool. Simul Healthc 2010;5:267–271. 22. Donmez B, Boyle LN, Lee JD. Mitigating driver distraction with retrospec- tive and concurrent feedback. Accid Anal Prev 2008;40:776–786. 23. Donmez B, Boyle LN, Lee JD. Safety implications of providing real-time feedback to distracted drivers. Accid Anal Prev 2007;39:581–590. 24. Wickens CD, Goh J, Helleberg J, Horrey WJ, Talleur DA. Attentional mod- els of multitask pilot performance using advanced display technology. Hum Factors 2003;45:360–380. 25. Nakamura K, Naya Y, Zenbutsu S, et al. Surgical navigation using three- dimensional computed tomography images fused intraoperatively with live video. J Endourol 2010;24:521–524. 26. Edworthy J, Hellier E. Alarms and human behaviour: implications for medical alarms. Br J Anaesth 2006;97:12–17. 27. Donmez B, Boyle LN, Lee JD. The impact of distraction mitigation strat- egies on driving performance. Hum Factors 2006;48:785–804. 28. Dixon BJ, Daly MJ, Chan H, Vescan AD, Witterick IJ, Irish JC. Surgeons blinded by enhanced navigation: the effect of augmented reality on attention. Surg Endosc 2013;27:454–461.
Laryngoscope 124: April 2014
Dixon et al.: Real-Time Navigation for Endoscopic Surgery
174
Made with FlippingBook