Novel Wireless Technology Platform for Bedside Image Navigation in the Endovascular Suite: Initial Clinical Experience in a Case Series

Clinical Review

Submitted on Fri, 03/01/2019 - 11:30
Authors

Sofia Vianna, MD;1,2 Timothy Yates, MD;1,2 Brandon Olivieri, MD;1,2 and Robert E. Beasley, MD1-3

 

1Vascular and Interventional Radiology, Mount Sinai Medical Center, Miami Beach, FL

2Vascular and Limb Preservation Center, Mount Sinai Medical Center, Miami Beach, FL

3Evanescence Vein Center, Mount Sinai Medical Center, Miami Beach, FL 

Citation
VASCULAR DISEASE MANAGEMENT 2019;16(2):E28-E31.
Abstract

Immediate access to patients’ anatomical image data is key to successful endovascular interventions. However, the limitation associated with the angiographic suite is that controls to such image data are located in the control room, outside the operative space. We present a newly developed, wireless technology platform that enables physicians to control viewing and manipulation of images in the sterile operative space. The technology is enabled by a handheld, palm-sized device that tracks finger movements by employing electric-field–sensing techniques and translates such movements to drive image manipulation on the image viewer. The technology was tested in 15 consecutive cases, reporting clinical applicability and usability in controlling angiographic and computed tomography images in a variety of endovascular procedures. In particular, the technology improved workflow by minimizing the number of visits to the control room (episodes of scrubbing out) and procedural time, while reducing use of contrast agent and exposure to radiation in certain scenarios. These observations demonstrate the promise of a potential role for the technology to be used routinely in the endovascular suite.    


Key words: touchless image navigation, human computer interaction, PACS, endovascular suite, workflow 

Introduction 

Interventional specialists rely heavily on interacting with a multitude of imaging equipment, devices, and patient records in a variety of different procedures. However, the current process of acquiring and interpreting patient data for real-time decisions is cumbersome and disruptive. An example is intraprocedural access and manipulation of radiology images, which is routinely performed to confirm or refine a treatment plan in various procedures. To accomplish this, the operator may break scrub multiple times midprocedure and visit the workstation in the control room where a mouse and keyboard are available for interacting with the imagery. Alternatively, the operator may delegate the task to an assistant,1,2 but he or she may be unfamiliar with the patient’s anatomy and history or the operator’s internal decision-making process. For these reasons, there is increasing recognition for the need to address this aspect of clinical practice to strive toward an improvement in procedural workflow.3

Here, we present a newly developed, wireless technology platform that enables physicians to control viewing and manipulation of images in the sterile operative space. Additionally, we present a case series in which the technology was evaluated for clinical applicability in the endovascular suite for a variety of endovascular interventions.Figure 1

MATERIALS AND METHODS

The technology is a human-computer interface (HCI) platform that tracks finger movements by employing electric field sensing techniques and translates such movements to drive image manipulation on the image viewer. Functions are enabled by a remote-type controller, which is a handheld, wireless, palm-sized device with 4 distinct regions representing 4 commonly used image manipulating functions (scroll, magnify, pan, and window/level) (TIPSO AirPad, NZ Technologies Inc) (Figure 1). Tapping any of the 4 distinct regions on the device surface will activate that particular mode. A touchless, finger-twirling motion over the surface will subsequently control the manipulation of the images using that mode (eg, if in magnify mode, clockwise twirl will increase the magnification, while counter-clockwise twirl decreases the magnification). The device accomplishes these manipulations using built-in computer algorithms that process the captured motions of the fingers relative to these regions and transmit the signals wirelessly via a WiFi router to the USB dongle connected to the workstation. These signals, in turn, drive the corresponding functions that are otherwise activated by mouse/keyboard commands in the imaging software. High resolution tracking and latency-free signal processing allows the physician to activate the image control functions and view images on demand in a responsive and fluid manner.Figure 2

The device was integrated with an angiography suite for evaluation in consecutive cases at our institution. Depending on the need for each case, the device was connected with either the live fluoroscopy workstation (Artis; Siemens Healthcare GmbH) or the PACS workstation (Synapse Cardiovascular, FUJIFILM Medical Systems). Training for use of this technology was provided to the physicians in advance. Prior to the start of each case, the device was draped using a sterile ultrasound probe cover and placed near or over the operating table on standby, ready for use by the physician (Figure 1). 

Case description and physician feedback on system utility was documented at the completion of each case. This study was in compliance with the Helsinki Declaration. 

RESULTS 

Figure 3The technology was evaluated in 15 consecutive procedures by 3 interventional specialists, and we selected a series of 5 representative cases reported in Table 1. In all cases, the physician was able to activate and make adjustments to the various modes using the technology intuitively and without errors.  

DISCUSSION

The novel technology platform examined here employs advanced techniques in 3D motion sensing and processing algorithms to enable interaction with imaging data in the confines of the sterile interventional operative space. Using the device placed over the drape, the physician is able to control image manipulation otherwise offered only by the mouse/keyboard at the workstation. 

Our experience demonstrated that the technology worked well in the consecutive cases in the angiography suite, allowing us to manipulate digital subtraction angiography and computed tomography records acquired live or stored in picture archiving and communications systems (PACS) as they are needed during the case. The technology was especially useful in situations where tridimensional reconstructions were needed and previous imaging had more information than fluoroscopy could generate during complex cases. Using the technology, the workflow was improved by reducing the number of visits to the control room (episodes of scrubbing out) and procedural time. Having the ability to conveniently access live and historical images at bedside eliminated the need to re-shoot fluoroscopy images in some cases, thus decreasing use of iodized contrast agent and exposure to radiation. 

Procedural workflow is a broad concept that drives the development of medical technologies in the operative workspace, as it is a determinant for procedural efficiency and patient outcomes.1–6 In the case of the modern angiography suite, there remains a workflow issue associated with the separation between control room and sterile operative area. Because of this set up, many procedures require the physicians to scrub out/in, often on multiple occasions, to enable them to review/manipulate images.7–10 This is not a desirable practice because it may lead to diversions and cumbersome communication that negatively affect efficiency and decision-making during image-guided procedures, while contributing to added procedural time and cost associated with multiple episodes of scrubbing out and visiting the control room for the purpose of reviewing images.1,2,9,11–13 Furthermore, practice guidelines recommend limiting traffic (and door openings) in the endovascular suite to reduce transmission of microorganisms into the suite and potentially into the sterile field.3 While scanning equipment (eg, C-arm) may provide a panel console at bedside for image viewing and minor image interactions, it may be inadequate to provide the controls that physicians need to maximize their ability to review readily available images. Table 1

HCI represents a growing area of clinical research in intraprocedural image navigation. A recent review has documented the different approaches undertaken by various research groups,8 largely employing commercially available camera-based platforms, such as Microsoft Kinect (used for gaming) (Microsoft Corp) or Leap Motion Controller (LMC) (Leap Motion Inc), which use hand and limb gestures as commands for image navigation.14–17 Other techniques, such as voice recognition and eye tracking, have also been evaluated as mechanisms for image interaction.8 Overall, the studies have shown their feasibility for implementation, particularly with Kinect and LMC; however, few studies have been tested in a real clinical environment, and the experiments have reported mixed results. Namely, these studies reported short training times and ease-of-use responses from the study participants. However, findings reflect false-positive activation in a dark environment, confinement to a fixed area of interaction within the camera’s line of sight, and indefinite benefits over the existing mode of image interaction, such as the conventional keyboard and mouse.8 These limitations have not been observed with the image navigation technology during testing in the present case series.    

While the image navigation technology we have described in our case series provides effective and intuitive control of a number of different image manipulation functions, the current release that was tested does not enable the user to perform drawing, measuring, and segmentation. This is particularly useful in cases for radioembolization therapy when 3-dimensional reconstructions are needed for spatial planning. Nonetheless, our observations in the present study show the promise of the technology platform as a translatable and useful technology in the endovascular suite.

Conclusion

Procedural workflow is a field of recent improvement efforts and new technologies are arising giving the physicians new alternatives. Handheld remote control of the PACS system such as the TIPSO AirPad  is definitely a great resource for the operative workspace.  

 

Disclosure: The authors have completed and returned the ICMJE Form for Disclosure of Potential Conflicts of Interest. The authors report no conflicts of interest regarding the content herein.

Manuscript submitted January 1, 2019; accepted January 3, 2019.

Address for correspondence: Robert Beasley, MD, FSIR, FSCAI, Section Chief, Vascular and Interventional Radiology, Mount Sinai Medical Center, 4300 Alton Road, Miami Beach, FL 33140, Email: Robert.beasley@msmc.com

 

References

1. O'Hara K, Gonzales G, Sellen A, et al. Touchless interaction in surgery. Communications of  the ACM. 2014;57(1):71-77.

2. Johnson R, O'Hara K, Sellen A, Cousins C, Criminisi A. Exploring the potential for touchless interaction. Proceedings of ACM CHI. 2011;2011:3323-3332.

3. Chan D, Downing D, Keough CE, et al. Joint practice guideline for sterile technique during vascular and interventional radiology procedures. From the Society of Interventional Radiology, Association of periOperative Registered Nurses, and Association for Radiologic and Imaging Nursing, for the Society of Interventional Radiology corrected Standards of Practice Committee, and Endorsed by the Cardiovascular Interventional Radiological Society of Europe and the Canadian Interventional Radiology Association. J Vasc Interven Radiol. 2012;23(12):1603-1612.

4. Wanta BT, Glasgow AE, Habermann EB, et al. Operating room traffic as a modifiable risk factor for surgical site infection. Surg Infect (Larchmt). 2016;17(5):755-760.

5. Mendez B, Requena M, Aires A, et al. Direct transfer to angio-suite to reduce workflow times and increase favorable clinical outcome. Stroke. 2018;49(11):2723-2727.

6. Settecase F, McCoy DB, Darflinger R, et al. Improving mechanical thrombectomy time metrics in the angiography suite. Stroke cart, parallel workflows, and conscious sedation. Interv Neuroradiol. 2018;24(2):168-177.

7. Ratib O. Imaging informatics. From image management to image navigation. Yearb Med Inform. 2009:167-172.

8. Mewes A, Hensen B, Wacker F, Hansen C. Touchless interaction with software in interventional radiology and surgery. A systematic literature review. Int J Comput Assist Radiol Surg. 2017;12(2):291-305.

9. Iannessi A, Marcy PY, Clatz O, Fillard P, Ayache N. Touchless intra-operative display for interventional radiologist. Diag Interv Imaging. 2014;95(3):333-337.

10. Rosset A, Spadola L, Pysher L, Ratib O. Informatics in radiology (infoRAD). Navigating the fifth dimension: innovative interface for multidimensional multimodality image navigation. Radiographics. 2006;26(1):299-308.

11. Grätzel C, Fong T, Grange S, Baur C. A non-contact mouse for surgeon-computer interaction. Technol Health Care. 2004;12(3):245-257.

12. Firth-Cozens J. Why communication fails in the operating room. Qual Saf Health Care. 2004;13(5):327.

13. Lingard L, Espin S, Whyte S, et al. Communication failures in the operating room. An observational classification of recurrent types and effects. Qual Saf Health Care. 2004;13(5):330-334.

14. Tan JH, Chao C, Zawaideh M, Roberts AC, Kinney TB. Informatics in Radiology. Developing a touchless user interface for intraoperative image control during interventional radiology procedures. Radiographics. 2013;33(2):E61-E70.

15. Strickland M, Tremaine J, Brigley G, Law C. Using a depth-sensing infrared camera system to access and manipulate medical imaging from within the sterile operating field. Can J Surg. 2013;56(3):E1-E6.

16. Bizzotto N, Costanzo A, Bizzotto L, et al. Leap motion gesture control with OsiriX in the operating room to control imaging. First experiences during live surgery. Surg Innov. 2014;21(6):655-656.

17. Weichert F, Bachmann D, Rudak B, Fisseler D. Analysis of the accuracy and robustness of the leap motion controller. Sensors (Basel). 2013;13(5):6380-6393.