This study investigated computer-assisted instrument guidance as an educational tool for residents in a simulation environment. Primarily, the study evaluated residents’ procedural speed and accuracy using ultrasound with and without the guidance device.
srbadv0208
Needle Guidance using Handheld Stereo Vision and Projection for Ultrasound-based Interventions
With real-time instrument tracking and in-situ guidance projection directly integrated in a handheld ultrasound imaging probe, needle-based interventions such as biopsies become much simpler to perform than with conventionally-navigated systems. Stereo imaging with needle detection can be made sufficiently robust and accurate to serve as primary navigation input. We describe the low-cost, easy-to-use approach used in the Clear Guide ONE generic navigation accessory for ultrasound machines, outline different available guidance methods, and provide accuracy results from phantom trials.
Conventional Versus Computer Assisted Stereoscopic Ultrasound Needle Guidance for Renal Access: A Randomized Bench-Top Crossover Trial
During urologic surgery,ultrasound (US) is an established method for needle guidance, butdifficulty in visualizing the needle trajectory may add technicalcomplexity to the procedure. Needle guidance systems may simplifythese procedures. The purpose of this randomized bench-top crossovertrial was to compare conventional ultrasound and a computer assistedstereoscopic needle guidance system for obtaining renal access andmass biopsy.
Visual tracking for multi-modality computer-assisted image guidance
With optical cameras, many interventional navigation tasks previously relying on EM, optical, or mechanical guidance can be performed robustly, quickly, and conveniently. We developed a family of novel guidance systems based on wide-spectrum cameras and vision algorithms for real-time tracking of interventional instruments and multi-modality markers. These navigation systems support the localization of anatomical targets, support placement of imaging probe and instruments, and provide fusion imaging. The unique architecture – low-cost, miniature, in-hand stereo vision cameras fitted directly to imaging probes – allows for an intuitive workflow that fits a wide variety of specialties such as anesthesiology, interventional radiology, interventional oncology, emergency medicine, urology, and others, many of which see increasing pressure to utilize medical imaging and especially ultrasound, but have yet to develop the requisite skills for reliable success. We developed a modular system, consisting of hardware (the Optical Head containing the mini cameras) and software (components for visual instrument tracking with or without specialized visual features, fully automated marker segmentation from a variety of 3D imaging modalities, visual observation of meshes of widely separated markers, instant automatic registration, and target tracking and guidance on real-time multi-modality fusion views). From these components, we implemented a family of distinct clinical and pre-clinical systems (for combinations of ultrasound, CT, CBCT, and MRI), most of which have international regulatory clearance for clinical use. We present technical and clinical results on phantoms, ex- and in-vivo animals, and patients.
Computer-Assisted Instrument Guidance: Enhanced Procedural Efficacy and Safety
The Clear Guide ONE (Clear Guide Medical, Baltimore, MD) is a Computer-Assisted Instrument Guidance (CAIG) device, which optically tracks a procedure needle and calculates a “projected” path, which is displayed on a screen for live guidance. We hypothesize that CAIG will enhance the efficiency and decrease the risk of complication for ultrasound-guided procedures, especially for less-experienced operators.
Evaluation of the Clear Guide ONE System versus Conventional Ultrasound Guided Vessel Cannulation in Swine
This study explores the potential of a new device, the Clear Guide ONE system (CLEAR GUIDE MEDICAL, Baltimore, MD), for improved percutaneous ultrasound-guided vascular access.
Fast, Intuitive, Vision-Based Performance Metrics for Visual Registration, Instrument Guidance, and Image Fusion
We characterize the performance of an ultrasound+computed tomography image fusion and instrument guidance system on phantoms, animals, and patients. The system is based on a visual tracking approach. Using multi-modality markers, registration is unobtrusive, and standard instruments do not require any calibration. A novel deformation estimation algorithm shows externally-induced tissue displacements in real time.
Clear Guide ONE: Local Optical Tracking of Instruments and Probe for Interventional Ultrasound Imaging
Background: Ultrasound guidance for needle-based interventions is a widespread approach to assist in precise targeting of anatomical structures for diagnostic (e.g., for biopsies) or therapeutic (e.g., tumor ablation or anesthesia) purposes. However, limited imaging quality (due to attenuation or suboptimal structure/instrument echogenicity), required hand/eye coordination for probe and instrument placement and anatomical constraints make instrument localization and targeting difficult. Therefore, systems based on mechanical, optical or electromagnetic tracking aim to provide guidance for the operator. However, these systems are cumbersome, expensive, difficult to learn and use and time-consuming to set up and therefore have not achieved widespread physician acceptance or use.
Objectives: Intervention-agnostic, unobtrusive guidance that works immediately after startup, at any time during the intervention, without additional setup steps, for any needle without required markers, and without the need for special environments (including aspects such as reference bases, EM compatibility etc.) can significantly improve adoption of ultrasound guidance. The use of ultrasound guidance for
needle procedures is the gold standard of care and is associated with high-quality medicine and superior clinical outcomes.
Methods: Using probe-mounted miniature stereo cameras, the described novel local instrument tracking approach (Clear Guide ONE™, Clear Guide Medical, Baltimore, MD) provides probe-relative localization of needle-like instruments. Real-time observation, detection and 3D reconstruction of instruments allow the overlay of predicted needle trajectories on ultrasound video streams. In order to ensure a uniform, clear target, a 2.4 mm steel ball (BB) was suspended by a crosswire and embedded in a Sigma-Aldrich gelatin from porcine skin at a depth of 6 cm. The BB target is highly visible in ultrasound, thus removing any ambiguity of the target location. For comparison, biopsy lesions in liver and kidney discovered by CT are usually 5–10 cm deep and have a minimum diameter of 5 mm. Imaging was performed using linear (L14-5/38) and convex (L5-2/60) handheld ultrasound probes on an Ultrasonix SonixTablet (Ultrasonix Inc., Richmond, BC, Canada). The augmented US plus guidance video streams were shown to two physicians (Fellows in Interventional Radiology at Johns Hopkins University). The subjects were not familiar with the device beforehand, and performed 41 guided-needle insertions in total. They were instructed to align the needles with the target bb before insertion and then minimize directional adjustments during placement. This is unlike with conventional “blind insertion”, where the needle is reoriented under ultrasound visualization to reach the target and consequently causes tissue trauma and needle bending. After each trial, two orthogonal digital photographs were taken of the BB and the placed needle as they appeared in the transparent phantom. These images were then manually analyzed to triangulate the distance of the needle to the target.
Results: Needle placement was successfully completed in all cases, with minimal training required (approximately 15min “free trialing” to become familiar with both the ultrasound machine and the Clear Guide ONE). Overall target accuracy was 3.27/2.85 mm average/median (range 0.3–10.5 mm; 2.28 mm standard deviation). As expected, accuracy for the lower-frequency convex probe was lower (3.8/4.21 mm avg./med.; range 0.3–7.3 mm; 2.17 mm std. dev.; N = 15) than for the high-frequency linear probe (2.96/2.11 mm avg./med.; range 0.7–10.5 mm; 2.33 mm std. dev.; N = 26).
Conclusion: The described fully-handheld local tracking system allows precise and repeatable interventional needle placement with minimal training and hardware footprint. We would expect greater adoption of ultrasound guidance by all physicians inserting needles or interventional tools into the body and therefore better clinical outcomes.
The Kinect as an interventional tracking system
This work explores the suitability of low-cost sensors for “serious” medical applications, such as tracking of interventional tools in the OR, for simulation, and for education. Although such tracking – i.e. the acquisition of pose data e.g. for ultrasound probes, tissue manipulation tools, needles, but also tissue, bone etc. – is well established, it relies mostly on external devices such as optical or electromagnetic trackers, both of which mandate the use of special markers or sensors attached to each single entity whose pose is to be recorded, and also require their calibration to the tracked entity, i.e. the determination of the geometric relationship between the marker’s and the object’s intrinsic coordinate frames. The Microsoft Kinect sensor is a recently introduced device for full-body tracking in the gaming market, but it was quickly hacked – due to its wide range of tightly integrated sensors (RGB camera, IR depth and greyscale camera, microphones, accelerometers, and basic actuation) – and used beyond this area. As its field of view and its accuracy are within reasonable usability limits, we describe a medical needle-tracking system for interventional applications based on the Kinect sensor, standard biopsy needles, and no necessary attachments, thus saving both cost and time. Its twin cameras are used as a stereo pair to detect needle-shaped objects, reconstruct their pose in four degrees of freedom, and provide information about the most likely candidate.