Numerous studies have demonstrated the efficacy of interstitial ablative approaches for the treatment of renal and hepatic
tumors. Despite these promising results, current systems remain highly dependent on operator skill, and cannot treat
many tumors because there is little control of the size and shape of the zone of necrosis, and no control over ablator
trajectory within tissue once insertion has taken place. Additionally, tissue deformation and target motion make it
extremely difficult to accurately place the ablator device into the target. Irregularly shaped target volumes typically
require multiple insertions and several sequential thermal ablation procedures. This study demonstrated feasibility of
spatially tracked image-guided conformal ultrasound (US) ablation for percutaneous directional ablation of diseased
tissue. Tissue was prepared by suturing the liver within a pig belly and 1mm BBs placed to serve as needle targets. The
image guided system used integrated electromagnetic tracking and cone-beam CT (CBCT) with conformable needlebased
high-intensity US ablation in the interventional suite. Tomographic images from cone beam CT were transferred
electronically to the image-guided tracking system (IGSTK). Paired-point registration was used to register the target
specimen to CT images and enable navigation. Path planning is done by selecting the target BB on the GUI of the realtime
tracking system and determining skin entry location until an optimal path is selected. Power was applied to create
the desired ablation extent within 7-10 minutes at a thermal dose (>300eqm43). The system was successfully used to
place the US ablator in planned target locations within ex-vivo kidney and liver through percutaneous access. Targeting
accuracy was 3-4 mm. Sectioned specimens demonstrated uniform ablation within the planned target zone. Subsequent
experiments were conducted for multiple ablator positions based upon treatment planning simulations. Ablation zones in
liver were 73cc, 84cc, and 140cc for 3, 4, and 5 placements, respectively. These experiments demonstrate the feasibility
of combining real-time spatially tracked image guidance with directional interstitial ultrasound ablation. Interstitial
ultrasound ablation delivered on multiple needles permit the size and shape of the ablation zone to be "sculpted" by
modifying the angle and intensity of the active US elements in the array. This paper summarizes the design and
development of the first system incorporating thermal treatment planning and integration of a novel interstitial acoustic
ablation device with integrated 3D electromagnetic tracking and guidance strategy.
One of the key technical challenges in developing an
extensible image-guided navigation system is that of interfacing with external proprietary hardware. The technical challenges arise from the constraints placed on the navigation system's hardware and software. Extending a navigation system's functionality by interfacing with an external hardware device may require modifications to internal hardware components. In some cases, it would
also require porting the complete code to a different operating system that is compatible with the manufacturer supplied application programming interface libraries and drivers. In this paper we describe our experience extending a multi-platform navigation system, implemented using the image-guided surgery toolkit IGSTK, to
support real-time acquisition of 2-D ultrasound (US) images acquired with the Terason portable US system. We describe the required hardware and software modifications imposed by the proposed extension and how the OpenIGTLink network communication protocol enabled us to minimize the changes to the system's hardware and software. The resulting navigation system retains its platform independence with the added capability for real-time image acquisition independent of the image source.
Vertebroplasty is a minimally invasive procedure in which bone cement is pumped into a fractured vertebral
body that has been weakened by osteoporosis, long-term steroid use, or cancer. In this therapy, a trocar (large
bore hollow needle) is inserted through the pedicle of the vertebral body which is a narrow passage and requires
great skill on the part of the physician to avoid going outside of the pathway. In clinical practice, this procedure
is typically done using 2D X-ray fluoroscopy. To investigate the feasibility of providing 3D image guidance, we
developed an image-guided system based on electromagnetic tracking and our open source software platform
the Image-Guided Surgery Toolkit (IGSTK). The system includes path planning, interactive 3D navigation, and
dynamic referencing. This paper will describe the system and our initial evaluation.
We have developed an image-guided navigation system using electromagnetically-tracked tools, with potential
applications for abdominal procedures such as biopsies, radiofrequency ablations, and radioactive seed placements. We
present the results of two phantom studies using our navigation system in a clinical environment. In the first study, a
physician and medical resident performed a total of 18 targeting passes in the abdomen of an anthropomorphic phantom
based solely upon image guidance. The distance between the target and needle tip location was measured based on
confirmatory scans which gave an average of 3.56 mm. In the second study, three foam nodules were placed at different
depths in a gelatin phantom. Ten targeting passes were attempted in each of the three depths. Final distances between the
target and needle tip were measured which gave an average of 3.00 mm. In addition to these targeting studies, we discuss
our refinement to the standard four-quadrant image-guided navigation user interface, based on clinician preferences. We
believe these refinements increase the usability of our system while decreasing targeting error.
KEYWORDS: Video, Ultrasonography, Image-guided intervention, Medical imaging, 3D video streaming, Computed tomography, 3D image processing, Imaging systems, Visualization, Surgery
The image-guided surgery toolkit (IGSTK) is an open source C++ library that provides the basic components required
for developing image-guided surgery applications. While the initial version of the toolkit has been released, some
additional functionalities are required for certain applications. With increasing demand for real-time intraoperative image
data in image-guided surgery systems, we are adding a video grabber component to IGSTK to access intraoperative
imaging data such as video streams. Intraoperative data could be acquired from real-time imaging modalities such as
ultrasound or endoscopic cameras. The acquired image could be displayed as a single slice in a 2D window or integrated
in a 3D scene. For accurate display of the intraoperative image relative to the patient's preoperative image, proper
interaction and synchronization with IGSTK's tracker and other components is necessary. Several issues must be
considered during the design phase: 1) Functions of the video grabber component 2) Interaction of the video grabber
component with existing and future IGSTK components; and 3) Layout of the state machine in the video grabber
component. This paper describes the video grabber component design and presents example applications using the video
grabber component.
Minimally invasive procedures are increasingly attractive to patients and medical personnel because they can reduce
operative trauma, recovery times, and overall costs. However, during these procedures, the physician has a very limited
view of the interventional field and the exact position of surgical instruments. We present an image-guided platform for
precision placement of surgical instruments based upon a small four degree-of-freedom robot (B-RobII; ARC
Seibersdorf Research GmbH, Vienna, Austria). This platform includes a custom instrument guide with an integrated
spiral fiducial pattern as the robot's end-effector, and it uses intra-operative computed tomography (CT) to register the
robot to the patient directly before the intervention. The physician can then use a graphical user interface (GUI) to select
a path for percutaneous access, and the robot will automatically align the instrument guide along this path. Potential
anatomical targets include the liver, kidney, prostate, and spine. This paper describes the robotic platform, workflow,
software, and algorithms used by the system. To demonstrate the algorithmic accuracy and suitability of the custom
instrument guide, we also present results from experiments as well as estimates of the maximum error between target
and instrument tip.
PET (Positron Emission Tomography) scanning has become a dominant force in oncology care because of its ability to
identify regions of abnormal function. The current generation of PET scanners is focused on whole-body imaging, and
does not address aspects that might be required by surgeons or other practitioners interested in the function of particular
body parts. We are therefore developing and testing a new class of hand-operated molecular imaging scanners designed
for use with physical examinations and intraoperative visualization. These devices integrate several technological
advances, including (1) nanotechnology-based quantum photodetectors for high performance at low light levels, (2)
continuous position tracking of the detectors so that they form a larger 'virtual detector', and (3) novel reconstruction
algorithms that do not depend on a circular or ring geometry. The first incarnations of this device will be in the form of
a glove with finger-mounted detectors or in a "sash" of detectors that can be draped over the patient. Potential
applications include image-guided biopsy, surgical resection of tumors, assessment of inflammatory conditions, and
early cancer detection. Our first prototype is in development now along with a clinical protocol for pilot testing.
The Image-Guided Surgery Toolkit (IGSTK) is an open source C++ software library that provides the basic components
needed to develop image-guided surgery applications. The focus of the toolkit is on robustness using a state machine
architecture. This paper presents an overview of the project based on a recent book which can be downloaded from
igstk.org. The paper includes an introduction to open source projects, a discussion of our software development process
and the best practices that were developed, and an overview of requirements. The paper also presents the architecture
framework and main components. This presentation is followed by a discussion of the state machine model that was
incorporated and the associated rationale. The paper concludes with an example application.
Open source software has tremendous potential for improving the productivity of research labs and enabling the development of new medical applications. The Image-Guided Surgery Toolkit (IGSTK) is an open source software toolkit based on ITK, VTK, and FLTK, and uses the cross-platform tools CMAKE and DART to support common operating systems such as Linux, Windows, and MacOS. IGSTK integrates the basic components needed in surgical guidance applications and provides a common platform for fast prototyping and development of robust image-guided applications. This paper gives an overview of the IGSTK framework and current status of development followed by an example needle biopsy application to demonstrate how to develop an image-guided application using this toolkit.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.