Although dysesthesia is a common and persistent surgical complication, there is no
accepted method for quantitatively tracking affected skin. To address this, two types
of computer vision technologies were tested in a total of four configurations. Surface
regions on plastic models of limbs were delineated with colored tape, imaged, and
compared with computed tomography scans. The most accurate system used visually
projected texture captured by a binocular stereo camera, capable of measuring areas
to within 0.05% of the ground-truth areas with 1.4% variance. This simple, inexpensive
technology shows promise for postoperative monitoring of dysesthesia surrounding
surgical scars.
A successful image-guided surgical intervention requires accurate measurement of coordinate systems. Uncertainty is
introduced every time a pose is measured by the optical tracking system. When we transform a measured pose into a
different coordinate system, the covariance (which encodes the uncertainty of the pose) must be propagated to this new
coordinate system. In this paper, we describe a method for propagating covariances estimated from registration, tracking,
and instrument calibration into the tip of the surgical tool. This is clinically important, since it is at the tool tip that the
clinician cares about uncertainty. We demonstrate that the propagation method, which is computed in real time as the tool
moves through space, reliably computes the propagated covariance by comparing our estimate to true covariances from
Monte Carlo simulations.
Image-guided interventions using intraoperative 3D imaging can be less cumbersome than systems dependent on preoperative
images, especially by needing neither potentially invasive image-to-patient registration nor a lengthy process of
segmenting and generating a 3D surface model. In this study, a method for computer-assisted surgery using direct navigation
on intraoperative imaging is presented. In this system the registration step of a navigated procedure was divided into
two stages: preoperative calibration of images to a ceiling-mounted optical tracking system, and intraoperative tracking
during acquisition of the 3D medical image volume. The preoperative stage used a custom-made multi-modal calibrator
that could be optically tracked and also contained fiducial spheres for radiological detection; a robust registration algorithm
was used to compensate for the very high false-detection rate that was due to the high physical density of the optical
light-emitting diodes. Intraoperatively, a tracking device was attached to plastic bone models that were also instrumented
with radio-opaque spheres; A calibrated pointer was used to contact the latter spheres as a validation of the registration.
Experiments showed that the fiducial registration error of the preoperative calibration stage was approximately 0.1 mm.
The target registration error in the validation stage was approximately 1.2 mm. This study suggests that direct registration,
coupled with procedure-specific graphical rendering, is potentially a highly accurate means of performing image-guided
interventions in a fast, simple manner.
Navigation of a flexible endoscope is a challenging surgical task: the shape of the end effector of the endoscope, interacting
with surrounding tissues, determine the surgical path along which the endoscope is pushed. We present a navigational
system that visualized the shape of the flexible endoscope tube to assist gastrointestinal surgeons in performing Natural
Orifice Translumenal Endoscopic Surgery (NOTES). The system used an electromagnetic positional tracker, a catheter
embedded with multiple electromagnetic sensors, and graphical user interface for visualization. Hermite splines were used
to interpret the position and direction outputs of the endoscope sensors. We conducted NOTES experiments on live swine
involving 6 gastrointestinal and 6 general surgeons. Participants who used the device first were 14.2% faster than when not
using the device. Participants who used the device second were 33.6% faster than the first session. The trend suggests that
spline-based visualization is a promising adjunct during NOTES procedures.
This paper presents a real-time, freehand ultrasound (US) calibration system, with automatic accuracy control
and incorporation of US section thickness. Intended for operating-room usage, the system featured a fully
automated calibration method that requires minimal human interaction, and an automatic accuracy control
mechanism based on a set of ground-truth data. We have also developed a technique to quantitatively evaluate
and incorporate US section thickness to improve the calibration precision. The experimental results demonstrated
that the calibration system was able to consistently and robustly achieve high calibration accuracy with real-time
performance and efficiency. Further, our preliminary results to incorporate elevation beam profile have
demonstrated a promising reduction of uncertainties to estimate elevation-related parameters.
KEYWORDS: Sensors, Finite element methods, Solids, Mechanics, Chemical elements, Data conversion, Object recognition, Robotic systems, Visualization, Systems modeling
Our recent work indicates that normal strain data generally provides insufficient information for reconstructing object geometry. For some classes of tactile tasks, the problem of object recognition is both underdetermined and, even if fully determined by the addition of shear data, is not stably invertible. Using both traditional theoretical analysis and finite-element methods to study the solid mechanics of a contact, a series of geometric indentors are applied to a tactile sensor model. In underdetermined cases, adding tangential (shear) components to the normal components of the sensed strains may allow discrimination of fine-form geometries. This indicates that in providing tactile displays to a human operator, both tangential and normal forces or displacements should be considered.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.