A hybrid imaging system is proposed for cancer detection, diagnosis and therapy monitoring by integrating
three complementary imaging techniques - ultrasound, photoacoustic and elasticity imaging. Indeed, simultaneous
imaging of the anatomy (ultrasound imaging), cancer-induced angiogenesis (photoacoustic imaging) and changes in
biomechanical properties (elasticity imaging) of tissue is based on many synergistic features of these modalities and
may result in a unique and important imaging tool. To facilitate the design and development of a real-time imaging
system for clinical applications, we have investigated the core components of the imaging system using numerical
simulations. Differences and similarities between each imaging technique were considered and contrasted. The results
of our study suggest that the integration of ultrasound, photoacoustic and elasticity imaging is possible using a custom
designed imaging system.
Tissue engineering is an interdisciplinary field that combines various aspects of engineering and life sciences and
aims to develop biological substitutes to restore, repair or maintain tissue function. Currently, the ability to have
quantitative functional assays of engineered tissues is limited to existing invasive methods like biopsy. Hence, an
imaging tool for non-invasive and simultaneous evaluation of the anatomical and functional properties of the engineered
tissue is needed. In this paper we present an advanced in-vivo imaging technology - ultrasound biomicroscopy combined
with complementary photoacoustic and elasticity imaging techniques, capable of accurate visualization of both structural
and functional changes in engineered tissues, sequential monitoring of tissue adaptation and/or regeneration, and possible
assistance of drug delivery and treatment planning. The combined imaging at microscopic resolution was evaluated on
tissue mimicking phantoms imaged with 25 MHz single element focused transducer. The results of our study
demonstrate that the ultrasonic, photoacoustic and elasticity images synergistically complement each other in detecting
features otherwise imperceptible using the individual techniques. Finally, we illustrate the feasibility of the combined
ultrasound, photoacoustic and elasticity imaging techniques in accurately assessing the morphological and functional
changes occurring in engineered tissue.
Understanding human behavior in video is essential in numerous applications including smart surveillance, video annotation/retrieval, and human-computer interaction.
However, recognizing human interactions is a challenging task due to ambiguity in body articulation, variations in body size and appearance, loose clothing, mutual occlusion, and shadows.
In this paper we present a framework for recognizing human actions and interactions in color video, and a hierarchical graphical model that unifies multiple-level processing in video computing: pixel level, blob level, object level, and event level. A mixture of Gaussian (MOG) model is used at the pixel level to train and classify individual pixel colors. A relaxation labeling with attribute relational graph (ARG) is used at the blob level to merge the pixels into coherent blobs and to register inter-blob relations. At the object level, the poses of individual body parts are recognized using Bayesian networks (BNs). At the event level, the actions of a single person are modeled using a dynamic Bayesian network (DBN). The results of the object-level descriptions for each person are juxtaposed along a common timeline to identify an interaction between two persons. The linguistic 'verb argument structure' is used to represent human action in terms of triplets. A meaningful semantic description in terms of is obtained. Our system achieves semantic descriptions of positive, neutral, and negative interactions between two persons including hand-shaking, standing hand-in-hand, and hugging as the positive interactions, approaching, departing, and pointing as the neutral interactions, and pushing, punching, and kicking as the negative interactions.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.