Accurate motion tracking of the left ventricle is critical in detecting wall motion abnormalities in the heart after an injury such as a myocardial infarction. We propose an unsupervised motion tracking framework with physiological constraints to learn dense displacement fields between sequential pairs of 2-D B-mode echocardiography images. Current deep-learning motion-tracking algorithms require large amounts of data to provide ground-truth, which is difficult to obtain for in vivo datasets (such as patient data and animal studies), or are unsuccessful in tracking motion between echocardiographic images due to inherent ultrasound properties (such as low signal-to-noise ratio and various image artifacts). We design a U-Net inspired convolutional neural network that uses manually traced segmentations as a guide to learn displacement estimations between a source and target image without ground- truth displacement fields by minimizing the difference between a transformed source frame and the original target frame. We then penalize divergence in the displacement field in order to enforce incompressibility within the left ventricle. We demonstrate the performance of our model on synthetic and in vivo canine 2-D echocardiography datasets by comparing it against a non-rigid registration algorithm and a shape-tracking algorithm. Our results show favorable performance of our model against both methods.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.