Multi-temporal collaborative analysis of port scenes can enhance the representation ability of image scenes, and image registration is required before multi-temporal analysis. In this paper, an image registration network E-SuperGlue with enhanced feature matching is proposed to solve the problems such as the difficulty of extracting feature points and matching feature descriptors for port multi-temporal image registration. Our network takes SuperGlue network as the basic framework. Firstly, Focus is introduced into the feature extraction network to increase the number to increase the number and detection rate of feature points. Secondly, LFPE module is added to feature matching network coding module to improve the information efficiency of feature descriptor coding. Finally, an improved multi-layer sensing structure E-MLP is added to the feature matching network to improve the utilization rate of channel information.
With the development of computer vision and deep learning, the convolutional neural network has been widely used in image processing such as object detection and semantic segmentation, and has achieved breakthrough achievements. However, when the training samples are insufficient, the conventional neural network usually has unsatisfactory robustness. In order to solve the problem, we improve the generalization performance of the few-shot detectors by focusing on the target center and can identify novel categories. The paper proposes a new attention mechanism based on the auxiliary circle feature map of the object center. By selecting an auxiliary circle feature map with the object center as the center of the circle and the minimum size in height and width as the diameter, adding it to the anchor-free CenterNet network as soft attention to promote network training. Several experiments on PASCAL VOC2007/2012 datasets show that the proposed method achieves the most advanced level in terms of the accuracy and standard deviation of few-shot object detection, which indicates the algorithm’s effectiveness.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.