The anchor-based two-stage object detection methods like the Faster R-CNN are commonly utilized for detection tasks in various fields. Since networks in these methods are built on the pre-trained classification models, their performance largely depends on the backbone's properties. And it will make them suffer from limited generalization ability on some specific datasets. To overcome this problem and enhance the model's representation ability, we propose a Variational Information Bottleneck Based Feature Enhancement Object Detection Network (VFEDet). We first design a spatial-wise feature enhancement module in the first stage to highlight the critical target in the images, using a weighting map generated from the original feature in the form of information bottleneck (i.e., Variational Information Bottleneck, VIB). It can effectively suppress the overfitting and make the features contain more discriminative information for recognition and bounding box regression. Furthermore, we modify the second stage by inserting the VIB after the first fully connected layer to improve the model's robustness. Introducing the two parts into the original detection model, we achieve 39.34% improvement on a thyroid nodule ultrasound image dataset polluted by a kind of special noise in a previous work. The effectiveness of the proposed method is also evaluated on the COCO dataset.
We propose a design of a retinal-projection-based near-eye display for achieving ultra-large field of view, vision correction, and occlusion. Our solution is highlighted by a contact lens combo, a transparent organic light-emitting diode panel, and a twisted nematic liquid crystal panel. Its design rules are set forth in detail, followed by the results and discussion regarding the field of view, angular resolution, modulation transfer function, contrast ratio, distortion, and simulated imaging.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.