The demand for applying the iris segmentation model on mobile devices has been growing rapidly. Most current segmentation networks have an enormous amount of parameters, hence unsuitable for mobile devices, while other small memory footprint models follow the spirit of classification networks and ignore the inherent characteristic of segmentation. To address the challenge, we propose a lightweight segmentation network (LiSeNet) for iris segmentation of noisy images. Unlike previous studies that only focus on improving the accuracy of segmentation masks, LiSeNet can simultaneously obtain segmentation masks, parameterized pupillary and limbic boundaries of the iris, further enabling CNN-based iris segmentation to be applied in any regular iris recognition systems. We first propose a multiscale concatenate (MSC) Block, which connects multiple sizes of convolution kernels in a dense manner, gradually reduces the dimension of feature maps and uses the aggregation of them for image representation. Based on the MSC block, we develop a two-stage refinement encoder to aggregate discriminative features through subnetwork feature reuse and substage feature reassess, thus obtaining a sufficient receptive field and enhancing the model learning ability. To exploit object contextual information more efficiently, we further devise a grouped spatial attention to emphasize the important features and suppress irrelevant noises through a gating mechanism in the decoder. Extensive experiments on three challenging iris datasets show that LiSeNet, without any complicated postprocessing, achieves competitive or state-of-the-art performance with only 2.2M parameters, being 14 × smaller than the previous best method. Code will be publicly available.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.