Few-shot class-incremental learning (FSCIL) is crucial and practical for artificial intelligence in the real world, which learns novel classes incrementally from few samples without forgetting the previously learned classes. However, FSCIL confronts two significant challenges: “catastrophic forgetting” and “overfitting new.” We focus on convolutional neural network (CNN)-based FSCIL and propose a human cognition-inspired FSCIL method, in which the knowledge of novel classes is learned under the guidance of the previously learned knowledge. Specifically, we learn a discriminative and generalized CNN feature extractor from the base classes in the first task. We generate the representations of base and novel classes in unified feature space without training on novel classes, thus avoiding “forgetting old.” For the novel classes in long sequential tasks, beyond the representation generation, we enhance the representation by exploring the correlations with the previously learned classes to alleviate overfitting new and ensure that the novel classes adapt to the feature space. Experimental results show that our proposed method achieves very competitive results on MiniImageNet and CIFAR-100 datasets.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.