Authors
Yabin Wang, Zhiheng Ma, Zhiwu Huang, Yaowei Wang, Zhou Su, Xiaopeng Hong
Publication date
2023/6/26
Journal
Proceedings of the AAAI Conference on Artificial Intelligence
Volume
37
Issue
8
Pages
10209-10217
Description
This paper focuses on the prevalent stage interference and stage performance imbalance of incremental learning. To avoid obvious stage learning bottlenecks, we propose a new incremental learning framework, which leverages a series of stage-isolated classifiers to perform the learning task at each stage, without interference from others. To be concrete, to aggregate multiple stage classifiers as a uniform one impartially, we first introduce a temperature-controlled energy metric for indicating the confidence score levels of the stage classifiers. We then propose an anchor-based energy self-normalization strategy to ensure the stage classifiers work at the same energy level. Finally, we design a voting-based inference augmentation strategy for robust inference. The proposed method is rehearsal-free and can work for almost all incremental learning scenarios. We evaluate the proposed method on four large datasets. Extensive results demonstrate the superiority of the proposed method in setting up new state-of-the-art overall performance. Code is available at https://github. com/iamwangyabin/ESN.
Total citations
2022202320241613
Scholar articles
Y Wang, Z Ma, Z Huang, Y Wang, Z Su, X Hong - Proceedings of the AAAI Conference on Artificial …, 2023