Quentin Ferdinand PhDv.2023 Mitigating catastrophic forgetting via feature transfer and knowledge consolidation for deep class-incremental learning Context
Defense (link)
Board of Examiners
AbstractDeep learning methods are designed for closed-set recognition, where a predefined set of known classes is assumed. However, in real-world scenarios, open-set recognition is more realistic, allowing for the possibility of encountering unknown or novel classes during testing. Class-incremental learning specifically addresses this problem by focusing on continuously improving models through the incorporation of new categories over time. Unfortunately, deep neural networks trained in this manner suffer from catastrophic forgetting, resulting in significant performance degradation on previously encountered classes. While various methods have been explored to alleviate this issue by rectifying the classification layer of deep incremental models, an equally important aspect resides in the degradation of feature quality, which can impede downstream classification performance. This thesis specifically focuses on investigating diverse approaches to enhance feature quality, facilitating adaptation to new classes while preserving discriminative features for past classes during incremental training. Specifically two methods have been proposed and rigorously evaluated on widely established benchmarks, attaining performances either comparable or superior to the state of the art. The first approach presented investigates the use of contrastive methods during incremental learning in order to improve the features extracted by incremental models while the second one uses an expansion and compression scheme to greatly reduce the forgetting happening at a feature level. Mots clés: incremental learning, catastrophic forgetting, knowledge distillation, convolutional neural network, supervised learning Publications
|