Rehearmixup: Improving Rehearsal-Based Continual Learning
22 Pages Posted: 24 Sep 2024
There are 3 versions of this paper
RehearMixup: Improving Rehearsal-Based Continual Learning
Rehearmixup: Improving Rehearsal-Based Continual Learning
Abstract
Neural networks often suffer from catastrophic forgetting when learning new tasks, leading to the loss of previously acquired knowledge. To address this issue, rehearsal-based methods have emerged, which involve storing a subset of data from previous tasks and accessing it during the learning of new tasks. Current rehearsal-based methods focus on selecting representative samples to store in memory. However, there is a considerable lack of exploration of how to exploit the data at hand and consider the correlation between tasks or between past and new knowledge to improve performance. Therefore, we propose a simple yet effective approach named RehearMixup that adapts the Mixup technique into rehearsal-based methods, which synthesizes new samples for learning by interpolating data from past or current tasks. Specifically, we introduce three strategies, namely Cross-Mixup, Intra-Memory-Mixup, and Intra-Current-Mixup, based on the inherent characteristics of rehearsal-based methods - involving the memory and new tasks. Through empirical evaluations under various benchmark scenarios, we compare our approach against different rehearsal-based baselines. The results demonstrate that ours, particularly Intra-Current-Mixup, improve accuracy, backward transfer, forward transfer, and enhance the model's robustness.
Keywords: Continual learning, Rehearsal-based method, Mixup technique, RehearMixup, Memory and new tasks
Suggested Citation: Suggested Citation