Unknown

Dataset Information

0

Alleviating catastrophic forgetting using context-dependent gating and synaptic stabilization.


ABSTRACT: Humans and most animals can learn new tasks without forgetting old ones. However, training artificial neural networks (ANNs) on new tasks typically causes them to forget previously learned tasks. This phenomenon is the result of "catastrophic forgetting," in which training an ANN disrupts connection weights that were important for solving previous tasks, degrading task performance. Several recent studies have proposed methods to stabilize connection weights of ANNs that are deemed most important for solving a task, which helps alleviate catastrophic forgetting. Here, drawing inspiration from algorithms that are believed to be implemented in vivo, we propose a complementary method: adding a context-dependent gating signal, such that only sparse, mostly nonoverlapping patterns of units are active for any one task. This method is easy to implement, requires little computational overhead, and allows ANNs to maintain high performance across large numbers of sequentially presented tasks, particularly when combined with weight stabilization. We show that this method works for both feedforward and recurrent network architectures, trained using either supervised or reinforcement-based learning. This suggests that using multiple, complementary methods, akin to what is believed to occur in the brain, can be a highly effective strategy to support continual learning.

SUBMITTER: Masse NY 

PROVIDER: S-EPMC6217392 | biostudies-literature | 2018 Oct

REPOSITORIES: biostudies-literature

altmetric image

Publications

Alleviating catastrophic forgetting using context-dependent gating and synaptic stabilization.

Masse Nicolas Y NY   Grant Gregory D GD   Freedman David J DJ  

Proceedings of the National Academy of Sciences of the United States of America 20181012 44


Humans and most animals can learn new tasks without forgetting old ones. However, training artificial neural networks (ANNs) on new tasks typically causes them to forget previously learned tasks. This phenomenon is the result of "catastrophic forgetting," in which training an ANN disrupts connection weights that were important for solving previous tasks, degrading task performance. Several recent studies have proposed methods to stabilize connection weights of ANNs that are deemed most important  ...[more]

Similar Datasets

| S-EPMC7440920 | biostudies-literature
| S-EPMC5690421 | biostudies-literature
| S-EPMC7787001 | biostudies-literature
| S-EPMC7244383 | biostudies-literature
| S-EPMC8258431 | biostudies-literature
| S-EPMC2787089 | biostudies-literature
| S-EPMC4007033 | biostudies-literature
| S-EPMC7174989 | biostudies-literature
| S-EPMC2430213 | biostudies-literature
| S-EPMC8137710 | biostudies-literature