Unknown

Dataset Information

0

Gated recurrence enables simple and accurate sequence prediction in stochastic, changing, and structured environments.


ABSTRACT: From decision making to perception to language, predicting what is coming next is crucial. It is also challenging in stochastic, changing, and structured environments; yet the brain makes accurate predictions in many situations. What computational architecture could enable this feat? Bayesian inference makes optimal predictions but is prohibitively difficult to compute. Here, we show that a specific recurrent neural network architecture enables simple and accurate solutions in several environments. This architecture relies on three mechanisms: gating, lateral connections, and recurrent weight training. Like the optimal solution and the human brain, such networks develop internal representations of their changing environment (including estimates of the environment's latent variables and the precision of these estimates), leverage multiple levels of latent structure, and adapt their effective learning rate to changes without changing their connection weights. Being ubiquitous in the brain, gated recurrence could therefore serve as a generic building block to predict in real-life environments.

SUBMITTER: Foucault C 

PROVIDER: S-EPMC8735865 | biostudies-literature |

REPOSITORIES: biostudies-literature

Similar Datasets

| S-EPMC5980395 | biostudies-literature
| S-EPMC8227059 | biostudies-literature
| S-EPMC10555843 | biostudies-literature
| S-EPMC6923510 | biostudies-literature
| S-EPMC4605288 | biostudies-literature
| S-EPMC5424156 | biostudies-literature
| S-EPMC7593340 | biostudies-literature
| S-EPMC3358312 | biostudies-literature
| S-EPMC9592829 | biostudies-literature
2002-12-08 | GSE88 | GEO