Unknown

Dataset Information

0

Neuronal message passing using Mean-field, Bethe, and Marginal approximations.


ABSTRACT: Neuronal computations rely upon local interactions across synapses. For a neuronal network to perform inference, it must integrate information from locally computed messages that are propagated among elements of that network. We review the form of two popular (Bayesian) message passing schemes and consider their plausibility as descriptions of inference in biological networks. These are variational message passing and belief propagation - each of which is derived from a free energy functional that relies upon different approximations (mean-field and Bethe respectively). We begin with an overview of these schemes and illustrate the form of the messages required to perform inference using Hidden Markov Models as generative models. Throughout, we use factor graphs to show the form of the generative models and of the messages they entail. We consider how these messages might manifest neuronally and simulate the inferences they perform. While variational message passing offers a simple and neuronally plausible architecture, it falls short of the inferential performance of belief propagation. In contrast, belief propagation allows exact computation of marginal posteriors at the expense of the architectural simplicity of variational message passing. As a compromise between these two extremes, we offer a third approach - marginal message passing - that features a simple architecture, while approximating the performance of belief propagation. Finally, we link formal considerations to accounts of neurological and psychiatric syndromes in terms of aberrant message passing.

SUBMITTER: Parr T 

PROVIDER: S-EPMC6374414 | biostudies-literature | 2019 Feb

REPOSITORIES: biostudies-literature

altmetric image

Publications

Neuronal message passing using Mean-field, Bethe, and Marginal approximations.

Parr Thomas T   Markovic Dimitrije D   Kiebel Stefan J SJ   Friston Karl J KJ  

Scientific reports 20190213 1


Neuronal computations rely upon local interactions across synapses. For a neuronal network to perform inference, it must integrate information from locally computed messages that are propagated among elements of that network. We review the form of two popular (Bayesian) message passing schemes and consider their plausibility as descriptions of inference in biological networks. These are variational message passing and belief propagation - each of which is derived from a free energy functional th  ...[more]

Similar Datasets

| S-EPMC2767368 | biostudies-literature
| S-EPMC6876225 | biostudies-literature
| S-EPMC6484029 | biostudies-literature
| S-EPMC4280643 | biostudies-literature
| S-EPMC6915101 | biostudies-literature
| S-EPMC10769188 | biostudies-literature
| S-EPMC11366765 | biostudies-literature
| S-EPMC8979305 | biostudies-literature
| S-EPMC4699204 | biostudies-literature
| S-EPMC11267574 | biostudies-literature