Unknown

Dataset Information

0

Nonlinear Information Bottleneck


ABSTRACT: Information bottleneck (IB) is a technique for extracting information in one random variable X that is relevant for predicting another random variable Y. IB works by encoding X in a compressed “bottleneck” random variable M from which Y can be accurately decoded. However, finding the optimal bottleneck variable involves a difficult optimization problem, which until recently has been considered for only two limited cases: discrete X and Y with small state spaces, and continuous X and Y with a Gaussian joint distribution (in which case optimal encoding and decoding maps are linear). We propose a method for performing IB on arbitrarily-distributed discrete and/or continuous X and Y, while allowing for nonlinear encoding and decoding maps. Our approach relies on a novel non-parametric upper bound for mutual information. We describe how to implement our method using neural networks. We then show that it achieves better performance than the recently-proposed “variational IB” method on several real-world datasets.

SUBMITTER: Kolchinsky A 

PROVIDER: S-EPMC7514526 | biostudies-literature | 2019 Nov

REPOSITORIES: biostudies-literature

Similar Datasets

| S-EPMC9806839 | biostudies-literature
| S-EPMC10369960 | biostudies-literature
| S-EPMC7971903 | biostudies-literature
| S-EPMC9825246 | biostudies-literature
| S-EPMC6085859 | biostudies-other
| S-EPMC10473284 | biostudies-literature
| S-EPMC2944202 | biostudies-literature
| S-EPMC2527865 | biostudies-literature
| S-EPMC7486391 | biostudies-literature
| S-EPMC6444992 | biostudies-literature