Unknown

Dataset Information

0

Exact Partial Information Decompositions for Gaussian Systems Based on Dependency Constraints.


ABSTRACT: The Partial Information Decomposition, introduced by Williams P. L. et al. (2010), provides a theoretical framework to characterize and quantify the structure of multivariate information sharing. A new method ( I dep ) has recently been proposed by James R. G. et al. (2017) for computing a two-predictor partial information decomposition over discrete spaces. A lattice of maximum entropy probability models is constructed based on marginal dependency constraints, and the unique information that a particular predictor has about the target is defined as the minimum increase in joint predictor-target mutual information when that particular predictor-target marginal dependency is constrained. Here, we apply the I dep approach to Gaussian systems, for which the marginally constrained maximum entropy models are Gaussian graphical models. Closed form solutions for the I dep PID are derived for both univariate and multivariate Gaussian systems. Numerical and graphical illustrations are provided, together with practical and theoretical comparisons of the I dep PID with the minimum mutual information partial information decomposition ( I mmi ), which was discussed by Barrett A. B. (2015). The results obtained using I dep appear to be more intuitive than those given with other methods, such as I mmi , in which the redundant and unique information components are constrained to depend only on the predictor-target marginal distributions. In particular, it is proved that the I mmi method generally produces larger estimates of redundancy and synergy than does the I dep method. In discussion of the practical examples, the PIDs are complemented by the use of tests of deviance for the comparison of Gaussian graphical models.

SUBMITTER: Kay JW 

PROVIDER: S-EPMC7512755 | biostudies-literature | 2018 Mar

REPOSITORIES: biostudies-literature

altmetric image

Publications

Exact Partial Information Decompositions for Gaussian Systems Based on Dependency Constraints.

Kay Jim W JW   Ince Robin A A RAA  

Entropy (Basel, Switzerland) 20180330 4


The Partial Information Decomposition, introduced by Williams P. L. et al. (2010), provides a theoretical framework to characterize and quantify the structure of multivariate information sharing. A new method ( I dep ) has recently been proposed by James R. G. et al. (2017) for computing a two-predictor partial information decomposition over discrete spaces. A lattice of maximum entropy probability models is constructed based on marginal dependency constraints, and the unique information that a  ...[more]

Similar Datasets

| S-EPMC6901079 | biostudies-literature
| S-EPMC4593074 | biostudies-literature
| S-EPMC3974646 | biostudies-literature
| S-EPMC8424921 | biostudies-literature
| S-EPMC9910319 | biostudies-literature
| S-EPMC10898342 | biostudies-literature
| S-EPMC3340063 | biostudies-literature
| S-EPMC8608426 | biostudies-literature
| S-EPMC5324576 | biostudies-literature
| S-EPMC11294562 | biostudies-literature