Unknown

Dataset Information

0

Reverse engineering gene networks using global-local shrinkage rules.


ABSTRACT: Inferring gene regulatory networks from high-throughput 'omics' data has proven to be a computationally demanding task of critical importance. Frequently, the classical methods break down owing to the curse of dimensionality, and popular strategies to overcome this are typically based on regularized versions of the classical methods. However, these approaches rely on loss functions that may not be robust and usually do not allow for the incorporation of prior information in a straightforward way. Fully Bayesian methods are equipped to handle both of these shortcomings quite naturally, and they offer the potential for improvements in network structure learning. We propose a Bayesian hierarchical model to reconstruct gene regulatory networks from time-series gene expression data, such as those common in perturbation experiments of biological systems. The proposed methodology uses global-local shrinkage priors for posterior selection of regulatory edges and relaxes the common normal likelihood assumption in order to allow for heavy-tailed data, which were shown in several of the cited references to severely impact network inference. We provide a sufficient condition for posterior propriety and derive an efficient Markov chain Monte Carlo via Gibbs sampling in the electronic supplementary material. We describe a novel way to detect multiple scales based on the corresponding posterior quantities. Finally, we demonstrate the performance of our approach in a simulation study and compare it with existing methods on real data from a T-cell activation study.

SUBMITTER: Panchal V 

PROVIDER: S-EPMC6936010 | biostudies-literature | 2020 Feb

REPOSITORIES: biostudies-literature

altmetric image

Publications

Reverse engineering gene networks using global-local shrinkage rules.

Panchal Viral V   Linder Daniel F DF  

Interface focus 20191213 1


Inferring gene regulatory networks from high-throughput 'omics' data has proven to be a computationally demanding task of critical importance. Frequently, the classical methods break down owing to the curse of dimensionality, and popular strategies to overcome this are typically based on regularized versions of the classical methods. However, these approaches rely on loss functions that may not be robust and usually do not allow for the incorporation of prior information in a straightforward way  ...[more]

Similar Datasets

| S-EPMC5388190 | biostudies-literature
| S-EPMC3592423 | biostudies-literature
| S-EPMC156306 | biostudies-literature
| S-EPMC8584672 | biostudies-literature
| S-EPMC3847961 | biostudies-literature
| S-EPMC2813854 | biostudies-other
| S-EPMC1959566 | biostudies-literature
| S-EPMC10870138 | biostudies-literature
| S-EPMC7015990 | biostudies-literature
| S-EPMC3716858 | biostudies-literature