Empirical Comparison of Publication Bias Tests in Meta-Analysis.
Ontology highlight
ABSTRACT: BACKGROUND:Decision makers rely on meta-analytic estimates to trade off benefits and harms. Publication bias impairs the validity and generalizability of such estimates. The performance of various statistical tests for publication bias has been largely compared using simulation studies and has not been systematically evaluated in empirical data. METHODS:This study compares seven commonly used publication bias tests (i.e., Begg's rank test, trim-and-fill, Egger's, Tang's, Macaskill's, Deeks', and Peters' regression tests) based on 28,655 meta-analyses available in the Cochrane Library. RESULTS:Egger's regression test detected publication bias more frequently than other tests (15.7% in meta-analyses of binary outcomes and 13.5% in meta-analyses of non-binary outcomes). The proportion of statistically significant publication bias tests was greater for larger meta-analyses, especially for Begg's rank test and the trim-and-fill method. The agreement among Tang's, Macaskill's, Deeks', and Peters' regression tests for binary outcomes was moderately strong (most ?'s were around 0.6). Tang's and Deeks' tests had fairly similar performance (??>?0.9). The agreement among Begg's rank test, the trim-and-fill method, and Egger's regression test was weak or moderate (?
SUBMITTER: Lin L
PROVIDER: S-EPMC6082203 | biostudies-literature | 2018 Aug
REPOSITORIES: biostudies-literature
ACCESS DATA