Methodological tools and sensitivity analysis for assessing quality or risk of bias used in systematic reviews published in the high-impact anesthesiology journals.
Ontology highlight
ABSTRACT: BACKGROUND:A crucial element in the systematic review (SR) methodology is the appraisal of included primary studies, using tools for assessment of methodological quality or risk of bias (RoB). SR authors can conduct sensitivity analyses to explore whether their results are sensitive to exclusion of low quality studies or a high RoB. However, it is unknown which tools do SR authors use for assessing quality/RoB, and how they set threshold for quality/RoB in sensitivity analyses. The aim of this study was to assess quality/RoB assessment tools, the types of sensitivity analyses and quality/RoB thresholds for sensitivity analyses used within SRs published in high-impact pain/anesthesiology journals. METHODS:This was a methodological study. We analyzed SRs published from January 2005 to June 2018 in the 25% highest-ranking journals within the Journal Citation Reports (JCR) "Anesthesiology" category. We retrieved the SRs from PubMed. Two authors independently screened records, full texts, and extracted data on quality/RoB tools and sensitivity analyses. We extracted data about quality/RoB tools, types of sensitivity analyses and the thresholds for quality/RoB used in them. RESULTS:Out of 678 analyzed SRs, 513 (76%) reported the use of quality/RoB assessments. The most commonly reported tools for assessing quality/RoB in the studies were the Cochrane tool for risk of bias assessment (N?=?251; 37%) and Jadad scale (N?=?99; 15%). Meta-analysis was conducted in 451 (66%) of SRs and sensitivity analysis in 219/451 (49%). Most commonly, sensitivity analysis was conducted to explore the influence of study quality/RoB (90/219; 41%) on the results. Quality/RoB thresholds used for sensitivity analysis for those studies were clearly reported in 47 (52%) articles that used them. The quality/RoB thresholds used for sensitivity analyses were highly heterogeneous and inconsistent, even when the same tool was used. CONCLUSIONS:A quarter of SRs reported using quality/RoB assessments, and some of them cited tools that are not meant for assessing quality/RoB. Authors who use quality/RoB to explore the robustness of their results in meta-analyses use highly heterogeneous quality/RoB thresholds in sensitivity analyses. Better methodological consistency for quality/RoB sensitivity analyses is needed.
SUBMITTER: Marusic MF
PROVIDER: S-EPMC7236513 | biostudies-literature | 2020 May
REPOSITORIES: biostudies-literature
ACCESS DATA