Unknown

Dataset Information

0

Inter-rater agreement in evaluation of disability: systematic review of reproducibility studies.


ABSTRACT:

Objectives

 To explore agreement among healthcare professionals assessing eligibility for work disability benefits.

Design

 Systematic review and narrative synthesis of reproducibility studies.

Data sources

 Medline, Embase, and PsycINFO searched up to 16 March 2016, without language restrictions, and review of bibliographies of included studies.

Eligibility criteria

 Observational studies investigating reproducibility among healthcare professionals performing disability evaluations using a global rating of working capacity and reporting inter-rater reliability by a statistical measure or descriptively. Studies could be conducted in insurance settings, where decisions on ability to work include normative judgments based on legal considerations, or in research settings, where decisions on ability to work disregard normative considerations. : Teams of paired reviewers identified eligible studies, appraised their methodological quality and generalisability, and abstracted results with pretested forms. As heterogeneity of research designs and findings impeded a quantitative analysis, a descriptive synthesis stratified by setting (insurance or research) was performed.

Results

 From 4562 references, 101 full text articles were reviewed. Of these, 16 studies conducted in an insurance setting and seven in a research setting, performed in 12 countries, met the inclusion criteria. Studies in the insurance setting were conducted with medical experts assessing claimants who were actual disability claimants or played by actors, hypothetical cases, or short written scenarios. Conditions were mental (n=6, 38%), musculoskeletal (n=4, 25%), or mixed (n=6, 38%). Applicability of findings from studies conducted in an insurance setting to real life evaluations ranged from generalisable (n=7, 44%) and probably generalisable (n=3, 19%) to probably not generalisable (n=6, 37%). Median inter-rater reliability among experts was 0.45 (range intraclass correlation coefficient 0.86 to ?-0.10). Inter-rater reliability was poor in six studies (37%) and excellent in only two (13%). This contrasts with studies conducted in the research setting, where the median inter-rater reliability was 0.76 (range 0.91-0.53), and 71% (5/7) studies achieved excellent inter-rater reliability. Reliability between assessing professionals was higher when the evaluation was guided by a standardised instrument (23 studies, P=0.006). No such association was detected for subjective or chronic health conditions or the studies' generalisability to real world evaluation of disability (P=0.46, 0.45, and 0.65, respectively).

Conclusions

 Despite their common use and far reaching consequences for workers claiming disabling injury or illness, research on the reliability of medical evaluations of disability for work is limited and indicates high variation in judgments among assessing professionals. Standardising the evaluation process could improve reliability. Development and testing of instruments and structured approaches to improve reliability in evaluation of disability are urgently needed.

SUBMITTER: Barth J 

PROVIDER: S-EPMC5283380 | biostudies-literature | 2017 Jan

REPOSITORIES: biostudies-literature

altmetric image

Publications

Inter-rater agreement in evaluation of disability: systematic review of reproducibility studies.

Barth Jürgen J   de Boer Wout E L WE   Busse Jason W JW   Hoving Jan L JL   Kedzia Sarah S   Couban Rachel R   Fischer Katrin K   von Allmen David Y DY   Spanjer Jerry J   Kunz Regina R  

BMJ (Clinical research ed.) 20170125


<h4>Objectives</h4> To explore agreement among healthcare professionals assessing eligibility for work disability benefits.<h4>Design</h4> Systematic review and narrative synthesis of reproducibility studies.<h4>Data sources</h4> Medline, Embase, and PsycINFO searched up to 16 March 2016, without language restrictions, and review of bibliographies of included studies.<h4>Eligibility criteria</h4> Observational studies investigating reproducibility among healthcare professionals performing disabi  ...[more]

Similar Datasets

| S-EPMC6864748 | biostudies-literature
| S-EPMC6097668 | biostudies-literature
| S-EPMC6607597 | biostudies-literature
| S-EPMC3552943 | biostudies-other
| S-EPMC6386303 | biostudies-literature
| S-EPMC7856064 | biostudies-literature
| S-EPMC7391354 | biostudies-literature
| S-EPMC4802863 | biostudies-literature
| S-EPMC3586147 | biostudies-literature
| S-EPMC4821828 | biostudies-literature