Unknown

Dataset Information

0

Validity of Online Screening for Autism: Crowdsourcing Study Comparing Paid and Unpaid Diagnostic Tasks.


ABSTRACT:

Background

Obtaining a diagnosis of neuropsychiatric disorders such as autism requires long waiting times that can exceed a year and can be prohibitively expensive. Crowdsourcing approaches may provide a scalable alternative that can accelerate general access to care and permit underserved populations to obtain an accurate diagnosis.

Objective

We aimed to perform a series of studies to explore whether paid crowd workers on Amazon Mechanical Turk (AMT) and citizen crowd workers on a public website shared on social media can provide accurate online detection of autism, conducted via crowdsourced ratings of short home video clips.

Methods

Three online studies were performed: (1) a paid crowdsourcing task on AMT (N=54) where crowd workers were asked to classify 10 short video clips of children as "Autism" or "Not autism," (2) a more complex paid crowdsourcing task (N=27) with only those raters who correctly rated ?8 of the 10 videos during the first study, and (3) a public unpaid study (N=115) identical to the first study.

Results

For Study 1, the mean score of the participants who completed all questions was 7.50/10 (SD 1.46). When only analyzing the workers who scored ?8/10 (n=27/54), there was a weak negative correlation between the time spent rating the videos and the sensitivity (?=-0.44, P=.02). For Study 2, the mean score of the participants rating new videos was 6.76/10 (SD 0.59). The average deviation between the crowdsourced answers and gold standard ratings provided by two expert clinical research coordinators was 0.56, with an SD of 0.51 (maximum possible SD is 3). All paid crowd workers who scored 8/10 in Study 1 either expressed enjoyment in performing the task in Study 2 or provided no negative comments. For Study 3, the mean score of the participants who completed all questions was 6.67/10 (SD 1.61). There were weak correlations between age and score (r=0.22, P=.014), age and sensitivity (r=-0.19, P=.04), number of family members with autism and sensitivity (r=-0.195, P=.04), and number of family members with autism and precision (r=-0.203, P=.03). A two-tailed t test between the scores of the paid workers in Study 1 and the unpaid workers in Study 3 showed a significant difference (P<.001).

Conclusions

Many paid crowd workers on AMT enjoyed answering screening questions from videos, suggesting higher intrinsic motivation to make quality assessments. Paid crowdsourcing provides promising screening assessments of pediatric autism with an average deviation <20% from professional gold standard raters, which is potentially a clinically informative estimate for parents. Parents of children with autism likely overfit their intuition to their own affected child. This work provides preliminary demographic data on raters who may have higher ability to recognize and measure features of autism across its wide range of phenotypic manifestations.

SUBMITTER: Washington P 

PROVIDER: S-EPMC6552453 | biostudies-literature | 2019 May

REPOSITORIES: biostudies-literature

altmetric image

Publications

Validity of Online Screening for Autism: Crowdsourcing Study Comparing Paid and Unpaid Diagnostic Tasks.

Washington Peter P   Kalantarian Haik H   Tariq Qandeel Q   Schwartz Jessey J   Dunlap Kaitlyn K   Chrisman Brianna B   Varma Maya M   Ning Michael M   Kline Aaron A   Stockham Nathaniel N   Paskov Kelley K   Voss Catalin C   Haber Nick N   Wall Dennis Paul DP  

Journal of medical Internet research 20190523 5


<h4>Background</h4>Obtaining a diagnosis of neuropsychiatric disorders such as autism requires long waiting times that can exceed a year and can be prohibitively expensive. Crowdsourcing approaches may provide a scalable alternative that can accelerate general access to care and permit underserved populations to obtain an accurate diagnosis.<h4>Objective</h4>We aimed to perform a series of studies to explore whether paid crowd workers on Amazon Mechanical Turk (AMT) and citizen crowd workers on  ...[more]

Similar Datasets

| S-EPMC6096331 | biostudies-literature
| S-EPMC4575153 | biostudies-literature
| S-EPMC7613535 | biostudies-literature
| S-EPMC9245227 | biostudies-literature
| S-EPMC8661776 | biostudies-literature
| S-EPMC4837201 | biostudies-literature
| S-EPMC10044163 | biostudies-literature
| S-EPMC5547290 | biostudies-other
| S-EPMC6059390 | biostudies-literature
| S-EPMC10798499 | biostudies-literature