Unknown

Dataset Information

0

Lessons Learned from Crowdsourcing Complex Engineering Tasks.


ABSTRACT:

Crowdsourcing

Crowdsourcing is the practice of obtaining needed ideas, services, or content by requesting contributions from a large group of people. Amazon Mechanical Turk is a web marketplace for crowdsourcing microtasks, such as answering surveys and image tagging. We explored the limits of crowdsourcing by using Mechanical Turk for a more complicated task: analysis and creation of wind simulations.

Harnessing crowdworkers for engineering

Our investigation examined the feasibility of using crowdsourcing for complex, highly technical tasks. This was done to determine if the benefits of crowdsourcing could be harnessed to accurately and effectively contribute to solving complex real world engineering problems. Of course, untrained crowds cannot be used as a mere substitute for trained expertise. Rather, we sought to understand how crowd workers can be used as a large pool of labor for a preliminary analysis of complex data.

Virtual wind tunnel

We compared the skill of the anonymous crowd workers from Amazon Mechanical Turk with that of civil engineering graduate students, making a first pass at analyzing wind simulation data. For the first phase, we posted analysis questions to Amazon crowd workers and to two groups of civil engineering graduate students. A second phase of our experiment instructed crowd workers and students to create simulations on our Virtual Wind Tunnel website to solve a more complex task.

Conclusions

With a sufficiently comprehensive tutorial and compensation similar to typical crowd-sourcing wages, we were able to enlist crowd workers to effectively complete longer, more complex tasks with competence comparable to that of graduate students with more comprehensive, expert-level knowledge. Furthermore, more complex tasks require increased communication with the workers. As tasks become more complex, the employment relationship begins to become more akin to outsourcing than crowdsourcing. Through this investigation, we were able to stretch and explore the limits of crowdsourcing as a tool for solving complex problems.

SUBMITTER: Staffelbach M 

PROVIDER: S-EPMC4575153 | biostudies-literature | 2015

REPOSITORIES: biostudies-literature

altmetric image

Publications

Lessons Learned from Crowdsourcing Complex Engineering Tasks.

Staffelbach Matthew M   Sempolinski Peter P   Kijewski-Correa Tracy T   Thain Douglas D   Wei Daniel D   Kareem Ahsan A   Madey Gregory G  

PloS one 20150918 9


<h4>Crowdsourcing</h4>Crowdsourcing is the practice of obtaining needed ideas, services, or content by requesting contributions from a large group of people. Amazon Mechanical Turk is a web marketplace for crowdsourcing microtasks, such as answering surveys and image tagging. We explored the limits of crowdsourcing by using Mechanical Turk for a more complicated task: analysis and creation of wind simulations.<h4>Harnessing crowdworkers for engineering</h4>Our investigation examined the feasibil  ...[more]

Similar Datasets

| S-EPMC150369 | biostudies-literature
| S-EPMC7304729 | biostudies-literature
| S-EPMC7581702 | biostudies-literature
| S-EPMC3976103 | biostudies-literature
| S-EPMC2953368 | biostudies-literature
| S-EPMC3140138 | biostudies-literature
| S-EPMC4581358 | biostudies-other
| S-EPMC4079170 | biostudies-literature