Authors
Travis D Breaux, Florian Schaub
Publication date
2014/8/25
Conference
2014 IEEE 22nd International Requirements Engineering Conference (RE)
Pages
163-172
Publisher
IEEE
Description
Natural language text sources have increasingly been used to develop new methods and tools for extracting and analyzing requirements. To validate these new approaches, researchers rely on a small number of trained experts to perform a labor-intensive manual analysis of the text. The time and resources needed to conduct manual extraction, however, has limited the size of case studies and thus the generalizability of results. To begin to address this issue, we conducted three experiments to evaluate crowdsourcing a manual requirements extraction task to a larger number of untrained workers. In these experiments, we carefully balance worker payment and overall cost, as well as worker training and data quality to study the feasibility of distributing requirements extraction to the crowd. The task consists of extracting descriptions of data collection, sharing and usage requirements from privacy policies. We …
Total citations
2014201520162017201820192020202120222023202434151391286451
Scholar articles