Rating crowdsourced annotations: evaluating contributions of variable quality and completeness
Crowdsourcing has become a popular means to acquire data about the Earth and its environment inexpensively, but the data-sets obtained are typically imperfect and of unknown quality. Two common imperfections with crowdsourced data are the contributions from cheats or spammers and missing cases. The...
Main Author: | |
---|---|
Format: | Article |
Language: | English |
Published: |
Taylor & Francis
2013
|
Online Access: | http://eprints.nottingham.ac.uk/2448/ http://eprints.nottingham.ac.uk/2448/ http://eprints.nottingham.ac.uk/2448/ http://eprints.nottingham.ac.uk/2448/1/Line_22_Rating_crowdsourced_annotations....pdf |