Rating crowdsourced annotations: evaluating contributions of variable quality and completeness
Crowdsourcing has become a popular means to acquire data about the Earth and its environment inexpensively, but the data-sets obtained are typically imperfect and of unknown quality. Two common imperfections with crowdsourced data are the contributions from cheats or spammers and missing cases. The...
| Main Author: | |
|---|---|
| Format: | Article |
| Published: |
Taylor & Francis
2013
|
| Online Access: | https://eprints.nottingham.ac.uk/2448/ |