Popular crowdsourcing techniques mostly focus on evaluating workers' labeling quality before adjusting their weights during label aggregation. Recently, another cohort of models regard crowd-sourced annotations as incomplete tensors and recover unfilled labels by tensor completion. However, mixed strategies of the two methodologies have never been comprehensively investigated, leaving them as rather independent approaches. In this work, we propose MiSC (Mixed Strategies Crowdsourcing), a versatile framework integrating arbitrary conventional crowdsourcing and tensor completion techniques. In particular, we propose a novel iterative Tucker label aggregation algorithm that outperforms state-of-the-art methods in extensive experiments.
CITATION STYLE
Ko, C. Y., Lin, R., Li, S., & Wong, N. (2019). MISC: Mixed strategies crowdsourcing. In IJCAI International Joint Conference on Artificial Intelligence (Vol. 2019-August, pp. 1394–1400). International Joint Conferences on Artificial Intelligence. https://doi.org/10.24963/ijcai.2019/193
Mendeley helps you to discover research relevant for your work.