Expert crowdsourcing (e.g., Upwork.com) provides promising benefits such as productivity improvements for employers, and flexible working arrangements for workers. Yet to realize these benefits, a key persistent challenge is effective hiring at scale. Current approaches, such as reputation systems and standardized competency tests, develop weaknesses such as score inflation over time, thus degrading market quality. This paper presents HirePeer, a novel alternative approach to hiring at scale that leverages peer assessment to elicit honest assessments of fellow workers’ job application materials, which it then aggregates using an impartial ranking algorithm. This paper reports on three studies that investigate both the costs and the benefits to workers and employers of impartial peer-assessed hiring. We find, to solicit honest assessments, algorithms must be communicated in terms of their impartial effects. Second, in practice, peer assessment is highly accurate, and impartial rank aggregation algorithms incur a small accuracy cost for their impartiality guarantee. Third, workers report finding peer-assessed hiring useful for receiving targeted feedback on their job materials.
CITATION STYLE
Kotturi, Y., Kahng, A., Procaccia, A. D., & Kulkarni, C. (2020). HirePeer: Impartial peer-assessed hiring at scale in expert crowdsourcing markets. In AAAI 2020 - 34th AAAI Conference on Artificial Intelligence (pp. 2577–2584). AAAI press. https://doi.org/10.1609/aaai.v34i03.5641
Mendeley helps you to discover research relevant for your work.