Strategyproof peer selection using randomization, partitioning, and apportionment

Haris Aziz, Omer Lev, Nicholas Mattei*, Jeffrey S. Rosenschein, Toby Walsh

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

27 Scopus citations

Abstract

Peer reviews, evaluations, and selections are a fundamental aspect of modern science. Funding bodies the world over employ experts to review and select the best proposals from those submitted for funding. The problem of peer selection, however, is much more general: a professional society may want to give a subset of its members awards based on the opinions of all members; an instructor for a Massive Open Online Course (MOOC) or an online course may want to crowdsource grading; or a marketing company may select ideas from group brainstorming sessions based on peer evaluation. We make three fundamental contributions to the study of peer selection, a specific type of group decision-making problem, studied in computer science, economics, and political science. First, we propose a novel mechanism that is strategyproof, i.e., agents cannot benefit by reporting insincere valuations. Second, we demonstrate the effectiveness of our mechanism by a comprehensive simulation-based comparison with a suite of mechanisms found in the literature. Finally, our mechanism employs a randomized rounding technique that is of independent interest, as it solves the apportionment problem that arises in various settings where discrete resources such as parliamentary representation slots need to be divided proportionally.

Original languageEnglish
Pages (from-to)295-309
Number of pages15
JournalArtificial Intelligence
Volume275
DOIs
StatePublished - Oct 2019

Bibliographical note

Publisher Copyright:
© 2019 Elsevier B.V.

Keywords

  • Algorithms
  • Allocation
  • Crowdsourcing
  • Peer review

Fingerprint

Dive into the research topics of 'Strategyproof peer selection using randomization, partitioning, and apportionment'. Together they form a unique fingerprint.

Cite this