Perceptions of Efficiency and Bias in Peer Review: Algorithmic versus Human Decision Making

Information

  • NSF Award
  • 2316034
Owner
  • Award Id
    2316034
  • Award Effective Date
    9/1/2023 - 9 months ago
  • Award Expiration Date
    8/31/2025 - a year from now
  • Award Amount
    $ 399,922.00
  • Award Instrument
    Standard Grant

Perceptions of Efficiency and Bias in Peer Review: Algorithmic versus Human Decision Making

This project will develop improved methods and concepts to guide the development and application of new digital technologies that could be used in peer review processes for evaluating scientific publishing and funding outcomes. The research team will seek to compare perceptions of peer review decisions assisted by algorithms to those made by humans. The focus of the research is on the ethics and value of using algorithms in peer review. On the one hand, algorithmic peer review serves an instrumental purpose, purportedly offering the ability to make more efficient decisions. On the other hand, algorithms can produce biased and discriminatory decisions, which can raise ethical concerns about their use. This study will expand knowledge of how algorithms relate to the norms, values, and institutional imperatives that dictate how science as a human and machine endeavor should be conducted.<br/><br/>The research team will carry out a factorial survey based on experiments in which research participants are presented with vignettes regarding human and algorithmic peer review decision making. The participants will be asked to assess the legitimacy of each scenario in light of bias and efficiency. The team will employ various techniques, principally multi-level econometric methods, to analyze data drawn from the survey and other sources, including sociology publications. The team will use an institutionalist approach to frame and delineate key concepts and relationships and to formulate research hypotheses that are empirically meaningful and theoretically appealing. The project will offer insights on measuring both pragmatic and moral legitimacy as they pertain to peer review. The project will also provide an adaptable survey tool and approach for gauging perceptions about algorithmic versus human peer review decisions and other scholarly communication activities. Accordingly, it addresses foundational issues in the philosophy of science, sociology of science and technology, and science communication.<br/><br/>This project is jointly funded through the ER2 program by the Directorate for Social, Behavioral and Economic Sciences and the Directorate for Computer and Information Science and Engineering.<br/><br/>This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.

  • Program Officer
    Jason D. Borensteinjborenst@nsf.gov7032924207
  • Min Amd Letter Date
    7/12/2023 - 10 months ago
  • Max Amd Letter Date
    7/12/2023 - 10 months ago
  • ARRA Amount

Institutions

  • Name
    George Mason University
  • City
    FAIRFAX
  • State
    VA
  • Country
    United States
  • Address
    4400 UNIVERSITY DR
  • Postal Code
    220304422
  • Phone Number
    7039932295

Investigators

  • First Name
    Connie L
  • Last Name
    McNeely
  • Email Address
    cmcneely@gmu.edu
  • Start Date
    7/12/2023 12:00:00 AM
  • First Name
    Laurie
  • Last Name
    Schintler
  • Email Address
    lschintl@gmu.edu
  • Start Date
    7/12/2023 12:00:00 AM
  • First Name
    James
  • Last Name
    Witte
  • Email Address
    jwitte@gmu.edu
  • Start Date
    7/12/2023 12:00:00 AM

Program Element

  • Text
    ER2-Ethical & Responsible Res
  • Text
    HCC-Human-Centered Computing
  • Code
    7367