Collaborative Research: HNDS-I:SweetPea: Automating the Implementation and Documentation of Unbiased Experimental Designs

Information

  • NSF Award
  • 2318549
Owner
  • Award Id
    2318549
  • Award Effective Date
    9/1/2023 - 9 months ago
  • Award Expiration Date
    8/31/2026 - 2 years from now
  • Award Amount
    $ 361,441.00
  • Award Instrument
    Standard Grant

Collaborative Research: HNDS-I:SweetPea: Automating the Implementation and Documentation of Unbiased Experimental Designs

Two important issues facing modern empirical research are those of transparency (how well others can figure out how the research was done) and replicability (whether the outcomes will be the same if someone else does the same experiments). Lack of transparency and failures of experiments to replicate stifle scientific progress and lead to a mistrust of the scientific method. This project develops an open-source programming language, SweetPea, that automates experimental design. It will help researchers understand and replicate the results of others, minimize errors and bias, and increase the efficiency and accuracy of the entire experimental process. Consequently, this technology will contribute to advancing scientific discoveries, make them more accessible and reliable, and provide researchers in empirical sciences with a valuable tool to aid their work.<br/><br/>Many replication problems in the behavioral sciences arise because of the challenges encountered in implementing accurate and appropriately balanced experimental designs while avoiding confounding factors. Additionally, the lack of clear and transparent documentation reduces transparency and the ability to replicate experimental results. The SweetPea programming language is designed to facilitate reproducible experimental design. This project extends SweetPea's core functionality to support various design strategies, automating the documentation process, and expanding the community of users and contributors. SweetPea uses an intuitive interface for the declarative expression of experimental designs, and advanced computational algorithms for sampling and analysis. The software ensures that experimental designs are properly implemented without introducing unexpected confounds. In addition, the project leverages large language models for robust documentation of experimental designs. The project includes outreach activities to engage psychologists, neuroscientists, behavioral economists, and machine learning experts. Overall, this project improves the accuracy, transparency, and replicability of experimental designs, offering researchers an accessible and powerful tool for scientific investigation. This project is jointly funded by the Human Networks and Data Science - Infrastructure program and the Established Program to Stimulate Competitive Research (EPSCoR).<br/><br/>This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.

  • Program Officer
    Patricia Van Zandtpvanzand@nsf.gov7032927437
  • Min Amd Letter Date
    8/16/2023 - 9 months ago
  • Max Amd Letter Date
    10/18/2023 - 7 months ago
  • ARRA Amount

Institutions

  • Name
    Brown University
  • City
    PROVIDENCE
  • State
    RI
  • Country
    United States
  • Address
    1 PROSPECT ST
  • Postal Code
    029129127
  • Phone Number
    4018632777

Investigators

  • First Name
    Michael
  • Last Name
    Frank
  • Email Address
    michael_frank@brown.edu
  • Start Date
    10/18/2023 12:00:00 AM
  • First Name
    Sebastian
  • Last Name
    Musslick
  • Email Address
    sebastian_musslick@brown.edu
  • Start Date
    8/16/2023 12:00:00 AM

Program Element

  • Text
    Program Planning and Policy De
  • Text
    Human Networks & Data Sci Infr
  • Text
    EPSCoR Co-Funding
  • Code
    9150

Program Reference

  • Text
    Human Networks & Data Sci Infrastructure
  • Text
    EXP PROG TO STIM COMP RES
  • Code
    9150