A web-based framework for multi-modal visualization and annotation of neuroanatomical data

Information

  • Research Project
  • 10365435
  • ApplicationId
    10365435
  • Core Project Number
    RF1MH128776
  • Full Project Number
    1RF1MH128776-01
  • Serial Number
    128776
  • FOA Number
    RFA-MH-19-147
  • Sub Project Id
  • Project Start Date
    9/16/2021 - 2 years ago
  • Project End Date
    9/14/2024 - 3 months from now
  • Program Officer Name
    ZHAN, MING
  • Budget Start Date
    9/16/2021 - 2 years ago
  • Budget End Date
    9/14/2024 - 3 months from now
  • Fiscal Year
    2021
  • Support Year
    01
  • Suffix
  • Award Notice Date
    9/16/2021 - 2 years ago
Organizations

A web-based framework for multi-modal visualization and annotation of neuroanatomical data

PROJECT SUMMARY/ABSTRACT Modern experimental approaches allow researchers to collect a variety of whole-brain data from the same animal via different anatomical labels, including tracers, genetic markers, and fiducial marks from recording electrodes. Unfortunately, viewing and analysis methods have not kept pace with the complexity of these datasets, which can be as large as several terabytes. This limitation makes it time- and resource-intensive to view and manipulate light-microscopy data or to share these datasets with distant laboratories. Currently available software solves some aspects of this problem, but no existing program provides a user-friendly way to visualize, annotate, and compare large neuroanatomical datasets across research sites, with minimal investment of computational resources. We propose to develop a web-based tool, named BrainSharer, to allow researchers to access, visualize, align, share, and semi-automatically annotate brain-wide data within a common framework. The foundation for this tool will be provided by Neuroglancer, a generic web-based volumetric viewer first developed at Google and then adapted for use in electron microscopy laboratories. While some of its current features are useful across applications, existing versions of Neuroglancer are not optimized for light-microscopy data. In particular, they do not realize the potential for sharing, viewing, and editing data across multi-laboratory collaborations, such as U19 projects. To enable BrainSharer to serve data rapidly and to save and restore sessions, we will add a modular distributed database to synchronize metadata across laboratories. In addition, we will tailor BrainSharer for light microscopy by displaying data in formats independent of the imaging modality, adding semiautomatic means to segment cell bodies and processes, adding tools for annotation (with special attention to defining cytological boundaries in three dimensions and tracing projection pathways), and adding ways to incorporate auxiliary data such as electrode tracks. In addition, we will integrate alignment tools into BrainSharer, so that separate datasets can be co-registered, visualized, and annotated in the same framework, along with established and emerging atlases. As test beds for development of BrainSharer, we will use three types of datasets from our U19 projects: whole-brain disynaptic and polysynaptic tracing, activity-based staining with c-fos, and neurovascular data. All software, training datasets, and video tutorials for BrainSharer will be made freely available to the community, hosted on our website, along with a slice histology dataset and an electrophysiology dataset with probes implanted throughout the brain. To orient new users, we will also provide a Jupyter notebook for converting raw, intermediate, and registered light-sheet data, along with detected cells and brain atlases, to precomputed format, so they can be loaded into BrainSharer. When complete, BrainSharer will make it straightforward for researchers to use their laptops to combine and compare large datasets from different anatomical labels for viewing and analysis relative to reference atlases, and to share this information across performance sites, thus increasing the ease of use and interoperability of big data in neuroscience.

IC Name
NATIONAL INSTITUTE OF MENTAL HEALTH
  • Activity
    RF1
  • Administering IC
    MH
  • Application Type
    1
  • Direct Cost Amount
    1299123
  • Indirect Cost Amount
    335405
  • Total Cost
    1634528
  • Sub Project Total Cost
  • ARRA Funded
    False
  • CFDA Code
    242
  • Ed Inst. Type
    GRADUATE SCHOOLS
  • Funding ICs
    NIMH:1634528\
  • Funding Mechanism
    Non-SBIR/STTR RPGs
  • Study Section
    ZMH1
  • Study Section Name
    Special Emphasis Panel
  • Organization Name
    PRINCETON UNIVERSITY
  • Organization Department
    NEUROSCIENCES
  • Organization DUNS
    002484665
  • Organization City
    PRINCETON
  • Organization State
    NJ
  • Organization Country
    UNITED STATES
  • Organization Zip Code
    085430036
  • Organization District
    UNITED STATES