This project continues the development and expansion of Project VICTOR, the Volcanology Infrastructure for Computational Tools and Resources, which provides a novel cyberinfrastructure serving the volcanology community. The volcanology community is transforming into a computer-savvy, data-driven, quantitative discipline that requires a matching cyberinfrastructure. Specifically, VICTOR provides a platform for executing numerical simulations of volcanic processes, including lava flows, ash, tephra dispersal, and pyroclastic density currents. During the preceding pilot phase, the VICTOR platform was initiated and a preliminary set of codes and workflows was developed. The new project will expand upon VICTOR capabilities by collaborating with model and database developers in the community to connect these tools to the platform. The central purpose of VICTOR is to catalyze the volcanology modeling community to advance model quality and access and promote model literacy and overall collaboration. Thus, this project will put special emphasis on education and training. The project will take a multi-faceted approach that combines: (1) inclusion and integration of community software codes, (2) workshops, (3) educator training and teaching modules for undergraduate and graduate level classes, and (4) establishment of a community governance structure and effective communication channels.<br/><br/>VICTOR is based on a JupyterHub platform, and access is through a central web portal. All components are based in the cloud, to allow for demand-based resource management, workflow portability, and reproducibility, and to offer access to high-performance computing to a broader community. The project will develop computational workflows that use new capabilities and libraries of models to simplify model verification, validation, and benchmarking and streamline access to required external datasets such as topography and environmental conditions using public Application Programming Interfaces (APIs). Workflows will utilize modern computing tools such as Jupyter Notebooks, minimizing the time-intensive steps of locating, installing, running, and testing models. Workflows will enable standardization of model inputs and outputs, facilitating studies of linked- and multi-hazard scenarios. The reproducibility and reliability of the modeling process will be enhanced through capabilities to save, re-run, edit, and test workflows. Ultimately, the combination of open-access models, data science tools, and the provisioned low-barrier access to computing resources will increase usability by the community and accelerate the transition to a culture of open science.<br/><br/>This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.