Modern sensors and imaging systems can capture far richer information than ever before, but with this capability comes certain challenges. First, the measurements provided by these systems are often not directly interpretable. A specially designed algorithm must process them to extract the information of interest and create, for example, the cleanest possible image. Second, the measurement systems themselves often have different configurations that can be used to collect information in different ways. It may not be evident in advance which configurations are best in any particular scenario. To address the above challenges, this project focuses on (i) developing new theories and algorithms for processing measurements and extracting the underlying information and (ii) providing guidance on how to optimally design such measurement systems and adaptively reconfigure them during the measurement process. Much of the research in this project is general and intended to be applicable across broad classes of measurement systems. Therefore, the advances in this project have significant potential benefits across diverse areas such as biomedical imaging, astronomy, computation, and communications. The project also involves an integrated outreach and education plan, focusing on promoting accessibility and awareness of STEM concepts for K-12 students through virtual and in-person outreach events.<br/><br/>Pushing the frontiers of what can be measured about the physical world requires going far beyond direct linear measurements. Instead, by design or necessity, many modern sensors acquire only indirect nonlinear or probabilistic observations of some object of interest. This project focuses on three motivating applications that involve such indirect measurements: (i) phase imaging, in which simple imaging systems use defocus to convert (classically unmeasurable) phase information into (measurable) intensity variations; (ii) sparse super-resolution imaging, in which direct imaging (photon counting) is combined with various types of quantum measurements to overcome "Rayleigh's curse" and achieve super-resolution of scenes with multiple point sources; and (iii) quantum state and quantum process tomography, in which probabilistic measurement techniques are used to infer the states of quantum systems and the quantum processes that act on these systems. Integrated with these applications, the project is focused in two overarching research thrusts: (i) structured inference, in which new representations and algorithms are developed for efficiently recovering low-rank matrices, structured factors, and other parameters of interest from indirect measurements; and (ii) adaptive measurement design, which aims to answer questions such as how to optimally design and configure the measurement system, how to incorporate structured inference algorithms, and how to adaptively adjust the measurement system state based on the collected measurements. Using atomic norm minimization as a foundation, the project develops new analysis of the nonconvex geometry of low-rank matrix recovery from probabilistic measurements and new techniques for sparse density estimation that allow inference of structured factors and other parameters from probabilistic measurements. The project also leverages modern machine learning and optimization-based signal recovery techniques to develop a cohesive and general framework that enables indirect measurement systems to adaptively configure the measurement system and ultimately reconstruct signals with enhanced signal-to-noise ratio, improved resolution, and reduced overall sensing budget.<br/><br/>This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.