Navigation in complex natural environments poses significant challenges for both biological organisms and human-made systems. This project aims to understand how echolocating bats adapt their echolocation calls and flight paths in real-time based on the sounds they receive. Bats can dynamically modify their calls to navigate through cluttered environments like dense forests, adjusting the timing and structure of their vocalizations based on auditory input. By studying the neural circuits responsible for this adaptive control, the researchers aim to uncover principles that can be applied to develop advanced, bioinspired navigation systems for autonomous drones and robots. The research will integrate neurophysiological methods, deep learning, and biomimetic robotics to understand how bats' brains process and react to auditory information. The ultimate goal is to replicate these biological strategies in robotic systems, enhancing their ability to navigate and interact with complex surroundings efficiently. This project also includes educational outreach initiatives to inspire students, especially those from underrepresented groups, about neuroscience and engineering. By developing exhibits and participating in outreach programs, the project aims to foster a deeper appreciation for scientific research and its real-world applications, demonstrating the value of federal funding in advancing knowledge and technology. Through this multifaceted approach, the project seeks to bridge the gap between fundamental science and practical technological advancements, highlighting the relevance of studying natural systems to solve modern engineering challenges.<br/><br/>This research investigates the neural dynamics of the auditory-frontal system in echolocating bats. The primary objective is to understand how acoustic information from echoes is processed by the auditory cortex (AC) and conveyed to the frontal auditory field (FAF) to control the time-frequency properties of echolocation calls. The project will use multi-electrode silicon probes to record simultaneous neural activity in the AC and FAF of actively vocalizing bats presented with complex targets. This approach aims to describe the neural dynamics of the AC-FAF circuit, providing insights into how auditory information modulates vocal production. Optogenetic perturbations will be employed to manipulate AC projections, testing their role in the FAF activity and vocal output. Deep learning techniques will be applied to simplify and link the complex inputs (echoes) and outputs (vocalizations) of this neural control system, leading to the development of a computational model. This model will be implemented in a biomimetic system and optimized using deep reinforcement learning for various navigation tasks. The research aims to uncover how neural dynamics facilitate adaptive control of vocalizations, with potential applications in improving autonomous navigation systems. By integrating neurophysiological data with computational modeling and robotics, the project addresses significant gaps in understanding the neural mechanisms of vocal production and auditory feedback, potentially transforming both neuroscience and engineering fields. This interdisciplinary approach promises to provide comprehensive insights and novel solutions for adaptive navigation in complex environments.<br/><br/>This project is being co-funded by the Neural Systems Cluster within the Division of Integrative Organismal Systems (IOS) in the Biological Sciences Directorate (BIO) and the Division of Information and Intelligent Systems (IIS) in the Directorate of Computer and Information Science and Engineering (CISE).<br/><br/>This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.