This EArly-concept Grant for Exploratory Research (EAGER) grant will leverage cutting-edge technology, including virtual reality and wearable devices, to study how individuals with visual impairment navigate through crowds. Most individuals with visual impairment live in urban areas where job opportunities and healthcare services are more accessible. Navigating dynamic city environments, particularly crowded spaces such as subway stations and bustling streets, is challenging for individuals with visual impairment who rely on auditory and tactile cues for navigation. These challenges are exacerbated in emergency situations by the need for swift and accurate responses that become essential for physical safety. By creating simulated urban environments enriched with advanced auditory and haptic cues in virtual reality, the project aims to elucidate sensorimotor interactions and cognitive processes underpinning the navigation of persons with visual impairment through crowded spaces. The insights gained from this research will inform the development of wearable technologies that enhance the safety and independence of individuals with visual impairment, thereby promoting their health, prosperity, and welfare.<br/><br/>This research project fills crucial knowledge gaps in understanding how persons with visual impairment navigate dynamic urban environments. By studying their cognition and behavior in immersive virtual reality settings, the project will provide insights for developing better assistive technologies and enterprise resilience strategies. Lived experiences of patients with glaucoma will inform the design of the multisensory tasks to reflect real-world challenges faced by persons with impairment. In these realistic virtual reality environments, participants will interact with synthetic actors who form a virtual crowd, and experience various urban scenarios enriched with visual, tactile, and auditory cues. The project will emphasize physics-based integration of these cues within the virtual reality environment to create immersive experiences mirroring real-world challenges. All three sensory cues will play a vital role in guiding navigation decisions and interactions through virtual crowds. Data collected in experiments will be explored within a novel network-inference approach to identify the cognitive and behavioral processes each person undergoes to make effective and efficient navigation decisions.<br/><br/>This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.