This EArly-concept Grant for Exploratory Research (EAGER) project will investigate a physics-based environment called Dronevision that will enable its users to see illuminated objects with their naked eyes and to have haptic interactions with them using their bare hands. It will use microdrones to materialize objects and synthetic actors for sensorimotor interaction with one or more human subjects. If successful, this work will fundamentally transform how researchers and the general public think about human-computer interaction. Applications of a Dronevision are diverse, ranging from education to manufacturing, healthcare, and entertainment. An immersive version of a Dronevision has the potential to revolutionize how people work, learn and educate, play and entertain, communicate, and socialize, potentially ushering in a whole new branch of science. The project includes the participation of PhD students, training them on a "high risk-high payoff" research topic with the opportunity to identify interdisciplinary dissertation topics. A conscious effort will be made to recruit students from underrepresented groups.<br/><br/>Dronevision will use miniature sized drones called Flying Light Specks, configured with light sources, computation, and networking capability, to implement embodied reasoning about objects and their properties. A swarm of drones will illuminate objects and provide physics-based sensorimotor interactions. This project will attempt to address several unsolved research challenges. First, a drone should be configured with lights bright enough that render the drone invisible while flying and accommodating human interaction (e.g., poking). Second, drones should be able to localize in the Dronevision space with millimeter accuracy. This accuracy is required to provide realistic illuminations and to quantify the force exerted by a human subject accurately. Third, in response to the human exerted force, a swarm of drones should exert force back against the user consistent with Newtonian physics using the laws of motion, mass, and gravity. This project will work to design and develop transformative techniques for these challenges to implement translation, rotation, collision, and friction between objects illuminated in a Dronevision. It will evaluate these techniques quantitatively through studies with human participants to evaluate qualitative tradeoffs from the human perspective.<br/><br/>This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.