EAGER: TaskDCL: Dronevision, A Physics-Based Environment using Flying Light Specks

Information

  • NSF Award
  • 2425754
Owner
  • Award Id
    2425754
  • Award Effective Date
    9/1/2024 - 5 months ago
  • Award Expiration Date
    8/31/2026 - a year from now
  • Award Amount
    $ 300,000.00
  • Award Instrument
    Standard Grant

EAGER: TaskDCL: Dronevision, A Physics-Based Environment using Flying Light Specks

This EArly-concept Grant for Exploratory Research (EAGER) project will investigate a physics-based environment called Dronevision that will enable its users to see illuminated objects with their naked eyes and to have haptic interactions with them using their bare hands. It will use microdrones to materialize objects and synthetic actors for sensorimotor interaction with one or more human subjects. If successful, this work will fundamentally transform how researchers and the general public think about human-computer interaction. Applications of a Dronevision are diverse, ranging from education to manufacturing, healthcare, and entertainment. An immersive version of a Dronevision has the potential to revolutionize how people work, learn and educate, play and entertain, communicate, and socialize, potentially ushering in a whole new branch of science. The project includes the participation of PhD students, training them on a "high risk-high payoff" research topic with the opportunity to identify interdisciplinary dissertation topics. A conscious effort will be made to recruit students from underrepresented groups.<br/><br/>Dronevision will use miniature sized drones called Flying Light Specks, configured with light sources, computation, and networking capability, to implement embodied reasoning about objects and their properties. A swarm of drones will illuminate objects and provide physics-based sensorimotor interactions. This project will attempt to address several unsolved research challenges. First, a drone should be configured with lights bright enough that render the drone invisible while flying and accommodating human interaction (e.g., poking). Second, drones should be able to localize in the Dronevision space with millimeter accuracy. This accuracy is required to provide realistic illuminations and to quantify the force exerted by a human subject accurately. Third, in response to the human exerted force, a swarm of drones should exert force back against the user consistent with Newtonian physics using the laws of motion, mass, and gravity. This project will work to design and develop transformative techniques for these challenges to implement translation, rotation, collision, and friction between objects illuminated in a Dronevision. It will evaluate these techniques quantitatively through studies with human participants to evaluate qualitative tradeoffs from the human perspective.<br/><br/>This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.

  • Program Officer
    Alexandra Medina-Borjaamedinab@nsf.gov7032927557
  • Min Amd Letter Date
    8/5/2024 - 6 months ago
  • Max Amd Letter Date
    8/5/2024 - 6 months ago
  • ARRA Amount

Institutions

  • Name
    University of Southern California
  • City
    LOS ANGELES
  • State
    CA
  • Country
    United States
  • Address
    3720 S FLOWER ST FL 3
  • Postal Code
    90033
  • Phone Number
    2137407762

Investigators

  • First Name
    Shahram
  • Last Name
    Ghandeharizadeh
  • Email Address
    shahram@usc.edu
  • Start Date
    8/5/2024 12:00:00 AM

Program Element

  • Text
    M3X - Mind, Machine, and Motor
  • Text
    Special Initiatives
  • Code
    164200

Program Reference

  • Text
    HUMAN-ROBOT INTERACTION
  • Code
    7632
  • Text
    EAGER
  • Code
    7916