In this context of virtual reality, creating, perceiving, and editing three-dimensional (3D) shapes are at the core of activities such as product design (creating or evaluating objects for manufacturing or personal fabrication), online shopping (experiencing furniture in a room or trying on clothing), and specialized training (gaining familiarity with a remote tool). Yet, today's approaches for interacting with virtual 3D shapes are strictly visual, requiring precise manipulation and interpretation of digital designs on a screen. This project's goal is to create algorithms and interfaces that make 3D modeling easier and more effective, even in the absence of visual cues: auto-correct for 3D drawing, the ability to hear shapes, and the ability to edit 3D shapes verbally. By using senses that do not require a screen—body awareness and sound—this project aims to untether people from their screens, enabling virtual 3D perception from anywhere. The outcomes of this project are expected to have far-reaching impacts, including increased accessibility for people with visual impairments, enhanced interface techniques for low-visibility scenarios, and new opportunities for underrepresented groups in research and do-it-yourself fabrication.<br/><br/>The research focuses on three main objectives: developing accurate “in-air” 3D drawing tools, designing sonification (conveying information through sound) techniques for non-visual shape perception and editing, and creating verbal 3D shape editing tools and interactions. These aims will be pursued through auto-correct algorithms that account for the limits of proprioceptive (a person’s sense of their body pose and movement) accuracy, techniques to sonify shapes based on hand pose, and methods for verbal shape modification. This research sets the stage for future studies on incorporating sound and speech into 3D modeling, as well as non-visual user interfaces.<br/><br/>This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.