Visually impaired (VI) students are challenged with learning highly visual content in Science, Technology, Engineering, and Math (STEM) courses using nonvisual technologies that portray only limited details and cannot be presented in real-time. Though there is a rising number of VI individuals seeking higher educational experiences, just under a third have attained a high school diploma. This situation is exaggerated in STEM fields, where much of the content relies heavily on graphical information presented visually. These unfortunate events are not due to VI individuals being incapable of high academic achievement, but rather, due to inadequate technologies and resources provided to them. Subsequently, there exists a dire need for a portable, refreshable real-time display of graphical information via nonvisual feedback that can be easily integrated within a classroom setting. This Innovation Corps for Learning Team project aims to assess the commercial viability, market potential, and technological impact of touchscreen-based educational curriculum that will enable real-time graphical information presentation through visual, auditory, and vibratory feedback. The proposed innovation leverages commercially available touchscreens and custom software created in the form of Android applications to translate visual content displayed on-screen into content that can be felt (through vibrations) and heard (through sound). <br/><br/>The proposed innovation has the potential to bridge this technological gap and open up these new learning opportunities for VI students. <br/>The infrastructure and software in this work will promote a transformed classroom in which students with VI are independently interacting with their sighted peers and primary classroom teacher, in real-time via touchscreens. Each student would have a touchscreen that is wirelessly networked to their peers' and teacher's touchscreens (or smart board or laptop). As the teacher draws a graph or figure on his or her input device, this same image will immediately "appear," in a multimodal sense, on all of the students' touchscreens. While sighted students may primarily use the visual display of the information, students with VI can leverage vibratory and/or auditory feedback (using headphones) to explore the graphical content. This work lays the foundation for the creation of touchscreen-based educational curriculum in the form of Android applications that will (1) meet the customer need of a portable, multimodal, real-time platform for graphical information presentation, (2) enable classroom integration through the use of existing hardware, and (3) encourage widespread adoption through distribution in online application markets. The impacts of this work will not only address several barriers of graphical information transfer for VI students (a significantly underrepresented population in STEM), but will also lay the foundation for the creation of a universally-designed, learner-centered framework for touchscreens that may benefit all people, enhancing the fidelity and creating new dimensions of learning right at our fingertips.