DESCRIPTION (provided by applicant): Handsight is a mobile phone service that offers an affordable, extensible set of automated sight-assistant functions to millions of blind and low-vision persons at $30/month atop standard voice and data fees. To the user, Handsight is simply a mobile phone application, requiring no special equipment or updating. By aiming a mobile telephone<s camera in roughly the right direction and pressing just one button, a Handsight user can snap a picture, send it to the Handsight computer center along with location data, and have a response within seconds. Moving the key computation and data from the handset to web servers cut cost, eases technology upgrades, and enables numerous data and technology partnerships needed to rapidly solve a broad spectrum of tasks. To allow widespread adoption Handsight uses voice, keypad, and vibration for user interaction; and for extensibility it is built on open-source software both on the handset and on the web application infrastructure. The range of tasks encountered by the blind and low-vision requires an array of components to solve: some are more heavily text-oriented and involve no location/navigational feedback (distinguishing and identifying different medicines; finding the phone bill in a stack of letters); some are more specifically navigational (locating the exact store entrance, finding the bus stop); yet others are informational (finding out when the next bus is due). Since we aim to provide a broadly useful tool, Handsight has to accommodate the whole range of task types. We are therefore proposing to build an architecture that integrates a set of components addressing the various task types, from text-detection and recognition software to navigational databases. Handsight<s application programming interface (API) will enable third parties to add capabilities to solve new tasks. As a web service, Handsight can evolve as computer vision and navigation technologies advance without requiring users to upgrade handsets or buy new versions of software. Phase I of this project was funded by the Dept. of Education / NIDRR and completed successfully between 10/01/2007 and 04/01/2008 under grant # H133S070044. It demonstrated not just the viability of the proposed project but also likely demand for such a service. (The NIDRR<s immutable deadlines precluded us from pursuing a Phase II with that agency.) Phase II consists in building the cell phone application and remote processing infrastructure with APIs to enable plug-in of the basic and future features. Subcontractor Smith-Kettlewell Institute for Eye Research will assure usability by blind and low-vision users; data partner NAVTEQ will provide a range of leading-edge digital map data. Blindsight already owns fast, accurate text detection and recognition software, and has experience in building scalable web services. A minimum set of features that make for a system with widespread appeal will be developed and/or licensed, and integrated into the service. PUBLIC HEALTH RELEVANCE: The proposed Handsight service falls squarely in the NEI mission to support research and programs with respect to visual disorders and the special health problems and requirements of the blind. It aims to provide a maximum of mobility and independence to the blind and low-vision using today<s mobile phone infrastructure via automated web services, using a minimum of specialized hardware and training. The 3 million-plus blind and low-vision persons in the U.S. encounter barriers to such activities of daily living as travel, navigation, and shopping, reading and social interactions. The difficulty of recognizing when a friend is nearby, of finding a product in a grocery store, of making change or locating a destination building often require sighted friends or assistants to make these tasks possible. This is expensive, difficult to arrange, and increases dependence. The Handsight system will simplify or make possible to many a new set of tasks previously requiring human assistance or special-purpose devices.