The broader impact/commercial potential of this project is improvement in cost-efficiency, energy-efficiency, and quality in manufacturing automation, increasing worker productivity and reducing repetitive motion injuries. This integrated visual-tactile system will be 3-4 times more inexpensive ($20,000 purchase cost vs. existing $65,000-80,000 vision system), improve the speed and accuracy of current robotic handling systems, and facilitate the automation of repetitive, injury-prone manual tasks. By enabling new robotic applications and increasing productivity in current automation, this solution will help the U.S. maintain a competitive domestic manufacturing sector. In 2009 there were 36,190 logged repetitive motion injuries in the U.S.; the median missed work time from these injuries was 21 days (U.S. Bureau of Labor Statistics). This innovative solution will facilitate the automation of repetitive, injury-prone manual tasks and greatly improve the speed, accuracy, and cost-efficiency of current robotic handling systems. The immediate commercial applications are in industrial robotics, specifically robotics in agile manufacturing. In the long term, the technology will be applied in personal, healthcare, and military robots. The current market potential for tactile sensors for industrial robots is estimated as $576 million - $1.15 billion and expected to more than double by 2025.<br/><br/>This Small Business Innovation Research (SBIR) Phase 2 project will result in a combined visual-tactile system that will give robots an integrated sense of touch and vision, much like the hand-eye coordination of humans. It incorporates a technically novel compliant tactile sensing solution?a rubber ?skin? that can be molded into any form factor and is inexpensive and durable. This advanced skin technology can resolve object shape, contact/slip events, and forces of contacted objects. It will uniquely fuse visual and tactile information for object handling and pose estimation resulting in flexible robotic system that handles objects more like humans do. This approach addresses key weaknesses in vision-based robotic manufacturing, such as occlusion and dislodging when parts are grasped. Current industrial robots are restricted in their ability to handle small, irregularly shaped, soft, or fragile parts. Existing solutions rely on expensive and complex 3D-vision systems or repetitive manual labor. This solution is two-fold: (1) A new flexible tactile sensor that can be tailored to a wide variety of form factors; (2) Software to fuse the tactile data with a vision system to estimate pose of objects in pick-and-place tasks.