I-Corps: Translation Potential of a Multimodal, Human-Robot Teaching-Learning-Collaboration Framework to Advance Manufacturing Flexibility and Productivity

Information

  • NSF Award
  • 2432565
Owner
  • Award Id
    2432565
  • Award Effective Date
    7/1/2024 - 7 months ago
  • Award Expiration Date
    6/30/2025 - 4 months from now
  • Award Amount
    $ 50,000.00
  • Award Instrument
    Standard Grant

I-Corps: Translation Potential of a Multimodal, Human-Robot Teaching-Learning-Collaboration Framework to Advance Manufacturing Flexibility and Productivity

The broader impact of this I-Corps project is based on the development of innovative human teaching and robot learning methods aimed at advancing robot-assisted manufacturing systems to enhance manufacturing flexibility and productivity for stakeholders. The technology could mitigate the limitations of traditional robot programming and control approaches by developing an easy-to-use, teaching-learning-collaboration framework, in which the robot can be efficiently programmed by human demonstrations. This solution enables the human-robot dyad to act in a sociable partnership to facilitate semi-automation of customized, collaborative, industrial tasks, such as manufacturing and food processing. The solution is human-compatible and quickly transferable, allowing industry sectors to rapidly expand automation of their operations without long periods of implementation. <br/><br/>This I-Corps project utilizes experiential learning coupled with a first-hand investigation of the industry ecosystem to assess the translation potential of the technology. The solution is based on the development of a multimodal, teaching-learning-collaboration framework for a robot to actively learn from human demonstrations and participate with humans in collaborative tasks. In this framework, the human worker can intuitively teach the robot to perform tasks using natural, multimodal information such as natural language, natural gesture information, gaze information, and vision sensing information. The manufacturing task operations can then be characterized through this multimodal information. The robot learns parameterized human demonstrations and builds task strategies using artificial intelligence-driven algorithms. The robot is then able to assist its human partner in shared tasks through its learned knowledge and the developed human-robot collaboration model. The solution could improve robot programming efficiency and collaboration quality for human-robot teams in smart manufacturing and other robot-assisted collaborative contexts.<br/><br/>This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.

  • Program Officer
    Molly Waskomwasko@nsf.gov7032924749
  • Min Amd Letter Date
    6/25/2024 - 7 months ago
  • Max Amd Letter Date
    6/25/2024 - 7 months ago
  • ARRA Amount

Institutions

  • Name
    Montclair State University
  • City
    MONTCLAIR
  • State
    NJ
  • Country
    United States
  • Address
    1 NORMAL AVE
  • Postal Code
    070431624
  • Phone Number
    9736556923

Investigators

  • First Name
    Weitian
  • Last Name
    Wang
  • Email Address
    wangw@montclair.edu
  • Start Date
    6/25/2024 12:00:00 AM

Program Element

  • Text
    I-Corps
  • Code
    802300

Program Reference

  • Text
    HUMAN-ROBOT INTERACTION
  • Code
    7632