Systems and methods for providing dynamic robotic control systems

Abstract
An articulated arm system is disclosed that includes an articulated arm including an end effector, and a robotic arm control systems including at least one sensor for sensing at least one of the position, movement or acceleration of the articulated arm, and a main controller for providing computational control of the articulated arm, and an on-board controller for providing, responsive to the at least one sensor, a motion signal that directly controls at least a portion of the articulated arm.
Description
BACKGROUND

The invention generally relates to robotics, and relates in particular to robotic control systems that are designed to accommodate a wide variety of unexpected conditions and loads.


Most industrial robotic systems operate in a top-down manner, generally as follows: a controller samples a variety of sensors, and then logic on that same controller computes whether or not to take action. The benefit of this logic flow (usually referred to as “polling”) is that all of the control logic is in the same place. The disadvantage is that in practical robotic systems, the signals are often sampled quite slowly. Also, all sensors must be wired to the control cabinet leading to long and error-prone cable runs.


A specific example of this traditional architecture would generally be implemented by a legacy robot supplier such as those sold by ABB Robotics, Inc. of Auburn Hills, Mich., Kuka Roboter GmbH of Germany, Fanuc America Corporation of Rochester Hills, Mich., or one of their top-tier integrators. All of these suppliers generally encourage the same architecture, and have similar form factors. For example: a welding cell used in an automotive facility might have an ABB IRC5 control cabinet, an ABB IRB2600 1.85 m reach 6 degree of freedom robot, a Miller GMAW welding unit wired over an industrial bus (Devicenet/CANbus) to the IRC5, and an endo-farm tooling package mounting a GMAW torch (e.g., a Tregaskiss Tough Gun). All programming is done on the IRC5, and the end effector has no knowledge of the world, and things like crashes can only be observed or prevented on the IRC5, which is itself quite limited.


Again, in such systems, however, the signals are often sampled relatively slowly and sensors must generally be wired to the control cabinet. There remains a need therefore, for a robotic control system that is able to efficiently and reliably provide dynamic control and responsiveness to conditions in the environment of the robot.


SUMMARY

In accordance with an embodiment, the invention provides an articulated arm system that includes an articulated arm including an end effector, and a robotic arm control systems including at least one sensor for sensing at least one of the position, movement or acceleration of the articulated arm, and a main controller for providing computational control of the articulated arm, and an on-board controller for providing, responsive to the at least one sensor, a motion signal that directly controls at least a portion of the articulated arm.


In accordance with another embodiment, the invention provides an articulated arm system including an articulated arm including an end effector, and an articulated arm control system including at least one sensor for sensing at least one of the position, movement or acceleration of the articulated arm, a main controller for providing computational control of the articulated arm, and an on-board controller for providing, responsive to the at least one sensor, a control signal to the main controller.


In accordance with another embodiment, the invention provides a method of providing a control signal to an end effector of an articulated arm. The method includes the steps of providing a main control signal from a main controller to the end effector of the articulated arm, receiving a sensor input signal from at least one sensor positioned proximate the end effector, and at least partially modifying the main control signal responsive to the sensor input signal.


In accordance with a further embodiment, the invention provides a method of providing a control signal to an end effector of an articulated arm. The method includes the steps of providing a main control signal from a main controller to the end effector of the articulated arm, receiving a sensor input signal from a sensor positioned proximate the end effector, and overriding the main control signal responsive to the sensor input signal.





BRIEF DESCRIPTION OF THE ILLUSTRATED EMBODIMENTS

The following description may be further understood with reference to the accompanying drawings in which:



FIG. 1 shows an illustrative diagrammatic view of an end effector used in a robotic system in accordance with an embodiment of the invention;



FIG. 2 shows an illustrative diagrammatic view of an on-board controller used in the end effector of FIG. 1;



FIG. 3 shows an illustrative diagrammatic view of processing steps used by a robotic control system in accordance with an embodiment of the invention;



FIG. 4 shows an articulated arm system in accordance with an embodiment of the invention;



FIG. 5 shows an illustrative block diagram of a robotic control system in accordance with an embodiment of the invention;



FIGS. 6A and 6B show an illustrative diagrammatic views of illustrative processing steps used by the robotic control system of FIG. 5;



FIG. 7 shows an illustrative diagrammatic view of the articulated arm system of FIG. 4 with the end effector rotated 180 degrees; and



FIGS. 8A and 8B show illustrative diagrammatic views of end effectors for use in further embodiments of the invention.





The drawings are shown for illustrative purposes only.


DETAILED DESCRIPTION

In accordance with an embodiment, the invention provides an architecture for robotic end effectors that allows the end effector to alter the state of the robot. In accordance with certain embodiments, the end effector may observe the environment at a very high frequency and compare local sensor data and observations to a set of formulas or trigger events. This allows for robot-agnostic low latency motion primitive routines, such as for example move until suction and move until force without requiring the full response time of the robotic main controller. A robotic end effector is therefore provided that can alter the state of the robot, and further that may be modified during run time based on a variety of control policies. In accordance with further embodiments, the invention provides a multifaceted gripper design strategy has also been developed for multimodal gripping without tool changers.


A majority of industrial robotic systems execute their programming logic control in one place only—in the robot controller. The robot controller in these systems is often a large legacy controller with an obscure and (and sometimes poorly featured) programming language. In contrast, the majority of modern and emerging robotic systems contain logic distributed between a robot controller and several workstation computers running a modern operating system and software stack, such as the Ubuntu operating system as sold by Canonical Ltd. of Isle Of Man, the Linux operating system as provided by The Linux Foundation of San Francisco, Calif. and the ROS robotic operating environment as provided by Open Source Robotics Foundation of San Francisco, Calif.


A positive aspect of these architectures is that they provide tremendous, even arbitrary, amounts of computing power that may be directed towards problems like motion planning, localization, computer vision, etc. The downsides of this architecture are primarily that going through high-level middleware such as ROS adds significant latency, and evaluating a control policy in a loop may see round trip times of well over 100 ms.


As a unifying solution for this problem, a gripper control system has been developed with onboard electronics, sensors, and actuators to which high level logic controlling the system uploads a set of ‘triggers’ at runtime. These are control policies, such as stop the robot when a force above X Newtons is observed, or when object is observed by depth sensor, slow down the trajectory. The end effector may then evaluate the policy natively at the kHz level, and trigger actions of situations where the gripper should take an action.



FIG. 1 shows a portion of an articulated arm assembly that includes a force sensor system 1, on-board control electronics 2, a vacuum end effector 3, a three dimensional depth sensor system 4, an input pressure sensor 5, an output pressure sensor 6, and another vacuum end effector 7. The articulated arm therefore includes on-board control electronics 2 as well as multiple end effectors 3, 7. In certain embodiments, the articulated arm may include a further end effector similar to end effector 3 that is adjacent end effector 3 (and is therefore not shown in FIG. 1).



FIG. 2 shows the on-board control electrics 2, which includes connectors 11 for the force sensors, connectors 12 for the robot, connectors 13 for the pressure sensors, connectors 14 for LEDs such as RGB LEDs, and connector 15 for a microcontroller with serial and wireless connections.


In accordance with an embodiment, the invention provides an articulated arm control system that includes an articulated arm with an end effector, at least one sensor for sensing at least one of the position, movement or acceleration of the articulated arm, a main controller for providing computational control of the articulated arm, and an on-board controller for providing, responsive to the at least one sensor, a control signal to the main controller.



FIG. 3 shows, for example, shows a pre-programmed robot control routine that begins (step 300), executes a first batch program (step 302), polls sensors for inputs (step 304), executes a second batch program (step 306), polls the sensors again for inputs (step 308), executes a third batch program (step 310), and then ends (step 312). If the system is relying on sensor inputs to cause a change in the program (e.g., stop due to readings of a force sensor), the system must wait for that sensor to be polled. In accordance with embodiments of the present invention, on the other hand, interrupt signals may be provided to the main robot controller to cause pre-defined specific responses. As diagrammatically shown in FIG. 3, such interrupt signals may be received any time and immediately processed.



FIG. 4 shows a robotic system 20 in accordance with an embodiment of the present invention in which the articulated arm portion of FIG. 1 (including the force sensor system 1, on-board control electronics 2, the vacuum end effector 3, the three dimensional depth sensor system 4, the input pressure sensor 5, the output pressure sensor 6, and the other vacuum end effector 7) is attached to further articulated arm sections 22, 24, 26, 28 and 30. The articulated arm section 30 is attached to a robot base 32, which is coupled to a main robot controller 34 by connector cables 36. An interrupt signal may be provided from the on-board control electronics 2 to the main robot controller 34 either by direct wire connection or wirelessly.


This solution conveys several tremendous advantages: First, one may add the advanced behaviors one generates to any robot, as long as the robot complies with a relatively simple API. Second, one may avoid long cable runs for delicate signals, from the end effector to the robot control box (which is often mounted some distance away from a work cell). Third, one may respond to changes in the environment at the speed of a native control loop, often thousands of times faster than going exclusively through high level logic and middleware. Fourth, one may alter these policies at runtime, switching from move until suction to stop on loss of suction, as well as chaining policies.


In accordance with a further embodiment, the invention provides a method of altering or overriding a control signal from a main controller to an end effector.



FIG. 5, for example, shows an implementation of the on-board control electronics 2. The electronics 2 receives at 40 control signals from the main robot controller 34 (shown in FIG. 4), which causes motors M1, M2, M3 (shown at 42, 44 and 46) and the vacuum (shown at 48) of the articulated arm to move. The motors may control, for example, elbow, wrist and gripper motors of the articulated arm. In the absence of any feedback signals from the environment, the control signals 40 are routed to the appropriate motors for control of the articulated arm in accordance with the program in the main controller.


The electronics 2 however, is also coupled to input sensors including pressure sensors 50, 52 and 54, a camera 56, force/torque sensors 58, 60 deflection/deformation sensor 62 and flow sensor 63. These sensors are coupled to an on-board controller 64 that determines whether to send an interrupt signal to the main robotic controller, and determines whether to immediately take action by overriding any of the output signals to motors M1-M3 and the vacuum. This is achieved by having the on-board controller 64 be coupled to control junctions 66, 68, 70 and 72 in the control paths of the signals 42, 44, 46 and 48.


The robot, for example, may be working in very cluttered, dynamic environments. In order to manipulate objects in these conditions, one needs much more sensing than a typical, more structured, open-loop robotic system would need. The grippers are therefore instrumented with absolute pressure sensors, a 3D RGBD camera, force-torque sensor, and suction cup deflection sensing. By sensing and processing the sensor data directly at the wrist via a microcontroller hardware interrupts may be set (via digital inputs) immediately (hundreds/thousands of Hz). There is much more overhead in the other approach of communicating the sensor data back to the main robotic controller for analysis, which would be significantly slower. This allows one to modify robot motion/execution significantly faster, which in turn allows one to move the robot significantly faster, adapting at speeds not possible otherwise. In these dynamic and unpredictable environments, adapting and providing recovery quickly is vitally important.


The pressure sensors, for example, may provide binary gripping/not gripping, and threshold comparisons (>grip pressure, <required retract pressure, <drop pressure). The pressure sensors may also map material properties/selected grasps to expected pressure readings and in real-time modify trajectory execution (speeds, constraints) in order to ensure successful transportation. The pressure sensors may also provide real-time monitoring of upstream pressure (pressure from source) to ensure expected air pressure available, and modify expected suction measurements from downstream accordingly.


The camera may be an RGBD camera that provides data regarding environment registration, automated localization of expected environment components (conveyor, out shelves, out-bin stack) to remove hand tuning, and expected/unexpected objects/obstacles in the environment and modify trajectory execution accordingly.


The force-torque sensors may provide impulse interrupts. When an unusual or unexpected force or torque is encountered we can stop trajectory execution and recover, where the robot before would have continued its motion in collision with that object causing damage to the object or robot. The force-torque sensors may also provide mass/COM estimates, such as Model Free mass estimates that may inform trajectory execution to slow down as one may be dealing with higher mass and inertias at the endpoint, which are more likely to be dropped due to torqueing off. Model Based mass estimates may also be used to ensure quality of grasp above COM, make sure that the correct item is grasped, that the item is singulated, and that the item is not damaged (unexpected mass).


The deflection/deformation sensor may observe suction cup contact with the environment (typically when one wants to interrupt motion) as the bellows are deflected and have not modified pressure readings, and have not yet displayed a noticeable force impulse. The deflection sensor at its simplest will be used for interrupting motion to avoid robot Force Protective Stops by being that earliest measurement of contact. The deflection/deformation sensor may also measure the floppiness of the picks, which allows one in real-time to again modify trajectory execution, slowing down or constraining the motions to ensure successful transport, or putting it back in the bin if the floppiness is beyond a threshold at which the item may be safely transported.


The flow sensors may detect changes in the amount of airflow as compared to expected air flow values or changes. For example, upon grasping an object, it is expected that the airflow would decrease. Once an object is grasped and is being carried or just held, a sudden increase in air flow may indicate that the grasp has been compromised or that the object has been dropped. The monitoring of weight in combination with air flow may also be employed, particularly when using high flow vacuum systems.


With reference to FIG. 6A, the program begins (step 600), by applying the end effector to an object at a selected grasp location (step 602). A vacuum is applied to the end effector (step 604), and the sensors are polled (step 606). Responsive to the sensor inputs, the system determines whether it should try to pick up the object (step 608). For example, if too much vacuum flow is detected, the system may determine that the grasp is insufficient for picking up the object. In this case, the system will determine (step 610) whether there have already been too many tries to pick up this particular object (possibly involving the main controller). If there have not already been too many retries, the system may select another grasp location for the object (step 612) and return to step 602 above. If the system determines that there have already been too many retries, the system will select a new object and a new associated grasp location (step 614) and return to step 602 above.


If the system determines that the object should be picked up (step 608), the system will then lift the object (step 616) and then read the sensors (step 618). If the orientation of the end effector needs to be adjusted, the system adjusts the orientation of the end effector (step 620), for example to cause a heavy object to be held in tension (vertically) by the end effector as opposed to a combination of a vertical and horizontal grasp that would cause a sheer force to be applied. In another example, the system may choose the hold a lighter object with a combination of a vertical and horizontal grasp to accommodate a high speed rotation movement so that when the object is being moved, a centrifugal force will be applied in the direction aligned with the grasp of the object. Once the orientation of the end effector is chosen (step 620), the system will choose a trajectory path (step 622), and then begin execution of the trajectory, e.g., the batch program N (step 624).


With reference to FIG. 6B, the execution of the batch program N may begin by polling the one or more sensors for inputs (step 626). If none of the inputs exceeds a defined threshold for the main control command (step 628), e.g., to move in a certain vector, then the system will continue to execute the batch program (step 630) until done (whereupon the system returns to step 614). If the batch program is not done, the system returns to step 626, polling the sensor(s) for inputs. If any of the inputs from the sensor(s) do exceed a threshold (step 628), then the system will determine whether the main control command should be altered (e.g., movement slowed or the path changed) (step 632), and if so, the program will so alter the main control command (step 634). If the main control command is not altered, the system will determine whether the main control command should be overridden (step 636), e.g., movement of the end effector should be stopped or the object should be put down for a new grasp attempt, or the object has been dropped, in which case, the system will proceed to pick up a new object and signal for cleaning by a human that an object has been dropped. In any of the exemplary cases, the program will so override the main control command (step 638). In either case, the system then returns to executing the batch program as either altered or overridden, returning to step 626 until done. If the main control signal for a batch program is changed (altered or overwritten), the main controller is also promptly notified.


In accordance with another embodiment, the invention provides an articulated arm control system includes an articulated arm with an end effector, at least one sensor for sensing at least one of the position, movement or acceleration of the articulated arm, and a main controller for providing computational control of the articulated arm, and an on-board controller for providing, responsive to the at least one sensor, a motion signal that directly controls at least a portion of the articulated arm.



FIG. 7, for example shows the robotic system 20 of FIG. 4 except that the articulated arm portion of FIG. 1 is rotated with respect to the articulated arm section 22 such that the vacuum end effector 3 is now positioned to engage the work environment, while the vacuum end effector 7 is moved out of the way.


A unique contribution of the articulated arm is its multiple facets for multimodal gripping, e.g., having multiple grippers packaged on a single end effector in such a way that the robot can use different grippers by orienting the end effector of the robot differently. These facets can be combined in combinations as well as used individually. Other more common approaches are tool changers, which switch a single tool out with a different one on a rack. Multimodal gripping of the present invention reduces cycle time significantly compared to tool changers, as well as being able to combine multiple aspects of a single end effector to pick up unique objects.


The gripper designs in the above embodiments that involved the use of up to three vacuum cups, may be designed specifically for picking items of less than a certain weight, such as 2.2 lbs., out of a clutter of objects, and for grasping and manipulating the bins in which the objects were provided.


The same approach to instrumentation of a vacuum grasping end effector may be applied to any arbitrary configuration of vacuum cups as well. For example, if the robotic system needs to handle boxes such as might be used for shipping of things, then arbitrary N×M arrangements of the suction cells may be created to handle the weight ranges of such packages. FIG. 8A for example shows an end effector 70 that includes a 3 by 3 array of end effector sections 72, each of which includes a vacuum cup 74. Each end effector section 72 may include pressure sensors as discussed above, and each vacuum cup 74 may include a deformation sensor that is able to detect deformation along any of three dimensions. The end effector sections 72 are mounted to a common base 76 that includes a coupling 78 for attachment to an articulated arm.



FIG. 8B shows an end effector 80 that includes a 6 by 6 array of end effector sections 82, each of which includes a vacuum cup 84. Again, each end effector section 82 may include pressure sensors as discussed above, and each vacuum cup 84 may include a deformation sensor that is able to detect deformation along any of three dimensions. The end effector sections 82 are mounted to a common base 86 that includes a coupling 88 for attachment to an articulated arm.


The 3×3 array that may, for example, handle up to 19.8 pound packages, and the 6×6 array that may handle up to 79.2 pounds. Such scaling of end effector sections may be made arbitrarily large, and of arbitrary shapes (if, for example, the known objects to be handled are of a particular shape as opposed to generally square/rectangular).


It is significant that by extrapolating the standard vacuum cell to arbitrary sizes/shapes, such an instrumented end effector may be designed for any given object or class of objects that shares all the benefits of such instrumentation as the above embodiments.


Those skilled in the art will appreciate that numerous variations and modifications may be made to the above disclosed embodiments without departing from the spirit and scope of the present invention.

Claims
  • 1. An articulated arm system, comprising: an articulated arm including an end effector; andan articulated arm control system including: at least one sensor for sensing at least one of the position, movement or acceleration of the articulated arm;a main controller configured to automatically provide at least one main control signal that controls movement of the end effector; andan on-board controller mounted on the articulated arm proximate the end effector and the at least one sensor, wherein the on-board controller is configured to automatically provide, responsive to the at least one sensor, a motion control signal to the end effector that overrides the at least one main control signal from the main controller to change the movement of the end effector.
  • 2. The articulated arm system as claimed in claim 1, wherein said motion control signal overrides the at least one main control signal from the main controller using a control junction through which the at least one main control signal passes.
  • 3. The articulated arm system as claimed in claim 2, wherein said motion control signal overrides the at least one main control signal from the main controller to change any of the acceleration, orientation or position of the end effector.
  • 4. The articulated arm system as claimed in claim 1, wherein said at least one sensor includes a plurality of sensors.
  • 5. The articulated arm system as claimed in claim 4, wherein said plurality of sensors include any of flow sensors, pressure sensors, cameras, torque sensors and deformation sensors.
  • 6. The articulated arm system as claimed in claim 1, wherein said end effector includes a plurality of end effector sections, each of which includes a vacuum cup.
  • 7. The articulated arm system as claimed in claim 6, wherein each end effector section includes at least one pressure sensor.
  • 8. The articulated arm system as claimed in claim 6, wherein said end effector sections are provided in an ordered array.
  • 9. The articulated arm system as claimed in claim 6, wherein said end effector sections are provided in a 3 by 3 array.
  • 10. The articulated arm system as claimed in claim 6, wherein said end effector sections are provided in a 6 by 6 array.
  • 11. An articulated arm system, comprising: an articulated arm including an end effector; andan articulated arm control system including: at least one sensor for sensing at least one of the position, movement or acceleration of the articulated arm;a main controller configured to automatically provide at least one main control signal that controls movement of the end effector; andan on-board controller mounted on the articulated arm proximate the end effector and the at least one sensor, wherein the on-board controller is configured to automatically provide, responsive to the at least one sensor, a modifying control signal to the end effector that modifies the at least one main control signal from the main controller to change the movement of the end effector.
  • 12. The articulated arm control system as claimed in claim 11, wherein said at least one sensor includes a plurality of sensors.
  • 13. The articulated arm control system as claimed in claim 12, wherein said plurality of sensors each includes any of flow sensors, pressure sensors, cameras, torque sensors and deformation sensors.
  • 14. A method of controlling an end effector of an articulated arm, comprising: automatically providing a main control signal from a main controller to the end effector of the articulated arm via an on-board control system mounted on the articulated arm proximate the end effector, said main control signal for controlling movement of the end effector;receiving by the on-board control system a sensor input signal from at least one sensor positioned proximate the end effector, said sensor input signal including information related to the position, movement or acceleration of the articulated arm; andat least partially modifying the main control signal by the on-board control system responsive to the sensor input signal to automatically provide a modified main control signal to the end effector from the on-board control system that changes the movement of the end effector.
  • 15. The method as claimed in claim 14, wherein the sensor input signal is coupled to an on-board controller of the on-board control system.
  • 16. The method as claimed in claim 15, wherein the on-board controller is mounted proximate the end effector.
  • 17. A method of controlling an end effector of an articulated arm: automatically providing a main control signal from a main controller to the end effector of the articulated arm via an on-board control system mounted on the articulated arm proximate the end effector, said main control signal for controlling movement of the end effector;receiving by the on-board control system a sensor input signal from at least one sensor positioned proximate the end effector, said sensor input signal including information related to the position, movement or acceleration of the articulated arm; andoverriding by the on-board control system the main control signal from the main controller to the end effector responsive to the sensor input signal to automatically change the movement of the end effector.
  • 18. The method as claimed in claim 17, wherein said at least one sensor includes a plurality of sensors.
  • 19. The method as claimed in claim 18, wherein said plurality of sensors each include any of flow sensors, pressure sensors, cameras, torque sensors and deformation sensors.
  • 20. The method as claimed in claim 17, wherein the sensor input signal is coupled to an on-board controller of the on-board control system.
  • 21. The method as claimed in claim 20, wherein the on-board controller is mounted proximate the end effector.
  • 22. The articulated arm system as claimed in claim 1, wherein said main controller provides a plurality of main control signals to the end effector, and wherein the on-board controller provides a plurality of motion control signals to the end effector that are configured to override one or more of the plurality of main control signals.
  • 23. The articulated arm system as claimed in claim 22, wherein the on-board controller provides the plurality of motion control signals that are configured to modify the one or more main control signals using control junctions.
  • 24. The articulated arm system as claimed in claim 22, wherein at least one of the plurality of motion control signals are configured to override at least one of the plurality of main control signals that control an orientation of the end effector.
  • 25. The articulated arm system as claimed in claim 22, wherein at least one of the plurality of motion control signals are configured to override at least one of the plurality of main control signals that are configured to cause the end effector to stop moving.
  • 26. The articulated arm system as claimed in claim 11, wherein said modifying control signal from the on-board controller modifies the at least one main control signal from the main controller using a control junction through which the at least one main control signal passes.
  • 27. The articulated arm system as claimed in claim 26, wherein said modifying control signal modifies the at least one main control signal from the main controller to change any of the acceleration, orientation or position of the end effector.
  • 28. The articulated arm system as claimed in claim 11, wherein said main controller provides a plurality of main control signals to the end effector, and wherein the on-board controller provides a plurality of modifying control signals that are configured to modify one or more of the plurality of main control signals.
  • 29. The articulated arm system as claimed in claim 28, wherein the on-board controller provides the plurality of modifying control signals that are configured to modify the one or more main control signals using control junctions.
  • 30. The articulated arm system as claimed in claim 28, wherein at least one of the plurality of modifying control signals are configured to modify at least one of the plurality of main control signals that control an orientation of the end effector.
  • 31. The method as claimed in claim 14, wherein the step of at least partially modifying the main control signal involves providing an on-board control signal to a control junction through which the main control signal passes.
  • 32. The method as claimed in claim 14, wherein said method further includes the steps of providing a plurality of main control signals to the end effector, and providing a plurality of modifying control signals by the on-board control system that are configured to modify one or more of the plurality of main control signals.
  • 33. The articulated arm system as claimed in claim 32, wherein the on-board system provides the plurality of modifying control signals that are configured to modify the one or more main control signals using control junctions.
  • 34. The method as claimed in claim 32, wherein at least one of the plurality of modifying control signals are configured to modify at least one of the plurality of main control signals that control an orientation of the end effector.
  • 35. The method as claimed in claim 17, wherein the step of overriding the main control signal involves using a control junction through which the main control signal passes.
  • 36. The method as claimed in claim 17, wherein the method further includes the steps of providing a plurality of main control signals to the end effector, and providing a plurality of motion control signals that are configured to override one or more of the plurality of main control signals.
  • 37. The method as claimed in claim 36, wherein the on-board control system provides the plurality of motion control signals that are configured to modify the one or more main control signals using control junctions.
  • 38. The method as claimed in claim 36, wherein at least one of the plurality of motion control signals are configured to override at least one of the plurality of main control signals that control an orientation of the end effector.
  • 39. The method as claimed in claim 36, wherein at least one of the plurality of motion control signals are configured to override at least one of the plurality of main control signals that are configured to cause the end effector to stop moving.
PRIORITY

The present application claims priority to U.S. Provisional Patent Application Ser. No. 62/212,697 filed Sep. 1, 2015 and U.S. Provisional Patent Application Ser. No. 62/221,976 filed Sep. 22, 2015, the disclosures of which are herein incorporated by reference in their entireties.

US Referenced Citations (127)
Number Name Date Kind
4557659 Scaglia Dec 1985 A
4604787 Silvers, Jr. Aug 1986 A
4677778 Sorimachi et al. Jul 1987 A
4786847 Daggett Nov 1988 A
4896357 Hagano et al. Jan 1990 A
5764013 Yae Jun 1998 A
5777267 Szydel Jul 1998 A
5860900 Dunning et al. Jan 1999 A
5865487 Gore et al. Feb 1999 A
6059092 Jerue et al. May 2000 A
6446175 West Sep 2002 B1
6817639 Schmalz et al. Nov 2004 B2
7263890 Takahashi Sep 2007 B2
7313464 Perreault et al. Dec 2007 B1
7474939 Oda et al. Jan 2009 B2
7677622 Dunkmann et al. Mar 2010 B2
8070203 Schaumberger Dec 2011 B2
8874270 Ando Oct 2014 B2
8936291 Yasuda et al. Jan 2015 B2
9061868 Paulsen et al. Jun 2015 B1
9227323 Konolige et al. Jan 2016 B1
9259844 Xu et al. Feb 2016 B2
9266237 Nomura Feb 2016 B2
9283680 Yasuda et al. Mar 2016 B2
9486926 Kawano Nov 2016 B2
9492923 Wellman et al. Nov 2016 B2
9604363 Ban Mar 2017 B2
9687982 Jules et al. Jun 2017 B1
9981379 Youmans et al. May 2018 B1
9999977 Wagner et al. Jun 2018 B2
10007827 Wagner et al. Jun 2018 B2
10118300 Wagner et al. Nov 2018 B2
10315315 Wagner et al. Jun 2019 B2
10335956 Wagner et al. Jul 2019 B2
10399236 Wagner et al. Sep 2019 B2
20010056313 Osborne, Jr. Dec 2001 A1
20020068994 Hong Jun 2002 A1
20020157919 Sherwin Oct 2002 A1
20030075051 Watanabe et al. Apr 2003 A1
20060242785 Cawley Nov 2006 A1
20100040450 Parnell Feb 2010 A1
20100094461 Roth et al. Apr 2010 A1
20100101346 Johnson Apr 2010 A1
20100109360 Meisho May 2010 A1
20100125361 Mougin et al. May 2010 A1
20100175487 Sato Jul 2010 A1
20100180711 Kilibarda et al. Jul 2010 A1
20100234857 Itkowitz Sep 2010 A1
20100241260 Kilibarda et al. Sep 2010 A1
20110176148 Briggs et al. Jul 2011 A1
20110206494 Lockie Aug 2011 A1
20110243707 Dumas et al. Oct 2011 A1
20130006417 Sanders Jan 2013 A1
20130110280 Folk May 2013 A1
20130166061 Yamamoto Jun 2013 A1
20130218335 Barajas et al. Aug 2013 A1
20130232919 Jaconelli Sep 2013 A1
20130245824 Barajas et al. Sep 2013 A1
20130343640 Buehler et al. Dec 2013 A1
20130345872 Brooks et al. Dec 2013 A1
20140005831 Naderer et al. Jan 2014 A1
20140067121 Brooks et al. Mar 2014 A1
20140067127 Gotou Mar 2014 A1
20140088763 Hazan Mar 2014 A1
20140154036 Matttern et al. Jun 2014 A1
20140200711 Douba Jul 2014 A1
20140244026 Neiser Aug 2014 A1
20140298231 Saito et al. Oct 2014 A1
20140305847 Kudrus Oct 2014 A1
20150032252 Galluzzo et al. Jan 2015 A1
20150057793 Kawano Feb 2015 A1
20150073589 Khodl et al. Mar 2015 A1
20150081090 Dong Mar 2015 A1
20150190925 Hoffman et al. Jul 2015 A1
20150203340 Jacobsen et al. Jul 2015 A1
20150224650 Xu et al. Aug 2015 A1
20150298316 Accou et al. Oct 2015 A1
20150306770 Mittal Oct 2015 A1
20150328779 Bowman Nov 2015 A1
20150346708 Mattern et al. Dec 2015 A1
20150352721 Wicks et al. Dec 2015 A1
20150375398 Penn et al. Dec 2015 A1
20150375401 Dunkmann Dec 2015 A1
20160031077 Inaba Feb 2016 A1
20160101526 Saito et al. Apr 2016 A1
20160136816 Pistorino May 2016 A1
20160167227 Wellman et al. Jun 2016 A1
20160176043 Mishra et al. Jun 2016 A1
20160221187 Bradski et al. Aug 2016 A1
20160243704 Vakanski et al. Aug 2016 A1
20160271805 Kuolt et al. Sep 2016 A1
20160347545 Lindbo et al. Dec 2016 A1
20170021499 Wellman et al. Jan 2017 A1
20170036354 Chavan Dafle et al. Feb 2017 A1
20170043953 Battles et al. Feb 2017 A1
20170050315 Henry et al. Feb 2017 A1
20170057091 Wagner et al. Mar 2017 A1
20170080566 Stubbs et al. Mar 2017 A1
20170080579 Wagner et al. Mar 2017 A1
20170087718 Wagner et al. Mar 2017 A1
20170087731 Wagner et al. Mar 2017 A1
20170106532 Wellman et al. Apr 2017 A1
20170120455 Wagner et al. May 2017 A1
20170121113 Wagner et al. May 2017 A1
20170136632 Wagner et al. May 2017 A1
20170157648 Wagner et al. Jun 2017 A1
20170197316 Wagner et al. Jul 2017 A1
20170225330 Wagner et al. Aug 2017 A1
20170305694 McMurrough et al. Oct 2017 A1
20170322561 Stiernagle Nov 2017 A1
20180043527 Koga Feb 2018 A1
20180127219 Wagner et al. May 2018 A1
20180148272 Wagner et al. May 2018 A1
20180265298 Wagner et al. Sep 2018 A1
20180273295 Wagner et al. Sep 2018 A1
20180273296 Wagner et al. Sep 2018 A1
20180273297 Wagner et al. Sep 2018 A1
20180273298 Wagner et al. Sep 2018 A1
20180281202 Brudniok et al. Oct 2018 A1
20180282065 Wagner et al. Oct 2018 A1
20180282066 Wagner et al. Oct 2018 A1
20180312336 Wagner et al. Nov 2018 A1
20180327198 Wagner et al. Nov 2018 A1
20180330134 Wagner et al. Nov 2018 A1
20180333749 Wagner et al. Nov 2018 A1
20190001505 Wagner et al. Jan 2019 A1
20190329979 Wicks et al. Oct 2019 A1
Foreign Referenced Citations (13)
Number Date Country
2928645 Apr 2015 CA
701886 Mar 2011 CH
0317020 May 1989 EP
0613841 Sep 1994 EP
1256421 Apr 2002 EP
1671906 Jun 2006 EP
2181814 May 2010 EP
2960024 Dec 2015 EP
2010034044 Jan 2010 WO
2015162390 Oct 2015 WO
2016070412 May 2016 WO
2017044632 Mar 2017 WO
2018017616 Jul 2017 WO
Non-Patent Literature Citations (8)
Entry
International Search Report and the Written Opinion issued by the International Searching Authority dated Nov. 18, 2016 in related International Application No. PCT/US2016/049935.
Hebert et al. “A Robotic Gripper System for Limp Material Manipulation: Hardware and Software Development and Integration”, Proceedings of the 1997 IEEE International Conference on Robotics and Automation. Albuquerque, Apr. 20-27, 1997; [Proceedings of the IEEE International Conference on Robotics and Automation], New York, IEEE, US, vol. Conf. 14, Apr. 20, 1997 ( Apr. 20, 1997), pp. 15-21.
Moura et al. “Neural Network Based Perturbation Identification Approach for High Accuracy Tracking Control of Robotic Manipulators”, Proceedings of the International Mechanical Engineering Congress and Exposition—IMECE—ASME, XX, XX, Nov. 1, 2003 (Nov. 1, 2003), pp. 1189-1197.
Vittor et al. “A Flexible Robotic Gripper for Automation of Assembly Tasks: A technology study on a gripper for operation in shared human environments”, Assembly and Manufacturing (ISAM), 2011 IEEE International Symposium on, IEEE, May 25, 2011 (May 25, 2011), pp. 1-6.
Liu et al. “Hand-Arm Coordination for a Tomato Harvesting Robot Based on Commercial Manipulator”, 2013 IEEE International Conference on Robotics and Biomimetics (ROBIO), Shenzhen China, IEEE, Dec. 12, 2013 (Dec. 12, 2013) pp. 2715-2720.
International Preliminary Report on Patentability issued by the International Bureau of WIPO in related International Patent Application No. PCT/US2016/049935 dated Mar. 15, 2018.
Office Action issue by Canadian Intellectual Property Office in related Canadian Patent Application No. 2,997,280 dated Jun. 6, 2019, 4 pgs.
Communication pursuant to Rules 161(1) and 162 EPC issued by the European Patent Office in related European Patent Application No. 16766742.7 dated Apr. 10, 2018, 3 pages.
Related Publications (1)
Number Date Country
20170080571 A1 Mar 2017 US
Provisional Applications (2)
Number Date Country
62221976 Sep 2015 US
62212697 Sep 2015 US