Robot system and method for controlling a robot system

Information

  • Patent Grant
  • 11040455
  • Patent Number
    11,040,455
  • Date Filed
    Monday, October 10, 2016
    7 years ago
  • Date Issued
    Tuesday, June 22, 2021
    2 years ago
  • Inventors
  • Original Assignees
    • HADDADIN BETEILIGUNGS UG
  • Examiners
    • Kiswanto; Nicholas
    Agents
    • Dilworth IP, LLC
Abstract
The present invention relates to a robotic system having at least one robotic arm, a control unit for controlling the robotic arm and a robotic arm sensor system, wherein the controller and robotic arm sensor system are designed to respond to predetermined haptic gestures of the user acting on the robotic arm in such a way that the robotic system performs at least one predetermined operation associated with the haptic gesture.
Description
RELATED APPLICATIONS

This application is a national stage filing under 35 U.S.C. § 371 of International Patent Application Serial No. PCT/EP2016/074250, filed Oct. 10, 2016, entitled “Robot System and Method for Controlling a Robot System,” which claims priority to German application serial number 10 2015 012 959.7, filed Oct. 8, 2015. The entire contents of these applications are incorporated herein by reference in their entirety.


The present invention relates to a robotic system and a method for controlling a robotic system.


Usually, robotic systems are programmed in such a way that an operator enters the necessary commands for the robotic system, e.g. as part of a programming or parameterization of the robotic system, by means of a separate device, for example a computer, via a corresponding user interface.


Furthermore, it is known to provide separate input devices directly on the robotic system, for example directly on a robotic arm, such as switches or keys, by means of which different operations can be activated on the robotic arm and then performed. Such operations include e.g. movement of the robotic arm in space, emergency shutdown in case of danger, confirmation of a position, etc. . . . . The operator always moves in the immediate vicinity of the robotic system.


In addition, in a direct human-robot collaboration (HRC), only the positioning of the end effector on the robotic arm is usually performed by haptic interaction between human and robot, in which the human manually moves the robotic arm to the desired position.


A continuing and usually more complex interaction is currently only carried out via additional input devices, as is the case with the programming or parameterization of the robotic system.


For this type of collaboration, it is necessary for the user to repeatedly switch back and forth between robot guidance and the actuation of the additional input device. This is cumbersome and can also lead to dangerous situations, as the user has to concentrate on the input devices again and again and let the potentially moving robot out of sight.


Based on that, it is an object of the present invention to expand the interaction between humans and a robotic system and thereby to provide an improved HRC environment. In addition, a simpler operability of the robotic system shall be provided to control various operations that are either performed by the robotic system itself or in a functional connection therewith.


This object is achieved with a robotic system having the features according to claim 1 and with a method for controlling a robotic system having the features according to claim 10.


The invention thus proposes a robotic system comprising at least one robotic arm, a control unit for controlling the robotic arm, and a robotic arm sensor system, wherein the controller and robotic arm sensor system are adapted to respond to predetermined haptic gestures of the operator acting on the robotic arm, so that the robotic system performs at least one predetermined operation associated with the haptic gesture.


For this purpose, the allocations between the possible haptic gestures and the associated operations are stored in a memory that cooperates with the controller.


The invention is preferably, but not exclusively, directed to robotic systems designed for manual guidance by an operator. These have integrated force and torque sensors, which can be used according to the invention and accordingly also adapted to receive the external forces, moments and torques acting by the haptic gestures of an operator on the robotic arm or only on parts of its arm members and to forward the same to the control unit.


The forces, moments and torques can be detected by various sensors, such as force measuring sensors, torque sensors, etc., which are located either in the joints, in the base and/or generally in the structure of the robotic system. It is also conceivable that the housing structure of a robotic arm is at least partially covered with a tactile skin, which allows an input via one or more fingers.


In one embodiment, the control unit may accordingly be designed to assign the forces and/or moments generated by the haptic gestures to different operations depending on their respective directions and/or their respective course.


In a further embodiment, the control unit may be configured to assign the forces and/or moments generated by the haptic gestures to different operations depending on their respective variables.


It is also possible that the control unit is designed to assign a different chronological sequence of haptic gestures to different operations, which are then carried out in a logical sequence.


The invention has the advantage that the sensor system already present in these robotic systems are not only used to support a mere position control of the end effector and trajectory control of the robotic arm, both during operation and during programming, but can also be used, by an operator by means of haptic gestures, such as pulling, pressing, pushing, turning, to receive deliberately generated forces, moments and torques and to supply the same to a further abstract interaction between humans and the robotic system.


In a preferred embodiment, the robotic system, comprises a display device with a graphical user interface, in its simplest form a computer connected to the robotic system for control or programming purposes. According to the invention, the haptic gestures can then be assigned to a control on the graphical user interface. The control can be done in two or three dimensions. In this embodiment, the operator can move a cursor or a menu navigation on the user interface by targeted spatially directed pressure on the robotic arm.


The operator does not necessarily need the help of a graphical user interface in order to receive corresponding confirmations or feedback of his actions from a programming system.


Thus, it can be provided according to the invention that the control unit is designed to generate a feedback in response to the haptic gesture.


The feedback can be auditory and/or visual.


Preferably, however, this feedback can also be made haptically detectable by the operator. For such a case, the control unit is configured to vary the degree of compliance for the mobility of the robotic arm while the haptic gesture is applied thereto.


For example, if the operator pushes laterally against an arm member of the robotic arm, the resistance of the robotic arm to the pressure exerted by the haptic gesture can be successively increased to provide the operator with appropriate feedback. The increase in the resistance can be effected by a corresponding control of drive units arranged in the hinge points between arm members of the robotic arm.


The feedback generated thereby can then serve as an orientation in more complex interactions, such as haptic menus, or as a type of acknowledgment when a predetermined haptic gesture has been detected by the control unit.


The invention further relates to a method for controlling a robotic system, which has at least one robotic arm, a control unit for controlling the robotic arm and a robotic arm sensor system, comprising the steps:


manipulating the robotic arm by at least one predetermined haptic gesture;


generating signals by the robotic arm sensor system in response to the predetermined haptic gesture;


transmitting the signals into the control unit; and


assigning the signals to different predetermined operations by the control unit.


Further steps of the method according to the invention comprise


generating at least one feedback in response to the different predetermined operations; as well as


adjusting of the compliance of the robotic arm depending on a predetermined haptic gesture.


It will become clear that the invention uses components already existing in the robotic system in a simple manner in order to expand the area of interactions for such a robotic system.


In principle, there are no limits to the way in which predetermined haptic gestures are to be assigned to various predetermined operations.


The principle of haptically controlled guidance of the robotic arm, while the robotic arm itself functions as an input device, can be used for various interactions.


Thus, such a control can be used for example in the programming of the robotic system and also in the parameterization of individual operations to be performed, which can reduce setup times while minimizing costs for such robotic systems.


In addition, it is conceivable that during operation the robotic system is haptically controlled in a HRC-environment. For example, a strong blow by the operator against the robotic arm may signal an emergency in the work area, forcing the robotic system to shut down. This shortens the response time because the operator does not need to get close to an emergency switch to operate it.





Further features and advantages will become apparent from the following description of an embodiment shown with reference to the only FIG. 1.






FIG. 1 shows a perspective view of a multi-axle lightweight robot, which is composed of a plurality of articulated arm members 10 connected to one another.


Using the example of the front (distal) member 20, which cooperates with an end effector, not shown, the possible directions are indicated schematically, which can act by haptic gestures on the robotic system.


Thus, it is possible for an operator to exert tensile or compressive forces on the member 20 in the X-, Y-, Z-directions.


A light push on the member 20 in the Z-direction, for example, may symbolize the pressing of a push-button, this haptic gesture is then assigned to a start command that activates a predetermined operation.


By pushing the member 20 sideways to the left (L) or to the right (R), the operator can be guided through a complex menu of a graphical user interface on a display device, not shown, such as the screen of a computer. By pressing in the Z- or Y-direction, the entry selected in the graphical menu by these haptic gestures can then be confirmed.


It is also possible to transmit the movement of the member 20 in the X-Y-plane to the movement of a cursor on a screen and to perform a “mouse click” by pressing in the Z-direction.


Furthermore, it is possible for the operator to make a haptic rotation R on the member 20, the amount of torque applied thereby being able to be used as a signal for the strength of a parameter to be selected by this rotation. The haptic rotation in R-direction simulates a rotary knob.


For an emergency, it may be provided that an operator simply beats against the robotic arm, indicated by arrow P. The robotic system recognizes by the force or the acceleration that it refers to such an emergency and then immediately stops its motion.


The haptic gestures are not limited to the foremost arm member 20. In principle, each individual arm member 10 may be provided with such functionality.


In addition, it is possible that the behavior of physical surfaces, keys or rotary knobs can be simulated by virtual, as freely programmable resistors when moving the robotic arm. In this way, the robotic system can be used by itself as an input or control device to activate complex systems and operations that are in a close functional relationship with the robotic system or its desired functionality.

Claims
  • 1. A robotic system, comprising: at least one robotic arm;a control unit for controlling the robotic arm;a robotic arm sensor system; anda display device with a graphical user interface,wherein the control unit and robotic arm sensor system are designed to respond to predetermined forces and/or moments generated by haptic gestures of the user acting on the robotic arm, in which said control unit is designed to assign the forces and/or moments to at least one predetermined operation so that the robotic system performs said at least one predetermined operation associated with the haptic gesture,wherein the control unit is further designed to assign said forces and/or moments generated by the haptic gesture with a navigation and selection control on the graphical user interface of said display device.
  • 2. The robotic system according to claim 1, in which the forces and/or moments generated by the haptic gesture are assigned to different operations depending on their respective directions.
  • 3. The robotic system according to claim 1, in which the forces and/or moments generated by the haptic gesture are assigned to different operations depending on their respective variables.
  • 4. The robotic system according to claim 1, in which a different chronological sequence of haptic gestures is assigned to different operations.
  • 5. The robotic system according to claim 1, in which the control unit is adapted to generate a feedback in response to the haptic gesture.
  • 6. The robotic system according to claim 5, in which the feedback is auditory and/or visual.
  • 7. The robotic system according to claim 5, in which the feedback is formed to be haptically detectable.
  • 8. The robotic system according to claim 7, in which the compliance of the robotic arm is variable.
  • 9. A method for controlling a graphical user interface of display device of a robotic system comprising at least one robotic arm, a control unit for controlling the robotic arm and a robotic arm sensor system, the method comprising the steps of: manipulating the robotic arm by at least one predetermined force and/or moment generated by a haptic gesture;generating signals by the robotic arm sensor system in response to the at least one predetermined force and/or moment generated by the haptic gesture;transmitting the signals into the control unit; andassigning the signals to different predetermined operations by the control unit, wherein the control unit assigns the signals with a navigation and selection control on the graphical user interface of the display device.
  • 10. The method according to claim 9, further comprising the step of: generating at least one feedback in dependence on the different predetermined operations.
  • 11. The method according to claim 10, further comprising the step of: adjusting the compliance of the robotic arm depending on a predetermined haptic gesture.
Priority Claims (1)
Number Date Country Kind
10 2015 012 959.7 Oct 2015 DE national
PCT Information
Filing Document Filing Date Country Kind
PCT/EP2016/074250 10/10/2016 WO 00
Publishing Document Publishing Date Country Kind
WO2017/060538 4/13/2017 WO A
US Referenced Citations (26)
Number Name Date Kind
4804304 Tellden et al. Feb 1989 A
4984175 Toyoda Jan 1991 A
5103149 Kimura Apr 1992 A
6285920 McGee Sep 2001 B1
8170719 Tsusaka May 2012 B2
8423189 Nakanishi et al. Apr 2013 B2
8494677 Mizutani Jul 2013 B2
8824777 Choi Sep 2014 B2
9114530 Tsusaka Aug 2015 B2
9393687 Hietmann Jul 2016 B2
9486920 Fudaba Nov 2016 B2
9592608 Bingham Mar 2017 B1
9597797 Ponulak Mar 2017 B2
9804593 Davis Oct 2017 B1
9919416 Bingham Mar 2018 B1
10150214 Jerregard Dec 2018 B2
10635074 Schreiber Apr 2020 B2
20050222714 Nihei Oct 2005 A1
20080016979 Yasumura et al. Jan 2008 A1
20090259412 Brogardh Oct 2009 A1
20090314120 Larsson et al. Dec 2009 A1
20130151010 Kubota et al. Jun 2013 A1
20130255426 Kassow et al. Oct 2013 A1
20130273818 Guan et al. Oct 2013 A1
20150094855 Chemouny Apr 2015 A1
20180200880 Meissner Jul 2018 A1
Foreign Referenced Citations (25)
Number Date Country
509927 Dec 2011 AT
201437046 Apr 2010 CN
102302858 Jan 2012 CN
102410342 Apr 2012 CN
104802156 Jul 2015 CN
199 56 176 Oct 2001 DE
699 21 721 Nov 2005 DE
10 2005 054575 Apr 2007 DE
10 2008 062622 Jun 2010 DE
10 2010 063222 Jun 2012 DE
10 2013 013679 Feb 2014 DE
10 2013 109753 Mar 2014 DE
10 2014 216514 Sep 2015 DE
1435737 Jul 2004 EP
1880809 Jan 2008 EP
2129498 Dec 2009 EP
2131257 Dec 2009 EP
2851162 Mar 2015 EP
2864085 Apr 2015 EP
H08281580 Oct 1996 JP
2008-23642 Feb 2008 JP
WO 2009124904 Oct 2009 WO
WO 2011107143 Sep 2011 WO
WO 2014162161 Oct 2014 WO
WO 2015113757 Aug 2015 WO
Non-Patent Literature Citations (8)
Entry
U.S. Appl. No. 15/752,574, filed Feb. 13, 2018, Haddadin et al.
U.S. Appl. No. 15/766,083, filed Apr. 5, 2018, Haddadin.
U.S. Appl. No. 15/766,094, filed Apr. 5, 2018, Haddadin.
PCT/EP2016/069339, dated Oct. 17, 2016, International Search Report and Written Opinion.
PCT/EP2016/069339, dated Feb. 20, 2018, International Preliminary Report on Patentability.
PCT/EP2016/074250, dated Jan. 30, 2017, International Search Report and Written Opinion.
PCT/EP2016/074251, dated Feb. 2, 2017, International Search Report and Written Opinion.
PCT/EP2016/074252, dated Feb. 2, 2017, International Search Report and Written Opinion.
Related Publications (1)
Number Date Country
20180361594 A1 Dec 2018 US