This application is a national stage filing under 35 U.S.C. § 371 of International Patent Application Serial No. PCT/EP2016/074251, filed Oct. 10, 2016, entitled “Robot System,” which claims priority to German application serial number 10 2015 012 962.7, filed Oct. 8, 2015. The entire contents of these applications are incorporated herein by reference in their entirety.
The invention relates to a robotic system with at least one robotic arm, at the end of which, for the operation of the robotic system, an effector may be mounted, which e.g. can consist of a tool.
Robots have now become established in many areas of industry and are e.g. used in the automotive industry for mounting, gluing, screwing, cutting, welding, painting or lifting heavy objects.
In order to teach a robotic system the desired operations, the robotic system must be programmed. This can be done with an on-line or off-line method, whereby in the off-line method the robot program is created without using the robot.
In on-line programming, the robot is needed for programming, which e.g. is the case with direct teach-in programming. Here, the individual points of the desired trajectories are approached by directly guiding the effector by an operator and the respective position of the effector, e.g. of a gripping tool, is determined via the internal encoders and stored. After the geometry of the trajectory has been programmed in this way, the trajectory program can be supplemented with additional instructions entered via an external handheld programming user device.
The previous methods of the described on-line programming are time consuming and uncomfortable for the operator.
It is therefore an object of the present invention to provide a robotic system in which the programming of the robotic system can be performed faster and easier as compared to previous systems.
This object is achieved by a robotic system as indicated in claim 1.
Advantageous embodiments of the invention are specified in the dependent claims.
Embodiments of the invention will be explained with reference to the accompanying drawings, in which
Effectors used in connection with the robotic system according to the invention may e.g. be tools for workpiece machining, e.g. a drill, gripper systems for handling and manipulating objects, measuring equipment for carrying out test jobs or cameras, with which the robot can perform observations.
The robot 1 is connected to a computer 10, which takes over the control of the robot 1 and is connected to a display device 11, on which a graphical user interface for programming the robot can be displayed and which, e.g. can consist of a computer screen or the screen of a laptop. The computer 10 is hereinafter also referred to as a control unit.
The pilot head 9 of the robotic system, which is shown in
The four operating keys 13, 14, 15 and 16 encircle a D-pad short-stroke key 17, which can be tilted up, down, left and right, e.g. to control a cursor or a selection in a menu displayed on the graphical user interface of the display device in the directions up, down, left and right. In summary, the keys of the key panel 12 attached to the robotic arm and the key 20 constitute an input device.
Instead of a D-pad short-stroke key, other direction keys can also be used for cursor control, e.g. four mutually isolated keys for each of the directions up, down, left and right.
In the center of the D-pad short-stroke key 17, a transparent luminous surface 18 is arranged, which can be illuminated by lighting elements, e.g. one or more LEDs that can be activated by the control unit, in one color or different colors.
In addition, in the lower part of the pilot head 9, a handle 19 is attached, with which the pilot head can be easily guided by an operator of the robotic system.
Further, located in the lower part of the pilot head 9, another button or key 20 is provided, which is mounted on the pilot head such that it can be operated by the operator of the robotic system with the same hand, which hand guides the pilot head 9 or the handle 19 of the pilot head 9 and thus the robotic arm.
The control unit 10, which comprises hardware and software, is designed such that it can specify at least one predefined operation which can be carried out by the robotic system, wherein this operation includes the corresponding control commands with which the axes and the drive of the robot are regulated and the sensors (not shown in
Preferably, a plurality of predefined operations and the associated commands are stored in the control unit. These predefined operations could include, e.g. picking up objects, placing objects, inserting objects, screwing in objects, drilling, surface finishing or button/key actuating.
In the control unit the predefined operations are assigned to icons that can be presented on a graphical user interface, which can be displayed on the display device by means of the control unit.
In a preferred embodiment of the robotic system according to the invention, the operator can use the keys 13, 14, 15, 16 and 17 of the input device to select the desired operations, that the robotic system should perform to accomplish a given task, from a menu displayed on the graphical user interface, in that the operator moves e.g. by means of the D-pad short-stroke key, in the menu to the corresponding operation icon and then, after having selected this icon, confirming this icon by pressing one of the four operating keys 13, 14, 15 and 16, which keys have been previously set with a corresponding function.
In an alternative embodiment, key 20 may also be used to confirm an operation previously selected by means of the D-pad stroke key.
In a further embodiment, the robotic system according to the invention can also be designed in such a way that the control unit is designed to, for each operation, display in the graphical user interface during the parameterization of an operation a predetermined parameterization submenu (context menu) stored in the control unit, in which submenu the various predetermined parameterization options are shown, which can then be selected with the input device on the pilot head 9 via the keys 13, 14, 15, 16, 17 and/or 20 by means of a control of the graphical user interface of the parameterization submenu in order to perform a parameterization.
With such a parameterization, e.g. parameters such as the coordinates of points to be approached by the effector, torques, forces, accelerations, time durations, number of repetitions or subsequent operations of an operation can be entered by means of the input device.
In a further embodiment, the control unit stores all possible operations of the robotic system and all possible parameterization submenus aimed for these operations, which are structured such that the operator can conduct all programming of the robotic system at the input device with a very limited number of input elements, e.g. keys, so that the programming can be done without the aid of external input devices such as computer keyboards. Ideally, with the pilot head as shown in
The setting of the parameters can also be done by a dialog menu stored in the control unit, wherein the individual parameters are queried individually and one input must be made on the input device via the keys, respectively. Corresponding feedback on the input device can then be provided, which confirms the respective input of the operator (for example by a green light field 18) or can be displayed as faulty (for example by a red light field 18).
The input device attached to the pilot head does not necessarily have to consist of keys, but may e.g. also include a touchpad, a trackball, a joystick or similar device.
In the robotic system according to the invention, the input device is further adapted to provide a user-directed feedback to an operator of the robotic system while setting the sequence of operations of the robotic system and/or parameterizing the predefined operations for the robotic system.
Such feedback may be provided e.g. optically, in that static or varying light signals are emitted through the light field 18, which are triggered by the control unit.
In another embodiment, the feedback may be designed such that it can be detected haptically by the operator of the robot system. This can be performed e.g. by vibrating the input device, i.e. the pilot head 9, which is triggered by the control unit, in that a drive belonging to the pilot head is activated, accordingly.
According to a further embodiment, the keypad can also have a plurality of light fields by means of which the optical feedback occurs.
The feedback signals are preferably designed so that they confirm an input of the operator as being positive or negative. For example, in the event of a faulty input by the operator, the illuminated field 18 lights up red, while it lights up green when the input is correct.
In another embodiment, the feedback may also be arranged to represent a request to select a predefined operation of the robotic system from a group of predefined operations or to input a parameter with respect to an operation.
According to a further embodiment, the control unit may be configured such that a certain selection of operations and/or parameters is performed by actuating certain keys and/or specific key combinations on the input device.
Further, according to another embodiment, the control unit of the robotic system may be configured to display a graphical user interface on a display device on which the predefined operation can be displayed, wherein the control unit is further configured to provide feedback to the operator depending on the operation represented on the graphical user interface.
In a further embodiment of the robotic system according to the invention, the feedback can also be effected by an acoustic signal. For this, e.g., a speaker can be mounted directly on the input device, which is controlled by the control unit.
The display device of the robotic system may also consist of a 3D display device, e.g. electronic 3D glasses.
Number | Date | Country | Kind |
---|---|---|---|
10 2015 012 962 | Oct 2015 | DE | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/EP2016/074251 | 10/10/2016 | WO | 00 |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2017/060539 | 4/13/2017 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
4025838 | Watanabe | May 1977 | A |
4398110 | Flinchbaugh et al. | Aug 1983 | A |
4591198 | Monforte et al. | May 1986 | A |
4659971 | Suzuki et al. | Apr 1987 | A |
4678952 | Peterson et al. | Jul 1987 | A |
4804304 | Tellden et al. | Feb 1989 | A |
5040338 | Schwaer et al. | Aug 1991 | A |
5125149 | Inaba et al. | Jun 1992 | A |
5360249 | Monforte et al. | Nov 1994 | A |
6246479 | Jung | Jun 2001 | B1 |
6422441 | Settelmayer et al. | Jul 2002 | B1 |
6463360 | Terada et al. | Oct 2002 | B1 |
8059088 | Eid | Nov 2011 | B2 |
8226140 | Dietrich et al. | Jul 2012 | B1 |
8423189 | Nakanishi et al. | Apr 2013 | B2 |
8918215 | Bosscher | Dec 2014 | B2 |
8997599 | Maisonnier et al. | Apr 2015 | B2 |
10279478 | Akan | May 2019 | B2 |
20010038453 | Jung | Nov 2001 | A1 |
20010045808 | Hietmann et al. | Nov 2001 | A1 |
20050093821 | Massie | May 2005 | A1 |
20050285854 | Morita | Dec 2005 | A1 |
20060091842 | Nishiyama et al. | May 2006 | A1 |
20060259195 | Eliuk | Nov 2006 | A1 |
20070057913 | Eid | Mar 2007 | A1 |
20080016979 | Yasumura et al. | Jan 2008 | A1 |
20080252311 | Koh | Oct 2008 | A1 |
20090314120 | Larsson et al. | Dec 2009 | A1 |
20100045808 | Matsusaka et al. | Feb 2010 | A1 |
20100073150 | Olson | Mar 2010 | A1 |
20100198394 | Trygg | Aug 2010 | A1 |
20100212133 | Montesanti | Aug 2010 | A1 |
20100262288 | Svensson | Oct 2010 | A1 |
20100314895 | Rizk et al. | Dec 2010 | A1 |
20110190932 | Tsusaka et al. | Aug 2011 | A1 |
20120185099 | Bosscher | Jul 2012 | A1 |
20120217129 | Tsutsumi et al. | Aug 2012 | A1 |
20130151010 | Kubota et al. | Jun 2013 | A1 |
20130255426 | Kassow | Oct 2013 | A1 |
20130273818 | Guan et al. | Oct 2013 | A1 |
20140047940 | Yamamoto | Feb 2014 | A1 |
20140183979 | Pelrine | Jul 2014 | A1 |
20140252668 | Austin | Sep 2014 | A1 |
20150053040 | Ueda et al. | Feb 2015 | A1 |
20150122070 | Yamaguchi | May 2015 | A1 |
20150364353 | Sugizaki et al. | Dec 2015 | A1 |
20170252920 | Motomura et al. | Sep 2017 | A1 |
20170320211 | Akan | Nov 2017 | A1 |
20180186017 | Xiong et al. | Jul 2018 | A1 |
20180207795 | Haddadin et al. | Jul 2018 | A1 |
20180345505 | Haddadin | Dec 2018 | A1 |
20180354141 | Haddadin | Dec 2018 | A1 |
20180361594 | Haddadin | Dec 2018 | A1 |
20190054634 | Haddadin | Feb 2019 | A1 |
20190099879 | Haddadin | Apr 2019 | A1 |
20190099881 | Niu | Apr 2019 | A1 |
20190099903 | Goto | Apr 2019 | A1 |
20190126465 | Haddadin | May 2019 | A1 |
20190126468 | Haddadin | May 2019 | A1 |
20190134811 | Haddadin | May 2019 | A1 |
20190168383 | Haddadin | Jun 2019 | A1 |
20190315002 | Haddadin | Oct 2019 | A1 |
20190275681 | Bohme et al. | Nov 2019 | A1 |
Number | Date | Country |
---|---|---|
509927 | Dec 2011 | AT |
2940490 | Sep 2015 | CA |
201437046 | Apr 2010 | CN |
102302858 | Jan 2012 | CN |
102410342 | Apr 2012 | CN |
104802156 | Jul 2015 | CN |
296 09 018 | Aug 1996 | DE |
197 31 656 | Jan 1999 | DE |
199 56 176 | Oct 2001 | DE |
699 21 721 | Nov 2005 | DE |
10 2005 054575 | Apr 2007 | DE |
10 2008 062622 | Jun 2010 | DE |
10 2009 039104 | Mar 2011 | DE |
10 2010 063 222 | Jun 2012 | DE |
10 2013 013679 | Feb 2014 | DE |
10 2013 109753 | Mar 2014 | DE |
10 2014 216514 | Sep 2015 | DE |
10 2016 004788 | Oct 2017 | DE |
441397 | Aug 1991 | EP |
1435737 | Jul 2004 | EP |
1880809 | Jan 2008 | EP |
2129498 | Dec 2009 | EP |
2131257 | Dec 2009 | EP |
2548706 | Jan 2013 | EP |
2784612 | Oct 2014 | EP |
2851162 | Mar 2015 | EP |
2864085 | Apr 2015 | EP |
2868439 | May 2015 | EP |
S60 123288 | Jul 1985 | JP |
S61 252084 | Oct 1986 | JP |
S62 87153 | Apr 1987 | JP |
H08281580 | Oct 1996 | JP |
2000-218584 | Aug 2000 | JP |
2008-23642 | Feb 2008 | JP |
2014-0011973 | Jan 2014 | KR |
WO 2007082954 | Jul 2007 | WO |
WO 2007099511 | Sep 2007 | WO |
WO 2009124904 | Oct 2009 | WO |
WO 2010088959 | Aug 2010 | WO |
WO 2011107143 | Sep 2011 | WO |
WO 2014162161 | Oct 2014 | WO |
WO 2014170355 | Oct 2014 | WO |
WO 2015113757 | Aug 2015 | WO |
Entry |
---|
PCT/EP2016/069339, Oct. 17, 2016, International Search Report and Written Opinion. |
PCT/EP2016/069339, Feb. 20, 2018, International Preliminary Report on Patentability. |
PCT/EP2016/074250, Jan. 30, 2017, International Search Report and Written Opinion. |
PCT/EP2016/074251, Feb. 2, 2017, International Search Report and Written Opinion. |
PCT/EP2016/074252, Feb. 2, 2017, International Search Report and Written Opinion. |
U.S. Appl. No. 15/752,574, filed Feb. 13, 2018, Haddadin et al. |
U.S. Appl. No. 15/766,080, filed Apr. 5, 2018, Haddadin. |
U.S. Appl. No. 15/766,094, filed Apr. 5, 2018, Haddadin. |
U.S. Appl. No. 16/077,705, filed Aug. 13, 2018, Haddadin. |
U.S. Appl. No. 16/083,192, filed Sep. 7, 2018, Haddadin. |
U.S. Appl. No. 16/095,326, filed Oct. 19, 2018, Haddadin. |
U.S. Appl. No. 16/095,332, filed Oct. 19, 2018, Haddadin et al. |
U.S. Appl. No. 16/095,336, filed Oct. 19, 2018, Haddadin. |
U.S. Appl. No. 16/095,622, filed Oct. 22, 2018, Haddadin. |
U.S. Appl. No. 16/095,624, filed Oct. 22, 2018, Haddadin et al. |
U.S. Appl. No. 16/340,916, filed Apr. 10, 2019, Bohme et al. |
PCT/EP2017/059448, Aug. 1, 2017, International Search Report and Written Opinion. |
PCT/EP2017/059448, Oct. 23, 2018, International Preliminary Report on Patentability. |
PCT/EP2017/059446, Jul. 19, 2017, International Search Report and Written Opinion. |
PCT/EP2017/059446, Oct. 23, 2018, International Preliminary Report on Patentability. |
PCT/EP2017/059572, Jul. 27, 2017, International Search Report and Written Opinion. |
PCT/EP2017/059572, Oct. 30, 2018, International Preliminary Report on Patentability. |
International Search Report and Written Opinion for Application No. PCT/EP2017/059448 dated Aug. 1, 2017. |
International Preliminary Report on Patentability for Application No. PCT/EP2017/059448 dated Oct. 23, 2018. |
International Search Report and Written Opinion for Application No. PCT/EP2017/059446 dated Jul. 19, 2017. |
International Preliminary Report on Patentability for Application No. PCT/EP2017/059446 dated Oct. 23, 2018. |
International Search Report and Written Opinion for Application No. PCT/EP2017/059572 dated Jul. 27, 2017. |
International Preliminary Report on Patentability for Application No. PCT/EP2017/059572 dated Oct. 30, 2018. |
[No Author Listed], “Advanced Automation for Space Missions,” NASA Conference Publication 2255, Aug. 29, 1980, pp. 1-335. Retrieved from https://ntrs/nasa/gov/archive/nasa/casi.ntrs.nasa.gov/19830007077.pdf on Jul. 7, 2017. |
[No Author Listed], “FANUC,” YouTube, Dec. 22, 2007. Retrieved from https://www.youtube.com/watch?v=-SREct28IJM on Jul. 11, 2017. Supplemented by five .PNGimages taken from video. |
[No Author Listed], CNC Products and Services. Brochure. FANUC America Corporation. 2017. Retrieved Jan. 24, 2019 from https://www.fanucamerica.com/docs/default-source/cnc-files/brochures/cnc-products-and-services.pdf?sfvrsn=865fc162_4. |
Sakakibara, A two-armed intelligent robot assembles mini robots automatically. Industrial Electronics, Control, and Instrumentation. Proceedings of the 1996 IEEE IECON 22nd International Conference on Taipaei, Taiwan. 1996;3(5):1879-1883. |
Schafer et al., Light-Weight Mechatronics and Sensorics for Robotic Exploration: a DLR Perspective. Feb. 25, 2008. Retrived from http://elib.dlr.de/55362/1/i-sairas2008_Schafer.pdf on Mar. 24, 2017. |
CN 102302858 is understood by its English-language machine translation and figures. |
CN 102410342 is understood by its English-language machine translation and figures. |
CN 104802156 is understood by its English-language machine translation and figures. |
DE 199 56 176 is understood by its English-language machine translation and figures. |
DE 699 21 721 is understood by its English-language abstract and figures. |
DE 10 2005 054575 is understood by its English-language abstract and figures. |
DE 10 2008 062622 is understood by its English-language abstract and figures. |
DE 10 2010 063222 is understood by its English-language abstract and figures. |
DE 10 2013 013679 is understood by its English-language abstract and figures. |
DE 10 2013 109753 is understood by its English-language abstract and figures. |
DE 10 2014 216514 is understood by its English-language machine translation and figures. |
EP 2131257 is understood by its English-language abstract and figures. |
EP 2851162 is understood by its English-language abstract and figures. |
JP H08281580 is understood by its English-language abstract and figures. |
JP 2008-23642 is understood by its English-language machine translation and figures. |
WO 2009/124904 is understood by its English-language abstract and figures. |
WO 2015/113757 is understood by its English-language abstract and figures. |
Number | Date | Country | |
---|---|---|---|
20180345505 A1 | Dec 2018 | US |