System for Performing an Input on a Robotic Manipulator

Information

  • Patent Application
  • 20220362943
  • Publication Number
    20220362943
  • Date Filed
    June 26, 2020
    3 years ago
  • Date Published
    November 17, 2022
    a year ago
Abstract
A system for performing an input on a robotic manipulator, wherein the system includes: a robotic manipulator having a plurality of limbs connected to one another by articulations and having actuators; a sensor unit configured to record an input variable, applied by a user by manually guiding the robotic manipulator, on the robotic manipulator, wherein the input variable is a kinematic variable or a force and/or a moment, and wherein the sensor unit is configured to transmit the input variable; and a computing unit connected to the robotic manipulator and to the sensor unit, the computing unit configured to transform the input variable received from the sensor unit via a predefined input variable mapping, wherein the input variable mapping defines a mathematical mapping of the input variable onto a coordinate of a graphical user interface or onto a setting of a virtual control element.
Description
FIELD

The invention relates to a system for performing an input on a robotic manipulator and to a method for performing an input on a robotic manipulator.


SUMMARY

The aim of the invention is to improve the performance of an input on a robotic manipulator by a user.


The invention results from the features of the independent claims. Advantageous developments and embodiments are the subject matter of the dependent claims.


A first aspect of the invention relates to a system to perform an input on a robotic manipulator, the system including:

    • a robotic manipulator having a plurality of limbs connected to one another by articulations and having actuators;
    • a sensor configured to record an input variable, applied by a user by manually guiding the robotic manipulator, on the robotic manipulator, wherein the input variable is a kinematic variable or a force and/or a moment, and wherein the sensor is configured to transmit the input variable; and
    • a computing unit connected to the robotic manipulator and to the sensor unit, the computing unit is configured to transform the input variable received from the sensor unit via a predefined input variable mapping, wherein the predefined input variable mapping defines a mathematical mapping of the input variable onto a coordinate of a graphical user interface or onto a setting of a virtual control element.


The coordinate of a graphical user interface corresponds, in particular in the case of a screen, to a coordinate on the screen, and in the case in which the graphical user interface is represented on virtual reality goggles, to a coordinate on the virtual reality goggles.


According to the first aspect of the invention and also below, an end effector can also be understood to be a “limb” of the robotic manipulator, wherein the end effector is typically arranged on the distal limb of the robotic manipulator.


The input variable is to that extent preferably a kinematic variable, that is to say a position and/or an orientation of at least one limb or of a predefined point on the robotic manipulator, or a time derivative thereof, that is to say a speed or an acceleration, or alternatively preferably a force applied onto the robotic manipulator by the user or a moment applied onto the robotic manipulator by the user, wherein in the last case of a moment or of a force, the computing unit is preferably configured to actuate the actuators of the robotic manipulator in such a manner that the robotic manipulator behaves as a rigid body, that is to say, to the extent made possible by the actuators, the actuators generate a counter-moment against an attempted movement of the robotic manipulator by the user. Accordingly, the sensor unit includes at least sensors which are suitable to determine a position and/or an orientation of a limb or of a point on the robotic manipulator or multiple limbs, and optionally also force sensors and/or moment sensors.


If the graphical user interface is represented on a screen, then in particular, the input variable mapping is a projection of the input variable onto the two-dimensional plane of the screen, in particular, taking into consideration the geometric limits of the screen, wherein, advantageously, the robotic manipulator (and in particular, its actuators), in the case of such an input variable which correlates with the reaching of one of the limits of the screen, is actuated by the computing unit in such a manner that the actuators provide an artificial resistance against an additional movement of the robotic manipulator.


The virtual control element is preferably an emulation of a real physical control element, for example, a rotary controller or a sliding controller. The virtual control element has, in particular, a lower limit and an upper limit, and between these two limits, a setting is defined which can be used, in particular, as parameter for the robotic manipulator.


It is an advantageous effect of the invention that a user of the robotic manipulator can intuitively perform an input on the robotic manipulator by a corresponding movement of the robotic manipulator by a corresponding application of a force or of a moment onto the robotic manipulator. The input of the user on the robotic manipulator can, to that extent, include a numerical value or else a command on the robotic manipulator, in particular, relating to a graphical user interface, so that the input of the user can advantageously be used for menu control either by mouse pointer (controlled by the input variable) or simply by switching through marked menu entries or objects on the graphical user interface. Advantageously, the user can therefore use the robotic manipulator universally as input device, since, as a result of the predefined input variable mapping, a corresponding reference to the graphical user interface or to the virtual control element is established. In particular, advantageously, the recording of a force or a moment or else a position change, in each case caused by the manual guiding on the part of the user on the robotic manipulator, can be defined as a haptic gesture by which, in particular, menus or functional regions of the graphical user interface can be controlled.


Moreover, advantageously, the system according to the first aspect of the invention can also be used for a system with a first robotic manipulator and a second robotic manipulator, wherein, in particular, the first robotic manipulator is used for performing the input according to the first aspect of the invention, and, in particular, the second robotic manipulator is designed for performing the first aspect of the invention in an alternative.


According to an advantageous embodiment, the computing unit for performing a predefined operation of the robotic manipulator is carried out depending on the transformed input variable or the coordinate or the setting.


The predefined operation of the robotic manipulator is here, in particular, the activation of an object on a graphical surface, preferably connected with the subsequent action of the robotic manipulator, for example, the parametrization of a function or of a robot program, as well as possibly the start of the execution of such a program or of such a function, moreover possibly the switching off of the robot manipulator, the storing of a current position value or of a value of the current orientation of the robotic manipulator.


According to an additional advantageous embodiment, the system includes a display unit, wherein the display unit is configured to display at least one of the following:

    • input variable,
    • amount of the input variable,
    • a transformed input variable,
    • amount of the transformed input variable.


The display unit is preferably one of: screen, projector, virtual reality display goggles, or LED unit. Advantageously, thereby, instant feedback is given to the user as to which type of input and/or at which level he/she is performing the input at a given time. In particular, the display unit is designed for displaying a mouse pointer with the current coordinate as mouse pointer coordinate and/or for displaying the current setting of the virtual control element.


According to an additional advantageous embodiment, the sensor unit is configured to record a current position and/or a current orientation of the robotic manipulator and configured to transmit the current position and/or the current orientation to the computing unit, wherein the computing unit is configured to actuate the actuators in such a manner that, during a manual guiding onto a predefined geometric structure, the robotic manipulator generates a resistance against movement of the robotic manipulator caused by the manual guiding, and wherein the computing unit is configured to activate a predefined function if the resistance exceeds a predefined limit value or if a distance of a predefined point of the robotic manipulator with respect to the predefined geometric structure undershoots a predefined limit value. Advantageously, this embodiment results in dispensing with an input element on the robotic manipulator, so that purely by the gesture control of the user, for example, the function of a mouse click can be generated, or else other functions on the graphical user interface can be activated.


According to an additional advantageous embodiment, the predefined function is activation of an object on the graphical user interface, at least if the coordinate coincides a predetermined coordinate range of the object. According to this embodiment, at the current position, in particular, at the current position of an end effector of the robotic manipulator, the function is activated similarly to a mouse click, wherein the current input variable and/or the past course of the input variables determine(s) the current coordinate of the graphical user interface, so that, in a manner similar to the guiding of a mouse pointer onto an object of the graphical user interface and in the case of subsequent clicking or double clicking with the mouse, the respective object located under the mouse pointer can be activated.


According to an additional advantageous embodiment, the predefined geometric structure is a plane and the plane is invariant in terms of its orientation and position with respect to a terrestrial coordinate system.


According to an additional advantageous embodiment, the computing unit is configured to actuate the actuators in such a manner that, during manual guiding of the robotic manipulator, the robotic manipulator outputs a haptic feedback and/or a tactile feedback of the input variable as recorded and/or of the input variable as transformed.


According to an additional advantageous embodiment, the haptic feedback and/or the tactile feedback in each case includes at least one of the following:

    • a position-dependent grid, dependent in particular on the position of an end effector,
    • a resistance-caused delimitation of a work region in which the robotic manipulator is capable of being manually guided, that is to say an artificial wall generated by counter-force,
    • a feedback in case the input variable and/or a current position and/or a current orientation of the robotic manipulator and/or the transformed input variable coincides with an object on the graphical user interface, wherein the coordinate of the graphical user interface is assigned to the input variable and/or to a current position and/or current orientation and/or to the transformed input variable of the robotic manipulator, and
    • a signal if the coordinate coincides a predefined coordinate range of the object of the graphical user interface.


According to an additional advantageous embodiment of the system, the display unit is a screen.


According to an additional advantageous embodiment, the computing unit is configured to actuate the robotic manipulator with gravity compensation. In the case of actuation of the robotic manipulator with gravity compensation, the actuators are actuated so that gravity acting on the robotic manipulator is compensated, so that, in the absence of an external force effect and proceeding from a static rest position without acceleration, the robotic manipulator continues to be in this rest position and remains in it. This advantageously facilitates the manual guiding of the robotic manipulator for the user.


An additional aspect of the invention relates to a method of performing an input on a robotic manipulator having a plurality of limbs connected to one another by articulations and having actuators, wherein the method includes:

    • recording using a sensor unit an input variable applied by a user by manually guiding the robotic manipulator, on the robotic manipulator, wherein the input variable is a kinematic variable or a force and/or a moment,
    • transmitting the input variable using the sensor unit, and
    • transforming using the computing unit the input variable received from the sensor unit via a predefined input variable mapping, wherein the predefined input variable mapping defines a mathematical mapping of the input variable onto a coordinate of a graphical user interface or onto a setting of a virtual control element of the computing unit.


According to an advantageous embodiment, the method moreover includes:

    • displaying by the display unit a mouse pointer with a current mouse pointer coordinate and/or a current setting of the virtual control element.


According to an advantageous embodiment, the method moreover includes:

    • performing a predefined operation of the robotic manipulator depending on the transformed input variable or the coordinate or the setting.


Advantages and preferred developments of the proposed method result from an analogous and appropriate application of the explanations provided above in connection with the proposed system.


Additional advantages, features and details result from the following description in which—optionally in reference to the drawingsat least one embodiment example is described in detail. Identical, similar and/or functionally equivalent parts are provided with identical reference numerals.





BRIEF DESCRIPTION OF THE DRAWINGS

In the drawings:



FIG. 1 shows a system for performing an input on a robotic manipulator according to an embodiment example of the invention, and



FIG. 2 shows a method for performing an input on a robotic manipulator according to an additional embodiment example of the invention.





The representations in the figures are diagrammatic and not true to scale.


DETAILED DESCRIPTION


FIG. 1 shows a system 100 for performing an input on a robotic manipulator 1, wherein the system includes:

    • a robotic manipulator 1 having a plurality of limbs connected to one another by articulations and having actuators 3;
    • a computing unit 5 arranged on the base of the robotic manipulator 1; and
    • a sensor unit 7 connected to the computing unit 5.


The sensor unit 7 is used for recording an input variable applied by the user by manually guiding the robotic manipulator 1, on the robotic manipulator 1, wherein the input variable is a force applied on the robotic manipulator 1 by the user, and wherein the sensor unit 7 is designed for transmitting the input variable to the computing unit 5. The computing unit 5 is here used for transforming the input variable via a predefined input variable mapping, and the input variable mapping is a mathematical mapping of the input variable onto a coordinate of a graphical user interface or onto a setting of a virtual control element. The system 100 furthermore includes a display unit 9, wherein the display unit 9 is designed to display at least one of the following:

    • input variable,
    • amount of the input variable,
    • transformed input variable, and
    • amount of the transformed input variable.


In particular, the display unit is designed for displaying a mouse pointer with the current mouse pointer coordinate. The sensor unit 7 is therefore used for recording a current position and/or a current orientation of the robotic manipulator 1 and for transmitting the current position and/or the current orientation to the computing unit 5, wherein the computing unit 5 is designed for actuating the actuators 3 in such a manner that the robotic manipulator 1, during manual guiding onto a predefined spatially fixed defined plane, generates a resistance against movement of the robotic manipulator 1 caused by manual guiding. The computing unit 5 is furthermore used for activating a predefined function if a predefined limit value in the resistance is exceeded or if a predefined limit value in the distance of a predefined point of the robotic manipulator 1 with respect to the geometric structure is undershot. The predefined function is an activation of an object on the graphical user interface, at least if the coordinate coincides a predefined coordinate range of the object. The computing unit 5 is moreover designed for actuating the actuators 3 in such a manner that, during manual guiding of the robotic manipulator 1, the robotic manipulator 1 outputs haptic feedback and/or tactile feedback of the recorded input variable and/or the transformed input variable, wherein the haptic feedback and/or the tactile feedback in each case include(s) at least one of the following:

    • position-dependent grid,
    • resistance-caused limitation of the work region in which the robotic manipulator 1 can be manually guided,
    • feedback in the case of coinciding of an input variable and/or a current position and/or current orientation of the robotic manipulator 1 with an object on the graphical user interface, wherein the coordinate of the graphical user interface is assigned to an input variable and/or a current position and/or current orientation of the robotic manipulator 1, and
    • signal if the coordinate coincides a predefined coordinate range of the object of the graphical user interface. The display unit 9 here is a screen.



FIG. 2 shows a method of performing an input on a robotic manipulator 1 having a plurality of limbs connected by articulations to one another and having actuators 3, wherein the robotic manipulator 1 is connected to a computing unit 5. The method includes:

    • recording S1 an input variable applied by the user by manually guiding the robotic manipulator 1, on the robotic manipulator 1, by a sensor unit 7 connected to the computing unit 5, wherein the input variable is a kinematic variable or a force and/or a moment,
    • transmitting S2 the input variable to the computing unit 5 by the sensor unit 7,
    • transforming S3 the input variable via a predefined input variable mapping by the computing unit 5, wherein the predefined input variable mapping defines a mathematical mapping of the input variable onto a coordinate of a graphical user interface or onto a setting of a virtual control element of the computing unit 5, and
    • displaying S4 a mouse pointer with the current mouse pointer coordinate and/or displaying the current setting of the virtual control element by the display unit.


Although the invention was illustrated and explained in greater detail by a preferred embodiment example, the invention is nevertheless not limited by the disclosed examples, and other variations can be derived therefrom by the person skilled in the art, without exceeding the scope of protection of the invention. Therefore, it is clear that multiple variation possibilities exist. It is also clear that embodiments mentioned as examples in fact represent only examples which in no way should be interpreted as limiting, for example, the scope of protection, the application possibilities or the configuration of the invention. Instead, the above description and the description of the figures enable the person skilled in the art to concretely implement the exemplary embodiments, wherein the person skilled in the art cognizant of the disclosed inventive idea can make diverse changes, for example, with regard to the function or the arrangement, in an exemplary embodiment of mentioned elements, without exceeding the scope of protection which is defined by the claims and their legal equivalents, such as, for example, further explanations in the description.


LIST OF REFERENCE NUMERALS




  • 1 Robotic manipulator


  • 3 Actuators


  • 5 Computing unit


  • 7 Sensor unit


  • 9 Display unit


  • 100 System

  • S1 Recording

  • S2 Transmitting

  • S3 Transforming

  • S4 Displaying


Claims
  • 1. A system to perform an input on a robotic manipulator, the system comprising: a robotic manipulator having a plurality of limbs connected to one another by articulations and having actuators;a sensor unit configured to record an input variable, applied by a user by manually guiding the robotic manipulator, on the robotic manipulator, wherein the input variable is a kinematic variable or a force and/or a moment, and wherein the sensor unit is configured to transmit the input variable; anda computing unit connected to the robotic manipulator and to the sensor unit, the computing unit configured to transform the input variable received from the sensor unit via a predefined input variable mapping, wherein the predefined input variable mapping defines a mathematical mapping of the input variable onto a coordinate of a graphical user interface or onto a setting of a virtual control element.
  • 2. The system according to claim 1, wherein the system comprises a display unit, wherein the display unit configured to display at least one of the following: the input variable;an amount of the input variable;the input variable as transformed;an amount of the input variable as transformed;a graphical user interface and objects on a graphical object surface; anda mouse pointer.
  • 3. The system according to claim 1, wherein: the sensor unit is configured to record a current position and/or a current orientation of the robotic manipulator for transmitting the current position and/or the current orientation to the computing unit; andthe computing unit is to actuate the actuators in such a manner that the robotic manipulator, during a manual guiding onto a predefined geometric structure, generates a resistance against movement of the robotic manipulator caused by the manual guiding, and wherein the computing unit is configured to activate a predefined function if the resistance exceeds a predetermined limit value or if a distance of a predefined point of the robotic manipulator with respect to the predefined geometric structure undershoots a predetermined limit value.
  • 4. The system according to claim 3, wherein the predefined function is activation of an object on the graphical user interface, at least when the coordinate coincides with a predetermined coordinate range of the object.
  • 5. The system according to claim 3, wherein the predefined geometric structure is a plane and the plane is invariant in terms of its orientation and position with respect to a terrestrial coordinate system.
  • 6. The system according to claim 3, wherein the computing unit is configured to actuate the actuators in such a manner that, during the manual guiding of the robotic manipulator, the robotic manipulator outputs a haptic feedback and/or a tactile feedback of the input variable as recorded and/or the input variable as transformed.
  • 7. The system according to claim 6, wherein the haptic feedback and/or the tactile feedback in each case includes at least one of the following: a position-dependent grid;a resistance-caused limitation of a work region in which the robotic manipulator is capable of being manually guided;a feedback in case the input variable and/or the current position and/or the current orientation of the robotic manipulator coincides with an object on the graphical user interface; anda signal if the coordinate coincides with a predefined coordinate range of the object of the graphical user interface.
  • 8.-10. (canceled)
  • 11. The system according to claim 7, wherein in the case of feedback the coordinate of the graphical user interface is assigned, using the computing unit, to the input variable and/or the current position and/or the current orientation of the robotic manipulator.
  • 12. The system according to claim 2, wherein the display unit is a screen.
  • 13. The system according to claim 1, wherein the computing unit is configured to actuate the robotic manipulator with gravity compensation.
  • 14. A method of performing an input on a robotic manipulator having a plurality of limbs connected to one another by articulations and having actuators, the method comprising: recording, using a sensor unit, an input variable applied by a user by manually guiding the robotic manipulator, on the robotic manipulator, wherein the input variable is a kinematic variable or a force and/or a moment;transmitting the input variable using the sensor unit; andtransforming, using a computing unit, the input variable received from the sensor unit via a predefined input variable mapping, wherein the predefined input variable mapping defines a mathematical mapping of the input variable onto a coordinate of a graphical user interface or onto a setting of a virtual control element of the computing unit.
  • 15. The method according to claim 14, wherein the method comprises displaying at least one of the following using a display unit: the input variable;an amount of the input variable;the input variable as transformed;an amount of the input variable as transformed;a graphical user interface and objects on a graphical object surface; anda mouse pointer.
  • 16. The method according to claim 14, wherein the method comprises: recording, using the sensor unit, a current position and/or a current orientation of the robotic manipulator for transmitting the current position and/or the current orientation to the computing unit; andactuating, using the computing unit, the actuators in such a manner that the robotic manipulator, during a manual guiding onto a predefined geometric structure, generates a resistance against movement of the robotic manipulator caused by the manual guiding; andactivating, using the computing unit, a predefined function if the resistance exceeds a predetermined limit value or if a distance of a predefined point of the robotic manipulator with respect to the predefined geometric structure undershoots a predetermined limit value.
  • 17. The method according to claim 16, wherein the predefined function is activation of an object on the graphical user interface, at least when the coordinate coincides with a predetermined coordinate range of the object.
  • 18. The method according to claim 16, wherein the predefined geometric structure is a plane and the plane is invariant in terms of its orientation and position with respect to a terrestrial coordinate system.
  • 19. The method according to claim 16, wherein actuation of the actuators is performed by the computing unit in such a manner that, during the manual guiding of the robotic manipulator, the robotic manipulator outputs a haptic feedback and/or a tactile feedback of the input variable as recorded and/or the input variable as transformed.
  • 20. The method according to claim 19, wherein the haptic feedback and/or the tactile feedback in each case includes at least one of the following: a position-dependent grid;a resistance-caused limitation of a work region in which the robotic manipulator is capable of being manually guided;a feedback in case the input variable and/or the current position and/or the current orientation of the robotic manipulator coincides with an object on the graphical user interface; anda signal if the coordinate coincides with a predefined coordinate range of the object of the graphical user interface.
  • 21. The method according to claim 20, wherein in the case of feedback the method comprises assigning, using the computing device, the coordinate of the graphical user interface to the input variable and/or the current position and/or the current orientation of the robotic manipulator.
  • 22. The method according to claim 15, wherein the display unit is a screen.
  • 23. The method according to claim 14, wherein the method comprises actuating, using the computing unit, the robotic manipulator with gravity compensation.
Priority Claims (1)
Number Date Country Kind
10 2019 004 478.9 Jun 2019 DE national
CROSS-REFERENCE TO RELATED APPLICATIONS

The present application is the U.S. National Phase of PCT/EP2020/067981, filed on 26 Jun. 2020, which claims priority to German Patent Application No. 10 2019 004 478.9, filed on 26 Jun. 2019, the entire contents of which are incorporated herein by reference.

PCT Information
Filing Document Filing Date Country Kind
PCT/EP2020/067981 6/26/2020 WO