The present application claims the priority of U.S. Patent Application No. 63/301,756, filed on Jan. 21, 2022, the contents of which are incorporated herein by reference.
The present application relates to robotic arms and to operating systems associated with robotic arms to teach or program maneuvers to a robotic arm.
Robotic arms are increasingly used in a number of different applications, from manufacturing, to servicing, and assistive robotics, among numerous possibilities. Such robots can be used to perform given tasks, including repetitive tasks. The tasks may be in the form of given movements, effector operations and maneuvers. These given movements may include moving an end effector to selected waypoints, along desired movement paths, applying forces of determined magnitude. The given tasks may include operating an end effector in certain ways depending on the type of end effector, etc.
Accordingly, a robotic arm must be taught, i.e., programmed, to execute given movements and perform given tasks. The level of complexity may vary in how the robotic arm is taught, and the teaching may often include a programming interface and a controller. However, the programming of a robotic arm may suffer from some inefficiencies, notably by the need for an operator to alternate maneuvers between a robotic arm and a programming interface (also known as a teach pendant), especially in the context of a collaborative mode in which the operator may manipulate the robotic arm during programming.
It is an aim of the present disclosure to provide a visual programming interface that addressed issues related to the art.
It is a further aim of the present disclosure to provide a robot teaching system that addressed issues related to the art.
Therefore, in accordance with a first aspect of the present disclosure, there is provided a programming interface of a robotic arm comprising: a programming table including rows and columns of cells, wherein the cells in the rows or in the columns represent an execution sequence; tiles positionable in the cells, each of the tiles representing at least one of an action, a decision, a condition associated with the robotic arm; wherein, during operation of the robotic arm, a controller operates the robotic arm based on the execution sequence and on the tiles in the programming table.
Further in accordance with the first aspect, for instance, the cells in the rows represent the execution sequence.
Still further in accordance with the first aspect, for instance, the cells in a common one of the columns represent a condition sequence.
Still further in accordance with the first aspect, for instance, tiles in the cells in the common one of the columns forming the condition sequence are each adjacent to another tile in a respective one of the rows, the other one of the tiles indicating an action and/or a decision of the condition sequence.
Still further in accordance with the first aspect, for instance, at least one of the tiles is a waypoint tile identifying at least one waypoint position and/or orientation to which the controller directs the robotic arm.
Still further in accordance with the first aspect, for instance, the waypoint tile further includes at least one parameter setting associated with a movement of the robotic arm to the waypoint position and/or orientation.
Still further in accordance with the first aspect, for instance, the at least one waypoint position and/or orientation of the waypoint tile is set from a signal received from a wrist of the robotic arm.
Still further in accordance with the first aspect, for instance, at least one of the tiles is a script tile, according to which the controller operates the robotic arm as a function of the script.
Still further in accordance with the first aspect, for instance, the script is Python®.
Still further in accordance with the first aspect, for instance, the script is imported into a field of the programming interface.
Still further in accordance with the first aspect, for instance, at least one of the tiles is associated with an actuation of an end effector of the robotic arm.
Still further in accordance with the first aspect, for instance, the end effector is a gripper.
Still further in accordance with the first aspect, for instance, the at least one tile associated with the actuation is to cause an opening of the gripper.
Still further in accordance with the first aspect, for instance, the at least one tile associated with the actuation is to cause a closing of the gripper.
Still further in accordance with the first aspect, for instance, the at least one tile associated with the actuation further includes at least one parameter setting associated with the actuation of the gripper.
Still further in accordance with the first aspect, for instance, the at least one parameter setting associated with the actuation of the gripper is a closing force or a closing speed.
Still further in accordance with the first aspect, for instance, the at least one tile associated with the actuation is to cause the end effector to seek the presence of an object by contact.
Still further in accordance with the first aspect, for instance, the at least one tile associated with the actuation further includes at least one parameter setting associated with the seek of the object, wherein the at least one parameter setting associated with the seek is one or more of a limit of contact force, a limit of displacement speed, a constraint in trajectory of movement.
Still further in accordance with the first aspect, for instance, at least one of the tiles is associated with an operation of a vision system of the robotic arm.
Still further in accordance with the first aspect, for instance, the operation of the vision system includes providing a video feed on the programming interface.
In accordance with a second aspect of the present disclosure, there is provided system for teaching a robotic arm comprising: a user interface at a working end of a robotic arm; a processing unit; and a non-transitory computer-readable memory communicatively coupled to the processing unit and comprising computer-readable program instructions executable by the processing unit for: recording a position of the working end of the robotic arm in response to signaling from the user interface in a first mode; recording an orientation of the working end of the robotic arm in response to signaling from the user interface in a second mode; and toggling between at least the first mode and the second mode in response to signaling from the user interface.
Further in accordance with the second aspect, for instance, in the first mode, the robotic arm moves in Cartesian admittance as a response to a force applied to the working end, the working end having a constant orientation in the Cartesian admittance.
Still further in accordance with the second aspect, for instance, in the second mode, the robotic arm moves in angular admittance as a response to a force applied to the working end, the robotic arm constraining movement to a rotation of a single joint in the angular admittance.
Still further in accordance with the second aspect, for instance, recording a force applied at the working end of the robotic arm may be in response to signaling from the user interface in a third mode.
Referring to the drawings and more particularly to
The base end 10B is configured to be connected to any appropriate structure or mechanism. The base end 10B may be rotatably mounted or not to the structure or mechanism. By way of a non-exhaustive example, the base end 10B may be mounted to a wheelchair, to a vehicle, to a frame, to a cart, to a robot docking station. Although a serial robot arm is shown, the joint arrangement of the robotic arm 10 may be found in other types of robots, including parallel manipulators.
Still referring to
Referring concurrently to
Among possible components present at the wrist 12, a connector 12A may be provided. The connector 12A may be a pogo-pin type socket by which a tool may be connected to a control system of the robotic arm 10. Other types of sockets, plugs, connectors may be used, the pogo-pin type socket being merely provided as an example. For instance, the connector 12A provides Ethernet connection, that may comply with the RS-485 standard or other standards, and/or may have any appropriate power output (e.g., 24V, 4A). A flange 12B or like raised support (shown as an annular support surface) may surround the connector 12A may be provided with attachment bores 12B′ or other female or male attachments. Threading may be provided in the attachment bores 12B′. An end effector, tool, peripheral with complementary attachments could hence be fixed to the wrist 12, the end effector, tool, peripheral also having a complementary connector for electronic engagement with the connector 12A. Referring to
A light display 12D may be provided. In an embodiment, the light display 12D extends peripherally around the wrist 12, to provide visibility from an enhanced point of view. For example, the light display 12D may cover at least 180 degrees of the circumference of the wrist 12. In the illustrated embodiment, the light display covers between 280 and 320 degrees of the circumference of the wrist 12. The light display 12D may be considered a user interface as it may provide information to the operator, such as a flashing or color indication when certain teaching tasks are performed. The light display 12D may take other forms or configurations. For example, discrete lights may be present in addition to or as alternatives to the light display 12D. In an embodiment, the various lights are light-emitting diodes (LEDs), though other types of light sources may be used. The LEDs, such as those in the light display 12D, may be RGD LEDs.
An interface 14 may also be present on the wrist 12, and may include buttons, an additional port, etc. An exemplary combination of buttons for the interface 14 are provided, but any other arrangement than that shown and described is possible. The interface 14 may include an enabling button(s) 14A to enable a manual control of directional movements of the wrist 12. By pressing on the enabling button 14A, a user may manipulate the robotic arm 10. Accordingly, the pressing of the enabling button 14A may cause a release of all brakes or of selected brakes at the various joints of the robotic arm 10, such that the user may displace the end effector at the end of the wrist 12 and/or move the links at opposite sides of any of the joints. In a variant, the user may be required to maintain a pressure on the enabling button 14A for the manual control to be activated, i.e., to allow a freedom of movement. Moreover, an excessive pressure on the enabling button 14A may result in a braking of the robotic arm 10, as an option. Practically speaking, the user may use one hand to press on the enabling button 14A, and the other hand may be free to manipulate other parts of the robotic arm 10, to press on other buttons to record a condition of the robotic arm 10, and/or to use an interface associated with a controller of the robotic arm 10. The enabling button 14A may also be used as an on/off switch for the manual control mode, whereby a single press on the enabling button 14A may allow a user to toggle between an activation and deactivation of the manual control mode. Alternatively, the enabling button 14A could also have a sensitive periphery, with the robotic arm 10 responding to a press of the enabling button 14A to move in the pressed direction.
Other buttons may be present, such as single-function buttons 14B and 14C. For example, button 14B may be a position capture button (a.k.a., waypoint capture button), whereas button 14C may be a mode toggling button, as will be explained below. There may be fewer or more of the single-function buttons 14B and 14C, and such buttons may be associated with any given function. A 2-position level button 14D may be present, to select (e.g., increase or decrease) a level of intensity, as observed from the +/− symbols. The 2-position level button 14D may be known as a volume button. The buttons 14A, 14B and 14C are shown as being mechanical buttons, in that a mechanical force must be applied to trigger electronic components. However, some other arrangements are possible, for instance by having only the enabling button 14A be a mechanical button, or by having all buttons being capacitive, resistive or like touch-sensitive buttons. In an embodiment, the button 14A differs in shape from the buttons 14B, 14C, and from the button 14D, to provide a user with a visual distinction between buttons and hence make the use of the interface 14 more intuitive. In an embodiment, the interface 14 may comply with the standard ANSI/UL 508, as the interface 14 operating the robot arm 10 is an industrial control device, for starting, stopping, regulating, controlling, or protecting electric motors, described below for example as being part of the motorization units 30.
Other types of wrist interfaces are contemplated. For example, a touchscreen may be used instead of the mechanical buttons, to provide similar functions, or additional ones, and provide display capacity as well. The touchscreen could thus be used as an alternative to both the light display 12D and to the buttons 14A-14C.
In addition to the wrist 12 at its working end 10A, the robotic arm 10 has a series of links 20, interconnected by motorized joint units 30, at the junction between adjacent links 20, forming joints between the links 20, the joints being for example rotational joints (a.k.a., one rotational degree-of-freedom (DOF) joints). The motorized joint units 30 may integrate brakes. For example, the integrated brakes may be normally open brakes that block the robotic arm 10 from moving when the robotic arm 10 is not powered. The brakes in the motorized joint units, or at least some of the brakes in the motorized joint units 30, may be normally open brakes. The brakes integrated in the motorized joint units 30 may be for example as described in U.S. Pat. No. 10,576,644, incorporated herein as reference, but other types of brakes may be used. A bottom one of the links 20 is shown and referred to herein as a robot arm base link 20′, or simply base link 20′, and may or may not be releasably connected to a docking cradle. For instance, the base link 20′ may be as described in United States Patent Application Publication No. US2020/0086504, incorporated herein by reference.
The links 20, including the wrist 12, define the majority of the outer surface of the robotic arm 10. The links 20 also have a structural function in that they form the skeleton of the robotic arm 10 (i.e., an outer shell skeleton), by supporting the motorized joint units 30 and tools at the working end 10A, with loads supported by the tools, in addition to supporting the weight of the robotic arm 10 itself. Electronic components may be concealed into the links 20. The arrangement of links 12, 20 provides the various degrees of freedom (DOF) of movement of the working end 10A of the robotic arm 10. In an embodiment, there are sufficient links 12, 20 to enable six DOFs of movement (i.e., three rotations and three translations) to the working end 10A, relative to the base link 20′. There may be fewer or more DOFs depending on the use of the robotic arm 10.
The motorized joint units 30 interconnect adjacent links 20, in such a way that a rotational degree of actuation is provided between adjacent links 20. According to an embodiment, the motorized joint units 30 may also connect a link to a tool via the wrist 12 at the working end 10A, although other mechanisms may be used at the working end 10A and at the base end 10B. The wrist 12 may be straight in shape, in contrast to some other ones of the links 20 being elbow shaped. The motorized joint units 30 may also form part of a structure of the robotic arm 10, as they interconnect adjacent links 20. The motorized joint units 30 form a drive system 30′ (
A communications link may extend through the robotic arm 10. In an embodiment, each link 20 includes signal transmission means (e.g., wires, cables, PCB, plugs and sockets, slip rings, etc), with the signal transmission means being serially connected from the base end 10B to the working end 10A, such that an end effector and a base controller may be communicatively coupled. The communications link may also be via a wireless communications link. The communications link may be accessible via the connector 12A, and the connectors 12C.
Numerous sensors 40 (
In an embodiment, the peripheral 50 may be associated with third party applications, as described below, and may use the available plug 10A′, for instance to be connected to the working end 10A of the robotic arm 10 if desired. Exemplary peripherals are an end effector (e.g., gripper), camera(s), a sensor(s), a light source(s). The peripheral(s) 50 may be the end effector as a standalone device, or may be used as a complement to the end effector, or may be part of a group of tools/instruments forming the end effector at the working end 10A of the robotic arm 10.
Referring concurrently to
In addition to the interface 10A″ at the working end 10A of the robotic arm 10, additional interface or interfaces 102D may be part of the robot control system 100, or may be a peripheral of the robot control system 100, for a user to communicate with and receive data from the robot control system 100. The interface(s) 102D may be embedded or integrated in the robotic arm 10, or may be physically separated from the robotic arm 10. The interface(s) 102D may take numerous forms, such as a screen or monitor, a graphic user interface (GUI), a touch screen, visual indicators (e.g., LED), tablet, an application on a smart device (e.g., phone, tablet), keyboard, mouse, push buttons, etc. One of the interface(s) 102D may be used for controlling the robotic arm 10, and for teaching the robotic arm 10. Such a user interface 102D may be referred to as a teach pendant, a teach box, a remote controller, among other possible names. In an embodiment, the user interface 102D used as teach pendant may be wireless and may communicate with a remainder of the robot control system 100 of the robotic arm 10.
The robot control system 100 may further include telecommunications module 102E by which the robot control system 100 may communicate with external devices, systems. The telecommunications module 102E may have wireless and/or wired capability.
For example, the computer-readable program instructions may include native functions 110 of the robotic arm 10. The native functions 110 may include one or more of controllably moving the working end 10A in the available degrees of freedom of the robotic arm 10; holding a fixed position; performing a safety braking maneuver; permitting hand guiding and teaching; measuring the interaction forces on the robotic arm 10, among other native functions. The native functions 110 may be connected to the drive system 30′ for the control system 100 to drive the robotic arm 10 in a desired manner. The native functions 110 may also operate based on feedback provided by the sensors 40 within or associated to the robotic arm 10. The sensors 40 contribute to the high precision control of the robotic arm 10 in its working envelope.
Still referring to
Referring to
Referring to
Referring to
Referring to concurrently to
Accordingly, tiles T representative of actions may be positioned in the table A, in order of execution. For example, tiles T may be dragged and dropped, or inserted by text, by a selection in a menu, etc. A first tile T on the left-hand side is indicative of an action or decision occurring before an action or decision presented by a second tile T to the right of the first tile T. In the condition axis, the tiles T are representative of actions or decisions occurring according to conditions. For example, based on measured parameters, system status, outputs, etc, the execution module 130 may select to perform any given one of the tiles T in the column.
For example, referring to
As another example, referring
As yet another example, the programming table A is used with tiles specific to the end effector, shown in the tiles as being a gripping mechanism (a.k.a., gripper) that may open and close. With the gripper at a given waypoint, the gripper may be initiated to perform subsequent movements, as per tile T8. According to the sequence, once reset, loops are to be performed as introduced by loop tile T1. The illustrated loop includes teaching two waypoints, to then close the gripper as per tile T9. The waypoints may represent a path to be taken by the gripper, e.g., move over an object, to then descend toward the object for subsequent grasping. Three waypoints may be reached, after which the gripper is opened as per tile T10. The robotic arm 10 may then be required to move to a given waypoint to complete the loop. The gripper tiles T8, T9 may possibly be hardware-agnostic (e.g., not specific to a certain gripper geometry or gripper manufacturer) thus leading to the re-use of the program no matter the nature of the end effector tooling.
Thus,
The teaching module 120 may operate a plurality of teaching modes, according to which the execution module 130 of the robot control system 100 will learn about different maneuvers or tasks to be performed by the robotic arm 10. In an embodiment, a user may alternate between different modes, using the interface 14 (
The modes may for example be used in the context of the recording of waypoints, as per waypoint tiles T4. In a first mode of the teaching module 120, a current robot position and orientation of the working end 10A may be recorded while a user hand guides movement of the working end 10A of the robotic arm 10. This recording of robot position may be for the position of a part of an end effector at the working end 10A. For instance, the tool center point (TCP) of the tool at the working end 10A may be used as reference for waypoint recording. The waypoint button 14B may be used to record the position and orientation. The arrangement of the wrist 12 is such that one hand of the user may hold the wrist 12 with the enabling button 14A being depressed, while the other hand may be used to press the waypoint button 14B. In the first mode, referred to as Cartesian value waypoint recording mode, with reference to
As an example, the end effector may be required to perform a straight line movement between two waypoints. An example may be a welding gun used as an end effector. As another example, a drilling tool may be used as an end effector. The end effector may be required to move from waypoint A to waypoint B to perform a given maneuver. For example, waypoints A and B may be then end points of a weld line, in the case of a tip of a welding gun. As another example, waypoints A and B may be representative of a drill path and depth. The first mode may be used to record the coordinates of the waypoints A and B in the coordinate system, for subsequently performing these tasks.
As another example, the end effector may be required to be in a given orientation when performing a task. In the example of the welding gun used as an end effector, the tip of the welding gun may be at a given angle to perform the weld. In the example of the drilling tool used as an end effector, the recorded orientation may have a drill bit collinear with a drilling path or offset in position or in orientation in relation to such a path.
In order to assist in determining a path of movement between waypoints A and B, the user may hand guide the end effector at the working end 10A to the waypoints A and/or B. The user may also use the interface 14, for instance to control the movement of the end effector at the working end 10A. For example, the enabling button 14A or equivalent may be used to control the movement of the robotic arm 10, in free hand movement. The jog mode may thus be useful for such movements, in that manipulations would not be required. The recording of a plurality of waypoints in the first mode may be part of a trajectory recording. For example, by holding a button, e.g., the capture button 14B, the first mode may enable a continuous or semi-continuous trajectory capture (e.g., spline interpolation or like registration event).
The recorded waypoints may be ranges of positions that may be deemed acceptable in a workflow. For example, in pick and place tasks with a gripper, the gripper may deposit an item in a zone, as opposed to depositing it in a given position.
Another mode, referred to arbitrarily as a second mode or angular value waypoint recording mode, may also be used in the context of recording of waypoints, as per waypoint tile T4. The second mode has the waypoint recorded as angular values of the various motorized joints 30 between adjacent links 20, as illustrated in
In
In a third mode, a level or intensity of actions associated with the end effector may be recorded of the teaching module 120. As another example, the end effector is a gripper, for instance used in a pick and place operation. The third mode may be selected by the user for the levels of grasping force of the gripper to be recorded.
In another embodiment, the end effector can perform vacuuming. The third mode may be used to indicate when the end effector is to perform suction or release suction.
In a fourth mode, a force applied by the end effector at the working end 10A may be recorded. In a variant, the robotic arm 10 adopts a lock mode, and the user may exert a given pressure on the end effector. Force sensor(s) in the end effector and/or wrist 12 may measure the force, and record it. The robotic arm 10 could then reproduce the force vector during a task. As another possibility, the user may use the enabling button 14A to have the robotic arm 10 effect a movement in a direction, and manually oppose a force to the end effector. The force vector or impedance may be measured. The level button 14C could also be used in the process. The force applied and recorded may be indicative of a minimum and a maximum force.
In parallel to the capture of data using any one of the modes mentioned above, a graphic-user interface of one of the user interfaces 102D may display the information as it is recorded. The user interface 102D may be used as a teach pendant. In order to assist the user in toggling through steps, color and/or light signaling may be used during a teaching sequence. For example, the display light 12D may be green to allow recordation in a selected mode. The display light 12D may flash or change colors when toggling between modes. A color (e.g., red) may be indicative of problems that must be resolved.
In any of the modes, the hand guiding may be facilitated by the locking of any combination of DOFs of the robotic arm 10. For example, the robotic arm 10 may be instructed to move only in a single DOF of the robotic arm 10, such as along a linear path, or to rotate about a single axis. Other examples include the end effector constrained to moving along a work plane, sliding along the side or the edge of a jig, being axially in line for example with a screwdriver or other similar tool, or moving spherically around a remote center of motion, such as moving around an object centered in jaws of a gripper mechanism.
Accordingly, the various modes described herein may record positions, orientations, levels, forces, and/or impedance associated with the robotic arm 10, for instance as tied to a specific end effector. The execution module 130 may therefore populate given task workflows with the taught information, to perform tasks and maneuvers, as taught by the teaching module 120.
Referring to
Referring to
The system for teaching a robotic arm may therefore include some or all of the components described herein. For example, as part of the system for teaching a robotic arm 10, there may be included the interface 14, the processing unit like the CPU 102A; the non-transitory computer-readable memory 102C, any of the interfaces 102D, the teaching module 120, the execution module 130.
The system may alternatively or additionally be described as including a processing unit like the CPU 102A; and a non-transitory computer-readable memory 102C communicatively coupled to the processing unit and comprising computer-readable program instructions executable by the processing unit for: recording a position of the working end of the robotic arm in response to signaling from the user interface in a first mode; recording an orientation of the working end of the robotic arm in response to signaling from the user interface in a second mode; recording a force applied at the working end of the robotic arm in response to signaling from the user interface in a third mode; and toggling between the first mode, the second mode and the third mode in response to signaling from the user interface.
The programming table A of
An agnostic camera hand-eye vision system may be used, with a vision module being one of the peripherals 50 (
Once the vision task is configured, the vision system can perform workpiece matching operations and determine 3D poses of the located workpieces with respect to the frame of reference of the robotic arm 10. The detection results and workpiece locations are communicated in a similar fashion. The visual programming environment can then treat the detected poses as custom frames and apply a previously taught object manipulation routine. The object manipulation routine may include, for example, pre-approach, approach, grasping and retreat poses in the task of object picking. The detection and localization functionality can also be extended to support specifically crafted fiducials/landmarks and/or to create additional custom reference frames based on these. The fiducials, landmarks and/or reference frames may then be used to locate trays containing multiple workpieces or dynamically adjust affected Cartesian waypoints and adapt the robotic arm 10 to work in a flexible unstructured environment. Also, among other common uses enabled by the vision system are presence/absence detection, OCR, bar-code reading, etc.
The programming table A may have a non-blocking feature by which some actions may occur concurrently. This is shown in
Referring to
To define the Seek action, contact parameters must be quantified. Examples of parameters include Force Threshold, Speed of Displacement and Maximum Displacement. The frame of reference may be selected (e.g., tool or base), and the direction of movement as well, as a function of the frame of movement.
The programming interface of the robotic arm 10, as operated for example by the control system 100 (i.e., controller) may be described as having a programming table including rows and columns of cells, wherein the cells in the rows or in the columns represent an execution sequence; tiles (icon, tab, etc.) positionable in the cells, each of the tiles representing at least one of an action, a decision, a condition associated with the robotic arm; wherein, during operation of the robotic arm, a controller operates the robotic arm based on the execution sequence and on the tiles in the programming table.
In a variant, the cells in the rows represent the execution sequence. The cells in a common one of the columns may represent a condition sequence. Tiles in the cells in the common one of the columns may form the condition sequence are each adjacent to another tile in a respective one of the rows, the other one of the tiles indicating an action and/or a decision of the condition sequence, i.e., some of the conditions may include an execution sequence by extending into the execution sequence direction. At least one some of the tiles may have another part of a GUI of a programming interface showing one or more of the parameters related to the condition, action, decision. One of the tiles may be a waypoint tile identifying at least one waypoint position and/or orientation to which the controller directs the robotic arm. The waypoint tile may further includes one or more parameter setting associated with a movement of the robotic arm to the waypoint position and/or orientation. The waypoint position and/or orientation of the waypoint tile is set from a signal received from a wrist of the robotic arm. One of the tiles may be a script tile, according to which the controller operates the robotic arm as a function of the script. The script may be Python®. One of the tiles may be associated with an actuation of an end effector of the robotic arm. The end effector may be a gripper. The tile associated with the actuation is to cause an opening of the gripper and/or a closing of the gripper. The tile associated with the actuation may also have one or more parameter settings associated with the actuation of the gripper, such as a parameter setting associated with the actuation of the gripper is a closing force or a closing speed. The tile associated with the actuation is to cause the end effector to seek the presence of an object by contact. The tile associated with the actuation may further include at least one parameter setting associated with the seek of the object, wherein the at least one parameter setting associated with the seek is one or more of a limit of contact force, a limit of displacement speed, a constraint in trajectory of movement. One of the tiles may be associated with an operation of a vision system of the robotic arm. The operation of the vision system may include providing a video feed on the programming interface.
The present disclosure pertains to a system for teaching a robotic arm that may have a user interface at a working end of a robotic arm; a processing unit; and a non-transitory computer-readable memory communicatively coupled to the processing unit and comprising computer-readable program instructions executable by the processing unit for: recording a position of the working end of the robotic arm in response to signaling from the user interface in a first mode; recording an orientation of the working end of the robotic arm in response to signaling from the user interface in a second mode; and toggling between at least the first mode and the second mode in response to signaling from the user interface. Optionally, in the first mode, the robotic arm moves in Cartesian admittance as a response to a force applied to the working end, the working end having a constant orientation in the Cartesian admittance.
Optionally, in the second mode, the robotic arm moves in angular admittance as a response to a force applied to the working end, the robotic arm constraining movement to a rotation of a single joint in the angular admittance. The system may record a force applied at the working end of the robotic arm in response to signaling from the user interface in a third mode.
The controller system 100 may be described as a system for programming a robotic arm and robot cell, with horizontal timeline featuring expandable/collapsible subroutines which can be re-used and which enable either a quick overview of the complete program or ability to zoom/focus on specific elements, that may be tuned. A programming table A may be programmed with tiles associated with native robot functions, or with functions associated with OEM or 3rd party plug-ins. The plug-ins may be hardware-agnostic plug-in system for end effectors and vision systems/cameras with blocks capable of being re-used and permitting visualizing of any type of action in a consistent fashion.
Among tiles T not shown, there may be: a grasping plug-in tile permitting any two or three finger gripper or single-acting vacuum gripper to be used in the same fashion (close/open, suction no suction, force/speed adjustment if accessible); Pick, Place, Stack, Matrix, Screw, Insert, Follow, Find tiles may also be available. A method is associated with the controller system 100 to seamlessly convert the visual program to and from a text-based script format. Another method may be associated with the controller system 100 to define sub-programs to be re-usable and accessible for higher level programs. The teaching module 120 may include a variable manager may be provided to program the system with advanced, intelligent data types (points, orientations, camera data, strings, etc) in private and global scopes. The variable manager capable of tracking multiple objects in a real-time database (e.g.,: objects on conveyor or in matrix).
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/CA2023/050063 | 1/20/2023 | WO |
Number | Date | Country | |
---|---|---|---|
63301756 | Jan 2022 | US |