This disclosure is related to dynamically controlling a robotic arm.
Known robotic apparatuses having attached end-of-arm tools are used to manipulate workpieces. Known end-of-arm tools can be maneuvered to engage a workpiece at a known location, act on the workpiece, transport the workpiece to a new location, and orient and engage the workpiece with other pieces and then release the workpiece. Applications of robotic apparatuses with end-of-arm devices include, e.g., material handling, manufacturing, packaging, and testing. Known robotic apparatuses have reference coordinate frames, e.g., a spatial three-dimensional coordinate system, and can be controlled to place the end-of-arm tool at a fixed position in the reference coordinate frame. Known robotic apparatuses use a vision sensor and a vision statement inserted in a computer program to command the robotic apparatus to move to a predetermined position in the reference coordinate frame, including applying a visual offset from the vision sensor in a static manner to command and correct maneuvering of the robotic apparatus. In such systems there is an initial or start position of the end-of-arm tool, and a nominal goal position of the end-of-arm tool. Signal input from the vision sensor is used to locate the workpiece and define a vision-based offset to the nominal goal position of the end-of-arm tool. The robotic apparatus is commanded to move the end-of-arm tool to a fixed position defined by the nominal goal position corrected with the vision-based offset.
Known workpieces can be pliable or malleable, and dynamically moveable, resulting in changes in location of elements of the workpiece during the period of time when the robotic apparatus moves the end-of-arm tool. A change in location of an element of the workpiece can cause a mismatch between a final position of the end-of-arm tool and the nominal goal position of the workpiece.
Maneuvering an articulable robotic arm having an end-of-arm tool includes monitoring a position of a workpiece that is dynamically moveable, determining an initial position of the workpiece, determining an initial position of the end-of-arm tool, iteratively determining an updated position of the workpiece corresponding to the monitored position of the workpiece, and iteratively executing individual motion segments to control the articulable robotic arm to position the end-of-arm tool contiguous to the workpiece based upon the initial position of the end-of-arm tool, the initial position of the workpiece and the iteratively determined updated position of the workpiece.
One or more embodiments will now be described, by way of example, with reference to the accompanying drawings, in which:
Referring now to the drawings, wherein the showings are for the purpose of illustrating certain exemplary embodiments only and not for the purpose of limiting the same,
The position of the EOAT 15 is defined in terms of linear and rotational positions relative to the x-axis (Xr), the y-axis (Yr), and the z-axis (Zr) of the spatial three-dimensional coordinate system 200. The point of origin A is preferably referenced to and defined at an identifiable physical location on the robotic arm 10.
The locating sensing devices 30, 40 can be oriented to identify and detect position of a workpiece 25 and periodically monitor its position. Each of the locating sensing devices 30, 40 is preferably setup and calibrated to identify the workpiece 25 and locate the position of the workpiece 25 relative to the spatial three-dimensional coordinate system 200 when activated. In one embodiment, one of the locating sensing devices 30, 40 is mechanically fixed to the robotic arm 10 and moves therewith.
Exemplary locating sensing devices 30, 40 can include low-resolution, wide-range devices that are used to dynamically maneuver the robotic arm 10 in space. Low resolution locating sensing devices include time-of-flight-based 3D laser rangefinders and flash LIDAR-based 3D image devices. Exemplary locating sensing devices 30, 40 can include high resolution devices having sufficient resolution to locate a position of the workpiece 25 with a narrow field-of-view and a focused sensing area. High resolution devices include high resolution image-based vision systems and triangulation-based short-range, high-precision 3D laser scanners. The locating sensing devices 30, 40 generate signals indicating a present position of the workpiece 25 that are transformed to the spatial three-dimensional coordinate system 200 and are input to the control module 5.
The control module 5 executes algorithmic code stored therein to control the robotic arm 10. The control module 5 is preferably a general-purpose digital computer comprising a microprocessor or central processing unit, storage mediums comprising non-volatile memory including read only memory and electrically programmable read only memory, random access memory, a high speed clock, analog-to-digital and digital-to-analog circuitry, and input/output circuitry and devices and appropriate signal conditioning and buffer circuitry. The control module 5 has a set of control algorithms, comprising resident program instructions and calibrations stored in the non-volatile memory and executed to provide the respective functions described herein. Algorithms are executed by the central processing unit to monitor the signals from the locating sensing devices 30, 40 indicating the present position of the workpiece 25 and execute control routines to control actuators (not shown) of the robotic arm 10 using preset calibrations. The algorithms are preferably executed at regular intervals, for example every 10 milliseconds during ongoing operation.
The workpiece 25 can comprise a device that includes flexible or malleable elements. Examples of flexible or malleable workpieces 25 illustrative of the concept include a wiring harness and a fascia element for a front end assembly of a vehicle. Alternatively, the workpiece 25 can comprise a device that is dynamically moving, e.g., on a conveying device.
The control module 5 executes code to execute local motion interpolation for the robotic arm 10 based upon the position of the workpiece 25 during each individual motion segment of the robotic arm (140), including filtering the interpolated segments (150). Individual control elements of the robotic arm 10, e.g., electro-mechanical servo devices (not shown) are controlled to maneuver the EOAT 15 to the preferred position and configured to be contiguous to and able to engage the workpiece 25 using control schemes during each of the individual motion segments, including using PID controls in one embodiment (160). The robotic arm 10 is moved and positioned at the preferred position to engage the workpiece 25 at its updated position using a control scheme such as a PID feedback control loop. In implementing the control scheme for the robotic arm 10, a first trajectory 210 is defined comprising a difference between the initial position of the EOAT 15 and an initial position of the workpiece 25. The first trajectory is segmented into small time increments that are driven by an iteratively executed position control law to move the robotic arm 10 along a first trajectory 210 as shown in
Referring again to
The workpiece(s) 25 can be moving at a constant speed as on a conveyor. The workpiece 25 can move at a random speed, as occurs at a loose end of a malleable or flexible part. A number of filtering techniques can be applied to the sensor data streams, such as Kalman filtering, to predict subsequent sensor data streams permitting implementation of more advanced predictive robot local motion interpolation for better smoothing between the sensor data streams. The sensor data streams can each comprise a relative position between the EOAT 15 and the position of the workpiece 25.
The sensor data streams iteratively describe updated, present positions of the workpiece 25 at the local motion interpolation level of the robotic arm 10 to allow the inputs from the locating sensing devices 30, 40 to be dynamically incorporated in the commanded movement of the robotic arm 10. The iteratively generated sensor data streams guide movement of the robotic arm 10 relative to the malleable workpiece 25 when the sensor data stream {right arrow over (Δp)}(t) is time-dependent. The dynamic manner of using the sensor data stream comprises using sensor signals at a time interval {right arrow over (ΔTs)} that is applied at the system level without interaction from an operator. The operator commands operation to activate the locating sensing devices 30, 40. Then, locating sensing devices 30, 40 can guide the robotic arm 10 at the sensor data stream rate blended with the motion rate of the robotic arm 10 to achieve interpolated local motion of the robotic arm 10. As a result, movement of the robotic arm 10 along a preferred path can be dynamically adjusted to an updated position vector of the workpiece 25 based upon the monitored position of the workpiece 25 using sensor data streaming from the plurality of locating sensing devices 30, 40.
In an embodiment wherein the locating sensing devices 30 is fixed relative to the base 20, the workpiece(s) 25 can be malleable or moving relative to the spatial three-dimensional coordinate system 200. The locating sensing devices 30, 40 continuously monitor position of the workpiece 25 and outputs the sensor data stream comprising the position of the workpiece 25, incremented over elapsed time. The control module 5 and robotic arm 10 use the sensor data streams as inputs and apply them to its local motion interpolation to dynamically adjust final position of the EOAT 15 to be contiguous to the workpiece 25, thus permitting the EOAT 15 to engage the workpiece 25.
In an embodiment wherein the locating sensing devices 30 is mounted on the robotic arm 10, the workpiece(s) 25 can be static and rigid. The sensors are continuously sensing the position of the workpiece 25 with respect to time and generate the sensor data stream relative to the spatial three-dimensional coordinate system 200. This increases accuracy and resolution of position of workpiece(s) 25 when the EOAT 15 is approaching the workpiece(s) 25 for engagement, e.g., for pickup or assembly.
The disclosure has described certain preferred embodiments and modifications thereto. Further modifications and alterations may occur to others upon reading and understanding the specification. Therefore, it is intended that the disclosure not be limited to the particular embodiment(s) disclosed as the best mode contemplated for carrying out this disclosure, but that the disclosure will include all embodiments falling within the scope of the appended claims.
Number | Name | Date | Kind |
---|---|---|---|
4575304 | Nakagawa et al. | Mar 1986 | A |
4675502 | Haefner et al. | Jun 1987 | A |
4831549 | Red et al. | May 1989 | A |
4833381 | Taft et al. | May 1989 | A |
5276777 | Hara | Jan 1994 | A |
5523663 | Tsuge et al. | Jun 1996 | A |
5572103 | Terada | Nov 1996 | A |
5579444 | Dalziel et al. | Nov 1996 | A |
6078846 | Greer et al. | Jun 2000 | A |
6205949 | van den Berg | Mar 2001 | B1 |
6466841 | DiStasio et al. | Oct 2002 | B2 |
6591161 | Yoo et al. | Jul 2003 | B2 |
6615112 | Roos | Sep 2003 | B1 |
7264436 | Tillmann | Sep 2007 | B2 |
7298385 | Kazi et al. | Nov 2007 | B2 |
7324873 | Nagatsuka et al. | Jan 2008 | B2 |
7340323 | Zhang et al. | Mar 2008 | B2 |
7498542 | Pan et al. | Mar 2009 | B2 |
7571025 | Bischoff | Aug 2009 | B2 |
20020013675 | Knoll et al. | Jan 2002 | A1 |
20030048459 | Gooch | Mar 2003 | A1 |
20030090682 | Gooch et al. | May 2003 | A1 |
20040093119 | Gunnarsson et al. | May 2004 | A1 |
20040189631 | Kazi et al. | Sep 2004 | A1 |
20040257021 | Chang et al. | Dec 2004 | A1 |
20050065653 | Ban et al. | Mar 2005 | A1 |
20050107919 | Watanabe et al. | May 2005 | A1 |
20050113971 | Zhang et al. | May 2005 | A1 |
20050131582 | Kazi et al. | Jun 2005 | A1 |
20050273202 | Bischoff | Dec 2005 | A1 |
20060178775 | Zhang et al. | Aug 2006 | A1 |
20070073444 | Kobayashi et al. | Mar 2007 | A1 |
20070083291 | Nagatsuka et al. | Apr 2007 | A1 |
20070213874 | Oumi et al. | Sep 2007 | A1 |
20070293987 | Yamada et al. | Dec 2007 | A1 |
20080089468 | Heigl et al. | Apr 2008 | A1 |
20080161964 | Irie et al. | Jul 2008 | A1 |
20080216552 | Ibach et al. | Sep 2008 | A1 |
20080249659 | Ueyama | Oct 2008 | A1 |
20090010285 | Dubois et al. | Jan 2009 | A1 |
20090157226 | de Smet | Jun 2009 | A1 |
Entry |
---|
Malleable Workpiece—NPL. |
Number | Date | Country | |
---|---|---|---|
20100234994 A1 | Sep 2010 | US |