The above and other objects, features, and advantages of the present invention will become more apparent from the description of the preferred embodiments as set forth below with reference to the accompanying drawings, wherein:
A robot simulation apparatus (hereinafter called the “simulation apparatus”) according to the present invention will be described below with reference to the drawings. Throughout the drawings, the same portions are designated by the same reference numerals, and the description of such portions, once given, will not be repeated hereafter. The simulation apparatus 1 of the embodiment shown in
The apparatus main unit 2 comprises a control section 6 and interfaces not shown. The control section 6 includes a board as a circuit member, a CPU, and various kinds of memories such as ROM, RAM, nonvolatile memory, etc. A system program for controlling the overall operation of the simulation apparatus 1 is stored in the ROM. RAM is memory used as temporary storage of data for processing by the CPU. The nonvolatile memory stores not only operation program data and various set values for the robot 12, but also programs and various data necessary for the implementation of the method to be described later.
The control section 6 is electrically connected to the display 3, keyboard, mouse 5, and other devices, such as a robot control device and CAD device not shown, via respective interfaces, and electrical signals are transferred between them. Each input signal is processed in the control section 6 to implement the corresponding function.
In one mode, the control section 6 implements the functions shown here. That is, the control section 6 comprises: a position data acquiring portion 8, which implements the function of converting data representing a two-dimensional position, specified on the screen of the display 3 using the mouse 5 as the position specifying portion, into data representing a three-dimensional position, and thereby acquiring the three-dimensional position of the destination of the end effector 13 of the robot 12; a shape data acquiring portion 9, which implements the function of acquiring shape data of a workpiece 14 at a position corresponding to the acquired three-dimensional position; and a position/orientation computing portion 10, which implements the function of computing the three-dimensional position and three-dimensional orientation of the robot 12 based on the acquired three-dimensional position and shape data.
In
The display 3 is constructed from a liquid crystal display or a CRT, etc., and a three-dimensional model of the robot 12 equipped with the end effector 13 and three-dimensional models of the workpiece 14 and peripheral devices not shown are graphically displayed on the screen of the display 3. In
One method for specifying the two-dimensional position on the screen of the display 3, the mouse 5 may be moved around on the screen to specify the position by using an arrow 5a or a cross cursor, as in the present embodiment.
In another mode of the display, a touch panel, which is a component integrally constructed with an LCD (Liquid Crystal Display) or the like may be employed. Since the touch panel is constructed to detect the X-Y coordinates of the position touched with a finger or a pen, the need for a mouse 5 used as a pointing device in the present embodiment can be eliminated.
Next, the simulation apparatus of the present embodiment will be described with reference to the flowchart of
In step S1, three-dimensional wire models of the robot (not shown), workpiece 14, and peripheral devices (not shown) are graphically displayed, as shown in Figure 4A, on the screen of the display 3 so as to reflect their relative positions in the actual working environment.
In step S2, the two-dimensional position that coincides with the tip of the end effector is specified on the screen of the display 3 by operating the mouse so as to point the arrow 5a to that position, as shown in
In step S3, from the two-dimensional position specified on the screen, the three-dimensional position is computed by the position data acquiring portion 8, as shown in
In step S4, the three-dimensional shape database used in step S3 is searched to retrieve face, edge, vertex and position data, etc. of the workpiece 14, thereby acquiring the shape data at the position closest to the determined three-dimensional position (shape data acquiring portion.
In step S5, the three-dimensional position and orientation of the robot 12 are computed from the shape data acquired in step S4. More specifically, when the position/orientation to be obtained in a three-dimensional space is denoted by P, P is expressed as P=(n, o, a, p), where n is the normal vector (vector in x direction), o is the orient vector (vector in y direction), a is the approach vector (vector in z direction), and p is the position vector.
Since the position vector p is already computed in step S3, the other orientation determining elements n, o, and a should be obtained in order to determine P. For example, when the shape data of the workpiece 14 represents a plane face, a is obtained as a normal vector (a1, b1, c1) from a plane equation a1x+b1y+c1z+d1=0.
The normal vector n is obtained as shown below from the approach vector a and the current position/orientation (n1, o1, a1) of the tool end point of the robot 12. Here, o1×a is the outer product, and |o1×a| is the absolute value of the outer product.
The orient vector o is obtained as o=n×a, i.e., the outer product of the normal vector n and the approach vector perpendicular to each other.
Alternatively, the current position/orientation of the tool end point of the robot 12 may not be used, but from the obtained line data (line information), the direction of the line may be taken as n, and the remaining a may be obtained. When the line is an arc or a free-form curve, a can be obtained by taking the tangent from the obtained position as n.
In step S6, the robot 12 is moved to the computed three-dimensional position/orientation in an animated manner, and robot motion corresponding to a jogging motion of the robot 12 is performed by off-line simulation.
Next, a modified example of the simulation apparatus according to the present embodiment will be described. In this modified example, the control section 6 of the apparatus main unit 2 further comprises a first recalculation portion 20, which recalculates the three-dimensional orientation of the robot 12 when it is determined that the robot 12 moves outside of a predetermined operating range.
As shown in the flowchart of
A specific example of the recalculation method will be described with reference to a six-axis articulated robot equipped with a servo gun (not shown). The servo gun is rotated about the approach axis (z axis) of the TCP (Tool Center Point), and the robot is moved again. The position/orientation P to be obtained in the three-dimensional space is expressed as P=(n, o, a, p), as described earlier. To change the position/orientation P, the rotational angle θ of the servo gun is successively changed from 0 to 360 degrees in increments of an arbitrary number of degrees, for example, 10 degrees, and the position/orientation P at each angle is obtained.
Here, when the rotation matrix is Pθ, the normal vector is n=(cosθ, −sinθ, 0, 0), the orient vector o=(sinθ, cosθ, 0, 0), the approach vector is a=(0, 0, 1, 0), and the constant is 1=(0, 0, 0, 1), then the new position/orientation P in the three-dimensional space is obtained as P=Pθ·P.
If the position is outside the operating range for all regions of the rotational angle θ from 0 to 360 degrees, it is determined that no solutions are found in any region, and the process is terminated by producing an error. If a solution is found, the robot is moved accordingly, after which the process is terminated. Steps S1 to S6 in this modified example are the same as those described earlier, and the same will not be repeated here.
Next, another modified example of the simulation apparatus according to the present embodiment will be described. In this modified example, as shown in
As shown in the flowchart of
In a specific example, as in the foregoing modified example, the rotational angle θ of the servo gun is successively changed from 0 to 360 degrees in increments of an arbitrary number of degrees, and the position/orientation P at each angle is obtained; if the position is outside the operating range for all regions, it is determined that no solutions are found in any region, and the process is terminated by producing an error. If a solution is found, the orientation is obtained using the center value of the angle range, and the robot is moved accordingly, after which the process is terminated. Steps S1 to S6 in this modified example are the same as those described earlier, and the same description will not be repeated here.
As described above, according to the above embodiment and other modes of the embodiment, by making use of shape data such as faces, lines, and vertices of the three-dimensional model of the workpiece 14, the robot 12 can be moved quickly and easily to the intended position/orientation with a high degree of precision in accordance with the application of the robot such as deburring, arc welding, spot welding, etc., and the time required to study the application of the robot system can be shortened. Furthermore, even people other than those skilled can perform robot jog motions appropriately and can study the application.
The present invention is not limited to the above embodiment, but can be modified in various ways without departing from the spirit and scope of the present invention. For example, in the modified example of the present embodiment, the control section 6 of the apparatus main unit 2 can be equipped with both the first and second recalculation portions 20, 21.
Number | Date | Country | Kind |
---|---|---|---|
2006-114813 | Apr 2006 | JP | national |