The present invention relates to an apparatus, a controller, and a method for generating image data of a moving path of an industrial machine.
Apparatuses configured to generate movement paths of industrial machines that perform works on workpieces are known (e.g., JP 5731463 B). When a work is performed on a workpiece by an industrial machine, defects may occur in a finished surface of the workpiece. There has been a need for techniques that facilitate identifying factors for such defects.
In an aspect of the present disclosure, an apparatus configured to generate image data of a movement path of an industrial machine includes a movement path generation section configured to generate the movement path of the industrial machine when performing a work on a workpiece; a running information acquisition section configured to acquire running information of the industrial machine when performing the work on the workpiece; and an image data generation section configured to generate the image data in which a first point on the movement path corresponding to a change point of first running information, and a second point on the movement path corresponding to a change point of second running information different from the first running information are highlighted on the movement path in display forms visually different from each other.
In another aspect of the present disclosure, a method for generating image data of a movement path of an industrial machine includes: generating the movement path of the industrial machine when performing a work on a workpiece; acquiring running information of the industrial machine when performing the work on the workpiece; and generating the image data in which a first point on the movement path corresponding to a change point of first running information, and a second point on the movement path corresponding to a change point of second running information different from the first running information are highlighted on the movement path in display forms visually different from each other.
According to the present disclosure, if a defect occurs in a finished surface of the workpiece when an industrial machine performs a work on a workpiece, an operator can easily identify a factor for the defect.
Hereinafter, embodiments of the present disclosure will be described in detail with reference to the drawings. In the various embodiments described below, similar elements are denoted by the same reference numeral, and redundant description thereof will be omitted. Referring first to
The industrial machine 12 is e.g. a machine tool (a lathe, milling machine, machining center, etc.), or an industrial robot (a vertical articulated robot, horizontal articulated robot, parallel link robot, etc.), and performs a predetermined work on a workpiece. The industrial machine 12 includes a movement mechanism 16 and a tool 18. The movement mechanism 16 includes at least one electric motor 20, and relatively moves the tool 18 with respect to a workpiece to be worked.
The electric motor 20 is e.g. a servomotor, and the controller 14 sends a command to the electric motor 20 so as to operate the movement mechanism 16 by driving the electric motor 20, thereby moving the tool 18 relative to the workpiece. The tool 18 is e.g. a cutting tool, a laser processing head, a welding torch, or a coating material applicator, and performs a predetermined work (e.g., cutting, laser processing, welding, or coating) on the workpiece.
As an example, when the industrial machine 12 is the machine tool, the movement mechanism 16 moves a processing table, on which the workpiece is set, in a horizontal direction, and moves a spindle head, on which the tool 18 (cutting tool) is set, in a vertical direction. As another example, when the industrial machine 12 is the industrial robot (articulated robot), the movement mechanism 16 includes a rotary barrel provided at a robot base so as to be rotatable about a vertical axis, a robot arm rotatably provided at the rotary barrel, and a wrist provided at a tip of the robot arm, and moves the tool 18 attached to the wrist to any position in a three-dimensional space.
The controller 14 controls the operation of the industrial machine 12. Specifically, the controller 14 is a computer including a processor 22, a memory 24, and an I/O interface 26. The processor 22 includes e.g. a CPU, or GPU, and executes arithmetic processing to perform various functions described later. The processor 22 is communicably connected to the memory 24 and the I/O interface 26 via a bus 28.
The memory 24 includes e.g. a ROM or RAM, and stores various types of data temporarily or permanently. The I/O interface 26 communicates with an external device, receives data from the external device, and transmits data to the external device, under the control of the processor 22. In the present embodiment, a display device 30 and an input device 32 are communicably connected to the I/O interface 26 in wireless or wired manner.
The display device 30 includes e.g. an LCD, an organic EL display, and the processor 22 transmits image data to the display device 30 via the I/O interface 26 to display the image on the display device 30. The input device 32 includes e.g. a keyboard, mouse, or touch sensor, and transmits information inputted by the operator to the processor 22 via the I/O interface 26. The display device 30 and the input device 32 may be provided integrally with the controller 14, or may be provided separate from the controller 14.
In the present embodiment, the processor 22 functions as an apparatus 50 configured to generate the image data of a movement path of the industrial machine 12. The function of the apparatus 50 will be described below. The processor 22 generates a movement path MP of the industrial machine 12 when performing a work on a workpiece. For example, the processor 22 generates the movement path MP of the tool 18 (specifically, a tool tip point, TCP, or the like), which is moved by the movement mechanism 16, with respect to the workpiece.
In the present embodiment, the processor 22 generates a movement path MP1 (first movement path) defined by a work program WP for performing the work on the workpiece. The work program WP is a computer program (e.g., a G code program) including a plurality of statements that define a plurality of target positions at which the tool 18 is to be arranged with respect to the workpiece, a minute line segment connecting two adjacent target positions, a target velocity of the tool 18 with respect to the workpiece, etc.
The processor 22 analyzes the work program WP and generates a command to be transmitted to the electric motor 20 in order to perform the work on the workpiece. In this way, the processor 22 operates the movement mechanism 16 in accordance with the work program WP so as to perform the work on the workpiece by the tool 18. The work program WP is stored in the memory 24.
The processor 22 generates the movement path MP1 in the three-dimensional space by analyzing the work program WP. The movement path MP1 is an aggregate of the minute line segments defined in the work program WP. The movement path MP1 is a movement path in control of the movement mechanism 16 (tool 18). For example, assume that the industrial machine 12 forms a workpiece A in
In this case, an example of the movement path MP1 of the tool 18 with respect to the workpiece A when forming a connection portion A3 between a base portion A1 and a main body portion A2 of the workpiece A is illustrated in
The processor 22 generates such movement path MP1 from the work program WP. In
The processor 22 acquires running information of the industrial machine 12 when performing the work on the workpiece A. In the present embodiment, the processor 22 acquires the running information from the work program WP. The running information acquired from the work program WP includes e.g. information of a position PT1, a velocity VT1, acceleration aT1, a jerk αT1 and a movement direction dT1 of the tool 18 with respect to the workpiece A; information of a position (rotation angle) PM1, a velocity (rotation speed) VM1, acceleration (angular acceleration) aM1, a jerk (angular jerk) αM1 and a movement direction (rotational direction) dM1 of a rotary shaft of the electric motor 20; and information of a running mode RM of the industrial machine 12.
The processor 22 analyzes the work program WP, and acquires, as the running information, time-series data indicating, in time series, changes of the positions PT1 and PM1, the velocities VT1 and VM1, the acceleration aT1 and aM1, the jerks αT2 and αM1, the movement directions dT1 and dM1, and the running mode RM with respect to time t, based on the information of each statement included in the work program WP (e.g., the target position, minute line segment, target velocity). Accordingly, the processor 22 functions as a running information acquisition section 54 (
In the example illustrated in
A line L3 illustrated in the lower side of the graph in
This phenomenon in which the difference ΔMP becomes large may similarly occur also at a time point at which a change amount δ2 of the positions PT1 and PM1, the velocities VT1 and VM1, the acceleration aT1 and aM1, the jerks αT1 and aM1, or the movement direction dM1 increases. At the time point at which the change amount δ1 increases (i.e., the difference ΔMP increases), the actual position (or path) of the tool 18 with respect to the workpiece during the work may be shifted from the target position (or path) defined by the work program, which may result in the generation of a defect in the finished surface of the workpiece A (e.g., an unintended line, pattern, or the like may be formed in the finished surface of the workpiece A when the workpiece is machined).
On the other hand, in the example illustrated in
On the other hand, the second running mode RM2 is a work mode, for example. The work mode is a running mode in which the work on the workpiece is performed by the tool 18 while the movement mechanism 16 moves the tool 18 at a velocity VT1_W (<VT1_P). In the work mode, the control gain GC is set to GC_W (>GC_P), and the time constant τ is set to τW (<τp). Accordingly, the responsiveness of the control of the movement mechanism 16 in the work mode is higher than that in the positioning mode.
As discussed above, at the time point t0 at which the running mode RM of the industrial machine 12 is switched between the first running mode RM1 and the second running mode RM2, vibrations are generated in the tool 18, whereby the difference ΔMP may increase as indicated by the arrow C in
In the present embodiment, the processor 22 acquires a change point at which each change amount δ1 of the positions PT1 and PM1, the velocities VT1 and VM1, the acceleration aT1 and aM1, the jerks αT1 and aM1, and the movement directions dT1 and dM1 within the predetermined time δt exceeds a predetermined threshold value δth1 in the time-series data of the positions PT1 and PM1, the velocities VT1 and VM1, the acceleration aT1 and aM1, the jerks αT1 and αM1, and the movement directions dT1 and dM1.
Specifically, regarding positions PT1 and PM1, the processor 22 retrieves from the time-series data of the positions PT1 and PM1 the change point at which the change amount (i.e., the distance) between two positions within the predetermined time δt exceeds a threshold value δth1, and acquires each time point t0 at which such a change point occurs, for example.
Regarding the velocities VT1 and VM1, the acceleration aT1 and aM1, and the jerks αT1 and αM1, for example, the processor 22 retrieves the change point at which an absolute value of each change amount of the velocities VT1 and VM1, acceleration aT1 and aM1, and jerks αT1 and αM1 within the predetermined time δt exceeds the threshold value δth1 from the time-series data of the velocities VT1 and VM1, acceleration aT1 and aM1, and jerks αT1 and αM1, and acquires each time point t0 at which such a change point occurs.
Further, regarding the movement direction dT1 of the tool 18, for example, the processor 22 calculates, as a change amount of the movement direction dT1 within the time δt from a first time point t1 to a second time point t2 (>t1) (i.e., δt=t2−t2), an inner product IP of a movement direction dT1_t1 at the first time point t1 and a movement direction dT1_t2 at the second time point t2. For example, if each of the movement directions dT1_t1 and dT1_t2 is considered as an unit vector, and an angle between the movement directions dT1_t1 and dT1_t2 is represented as θ, the internal product IP can be calculated by the equation of IP=cos θ (−1≤IP≤1).
As the change amount (angle θ) from the movement direction dT1_t1 to the movement direction dT1_t2 becomes larger from 0 degrees toward 180 degrees, the inner product IP becomes smaller in the range of [−1≤IP≤1]. The processor 22 retrieves a change point at which the inner product IP becomes smaller exceeding the predetermined threshold value δth1 from the time-series data of the movement direction dT1, and acquires the time point t0 at which such a change point occurs.
Alternatively, the processor 22 may calculate the angle θ between the movement direction dT1_t1 and the movement direction dT1_t2 as a change amount of the movement direction dT1 within the time δt, and retrieve a change point at which the angle θ exceeds the predetermined threshold value δth1 from the time-series data of the movement direction dT1, and then acquire the time point t0 at which the change point occurs.
Regarding the movement direction (rotation direction) dM1 of the rotary shaft of the electric motor 20, for example, the processor 22 determines that the movement direction dM1 exceeds the predetermined threshold value δth1 (=180 degrees) when the movement direction dM1 is reversed, and retrieves a change point at which the movement direction dM1 is reversed from the time-series data of the movement direction dM1, and then acquires the time point t0 at which such a change point occurs.
On the other hand, regarding the running mode RM, the processor 22 retrieves a change point at which the first running mode RM1 is switched to the second running mode RM2 from the time-series data of the running mode RM (
Next, the processor 22 specifies points on the movement path MP1 corresponding to the respective change points of the running information PT1, PM1, VT1, VM1, aT1 aM1, αT1, αM1, dT1, dM1, and RM. In the present embodiment, the processor 22 acquires a total of eleven kinds of running information, i.e., the positions PT1 and PM1, the velocities VT1 and VM1, the acceleration aT1 and aM1, the jerks αT1 and αM1, the movement directions dT1 and dM1, and the running mode RM.
Thus, regarding these eleven kinds of running information PT1, PM1, VT1, VM1, aT1, aM1, αT1, αM1, dT1, dM1, and RM, the processor 22 retrieves and specifies the points on the movement path MP1 corresponding to the time points t0 at which the change points of these pieces of running information occurs. In this regard, since the time-series data of the running information PT1, PM1, VT1, VM1, aT1, aM1, αT1, αM1, dT1, dM1 and RM are associated with the statements included in the work program WP, the processor 22 can identify the statement of the work program WP that corresponds to the time point t0, and thereby identifying the point on the movement path MP1 corresponding to the time point t0 via the statement of the work program WP.
In this way, the processor 22 identifies the plurality of points on the movement path MP1 corresponding to the respective change points of the running information PT1, PM1, VT1, VM1, aT1, αM1, αT1, αM1, dT1, dM1, and RM. Then, the processor 22 generates image data ID1 in which the identified plurality of points on the movement path MP1 are highlighted in display forms visually different from one another.
An example of an image of the image data ID1 is illustrated in
The arrow D1, the arrow D2, and the arrow D3 depicted in the image data ID1 are displayed in different colors, different shapes, or different visual effects (flashing or the like) from one another, for example, and can be visually identified, respectively. Accordingly, the operator is able to visually recognize which of the running information either of the arrows D1, D2 or D3 corresponds to.
Note that, in the image data ID1 illustrated in
As stated above, in the present embodiment, the processor 22 functions as the movement path generation section 52, the running information acquisition section 54, the change point acquisition section 56, and the image data generation section 58 of the apparatus 50. These movement path generation section 52, the running information acquisition section 54, the change point acquisition section 56 and the image data generation section 58 constitute the apparatus 50.
The processor 22 transmits the generated image data ID1 to the display device 30 via the I/O interface 26, and the display device 30 displays the image of the image data ID1 as illustrated in
According to the present embodiment, if a defect occurs in a finished surface of the workpiece A when the industrial machine 12 performs the work on the workpiece A, the operator easily identifies a factor for the defect. More specifically, assume that the position of the defect generated in the finished surface of the workpiece A matches (or is close to) the position of the point highlighted in the image data ID1, when the industrial machine 12 performs the work on the workpiece A in accordance with the machining program WP.
In this case, the operator can immediately identify which of the change points of the running information PT1, PM1, VT1, VM1, aT1, aM1, αT1, αM1, dT1, dM1, and RM, the point that matches (or is close to) the position of the defect generated in the workpiece A corresponds to. Thus, the operator may estimate that the factor for the defect results from the parameter relating to the identified running information PT1, PM1, VT1, VM1, aT1, aM1, αT1, αM1, dT1, dM1 or RM.
As a result, the operator is able to achieve an improvement in precision of the work performed on the workpiece A and a reduction in time necessary for startup of the mechanical system 10, by changing the machining program WP or the parameter (e.g., the statement of the machining program, control gain GC, or time constant τ) relating to the identified running information PT1, PM1, VT1, VM1, aT1, aM1, αT1, αM1, dT1, dM1 or RM.
The processor 22 may select a point to be highlighted in the image of the image data ID1 from the plurality of kinds of running information PT1, PM1, VT1, VM1, aT1, aM1, αT1, αM1, dT1, dM1 and RM, in response to the input information to the input device 32, and may highlight in the image only the point corresponding to the selected running information.
Such an embodiment will be described with reference to
In the example illustrated in
The operator operates the input device 32 (e.g., a mouse) so as to input the input information for selecting the running information PT1, PM1, VT1, VM1, aT1, aM1, αT1, αM1, dT1, dM1 or RM to be highlighted, while viewing the image of the image data ID1 displayed on the display device 30.
In response to the input information to the input device 32, the processor 22 highlights the point corresponding to the selected running information PT1, PM1, VT1, VM1, aT1, aM1, αT1, αM1, dT1, dM1 or RM, and displays the check mark in the check column of the selected running information PT1, PM1, VT1, VM1, aT1, aM1, αT1, αM1, dT1, dM1, or RM. In the example depicted in
The processor 22 may display, in the image data ID1, the relating parameter of the running information PT1, PM1, VT1, VM1, aT1, aM1, αT1, αM1, dT1, dM1, or RM corresponding to the point highlighted in the image data ID1, in response to the input information to the input device 32. Such an embodiment will be described with reference to
The image data ID1 illustrated in
In the example illustrated in
The operator operates the input device 32 (e.g., the mouse) so as to input the input information for selecting the point highlighted in the image data ID1 (arrows D1, D2, D3), while viewing the image of the image data ID1 displayed on the display device 30. In response to the input information to the input device 32, the processor 22 displays, in the image data ID1, the relating parameter of the running information PT1, PM1, VT1, VM1, aT1, αM1, αT1, αM1, dT1, dM1, or RM corresponding to the selected points (arrows D1, D2, D3).
According to this configuration, in a case where the position of the defect generated in the workpiece A matches (is close to) the position of the point highlighted in the image data ID1, the operator is able to refer to the relating parameter of the running information PT1, PM1, VT1, VM1, aT1, aM1, αT1, αM1, dT1, dM1, or RM that may be the factor for the defect, and verify the factor in more detail.
Next, a mechanical system 100 according to another embodiment will be described with reference to
The sensor 104 includes e.g. a rotation detection sensor such as an encoder or a Hall element, or a force sensor such as a torque sensor or a force detection sensor, configured to detect the position PM2 of the rotary shaft of the electric motor 20, the torque applied to the rotary shaft, or the force applied to the movement mechanism 16. The sensor 104 transmits the detected position PM1, torque or force, as feedback information FB, to the processor 22 via the I/O interface 26.
The processor 22 receives the feedback information FB detected by the sensor 104 therefrom while operating the industrial machine 102 in accordance with the work program WP. For example, the processor 22 acquires the feedback information FB detected by the sensor 104 while operating the industrial machine 102 in accordance with the work program WP so as to actually perform the work on the workpiece A by the tool 18.
Alternatively, the processor 22 may acquire the feedback information FB detected by the sensor 104 while operating the industrial machine 102 in accordance with the work program WP without performing the actual work by the tool 18 (e.g., without activating the tool 18, or with the tool 18 being removed from the movement mechanism 16).
The processor 22 determines, from the feedback information FB from the sensor 104 (specifically, the position FM2), a position PT2 of the tool 18 relative to the workpiece A while operating the industrial machine 102 in accordance with the work program WP. Then, the processor 22 functions as the movement path generation section 52, and generates, from the position PT2, the movement path MP2 (second movement path) of the industrial machine 102 in a three-dimensional space.
The movement path MP2 corresponds to the actual movement path of the movement mechanism 16 when the industrial machine 102 is operated in accordance with the work program WP. An example of the movement path MP2 is illustrated in
Then, similarly as the above-described method for acquiring the change points of the running information PT1, PM1, VT1, VM1, aT1, aM1, αT1, αM1, dT1 and dM1, the processor 22 functions as the change point acquisition section 56 to acquire a change point at which a change amount δ2 of each of the positions PT2 and PM2, velocities VT2 and VM2, acceleration aT2 and aM2, jerks αT2 and αM2, and movement directions dT2 and dM2 exceeds a predetermined threshold value δth2 within a predetermined time δt in the time-series data of the positions PT2 and PM2, velocities VT2 and VM2, acceleration aT2 and aM2, jerks αT2 and αM2, and movement directions dT2 and dM2.
The processor 22 acquires, in the time-series data, the change point of each of the positions PT2 and PM2, velocities VT2 and VM2, acceleration aT2 and aM2, jerks αT2 and αM2, and movement directions dT2 and dM2; and the time point t0 at which each change point occurs, and then identifies the points on the movement path MP2 corresponding to the respective change points.
The time-series data of the running information PT2, PM2, VT2, VM2, aT2, aM2, αT2, αM2, dT2 and dM2, and the movement path MP2 are data acquired from the feedback information FB, and are associated with each other via time t. Accordingly, the processor 22 is able to retrieve and identify the points on the movement path MP2 corresponding to the time points t0 at which the change points of the running information PT2, PM2, VT2, VM2, aT2, aM2, αT2, αM2, dT2, and dM2 acquired from the feedback information FB occur.
Then, the processor 22 functions as the image data generation section 58 to generate image data ID2 in which the points on the movement path MP2 corresponding to the change points of the running information PT2, PM2, VT2, VM2, aT2, aM2, aT2, aM2, dT2 and dM2 are highlighted on the movement path MP2 in display forms visually different from one another. An example of the image data ID2 is illustrated in
In the example illustrated in
The arrow D1, the arrow D2, and the arrow D3 depicted in the image data ID2 are displayed in different colors, different shapes, or different visual effects (flashing or the like) from one another, and thereby can be visually identified. Accordingly, the operator is able to visually recognize which of the running information each of the arrows D1, D2, and D3 in the image data ID2 corresponds to.
Note that, in the image data ID2 illustrated in
As described above, in the image data ID2, the points on the movement path MP2 corresponding to the change points of the plurality of pieces of running information different from one another acquired from the feedback information FB are highlighted on the movement path MP2 in display forms visually different from one another. The processor 22 may switch the image to be displayed on the display device 30 between the image illustrated in
Note that, as illustrated in
Further, as illustrated in
Furthermore, the processor 22 of the mechanical system 100 may acquire, as the running information, the time-series data of the positions PT1 and PM1, velocities VT1 and VM1, acceleration aT1 and aM1, jerks αT1 and aM1, movement directions dT1 and dM1, and the running mode RM; and the time-series data of the positions PT2 and PM2, velocities VT2 and VM2, acceleration aT2 and aM2, jerks αT1 and αM2, movement directions dT1 and dM1.
In this case, the processor 22 may highlight, in the image data ID2, the points on the movement path MP2 corresponding to the change points of the running information PT2, PM2, VT2, VM2, aT2, aM2, αT2, αM2, dT2 and dM2, and also the points on the movement path MP2 corresponding to the change points of the running information PT1, PM1, VT1, VM1, aT1, aM1, αT1, αM1, dT1, dM1, and RM on the movement path MP2.
Further, in the mechanical system 100, the processor 22 may generate both the movement paths MP1 and MP2, display the movement paths MP1 and MP2 in the image data ID2, and highlight the points on the movement path MP1 corresponding to the change points of the running information PT1, PM1, VT1, VM1, aT1, aM1, αT1, αM1, dT1, dM1 and RM, as well as the points on the movement path MP2 (or the movement path MP1) corresponding to the change points of the running information PT2, PM2, VT2, VM2, aT2, aM2, αT2, αM2, dT2, and dM2.
The processor 22 may acquire the difference ΔMP between the movement path MP1 and the movement path MP2, and display the difference ΔMP in the image data ID2. Such an embodiment is illustrated in
For example, the processor 22 displays a region E corresponding to the difference ΔMP between the movement path MP1 and the movement path MP2 as a colored region (e.g., a red-colored region). The processor 22 may enlarge and display the difference ΔMP (i.e., the region E) in the image data ID2, in response to the input information to the input device 32.
In the above-described embodiments, the processor 22 may acquire time-series data of digital signals in which discrete numerical data exist in time sequence as the running information PT1, PM1, VT1, VM1, aT1, aM1, αT1, αM1, dT1, dM1, PT2, PM2, VT2, VM2, aT2, aM2, αT2, αM2, dT2, or dM2, not being limited to the time-series data of continuous analog signals as illustrated in
The running information RM may not be limited to the time-series data as illustrated in
The change point acquisition section 56 may be omitted from the above-described embodiments. In this case, the operator may manually identify and input the change points of the running information PT1, PM1, VT1, VM1, aT1, aM1, αT1, αM1 dT1, dM1, RM, PT2, PM2, VT2, VM2, aT2, aM2, αT2, αM2, dT2, or dM2 from the time-series data thereof, for example.
In the above-described embodiments, the points on the movement path MP corresponding to the change points of the running information PT1, PM1, VT1, VM1, aT1, aM1, αT1, αM1, dT1, dM1, RM, PT2, PM2, VT2, VM2, aT2, aM2, αT2, αM2, dT2, and dM2 are highlighted by the arrows D1, D2, and D3. However, the point on the movement path MP corresponding to the change point may be highlighted in any display form (e.g., colored dot, triangular point, or square point). Although the present disclosure has been described through the embodiments, the embodiments are not intended to limit the invention according to the claims.
Number | Date | Country | Kind |
---|---|---|---|
2019-198977 | Oct 2019 | JP | national |