APPARATUS, CONTROLLER, AND METHOD FOR GENERATING IMAGE DATA OF MOVEMENT PATH OF INDUSTRIAL MACHINE

Abstract
An apparatus configured to facilitate identifying a factor for a defect if the defect occurs in a finished surface of the workpiece. An apparatus includes a movement path generation section configured to generate the movement path of the industrial machine when performing a work on a workpiece; a running information acquisition section configured to acquire running information of the industrial machine when performing a work on the workpiece; and an image data generation section configured to generate the image data in which a first point on the movement path corresponding to a change point of first running information, and a second point on the movement path corresponding to a change point of second running information different from the first running information are highlighted on the movement path in display forms visually different from each other.
Description
BACKGROUND OF THE INVENTION
1. Field of the Invention

The present invention relates to an apparatus, a controller, and a method for generating image data of a moving path of an industrial machine.


2. Description of the Related Art

Apparatuses configured to generate movement paths of industrial machines that perform works on workpieces are known (e.g., JP 5731463 B). When a work is performed on a workpiece by an industrial machine, defects may occur in a finished surface of the workpiece. There has been a need for techniques that facilitate identifying factors for such defects.


SUMMARY OF THE INVENTION

In an aspect of the present disclosure, an apparatus configured to generate image data of a movement path of an industrial machine includes a movement path generation section configured to generate the movement path of the industrial machine when performing a work on a workpiece; a running information acquisition section configured to acquire running information of the industrial machine when performing the work on the workpiece; and an image data generation section configured to generate the image data in which a first point on the movement path corresponding to a change point of first running information, and a second point on the movement path corresponding to a change point of second running information different from the first running information are highlighted on the movement path in display forms visually different from each other.


In another aspect of the present disclosure, a method for generating image data of a movement path of an industrial machine includes: generating the movement path of the industrial machine when performing a work on a workpiece; acquiring running information of the industrial machine when performing the work on the workpiece; and generating the image data in which a first point on the movement path corresponding to a change point of first running information, and a second point on the movement path corresponding to a change point of second running information different from the first running information are highlighted on the movement path in display forms visually different from each other.


According to the present disclosure, if a defect occurs in a finished surface of the workpiece when an industrial machine performs a work on a workpiece, an operator can easily identify a factor for the defect.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a block diagram of a mechanical system according to an embodiment.



FIG. 2 is a perspective view of a workpiece according to an embodiment.



FIG. 3 is an image of a movement path of an industrial machine when performing a work on a workpiece illustrated in FIG. 3.



FIG. 4 is an enlarged view of a region B in FIG. 3.



FIG. 5 illustrates an example of time-series data of running information.



FIG. 6 illustrates an example of time-series data of running information.



FIG. 7 illustrates an example of a picture of image data that is generated by an image data generation section illustrated in FIG. 1.



FIG. 8 is an enlarged view of a region B in FIG. 7.



FIG. 9 illustrates another example of a picture of image data that is generated by the image data generation section.



FIG. 10 illustrates yet another example of a picture of image data that is generated by the image data generation section.



FIG. 11 is a block diagram of a mechanical system according to another embodiment.



FIG. 12 is an enlarged view of the region B in FIG. 7.



FIG. 13 illustrates an example of a picture of image data that is generated by an image data generation section illustrated in FIG. 11.



FIG. 14 illustrates an example of a picture of image data in which a difference between movement paths is represented.





DETAILED DESCRIPTION

Hereinafter, embodiments of the present disclosure will be described in detail with reference to the drawings. In the various embodiments described below, similar elements are denoted by the same reference numeral, and redundant description thereof will be omitted. Referring first to FIG. 1, a mechanical system 10 according to an embodiment will be described. The mechanical system 10 includes an industrial machine 12 and a controller 14.


The industrial machine 12 is e.g. a machine tool (a lathe, milling machine, machining center, etc.), or an industrial robot (a vertical articulated robot, horizontal articulated robot, parallel link robot, etc.), and performs a predetermined work on a workpiece. The industrial machine 12 includes a movement mechanism 16 and a tool 18. The movement mechanism 16 includes at least one electric motor 20, and relatively moves the tool 18 with respect to a workpiece to be worked.


The electric motor 20 is e.g. a servomotor, and the controller 14 sends a command to the electric motor 20 so as to operate the movement mechanism 16 by driving the electric motor 20, thereby moving the tool 18 relative to the workpiece. The tool 18 is e.g. a cutting tool, a laser processing head, a welding torch, or a coating material applicator, and performs a predetermined work (e.g., cutting, laser processing, welding, or coating) on the workpiece.


As an example, when the industrial machine 12 is the machine tool, the movement mechanism 16 moves a processing table, on which the workpiece is set, in a horizontal direction, and moves a spindle head, on which the tool 18 (cutting tool) is set, in a vertical direction. As another example, when the industrial machine 12 is the industrial robot (articulated robot), the movement mechanism 16 includes a rotary barrel provided at a robot base so as to be rotatable about a vertical axis, a robot arm rotatably provided at the rotary barrel, and a wrist provided at a tip of the robot arm, and moves the tool 18 attached to the wrist to any position in a three-dimensional space.


The controller 14 controls the operation of the industrial machine 12. Specifically, the controller 14 is a computer including a processor 22, a memory 24, and an I/O interface 26. The processor 22 includes e.g. a CPU, or GPU, and executes arithmetic processing to perform various functions described later. The processor 22 is communicably connected to the memory 24 and the I/O interface 26 via a bus 28.


The memory 24 includes e.g. a ROM or RAM, and stores various types of data temporarily or permanently. The I/O interface 26 communicates with an external device, receives data from the external device, and transmits data to the external device, under the control of the processor 22. In the present embodiment, a display device 30 and an input device 32 are communicably connected to the I/O interface 26 in wireless or wired manner.


The display device 30 includes e.g. an LCD, an organic EL display, and the processor 22 transmits image data to the display device 30 via the I/O interface 26 to display the image on the display device 30. The input device 32 includes e.g. a keyboard, mouse, or touch sensor, and transmits information inputted by the operator to the processor 22 via the I/O interface 26. The display device 30 and the input device 32 may be provided integrally with the controller 14, or may be provided separate from the controller 14.


In the present embodiment, the processor 22 functions as an apparatus 50 configured to generate the image data of a movement path of the industrial machine 12. The function of the apparatus 50 will be described below. The processor 22 generates a movement path MP of the industrial machine 12 when performing a work on a workpiece. For example, the processor 22 generates the movement path MP of the tool 18 (specifically, a tool tip point, TCP, or the like), which is moved by the movement mechanism 16, with respect to the workpiece.


In the present embodiment, the processor 22 generates a movement path MP1 (first movement path) defined by a work program WP for performing the work on the workpiece. The work program WP is a computer program (e.g., a G code program) including a plurality of statements that define a plurality of target positions at which the tool 18 is to be arranged with respect to the workpiece, a minute line segment connecting two adjacent target positions, a target velocity of the tool 18 with respect to the workpiece, etc.


The processor 22 analyzes the work program WP and generates a command to be transmitted to the electric motor 20 in order to perform the work on the workpiece. In this way, the processor 22 operates the movement mechanism 16 in accordance with the work program WP so as to perform the work on the workpiece by the tool 18. The work program WP is stored in the memory 24.


The processor 22 generates the movement path MP1 in the three-dimensional space by analyzing the work program WP. The movement path MP1 is an aggregate of the minute line segments defined in the work program WP. The movement path MP1 is a movement path in control of the movement mechanism 16 (tool 18). For example, assume that the industrial machine 12 forms a workpiece A in FIG. 2 by the tool 18.


In this case, an example of the movement path MP1 of the tool 18 with respect to the workpiece A when forming a connection portion A3 between a base portion A1 and a main body portion A2 of the workpiece A is illustrated in FIGS. 3 and 4. Note that FIG. 4 is an enlarged view of region B in FIG. 3. Each of lines illustrated in FIG. 4 constitutes the movement path MP1, and an outer surface of the connection portion A3 is defined by the movement path MP1.


The processor 22 generates such movement path MP1 from the work program WP. In FIGS. 3 and 4, the movement path MP1 when forming the connection portion A3 of the workpiece A is exemplified. However, it will be appreciated that the processor 22 can similarly generate the movement path MP1 for forming the base portion A1 and the main body portion A2 of the workpiece A from the work program WP. Thus, in the present embodiment, the processor 22 functions as a movement path generation section 52 (FIG. 1) configured to generate the movement path MP1.


The processor 22 acquires running information of the industrial machine 12 when performing the work on the workpiece A. In the present embodiment, the processor 22 acquires the running information from the work program WP. The running information acquired from the work program WP includes e.g. information of a position PT1, a velocity VT1, acceleration aT1, a jerk αT1 and a movement direction dT1 of the tool 18 with respect to the workpiece A; information of a position (rotation angle) PM1, a velocity (rotation speed) VM1, acceleration (angular acceleration) aM1, a jerk (angular jerk) αM1 and a movement direction (rotational direction) dM1 of a rotary shaft of the electric motor 20; and information of a running mode RM of the industrial machine 12.


The processor 22 analyzes the work program WP, and acquires, as the running information, time-series data indicating, in time series, changes of the positions PT1 and PM1, the velocities VT1 and VM1, the acceleration aT1 and aM1, the jerks αT2 and αM1, the movement directions dT1 and dM1, and the running mode RM with respect to time t, based on the information of each statement included in the work program WP (e.g., the target position, minute line segment, target velocity). Accordingly, the processor 22 functions as a running information acquisition section 54 (FIG. 1) configured to acquire the running information.



FIGS. 5 and 6 each illustrates an example of a graph of time-series data of running information. FIG. 5 exemplifies a line L1 indicating the time-series data of the position PT1 of the tool 18, and a line L2 indicating the time-series data of the velocity VT1 of the tool 18, where the horizontal axis represents the time t, and the vertical axis represents a position P and a velocity V. On the other hand, FIG. 6 exemplifies a line L4 indicating the time-series data of the running mode RM, where the horizontal axis represents the time t and the vertical axis represents the running mode RM.


In the example illustrated in FIG. 5, the tool 18 moves in a direction toward a position P0 while gradually decelerating, and reaches the position P0 at a time point t0. Subsequently, the tool 18 moves in a direction away from the position P0 while accelerating. In this case, a change amount of the velocity VT1 within a predetermined time δt including the time point t0 is δVT1 in FIG. 5, and a change amount in the movement direction dT1 of the tool 18 is 180 degrees (i.e., reversed).


A line L3 illustrated in the lower side of the graph in FIG. 5 indicates a difference ΔMP between the movement path MP2 of the tool 18 acquired from the work program WP and an actual movement path MP2 of the tool 18 when the movement mechanism 16 is operated in accordance with the work program WP. As indicated by the line L3, the difference ΔMP becomes large as indicated by an arrow C in the graph at the time point t0 (strictly, a time point immediately before the time point t0) at which the change amount in the movement direction dT1 is large (e.g., the movement direction is reversed).


This phenomenon in which the difference ΔMP becomes large may similarly occur also at a time point at which a change amount δ2 of the positions PT1 and PM1, the velocities VT1 and VM1, the acceleration aT1 and aM1, the jerks αT1 and aM1, or the movement direction dM1 increases. At the time point at which the change amount δ1 increases (i.e., the difference ΔMP increases), the actual position (or path) of the tool 18 with respect to the workpiece during the work may be shifted from the target position (or path) defined by the work program, which may result in the generation of a defect in the finished surface of the workpiece A (e.g., an unintended line, pattern, or the like may be formed in the finished surface of the workpiece A when the workpiece is machined).


On the other hand, in the example illustrated in FIG. 6, the running mode RM of the industrial machine 12 is switched from a first running mode RM1 to a second running mode RM2. For example, the first running mode RM1 is a positioning mode. The positioning mode is a running mode in which the movement mechanism 16 moves the tool 18 to a work start point at a velocity VT1_P in a state where the tool 18 does not perform the work on the workpiece, for example. In the positioning mode, a control gain GC, which determines a response speed of the control of the movement mechanism 16 by the processor 22, is set to GC_P. Further, a time constant T when the movement mechanism 16 accelerates or decelerates the tool 18 is set to Tp.


On the other hand, the second running mode RM2 is a work mode, for example. The work mode is a running mode in which the work on the workpiece is performed by the tool 18 while the movement mechanism 16 moves the tool 18 at a velocity VT1_W (<VT1_P). In the work mode, the control gain GC is set to GC_W (>GC_P), and the time constant τ is set to τW (<τp). Accordingly, the responsiveness of the control of the movement mechanism 16 in the work mode is higher than that in the positioning mode.


As discussed above, at the time point t0 at which the running mode RM of the industrial machine 12 is switched between the first running mode RM1 and the second running mode RM2, vibrations are generated in the tool 18, whereby the difference ΔMP may increase as indicated by the arrow C in FIG. 6. Note that the first running mode RM1 may be the work mode and the second running mode RM2 may be the positioning mode, or the first running mode RM1 and the second running mode RM2 may be any running modes different from the work mode and positioning mode.


In the present embodiment, the processor 22 acquires a change point at which each change amount δ1 of the positions PT1 and PM1, the velocities VT1 and VM1, the acceleration aT1 and aM1, the jerks αT1 and aM1, and the movement directions dT1 and dM1 within the predetermined time δt exceeds a predetermined threshold value δth1 in the time-series data of the positions PT1 and PM1, the velocities VT1 and VM1, the acceleration aT1 and aM1, the jerks αT1 and αM1, and the movement directions dT1 and dM1.


Specifically, regarding positions PT1 and PM1, the processor 22 retrieves from the time-series data of the positions PT1 and PM1 the change point at which the change amount (i.e., the distance) between two positions within the predetermined time δt exceeds a threshold value δth1, and acquires each time point t0 at which such a change point occurs, for example.


Regarding the velocities VT1 and VM1, the acceleration aT1 and aM1, and the jerks αT1 and αM1, for example, the processor 22 retrieves the change point at which an absolute value of each change amount of the velocities VT1 and VM1, acceleration aT1 and aM1, and jerks αT1 and αM1 within the predetermined time δt exceeds the threshold value δth1 from the time-series data of the velocities VT1 and VM1, acceleration aT1 and aM1, and jerks αT1 and αM1, and acquires each time point t0 at which such a change point occurs.


Further, regarding the movement direction dT1 of the tool 18, for example, the processor 22 calculates, as a change amount of the movement direction dT1 within the time δt from a first time point t1 to a second time point t2 (>t1) (i.e., δt=t2−t2), an inner product IP of a movement direction dT1_t1 at the first time point t1 and a movement direction dT1_t2 at the second time point t2. For example, if each of the movement directions dT1_t1 and dT1_t2 is considered as an unit vector, and an angle between the movement directions dT1_t1 and dT1_t2 is represented as θ, the internal product IP can be calculated by the equation of IP=cos θ (−1≤IP≤1).


As the change amount (angle θ) from the movement direction dT1_t1 to the movement direction dT1_t2 becomes larger from 0 degrees toward 180 degrees, the inner product IP becomes smaller in the range of [−1≤IP≤1]. The processor 22 retrieves a change point at which the inner product IP becomes smaller exceeding the predetermined threshold value δth1 from the time-series data of the movement direction dT1, and acquires the time point t0 at which such a change point occurs.


Alternatively, the processor 22 may calculate the angle θ between the movement direction dT1_t1 and the movement direction dT1_t2 as a change amount of the movement direction dT1 within the time δt, and retrieve a change point at which the angle θ exceeds the predetermined threshold value δth1 from the time-series data of the movement direction dT1, and then acquire the time point t0 at which the change point occurs.


Regarding the movement direction (rotation direction) dM1 of the rotary shaft of the electric motor 20, for example, the processor 22 determines that the movement direction dM1 exceeds the predetermined threshold value δth1 (=180 degrees) when the movement direction dM1 is reversed, and retrieves a change point at which the movement direction dM1 is reversed from the time-series data of the movement direction dM1, and then acquires the time point t0 at which such a change point occurs.


On the other hand, regarding the running mode RM, the processor 22 retrieves a change point at which the first running mode RM1 is switched to the second running mode RM2 from the time-series data of the running mode RM (FIG. 6), and acquires the time point t0 at which such a change point occurs. In this way, the processor 22 acquires each change point of the positions PT1 and PM1, the velocities VT1 and VM1, the acceleration aT1 and aM1, the jerks αT1 and αM1, the movement directions dT1 and dM1, and the running mode RM in the time-series data, and acquires the time point t0 at which each change point occurs. Accordingly, the processor 22 functions as a change point acquisition section 56 (FIG. 1).


Next, the processor 22 specifies points on the movement path MP1 corresponding to the respective change points of the running information PT1, PM1, VT1, VM1, aT1 aM1, αT1, αM1, dT1, dM1, and RM. In the present embodiment, the processor 22 acquires a total of eleven kinds of running information, i.e., the positions PT1 and PM1, the velocities VT1 and VM1, the acceleration aT1 and aM1, the jerks αT1 and αM1, the movement directions dT1 and dM1, and the running mode RM.


Thus, regarding these eleven kinds of running information PT1, PM1, VT1, VM1, aT1, aM1, αT1, αM1, dT1, dM1, and RM, the processor 22 retrieves and specifies the points on the movement path MP1 corresponding to the time points t0 at which the change points of these pieces of running information occurs. In this regard, since the time-series data of the running information PT1, PM1, VT1, VM1, aT1, aM1, αT1, αM1, dT1, dM1 and RM are associated with the statements included in the work program WP, the processor 22 can identify the statement of the work program WP that corresponds to the time point t0, and thereby identifying the point on the movement path MP1 corresponding to the time point t0 via the statement of the work program WP.


In this way, the processor 22 identifies the plurality of points on the movement path MP1 corresponding to the respective change points of the running information PT1, PM1, VT1, VM1, aT1, αM1, αT1, αM1, dT1, dM1, and RM. Then, the processor 22 generates image data ID1 in which the identified plurality of points on the movement path MP1 are highlighted in display forms visually different from one another.


An example of an image of the image data ID1 is illustrated in FIGS. 7 and 8. In the example illustrated in FIG. 7, a point on the movement path MP1 corresponding to a change point of first running information (e.g., the acceleration αT1 of the tool 18) is highlighted by an arrow D1. A point on the movement path MP1 corresponding to a change point of second running information (e.g., the movement direction dM1 of the rotary shaft of the electric motor 20) is highlighted by an arrow D2. Further, a point on the movement path MP1 corresponding to a change point of third running information (e.g., the movement velocity VT1 of the tool 18) is highlighted by an arrow D3. FIG. 8 is an enlarged view of region B in FIG. 7.


The arrow D1, the arrow D2, and the arrow D3 depicted in the image data ID1 are displayed in different colors, different shapes, or different visual effects (flashing or the like) from one another, for example, and can be visually identified, respectively. Accordingly, the operator is able to visually recognize which of the running information either of the arrows D1, D2 or D3 corresponds to.


Note that, in the image data ID1 illustrated in FIG. 7, the points corresponding to the change points of the total of three kinds of running information (e.g., aT1, dM1, VT1) are highlighted, for ease of understanding. However, for all of the eleven kinds of running information PT1, PM1, VT1, VM1, aT1, aM1, αT1, αM1, dT1, dM1, and RM, the points corresponding to the change points thereof may be highlighted in the image data ID1. As described above, in the image data ID1, the points on the movement path MP1 corresponding to the change points of the plurality of pieces of running information different from one another are highlighted on the movement path MP1 in display forms visually different from one another. Accordingly, the processor 22 functions as an image data generation section 58 (FIG. 1) configured to generate the image data ID1.


As stated above, in the present embodiment, the processor 22 functions as the movement path generation section 52, the running information acquisition section 54, the change point acquisition section 56, and the image data generation section 58 of the apparatus 50. These movement path generation section 52, the running information acquisition section 54, the change point acquisition section 56 and the image data generation section 58 constitute the apparatus 50.


The processor 22 transmits the generated image data ID1 to the display device 30 via the I/O interface 26, and the display device 30 displays the image of the image data ID1 as illustrated in FIGS. 7 and 8. The processor 22 may switch the image displayed on the display device 30 between the image illustrated in FIG. 7 and the enlarged image illustrated in FIG. 8, in response to the input information to the input device 32.


According to the present embodiment, if a defect occurs in a finished surface of the workpiece A when the industrial machine 12 performs the work on the workpiece A, the operator easily identifies a factor for the defect. More specifically, assume that the position of the defect generated in the finished surface of the workpiece A matches (or is close to) the position of the point highlighted in the image data ID1, when the industrial machine 12 performs the work on the workpiece A in accordance with the machining program WP.


In this case, the operator can immediately identify which of the change points of the running information PT1, PM1, VT1, VM1, aT1, aM1, αT1, αM1, dT1, dM1, and RM, the point that matches (or is close to) the position of the defect generated in the workpiece A corresponds to. Thus, the operator may estimate that the factor for the defect results from the parameter relating to the identified running information PT1, PM1, VT1, VM1, aT1, aM1, αT1, αM1, dT1, dM1 or RM.


As a result, the operator is able to achieve an improvement in precision of the work performed on the workpiece A and a reduction in time necessary for startup of the mechanical system 10, by changing the machining program WP or the parameter (e.g., the statement of the machining program, control gain GC, or time constant τ) relating to the identified running information PT1, PM1, VT1, VM1, aT1, aM1, αT1, αM1, dT1, dM1 or RM.


The processor 22 may select a point to be highlighted in the image of the image data ID1 from the plurality of kinds of running information PT1, PM1, VT1, VM1, aT1, aM1, αT1, αM1, dT1, dM1 and RM, in response to the input information to the input device 32, and may highlight in the image only the point corresponding to the selected running information.


Such an embodiment will be described with reference to FIG. 9. The image data ID1 illustrated in FIG. 9 includes running-information-selection image data 60. The running-information-selection image data 60 is image data for making it possible to select the running information PT1, PM1, VT1, VM1, aT1, aM1, αT1, aM1, dT1, dM1 or RM, and displays the running information PT1, PM1, VT1, VM1, aT1 aM1, αT1, αM1, dT1, dM1, and RM.


In the example illustrated in FIG. 9, a check column is set for each of the running information PT1, PM1, VT1, VM1, aT1, aM1, αT1, αM1, dT1, dM1 and RM, and configured such that a check mark is displayed in the check column of the selected running information PT1, PM1, VT1, VM1, aT1, aM1, αT1, αM1, dT1, dM1 or RM.


The operator operates the input device 32 (e.g., a mouse) so as to input the input information for selecting the running information PT1, PM1, VT1, VM1, aT1, aM1, αT1, αM1, dT1, dM1 or RM to be highlighted, while viewing the image of the image data ID1 displayed on the display device 30.


In response to the input information to the input device 32, the processor 22 highlights the point corresponding to the selected running information PT1, PM1, VT1, VM1, aT1, aM1, αT1, αM1, dT1, dM1 or RM, and displays the check mark in the check column of the selected running information PT1, PM1, VT1, VM1, aT1, aM1, αT1, αM1, dT1, dM1, or RM. In the example depicted in FIG. 9, a case is illustrated where the running information VT1 is selected. According to this configuration, the operator is able to easily select and highlight the point on the movement path MP1 corresponding to the change point of the desired running information.


The processor 22 may display, in the image data ID1, the relating parameter of the running information PT1, PM1, VT1, VM1, aT1, aM1, αT1, αM1, dT1, dM1, or RM corresponding to the point highlighted in the image data ID1, in response to the input information to the input device 32. Such an embodiment will be described with reference to FIG. 10.


The image data ID1 illustrated in FIG. 10 includes parameter image data 62. The parameter image data 62 is image data for displaying the kind of the running information and the relating parameter which relates to that running information. The relating parameter includes e.g. setting values of the positions PT1 and PM1, the velocities VT1 and VM1, the acceleration aT1 and aM1, the jerks αT1 and αM1, the movement directions dT1 and dM1, and the running mode RM; the control gain GC; or the time constant τ.


In the example illustrated in FIG. 10, the parameter image data 62 displays the kind of the running information (i.e., the tool velocity VT1) corresponding to the point highlighted by the arrow D3, and also displays, as the relating parameter of the running information VT1, a value of the control gain GC and an identification number of the control gain GC, a value of the time constant τ and an identification number of the time constant τ, and the corresponding statement of the machining program WP including the information of the target position coordinates and the minute line segment length. As illustrated in FIG. 10, each of the control gain GC and the time constant τ is assigned the identification number, and the setting information of the control gain GC and time constant can be retrieved from the identification numbers.


The operator operates the input device 32 (e.g., the mouse) so as to input the input information for selecting the point highlighted in the image data ID1 (arrows D1, D2, D3), while viewing the image of the image data ID1 displayed on the display device 30. In response to the input information to the input device 32, the processor 22 displays, in the image data ID1, the relating parameter of the running information PT1, PM1, VT1, VM1, aT1, αM1, αT1, αM1, dT1, dM1, or RM corresponding to the selected points (arrows D1, D2, D3).


According to this configuration, in a case where the position of the defect generated in the workpiece A matches (is close to) the position of the point highlighted in the image data ID1, the operator is able to refer to the relating parameter of the running information PT1, PM1, VT1, VM1, aT1, aM1, αT1, αM1, dT1, dM1, or RM that may be the factor for the defect, and verify the factor in more detail.


Next, a mechanical system 100 according to another embodiment will be described with reference to FIG. 11. The mechanical system 100 is different from the above-described mechanical system 10 in an industrial machine 102. Specifically, the industrial machine 102 further includes a sensor 104, in addition to the movement mechanism 16 and the tool 18.


The sensor 104 includes e.g. a rotation detection sensor such as an encoder or a Hall element, or a force sensor such as a torque sensor or a force detection sensor, configured to detect the position PM2 of the rotary shaft of the electric motor 20, the torque applied to the rotary shaft, or the force applied to the movement mechanism 16. The sensor 104 transmits the detected position PM1, torque or force, as feedback information FB, to the processor 22 via the I/O interface 26.


The processor 22 receives the feedback information FB detected by the sensor 104 therefrom while operating the industrial machine 102 in accordance with the work program WP. For example, the processor 22 acquires the feedback information FB detected by the sensor 104 while operating the industrial machine 102 in accordance with the work program WP so as to actually perform the work on the workpiece A by the tool 18.


Alternatively, the processor 22 may acquire the feedback information FB detected by the sensor 104 while operating the industrial machine 102 in accordance with the work program WP without performing the actual work by the tool 18 (e.g., without activating the tool 18, or with the tool 18 being removed from the movement mechanism 16).


The processor 22 determines, from the feedback information FB from the sensor 104 (specifically, the position FM2), a position PT2 of the tool 18 relative to the workpiece A while operating the industrial machine 102 in accordance with the work program WP. Then, the processor 22 functions as the movement path generation section 52, and generates, from the position PT2, the movement path MP2 (second movement path) of the industrial machine 102 in a three-dimensional space.


The movement path MP2 corresponds to the actual movement path of the movement mechanism 16 when the industrial machine 102 is operated in accordance with the work program WP. An example of the movement path MP2 is illustrated in FIGS. 3 and 12. FIG. 12 is an enlarged view of the region B in FIG. 3. Each of the lines illustrated in FIG. 12 constitutes the movement path MP2. Further, the processor 22 functions as the running information acquisition section 54, and acquires, from the feedback information FB, time-series data of the positions PT2 and PM2, velocities VT2 and VM1, acceleration aT2 and aM2, jerks αT2 and αM2, and movement directions dT2 and dM2, as the running information. The position PT2, velocity VT2, acceleration aT2, jerk αT2, and movement direction dT2 of the tool 18 with respect to the workpiece A, and the velocity VM1, acceleration aM2, jerk αM2, and movement direction dM2 of the rotary shaft of the electric motor 20 can be obtained from the feedback information FB (i.e., the position PM2 of the rotary shaft) from the sensor 104.


Then, similarly as the above-described method for acquiring the change points of the running information PT1, PM1, VT1, VM1, aT1, aM1, αT1, αM1, dT1 and dM1, the processor 22 functions as the change point acquisition section 56 to acquire a change point at which a change amount δ2 of each of the positions PT2 and PM2, velocities VT2 and VM2, acceleration aT2 and aM2, jerks αT2 and αM2, and movement directions dT2 and dM2 exceeds a predetermined threshold value δth2 within a predetermined time δt in the time-series data of the positions PT2 and PM2, velocities VT2 and VM2, acceleration aT2 and aM2, jerks αT2 and αM2, and movement directions dT2 and dM2.


The processor 22 acquires, in the time-series data, the change point of each of the positions PT2 and PM2, velocities VT2 and VM2, acceleration aT2 and aM2, jerks αT2 and αM2, and movement directions dT2 and dM2; and the time point t0 at which each change point occurs, and then identifies the points on the movement path MP2 corresponding to the respective change points.


The time-series data of the running information PT2, PM2, VT2, VM2, aT2, aM2, αT2, αM2, dT2 and dM2, and the movement path MP2 are data acquired from the feedback information FB, and are associated with each other via time t. Accordingly, the processor 22 is able to retrieve and identify the points on the movement path MP2 corresponding to the time points t0 at which the change points of the running information PT2, PM2, VT2, VM2, aT2, aM2, αT2, αM2, dT2, and dM2 acquired from the feedback information FB occur.


Then, the processor 22 functions as the image data generation section 58 to generate image data ID2 in which the points on the movement path MP2 corresponding to the change points of the running information PT2, PM2, VT2, VM2, aT2, aM2, aT2, aM2, dT2 and dM2 are highlighted on the movement path MP2 in display forms visually different from one another. An example of the image data ID2 is illustrated in FIGS. 7 and 13.


In the example illustrated in FIG. 7, a point on the movement path MP2 corresponding to a change point of first running information (e.g., the acceleration aT2 of the tool 18) is highlighted by the arrow D1. Further, a point on the movement path MP2 corresponding to a change point of second running information (e.g., the movement direction dM2 of the rotary shaft of the electric motor 20) is highlighted by the arrow D2. Furthermore, a point on the movement path MP2 corresponding to a change point of third running information (e.g., the movement velocity VT2 of the tool 18) is highlighted by the arrow D3.


The arrow D1, the arrow D2, and the arrow D3 depicted in the image data ID2 are displayed in different colors, different shapes, or different visual effects (flashing or the like) from one another, and thereby can be visually identified. Accordingly, the operator is able to visually recognize which of the running information each of the arrows D1, D2, and D3 in the image data ID2 corresponds to.


Note that, in the image data ID2 illustrated in FIG. 7, for ease of understanding, an example is illustrated in which the points corresponding to the change points of the total of three kinds of running information (e.g., aT2, dM2, VT2) are highlighted. However, for all of ten kinds of the running information PT2, PM2, VT2, VM2, aT2, aM2, αT2, αM2, dT2, and dM2, the points corresponding to the change points thereof may be highlighted.


As described above, in the image data ID2, the points on the movement path MP2 corresponding to the change points of the plurality of pieces of running information different from one another acquired from the feedback information FB are highlighted on the movement path MP2 in display forms visually different from one another. The processor 22 may switch the image to be displayed on the display device 30 between the image illustrated in FIG. 7 and the enlarged image illustrated in FIG. 12, in response to input information to the input device 32. According to the present embodiment, if the defect occurs in the finished surface of the workpiece A when the industrial machine 102 performs the work on the workpiece A, the operator easily identifies the factor for the defect.


Note that, as illustrated in FIG. 9, the processor 22 of the mechanical system 100 may select the point to be highlighted in the image of the image data ID2 from the plurality of kinds of running information PT2, PM2, VT2, VM2, aT2, aM2, αT2, αM2, dT2, and dM2, in response to the input information to the input device 32, and may highlight only the point corresponding to the selected running information in the image.


Further, as illustrated in FIG. 10, the processor 22 of the mechanical system 100 may display, in the image data ID2, the relating parameters of the running information PT2, PM2, VT2, VM2, aT2, aM2, aT2, aM2, dT2, or dM2 corresponding to the point highlighted in the image data ID2, in response to the input information to the input device 32.


Furthermore, the processor 22 of the mechanical system 100 may acquire, as the running information, the time-series data of the positions PT1 and PM1, velocities VT1 and VM1, acceleration aT1 and aM1, jerks αT1 and aM1, movement directions dT1 and dM1, and the running mode RM; and the time-series data of the positions PT2 and PM2, velocities VT2 and VM2, acceleration aT2 and aM2, jerks αT1 and αM2, movement directions dT1 and dM1.


In this case, the processor 22 may highlight, in the image data ID2, the points on the movement path MP2 corresponding to the change points of the running information PT2, PM2, VT2, VM2, aT2, aM2, αT2, αM2, dT2 and dM2, and also the points on the movement path MP2 corresponding to the change points of the running information PT1, PM1, VT1, VM1, aT1, aM1, αT1, αM1, dT1, dM1, and RM on the movement path MP2.


Further, in the mechanical system 100, the processor 22 may generate both the movement paths MP1 and MP2, display the movement paths MP1 and MP2 in the image data ID2, and highlight the points on the movement path MP1 corresponding to the change points of the running information PT1, PM1, VT1, VM1, aT1, aM1, αT1, αM1, dT1, dM1 and RM, as well as the points on the movement path MP2 (or the movement path MP1) corresponding to the change points of the running information PT2, PM2, VT2, VM2, aT2, aM2, αT2, αM2, dT2, and dM2.


The processor 22 may acquire the difference ΔMP between the movement path MP1 and the movement path MP2, and display the difference ΔMP in the image data ID2. Such an embodiment is illustrated in FIG. 14. In FIG. 14, in order to facilitate the understanding, a single movement path MP1 and a single movement path MP2 corresponding to the single movement path MP1 are illustrated.


For example, the processor 22 displays a region E corresponding to the difference ΔMP between the movement path MP1 and the movement path MP2 as a colored region (e.g., a red-colored region). The processor 22 may enlarge and display the difference ΔMP (i.e., the region E) in the image data ID2, in response to the input information to the input device 32.


In the above-described embodiments, the processor 22 may acquire time-series data of digital signals in which discrete numerical data exist in time sequence as the running information PT1, PM1, VT1, VM1, aT1, aM1, αT1, αM1, dT1, dM1, PT2, PM2, VT2, VM2, aT2, aM2, αT2, αM2, dT2, or dM2, not being limited to the time-series data of continuous analog signals as illustrated in FIG. 5. The processor 22 may acquire simple numerical data as the running information, rather than time-series data.


The running information RM may not be limited to the time-series data as illustrated in FIG. 6, and information on the command or the time t0 for switching the running mode RM of the industrial machine 12 may be acquired as the running information RM.


The change point acquisition section 56 may be omitted from the above-described embodiments. In this case, the operator may manually identify and input the change points of the running information PT1, PM1, VT1, VM1, aT1, aM1, αT1, αM1 dT1, dM1, RM, PT2, PM2, VT2, VM2, aT2, aM2, αT2, αM2, dT2, or dM2 from the time-series data thereof, for example.


In the above-described embodiments, the points on the movement path MP corresponding to the change points of the running information PT1, PM1, VT1, VM1, aT1, aM1, αT1, αM1, dT1, dM1, RM, PT2, PM2, VT2, VM2, aT2, aM2, αT2, αM2, dT2, and dM2 are highlighted by the arrows D1, D2, and D3. However, the point on the movement path MP corresponding to the change point may be highlighted in any display form (e.g., colored dot, triangular point, or square point). Although the present disclosure has been described through the embodiments, the embodiments are not intended to limit the invention according to the claims.

Claims
  • 1. An apparatus configured to generate image data of a movement path of an industrial machine, the apparatus comprising: a movement path generation section configured to generate the movement path of the industrial machine when performing a work on a workpiece;a running information acquisition section configured to acquire running information of the industrial machine when performing the work on the workpiece; andan image data generation section configured to generate image data in which a first point on the movement path corresponding to a change point of first running information, and a second point on the movement path corresponding to a change point of second running information different from the first running information are highlighted on the movement path in display forms visually different from each other.
  • 2. The apparatus of claim 1, wherein the running information includes: a position, a velocity, acceleration, a jerk or a movement direction of the industrial machine; ora running mode of the industrial machine.
  • 3. The apparatus of claim 2, wherein the running information includes time-series data that indicates, in time series, a change of the position, the velocity, the acceleration, the jerk, the movement direction, or the running mode with respect to time.
  • 4. The apparatus of claim 3, further comprising a change point acquisition section configured to acquire, in the time-series data: the change point at which a change amount of the position, the velocity, the acceleration, the jerk, or the movement direction within a predetermined time exceeds a predetermined threshold value; orthe change point at which first running mode is switched to second running mode.
  • 5. The apparatus of claim 1, wherein the running information acquisition section acquires the running information from a work program for performing the work on the workpiece, or from feedback information detected when the industrial machine is operated in accordance with the work program.
  • 6. The apparatus of claim 1, wherein the movement path generation section generates: a first movement path defined by a work program for performing the work on the workpiece; anda second movement path when the industrial machine is operated in accordance with the work program,wherein the image data generation section generates the image data in which the first movement path and the second movement path are displayed.
  • 7. The apparatus of claim 6, wherein the image data generation section generates the image data such that a difference between the first movement path and the second movement path can be enlarged and displayed.
  • 8. A controller of the industrial machine, comprising the apparatus of claim 1.
  • 9. A method for generating image data of a movement path of an industrial machine, the method comprising: generating the movement path of the industrial machine when performing a work on a workpiece;acquiring running information of the industrial machine when performing the work on the workpiece; andgenerating the image data in which a first point on the movement path corresponding to a change point of first running information, and a second point on the movement path corresponding to a change point of second running information different from the first running information are highlighted on the movement path in display forms visually different from each other.
Priority Claims (1)
Number Date Country Kind
2019-198977 Oct 2019 JP national