The present invention relates to a robot for industrial, medical, domestic use or the like, especially relates to a robot system, a control apparatus of a robot, and a control program of the robot that need to work with high-accuracy.
The use of robots is rapidly increasing in industrial fields such as industry, commerce, and agriculture, in medical fields such as surgery, nursing, and care, and even in households such as cleaning. Among these fields, for example in production fields, objects of the robots are frequently changing in accordance with diversifying needs such as custom-made production or high-mix low-volume production. Therefore, the robots are required to respond quickly and flexibly. Further, high-accuracy work is essential to achieve high-quality.
Patent Application Publication No. 5622250 discloses an apparatus for executing high-accuracy work processing. In Patent Application Publication No. 5622250, as described in claim 1, an apparatus projects a reference pattern from a projection means onto a work to be machined, calculates a displacement data by imaging the work with the projected reference pattern, corrects a three-dimensional machining data based on the displacement data, and matches a machining origin of an industrial robot with a machining origin of the work.
Although Patent Application Publication No. 5622250 improves machining accuracy by projecting and imaging a reference pattern and correcting machining data, the following problems still exist. Every time a work to be machined is changed, a reference pattern must be created and a jig to hold the work with high positioning accuracy is required, thus the work to be machined cannot be easily changed. Moreover, since an image capturing camera is fixed at a place far from a machining origin, it is impossible to execute a highly accurate observation at the machining origin.
The present invention has been made in view of the above circumstances and provides a robot system, a robot control apparatus, and a robot control program of the robot capable of executing a high-accuracy work without preparing a jig corresponding to the objects even when the objects are in different shapes.
According to the present invention, provided is a robot system, comprising: a robot including a first sensor configured to measure a displacement quantity of a coordinate position between a work point and a target point defined for each of a plurality of objects with different shapes or a physical quantity that changes due to the displacement quantity at a first operation frequency; and a control apparatus for controlling the robot, including a coarse operation management unit configured to move the target point to the vicinity of the object at a second operation frequency, a calculation control unit configured to generate a control signal to correct the displacement quantity at a third operation frequency in such a manner that the target point approaches the work point, and a correction drive unit configured to execute a correction operation to align the target point with the work point based on the control signal; wherein the second operation frequency is a frequency less than or equal to ½ of the first and third operation frequencies.
In the robot system of the present invention, the displacement of the coordinate position between the work point and the target point, which is defined for each object, can be measured by the first sensor, and the position of the target point can be corrected via the correction drive unit. At this time, the first operation frequency, which is an operation frequency of the first sensor, and the third operation frequency of the calculation control unit are higher than twice the operation frequency of the coarse operation management unit, which enables quick positioning. In other words, even when the objects are individually different in shape, high-accuracy work can be executed smoothly without preparing a jig corresponding to the objects.
Hereinafter, embodiments of the present invention will be described with reference to the drawings. Various features described in the embodiment below can be combined with each other. In particular, the “unit” in the present invention may include, for instance, a combination of hardware resources implemented by circuits in a broad sense and information processing of software that can be concretely realized by these hardware resources. Further, although various information is executed in the present embodiment, this information can be represented by high and low signal values as a bit set of binary numbers composed of 0 or 1, and communication/calculation can be executed on a circuit in a broad sense.
Further, the circuit in a broad sense is a circuit realized by combining at least an appropriate number of a circuit, a circuitry, a processor, a memory, and the like. In other words, it is a circuit includes Application Specific Integrated Circuit (ASIC), Programmable Logic Device (e.g., Simple Programmable Logic Device (SPLD), Complex Programmable Logic Device (CPLD), and Field Programmable Gate Array (FPGA)), and the like.
In unit 1, the overall configuration of a robot system 1 will be described with the drawings.
In the robot system 1 of the present embodiment, the overall form of the robot 2 is not particularly limited, but is characterized by comprising a first sensor 21 and an object action unit 22 (target point). The details of these two components will be described later. Further, other functions generally possessed by robots, such as a user interface function for specifying a work to be executed by an operator, a function for supplying the object OBJ, and a static position adjustment function, are assumed to be executed by a main body 20 in the drawings and will not be described in detail here.
The object action unit 22 is configured to displace the coordinate position and to execute a predetermined work on multiple types of objects OBJ with different individual shapes. The displacement method of the coordinate position is not limited, any method such as an axial sliding type or an articulated type can be used.
The first sensor 21 is configured to measure a distance d, which is a displacement quantity of the coordinate position between a work point OP defined for each object OBJ and the object action unit 22 (target point), or a force or a torque, which is a physical quantity that changes due to the displacement quantity of the coordinate positions. The operation frequency of the first sensor 21 is defined as a first operation frequency. The measurement method for measuring the distance d, which is the displacement quantity of the coordinate positions, the force or the torque is not limited, any method can be used such as a camera that detects at least one of visible light, infrared light, and ultraviolet light, an ultrasonic sonar, and a torque sensor. Hereinafter, for the sake of simplicity, a method for measuring the distance d, which is the displacement quantity, will be described.
The high frame rate camera 21a, which is the first sensor 21 in
Although the high frame rate camera 21a can be fixed in a position to overlook the entire object OBJ, the high frame rate camera 21a can always follow the work point OP to acquire magnified image information with high-accuracy by mechanically interlocking with the object action unit 22. In this case, it is preferable that a second sensor (unshown) is separately arranged for the coarse operation management unit 33, which will be described later, and both the object action unit 22 and the high frame rate camera 21a move to the vicinity of the object OBJ based on a measurement result of the second sensor. In particular, it should be noted that the high frame rate camera 21a measures the displacement quantity as two-dimensional coordinate information, and the correction drive unit 33 described later executes a two-dimensional correction operation.
As shown in
The communication unit 31 exchanges information with the robot 2. Although wired communication means such as USB, IEEE1394, Thunderbolt, and wired LAN network communication are preferable, wireless LAN network communication, mobile communication such as 5G/LTE/3G, Bluetooth (registered trademark) communication or the like may be included as necessary. The communication means illustrated above are only examples, and a dedicated communication standard may be adopted as well. In other words, it is more preferable to carry out as a set of a plurality of the aforementioned communication means.
In
The storage unit 32 is a volatile or non-volatile storage media that stores various information. For example, the storage unit 32 can be implemented as a storage device such as a solid state drive (SSD) or as a memory such as a random access memory (RAM) that stores temporarily necessary information (arguments, arrays, etc.) regarding program operation, and any combination thereof.
In particular, the storage unit 32 stores various parameters regarding different work types and work contents, information regarding shapes and materials of different objects OBJ, and past work position information during continuous work.
The storage unit 32 stores various programs or the like regarding the control apparatus 3 that are executed by the controller 33. Specifically, for example, the storage unit 32 stores a program that executes coarse operation management of the object action unit 22 defined for each object OBJ, calculates the displacement of the coordinate position between the work point OP defined for each object OBJ and the object action unit 22 based on the information input from the first sensor 21, and calculates and instructs a correction operation to the object action unit 22 to make the object action unit 22 approach the work point OP.
The controller 33 processes and controls overall operations regarding the control apparatus 3. The controller 33 is, for example, an unshown central processing unit (CPU). The controller 33 realizes various functions related to the control apparatus 3 by reading out a predetermined program stored in the storage unit 32. Specifically, the controller 33 realize functions of calculating the coordinate position displacement information between the work point OP defined for each object OBJ and the current object action unit 22 based on the information given in advance for each object OBJ and the information from the first sensor 21 and other sensors, managing the coarse operation of the object action unit 22 and the first sensor 21, and executing correction operation of the object action unit 22 with high-accuracy.
In other words, information processing by software (stored in the storage unit 32) can be specifically realized by hardware (controller 33) to be executed as a calculation control unit 331, a coarse operation management unit 332, and a correction drive unit 333. Although the controller 33 is indicated as a single one in
The calculation control unit 331 is one in which information processing by software (stored in the storage unit 32) is specifically realized by hardware (controller 33). The calculation control unit 331 executes operations to identify spatial coordinates of the work point OP and the object action unit 22 based on the information acquired from the first sensor 21 via the communication unit 31 and the parameters given in advance for each object OBJ. At this time, the frequency of the calculation is the first operation frequency, which is the operation frequency of the first sensor 21. For example, in the case of the configuration shown in
The calculation frequency to generate the control signal is defined as a third operation frequency. Although there is no problem even if the third operation frequency is the same as the first operation frequency, there is no need to be the same. By having the first and third operation frequencies at high frequencies, the robot system 1 as a whole can execute high-speed work.
Further, when a second sensor exists, the calculation control unit 331 executes a calculation to identify the spatial coordinates of the work point OP and the object action unit 22 based on the information obtained from the second sensor via the communication unit 31 and the parameters given in advance for each object OBJ. The spatial coordinates calculated based on the information from the second sensor are not necessarily more accurate than the spatial coordinates calculated from the first sensor 21, and the update frequency (operation frequency) is also not as high as the first operation frequency, which is the operation frequency of the first sensor. The spatial coordinate position information calculated from the second sensor is utilized by the coarse operation management unit 332.
The coarse operation management unit 332 is one in which information processing by software (stored in the storage unit 32) is specifically realized by hardware (controller 33). The coarse operation management unit 332 manages a coarse operation of the object action unit 22 alone or coarse operations of both the object action unit 22 and the first sensor 21. Here, the coarse operation means that the object action unit 22 alone or both the object action unit 22 and the first sensor 21 are brought close to the work point OP defined for each object OBJ. The vicinity of the work point may be as follows: when utilizing the information defined in the software stored in the storage unit 32, when utilizing the spatial coordinate position information calculated by the calculation control unit 331 based on the information from the first sensor 21, or when utilizing the spatial coordinate position information calculated by the calculation control unit 331 based on the information from the second sensor. Further, any combination thereof may be utilized.
The operation frequency at which the coarse operation management unit 332 adjusts the position of the object action unit 22 is defined as a second operation frequency. In the present invention, the second operation frequency is ½ or less than the first operation frequency, which is the operation frequency of the first sensor, and the third operation frequency, which is the operation frequency of the calculation control unit 331 described below. By setting the operation of the coarse operation management unit 332 to a low frequency in this way, the main body 20 can be utilized for coarse operation even when the main body 20 is relatively large and react slowly. Note that when using the spatial coordinate position information updated by the first operation frequency calculated from the first sensor 21, the update frequency of the information is reduced to about the second operation frequency by thinning out the information on the time axis or averaging a plurality of information or the like.
The correction drive unit 333 is one in which information processing by software (stored in the storage unit 32) is specifically realized by hardware (controller 33). Based on the position correction signal provided by the calculation control unit 331, the correction drive unit 333 executes position correction with respect to the object action unit 22 to align the action point of the object action unit 22 with the action point defined for each object OBJ. In this case, highly accurate coordinate position alignment within a range of spatial resolution of the first sensor 21 and the object action unit 22 becomes possible.
In section 2, a control method of the robot 2 in the robot system 1 for the robot to perform highly accurate work on each object OBJ will be described. As a specific example,
The single work control flow is a control flow when the robot system 1 executes single work on the object OBJ. See
The object OBJ is arranged in a robot workable area. It is sufficient that the positioning accuracy at this time is determined by the subsequent processing, the work point OP (designated point for work) on the object OBJ and the action point TP (target point) of the object action unit 22 (tip of the cutting tool CT) exist in a visual field of the first sensor 21 (high frame rate camera 21a) and the object action unit 22 (high-speed two-dimensional actuator 22a), and enter a correction operation allowable range of the object action unit 22 (high-speed two-dimensional actuator 22a). It is not necessary to prepare a jig for holding the object OBJ manufactured with high-accuracy only for positioning.
The coarse operation management unit 332 moves the object action unit 22 to the vicinity of the work point OP on the object OBJ. At this time, the work point OP may be used by inputting the coordinate position information of the work point OP for each object OBJ, which has been stored in the storage unit 32 in advance, to the coarse operation management unit 332. Alternatively, as the second sensor, a method may be utilized in which the coarse operation management unit 332 uses the coordinate information obtained as a calculation result of inputting the image information acquired from a general camera into the calculation control unit 331.
The coordinate position displacement information obtained in step S3 is transmitted to the correction drive unit 333. The correction drive unit 333 executes coordinate position correction movement control for the high-speed two-dimensional actuator 22a (object action unit 22) in such a manner that the tip position TP of the cutting tool CT approaches the work point OP. As a result, the work point OP and the tip position TP of the cutting tool CT can be brought close to each other with high-accuracy within the range of spatial resolution of the high frame rate camera 21a (first sensor 21) and the high-speed two-dimensional actuator 22a (object action unit 22).
Robot 2 executes a work on the object OBJ.
The continuous work control flow is a control flow when the robot system 1 executes continuous work on the object OBJ. See
The object OBJ is arranged in a workable area of the robot 2. The explanation will be given with
The coarse operation management unit 332 moves the object action unit 22 to the vicinity of the work point OP updated for each work, starting from the continuous work start point ST on the continuous work designated position RT1 on the object OBJ and moving in the direction of the continuous work end point EN for each work. In this case, the work point OP may be used by inputting the coordinate position information of the work point OP to the coarse operation management unit 332, wherein the coordinate position information of the work point OP is stored in the storage unit 32 in advance and is updated for each work on the continuous work designated position RT1 for each object OBJ. Alternatively, as the second sensor, a method may be utilized in which the coarse operation management unit 332 uses the coordinate information obtained as a calculation result of inputting the image information acquired from a general camera into the calculation control unit 331, which is the same as for the single work. The continuous work designated position RT1 can be explicitly indicated by the operator such as applying a mark, or can be utilized by image-recognizing a boundary line if the boundary line can be defined as the continuous work designated position RT1 in the case where there are multiple objects in the object OBJ or the like. A control trajectory RT2 on the left side of
The coordinate position displacement measurement and the correction work for each work within the continuous operation are the same as for the single work.
The coordinate position displacement information obtained in step S3 is transmitted to the correction drive unit 333. The correction drive unit 333 executes coordinate position correction movement control for the high-speed two-dimensional actuator 22a (object action unit 22) in such a manner that the tip position TP of the cutting tool CT approaches the work point OP. As a result, the work point OP and the tip position TP of the cutting tool CT can be brought close to each other with high-accuracy within the range of spatial resolution of the high frame rate camera 21a (first sensor 21) and the high-speed two-dimensional actuator 22a (object action unit 22), which is the same as for the single work.
Robot 2 executes a work on the object OBJ, which is the same as for the single work.
This step determines whether the continuous work has been finished, which can be determined by checking whether all the work at the continuous work designated position RT1 for each object OBJ stored in the storage unit 32 in advance is finished. Alternatively, if a general camera is used as the second sensor, the stage of the continuous work may be determined, for example, that the end point of the marked work instruction line has been reached or the like. If the continuous work is not finished, the process returns to step S2 to continue the work.
By implementing the various control methods as described above, the robot 2 can be controlled with high-accuracy without preparing a jig for holding the object OBJ even if the shape of each object OBJ is different. The loop is broken out when a series of continuous work is finished.
In section 3, modifications according to the present embodiment will be described. In other words, the robot system 1 according to the present embodiment may be implemented in the following manners.
As an example of the first sensor 21, two high frame rate cameras 21a and 21b are arranged. If image information of the object OBJ is acquired from different angles by using two or more optical cameras, the three-dimensional coordinate of the work point OP on the object OBJ can be clarified by calculation in the calculation control unit 331. Even in three-dimensional measurement, the requirements for each of the high frame rate cameras 21a and 21b are the same as for the two-dimensional high frame rate camera 21a described in sections 1 and 2, and for the purpose of high-speed and high-accuracy positioning, a high frame rate (imaging rate) of 100 fps or higher is preferable, and 500 fps or higher is even more preferable. Specific examples are omitted. Although the high frame rate cameras 21a and 21b can be fixed in a position to overlook the entire object OBJ, the high frame rate cameras 21a and 21b can always follow the work point OP to acquire magnified image information with high-accuracy by mechanically interlocking with the object action unit 22 (target point), which is the same as in the case of a two-dimensional correction. It should be noted that the high frame rate cameras 21a and 21b measure the displacement quantity as three-dimensional coordinate information, and the correction drive unit 333 executes a three-dimensional correction operation.
As shown in
[Continuous Work with Online Correction]
In the continuous work control flow described in section 2.2, the continuous work designated position RT1 used by the coarse operation management unit 332 is stored in the storage unit 32 in advance, or the method that uses information from the second sensor (such as a general camera) is adopted. Here, a control flow of an embodiment in which the movement information used by the coarse operation management unit 332 is updated based on the work point coordinate position identified by the first sensor 21 will be described. A configuration diagram of the object action unit 22 (target point) and the first sensor 21 can be referred to
The object OBJ is placed in the workable area of the robot 2. The description thereof is omitted since this step is the same as for the continuous work in section 2.2.
The coarse operation management unit 332 moves the object action unit 22 to the vicinity of the work point OP updated for each work, starting from the continuous work start point ST on the continuous work designated position RT1 on the object OBJ and moving in the direction of the continuous work end point EN for each work. The movement information at this time may be updated using the work point OP information identified by the first sensor 21, as described later in step S8. The continuous work designated position RT1 can be explicitly indicated by the operator such as applying a mark, or can be utilized by image-recognizing a boundary line if the boundary line can be defined as the continuous work designated position RT1 in the case where there are multiple objects in the object OBJ or the like, which is the same as in section 2.2.
The coordinate position displacement measurement and the correction work for each work within the continuous operation is the same as for the single work in section 2.1 and for the continuous work in section 2.2, thus the description thereof is omitted. Information on where the work point OP identified by the high frame rate camera 21a (first sensor 21) exists in the image data IM in
The coordinate position displacement information obtained in step S3 is transmitted to the correction drive unit 333, and the coordinate position correction movement control implemented with respect to the object action unit 22 is the same as for the single work in section 2.1 and for the continuous work in section 2.2, thus the description thereof is omitted.
Robot 2 executes a work on the object OBJ, which is the same as for the single work in section 2.1 and for the continuous work in section 2.2.
This step determines whether all the steps of the continuous work stored in the storage unit 32 in advance have been finished. If the continuous work is not finished, the process returns to step S7 to continue the work.
This step determines whether to update the movement information used by the coarse operation management unit 332, by checking whether the work point OP identified by the high frame rate camera 21a (first sensor 21) in step S3 is located within the allowable range of the image data IM. Specifically, for instance, if it can be estimated that the current work point OP is located near the center of the image data IM, and the position displacement of the distance d between the next work point and the tip position TP of the cutting tool CT (object action unit 22) is small enough in a range for the correction drive unit to execute processing, then the process returns to step S2 to continue the work at the current position under the allowable range. If it is determined that the allowable range is exceeded, then the process proceeds to step S8. It is also possible to set the allowable range, which is a threshold value, to 0 and always proceed to step S8.
The movement information used in the coarse operation management unit 332 is updated based on the work point OP information identified by the high frame rate camera 21a (first sensor 21). Specifically, for instance, if the work point OP displaces upward from the center in the image data IM, the work point OP can be brought closer to the center direction by moving the robot 2 upward. The calculation control unit 331 executes such calculation, updates the movement information used by the coarse operation management unit 332, and returns to step S2 to continue the continuous work.
In this case, if there is a means to measure the actual movement distance such as an encoder in the high-speed two-dimensional actuator 22a (object action unit 22), the calculation can also be executed taking into account the actual movement distance information measured in the object action unit 22.
By adding machine learning, which is being actively researched in the field of artificial intelligence (AI), to the robot system 1 according to the present embodiment, accurate and efficient product processing can be expected. As described in [Means for Solving Problem], the robot system 1 is particularly suitable when the object OBJ is custom-made production or high-mix low-volume production. Although the custom-made production or the high-mix low-volume production may naturally have various specific shapes or dimensions, in terms of the attributes of the objects, many parts are common to conventional goods such as applications, materials, shapes, and dimensions. Therefore, the attributes of the objects to be machined can be machine learned by the robot system 1 in such a manner that the objects can be machined more accurately and efficiently during current machining or in future machining.
For instance, as an example of machine learning, a neural network can be adopted.
Computation nodes N_21 to N_25 add up the input values from the computation nodes N_11 to N_13 and input these values (or values obtained by adding a predetermined bias value to the values) to a predetermined activation function. Then, the output value of the activation function is transmitted to the computation node N_31, which is the next node. At this time, the value obtained by multiplying the weight w set between the calculation nodes N_21 to N_25 and the calculation node N_31 by the output values is input to the calculation node N_31. The calculation node N_31 adds up the input values and outputs the total value as an output signal. At this time, the calculation node N_31 may add the input values together, input the total value plus the bias value to the activation function, and output the output value as an output signal. In this way, a machining plan data of the object OBJ to be machined is optimized and output. Such machining plan data is utilized, for example, for a determination of the coarse operation by the coarse operation management unit 332.
In addition, by utilizing artificial intelligence (AI), the robot system 1 can evolve into middle-level intelligence or high-level intelligence and can be used for task management in Industrial 4.0.
In section 2, the case in which the robot 2 executes a predetermined work while correcting the position of the object action unit 22 with having the object action unit 22 is described. On the other hand, in cases such as when the weight of the object action unit 22 is heavy, there are some demands that the position information of the target point be executed before the actual work of the robot system 1 is grasped with high-accuracy, and the actual work of the robot system 1 should be executed in a shorter time.
Even in this case, the object action unit 22 can be removed from the robot 2 in
The object OBJ is arranged in the robot workable area with the object action unit 22 removed from the robot 2. The continuous work start point ST on the continuous work designated position RT1 on the left side of
Here, RT1 in
High-accuracy position information of the target point is obtained from the image data IM captured by the high frame rate camera 21a (first sensor 21). The high-accuracy position information of the target point is obtained by inputting the image data IM to the calculation control unit 331 via the communication unit 31, calculating the information of the coordinate position displacement quantity from the center of the image data with the calculation control unit 331, and combining with the movement quantity by the coarse operation management unit 332.
The high-accuracy position information calculated in step S3 is stored in the storage unit 32.
This step determines whether the measurement of the entire continuous work designated positions have been finished. If the measurement is finished, the process proceeds to step S6. If not, the process returns to step S2 to continue the measurement.
The cutting tool CT (object action unit 22) is attached to the robot 2 to execute the work. At this time, the continuous work is executed while moving the tip position TP of the cutting tool CT based on the high-accuracy position information stored in the storage unit 32. Since the high-accuracy position information is stored, there is no need to execute feedback control during the work.
As described above, according to the present embodiment, in the robot system 1, even if the objects OBJ have different shapes, the robot system 1 is capable of executing high-accuracy work without preparing a jig corresponding to the object OBJ can be implemented.
The robot system 1 comprising: a robot 2 including a first sensor 21 configured to measure a distance d, which is a displacement quantity of a coordinate position between a work point OP and a target point defined for each object OBJ, or a physical quantity that changes due to the displacement quantity at a first operation frequency; and a control apparatus 3 for controlling the robot 2, including a coarse operation management unit 332 configured to move the object action unit 22 to the vicinity of the object OBJ at a second operation frequency, a calculation control unit 331 configured to generate a control signal to correct the displacement quantity at a third operation frequency in such a manner that the object action unit 22 approaches the work point OP, and a correction drive unit 333 configured to execute a correction operation to align the object action unit 22 with the work point OP based on the control signal; wherein the second operation frequency is a frequency less than or equal to ½ of the first and third operation frequencies.
Further, in the robot system 1, even if the objects OBJ have different shapes, the control apparatus 3 of the robot 2 capable of executing high-accuracy work without preparing a jig corresponding to the object OBJ can be implemented.
The control apparatus 3 of the robot 2 including a first sensor 21 configured to measure a distance d, which is a displacement quantity of a coordinate position between a work point OP and a target point defined for each object OBJ, or a physical quantity that changes due to the displacement quantity at a first operation frequency, comprising: a coarse operation management unit 332 configured to move the target point (the object action unit 22) to the vicinity of the object OBJ at a second operation frequency, a calculation control unit 331 configured to generate a control signal to correct the displacement quantity at a third operation frequency in such a manner that the object action unit 22 approaches the work point OP, and a correction drive unit 333 configured to execute a correction operation to align the target point (the object action unit 22) with the work point OP based on the control signal; wherein the second operation frequency is a frequency less than or equal to ½ of the first and third operation frequencies.
In the robot system 1, even if the objects OBJ have different shapes, the software for implementing the control apparatus 3 of the robot 2 or the robot system 1 as hardware, which can execute high-accuracy work without preparing a jig corresponding to the object OBJ, can be implemented as a program. Such a program may be provided as a non-transitory computer readable media, a downloadable program from an external server, or a so-called “cloud computing” that enables an external computer to run the program and execute each function on a client terminal.
The control program of the robot 2 including a first sensor 21 configured to measure a distance d, which is a displacement quantity of a coordinate position between a work point OP and a target point defined for each object OBJ, or a physical quantity that changes due to the displacement quantity at a first operation frequency, configured to allow a computer to execute: a coarse operation management function that moves the target point (the object action unit 22) to the vicinity of the object OBJ at a second operation frequency, a calculation control function that generates a control signal to correct the displacement quantity at a third operation frequency in such a manner that the target point (the object action unit 22) approaches the work point OP, and a correction drive function that executes a correction operation to align the target point (the object action unit 22) with the work point OP based on the control signal; wherein the second operation frequency is a frequency less than or equal to ½ of the first and third operation frequencies.
Finally, various embodiments of the present invention have been described, but these are presented as examples and are not intended to limit the scope of the invention. The novel embodiment can be implemented in various other forms, and various omissions, replacements, and changes can be made without departing from the abstract of the invention. The embodiment and its modifications are included in the scope and abstract of the invention and are included in the scope of the invention described in the claims and the equivalent scope thereof.
Number | Date | Country | Kind |
---|---|---|---|
2019-031790 | Feb 2019 | JP | national |
This application is a U.S. National Phase application under 35 U.S.C. 371 of International Application No. PCT/JP2020/007310, filed on Feb. 25, 2020, which claims priority to Japanese Patent Application No. 2019-031790, filed on Feb. 25, 2019. The entire disclosures of the above applications are expressly incorporated by reference herein.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2020/007310 | 2/25/2020 | WO | 00 |