This is the U.S. National Phase application of PCT/JP2021/000895, filed Jan. 13, 2021, which claims priority to Japanese Patent Application No. 2020-006855, filed Jan. 20, 2020, the disclosures of these applications being incorporated herein by reference in their entireties for all purposes.
The present invention relates to a robot simulation device for simulating the operation of a robot.
Various simulation devices for simulating the operation of an industrial robot have been provided. For example, Patent Literature 1 describes “a simulation device which performs simulation by means of a virtual robot, which is a virtualized robot, the simulation device comprising a reception part which receives commands for instructing at least one of holding and releasing of a virtual object by a virtual holding part of the virtual robot, and a control part which performs at least one of holding and releasing of the virtual object by the virtual holding part based on a command received by the reception part” (Patent Literature 1, claim 1).
By equipping a robot manipulator with a force sensor, it is possible to detect forces applied to a workpiece and perform advanced operations such as exploration operations, fitting operations, and polishing while performing force control. However, skilled parameter adjustment ability is required to properly perform force control. In general, in order to perform such parameter adjustment, it is necessary for an operator to repeatedly fail and succeed in force control to acquire parameter setting know-how. A robot simulation device which can facilitate parameter setting of force control is desired.
One aspect of the present disclosure provides a robot simulation device for simulating a force control operation which is performed while bringing a tool part mounted on a robot manipulator into contact with a target workpiece, the device comprising a memory which stores a motion program and a force control parameter, which is a set parameter related to the force control operation, and a force control simulation execution part configured to execute a simulation of the force control operation based on the motion program and the force control parameter, wherein the force control simulation execution part comprises a virtual force generation part configured to generate, based on position information of the tool part obtained from results of the simulation of the force control operation, a virtual force received by the tool part from the target workpiece in a state in which the tool part is in contact with the target workpiece, and execute the simulation of the force control operation based on the virtual force and a target force set as the force control parameter.
According to the above configuration, an operator can intuitively understand the generation state of a pressing force, and can easily carry out force control parameter setting.
From the detailed description of typical embodiments of the invention shown in the attached drawings, the objective, features, and advantages of the invention as well as other objectives, features, and advantages will be further clarified.
Next, the embodiments of the present disclosure will be described with reference to the drawings. In the referenced drawings, identical constituent portions or functional portions have been assigned the same reference sign. In order to facilitate understanding, the scales of the drawings have been appropriately modified. Furthermore, the forms shown in the drawings are merely one example for carrying out the present invention. The present invention is not limited to the illustrated forms.
Further, an external computer 90 which is responsible for functions for executing a physics simulation based on an operation model of the manipulator 10 when the controller 50 executes a simulation of the force control operation (hereinafter referred to as force control simulation), and a display device 70 on which force control simulation results are displayed are connected to the controller 50. Note that as used herein, the term “simulation” encompasses not only operations of calculating the position of the manipulator or the like by numerical simulation, but also the case in which a shape model of the manipulator or the like is simulated in accordance with teaching data or the like.
The controller 50 further has functions for visualizing the magnitude of the force (specifically, the pressing force acting on the target workpiece) received by the tool part 11 from the target workpiece and the occurrence site thereof by executing a force control simulation in accordance with teaching data (motion program) and the force control parameter according to the operator, and displaying the magnitude of the force and the occurrence site on the display device 70 as an AR (augmented reality) image or VR (virtual reality) image. As a result, for example, the operator can understand how much the pressing force will be and where it will act on the target workpiece, and can adjust the motion program and force control parameters before actually executing the force control operation. Note that in the present description, the force control parameter includes at least one of a target pressing force, pressing direction velocity, force control gain, search area, velocity gain, and teaching position.
The external computer 90 comprises a physics simulation part 91 which executes a physics simulation of the manipulator 10 based on a motion model (equation of motion) of the manipulator 10.
In the present embodiment, the display device 70 is configured as a head-mounted display. The display device 70 can also be constituted by another information processing device such as a tablet terminal on which a camera is mounted. The operator wears the display device 70 configured as a head-mounted display. The display device 70 includes an imaging device 71, an AR/VR image processing part 72 which executes image processing for displaying an augmented reality (AR) image or a virtual reality (VR) image, a display 73, and an audio output part 74. The imaging device 71 is provided on the display device 70 so that the optical axis of the image pickup lens faces toward the front of the wearer, and captures an image of an actual workspace including the manipulator 10. Using the information of the virtual pressing force acting on the target workpiece and the site on which it acts, which are obtained as force control simulation results, the AR/VR image processing part 72 can execute augmented reality image processing, in which an image representing the virtual pressing force is overlaid on the actual image, or virtual reality image processing in which an image representing the virtual pressing force is overlaid on an image (video animation) in a virtual reality space in which a model of each object such as the manipulator 10 is arranged. The display 73 is arranged in front of the wearer and displays images (video) generated by the AR/VR image processing part 72. Note that the display device 70 has a position sensor (optical sensor, laser sensor, or magnetic sensor) and an acceleration sensor (gyro sensor) for acquiring the position of the display device 70 in the workspace, whereby the relative positional relationship of the coordinate system (camera coordinate system) fixed to the display device with respect to the world coordinate system fixed to the workspace can be understood.
Δx=Kf(F−Fd)
Next, the force control simulation executed under control by the controller 50 (force control simulation execution part 52) will be described. In the present embodiment, the force control simulation is realized by detecting or generating the virtual pressing force acting on the target workpiece when the motion program is executed by the methods described below
(Virtual force generation method 1): The motion model (equation of motion) of the robot manipulator 10 is set, and the operation of the block diagram of the force control shown in
(Virtual force generation method 2): The virtual force (virtual pressing force) is obtained using log data including the force (moment) detected by force sensor 3 and the position information of the robot (manipulator 10) when the operation by force control has been executed in the past in the same operating environment or using log data obtained by detecting and recording the force (moment) acting on the workpiece by the force sensor by stopping the driving of the tool (for example, the rotational driving of the polishing grindstone) while actually moving the robot with respect to the target workpiece using the motion program. In the case of the virtual force generation method 2, the distance between the tool and the target workpiece can be determined from the teaching trajectory, and when there is log data of the same degree in terms of the distance between the motion trajectory of the robot and the target workpiece, the pressing force recorded as log data can be used as the virtual force (virtual pressing force).
(Virtual force generation method 3): In the actual operation related to a specific workpiece, training data representing the correspondence between the relative position or speed of the robot (tool) and the workpiece and the force (moment) detected by the force sensor is collected, and a learning model is constructed by the learning function to obtain the virtual force (virtual pressing force).
Virtual force generation method 1 will be described in detail. In virtual force generation method 1, the equation of motion (motion model) of the robot manipulator 10 is set, the force control blocks shown in
M(θ,{umlaut over (θ)})+h(θ,{dot over (θ)})+g(θ)=τ+τL
In the above formula, θ represents the angle of each joint, M is a matrix related to the moment of inertia, h is a matrix related to the Coriolis force and centrifugal force, g is a term representing the influence of gravity, τ is torque, and τL is load torque.
The motion command based on the teaching trajectory (command given to the manipulator 10 in the example of
The first calculation example of the virtual pressing force F is an example in which the rigidity of the target workpiece is relatively low with respect to the tool. In the present example, the amount by which the tool tip position moves beyond the contact position with the target workpiece to the target workpiece side is defined as δ, and the virtual force F may be determined from the following formula:
F=Kd·δ (1a)
by multiplying by coefficient Kd regarding the rigidity of the workpiece. Note that in this case, it is assumed that the target workpiece has a fixed position in the workspace. Alternatively, there may be used a procedure in which the force F received from the workpiece when the position of the tool tip contacts the target workpiece is calculated from the following formula:
F=Kd·δ+Kc·Vc (1b)
wherein Vc represents the velocity when the tool tip position moves beyond the contact position with the target workpiece. The coefficients Kd and Kc can be set in accordance with the rigidity and shape of the target workpiece.
The second calculation example of the virtual pressing force F is an example in which the virtual force F is calculated based on the amount of deflection of the tool when the rigidity of the tool is relatively low with respect to the target workpiece. The amount δ that the tool tip position moves beyond the contact position with the target workpiece to the target workpiece side is considered as the amount of deflection of the tool, and the virtual force F is calculated by the following formula using the rigidity coefficient (virtual spring constant) of the tool.
F=(tool virtual spring constant)×δ (2a)
Note that if the tool is a so-called floating tool which has a mechanism (spring mechanism) that expands and contracts in the pressing direction, the expansion and contraction length of the tool tip can be obtained based on the position of the tool tip and the position of the target workpiece, and the virtual force F can be obtained by the following formula.
F=(tool spring constant)×expansion/contraction length (2b)
The third calculation example of the virtual force (virtual pressing force) F is an example in which the virtual force F is calculated from the distance that the robot (tool tip) moves in the pressing direction in response to the speed command when the rigidity of the tool is relatively high. In the case of this example, the movement position according to the speed command is defined as Tx, the position to which the robot (tool tip) actually moves in response to the speed command is defined as d, and calculation is performed by the following formula.
F=k×(Tx−d) (3)
where k is a coefficient. A value obtained as an experimental value, an empirical value, or the like may be set as the coefficient k.
In the calculation formula of the virtual force F described above, the virtual force may be obtained by substituting teaching data (teaching position, teaching speed) instead of the position and speed of the tool tip by physics simulation.
Next, the virtual force generation method 3 will be described in detail. The generation of the virtual pressing force by the virtual force generation method 3 is executed by the virtual force learning part 55. The virtual force learning part 55 has functions to extract useful rules, knowledge representations, judgment criteria, etc., in the set of input data by analysis, output the judgment results, and perform knowledge learning (machine learning). There are various methods of machine learning, but they can be broadly divided into, for example, “supervised learning”, “unsupervised learning”, and “reinforcement learning.” Furthermore, in order to realize these methods, there is a method called “deep learning” in which the extraction of feature amounts themselves are learned. In the present embodiment, “supervised learning” is adopted as the machine learning by the virtual force learning part 55.
As described in the section “Virtual force generation method 2” above, in a state in which the tip of the tool and the target workpiece are in contact, it is considered that the relative distance between the tool tip position and the workpiece, the relative velocity, the coefficient related to the rigidity or dynamic friction of the target workpiece, the coefficient related to the rigidity of the tool, etc., correlate with the magnitude of the pressing force. Thus, the virtual force learning part 55 executes learning using learning data in which these values which correlate with the magnitude of the pressing force are used as input data and the pressing force detected by the force sensor is used as response data.
As a specific example of building a learning model, there may be an example of constructing a learning model corresponding to the first to third calculation examples of the virtual force F described above. When constructing a learning model corresponding to the first calculation example of the virtual force F, learning data in which the relative distance (δ) between the tool tip position and the target workpiece, relative velocity (Vc), and values related to the rigidity of the target workpiece (Kd, Kc) (or alternatively, at least the relative distance (δ) between the tool tip position and the target workpiece and the value related to the rigidity of the workpiece (Kd)) are used as the input data and the pressing force detected by the force sensor in that case is used as the response data is collected. The learning model is constructed by executing learning using the learning data.
When constructing a learning model corresponding to the second calculation example of the virtual force F, learning data in which the amount of movement of the tool tip position (δ) and the “virtual spring constant of the tool” are used as input data and the pressing force detected by the force sensor is the response data is collected. The learning model is constructed by executing learning using the learning data. Note that learning data (training data) composed of input data including at least one of the coefficient related to the rigidity of the target workpiece and the coefficient related to the rigidity of the tool part, and the distance (δ) of the tool part to the target workpiece when the tool part is in contact with the target workpiece and response data, which is the pressing force detected by the force sensor in that case, may be collected, and the learning model may be constructed by executing learning using the learning data.
When constructing a learning model corresponding to the third calculation example of the virtual force F, learning data in which the moving position (Tx) according to the speed command and the position (d) to which the tip of the tool actually moved in response to the speed command are used as input data and the pressing force detected by the force sensor in that case is used as the response data is collected. The learning model is constructed by executing learning using the learning data. The learning in this case corresponds to the operation of learning the coefficient k.
Such learning can be realized using a neural network (for example, a three-layer neural network). The operation modes of the neural network include a learning mode and a prediction mode. In the learning mode, the training data (input data) described above is input as an input variable to the neural network, and the weight applied to the input of each neuron is learned. Weight learning is executed by determining the error between the output value and the correct answer value (response data) when the input data is input to the neural network, and back-propagating the error to each layer of the neural network and adjusting the weight of each layer so that the output value approaches the correct answer value. When a learning model is constructed by such learning, it is possible to predict the virtual pressing force by inputting the input data described above as an input variable to the virtual force learning part 55. Specifically, the force control simulation execution part 52 inputs the input data described above to the virtual force learning part 5, which executes learning, and obtains the virtual force as the output therefrom.
The audio output part 74 outputs a sound which expresses the magnitude of the virtual force generated by the virtual force generator 54 in accordance with the volume. For example, the operator can more intuitively understand the magnitude of the virtual force by outputting a sound corresponding to the magnitude of the virtual force generated by the virtual force generator 54 in real time during the execution of the force control simulation.
Next, two examples including an exploration operation and a fitting operation will be described regarding the simulation of specific operations based on the function of the force control simulation by the controller 50 described above. When performing an exploration operation or a fitting operation, the hand 9 is attached to the tip of the wrist of the manipulator 10 to grip the fitting part, as shown in
As shown in
Next, the controller 50 moves the tool part 11 toward the target workpiece (fitted component W11) at the teaching speed (step S12).
Next, the controller 50 detects the impact force due to the fitting component W10 colliding with the fitted component W11, which is the target workpiece, based on position/velocity information of the tool part 11 and position information of the fitted component W11. When the controller 50 detects the impact force due to the collision, the controller 50 controls the force along the direction of the target pressing force, and searches for a position where the force received from the target workpiece is released while moving the fitted component W10 in parallel and rotationally with respect to the contact plane of the fitted component W11 (step S13). In this case, when the search area on the fitted component W11 is specified as the force control parameter, the search can be executed according to this search area.
(a1) When it is assumed that a constant force is generated per unit time:
mv=F·Δt
where Δt is time and F is the generated force.
(a2) When the generated force is treated as an impulse force (when it is assumed that the force f changes with time):
mv=(integral of f(t)).
(a3) When impulse loss occurs during the collision:
mv=e·F·Δt
where e is the coefficient of restitution.
The controller 50 (virtual force generator 54) obtains the generated force at the time of collision between the tool part 11 and the target workpiece (fitted component W11) by any of (a1) to (a3) above.
After collision between the tip of the tool part 11 and the target workpiece (fitted component W11) is detected, the tool part 11 and the target workpiece (fitted component W11) are in contact with each other, as shown in
As described above, when an impact force is detected, the controller 50 translates and rotates the tool part 11 (fitting component W10) on the surface of contact with the target workpiece (fitting component W11) while controlling the force in the target pressing force direction to search for a position where the force received by the tool part 11 from the target workpiece is released (specifically, a position where there is a hole) (step S13,
When the position where the force (virtual force) received from the target workpiece is released is found in the search, the controller 50 presses the fitting component W10 into the hole (step S15). When the fitting component W10 is pressed into the hole, if the fitting component W10 receives a moment from the inner wall of the hole, the posture of the fitting component W10 (tool part 11) is corrected so as to cancel the moment (step S16). For example, when the center line of the fitting component W10 is inclined with respect to the center axis of the fitting hole of the fitted component W11, a moment about the axis perpendicular to the fitting direction may occur. Based on the contact position between the fitting component W10 (tool part 11) and the fitted component W11, for example, the virtual force generated at the contact point between the fitting component W10 (tool part 11) and the fitted component W11 can be obtained using the virtual force generation method 1 described above. Based on this virtual force, the moment about a predetermined point (axis) on the fitting component W10 can be obtained. When obtaining the moment, for example, when a part of the fitting component W10 hits the inner peripheral surface of the hole of the fitted component W11, the moment acting on the fitting component W may be obtained assuming that the force acts in the normal direction of the contact surface on the inner peripheral surface of the hole. When the pressing amount into the hole reaches the target depth, the controller 50 ends the search operation (step S17).
Next, simulation of a fitting operation will be described. As shown in
Next, when a force received when the fitting component W10 is inserted into the hole of the fitted component W13 is detected, the controller 50 continues the insertion while moving the fitting component W10 in the insertion direction and correcting the posture of the fitting component W10 so as to cancel the moment when the fitting component W10 receives a moment from the inner surface of the hole (step S23). The force received by the virtual force generator (force received in the direction opposite to the insertion direction) during insertion into the hole may be calculated based on the outer shape of the fitting component W10, the inner diameter of the hole on the fitted component W13 side, the friction coefficient of the inner peripheral surface of the hole, etc. In such an insertion operation, when the load (virtual force) exceeds an allowable value or the operation time exceeds a predetermined threshold value (Yes in S24), it is considered that the search has failed and the process ends (step S25). When the amount of movement in the insertion direction reaches the target depth, the controller 50 ends the fitting operation (step S26). In step S26, the fitting operation may be ended when the amount of movement in the insertion direction reaches the target depth and the pressing force (virtual pressing force) reaches the target pressing force.
Next, the virtual force display function by the controller 50 will be described. As described above, the controller 50 can generate the force (virtual force) received by the tool part 11 by executing the force control simulation. By providing the display device 70 with information including the virtual force, the occurrence site of the virtual force, and the position of the tool part as a result of the simulation, the controller 50 can display an image representing the magnitude of the virtual force and the occurrence site thereof as an augmented reality image overlaid on a real image of the workspace, or the image representing the magnitude and occurrence site of the virtual force overlaid on a displayed virtual reality image using model data of each object. When generating a virtual reality image, for example, the model data and the arrangement position information of each object in the workspace including the manipulator 10 can be provided from the controller 50 to the display device 70.
Generally, the teaching of operations by force control and the setting of force control parameters are difficult. By visualizing the occurrence site of the force and magnitude of the force related to the target workpiece and providing it as an augmented reality image or a virtual reality image, in the step before the operator who created the teaching data actually operates the robot to perform the force control operation, such as polishing, the operator can instantly understand the site where the force is acting on the target workpiece, etc., and can accurately correct the teaching point, the operation speed, and the force control parameter.
The controller 50 may further comprise an adjustment content generation part 56 which generates an adjustment content regarding the control parameter for suppressing the virtual force within a predetermined reference value based on the results of comparing the virtual force with the predetermined reference value.
In the image of the force control simulation result shown in
According to the present embodiment as described above, the operator can intuitively understand the generation state of the pressing force, and it is possible to facilitate the parameter setting of the force control.
Though the present invention has been described above using typical embodiments, a person skilled in the art would appreciate that modifications and various other modifications, omissions, and additions can be made to each of the above embodiments without departing from the scope of the invention.
The division of functions in the controller 50, display device 70, and external computer 90 in the embodiments described above are exemplary, and the arrangement of these functional blocks can be changed. The imaging device may be arranged in a fixed position in the workspace as a separate device from the display device.
The functional blocks of the controller and display device may be realized by the CPU of the devices executing the various software stored in the storage device, or alternatively, may be realized by a hardware-based configuration such as an ASIC (Application Specific Integrated IC).
The program for executing the various simulation processes in the embodiments described above can be recorded on various recording media that can be read by a computer (for example, semiconductor memory such as ROM, EEPROM, or flash memory, magnetic recording medium, or an optical disc such as a CD-ROM or DVD-ROM).
Number | Date | Country | Kind |
---|---|---|---|
2020-006855 | Jan 2020 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2021/000895 | 1/13/2021 | WO |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2021/149563 | 7/29/2021 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
5825308 | Rosenberg | Oct 1998 | A |
6061004 | Rosenberg | May 2000 | A |
7847503 | Ooga | Dec 2010 | B2 |
7881917 | Nagatsuka | Feb 2011 | B2 |
8185265 | Nagano | May 2012 | B2 |
9272417 | Konolige | Mar 2016 | B2 |
9579797 | Apkarian | Feb 2017 | B2 |
9597797 | Ponulak | Mar 2017 | B2 |
9696795 | Marcolina | Jul 2017 | B2 |
10354397 | Davis | Jul 2019 | B2 |
10576635 | Ogawa | Mar 2020 | B2 |
10800033 | Yamada | Oct 2020 | B2 |
11045958 | Bowling | Jun 2021 | B2 |
11097418 | Nagarajan | Aug 2021 | B2 |
11173610 | Gothoskar | Nov 2021 | B2 |
11220002 | Atohira | Jan 2022 | B2 |
11396101 | Sugiyama | Jul 2022 | B2 |
11639001 | Bowling | May 2023 | B2 |
11707837 | Oleynik | Jul 2023 | B2 |
11787037 | Sakaino | Oct 2023 | B2 |
20050024331 | Berkley | Feb 2005 | A1 |
20060152533 | Read | Jul 2006 | A1 |
20070282485 | Nagatsuka | Dec 2007 | A1 |
20100168950 | Nagano | Jul 2010 | A1 |
20150379171 | Kuwahara | Dec 2015 | A1 |
20160239080 | Marcolina | Aug 2016 | A1 |
20160257000 | Guerin | Sep 2016 | A1 |
20160332297 | Sugaya | Nov 2016 | A1 |
20180029232 | Ouchi | Feb 2018 | A1 |
20180231965 | Onoyama | Aug 2018 | A1 |
20180297202 | Nishitani | Oct 2018 | A1 |
20190221037 | Sugaya | Jul 2019 | A1 |
20190329405 | Atohira | Oct 2019 | A1 |
20220168047 | Nagao | Jun 2022 | A1 |
Number | Date | Country |
---|---|---|
06262563 | Sep 1994 | JP |
2002355782 | Dec 2002 | JP |
2011224696 | Nov 2011 | JP |
2014128857 | Jul 2014 | JP |
2018015857 | Feb 2018 | JP |
2019063879 | Apr 2019 | JP |
2019081242 | May 2019 | JP |
2019188530 | Oct 2019 | JP |
Entry |
---|
International Search Report and Written Opinion for International Application No. PCT/JP2021/000895, dated Apr. 6, 2021, 7 pages. |
Number | Date | Country | |
---|---|---|---|
20230032334 A1 | Feb 2023 | US |