The present invention relates to a robot device, a robot device controlling method, an article manufacturing method, and a recording medium.
In recent years, there has been proposed a robot device having a robot arm in which an image is captured by a camera, and a position and a posture of the robot arm are corrected using the image (see JP 2019-89188 A). That is, in such a robot device, a target image is set as a reference, and the robot arm is controlled by so-called visual servo so that the captured image coincides with the set target image. As a result, relative positions of a target object and an end effector such as a robot hand mounted on a distal end of the robot arm can be aligned by correcting the position and posture of the robot arm. Therefore, it is not necessary to increase the accuracy in positioning the target object in advance or to precisely teach the robot arm in advance, making it possible to reduce the cost of the robot device and shorten the teaching time.
According to a first aspect of the present invention, a robot device includes an imaging unit configured to capture an image, an end effector including a base part and a moving part configured to move an object with respect to the base part, a robot configured to support the end effector in such a manner that the robot can move a position of the end effector, and a control unit configured to control the imaging unit, the moving part, and the robot. In a case where a movement control is executed to move a position of the object based on the image captured by the imaging unit and a target image, the control unit is configured to execute a first mode in which the position of the object is moved by controlling the robot to move the position of the end effector, and a second mode in which the position of the object is moved by controlling the moving part.
According to a second aspect of the present invention, a robot device controlling method includes a first step in which in a case where a movement control is executed to move a position of an object based on an image captured by an imaging unit and a target image, a control unit moves the position of the object by controlling a robot configured to movably support an end effector to move the position of the end effector, the end effector including a base part and a moving part configured to move the object with respect to the base part, and a second step in which in a case where the movement control is executed, the control unit moves the position of the object by controlling the moving part.
Further features of the present invention will become apparent from the following description of exemplary embodiments with reference to the attached drawings.
As described in JP 2019-89188 A, in a case where a robot arm is controlled by visual servo, for example, since the robot arm has multiple control axes like a six-axis articulated robot, the multiple control axes are simultaneously driven to move an end effector toward a target direction. However, since the multiple control axes are simultaneously feedback-controlled, it may take much time to converge alignment thereof to the target position.
Therefore, the present invention provides a robot device, a robot device controlling method, an article manufacturing method, a program, and a recording medium capable of shortening the time required when a movement of a position of an object is controlled based on a captured image and a target image.
Hereinafter, a first embodiment for carrying out the present invention will be described with reference to
First, a schematic configuration of a robot system according to a first embodiment will be described with reference to
As illustrated in
The pin placing table (not illustrated) has a plurality of holes into which pins W are to be inserted, and pins W to be supplied as components before work are installed in these holes. In addition, the assembly workpiece 402 has a hole 402H into which the pin W serving as a workpiece is inserted and fitted to be assembled. That is, the robot device 10, which will be described in detail below, executes an assembly control to perform work of assembling the pin W to the assembly workpiece 402, and the completed assembly workpiece 402 is manufactured as an article by assembling a plurality of pins W to the assembly workpiece 402.
The robot arm 10A is a so-called six-axis articulated manipulator, and includes a pedestal 2 that is a base part fixed to a work table or a surface plate, and a plurality of links 11, 12, 13, 14, 15, and 16 that transmit displacement and force. In addition, the robot arm 10A includes a plurality of joints J1, J2, J3, J4, J5, and J6 that connect the respective links 11 to 16 so as to be able to turn or rotate. The link 16 disposed at the distal end of the robot arm 10A is configured as a flange-shaped flange part to which the robot hand 100 is attached.
The control device 200 is constituted by a computer using a microprocessor element or the like, and is capable of controlling the robot device 10. As illustrated in
The controller 300 may be, for example, an operation device such as a teaching pendant (TP), but may be another computer device (PC or server) capable of editing a robot program. The controller 300 can be connected to the control device 200 via a wired or wireless communication connection unit, and has user interface functions for operating the robot and displaying a status. The CPU 201 receives, for example, teaching point data input by the controller 300 from the communication interface 204. In addition, a track of each axis (each link) of the robot device 10 can be generated based on the teaching point data input from the controller 300, and can be transmitted to the robot device 10 as a control target value via the communication interface 204. As a result, the robot device 10 can operate the pin W, which is an operation target, using the robot hand 100 attached to the distal end of the robot arm 10A.
Next, a detailed configuration of a robot hand 100 will be described with reference to
As illustrated in
In addition, a first finger 141 and a second finger 142 constituting the finger part 140 capable of executing work on the pin W, which is a workpiece, is movably supported by the base part 101. A fingertip portion 141b of the first finger 141 and a fingertip portion 142b of the second finger are configured to be able to abut on the pin W from respective directions to support the pin W by sandwiching and gripping pin W therebetween. The fingertip portion 141b of the first finger 141 and the fingertip portion 142b of the second finger are located between the base part 101 and the focal position P of the camera 120 within the field of view of camera 120 as will be described in detail below. As a result, the camera 120 can capture an image farther than the first finger 141 and the second finger 142, and perform various controls based on the image.
Next, the moving part 130 that drives the finger part 140 to move will be described. As described above, as illustrated in
Specifically, the first guide part 170 serving as a sliding part includes a first linear motion guide 171, and a movable part 172 and a movable part 173 slidably supported thereby. Also, the second guide part 180 serving as a sliding part includes a second linear motion guide 181, and a movable part 182 and a movable part 183 slidably supported thereby. The first slider 143 is fixed to the movable part 173 and the movable part 183, and the second slider 144 is fixed to the movable part 172 and the movable part 182.
On the other hand, the first driving part 150 roughly includes a motor 151, a driving pulley 152, a belt 153, a driven pulley 154, a ball screw shaft 155, a ball nut 156, and an angle detection sensor 159. That is, the motor 151 outputs a driving rotation, and the driving pulley 152 fixed to an output shaft of the motor 151 is driven to rotate. The belt 153 is stretched between the driving pulley 152 and the driven pulley 154, and the rotation of the driving pulley 152 is transmitted to the driven pulley 154. The driven pulley 154 is fixed to one end of the ball screw shaft 155 to rotate the ball screw shaft 155, thereby sliding the ball nut 156 fixed to the first slider 143 in the axial direction via a ball (not illustrated). Accordingly, the first slider 143 slides linearly in the axial direction of the ball screw shaft 155 (in the direction along the first linear motion guide 171 and the second linear motion guide 181) while being guided by the first guide part 170 and the second guide part 180. Therefore, the first finger 141 is driven to move in the axial direction of the ball screw shaft 155 according to a rotation direction of the motor 151. For example, when the motor 151 is rotated forward, the first finger 141 moves to one side toward the right in the drawing, and when the motor 151 is rotated backward, the first finger 141 moves to the other side toward the left in the drawing. A rotation angle of the motor 151 is detected by the angle detection sensor 159 constituted by, for example, an encoder, and is output to the control device 200. Based thereon, the CPU 201 determines a position of the first finger 141 by performing a calculation, and controls the position of the first finger 141 by issuing a command for driving the motor 151 based on the position.
Similarly, the second driving part 160 roughly includes a motor 161, a driving pulley 162, a belt 163, a driven pulley 164, a ball screw shaft 165, a ball nut 166, and an angle detection sensor 169. That is, the motor 161 outputs a driving rotation, and the driving pulley 162 fixed to an output shaft of the motor 161 is driven to rotate. The belt 163 is stretched between the driving pulley 162 and the driven pulley 164, and the rotation of the driving pulley 162 is transmitted to the driven pulley 164. The driven pulley 164 is fixed to one end of the ball screw shaft 165 to rotate the ball screw shaft 165, thereby sliding the ball nut 166 fixed to the second slider 144 in the axial direction via a ball (not illustrated). Accordingly, the second slider 144 linearly slides in the axial direction of the ball screw shaft 165 while being guided by the first guide part 170 and the second guide part 180. Therefore, the second finger 142 is driven to move in the axial direction of the ball screw shaft 165 according to a rotation direction of the motor 161. For example, when the motor 161 is rotated forward, the second finger 142 moves to one side toward the left in the drawing, and when the motor 161 is rotated backward, the second finger 142 moves to the other side toward the right in the drawing. A rotation angle of the motor 161 is detected by the angle detection sensor 169 constituted by, for example, an encoder, and is output to the control device 200. Based thereon, the CPU 201 determines a position of the second finger 142 by performing a calculation, and controls the position of the second finger 142 by issuing a command for driving the motor 161 based on the position.
With the configuration of the moving part 130 as described above, the first finger 141 and the second finger 142 can independently move on the same first linear motion guide 171 and second linear motion guide 181. As a result, when the first finger 141 and the second finger 142 move while gripping the pin W, a deviation between the relative positions of the first finger 141 and the second finger 142 in a direction other than the movement direction can be reduced, making it easier to control the movement of the finger part while gripping the pin W.
In addition, by installing the second linear motion guide 181 on the opposite side of the gripped position of the pin W with the first linear motion guide 171 interposed therebetween, a moment force generated when installing the pin W can be supported by the two linear motion guides. Accordingly, the durability of the first linear motion guide 171 and the second linear motion guide 181 can be improved.
When the pin W is gripped by the first finger 141 and the second finger 142, first, a rotation angle of the motor 151 is detected by the angle detection sensor 159, and a rotation angle of the motor 161 is detected by the angle detection sensor 169. Then, based on values thereof, the CPU 201 controls the motor 151 and the motor 161 so that the first finger 141 and the second finger 142 are as far away from each other as the pin W, and are positioned with the pin W therebetween. As a result, the pin W is gripped by the first finger 141 and the second finger 142.
In addition, the first finger 141 and the second finger 142 can move in the same direction along the first linear motion guide 171 and the second linear motion guide 181 while maintaining the distance therebetween, that is, can move while gripping the pin W That is, the first finger 141 and the second finger 142 can be moved from the position illustrated in
Here, an operation in a case where the finger part 140 gripping the pin W is moved by the moving part 130 will be described. For example, as illustrated in
Next, shapes of the first finger 141 and the second finger 142 will be described with reference to
The second root portion 142a is disposed at a position offset to the left side in
By forming the shapes of the first finger 141 and the second finger 142 in this manner, the first fingertip portion 141b, the second fingertip portion 142b, and the gripped pin W can be imaged by the camera 120. In addition, the first connecting portion 141c, the second connecting portion 142c, the first root portion 141a, and the second root portion 142a other than first fingertip portion 141b and the second fingertip portion 142b are not captured in the image captured by the camera 120. As a result, it is possible to minimize an area where the first finger 141 and the second finger 142 overlap a work position (the hole 402H of the assembly workpiece 402, a hole 401H of the pin placing table 401, the pin W placed on the pin placing table 401, or the like) when the camera 120 captures an image. Further, with such a bent shape, even though the first finger 141 and the second finger 142 are moved, it is possible to achieve a structure capable of imaging the first fingertip portion 141b and the second fingertip portion 142b without interfering with the camera 120 in a compact manner. Accordingly, the size of the robot hand 100 can be reduced.
Next, a visual servo control will be described with reference to
In the present embodiment, the “visual servo” refers to an operation of moving the robot arm 10A based on a captured image and a target image to align an object to a target position. Furthermore, the “visual servo control” refers to a series of controls for performing a visual servo operation. That is, the visual servo control is one method of controlling a movement of a position/posture of the robot arm 10A, and is a control method in which a change in position of a target object is measured as visual information and the measured change is used as information for feedback control. In addition, in the visual servo control according to the present embodiment, a control is performed to extract an image feature included in an image of an object on a current image and a difference from an image feature on a target image is fed back. Specifically, an alignment or a position correction of the pin W gripped by the robot hand 100 with respect to the hole 402H of the assembly workpiece 402, which is an imaging target, is performed.
Here, first, a robot system 1 for executing a visual servo control and a coordinate system of each unit in the robot system 1 will be described with reference to
As described above, in the present embodiment, the robot arm 10A is a vertically articulated robot arm. A proximal end (fixed end) of the robot arm 10A is installed on a stand 600. The robot hand 100 is attached to the distal end (free end) of the robot arm 10A. The robot arm 10A includes a pedestal 2, a plurality of links 11 to 16, and a plurality of joints J1 to J6. The plurality of links 11 to 16 are connected to one another in series via the plurality of joints J1 to J6 in this order. It is assumed that the first joint J1, the second joint J2, the third joint J3, the fourth joint J4, the fifth joint J5, and the sixth joint J6 are located in this order from the proximal end side (the link 11 side) toward the distal end side (the link 16 side) of the robot arm 10A. The link 11, which is a proximal end portion of the robot arm 10A, is fixed to the pedestal 2. The pedestal 2 is fixed to an upper surface of the stand 600. Each of the links 11 to 16 is driven to move (may be driven to expand and contract) around a control axis of each of the joints J1 to J6. As a result, the robot arm 10A can adjust the robot hand 100 to a certain position in the three-axis direction and a certain posture in the three-axis direction.
The robot hand 100 is provided on the link 16, which is a distal end portion of the robot arm 10A. That is, the link 16 is a supporter configured to support the end effector such as the robot hand 100. In short, the position of the link 16 is the position of the robot hand 100. However, by driving the sixth joint J6, the postures of the link 16 and the robot hand 100 are changed around the control axis thereof. Only the first joint J1 to the fifth joint J5 may be controlled to be driven by the visual servo to be described below to control the position of the robot hand 100, that is, the sixth joint J6 may not be driven by the visual servo.
The posture of the robot arm 10A can be expressed by a coordinate system. That is, the CPU 201 can calculate a coordinate system T0 of the robot device 10, a coordinate system Te of the robot hand 100 (end effector), and a coordinate system Tc of the camera 120. That is, the coordinate system T0 in
The coordinate system Tc is a coordinate system set around the camera 120, is represented by orthogonal coordinates of three axes including X, Y, and Z axes similarly to the coordinate system To and the coordinate system Te, and is set such that the optical axis direction of the camera 120 is the Z-axis direction. In the present embodiment, since the camera 120 is fixed to the robot hand 100, the coordinate system Tc and the coordinate system Te coincide with each other. However, the camera may be fixed to a ceiling of a factory where the robot system 1 is installed, a predetermined position of the robot arm 10A, or the like, and the coordinate system Te and the coordinate system Tc may be different coordinate systems.
The servo control unit 210 controls a servo (motor) (not illustrated) for each of the joints J1 to J6 to be driven. The servo control unit 210 is disposed, for example, inside the pedestal 2. The position where the servo control unit 210 is disposed is not limited to the inside of the pedestal 2, and the servo control unit 210 may be disposed anywhere. For example, the servo control unit 210 may be disposed inside a casing of the control device 200. That is, the servo control unit 210 may be a part of the configuration of the control device 200. The servo control unit 210 controls the servos of the joints J1 to J6 to be driven based on command values corresponding to the respective joints J1 to J6 acquired from the control device 200 such that the angles or torques of the respective joints J1 to J6 follow the command values. That is, the servo control unit 210 is configured to be able to control a position or a torque of each joint of the robot arm 10A.
Next, a movement control of a workpiece by a visual servo control will be described with reference to
As illustrated in
Next, the CPU 201 extracts a target image feature amount from the target image (S12). That is, as illustrated in
For example, a line segment detection method using a Hough transform can be used to extract the edges f111 and f121. The predetermined portions f112 and f122 may be extracted based on results of extracting the edges f111 and f121. More preferably, the edges f111 and f121 and the predetermined portions f112 and f122 are extracted using different methods. For example, a line segment detection method using a Hough transform is used to extract the edges f111 and f121, and template matching is used to extract the predetermined portions f112 and f122. Then, even if an erroneous image feature is extracted due to a failure in image processing, an error can be detected by comparing the results of extracting the edges f111 and f112 or the results of extracting the predetermined portions f121 and f122.
When template matching is used to extract the predetermined portions f112 and f122, in step S12, values input by an operator or values recorded in the ROM 202 or the like are registered as positions and sizes of the predetermined portions f112 and f122. Then, the corresponding range of the target image may be registered as a template. The registered template is temporarily recorded in the RAM 203 or the like.
A coordinate system Tc′ is a coordinate system on the target image, with a perpendicular bisector of the edge f111 as an image feature being an X axis and a direction along the edge f111 being a Y axis on the target image. The target image feature amount is calculated by coordinates on the coordinate system Tc′. For example, the target image feature amount includes three values of a distance f131 between points at which the edges f111 and f121 and the X axis intersect, a difference f132 in the Y′-axis direction between the positions of the predetermined portions f12 and f122, and an angle f133 formed by the edges f111 and f121. That is, the target image feature amount can be expressed by a three-dimensional vector including the distance f131, the position difference f132, and the angle f133 as target image feature amount Fg=[f131 f132 f133]T. The subscript T represents transposition of the vector or matrix.
Subsequently, the CPU 201 acquires a current captured image (hereinafter, referred to as a “current image”) from the camera 120 (S13). The current image is temporarily recorded, for example, in the RAM 203 or the like.
Next, the CPU 201 extracts a current image feature amount from the current image by a method similar to the method of extracting the target image feature amount from the target image in step S12 (S14). Hereinafter, the components of the target image feature amount are denoted by fg131, fg132, and fg133, respectively, and the components of the current image feature amount are denoted by fc131, fc132, and fc133, respectively. That is, the target image feature amount Fg=[fg131 fg132 fg133]T and the current image feature amount Fc=[fc131 fc132 fc133]T are set.
Next, the CPU 201 calculates a control amount qv of the control axis for each of the joints J1 to J6 of the robot from the target image feature amount Fg and the current image feature amount Fc, and performs a visual servo operation of the robot arm 10A (S15). In calculating the control amount qv, first, a feature amount difference Fe (F is in bold) between the current image feature amount Fc and the target image feature amount Fg is calculated according to the following Formula 1.
Subsequently, the CPU 201 calculates a matrix of image Jacobian Jimg (J is in bold) and a matrix of robot Jacobian Jr (J is in bold). The image Jacobian Jimg is a matrix of 3 rows and 3 columns associating a minute displacement amount of the current image feature amount Fc with a minute displacement amount of the coordinate system Te set in the robot hand 100. The robot Jacobian Jr is a matrix of 3 rows and 6 columns associating minute displacement amounts of the joints J1 to J6 of the robot arm 10A with minute displacement amount of the coordinate system Te set in the robot hand 100. The image Jacobian Jimg and the robot Jacobian Jr are defined according to the following Formula 2.
Here, xe (x is in bold) denotes a position vector xe=[Xe Ye αe]T with three degrees of freedom of the coordinate system Te in the coordinate system To, and αe denotes a rotation angle around the Z axis of the coordinate system To. q (q is in bold) denotes a joint angle vector q=[q1 . . . q6]T of the joints J1 to J6 of the robot arm 10A.
Subsequently, the CPU 201 calculates a control amount qv (q is in bold) of each of the joints J1 to J6 of the robot arm 10A. The control amount qv is calculated, for example, according to the following Formula 3.
(is in bold) in Formula 3 is a feedback gain represented by a three-dimensional vector, and the subscript − represents an inverse matrix, and the subscript + represents a pseudo inverse matrix. Note that the method of calculating the control amount qv from the feature amount difference Fe is not limited to the above-described method, and another known method may be freely used. The CPU 201 calculates a new angle command value by adding the control amount qv to a previous angle command value for each of the joints J1 to J6 of the robot arm 10A. Then, the CPU 201 performs a visual servo operation by operating the robot arm 10A via the servo control unit 210 based on the angle command value.
Then, the CPU 201 determines whether the position correction operation by the visual servo has been completed (S16). When the feature amount difference Fe is smaller than or equal to a predetermined value, it is determined that the correction operation has been completed, and the operation is completed (Yes in S16). When the feature amount difference Fe is not smaller than or equal to the predetermined value (No in S16), the process returns to step S13 to repeat the above-described processing. Here, the predetermined value is a value with which a determination can be made with accuracy that is sufficient to reliably perform assembly work in subsequent force control (torque control). In addition, when it is determined that the feature amount difference Fe is smaller than or equal to the predetermined value, all the values included in the feature amount difference Fe may be determined using the same value, or the determination may be performed using different values, and the correction operation may continue until all the values satisfy the condition. As described above, the robot arm 10A is operated so that the current image and the target image coincide with each other.
Next, a movement control of a pin W as a workpiece according to the first embodiment will be described with reference to
As the movement control of the pin W according to the first embodiment, a control for moving the pin W to be aligned to the hole 402H of the assembly workpiece 402 as a work position from the state where the pin W is gripped by the finger part 140 will be described. That is, before this movement control, a control is executed to grip the pin W using the finger part 140 from the pin placing table (not illustrated), and move the robot hand 100 through the robot arm 10A to be roughly positioned above the hole 402H of the assembly workpiece 402. After this movement control, a control is executed to control the torque of the robot arm 10A to move the robot hand 100, so that the pin W is inserted into the hole 402H of the assembly workpiece 402 to be assembled. By causing the robot device 10 to perform work to repeat these controls, the plurality of pins W are assembled to the assembly workpiece 402 to manufacture an article. That is, in the following description about the movement control of the pin W, only work of aligning one pin W immediately above one hole 402H will be described.
As illustrated in
Next, the CPU 201 extracts feature amounts (a target feature amount 912 and a current feature amount 911) from the target image 902 and the current image 901, respectively, similarly to the visual servo control described above (S23). Then, the CPU 201 calculates a control amount as a target movement amount necessary for a movement of the position of the robot arm 10A or a movement of the position of the finger part 140 of the robot hand 100 from a difference between the target feature amount 912 and the current feature amount 911 (S24). That is, as illustrated in
Here, as illustrated in
Therefore, in the movement control of the pin W according to the present embodiment, the first mode (first step) and the second mode (second step) can be executed as follows. That is, the first mode is a mode in which the position of the pin W (object) is moved by controlling the robot arm 10A to move the position of the robot hand 100. The second mode is a mode in which the moving part 130 of the robot hand 100 is controlled to linearly move the position of the pin W (object) using the first guide part 170 and the second guide part 180 described above. In the movement control of the pin W according to the present embodiment, it is determined whether to execute, only the first mode, only the second mode, or both the first mode and the second mode based on the control amount 920 obtained as described above (that is, based on the target image and the current image), and the process proceeds to the execution of the first mode and/or the second mode.
Roughly speaking, if the control amount 920 (that is, the difference between the current position of the pin W and the target position) is a minute value such as a specified value or smaller, the visual servo of the robot arm 10A does not require much time, and thus, only the first mode is executed. If the control amount 920 is a value larger than the specified value and the movement direction of the pin W in the control amount 920 includes a component in the movement direction of the finger part 140 using the moving part 130, both the second mode and the first mode are executed while the second mode is given priority. That is, a necessary control amount (a movement amount of the pin W) is divided into a control amount of the finger part 140 of the robot hand 100 and the other control amount, and the operation of the robot arm 10A is reduced by allocating an operation to the robot hand 100 having fewer control axes. In this case, if the movement direction of the pin W in the control amount 920 is only the movement direction of the finger part 140 using the moving part 130, only the second mode is inevitably executed.
Specifically, first, the CPU 201 converts the control amount 920 obtained as described above into a control amount based on a certain coordinate reference. In general, a control amount is managed based on the robot coordinate reference, and a start point is often set at a tool center point (TCP). Then, as illustrated in
Next, it is determined whether the first control amount 920x in the operation direction of the finger part 140 divided as described above can be realized by operating (moving) the finger part 140 (the first finger 141 and the second finger 142), that is, whether the second mode can be executed (S26). For example, when the movable amount from the current position of the pin W to the end of the movable range of the finger part 140 is smaller than the first control amount 920x, it is determined that the first control amount 920x cannot be realized by moving the finger part 140 (No in S26). That is, when the first control amount 920x is larger than the movable amount of the finger part 140, it is determined that the first control amount 920x cannot be realized by moving the finger part 140. In this case, the CPU 201 determines that the second mode cannot be executable, and selects execution of the first mode only, that is, performs visual servo by only driving the robot arm 10A (S27). As a result, the pin W is moved by visual servo by the robot arm 10A together with the robot hand 100 and is aligned to the target position (see
On the other hand, when it is determined that the first control amount 920x can be realized by moving the finger part 140 (Yes in S26), the CPU 201 determines whether the first control amount 920x (the control amount of the finger part) in which the finger part 140 is controlled is smaller than or equal to the specified value (S29). When the first control amount 920x of the finger part is smaller than or equal to the specified value, that is, when the movement amount of the pin W is minute, it does not take much time even if the visual servo of the robot arm 10A is performed, and thus, the process proceeds to step S27 described above. In this case as well, the CPU 201 selects execution of the first mode only, that is, performs visual servo by only driving the robot arm 10A (S27). As a result, the pin W is moved by visual servo by the robot arm 10A until the difference between the target image and the current image becomes smaller than or equal to the certain value (No in S28), and the movement control of the pin W is terminated when the difference between the target image and the current image becomes smaller than or equal to a certain value (Yes in S28).
When the CPU 201 determines that the first control amount 920x can be realized by moving the finger part 140 (Yes in S26) and further determines that the first control amount 920x is not smaller than or equal to the specified value (No in S29), the process proceeds to step S30. Then, as illustrated in
The CPU 201 executes the first mode for the remaining second control amount 920y obtained by dividing the control amount 920. That is, as illustrated in
Then, it is determined whether the difference between the target image and the current image is smaller than or equal to the certain value (S28). Here, since the pin W should have been moved by the control amount 920, the difference between the target image and the current image should be smaller than or equal to the certain value (Yes in S28). Note that, if the difference between the target image and the current image is not smaller than or equal to the certain value (No in S28), the pin W is moved by visual servo by the robot arm 10A until the difference between the target image and the current image becomes smaller than or equal to the certain value. Then, the movement control of the pin W described above is terminated.
As described above, in the first embodiment, the movement of the position of the pin W is controlled based on a captured current image and a target image. In this case, a first mode in which the pin W is moved by driving the robot arm 10A and a second mode in which the pin W is moved by driving the finger part 140 using the moving part 130 can be executed. As a result, some or all of the burden generated when the plurality of control axes of the robot arm 10A are accurately driven can be borne by the second mode, shortening the time required for the movement control for moving the position of the pin W.
In addition, a control amount 920 in which the position of the pin W is moved is calculated based on the current image and the target image, and the control amount 920 is divided into a first control amount 920x in a first direction in which the pin W can be moved by the moving part 130 and a second control amount 920y in a second direction. This makes it possible to determine whether the first control amount 920x can be realized by operating the finger part 140.
For example, an assembly workpiece 402 has a plurality of holes 402H (see
In the movement control of the pin W described above, visual servo is not performed by the robot arm 10A in step S30. However, the present disclosure is not limited thereto, and for example, after only the finger part 140 is operated by the moving part 130 in step 30, the process may proceed to step 27, and visual servo may be performed by the robot arm 10A. In this case, it is possible to generate a new target image excluding the first control amount 920x of the finger part 140, thereby calculating a remaining control amount, and to perform visual servo using the robot arm 10A for the remaining control amount.
In the movement control of the pin W described above, when the first control amount 920x is larger than the movable amount of the finger part 140 in step S26, it is determined that the second mode cannot be executed. However, the present disclosure is not limited thereto, and it may be determined to execute the second mode even though the first control amount 920x is larger than the movable amount of the finger part 140. That is, for the first control amount 920x, the pin W may be moved by executing the second mode by the movable amount of the finger part 140, and the pin W may be moved by executing the first mode by the remaining control amount. In particular, in this case, by determining to execute the second mode when the divided first control amount 920x (first movement amount) is larger than the divided second control amount 920y (second movement amount), it is possible to reduce the control amount (movement amount) in the first mode and shorten the time.
Next, a second embodiment partially modified from the first embodiment will be described with reference to
In the second embodiment, as compared with the first embodiment, the finger part can be moved by the moving part in two movement directions (two coordinate directions), that is, the position of the pin W can be moved on a plane including the X direction and the Y direction in the second mode.
Specifically, as illustrated in
In the robot hand 100 configured in this manner, when a movement control of a pin W is executed (see
Other configurations, operations, and effects of the second embodiment are the same as those of the first embodiment, and thus, the description thereof will be omitted.
In the first and second embodiments described above, a movement of a position of a pin Was a workpiece is controlled, but the object is not limited to the workpiece. For example, before gripping a pin as a workpiece, a distal end portion of the finger part (the first finger and the second finger) is treated as an object, and its movement is controlled. In this case, the position of the finger part is moved by the moving part of the end effector as a second mode, and the position of the finger part is moved by the robot together with the end effector as a first mode. Furthermore, for example, if the end effector includes a tool such as a driver, tweezers, or the like, for example, a distal end portion of the tool may be treated as an object, and a movement of its position may be controlled. In this case, the position of the tool is moved by the moving part of the end effector as a second mode, and the position of the tool is moved by the robot together with the end effector as a first mode.
In the first and second embodiments, a movement amount (control amount) of a pin is calculated based on a current image obtained by imaging the pin W and a target image as a target position of the pin W However, the present disclosure is not limited thereto, and for example, a movement amount (control amount) of a pin may be calculated based on a current image obtained by imaging a hole 402H of an assembly workpiece 402, a target image as a target position of the hole 402H, and a position of the pin W (finger part) in the end effector. In addition, it has been described that images in the field of view of the camera 120 fixedly supported by the robot hand 100 serving as an end effector are used as the current image and the target image, but the present disclosure is not limited thereto. That is, the camera may be supported at a predetermined position of the robot arm 10A, or the camera may be supported on a ceiling of a factory or the like where the robot system 1 is installed. That is, the coordinate system may be converted based on the position of the camera to calculate a movement direction of an object and calculate a movement amount (control amount) thereof.
In the first embodiment, the robot hand 100 slides the first finger 141 and the second finger 142, but the present disclosure is not limited to this configuration. For example, the robot hand 100 may have three or more fingers, or the fingers may move on an arc rather than moving on a straight line. In addition, it has been described that the first guide part 170 and the second guide part 180 are provided so as to slide the first finger 141 and the second finger 142, but the present disclosure is not limited thereto, and one guide part or three or more guide parts may be provided. Of course, when a plurality of guide parts are provided, the first finger 141 and the second finger 142 move stably and the workpiece is also gripped stably. Therefore, it is preferable to provide two or more guide parts. Although it has also been described that the first finger and the second finger are driven by the motor 151 and the motor 161, the present disclosure is not limited thereto, and the first finger 141 and the second finger 142 may be driven by, for example, solenoids or hydraulic pressures. In the second embodiment as well, the number of fingers may be three or more.
In the first and second embodiments, the robot arm 10A is controlled to be driven in the first mode, and the moving part 130 is controlled to be driven in the second mode. However, the present disclosure is not limited thereto, and the first joint J1 to the fifth joint J5 of the robot arm 10A may be controlled to be driven in the first mode, and the moving part 130 and the sixth joint J6 may be controlled to be driven in the second mode. In this case, in the second mode, the X direction, which is a movement direction of the finger part 140, can be rotated around the sixth joint J6, and in particular, even in the structure of the robot hand 100 as in the first embodiment, the finger part 140 (the pin W) can be moved on a plane.
In addition, the present disclosure is not limited to the embodiments described above, and the embodiments can be modified in various ways within the technical idea of the present disclosure. For example, at least two of the plurality of embodiments and the plurality of modifications described above may be combined. In addition, the effects described in the embodiments are merely the most preferable effects enumerated among the results of the embodiments of the present disclosure, and the effects of the embodiments of the present disclosure are not limited to those described in the present embodiment.
Furthermore, in the first and second embodiments described above, the robot main body is a vertically articulated robot, but the present disclosure is not limited thereto. The robot main body may be, for example, a horizontally articulated robot, a parallel link robot, or an orthogonal robot. In addition, the above-described embodiments can be applied to a machine capable of automatically performing an operation for expansion and contraction, bending and stretching, vertical movement, horizontal movement, turning, or a combination thereof based on information in the storage device provided in the control device.
Embodiment(s) of the present invention can also be realized by a computer of a system or apparatus that reads out and executes computer executable instructions (e.g., one or more programs) recorded on a storage medium (which may also be referred to more fully as a ‘non-transitory computer-readable storage medium’) to perform the functions of one or more of the above-described embodiment(s) and/or that includes one or more circuits (e.g., application specific integrated circuit (ASIC)) for performing the functions of one or more of the above-described embodiment(s), and by a method performed by the computer of the system or apparatus by, for example, reading out and executing the computer executable instructions from the storage medium to perform the functions of one or more of the above-described embodiment(s) and/or controlling the one or more circuits to perform the functions of one or more of the above-described embodiment(s). The computer may comprise one or more processors (e.g., central processing unit (CPU), micro processing unit (MPU)) and may include a network of separate computers or separate processors to read out and execute the computer executable instructions. The computer executable instructions may be provided to the computer, for example, from a network or the storage medium. The storage medium may include, for example, one or more of a hard disk, a random-access memory (RAM), a read only memory (ROM), a storage of distributed computing systems, an optical disk (such as a compact disc (CD), digital versatile disc (DVD), or Blu-ray Disc (BD)™), a flash memory device, a memory card, and the like.
While the present invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions.
This application claims the benefit of Japanese Patent Application No. 2023-163125, filed Sep. 26, 2023, which is hereby incorporated by reference herein in its entirety.
| Number | Date | Country | Kind |
|---|---|---|---|
| 2023-163125 | Sep 2023 | JP | national |