The present invention relates to a control device, a robot system, and a control method.
In the related art, a robot system is known which has a robot for carrying out work on a workpiece and a camera (imaging unit) capable of imaging the workpiece. In this robot system, based on an image captured by the camera, the robot can carry out various types of work in a real space. In order for the robot to carry out the work based on the image, it is necessary to perform calibration (correlation) between an image coordinate system of the image captured by the camera and a robot coordinate system serving as a control reference of the robot. For example, it is necessary to perform the calibration between the image coordinate system using a two-dimensional image captured by the camera and the robot coordinate system in a two-dimensional space on a surface of a work table where the robot carries out the work.
JP-A-2016-187845 discloses a calibration method using a marker board provided with a plurality of markers. According to the method, position information is acquired using a robot coordinate of one marker, and position information is acquired using an image coordinate of the camera so that these two pieces of position information are combined with each other. In this manner, the calibration is performed between the robot coordinate system and the image coordinate system.
However, according to the method in the related art, a dedicated member such as the marker board needs to be prepared, thereby causing a worker to spend time and labor.
When the calibration is performed between the image coordinate system using the two-dimensional image and the robot coordinate system in the two-dimensional space on the surface of the work table, the calibration is not performed in a height direction on the work table. Therefore, if a height of the marker board and a height of the workpiece do not coincide with each other, the robot is less likely to carry out proper work by using a result of the calibration. Accordingly, the worker needs to prepare the marker board corresponding to the height of the workpiece, thereby causing a problem in that the worker feels unsatisfactory workability.
An advantage of some aspects of the invention is to solve at least a part of the problems described above, and the invention can be implemented as the following configurations.
A control device according to an application example of the invention includes a receiving unit that receives information relating to a first captured image from a first imaging unit capable of capturing an image, and a control unit capable of performing a command relating to drive of a robot having a movable unit capable of holding a workpiece, based on the information. The control unit is capable of performing correlation between a robot coordinate system which is a coordinate system relating to the robot and a first image coordinate system which is a coordinate system relating to the first captured image, and performs the correlation, based on a coordinate in the robot coordinate system of a predetermined site of the movable unit holding the workpiece when the workpiece is located at each of a plurality of positions inside an imaging region of the first imaging unit and a coordinate in the first image coordinate system of the workpiece when the workpiece is located at each of the plurality of positions.
According to this control device, calibration (correlation) can be performed using the workpiece. Accordingly, time and labor can be saved in preparing a dedicated member for the calibration, and workability of a worker can be improved. In addition, the calibration can be more accurately performed. A result of the calibration is used, thereby enabling a robot to more correctly carry out actual work on the workpiece.
In the control device according to the application example, it is preferable that, in the correlation, the control unit uses the first captured image captured when the workpiece is located at a first position inside the imaging region, and the first captured image captured when the workpiece is located at a second position different from the first position inside the imaging region.
In this manner, the calibration can be quickly, easily, and more accurately performed using one first imaging unit.
In the control device according to the application example, it is preferable that the workpiece includes a first workpiece and a second workpiece different from the first workpiece and, in the correlation, the control unit uses the first captured image captured when the first workpiece is located at a first position inside the imaging region, and the first captured image captured when the second workpiece is located at a second position different from the first position inside the imaging region.
In this way, the calibration can be performed using a plurality of the workpieces. Accordingly, time and labor can be saved in using a dedicated member for the calibration.
In the control device according to the application example, it is preferable that the workpiece includes a first workpiece and a second workpiece different from the first workpiece, and, in the correlation, the control unit uses the first captured image captured when the first workpiece is located at a first position inside the imaging region, and when the second workpiece is located at a second position different from the first position inside the imaging region.
In this manner, the calibration can be more quickly performed compared to a case of using the first captured image captured at each position.
In the control device according to the application example, it is preferable that, in the correlation, the control unit obtains a coordinate in the robot coordinate system of the predetermined site in a state where the workpiece is held by the movable unit, and obtains a coordinate in the first captured image of the workpiece after the workpiece is detached from the movable unit.
In this manner, the calibration can be correctly, quickly, and more accurately performed. In addition, the calibration can be more precisely performed.
In the control device according to the application example, it is preferable that the receiving unit is capable of communicating with the first imaging unit disposed so as to be capable of imaging a work table on which the workpiece is placed.
In this manner, the workpiece placed on the work table can be imaged, and the calibration can be correctly performed using the first captured image obtained by imaging the workpiece. Furthermore, when the robot carries out the work on the workpiece, the robot can properly carry out the work by using the first captured image.
In the control device according to the application example, it is preferable that the receiving unit is capable of receiving information relating to second captured image from a second imaging unit capable of capturing an image and different from the first imaging unit. It is preferable that the control unit is capable of coordinate transformation between the robot coordinate system and a second image coordinate system which is a coordinate system relating to the second captured image, and obtains a position of the workpiece with respect to the predetermined site, based on the coordinate transformation.
Even in a state where a position of the workpiece with respect to the predetermined site is unknown, it is possible to properly and easily perform the correlation between the first image coordinate system and the robot coordinate system.
In the control device according to the application example, it is preferable that the receiving unit is capable of communicating with the second imaging unit disposed so as to be capable of imaging the workpiece in a state where the workpiece is held by the movable unit.
In this manner, the position of the workpiece with respect to the predetermined site can be efficiently obtained.
A robot system according to an application example of the invention includes the control device according to the application example and a robot controlled by the control device.
According to this robot system, workability of the worker can be improved. The robot can more correctly, quickly, and accurately carry out the work on the workpiece.
A control method according to an application example of the invention includes correlating a robot coordinate system which is a coordinate system relating to a robot having a movable unit capable of holding a workpiece, and a first image coordinate system which is a coordinate system relating to a first captured image obtained from a first imaging unit capable of capturing an image, and driving the robot, based on a result of the correlating and information relating to the first captured image obtained from the first imaging unit. In the correlating, the correlating is performed, based on a coordinate in the robot coordinate system of a predetermined site of the movable unit holding the workpiece when the workpiece is located at each of a plurality of positions inside an imaging region of the first imaging unit and a coordinate in the first image coordinate system of the workpiece when the workpiece is located at each of the plurality of positions.
According to this control method, workability of the worker can be improved. The robot can more correctly, quickly, and accurately carry out the work on the workpiece.
The invention will be described with reference to the accompanying drawings, wherein like numbers reference like elements.
Hereinafter, a control device, a robot system, and a control method according to the invention will be described in detail with reference to preferred embodiments illustrated in the accompanying drawings.
In the description herein, the term “horizontal” includes a case of inclination within a range of ±10° or smaller with respect to the horizontal. Similarly, the term “vertical” includes a case of inclination within a range of ±10° or smaller with respect to the vertical. The term “parallel” includes not only a case where two lines (including axes) or planes are perfectly parallel to each other but also a case where the two lines are inclined ±10°. The term “orthogonal” includes not only a case where two lines (including axes) or planes intersect each other at an angle of 90° but also a case where the two lines are inclined within ±10° with respect to 90°.
For example, a robot system 100 illustrated in
Hereinafter, respective units belonging to the robot system 100 will be sequentially described.
The robot 1 is a so-called 6-axis vertically articulated robot, and has a base 110 and a movable unit 20 connected to the base 110. The movable unit 20 has a robot arm 10 and a hand 17.
The base 110 allows the robot 1 to be attached to any desired installation place. In this embodiment, the base 110 is installed in an installation place 70 on a floor, for example. The installation place of the base 110 is not limited to the installation place 70 on the floor. For example, the installation place may be a wall, a ceiling, a movable carriage.
As illustrated in
Here, as illustrated in
As illustrated in
Each of the drive units 130 is electrically connected to a motor driver (not illustrated) incorporated in the base 110 illustrated in
The robot 1 configured in this way has a base coordinate system (robot coordinate system) which is set with reference to the base 110 of the robot 1. The base coordinate system is a three-dimensional orthogonal coordinate system defined by the xr-axis and the yr-axis which are respectively parallel to a horizontal direction and the zr-axis which is orthogonal to the horizontal direction and whose vertically upward direction is a positive direction. In this embodiment, in the base coordinate system, a center point on an upper end surface of the base 110 is set as an origin. A translational component with respect to the xr-axis is set as a “component xr”, a translational component with respect to the yr-axis is set as a “component yr”, a translational component with respect to the zr-axis is set as a “component zr”, a rotational component around the zr-axis is set as a “component ur”, a rotational component around the yr-axis is set as a “component vr”, and a rotational component around the xr-axis is set as a “component wr”. A unit of a length (size) of the component xr, the component yr, and the component zr is “mm”, and a unit of an angle (size) of the component ur, the component vr, and the component wr is “°”.
The robot 1 has a tip end coordinate system whose origin is a predetermined point P6 of the arm 16. The tip end coordinate system is a two-dimensional orthogonal coordinate system defined by an xa-axis and a ya-axis which are orthogonal to each other. The xa-axis and the ya-axis are orthogonal to the pivot axis O6. A translational component with respect to the xa-axis is set as a “component xa”, a translational component with respect to the ya-axis is set as a “component ya”, a translational component with respect to the za-axis is set as a “component za”, a rotational component around the za-axis is set as a “component ua”, a rotational component around the ya-axis is set as a “component va”, and a rotational component around the xa-axis is set as a “component wa”. A unit of a length (size) of the component xa, the component ya and the component za is “mm”, and a unit of an angle (size) of the component ua, the component va, and the component wa is “°”. In this embodiment, calibration between the base coordinate system and the tip end coordinate system is completed. In this embodiment, the base coordinate system is regarded as the “robot coordinate system”. However, the tip end coordinate system may be regarded as the “robot coordinate system”.
Hitherto, the configuration of the robot 1 has been briefly described. In this embodiment, as described above, the holding unit is the hand 17. However, the holding unit may adopt any configuration as long as the workpiece can be held. For example, the holding unit may be a device (not illustrated) including a suction mechanism. Although not illustrated, the robot 1 may include a force detection device configured to include a six-axis force sensor for detecting a force (including a moment) applied to the hand 17, for example.
As illustrated in
Although not illustrated, for example, the first imaging unit 3 has an imaging element configured to include a charge coupled device (CCD) image sensor having a plurality of pixels, and an optical system including a lens. The first imaging unit 3 causes the lens to form an image on a light receiving surface of the imaging element by using light reflected from an imaging object, converts the light into an electric signal, and outputs the electric signal to the control device 5. The first imaging unit 3 is not limited to the above-described configuration, and may adopt other configurations as long as the configuration has an imaging function.
The first imaging unit 3 has a first image coordinate system, that is, a coordinate system of a captured image output from the first imaging unit 3. The first image coordinate system is a two-dimensional orthogonal coordinate system defined by an xb-axis and a yb-axis which are respectively parallel to an in-plane direction of the captured image (refer to
As illustrated in
Although not illustrated, for example, the second imaging unit 4 has an imaging element configured to include a charge coupled device (CCD) image sensor having a plurality of pixels, and an optical system including a lens. The second imaging unit 4 causes the lens to form an image on a light receiving surface of the imaging element by using light reflected from an imaging object, converts the light into an electric signal, and outputs the electric signal to the control device 5. The second imaging unit 4 is not limited to the above-described configuration, and may adopt other configurations as long as the configuration has an imaging function.
A second image coordinate system, that is, a coordinate system of a second captured image 40 output from the second imaging unit 4 is set in the second imaging unit 4. The second image coordinate system is a two-dimensional orthogonal coordinate system defined by an xc-axis and a yc-axis which are respectively parallel to an in-plane direction of the second captured image 40 (refer to
The control device 5 illustrated in
As illustrated in
The control unit 51 (processor) executes various programs stored in the storage unit 52. In this manner, each drive of the robot 1, the first imaging unit 3, and the second imaging unit 4 can be controlled, and various calculation and determination processes can be realized.
For example, the storage unit 52 is configured to include the volatile memory or the non-volatile memory. The storage unit 52 is not limited to a configuration where the control device 5 is internally equipped with the storage unit 52 (the volatile memory or the nonvolatile memory), and may adopt a configuration having a so-called external storage device (not illustrated).
The storage unit 52 stores various programs (commands) which can be executed by a processor. The storage unit 52 can store various data items received by the external input/output unit 53.
The various programs include a robot drive command relating to drive of the robot 1, a first coordinate transformation command relating to correlation between the first image coordinate system and the tip end coordinate system of the robot 1 or the robot coordinate system (base coordinate system), a second coordinate transformation command relating to correlation between the second image coordinate system and the tip end coordinate system of the robot 1 or the robot coordinate system (base coordinate system), and a robot coordinate transformation command relating to correlation between the tip end coordinate system and the base coordinate system.
The first coordinate transformation command is a command to obtain a coordinate transformation equation for transforming a first image coordinate (xb, yb, and ub: position and posture) serving as a coordinate in the first image coordinate system into a coordinate (xa, ya, and ua: position and posture) in the tip end coordinate system of the robot 1 or a robot coordinate (xr, yr, and ur: position and posture) serving as a coordinate in the robot coordinate system. The first coordinate transformation command is executed, thereby enabling the correlation among the first image coordinate, the tip end coordinate system, and the robot coordinate system. The second coordinate transformation command is a command to obtain a coordinate transformation equation for transforming a second image coordinate (xc, yc, and uc: position and posture) serving in the second image coordinate system into the coordinate (xa, ya, and ua: position and posture) in the tip end coordinate system of the robot 1 or the robot coordinate. The second coordinate transformation command is executed, thereby enabling the correlation among the second image coordinate, the tip end coordinate system, and the robot coordinate system.
For example, various data items include data output from a plurality of position sensors 140 belonging to the robot 1, data of the captured image output from the first imaging unit 3, and data of the captured image output from the second imaging unit 4. Various data items include data of the number of respective pixels of the first imaging unit 3 and the second imaging unit 4, and data relating to speed or acceleration (more specifically, movement speed and movement acceleration of the hand 17, for example) of the robot 1 when calibration is performed (to be described later).
For example, the external input/output unit 53 is configured to include an I/O interface circuit, and is used for connecting the control device 5 to other respective devices (the robot 1, the first imaging unit 3, the second imaging unit 4, the display device 501, and the input device 502). Therefore, the external input/output unit 53 has a function as a receiving unit which receives various data items output from the robot 1, the first imaging unit 3, and the second imaging unit 4. The external input/output unit 53 has a function to output and display information relating to various screens (for example, an operation screen) on a monitor of the display device 501.
In addition to the above-described configuration, the control device 5 may further include other additional configurations. The control unit 51 may be configured to include a single processor or a plurality of processors. The storage unit 52 and the external input/output unit 53 may be similarly configured.
The display device 501 illustrated in
For example, the input device 502 is configured to include a keyboard. Therefore, the worker operates the input device 502, thereby enabling the worker to instruct the control device 5 to perform various processes. Although not illustrated, the input device 502 may be configured to include a teaching pendant, for example.
Instead of the display device 501 and the input device 502, a display input device (not illustrated) provided with both functions of the display device 501 and the input device 502 may be used. For example, as the display input device, a touch panel can be used. The robot system 100 may have one display device 501 and one input device 502, or may have a plurality of the display devices 501 and the input devices 502.
Hitherto, a basic configuration of the robot system 100 has been briefly described. The robot system 100 has the control device 5 and the robot 1 controlled by the control device 5. The control device 5 performs control to be described later.
According to the robot system 100 configured in this way, the control can be performed by the control device 5 (to be described later). Accordingly, workability of the robot system 100 operated by the worker can be improved. The robot 1 can more correctly, quickly, and accurately carry out the work on the workpiece 91.
As illustrated in
Specific work content performed by the robot 1 is not particularly limited. However, in the work (Step S20) carried out by the robot 1, a “workpiece” used in the calibration (Step S10) or “one having a configuration the same as or equivalent to the workpiece” is used. Therefore, in this embodiment, as will be described later, the workpiece 91 illustrated in
The specific work content of the work (Step S20) carried out by the robot 1 is not particularly limited. Therefore, hereinafter, description thereof will be omitted, and the calibration (Step S10) will be described.
In the calibration (Step S10), the calibration (correlation) is performed between the first image coordinate system of the first imaging unit 3 and the robot coordinate system of the robot 1. In order to cause the robot 1 to carry out various types of work based on the data of the captured image output from the first imaging unit 3, the robot system 100 obtains a coordinate transformation equation for transforming the coordinate (first image coordinate: xb, yb, and ub) in the first image coordinate system into the coordinate (robot coordinate: xr, yr, and ur) in the robot coordinate system. The correlation can be performed between the first image coordinate system and the robot coordinate system by obtaining the coordinate transformation equation.
In this embodiment, the calibration is performed using the workpiece 91 illustrated in
Hereinafter, referring to the flowchart illustrated in
First, the control unit 51 drives the robot arm 10 so as to grip one workpiece 91a out of nine workpieces 91a to 91i by using the hand 17 as illustrated in
Here, in this embodiment, as illustrated in
Next, the control unit 51 locates the workpiece 91a inside a field of view of the first imaging unit 3, that is, inside an imaging region S3, and places the workpiece 91a on the work table 71 as illustrated in
Next, the control unit 51 stores the robot coordinate of the predetermined point P6 in the storage unit 52 (Step S13). In this case, as illustrated in
Next, the control unit 51 releases the hand 17, and detaches the hand 17 from the workpiece 91a as illustrated in
Next, the control unit 51 causes the first imaging unit 3 to image the workpiece 91a, and causes the storage unit 52 to store a first image coordinate in the through-hole 911 of the workpiece 91a which is obtained based on the data of the first captured image 30 (Step S15).
Next, the control unit 51 determines whether or not Steps S11 to S15 described above are performed a predetermined number of times (Step S16), and repeats Steps S11 to S15 until the steps are performed the predetermined number of times. In this embodiment, Steps S11 to S15 described above are repeated nine times. In other words, in this embodiment, the control unit 51 repeats Steps S11 to S15 until it is determined that nine pairs of the robot coordinate and the first image coordinate are acquired.
Here, in this embodiment, the control unit 51 moves the workpiece 91a so that the through-hole 911 of the workpiece 91a projected on the first captured image 30 is projected at a different position at each time. In particular, as illustrated in
Next, if the steps are repeated a predetermined number of times (nine in this embodiment), based on the robot coordinate of the nine predetermined points P6 and the first image coordinates of the nine workpieces 91a, the control unit obtains a coordinate transformation equation for transforming the first image coordinate into the robot coordinate (Step S17). In this manner, the calibration (correlation) is completed between the image coordinate system and the robot coordinate system.
Hitherto, the calibration (Step S10) has been briefly described. Here, if the obtained coordinate transformation equation is used, a position and a posture of an imaging target imaged by the first imaging unit 3 can be transformed into a position and a posture in the robot coordinate system. Furthermore, as described above, the correlation is in a completed state between the robot coordinate system (base coordinate system) and the tip end coordinate system. Accordingly, the position and the posture of the imaging target imaged by the first imaging unit 3 can be transformed into the position and the posture in the tip end coordinate system. Therefore, based on the first captured image 30, the control unit 51 can locate the hand 17 of the robot 1 and the workpiece 91a gripped by the hand 17 at a desired place. Therefore, as a result in the calibration (Step S10), the coordinate transformation equation is used for robot coordinate system and the first imaging unit 3, thereby enabling the robot 1 to properly carry out the work in the work (
As described above, the control device 5 includes the external input/output unit 53 which functions as a receiving unit to receive information relating to the first captured image 30 from the first imaging unit 3 capable of capturing the image, and the control unit 51 which can execute the command relating to the drive of the robot 1 having the movable unit 20 capable of holding the workpiece 91a, based on the information relating to the first captured image 30. The control unit 51 can perform the correlation between the robot coordinate system serving as the coordinate system relating to the robot 1 and the first image coordinate system serving as the coordinate system relating to the first captured image 30. The control unit 51 performs the calibration (correlation), based on the robot coordinate (coordinate in the robot coordinate system) of the predetermined point P6 as the predetermined site of the movable unit 20 holding the workpiece 91a when the workpiece 91a is located at each of the plurality of positions inside the imaging region S3 of the first imaging unit 3, and the first image coordinate (coordinate in the first image coordinate system) of the workpiece 91a when the workpiece 91a is located at each of the plurality of positions.
According to this control device 5, the calibration (correlation) can be performed using the workpiece 91a serving as an actual workpiece of the robot 1. Accordingly, time and labor can be saved in preparing a dedicated member for the calibration in the related art and a more accurate calibration can be performed. In particular, it is not necessary to prepare the dedicated member in view of the height of the workpiece 91a as in the related art. Since the calibration is performed using the workpiece 91a, it is possible to use a design height of the workpiece 91a. Accordingly, the calibration in a height direction (zr-axis direction) can be omitted. In this way, a calibration procedure is simplified, and workability of a worker can be improved. Based on a result (coordinate transformation equation) of the calibration using the workpiece 91a, the robot 1 can carry out various types of work. Therefore, the robot 1 can correctly carry out the work on the workpiece 91a.
In this embodiment, the predetermined point P6 serving as the predetermined site is set. However, the predetermined site may be located at any place of the movable unit 20. For example, the predetermined site may be a tool center point P, or a tip end center of the arm 15. In this embodiment, a place serving as a reference of the workpiece 91a in the calibration is the through-hole 911. However, without being limited thereto, the place serving as the reference may be a corner portion of the workpiece 91a, for example.
As described above, in the calibration (correlation), the control unit 51 obtains a coordinate in the robot coordinate system of the predetermined point P6 serving as the predetermined site in a state where the workpiece 91a is held by the movable unit 20 (Step S13), and detaches the movable unit 20 from the workpiece 91a (Step S14). Thereafter, the control unit 51 obtains a coordinate in the first captured image 30 of the workpiece 91a (Step S15).
In this manner, the calibration can be correctly, quickly, and more accurately (more precisely) performed. The calibration can be performed using only the first imaging unit 3 (one imaging unit) with which the robot 1 actually carries out the work. Therefore, the workability is more satisfactory to the worker.
Furthermore, as described above, the first imaging unit 3 is installed so as to be capable of imaging the work table 71. The external input/output unit 53 having a function as the receiving unit can communicate with the first imaging unit 3 disposed so as to be capable of imaging the work table 71 on which the workpiece 91a is placed.
In this manner, the workpiece 91a placed on the work table 71 can be imaged, and the calibration can be correctly performed using the first captured image 30 obtained by imaging the workpiece 91a. Furthermore, when the robot 1 carries out on the workpiece 91a and the workpieces 91b to 91i having the same configuration, based on a result of the calibration, the control unit 51 enables the robot 1 to properly carry out the work by using the first captured image 30. In this way, the calibration can be performed using only the first imaging unit 3 (one imaging unit), and the actual work can be carried out by the robot 1. Therefore, the workability is very satisfactory to the worker.
As described above, in the calibration (correlation), the control unit 51 uses the first captured image 30 captured when the workpiece 91a is located at a first position P10 inside the imaging region S3, and the first captured image 30 captured when the workpiece 91a is located at a second position P20 which is different from the first position P10 inside the imaging region S3 (refer to
In this manner, if only one workpiece 91a is used, the calibration can be performed. The calibration can be quickly, easily, and more accurately performed using one first imaging unit 3. Therefore, it is possible to save time and labor of the worker can be saved.
The first position P10 and the second position P20 are not limited to the illustrated positions. As long as the positions are different from each other, both the positions are not limited to the positions respectively illustrated in
Here, in the above description, the control unit 51 performed the calibration using one workpiece 91a. However, the calibration can also be performed using a plurality of the workpieces 91a to 91i (refer to
In the calibration using a plurality of the workpieces 91a to 91i, for example, the workpiece 91a serving as a first workpiece is located at the first position P10 (refer to
In this way, the workpiece 91 includes the workpiece 91a (first workpiece) and the workpiece 91b (second workpiece) different from the workpiece 91a. In the calibration (correlation), the control unit 51 uses the first captured image 30 captured when the workpiece 91a is located at the first position P10 inside the imaging region S3, and the first captured image 30 captured when the workpiece 91b is located at the second position P20 different from the first position P10 inside the imaging region S3. In this embodiment, the control unit 51 uses each of the first captured images 30 when the different workpieces 91a to 91i are located at nine respective positions.
In this way, the calibration can be performed using a plurality of the workpieces 91a to 91i. According to this method, time and labor can also be saved in using a dedicated member for the calibration.
For example, the control unit 51 locates the different workpiece 91a to 91i at each of any desired nine positions. Thereafter, the control unit 51 causes the first imaging unit 3 to collectively image the workpiece 91a to 91i. That is, as illustrated in
In this way, the workpiece 91 includes the workpiece 91a (first workpiece) and the workpiece 91b (second workpiece) different from the workpiece 91a. In the calibration (correlation), the control unit 51 uses the first captured image 30 captured when the workpiece 91a is located at the first position P10 inside the imaging region S3 and the workpiece 91b is located at the second position P20 different from the first position P10 inside the imaging region S3. In this embodiment, the different workpiece 91a to 91i are respectively located at the nine positions. Thereafter, the control unit 51 uses the first captured image 30 obtained by collectively imaging the nine workpieces 91a to 91i (refer to
In this manner, the calibration can be more quickly performed, compared to a case of using the first captured image 30 captured for each of the above-described positions.
As described above, a control method using the control device 5 includes Step S10 for performing the calibration (correlation) between the robot coordinate system serving as the coordinate system relating to the robot 1 having the movable unit 20 capable of holding the workpiece 91 and the first image coordinate system serving as the coordinate system relating to the first captured image 30 received from the first imaging unit 3 capable of capturing the image, and Step S20 for driving the robot 1, based on the result of the calibration and the information relating to the first captured image 30 which is received from the first imaging unit 3. In Step S10 for performing the calibration, the calibration (correlation) is performed, based on the coordinate in the robot coordinate system of the predetermined point P6 serving as the predetermined site of the movable unit 20 holding the workpiece 91 when the workpieces 91 is located at each of the plurality of positions inside the imaging region S3 of the first imaging unit 3, and the coordinate in the first image coordinate system of the workpiece 91 when the workpiece 91 is located at each of the plurality of positions.
According to this control method, as described above, the control method is performed based on the calibration result using the workpiece 91. Therefore, the robot 1 can correctly, quickly, and accurately carried out on the workpiece 91.
Hitherto, the control method has been described. In this embodiment, as illustrated in
In this embodiment, the workpiece 91 having the configuration illustrated in
Next, a second embodiment will be described.
This embodiment is mainly the same as the above-described embodiment except that a coordinate transformation equation with low precision is obtained so that Steps S11 to S16 are automatically performed. In the following description, points different from those according to the above-described embodiment will be mainly described, and description of similar matters will be omitted.
Hereinafter, referring to the flowchart illustrated in
First, the control unit 51 obtains the coordinate transformation equation between the base coordinate system and the first image coordinate system (
Specifically, the coordinate transformation equation in Step S21 can be generated through a process of moving the workpiece 91a to any desired two places inside a field of view of the first imaging unit 3.
More specifically, in Step S21, the control unit 51 first causes the hand 17 to grip the workpiece 91a, and moves the workpiece 91a to any two different desired positions so as to acquire two pairs of a robot coordinate (xr and yr) of the predetermined point P6 and a first image coordinate (xb and yb). For example, as illustrated in
Next, nine reference points 301 are set in the first captured image 30 as illustrated in
If the nine reference points 301 are set, based on the coordinate transformation equation obtained in Step S21 described above, the control unit 51 moves the workpiece 91a so that the through-hole 911 is located at the nine reference points 301. Since the coordinate transformation equation obtained in Step S21 described above is used, the robot coordinate at the designated position inside the first captured image 30 is recognized. Accordingly, a jog operation based on a command of the worker can be omitted. Therefore, Steps S11 to S16 can be automatically performed.
According to the method described above, Steps S11 to S16 can be automatically performed. Accordingly, time and labor can be further saved in the calibration. The nine reference points 301 can be set at a substantially equal interval. Therefore, calibration accuracy can be improved compared to a case where the workpiece 91a is located at any desired nine positions by performing the jog operation based on the command of the worker.
In this embodiment, the nine reference points 301 are provided. However, the number of the reference points 301 is optionally determined, and may be two or more. However, if the number of the reference points 301 increases, the calibration accuracy is improved. In this embodiment, the reference points 301 are arrayed in the lattice shape. However, the array is not limited to the lattice shape.
In this embodiment described above, an advantageous effect the same as that of the first embodiment can also be achieved.
Next, a third embodiment will be described.
This embodiment is mainly the same as the above-described embodiments except that a designated position of the workpiece is set using the first imaging unit (tool setting). In the following description, points different from those according to the above-described embodiments will be mainly described, and description of similar matters will be omitted.
As illustrated in
Hereinafter, referring to the flowchart illustrated in
First, before Step S11 described above is performed, the control unit 51 obtains a relative relationship between the robot coordinate system and the first image coordinate system (Step S23). Specifically, the control unit 51 locates the workpiece 91a inside the first captured image 30 as indicated by a solid line in
As described above, the workpiece 91a is moved to three places inside the first captured image 30. However, these locations are optionally set as long as the workpiece 91a is located inside the first captured image 30.
Next, the control unit 51 obtains coefficients a, b, c, and d in Equation (2) below, based on the acquired three robot coordinates and three first image coordinates. In this manner, the coordinate transformation equation can be obtained between the robot coordinate and the first image coordinate. Therefore, the amount of displacement (amount of movement) in the first image coordinate system can be transformed into the amount of displacement in the robot coordinate system (base coordinate system), and furthermore, can be transformed into the amount of displacement in the tip end coordinate system.
The reference numerals Δxb and Δyb in Equation (2) represent the displacement (distance) between two places in the image coordinate system, and the reference numerals Δxa and Δya represent the displacement between two places in the robot coordinate system.
In this way, based on the three robot coordinates and the three image coordinates which are obtained by moving the predetermined point P6 to three different places, the coordinate transformation equation (affine transformation equation) indicated by Equation (2) above is used. Accordingly, a relative relationship between the robot coordinate system and the image coordinate system can be easily and properly obtained.
Next, the through-hole 911 (designated position) of the workpiece 91a is set using the first imaging unit 3 (Step S24).
Specifically, the control unit 51 uses the coordinate transformation equation obtained in Step S23, and locates the through-hole 911 of the workpiece 91a at a center O30 of the first captured image 30 as illustrated in
Next, as illustrated in
After Steps S23 and S24 described above are performed, the control unit 51 performs Steps S11 to S17. In this manner, the control unit 51 can properly and easily perform the correlation between the first image coordinate system and the robot coordinate system, even in a state where the position of the workpiece 91a with respect to the predetermined point P6 is not recognized.
In a case where the position of the tool center point P with respect to the predetermined point P6 can be obtained from a design value or a measured value, Step S23 described above may be omitted. In Step S24, without performing the above-described method, the design value or the measured value may be used as the position of the tool center point P (and the position of the through-hole 911) with respect to the predetermined point P6.
In this embodiment described above, an advantageous effect the same as that of the first embodiment can also be achieved.
Next, a fourth embodiment will be described.
This embodiment is mainly the same as the above-described embodiments except that a designated position of the workpiece is set using the second imaging unit (tool setting). In the following description, points different from those according to the above-described embodiments will be mainly described, and description of similar matters will be omitted.
The calibration illustrated in
In a case of using the hand 17 in this way, as illustrated in
In setting the through-hole 911 (designated position) of the workpiece 91a (Step S25), in a state where the hand 17 grips the workpiece 91a, the control unit 51 locates the workpiece 91a immediately above the second imaging unit 4. In this case, for example, in a case where the workpiece 91a is gripped by the hand 17 as illustrated in
In the calibration according to this embodiment, as described above, the external input/output unit 53 having a function as the receiving unit can receive the information relating to the second captured image 40 from the second imaging unit 4 capable of capturing the image and different from the first imaging unit 3. The control unit 51 can perform the coordinate transformation between the robot coordinate system and the second image coordinate system serving as the coordinate system relating to the second captured image 40. Based on the coordinate transformation, the control unit 51 can obtain the position of the workpiece 91a (particularly, the through-hole 911) with respect to the predetermined point P6 serving as the predetermined site.
In this manner, even in a state where the position of the workpiece 91a with respect to the predetermined point P6 is not recognized, the calibration (correlation) can be properly and easily performed between the first image coordinate system and the robot coordinate system.
Furthermore, as described above, the second imaging unit 4 is installed so as to be capable of imaging the workpiece 91a in a state where the workpiece 91a is held by the movable unit 20. In particular, the imaging direction of the second imaging unit 4 is opposite to the imaging direction of the first imaging unit 3. The second imaging unit 4 is capable of capturing the image vertically upward with respect to the second imaging unit 4. The first imaging unit 3 is capable of capturing the image vertically downward with respect to the first imaging unit 3. The external input/output unit 53 having the function as the receiving unit can communicate with the second imaging unit 4 disposed so as to be capable of imaging the workpiece 91a in a state where the workpiece 91a is held by the movable unit 20.
In this manner, the position of the workpiece 91a with respect to the predetermined point P6 can be efficiently obtained.
Here, as described in the first embodiment, the configuration of the “workpiece” is not limited to the configuration illustrated in
In this embodiment described above, an advantageous effect the same as that of the first embodiment can also be achieved.
Hitherto, the control device, the robot system, and the control method according to the invention have been described with reference to the illustrated embodiments. However, the invention is not limited to these embodiments. The configuration of each unit can be substituted with any desired configuration having the same function. Any other configuration element may be added to the invention. The respective embodiments may be appropriately combined with each other.
In the embodiments described above, as an example of the robot belonging to the robot system according to the invention, a so-called six-axis vertically articulated robot has been described. However, the robot may be other robots such as a scalar robot. Without being limited to a single arm robot, for example, another robot such as a double arm robot may be used. Therefore, the number of the movable units is not limited to one, and may be two or more. The number of the arms belonging the robot arm included in the movable unit is six in the above-described embodiments. However, the number of the arms may be 1 to 5 or 7 or more.
The entire disclosure of Japanese Patent Application No. 2017-146229, filed Jul. 28, 2017 is expressly incorporated by reference herein.
Number | Date | Country | Kind |
---|---|---|---|
2017-146229 | Jul 2017 | JP | national |