The present invention relates to a control device, a robot, and a robot system.
In the related art, a robot system is known which includes a robot having a robot arm provided with an end effector for carrying out work on an object and a camera attached to a distal portion of the robot arm, and a control device for controlling the driving of the robot.
As an example of this robot system, for example, JP-A-2015-66603 discloses a robot system including a robot device having a joint arm provided with a hand and a camera disposed at the arm positioned at the forefront of the joint arm, and a control device for controlling a position and a posture of the robot device. In the robot system, a hand coordinate system of a hand and a camera coordinate system of a camera are set. In the robot system, in order to grip an object by using the hand, based on a captured image of the camera, a robot calibration device performs calibration processing on the hand coordinate system and an image coordinate system.
Here, in the robot system disclosed in JP-A-2015-66603, the camera is disposed in a distal arm, and rotates following rotating of the distal arm. Therefore, there is a problem in that a wire of the camera is likely to be degraded after being frequently bent due to the rotation of the distal arm. In particular, in a case where the distal arm is frequently moved, the wire is significantly degraded.
An advantage of some aspects of the invention is to solve at least a part of the problems described above, and the invention can be implemented as the following configurations.
A control device according to an aspect of the invention controls a robot having a movable unit including a plurality of arms. The control device includes a processor that performs calibration between a coordinate system of an imaging unit disposed in an arm different from an arm positioned on a most distal side of the movable unit and a coordinate system of the robot.
In the control device according to the aspect of the invention, the calibration can be performed between the coordinate system of the imaging unit and the coordinate system of the robot. Accordingly, the robot is enabled to carry out accurate work, based on the captured image of the imaging unit. With the control device according to the aspect of the invention, the calibration can be performed in the imaging unit disposed in the arm different from the arm positioned on the most distal side of the movable unit. Accordingly, the control device is used so that the imaging unit can be disposed in the arm different from the arm on the most distal side of the robot. Therefore, for example, it is possible to minimize a possibility that a wire of the imaging unit pulled from a proximal side of the robot may be degraded after being frequently bent due to the rotation of the distal arm.
In the control device according to the aspect of the invention, it is preferable that the control unit performs the calibration, based on a captured image obtained by causing the imaging unit to image a marker.
With this configuration, it is unnecessary to touch up a calibration jig for an object such as a member for calibration (calibration plate), for example, and the calibration can be performed in a non-contact manner. Therefore, artificial variations in carrying out touch-up work can be minimized. Since the calibration can be performed in the non-contact manner, the calibration can be performed with high accuracy regardless of the material of the object or the like, for example.
In the control device according to the aspect of the invention, it is preferable that the control unit controls the movable unit so that the movable unit does not appear in the captured image.
With this configuration, even if the marker is imaged by the imaging unit disposed in the arm different from the arm on the most distal side, it is possible to avoid a case where the movable unit appears in the captured image. Therefore, the more accurate calibration can be performed using the captured image.
In the control device according to the aspect of the invention, it is preferable that the imaging unit is capable of imaging a distal side of the movable unit. The control unit preferably controls the movable unit so that a distal portion of the movable unit does not appear in the captured image.
With this configuration, even if the marker is imaged by the imaging unit disposed in the arm different from the arm on the most distal side, it is possible to avoid a case where the distal portion (for example, an end effector) of the movable unit appears in the captured image. Therefore, the more accurate calibration can be performed using the captured image.
In the control device according to the aspect of the invention, it is preferable that the control unit performs the calibration, based on a first captured image obtained by positioning the imaging unit at a first position and by causing the imaging unit to image the marker, and a second captured image obtained by positioning the imaging unit at a second position different from the first position and by causing the imaging unit to image the marker and a first posture of the imaging unit at the first position is different from a second posture of the imaging unit at the second position.
Even in a case where the first posture and the second posture are different from each other in this way, the more accurate calibration can be performed.
In the control device according to the aspect of the invention, it is preferable that the robot has a base which supports the movable unit, the arm different from the arm positioned on the most distal side of the movable unit is capable of rotating around the base, and the control unit sets a plurality of reference points used in the calibration, based on the captured image, performs calibration on the plurality of reference points in view of the rotating of the imaging unit, and updates the plurality of reference points.
With this configuration, the calibration can be more accurately performed between the coordinate system of the imaging unit disposed in the arm different from the arm on the most distal side and the coordinate system of the robot.
In the control device according to the aspect of the invention, it is preferable that, based on information relating to a first region obtained by dividing a first search window set in the captured image and information of an object having the marker appearing in the captured image, the control unit sets a second search window by calibrating the first search window, and based on the second search window, the control unit sets the plurality of reference points.
With this configuration, an image of the object can be properly recognized during the calibration. Therefore, the calibration can be more accurately performed between the coordinate system of the imaging unit and the coordinate system of the robot.
In the control device according to the aspect of the invention, it is preferable that, based on a second region obtained by dividing the second search window, the control unit sets the plurality of reference points.
With this configuration, the plurality of reference points can be easily set.
In the control device according to the aspect of the invention, it is preferable that the control unit drives the movable unit so as to move the imaging unit to at least two locations without changing a posture of the imaging unit, and based on coordinates in a coordinate system of the imaging unit in at least the two locations and coordinates in a coordinate system of the robot in at least the two locations, the control unit calculates a transformation coefficient between the coordinate system of the imaging unit and the coordinate system of the robot, and calculates an offset of the imaging unit with respect to the arm having the imaging unit disposed therein.
With this configuration, a mounting position of the imaging unit to be mounted on the movable unit can be obtained using a relatively simple method.
In the control device according to the aspect of the invention, it is preferable that the control unit drives the movable unit so as to change the posture of the imaging unit without changing an imaging position imaged by the imaging unit, and the control unit updates the offset, based on the coordinates in the coordinate system of the robot before and after the posture of the imaging unit is changed.
With this configuration, the more accurate offset (specifically, misalignment of the position of the marker appearing in the captured image with a predetermined portion of the robot) can be obtained.
A robot according to an aspect of the invention is controlled by the control device according to the aspect of the invention and has a movable unit including a plurality of arms.
According to the robot, under the control of the control device, it is possible to accurately perform a calibration-related operation. For example, it is possible to minimize a possibility that the wire of the imaging unit pulled from the proximal side of the robot may be degraded after being frequently bent due to the rotation of the distal arm.
A robot system according to an aspect of the invention includes the control device according to the aspect of the invention, a robot controlled by the control device and having a movable unit including a plurality of arms, and an imaging unit.
According to the robot system, under the control of the control device, the robot can accurately perform the calibration-related operation. For example, it is possible to minimize a possibility that the wire of the imaging unit pulled from the proximal side of the robot may be degraded after being frequently bent due to the rotation of the distal arm.
The invention will be described with reference to the accompanying drawings, wherein like numbers reference like elements.
Hereinafter, a control device, a robot, and a robot system according to the invention will be described in detail with reference to preferred embodiments illustrated in the accompanying drawings.
For example, a robot system 100 illustrated in
Hereinafter, each unit belonging to the robot system 100 will be sequentially described.
The robot 1 is a so-called horizontally articulated robot (scalar robot). For example, the robot 1 can hold or transport a target member such as precision instruments or components. As illustrated in
The base 110 is a portion for attaching the robot 1 to any desired installation location. The installation location of the base 110 is not particularly limited. For example, the installation location includes a floor, a wall, a ceiling, and a movable carriage.
A first arm 101 rotatable around a first axis J1 (rotating axis) along a perpendicular direction with respect to the base 110 is connected to an upper end portion of the base 110. A second arm 102 rotatable around a second axis J2 (rotating axis) along the perpendicular direction with respect to the first arm 101 is connected to a distal portion of the first arm 101. The work head 104 is disposed in a distal portion of the second arm 102. The work head 104 has a spline shaft 103 (arm, distal arm) inserted into a spline nut and a ball screw nut (none of them illustrated) which are coaxially disposed in the distal portion of the second arm 102. The spline shaft 103 is rotatable around a third axis J3 with respect to the second arm 102, and is movable (capable of ascending and descending) in the upward/downward direction. Here, in the embodiment, as illustrated in
As illustrated in
The hand 150 is attached to the spline shaft 103 so that a center axis of the hand 150 coincides with the third axis J3 of the spline shaft 103 in design. Therefore, the hand 150 rotates in response to the rotating of the spline shaft 103. As illustrated in
As illustrated in
Each drive unit 130 is electrically connected to the motor driver 120 incorporated in the base 110 illustrated in
In the robot 1 having this configuration, as illustrated in
In the robot 1, a distal coordinate system based on the distal portion of the second arm 102 of the robot 1 is set. The distal coordinate system is a three-dimensional orthogonal coordinate system defined by an xa-axis, a ya-axis, and a za-axis which are orthogonal to each other. In the embodiment, as the original point, the distal coordinate system has an axis coordinate P2 of the second arm 102. The calibration between the base coordinate system and the distal coordinate system is previously completed. The robot 1 is in a state where the coordinates of the distal coordinate system can be calculated, based on the base coordinate system. A translation component for the xa-axis is set as a “component xa”, a translation component for the ya-axis is set as a “component ya”, a translation component for the za-axis is set as a “component za”, a rotation component around the za-axis is set as a “component ua”, a rotation component around the ya-axis is set as a “component va”, and a rotation component around the xa-axis is set as a “component wa”. The unit of the length (size) of the component xa, the component ya, and the component za is expressed using “mm”, and the unit of the angle (size) of the component ua, the component va, and the component wa is expressed using “degree)(°)”.
Hitherto, a configuration of the robot 1 has been briefly described. Although not illustrated, for example, the robot 1 may include a force detection unit configured to include a force sensor (for example, a six-axis force sensor) which detects a force (including a moment) applied to the hand 150.
As illustrated in
The mobile camera 3 has an image sensor element 31 configured to include a charge coupled device (CCD) image sensor having a plurality of pixels, and a lens 32 (optical system). The mobile camera 3 converts light into an electric signal by causing the lens 32 to form an image of the light from an imaging target on the light receiving surface 311 (sensor surface) of the image sensor element 31, and outputs the converted electric signal to the control device 5. Here, the light receiving surface 311 is a surface of the image sensor element 31 on which the light forms the image.
This mobile camera 3 is attached to the second arm 102 so as to image the distal side of the robot arm 10. Therefore, in the embodiment, the mobile camera 3 can capture an image downward in the perpendicular direction. In the embodiment, the mobile camera 3 is attached to the second arm 102 so that an optical axis A3 of the mobile camera 3 (optical axis of the lens 32) is parallel to the third axis J3 of the spline shaft 103. The mobile camera 3 is disposed in the second arm 102. Accordingly, the position of the mobile camera 3 can be changed together with the driving (rotating) of the second arm 102.
In this mobile camera 3, as the image coordinate system (coordinate system of a captured image 30 output from the mobile camera 3) of the mobile camera 3, a two-dimensional orthogonal coordinate system defined by the xb-axis and the yb-axis which are parallel to each other in an in-lane direction of the captured image 30 is set (refer to
The control device 5 illustrated in
Hereinafter, each function (function unit) of the control device 5 will be described.
As illustrated in
The display control unit 51 is configured to include a graphic controller, for example, and is connected to the display device 41. The display control unit 51 has a function to display various screens (for example, operation screens) on the monitor of the display device 41. The input control unit 52 is connected to the input device 42, and has a function to receive an input from the input device 42.
The control unit 53 has a function to control the driving of the robot 1 and the operation of the mobile camera 3, and a function to perform processes for various calculations and determinations. For example, the control unit 53 is configured to include a CPU. Each function of the control unit 53 can be realized by causing the CPU to execute various programs stored in the storage unit 55.
Specifically, the control unit 53 controls the driving of each drive unit 130 so as to drive or stop the robot arm 10. For example, based on the information output from the position sensor 131 disposed in each drive unit 130, the control unit 53 derives a target value of a motor (not illustrated) belonging to each drive unit 130 in order to move the hand 150 to a target position. The control unit 53 performs the processes for various calculations and various determinations, based on the information acquired by the input/output unit 54 and transmitted from the position sensor 131 and the mobile camera 3. For example, the control unit 53 calculates coordinates (components xb, yb, ub: position and the posture) of an imaging target in the image coordinate system, based on the captured image 30 captured by the mobile camera 3. For example, the control unit 53 obtains calibration parameters for transforming coordinates (image coordinates) in the image coordinate system of the mobile camera 3 into coordinates (robot coordinates) in the distal coordinate system of the robot 1 or coordinates (base coordinates) in the base coordinate system of the robot 1. In the embodiment, the distal coordinates of the robot 1 are regarded as the “robot coordinates”, but the base coordinates may be regarded as the “robot coordinates”.
For example, the input/output unit 54 (information acquisition unit) is configured to include an interface circuit, and has a function to exchange information with the robot 1 and the mobile camera 3. For example, the input/output unit 54 has a function to acquire information such as a rotation angle of a rotary shaft of the motor or speed reducer belonging to each drive unit 130 of the robot 1, and the captured image 30. For example, the input/output unit 54 outputs the target value of the motor derived from the control unit 53 to the robot 1.
For example, the storage unit 55 is configured to include a RAM and a ROM, and stores programs and various data items for the control device 5 to perform various processes. For example, the storage unit 55 stores a program for performing calibration. The storage unit 55 is not limited to those which are incorporated in the control device 5 (RAM and ROM), and may be configured to have a so-called external storage device (not illustrated).
As described above, the display device 41 includes the monitor (not illustrated) such as a display, and has a function to display the captured image 30, for example. Therefore, a worker can confirm the captured image 30 and work of the robot 1 via the display device 41. As described above, the input device 42 is configured to include the mouse or the keyboard, for example. Therefore, the worker operates the input device 42, thereby enabling the worker to issue various process instructions to the control device 5. Instead of the display device 41 and the input device 42, a display input device (not illustrated) which serves as both the display device 41 and the input device 42 may be used. For example, as the display input device, a touch panel can be used.
Hitherto, the basic configuration of the robot system 100 has been briefly described. In this robot system, the robot 1 is caused to carry out the work, based on the captured image 30. For this purpose, it is necessary to obtain a transformation matrix equation (calibration parameter) which transforms the image coordinates (xb, yb, and ub) to the robot coordinates (xa, ya, and ua). That is, calibration is needed between the mobile camera 3 and the robot 1. In accordance with an instruction from the worker, this calibration is automatically performed by the control device 5, based on a program for performing the calibration.
Hereinafter, the calibration (various settings and performances for the calibration) will be described.
Before the calibration is performed, the worker places the object 60 on the work table 91 as illustrated in
Hereinafter, each process (Step) will be described with reference to a flowchart illustrated in
First, the input/output unit 54 acquires image information of the mobile camera 3, and the storage unit 55 stores the acquired image information. The image information means information on the number of pixels of the mobile camera 3.
Next, the control unit 53 performs calibration property setting. Specifically, the calibration property setting means setting of the speed and acceleration of the robot 1 when the calibration is performed (more specifically, the movement speed of the hand 150 or the movement acceleration, for example), and setting of a local plane (work plane). The speed and the acceleration of the robot 1 when the calibration is performed are not particularly limited. However, it is preferable that the speed and the acceleration are 30 to 70% of the maximum speed and the maximum acceleration. In this manner, it is possible to further minimize variations in obtaining results of the calibration, and it is possible to further improve the accuracy of the calibration.
Next, the control unit 53 performs Steps S13 and S14 so as to obtain a relative relationship between the distal coordinate system and the image coordinate system.
Next, the control unit 53 moves the predetermined portion (in the embodiment, the axis coordinate P2) of the robot 1 from a point A0 to two different points A1 and A2 different from the point A0, and acquires the image coordinates of the marker 61 appearing on the captured image 30 when the predetermined portion moves to the points A1 and A2 from the point A0 (refer to
Specifically, first, the control unit 53 drives the robot arm 10 so that the marker 61 is positioned (so as to appear) at a center O30 of the captured image 30, and acquires image data from the mobile camera 3 (refer to
Next, the control unit 53 drives the robot arm 10, moves the axis coordinate P2 from the point A0 in a direction of an arrow all in
The control unit 53 drives the robot arm 10, moves the axis coordinate P2 from the point A0 in a direction of an arrow a12 so as to be positioned at the point A2, and acquires the image data from the mobile camera 3. For example, the axis coordinate P2 is moved 0 mm from the point A0 in the xa-direction, and is moved 10 mm in the ya-direction so as to be positioned at the point A2 (refer to
A movement amount from the point A0 to the points A1 and A2 is not limited to the above-described numerical value. The movement amount may be optionally determined.
Next, based on three robot coordinates and three image coordinates stored in Step S13, the control unit 53 obtains a coordinate transformation equation between the robot coordinates and the image coordinates by using a coordinate transformation equation (Equation (1)) for transforming the coordinates to the robot coordinates and the image coordinate.
Δxb and Δyb in Equation (1) represent a distance (displacement) between two points in the image coordinate system, and Δxa and Δya represent a distance (displacement) between two points in the distal coordinate system. In addition, a, b, c, and d are unknown variables.
The variables a and b are obtained as follows, from Equation (1), the points A0 and A1, the center O30, and the point B1.
Δxb1=xb1−xb0 Δxb1=10·a+0·b
Δyb1=yb1−yb0 Δyb1=10·c+0·d
⇒a=Δxb1/10 c=Δyb1/10
Similarly, the variables c and d are obtained as follows from Equation (1), the points A0 and A2, the center O30, and the point B2.
Δxb2=xb2−xb0 Δxb2=0·a+10·b
Δyb2=yb2−yb0 Δyb2=0·c+10·d
⇒b=Δxb2/10 d=Δyb2/10
The variables a, b, c, and d are obtained in this way, and a coordinate transformation equation (primary transformation matrix) can be generated. In this manner, a relative relationship between the distal coordinate system and the image coordinate system can be obtained, and displacement (movement amount) in the image coordinate system can be transformed into displacement (movement amount) in the distal coordinate system. In this way, the coordinate transformation equation (affine transformation equation) illustrated in Equation (1) is used, based on three robot coordinates and image coordinates obtained by moving the axis coordinate P2 to three different locations. In this manner, the relative relationship between the distal coordinate system and the image coordinate system can be easily and properly obtained.
In the embodiment, the process in Step S14 is performed after the process in Step S13 is performed. However, these processes may be performed at the same time.
Next, the control unit 53 generates the plurality of (nine in the embodiment) reference points used in performing the calibration (Step S20 or Step S24) (to be described later). Hereinafter, the calibration will be described with reference to the flowchart illustrated in
The control unit 53 drives the robot arm 10 so as to position the marker 61 at the center O301 of the first search window 301 set in the captured image 30 illustrated in
Determination of Whether or not the Object Falls within the First Region (
The control unit 53 recognizes the image of the object 60, and determines whether or not the object 60 falls within one of first regions S1 obtained by equally dividing the first search window 301 into nine regions. In the embodiment, the image of the object 60 is recognized (detected) by recognizing an outer shape (for example, four corner portions) of the object 60. A method of recognizing the image is not limited to the method of recognizing the four corner portions of the object 60, and any method may be used. For example, it is an effective way to recognize the object 60 by acquiring a value of a rotation angle of the object 60 with respect to a model (template) of the outer shape registered in advance. The object 60 may be not only a substance but also a model window for detecting a substance.
A Case where the Object Falls within the First Region (
In a case of falling within the first region S1 as in an object 60b (object 60) illustrated in
The control unit 53 sets the center point of each first region S1 of the first search window 301 as the reference point 305 to be used for the calibration (refer to
In the embodiment, the center point of the first region S1 is set as the reference point 305. However, the reference point 305 is not limited to the center point of the first region S1. Any location may be used as long as the location falls within the first region S1.
A Case where the Object does not Fall within the First Region (
In a case of not falling within the first region S1 as in an object 60a (object 60) illustrated in
In view of a region (protruding portion) in which the object 60a does not fall within the first region S1, the control unit 53 sets a second search window 302 smaller than the first search window 301 (refer to
Here, in performing the calibration (Step S20 or Step S24) (to be described later), work is carried out for moving the mobile camera 3 to nine locations so that one marker 61 is positioned (appears) at each reference point 305 or each reference point 306 (to be described later) on the captured image 30. In this case, if the object 60a does not fall within the first region S1, a portion of the object 60a cannot appear within the captured image 30 as in the object 60a illustrated by a broken line in
Specifically, first, the control unit 53 calculates A, B, C, and D illustrated below as information relating to the first region S1 (refer to
“A” represents a length from the center point of the first region S1 to an edge on the −xb side. “B” represents a length from the center point of the first region S1 to an edge on the +xb side. “C” represents a length from the center point of the first region S1 to an edge on the +yb side. “D” represents a length from the center point of the first region S1 to an edge of −yb side.
Next, the control unit 53 uses the following equation (matrix) as the information of the object 60 in the image coordinate system so as to calculate respective image coordinates of E, F, G, H, and FP.
As illustrated in
The control unit 53 calculates A′, B′, C′, and D′ illustrated below as the information of the object 60 on the captured image 30, based on A to D, E to H, and FP.
A′=(O301_xb)−(G_xb)
B′=(F_xb)−(O301_xb)
C′=(H_yb)−(O301_yb)
D′=(O301_yb)−(E_yb)
“A′” represents the length in the xb-direction from the marker 61 to G in the object 60 on the captured image 30. “B′” represents the length in the xb-direction from the marker 61 to F in the object 60 on the captured image 30. “C′” represents the length in the yb-direction from the marker 61 to H in the object 60 on the captured image 30. “D′” represents the length in the yb-direction from the marker 61 to E in the object 60 on the captured image 30.
Next, as illustrated in
The above-described “A” represents the length from the center point to the edge on the −xb side of the second region S2, which is obtained by equally dividing the second search window 302 into nine regions. “C” represents the length from the center point to the edge on the +yb side of the second region S2. “D” represents the length from the center point to the edge on the −yb side in the second region S2. In this manner, the second search window 302 can be set.
In this way, the control unit 53 calibrates the first search window 301, based on the information relating to the first region S1 obtained by dividing the first search window 301 set in the captured image 30 and the information of the object 60 having the marker 61 appearing in the captured image 30. In this manner, the control unit 53 sets the second search window 302, and sets the plurality of reference points 305, based on the second search window 302. In this manner, the image of the object 60 can be properly recognized when the calibration is performed. Accordingly, calibration can be more accurately performed between the image coordinate system and the distal coordinate system. According to the method of setting the second search window, the reference point 305 can be set by causing the control device 5 to automatically set the second search window.
Next, the control unit 53 drives the robot arm 10 so as to position the marker 61 at the center O302 of the second search window 302.
Next, similarly to Step S13, in a state where the marker 61 is positioned at the center O302 of the second search window 302, the control unit 53 moves the predetermined portion (in the embodiment, the axis coordinate P2) of the robot 1 to two locations different therefrom. The storage unit 55 stores each of the robot coordinates (xa and ya) and the image coordinates (xb and yb) in a state where the marker 61 is positioned at the center O302 and when the marker 61 is moved to the two locations different therefrom.
Next, the coordinate transformation equation (primary transformation matrix) is generated (reset) using the method the same as that in Step S14. That is, the coordinate transformation equation obtained in Step S14 is updated.
Here, in Step S13, the movement amount from the point A0 to the points A1 and A2 in Step S13 is very small. The reason that the movement amount is small is that in Step S13, a relationship (how long the angle of view is in the robot coordinate system in units of mm) is unknown between the movement amount (size) in the image coordinate system and the movement amount (size) in the robot coordinate system. Accordingly, the reason is to prevent the marker 61 from not appearing in the captured image 30 when the marker 61 is moved. As described above, the coordinate transformation equation obtained in Step S14 is calculated by moving the marker 61 to a short distance. Consequently, in some cases, the movement amount is inaccurately calculated. In contrast, if the coordinate transformation equation is generated once, the relationship can be roughly recognized between the movement amount (size) in the image coordinate system and the movement amount (size) in the robot coordinate system. Therefore, in Step S155, since the coordinate transformation equation is generated in Step S13, the marker 61 can be further moved within the range (within the angle of view) in which the marker 61 appears in the captured image 30 so that the movement amount in Step S155 is larger than the movement amount in Step S13. That is, the movement amount in Step S155 can be further increased than the movement amount in Step S13. In this manner, in Step S155, the more accurate coordinate transformation equation can be obtained compared to the coordinate transformation equation obtained in Step S14.
Next, in the same way as the generation of the reference point based on the first search window 301 described above, the control unit 53 sets (generates) the center point of each second region S2 of the second search window 302 as the plurality of reference points 305 arrayed in a lattice pattern to be used for the calibration (refer to
Next, the control unit 53 updates the coordinate transformation equation obtained in Step S14 in view of the rotating (driving) of the mobile camera 3 in response to the rotating (driving) of the second arm 102, and resets (calibrates) the nine reference points 305 (Step S16). That is, the coordinate transformation equation is updated so as to generate new nine reference points 306.
Here, as described above, the robot 1 has the first arm 101 rotating around the first axis J1 along the perpendicular direction, the second arm 102 rotating around the second axis J2 along the perpendicular direction, and the spline shaft 103 rotating around the third axis J3 along the perpendicular direction (refer to
For example, in a case where the mobile camera 3 is attached to the spline shaft 103 rotating around the third axis J3 located third from the base 110, without changing a posture of the mobile camera 3 as illustrated in
In contrast, as in the embodiment, in a case where the mobile camera 3 is attached to the second arm 102 rotating around the second axis J2 located second from the base 110, the mobile camera 3 rotates in response to the rotating of the second arm 102 as illustrated in
Therefore, in Step S16, the coordinate transformation equation is updated in view of the rotating in response to the rotating of the second arm 102 of the mobile camera 3. Specifically, the displacement (ΔxbM1 and ΔybM1) between the center O30 of the captured image 30 and the marker 61 on the captured image 30, and the displacement (ΔxbM1 and ΔybM1) between the center O30 and the reference point 305 are calculated using the coordinate transformation equation obtained in Step S14 described above. Based on the displacements and Equation (2) below, the control unit 53 resets (calibrates) a new reference point 306 (refer to
In this way, the new reference point 306 is reset (calibrated) in view of the rotating of the mobile camera 3 in response to the rotating (movement) of the second arm 102. In this manner, as in the embodiment, even in a case where the mobile camera 3 is attached to the second arm 102, the marker 61 can be properly positioned at the reference point 306. Accordingly, the calibration (Step S20 or Step S24) can be more accurately performed (to be described later).
Next, the control unit 53 calculates the offset of the mobile camera 3 (arm setting at the installation position of the mobile camera 3). In the embodiment, the control unit 53 obtains the misalignment of the position of the marker 61 appearing in the captured image 30 with the axis coordinate P2, that is, the distance between the axis coordinate P2 and the marker 61 appearing in the captured image 30 in the translation components xa and ya with respect to two axes excluding the axis parallel to the third axis J3. More specifically, in the embodiment, the control unit 53 obtains the distance between the optical axis A3 of the mobile camera 3 and the second axis J2 in the horizontal plane (viewed in the direction extending along the second axis J2), and how much degrees)(°) the optical axis A3 of the mobile camera 3 is misaligned with the second arm 102 (line segment connecting the second axis J2 and the third axis J3 to each other) around the second axis J2. Hereinafter, this will be described with reference to the flowchart illustrated in
First, in a state of a robot 1a (robot 1) illustrated in
Specifically, the control unit 53 changes the state of the robot 1a illustrated by the two-dot chain line in
Next, the control unit 53 rotates the first axis J1 as illustrated in
Specifically, the control unit 53 changes the state of the robot 1a illustrated by the two-dot chain line in
The line segment L21 which represents the distance between the first axis J1 and the optical axis A3 is obtained as follows. A length (mm) of a line segment L11 connecting the first axis J1 and the axis coordinate P2 of the robot 1a to each other, a length (mm) of a line segment L12 connecting the first axis J1 and the axis coordinate P2 of the robot 1c to each other, and a length (mm) of a line segment L13 connecting the position of the axis coordinate P2 of the robot 1a and the position of the axis coordinate P2 of the robot 1c to each other are respectively known. A length (pixel) of a line segment L23 connecting the position of the center O30 of the captured image 30 of the robot 1a and the position of the center O30 of the captured image 30 of the robot 1c to each other is known. Therefore, the length (mm) of the line segment L23 in the distal coordinate system is obtained using the transformation coefficient (mm/pixel) obtained in Step S171. A triangle T1 configured to include the line segments L11, L12 and L13 and a triangle T2 configured to include the line segments L21, L22 and L23 are similar to each other. Since the triangle T1 and the triangle T2 are similar to each other, it is possible to obtain the length (mm) of the line segment L21 in the distal coordinate system from the line segments L11, L12, L13 and the line segment L23. The line segment L21 connects the first axis J1 and the optical axis A3 of the robot 1a to each other. The line segment L22 connects the first axis J1 and the optical axis A3 of the robot 1c to each other.
The control unit 53 rotates the second axis J2 as illustrated in
Specifically, the control unit 53 changes the state of the robot 1a illustrated by the two-dot chain line in FIG. 21 to a state of a robot 1d (robot 1) illustrated by the solid line in
The line segment L25, which represents the distance between the second axis J2 and the optical axis A3 is obtained as follows. The length (mm) of the line segment L15 connecting the second axis J2 of the robot 1a and the axis coordinate P2 to each other, the length (mm) of the line segment L16 connecting the second axis J2 and the axis coordinate P2 of the robot 1d to each other, and the length (mm) of the line segment L14 connecting the position of the axis coordinate P2 of the robot 1a and the position of the axis coordinate P2 of the robot 1d to each other are respectively known. The length (pixel) of the line segment L24 connecting the position of the center O30 of the captured image 30 of the robot 1a and the position of the center O30 of the captured image 30 of the robot 1d to each other is known. Therefore, the length (mm) of the line segment L24 in the distal coordinate system is obtained using the transformation coefficient (mm/pixel) obtained in Step S171. A triangle T3 configured to include the line segments L14, L15, and L16 and a triangle T4 configured to include the line segments L24, L25, and L26 are similar to each other. Since the triangle T3 and the triangle T4 are similar to each other, it is possible to obtain the length (mm) of the line segment L25 in the distal coordinate system from the line segments L14, L15, and L16 and the line segment L24. The line segment L25 connects the second axis J2 of the robot 1a and the optical axis A3 to each other. The line segment L26 connects the second axis J2 and the optical axis A3 of the robot 1d to each other.
Next, the control unit 53 uses Equation (3) below so as to calculate an angle θ14 formed between the line segment L17 and the line segment L25 from the line segment L25 connecting the second axis J2 and the optical axis A3 of the robot 1a to each other, the line segment L17, and the line segment L21 (refer to
An angle θ16 formed between the line segment L15 and the line segment L25 is obtained from the obtained angle θ14 and an angle θ15 formed between the known line segment L15 and line segment L17. In this way, it is possible to obtain that the mobile camera 3 is installed at a position misaligned at an angle θ16)(°) around the second axis J2 in the horizontal plane with respect to the second arm 102.
Through the above-described configuration, the offset of the mobile camera 3 can be automatically obtained by the control device 5.
In calculating the offset of the mobile camera 3 described above (Step S17), the control unit 53 drives the robot arm 10 serving as the “movable unit” so as to move the mobile camera 3 serving as the “imaging unit” to at least two locations (two locations in the embodiment) without changing the posture of the mobile camera 3 serving as the “imaging unit”. Based on coordinates (image coordinates) in the image coordinate system serving as the “coordinate system of the imaging unit” in at least two locations and coordinates (robot coordinates) in the distal coordinate system serving as the “coordinate system of the robot” in at least two locations, the control unit 53 calculates the transformation coefficient (mm/pixel) of the image coordinate system and the distal coordinate system (Step S171). The control unit 53 calculates the offset of the mobile camera 3 serving as the “imaging unit” with respect to the second arm 102 serving as the “arm” having the mobile camera 3 serving as the “imaging unit”. In this way, the mobile camera 3 is moved (particularly, translated) without changing the posture of the mobile camera 3. Through this configuration, the misalignment of the position of the mobile camera 3 with the robot arm 10 (second arm 102 in the embodiment), that is, the offset is calculated. In this manner, the transformation coefficient can be relatively easily obtained. The transformation coefficient is used so that the offset can be easily obtained in a short time.
Here, the term of translation means linearly moving within a plane (excluding rotation and arc-shaped movement). The term of changing the posture of the mobile camera 3 (imaging unit) means changing at least one of the components ub, vb, and wb. The term of changing the position of the mobile camera 3 (imaging unit) means changing at least one of the components xb, yb, and zb. The term of without changing the posture of the mobile camera 3 (imaging unit) means without changing the components ub, vb, and wb.
In calculating the offset in Step S17, it is possible to use an optimized calculation method according to a third embodiment (to be described later) instead of Step S172 to Step S174.
Next, the control unit 53 drives the robot arm 10 so as to position the marker 61 at the center O30 of the captured image 30.
Next, the control unit 53 determines whether or not to perform the operation for changing the posture of the hand 150.
In a case of performing the operation for changing the posture of the hand 150, the process proceeds to Step S21.
The control unit 53 starts performing the operation for changing the posture of the hand 150. The operation for changing the posture of the hand 150 means an operation for changing the posture of the hand 150 so as to change the posture of the mobile camera 3 without causing the mobile camera 3 to change an imaging position. The operation for changing the posture of the hand 150 is performed. In this manner, it is possible to obtain the more accurate offset by updating the above-described offset.
Change of the Posture of the Hand from a First Hand Posture to a Second Hand Posture while Positioning the Marker at the Center of the Captured Image (
Here, in Step S18 described above, a state of the distal portion of the robot arm 10 when the marker 61 is positioned at the center O30 of the captured image 30 is indicated by the two-dot chain line in
First, the control unit 53 drives the robot arm 10 while positioning the marker 61 at the center O30 of the captured image 30, and changes the posture (first posture) of the hand 150 illustrated by the two-dot chain line in
Next, as illustrated in
In the embodiment, the offset of the mobile camera 3 is obtained by rotating the hand 150 and the mobile camera 3 around the perpendicular line passing through the marker 61 by 180°. However, without being limited to 180°, the angle may be optionally set. Even in a case where the angle is other than 180°, the distance between the axis coordinate P2 and the marker 61 in the horizontal plane, the rotation angle around the perpendicular line passing through the marker 61 from the right arm posture to the left arm posture, the robot coordinates of the axis coordinate P2 in the first posture, the robot coordinates of the axis coordinate P2 in the second posture, and the image coordinates of the marker 61 are used, and simultaneous equations are solved. In this manner, the robot coordinates of the marker 61 can be obtained. For example, the offset of the mobile camera 3 can be obtained by obtaining the distance between the robot coordinates of the axis coordinate P2 and the robot coordinates of the marker 61 in the first hand posture.
In this way, the control unit 53 drives the robot arm 10 serving as the “movable unit” so as to change the posture of the mobile camera 3 serving as the “imaging unit” without causing the mobile camera 3 serving as the “imaging unit” to change the imaging position. Based on the coordinates (robot coordinates) in the distal coordinate system serving as the “coordinate system of the robot” before and after the posture of the mobile camera 3 is changed, the control unit 53 updates the offset. That is, a new offset is obtained, based on the offset obtained in Step S17 and the offset obtained by changing the posture in Step S22. In this manner, it is possible to obtain the more accurate offset of the mobile camera 3 (more specifically, the misalignment of the position of the marker 61 appearing in the captured image 30 with respect to the axis coordinate P2). The term of changing the posture of the mobile camera 3 described above indicates changing the component ub (refer to
Here, for example, as described above, the mobile camera 3 is attached to the second arm 102 so that the optical axis A3 is parallel to the third axis J3 in design. However, in actual practice, the mobile camera 3 may be attached to the second arm 102 in a state where the optical axis A3 is inclined with respect to the third axis J3 due to an artificial attachment error of the mobile camera 3 (refer to
Next, the control unit 53 resets the hand 150 to adopt the first posture, and the process proceeds to Step S24.
The control unit 53 uses each position of the plurality of reference points 306 obtained in Step S16 and the offset obtained in Steps S21 and S22, drives the robot arm 10, and moves the axis coordinate P2 and the mobile camera 3 so as to position the marker 61 at each reference point 306. At this time, every time the mobile camera 3 moves, the marker 61 is imaged by the camera 3 so as to acquire the image data.
For example, the control unit 53 drives the robot arm 10, and positions the mobile camera 3 so that the marker 61 is positioned at the first reference point 306 (refer to
Next, based on the coordinates (components xb, yb, and ub) of the marker 61 based on the first to ninth captured images and the robot coordinates (components xa, ya, and ua), the control unit 53 obtains calibration parameters (coordinate transformation matrix) for transforming the image coordinates into the robot coordinates. In this manner, the calibration is completed between the image coordinate system and the distal coordinate system. In this manner, as described above, the robot coordinate system and the base coordinate system are completely calibrated. Accordingly, the calibration can be performed between the image coordinate system and the base coordinate system. In this way, the position (components xb and yb) and the posture (component ub) of the imaging target imaged by the mobile camera 3 can be transformed into the position (components xa and ya) and the posture (component ua) in the distal coordinate system. Therefore, the position (components xa and ya) of the marker 61 in the distal coordinate system can be obtained, based on the captured image 30. As a result, the hand 150 of the robot 1 can be positioned at the target location, based on the captured image 30.
In this way, the control unit 53 positions the mobile camera 3 serving as the “imaging unit” at the first position. The control unit 53 performs the calibration, based on the first captured image 30a obtained by causing the mobile camera 3 to image the marker 61 and the second captured image 30b obtained by causing the mobile camera 3 to image the marker 61 by positioning the mobile camera 3 at the second position different from the first position (refer to
A Case where the Operation for Changing the Posture of the Hand is not Performed (Step S19: No)
In a case where the operation for changing the posture of the hand 150 is not performed, the process proceeds to Step S20.
The calibration of the image coordinate system and the distal coordinate system is performed using the coordinate transformation equation obtained in Step S16 and the offset obtained in Step S17 and using the same method as that in Step S24.
In this way, the calibration is completed.
As described above, the control device 5 controls the robot 1 having robot arm 10 serving as the “movable unit” including the first arm 101, the second arm 102, and the spline shaft 103 which serve as a plurality of “arms”. The control device 5 includes the control unit 53 for performing the calibration between the coordinate system (image coordinate system) of the mobile camera 3 serving as the “imaging unit” disposed in the second arm 102 (arm) different from the spline shaft 103 (arm) positioned on the most distal side of the robot arm 10 serving as the “movable unit” and having the imaging function, and the coordinate system (distal coordinate system) of the robot 1. According to the control device 5 as described above, the calibration between the image coordinate system and the distal coordinate system, that is, the calibration can be performed as described above. Accordingly, based on the captured image 30 of the mobile camera 3, the robot 1 is enabled to carry out accurate work. According to the control device 5, the calibration can be performed in the mobile camera 3 disposed in the second arm 102 (arm) different from the spline shaft 103 (arm) positioned on the most distal side. Therefore, the control device 5 is used, thereby enabling the mobile camera 3 to be disposed in the second arm 102 (arm) of the robot 1. As a result, for example, it is possible to minimize a possibility that a wire (not illustrated) of the mobile camera 3 pulled from the base 110 of the robot 1 may be degraded after being frequently bent due to the rotation of the spline shaft 103.
In particular, as described above, the mobile camera 3 is disposed in the second arm 102 rotating around the second axis J2 located second from the base 110 side in the first axis J1, the second axis J2, and the third axis J3 which are located in the rotating direction the same as that of the optical axis A3 (yaw in the embodiment). According to the control device 5 of the embodiment, the calibration can be more precisely performed on the mobile camera 3 disposed in this location. Without being limited to the mobile camera 3 disposed in the second arm 102, any configuration may be adopted as long as the control device 5 can perform the calibration between the image coordinate system of the mobile camera 3 disposed in the arm other than the spline shaft 103 serving as the arm positioned in the most distal end of the robot arm 10 and the distal coordinate system of the robot 1. That is, the control device 5 can perform the calibration relating to the image coordinate system of the mobile camera 3 disposed in either the first arm 101 or the second arm 102.
The “coordinate system of the robot” is regarded as the distal coordinate system in the embodiment. However, the coordinate system of the robot may be regarded as the base coordinate system of the robot 1, or may be regarded as the coordinate system of the predetermined portion of the robot 1 other than the distal coordinate system. The “coordinate system of the imaging unit” indicates the coordinate system of the captured image output from the imaging unit. The “calibration” in the embodiment is regarded as the calibration of the image coordinate system and the robot coordinate system (distal coordinate system or base coordinate system). However, the calibration may be regarded as obtaining the relative relationship between the image coordinate system and the robot coordinate system as performed in Step S14. For example, the relative relationship means transforming the distance between two points in the image coordinate system into the distance between two points in the distal coordinate system.
As described above, the control unit 53 performs the calibration, based on the captured image 30 (image data) obtained by causing the mobile camera 3 serving as the “imaging unit” to image the marker 61. In this manner, for example, it is unnecessary to touch up a calibration jig for the object 60. The calibration can be performed in the non-contact manner. Therefore, artificial variations of touch-up work can be reduced. Since the calibration can be performed in the non-contact manner, the calibration can be more accurately performed regardless of a material of the object, for example.
Here, in the embodiment, as the “marker”, the circular marker 61 attached to the object 60 is exemplified. However, any configuration may be adopted as long as the “the marker” is disposed at a location which enables the mobile camera 3 to image the “marker”. For example, without being limited to the circular marker 61, the “marker” may be a figure other than a circle, or a letter. A characteristic portion disposed in the object 60 or the object 60 itself may be used. Alternatively, the object 60 used in the calibration may have any shape.
Here, as described above, the robot 1 has the base 110 which supports the robot arm 10 serving as the “movable unit”. The second arm 102 (arm) different from the spline shaft 103 (arm) positioned on the most distal side of the robot arm 10 is capable of rotating with respect to the base 110. The control unit 53 sets the plurality of reference points 305 to be used for the calibration, based on the captured image 30 (Step S15), calibrates the plurality of reference points 305 in view of the rotating of the mobile camera 3 serving as the “imaging unit”, and updates the plurality of reference points 306 (Step S16). Specifically, in Step S16, the coordinate transformation equation is updated in view of the rotating of the mobile camera 3, and the plurality of reference points 306 are reset. Therefore, in performing the calibration, even if the posture of the mobile camera 3 is changed at the first to ninth positions due to the rotating of the mobile camera 3 in response to the rotating of the second arm 102, the more accurate calibration can be realized between the image coordinate system and the distal coordinate system. Therefore, the calibration can be more accurately performed between the image coordinate system of the mobile camera 3 disposed in the second arm 102 and the distal coordinate system.
The above-described robot 1 is controlled by the control device 5, and has the robot arm 10 serving as the “movable unit” including the first arm 101, the second arm 102, and the spline shaft 103 which serve as the plurality of “arms”. According to the robot 1, under the control of the control device 5, the operation relating to the calibration can be accurately performed.
Hitherto, the robot system 100 has been described. As described above, the robot system 100 includes the control device 5, the robot 1 controlled by the control device 5 and having the robot arm 10 serving as the “movable unit” including the first arm 101, the second arm 102, and the spline shaft 103 which serve as the plurality of “arms”, and the mobile camera 3 serving as the “imaging unit” having the imaging function. According to the robot system 100, the robot 1 can accurately perform the operation relating to the calibration under the control of the control device 5. The mobile camera 3 can be attached to the second arm 102. Accordingly, for example, it is possible to minimize a possibility that a wire (not illustrated) of the mobile camera 3 pulled from the proximal side of the robot 1 may be degraded after being frequently bent due to the rotation of the spline shaft 103.
Next, a second embodiment will be described.
The robot system according to the embodiment is mainly the same as that according to the above-described first embodiment except for the different configuration of the robot. In the following description, with regard to the second embodiment, points different from those of the above-described embodiment will be mainly described, and the same elements will be omitted in description.
As illustrated in
As illustrated in
The base 110 and the first arm 11 are connected to each other via the joint 171, and the first arm 11 can rotate around a first rotating axis O1 along the perpendicular direction with respect to the base 110. The first arm 11 and the second arm 12 are connected to each other via the joint 172, and the second arm 12 can rotate around a second rotating axis O2 along the horizontal direction with respect to the first arm 11. The second arm 12 and the third arm 13 are connected to each other via the joint 173, and the third arm 13 can rotate around a third rotating axis O3 along the horizontal direction with respect to the second arm 12. The third arm 13 and the fourth arm 14 are connected to each other via the joint 174, and the fourth arm 14 can rotate around a fourth rotating axis O4 orthogonal to the third rotating axis O3 with respect to the third arm 13. The fourth arm 14 and the fifth arm 15 are connected to each other via the joint 175, and the fifth arm 15 can rotate around a fifth rotating axis O5 orthogonal to the fourth rotating axis O4 with respect to the fourth arm 14. The fifth arm 15 and the sixth arm 16 are connected to each other via the joint 176, and the sixth arm 16 can rotate around a sixth rotating axis O6 orthogonal to the fifth rotating axis O5 with respect to the fifth arm 15. Here, as illustrated in
Although not illustrated in
As illustrated in
As illustrated in
The robot arm 10A serving as the “movable unit” belonging to the robot 1 having this configuration is configured to include the sixth arm 16 serving as the “distal arm” disposed closer on the distal side of the robot arm 10A serving as the “movable unit” than the fifth arm 15 as the “arm”. That is, in the robot 1A, the mobile camera 3 is disposed in the fifth arm 15. Similarly to the first embodiment, the robot 1A having this configuration can also perform the calibration (various settings and performances for the calibration) under the control of the control device 5. In this manner, the fifth arm 15 other than the sixth arm 16 serving as the “distal arm” positioned on the most distal side of the robot arm 10A can be used as the “arm” having the mobile camera 3 set therein. Therefore, for example, it is possible to minimize a possibility that the wire (not illustrated) of the mobile camera 3 pulled from the proximal side of the robot 1A may be degraded after being frequently bent due to the rotation of the sixth arm 16.
As illustrated in
Without being limited to the mobile camera 3 disposed in the fifth arm 15, the control device 5 in the embodiment may be capable of performing the calibration between the image coordinate system of the mobile camera 3 disposed in the arms 11 to 15 other than the sixth arm 16 positioned in the most distal end of the robot arm 10 and the distal coordinate system of the robot 1. That is, the control device 5 can perform the calibration relating to the image coordinate system of the mobile camera 3 disposed in any one of the first arm 11, the second arm 12, the third arm 13, the fourth arm 14, and the fifth arm 15.
In the robot system 100A according to the embodiment, as described above, the mobile camera 3 is disposed in the fifth arm 15 (one preceding arm) positioned closer on the base 110 side than the sixth arm 16 rotating around the sixth rotating axis O6 serving as the yaw axis positioned third from the base 110 in the first rotating axis O1, the fourth rotating axis O4, and the sixth rotating axis O6 whose rotating axes (yaw in the embodiment)) are the same as that of the optical axis A3. The control device 5 included in the robot system 100A according to the embodiment can particularly more accurately perform the calibration on the mobile camera 3 disposed in this location.
In the embodiment, the work surface (upper surface) of the work table 91 on which the robot 1A carries out the work is parallel to the horizontal plane. However, the work surface may not be parallel to the horizontal plane, and may be inclined with respect to the horizontal plane. In this case, it is preferable to set in advance a virtual reference plane parallel to the work surface and defined based on the base coordinate system.
Next, a third embodiment will be described.
The robot system according to the embodiment is mainly the same as that according to the above-described second embodiment except for the different offset calculation of the mobile camera. In the following description, with regard to the third embodiment, points different from those in the above-described embodiments will be mainly explained to the center, and the same elements will be omitted in description.
In the embodiment, the offset of the mobile camera 3 is obtained with respect to the fifth arm 15 in the robot 2A in the second embodiment. For example, in the calibration according to the second embodiment, after the plurality of reference points are completely reset (
Hereinafter, description will be continued with reference to a flowchart illustrated in
First, the control unit 53 performs Steps S31 to S34, drives the robot arm 10A so as to change the position (components xb and yb) of the mobile camera 3 without changing the posture (components ub, vb, and wb) and the position (component zb) of the mobile camera 3, and obtains the transformation coefficient (mm/pixel: resolution), based on the image coordinates and the robot coordinates before and after the position is changed.
In advance, the control unit 53 drives the robot arm 10A so that the axis coordinate P5 is positioned at an optional position H1. That is, for example, the robot 1A is brought into a state indicated by the two-dot chain line in
Detecting of a position 61e of the marker 61 on the captured image 30 at the position H1 (
First, the control unit 53 acquires image data by the mobile camera 3, and detects the position 61e (marker position) of the marker 61 on the captured image 30 at the position H1 (refer to
Translating the mobile camera 3 to a position H2 without changing the posture of the mobile camera 3 (
Next, the control unit 53 drives the robot arm 10A so as to translate the axis coordinate P5 without changing the posture (components ub, vb, and wb) and the position (component zb) of the mobile camera 3. That is, for example, a state of the robot 1A illustrated by the two-dot chain line in
Detecting the position 61f of the marker 61 on the captured image 30 at the position H2 (
Next, the control unit 53 acquires the image data from the mobile camera 3, and detects the position 61f (marker position) of the marker 61 on the captured image 30 at the position H2 (refer to
Next, the control unit 53 obtains the transformation coefficient (mm/pixel: resolution) of the image coordinates and the robot coordinates from the distance (mm) between position H1 and position H2, and from the distance (pixel) between the position of the marker 61 at the position 61e on the captured image 30, and the position of the marker 61 at position 61f. In other words, the control unit 53 obtains the transformation coefficient from the movement distance (mm) of the axis coordinate P5 (mobile camera 3) before and after the above-described mobile camera 3 is moved to two locations and the movement distance (pixel) of the marker 61 appearing on the captured image 30.
Next, based on the information in Steps S31 to S34, the installation orientation (specifically, the component ua) of the mobile camera 3 is calculated.
Specifically, as illustrated in
θ10=Ru−α+β−180 (4)
Next, the control unit 53 sets the posture (specifically, the component ua of the optical axis A3 (installation axis of the mobile camera 3)) of the mobile camera 3 at the N-number of optional positions in the virtual plane. For example, the virtual plane is orthogonal to the optical axis A3, and the coordinate system (local coordinate system) defined based on the base coordinate system is set. The original point on the virtual plane is set to the axis coordinate P5 in the embodiment.
Moving the Robot Arm 10A to the n-Th Imaging Posture (
Next, the control unit 53 drives the robot arm 10A, and moves the robot arm 10A to the first imaging posture (the n-th imaging posture) in the N-number of imaging postures. A numerical value of N may be 3 or more. The number is optionally set. However, as the number increases, the accuracy of the offset calculation is further improved. In the embodiment, for example, N is set to 10.
Imaging the Marker 61 at the n-Th Imaging Posture, and Detecting the Position of the Marker 61 (
Next, based on the first image (n-th image) serving as the captured image 30 captured at the first imaging posture (n-th imaging posture), the control unit 53 detects the position (position n) of the marker 61 at the first imaging posture.
Storing the Position of the Marker 61 and a Posture (3) of the axis coordinate P5 (
Next, the storage unit 55 stores the image coordinates (xb, yb, and ub) at the position of the marker 61 on the first image (n-th image) and the robot coordinates (xa, ya, and ua) at the first imaging posture (n-th imaging posture).
Next, the control unit 53 detects the position (position n) of the N-number of markers 61, and determines whether or not the position is stored. In the embodiment, since N is set to 10, n is set to 10. In a case where the position n is not detected, that is, if the N-number of markers 61 is not detected, the process returns to Step S37. In a case where the position n is detected, that is, in a case where the N-number of markers 61 is detected, the process proceeds to Step S41. In the embodiment, Steps S37 to S40 are repeated until n reaches 10. Therefore, next, the position of the marker 61 at the second imaging posture is detected, based on the second image serving as the captured image 30 captured at the second imaging posture. If n reaches 10, that is, if the first to the n-th images are acquired, the process proceeds to Step S41.
Calculating the Offset of the Mobile Camera 3 from the Transformation Coefficient (1), the Installation Orientation (2), and the Posture (3) of the Optical Axis A3 of the Mobile Camera 3 by Using an Optimized Calculation Method (
Hereinafter, a method for obtaining the offset (position relationship between the axis coordinate P5 and the marker 61 appearing in the mobile camera 3) of the mobile camera 3 by using the optimized calculation method will be described. In the embodiment, although the solution is calculated using the optimized calculation method, the solution may be analytically obtained.
In the optimized calculation method described below, in a case of using the optimized calculation method by positioning the light receiving surface 311 of the mobile camera 3 so as to be parallel to the work surface (for example, an upper surface of the work table 91) on which the robot 1A carries out the work (refer to
First, if the position of the marker 61 in the distal coordinate system and the axis coordinate P5 of the robot 1A having the mobile camera 3 installed therein are defined as follows, the offset of the mobile camera 3 in the distal coordinate system can be calculated as follows.
J5 represents the axis coordinate P5. CAM represents the optical axis A3 (position of the marker 61 when the marker 61 is positioned at the center O30 of the captured image 30) of the mobile camera 3. Tx, Ty, and Tu are unknown variables.
Similarly, if the position and the posture of the axis coordinate P5 in the base coordinate system are defined as follows, the coordinates of the fifth rotating axis O5 can be calculated as follows.
BASE represents the center point (original point) of the upper end surface of the base 110. Rx, Ry, and Ru are unknown variables.
A position (Mx and My) of the marker 61 in the base coordinate system is defined as follows.
In addition to the above-described definition, if the transformation coefficient (resolution) is set as Fx and Fy (pixel/mm) and the center O30 of the captured image 30 of the mobile camera 3 is set as Cx and Cy (pixel), the relationship between an image coordinate P′ (ub and vb) of the marker 61 and the position (Mx and My) of the marker 61 can be expressed using the following relational expression.
In the above equation, after the transformation coefficient Fx and Fy is measured in advance, Tx, Ty, Tu, Mx, and My are set as the unknown variables, and the optimized calculation method is used. The unknown variables Tx, Ty, Tu, Mx, and My are calculated by setting the position of the marker 61 imaged at the N-number of postures Rn of the robot 1A as Pn and by minimizing an error evaluating function E below.
Minimizing the error evaluating function E is performed using the multivariate Newton method according to Procedures 1 to 5 below.
An initial value X° is determined as follows.
X
0=(Tx0,Ty0,Tu0,Mx0,My0)T
For example, the initial value is set using the following values.
Tx=0, Ty=0,
As Tu, the installation orientation of the mobile camera 3 obtained in Step S35 is used. As Fx and Fy, the transformation coefficient obtained in Step S34 is used.
Mx=0, My=0
A gradient ∇E and a Hessian matrix H in a current value X are calculated.
A solution ΔXn of the simultaneous equations is calculated.
H(Xn)ΔXn=−∇E(Xn)
A value Xn is updated.
X
n+1
=X
n
+ΔX
n
If ΔXn<δ is satisfied, Xn is returned. In other cases, Procedure 2 to Procedure 5 are repeated.
Through the above-described procedures, the unknown variables Tx, Ty, Tu, Mx, and My are obtained, and the offset of the mobile camera 3 can be obtained. Calculating the offset of the mobile camera 3 using this optimized calculation method can be used instead of Step S17 (specifically, Step S172 to Step S174) in the first embodiment. In this case, the axis coordinate P5 may be replaced with the axis coordinate P2.
In the above-described optimized calculation method, in a case where a value of the angle θ10 described above is used as the unknown variable Tu, only the four variables Tx, Ty, Mx, and My are calculated. Therefore, in this case, a numerical value of N in Step S37 may be 2 or more.
As described above, the control device 5 controls the robot 1A having the robot arm 10A serving as the “movable unit” including the fifth arm 15 serving as the “arm” provided with the mobile camera 3 serving as the “imaging unit”. The control device 5 includes the control unit 53 that obtains the posture (component ua) of the mobile camera 3 serving as the “imaging unit” by translating the fifth arm 15 serving as the “arm” (Steps S31 to S35). According to the control device 5, the fifth arm 15 is translated. In this manner, it is possible to obtain the posture (component ua) of the mobile camera 3 with respect to the fifth arm 15. Therefore, the fifth arm 15 other than the sixth arm 16 serving as the “distal arm” located on the most distal side of the robot arm 10A can be used as the “arm” having the mobile camera 3 set therein. As a result, for example, it is possible to minimize a possibility that the wire (not illustrated) of the mobile camera 3 pulled from the base 110 due to the excessive rotating of the sixth arm 16 may be degraded after being frequently bent due to the rotation of the sixth arm 16. Here, the term of translation means linearly moving within a plane (excluding rotation and arc-shaped movement).
For example, in applications of a relatively small rotating amount of the sixth arm 16, the “arm” provided with the mobile camera 3 serving as the “imaging unit” may be the sixth arm 16. Even in this case, the sixth arm 16 is translated. In this manner, it is possible to minimize a possibility that the wire may be degraded after being frequently bent due to the rotation of the sixth arm 16.
As described above, the control unit 53 obtains the posture (component ua) of the mobile camera 3 serving as the “imaging unit”, based on the translation direction (in the embodiment, the translation direction of the axis coordinate P 5), the direction of translating the fifth arm 15 serving as the “arm” (in the embodiment, the direction of translating the axis coordinate P5), and the movement direction in the coordinate system (image coordinate system) of the mobile camera 3 serving as the “imaging unit” in response to the translation of the fifth arm 15 as the “arm”. Specifically, as described above, the posture of the mobile camera 3 is obtained in Step S35. In this manner, the posture of the mobile camera 3 can be obtained with respect to the fifth arm 15 without excessively rotating the robot arm 10.
In particular, as described above, the control unit 53 obtains the offset of the mobile camera 3 with respect to the fifth arm 15 serving as the “arm”, based on the marker 61 (marker recognition position) imaged by the mobile camera 3 serving as the “imaging unit”. In this manner, for example, it is unnecessary to measure the offset by using a measure (measuring instrument). Therefore, it is possible to minimize the artificial variations in the touch-up work by measuring the offset by using the measure (measuring instrument), for example. The offset can be calculated in a non-contact manner. Accordingly, it is possible to more accurately calculate the offset regardless of the material of the object 60, for example.
Specifically, as described above, the control unit 53 obtains the offset, based on the first image (captured image 30) in which the marker 61 is imaged by positioning the mobile camera 3 serving as the “imaging unit” at the first imaging posture and the second image (captured image 30) in which the marker 61 is imaged by positioning the mobile camera 3 at the second imaging posture. In the embodiment, as described above, the offset is obtained, based on the first to tenth images. In this manner, the offset can be calculated in a non-contact manner, and the offset can be more accurately calculated using a relatively simple method.
As described above, the robot 1A is controlled by the control device 5, and has the robot arm 10A serving as the “the movable unit” including the fifth arm 15 serving as the “arm” provided with the mobile camera 3 serving as the “imaging unit”. According to this robot 1A, as described above, it is possible to accurately perform the operation relating to the calculation of the offset.
The robot system 100A in the embodiment as described above includes the robot 1A having the control device 5, the mobile camera 3 serving as the “imaging unit”, and the robot arm 10A controlled by the control device 5 and serving as the “movable unit” including the fifth arm 15 serving as the “arm” provided with the mobile camera 3 serving as the “imaging unit”. According to this robot system 100A, under the control of the control device 5, the robot 1A can accurately perform the operation relating to the calculation of the offset. For example, it is possible to minimize a possibility that the wire (not illustrated) of the mobile camera 3 pulled from the base 110 of the robot 1A may be degraded after being frequently bent due to the rotation of the sixth arm 16.
In a case where the sixth arm 16 is not excessively rotated, the installation location of the mobile camera 3 may be the distal arm (sixth arm 16). The same is applied to the robot 1 in the first embodiment. The “arm” is not limited to the fifth arm 15, and may be the arm provided with the “imaging unit”.
Next, a fourth embodiment will be described.
The robot system according to the embodiment is mainly the same as that according to the above-described third embodiment except for the different configuration of the hand and additional processing performed by the control unit. In the following description, with regard to the fourth embodiment, points different from those in the above-described embodiments will be mainly described, and the same elements will be omitted in description.
A hand 150A belonging to the robot 1A illustrated in
In this way, for example, when the hand 150A is translated, the rotation component of the fifth arm 15 is applied to the movement of the mobile camera 3. Therefore, for example, if the mobile camera 3 is attempted to move to the target location by performing so-called jog feeding with reference to the distal end P150 of the hand 150A illustrated in
Therefore, in the embodiment, the control unit 53 controls the robot arm 10A so that the mobile camera 3 can be properly moved to the target location (for example, the spot 616), that is, so that the target location appears at the center O30 of the captured image 30. Hereinafter, description will be continued with reference to a flowchart illustrated in
First, the control unit 53 assumes that the mobile camera 3 is installed in the sixth arm 16, and calculates a joint angle θ6 of the sixth rotating axis O6 when the mobile camera 3 adopts a target posture (xr, yr, zr, ur, vr, and wr) (Step S51). Here, in the embodiment, for example, the target posture is a position/posture of the mobile camera 3 taking a position/posture of a captured image 30c (captured image 30) in
Next, the storage unit 55 records the joint angle θ6 obtained in Step S51 as an initial value (Step S52).
Next, the control unit 53 calculates again the joint angle θ6 by adding a predetermined angle (for example, 1°) to the target posture (ur), and calculates a displacement Δθ6 of the joint angle θ6 with respect to the posture (ur) (Step S53).
Next, the control unit 53 changes the target posture (ur) so that the joint angle θ6 is 0 (zero), and calculates joint angles θ1 to θ6 (Step S54). Specifically, the target posture (ur) in Step S51 is set as “ura”, the joint angle θ6 obtained in Step S51 is set as “θ6A”, and the target posture (ur) to be newly obtained in Step S54 is set as “urb”. The target posture (urb) is obtained using ura, θ6A, Δθ6 obtained in Step S53, and Equation (5) below. The control unit 53 calculates the joint angle θ1 to θ6 of the first rotating axis O1 to the sixth rotating axis O6 in the new target posture (urb).
Next, the control unit 53 determines whether or not the joint angle θ6 obtained in Step S54 falls within a predetermined threshold range (Step S55). In a case where the joint angle θ6 does not fall within the predetermined threshold range, the process returns to Step S53. On the other hand, in a case where the joint angle θ6 falls within the predetermined threshold range, the process proceeds to Step S56. Here, the predetermined threshold range is a preset value. For example, the joint angle θ6 is preferably 0°. Accordingly, as a value within the threshold range, it is preferable that the value falls within ±10°, and more preferable ±0.1°. For example, in a case where the joint angle θ6 is 0°, the arms are in a state of the fifth arm 15d (fifth arm 15) and the sixth arm 16d (sixth arm 16) illustrated in
In a case where convergence cannot be obtained in Step S55 even if Steps S53 to S55 are repeatedly performed multiple times, Steps S53 to S55 may be repeatedly performed by changing the initial value at each optional angle (for example, 30°) until the convergence is obtained.
Next, if it is determined that the joint angle θ6 falls within the predetermined threshold range, the control unit 53 sets the joint angles θ1 to θ5 calculated in Step S54 and the initial value of the joint angle θ6 stored in Step S52 as the target posture, and outputs the target posture to the robot 1A (Step S56).
According to the above-described configuration, the process for positioning the mobile camera 3 at the target location is completed. Here, as described above, the robot 1A has the base 110 which supports the robot arm 10A serving as the “movable unit”. The fifth arm 15 serving as the “arm” is capable of rotating with respect to the base 110, and the sixth arm 16 serving as the “distal arm” is capable of rotating with respect to the fifth arm 15. As described above, in view of the rotating of the mobile camera 3 serving as the “imaging unit” rotating in response to the rotating of the fifth arm 15, the control unit 53 performs the process for moving the mobile camera 3 (in the embodiment, the center O30 of the captured image 30 of the mobile camera 3) to a designated position (refer to
As illustrated in
Therefore, the control unit 53 sets a distal axis angle (joint angle of the hand 150A) of the hand 150A so that the hand 150A is not positioned within the field of view of the mobile camera 3.
Specifically, for example, the storage unit 55 stores information relating to the distal axis angle of the hand 150A in a state where the hand 150A is not positioned within the field of view of the mobile camera 3, and the control unit 53 controls the movement of the mobile camera 3 (fifth arm 15), based on this information.
For example, in a case where the hand 150A is not positioned in the field of view of the mobile camera 3 before and after the mobile camera 3 is moved, the control unit 53 causes the storage unit 55 to store the distal axis angle of the hand 150A at that position. Even after the mobile camera 3 is moved, the control unit 53 maintains the distal axis angle of the hand 150A before the mobile camera 3 is moved.
For example, the control unit 53 sets in advance the range of the distal axis angle of the hand 150A in a state where the hand 150A is not positioned within the field of view of the mobile camera 3. Based on the range of the distal axis angle of the hand 150A and the target posture of the mobile camera 3, the control unit 53 sets the distal axis angle (component ur of the distal axis posture) nearest to the target distal axis angle within a range where both of these do not overlap each other.
In this way, the control unit 53 controls the robot arm 10A so that the robot arm 10A serving as the “movable unit” does not appear in the captured image 30 captured by the mobile camera 3 serving as the “imaging unit”. Even if the marker 61 is imaged by the mobile camera 3 installed in the fifth arm 15, it is possible to avoid the robot arm 10A from appearing in the captured image 30.
In particular, as described above, the mobile camera 3 serving as the “imaging unit” can image the distal side of the robot arm 10A serving as the “movable unit”. The control unit 53 controls the robot 10A so that the distal portion (for example, the hand 150A) of the robot arm 10A does not appear in the captured image 30. Even if the marker 61 is imaged by the mobile camera 3 installed in the fifth arm 15, it is possible to avoid the hand 150A from appearing in the captured image 30. Therefore, the offset can be more accurately calculated and calibrated using the captured image 30.
In the robot 1 according to the first embodiment, the control unit 53 also controls the robot arm 10 so that the robot arm 10 does not appear in the captured image 30.
In the embodiment, as illustrated in
Hitherto, the control device, the robot, and the robot system according to the invention have been described with reference to the illustrated embodiments. However, the invention is not limited thereto. The configuration of each unit can be replaced with any optional configuration having the same function. Any other configuration may be added to the invention. The respective embodiments may be appropriately combined with each other.
The number of robot arms is not particularly limited, and may be two or more. The number of rotating axes of the robot arm is not particularly limited, and may be optionally determined.
The entire disclosure of Japanese Patent Application No. 2016-239871, filed Dec. 9, 2016 is expressly incorporated by reference herein.
Number | Date | Country | Kind |
---|---|---|---|
2016-239871 | Dec 2016 | JP | national |