The present application claims priority under 35 U.S.C. §119 to Japanese Patent Application No. 2011-203258 filed on Sep. 16, 2011. The contents of this application are incorporated herein by reference in their entirety.
1. Field of the Invention
An embodiment disclosed herein relates to a robot system.
2. Description of the Related Art
Japanese Application Publication No. 2008-87074 discloses a robot system in which a task of picking up workpieces from a container maintaining the workpieces in disorder (in bulk) is automatically performed by a robot. In this robot system, the three-dimensional shape of bulk workpieces is measured and the grip target workpiece selected pursuant to the three-dimensional shape is picked up from the container by the robot.
In accordance with the aspect of the embodiments, there is provided a robot system including: a projecting unit for projecting a slit light on a specified placement region loaded with a work and for moving the slit light in a specified moving direction; an imaging unit for imaging, a number of times, the slit light moving on the work loaded on the placement region; an estimated projection region determining unit for determining an estimated projection region such that the estimated projection region extends across an image taken by the imaging unit in an intersection direction intersecting the moving direction and such that the length of the estimated projection region in a direction substantially parallel to the moving direction grows larger toward the center of the image in the intersection direction; a projection position detecting unit for detecting a projection position of the slit light within the estimated projection region; and a robot for gripping the workpiece based on the projection position detected by the projection position detecting unit.
The objects and features of the present invention will become apparent from the following description of embodiments, given in conjunction with the accompanying drawings, in which:
An embodiment of a robot system disclosed herein will now be described in detail with reference to the accompanying drawings which form a part hereof. The present disclosure is not limited to the embodiment to be described below.
As shown in
The projecting unit 4 performs the projection and movement of the slit light 40 under the control of the control device 6. The configuration of the projecting unit 4 will be described later with reference to
The control device 6 is a device for generally controlling the overall operations of the robot system 1. The control device 6 outputs a slit light projecting command to the projecting unit 4 and outputs an imaging command to the imaging unit 5. Based on the images acquired from the imaging unit 5, the control device 6 measures the three-dimensional shape of the workpieces 3 and determines the positions and postures of the workpieces 3.
Then, the control device 6 selects a grip target workpiece 3 among the workpieces 3 whose positions and postures are determined. The control device 6 outputs to the robot 7 a task command for causing the robot 7 to perform a task of picking up the grip target workpiece 3 from the placement region 2. The configuration of the control device 6 will be described later with reference to
Next, the configuration of the control device 6 will be described with reference to
The control unit 8 is an operation processing unit including a CPU (Central Processing Unit), a memory and the like. The control unit 8 includes a projection control unit 81, an imaging control unit 82, an estimated projection region determining unit 83, a projection position detecting unit 84, a three-dimensional shape measuring unit 85, a task commanding unit 86 and the like.
The storage unit 9 is an information storing device such as a hard disk drive or a flash memory. The storage unit 9 stores image information 91 and estimated projection region information 92.
The projection control unit 81 is a processing unit for outputting a specific command to the projecting unit 4 and controlling the operation of the projecting unit 4 when performing a pre-scan and a scan to be described later. The term “pre-scan” used herein refers to a process by which the slit light 40 is projected on the placement region 2 not loaded with the workpieces 3 thereon and is moved, e.g., from the left end of the placement region 2 to the right end thereof. The term “scan” used herein refers to a process by which the slit light 40 is projected on the placement region 2 loaded with the workpieces 3 and is moved in a specific moving direction.
The imaging control unit 82 is a processing unit for outputting a specific command to the imaging unit 5 to control the operation of the imaging unit 5 when the projection of the slit light 40 is performed by the projecting unit 4. Responsive to the command inputted from the imaging control unit 82, the imaging unit 5 sequentially images the slit light 40 moving on the placement region 2 and the workpieces 3 placed on the placement region 2 a number of times. The imaging control unit 82 sequentially acquires image information 91 on the respective images taken by the imaging unit 5 and stores the image information 91 in the storage unit 9.
The estimated projection region determining unit 83 is a processing unit for determining an estimated projection region to be projected with the slit light 40 during the scan. The estimated projection region determining unit 83 acquires image information 91 corresponding to the images taken during the pre-scan from the image information 91 stored in the storage unit 9 and determines an estimated projection region based on the image information 91 thus acquired.
The estimated projection region determining unit 83 determines the estimated projection region such that the estimated projection region extends across the image in the intersection direction intersecting (e.g., intersecting by a right angle) the moving direction of the slit light 40 and such that the length of the estimated projection region in the direction parallel to the moving direction of the slit light 40 grows larger toward the center of the image in the intersection direction. The order of determining the estimated projection region will be described later with reference to
Further, “parallel” in the present embodiment is not mathematical and may include an error in practice, such as a tolerance, an installation error or the like.
The estimated projection region determining unit 83 have the storage unit 9 store the estimated projection region information 92 indicative of the decided estimated projection region. At this time, the estimated projection region determining unit 83 may cause the storage unit 9 to store the estimated projection region information 92 corresponding to a plurality of images sequentially taken by the imaging unit 5 while the scan is performed once.
The projection position detecting unit 84 is a processing unit for detecting the projection position of the slit light 40 in each of the images taken by the imaging unit 5. The projection position detecting unit 84 acquires the image information 91, which corresponds to the image taken during the scan, and the estimated projection region information 92 from the storage unit 9.
Based on the image information 91 and the estimated projection region information 92, the projection position detecting unit 84 sequentially reads the pixels included in the estimated projection region of each image in the direction parallel to the moving direction of the slit light 40. Then, the projection position detecting unit 84 detects the pixels having the brightness equal to or higher than a specific threshold value or the pixels having the highest brightness values in respective lines in a direction parallel to the moving direction of the slit light 40 as the projection position of the slit light 40 in the image and outputs the detection result to the three-dimensional shape measuring unit 85.
Based on the positional relationship between three points, i.e., the projection position of the slit light 40 inputted from the projection position detecting unit 84, the position of the projecting unit 4 and the position of the imaging unit 5, the three-dimensional shape measuring unit 85 measures the three-dimensional shape of the workpieces 3 under the principle of triangulation.
A method of measuring the three-dimensional shape of the workpiece 3 with the three-dimensional shape measuring unit 85 will now be briefly described with reference to
As shown in
In the robot system 1 stated above, the mirror 42 and the imaging unit 5 are arranged so that the reflection position of the slit light 40 in the mirror 42 and the reception position of the slit light 40 in the imaging unit 5 can lie on one and the same plane (hereinafter referred to as “reference plane Z1”) parallel to the placement region 2
The three-dimensional shape measuring unit 85 calculates an irradiation angle a of the slit light 40 with respect to the workpiece 31 based on the rotation angle of the mirror 42. In addition, the three-dimensional shape measuring unit 85 calculates a reception angle b of the slit light 40 with respect to the imaging unit 5 based on the projection position of the slit light 40 in the image.
In this connection, the distance c from the reflection position P of the slit light 40 in the mirror 42 to the reception position Q of the slit light 40 in the imaging unit 5 is known. Likewise, the distance d from the reference plane Z1 to the placement region 2 is known.
Using the distance c from the reflection position P in the mirror 42 to the reception position Q in the imaging unit 5, the irradiation angle a of the slit light 40 and the reception angle b of the slit light 40, the three-dimensional shape measuring unit 85 can calculate the distance e from the reference plane Z1 to the projection position R of the slit light 40 under the principle of triangulation.
Thus, the three-dimensional shape measuring unit 85 can calculate the height f of the workpiece 31 by subtracting the distance e from the reference plane Z1 to the projection position R of the slit light 40 from the distance d from the reference plane Z1 to the placement region 2. Accordingly, the three-dimensional shape measuring unit 85 can determine the three-dimensional shape of the workpiece 31 by calculating the height f with respect to the respective portions of the workpiece 31.
In the present embodiment, the reception position Q in the X-Y plane lies at the center of the placement region 2. The position of the reflection surface of the mirror 42 in the X-Y plane lies at the center of the placement region 2 in the Y-direction and at the X-axis negative side of the X-axis negative end of the placement region 2.
Referring back to
Based on the information inputted from the three-dimensional shape measuring unit 85, the task commanding unit 86 determines the positions and postures of the workpieces 3 stacked on the placement region 2 in bulk and selects a grip target workpiece 3. Then the task commanding unit 86 outputs, to the robot 7, a command for allowing the robot 7 to perform a task of picking up the selected workpiece 3 from the placement region 2.
Next, an order of determining an estimated projection region 11 will be described with reference to
Further,
In the robot system 1, when the task of picking up the workpiece 3 is carried out by the robot 7, the pre-scan on the placement region 2 is performed in advance by the slit light 40. In the robot system 1, the estimated projection region 11 (see
More specifically, as shown in
During the pre-scan, the imaging unit 5 takes images 10 of the slit light 40 projected on the placement region 2 at regular intervals. The imaging unit 5 receives the slit light 40 reflected by the placement region 2 in the position just above the placement region 2 (at the Z-axis negative side) to take the image 10 of the slit light 40.
At this time, the slit light 40 reflected by the placement region 2 spreads and is moved toward the imaging unit 5. In this regard, the distance from the slit light 40 to the imaging unit 5 grows shorter as the reflection point comes closer to the longitudinal center of the slit light 40 (in the direction parallel to the Y-axis). The distance from the slit light 40 to the imaging unit 5 grows longer as the reflection point comes closer to the longitudinal both ends of the slit light 40.
Therefore, the slit light 40 further spreads until the reception of the slit light 40 by the imaging unit 5 as the reflection point comes closer to the longitudinal both ends of the slit light 40. This leads to a decrease in the quantity of the slit light 40 received by the imaging unit 5. Accordingly, if the slit light 40 shown in
In other words, the slit light 40c of the image 10 does not have a flat shape but have a bulging shape because the width of the slit light 40c in the X-axis direction grows larger (i.e., the width of the slit light 40c becomes wider) from the longitudinal both ends toward the longitudinal center.
Therefore, as shown in
More specifically, the estimated projection region determining unit 83 determines, as the estimated projection region 11, the barrel-shaped region which extends across the image 10 in the intersection direction (Y-axis direction) intersecting the moving direction of the slit light 40c (X-axis direction) and which has an X-axis direction width growing larger toward the Y-axis direction center of the image 10.
At this time, the estimated projection region determining unit 83 determines a length (width) g from the X-axis negative-side edge of the slit light 40c taken during the pre-scan to the X-axis negative-side edge of the estimated projection region 11 by taking into account the Z-axis direction length of the workpieces 3 (the estimated maximum height of the workpieces 3 which is estimated when the workpieces 3 are placed with no overlap).
Moreover, the estimated projection region determining unit 83 determines a length h shorter than the length g as the length (width) h from the X-axis positive-side edge of the slit light 40c to the X-axis positive-side edge of the estimated projection region 11.
As an example, description will now be made on a case where the slit light 40 is projected on a rectangular parallelepiped workpiece 31 as shown in
Accordingly, the estimated projection region determining unit 83 determines the length g from the X-axis negative-side edge of the slit light 40 to the X-axis negative-side edge of the estimated projection region 11 by adding a specified margin to the deviation distance in the X-axis negative-side of the slit light 40 corresponding to the estimated maximum height of the workpiece 31.
Depending on the irregular reflection of the slit light 40 or the imaging conditions, it is sometimes the case that, when scanning the workpiece 31, the slit light 40 is projected on the position deviated toward the X-axis positive side from the projection position of the slit light 40 imaged during the pre-scan. In this case, however, it is less likely that the deviation amount (the degree of deviation toward the X-axis positive side) becomes greater than the length g.
Therefore, the estimated projection region determining unit 83 determines the length h shorter than the length g as the length (width) h from the X-axis positive-side edge of the slit light 40 to the X-axis positive-side edge of the estimated projection region 11.
The description made above is directed to a case where the length g is determined on the assumption that the workpieces 31 are irregularly placed on the placement region 2 with no overlap. In the event that the workpieces 3 such as screws or the like are staked on the placement region 2 in bulk in an overlapping state, the length g is further increased in view of the height of the workpieces 3 overlapping with one another.
Taking into account the fact that the width of the slit light 40 tends to grow larger from the longitudinal both ends toward the center of the slit light 40, the estimated projection region determining unit 83 determines, as the estimated projection region 11, the barrel-shaped region in which the X-axis direction length (width) of the estimated projection region 11 grows larger toward the Y-axis direction center. Then, the estimated projection region determining unit 83 allows the storage unit 9 to store the estimated projection region information 92 indicative of the estimated projection region 11 thus determined.
The bulging shape of the contour of the estimated projection region 11 may not coincide with the bulging shape of the contour of the slit light 40. In other words, the estimated projection region 11 may have an arbitrary barrel-like shape in which the X-axis direction length (width) grows larger toward the Y-axis direction center.
Next, a method of detecting the projection position of the slit light 40 by the projection position detecting unit 84 will be described with reference to
Description will be made herein on a case where the scan of the workpieces 3 is performed while moving the slit light 40 on the placement region 2 toward the X-axis direction positive-side at a constant speed. In this case, the projecting unit 4 causes the mirror 42 to reflect the slit light 40 emitted from the light source 41 and then to project the slit light 40 on the placement region 2 loaded with the workpieces 3.
During the projection, the projecting unit 4 performs the scan of the workpieces 3 by rotating the mirror 42 so as to move the slit light 40 on the placement region 2 in a specified direction, e.g., from the left end to the right end of the placement region 2 in the X-axis positive direction. Then, the imaging unit 5 sequentially takes, a number of times, an image 10 of the slit light 40 moving on the placement region 2 in the X-axis positive direction at regular intervals.
The projection position detecting unit 84 detects the projection position of the slit light 40 from the image 10 taken by the imaging unit 5. In this regard, the projection position detecting unit 84 acquires the image information 91 and the estimated projection region information 92 from the storage unit 9. Based on the estimated projection region information 92, the projection position detecting unit 84 sets an estimated projection region 11 in each of the images 10 and detects the projection position of the slit light 40 within the estimated projection region 11.
More specifically, the projection position detecting unit 84 estimates the projection position of the slit light 40 in each of the images 10 based on the moving speed of the slit light 40 in the placement region 2. Then, as shown in
Then, the projection position detecting unit 84 selectively and sequentially reads the pixels, which are included in the estimated projection region 11 set in each of the images 10, from one Y-axis end (i.e., the top end of the image in
In this manner, the projection position detecting unit 84 does not read all the pixels of the image 10 but selectively reads the pixels existing within the estimated projection region 11 estimated in advance, thereby detecting the projection position of the slit light 40. It is therefore possible to reduce the processing time required in detecting the projection position.
In conformity with the shape of the slit light 40 appearing in the image 10, the estimated projection region 11 is optimized such that the length (width) thereof in the direction parallel to the X-axis grows larger from the longitudinal two ends toward the center.
As a result, the projection position detecting unit 84 can read a necessary and sufficient number of pixels when detecting the projection position of the slit light 40 in the longitudinal central portion (indicated by “I” in
When detecting the projection positions of the slit light 40 in the longitudinal two end portions, the projection position detecting unit 84 can detect the projection position of the slit light 40 without having to read an unnecessarily large number of pixels. This makes it possible to effectively reduce the detection time of the slit light 40.
The projection position detecting unit 84 detects the projection position of the slit light 40 based on a brightness distribution diagram of the slit light 40 in the image 10. This makes it possible to further increase the detection accuracy of the projection position of the slit light 40 in the image 10.
Next, a method of detecting the projection position of the slit light 40 based on the brightness distribution diagram (histogram) will be described with reference to
As shown in
As shown in
Accordingly, when detecting the projection position of the slit light 40 in the longitudinal center, the projection position detecting unit 84 detects the projection position by using, e.g., a threshold value preset by the brightness distribution diagram on the pixels of each line in the direction parallel with the X-axis.
In this case, the projection position detecting unit 84 discerns, e.g., a pixel D positioned at the midpoint between a pair of pixels B and C in the image 10, the brightness of the pair of pixels B and C being equal to the threshold value (or being approximate to the threshold value, when no pair of pixels equal to the threshold value exists). Then, the projection position detecting unit 84 detects, as the projection position of the slit light 40, the midpoint pixel D thus discerned on the corresponding line.
Further, when the brightness of the midpoint pixel D is lower than the threshold value, a pixel, which is the closest pixel to the midpoint pixel D and of which brightness is not lower than the threshold value, is detected as the projection position of the slit light 40.
By using the brightness distribution diagram on the pixels in this manner, the projection position detecting unit 84 can accurately detect the projection position of the slit light 40.
Further, when the projection position is detected in the longitudinal center of the slit light 40, another method may be used as substitute for the method described by referring to
Furthermore, detecting the position of the pixel having the peak brightness value on each line in the direction parallel to the X-axis as the projection position of the slit light, which is described above by referring to
While the description made above is directed to a case where the projection position detecting unit 84 detects the projection position of the slit light 40 by sequentially moving the estimated projection region 11 of a specific shape and size toward the X-axis positive side in the order of the images 10 taken, the estimated projection region 11 may be provided in each of the images 10.
Next, a method of providing the estimated projection region 11 in each of the images 10 will be described with reference to
As shown in
Then, the estimated projection region determining unit 83 allows the storage unit 9 to store the estimated projection region information 92 corresponding to the estimated projection region 11 on an image-by-image basis. As a consequence, using the estimated projection region information 92 stored on an image-by-image basis, the projection position detecting unit 84 can detect the projection position of the slit light 40 by setting the estimated projection region 11 having a size corresponding to the projection position of the slit light 40 in the image 10.
Therefore, if the slit light 40 is positioned near the X-axis negative side or positive side portion in the image 10 as shown in
Since the estimated projection region 11 is minimized, it is possible to reduce generation of a slit light detection error caused by the ambient light other than the slit light 40. As a result, the robot system 1 can reduce generation of a workpiece detection error, thereby restraining the robot 7 from performing a gripping operation in a wrong position.
On the other hand, when the slit light 40 is positioned at the X-axis center in the image 10, the projection position detecting unit 84 can set an estimated projection region 11 having a necessary and sufficient width as shown in
Next, the processing executed by the control unit 8 in the control device 6 will be described with reference to
At this time, the projection control unit 81 outputs to the projecting unit 4 a command instructing the projecting unit 4 to project the slit light 40 on the placement region 2 not loaded with the workpieces 3 and to move (pre-scan) the slit light 40 on the placement region 2 in the direction parallel to the X-axis. Moreover, the imaging control unit 82 outputs to the imaging unit 5 a command instructing the imaging unit 5 to take an image 10 of the slit light 40 moving on the placement region 2. Then, the imaging control unit 82 acquires the image information 91 on the taken image 10 from the imaging unit 5 to store the image information 91 in the storage unit 9.
Subsequently, the estimated projection region determining unit 83 determines an estimated projection region 11 (see
Thereafter, the control unit 8 begins to perform a scan in step S103. At this time, the projection control unit 81 outputs to the projecting unit 4 a command instructing the projecting unit 4 to project the slit light 40 on the placement region 2 loaded with the workpieces 3 and instructing the projecting unit 4 to move (scan) the slit light 40 on the placement region 2 in the direction parallel to the X-axis. Moreover, the imaging control unit 82 outputs to the imaging unit 5 a command instructing the imaging unit 5 to take an image 10 of the slit light 40 moving on the placement region 2. Then, the imaging control unit 82 acquires the image information 91 on the taken image 10 from the imaging unit 5 and stores the image information 91 in the storage unit 9.
Then, using the estimated projection region information 92 stored in the storage unit 9, the projection position detecting unit 84 sets the estimated projection region 11 (see
Using the brightness distribution diagram thus prepared, the projection position detecting unit 84 detects the projection position of the slit light 40 in each of the images 10 in step S105. The projection position detecting unit 84 outputs the information on the detected projection position to the three-dimensional shape measuring unit 85. Based on the projection position of the slit light 40 in each of the images 10, the three-dimensional shape measuring unit 85 measures the three-dimensional shape of the workpieces 3 in step S106. The three-dimensional shape measuring unit 85 determines the positions and postures of the workpieces 3 to output the determination result to the task commanding unit 86.
Based on the positions and postures of the workpieces 3, the task commanding unit 86 determines whether there exists a grip target workpiece 3 in step S107. If it is determined that the grip target workpiece 3 does exist (if yes in step S107), the task commanding unit 86 selects the grip target workpiece 3 in step S108.
Then, the task commanding unit 86 outputs to the robot 7 a workpiece grip command instructing the robot 7 to take out the selected grip target workpiece 3 from the placement region 2 in step S109. Thereafter, the processing is returned to step S103. On the other hand, if it is determined in step S107 that the grip target workpiece 3 does not exist (if no in step S107), the task commanding unit 86 completes the processing.
As set forth above, the robot system 1 according to the present embodiment shortens the measuring time of the three-dimensional shape of the workpiece 3 and enhances the measuring accuracy thereof by optimizing the estimated projection region 11 of the slit light 40 in conformity with the shape of the slit light 40 in the image 10. With the robot system 1 in accordance with the present embodiment, it is therefore possible to increase the speed and accuracy of picking up the workpiece 3 performed by the robot 7.
When the estimated projection regions 11 differing in the X-axis direction length (width) from one another are set in the respective images 10, the estimated projection region determining unit 83 may not perform the pre-scan and may determine the estimated projection regions 11 using a function in which the width of the estimated projection regions 11 is changed depending on the X-axis direction position of the slit light 40 in the respective images 10 registered in advance. This makes it possible to reduce the amount of the estimated projection region information 92 stored in the storage unit 9.
If the length (width) of the slit light 40 in the direction parallel to the X-axis grows larger as the projection position of the slit light 40 comes closer to the X-axis direction center of the placement region 2, the amount of the slit light 40 may be made smaller as the projection position comes closer to the center of the placement region 2.
This makes it possible to keep the width of the slit light 40 uniform regardless of the projection position of the slit light 40 in the placement region 2. Accordingly, the projection position of the slit light 40 can be detected through the use of one kind of estimated projection region 11 by merely changing the set position of the estimated projection region 11.
Based on the estimated projection region 11 from which the projection position of the slit light 40 is detected by the projection position detecting unit 84, the estimated projection region determining unit 83 may determine the estimated projection region 11 from which the next projection position of the slit light 40 is detected by the projection position detecting unit 84.
For example, based on the position of the estimated projection region 11 in the image and the speed of the slit light 40 moving along the placement region 2, the estimated projection region determining unit 83 may sequentially determine the estimated projection regions 11 in the images 10 from which the next projection positions of the slit light 40 are detected by the projection position detecting unit 84. With this configuration, it is possible to reduce the amount of the estimated projection region information 92 stored in the storage unit 9.
Based on the projection position of the slit light 40 detected by the projection position detecting unit 84, the estimated projection region determining unit 83 may determine the estimated projection region 11 from which the next projection position of the slit light 40 is detected by the projection position detecting unit 84. With this configuration, it is equally possible to reduce the amount of the estimated projection region information 92 stored in the storage unit 9.
Other effects and other modified examples can be readily derived by those skilled in the art. For that reason, the broad aspect of the present disclosure is not limited to the specific disclosure and the representative embodiment shown and described above. Accordingly, the present disclosure can be modified in many different forms without departing from the scope defined by the appended claims and the equivalents thereof.
Number | Date | Country | Kind |
---|---|---|---|
2011-203258 | Sep 2011 | JP | national |