The present invention relates to a robot system and a control method.
Conventionally, various techniques have been proposed to measure a distance to a workpiece having a three-dimensional shape (for example, see Patent Document 1). A three-dimensional measuring device disclosed in Patent Document 1 includes a shape measuring unit that acquires distance information for each point on a surface of an object existing in an imaging field of view, and an indicator that is arranged to be exposed on a sensor housing and indicates a positional relation between the surface of the object calculated based on the distance information acquired by the shape measuring unit and a reference point in the three-dimensional measuring device.
Patent Document 1: Japanese Unexamined Patent Application, Publication No. 2019-207152
When a robot picks up stacked workpieces such as cardboard boxes and loads the workpieces to another location (for example, a pallet), a robot system with the robot needs to accurately acquire a thickness of the workpiece. Therefore, it is desired to acquire the thickness of the workpiece with high accuracy.
The present disclosure provides a robot system including: an image capturing unit that captures a two-dimensional image of a workpiece; an image processing unit that acquires distance information of the workpiece based on the two-dimensional image; a distance image generation unit that generates a distance image based on the distance information; and a thickness calculation unit that calculates a thickness of the workpiece based on the distance image, the image capturing unit being configured to capture the two-dimensional image of the workpiece regardless of a positional relation between a side surface of the workpiece and the image capturing unit.
The present disclosure provides a control method of a robot system including: a step of capturing a two-dimensional image of a workpiece; a step of acquiring distance information of the workpiece based on the two-dimensional image; a step of generating a distance image based on the distance information; and a step of calculating a thickness of the workpiece based on the distance image, in which, an image capturing unit configured to capture the two-dimensional image of the workpiece captures the two-dimensional image of the workpiece regardless of a positional relation between a side surface of the workpiece and the image capturing unit.
According to the present invention, it is possible to acquire a thickness of a workpiece with high accuracy.
An example of an embodiment of the present invention will be described below.
A hand or a tool is attached to a tip of an arm of the robot 10. The robot 10 performs a task such as handling of the workpiece W under control of the robot controller 20. Further, the image capturing unit 11 is mounted on the tip of the arm of the robot 10. The image capturing unit 11 may not be attached to the robot 10, and may be installed at a predetermined position, for example.
The image capturing unit 11 is mounted on the tip of the arm of the robot 10. The image capturing unit 11 captures a distance image and a two-dimensional image of the workpiece W.
Returning to
The image controller 30 is connected to the image capturing unit 11 and controls the image capturing unit 11. Further, the image controller 30 performs predetermined processing on the image captured by the image capturing unit 11. Further, the image controller 30 includes an image processing unit 301, a distance image generation unit 302, an image recognition unit 303, and a thickness calculation unit 304.
The image processing unit 301 acquires distance information of the workpiece W based on the two-dimensional image of the workpiece W captured by the image capturing unit 11. The distance image generation unit 302 generates a distance image based on the distance information acquired by the image processing unit 301. The thickness calculation unit 304 calculates a thickness of the workpiece W based on the generated distance image.
The image processing unit 301 searches for an image corresponding to a small zone (image range) in the image 1 from the image 2, and calculates parallax between the image 1 and the image 2.
Such a difference in pixel position between the two images 1 and 2 is called parallax. As a distance from the internal camera 111 to the workpiece W becomes longer, the parallax becomes smaller, whereas as the distance from the internal camera 111 to the workpiece 8 becomes shorter, the parallax becomes larger.
In
The image processing unit 301 converts the parallax between the two two-dimensional images into a distance to acquire distance information. The conversion of the parallax into the distance is performed using Formula (1) below.
Z=B*F/S (1)
where, Z indicates a distance (mm), B indicates a distance (mm) between two cameras, F indicates a focal distance (mm), and S indicates parallax (mm).
Then, the distance image generation unit 302 generates a distance image using the acquired distance information. In other words, the distance image is obtained by imaging of the distance information between image capturing unit 11 (internal camera 111) to the workpiece W. In the distance image, therefore, a location closer to the image capturing unit 11 becomes brighter on the image, and a location farther from the image capturing unit 11 becomes darker on the image.
The image recognition unit 303 connects adjacent three-dimensional points in the distance image, and characterizes a set of the three-dimensional points by an area and an angle. The image recognition unit 303 detects the set (blob) of the characterized three-dimensional points to detect the workpiece W. Then, the thickness calculation unit 304 calculates a thickness of the detected workpiece W.
In the example shown in
The image recognition unit 303 detects blobs existing within the set search range as workpieces. In order to reduce erroneous detection and non-detection, the image recognition unit 303 repeats the detection in response to the operation of the operation unit by the instructor, and thus adjusts the search range for searching for the feature amounts.
In the image MG, since minor axis lengths of blobs of all workpieces are within the set search range, the image recognition unit 303 can detect all of the workpieces B1 to B5. On the other hand, In the image M7, since a minor axis length of the blob of the workpiece B2 is not within the set search range, the image recognition unit 303 cannot detect the workpiece B2.
When the search range is not appropriate as described above, the image recognition unit 303 generates erroneous detection and non-detection of the workpieces. In the above-described example, the area and the minor axis length of the workpiece are used as the search range for the feature amount to be adjusted, but the feature amount is not limited thereto. The feature amount may be, for example, the major axis length or the angle.
The thickness calculation unit 304 may calculate the thickness of the workpiece W recognized by the image recognition unit 303 before the workpiece W is picked up by the robot 10. In addition, when a plurality of workpieces W are placed on a pallet for loading luggage, the thickness calculation unit 304 may calculate thicknesses of the plurality of workpiece W for each pallet and calculate may calculate an average value of the thicknesses of the plurality of workpieces W, thereby calculating the thickness of the workpiece W.
Further, the thickness calculation unit 304 can calculate the thickness of the workpiece W recognized by the image recognition unit 303, for example, using methods (1) to (4) below.
(1) Using major/minor axis lengths of a blob of the workpiece W.
(2) Using an average value of major/minor axis lengths of a plurality of blobs.
(3) Using a length between centroids of blobs adjacent to each other in a height direction.
(4) Using an average value of lengths between centroids of a plurality of blobs adjacent to each other in a height direction.
With such processing, the thickness calculation unit 304 can calculate the thickness of the workpiece W even when a plurality of cardboard boxes are stacked as the workpiece W as shown in
in Step S2, the image processing unit 301 acquires distance information of the workpiece W based on the two-dimensional image. In Step S3, the distance image generation unit 302 generates distance image based on the distance information in Step S4, the image recognition unit 303 recognizes the workplace W based on the distance image. In Step S5, the thickness calculation unit 304 calculates a thickness of the workpiece W based on the recognized distance image in which the workpiece W is recognized.
As described above, the robot system 100 according to the present embodiment includes the image capturing unit 11 that captures the two-dimensional image of the workpiece W, the image processing unit 301 that acquires the distance information of the workpiece W based on the two-dimensional image, the distance image generation unit 302 that generates the distance image based on the distance information, and the thickness calculation unit 304 that calculates the thickness of the workpiece W based on the distance image. The image capturing unit 11 moves according to the motion of the robot 10, and captures the two-dimensional image of the workpiece regardless of the positional relation between the side surface of the workpiece W and the image capturing unit 11.
The conventional technique uses a three-dimensional sensor to measure the thickness of the workpiece from a difference between three-dimensional information (for example, height or shape) of a base or a floor as a reference and three-dimension information of the workpiece (for example, loaded luggage). Therefore, the conventional technique can measure the thicknesses of the workpieces only when the workpieces are not stacked. In addition, the conventional technique requires that the three-dimensional sensor faces an upper surface of the workpiece.
On the other hand, since the robot system 100 according to the present embodiment calculates the thickness of the workpiece W based on the distance image, it is possible to acquire the thickness of the workpiece W with high accuracy regardless of whether the side surface of the workpiece W faces the image capturing unit 11. Further, unlike the conventional technique, the robot system 100 can acquire the thickness of the workpieces W even when the workpieces W are stacked. In addition, since the robot system 100 calculates the thickness of the workpiece W based on the distance image, it is not necessary to measure the reference position unlike the conventional technique.
Further, the thickness calculation unit 304 calculates the thickness of the workpiece W by calculating the average value of the thicknesses of the plurality of workpieces W based on the distance image. Thus, the robot system 100 can acquire the thickness of the workpiece W with high accuracy even when the workpieces W (for example, cardboard boxes) having approximately the same shape are loaded.
In addition, the thickness calculation unit 304 connects adjacent three-dimensional points in the distance image, characterizes a set of the three-dimensional points, and detects the set (blob) of three-dimensional points to detect the workpiece W. Thus, the robot system 100 can acquire the thickness of the workpiece W with high accuracy regardless of whether the side surface of the workpiece W faces the image capturing unit 11.
Although the embodiment of the present invention has been described, the robot system 100 can be implemented by hardware, software, or a combination of the hardware and the software. Further, the control method performed by the robot system 100 described above can be implemented by hardware, software, or a combination of hardware and software. Here, the implementation by software means implementation by a computer reading and executing a program.
The program may be stored and supplied to a computer using various types of non-transitory computer readable media. The non-transitory computer-readable media include various types of tangible storage media. Examples of non-transitory computer-readable media include magnetic storage media (for example, hard disk drives), optical magnetic storage media (for example, magneto-optical disks), a CD-Read Only Memory (ROM), CD-R, CD-R/W, and semiconductor memories (such as mask ROM, Programmable ROM (PROM), Erasable PROM (EPROM), flash ROM, RAM (random access memory)).
Moreover, although the above-described embodiment is a preferred embodiment of the present invention, the scope of the present invention is not limited only to the embodiment. Various modifications can be made without departing from the gist of the present invention.
1 robot system
10 robot
11 image capturing unit
20 robot controller
30 image control unit
301 image processing unit
302 distance image generation unit
303 image recognition unit
304 thickness calculation unit
Number | Date | Country | Kind |
---|---|---|---|
2020-147223 | Sep 2020 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2021/031375 | 8/26/2021 | WO |