The present invention relates to an image generation device, a robot controller, and a computer program.
Conventionally, in order to accurately perform work such as handling or machining of a workpiece using a robot, it is necessary to accurately recognize a position where the workpiece is placed and a deviation of the workpiece gripped by the robot. For this reason, in recent years, a visual sensor has been used to visually recognize the position of the workpiece and the deviation of the workpiece (for example, see Patent Document 1).
There is a case where a range of brightness cannot be appropriately represented in one image when an object (for example, a workpiece) is imaged by a visual sensor. For example, when brightness conforms with a bright region in the field of view, a dark region is blackened out and cannot be visually recognized. Conversely, when brightness conforms with a dark region in the field of view, a bright region is whitened out and cannot be visually recognized.
A technique called HDR (High Dynamic Range) composite is known to cope with such problems. This technique generates a composite image with a wide dynamic range, which cannot be obtained from a single image, by combining a plurality of captured images.
A method of generating a composite image from a plurality of captured images is an effective method by reason of adjusting a wide range of brightness values within a specific brightness value while maintaining small brightness features. However, according to such a method, the brightness value may be reduced as a pixel value, and thus an image with little change may be generated.
Machine vision used in robots has processing of recognizing an image by detecting differences in pixel value as features. Reducing the difference in pixel value in such processing leads to lowering a threshold value of the difference in pixel value to be detected. As a result, a problem arises in that the features to be detected increase, which causes an increase in processing time and erroneous detection. Therefore, there is a demand for a technique capable of widening the range in which pixel values exist in a composite image.
An aspect of the present disclosure is directed to an image generation device that generates a composite image in which images are combined, the image generation device including: a number-of-images setting unit that sets a number of images in which an object is to be captured; an exposure time setting unit that sets an exposure time for the images to be captured; a brightness range setting unit that sets a brightness range of the object; an image capturing control unit that controls and causes an image capturing unit to capture images of the object based on the number of images and the exposure time; and an image combining unit that combines the captured images based on the brightness range to thereby generate the composite image.
An aspect of the present disclosure is directed to a robot controller that generates a composite image in which images are combined, the robot controller including: a number-of-images setting unit that sets a number of images in which an object is to be captured; an exposure time setting unit that sets an exposure time for the images to be captured; a brightness range setting unit that sets a brightness range of the object; an image capturing control unit that controls and causes an image capturing unit to capture images of the object based on the number of images and the exposure time; and an image combining unit that combines the captured images based on the brightness range to thereby generate the composite image.
An aspect of the present disclosure is directed to a computer program for causing a computer to execute steps including: a step of setting the number of images in which an object is to be captured; a step of setting an exposure time for the images to be captured; a step of setting a brightness range of the object; a step of controlling and causing an image capturing unit to capture images of the object based on the number of images and the exposure time; and a step of combining the captured images based on the brightness range to generate the composite image.
According to the present invention, it is possible to widen the range in which the pixel values exist in the composite image.
An example of embodiments of the present invention will be described below.
A hand or a tool is attached to a tip of the arm 3 of the robot 2. The robot 2 performs work such as handling or machining of a workpiece W under control of the robot controller 1. Further, the visual sensor 4 is attached to the tip of the arm 3 of the robot 2. The visual sensor 4 may not be attached to the robot 2, and may be fixed and installed at a predetermined position, for example.
The visual sensor 4 captures an image of the workpiece W under the control of the robot controller 1. The visual sensor 4 may be a two-dimensional camera having an image capturing element configured by a CCD (Charge Coupled Device) image sensor and an optical system including lenses.
The robot controller 1 executes a robot program for the robot 2 to control a motion of the robot 2. At this time, the robot controller 1 compensates the motion of the robot 2 such that the robot 2 performs predetermined work with respect to the position of the workpiece W using an image captured by the visual sensor 4.
The control unit 11 is a processor such as a CPU (Central Processing Unit), and implements various functions by executing programs stored in the storage unit 12.
The control unit 11 includes a number-of-images setting unit 111, an exposure time setting unit 112, a brightness range setting unit 113, am image capturing control unit 114, and an image combining unit 115.
The storage unit 12 is a storage device such as a ROM (Read Only Memory) for storing an OS (Operating System) or application programs, a RAM (Random Access Memory), or a hard disk drive and SSD (Solid State Drive) that store various other information. The storage unit 12 stores various information, for example, robot programs. In addition, the storage unit 12 includes an image storage unit 121 that stores captured images.
The number-of-images setting unit 111 sets the number of images of an object (for example, a workpiece W) captured by the visual sensor 4. Specifically, the number-of-images setting unit 111 sets the number of images to be captured, based on a brightness range of the object.
The exposure time setting unit 112 sets an exposure time of the image captured by the visual sensor 4. Specifically, the exposure time setting unit 112 sets the exposure time based on the brightness range of the object.
The brightness range setting unit 113 sets a brightness range of the object. Here, the brightness range may be set by a user's input operation, for example. Further, the brightness range setting unit 113 may set the brightness range of the object based on image information captured by the visual sensor 4. In addition, the brightness range setting unit 113 may set the brightness range of the object based on the image information of the object captured by the visual sensor 4. Here, examples of the image information include distribution of brightness values estimated from the captured image captured by the visual sensor 4 and brightness of a target part in the captured image captured by the visual sensor 4.
The image capturing control unit 114 controls and causes the visual sensor 4 to capture images of the object, based on the number of images to be captured that has been set by the number-of-images setting unit 111 and the exposure time set by the exposure time setting unit 112.
The image combining unit 115 combines the captured images based on the brightness range to thereby generate a composite image. In addition, the image combining unit 115 may adjust pixel values of the captured images based on the brightness range. For example, the image combining unit 115 may combine the captured images by setting the minimum value of the brightness range to 1 of the pixel value of the captured image and setting the maximum value of the brightness range to a number obtained by subtracting 1 from the maximum possible value of the pixel value.
Image features used in a case of detecting the object (for example, the workpiece W) from the composite image include only the brightness range of an effective region. In this case, regions above and below the effective region become useless regions in which no pixel values exist.
Therefore, the image generation device 10 according to the present embodiment represents the pixel values as 1 to Imax-1 conforming with the brightness range of an image capturing environment, for example. In other words, the image generation device 10 sets values outside the range of pixel values to 0 and Imax. Thus, the image generation device 10 can generate a composite image from which useless regions are eliminated, and thus can generate an image with a larger difference in pixels.
Specifically, as shown in
Here, the relationship between brightness and pixel values is described by Formula (1) as follows.
I=C·L·t Formula (1)
Here, I indicates a pixel value, C indicates an arbitrary constant, L indicates brightness, and t indicates an exposure time. Then, a formula for calculating brightness is obtained as follows from the above formula.
L=I/(Ct) Formula (2)
By changing the exposure time t and generating a captured image by the visual sensor 4 in Formula (2), the brightness range setting unit 113 can obtain a distribution of the brightness L in the captured image.
Specifically, the brightness range setting unit 113 may set, as image information of the object, a brightness range of the object based on brightness of a target part (for example, a specific part in the captured image, or a specific range in the captured image) in the captured image captured by the visual sensor 4.
For example, in the example shown in
Instead of the examples shown in
The combination of exposure times for multiple exposure can use the following method, where a is an arbitrary constant.
t
i
=T/a
i(i=0,1,2, . . . ,n−1) Formula (3)
Here, a indicates a constant that determines the amount of change in exposure time, and n indicates the number of images to be captured.
ti indicates an exposure time when an i-th image is captured. The reason for doing so is that human senses perceive physical quantities logarithmically and are approximated thereto.
The exposure time setting unit 112 implements such logarithmic representation between specific types of brightness. In other words, the exposure time setting unit 112 obtains a logarithmic representation only within specific brightness. A method described below indicates an example for realizing a logarithmic representation only within specific brightness.
As described above, the exposure time setting unit 112 can acquire n images between the minimum exposure time tmin and the maximum exposure time tmax, thereby obtaining a logarithmic relationship only within a specific brightness range. Combinations of the exposure times in this case are shown as follows.
Based on tmin, the next image is captured at tmin×x, the next image is captured at (tmin×x)×x, and the next image is captured at (tmin×x)×x, in which the images are captured at the time multiplied by x, respectively.
In other words, when n images are captured, tmax is tmin×x(n-1). Here, x can be calculated from a relationship between tmin and tmax.
The range of estimated pixel values is from Imax to Imin. In a case where the range of the pixel values is represented by 0 to Imax, the pixel values are integers when the range is simply extended, so the values become discrete. In order for the pixel values to be filled up, at least (Imax−Imin)×n≥Imax should be satisfied. In other words, n=Imax/(Imax−Imin). The number-of-images setting unit 111 can set n thus obtained as the number of images to be captured.
Note that Steps S1 to S3 described above are in random order, and the order of each Step may be changed, or each Step may be executed simultaneously.
In Step S4, the image capturing control unit 114 controls and causes the visual sensor 4 to capture images of the object, based on the number of images to be captured set in Step S2 and the exposure time set in Step S3. Then, the image storage unit 121 stores the captured images. In Step S5, the image combining unit 115 combines the captured images captured in Step S4 based on the brightness range set in Step S1 to thereby generate a composite image.
As described above, according to the present embodiment, the image generation device 10, which generates the composite image in which images are combined, includes: the number-of-images setting unit 111 that sets the number of images in which the object is to be captured; the exposure time setting unit 112 that sets the exposure time for the images to be captured; the brightness range setting unit 113 that sets the brightness range of the object; the image capturing control unit 114 that controls and causes the visual sensor 4 to capture the image of the object based on the number of images to be captured and the exposure time; and the image combining unit 115 that combines the captured images based on the brightness range to generate the composite image.
Thus, the image generation device 10 can eliminate the range where no pixel value exists in the composite image and can represent widely the range where the pixel value exists. Therefore, the image generation device 10 can represent an image with a small pixel difference as an image with a large pixel difference, and can obtain clear features. Thereby, for example, the robot controller 1 can reduce processing time and erroneous detection by using the composite image generated by the image generation device 10.
Further, the brightness range may be set by the user, for example. Thus, the image generation device 10 can suitably obtain the composite image desired by the user.
Further, the brightness range setting unit 113 may set the brightness range based on the image information captured by the visual sensor 4. Thus, the image generation device 10 can appropriately set the brightness range in order to widen the range where the pixel value exists in the composite image.
Further, the brightness range setting unit 113 may set the brightness range based on the image information of the object captured by the visual sensor 4. Thus, the image generation device 10 can appropriately set the brightness range in order to widen the range where the pixel value exists in the composite image.
Further, the exposure time setting unit 112 sets the exposure time based on the brightness range. Thus, the image generation device 10 can appropriately set the exposure time in order to widen the range where the pixel value exists in the composite image.
Further, the number-of-images setting unit 111 sets the number of images to be captured, based on the brightness range. Thus, the image generation device 10 can appropriately set the number of images to be captured in order to widen the range where the pixel value exists in the composite image.
Further, the image combining unit 115 adjusts the pixel value of the captured image based on the brightness range. Thus, the image generation device 10 can generate the composite image in which the range where the pixel value exists is widened, by adjusting the pixel value.
In such an image processing system 201 shown in
In such an image processing system 301 shown in
Although the embodiment of the present invention has been described above, the image generation device 10 described above can be implemented by hardware, software, or a combination thereof. Further, the control method performed by the robot controller 1 described above can also be implemented by hardware, software, or a combination thereof. Here, “implemented by software” means implemented by a computer reading and executing a program.
The program may be stored and supplied to a computer using various types of non-transitory computer readable media. The non-transitory computer readable media include various types of tangible storage media. Examples of the non-transitory computer readable media include a magnetic recording medium (for example, a hard disk drive), a magneto-optic recording medium (for example, a magneto-optic disk), a CD-ROM (Read Only Memory), a CD-R, a CD-R/W, and a semiconductor memory (for example, a mask ROM, a PROM (Programmable ROM), an EPROM (Erasable PROM), a flash ROM, and a RAM (Random Access Memory)).
In addition, although the above-described embodiment is a preferred embodiment of the present invention, the scope of the present invention is not limited to only the above-described embodiment, and various modifications can be made without departing from the gist of the present invention.
Number | Date | Country | Kind |
---|---|---|---|
2021-002903 | Jan 2021 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2022/000335 | 1/7/2022 | WO |