The present invention relates to an image processing apparatus and an image processing method, and particularly to an image processing apparatus and an image processing method which are usable for three-dimensional (3D) measurement technique used for product shape inspection, robotic picking, reverse engineering, or the like.
In recent years, there has been an increasing demand for a technique relating to accurate three-dimensional shape measurement in product shape inspection, robotic picking, reverse engineering, or the like. Such accurate three-dimensional shape measurement can be used for various applications, such as scratch checking of industrial products or the like, pin picking, application of a measurement result to a three-dimensional printer.
In the related art, known three-dimensional shape measurement uses a phase shift method which uses a projector and a camera (see JP-A-2009-115612). In this method, an image of a sinusoidal pattern projected by a projector is captured by a camera, the correspondence between a pixel of the captured image and the corresponding pixel of the projection image (i.e., correspondence of pixels between the projector and the camera) is calculated, and the depth of the captured image is calculated by triangulation.
In such a three-dimensional shape measurement method, for example, in the case where a product which has a portion having a large difference in brightness, such as a black and white portion, is inspected, if the light intensity and exposure time are both adjusted for bright areas, the result is insufficient light intensity in dark areas, and if the light intensity and the exposure time are both adjusted for dark areas, the result is halation in the bright areas. Therefore, it is necessary to perform a plurality of measurements while adjusting light intensity and exposure time in the camera in accordance with the black and white areas.
Therefore, there has been proposed a three-dimensional measurement apparatus that measures an object three-dimensionally. When an object to be measured has a plurality of areas of varying brightness (colors), a plurality of test blocks are assigned to the object to be measured. Illuminance for measurement is determined for each of the assigned test blocks. A fringe pattern is projected on each test block by the projection unit using the determined illuminance for measurement. A fringe image on the test block on which the fringe pattern is projected is captured by an imaging unit, and the object to be measured is three-dimensionally measured on the basis of the captured fringe images (see JP-A-2013-036791).
However, also in the above three-dimensional measurement apparatus, the illuminance set for each test block is changed for measurement. Therefore, when there is a portion having areas that vary considerably in terms of brightness in the field of view, imaging and measurement need to be performed while changing the illuminance, and such measurement takes time.
An advantage of some aspects of the invention is to provide an image processing apparatus and an image processing method for three-dimensional measurement technique with a reduced measurement time and improved measurement capability.
An image processing apparatus according to an aspect of the invention includes a projection unit configured to project predetermined fringe pattern light onto an object, an imaging unit configured to image the object onto which the predetermined fringe pattern light is projected by the projection unit, a luminance value acquisition unit configured to acquire a luminance value of a pixel corresponding to the object from captured image data obtained by imaging the object onto which the predetermined fringe pattern light is projected by the imaging unit, a three-dimensional point group generation unit configured to calculate a three-dimensional shape on the basis of luminance value data acquired by the luminance value acquisition unit, and a control unit configured to control the projection unit, the imaging unit, the luminance value acquisition unit, and the three-dimensional point group generation unit. The control unit includes an extended control unit configured to include and control a luminance distribution acquisition unit configured to acquire luminance distribution data of the object from captured image data obtained by imaging the object by using the imaging unit, while causing the projection unit to project uniform pattern light, and a projection luminance setting unit configured to set a projection luminance distribution pattern based on luminance distribution data acquired by the luminance distribution acquisition unit. The extended control unit causes the luminance value acquisition unit to acquire luminance value data from captured image data obtained by imaging the object by using the imaging unit while causing the projection unit to project pattern light based on a projection luminance distribution fringe pattern obtained by combining the projection luminance distribution pattern set by the projection luminance setting unit and the predetermined fringe pattern, and the extended control unit further causes the three-dimensional point group generation unit to calculate a three-dimensional shape based on the luminance value data.
According to the aspect, a projection luminance distribution pattern is generated, the projection unit projects projection luminance distribution fringe pattern light based on the projection luminance distribution pattern, and the imaging unit performs imaging under this condition, and thus, a three-dimensional shape of an object to be measured including a high luminance area to a low luminance area, can be calculated in one measurement.
The projection luminance distribution fringe pattern is preferably obtained by superimposing the predetermined fringe pattern and a low luminance pattern. The low luminance pattern having a reduced projection luminance value is set, on the basis of the luminance distribution data, to an object area being an area having a luminance value higher than a predetermined reference luminance by at least a predetermined value. According to this configuration, a projection luminance distribution fringe pattern can be relatively readily generated.
The projection luminance distribution fringe pattern may have a plurality of object areas each of which is the object area and in which the low luminance pattern has different projection luminance values for the respective object areas. According to such a configuration, an object to be measured having a wider range of luminance can be measured.
The low luminance pattern set to the object area may have projection luminance values distributed in one object area. According to such a configuration, an object to be measured having a wider range of luminance can be measured.
The projection unit preferably includes a correction table representing a relationship between input luminance and measured luminance corresponding to the input luminance, determines set luminance corresponding to luminance to be projected in accordance with a proportional relationship of a minimum value and a maximum value of the input luminance to a minimum value of measured luminance and a maximum value of measured luminance corresponding to the minimum value and the maximum value of the input luminance, and projects, as input luminance, actually set luminance corresponding to measured luminance in the correction table corresponding to the set luminance. According to this configuration, an output luminance range of the projection unit can be relatively readily expanded.
An image processing method according to another aspect of the invention includes acquiring luminance distribution data of an object from captured image data obtained by causing an imaging unit to image the object while projecting uniform pattern light onto the object, setting a projection luminance distribution pattern based on the luminance distribution data acquired in the acquiring of luminance distribution data, projecting onto the object projection luminance distribution fringe pattern light obtained by combining the projection luminance distribution pattern set in the setting of a projection luminance distribution pattern and a predetermined fringe pattern, imaging the object onto which the projection luminance distribution fringe pattern light is projected in the projecting of projection luminance distribution fringe pattern light, acquiring a luminance value of a pixel corresponding to the object from captured image data obtained by imaging performed in the imaging of the object, and calculating a three-dimensional shape based on luminance value data acquired in the acquiring of a luminance value.
In such an aspect, a projection luminance distribution pattern is generated, the projection unit projects a projection luminance distribution fringe pattern light based on the projection luminance distribution pattern, and the imaging unit performs imaging under this condition, and thus, a three-dimensional shape of an object to be measured, including a high luminance area to a low luminance area, can be calculated in one measurement.
The invention will be described with reference to the accompanying drawings, wherein like numbers reference like elements.
The invention will be described in detail below on the basis of embodiments.
Embodiment 1 of the invention will be described below with reference to the drawings.
The projection unit 12 is, for example, a projector including a liquid crystal light valve and a projection lens which project a projection image, a liquid crystal drive unit, and a super-high pressure mercury lamp or metal halide lamp as a light source. The projection unit 12 is communicably connected to the control unit 20, for example, by using a cable. The wired communication by using a cable is performed in accordance with a standard such as Ethernet (registered trademark) or Universal Serial Bus (USB). The projection unit 12 and the control unit 20 may be connected via wireless communication performed in accordance with a communication standard such as Wi-Fi (registered trademark). The projection unit 12 acquires various patterns (images) from the control unit 20 through communication and projects the acquired patterns onto the object 1.
The imaging unit 13 is communicably connected to the control unit 20, for example, by using a cable. The wired communication by using a cable is performed in accordance with a standard such as Ethernet (registered trademark) or USB. The imaging unit 13 and the control unit 20 may be connected via wireless communication performed in accordance with a communication standard such as Wi-Fi. The imaging unit 13 images the object 1.
The control unit 20 includes a projection pattern setting unit 21 configured to generate a projection pattern to be projected by the projection unit 12, an image acquisition unit 22 configured to acquire captured image data obtained by imaging the object 1 by using the imaging unit 13, a luminance value acquisition unit 23 configured to acquire a luminance value of a pixel corresponding to the object 1 from captured image data acquired by the image acquisition unit 22, and a three-dimensional point group generation unit 24 configured to calculate a three-dimensional shape on the basis of luminance value data acquired by the luminance value acquisition unit 23.
The image processing apparatus 10 having such a configuration calculates a three-dimensional shape, for example, by using a phase shift method. First, the process of the phase shift method will be described. In the phase shift method, a fringe pattern light in which luminance changes sinusoidally is projected from the projection unit 12 onto the object 1, and the imaging unit 13 performs imaging while the fringe pattern being controlled. The phase of the fringe pattern projected onto the object 1 is shifted by a predetermined amount of phase shift. Phase shift is repeated a plurality of times (at least three times, normally four times or more) until the phase of the fringe pattern is shifted by one cycle. Whenever the phase of the fringe pattern is shifted, the imaging unit 13 images the object 1 onto which the fringe pattern light is projected. For example, when the amount of phase shift is π/2 [rad], the phase of a fringe is shifted by 0, π/2, π, and 3π/2, and an image of the object to be measured is captured at each phase. Then a total of four sets of captured image data are acquired.
Next, the luminance value acquisition unit 23 acquires the luminance of pixels of the object 1 on the basis of captured image data. In four phase shifts, luminance values of the pixels are obtained from each of four sets of captured image data. Then, the luminance values are applied to the following formula (1) to obtain a phase φ(x,y) at coordinates (x,y).
φ(x,y)=tan−1{I3π/2(x,y)−Iπ/2(x,y)}/{I0(x,y)−Iπ(x,y)} (1)
In the formula (1), I0(x,y), Iπ/2(x,y), Iπ(x,y) and I3π/2(x,y) denote the luminance value of a pixel positioned at coordinates (x,y) for the phases 0, π/2, π, and 3π/2, respectively.
When the phase φ(x,y) can be determined, height information at respective coordinates can be obtained on the basis of the phase φ(x,y) in accordance with the principle of triangulation, thus enabling a three-dimensional shape of the object 1 to be obtained.
An example of the object 1 is illustrated in
When the method described above is applied to such an object 1 in which there is a large variation in luminance, the three-dimensional shapes of all the objects to be measured 101A to 103A cannot be measured in one measurement, as described below.
Therefore, such an object 1 in which there are large variations in luminance usually needs to be subjected to measurement three times at different exposure times, and this procedure further needs to be performed for each phase shift.
The image processing apparatus 10 according to the present embodiment includes an extended control unit 30 so as to measure, in a single imaging step, such an object 1 in which there are large variations in luminance. The extended control unit 30 is configured to cause the projection unit 12 to project uniform pattern light and cause, in such a state, the imaging unit 13 to image the object 1. The extended control unit 30 includes a luminance distribution acquisition unit 31 and a projection luminance setting unit 32. The luminance distribution acquisition unit 31 is configured to acquire luminance distribution data of the object 1 from image data captured by the imaging unit 13. The projection luminance setting unit 32 is configured to set a projection luminance distribution pattern based on the luminance distribution data acquired by the luminance distribution acquisition unit 31. The extended control unit 30 causes the projection unit 12 to project uniform pattern light, and causes, in such a state, the imaging unit 13 to image the object 1 to acquire image data. The extended control unit 30 causes the luminance distribution acquisition unit 31 to acquire the luminance distribution data of the object 1 from the image data, and causes the projection luminance setting unit 32 to set a projection luminance distribution pattern based on the luminance distribution data. Then, the extended control unit 30 transmits the projection luminance distribution pattern set by the projection luminance setting unit 32 to the projection pattern setting unit 21, causes the projection pattern setting unit 21 to set a projection luminance distribution fringe pattern obtained by combining the projection luminance distribution pattern and a predetermined fringe pattern, and causes the projection unit 12 to project pattern light based on the projection luminance distribution fringe pattern. Although detailed description will be made later, even when there are large variations in luminance in the object 1 as described above, the three-dimensional shape of the object 1 can be calculated in a single measurement. This process will be described in detail below.
The extended control unit 30 first causes the projection unit 12 to project uniform pattern light. The uniform pattern is, for example, a uniformly white pattern, and the luminance of the white pattern may be set appropriately. The extended control unit 30 causes the imaging unit 13 to image the object 1 onto which the uniform pattern is being projected. This imaging is performed with exposure adjusted for the intermediate luminance area 102. An example of captured image data is illustrated in
Next, the extended control unit 30 causes the luminance distribution acquisition unit 31 to acquire luminance value data based on the captured image data, and causes the projection luminance setting unit 32 to set a projection luminance distribution pattern based on the luminance value data.
An example of a projection luminance distribution pattern is illustrated in
Upon setting the object areas, the shapes or sizes thereof may be set appropriately without the shapes or sizes thereof being similar to those of the high luminance area 101 and the intermediate luminance area 102, and the object areas may be set to include at least the objects 101A and 102A to be measured. In setting such an object area, an area having a luminance value larger than a predetermined reference luminance by a predetermined value is defined as the object area on the basis of a luminance value acquired by the luminance distribution acquisition unit 31. The object area may be set on at least the basis of a difference between luminance values in areas where the objects 101A to 103A to be measured are positioned. Furthermore, a low luminance pattern in a projection luminance distribution pattern set by the projection luminance setting unit 32 may be set to have an appropriate exposure in the high luminance area 101 or the intermediate luminance area 102 while adjusting exposure for the low luminance area 103.
Such low luminance patterns 121 and 122 in the projection luminance distribution pattern 120 each have a constant luminance value, but in the low luminance patterns 121 and 122, the respective luminance values may be set to be changed.
Such a projection luminance distribution pattern 120 is transmitted to the projection pattern setting unit 21, combined with the fringe pattern 110 described above, and formed into a projection luminance distribution fringe pattern 130 as illustrated in
The projection unit 12 projects the projection luminance distribution fringe pattern 130, and the imaging unit 13 performs imaging in this state. An example of the captured image data is illustrated in
The image processing apparatus 10 according to this embodiment can measure a three-dimensional shape of a normal object by using a normal method, but for the object 1 in which there are large changes in luminance, as illustrated in
The above-described process will be summarized as follows.
The process includes a luminance distribution acquisition step of causing the imaging unit 13 to image the object 1 onto which uniform pattern light is being projected and acquiring, by the luminance distribution acquisition unit 31, luminance distribution data of the object 1 from captured image data, a projection luminance setting step of setting, by the projection luminance setting unit 32, the projection luminance distribution pattern 120 based on the luminance distribution data acquired in the luminance distribution acquisition step, a projection step of projecting, by the projection pattern setting unit 21, pattern light onto the object 1 based on the projection luminance distribution fringe pattern 130 obtained by combining the projection luminance distribution pattern 120 set in the projection luminance setting step with the predetermined fringe pattern 110, an imaging step of imaging the object 1 onto which the pattern light based on the projection luminance distribution fringe pattern 130 is projected in the projection step, and acquiring captured image data by the image acquisition unit 22, a luminance value acquisition step of acquiring a luminance value of a pixel corresponding to the object 1 from the captured image data (
The present embodiment relates to an image processing apparatus including a function of expanding a projectable luminance range of the projection unit 12, the other portions are similar to those in Embodiment 1, and repeated description thereof will be omitted.
There may be a case in Embodiment 1 where the projection luminance distribution fringe pattern 130 is not able to be projected with the projectable luminance range of the projection unit 12, but according to this embodiment, the projection luminance range can be expanded using software, and even when the luminance range of the projection luminance distribution fringe pattern 130 is large, projection can be performed. Note that this method is used not only for the projection of the projection luminance distribution fringe pattern 130, but also for projection of the fringe pattern 110, and in this case, a sinusoidal fringe pattern in which a luminance difference is large is formed to advantageously improve accuracy in three-dimensional shape measurement.
In general, the output luminance of the projection unit 2 is not linear shape but curved shape, with respect to an input value. Therefore, normally, a range having a response closer to a linear response, for example, an area R of
In this case, the fringe pattern is represented by the pattern O1 in
In this embodiment, a relationship between input luminance and measured luminance corresponding to the input luminance is measured in advance, and the measured relationship is stored as a correction table. Input luminance corresponding to predetermined output luminance is determined as set luminance S1 on the basis of a straight line MN connecting a point M, where input luminance corresponds to input luminance 0 as a minimum value, and a point N, where input luminance corresponds to input luminance 4095 as a maximum value. However, for input luminance actually set, output luminance corresponding to output luminance of set luminance on the straight line MN is obtained from the correction table described above, and this value is defined as actually set luminance S2. This will be described with reference to
As described above, set luminance corresponding to luminance to be projected is determined in accordance with a proportional relationship of a minimum value and a maximum value of the input luminance to a minimum value of measured luminance and a maximum value of measured luminance corresponding to the minimum value and the maximum value of the input luminance, and actually set luminance corresponding to measured luminance in the correction table corresponding to the set luminance is projected as input luminance. As a result, a sinusoidal waveform having a wide luminance range as illustrated in
The entire disclosure of Japanese Patent Application No. 2016-250626, filed Dec. 26, 2016 is expressly incorporated by reference herein.
Number | Date | Country | Kind |
---|---|---|---|
2016-250626 | Dec 2016 | JP | national |