The present invention relates to a target location determination device that, by image processing a captured image of a color target consisting of a combination of different colors, determines a location of the color target in the captured image.
In recent years, an increasing number of vehicles are coming equipped with a camera so as to enables the driver of the vehicle to visually check the sides and behind the vehicle and the like via a monitor in the vehicle. Furthermore, a device that performs image processing and the like utilizing images captured by this camera and assists with parking and the like has also been developed. Particularly with regard to cameras that import captured images that serve as a basis for creating information that is utilized for positioning a vehicle or the like, calibration such as optical axis alignment and the like with high accuracy is demanded. The above target location specification device is used for calibration processing of such in-vehicle cameras. For example, technology that involves using an in-vehicle camera to capture an image of a marker (target) having a black and white checkered pattern that is disposed in two places within the field of view of the camera, detecting a center point (calibration point) of the marker through image processing, and calibrating the in-vehicle camera is known (e.g., see Patent Document 1). In the case of using a target having a black and white pattern, a black and white camera adjusted so that the luminance difference of the black and white pattern is clear needs to be utilized in order to appropriately bring out the luminance difference of this black and white pattern. However, color cameras that generate a color image are now often employed from the point of view of visibility and the like, as the camera used in order to provide a view down the sides and behind the vehicle and the like on a monitor in a vehicle. In the case where a color image signal output from a color camera is used for luminance difference evaluation of a black and white pattern, the possibility arises of the luminance difference of the black and white pattern not being sufficiently obtained.
A target location specification device using a color camera and a color target is also known (e.g., see Patent Document 2). This target location specification device is provided with a color difference conversion portion that processes pixel values of a captured image obtained by capturing an image of a target consisting of a combination of a first color and a second color, and generates a first color component value, a second color component value and a luminance value, a color region determination portion that determines the regions of the first color and the second color based on the first color component value and the second color component value using a determination condition that is based on the luminance value, a boundary detection portion that detects a boundary between the first color and the second color based on the determination result, and a target location computation portion that computes a location of the target based on the boundary detection result. Because color image signals are constituted by three color component signals (e.g., RGB signals), the calculation load of the image processing calculations is greater than with a black and white image (grayscale image), and the cost burden also increases.
[Patent Document 1]: JP 2008-131250A (para. 0023-0040,
[Patent Document 2]: WO 2010/016379 (para. 0006-0013,
In view of the above situation, an object of the present invention is to provide a target location determination device that is able to accuracy recognize a color pattern as well as reducing the calculation load, even when using a color camera and a color target.
A characteristic of the present invention in a target location determination device for determining a location of a color target in a captured image based on color image data obtained by capturing an image of the color target, which consists of a combination of a first target region that is a first color and a second target region that is a second color having a different color component from the first color, is the provision of a grayscaling profile storage portion that stores a grayscaling profile for respectively converting the first color and the second color into a first monochrome color and a second monochrome color so that a luminance difference between the first monochrome color converted from the first color and the second monochrome color converted from the second color is greater than a luminance difference between the first color and the second color, a grayscaling portion that converts the color image data into grayscale image data using the grayscaling profile, and a target location determination module that determines a target location in the captured image by recognizing a boundary between the first target region and the second target region of the color target in the grayscale image data.
This configuration enables the boundary between the region that is the first monochrome color and the region that is the second monochrome color to be accuracy detected, because a grayscaling profile that increases the luminance difference between the first monochrome color into which the first color is converted and the second monochrome color into which the second color is converted is used, when converting from color image data into grayscale image data. Moreover, the calculation load is reduced compared with color processing because this boundary detection is performed based on a grayscale image, or in other words, based only on luminance values. Because the color balance (gray balance to be precise) of a captured image that has been grayscaled in this way is greatly out of adjustment, it is better to use the color image data before grayscaling in monitor display for the driver to check the area around the vehicle.
As for a specific creation of the abovementioned grayscaling profile, preferably the grayscaling profile is created so as to approach white as the color component of the first color increases, and to approach black as the color component of the second color increases. For example, the first color is red and the second color is blue, so as be easily discernible even with the naked eye. In the case where the RGB values, which are pixel values, contain a high proportion of R component value compared with the other two color component values, a first relationship in which the profile will be white is produced. Similarly, in the case where the RGB values contain a high proportion of B component value compared with the other two color component values, a second relationship in which the profile will be black is produced. A grayscaling profile that is able to represent a relationship in which the profile becomes whiter as the red component increases and becomes blacker as the blue component increases is created by merging the first and second relationships.
Although distinguishing by luminance difference is difficult with typical grayscaling, a color target whose first color is red and whose second color is blue is a favorable application example of the present invention, considering that red and blue targets are used comparatively often since distinguishability with human vision is good.
It is also conceivable for the target colors of the color target to differ depending on intended use and other factors. So as to be able to respond to such cases, in one favorable embodiment of the present invention, a grayscaling profile for every different color combination of the first color and the second color utilized in the color target is stored in the grayscaling profile storage portion. This configuration enables various color patterns to be accurately recognized by creating and storing grayscaling profiles compatible with the combinations of target colors of the color target that could possibly be used beforehand.
Depending on type of illumination light source that illuminates the color target, the problem of the original first color and second color shifting significantly within color image data acquired by a color camera may arise. In order to reduce such a problem, the colors after the original first color and second color have shifted due to specific type of illumination light source may be derived beforehand, and a grayscaling profile that uses those colors as the first color and the second color can be created for that type of illumination light source. For this purpose, in one favorable embodiment of the present invention, a grayscaling profile is created for every type of illumination light source that illuminates the color target, and stored in the grayscaling profile storage portion.
Hereafter, embodiments of the present invention will be described based on the drawings. The principles of a grayscaling process employed by the target location determination device according to the present invention are schematically shown in
Color image data is generated as a result of an image of this color target 2 being captured with a color camera, and, here, the pixel value of the first color and the pixel value of the second color in the color image data are also respectively represented by C1 (R1, G1, B1) and C2 (R2, G2, B2). The conversion of this color image data into grayscale image data is the grayscaling process. At this time, the first color: C1 (R1, G1, B1) and the second color: C2 (R2, G2, B2) in the color image data are respectively converted into a first monochrome color: D1 (N1) and a second monochrome color: D2 (N2) in grayscale image data. Note that in the case where the color image data is 32-bit color image data, R1, G1 and B1 and R2, G2 and B2 take values 0-255, and in the case where the grayscale image data is 8-bit grayscale image data, N1 and N2 also take values from 0 to 0-255. C (0, 0, 0) and D (0) are assumed to represent black, and C (255,255,255) and D (255) are assumed to represent white. In this invention, the grayscaling process is intended to make the luminance difference between the first monochrome color that is converted from the first color and the second monochrome color that is converted from the second color greater than the luminance difference between the first color and the second color.
Conversion from one color space to another color space (including a black and white color space) is performed using a color conversion profile called a color conversion matrix, but given that conversion here is to a grayscale space, this profile will be called a grayscaling profile. This grayscaling profile is simply:
M[C(R, G, B)]=D(N).
Here, R, G and B can be represented as the pixel values of color image data and N can be represented as the pixel value of grayscale image data.
Note that a color conversion profile as referred to here is also used during normal times (during use by a user) for color adjustment of camera images viewed by the user on a display device, and functions as a grayscaling profile during target location determination according to the present invention. Although another color pixel value Cn (Rn, Gn, Bn) is derived from a given color pixel value Cm (Rm, Gm, Bm) when used as a color conversion profile during normal times (during use by a user), one grayscale pixel value D (N) is derived from a given color pixel value Cm (Rm, Gm, Bm) when used as a grayscaling profile.
In
In order to avoid such a problem, in the present invention, a grayscaling profile according to which the prescribed first color and second color, when grayscaled, are converted into a first monochrome color and a second monochrome color that have a large luminance difference is prepared. For example, as schematically shown in (b) of
M(10, 20,240)=20, and
M(240, 30, 30)=185.
This grayscaling profile preferably is created so that the R, G and B values of blue colors are as much as possible continuously converted into regions that will be dark gray, and so that the R, G and B values of red colors are as much as possible continuously converted into regions that will be light gray.
To create a grayscaling profile that supports such continuity, a configuration can be adopted in which the D value is computed using a weight calculation for setting a weight coefficient that depends on the amount of transition, with respect to R, G and B values that transition from pure blue represented by the R, G and B values (0, 0, 255) to approximate blue.
By using a grayscaling profile created in this way, grayscale image data in which a large luminance difference occurs between the first monochrome color and the second monochrome color is output. By using grayscale image data having such a luminance difference, detection of the boundary line between the first target region colored the first color and the second target region colored the second color can be easily and precisely performed.
As shown in
In
The target 2 is disposed in at least two places within the range of the field of view of the camera 11. Also, the target 2 is disposed so that the coordinates thereof are known in the world coordinate system. In this example, the target 2, as shown in
In the example shown in
The dimensions of the target 2 are appropriately set so as to be able to accurately detect the calibration point Q, according to the resolution of the camera 11, the performance of the image processing function for processing images captured by the camera 11, the disposition location of the target, and the like. As an example, in the case where D1 and D2 are 1-2 m, and W1 and W2 are around 0.5 m, a target 2 whose black and white regions are each 10-15 cm square and whose entirety is 20-30 cm square is used, as shown in
In this embodiment, the target location specification device according to the present invention is substantially constituted by an image processing unit whose core member is a computer, and a block diagram schematically showing the image processing function is shown in
The grayscaling module 4 includes a target color setting portion 41, a grayscaling profile storage portion 42, a grayscaling profile selection portion 43, and a grayscaling portion 44. The target color setting portion 41 sets the first color and the second color, which are the characteristic colors of the target 2 to be processed, through an input operation from a keyboard 14. A method of estimating and setting the first color and the second color from the input color image data may, of course, be employed.
The grayscaling profile storage portion 42 stores a grayscaling profile (color space conversion matrix) serving as a grayscale conversion table used in the grayscaling process that was described using
The grayscaling profile selection portion 43 selects a conforming grayscaling profile from the color configuration of the target 2 set by the target color setting portion 41, here, the color configuration of a blue and red checkered pattern, and provides the selected grayscaling profile to the grayscaling portion 44. The grayscaling portion 44 generates grayscale image data from the color image data using the grayscaling profile selected by the grayscaling profile selection portion 43.
The target location determination module 5, in this embodiment, includes a preprocessing portion 51, a luminance difference computation portion 52, a threshold setting portion 53, a target region determination portion 54, a boundary detection portion 55, and a target location computation portion 56. The preprocessing portion 51 performs processing such as correction of image distortion caused by the lens characteristic of the camera 11 if needed. The luminance difference computation portion 52 computes the luminance difference between the first monochrome color and the second monochrome color obtained as a result of the first color and the second color being grayscaled using the selected grayscaling profile. The threshold setting portion 53 sets a specific color detection threshold that serves as a determination condition for determining whether a target pixel (target region) is the first monochrome color (first color: blue) or the second monochrome color (second color: red), based on the luminance difference calculation value computed by the luminance difference computation portion 52.
The region determination portion 54 sequentially scans the grayscale image data containing the target 2, using the threshold set by the threshold setting portion 53, and determines the division between the first target region (blue region) and the second target region (red region).
The boundary detection portion 55 detects the boundary between the blue region and red region in the target 2, utilizing the result of determining the first target region (blue region) and the second target region (red region) by the region determination portion 54. Because the boundary detected by the boundary detection portion 55, or in other words, the intersection of the two boundary lines, will be the calibration point Q, the target location computation portion 56 is able to compute the location of the target 2 in the captured image, or in other words, the calibration point Q, based on the result of the boundary detection by the boundary detection portion 55.
Note that the color image signal acquired by the camera 11 and output from the image signal output portion 12 is displayed as a color captured image on a monitor 13 through a video signal generation portion 33.
Next, the flow of control in a color target location determination device provided with the grayscaling module 4 and the target location determination module 5 constituted as mentioned above will be described using
First, the vehicle 1 is positioned and stopped precisely at a prescribed location of an inspection area (#01). After checking that the vehicle has stopped, the camera 11 is operated and images of the area around the vehicle are captured (#02). The camera is set so that images of the two targets 2 are included in the color images captured by the camera 11 of the vehicle 1 stopped at the prescribed location, even if the attachment accuracy of the camera 11 is a little off. The color image data output through the image signal output portion 12 is subjected to basic image processing such as white balance correction, for example (#03).
Next, the first color and the second color, which are the characteristic colors of the target 2 to be processed that are set in advance in the target color setting portion 41, are read out as the color configuration of the target 2 to be processed (#04). The conforming grayscaling profile is selected, with the color configuration that was read out as a search keyword (#05). The color image data is converted into grayscale image data, using the selected grayscaling profile (#06).
The converted grayscale image data is used in the target location determining process. In the target location determining process, the luminance difference computation portion 52 computes a luminance difference calculation value of the first monochrome color and the second monochrome color (#11). Next, based on the luminance difference calculation value, the threshold setting portion 53 sets a detection threshold for the first target region (blue region) of the target 2 (#12) and a detection threshold for the second target region (red region) of the target 2 (#13). Because it can be inferred that the image was captured in a favorable light environment in the case where the luminance difference (luminance difference calculation value) is large, for example, it is envisioned that the detection thresholds will be rigorously set (difference between the detection threshold for the first target region (blue region) and the detection threshold for the second target region (red region) will be increased) in order to raise the accuracy of automatic detection. In contrast, because it is inferred that the image was captured in an unfavorable light environment (dark environment) in the case where the luminance difference (luminance difference calculation value) is small, it is envisioned that the detection thresholds will be set moderately (difference between the detection threshold for the first target region (blue region) and the detection threshold for the second target region (red region) will be reduced) in order, first of all, to enable automatic detection. A method that allows the detection thresholds to be varied is favorable in the case where fine adjustment is performed manually by an operator after automatic detection. Determination of the first target region (blue region) (#14) and determination of the second target region (red region) (#15) of the target 2 are performed using the thresholds set here. Next, the boundary line between the first target region and the second target region of the target 2 is detected from the result of determining the first target region and the second target region (#16). Of course, it is also possible to implement determination of the first target region and the second target region of the target 2 and detection of the boundary line between the first target region and the second target region at the same time. In any case, because the detected boundary line between the first target region and the second target region will indicate a configuration in which two lines are substantially orthogonal, the calibration point coordinates are computed with the intersection thereof as the calibration point (#17). Note that a configuration may be adopted in which predetermined thresholds are used, rather than performing processing that allows the detection thresholds to be varied such as the abovementioned steps #12 and #13.
The location of the target 2, or in other words, the coordinate location of the calibration point can be derived by the above processing steps. Accordingly, next, the amount of shift between the preset target calibration point and the calibration point computed at step #17 is computed (#18). Based on this computed amount of shift, the attachment accuracy determination module 6 determines the attachment accuracy of the camera 11 (#19).
(1) A functional block diagram showing an image processing function of another embodiment is shown in
The grayscaling profiles stored in the grayscaling profile storage portion 42 may be created not only for every combination of specific color configurations (first color and second color) of the target 2 as mentioned above but also for every type of light source estimated by the light source estimation portion 45. Accordingly, the grayscaling profile selection portion 43 selects a grayscaling profile, using the color configuration of the target 2 set by the target color setting portion 41 and the type of light source estimated by the light source estimation portion 45 as search keywords. This configuration enables target location determination in which adverse influence due to the type of light source when capturing an image of the target is also suppressed.
(2) Each functional portion in the abovementioned grayscaling module 4 and target location determination module 5 indicates a function allocation, and does not necessarily need to be provided independently. It should be obvious that each function may be realized by the collaboration of hardware such as a microcomputer and software such as a program executed on the hardware.
(3) In the abovementioned embodiment, the target 2 to be processed by this color target location determination device is a target 2 for determining the in-vehicle camera attachment location, but may be a target for a stop mark in a parking lot or at a battery charging station. Also, the present invention may be applied, with white lines, yellow lines and the like drawn on the road regarded as targets 2.
(4) In the above embodiment, a blue/red common grayscaling profile that merges a grayscaling profile for reducing the color effect on the first color (blue) and a grayscaling profile for reducing the color effect on the second color (red) is used as a grayscaling profile having the characteristic of reducing the color effect that the target 2 receives due to the light source. Alternatively, a configuration may be adopted in which a blue correction grayscaling profile for reducing the color effect on blue and a red correction grayscaling profile for reducing the color effect on red are prepared, and these grayscaling profiles are used separately.
The present invention can be widely utilized in image processing technology for converting a color image of a target characterized by the combination of different colors into a grayscale image, and detecting the boundary between the different colors.
Number | Date | Country | Kind |
---|---|---|---|
2010-184185 | Aug 2010 | JP | national |
Filing Document | Filing Date | Country | Kind | 371c Date |
---|---|---|---|---|
PCT/JP2011/066124 | 7/14/2011 | WO | 00 | 12/19/2012 |