The present invention relates to a camera distance measurement device for measuring the distance to an object in a camera image by using, for example, an in-vehicle camera.
For example, patent reference 1 discloses a device which implements a distance measurement based on an arrival state of light arriving at an object by using still images close to each other with respect to time, each of the still images including an image which is captured by applying light to the object. However, because still images different with respect to time are used according to a conventional technology represented by the technology disclosed in patent reference 1, a temporal displacement may occur in an object in the case of using either a moving image or an image captured by a camera mounted in a moving object, such as an in-vehicle camera which moves as a vehicle travels. Further, it is necessary to separately prepare a mechanism exclusively used for applying light to an object.
The present invention is made in order to solve the above-mentioned problems, and it is therefore an object of the present invention to provide a camera distance measurement device which can measure the distance to an object in a camera image.
In accordance with the present invention, there is provided a camera distance measurement device for displaying an image in which a plurality of graduation lines which are arranged in a form of a grid with respect to a vehicle are superimposed on a camera image which is captured by a camera mounted to the vehicle on a display unit to measure a distance in a direction of the width of the vehicle and a distance in a direction of the capturing by the camera from a unit distance defined for each grid side of the graduation lines, the camera distance measurement device including: a parameter storage unit for storing mounting information showing a mounting position and a mounting angle at and with which the camera is mounted to the vehicle, angle of view information showing an angle of view of the camera, projection method information showing a projection method for use in a lens of the camera, and screen size information showing a screen size of the display unit as parameter information; a distance measurement arithmetic operation unit for performing a process of correcting distortion of the lens of the camera on position coordinates in real space of each of grid points defined by the plurality of graduation lines which are arranged in the form of a grid and at intervals of the unit distance, and for transforming the position coordinates of each of the grid points in which the distortion of the lens has been corrected into position coordinates in the camera image on a basis of the mounting information, the angle of view information, the projection method information, and the screen size information which are read from the parameter storage unit to create graduation line information; a line drawing unit for arranging the plurality of graduation lines on a basis of the graduation line information in such a way that they are intersecting at right angles in a form of a grid to create a graduation line image; an image correcting unit for performing a correcting process of removing the distortion of the lens of the camera in the camera image and distortion caused by the projection method; and an image superimposing unit for superimposing the graduation line image created by the line drawing unit on the camera image corrected by the image correcting unit to output the camera image on which the graduation line image is superimposed to the display unit.
Further, in accordance with the present invention, there is provided a camera distance measurement device for displaying a camera image which is captured by a camera mounted to a vehicle on a display unit to measure a distance from a position in the camera image to the vehicle, the camera distance measurement device including: a parameter storage unit for storing mounting information showing a mounting position and a mounting angle at and with which the camera is mounted to the vehicle, angle of view information showing an angle of view of the camera, projection method information showing a projection method for use in a lens of the camera, and screen size information showing a screen size of the display unit as parameter information; an in-screen position determining unit for determining a position in the camera image displayed on the display unit; a distance measurement arithmetic operation unit for performing a process of correcting distortion of the lens of the camera on coordinates of the position in space of the camera image determined by the in-screen position determining unit, and for transforming the position coordinates in which the distortion of the lens has been corrected into position coordinates at a predetermined height from a ground surface in real space on a basis of the mounting information, the angle of view information, the projection method information, and the screen size information which are read from the parameter storage unit to create position information; and an output unit for outputting a distance from the position in the camera image determined by the in-screen position determining unit to the vehicle on a basis of the position information.
In accordance with the present invention, there is provided an advantage of being able to measure a distance in a direction of the width of the vehicle and a distance in a direction of the capturing by the camera from the unit distance defined for each grid block side of the graduation lines, and another advantage of being able to measure the distance to an object in the camera image.
Hereafter, in order to explain this invention in greater detail, the preferred embodiments of the present invention will be described with reference to the accompanying drawings.
The camera unit 3 includes a camera for capturing an image of an area surrounding the vehicle (for example, an area behind the vehicle), and transmits the camera image captured by this camera to the image correcting unit 6. The image correcting unit 6 is a component for making a predetermined correction to the camera image received from the camera unit 3, and outputs the image corrected thereby to the image superimposing unit 8. An image in which an image of graduation lines which defines distances from the vehicle and which is created by the line drawing unit 7 is superimposed on the camera image from the image correcting unit 6 is displayed on the display unit 5. The driver of the vehicle is enabled to visually recognize the distance between the vehicle which the user is driving and an obstacle on the basis of the graduation lines in the image.
The parameter storage unit 4 is disposed in such a way that the distance measurement arithmetic operation unit 2 can read data from the parameter storage unit, and stores mounting information, angle of view information, projection method information, and screen size information. The mounting information shows how the camera is mounted to the vehicle. More specifically, the mounting information shows the mounting position and the mounting angle at and with which the camera is mounted in the vehicle. The information showing the mounting position includes the height of the camera mounted with respect to the vehicle and the displacement of the camera from the center of the vehicle in a direction of the width of the vehicle. The angle of view information is angle information shows a range of angles during which an object can be captured by the camera of the camera unit 3, and includes either a maximum horizontal angle of view Xa and a maximum vertical angle of view Ya of the camera, or a diagonal angle of view of the camera. The projection method information shows a projection method for use in the lens of the camera of the camera unit 3. Because a fish-eye lens is used as the lens of the camera in Embodiment 1, information showing either one of stereographic projection, equidistant projection, equisolidangle projection and orthogonal projection is provided as the projection method information. The projection method information constructs camera correction information. The screen size information shows a screen size in an image output, i.e. a display range at the time of display of an image by the display unit 5, and includes a maximum horizontal drawing pixel size Xp and a maximum vertical drawing pixel size Yp of the display unit 5.
Next, the operation of the camera distance measurement device will be explained. The graduation line creating unit 9 of the distance measurement arithmetic operation unit 2 calculates the positions at which the graduation lines to be displayed on the display unit 5 are to be drawn, i.e. graduation line information showing the positions of the graduation lines in the camera image captured by the camera on the basis of the preset graduation line size information. Hereafter, a case in which the camera unit 3 is mounted to a rear portion of the vehicle, and an area behind the vehicle is defined as an image capture range will be explained.
The graduation line creating unit 9 determines the length in the direction of the width of the vehicle in which the graduation line group consisting of the straight lines L1 is aligned on the basis of graduation line size information, defines the graduation line groups in the form of such a grid as shown in
The lens distortion function arithmetic operation unit 10 performs a lens distortion function i on the coordinates (x, y) showing a point of intersection of graduation lines calculated by the graduation line creating unit 9 to transform the coordinates (x, y) into coordinates (i (x), i (y)) which has undergone lens distortion. The lens distortion function i represents the distortion which the camera image which is acquired when the camera of the camera unit 3 has captured an object undergoes due to the shape of the lens of the camera. For example, the lens distortion function i can be determined by using the Zhang model related to the lens distortion. In this model, the lens distortion is modeled as radial distortion, and, when the coordinates which have not been affected by the lens distortion are expressed as (x, y), the coordinates which have been affected by the lens distortion are expressed as (i(x) and i (y)), the normalized coordinates which have not been affected by the lens distortion are expressed as (u, v), and the normalized coordinates which have been affected by the lens distortion are expressed as (u˜, v˜), the following equations are given.
u
˜
=u+u(k1r2+k2r4)
v
˜
=v+v(k1r2+k2r4)
r
2
=u
2
+v
2
In these equations, u˜ is u tilde and v˜ is v tilde. k1 and k2 are coefficients of a polynomial expressing the lens distortion, which is caused by radial distortion, in the normalized coordinates (u˜, v˜) which have been affected by the lens distortion with respect to the normalized coordinates (u, v) which have not been affected by the lens distortion, and are constants inherent in the lens. When the center of the radial distortion in the coordinates which have not been affected by the lens distortion is expressed as a principal point (xo, yo), there is a relationship given by the following equations.
i(x)=x+(x−xo)(k1r2+k2r4)
i(y)=y+(y−yo)(k1r2+k2r4)
where x0 and y0 are constants inherent in the lens. By using these relational expressions, the coordinates (x, y) which have not been affected by the lens distortion can be transformed into the coordinate (i(x), i(y)) which have been affected by the lens distortion.
The projection function arithmetic operation unit 11 performs a function h according to the projection method determined on the basis of projection method information inputted thereto from the parameter storage unit 4 on the coordinates (i(x), i(y)) which have undergone the lens distortion and which is outputted from the lens distortion function arithmetic operation unit 10 to transform the coordinates (i(x), i(y)) into coordinates (h(i(x)), h(i(y))) which have undergone projection distortion. The function h according to the projection method functionally shows how far from the lens center light incident upon the lens with an angle of θ is focused. The function h according to the projection method has the following relationship when the focal length of the lens is expressed as f, the angle of incidence of the incident light, i.e., the half angle of view is expressed as θ, and the image height on the imaging surface of the camera is expressed as Y. In the case of stereographic projection, the function h has the following relationship: Y=2 f tan (θ/2), in the case of equidistant projection, the function h has the following relationship: Y=fθ, in the case of equisolidangle projection, the function h has the following relationship: Y=2f sin(θ/2), and, in the case of orthogonal projection, the function h has the following relationship: Y=f sin θ. Therefore, by transforming the value i(x) of the coordinates (i(x), i(y)) which have undergone the lens distortion and which are outputted from the lens distortion function arithmetic operation unit 10 into the angle of incidence of θ to the lens, and then substituting this angle into the above-mentioned projection equation, a value h (i(x)) which has undergone the projection distortion is acquired. Similarly, by transforming the value i(y) of the coordinates (i(x), i(y)) which have undergone the lens distortion into the angle of incidence of θ to the lens, and then substituting this angle into the above-mentioned projection equation, a value h(i(y)) is acquired. By doing in the above-mentioned way, the projection function arithmetic operation unit can acquire the coordinates (h(i(x)), h(i(y))) which have undergone the projection distortion.
The projection plane transformation function arithmetic operation unit 12 performs a projection plane transformation function f determined on the basis of the mounting information inputted thereto from the parameter storage unit 4 on the coordinates (h(i(x)), h(i(y))) which have undergone the projection distortion and which are outputted from the projection function arithmetic operation unit 11 to transform the coordinates (h(i(x)), h(i(y))) into coordinates (f(h(i(x))), f(h(i(y)))) which have undergone a projection plane transformation (i.e. an imaging surface transformation). The projection plane transformation is the one of adding an influence caused by the mounting state of the camera including the mounting position and the mounting angle of the camera to the camera image captured by the camera because the mounting state of the camera has an influence on the camera image. The projection plane transformation function f is expressed by a geometrical function which has, its coefficients, the height L of the mounting position of the camera with respect to the ground surface, the mounting vertical angle phi which is the degree of inclination angle of the optical axis of the camera with respect to a vertical line, the mounting horizontal angle of θ which is the degree of inclination angle of the optical axis of the camera with respect to a center line extending in a direction of the length of the vehicle, and the distance H which is the amount of displacement of the camera from the center of the width of the vehicle. It is assumed that the camera is not displaced towards a direction of tilt rotation having the optical axis of the camera as the axis of rotation, and is mounted precisely.
The image output function arithmetic operation unit 13 performs an image output function g determined on the basis of the angle of view information and the screen size information which are inputted thereto from the parameter storage unit 4 on the coordinates (f(h(i(x))), f(h(i(y)))) which have undergone the projection plane transformation to transform the coordinates (f(h(i(x))), f(h(i(y)))) into coordinates for image output (g(f(h(i(x)))), g(f(h(i(y))))). Because the size of the camera image captured by the camera generally differs from that of an image which can be displayed on the display unit 5, the size of the camera image is changed into a size which can be displayed on the display unit 5. To this end, the image output function arithmetic operation unit 13 carries out a transformation process corresponding to a change of the size of the camera image to a size which can be displayed on the display unit 5 on the coordinates (g(f(h(i(x)))), g(f(h(i(y))))) which have undergone the projection plane transformation so that the transformed image can have a scale matching that of the camera image. The image output transformation function g is expressed by a mapping function which has, as its coefficients, a maximum horizontal angle of view Xa and a maximum vertical angle of view Ya of the camera, and a maximum horizontal drawing pixel size Xp and a maximum vertical drawing pixel size Yp in the image output.
In the example explained above, the arithmetic operations are carried out on the coordinates showing each point of intersection of graduation lines in the order of the one with the lens distortion function, the one with the projection function, the one with the projection plane transformation function, and the one with the image output function. However, the order in which the functions are performed on the coordinates showing each point of intersection of graduation lines is not limited to the above-mentioned one.
The projection plane transformation function f in the projection plane transformation function arithmetic operation unit 12 further includes the angle of view of the camera (the maximum horizontal angle of view Xa and the maximum vertical angle of view Ya of the camera) as the screen size information showing the size of the captured camera image. Therefore, even when extracting and displaying a part of the camera image, the camera distance measurement device can display the graduation lines in such a way that the graduation lines fit on the extracted part of the camera image by changing the coefficients of the angle of view of the camera in the projection plane transformation function f.
The image correcting unit 6 determines a function i−1 which is the inverse of the lens distortion function i on the basis of the lens distortion information on the camera of the camera unit 3, and performs the inverse function on the camera image captured by the camera unit 3. Because the camera image captured by the camera unit 3 is affected by the lens distortion, the image correcting unit can correct the camera image to provide a camera image which has not been affected by the lens distortion by performing the lens distortion inverse function i−1 on the camera image captured by the camera unit.
The coordinate information which defines the graduation lines which have undergone the transformation processes in the above-mentioned way is outputted from the distance measurement arithmetic operation unit 2 to the line drawing unit 7 as graduation line information. The line drawing unit 7 creates a graduation line image in which a plurality of graduation lines are arranged in such a way as to intersect at right angles in the form of a grid on the basis of the graduation line information.
The image correcting unit 6 then determines a function h−1 which is the inverse of the projection function h on the basis of the projection method information, and performs the inverse function on the camera image on which the lens distortion inverse function arithmetic operation has been performed. Because the camera image captured by the camera unit 3 has undergone the distortion due to the projection method for use in the lens, the image correcting unit can correct the camera image to provide a camera image which has not been affected by the projection distortion by performing the projection inverse function h−1 on the camera image captured by the camera unit.
The image superimposing unit 8 superimposes the graduation line image on the corrected camera image as images in different layers in such a way that the graduation line image drawn by the line drawing unit 7 is superimposed on the camera image corrected by the image correcting unit 6. The display unit 5 performs the image output function g on the corrected camera image among the graduation line image and the corrected camera images located in the different layers to change the size of the corrected camera image to a size which the display unit 5 can display thereon. The display unit then superimposes the graduation line image on the corrected camera image whose size has been changed to create a composite image and then displays this composite image. Because any object in the camera image is affected by the lens distortion, the projection method, and the mounting state of the camera, the distance measurement arithmetic operation unit 2 can display the graduation lines which fit the camera image by carrying out the coordinate transformations corresponding to the lens distortion, the projection method, and the mounting state of the camera on the camera image.
Each of grid blocks formed by the straight lines L1a to L5a has a side extending in a direction of the width of the vehicle and having a predetermined distance (e.g. 0.50 meters) and a side extending in a direction (a depth direction) vertical to the direction of the width of the vehicle and having a predetermined distance (e.g. 0.50 meters), as shown in
As mentioned above, the camera distance measurement device in accordance with this Embodiment 1 includes: the parameter storage unit 4 for storing mounting information showing the mounting position and the mounting angle at and with which the camera is mounted to the vehicle, angle of view information showing the angle of view of the camera, projection method information showing the projection method for use in the lens of the camera, and screen size information showing the screen size of the display unit as parameter information; the distance measurement arithmetic operation unit 2 for performing the process of correcting distortion of the lens of the camera on the position coordinates in the real space of each of grid points defined by the plurality of graduation lines which are arranged in the form of a grid and at intervals of the unit distance, and for transforming the position coordinates of each of the grid points in which the distortion of the lens has been corrected into position coordinates in the camera image on the basis of the mounting information, the angle of view information, the projection method information, and the screen size information which are read from the parameter storage unit 4 to create graduation line information; the line drawing unit 7 for arranging the plurality of graduation lines on the basis of the graduation line information in such a way that they are intersecting at right angles in the form of a grid to create a graduation line image; the image correcting unit 6 for performing the correcting process of removing the distortion of the lens of the camera in the camera image and the distortion caused by the projection method; and the image superimposing unit 8 for superimposing the graduation line image created by the line drawing unit 7 on the camera image corrected by the image correcting unit 6 to output the camera image on which the graduation line image is superimposed to the display unit 5. Because the camera distance measurement device is constructed in this way, the camera distance measurement device can present a graduation line image which enables the user to easily estimate the distance to an object in the camera image.
The output unit 5A is a component for outputting the distance from the vehicle to the position which is calculated by the distance measurement arithmetic operation unit 2A, and is comprised of a display unit which is a display or a sound output unit for notifying the distance by voice. The in-screen position determining unit 14 is a component for specifying an arbitrary position in the camera image displayed on the screen. For example, the in-screen position determining unit can be comprised of an input processing unit for displaying a pointer on the screen to enable the user to specify an arbitrary position by using the pointer, or a touch panel disposed on the screen on which the camera image is displayed.
Next, the operation of the camera distance measurement device will be explained. When the user specifies an arbitrary position on the camera image displayed on the screen by using the in-screen position determining unit 14, the coordinates (u, v) of the specified position in the space of the camera image are inputted to the distance measurement arithmetic operation unit 2A. The lens distortion function arithmetic operation unit 10 of the distance measurement arithmetic operation unit 2A performs a lens distortion function i on the coordinates (u, v) of the position on the camera image specified by the in-screen position determining unit 14 to transform the coordinates into coordinates (i(u), i(v)) which have undergone lens distortion. The lens distortion function i represents the distortion which the camera image which is acquired when the camera of the camera unit 3 has captured an object undergoes due to the shape of the lens of the camera, that shown in above-mentioned Embodiment 1. For example, the lens distortion function i can be determined by using the Zhang model related to the lens distortion. In this model, the lens distortion is modeled as radial distortion, and, when the ideal image coordinates which have not been affected by the lens distortion are expressed as (u, v) T, the observation image coordinates which have been affected by the lens distortion are expressed as (u˜, v˜) T, the ideal normalized coordinates which have not been affected by the lens distortion are expressed as (x, y)T, and the observation normalized coordinates which are affected by the lens distortion are expressed as (x˜, y˜)T, the following equations are given.
x
˜
=x+x(k1r2+k2r4)
y
˜
=y+y(k1r2+k2r4)
r
2
=x
2
+y
2
In these equations, x˜ is x tilde, y˜ is y tilde, u˜ is u tilde, and v˜ is v tilde. k1 and k2 are coefficients of a polynomial expressing the lens distortion, which is caused by radial distortion, in the normalized coordinates (x˜, y˜)T which have been affected by the lens distortion with respect to the normalized coordinates (x, y)T which have not been affected by the lens distortion, and are constants inherent in the lens. When the center of the radial distortion in the coordinates which have not been affected by the lens distortion is expressed as a principal point (uo, vo) T, there is a relationship given by the following equation.
u
˜
=u+(u−uo)(k1r2+k2r4)
v
˜
=v+(v−vo)(k1r2+k2r4)
where u0 and v0 are constants inherent in the lens. By using these relational expressions, the position (u, v) of an object in the space of the camera image can be transformed into the position coordinates (x, y) (=(i(u), i(v))) of the object in the real space.
The projection function arithmetic operation unit 11 performs a function h according to a projection method determined on the basis of projection method information inputted thereto from the parameter storage unit 4 on the coordinates (i(x), i(y)) which have undergone the lens distortion and which is outputted from the lens distortion function arithmetic operation unit 10 to transform the coordinates (i(x), i(y)) into coordinates (h(i(x)), h(i(y))) which have undergone projection distortion. The function h according to the projection method functionally shows how far from the lens center light incident upon the lens with an angle of θ is focused. The function h according to the projection method has the following relationship when the focal length of the lens is expressed as f, the angle of incidence of the incident light, i.e., the half angle of view is expressed as θ, and the image height on the imaging surface of the camera is expressed as Y. In the case of stereographic projection, the function h has the following relationship: Y=2 f tan(θ/2), in the case of equidistant projection, the function h has the following relationship: Y=fθ, in the case of equisolidangle projection, the function h has the following relationship: Y=2f sin(θ/2), and, in the case of orthogonal projection, the function h has the following relationship: Y=f sin θ. Therefore, by transforming the value i(u) of the coordinates (i(u), i(v)) which have undergone the lens distortion and which are outputted from the lens distortion function arithmetic operation unit 10 into the angle of incidence of θ to the lens, and then substituting this angle into the above-mentioned projection equation, the value h(i(u)) which has undergone the projection distortion is acquired. Similarly, by transforming the value i(v) of the coordinates (i(u), i(v)) which have undergone the lens distortion into the angle of incidence of θ to the lens, and then substituting this angle into the above-mentioned projection equation, the value h(i(v)) is acquired. By doing in the above-mentioned way, the projection function arithmetic operation unit can acquire the coordinates (h(i(u)), h(i(v))) which have undergone the projection distortion.
The projection plane transformation function arithmetic operation unit 12 performs a projection plane transformation function f determined on the basis of mounting information inputted thereto from the parameter storage unit 4 on the coordinates (h(i(u)), h(i(v))) which have undergone the projection distortion and which are outputted from the projection function arithmetic operation unit 11 to transform the coordinates (h(i(u)), h(i(v))) into coordinates (f(h(i(u))), f(h(i(v)))) which have undergone a projection plane transformation (i.e. an imaging surface transformation). The projection plane transformation is the one of adding an influence caused by the mounting state of the camera including the mounting position and the mounting angle of the camera to the camera image captured by the camera because the mounting state of the camera has an influence on the camera image. The projection plane transformation function f is expressed by a geometrical function which has, its coefficients, the height L of the mounting position of the camera with respect to the ground surface, the mounting vertical angle phi which is the degree of inclination angle of the optical axis of the camera with respect to a vertical line, the mounting horizontal angle of θ which is the degree of inclination angle of the optical axis of the camera with respect to a centerline extending in a direction of the length of the vehicle, and the distance H which is the amount of displacement of the camera from the center of the width of the vehicle. It is assumed that the camera is not displaced towards a direction of tilt rotation having the optical axis of the camera as the axis of rotation, and is mounted precisely.
The image output function arithmetic operation unit 13 performs an image output function g determined on the basis of angle of view information and screen size information which are inputted thereto from the parameter storage unit 4 on the coordinates (f(h(i(u))), f (h(i(v)))) which have undergone the projection plane transformation to transform the coordinates (f(h(i(u))), f (h(i(v)))) into coordinates for image output (g(f(h(i(u)))), g(f(h(i(v))))). The image output function arithmetic operation unit 13 carries out a transform process corresponding to a change of the size of the camera image to a size which can be displayed on the display unit on the coordinates (g(f(h(i (u)))), g(f(h(i (v))))) which have undergone the projection plane transformation so that the transformed image can have a scale matching that of the camera image. The image output transformation function g is expressed by a mapping function which has, as its coefficients, a maximum horizontal angle of view Xa and a maximum vertical angle of view Ya of the camera, and a maximum horizontal drawing pixel size Xp and a maximum vertical drawing pixel size Yp in the image output.
In the example explained above, the arithmetic operations are carried out on the coordinates showing each point of intersection of graduation lines in the order of the one with the lens distortion function, the one with the projection function, the one with the projection plane transformation function, and the one with the image output function. However, the order in which the functions are performed on the coordinates showing each point of intersection of graduation lines is not limited to the above-mentioned one.
The projection plane transformation function f in the projection plane transformation function arithmetic operation unit 12 further includes the angle of view of the camera (the maximum horizontal angle of view Xa and the maximum vertical angle of view Ya of the camera) as the screen size information showing the size of the captured camera image. Therefore, even when extracting and displaying a part of the camera image, the camera distance measurement device can display the graduation lines in such a way that the graduation lines fit on the extracted part of the camera image by changing the coefficients of the angle of view of the camera in the projection plane transformation function f. As a result, the camera distance measurement device can carry out interconversion between a position in the two-dimensional space of the camera image and a position in the three-dimensional real space whose height as one dimension thereof is fixed. Because the camera distance measurement device can thus carry out interconversion between the two-dimensional space of the camera image and the three-dimensional real space, the camera distance measurement device enables the user to determine an arbitrary position in the camera image by using the in-screen position determining unit 14 to present a corresponding position in the real space to the user on the basis of the camera image.
After the distance measurement arithmetic operation unit 2A calculates the position in the real space corresponding to the position in the camera screen which is specified by the in-screen position determining unit 14 (the coordinate position at a height of z in the real space), the output unit 5A provides this calculated value for the user. For example, in the case in which the output unit 5A is constructed of a sound output unit, the output unit outputs the position calculated by the distance measurement arithmetic operation unit 2A by voice. As an alternative, in the case in which the output unit 5A is constructed of a display unit for displaying the camera image captured by the camera unit 3, the output unit displays the distance from the vehicle to the position on the display screen with characters or brings a color into correspondence with the distance and displays the distance in the color, thereby enabling the user to visually recognize the distance.
As mentioned above, the camera distance measurement device in accordance with this Embodiment 2 includes: the parameter storage unit 4 for storing mounting information showing the mounting position and the mounting angle at and with which the camera is mounted to the vehicle, angle of view information showing the angle of view of the camera, projection method information showing a projection method for use in the lens of the camera, and screen size information showing the screen size of the display unit as parameter information; the in-screen position determining unit 14 for determining a position in the camera image displayed on the display unit; the distance measurement arithmetic operation unit 2A for performing the process of correcting the distortion of the lens of the camera on the coordinates of the position in the space of the camera image determined by the in-screen position determining unit 14, and for transforming the position coordinates in which the distortion of the lens has been corrected into position coordinates at a predetermined height from a ground surface in the real space on the basis of the mounting information, the angle of view information, the projection method information, and the screen size information which are read from the parameter storage unit to create position information; and the output unit 5A for outputting the distance from the position in the camera image determined by the in-screen position determining unit 14 to the vehicle on the basis of the position information. Because the camera distance measurement device is constructed in this way, the camera distance measurement device can recognize the distance to an object from the vehicle by transforming an arbitrary position in the camera image to a coordinate position at a height of z (e.g. at the height from the ground surface to the camera) in real space.
In Embodiment 3, a structure having a combination of those according to above-mentioned Embodiments 1 and 2 will be shown.
The camera distance measurement device can display such a graduation line image as shown in
As mentioned above, the camera distance measurement device in accordance with this Embodiment 3 includes the in-screen position determining unit 14 for determining a position in the camera image displayed on the display unit 5 in addition to the structure according to above-mentioned Embodiment 1, and performs the process of correcting the distortion of the lens of the camera on the coordinates of the position in the space of the camera image determined by the in-screen position determining unit 14, and transforms the position coordinates in which the distortion of the lens has been corrected into position coordinates at a predetermined height from a ground surface in the real space on the basis of the mounting information, the angle of view information, the projection method information, and the screen size information which are read from the parameter storage unit 4 to create position information, and the display unit 5 outputs the distance from the position in the camera image determined by the in-screen position determining unit 14 to the vehicle on the basis of the position information. Because the camera distance measurement device is constructed in this way, the camera distance measurement device can transform an arbitrary position in the graduation line image to a coordinate position at a height of z (e.g. at the height from the ground surface to the camera) in the real space to enable the user recognize the distance from the vehicle to the position while presenting the graduation line image which can facilitate the measurement of the distance to an object in the camera image.
Although in the example shown in
As mentioned above, because the camera distance measurement device in accordance with this Embodiment 4 includes the image recognizing unit 15 for detecting the position of an object in the camera image by carrying out image recognition on the camera image, the camera distance measurement device can determine an arbitrary position (the position of an obstacle) in the camera image according to the image recognition on the camera image, thereby being able to easily estimate the distance to the obstacle in the camera image.
Because the camera distance measurement device in accordance with the present invention can measure the distance to an object in the camera image, the camera distance measurement device can be applied effectively to a parking support device or the like which uses a rear camera having an image capture range in a backward direction behind a vehicle.
Filing Document | Filing Date | Country | Kind | 371c Date |
---|---|---|---|---|
PCT/JP2010/003785 | 6/7/2010 | WO | 00 | 9/12/2012 |