The present application claims priority under 35 U.S.C. § 119(a) to Korean Patent Application Nos. 10-2017-0074556 and 10-2017-0132250, filed on Jun. 14, 2017 and Oct. 12, 2017, respectively, which are incorporated herein by reference in their entirety.
Embodiments of the present disclosure relate to a camera angle estimation method for an around view monitoring system, and more particularly, to a camera angle estimation method for an around view monitoring system, by which in order to correct an image of a camera for generating an around view image in an around view monitoring system of a vehicle, feature points are extracted from a peripheral image captured by the camera even though correction patterns are not installed in the vicinity of the vehicle installed with the camera, so that it is possible to estimate an angle of the camera on the basis of corresponding points acquired by tracking the feature points.
In vehicles recently sold, an advanced driver assistance system (ADAS) is increasingly mounted to help safe driving of a driver.
As a part of such an advanced driver assistance system (ADAS), vehicles, in which a ultrasonic sensor or a rear view camera is mounted, are increased in order to reduce the occurrence of accidents due to a blind spot. Recently, vehicles, in which an around view monitoring (AVM) system is mounted, are also increased.
Particularly, since the around view monitoring (AVM) system can monitor all directions of 360° about a vehicle, it attracts attention. However, since a wide space, where a specific facility (for example, a lattice or a lane pattern for camera correction) is installed, and a skilled person for correction work are required in order to correct a camera after the system is installed, it is disadvantageous in terms of cost and time. Therefore, there is a limitation in wide use of the around view monitoring (AVM) system.
The background technology of the present invention is disclosed in Korean Unexamined Patent Publication No. 2016-0056658 (May 20, 2016 (Publication date). Entitled “an around view monitoring system and a control method thereof”).
Various embodiments are directed to a camera angle estimation method for an around view monitoring system, by which in order to correct an image of a camera for generating an around view image in an around view monitoring system of a vehicle, feature points are extracted from a peripheral image captured by the camera even though correction patterns are not installed in the vicinity of the vehicle installed with the camera, so that it is possible to estimate an angle of the camera on the basis of corresponding points acquired by tracking the feature points.
In an embodiment, a camera angle estimation method for an around view monitoring system includes: uniformizing and extracting, by a control unit of the around view monitoring system, feature points from an image of each of at least three cameras; acquiring, by the control unit, corresponding points by tracking the extracted feature points; integrating, by the control unit, the corresponding points acquired from the image of each of the cameras with one another; estimating, by the control unit, vanishing points and vanishing lines by using the integrated corresponding points; and estimating an angle of each of the cameras on the basis of the estimated vanishing points and vanishing lines.
In an embodiment, the feature points are points easily distinguished from a surrounding background in the image of each of the cameras, and are points easily distinguished even when a shape, a size, or a position of an object is changed and easily found from the image even when a point of view of the camera or illumination is changed.
In an embodiment, in the uniformizing and extracting of the feature points, the control unit divides the image of each of the cameras into a plurality of preset areas in order to uniformize the feature points, and allows a predetermined number of feature points to be forcibly extracted from the divided each area.
In an embodiment, the image of each of the at least three cameras is an image continuously or sequentially captured, and includes an image captured immediately subsequent to a previously captured image or a camera image having a temporal difference of a specific frame or more.
In an embodiment, in order to estimate the vanishing points and the vanishing lines, the control unit draws virtual straight lines, along which two corresponding points extend in a longitudinal direction, to estimate one vanishing point at a spot at which the virtual straight lines cross each other, and draws virtual straight lines, which respectively connect both ends of the two corresponding points to each other, to estimate a remaining vanishing point at a spot at which the virtual straight lines extend and cross each other, thereby estimating a vanishing line that connects the two vanishing points to each other.
In an embodiment, in the estimating of the angle of each of the cameras, the control unit estimates, as the angle of each of the cameras, a rotation matrix Re for converting a coordinate system of a real world road surface to an image coordinate system on which a distortion-corrected image is displayed.
In accordance with an embodiment, in order to correct an image of a camera for generating an around view image in an around view monitoring system of a vehicle, feature points are extracted from a peripheral image captured by the camera even though correction patterns are not installed in the vicinity of the vehicle installed with the camera, and an angle of the camera is estimated on the basis of corresponding points acquired by tracking the feature points, so that the image of the camera is automatically corrected on the basis of the estimated angle of the camera and thus it is possible to more simply generate an exact around view image.
Hereinafter, a camera angle estimation method for an around view monitoring system will be described below with reference to the accompanying drawings through various examples of embodiments.
As illustrated in
The control unit 110 estimates installation angles (or camera angles) of the cameras 120, 121, 122 and automatically corrects camera images on the basis of the estimated camera installation angles (or camera angles and see
Then, the control unit 110 outputs the around view image obtained by processing the corrected camera images to the screen of the audio video navigation (AVN) device (not illustrated).
When the camera angles are estimated, the camera images can be easily corrected on the basis of the estimated camera angles. Hereinafter, in the present embodiment, a method for estimating the camera installation angles (or the camera angles) will be described in detail with reference to
As illustrated in
For example, six four cameras may be installed at the front, rear, right (for example, a right side view mirror), left (for example, a left side view mirror), an inner front (for example, a rear view mirror), and an inner rear sides of a vehicle.
As illustrated in
The uniformization and extraction of the feature points represent extraction of feature points uniformly distributed in an entire area of the camera image.
Furthermore, the control unit 110 tracks the uniformized feature points and acquires corresponding points (S102) (in this case, since the corresponding points are acquired by tracking the feature points, the corresponding points may be linear in correspondence to the shape of the tracked feature points), and integrates the acquired corresponding points with corresponding points acquired from a plurality of images (for example, camera images of three frames or more) captured continuously or sequentially (S103).
Furthermore, the control unit 110 estimates vanishing points and vanishing lines by using the acquired corresponding points and estimates camera angles on the basis of the estimated vanishing points and vanishing lines (S104).
Furthermore, the control unit 110 corrects images of the cameras on the basis of the estimated camera angles (S105).
After the camera images are corrected as described above, the control unit 110 generates an around view image by combining the corrected camera images with one another according to a predetermined around view algorithm.
Hereinafter, in the present embodiment, the method according to steps of
In extracting (or detecting) the feature points in
However, the feature points may not be uniformly distributed according to capturing environments and thus an estimation error of the camera angles may be increased. In this regard, in the present embodiment, a process for uniformizing a distribution of the feature points is performed.
As well-known methods for detecting the feature points, there are various methods such as a Harris corner detector, Shi and Tomasi corner detector, FAST, and DOG.
For example,
In this regard, in the present embodiment, a camera image is divided into a plurality of preset areas in order to uniformly distribute feature points (see (a) of
Accordingly, when the feature point extraction result illustrated in (b) of
Referring again to
In this case, in tracking the feature points, it may be possible to use well-known various optical flow tracking methods such as census transform (CT, a method for comparing a change in the brightness of a surrounding area of one pixel with the brightness of a center pixel) and Kanade-Lucas Tomasi (KLT).
In this regard, the present embodiment uses a method for extracting and tracking a small number of feature points from each camera image by using a continuous camera image, acquiring corresponding points according to each camera image, and integrating the corresponding points acquired from each camera image with one another.
For example, as illustrated in
It can be understood that a computation amount (that is, a load for computation) used in the result of (b) of
In the present embodiment, the continuously captured image (or the continuous image) is not limited to an image captured immediately subsequent to a previously captured image. That is, in the present embodiment, when a plurality of images are used, the images are not necessarily to be continuous and camera images having a temporal difference of a specific frame or more may also be used. It represents that currently acquired corresponding points and corresponding points acquired 10 minutes ago may be integrated with each other for use.
In the present embodiment, the control unit 110 estimates vanishing points and vanishing lines by using the corresponding points acquired from camera images, and estimates a rotation matrix between the ground and the camera.
The rotation matrix is a matrix for obtaining a coordinate of a new point when one point on a two-dimensional or three-dimensional space is counterclockwise rotated about the origin by a desired angle.
For example, when a vehicle moves straight, two acquired corresponding points should be parallel to each other and should have substantially the same length on the real world road surface (a bird's-eye view) as illustrated in (a) of
However, when the corresponding points are captured as a camera image, the directions and lengths of the corresponding points are not substantially equal to each other due to perspective distortion as illustrated in (b) of
Since the two corresponding points form a parallelogram in the real world, the control unit 110 draws virtual straight lines, along which the corresponding points extend in the longitudinal direction, to obtain one vanishing point at a spot at which the virtual straight lines cross each other, and draws virtual straight lines, which respectively connect both ends of the corresponding points to each other, to obtain the other vanishing point at a spot at which the virtual straight lines extend and cross each other, so that it is possible to obtain a vanishing line that connects the two vanishing points to each other as illustrated in (c) of
As illustrated in
It indicates the conversion of the coordinate system of the real world road surface as illustrated in (a) of
Accordingly, for the convenience of estimation, an angle Re of a camera may be estimated through reverse conversion (that is, conversion of the distortion-corrected image coordinate system to the coordinate system of the real world road surface). That is, in
For example, the conversion operation illustrated in
In Equation 1 above, K denotes a matrix including camera internal variables and v1′, and v2′ denote results obtained by multiplying the vanishing points v1 and v2 by K−1, wherein K is a value obtainable in advance.
Furthermore, since p1′ denotes a straight direction of a vehicle in the image, p1′ is [1 0 0]T in the case of a side camera and is [0 1 0]T in the case of front/rear cameras.
Accordingly, in the case of the side camera, the coordinate of the first vanishing point v1′ is r1, which is the first column vector of the angle Re of the camera to be estimated, as expressed by Equation 2 below.
Meanwhile, it is not possible to fix the exact position of p2′, but since this point is separated to infinity, it may be expressed by [a b 0]T.
When v2′ is converted to ReT, since p2′ is obtained, {circle around (1)} of Equation 3 below may be obtained.
Furthermore, when Equation 2 above is converted in substantially the same manner, {circle around (3)} of Equation 3 above may be obtained. Finally, when {circle around (1)} and {circle around (2)} of Equation 3 above are added to each other, {circle around (3)} of Equation 3 above may be obtained.
Furthermore, {circle around (3)} of Equation 3 above represents that r3 is a parameter of a vanishing line which is a straight line geometrically connecting the two vanishing points v1′, and v2′ to each other.
Meanwhile, since r1 is v1′, by Equation 2 above, r1 may be calculated based on the following Equation 4 for calculating vanishing points. That is, r1 may be calculated by the following Equation 4.
In the following Equation 4, (xi, yi) and (xi′, yi′) denote a coordinate of an ith corresponding point.
Furthermore, r3 denotes a parameter of a vanishing line.
Accordingly, a plurality of vanishing points v2 are calculated from a combination of corresponding points and straight line estimation is performed based on the plurality of vanishing points v2, so that it is possible to calculate r3.
In this case, when using a forced condition that the straight line (that is, the vanishing line, r3) should pass through r1(v1), r3 may be obtained as expressed by Equation 5 below. That is, r3 may be calculated by Equation 5 below.
In Equation 5 above, v1x and y1y denote a coordinate of the vanishing point v1 calculated through Equation 4 above, and (v2,ix, v2,jy) denote a coordinate of an itth vanishing point obtained through the combination of the corresponding points. As described above, after r1 and r3 are calculated, r2 is calculated through a cross product of r1 and r3, so that the Re of the camera, which is desired to be estimated, is estimated.
(a) of
As illustrated in
As described above, in the present embodiment, it is possible to automatically estimate a camera angle of an around view monitoring (AVM) system on the basis of an image captured by a camera. Particularly, it is possible to estimate the camera angle even in a situation in which there is no special pattern (a special pattern for camera correction) such as a lattice and a lane on the ground.
Consequently, in the present embodiment, when a general driver, other than a professional engineer, drives a vehicle on an arbitrary road (anywhere), since an angle of a camera is automatically estimated, the convenience of the AVM system is improved and the installation cost is reduced. Furthermore, since a limitation condition (a special facility such as a tolerance correction device) for system operation environments or correction is reduced, utilization is improved, so that a limitation for overseas export can be solved.
While various embodiments have been described above, it will be understood to those skilled in the art that the embodiments described are by way of example only. Accordingly, the camera angle estimation method for an around view monitoring system described herein should not be limited based on the described embodiments.
Number | Date | Country | Kind |
---|---|---|---|
10-2017-0074556 | Jun 2017 | KR | national |
10-2017-0132250 | Oct 2017 | KR | national |