Priority is claimed on Japanese Patent Application No. 2022-053944, filed Mar. 29, 2022, the content of which is incorporated herein by reference.
The present invention relates to an image processing device, an image processing method, and a storage medium.
Conventionally, a technology of estimating a road marking on a road on which a vehicle travels and controlling the traveling of the vehicle on the basis of the estimated road marking is known. For example, Japanese Unexamined Patent Application, First Publication No. 2021-60885, discloses a technology of selecting a plurality of three-dimensional objects from an image captured by a camera mounted in a vehicle, estimating road markings on the basis of positions of the selected three-dimensional objects, and setting a target speed of the vehicle according to the curvature of the estimated road markings.
In the technology described in Japanese Patent Application, First Publication No. 2021-60885, road markings are estimated by fitting a curved line to a plurality of positions extracted from an image obtained by photographing a road on which a vehicle travels. However, in such a method, an error in the estimated road markings is large due to a restriction on a degree of freedom of the curve to be fitted, and blurring may occur in the estimated road markings due to a variation in the position extracted from the image at each time point. As a result, there are cases where the estimated road markings cannot be used stably for the traveling control of the vehicle.
The present invention has been made in consideration of such circumstances, and an object thereof is to provide an image processing device, an image processing method, and a storage medium capable of estimating road markings that can be stably used for traveling control of a mobile object.
An image processing device, an image processing method, and a storage medium according to the present invention have adopted the following configurations.
(1): An image processing device according to one aspect of the present invention includes a storage medium configured to store an instruction readable by a computer and a processor that is connected to the storage medium, in which the processor executes an instruction readable by the computer, thereby dividing an image representing an area in front of a mobile object, which is captured by a camera mounted on the mobile object, at predetermined intervals to generate an element function that approximates a track boundary in each of divided areas on the basis of: a probability value indicating an existence probability of the track boundary for each of coordinates in each of the areas; and the coordinates corresponding to the probability value, generating a composite function that approximates a track boundary in the front area by combining the element functions generated for each of the areas, and executing driving control or driving assistance of the mobile object on the basis of the track boundary approximated by the generated composite function.
(2): In the aspect of (1) described above, the hardware processor acquires the probability value using a learned model that has been learned to output the probability value indicating an existence probability of a track boundary for each of the coordinates depending on an input of the image, and generates the element function on the basis of the probability value and the coordinates corresponding to the probability value.
(3): In the aspect of (1) described above, the hardware processor generates the composite function by multiplying the element function generated for each of the areas by a weighting function defined for each of the areas and taking a sum of results.
(4): In the aspect of (1) described above, the element function is a quadratic function, and the hardware processor generates the element function by iteratively updating parameters of the quadratic function that approximates the track boundary, using coordinates in descending order of the probability value.
(5): An image processing method according to another aspect of the present invention includes, by a computer, dividing an image representing an area in front of a mobile object, which is captured by a camera mounted on the mobile object, at predetermined intervals to generate an element function that approximates a track boundary in each of divided areas on the basis of: a probability value indicating an existence probability of the track boundary for each of coordinates in each of the areas; and the coordinates corresponding to the probability value, generating a composite function that approximates a track boundary in the front area by combining the element functions generated for each of the areas, and executing driving control or driving assistance of the mobile object on the basis of the track boundary approximated by the generated composite function.
(6): A computer-readable non-transitory storage medium according to still another aspect of the present invention stores a program causing a computer to execute dividing an image representing an area in front of a mobile object, which is captured by a camera mounted on the mobile object, at predetermined intervals to generate an element function that approximates a track boundary in each of divided areas on the basis of: a probability value indicating an existence probability of the track boundary for each of coordinates in each of the areas; and the coordinates corresponding to the probability value, generating a composite function that approximates a track boundary in the front area by combining the element functions generated for each of the areas, and executing driving control or driving assistance of the mobile object on the basis of the track boundary approximated by the generated composite function.
According to (1) to (6), it is possible to estimate road markings that can be stably used for traveling control of a mobile object.
Hereinafter, embodiments of an image processing device, an image processing method, and a storage medium of the present invention will be described with reference to the drawings. In the present embodiment, the image processing device is, for example, a terminal device such as a smartphone having a camera and a display. However, the present invention is not limited to such a configuration, and the image processing device may be a computer device as long as it receives at least an image captured by a camera, processes it, and outputs a result of the processing to a display. In this case, functions of the present invention are realized by the camera, the display, and the image processing device in cooperation.
[Configuration]
As shown in
[Extraction of Candidate Point]
The candidate point extraction unit 110 extracts candidate points of a boundary of a track (a track boundary) on which the host vehicle M travels on the basis of an image representing the area in front of the host vehicle M captured by the camera 10.
[Update of Model Parameter]
The model parameter updating unit 120 rearranges the candidate points of the track boundary extracted by the candidate point extraction unit 110 in descending order of the probability value, and uses the candidate points of the track boundary in descending order of the probability value to update model parameters of a track boundary model to be described below.
In Equation (1), ai(n,k), bi(n,k), and ci(n,k) represent a second order coefficient, a first order coefficient, and a constant term of a quadratic function (which may be referred to hereinafter as an “element function”) that approximates a track boundary in an image, respectively, and wi represents a weighting function that outputs a weight of 0 or more and 1 or less according to an x-coordinate of the input candidate point. More specifically, the weighting function is defined by the following Equations (2) to (4). In Equations (2) to (4), the values of xwi (i=1 to m) are fixed values set in advance. A sum of wi(x) (i=1 to m) is set to always be one.
In this manner, compared to a conventional method (a least-squares method of a batch calculation method) in which the candidate points of a track boundary in an image are approximated by a single quadratic function, the track boundary is approximated by a quadratic function for each partial area of the image, and the approximated quadratic functions are combined to obtain a final approximation curve in the present embodiment, so that the track boundary in the image can be expressed with higher accuracy.
Note that, in the present embodiment, values and the number of x-coordinates xwi (i=1 to m) for dividing the area are fixed values set in advance. However, the present invention is not limited to such a configuration, and the values and number of x-coordinates xwi for dividing the area may be set as, for example, the number of clusters obtained by clustering the extracted candidate points and boundary points thereof.
Next, model parameter update processing executed by the model parameter updating unit 120 will be described with reference to
First, the model parameter updating unit 120 rearranges the candidate points (x′(n,k) and y′(n,k)) of the track boundary extracted by the candidate point extraction unit 110 in descending order of the probability value, and obtains candidate points (x(n,k) and y(n,k)). The model parameter updating unit 120 generates a vector ξ(n,k)=[x(n,k)2,x(n,k),1] based on the candidate point x(n,k), and substitutes x(n,k) into the weighting function wi to obtain a weighting value wi(x(n,k)).
Next, the model parameter updating unit 120 calculates an inner product of the vector (n,k) and model parameters θi(n,k)=[ai(n,k),bi(n,k),ci(n,k)] and multiplies it by the weighting value wi(x(n,k)), thereby obtaining element functions fi(x)=wi(x(n,k))(ai(n,k)×(n,k)2+bi(n,k)×(n,k)+ci(n,k)) for each area. The model parameter updating unit 120 obtains an output estimated value y_hat(n,k) represented by Equation (1) by taking a sum of the elemental functions fi(x) for each area. Initial values of the model parameters θi(n,k)=[ai(n,k),bi(n,k),ci(n,k)] to be multiplied at this time will be described below.
The model parameter updating unit 120 then calculates an identification error between the output estimated value y_hat(n,k) and a y-coordinate y(n,k) of a candidate point as eid(n,k)=wi(x(n,k))(y(n,k)−y_hat(n,k)). The identification error can be reflected for each area by multiplying the error between the output estimated value y_hat(n,k) and the y-coordinate y(n,k) of the candidate point by the weighting value wi(x(n,k)). The model parameter updating unit 120 defines an adaptive gain Kp for correcting the model parameter θi(n,k) in a direction of decreasing a squared error of the identification error eid(n,k) by the following Equation (5).
In Equation (5), P′(n,k) represents a covariance matrix of 3 rows and 3 columns and is a matrix defined by Equation (6) below.
In Equation (6), I represents a unit matrix of 3 rows and 3 columns, and λ1 and λ2 represent setting parameters of an iterative identification algorithm. λ1 and λ2 are constant values greater than 0 and equal to or less than 1. They are set as λ1=1 and λ2=1 when a least squares method is applied, set as λ1=λ(0<λ≤1) 1 and λ2=1 when a weighted least squares method is applied, and set as λ1=1 and λ2=0 when a fixed gain method is applied. When the fixed gain method is applied, the adaptive gain Kp is represented by the following Equations (7) and (8).
In Equation (8), P represents an identification gain matrix. P1, P2, and P3 represent identification gains and are positive fixed values. The model parameter updating unit 120 multiplies the identification error eid(n,k) by the adaptive gain Kp, thereby obtaining a correction amount ddθi(n,k) of the model parameter represented by the following equation (9).
ddθi(n,k)=Kp(n,k)eid_i(n,k)(i=1˜m) Equation (9)
Next, the model parameter updating unit 120 multiplies a previous value dθi (n−1,k) of the final correction amount, which will be described below, by a forgetting gain Δfgt, and adds the obtained multiplied value to the correction amount ddθi(n,k) of the model parameter of Equation (9), thereby obtaining a correction amount dθraw_i(n,k) of the model parameter represented by the following Equation (10). In this manner, by defining the correction amount dθraw_i(n,k) using a value obtained by multiplying the final correction amount dθi(n−1,k) previously obtained by the forgetting gain Δfgt, abrupt fluctuations of the track boundary model can be curbed.
dθraw_i(n,k)=Δfgtdθi(n−1,k)+Kp(n,k)eid_i(n,k) Equation (10)
In Equation (10), the forgetting gain Δfgt is a diagonal matrix of 3 rows and 3 columns represented by the following Equation (11). In Equation (11), δfgt_1, δfgt_2, and δfgt_3 are constant values that satisfy 0<δfgt_1, δfgt_2<1, and δfgt_3=1. That is, the forgetting gain Δfgt is set to apply a forgetting effect to ai(n,k) and bi(n,k).
Next, the model parameter updating unit 120 performs limiter processing (an example of a “constraint condition”) represented by the following Equations (12) to (14) on the correction amount dθraw_i(n,k) of the model parameter to correct the correction amount of the model parameter and obtain a final correction amount dθi(n,k) represented by the following Equation (15). In Equations (12) to (14), daL, daH, dbL, dbH, dcL, and dcH are fixed values set in advance, and are set to prevent the track boundary model from being in an unrealistic shape.
Next, the model parameter updating unit 120 adds a reference value θbase_i(n,k) of the model parameter represented by the following Equation (16) to the correction amount dθi(n,k), thereby obtaining a current model parameter value θi(n,k) represented by the following Equation (17). The calculated current model parameter value θi(n,k) is used as an identification value (an initial value) for calculating the model parameter θi(n+1,k) for a next input value n+1.
[Reference Value Calculation of Model Parameter]
Next, a method for calculating the reference value θbase_i(n,k) of the model parameter will be described with reference to
Equations (18) and (19) represent initial values of model parameters abase_i(n,k) and cbase_i(n,k), respectively. As shown in Equations (18) and (19), since the track boundary model can be curved in both left and right directions, the initial value of the model parameter abase_i(n,k) may be zero, and since a model parameter ci corresponding to a y-intercept of the track boundary model can be in both left and right directions, the initial value of the model parameter cbase_i(n,k) may be zero.
In Equation (20), ci(n−1,k) represents a previously calculated identification value of the model parameter ci, and a function g represents a scaling function that gives a straight line passing through the identification value (that is, the y-intercept) of the model parameter ci and a vanishing point VP of an image. That is, as shown in a left part of
When the model parameter updating unit 120 determines a track boundary model for each recognition cycle, the driving control unit 130 performs automated driving or driving assistance of the host vehicle M on the basis of the determined track boundary model. More specifically, for example, the driving control unit 130 performs bird's-eye view conversion on a track boundary model in a camera coordinate system to obtain a track boundary model in a bird's-eye view coordinate system. The driving control unit 130 generates a target trajectory and an action plan of the host vehicle M using the track boundary model in the bird's-eye view coordinate system, and causes the host vehicle M to travel according to the generated target trajectory and action plan. In addition, for example, the driving control unit 130 uses the track boundary model in the bird's-eye view coordinate system to assist with steering or to issue a warning such that the host vehicle does not deviate from the determined track boundary model when an occupant of the host vehicle M performs manual driving.
When a resampling timing Tds has elapsed while the model parameter value θi(n,k−2) is being updated, the image processing device 100 performs down-sampling on a model parameter value (for example, θi(N(k−2)−1,k−2)) at a time when the resampling timing Tds has elapsed, and determines the result as the final model parameter value in a recognition cycle k−2. The image processing device 100 causes the display unit 20 to display a track boundary model set with the determined model parameter value θi(N(k−2)−1,k−2). As described above, unlike the least-squares method of the batch calculation method, in the present embodiment, the model parameter value θi(n,k) is iteratively updated by using the output values in descending order of the probability value, thereby it is possible to estimate a track boundary model with high accuracy even when the data amount is large and calculation cannot be completed by the least-squares method of the batch calculation method.
The image processing device 100, in a recognition cycle k−1, rearranges DNN output values x′(1,k−1), y′(1,k−1), x′(2,k−1), y′(2,k−1), . . . , x′(N(k−1),k−1), y′(N(k−1),k−1) in descending order of the probability value, and obtains x(1,k−1), y(1,k−1), x(2,k−1), y(2,k−1), . . . , x(N(k−1),k−1), and y(n(k−1),k−1). The image processing device 100 constitutes a vector ξ(n,k−1)=[x(n,k−1)2,x(n,k−1), 1] in descending order of the probability value, and updates the model parameter value θi(n,k−1) by inputting it to the iterative identification algorithm as shown in
According to the present embodiment described above, the image processing device divides an area in front of a host vehicle on a coordinate system based on the host vehicle at predetermined intervals, generates a function for approximating a road marking in each area on the basis of a probability value indicating an existence probability of the road marking for each of coordinates in each area obtained by the division and corresponding coordinates, and generates a function that approximates a road marking in the front area by combining the functions generated for each area. As a result, it is possible to estimate road markings that can be stably used for traveling control of mobile objects.
The embodiment described above can be expressed as follows.
An image processing device includes a storage medium that stores an instruction readable by a computer and a processor that is connected to the storage medium, in which the processor executes an instruction readable by the computer, thereby dividing an image representing an area in front of a mobile object, which is captured by a camera mounted on the mobile object, at predetermined intervals to generate an element function that approximates a track boundary in each of divided areas on the basis of: a probability value indicating an existence probability of the track boundary for each of coordinates in each of the areas; and the coordinates corresponding to the probability value, generating a composite function that approximates a track boundary in the front area by combining the element functions generated for each of the areas, and executing driving control or driving assistance of the mobile object on the basis of the track boundary approximated by the generated composite function.
As described above, a mode for implementing the present invention has been described using the embodiments, but the present invention is not limited to such embodiments at all, and various modifications and replacements can be added within a range not departing from the gist of the present invention.
Number | Date | Country | Kind |
---|---|---|---|
2022-053944 | Mar 2022 | JP | national |
Number | Name | Date | Kind |
---|---|---|---|
20160098606 | Nakamura | Apr 2016 | A1 |
Number | Date | Country |
---|---|---|
2021-060885 | Apr 2021 | JP |
Number | Date | Country | |
---|---|---|---|
20230316780 A1 | Oct 2023 | US |