This application claims priority to German Patent Application No. 10 2020 214 619.5, filed on Nov. 20, 2020 with the German Patent and Trademark Office and to German Patent Application No. 10 2021 203 497.7, filed on Apr. 8, 2021 with the German Patent and Trademark Office. The contents of the aforesaid patent applications are incorporated herein for all purposes.
The invention relates to a method for detecting and tracking an object in the environment of a motor vehicle by means of ultrasound, and to a corresponding device.
This background section is provided for the purpose of generally describing the context of the disclosure. Work of the presently named inventor(s), to the extent the work is described in this background section, as well as aspects of the description that may not otherwise qualify as prior art at the time of filing, are neither expressly nor impliedly admitted as prior art against the present disclosure.
In order to control a vehicle, referred to in the following as an ego vehicle, in an automated or, alternatively, semi-automated manner, objects in the environment of the ego vehicle need to be detected. First and foremost, the physical variables such as position, orientation, speed, and dimensions of these objects, which may be both static and dynamic, are relevant here. Detecting objects in the environment of the ego vehicle using ultrasonic sensors constitutes an alternative to radar- or camera-based detection. Furthermore, it can be used as a redundant source of information, for example in order to achieve a particular ASIL level.
However, the prior art does not contain precise information regarding the technical implementation of how to detect objects using ultrasonic sensors. Neither are there any explanations as to how a distance value estimated using a Kalman filter as well as the first and second derivations of said value can be used to determine the relevant object variables actually of interest for objects in the environment of the ego vehicle, namely the position, orientation, speed and dimensions of the object.
A need exists to improve the detection of an object and of its relevant object variables in the environment of a motor vehicle, referred to in the following as ego vehicle, by means of ultrasound.
The need is addressed by a method for detecting an object in the environment of an ego vehicle and by a corresponding device according to the independent claims. Embodiments of the invention are described in the dependent claims, the following description, and the drawings.
The details of one or more embodiments are set forth in the accompanying drawings and the description below. Other features will be apparent from the description, drawings, and from the claims.
In the following description of embodiments of the invention, specific details are described in order to provide a thorough understanding of the invention. However, it will be apparent to one of ordinary skill in the art that the invention may be practiced without these specific details. In other instances, well-known features have not been described in detail to avoid unnecessarily complicating the instant description.
In some embodiments, a method for detecting and determining relevant variables of an object in the environment of a motor vehicle by means of at least one ultrasonic sensor disposed laterally on the motor vehicle is provided. The ultrasonic sensor may have a sensing range defined by a sensor-specific opening angle, and the method may comprise the steps of:
In this way, the relevant object variables, such as the position, orientation, speed and dimensions of the object, can be acquired in a model-based manner and thus in a closed, consistent form. The predictive error method, a sliding mode observer, the Kalman filter, or the Levenberg-Marquardt method can be considered as a suitable optimization method, such that the deviation between the modeled and real recorded sensor values is as small as possible.
For example, the sum of the differences between the modeled sensor values and the real sensor values is minimized in order to determine the state vector and parameter vector.
For example, a modeled sensor value vector is formed from the modeled values of the at least one ultrasonic sensor, which modeled sensor value vector is linked to the sensor value vector of the real sensor values of the individual ultrasonic vectors for minimization.
For example, the modeled sensor values are a function of the position of the object relative to the ego vehicle and a function of the appearance of corners of the object within the sensing range of the at least one ultrasonic sensor, wherein the coordinates of the corners of the object are determined in the coordinate system of the motor vehicle.
For example, the relevant variables of the object are given at least by:
For example, the modeled sensor value vector is formed from the modeled distance values of the at least one ultrasonic sensor, wherein the modeled distance value of the at least one ultrasonic sensor results from the minimum of the maximum range of the at least one ultrasonic sensor and calculated distance values between the object and the motor vehicle.
For example, the calculated distance values are ascertained as a function of the position of the object relative to the sensor normal of the at least one ultrasonic sensor of the motor vehicle, wherein the sensor normal is defined as the straight line that extends through the origin of the respective at least one ultrasonic sensor and that stands perpendicularly on the object longitudinal axis.
For example, in order to determine the calculated distance values, it is checked whether the object is located to the left or right of or on the sensor normal.
Furthermore, the check of the position of the object can be carried out by means of activation functions.
A device for detecting, tracking and determining relevant variables of an object in the environment of a motor vehicle is provided in some embodiments, wherein the device is configured and designed to carry out the previously described method, and comprises
Further embodiments are discussed in greater detail with reference to the drawings.
Specific references to components, process steps, and other elements are not intended to be limiting. Further, it is understood that like parts bear the same or similar reference numerals when referring to alternate FIGS.
The object O is further characterized by its length l and width b as well as by its position {right arrow over (r)} relative to the ego vehicle E, wherein the following applies for the position {right arrow over (r)} in the xe-ye coordinate system of the ego vehicle:
{right arrow over (r)}=[x,y]T,
Furthermore, the object O is moving at a speed {right arrow over (v)} relative to the ego vehicle, wherein the following applies for the speed {right arrow over (v)} relative to the ego vehicle E:
{right arrow over (v)}=[vx,vy]T,
In order to be able to model the movement of the object O, state variable x1 to x6 are defined, which are combined into one state vector
{right arrow over (x)}=[x1,x2,x3,x4,x5,x6]T
The meaning of the individual state variables is as follows:
The replication of the movement of the object O can be expressed as follows:
{right arrow over (x)}(k+1)=A·{right arrow over (x)}(k)+K(k)·{right arrow over (e)}(k), wherein kϵ1, . . . N, with N=number of measurements,
Here, K(k) is the matrix yet to be determined for correcting the estimation via the feedback of the error vector {right arrow over (e)}(k). The error vector {right arrow over (e)}(k) contains the deviations between the modeled values ŷi(k) and the measured values yi(k) of the various sensors i=1 to m, wherein the components of the vector {right arrow over (e)}(k) are defined as follows:
ei(k)=yi(k)−ŷi(k), with
yi=hi({right arrow over (x)}(k),{right arrow over (θ)}(k)).
Here, hi is the so-called output function and describes the distance sensing with the sensor i as a function of {right arrow over (x)} and the parameter vector {right arrow over (θ)}, wherein the parameter vector {right arrow over (θ)} contains the unknown width and length of the object as well as the coefficients of the matrix K and is estimated using a suitable optimization method.
In
It is obvious from the sequence of the approach of the object vehicle O to the ego vehicle E that, in
s1>s2>s3.
In the case of the measurements shown in
The coordinate system xe,ye of the ego vehicle E is located in the center thereof and the zero point of the coordinate system xS,yS of the sensor SR1 considered here is disposed centrally in the transmission plane of the sensor SR1. The coordinate system of the sensor SR1 is transformed into the coordinate system of the ego vehicle E by adding up the coordinates of the installation position of the sensor SR1.
The reference signs E1 to E4 denote the corners of the object O as drawn, wherein E1 forms the front left corner, E2 forms the front right corner, E3 forms the back left corner, and E4 forms the back right corner of the object. Consequently, the connection E1E2 forms the front edge and the connection E3E4 forms the back edge of the object, wherein the width of the object O is denoted by b and the length is denoted by l.
Furthermore, the sensing range W of the sensor SR1 is shown and is defined by an opening angle φ. The perpendicular a to the longitudinal axis LO of the object is drawn in the coordinate origin of the sensor SR1 and serves to define and determine the distances μ1 and μ2 and is referred to as the sensor normal. Here, the distance μ1 is defined as the distance between the front edge E1E2 and the perpendicular a and the distance μ2 is defined as the distance between the back edge E3E4 of the object O and the perpendicular a. Finally, the center point M of the object O in the xSs,yS coordinate system of the sensor SR1 has the coordinates xobj and yobj.
Furthermore, the following applies for the above-defined state variable {right arrow over (x)}=[x1,x2,x3,x4,x5,x6]T:
xobj=x1(k)
yobj=x3(k)
α=x5(k),
θ1=0.5·, and
θ2=0.5·b.
The calculation steps for determining the distance value of a sensor, in this case the sensor SR1, are thus the following:
Firstly, the distances μ1 and μ2 are determined, wherein the straight line a referred to as the sensor normal is being shown in the Hess normal form and the distances μ1 and μ2 are calculated as orthogonal distances between the two corner points E1 and E3 as well as the straight line. The following equations then result:
μ1(k)=cos x5(k)·xE1(k)+sin x5(k)·yE1(k), and
μ2(k)=cos x5(k)·xE3(k)+sin x5(k)·yE3(k).
The coordinates of the corners E1 and E4 of the object O in the coordinate system xe,ye of the ego vehicle E are as follows:
xE1(k)=x1(k)+θ1(k)·cos x5(k)+θ2·sin x5(k)
yE1(k)=x3(k)+θ1(k)·sin x5(k)+θ2·cos x5(k) E1:
xE2(k)=x1(k)+θ1(k)·cos x5(k)+θ2·sin x5(k)
yE2(k)=x3(k)+θ1(k)·sin x5(k)+θ2·cos x5(k) E2:
xE3(k)=x1(k)+θ1(k)·cos x5(k)+θ2·sin x5(k)
yE3(k)=x3(k)+θ1(k)·sin x5(k)+θ2·cos x5(k) E3:
xE4(k)=x1(k)+θ1(k)·cos x5(k)+θ2·sin x5(k)
yE4(k)=x3(k)+θ1(k)·sin x5(k)+θ2·cos x5(k) E4:
Here, the coordinates are transformed from the object coordinate system x0,y0 into the ego coordinate system xe,ye using the following transformation matrix:
The coordinates of the corner points E1, E2, E3, E4 in the object coordinate system x0,y0 are:
Consequently, the object corners are calculated in the ego coordinate system xe,ye as follows:
Performing the transformations leads to the above results for the corners E1, E2, E3 and E4.
Subsequently, the modeled or estimated distance value ŷ(k) of a sensor, in this case the sensor SR1 in the example of
The distinctions described above under A(k),B(k) and C(k) are repeated for each ultrasonic sensor i=1 to m considered in the model, m being the number of sensors involved. In the example of
The total output of the complete model is then composed of the individual estimated sensor values as follows:
ŷ=[ŷ1(k),ŷ2(k), . . . ,ŷm(k)]T with 1, . . . ,m number of sensors
In the example of
ŷ=[ŷSR1(k),ŷSR2(k)]T.
By applying a suitable optimization method, for example the predictive error method, a sliding mode observer, the Kalman filter, the Levenberg-Marquardt method, etc., the state vector {right arrow over (x)} and the parameter vector θ are adapted such that the measured y(k) and the modeled ŷ(k) system output match as well as possible within the meaning of a defined optimization criterion.
A typical optimization criterion is the minimization of the sum of the deviations F(N) between the measured and the modeled system output, i.e.
F(N)=Σk=1N(y(k)−ŷ(k))2, with N=number of measurements.
This is not the only option for minimizing the deviations. Instead of squaring the difference, the amount of the difference between the measured and the modeled system output can be used, for example.
”, wherein the measurement index k can also be understood as the point in time t. There is therefore a clear assignment of the number k of the measurement to the point in time t. The model M, i.e., the modeled system output ŷ(k), is shown as a solid line.
Therefore, the state vector {right arrow over (x)} described at the outset and the parameter vector θ are determined by means of the above-mentioned minimization of the sum of the deviations.
”, wherein the measurement index k can also be understood as the point in time t. There is therefore a clear assignment of the number k of the measurement to the point in time t. The model M, i.e., the modeled system output ŷ(k), is shown as a solid line that extends obliquely in the lower region, unlike in the case of
The state vector {right arrow over (x)} described at the outset and the parameter vector θ can therefore be determined from the model M.
In the case of the described modeling of the distance sensing, the present situation A(k),B(K) or C(k) is checked using IF-THEN queries. Instead of this sequence of requests, in another embodiment, specially parameterized activation functions can also be used. This will be explained in the following based on the example of the check as to whether the object is located to the left or right of or on the straight line a.
The three cases A(k) to C(k) already explained above are distinguished.
A(k) μ1<0
Here, the case with μ1(k)<0 is therefore relevant and is reproduced in
For this first case with μ1(k)<0, in which the object O is located to the left of the straight line a, the following transition function ƒ1(k) is defined:
Here, the factor β is a constant parameter that defines the steepness of the transition from the value “One” to the value “Zero” or, alternatively, from the value “Zero” to the value “One”.
Using the above transition function ƒ1(k), an activation function δ1(k) for the case μ1(k)<0 is defined as follows:
δ1(k)=(ƒ1(k)−0.5).
This first activation function δ1(k) is shown in
B(k) μ2(k)>0
The case μ2(k)>0 will now be considered, which means that the object O is located to the right of the straight line a shown in
For the second case with μ2(k)>0, in which the object O is located to the right of the straight line a, the following transition function ƒ2(k) is defined:
Here, the factor β is defined as above and defines the steepness of the transition from “Zero” to “One” or vice versa.
Using the above transition function ƒ2(k), a further activation function δ2(k) for the case μ2(k)>0 is defined as follows:
δ2(k)=(ƒ2(k)−0.5).
The course of this second activation function is shown in
C(k)) Else
Thirdly, the case is considered in which the object O completely covers the sensing range W of the sensor SR1, i.e., is located on the straight line a, as is the case in the example of
In this case, a third activation function δ3(k) is defined as follows:
δ3(k)=(ƒ1(k)−1.5)·(ƒ2(k)−1.5)
This third activation function δ3(k) then only assumes the value of “One” if μ1(k)<0 and μ2(k)>0 apply at the same time.
By means of the activation functions δ1(k), δ2(k) and δ3(k), the query of the cases as to whether the object O is located to the left of the straight line a, to the right of the straight line a, or on the straight line a, can be represented compactly as follows:
s=δ1(k)·A(k)+δ2(k)·B(k)+δ3(k)·C(k)
In order to be able to calculate the above equation, the cases or functions A(k) to C(k) must be defined. This will be shown below based on the example of the function A(k). The functions B(k) and C(k) then result in a similar manner:
Firstly, the expressions within the “IF-THEN” queries in the case A(k) are summarized as follows:
Further activation functions, which are shown in
Furthermore, the calculation of the distances s for the subcases A1, A2 and A3 of the case A(k) are defined as follows:
If the distance s in the case A(k) is denoted by sA, the following results for the calculation of said distance in the case A(k):
sA=ƒρ
Consequently, the equation for the general determination of the distance s:
s=δ1(k)·A(k)+δ2(k)·B(k)+δ3(k)·C(k)
can be rewritten as:
s=δ1(k)·ƒρ
The further cases B(k) and C(k) can be represented in a similar manner. The final value of the modeled distance value of an ultrasonic sensor then results as:
ŷ(k)=min(s,smax),
The invention has been described in the preceding using various exemplary embodiments. Other variations to the disclosed embodiments may be understood and effected by those skilled in the art in practicing the claimed invention, from a study of the drawings, the disclosure, and the appended claims. In the claims, the word “comprising” does not exclude other elements or steps, and the indefinite article “a” or “an” does not exclude a plurality. A single processor, module or other unit or device may fulfil the functions of several items recited in the claims.
The term “exemplary” used throughout the specification means “serving as an example, instance, or exemplification” and does not mean “preferred” or “having advantages” over other embodiments. The term “in particular” and “particularly” used throughout the specification means “for example” or “for instance”.
The mere fact that certain measures are recited in mutually different dependent claims or embodiments does not indicate that a combination of these measures cannot be used to advantage. Any reference signs in the claims should not be construed as limiting the scope.
Number | Date | Country | Kind |
---|---|---|---|
10 2020 214 619.5 | Nov 2020 | DE | national |
10 2021 203 497.7 | Apr 2021 | DE | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/EP2021/078237 | 10/13/2021 | WO |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2022/106124 | 5/27/2022 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
7843767 | Hayasaka | Nov 2010 | B2 |
20080232198 | Hayasaka | Sep 2008 | A1 |
20220342061 | Pampus | Oct 2022 | A1 |
20230305112 | Podolski | Sep 2023 | A1 |
Number | Date | Country |
---|---|---|
102006036423 | Feb 2008 | DE |
102015117379 | Apr 2017 | DE |
102018105014 | Sep 2018 | DE |
102021203497 | May 2022 | DE |
2013180787 | Dec 2013 | WO |
2016189112 | Dec 2016 | WO |
2022106124 | May 2022 | WO |
Entry |
---|
Yu, Jiaying et al., “Dynamical Tracking of Surrounding Objects for Road Vehicles Using Linearly-Arrayed Ultrasonic Sensors,” IEEE Intelligent Vehicles Symposium (IV), pp. 72-77, Jun. 19, 2016. |
International Search Report and Written Opinion, Application No. PCT/EP2021/078237, 28 pages, Jan. 21, 2022. |
Number | Date | Country | |
---|---|---|---|
20240036193 A1 | Feb 2024 | US |