Method and device for locating a vehicle

Information

  • Patent Grant
  • 11851069
  • Patent Number
    11,851,069
  • Date Filed
    Wednesday, April 22, 2020
    4 years ago
  • Date Issued
    Tuesday, December 26, 2023
    4 months ago
Abstract
A method for locating a vehicle involves detecting an elevation profile of a roadway of the vehicle and in which image features on the roadway are detected as landmarks and are compared with landmarks stored on a digital map. A transformation of the detected and/or stored landmarks into a common perspective performed to compare the landmarks. The transformation is carried out based on model parameters of a parametric model of the elevation profile of the roadway and a parametric model of a vehicle inclination. The model parameters are determined by determining an expected elevation profile of the roadway from the parametric models of the elevation profile and the vehicle inclination and minimizing a difference between the expected elevation profile and the detected elevation profile by varying the model parameters.
Description
BACKGROUND AND SUMMARY OF THE INVENTION

Exemplary embodiments of the invention relate to a method for locating a vehicle, a device for locating a vehicle, and a vehicle having such a device.


From the prior art, it is known to compare and correlate data detected by sensors about objects in the environment of a vehicle with the information stored on a digital map, which has features of the object itself and position information associated with the respective object, in order to determine a position of the vehicle with the aid of the digital map. By way of example, WO 2018/059735 A1 relates to a method for the self-localization of a vehicle, in which images of the vehicle environment are captured by means of at least one image capturing unit and image features are extracted from the ambient images and superimposed on environment features stored on a digital environment map.


Exemplary embodiments of the invention are directed to improving localization of a vehicle, i.e., the determination of a position of a vehicle.


In the method for locating a vehicle according to the invention, an elevation profile of a roadway of the vehicle is detected. For locating, image features on the roadway are detected as landmarks and compared with landmarks stored on a digital map. To make the landmarks comparable, the detected and/or stored landmarks are transformed into a common perspective.


Advantageously, the landmarks stored on the digital map are transformed into the perspective of the detected landmarks, i.e., into the perspective of the vehicle. Through this transformation, the landmarks stored on the map are virtually projected onto the roadway. Virtually means that the projected landmarks are only calculated. Through the projection, the stored landmarks are thus calculated back into a form as they would be seen from the vehicle and are thus comparable with the detected landmarks.


Alternatively, the detected landmarks can also be transformed into the perspective of the stored landmarks, i.e., into the perspective of the digital map. The perspective of the digital map is the perspective of a top view, i.e., a view from above. Through this transformation, the detected landmarks are thus virtually projected into the plane of the digital map. The detected landmarks are thus calculated back into a form as they would have to be stored on the digital map and are thus comparable with the landmarks stored on the map.


In accordance with the invention, the transformation of the detected and/or stored landmarks is performed based on model parameters of a parametric model of the elevation profile of the roadway and a parametric model of a vehicle inclination of the vehicle. The model parameters are determined by determining an expected elevation profile of the roadway from the parametric models of the elevation profile and the vehicle inclination and minimizing a difference between the expected elevation profile and the detected elevation profile by varying the model parameters.


Preferably, the method according to the invention is carried out with the following steps:

    • detecting the roadway in the environment of the vehicle with a first sensor unit of the vehicle,
    • determining the detected elevation profile of the roadway based on first data of the first sensor unit,
    • determining the model parameters of the parametric models of the elevation profile of the roadway and of the vehicle inclination,
    • recording second data representing the detected landmarks by the first sensor unit or by a second sensor unit,
    • comparing the detected landmarks with the landmarks stored on the digital map after transforming the detected and/or stored landmarks into the common perspective, and
    • determining a position of the vehicle from the comparison of landmarks.


The vehicle may be a car, lorry, bus, rail vehicle or aircraft.


Preferably, the parametric model of the vehicle inclination is defined such that it describes a deflection of a longitudinal axis and transverse axis of the vehicle with respect to a reference plane, in particular a horizontal plane.


Preferably, the parametric model of the vehicle inclination is generated based on a tangent function of a pitch angle of the vehicle around a transverse direction and a tangent function of a roll angle of the vehicle around a longitudinal direction. The pitch angle and the roll angle are thereby the model parameters of the parametric model of the vehicle inclination.


The longitudinal direction and the transverse direction are defined in particular in a coordinate system that moves with a center of gravity of the vehicle, wherein the longitudinal direction and the transverse direction are located in a horizontal plane, and the longitudinal direction moves with a yaw angle of the vehicle, i.e., azimuth with respect to the earth, while the transverse direction remains perpendicular to the longitudinal direction. The roll angle as well as the pitch angle of the vehicle in each case are therefore defined in particular with respect to this horizontal plane.


When determining the model parameters, a plurality of measurement points from the first sensor data is preferably used to determine a respective elevation of the elevation profile at the measurement points of the roadway. This plurality of respective elevations is compared to a plurality of points on the expected elevation profile, such that it results in a system of equations, wherein the difference between the detected elevation profile and the expected elevation profile is reduced by minimization, and ideally becomes zero by adjusting the model parameters accordingly. This corresponds to a search for such model parameters for which the difference from the detected elevation profile and the expected elevation profile becomes minimal based on the plurality of measurement points considered.


The reason why the expected elevation profile is generated from the parametric model of the elevation profile of the roadway and the parametric model of the vehicle inclination is that the parametric model is defined relative to the roadway, wherein the vehicle inclination, i.e., the pitch angle and the roll angle, have an influence on what elevation is actually detected by the first sensor unit of a respective point on the roadway. The theoretically expected elevation profile, which indicates an expectation of the measurement from the first sensor unit, thus takes into account this influence of the pitch angle and the roll angle. Thus, not only are the model parameters adjusted, but the roll angle and the pitch angle are also determined such that the expected elevation profile and the actual detected elevation profile match as closely as possible.


The image features on the roadway are, in particular, markings that are applied to the roadway in the environment of the vehicle, for example lane markings, stop lines, direction arrows, characters, etc. However, it is also conceivable that other features present on the roadway are detected and used as detected landmarks.


It is an advantageous effect of the invention that, when locating a position of the vehicle, i.e., locating the vehicle by comparing the landmarks, it is taken into account that roads are not always flat. In particular, roads may have a curved or distorted or bent surface relative to a reference ellipsoid of the earth. In addition to this, due to the spring mounting of the vehicle, the vehicle may have a pitch angle or also a roll angle with respect to this reference ellipsoid, wherein these orientation angles are also taken into account in the transformation. The transformation of the detected and/or stored landmarks into a common perspective ensures that the detected and stored landmarks are comparable with each other (because the landmarks stored on the map are typically defined by looking at the environment from above, while the detected landmarks are detected from the perspective of the vehicle, in particular from the perspective of the first or second sensor unit). In this respect, the comparison of the landmarks is corrected for errors caused by the curved surface of the road and also by the vehicle inclination (pitch angle and roll angle). The detected landmarks can thus be assigned with better accuracy to the landmarks stored on the digital map, which advantageously enables more precise locating of the vehicle with the aid of the landmarks stored on the digital map.


According to an advantageous embodiment, the first sensor unit is the same as the second sensor unit, i.e., the roadway and the landmarks are detected with the same sensor unit. Preferably, a stereo camera unit is used for both the first sensor unit and the second sensor unit. Further advantageously, the determination of the detected elevation profile of the roadway based on the first data as well as the comparison of the second data with the landmarks stored on the digital map can be performed by one and the same data set of first data and second data in one go. If the first data are equal to the second data, one image data set of the stereo camera unit can be used for both purposes. This has the advantage that the method can be executed with relatively little sensor data and with relatively little computational effort.


According to a further advantageous embodiment, the parametric model of the elevation profile of the roadway comprises a parametrizable curve for the longitudinal direction and a parametrizable curve for the transverse direction.


According to a further advantageous embodiment, the parametrizable curve for the longitudinal direction is a second-order polynomial having two model parameters, and the parametrizable curve for the transverse direction is a fourth-order polynomial having four model parameters.


The second-order polynomial is preferably represented as: H(x)=ex2+fx, wherein e and f are the model parameters of the polynomial for the longitudinal direction, which are determined, wherein x is a coordinate in the longitudinal direction.


The fourth-order polynomial is preferably represented as H(y)=ay4+by3+cy2+dy, wherein a, b, c, and d are the model parameters of the polynomial for the transverse direction, which are determined, wherein y is coordinate in the transverse direction.


The parametric model of the vehicle inclination is preferably expressed, based on the tangent function of the pitch angle and the tangent function of the roll angle as the term He, by:


He=x*tan(pitch angle)+y*tan(roll angle), wherein pitch angle and roll angle are the model parameters to be determined, x is the coordinate in the longitudinal direction and y is the coordinate of the transverse direction.


According to a further advantageous embodiment, minimizing the difference between the expected elevation profile and the detected elevation profile to determine the model parameters is performed by means of one of the following processes and methods:

    • Gauss-Newton method,
    • Levenberg-Marquard method,
    • gradient-based search method,
    • quadratic optimization,
    • training of a neural network,
    • genetic or evolutionary algorithm.


According to a further advantageous embodiment, a time series of the pitch angle and/or the roll angle is used to calibrate the first sensor unit and/or the second sensor unit respectively.


According to a further advantageous embodiment, at least one of the following is used for the first sensor unit and/or the second sensor unit:

    • lidar unit,
    • radar unit,
    • stereo camera unit,
    • ultrasonic sensor unit.


Another aspect of the invention relates to a device for determining a position of a vehicle, having:

    • a first sensor unit of the vehicle, wherein the first sensor unit is designed to detect a roadway in an environment of the vehicle,
    • a computing unit adapted to determine a detected elevation profile of the roadway based on first data from the first sensor unit, and designed to determine model parameters of a parametric model of the elevation profile of the roadway and a parametric model of the vehicle inclination by minimizing a difference between an expected elevation profile and the detected elevation profile, wherein the expected elevation profile is generated from the parametric models of the elevation profile and the vehicle inclination,
    • a second sensor unit designed to record second data representing the detected landmarks,
    • wherein the computing unit is further adapted to compare the detected landmarks with landmarks stored on a digital map, wherein the detected and/or stored landmarks are converted to a common perspective by a transformation to ensure comparability of the landmarks, and wherein the transformation is performed based on the determined model parameters of the parametric models of the elevation profile and the vehicle inclination, and wherein the computing unit is designed to determine a position of the vehicle from the comparison of the landmarks.


Another aspect of the invention relates to a vehicle having a device as described above and below.


Advantages and preferred developments of the proposed device emerge from an analogous and corresponding transfer of the statements made above in connection with the proposed method.


Further advantages, features and details emerge from the following description, in which—optionally with reference to the drawing—at least one exemplary embodiment is described in detail. Identical, similar and/functionally identical parts are provided with the same reference numerals.





BRIEF DESCRIPTION OF THE DRAWING FIGURES

Here are shown:



FIG. 1 schematically a method for determining a position of a vehicle according to an exemplary embodiment of the invention,



FIG. 2 a roadway having a vehicle located thereon in relation to the execution of the method according to the exemplary embodiment of FIG. 1,



FIG. 3 a grid of measurement points on a roadway in relation to the execution of the method according to the exemplary embodiment of FIG. 1, and



FIG. 4 a vehicle having a device according to a further exemplary embodiment of the invention.





The representations in the figures are schematic and not to scale.


DETAILED DESCRIPTION


FIG. 1 shows a method for determining a position of a vehicle 10, i.e., for locating the vehicle. Certain details of this method or of the situation to which the method is applied can be seen from FIG. 2 and FIG. 3, which also refer to the embodiments of FIG. 1.


In the first step S1 of the method, a roadway in an environment of the vehicle 10 is detected with a first sensor unit 3 of the vehicle 10, the first sensor unit 3 being designed as a stereo camera unit. Next, a detected elevation profile of the roadway is determined based on the first data supplied by the stereo camera unit 3 (step S2). This is done for a plurality of measurement points from the image of the stereo camera unit 3, as depicted in FIG. 3. Here, the roadway is curved around a central line of the roadway, such that markings on the right side of the roadway appear distorted downwards. For the plurality of measurement points, determining S3 model parameters of a vehicle inclination Hxy of the elevation profile of the roadway and of model parameters of a vehicle inclination He of the vehicle 10 ensues by minimizing a difference between an expected elevation profile Hthe and the detected elevation profile. The expected elevation profile Hthe is generated by summing the parametric model Hxy of the elevation profile of the roadway and the parametric model He of the vehicle inclination:

Hthe=Hxy+He.


The parametric model of the elevation profile of the roadway is composed of a second-order polynomial H(x) having two model parameters e, f for the longitudinal direction x, and of a fourth-order polynomial H(y) having four model parameters a, b, c, d for the transverse direction y:

H(y)=ay4+by3+cy2+dy;
H(x)=ex2+fx.


The parametric model of vehicle inclination describes the inclination of the vehicle relative to a reference plane, for example relative to a horizontal plane. It is defined as a quantity He based on a tangent function of a pitch angle of the vehicle around a transverse direction and a tangent function of a roll angle of the vehicle around a longitudinal direction as follows:

He=x*tan(pitch angle)+y*tan(roll angle).


Here, pitch angle and roll angle refer to the model parameters of the parametric model of the vehicle inclination, x represents the coordinate in longitudinal direction and y represents the coordinate in transverse direction.


The expected elevation profile Hthe is formed based on the parametric models Hxy, He by summing the models:

Hthe=Hxy+He=H(x)+H(y)+He.


For a plurality of points assigned to the roadway and the respective detected heights Hsen at a respective measurement point of the detected elevation profile, it is thus obtained:







r
1

=



H
the

(


x
1

,

y
1


)

-


H
sen

(


x
1

,

y
1


)









r
2

=



H
the

(


x
2

,

y
2


)

-


H
sen

(


x
2

,

y
2


)









r
3

=



H
the

(


x
3

,

y
3


)

-


H
sen

(


x
3

,

y
3


)











Minimizing the difference between the expected elevation profile and the detected elevation profile, expressed by the residuals r1, r2, r3, is done to determine the model parameters by means of the Levenberg-Marquard method, such that the model parameters a, b, c, d, e, f, pitch angle, and roll angle are obtained. In this way, a mathematical description of the current elevation profile and the current vehicle information is obtained.


This is followed by the detection S4 of second data on objects from the environment of the vehicle 10 by the second sensor unit 5, that is, the stereo camera unit, which corresponds to the first sensor unit 3. The second data represent the detected landmarks. Furthermore, the comparison S5 of the second data, i.e., the detected landmarks, with landmarks stored on a digital map follows, wherein the stored landmarks are transformed into the perspective of the detected landmarks before the comparison. The transformation is based on the determined model parameters of the parametric models of the elevation profile and the vehicle inclination. Its purpose is to transform the stored landmarks into a form in which they can be compared with the detected landmarks. Finally, determining S6 a position of the vehicle 10 from the comparison of detected landmarks with the stored landmarks transformed into the same perspective occurs. It is also conceivable that, in step S5, the detected landmarks are transformed into the perspective of the stored landmarks before being compared. Accordingly, in step 6, the position is then determined from the comparison of the transformed detected landmarks with the stored landmarks.



FIG. 2 shows, in addition to FIG. 1, the kinked course of the road on which the detected landmarks appear kinked downwards from the point of view of the stereo camera unit 3.



FIG. 3 shows a possible grid of measurement points for comparing the expected with the measured elevation profile, as well as the orientation of the longitudinal direction x and the transverse direction y.



FIG. 4 shows a vehicle 10 having a device 1, wherein the device 1 serves to determine a position of a vehicle 10, having:

    • a first sensor unit 3 of the vehicle 10, wherein the first sensor unit 3 is designed to detect a roadway in an environment of the vehicle 10,
    • a computing unit 7, configured to determine a detected elevation profile of the roadway based on first data from the first sensor unit 3, and designed to determine model parameters of a parametric model of the elevation profile of the roadway and a parametric model of the vehicle inclination by minimizing a difference between an expected elevation profile and the detected elevation profile, wherein the expected elevation profile is generated from the parametric models of the elevation profile and the vehicle inclination,
    • a second sensor unit 5, designed to detect second data on objects from the environment of the vehicle 10 representing the detected landmarks,


      wherein the computing unit 7 is further designed to compare the detected landmarks with landmarks stored on a digital map, wherein the detected or stored landmarks are converted to the same perspective by a transformation to enable the comparison of the landmarks, and wherein the transformation occurs based on the determined model parameters of the elevation profile and the vehicle inclination, and wherein the computing unit 7 is designed to determine a position of the vehicle 10 from the comparison of the landmarks.


Although the invention has been further illustrated and explained in detail by preferred exemplary embodiments, the invention is not limited by the disclosed examples and other variations can be derived therefrom by those skilled in the art without departing from the scope of protection of the invention. It is therefore clear that a plurality of possible variations exist. It is also clear that exemplary embodiments mentioned are really only examples, which are not to be understood in any way as limiting, for example, the scope of protection, the possible applications or the configuration of the invention. Rather, the preceding description and the description of figures enables the person skilled in the art to implement the exemplary embodiments in a concrete manner, wherein the person skilled in the art, being aware of the disclosed idea of the invention, can make a variety of changes, for example with respect to the function or the arrangement of individual elements mentioned in an exemplary embodiment, without leaving the scope of protection defined by the claims and their legal equivalents, such as further explanations in the description.

Claims
  • 1. A method for locating a vehicle on a roadway, the method comprising: detecting, by a first sensor unit of the vehicle, the roadway in an environment of the vehicle to generate a plurality of measurement points;detecting an elevation profile of the roadway in the environment of the vehicle based on the plurality of measurement points;determining a set of parametric models, which includes a first and second parametric model, wherein the first parametric model is of the detected elevation profile of the roadway and comprises (1) a parameterizable curve for a longitudinal direction of the detected elevation profile, and (2) a parametrizable curve for a transverse direction of the detected elevation profile, andthe second parametric model is of an inclination of the vehicle and has pitch angle and roll angle as parameters;determining an expected elevation profile of the roadway based on the set of parametric models;determining parameters of the first and second parametric models, including the pitch angle and roll angle parameters, by minimizing a difference between the expected elevation profile and the detected elevation profile;detecting, by a second sensor unit, landmarks in the environment of the vehicle;transforming, using the determined parameters of the first and second parametric models, one of the detected landmarks or the landmarks stored on the digital map into a common perspective with the other one of the detected landmarks or the landmarks stored on the digital map; andcomparing, using the common perspective, the detected landmarks with landmarks stored on the digital map to determine a position of the vehicle; anddetermining a position of the vehicle based on the comparison of the detected landmarks with the information from the digital map.
  • 2. The method of claim 1, wherein the parametrizable curve for the longitudinal direction is a second-order polynomial having two model parameters, andthe parametrizable curve for the transverse direction is a fourth-order polynomial having four model parameters.
  • 3. The method of claim 1, wherein the parametric model of the vehicle inclination describes a deflection of a longitudinal axis and transverse axis of the vehicle with respect to a contact area of the vehicle with the roadway.
  • 4. The method of claim 3, wherein the parametric model of the vehicle inclination is generated based on a tangent function of a pitch angle of the vehicle and a tangent function of a roll angle of the vehicle.
  • 5. The method of claim 4, wherein a time series of the pitch angle and/or the roll angle is used to calibrate the first sensor and/or the second sensor, respectively.
  • 6. The method of claim 1, wherein the first sensor unit or the second sensor unit comprises at least one of: lidar unit,radar unit,stereo camera unit, andultrasonic sensor unit.
  • 7. The method of claim 1, wherein the first sensor unit and the second sensor unit are a same sensor unit.
  • 8. The method of claim 7, wherein the same sensor unit is a stereo camera.
Priority Claims (1)
Number Date Country Kind
10 2019 003 238.1 May 2019 DE national
PCT Information
Filing Document Filing Date Country Kind
PCT/EP2020/061241 4/22/2020 WO
Publishing Document Publishing Date Country Kind
WO2020/224970 11/12/2020 WO A
US Referenced Citations (66)
Number Name Date Kind
3580978 Ebeling May 1971 A
5790403 Nakayama Aug 1998 A
5947861 Nobumoto Sep 1999 A
6128569 Fukushima Oct 2000 A
6543278 Kogure Apr 2003 B1
8387439 Jiang et al. Mar 2013 B2
9008876 Pinto Apr 2015 B2
9109907 Park Aug 2015 B2
9520064 Tsuda Dec 2016 B2
9753144 Jafari Sep 2017 B1
9902228 Göhrle et al. Feb 2018 B2
10922969 Song Feb 2021 B2
11054827 Gdalyahu Jul 2021 B2
11391575 Li Jul 2022 B2
20020013651 Weiberle Jan 2002 A1
20030051560 Ono Mar 2003 A1
20030182025 Tseng Sep 2003 A1
20050270224 Silberman Dec 2005 A1
20060074541 Ono Apr 2006 A1
20070067085 Lu Mar 2007 A1
20070129873 Bernzen Jun 2007 A1
20070154068 Stein Jul 2007 A1
20090012709 Miyazaki Jan 2009 A1
20100209881 Lin Aug 2010 A1
20100209882 Lin Aug 2010 A1
20100209883 Chin Aug 2010 A1
20100209884 Lin Aug 2010 A1
20100209885 Chin Aug 2010 A1
20100209886 Lin Aug 2010 A1
20100209887 Chin Aug 2010 A1
20100209888 Huang Aug 2010 A1
20100209889 Huang Aug 2010 A1
20100209890 Huang Aug 2010 A1
20100209891 Lin Aug 2010 A1
20100209892 Lin Aug 2010 A1
20100211270 Chin Aug 2010 A1
20110004359 Kretschmann et al. Jan 2011 A1
20110006903 Niem Jan 2011 A1
20110044507 Strauss Feb 2011 A1
20110087398 Lu Apr 2011 A1
20110226036 Jiang Sep 2011 A1
20120016646 Takenaka Jan 2012 A1
20130144476 Pinto Jun 2013 A1
20130218396 Moshchuk Aug 2013 A1
20130238164 Matsuda Sep 2013 A1
20140032100 Park Jan 2014 A1
20140043473 Gupta Feb 2014 A1
20140085469 Sakano Mar 2014 A1
20140195112 Lu Jul 2014 A1
20160031287 Guest Feb 2016 A1
20190263414 Åsbogård Aug 2019 A1
20190371170 Song Dec 2019 A1
20200180692 Hara Jun 2020 A1
20200257291 Zhang Aug 2020 A1
20200318971 Mori Oct 2020 A1
20210046935 Mizoguchi Feb 2021 A1
20210323606 Namba Oct 2021 A1
20210370958 Moshchuk Dec 2021 A1
20220153070 Singuru May 2022 A1
20220194377 Otanez Jun 2022 A1
20220221291 Knoeppel Jul 2022 A1
20220274602 Zarringhalam Sep 2022 A1
20220281456 Giovanardi Sep 2022 A1
20220324421 Giovanardi Oct 2022 A1
20230041499 Uestuenel Feb 2023 A1
20230104727 Oda Apr 2023 A1
Foreign Referenced Citations (26)
Number Date Country
2787856 Nov 2013 CA
102292249 Dec 2011 CN
103162694 Jun 2013 CN
103661352 Mar 2014 CN
103661393 Mar 2014 CN
105393084 Mar 2016 CN
111688715 Sep 2020 CN
113442932 Sep 2021 CN
113752996 Dec 2021 CN
109353276 Feb 2022 CN
102006049118 Apr 2008 DE
102007004606 Jul 2008 DE
102013002889 Aug 2014 DE
102013018924 May 2015 DE
112014001807 Jan 2016 DE
102016011849 Jun 2017 DE
102017107921 Oct 2017 DE
102017216238 Mar 2019 DE
112018000107 May 2019 DE
102019216722 May 2021 DE
0678731 Oct 1995 EP
1209062 Dec 2012 KR
20220039038 Mar 2022 KR
2018059735 Apr 2018 WO
WO-2018059735 Apr 2018 WO
WO-2021091914 May 2021 WO
Non-Patent Literature Citations (9)
Entry
“Development of a multisensor road surveying measurement system;” Streiter et al.; International Multi-Conference on Systems, Signals & Devices (pp. 1-5); Jun. 4, 2012. (Year: 2012).
“Dynamic position calibration by road structure detection;” Speth et al.; 2015 IEEE International Conference on Vehicular Electronics and Safety (ICVES) (pp. 104-109); Mar. 4, 2016. (Year: 2016).
“Vehicle attitude estimation in adverse weather conditions using a camera, a GPS and a 3D road map;” Belaroussi et al. 2011 IEEE Intelligent Vehicles Symposium (IV) (pp. 782-787); Jul. 25, 2011 (Year: 2011).
Hwangbo et al.; “Integration of Orbital and Ground Image Networks for the Automation of Rover Localization;” ASPRS 2009 Annual Conference; Mar. 9-13, 2009; Baltimore, MD, USA; https://www.asprs.org/a/publications/proceedings/baltimore09/0040.pdf.
International Search Report dated Jun. 30, 2020 in related/corresponding International Application No. PCT/EP2020/061241.
English translation of Office Action dated Feb. 19, 2020 in related/corresponding DE Application No. 10 2019 003 238.1.
English translation of Written Opinion dated Jun. 30, 2020 in related/corresponding International Application No. PCT/EP2020/061241.
Response dated Mar. 13, 2020, filed responsvie to Office Action dated Feb. 19, 2020 in related/corresponding DE Application No. 10 2019 003 238.1, including English translation.
Office Action dated Sep. 29, 2023 in related/corresponding CN Application No. 202080034165.4.
Related Publications (1)
Number Date Country
20220221291 A1 Jul 2022 US