This application claims under 35 U.S.C. § 119(a) the benefit of Korean Patent Application No. 10-2016-0033855, filed on Mar. 22, 2016 in the Korean Intellectual Property Office, the entire contents of which are incorporated by reference herein.
(a) Technical Field
The present invention relates to a side collision avoidance system and method for a vehicle, more particularly, to a system and method for increasing accuracy in the detection of obstacles by combining ultrasonic sensor data with radar sensor data and predicting the probability of collisions with obstacles with high accuracy when a vehicle that is parallel parked exits a parking space.
(b) Description of the Related Art
In recent years, traffic congestion and parking problems have worsened in certain geographical areas. As the number of vehicles increases in a particular region, city, and country, available parking spaces are reduced. In order to address the lack of parking spaces, the size of a parking space for a single vehicle may be narrowed.
In addition, when many vehicles are parked in a parking lot even without visible parking markings, distances between vehicles are reduced. In this case, drivers may have difficulties in recognizing surrounding obstacles with the naked eye and maneuvering their vehicles to park in narrow parking spaces or exit the narrow parking spaces.
Accordingly, a smart parking assist system (SPAS) has been mounted in vehicles, and automatic parking or automatic exit technology utilizing images of the surrounding environment captured by the SPAS has been developed. Typically an SPAS automatically steers the vehicle when parking in a parking space, to assist parking or exit procedures.
However, when a vehicle, especially a vehicle that has been parallel parked, exits a parking space by using the SPAS, objects present at the left and right sides of the vehicle may not be detected, thereby resulting in collisions. Accordingly, it is necessary to detect the objects present at the left and right sides of the vehicle in order to allow the vehicle to exit the parking space safely and conveniently.
Conventionally, side collision avoidance technology uses radar sensors mounted on the rear of a vehicle so as to detect vehicles approaching from behind and output rear vehicle detection signals. In addition, such technology may utilize a plurality of ultrasonic sensors mounted on the sides of the vehicle so as to detect vehicles approaching from the sides and output side vehicle detection signals, where this technology may detect obstacles on the basis of the output signals of the sensors.
Since the conventional side collision avoidance technology uses radar sensor data and ultrasonic sensor data independently to detect obstacles, complementary functions of respective sensors may not be utilized, and thus accuracy in the detection of obstacles may be reduced. In addition, it may be difficult to predict the probability of collisions with obstacles with high accuracy.
An aspect of the present invention provides a side collision avoidance system and method for a vehicle configured to combine data from radar sensors mounted on front and rear surfaces of the vehicle with data from ultrasonic sensors mounted on side surfaces of the vehicle so as to detect obstacles on the basis of the combined data, thereby increasing accuracy in the detection of obstacles and predicting the probability of collisions with obstacles with high accuracy.
The objects of the present invention are not limited to the foregoing objects, and any other objects and advantages not mentioned herein will be clearly understood from the following description. The present inventive concept will be more clearly understood from exemplary embodiments of the present invention. In addition, it will be apparent that the objects and advantages of the present invention can be achieved by elements and features claimed in the claims and a combination thereof.
According to an aspect of the present invention, a side collision avoidance system for a vehicle includes: a plurality of radar sensors configured to detect an obstacle around the vehicle; a plurality of ultrasonic sensors configured to detect the obstacle around the vehicle; a controller combining sensor data from the radar sensors with sensor data from the ultrasonic sensors to generate combined data, detecting the obstacle on the basis of the combined data, and predicting collision or non-collision with the detected obstacle on the basis of the combined data; and a brake driver configured to brake the vehicle when the collision is predicted.
According to another aspect of the present invention, a side collision avoidance method for a vehicle includes steps of: generating, by a controller, combined data by combining sensor data from radar sensors with sensor data from ultrasonic sensors; detecting, by the controller, an obstacle on the basis of the combined data; predicting, by the controller, collision or non-collision with the detected obstacle on the basis of the combined data; and braking, by a brake driver, the vehicle when the collision is predicted.
The above and other objects, features and advantages of the present invention will be more apparent from the following detailed description taken in conjunction with the accompanying drawings:
It is understood that the term “vehicle” or “vehicular” or other similar term as used herein is inclusive of motor vehicles in general such as passenger automobiles including sports utility vehicles (SUV), buses, trucks, various commercial vehicles, watercraft including a variety of boats and ships, aircraft, and the like, and includes hybrid vehicles, electric vehicles, plug-in hybrid electric vehicles, hydrogen-powered vehicles and other alternative fuel vehicles (e.g. fuels derived from resources other than petroleum). As referred to herein, a hybrid vehicle is a vehicle that has two or more sources of power, for example both gasoline-powered and electric-powered vehicles.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used herein, the singular forms “a,” “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “comprises” and/or “comprising,” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. As used herein, the term “and/or” includes any and all combinations of one or more of the associated listed items. Throughout the specification, unless explicitly described to the contrary, the word “comprise” and variations such as “comprises” or “comprising” will be understood to imply the inclusion of stated elements but not the exclusion of any other elements. In addition, the terms “unit”, “-er”, “-or”, and “module” described in the specification mean units for processing at least one function and operation, and can be implemented by hardware components or software components and combinations thereof.
Further, the control logic of the present invention may be embodied as non-transitory computer readable media on a computer readable medium containing executable program instructions executed by a processor, controller or the like. Examples of computer readable media include, but are not limited to, ROM, RAM, compact disc (CD)-ROMs, magnetic tapes, floppy disks, flash drives, smart cards and optical data storage devices. The computer readable medium can also be distributed in network coupled computer systems so that the computer readable media is stored and executed in a distributed fashion, e.g., by a telematics server or a Controller Area Network (CAN).
The above and other objects, features and advantages of the present invention will be more clearly understood from the following detailed description taken in conjunction with the accompanying drawings so that those skilled in the art to which the present invention pertains can easily carry out technical ideas described herein. In addition, a detailed description of well-known techniques associated with the present invention will be ruled out in order not to unnecessarily obscure the gist of the present invention. Hereinafter, exemplary embodiments of the present invention will be described in detail with reference to the accompanying drawings.
As illustrated in
With respect to each of the aforementioned elements, first, the radar sensor unit 10 may detect obstacles positioned in front of and behind a vehicle. As illustrated in
The radar sensor 210 mounted on a front surface of the vehicle may detect obstacles positioned in front of the vehicle. The radar sensor 220 mounted on the left side of a rear surface of the vehicle may detect obstacles positioned at the left side of and behind the vehicle. The radar sensor 230 mounted on the right side of the rear surface of the vehicle may detect obstacles positioned at the right side of and behind the vehicle.
Next, the ultrasonic sensor unit 20 may detect obstacles positioned in front of and behind the vehicle and positioned at the left and right sides of the vehicle. As illustrated in
The plurality of ultrasonic sensors 241 mounted on the front surface of the vehicle may detect obstacles positioned in front of the vehicle, and the plurality of ultrasonic sensors 242 mounted on the rear surface of the vehicle may detect obstacles positioned behind the vehicle.
In addition, the plurality of ultrasonic sensors 243 mounted on the left side surface of the vehicle may detect obstacles positioned at the left side of the vehicle, and the plurality of ultrasonic sensors 244 mounted on the right side surface of the vehicle may detect obstacles positioned at the right side of the vehicle.
As illustrated in
Therefore, when other vehicles are positioned at the left side of the vehicle, they may be detected by all of the radar sensor 220 and the ultrasonic sensors 243, and when other vehicles are positioned at the right side of the vehicle, they may be detected by all of the radar sensor 230 and the ultrasonic sensors 244. That is, some of the other vehicles may be present in an overlapped detection area of the radar sensor and the ultrasonic sensors.
Meanwhile, since technology for avoiding a side collision when a vehicle that is parallel parked exits a parking space is primarily described in exemplary embodiments of the present invention, a first sensor group including the left side ultrasonic sensor 243 and the left rear radar sensor 220 and a second sensor group including the right side ultrasonic sensor 244 and the right rear radar sensor 230 may mainly be used. In reality, when a vehicle that is parallel parked exits a parking space, it is obvious to avoid the collisions of other vehicles parked in front of and behind the vehicle, and thus, all of the plurality of ultrasonic sensors 241 mounted on the front surface of the vehicle and the plurality of ultrasonic sensors 242 mounted on the rear surface of the vehicle, as well as the radar sensor 210 mounted on the front surface of the vehicle, may also be used.
The side collision avoidance technology in exemplary embodiments of the present invention is used to avoid side collisions when a vehicle that is parallel parked exits a parking space, but is not limited thereto. For example, the technology may also be used to avoid side collisions when the vehicle is travelling. Further, the technology may be applied in other parking situations, e.g., when the vehicle is parked in a parking lot with other vehicles in a plurality of parallel or diagonal rows.
The brake driver 30 may drive a brake under control of the controller 50.
The warning unit 40 may output an alarm under control of the controller 50.
The controller 50 generally controls the aforementioned respective elements to perform the functions thereof normally.
In particular, the controller 50 may combine sensor data from the radar sensor mounted on the rear surface of the vehicle with sensor data from the ultrasonic sensor mounted on the side surface of the vehicle to generate the combined data, detect an obstacle on the basis of the combined data, and predict whether or not the vehicle collides with the detected obstacle on the basis of the combined data.
In addition, when the collision with the obstacle is predicted, the controller 50 may control the brake driver 30 to stop the vehicle.
In addition, when the collision with the obstacle is predicted, the controller 50 may control the warning unit 40 to warn a driver of the predicted collision.
In addition, when the collision with the obstacle is predicted, the controller 50 may control the brake driver 30 to stop the vehicle, and then control the warning unit 40 to warn the driver of the predicted collision.
Hereinafter, the configuration of the controller 50 will be detailed with reference to
As illustrated in
With respect to each of the aforementioned elements, first, the sensor driver 51 may determine whether a vehicle that is parallel parked is in a left-side exit mode or in a right-side exit mode when exiting a parking space, and activate a sensor group corresponding to the determined exit mode. Here, the sensor driver 51 may receive the input of a driver to determine the left-side exit mode or the right-side exit mode, and may also be interlocked with a smart parking assist system (SPAS) mounted in the vehicle to determine the left-side exit mode or the right-side exit mode. Here, the sensor driver 51 may obtain information from an in-vehicle network system. The in-vehicle network system includes a controller area network (CAN), a local interconnect network (LIN), a FlexRay network, and a media oriented system transport (MOST).
For example, the sensor driver 51 may drive the first sensor group including the left side ultrasonic sensor 243 and the left rear radar sensor 220 in the left-side exit mode, and may drive the second sensor group including the right side ultrasonic sensor 244 and the right rear radar sensor 230 in the right-side exit mode.
Obstacles detected by the first sensor group activated by the sensor driver 51 are illustrated by way of example in
In
Next, the sensor data combiner 52 may combine sensor data from the radar sensor unit 10 with sensor data from the ultrasonic sensor unit 20.
In other words, the sensor data combiner 52 may combine the radar sensor data with the ultrasonic sensor data on the basis of the following Equation 1 to generate the combined data:
=+λ(−)
P3=P1−(P1−P1/2)U−11/2(P1−P1/2)T Equation 1
Here, is ultrasonic sensor data indicating a position of an obstacle, is radar sensor data indicating a position of an obstacle, and is combined data indicating a position of an obstacle.
In addition, P1 indicates covariance of ultrasonic sensor data, P2 indicates covariance of radar sensor data, P3 indicates covariance of combined data, and P1/2 indicates cross covariance of ultrasonic sensor data and radar sensor data.
In addition, λ=(P1−P1/2)U−11/2, U1/2=P1+P2−P1/2, and P1/2=ρ√{square root over (P1·P2)} may be satisfied. Here, ρ indicates an effective correlation coefficient (for example, 0.4).
Meanwhile, the sensor data with respect to the vehicles 410, 420, and 430 detected by the left side ultrasonic sensor 243 and/or the left rear radar sensor 220 in
In order words, the image of the vehicle 410 detected by the left side ultrasonic sensor 243 may be illustrated as an ellipsoidal shape (hereinafter, a “first ellipse”) of which the top and bottom are long, and the image of the vehicle 420 detected by the left rear radar sensor 220 may be illustrated as an ellipsoidal shape (hereinafter, a “second ellipse”) of which the left and right sides are long. Here, an area of the second ellipse is larger than that of the first ellipse.
The sensor data combined by the sensor data combiner 52 may be illustrated as a right-side image of
This is further detailed with reference to
As illustrated in
Next, the effective obstacle detector 53 may detect obstacles on the basis of the data (hereinafter, “the combined data”) combined by the sensor data combiner 52, and may detect obstacles positioned in an interested area 710 among the detected obstacles as effective obstacles. Here, the interested area 710 may be a predetermined area which is illustrated in
In addition, the effective obstacle detector 53 may continuously predict the positions of the obstacles positioned in the interested area 710.
In particular, even when the sensor data from the radar sensors and the sensor data from the ultrasonic sensors are not continuously transmitted, the effective obstacle detector 53 may predict the positions of the obstacles on the basis of Equation 2 and Equation 3.
In other words, the effective obstacle detector 53 may estimate the positions of the obstacles according to time on the basis of Equation 2. The estimated positions of the obstacles are illustrated in
{circumflex over (X)}k=Fk-1{circumflex over (X)}k-1+Gkwk Equation 2
Here, {circumflex over (X)}k indicates an estimated position of an obstacle at a point in time k, {circumflex over (X)}k-1 indicates an estimated position of the obstacle at a point in time k−1, F indicates an obstacle state change mathematical model, G indicates an input model related to the state change, and W indicates a system input (noise and the like).
In addition, the effective obstacle detector 53 may update the covariance on the basis of the following Equation 3:
Pk=Fk-1Pk-1Fk-1T+RW Equation 3
Here, Pk indicates covariance at the point in time k, Pk-1 indicates covariance at the point in time k−1, and Rw indicates model error covariance.
In addition, the effective obstacle detector 53 may determine whether or not the obstacle is positioned in the interested area, on the basis of Equation 4. In other words, it may be determined that the obstacle is out of the interested area when satisfying the following Equation 4:
√{square root over (tr(Pk))}>Plimit Equation 4
Here, Plimit indicates an upper limit (threshold value) of valid covariance, and tr indicates a function for calculating the addition or subtraction of respective components in a matrix.
In such a technique for predicting the positions of the obstacles according to time, a transmission cycle of data from the sensors may be 100 ms. However, it may also be used to predict the positions of the obstacles with a transmission cycle of 50 ms.
Next, the collision predictor 54 may detect, as a final obstacle, an obstacle positioned in a potential collision area 910 among the effective obstacles detected by the effective obstacle detector 53. In other words, the collision predictor 54 may detect the vehicle 410 positioned in the potential collision area 910 as illustrated in
The collision predictor 54 may predict collision or non-collision with the final obstacle, according to a collision probability with the final obstacle on the basis of a standard deviation of the covariance of the combined data. In other words, the collision predictor 54 may calculate the standard deviation using the covariance of the combined data generated by the sensor data combiner 52, calculate the collision probability in consideration of the calculated standard deviation, and predict the collision with the final obstacle when the calculated collision probability exceeds a reference value.
With reference to
Here, tinv is an inverse number of TTC (Time To Collision), and satisfies
Here, ve indicates a velocity of a vehicle (the vehicle 400), vt indicates a velocity of an obstacle (the vehicle 410), and Δx indicates a distance to collision.
In addition,
By applying σt to Equation 5, the collision probability with the obstacle may be predicted with high accuracy.
In the exemplary embodiments of the present invention, the side collision avoidance system that is configured as a single independent system is described by way of example. However, the functions of the side collision avoidance system may be added to the well-known SPAS.
First of all, the controller 50 may combine sensor data (hereinafter, “radar sensor data”) from the radar sensor with sensor data from the ultrasonic sensor (hereinafter, “ultrasonic sensor data”) to generate combined data in operation 1101.
Next, the controller 50 may detect an obstacle on the basis of the combined data in operation 1102.
Thereafter, the controller 50 may predict collision or non-collision with the detected obstacle on the basis of the combined data in operation 1103.
Then, when the collision is predicted, the brake driver may brake the vehicle in operation 1104.
Meanwhile, the above-stated method according to the exemplary embodiment of the present invention may be written as a computer program. Codes and code segments constituting the program may easily be inferred by a computer programmer skilled in the art. In addition, the written program may be stored in a computer-readable recording medium (an information storage medium) and be read and executed by a computer, thereby implementing the method according to the exemplary embodiment of the present invention. The recording medium includes all types of computer-readable recording media.
As set forth above, the side collision avoidance system and method for a vehicle may combine data from radar sensors mounted on the front and rear surfaces of the vehicle with data from ultrasonic sensors mounted on the side surfaces of the vehicle and detect obstacles on the basis of the combined data, thereby increasing accuracy in the detection of obstacles and predicting the probability of collisions with obstacles with high accuracy.
In addition, the side collision avoidance system may provide optimal performance by allowing the radar sensors and the ultrasonic sensors to complement each other against the disadvantages thereof.
Furthermore, the present inventive concept may be applied to an SPAS, a blind spot detection (BSD) system, a rear cross traffic alert (RCTA) system, a smart cruise control (SCC) system, and the like, to thereby improve performance.
Hereinabove, although the present invention has been described with reference to exemplary embodiments and the accompanying drawings, the present invention is not limited thereto, but may be variously modified and altered by those skilled in the art to which the present invention pertains without departing from the spirit and scope of the present invention claimed in the following claims.
Number | Date | Country | Kind |
---|---|---|---|
10-2016-0033855 | Mar 2016 | KR | national |
Number | Name | Date | Kind |
---|---|---|---|
5408414 | Nomoto | Apr 1995 | A |
6437688 | Kobayashi | Aug 2002 | B1 |
8760962 | Kloss | Jun 2014 | B2 |
20030225546 | Alfano | Dec 2003 | A1 |
20040024527 | Patera | Feb 2004 | A1 |
20040047518 | Tiana | Mar 2004 | A1 |
20040181338 | Dobler | Sep 2004 | A1 |
20050278098 | Breed | Dec 2005 | A1 |
20060080005 | Lee | Apr 2006 | A1 |
20070063874 | Danz et al. | Mar 2007 | A1 |
20090125181 | Luke | May 2009 | A1 |
20090146842 | Jung | Jun 2009 | A1 |
20140336841 | Shin et al. | Nov 2014 | A1 |
20150134295 | Kim | May 2015 | A1 |
20150239437 | Ignaczak | Aug 2015 | A1 |
20160075325 | Kim et al. | Mar 2016 | A1 |
20170210379 | Obata | Jul 2017 | A1 |
Number | Date | Country |
---|---|---|
2000-330637 | Nov 2000 | JP |
2006528106 | Dec 2006 | JP |
2008-168851 | Jul 2008 | JP |
2009-244985 | Oct 2009 | JP |
5607410 | Oct 2014 | JP |
10-0796707 | Jan 2008 | KR |
10-2009-0042534 | Apr 2009 | KR |
1020090022129 | Apr 2009 | KR |
10-2011-0116777 | Oct 2011 | KR |
1020120094366 | Aug 2012 | KR |
1020130028183 | Mar 2013 | KR |
10-2013-0078382 | Jul 2013 | KR |
10-2014-0118153 | Oct 2014 | KR |
101449175 | Oct 2014 | KR |
1020140118611 | Oct 2014 | KR |
101477231 | Dec 2014 | KR |
10-1526815 | Jun 2015 | KR |
Number | Date | Country | |
---|---|---|---|
20170274876 A1 | Sep 2017 | US |