The present application relates to a system and method for determining whether an object vehicle is changing direction or turning using a sensor system.
Vehicle safety systems are becoming increasingly more prevalent in today's vehicles. Some such vehicle safety systems are being incorporated in order to reduce the likelihood or prepare a host vehicle for an imminent crash situation.
One conventional vehicle safety system is a Supplementary Restraint System (SRS). An SRS is an airbag system that works together with conventional three-point seat belts to prevent a driver or passenger from impacting a hard surface (e.g., steering wheel or dashboard) in the event of a collision.
Another conventional vehicle safety system is a Collision-Mitigation-By-Braking (CMBB) system. CMBB systems operate by braking the host vehicle in order to reduce the kinetic energy of an imminent impact, thereby greatly reducing the severity of a crash.
Yet another conventional vehicle safety system is an Adaptive Cruise Control (ACC). ACC operates by automatically adjusting the vehicle speed and distance to that of a target vehicle. An ACC system can operate to decelerate or accelerate the vehicle according to the desired speed and distance settings established by a host vehicle driver.
A method, according to one or more embodiments of the present application, may include transmitting, from a sensor unit, a number of signal pulses over a detection area external to a host vehicle. The method may further include receiving, at the sensor unit, one or more of the signal pulses reflected from an object vehicle located in the detection area and determining whether the object vehicle is turning based upon the one or more reflected signal pulses.
The sensor unit may include a single transmitter for transmitting the number of signal pulses over the detection area. Moreover, the number of signal pulses may comprise a number of infra-red (IR) light pulses distributed evenly over the detection area through a transmission lens. The sensor unit may include a single receiver for receiving the one or more signal pulses reflected from the object vehicle. The receiver may include a left channel corresponding to a left region of the detection area and a right channel corresponding to a right region of the detection area. The left channel may receive the one or more signal pulses reflected from a left rear portion of the object vehicle at least partially located in the left region of the detection area. Further, the right channel may receive the one or more signal pulses reflected from a right rear portion of the object vehicle at least partially located in the right region of the detection area.
The step of determining whether the object vehicle is turning based upon the one or more reflected signal pulses may include determining a first relative traveling distance between the left rear portion of the object vehicle and the host vehicle based upon the reflected signal pulses received at the left channel of the receiver. The step may further include determining a second relative traveling distance between the right rear portion of the object vehicle and the host vehicle based upon the reflected signal pulses received at the right channel of the receiver and determining whether the object vehicle is turning based upon a difference between the first and second relative traveling distances.
The step of determining whether the object vehicle is turning based upon the difference between the first and second relative traveling distances may include comparing the difference to a threshold and detecting that the object vehicle is turning left upon a determination that the difference exceeds the threshold and the first relative traveling distance is less than the second relative traveling distance.
Alternatively, the step of determining whether the object vehicle is turning based upon the one or more reflected signal pulses may include determining a first relative traveling velocity between the left rear portion of the object vehicle and the host vehicle based upon the reflected signal pulses received at the left channel of the receiver, determining a second relative traveling velocity between the right rear portion of the object vehicle and the host vehicle based upon the reflected signal pulses received at the right channel of the receiver, and determining whether the object vehicle is turning based upon a difference between the first and second relative traveling velocities.
A system, according to one or more embodiments of the present application, may include a sensor unit located on a host vehicle having a transmitter that can emit a signal distributed about a detection area external to the host vehicle. The sensor unit may further include a receiver that can receive one or more left reflected signals corresponding to the transmitted signal reflected from a left rear portion of an object vehicle located in a left region of the detection area. The receiver can also receive one or more right reflected signals corresponding to the transmitted signal reflected from a right rear portion of the object vehicle located in a right region of the detection area. The system may further include a controller configured to determine whether the object vehicle is turning based upon a difference between the left and right reflected signals.
The sensor unit may be mounted behind a central portion of a windshield of the host vehicle. Moreover, the sensor unit may further include a housing for at least partially enclosing the transmitter and the receiver with the windshield.
The transmitter may include a transmission lens and the signal may include a plurality of infrared (IR) light pulses emitted through the transmission lens. The receiver may include a left channel configured to receive the one or more left reflected signals and a right channel configured to receive the one or more right reflected signals. The receiver may further include at least a left receiver lens that directs the one or more left reflected signals to the left channel and a right receiver lens that directs the one or more right reflected signals to the right channel.
The controller may be configured to determine a first relative traveling distance between the left rear portion of the object vehicle and the host vehicle based upon the one or more left reflected signals, determine a second relative traveling distance between the right rear portion of the object vehicle and the host vehicle based upon the one or more right reflected signals, and determine whether the object vehicle is turning based upon a difference between the first and second relative traveling distances. The controller may be further configured to compare the difference between the first and second relative traveling distances to a threshold and detect that the object vehicle is turning right upon a determination that the difference exceeds the threshold and the first relative traveling distance is greater than the second relative traveling distance.
Alternatively, the controller may be configured to determine a first relative traveling velocity between the left rear portion of the object vehicle and the host vehicle based upon the one or more left reflected signals, determine a second relative traveling velocity between the right rear portion of the object vehicle and the host vehicle based upon the one or more right reflected signals, and determine whether the object vehicle is turning based upon a difference between the first and second relative traveling velocities.
A detailed description and accompanying drawings are set forth below.
a depicts an exemplary environmental diagram of the object vehicle turning left according to one or more embodiments of the present application;
b depicts an exemplary environmental diagram of the object vehicle turning right according to one or more embodiments of the present application;
c depicts an exemplary environmental diagram of the object vehicle traveling straight according to one or more embodiments of the present application; and
As required, detailed embodiments of the present application are disclosed herein; however, it is to be understood that the disclosed embodiments are merely exemplary of an apparatus, system or method that may be embodied in various and alternative forms. The figures are not necessarily to scale; some features may be exaggerated or minimized to show details of particular components. Therefore, specific structural and functional details disclosed herein are not to be interpreted as limiting, but merely as a representative basis for teaching one skilled in the art to variously employ one or more embodiments of the present application.
With reference to the drawings,
The sensor unit 16 may be located within the host vehicle 10 in a suitable location that can protect it from external elements. For example, the sensor unit 16 may be positioned behind a front windshield 20 of the host vehicle 10. As such, the sensor unit 16 may be protected from ambient conditions that may include rain, snow, sleet, wind, or the like. According to one or more embodiments, the sensor unit 16 may be positioned adjacent a rear-view mirror (not shown). Alternatively, the sensor unit 16 may be positioned on top of the host vehicle's dashboard (not shown) near a base 22 of the windshield 20. Moreover, the mounting location of the sensor unit 16 may be selected to provide the sensor unit 16 with a detection area 24 that projects beyond a front end 26 of the host vehicle 10 to detect objects, such as the object vehicle 12, that the host vehicle 10 may be approaching. In this regard, other mounting locations for the sensor unit 16 may be employed without departing from the scope of the present application, such as behind a vehicle grill, so long as the detection area 24 is not easily obscured.
The sensor unit 16 may include a sensor 28 for the detection of objects within the detection area 24. The sensor 28 may be a laser sensor, sonar sensor, vision sensor, or the like, suitable for detecting objects such as another vehicle in the detection area 24. The sensor system 14 may be employed to detect the relative traveling distance of an object from the host vehicle 10. The sensor system 14 may then use the detected distance in order to determine a relative velocity of the object that may be approaching the host vehicle 10. According to one or more embodiments, the sensor system 14 may be a closing velocity (CV) sensor system and the sensor 28 may be an infrared (IR) light sensor or other closing velocity sensor that may obtain distance data based upon changes in velocity.
The controller 18 may receive the sensed distance and/or velocity data corresponding to an object in the detection area 24 from the sensor unit 16. Further, the controller 18 may process the detected distance and/or relative velocity data and communicate the information to other vehicle performance and safety systems 30 to assist a driver. The controller 18 may communicate distance and velocity data to the other vehicle performance and safety systems 30 via a controller area network (CAN) 32. For instance, the controller 18 may provide distance and velocity data about objects in the detection area 24 to the CAN 32 for use by safety systems such as a supplementary restraint system (SRS), adaptive cruise control (ACC), forward collision warning (FCW), collision mitigation by braking (CMBD), or the like.
According to one or more embodiments, an object in the detection area 24 may be the object vehicle 12. More specifically, the object in the detection area 24 may be a rear end 34 of the object vehicle 12. Thus, the sensor system 14 may obtain distance and/or relative velocity data associated with the rear end 34.
According to one or more embodiments, the signals 42 emitted by the transmitter 38 may be light signals, such as IR laser light signals or the like. For instance, the transmitter 38 may emit a series of laser light pulses. The transmitter 38 may be accompanied by an optical transmission lens 48 that can distribute the emitted laser radiation relatively evenly over the detection area 24. An object in the detection area 24 may reflect one or more of the laser light pulses back to the sensor unit 16. The reflected light pulses may be received at the receiver 40.
According to one or more embodiments, the receiver 40 may include a plurality of optical receiving lenses 50, each associated with a different receiver channel 52. Accordingly, the detection area 24 may be generally subdivided into several detection regions, one for each channel 52. For instance, the receiver 40 may include a left channel 52a, a center channel 52b, and a right channel 52c. The intensity of the reflected light 44 may be measured through each receiving lens 50, for example, by a light-sensitive diode associated with each channel 52.
The processor 46 may collect data from the receiver 40 and may calculate a distance and a velocity for each channel 52 associated with a region in which an object is present. The processor 46 may use time-of-flight measurements of the light pulses between transmission and reception to calculate relative distances between the host vehicle 10 and an object in the detection area 24 of the sensor unit 16, such as the object vehicle 12. Relative velocity data may be generated from changes in the measured distances between the host vehicle 10 and the object vehicle 12 within a defined time period.
The sensor unit 16 may transmit the series of light pulses periodically. Correspondingly, the sensor unit 16 may communicate periodic updates of distance and velocity data for each channel 52 to the controller 18 or other systems 30 via the CAN 32.
With reference now to
As shown in
Referring generally to
The rear end 34 of the object vehicle 12 may include a left rear portion 60 and a right rear portion 62. The left rear portion 60 may generally correspond to an area of the rear end 34 proximate the left taillight. The right rear portion 62 may generally correspond to an area of the rear end 34 proximate the right taillight. As shown in
When the object vehicle 12 is making a left-hand turn, the distance between the right rear portion 62 of the object vehicle's rear end 34 and the host vehicle 10 may become greater than the distance between the left rear portion 60 and the host vehicle 10. Moreover, the difference in the relative velocity for each of the left rear portion 60 and the right rear portion 62 with respect to the host vehicle 10 may increase or decrease depending on whether the host vehicle 10 is gaining on the object vehicle 12. For instance, if the host vehicle 10 is gaining on the object vehicle 12 while the object vehicle is turning left, the relative closing velocity of the left rear portion 60 with respect to the host vehicle 10 may be greater than the relative closing velocity of the right rear portion 62. Of course, the opposite may occur if the host vehicle 10 is traveling at the same or lesser speed than the object vehicle 12.
The changes in distance and/or relative velocity between the left rear portion 60 and right rear portion 62 of the object vehicle's rear end 34 may be used to detect whether the object vehicle 12 is turning and in which direction. To this end, distance and/or relative velocity data associated with the left rear portion 60 of the object vehicle's rear end 34 may be calculated from reflected light pulses 44 received at the left channel 52a, which may correspond to the left detection region 54a. Moreover, distance and/or relative velocity data associated with the right rear portion 62 of the object vehicle's rear end 34 may be calculated from reflected light pulses 44 received at the right channel 52c, which may correspond to the right detection region 54c. Further, the difference (Δd) in distance and/or relative velocity between the left rear portion values and the right rear portion values may be determined. Based on this difference, the sensor system 14 may determine which direction, if any, that the object vehicle 12 is turning.
As shown in
At step 530, the sensor system 14 may determine whether an object is present in the detection area 24 based on the reflected signals 44 received by the sensor unit 16. Further, the sensor system 14 may determine whether an object detected in the detection area 24 is a vehicle, such as the object vehicle 12. If no object vehicle 12 is detected, the method may return to step 510 and the sensor system 14 may continue to monitor for objects in the detection area 24. If, on the other hand, the sensor system 14 determines that another vehicle is in the detection area 24, then the method may proceed to step 540. The sensor system 14 may calculate distance and/or relative velocity data for both the left rear portion 60 and the right rear portion 62 of the object vehicle 12 with respect to the host vehicle 10. Distance and/or velocity data associated with the left rear portion 60 may be obtained from light pulses reflected off the left rear portion and received at the left channel 52a of the receiver 40. Likewise, distance and/or velocity data associated with the right rear portion 62 may be obtained from light pulses reflected off the right rear portion and received at the right channel 52c of the receiver 40. At step 540, the sensor system 14 may determine the difference Δd in the distances and/or relative velocities between the left rear portion 60 and the host vehicle 10 and the right rear portion 62 and the host vehicle 10.
At step 550, the sensor system 14 may determine whether the difference Δd exceeds the turning threshold. For example, if the difference Δd is equal to or less than the turning threshold, then the method may proceed to step 560. At step 560, the sensor system 14 may determine that the object vehicle 12 is not turning. However, if at step 550 the difference Δd is greater than the turning threshold, then the sensor system 14 may conclude that the object vehicle 12 is turning and the method may proceed to step 570.
At step 570, the sensor system 14 may compare distance and/or relative velocity data received at the left and right channels 52a, 52c of the receiver 40. For instance, if the reflected light pulses received at the left and right channels indicate that the left rear portion 60 of the object vehicle 12 is farther away from the host vehicle 10 than the right rear portion 62, then the method may proceed to step 580. At step 580, the sensor system 14 may conclude that the object vehicle 12 is turning to the right of the host vehicle 10. If, on the other hand, the reflected light pulses received at the left and right channels indicate that the left rear portion 60 of the object vehicle 12 is closer to the host vehicle 10 than the right rear portion 62, then the method may proceed to step 590. At step 590, the sensor system 14 may conclude that the object vehicle 12 is turning to the left of the host vehicle 10.
It should be noted that the method of
While exemplary embodiments are described above, it is not intended that these embodiments describe all possible embodiments of the application. Rather, the words used in the specification are words of description rather than limitation, and it is understood that various changes may be made without departing from the spirit and scope of the application. Additionally, the features of various implementing embodiments may be combined to form further embodiments of the application.
Number | Name | Date | Kind |
---|---|---|---|
4757450 | Etoh | Jul 1988 | A |
4802096 | Hainsworth et al. | Jan 1989 | A |
5026153 | Suzuki et al. | Jun 1991 | A |
5227784 | Masamori et al. | Jul 1993 | A |
5266955 | Izumi et al. | Nov 1993 | A |
5471214 | Faibish et al. | Nov 1995 | A |
5529138 | Shaw et al. | Jun 1996 | A |
5594645 | Nishimura et al. | Jan 1997 | A |
5710565 | Shirai et al. | Jan 1998 | A |
5714928 | Sudo et al. | Feb 1998 | A |
5973618 | Ellis | Oct 1999 | A |
6084508 | Mai et al. | Jul 2000 | A |
6087975 | Sugimoto et al. | Jul 2000 | A |
6298298 | Tange et al. | Oct 2001 | B1 |
6304321 | Wangler et al. | Oct 2001 | B1 |
6343810 | Breed | Feb 2002 | B1 |
6466863 | Shirai et al. | Oct 2002 | B2 |
6484087 | Shrai et al. | Nov 2002 | B2 |
6517172 | Bond, III et al. | Feb 2003 | B1 |
6523912 | Bond, III et al. | Feb 2003 | B1 |
6532408 | Breed | Mar 2003 | B1 |
6639543 | Puglia | Oct 2003 | B2 |
6640182 | Matsui | Oct 2003 | B2 |
6650983 | Rao et al. | Nov 2003 | B1 |
6885968 | Breed et al. | Apr 2005 | B2 |
6950014 | Rao et al. | Sep 2005 | B2 |
7049945 | Breed et al. | May 2006 | B2 |
7486803 | Camus | Feb 2009 | B2 |
7660438 | Camus | Feb 2010 | B2 |
20050267657 | Devdhar | Dec 2005 | A1 |
20050278098 | Breed | Dec 2005 | A1 |
20070228705 | Rao et al. | Oct 2007 | A1 |
20090018711 | Ueda et al. | Jan 2009 | A1 |
Number | Date | Country |
---|---|---|
7200990 | Aug 1995 | JP |
Entry |
---|
Dabbour E. et al., Perceptual Framework for a Modern Left-Turn Collision Warning System, International Journal of Applied Science, Engineering and Technology 5,vol. 1, 2009, pp. 8-14. |
Press Release, Continental, New Type of Precrash Sensor is Able to Prevent Many Accidents in Urban Traffic, Apr. 18, 2007, pp. 1-4. |
Press Release, Continental, World Premiere of Continental Sensor System in the New Volvo XC60, pp. 1-4. |
Number | Date | Country | |
---|---|---|---|
20120109504 A1 | May 2012 | US |