The technical field generally relates to alert systems of a vehicle, and more particularly relates to alert systems of a vehicle that utilize real-time lane information for alerting a driver of the vehicle
Vehicles include alert systems that detect objects in proximity to the vehicle and alert the driver to the object. The alerts are typically generated based on a location of the object and based on a particular driving maneuver that is or will be occurring. Such alert systems can include, but are not limited to, side blind zone alert systems, lane change alert systems, and other systems using front, side, and rear view cameras.
Sensory devices coupled to the rear, side, and/or front of the vehicle detect objects within particular areas. Typically the sensory devices are placed and/or calibrated to detect objects within a defined area around the vehicle. For example, the defined area may be intended to encompass an adjacent lane. However, the width of the lane can vary from road to road, and thus the predefined area may encompass more or less than the adjacent lane. When the predefined area encompasses an area that includes more than the adjacent lane, moving objects that fall within that area but that fall outside of the adjacent lane may be detected (e.g., a moving vehicle may be detected two lanes over). Such objects would be interpreted by the sensory device as being within the adjacent lane and, consequently, may cause false alerts.
Accordingly, it is desirable to provide methods and systems that take into account real-time lane information when generating the alerts. Furthermore, other desirable features and characteristics of the present invention will become apparent from the subsequent detailed description and the appended claims, taken in conjunction with the accompanying drawings and the foregoing technical field and background.
Methods and systems are provided for alerting a driver of a vehicle. In one embodiment, a method includes: receiving sensor data that is generated by an image sensor that senses conditions in proximity of the vehicle; determining real-time lane information from the sensor data, wherein the real-time lane information includes at least one of a lane width, a lane type, and a lane curvature; selectively performing an alert method that evaluates the presence of objects in proximity to the vehicle based on the real-time lane information; and selectively generating an alert signal to alert the driver based on the alert method.
In another embodiment, a system includes a first module that receives sensor data that is generated by an image sensor of the vehicle, and that determines real-time lane information from the sensor data, wherein the real-time lane information includes at least one of a lane width, a lane type, and a lane curvature. A second module selectively performs an alert method that evaluates the presence of objects in proximity to the vehicle based on the real-time lane information and selectively generates an alert signal to alert the driver based on the alert method.
In still another embodiment, a vehicle includes at least one image sensor that generates a sensor signal. A control module receives the sensor signal, determines real-time lane information from the sensor signal, performs an alert method that evaluates the presence of objects in proximity to the vehicle based on the real-time lane information, and selectively generates an alert signal to alert the driver based on the alert method. The real-time lane information includes at least one of a lane width, a lane type, and a lane curvature.
The exemplary embodiments will hereinafter be described in conjunction with the following drawing figures, wherein like numerals denote like elements, and wherein:
The following detailed description is merely exemplary in nature and is not intended to limit the application and uses. Furthermore, there is no intention to be bound by any expressed or implied theory presented in the preceding technical field, background, brief summary or the following detailed description. It should be understood that throughout the drawings, corresponding reference numerals indicate like or corresponding parts and features. As used herein, the term module refers to any hardware, software, firmware, electronic control component, processing logic, and/or processor device, individually or in any combination, including without limitation: application specific integrated circuit (ASIC), an electronic circuit, a processor (shared, dedicated, or group) and memory that executes one or more software or firmware programs, a combinational logic circuit, and/or other suitable components that provide the described functionality.
Referring now to
The vehicle alert system 12 includes one or more sensors 14a-14n that sense observable conditions in proximity to the vehicle 10. The sensors 14a-14n can be image sensors, radar sensors, ultrasound sensors, or other sensors that sense observable conditions in proximity to the vehicle 10. For exemplary purposes, the disclosure is discussed in the context of the sensors 14a-14n being image sensors or cameras that track visual images of the surroundings of the vehicle 10. The image sensors can include, but are not limited to, a front image sensor 14a, a right side image sensor 14b, a left side image sensor 14c, and rear image sensors 14d, 14n.
The sensors 14a-14n sense the surroundings of the vehicle 10 and generate sensor signals based thereon. A control module 16 receives the signals, processes the signals, and selectively generates an alert signal. A warning system 18 receives the alert signal and generates an audible or visual warning to warn a driver or other occupant of the vehicle of an object in proximity to the vehicle 10. In various embodiments, the control module 16 determines real-time lane information based on the sensor signals and uses the real-time lane information in one or more alert methods to selectively generate the alert signals.
Referring now to
The lane width determination module 20 receives as input sensor data 28 from the front image sensor 14a and/or rear image sensors 14d, 14n of the vehicle 10 (
The lane type determination module 22 receives as input sensor data 32 from the front image sensor 14a, the side image sensors 14b, 14c, and/or the rear image sensors 14d, 14n of the vehicle 10. Based on the sensor data 32, the lane type determination module 22 determines a lane type 34 of the current lane. The lane type 34 may be, for example, but is not limited to, a right side single lane (e.g., a lane that is a single lane in the current direction and is the only lane to the right), a middle lane (e.g., a lane that has lanes on both sides in the same direction), a left side single lane (e.g., a lane that is a single lane in the current direction and is the only lane to the left), a right side multiple lane (e.g., a lane that is a rightmost lane of multiple lanes in the same direction), and a left side multiple lane (e.g. a lane that is a leftmost lane of multiple lanes in the same direction). In various embodiments, the lane type determination module 22 determines the lane type based on the detected lane markers (e.g., whether they be solid lines or dashed lines, whether they be white or yellow, etc.) to the right of the vehicle 10 and to the left of the vehicle 10.
The lane curvature determination module 24 receives as input sensor data 36 from any one of the front image sensor 14a, the side image sensors 14b, 14c, and the rear image sensors 14d, 14n of the vehicle 10. Based on the sensor data 36, the lane curvature determination module 24, determines a curvature 38 of the lane. For example, the lane curvature determination module 24 evaluates the sensor data 36 of the front image sensor 14a and depending on the patterns of how lane markings are appearing in the image, lane projected paths and lane curvature calculations are performed.
The alert module 26 receives as input the lane width 30, the lane type 34, the lane curvature 38, object data 40, and vehicle maneuver data 42. The object data 40 represents the presence of an object that has been detected (e.g., by radar or other sensing device) in proximity to the vehicle 10. Based on the inputs, the alert module 26 performs one or more alert methods. The alert methods selectively generate alert signals 44 to alert the driver of the detected object based on the lane width 30, the lane type 34, and the lane curvature 38 that are determined real-time.
In various embodiments, the alert methods can include, but are not limited to, side blind zone alert methods, and lane change alert methods. The side blind zone alert methods, for example, evaluate a threat of making a safe lane change maneuver based on detecting vehicles in a blind zone in the adjacent lane, next to the vehicle. The lane change alert methods, for example, evaluate a threat of making a safe lane change maneuver based on computing a delta speed of approaching objects (e.g., vehicles) in adjacent lanes.
For example, as shown in
In another example, as shown in
As can be appreciated, in accordance with various embodiments, other alert methods known in the art can take into account this real-time information to improve the integrity of the alert method and to reduce the number of false alerts.
Referring now to
As can further be appreciated, the method of
In one example, the method may begin at 100. The sensor data is received at 110. The real-time lane information is determined at 200. In particular, the current lane width is determined, for example, as discussed above, at 120. The lane type is determined, for example, as discussed above at 130. The lane curvature is determined, for example, as discussed above at 140.
Once the real-time lane information is determined at 200, the alert methods are performed at 210 based on the real-time lane information. In particular, one or more of the alert methods selectively evaluate one or more of the lane width, the lane type, and the lane curvature to determine whether an alert should be generated at 150. If it is determined that a condition exists in which an alert should be generated at 160, the alert signal is generated at 170. Thereafter, the method may end at 180. If it is determined that a condition does not exist in which the alert should be generated at 160, the method may end at 180.
As can be appreciated, although the steps 200 and 210 are shown to be performed in sequential order, in various embodiments, the real-time information determination steps of 200 can be performed at different time intervals than time intervals of the alert method steps of 210. In further various embodiments, various alert methods can be further performed at different time intervals from each other.
While at least one exemplary embodiment has been presented in the foregoing detailed description, it should be appreciated that a vast number of variations exist. It should also be appreciated that the exemplary embodiment or exemplary embodiments are only examples, and are not intended to limit the scope, applicability, or configuration of the disclosure in any way. Rather, the foregoing detailed description will provide those skilled in the art with a convenient road map for implementing the exemplary embodiment or exemplary embodiments. It should be understood that various changes can be made in the function and arrangement of elements without departing from the scope of the disclosure as set forth in the appended claims and the legal equivalents thereof