The present invention relates generally to a vehicle sensing system for a vehicle and, more particularly, to a vehicle sensing system that utilizes one or more cameras at a vehicle.
Use of imaging sensors in vehicle imaging systems is common and known. Examples of such known systems are described in U.S. Pat. Nos. 5,949,331; 5,670,935 and/or 5,550,677, which are hereby incorporated herein by reference in their entireties.
A vehicular sensing system includes a camera disposed at a vehicle equipped with the vehicular sensing system that views exterior of the equipped vehicle. The camera is operable to capture image data. The camera includes a CMOS imaging array that includes at least one million photosensors arranged in rows and columns. The system includes a radar sensor disposed at the equipped vehicle and sensing at least rearward and sideward of the equipped vehicle. The radar sensor is operable to capture radar data. The system includes an electronic control unit (ECU) with electronic circuitry and associated software. Image data captured by the camera is transferred to the ECU, and radar data captured by the radar sensor is transferred to the ECU. The electronic circuitry of the ECU includes at least one data processor. The ECU is operable to process (i) image data captured by the camera and transferred to the ECU and (ii) radar data captured by the radar sensor and transferred to the ECU. The vehicular sensing system, via processing at the ECU of radar data captured by the radar sensor, detects an object within a blind spot of a driver of the equipped vehicle. The blind spot is at least sideward of the equipped vehicle. The vehicular sensing system, via processing at the ECU of image data captured by the camera, determines that the equipped vehicle is within a traffic lane that borders an edge of a road along which the equipped vehicle is traveling. The vehicular sensing system, responsive to determining that the detected object is not sideward of a side of the equipped vehicle that is closest to the edge of the road, generates a blind spot warning. The vehicular sensing system, responsive to determining that the detected object is sideward from the side of the equipped vehicle that is closest to the edge of the road along which the equipped vehicle is traveling, suppresses the blind spot warning.
These and other objects, advantages, purposes and features of the present invention will become apparent upon review of the following specification in conjunction with the drawings.
Blind spot monitoring (BSM), also known as a blind spot detection system or as blind spot assist, is an advanced driver assistance system (ADAS) feature commonly found in modern vehicles. Blind spot monitoring is designed to assist drivers in detecting and avoiding vehicles or other objects that may be in their blind spots, which are areas around the vehicle that are not directly visible to the driver through the side and rearview mirrors. The system uses sensors, typically located at the sides and/or rear of the vehicle, to continuously monitor the traffic lanes adjacent to the equipped vehicle. These sensors can detect other vehicles or other objects that enter the blind spot zones. When a vehicle is detected in the blind spot, the system alerts the driver through visual, audible, and/or tactile signals. The most common form of alert is a visual indicator, usually at the side mirrors or on the A-pillars of the equipped vehicle, which typically light up or blink to warn the driver when there is a vehicle present in the blind spot. Some systems also provide audible warnings, such as a beep or chime.
Blind spot monitor systems are especially useful during lane changes or when merging into traffic. The system acts as an additional safety measure to assist drivers in making better informed decisions and help reduce the risk of collisions caused by unnoticed vehicles in the blind spots. In some systems, one or more rear corner radar sensors at the vehicle detect relevant objects (vehicles, motorcycles, bicyclists, etc.) in the vehicle's vicinity. This sensor information is used by the system to generate alerts for the driver. In some driving scenarios, radar sensors may interpret guardrails, metal barriers, a row of parked vehicles, etc., as a moving vehicle or object. This results in a false positive alert to the driver. Implementations herein use one or more additional sensors to suppress such false positive alerts.
A vehicle sensing system and/or driver or driving assist system and/or object detection system and/or alert system operates to capture sensor data exterior of the vehicle and may process the captured sensor data to detect objects at or near the vehicle and in the predicted path of the vehicle, such as to assist a driver of the vehicle in during a lane change maneuver. The sensing system includes a data processor or data processing system that is operable to receive sensor data from one or more sensors (e.g., cameras and/or radar sensors) and may provide an output to a display device for displaying images representative of the captured sensor data. Optionally, the vision system may provide a display, such as a rearview display or a top down or bird's eye or surround view display or the like.
Referring now to the drawings and the illustrative embodiments depicted therein, a vehicle 10 includes a sensing system 12 that includes at least one exterior viewing imaging sensor or camera, such as a front camera module (FCM) 14a (and the system may optionally include multiple exterior viewing imaging sensors or cameras, such as a forward viewing camera 14b at the front of the vehicle, a sideward/rearward viewing camera 14c, 14d at respective sides of the vehicle, and/or a rearward viewing imaging sensor or camera 14e, such as the rear backup camera of the vehicle), which captures images exterior of the vehicle, with the camera having a lens for focusing images at or onto an imaging array or imaging plane or imager of the camera (
The system optionally includes a sensor fusion module 26. The sensor fusion module 26 receives the object information from sensors (e.g., the FCM 14a and the radar sensors 15) and combines or fuses the information to calculate an accurate position of objects detected near the vehicle along with various object parameters of the object, such as pose, velocity, acceleration, etc. Alternatively, the object information from one or more radar sensors 15 may be used directly (i.e., without first fusing object information derived from image data captured by the FCM 14a). An optional vehicle state estimator module 28 estimates various states of the equipped vehicle (e.g., vehicle speed, yaw rate, acceleration, vehicle position, etc.) using the current state and the external disturbances (e.g., road gradient and other environmental conditions). The vehicle state estimator 28 may also provide parameters for the equipped vehicle. Alternatively, information reported by vehicle sensors may be used directly.
The BSM 24 identifies any threat (e.g., other objects such as vehicles, bicycles, etc.) present in one or more blind spots of a driver of the equipped vehicle. The BSM uses information from the vehicle state estimator 28, the FCM 14a (or other camera, such as a surround view camera disposed at the front or at a side mirror of the vehicle), and the sensor fusion module 26 to determine whether any threats exist within the field of sensing of the sensors. The radar sensor(s) 15 may generate outputs defining or identifying guard-rails, barriers, rows of parked vehicles, etc., as valid and moving objects. This can lead to the blind spot monitoring system 24 incorrectly considering these objects as a threat. For example, when the equipped vehicle travels along the leftmost or rightmost traffic lane with the guard-rails present adjacent and near the equipped vehicle, the radar sensor 15 may identify the guard-rail as an object in a blind spot of the driver. A camera module (e.g., the FCM 14a and/or a surround view camera) may determine or identify a distance between the equipped vehicle and the edge of the road the vehicle is traveling along. The camera determines (or provides the information to the BSM 24 to determine) road or lane information that includes polynomial coefficients, range, and/or confidences for the left and/or right road edges.
Using this data, the BSM 24 localizes the equipped vehicle with respect to the road edges (i.e., determines the position of the vehicle relative to the left and/or right road edges) and suppresses threats when the position of the vehicle relative to the road edges indicates that the detected object is a false positive (i.e., not a moving object in a blind spot of the driver. For example, when the equipped vehicle is close to guard rails and the equipped vehicle is traveling in the leftmost or rightmost traffic lane, the radar sensor 15 may indicate the guard rails are a potential blind spot threat, but the BSM 24, using the road information derived from image data captured by the camera(s), determines that the vehicle's position relative to the road edge indicates that the guard rails are a false positive (i.e., are not a moving object in a blind spot of the driver). A false positive suppression algorithm and/or the BSM 24 may, using this information, suppress a blind spot indicator warning (e.g., a visual, audible, and/or haptic warning) that otherwise may be generated based on the radar data alone. When the image data does not indicate that a detected object is a false positive, then instead the warning will not be suppressed and the warning will be generated by a human-machine interface (HMI) module to warn or alert the driver to the object in the blind spot.
Referring now to
y=C
3
x
3
+C
2
x
2
+C
1
x+C
0 (1)
Here, x represents the longitudinal distance of a point on a lane mark of a lane the vehicle is currently traveling along from the reference point of the equipped vehicle while y represents the lateral distance of the same point on the lane mark from the reference point of the equipped vehicle. Apart from the lateral distance, the range and the confidence are also parameters that may be used in determining the position of the equipped vehicle with respect to the road edges. The range may be determined as a minimum and a maximum value (e.g., based on the confidence values and one or more confidence thresholds) that defines the range of the distance in which the lane coefficients are valid. The confidence values represent how confident the system is in the location of the lane markers or edge of the road/lane the vehicle is traveling along. The confidence values may be based on factors such as environmental conditions (e.g., rain, fog, snow, etc.), the visibility of the lane markings, the terrain, ambient light values, etc.
The system may precisely select the points on the vehicle to check for the lateral distance to the guard rails (or other boundary or object along the road). One implementation includes using one point of the vehicle (e.g., the center of the equipped vehicle) to determine the lateral distance to the guard rails or other object. However, this is not a very robust method and could cause issues as that point might not lie within the lane validity range of the lane information. This may provide an incorrect lateral distance to the guard rail (or other object). This method may also provide incorrect lateral distances during a lane change. Instead of using one single point, the system uses dynamically changing points according to the lane validity range. Multiple different points of interest may be evaluated by the system. For example, the system may evaluate the front bumper of the equipped vehicle, the front axle of the equipped vehicle, and/or the rear axle of the equipped vehicle (
In some scenarios, the minimum lane validity range may not fall below these points. Examples of these scenarios are illustrated in
Once the points of interest are finalized (e.g., the points 50-58 of
Thus, the sensing system or blind spot monitoring system warns the driver of threats present in one or more blind spots of the driver of the vehicle using one or more radar sensors (e.g., corner radar sensors) to help avoid collision during lane changes. The system may determine when the vehicle is in a traffic lane bordering the edge of the road (e.g., the far left or far right lane of the road). In these scenarios, the system may suppress blind spot alerts for objects to the side of the vehicle that borders the road edge. The system may determine that the vehicle is in a traffic lane bordering the edge of the road using image data captured by a camera (e.g., a front camera module) and associated reported road information.
The camera or sensor may comprise any suitable camera or sensor. Optionally, the camera may comprise a “smart camera” that includes the imaging sensor array and associated circuitry and image processing circuitry and electrical connectors and the like as part of a camera module, such as by utilizing aspects of the vision systems described in U.S. Pat. Nos. 10,099,614 and/or 10,071,687, which are hereby incorporated herein by reference in their entireties.
The system includes an image processor operable to process image data captured by the camera or cameras, such as for detecting objects or other vehicles or pedestrians or the like in the field of view of one or more of the cameras. For example, the image processor may comprise an image processing chip selected from the EYEQ family of image processing chips available from Mobileye Vision Technologies Ltd. of Jerusalem, Israel, and may include object detection software (such as the types described in U.S. Pat. Nos. 7,855,755; 7,720,580 and/or 7,038,577, which are hereby incorporated herein by reference in their entireties), and may analyze image data to detect vehicles and/or other objects. Responsive to such image processing, and when an object or other vehicle is detected, the system may generate an alert to the driver of the vehicle and/or may generate an overlay at the displayed image to highlight or enhance display of the detected object or vehicle, in order to enhance the driver's awareness of the detected object or vehicle or hazardous condition during a driving maneuver of the equipped vehicle.
The vehicle may include any type of sensor or sensors, such as imaging sensors or radar sensors or lidar sensors or ultrasonic sensors or the like. The imaging sensor of the camera may capture image data for image processing and may comprise, for example, a two dimensional array of a plurality of photosensor elements arranged in at least 640 columns and 480 rows (at least a 640×480 imaging array, such as a megapixel imaging array or the like), with a respective lens focusing images onto respective portions of the array. The photosensor array may comprise a plurality of photosensor elements arranged in a photosensor array having rows and columns. The imaging array may comprise a CMOS imaging array having at least 300,000 photosensor elements or pixels, preferably at least 500,000 photosensor elements or pixels and more preferably at least one million photosensor elements or pixels or at least two million photosensor elements or at least three million photosensor elements or pixels or at least five million photosensor elements or pixels arranged in rows and columns. The imaging array may capture color image data, such as via spectral filtering at the array, such as via an RGB (red, green and blue) filter or via a red/red complement filter or such as via an RCC (red, clear, clear) filter or the like. The logic and control circuit of the imaging sensor may function in any known manner, and the image processing and algorithmic processing may comprise any suitable means for processing the images and/or image data.
For example, the vision system and/or processing and/or camera and/or circuitry may utilize aspects described in U.S. Pat. Nos. 9,233,641; 9,146,898; 9,174,574; 9,090,234; 9,077,098; 8,818,042; 8,886,401; 9,077,962; 9,068,390; 9,140,789; 9,092,986; 9,205,776; 8,917,169; 8,694,224; 7,005,974; 5,760,962; 5,877,897; 5,796,094; 5,949,331; 6,222,447; 6,302,545; 6,396,397; 6,498,620; 6,523,964; 6,611,202; 6,201,642; 6,690,268; 6,717,610; 6,757,109; 6,802,617; 6,806,452; 6,822,563; 6,891,563; 6,946,978; 7,859,565; 5,550,677; 5,670,935; 6,636,258; 7,145,519; 7,161,616; 7,230,640; 7,248,283; 7,295,229; 7,301,466; 7,592,928; 7,881,496; 7,720,580; 7,038,577; 6,882,287; 5,929,786 and/or 5,786,772, and/or U.S. Publication Nos. US-2014-0340510; US-2014-0313339; US-2014-0347486; US-2014-0320658; US-2014-0336876; US-2014-0307095; US-2014-0327774; US-2014-0327772; US-2014-0320636; US-2014-0293057; US-2014-0309884; US-2014-0226012; US-2014-0293042; US-2014-0218535; US-2014-0218535; US-2014-0247354; US-2014-0247355; US-2014-0247352; US-2014-0232869; US-2014-0211009; US-2014-0160276; US-2014-0168437; US-2014-0168415; US-2014-0160291; US-2014-0152825; US-2014-0139676; US-2014-0138140; US-2014-0104426; US-2014-0098229; US-2014-0085472; US-2014-0067206; US-2014-0049646; US-2014-0052340; US-2014-0025240; US-2014-0028852; US-2014-005907; US-2013-0314503; US-2013-0298866; US-2013-0222593; US-2013-0300869; US-2013-0278769; US-2013-0258077; US-2013-0258077; US-2013-0242099; US-2013-0215271; US-2013-0141578 and/or US-2013-0002873, which are all hereby incorporated herein by reference in their entireties. The system may communicate with other communication systems via any suitable means, such as by utilizing aspects of the systems described in U.S. Pat. Nos. 10,071,687; 9,900,490; 9,126,525 and/or 9,036,026, which are hereby incorporated herein by reference in their entireties.
The system may utilize sensors, such as radar sensors or imaging radar sensors or lidar sensors or the like, to detect presence of and/or range to objects and/or other vehicles and/or pedestrians. The sensing system may utilize aspects of the systems described in U.S. Pat. Nos. 10,866,306; 9,954,955; 9,869,762; 9,753,121; 9,689,967; 9,599,702; 9,575,160; 9,146,898; 9,036,026; 8,027,029; 8,013,780; 7,408,627; 7,405,812; 7,379,163; 7,379,100; 7,375,803; 7,352,454; 7,340,077; 7,321,111; 7,310,431; 7,283,213; 7,212,663; 7,203,356; 7,176,438; 7,157,685; 7,053,357; 6,919,549; 6,906,793; 6,876,775; 6,710,770; 6,690,354; 6,678,039; 6,674,895 and/or 6,587,186, and/or U.S. Publication Nos. US-2019-0339382; US-2018-0231635; US-2018-0045812; US-2018-0015875; US-2017-0356994; US-2017-0315231; US-2017-0276788; US-2017-0254873; US-2017-0222311 and/or US-2010-0245066, which are hereby incorporated herein by reference in their entireties.
The radar sensors of the sensing system each comprise a plurality of transmitters that transmit radio signals via a plurality of antennas, a plurality of receivers that receive radio signals via the plurality of antennas, with the received radio signals being transmitted radio signals that are reflected from an object present in the field of sensing of the respective radar sensor. The system includes an ECU or control that includes a data processor for processing sensor data captured by the radar sensors. The ECU or sensing system may be part of a driving assist system of the vehicle, with the driving assist system controls at least one function or feature of the vehicle (such as to provide autonomous driving control of the vehicle) responsive to processing of the data captured by the radar sensors.
The radar sensor or radar sensors are disposed at the vehicle so as to sense exterior of the vehicle. For example, the radar sensors may include a front sensing radar sensor mounted at a grille or front bumper of the vehicle, such as for use with an automatic emergency braking system of the vehicle, an adaptive cruise control system of the vehicle, a collision avoidance system of the vehicle, etc., or the radar sensor may be comprise a corner radar sensor disposed at a front corner or rear corner of the vehicle, such as for use with the blind spot monitoring and alert system of the vehicle and/or a surround vision system of the vehicle, or the radar sensor may comprise a blind spot monitoring radars disposed at a rear fender of the vehicle for monitoring sideward/rearward of the vehicle for the blind spot monitoring and alert system of the vehicle. Optionally, a radar sensor or radar sensors may be disposed within the vehicle so as to sense interior of the vehicle, such as for use with a cabin monitoring system of the vehicle or a driver monitoring system of the vehicle or an occupant detection or monitoring system of the vehicle. The radar sensing system may comprise multiple input multiple output (MIMO) radar sensors having multiple transmitting antennas and multiple receiving antennas.
The camera module and circuit chip or board and imaging sensor may be implemented and operated in connection with various vehicular vision-based systems, and/or may be operable utilizing the principles of such other vehicular systems, such as a vehicle headlamp control system, such as the type disclosed in U.S. Pat. Nos. 5,796,094; 6,097,023; 6,320,176; 6,559,435; 6,831,261; 7,004,606; 7,339,149 and/or 7,526,103, which are all hereby incorporated herein by reference in their entireties, a rain sensor, such as the types disclosed in commonly assigned U.S. Pat. Nos. 6,353,392; 6,313,454; 6,320,176 and/or 7,480,149, which are hereby incorporated herein by reference in their entireties, a vehicle vision system, such as a forward, sideward or rearward directed vehicle vision system utilizing principles disclosed in U.S. Pat. Nos. 5,550,677; 5,670,935; 5,760,962; 5,877,897; 5,949,331; 6,222,447; 6,302,545; 6,396,397; 6,498,620; 6,523,964; 6,611,202; 6,201,642; 6,690,268; 6,717,610; 6,757,109; 6,802,617; 6,806,452; 6,822,563; 6,891,563; 6,946,978 and/or 7,859,565, which are all hereby incorporated herein by reference in their entireties, a trailer hitching aid or tow check system, such as the type disclosed in U.S. Pat. No. 7,005,974, which is hereby incorporated herein by reference in its entirety, a reverse or sideward imaging system, such as for a lane change assistance system or lane departure warning system or for a blind spot or object detection system, such as imaging or detection systems of the types disclosed in U.S. Pat. Nos. 7,881,496; 7,720,580; 7,038,577; 5,929,786 and/or 5,786,772, which are hereby incorporated herein by reference in their entireties, a video device for internal cabin surveillance and/or video telephone function, such as disclosed in U.S. Pat. Nos. 5,760,962; 5,877,897; 6,690,268; 7,370,983; 7,937,667 and/or 9,800,983, and/or U.S. Publication No. US-2006-0050018, which are hereby incorporated herein by reference in their entireties, a traffic sign recognition system, a system for determining a distance to a leading or trailing vehicle or object, such as a system utilizing the principles disclosed in U.S. Pat. Nos. 6,396,397 and/or 7,123,168, which are hereby incorporated herein by reference in their entireties, and/or the like.
The ECU may be operable to process data for at least one driving assist system of the vehicle. For example, the ECU may be operable to process data (such as image data captured by a forward viewing camera of the vehicle that views forward of the vehicle through the windshield of the vehicle) for at least one selected from the group consisting of (i) a headlamp control system of the vehicle, (ii) a pedestrian detection system of the vehicle, (iii) a traffic sign recognition system of the vehicle, (iv) a collision avoidance system of the vehicle, (v) an emergency braking system of the vehicle, (vi) a lane departure warning system of the vehicle, (vii) a lane keep assist system of the vehicle, (viii) a blind spot monitoring system of the vehicle and (ix) an adaptive cruise control system of the vehicle. Optionally, the ECU may also or otherwise process radar data captured by a radar sensor of the vehicle or other data captured by other sensors of the vehicle (such as other cameras or radar sensors or such as one or more lidar sensors of the vehicle). Optionally, the ECU may process captured data for an autonomous control system of the vehicle that controls steering and/or braking and/or accelerating of the vehicle as the vehicle travels along the road.
The system may also communicate with other systems, such as via a vehicle-to-vehicle communication system or a vehicle-to-infrastructure communication system or the like. Such car2car or vehicle to vehicle (V2V) and vehicle-to-infrastructure (car2X or V2X or V2I or a 4G or 5G broadband cellular network) technology provides for communication between vehicles and/or infrastructure based on information provided by one or more vehicles and/or information provided by a remote server or the like. Such vehicle communication systems may utilize aspects of the systems described in U.S. Pat. Nos. 10,819,943; 9,555,736; 6,690,268; 6,693,517 and/or 7,580,795, and/or U.S. Publication Nos. US-2014-0375476; US-2014-0218529; US-2013-0222592; US-2012-0218412; US-2012-0062743; US-2015-0251599; US-2015-0158499; US-2015-0124096; US-2015-0352953; US-2016-0036917 and/or US-2016-0210853, which are hereby incorporated herein by reference in their entireties.
Changes and modifications in the specifically described embodiments can be carried out without departing from the principles of the invention, which is intended to be limited only by the scope of the appended claims, as interpreted according to the principles of patent law including the doctrine of equivalents.
The present application claims the filing benefits of U.S. provisional application Ser. No. 63/505,709, filed Jun. 2, 2023, which is hereby incorporated herein by reference in its entirety.
Number | Date | Country | |
---|---|---|---|
63505709 | Jun 2023 | US |