The present invention relates generally to a vehicular alert system for a vehicle and, more particularly, to a vehicular alert system that utilizes one or more sensors at a vehicle.
It is known to use sensors to determine if it is safe to open a vehicle door. Examples of such known systems are described in U.S. Pat. Nos. 11,124,113; 9,688,199 and/or 9,068,390, which are hereby incorporated herein by reference in their entireties.
A vehicular alert system includes at least one sensor disposed at a vehicle equipped with the vehicular alert system and sensing exterior of the vehicle. The at least one sensor capturing sensor data. The system includes an electronic control unit (ECU) with electronic circuitry and associated software. The electronic circuitry of the ECU includes a processor for processing sensor data captured by the at least one sensor. The vehicular alert system, via processing at the ECU of sensor data captured by the at least one sensor, detects an object sensed by the at least one sensor. The vehicular alert system, as the vehicle is moving, determines a likelihood that the vehicle is parking. The vehicular alert system, responsive to determining that the likelihood that the vehicle is parking exceeds a threshold level, and while the vehicle is moving, tracks a position of the detected object relative to the equipped vehicle until the detected object leaves a field of sensing of the at least one sensor. The vehicular alert system, responsive to the detected object leaving the field of sensing of the at least one sensor as the vehicle moves, predicts a position of the detected object relative to the vehicle after the vehicle has further moved and when the vehicle stops. The vehicular alert system, responsive to determining the equipped vehicle has stopped moving, determines that the detected object is a hazard based on the predicted position of the detected object. The vehicular alert system, responsive to determining that the detected object is a hazard, alerts an occupant of the equipped vehicle of the detected object.
These and other objects, advantages, purposes and features of the present invention will become apparent upon review of the following specification in conjunction with the drawings.
A vehicular alert system and/or object detection system operates to capture images or sensor data exterior of the vehicle and may process the captured image data or sensor data to detect objects or hazardous situations at or near the vehicle, such as to assist an occupant in determining whether the detected object is a threat. The alert system includes a processor or image processing system that is operable to receive image data or sensor from one or more cameras or sensors. Optionally, the vision system may provide a display, such as a rearview display or a top down or bird's eye or surround view display or the like.
Referring now to the drawings and the illustrative embodiments depicted therein, a vehicle 10 includes an imaging system or vision system or sensing system or alert system 12 that includes at least one exterior viewing imaging sensor or camera or radar sensor, such as a rearward viewing imaging sensor or camera 14a (and the system may optionally include multiple exterior viewing imaging sensors or cameras, such as a forward viewing camera 14b at the front (or at the windshield) of the vehicle, and a sideward/rearward viewing camera 14c, 14d or other surround cameras at respective sides of the vehicle), which captures images exterior of the vehicle, with the camera having a lens for focusing images at or onto an imaging array or imaging plane or imager of the camera (
Implementations herein include a system that incorporates a smart door open warning (SDOW) feature that alerts a driver/passenger of an equipped vehicle about hazardous conditions outside the vehicle before or when the user exits the vehicle. The SDOW feature, for example, prevents damage (e.g., scratches and/or dents) to the vehicle and door and/or injury to the occupant by detecting objects in close proximity of the vehicle (i.e., near and within a swing path of the doors and/or tailgate/rear hatch) (
Referring now to
Referring now to
Optionally, the system may continue to track the location of the objects after initial detection of the object (e.g., after detecting the object as the vehicle is slowing to park). In some cases, the system predicts the location of a tracked object when the object enter the blind zone of the sensors (e.g., immediately next of the vehicle) and thus disappears from a field of view of the sensors. The system may predict the path of the object using, for example, vehicle kinematics and trajectory. For example, a corner radar sensor may approach an obstacle as the vehicle slows to park. As the vehicle continues moving forward, the obstacle moves out of the field of sensing of the radar sensing. Despite the obstacle no longer being in the field of sensing of the radar sensor (or any other sensor of the vehicle), the system may predict or estimate (e.g., via vehicle kinematics) the position of the obstacle and determines, based on the predicted position, whether the obstacle is a hazard prior to an occupant opening a door of the vehicle. Thus, even if the vehicle lacks a sensor with a field of sensing proximate each door of the vehicle, the system may predict/estimate the locations of obstacles to provide the door open warning(s).
The system may provide a variety of different alerts to occupants of the vehicle based on detected objects and conditions. For example, the system provides an audible alert (e.g., an automated voice alert or audible tone or alarm) announcing presence of the detected objects and/or conditions. The audible alert may indicate a direction or position of the detected object/condition (e.g., which occupant and/or door is at risk). The alert may include image data (e.g., captured by one or more cameras of the vehicle) displayed (as a still image or as video images) on a display within the vehicle. The system may display text describing or indicating the detected objects/conditions. The alert may include one or more haptic alerts using, for example, seats, seat mats, door handles, and/or the steering wheel.
In some examples, the system determines if an imminent collision with a detected object is likely (i.e., greater than a threshold likeliness). When an imminent collision is detected (i.e., with a door of the vehicle or with an occupant of the vehicle after exiting the vehicle), the system may apply a threshold for maximum door opening angle (e.g., based on the detected objects size, trajectory, speed, etc.) in addition to an alert. For example, the system may limit how far the door can be opened in order to avoid or mitigate any potential collision. Optionally, the system may restrict from opening the door entirely. The system may restrict the door from opening until an occupant has acknowledged one or more warnings of the system. The system may include door opening warning aspects described in U.S. Pat. Nos. 11,124,113; 9,688,199 and/or 9,068,390, and/or U.S. patent application Ser. No. 17/815,675, filed Jul. 28, 2022, which are hereby incorporated herein by reference in their entireties.
The camera or sensor may comprise any suitable camera or sensor. Optionally, the camera may comprise a “smart camera” that includes the imaging sensor array and associated circuitry and image processing circuitry and electrical connectors and the like as part of a camera module, such as by utilizing aspects of the vision systems described in U.S. Pat. Nos. 10,099,614 and/or 10,071,687, which are hereby incorporated herein by reference in their entireties.
The system includes an image processor operable to process image data captured by the camera or cameras, such as for detecting objects or other vehicles or pedestrians or the like in the field of view of one or more of the cameras. For example, the image processor may comprise an image processing chip selected from the EYEQ family of image processing chips available from Mobileye Vision Technologies Ltd. of Jerusalem, Israel, and may include object detection software (such as the types described in U.S. Pat. Nos. 7,855,755; 7,720,580 and/or 7,038,577, which are hereby incorporated herein by reference in their entireties), and may analyze image data to detect vehicles and/or other objects. Responsive to such image processing, and when an object or other vehicle is detected, the system may generate an alert to the driver of the vehicle and/or may generate an overlay at the displayed image to highlight or enhance display of the detected object or vehicle, in order to enhance the driver's awareness of the detected object or vehicle or hazardous condition during a driving maneuver of the equipped vehicle.
The vehicle may include any type of sensor or sensors, such as imaging sensors or radar sensors or lidar sensors or ultrasonic sensors or the like. The imaging sensor or camera may capture image data for image processing and may comprise any suitable camera or sensing device, such as, for example, a two dimensional array of a plurality of photosensor elements arranged in at least 640 columns and 480 rows (at least a 640×480 imaging array, such as a megapixel imaging array or the like), with a respective lens focusing images onto respective portions of the array. The photosensor array may comprise a plurality of photosensor elements arranged in a photosensor array having rows and columns. The imaging array may comprise a CMOS imaging array having at least 300,000 photosensor elements or pixels, preferably at least 500,000 photosensor elements or pixels and more preferably at least 1 million photosensor elements or pixels. The imaging array may capture color image data, such as via spectral filtering at the array, such as via an RGB (red, green and blue) filter or via a red/red complement filter or such as via an RCC (red, clear, clear) filter or the like. The logic and control circuit of the imaging sensor may function in any known manner, and the image processing and algorithmic processing may comprise any suitable means for processing the images and/or image data.
For example, the vision system and/or processing and/or camera and/or circuitry may utilize aspects described in U.S. Pat. Nos. 9,233,641; 9,146,898; 9,174,574; 9,090,234; 9,077,098; 8,818,042; 8,886,401; 9,077,962; 9,068,390; 9,140,789; 9,092,986; 9,205,776; 8,917,169; 8,694,224; 7,005,974; 5,760,962; 5,877,897; 5,796,094; 5,949,331; 6,222,447; 6,302,545; 6,396,397; 6,498,620; 6,523,964; 6,611,202; 6,201,642; 6,690,268; 6,717,610; 6,757,109; 6,802,617; 6,806,452; 6,822,563; 6,891,563; 6,946,978; 7,859,565; 5,550,677; 5,670,935; 6,636,258; 7,145,519; 7,161,616; 7,230,640; 7,248,283; 7,295,229; 7,301,466; 7,592,928; 7,881,496; 7,720,580; 7,038,577; 6,882,287; 5,929,786 and/or 5,786,772, and/or U.S. Publication Nos. US-2014-0340510; US-2014-0313339; US-2014-0347486; US-2014-0320658; US-2014-0336876; US-2014-0307095; US-2014-0327774; US-2014-0327772; US-2014-0320636; US-2014-0293057; US-2014-0309884; US-2014-0226012; US-2014-0293042; US-2014-0218535; US-2014-0218535; US-2014-0247354; US-2014-0247355; US-2014-0247352; US-2014-0232869; US-2014-0211009; US-2014-0160276; US-2014-0168437; US-2014-0168415; US-2014-0160291; US-2014-0152825; US-2014-0139676; US-2014-0138140; US-2014-0104426; US-2014-0098229; US-2014-0085472; US-2014-0067206; US-2014-0049646; US-2014-0052340; US-2014-0025240; US-2014-0028852; US-2014-005907; US-2013-0314503; US-2013-0298866; US-2013-0222593; US-2013-0300869; US-2013-0278769; US-2013-0258077; US-2013-0258077; US-2013-0242099; US-2013-0215271; US-2013-0141578 and/or US-2013-0002873, which are all hereby incorporated herein by reference in their entireties. The system may communicate with other communication systems via any suitable means, such as by utilizing aspects of the systems described in U.S. Pat. Nos. 10,071,687; 9,900,490; 9,126,525 and/or 9,036,026, which are hereby incorporated herein by reference in their entireties.
Changes and modifications in the specifically described embodiments can be carried out without departing from the principles of the invention, which is intended to be limited only by the scope of the appended claims, as interpreted according to the principles of patent law including the doctrine of equivalents.
The present application claims the filing benefits of U.S. provisional application Ser. No. 63/262,045, filed Oct. 4, 2021, which is hereby incorporated herein by reference in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
7038577 | Pawlicki et al. | May 2006 | B2 |
7720580 | Higgins-Luthman | May 2010 | B2 |
7855755 | Weller et al. | Dec 2010 | B2 |
8874317 | Marczok et al. | Oct 2014 | B2 |
9068390 | Ihlenburg et al. | Jun 2015 | B2 |
9637965 | Kothari | May 2017 | B1 |
9688199 | Koravadi | Jun 2017 | B2 |
9947227 | Dilger | Apr 2018 | B1 |
10071687 | Ihlenburg et al. | Sep 2018 | B2 |
10099614 | Diessner | Oct 2018 | B2 |
11124113 | Singh | Sep 2021 | B2 |
11597383 | Sonalker | Mar 2023 | B1 |
20090243822 | Hinninger et al. | Oct 2009 | A1 |
20130116859 | Ihlenburg et al. | May 2013 | A1 |
20140098230 | Baur | Apr 2014 | A1 |
20150298611 | Komoguchi et al. | Oct 2015 | A1 |
20150344028 | Gieseke et al. | Dec 2015 | A1 |
20160023598 | Kohler et al. | Jan 2016 | A1 |
20170008455 | Goudy et al. | Jan 2017 | A1 |
20170015312 | Latotzki | Jan 2017 | A1 |
20170017848 | Gupta et al. | Jan 2017 | A1 |
20170032677 | Seo | Feb 2017 | A1 |
20170050672 | Gieseke et al. | Feb 2017 | A1 |
20170197549 | Vladimerou et al. | Jul 2017 | A1 |
20170218678 | Kothari | Aug 2017 | A1 |
20170253237 | Diessner | Sep 2017 | A1 |
20170274821 | Goudy et al. | Sep 2017 | A1 |
20170317748 | Krapf | Nov 2017 | A1 |
20170329346 | Latotzki | Nov 2017 | A1 |
20180297520 | Morimura | Oct 2018 | A1 |
20190102602 | Uchida et al. | Apr 2019 | A1 |
20190102633 | Uchida et al. | Apr 2019 | A1 |
20190135278 | Hillman | May 2019 | A1 |
20190232863 | Rowell | Aug 2019 | A1 |
20200062248 | Hasegawa et al. | Feb 2020 | A1 |
20200192383 | Nath | Jun 2020 | A1 |
20200269837 | Nath et al. | Aug 2020 | A1 |
20210046927 | Miller et al. | Feb 2021 | A1 |
20210061241 | Liu et al. | Mar 2021 | A1 |
20210109543 | Hiromitsu et al. | Apr 2021 | A1 |
20210146833 | Kang | May 2021 | A1 |
20210323548 | Fukatsu et al. | Oct 2021 | A1 |
20220118970 | Takaki | Apr 2022 | A1 |
20220306090 | Noguchi | Sep 2022 | A1 |
20230032998 | Kushwaha et al. | Feb 2023 | A1 |
Number | Date | Country | |
---|---|---|---|
20230106562 A1 | Apr 2023 | US |
Number | Date | Country | |
---|---|---|---|
63262045 | Oct 2021 | US |