The present invention relates generally to a vehicle vision system for a vehicle and, more particularly, to a vehicle vision system that utilizes one or more cameras at a vehicle.
Use of imaging sensors in vehicle imaging systems is common and known. Examples of such known systems are described in U.S. Pat. Nos. 5,949,331; 5,670,935 and/or 5,550,677, which are hereby incorporated herein by reference in their entireties.
The present invention provides a driver assistance system or vision system or imaging system for a vehicle that utilizes one or more cameras (preferably one or more CMOS cameras) to capture image data representative of images exterior of the vehicle, and wherein an image processor is operable to process image data captured by the camera to detect the presence of a blinking light source in the field of view of the camera. Responsive to detection of a blinking light source, the vision system determines if the detected blinking light source is a turn signal indicator of another vehicle on the road on which the equipped vehicle is traveling based on a characteristic of the detected blinking light source being within a threshold level corresponding to a characteristic of a turn signal indicator of a vehicle.
The vision system may determine if the detected blinking light source is a turn signal indicator of another vehicle on the road on which the equipped vehicle is traveling based on at least one of (i) a color of the detected blinking light source being within a threshold color range, (ii) the rate of flashing of the detected blinking light source being within a threshold rate, and (iii) the location of the detected blinking light source being within a threshold range of locations for another vehicle. The threshold level may be selected or adjusted responsive to a current geographical location of the equipped vehicle (which may be determined via a communication to the vehicle or a GPS system of the vehicle or the like).
These and other objects, advantages, purposes and features of the present invention will become apparent upon review of the following specification in conjunction with the drawings.
A vehicle vision system and/or driving assist system and/or object detection system and/or alert system and/or control system for a driver assistance and warning system, such as for an autonomous or semi-autonomous vehicle, operates to capture images exterior of the vehicle and may process the captured image data to display images and to detect objects at or near the vehicle and in the predicted path of the vehicle, such as to assist in maneuvering the vehicle. The vision system includes an image processor or image processing system that is operable to receive image data from one or more cameras and provide an output to a display device for displaying images representative of the captured image data. Optionally, the vision system may provide display, such as a rearview display or a top down or bird's eye or surround view display or the like.
Referring now to the drawings and the illustrative embodiments depicted therein, a vehicle 10 includes an imaging system or vision system 12 that includes at least one exterior facing imaging sensor or camera, such as a rearward facing imaging sensor or camera 14a (and the system may optionally include multiple exterior facing imaging sensors or cameras, such as a forward viewing camera 14b at the front and a forward viewing camera 14h at the windshield of the vehicle, and a sideward viewing camera 14c, 14d at respective sides of the vehicle and another rearward viewing cameras 14e, 14f integrated to the side mirrors or a wing and another rearward facing camera 14g at a rear window of the vehicle), which capture image data representative of the scene occurring exterior of the vehicle, with each of the cameras having a lens for focusing images at or onto an imaging array or imaging plane or imager of the camera (
Driver assistant systems for aiding the driver to hold its lane are known. Nowadays these often also possess a lane change function, which is typically initiated by the driver setting his or her turn signal indicator (hereinafter referred to as TSI) into the direction of the intended target lane. When the TSI is set, the crossing of the adjacent target lane borderline (typically the lane marking) gets allowed to the system, which than executes the lane change, which is completed when the vehicle is within the target lane's borderlines.
Typically, the driver has to take care about sufficient clearance at the target lane before initiating the lane change. Some systems provide aids and warnings to the driver and/or automated systems by sensing the adjacent lane's clearance. This is typically done by using RADAR sensors, LIDAR sensors or Ultrasound sensors, or optionally cameras. Unlike RADAR sensors, LIDAR sensors and Ultrasound sensors, mono-cameras are not as capable of measuring the distance to objects within the visual scene per se, which makes them a second choice. Algorithms are present to provide distance estimations, such as structure from motion, back projection and plausible size comparison by mono-camera. Stereo cameras are often too expansive and too bulky. Often, it is the desirable to incorporate the adjacent lane sensors within the vehicle side mirror housing, by that the space the sensor can take is more limited. The possible stereo base of a stereo camera and the typical resolution that is available in automotive, such as about two megapixels, is typically too small for delivering a sufficient distance measurement for approaching traffic.
Some more advanced systems are capable for object and scene detection by their sensor equipment, data processors, fusion and data processing algorithms, which are capable to sense whether a moving object, such as a vehicle approaching from the blind spot or from behind the equipped vehicle from the lane 31 of the equipped vehicle or from the lane 33 next to the adjacent lane 32 is already directing into the adjacent lane that the driver of the equipped vehicle was intending to change to. This is shown in
For RADAR sensing systems, LIDAR sensing systems and Ultrasonic sensing systems, it is impossible to detect whether an approaching vehicle has its TSI set. Additionally, the vehicle 22 is in the blind spot of vehicle 10. Known blind spot vehicle detection systems typically do not sense and warn about vehicles in adjacent lanes further than the next lane.
The system of the present invention is capable to sense the blinking of a TSI of an (or multiple) approaching vehicle (both in front or rear the ego vehicle). The regulations of the most countries specify the color room or spectral band of actuated TSIs. Additionally, the regulations of the most countries specify the blinking on and off time for actuated TSIs. For example, Germany's StVO § 54 requires a blinking frequency of 1.5+/−0.5 Hz. The on/off ratio is 55%+/−25%. Many countries also specify the position at which the TSIs have to be mounted at the vehicle.
The system of the present invention may receive the information in which country it actually is. Such information may be provided from a vision processing system, from a navigation system or set by a driver entry.
The system of the present invention may detect the blinking at an approaching object and may determine the blinking to be a TSI by taking its peak light intensity, its blinking time pattern, its color tone and its position on the object into account.
The object may have been previously detected and classified by known art scene-(‘2D Scene classification’ in
The equipped vehicle's position (ego position) may be known or detected by known positioning systems and algorithms, such as GPS or visual systems. The foreign road participant object's position may be known by methods such as by getting the positions told by any kind of V2V communication or by relative object distance and angle detection such as by using RADAR sensing systems, LIDAR sensing systems, stereo camera vision systems or structure from motion processing on mono cameras for scene detection and object mapping (see step ‘Shape rotation by relative angle’ in the flow chart of
The object detection and classification may output the road participant's borderlines or shapes (eventually used as ROI later such as shown in
The patch which was classified as a road participant may be used as generally region of interest ROI for detecting blinkers as shown in the chart of
Seen from the rear or front the detection of the TSI is comparably primitive. As visualized in
At vehicles seen in an angle, the line where to divide right from left is not in the middle but still in the middle of the front portion. The shape of the ROI taken from the classification alone may not sufficiently tell where the front portion ends and the side portion begins (see
To solve this it may be assumed that the actual ego lane and the adjacent past and proceeding lanes (so all lanes) are well known, detected by known art lane detection (possibly together with the rest of the scene, so scene detection in accordance with ‘2D Scene classification’ in
By that the ratio between front portion and side portion of a road participant can be assumed and by that it can be detected whether a blinker may be mostly in the left third or the right third of a road participant's front and whether it is plausible that also side blinkers visible at the road participant's side portion.
Optionally, instead of determining the relative angles of foreign vehicles relative to the ego vehicle, the system of the invention may have an object classification algorithm which puts out the vehicles' surface portions such as ‘left side’, ‘right side’, ‘front’, ‘rear’ (and optionally also the ‘top’ when visible).
In an alternative option for improving the clear distinguishing between the front portion and side portion of a road participant seen from an angle, an object orientation classifier may come into use (see step ‘Shape rotation by image classification’ in the flow charts of
Optionally, the classifier may output a confidence level at which viewing angle may apply most, such as indicated in
Optionally, the classifier may output the typical areas where at this type of vehicle the TSIs are to be found directly.
Optionally, the peak lights may be enhanced by an image histogram manipulation so that the TSIs can get separated from the image clutter better (optionally just in the objects bounding box or ROI) as shown in the filtered image of
Optionally, the system of the present invention may employ an anti-flicker filtering algorithm for improving the true positive blinker detection ratio. Optionally, the anti-flicker and TSI determination algorithm, systems and devices may be communized to one common system. Optionally, the TSI determination system may control or influence the camera color, camera brightness and/or HDR control parameters for optimizing TSI determination SNR.
Optionally, a system according the invention may have (ego-) vehicle control means to assist or intervene in case the ego vehicle is about to enter a lane that another relatively close (or still relatively distant but fast approaching) vehicle is also about to enter, such as discontinuing the lane change maneuver, but staying in the occupied lane and braking if necessary instead (in the example of
Therefore, the system of the present invention uses a vision system to detect and identify activated turn signal indicators of other vehicles present in the field of view of the camera at the equipped vehicle. The system processes image data captured by the camera to detect blinking or flashing, and determines whether or not the detected blinking is indicative of an activated turn signal indicator of a vehicle ahead or behind the equipped vehicle. Such determination is made responsive to a color of the detected blinking being within a threshold color range, the rate of flashing of the detected blinking being within a threshold rate, and/or the location of the detected blinking being within a threshold range of locations for another vehicle. The threshold(s) may be selected or adjusted responsive to the current geographical location of the equipped vehicle.
The camera or sensor may comprise any suitable camera or sensor. Optionally, the camera may comprise a “smart camera” that includes the imaging sensor array and associated circuitry and image processing circuitry and electrical connectors and the like as part of a camera module, such as by utilizing aspects of the vision systems described in International Publication Nos. WO 2013/081984 and/or WO 2013/081985, which are hereby incorporated herein by reference in their entireties.
The system includes an image processor operable to process image data captured by the camera or cameras, such as for detecting objects or other vehicles or pedestrians or the like in the field of view of one or more of the cameras. For example, the image processor may comprise an image processing chip selected from the EyeQ family of image processing chips available from Mobileye Vision Technologies Ltd. of Jerusalem, Israel, and may include object detection software (such as the types described in U.S. Pat. Nos. 7,855,755; 7,720,580 and/or 7,038,577, which are hereby incorporated herein by reference in their entireties), and may analyze image data to detect vehicles and/or other objects. Responsive to such image processing, and when an object or other vehicle is detected, the system may generate an alert to the driver of the vehicle and/or may generate an overlay at the displayed image to highlight or enhance display of the detected object or vehicle, in order to enhance the driver's awareness of the detected object or vehicle or hazardous condition during a driving maneuver of the equipped vehicle.
The vehicle may include any type of sensor or sensors, such as imaging sensors or radar sensors or lidar sensors or ladar sensors or ultrasonic sensors or the like. The imaging sensor or camera may capture image data for image processing and may comprise any suitable camera or sensing device, such as, for example, a two dimensional array of a plurality of photosensor elements arranged in at least 640 columns and 480 rows (at least a 640×480 imaging array, such as a megapixel imaging array or the like), with a respective lens focusing images onto respective portions of the array. The photosensor array may comprise a plurality of photosensor elements arranged in a photosensor array having rows and columns. Preferably, the imaging array has at least 300,000 photosensor elements or pixels, more preferably at least 500,000 photosensor elements or pixels and more preferably at least 1 million photosensor elements or pixels. The imaging array may capture color image data, such as via spectral filtering at the array, such as via an RGB (red, green and blue) filter or via a red/red complement filter or such as via an RCC (red, clear, clear) filter or the like. The logic and control circuit of the imaging sensor may function in any known manner, and the image processing and algorithmic processing may comprise any suitable means for processing the images and/or image data.
For example, the vision system and/or processing and/or camera and/or circuitry may utilize aspects described in U.S. Pat. Nos. 9,233,641; 9,146,898; 9,174,574; 9,090,234; 9,077,098; 8,818,042; 8,886,401; 9,077,962; 9,068,390; 9,140,789; 9,092,986; 9,205,776; 8,917,169; 8,694,224; 7,005,974; 5,760,962; 5,877,897; 5,796,094; 5,949,331; 6,222,447; 6,302,545; 6,396,397; 6,498,620; 6,523,964; 6,611,202; 6,201,642; 6,690,268; 6,717,610; 6,757,109; 6,802,617; 6,806,452; 6,822,563; 6,891,563; 6,946,978; 7,859,565; 5,550,677; 5,670,935; 6,636,258; 7,145,519; 7,161,616; 7,230,640; 7,248,283; 7,295,229; 7,301,466; 7,592,928; 7,881,496; 7,720,580; 7,038,577; 6,882,287; 5,929,786 and/or 5,786,772, and/or U.S. Publication Nos. US-2014-0340510; US-2014-0313339; US-2014-0347486; US-2014-0320658; US-2014-0336876; US-2014-0307095; US-2014-0327774; US-2014-0327772; US-2014-0320636; US-2014-0293057; US-2014-0309884; US-2014-0226012; US-2014-0293042; US-2014-0218535; US-2014-0218535; US-2014-0247354; US-2014-0247355; US-2014-0247352; US-2014-0232869; US-2014-0211009; US-2014-0160276; US-2014-0168437; US-2014-0168415; US-2014-0160291; US-2014-0152825; US-2014-0139676; US-2014-0138140; US-2014-0104426; US-2014-0098229; US-2014-0085472; US-2014-0067206; US-2014-0049646; US-2014-0052340; US-2014-0025240; US-2014-0028852; US-2014-005907; US-2013-0314503; US-2013-0298866; US-2013-0222593; US-2013-0300869; US-2013-0278769; US-2013-0258077; US-2013-0258077; US-2013-0242099; US-2013-0215271; US-2013-0141578 and/or US-2013-0002873, which are all hereby incorporated herein by reference in their entireties. The system may communicate with other communication systems via any suitable means, such as by utilizing aspects of the systems described in International Publication Nos. WO 2010/144900; WO 2013/043661 and/or WO 2013/081985, and/or U.S. Pat. No. 9,126,525, which are hereby incorporated herein by reference in their entireties.
Optionally, the camera may comprise a forward facing camera, such as disposed at a windshield electronics module (WEM) or the like. The forward facing camera may utilize aspects of the systems described in U.S. Pat. Nos. 8,256,821; 7,480,149; 6,824,281 and/or 6,690,268, and/or U.S. Publication Nos. US-2015-0327398; US-2015-0015713; US-2014-0160284; US-2014-0226012 and/or US-2009-0295181, which are all hereby incorporated herein by reference in their entireties.
The system may also communicate with other systems, such as via a vehicle-to-vehicle communication system or a vehicle-to-infrastructure communication system or the like. Such car2car or vehicle to vehicle (V2V) and vehicle-to-infrastructure (car2X or V2X or V2I or 4G or 5G) technology provides for communication between vehicles and/or infrastructure based on information provided by one or more vehicles and/or information provided by a remote server or the like. Such vehicle communication systems may utilize aspects of the systems described in U.S. Pat. Nos. 6,690,268; 6,693,517 and/or 7,580,795, and/or U.S. Publication Nos. US-2014-0375476; US-2014-0218529; US-2013-0222592; US-2012-0218412; US-2012-0062743; US-2015-0251599; US-2015-0158499; US-2015-0124096; US-2015-0352953; US-2016-0036917 and/or US-2016-0210853, which are hereby incorporated herein by reference in their entireties.
The system may utilize sensors, such as radar sensors or lidar sensors or the like. The sensing system may utilize aspects of the systems described in U.S. Pat. Nos. 9,753,121; 9,689,967; 9,599,702; 9,575,160; 9,146,898; 9,036,026; 8,027,029; 8,013,780; 6,825,455; 7,053,357; 7,408,627; 7,405,812; 7,379,163; 7,379,100; 7,375,803; 7,352,454; 7,340,077; 7,321,111; 7,310,431; 7,283,213; 7,212,663; 7,203,356; 7,176,438; 7,157,685; 6,919,549; 6,906,793; 6,876,775; 6,710,770; 6,690,354; 6,678,039; 6,674,895 and/or 6,587,186, and/or International Publication No. WO 2011/090484 and/or U.S. Publication Nos. US-2017-0222311 and/or US-2010-0245066 and/or U.S. patent applications, Ser. No. 15/685,123, filed Aug. 24, 2017, Ser. No. 15/675,919, filed Aug. 14, 2017, Ser. No. 15/647,339, filed Jul. 12, 2017, Ser. No. 15/619,627, filed Jun. 12, 2017, Ser. No. 15/584,265, filed May 2, 2017, Ser. No. 15/467,247, filed Mar. 23, 2017, and/or Ser. No. 15/446,220, filed Mar. 1, 2017, and/or International PCT Application No. PCT/IB2017/054120, filed Jul. 7, 2017, which are hereby incorporated herein by reference in their entireties.
Optionally, the vision system may include a display for displaying images captured by one or more of the imaging sensors for viewing by the driver of the vehicle while the driver is normally operating the vehicle. Optionally, for example, the vision system may include a video display device, such as by utilizing aspects of the video display systems described in U.S. Pat. Nos. 5,530,240; 6,329,925; 7,855,755; 7,626,749; 7,581,859; 7,446,650; 7,338,177; 7,274,501; 7,255,451; 7,195,381; 7,184,190; 5,668,663; 5,724,187; 6,690,268; 7,370,983; 7,329,013; 7,308,341; 7,289,037; 7,249,860; 7,004,593; 4,546,551; 5,699,044; 4,953,305; 5,576,687; 5,632,092; 5,677,851; 5,708,410; 5,737,226; 5,802,727; 5,878,370; 6,087,953; 6,173,508; 6,222,460; 6,513,252 and/or 6,642,851, and/or U.S. Publication Nos. US-2012-0162427; US-2006-0050018 and/or US-2006-0061008, which are all hereby incorporated herein by reference in their entireties.
Optionally, the vision system (utilizing the forward facing camera and a rearward facing camera and other cameras disposed at the vehicle with exterior fields of view) may be part of or may provide a display of a top-down view or birds-eye view system of the vehicle or a surround view at the vehicle, such as by utilizing aspects of the vision systems described in International Publication Nos. WO 2010/099416; WO 2011/028686; WO 2012/075250; WO 2013/019795; WO 2012/075250; WO 2012/145822; WO 2013/081985; WO 2013/086249 and/or WO 2013/109869, and/or U.S. Publication No. US-2012-0162427, which are hereby incorporated herein by reference in their entireties.
Changes and modifications in the specifically described embodiments can be carried out without departing from the principles of the invention, which is intended to be limited only by the scope of the appended claims, as interpreted according to the principles of patent law including the doctrine of equivalents.
The present application claims the filing benefits of U.S. provisional application Ser. No. 62/383,792, filed Sep. 6, 2016, which is hereby incorporated herein by reference in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
5550677 | Schofield et al. | Aug 1996 | A |
5670935 | Schofield et al. | Sep 1997 | A |
5760962 | Schofield et al. | Jun 1998 | A |
5786772 | Schofield et al. | Jul 1998 | A |
5796094 | Schofield et al. | Aug 1998 | A |
5877897 | Schofield et al. | Mar 1999 | A |
5929786 | Schofield et al. | Jul 1999 | A |
5949331 | Schofield et al. | Sep 1999 | A |
6201642 | Bos | Mar 2001 | B1 |
6222447 | Schofield et al. | Apr 2001 | B1 |
6302545 | Schofield et al. | Oct 2001 | B1 |
6396397 | Bos et al. | May 2002 | B1 |
6498620 | Schofield et al. | Dec 2002 | B2 |
6523964 | Schofield et al. | Feb 2003 | B2 |
6611202 | Schofield et al. | Aug 2003 | B2 |
6636258 | Strumolo | Oct 2003 | B2 |
6690268 | Schofield et al. | Feb 2004 | B2 |
6717610 | Bos et al. | Apr 2004 | B1 |
6757109 | Bos | Jun 2004 | B2 |
6802617 | Schofield et al. | Oct 2004 | B2 |
6806452 | Bos et al. | Oct 2004 | B2 |
6822563 | Bos et al. | Nov 2004 | B2 |
6882287 | Schofield | Apr 2005 | B2 |
6891563 | Schofield et al. | May 2005 | B2 |
6946978 | Schofield | Sep 2005 | B2 |
7005974 | McMahon et al. | Feb 2006 | B2 |
7038577 | Pawlicki et al. | May 2006 | B2 |
7145519 | Takahashi et al. | Dec 2006 | B2 |
7161616 | Okamoto et al. | Jan 2007 | B1 |
7230640 | Regensburger et al. | Jun 2007 | B2 |
7248283 | Takagi et al. | Jul 2007 | B2 |
7295229 | Kumata et al. | Nov 2007 | B2 |
7301466 | Asai | Nov 2007 | B2 |
7592928 | Chinomi et al. | Sep 2009 | B2 |
7720580 | Higgins-Luthman | May 2010 | B2 |
7855755 | Weller et al. | Dec 2010 | B2 |
7859565 | Schofield et al. | Dec 2010 | B2 |
7881496 | Camilleri et al. | Feb 2011 | B2 |
8694224 | Chundrlik, Jr. et al. | Apr 2014 | B2 |
8818042 | Schofield et al. | Aug 2014 | B2 |
8886401 | Schofield et al. | Nov 2014 | B2 |
8917169 | Schofield et al. | Dec 2014 | B2 |
9068390 | Ihlenburg et al. | Jun 2015 | B2 |
9077098 | Latunski | Jul 2015 | B2 |
9077962 | Shi et al. | Jul 2015 | B2 |
9090234 | Johnson et al. | Jul 2015 | B2 |
9092986 | Salomonsson et al. | Jul 2015 | B2 |
9140789 | Lynam | Sep 2015 | B2 |
9146898 | Ihlenburg et al. | Sep 2015 | B2 |
9174574 | Salomonsson | Nov 2015 | B2 |
9205776 | Turk | Dec 2015 | B2 |
9233641 | Sesti et al. | Jan 2016 | B2 |
9305223 | Ogale | Apr 2016 | B1 |
20100260377 | Takahashi | Oct 2010 | A1 |
20120299476 | Roberts | Nov 2012 | A1 |
20130002873 | Hess | Jan 2013 | A1 |
20130141578 | Chundrlik, Jr. et al. | Jun 2013 | A1 |
20130215271 | Lu | Aug 2013 | A1 |
20130222593 | Byrne et al. | Aug 2013 | A1 |
20130242099 | Sauer et al. | Sep 2013 | A1 |
20130258077 | Bally et al. | Oct 2013 | A1 |
20130278769 | Nix et al. | Oct 2013 | A1 |
20130298866 | Vogelbacher | Nov 2013 | A1 |
20130300869 | Lu et al. | Nov 2013 | A1 |
20130314503 | Nix et al. | Nov 2013 | A1 |
20140005907 | Bajpai | Jan 2014 | A1 |
20140025240 | Steigerwald et al. | Jan 2014 | A1 |
20140028852 | Rathi | Jan 2014 | A1 |
20140049646 | Nix | Feb 2014 | A1 |
20140052340 | Bajpai | Feb 2014 | A1 |
20140067206 | Pflug | Mar 2014 | A1 |
20140085472 | Lu et al. | Mar 2014 | A1 |
20140098229 | Lu et al. | Apr 2014 | A1 |
20140104426 | Boegel et al. | Apr 2014 | A1 |
20140138140 | Sigle | May 2014 | A1 |
20140139676 | Wierich | May 2014 | A1 |
20140152825 | Schaffner | Jun 2014 | A1 |
20140160276 | Pliefke et al. | Jun 2014 | A1 |
20140160291 | Schaffner | Jun 2014 | A1 |
20140168415 | Ihlenburg et al. | Jun 2014 | A1 |
20140168437 | Rother et al. | Jun 2014 | A1 |
20140211009 | Fursich | Jul 2014 | A1 |
20140218535 | Ihlenburg et al. | Aug 2014 | A1 |
20140226012 | Achenbach | Aug 2014 | A1 |
20140232869 | May et al. | Aug 2014 | A1 |
20140247352 | Rathi et al. | Sep 2014 | A1 |
20140247354 | Knudsen | Sep 2014 | A1 |
20140247355 | Ihlenburg | Sep 2014 | A1 |
20140293042 | Lynam | Oct 2014 | A1 |
20140293057 | Wierich | Oct 2014 | A1 |
20140307095 | Wierich | Oct 2014 | A1 |
20140309884 | Wolf | Oct 2014 | A1 |
20140313339 | Diessner | Oct 2014 | A1 |
20140320636 | Bally et al. | Oct 2014 | A1 |
20140320658 | Pliefke | Oct 2014 | A1 |
20140327772 | Sahba | Nov 2014 | A1 |
20140327774 | Lu et al. | Nov 2014 | A1 |
20140336876 | Gieseke et al. | Nov 2014 | A1 |
20140340510 | Ihlenburg et al. | Nov 2014 | A1 |
20140347486 | Okouneva | Nov 2014 | A1 |
Number | Date | Country | |
---|---|---|---|
20180068191 A1 | Mar 2018 | US |
Number | Date | Country | |
---|---|---|---|
62383792 | Sep 2016 | US |