This invention relates LIDAR (Light Detection and Ranging) systems and, in particular, to High Resolution Flash LIDAR (HFL) sensors that detects adverse conditions such as weather conditions affecting a vehicle as well as detecting solid objects in the field of view.
LIDAR sensors, used in advanced driver assist systems, undergo significant performance degradation in bad weather conditions. These conditions include rainfall, snowfall, hail, drizzle, haze, smog, fog, spray formed by droplets of water kicked up by a tire of a vehicle driving on wet road (freeways), etc. The performance of the sensor degrades due to three main reasons. First, the power of the laser is scattered, significantly reducing the maximum detectable distance. Secondly, the returns from snowflakes, rain drops and fog are confused with returns from solid objects. Thirdly, the quality of the LIDAR image or point cloud decreases due to interference from weather objects. This degradation increases the need to detect the current weather condition in which the vehicle is driving in order to be able to enter into a weather mode where some functionality will be disabled after notifying the driver to take over.
A conventional driver assist system for detecting weather such as rain is disclosed in EP 3091342 A1. This system uses additional channel for bad weather detection as opposed to the technique used in this patent where the normal object detection channel is used both for detection of weather and objects. This state of the art is also limited in the sense that, it probes a very limited space in front of the vehicle making the reliability questionable. In addition, this conventional way of detection is not able to distinguish the type of weather condition such as rain, snow, fog, spray etc., since this channel has a very limited resolution. However, weather detection using the HFL sensor disclosed herein can detect and classify weather conditions reliably owing to its high resolution and fast sampling rate of the lidar signal.
U.S. Pat. No. 8,879,049 discloses an optical sensing system that uses a dedicated photodiode or receiver channel which overlaps with the illumination field only for short distance in front of the sensor. The photodiode cannot be used for any other purpose. This method again suffers from the same problem that it probes a very small region (few cm3) and is unable to classify weather condition due to its very low resolution.
Thus, there is a need to have a robust and cost-effective weather detection and classification system for a driver assist or autonomous vehicles to make them safe and reliable. Hence, this additional feature helps the vehicles to easily monitor their environments and predict/notify performance degradation reliably.
An objective of the invention is to fulfill the need referred to above. In accordance with the principles of an embodiment, this objective is achieved by a method of detecting adverse weather conditions in a driver assist or autonomous vehicle system for a vehicle. The method provides a system including a LIDAR sensor (HFL sensor in particular) having a transmitting portion including a light source and illumination optics, and a receiving portion having a photodetector or array of photodetectors as used in HFL sensor, for receiving reflected light, and receiving optics. The receiving optics is spaced from the illumination optics. The illumination optics and the receiving optics each define a field of view, with the field of views overlapping at a certain distance from the sensor defining a solid object sensing region. A region located outside of the solid object sensing region defines a non-overlapping region. The photodetector determines if a signal exists in the solid object sensing region indicative of a solid object therein. The same photodetector also determines if a signal exists in the non-overlapping region indicative of an adverse weather condition affecting the vehicle.
In accordance with another aspect of an embodiment, a system for detecting adverse conditions in an environment includes a LIDAR sensor having a transmitting portion including a light source and illumination optics, and a receiving portion having a photodetector or array of photodetectors as used in HFL sensor, for receiving reflected light, and receiving optics. The receiving optics is spaced from the illumination optics. The illumination optics and the receiving optics each define a field of view, with the field of views overlapping at a certain distance from the sensor defining a solid object sensing region. A region located outside of the solid object sensing region defines a non-overlapping region. The photodetectors are constructed and arranged to detect at least one signal when a solid object is in the solid object sensing region and to detect at least one signal when a non-solid object is in the non-overlapping region. A processor circuit is electrically coupled with the sensor and is constructed and arranged to process signals obtained from the sensor.
Other objectives, features and characteristics of the present invention, as well as the methods of operation and the functions of the related elements of the structure, the combination of parts and economics of manufacture will become more apparent upon consideration of the following detailed description and appended claims with reference to the accompanying drawings, all of which form a part of this specification.
The invention will be better understood from the following detailed description of the preferred embodiments thereof, taken in conjunction with the accompanying drawings, wherein like reference numerals refer to like parts, in which:
With reference to
With reference to
Unlike normal cameras, the HFL sensor 13 is an active sensor having its own illumination (laser diode 20) with a defined divergence or field of view. Due to mechanical reasons and design requirement, the illumination optics Tx and receiving optics Rx are not located at the same position. As a result, the illumination field of view (FOV) and the receiving field of view do not overlap until some distance in front of the sensor called the “overlapping distance”. The overlapping distance is the distance required for the pixel's FOV to overlap with the illumination field of the radiation (laser). This overlap distance depends on the separation distance between the illumination and receiving optics. The larger the distance between the receiving optics Rx and illumination optics Tx, the larger is the overlap distance. Moreover, since the detector array 25 of the HFL sensor 13 has multiple pixels (thousands), each of these pixels have their own overlap distance given by their position on the focal plane array (FPA).
With reference to
However, due to special optical phenomena, e.g., multiple scattering from fog, spray, rain, snow, or other non-solid objects, the photodiode or detector array 25 can detect a signal in the blind window. Hence, the presence of this signal in the non-overlapping region R serves as a fingerprint for the presence of adverse weather condition (snow, spray, fog, etc.). This non-overlap region R is few centimeters to meters depending on the distance between the Illuminating optics and the receiving optics and the location of the pixel on the FPA. Edge pixels normally have longer overlapping distance.
In addition, with reference to
As used herein “fog particles” are little droplets of water suspended in air usually in the range of few microns of meters. “Spray” is a fog-like material produced when a car drives over a wet road. This is formed when the water on the ground is kicked up by the tire of a vehicle forming cloud of little droplets of water in the air. The size of spray droplet is usually bigger than fog droplets and is highly dynamic behavior because of air turbulence from the vehicle. This is usually formed at high speeds on highways. “Scattering”, in simple terms, is a phenomenon where a light incident on a particle is scattered in all directions (usually in varying degrees). Depending on the size of the particle relative to the wavelength of the incident light the scattering behavior changes. In the emission wavelength of the laser of the HFL sensor 14, the fog particles interact with light in what is referred to as “Mie Scattering”. This scattering is more omni-directional for small size particles while it is more forward scattered for larger particles.
In accordance with the embodiment, after detection of the weather condition by detecting a signal in the non-overlapping region noted above, an algorithm is executed by a processor circuit 34 of the control unit 16 (
Returning to
Generally, a slight separation of the Tx optics 22 and the Rx optics 26 path in the direction of low beam divergence (could be horizontal or vertical) produces a larger overlapping distance. Thus, a larger separation of the Rx and Tx optics is preferred in the direction of low divergence of the illumination.
The operations and algorithms described herein can be implemented as executable code within a micro-controller or control unit 16 having processor circuit 34 as described, or stored on a standalone computer or machine readable non-transitory tangible storage medium that are completed based on execution of the code by a processor circuit implemented using one or more integrated circuits. Example implementations of the disclosed circuits include hardware logic that is implemented in a logic array such as a programmable logic array (PLA), a field programmable gate array (FPGA), or by mask programming of integrated circuits such as an application-specific integrated circuit (ASIC). Any of these circuits also can be implemented using a software-based executable resource that is executed by a corresponding internal processor circuit such as a micro-processor circuit (not shown) and implemented using one or more integrated circuits, where execution of executable code stored in an internal memory circuit causes the integrated circuit(s) implementing the processor circuit to store application state variables in processor memory, creating an executable application resource (e.g., an application instance) that performs the operations of the circuit as described herein. Hence, use of the term “circuit” in this specification refers to both a hardware-based circuit implemented using one or more integrated circuits and that includes logic for performing the described operations, or a software-based circuit that includes a processor circuit (implemented using one or more integrated circuits), the processor circuit including a reserved portion of processor memory for storage of application state data and application variables that are modified by execution of the executable code by a processor circuit. The memory circuit 36 can be implemented, for example, using a non-volatile memory such as a programmable read only memory (PROM) or an EPROM, and/or a volatile memory such as a DRAM, etc.
Advantages of the system 10 of the embodiment include:
Although the above described system and method has been disclosed to detect an adverse weather condition, other methods using the HFL sensor 13 can be employed. For example, another method includes processing of clusters at close distance. Rain and snow have small clusters, round shape, are not persistent. Intensity and reflectivity can also be considered. Fog and spray have big clusters, have a shape of FOV, are persistent and transparent. Other methods can include processing of point cloud, monitoring overlap of clusters, post ground etc., or monitoring multiple pulse detections.
Although the embodiment has been disclosed for use in a driver assist system or autonomous vehicle system, the system 10 can be used in other adverse environments, such as for detection in dusty or smoke-filled environments. In addition the system 10 can be used as weather sensor for meteorological applications.
The foregoing preferred embodiments have been shown and described for the purposes of illustrating the structural and functional principles of the present invention, as well as illustrating the methods of employing the preferred embodiments and are subject to change without departing from such principles. Therefore, this invention includes all modifications encompassed within the scope of the following claims.