The disclosed technique relates to imaging systems, in general, and to method for object detection and classification system.
Traditional imaging sensors use a spectral pattern with various configurations such as Bayer pattern, “RGBW” (red, green, blue, white), “RCCC” (red, Clear, Clear, Clear) etc. These color or clear spectral filter pass a wide spectral region which masks the pure signal. Prior art has described also narrow spectral patterns on the imaging sensor pixels such Fabry-Perot filters. This approach may lack spectral information due to the narrow spectral band and may also lack immunity to backscattering for an active imaging approach.
Another prior art, U.S. Pat. No. 8,446,470, titled “Combined RGB and IR imaging sensor” describes an imaging system with plurality of sub-arrays having different sensing colors and infrared radiation. This proposed imaging system has inherent drawback in imaging wide dynamic range scenery of a single spectral radiation such as originating from a LED or a Laser where a saturated pixel may mask (due to signal leakage) a nearby non saturated pixel. Another drawback may occur in imaging scenery consisting a pulsed or modulated spectral radiation, such as originating from a LED or a Laser, where a pixel exposure is not synchronized or unsynchronized to this type of operation method.
Before describing the invention method, the following definitions are put forward.
The term “Visible” as used herein is a part of the electro-magnetic optical spectrum with wavelength between 400 to 700 nanometers.
The term “Infra-Red” (IR) as used herein is a part of the Infra-Red spectrum with wavelength between 700 nanometers to 1 mm.
The term “Near Infra-Red” (NIR) as used herein is a part of the Infra-Red spectrum with wavelength between 700 to 1400 nanometers.
The term “Short Wave Infra-Red” (SWIR) as used herein is a part of the Infra-Red spectrum with wavelength between 1400 to 3000 nanometers.
The term “Field Of View” (FOV) as used herein is the angular extent of a given scene, delineated by the angle of a three dimensional cone that is imaged onto an image sensor of a camera, the camera being the vertex of the three dimensional cone. The FOV of a camera at particular distances is determined by the focal length of the lens and the active image sensor dimensions.
The term “Field of Illumination” (FOI) as used herein is the angular extent of a given scene, delineated by the angle of a three dimensional cone that is illuminated from an illuminator (e.g. LED, LASER, flash lamp, ultrasound transducer, etc.), the illuminator being the vertex of the three dimensional cone. The FOI of an illuminator at particular distances is determined by the focal length of the lens and the illuminator illuminating surface dimensions.
The term “pixel” or “photo-sensing pixel” as used herein, is defined as a photo sensitive element used as part of an array of pixels in in an image detector device.
The term “sub pixel” or “photo-sensing sub pixel” as used herein, is defined as a photo sensitive element used as part of an array of sub pixels in a photo-sensing pixel. Thus, an image detector has an array of photo-sensing pixels and each photo-sensing pixel includes an array of photo-sensing sub pixels. Specifically, each photo-sensing sub pixel may be sensitive to a different range of wavelengths. Each one of the photo-sensing sub pixel are controlled in accordance with a second type exposure and/or readout scheme.
The term “second type exposure and/or readout scheme” of a photo-sensing sub pixel as used herein, is defined as a single exposure (i.e. light accumulation) of the photo sensitive element per a single signal read.
The term “first type sub pixel” or “first type photo-sensing sub pixel” as used herein, relates to a photo-sensing sub pixel which is controllable beyond the second type exposure scheme.
In accordance with the disclosed technique, there is thus provided an imaging sensor (detector) or camera having an array of photo-sensitive pixels configuration that combines:
In one embodiment of the present invention, photo-sensitive pixel configuration as described hereinabove includes at least one sub pixel: “first type sub pixel” or “first type photo-sensing sub pixel” which relates to a photo-sensing sub pixel controllable beyond the second type exposure scheme.
In another embodiment of the present invention, exposure control mechanism (i.e. exposure scheme) for at least one first type sub pixel may provide a single exposure per sub pixel signal readout or multiple exposures per single sub pixel readout.
In another embodiment of the present invention, pixel signal readout may be a single channel or multiple readout channels.
In another embodiment of the present invention, at least one first type sub pixel may have a separate signal readout channel as to other sub pixels readout channel.
The imaging sensor (detector) or camera of the present invention is suitable for use in automotive camera products, such as for mono-vision based systems, providing driver assistance functionalities such as: adaptive headlamp control systems, lane departure warning (and/or lane keeping), traffic sign recognition, front collision warning, object detection (e.g. pedestrian, animal etc.), night vision and/or the like.
The imaging sensor (detector) or camera of the present invention is suitable for use in automotive camera products, such as for stereo-vision based systems, providing driver assistance functionalities such as: described hereinabove for mono-vision based systems, and 3D mapping information.
Therefore, the imaging sensor of the present invention can provide multi spectral imaging (for example both visible and IR imaging) capability with an adequate Signal to Noise (S/N) and/or adequate Signal to Background (S/B) for each photo-sensing sub pixel array in a single sensor frame, without halo (blooming) effect between adjacent sub pixels, and without external filters (such as spectral, polarization, intensity etc.). Such a sub pixel configuration of visible and IR pixels is applicable to various pixelated imaging array type sensing devices. The imaging sensor of the present invention is suitable for applications in maritime cameras, automotive cameras, security cameras, consumer digital cameras, mobile phone cameras, and industrial machine vision cameras, as well as other markets and/or applications.
These, additional, and/or other aspects and/or advantages of the present invention are: set forth in the detailed description which follows; possibly inferable from the detailed description; and/or learnable by practice of the present invention.
The present invention will be more readily understood from the detailed description of embodiments thereof made in conjunction with the accompanying drawings of which:
Before explaining at least one embodiment of the invention in detail, it is to be understood that the invention is not limited in its application to the details of construction and the arrangement of the components set forth in the following description or illustrated in the drawings. The invention is applicable to other embodiments or of being practiced or carried out in various ways. Also, it is to be understood that the phraseology and terminology employed herein is for the purpose of description and should not be regarded as limiting.
In accordance with the present invention, the disclosed technique provides methods and systems for accumulating a signal by a controllable spectral sensing element.
System 10 may include at least a single illuminator 14 in the non-visible spectrum (i.e. NIR, SWIR or NIR/SWIR spectrum) providing a Field Of Illumination (FOI) covering a certain part of the mosaic spectral imaging camera 15 FOV. Illuminator 14 may be a Continues Wave (CW) light source or a pulsed light source. Illuminator 14 may provide a polarized spectrum of light and/or a diffusive light.
System 10 further includes a system control 11 which may provide the synchronization of the mono vision control 12 to the illuminator control 13. System control 11 may further provide real-time image processing (computer vision) such as driver assistance features (e.g. pedestrian detection, lane departure warning, traffic sign recognition, etc.) in the case of an automotive usage. Mono vision control 12 manages the mosaic spectral imaging camera 15 such as: image acquisition (i.e. readout), de-mosaicking and imaging sensor exposure control/mechanism. Illuminator control 13 manages the illuminator 14 such as: ON/OFF, light source optical intensity level and pulse triggering for a pulsed light source configuration.
In accordance with some embodiment system 10 can be configured with gated imaging 15 capabilities for at least the sub pixels of a first type by synchronizing their gating with pulsed light present in the scene. The other type of sub pixels can remain unsynchronized with the pulsed light 14. The gated imaging feature presents advantages at daytime conditions, for nighttime conditions, light-modulated objects imaging (e.g. high repetition light flickering such as traffic sign etc.) and in poor visibility conditions. In addition, to enable target detection (i.e. any type of object such as car, motorcycle, pedestrian etc.) based on a selectively Depth-Of-Field (refereed hereinafter sometimes as “Slice”) in real time with an automatic alert mechanism conditions regarding accumulated targets. The gated imaging system may be handheld, mounted on a static and/or moving platform. Gated imaging system may even be used in underwater platforms, ground platforms or air platforms. The preferred platform for the gated imaging system herein is vehicular.
A gated imaging system is described in certain prior art such as patent: U.S. Pat. No. 7,733,464 B2, titled “vehicle mounted night vision imaging system and method”. Light source pulse (in free space) is defined as:
where the parameters defined in index below. Gated Camera ON time (in free space) is defined as:
Gated Camera OFF time (in free space) is defined as:
where c is the speed of light, R0, Rmin and Rmax are specific ranges. The gated imaging is utilized to create a sensitivity as a function of range through time synchronization of TLaser, TII and TOff.
Hereinafter a single “Gate” (i.e. at least a single light source pulse illumination followed by at least a single sensor exposure per a sensor readout) utilizes a specific TLazer, TH and TOff timing as defined above. Hereinafter “Gating” (i.e. at least a single sequences of; a single light source pulse illumination followed by a single sensor exposure and a single light source pulse illumination followed by a single sensor exposure ending the sequence a sensor readout) utilizes each sequence a specific TLaser, TII and TOff timing as defined above. Hereinafter Depth-Of-Field (“Slice”) utilizes at least a single Gate or Gating providing a specific accumulated imagery of the viewed scene. Each DOF may have a certain DOF parameters that includes at least on the following; R0, ROff and Rmax.
Prior describing embodiments of invention,
Reference is now made to
Reference is now made to
Reference is now made to
For example this representation can define pixelated filters as indicated in the following table.
A signal output, Signal(e) expressed in electrons, of prior art imaging 2D sensing element (i.e. sub pixel) without an internal gain and neglecting noise can be expressed by:
Sλ is the sensing element response (responsivity) to a specific wavelength (i.e. Sλ=QE(λ). FF(λ), QE(λ) is the quantum efficiency and FF(λ) is the sub pixel fill factor),
is the optical power density at a specific wavelength, dwidth·dlength is the photo-sensing active area of the sub pixel (e.g. pin diode, buried pin diode etc.) and texposure is the sub pixel exposure duration to the optical power density. Thus, taking into account that a Color Filter Array (CFA) and/or any type of spectral pattern (as illustrated in
In another embodiment, fusion frames of mosaic spectral imaging sensor pixels 40 (two by two sub pixel array 35 that is repeated over the pixelated array of imaging sensor 15) provides yet another layer of information. A fused frame may provide data such as: moving objects types in the imaging sensor FOV, trajectory of moving objects in the imaging sensor FOV, scenery conditions (for example, ambient light level) or any other spectral, time variance data of the viewed scenery.
In another embodiment, in case of a moving platform (i.e. imaging sensor pixels 40 is movable) fused frames may provide yet another layer of information. A fused frame may provide full resolution image of the viewed FOV with at least a single spectral photo-sensing sub pixel.
Advance Driver Assistance Systems (ADAS) imaging based applications may require spectral information (info′) as presented in
In addition each type (Type A and/or Type B) may have different exposure control mechanism (and anti-blooming ratio as defined hereinabove.
For another example, sub-Array 35 Type C may be at least on the options as follows.
In addition each type option (Type C option 1 and/or option 2) may have different exposure control mechanism (i.e. exposure scheme) and anti-blooming ratio as defined hereinabove.
In another embodiment, system 10 may provide at least the above ADAS applications in addition to predication of areas of interest where a mosaic spectral imaging sensor pixels 40 is incorporated in mosaic spectral imaging camera 15. Predicated areas of interest may include: objects in the viewed scenery (e.g. road signs, vehicles, traffic lights, curvature of the road etc.) and similar system approaching system 10.
While the invention has been described with respect to a limited number of embodiments, these should not be construed as limitations on the scope of the invention, but rather as exemplifications of some of the preferred embodiments. Other possible variations, modifications, and applications are also within the scope of the invention.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/IL2014/051106 | 12/17/2014 | WO | 00 |
Number | Date | Country | |
---|---|---|---|
20160316153 A1 | Oct 2016 | US |