The present disclosure relates to a camera system for a vehicle. More particularly, the present disclosure relates to a surround view camera system with improved low light performance.
Present automotive vehicles may utilize camera systems to assist vehicle drivers during use of the vehicle. These camera systems may be used to passively assist the driver during a driver actuated and controlled maneuver, and they may also be used as part of a driver-assist system, where the vehicle may provide controls to the vehicle.
For example, a camera system may be used to display the rear area of the vehicle when the vehicle is engaged in a reversing maneuver, when visibility behind the vehicle may be difficult. Camera systems may also be used to detect obstacles in the roadway, such as other vehicle, pedestrians, animals, or the like, and may be used to assist in braking the vehicle or maneuvering the vehicle to avoid or limit an imminent collision.
Similarly, the camera systems may be used to detect vehicles present in a blind spot of the vehicle, and may be used with a vehicle control system to provide a warning to the driver.
Cameras are also in use with autonomous driving systems, in which the systems are configured to view and monitor the vehicle surroundings to determine what type of vehicle maneuver to perform. Autonomous driving systems and advanced driver assist systems (ADAS) may also utilize radar to detect the presence of certain objects relative to the vehicle, but the radar systems are limited in what they can reliably detect. For example, radar may provide accurate velocity and distance information of an object outside of the vehicle, but is not as accurate as a camera in determining and classifying what the object is.
Camera systems therefore are used to reproduce the environment that is typically viewable by the human eye. Cameras are useful when there is ample light present in the environment, but the performance capabilities of cameras decreases dramatically at night and can be almost unusable in extreme low light conditions.
In some cases, the forward facing headlights typical in vehicles can provide sufficient illumination at night or in extreme low light conditions. Similarly, the tail lights and reverse-lighting that occurs during reverse vehicle maneuvers can also provide light at the rear of the vehicle sufficient for minor reverse maneuvers. However, side cameras do not benefit from headlights or tail lights to illuminate the environment.
In view of the foregoing, there remains a need for improvements to camera systems in vehicles.
A system for processing images from a camera for a vehicle is provided. The system may include a camera module including a lens configured to receive light from a surrounding environment, the camera module including a camera sensor disposed adjacent the lens and configured for receiving light that passes through the lens, an image signal processor operatively coupled to the camera module and configured to receive data from the camera sensor, and a control unit operatively coupled to the image signal processor and configured to receive an image from the image signal processor.
The system may further include an infrared light elimination mechanism associated with the camera sensor and operable in a normal light condition for eliminating an infrared portion of the light that passes through the lens, and further operable in a low light condition for allowing the infrared portion of the light passing through the lens to be processed by the image signal processor.
In another aspect, a method for capturing and processing an image in a camera system for a vehicle is provided. The method includes the step of receiving light into a camera module through a lens, wherein at least a portion of the light passes to and is received at a camera sensor coupled to an image signal processor and an autonomous driving control unit.
The method further includes detecting a normal light condition in an environment surrounding the vehicle and eliminating an infrared portion of the light received through the lens in response to detecting the normal light condition to define a non-infrared light portion. The method further includes using the non-infrared light portion to define and process a normal-light image for use in the control unit.
The method further includes detecting a low light condition in an environment surrounding the vehicle and using all of the light that passes through the lens in response to detecting the low light condition to define and process a low-light image for use in the control unit.
Other advantages of the present invention will be readily appreciated, as the same becomes better understood by reference to the following detailed description when considered in connection with the accompanying drawings wherein:
Referring to
The sensor 12 may be in the form of a CMOS or CCD camera sensor. The sensor 12 may typically pick up more light than the human eye. This extra light that that the sensor 12 picks up is in the near infrared band that human eyes cannot see. For camera systems to produce “normal” looking images during daytime operation or the like, relative to the human eye, the system 10 may eliminate or reduce the “extra light” that may be collected by the sensor 12. In the low-light condition, representing color correctly for the human eye is less relevant, and therefore it is advantageous for the sensor 12 to pick up the extra light to aid the sensor 12 and ISP 14 to produce a usable image for the ADCU 16 to process.
With reference to
In one approach, and with reference to
Light is received at the sensor 12a after passing through the lens 18, which focuses the incoming light at the camera module 22 and in particular the light passing through the lens 18 onto the sensor 12a. The lens 18 may be selected from a group of lens that are specifically designed to work with the RGB-IR type sensor 12a.
In the system 10 that uses the RGB-IR sensor 12a, all of the light passing through the lens 20 is received by the sensor 12a and all or a portion of the received light, in the form of pixel data at the various pixels, can be processed further by the ISP 14 and the ADCU 16. Whether or not all of the data is used by the ADCU 16 depends on whether the camera unit 22 is operating in the normal light condition or the low-light condition.
In the normal light condition, if all of the received data from the pixels of the sensor 12a were used by the ADCU 16, the resulting image would include too much extra light, because of the data received by the IR pixels. The resulting image would appear in a manner that does not represent what a human eye perceives, thereby creating an inaccurate image for the ADCU 16 to use in its machine learning processes, and further creating an image that when viewed by the human eye is distorted If the IR portion is not subtracted during a normal light condition, the resulting image may include a magenta-type tint, and the image would therefore not be desirable for backup camera views, side camera views, or the like. Typically, ADCUs and machine learning processes operate based on “normal” looking images that resemble what the human eye can perceive, but the images may also be used for real time visual monitoring by the vehicle driver or occupants.
Thus, during a normal light condition, the IR data is preferably removed. In the normal light condition, the ADCU 16 turns off the illuminator 20, and the raw data received in the camera unit 22 includes both the RGB and IR data collected by the pixels of the sensor 12a. Software present in the ISP 14 or the ADCU 16 may then subtract, remove, or delete the IR data, leaving only the data from the RGB pixels. The IR pixels on the sensor 12a are in a predetermined layout, and therefore the system 10 is aware of exactly which pixels and data to remove from the raw data to leave only the RGB data in order to construct an image that conforms to what the human eye typically perceives at normal light conditions.
As described above, in the normal light operation, the illuminator 20 is turned off, because the IR data that is received by the sensor 12a is intended to be deleted from the image. However, the ADCU 16 could alternatively turn on the illuminator 20, which may increase the amount of IR data that is received by the IR pixels of the sensor 12a. The system may operate in the same manner described above, in which the increased IR data is removed, leaving only the RGB data. Thus, even if more IR data is collected, the increased IR data as a result of the illuminator 20 being on is deleted.
In a low light condition, the ADCU 16 turns on the illuminator 20, which illuminates the near IR light in the environment around the camera unit 22. The illuminated light, including near-IR light, passes through the lens 18 and is received by the RBG-IR sensor 12a. The raw data received at the sensor 12a includes the raw RGB data from the RBG pixels, as well as the raw IR data from the IR pixels. The raw data does not need to have the data from the IR pixels subtracted, because the data from the IR pixels is desirable in low light conditions in order to produce a usable image for the ADCU 16. This is the case because even with the illuminator 20 on, the data from the RGB pixels is reduced relative to a normal light condition. The RGB-IR sensor 12a therefore provides an improved image for the ADCU 16 to use in machine learning applications.
The system 10 using the RGB-IR sensor 12a may include additional light sensors 21 (shown in
The ADCU 16 may operate in response to detecting the low light condition or the normal light condition by sending various signals within the system 10 that may control connected components. For example, the ADCU 16 may send a signal to the illuminator 20 to turn the illuminator 20 off in response to detecting a normal light condition when the light is above the threshold level. The ADCU 16 may also send a signal to the illuminator 20 to turn the illuminator 20 on in response to detecting a low light condition when the light is below the threshold level.
The ADCU 16 may similarly send a signal to the ISP 14 to delete the IR pixel data from the raw data of the sensor 12a in response to determining a normal light condition when the light is above the threshold level. The ADCU 16 may send a signal to the ISP 14 to use all of the raw data in response to determining a low light condition when the light is below the threshold level.
The above system 10 with the RGB-IR sensor 12a can therefore collect data in both the low light and normal light conditions. The data used by the ADCU 16 is either the entire raw data during low light conditions, or just the RGB data in normal conditions, with the IR portion having been subtracted from the raw data.
In another approach, and with reference to
The system 10 may further include an IR-cut filter 30. In this approach, the infrared elimination mechanism 19 comprises the IR-cut filter 30. The IR-cut filter 30 may be installed within the camera module 22, along with the lens 18 and the RGB sensor 12b. The IR-cut filter 30 may be connected to a moving mechanism 32 configured to move the IR-cut filter 30 between at least two different positions. The moving mechanism 32 may be a mechanically actuatable mechanism to which the IR-cut filter 30 is attached, with the mechanism 32 being actuated to move the filter 30 along a path between filtering and non-filtering positions. For example, a motor and a rotation-translation mechanism may be used or a solenoid actuator may be used. Other types of controllable mechanisms that may actuate in response to a control signal may be used.
In a first position of the IR-cut filter 30, shown in
The IR-cut filter 30 is configured to block IR light, such that the light passing through the IR-cut filter 30 that reaches the sensor is effectively limited to the band of light visible to the human eye. With the IR-cut filter 30 being moveable between the first and second position, the system 10 can control whether or not IR light is detected by the sensor based on the position of the IR-cut filter relative to the lens 18 and the sensor 12b.
In a normal light condition, in which sufficient light is present to enable machine learning applications based on images resembling those visible to the human eye, the IR-cut filter 30 may be moved to the first position, shown in
Thus, the raw data collected by the sensor 12b may be passed on to the ADCU 16 without any special processing, aside from traditional image processing that converts RGB pixels to images. The ADCU 16 may then process the image using machine learning applications and models as necessary.
In the normal light condition, the IR illuminator 20 may be turned off by the ADCU 16, because the IR light passing through the lens 18 will nevertheless be filtered out by the IR-cut filter 30, so additional illumination for IR light is generally unnecessary. Thus, any additional IR light that is illuminated by the IR illuminator 20 would not pass to the RGB sensor 12b. However, it will be appreciated that the IR illuminator 20 may be turned on by the ADCU 16, even in normal light conditions, and the system 10 may operate in the same manner, with the raw data collected at the sensor 12b being used without any special processing to remove an IR portion, because the IR light is blocked by the filter 30.
In the low light condition, it is desirable to collect the extra light from the IR band. Accordingly, in the low light condition, the IR-cut filter 30 may be moved out of the path between the lens 18 and the sensor 12b, as shown in
As illustrated in
The ADCU 16 may operate in response to detecting the low light condition or the normal light condition by sending various signals within the system 10 that may control connected components. For example, the ADCU 16 may send a signal to the illuminator 20 to turn the illuminator 20 off in response to detecting a normal light condition when the light is above the threshold level. The ADCU 16 may also send a signal to the illuminator 20 to turn the illuminator 20 on in response to detecting a low light condition when the light is below the threshold level.
Similarly, the ADCU 16 may send a signal to the moving mechanism 32 coupled to the filter 30 in response to determining a normal light condition when the light is above the threshold level, with the signal controlling the moving mechanism 32 to move the filter 30 into the first position where the filter 30 is disposed between the lens 18 and the RGB sensor 12a. The ADCU 16 may send a signal to the moving mechanism 32 in response to determining a low light condition when the light is below the threshold level to move the filter 30 out of the light path between the lens 18 and the RGB sensor 12b and into the second position, so that the IR light may be collected by the RGB sensor 12b.
In the low light condition, and in order to increase the amount of extra light collected, the IR illuminator 20 may be turned on, thereby providing IR light nearby, which can be collected by the RGB sensor 12b. With the IR-cut filter 30 moved out of the path between the lens 18 and the sensor 12b, the extra illuminated IR light is not blocked.
In the low light condition with the IR illuminator 20 on and the IR-cut filter 30 moved out of the path between the lens 18 and the sensor 12b, the raw data collected by the sensor 12b is used by the ADCU 16 without special processing to remove an IR portion of the image.
The camera module 22 with the movable IR-cut filter 30 therefore allows a single camera module to be used during both low light and normal light conditions. This solution provides efficiencies relative to systems that use separate camera modules, where one camera module has an IR-cut filter in a fixed position and is used to collect the light during normal light conditions, and another camera module is without an IR-cut filter and is used to collect the light during low light conditions.
It will be appreciated that some of the features described above may be used in combination with each other.
In one approach, the RGB-IR sensor 12a may be used, along with the IR-cut filter 30. In this approach, the IR-cut filter 30 may be moved to the first position to block IR light from reaching the RGB-IR sensor 12a. Thus, the system 10 may operate without subtracting the data from the IR pixels, because the IR-cut filter 30 blocks the IR light. The raw data may still include an IR component, but the IR component would effectively be empty, and therefore could remain as part of the raw data. Alternatively, the system 10 may still operate in the same manner as the RGB-IR sensor system previously described above, with the IR component deleted or subtracted from the raw data, if desired.
Similarly, the IR-cut filter 30 may be moved to the second unfiltered position even in the normal light operation. In this case, the system 10 would operate as described above with respect to the RGB-IR sensor 12a, in which the IR component is subtracted, because the IR-cut filter 30 was moved out into its second position and did not block IR light.
Basically, the system can include the IR-cut filter 30 with a RGB-IR sensor 12a. When the IR-cut filter 30 is in the second unfiltered position, the system 10 operates similarly to the RGB-IR system previously described.
The primary differences between the system 10 with the RGB sensor 12b plus the IR-cut filter 30 and the system 10 with the RGB-IR sensor 12a without an IR-cut filter is the manner in which the normal light condition is handled and processed. With the RGB sensor 12b and IR-cut filter 30, the filter 30 optically removes the IR light from reaching the RGB sensor, such that the light that is received is effectively limited to the band that the human eye can detect.
With the RGB-IR sensor 12a and no IR-cut filter, the IR light will reach the sensor 12a and be collected by the sensor 12a. However, the dedicated IR pixels will effectively compartmentalize the IR portion of the image, which can then be removed via software, because the software knows which pixel data is from the IR band.
In both cases, the IR portion of an image is removed from the resulting image. The difference is whether the IR portion is blocked at the front end optically or at the back end via software.
Both of these systems operate in a similar manner in the low light condition. In each case, there is no optical blocking of IR light, because the IR-cut filter 30 is either not present in the system or is moved to its second position out of the path between the lens 18 and the sensor 12a/12b. Thus, in each case, the full spectrum of light entering the camera is collected by the sensor, and the raw data is used by the system, with the extra IR light providing a usable image for the machine learning applications and models of the ADCU 16.
Obviously, many modifications and variations of the present invention are possible in light of the above teachings and may be practiced otherwise than as specifically described while within the scope of the appended claims. These antecedent recitations should be interpreted to cover any combination in which the inventive novelty exercises its utility.