The present application claims the filing benefits of U.S. provisional application Ser. No. 63/587,471, filed Oct. 3, 2023, which is hereby incorporated herein by reference in its entirety.
The present invention relates generally to a vehicle cabin monitoring system for a vehicle and, more particularly, to a vehicle cabin monitoring system that utilizes one or more cameras at a vehicle.
Use of imaging sensors in vehicle imaging systems is common and known. Examples of such known systems are described in U.S. Pat. Nos. 5,949,331; 5,670,935 and/or 5,550,677, which are hereby incorporated herein by reference in their entireties.
A vehicular cabin monitoring system includes a camera disposed within a cabin of a vehicle equipped with the vehicular cabin monitoring system and viewing within an interior cabin of the vehicle. The camera is operable to capture image data. The camera includes an imager, and the imager includes a CMOS imaging array having at least one million photosensors arranged in rows and columns. The system includes an electronic control unit (ECU) with electronic circuitry and associated software. The electronic circuitry includes an image processor operable to process image data captured by the camera. Image data captured by the camera is transferred to and is processed at the ECU. The system includes a light emitter operable to emit nonvisible light. Nonvisible light emitted by the light emitter, when electrically operated to emit nonvisible light, illuminates at least a portion of the interior cabin that is viewed by the camera. The vehicular cabin monitoring system, with the light emitter electrically operated to emit nonvisible light, and via processing at the ECU of image data captured by the camera and transferred to the ECU, determines illuminance of nonvisible light at a region of interest within the illuminated portion of the interior cabin that is viewed by the camera. The vehicular cabin monitoring system, responsive to determining that the illuminance of nonvisible light at the region of interest is greater than a threshold illuminance, reduces intensity of nonvisible light emitted by the light emitter.
These and other objects, advantages, purposes and features of the present invention will become apparent upon review of the following specification in conjunction with the drawings.
A vehicular cabin monitoring system operates to capture images interior of the vehicle and may process the captured image data to detect objects within the vehicle, such as to monitor an attentiveness of the driver of the vehicle. The cabin monitoring system includes an image processor or image processing system that is operable to receive image data from one or more cameras. The system includes one or more illumination sources, such as light emitting diodes (LEDs) or vertical-cavity surface-emitting lasers (VCSEL), that emit light to illuminate the field of view of the camera(s) in low light conditions.
Referring now to the drawings and the illustrative embodiments depicted therein, a vehicle 10 includes a cabin monitoring system 12 that includes at least one interior viewing imaging sensor or camera, such as a camera 14 disposed at a rearview mirror assembly 16 of the vehicle (and the system may optionally include one or more cameras at other locations within the vehicle, such as at a windshield of the vehicle, at a headliner of the vehicle, at an instrument panel of the vehicle, etc.), which captures images interior of the vehicle, with the camera having a lens for focusing images at or onto an imaging array or imaging plane or imager of the camera (
The cabin monitoring system 12 includes one or more light emitters, such as two or more LEDs 20 (or other light sources) that are operable to emit light to illuminate the cabin of the vehicle for the camera 14 during low light conditions. For example, a first LED 20 illuminates the cabin for a driver monitoring function of the cabin monitoring system and a second LED 20 illuminates the cabin for an occupant monitoring function of the cabin monitoring system. The LEDs 20 may emit visible light or nonvisible light, such as infrared (IR) light or near-IR light. For example, the cabin monitoring system may include a light emitter (that includes one or more LEDs or other suitable illumination source) that, when electrically operated, emits near infrared light, and the camera may be sensitive to near infrared light, with the frame capture rate of the camera at least in part corresponding to the pulse rate of the near infrared light emitter. The light emitter may be located at any location within the cabin of the vehicle, such as at an interior mirror assembly, at a headliner/roof, at a central console, etc.
Cabin monitoring systems such as driver monitoring systems (DMS) and occupant monitoring systems (OMS) are vehicle safety systems that assess drivers or other occupants of a vehicle for any number of purposes. For example, a DMS may estimate or predict an alertness or drowsiness of a driver and take action (e.g., generate a warning) when the driver appears too drowsy to drive safely. As another example, an OMS may monitor for the presence and health of any occupant of the vehicle, such as when a child is left behind in a vehicle. These vehicles often rely on infrared (IR) or near-IR light LEDs to illuminate the cabin with non-visible light during low light conditions. However, for safety concerns, it is not recommended for these LEDs to shine directly into human eyes (i.e., when the eye is near the LED) at high levels of power, even though the emitted light is not visible to the human. Prolonged exposure to high-intensity IR light can potentially cause damage to the eyes.
For example, as shown in
Implementations herein are directed toward systems and methods for preventing eye damage for IR and near-IR LED illumination sources (such as for camera-based DMS and OMS). The implementations include a vehicular cabin monitoring system that determines when an occupant (e.g., a driver or passenger) of the vehicle approaches the illumination source and reduces the radiation power (i.e., the amount of IR light emitted) to levels safe for human eyes. For example, the system may process image data captured by a camera to determine the brightness of photosensors or pixels in a specific area of interest (which represents the illuminance of the nonvisible light). If the brightness or illuminance exceeds a certain threshold, the system reduces the amount of nonvisible light emitted by the LEDs.
Referring now to
The state machine 40 in this example has three states: an “ALL LEDs OFF” state, an “OMS LED Only” state, and an “ALL LEDs On” state. Each of these states dictates the operational behavior of the LEDs within the vehicular cabin monitoring system. In the ALL LEDs On state, both the DMS LED(s) and the OMS LED(s) are enabled/pulsed at respective frequencies (e.g., 30 Hz). The frequencies may be the same for both or different. Optionally, the frequencies are the same but there is a phase shift between the OMS LED pulsing and the DMS LED pulsing (e.g., a 180-degree phase shift). This phase shift can help in reducing interference or achieving specific lighting effects. The ALL LEDs On state may be the default operational state (i.e., the state the system is in during nominal operation).
In the OMS LED Only state, the OMS LED(s) may be operational (i.e., pulsing at a defined frequency, such as at 30 Hz) and the DMS LED is off or inactive (i.e., does not pulse or emit light at all). Conversely, in the ALL LEDs Off state, neither the OMS LED nor the DMS LED pulse or emit light.
The state machine 40 may transition from the ALL LEDs On state (i.e., the normal or default state) to the OMS LED Only state when, as described in more detail below, the system determines that a human head/face is approaching the LEDs. For example, the system determines that a head is approaching one or both of the DMS and OMS LEDs. Upon making this determination, the system transitions to the OMS LED Only state and subsequently turns off the DMS LED. In this OMS LED state, the system continuously monitors whether the head, particularly the eyes, is at risk from exposure to the OMS LED. If such a risk is detected, the system transitions to the ALL LEDs Off state, wherein both the DMS LED and the OMS LED are turned off to mitigate any potential harm.
In the ALL LEDs Off state, the system remains inactive for a configurable timeout period. This period can be set to various durations, such as at least 10 seconds, at least 30 seconds, or even up to 50 seconds, depending on the specific requirements of the application. After this timeout period elapses, the system automatically transitions back to the OMS LED Only state. During this state, if the system continues to detect a risk to the head or eyes from the OMS LED, it may transition back to the ALL LEDs Off state to ensure safety.
Alternatively, if during the OMS LED Only state the system determines that the risk is no longer present (indicating that the head has moved away from the LEDs), the state machine 40 may transition back to the ALL LEDs On state. This transition signifies a return to normal operation where all LEDs are active. The system's ability to dynamically transition between these states ensures that it can effectively manage and mitigate risks associated with human interaction with the LEDs while maintaining optimal functionality.
Optionally, the system determines whether a head is approaching the LEDs by determining the illuminance of the nonvisible light within a region of interest. For example, the system determines whether a head is approaching the LEDs based on a distribution of brightness values for photosensors or pixels in frames of image data captured by the camera while the LEDs emit or radiate IR or near IR light. This determination process may involve analyzing the brightness values in predefined regions of interest (ROIs) within each captured image. The system determines whether the distribution of brightness values satisfies (e.g., exceeds) a threshold value for at least a threshold period of time. The threshold period of time may be defined as at least a certain duration and/or a threshold number of sequential frames of image data. If the system finds that the brightness values satisfy this threshold condition, the system may adjust the amount of IR light emitted by the LEDs. For example, the system determines that the brightness in a given ROI exceeds a threshold value for at least 1 second (or any other threshold period of time), and, in response, reduces the amount of IR light emitted by the LEDs (e.g., by transitioning from the ALL LEDs On state to the OMS LED Only state). As another example, the system determines that the brightness in a given ROI is below a threshold value for at least 1 second, and, in response, increases the amount of IR light emitted by the LEDs (e.g., by transitioning from the OMS LED Only state to the ALL LEDs On state).
Optionally, the system determines the distribution of brightness within the ROI based on evaluating the percentage of photosensors or pixels within the ROI in one or more frames of image data that satisfy the threshold value (i.e., the quantity of photosensors or pixels “overexposed” or too bright within the ROI). For example, the system may decrease the amount of IR light emitted when over 70% of the photosensors or pixels within the ROI exceed the brightness threshold (i.e., are overexposed). As shown in
When the system, while in the OMS LED Only state, determines that a risk continues to be present for an occupant of the vehicle (for example, the brightness of the ROI remains above the threshold level for another threshold period of time), the system may transition to the ALL LEDs Off state to disable all LEDs and eliminate the risk to the occupants of the vehicle. For example, if the system is in the OMS LED Only state for at least 10 seconds with the brightness within the ROI continuing to exceed a threshold (which may be the same threshold as the ALL LEDs On state or a different threshold), the system may automatically transition to the ALL LEDs Off state. The system may wait for a threshold period of time (e.g., a configurable timeout such as at least 20 seconds or at least 30 seconds, such as 50 seconds or 60 seconds) in the ALL LEDs Off state before transitioning back to the OMS LED Only state.
In this state, the system may again determine whether the risk to the occupants remains. If the risk remains, the system may return to the ALL LEDs Off state. If this risk is no longer present, such as when an occupant has moved away from the camera and/or LEDs, thereby reducing or eliminating the excessive brightness, the system may transition back to the ALL LEDs On state. In this state, all LEDs are reactivated, and normal operation resumes, provided that no further risks are detected. This dynamic adjustment between states ensures that the system continuously monitors and responds to potential hazards, maintaining a safe environment for all vehicle occupants.
In some examples, the “ALL LEDs On” and/or the “OMS LED Only” states have one or more sub-states that reduce power to the enabled LEDs such that the enabled LEDs emit less light. For example, the system achieves this reduction by reducing a current and/or a voltage supplied to the LEDs. The system may transition between these sub-states prior to transitioning between the primary states. For example, when the system is in the “ALL LEDs On” state, the system may transition from a full power sub-state to a reduced power sub-state to reduce an amount of power provided to the DMS LED(s) or all LEDs to reduce the amount of light emitted by the LEDs prior to transitioning to the OMS LED Only state. That is, the sub-states may act as “intermediate states” to reduce the amount of power provided to some or all of the LEDs prior to turning off the LEDs.
The transitions between these sub-states may be governed by the same parameters that control the transitions between the primary states (e.g., the distribution of brightness within the ROI, the threshold periods of time, etc.). The system may incorporate any amount debouncing between the states and sub-states of the state machine to manage these transitions effectively. Debouncing helps to prevent rapid, unintended switching between states and sub-states, which could otherwise lead to instability or flickering of the LEDs. The amount of debouncing applied can be configured according to the specific requirements of the application or the hardware used, allowing for customization based on different operational scenarios, vehicles, or user needs.
Referring now to
Referring now to
For example, as illustrated in
Other parameters may configure other aspects, such as a parameter to configure the threshold for brightness or overexposure of a single pixel or photosensor. This ensures that only pixels or photosensors exceeding a certain brightness level are considered overexposed. Another parameter may configure the percentage of pixels within the ROI that must be overexposed to trigger a state or sub-state change. In another example, a parameter may establish an amount of debouncing, which refers to the threshold period during which pixels must remain overexposed or underexposed before a change is registered. This helps in filtering out transient changes in brightness and ensures that only consistent and significant changes are considered.
The size and positioning of the ROI may be configurable along with both the brightness value and the percentage of pixels within the ROI that must be overexposed based on use case and other environmental parameters of the system and vehicle. The performance of the system may be fine-tuned via careful calibration of these parameters.
Thus, implementations herein include a vehicular cabin monitoring system configured to enhance vehicle safety by monitoring the attentiveness of drivers and the presence and health of occupants. The system employs one or more cameras placed within the vehicle, such as at the rearview mirror assembly, windshield, headliner, or instrument panel, to capture images of the vehicle's interior. These images are processed to detect objects and monitor driver and occupant behavior. The system includes multiple illumination sources or light emitters, such as LEDs that emit visible or non-visible light (e.g., infrared or near-infrared light), to ensure proper illumination in low-light conditions. The light emitters may be placed at any appropriate location within the cabin of the vehicle, such as at the interior mirror assembly of the vehicle.
The system may dynamically adjust the intensity of the emitted light to prevent potential eye damage from prolonged exposure to high-intensity infrared light. For example, the system transitions between different operational states (e.g., ALL LEDs On, OMS LED Only, and ALL LEDs Off) based on the proximity of occupants to the LEDs. The system may use a state machine to manage these transitions, ensuring that the LEDs are turned off or their power is reduced when a human head is detected within a threshold distance. This approach mitigates risks associated with direct exposure to intense IR light while maintaining optimal functionality for monitoring purposes. Thus, the system detects when a human head is too close to the LEDs, which overcomes the problem in conventional solutions using blockage signals where, for example, a partial blockage results in a false negative (i.e., the system believes there is a blockage when no exists and the capabilities of the system are reduced).
Optionally, the system may perform facial recognition detection and/or eye recognition detection via processing of the captured image data to determine whether a face or eyes are approaching the light emitter or illumination source. For example, the system may reduce the intensity of the near infrared light emitted by the illumination source responsive to determining the threshold brightness and responsive to determining, via processing of image data captured by the driver monitoring camera, that a person's head or face or eyes are at or near or approaching the light emitter, and the system may not reduce the intensity of the near infrared light emitted by the illumination source when the system determines that a person's hand or other object is approaching the light emitter, regardless of the determined brightness. In other words, the system may reduce the intensity of the light emitter when the system (i) determines approach of a person's face toward the light emitter and (ii) determines that the brightness of light reflecting off the person's face is greater than a threshold level, which is indicative of the person's face/eyes being within a threshold distance to the light emitter.
The interior-viewing camera may be disposed at the mirror head of the interior rearview mirror assembly and moves together and in tandem with the mirror head when the driver of the vehicle adjusts the mirror head to adjust his or her rearward view. The interior-viewing camera may be disposed at a lower or chin region of the mirror head below the mirror reflective element of the mirror head, or the interior-viewing camera may be disposed behind the mirror reflective element and viewing through the mirror reflective element. Similarly, the light emitter may be disposed at the lower or chin region of the mirror head below the mirror reflective element of the mirror head (such as to one side or the other of the interior-viewing camera), or the light emitter may be disposed behind the mirror reflective element and emitting light that passes through the mirror reflective element. The ECU may be disposed at the mirror assembly (such as accommodated by the mirror head), or the ECU may be disposed elsewhere in the vehicle remote from the mirror assembly, whereby image data captured by the interior-viewing camera may be transferred to the ECU via a coaxial cable or other suitable communication line. Cabin monitoring or occupant detection may be achieved via processing at the ECU of image data captured by the interior-viewing camera. Optionally, cabin monitoring or occupant detection may be achieved in part via processing at the ECU of radar data captured by one or more interior-sensing radar sensors disposed within the vehicle and sensing the interior cabin of the vehicle.
The system may utilize aspects of driver monitoring systems and/or head and face direction and position tracking systems and/or eye tracking systems and/or gesture recognition systems. Such head and face direction and/or position tracking systems and/or eye tracking systems and/or gesture recognition systems may utilize aspects of the systems described in U.S. Pat. Nos. 11,827,153; 11,780,372; 11,639,134; 11,582,425; 11,518,401; 10,958,830; 10,065,574; 10,017,114; 9,405,120 and/or 7,914,187, and/or U.S. Publication Nos. US-2024-0190456; US-2024-0168355; US-2022-0377219; US-2022-0254132; US-2022-0242438; US-2021-0323473; US-2021-0291739; US-2020-0320320; US-2020-0202151; US-2020-0143560; US-2019-0210615; US-2018-0231976; US-2018-0222414; US-2017-0274906; US-2017-0217367; US-2016-0209647; US-2016-0137126; US-2015-0352953; US-2015-0296135; US-2015-0294169; US-2015-0232030; US-2015-0092042; US-2015-0022664; US-2015-0015710; US-2015-0009010 and/or US-2014-0336876, and/or U.S. patent application Ser. No. 18/666,959, filed May 17, 2024 (Attorney Docket DON01 P5121), and/or U.S. provisional application Ser. No. 63/641,574, filed May 2, 2024, and/or International Publication No. WO 2023/220222, which are all hereby incorporated herein by reference in their entireties.
The camera or sensor may comprise any suitable camera or sensor. Optionally, the camera may comprise a “smart camera” that includes the imaging sensor array and associated circuitry and image processing circuitry and electrical connectors and the like as part of a camera module, such as by utilizing aspects of the vision systems described in U.S. Pat. Nos. 10,099,614 and/or 10,071,687, which are hereby incorporated herein by reference in their entireties.
The system includes an image processor operable to process image data captured by the camera or cameras, such as for detecting objects or other vehicles or pedestrians or the like in the field of view of one or more of the cameras. For example, the image processor may comprise an image processing chip selected from the EYEQ family of image processing chips available from Mobileye Vision Technologies Ltd. of Jerusalem, Israel, and may include object detection software (such as the types described in U.S. Pat. Nos. 7,855,755; 7,720,580 and/or 7,038,577, which are hereby incorporated herein by reference in their entireties), and may analyze image data to detect vehicles and/or other objects. Responsive to such image processing, and when an object or other vehicle is detected, the system may generate an alert to the driver of the vehicle and/or may generate an overlay at the displayed image to highlight or enhance display of the detected object or vehicle, in order to enhance the driver's awareness of the detected object or vehicle or hazardous condition during a driving maneuver of the equipped vehicle.
The imaging device and control and image processor and any associated illumination source, if applicable, may comprise any suitable components, and may utilize aspects of the cameras (such as various imaging sensors or imaging array sensors or cameras or the like, such as a CMOS imaging array sensor, a CCD sensor or other sensors or the like) and vision systems described in U.S. Pat. Nos. 5,760,962; 5,715,093; 6,922,292; 6,757,109; 6,717,610; 6,590,719; 6,201,642; 5,796,094; 6,559,435; 6,831,261; 6,822,563; 6,946,978; 7,720,580; 8,542,451; 7,965,336; 7,480,149; 5,877,897; 6,498,620; 5,670,935; 5,796,094; 6,396,397; 6,806,452; 6,690,268; 7,005,974; 7,937,667; 7,123,168; 7,004,606; 6,946,978; 7,038,577; 6,353,392; 6,320,176; 6,313,454 and/or 6,824,281,and/or International Publication Nos. WO 2009/036176; WO 2009/046268; WO 2010/099416; WO 2011/028686 and/or WO 2013/016409, and/or U.S. Publication Nos. US 2010-0020170 and/or US-2009-0244361, which are all hereby incorporated herein by reference in their entireties.
Changes and modifications in the specifically described embodiments can be carried out without departing from the principles of the invention, which is intended to be limited only by the scope of the appended claims, as interpreted according to the principles of patent law including the doctrine of equivalents.
Number | Date | Country | |
---|---|---|---|
63587471 | Oct 2023 | US |