The present invention concerns rearview assemblies and imaging systems for vehicles and, more particularly, relates to an imaging system for an interior rearview assembly.
It is one aspect of the present disclosure to provide a rearview assembly for a vehicle. The rearview assembly includes a reflective element; an infrared illumination source disposed behind the reflective element and configured to emit infrared light through the reflective element to illuminate a field of view with infrared light, the field of view including at least part of a vehicle cabin; and an optical element disposed behind the reflective element and configured to direct infrared light from the infrared illumination source towards the field of view. The optical element including: a molded portion comprised of a visibly opaque material that is substantially transparent to the infrared light, the molded portion having a front surface and a rear surface; and a reflector disposed on the rear surface of the molded portion and configured to substantially reflect infrared light emitted from the illumination source that passes through the molded portion towards the field of view.
It is another aspect of the present disclosure to provide an imaging system for a vehicle including a reflective element configured to provide a driver of the vehicle with a view rearward relative to the vehicle, the reflective element substantially transparent in an infrared region of the electromagnetic spectrum; at least one imager in connection with the rearview assembly and configured to acquire an image within a field of view of the imager, the field of view including at least part of a vehicle cabin; an infrared illumination source disposed behind the reflective element and configured to emit infrared light through the reflective element to illuminate at least a portion of the field of view of the imager with infrared light; and an optical element disposed behind the reflective element and configured to direct infrared light from the infrared illumination source towards the field of view. The optical element including: a molded portion comprised of a visibly opaque material that is substantially transparent to the infrared light, the molded portion having a front surface and a rear surface; and a reflector disposed on the rear surface of the molded portion and configured to substantially reflect infrared light emitted from the illumination source that passes through the molded portion towards the field of view.
It is another aspect of the present disclosure to provide a rearview assembly for a vehicle, the rearview assembly includes: a reflective element configured to provide a driver of the vehicle with a view rearward relative to the vehicle, the reflective element substantially transparent in an infrared region of the electromagnetic spectrum; at least one imager in connection with the rearview assembly and configured to acquire an image within a field of view of the imager, the field of view including at least part of a vehicle cabin; a plurality of infrared LEDs disposed behind the reflective element and configured to emit infrared light through the reflective element to illuminate at least a portion of the field of view of the imager with infrared light; and an optical element disposed behind the reflective element and configured to direct infrared light from the infrared LEDs towards the field of view. The optical element including: a molded portion comprised of a visibly opaque material that is substantially transparent to the infrared light, the molded portion having a front surface and a rear surface, and a plurality of recessed cups formed in the front surface each with an opening through to the rear surface where a corresponding one of the plurality of LEDs is positioned; and a reflector disposed on the rear surface of the molded portion and configured to substantially reflect infrared light emitted from the illumination source that passes through the molded portion, the reflector reflects the infrared light towards the field of view of the imager.
These and other features, advantages, and objects of the present device will be further understood and appreciated by those skilled in the art upon studying the following specification, claims, and appended drawings.
The embodiment(s) will now be described with reference to the following drawings, in which:
The components in the figures are not necessarily to scale, emphasis instead being placed upon illustrating the principles described herein.
For purposes of description herein, the terms “upper,” “lower,” “right,” “left,” “rear,” “front,” “vertical,” “horizontal,” and derivatives thereof shall relate to the rearview assembly as oriented in
The terms “including,” “comprises,” “comprising,” or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. An element preceded by “comprises a . . . ” does not, without more constraints, preclude the existence of additional identical elements in the process, method, article, or apparatus that comprises the element.
As defined herein, “approximately” and “about,” when used in reference to angles, proportions, and the like, may, in some embodiments, mean within plus or minus ten percent of the stated value. In other embodiments, “approximately” and “about,” when used in reference to angles, proportions, and the like, may mean within plus or minus five percent of the stated value. In further embodiments, “approximately” and “about,” when used in reference to angles, proportions, and the like, may mean within plus or minus three percent of the stated value. In yet other embodiments, “approximately” and “about,” when used with reference to angles, proportions, and the like, may mean within plus or minus one percent of the stated value.
Referring to
As shown in
As shown in
The first and second illuminations 22, 23 have the same wavelength, including wavelengths ranging from 800 nm to 1000 nm.
As depicted, the first and second light sources 18, 20 may each include one or more infrared emitter banks 27 that emit the first and second illuminations 22, 23, respectively. Each illumination source (emitter bank) 27 may include a plurality of near IR light-emitting diodes (LEDs) 102 (
A problem with providing sufficient illumination from the infrared emitter banks 27 to the field of view of the imaging system 10 is that conventional reflectors made of aluminum or the like, may be visible through the reflective element 14 during daylight conditions. This is because any visible light that can transmit through the reflective element 14 may be reflected back through the reflective element 14 to the eyes of the viewer by such a reflector for the infrared emitter banks 27. The embodiments described below solve this problem.
A first example of the infrared emitter bank 27 is shown in
The molded portion 106 may be made of any material that is sufficiently opaque to visible light and transmissive to infrared light. As but one example, the molded portion 106 may be made of polycarbonate incorporating an opaque dye such as Epolight™ 7778, 7276A, 7276B, and 7276F visible opaque dyes available from Epolin of Newark, New Jersey. The transmittance of the molded portion 106 when made of polycarbonate incorporating Epolight™ 7778 visible opaque dye is shown in
The reflector 110 may be formed of a layer of aluminum applied to the rear surface 108 of the molded portion 106. The layer of aluminum may be applied via a sputtered deposition, as a painted coating, or as a machined aluminum adhered to the rear surface 108.
The front surface 107 and the rear surface 108 may have different shapes. This may be to account for refraction of the infrared light when passing through the molded portion 106. The shapes of the front surface 107 and the rear surface 108 may be selected to provide uniform illumination throughout the field of view while taking account angles of refraction and reflection caused by the optical element 104.
The following description of the remaining components of imaging system 10 are taken from U.S. patent application Ser. No. 18/109,395, entitled “IMAGING SYSTEM FOR A VEHICLE,” filed on Feb. 14, 2023, and provided to complete the description of the imaging system 10. The entire disclosure of U.S. patent application Ser. No. 18/109,395 is incorporated herein by reference.
An imager 28 may be disposed on an exterior surface of the housing 16 and include a lens 28a for receiving light. As shown in
As shown in
The imager 28 may be configured to capture images of lower seating portions (e.g., proximate to a seat deck). In this way, the field of view 35 of the imager 28 may capture the lower body portion 30 of the occupants 25, such as laps 30a, hands 30b (e.g., driver's hands 30b on the steering wheel 17), torsos 30c, etc., as shown in
The imager 28 may be configured to capture images of occupants over a wide range of heights, for example from 1300 mm to 2200 mm, from 1500 mm to 2000 mm, 1600 mm to 1900 mm, etc. The position of the imager 28 may also limit the effect of distortion along edges of the lens 28a of the imager 28. According to some aspects of the present disclosure, the imager 28 is operable to capture a width of the cabin 24. Stated differently, a width of the field of view 35 of the imager 28 may exceed a distance between the front passenger 25b and the driver 25a along the width of the cabin 24. By capturing a wide field of view encompassing a front portion of the cabin 24 (e.g., back portions of driver and passenger seats), the inherent distortion at the edges of the lens 28a may occur beyond the front passenger 25b and the driver 25a (e.g., proximate to front side doors of the vehicle). This may limit poor image quality adjacent portions of the occupants 25, such as the face; eyes 26a, 26b; torsos 30c; and laps 30a of the occupants 25.
Referring now to
Additionally, the controller 32 may be configured to execute a second algorithm operable to determine a pose of the occupant 25. The controller 32 may be configured to associate the pose with a distracted state of the occupant 25. For example, the controller 32 may execute the second algorithm to classify the occupant's neck arching downwardly and/or the occupant's head tilted forwardly as a “looking down” pose. The controller 32 may associate the looking down pose with a distracted state of the occupant 25 based on historical data, pre-programming, and/or iterative training. Additionally, or alternatively, the controller 32 may execute the second algorithm to classify one or both hands 30b of the occupant 25 on the lap 30a of the occupant 25 with a “hands-off-of-wheel” pose. The controller 32 may associate the hands-off-of-wheel pose with the distracted state of the occupant 25.
The controller 32 may also be operable to execute a third algorithm configured to track motion of one or more body parts of the occupant 25. For example, by employing one of the first and second algorithms in parallel with the third algorithm, the controller 32 may determine the distracted state based on finger movement of one or both hands 30b of the occupant 25 by associating finger movement with mobile device interaction (e.g., typing/texting). The controller 32 may also be operable to detect a mobile device based on light projected from or reflected off of the mobile device via a fourth algorithm. For example, the controller 32 may execute the fourth algorithm to analyze image data generated based on infrared light or visible light to identify pixel data of the image data that corresponds to higher luminosity. It is generally contemplated that the controller 32 may be operable to combine one or more of the algorithms to detect the distracted state.
The controller 32 may be operable to detect and/or categorize any number of distracted states based on a projected level of distraction of the occupant 25. For example, the controller 32 may determine a first distracted state in response to a combination of factors identified by the algorithms or detection routines of the controller 32. For example, the controller 32 may detect a first distracted state in response to a combination of the occupant 25 looking down or a hands-off-of-wheel pose in combination with a moving or typing motion on the mobile device. In such cases, the field of view 35 may provide for image data depicting the mobile device in the lap 30a or lower seating portion as previously described. In some cases, the controller 32 may determine a second distracted state corresponding to the hands-off-of-wheel pose. The controller 32 may be configured to then categorize the first distracted state with a first level of distraction and categorize the second distracted state with a second level of distraction. In the above example, the controller 32 may determine that the first level of distraction is greater than the second level of distraction. The controller 32 may then generate a response signal based on the level of distraction of the occupant 25. The response signal may be operable to control various functions (e.g., the function depending on the level of distraction) in the cabin 24 to alert the occupant 25 of a distracted state.
In some implementations, the controller 32 may be configured to detect an eye glint of the occupant 25 and/or a pupil of the occupant 25 to determine focal direction data corresponding to a focal direction of the eye 26. The controller 32 may then incorporate the focal direction data into any one of the first, second, third, or fourth algorithms to determine one or more distracted states. The eye glint may be identified as the brightest specular reflection of visible or infrared light from an outer surface of a cornea of the eye 26 of the occupant 25. The eye glint may occur at a position of the outer surface of the cornea that overlaps the iris 26c and may be identified by its contrast to surrounding portions of the image. By tracking the focal direction of the occupant 25, the controller 32 may be configured to detect a direction in which the occupant 25 is looking and may identify or infer a level of distraction based on this information, alone or in combination with other factors. The imager 28 may be operated by the controller 32 to adjust an exposure time (e.g., lessen the exposure time) when one or both light sources 18, 20 are operating for the detection of eye glint.
By way of example, the controller 32 may incorporate eye-tracking functions to determine whether the driver 25a is looking toward the mobile device and/or away from the front of the vehicle. For example, the controller 32 may be operable to receive image data corresponding to an image of the lower body portion 30 and determine a first condition corresponding to the presence of the mobile device on or near the lower body portion 30. Further, the controller 32 may be operable to determine a second condition corresponding to a focal direction of the eye 26 and determine, based on the first condition and the second condition, the distraction level corresponding to the occupant 25 viewing the mobile device. The second condition may also, or may alternatively, be detected in response to the controller 32 determining the presence of the mobile device based on the image data. For example, the controller 32 may determine that the occupant 25 is looking at and/or interacting with the mobile device.
As shown in
In some implementations, the controller 32 may also be operable to access biometric data and/or driving metric data associated with the driver 25a. The controller 32 may be configured to communicate an instruction to output the biometric data and/or driving data to the display 33 depending on an operating state or gear status (e.g., park, drive, reverse, etc.) of the vehicle. The controller 32 may communicate with a powertrain system controller to determine the operating state of the vehicle. For example, the controller 32 may determine the vehicle is in park and communicate the instruction to the display 33 to indicate the biometric and/or driving metric data. The biometric data may include a duration of the blink of the eyes 26a, 26b, a dilation of the pupils, a body pose of the driver 25a, and/or any other biometric feature that may be used to determine a health status of the driver 25a.
The controller 32 may also be configured to receive vehicle operation data corresponding to speed, acceleration, deceleration, brake force, proximity to other vehicles, etc., and generate the driving metric data based on the vehicle operation data. The vehicle operation data may be communicated to the controller 32 from the powertrain system controller or another on-board vehicle controller that tracks driving operations. The driving metric data may be communicated to the driver 25a via the display 33 in the form of a numerical score (e.g., 8/10), descriptive verbiage (e.g., “Very Good”), or the like.
With continued reference to
In some cases, the controller 32 may provide visual notifications indicating an operating state of the imaging system 10. The notifications may be communicated via the display 33 or an indicator 34 configured to output a visual notification indicating an operating state of the imaging system 10. The indicator 34 may be configured as an LED or other light source and is operable by the controller 32 to flash and/or change colors to indicate the operating state of the imaging system 10. In one specific embodiment, the indicator 34 may be configured as an RGB LED operable to indicate the operating state by emitting light expressed in a red color, a green color, a blue color, or any color combination thereof. The controller 32 may be operable to communicate an instruction to control the indicator 34 to indicate a warning condition. In this way, the imaging system 10 may be configured to alert the driver 25a when the controller 32 detects the one or more distracted states.
In order to accurately identify the features of the occupant 25 (e.g., occupant pose, gaze direction, body movements, etc.), the imaging system 10 may adjust an operation based on ambient lighting conditions of the vehicle. For example, the controller 32 may adjust various imaging parameters (e.g., an auto gain threshold) to determine characteristics (e.g., wavelength, intensity, etc.) of the first and/or the second illumination 22, 23. In embodiments where the interior rearview mirror assembly 12 is configured as an electro-optic rearview mirror assembly 12, the controller 32 may additionally, or alternatively, use available feedback mechanisms from a dimming controller to determine the characteristics of the first and/or the second illumination 22, 23. In these embodiments, the transmissive properties of an electro-optic element in the mirror assembly 12 may be configured to filter the first or the second wavelengths corresponding to the first and second illuminations 22, 23. For example, the electro-optic element may be activated to filter out visible light projected from the first light source 18 while allowing infrared light generated by the second light source 20 to pass through the electro-optic element, or vice versa. Additionally, or alternatively, a first electro-optic element overlaying the first light source 18 may be electrically isolated from a second electro-optic element overlaying the second light source 20. In this way, the first electro-optic element may be activated to filter light projected from the first light source 18 while light projected from the second light source 20 remains unfiltered by the second electro-optic element.
Further to adequate lighting conditions of the vehicle, the imaging system 10 may be configured to include manual entry criteria that may improve occupant detection algorithms in the controller 32. For example, eye color and/or skin color may aid the controller 32 in determining the characteristics (e.g., intensity, frequency, etc.) of the first and/or the second illumination 22, 23. The criteria may be entered using any available user-input device of the vehicle or a mobile device in communication with the controller 32. By employing one or more of the foregoing features, the imaging system 10 may benefit from improved speed and accuracy with respect to biometric capture and/or user authentication.
To operate the first light source 18 independently from the second light source 20, power signals supplied to the first light source 18 may have different properties than power signals supplied to the second light source 20. For example, with continued reference to
Referring now to
The field of view 35 may have a horizontal field component 35a in the range of approximately 120° to 160° and a similar or different vertical field component 35b. The horizontal field component 35a may be in the range of about 135° to 150°. The horizontal and vertical field components 35a, 35b may approximate a width and a height of the field of view 35, respectively, for a given depth (e.g., a distance from the imager 28 in a direction normal to the center of the lens 28a). For example, at a depth of about 500 mm, the field of view 35 may encompass the width of the cabin 24 and the height of the cabin 24. The depth at which the field of view 35 encompasses the width and/or height of the cabin 24 may be less than 500 mm (e.g., 300 mm) or greater than 500 mm (e.g., 600 mm). Stated differently, the angle of the horizontal field component 35a may allow the imager 28 to be operable to capture the full width of the cabin 24 at the first distance D1 from the imager 28 (e.g., a depth of the field of view 35 of approximately 500 mm). For use strictly as a driver monitoring system (DMS), the field of view 35 may be a narrower range such as spanning about 60 degrees.
The field of view 35 may have a generally arcuate shape (e.g., elliptical) or a squared-off shape (e.g., rectangular). The range of the field of view 35 may be dependent on optical properties of the lens 28a of the imager 28. Stated differently, the size and quality of the lens 28a may determine how expansive the field of view 35 is. By way of example, a field of view 35 having a horizontal field component 35a of 120° and a vertical field component 35b of 120° may have a regular polygonal or circular shape. If the field of view 35 has a horizontal field component 35a of 120° and a vertical field component 35b of 90°, then the field of view 35 may have an irregular polygonal or elliptical shape.
Referring more particularly to
The first portion 36 includes a first horizontal field component 36a and a first vertical field component (not numbered). The second portion 37 includes a second horizontal field component 37a and a second vertical field component (not numbered). The shape of the first portion 36, as defined by its field components (e.g., the first horizontal field component 36a and the first vertical field component), may or may not be proportional to the second portion 37, as defined by its field components (e.g., the second horizontal field component 37a and the second vertical field component).
Still referring to
The second portion 37 may be directed to the same region as the first portion 36 or may be optionally directed to a focal region common to an illuminated region encompassed by a second illumination range Θ2 of the second illumination 23. The second light source 20 may be operable to direct the second illumination 23 toward an area corresponding to the cabin 24 of the vehicle to illuminate driver and passenger compartments 24a, 24b, 24c of the vehicle. The second light source 20 may be configured to produce high-intensity light within the second illumination range Θ2 of, for example, about 120° (e.g., horizontally and/or vertically) in order to illuminate the cabin 24 of the vehicle. The remaining range of the second illumination 23 may be distributed to peripheral areas (e.g., walls, side windows, etc.) of the vehicle. The potential distribution of the second illumination 23 is described later in particular reference to
Referring to
The first portion 36 of the field of view 35 may be employed for identification and/or authentication functions. For example, the controller 32 may operate the imager 28 with the first portion 36 to enable image acquisition of an iris 26c of one or both eyes 26 of the driver 25a. The controller 32 may process image data generated by the imager 28 while operating with the field of view 35 to identify the driver 25a. According to some aspects of the disclosure, the horizontal field component 36a may be approximately 20° and the first vertical field component (not numbered) may be similar or different. As previously illustrated in
For example, changing gears or igniting an engine of the vehicle may be prevented based on a determination that the occupant 25 is not an authorized driver 25a of the vehicle. The controller 32 may determine whether the driver 25a is authorized by processing the image data to identify the driver 25a according to biometric features. To capture the biometric features of the occupant 25, the controller 32 may employ a second identification function, or a driver monitoring function that includes facial recognition that may activate the second light source 20 without additional illumination, to project the second illumination 23 onto the driver 25a. As discussed herein, the second illumination 23 may be infrared illumination having a wavelength of approximately 940 nm. In some embodiments, when activating only the second light source 20, the controller 32 also may operate the imager 28 with the second portion 37 to enable image acquisition of a face and/or body of the driver 25a.
The controller 32 may process image data generated by the imager 28 while operating with the second portion 37 to monitor the driver 25a. The second portion 37 may be a wide field of view. The second horizontal field component 37a may be approximately 60° and the second vertical field component (not numbered) may be similar or different. As described herein, image data generated by the imager 28 may be shown on the display 33 and the driver 25a may adjust the position of the interior rearview mirror assembly 12 such that the image appearing on the display 33 is properly trained on the necessary biometric feature (e.g., face and/or body) required to monitor the driver 25a. Driver monitoring may include monitoring for sleepiness, inattentiveness, and other driver states. Additionally, the imager 28 may be configured to capture image data in the second portion 37 to provide for an occupancy detection (e.g., passenger occupancy) or detection of various objects in the cabin 24 of the vehicle.
As shown in
According to aspects of the present disclosure, the first illumination range Θ1 and the second illumination range Θ2 each may be formed via a collimating lens associated with each light source 18, 20 configured to generate a focused pattern of light. For example, the intensity of the light emitted from the light sources 18, 20 may vary over the distribution of a light beam pattern (e.g., higher central intensity, gradient distribution, etc.). In this way, a certain percentage of the light energy generated by the light sources 18, 20 may be relegated or controlled to project over a specific range of illumination and corresponding portions of the cabin 24 of the vehicle. By way of example, the distribution of light from the light sources 18, 20 may follow the pattern of a full width at half maximum (FWHM) distribution, where half of the light produced by the light source 18, 20 is within a specific range, and the other half may be distributed in a remaining range, typically with the total range approximating 120°.
With continued reference to
Referring to
The imaging system 10 may employ particular lighting techniques that allow the controller 32 to identify the eye glint in the event that the occupant 25 dons wearable glasses. When a lens, such as a focal lens provided with wearable glasses, is disposed between one or both light sources 18, 20 and the eyes 26, a light glare spot may form on the wearable glasses. The light glare spot may obstruct the imager 28 from capturing a view of the pupil. However, some light may pass through the lens and reflect off of the eyes 26 to form the eye glint. Due to the presence of the light glare spot and the eye glint simultaneously, detection of the pupil and/or determination of which reflection (e.g., either the light glare spot or the eye glint) is the eye glint may be challenging. By alternating pulses of light from the light sources 18, 20, light reflected off of lenses of wearable glasses may produce different light glare spot locations. A comparison of the light glare spot locations may allow detection of the pupil and/or identification of the pupil, thereby indicating the focal direction of the eye 26.
Referring to
As a result of the spacing between the first light source 18 and the imager 28, the first ray of light 40 may form a first angle of incidence & between the first projection component and the first reflection component. The second ray of light 41 may form a second angle of incidence 82 between the second projection component and the second reflection component. The first and second projection/reflection components may be referred to herein as legs of each corresponding angle of incidence 81, 82. Each angle of incidence 81, 82 may be in the range of about 5° to 15°, and more particularly in the range of about 8° to 15° at a working distance of approximately 500 mm. The working distance may be in the range of about 300 mm to 700 mm. For example, an occupant 25 having a height of 1.9 meters may be reclined and/or situated at a distance (e.g., 650 mm) greater than a working distance (e.g., 400 mm) of an occupant 25 having a height of 1.5 meters. The second distance D2 may be in the range of about 90 mm to 150 mm. The first distance D1 and the second distance D2 may be operable to define the angle of incidence in the range about 5° to 15°. It is generally contemplated that the first and second projection components may be within the first illumination range Θ1 and formed by the particular lens properties associated with the first light source 18.
As illustrated in
As previously discussed, the third light source 19 may be activated to provide additional light to the cabin 24 and allow for the first and/or second illuminations 22, 23 to dynamically adjust the illumination ranges Θ1, Θ2 based on the additional light. Additional auxiliary light sources 19 may be identified in the cabin 24 by processing an image of the cabin 24, and the controller 32 may operate the light sources 18, 20 to narrow the illumination ranges Θ1, Θ2 due to additional light projected from these auxiliary light sources 19. Stated differently, by providing auxiliary light sources 19 in the cabin 24, the first and second light sources 18, 20 may operate with narrower, more precise high-intensity ranges.
As a result of the spacing between the second light source 20 and the imager 28, the third ray of light 54 may form a third angle of incidence 83 between the third projection component and the third reflection component. The fourth ray of light 56 may form a fourth angle of incidence 84 between the fourth projection component and the fourth reflection component. The third and fourth projection/reflection components may be referred to herein as legs of each corresponding angle of incidence δ3, δ4. Each angle of incidence δ3, δ4 may be in the range of about 5° to 15°, and more particularly in the range of about 8° to 15° at a working distance of approximately 500 mm. In other words, the first distance D1 and the second distance D2 may be operable to define the angle of incidence δ3, δ4 in the range of about 5° to 15°. It is generally contemplated that the third and fourth projection components may be within the second illumination range Θ2 and formed by the particular lens properties associated with the second light source 20.
Referring to
Because the first light source 18 and the second light source 20 may each be spaced from the imager 28, the imaging system 10 may be operable to distribute individual thermal energy loads within the housing 16. Further, independent control of each light source 18, 20 may allow the imaging system 10 to prioritize the operation of the first and second light sources 18, 20. For example, if the controller 32 detects an overheat condition of the housing 16, or any other component of the imaging system 10, the controller 32 may be operable to deactivate the second light source 20 and only operate the first light source 18, or vice versa. The controller 32 may be pre-programmed to determine the priority of the first and second light sources 18, 20 in various scenarios, or may be programmed to update a prioritization memory based on repeated use (e.g., iterative training). According to some aspects of the present disclosure, the controller 32 may be operable to receive thermal data corresponding to thermal energy of the mirror assembly 12 and control the first and second light sources 18, 20 based on the thermal data. For example, the controller 32 may be operable to deactivate the second light source 20 based on the thermal data of the mirror assembly 12.
According to one aspect of the present disclosure, the imaging system 10 may be configured to selectively operate between a left-hand drive mode and a right-hand drive mode. In other words, the imaging system 10 may be considered to be “universal” in operation. For example, the first light source 18 and the second light source 20 may share identical lens properties (e.g., having the same orientation, same light output energy, same pattern, etc.), with the first light source 18 disposed on the first side 21a of the mirror assembly 12 and the second light source 20 disposed on the second side 21b of the mirror assembly 12, opposite the first side 21a. In this way, a mirror assembly 12 manufactured according to this aspect may be incorporated into vehicles with left-hand driving positions and vehicles having right-hand driving positions without change in function.
By way of example, because the imager 28 may be configured to capture images of the cabin 24, including the steering wheel 17, the controller 32 may be configured to determine the position of the steering wheel 17 and determine the configuration of the vehicle (e.g., left-hand or right-hand drive) based on the position of the steering wheel 17. This example is not intended to be limiting, as the controller 32 may additionally or alternatively be configured to determine the configuration of the vehicle based on identification of occupants 25 in the cabin 24, or other vehicle features. Once the configuration of the vehicle is determined, control of the first and second light sources 18, 20 may be optimized. For example, the controller 32 may be configured to prioritize employment of the light source associated with driver 25a recognition over employment of the light source associated with the front or rear passenger compartment 24b, 24c of the cabin 24 in the event of the mirror assembly 12 meeting a thermal threshold. Continuing with this example, the controller 32 may be configured to de-activate either the first light source 18 or the second light source 20 depending on the configuration of the vehicle.
According to one aspect of the invention, a rearview assembly for a vehicle is provided including a reflective element; an infrared illumination source disposed behind the reflective element and configured to emit infrared light through the reflective element to illuminate a field of view with infrared light, the field of view including at least part of a vehicle cabin; and an optical element disposed behind the reflective element and configured to direct infrared light from the infrared illumination source towards the field of view. The optical element including: a molded portion comprised of a visibly opaque material that is substantially transparent to the infrared light, the molded portion having a front surface and a rear surface; and a reflector disposed on the rear surface of the molded portion and configured to substantially reflect infrared light emitted from the illumination source that transmits through the molded portion towards the field of view.
According to various aspects, the disclosure may implement one or more of the following features or configurations in various combinations:
According to another aspect of the invention, an imaging system for a vehicle is provided including a reflective element configured to provide a driver of the vehicle with a view rearward relative to the vehicle, the reflective element substantially transparent in an infrared region of the electromagnetic spectrum; at least one imager in connection with the rearview assembly and configured to acquire an image within a field of view of the imager, the field of view including at least part of a vehicle cabin; an infrared illumination source disposed behind the reflective element and configured to emit infrared light through the reflective element to illuminate at least a portion of the field of view of the imager with infrared light; and an optical element disposed behind the reflective element and configured to direct infrared light from the infrared illumination source towards the field of view. The optical element including: a molded portion comprised of a visibly opaque material that is substantially transparent to the infrared light, the molded portion having a front surface and a rear surface; and a reflector disposed on the second surface of the molded portion and configured to substantially reflect infrared light emitted from the illumination source that passes through the molded portion towards the field of view.
According to various aspects, the disclosure may implement one or more of the following features or configurations in various combinations:
According to another aspect of the invention, a rearview assembly is provided for a vehicle, the rearview assembly including: a reflective element configured to provide a driver of the vehicle with a view rearward relative to the vehicle, the reflective element substantially transparent in an infrared region of the electromagnetic spectrum; at least one imager in connection with the rearview assembly and configured to acquire an image within a field of view of the imager, the field of view including at least part of a vehicle cabin; a plurality of infrared LEDs disposed behind the reflective element and configured to emit infrared light through the reflective element to illuminate at least a portion of the field of view of the imager with infrared light; and an optical element disposed behind the reflective element and configured to direct infrared light from the infrared LEDs towards the field of view. The optical element including: a molded portion comprised of a visibly opaque material that is substantially transparent to the infrared light, the molded portion having a front surface and a rear surface, and a plurality of recessed cups formed in the front surface each with an opening through to the rear surface where a corresponding one of the plurality of LEDs is positioned; and a reflector disposed on the rear surface of the molded portion and configured to substantially reflect infrared light emitted from the illumination source that passes through the molded portion, the reflector reflects the infrared light towards the field of view of the imager.
According to various aspects, the disclosure may implement one or more of the following features or configurations in various combinations:
It will be understood that any described processes or steps within described processes may be combined with other disclosed processes or steps to form structures within the scope of the present device. The exemplary structures and processes disclosed herein are for illustrative purposes and are not to be construed as limiting.
It is also to be understood that variations and modifications can be made on the aforementioned structures and methods without departing from the concepts of the present device, and further it is to be understood that such concepts are intended to be covered by the following claims unless these claims by their language expressly state otherwise.
The above description is considered that of the illustrated embodiments only. Modifications of the device will occur to those skilled in the art and to those who make or use the device. Therefore, it is understood that the embodiments shown in the drawings and described above are merely for illustrative purposes and not intended to limit the scope of the device, which is defined by the following claims as interpreted according to the principles of patent law, including the Doctrine of Equivalents.
This application claims priority under 35 U.S.C. § 119 (e) upon U.S. Provisional Patent Application No. 63/500,406, entitled “REARVIEW ASSEMBLY FOR A VEHICLE HAVING VISIBLY OPAQUE OPTICAL ELEMENT FOR REFLECTING INFRARED LIGHT” filed on May 5, 2023, by Kasen K. Anderson et al., the entire disclosure of which is incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
63500406 | May 2023 | US |