The present invention relates to vehicular interior mirror assemblies and vision systems that display at a video mirror display video images derived from image data captured by one or more cameras of the vehicle.
It is known to provide a video display at the exterior rearview mirror assembly, such as described in U.S. Pat. No. 7,777,611, which is hereby incorporated herein by reference in its entirety, or to provide a video display at an interior rearview mirror assembly to display sideward and/or rearward images, such as described in U.S. Pat. No. 5,670,935, which is hereby incorporated herein by reference in its entirety.
The present invention provides a vehicular vision system that displays video images at a video display screen of an interior rearview mirror assembly for a camera monitoring system, a rear backup camera system, and/or a surround view vision system. The system includes an electronic control unit (ECU) of the vehicle that receives image data captured by one or more cameras, and outputs (such as via a coaxial cable) to the video display of the interior rearview mirror assembly for displaying video images at the display screen based on the driving situation and/or user input by the driver. The video display screen in the interior rearview mirror assembly displays video images that are viewable through the mirror reflective element. The display device includes a stack of layers that reduce or minimize reflections from surfaces and interface changes through the stack. This allows the mirror head to display the video images when in the same orientation as when the driver views rearward via reflections at the mirror reflective element, and thus may eliminate the need for a toggle mechanism or actuator to tilt the mirror head during the video display mode.
These and other objects, advantages, purposes and features of the present invention will become apparent upon review of the following specification in conjunction with the drawings.
Referring now to the drawings and the illustrative embodiments depicted therein, a vehicle 10 includes an imaging system or vision system 12 that includes multiple exterior viewing cameras, including, for example, surround view cameras 14a-d (including a rearward viewing or rear backup camera 14a, a forward viewing camera 14b at the front of the vehicle and side surround view cameras 14c, 14d at respective sides of the vehicle), and/or camera monitoring system (CMS) cameras 15a-c (including side rearward viewing CMS cameras 15a, 15b at the respective sides of the vehicle, and a rearward viewing camera 15c that has a different field of view than the rear backup camera 14a), which capture image data of the respective scenes exterior of the vehicle and in the field of view of the respective camera, with each camera having a lens for focusing images at or onto an imaging array or imaging plane or imager of the camera (
The vision system 12 includes a control or electronic control unit (ECU) 18 having electronic circuitry and associated software, with the electronic circuitry including a data processor or image processor that is operable to process image data captured by the cameras, whereby the ECU may detect or determine presence of objects or the like and/or the system may provide video images or video image data or outputs or signals to a display device of the interior rearview mirror assembly 20 of the vehicle for displaying video images for viewing by the driver of the vehicle and/or to a display device 22 at the center console or stack of the vehicle (and optionally to CMS displays at or near the driver and passenger side A-pillars of the vehicle, such as described in U.S. Publication Nos. US-2018-0134217 and/or US-2014-0285666 and/or International PCT Application No. PCT/US2022/070062, filed Jan. 6, 2022, which published on Jul. 14, 2022 as International Publication No. WO 2022/050826, which are hereby incorporated herein by reference in their entireties). The data transfer or signal communication from the cameras to the ECU may comprise any suitable data or communication link, such as a vehicle network bus or CAN (Controller Area Network) bus or LIN (Local Interconnect Network) bus or I2C bus or the like of the equipped vehicle.
The ECU receives image data captured by each of the cameras and the image data is processed by the data processor or image processor of the ECU. The ECU is connected to the video display of the mirror assembly 20 via a single coaxial wire or cable for communicating with the display (such as to provide control signals or the like) and for providing video image signals to the display. The ECU is also connected to the video display 22 of the center console via a single coaxial wire or cable for communicating with the display and for providing video image signals to the display. Thus, the ECU can provide video image signals or outputs to the center stack display or head unit 22 and/or to the video mirror display 20.
The connections between the cameras and the ECU and/or between the displays and the ECUs may be made via respective coaxial cables, which may provide power and control of the cameras (by the ECU) and which may provide image data from the cameras to the ECU, and which may provide video images or video image data from the ECU to the display devices. Each device (e.g., camera and display device) is thus connected to and communicates with the ECU via a single respective coaxial cable, thus reducing cable inputs to the video mirror display and the center stack display. The connections and communications may utilize aspects of the systems described in U.S. Pat. Nos. 10,264,219; 9,900,490 and/or 9,609,757, which are hereby incorporated herein by reference in their entireties.
The ECU may selectively or episodically provide video image data or signals to the center stack display or head unit 22 and/or the video mirror display 20 based on vehicle speed. For example, at slower speeds (e.g., during a parking or unparking maneuver), video images (such as surround view images or rearview images or the like) are displayed at the center stack display 22 (where it is safe for the driver to look down toward the center stack display when slowly maneuvering the vehicle), and at higher speeds (such as when the vehicle is driven forward along a road), video images (such as rearview images or CMS images or the like) are displayed at the video mirror display 20 (where it is safe for the driver to view without taking his or her eyes off the road when driving the vehicle at higher speeds).
The interior rearview mirror assembly accommodates a video display device disposed behind the mirror reflective element so as to be viewable through a partially reflectant and partially visible light transmitting or transflective mirror reflector of the mirror reflective element when the display screen is powered. For example, and such as shown in
The rearward viewing camera 15c of the CMS cameras may also or otherwise function to provide rearward video images for a dual-mode interior rearview video mirror that can switch from a traditional reflection mode to a panoramic live-video display mode, such as is by utilizing aspects of the mirror assemblies and systems described in U.S. Pat. Nos. 10,442,360; 10,421,404; 10,166,924 and/or 10,046,706, and/or U.S. Publication Nos. US-2021-0245662; US-2021-0162926; US-2021-0155167; US-2020-0377022; US-2019-0258131; US-2019-0146297; US-2019-0118717; US-2018-0134217; US-2017-0355312 and/or US-2014-0285666, which are all hereby incorporated herein by reference in their entireties. Thus, when the mirror assembly is set to the video display mode (such as via actuation by the driver of a user-actuatable input), the ECU automatically switches to communicate video images derived from image data captured by the rearward viewing camera 15c to the video display screen at the interior rearview mirror.
Thus, when the mirror assembly is set to the video display mode (such as via actuation by the driver of a user-actuatable input), the ECU automatically switches to communicate video image data derived from image data captured by the rearward-viewing camera 15c to the video display screen at the interior rearview mirror.
Optionally, for a pickup truck application (such as shown in
As shown in
Optionally, the mirror head may include a mirror reflective element and display assembly 130 (
The VRLC glass assembly disclosed herein provides an integrated cover glass that is part of the stack of layers and thus eliminates the need for an additional glass layer or substrate. For example, and with reference to
Optionally, and such as shown in
As shown in
Optionally, the VRLC mirror assembly and the LCD video display screen may be combined into a single unit or VRLC assembly with integrated LC display and LC mirror and cover glass. For example, and such as shown in
Therefore, the vehicular interior rearview mirror assembly includes an auto-dimming mirror and a display screen disposed behind the auto-dimming mirror and viewable through the auto-dimming mirror when displaying video images. The auto-dimming mirror comprises a variable reflectant stack of layers that reduces or minimizes double images so that the video images can be viewed with the same mirror head orientation as the reflected images are viewed (i.e., the mirror head does not need to mechanically toggle or pivot between a mirror mode orientation and a display mode orientation). By arranging the stack of layers, duplicate layers and components may be eliminated. Optionally, the cover glass may comprise a prismatic glass first surface (angled relative to the rear surface) to further reduce unwanted reflections. The mirror assembly provides variable reflectivity with a video display behind a VRLC mirror with an integrated cover glass, and optionally with an integrated prismatic cover glass, with the polarizer disposed or applied between the LC layer and the cover glass.
The mirror assembly may operate as part of a vehicle vision system and/or driver or driving assist system and/or object detection system and/or alert system that operates to capture images exterior of the vehicle and that may process the captured image data to display images and to detect objects at or near the vehicle and in the predicted path of the vehicle, such as to assist a driver of the vehicle in maneuvering the vehicle in a forward or rearward direction. The vision system includes an image processor or image processing system that is operable to receive image data from one or more cameras and to provide an output to one or more display devices for displaying video images representative of the captured image data. For example, the vision system may provide a rearview display or a top down or bird's eye or surround view display or the like.
The vision system 12 includes a control or electronic control unit (ECU) 18 having electronic circuitry and associated software, with the electronic circuitry including a data processor or image processor that is operable to process image data captured by the cameras, whereby the ECU may detect or determine presence of objects or the like and/or the system may provide video images to a display device of the interior rearview mirror assembly 20 of the vehicle for viewing by the driver of the vehicle and/or to a display device 22 at the center console or stack of the vehicle (and optionally to CMS displays at or near the driver and passenger side A-pillars of the vehicle, such as described in U.S. Publication Nos. US-2018-0134217 and/or US-2014-0285666, which are hereby incorporated herein by reference in their entireties). The data transfer or signal communication from the cameras to the ECU may comprise any suitable data or communication link, such as a vehicle network bus or CAN (Controller Area Network) bus or LIN (Local Interconnect Network) bus or I2C bus or the like of the equipped vehicle.
The ECU receives image data captured by each of the cameras and the image data is processed by the data processor or image processor of the ECU. The ECU is connected to the video display of the mirror assembly 20 via a single coaxial wire or cable for communicating with the display (such as to provide control signals or the like) and for providing video image signals to the display. The ECU may also be connected to the video display 22 of the center console via a single coaxial wire or cable for communicating with the display and for providing video image signals to the display. Thus, the ECU may provide video images to the center stack display or head unit 22 and/or to the video mirror display 20.
The connections between the cameras and the ECU and/or between the displays and the ECUs may be made via respective coaxial cables, which may provide power and control of the cameras (by the ECU) and which may provide image data from the cameras to the ECU, and which may provide video images from the ECU to the display devices. Each device (e.g., camera and display device) is thus connected to and communicates with the ECU via a single respective coaxial cable, thus reducing cable inputs to the video mirror display and the center stack display. The connections and communications may utilize aspects of the systems described in U.S. Pat. Nos. 10,264,219; 9,900,490 and/or 9,609,757, which are hereby incorporated herein by reference in their entireties.
The surround view display system may utilize aspects of the systems described in U.S. Pat. Nos. 9,446,713; 9,085,261 and/or 6,690,268, and/or U.S. Publication Nos. US-2021-0094473; US-2020-0017143; US-2019-0297233; US-2019-0347825; US-2019-0118860; US-2019-0064831; US-2019-0042864; US-2019-0039649; US-2019-0143895; US-2019-0016264; US-2018-0276839; US-2018-0276838; US-2018-0253608; US-2018-0215382; US-2017-0254873; US-2017-0217372; US-2017-0050672; US-2015-0217693; US-2014-0160276; US-2014-0085472 and/or US-2015-0002670, which are all hereby incorporated herein by reference in their entireties.
Optionally, the system may provide for dimming control of the mirror reflective element of the interior mirror and of the mirror reflective elements of the exterior mirrors via processing of image data captured by one or more of the cameras (such as by utilizing aspects of the vision systems described in U.S. Publication Nos. US-2019-0258131 and/or US-2019-0047475, which are hereby incorporated herein by reference in their entireties). For example, the system may utilize a rearward sensing sensor at the mirror or the rearward viewing camera (at the CHMSL region) to determine glare light rearward of the vehicle when the vehicle is not towing a trailer. However, when the vehicle is towing a trailer that obstructs the rear window view, the tall trailer blocks the glare or rear sensing sensor at the interior mirror and blocks the rear backup camera view and blocks the CMS CHMSL camera view. Thus, the presence of the trailer inhibits the ability for glare control for the exterior auto-dimming mirrors using the rearward sensing sensor and/or rearward viewing cameras of the vehicle. In such trailering situations, the system may utilize the CMS exterior side cameras for independently determining glare conditions for each respective exterior auto-dimming mirror. Thus, the exterior CMS Cameras and the rearward viewing trailer camera can be used to detect glare and determine the driver and passenger side auto-dimming mirror level of dimming independently.
The microcontroller receives an input from a light sensor, and may control the auto-dimming mirror driver and the LED backlighting responsive to a detected or determined ambient light level at the mirror assembly. The system may also control dimming of the exterior mirrors and control the display intensity of the respective display screens (e.g., at the respective A-pillars) responsive to processing of image data captured by the respective side CMS cameras. The system may also control the intensity of the interior mirror display responsive to processing of image data captured by one or both side CMS cameras or responsive to processing of image data captured by the rearward viewing CHMSL camera or responsive to processing of image data captured by the rear backup camera or responsive to processing of image data captured by the trailer camera. The data used for dimming control may be provided by one or more of the cameras (such as one or more of the CMS cameras) and/or may be provided via the CAN or LIN bus of the vehicle. Optionally, the system may operate to control the intensity of dash lights or other interior vehicle lighting and/or exterior vehicle lighting responsive to processing of image data captured by one or more of the CMS cameras (or surround vision cameras) of the vehicle. The data used for dimming control may be provided by one or more of the cameras (such as one or more of the CMS cameras) and/or may be provided via the CAN or LIN bus of the vehicle.
Thus, the display system of the present invention may provide intelligent dimming to control the display intensity using camera lux information, and may utilize floating pogo pins between the backlighting FPC and the mirror cell (such as by utilizing aspects of the mirror assemblies and electrical connectors described in U.S. Pat. Nos. 10,484,587; 10,466,563; 9,878,669 and/or 9,565,342, which are hereby incorporated herein by reference in their entireties). The video display screen may comprise a 9.6″ custom TFT with failsafe/on-screen-display image (displays a symbol or icon). The mirror assembly includes a human machine interface (HMI) that provides selectable three different view adjustments.
The cameras may comprise any suitable imaging sensor or camera, such as a pixelated imaging array or the like, such as a CMOS imaging array sensor, a CCD sensor or other sensors or the like, such as a camera or sensor of the types disclosed in commonly assigned, U.S. Pat. Nos. 7,965,336; 5,550,677; 5,760,962; 6,097,023 and 5,796,094, which are hereby incorporated herein by reference in their entireties. Optionally, the cameras may comprise a stereo imaging camera or the like, such as by utilizing aspects of the imaging systems described in U.S. Pat. Nos. 6,396,397 and/or 5,796,094, which are hereby incorporated herein by reference in their entireties. Optionally, the cameras may comprise an infrared or near infrared light sensitive camera and may be suitable for capturing images in low lighting conditions, and/or the camera may include or be associated with an illumination source (such as an infrared or near-infrared light emitting illumination source that, when actuated to emit infrared or near-infrared light at the side of the vehicle, enhances the camera's performance but is not visible or discernible to the driver of the vehicle), such as by utilizing aspects of the cameras described in U.S. Pat. Nos. 7,965,336; 5,550,677; 5,760,962; 6,097,023 and 5,796,094, which are hereby incorporated herein by reference in their entireties.
The sideward and rearward viewing cameras may be incorporated at the exterior rearview mirror assembly or elsewhere at the vehicle, such as at a side portion of the vehicle, and having a sideward and rearward field of view. Optionally, the camera may have a wide angle field of view at the side of the vehicle and/or may have an adjustable field of view and/or may capture images for use in other vision systems, such as for use in a top-down view or bird's-eye view vision system of the vehicle or a surround view vision system at the vehicle, such as by utilizing aspects of the vision systems described in U.S. Pat. Nos. 9,126,525; 9,041,806; 9,900,522; 9,900,522; 10,071,687 and/or 9,762,880, and/or U.S. Publication Nos. US-2015-0022664 and/or US-2012-0162427, which are hereby incorporated herein by reference in their entireties.
The mirror assembly may include a camera or sensor or light of a driver monitoring system and/or head and face direction and position tracking system and/or eye tracking system and/or gesture recognition system. Such head and face direction and/or position tracking systems and/or eye tracking systems and/or gesture recognition systems may utilize aspects of the systems described in U.S. Pat. Nos. 10,065,574; 10,017,114; 9,405,120 and/or 7,914,187, and/or U.S. Publication Nos. US-2021-0323473; US-2021-0291739; US-2020-0202151; US-2020-0143560; US-2020-0320320; US-2018-0231976; US-2018-0222414; US-2017-0274906; US-2017-0217367; US-2016-0209647; US-2016-0137126; US-2015-0352953; US-2015-0296135; US-2015-0294169; US-2015-0232030; US-2015-0092042; US-2015-0022664; US-2015-0015710; US-2015-0009010 and/or US-2014-0336876, and/or U.S. patent application Ser. No. 17/650,255, filed Feb. 8, 2022, which published on Aug. 11, 2022 as U.S. Patent Publication No. US-2022-0254132, Ser. No. 17/649,723, filed Feb. 2, 2022, which published on Aug. 4, 2022 as U.S. Patent Publication No. US-2022-0242438, Ser. No. 17/450,721, filed Oct. 13, 2021, now U.S. Pat. No. 11,518,401, and/or U.S. provisional application Ser. No. 63/201,894, filed May 18, 2021, and/or International Application No. PCT/US2022/072238, filed May 11, 2022, which published on Nov. 17, 2022 as International Publication No. WO 2022241423, and/or International Application No. PCT/US2022/070882, filed Mar. 1, 2022, which published on Sep. 8, 2022 as International Publication No. WO 20222187805, which are hereby incorporated herein by reference in their entireties.
Optionally, the driver monitoring system may be integrated with a camera monitoring system (CMS) of the vehicle. The integrated vehicle system incorporates multiple inputs, such as from the inward viewing or driver monitoring camera and from the forward or outward viewing camera, as well as from a rearward viewing camera and sideward viewing cameras of the CMS, to provide the driver with unique collision mitigation capabilities based on full vehicle environment and driver awareness state. The image processing and detections and determinations are performed locally within the interior rearview mirror assembly and/or the overhead console region, depending on available space and electrical connections for the particular vehicle application. The CMS cameras and system may utilize aspects of the systems described in U.S. Pat. No. 11,242,008 and/or U.S. Publication Nos. US-2021-0162926; US-2021-0155167; US-2018-0134217 and/or US-2014-0285666, and/or International Application No. PCT/US2022/070062, filed Jan. 6, 2022, which are all hereby incorporated herein by reference in their entireties.
The ECU may receive image data captured by a plurality of cameras of the vehicle, such as by a plurality of surround view system (SVS) cameras and a plurality of camera monitoring system (CMS) cameras and optionally one or more driver monitoring system (DMS) cameras. The ECU may comprise a central or single ECU that processes image data captured by the cameras for a plurality of driving assist functions and may provide display of different video images to a video display screen in the vehicle (such as at an interior rearview mirror assembly or at a central console or the like) for viewing by a driver of the vehicle. The system may utilize aspects of the systems described in U.S. Pat. Nos. 11,242,008; 10,442,360 and/or 10,046,706, and/or U.S. Publication Nos. US-2021-0155167 and/or US-2019-0118717, and/or International Application No. PCT/US2022/070062, filed Jan. 6, 2022, which are all hereby incorporated herein by reference in their entireties.
The DMS camera may be disposed behind and viewing through the reflective polarizer used for the VRLC mirror, taking advantage of the IR transmittance of the materials. The near-IR light emitter may also be disposed behind and transmitting through the reflective polarizer used for the VRLC mirror. The VRLC mirror provides enhanced near-IR transmissive properties that are advantageous with the in-mirror DMS camera and near-IR light emitter. The VRLC mirror may, for example, provide at least 70 percent and preferably at least 80 percent transmittance in the 940 nm band that is used for DMS, both in reflective and transparent states. For example, spectrometer testing of VRLC samples indicate 85% to 95% IR transmittance in the 940 nm band that is used for DMS, both in reflective and transparent states. This is advantageous for both behind-the-glass cameras and near-IR illumination sources.
The mirror assembly may comprise any suitable construction, such as, for example, a mirror assembly with the reflective element being nested in the mirror casing and with a bezel portion that circumscribes a perimeter region of the front surface of the reflective element, or with the mirror casing having a curved or beveled perimeter edge around the reflective element and with no overlap onto the front surface of the reflective element (such as by utilizing aspects of the mirror assemblies described in U.S. Pat. Nos. 7,184,190; 7,274,501; 7,255,451; 7,289,037; 7,360,932; 7,626,749; 8,049,640; 8,277,059 and/or 8,529,108, which are hereby incorporated herein by reference in their entireties) or such as a mirror assembly having a rear substrate of an electro-optic reflective element nested in the mirror casing, and with the front substrate having curved or beveled perimeter edges, or such as a mirror assembly having a prismatic reflective element that is disposed at an outer perimeter edge of the mirror casing and with the prismatic substrate having curved or beveled perimeter edges, such as described in U.S. Pat. Nos. 8,508,831; 8,730,553; 9,598,016 and/or 9,346,403, and/or U.S. Publication Nos. US-2014-0313563 and/or US-2015-0097955, which are hereby incorporated herein by reference in their entireties (and with electro-optic and prismatic mirrors of such construction are commercially available from the assignee of this application under the trade name INFINITY™ mirror).
The mirror assembly may include user inputs or actuatable switches or touch sensors or the like for user/driver control of one or more features of the mirror assembly and/or display system. The user inputs or touch sensors may comprise any suitable sensors or inputs, and may utilize aspects of the inputs and sensors described in U.S. Pat. Nos. 9,827,913; 9,598,016; 9,346,403; 8,508,831; 8,730,553; 7,224,324; 7,253,723; 7,255,451 and/or 8,154,418, which are hereby incorporated herein by reference in their entireties.
Optionally, the display may utilize aspects of the displays of the types disclosed in U.S. Pat. Nos. 9,264,672; 9,041,806; 7,855,755; 7,777,611; 7,626,749; 7,581,859; 7,446,924; 7,446,650; 7,370,983; 7,338,177; 7,274,501; 7,255,451; 7,195,381; 7,184,190; 6,690,268; 6,329,925; 5,668,663; 5,530,240 and/or 5,724,187, and/or in U.S. Publication No. US-2006-0050018, which are all hereby incorporated herein by reference in their entireties. The display may be viewable through the reflective element when the display is activated to display information. The display element may be any type of display element, such as a liquid crystal display (LCD) element, a vacuum fluorescent (VF) display element, a light emitting diode (LED) display element, such as an organic light emitting diode (OLED) or an inorganic light emitting diode, an electroluminescent (EL) display element, a video screen display element or backlit thin film transistor (TFT) display element or the like, and may be operable to display various information (as discrete characters, icons or the like, or in a multi-pixel manner) to the driver of the vehicle, such as passenger side inflatable restraint (PSIR) information, tire pressure status, and/or the like.
Changes and modifications to the specifically described embodiments may be carried out without departing from the principles of the present invention, which is intended to be limited only by the scope of the appended claims as interpreted according to the principles of patent law.
The present application claims the filing benefits of U.S. provisional application Ser. No. 63/201,891, filed May 18, 2021, which is hereby incorporated herein by reference in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
5530240 | Larson et al. | Jun 1996 | A |
5550677 | Schofield et al. | Aug 1996 | A |
5567360 | Varaprasad et al. | Oct 1996 | A |
5570127 | Schmidt | Oct 1996 | A |
5668663 | Varaprasad et al. | Sep 1997 | A |
5670935 | Schofield et al. | Sep 1997 | A |
5724187 | Varaprasad et al. | Mar 1998 | A |
5760962 | Schofield et al. | Jun 1998 | A |
5796094 | Schofield et al. | Aug 1998 | A |
5877897 | Schofield et al. | Mar 1999 | A |
6097023 | Schofield et al. | Aug 2000 | A |
6158655 | DeVries, Jr. et al. | Dec 2000 | A |
6329925 | Skiver et al. | Dec 2001 | B1 |
6483438 | DeLine et al. | Nov 2002 | B2 |
6593565 | Heslin et al. | Jul 2003 | B2 |
6627918 | Getz et al. | Sep 2003 | B2 |
6690268 | Schofield et al. | Feb 2004 | B2 |
6703925 | Steffel | Mar 2004 | B2 |
6824281 | Schofield et al. | Nov 2004 | B2 |
7038577 | Pawlicki et al. | May 2006 | B2 |
7184190 | McCabe et al. | Feb 2007 | B2 |
7195381 | Lynam et al. | Mar 2007 | B2 |
7224324 | Quist et al. | May 2007 | B2 |
7249860 | Kulas et al. | Jul 2007 | B2 |
7253723 | Lindahl et al. | Aug 2007 | B2 |
7255451 | McCabe et al. | Aug 2007 | B2 |
7274501 | McCabe et al. | Sep 2007 | B2 |
7289037 | Uken et al. | Oct 2007 | B2 |
7338177 | Lynam | Mar 2008 | B2 |
7360932 | Uken et al. | Apr 2008 | B2 |
7370983 | DeWind et al. | May 2008 | B2 |
7420756 | Lynam | Sep 2008 | B2 |
7446650 | Scholfield et al. | Nov 2008 | B2 |
7446924 | Schofield et al. | Nov 2008 | B2 |
7477758 | Piirainen et al. | Jan 2009 | B2 |
7480149 | DeWard et al. | Jan 2009 | B2 |
7581859 | Lynam | Sep 2009 | B2 |
7626749 | Baur et al. | Dec 2009 | B2 |
7720580 | Higgins-Luthman | May 2010 | B2 |
7777611 | Desai | Aug 2010 | B2 |
7855755 | Weller et al. | Dec 2010 | B2 |
7914187 | Higgins-Luthman et al. | Mar 2011 | B2 |
8049640 | Uken et al. | Nov 2011 | B2 |
8258932 | Wahlstrom | Sep 2012 | B2 |
8277059 | McCabe et al. | Oct 2012 | B2 |
8446470 | Lu et al. | May 2013 | B2 |
8451107 | Lu et al. | May 2013 | B2 |
8508831 | De Wind et al. | Aug 2013 | B2 |
8529108 | Uken et al. | Sep 2013 | B2 |
8730553 | De Wind et al. | May 2014 | B2 |
8743203 | Karner et al. | Jun 2014 | B2 |
8876342 | Wimbert et al. | Nov 2014 | B2 |
8922422 | Klar et al. | Dec 2014 | B2 |
9041806 | Baur et al. | May 2015 | B2 |
9085261 | Lu et al. | Jul 2015 | B2 |
9090213 | Lawlor et al. | Jul 2015 | B2 |
9126525 | Lynam et al. | Sep 2015 | B2 |
9174578 | Uken et al. | Nov 2015 | B2 |
9264672 | Lynam | Feb 2016 | B2 |
9346403 | Uken et al. | May 2016 | B2 |
9405120 | Graf et al. | Aug 2016 | B2 |
9446713 | Lu et al. | Sep 2016 | B2 |
9487159 | Achenbach | Nov 2016 | B2 |
9493122 | Krebs | Nov 2016 | B2 |
9565342 | Sauer et al. | Feb 2017 | B2 |
9598016 | Blank et al. | Mar 2017 | B2 |
9609757 | Steigerwald | Mar 2017 | B2 |
9827913 | De Wind et al. | Nov 2017 | B2 |
9878669 | Kendall | Jan 2018 | B2 |
9900490 | Ihlenburg et al. | Feb 2018 | B2 |
10017114 | Bongwald | Jul 2018 | B2 |
10029614 | Larson | Jul 2018 | B2 |
10046706 | Larson et al. | Aug 2018 | B2 |
10065574 | Tiryaki | Sep 2018 | B2 |
10166924 | Baur | Jan 2019 | B2 |
10166926 | Krebs et al. | Jan 2019 | B2 |
10261648 | Uken et al. | Apr 2019 | B2 |
10264219 | Mleczko et al. | Apr 2019 | B2 |
10315573 | Bongwald | Jun 2019 | B2 |
10421404 | Larson et al. | Sep 2019 | B2 |
10442360 | LaCross et al. | Oct 2019 | B2 |
10466563 | Kendall et al. | Nov 2019 | B2 |
10484587 | Conger | Nov 2019 | B2 |
10567633 | Ihlenburg et al. | Feb 2020 | B2 |
10567705 | Ziegenspeck et al. | Feb 2020 | B2 |
10703204 | Hassan et al. | Jul 2020 | B2 |
10922563 | Nix et al. | Feb 2021 | B2 |
10958830 | Koravadi | Mar 2021 | B2 |
11167771 | Caron et al. | Nov 2021 | B2 |
11205083 | Lynam | Dec 2021 | B2 |
11214199 | LaCross et al. | Jan 2022 | B2 |
11240427 | Koravadi | Feb 2022 | B2 |
11242008 | Blank et al. | Feb 2022 | B2 |
11252376 | Ihlenburg | Feb 2022 | B2 |
11341671 | Lu et al. | May 2022 | B2 |
11348374 | Kramer et al. | May 2022 | B2 |
11433906 | Lu | Sep 2022 | B2 |
11465561 | Peterson et al. | Oct 2022 | B2 |
11488399 | Wacquant | Nov 2022 | B2 |
11493918 | Singh | Nov 2022 | B2 |
11518401 | Kulkarni | Dec 2022 | B2 |
11582425 | Liu | Feb 2023 | B2 |
11639134 | Huizen et al. | May 2023 | B1 |
20020005999 | Hutzel et al. | Jan 2002 | A1 |
20060050018 | Hutzel et al. | Mar 2006 | A1 |
20070182528 | Breed et al. | Aug 2007 | A1 |
20080094715 | Schofield et al. | Apr 2008 | A1 |
20080231703 | Nagata et al. | Sep 2008 | A1 |
20090040778 | Takayanagi et al. | Feb 2009 | A1 |
20100085653 | Uken et al. | Apr 2010 | A1 |
20100194890 | Weller | Aug 2010 | A1 |
20110080481 | Bellingham | Apr 2011 | A1 |
20110115615 | Luo et al. | May 2011 | A1 |
20110273659 | Sobecki | Nov 2011 | A1 |
20140022390 | Blank et al. | Jan 2014 | A1 |
20140085472 | Lu et al. | Mar 2014 | A1 |
20140160276 | Pliefke et al. | Jun 2014 | A1 |
20140285666 | O'Connell et al. | Sep 2014 | A1 |
20140293169 | Uken et al. | Oct 2014 | A1 |
20140313563 | Uken et al. | Oct 2014 | A1 |
20140336876 | Gieseke et al. | Nov 2014 | A1 |
20140340516 | Vojtisek et al. | Nov 2014 | A1 |
20150002670 | Bajpai | Jan 2015 | A1 |
20150009010 | Biemer | Jan 2015 | A1 |
20150015710 | Tiryaki | Jan 2015 | A1 |
20150022664 | Pflug et al. | Jan 2015 | A1 |
20150042808 | Pflug | Feb 2015 | A1 |
20150092042 | Fursich | Apr 2015 | A1 |
20150097955 | De Wind et al. | Apr 2015 | A1 |
20150217693 | Pliefke et al. | Aug 2015 | A1 |
20150232030 | Bongwald | Aug 2015 | A1 |
20150294169 | Zhou et al. | Oct 2015 | A1 |
20150296135 | Wacquant et al. | Oct 2015 | A1 |
20150352953 | Koravadi | Dec 2015 | A1 |
20160009226 | Krebs | Jan 2016 | A1 |
20160137126 | Fursich et al. | May 2016 | A1 |
20160209647 | Fursich | Jul 2016 | A1 |
20160375833 | Larson | Dec 2016 | A1 |
20170050672 | Gieseke et al. | Feb 2017 | A1 |
20170217367 | Pflug et al. | Aug 2017 | A1 |
20170217372 | Lu et al. | Aug 2017 | A1 |
20170237946 | Schofield et al. | Aug 2017 | A1 |
20170254873 | Koravadi | Sep 2017 | A1 |
20170274906 | Hassan et al. | Sep 2017 | A1 |
20170355312 | Habibi et al. | Dec 2017 | A1 |
20180134217 | Peterson et al. | May 2018 | A1 |
20180215382 | Gupta et al. | Aug 2018 | A1 |
20180222414 | Ihlenburg et al. | Aug 2018 | A1 |
20180231976 | Singh | Aug 2018 | A1 |
20180253608 | Diessner et al. | Sep 2018 | A1 |
20180276838 | Gupta et al. | Sep 2018 | A1 |
20180276839 | Diessner et al. | Sep 2018 | A1 |
20190016264 | Potnis et al. | Jan 2019 | A1 |
20190039649 | Gieseke et al. | Feb 2019 | A1 |
20190042864 | Pliefke et al. | Feb 2019 | A1 |
20190047475 | Uken et al. | Feb 2019 | A1 |
20190054899 | Hoyos et al. | Feb 2019 | A1 |
20190064831 | Gali et al. | Feb 2019 | A1 |
20190118717 | Blank et al. | Apr 2019 | A1 |
20190118860 | Gali et al. | Apr 2019 | A1 |
20190143895 | Pliefke et al. | May 2019 | A1 |
20190146297 | Lynam et al. | May 2019 | A1 |
20190168669 | Lintz et al. | Jun 2019 | A1 |
20190210615 | Caron et al. | Jul 2019 | A1 |
20190258131 | Lynam et al. | Aug 2019 | A9 |
20190297233 | Gali et al. | Sep 2019 | A1 |
20190347825 | Gupta et al. | Nov 2019 | A1 |
20190364199 | Koravadi | Nov 2019 | A1 |
20190381938 | Hopkins | Dec 2019 | A1 |
20200017143 | Gali | Jan 2020 | A1 |
20200143560 | Lu et al. | May 2020 | A1 |
20200148120 | Englander et al. | May 2020 | A1 |
20200202151 | Wacquant | Jun 2020 | A1 |
20200320320 | Lynam | Oct 2020 | A1 |
20200327323 | Noble | Oct 2020 | A1 |
20200377022 | LaCross et al. | Dec 2020 | A1 |
20210056306 | Hu et al. | Feb 2021 | A1 |
20210094473 | Gali et al. | Apr 2021 | A1 |
20210122404 | Lisseman et al. | Apr 2021 | A1 |
20210155167 | Lynam et al. | May 2021 | A1 |
20210162926 | Lu | Jun 2021 | A1 |
20210245662 | Blank et al. | Aug 2021 | A1 |
20210291739 | Kasarla et al. | Sep 2021 | A1 |
20210306538 | Solar | Sep 2021 | A1 |
20210323473 | Peterson et al. | Oct 2021 | A1 |
20210368082 | Solar | Nov 2021 | A1 |
20220111857 | Kulkarni | Apr 2022 | A1 |
20220242438 | Sobecki et al. | Aug 2022 | A1 |
20220254132 | Rother | Aug 2022 | A1 |
20220377219 | Conger et al. | Nov 2022 | A1 |
20230131471 | Sobecki et al. | Apr 2023 | A1 |
20230137004 | Huizen et al. | May 2023 | A1 |
Number | Date | Country |
---|---|---|
2022150826 | Jul 2022 | WO |
2023034956 | Mar 2023 | WO |
Number | Date | Country | |
---|---|---|---|
20220371513 A1 | Nov 2022 | US |
Number | Date | Country | |
---|---|---|---|
63201891 | May 2021 | US |