The present invention relates generally to a vehicle vision system for a vehicle and, more particularly, to a vehicle vision system that utilizes one or more cameras at a vehicle.
Use of imaging sensors in vehicular trailer assist systems is common and known. Examples of such known systems are described in U.S. Pat. Nos. 9,446,713 and 9,085,261, which are hereby incorporated herein by reference in their entireties.
A vehicular trailer assist system includes a plurality of vehicle cameras disposed at a vehicle equipped with the vehicular trailer assist system. The plurality of vehicle cameras includes (i) a rearward-viewing camera viewing at least rearward of the vehicle, (ii) a left-side vehicle camera disposed at a left side of the equipped vehicle and viewing at least rearward and sideward of the equipped vehicle along the left side of the equipped vehicle, and (iii) a right-side vehicle camera disposed at a right side of the equipped vehicle and viewing at least rearward and sideward of the equipped vehicle along the right side of the equipped vehicle. The vehicle cameras may include respective CMOS imaging arrays having at least one million photosensors arranged in rows and columns. The system includes an electronic control unit (ECU) with electronic circuitry and associated software. The electronic circuitry includes an image processor operable to process image data provided to the ECU. The plurality of vehicle cameras captures image data and image data captured by the plurality of vehicle cameras is provided to the ECU. A trailer camera is disposed at a rear portion of a trailer and viewing at least rearward of the trailer. With the trailer hitched to the equipped vehicle, the trailer camera captured image data and image data captured by the trailer camera is provided to the ECU. A video display screen is disposed at the equipped vehicle and viewable by a driver of the equipped vehicle. With the trailer hitched to the vehicle, one or more detachably attached cameras, such as an attachable or portable or auxiliary camera that may be selectively attached at the trailer by a user or operator of the vehicle and trailer, may be disposed at the trailer, such as at one or more respective sides of the trailer to view rearward and along the respective sides of the trailer, and the one or more auxiliary cameras may capture image data and communicate the captured image data for display at the vehicle. The auxiliary camera views at least rearward of the trailer and has a field of view that is different from a field of view of the trailer camera. During a reversing maneuver of the equipped vehicle with the trailer hitched to the equipped vehicle, video images are displayed at the video display screen, and the displayed video images are derived from image data provided to the ECU from at least one selected from the group consisting of (i) the trailer camera and (ii) at least one vehicle camera of the plurality of vehicle cameras. With the auxiliary camera detachably attached at the exterior side portion of the trailer during the reversing maneuver of the equipped vehicle with the trailer hitched to the equipped vehicle, and responsive to a user input, video images derived at least in part from image data captured by the auxiliary camera detachably attached at the exterior side portion of the trailer are displayed for viewing by the driver of the equipped vehicle. Thus, the auxiliary cameras provide views along the sides of the trailer and rearward of the trailer to aid in display of the region rearward and sideward of the trailer for the driver of the vehicle.
These and other objects, advantages, purposes and features of the present invention will become apparent upon review of the following specification in conjunction with the drawings.
A vehicle vision system and/or driver or driving assist system and/or object detection system and/or alert system operates to capture images exterior of the vehicle and may process the captured image data to display images and to detect objects at or near the vehicle and in the predicted path of the vehicle, such as to assist a driver of the vehicle in maneuvering the vehicle in a rearward direction. The vision system includes an image processor or image processing system that is operable to receive image data from one or more cameras and provide an output to a display device for displaying images representative of the captured image data. Optionally, the vision system may provide display, such as a rearview display or a top down or bird's eye or surround view display or the like. As discussed below, the system may display images for viewing by a driver of the vehicle, the images derived from image data captured by one or more vehicle cameras and/or one or more trailer cameras disposed at a trailer hitched to the vehicle equipped with the vision system. The system further displays images captured by one or more selectably detachably attached cameras, the one or more portable or attachable cameras disposed, for example, at respective sides of the trailer to increase the field of view of the driver when maneuvering the vehicle and trailer in a rearward direction.
Referring now to the drawings and the illustrative embodiments depicted therein, a vehicle 10 includes an imaging system or vision system 12 that includes at least one exterior viewing imaging sensor or camera, such as a rearward viewing imaging sensor or camera 14a (and the system may optionally include multiple exterior viewing imaging sensors or cameras, such as a forward viewing camera 14b at the front (or at the windshield) of the vehicle, and a sideward/rearward viewing camera 14c, 14d at respective sides of the vehicle), which captures images exterior of the vehicle, with the camera having a lens for focusing images at or onto an imaging array or imaging plane or imager of the camera (
As shown in
The vision system 12 provides sideward fields of view 24 that are rearward of the vehicle 10 and along the respective sides of the vehicle 10. The sideward fields of view 24 may be provided by one or more of the exterior rearview mirrors of the vehicle, the sideward/rearward cameras 14c, 14d, and/or the side rearward CMS cameras 15a, 15d.
As shown in
Thus,
As shown in
The system 12 may communicate with and receive images from one or more attachable cameras 30. The driver may position one attachable camera 30 as needed (such as at the side of the trailer at which the dock will align) or the driver may position multiple cameras 30 to provide multiple distinct views. For example, attachable cameras 30 may be positioned at each side of the trailer 22 to view obstacles 13 (e.g., docks at a boat launch ramp) along both sides of the trailer 22 (
Optionally, attachable cameras 30 may be disposed along the side of the trailer 22 so that the respective fields of view 32 of the attachable cameras 30 and the sideward fields of view 24 of the sideviewing cameras and/or exterior rearview mirrors at least partially overlap. For example, and such as shown in
Optionally, the one or more attachable cameras 30 may be disposed at positions of the vehicle 10 and having fields of view exterior of the vehicle, such as along a roofline of the vehicle and viewing forward or rearward of the vehicle or along a bed rail of a truck bed of the vehicle and viewing rearward (and optionally along a respective side) of the vehicle.
The system may selectively display images from one or more of the attachable cameras 30. For example, the system may display images from attachable cameras having a wider field of view 32 (e.g., the attachable cameras at the central portion of the trailer) when no obstacle is detected in close proximity to the trailer, and the system may display images from attachable cameras 30 having a specific field of view (e.g., one or more of the attachable cameras at the rear portion of the trailer) when an obstacle is detected in the field of view of the camera or in close proximity to a location corresponding to the attachable camera 30.
As shown in
In other words, the remote camera may be more useful mounted closer to the side of the trailer, to better show alignment of the trailer to the nearby dock. It may also be more useful placed further forward, to reference the trailer frame (i.e., include at least a portion of the trailer in the field of view). Two cameras may be included to allow for docks on each side of the trailer. Thus, the system may place one or more cameras on the sides of the trailer to provide trailer-referenced view for alignment of the trailer while reversing.
Thus, the attachable cameras 30 provide additional images for viewing by a driver of the vehicle to assist in maneuvering the vehicle (and optionally trailer) in difficult-to-maneuver situations, such as during blind-side reversing of the vehicle and trailer, or to ensure the trailer does not come into contact with obstacles such as docks, garages, landscaping, etc.
The attachable cameras 30 may be selectably attached at any position of the vehicle 10 or trailer 22 hitched to the vehicle. The attachable cameras 30 may be self-mountable, meaning that the attachable camera 30 comprises a mounting portion for attaching to a surface of the vehicle 10 or trailer 22. For example, the mounting portion may include a suction cup, vice clamp, magnetic element, or any suitable fastener. Optionally, the attachable cameras 30 may be received at mounting elements attached at the vehicle or trailer such that the attachable camera 30 mounts to the vehicle or trailer when received at the mounting element.
Images provided by the one or more attachable cameras 30 may be viewed by the driver of the vehicle at any suitable display device within the vehicle, such as at the interior rearview mirror display, center stack display, at display devices disposed at the mirror reflective elements of the exterior rearview mirror assembly, or at a mobile device. Video images derived from the images or image data captured by the attachable cameras 30 may be provided for display to the driver of the vehicle at the display device 16 for viewing with the video images from the one or more vehicle cameras 14, 15 and/or the trailer camera 26. Optionally, the images from the attachable cameras may be viewable at a separate and distinct display device from the display screen that displays the images derived from outputs of the vehicle cameras and trailer camera. For example, the images captured by the vehicle cameras and/or trailer camera may be provided at the center console display device, while the images captured by the attachable cameras may be provided at the interior rearview mirror display device. The images may be transmitted directly to the display, such as via BLUETOOTH® or other wireless or wired communication means, or the images may be received at the ECU 18 and transmitted to the display via the ECU 18 at the vehicle 10.
Optionally, the images provided by the one or more attachable cameras 30 may be viewed by the driver at a mobile device, such as a smart phone, tablet computer, or the like. The system may provide the images from the attachable camera 30 at the mobile device while providing other images (such as from the trailer camera 26) at the display device 16 of the vehicle 10. Thus, the driver of the vehicle may view a first set of images from one or more cameras of the vehicle and/or trailer at the display device 16 of the vehicle and the driver may view a second set of images from the attachable cameras 30 at a second display device such as the mobile device. This may allow the driver to more easily view and manipulate (e.g., zoom in/out, pan, switch between images from multiple attachable cameras) the images from the attachable cameras. The mobile device may provide an interface for interacting with the attachable cameras such as via a mobile application. For example, via use of the mobile application at the mobile device, the driver may view the images and provide commands to the attachable cameras 30, such as instruct the attachable cameras to power on/off or begin capturing image data. Optionally, the attachable cameras 30 may be configured to adjust their respective field of view responsive to user inputs, such as to pan or zoom or pivot, such as responsive to a user input at the user's smart phone or the like. The image data captured by the attachable cameras 30 may be transmitted (e.g., wirelessly, such as via a short range communication protocol, such as BLUETOOTH®) directly to the mobile device or the image data may be transmitted to a remote server in communication with the attachable cameras 30. The image data may be processed at the remote server to provide video images derived from the captured image data to the mobile device. Thus, the image data captured by the attachable cameras 30 may not be processed at the ECU 18 of the vehicle.
The images from the attachable cameras 30 may be provided automatically (such as responsive to the cameras 30 being disposed at the vehicle or trailer) or the attachable cameras 30 may provide images responsive to a user actuatable input, such as a button or actuatable input disposed in the interior cabin of the vehicle. The images from the two side attachable cameras may be displayed as a split screen display (optionally with a demarcation between the two images). Optionally, images derived from an output of one of the attachable trailer cameras may be displayed or enlarged or highlighted responsive to determination that the trailer is approaching or within a threshold distance of the obstacle or dock to alert the driver of a potential misalignment of the trailer at the ramp and dock.
The attachable cameras 30 may receive power from the vehicle, such as connected to a power source of the vehicle via the wiring harness of the vehicle or via a power outlet at the trailer connected to the wiring harness of the vehicle, or the attachable cameras 30 may include a power supply, such as a rechargeable battery. Optionally, the attachable camera may include means for regenerating power stored at the power supply, such as a solar panel or mechanical energy harvesting system. Thus, the attachable cameras 30 may be self-contained units for disposal at the trailer or vehicle or any position within communicable range of the system.
Optionally, the system 12 may include a trailering assist function that is operable to assist in maneuvering (such as backing up or reversing) the vehicle 10 with the trailer 22 hitched to the vehicle 10. The ECU 18 may receive image data captured by the rear trailer camera 26 (and optionally other trailer cameras if the trailer has sideward-viewing or other rearward-viewing or interior-viewing cameras). The ECU 18, via processing of the received image data captured by the rear trailer camera 26, may detect objects or the like and/or may generate a video image output to display video images of the scene rearward of the trailer 22 at the display 16 for viewing by the driver of the vehicle 10. Optionally, the system 12 may include or communicate with multiple trailer cameras (such as sideward-viewing cameras and a forward-viewing camera) to provide a surround view display of areas around the trailer 22 as well as the vehicle 10, such as by utilizing aspects of the systems described in U.S. Publication No. US-2021-0094473, which is hereby incorporated herein by reference in its entirety. The system thus may display a 360 degree bird's eye view or surround view of the surroundings of the towing vehicle 10 and the trailer 22 being towed by the vehicle.
The system 12 may provide the trailering assist function without use of the images captured by the attachable cameras 30. In other words, the trailering assist function may provide images to the driver of the vehicle and the images captured by the attachable cameras 30 may be provided to the driver separate from the trailer assist images. For example, the images captured by the attachable cameras 30 may be provided to the driver at separate displays or at the same display and distinctly separated from the images provided by the trailering assist function, such as picture in picture.
Optionally, the system 12 may provide the trailering assist function using the images captured by the attachable cameras 30. Thus, image data captured by the attachable cameras 30 may be processed at the ECU 18 for detecting objects near the vehicle and/or for providing enhanced or surround view display.
The trailer assist system or trailer surround view display system may utilize aspects of the systems described in U.S. Pat. Nos. 9,446,713; 9,085,261 and/or 6,690,268, and/or U.S. Publication Nos. US-2020-0017143; US-2019-0297233; US-2019-0347825; US-2019-0118860; US-2019-0064831; US-2019-0042864; US-2019-0039649; US-2019-0143895; US-2019-0016264; US-2018-0276839; US-2018-0276838; US-2018-0253608; US-2018-0215382; US-2017-0254873; US-2017-0217372; US-2017-0050672; US-2015-0217693; US-2014-0160276; US-2014-0085472 and/or US-2015-0002670, which are all hereby incorporated herein by reference in their entireties.
The one or more cameras or sensors may comprise any suitable camera or sensor. Optionally, the camera may comprise a “smart camera” that includes the imaging sensor array and associated circuitry and image processing circuitry and electrical connectors and the like as part of a camera module, such as by utilizing aspects of the vision systems described in U.S. Pat. Nos. 10,099,614 and/or 10,071,687, which are hereby incorporated herein by reference in their entireties.
The system includes an image processor operable to process image data captured by the camera or cameras, such as for detecting objects or other vehicles or pedestrians or the like in the field of view of one or more of the cameras. For example, the image processor may comprise an image processing chip selected from the EYEQ family of image processing chips available from Mobileye Vision Technologies Ltd. of Jerusalem, Israel, and may include object detection software (such as the types described in U.S. Pat. Nos. 7,855,755; 7,720,580 and/or 7,038,577, which are hereby incorporated herein by reference in their entireties), and may analyze image data to detect vehicles and/or other objects. Responsive to such image processing, and when an object or other vehicle is detected, the system may generate an alert to the driver of the vehicle and/or may generate an overlay at the displayed image to highlight or enhance display of the detected object or vehicle, in order to enhance the driver's awareness of the detected object or vehicle or hazardous condition during a driving maneuver of the equipped vehicle.
The vehicle may include any type of sensor or sensors, such as imaging sensors or radar sensors or lidar sensors or ultrasonic sensors or the like. The imaging sensor or camera may capture image data for image processing and may comprise any suitable camera or sensing device, such as, for example, a two dimensional array of a plurality of photosensor elements arranged in at least 640 columns and 480 rows (at least a 640×480 imaging array, such as a megapixel imaging array or the like), with a respective lens focusing images onto respective portions of the array. The photosensor array of any of the cameras (including the vehicle cameras, the trailer camera or cameras, and/or the auxiliary camera or cameras) may comprise a plurality of photosensor elements arranged in a photosensor array having rows and columns. The imaging array may comprise a CMOS imaging array having at least 300,000 photosensor elements or pixels, preferably at least 500,000 photosensor elements or pixels and more preferably at least one million photosensor elements or pixels arranged in rows and columns. The imaging array may capture color image data, such as via spectral filtering at the array, such as via an RGB (red, green and blue) filter or via a red/red complement filter or such as via an RCC (red, clear, clear) filter or the like. The logic and control circuit of the imaging sensor may function in any known manner, and the image processing and algorithmic processing may comprise any suitable means for processing the images and/or image data.
For example, the vision system and/or processing and/or camera and/or circuitry may utilize aspects described in U.S. Pat. Nos. 9,233,641; 9,146,898; 9,174,574; 9,090,234; 9,077,098; 8,818,042; 8,886,401; 9,077,962; 9,068,390; 9,140,789; 9,092,986; 9,205,776; 8,917,169; 8,694,224; 7,005,974; 5,760,962; 5,877,897; 5,796,094; 5,949,331; 6,222,447; 6,302,545; 6,396,397; 6,498,620; 6,523,964; 6,611,202; 6,201,642; 6,690,268; 6,717,610; 6,757,109; 6,802,617; 6,806,452; 6,822,563; 6,891,563; 6,946,978; 7,859,565; 5,550,677; 5,670,935; 6,636,258; 7,145,519; 7,161,616; 7,230,640; 7,248,283; 7,295,229; 7,301,466; 7,592,928; 7,881,496; 7,720,580; 7,038,577; 6,882,287; 5,929,786 and/or 5,786,772, and/or U.S. Publication Nos. US-2014-0340510; US-2014-0313339; US-2014-0347486; US-2014-0320658; US-2014-0336876; US-2014-0307095; US-2014-0327774; US-2014-0327772; US-2014-0320636; US-2014-0293057; US-2014-0309884; US-2014-0226012; US-2014-0293042; US-2014-0218535; US-2014-0218535; US-2014-0247354; US-2014-0247355; US-2014-0247352; US-2014-0232869; US-2014-0211009; US-2014-0160276; US-2014-0168437; US-2014-0168415; US-2014-0160291; US-2014-0152825; US-2014-0139676; US-2014-0138140; US-2014-0104426; US-2014-0098229; US-2014-0085472; US-2014-0067206; US-2014-0049646; US-2014-0052340; US-2014-0025240; US-2014-0028852; US-2014-005907; US-2013-0314503; US-2013-0298866; US-2013-0222593; US-2013-0300869; US-2013-0278769; US-2013-0258077; US-2013-0258077; US-2013-0242099; US-2013-0215271; US-2013-0141578 and/or US-2013-0002873, which are all hereby incorporated herein by reference in their entireties. The system may communicate with other communication systems via any suitable means, such as by utilizing aspects of the systems described in U.S. Pat. Nos. 10,071,687; 9,900,490; 9,126,525 and/or 9,036,026, which are hereby incorporated herein by reference in their entireties.
The system may utilize aspects of the trailering assist systems or trailer angle detection systems or trailer hitch assist systems described in U.S. Pat. Nos. 10,755,110; 10,733,757; 10,706,291; 10,638,025; 10,586,119; 10,552,976; 10,532,698; 10,160,382; 10,086,870; 9,558,409; 9,446,713; 9,085,261 and/or 6,690,268, and/or U.S. Publication Nos. US-2022-0189052; US-2022-0028111; US-2022-0027644; US-2022-0024391; US-2021-0170947; US-2021-0170820; US-2021-0078634; US-2020-0406967; US-2020-0361397; US-2020-0356788; US-2020-0334475; US-2020-0017143; US-2019-0347825; US-2019-0118860; US-2019-0064831; US-2018-0276838; US-2018-0215382; US-2017-0254873; US-2017-0217372 and/or US-2015-0002670, and/or International Publication No. WO 2021/0127693, which are all hereby incorporated herein by reference in their entireties.
The ECU may receive image data captured by a plurality of cameras of the vehicle, such as by a plurality of surround view system (SVS) cameras and a plurality of camera monitoring system (CMS) cameras and optionally one or more driver monitoring system (DMS) cameras. The ECU may comprise a central or single ECU that processes image data captured by the cameras for a plurality of driving assist functions and may provide display of different video images to a video display screen in the vehicle (such as at an interior rearview mirror assembly or at a central console or the like) for viewing by a driver of the vehicle. The system may utilize aspects of the systems described in U.S. Pat. Nos. 10,442,360 and/or 10,046,706, and/or U.S. Publication Nos. US-2021-0245662; US-2021-0162926; US-2021-0155167 and/or US-2019-0118717, and/or International Publication No. WO 2022/150826, which are all hereby incorporated herein by reference in their entireties.
The vision system includes a display for displaying images captured by one or more of the imaging sensors for viewing by the driver of the vehicle while the driver is normally operating the vehicle. Optionally, for example, the vision system may include a video display device, such as by utilizing aspects of the video display systems described in U.S. Pat. Nos. 5,530,240; 6,329,925; 7,855,755; 7,626,749; 7,581,859; 7,446,650; 7,338,177; 7,274,501; 7,255,451; 7,195,381; 7,184,190; 5,668,663; 5,724,187; 6,690,268; 7,370,983; 7,329,013; 7,308,341; 7,289,037; 7,249,860; 7,004,593; 4,546,551; 5,699,044; 4,953,305; 5,576,687; 5,632,092; 5,708,410; 5,737,226; 5,802,727; 5,878,370; 6,087,953; 6,173,501; 6,222,460; 6,513,252 and/or 6,642,851, and/or U.S. Publication Nos. US-2014-0022390; US-2012-0162427; US-2006-0050018 and/or US-2006-0061008, which are all hereby incorporated herein by reference in their entireties.
Optionally, the display may be viewable through a reflective element of a mirror assembly when the display is activated to display information. Optionally, the display element may comprise any type of display element, such as a vacuum fluorescent (VF) display element, a light emitting diode (LED) display element, such as an organic light emitting diode (OLED) or an inorganic light emitting diode, an electroluminescent (EL) display element, a liquid crystal display (LCD) element, a video screen display element or backlit thin film transistor (TFT) display element or the like, and may be operable to display various information (as discrete characters, icons or the like, or in a multi-pixel manner) to the driver of the vehicle, such as passenger side inflatable restraint (PSIR) information, tire pressure status, and/or the like.
The display device may be disposed at or incorporated in an interior mirror assembly, and the interior mirror assembly may comprise a dual-mode interior rearview video mirror that can switch from a traditional reflection mode to a live-video display mode, such as by utilizing aspects of the mirror assemblies and systems described in U.S. Pat. Nos. 11,465,561; 10,442,360; 10,421,404; 10,166,924 and/or 10,046,706, and/or U.S. Publication Nos. US-2021-0245662; US-2021-0162926; US-2021-0155167; US-2021-0094473; US-2020-0377022; US-2019-0258131; US-2019-0146297; US-2019-0118717; US-2019-0047475 and/or US-2017-0355312, which are all hereby incorporated herein by reference in their entireties.
Changes and modifications in the specifically described embodiments can be carried out without departing from the principles of the invention, which is intended to be limited only by the scope of the appended claims, as interpreted according to the principles of patent law including the doctrine of equivalents.
The present application claims the filing benefits of U.S. provisional application Ser. No. 63/267,329, filed Jan. 31, 2022, which is hereby incorporated herein by reference in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
4546551 | Franks | Oct 1985 | A |
4953305 | Van Lente et al. | Sep 1990 | A |
5530240 | Larson et al. | Jun 1996 | A |
5550677 | Schofield et al. | Aug 1996 | A |
5576687 | Blank et al. | Nov 1996 | A |
5632092 | Blank et al. | May 1997 | A |
5668663 | Varaprasad et al. | Sep 1997 | A |
5670935 | Schofield et al. | Sep 1997 | A |
5699044 | Van Lente et al. | Dec 1997 | A |
5708410 | Blank et al. | Jan 1998 | A |
5724187 | Varaprasad et al. | Mar 1998 | A |
5737226 | Olson et al. | Apr 1998 | A |
5760962 | Schofield et al. | Jun 1998 | A |
5786772 | Schofield et al. | Jul 1998 | A |
5796094 | Schofield et al. | Aug 1998 | A |
5802727 | Blank et al. | Sep 1998 | A |
5877897 | Schofield et al. | Mar 1999 | A |
5878370 | Olson | Mar 1999 | A |
5929786 | Schofield et al. | Jul 1999 | A |
5949331 | Schofield et al. | Sep 1999 | A |
6087953 | DeLine et al. | Jul 2000 | A |
6173501 | Blank et al. | Jan 2001 | B1 |
6201642 | Bos | Mar 2001 | B1 |
6222447 | Schofield et al. | Apr 2001 | B1 |
6222460 | DeLine et al. | Apr 2001 | B1 |
6302545 | Schofield et al. | Oct 2001 | B1 |
6329925 | Skiver et al. | Dec 2001 | B1 |
6396397 | Bos et al. | May 2002 | B1 |
6498620 | Schofield et al. | Dec 2002 | B2 |
6513252 | Schierbeek et al. | Feb 2003 | B1 |
6523964 | Schofield et al. | Feb 2003 | B2 |
6611202 | Schofield et al. | Aug 2003 | B2 |
6636258 | Strumolo | Oct 2003 | B2 |
6642851 | Deline et al. | Nov 2003 | B2 |
6690268 | Schofield et al. | Feb 2004 | B2 |
6717610 | Bos et al. | Apr 2004 | B1 |
6757109 | Bos | Jun 2004 | B2 |
6802617 | Schofield et al. | Oct 2004 | B2 |
6806452 | Bos et al. | Oct 2004 | B2 |
6822563 | Bos et al. | Nov 2004 | B2 |
6882287 | Schofield | Apr 2005 | B2 |
6891563 | Schofield et al. | May 2005 | B2 |
6946978 | Schofield | Sep 2005 | B2 |
7004593 | Weller et al. | Feb 2006 | B2 |
7005974 | McMahon et al. | Feb 2006 | B2 |
7038577 | Pawlicki et al. | May 2006 | B2 |
7145519 | Takahashi et al. | Dec 2006 | B2 |
7161616 | Okamoto et al. | Jan 2007 | B1 |
7184190 | McCabe et al. | Feb 2007 | B2 |
7195381 | Lynam et al. | Mar 2007 | B2 |
7230640 | Regensburger et al. | Jun 2007 | B2 |
7248283 | Takagi et al. | Jul 2007 | B2 |
7249860 | Kulas et al. | Jul 2007 | B2 |
7255451 | McCabe et al. | Aug 2007 | B2 |
7274501 | McCabe et al. | Sep 2007 | B2 |
7289037 | Uken et al. | Oct 2007 | B2 |
7295229 | Kumata et al. | Nov 2007 | B2 |
7301466 | Asai | Nov 2007 | B2 |
7308341 | Schofield et al. | Dec 2007 | B2 |
7329013 | Blank et al. | Feb 2008 | B2 |
7338177 | Lynam | Mar 2008 | B2 |
7370983 | DeWind et al. | May 2008 | B2 |
7446650 | Scholfield et al. | Nov 2008 | B2 |
7581859 | Lynam | Sep 2009 | B2 |
7592928 | Chinomi et al. | Sep 2009 | B2 |
7626749 | Baur et al. | Dec 2009 | B2 |
7720580 | Higgins-Luthman | May 2010 | B2 |
7855755 | Weller et al. | Dec 2010 | B2 |
7859565 | Schofield et al. | Dec 2010 | B2 |
7881496 | Camilleri et al. | Feb 2011 | B2 |
8694224 | Chundrlik, Jr. et al. | Apr 2014 | B2 |
8818042 | Schofield et al. | Aug 2014 | B2 |
8886401 | Schofield et al. | Nov 2014 | B2 |
8917169 | Schofield et al. | Dec 2014 | B2 |
9036026 | Dellantoni et al. | May 2015 | B2 |
9068390 | Ihlenburg et al. | Jun 2015 | B2 |
9077098 | Latunski | Jul 2015 | B2 |
9077962 | Shi et al. | Jul 2015 | B2 |
9085261 | Lu et al. | Jul 2015 | B2 |
9090234 | Johnson et al. | Jul 2015 | B2 |
9092986 | Salomonsson et al. | Jul 2015 | B2 |
9126525 | Lynam et al. | Sep 2015 | B2 |
9140789 | Lynam | Sep 2015 | B2 |
9146898 | Ihlenburg et al. | Sep 2015 | B2 |
9174574 | Salomonsson | Nov 2015 | B2 |
9205776 | Turk | Dec 2015 | B2 |
9233641 | Sesti et al. | Jan 2016 | B2 |
9446713 | Lu et al. | Sep 2016 | B2 |
9558409 | Pliefke et al. | Jan 2017 | B2 |
9609757 | Steigerwald | Mar 2017 | B2 |
9900490 | Ihlenburg et al. | Feb 2018 | B2 |
9914333 | Shank et al. | Mar 2018 | B2 |
10046706 | Larson et al. | Aug 2018 | B2 |
10071687 | Ihlenburg et al. | Sep 2018 | B2 |
10086870 | Gieseke et al. | Oct 2018 | B2 |
10089537 | Nix et al. | Oct 2018 | B2 |
10099614 | Diessner | Oct 2018 | B2 |
10154185 | Sigle et al. | Dec 2018 | B2 |
10160382 | Pliefke et al. | Dec 2018 | B2 |
10166924 | Baur | Jan 2019 | B2 |
10179543 | Rathi et al. | Jan 2019 | B2 |
10264219 | Mleczko et al. | Apr 2019 | B2 |
10313572 | Wohlte | Jun 2019 | B2 |
10332002 | Bliss et al. | Jun 2019 | B2 |
10346705 | Naserian et al. | Jul 2019 | B2 |
10407047 | Chundrlik, Jr. et al. | Sep 2019 | B2 |
10421404 | Larson et al. | Sep 2019 | B2 |
10442360 | LaCross et al. | Oct 2019 | B2 |
10452931 | Gupta | Oct 2019 | B2 |
10532698 | Potnis et al. | Jan 2020 | B2 |
10552976 | Diessner et al. | Feb 2020 | B2 |
10567705 | Ziegenspeck et al. | Feb 2020 | B2 |
10586119 | Pliefke et al. | Mar 2020 | B2 |
10638025 | Gali et al. | Apr 2020 | B2 |
10706291 | Diessner et al. | Jul 2020 | B2 |
10733757 | Gupta et al. | Aug 2020 | B2 |
10755110 | Bajpai | Aug 2020 | B2 |
10933810 | Lu et al. | Mar 2021 | B2 |
10948798 | Lynam et al. | Mar 2021 | B2 |
10967796 | Uken et al. | Apr 2021 | B2 |
11064165 | Kiliman | Jul 2021 | B2 |
11072284 | Windeler et al. | Jul 2021 | B2 |
11180083 | Lu et al. | Nov 2021 | B2 |
11410431 | Pliefke et al. | Aug 2022 | B2 |
11465560 | Lu et al. | Oct 2022 | B2 |
11465561 | Peterson et al. | Oct 2022 | B2 |
11634073 | Ihlenburg et al. | Apr 2023 | B2 |
11673605 | Gieseke et al. | Jun 2023 | B2 |
11702017 | Gali et al. | Jul 2023 | B2 |
11861878 | Gali et al. | Jan 2024 | B2 |
11875575 | Gali et al. | Jan 2024 | B2 |
20060050018 | Hutzel et al. | Mar 2006 | A1 |
20060061008 | Karner et al. | Mar 2006 | A1 |
20110050903 | Vorobiev | Mar 2011 | A1 |
20120162427 | Lynam | Jun 2012 | A1 |
20130002873 | Hess | Jan 2013 | A1 |
20130141578 | Chundrlik, Jr. et al. | Jun 2013 | A1 |
20130215271 | Lu | Aug 2013 | A1 |
20130222593 | Byrne et al. | Aug 2013 | A1 |
20130242099 | Sauer et al. | Sep 2013 | A1 |
20130258077 | Bally et al. | Oct 2013 | A1 |
20130278769 | Nix et al. | Oct 2013 | A1 |
20130298866 | Vogelbacher | Nov 2013 | A1 |
20130300869 | Lu et al. | Nov 2013 | A1 |
20130314503 | Nix et al. | Nov 2013 | A1 |
20140005907 | Bajpai | Jan 2014 | A1 |
20140022390 | Blank et al. | Jan 2014 | A1 |
20140025240 | Steigerwald et al. | Jan 2014 | A1 |
20140028852 | Rathi | Jan 2014 | A1 |
20140049646 | Nix | Feb 2014 | A1 |
20140052340 | Bajpai | Feb 2014 | A1 |
20140067206 | Pflug | Mar 2014 | A1 |
20140085472 | Lu et al. | Mar 2014 | A1 |
20140098229 | Lu et al. | Apr 2014 | A1 |
20140104426 | Boegel et al. | Apr 2014 | A1 |
20140138140 | Sigle | May 2014 | A1 |
20140139676 | Wierich | May 2014 | A1 |
20140152825 | Schaffner | Jun 2014 | A1 |
20140160276 | Pliefke et al. | Jun 2014 | A1 |
20140160291 | Schaffner | Jun 2014 | A1 |
20140168415 | Ihlenburg et al. | Jun 2014 | A1 |
20140168437 | Rother et al. | Jun 2014 | A1 |
20140211009 | Fursich | Jul 2014 | A1 |
20140218535 | Ihlenburg et al. | Aug 2014 | A1 |
20140226012 | Achenbach | Aug 2014 | A1 |
20140232869 | May et al. | Aug 2014 | A1 |
20140247352 | Rathi et al. | Sep 2014 | A1 |
20140247354 | Knudsen | Sep 2014 | A1 |
20140247355 | Ihlenburg | Sep 2014 | A1 |
20140285666 | O'Connell et al. | Sep 2014 | A1 |
20140293042 | Lynam | Oct 2014 | A1 |
20140293057 | Wierich | Oct 2014 | A1 |
20140307095 | Wierich | Oct 2014 | A1 |
20140309884 | Wolf | Oct 2014 | A1 |
20140313339 | Diessner | Oct 2014 | A1 |
20140320636 | Bally et al. | Oct 2014 | A1 |
20140320658 | Pliefke | Oct 2014 | A1 |
20140327772 | Sahba | Nov 2014 | A1 |
20140327774 | Lu et al. | Nov 2014 | A1 |
20140336876 | Gieseke et al. | Nov 2014 | A1 |
20140340510 | Ihlenburg et al. | Nov 2014 | A1 |
20140347486 | Okouneva | Nov 2014 | A1 |
20150002670 | Bajpai | Jan 2015 | A1 |
20150217693 | Pliefke et al. | Aug 2015 | A1 |
20170050672 | Gieseke et al. | Feb 2017 | A1 |
20170217372 | Lu | Aug 2017 | A1 |
20170254873 | Koravadi | Sep 2017 | A1 |
20170355312 | Habibi et al. | Dec 2017 | A1 |
20180134217 | Peterson et al. | May 2018 | A1 |
20180211528 | Seifert | Jul 2018 | A1 |
20180215382 | Gupta et al. | Aug 2018 | A1 |
20180253608 | Diessner et al. | Sep 2018 | A1 |
20180276838 | Gupta et al. | Sep 2018 | A1 |
20180276839 | Diessner et al. | Sep 2018 | A1 |
20180341823 | Gupta | Nov 2018 | A1 |
20190016264 | Potnis et al. | Jan 2019 | A1 |
20190039649 | Gieseke et al. | Feb 2019 | A1 |
20190042864 | Pliefke et al. | Feb 2019 | A1 |
20190047475 | Uken et al. | Feb 2019 | A1 |
20190064831 | Gali et al. | Feb 2019 | A1 |
20190118717 | Blank et al. | Apr 2019 | A1 |
20190118860 | Gali et al. | Apr 2019 | A1 |
20190143895 | Pliefke et al. | May 2019 | A1 |
20190146297 | Lynam et al. | May 2019 | A1 |
20190225152 | Koravadi | Jul 2019 | A1 |
20190258131 | Lynam et al. | Aug 2019 | A9 |
20190297233 | Gali et al. | Sep 2019 | A1 |
20190347825 | Gupta et al. | Nov 2019 | A1 |
20200017143 | Gali | Jan 2020 | A1 |
20200334475 | Joseph et al. | Oct 2020 | A1 |
20200356788 | Joseph et al. | Nov 2020 | A1 |
20200361397 | Joseph et al. | Nov 2020 | A1 |
20200377022 | LaCross et al. | Dec 2020 | A1 |
20200406967 | Yunus et al. | Dec 2020 | A1 |
20210078634 | Jalalmaab et al. | Mar 2021 | A1 |
20210094473 | Gali et al. | Apr 2021 | A1 |
20210127693 | Tomita et al. | May 2021 | A1 |
20210155167 | Lynam et al. | May 2021 | A1 |
20210162926 | Lu | Jun 2021 | A1 |
20210170820 | Zhang | Jun 2021 | A1 |
20210170947 | Yunus et al. | Jun 2021 | A1 |
20210213878 | Schondorf | Jul 2021 | A1 |
20210245662 | Blank et al. | Aug 2021 | A1 |
20210300246 | Peterson et al. | Sep 2021 | A1 |
20220024391 | Gali et al. | Jan 2022 | A1 |
20220027644 | Gali et al. | Jan 2022 | A1 |
20220028111 | Gali et al. | Jan 2022 | A1 |
20220189052 | Jalalmaab et al. | Jun 2022 | A1 |
20220212599 | Gali et al. | Jul 2022 | A1 |
20220212668 | Joseph et al. | Jul 2022 | A1 |
20220215670 | Gali et al. | Jul 2022 | A1 |
20230001984 | Lu et al. | Jan 2023 | A1 |
20240064274 | Blank et al. | Feb 2024 | A1 |
Number | Date | Country | |
---|---|---|---|
20230242038 A1 | Aug 2023 | US |
Number | Date | Country | |
---|---|---|---|
63267329 | Jan 2022 | US |