Driver assist system for vehicle

Information

  • Patent Grant
  • 9315151
  • Patent Number
    9,315,151
  • Date Filed
    Friday, April 3, 2015
    9 years ago
  • Date Issued
    Tuesday, April 19, 2016
    8 years ago
Abstract
A driver assist system for a vehicle includes at least one non-visual sensor and a color video rear backup camera having a field of view rearward of the vehicle, with the field of view of the camera encompassing a ground area to the rear of the vehicle. An image processor processes image data captured by the camera. A video display screen is viewable by a driver of the vehicle. During a reversing or parking maneuver of the vehicle, images derived, at least in part, from image data captured by the camera are displayed by the video display screen to assist the driver in operating the vehicle. At least one indication is provided to a driver of the vehicle, at least in part, responsive to detection by the non-visual sensor of at least one object external of the vehicle.
Description
FIELD OF THE INVENTION

The present invention relates generally to telematics systems for vehicles and, more particularly, to telematics systems which may provide driving instructions or directions to a driver of a vehicle or which may provide other controls to an accessory or system of the vehicle. The present invention also relates generally to vehicle seating adjustment systems and, more particularly, to vehicle seating adjustment systems with memory adjustment.


BACKGROUND OF THE INVENTION

In-vehicle telematics systems or vehicle-based telematics systems, such as General Motor's ONSTAR®, Daimler's TELEAID™, Ford's RESCU® or the like, are common in vehicles today. Such telematics systems involve a telecommunication link from the vehicle to an operator or a voice input system at a service center or the like external to the vehicle. The driver of the vehicle may connect or communicate with an operator at the service center to request directions to a targeted location. The service center may provide directions to the targeted location based on the known position of the vehicle, which may be given to the service center operator by the driver, or which may be known by the operator via a link to a global positioning system (GPS) of the vehicle.


However, in such concierge-type systems, typically all of the road names, exits to take, and directional headings/directions are given verbally by the service center operator to the driver all together while the driver is driving the vehicle. The driver is then typically expected to remember several directional driving instructions and often has difficulty in remembering the full directions. Although the driver may optionally remain on the line with the service center operator until the driver reaches the intended destination, which may take many minutes, such as ten, fifteen, twenty minutes or more, and/or the driver may call back to the service center for updated directions, these actions increase the cost of the service, since the service center typically charges for such calls.


Therefore, there is a need in the art for a navigation system that overcomes the shortcomings of the prior art.


SUMMARY OF THE INVENTION

The present invention is intended to provide instructions or directions to a driver of a vehicle which are keyed or coded or linked to respective geographic locations, such that the particular instructions are provided in response to the geographic position of the vehicle at least generally corresponding to the particular geographic location associated with the particular instruction. The particular instructions are thus provided to the driver of the vehicle only when the geographic position of the vehicle is at or near the predetermined or preset waypoints or geographic locations corresponding to the respective particular instructions.


According to an aspect of the present invention, a navigation system for a vehicle includes a vehicle-based telematics system, a vehicle-based global positioning system and a control. The telematics system is operable to receive a user input and to download directional information from a remote source to the control of the vehicle in response to the user input (often, for instance, in ONSTAR®, the user input may be a request from the driver to the remote source or service center operator for directions to a particular destination) and an initial geographic position of the vehicle, such as typically determined by the vehicle-based global positioning system. The directional information comprises at least two instructions, with each instruction being coded to or associated with or linked to a respective geographic location or waypoint. The control is operable to provide an output corresponding to each of the at least two instructions in response to a then current geographic position of the vehicle. The control is operable to provide each instruction only when the then current geographic position of the vehicle at least generally matches or corresponds to the particular respective geographic location associated with the particular instruction.


For instance, a first instruction is typically downloaded that comprises information as to the initial geographic position and heading of the vehicle (e.g., “You are now heading East on Maple Street. Continue until you reach Oak Road.”). A second instruction may then provide information as the vehicle approaches the appropriate turn or intersection or the like to take (e.g., “You are now within two blocks of Oak Road. Prepare to turn Right at Oak Road.”). A subsequent instruction may then provide information as to the geographic position of the vehicle after the previous step has been completed (e.g., “You are now heading South on Oak Road. Continue until you reach Elm Street.”). The output thus provides separate instructions or steps of the directional information, with each instruction coded to a particular geographic location and provided in response to the then current geographic position of the vehicle.


Also, if the driver of the vehicle does not correctly turn or passes an appropriate turn or the like, the control of the present invention knows this via an input from the in-vehicle or vehicle-based global positioning system. As a consequence, a warning instruction may be communicated to the driver indicating that the directions are not being appropriately followed (e.g., “You have passed Oak Road. Please execute a U-Turn and proceed West on Maple Street to Oak Road and turn Left at Oak Road.”). Also, if the driver turns off a given road onto an incorrect road or otherwise strays from the given route, the control may communicate a similar warning or instruction to alert the driver that the vehicle is no longer traveling along the given route (e.g., “You have left Maple Street, but are not on Oak Road. Return to Maple Street and continue East on Maple Street to Oak Road, then turn Right on Oak Road.”).


The control is operable to tag or code each of the instructions with a respective geographic location or waypoint (alternately, each of the instructions may be tagged or coded or associated with a respective geographic location or waypoint at the remote source before downloading to the control of the vehicle, without affecting the scope of the present invention). The control is then operable to only display a particular instruction when the geographic location tagged or coded to the particular instruction matches or generally matches the actual, then current geographic position of the vehicle.


The control also receives, preferably continuously, an input from the vehicle-based global positioning system that is indicative of the actual, current geographic position of the vehicle as the vehicle travels along the road, highway or the like. The control is then operable to compare the tagged or coded geographic location (as associated with the respective instructions) with the GPS-derived actual geographic position information. Thus, the control may determine when a particular instruction is appropriate to be displayed and/or communicated to the driver by determining that the GPS-derived actual geographic position of the vehicle is now at or at least close to the geographic location associated with a particular instruction.


The user input may comprise a vocal input from the driver of the vehicle to the remote source or service center, or may comprise a keypad input or the like, without affecting the scope of the present invention. Preferably, the geographic position of the vehicle is provided to the remote source (such as a service center or the like) via the global positioning system of the vehicle and the telematics system of the vehicle.


In one form, the output of the control is provided to the driver as an audible message. In another form, the output of the control is provided to the driver as a visible display. The visible display may comprise a video display element, an alphanumeric or iconistic display element or the like, and may comprise a display on demand type display element, a thin film transistor liquid crystal display element, a multi-pixel display element, and/or a multi-icon display element and/or the like. In another form, a combination of a visible and audible output may be used.


Optionally, the system may include a seat adjustment system that is operable to adjust a seat of the vehicle in response to data received via at least one of the vehicle-based telematics system and the vehicle-based global positioning system. The seat adjustment system may be operable in response to biometric data pertaining to the occupant of the seat of the vehicle.


According to another aspect of the present invention, a method for providing navigational directions to a driver of a vehicle comprises accessing a remote source or service center via a vehicle-based wireless communication system and downloading local information from the remote source to a control of the vehicle via the wireless communication system in response to a user input. The local information comprises at least two driving instructions. Each of the at least two driving instructions is associated with or linked to a respective, particular geographic location. A current geographic position of the vehicle is provided to the control via a vehicle-based global positioning system. Each of the at least two driving instructions is provided by the control to the driver in response to the then current geographic position of the vehicle and only when the current geographic position of the vehicle at least generally matches or corresponds to the particular geographic location electronically associated with or linked to the respective one of the at least two driving instructions.


Preferably, the method includes associating or tagging or coding or linking (such as electronically, digitally or the like) each of the instructions with a respective particular geographic location. The control may tag or code the instructions to be associated with the respective geographic locations after the instructions have been downloaded, or the remote service center may tag or code the instructions to be associated with the respective geographic locations before downloading the instructions to the control, without affecting the scope of the present invention.


In one form, the at least two driving instructions are visibly displayed to the driver at a display of the vehicle. In another form, the at least two driving instructions are audibly communicated to the driver via at least one speaker of the vehicle. In a third form, a combination of a visible display and audible communication may be used.


According to yet another aspect of the present invention, a navigation system for a vehicle comprises a vehicle-based telematics system, a vehicle-based global positioning system, and a control. The telematics system is operable to receive a user input from a driver of the vehicle and to download directional information to the control of the vehicle in response to the user input and an initial geographic position of the vehicle. The directional information comprises at least two instructions. The control is operable to tag or code or link each of the instructions with a respective geographic location. The control is operable to provide an output corresponding to a particular instruction only when the geographic location tagged or coded or linked to the particular instruction at least generally corresponds to the actual current geographic position of the vehicle.


The present invention thus provides for step-by-step instructions or driving directions to the driver of a vehicle as the driver is driving the vehicle according to the instructions. Each step or instruction is provided either after the previous step or instruction has been completed or as the vehicle approaches a turn or intersection or location where the next step is to be performed, so that the driver is not overwhelmed with multiple instructions to remember as the driver drives the vehicle toward the targeted destination. The control or the remote source or service center is operable to electronically or digitally or otherwise tag, key, code or otherwise associate each instruction or step with a geographic location or waypoint, and the control is operable to only display that instruction when the geographic location tagged to the instruction generally matches the actual, current geographic position of the vehicle. All of the instructions are provided or downloaded to the vehicle during a single, short communication with the remote source or service center via the telematics system, so as to avoid multiple communications to the remote service center or a lengthy communication with the remote service center, thereby reducing the cost of the instruction service to the driver of the vehicle.


These and other objects, advantages, purposes, and features of the present invention will become more apparent from the study of the following description taken in conjunction with the drawings.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a top plan view of a vehicle incorporating a navigation system in accordance with the present invention;



FIG. 2 is a block diagram of a navigation system in accordance with the present invention;



FIG. 3 is a top plan view of a vehicle incorporating a seat adjustment system in accordance with the present invention; and



FIG. 4 is a block diagram of a seat adjustment system in accordance with the present invention.





DESCRIPTION OF THE PREFERRED EMBODIMENTS

Referring now to the drawings and the illustrative embodiments depicted therein, a navigation system 10 of a vehicle 12 includes a control 14 which is operable to communicate an output 16, such as step-by-step directions or driving instructions, to a driver of the vehicle based on an initial, current or present geographic position of the vehicle and the desired or targeted final destination of the vehicle (FIGS. 1 and 2). The initial geographic position of the vehicle and the targeted destination is communicated to a remote source or service center 20 via a telematics system 18 of the vehicle and a global positioning system 22 of the vehicle. In response to a user input 24 from the driver or other occupant of the vehicle and the initial geographic position of the vehicle, the service center 20 provides or downloads a set of instructions or driving directions 26, which is received by the control 14 from the service center via the telematics system or wireless communication system 18 of the vehicle. Each of the particular instructions is electronically or digitally or otherwise coded, tagged, keyed, or otherwise associated with a respective particular geographic location or waypoint. The control 14 then provides the instructions or output 16 to the driver in a step-by-step manner based on the GPS-derived, actual, then current geographic position of the vehicle, and with the stepping from one step to the subsequent step of the instructions being linked to the then current geographic position of the vehicle in relation to the particular geographic locations or waypoints associated with the instructions, as discussed below.


The driver or the other occupant of the vehicle provides the user input 24 to the telematics system or wireless communication system 18 of the vehicle. The user input 24 may include a vocal communication or request for driving instructions or directional information to the final destination to an operator or voice input/recognition system of the service center or the like 20 associated with the telematics system 18 of the vehicle, or may be a keyed-in request or instructions via a keypad or the like to a remote computer system or computerized service center or the like, without affecting the scope of the present invention. The driver or other occupant of the vehicle may provide (such as via a vocal communication or via a keypad input or the like) the initial position of the vehicle to the service center or the geographic position of the vehicle may be communicated to the service center via a global positioning system 22 of the vehicle.


The remote service center 20 is then operable to download the local map and/or the driving instructions or directions to a memory storage or control 14 of the vehicle while the communication link is open between the service center and the vehicle. Because only the local information necessary to direct the driver to the targeted destination is downloaded to the control or memory of the vehicle, the download may be completed in a relatively short period of time (thus minimizing the time and cost of the communication) and does not require a large amount of memory or storage space for the information. After the instructions or directions are downloaded to the vehicle, the driver may disconnect from the service center to avoid additional charges for the communication and service.


Each of the output instructions provided by the control is electronically or digitally or otherwise keyed or coded or tagged or otherwise associated with or linked to a respective or corresponding geographic location or waypoint. The instructions may be tagged or coded by the remote source or service center before the instructions are downloaded to the vehicle, or the instructions may be tagged or coded by the control at the vehicle after the instructions have been downloaded to the control, without affecting the scope of the present invention.


The control 14 also receives, preferably continuously, an input from the in-vehicle or vehicle-based global positioning system 22 which is indicative of the actual, current geographic position of the vehicle as it travels along the road, highway or the like. The control is then operable to compare the tagged or coded geographic locations as associated with the respective instructions with the GPS-derived actual geographic position information. Thus, the control is operable to determine when a particular instruction is appropriate to be displayed or communicated to the driver of the vehicle by determining that the actual GPS-derived geographic position of the vehicle is now at or at least close to the geographic location associated with a particular instruction. The control is then operable to provide the separate or particular output instructions to the driver of the vehicle in response to the actual, then current geographic position of the vehicle matching or corresponding to or approaching a particular geographic location or waypoint keyed to or coded to or tagged to or associated with a respective, particular instruction.


Preferably, the output or instructions are provided to the driver of the vehicle in a step-by-step manner, where each individual instruction or step is provided based on the then current geographic position of the vehicle with respect to the keyed or coded geographic location. More particularly, each particular instruction is provided to the driver by the control only when the actual geographic position of the vehicle at least generally corresponds to or matches the particular geographic location associated with or linked to the respective, particular instruction. The particular instruction is thus provided to the driver of the vehicle at the particular time at which the vehicle is positioned at or near a geographic location where the particular instruction is most useful to the driver of the vehicle.


For example, an initial instruction may be electronically or digitally coded to the initial geographic position of the vehicle when the directions/instructions are first requested (e.g., “You are heading East on First Street”). Each subsequent individual step may be provided in response to the control detecting or determining (in response to an output of the global positioning system) that the vehicle is approaching, at or near the next geographic location or waypoint, such as a turn, location, intersection or the like, at which the next step is to be performed (e.g., the car is approaching and within a predetermined or threshold distance from Main Street and the next instruction is “Turn Left on Main Street”), or in response to the control detecting or determining (again in response to the global positioning system of the vehicle) that a previous instruction or step has been completed (e.g., the car has turned left and is now traveling along Main Street and the next instruction is “Proceed North on Main Street”). The control is thus operable to provide the next step or instruction only when the driver can readily understand the instruction and focus on performing that particular step. The driver thus does not have to remember all of the multiple steps or turns or street names or exits or the like while also driving the vehicle. The driver also thus does not have to remain on the line with the remote service center operator and/or does not have to repeatedly contact the service center to obtain the instructions again if any of the instructions are forgotten, since the local instructions and/or map have been downloaded to the vehicle.


The telematics system or wireless communication system 18 of the vehicle may be operable to connect to a corresponding service center or operator or voice input/recognition system or the like 20 which may provide a variety of information or assistance to the driver of the vehicle in response to a vocal message from the driver or other occupant of the vehicle (although the user input may be a keypad input or the like to a computerized service center or the like, without affecting the scope of the present invention). Such a communication system and service center may be substantially similar to known systems and centers, such as General Motors' ONSTAR®, Daimler's TELEAID™, Ford's RESCU® or the like, which are common in vehicles today. The communication link may be accomplished utilizing various linking principles, such as the principles disclosed in commonly assigned U.S. Pat. Nos. 6,420,975; 6,278,377; 6,243,003; 6,329,925; 6,428,172; 6,326,613, the disclosures of which are hereby incorporated herein by reference.


The driver or occupant of the vehicle may actuate a communication link (such as via a push button or the like at the interior rearview mirror or at a console of the vehicle), and request from the operator, such as via a voice input, the driving instructions or directions as to how to get to a desired or targeted location or destination. The service center may receive the initial geographic position of the vehicle (such as in response to the global positioning system of the vehicle or from the driver), and may access a database to obtain the appropriate local map and/or local directions to the targeted destination. The operator may even access the vast data banks available at the service center for destinations or locations and may provide human interaction to help find the destination of choice if the driver does not know the exact address. The operator or service center then downloads the local information or step-by-step or turn-by-turn directions 26 to the control or memory or storage system 14 of the vehicle 12 in a single download. Optionally, it is envisioned that the service center may download or provide the information to the vehicle in real time (which may result in a longer opened communication link between the vehicle and the service center), without affecting the scope of the present invention.


The control 14 is operable to provide the downloaded instructions to the driver of the vehicle while the vehicle is driven by the driver toward the targeted destination. The control 14 provides the information or directions or output 16, such as when/where to turn, how far until the turn, and the direction to travel, to the driver as needed. The control may be operable to update the output display or message in real time based on the current geographic position of the vehicle as the vehicle travels along the given route.


The output or instructions may be provided to the driver by the control via an audible message or signal, such as via one or more speakers of the vehicle, such as by utilizing principles of audio systems of the types disclosed in commonly assigned U.S. Pat. No. 6,243,003; 6,278,377; and 6,420,975, which are hereby incorporated herein by reference, or may be provided via a display, such as in a display of an interior rearview mirror 28, such as a scrolling display of the type disclosed in U.S. patent application, Ser. No. 09/799,414, filed Mar. 5, 2001, now U.S. Pat. No. 6,477,464, which is hereby incorporated herein by reference, or a display on demand type display, such as the types disclosed in commonly assigned U.S. Pat. Nos. 5,668,663 and 5,724,187, and U.S. patent applications, Ser. No. 10/054,633, filed Jan. 22, 2002, now U.S. Pat. No. 7,195,381; and Ser. No. 09/793,002, filed Feb. 26, 2001, now U.S. Pat. No. 6,690,268, the entire disclosures of which are hereby incorporated herein by reference, or in a display screen or the like at the interior rearview mirror assembly or elsewhere within the vehicle, without affecting the scope of the present invention. Other types of visible displays or locations for such visible displays may be utilized, such as at an accessory module or pod or windshield electronic module, an instrument panel of the vehicle, a console of the vehicle and/or the like, without affecting the scope of the present invention. The visible display may comprise written instructions, icons (such as left and right arrows or the like), or any other characters or symbols or indicia which convey to the driver of the vehicle when/where to turn and/or which direction to travel in order to arrive at the targeted destination. Optionally, the output may comprise a combination of a visible display and an audible message or signal, without affecting the scope of the present invention.


As indicated above, a variety of means may be utilized to visually convey the direction instructions to the driver of the vehicle. For example, and such as described in U.S. patent application, Ser. No. 09/799,414, filed Mar. 5, 2001, now U.S. Pat. No. 6,477,464, which is hereby incorporated herein by reference, a text display may be provided and/or an iconistic display may be provided, such as a display readable through the interior rearview mirror reflective element itself. In this regard, use of a display on demand (DOD) type display (such as disclosed in commonly assigned, U.S. patent applications, Ser. No. 10/054,633, filed Jan. 22, 2002, now U.S. Pat. No. 7,195,381, and Ser. No. 09/793,002, filed Feb. 26, 2001, now U.S. Pat. No. 6,690,268, and in U.S. Pat. Nos. 5,668,663 and 5,724,187, the entire disclosures of which are hereby incorporated by reference herein), may be preferred. For example, a video display element or a video display screen or an information display element can be used (such as an elongated alphanumeric/multi-pixel/multi-icon display element and/or such as an LCD display or an emitting display element, such as a multi-pixel electroluminescent display or field emission display or light emitting diode display (organic or inorganic) or the like) which is disposed within the mirror housing of the interior mirror assembly of the vehicle, and located behind the mirror reflective element in the mirror housing, and configured so that the information displayed by the display element (that is positioned to the rear of the reflector of the mirror reflective element) is viewable by the driver through the mirror reflective element. Such a display can be accomplished by partially or wholly removing the reflector in the area of the display or, more preferably, by providing a display on demand type display, whereby the reflective element comprises a transflective element, as discussed below.


Preferably, and such as is disclosed in U.S. patent application, Ser. No. 09/793,002, filed Feb. 26, 2001, now U.S. Pat. No. 6,690,268, the video display screen or other visible display element or elements may be disposed behind the mirror reflective element so that the information displayed is visible by viewing through the mirror reflective element of the interior rearview mirror assembly, with the reflective element preferably comprising a transflective mirror reflector such that the mirror reflective element is significantly transmitting to visible light incident from its rear (i.e. the portion furthest from the driver in the vehicle), with at least about 15% transmission preferred, at least about 20% transmission more preferred, and at least about 25% transmission most preferred, while, simultaneously, the mirror reflective element is substantially reflective to visible light incident from its front (i.e. the position closest to the driver when the interior mirror assembly is mounted in the vehicle), with at least about 60% reflectance preferred, at least about 70% reflectance more preferred, and at least about 75% reflectance most preferred.


Preferably, a transflective electrochromic reflective mirror element is used (such as is disclosed in U.S. patent application, Ser. No. 09/793,002, filed Feb. 26, 2001, now U.S. Pat. No. 6,690,268; and/or in U.S. Pat. Nos. 5,668,663 and 5,724,187, the entire disclosures of which are hereby incorporated by reference herein) that comprises an electrochromic medium sandwiched between two substrates. With the likes of a TFT LCD video display or a light emitting information display disposed behind the rear substrate of a third-surface transflective electrochromic mirror reflective element in a “display-on-demand” configuration (such as disclosed in U.S. patent applications, Ser. No. 10/054,633, filed Jan. 22, 2002, now U.S. Pat. No. 7,195,381, and Ser. No. 09/793,002, filed Feb. 26, 2001, now U.S. Pat. No. 6,690,268, which are hereby incorporated herein by reference), the presence of (and the image or information displayed by) the video display screen or information display is only principally visible to the driver (who views the display through the transflective mirror reflective element) when the information display element is powered so as to transmit light from the rear of the mirror reflective element through the transflective mirror reflector to reach the eyes of the driver. Preferably, a single high-intensity power LED, such as a white light emitting LED comprising a Luxeon™ Star Power LXHL-MW1A white light emitting LED having (at a 25 degree Celsius junction temperature) a minimum forward voltage of 2.55 volts, a typical forward voltage of 3.42 volts, a maximum forward voltage of 3.99 volts, a dynamic resistance of 1 ohm and a forward current of 350 milliamps, and as available from Lumileds Lighting LLC of San Jose, Calif., is used as a backlight for the TFT LCD video screen. Alternately, a plurality of such single high-intensity power LEDs (such as an array of two or of four such power LEDs) may be placed behind the TFT LCD video screen so that the intense white light projected from the individual single high-intensity power LEDs passes through the TFT LCD element and through the transflective electrochromic element, preferably producing a display intensity as viewed by the driver of at least about 200 candelas/sq. meter; more preferably at least about 300 candelas/sq. meter; and most preferably at least about 400 candelas/sq. meter. Alternately, cold cathode vacuum fluorescent sources/tubes can be used for backlighting and optionally can be used in conjunction with LED backlighting.


Optionally, and in accordance with incorporated U.S. patent application Ser. No. 09/793,002, now U.S. Pat. No. 6,690,268, a reverse-aid rearward viewing camera can be mounted to the rear of the vehicle in order to display to the driver, upon selecting a reverse gear, a field of view immediately rearward of the vehicle so as to assist the driver in reversing the vehicle. Such vehicle reverse-aid camera systems are disclosed in U.S. patent application Ser. No. 09/361,814, filed Jul. 27, 1999, now U.S. Pat. No. 6,201,642, and in U.S. patent application Ser. No. 09/199,907, filed Nov. 25, 1998, now U.S. Pat. No. 6,717,610, and in U.S. patent application Ser. No. 09/313,139, filed May 17, 1999, now U.S. Pat. No. 6,222,447; Ser. No. 09/776,625, filed Feb. 5, 2001, now U.S. Pat. No. 6,611,202.


Note that other display locations are possible for display of the video image or information display, such as a map and/or a text message comprising driving instructions, to the driver or occupant of the vehicle. For example, a video image may be displayed on an LCD video screen of flip-down display (such as is disclosed in U.S. patent application, Ser. No. 09/793,002, filed Feb. 26, 2001, now U.S. Pat. No. 6,690,268, incorporated above), or on a video screen incorporated into the rearview mirror assembly, such as the type disclosed in U.S. provisional applications, Ser. No. 60/439,626, filed Jan. 13, 2003; Ser. No. 60/489,812, filed Jul. 24, 2003; and Ser. No. 60/492,225, filed Aug. 1, 2003, which are hereby incorporated herein by reference. Optionally, for example, a video display located in the front instrument panel can be used, or a video display located in an overhead console (such as an overhead accessory module or system as described in U.S. provisional applications, Ser. No. 60/489,812, filed Jul. 24, 2003; and Ser. No. 60/492,225, filed Aug. 1, 2003, which are hereby incorporated herein by reference) can be used, without affecting the scope of the present invention.


Alternately, as outlined above, a local area map may be downloaded to the control from the external service provider or service center and the control may be operable (such as by using the principles disclosed in U.S. patent applications, Ser. No. 10/054,633, filed Jan. 22, 2002, now U.S. Pat. No. 7,195,381, and Ser. No. 09/793,002, filed Feb. 26, 2001, now U.S. Pat. No. 6,690,268, which are hereby incorporated herein by reference) to feed such a map to the likes of a thin film transistor (TFT) liquid crystal (LC) video screen or other type of video screen or display element or display system, and with the instructions being conveyed by alphanumeric characters and/or indicia or the like and/or by highlighting portions of the map display. Such highlighting may be controlled by the in-vehicle control or control unit based on actual, current vehicle position information as determined by the in-vehicle or vehicle-based global positioning system. Thus, the vehicle owner need not buy into or have in the vehicle a full map of all areas to which the vehicle may be driven (such as regional maps or national maps or the like).


Alternately, a low cost, multi-pixel display (such as the type disclosed in U.S. provisional application, Ser. No. 60/373,932, filed Apr. 19, 2002, and in U.S. patent application, Ser. No. 10/418,486, filed Apr. 18, 2003, now U.S. Pat. No. 7,005,974, which are hereby incorporated herein by reference), such as a low cost multi-pixel vacuum fluorescent display, a low cost multi-pixel organic light emitting diode (OLED), a low cost multi-pixel field emission display, or any other or similar multi-pixel light emitting display or the like may be utilized, without affecting the scope of the present invention. The local area map, with the instructions iconistically displayed thereon, may be displayed on such a multi-pixel display or the like in response to the control receiving an input or download from the telematics system and/or the in-vehicle or vehicle-based global positioning system.


As disclosed in U.S. patent application Ser. No. 10/054,633, filed Jan. 22, 2002, now U.S. Pat. No. 7,195,381, incorporated above, suitable LEDs for a light source unit include a white light emitting light emitting diode, such as described in U.S. provisional applications, Ser. No. 60/263,680, filed Jan. 23, 2001; Ser. No. 60,243,986, filed Oct. 27, 2000; Ser. No. 60/238,483, filed Oct. 6, 2000; Ser. No. 60/237,077, filed Sep. 30, 2000; Ser. No. 60/234,412, filed Jul. 21, 2000; Ser. No. 60/218,336, filed Jul. 14, 2000; and Ser. No. 60/186,520, filed Mar. 2, 2000, and U.S. utility applications entitled VIDEO MIRROR SYSTEMS INCORPORATING AN ACCESSORY MODULE, filed Feb. 26, 2001, now U.S. Pat. No. 6,690,268, and REARVIEW MIRROR ASSEMBLY WITH UTILITY FUNCTIONS, Ser. No. 09/585,379, filed Jun. 1, 2000, including a thermostable LED, which emits the same color light even when the temperature varies. Thus, regardless of the interior or exterior temperature of the vehicle and/or of the accessory equipped with the thermostable non-incandescent light emitting diode source, the same color light is radiated. Such a thermostable white light emitting non-incandescent light emitting diode source can incorporate a trio of red, green, and blue fluorescent materials that together create white light when struck by 380 nm wavelength light from a gallium-nitride LED, and is available from Toyoda Gosei Co. and Toshiba Corp of Nagoya, Japan.


One suitable white light emitting diode (LED) that is thermostable is available from Toshiba America Electronic Components, Inc. of Irvine, Calif., Part No.: TLWA1100. The thermostable white-light LED integrates multiple colored phosphors and a short peak wavelength (preferably, approximately 380 nanometers (nm) in peak spectral output intensity) light-emitting diode junction in a phosphor-mixed transparent resin package to achieve a high luminosity, low power consumption light source. Such thermostable LEDs adopt a technological approach differing from that used in conventional LEDs. Light emission in the visible wavelength band is controlled by excited phosphors, not by using temperature changes in the LED to achieve a change in color output. The fact that the LED emission does not directly determine the color brings advantages in overall controllability and wavelength stability. Incorporated in vehicular accessories, such as those disclosed above, the thermostable diode achieves improved tonic reproduction and enhanced color durability during temperature shifts. Such thermostable LEDs utilize a short wavelength light source by reducing the indium in an indium-doped GaN emission layer. This excites red, green, and blue (RGB) phosphors in the transparent resin of the device package to output white light. The RGB balance of the phosphor layer determines the output color, and different colored output can be achieved through modified phosphor balance. The emission light from the LED itself does not directly contribute to the white color. The phosphors used in the new LED offer excellent performance in terms of operating temperature range and color yield. Specifications of such thermostable white LEDs include a compact package (3.2×2.8 millimeter), provided in a Surface Mount Device (SMD). Luminosity is typically about 100 millicandela (mcd) at 20 mA and luminous flux/electrical watt is about 4.5-5.0 lumens per watt at 20 mA. Correlated color temperature is about 6,500-9,000K. Operating temperature is about −40° Celsius-100° Celsius and storage temperature is about −40°-100° Celsius.


Also, high brightness LEDS are available from Uniroyal Technology Corporation of Saratoga, Fla. under the tradename POWER-Ga(I)™ High Brightness InGaN LEDs which comprise high brightness, high luminous efficiency short wavelength LEDs utilizing a power ring n-Contact and a centralized p-Pad design feature. 450 nm and 470 nm high brightness blue LED die products are available that have a minimum power output of 2 milliwatts in die form which, when conventionally packaged, can result in packaged lamp power levels between 4 and 5 milliwatts. Such LEDs combine indium gallium nitride (InGaN) materials on sapphire substrates in order to produce higher efficiencies. GaN LEDs can be produced by MOCVD epitaxy on Sapphire (aluminum oxide) or can be produced on silicon carbide substrates. Ultraviolet light emitting LEDs can be produced.


Depending on the application, LEDs emitting a colored light can be used, such as high intensity amber and reddish orange light emitting diode sources, such as solid state light emitting diode LED sources utilizing double hydro junction AlGaAs/GaAs Material Technology, such as very high intensity red LED lamps (5 mm) HLMP-4100/4101 available from Hewlett Packard Corporation of Palo Alto, Calif., or transparent substrate aluminum indium gallium phosphide (AlInGaP) Material Technology, commercially available from Hewlett Packard Corporation of Palo Alto, Calif. Also, blue can be used, or a combination of individual different colored diodes, such as red, blue, white, green, amber, orange etc. can be used with color mixing thereof to form a desired color or to deliver a desired local intensity of illumination as noted above. Other suitable white emitting light-emitting diodes are available from Nichia Chemical Industries of Tokyo, Japan and from Cree Research Inc., of Durham, N.C. For example, a white light emitting diode is available from Nichia Chemical Industries of Tokyo, Japan under Model Nos. NSPW 300AS, NSPW 500S, NSPW 310AS, NSPW 315AS, NSPW 510S, NSPW 515S and NSPW WF50S, such as is disclosed in U.S. patent application Ser. No. 09/448,700, filed Nov. 24, 1999, now U.S. Pat. No. 6,329,925, and in U.S. patent application Ser. No. 09/244,726, filed Feb. 5, 1999, now U.S. Pat. No. 6,172,613. A variety of constructions are used including GaAsP on GaP substrate, gallium aluminum phosphide, indium gallium nitride, and GaN on a SiC substrate. Optionally, a plurality of LEDs such as a cluster of two, three, four, six, eight or the like LEDs (each of the same color or the cluster comprising different colored LEDs) can be used to target and illuminate a local area for higher illumination at that area, such as may be useful in a map light or as a reading light or as an interior light or as an illumination source for an interior vehicle cabin-mounted and monitoring camera (most preferably illuminating the target area with white light). Such a cluster of high efficiency LEDs can be mounted at the mirror mount so as to project an intense pattern of light generally downwardly into the vehicle cabin for purposes of map reading, general illumination, courtesy illumination and the like. Also, a cluster of LED's, preferably including at least one white emitting LED and/or at least one blue emitting LED, can be mounted in a roof portion, side portion or any other portion of the vehicle cabin to furnish dome lighting, rail lighting, compartment lighting and the like. Use of white emitting LEDs is disclosed in U.S. Pat. No. 6,152,590, entitled LIGHTING DEVICE FOR MOTOR VEHICLES, filed Feb. 12, 1999, by Peter Fuerst and Harald Buchalla of Donnelly Hohe Gmbh & Co, KG.


Other suitable LEDs may include high-intensity, high current capability light emitting diodes such as the high-flux power LEDs available from LumiLeds Lighting, U.S., LLC of San Jose, Calif. under the SunPower Series High-Flux LED tradename. Such high-intensity power LEDs comprise a power package allowing high current operation of at least about 100 milliamps forward current, more preferably at least about 250 milliamps forward current, and most preferably at least about 350 milliamps forward current through a single LED. Such high current/high-intensity power LEDs (as high as 500 mA or more current possible, and especially with use of heat sinks) are capable of delivering a luminous efficiency of at least about 1 lumen per watt, more preferably at least about 3 lumens per watt, and most preferably at least about 5 lumens per watt. Such high intensity power LEDs are available in blue, green, blue-green, red, amber, yellow and white light emitting forms, as well as other colors. Such high-intensity LEDs can provide a wide-angle radiation pattern, such as an about 30 degree to an about 160 degree cone. Such high-intensity power LEDs, when normally operating, emit a luminous flux of at least about 1 lumen, more preferably at least about 5 lumens and most preferably at least about 10 lumens. For certain applications such as ground illumination from lighted exterior mirror assemblies and interior mirror map lights, such high-intensity power LEDs preferably conduct at least about 250 milliamps forward current when operated at a voltage in the about 2 volts to about 5 volts range, and emit a luminous flux of at least about 10 lumens, more preferably at least about 15 lumens and most preferably at least about 25 lumens, preferably emitting white light.


For example, the mirror assembly may include circuitry for mirror mounted video cameras, which are used to visually detect the presence of moisture on the windshield and actuate windshield wipers accordingly, such as described in U.S. patent application Ser. No. 08/621,863, filed Mar. 25, 1996, now U.S. Pat. No. 5,796,094, or mirror mounted cameras for vehicle internal cabin monitoring disclosed in U.S. Pat. Nos. 5,877,897 and 5,760,962, both commonly assigned, or mirror mounted cameras for rear vision systems as disclosed in U.S. Pat. Nos. 5,959,367; 5,929,786; 5,949,331; 5,914,815; 5,786,772; 5,798,575; 5,670,935; and U.S. patent applications, Ser. No. 09/304,201, filed May 3, 1999, now U.S. Pat. No. 6,198,409; Ser. No. 09/375,315, filed Aug. 16, 1999, now U.S. Pat. No. 6,175,164; Ser. No. 09/199,907 filed Nov. 25, 1998, now U.S. Pat. No. 6,717,610; Ser. No. 09/361,814, filed Jul. 27, 1999, now U.S. Pat. No. 6,201,642; Ser. No. 09/372,915, filed Aug. 12, 1999, now U.S. Pat. No. 6,396,397; Ser. No. 09/300,201, filed May 3, 1999; and Ser. No. 09/313,139, filed May 17, 1999, now U.S. Pat. No. 6,222,447, which are all commonly assigned. Additional features and accessories that may be incorporated into the mirror assembly include: a trip computer, an intrusion detector, displays indicating, for example passenger air bag status, including information displays such as a PSIR (Passenger Side Inflatable Restraint) display, an SIR (Side-Airbag Inflatable Restraint), compass/temperature display, a tire pressure status display or other desirable displays and the like, such as those described in U.S. patent application Ser. No. 09/244,726, filed Feb. 5, 1999, now U.S. Pat. No. 6,172,613. For example, a rearview mirror assembly (or an accessory module assembly such as a windshield electronics module assembly), may include: antennas, including GPS or cellular phone antennas, such as disclosed in U.S. Pat. No. 5,971,552; a communication module, such as disclosed in U.S. Pat. No. 5,798,688; displays such as shown in U.S. Pat. No. 5,530,240 or in U.S. Application Ser. No. 09/244,726, filed Feb. 5, 1999, now U.S. Pat. No. 6,172,613; blind spot detection systems, such as disclosed in U.S. Pat. Nos. 5,929,786 or 5,786,772; transmitters and/or receivers, such as garage door openers, a digital network, such as described in U.S. Pat. No. 5,798,575; a high/low head lamp controller, such as disclosed in U.S. Pat. No. 5,715,093; a memory mirror system, such as disclosed in U.S. Pat. No. 5,796,176; a hands-free phone attachment, a video device for internal cabin surveillance and/or video telephone function, such as disclosed in U.S. Pat. Nos. 5,760,962 and 5,877,897 and application Ser. No. 09/433,467, now U.S. Pat. No. 6,326,613; a remote keyless entry receiver; microphones and/or speakers, such as disclosed in U.S. patent applications Ser. No. 09/361,814, filed Jul. 27, 1999, now U.S. Pat. No. 6,201,642, and 09/199,907, filed Nov. 25, 1998, now U.S. Pat. No. 6,717,610; a compass, such as disclosed in U.S. Pat. No. 5,924,212; seat occupancy detector; a trip computer; an ONSTAR System or the like, with all of these referenced patents and applications being commonly assigned.


An interior rearview mirror assembly may also include a compass/temperature and a clock display, fuel level display, and other vehicle status and other information displays. The interior rearview mirror assembly may also include a compass/temperature and a clock display, fuel level display, and other vehicle status and other information displays. Furthermore, information displays may be incorporated which provide information to the driver or occupants of the vehicle, such as warnings relating to the status of the passenger airbag. In application Ser. No. 09/244,726, filed Feb. 5, 1999, now U.S. Pat. No. 6,172,613, information displays are provided which include information relating to vehicle or engine status, warning information, and the like such as information relating to oil pressure, fuel remaining, time, temperature, compass headings for vehicle direction, and the like. The passenger side air bag on/off signal may be derived from various types of seat occupancy detectors such as by video surveillance of the passenger seat as disclosed in PCT Application No. PCT/US94/01954, filed Feb. 25, 1994, published Sep. 1, 2004 as PCT Publication No. WO/1994/019212, or by ultrasonic or sonar detection, infrared sensing, pyrodetection, weight detection, or the like. Alternately, enablement/displayment of the passenger side air bag operation can be controlled manually such as through a user-operated switch operated with the ignition key of the vehicle in which the mirror assembly is mounted as described in U.S. patent application Ser. No. 08/799,734, filed Feb. 12, 1997, now U.S. Pat. No. 5,786,772. In addition, the interior rearview mirror assemblies may include electronic and electric devices, including a blind spot detection system, such as the type disclosed in U.S. patent application Ser. No. 08/799,734, filed Feb. 12, 1997, now U.S. Pat. No. 5,786,772, or rain sensor systems, for example rain sensor systems which include windshield contacting rain sensors such as described in U.S. Pat. No. 4,973,844 or non-windshield contacting rain sensors, such as described in PCT International Application PCT/US94/05093, published as WO 94/27262 on Nov. 24, 1994.


In addition, the mirror assembly (or an accessory module assembly such as a windshield electronics module assembly) may incorporate one or more video screens or video display assemblies, such as disclosed in U.S. provisional applications, Ser. No. 60/263,680, filed Jan. 23, 2001; Ser. No. 60/243,986, filed Oct. 27, 2000; Ser. No. 60/238,483, filed Oct. 6, 2000; Ser. No. 60/237,077, filed Sep. 29, 2000; Ser. No. 60/234,412, filed Sep. 21, 2000; Ser. No. 60/218,336, filed Jul. 14, 2000; and Ser. No. 60/186,520, filed Mar. 2, 2000, all commonly assigned.


The video screen may be used for a baby minder system, such as the vehicle interior monitoring system described in U.S. Pat. Nos. 5,877,897 and 5,760,962 or the rear vision system described in U.S. patent applications, Ser. No. 09/361,814, filed Jul. 27, 1999, now U.S. Pat. No. 6,201,642, and Ser. No. 09/199,907, filed Nov. 25, 1998, now U.S. Pat. No. 6,717,610, and Ser. No. 09/433,467, filed Nov. 4, 1999, now U.S. Pat. No. 6,326,613. An interior surveillance system permits the driver of the vehicle to observe behavior or the activities of babies or children or other passengers seated in the rear seat. This is especially advantageous when the child or baby is in a rearward facing car seat, where the child or baby would ordinarily not be visible to the driver while driving. For example, a camera, such as a CMOS or CCD camera, can be mounted to view the rear seat area of the vehicle so that the driver can view what is occurring, such as in a rear seat mounted baby seat or with a rear seat passenger, such as children. Preferably, to enable viewing of the rear seat occupant or occupants even by night, the target field of view of the camera may be illuminated in a manner that provides adequate visibility for the camera to discern what is occurring in the rear seat in a darkened vehicle cabin but not illuminating in a manner that causes glare, distraction, and/or discomfort to any vehicle occupants, including the driver and/or rear seat passengers. For example, such a rear seat monitoring camera illumination is preferably achieved using directed low level non-incandescent light sources, such as light emitting diodes (LEDs), organic light emitting material, electroluminescent sources (both organic and inorganic), and the like, and most preferably such non-incandescent sources are low power and are directed low intensity sources, such as described in U.S. Pat. No. 5,938,321 and application Ser. No. 09/287,926, filed Apr. 7, 1999, now U.S. Pat. No. 6,139,172. The baby minder camera may be mounted as a part of the rearview mirror assembly and, most preferably, may be mounted as a part of a roof area of the interior vehicle cabin such as a header, including a front header of a roof or a rear header or a header console of a roof. It may be desirable to mount a baby minder camera to the rear header of a roof when it is desirable to view rear facing child support seats. Most preferably, a plurality of at least two, more preferably at least four, and most preferably at least six LEDs (or similar low level, directed, low-current light sources such as electroluminescent sources and organic light emitting sources) are mounted with a camera (preferably, such as to form a ring around the camera) with the light projected from the individual LEDs directed to be coincident with the camera field of view and to illuminate the target area desired to be viewed. The LEDs being directed low level sources will not glare or cause discomfort to occupants when illuminated. Further, camera illumination sources can be illuminated whenever the ignition switch is on to operate the vehicle or at least when the ignition switch is placed in an “accessory on” position so that both the camera and illumination lights are operating on vehicle battery power even when parked. Alternately, the illumination lights can be operational only when the baby minder camera is selected to be operational. While it is preferred to use non-incandescent lights, incandescent light sources can be used, most preferably high intensity, low current incandescent light sources. For example, when the camera is activated to view the rear seat or to view a baby seat or the like, the dome light in the vehicle, which typically comprises an incandescent light source, can illuminate so that the rear seat area is illuminated to assist visibility for the camera. A circuit or other device can be provided that illuminates the dome light (or a similar rear seat-illuminating interior light source, such as a rail lamp or the like) whenever the camera is selected to view the rear seat. Optionally, the dome light or similar interior light within the interior cabin, once caused to illuminate when the camera is activated, can cease to illuminate after a determined time interval (such as 5 seconds or ten seconds or longer) under the control of a timeout circuit or device. By providing a timeout, the driver can selectively view the status of passengers in the rear seat of the vehicle by selecting a baby-minder camera or similar rear seat viewing function (such as by voice command, user-operated switch or the like). Upon selection of the camera function, whatever is being viewed on the video screen in the vehicle may be interrupted (or superimposed over or the like), the interior light in the cabin (such as the dome light) will illuminate, a timeout will initiate, and the driver (or other front-seat occupant) can view the rear seat status for the duration of the timeout. Once the timeout elapses, the interior light ceases to illuminate, and preferably, the camera ceases to be activated and the video screen reverts to its pre-event status. Optionally, a reverse-aid rearward viewing camera can be mounted to the rear of the vehicle in order to display to the driver, upon selecting a reverse gear, a field of view immediately rearward of the vehicle so as to assist the driver in reversing the vehicle. Such vehicle reverse-aid camera systems are disclosed in U.S. patent application Ser. No. 09/361,814, filed Jul. 27, 1999, now U.S. Pat. No. 6,201,642, and in U.S. patent application Ser. No. 09/199,907, filed Nov. 25, 1998, now U.S. Pat. No. 6,717,610, and in U.S. patent application Ser. No. 09/313,139, filed May 17, 1999, now U.S. Pat. No. 6,222,447.


Light emitting sources, such as light emitting diodes, can be used to provide lighting for any camera that feeds an image to the mirror-mounted video screen (or feeds an image to an accessory module assembly such as a windshield electronics module assembly). Light emitting diodes can be used to provide illumination in various colors, such as white, amber, yellow, green, orange red, blue, or their combination, or the like, may be used. Alternately, other light emitting elements can be used to provide illumination for any camera that feeds an image to the mirror-mounted video screen, such as incandescent sources, fluorescent sources, including cold-cathode fluorescent sources, electroluminescent sources (both organic and inorganic), such as described in U.S. Pat. No. 5,938,321, and application Ser. No. 09/287,926, filed Apr. 7, 1999, now U.S. Pat. No. 6,139,172, and in such as is disclosed in U.S. patent application Ser. No. 09/466,010, filed Dec. 17, 1999, now U.S. Pat. No. 6,420,975, and in U.S. patent application Ser. No. 09/449,121, filed Nov. 24, 1999, now U.S. Pat. No. 6,428,172, and U.S. patent application Ser. No. 09/585,379, filed Jun. 1, 2000, entitled REARVIEW MIRROR ASSEMBLY WITH UTILITY FUNCTIONS.


The mirror-mounted video screen can display the output from a rear vision back-up camera, such as disclosed in applications Ser. No. 09/199,907, filed Nov. 25, 1998, now U.S. Pat. No. 6,717,610, and Ser. No. 09/361,814, filed Jul. 27, 1999, now U.S. Pat. No. 6,201,642, commonly assigned, along with vehicle instrument status, such as a vehicle information display, such as information relating to fuel gauge levels, speed, climate control setting, GPS directional instructions, tire pressure status, instrument and vehicle function status, and the like.


Also, and especially for a mirror assembly incorporating a video screen that is incorporated as part of an interior electro-optic (such as electrochromic) mirror assembly, a common circuit board and/or common electronic components and sub-circuitry can be utilized to control the electro-optic activity of the reflective element and to control the image displayed by the video screen, thus achieving economy of design and function, and for operating other electrical or electronic functions supported in the interior rearview assembly. For example, a circuit board of the interior mirror assembly may support, for example, light emitting diodes (LEDs) for illuminating indicia on display elements provided on a chin or eyebrow portion of the bezel region of the interior mirror casing. Reference is made to U.S. Pat. Nos. 5,671,996 and 5,820,245. It should be understood that one or more of these buttons or displays may be located elsewhere on the mirror assembly or separately in a module, for example of the type disclosed in U.S. patent application Ser. No. 09/244,726, now U.S. Pat. No. 6,172,613, and may comprise the touch-sensitive displays as disclosed in U.S. patent application Ser. No. 60/192,721, filed Mar. 27, 2000. Note that button inputs can be provided along the lower bezel region of the interior mirror assembly such that, when actuated, a display appears within the mirror reflector region of the mirror reflective element. Preferably, the display appears local to the physical location of the particular button accessed by the driver or vehicle occupant (typically, immediately above it) so that the person accessing the mirror associates the appearance and information of the display called up by that individual button with the user's actuation of the button. Multiple actuations of that button can cause the display to scroll through various menu items/data displays, allowing the user to access a wide range of information. The button and associated circuitry can be adapted to recognize when a particular menu item is desired selected (such as holding down a particular input button for longer than a prescribed period, for example longer than about 1 second or longer than about 2 seconds or the like; if the button is held down for less than the prescribed period, the display scrolls to the next menu item). Preferably, whatever information is being displayed is displayed by a substantially reflecting and substantially transmitting reflective/transmissive reflector of the mirror reflective element such as the display on demand constructions disclosed in U.S. Pat. No. 5,724,187. Also, these features can be provided for a non-mirror video display.


Optionally, one or more of the cameras of the vehicle may be equipped with infrared LED light emitting sources, such as are disclosed in U.S. patent application Ser. No. 09/025,712, filed Feb. 18, 1998, now U.S. Pat. No. 6,087,953, and U.S. patent application Ser. No. 09/244,726, filed Feb. 5, 1999, now U.S. Pat. No. 6,172,613, and in U.S. patent application Ser. No. 09/561,023, filed Apr. 28, 2000, now U.S. Pat. No. 6,553,308, and in U.S. patent application Ser. No. 09/466,010, filed Dec. 17, 1999, now U.S. Pat. No. 6,420,975, in order to light up an area in or around the vehicle when it is dark. When an intrusion detector such as a motion detector (preferably a pyrodetector-based intrusion detection system such as is disclosed in U.S. patent application Ser. No. 08/901,929, filed Jul. 29, 1997, now U.S. Pat. No. 6,166,625, and U.S. patent application Ser. No. 09/516,831, filed Mar. 1, 2000, now U.S. Pat. No. 6,390,529, and U.S. patent application Ser. No. 09/275,565, filed Mar. 24, 1999, now U.S. Pat. No. 6,086,131) is triggered by, for example, someone attempting to break into the vehicle or steal the vehicle, the vehicle-based security system triggers images captured by the vehicular camera(s) to be downloaded to the telemetry system which then forwards by wireless telecommunication (such as by radio frequency or by microwave transmission) the images (or a security alert signal derived from an in-vehicle image analysis of the captured images) to a security service, a mobile device in the possession of the driver of the vehicle when he/she is remote from the parked vehicle (such as a key-fob or a Palm Pilot™ PDA), the cell phone of the vehicle owner, the home computer of the vehicle owner or the police or the like that is remote and distant from the vehicle where the security condition is being detected. Preferably, the in-vehicle camera-based security system silently and secretly records the events occurring in and/or around the vehicle while it is operating (such as when idling in traffic or moving on a highway or stopped at a traffic light) and provides a “black box” recording of activities in the interior of the vehicle or exterior of the vehicle. For example, the security system may be used to record or document vehicle status including speed, brake activation, vehicle control status signals (for example, whether the turn signal has been actuated, vehicle traction, tire pressures, yaw and roll, geographic location, time and date) and other vehicle information as well as record visual images detected by the cameras. In an accident, such vehicle performance/function data in combination with a visual recording of the interior and/or exterior vehicular scene (and optionally, a microphone recording of sounds/voices interior and/or exterior to the vehicle) can help insurance and police investigators establish the causes and conditions of an accident. The camera-based vehicle performance/function recording system of the vehicle preferably records data onto a recording medium (such as onto electronic memory or onto digital recording tape) that is rugged and protected from the consequences of an accident so as to survive the impact forces, shocks, fires and other events possible in an automobile accident. Preferably, any electronic memory utilized is non-volatile memory that is non-erasing in the event of electrical power loss in the vehicle. For example, the camera-based in-vehicle security system may include an electronic memory recording medium and/or a video tape (preferably a digital) recording medium so that a pre-determined period of operation of the vehicle, such as up to the last about 1 minute of vehicle operation, more preferably up to the last about 5 minutes of vehicle operation, most preferably up to the last about 15 minutes of vehicle operation, or even greater, is continuously recorded (such as on a closed-loop tape or electronic recording that continually records the most recent events inside and/or outside the road transportation vehicle). The camera-based in-vehicle security system can maintain the stored images and/or vehicle data in the vehicle for downloading when desired such as after an accident. Alternately, the camera-based in-vehicle security system can transmit the images and/or vehicle data by wireless communication to a remote receiver such as a receiver distant and remote from the vehicle (such as at a security system or a telematic service such as ONSTAR™ or RESCU™ or at the vehicle owners home or at a car rental center). This can occur continuously while the vehicle is being operated, so that in the event an accident occurs, retrieval and analysis of the recorded information is not impeded such as by damage or even loss of the vehicle in the accident. Also, the remote receiver of the information can alert authorities (such as a police, fire and/or ambulance service) of an accident immediately when such accident occurs (and thus potentially speed aid to any accident victims and/or dispatch the correct medical aid for the type of accident/injuries recorded by the camera(s)). The recorded information can include the gear in which the driver is operating the vehicle, the activation of the brakes, the speed at which the driver is traveling, the rate of acceleration/deceleration, the time, date and geographic location, the atmospheric conditions including lighting conditions—basically, the system can record what happened during a collision whereby the system provides an information recordation function. For example, when the system is used to record an accident when the vehicle is operating, the cameras may record scenes, vehicle instrument/function status, or the like which are kept on a tape or non-volatile electronic, solid-state memory, for example a continuous loop tape or electronic memory. Alternately, this information can be continuously transmitted or downloaded. For example, the information can be downloaded in response to a selected stimuli or trigger, such as when the brakes are activated, the air bag or bags are activated, when the horn is operated, or when the car de-accelerates, or the like. For example, the system may use accelerometers such as disclosed in U.S. patent application Ser. No. 09/440,497, filed Nov. 15, 1999, now U.S. Pat. No. 6,411,204, and, furthermore, may be combined with the deceleration based anti-collision safety light control system described in the aforementioned application. This information recordation function can be used, as noted above, to record both interior activities and exterior activities and, therefore, can be used as noted above as a security system as well. When the system is used as a security system, the telemetry system may contact the security base who in turn can scroll through the camera images to determine whether the alarm is a true or false alarm. In this manner, various existing systems that are provided in the vehicle may be optionally used individually to provide one or more functions or collectively to provide even further or enhanced functions.


Examples of camera locations where vehicular cameras included in a vehicular camera-based accident recording system can be located include interior and exterior mirror assembly locations, roof areas such as a headliner or header console, front, side and rear exterior body areas such as front grilles, rear doors/trunk areas, side doors, side panels, door handles, CHMSL units, interior body pillars (such as an A-, B- or C-interior pillar) and seat backs, and such as are disclosed in U.S. provisional applications, Ser. No. 60/187,961, filed Mar. 9, 2000; Ser. No. 60/192,721, filed Mar. 27, 2000; and Ser. No. 60/186,520, filed Mar. 1, 2000; and in U.S. Pat. Nos. 5,877,897; 5,760,962; 5,959,367; 5,929,786; 5,949,331; 5,914,815; 5,786,772; 5,798,575; and 5,670,935; and U.S. patent applications, Ser. No. 09/304,201, filed May 3, 1999, now U.S. Pat. No. 6,124,886; Ser. No. 09/375,315, filed Aug. 16, 1999, now U.S. Pat. No. 6,175,164; Ser. No. 09/199,907, filed Nov. 25, 1998, now U.S. Pat. No. 6,717,610; Ser. No. 09/361,814, filed Jul. 27, 1999, now U.S. Pat. No. 6,201,642; Ser. No. 09/372,915, filed Aug. 12, 1999, now U.S. Pat. No. 6,396,397; Ser. No. 09/304,201, filed May 3, 1999, now U.S. Pat. No. 6,198,409; and Ser. No. 09/313,139, filed May 17, 1999, now U.S. Pat. No. 6,222,447, which are all commonly assigned. For example, a camera, preferably a solid-state CMOS video camera, can be located within the interior cabin of the vehicle (and preferably located at, on or within the interior rearview mirror assembly or at or in an A-pillar), and adapted to capture a surveillance image of the front and rear occupants of the vehicle. In this regard, locating the interior cabin surveillance camera at, on or within the interior rearview mirror assembly is preferred as this location provides the camera with a good rearward field of view that captures an image of all front and rear seat occupants. Preferably, the vehicle is also equipped with the in-vehicle portion of a wireless communication telematic system such as an ONSTAR™ or RESCU™ system, and the geographic location of the vehicle can also be established by a navigational system, such as an in-vehicle GPS system. Images of the interior vehicle cabin (including images of the various vehicle occupants) can be captured by the in-vehicle image capture device, preferably an interior mirror-mounted video camera, and this information, in conjunction with the geographic location of the vehicle provided by a position locator such as a GPS system, along with various vehicle information/function data such as the state of activation of any air bag in the vehicle, can be communicated by wireless telecommunication to an external service remote from the vehicle such as an ONSTAR™ or RESCU™ service. Such communication can be periodic (such as when the ignition is first turned on during a particular trip, or initially when the ignition is first turned on and intermittently thereafter, such as every about 1 minute or so) or continuous during operation of the vehicle with its engine turned on. Should the receiver at the remote service be alerted that an accident has occurred (such as by receiving from the vehicle via wireless telematic communication an accident alert signal indicative that an air bag has activated), the remote receiver (which can be an ONSTAR™ operator or an automatic computer-based image analyzer or an emergency service such as a “911” service provider) can count, via the video imaged relayed from the vehicle, the number of occupants in the vehicle and can accordingly alert emergency services as to the location of the accident and the number of victims involved (thus ensuring that the appropriate number of, for example, ambulances are dispatched to deal with the actual number of potential victims in the vehicle at the time of the crash). Optionally, the owner/driver of the vehicle can register/notify the remote telematic service of any special medical needs, blood types and the likes of the likely driver(s) and/or likely occupants (such as family members) along with any next-of-kin information, insurance coverage and the like so that, in the event the like of an ONSTAR™ or RESCU™ telematic service or telematically-linked “911” emergency response service determines an accident has occurred, medical and emergency relief specific to the likely/actual occupants of the vehicle can be dispatched. Likewise, should an in-vehicle fire be detected such as by visual determination via image analysis of video images telematically transmitted and/or by an in-vehicle temperature probe transmitting data telematically, then the fire brigade can be automatically sent to the crash site and/or an in-vehicle fire extinguisher can be activated to put out any fire (either by remote, wireless activation by the telematic service of the in-vehicle fire extinguisher or by automatic in-vehicle image analysis of the image recorded by an interior or exterior camera of the vehicle that, upon in-vehicle image analysis determining that a fire has occurred in the vehicle, causes a vehicular on-board fire extinguisher to actuate to put out the fire). Also, either remotely or via in-vehicle image analysis, the engine of the vehicle can be turned off after an accident has been detected via the vehicular camera system.


A variety of other electrical and electronic features can be incorporated into the assemblies, such as those disclosed in U.S. patent application Ser. No. 09/433,467, filed Nov, 4, 1999, now U.S. Pat. No. 6,326,613. For example, a microphone or a plurality of microphones may be incorporated, preferably to provide hands-free input to a wireless telecommunication system such as the ONSTAR™ system in use in General Motors vehicles. Most preferably, such microphones provide input to an audio system that transmits and communicates wirelessly with a remote transceiver, preferably in voice recognition mode. Such systems are described in U.S. patent application Ser. No. 09/382,720, filed Aug. 25, 1999, now U.S. Pat. No. 6,243,003.


In this regard it may be desirable to use audio processing techniques, such as digital sound processing, to ensure that vocal inputs to the vehicular audio system are clearly distinguished from cabin ambient noise such as from wind noise, HVAC, and the like. Digital sound processing techniques, as known in the acoustics arts and such as are disclosed in U.S. Pat. No. 4,959,865, entitled A METHOD FOR INDICATING THE PRESENCE OF SPEECH IN AN AUDIO SIGNAL, issued Sep. 25, 1990, to Stettiner et al., are particularly useful to enhance clarity of vocal signal detection when a single microphone is used, located in the interior mirror assembly such as in the mirror casing that houses the interior mirror reflective element, as part of a vehicular wireless communication system such as General Motors' ONSTAR™ system. Use of digital signal processing and a single mirror-mounted microphone (such as is described in U.S. patent application Ser. No. 09/396,179, filed Sep. 14, 1999, now U.S. Pat. No. 6,278,377) is particularly advantageous for economical achievement of clear and error-free transmission from the vehicle, while operating along a highway, to a remote receiver, particularly in speech-recognition mode. Although advantageous with a single mirror-mounted microphone (or for a microphone mounted elsewhere in the vehicle cabin such as in the header region or in an accessory module assembly such as a windshield electronics module assembly), digital sound processing is also beneficial when multiple microphones are used, and preferably when at least two and more preferably at least four microphones are used.


As previously described, connection and communication between the video displays and/or the cameras and/or other electronic accessories can be by wired connection (including multi-element cables, wired multiplex links and fiber-optic cables) and/or by wireless connection/communication (such as by infrared communication and/or by radio frequency communication such as via BLUETOOTH, described below).


For example, the video displays may include a display of the speed limit applicable to the location where the vehicle is travelling. Conventionally, speed limits are posted as a fixed limit (for example, 45 MPH) that is read by the vehicle driver upon passing a sign. As an improvement to this, an information display (preferably an alphanumerical display and, more preferably, a reconfigurable display) can be provided within the vehicle cabin, and preferably displayed by a video display, and readable by the driver, that displays the speed limit at whatever location on the road/highway the vehicle actually is at any moment. For example, existing speed limit signs could be enhanced to include a transmitter that broadcasts a local speed limit signal, such signal being received by an in-vehicle receiver and displayed to the driver. The speed limit signal can be transmitted by a variety of wireless transmission methods, such as radio transmission, and such systems can benefit from wireless transmission protocols and standards, such as the BLUETOOTH low-cost, low-power radio based cable replacement or wireless link based on short-range radio-based technology. BLUETOOTH enables creation of a short-range (typically 30 feet or so although longer and shorter ranges are possible), wireless personal area network via small radio transmitters built into various devices. For example, transmission can be on a 2.45 gigahertz band, moving data at about 721 kilobits per second, or faster. BLUETOOTH, and similar systems, allow creation of an in-vehicle area network. Conventionally, features and accessories in the vehicle are wired together. Thus, for example, an interior electrochromic mirror and an exterior electrochromic mirror is connected by at least one wire in order to transmit control signal and the like. With BLUETOOTH and similar systems such as the IEEE 802.11 a protocol which is a wireless local area network standard that preferably uses a 5 GigaHertz frequency band and with a data transfer rate of at least about 10 Mb/sec and more preferably at least about 30 Mb/sec, control commands can be broadcast between the interior mirror and the exterior mirror (and vice versa) or between a camera capturing an image in a horse box (or any other towed trailer) being towed by a vehicle and a video display located at the windshield or at the interior rearview mirror or at or adjacent to an A-pillar of that vehicle that is viewable by the vehicle driver without the need for physical wiring interconnecting the two. Likewise, for example, the two exterior mirror assemblies on the vehicle can exchange, transmit and/or receive control commands/signals (such as of memory position or the like such as is described in U.S. Pat. No. 5,798,575) via an in-vehicle short-range radio local network such as BLUETOOTH. Similarly, tire pressure sensors in the wheels can transmit via BLUETOOTH to a receiver in the interior mirror assembly, and tire pressure status (such as described in U.S. patent application Ser. No. 09/513,941, filed Feb. 28, 2000, now U.S. Pat. No. 6,294,989) can be displayed, preferably at the interior rearview mirror. In the case of the dynamic speed limit system described above, preferably, the in-vehicle receiver is located at and/or the display of local speed limit is displayed at the interior mirror assembly (for example, a speed limit display can be located in a chin or eyebrow portion of the mirror case, such as in the mirror reflector itself, or such as in a pod attached to the interior mirror assembly), or can be displayed on any video display. More preferably, the actual speed of the vehicle can be displayed simultaneously with and beside the local speed limit in-vehicle display and/or the difference or excess thereto can be displayed. Optionally, the wireless-based speed limit transmission system can actually control the speed at which a subject vehicle travels in a certain location (such as by controlling an engine governor or the like) and thereby provide a vehicle speed control function. Thus, for example, a school zone speed limit can be enforced by transmission of a speed-limiting signal into the vehicle. Likewise, different classes of vehicles can be set for different speed limits for the same stretch of highway. The system may also require driver identification and then set individual speed limits for individual drivers reflecting their skill level, age, driving record and the like. Moreover, a global positioning system (GPS) can be used to locate a specific vehicle, calculate its velocity on the highway, verify what the allowed speed limit is at that specific moment on that specific stretch of highway, transmit that specific speed limit to the vehicle for display (preferably at the interior rearview mirror that the driver constantly looks at as part of the driving task) and optionally alert the driver or retard the driver's ability to exceed the speed limit as deemed appropriate. A short-range, local communication system such as envisaged in the BLUETOOTH protocol finds broad utility in vehicular applications, and particularly where information is to be displayed at the interior mirror assembly or on a video display, or where a microphone or user-interface (such as buttons to connect/interact with a remote wireless receiver) is to be located at the interior (or exterior) rearview mirror assembly. For example, a train approaching a railway crossing may transmit a wireless signal such as a radio signal (using the BLUETOOTH protocol or another protocol) and that signal may be received by and/or displayed at the interior rearview mirror assembly (or the exterior side view mirror assembly) or a video display. Also, the interior rearview mirror and/or the exterior side view mirrors and/or any video display can function as transceivers/display locations/interface locations for intelligent vehicle highway systems, using protocols such as the BLUETOOTH protocol. Protocols such as BLUETOOTH and the IEEE 802.11a wireless local area network standard that preferably uses a 5 GigaHertz frequency band and with a data transfer rate of at least about 10 Mb/sec and more preferably at least about 30 Mb/sec, as known in the telecommunications art, can facilitate voice/data, voice over data, digital and analog communication and vehicle/external wireless connectivity, preferably using the interior and/or exterior mirror assemblies as transceiver/display/user-interaction sites. Electronic accessories to achieve the above can be accommodated in any of the video displays/video mirrors/camera assemblies, and/or in the interior mirror assembly (such as in the housing disclosed in U.S. patent application Ser. No. 09/433,467, filed Nov. 4, 1999, now U.S. Pat. No. 6,326,613).


Furthermore, information displays may be incorporated which provide information to the driver or occupants of the vehicle, such as warnings relating to the status of the passenger airbag or a train approaching warning. Such a train approaching warning system alerts the driver of the vehicle of the eminent arrival of a train at a railroad crossing. Such a warning system can activate audible and/or visual alarms in the vehicle if a train is approaching. Such train warning displays may override any existing displays so that the driver is fully alert to any potential hazard. One suitable train control system is described in U.S. patent application Ser. No. 09/561,023, filed Apr. 28, 2000, now U.S. Pat. No. 6,553,308. Vehicle to road-side communication antennas can be attached to railroad signs, crossing barriers, and the like and can transmit to antennas mounted in the vehicle located such as within the interior rearview mirror of the vehicle or within an interior cabin trim item or side exterior rearview mirror assembly. One such track side communication system is available from Dynamic Vehicle Safety Systems of Amarillo, Tex., which detects signals from trains approaching a crossing and transmits these signals along the road to forewarn of a railroad crossing ahead.


In application Ser. No. 09/244,726, filed Feb. 5, 1999, now U.S. Pat. No. 6,172,613, information displays are provided which include information relating to vehicle or engine status, warning information, and the like such as information relating to oil pressure, fuel remaining, time, temperature, compass headings for vehicle direction, and the like. The passenger side air bag on/off signal may be derived from various types of seat occupancy detectors such as by video surveillance of the passenger seat as disclosed in PCT Application No. PCT/US94/01954, filed Feb. 25, 1994, published Sep. 1, 2004 as PCT Publication No. WO/1994/019212, or by ultrasonic or sonar detection, infrared sensing, pyrodetection, weight detection, or the like. Alternately, enablement/displayment of the passenger side air bag operation can be controlled manually such as through a user operated switch operated with the ignition key of the vehicle in which the assembly is mounted as described in U.S. patent application Ser. No. 08/799,734, filed Feb. 12, 1997, now U.S. Pat. No. 5,786,772.


In addition, the interior rearview mirror assembly may include a blind spot detection system, such as the type disclosed in U.S. patent application Ser. No. 08/799,734, filed Feb. 12, 1997, now U.S. Pat. No. 5,786,772, or rain sensor systems, for example rain sensor systems which include windshield contacting rain sensors, such as described in U.S. Pat. No. 4,973,844 or non-windshield contacting rain sensors, such as described in PCT International Application PCT/US94/05093, published as WO 94/27262 on Nov. 24, 1994.


The interior rearview mirror assembly may also incorporate one or more user actuatable buttons or the like for activating the various accessories housed in the assembly, for example an ONSTAR system, HOMELINK® system, a remote transaction system, or the like. For example, one or more user actuatable buttons may be mounted at the chin area or eyebrow area for actuating, for example a video screen, or for selecting or scrolling between displays or for activating, for example, a light, including a map light which may be incorporated into the mirror casing. Furthermore, a dimming switch may be incorporated into the casing to provide adjustment to the brightness of the video screen.


Also, a single high-intensity power LED may comprise a single LED light source in a compact package or as an individual chip or circuit element (and with a diagonal size less than about 14 mm diagonal cross-sectional dimension when viewed from the light emitting side; more preferably less than about 8 mm; and most preferably, less than about 5 mm) that illuminates to emit a light beam when (powered at about 25 degrees Celsius or thereabouts) at least about 100 milliamps passes (i.e., conducts) through the LED element (more preferably when at least about 225 milliamps passes through the LED element and most preferably when at least 300 milliamps passes through the LED element), and with a luminous efficiency of at least about 1 lumen/watt, more preferably at least about 3 lumens/watt, and most preferably at least about 7 lumens/watt. Such high-intensity power LEDs, when normally operating, emit a luminous flux of at least about 1 lumen, more preferably at least about 5 lumens and most preferably at least about 10 lumens. For certain applications such as ground illumination from lighted exterior mirror assemblies and interior mirror map lights, such high-intensity LEDs preferably conduct at least about 250 milliamps forward current when operated at a voltage in the about 2 volts to about 5 volts range, and emit a luminous flux of at least about 10 lumens, more preferably at least about 15 lumens, even more preferably at least about 20 lumens, and most preferably at least about 25 lumens, preferably emitting white light.


Single high-intensity power LEDs suitable to use include high-intensity, high-current capability light emitting diodes such as the high-flux LEDs available from LumiLeds Lighting, U.S., LLC of San Jose, Calif. under the SunPower Series High-Flux LED tradename. Such high-intensity power LEDs comprise a power package allowing high-current operation of at least about 100 milliamps forward current, more preferably at least about 250 milliamps forward current, and most preferably at least about 350 milliamps forward current, through a single LED. Such high-current/high-intensity power LEDs (as high as 500 mA or more current possible, and especially with use of heat sinks) are capable of delivering a luminous efficiency of at least about 1 lumen per watt, more preferably at least about 3 lumens per watt, and most preferably at least about 5 lumens per watt. Such high-intensity LEDs are available in blue, green, blue-green, red, amber, yellow and white light emitting forms, as well as other colors. Such high-intensity LEDs can provide a wide-angle radiation pattern, such as an about 30 degree to an about 160 degree cone. Typically, such high-intensity LEDs are fabricated using Indium Gallium Nitride technology. To assist heat dissipation and maintain the LED junction below about 130° Celsius (and more preferably below about 100° Celsius and most preferably below about 70° Celsius), a heat sink can be used. Preferably, such a heat sink comprises a metal heat dissipater (such as an aluminum metal heat sink) with a surface area dissipating heat of at least about 1 square inch, more preferably of at least about 2.5 square inches, and most preferably of at least about 3.5 square inches. When used as, for example, a map light assembly mounted in an interior rearview mirror assembly (such as in the mirror housing or in a pod attaching to the mirror mount to the vehicle), a single high-intensity power LED (for example, a single white light emitting LED passing about 350 mA and emitting light, and preferably white light or any other color, with a luminous efficiency of at least about 3 lumens per watt, and with a light pattern of about 120° or so) can be combined with a reflector element and a lens to form a high-intensity power LED interior light module capable of directing an intense light beam of light from an interior mirror assembly mounted to a windshield or header region of the vehicle to the lap area of a driver or a front-seat passenger in order to allow a reading function such as a map reading function and/or to provide courtesy or theatre lighting within the vehicle cabin. Also, a single high-intensity power LED (for example, a single white light emitting LED or a red light emitting or any other colored light emitting diode passing about 350 mA and emitting light, preferably white light or any other color, with a luminous efficiency of at least about 3 lumens per watt, and with a light pattern of about 120° or so) can be combined with a reflector element and a lens to form a high-intensity LED security light module capable of directing an intense light beam of light (or any other color) from an exterior mirror assembly to illuminate the ground adjacent an entry door of the vehicle in order to provide a security lighting function. Also, a single high-intensity power LED (for example, a single white light emitting LED or a red light emitting or any other colored light emitting diode passing about 350 mA and emitting white light with a luminous efficiency of at least about 3 lumens per watt, and with a light pattern of about 120° or so) can be combined with a reflector element and a lens (and optionally with high-intensity and/or conventional near-IR light emitting diodes), and be used in conjunction with a reversing or forward parking camera mounted on the exterior of a vehicle (such as at a license plate holder) in order to provide illumination for the, for example, reverse-aid camera when reversing at night.


For applications such as ground illumination from exterior mirror assemblies and map/reading lighting from interior mirror assemblies or from windshield-mounted accessory modules such as windshield electronic modules or for ground illumination/camera-field-of-view illumination in association with video-based reverse-aids systems or park-aid systems or tow hitch-aid systems, it is preferable to use a single high-intensity power LED source having a luminous efficiency of at least about 7 lumens/watt; more preferably at least about 15 lumens/watt; and most preferably at least about 20 lumens/watt, with such single high efficiency power LED light source preferably being provided in a module that includes a heat sink/heat dissipater and most preferably, that further includes a power regulator such as a series power resistor and most preferably, a DC to DC voltage converter. Such high efficiency power LEDs are available from LumiLeds Lighting, U.S., LLC of San Jose, Calif. under the Sun Power Series High-Flux LED tradename, for example.


Also , a video display element or screen can be used (such as an LCD display or an emitting display element such as a multi-pixel electroluminescent display or field emission display or light emitting diode display (organic or inorganic)) disposed within the mirror housing of the interior mirror assembly of the vehicle, and located behind the mirror reflective element in the mirror housing, and configured so that the image displayed by the video display element is visible to the driver by viewing through the mirror reflective element. Preferably, and such as is disclosed in U.S. utility patent application Ser. No. 09/793,002, filed Feb. 26, 2001, now U.S. Pat. No. 6,690,268, the mirror reflective element (behind which the video display screen is disposed so that the image displayed is visible by viewing through the mirror reflective element) of the interior mirror assembly preferably comprises a transflective mirror reflector such that the mirror reflective element is significantly transmitting to visible light incident from its rear (i.e. the portion furthest from the driver in the vehicle), with at least about 15% transmission preferred, at least about 20% transmission more preferred and at least about 25% transmission most preferred, while simultaneously, the mirror reflective element is substantially reflective to visible light incident from its front (i.e. the position closest to the driver when the interior mirror assembly is mounted in the vehicle), with at least about 60% reflectance preferred, at least about 70% reflectance more preferred and at least about 75% reflectance most preferred. Preferably a transflective electrochromic reflective mirror element is used (such as is disclosed in U.S. utility patent application Ser. No. 09/793,002, filed Feb. 26, 2001, now U.S. Pat. No. 6,690,268, and in U.S. Pat. Nos. 5,668,663; 5,724,187) that comprises an electrochromic medium sandwiched between two substrates. The front (i.e. closest to the driver when the interior mirror assembly is mounted in the vehicle) substrate preferably comprises a glass substrate having a transparent electronic conductive coating (such as indium tin oxide or doped tin oxide) on its inner surface (and contacting the electrochromic medium). More preferably, the front substrate of the twin-substrate electrochromic cell that sandwiches the electrochromic medium comprises a glass substrate having a thickness of about 1.6 millimeters or less; most preferably, about 1.1 millimeters or less. The rear (i.e. furthest from the driver when the interior mirror assembly is mounted in the vehicle) substrate preferably comprises a glass substrate having a transflective mirror reflector on the surface thereof that the electrochromic medium contacts (such a configuration being referred to as a “third-surface” reflector in the electrochromic mirror art). For example, the mirror reflector can comprise a transparent semiconductor/metal conductor/transparent semiconductor multilayer stack such an indium tin oxide/silver/indium tin oxide stack (for example, a third-surface electrochromic mirror reflective element can be used comprising a front substrate comprising an about 1.1 mm thick glass substrate having a half-wave ITO coating of about 12 ohms/square sheet resistance on its inner surface; a rear substrate comprising an about 1.6 mm thick glass substrate having a transflective mirror reflector thereon comprising an about 350 angstrom thick silver metal layer sandwiched between an about 800 angstrom thick indium thin oxide transparent semiconductor layer and another about 800 angstrom thick indium thin oxide transparent semiconductor layer; and with an electrochromic solid polymer matrix medium such as is disclosed in U.S. Pat. No. 6,245,262 disposed between the transflective mirror reflector of the rear substrate and the half-wave indium tin oxide layer of the front substrate. Visible light reflectivity of the transflective electrochromic mirror element is about 60-65%; transmission is about 20-25%. With a TFT LCD video display disposed behind the rear substrate of such a third-surface transflective electrochromic mirror reflective element in a “display-on-demand” configuration, the presence of (and image displayed by) the video display screen is only principally visible to the driver (who views through the transflective mirror reflective element) when the video display element is powered so as to project light from the rear of the mirror reflective element).


Also, and as disclosed in U.S. utility patent application Ser. No. 09/793,002, filed Feb. 26, 2001, now U.S. Pat. No. 6,690,268, incorporated above, an image on the screen includes a video view rearward of the vehicle, and also preferably includes electronically generated indicia overlaying the video image on the video screen and indicating the distance of detected objects (such as via a graphic display or via an alphanumeric display in feet/inches) and/or highlighting of obstacles/objects that a reversing vehicle is in jeopardy of colliding with (such as a child or a barrier). For example, red highlighting can be used, or a screen portion can strobe/flash to draw the driver's attention to an object in the screen. Also, the control can provide an audible output signal to a speaker that audibly alerts the driver that the vehicle is reversing closer and closer to a rear-situated object. The combination of a video reverse-aid system with an audible reverse-aid system based off an object detection system such as an ultrasonic obstacle detection system is a significant advance over reversing systems known to date, and particularly with distance or similar graphics overlaying the video image of the rearward scene.


Also, any of the video screens of the above embodiments, and such as disclosed in U.S. utility patent application Ser. No. 09/793,002, filed Feb. 26, 2001, now U.S. Pat. No. 6,690,268, can display the image output by a forward facing image capture device (preferably positioned to capture a video image of the ground surface/objects/persons immediately in front of the vehicle, most preferably encompassing an area that encompasses substantially the entire front fender width of the vehicle) and/or can display the image output by a rearward facing image capture device positioned to capture a video image of the ground surface/objects/persons immediately to the rear of the vehicle, most preferably encompassing an area that encompasses substantially the entire rear fender width of the vehicle. Preferably, a graphic overlay with indicia of forward or backup travel, such as disclosed in U.S. patent application Ser. No. 09/313,139 filed May 19, 1999, now U.S. Pat. No. 6,222,447, in U.S. patent application Ser. No. 09/776,625, filed Feb. 5, 2001, now U.S. Pat. No. 6,611,202, and in U.S. Pat. No. 5,949,331. For example, the intended path of travel and/or a distance grid can be electronically superimposed upon the video image from a reverse-aid camera as displayed on any screen of the above video mirrors, video display assemblies and accessory modules.


As disclosed in U.S. patent application Ser. No. 09/793,002, filed Feb. 26, 2001, now U.S. Pat. No. 6,690,268 (incorporated by reference above), a towing vehicle is equipped with a reversing-aid camera (that is mounted, for example, at the rear license plate region of vehicle). The reversing-aid camera has its field of view directed to include the tow bar/hitch connection, as well as the leading portion of the towed container. The driver is provided with a control that allows him/her toggle between the reversing-aid camera and a tow container reverse-aid camera. When the driver selects the reversing-aid camera, a view of the tow-bar/tow container (that may be a boat or a U-Haul trailer or an animal container or a trailer tent or any other trailer type) is displayed on a video display screen (that can be any of the video display screens of the present invention). When the driver selects the tow container reverse-aid camera, a view to the rear of the tow container (that may be a boat or a U-Haul trailer or an animal container or a trailer tent or any other trailer type) is displayed on the video display screen (that can be any of the video display screens of the present invention).


Also, a variety of lenses (both refractive and diffractive) can be used to define the field of view of the cameras of the present invention. For reverse-aid type cameras, a wide angle lens is useful so that a view of the entire road immediately to the rear of the vehicle, and of width of the vehicle, is captured. To reduce any image distortion, optical image distortion reduction means and/or software-based image distortion reducing means, as known in the art, can be used. Optionally, two cameras can be used to assist reversing; one equipped with a wide-angle lens and mounted such as at the rear license plate of the vehicle in order to view a near field of zero to 5 feet or so immediately exteriorly, and a second camera, aimed more far field to capture an image of traffic and obstacles further down the path the vehicle is reversing in, can be used. Selection of one or the other of the first and second camera can be at the driver's discretion, or alternately, engagement of the reverse gear handle by the driver initially selects the first, near-field view so that the driver can check it is safe to initiate a reverse move, and then once the reverse gear is fully engaged and the vehicle is actually moving in reverse, the second, far-field view is selected.


The video screen can be used with at least one of a rear back-up camera, a baby-minder camera, and a sidelane-viewing camera. An image capturing device is in communication with the video screen, which displays images captured by the image capturing device and, further, displays indicia overlaying the video image. For example, the indicia may comprise a graphic display or an alphanumeric display.


The video screen may display an information display selected from the group consisting of a rain sensor operation display, a telephone information display, a highway status information display, a blind spot indicator display, a hazard warning display, a vehicle status display, a page message display, a speedometer display, a tachometer display, an audio system display, a fuel gage display, a heater control display, an air conditioning system display, a status of inflation of tires display, an email message display, a compass display, an engine coolant temperature display, an oil pressure display, a cellular phone operation display, a global positioning display, a weather information display, a temperature display, a traffic information display, a telephone number display, a fuel status display, a battery condition display, a time display, a train approach warning display, and a toll booth transaction display.


The video screen may comprise one of a vacuum fluorescent display element, a light emitting diode display, an electroluminescent display element, a multi-pixel display element, a reconfigurable display element, and a scrolling display element.


In another aspect, a video mirror system includes a variable intensity control which is in communication with the video screen, which varies the display intensity of the video screen, for example, in response to ambient lighting conditions.


Note that since a rear-facing camera will have a portion exposed to the outdoor elements such as rain, ice, snow, road splash, dirt etc, it is desirable that a cleaning means be included to maintain any lens/lens cover clean/contaminant-free. For example, use of vectored airflow can be used to blow away any accumulated rain drops etc on the lens/lens cover. Thus, the camera assembly, and its installation into the rear of the vehicle, can be constructed so as to include air channeling elements such as baffles to channel airflow over the rear-facing lens portion, and so remove water drops therefrom. A lens heater (such as a glass plate coated with a transparent resistive coating such as indium tin oxide or doped tin oxide) can be included to defog/de-ice the camera lens. Alternately, or in addition, a mechanical wiper can be provided. Also, a lens cover can be provided that mechanically removes only when the rear-facing camera is accessed to view rearward, so as to minimize the time that the lens is exposed to the outdoor elements. Thus, for example, a metal or plastic shield or shutter can be disposed over the camera lens when the camera is not in use (such as when the vehicle is parked or when the vehicle is driving forward). However, when the engine is operating, and reverse gear of the vehicle transmission system is engaged by the driver, the cover over (and protecting) the camera lens is mechanically opened/removed (such as by rotating out of the way) to expose the lens of the camera and to allow the camera view rearwardly. Once reverse gear is disengaged, the mechanical shutter closes over the camera lens, protecting it once again from the outdoor elements. An output of the camera is provided as an input to a control. The system also includes an object-monitoring sensor (preferably, an ultrasonic sensor such as is known in the art, or a radar sensor or an infrared sensor, or the like, such as is known in the art). The object monitoring sensor generates an output indicative of detection of an object rearward of the vehicle, and preferably includes a measure of the distance of that detected object from the rear of the vehicle. The output is also provided to the control. The control generates a video output signal which is provided as an input to the video screen, which preferably is mounted at, on, or within the interior rearview mirror assembly in the vehicle. The image on the screen generated thereby includes a video view rearward of the vehicle, and also preferably includes electronically generated indicia overlaying the video image on the video screen and indicating the distance of detected objects (such as via a graphic display or via an alphanumeric display in feet/inches) and/or highlighting of obstacles/objects that a reversing vehicle is in jeopardy of colliding with (such as a child or a barrier). For example, red highlighting can be used, or a screen portion can strobe/flash to draw the driver's attention to an object in the screen. Also, the control can provide an audible output signal to a speaker that audibly alerts the driver that the vehicle is reversing closer and closer to a rear-situated object. The combination of a video reverse-aid system with an audible reverse-aid system based off an object detection system such as an ultrasonic obstacle detection system is a significant advance over reversing systems known to date, and particularly with distance or similar graphics overlaying the video image of the rearward scene.


Preferably, the video screens used in this present invention are pixelated liquid crystal displays, or most preferably, are pixelated emitting displays such as field emission displays or plasma displays or electroluminescent displays such as organic electroluminescent displays. Alternately, a cathode ray tube video screen can be used. Also note, as described above, the display on the video screen can be a reconfigurable display capable of displaying a plurality of vehicle functions. Also, the field of view of any camera inside or outside the vehicle can be fixed, or it can be variable such as by manipulating a joystick or the like. For example, a manual control to move the field of view of the camera and/or its focus and/or its zoom can be included in the joystick or similar controls conventionally provided to adjust outside sideview mirror reflectors.


Also, dynamic color selection can be used in operating the video display. Also, optionally, a Kopin display, as known in the art, can be used. Optionally, the electronics to generate an image can be located at, on or in the interior mirror assembly, and an image can be projected from the interior mirror assembly toward the vehicle windshield for viewing by the driver of the vehicle by looking at or through the vehicle windshield.


Note that communication between any camera and the display screen can be by wire (such as a direct wire connection or via an optical fiber link) or via a bus system (such as a CAN or LIN system, as known in the arts) or wirelessly such as by IR or RF communication (such as using a local area RF broadcast network such as the BLUETOOTH protocol from Motorola).


Also, to minimize cost in the system, the video screen can connect to a control that provides a direct row/column drive to the pixelated array of the video screen; and with the control itself receiving a direct row/column feed from the pixelated array of the camera (such as a CMOS camera) used to capture the image desired displayed on the video screen. Row and column drivers for the video screen can be included in the video screen package itself, such as via flexible circuitry attached to the back of the video screen element (typically a glass element) itself.


Also, because ambient light washout can be a problem for a mirror-mounted video display (or any other video display of the present invention), a contrast enhancement filter can be disposed over the video screen (or incorporated into its construction) to increase the contrast ratio between the video image and ambient light. Optionally, anti-reflective coatings/films/layers can be used to reduce surface reflections off the video screen. Suitable anti-reflective coatings/films/layers are known in the art and can include diffuser surface layers and interference layers. Optionally, an electrically variable contrast enhancement filter can be used overlying the video screen, such as an electro-optic (preferably electrochromic) contrast enhancement filter.


Also, optionally, a video screen displaying an image of the rearward scene of the vehicle, and preferably displaying a panoramic image such as described in U.S. Pat. Nos. 5,670,935 and 5,550,677 and U.S. Pat. applications Ser. No. 09/199,907, filed Nov. 25, 1998, now U.S. Pat. No. 6,717,610, and Ser. No. 09/361,814, filed Jul. 27, 1999, now U.S. Pat. No. 6,201,642, can be used instead of a conventional mirror reflector.


Also, and especially for video mirror assemblies where the video screen is incorporated as part of an interior electro-optic (such as electrochromic) mirror assembly, a common circuit board and/or common electronic components and sub-circuitry can be utilized to control the electro-optic activity of the reflective element and to control the image displayed by the video screen, thus achieving economy of design and function and for operating other electrical or electronic functions supported in the interior rearview assembly. For example, a circuit board of the interior mirror assembly may support, for example, light emitting diodes (LEDs) for illuminating indicia on display elements provided on a chin or eyebrow portion of the bezel region of the interior mirror casing. Reference is made to U.S. Pat. Nos. 5,671,996 and 5,820,245. It should be understood that one or more of these buttons or displays may be located elsewhere on the mirror assembly or separately in a module, for example of the type disclosed in pending U.S. patent application Ser. No. 09/244,726, entitled “REARVIEW MIRROR ASSEMBLY INCORPORATING VEHICLE INFORMATION DISPLAY”, filed by DeLine et al., and may comprise the touch-sensitive displays as disclosed in U.S. provisional application entitled “INTERACTIVE AUTOMOTIVE REARVIEW SYSTEM”, Ser. No. 60/192,721, filed Mar. 27, 2000.


Also, the video display in any of the video mirror applications of the present invention can function as the display screen for a portable computer device, a portable cellular phone, and/or a portable personal digital assistant device (PDA) such as a PalmPilot™ or other personal digital assistant. When serving as the display screen of a PDA, the PDA/in-vehicle display screen can optionally operate in combination with a cellular phone or as a stand-alone device. Also, any of the video screens of the present invention can serve multiple purposes such as a video screen for an on-board vehicular camera and/or as the video monitor screen for a portable computing/PDA/cellular phone/telecommunication device. The video display system of the present invention can itself function as an in-vehicle PDA and/or cellular phone, in addition to other functions as described above. Portable devices such as PDAs, cellular phones, and palm/notebook/laptop portable computers can connect to/communicate with the video mirror systems of the present invention by direct wired connection/docking or by wireless communication such as described in U.S. patent application Ser. No. 09/561,023, filed Apr. 28, 2000, entitled “VEHICLE-BASED NAVIGATION SYSTEM WITH SMART MAP FILTERING, PORTABLE UNIT HOME-BASE REGISTRATION AND MULTIPLE NAVIGATION SYSTEM PREFERENTIAL USE”, to Uhlmann et al.; U.S. provisional application Ser. No. 60/131,593, filed Apr. 29, 1999, entitled “VEHICLE-BASED NAVIGATION SYSTEM WITH SMART MAP FILTERING, PORTABLE UNIT HOME-BASE REGISTRATION AND MULTIPLE NAVIGATION SYSTEM PREFERENTIAL USE”, to Uhlmann et al.; and U.S. provisional application Ser. No. 60/199,676, filed Apr. 21, 2000, entitled “VEHICLE MIRROR ASSEMBLY COMMUNICATING WIRELESSLY WITH VEHICLE ACCESSORIES AND OCCUPANTS”. Preferably, the video mirror systems of the present invention are equipped with a mobile device communication port (such as an IrDA-port) that transmits/receives data via wireless infrared communication. For example, any of the video display housings and/or any of the video attachment members/mounts and/or any of the interior mirror assemblies can be equipped with a mobile device communication port (such as an IrDA-port) that transmits/receives data via wireless infrared communication. Also, any of the video display assemblies, including any of the video screens or video display housings can be adapted to receive data input by touch such as by a human finger or a stylus such as via a touch screen, and such as is disclosed in U.S. provisional application Ser. No. 60/192,721, filed Mar. 27, 2000, entitled “INTERACTIVE AUTOMOTIVE REARVISION SYSTEM”, to Lynam et al.


As disclosed in U.S. utility patent application Ser. No. 09/793,002, filed Feb. 26, 2001 (now U.S. Pat. No. 6,690,268), incorporated above, a forward facing camera can be mounted such as at the front grille or front fender/bumper or front bonnet/hood area of the vehicle, and with its field of view directed to capture an image of the road surface immediately in front of the vehicle. With such as forward parking-aid camera device (that can utilize wide-angle optics and other techniques as previously described and referenced with regards to reverse back-up aid cameras), the driver of a vehicle, such as a large SUV such as the MY2000 Ford Excursion vehicle from Ford Motor Company, can see how close the front fender/bumper is from another vehicle or a barrier or the like when the driver is parking the vehicle. Optionally, a driver actuatable switch can be provided to allow the driver select display of the view from the forward-facing park-aid camera device. Alternately, and preferably, the image from the forward-facing park-aid camera device is displayed on the in-vehicle video screen (such as any of the screens disclosed herein) whenever the vehicle is moving in a forward direction at a velocity less than a predetermined slow velocity (for example, when moving forward at less than about 7 miles/hour or less than about 5 miles per hour or less than about 3 miles per hour), such slow forward speed of travel being potentially indicative of a parking event occurring. Optionally, the forward facing park-aid view can be displayed whenever the vehicle is stationary with its engine operating as a safety measure to prevent inadvertent collision with obstacles or persons upon moving forward.


As disclosed in U.S. utility patent application Ser. No. 09/793,002, filed Feb. 26, 2001 (now U.S. Pat. No. 6,690,268) incorporated above, interconnection to and communication with the components of the video mirror systems of the present invention can be via a variety of means and protocols including J2284, J1850, UART, optical fiber, hard wired, wireless RF, wireless IR and wireless microwave. Also, the image displayed on the display screen can adapt/change depending on the driving task condition. For example, if reverse gear is selected, and as previously described, the display can automatically change to display the image immediately behind the vehicle. If this is the current view at the time the reverse gear is initially engaged, no change is necessary. Also, reverse gear selection can turn on any additional illumination sources (IR, visible etc). For vehicle equipped with blind spot cameras as well as a reversing camera, a split screen display can be used to show the area immediately behind the vehicle and to the vehicle sides. When reverse gear is disengaged, the additional illumination can automatically turn off, and the video screen can revert to its pre-reversing display image. When reversing, additional warning inputs can be displayed or audibly communicated to the driver such as a warning of objects detected, peripheral motion, the direction in which an object detected lies, and the distance to an object (determined by radar, ultrasonic, infrared, and/or vision analysis). Such objects/data detected can be displayed as icons and/or an alphanumerical display on a video image of the rearward scene and/or on a graphic representation of the rearward view.


As disclosed U.S. Pat. No. 5,670,935, incorporated above, a rearview vision system for a vehicle includes at least one image capture device directed rearwardly with respect to the direction of travel of the vehicle. A display system displays an image synthesized from output of the image captive device. The display system is preferably contiguous with the forward field of view of the vehicle driver at a focal length that is forward of the vehicle passenger compartment. A plurality of image capture devices may be provided and the display system displays a unitary image synthesized from outputs of the image captive devices which approximates a rearward-facing view from a single location, such as forward of the vehicle.


For enhancing the interpretation of visual information in the rearview vision system by presenting information in a manner which does not require significant concentration of the driver or present distractions to the drive, the rearview vision system has at least two image capture devices positioned on the vehicle and directed rearwardly with respect to the direction of travel of the vehicle. A display is provided for images captured by the image capture devices. The display combines the captured images into an image that would be achieved by a single rearward-looking camera having a view unobstructed by the vehicle. In order to obtain all of the necessary information of activity, not only behind but also alongside of the vehicle, the virtual camera should be positioned forward of the driver. The image synthesized from the multiple image capture devices may have a dead space which corresponds with the area occupied by the vehicle. This dead space is useable by the driver's sense of perspective in judging the location of vehicles behind and alongside of the equipped vehicle.


Techniques are provided for synthesizing images captured by individual, spatially separated, image capture devices into such ideal image, displayed on the display device. This may be accomplished by providing at least three image capture devices. At least two of the image capture devices are side image capture devices mounted on opposite sides of the vehicle. At least one of the image capture devices is a center image capture device mounted laterally between the side image capture devices. A display system displays an image synthesized from outputs of the image capture devices. The displayed image includes an image portion from each of the image capture devices.


Preferably, perspective lines are included at lateral edges of the dead space which are aligned with the direction of travel of the vehicle and, therefore, appear in parallel with lane markings. This provides visual clues to the driver's sense of perspective in order to assist in judging distances of objects around the vehicle.


Image enhancement means are provided for enhancing the displayed image. Such means may be in the form of graphic overlays superimposed on the displayed image. Such graphic overlap may include indicia of the anticipated path of travel of the vehicle which is useful in assisting the driver in guiding the vehicle in reverse directions. Such graphic overlay may include a distance grid indicating distances behind the vehicle of objects juxtaposed with the grid.


The vehicle, which may be an automobile, a light truck, a sport utility vehicle, a van, a bus, a large truck, or the like, includes a rearview vision system for providing a driver of the vehicle with a view rearwardly of the vehicle with respect to the direction of travel of the vehicle. The vision system includes at least two side image capture devices positioned, respectively, on opposite sides of vehicle and a center image capture device positioned on the lateral centerline of the vehicle. All of the image capture devices are directed generally rearwardly of the vehicle. The rearview vision system additionally includes an image processor for receiving data signals from image capture devices and synthesizing, from the data signals, a composite image which is displayed on a display.


Images captured by image capture devices are juxtaposed on the display by the image processor in a manner which approximates the view from a single virtual image capture device positioned forwardly of the vehicle at a location and facing rearwardly of the vehicle, with the vehicle being transparent to the view of the virtual image capture device. The vision system provides a substantially seamless panoramic view rearwardly of the vehicle without duplicate or redundant images of objects. Furthermore, elongated, laterally-extending, objects, such as the earth's horizon, appear uniform and straight across the entire displayed image. The displayed image provides a sense of perspective, which enhances the ability of the driver to judge location and speed of adjacent trailing vehicles.


Each of the side image capture devices has a field of view and is aimed rearwardly with respect to the vehicle about an axis which is at an angle, with respect to the vehicle, that is half of the horizontal field of view of the image capture device. In this manner, each of the image capture devices covers an area bounded by the side of the vehicle and extending outwardly at an angle defined by the horizontal field of view of the respective side image capture device. The center image capture device has a horizontal field of view, which is symmetrical about the longitudinal axis of the vehicle. The field of view of each side image capture device intersect the field of view of the center image capture device at a point which is located a distance behind the vehicle.


Rear blind zones are located symmetrically behind the vehicle extending from the rear of the vehicle to a point. Side blind zones located laterally on respective sides of the vehicle extend rearwardly of the forward field of view of the driver to the field of view of the respective side image capture device.


A left overlap zone and a right overlap zone extend rearward from respective points where the horizontal fields of view of the side image capture devices intersect the field of view of the center image capture device. Overlap zones define areas within which an object will be captured both by the center image capture device and one of the side image capture devices. An object in an overlap zone will appear on the display in multiple image portions in a redundant or duplicative fashion. In order to avoid the presentation of redundant information to the driver, and thereby avoid confusion and simplify the task of extracting information from the multiple images or combined images on the display, the object should avoid overlapping zones.


When operating the vehicle in the reverse direction, it may be desirable to provide additional data concerning the area surrounding the immediate rear of the vehicle. This may be accomplished by utilizing non-symmetrical optics for the center image capture device in order to provide a wide angle view at a lower portion of the field of view. Alternatively, a wide angle optical system could be utilized with the electronic system selectively correcting distortion of the captured image. Such a system would provide a distortion-free image while obtaining more data, particularly in the area surrounding the back of the vehicle.


Use of more than three image capture devices is comprehended. In addition to side image capture devices positioned at the front sides of the vehicle and a center image capture device positioned at the center rear of the vehicle, additional image capture devices may be useful at the rear corners of the vehicle in order to further eliminate blind spots. It may additionally be desirable to provide an additional center image capture device at a higher elevation in order to obtain data immediately behind the vehicle and thereby fill in the road surface detail immediately behind the vehicle. Such additional detail is particularly useful when operating the vehicle in the reverse direction. Of course, each of the image capture devices could be a combination of two or more image capture devices.


An image display device displays a composite image made up of a left image portion, a right image portion, and a center image portion. The display may additionally include indicia such as the readout of a compass, vehicle speed, turn signals, and the like as well as other graphical or video displays, such as a navigation display, a map display, and a forward-facing vision system. In this manner, the rearview vision system may be a compass vision system or an information vision system.


The display is of a size to be as natural as possible to the driver. This is a function of the size of the display and the distance between the display and the driver and is preferably positioned within the driver's physiological field of view without obstructing the view through the windshield. It is known that the driver's field of view, with the head and eyes fixed forward, extends further in a downward direction than in an upward direction. The display could be located above the vertical view through the windshield wherein the display may be observed at the upward portion of the driver's field of view. However, the position for the display is preferred wherein the display is within the lower portion of the driver's field of view.


The display is a flat panel display, such as a back-lit liquid crystal display, a plasma display, a field emission display, or a cathode ray tube. However, the synthesized image could be displayed using other display techniques such as to provide a projected or virtual image. One such virtual display is a heads-up display. The display may be mounted/attached to the dashboard, facia or header, or to the windshield at a position conventionally occupied by an interior rearview mirror.


Although various camera devices may be utilized for image capture devices, an electro-optic, pixelated imaging array, located in the focal plane of an optical system, is preferred. Such imaging array allows the number of pixels to be selected to meet the requirements of rearview vision system. A commercially available display may be used, however, by leaving a horizontal band of the display for displaying alpha-numeric data, such as portions of an instrument cluster, compass display, or the like.


The image capture devices are CMOS imaging arrays of the type manufactured by VLSI Vision Ltd. of Edinburgh, Scotland, which are described in more detail U.S. patent application Ser. No. 08/023,918 filed Feb. 26, 1993, by Kenneth Schofield and Mark Larson for an AUTOMATIC REARVIEW MIRROR SYSTEM USING A PHOTOSENSOR ARRAY, now U.S. Pat. No. 5,550,677. However, other pixelated focal plane image-array devices, which are sensitive to visible or invisible electromagnetic radiation, could be used. The devices could be sensitive to either color or monochromatic visible radiation or near or far infrared radiation of the type used in night-vision systems. Each image capture device could be a combination of different types of devices, such as one sensitive to visible radiation combined with one sensitive to infrared radiation. Examples of other devices known in the art include charge couple devices and the like.


The relationship is enhanced between the driver's primary view and the image presented on the rearview vision system. This is accomplished in a manner which provides ease of interpretation while avoiding confusion so that the driver does not have to concentrate or look closely at the image. In this manner, information presented on the display is naturally assimilated. This is accomplished while reducing blind spots so that other vehicles or objects of interest to the driver will likely be displayed to the driver. Additionally, the use of perspective allows distances to be more accurately determined.


It is further envisioned that the control may provide a warning or alert to the driver of the vehicle when the actual geographic position of the vehicle (as provided by the global positioning system of the vehicle) is not where it should be based on the instructions received from the remote service center. For example, the control may instruct the driver to turn around or otherwise get back onto the given route, or the control may instruct the driver to contact the service center to obtain updated directions based on the new position of the vehicle. This may be done if, for example, the geographic position of the vehicle is outside of a predetermined or threshold range or distance of the next location or waypoint, or if the geographic position of the vehicle is past the location or waypoint. Optionally, the control may provide audible chirps or other audible signal or the like delivered by a speaker to alert the driver when approaching a turn or to indicate to the driver that the driver has missed a turn.


The control may also be operable to continuously monitor the actual geographic position of the vehicle and compare to the locations or waypoints associated with the instructions even after the vehicle has strayed from the given route. As discussed above, the control may provide instructions to turn around to get back on the given route. However, if the vehicle continues along a different path (such as in situations where the driver gets lost and attempts to find a way back to the given route, or where the driver may take an alternate route, such as an alternate route known to the driver or a detour or the like), but eventually arrives at one of the geographic locations or waypoints associated with the downloaded instructions, the control may be operable to recognize that the vehicle is back on the given route and resume communicating/displaying the appropriate instructions to the driver to direct the driver to the targeted destination.


During operation, as the driver is driving the vehicle, the driver may access or contact a service center via the telematics system 18 of the vehicle, such as ONSTAR®, TELEAID™, RESCU® or the like, depending on the type of vehicle, and request driving directions to a particular desired destination or targeted location. The operator or service center may provide the directions to the desired destination from the known position of the vehicle (which may be provided by the driver to the service center or may be known by the service center in response to the global positioning system of the vehicle). Preferably, the service center communicates the directions and downloads the directions to a storage location or control of the vehicle. The directions or instructions are electronically or digitally or otherwise coded or tagged or otherwise associated with or linked to a particular geographic location or waypoint either by the remote service center or by the control. The control is then operable to provide the directions in sections or parts or steps, with each separate, particular step or instruction being provided to the driver in response to the current geographic position of the vehicle (based on a signal from the global positioning system of the vehicle) generally corresponding to a particular geographic location or waypoint associated with the particular step or instruction. For example, a step may be provided in response to the vehicle completing a previous step of the directions, and/or may be provided in response to the vehicle approaching (such as the vehicle being within a threshold distance of) the street, intersection, location or the like at which the next step or turn is to be performed, without affecting the scope of the present invention.


Therefore, the present invention provides a navigation system which is operable to provide step-by-step instructions to a targeted destination to a driver of a vehicle while the driver is driving the vehicle toward the targeted destination. The instructions are downloaded from a remote database at a remote service center or the like via a telematics system or wireless communication system of the vehicle. The instructions may then be provided to the driver only as needed by the driver, since they are coded or associated with or linked to particular geographic locations or waypoints, thereby simplifying the instructions so that the driver will be able to understand each step and execute the step accordingly. The instructions may be downloaded to a storage or memory location or system of the vehicle during a brief communication or connection with the remote service center, so that the driver does not have to remain connected with the remote service center or repeatedly contact the service center to receive updated instructions as the driver drives the vehicle toward the targeted destination. The downloaded instructions are only the local instructions and thus do not require an excessive amount of time to download nor do they require an excessive amount of storage space or memory on the control. Thus, the remote service center, operator, computerized system or the like maintains the detailed maps and directories, and feeds back or downloads wirelessly to the vehicle the local information or map for communication or display to the driver of the vehicle for directional guidance information.


Optionally, the telematics system or communication link or other system may be operable to download data, such as via ONSTAR® or other communication system, or via a global positioning system or the like, to the vehicle or to a control or system or accessory of the vehicle. The data may be used to adjust an accessory or system of the vehicle or to set the accessory or system of the vehicle to a desired or appropriate setting in response to the data and/or in response to other vehicle or driver characteristics or status.


For example, data pertaining to the location of the vehicle, the time of day, the date, weather conditions and/or driving conditions may be provided to the vehicle for use in adjustment of an accessory or system of the vehicle. For example, such data may be used by a seat adjustment system, such that adjustment of the driver or passenger seat of the vehicle may be made in response to changes in such data. This may be beneficial because, for example, during long journeys, the seat adjustment or position at the start of the journey may not be comfortable or appropriate later on in the long journey. The seat adjustment system of the present invention thus may be operable to adjust the seat position or lumbar support or the like (and the mirror position or positions may also be adjusted accordingly) in response to various conditions, such as the length of the journey, altitude of the vehicle, driving conditions and/or the like. The seat adjustment system thus may make dynamic adjustments of the seat or seats to keep the driver or occupants of the vehicle comfortable or alert.


Optionally, it is envisioned that the seats of the vehicle may have a massage capability. In such applications, the seat adjustment system or seat control system may detect that the vehicle is on a long journey, and may activate the massage function to enhance the comfort to the driver of the vehicle. Such an adjustment or control may also be enabled if rural highway conditions are detected or other driving conditions where such a feature may be desired. It is further envisioned that the seat adjustment or control system may be programmable, such that a particular driver or occupant may indicate what changes he or she may desire in certain conditions. The seat adjustment system may then automatically activate such features or changes when the specified conditions are detected.


Optionally, the adjustment may also or otherwise be made in response to biometric data about the driver or occupant that is presently occupying the seat. It is known to use body measurements to order clothing tailored to the body measurements. Many catalogue clothing companies are now taking body scan measurements to order clothing on line. These measurements ensure a substantially perfect fit of the ordered clothing. Such body scan measurements or data or other such biometric data may be entered into the vehicle seat adjustment system, or may be communicated to the vehicle seat adjustment system, such as via the telematics system or other communication system or data system or the like. The seat adjustment system may then adjust the seat (and the mirrors may be adjusted as well) in response to detection of a particular person and/or their biometric characteristics or data.


Referring now FIGS. 3 and 4, a biometric seat adjustment system 110 is operable to adjust the seats 112 of a vehicle 114. The biometric seat adjustment system 110 may adjust a driver seat 112a, a front passenger seat 112b, and/or one or more rear passenger seats 112c via a powered seat adjustment mechanism 116 (FIG. 4) at the respective seats in response to biometric data or information pertaining to a person that may be sitting in or may be about to sit in one of the vehicle seats. As shown in FIG. 4, biometric seat adjustment system 110 includes a control 118, which may store biometric data 120 in a memory and/or may receive biometric data 120 from a remote source or an input device or communication (not shown). Control 118 is operable to control or adjust the seat adjustment mechanism 116 to adjust the seats 112 of the vehicle (such as lumbar support, seat travel, seat height, etc.) in response to the stored biometric data and/or input. For example, a person may have their biometric data or characteristics stored in a memory of control 118, and may select a particular code or setting corresponding to their data (such as “position 1” of the seat adjustment system), whereby control 118 adjusts the adjustment mechanism of the particular selected seat in response to the data. Alternately, a person may have their biometric data or characteristics stored in a portable device (such as a key fob, PDA, or the like) or at a remote location or device, and may have the biometric data or characteristic communicated to the control 118, whereby control 118 may adjust the adjustment mechanism of the particular selected seat in response to the communication. The control 118 may also be operable to control or adjust a setting of an interior rearview mirror 122, an exterior rearview mirror or mirrors 124, a steering wheel 126 and/or the like in response to the input or communication.


The present invention thus provides a vehicle seat adjustment in response to biometric data, such as various body dimensions, weight, sex, age and the like. Such body dimension measurements, such as those taken for ordering clothing, may be made on a person and may be contained in the person's computer or the like, along with other biometric data or characteristics of the person (and optionally may include preferences of the person). These data may be loaded into the vehicle computer and/or seat adjustment system. The seat adjustment system receives the data and may be operable to pre-adjust the driver seat (or passenger seat or other seat) of the vehicle in response to the data so that the seat that the person will sit in is set to the person's precise body measurements and other data. Additionally, the adjustment system may pre-adjust an interior rearview mirror, exterior rearview mirror or mirrors, steering wheel and/or the like in response to the measurements or inputs.


The body dimensions may be saved in a person's computer or PDA, such as done for ordering clothing. Such measurement and saving technology now exists and is used by some catalogues, such as Lands' End and/or Levi (which provides for measurements in their stores and these measurements are stored in the person's file for ordering perfect fit jeans). Alternately, a vehicle dealer may perform simple measurements on a person (like a tailor with a new suit). This information may then be used to adjust the seat in the person's vehicle to the person's body size, weight, age, sex, etc. For example, the vehicle dealer may download the information or data for a person or person's (such as a driver and their spouse) into memory positions 1 and 2 of a vehicle seat adjustment memory of the person's vehicle. Optionally, the data may be downloaded into a Bluetooth (or other communication protocol) enabled phone, PDA or key fob, which may then be used to communicate the data to the targeted vehicle. Such an approach would be particularly suitable for and advantageous to use with rental cars.


The biometric seat adjustment system preferably utilizes the normal memory seat adjustment system or mechanisms currently in some vehicles, such as high end vehicles. While the seats today can be adjusted to a person's particular preferences, it is likely that most people take awhile to get themselves comfortable. By using a few body dimensions and the person's weight (and maybe other information or characteristics as well), the present invention may set the seat or seats substantially perfectly before or when the person or persons first get into the vehicle.


It is envisioned that the biometric data measurement event may occur in the vehicle (such as by an in-vehicle laser or similar scanners and/or cameras that measure the driver's and/or passengers' biometric dimensions). Alternately, the biometric data may be measured external to the vehicle (such as at a dealership “booth” when the driver is ordering and/or receiving delivery of the vehicle or at a biometric measurement booth at a Mall or other store or facility or the like) and may be provided to the vehicle in a manner such as described above and/or via, for example, an ONSTAR® telematics service or via a similar telecommunication system or event or the like.


It is further envisioned that more than the seat or seats (lumbar support/seat travel/seat height etc.) may be adjusted in response to the individual biometric data stored in or communicated to the vehicle memory system. For example, exterior and/or interior mirror reflective elements may be moved or adjusted in response to such stored or input biometrics data, which may be called up or loaded when that particular individual sits in one of the seats of the vehicle. Additionally, other accessories or systems of the vehicle may be adjusted or customized, such as suspension characteristics; steering column tilt; size of display characters (for example, older drivers may desire larger alphanumerical display digits); and/or the like, in response to the biometric data of a particular individual.


Therefore, the present invention provides a navigation system which is operable to provide step-by-step instructions to a targeted destination to a driver of a vehicle while the driver is driving the vehicle toward the targeted destination. The instructions are downloaded from a remote database at a remote service center or the like via a telematics system or wireless communication system of the vehicle. The instructions may then be provided to the driver only as needed by the driver, since they are coded or associated with or linked to particular geographic locations or waypoints, thereby simplifying the instructions so that the driver will be able to understand each step and execute the step accordingly. The present invention may also provide a seat adjustment function that automatically adjusts the seat of the vehicle in response to data communicated to the vehicle via a telematics system or a global positioning system or the like. The seat adjustment system or function may be operable to adjust the seat of the vehicle in response to biometric data of the person occupying the seat. The interior and/or exterior rearview mirrors may also be adjusted in response to the data or seat adjustments.


Changes and modifications in the specifically described embodiments may be carried out without departing from the principles of the present invention, which is intended to be limited only by the scope of the appended claims as interpreted according to the principles of patent law.

Claims
  • 1. A driver assist system for a vehicle, said driver assist system comprising: a color video camera disposed at a vehicle equipped with said driver assist system;said color video camera having a field of view exterior of the equipped vehicle;wherein said color video camera comprises a rear backup camera disposed at a rear portion of the equipped vehicle and having a field of view rearward of the equipped vehicle;wherein the field of view of said color video camera encompasses a ground area to the rear of the equipped vehicle having a width at least substantially an entire rear fender width of the equipped vehicle;said color video camera comprising a lens and a two-dimensional array of photosensing elements arranged in rows of photosensing elements and columns of photosensing elements;wherein an image processor processes image data captured by said color video camera;at least one non-visual sensor;a video display screen viewable by a driver of the equipped vehicle when the driver is normally operating the equipped vehicle;wherein, during at least one of (i) a reversing maneuver of the equipped vehicle and (ii) a parking maneuver of the equipped vehicle, images derived, at least in part, from image data captured by said color video camera are displayed by said video display screen to assist the driver in operating the equipped vehicle;wherein said video display screen is operable to display images derived, at least in part, from image data captured by said color video camera with a display intensity greater than 200 candelas/sq. meter for viewing by the driver of the equipped vehicle;wherein at least one indication is provided to a driver of the equipped vehicle who is at least one of (i) executing a reversing maneuver of the equipped vehicle and (ii) parking the equipped vehicle;wherein said at least one indication comprises an indication provided, at least in part, responsive to detection by said non-visual sensor of at least one object external of the equipped vehicle; andwherein distortion of images derived, at least in part, from image data captured by said color video camera, and displayed by said video display screen, is electronically corrected.
  • 2. The driver assist system of claim 1, wherein said at least one indication comprises an indication provided, at least in part, responsive to said image processor processing image data captured by said color video camera.
  • 3. The driver assist system of claim 1, wherein said at least one indication comprises a visible cue displayed by said video display screen, and wherein said visible cue at least one of (i) indicates distance to an object exterior of the equipped vehicle, (ii) indicates proximity to an object exterior of the equipped vehicle, (iii) highlights an object exterior of the equipped vehicle and indicates distance to an object exterior of the equipped vehicle, (iv) highlights an object exterior of the equipped vehicle and (v) highlights an object that the equipped vehicle is in jeopardy of colliding with.
  • 4. The driver assist system of claim 1, wherein said at least one indication comprises an audible alert, and wherein said video display screen comprises one of (a) a liquid crystal display screen and (b) an organic light emitting diode display screen.
  • 5. The driver assist system of claim 4, wherein said audible alert alerts the driver that the equipped vehicle is approaching an object exterior to the equipped vehicle.
  • 6. The driver assist system of claim 1, wherein said video display screen is operable to display images derived, at least in part, from image data captured by said color video camera with a display intensity greater than 300 candelas/sq. meter for viewing by the driver of the equipped vehicle, and wherein said video display screen comprises one of (a) a liquid crystal display screen and (b) an organic light emitting diode display screen.
  • 7. The driver assist system of claim 1, wherein said video display screen is operable to display images derived, at least in part, from image data captured by said color video camera with a display intensity greater than 400 candelas/sq. meter for viewing by the driver of the equipped vehicle, and wherein said video display screen comprises one of (a) a liquid crystal display screen and (b) an organic light emitting diode display screen.
  • 8. The driver assist system of claim 1, wherein said non-visual sensor comprises an infrared sensor.
  • 9. The driver assist system of claim 1, wherein said non-visual sensor comprises an ultrasonic sensor.
  • 10. The driver assist system of claim 1, wherein said non-visual sensor comprises a radar sensor.
  • 11. The driver assist system of claim 1, wherein said color video camera is part of a multi-camera vision system of the equipped vehicle, said multi-camera vision system comprising at least two other color video cameras, wherein each of said two other color video cameras has a field of view exterior of the equipped vehicle, wherein one of said two other color video cameras is mounted at a driver-side of the equipped vehicle and the other of said two other color video cameras is mounted at a passenger-side of the equipped vehicle, and wherein the one of said two side-mounted cameras mounted at the driver-side has a field of view at the driver-side of the equipped vehicle and the other of said two side-mounted cameras mounted at the passenger-side has a field of view at the passenger-side of the equipped vehicle, and wherein the field of view of said one of said two other color video cameras that is mounted at the driver-side overlaps the field of view of said rear backup camera, and wherein the field of view of said other of said two other color video cameras that is mounted at the passenger-side overlaps the field of view of said rear backup camera.
  • 12. The driver assist system of claim 11, wherein a unitary image is synthesized from image data captured by cameras of said multi-camera vision system of the equipped vehicle, and wherein said synthesized image approximates a view as would be seen by a virtual camera at a single location exterior of the equipped vehicle, and wherein said synthesized image is displayed by said video display screen that is viewable by the driver of the equipped vehicle when normally operating the equipped vehicle.
  • 13. The driver assist system of claim 1, wherein said driver assist system comprises a user input and wherein said user input comprises touch input.
  • 14. The driver assist system of claim 13, wherein at least one of (i) said user input comprises an input for a telematics system of the equipped vehicle, (ii) said user input comprises an input for a navigational system of the equipped vehicle and (iii) said user input comprises an input for a biometric system of the equipped vehicle.
  • 15. The driver assist system of claim 1, wherein said video display screen is disposed at one of (i) a windshield electronics module of the equipped vehicle, (ii) an instrument panel of the equipped vehicle and (iii) a console of the equipped vehicle.
  • 16. The driver assist system of claim 1, comprising at least one of (a) a control operable to control at least one accessory of the equipped vehicle in accordance with a characteristic of an occupant of the equipped vehicle, and wherein said at least one accessory comprises a seat adjustment system operable to adjust a seat of the equipped vehicle in accordance with a biometric characteristic of an occupant of the equipped vehicle, and wherein said seat adjustment system is operable to adjust a seat of the equipped vehicle in response to biometric data, said biometric data pertaining to the occupant of the seat of the equipped vehicle, and (b) a control operable to communicate with an external service provider via a wireless communication link between the equipped vehicle and the external service provider, wherein said control receives an input from the driver of the equipped vehicle and responsive thereto establishes said wireless communication link between the equipped vehicle and the external service provider, and wherein said control controls at least one accessory of the equipped vehicle responsive to a geographic location of the equipped vehicle as determined by a global positioning system of the equipped vehicle, and wherein data from the external service provider is downloaded to said control via said wireless communication link, and wherein said control comprises memory for storing downloaded data at least after said wireless communication link established between said control and the external service provider is disconnected, and wherein said downloaded data comprises downloaded driving instruction data useful for instructing the driver of the equipped vehicle how to drive from an initial location to a destination location, and wherein driving instructions derived, at least in part, from said downloaded driving instruction data are displayed, when the equipped vehicle is not executing a reversing maneuver, by said video display screen for viewing by the driver of the equipped vehicle, and wherein said driving instructions are displayed by said video display screen in a step-by-step manner, with at least some driving instruction steps being displayed by said video display screen after said wireless communication link between said control and the external service provider is disconnected, and wherein said video display screen is part of an interior rearview mirror assembly of the equipped vehicle.
  • 17. The driver assist system of claim 1, wherein said at least one indication comprises a visual indication displayed by said video display screen, and wherein said visual indication provides an indication of distance from the equipped vehicle, and wherein said visual indication is provided to the driver of the equipped vehicle when the driver is parking the equipped vehicle during a parking maneuver of the equipped vehicle, and wherein the equipped vehicle comprises a forward-facing camera disposed at a front portion of the equipped vehicle that is operable to capture video images of the ground surface generally in front of the equipped vehicle, and wherein the field of view of said forward-facing camera encompasses a ground area to the front of the equipped vehicle having a width at least substantially an entire front fender width of the equipped vehicle.
  • 18. The driver assist system of claim 1, wherein said video display screen comprises a multi-pixel TFT liquid crystal display screen, and wherein said rear backup camera disposed at the rear portion of the equipped vehicle has a horizontal field of view that is generally symmetrical about the longitudinal axis of the equipped vehicle.
  • 19. The driver assist system of claim 1, wherein said rear backup camera disposed at the rear portion of the equipped vehicle is positioned generally at the longitudinal centerline of the equipped vehicle and has a horizontal field of view that is generally symmetrical about the longitudinal axis of the equipped vehicle.
  • 20. A driver assist system for a vehicle, said driver assist system comprising: a camera disposed at a vehicle equipped with said driver assist system;said camera having a field of view exterior of the equipped vehicle;wherein said camera comprises a rear backup camera disposed at a rear portion of the equipped vehicle and having a field of view rearward of the equipped vehicle;said camera comprising a lens and a two-dimensional array of photosensing elements arranged in rows of photosensing elements and columns of photosensing elements;wherein an image processor processes image data captured by said camera;at least one non-visual sensor;a video display screen viewable by a driver of the equipped vehicle when the driver is normally operating the equipped vehicle;wherein said video display screen comprises a multi-pixel display screen;wherein, during at least one of (i) a reversing maneuver of the equipped vehicle and (ii) a parking maneuver of the equipped vehicle, images derived, at least in part, from image data captured by said camera are displayed by said video display screen to assist the driver in operating the equipped vehicle;wherein said video display screen is operable to display images derived, at least in part, from image data captured by said camera with a display intensity greater than 200 candelas/sq. meter for viewing by the driver of the equipped vehicle;wherein at least one indication is provided to a driver of the equipped vehicle who is at least one of (i) executing a reversing maneuver of the equipped vehicle and (ii) parking the equipped vehicle;wherein at least one of (a) said at least one indication comprises an indication provided, at least in part, responsive to detection by said non-visual sensor of at least one object external of the equipped vehicle and (b) said at least one indication comprises an indication provided, at least in part, responsive to said image processor processing image data captured by said rear backup camera; andwherein said camera is part of a multi-camera vision system of the equipped vehicle, said multi-camera vision system comprising at least two other cameras, wherein each of said two other cameras has a field of view exterior of the equipped vehicle, wherein one of said two other cameras is mounted at a driver-side of the equipped vehicle and the other of said two other cameras is mounted at a passenger-side of the equipped vehicle, and wherein the one of said two side-mounted cameras mounted at the driver-side has a field of view at the driver-side of the equipped vehicle and the other of said two side-mounted cameras mounted at the passenger-side has a field of view at the passenger-side of the equipped vehicle, and wherein the field of view of said one of said two other cameras that is mounted at the driver-side overlaps the field of view of said rear backup camera, and wherein the field of view of said other of said two other cameras that is mounted at the passenger-side overlaps the field of view of said rear backup camera.
  • 21. The driver assist system of claim 20, wherein a unitary image is synthesized from image data captured by cameras of said multi-camera vision system of the equipped vehicle, and wherein said synthesized image approximates a view as would be seen by a virtual camera at a single location exterior of the equipped vehicle, and wherein said synthesized image is displayed by said video display screen that is viewable by the driver of the equipped vehicle when normally operating the equipped vehicle.
  • 22. The driver assist system of claim 21, wherein said rear backup camera disposed at the rear portion of the equipped vehicle is positioned generally at the longitudinal centerline of the equipped vehicle and has a horizontal field of view that is generally symmetrical about the longitudinal axis of the equipped vehicle.
  • 23. The driver assist system of claim 21, wherein said at least one indication comprises an indication provided, at least in part, responsive to said image processor processing image data captured by said rear backup camera.
  • 24. The driver assist system of claim 21, wherein said at least one indication comprises a visual indication displayed by said video display screen and wherein said visual indication comprises a visual cue displayed by said video display screen, and wherein said visual cue at least one of (i) indicates distance to an object exterior of the equipped vehicle, (ii) indicates proximity to an object exterior of the equipped vehicle, (iii) highlights an object exterior of the equipped vehicle and indicates distance to an object exterior of the equipped vehicle, (iv) highlights an object exterior of the equipped vehicle and (v) highlights an object that the equipped vehicle is in jeopardy of colliding with.
  • 25. The driver assist system of claim 21, wherein said at least one indication comprises an audible alert and wherein said audible alert alerts the driver that the equipped vehicle is approaching an object exterior to the equipped vehicle.
  • 26. The driver assist system of claim 21, wherein the field of view of said rear backup camera encompasses a ground area to the rear of the equipped vehicle having a width at least substantially an entire rear fender width of the equipped vehicle.
  • 27. The driver assist system of claim 21, wherein said rear backup camera comprises a CMOS rear backup camera and wherein said CMOS rear backup camera comprises wide-angle optics, and wherein distortion of images derived, at least in part, from image data captured by said color video camera, and displayed by said video display screen, is electronically corrected.
  • 28. The driver assist system of claim 21, wherein said video display screen comprises a liquid crystal display screen and wherein said liquid crystal display screen comprises a multi-pixel TFT liquid crystal display screen backlit by a plurality of white light emitting light emitting diodes.
  • 29. The driver assist system of claim 28, wherein said non-visual sensor comprises an ultrasonic sensor.
  • 30. A driver assist system for a vehicle, said driver assist system comprising: a color video camera disposed at a vehicle equipped with said driver assist system;said color video camera having a field of view exterior of the equipped vehicle;wherein said color video camera comprises a rear backup camera disposed at a rear portion of the equipped vehicle and having a field of view rearward of the equipped vehicle;said color video camera comprising a lens and a two-dimensional array of photosensing elements arranged in rows of photosensing elements and columns of photosensing elements;wherein an image processor processes image data captured by said color video camera;at least one non-visual sensor;a video display screen viewable by a driver of the equipped vehicle when the driver is normally operating the equipped vehicle;wherein said video display screen comprises a multi-pixel display screen;wherein, during at least one of (i) a reversing maneuver of the equipped vehicle and (ii) a parking maneuver of the equipped vehicle, images derived, at least in part, from image data captured by said color video camera are displayed by said video display screen to assist the driver in operating the equipped vehicle;wherein said video display screen is operable to display images derived, at least in part, from image data captured by said color video camera with a display intensity greater than 200 candelas/sq. meter for viewing by the driver of the equipped vehicle;wherein at least one indication is provided to a driver of the equipped vehicle who is at least one of (i) executing a reversing maneuver of the equipped vehicle and (ii) parking the equipped vehicle;wherein said at least one indication comprises an indication provided, at least in part, responsive to detection by said non-visual sensor of at least one object external of the equipped vehicle;wherein said at least one indication comprises a visual indication displayed by said video display screen; andwherein the equipped vehicle comprises a forward-facing camera disposed at a front portion of the equipped vehicle that is operable to capture video images of the ground surface generally in front of the equipped vehicle, and wherein the field of view of said forward-facing camera encompasses a ground area to the front of the equipped vehicle having a width at least substantially an entire front fender width of the equipped vehicle.
  • 31. The driver assist system of claim 30, wherein the field of view of said rear backup camera encompasses a ground area to the rear of the equipped vehicle having a width at least substantially an entire rear fender width of the equipped vehicle.
  • 32. The driver assist system of claim 30, wherein said rear backup camera is part of a multi-camera vision system of the equipped vehicle, said multi-camera vision system comprising at least two other color video cameras, wherein each of said two other color video cameras has a field of view exterior of the equipped vehicle, wherein one of said two other color video cameras is mounted at a driver-side of the equipped vehicle and the other of said two other color video cameras is mounted at a passenger-side of the equipped vehicle, and wherein the one of said two side-mounted cameras mounted at the driver-side has a field of view at the driver-side of the equipped vehicle and the other of said two side-mounted cameras mounted at the passenger-side has a field of view at the passenger-side of the equipped vehicle, and wherein the field of view of said one of said two other color video cameras that is mounted at the driver-side overlaps the field of view of said rear backup camera, and wherein the field of view of said other of said two other color video cameras that is mounted at the passenger-side overlaps the field of view of said rear backup camera, and wherein a unitary image is synthesized from image data captured by cameras of said multi-camera vision system of the equipped vehicle, and wherein said synthesized image approximates a view as would be seen by a virtual camera at a single location exterior of the equipped vehicle, and wherein said synthesized image is displayed by said video display screen that is viewable by the driver of the equipped vehicle when normally operating the equipped vehicle.
  • 33. The driver assist system of claim 30, wherein said rear backup camera comprises wide-angle optics and wherein distortion of images derived, at least in part, from image data captured by said color video camera, and displayed by said video display screen, is electronically corrected, and wherein said video display screen comprises a liquid crystal display screen, and wherein said liquid crystal display screen comprises a multi-pixel TFT liquid crystal display screen backlit by a plurality of white light emitting light emitting diodes.
  • 34. The driver assist system of claim 33, wherein said at least one indication comprises an indication provided, at least in part, responsive to said image processor processing image data captured by said rear backup camera.
  • 35. The driver assist system of claim 30, wherein said at least one indication comprises an audible alert and wherein said audible alert alerts the driver that the equipped vehicle is approaching an object exterior to the equipped vehicle, and wherein another of said at least one indication comprises a visual indication displayed by said video display, and wherein said visual indication comprises a visual cue and wherein said visual cue at least one of (i) indicates distance to an object exterior of the equipped vehicle, (ii) indicates proximity to an object exterior of the equipped vehicle, (iii) highlights an object exterior of the equipped vehicle and indicates distance to an object exterior of the equipped vehicle, (iv) highlights an object exterior of the equipped vehicle and (v) highlights an object that the equipped vehicle is in jeopardy of colliding with.
  • 36. A driver assist system for a vehicle, said driver assist system comprising: a color video camera disposed at a vehicle equipped with said driver assist system;said color video camera having a field of view exterior of the equipped vehicle;wherein said color video camera comprises a rear backup camera disposed at a rear portion of the equipped vehicle and having a field of view rearward of the equipped vehicle;said color video camera comprising a lens and a two-dimensional array of photosensing elements arranged in rows of photosensing elements and columns of photosensing elements;wherein an image processor processes image data captured by said color video camera;at least one non-visual sensor;a video display screen viewable by a driver of the equipped vehicle when the driver is normally operating the equipped vehicle;wherein said video display screen comprises a multi-pixel display screen;wherein, during at least one of (i) a reversing maneuver of the equipped vehicle and (ii) a parking maneuver of the equipped vehicle, images derived, at least in part, from image data captured by said color video camera are displayed by said video display screen to assist the driver in operating the equipped vehicle;wherein said video display screen is operable to display images derived, at least in part, from image data captured by said color video camera with a display intensity greater than 200 candelas/sq. meter for viewing by the driver of the equipped vehicle;wherein at least one indication is provided to a driver of the equipped vehicle who is at least one of (i) executing a reversing maneuver of the equipped vehicle and (ii) parking the equipped vehicle;wherein at least one of (a) said at least one indication comprises an indication provided, at least in part, responsive to detection by said non-visual sensor of at least one object external of the equipped vehicle and (b) said at least one indication comprises an indication provided, at least in part, responsive to said image processor processing image data captured by said rear backup camera;wherein said color video camera is part of a multi-camera vision system of the equipped vehicle, said multi-camera vision system comprising at least two other color video cameras, wherein each of said two other color video cameras has a field of view exterior of the equipped vehicle, wherein one of said two other color video cameras is mounted at a driver-side of the equipped vehicle and the other of said two other color video cameras is mounted at a passenger-side of the equipped vehicle, and wherein the one of said two side-mounted cameras mounted at the driver-side has a field of view at the driver-side of the equipped vehicle and the other of said two side-mounted cameras mounted at the passenger-side has a field of view at the passenger-side of the equipped vehicle, and wherein the field of view of said one of said two other color video cameras that is mounted at the driver-side overlaps the field of view of said rear backup camera, and wherein the field of view of said other of said two other color video cameras that is mounted at the passenger-side overlaps the field of view of said rear backup camera, and wherein a unitary image is synthesized from image data captured by cameras of said multi-camera vision system of the equipped vehicle, and wherein said synthesized image approximates a view as would be seen by a virtual camera at a single location exterior of the equipped vehicle, and wherein said synthesized image is displayed by said video display screen that is viewable by the driver of the equipped vehicle when normally operating the equipped vehicle; andwherein distortion of images derived, at least in part, from image data captured by said color video camera, and displayed by said video display screen, is electronically corrected.
  • 37. The driver assist system of claim 36, wherein the field of view of said rear backup camera encompasses a ground area to the rear of the equipped vehicle having a width at least substantially an entire rear fender width of the equipped vehicle.
  • 38. The driver assist system of claim 36, wherein the equipped vehicle comprises a forward-facing camera disposed at a front portion of the equipped vehicle that is operable to capture video images of the ground surface generally in front of the equipped vehicle, and wherein the field of view of said forward-facing camera encompasses a ground area to the front of the equipped vehicle having a width at least substantially an entire front fender width of the equipped vehicle.
  • 39. The driver assist system of claim 38, wherein said rear backup camera disposed at the rear portion of the equipped vehicle is positioned generally at the longitudinal centerline of the equipped vehicle and has a horizontal field of view that is generally symmetrical about the longitudinal axis of the equipped vehicle.
  • 40. The driver assist system of claim 39, wherein said at least one indication comprises a visual indication displayed by said video display screen and wherein said visual indication comprises a visual cue, and wherein said visual cue at least one of (i) indicates distance to an object exterior of the equipped vehicle, (ii) indicates proximity to an object exterior of the equipped vehicle and (iii) highlights an object exterior of the equipped vehicle and indicates distance to an object exterior of the equipped vehicle.
  • 41. The driver assist system of claim 37, wherein said at least one indication comprises an indication provided, at least in part, responsive to said image processor processing image data captured by said rear backup camera.
  • 42. The driver assist system of claim 37, wherein said at least one indication comprises an indication provided, at least in part, responsive to detection by said non-visual sensor of at least one object external of the equipped vehicle, and wherein said non-visual sensor comprises a sensor selected from the group consisting of an ultrasonic sensor, a radar sensor and an infrared sensor.
CROSS REFERENCE TO RELATED APPLICATIONS

The present application is a continuation of U.S. patent application Ser. No. 14/211,256, filed Mar. 14, 2014, now U.S. Pat. No. 9,014,966, which is a continuation of U.S. patent application Ser. No. 14/033,963, filed Sep. 23, 2013, now U.S. Pat. No. 8,676,491, which is a continuation of U.S. patent application Ser. No. 13/621,382, filed Sep. 17, 2012, now U.S. Pat. No. 8,543,330, which is a continuation of U.S. patent application Ser. No. 13/399,347, filed Feb. 17, 2012, now U.S. Pat. No. 8,271,187, which is a continuation of U.S. patent application Ser. No. 13/209,645, filed Aug. 15, 2011, now U.S. Pat. No. 8,121,787, which is a continuation of U.S. patent application Ser. No. 12/908,481, filed Oct. 20, 2010, now U.S. Pat. No. 8,000,894, which is a continuation of U.S. patent application Ser. No. 12/724,895, filed Mar. 16, 2010, now U.S. Pat. No. 7,822,543, which is a continuation of U.S. patent application Ser. No. 12/405,614, filed Mar. 17, 2009, now U.S. Pat. No. 7,711,479, which is a continuation of U.S. patent application Ser. No. 11/935,800, filed Nov. 6, 2007, now U.S. Pat. No. 7,571,042, which is a continuation of U.S. patent application Ser. No. 11/624,381, filed Jan. 18, 2007, now U.S. Pat. No. 7,490,007, which is a continuation of U.S. patent application Ser. No. 10/645,762, filed Aug. 20, 2003, now U.S. Pat. No. 7,167,796, which claims priority of U.S. provisional applications, Ser. No. 60/406,166, filed Aug. 27, 2002; Ser. No. 60/405,392, filed Aug. 23, 2002; and Ser. No. 60/404,906, filed Aug. 21, 2002, and U.S. patent application Ser. No. 10/645,762 is a continuation-in-part of U.S. patent application Ser. No. 10/456,599, filed Jun. 6, 2003, now U.S. Pat. No. 7,004,593, and U.S. patent application Ser. No. 10/645,762 is a continuation-in-part of U.S. patent application Ser. No. 10/287,178, filed Nov. 4, 2002, now U.S. Pat. No. 6,678,614, which is a continuation of U.S. patent application Ser. No. 09/799,414, filed Mar. 5, 2001, now U.S. Pat. No. 6,477,464, which claims priority of U.S. provisional application Ser. No. 60/187,960, filed Mar. 9, 2000, all of which are hereby incorporated herein by reference in their entireties, and U.S. patent application Ser. No. 11/624,381 is a continuation-in-part of U.S. patent application Ser. No. 10/755,915, filed Jan. 13, 2004, now U.S. Pat. No. 7,446,650, which is a continuation of U.S. patent application Ser. No. 09/793,002, filed Feb. 26, 2001, now U.S. Pat. No. 6,690,268, which claims benefit of U.S. provisional applications, Ser. No. 60/263,680, filed Jan. 23, 2001; Ser. No. 60/243,986, filed Oct. 27, 2000; Ser. No. 60/238,483, filed Oct. 6, 2000; Ser. No. 60/237,077, filed Sep. 30, 2000; Ser. No. 60/234,412, filed Sep. 21, 2000; Ser. No. 60/218,336, filed Jul. 14, 2000; and Ser. No. 60/186,520, filed Mar. 2, 2000, and U.S. patent application Ser. No. 11/624,381 is a continuation-in-part of U.S. patent application Ser. No. 10/054,633, filed Jan. 22, 2002, now U.S. Pat. No. 7,195,381, which claims priority from and incorporates by reference in their entireties U.S. provisional applications, Ser. No. 60/346,733, filed Jan. 7, 2002; Ser. No. 60/263,680, filed Jan. 23, 2001; Ser. No. 60/271,466, filed Feb. 26, 2001; and Ser. No. 60/315,384, filed Aug. 28, 2001, and which is a continuation-in-part of U.S. patent application Ser. No. 09/793,002, filed Feb. 26, 2001, now U.S. Pat. No. 6,690,268.

US Referenced Citations (1622)
Number Name Date Kind
1096452 Perrin May 1914 A
1563258 Cunningham Nov 1925 A
2069368 Horinstein Feb 1937 A
2166303 Hodny et al. Jul 1939 A
2263382 Gotzinger Nov 1941 A
2414223 DeVirgilis Jan 1947 A
2457348 Chambers Dec 1948 A
2561582 Marbel Jul 1951 A
2580014 Gazda Dec 1951 A
3004473 Arthur et al. Oct 1961 A
3075430 Woodward et al. Jan 1963 A
3141393 Platt Jul 1964 A
3152216 Woodward Oct 1964 A
3162008 Berger et al. Dec 1964 A
3185020 Thelen May 1965 A
3266016 Maruyama et al. Aug 1966 A
3280701 Donnelly et al. Oct 1966 A
3432225 Rock Mar 1969 A
3451741 Manos Jun 1969 A
3453038 Kissa et al. Jul 1969 A
3467465 Van Noord Sep 1969 A
3473867 Byrnes Oct 1969 A
3480781 Mandalakas Nov 1969 A
3499112 Heilmeier et al. Mar 1970 A
3499702 Goldmacher et al. Mar 1970 A
3521941 Deb et al. Jul 1970 A
3543018 Barcus et al. Nov 1970 A
3557265 Chisholm et al. Jan 1971 A
3565985 Schrenk et al. Feb 1971 A
3612654 Klein Oct 1971 A
3614210 Caplan Oct 1971 A
3628851 Robertson Dec 1971 A
3676668 Collins et al. Jul 1972 A
3680951 Jordan et al. Aug 1972 A
3689695 Rosenfield et al. Sep 1972 A
3711176 Alfrey, Jr. et al. Jan 1973 A
3712710 Castellion et al. Jan 1973 A
3748017 Yamamura et al. Jul 1973 A
3781090 Sumita Dec 1973 A
3806229 Schoot et al. Apr 1974 A
3807832 Castellion Apr 1974 A
3807833 Graham et al. Apr 1974 A
3821590 Kosman et al. Jun 1974 A
3837129 Losell Sep 1974 A
3860847 Carley Jan 1975 A
3862798 Hopkins Jan 1975 A
3870404 Wilson et al. Mar 1975 A
3876287 Sprokel Apr 1975 A
3932024 Yaguchi et al. Jan 1976 A
3940822 Emerick et al. Mar 1976 A
3956017 Shigemasa May 1976 A
3978190 Kurz, Jr. et al. Aug 1976 A
3985424 Steinacher Oct 1976 A
4006546 Anderson et al. Feb 1977 A
4035681 Savage Jul 1977 A
4040727 Ketchpel Aug 1977 A
4052712 Ohama et al. Oct 1977 A
4075468 Marcus Feb 1978 A
4088400 Assouline et al. May 1978 A
4093364 Miller Jun 1978 A
4097131 Nishiyama Jun 1978 A
4109235 Bouthors Aug 1978 A
4139234 Morgan Feb 1979 A
4159866 Wunsch et al. Jul 1979 A
4161653 Bedini et al. Jul 1979 A
4171875 Taylor et al. Oct 1979 A
4174152 Gilia et al. Nov 1979 A
4200361 Malvano et al. Apr 1980 A
4202607 Washizuka et al. May 1980 A
4211955 Ray Jul 1980 A
4214266 Myers Jul 1980 A
4219760 Ferro Aug 1980 A
4221955 Joslyn Sep 1980 A
4228490 Thillays Oct 1980 A
4247870 Gabel et al. Jan 1981 A
4257703 Goodrich Mar 1981 A
4274078 Isobe et al. Jun 1981 A
4277804 Robison Jul 1981 A
4281899 Oskam Aug 1981 A
4288814 Talley et al. Sep 1981 A
4297401 Chern et al. Oct 1981 A
RE30835 Giglia Dec 1981 E
4306768 Egging Dec 1981 A
4310851 Pierrat Jan 1982 A
4331382 Graff May 1982 A
4338000 Kamimori et al. Jul 1982 A
4377613 Gordon Mar 1983 A
4390895 Sato et al. Jun 1983 A
4398805 Cole Aug 1983 A
4419386 Gordon Dec 1983 A
4420238 Felix Dec 1983 A
4425717 Marcus Jan 1984 A
4435042 Wood et al. Mar 1984 A
4435048 Kamimori et al. Mar 1984 A
4436371 Wood et al. Mar 1984 A
4438348 Casper et al. Mar 1984 A
4443057 Bauer et al. Apr 1984 A
4446171 Thomas May 1984 A
4465339 Baucke et al. Aug 1984 A
4473695 Wrighton et al. Sep 1984 A
4490227 Bitter Dec 1984 A
4491390 Tong-Shen Jan 1985 A
4499451 Suzuki et al. Feb 1985 A
4521079 Leenhouts et al. Jun 1985 A
4524941 Wood et al. Jun 1985 A
4538063 Bulat Aug 1985 A
4546551 Franks Oct 1985 A
4555694 Yanagishima et al. Nov 1985 A
4561625 Weaver Dec 1985 A
4572619 Reininger et al. Feb 1986 A
4580196 Task Apr 1986 A
4580875 Bechtel et al. Apr 1986 A
4581827 Higashi Apr 1986 A
4588267 Pastore May 1986 A
4603946 Kato et al. Aug 1986 A
4623222 Itoh et al. Nov 1986 A
4625210 Sagl Nov 1986 A
4626850 Chey Dec 1986 A
4630040 Haertling Dec 1986 A
4630109 Barton Dec 1986 A
4630904 Pastore Dec 1986 A
4634835 Suzuki Jan 1987 A
4635033 Inukai et al. Jan 1987 A
4636782 Nakamura et al. Jan 1987 A
4638287 Umebayashi et al. Jan 1987 A
4646210 Skogler et al. Feb 1987 A
4652090 Uchikawa et al. Mar 1987 A
4655549 Suzuki et al. Apr 1987 A
4664479 Hiroshi May 1987 A
4665311 Cole May 1987 A
4665430 Hiroyasu May 1987 A
4669827 Fukada et al. Jun 1987 A
4671615 Fukada et al. Jun 1987 A
4671619 Kamimori et al. Jun 1987 A
4678281 Bauer Jul 1987 A
4679906 Brandenburg Jul 1987 A
4682083 Alley Jul 1987 A
4692798 Seko et al. Sep 1987 A
4693788 Berg et al. Sep 1987 A
4694295 Miller et al. Sep 1987 A
4697883 Suzuki et al. Oct 1987 A
4701022 Jacob Oct 1987 A
4702566 Tukude et al. Oct 1987 A
4704740 McKee et al. Nov 1987 A
4711544 Iino et al. Dec 1987 A
4712879 Lynam et al. Dec 1987 A
4713685 Nishimura et al. Dec 1987 A
RE32576 Pastore Jan 1988 E
4718756 Lancaster Jan 1988 A
4721364 Itoh et al. Jan 1988 A
4729068 Ohe Mar 1988 A
4729076 Masami et al. Mar 1988 A
4731669 Hayashi et al. Mar 1988 A
4731769 Schaefer et al. Mar 1988 A
4733335 Serizawa et al. Mar 1988 A
4733336 Skogler et al. Mar 1988 A
4740838 Mase et al. Apr 1988 A
4758040 Kingsley et al. Jul 1988 A
4761061 Nishiyama et al. Aug 1988 A
4773740 Kawakami et al. Sep 1988 A
4780752 Angerstein et al. Oct 1988 A
4781436 Armbruster Nov 1988 A
4789774 Koch et al. Dec 1988 A
4789904 Peterson Dec 1988 A
4793690 Gahan et al. Dec 1988 A
4793695 Wada et al. Dec 1988 A
4794261 Rosen Dec 1988 A
D299491 Masuda Jan 1989 S
4799768 Gahan Jan 1989 A
4803599 Trine et al. Feb 1989 A
4807096 Skogler et al. Feb 1989 A
4820933 Hong et al. Apr 1989 A
4825232 Howdle Apr 1989 A
4826289 Vandenbrink et al. May 1989 A
4827086 Rockwell May 1989 A
4833534 Paff et al. May 1989 A
4837551 Iino Jun 1989 A
4842378 Flasck et al. Jun 1989 A
4845402 Smith Jul 1989 A
4847772 Michalopoulos et al. Jul 1989 A
4855161 Moser et al. Aug 1989 A
4855550 Schultz, Jr. Aug 1989 A
4859813 Rockwell Aug 1989 A
4859867 Larson et al. Aug 1989 A
4860171 Kojima Aug 1989 A
4862594 Schierbeek et al. Sep 1989 A
4871917 O'Farrell et al. Oct 1989 A
4872051 Dye Oct 1989 A
4882466 Friel Nov 1989 A
4882565 Gallmeyer Nov 1989 A
4883349 Mittelhäuser Nov 1989 A
4884135 Schiffman Nov 1989 A
4886960 Molyneux et al. Dec 1989 A
4889412 Clerc et al. Dec 1989 A
4891828 Kawazoe Jan 1990 A
4892345 Rachael, III Jan 1990 A
4902103 Miyake et al. Feb 1990 A
4902108 Byker Feb 1990 A
4906085 Sugihara et al. Mar 1990 A
4909606 Wada et al. Mar 1990 A
4910591 Petrossian et al. Mar 1990 A
4916374 Schierbeek et al. Apr 1990 A
4917477 Bechtel et al. Apr 1990 A
4926170 Beggs et al. May 1990 A
4930742 Schofield et al. Jun 1990 A
4933814 Sanai Jun 1990 A
4935665 Murata Jun 1990 A
4936533 Adams et al. Jun 1990 A
4937796 Tendler Jun 1990 A
4937945 Schofield et al. Jul 1990 A
4943796 Lee Jul 1990 A
4948242 Desmond et al. Aug 1990 A
4953305 Van Lente et al. Sep 1990 A
4956591 Schierbeek et al. Sep 1990 A
4957349 Clerc et al. Sep 1990 A
4959247 Moser et al. Sep 1990 A
4959865 Stettiner et al. Sep 1990 A
4966441 Conner Oct 1990 A
4970653 Kenue Nov 1990 A
4973844 O'Farrell et al. Nov 1990 A
4974122 Shaw Nov 1990 A
4978196 Suzuki et al. Dec 1990 A
4983951 Igarashi et al. Jan 1991 A
4985809 Matsui et al. Jan 1991 A
4987357 Masaki Jan 1991 A
4989956 Wu et al. Feb 1991 A
4996083 Moser et al. Feb 1991 A
5001386 Sullivan et al. Mar 1991 A
5001558 Burley et al. Mar 1991 A
5005213 Hanson et al. Apr 1991 A
5006971 Jenkins Apr 1991 A
5014167 Roberts May 1991 A
5016988 Iimura May 1991 A
5016996 Ueno May 1991 A
5017903 Krippelz, Sr. May 1991 A
5018839 Yamamoto et al. May 1991 A
5027200 Petrossian et al. Jun 1991 A
5037182 Groves et al. Aug 1991 A
5038255 Nishihashi et al. Aug 1991 A
5052163 Czekala Oct 1991 A
5056899 Warszawski Oct 1991 A
5057974 Mizobe Oct 1991 A
5058851 Lawlor et al. Oct 1991 A
5059015 Tran Oct 1991 A
5066108 McDonald Nov 1991 A
5066112 Lynam et al. Nov 1991 A
5069535 Baucke et al. Dec 1991 A
5070323 Iino et al. Dec 1991 A
5073012 Lynam Dec 1991 A
5076673 Lynam et al. Dec 1991 A
5076674 Lynam Dec 1991 A
5078480 Warszawski Jan 1992 A
5096287 Kakinami et al. Mar 1992 A
5100095 Haan et al. Mar 1992 A
5101139 Lechter Mar 1992 A
5105127 Lavaud et al. Apr 1992 A
5115346 Lynam May 1992 A
5119220 Narita et al. Jun 1992 A
5121200 Choi Jun 1992 A
5122619 Dlubak Jun 1992 A
5123077 Endo et al. Jun 1992 A
5124845 Shimojo Jun 1992 A
5124890 Choi et al. Jun 1992 A
5128799 Byker Jul 1992 A
5130898 Akahane Jul 1992 A
5131154 Schierbeek et al. Jul 1992 A
5134507 Ishii Jul 1992 A
5134549 Yokoyama Jul 1992 A
5135298 Feltman Aug 1992 A
5136483 Schöniger et al. Aug 1992 A
5140455 Varaprasad et al. Aug 1992 A
5140465 Yasui et al. Aug 1992 A
5142407 Varaprasad et al. Aug 1992 A
5145609 Varaprasad et al. Sep 1992 A
5148306 Yamada et al. Sep 1992 A
5150232 Gunkima et al. Sep 1992 A
5151816 Varaprasad et al. Sep 1992 A
5151824 O'Farrell Sep 1992 A
5154617 Suman et al. Oct 1992 A
5158638 Osanami et al. Oct 1992 A
5160200 Cheselske Nov 1992 A
5160201 Wrobel Nov 1992 A
5166815 Elderfield Nov 1992 A
5168378 Black et al. Dec 1992 A
5173881 Sindle Dec 1992 A
5177031 Buchmann et al. Jan 1993 A
5178448 Adams et al. Jan 1993 A
5179471 Caskey et al. Jan 1993 A
5183099 Bechu Feb 1993 A
5184956 Langlais et al. Feb 1993 A
5189537 O'Farrell Feb 1993 A
5193029 Schofield et al. Mar 1993 A
5197562 Kakinami et al. Mar 1993 A
5202950 Arego et al. Apr 1993 A
5207492 Roberts May 1993 A
5210967 Brown May 1993 A
5212819 Wada May 1993 A
5214408 Asayama May 1993 A
5217794 Schrenk Jun 1993 A
5223814 Suman Jun 1993 A
5223844 Mansell et al. Jun 1993 A
5229975 Truesdell et al. Jul 1993 A
5230400 Kakinami et al. Jul 1993 A
5233461 Dornan et al. Aug 1993 A
5235316 Qualizza Aug 1993 A
5239405 Varaprasad et al. Aug 1993 A
5239406 Lynam Aug 1993 A
5243417 Pollard Sep 1993 A
5245422 Borcherts et al. Sep 1993 A
5252354 Cronin et al. Oct 1993 A
5253109 O'Farrell et al. Oct 1993 A
5255442 Schierbeek et al. Oct 1993 A
5260626 Takase et al. Nov 1993 A
5277986 Cronin et al. Jan 1994 A
5280555 Ainsburg Jan 1994 A
5285060 Larson et al. Feb 1994 A
5289321 Secor Feb 1994 A
5296924 de Saint Blancard et al. Mar 1994 A
5303075 Wada et al. Apr 1994 A
5303205 Gauthier et al. Apr 1994 A
5304980 Maekawa Apr 1994 A
5305012 Faris Apr 1994 A
5307136 Saneyoshi Apr 1994 A
5313335 Gray et al. May 1994 A
5325096 Pakett Jun 1994 A
5325386 Jewell et al. Jun 1994 A
5327288 Wellington et al. Jul 1994 A
5330149 Haan et al. Jul 1994 A
5331312 Kudoh Jul 1994 A
5331358 Schurle et al. Jul 1994 A
5339075 Abst et al. Aug 1994 A
5339529 Lindberg Aug 1994 A
5341437 Nakayama Aug 1994 A
D351370 Lawlor et al. Oct 1994 S
5354965 Lee Oct 1994 A
5355118 Fukuhara Oct 1994 A
5355245 Lynam Oct 1994 A
5355284 Roberts Oct 1994 A
5361190 Roberts et al. Nov 1994 A
5363294 Yamamoto et al. Nov 1994 A
5371659 Pastrick et al. Dec 1994 A
5373482 Gauthier Dec 1994 A
5379146 Defendini Jan 1995 A
5386285 Asayama Jan 1995 A
5386306 Gunjima et al. Jan 1995 A
5400158 Ohnishi et al. Mar 1995 A
5402103 Tashiro Mar 1995 A
5406395 Wilson et al. Apr 1995 A
5406414 O'Farrell et al. Apr 1995 A
5408353 Nichols et al. Apr 1995 A
5408357 Beukema Apr 1995 A
5410346 Saneyoshi et al. Apr 1995 A
5414439 Groves et al. May 1995 A
5414461 Kishi et al. May 1995 A
5416313 Larson et al. May 1995 A
5416478 Morinaga May 1995 A
5418610 Fischer May 1995 A
5422756 Weber Jun 1995 A
5424726 Beymer Jun 1995 A
5424865 Lynam Jun 1995 A
5424952 Asayama Jun 1995 A
5426524 Wada et al. Jun 1995 A
5426723 Horsley Jun 1995 A
5430431 Nelson Jul 1995 A
5432496 Lin Jul 1995 A
5432626 Sasuga et al. Jul 1995 A
5436741 Crandall Jul 1995 A
5437931 Tsai et al. Aug 1995 A
5439305 Santo Aug 1995 A
5444478 Lelong et al. Aug 1995 A
5446576 Lynam et al. Aug 1995 A
5455716 Suman et al. Oct 1995 A
5461361 Moore Oct 1995 A
D363920 Roberts et al. Nov 1995 S
5469187 Yaniv Nov 1995 A
5469298 Suman et al. Nov 1995 A
5475366 Van Lente et al. Dec 1995 A
5475494 Nishida et al. Dec 1995 A
5481409 Roberts Jan 1996 A
5483453 Uemura et al. Jan 1996 A
5485161 Vaughn Jan 1996 A
5485378 Franke et al. Jan 1996 A
5487522 Hook Jan 1996 A
5488496 Pine Jan 1996 A
5497305 Pastrick et al. Mar 1996 A
5497306 Pastrick Mar 1996 A
5500760 Varaprasad et al. Mar 1996 A
5506701 Ichikawa Apr 1996 A
5509606 Breithaupt et al. Apr 1996 A
5510983 Iino Apr 1996 A
5515448 Nishitani May 1996 A
5517853 Chamussy May 1996 A
5519621 Wortham May 1996 A
5521744 Mazurek May 1996 A
5521760 DeYoung et al. May 1996 A
5523811 Wada et al. Jun 1996 A
5523877 Lynam Jun 1996 A
5525264 Cronin et al. Jun 1996 A
5525977 Suman Jun 1996 A
5528422 Roberts Jun 1996 A
5528474 Roney et al. Jun 1996 A
5529138 Shaw et al. Jun 1996 A
5530240 Larson et al. Jun 1996 A
5530420 Tsuchiya et al. Jun 1996 A
5530421 Marshall et al. Jun 1996 A
5535056 Caskey et al. Jul 1996 A
5535144 Kise Jul 1996 A
5539397 Asanuma et al. Jul 1996 A
5541590 Nishio Jul 1996 A
5550677 Schofield et al. Aug 1996 A
5555172 Potter Sep 1996 A
5561333 Darius Oct 1996 A
5566224 ul Azam et al. Oct 1996 A
5567360 Varaprasad et al. Oct 1996 A
5568316 Schrenk et al. Oct 1996 A
5570127 Schmidt Oct 1996 A
5572354 Desmond et al. Nov 1996 A
5574426 Shisgal et al. Nov 1996 A
5574443 Hsieh Nov 1996 A
5575552 Faloon et al. Nov 1996 A
5576687 Blank et al. Nov 1996 A
5576854 Schmidt et al. Nov 1996 A
5576975 Sasaki et al. Nov 1996 A
5578404 Kliem Nov 1996 A
5587236 Agrawal et al. Dec 1996 A
5587699 Faloon et al. Dec 1996 A
5593221 Evanicky et al. Jan 1997 A
5594222 Caldwell Jan 1997 A
5594560 Jelley et al. Jan 1997 A
5594615 Spijkerman et al. Jan 1997 A
5602542 Widmann et al. Feb 1997 A
5602670 Keegan Feb 1997 A
5603104 Phelps, III et al. Feb 1997 A
5608550 Epstein et al. Mar 1997 A
5609652 Yamada et al. Mar 1997 A
5610380 Nicolaisen Mar 1997 A
5610756 Lynam et al. Mar 1997 A
5611966 Varaprasad et al. Mar 1997 A
5614885 Van Lente et al. Mar 1997 A
5615023 Yang Mar 1997 A
5615857 Hook Apr 1997 A
5617085 Tsutsumi et al. Apr 1997 A
5619374 Roberts Apr 1997 A
5619375 Roberts Apr 1997 A
5621571 Bantli et al. Apr 1997 A
5626800 Williams et al. May 1997 A
5631089 Center, Jr. et al. May 1997 A
5631638 Kaspar et al. May 1997 A
5631639 Hibino et al. May 1997 A
5632092 Blank et al. May 1997 A
5632551 Roney et al. May 1997 A
5634709 Iwama Jun 1997 A
5640216 Hasegawa et al. Jun 1997 A
5642238 Sala Jun 1997 A
5644851 Blank et al. Jul 1997 A
5646614 Abersfelder et al. Jul 1997 A
5649756 Adams et al. Jul 1997 A
5649758 Dion Jul 1997 A
5650765 Park Jul 1997 A
5650929 Potter et al. Jul 1997 A
5661455 Van Lente et al. Aug 1997 A
5661651 Geschke et al. Aug 1997 A
5661804 Dykema et al. Aug 1997 A
5662375 Adams et al. Sep 1997 A
5666157 Aviv Sep 1997 A
5667289 Akahane et al. Sep 1997 A
5668663 Varaprasad et al. Sep 1997 A
5668675 Fredricks Sep 1997 A
5669698 Veldman et al. Sep 1997 A
5669699 Pastrick et al. Sep 1997 A
5669704 Pastrick Sep 1997 A
5669705 Pastrick et al. Sep 1997 A
5670935 Schofield et al. Sep 1997 A
5671996 Bos et al. Sep 1997 A
5673994 Fant, Jr. et al. Oct 1997 A
5673999 Koenck Oct 1997 A
5677598 De Hair et al. Oct 1997 A
5679283 Tonar et al. Oct 1997 A
5680123 Lee Oct 1997 A
5680245 Lynam Oct 1997 A
5680263 Zimmermann et al. Oct 1997 A
5686975 Lipton Nov 1997 A
5686979 Weber et al. Nov 1997 A
5689241 Clarke, Sr. et al. Nov 1997 A
5689370 Tonar et al. Nov 1997 A
5691848 Van Lente et al. Nov 1997 A
5692819 Mitsutake et al. Dec 1997 A
5696529 Evanicky et al. Dec 1997 A
5696567 Wada et al. Dec 1997 A
5699044 Van Lente et al. Dec 1997 A
5699188 Gilbert et al. Dec 1997 A
5703568 Hegyi Dec 1997 A
5708410 Blank et al. Jan 1998 A
5708415 Van Lente et al. Jan 1998 A
5708857 Ishibashi Jan 1998 A
5715093 Schierbeek et al. Feb 1998 A
5724187 Varaprasad et al. Mar 1998 A
5724316 Brunts Mar 1998 A
5729194 Spears et al. Mar 1998 A
5737226 Olson et al. Apr 1998 A
5741966 Handfield et al. Apr 1998 A
5744227 Bright et al. Apr 1998 A
5745050 Nakagawa Apr 1998 A
5745266 Smith Apr 1998 A
5748172 Song et al. May 1998 A
5748287 Takahashi et al. May 1998 A
5751211 Shirai et al. May 1998 A
5751246 Hertel May 1998 A
5751390 Crawford et al. May 1998 A
5751489 Caskey et al. May 1998 A
5754099 Nishimura et al. May 1998 A
D394833 Muth Jun 1998 S
5760242 Igarashi et al. Jun 1998 A
5760828 Cortes Jun 1998 A
5760931 Saburi et al. Jun 1998 A
5760962 Schofield et al. Jun 1998 A
5761094 Olson et al. Jun 1998 A
5762823 Hikmet Jun 1998 A
5764139 Nojima et al. Jun 1998 A
5765940 Levy et al. Jun 1998 A
5767793 Agravante et al. Jun 1998 A
5768020 Nagao Jun 1998 A
5775762 Vitito Jul 1998 A
5777779 Hashimoto et al. Jul 1998 A
5780160 Allemand et al. Jul 1998 A
5786772 Schofield et al. Jul 1998 A
5788357 Muth et al. Aug 1998 A
5790298 Tonar Aug 1998 A
5790502 Horinouchi et al. Aug 1998 A
5790973 Blaker et al. Aug 1998 A
5793308 Rosinski et al. Aug 1998 A
5793420 Schmidt Aug 1998 A
5796094 Schofield et al. Aug 1998 A
5796176 Kramer et al. Aug 1998 A
5798057 Hikmet Aug 1998 A
5798575 O'Farrell et al. Aug 1998 A
5798688 Schofield Aug 1998 A
5800918 Chartier et al. Sep 1998 A
5802727 Blank et al. Sep 1998 A
5803579 Turnbull et al. Sep 1998 A
5805330 Byker et al. Sep 1998 A
5805367 Kanazawa Sep 1998 A
5806879 Hamada et al. Sep 1998 A
5806965 Deese Sep 1998 A
5808197 Dao Sep 1998 A
5808566 Behr et al. Sep 1998 A
5808589 Fergason Sep 1998 A
5808713 Broer et al. Sep 1998 A
5808777 Lynam et al. Sep 1998 A
5808778 Bauer et al. Sep 1998 A
5812321 Schierbeek et al. Sep 1998 A
5813745 Fant, Jr. et al. Sep 1998 A
5818625 Forgette et al. Oct 1998 A
5820097 Spooner Oct 1998 A
5820245 Desmond et al. Oct 1998 A
5822023 Suman et al. Oct 1998 A
5823654 Pastrick et al. Oct 1998 A
5825527 Forgette et al. Oct 1998 A
5835166 Hall et al. Nov 1998 A
5837994 Stam et al. Nov 1998 A
5844505 Van Ryzin Dec 1998 A
5848373 DeLorme et al. Dec 1998 A
5850176 Kinoshita et al. Dec 1998 A
5850205 Blouin Dec 1998 A
5863116 Pastrick et al. Jan 1999 A
5864419 Lynam Jan 1999 A
5867801 Denny Feb 1999 A
5871275 O'Farrell et al. Feb 1999 A
5871843 Yoneda et al. Feb 1999 A
5877707 Kowalick Mar 1999 A
5877897 Schofield et al. Mar 1999 A
5878353 ul Azam et al. Mar 1999 A
5878370 Olson Mar 1999 A
5879074 Pastrick Mar 1999 A
5883605 Knapp Mar 1999 A
5883684 Millikan et al. Mar 1999 A
5883739 Ashihara et al. Mar 1999 A
5888431 Tonar et al. Mar 1999 A
5894196 McDermott Apr 1999 A
D409540 Muth May 1999 S
5899551 Neijzen et al. May 1999 A
5899956 Chan May 1999 A
5904729 Ruzicka May 1999 A
5910854 Varaprasad et al. Jun 1999 A
5914815 Bos Jun 1999 A
5917664 O'Neill et al. Jun 1999 A
5918180 Dimino Jun 1999 A
5920367 Kajimoto et al. Jul 1999 A
5922176 Caskey Jul 1999 A
5923027 Stam et al. Jul 1999 A
5923457 Byker et al. Jul 1999 A
5924212 Domanski Jul 1999 A
5926087 Busch et al. Jul 1999 A
5927792 Welling et al. Jul 1999 A
5928572 Tonar et al. Jul 1999 A
5929786 Schofield et al. Jul 1999 A
5935702 Macquart et al. Aug 1999 A
5936774 Street Aug 1999 A
5938320 Crandall Aug 1999 A
5938321 Bos et al. Aug 1999 A
5938721 Dussell et al. Aug 1999 A
5940011 Agravante et al. Aug 1999 A
5940120 Frankhouse et al. Aug 1999 A
5940201 Ash et al. Aug 1999 A
5942895 Popovic et al. Aug 1999 A
5947586 Weber Sep 1999 A
5949331 Schofield et al. Sep 1999 A
5949345 Beckert et al. Sep 1999 A
5949506 Jones et al. Sep 1999 A
5956079 Ridgley Sep 1999 A
5956181 Lin Sep 1999 A
5959367 O'Farrell et al. Sep 1999 A
5959555 Furuta Sep 1999 A
5959577 Fan et al. Sep 1999 A
5963247 Banitt Oct 1999 A
5963284 Jones et al. Oct 1999 A
5965247 Jonza et al. Oct 1999 A
5968538 Snyder, Jr. Oct 1999 A
5971552 O'Farrell et al. Oct 1999 A
5973760 Dehmlow Oct 1999 A
5975715 Bauder Nov 1999 A
5984482 Rumsey et al. Nov 1999 A
5986730 Hansen et al. Nov 1999 A
5990469 Bechtel et al. Nov 1999 A
5990625 Meissner et al. Nov 1999 A
5995180 Moriwaki et al. Nov 1999 A
5998617 Srinivasa et al. Dec 1999 A
5998929 Bechtel et al. Dec 1999 A
6000823 Desmond et al. Dec 1999 A
6001486 Varaprasad et al. Dec 1999 A
6002511 Varaprasad et al. Dec 1999 A
6002983 Alland et al. Dec 1999 A
6005724 Todd Dec 1999 A
6007222 Thau Dec 1999 A
6008486 Stam et al. Dec 1999 A
6008871 Okumura Dec 1999 A
6009359 El-Hakim et al. Dec 1999 A
6016035 Eberspächer et al. Jan 2000 A
6016215 Byker Jan 2000 A
6019411 Carter et al. Feb 2000 A
6019475 Lynam et al. Feb 2000 A
6020987 Baumann et al. Feb 2000 A
6021371 Fultz Feb 2000 A
6023229 Bugno et al. Feb 2000 A
6025872 Ozaki et al. Feb 2000 A
6028537 Suman et al. Feb 2000 A
6037689 Bingle et al. Mar 2000 A
6040939 Demiryont et al. Mar 2000 A
6042253 Fant, Jr. et al. Mar 2000 A
6042934 Guiselin et al. Mar 2000 A
6045243 Muth et al. Apr 2000 A
6045643 Byker et al. Apr 2000 A
6046766 Sakata Apr 2000 A
6046837 Yamamoto Apr 2000 A
6049171 Stam et al. Apr 2000 A
D425466 Todd et al. May 2000 S
6060989 Gehlot May 2000 A
6061002 Weber et al. May 2000 A
6062920 Jordan et al. May 2000 A
6064508 Forgette et al. May 2000 A
6065840 Caskey et al. May 2000 A
6066920 Torihara et al. May 2000 A
6067111 Hahn et al. May 2000 A
6067500 Morimoto et al. May 2000 A
6068380 Lynn et al. May 2000 A
D426506 Todd et al. Jun 2000 S
D426507 Todd et al. Jun 2000 S
D427128 Mathieu Jun 2000 S
6072391 Suzukie et al. Jun 2000 A
6074077 Pastrick et al. Jun 2000 A
6074777 Reimers et al. Jun 2000 A
6076948 Bukosky et al. Jun 2000 A
6078355 Zengel Jun 2000 A
6078865 Koyanagi Jun 2000 A
D428372 Todd et al. Jul 2000 S
D428373 Todd et al. Jul 2000 S
6082881 Hicks Jul 2000 A
6084700 Knapp et al. Jul 2000 A
6086131 Bingle et al. Jul 2000 A
6086229 Pastrick Jul 2000 A
6087012 Varaprasad et al. Jul 2000 A
6087953 DeLine et al. Jul 2000 A
6091343 Dykema et al. Jul 2000 A
6093976 Kramer et al. Jul 2000 A
6094618 Harada Jul 2000 A
D428842 Todd et al. Aug 2000 S
D429202 Todd et al. Aug 2000 S
D430088 Todd et al. Aug 2000 S
6097023 Schofield et al. Aug 2000 A
6097316 Liaw et al. Aug 2000 A
6099131 Fletcher et al. Aug 2000 A
6099155 Pastrick et al. Aug 2000 A
6100811 Hsu et al. Aug 2000 A
6102546 Carter Aug 2000 A
6102559 Nold et al. Aug 2000 A
6104552 Thau et al. Aug 2000 A
6106121 Buckley et al. Aug 2000 A
6111498 Jobes et al. Aug 2000 A
6111683 Cammenga et al. Aug 2000 A
6111684 Forgette et al. Aug 2000 A
6111685 Tench et al. Aug 2000 A
6111696 Allen et al. Aug 2000 A
6115086 Rosen Sep 2000 A
6115651 Cruz Sep 2000 A
6116743 Hoek Sep 2000 A
6118219 Okigami et al. Sep 2000 A
6122597 Saneyoshi et al. Sep 2000 A
6122921 Brezoczky et al. Sep 2000 A
6124647 Marcus et al. Sep 2000 A
6124886 DeLine et al. Sep 2000 A
6127919 Wylin Oct 2000 A
6127945 Mura-Smith Oct 2000 A
6128576 Nishimoto et al. Oct 2000 A
6130421 Bechtel et al. Oct 2000 A
6130448 Bauer et al. Oct 2000 A
6132072 Turnbull et al. Oct 2000 A
6137620 Guarr et al. Oct 2000 A
6139171 Waldmann Oct 2000 A
6139172 Bos et al. Oct 2000 A
6140933 Bugno et al. Oct 2000 A
6142656 Kurth Nov 2000 A
6146003 Thau Nov 2000 A
6147934 Arikawa et al. Nov 2000 A
6148261 Obradovich et al. Nov 2000 A
6149287 Pastrick et al. Nov 2000 A
6150014 Chu et al. Nov 2000 A
6151065 Steed et al. Nov 2000 A
6151539 Bergholz et al. Nov 2000 A
6152551 Annas Nov 2000 A
6152590 Fürst et al. Nov 2000 A
6154149 Tyckowski et al. Nov 2000 A
6154306 Varaprasad et al. Nov 2000 A
6157294 Urai et al. Dec 2000 A
6157418 Rosen Dec 2000 A
6157424 Eichenlaub Dec 2000 A
6157480 Anderson et al. Dec 2000 A
6158655 DeVries, Jr. et al. Dec 2000 A
6161071 Shuman et al. Dec 2000 A
6161865 Rose et al. Dec 2000 A
6164564 Franco et al. Dec 2000 A
6166625 Teowee et al. Dec 2000 A
6166629 Hamma et al. Dec 2000 A
6166834 Taketomi et al. Dec 2000 A
6166847 Tench et al. Dec 2000 A
6166848 Cammenga et al. Dec 2000 A
6167255 Kennedy, III et al. Dec 2000 A
6167755 Damson et al. Jan 2001 B1
6169955 Fultz Jan 2001 B1
6170956 Rumsey et al. Jan 2001 B1
6172600 Kakinami et al. Jan 2001 B1
6172601 Wada et al. Jan 2001 B1
6172613 DeLine et al. Jan 2001 B1
6173501 Blank et al. Jan 2001 B1
6175164 O'Farrell et al. Jan 2001 B1
6175300 Kendrick Jan 2001 B1
6176602 Pastrick et al. Jan 2001 B1
6178034 Allemand et al. Jan 2001 B1
6178377 Ishihara et al. Jan 2001 B1
6181387 Rosen Jan 2001 B1
6182006 Meek Jan 2001 B1
6183119 Desmond et al. Feb 2001 B1
6184679 Popovic et al. Feb 2001 B1
6184781 Ramakesavan Feb 2001 B1
6185492 Kagawa et al. Feb 2001 B1
6185501 Smith et al. Feb 2001 B1
6188505 Lomprey et al. Feb 2001 B1
6191704 Takenaga et al. Feb 2001 B1
6193379 Tonar et al. Feb 2001 B1
6193912 Thieste et al. Feb 2001 B1
6195194 Roberts et al. Feb 2001 B1
6196688 Caskey et al. Mar 2001 B1
6198409 Schofield et al. Mar 2001 B1
6199014 Walker et al. Mar 2001 B1
6199810 Wu et al. Mar 2001 B1
6200010 Anders Mar 2001 B1
6201642 Bos Mar 2001 B1
6206553 Boddy et al. Mar 2001 B1
6207083 Varaprasad et al. Mar 2001 B1
6210008 Hoekstra et al. Apr 2001 B1
6210012 Broer Apr 2001 B1
6212470 Seymour et al. Apr 2001 B1
6217181 Lynam et al. Apr 2001 B1
6218934 Regan Apr 2001 B1
6222447 Schofield et al. Apr 2001 B1
6222460 DeLine et al. Apr 2001 B1
6222689 Higuchi et al. Apr 2001 B1
6226061 Tagusa May 2001 B1
6227689 Miller May 2001 B1
6232937 Jacobsen et al. May 2001 B1
6236514 Sato May 2001 B1
6239851 Hatazawa et al. May 2001 B1
6239898 Byker et al. May 2001 B1
6239899 DeVries et al. May 2001 B1
6243003 DeLine et al. Jun 2001 B1
6244716 Steenwyk et al. Jun 2001 B1
6245262 Varaprasad et al. Jun 2001 B1
6246933 Bague Jun 2001 B1
6247820 Van Order Jun 2001 B1
6249214 Kashiwazaki Jun 2001 B1
6249310 Lefkowitz Jun 2001 B1
6249369 Theiste et al. Jun 2001 B1
6250148 Lynam Jun 2001 B1
6250766 Strumolo et al. Jun 2001 B1
6250783 Stidham et al. Jun 2001 B1
6255639 Stam et al. Jul 2001 B1
6257746 Todd et al. Jul 2001 B1
6259412 Duroux Jul 2001 B1
6259423 Tokito et al. Jul 2001 B1
6259475 Ramachandran et al. Jul 2001 B1
6260608 Kim Jul 2001 B1
6262842 Ouderkirk et al. Jul 2001 B1
6264353 Caraher et al. Jul 2001 B1
6265968 Betzitza et al. Jul 2001 B1
6268803 Gunderson et al. Jul 2001 B1
6268837 Kobayashi et al. Jul 2001 B1
6269308 Kodaka et al. Jul 2001 B1
6271901 Ide et al. Aug 2001 B1
6274221 Smith et al. Aug 2001 B2
6275231 Obradovich Aug 2001 B1
6276821 Pastrick et al. Aug 2001 B1
6276822 Bedrosian et al. Aug 2001 B1
6277471 Tang Aug 2001 B1
6278271 Schott Aug 2001 B1
6278377 DeLine et al. Aug 2001 B1
6278941 Yokoyama Aug 2001 B1
6280068 Mertens et al. Aug 2001 B1
6280069 Pastrick et al. Aug 2001 B1
6281804 Haller et al. Aug 2001 B1
6286965 Caskey et al. Sep 2001 B1
6286984 Berg Sep 2001 B1
6289332 Menig et al. Sep 2001 B2
6290378 Buchalla et al. Sep 2001 B1
6291905 Drummond et al. Sep 2001 B1
6291906 Marcus et al. Sep 2001 B1
6294989 Schofield et al. Sep 2001 B1
6296379 Pastrick Oct 2001 B1
6297781 Turnbull et al. Oct 2001 B1
6299333 Pastrick et al. Oct 2001 B1
6300879 Regan et al. Oct 2001 B1
6301039 Tench Oct 2001 B1
6304173 Pala et al. Oct 2001 B2
6305807 Schierbeek Oct 2001 B1
6310611 Caldwell Oct 2001 B1
6310714 Lomprey et al. Oct 2001 B1
6310738 Chu Oct 2001 B1
6313454 Bos et al. Nov 2001 B1
6314295 Kawamoto Nov 2001 B1
6315440 Satoh Nov 2001 B1
6317057 Lee Nov 2001 B1
6317180 Kuroiwa et al. Nov 2001 B1
6317248 Agrawal et al. Nov 2001 B1
6318870 Spooner et al. Nov 2001 B1
6320176 Schofield et al. Nov 2001 B1
6320282 Caldwell Nov 2001 B1
6320612 Young Nov 2001 B1
6324295 Valery et al. Nov 2001 B1
6326613 Heslin et al. Dec 2001 B1
6326900 DeLine et al. Dec 2001 B2
6327925 Gombert et al. Dec 2001 B1
6329925 Skiver et al. Dec 2001 B1
6330511 Ogura et al. Dec 2001 B2
6331066 Desmond et al. Dec 2001 B1
6333759 Mazzilli Dec 2001 B1
6335680 Matsuoka Jan 2002 B1
6336737 Thau Jan 2002 B1
6340850 O'Farrell et al. Jan 2002 B2
6341523 Lynam Jan 2002 B2
6344805 Yasui et al. Feb 2002 B1
6346698 Turnbull Feb 2002 B1
6347880 Fürst et al. Feb 2002 B1
6348858 Weis et al. Feb 2002 B2
6351708 Takagi et al. Feb 2002 B1
6353392 Schofield et al. Mar 2002 B1
6356206 Takenaga et al. Mar 2002 B1
6356376 Tonar et al. Mar 2002 B1
6356389 Nilsen et al. Mar 2002 B1
6357883 Strumolo et al. Mar 2002 B1
6359392 He Mar 2002 B1
6362121 Chopin et al. Mar 2002 B1
6362548 Bingle et al. Mar 2002 B1
6363326 Scully Mar 2002 B1
6366013 Leenders et al. Apr 2002 B1
6366213 DeLine et al. Apr 2002 B2
6369701 Yoshida et al. Apr 2002 B1
6370329 Teuchert Apr 2002 B1
6371636 Wesson Apr 2002 B1
6379013 Bechtel et al. Apr 2002 B1
6379788 Choi et al. Apr 2002 B2
6382805 Miyabukuro May 2002 B1
6385139 Arikawa et al. May 2002 B1
6386742 DeLine et al. May 2002 B1
6390529 Bingle et al. May 2002 B1
6390626 Knox May 2002 B2
6390635 Whitehead et al. May 2002 B2
6396397 Bos et al. May 2002 B1
6396408 Drummond et al. May 2002 B2
6396637 Roest et al. May 2002 B2
6407468 LeVesque et al. Jun 2002 B1
6407847 Poll et al. Jun 2002 B1
6408247 Ichikawa et al. Jun 2002 B1
6411204 Bloomfield et al. Jun 2002 B1
6412959 Tseng Jul 2002 B1
6412973 Bos et al. Jul 2002 B1
6414910 Kaneko et al. Jul 2002 B1
6415230 Maruko et al. Jul 2002 B1
6416208 Pastrick et al. Jul 2002 B2
6417786 Learman et al. Jul 2002 B2
6418376 Olson Jul 2002 B1
6419300 Pavao et al. Jul 2002 B1
6420036 Varaprasad et al. Jul 2002 B1
6420800 LeVesque et al. Jul 2002 B1
6420975 DeLine et al. Jul 2002 B1
6421081 Markus Jul 2002 B1
6424272 Gutta et al. Jul 2002 B1
6424273 Gutta et al. Jul 2002 B1
6424786 Beeson et al. Jul 2002 B1
6424892 Matsuoka Jul 2002 B1
6426492 Bos et al. Jul 2002 B1
6426568 Turnbull et al. Jul 2002 B2
6427349 Blank et al. Aug 2002 B1
6428172 Hutzel et al. Aug 2002 B1
6433676 DeLine et al. Aug 2002 B2
6433680 Ho Aug 2002 B1
6433914 Lomprey et al. Aug 2002 B1
6437688 Kobayashi Aug 2002 B1
6438491 Farmer Aug 2002 B1
6439755 Fant, Jr. et al. Aug 2002 B1
6441872 Ho Aug 2002 B1
6441943 Roberts et al. Aug 2002 B1
6441963 Murakami et al. Aug 2002 B2
6441964 Chu et al. Aug 2002 B1
6445287 Schofield et al. Sep 2002 B1
6447128 Lang et al. Sep 2002 B1
6449082 Agrawal et al. Sep 2002 B1
6452533 Yamabuchi et al. Sep 2002 B1
6452572 Fan et al. Sep 2002 B1
6456438 Lee et al. Sep 2002 B1
6462795 Clarke Oct 2002 B1
6463369 Sadano et al. Oct 2002 B2
6466701 Ejiri et al. Oct 2002 B1
6471362 Carter et al. Oct 2002 B1
6472977 Pöchmüller Oct 2002 B1
6472979 Schofield et al. Oct 2002 B2
6473001 Blum Oct 2002 B1
6474853 Pastrick et al. Nov 2002 B2
6476731 Miki et al. Nov 2002 B1
6476855 Yamamoto Nov 2002 B1
6477460 Kepler Nov 2002 B2
6477464 McCarthy et al. Nov 2002 B2
6483429 Yasui et al. Nov 2002 B1
6483438 DeLine et al. Nov 2002 B2
6483613 Woodgate et al. Nov 2002 B1
6487500 Lemelson et al. Nov 2002 B2
6494602 Pastrick et al. Dec 2002 B2
6498620 Schofield et al. Dec 2002 B2
6501387 Skiver et al. Dec 2002 B2
6512203 Jones et al. Jan 2003 B2
6512624 Tonar et al. Jan 2003 B2
6513252 Schierbeek et al. Feb 2003 B1
6515378 Drummond et al. Feb 2003 B2
6515581 Ho Feb 2003 B1
6515582 Teowee Feb 2003 B1
6515597 Wada et al. Feb 2003 B1
6516664 Lynam Feb 2003 B2
6518691 Baba Feb 2003 B1
6519209 Arikawa et al. Feb 2003 B1
6520667 Mousseau Feb 2003 B1
6522451 Lynam Feb 2003 B1
6522969 Kannonji Feb 2003 B2
6525707 Kaneko et al. Feb 2003 B1
6534884 Marcus et al. Mar 2003 B2
6538709 Kurihara et al. Mar 2003 B1
6539306 Turnbull Mar 2003 B2
6542085 Yang Apr 2003 B1
6542182 Chutorash Apr 2003 B1
6543163 Ginsberg Apr 2003 B1
6545598 de Villeroche Apr 2003 B1
6549253 Robbie et al. Apr 2003 B1
6549335 Trapani et al. Apr 2003 B1
6550949 Bauer et al. Apr 2003 B1
6552326 Turnbull Apr 2003 B2
6552653 Nakaho et al. Apr 2003 B2
6553130 Lemelson et al. Apr 2003 B1
6553308 Uhlmann et al. Apr 2003 B1
6559761 Miller et al. May 2003 B1
6559902 Kusuda et al. May 2003 B1
6560004 Theiste et al. May 2003 B2
6560027 Meine May 2003 B2
6566821 Nakatsuka et al. May 2003 B2
6567060 Sekiguchi May 2003 B1
6567708 Bechtel et al. May 2003 B1
6568839 Pastrick et al. May 2003 B1
6572233 Northman et al. Jun 2003 B1
6573957 Suzuki Jun 2003 B1
6573963 Ouderkirk et al. Jun 2003 B2
6575582 Tenmyo Jun 2003 B2
6575643 Takahashi Jun 2003 B2
6578989 Osumi et al. Jun 2003 B2
6580373 Ohashi Jun 2003 B1
6580479 Sekiguchi et al. Jun 2003 B1
6580562 Aoki et al. Jun 2003 B2
6581007 Hasegawa et al. Jun 2003 B2
6583730 Lang et al. Jun 2003 B2
6591192 Okamura et al. Jul 2003 B2
6592230 Dupay Jul 2003 B2
6593011 Liu et al. Jul 2003 B2
6593565 Heslin et al. Jul 2003 B2
6593984 Arakawa et al. Jul 2003 B2
6594065 Byker et al. Jul 2003 B2
6594067 Poll et al. Jul 2003 B2
6594090 Kruschwitz et al. Jul 2003 B2
6594583 Ogura et al. Jul 2003 B2
6594614 Studt et al. Jul 2003 B2
6595649 Hoekstra et al. Jul 2003 B2
6597489 Guarr et al. Jul 2003 B1
6606183 Ikai et al. Aug 2003 B2
6611202 Schofield et al. Aug 2003 B2
6611227 Nebiyeloul-Kifle et al. Aug 2003 B1
6611759 Brosche Aug 2003 B2
6612723 Futhey et al. Sep 2003 B2
6614387 Deadman Sep 2003 B1
6614419 May Sep 2003 B1
6614579 Roberts et al. Sep 2003 B2
6615438 Franco et al. Sep 2003 B1
6616313 Fürst et al. Sep 2003 B2
6616764 Krämer et al. Sep 2003 B2
6618672 Sasaki et al. Sep 2003 B2
6621616 Bauer et al. Sep 2003 B1
6624936 Kotchick et al. Sep 2003 B2
6627918 Getz et al. Sep 2003 B2
6630888 Lang et al. Oct 2003 B2
6636190 Hirakata et al. Oct 2003 B2
6636258 Strumolo Oct 2003 B2
6638582 Uchiyama et al. Oct 2003 B1
6639360 Roberts et al. Oct 2003 B2
6642840 Lang et al. Nov 2003 B2
6642851 DeLine et al. Nov 2003 B2
6646697 Sekiguchi et al. Nov 2003 B1
6648477 Hutzel et al. Nov 2003 B2
6650457 Busscher et al. Nov 2003 B2
6657607 Evanicky et al. Dec 2003 B1
6661482 Hara Dec 2003 B2
6661830 Reed et al. Dec 2003 B1
6663262 Boyd et al. Dec 2003 B2
6665592 Kodama Dec 2003 B2
6667726 Damiani et al. Dec 2003 B1
6669109 Ivanov et al. Dec 2003 B2
6669285 Park et al. Dec 2003 B1
6670207 Roberts Dec 2003 B1
6670910 Delcheccolo et al. Dec 2003 B2
6670935 Yeon et al. Dec 2003 B2
6670941 Albu et al. Dec 2003 B2
6671080 Poll et al. Dec 2003 B2
6672731 Schnell et al. Jan 2004 B2
6672734 Lammers Jan 2004 B2
6672744 DeLine et al. Jan 2004 B2
6672745 Bauer et al. Jan 2004 B1
6674370 Rodewald et al. Jan 2004 B2
6675075 Engelsberg et al. Jan 2004 B1
6678083 Anstee Jan 2004 B1
6678614 McCarthy et al. Jan 2004 B2
6679608 Bechtel et al. Jan 2004 B2
6683539 Trajkovic et al. Jan 2004 B2
6683969 Nishigaki et al. Jan 2004 B1
6685348 Pastrick et al. Feb 2004 B2
6690262 Winnett Feb 2004 B1
6690268 Schofield et al. Feb 2004 B2
6690413 Moore Feb 2004 B1
6690438 Sekiguchi Feb 2004 B2
6691464 Nestell et al. Feb 2004 B2
6693517 McCarthy et al. Feb 2004 B2
6693518 Kumata et al. Feb 2004 B2
6693519 Keirstead Feb 2004 B2
6693524 Payne Feb 2004 B1
6700692 Tonar et al. Mar 2004 B2
6704434 Sakoh et al. Mar 2004 B1
6709136 Pastrick et al. Mar 2004 B2
6713783 Mase et al. Mar 2004 B1
6717109 Macher et al. Apr 2004 B1
6717610 Bos et al. Apr 2004 B1
6717712 Lynam et al. Apr 2004 B2
6719215 Drouillard Apr 2004 B2
6724446 Motomura et al. Apr 2004 B2
6726337 Whitehead et al. Apr 2004 B2
6727807 Trajkovic et al. Apr 2004 B2
6727808 Uselmann et al. Apr 2004 B1
6727844 Zimmermann et al. Apr 2004 B1
6731332 Yasui et al. May 2004 B1
6734807 King May 2004 B2
6736526 Matsuba et al. May 2004 B2
6737629 Nixon et al. May 2004 B2
6737630 Turnbull May 2004 B2
6737964 Samman et al. May 2004 B2
6738088 Uskolovsky et al. May 2004 B1
6742904 Bechtel et al. Jun 2004 B2
6744353 Sjönell Jun 2004 B2
6746775 Boire et al. Jun 2004 B1
6747716 Kuroiwa et al. Jun 2004 B2
6748211 Isaac et al. Jun 2004 B1
6749308 Niendorf et al. Jun 2004 B1
6755542 Bechtel et al. Jun 2004 B2
6756912 Skiver et al. Jun 2004 B2
6757039 Ma Jun 2004 B2
6757109 Bos Jun 2004 B2
D493131 Lawlor et al. Jul 2004 S
D493394 Lawlor et al. Jul 2004 S
6759113 Tang Jul 2004 B1
6759945 Richard Jul 2004 B2
6760157 Allen et al. Jul 2004 B1
6765480 Tseng Jul 2004 B2
6773116 De Vaan et al. Aug 2004 B2
6774356 Heslin et al. Aug 2004 B2
6774810 DeLine et al. Aug 2004 B2
6778904 Iwami et al. Aug 2004 B2
6779900 Nolan-Brown Aug 2004 B1
6781738 Kikuchi et al. Aug 2004 B2
6782718 Lingle et al. Aug 2004 B2
6784129 Seto et al. Aug 2004 B2
6797396 Liu et al. Sep 2004 B1
6800871 Matsuda et al. Oct 2004 B2
6801127 Mizusawa et al. Oct 2004 B2
6801244 Takeda et al. Oct 2004 B2
6801283 Koyama et al. Oct 2004 B2
6805474 Walser et al. Oct 2004 B2
6806452 Bos et al. Oct 2004 B2
6806922 Ishitaka Oct 2004 B2
6810323 Bullock et al. Oct 2004 B1
6812463 Okada Nov 2004 B2
6812907 Gennetten et al. Nov 2004 B1
6819231 Berberich et al. Nov 2004 B2
6823261 Sekiguchi Nov 2004 B2
6824281 Schofield et al. Nov 2004 B2
6831268 Bechtel et al. Dec 2004 B2
6832848 Pastrick Dec 2004 B2
6834969 Bade et al. Dec 2004 B2
6836725 Millington et al. Dec 2004 B2
6838980 Gloger et al. Jan 2005 B2
6842189 Park Jan 2005 B2
6842276 Poll et al. Jan 2005 B2
6845805 Köster Jan 2005 B1
6846098 Bourdelais et al. Jan 2005 B2
6847424 Gotoh et al. Jan 2005 B2
6847487 Burgner Jan 2005 B2
6848817 Bos et al. Feb 2005 B2
6849165 Klöppel et al. Feb 2005 B2
6853491 Ruhle et al. Feb 2005 B1
6859148 Miller et al. Feb 2005 B2
6861789 Wei Mar 2005 B2
6864930 Matsushita et al. Mar 2005 B2
6870655 Northman et al. Mar 2005 B1
6870656 Tonar et al. Mar 2005 B2
6871982 Holman et al. Mar 2005 B2
6877888 DeLine et al. Apr 2005 B2
6882287 Schofield Apr 2005 B2
6889064 Baratono et al. May 2005 B2
6891563 Schofield et al. May 2005 B2
6891677 Nilsen et al. May 2005 B2
6898518 Padmanabhan May 2005 B2
6902284 Hutzel et al. Jun 2005 B2
6904348 Drummond et al. Jun 2005 B2
6906620 Nakai et al. Jun 2005 B2
6906632 DeLine et al. Jun 2005 B2
6909486 Wang et al. Jun 2005 B2
6910779 Abel et al. Jun 2005 B2
6912001 Okamoto et al. Jun 2005 B2
6912396 Sziraki et al. Jun 2005 B2
6914521 Rothkop Jul 2005 B2
6916099 Su et al. Jul 2005 B2
6917404 Baek Jul 2005 B2
6918674 Drummond et al. Jul 2005 B2
6922902 Schierbeek et al. Aug 2005 B2
6923080 Dobler et al. Aug 2005 B1
6928180 Stam et al. Aug 2005 B2
6928366 Ockerse et al. Aug 2005 B2
6930737 Weindorf et al. Aug 2005 B2
6933837 Gunderson et al. Aug 2005 B2
6934067 Ash et al. Aug 2005 B2
6940423 Takagi et al. Sep 2005 B2
6946978 Schofield Sep 2005 B2
6947576 Stam et al. Sep 2005 B2
6947577 Stam et al. Sep 2005 B2
6949772 Shimizu et al. Sep 2005 B2
6950035 Tanaka et al. Sep 2005 B2
6951410 Parsons Oct 2005 B2
6951681 Hartley et al. Oct 2005 B2
6952312 Weber et al. Oct 2005 B2
6958495 Nishijima et al. Oct 2005 B2
6958683 Mills et al. Oct 2005 B2
6959994 Fujikawa et al. Nov 2005 B2
6961178 Sugino et al. Nov 2005 B2
6961661 Sekiguchi Nov 2005 B2
6963438 Busscher et al. Nov 2005 B2
6968273 Ockerse et al. Nov 2005 B2
6971181 Ohm et al. Dec 2005 B2
6972888 Poll et al. Dec 2005 B2
6974236 Tenmyo Dec 2005 B2
6975215 Schofield et al. Dec 2005 B2
6977702 Wu Dec 2005 B2
6980092 Turnbull et al. Dec 2005 B2
6985291 Watson et al. Jan 2006 B2
6989736 Berberich et al. Jan 2006 B2
6992573 Blank et al. Jan 2006 B2
6992718 Takahara Jan 2006 B1
6992826 Wang Jan 2006 B2
6995687 Lang et al. Feb 2006 B2
6997571 Tenmyo Feb 2006 B2
7001058 Inditsky Feb 2006 B2
7004592 Varaprasad et al. Feb 2006 B2
7004593 Weller et al. Feb 2006 B2
7005974 McMahon et al. Feb 2006 B2
7006173 Hiyama et al. Feb 2006 B1
7008090 Blank Mar 2006 B2
7009751 Tonar et al. Mar 2006 B2
7012543 DeLine et al. Mar 2006 B2
7012727 Hutzel et al. Mar 2006 B2
7023331 Kodama Apr 2006 B2
7029156 Suehiro et al. Apr 2006 B2
7030738 Ishii Apr 2006 B2
7030775 Sekiguchi Apr 2006 B2
7038577 Pawlicki et al. May 2006 B2
7041965 Heslin et al. May 2006 B2
7042616 Tonar et al. May 2006 B2
7046418 Lin et al. May 2006 B2
7046448 Burgner May 2006 B2
7050908 Schwartz et al. May 2006 B1
7057505 Iwamoto Jun 2006 B2
7057681 Hinata et al. Jun 2006 B2
7063893 Hoffman Jun 2006 B2
7064882 Tonar et al. Jun 2006 B2
7068289 Satoh et al. Jun 2006 B2
7074486 Boire et al. Jul 2006 B2
7081810 Henderson et al. Jul 2006 B2
7085633 Nishira et al. Aug 2006 B2
7092052 Okamoto et al. Aug 2006 B2
7095432 Nakayama et al. Aug 2006 B2
7095567 Troxell et al. Aug 2006 B2
7106213 White Sep 2006 B2
7106392 You Sep 2006 B2
7108409 DeLine et al. Sep 2006 B2
7110021 Nobori et al. Sep 2006 B2
7114554 Bergman et al. Oct 2006 B2
7121028 Shoen et al. Oct 2006 B2
7125131 Olczak Oct 2006 B2
7130727 Liu et al. Oct 2006 B2
7132064 Li et al. Nov 2006 B2
7136091 Ichikawa et al. Nov 2006 B2
7138974 Hirakata et al. Nov 2006 B2
7149613 Stam et al. Dec 2006 B2
7150552 Weidel Dec 2006 B2
7151515 Kim et al. Dec 2006 B2
7151997 Uhlmann et al. Dec 2006 B2
7153588 McMan et al. Dec 2006 B2
7154657 Poll et al. Dec 2006 B2
7158881 McCarthy et al. Jan 2007 B2
7160017 Lee et al. Jan 2007 B2
7161567 Homma et al. Jan 2007 B2
7167796 Taylor et al. Jan 2007 B2
7168830 Pastrick et al. Jan 2007 B2
7175291 Li Feb 2007 B1
7176790 Yamazaki Feb 2007 B2
7184190 McCabe et al. Feb 2007 B2
7185995 Hatanaka et al. Mar 2007 B2
7187498 Bengoechea et al. Mar 2007 B2
7188963 Schofield et al. Mar 2007 B2
7193764 Lin et al. Mar 2007 B2
7195381 Lynam et al. Mar 2007 B2
7199767 Spero Apr 2007 B2
7202987 Varaprasad et al. Apr 2007 B2
7206697 Olney et al. Apr 2007 B2
7209277 Tonar et al. Apr 2007 B2
7215238 Buck et al. May 2007 B2
7215473 Fleming May 2007 B2
7221363 Roberts et al. May 2007 B2
7221365 Lévesque et al. May 2007 B1
7224324 Quist et al. May 2007 B2
7227472 Roe Jun 2007 B1
7230523 Harter, Jr. et al. Jun 2007 B2
7232231 Shih Jun 2007 B2
7232594 Miroshin et al. Jun 2007 B2
7233304 Aratani et al. Jun 2007 B1
7235918 McCullough et al. Jun 2007 B2
7241030 Mok et al. Jul 2007 B2
7241037 Mathieu et al. Jul 2007 B2
7245207 Dayan et al. Jul 2007 B1
7245231 Kiefer et al. Jul 2007 B2
7245336 Hiyama et al. Jul 2007 B2
7248283 Takagi et al. Jul 2007 B2
7248305 Ootsuta et al. Jul 2007 B2
7249860 Kulas et al. Jul 2007 B2
7251079 Capaldo et al. Jul 2007 B2
7253723 Lindahl et al. Aug 2007 B2
7255451 McCabe et al. Aug 2007 B2
7255465 DeLine et al. Aug 2007 B2
7259036 Borland et al. Aug 2007 B2
7262406 Heslin et al. Aug 2007 B2
7262916 Kao et al. Aug 2007 B2
7265342 Heslin et al. Sep 2007 B2
7268841 Kasajima et al. Sep 2007 B2
7269327 Tang Sep 2007 B2
7269328 Tang Sep 2007 B2
7271951 Weber et al. Sep 2007 B2
7274501 McCabe et al. Sep 2007 B2
7281491 Iwamaru Oct 2007 B2
7286280 Whitehead et al. Oct 2007 B2
7287868 Carter et al. Oct 2007 B2
7289037 Uken et al. Oct 2007 B2
7290919 Pan et al. Nov 2007 B2
7292208 Park et al. Nov 2007 B1
7292918 Silvester Nov 2007 B2
7300183 Kiyomoto et al. Nov 2007 B2
7302344 Olney et al. Nov 2007 B2
7304661 Ishikura Dec 2007 B2
7308341 Schofield et al. Dec 2007 B2
7310177 McCabe et al. Dec 2007 B2
7311428 DeLine et al. Dec 2007 B2
7316485 Roose Jan 2008 B2
7317386 Lengning et al. Jan 2008 B2
7318664 Hatanaka et al. Jan 2008 B2
7323819 Hong et al. Jan 2008 B2
7324043 Purden et al. Jan 2008 B2
7324172 Yamazaki et al. Jan 2008 B2
7324174 Hafuka et al. Jan 2008 B2
7324261 Tonar et al. Jan 2008 B2
7327225 Nicholas et al. Feb 2008 B2
7327226 Turnbull et al. Feb 2008 B2
7327855 Chen Feb 2008 B1
7328103 McCarthy et al. Feb 2008 B2
7329013 Blank et al. Feb 2008 B2
7329850 Drummond et al. Feb 2008 B2
7331415 Hawes et al. Feb 2008 B2
7338177 Lynam Mar 2008 B2
7342707 Roberts et al. Mar 2008 B2
7344284 Lynam et al. Mar 2008 B2
7349143 Tonar et al. Mar 2008 B2
7349144 Varaprasad et al. Mar 2008 B2
7349582 Takeda et al. Mar 2008 B2
7355524 Schofield Apr 2008 B2
7360932 Uken et al. Apr 2008 B2
7362505 Hikmet et al. Apr 2008 B2
7368714 Remillard et al. May 2008 B2
7370983 DeWind et al. May 2008 B2
7372611 Tonar et al. May 2008 B2
7375895 Brynielsson May 2008 B2
7379224 Tonar et al. May 2008 B2
7379225 Tonar et al. May 2008 B2
7379243 Horsten et al. May 2008 B2
7379814 Ockerse et al. May 2008 B2
7379817 Tyson et al. May 2008 B1
7380633 Shen et al. Jun 2008 B2
7389171 Rupp Jun 2008 B2
7391563 McCabe et al. Jun 2008 B2
7396147 Munro Jul 2008 B2
7411637 Weiss Aug 2008 B2
7411732 Kao et al. Aug 2008 B2
7412328 Uhlmann et al. Aug 2008 B2
7417781 Tonar et al. Aug 2008 B2
7420159 Heslin et al. Sep 2008 B2
7420756 Lynam Sep 2008 B2
7429998 Kawauchi et al. Sep 2008 B2
7446462 Lim et al. Nov 2008 B2
7446650 Scholfield et al. Nov 2008 B2
7446924 Schofield et al. Nov 2008 B2
7448776 Tang Nov 2008 B2
7452090 Weller et al. Nov 2008 B2
7453057 Drummond et al. Nov 2008 B2
7455412 Rottcher Nov 2008 B2
7460007 Schofield et al. Dec 2008 B2
7467883 DeLine et al. Dec 2008 B2
7468651 DeLine et al. Dec 2008 B2
7471438 McCabe et al. Dec 2008 B2
7474963 Taylor et al. Jan 2009 B2
7477439 Tonar et al. Jan 2009 B2
7480149 DeWard et al. Jan 2009 B2
7488080 Skiver et al. Feb 2009 B2
7488099 Fogg et al. Feb 2009 B2
7489374 Utsumi et al. Feb 2009 B2
7490007 Taylor et al. Feb 2009 B2
7490943 Kikuchi et al. Feb 2009 B2
7490944 Blank et al. Feb 2009 B2
7494231 Varaprasad et al. Feb 2009 B2
7495719 Adachi et al. Feb 2009 B2
7496439 McCormick Feb 2009 B2
7502156 Tonar et al. Mar 2009 B2
7505047 Yoshimura Mar 2009 B2
7505188 Niiyama et al. Mar 2009 B2
7511607 Hubbard et al. Mar 2009 B2
7511872 Tonar et al. Mar 2009 B2
7525604 Xue Apr 2009 B2
7525715 McCabe et al. Apr 2009 B2
7526103 Schofield et al. Apr 2009 B2
7533998 Schofield et al. May 2009 B2
7538316 Heslin et al. May 2009 B2
7540620 Weller et al. Jun 2009 B2
7541570 Drummond et al. Jun 2009 B2
7542193 McCabe et al. Jun 2009 B2
7543946 Ockerse et al. Jun 2009 B2
7543947 Varaprasad et al. Jun 2009 B2
7545429 Travis Jun 2009 B2
7547467 Olson et al. Jun 2009 B2
7548291 Lee et al. Jun 2009 B2
7551354 Horsten et al. Jun 2009 B2
7561181 Schofield et al. Jul 2009 B2
7562985 Cortenraad et al. Jul 2009 B2
7567291 Bechtel et al. Jul 2009 B2
7571038 Butler et al. Aug 2009 B2
7571042 Taylor et al. Aug 2009 B2
7572017 Varaprasad et al. Aug 2009 B2
7572490 Park et al. Aug 2009 B2
7579939 Schofield et al. Aug 2009 B2
7579940 Schofield et al. Aug 2009 B2
7580795 McCarthy et al. Aug 2009 B2
7581859 Lynam Sep 2009 B2
7581867 Lee et al. Sep 2009 B2
7583184 Schofield et al. Sep 2009 B2
7586566 Nelson et al. Sep 2009 B2
7586666 McCabe et al. Sep 2009 B2
7589883 Varaprasad et al. Sep 2009 B2
7589893 Rottcher Sep 2009 B2
7600878 Blank et al. Oct 2009 B2
7605883 Yamaki et al. Oct 2009 B2
7619508 Lynam et al. Nov 2009 B2
7623202 Araki et al. Nov 2009 B2
7626749 Baur et al. Dec 2009 B2
7629996 Rademacher et al. Dec 2009 B2
7633567 Yamada et al. Dec 2009 B2
7636188 Baur et al. Dec 2009 B2
7636195 Nieuwkerk et al. Dec 2009 B2
7636930 Chang Dec 2009 B2
7643200 Varaprasad et al. Jan 2010 B2
7643927 Hils Jan 2010 B2
7651228 Skiver et al. Jan 2010 B2
7658521 DeLine et al. Feb 2010 B2
7663798 Tonar et al. Feb 2010 B2
7667579 DeLine et al. Feb 2010 B2
7670016 Weller et al. Mar 2010 B2
7688495 Tonar et al. Mar 2010 B2
7695174 Takayanagi et al. Apr 2010 B2
7696964 Lankhorst et al. Apr 2010 B2
7706046 Bauer et al. Apr 2010 B2
7710631 McCabe et al. May 2010 B2
7711479 Taylor et al. May 2010 B2
7724434 Cross et al. May 2010 B2
7726822 Blank et al. Jun 2010 B2
7728276 Drummond et al. Jun 2010 B2
7728721 Schofield et al. Jun 2010 B2
7728927 Nieuwkerk et al. Jun 2010 B2
7731403 Lynam et al. Jun 2010 B2
7734392 Schofield et al. Jun 2010 B2
7742864 Sekiguchi Jun 2010 B2
7746534 Tonar et al. Jun 2010 B2
7771061 Varaprasad et al. Aug 2010 B2
7787077 Kondoh et al. Aug 2010 B2
7791694 Molsen et al. Sep 2010 B2
7795675 Darwish et al. Sep 2010 B2
7815326 Blank et al. Oct 2010 B2
7821697 Varaprasad et al. Oct 2010 B2
7822543 Taylor et al. Oct 2010 B2
7826123 McCabe et al. Nov 2010 B2
7830583 Neuman et al. Nov 2010 B2
7832882 Weller et al. Nov 2010 B2
7842154 Lynam Nov 2010 B2
7854514 Conner et al. Dec 2010 B2
7855755 Weller et al. Dec 2010 B2
7859565 Schofield et al. Dec 2010 B2
7859737 McCabe et al. Dec 2010 B2
7864398 Dozeman et al. Jan 2011 B2
7864399 McCabe et al. Jan 2011 B2
7871169 Varaprasad et al. Jan 2011 B2
7873593 Schofield et al. Jan 2011 B2
7888629 Heslin et al. Feb 2011 B2
7898398 DeLine et al. Mar 2011 B2
7898719 Schofield et al. Mar 2011 B2
7903324 Kobayashi et al. Mar 2011 B2
7903335 Nieuwkerk et al. Mar 2011 B2
7906756 Drummond et al. Mar 2011 B2
7911547 Brott et al. Mar 2011 B2
7914188 DeLine et al. Mar 2011 B2
7916009 Schofield et al. Mar 2011 B2
7916380 Tonar et al. Mar 2011 B2
7918570 Weller et al. Apr 2011 B2
7926960 Skiver et al. Apr 2011 B2
7937667 Kramer et al. May 2011 B2
7965336 Bingle et al. Jun 2011 B2
7965357 Van De Witte et al. Jun 2011 B2
7980711 Takayanagi et al. Jul 2011 B2
7994471 Heslin et al. Aug 2011 B2
8000894 Taylor et al. Aug 2011 B2
8004768 Takayanagi et al. Aug 2011 B2
8019505 Schofield et al. Sep 2011 B2
8027691 Bernas et al. Sep 2011 B2
8031225 Watanabe et al. Oct 2011 B2
8040376 Yamada et al. Oct 2011 B2
8044776 Schofield et al. Oct 2011 B2
8047667 Weller et al. Nov 2011 B2
8049640 Uken et al. Nov 2011 B2
8063753 DeLine et al. Nov 2011 B2
8072318 Lynam et al. Dec 2011 B2
8083386 Lynam Dec 2011 B2
8094002 Schofield et al. Jan 2012 B2
8095260 Schofield et al. Jan 2012 B1
8095310 Taylor et al. Jan 2012 B2
8100568 DeLine et al. Jan 2012 B2
8106347 Drummond et al. Jan 2012 B2
8121787 Taylor et al. Feb 2012 B2
8134117 Heslin et al. Mar 2012 B2
8144033 Chinomi et al. Mar 2012 B2
8154418 Peterson et al. Apr 2012 B2
8162493 Skiver et al. Apr 2012 B2
8164817 Varaprasad et al. Apr 2012 B2
8169307 Nakamura et al. May 2012 B2
8170748 Schofield et al. May 2012 B1
8177376 Weller et al. May 2012 B2
8179236 Weller et al. May 2012 B2
8179437 Schofield et al. May 2012 B2
8179586 Schofield et al. May 2012 B2
8194132 Dayan et al. Jun 2012 B2
8194133 De Wind et al. Jun 2012 B2
8217887 Sangam et al. Jul 2012 B2
8228588 McCabe et al. Jul 2012 B2
8237909 Ostreko et al. Aug 2012 B2
8267559 DeLine et al. Sep 2012 B2
8271187 Taylor et al. Sep 2012 B2
8277059 McCabe et al. Oct 2012 B2
8282224 Anderson et al. Oct 2012 B2
8282226 Blank et al. Oct 2012 B2
8282253 Lynam Oct 2012 B2
8288711 Heslin et al. Oct 2012 B2
8294975 Varaprasad et al. Oct 2012 B2
8304711 Drummond et al. Nov 2012 B2
8308325 Takayanagi et al. Nov 2012 B2
8309907 Heslin et al. Nov 2012 B2
8325028 Schofield et al. Dec 2012 B2
8335032 McCabe et al. Dec 2012 B2
8339526 Minikey, Jr. et al. Dec 2012 B2
8355853 Schofield et al. Jan 2013 B2
8358262 Degwekar et al. Jan 2013 B2
8379289 Schofield et al. Feb 2013 B2
8400704 McCabe et al. Mar 2013 B2
8427288 Schofield et al. Apr 2013 B2
8543330 Taylor et al. Sep 2013 B2
8643724 Schofield et al. Feb 2014 B2
8676491 Taylor et al. Mar 2014 B2
9014966 Taylor Apr 2015 B2
20010002451 Breed May 2001 A1
20010020202 Obradovich Sep 2001 A1
20010026316 Senatore Oct 2001 A1
20010035853 Hoelen et al. Nov 2001 A1
20020011611 Huang et al. Jan 2002 A1
20020049535 Rigo et al. Apr 2002 A1
20020085155 Arikawa Jul 2002 A1
20020092958 Lusk Jul 2002 A1
20020118321 Ge Aug 2002 A1
20020133144 Chan et al. Sep 2002 A1
20020149727 Wang Oct 2002 A1
20020154007 Yang Oct 2002 A1
20030002165 Mathias et al. Jan 2003 A1
20030007261 Hutzel et al. Jan 2003 A1
20030030724 Okamoto Feb 2003 A1
20030069690 Correia et al. Apr 2003 A1
20030090568 Pico May 2003 A1
20030090569 Poechmueller May 2003 A1
20030098908 Misaiji et al. May 2003 A1
20030103142 Hitomi et al. Jun 2003 A1
20030122929 Minaudo et al. Jul 2003 A1
20030133014 Mendoza Jul 2003 A1
20030137586 Lewellen Jul 2003 A1
20030156193 Nakamura Aug 2003 A1
20030169158 Paul, Jr. Sep 2003 A1
20030179293 Oizumi Sep 2003 A1
20030202096 Kim Oct 2003 A1
20030206256 Drain et al. Nov 2003 A1
20030214576 Koga Nov 2003 A1
20030214584 Ross, Jr. Nov 2003 A1
20030227546 Hilborn et al. Dec 2003 A1
20040004541 Hong Jan 2004 A1
20040027695 Lin Feb 2004 A1
20040036768 Green Feb 2004 A1
20040080404 White Apr 2004 A1
20040239243 Roberts et al. Dec 2004 A1
20040239849 Wang Dec 2004 A1
20050018738 Duan et al. Jan 2005 A1
20050024591 Lian et al. Feb 2005 A1
20050117095 Ma Jun 2005 A1
20050168995 Kittelmann et al. Aug 2005 A1
20050237440 Sugimura et al. Oct 2005 A1
20050270766 Kung et al. Dec 2005 A1
20060050018 Hutzel et al. Mar 2006 A1
20060061008 Karner et al. Mar 2006 A1
20060076860 Hoss Apr 2006 A1
20060139953 Chou et al. Jun 2006 A1
20060187378 Bong et al. Aug 2006 A1
20060279522 Kurihara Dec 2006 A1
20070064108 Haler Mar 2007 A1
20070080585 Lyu Apr 2007 A1
20070086097 Motomiya et al. Apr 2007 A1
20070183037 De Boer et al. Aug 2007 A1
20070262732 Shen Nov 2007 A1
20080042938 Cok Feb 2008 A1
20090002491 Haler Jan 2009 A1
20090052003 Schofield et al. Feb 2009 A1
20090096937 Bauer et al. Apr 2009 A1
20090201137 Weller et al. Aug 2009 A1
20090258221 Diehl et al. Oct 2009 A1
20090262192 Schofield et al. Oct 2009 A1
20090296190 Anderson et al. Dec 2009 A1
20100045899 Ockerse Feb 2010 A1
20100245701 Sato et al. Sep 2010 A1
20100246017 Tonar et al. Sep 2010 A1
20100277786 Anderson et al. Nov 2010 A1
20100289995 Hwang et al. Nov 2010 A1
20110128137 Varaprasad et al. Jun 2011 A1
20110166779 McCarthy et al. Jul 2011 A1
20110166785 McCarthy et al. Jul 2011 A1
20120050068 DeLine et al. Mar 2012 A1
20120062744 Schofield et al. Mar 2012 A1
20120086808 Lynam et al. Apr 2012 A1
20120182141 Peterson et al. Jul 2012 A1
20120203550 Skiver et al. Aug 2012 A1
20120206790 Varaprasad et al. Aug 2012 A1
20120224066 Weller et al. Sep 2012 A1
20120224248 Schofield et al. Sep 2012 A1
20120236152 De Wind et al. Sep 2012 A1
Foreign Referenced Citations (187)
Number Date Country
A-4031795 Feb 1995 AU
1189224 Jul 1998 CN
941408 Apr 1956 DE
944531 Jul 1956 DE
7323996 Nov 1973 DE
2808260 Aug 1979 DE
3248511 Jul 1984 DE
3301945 Jul 1984 DE
3614882 Nov 1987 DE
3720848 Jan 1989 DE
9306989.8 Jul 1993 DE
4329983 Aug 1995 DE
4444443 Jun 1996 DE
29703084 Jun 1997 DE
29805142 May 1998 DE
19741896 Apr 1999 DE
19755008 Jul 1999 DE
29902344 Jul 1999 DE
19934999 Feb 2001 DE
19943355 Mar 2001 DE
20118868 Mar 2002 DE
10131459 Jan 2003 DE
102005000650 Jul 2006 DE
0299509 Jan 1989 EP
0513476 Nov 1992 EP
0524766 Jan 1993 EP
0729864 Dec 1995 EP
0728618 Aug 1996 EP
0825477 Feb 1998 EP
0830985 Mar 1998 EP
0928723 Jul 1999 EP
937601 Aug 1999 EP
1022903 Jul 2000 EP
1065642 Jan 2001 EP
1075986 Feb 2001 EP
1097848 May 2001 EP
1152285 Nov 2001 EP
1170173 Jan 2002 EP
1193773 Mar 2002 EP
1256833 Nov 2002 EP
0899157 Oct 2004 EP
1315639 Feb 2006 EP
1021987 Feb 1953 FR
1461419 Dec 1966 FR
2585991 Feb 1987 FR
2672857 Aug 1992 FR
2673499 Sep 1992 FR
2759045 Aug 1998 FR
810010 Mar 1959 GB
934037 Aug 1963 GB
1008411 Oct 1965 GB
1136134 Dec 1968 GB
1553376 Sep 1979 GB
2137573 Oct 1984 GB
2161440 Jan 1986 GB
2192370 Jan 1988 GB
2222991 Mar 1990 GB
2233530 Sep 1991 GB
2255539 Nov 1992 GB
2351055 Dec 2000 GB
2362494 Nov 2001 GB
50-000638 Jan 1975 JP
52-146988 Nov 1977 JP
55-039843 Mar 1980 JP
57-30639 Feb 1982 JP
57-102602 Jun 1982 JP
57-208530 Dec 1982 JP
58-020954 Feb 1983 JP
58-030729 Feb 1983 JP
58-110334 Jun 1983 JP
58-180347 Oct 1983 JP
58-209635 Dec 1983 JP
59-114139 Jul 1984 JP
60-212730 Oct 1985 JP
60-261275 Dec 1985 JP
61-127186 Jun 1986 JP
61-260217 Nov 1986 JP
62-043543 Feb 1987 JP
62-075619 Apr 1987 JP
6216073 Apr 1987 JP
62-122487 Jun 1987 JP
62-131232 Jun 1987 JP
63-02753 Jan 1988 JP
63-085525 Apr 1988 JP
63-106730 May 1988 JP
63-106731 May 1988 JP
63-274286 Nov 1988 JP
64-14700 Jan 1989 JP
01-123587 May 1989 JP
01-130578 May 1989 JP
H236417 Aug 1990 JP
H2117935 Sep 1990 JP
02-122844 Oct 1990 JP
03-28947 Mar 1991 JP
03-028947 Mar 1991 JP
03-052097 Mar 1991 JP
30-061192 Mar 1991 JP
3099952 Apr 1991 JP
03-110855 May 1991 JP
03-198026 Aug 1991 JP
03-243914 Oct 1991 JP
04-114587 Apr 1992 JP
04-245886 Sep 1992 JP
05-080716 Apr 1993 JP
05-183194 Jul 1993 JP
05-213113 Aug 1993 JP
05-257142 Oct 1993 JP
60-80953 Mar 1994 JP
61-07035 Apr 1994 JP
62-27318 Aug 1994 JP
06-318734 Nov 1994 JP
07-146467 Jun 1995 JP
H730149 Jun 1995 JP
07-175035 Jul 1995 JP
07-191311 Jul 1995 JP
07-266928 Oct 1995 JP
07-267002 Oct 1995 JP
07-277072 Oct 1995 JP
07-281150 Oct 1995 JP
07-281185 Oct 1995 JP
08-008083 Jan 1996 JP
08-083581 Mar 1996 JP
08-216789 Aug 1996 JP
08-227769 Sep 1996 JP
09-033886 Feb 1997 JP
09-260074 Mar 1997 JP
05-077657 Jul 1997 JP
09-220976 Aug 1997 JP
09-230827 Sep 1997 JP
09-266078 Oct 1997 JP
09-288262 Nov 1997 JP
10-076880 Mar 1998 JP
10-190960 Jul 1998 JP
10-199480 Jul 1998 JP
10-206643 Aug 1998 JP
10-221692 Aug 1998 JP
10-239659 Sep 1998 JP
10-276298 Oct 1998 JP
11-038381 Feb 1999 JP
11-067485 Mar 1999 JP
11-078693 Mar 1999 JP
11-109337 Apr 1999 JP
11-160539 Jun 1999 JP
11-212073 Aug 1999 JP
11-283759 Oct 1999 JP
11-298058 Oct 1999 JP
11-305197 Nov 1999 JP
2000-131681 May 2000 JP
2000-153736 Jun 2000 JP
2000-159014 Jun 2000 JP
2000-255321 Sep 2000 JP
2000-330107 Nov 2000 JP
2001-083509 Mar 2001 JP
2001-097116 Apr 2001 JP
2001-222005 Aug 2001 JP
2002-072901 Mar 2002 JP
2002-120649 Apr 2002 JP
2002-122860 Apr 2002 JP
2002-162626 Jun 2002 JP
2002-352611 Dec 2002 JP
2003-182454 Mar 2003 JP
2003-267129 Sep 2003 JP
2004-182156 Jul 2004 JP
2005-148119 Jun 2005 JP
2005-280526 Oct 2005 JP
2005-327600 Nov 2005 JP
38-46073 Nov 2006 JP
2008-083657 Apr 2008 JP
20060038856 May 2006 KR
100663930 Jan 2007 KR
20090031998 Mar 2009 KR
WO 8202448 Jul 1982 WO
WO 8606179 Oct 1986 WO
WO 9419212 Sep 1994 WO
WO 9621581 Jul 1996 WO
WO 9814974 Apr 1998 WO
WO 9838547 Sep 1998 WO
WO 9915360 Apr 1999 WO
WO 0023826 Apr 2000 WO
WO 0052661 Sep 2000 WO
WO 0055685 Sep 2000 WO
WO 0101192 Jan 2001 WO
WO0180353 Oct 2001 WO
WO 0218174 Mar 2002 WO
WO 0249881 Jun 2002 WO
WO 03021343 Mar 2003 WO
WO 03078941 Sep 2003 WO
Non-Patent Literature Citations (31)
Entry
Tremblay, M., et al. High resolution smart image sensor with integrated parallel analog processing for multiresolution edge extraction, Robotics and Autonomous Systems 11 (1993), pp. 231-242, with abstract.
Lu, M., et al. On-chip Automatic Exposure Control Technique, Solid-State Circuits Conference, 1991. ESSCIRC '91. Proceedings—Seventeenth European (vol. 1) with abstract.
CMOS sensor page of University of Edinburgh.
Inter Partes Reexamination Proceeding IPR2015-01415 regarding U.S. Pat. No. 8,543,330, issued to Taylor et al.
Inter Partes Reexamination Proceeding IPR2015-01413 regarding U.S. Pat. No. 8,676,491, issued to Taylor et al.
Brown, Lisa Gottesfeld, A Survey of Image Registration Techniques, vol. 24, ACM Computing Surveys, pp. 325-376, 1992.
Burt et al., A Multiresolution Spline with Application to Image Mosaics, ACM Transactions on Graphics, vol. 2. No. 4, pp. 217-236, Oct. 1983.
Dana H. Ballard and Christopher M. Brown, Computer Vision, Prentice-Hall, Englewood Cliffs, New Jersey, 5 pages, 1982.
Edgar, Julian; Goodbye 12 Volts . . . Hello 42 Volts!; Oct. 5, 1999; Autospeed 50; Issue 50; www.autospeed.co.nz/cms/A—0319/article.html.
G. Wang, D. Renshaw, P.B. Denyer and M. Lu, CMOS Video Cameras, article, 1991, 4 pages, University of Edinburgh, UK.
Greene et al., Creating Raster Omnimax Images from Multiple Perspective Views Using the Elliptical Weighted Average Filter, IEEE Computer Graphics and Applications, vol. 6, No. 6, pp. 21-27, Jun. 1986.
Inter Partes Reexamination Proceeding IPR2015-00250 regarding U.S. Pat. No. 8,543,330, issued to Taylor et al.
Inter Partes Reexamination Proceeding IPR2015-00251 regarding U.S. Pat. No. 8,676,491, issued to Taylor et al.
Japanese Article “Television Image Engineering Handbook, The Institute of Television Engineers of Japan” (“JP Handbook”).
Jewett, Dale; Aug. 2000; Automotive Industries; Cahners Publising Company; www.findarticles.com/p/articles/mi—m3012/is—8—180ai—64341779.
Kobe, Gerry; 42 Volts Goes Underhood; Mar. 2000; Automotive Industries; Cahners Publishing Company; www.findarticles.com/p/articles/mi—m3012/is—3—180/ai—61361677.
Nathan, Robert, Digital Video Data Handling, NASA JPL Tech Report 32-877, Pasadena, CA, Jan. 5, 1966.
National Semiconductor, LM78S40, Universal Switching Regulator Subsystem, National Semiconductor Corporation, Apr. 1996, p. 6.
Porter et al., “Compositing Digital Images,” Computer Graphics (Proc. Siggraph), vol. 18, No. 3, pp. 253-259, Jul. 1984.
SAE Paper No. 750364 to Nolan, published Feb. 1, 1975.
SAE Paper No. 770274 to Smith, published Feb. 1, 1977.
SAE Paper No. 860173 to Ortega, published Mar. 1, 1986.
SAE Paper No. 871288 to Otsuka, published Nov. 8, 1987.
SAE Paper No. 890282 to Corsi, published Feb. 1, 1989.
SAE Paper No. 890283 to Brandt, published Feb. 1, 1989.
SAE Paper No. 890288 to Weihrauch, published Feb. 1, 1989.
SAE Paper No. 930456 to Gumkowski, published Mar. 1, 1993.
Stewart, James W.; HP SnapLED: LED Assemblies for Automotive Signal Applications; Nov. 1, 1998; Hewlett-Packard Journal; vol. 50, No. 1, www.hpl.hp.com/hpjournal/98nov/nov98al.pdf.
Szeliski, Richard, Image Mosaicing for Tele-Reality Applications, DEC Cambridge Research Laboratory, CRL 94/2, May 1994.
Wolberg, “A Two-Pass Mesh Warping Implementation of Morphing,” Dr. Dobb's Journal, No. 202, Jul. 1993.
Wolberg, George, Digital Image Warping, IEEE Computer Society Press, 1990.
Related Publications (1)
Number Date Country
20150210217 A1 Jul 2015 US
Provisional Applications (15)
Number Date Country
60406166 Aug 2002 US
60405392 Aug 2002 US
60404906 Aug 2002 US
60187960 Mar 2000 US
60263680 Jan 2001 US
60243986 Oct 2000 US
60238483 Oct 2000 US
60237077 Sep 2000 US
60234412 Sep 2000 US
60218336 Jul 2000 US
60186520 Mar 2000 US
60346733 Jan 2002 US
60263680 Jan 2001 US
60271466 Feb 2001 US
60315384 Aug 2001 US
Continuations (13)
Number Date Country
Parent 14211256 Mar 2014 US
Child 14678145 US
Parent 14033963 Sep 2013 US
Child 14211256 US
Parent 13621382 Sep 2012 US
Child 14033963 US
Parent 13399347 Feb 2012 US
Child 13621382 US
Parent 13209645 Aug 2011 US
Child 13399347 US
Parent 12908481 Oct 2010 US
Child 13209645 US
Parent 12724895 Mar 2010 US
Child 12908481 US
Parent 12405614 Mar 2009 US
Child 12724895 US
Parent 11935800 Nov 2007 US
Child 12405614 US
Parent 11624381 Jan 2007 US
Child 11935800 US
Parent 10645762 Aug 2003 US
Child 11624381 US
Parent 09799414 Mar 2001 US
Child 10287178 US
Parent 09793002 Feb 2001 US
Child 10755915 US
Continuation in Parts (5)
Number Date Country
Parent 10456599 Jun 2003 US
Child 10645762 US
Parent 10287178 Nov 2002 US
Child 10456599 US
Parent 10755915 Jan 2004 US
Child 11624381 US
Parent 10054633 Jan 2002 US
Child 11624381 US
Parent 09793002 Feb 2001 US
Child 10054633 US