Vehicular forward-sensing system

Information

  • Patent Grant
  • 11506782
  • Patent Number
    11,506,782
  • Date Filed
    Monday, December 21, 2020
    3 years ago
  • Date Issued
    Tuesday, November 22, 2022
    2 years ago
Abstract
A vehicular forward-sensing system includes a radar sensor and a forward viewing image sensor disposed within a windshield electronics module that is removably installed within the vehicle cabin at the vehicle windshield. A control is responsive to an output of the radar sensor and responsive to an output of the image sensor. Responsive to the image sensor viewing an object present in the path of forward travel of the vehicle and responsive to the radar sensor sensing the object present in the path of forward travel of the vehicle, the control determines that the object is an object of interest by processing by an image processing chip of image data of the object captured by the image sensor at a portion of an image plane of the image sensor that is spatially related to a location of the object present in the path of forward travel of the vehicle.
Description
FIELD OF THE INVENTION

The present invention generally relates to forward facing sensing systems and, more particularly, to forward facing sensing systems utilizing a radar sensor device.


BACKGROUND OF THE INVENTION

It is known to provide a radar (radio detection and ranging) system (such as a 77 GHz radar or other suitable frequency radar) on a vehicle for sensing the area forward of a vehicle, such as for an adaptive cruise control (ACC) system or an ACC stop and go system or the like. It is also known to provide a lidar (laser imaging detection and ranging) system for sensing the area forward of a vehicle for similar applications. Typically, the radar system is preferred for such vehicle applications because of its ability to detect better than the lidar system in fog or other inclement weather conditions.


Typically, such radar sensor devices are often located at the front grille of the vehicle and thus may be intrusive to the underhood packaging of the vehicle and the exterior styling of the vehicle. Although it is known to provide a lidar sensing device or system at the windshield for scanning/detecting through the windshield, radar systems are typically not suitable for such applications, since they typically are not suitable for viewing through glass, such as through the vehicle windshield (because the glass windshield may substantially attenuate the radar performance or ability to detect objects forward of the vehicle). It is also known to augment such a radar or lidar system with a forward facing camera or image sensor.


SUMMARY OF THE INVENTION

The present invention provides a forward facing sensing system for detecting objects forward of the vehicle (such as for use with or in conjunction with an adaptive cruise control system or other object detection system or the like), with a radar sensor device being located behind, and transmitting through [typically, transmitting at least about 20 GHz frequency (such as 24 GHz) and more preferably at least about 60 GHz frequency (such as 60 GHz or 77 GHz or 79 GHz or thereabouts)], a radar transmitting portion established at the upper windshield area of the vehicle. The radar sensor device is positioned at a recess or pocket or opening formed at and along the upper edge of the windshield so as to have a forward transmitting and receiving direction for radar electromagnetic waves that is not through the glass panels of the windshield. The vehicle or sensing system preferably includes a sealing or cover element, such as a plastic cover element at the sensing device to seal/environmentally protect the radar sensor device within the cabin of the vehicle while allowing for transmission of and receipt of radar frequency electromagnetic radiation waves to and from the exterior of the vehicle.


According to an aspect of the present invention, a forward facing sensing system or radar sensing system for a vehicle includes a radar sensor device disposed at a pocket or recess or opening established at an upper edge of the vehicle windshield and having a forward transmitting and receiving direction that is not through the windshield. A cover panel is disposed at the radar sensor device and is substantially sealed at the vehicle windshield at or near the pocket at the upper edge of the vehicle windshield. The cover panel comprises a material that is substantially transmissive to radar frequency electromagnetic radiation waves. The radar sensor device transmits and receives radar frequency electromagnetic radiation waves that transmit through the cover panel. The system includes a control that is responsive to an output of the radar sensor device.


According to another aspect of the present invention, a forward facing sensing system for a vehicle includes a radar sensor device operable to detect an object ahead of the vehicle, a forward facing image sensor having a forward field of view, and a control responsive to an output of the radar sensor device and responsive to an output of the forward facing image sensor. The control is operable to control sensing by the radar sensor device and the control is operable to control a focused or enhanced interrogation of a detected object (or area at which a detected object is detected) in response to a detection of an object forward of the vehicle by the radar sensor device. The control may be operable to at least one of (a) control enhanced interrogation of a detected object by the radar sensor device in response to the forward facing image sensor detecting an object (such as by enhancing the interrogation via a beam aiming or beam selection technique, such as by digital beam forming in a phased array antenna system or such as by digital beam steering or the like), and (b) control enhanced interrogation of a detected object by the forward facing image sensor in response to the radar sensor device detecting an object (such as by enhancing the interrogation via enhanced or intensified algorithmic processing of a portion of the image plane of the image sensor that is spatially related to the location of the detected object in the forward field of view of the image sensor). The control thus may be responsive to the forward facing image sensor to guide or control the focused interrogation of the detected object by the radar sensor device, or the control may be responsive to the radar sensor device to guide or control the focused or enhanced interrogation of the detected object by the forward facing image sensor (such as via directing or controlling the image sensor and/or its field of view or zoom function or via image processing of the captured image data, such as by providing enhanced processing of the area at which the object is detected).


Optionally, and desirably, the forward facing image sensor and the radar sensor device may be commonly established on a semiconductor substrate. Optionally, the semiconductor substrate may comprise one of (i) a germanium substrate, (ii) a gallium arsenide substrate, and (iii) a silicon germanium substrate.


These and other objects, advantages, purposes and features of the present invention will become apparent upon review of the following specification in conjunction with the drawings.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a perspective view of a vehicle incorporating a forward facing radar sensing system in accordance with the present invention; and



FIG. 2 is a perspective view of a windshield and radar sensing system of the present invention.





DESCRIPTION OF THE PREFERRED EMBODIMENTS

Referring now to the drawings and the illustrative embodiments depicted therein, a sensing system or forward facing sensing system or radar sensing system 10 for a vehicle 12 includes a radar sensor device 14 at an upper region of the vehicle windshield 12a and with a forward transmitting and sensing direction forward of the vehicle and in the forward direction of travel of the vehicle (FIG. 1). The windshield glass 12a may be formed with a cutout or pocket 12b at the upper edge. The pocket may be cut from the glass (so as to provide a cut opening at the upper edge of the glass windshield) or the glass may be formed with an inward bulge or pocket that provides an opening for the sensing device. The radar sensor device 14 thus may be disposed at the pocket 12b and may have a clear or unobstructed view or sensing direction forward of the vehicle that does not pass through glass (and whereby the glass windshield will not attenuate the performance of the radar sensor device). Because the upper region of the vehicle windshield is typically not used, the radar sensor device 14 may be disposed thereat without being intrusive of other systems or elements and without adversely affecting the vehicle design and/or layout. The sensing system 10 is operable to detect objects or vehicles or the like in front of the vehicle as the vehicle is traveling along a road, such as in conjunction with an adaptive cruise control system or the like. Although shown and described as being a forward facing sensing system, aspects of the present invention may be suitable for other sensing systems, such as a rearward facing sensing system or the like.


Radar sensor device 14 thus may be disposed within a windshield electronics module 16 or accessory module or overhead console of the vehicle, and within the vehicle cabin, without experiencing the adverse performance caused by the attenuation of radio or radar frequency electromagnetic radiation wave transmission through the windshield glass. Optionally, the vehicle sheet metal may be adapted to receive and/or support the radar sensor device at the upper edge of the windshield, or to accommodate the radar sensor device as disposed in and/or supported by the windshield electronics module or the like.


In order to seal the upper edge of the windshield at the pocket 12b, a cover element or plate 18 may be provided that substantially or entirely spans the opening at the pocket and that is sealed at the glass windshield and vehicle around the perimeter of the pocket, so as to limit or substantially preclude water intrusion or the like into the vehicle at the radar sensor device. The cover element 18 preferably comprises a plastic or polymeric or polycarbonate material that is transmissive to radar waves so as to limit or substantially preclude an adverse effect on the performance of the radar sensor device and system. Optionally, and desirably, the cover element may be colored to match or substantially match the shade band along the upper region of the windshield or to match or substantially match the windshield electronics module or other interior or exterior component of the vehicle. Because the radar sensor device does not require a transparent cover, the cover element may be opaque or substantially opaque and/or may function to substantially camouflage or render covert the sensor device and/or the windshield electronics module or the like.


The radar sensor device may utilize known transmitting and receiving technology and may utilize a sweeping beam or a phased array or the like for scanning or sensing or interrogating the area in front of the vehicle. Optionally, the forward facing radar sensing system may include or may be associated with a forward facing camera or imaging sensor 20 (which may be disposed at or in the windshield electronics module or accessory module or overhead console or at another accessory module or windshield electronics module or at the interior rearview mirror assembly 22 or the like), which has a forward field of view in the forward direction of travel of the vehicle. The sensing system may function to perform a “sweep” of the area in front of the vehicle and if an object or the like is detected (e.g., the radar sensing system detects a “blip”), the radar sensor device and system may hone in on or focus on or further interrogate the region where the object is detected and may perform a more focused or enhanced interrogation of the area at which the object was detected to determine if the object is an object of interest. Optionally, for example, the system may control enhanced interrogation of a detected object by the radar sensor device (such as a beam aiming or beam selection technique, such as by digital beam forming in a phased array antenna system or such as by digital beam steering). Such enhanced interrogation by the radar sensor device may be in response to the forward facing image sensor detecting an object in its forward field of view.


Optionally, and desirably, the forward facing camera may guide or initiate or control the more focused interrogation of the suspected object of interest (such as further or enhanced interrogation by the camera and imaging system) in response to the initial detection by the radar sensing system. For example, the radar sensing system may initially detect an object and the forward facing camera may be directed toward the detected object or otherwise controlled or processed to further interrogate the detected object (or area at which the object is detected) via the camera and image processing, or, alternately, the forward facing camera may initially detect an object and the system may select or aim a radar beam in a direction of a detected object. The enhanced interrogation of the object area by the forward facing camera may be accomplished via control of the camera's field of view or degree of zoom [for example, the camera may zoom into the area (via adjustment of a lens of the camera to enlarge an area of the field of view for enhanced processing) at which the object is detected] or via control of the image processing techniques. For example, the image processor may provide enhanced processing of the captured image data at the area or zone at which the object is detected, such as by enhanced or intensified algorithmic processing of a portion of the image plane of the image sensor that is spatially related to the location of the detected object in the forward field of view of the image sensor, such as by enhanced processing of pixel outputs of pixels within a zone or sub-array of a pixelated imaging array sensor, such as by utilizing aspects of the imaging systems described in U.S. Pat. Nos. 7,123,168; 7,038,577; 7,004,606; 6,690,268; 6,396,397; 5,550,677; 5,670,935; 5,796,094; 5,877,897 and 6,498,620, and/or U.S. patent application Ser. No. 11/239,980, filed Sep. 30, 2005, now U.S. Pat. No. 7,881,496; and/or Ser. No. 11/315,675, filed Dec. 22, 2005, now U.S. Pat. No. 7,720,580, which are all hereby incorporated herein by reference in their entireties.


Thus, the sensing system of the present invention provides for cooperation or collaboration between the radar sensor device and the forward facing camera or image sensor in a way that benefits or enhances the sensing capabilities of the forward facing sensing system. The sensing system may thus operate with reduced processing until an object is initially detected, and then may provide further processing to determine if the object is an object of interest to the forward facing sensing system.


Optionally, and desirably, the radar sensor device and forward facing camera may be commonly established on a semiconductor substrate, such as a substrate comprising a germanium substrate, a gallium arsenide substrate or a silicon germanium substrate or the like. The substrate may include or may incorporate at least some of the control circuitry for the radar sensor device and camera and/or may include or incorporate common circuitry for the radar sensor device and camera.


Because the radar sensor device and camera may be disposed on a common substrate and/or may be disposed within a windshield electronics module or the like, the forward facing sensing system may be removably installed at the vehicle and may be removed therefrom, such as for service or replacement. Thus, the sensing system (including the radar sensor device and camera) may comprise a self-contained unit or system that is disposed at the upper region of the windshield. Optionally, the radar sensor device and/or camera may be disposed within a windshield electronics module or the like, such as by utilizing aspects of the modules described in U.S. patent application Ser. No. 10/958,087, filed Oct. 4, 2004, now U.S. Pat. No. 7,188,963; and/or Ser. No. 11/201,661, filed Aug. 11, 2005, now U.S. Pat. Nos. 7,480,149, and/or 7,004,593; 6,824,281; 6,690,268; 6,250,148; 6,341,523; 6,593,565; 6,428,172; 6,501,387; 6,329,925 and 6,326,613, and/or in PCT Application No. PCT/US03/40611, filed Dec. 19, 2003 and published Jul. 15, 2004 as International Publication No. WO 2004/058540, and/or Ireland pat. applications, Ser. No. S2004/0614, filed Sep. 15, 2004; Ser. No. S2004/0838, filed Dec. 14, 2004; and Ser. No. S2004/0840, filed Dec. 15, 2004, which are all hereby incorporated herein by reference in their entireties.


Optionally, the mirror assembly and/or windshield electronics module may include or incorporate a display, such as a static display, such as a static video display screen (such as a display utilizing aspects of the displays described in U.S. Pat. Nos. 5,530,240 and/or 6,329,925, which are hereby incorporated herein by reference in their entireties, or a display-on-demand or transflective type display or other display utilizing aspects of the displays described in U.S. Pat. Nos. 6,690,268; 5,668,663 and/or 5,724,187, and/or U.S. patent application Ser. No. 10/054,633, filed Jan. 22, 2002, now U.S. Pat. No. 7,195,381; Ser. No. 11/021,065, filed Dec. 23, 2004, now U.S. Pat. No. 7,255,451; Ser. No. 10/528,269, filed Mar. 17, 2005, now U.S. Pat. No. 7,274,501; Ser. No. 10/533,762, filed May 4, 2005, now U.S. Pat. No. 7,184,190; Ser. No. 10/538,724, filed Jun. 13, 2005 and published on Mar. 9, 2006 as U.S. Publication No. US 2006/0050018; Ser. No. 11/226,628, filed Sep. 14, 2005 and published Mar. 23, 2006 as U.S. Publication No. US 2006/0061008; Ser. No. 10/993,302, filed Nov. 19, 2004, now U.S. Pat. No. 7,338,177; and/or Ser. No. 11/284,543, filed Nov. 22, 2005, now U.S. Pat. No. 7,370,983, and/or PCT Patent Application No. PCT/US2006/018567, filed May 15, 2006 and published Nov. 23, 2006 as International Publication No. WO 2006/124682; and/or PCT Application No. PCT/US2006/042718, filed Oct. 31, 2006, published May 10, 2007 as International Publication No. WO 07/053710; and U.S. provisional applications, Ser. No. 60/836,219, filed Aug. 8, 2006; Ser. No. 60/759,992, filed Jan. 18, 2006; and Ser. No. 60/732,245, filed Nov. 1, 2005, and/or PCT Application No. PCT/US03/40611, filed Dec. 19, 2003, and published Jul. 15, 2004 as International Publication No. WO 2004/058540, which are all hereby incorporated herein by reference in their entireties). Alternately, the display screen may comprise a display (such as a backlit LCD video display) that is movable to extend from the mirror casing when activated, such as a slide-out display of the types described in U.S. patent application Ser. No. 10/538,724, filed Jun. 13, 2005 and published on Mar. 9, 2006 as U.S. Publication No. US 2006/0050018; and/or Ser. No. 11/284,543, filed Nov. 22, 2005, now U.S. Pat. No. 7,370,983, and/or PCT Patent Application No. PCT/US2006/018567, filed May 15, 2006 and published Nov. 23, 2006 as International Publication No. WO 2006/124682; and/or PCT Application No. PCT/US2006/042718, filed Oct. 31, 2006, and published May 10, 2007 as International Publication No. WO 07/053710; and U.S. provisional applications, Ser. No. 60/836,219, filed Aug. 8, 2006; Ser. No. 60/759,992, filed Jan. 18, 2006; and Ser. No. 60/732,245, filed Nov. 1, 2005, which are all hereby incorporated herein by reference in their entireties. Optionally, and preferably, the display is episodically extended and/or actuated, such as to display driving instructions to the driver as the vehicle approaches a waypoint or turn along the selected route, and then retracted after the vehicle has passed the waypoint and continues along the selected route.


Optionally, the display on the video screen may be operable to display an alert to the driver of a potential hazardous condition detected ahead of or in the forward path of the vehicle. For example, an output of a forward-viewing active night vision system incorporating an imaging sensor or camera device and near-IR floodlighting (such as those described in U.S. Pat. No. 5,877,897 and U.S. patent application Ser. No. 11/651,726, filed Jan. 10, 2007, now U.S. Pat. No. 7,311,406, which are hereby incorporated herein by reference in their entireties), or an output of another suitable forward facing sensor or system such a passive far-IR thermal imaging night vision sensor/camera, may be processed by an image processor, such as, for example, an EYEQ™ image processing chip available from Mobileye Vision Technologies Ltd. of Jerusalem, Israel. Such image processors include object detection software (such as the types described in U.S. Pat. No. 7,038,577; and/or Ser. No. 11/315,675, filed Dec. 22, 2005, now U.S. Pat. No. 7,720,580, which are hereby incorporated herein by reference in their entireties), and they analyze image data to detect objects. The image processor or control may determine if a potentially hazardous condition (such as an object or vehicle or person or deer or the like) may exist in the vehicle path and may provide an alert signal (such as by actuation of a visual indicator or an audible indicator or by an enhancement/overlay on a video display screen that is showing a video image to the driver of what the night vision sensor/camera is seeing) to prompt/alert the driver of a potential hazard (such as a deer or a pedestrian or a fallen rock or the like) as needed or appropriate. The display thus may provide an episodal alert so that the driver's attention is drawn to the display alert only when there is a potential hazard detected. Such a system avoids the driver from having to look forward out the windshield while often looking to or watching a monitor running a video of the camera's output, which is not particularly consumer-friendly and simply loads the driver with yet another task.


Optionally, the mirror reflective element of the mirror assembly may comprise a prismatic mirror reflector or an electrically variable reflectance mirror reflector, such as an electro-optic reflective element assembly or cell, such as an electrochromic reflective element assembly or cell. For example, the rearview mirror assembly may comprise an electro-optic or electrochromic reflective element or cell, such as an electrochromic mirror assembly and electrochromic reflective element utilizing principles disclosed in commonly assigned U.S. Pat. Nos. 6,690,268; 5,140,455; 5,151,816; 6,178,034; 6,154,306; 6,002,544; 5,567,360; 5,525,264; 5,610,756; 5,406,414; 5,253,109; 5,076,673; 5,073,012; 5,117,346; 5,724,187; 5,668,663; 5,910,854; 5,142,407; 4,824,221; 5,818,636; 6,166,847; 6,111,685; 6,392,783; 6,710,906; 6,798,556; 6,554,843; 6,420,036; 5,142,406; 5,442,478 and/or 4,712,879, and/or U.S. patent application Ser. No. 10/054,633, filed Jan. 22, 2002, now U.S. Pat. No. 7,195,381; Ser. No. 11/021,065, filed Dec. 23, 2004, now U.S. Pat. No. 7,255,451; Ser. No. 10/528,269, filed Mar. 17, 2005, now U.S. Pat. No. 7,274,501; Ser. No. 10/533,762, filed May 4, 2005, now U.S. Pat. No. 7,184,190; Ser. No. 10/538,724, filed Jun. 13, 2005, and published on Mar. 9, 2006 as U.S. Publication No. US 2006/0050018; Ser. No. 11/226,628, filed Sep. 14, 2005 and published Mar. 23, 2006 as U.S. Publication No. US 2006/0061008; Ser. No. 10/993,302, filed Nov. 19, 2004, now U.S. Pat. No. 7,338,177; and/or Ser. No. 11/284,543, filed Nov. 22, 2005, now U.S. Pat. No. 7,370,983, and/or International Pat. Publication Nos. WO 2004/098953, published Nov. 18, 2004; WO 2004/042457, published May 21, 2004; WO 2003/084780, published Oct. 16, 2003; and/or WO 2004/026633, published Apr. 1, 2004, which are all hereby incorporated herein by reference in their entireties, and/or such as disclosed in the following publications: N. R. Lynam, “Electrochromic Automotive Day/Night Mirrors”, SAE Technical Paper Series 870636 (1987); N. R. Lynam, “Smart Windows for Automobiles”, SAE Technical Paper Series 900419 (1990); N. R. Lynam and A. Agrawal, “Automotive Applications of Chromogenic Materials”, Large Area Chromogenics: Materials and Devices for Transmittance Control, C. M. Lampert and C. G. Granquist, EDS., Optical Engineering Press, Wash. (1990), which are hereby incorporated herein by reference in their entireties.


Optionally, and preferably, the mirror reflective element may comprise a frameless reflective element, such as by utilizing aspects of the reflective elements described in PCT Application No. PCT/US2006/018567, filed May 15, 2006 and published Nov. 23, 2006 as International Publication No. WO 2006/124682; PCT Application No. PCT/US2004/015424, filed May 18, 2004 and published on Dec. 2, 2004, as International Publication No. WO 2004/103772; and/or U.S. patent application Ser. No. 11/140,396, filed May 27, 2005, now U.S. Pat. No. 7,360,932; Ser. No. 11/226,628, filed Sep. 14, 2005, and published Mar. 23, 2006 as U.S. Publication No. US 2006/0061008; Ser. No. 11/021,065, filed Dec. 23, 2004, now U.S. Pat. No. 7,255,451; Ser. No. 10/528,269, filed Mar. 17, 2005, now U.S. Pat. No. 7,274,501; Ser. No. 10/533,762, filed May 4, 2005, now U.S. Pat. No. 7,184,190; and/or Ser. No. 10/538,724, filed Jun. 13, 2005, and published on Mar. 9, 2006 as U.S. Publication No. US 2006/0050018, which are hereby incorporated herein by reference in their entireties. Optionally, the reflective element may include a metallic perimeter band around the perimeter of the reflective element, such as by utilizing aspects of the reflective elements described in PCT Application No. PCT/US2006/018567, filed May 15, 2006 and published Nov. 23, 2006 as International Publication No. WO 2006/124682; PCT Application No. PCT/US03/29776, filed Sep. 19, 2003 and published Apr. 1, 2004 as International Publication No. WO 2004/026633; and/or PCT Application No. PCT/US03/35381, filed Nov. 5, 2003 and published May 21, 2004 as International Publication No. WO 2004/042457; and/or U.S. patent application Ser. No. 11/021,065, filed Dec. 23, 2004, now U.S. Pat. No. 7,255,451; and/or Ser. No. 11/226,628, filed Sep. 14, 2005 and published Mar. 23, 2006 as U.S. Publication No. US 2006/0061008, which is hereby incorporated herein by reference in their entireties. The frameless reflective element thus is aesthetically pleasing to a person viewing the mirror assembly, since the reflective element (as recessed or partially recessed in the opening of the bezel portion of the mirror casing) does not include a separate frame or bezel portion around its perimeter edge. The metallic perimeter band may be selected to have a desired color or tint to match or contrast a color scheme or the like of the vehicle, such as described in PCT Application No. PCT/US2006/018567, filed May 15, 2006 and published Nov. 23, 2006 as International Publication No. WO 2006/124682; and/or PCT Application No. PCT/US2004/015424, filed May 18, 2004 and published Dec. 2, 2004 as International Publication No. WO 2004/103772, which are hereby incorporated herein by reference in their entireties.


Optionally, use of an elemental semiconductor mirror, such as a silicon metal mirror, such as disclosed in U.S. Pat. Nos. 6,286,965; 6,196,688; 5,535,056; 5,751,489 and 6,065,840, and/or in U.S. patent application Ser. No. 10/993,302, filed Nov. 19, 2004, now U.S. Pat. No. 7,338,177, which are all hereby incorporated herein by reference in their entireties, can be advantageous because such elemental semiconductor mirrors (such as can be formed by depositing a thin film of silicon) can be greater than 50 percent reflecting in the photopic (SAE J964a measured), while being also substantially transmitting of light (up to 20 percent or even more). Such silicon mirrors also have the advantage of being able to be deposited onto a flat glass substrate and to be bent into a curved (such as a convex or aspheric) curvature, which is also advantageous since many passenger-side exterior rearview mirrors are bent or curved.


Optionally, the mirror assembly may comprise a prismatic mirror assembly, such as a prismatic mirror assembly utilizing aspects described in U.S. Pat. Nos. 6,318,870; 6,598,980; 5,327,288; 4,948,242; 4,826,289; 4,436,371 and 4,435,042, and PCT Application No. PCT/US04/015424, filed May 18, 2004 and published Dec. 2, 2004 as International Publication No. WO 2004/103772; and U.S. patent application Ser. No. 10/933,842, filed Sep. 3, 2004, now U.S. Pat. No. 7,249,860, which are hereby incorporated herein by reference in their entireties. Optionally, the prismatic reflective element may comprise a conventional prismatic reflective element or prism, or may comprise a prismatic reflective element of the types described in PCT Application No. PCT/US03/29776, filed Sep. 19, 2003 and published Apr. 1, 2004 as International Publication No. WO 2004/026633; and/or U.S. patent application Ser. No. 10/709,434, filed May 5, 2004, now U.S. Pat. No. 7,420,756; Ser. No. 10/933,842, filed Sep. 3, 2004, now U.S. Pat. No. 7,249,860; Ser. No. 11/021,065, filed Dec. 23, 2004, now U.S. Pat. No. 7,255,451; Ser. No. 10/528,269, filed Mar. 17, 2005, now U.S. Pat. No. 7,274,501; and/or Ser. No. 10/993,302, filed Nov. 19, 2004, now U.S. Pat. No. 7,338,177, and/or PCT Application No. PCT/US2004/015424, filed May 18, 2004 and published Dec. 2, 2004 as International Publication No. WO 2004/103772, which are all hereby incorporated herein by reference in their entireties, without affecting the scope of the present invention.


Optionally, the reflective element may comprise a bent, wide-angle mirror reflector rather than a flat mirror reflector. If a bent, wide-angle mirror reflector is used, it is preferable that the mirror reflector comprise a glass substrate coated with a bendable reflector coating (such as of silicon as described in U.S. Pat. Nos. 6,065,840; 5,959,792; 5,535,056 and 5,751,489, which are hereby incorporated by reference herein in their entireties.


Optionally, the mirror casing and/or windshield electronics module may be suitable for supporting larger or heavier components or circuitry that otherwise may not have been suitable for mounting or locating at or in a mirror casing. For example, the mirror casing or module may house or support a battery or power pack for various electronic features or components, and/or may support a docking station for docking and/or holding a cellular telephone or hand-held personal data device or the like, such as by utilizing aspects of the systems described in U.S. Pat. No. 6,824,281, and/or PCT Application No. PCT/US03/40611, filed Dec. 19, 2003 and published Jul. 15, 2004 as International Publication No. WO 2004/058540, and/or U.S. patent application Ser. No. 10/510,813, filed Aug. 23, 2002, now U.S. Pat. No. 7,306,276, and/or U.S. patent application Ser. No. 11/842,328, filed Aug. 21, 2007, now U.S. Pat. No. 7,722,199, and Ser. No. 11/861,904, filed Sep. 26, 2007, now U.S. Pat. No. 7,937,667, and/or U.S. provisional application, Ser. No. 60/839,446, filed Aug. 23, 2006; Ser. No. 60/879,619, filed Jan. 10, 2007; Ser. No. Ser. No. 60/850,700, filed Oct. 10, 2006; and/or Ser. No. 60/847,502, filed Sep. 27, 2006, which are hereby incorporated herein by reference in their entireties.


Optionally, the mirror assembly and/or windshield electronics module may include or incorporate a navigation device that may include navigational circuitry and a GPS antenna to determine the geographical location of the vehicle and to provide routes to targeted or selected destinations, such as by utilizing aspects of known navigational devices and/or the devices of the types described in U.S. Pat. Nos. 4,862,594; 4,937,945; 5,131,154; 5,255,442; 5,632,092; 5,798,688; 5,971,552; 5,924,212; 6,243,003; 6,278,377; 6,420,975; 6,946,978; 6,477,464; 6,678,614 and/or 7,004,593, and/or U.S. patent application Ser. No. 10/645,762, filed Aug. 20, 2003, now U.S. Pat. No. 7,167,796; Ser. No. 10/529,715, filed Mar. 30, 2005, now U.S. Pat. No. 7,657,052; Ser. No. 10/538,724, filed Jun. 13, 2005 and published Mar. 9, 2006 as U.S. Publication No. 2006/0050018; Ser. No. 11/861,904, filed Sep. 26, 2007, now U.S. Pat. No. 7,937,667; and/or Ser. No. 10/964,512, filed Oct. 13, 2004, now U.S. Pat. No. 7,308,341, and/or U.S. provisional applications, Ser. No. 60/879,619, filed Jan. 10, 2007; Ser. No. Ser. No. 60/850,700, filed Oct. 10, 2006; and/or Ser. No. 60/847,502, filed Sep. 27, 2006, which are all hereby incorporated herein by reference in their entireties. Optionally, the mirror or navigation device may include a microphone, whereby the mirror or navigation device may provide voice activated control of the navigation device.


Optionally, for example, the mounting structure and/or mirror casing and/or windshield electronics module may support compass sensors, such as compass sensors of the types described in may utilize aspects of the compass systems described in U.S. patent application Ser. No. 11/305,637, filed Dec. 16, 2005, now U.S. Pat. No. 7,329,013; Ser. No. 10/352,691, filed Jan. 28, 2003, now U.S. Pat. No. 6,922,902; Ser. No. 11/284,543, filed Nov. 22, 2005, now U.S. Pat. No. 7,370,983; Ser. No. 11/226,628, filed Sep. 14, 2005, and published on Mar. 23, 2006 as U.S. Publication No. 2006/0061008; and/or Ser. No. 10/933,842, filed Sep. 3, 2004, now U.S. Pat. No. 7,249,860; and/or U.S. Pat. Nos. 7,004,593; 4,546,551; 5,699,044; 4,953,305; 5,576,687; 5,632,092; 5,677,851; 5,708,410; 5,737,226; 5,802,727; 5,878,370; 6,087,953; 6,173,508; 6,222,460; 6,513,252 and 6,642,851, and/or PCT Application No. PCT/US2004/015424, filed May 18, 2004 and published Dec. 2, 2004 as International Publication No. WO 2004/103772, and/or European patent application, published Oct. 11, 2000 under Publication No. EP 0 1043566, which are all hereby incorporated herein by reference in their entireties. The compass circuitry may include the compass sensor, such as a magneto-responsive sensor, such as a magneto-resistive sensor, such as the types disclosed in U.S. Pat. Nos. 5,255,442; 5,632,092; 5,802,727; 6,173,501; 6,427,349 and 6,513,252 (which are hereby incorporated herein by reference in their entireties), a magneto-capacitive sensor, a Hall-effect sensor, such as the types described in U.S. Pat. Nos. 6,278,271; 5,942,895 and 6,184,679 (which are hereby incorporated herein by reference in their entireties), a magneto-inductive sensor, such as described in U.S. Pat. No. 5,878,370 (which is hereby incorporated herein by reference in its entirety), a magneto-impedance sensor, such as the types described in PCT Publication No. WO 2004/076971, published Sep. 10, 2004 (which is hereby incorporated herein by reference in its entirety), or a flux-gate sensor or the like, and/or may comprise a compass chip, such as described in U.S. patent application Ser. No. 11/226,628, filed Sep. 14, 2005, and published on Mar. 23, 2006 as U.S. Publication No. 2006/0061008; and/or Ser. No. 11/284,543, filed Nov. 22, 2005, now U.S. Pat. No. 7,370,983, which are hereby incorporated herein by reference in their entireties. By positioning the compass sensors at a fixed location, further processing and calibration of the sensors to accommodate adjustment or movement of the sensors is not necessary.


Optionally, the mounting structure and/or mirror casing and/or windshield electronics module may support one or more imaging sensors or cameras, and may fixedly support them with the cameras set with a desired or appropriate forward and/or rearward field of view. For example, the camera may be operable in conjunction with a forward facing imaging system, such as a rain sensing system, such as described in U.S. Pat. Nos. 6,968,736; 6,806,452; 6,516,664; 6,353,392; 6,313,454; 6,250,148; 6,341,523 and 6,824,281, and in U.S. patent application Ser. No. 10/958,087, filed Oct. 4, 2004, now U.S. Pat. No. 7,188,963; and/or Ser. No. 11/201,661, filed Aug. 11, 2005, now U.S. Pat. No. 7,480,149, which are all hereby incorporated herein by reference in their entireties. The mounting structure and/or mirror casing may be pressed or loaded against the interior surface of the windshield to position or locate the image sensor in close proximity to the windshield and/or to optically couple the image sensor at the windshield. The mounting structure and/or mirror casing may include an aperture or apertures at its forward facing or mounting surface and the windshield may include apertures through the opaque frit layer (typically disposed at a mirror mounting location of a windshield) or the windshield may not include such a frit layer, depending on the particular application.


Optionally, the image sensor may be operable in conjunction with a forward or rearward vision system, such as an automatic headlamp control system and/or a lane departure warning system or object detection system and/or other forward vision or imaging systems, such as imaging or vision systems of the types described in U.S. Pat. Nos. 7,038,577; 7,005,974; 7,004,606; 6,690,268; 6,946,978; 6,757,109; 6,717,610; 6,396,397; 6,201,642; 6,353,392; 6,313,454; 5,550,677; 5,670,935; 5,796,094; 5,715,093; 5,877,897; 6,097,023 and 6,498,620, and/or U.S. patent application Ser. No. 09/441,341, filed Nov. 16, 1999, now U.S. Pat. No. 7,339,149; Ser. No. 10/422,512, filed Apr. 24, 2003, now U.S. Pat. No. 7,123,168; Ser. No. 11/239,980, filed Sep. 30, 2005, now U.S. Pat. No. 7,881,496; Ser. No. 11/672,070, filed Feb. 7, 2007, now U.S. Pat. No. 8,698,894; and/or Ser. No. 11/315,675, filed Dec. 22, 2005, now U.S. Pat. No. 7,720,580, and/or U.S. provisional applications, Ser. No. 60/628,709, filed Nov. 17, 2004; Ser. No. 60/614,644, filed Sep. 30, 2004; Ser. No. 60/618,686, filed Oct. 14, 2004; Ser. No. 60/731,183, filed Oct. 28, 2005; and/or Ser. No. 60/765,797, filed Feb. 7, 2006, and/or International PCT Application No. PCT/US2006/041709, filed Oct. 27, 2006, and published May 10, 2007 as International Publication No. WO 07/053404, which are hereby incorporated herein by reference in their entireties. The mirror casing thus may support one or more rearward facing imaging sensors or cameras, such as for rearward vision or imaging systems, such as for a rear vision system or back up aid of the types described in U.S. Pat. Nos. 6,717,610 and/or 6,201,642 (which are hereby incorporated herein by reference in their entireties), and/or a cabin monitoring system or baby view system of the types described in U.S. Pat. No. 6,690,268 (which is hereby incorporated herein by reference in its entirety), and/or the like.


Optionally, the fixed mounting structure and/or mirror casing and/or windshield electronics module may house or support a display device, such as a heads up display device (such as the types described in U.S. patent application Ser. No. 11/105,757, filed Apr. 14, 2005, now U.S. Pat. No. 7,526,103; and Ser. No. 11/029,695, filed Jan. 5, 2005, now U.S. Pat. No. 7,253,723, which are hereby incorporated herein by reference in their entireties) that is operable to project a display at the area in front of the driver to enhance viewing of the display information without adversely affecting the driver's forward field of view. For example, the mirror casing may support a heads up display (HUD), such as a MicroHUD™ head-up display system available from MicroVision Inc. of Bothell, Wash., and/or such as a HUD that utilizes aspects described in U.S. patent application Ser. No. 11/105,757, filed Apr. 14, 2005, now U.S. Pat. No. 7,526,103; and Ser. No. 11/029,695, filed Jan. 5, 2005, now U.S. Pat. No. 7,253,723, which are hereby incorporated herein by reference in their entireties. For example, MicroVision's MicroHUD™ combines a MEMS-based micro display with an optical package of lenses and mirrors to achieve a compact high-performance HUD module that reflects a virtual image off the windscreen that appears to the driver to be close to the front of the car. This laser-scanning display can outperform many miniature flat panel LCD display screens because it can be clearly viewed in the brightest conditions and also dimmed to the very low brightness levels required for safe night-time driving. For example, such a display device may be located at or in the mirror casing/mounting structure/windshield electronics module and may be non-movably mounted at the mirror casing or mounting structure or windshield electronics module, and may be operable to project the display information at the windshield of the vehicle so as to be readily viewed by the driver of the vehicle in the driver's forward field of view.


The mounting structure and/or mirror casing and/or windshield electronics module may be fixedly attached to or supported at the vehicle windshield and may extend upward toward the headliner of the vehicle. Thus, the mirror assembly of the present invention may have enhanced wire management and may substantially conceal the wiring of the electronic components/accessories between the circuitry within the mirror casing and the headliner at the upper portion of the vehicle windshield. Optionally, the mirror assembly may include wire management elements, such as the types described in U.S. patent application Ser. No. 11/226,628, filed Sep. 14, 2005, and published Mar. 23, 2006 as U.S. Publication No. 2006/0061008; and/or Ser. No. 11/584,697, filed Oct. 20, 2006, now U.S. Pat. No. 7,510,287; and/or U.S. provisional application, Ser. No. Ser. No. 60/729,430, filed Oct. 21, 2005, which are hereby incorporated herein by reference in their entireties, to conceal the wires extending between an upper portion of the mirror casing and the vehicle headliner (or overhead console). Optionally, the mirror casing and/or mounting structure and/or windshield electronics module may abut the headliner and/or may be an extension of an overhead console of the vehicle (such as by utilizing aspects described in U.S. patent application Ser. No. 10/538,724, filed Jun. 13, 2005 and published Mar. 9, 2006 as U.S. Publication No. 2006/0050018, and/or U.S. patent application Ser. No. 10/510,813, filed Aug. 23, 2002, now U.S. Pat. No. 7,306,276, which are hereby incorporated herein by reference in their entireties). The mirror assembly of the present invention thus may allow for utilization of the area above the mirror reflective element for additional mirror content, such as additional electronic accessories or circuitry, and thus may provide for or accommodate additional mirror content/circuitry and/or vehicle content/circuitry.


Optionally, the mirror assembly and/or reflective element assembly may include one or more displays, such as for the accessories or circuitry described herein. The displays may comprise any suitable display, such as displays of the types described in U.S. Pat. Nos. 5,530,240 and/or 6,329,925, which are hereby incorporated herein by reference in their entireties, or may be display-on-demand or transflective type displays or other displays, such as the types described in U.S. Pat. Nos. 6,690,268; 5,668,663 and/or 5,724,187, and/or U.S. patent application Ser. No. 10/054,633, filed Jan. 22, 2002, now U.S. Pat. No. 7,195,381; Ser. No. 11/021,065, filed Dec. 23, 2004, now U.S. Pat. No. 7,255,451; Ser. No. 10/528,269, filed Mar. 17, 2005, now U.S. Pat. No. 7,274,501; Ser. No. 10/533,762, filed May 4, 2005, now U.S. Pat. No. 7,184,190; Ser. No. 10/538,724, filed Jun. 13, 2005 and published Mar. 9, 2006 as U.S. Publication No. 2006/0050018; Ser. No. 11/226,628, filed Sep. 14, 2005 and published Mar. 23, 2006 as U.S. Publication No. 2006/0061008; Ser. No. 10/993,302, filed Nov. 19, 2004, now U.S. Pat. No. 7,338,177; and/or Ser. No. 11/284,543, filed Nov. 22, 2005, now U.S. Pat. No. 7,370,983, and/or PCT Patent Application No. PCT/US2006/018567, filed May 15, 2006 and published Nov. 23, 2006 as International Publication No. WO 2006/124682; and/or PCT Application No. PCT/US2006/042718, filed Oct. 31, 2006, and published May 10, 2007 as International Publication No. WO 07/053710; and/or U.S. provisional applications, Ser. No. 60/836,219, filed Aug. 8, 2006; Ser. No. 60/759,992, filed Jan. 18, 2006; and Ser. No. 60/732,245, filed Nov. 1, 2005, and/or PCT Application No. PCT/US03/40611, filed Dec. 19, 2003 and published Jul. 15, 2004 as International Publication No. WO 2004/058540, which are all hereby incorporated herein by reference in their entireties, or may include or incorporate video displays or the like, such as the types described in U.S. Pat. No. 6,690,268 and/or PCT Application No. PCT/US03/40611, filed Dec. 19, 2003 and published Jul. 15, 2004 as International Publication No. WO 2004/058540, U.S. patent application Ser. No. 10/538,724, filed Jun. 13, 2005, and published Mar. 9, 2006 as U.S. Publication No. 2006/0050018; and/or Ser. No. 11/284,543, filed Nov. 22, 2005, now U.S. Pat. No. 7,370,983, which are hereby incorporated herein by reference in their entireties. Optionally, the mirror assembly may include a video display that is selectively positionable, such as extendable/retractable or pivotable or foldable so as to be selectively positioned at a side or below the mirror casing when in use and storable within or at least partially within the mirror casing when not in use. The display may automatically extend/pivot to the in-use position in response to an actuating event, such as when the vehicle is shifted into its reverse gear for a rear vision system or back up aid.


Such a video mirror display (or other display) may be associated with a rearward facing camera at a rear of the vehicle and having a rearward field of view, such as at the license plate holder of the vehicle or at a rear trim portion (such as described in U.S. patent application Ser. No. 11/672,070, filed Feb. 7, 2007, now U.S. Pat. No. 8,698,894, and U.S. provisional application Ser. No. 60/765,797, filed Feb. 7, 2006, which is hereby incorporated herein by reference in its entirety). The image data captured by the rearward facing camera may be communicated to the control or video display at the rearview mirror assembly (or elsewhere in the vehicle, such as at an overhead console or accessory module or the like) via any suitable communication means or protocol. For example, the image data may be communicated via a fiber optic cable or a twisted pair of wires, or may be communicated wirelessly, such as via a BLUETOOTH® communication link or protocol or the like, or may be superimposed on a power line, such as a 12 volt power line of the vehicle, such as by utilizing aspects of the systems described in U.S. patent application Ser. No. 11/239,980, filed Sep. 30, 2005, now U.S. Pat. No. 7,881,496, which is hereby incorporated herein by reference in its entirety.


Optionally, the mirror assembly may include one or more user inputs for controlling or activating/deactivating one or more electrical accessories or devices of or associated with the mirror assembly. For example, the mirror assembly may comprise any type of switches or buttons, such as touch or proximity sensing switches, such as touch or proximity switches of the types described in PCT Application No. PCT/US03/40611, filed Dec. 19, 2003 and published Jul. 15, 2004 as International Publication No. WO 2004/058540; and/or PCT Application No. PCT/US2004/015424, filed May 18, 2004 and published Dec. 2, 2004 as International Publication No. WO 2004/103772, and/or U.S. Pat. Nos. 6,001,486; 6,310,611; 6,320,282 and 6,627,918, and/or U.S. patent application Ser. No. 11/021,065, filed Dec. 23, 2004, now U.S. Pat. No. 7,255,451; and/or U.S. patent application Ser. No. 09/817,874, filed Mar. 26, 2001, now U.S. Pat. No. 7,224,324; Ser. No. 10/956,749, filed Oct. 1, 2004, now U.S. Pat. No. 7,446,924; Ser. No. 10/933,842, filed Sep. 3, 2004, now U.S. Pat. No. 7,249,860; Ser. No. 11/021,065, filed Dec. 23, 2004, now U.S. Pat. No. 7,255,451; and/or Ser. No. 11/140,396, filed May 27, 2005, now U.S. Pat. No. 7,360,932, which are hereby incorporated herein by reference in their entireties, or the inputs may comprise other types of buttons or switches, such as those described in U.S. Pat. No. 6,501,387, and/or U.S. patent application Ser. No. 11/029,695, filed Jan. 5, 2005, now U.S. Pat. No. 7,253,723; and/or Ser. No. 11/451,639, filed Jun. 13, 2006, now U.S. Pat. No. 7,527,403, which are hereby incorporated herein by reference in their entireties, or such as fabric-made position detectors, such as those described in U.S. Pat. Nos. 6,504,531; 6,501,465; 6,492,980; 6,452,479; 6,437,258 and 6,369,804, which are hereby incorporated herein by reference in their entireties. Other types of switches or buttons or inputs or sensors may be incorporated to provide the desired function, without affecting the scope of the present invention. The manual inputs or user actuatable inputs or actuators may control or adjust or activate/deactivate one or more accessories or elements or features. For touch sensitive inputs or applications or switches, the mirror assembly or accessory module or input may, when activated, provide a positive feedback (such as activation of an illumination source or the like, or such as via an audible signal, such as a chime or the like, or a tactile or haptic signal, or a rumble device or signal or the like) to the user so that the user is made aware that the input was successfully activated.


Optionally, the user inputs or buttons may comprise user inputs for a garage door opening system, such as a vehicle based garage door opening system of the types described in U.S. Pat. Nos. 7,023,322; 6,396,408; 6,362,771 and 5,798,688, which are hereby incorporated herein by reference in their entireties. The user inputs may also or otherwise function to activate and deactivate a display or function or accessory, and/or may activate/deactivate and/or commence a calibration of a compass system of the mirror assembly and/or vehicle. Optionally, the user inputs may also or otherwise comprise user inputs for a telematics system of the vehicle, such as, for example, an ONSTAR® system as found in General Motors vehicles and/or such as described in U.S. Pat. Nos. 4,862,594; 4,937,945; 5,131,154; 5,255,442; 5,632,092; 5,798,688; 5,971,552; 5,924,212; 6,243,003; 6,278,377; 6,420,975; 6,946,978; 6,477,464; 6,678,614 and/or 7,004,593, and/or U.S. patent application Ser. No. 10/645,762, filed Aug. 20, 2003, now U.S. Pat. No. 7,167,796; Ser. No. 10/529,715, filed Mar. 30, 2005, now U.S. Pat. No. 7,657,052; Ser. No. 10/538,724, filed Jun. 13, 2005 and published Mar. 9, 2006 as U.S. Publication No. 2006/0050018; and/or Ser. No. 10/964,512, filed Oct. 13, 2004, now U.S. Pat. No. 7,308,341, which are all hereby incorporated herein by reference in their entireties.


Optionally, the display and inputs may be associated with various accessories or systems, such as, for example, a tire pressure monitoring system or a passenger air bag status or a garage door opening system or a telematics system or any other accessory or system of the mirror assembly or of the vehicle or of an accessory module or console of the vehicle, such as an accessory module or console of the types described in U.S. Pat. Nos. 6,877,888; 6,690,268; 6,824,281; 6,672,744; 6,386,742 and 6,124,886, and/or PCT Application No. PCT/US03/40611, filed Dec. 19, 2003 and published Jul. 15, 2004 as International Publication No. WO 2004/058540, and/or PCT Application No. PCT/US04/15424, filed May 18, 2004 and published Dec. 2, 2004 as International Publication No. WO 2004/103772, and/or U.S. patent application Ser. No. 10/510,813, filed Aug. 23, 2002, now U.S. Pat. No. 7,306,276, which are hereby incorporated herein by reference in their entireties.


Optionally, the mirror assembly or accessory module may fixedly or non-movably support one or more other accessories or features, such as one or more electrical or electronic devices or accessories. For example, illumination sources or lights, such as map reading lights or one or more other lights or illumination sources, such as illumination sources of the types disclosed in U.S. Pat. Nos. 6,690,268; 5,938,321; 5,813,745; 5,820,245; 5,673,994; 5,649,756; 5,178,448; 5,671,996; 4,646,210; 4,733,336; 4,807,096; 6,042,253; 6,971,775 and/or 5,669,698, and/or U.S. patent application Ser. No. 10/054,633, filed Jan. 22, 2002, now U.S. Pat. No. 7,195,381; and/or Ser. No. 10/933,842, filed Sep. 3, 2004, now U.S. Pat. No. 7,249,860, which are hereby incorporated herein by reference in their entireties, may be included in the mirror assembly. The illumination sources and/or the circuit board may be connected to one or more buttons or inputs for activating and deactivating the illumination sources.


Optionally, the mirror assembly may also or otherwise include other accessories, such as microphones, such as analog microphones or digital microphones or the like, such as microphones of the types disclosed in U.S. Pat. Nos. 6,243,003; 6,278,377 and/or 6,420,975, and/or in U.S. patent application Ser. No. 10/529,715, filed Mar. 30, 2005, now U.S. Pat. No. 7,657,052. Optionally, the mirror assembly may also or otherwise include other accessories, such as a telematics system, speakers, antennas, including global positioning system (GPS) or cellular phone antennas, such as disclosed in U.S. Pat. No. 5,971,552, a communication module, such as disclosed in U.S. Pat. No. 5,798,688, a voice recorder, a blind spot detection and/or indication system, such as disclosed in U.S. Pat. Nos. 7,038,577; 6,882,287; 5,929,786 and/or 5,786,772, and/or U.S. patent application Ser. No. 11/315,675, filed Dec. 22, 2005, now U.S. Pat. No. 7,720,580; and/or PCT Application No. PCT/US2006/026148, filed Jul. 5, 2006 and published Jan. 11, 2007 as International Publication No. WO 2007/005942, transmitters and/or receivers, such as for a garage door opener or a vehicle door unlocking system or the like (such as a remote keyless entry system), a digital network, such as described in U.S. Pat. No. 5,798,575, a hands-free phone attachment, an imaging system or components or circuitry or display thereof, such as an imaging and/or display system of the types described in U.S. Pat. Nos. 6,690,268 and 6,847,487; and/or U.S. provisional applications, Ser. No. 60/614,644, filed Sep. 30, 2004; Ser. No. 60/618,686, filed Oct. 14, 2004; and/or Ser. No. 60/628,709, filed Nov. 17, 2004; and/or U.S. patent application Ser. No. 11/105,757, filed Apr. 14, 2005, now U.S. Pat. No. 7,526,103; Ser. No. 11/334,139, filed Jan. 18, 2006, now U.S. Pat. No. 7,400,435; and/or Ser. No. 11/239,980, filed Sep. 30, 2005, now U.S. Pat. No. 7,881,496, a video device for internal cabin surveillance (such as for sleep detection or driver drowsiness detection or the like) and/or video telephone function, such as disclosed in U.S. Pat. Nos. 5,760,962 and/or 5,877,897, an occupant detection system and/or interior cabin monitoring system (such as the types described in U.S. Pat. Nos. 6,019,411 and/or 6,690,268, and/or PCT Application No. PCT/US2005/042504, filed Nov. 22, 2005 and published Jun. 1, 2006 as International Publication No. WO 2006/058098; and/or PCT Application No. PCT/US94/01954, filed Feb. 25, 1994, a heating element, particularly for an exterior mirror application, such as the types described in U.S. patent application Ser. No. 11/334,139, filed Jan. 18, 2006, now U.S. Pat. No. 7,400,435, a remote keyless entry receiver, a seat occupancy detector, a remote starter control, a yaw sensor, a clock, a carbon monoxide detector, status displays, such as displays that display a status of a door of the vehicle, a transmission selection (4 wd/2 wd or traction control (TCS) or the like), an antilock braking system, a road condition (that may warn the driver of icy road conditions) and/or the like, a trip computer, a tire pressure monitoring system (TPMS) receiver (such as described in U.S. Pat. Nos. 6,124,647; 6,294,989; 6,445,287; 6,472,979; 6,731,205, and/or U.S. patent application Ser. No. 11/232,324, filed Sep. 21, 2005, now U.S. Pat. No. 7,423,522, and/or an ONSTAR® system and/or any other accessory or circuitry or the like (with all of the above-referenced U.S. patents and PCT applications and U.S. patent applications and U.S. provisional applications being commonly assigned, and with the disclosures of the referenced U.S. patents and PCT applications and U.S. patent applications and U.S. provisional applications being hereby incorporated herein by reference in their entireties).


Changes and modifications to the specifically described embodiments may be carried out without departing from the principles of the present invention, which is intended to be limited only by the scope of the appended claims as interpreted according to the principles of patent law.

Claims
  • 1. A vehicular forward-sensing system, said vehicular forward-sensing system comprising: a radar sensor;wherein, with said radar sensor disposed at a vehicle, said radar sensor has a sensing direction forward of the vehicle;wherein said radar sensor transmits at a frequency of at least 60 GHz;an image sensor;wherein, with said image sensor disposed at the vehicle, said image sensor has a viewing direction forward of the vehicle;wherein both said radar sensor and said image sensor are disposed within a windshield electronics module that is removably installed within an interior cabin of the vehicle at an upper region of a windshield of the vehicle and that is removable from the upper region of the windshield of the vehicle as a self-contained unit for service or replacement;said radar sensor and said image sensor cooperating in a way that enhances the capabilities of said vehicular forward-sensing system to detect objects present in a path of forward travel of the vehicle;a control responsive to an output of said radar sensor and responsive to an output of said image sensor;wherein said control comprises an image processing chip;wherein responsive to said image sensor viewing an object present in the path of forward travel of the vehicle and to said radar sensor sensing the object present in the path of forward travel of the vehicle, said control determines that the object is an object of interest;wherein determination by said control that the object is an object of interest comprises processing by the image processing chip of image data of the object captured by said image sensor at a portion of an image plane of said image sensor that is spatially related to a location of the object present in the path of forward travel of the vehicle; andwherein the object present in the path of forward travel of the vehicle comprises another vehicle present in the path of forward travel of the vehicle.
  • 2. The vehicular forward-sensing system of claim 1, wherein said radar sensor comprises a phased array antenna.
  • 3. The vehicular forward-sensing system of claim 2, wherein said control determines the location of the other vehicle present in the path of forward travel of the vehicle at least in part via steering a radar beam formed by the phased array antenna of said radar sensor.
  • 4. The vehicular forward-sensing system of claim 3, wherein said radar sensor utilizes digital beam steering.
  • 5. The vehicular forward-sensing system of claim 1, wherein said control at least in part controls an adaptive cruise control system of the vehicle.
  • 6. The vehicular forward-sensing system of claim 1, wherein said image sensor captures image data for an automatic headlamp control system of the vehicle and for a lane departure warning system of the vehicle.
  • 7. The vehicular forward-sensing system of claim 1, wherein responsive to a determination at said control that the other vehicle present in the path of forward travel of the vehicle causes a potentially hazardous condition, a driver of the vehicle is alerted to the determined potentially hazardous condition.
  • 8. The vehicular forward-sensing system of claim 7, wherein the driver of the vehicle is visually alerted to the determined potentially hazardous condition via a visual indication displayed on a display screen disposed in the interior cabin of the vehicle.
  • 9. The vehicular forward-sensing system of claim 7, wherein the driver of the vehicle is visually alerted to the determined potentially hazardous condition via an overlay on an image displayed on a display screen disposed in the interior cabin of the vehicle.
  • 10. The vehicular forward-sensing system of claim 7, wherein the driver of the vehicle is visually alerted to the determined potentially hazardous condition via an enhancement of an image displayed on a display screen disposed in the interior cabin of the vehicle.
  • 11. The vehicular forward-sensing system of claim 7, wherein the driver of the vehicle is audibly alerted to the determined potentially hazardous condition.
  • 12. The vehicular forward-sensing system of claim 1, wherein said radar sensor and said image sensor share, at least in part, common circuitry.
  • 13. The vehicular forward-sensing system of claim 12, wherein said image processing chip and said common circuitry are housed in said windshield electronics module installed within the interior cabin of the vehicle at the upper region of the windshield of the vehicle.
  • 14. The vehicular forward-sensing system of claim 12, wherein said common circuitry includes the image processing chip.
  • 15. The vehicular forward-sensing system of claim 1, wherein said radar sensor utilizes digital beam forming.
  • 16. The vehicular forward-sensing system of claim 1, wherein the portion of the image plane of said image sensor that is spatially related to the location of the other vehicle present in the path of forward travel of the vehicle determines, at least in part, that the other vehicle in the path of forward travel of the vehicle causes a potentially hazardous condition.
  • 17. The vehicular forward-sensing system of claim 1, wherein said radar sensor utilizes beam aiming.
  • 18. The vehicular forward-sensing system of claim 1, wherein said radar sensor utilizes beam selection.
  • 19. The vehicular forward-sensing system of claim 1, wherein said image sensor views through the windshield of the vehicle at a location that is near where an interior rearview mirror assembly of the vehicle is located.
  • 20. The vehicular forward-sensing system of claim 19, wherein said radar sensor utilizes digital beam steering.
  • 21. A vehicular forward-sensing system, said vehicular forward-sensing system comprising: a radar sensor;wherein, with said radar sensor disposed at a vehicle, said radar sensor has a sensing direction forward of the vehicle;wherein said radar sensor transmits at a frequency of at least 60 GHz;an image sensor;wherein, with said image sensor disposed at the vehicle, said image sensor has a viewing direction forward of the vehicle;wherein both said radar sensor and said image sensor are disposed within a windshield electronics module that is removably installed within an interior cabin of the vehicle at an upper region of a windshield of the vehicle and that is removable from the upper region of the windshield of the vehicle as a self-contained unit for service or replacement;said radar sensor and said image sensor cooperating in a way that enhances the capabilities of said vehicular forward-sensing system to detect objects present in a path of forward travel of the vehicle;a control responsive to an output of said radar sensor and responsive to an output of said image sensor;wherein said control comprises an image processing chip;wherein responsive to said image sensor viewing an object present in the path of forward travel of the vehicle and to said radar sensor sensing the object present in the path of forward travel of the vehicle, said control determines that the object is an object of interest;wherein determination by said control that the object is an object of interest comprises processing by the image processing chip of image data of the object captured by said image sensor at a portion of an image plane of said image sensor that is spatially related to a location of the object present in the path of forward travel of the vehicle; andwherein said control determines that the object present in the path of forward travel of the vehicle causes a potentially hazardous condition.
  • 22. The vehicular forward-sensing system of claim 21, wherein the portion of the image plane of said image sensor that is spatially related to the location of the object present in the path of forward travel of the vehicle determines, at least in part, that the object in the path of forward travel of the vehicle causes a potentially hazardous condition.
  • 23. The vehicular forward-sensing system of claim 22, wherein the object present in the path of forward travel of the vehicle comprises a person present in the path of forward travel of the vehicle.
  • 24. The vehicular forward-sensing system of claim 22, wherein the object present in the path of forward travel of the vehicle comprises a deer present in the path of forward travel of the vehicle.
  • 25. The vehicular forward-sensing system of claim 21, wherein responsive to the determination at said control that the object present in the path of forward travel of the vehicle causes a potentially hazardous condition, a driver of the vehicle is alerted to the determined potentially hazardous condition.
  • 26. The vehicular forward-sensing system of claim 25, wherein the driver of the vehicle is visually alerted to the determined potentially hazardous condition via a visual indication displayed on a display screen disposed in the interior cabin of the vehicle.
  • 27. The vehicular forward-sensing system of claim 25, wherein the driver of the vehicle is visually alerted to the determined potentially hazardous condition via an overlay on an image displayed on a display screen disposed in the interior cabin of the vehicle.
  • 28. The vehicular forward-sensing system of claim 25, wherein the driver of the vehicle is visually alerted to the determined potentially hazardous condition via an enhancement of an image displayed on a display screen disposed in the interior cabin of the vehicle.
  • 29. The vehicular forward-sensing system of claim 25, wherein the driver of the vehicle is audibly alerted to the determined potentially hazardous condition.
  • 30. The vehicular forward-sensing system of claim 21, wherein said radar sensor and said image sensor share, at least in part, common circuitry.
  • 31. The vehicular forward-sensing system of claim 30, wherein said image processing chip and said common circuitry are housed in said windshield electronics module installed within the interior cabin of the vehicle at the upper region of the windshield of the vehicle.
  • 32. The vehicular forward-sensing system of claim 31, wherein said common circuitry includes the image processing chip.
  • 33. The vehicular forward-sensing system of claim 31, wherein said control at least in part controls an adaptive cruise control system of the vehicle.
  • 34. The vehicular forward-sensing system of claim 31, wherein said image sensor captures image data for an automatic headlamp control system of the vehicle and for a lane departure warning system of the vehicle.
  • 35. The vehicular forward-sensing system of claim 31, wherein said image sensor views through the windshield of the vehicle at a location that is near where an interior rearview mirror assembly of the vehicle is located.
  • 36. The vehicular forward-sensing system of claim 31, wherein said radar sensor utilizes beam aiming.
  • 37. The vehicular forward-sensing system of claim 31, wherein said radar sensor utilizes beam selection.
  • 38. The vehicular forward-sensing system of claim 31, wherein said radar sensor comprises a phased array antenna.
  • 39. The vehicular forward-sensing system of claim 38, wherein said control determines the location of the object present in the path of forward travel of the vehicle at least in part via steering a radar beam formed by the phased array antenna of said radar sensor.
  • 40. The vehicular forward-sensing system of claim 31, wherein said radar sensor utilizes digital beam steering.
  • 41. The vehicular forward-sensing system of claim 31, wherein the object present in the path of forward travel of the vehicle comprises a person present in the path of forward travel of the vehicle.
  • 42. The vehicular forward-sensing system of claim 31, wherein the object present in the path of forward travel of the vehicle comprises another vehicle present in the path of forward travel of the vehicle.
  • 43. A vehicular forward-sensing system, said vehicular forward-sensing system comprising: a radar sensor;wherein, with said radar sensor disposed at a vehicle, said radar sensor has a sensing direction forward of the vehicle;wherein said radar sensor transmits at a frequency of at least 60 GHz;an image sensor;wherein, with said image sensor disposed at the vehicle, said image sensor has a viewing direction forward of the vehicle;wherein both said radar sensor and said image sensor are disposed within a windshield electronics module that is removably installed within an interior cabin of the vehicle at an upper region of a windshield of the vehicle and that is removable from the upper region of the windshield of the vehicle as a self-contained unit for service or replacement;said radar sensor and said image sensor cooperating in a way that enhances the capabilities of said vehicular forward-sensing system to detect objects present in a path of forward travel of the vehicle;a control responsive to an output of said radar sensor and responsive to an output of said image sensor;wherein said control comprises an image processing chip;wherein responsive to said image sensor viewing an object present in the path of forward travel of the vehicle and to said radar sensor sensing the object present in the path of forward travel of the vehicle, said control determines that the object is an object of interest;wherein determination by said control that the object is an object of interest comprises processing by the image processing chip of image data of the object captured by said image sensor at a portion of an image plane of said image sensor that is spatially related to a location of the object present in the path of forward travel of the vehicle;wherein said radar sensor and said image sensor share, at least in part, common circuitry housed in said windshield electronics module installed within the interior cabin of the vehicle at the upper region of the windshield of the vehicle; andwherein the object present in the path of forward travel of the vehicle comprises one selected from the group consisting of (i) another vehicle present in the path of forward travel of the vehicle, (ii) a person present in the path of forward travel of the vehicle and (iii) a deer present in the path of forward travel of the vehicle.
  • 44. The vehicular forward-sensing system of claim 43, wherein said common circuitry includes the image processing chip.
  • 45. The vehicular forward-sensing system of claim 43, wherein said control at least in part controls an adaptive cruise control system of the vehicle.
  • 46. The vehicular forward-sensing system of claim 43, wherein said image sensor captures image data for an automatic headlamp control system of the vehicle and for a lane departure warning system of the vehicle.
  • 47. The vehicular forward-sensing system of claim 43, wherein said image sensor views through the windshield of the vehicle at a location that is near where an interior rearview mirror assembly of the vehicle is located.
  • 48. The vehicular forward-sensing system of claim 43, wherein said radar sensor utilizes beam aiming.
  • 49. The vehicular forward-sensing system of claim 43, wherein said radar sensor utilizes beam selection.
  • 50. The vehicular forward-sensing system of claim 43, wherein said radar sensor comprises a phased array antenna.
  • 51. The vehicular forward-sensing system of claim 50, wherein said control determines the location of the object present in the path of forward travel of the vehicle at least in part via steering a radar beam formed by the phased array antenna of said radar sensor.
  • 52. The vehicular forward-sensing system of claim 43, wherein said radar sensor utilizes digital beam steering.
  • 53. The vehicular forward-sensing system of claim 43, wherein the object present in the path of forward travel of the vehicle comprises a deer present in the path of forward travel of the vehicle.
  • 54. The vehicular forward-sensing system of claim 43, wherein the object present in the path of forward travel of the vehicle comprises another vehicle present in the path of forward travel of the vehicle.
  • 55. The vehicular forward-sensing system of claim 43, wherein the object present in the path of forward travel of the vehicle comprises a person present in the path of forward travel of the vehicle.
  • 56. The vehicular forward-sensing system of claim 55, wherein responsive to a determination at said control that the person present in the path of forward travel of the vehicle causes a potentially hazardous condition, a driver of the vehicle is alerted to the determined potentially hazardous condition.
  • 57. The vehicular forward-sensing system of claim 56, wherein the driver of the vehicle is visually alerted to the determined potentially hazardous condition via a visual indication displayed on a display screen disposed in the interior cabin of the vehicle.
  • 58. The vehicular forward-sensing system of claim 56, wherein the driver of the vehicle is visually alerted to the determined potentially hazardous condition via an overlay on an image displayed on a display screen disposed in the interior cabin of the vehicle.
  • 59. The vehicular forward-sensing system of claim 56, wherein the driver of the vehicle is visually alerted to the determined potentially hazardous condition via an enhancement of an image displayed on a display screen disposed in the interior cabin of the vehicle.
  • 60. The vehicular forward-sensing system of claim 56, wherein the driver of the vehicle is audibly alerted to the determined potentially hazardous condition.
CROSS REFERENCE TO RELATED APPLICATIONS

This application is a continuation of U.S. patent application Ser. No. 15/929,969, filed Jun. 1, 2020, now U.S. Pat. No. 10,877,147, which is a continuation of U.S. patent application Ser. No. 16/166,333, filed Oct. 22, 2018, now U.S. Pat. No. 10,670,713, which is a continuation of U.S. patent application Ser. No. 15/361,746, filed Nov. 28, 2016, now U.S. Pat. No. 10,107,905, which is a continuation of U.S. patent application Ser. No. 15/149,338, filed May 9, 2016, now U.S. Pat. No. 9,507,021, which is a continuation of U.S. patent application Ser. No. 15/005,092, filed Jan. 25, 2016, now U.S. Pat. No. 9,335,411, which is a continuation of U.S. patent application Ser. No. 14/859,683, filed Sep. 21, 2015, now U.S. Pat. No. 9,244,165, which is a continuation of U.S. patent application Ser. No. 14/107,624, filed Dec. 16, 2013, now U.S. Pat. No. 9,140,789, which is a divisional of U.S. patent application Ser. No. 13/656,975, filed Oct. 22, 2012, now U.S. Pat. No. 8,614,640, which is a continuation of U.S. patent application Ser. No. 13/540,856, filed Jul. 3, 2012, now U.S. Pat. No. 8,294,608, which is a continuation of U.S. patent application Ser. No. 13/192,525, filed Jul. 28, 2011, now U.S. Pat. No. 8,217,830, which is a continuation of U.S. patent application Ser. No. 12/524,446, filed Jul. 24, 2009, now U.S. Pat. No. 8,013,780, which is a 371 application of PCT Application No. PCT/US2008/051833, filed Jan. 24, 2008, which claims the benefit of U.S. provisional application Ser. No. 60/886,568, filed Jan. 25, 2007, which are incorporated herein by reference for all purposes.

US Referenced Citations (504)
Number Name Date Kind
4571082 Downs Feb 1986 A
4572619 Reininger et al. Feb 1986 A
4580875 Bechtel et al. Apr 1986 A
4600913 Caine Jul 1986 A
4603946 Kato et al. Aug 1986 A
4614415 Hyatt Sep 1986 A
4620141 McCumber et al. Oct 1986 A
4623222 Itoh et al. Nov 1986 A
4626850 Chey Dec 1986 A
4629941 Ellis et al. Dec 1986 A
4630109 Barton Dec 1986 A
4632509 Ohmi et al. Dec 1986 A
4638287 Umebayashi et al. Jan 1987 A
4647161 Muller Mar 1987 A
4669825 Itoh et al. Jun 1987 A
4669826 Itoh et al. Jun 1987 A
4671615 Fukada et al. Jun 1987 A
4672457 Hyatt Jun 1987 A
4676601 Itoh et al. Jun 1987 A
4684164 Durham Aug 1987 A
4690508 Jacob Sep 1987 A
4692798 Seko et al. Sep 1987 A
4697883 Suzuki et al. Oct 1987 A
4701022 Jacob Oct 1987 A
4713685 Nishimura et al. Dec 1987 A
4717830 Botts Jan 1988 A
4727290 Smith et al. Feb 1988 A
4731669 Hayashi et al. Mar 1988 A
4741603 Miyagi et al. May 1988 A
4768135 Kretschmer et al. Aug 1988 A
4789904 Peterson Dec 1988 A
4793690 Gahan et al. Dec 1988 A
4817948 Simonelli Apr 1989 A
4820933 Hong et al. Apr 1989 A
4825232 Howdle Apr 1989 A
4838650 Stewart et al. Jun 1989 A
4847772 Michalopoulos et al. Jul 1989 A
4862037 Farber et al. Aug 1989 A
4867561 Fujii et al. Sep 1989 A
4871917 O'Farrell et al. Oct 1989 A
4872051 Dye Oct 1989 A
4881019 Shiraishi et al. Nov 1989 A
4882565 Gallmeyer Nov 1989 A
4886960 Molyneux et al. Dec 1989 A
4891559 Matsumoto et al. Jan 1990 A
4892345 Rachael, III Jan 1990 A
4895790 Swanson et al. Jan 1990 A
4896030 Miyaji Jan 1990 A
4907870 Brucker Mar 1990 A
4910591 Petrossian et al. Mar 1990 A
4916374 Schierbeek et al. Apr 1990 A
4917477 Bechtel et al. Apr 1990 A
4937796 Tendler Jun 1990 A
4943796 Lee Jul 1990 A
4953305 Van Lente et al. Sep 1990 A
4956591 Schierbeek et al. Sep 1990 A
4961625 Wood et al. Oct 1990 A
4967319 Seko Oct 1990 A
4970653 Kenue Nov 1990 A
4971430 Lynas Nov 1990 A
4974078 Tsai Nov 1990 A
4987357 Masaki Jan 1991 A
4991054 Walters Feb 1991 A
5001558 Burley et al. Mar 1991 A
5003288 Wilhelm Mar 1991 A
5012082 Watanabe Apr 1991 A
5016977 Baude et al. May 1991 A
5027001 Torbert Jun 1991 A
5027200 Petrossian et al. Jun 1991 A
5044706 Chen Sep 1991 A
5055668 French Oct 1991 A
5059877 Teder Oct 1991 A
5064274 Alten Nov 1991 A
5072154 Chen Dec 1991 A
5086253 Lawler Feb 1992 A
5096287 Kakinami et al. Mar 1992 A
5097362 Lynas Mar 1992 A
5121200 Choi Jun 1992 A
5124549 Michaels et al. Jun 1992 A
5130709 Toyama et al. Jul 1992 A
5148014 Lynam et al. Sep 1992 A
5168378 Black Dec 1992 A
5170374 Shimohigashi et al. Dec 1992 A
5172235 Wilm et al. Dec 1992 A
5177685 Davis et al. Jan 1993 A
5182502 Slotkowski et al. Jan 1993 A
5184956 Langlais et al. Feb 1993 A
5189561 Hong Feb 1993 A
5193000 Lipton et al. Mar 1993 A
5193029 Schofield et al. Mar 1993 A
5204778 Bechtel Apr 1993 A
5208701 Maeda May 1993 A
5245422 Borcherts et al. Sep 1993 A
5253109 O'Farrell et al. Oct 1993 A
5276389 Levers Jan 1994 A
5285060 Larson et al. Feb 1994 A
5289182 Brillard et al. Feb 1994 A
5289321 Secor Feb 1994 A
5305012 Faris Apr 1994 A
5307136 Saneyoshi Apr 1994 A
5309137 Kajiwara May 1994 A
5309163 Ngan et al. May 1994 A
5313072 Vachss May 1994 A
5325096 Pakett Jun 1994 A
5325386 Jewell et al. Jun 1994 A
5329206 Slotkowski et al. Jul 1994 A
5331312 Kudoh Jul 1994 A
5336980 Levers Aug 1994 A
5341437 Nakayama Aug 1994 A
5351044 Mathur et al. Sep 1994 A
5355118 Fukuhara Oct 1994 A
5374852 Parkes Dec 1994 A
5381155 Gerber Jan 1995 A
5386285 Asayama Jan 1995 A
5394333 Kao Feb 1995 A
5406395 Wilson et al. Apr 1995 A
5410346 Saneyoshi et al. Apr 1995 A
5414257 Stanton May 1995 A
5414461 Kishi et al. May 1995 A
5416313 Larson et al. May 1995 A
5416318 Hegyi May 1995 A
5416478 Morinaga May 1995 A
5424952 Asayama Jun 1995 A
5426294 Kobayashi et al. Jun 1995 A
5430431 Nelson Jul 1995 A
5434407 Bauer et al. Jul 1995 A
5440428 Hegg et al. Aug 1995 A
5444478 Lelong et al. Aug 1995 A
5451822 Bechtel et al. Sep 1995 A
5457493 Leddy et al. Oct 1995 A
5461357 Yoshioka et al. Oct 1995 A
5461361 Moore Oct 1995 A
5469298 Suman et al. Nov 1995 A
5471515 Fossum et al. Nov 1995 A
5475494 Nishida et al. Dec 1995 A
5498866 Bendicks et al. Mar 1996 A
5500766 Stonecypher Mar 1996 A
5510983 Lino Apr 1996 A
5515042 Nelson May 1996 A
5515448 Nishitani May 1996 A
5521633 Nakajima et al. May 1996 A
5528698 Kamei et al. Jun 1996 A
5529138 Shaw et al. Jun 1996 A
5530240 Larson et al. Jun 1996 A
5530420 Tsuchiya et al. Jun 1996 A
5535314 Alves et al. Jul 1996 A
5537003 Bechtel et al. Jul 1996 A
5539397 Asanuma et al. Jul 1996 A
5541590 Nishio Jul 1996 A
5550677 Schofield Aug 1996 A
5568027 Teder Oct 1996 A
5574443 Hsieh Nov 1996 A
5581464 Woll et al. Dec 1996 A
5585798 Yoshioka Dec 1996 A
5594222 Caldwell Jan 1997 A
5614788 Mullins Mar 1997 A
5619370 Guinosso Apr 1997 A
5634709 Iwama Jun 1997 A
5642299 Hardin et al. Jun 1997 A
5648835 Uzawa Jul 1997 A
5650944 Kise Jul 1997 A
5657021 Ehsani-Nategh Aug 1997 A
5660454 Mori et al. Aug 1997 A
5661303 Teder Aug 1997 A
5666028 Bechtel et al. Sep 1997 A
5668663 Varaprasad et al. Sep 1997 A
5670935 Schofield Sep 1997 A
5677851 Kingdon et al. Oct 1997 A
5699044 Van Lente et al. Dec 1997 A
5715093 Schierbeek et al. Feb 1998 A
5724187 Varaprasad et al. Mar 1998 A
5724316 Brunts Mar 1998 A
5737226 Olson et al. Apr 1998 A
5760826 Nayar Jun 1998 A
5760828 Cortes Jun 1998 A
5760931 Saburi et al. Jun 1998 A
5761094 Olson et al. Jun 1998 A
5765116 Wilson-Jones et al. Jun 1998 A
5781437 Wiemer et al. Jul 1998 A
5786772 Schofield et al. Jul 1998 A
5790403 Nakayama Aug 1998 A
5790973 Blaker et al. Aug 1998 A
5793308 Rosinski et al. Aug 1998 A
5793420 Schmidt Aug 1998 A
5796094 Schofield et al. Aug 1998 A
5798575 O'Farrell et al. Aug 1998 A
5835255 Miles Nov 1998 A
5837994 Stam et al. Nov 1998 A
5844505 Van Ryzin Dec 1998 A
5844682 Kiyomoto et al. Dec 1998 A
5845000 Breed et al. Dec 1998 A
5847676 Cole Dec 1998 A
5848802 Breed et al. Dec 1998 A
5850176 Kinoshita et al. Dec 1998 A
5850254 Takano et al. Dec 1998 A
5867591 Onda Feb 1999 A
5872536 Lyons Feb 1999 A
5877707 Kowalick Mar 1999 A
5877897 Schofield Mar 1999 A
5878370 Olson Mar 1999 A
5883739 Ashihara et al. Mar 1999 A
5884212 Lion Mar 1999 A
5890021 Onoda Mar 1999 A
5896085 Mori et al. Apr 1999 A
5899956 Chan May 1999 A
5914815 Bos Jun 1999 A
5923027 Stam Jul 1999 A
5929786 Schofield et al. Jul 1999 A
5933109 Tohya Aug 1999 A
5938717 Dunne Aug 1999 A
5940120 Frankhouse et al. Aug 1999 A
5956181 Lin Sep 1999 A
5959367 O'Farrell et al. Sep 1999 A
5959555 Furuta Sep 1999 A
5959571 Aoyagi Sep 1999 A
5963247 Banitt Oct 1999 A
5971552 O'Farrell et al. Oct 1999 A
5986796 Miles Nov 1999 A
5990469 Bechtel et al. Nov 1999 A
5990649 Nagao et al. Nov 1999 A
6001486 Varaprasad et al. Dec 1999 A
6020704 Buschur Feb 2000 A
6049171 Stam et al. Apr 2000 A
6057754 Kinoshita May 2000 A
6066933 Ponziana May 2000 A
6067110 Nonaka May 2000 A
6075492 Schmidt et al. Jun 2000 A
6084519 Coulling et al. Jul 2000 A
6085151 Farmer Jul 2000 A
6087953 DeLine et al. Jul 2000 A
6097024 Stam et al. Aug 2000 A
6116743 Hoek Sep 2000 A
6118401 Tognazzini Sep 2000 A
6118410 Nagy Sep 2000 A
6124647 Marcus et al. Sep 2000 A
6124886 DeLine et al. Sep 2000 A
6139172 Bos et al. Oct 2000 A
6144022 Tenenbaum et al. Nov 2000 A
6172613 DeLine et al. Jan 2001 B1
6175164 O'Farrell et al. Jan 2001 B1
6175300 Kendrick Jan 2001 B1
6198409 Schofield et al. Mar 2001 B1
6201642 Bos Mar 2001 B1
6216540 Nelson et al. Apr 2001 B1
6222460 DeLine et al. Apr 2001 B1
6243003 DeLine et al. Jun 2001 B1
6250148 Lynam Jun 2001 B1
6259412 Duroux Jul 2001 B1
6266082 Yonezawa et al. Jul 2001 B1
6266442 Laumeyer et al. Jul 2001 B1
6278399 Ashihara Aug 2001 B1
6285393 Shimoura et al. Sep 2001 B1
6291906 Marcus et al. Sep 2001 B1
6294989 Schofield et al. Sep 2001 B1
6297781 Turnbull et al. Oct 2001 B1
6302545 Schofield et al. Oct 2001 B1
6310611 Caldwell Oct 2001 B1
6313454 Bos Nov 2001 B1
6317057 Lee Nov 2001 B1
6320282 Caldwell Nov 2001 B1
6323477 Biasing et al. Nov 2001 B1
6326613 Heslin et al. Dec 2001 B1
6329925 Skiver et al. Dec 2001 B1
6333759 Mazzilli Dec 2001 B1
6341523 Lynam Jan 2002 B2
6353392 Schofield Mar 2002 B1
6366213 DeLine et al. Apr 2002 B2
6370329 Teuchert Apr 2002 B1
6396397 Bos May 2002 B1
6411204 Bloomfield et al. Jun 2002 B1
6411328 Franke et al. Jun 2002 B1
6420975 DeLine et al. Jul 2002 B1
6424273 Gulla et al. Jul 2002 B1
6428172 Hutzel et al. Aug 2002 B1
6430303 Naoi et al. Aug 2002 B1
6433676 DeLine et al. Aug 2002 B2
6442465 Breed et al. Aug 2002 B2
6452148 Bendicks et al. Sep 2002 B1
6462700 Schmidt et al. Oct 2002 B1
6477464 McCarthy et al. Nov 2002 B2
6485155 Duroux et al. Nov 2002 B1
6492935 Higuchi Dec 2002 B1
6497503 Dassanayake et al. Dec 2002 B1
6498620 Schofield Dec 2002 B2
6513252 Schierbeek Feb 2003 B1
6516664 Lynam Feb 2003 B2
6523964 Schofield et al. Feb 2003 B2
6534884 Marcus et al. Mar 2003 B2
6539306 Turnbull Mar 2003 B2
6547133 Devries, Jr. et al. Apr 2003 B1
6553130 Lemelson et al. Apr 2003 B1
6555804 Blasing Apr 2003 B1
6574033 Chui et al. Jun 2003 B1
6580385 Winner Jun 2003 B1
6589625 Kothari et al. Jul 2003 B1
6593565 Heslin et al. Jul 2003 B2
6594583 Ogura et al. Jul 2003 B2
6611202 Schofield et al. Aug 2003 B2
6611610 Stam et al. Aug 2003 B1
6627918 Getz et al. Sep 2003 B2
6636148 Higuchi Oct 2003 B2
6636258 Strumolo Oct 2003 B2
6648477 Hutzel et al. Nov 2003 B2
6650233 DeLine et al. Nov 2003 B2
6650455 Miles Nov 2003 B2
6672731 Schnell et al. Jan 2004 B2
6674562 Miles Jan 2004 B1
6678614 McCarthy et al. Jan 2004 B2
6680792 Miles Jan 2004 B2
6690268 Schofield Feb 2004 B2
6696978 Trajkovic Feb 2004 B2
6700605 Toyoda et al. Mar 2004 B1
6703925 Steffel Mar 2004 B2
6704621 Stein et al. Mar 2004 B1
6710908 Miles et al. Mar 2004 B2
6711474 Treyz et al. Mar 2004 B1
6714331 Lewis et al. Mar 2004 B2
6717610 Bos et al. Apr 2004 B1
6727807 Trajkovic et al. Apr 2004 B2
6735506 Breed et al. May 2004 B2
6741377 Miles May 2004 B2
6744353 Sjonell Jun 2004 B2
6757109 Bos Jun 2004 B2
6762867 Lippert et al. Jul 2004 B2
6771208 Lutter Aug 2004 B2
6772057 Breed et al. Aug 2004 B2
6794119 Miles Sep 2004 B2
6795014 Cheong Sep 2004 B2
6795221 Urey Sep 2004 B1
6802617 Schofield et al. Oct 2004 B2
6806452 Bos Oct 2004 B2
6812882 Ono Nov 2004 B2
6816084 Stein Nov 2004 B2
6818884 Koch et al. Nov 2004 B2
6822563 Bos et al. Nov 2004 B2
6823241 Shirato et al. Nov 2004 B2
6823244 Breed Nov 2004 B2
6824281 Schofield Nov 2004 B2
6828903 Watanabe Dec 2004 B2
6831591 Horibe Dec 2004 B2
6838980 Gloger Jan 2005 B2
6841767 Mindl et al. Jan 2005 B2
6847487 Burgner Jan 2005 B2
6853327 Miceli Feb 2005 B2
6856873 Breed et al. Feb 2005 B2
6859705 Rao Feb 2005 B2
6864784 Loeb Mar 2005 B1
6873912 Shimomura Mar 2005 B2
6879281 Gresham et al. Apr 2005 B2
6882287 Schofield Apr 2005 B2
6889161 Winner et al. May 2005 B2
6903677 Takashima Jun 2005 B2
6909753 Meehan et al. Jun 2005 B2
6941211 Kuroda Sep 2005 B1
6944544 Prakah-Asante et al. Sep 2005 B1
6946978 Schofield Sep 2005 B2
6947577 Stam et al. Sep 2005 B2
6953253 Schofield et al. Oct 2005 B2
6958729 Metz Oct 2005 B1
6968736 Lynam Nov 2005 B2
6975390 Mindl et al. Dec 2005 B2
6975775 Rykowski et al. Dec 2005 B2
6987419 Gresham Jan 2006 B2
6999024 Kumon et al. Feb 2006 B2
7004593 Weller et al. Feb 2006 B2
7004606 Schofield Feb 2006 B2
7005974 McMahon et al. Feb 2006 B2
7012560 Braeuchle Mar 2006 B2
7038577 Pawlicki et al. May 2006 B2
7042389 Shirai May 2006 B2
7062300 Kim Jun 2006 B1
7065432 Moisel et al. Jun 2006 B2
7085637 Breed et al. Aug 2006 B2
7088286 Natsume et al. Aug 2006 B2
7092548 Laumeyer et al. Aug 2006 B2
7116246 Winter et al. Oct 2006 B2
7123168 Schofield Oct 2006 B2
7126460 Yamada Oct 2006 B2
7126525 Suzuki Oct 2006 B2
7149613 Stam et al. Dec 2006 B2
7167796 Taylor et al. Jan 2007 B2
7176830 Horibe Feb 2007 B2
7188963 Schofield Mar 2007 B2
7196305 Shaffer et al. Mar 2007 B2
7199747 Jenkins et al. Apr 2007 B2
7202776 Breed Apr 2007 B2
7227459 Bos et al. Jun 2007 B2
7227611 Hull et al. Jun 2007 B2
7250853 Flynn Jul 2007 B2
7311406 Schofield et al. Dec 2007 B2
7322755 Neumann et al. Jan 2008 B2
7325934 Schofield et al. Feb 2008 B2
7325935 Schofield et al. Feb 2008 B2
7344261 Schofield et al. Mar 2008 B2
7380948 Schofield et al. Jun 2008 B2
7388182 Schofield et al. Jun 2008 B2
7400266 Haug Jul 2008 B2
7423248 Schofield et al. Sep 2008 B2
7425076 Schofield et al. Sep 2008 B2
7432848 Munakata Oct 2008 B2
7436038 Engelmann et al. Oct 2008 B2
7439507 Deasy et al. Oct 2008 B2
7453374 Koike Nov 2008 B2
7460951 Altan Dec 2008 B2
7480149 DeWard Jan 2009 B2
7526103 Schofield Apr 2009 B2
7542835 Takahama Jun 2009 B2
7558007 Katoh et al. Jul 2009 B2
7570198 Tokoro Aug 2009 B2
7587072 Russo et al. Sep 2009 B2
7613568 Kawasaki Nov 2009 B2
7619508 Lynam et al. Nov 2009 B2
7619562 Stumbo et al. Nov 2009 B2
7633383 Dunsmoir Dec 2009 B2
7639149 Katoh Dec 2009 B2
7671806 Voigtlaender Mar 2010 B2
7706978 Schiffmann Apr 2010 B2
7720580 Higgins-Luthman May 2010 B2
7728272 Blaesing Jun 2010 B2
7765065 Stiller Jul 2010 B2
7777669 Tokoro et al. Aug 2010 B2
7811011 Blaesing et al. Oct 2010 B2
7828478 Rege et al. Nov 2010 B2
7855353 Blaesing et al. Dec 2010 B2
7881496 Camilleri Feb 2011 B2
7914187 Higgins-Luthman et al. Mar 2011 B2
7920251 Chung Apr 2011 B2
7978122 Schmidlin Jul 2011 B2
8013780 Lynam Sep 2011 B2
8192095 Kortan et al. Jun 2012 B2
8217830 Lynam Jul 2012 B2
8294608 Lynam Oct 2012 B1
8614640 Lynam Dec 2013 B2
9140789 Lynam Sep 2015 B2
9244165 Lynam Jan 2016 B1
9335411 Lynam May 2016 B1
9507021 Lynam Nov 2016 B2
10107905 Lynam Oct 2018 B2
10670713 Lynam Jun 2020 B2
10877147 Lynam Dec 2020 B2
20020015153 Downs Feb 2002 A1
20020021229 Stein Feb 2002 A1
20020044065 Quist et al. Apr 2002 A1
20020113873 Williams Aug 2002 A1
20020159270 Lynam et al. Oct 2002 A1
20030080878 Kirmuss May 2003 A1
20030112132 Trajkovic Jun 2003 A1
20030137586 Lewellen Jul 2003 A1
20030138132 Stam Jul 2003 A1
20030201929 Lutter Oct 2003 A1
20030222982 Hamdan et al. Dec 2003 A1
20030227777 Schofield Dec 2003 A1
20040012488 Schofield Jan 2004 A1
20040016870 Pawlicki et al. Jan 2004 A1
20040032321 McMahon et al. Feb 2004 A1
20040051634 Schofield et al. Mar 2004 A1
20040080450 Cheong Apr 2004 A1
20040114381 Salmeen et al. Jun 2004 A1
20040128065 Taylor et al. Jul 2004 A1
20040200948 Bos et al. Oct 2004 A1
20040227663 Suzuki Nov 2004 A1
20040246167 Kumon Dec 2004 A1
20050046978 Schofield Mar 2005 A1
20050078389 Kulas et al. Apr 2005 A1
20050102070 Takahama May 2005 A1
20050104089 Engelmann May 2005 A1
20050134966 Burgner Jun 2005 A1
20050134983 Lynam Jun 2005 A1
20050146792 Schofield et al. Jul 2005 A1
20050169003 Lindahl et al. Aug 2005 A1
20050195488 McCabe et al. Sep 2005 A1
20050200700 Schofield et al. Sep 2005 A1
20050225492 Metz Oct 2005 A1
20050232469 Schofield et al. Oct 2005 A1
20050264891 Uken et al. Dec 2005 A1
20050270225 Tokoro Dec 2005 A1
20060018511 Stam et al. Jan 2006 A1
20060018512 Stam et al. Jan 2006 A1
20060028731 Schofield et al. Feb 2006 A1
20060050018 Hutzel et al. Mar 2006 A1
20060067378 Rege Mar 2006 A1
20060091654 De Mersseman May 2006 A1
20060091813 Stam et al. May 2006 A1
20060103727 Tseng May 2006 A1
20060157639 Shaffer Jul 2006 A1
20060164230 DeWind et al. Jul 2006 A1
20060250501 Wildmann et al. Nov 2006 A1
20070023613 Schofield et al. Feb 2007 A1
20070088488 Reeves Apr 2007 A1
20070104476 Yasutomi et al. May 2007 A1
20070109406 Schofield et al. May 2007 A1
20070109651 Schofield et al. May 2007 A1
20070109652 Schofield et al. May 2007 A1
20070109653 Schofield et al. May 2007 A1
20070109654 Schofield et al. May 2007 A1
20070120657 Schofield et al. May 2007 A1
20070152152 Deasy Jul 2007 A1
20070176080 Schofield et al. Aug 2007 A1
20080117097 Walter et al. May 2008 A1
20080180529 Taylor et al. Jul 2008 A1
20090113509 Tseng et al. Apr 2009 A1
20100001897 Lyman Jan 2010 A1
20100045797 Schofield et al. Feb 2010 A1
20110037640 Schmidlin Feb 2011 A1
Foreign Referenced Citations (14)
Number Date Country
10354872 Jun 2004 DE
1506893 Feb 2005 EP
H05301541 Nov 1993 JP
H08276787 Oct 1996 JP
H10142331 May 1998 JP
H10147178 Jun 1998 JP
H1178737 Mar 1999 JP
2001158284 Jun 2001 JP
2001233139 Aug 2001 JP
2003044995 Feb 2003 JP
2003169233 Jun 2003 JP
2004082829 Mar 2004 JP
2003053743 Jul 2003 WO
2006035510 Apr 2006 WO
Non-Patent Literature Citations (4)
Entry
European Search Report for European Patent Application No. 08780377.1 dated Jun. 7, 2010.
Bombini et al., “Radar-vision fusion for vehicle detection” Dipartimento di Ingegneria dell'Informazione Universit_a di Parma Parma I-43100 Italy Mar. 14, 2006.
PCT International Search Report of PCT/US2008/051833 dated Oct. 7, 2008.
European Examination Report for EP Application No. 08780377.1 dated Aug. 8, 2012.
Related Publications (1)
Number Date Country
20210109212 A1 Apr 2021 US
Provisional Applications (1)
Number Date Country
60886568 Jan 2007 US
Divisions (1)
Number Date Country
Parent 13656975 Oct 2012 US
Child 14107624 US
Continuations (10)
Number Date Country
Parent 15929969 Jun 2020 US
Child 17247711 US
Parent 16166333 Oct 2018 US
Child 15929969 US
Parent 15361746 Nov 2016 US
Child 16166333 US
Parent 15149338 May 2016 US
Child 15361746 US
Parent 15005092 Jan 2016 US
Child 15149338 US
Parent 14859683 Sep 2015 US
Child 15005092 US
Parent 14107624 Dec 2013 US
Child 14859683 US
Parent 13540856 Jul 2012 US
Child 13656975 US
Parent 13192525 Jul 2011 US
Child 13540856 US
Parent 12524446 US
Child 13192525 US