The present invention relates to imaging systems or vision systems for vehicles.
Use of imaging sensors in vehicle imaging systems is common and known. Examples of such known systems are described in U.S. Pat. Nos. 5,949,331; 5,670,935; and/or 5,550,677, which are hereby incorporated herein by reference in their entireties.
The present invention provides a calibration system for a vision system or imaging system for a vehicle that utilizes multiple cameras to capture images exterior of the vehicle, such as rearwardly and sidewardly and forwardly of the vehicle, such as for a surround view or bird's-eye view system of a vehicle. The cameras provide communication/data signals, including camera data or image data that may be displayed for viewing by the driver of the vehicle, and/or that is processed to merge the captured images from the cameras to provide or display a continuous surround view image for viewing by the driver of the vehicle. The cameras and/or image processing is calibrated to provide the continuous image or merged images.
According to one aspect of the present invention, the calibration system or method includes at least one reconfigurable calibration target that is reconfigurable to adapt the calibration system for different sizes or lengths or types of vehicles, such as at an end-of-line calibration procedure or as an aftermarket or after end-of-line calibration procedure. In one form, the calibration system or method may include an adjustable cover element for selectively covering a portion of a calibration target so as to selectively expose an appropriate portion of the calibration target to adapt the calibration system for different sizes or lengths or types of vehicles being calibrated. In another form, the calibration system or method may include at least one reconfigurable display, and the calibration system or method may be operable to adjust at least one of a size of the at least one reconfigurable display and a location of the at least one reconfigurable display by electronically generating an appropriate target display responsive to the input.
According to another aspect of the present invention, the calibration or system or method may utilize a laser emitting device to project or emit a laser line cross or grid onto a ground surface by a vehicle and in an overlapping region of two fields of view of two vehicle cameras, whereby image data captured by the two cameras is processed to determine the location of the laser lines and to calibrate or adjust a camera or image processing accordingly. Responsive to such detection or determination, the captured images can be stitched or merged together to provide a substantially seamless top-down view or bird's-eye view at the vehicle via capturing images and processing images captured by the vehicle cameras.
These and other objects, advantages, purposes and features of the present invention will become apparent upon review of the following specification in conjunction with the drawings.
Referring now to the drawings and the illustrative embodiments depicted therein, a vehicle 10 includes an imaging system or vision system 12 that includes a plurality of imaging sensors or cameras (such as a rearward facing imaging sensor or camera 14a and a forwardly facing camera 14b at the front (or at the windshield) of the vehicle, and a sidewardly and/or rearwardly facing camera 14c, 14b at the sides of the vehicle), which capture images exterior of the vehicle, with the cameras having a lens for focusing images at or onto an imaging array or imaging plane of the camera (
It is not uncommon at a vehicle assembly plant that the vehicle assembly line produce a mixture of vehicle types and sizes. Thus, for example, a sedan type vehicle may be immediately followed on the moving assembly line by an SUV type vehicle or a station wagon type vehicle or crossover type vehicle or the like. Some or all of this mixture of vehicle types/sizes being conveyed along the moving vehicle assembly line may be equipped with a multi-camera system that desires/requires calibration before the subject or equipped vehicle exits the assembly line. Typically, such calibration is achieved at the likes of a wheel balance station. In accordance with the present invention, when a subject multi-camera equipped vehicle of a given vehicle type/size enters the camera calibration station, a signal indicative of that vehicle type is automatically transmitted to a control that controls the various calibration targets disposed at the calibration station (typically for a surround view/bird's eye multi-camera system and typically disposed on the ground adjacent the calibration station). Responsive to receipt of that signal, the individual calibration targets are adjusted or customized to the calibration needs or specifications of that particular vehicle type. Such customization can be achieved in several ways, such as by occlusion of portions of an individual calibration target, such as via the likes of a movable cover or curtain or shroud or mask or the like. Optionally, the calibration target itself may be a reconfigurable electronic calibration target display that can reconfigurably customize the visible calibration target (that is visible or detectable or sensed by the camera or cameras being calibrated) to the expectations and/or needs of the particular vehicle type being calibrated. For example, such reconfigurable targets may comprise a large area plasma display or a large area LED or OLED display or a backlit LCD display or an e-ink display or the like, such as are commonly known in the display arts. This aspect of the present invention is also applicable to forward facing imagers (and/or optionally rearward facing imagers) utilized for the likes of automatic headlamp control, lane departure warning, forward collision warning, traffic sign recognition, vehicle detection and pedestrian detection and/or the like (such as described in U.S. Pat. Nos. 5,550,677; 5,670,935; 5,760,962; 5,796,094; 5,877,897; 6,302,545; 6,396,397; 6,690,268; 7,859,565; 8,027,029 and/or 8,070,332, which are all hereby incorporated herein by reference in their entireties), where a reconfigurable electronic target may be disposed forward of the vehicle so as to be viewed by the forward facing imager as the vehicle equipped with the imager passes along the assembly line and/or while the vehicle is at a calibration station for other exterior viewing cameras of that vehicle (such as, for example, a surround view/bird's eye system). The calibration system thus may provide a reconfigurable calibration target (or multiple reconfigurable calibration targets) that is/are reconfigurable to adapt the calibration system for the particular size/type of vehicle that is at the calibration station.
As can be seen with reference to
It is desirable to have the targets at fixed or constant or non-varying distances from the cameras for all vehicles so that the processing algorithms may be configured or optimized for enhanced precision and thus enhanced calibrating of the system. For example, in order to optimize the calibration of systems on different length vehicles, the target 18a in
The present invention provides multiple targets at the calibration area or zone (and optionally may provide multiple targets at one or more of the target locations, such as the front, sides and/or rear of the vehicle, or larger targets at one or more of the target locations), and the different targets or portions of the targets are selectively exposed (or different sized and located targets are selectively generated, such as via a reconfigurable display or the like that is reconfigured for the vehicle being calibrated) for different vehicle sizes or types while the non-used targets or target portions (the targets or target portions that are not needed for calibrating the particular vehicle being calibrated) are selectively covered, such as by a mechanical device or the like (such as shown in
In the illustrated embodiment, of
Optionally, the cover element or panel or frame may comprise a panel that may be flipped between two positions, with the cover, when in a first position, covering about a half of the target 18a′ and, when in a second position, covering about the other half of the target 18a′. For example, the cover may pivot or flip via a hinged mounting or attachment at or near the target, with the cover panel hingedly mounted at the target and being pivotable about its hinge axis between its first and second covering positions.
Thus, the end of line calibration process or system of the present invention provides enhanced or optimized calibration target positions for multiple vehicle types or sizes. Thus, the processing algorithm that calibrates the cameras and/or vision system may be common for all vehicles and may calibrate all sizes or types of vehicles in a similar manner and to a similar degree of precision. Although shown and described as having a cover that is movable over a larger target between two positions, it is envisioned that multiple covers and/or multiple covering positions may be provided to selectively expose more than two portions of a larger target to adapt the calibration system for three or more types or lengths or sizes of vehicles. Also, although shown and described as providing a movable cover at one target that is at the front or rear of the vehicle being calibrated, it is envisioned that a target or targets at the side or sides of the vehicle may also be selectively exposed/covered to adapt the calibration system for different width vehicles and/or for vehicles with the side cameras disposed at different locations along the vehicles (relative to the front and/or rear of the vehicle). It is further envisioned that the calibration system may comprise one or more calibration targets that are movable along the floor of the target zone to position the calibration target at an appropriate location relative to the camera/cameras of a vehicle at the calibration zone (for example, the target may mechanically move along the floor of the calibration station or target zone or may be otherwise mechanically reconfigured or a target may be projected or displayed or reconfigured to provide the desired size and location of the targets at the floor of the calibration station or target zone). The calibration system or method of the present invention thus provides selective exposure and blocking/hiding of portions of at least one calibration target (or selective generation or reconfiguration of different sized and located targets) at a calibration zone to adapt the calibration system for different types or sizes of vehicles, such that the same or common processing or calibrating algorithm provides enhanced or optimal processing and calibration of the vehicle systems for two or more vehicle types or sizes.
Even though the vehicle vision system may be optimally calibrated when the vehicle leaves the production facility, one or more cameras of the vehicle vision system may become uncalibrated or decalibrated as the vehicle is driven and over time and during usage of the vehicle or when replacing one or more cameras. When this happens, the decalibrated camera(s) must be re-calibrated to again properly function.
The present invention also provides an aftermarket or after end-of-line calibration process that calibrates one or more decalibrated cameras of a vehicle (such as at a service station or the like). The recalibration process or system of the present invention comprises use of a handheld laser calibration target tool that emits a beam pattern at an overlapping field of view area of two cameras, whereby the system may detect the location of the beam pattern in each of the two cameras' fields of view and adjust or recalibrate the decalibrated camera accordingly. As shown in
As an addition or alternative to the inventive system discussed above, the system may not just utilize the dedicated points of a projected pattern within the overlapping region, but also a pattern, or preferably straight lines, which extends not just over the overlapping region but beyond its borderlines. It is suggested to place (drawing or sticking to a target on the ground or preferably projecting) straight lines 27, which start in the center region 28 of the field of view of the respective cameras at a distance from the vehicle and that end or meet or cross or intersect at a common point within the overlapping view region, such as shown in
Typically, the limiting for discriminating differences within the center of a fisheye image is the pixel resolution. This means the smallest area which can be discriminated equates to the covering area of one pixel plus/minus the range until a neighboring pixel becomes triggered. Assuming a round covering area this is typically a half pixel steradian.
The covering area of one pixel is typically more as like a square. Since the covering area is a steradian (or two orthogonal angles describing the width and heights of a square), the covered area increases exponentially with increasing distance such as like illustrated in
According to another aspect of the present invention, the projection device may comprise a hand held device or a device mounted on a tripod or such which possesses a wired or wireless data connection attached to or linked to or in communication with the calibration system for instructing the person operating the hand held device if and how to change the projecting position or to control the projection direction automated or half automated. The tripod or device may have actuators to do this. The operator may have to carry the projector around the vehicle at least once to calibrate all of the overlapping regions.
When calibrating cameras using the overlapping camera views, the process of localizing corresponding points in the overlapping camera views can be challenging. The probability of localizing a point in both views is higher at points at the ground plane and is lower at points that are on an object or objects having a height or distance above the ground plane. Also, the resolution of images in the overlapping regions may be quite low, such that several points or characteristics may be needed to get sufficient precision by statistic analysis. Thus, successfully extracting points from a real world scene may be highly dependent on the surrounding textures or contours or objects at or near the vehicle and in the overlapping fields of view and on the lighting conditions at or near the vehicle. Because of this, the checkerboard targets (such as shown in
The present invention provides a calibration process or system or method that utilizes a laser pointer with a point approximating friendly characteristic (such as, for example, a cross or grid lines or the like, that form points via the crossing or intersection of the laser lines). In order to calibrate the camera, the laser pointer is moved or established in the overlap region and images are captured nearly synchronously with both cameras (such as the front and side cameras as shown in
Optionally, point extraction by the cameras and system may be improved by pulsing the laser, whereby consecutive images may be captured with the laser enabled and then disabled. The subtraction of both images will eliminate static content and will highlight the laser lines in the overlapping region. Optionally, the laser may be synchronized with the image capturing frame rate or running at a certain frequency, which may be harmonized with the speed or frame rate of the cameras.
An aftermarket or service center (such as a Ford or General Motors or Toyota dealership or the like) may have to recalibrate a multi-camera system due to, for example, replacement of a broken away camera-equipped exterior mirror that forms part of the subject vehicle's multi-camera system. The particular vehicle type (for example, a Mercedes C-Class or E-Class or M-Class or the like) will require its own particular calibration target or targets at the service center. In accordance with the present invention, a light projection system is provided at the dealership (or other location suitable for use as a camera calibration station) that projects the required lines or targets onto the ground adjacent the to-be-calibrated subject vehicle in the form of a projected light pattern or target. And the particular pattern projected is electronically selective by the service center or technician in the appropriate form for that particular vehicle type being serviced. Preferably, the projected pattern on the ground is produced by use of a laser/movable mirror or a movable laser projection system that by rapidly moving a laser light beam effectively “writes” the desired pattern onto the floor adjacent the subject vehicle. In accordance with this aspect of the present invention, the service technician need only key in or input the particular vehicle model or type to be calibrated, and the projection system of the present invention then automatically and appropriately generates or “writes” the desired or appropriate or selected calibration pattern on the ground in a manner appropriate for multi-camera calibration for that particular vehicle model. Alternatively, any light projector, such as with an appropriate mask, may be used (such as a mask that is electronically reconfigurable such as by using a transmissive liquid crystal panel or the like).
Thus, the present invention provides an after end-of-line calibration process or system or method (such as an aftermarket calibration system or the like at a vehicle service station or the like) that enhances detection of common points in the overlapping regions of two adjacent cameras of the vehicle (such as an overlapping field of view portion of the front and side cameras or the rear and side cameras). The system or procedure may comprise, for example, parking the vehicle at a generally flat surface with about a two meter open or un-blocked space around the perimeter of the vehicle, at least at the fields of view of the decalibrated camera and the adjacent reference camera. The calibration function is activated for a particular camera (such as a decalibrated camera or a camera which may be a threshold degree of calibration) or for the entire vision system. The system may prompt a service technician or user to generate or present the calibration target at a particular overlapping region by the vehicle, whereby the service technician or user may enable and use the handheld laser calibration target tool and may direct or place the focus point of the emitted grid or cross or the like in the overlap region of the two cameras. The characteristic of the created or generated target will show pairs of lines in both of the camera views that encompass the overlapping region.
If the system detects the focus point of the calibration target with good or sufficient precision, the system may acknowledge such detection, such as with an alert or visual or audible signal, so that the service technician knows that the focus point was detected. The system may then prompt the user or service technician to modify the location or position of the focus point (while maintaining the focus point in the overlapping region). This procedure may be repeated multiple times until enough points are detected and extracted for both or all of the cameras that are necessary to calibrate the camera or cameras. When the system has detected and captured images of a sufficient number of focus points, the service calibration system is operable to calculate and calibrate the cameras from the focus point image data. Because the focus points are at the ground level by the vehicle and are actual and consistent lines and points (at the intersection of the laser lines), the present invention provides enhanced detection and recognition and processing of the focus points in the captured images and thus enhanced processing of the image data and calibration of the cameras.
Thus, using the laser focus point system of the present invention, the point extraction is independent of the ground texture or contours or objects surrounding the vehicle and is independent of the lighting conditions at or surrounding the vehicle. Also, during the calibration procedure, the number of extracted points could be readily increased, and the precision of the extracted extrinsic parameters may be adjusted. With the use of the laser focus points, the laser focus points can be generated or projected or emitted at any locations in the overlapping portion of the fields of view, and no alignment of targets is necessary. The camera images between the different views should be made nearly synchronously, and, by using differential images, at least two images are stored in internal memory during the calibration process. At least one reference camera with known extrinsic parameters is necessary, and then the adjacent camera or cameras can be calibrated accordingly.
The imaging sensor or camera that captures the image data for image processing may comprise any suitable camera or sensing device, such as, for example, an array of a plurality of photosensor elements arranged in at least 640 columns and at least 480 rows (at least a 640×480 imaging array), with a respective lens focusing images onto respective portions of the array. The photosensor array may comprise a plurality of photosensor elements arranged in a photosensor array having rows and columns. The logic and control circuit of the imaging sensor may function in any known manner, and the image processing and algorithmic processing may comprise any suitable means for processing the images and/or image data. For example, the vision system and/or processing and/or camera and/or circuitry may utilize aspects described in U.S. Pat. Nos. 7,005,974; 5,760,962; 5,877,897; 5,796,094; 5,949,331; 6,222,447; 6,302,545; 6,396,397; 6,498,620; 6,523,964; 6,611,202; 6,201,642; 6,690,268; 6,717,610; 6,757,109; 6,802,617; 6,806,452; 6,822,563; 6,891,563; 6,946,978; 7,859,565; 5,550,677; 5,670,935; 6,636,258; 7,145,519; 7,161,616; 7,230,640; 7,248,283; 7,295,229; 7,301,466; 7,592,928; 7,881,496; 7,720,580; 7,038,577; 6,882,287; 5,929,786 and/or 5,786,772, and/or PCT Application No. PCT/US2010/047256, filed Aug. 31, 2010 and published Mar. 10, 2011 as International Publication No. WO 2011/028686 and/or International Publication No. WO 2010/099416, published Sep. 2, 2010, and/or PCT Application No. PCT/US10/25545, filed Feb. 26, 2010 and published Sep. 2, 2010 as International Publication No. WO 2010/099416, and/or PCT Application No. PCT/US2012/048800, filed Jul. 30, 2012, and/or PCT Application No. PCT/US2012/048110, filed Jul. 25, 2012, and/or PCT Application No. PCT/CA2012/000378, filed Apr. 25, 2012, and/or PCT Application No. PCT/US2012/056014, filed Sep. 19, 2012, and/or PCT Application No. PCT/US12/57007, filed Sep. 25, 2012, and/or PCT Application No. PCT/US2012/061548, filed Oct. 24, 2012, and/or PCT Application No. PCT/US2012/062906, filed Nov. 1, 2012, and/or PCT Application No. PCT/US2012/063520, filed Nov. 5, 2012, and/or U.S. patent application Ser. No. 13/660,306, filed Oct. 25, 2012; Ser. No. 13/653,577, filed Oct. 17, 2012; and/or Ser. No. 13/534,657, filed Jun. 27, 2012, and/or U.S. provisional application Ser. No. 61/710,924, filed Oct. 8, 2012; Ser. No. 61/696,416, filed Sep. 4, 2012; Ser. No. 61/682,995, filed Aug. 14, 2012; Ser. No. 61/682,486, filed Aug. 13, 2012; Ser. No. 61/680,883, filed Aug. 8, 2012; Ser. No. 61/678,375, filed Aug. 1, 2012; Ser. No. 61/676,405, filed Jul. 27, 2012; Ser. No. 61/666,146, filed Jun. 29, 2012; Ser. No. 61/653,665, filed May 31, 2012; Ser. No. 61/653,664, filed May 31, 2012; Ser. No. 61/648,744, filed May 18, 2012; Ser. No. 61/624,507, filed Apr. 16, 2012; Ser. No. 61/616,126, filed Mar. 27, 2012; Ser. No. 61/615,410, filed Mar. 26, 2012; Ser. No. 61/613,651, filed Mar. 21, 2012; Ser. No. 61/607,229, filed Mar. 6, 2012; Ser. No. 61/605,409, filed Mar. 1, 2012; Ser. No. 61/602,878, filed Feb. 24, 2012; Ser. No. 61/602,876, filed Feb. 24, 2012; Ser. No. 61/600,205, filed Feb. 17, 2012; Ser. No. 61/588,833, filed Jan. 20, 2012; Ser. No. 61/583,381, filed Jan. 5, 2012; Ser. No. 61/579,682, filed Dec. 23, 2011; Ser. No. 61/570,017, filed Dec. 13, 2011; Ser. No. 61/568,791, filed Dec. 9, 2011; Ser. No. 61/567,446, filed Dec. 6, 2011; Ser. No. 61/567,150, filed Dec. 6, 2011; Ser. No. 61/565,713, filed Dec. 1, 2011; Ser. No. 61/563,965, filed Nov. 28, 2011, which are all hereby incorporated herein by reference in their entireties. The system may communicate with other communication systems via any suitable means, such as by utilizing aspects of the systems described in PCT Application No. PCT/US10/038477, filed Jun. 14, 2010, and/or U.S. patent application Ser. No. 13/202,005, filed Aug. 17, 2011, and/or U.S. provisional application Ser. No. 61/650,667, filed May 23, 2012; Ser. No. 61/579,682, filed Dec. 23, 2011; Ser. No. 61/565,713, filed Dec. 1, 2011, which are hereby incorporated herein by reference in their entireties.
The imaging device and control and image processor and any associated illumination source, if applicable, may comprise any suitable components, and may utilize aspects of the cameras and vision systems described in U.S. Pat. Nos. 5,550,677; 5,877,897; 6,498,620; 5,670,935; 5,796,094; 6,396,397; 6,806,452; 6,690,268; 7,005,974; 7,123,168; 7,004,606; 6,946,978; 7,038,577; 6,353,392; 6,320,176; 6,313,454; and 6,824,281, and/or International Publication No. WO 2010/099416, published Sep. 2, 2010, and/or PCT Application No. PCT/US10/47256, filed Aug. 31, 2010 and published Mar. 10, 2011 as International Publication No. WO 2011/028686, and/or U.S. patent application Ser. No. 12/508,840, filed Jul. 24, 2009, and published Jan. 28, 2010 as U.S. Pat. Publication No. US 2010-0020170, and/or PCT Application No. PCT/US2012/048110, filed Jul. 25, 2012, and/or U.S. patent application Ser. No. 13/534,657, filed Jun. 27, 2012, which are all hereby incorporated herein by reference in their entireties. The camera or cameras may comprise any suitable cameras or imaging sensors or camera modules, and may utilize aspects of the cameras or sensors described in U.S. patent application Ser. No. 12/091,359, filed Apr. 24, 2008 and published Oct. 1, 2009 as U.S. Publication No. US-2009-0244361, and/or Ser. No. 13/260,400, filed Sep. 26, 2011, and/or U.S. Pat. Nos. 7,965,336 and/or 7,480,149, which are hereby incorporated herein by reference in their entireties. The imaging array sensor may comprise any suitable sensor, and may utilize various imaging sensors or imaging array sensors or cameras or the like, such as a CMOS imaging array sensor, a CCD sensor or other sensors or the like, such as the types described in U.S. Pat. Nos. 5,550,677; 5,670,935; 5,760,962; 5,715,093; 5,877,897; 6,922,292; 6,757,109; 6,717,610; 6,590,719; 6,201,642; 6,498,620; 5,796,094; 6,097,023; 6,320,176; 6,559,435; 6,831,261; 6,806,452; 6,396,397; 6,822,563; 6,946,978; 7,339,149; 7,038,577; 7,004,606; and/or 7,720,580, and/or U.S. patent application Ser. No. 10/534,632, filed May 11, 2005, now U.S. Pat. No. 7,965,336; and/or PCT Application No. PCT/US2008/076022, filed Sep. 11, 2008 and published Mar. 19, 2009 as International Publication No. WO/2009/036176, and/or PCT Application No. PCT/US2008/078700, filed Oct. 3, 2008 and published Apr. 9, 2009 as International Publication No. WO/2009/046268, which are all hereby incorporated herein by reference in their entireties.
The camera module and circuit chip or board and imaging sensor may be implemented and operated in connection with various vehicular vision-based systems, and/or may be operable utilizing the principles of such other vehicular systems, such as a vehicle headlamp control system, such as the type disclosed in U.S. Pat. Nos. 5,796,094; 6,097,023; 6,320,176; 6,559,435; 6,831,261; 7,004,606; 7,339,149; and/or 7,526,103, which are all hereby incorporated herein by reference in their entireties, a rain sensor, such as the types disclosed in commonly assigned U.S. Pat. Nos. 6,353,392; 6,313,454; 6,320,176; and/or 7,480,149, which are hereby incorporated herein by reference in their entireties, a vehicle vision system, such as a forwardly, sidewardly or rearwardly directed vehicle vision system utilizing principles disclosed in U.S. Pat. Nos. 5,550,677; 5,670,935; 5,760,962; 5,877,897; 5,949,331; 6,222,447; 6,302,545; 6,396,397; 6,498,620; 6,523,964; 6,611,202; 6,201,642; 6,690,268; 6,717,610; 6,757,109; 6,802,617; 6,806,452; 6,822,563; 6,891,563; 6,946,978; and/or 7,859,565, which are all hereby incorporated herein by reference in their entireties, a trailer hitching aid or tow check system, such as the type disclosed in U.S. Pat. No. 7,005,974, which is hereby incorporated herein by reference in its entirety, a reverse or sideward imaging system, such as for a lane change assistance system or lane departure warning system or for a blind spot or object detection system, such as imaging or detection systems of the types disclosed in U.S. Pat. Nos. 7,720,580; 7,038,577; 5,929,786 and/or 5,786,772, and/or U.S. patent application Ser. No. 11/239,980, filed Sep. 30, 2005, now U.S. Pat. No. 7,881,496, and/or U.S. provisional application Ser. No. 60/628,709, filed Nov. 17, 2004; Ser. No. 60/614,644, filed Sep. 30, 2004; Ser. No. 60/618,686, filed Oct. 14, 2004; Ser. No. 60/638,687, filed Dec. 23, 2004, which are hereby incorporated herein by reference in their entireties, a video device for internal cabin surveillance and/or video telephone function, such as disclosed in U.S. Pat. Nos. 5,760,962; 5,877,897; 6,690,268; and/or 7,370,983, and/or U.S. patent application Ser. No. 10/538,724, filed Jun. 13, 2005 and published Mar. 9, 2006 as U.S. Publication No. US-2006-0050018, which are hereby incorporated herein by reference in their entireties, a traffic sign recognition system, a system for determining a distance to a leading or trailing vehicle or object, such as a system utilizing the principles disclosed in U.S. Pat. Nos. 6,396,397 and/or 7,123,168, which are hereby incorporated herein by reference in their entireties, and/or the like.
Optionally, the circuit board or chip may include circuitry for the imaging array sensor and or other electronic accessories or features, such as by utilizing compass-on-a-chip or EC driver-on-a-chip technology and aspects such as described in U.S. Pat. No. 7,255,451 and/or U.S. Pat. No. 7,480,149; and/or U.S. patent application Ser. No. 11/226,628, filed Sep. 14, 2005 and published Mar. 23, 2006 as U.S. Publication No. US-2006-0061008, and/or Ser. No. 12/578,732, filed Oct. 14, 2009, which are hereby incorporated herein by reference in their entireties.
Optionally, the vision system may include a display for displaying images captured by one or more of the imaging sensors for viewing by the driver of the vehicle while the driver is normally operating the vehicle. Optionally, for example, the vision system may include a video display device disposed at or in the interior rearview mirror assembly of the vehicle, such as by utilizing aspects of the video mirror display systems described in U.S. Pat. No. 6,690,268 and/or U.S. patent application Ser. No. 13/333,337, filed Dec. 21, 2011, which are hereby incorporated herein by reference in their entireties. The video mirror display may comprise any suitable devices and systems and optionally may utilize aspects of the compass display systems described in U.S. Pat. Nos. 7,370,983; 7,329,013; 7,308,341; 7,289,037; 7,249,860; 7,004,593; 4,546,551; 5,699,044; 4,953,305; 5,576,687; 5,632,092; 5,677,851; 5,708,410; 5,737,226; 5,802,727; 5,878,370; 6,087,953; 6,173,508; 6,222,460; 6,513,252; and/or 6,642,851, and/or European patent application, published Oct. 11, 2000 under Publication No. EP 0 1043566, and/or U.S. patent application Ser. No. 11/226,628, filed Sep. 14, 2005 and published Mar. 23, 2006 as U.S. Publication No. US-2006-0061008, which are all hereby incorporated herein by reference in their entireties. Optionally, the video mirror display screen or device may be operable to display images captured by a rearward viewing camera of the vehicle during a reversing maneuver of the vehicle (such as responsive to the vehicle gear actuator being placed in a reverse gear position or the like) to assist the driver in backing up the vehicle, and optionally may be operable to display the compass heading or directional heading character or icon when the vehicle is not undertaking a reversing maneuver, such as when the vehicle is being driven in a forward direction along a road (such as by utilizing aspects of the display system described in PCT Application No. PCT/US2011/056295, filed Oct. 14, 2011 and published Apr. 19, 2012 as International Publication No. WO 2012/051500, which is hereby incorporated herein by reference in its entirety).
Optionally, the vision system (utilizing the forward facing camera and a rearward facing camera and other cameras disposed at the vehicle with exterior fields of view) may be part of or may provide a display of a top-down view or birds-eye view system of the vehicle or a surround view at the vehicle, such as by utilizing aspects of the vision systems described in PCT Application No. PCT/US10/25545, filed Feb. 26, 2010 and published on Sep. 2, 2010 as International Publication No. WO 2010/099416, and/or PCT Application No. PCT/US10/47256, filed Aug. 31, 2010 and published Mar. 10, 2011 as International Publication No. WO 2011/028686, and/or PCT Application No. PCT/US2011/062834, filed Dec. 1, 2011 and published Jun. 7, 2012 as International Publication No. WO2012/075250, and/or PCT Application No. PCT/US2012/048993, filed Jul. 31, 2012, and/or PCT Application No. PCT/US11/62755, filed Dec. 1, 2011 and published Jun. 7, 2012 as International Publication No. WO 2012-075250, and/or PCT Application No. PCT/CA2012/000378, filed Apr. 25, 2012, and/or U.S. patent application Ser. No. 13/333,337, filed Dec. 21, 2011, and/or U.S. provisional application Ser. No. 61/615,410, filed Mar. 26, 2012; Ser. No. 61/588,833, filed Jan. 20, 2012; Ser. No. 61/570,017, filed Dec. 13, 2011; and/or Ser. No. 61/568,791, filed Dec. 9, 2011, which are hereby incorporated herein by reference in their entireties.
Optionally, the video mirror display may be disposed rearward of and behind the reflective element assembly and may comprise a display such as the types disclosed in U.S. Pat. Nos. 5,530,240; 6,329,925; 7,855,755; 7,626,749; 7,581,859; 7,446,650; 7,370,983; 7,338,177; 7,274,501; 7,255,451; 7,195,381; 7,184,190; 5,668,663; 5,724,187 and/or 6,690,268, and/or in U.S. patent application Ser. No. 12/091,525, filed Apr. 25, 2008, now U.S. Pat. No. 7,855,755; Ser. No. 11/226,628, filed Sep. 14, 2005 and published Mar. 23, 2006 as U.S. Publication No. US-2006-0061008; and/or Ser. No. 10/538,724, filed Jun. 13, 2005 and published Mar. 9, 2006 as U.S. Publication No. US-2006-0050018, which are all hereby incorporated herein by reference in their entireties. The display is viewable through the reflective element when the display is activated to display information. The display element may be any type of display element, such as a vacuum fluorescent (VF) display element, a light emitting diode (LED) display element, such as an organic light emitting diode (OLED) or an inorganic light emitting diode, an electroluminescent (EL) display element, a liquid crystal display (LCD) element, a video screen display element or backlit thin film transistor (TFT) display element or the like, and may be operable to display various information (as discrete characters, icons or the like, or in a multi-pixel manner) to the driver of the vehicle, such as passenger side inflatable restraint (PSIR) information, tire pressure status, and/or the like. The mirror assembly and/or display may utilize aspects described in U.S. Pat. Nos. 7,184,190; 7,255,451; 7,446,924 and/or 7,338,177, which are all hereby incorporated herein by reference in their entireties. The thicknesses and materials of the coatings on the substrates of the reflective element may be selected to provide a desired color or tint to the mirror reflective element, such as a blue colored reflector, such as is known in the art and such as described in U.S. Pat. Nos. 5,910,854; 6,420,036; and/or 7,274,501, which are hereby incorporated herein by reference in their entireties.
Optionally, the display or displays and any associated user inputs may be associated with various accessories or systems, such as, for example, a tire pressure monitoring system or a passenger air bag status or a garage door opening system or a telematics system or any other accessory or system of the mirror assembly or of the vehicle or of an accessory module or console of the vehicle, such as an accessory module or console of the types described in U.S. Pat. Nos. 7,289,037; 6,877,888; 6,824,281; 6,690,268; 6,672,744; 6,386,742; and 6,124,886, and/or U.S. patent application Ser. No. 10/538,724, filed Jun. 13, 2005 and published Mar. 9, 2006 as U.S. Publication No. US-2006-0050018, which are hereby incorporated herein by reference in their entireties.
Changes and modifications to the specifically described embodiments may be carried out without departing from the principles of the present invention, which is intended to be limited only by the scope of the appended claims as interpreted according to the principles of patent law.
The present application is a 371 national phase filing of PCT Application No. PCT/US2012/064980, filed Nov. 14, 2012, which claims the filing benefit of U.S. provisional application Ser. No. 61/559,970, filed Nov. 15, 2011, which is hereby incorporated herein by reference in its entirety.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/US2012/064980 | 11/14/2012 | WO | 00 |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2013/074604 | 5/23/2013 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
4961625 | Wood et al. | Oct 1990 | A |
4966441 | Conner | Oct 1990 | A |
4967319 | Seko | Oct 1990 | A |
4970653 | Kenue | Nov 1990 | A |
5003288 | Wilhelm | Mar 1991 | A |
5059877 | Teder | Oct 1991 | A |
5064274 | Alten | Nov 1991 | A |
5072154 | Chen | Dec 1991 | A |
5096287 | Kakinami et al. | Mar 1992 | A |
5148014 | Lynam | Sep 1992 | A |
5166681 | Bottesch et al. | Nov 1992 | A |
5177606 | Koshizawa | Jan 1993 | A |
5182502 | Slotkowski et al. | Jan 1993 | A |
5193029 | Schofield | Mar 1993 | A |
5204778 | Bechtel | Apr 1993 | A |
5208701 | Maeda | May 1993 | A |
5208750 | Kurami et al. | May 1993 | A |
5214408 | Asayama | May 1993 | A |
5243524 | Ishida et al. | Sep 1993 | A |
5245422 | Borcherts et al. | Sep 1993 | A |
5276389 | Levers | Jan 1994 | A |
5289321 | Secor | Feb 1994 | A |
5305012 | Faris | Apr 1994 | A |
5307136 | Saneyoshi | Apr 1994 | A |
5351044 | Mathur et al. | Sep 1994 | A |
5355118 | Fukuhara | Oct 1994 | A |
5386285 | Asayama | Jan 1995 | A |
5406395 | Wilson et al. | Apr 1995 | A |
5408346 | Trissel et al. | Apr 1995 | A |
5414461 | Kishi et al. | May 1995 | A |
5426294 | Kobayashi et al. | Jun 1995 | A |
5430431 | Nelson | Jul 1995 | A |
5434407 | Bauer et al. | Jul 1995 | A |
5440428 | Hegg et al. | Aug 1995 | A |
5444478 | Lelong et al. | Aug 1995 | A |
5451822 | Bechtel et al. | Sep 1995 | A |
5469298 | Suman et al. | Nov 1995 | A |
5530420 | Tsuchiya et al. | Jun 1996 | A |
5535144 | Kise | Jul 1996 | A |
5535314 | Alves et al. | Jul 1996 | A |
5537003 | Bechtel et al. | Jul 1996 | A |
5539397 | Asanuma et al. | Jul 1996 | A |
5550677 | Schofield et al. | Aug 1996 | A |
5555555 | Sato et al. | Sep 1996 | A |
5568027 | Teder | Oct 1996 | A |
5574443 | Hsieh | Nov 1996 | A |
5648835 | Uzawa | Jul 1997 | A |
5661303 | Teder | Aug 1997 | A |
5670935 | Schofield et al. | Sep 1997 | A |
5699044 | Van Lente et al. | Dec 1997 | A |
5724316 | Brunts | Mar 1998 | A |
5737226 | Olson et al. | Apr 1998 | A |
5757949 | Kinoshita et al. | May 1998 | A |
5760826 | Nayer | Jun 1998 | A |
5760962 | Schofield et al. | Jun 1998 | A |
5761094 | Olson et al. | Jun 1998 | A |
5765116 | Wilson-Jones et al. | Jun 1998 | A |
5781437 | Wiemer et al. | Jul 1998 | A |
5786772 | Schofield et al. | Jul 1998 | A |
5790403 | Nakayama | Aug 1998 | A |
5790973 | Blaker et al. | Aug 1998 | A |
5796094 | Schofield et al. | Aug 1998 | A |
5837994 | Stam et al. | Nov 1998 | A |
5845000 | Breed et al. | Dec 1998 | A |
5848802 | Breed et al. | Dec 1998 | A |
5850176 | Kinoshita et al. | Dec 1998 | A |
5850254 | Takano et al. | Dec 1998 | A |
5867591 | Onda | Feb 1999 | A |
5877707 | Kowalick | Mar 1999 | A |
5877897 | Schofield et al. | Mar 1999 | A |
5878370 | Olson | Mar 1999 | A |
5896085 | Mori et al. | Apr 1999 | A |
5920367 | Kajimoto et al. | Jul 1999 | A |
5923027 | Stam et al. | Jul 1999 | A |
5929786 | Schofield et al. | Jul 1999 | A |
5956181 | Lin | Sep 1999 | A |
6049171 | Stam et al. | Apr 2000 | A |
6052124 | Stein et al. | Apr 2000 | A |
6066933 | Ponziana | May 2000 | A |
6084519 | Coulling et al. | Jul 2000 | A |
6091833 | Yasui et al. | Jul 2000 | A |
6097024 | Stam et al. | Aug 2000 | A |
6100811 | Hsu et al. | Aug 2000 | A |
6175300 | Kendrick | Jan 2001 | B1 |
6198409 | Schofield et al. | Mar 2001 | B1 |
6201642 | Bos | Mar 2001 | B1 |
6226061 | Tagusa | May 2001 | B1 |
6259423 | Tokito et al. | Jul 2001 | B1 |
6266082 | Yonezawa et al. | Jul 2001 | B1 |
6266442 | Laumeyer et al. | Jul 2001 | B1 |
6285393 | Shimoura et al. | Sep 2001 | B1 |
6285778 | Nakajima et al. | Sep 2001 | B1 |
6294989 | Schofield et al. | Sep 2001 | B1 |
6297781 | Turnbull et al. | Oct 2001 | B1 |
6310611 | Caldwell | Oct 2001 | B1 |
6313454 | Bos et al. | Nov 2001 | B1 |
6317057 | Lee | Nov 2001 | B1 |
6320282 | Caldwell | Nov 2001 | B1 |
6353392 | Schofield et al. | Mar 2002 | B1 |
6370329 | Teuchert | Apr 2002 | B1 |
6396397 | Bos et al. | May 2002 | B1 |
6411204 | Bloomfield et al. | Jun 2002 | B1 |
6424273 | Gutta et al. | Jul 2002 | B1 |
6445287 | Schofield et al. | Sep 2002 | B1 |
6477464 | McCarthy et al. | Nov 2002 | B2 |
6498620 | Schofield et al. | Dec 2002 | B2 |
6515378 | Drummond et al. | Feb 2003 | B2 |
6516664 | Lynam | Feb 2003 | B2 |
6553130 | Lemelson et al. | Apr 2003 | B1 |
6570998 | Ohtsuka et al. | May 2003 | B1 |
6574033 | Chui et al. | Jun 2003 | B1 |
6578017 | Ebersole et al. | Jun 2003 | B1 |
6587573 | Stam et al. | Jul 2003 | B1 |
6589625 | Kothari et al. | Jul 2003 | B1 |
6593011 | Liu et al. | Jul 2003 | B2 |
6593565 | Heslin et al. | Jul 2003 | B2 |
6593698 | Stam et al. | Jul 2003 | B2 |
6594583 | Ogura et al. | Jul 2003 | B2 |
6611610 | Stam et al. | Aug 2003 | B1 |
6627918 | Getz et al. | Sep 2003 | B2 |
6631316 | Stam et al. | Oct 2003 | B2 |
6631994 | Suzuki et al. | Oct 2003 | B2 |
6636258 | Strumolo | Oct 2003 | B2 |
6648477 | Hutzel et al. | Nov 2003 | B2 |
6650233 | DeLine et al. | Nov 2003 | B2 |
6650455 | Miles | Nov 2003 | B2 |
6672731 | Schnell et al. | Jan 2004 | B2 |
6674562 | Miles | Jan 2004 | B1 |
6678056 | Downs | Jan 2004 | B2 |
6678614 | McCarthy et al. | Jan 2004 | B2 |
6680792 | Miles | Jan 2004 | B2 |
6690268 | Schofield et al. | Feb 2004 | B2 |
6700605 | Toyoda et al. | Mar 2004 | B1 |
6703925 | Steffel | Mar 2004 | B2 |
6704621 | Stein et al. | Mar 2004 | B1 |
6710908 | Miles et al. | Mar 2004 | B2 |
6711474 | Treyz et al. | Mar 2004 | B1 |
6714331 | Lewis et al. | Mar 2004 | B2 |
6717610 | Bos et al. | Apr 2004 | B1 |
6735506 | Breed et al. | May 2004 | B2 |
6741377 | Miles | May 2004 | B2 |
6744353 | Sjönell | Jun 2004 | B2 |
6757109 | Bos | Jun 2004 | B2 |
6762867 | Lippert et al. | Jul 2004 | B2 |
6794119 | Miles | Sep 2004 | B2 |
6795221 | Urey | Sep 2004 | B1 |
6806452 | Bos et al. | Oct 2004 | B2 |
6807287 | Hermans | Oct 2004 | B1 |
6822563 | Bos et al. | Nov 2004 | B2 |
6823241 | Shirato et al. | Nov 2004 | B2 |
6824281 | Schofield et al. | Nov 2004 | B2 |
6864930 | Matsushita et al. | Mar 2005 | B2 |
6882287 | Schofield | Apr 2005 | B2 |
6889161 | Winner et al. | May 2005 | B2 |
6909753 | Meehan et al. | Jun 2005 | B2 |
6946978 | Schofield | Sep 2005 | B2 |
6968736 | Lynam | Nov 2005 | B2 |
6975775 | Rykowski et al. | Dec 2005 | B2 |
7004606 | Schofield | Feb 2006 | B2 |
7038577 | Pawlicki et al. | May 2006 | B2 |
7062300 | Kim | Jun 2006 | B1 |
7065432 | Moisel et al. | Jun 2006 | B2 |
7085637 | Breed et al. | Aug 2006 | B2 |
7092548 | Laumeyer et al. | Aug 2006 | B2 |
7113867 | Stein | Sep 2006 | B1 |
7116246 | Winter et al. | Oct 2006 | B2 |
7123168 | Schofield | Oct 2006 | B2 |
7133661 | Hatae et al. | Nov 2006 | B2 |
7149613 | Stam et al. | Dec 2006 | B2 |
7151996 | Stein | Dec 2006 | B2 |
7167796 | Taylor et al. | Jan 2007 | B2 |
7195381 | Lynam et al. | Mar 2007 | B2 |
7196721 | Uchiyama | Mar 2007 | B2 |
7202776 | Breed | Apr 2007 | B2 |
7227459 | Bos et al. | Jun 2007 | B2 |
7227611 | Hull et al. | Jun 2007 | B2 |
7325934 | Schofield et al. | Feb 2008 | B2 |
7325935 | Schofield et al. | Feb 2008 | B2 |
7338177 | Lynam | Mar 2008 | B2 |
7375803 | Bamji | May 2008 | B1 |
7380948 | Schofield et al. | Jun 2008 | B2 |
7388182 | Schofield et al. | Jun 2008 | B2 |
7423821 | Bechtel et al. | Sep 2008 | B2 |
7425076 | Schofield et al. | Sep 2008 | B2 |
7526103 | Schofield et al. | Apr 2009 | B2 |
7541743 | Salmeen et al. | Jun 2009 | B2 |
7565006 | Stam et al. | Jul 2009 | B2 |
7566851 | Stein et al. | Jul 2009 | B2 |
7605856 | Imoto | Oct 2009 | B2 |
7619508 | Lynam et al. | Nov 2009 | B2 |
7720580 | Higgins-Luthman | May 2010 | B2 |
7786898 | Stein et al. | Aug 2010 | B2 |
7792329 | Schofield et al. | Sep 2010 | B2 |
7843451 | Lafon | Nov 2010 | B2 |
7855778 | Yung et al. | Dec 2010 | B2 |
7881496 | Camilleri | Feb 2011 | B2 |
7914187 | Higgins-Luthman et al. | Mar 2011 | B2 |
7914188 | DeLine et al. | Mar 2011 | B2 |
7930160 | Hosagrahara et al. | Apr 2011 | B1 |
7949486 | Denny et al. | May 2011 | B2 |
8017898 | Lu et al. | Sep 2011 | B2 |
8064643 | Stein et al. | Nov 2011 | B2 |
8082101 | Stein et al. | Dec 2011 | B2 |
8100568 | DeLine et al. | Jan 2012 | B2 |
8164628 | Stein et al. | Apr 2012 | B2 |
8224031 | Saito | Jul 2012 | B2 |
8233045 | Luo et al. | Jul 2012 | B2 |
8254635 | Stein et al. | Aug 2012 | B2 |
8300886 | Hoffmann | Oct 2012 | B2 |
8378851 | Stein et al. | Feb 2013 | B2 |
8421865 | Euler et al. | Apr 2013 | B2 |
8452055 | Stein et al. | May 2013 | B2 |
8534887 | DeLine et al. | Sep 2013 | B2 |
8553088 | Stein et al. | Oct 2013 | B2 |
8947533 | Bandou | Feb 2015 | B2 |
9025819 | Sung | May 2015 | B2 |
9193303 | Higgins-Luthman | Nov 2015 | B2 |
9275458 | Oh | Mar 2016 | B2 |
20020005778 | Breed | Jan 2002 | A1 |
20020011611 | Huang et al. | Jan 2002 | A1 |
20020113873 | Williams | Aug 2002 | A1 |
20030103142 | Hitomi et al. | Jun 2003 | A1 |
20030137586 | Lewellen | Jul 2003 | A1 |
20030222982 | Hamdan et al. | Dec 2003 | A1 |
20040130702 | Jupp | Jul 2004 | A1 |
20040135992 | Munro | Jul 2004 | A1 |
20040164228 | Fogg et al. | Aug 2004 | A1 |
20050219852 | Stam et al. | Oct 2005 | A1 |
20050237385 | Kosaka et al. | Oct 2005 | A1 |
20060050018 | Hutzel et al. | Mar 2006 | A1 |
20060091813 | Stam et al. | May 2006 | A1 |
20060103727 | Tseng | May 2006 | A1 |
20060250501 | Widmann et al. | Nov 2006 | A1 |
20070024724 | Stein et al. | Feb 2007 | A1 |
20070104476 | Yasutomi et al. | May 2007 | A1 |
20070154067 | Laumeyer | Jul 2007 | A1 |
20070242339 | Bradley | Oct 2007 | A1 |
20080043099 | Stein et al. | Feb 2008 | A1 |
20080147321 | Howard et al. | Jun 2008 | A1 |
20080192132 | Bechtel et al. | Aug 2008 | A1 |
20080266396 | Stein | Oct 2008 | A1 |
20090113509 | Tseng et al. | Apr 2009 | A1 |
20090160987 | Bechtel et al. | Jun 2009 | A1 |
20090190015 | Bechtel et al. | Jul 2009 | A1 |
20090256938 | Bechtel et al. | Oct 2009 | A1 |
20090290032 | Zhang et al. | Nov 2009 | A1 |
20100194886 | Asari et al. | Aug 2010 | A1 |
20100238291 | Pavlov et al. | Sep 2010 | A1 |
20100253784 | Oleg | Oct 2010 | A1 |
20110216201 | McAndrew et al. | Sep 2011 | A1 |
20120045112 | Lundblad et al. | Feb 2012 | A1 |
20120069185 | Stein | Mar 2012 | A1 |
20120200707 | Stein et al. | Aug 2012 | A1 |
20120314071 | Rosenbaum et al. | Dec 2012 | A1 |
20120320209 | Vico | Dec 2012 | A1 |
20130141580 | Stein et al. | Jun 2013 | A1 |
20130147957 | Stein | Jun 2013 | A1 |
20130169812 | Lu et al. | Jul 2013 | A1 |
20130286193 | Pflug | Oct 2013 | A1 |
20140043473 | Rathi et al. | Feb 2014 | A1 |
20140063254 | Shi et al. | Mar 2014 | A1 |
20140098229 | Lu et al. | Apr 2014 | A1 |
20140160276 | Pliefke | Jun 2014 | A1 |
20140247352 | Rathi et al. | Sep 2014 | A1 |
20140247354 | Knudsen | Sep 2014 | A1 |
20140320658 | Pliefke | Oct 2014 | A1 |
20140333729 | Pflug | Nov 2014 | A1 |
20140347486 | Okouneva | Nov 2014 | A1 |
20140350834 | Turk | Nov 2014 | A1 |
20150217693 | Pliefke | Aug 2015 | A1 |
Number | Date | Country |
---|---|---|
0353200 | Jan 1990 | EP |
0361914 | Feb 1993 | EP |
0640903 | Mar 1995 | EP |
0697641 | Feb 1996 | EP |
1115250 | Jul 2001 | EP |
2377094 | Oct 2011 | EP |
2667325 | Nov 2013 | EP |
2233530 | Sep 1991 | GB |
S5539843 | Mar 1980 | JP |
S58110334 | Jun 1983 | JP |
6216073 | Apr 1987 | JP |
6272245 | May 1987 | JP |
S62-131837 | Jun 1987 | JP |
01123587 | May 1989 | JP |
H1168538 | Jul 1989 | JP |
H236417 | Aug 1990 | JP |
03099952 | Apr 1991 | JP |
3099952 | Apr 1991 | JP |
6227318 | Aug 1994 | JP |
07105496 | Apr 1995 | JP |
2630604 | Jul 1997 | JP |
200274339 | Mar 2002 | JP |
20041658 | Jan 2004 | JP |
WO9419212 | Feb 1994 | WO |
WO9638319 | Dec 1996 | WO |
WO2012139636 | Oct 2012 | WO |
WO2012139660 | Oct 2012 | WO |
WO2012143036 | Oct 2012 | WO |
Entry |
---|
Achler et al., “Vehicle Wheel Detector using 2D Filter Banks,” IEEE Intelligent Vehicles Symposium of Jun. 2004. |
Behringer et al., “Simultaneous Estimation of Pitch Angle and Lane Width from the Video Image of a Marked Road,” pp. 966-973, Sep. 12-16, 1994. |
Borenstein et al., “Where am I? Sensors and Method for Mobile Robot Positioning”, University of Michigan, Apr. 1996, pp. 2, 125-128. |
Bow, Sing T., “Pattern Recognition and Image Preprocessing (Signal Processing and Communications)”, CRC Press, Jan. 15, 2002, pp. 557-559. |
Broggi et al., “Automatic Vehicle Guidance: The Experience of the ARGO Vehicle”, World Scientific Publishing Co., 1999. |
Broggi et al., “Multi-Resolution Vehicle Detection using Artificial Vision,” IEEE Intelligent Vehicles Symposium of Jun. 2004. |
Franke et al., “Autonomous driving approaches downtown”, Intelligent Systems and Their Applications, IEEE 13 (6), 40-48, Nov./Dec. 1999. |
IEEE 100—The Authoritative Dictionary of IEEE Standards Terms, 7th Ed. (2000). |
Kastrinaki et al., “A survey of video processing techniques for traffic applications”. |
Philomin et al., “Pedestrain Tracking from a Moving Vehicle”. |
Sahli et al., “A Kalman Filter-Based Update Scheme for Road Following,” IAPR Workshop on Machine Vision Applications, pp. 5-9, Nov. 12-14, 1996. |
Sun et al., “On-road vehicle detection using optical sensors: a review”, IEEE Conference on Intelligent Transportation Systems, 2004. |
Van Leeuwen et al., “Motion Estimation with a Mobile Camera for Traffic Applications”, IEEE, US, vol. 1, Oct. 3, 2000, pp. 58-63. |
Van Leeuwen et al., “Motion Interpretation for In-Car Vision Systems”, IEEE, US, vol. 1, Sep. 30, 2002, p. 135-140. |
Van Leeuwen et al., “Real-Time Vehicle Tracking in Image Sequences”, IEEE, US, vol. 3, May 21, 2001, pp. 2049-2054, XP010547308. |
Van Leeuwen et al., “Requirements for Motion Estimation in Image Sequences for Traffic Applications”, IEEE, US, vol. 1, May 24, 1999, pp. 145-150, XP010340272. |
Vlacic et al. (Eds), “Intelligent Vehicle Tecnologies, Theory and Applications”, Society of Automotive Engineers Inc., edited by SAE International, 2001. |
Zheng et al., “An Adaptive System for Traffic Sign Recognition,” IEEE Proceedings of the Intelligent Vehicles '94 Symposium, pp. 165-170 (Oct. 1994). |
International Search Report and Written Opinion date Mar. 26, 2013 for corresponding PCT Application No. PCT/US2012/064980. |
Number | Date | Country | |
---|---|---|---|
20140320658 A1 | Oct 2014 | US |
Number | Date | Country | |
---|---|---|---|
61559970 | Nov 2011 | US |