Vehicular display system with multi-paned image display

Information

  • Patent Grant
  • 11607995
  • Patent Number
    11,607,995
  • Date Filed
    Monday, May 17, 2021
    3 years ago
  • Date Issued
    Tuesday, March 21, 2023
    a year ago
Abstract
A vehicular display system includes a rearward viewing camera disposed at a vehicle and a controller that includes a processor for processing image data captured by the rearward viewing camera. A display device is disposed in the vehicle for viewing by a driver of the vehicle. Image data captured by the rearward viewing camera is processed at the controller, which generates an output representative of an image having three image panes including a central image pane derived from a central subset of captured image data and two side image panes derived from respective side subsets of captured image data. Each of the side image panes are shaped and arranged with respect to the central image pane to have non-parallel upper edges and non-parallel lower edges. The display device, responsive to the output generated by the processing system, displays the image for viewing by the driver of the vehicle.
Description
FIELD OF THE INVENTION

The present invention relates generally to cameras and displays and, more particularly, to a vehicle vision system.


BACKGROUND OF THE INVENTION

Vehicle vision systems can provide vehicle operators with valuable information about driving conditions. For example, a typical vehicle vision system can aid a driver in parking his or her automobile by alerting the driver to hazards around the automobile that should be avoided. Other uses for vehicle vision systems are also known.


However, a typical vehicle camera or vision system may not be able to provide video that is quickly and reliably comprehensible to the driver.


SUMMARY OF THE INVENTION

A vehicle vision or camera system performs dewarping on captured images and outputs dewarped images separated into three image panes. Two side image panes are shaped and arranged with respect to a central image pane to provide the vehicle operator with a view of outside the vehicle that is readily comprehensible. For example, the side panes may each comprise a parallelogram shape, with their respective upper and lower edges being non-parallel to the upper and lower edges of the central image pane (which may be rectangular shaped with its upper and lower edges generally parallel and horizontal when the image is displayed at the display). The upper and lower edges of the side image panes may be parallel and may slope downwardly or upwardly away from the central image pane, or the upper and lower edges of the side image panes may taper towards one another or diverge away from one another away from the central image pane. When the image is displayed at the display, each of the side image panes may be arranged with respect to the central image pane to appear folded with respect to the central image pane.


These and other objects, advantages, purposes and features of the present invention will become apparent upon review of the following specification in conjunction with the drawings.





BRIEF DESCRIPTION OF THE DRAWINGS

The drawings illustrate, by way of example only, embodiments of the present disclosure.



FIG. 1 is a perspective view of a vehicle having a vehicle vision system;



FIG. 2 is a functional block diagram of the vehicle vision system;



FIG. 3 is a diagram of an original image captured by the vehicle camera;



FIG. 4 is a diagram of a dewarped image having three panes as generated by the vehicle vision system;



FIG. 5 is a diagram of a remapping table;



FIGS. 6-10 are diagrams of other three-pane dewarped images in accordance with the present invention;



FIG. 11 is a diagram of another dewarped image having three panes as generated by the vehicle vision system, shown with a road outline overlay and with a vehicle reference icon to enhance the driver's ability to judge distances and speeds of objects in the image;



FIG. 12 is a diagram of a dewarped image similar to FIG. 11, shown with the road outline overlay filled in;



FIG. 13 is a diagram of a dewarped image similar to FIG. 11, shown with color gradient line overlays to enhance the driver's ability to judge distances and speeds of objects in the image;



FIG. 14 is a diagram of a dewarped image similar to FIG. 13, shown with the color gradient line overlays having markers therealong;



FIG. 15 is a diagram of a dewarped image similar to FIG. 13, shown with distance flags overlays to enhance the driver's ability to judge distances and speeds of objects in the image;



FIG. 16 is an image of another dewarped image having three panes as generated by the vehicle vision system in accordance with the present invention;



FIG. 17 is another image of the dewarped image similar to that of FIG. 16, but with the center pane reduced and the side images enlarged;



FIG. 18 is another image similar to that of FIG. 17, but with an additional zoom to increase the size of objects shown at the center pane; and



FIGS. 19-22 are images of another dewarped image having a curved displayed image as generated by the vehicle vision system in accordance with the present invention, with FIGS. 20-22 configured to display the objects at the side regions as a larger size.





DESCRIPTION OF THE PREFERRED EMBODIMENTS

With reference to FIG. 1, a vehicle 10, such as a car, truck, van, bus, or other type of vehicle, includes a camera 12. The camera 12 is configured to be positioned on the vehicle 10 to face away from the bulk of the body 14 of the vehicle 10 so as to have an exterior field of view, whereby the camera is operable to capture video images of the environment outside of the vehicle 10 to, for example, aid the operator of the vehicle 10.


In this example, the camera 12 is positioned at a rear-portion of the body 14 of the vehicle 10 and is rearward-facing to capture video images of the environment behind the vehicle 10. The camera 12 may also be angled downward towards the road by a selected angle. In another example, the camera 12 may be positioned at a rear bumper 15 of the vehicle 10. In still other examples, the camera 12 may be forward-facing and may be positioned, for example, at the grille of the vehicle 10 or elsewhere at a forward portion of the vehicle.


The camera 12 may include a wide-angle lens (such as shown at 32 of FIG. 2), such as a lens with about a 180-degree or more horizontal field of view or other suitable wide-angle lens. Such a lens may comprise one or more spherical type lenses or lens optics or elements and/or aspheric lenses or lens optics or elements or the like. In this way, the camera 12 is operable to capture images of the environment behind or ahead of the vehicle 10, including portions of the roadway immediately behind or in front of the vehicle, as well as areas to the right and left of the vehicle 10. When the camera 12 is rearward-facing and has a wide-angle lens, such as a 180-degree lens or the like, the horizontal extents of the field of view of the camera 12 are shown at 13 in FIG. 1. Such a field of view allows the camera to capture images of a wide range of potential hazards including objects directly in the vehicle's rear path of travel, objects in rear blind spots, as well as objects at a distance to the far left and far right of the vehicle 10, such as an approaching vehicle on a perpendicular path of travel to the vehicle 10 (such as at a cross road that the vehicle is at or approaching or such as at an aisle of a parking lot when the vehicle is pulling out of a parking space). A similar field of view may be established in embodiments wherein the camera 12 is forward-facing.


The camera 12 may comprise a charge-coupled device (CCD) image sensor, a complementary metal-oxide-semiconductor (CMOS) image sensor, or any other suitable type of image sensor. For example, the camera and/or imaging device and/or control and/or image processor may comprise any suitable components, and may utilize aspects of the cameras and vision systems described in U.S. Pat. Nos. 7,965,336; 7,937,667; 7,720,580; 7,480,149; 7,339,149; 7,123,168; 7,005,974; 7,004,606; 7,038,577; 6,946,978; 6,922,292; 6,831,261; 6,822,563; 6,806,452; 6,757,109; 6,717,610; 6,824,281; 6,806,452; 6,690,268; 6,590,719; 6,559,435; 6,498,620; 6,396,397; 6,353,392; 6,320,176; 6,313,454; 6,201,642; 6,097,023; 5,877,897; 5,796,094; 5,760,962; 5,715,093; 5,670,935 and/or 5,550,677, and/or International Publication No. WO 2010/099416, published Sep. 2, 2010, and/or PCT Application No. PCT/US10/47256, filed Aug. 31, 2010 and published Mar. 10, 2011 as International Publication No. WO 2011/028686, and/or PCT Application No. PCT/US2008/076022, filed Sep. 11, 2008 and published Mar. 19, 2009 as International Publication No. WO 2009/036176, and/or PCT Application No. PCT/US2008/078700, filed Oct. 3, 2008 and published Apr. 9, 2009 as International Publication No. WO 2009/046268, and/or PCT Application No. PCT/US2012/048110, filed Jul. 25, 2012, and published on Jan. 31, 2013 as International Publication No. WO 2013/016409, and/or U.S. patent application Ser. No. 13/534,657, filed Jun. 27, 2012, and published on Jan. 3, 2013 as U.S. Publication No. US-2013-0002873; Ser. No. 12/508,840, filed Jul. 24, 2009, and published on Jan. 28, 2010 as U.S. Publication No. US-2010-0020170; Ser. No. 12/091,359, filed Apr. 24, 2008 and published Oct. 1, 2009 as U.S. Publication No. US-2009-0244361; Ser. No. 12/508,840, filed Jul. 24, 2009, and published Jan. 28, 2010 as U.S. Pat. Publication No. US 2010-0020170; and/or Ser. No. 13/260,400, filed Sep. 26, 2011, now U.S. Pat. No. 8,542,451, which are hereby incorporated herein by reference in their entireties.


The camera 12 is coupled via a line or link 16 (such as, for example, conductive wires or the like, or a communication bus, such as a LVDS or an Ethernet bus or the like) to a processing system 18 which may be located at a forward portion of the vehicle 10, such as under the hood or below the dashboard. In other examples, the camera 12 can be coupled to the processing system 18 via a wireless communications technique instead of via the line 16. Moreover, the processing system 18 can be positioned elsewhere in the vehicle 10. The camera 12 and processing unit 18 may also comprise components or parts of a single camera module or housing, with the image capturing device and image processing units are integrated together. Such an integrated unit may provide a simpler and lower cost camera product.


As shown in FIG. 2, the camera 12 and processing system 18 can form at least part of a vehicle vision system or camera system 20.


The processing system 18 includes a processor 22 and connected memory 24. The processing system 18 is operatively coupled to both the camera 12, as mentioned above, and to a display 30.


The display 30 is configured to be positioned inside the cabin of the vehicle 10. The display 30 is coupled to the processing system 18 by way of, for example, conductive lines. The display 30 can include an in-vehicle display panel situated in the dashboard of the vehicle 10. The display 30 may comprise a backlit liquid-crystal display (LCD) panel, a light-emitting diode (LED) display panel, an organic LED (OLED) display panel, an active-matrix organic LED (AMOLED) display panel, or the like, as well as a circuit to drive the display panel with a video signal received from the processing system 18. The display 30 may include a touch-screen interface to control how the video is displayed by, for example, outputting a mode signal to the processing system 18.


The display may, for example, utilize aspects of the video displays (that may be disposed at a vehicle console or instrument panel or at an interior rearview mirror assembly of the vehicle) described in U.S. Pat. Nos. 6,690,268; 7,370,983; 7,329,013; 7,308,341; 7,289,037; 7,249,860; 7,004,593; 4,546,551; 5,699,044; 4,953,305; 5,576,687; 5,632,092; 5,677,851; 5,708,410; 5,737,226; 5,802,727; 5,878,370; 6,087,953; 6,173,508; 6,222,460; 6,513,252; 6,642,851; 5,530,240; 6,329,925; 7,855,755; 7,626,749; 7,581,859; 7,446,924; 7,446,650; 7,370,983; 7,338,177; 7,274,501; 7,255,451; 7,195,381; 7,184,190; 5,668,663 and/or 5,724,187, and/or U.S. patent application Ser. No. 13/333,337, filed Dec. 21, 2011, now U.S. Pat. No. 9,264,672, and/or U.S. patent application Ser. No. 11/226,628, filed Sep. 14, 2005 and published Mar. 23, 2006 as U.S. Publication No. US-2006-0061008, and/or Ser. No. 10/538,724, filed Jun. 13, 2005 and published Mar. 9, 2006 as U.S. Publication No. US-2006-0050018, which are all hereby incorporated herein by reference in their entireties.


The processing system 18 is configured to receive from the camera 12 image data representative of an image captured by the camera 12, manipulate the received image data, and then output a processed image to the display 30. The processing system 18 may be configured to perform these steps on a continuous basis so as to continuously update the image shown on the display 30, to aid the driver in operating the vehicle 10. In embodiments wherein the camera 12 is rear-facing, such a system can assist the driver to safely back the vehicle 10 up, perhaps out of a parking spot in a parking lot in which there may be vehicular cross-traffic. In embodiments wherein the camera 12 is front-facing, such a system can assist the driver to safely enter an intersection where a view of potential cross-traffic is obscured by buildings or parked vehicles. The processing system may utilize aspects of the systems described in PCT Application No. PCT/US2013/027342, filed Feb. 22, 2013, and published on Aug. 9, 2013 as International Publication No. WO 2013/126715, which is hereby incorporated herein by reference in its entirety.


Image data as discussed herein may be a series of pixel color values of an image, a compressed stream of pixel color values, pixel color values of an image differentially encoded with respect to a previous image (such as, for example, an MPEG video P-frame or B-frame that refers back to a previous frame, such as an I-frame), or the like. Irrespective of the form of the image data, the processing system 18 can be considered to have received an image and to have access to all the pixels of the image for the purposes of image processing.


The processing system 18 can include one or more image processors which may be located together or in separate locations. One or more image processors may, for example, be located at a controller (such as, for example, an engine control unit (ECU) or a vehicle control unit (VCU) or the like) of the vehicle 10 or elsewhere, such as at the camera 12. One processor 22 is depicted at the processing system 18 for sake of convenience. In one example, the processing system 18 includes a processor at the camera 12 and another processor at the controller, with each of the processors performing different kinds of processing. For example, the processor at the camera 12 may perform noise compensation, while the processor at the controller may perform dewarping or other image manipulation. In another example, a single processor is provided at the controller or at the camera 12. In any of these examples, a processor can be a single-core processor, a multi-core processor, a microprocessor, a graphics processing unit (GPU), a central processing unit (CPU), or the like.


The memory 24 can be located and distributed in a similar manner as the processor or processors described above. The memory 24 can store program code, such as an image manipulation routine 26. The processor 22 can execute program code stored in the memory 24. As will be discussed in detail below, the processor 22 can be configured by the image manipulation routine 26 to process an image received from the camera 12 to generate a dewarped image having three panes, as will be discussed in detail below.



FIG. 3 shows an example of an image 60 captured by the camera 12 using the wide-angle lens 32. Warping in the image 60 resulting from the wide-angled lens 32 can be seen. In the example scene, a parking lot lane 62 is perpendicular to the vehicle 10. An approaching vehicle 78 traveling in that laneway 62 presents a potential collision hazard to the vehicle 10. Portions of the vehicle body 14 and bumper 15 can be seen to be distorted.


As will now be discussed in detail with reference to FIG. 4, the vehicle vision system 20 can process images (such as, for example, image 60 of FIG. 3) captured by the camera 12 to generate more informative, processed images and output such to the display 30. This can be performed by the image manipulation routine 26 acting on image data received from the camera 12. Illustrated in FIG. 4, is an example processed image 80 as it would be displayed on the display 30. It will be noted that the environment captured in the image of FIG. 4 is different from the environment captured in the image of FIG. 3, however these images are merely exemplary and are intended only to illustrate the particular point being made in relation to each figure.


The image manipulation routine 26 performs dewarping to obtain the image 80. Such dewarping flattens images received from the camera 12 to reduce the apparent curvature resulting from optical distortion causes by the wide-angle lens 32. The image manipulation routine 26 can also be configured to perform a perspective adjustment to the image (in other words, to show the environment as it would appear if the camera 12 were positioned or oriented differently than it is). In the example embodiment shown, the perspective adjustment carried out by the image manipulation routine 26 shows the environment as it would appear if the camera 12 were oriented horizontally (or generally parallel to the ground). The image manipulation routine 26 can perform other types of image manipulation, such as reshaping one or more portions of the image by one or more of enlarging, moving, cropping, stretching, compressing, skewing, rotating, and tilting, for example.


The image manipulation routine 26 separates the dewarped image 80 into three image panes 82, 84, 86. The three image panes include a rectangular central image pane 82 derived from a first subset of captured image data and two side image panes 84, 86 derived from second and third subsets of captured image data. The three image panes 82, 84, 86 may be of approximately the same width, such that the displayed image 80 is approximately divided into thirds. The portions of the displayed image 80 shown in the side image panes 84, 86 are more warped than the portion shown in the central image pane 82.


In the illustrated embodiment of FIG. 4, the left-side image pane 84 has an upper edge 84U and a lower edge 84L that are generally parallel to each other and slope generally downwardly away from the central image pane 82. Similarly, the right-side image pane 86 has an upper edge 86U and a lower edge 86L that are generally parallel to each other and slope generally downwardly away from the central image pane 82. The terms left and right are relative to the direction that the camera 12 is pointing. The side image panes 84, 86 can be shaped generally as parallelograms. As can be seen, the two side image panes 84, 86 are shaped and arranged with respect to the central image pane 82 to appear folded at some angle with respect to the central image pane 82, (so as to give the appearance of three panels of a bay window), as it would appear from a raised perspective or viewpoint.


This folded effect of the displayed image 80 can give the operator of the vehicle 10 a better understanding of the content of the three image panes 82, 84, 86, namely, that the central image pane 82 displays what is immediately in the path of the vehicle 10, while the side image panes 84, 86 display what is left and right to the path of the vehicle 10. The image data captured by the wide-angled lens 32 is thus presented in a way that improves driver comprehension of the scene outside the vehicle 10. For example, it may be more readily apparent to the operator that the pedestrian 120 and van 122 are left of the vehicle 10 rather than in the path of the vehicle 10. Similarly, it may be more readily apparent to the operator that the posts 124 are to the right of the vehicle 10 rather than in the path of the vehicle 10. At the same time, it will still be readily apparent to the operator that the posts 126 are directly in the path of the vehicle 10.


The side image panes 84, 86 can be shaped with respect to the horizon to increase the folded effect. The upper edge 84U and the lower edge 84L of the left-side image pane 84 can be angled to be generally parallel to a horizon line 84H of the left-side image pane 84. Similarly, the upper edge 86U and the lower edge 86L of the right-side image pane 86 can be angled to be generally parallel to a horizon line 86H of the right-side image pane 86. The central image pane 82 has horizontal upper and lower edges, which are generally parallel to a horizon line 82H of the central image pane 82 (when the image is displayed for viewing by a driver of the vehicle when normally operating the vehicle). The horizon lines 82H, 84H, 86H represent the horizon resulting from the specific dewarping algorithm used. In the embodiment shown, it can be seen that the horizon line 82H is relatively straight and horizontal (in other words, it has an average angle of about 0 degrees), while the horizon lines 84H and 86H have some small amount of curvature, and are generally angled at some non-zero average angle relative to the average angle of the horizon line 82H. The average angle of the horizon lines 82H, 84H, 86H can be selected during development of the image manipulation routine 26. In other words, the dewarping algorithm can be configured so that it generates image portions in panes 84 and 86 that have horizon lines 84H and 86H that have selected average angles relative to the horizontal. Testing of the dewarping algorithm may be carried out in any suitable environment, such as outside in an open area where the horizon is relatively unobscured by visual obstructions such as buildings.


Selecting the downward slope angles of the edges 84U, 84L, 86U, 86L so that they generally match the average angles of the horizon lines 84H and 86H can increase the folded visual effect (such that it provides an appearance of viewing the environment through a bay window).


The image manipulation routine 26 can apply a static overlay to the displayed dewarped image 80. The overlay is static in that it remains fixed in appearance and fixed positionally when overlaid on the displayed images derived from the image data captured by the camera 12. The static overlay may include generally vertical bands 90 (which may be gaps between the adjacent image panes or demarcation lines or dark lines overlayed at the joint between the adjacent image panes or the like). The bands 90 may simply be referred to as vertical bands 90 however this is simply for readability and it will be understood that these bands 90 need not be strictly vertical but may be generally vertical. One vertical band 90 separates the left-side pane 84 from the central pane 82 and another vertical band 90 separates the right-side pane 86 from the central pane 82. The vertical bands 90 can be a single color, such as white or more preferably black so as to contrast with the image portions shown in the panes 82, 84 and 86 during vehicle use. The vertical bands 90 help visually delineate the side panes 84, 86 from the central pane 82, and may be shaped and sized to appear to the operator of the vehicle 10 like vehicular A- or C-pillars between the rear or front windshield and the side windows. The vertical bands 90 further reinforce the visual division of the horizon line into three relatively straight segments (horizon lines 82H, 84H and 86H) where the left and right segments (horizon lines 84H and 86H) are angled relative to the center segment (horizon line 82H), thereby reinforcing the aforementioned folded visual effect.


The overlay may further include triangular regions 94, 96 above the two side image panes 84, 86. Above the left-side image pane 84 is the left-side triangular region 94 and above the right-side image pane 86 is the right-side triangular region 96. The triangular regions 94, 96 may have the same color as the vertical bands 90 and may be generally contiguous therewith or they may be separated from the vertical bands 90. The color of the triangular regions 94, 96 preferably contrasts with the side image panes 84, 86 to help visually define the shapes of the side image panes 84, 86, and can thus reinforce the folded visual effect of the displayed image 80.


The overlay may further include a trapezoidal region 98 below the three image panes 82, 84, 86. The trapezoidal region 98 occupies space left by the shapes of the side image panes 84, 86 and the arrangement of the side image panes 84, 86 with the central image pane 82 so as to further reinforce the folded visual effect. The trapezoidal region 98 has triangular regions below the side image panes 84, 86 and a rectangular region below the central image pane 82. The trapezoidal region 98 may be the same color as the vertical bands 90 and may be contiguous therewith. The color of the trapezoidal region 98 preferably contrasts with the three image panes 82, 84, 86 to help visually define the shapes of the three image panes 82, 84, 86, so as to reinforce the folded visual effect of the displayed image 80.


In one example, the static regions 94, 96, 98 are shaped and sized to omit or obliterate some image data, but such image data is predetermined to not provide information relevant to the operation of the vehicle 10. Omitted or obliterated image data may represent portions of the captured image expected to contain sky or vehicle body 14. This can be advantageous when the field of view of the camera 12 does not match the aspect ratio of the display 30 or when it is desired to improve the appearance of the image 80. In another example, the triangular and trapezoidal static regions 94, 96, 98 are shaped and sized to not omit or obliterate any captured image data, and the image manipulation routine 26 performs stretching and/or interpolation on image data near the static regions 94, 96, 98 to extend the image to the edges of the static regions 94, 96, 98. This can be advantageous when it is desired to display all of the captured image.


In some embodiments, the vertical bands 90 may block or obliterate some image data, in which case the static vertical bands 90 are preferably relatively thin so as to reduce the types of obstacle or hazard that would be obscured by them. In some embodiments however, the image 80 may be configured to that the vertical bands 90 do not obliterate any image data. Instead, the image manipulation routine 26 may manipulate the image data so that the image is split and the portions shown in the side image panes 84 and 86 are offset from their initial position immediately adjacent the image portion shown in the central image pane 82 to a final position where they are offset from the central image portion by a distance corresponding to the thickness of the respective band 90.


Thus, and in accordance with the present invention, it is preferred to have the central image pane be visually differentiatable or demarcatable from each of the side image panes via a demarcation or static overlay. Such a demarcation may be provided in various forms, depending on the particular application and desired display appearance. For example, and such as shown in FIG. 4, the demarcation or static overlay 90 may comprise a small gap that is established between the central image pane and each of the side image panes. Alternatively, such a gap need not be provided, and the likes of a dark or black demarcating line or border or the like may be electronically superimposed at the respective joint between the left image pane and the center image pane and the right image pane and the center image pane (and optionally established so as to block a portion or portions of the displayed image). Optionally, and as discussed above, static overlays or static regions may be provided or established at the perimeter regions of the displayed image (such as above and/or below the center image pane and/or the side image panes or the like). The displayed dewarped image (that is displayed as three image panes at the display) is derived from captured image data (captured by a camera of the vehicle that has an exterior field of view), while the static overlay or overlays or demarcations or static regions provided or overlaid at the displayed image at the display screen are formed from or derived from other than captured image data.


Although shown and described as having three image panes such as shown in FIG. 4, clearly other shaped image panes may be utilized to provide the desired visual effect to the driver viewing the displayed images while operating the vehicle. For example, and with reference to FIGS. 6-10, the three image panes may have non-parallel upper and lower slope angles for the side image panes (such as shown in FIGS. 6 and 7). Optionally, the displayed images may have the center image pane extend vertically from the top of the image to the bottom of the image (with no static region above or below the center pane), such as shown in FIG. 6, or may have static regions both above and below a smaller center image pane, such as shown in FIG. 7. Optionally, and as shown in FIG. 8, the static bands or demarcating bands between the image panes may comprise non-vertical bands. FIGS. 9 and 10 show different approaches, with FIG. 10 being similar to FIG. 4 and FIG. 9 having the static region above the center image pane instead of below the center image pane, with the side image panes having slope angles that are angled upward instead of downward. Other shapes or configurations or appearances of the three paned displayed image may be implemented depending on the particular application of the vision system and desired display for viewing by the driver.


Because customers may find the split or tri view scenes difficult to comprehend, various pane shapes and/or border overlays help provide a visual reference. Optionally, the display may further include ground position reference overlays that may serve as additional cues to enhance the driver's ability to judge distances to objects present in the field of view of the camera and to judge the speeds of objects that are moving in the field of view of the camera. Various exemplary overlays are shown in FIGS. 11-15, with FIGS. 11-15 including similar reference numbers as used in FIG. 4.


For example, and with reference to FIG. 11, a dewarped image having three image panes as generated by the vehicle vision system of the present invention is shown with a road outline overlay and with a vehicle reference icon (shown in the lower left corner region of the displayed image, but it could be elsewhere in the displayed image) to enhance the driver's ability to judge distances and speeds of objects in the image. Optionally, and such as shown in FIG. 12, the road outline overlay may be colored or filled in. The road outline overlay part may be defined as semi-transparent such that the objects in the area are not obscured by the overlay while the impression of the road is still conveyed to driver. Optionally, and with reference to FIG. 13, color gradient line overlays may be incorporated in the displayed images to enhance the driver's ability to judge distances and speeds of objects in the image. Optionally, and as shown in FIG. 14, the color gradient line overlays having markers therealong (such as short horizontal lines spaced along the line overlays). Optionally, distance flag or marker or indicator overlays may be generated (such as shown in FIG. 15) to enhance the driver's ability to judge distances and speeds of objects in the image. Other overlays may be incorporated to provide the desired information or enhancement, while remaining within the spirit and scope of the present invention. The overlays may utilize aspects of the systems described in U.S. Pat. Nos. 5,670,935; 5,949,331; 6,222,447 and 6,611,202, and/or U.S. patent application Ser. No. 12/677,539, filed Mar. 31, 2010, now U.S. Pat. No. 8,451,107, which are hereby incorporated herein by reference in their entireties.


The camera 12 can be positioned and the image manipulation routine 26 correspondingly configured so that the dewarped image 80 contains at least a portion of the bumper 15 of the vehicle 10. In some embodiments, the bumper appears in all three image panes 82, 84 and 86. This can advantageously assure the vehicle operator that all hazards adjacent the vehicle 10 are visible on the display 30. In other words, it assures the driver that there is no portion of the environment behind the vehicle (or in front of the vehicle in forward-facing embodiments) that is omitted from the images. Additionally, it gives the driver of the vehicle a point of reference for where the vehicle 10 is relative to the images. This facilitates precise positioning of the vehicle 10 relative to objects shown in the images. For example, it facilitates parking the vehicle very close to another vehicle that is within the field of view of the camera 12 without fear of colliding with the other vehicle.


The image manipulation routine 26 can be embodied by one or more of a remapping table, function, algorithm, or process that acts on a set of pixels to generate a respective set of processed pixels.


As shown in FIG. 5, a remapping table 134 correlates X and Y coordinates of source pixels of a source image 130 captured by the camera 12 to X and Y coordinates of destination pixels of a destination image 132 for output at the display 30. The remapping table 134 allows color values A-L of each source pixel to be set at the X and Y coordinates of a corresponding destination pixel. In this example, corner pixels of the source image 130 lack image data, so the remapping table 134 references color values of neighboring pixels to populate corner pixels of the destination image 132. Although simplified to 16 pixels for explanatory purposes, the remapping table 134 can correspond to a dewarping operation that increases the size of the destination image 132 as well as make the destination image 132 rectangular when compared to the source image 130, which is nominally round. This technique can be used by increasing the number of pixels to achieve any of the image manipulations discussed herein.


Each X-Y pair of coordinates' values in remapping table 134 may also represent a multiple pixel group. For example, a group of 4×4 pixels (16 pixels in total) may be represented by four pairs of X-Y coordinates at the four corners. Only the four corner pixel coordinates of the source image are stored in remapping table 134. The other pixels inside the 4×4 square can be interpolated by the four corner pixel coordinates. The method of interpolation may comprise, but may not be limited to, a two dimensional (2D) bilinear interpolation technique or the like. Using multi-pixel group mapping can save memory space thus save system cost. The number of pixels in the mapping group and the shape of the mapping group may vary. For example, a 16×16 or a 16×12 group may be considered depending on the application requirement.


The remapping table 134 can have constant values for destination pixels of the static bands and regions 90, 94, 96, 98, and such constant values are used regardless of whether or not source pixel data is available. The constant values can be assigned to represent certain static color values that define the color of the bands and regions.


The image manipulation routine 26 can include instructions for carrying out the remapping of pixels, and can further include any remapping tables as well. Alternatively, any remapping tables can be stored in the memory 24 separately from the image manipulation routine 26.


In another example, a remapping function takes as input source pixel coordinates and color values and outputs destination pixel coordinates and color values. In this case, the image manipulation routine 26 includes instructions that define the remapping function. The image manipulation routine 26 can use interpolation or extrapolation to output color values for pixels that do not directly correlate to pixels in the captured image. Although interpolation or extrapolation may result in blur or an apparent loss of image fidelity, it can also result in a larger or more easily comprehensible image. When the side image panes 84, 86 are reflectionally symmetric, the remapping function can be passed a parameter that identifies the specific image pane 84, 86, so that the remapping function can operate on the pane's pixels accordingly.


The remapping function can call an overlay function to generate the static bands and regions 90, 94, 96, 98. The static band may comprise a static picture, which is overlayed on to the dewarped live image. Alpha blending of the overlay picture is defined such that the live image area(s) are transparent in the overlay picture so that can be seen by the user, while the static areas, such as at 90, 94, 96, 98, are defined as opaque. Certain areas of the overlay graph are defined as semi-transparent by the alpha value of the overlay picture, so that the areas are not totally obscured by the overlay graph.


In other examples, other techniques can alternatively or additionally be used for the image manipulation routine 26.


In any of the examples described herein, the processor 22 can be configured to generate the dewarped image 80 based on image data received from a single camera (such as camera 12). That is, the processor 22 need not use image data provided by multiple cameras, if any other cameras are provided to the vehicle 10, to carry out the image processing described herein.


The techniques described above can emphasize to the vehicle operator that the side image panes 84, 86 are more lateral to the vehicle 10 than may be apparent from the originally captured image. For example, the approaching vehicle 78 in the original image 60 in FIG. 3 may have a position or may be moving in a way that is misperceived by the driver due to the distortion caused by the wide-angle lens 32. While the vehicle 78 is indeed a hazard to the driver wishing to enter the lane 62, the distortion of the image 60 may be confusing to the driver and may cause the driver to not fully understand the approaching hazard. However, the processed image 80 in FIG. 4 is dewarped and includes three panes that are shaped and arranged as well as separated and contrasted by static bands and regions 90, 94, 96, 98 to emphasize the lateral nature of the image data in the side image panes 84, 86. This folded visual effect can provide for quick and accurate assessment of the content of the image 80. Accordingly, it can be more readily apparent to the driver that the hazards 120, 122, 124 are lateral of the vehicle 10.


While side image panes 84, 86 have the advantage of alerting drivers to oncoming cross-traffic or other hazards that may be obstructed by blind spots or obstacles, showing the central image pane 82 as well provides a further advantage even if the scene of the central image pane 82 is clearly directly visible via the rear-view mirror or front windshield. This advantage is that the driver does not have to switch his/her attention between the display 30 and the rear view mirror or front windshield, and can thus observe the entire scene on the display 30.


Thus, the present invention provides dewarped images in a multi-pane configuration to provide a desired display of objects rearward of the vehicle and sideward and rearward of the equipped vehicle in a manner that is readily seen and discerned and understood by the driver of the vehicle, when viewing the displayed images during a reversing maneuver of the vehicle. Optionally, and as can be seen with reference to FIGS. 16-18, the size of the image panes and the degree of zoom of the dewarped images shown in each image pane may be adjusted to provide the desired effect. For example, the image shown in FIG. 17 has enlarged side image panes and a reduced center image pane as compared to the image shown in FIG. 16, while the image shown in FIG. 18 has an increased zoom or enlargement factor at the center image pane as compared to the image shown in FIG. 17. The degree of zoom and the relative sizes of the center and side image panes may be adjusted or selected depending on the particular application of the vision system and the desired display features.


Optionally, and with reference to FIGS. 19-22, the processor of the present invention may provide a curved wide angle dewarped image for display without the multi-panes of the displays discussed above. Visual cues such as the crescent shaped overlay regions at the top and bottom of the image can be used to improve understanding of the manipulated scene. For example, such cues help users perceive the image as having a three dimensional effect, such as with the three pane view. The images may be compressed at the center region and/or expanded at the side regions to provide, when displayed at a video display for viewing by the driver of the vehicle, enhanced viewing of objects sideward and rearward of the vehicle, which may be particularly useful when backing out of a parking space or the like (so as to enhance viewing of vehicles approaching the parking space from the left or right of the parked vehicle). For example, and as can be seen with reference to FIGS. 19-22, the side regions of the image of FIG. 20 are larger than the side regions of the image of FIG. 19, while the side regions of the image of FIG. 21 are larger than the side regions of the image of FIG. 20, and the side regions of the image of FIG. 22 are larger than the side regions of the image of FIG. 21. The degree of zoom and the relative sizes of the center and side regions of the curved images may be adjusted or selected depending on the particular application of the vision system and the desired display features.


Thus, in accordance with the present invention, a substantially dewarped image is produced, where the displayed image does not exhibit a fish eye or otherwise warped image to the driver viewing the in-cabin display. The displayed image thus may substantially represent what the driver would see if directly viewing at least a portion of the same scene.


Although the terms “image data” and “data” may be used in this disclosure interchangeably with entities represented by image data, such as images, portions of images, image regions, panes, and the like, one of ordinary skill in the art, given this disclosure, will understand how image data and entities represented by image data interrelate.


Optionally, the video display screen or device may be operable to display images captured by a rearward viewing camera of the vehicle during a reversing maneuver of the vehicle (such as responsive to the vehicle gear actuator being placed in a reverse gear position or the like) to assist the driver in backing up the vehicle, and optionally may be operable to display a compass heading or directional heading character or icon (or other icon or indicia or displayed images) when the vehicle is not undertaking a reversing maneuver, such as when the vehicle is being driven in a forward direction along a road (such as by utilizing aspects of the display system described in PCT Application No. PCT/US2011/056295, filed Oct. 14, 2011 and published Apr. 19, 2012 as International Publication No. WO 2012/051500, which is hereby incorporated herein by reference in its entirety).


The system includes an image processor operable to process image data captured by the camera or cameras, such as for detecting objects or other vehicles or pedestrians or the like in the field of view of one or more of the cameras. For example, the image processor may comprise an EYEQ2 or EYEQ3 image processing chip available from Mobileye Vision Technologies Ltd. of Jerusalem, Israel, and may include object detection software (such as the types described in U.S. Pat. Nos. 7,855,755; 7,720,580 and/or 7,038,577, which are hereby incorporated herein by reference in their entireties), and may analyze image data to detect vehicles and/or other objects. Responsive to such image processing, and when an object or other vehicle is detected, the system may generate an alert to the driver of the vehicle and/or may generate an overlay at the displayed image to highlight or enhance display of the detected object or vehicle, in order to enhance the driver's awareness of the detected object or vehicle or hazardous condition during a driving maneuver of the equipped vehicle. Optionally, the image processor may utilize aspects of the systems described in U.S. Pat. No. 7,697,027, which is hereby incorporated herein by reference in its entirety.


The vehicle may include any type of sensor or sensors, such as imaging sensors or radar sensors or lidar sensors or ladar sensors or ultrasonic sensors or the like. The imaging sensor or camera may capture image data for image processing and may comprise any suitable camera or sensing device, such as, for example, an array of a plurality of photosensor elements arranged in at least 640 columns and 480 rows (at least a 640×480 imaging array), with a respective lens focusing images onto respective portions of the array. The photosensor array may comprise a plurality of photosensor elements arranged in a photosensor array having rows and columns. The logic and control circuit of the imaging sensor may function in any known manner, and the image processing and algorithmic processing may comprise any suitable means for processing the images and/or image data. For example, the vision system and/or processing and/or camera and/or circuitry may utilize aspects described in U.S. Pat. Nos. 7,005,974; 5,760,962; 5,877,897; 5,796,094; 5,949,331; 6,222,447; 6,302,545; 6,396,397; 6,498,620; 6,523,964; 6,611,202; 6,201,642; 6,690,268; 6,717,610; 6,757,109; 6,802,617; 6,806,452; 6,822,563; 6,891,563; 6,946,978; 7,859,565; 5,550,677; 5,670,935; 6,636,258; 7,145,519; 7,161,616; 7,230,640; 7,248,283; 7,295,229; 7,301,466; 7,592,928; 7,881,496; 7,720,580; 7,038,577; 6,882,287; 5,929,786 and/or 5,786,772, and/or PCT Application No. PCT/US2010/047256, filed Aug. 31, 2010 and published Mar. 10, 2011 as International Publication No. WO 2011/028686 and/or International Publication No. WO 2010/099416, published Sep. 2, 2010, and/or PCT Application No. PCT/US10/25545, filed Feb. 26, 2010 and published Sep. 2, 2010 as International Publication No. WO 2010/099416, and/or PCT Application No. PCT/US2012/048800, filed Jul. 30, 2012, and published on Feb. 7, 2013 as International Publication No. WO 2013/019707, and/or PCT Application No. PCT/US2012/048110, filed Jul. 25, 2012, and published on Jan. 31, 2013 as International Publication No. WO 2013/016409, and/or PCT Application No. PCT/CA2012/000378, filed Apr. 25, 2012, and published on Nov. 1, 2012 as International Publication No. WO 2012/145822, and/or PCT Application No. PCT/US2012/056014, filed Sep. 19, 2012, and published on Mar. 28, 2013 as International Publication No. WO 2013/043661, and/or PCT Application No. PCT/US12/57007, filed Sep. 25, 2012, and published on Apr. 4, 2013 as International Publication No. WO 2013/048994, and/or PCT Application No. PCT/US2012/061548, filed Oct. 24, 2012, and published on May 2, 2013 as International Publication No. WO 2013/063014, and/or PCT Application No. PCT/US2012/062906, filed Nov. 1, 2012, and published on May 1, 2013 and International Publication No. WO 2013/067083, and/or PCT Application No. PCT/US2012/063520, filed Nov. 5, 2012, and published on May 16, 2013 as International Publication No. WO 2013/070539, and/or PCT Application No. PCT/US2012/064980, filed Nov. 14, 2012, and published on May 23, 2013 as International Publication No. WO 2013/074604, and/or PCT Application No. PCT/US2012/066570, filed Nov. 27, 2012, and published on Jun. 6, 2013 as International Publication No. WO 2013/081984, and/or PCT Application No. PCT/US2012/066571, filed Nov. 27, 2012, and published on Jun. 6, 2013 as International Publication No. WO 2013/081985, and/or PCT Application No. PCT/US2012/068331, filed Dec. 7, 2012, and published on Jun. 13, 2013 as International Publication No. WO 2013/086249, and/or PCT Application No. PCT/US2013/022119, filed Jan. 18, 2013, and published on Jul. 25, 2013 as International Publication No. WO 2013/109869, and/or PCT Application No. PCT/US2013/027342, filed Feb. 22, 2013, and published on Aug. 9, 2013 as International Publication No. WO 2013/126715, and/or U.S. patent application Ser. No. 13/847,815, filed Mar. 20, 2013, and published on Oct. 31, 2013 as U.S. Publication No. US-2013-0286193; Ser. No. 13/779,881, filed Feb. 28, 2013, now U.S. Pat. No. 8,694,224; Ser. No. 13/785,099, filed Mar. 5, 2013, now U.S. Pat. No. 9,565,342; Ser. No. 13/774,317, filed Feb. 22, 2013, now U.S. Pat. No. 9,269,263; Ser. No. 13/774,315, filed Feb. 22, 2013, and published on Aug. 22, 2013 as U.S. Publication No. US-2013-0215271; Ser. No. 13/681,963, filed Nov. 20, 2012, now U.S. Pat. No. 9,264,673; Ser. No. 13/660,306, filed Oct. 25, 2012, now U.S. Pat. No. 9,146,898; Ser. No. 13/653,577, filed Oct. 17, 2012, now U.S. Pat. No. 9,174,574; and/or Ser. No. 13/534,657, filed Jun. 27, 2012, and published on Jan. 3, 2013 as U.S. Publication No. US-2013-0002873, and/or U.S. provisional applications, Ser. No. 61/766,883, filed Feb. 20, 2013; Ser. No. 61/760,368, filed Feb. 4, 2013; Ser. No. 61/760,364, filed Feb. 4, 2013; Ser. No. 61/758,537, filed Jan. 30, 2013; Ser. No. 61/754,8004, filed Jan. 21, 2013; Ser. No. 61/745,925, filed Dec. 26, 2012; Ser. No. 61/745,864, filed Dec. 26, 2012; Ser. No. 61/736,104, filed Dec. 12, 2012; Ser. No. 61/736,103, filed Dec. 12, 2012; Ser. No. 61/735,314, filed Dec. 10, 2012; Ser. No. 61/734,457, filed Dec. 7, 2012; Ser. No. 61/733,598, filed Dec. 5, 2012; Ser. No. 61/733,093, filed Dec. 4, 2012; Ser. No. 61/710,924, filed Oct. 8, 2012; Ser. No. 61/696,416, filed Sep. 4, 2012; Ser. No. 61/682,995, filed Aug. 14, 2012; Ser. No. 61/682,486, filed Aug. 13, 2012; Ser. No. 61/680,883, filed Aug. 8, 2012; Ser. No. 61/678,375, filed Aug. 1, 2012; Ser. No. 61/676,405, filed Jul. 27, 2012; Ser. No. 61/666,146, filed Jun. 29, 2012; Ser. No. 61/653,665, filed May 31, 2012; Ser. No. 61/653,664, filed May 31, 2012; Ser. No. 61/648,744, filed May 18, 2012; Ser. No. 61/624,507, filed Apr. 16, 2012; Ser. No. 61/616,126, filed Mar. 27, 2012, which are all hereby incorporated herein by reference in their entireties. The system may communicate with other communication systems via any suitable means, such as by utilizing aspects of the systems described in PCT Application No. PCT/US10/038477, filed Jun. 14, 2010, and/or U.S. patent application Ser. No. 13/202,005, filed Aug. 17, 2011, now U.S. Pat. No. 9,126,525, which are hereby incorporated herein by reference in their entireties.


The imaging device and control and image processor and any associated illumination source, if applicable, may comprise any suitable components, and may utilize aspects of the cameras and vision systems described in U.S. Pat. Nos. 5,550,677; 5,877,897; 6,498,620; 5,670,935; 5,796,094; 6,396,397; 6,806,452; 6,690,268; 7,005,974; 7,123,168; 7,004,606; 6,946,978; 7,038,577; 6,353,392; 6,320,176; 6,313,454 and 6,824,281, and/or International Publication No. WO 2010/099416, published Sep. 2, 2010, and/or PCT Application No. PCT/US10/47256, filed Aug. 31, 2010 and published Mar. 10, 2011 as International Publication No. WO 2011/028686, and/or U.S. patent application Ser. No. 12/508,840, filed Jul. 24, 2009, and published Jan. 28, 2010 as U.S. Pat. Publication No. US 2010-0020170, and/or PCT Application No. PCT/US2012/048110, filed Jul. 25, 2012, and published on Jan. 31, 2013 as International Publication No. WO 2013/016409, and/or U.S. patent application Ser. No. 13/534,657, filed Jun. 27, 2012, and published on Jan. 3, 2013 as International Publication No. WO 2013/0002873, which are all hereby incorporated herein by reference in their entireties. The camera or cameras may comprise any suitable cameras or imaging sensors or camera modules, and may utilize aspects of the cameras or sensors described in U.S. patent application Ser. No. 12/091,359, filed Apr. 24, 2008 and published Oct. 1, 2009 as U.S. Publication No. US-2009-0244361, and/or Ser. No. 13/260,400, filed Sep. 26, 2011, now U.S. Pat. Nos. 8,542,451, and/or 7,965,336 and/or 7,480,149, which are hereby incorporated herein by reference in their entireties. The imaging array sensor may comprise any suitable sensor, and may utilize various imaging sensors or imaging array sensors or cameras or the like, such as a CMOS imaging array sensor, a CCD sensor or other sensors or the like, such as the types described in U.S. Pat. Nos. 5,550,677; 5,670,935; 5,760,962; 5,715,093; 5,877,897; 6,922,292; 6,757,109; 6,717,610; 6,590,719; 6,201,642; 6,498,620; 5,796,094; 6,097,023; 6,320,176; 6,559,435; 6,831,261; 6,806,452; 6,396,397; 6,822,563; 6,946,978; 7,339,149; 7,038,577; 7,004,606 and/or 7,720,580, and/or U.S. patent application Ser. No. 10/534,632, filed May 11, 2005, now U.S. Pat. No. 7,965,336; and/or PCT Application No. PCT/US2008/076022, filed Sep. 11, 2008 and published Mar. 19, 2009 as International Publication No. WO 2009/036176, and/or PCT Application No. PCT/US2008/078700, filed Oct. 3, 2008 and published Apr. 9, 2009 as International Publication No. WO 2009/046268, which are all hereby incorporated herein by reference in their entireties.


The camera module and circuit chip or board and imaging sensor may be implemented and operated in connection with various vehicular vision-based systems, and/or may be operable utilizing the principles of such other vehicular systems, such as a vehicle headlamp control system, such as the type disclosed in U.S. Pat. Nos. 5,796,094; 6,097,023; 6,320,176; 6,559,435; 6,831,261; 7,004,606; 7,339,149 and/or 7,526,103, which are all hereby incorporated herein by reference in their entireties, a rain sensor, such as the types disclosed in commonly assigned U.S. Pat. Nos. 6,353,392; 6,313,454; 6,320,176 and/or 7,480,149, which are hereby incorporated herein by reference in their entireties, a vehicle vision system, such as a forwardly, sidewardly or rearwardly directed vehicle vision system utilizing principles disclosed in U.S. Pat. Nos. 5,550,677; 5,670,935; 5,760,962; 5,877,897; 5,949,331; 6,222,447; 6,302,545; 6,396,397; 6,498,620; 6,523,964; 6,611,202; 6,201,642; 6,690,268; 6,717,610; 6,757,109; 6,802,617; 6,806,452; 6,822,563; 6,891,563; 6,946,978 and/or 7,859,565, which are all hereby incorporated herein by reference in their entireties, a trailer hitching aid or tow check system, such as the type disclosed in U.S. Pat. No. 7,005,974, which is hereby incorporated herein by reference in its entirety, a reverse or sideward imaging system, such as for a lane change assistance system or lane departure warning system or for a blind spot or object detection system, such as imaging or detection systems of the types disclosed in U.S. Pat. Nos. 7,720,580; 7,038,577; 5,929,786 and/or 5,786,772, and/or U.S. patent application Ser. No. 11/239,980, filed Sep. 30, 2005, now U.S. Pat. No. 7,881,496, and/or U.S. provisional applications, Ser. No. 60/628,709, filed Nov. 17, 2004; Ser. No. 60/614,644, filed Sep. 30, 2004; Ser. No. 60/618,686, filed Oct. 14, 2004; Ser. No. 60/638,687, filed Dec. 23, 2004, which are hereby incorporated herein by reference in their entireties, a video device for internal cabin surveillance and/or video telephone function, such as disclosed in U.S. Pat. Nos. 5,760,962; 5,877,897; 6,690,268 and/or 7,370,983, and/or U.S. patent application Ser. No. 10/538,724, filed Jun. 13, 2005 and published Mar. 9, 2006 as U.S. Publication No. US-2006-0050018, which are hereby incorporated herein by reference in their entireties, a traffic sign recognition system, a system for determining a distance to a leading or trailing vehicle or object, such as a system utilizing the principles disclosed in U.S. Pat. Nos. 6,396,397 and/or 7,123,168, which are hereby incorporated herein by reference in their entireties, and/or the like.


Optionally, the circuit board or chip may include circuitry for the imaging array sensor and or other electronic accessories or features, such as by utilizing compass-on-a-chip or EC driver-on-a-chip technology and aspects such as described in U.S. Pat. Nos. 7,255,451 and/or 7,480,149; and/or U.S. patent application Ser. No. 11/226,628, filed Sep. 14, 2005 and published Mar. 23, 2006 as U.S. Publication No. US-2006-0061008, and/or Ser. No. 12/578,732, filed Oct. 14, 2009, now U.S. Pat. No. 9,487,144, which are hereby incorporated herein by reference in their entireties.


Optionally, the vision system may include a display for displaying images captured by one or more of the imaging sensors for viewing by the driver of the vehicle while the driver is normally operating the vehicle. Optionally, for example, the vision system may include a video display device disposed at or in the interior rearview mirror assembly of the vehicle, such as by utilizing aspects of the video mirror display systems described in U.S. Pat. No. 6,690,268 and/or U.S. patent application Ser. No. 13/333,337, filed Dec. 21, 2011, now U.S. Pat. No. 9,264,672, which are hereby incorporated herein by reference in their entireties. The video mirror display may comprise any suitable devices and systems and optionally may utilize aspects of the compass display systems described in U.S. Pat. Nos. 7,370,983; 7,329,013; 7,308,341; 7,289,037; 7,249,860; 7,004,593; 4,546,551; 5,699,044; 4,953,305; 5,576,687; 5,632,092; 5,677,851; 5,708,410; 5,737,226; 5,802,727; 5,878,370; 6,087,953; 6,173,508; 6,222,460; 6,513,252 and/or 6,642,851, and/or European patent application, published Oct. 11, 2000 under Publication No. EP 0 1043566, and/or U.S. patent application Ser. No. 11/226,628, filed Sep. 14, 2005 and published Mar. 23, 2006 as U.S. Publication No. US-2006-0061008, which are all hereby incorporated herein by reference in their entireties. Optionally, the video mirror display screen or device may be operable to display images captured by a rearward viewing camera of the vehicle during a reversing maneuver of the vehicle (such as responsive to the vehicle gear actuator being placed in a reverse gear position or the like) to assist the driver in backing up the vehicle, and optionally may be operable to display the compass heading or directional heading character or icon when the vehicle is not undertaking a reversing maneuver, such as when the vehicle is being driven in a forward direction along a road (such as by utilizing aspects of the display system described in PCT Application No. PCT/US2011/056295, filed Oct. 14, 2011 and published Apr. 19, 2012 as International Publication No. WO 2012/051500, which is hereby incorporated herein by reference in its entirety).


Optionally, the vision system (utilizing the forward facing camera and a rearward facing camera and other cameras disposed at the vehicle with exterior fields of view) may be part of or may provide a display of a top-down view or birds-eye view system of the vehicle or a surround view at the vehicle, such as by utilizing aspects of the vision systems described in PCT Application No. PCT/US10/25545, filed Feb. 26, 2010 and published on Sep. 2, 2010 as International Publication No. WO 2010/099416, and/or PCT Application No. PCT/US10/47256, filed Aug. 31, 2010 and published Mar. 10, 2011 as International Publication No. WO 2011/028686, and/or PCT Application No. PCT/US2011/062834, filed Dec. 1, 2011 and published Jun. 7, 2012 as International Publication No. WO 2012/075250, and/or PCT Application No. PCT/US2012/048993, filed Jul. 31, 2012, and published on Feb. 7, 2013 as International Publication No. WO 2013/019795, and/or PCT Application No. PCT/US11/62755, filed Dec. 1, 2011 and published Jun. 7, 2012 as International Publication No. WO 2012-075250, and/or PCT Application No. PCT/CA2012/000378, filed Apr. 25, 2012, and published on Nov. 1, 2012 as International Publication No. WO 2012/145822, and/or PCT Application No. PCT/US2012/066571, filed Nov. 27, 2012, and published on Jun. 6, 2013 as International Publication No. WO 2013/081985, and/or PCT Application No. PCT/US2012/068331, filed Dec. 7, 2012, and published on Jun. 13, 2013 as International Publication No. WO 2013/086249, and/or PCT Application No. PCT/US2013/022119, filed Jan. 18, 2013, and published on Jul. 25, 2013 as International Publication No. WO 2013/109869, and/or U.S. patent application Ser. No. 13/333,337, filed Dec. 21, 2011, now U.S. Pat. No. 9,264,672, which are hereby incorporated herein by reference in their entireties.


Optionally, a video mirror display may be disposed rearward of and behind the reflective element assembly and may comprise a display such as the types disclosed in U.S. Pat. Nos. 5,530,240; 6,329,925; 7,855,755; 7,626,749; 7,581,859; 7,446,650; 7,370,983; 7,338,177; 7,274,501; 7,255,451; 7,195,381; 7,184,190; 5,668,663; 5,724,187 and/or 6,690,268, and/or in U.S. patent application Ser. No. 12/091,525, filed Apr. 25, 2008, now U.S. Pat. No. 7,855,755; Ser. No. 11/226,628, filed Sep. 14, 2005 and published Mar. 23, 2006 as U.S. Publication No. US-2006-0061008; and/or Ser. No. 10/538,724, filed Jun. 13, 2005 and published Mar. 9, 2006 as U.S. Publication No. US-2006-0050018, which are all hereby incorporated herein by reference in their entireties. The display is viewable through the reflective element when the display is activated to display information. The display element may be any type of display element, such as a vacuum fluorescent (VF) display element, a light emitting diode (LED) display element, such as an organic light emitting diode (OLED) or an inorganic light emitting diode, an electroluminescent (EL) display element, a liquid crystal display (LCD) element, a video screen display element or backlit thin film transistor (TFT) display element or the like, and may be operable to display various information (as discrete characters, icons or the like, or in a multi-pixel manner) to the driver of the vehicle, such as passenger side inflatable restraint (PSIR) information, tire pressure status, and/or the like. The mirror assembly and/or display may utilize aspects described in U.S. Pat. Nos. 7,184,190; 7,255,451; 7,446,924 and/or 7,338,177, which are all hereby incorporated herein by reference in their entireties. The thicknesses and materials of the coatings on the substrates of the reflective element may be selected to provide a desired color or tint to the mirror reflective element, such as a blue colored reflector, such as is known in the art and such as described in U.S. Pat. Nos. 5,910,854; 6,420,036 and/or 7,274,501, which are hereby incorporated herein by reference in their entireties.


Optionally, the display or displays and any associated user inputs may be associated with various accessories or systems, such as, for example, a tire pressure monitoring system or a passenger air bag status or a garage door opening system or a telematics system or any other accessory or system of the mirror assembly or of the vehicle or of an accessory module or console of the vehicle, such as an accessory module or console of the types described in U.S. Pat. Nos. 7,289,037; 6,877,888; 6,824,281; 6,690,268; 6,672,744; 6,386,742 and 6,124,886, and/or U.S. patent application Ser. No. 10/538,724, filed Jun. 13, 2005 and published Mar. 9, 2006 as U.S. Publication No. US-2006-0050018, which are hereby incorporated herein by reference in their entireties.


While the foregoing provides certain non-limiting example embodiments, it should be understood that combinations, subsets, and variations of the foregoing are contemplated. The monopoly sought is defined by the claims.

Claims
  • 1. A vehicular display system comprising: a rearward viewing camera disposed at a rear portion of a body of a vehicle, wherein the rearward viewing camera views rearward of the vehicle, and wherein the rearward viewing camera is operable to capture image data, the captured image data representative of a scene viewed by the rearward viewing camera;a controller comprising a processor for processing image data captured by the rearward viewing camera;a display device disposed in the vehicle for viewing by a driver of the vehicle when the driver is operating the vehicle, wherein the display device comprises a single video display screen for displaying video images for viewing by the driver of the vehicle;wherein, with the rearward viewing camera disposed at the vehicle and capturing image data, captured image data is processed at the controller to generate an output representative of an image for display at the single video display screen of the display device;wherein the display device, responsive to the output generated by the controller, displays the image at the single video display screen;wherein the image, when displayed at the single video display screen of the display device, comprises three individual image panes including (i) a central image pane derived from a central subset of image data, (ii) a left-side image pane derived from a left-side subset of image data and (iii) a right-side image pane derived from a right-side subset of image data;wherein, when the image is displayed at the single video display screen of the display device, each of the central image pane, the left-side image pane and the right-side image pane displayed at the single video display screen has a respective upper edge and a respective lower edge;wherein the upper edge of the left-side image pane is not parallel to the upper edge of the right-side image pane, and wherein the lower edge of the left-side image pane is not parallel to the lower edge of the right-side image pane;wherein the upper edge of the left-side image pane is not parallel to the upper edge of the central image pane, and wherein the upper edge of the right-side image pane is not parallel to the upper edge of the central image pane; andwherein the lower edge of the left-side image pane is not parallel to the lower edge of the central image pane, and wherein the lower edge of the right-side image pane is not parallel to the lower edge of the central image pane.
  • 2. The vehicular display system of claim 1, wherein, when the image is displayed at the single video display screen of the display device, the upper edge of the left-side image pane slopes downwardly away from the central image pane at the left side of the central image pane and the upper edge of the right-side image pane slopes downwardly away from the central image pane at the right side of the central image pane.
  • 3. The vehicular display system of claim 2, wherein, when the image is displayed at the single video display screen of the display device, the lower edge of the left-side image pane slopes upwardly away from the central image pane at the left side of the central image pane and the lower edge of the right-side image pane slopes upwardly away from the central image pane at the right side of the central image pane.
  • 4. The vehicular display system of claim 2, wherein, when the image is displayed at the single video display screen of the display device, the lower edge of the left-side image pane slopes downwardly away from the central image pane at the left side of the central image pane and the lower edge of the right-side image pane slopes downwardly away from the central image pane at the right side of the central image pane.
  • 5. The vehicular display system of claim 4, wherein the upper edge of the left-side image pane is parallel to the lower edge of the left-side image pane, and wherein the upper edge of the right-side image pane is parallel to the lower edge of the right-side image pane.
  • 6. The vehicular display system of claim 4, wherein the upper edge of the left-side image pane is not parallel to the lower edge of the left-side image pane, and wherein the upper edge of the right-side image pane is not parallel to the lower edge of the right-side image pane.
  • 7. The vehicular display system of claim 1, wherein, when the image is displayed at the single video display screen of the display device, the upper edge of the left-side image pane slopes upwardly away from the central image pane at the left side of the central image pane and the upper edge of the right-side image pane slopes upwardly away from the central image pane at the right side of the central image pane.
  • 8. The vehicular display system of claim 7, wherein, when the image is displayed at the single video display screen of the display device, the lower edge of the left-side image pane slopes downwardly away from the central image pane at the left side of the central image pane and the lower edge of the right-side image pane slopes downwardly away from the central image pane at the right side of the central image pane.
  • 9. The vehicular display system of claim 1, wherein the scene viewed by the rearward viewing camera encompasses a bumper of the vehicle, and wherein the displayed image contains at least a portion of the bumper of the vehicle.
  • 10. The vehicular display system of claim 1, wherein the left-side image pane and the right-side image pane are shaped and arranged with respect to the central image pane to appear folded with respect to the central image pane when the image is displayed at the single video display screen.
  • 11. The vehicular display system of claim 1, wherein the central image pane has horizontal upper and lower edges when the image is displayed at the single video display screen.
  • 12. The vehicular display system of claim 1, wherein the displayed image further includes (i) a left-side static vertical band displayed at the single video display screen and separating the central image pane from the left-side image pane and (ii) a right-side static vertical band displayed at the single video display screen and separating the central image pane from the right-side image pane.
  • 13. The vehicular display system of claim 1, wherein the displayed image further includes static triangular regions displayed at the single video display screen, the static triangular regions positioned above or below the left-side image pane and the right-side image pane.
  • 14. The vehicular display system of claim 1, wherein the displayed image further includes a static trapezoidal region displayed at the single video display screen below the central image pane, the left-side image pane and the right-side image pane.
  • 15. The vehicular display system of claim 1, wherein the single video display screen comprises a rectangular display screen having an upper border and a lower border, and wherein the upper edge of the central image pane is parallel to the upper border of the single video display screen.
  • 16. A vehicular display system comprising: a rearward viewing camera disposed at a rear portion of a body of a vehicle, wherein the rearward viewing camera views rearward of the vehicle, and wherein the rearward viewing camera is operable to capture image data, the captured image data representative of a scene viewed by the rearward viewing camera;a controller comprising a processor for processing image data captured by the rearward viewing camera;a display device disposed in the vehicle for viewing by a driver of the vehicle when the driver is operating the vehicle, wherein the display device comprises a single video display screen for displaying video images for viewing by the driver of the vehicle;wherein, with the rearward viewing camera disposed at the vehicle and capturing image data, captured image data is processed at the controller to generate an output representative of an image for display at the single video display screen of the display device;wherein the display device, responsive to the output generated by the controller, displays the image at the single video display screen;wherein the image, when displayed at the single video display screen of the display device, comprises three individual image panes including (i) a central image pane derived from a central subset of image data, (ii) a left-side image pane derived from a left-side subset of image data and (iii) a right-side image pane derived from a right-side subset of image data;wherein, when the image is displayed at the single video display screen of the display device, each of the central image pane, the left-side image pane and the right-side image pane displayed at the single video display screen has a respective upper edge and a respective lower edge;wherein the single video display screen comprises a rectangular display screen having an upper border and a lower border, and wherein the upper edge of the central image pane is parallel to the upper border of the single video display screen;wherein the upper edge of the left-side image pane is not parallel to the upper edge of the right-side image pane, and wherein the lower edge of the left-side image pane is not parallel to the lower edge of the right-side image pane;wherein the upper edge of the left-side image pane is not parallel to the upper edge of the central image pane, and wherein the upper edge of the right-side image pane is not parallel to the upper edge of the central image pane;wherein the lower edge of the left-side image pane is not parallel to the lower edge of the central image pane, and wherein the lower edge of the right-side image pane is not parallel to the lower edge of the central image pane; andwherein the displayed image further includes (i) a left-side static vertical band displayed at the single video display screen and separating the central image pane from the left-side image pane and (ii) a right-side static vertical band displayed at the single video display screen and separating the central image pane from the right-side image pane.
  • 17. The vehicular display system of claim 16, wherein, when the image is displayed at the single video display screen of the display device, the upper edge of the left-side image pane slopes downwardly away from the central image pane at the left side of the central image pane and the upper edge of the right-side image pane slopes downwardly away from the central image pane at the right side of the central image pane.
  • 18. The vehicular display system of claim 17, wherein, when the image is displayed at the single video display screen of the display device, the lower edge of the left-side image pane slopes upwardly away from the central image pane at the left side of the central image pane and the lower edge of the right-side image pane slopes upwardly away from the central image pane at the right side of the central image pane.
  • 19. The vehicular display system of claim 17, wherein, when the image is displayed at the single video display screen of the display device, the lower edge of the left-side image pane slopes downwardly away from the central image pane at the left side of the central image pane and the lower edge of the right-side image pane slopes downwardly away from the central image pane at the right side of the central image pane.
  • 20. The vehicular display system of claim 19, wherein the upper edge of the left-side image pane is parallel to the lower edge of the left-side image pane, and wherein the upper edge of the right-side image pane is parallel to the lower edge of the right-side image pane.
  • 21. The vehicular display system of claim 19, wherein the upper edge of the left-side image pane is not parallel to the lower edge of the left-side image pane, and wherein the upper edge of the right-side image pane is not parallel to the lower edge of the right-side image pane.
  • 22. The vehicular display system of claim 16, wherein, when the image is displayed at the single video display screen of the display device, the upper edge of the left-side image pane slopes upwardly away from the central image pane at the left side of the central image pane and the upper edge of the right-side image pane slopes upwardly away from the central image pane at the right side of the central image pane.
  • 23. The vehicular display system of claim 22, wherein, when the image is displayed at the single video display screen of the display device, the lower edge of the left-side image pane slopes downwardly away from the central image pane at the left side of the central image pane and the lower edge of the right-side image pane slopes downwardly away from the central image pane at the right side of the central image pane.
  • 24. A vehicular display system comprising: a rearward viewing camera disposed at a rear portion of a body of a vehicle, wherein the rearward viewing camera views rearward of the vehicle, and wherein the rearward viewing camera is operable to capture image data, the captured image data representative of a scene viewed by the rearward viewing camera;a controller comprising a processor for processing image data captured by the rearward viewing camera;a display device disposed in the vehicle for viewing by a driver of the vehicle when the driver is operating the vehicle, wherein the display device comprises a single video display screen for displaying video images for viewing by the driver of the vehicle;wherein, with the rearward viewing camera disposed at the vehicle and capturing image data, captured image data is processed at the controller to generate an output representative of an image for display at the single video display screen of the display device;wherein the display device, responsive to the output generated by the controller, displays the image at the single video display screen;wherein the image, when displayed at the single video display screen of the display device, comprises three individual image panes including (i) a central image pane derived from a central subset of image data, (ii) a left-side image pane derived from a left-side subset of image data and (iii) a right-side image pane derived from a right-side subset of image data;wherein, when the image is displayed at the single video display screen of the display device, each of the central image pane, the left-side image pane and the right-side image pane displayed at the single video display screen has a respective upper edge and a respective lower edge;wherein the upper edge of the left-side image pane is not parallel to the upper edge of the right-side image pane, and wherein the lower edge of the left-side image pane is not parallel to the lower edge of the right-side image pane;wherein the upper edge of the left-side image pane is not parallel to the upper edge of the central image pane, and wherein the upper edge of the right-side image pane is not parallel to the upper edge of the central image pane;wherein the lower edge of the left-side image pane is not parallel to the lower edge of the central image pane, and wherein the lower edge of the right-side image pane is not parallel to the lower edge of the central image pane;wherein, when the image is displayed at the single video display screen of the display device, the lower edge of the left-side image pane slopes downwardly away from the central image pane at the left side of the central image pane and the lower edge of the right-side image pane slopes downwardly away from the central image pane at the right side of the central image pane;wherein the displayed image further includes a static trapezoidal region displayed at the single video display screen below the central image pane, the left-side image pane and the right-side image pane; andwherein the displayed image further includes (i) a left-side static vertical band displayed at the single video display screen and separating the central image pane from the left-side image pane and (ii) a right-side static vertical band displayed at the single video display screen and separating the central image pane from the right-side image pane.
  • 25. The vehicular display system of claim 24, wherein, when the image is displayed at the single video display screen of the display device, the upper edge of the left-side image pane slopes downwardly away from the central image pane at the left side of the central image pane and the upper edge of the right-side image pane slopes downwardly away from the central image pane at the right side of the central image pane.
  • 26. The vehicular display system of claim 25, wherein the displayed image further includes static triangular regions displayed at the single video display screen, the static triangular regions positioned above the left-side image pane and the right-side image pane.
  • 27. The vehicular display system of claim 24, wherein the upper edge of the left-side image pane is parallel to the lower edge of the left-side image pane, and wherein the upper edge of the right-side image pane is parallel to the lower edge of the right-side image pane.
  • 28. The vehicular display system of claim 24, wherein the upper edge of the left-side image pane is not parallel to the lower edge of the left-side image pane, and wherein the upper edge of the right-side image pane is not parallel to the lower edge of the right-side image pane.
  • 29. The vehicular display system of claim 24, wherein, when the image is displayed at the single video display screen of the display device, the upper edge of the left-side image pane slopes upwardly away from the central image pane at the left side of the central image pane and the upper edge of the right-side image pane slopes upwardly away from the central image pane at the right side of the central image pane.
CROSS REFERENCE TO RELATED APPLICATIONS

The present application is a continuation of U.S. patent application Ser. No. 16/665,071, filed Oct. 28, 2019, now U.S. Pat. No. 11,007,937, which is a continuation of U.S. patent application Ser. No. 13/852,190, filed Mar. 28, 2013, now U.S. Pat. No. 10,457,209, which claims the filing benefit of U.S. provisional applications, Ser. No. 61/745,864, filed Dec. 26, 2012, Ser. 61/700,617, filed Sep. 13, 2012, and Ser. No. 61/616,855, filed Mar. 28, 2012, which are hereby incorporated herein by reference in their entireties. U.S. patent application Ser. No. 13/852,190 is also a continuation-in-part of PCT Application No. PCT/US2013/027342, filed Feb. 22, 2013, which claims the filing benefit of U.S. provisional application Ser. No. 61/601,669, filed Feb. 22, 2012, which are hereby incorporated herein by reference in their entireties.

US Referenced Citations (502)
Number Name Date Kind
4645975 Meitzler et al. Feb 1987 A
4692798 Seko et al. Sep 1987 A
4713685 Nishimura et al. Dec 1987 A
4931937 Kakinami et al. Jun 1990 A
4967319 Seko Oct 1990 A
4970653 Kenue Nov 1990 A
5059877 Teder Oct 1991 A
5096287 Kakinami et al. Mar 1992 A
5160971 Koshizawa Nov 1992 A
5161632 Asayama Nov 1992 A
5165108 Asayama Nov 1992 A
5166681 Bottesch et al. Nov 1992 A
5177606 Koshizawa Jan 1993 A
5214408 Asayama May 1993 A
5223907 Asayama Jun 1993 A
5230400 Kakinami et al. Jul 1993 A
5245422 Borcherts et al. Sep 1993 A
5289321 Secor Feb 1994 A
5291424 Asayama et al. Mar 1994 A
5298732 Chen Mar 1994 A
5313072 Vachss May 1994 A
5329206 Slotkowski et al. Jul 1994 A
5336980 Levers Aug 1994 A
5379196 Kobayashi et al. Jan 1995 A
5386285 Asayama Jan 1995 A
5414461 Kishi et al. May 1995 A
5424952 Asayama Jun 1995 A
5426294 Kobayashi et al. Jun 1995 A
5444478 Lelong et al. Aug 1995 A
5483060 Sugiura et al. Jan 1996 A
5483168 Reid Jan 1996 A
5487116 Nakano et al. Jan 1996 A
5488496 Pine Jan 1996 A
5493392 Blackmon et al. Feb 1996 A
5498866 Bendicks et al. Mar 1996 A
5500766 Stonecypher Mar 1996 A
5508592 Lapatovich et al. Apr 1996 A
5510983 Lino Apr 1996 A
5515448 Nishitani May 1996 A
5521633 Nakajima et al. May 1996 A
5528698 Kamei et al. Jun 1996 A
5529138 Shaw et al. Jun 1996 A
5530240 Larson et al. Jun 1996 A
5530420 Tsuchiya et al. Jun 1996 A
5530771 Maekawa Jun 1996 A
5535144 Kise Jul 1996 A
5535314 Alves et al. Jul 1996 A
5537003 Bechtel et al. Jul 1996 A
5539397 Asanuma et al. Jul 1996 A
5541590 Nishio Jul 1996 A
5550677 Schofield et al. Aug 1996 A
5555136 Waldmann et al. Sep 1996 A
5555312 Shima et al. Sep 1996 A
5555555 Sato et al. Sep 1996 A
5559695 Daily Sep 1996 A
5568027 Teder Oct 1996 A
5568316 Schrenk et al. Oct 1996 A
5574443 Hsieh Nov 1996 A
5581464 Woll et al. Dec 1996 A
5582383 Mertens et al. Dec 1996 A
5594222 Caldwell Jan 1997 A
5612686 Takano et al. Mar 1997 A
5612883 Shaffer et al. Mar 1997 A
5614788 Mullins Mar 1997 A
5619370 Guinosso Apr 1997 A
5627586 Yamasaki May 1997 A
5633944 Guibert et al. May 1997 A
5634709 Iwama Jun 1997 A
5638116 Shimoura et al. Jun 1997 A
5642299 Hardin et al. Jun 1997 A
5646612 Byon Jul 1997 A
5648835 Uzawa Jul 1997 A
5650944 Kise Jul 1997 A
5660454 Mori et al. Aug 1997 A
5661303 Teder Aug 1997 A
5670935 Schofield et al. Sep 1997 A
5673019 Dantoni Sep 1997 A
5675489 Pomerleau Oct 1997 A
5676484 Chamberlin et al. Oct 1997 A
5677851 Kingdon et al. Oct 1997 A
5680263 Zimmermann et al. Oct 1997 A
5699044 Van Lente et al. Dec 1997 A
5699057 Ikeda et al. Dec 1997 A
5699149 Kuroda et al. Dec 1997 A
5707129 Kobayashi Jan 1998 A
5724316 Brunts Mar 1998 A
5737226 Olson et al. Apr 1998 A
5757949 Kinoshita et al. May 1998 A
5760826 Nayar Jun 1998 A
5760828 Cortes Jun 1998 A
5760931 Saburi et al. Jun 1998 A
5760962 Schofield et al. Jun 1998 A
5761094 Olson et al. Jun 1998 A
5764139 Nojima et al. Jun 1998 A
5765116 Wilson-Jones et al. Jun 1998 A
5765940 Levy et al. Jun 1998 A
5781437 Wiemer et al. Jul 1998 A
5786772 Schofield et al. Jul 1998 A
5790403 Nakayama Aug 1998 A
5790973 Blaker et al. Aug 1998 A
5793308 Rosinski et al. Aug 1998 A
5793420 Schmidt Aug 1998 A
5808589 Fergason Sep 1998 A
5811888 Hsieh Sep 1998 A
5835255 Miles Nov 1998 A
5835613 Breed et al. Nov 1998 A
5837994 Stam et al. Nov 1998 A
5841126 Fossum et al. Nov 1998 A
5844505 Van Ryzin Dec 1998 A
5844682 Kiyomoto et al. Dec 1998 A
5845000 Breed et al. Dec 1998 A
5848802 Breed et al. Dec 1998 A
5850176 Kinoshita et al. Dec 1998 A
5850254 Takano et al. Dec 1998 A
5867591 Onda Feb 1999 A
5877707 Kowalick Mar 1999 A
5877897 Schofield et al. Mar 1999 A
5878370 Olson Mar 1999 A
5883739 Ashihara et al. Mar 1999 A
5884212 Lion Mar 1999 A
5890021 Onoda Mar 1999 A
5890083 Franke et al. Mar 1999 A
5896085 Mori et al. Apr 1999 A
5899956 Chan May 1999 A
5904725 Iisaka et al. May 1999 A
5912534 Benedict Jun 1999 A
5914815 Bos Jun 1999 A
5922036 Yasui et al. Jul 1999 A
5923027 Stam et al. Jul 1999 A
5929784 Kawaziri et al. Jul 1999 A
5929786 Schofield et al. Jul 1999 A
5938320 Crandall Aug 1999 A
5940120 Frankhouse et al. Aug 1999 A
5942853 Piscart Aug 1999 A
5949331 Schofield et al. Sep 1999 A
5956181 Lin Sep 1999 A
5959367 O'Farrell et al. Sep 1999 A
5959555 Furuta Sep 1999 A
5963247 Banitt Oct 1999 A
5964822 Alland et al. Oct 1999 A
5971552 O'Farrell et al. Oct 1999 A
5986796 Miles Nov 1999 A
5990469 Bechtel et al. Nov 1999 A
5990649 Nagao et al. Nov 1999 A
5991427 Kakinami et al. Nov 1999 A
6005611 Gullichsen Dec 1999 A
6009336 Harris et al. Dec 1999 A
6020704 Buschur Feb 2000 A
6031484 Bullinger et al. Feb 2000 A
6037860 Zander et al. Mar 2000 A
6037975 Aoyama Mar 2000 A
6049171 Stam et al. Apr 2000 A
6052124 Stein et al. Apr 2000 A
6057754 Kinoshita et al. May 2000 A
6066933 Ponziana May 2000 A
6084519 Coulling et al. Jul 2000 A
6091833 Yasui et al. Jul 2000 A
6097023 Schofield et al. Aug 2000 A
6097024 Stam et al. Aug 2000 A
6107939 Sorden Aug 2000 A
6144022 Tenenbaum et al. Nov 2000 A
6144158 Beam Nov 2000 A
6150014 Chu et al. Nov 2000 A
6150930 Cooper Nov 2000 A
6166628 Andreas Dec 2000 A
6175300 Kendrick Jan 2001 B1
6266082 Yonezawa et al. Jul 2001 B1
6266442 Laumeyer et al. Jul 2001 B1
6285393 Shimoura et al. Sep 2001 B1
6285778 Nakajima et al. Sep 2001 B1
6291906 Marcus et al. Sep 2001 B1
6292752 Franke et al. Sep 2001 B1
6294989 Schofield et al. Sep 2001 B1
6297781 Turnbull et al. Oct 2001 B1
6302545 Schofield et al. Oct 2001 B1
6310611 Caldwell Oct 2001 B1
6311119 Sawamoto et al. Oct 2001 B2
6313454 Bos et al. Nov 2001 B1
6315421 Apfelbeck et al. Nov 2001 B1
6317057 Lee Nov 2001 B1
6318870 Spooner et al. Nov 2001 B1
6320176 Schofield et al. Nov 2001 B1
6320282 Caldwell Nov 2001 B1
6324450 Iwama Nov 2001 B1
6326613 Heslin et al. Dec 2001 B1
6329925 Skiver et al. Dec 2001 B1
6333759 Mazzilli Dec 2001 B1
6341523 Lynam Jan 2002 B2
6353392 Schofield et al. Mar 2002 B1
6362729 Hellmann et al. Mar 2002 B1
6366236 Farmer et al. Apr 2002 B1
6370329 Teuchert Apr 2002 B1
6388565 Bernhard et al. May 2002 B1
6388580 Graham May 2002 B1
6411204 Bloomfield et al. Jun 2002 B1
6411328 Franke et al. Jun 2002 B1
6429594 Stam et al. Aug 2002 B1
6430303 Naoi et al. Aug 2002 B1
6433817 Guerra Aug 2002 B1
6441748 Takagi et al. Aug 2002 B1
6442465 Breed et al. Aug 2002 B2
6469739 Bechtel et al. Oct 2002 B1
6472979 Schofield et al. Oct 2002 B2
6477464 McCarthy et al. Nov 2002 B2
6498620 Schofield Dec 2002 B2
6513252 Schierbeek Feb 2003 B1
6516272 Lin Feb 2003 B2
6516664 Lynam Feb 2003 B2
6523964 Schofield et al. Feb 2003 B2
6534884 Marcus et al. Mar 2003 B2
6539306 Turnbull Mar 2003 B2
6540193 DeLine Apr 2003 B1
6547133 Devries, Jr. et al. Apr 2003 B1
6553130 Lemelson et al. Apr 2003 B1
6559435 Schofield et al. May 2003 B2
6570998 Ohtsuka et al. May 2003 B1
6574033 Chui et al. Jun 2003 B1
6578017 Ebersole et al. Jun 2003 B1
6587573 Stam et al. Jul 2003 B1
6589625 Kothari et al. Jul 2003 B1
6593565 Heslin et al. Jul 2003 B2
6593698 Stam et al. Jul 2003 B2
6594583 Ogura et al. Jul 2003 B2
6611202 Schofield et al. Aug 2003 B2
6611610 Stam et al. Aug 2003 B1
6627918 Getz et al. Sep 2003 B2
6631316 Stam et al. Oct 2003 B2
6631994 Suzuki et al. Oct 2003 B2
6636258 Strumolo Oct 2003 B2
6648477 Hutzel et al. Nov 2003 B2
6650233 DeLine et al. Nov 2003 B2
6650455 Miles Nov 2003 B2
6672731 Schnell et al. Jan 2004 B2
6674562 Miles Jan 2004 B1
6678056 Downs Jan 2004 B2
6678614 McCarthy et al. Jan 2004 B2
6680792 Miles Jan 2004 B2
6690268 Schofield et al. Feb 2004 B2
6690337 Mayer, III Feb 2004 B1
6700605 Toyoda et al. Mar 2004 B1
6703925 Steffel Mar 2004 B2
6704621 Stein et al. Mar 2004 B1
6710908 Miles et al. Mar 2004 B2
6711474 Treyz et al. Mar 2004 B1
6714331 Lewis et al. Mar 2004 B2
6717610 Bos et al. Apr 2004 B1
6728393 Stam et al. Apr 2004 B2
6728623 Takenaga et al. Apr 2004 B2
6735506 Breed et al. May 2004 B2
6741377 Miles May 2004 B2
6744353 Sjonell Jun 2004 B2
6757109 Bos Jun 2004 B2
6762867 Lippert et al. Jul 2004 B2
6764210 Akiyama Jul 2004 B2
6765480 Tseng Jul 2004 B2
6784828 Delcheccolo et al. Aug 2004 B2
6794119 Miles Sep 2004 B2
6795221 Urey Sep 2004 B1
6801127 Mizusawa et al. Oct 2004 B2
6801244 Takeda et al. Oct 2004 B2
6802617 Schofield et al. Oct 2004 B2
6806452 Bos et al. Oct 2004 B2
6807287 Hermans Oct 2004 B1
6812463 Okada Nov 2004 B2
6819231 Berberich et al. Nov 2004 B2
6822563 Bos et al. Nov 2004 B2
6823241 Shirato et al. Nov 2004 B2
6823261 Sekiguchi Nov 2004 B2
6824281 Schofield et al. Nov 2004 B2
6831261 Schofield et al. Dec 2004 B2
6838980 Gloger et al. Jan 2005 B2
6842189 Park Jan 2005 B2
6847487 Burgner Jan 2005 B2
6859148 Miller et al. Feb 2005 B2
6861809 Stam Mar 2005 B2
6873253 Veziris Mar 2005 B2
6882287 Schofield Apr 2005 B2
6888447 Hori et al. May 2005 B2
6891563 Schofield et al. May 2005 B2
6898518 Padmanabhan May 2005 B2
6906620 Nakai et al. Jun 2005 B2
6906639 Lemelson et al. Jun 2005 B2
6909753 Meehan et al. Jun 2005 B2
6914521 Rothkop Jul 2005 B2
6932669 Lee et al. Aug 2005 B2
6933837 Gunderson et al. Aug 2005 B2
6940423 Takagi et al. Sep 2005 B2
6946978 Schofield Sep 2005 B2
6950035 Tanaka et al. Sep 2005 B2
6953253 Schofield et al. Oct 2005 B2
6959994 Fujikawa et al. Nov 2005 B2
6961178 Sugino et al. Nov 2005 B2
6961661 Sekiguchi Nov 2005 B2
6967569 Weber et al. Nov 2005 B2
6968736 Lynam Nov 2005 B2
6975775 Rykowski et al. Dec 2005 B2
6989736 Berberich et al. Jan 2006 B2
6995687 Lang et al. Feb 2006 B2
7004593 Weller et al. Feb 2006 B2
7004606 Schofield Feb 2006 B2
7005974 McMahon et al. Feb 2006 B2
7012727 Hutzel et al. Mar 2006 B2
7023331 Kodama Apr 2006 B2
7030738 Ishii Apr 2006 B2
7030775 Sekiguchi Apr 2006 B2
7038577 Pawlicki et al. May 2006 B2
7046448 Burgner May 2006 B2
7057505 Iwamoto Jun 2006 B2
7057681 Hinata et al. Jun 2006 B2
7062300 Kim Jun 2006 B1
7065432 Moisel et al. Jun 2006 B2
7068289 Satoh et al. Jun 2006 B2
7080326 Molander Jul 2006 B2
7085633 Nishira et al. Aug 2006 B2
7085637 Breed et al. Aug 2006 B2
7092548 Laumeyer et al. Aug 2006 B2
7095432 Nakayama et al. Aug 2006 B2
7106213 White Sep 2006 B2
7110021 Nobori et al. Sep 2006 B2
7110156 Lawlor et al. Sep 2006 B2
7113867 Stein Sep 2006 B1
7116246 Winter et al. Oct 2006 B2
7121028 Shoen et al. Oct 2006 B2
7123168 Schofield Oct 2006 B2
7133661 Hatae et al. Nov 2006 B2
7149613 Stam et al. Dec 2006 B2
7151996 Stein Dec 2006 B2
7167796 Taylor et al. Jan 2007 B2
7187498 Bengoechea et al. Mar 2007 B2
7195381 Lynam et al. Mar 2007 B2
7202776 Breed Apr 2007 B2
7202987 Varaprasad et al. Apr 2007 B2
7205904 Schofield Apr 2007 B2
7221363 Roberts et al. May 2007 B2
7224324 Quist et al. May 2007 B2
7227459 Bos et al. Jun 2007 B2
7227611 Hull et al. Jun 2007 B2
7235918 McCullough et al. Jun 2007 B2
7248283 Takagi et al. Jul 2007 B2
7249860 Kulas et al. Jul 2007 B2
7253723 Lindahl et al. Aug 2007 B2
7255451 McCabe et al. Aug 2007 B2
7271951 Weber et al. Sep 2007 B2
7304661 Ishikura Dec 2007 B2
7311406 Schofield et al. Dec 2007 B2
7325934 Schofield et al. Feb 2008 B2
7325935 Schofield et al. Feb 2008 B2
7337055 Matsumoto et al. Feb 2008 B2
7338177 Lynam Mar 2008 B2
7339149 Schofield et al. Mar 2008 B1
7344261 Schofield et al. Mar 2008 B2
7355524 Schofield Apr 2008 B2
7360932 Uken et al. Apr 2008 B2
7370983 DeWind et al. May 2008 B2
7375803 Bamji May 2008 B1
7380948 Schofield et al. Jun 2008 B2
7388182 Schofield et al. Jun 2008 B2
7402786 Schofield et al. Jul 2008 B2
7420756 Lynam Sep 2008 B2
7423248 Schofield et al. Sep 2008 B2
7423821 Bechtel et al. Sep 2008 B2
7425076 Schofield et al. Sep 2008 B2
7429998 Kawauchi et al. Sep 2008 B2
7432967 Bechtel et al. Oct 2008 B2
7446924 Schofield et al. Nov 2008 B2
7459664 Schofield et al. Dec 2008 B2
7460007 Schofield et al. Dec 2008 B2
7474963 Taylor et al. Jan 2009 B2
7489374 Utsumi et al. Feb 2009 B2
7495719 Adachi et al. Feb 2009 B2
7525604 Xue Apr 2009 B2
7526103 Schofield et al. Apr 2009 B2
7541743 Salmeen et al. Jun 2009 B2
7543946 Ockerse et al. Jun 2009 B2
7545429 Travis Jun 2009 B2
7548291 Lee et al. Jun 2009 B2
7551103 Schofield Jun 2009 B2
7561181 Schofield et al. Jul 2009 B2
7565006 Stam et al. Jul 2009 B2
7566851 Stein et al. Jul 2009 B2
7567291 Bechtel et al. Jul 2009 B2
7605856 Imoto Oct 2009 B2
7613327 Stam et al. Nov 2009 B2
7616781 Schofield et al. Nov 2009 B2
7619508 Lynam et al. Nov 2009 B2
7629996 Rademacher et al. Dec 2009 B2
7633383 Dunsmoir et al. Dec 2009 B2
7639149 Katoh Dec 2009 B2
7653215 Stam Jan 2010 B2
7655894 Schofield et al. Feb 2010 B2
7663798 Tonar et al. Feb 2010 B2
7676087 Dhua et al. Mar 2010 B2
7720580 Higgins-Luthman May 2010 B2
7724434 Cross et al. May 2010 B2
7731403 Lynam et al. Jun 2010 B2
7742864 Sekiguchi Jun 2010 B2
7786898 Stein et al. Aug 2010 B2
7791694 Molsen et al. Sep 2010 B2
7792329 Schofield et al. Sep 2010 B2
7842154 Lynam Nov 2010 B2
7843451 Lafon Nov 2010 B2
7854514 Conner et al. Dec 2010 B2
7855755 Weller et al. Dec 2010 B2
7855778 Yung et al. Dec 2010 B2
7859565 Schofield et al. Dec 2010 B2
7877175 Higgins-Luthman Jan 2011 B2
7881496 Camilleri et al. Feb 2011 B2
7903324 Kobayashi et al. Mar 2011 B2
7903335 Nieuwkerk et al. Mar 2011 B2
7914187 Higgins-Luthman et al. Mar 2011 B2
7930160 Hosagrahara et al. Apr 2011 B1
7949152 Schofield et al. May 2011 B2
7965357 Van De Witte et al. Jun 2011 B2
7991522 Higgins-Luthman Aug 2011 B2
7994462 Schofield et al. Aug 2011 B2
8017898 Lu et al. Sep 2011 B2
8027691 Bernas et al. Sep 2011 B2
8044776 Schofield Oct 2011 B2
8064643 Stein et al. Nov 2011 B2
8082101 Stein et al. Dec 2011 B2
8090153 Schofield et al. Jan 2012 B2
8095310 Taylor et al. Jan 2012 B2
8098142 Schofield et al. Jan 2012 B2
8120652 Bechtel et al. Feb 2012 B2
8164628 Stein et al. Apr 2012 B2
8184159 Luo May 2012 B2
8203440 Schofield et al. Jun 2012 B2
8222588 Schofield et al. Jul 2012 B2
8224031 Saito Jul 2012 B2
8233045 Luo et al. Jul 2012 B2
8254635 Stein et al. Aug 2012 B2
8289430 Bechtel et al. Oct 2012 B2
8305471 Bechtel et al. Nov 2012 B2
8308325 Takayanagi et al. Nov 2012 B2
8314689 Schofield et al. Nov 2012 B2
8324552 Schofield et al. Dec 2012 B2
8339526 Minikey, Jr. et al. Dec 2012 B2
8378851 Stein et al. Feb 2013 B2
8386114 Higgins-Luthman Feb 2013 B2
8405726 Schofield et al. Mar 2013 B2
8452055 Stein et al. May 2013 B2
8553088 Stein et al. Oct 2013 B2
10457209 Byrne et al. Oct 2019 B2
11007937 Byrne et al. May 2021 B2
20020005778 Breed et al. Jan 2002 A1
20020029103 Breed et al. Mar 2002 A1
20020113873 Williams Aug 2002 A1
20020116106 Breed et al. Aug 2002 A1
20030035050 Mizusawa Feb 2003 A1
20030103142 Hitomi et al. Jun 2003 A1
20030125855 Breed et al. Jul 2003 A1
20030137586 Lewellen Jul 2003 A1
20030209893 Breed et al. Nov 2003 A1
20030222982 Hamdan et al. Dec 2003 A1
20040164228 Fogg et al. Aug 2004 A1
20040179099 Bos Sep 2004 A1
20040200948 Bos et al. Oct 2004 A1
20050073853 Stam Apr 2005 A1
20050131607 Breed Jun 2005 A1
20050219852 Stam et al. Oct 2005 A1
20050237385 Kosaka et al. Oct 2005 A1
20060017807 Lee Jan 2006 A1
20060018511 Stam et al. Jan 2006 A1
20060018512 Stam et al. Jan 2006 A1
20060050018 Hutzel et al. Mar 2006 A1
20060091813 Stam et al. May 2006 A1
20060103727 Tseng May 2006 A1
20060244829 Kato Nov 2006 A1
20060250501 Wildmann et al. Nov 2006 A1
20070024724 Stein et al. Feb 2007 A1
20070104476 Yasutomi et al. May 2007 A1
20070109406 Schofield et al. May 2007 A1
20070120657 Schofield et al. May 2007 A1
20070154063 Breed Jul 2007 A1
20070193811 Breed et al. Aug 2007 A1
20070242339 Bradley Oct 2007 A1
20080043099 Stein et al. Feb 2008 A1
20080147321 Howard et al. Jun 2008 A1
20080192132 Bechtel et al. Aug 2008 A1
20080231710 Asari Sep 2008 A1
20080234899 Breed et al. Sep 2008 A1
20080266396 Stein Oct 2008 A1
20090052003 Schofield et al. Feb 2009 A1
20090066065 Breed et al. Mar 2009 A1
20090079585 Chinomi Mar 2009 A1
20090113509 Tseng et al. Apr 2009 A1
20090160987 Bechtel et al. Jun 2009 A1
20090190015 Bechtel et al. Jul 2009 A1
20090201137 Weller et al. Aug 2009 A1
20090243824 Peterson et al. Oct 2009 A1
20090256938 Bechtel et al. Oct 2009 A1
20110032422 Yamamoto Feb 2011 A1
20120045112 Lundblad et al. Feb 2012 A1
20120069185 Stein Mar 2012 A1
20120154591 Baur et al. Jun 2012 A1
20120200707 Stein et al. Aug 2012 A1
20120314071 Rosenbaum et al. Dec 2012 A1
20130027558 Ramanath et al. Jan 2013 A1
20130141580 Stein et al. Jun 2013 A1
20130147957 Stein Jun 2013 A1
20130286193 Pflug Oct 2013 A1
20140055616 Corcoran Feb 2014 A1
Foreign Referenced Citations (25)
Number Date Country
0353200 Jan 1990 EP
0640903 Mar 1995 EP
1115250 Jul 2001 EP
59114139 Jul 1984 JP
6080953 May 1985 JP
6079889 Oct 1986 JP
S6216073 Apr 1987 JP
6272245 May 1987 JP
S62131837 Jun 1987 JP
6414700 Jan 1989 JP
H1168538 Jul 1989 JP
3099952 Apr 1991 JP
4114587 Apr 1992 JP
H04127280 Apr 1992 JP
0577657 Mar 1993 JP
5213113 Aug 1993 JP
6227318 Aug 1994 JP
07105496 Apr 1995 JP
2630604 Jul 1997 JP
200274339 Mar 2002 JP
200383742 Mar 2003 JP
20041658 Jan 2004 JP
2011014497 Feb 2011 WO
WO-2011014497 Feb 2011 WO
2013126715 Aug 2013 WO
Non-Patent Literature Citations (7)
Entry
Achler et al., “Vehicle Wheel Detector using 2D Filter Banks,” IEEE Intelligent Vehicles Symposium of Jun. 2004.
Broggi et al., “Multi-Resolution Vehicle Detection using Artificial Vision,” IEEE Intelligent Vehicles Symposium of Jun. 2004.
Sun et al., “On-road vehicle detection using optical sensors: a review”.
Tokimaru et al., “CMOS Rear-View TV System with CCD Camera”, National Technical Report vol. 34, No. 3, pp. 329-336, Jun. 1988 (Japan).
Vellacott, Oliver, “CMOS in Camera,” IEE Review, pp. 111-114 (May 1994).
Wang et al., CMOS Video Cameras, article, 1991, 4 pages, University of Edinburgh, UK.
Zheng et al., “An Adaptive System for Traffic Sign Recognition,” IEEE Proceedings of the Intelligent Vehicles '94 Symposium, pp. 165-170 (Oct. 1994).
Related Publications (1)
Number Date Country
20210268963 A1 Sep 2021 US
Provisional Applications (4)
Number Date Country
61745864 Dec 2012 US
61700617 Sep 2012 US
61616855 Mar 2012 US
61601669 Feb 2012 US
Continuations (2)
Number Date Country
Parent 16665071 Oct 2019 US
Child 17302935 US
Parent 13852190 Mar 2013 US
Child 16665071 US
Continuation in Parts (1)
Number Date Country
Parent PCT/US2013/027342 Feb 2013 US
Child 13852190 US