Method of synchronizing multiple vehicular cameras with an ECU

Abstract
A method of synchronizing cameras with an electronic control unit (ECU) of a vehicular vision system includes providing camera control signals to the cameras from the ECU via respective links from the ECU to the cameras, with the camera control signals regulating timing of the respective camera to be synchronous with reference timing of the ECU. The timing regulation of the cameras includes starting the camera synchronous to the ECU reference timing and holding the camera synchronous to the ECU reference timing. Image data is captured with each camera and provided to the ECU via the respective link. Image data captured by at least one of the cameras may be processed to detect an object present exterior of the vehicle.
Description
FIELD OF THE INVENTION

The present invention relates to vehicles with cameras mounted thereon and in particular to vehicles with one or more exterior-facing cameras, such as forward facing cameras and/or sideward facing cameras and/or rearward facing cameras.


BACKGROUND OF THE INVENTION

Use of imaging sensors in vehicle imaging systems is common and known. Examples of such known systems are described in U.S. Pat. Nos. 5,949,331; 5,670,935 and/or 5,550,677, which are hereby incorporated herein by reference in their entireties.


SUMMARY OF THE INVENTION

The present invention provides a camera for a vision system that utilizes one or more cameras or image sensors to capture image data of a scene exterior (such as forwardly) of a vehicle and provides a display of images indicative of or representative of the captured image data. The vehicle vision system automatically synchronizes a number of cameras of the vision system without changing the system architecture. The vehicle vision system powers on or initializes a camera, and starts the camera synchronous to an ECU reference timing, and then regulates the camera timing synchronous to the ECU reference timing. The system may adjust or regulate the camera or sensor between a fast mode and a slow mode depending on whether a maximum buffer level achieved during processing exceeds a selected maximum buffer threshold and whether a minimum buffer level achieved during processing is below a selected minimum buffer threshold. By adjusting the mode of the camera or sensor, the system can regulate the camera and synchronize the camera to the ECU timing.


These and other objects, advantages, purposes and features of the present invention will become apparent upon review of the following specification in conjunction with the drawings.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a plan view of a vehicle with a vision system that incorporates cameras in accordance with the present invention;



FIG. 2A is a schematic of a multi-camera system in accordance with the present invention;



FIG. 2B is a schematic of a multi-camera system with a hub in the path of video data transmission in accordance with the present invention, with some cameras (Cam1 and Cam2) running over the hub and some cameras (CamN) connected to the ECU directly;



FIG. 3A is a schematic of an ECU structure of the vision system of the present invention, showing on-the-fly image processing without an image buffer, and behind the buffers the image pixels of all of the cameras are clock aligned;



FIG. 3B is another schematic of an ECU structure of the vision system of the present invention, showing image processing with an image buffer within the pipeline;



FIG. 4 is a flow chart showing the synchronization main states of the present invention;



FIG. 5A is a flow chart of a smart camera operation in accordance with the present invention, showing a sequence when starting from reset;



FIG. 5B is a flow chart of a smart camera operation in accordance with the present invention, showing a sequence when starting from standby; and



FIG. 6 is a flow chart of a regulation of the smart camera of the vision system of the present invention.





DETAILED DESCRIPTION OF THE INVENTION

Referring now to the drawings and the illustrative embodiments depicted therein, a vehicle 10 includes an imaging system or vision system 12 that includes one or more imaging sensors or cameras (such as a rearward facing imaging sensor or camera 14a and/or a forwardly facing camera 14b at the front (or at the windshield) of the vehicle, and/or a sidewardly/rearwardly facing camera 14c, 14b at the sides of the vehicle), which capture images exterior of the vehicle, with the cameras having a lens for focusing images at or onto an imaging array or imaging plane of the camera (FIG. 1). The vision system 12 is operable to process image data captured by the cameras and may provide displayed images at a display device 16 for viewing by the driver of the vehicle. Optionally, the vision system may process image data to detect objects, such as objects to the rear of the subject or equipped vehicle during a reversing maneuver, or such as approaching or following vehicles or vehicles at a side lane adjacent to the subject or equipped vehicle or the like.


General Description:


A typical multi-camera video system as shown in FIG. 2A comprises several (2 to N) satellite cameras, an electronic control unit (ECU) and a display. Typically, the satellite cameras (such as exterior facing cameras such as cameras 14a, 14b, 14c, 14d of FIG. 1) are connected via a common video interface (such as NTSC (National Television System Committee) or PAL (Phase Alternating Line) or CameraLink or LVDS or the like) and an additional control channel (CAN, LIN, UART) to the ECU. Ethernet and LVDS (Low Voltage Differential Signaling) may combine the video and the control channel over one interface. All of the mentioned interfaces are not able to distribute a common clock, generated inside the ECU, to the cameras. Therefore, all of the cameras and the ECU have their own clock sources or timing, although they normally have the same typical clock frequencies.


In such a system, it is often desired or required that the image signal processing uses images that are sensed at the same time to combine them without artifacts to the display output (particularly for generating a surround view display image based on image data captured from multiple exterior viewing cameras of the vehicle). Therefore, the sensors have to run synchronized all the time. This means their frame rates have to be substantially or exactly the same and the sensor readout has to be substantially or nearly at the same video line at a time. They thus typically run synchronously.


The present invention provides a method that allows the system to synchronize any number of cameras without changing the system architecture. Although only one camera is discussed in the description below, the description applies to all of the cameras of a multi-camera vision system because every camera of a plurality of cameras of the vehicle may be synchronized to an ECU reference timing individually.


When running the video data over common high performance busses (such as Flexray or the like), there may be hubs instead of an ECU that the cameras are interfacing with and synchronized to. For example, see FIG. 2B, which shows a system that has some cameras (Cam1 and Cam2) connected to and interfacing with a hub (with another camera or other cameras (CamN) connected to and interfacing with the ECU directly. Although only the interfacing to an ECU is discussed in the description below, the present invention and description below applies to both a hub and an ECU as reference timing providing node to the camera or cameras.


ECU Entities for Synchronization:


The components for realizing the synchronization tasks in the ECU are shown in FIG. 3A. The camera video data Cam_Video is coming in to or is received by the Video Line Buffer via the Video Interface controlled by the transfer clock Cam_CLK. The side signals Cam_HS and Cam_VS are signaling the line period with blanking (horizontal timing) and frame period with blanking (vertical timing). Data is only written to the Video Line Buffer when Cam_HS and Cam_VS both do not signal a blank phase. The Video Line Buffer is organized in a FIFO (first in first out) manner, containing only the pixel data of about 1 to 2 lines or more. This depends of the clock tolerances and how fine the sensor timing can be adjusted.


The Reference Timing Generator produces the internal pixel reference clock Ref_CLK and timing signals Ref_HS and Ref_VS, which nearly have the same timing properties as the camera signals. If the reference timing signals indicate valid data, the Video Line Buffer is read. Finally, Cam_CLK, Cam_Video, Cam_HS and Cam_VS are replaced by Ref_CLK, Sync_Video, Ref_HS and Ref_VS. This at first has the effect that the camera data path is clock-synchronized to the ECU reference timing.


Then the Camera Sync Control instance has to control and ensure that the lines and frames are synchronized. This is achieved by taking care that primarily the camera is started at a well-defined time or timing point, so that the start of the first frame is written to the Video Line Buffer when the reading also starts. Secondly, the camera timing has to be programmed in a manner so that it is definitely a bit faster than the reference timing. This will lead to an increasing Buffer_level of the Video Line Buffer during the frame processing. After some frames, the Buffer_level arrives at a certain upper level threshold c_th_upper, where the camera sensor has to be adjusted to a timing, which is a bit slower than the reference timing. After this, the Buffer_level will decrease frame by frame. When the value arrives at a certain lower level threshold c_th_lower, the sensor has to be re-adjusted to the faster timing and so on. The threshold values c_th_upper and c_th_lower are constants, which have to be well determined or calculated to avoid a buffer overflow or underflow at all conditions.


Ref_VS is required to find the right start point for the camera in Camera Sync Control. A physical Power Switch on the ECU to control the camera power is not required but may be an optional element of the system. The method or system of the present invention also works if camera power is switched somewhere else, such as, for example, by the vehicle ignition or a system activation switch.


Synchronization for Image Pipeline with Image Buffer:


In the case where an image buffer is available in the image pipeline, the FIFO or buffer has to equalize only the drifts between the sensor clock and the reference clock during one line. Behind the FIFO, the image alignment still has a drift of about 1-2 lines. This will be solved at the image buffer during the vertical blank period. The main benefit of this solution is the much smaller FIFO. Optionally, the solution with the larger line buffer FIFO (FIG. 3A) may also be used in systems with an image buffer in the pipeline.


The input structure for an ECU with an image buffer in the image pipeline is shown in FIG. 3B. In this case, the Cam_HS and Cam_VS signals are just sampled to the reference clock Ref_CLK and delayed by some clock period or time period. The delay has to be realized in a manner that the active periods of the lines are conserved. During the blank phases, the FIFO is always empty. When a line starts, it is written from the camera side. Optionally, and desirably, when it is filled about half way, the reading also starts. The difference to the solution in FIG. 3A is that Sync_HS and Sync_VS in the system of FIG. 3B are still dependent from the sensor timing and therefore exist in parallel for each camera like Sync_Video. The Camera Sync Control is the same in both systems. Instead of the buffer level, there is calculated (in the system of FIG. 3B) the distance in clock cycles between the start of frame (SOF) of the Reference Timing at Ref_VS and the SOF of the sensor timing at Sync_VS. This is the task of the Calc_Distance block.


Camera Sync Control—Main Tasks:


To achieve a multi-camera system running synchronously in accordance with the present invention, two tasks are performed:

    • 1) Start the camera synchronous to the ECU reference timing; and
    • 2) Hold (regulate) the camera timing synchronous to the ECU timing.

      FIG. 4 shows the main tasks in a state-machine depending of the camera power (POWER_on).


      Start Camera from Reset Task:


A flowchart of the start camera task is shown in FIG. 5A. After starting or powering-on a camera, the link from ECU to camera is initialized, regardless of which interface technology is used. After the link is initialized, a communication to the camera sensor and optional parts such as, for example, an EEPROM or the like, is established.


The EEPROM may contain production and calibration data, which is often named intrinsic data. This camera intrinsic data should be read next, because later-on it may be more time consuming or complicated.


If the startup time of the sensor is varying from part to part, it may be helpful to measure individual behavior and adjust the start point individually. A good period to measure is reset to the start of frame (SOF) or reset to the end of frame (EOF). If this is likely constant, the measurement step with its reset before can be left-out.


At the end, the sensor has to be reconfigured for application specific needs. The startup time from the sensor, in that case, is different from the startup time with default values after the reset. However, the difference of these two times will be constant for all parts. With knowledge of that time difference or delta time and under consideration of known or selected tolerances, the starting point related to the ECU reference Ref_VS can be calculated, so that the sensor will start its first frame slightly before the ECU wants to read it from the FIFO. The start point is then awaited after the triggering edge of Ref_VS.


The startup of the sensor is initiated with a reset. This can be done either via hardware (HW), such as by pin toggling or the like, or via software (SW), such as by sending a command or the like. Then the application specific configurations are sent to the sensor, where at its end the sensor has the correct frame timing. At the end of this task, the sensor's start of its first frame will be synchronous with or synchronized to the ECU reference timing and first read.


Start Camera from Standby Task:


A flowchart of the start camera task is shown in FIG. 5B. After starting or powering-on a camera, the link from ECU to camera is initialized, regardless of which interface technology is used. After that, a communication to the camera sensor and optional parts, such as, for example, an EEPROM or the like, is established.


The EEPROM may contain production and calibration data, which is often named intrinsic data. This data should be read next, because later-on it may be more time consuming or complicated to accomplish.


Optionally, the sensor may be configured with application specific settings or application specific settings may be loaded into the sensor.


After that, the sensor has to be put into the standby mode where the application specific settings may not be lost. The sensor start timing from standby to run mode may then be measured. If the startup time of the sensor is varying from part to part, it may be helpful to measure individual behavior and adjust the start point individually. A good period to measure is run to start of frame (SOF) or run to end of frame (EOF). If this is likely constant, the measurement step can be left-out.


Then the calculation of the starting point takes place which is related to the ECU reference Ref_VS, so that at the end the sensor will start this first frame slightly before the ECU wants to read it from the FIFO.


The start point is then awaited relating to the triggering edge of Ref_VS and the startup of the sensor is initiated by setting the sensor into run mode. At the end of this task, the sensor's start of first frame will be synchronous to the ECU reference timing and first read.


Regulation Task:


A flowchart of the regulation task is shown in FIG. 6. Buffer_Level from FIG. 3A and Distance from FIG. 3B have the same meaning for the Regulation Task. The regulation task is a repeating process started once. As described before, the sensor is normally operating a bit faster than the ECU reference timing. This is the fast mode.


While the end of frame (EOF) is awaited, that system saves the maximum occurred Buffer_Level in fast mode to max_buffer_level. After the end of frame (EOF), the max_buffer_level is compared to the c_th_upper threshold. If the max buffer level is not greater than the c_th_upper threshold, then the buffer takes no risk to overflow in the next frame and the sensor can continue in fast mode.


Otherwise, if the max buffer level is greater than the c_th_upper threshold, the sensor timing is switched to the slow mode. This is done during the vertical blanking period before the start of frame (SOF).


While the end of frame is awaited, the system saves the minimum occurred Buffer_Level in the slow mode to min_buffer_level. After EOF, the min_buffer_level is compared to the c_th_lower threshold. If the minimum buffer level is not smaller than the c_th_lower threshold, then the buffer takes no risk to underflow in the next frame and the sensor can continue in slow mode. Otherwise, if the minimum buffer level is smaller than the threshold, the sensor timing is switched to the fast mode. This is done during the vertical blanking period before SOF. The regulation task then restarts again.


Optionally, the regulation task may also operate well if the min_buffer_level is not calculated and checked. In such an application, the fast mode follows one frame period of slow mode automatically. This depends on the adjustment granularity of the sensor and the system design.


Switching Sensor Timing:


In principle, there are two possibilities to slow down the sensor timing. A first option is to add an additional blank line. This enlarges the frame period by one line period. However, if the sensor has a rolling shutter, this method won't work, because it may conflict with the exposure control.


A second and preferred option is to add blank pixels to the lines. In this case the frame period enlarges by the number of lines multiplied with the added pixels per row. In this case, the exposure control is also influenced, but in a much smoother way, which cannot be recognized by the viewer viewing displayed images captured by the imager and displayed on a display.


Therefore, the present invention provides a system that automatically synchronizes a number of cameras of a vehicle vision system without changing the system architecture. The system of the present invention powers on or initializes a camera, and starts the camera synchronous to an ECU reference timing, and then regulates the camera timing synchronous to the ECU reference timing. The system may adjust or regulate the camera or sensor between a fast mode and a slow mode depending on whether a maximum buffer level achieved during processing exceeds a selected maximum buffer threshold and whether a minimum buffer level achieved during processing is below a selected minimum buffer threshold. By adjusting the mode of the camera or sensor, the system can regulate the camera and synchronize the camera to the ECU timing.


The vehicle may include any type of sensor or sensors, such as imaging sensors or radar sensors or lidar sensors or ladar sensors or ultrasonic sensors or the like. The imaging sensor or camera may capture image data for image processing and may comprise any suitable camera or sensing device, such as, for example, an array of a plurality of photosensor elements arranged in at least 640 columns and 480 rows (preferably a megapixel imaging array or the like), with a respective lens focusing images onto respective portions of the array. The photosensor array may comprise a plurality of photosensor elements arranged in a photosensor array having rows and columns. The logic and control circuit of the imaging sensor may function in any known manner, and the image processing and algorithmic processing may comprise any suitable means for processing the images and/or image data.


For example, the vision system and/or processing and/or camera and/or circuitry may utilize aspects described in U.S. Pat. Nos. 7,005,974; 5,760,962; 5,877,897; 5,796,094; 5,949,331; 6,222,447; 6,302,545; 6,396,397; 6,498,620; 6,523,964; 6,611,202; 6,201,642; 6,690,268; 6,717,610; 6,757,109; 6,802,617; 6,806,452; 6,822,563; 6,891,563; 6,946,978; 7,859,565; 5,550,677; 5,670,935; 6,636,258; 7,145,519; 7,161,616; 7,230,640; 7,248,283; 7,295,229; 7,301,466; 7,592,928; 7,881,496; 7,720,580; 7,038,577; 6,882,287; 5,929,786 and/or 5,786,772, and/or International Publication Nos. WO 2011/028686; WO 2010/099416; WO 2012/061567; WO 2012/068331; WO 2012/075250; WO 2012/103193; WO 2012/0116043; WO 2012/0145313; WO 2012/0145501; WO 2012/145818; WO 2012/145822; WO 2012/158167; WO 2012/075250; WO 2012/103193; WO 2012/0116043; WO 2012/0145501; WO 2012/0145343; WO 2012/154919; WO 2013/019707; WO 2013/016409; WO 2012/145822; WO 2013/067083; WO 2013/070539; WO 2013/043661; WO 2013/048994; WO 2013/063014, WO 2013/081984; WO 2013/081985; WO 2013/074604; WO 2013/086249; WO 2013/103548; WO 2013/109869; WO 2013/123161; WO 2013/126715; WO 2013/043661 and/or WO 2013/158592 and/or U.S. patent applications, Ser. No. 14/082,573, filed Nov. 18, 2013 and published May 22, 2014 as U.S. Publication No. US-2014-0139676; Ser. No. 14/082,574, filed Nov. 18, 2013 and published May 22, 2014 as U.S. Publication No. US-2014-0138140; Ser. No. 14/082,575, filed Nov. 18, 2013 and published Jun. 5, 2014 as U.S. Publication No. US-2014-0156157; Ser. No. 14/082,577, filed Nov. 18, 2013, now U.S. Pat. No. 8,818,042; Ser. No. 14/071,086, filed Nov. 4, 2013, now U.S. Pat. No. 8,886,401; Ser. No. 14/076,524, filed Nov. 11, 2013, now U.S. Pat. No. 9,077,962; Ser. No. 14/052,945, filed Oct. 14, 2013 and published Apr. 17, 2014 as U.S. Publication No. US-2014-0104426; Ser. No. 14/046,174, filed Oct. 4, 2013 and published Apr. 10, 2014 as U.S. Publication No. US-2014-0098229; Ser. No. 14/016,790, filed Oct. 3, 2013 and published Mar. 6, 2014 as U.S. Publication No. US-2014-0067206; Ser. No. 14/036,723, filed Sep. 25, 2013, now U.S. Pat. No. 9,446,713; Ser. No. 14/016,790, filed Sep. 3, 2013 and published Mar. 6, 2014 as U.S. Publication No. US-2014-0067206; Ser. No. 14/001,272, filed Aug. 23, 2013, now U.S. Pat. No. 9,233,641; Ser. No. 13/970,868, filed Aug. 20, 2013 and published Feb. 20, 2014 as U.S. Publication No. US-2014-0049646; Ser. No. 13/964,134, filed Aug. 12, 2013 and published Feb. 20, 2014 as U.S. Publication No. US-2014-0052340; Ser. No. 13/942,758, filed Jul. 16, 2013 and published Jan. 23, 2014 as U.S. Publication No. US-2014-0025240; Ser. No. 13/942,753, filed Jul. 16, 2013 and published Jan. 30, 2014 as U.S. Publication No. US-2014-0028852; Ser. No. 13/927,680, filed Jun. 26, 2013 and published Jan. 2, 2014 as U.S. Publication No. US-2014-0005907; Ser. No. 13/916,051, filed Jun. 12, 2013, now U.S. Pat. No. 9,077,098; Ser. No. 13/894,870, filed May 15, 2013 and published Nov. 28, 2013 as U.S. Publication No. US-2013-0314503; Ser. No. 13/887,724, filed May 6, 2013 and published Nov. 14, 2013 as U.S. Publication No. US-2013-0298866; Ser. No. 13/852,190, filed Mar. 28, 2013 and published Aug. 29, 2013 as U.S. Publication No. US-2013-0222593; Ser. No. 13/851,378, filed Mar. 27, 2013 and published Nov. 14, 2013 as U.S. Publication No. US-2013-0300869; Ser. No. 13/848,796, filed Mar. 22, 2012 and published Oct. 24, 2013 as U.S. Publication No. US-2013-0278769; Ser. No. 13/847,815, filed Mar. 20, 2013 and published Oct. 31, 2013 as U.S. Publication No. US-2013-0286193; Ser. No. 13/800,697, filed Mar. 13, 2013 and published Oct. 3, 2013 as U.S. Publication No. US-2013-0258077; Ser. No. 13/785,099, filed Mar. 5, 2013 and published Sep. 19, 2013 as U.S. Publication No. US-2013-0242099; Ser. No. 13/779,881, filed Feb. 28, 2013, now U.S. Pat. No. 8,694,224; Ser. No. 13/774,317, filed Feb. 22, 2013, now U.S. Pat. No. 9,269,263; Ser. No. 13/774,315, filed Feb. 22, 2013 and published Aug. 22, 2013 as U.S. Publication No. US-2013-0215271; Ser. No. 13/681,963, filed Nov. 20, 2012, now U.S. Pat. No. 9,264,673; Ser. No. 13/660,306, filed Oct. 25, 2012, now U.S. Pat. No. 9,146,898; Ser. No. 13/653,577, filed Oct. 17, 2012, now U.S. Pat. No. 9,174,574, and/or Ser. No. 13/534,657, filed Jun. 27, 2012 and published Jan. 3, 2013 as U.S. Publication No. US-2013-0002873, and/or U.S. provisional applications, Ser. No. 61/901,127, filed Nov. 7, 2013; Ser. No. 61/905,461, filed Nov. 18, 2013; Ser. No. 61/905,462, filed Nov. 18, 2013; Ser. No. 61/895,610, filed Oct. 25, 2013; Ser. No. 61/895,609, filed Oct. 25, 2013; Ser. No. 61/893,489, filed Oct. 21, 2013; Ser. No. 61/886,883, filed Oct. 4, 2013; Ser. No. 61/879,837, filed Sep. 19, 2013; Ser. No. 61/879,835, filed Sep. 19, 2013; Ser. No. 61/878,877, filed Sep. 17, 2013; Ser. No. 61/875,351, filed Sep. 9, 2013; Ser. No. 61/869,195, filed Aug. 23, 2013; Ser. No. 61/864,835, filed Aug. 12, 2013; Ser. No. 61/864,836, filed Aug. 12, 2013; Ser. No. 61/864,837, filed Aug. 12, 2013; Ser. No. 61/864,838, filed Aug. 12, 2013; Ser. No. 61/856,843, filed Jul. 22, 2013, Ser. No. 61/845,061, filed Jul. 11, 2013; Ser. No. 61/844,630, filed Jul. 10, 2013; Ser. No. 61/844,173, filed Jul. 9, 2013; Ser. No. 61/844,171, filed Jul. 9, 2013; Ser. No. 61/842,644, filed Jul. 3, 2013; Ser. No. 61/840,542, filed Jun. 28, 2013; Ser. No. 61/838,619, filed Jun. 24, 2013; Ser. No. 61/838,621, filed Jun. 24, 2013; Ser. No. 61/837,955, filed Jun. 21, 2013; Ser. No. 61/836,900, filed Jun. 19, 2013; Ser. No. 61/836,380, filed Jun. 18, 2013; Ser. No. 61/834,129, filed Jun. 12, 2013; Ser. No. 61/833,080, filed Jun. 10, 2013; Ser. No. 61/830,375, filed Jun. 3, 2013; Ser. No. 61/830,377, filed Jun. 3, 2013; Ser. No. 61/825,752, filed May 21, 2013; Ser. No. 61/825,753, filed May 21, 2013; Ser. No. 61/823,648, filed May 15, 2013; Ser. No. 61/823,644, filed May 15, 2013; Ser. No. 61/821,922, filed May 10, 2013; Ser. No. 61/819,835, filed May 6, 2013; Ser. No. 61/819,033, filed May 3, 2013; Ser. No. 61/816,956, filed Apr. 29, 2013; Ser. No. 61/815,044, filed Apr. 23, 2013; Ser. No. 61/814,533, filed Apr. 22, 2013; Ser. No. 61/813,361, filed Apr. 18, 2013; Ser. No. 61/810,407, filed Apr. 10, 2013; Ser. No. 61/808,930, filed Apr. 5, 2013; Ser. No. 61/807,050, filed Apr. 1, 2013; Ser. No. 61/806,674, filed Mar. 29, 2013; Ser. No. 61/793,592, filed Mar. 15, 2013; Ser. No. 61/772,015, filed Mar. 4, 2013; Ser. No. 61/772,014, filed Mar. 4, 2013; Ser. No. 61/770,051, filed Feb. 27, 2013; Ser. No. 61/770,048, filed Feb. 27, 2013; Ser. No. 61/766,883, filed Feb. 20, 2013; Ser. No. 61/760,366, filed Feb. 4, 2013; Ser. No. 61/760,364, filed Feb. 4, 2013; Ser. No. 61/756,832, filed Jan. 25, 2013; Ser. No. 61/754,804, filed Jan. 21, 2013; Ser. No. 61/736,104, filed Dec. 12, 2012; Ser. No. 61/736,103, filed Dec. 12, 2012; Ser. No. 61/734,457, filed Dec. 7, 2012, and/or Ser. No. 61/733,093, filed Dec. 4, 2012, which are all hereby incorporated herein by reference in their entireties. The system may communicate with other communication systems via any suitable means, such as by utilizing aspects of the systems described in International Publication Nos. WO/2010/144900; WO 2013/043661 and/or WO 2013/081985, and/or U.S. patent application Ser. No. 13/202,005, filed Aug. 17, 2011, now U.S. Pat. No. 9,126,525, which are hereby incorporated herein by reference in their entireties.


The imaging device and control and image processor and any associated illumination source, if applicable, may comprise any suitable components, and may utilize aspects of the cameras and vision systems described in U.S. Pat. Nos. 5,550,677; 5,877,897; 6,498,620; 5,670,935; 5,796,094; 6,396,397; 6,806,452; 6,690,268; 7,005,974; 7,123,168; 7,004,606; 6,946,978; 7,038,577; 6,353,392; 6,320,176; 6,313,454 and 6,824,281, and/or International Publication No. WO 2010/099416, published Sep. 2, 2010, and/or PCT Application No. PCT/US10/47256, filed Aug. 31, 2010 and published Mar. 10, 2011 as International Publication No. WO 2011/028686, and/or U.S. patent application Ser. No. 12/508,840, filed Jul. 24, 2009, and published Jan. 28, 2010 as U.S. Pat. Publication No. US 2010-0020170, and/or PCT Application No. PCT/US2012/048110, filed Jul. 25, 2012 and published Jan. 31, 2013 as International Publication No. WO 2013/016409, and/or U.S. patent application Ser. No. 13/534,657, filed Jun. 27, 2012 and published Jan. 3, 2013 as U.S. Publication No. US-2013-0002873, which are all hereby incorporated herein by reference in their entireties. The camera or cameras may comprise any suitable cameras or imaging sensors or camera modules, and may utilize aspects of the cameras or sensors described in U.S. patent applications, Ser. No. 12/091,359, filed Apr. 24, 2008 and published Oct. 1, 2009 as U.S. Publication No. US-2009-0244361, and/or Ser. No. 13/260,400, filed Sep. 26, 2011, now U.S. Pat. No. 8,542,451, and/or U.S. Pat. Nos. 7,965,336 and/or 7,480,149, which are hereby incorporated herein by reference in their entireties. The imaging array sensor may comprise any suitable sensor, and may utilize various imaging sensors or imaging array sensors or cameras or the like, such as a CMOS imaging array sensor, a CCD sensor or other sensors or the like, such as the types described in U.S. Pat. Nos. 5,550,677; 5,670,935; 5,760,962; 5,715,093; 5,877,897; 6,922,292; 6,757,109; 6,717,610; 6,590,719; 6,201,642; 6,498,620; 5,796,094; 6,097,023; 6,320,176; 6,559,435; 6,831,261; 6,806,452; 6,396,397; 6,822,563; 6,946,978; 7,339,149; 7,038,577; 7,004,606 and/or 7,720,580, and/or U.S. patent application Ser. No. 10/534,632, filed May 11, 2005, now U.S. Pat. No. 7,965,336, and/or PCT Application No. PCT/US2008/076022, filed Sep. 11, 2008 and published Mar. 19, 2009 as International Publication No. WO/2009/036176, and/or PCT Application No. PCT/US2008/078700, filed Oct. 3, 2008 and published Apr. 9, 2009 as International Publication No. WO/2009/046268, which are all hereby incorporated herein by reference in their entireties.


The camera module and circuit chip or board and imaging sensor may be implemented and operated in connection with various vehicular vision-based systems, and/or may be operable utilizing the principles of such other vehicular systems, such as a vehicle headlamp control system, such as the type disclosed in U.S. Pat. Nos. 5,796,094; 6,097,023; 6,320,176; 6,559,435; 6,831,261; 7,004,606; 7,339,149 and/or 7,526,103, which are all hereby incorporated herein by reference in their entireties, a rain sensor, such as the types disclosed in commonly assigned U.S. Pat. Nos. 6,353,392; 6,313,454; 6,320,176 and/or 7,480,149, which are hereby incorporated herein by reference in their entireties, a vehicle vision system, such as a forwardly, sidewardly or rearwardly directed vehicle vision system utilizing principles disclosed in U.S. Pat. Nos. 5,550,677; 5,670,935; 5,760,962; 5,877,897; 5,949,331; 6,222,447; 6,302,545; 6,396,397; 6,498,620; 6,523,964; 6,611,202; 6,201,642; 6,690,268; 6,717,610; 6,757,109; 6,802,617; 6,806,452; 6,822,563; 6,891,563; 6,946,978 and/or 7,859,565, which are all hereby incorporated herein by reference in their entireties, a trailer hitching aid or tow check system, such as the type disclosed in U.S. Pat. No. 7,005,974, which is hereby incorporated herein by reference in its entirety, a reverse or sideward imaging system, such as for a lane change assistance system or lane departure warning system or for a blind spot or object detection system, such as imaging or detection systems of the types disclosed in U.S. Pat. Nos. 7,720,580; 7,038,577; 5,929,786 and/or 5,786,772, and/or U.S. patent applications, Ser. No. 11/239,980, filed Sep. 30, 2005, now U.S. Pat. No. 7,881,496, and/or U.S. provisional applications, Ser. No. 60/628,709, filed Nov. 17, 2004; Ser. No. 60/614,644, filed Sep. 30, 2004; Ser. No. 60/618,686, filed Oct. 14, 2004; Ser. No. 60/638,687, filed Dec. 23, 2004, which are hereby incorporated herein by reference in their entireties, a video device for internal cabin surveillance and/or video telephone function, such as disclosed in U.S. Pat. Nos. 5,760,962; 5,877,897; 6,690,268 and/or 7,370,983, and/or U.S. patent application Ser. No. 10/538,724, filed Jun. 13, 2005 and published Mar. 9, 2006 as U.S. Publication No. US-2006-0050018, which are hereby incorporated herein by reference in their entireties, a traffic sign recognition system, a system for determining a distance to a leading or trailing vehicle or object, such as a system utilizing the principles disclosed in U.S. Pat. Nos. 6,396,397 and/or 7,123,168, which are hereby incorporated herein by reference in their entireties, and/or the like.


Optionally, the circuit board or chip may include circuitry for the imaging array sensor and or other electronic accessories or features, such as by utilizing compass-on-a-chip or EC driver-on-a-chip technology and aspects such as described in U.S. Pat. No. 7,255,451 and/or U.S. Pat. No. 7,480,149, and/or U.S. patent applications, Ser. No. 11/226,628, filed Sep. 14, 2005 and published Mar. 23, 2006 as U.S. Publication No. US-2006-0061008, and/or Ser. No. 12/578,732, filed Oct. 14, 2009 and published Apr. 22, 2010 as U.S. Publication No. US-2010-0097469, which are hereby incorporated herein by reference in their entireties.


Optionally, the vision system may include a display for displaying images captured by one or more of the imaging sensors for viewing by the driver of the vehicle while the driver is normally operating the vehicle. Optionally, for example, the vision system may include a video display device disposed at or in the interior rearview mirror assembly of the vehicle, such as by utilizing aspects of the video mirror display systems described in U.S. Pat. No. 6,690,268 and/or U.S. patent application Ser. No. 13/333,337, filed Dec. 21, 2011, now U.S. Pat. No. 9,264,672, which are hereby incorporated herein by reference in their entireties. The video mirror display may comprise any suitable devices and systems and optionally may utilize aspects of the compass display systems described in U.S. Pat. Nos. 7,370,983; 7,329,013; 7,308,341; 7,289,037; 7,249,860; 7,004,593; 4,546,551; 5,699,044; 4,953,305; 5,576,687; 5,632,092; 5,677,851; 5,708,410; 5,737,226; 5,802,727; 5,878,370; 6,087,953; 6,173,508; 6,222,460; 6,513,252 and/or 6,642,851, and/or European patent application, published Oct. 11, 2000 under Publication No. EP 0 1043566, and/or U.S. patent application Ser. No. 11/226,628, filed Sep. 14, 2005 and published Mar. 23, 2006 as U.S. Publication No. US-2006-0061008, which are all hereby incorporated herein by reference in their entireties. Optionally, the video mirror display screen or device may be operable to display images captured by a rearward viewing camera of the vehicle during a reversing maneuver of the vehicle (such as responsive to the vehicle gear actuator being placed in a reverse gear position or the like) to assist the driver in backing up the vehicle, and optionally may be operable to display the compass heading or directional heading character or icon when the vehicle is not undertaking a reversing maneuver, such as when the vehicle is being driven in a forward direction along a road (such as by utilizing aspects of the display system described in PCT Application No. PCT/US2011/056295, filed Oct. 14, 2011 and published Apr. 19, 2012 as International Publication No. WO 2012/051500, which is hereby incorporated herein by reference in its entirety).


Optionally, the vision system (utilizing the forward facing camera and a rearward facing camera and other cameras disposed at the vehicle with exterior fields of view) may be part of or may provide a display of a top-down view or birds-eye view system of the vehicle or a surround view at the vehicle, such as by utilizing aspects of the vision systems described in International Publication Nos. WO 2010/099416; WO 2011/028686; WO 2012/075250; WO 2013/019795; WO 2012/075250; WO 2012/145822; WO 2013/081985; WO 2013/086249 and/or WO 2013/109869, and/or U.S. patent application Ser. No. 13/333,337, filed Dec. 21, 2011, now U.S. Pat. No. 9,264,672, which are hereby incorporated herein by reference in their entireties.


Optionally, a video mirror display may be disposed rearward of and behind the reflective element assembly and may comprise a display such as the types disclosed in U.S. Pat. Nos. 5,530,240; 6,329,925; 7,855,755; 7,626,749; 7,581,859; 7,446,650; 7,370,983; 7,338,177; 7,274,501; 7,255,451; 7,195,381; 7,184,190; 5,668,663; 5,724,187 and/or 6,690,268, and/or in U.S. patent applications, Ser. No. 12/091,525, filed Apr. 25, 2008, now U.S. Pat. No. 7,855,755; Ser. No. 11/226,628, filed Sep. 14, 2005 and published Mar. 23, 2006 as U.S. Publication No. US-2006-0061008, and/or Ser. No. 10/538,724, filed Jun. 13, 2005 and published Mar. 9, 2006 as U.S. Publication No. US-2006-0050018, which are all hereby incorporated herein by reference in their entireties. The display is viewable through the reflective element when the display is activated to display information. The display element may be any type of display element, such as a vacuum fluorescent (VF) display element, a light emitting diode (LED) display element, such as an organic light emitting diode (OLED) or an inorganic light emitting diode, an electroluminescent (EL) display element, a liquid crystal display (LCD) element, a video screen display element or backlit thin film transistor (TFT) display element or the like, and may be operable to display various information (as discrete characters, icons or the like, or in a multi-pixel manner) to the driver of the vehicle, such as passenger side inflatable restraint (PSIR) information, tire pressure status, and/or the like. The mirror assembly and/or display may utilize aspects described in U.S. Pat. Nos. 7,184,190; 7,255,451; 7,446,924 and/or 7,338,177, which are all hereby incorporated herein by reference in their entireties. The thicknesses and materials of the coatings on the substrates of the reflective element may be selected to provide a desired color or tint to the mirror reflective element, such as a blue colored reflector, such as is known in the art and such as described in U.S. Pat. Nos. 5,910,854; 6,420,036 and/or 7,274,501, which are hereby incorporated herein by reference in their entireties.


Optionally, the display or displays and any associated user inputs may be associated with various accessories or systems, such as, for example, a tire pressure monitoring system or a passenger air bag status or a garage door opening system or a telematics system or any other accessory or system of the mirror assembly or of the vehicle or of an accessory module or console of the vehicle, such as an accessory module or console of the types described in U.S. Pat. Nos. 7,289,037; 6,877,888; 6,824,281; 6,690,268; 6,672,744; 6,386,742 and 6,124,886, and/or U.S. patent application Ser. No. 10/538,724, filed Jun. 13, 2005 and published Mar. 9, 2006 as U.S. Publication No. US-2006-0050018, which are hereby incorporated herein by reference in their entireties.


While the above description constitutes a plurality of embodiments of the present invention, it will be appreciated that the present invention is susceptible to further modification and change without departing from the fair meaning of the accompanying claims.

Claims
  • 1. A method of synchronizing cameras with an electronic control unit of a vehicular vision system, said method comprising: providing a plurality of cameras at a vehicle equipped with the vehicular vision system, wherein the plurality of cameras comprises at least a first camera, a second camera, a third camera and a fourth camera;wherein providing the plurality of cameras comprises providing the first camera at a front portion of the equipped vehicle, providing the second camera at a left-side side portion of the equipped vehicle, providing the third camera at a right-side side portion of the equipped vehicle, and providing the fourth camera at a rear portion of the equipped vehicle;wherein each of the first camera, the second camera, the third camera and the fourth camera, when provided at the equipped vehicle, has a respective field of view exterior of the equipped vehicle;providing an electronic control unit (ECU) at the equipped vehicle;providing a first camera control signal to the first camera from the ECU via a first link from the ECU to the first camera, wherein the first camera control signal regulates timing of the first camera to be synchronous with reference timing of the ECU;providing a second camera control signal to the second camera from the ECU via a second link from the ECU to the second camera, wherein the second camera control signal regulates timing of the second camera to be synchronous with reference timing of the ECU;providing a third camera control signal to the third camera from the ECU via a third link from the ECU to the third camera, wherein the third camera control signal regulates timing of the third camera to be synchronous with reference timing of the ECU;providing a fourth camera control signal to the fourth camera from the ECU via a fourth link from the ECU to the fourth camera, wherein the fourth camera control signal regulates timing of the fourth camera to be synchronous with reference timing of the ECU;regulating timing of the first camera via starting the first camera synchronous to the ECU reference timing and holding the first camera synchronous to the ECU reference timing;regulating timing of the second camera via starting the second camera synchronous to the ECU reference timing and holding the second camera synchronous to the ECU reference timing;regulating timing of the third camera via starting the third camera synchronous to the ECU reference timing and holding the third camera synchronous to the ECU reference timing;regulating timing of the fourth camera via starting the fourth camera synchronous to the ECU reference timing and holding the fourth camera synchronous to the ECU reference timing;capturing frames of image data with each of the first camera, the second camera, the third camera and the fourth camera;providing image data captured by the first camera to the ECU via the first link from the first camera to the ECU;providing image data captured by the second camera to the ECU via the second link from the second camera to the ECU;providing image data captured by the third camera to the ECU via the third link from the third camera to the ECU; andproviding image data captured by the fourth camera to the ECU via the fourth link from the fourth camera to the ECU.
  • 2. The method of claim 1, comprising (i) providing at least one other camera control signal from the ECU to the first camera via the first link, (ii) providing at least one other camera control signal from the ECU to the second camera via the second link, (iii) providing at least one other camera control signal from the ECU to the third camera via the third link and (iv) providing at least one other camera control signal from the ECU to the fourth camera via the fourth link.
  • 3. The method of claim 1, comprising processing, at the ECU, image data captured by at least one of the first camera, the second camera, the third camera and the fourth camera to detect an object present exterior of the equipped vehicle.
  • 4. The method of claim 3, wherein the object present exterior of the equipped vehicle is exterior a side of the equipped vehicle.
  • 5. The method of claim 4, wherein the object present exterior the side of the equipped vehicle comprises a vehicle that is approaching the equipped vehicle.
  • 6. The method of claim 5, wherein the vehicle that is approaching the equipped vehicle is traveling in a traffic lane adjacent to a traffic lane in which the equipped vehicle is traveling.
  • 7. The method of claim 3, wherein the object present exterior of the equipped vehicle is to the rear of the equipped vehicle and is detected during a reversing maneuver of the equipped vehicle.
  • 8. The method of claim 1, wherein the first camera, the second camera, the third camera and the fourth camera are part of a surround vision system of the equipped vehicle.
  • 9. The method of claim 8, comprising (i) processing, at the ECU, image data captured by at least one of the first camera, the second camera, the third camera and the fourth camera, and (ii) outputting, via the ECU, and responsive at least in part to the processing, images for display at a display device of the equipped vehicle for viewing by an occupant of the equipped vehicle.
  • 10. The method of claim 9, wherein the displayed images comprise birds-eye view images generated by said surround vision system.
  • 11. The method of claim 1, wherein the first camera comprises a first megapixel imaging array having at least one million photosensing elements arranged in columns and rows of photosensing elements, and wherein the second camera comprises a second megapixel imaging array having at least one million photosensing elements arranged in columns and rows of photosensing elements, and wherein the third camera comprises a third megapixel imaging array having at least one million photosensing elements arranged in columns and rows of photosensing elements, and wherein the fourth camera comprises a fourth megapixel imaging array having at least one million photosensing elements arranged in columns and rows of photosensing elements.
  • 12. The method of claim 1, comprising, upon powering the first, second, third and fourth cameras, initializing the first, second, third and fourth links to establish communication with the respective first, second, third and fourth cameras.
  • 13. The method of claim 12, wherein the established communication at least comprises first, second, third, and fourth camera calibration data for the respective first, second, third, and fourth cameras.
  • 14. A method of synchronizing cameras with an electronic control unit of a vehicular vision system, said method comprising: providing a plurality of cameras at a vehicle equipped with the vehicular vision system, wherein the plurality of cameras comprises at least a first camera, a second camera, a third camera and a fourth camera;wherein the first camera comprises a first megapixel imaging array having at least one million photosensing elements arranged in columns and rows of photosensing elements, and wherein the second camera comprises a second megapixel imaging array having at least one million photosensing elements arranged in columns and rows of photosensing elements, and wherein the third camera comprises a third megapixel imaging array having at least one million photosensing elements arranged in columns and rows of photosensing elements, and wherein the fourth camera comprises a fourth megapixel imaging array having at least one million photosensing elements arranged in columns and rows of photosensing elements;wherein providing the plurality of cameras comprises providing the first camera at a front portion of the equipped vehicle, providing the second camera at a left-side side portion of the equipped vehicle, providing the third camera at a right-side side portion of the equipped vehicle, and providing the fourth camera at a rear portion of the equipped vehicle;wherein each of the first camera, the second camera, the third camera and the fourth camera, when provided at the equipped vehicle, has a respective field of view exterior of the equipped vehicle;providing an electronic control unit (ECU) at the equipped vehicle;providing a first camera control signal to the first camera from the ECU via a first link from the ECU to the first camera, wherein the first camera control signal regulates timing of the first camera to be synchronous with reference timing of the ECU;providing a second camera control signal to the second camera from the ECU via a second link from the ECU to the second camera, wherein the second camera control signal regulates timing of the second camera to be synchronous with reference timing of the ECU;providing a third camera control signal to the third camera from the ECU via a third link from the ECU to the third camera, wherein the third camera control signal regulates timing of the third camera to be synchronous with reference timing of the ECU;providing a fourth camera control signal to the fourth camera from the ECU via a fourth link from the ECU to the fourth camera, wherein the fourth camera control signal regulates timing of the fourth camera to be synchronous with reference timing of the ECU;regulating timing of the first camera via starting the first camera synchronous to the ECU reference timing and holding the first camera synchronous to the ECU reference timing;regulating timing of the second camera via starting the second camera synchronous to the ECU reference timing and holding the second camera synchronous to the ECU reference timing;regulating timing of the third camera via starting the third camera synchronous to the ECU reference timing and holding the third camera synchronous to the ECU reference timing;regulating timing of the fourth camera via starting the fourth camera synchronous to the ECU reference timing and holding the fourth camera synchronous to the ECU reference timing;providing at least one other camera control signal from the ECU to the first camera via the first link;providing at least one other camera control signal from the ECU to the second camera via the second link;providing at least one other camera control signal from the ECU to the third camera via the third link;providing at least one other camera control signal from the ECU to the fourth camera via the fourth link;capturing frames of image data with each of the first camera, the second camera, the third camera and the fourth camera;providing image data captured by the first camera to the ECU via the first link from the first camera to the ECU;providing image data captured by the second camera to the ECU via the second link from the second camera to the ECU;providing image data captured by the third camera to the ECU via the third link from the third camera to the ECU;providing image data captured by the fourth camera to the ECU via the fourth link from the fourth camera to the ECU; andprocessing, at the ECU, image data captured by at least one of the first camera, the second camera, the third camera and the fourth camera to detect an object present exterior of the equipped vehicle.
  • 15. The method of claim 14, wherein the object present exterior of the equipped vehicle is exterior a side of the equipped vehicle.
  • 16. The method of claim 15, wherein the object present exterior the side of the equipped vehicle comprises a vehicle that is approaching the equipped vehicle.
  • 17. The method of claim 16, wherein the vehicle that is approaching the equipped vehicle is traveling in a traffic lane adjacent to a traffic lane in which the equipped vehicle is traveling.
  • 18. The method of claim 14, wherein the object present exterior of the equipped vehicle is to the rear of the equipped vehicle and is detected during a reversing maneuver of the equipped vehicle.
  • 19. The method of claim 14, wherein the first camera, the second camera, the third camera and the fourth camera are part of a surround vision system of the equipped vehicle.
  • 20. The method of claim 19, comprising (i) processing, at the ECU, image data captured by at least one of the first camera, the second camera, the third camera and the fourth camera, and (ii) outputting, via the ECU, and responsive at least in part to the processing, images for display at a display device of the equipped vehicle for viewing by an occupant of the equipped vehicle.
  • 21. The method of claim 20, wherein the displayed images comprise birds-eye view images generated by said surround vision system.
  • 22. The method of claim 14, comprising, upon powering the first, second, third and fourth cameras, initializing the first, second, third and fourth links to establish communication with the respective first, second, third and fourth cameras.
  • 23. The method of claim 22, wherein the established communication at least comprises first, second, third, and fourth camera calibration data for the respective first, second, third, and fourth cameras.
  • 24. A method of synchronizing cameras with an electronic control unit of a vehicular vision system, said method comprising: providing a plurality of cameras at a vehicle equipped with the vehicular vision system, wherein the plurality of cameras comprises at least a first camera, a second camera, a third camera and a fourth camera;wherein the first camera comprises a first megapixel imaging array having at least one million photosensing elements arranged in columns and rows of photosensing elements, and wherein the second camera comprises a second megapixel imaging array having at least one million photosensing elements arranged in columns and rows of photosensing elements, and wherein the third camera comprises a third megapixel imaging array having at least one million photosensing elements arranged in columns and rows of photosensing elements, and wherein the fourth camera comprises a fourth megapixel imaging array having at least one million photosensing elements arranged in columns and rows of photosensing elements;wherein providing the plurality of cameras comprises providing the first camera at a front portion of the equipped vehicle, providing the second camera at a left-side side portion of the equipped vehicle, providing the third camera at a right-side side portion of the equipped vehicle, and providing the fourth camera at a rear portion of the equipped vehicle;wherein each of the first camera, the second camera, the third camera and the fourth camera, when provided at the equipped vehicle, has a respective field of view exterior of the equipped vehicle;providing an electronic control unit (ECU) at the equipped vehicle;providing a first link between the ECU and the first camera;providing a second link between the ECU and the second camera;providing a third link between the ECU and the third camera;providing a fourth link between the ECU and the fourth camera;upon powering the first, second, third and fourth cameras, initializing the first, second, third and fourth links to establish communication with the respective first, second, third and fourth cameras;providing a first camera control signal to the first camera from the ECU via the first link from the ECU to the first camera, wherein the first camera control signal regulates timing of the first camera to be synchronous with reference timing of the ECU;providing a second camera control signal to the second camera from the ECU via the second link from the ECU to the second camera, wherein the second camera control signal regulates timing of the second camera to be synchronous with reference timing of the ECU;providing a third camera control signal to the third camera from the ECU via the third link from the ECU to the third camera, wherein the third camera control signal regulates timing of the third camera to be synchronous with reference timing of the ECU;providing a fourth camera control signal to the fourth camera from the ECU via the fourth link from the ECU to the fourth camera, wherein the fourth camera control signal regulates timing of the fourth camera to be synchronous with reference timing of the ECU;regulating timing of the first camera via starting the first camera synchronous to the ECU reference timing and holding the first camera synchronous to the ECU reference timing;regulating timing of the second camera via starting the second camera synchronous to the ECU reference timing and holding the second camera synchronous to the ECU reference timing;regulating timing of the third camera via starting the third camera synchronous to the ECU reference timing and holding the third camera synchronous to the ECU reference timing;regulating timing of the fourth camera via starting the fourth camera synchronous to the ECU reference timing and holding the fourth camera synchronous to the ECU reference timing;capturing frames of image data with each of the first camera, the second camera, the third camera and the fourth camera;providing image data captured by the first camera to the ECU via the first link from the first camera to the ECU;providing image data captured by the second camera to the ECU via the second link from the second camera to the ECU;providing image data captured by the third camera to the ECU via the third link from the third camera to the ECU; andproviding image data captured by the fourth camera to the ECU via the fourth link from the fourth camera to the ECU.
  • 25. The method of claim 24, wherein the established communication at least comprises first, second, third, and fourth camera calibration data for the respective first, second, third, and fourth cameras.
  • 26. The method of claim 24, comprising (i) providing at least one other camera control signal from the ECU to the first camera via the first link, (ii) providing at least one other camera control signal from the ECU to the second camera via the second link, (iii) providing at least one other camera control signal from the ECU to the third camera via the third link and (iv) providing at least one other camera control signal from the ECU to the fourth camera via the fourth link.
  • 27. The method of claim 24, comprising processing, at the ECU, image data captured by at least one of the first camera, the second camera, the third camera and the fourth camera to detect an object present exterior of the equipped vehicle.
  • 28. The method of claim 27, wherein the object present exterior of the equipped vehicle is exterior a side of the equipped vehicle.
  • 29. The method of claim 28, wherein the object present exterior the side of the equipped vehicle comprises a vehicle that is approaching the equipped vehicle.
  • 30. The method of claim 29, wherein the vehicle that is approaching the equipped vehicle is traveling in a traffic lane adjacent to a traffic lane in which the equipped vehicle is traveling.
  • 31. The method of claim 27, wherein the object present exterior of the equipped vehicle is to the rear of the equipped vehicle and is detected during a reversing maneuver of the equipped vehicle.
  • 32. The method of claim 24, wherein the first camera, the second camera, the third camera and the fourth camera are part of a surround vision system of the equipped vehicle.
  • 33. The method of claim 32, comprising (i) processing, at the ECU, image data captured by at least one of the first camera, the second camera, the third camera and the fourth camera, and (ii) outputting, via the ECU, and responsive at least in part to the processing, images for display at a display device of the equipped vehicle for viewing by an occupant of the equipped vehicle.
  • 34. The method of claim 33, wherein the displayed images comprise birds-eye view images generated by said surround vision system.
CROSS REFERENCE TO RELATED APPLICATIONS

The present application is a continuation of U.S. patent application Ser. No. 15/911,442, filed Mar. 5, 2018, now U.S. Pat. No. 10,171,709, which is a continuation of U.S. patent application Ser. No. 15/338,781, filed Oct. 31, 2016, now U.S. Pat. No. 9,912,841, which is a continuation of U.S. patent application Ser. No. 14/097,581, filed Dec. 5, 2013, now U.S. Pat. No. 9,481,301, which claims the filing benefits of U.S. provisional application Ser. No. 61/733,598, filed Dec. 5, 2012, which is hereby incorporated herein by reference in its entirety.

US Referenced Citations (301)
Number Name Date Kind
4987357 Masaki Jan 1991 A
4991054 Walters Feb 1991 A
5001558 Burley et al. Mar 1991 A
5003288 Wilhelm Mar 1991 A
5012082 Watanabe Apr 1991 A
5016977 Baude et al. May 1991 A
5027001 Torbert Jun 1991 A
5027200 Petrossian et al. Jun 1991 A
5044706 Chen Sep 1991 A
5055668 French Oct 1991 A
5059877 Teder Oct 1991 A
5064274 Alten Nov 1991 A
5072154 Chen Dec 1991 A
5086253 Lawler Feb 1992 A
5096287 Kakinami et al. Mar 1992 A
5097362 Lynas Mar 1992 A
5121200 Choi Jun 1992 A
5124549 Michaels et al. Jun 1992 A
5130709 Toyama et al. Jul 1992 A
5168378 Black Dec 1992 A
5170374 Shimohigashi et al. Dec 1992 A
5172235 Wilm et al. Dec 1992 A
5177685 Davis et al. Jan 1993 A
5182502 Slotkowski et al. Jan 1993 A
5184956 Langlais et al. Feb 1993 A
5189561 Hong Feb 1993 A
5193000 Lipton et al. Mar 1993 A
5204778 Bechtel Apr 1993 A
5208701 Maeda May 1993 A
5245422 Borcherts et al. Sep 1993 A
5276389 Levers Jan 1994 A
5285060 Larson et al. Feb 1994 A
5289182 Brillard et al. Feb 1994 A
5289321 Secor Feb 1994 A
5305012 Faris Apr 1994 A
5307136 Saneyoshi Apr 1994 A
5309137 Kajiwara May 1994 A
5313072 Vachss May 1994 A
5325096 Pakett Jun 1994 A
5325386 Jewell et al. Jun 1994 A
5329206 Slotkowski et al. Jul 1994 A
5331312 Kudoh Jul 1994 A
5336980 Levers Aug 1994 A
5341437 Nakayama Aug 1994 A
5351044 Mathur et al. Sep 1994 A
5355118 Fukuhara Oct 1994 A
5374852 Parkes Dec 1994 A
5386285 Asayama Jan 1995 A
5394333 Kao Feb 1995 A
5406395 Wilson et al. Apr 1995 A
5410346 Saneyoshi et al. Apr 1995 A
5414257 Stanton May 1995 A
5414461 Kishi et al. May 1995 A
5416313 Larson et al. May 1995 A
5416318 Hegyi May 1995 A
5416478 Morinaga May 1995 A
5424952 Asayama Jun 1995 A
5426294 Kobayashi et al. Jun 1995 A
5430431 Nelson Jul 1995 A
5434407 Bauer et al. Jul 1995 A
5440428 Hegg et al. Aug 1995 A
5444478 Lelong et al. Aug 1995 A
5451822 Bechtel et al. Sep 1995 A
5457493 Leddy et al. Oct 1995 A
5461357 Yoshioka et al. Oct 1995 A
5461361 Moore Oct 1995 A
5469298 Suman et al. Nov 1995 A
5471515 Fossum et al. Nov 1995 A
5475494 Nishida et al. Dec 1995 A
5498866 Bendicks et al. Mar 1996 A
5500766 Stonecypher Mar 1996 A
5510983 Lino Apr 1996 A
5515448 Nishitani May 1996 A
5521633 Nakajima et al. May 1996 A
5528698 Kamei et al. Jun 1996 A
5529138 Shaw et al. Jun 1996 A
5530240 Larson et al. Jun 1996 A
5530420 Tsuchiya et al. Jun 1996 A
5535314 Alves et al. Jul 1996 A
5537003 Bechtel et al. Jul 1996 A
5539397 Asanuma et al. Jul 1996 A
5541590 Nishio Jul 1996 A
5550677 Schofield et al. Aug 1996 A
5555555 Sato et al. Sep 1996 A
5568027 Teder Oct 1996 A
5574443 Hsieh Nov 1996 A
5581464 Woll et al. Dec 1996 A
5594222 Caldwell Jan 1997 A
5614788 Mullins Mar 1997 A
5619370 Guinosso Apr 1997 A
5632092 Blank et al. May 1997 A
5634709 Iwama Jun 1997 A
5642299 Hardin et al. Jun 1997 A
5648835 Uzawa Jul 1997 A
5650944 Kise Jul 1997 A
5660454 Mori et al. Aug 1997 A
5661303 Teder Aug 1997 A
5666028 Bechtel et al. Sep 1997 A
5670935 Schofield et al. Sep 1997 A
5677851 Kingdon et al. Oct 1997 A
5699044 Van Lente et al. Dec 1997 A
5724316 Brunts Mar 1998 A
5732379 Eckert et al. Mar 1998 A
5737226 Olson et al. Apr 1998 A
5760828 Cortes Jun 1998 A
5760931 Saburi et al. Jun 1998 A
5761094 Olson et al. Jun 1998 A
5765116 Wilson-Jones et al. Jun 1998 A
5765118 Fukatani Jun 1998 A
5781437 Wiemer et al. Jul 1998 A
5786772 Schofield et al. Jul 1998 A
5790403 Nakayama Aug 1998 A
5790973 Blaker et al. Aug 1998 A
5793308 Rosinski et al. Aug 1998 A
5793420 Schmidt Aug 1998 A
5796094 Schofield et al. Aug 1998 A
5835255 Miles Nov 1998 A
5837994 Stam et al. Nov 1998 A
5844505 Van Ryzin Dec 1998 A
5844682 Kiyomoto et al. Dec 1998 A
5845000 Breed et al. Dec 1998 A
5848802 Breed et al. Dec 1998 A
5850176 Kinoshita et al. Dec 1998 A
5850254 Takano et al. Dec 1998 A
5867591 Onda Feb 1999 A
5877707 Kowalick Mar 1999 A
5877897 Schofield et al. Mar 1999 A
5878357 Sivashankar et al. Mar 1999 A
5878370 Olson Mar 1999 A
5883739 Ashihara et al. Mar 1999 A
5884212 Lion Mar 1999 A
5890021 Onoda Mar 1999 A
5896085 Mori et al. Apr 1999 A
5899956 Chan May 1999 A
5915800 Hiwatashi et al. Jun 1999 A
5923027 Stam et al. Jul 1999 A
5924212 Domanski Jul 1999 A
5959555 Furuta Sep 1999 A
5963247 Banitt Oct 1999 A
5986796 Miles Nov 1999 A
5990469 Bechtel et al. Nov 1999 A
5990649 Nagao et al. Nov 1999 A
6020704 Buschur Feb 2000 A
6049171 Stam et al. Apr 2000 A
6066933 Ponziana May 2000 A
6084519 Coulling et al. Jul 2000 A
6097024 Stam et al. Aug 2000 A
6100799 Fenk Aug 2000 A
6144022 Tenenbaum et al. Nov 2000 A
6175300 Kendrick Jan 2001 B1
6178034 Allemand et al. Jan 2001 B1
6201642 Bos et al. Mar 2001 B1
6202164 Gulick Mar 2001 B1
6223114 Boros et al. Apr 2001 B1
6227689 Miller May 2001 B1
6266082 Yonezawa et al. Jul 2001 B1
6266442 Laumeyer et al. Jul 2001 B1
6279058 Gulick Aug 2001 B1
6285393 Shimoura et al. Sep 2001 B1
6294989 Schofield et al. Sep 2001 B1
6297781 Turnbull et al. Oct 2001 B1
6310611 Caldwell Oct 2001 B1
6317057 Lee Nov 2001 B1
6320282 Caldwell Nov 2001 B1
6333759 Mazzilli Dec 2001 B1
6353392 Schofield et al. Mar 2002 B1
6370329 Teuchert Apr 2002 B1
6392315 Jones et al. May 2002 B1
6396397 Bos et al. May 2002 B1
6411204 Bloomfield et al. Jun 2002 B1
6424273 Gutta et al. Jul 2002 B1
6430303 Naoi et al. Aug 2002 B1
6442465 Breed et al. Aug 2002 B2
6477464 McCarthy et al. Nov 2002 B2
6497503 Dassanayake et al. Dec 2002 B1
6498620 Schofield et al. Dec 2002 B2
6534884 Marcus et al. Mar 2003 B2
6539306 Turnbull Mar 2003 B2
6547133 DeVries, Jr. et al. Apr 2003 B1
6553130 Lemelson et al. Apr 2003 B1
6574033 Chui et al. Jun 2003 B1
6589625 Kothari et al. Jul 2003 B1
6594583 Ogura et al. Jul 2003 B2
6611610 Stam et al. Aug 2003 B1
6636258 Strumolo Oct 2003 B2
6650455 Miles Nov 2003 B2
6672731 Schnell et al. Jan 2004 B2
6674562 Miles Jan 2004 B1
6680792 Miles Jan 2004 B2
6690268 Schofield et al. Feb 2004 B2
6700605 Toyoda et al. Mar 2004 B1
6704621 Stein et al. Mar 2004 B1
6710908 Miles et al. Mar 2004 B2
6711474 Treyz et al. Mar 2004 B1
6714331 Lewis et al. Mar 2004 B2
6717610 Bos et al. Apr 2004 B1
6735506 Breed et al. May 2004 B2
6741377 Miles May 2004 B2
6744353 Sjönell Jun 2004 B2
6762867 Lippert et al. Jul 2004 B2
6794119 Miles Sep 2004 B2
6795221 Urey Sep 2004 B1
6806452 Bos et al. Oct 2004 B2
6819231 Berberich et al. Nov 2004 B2
6823241 Shirato et al. Nov 2004 B2
6824281 Schofield et al. Nov 2004 B2
6850156 Bloomfield et al. Feb 2005 B2
6889161 Winner et al. May 2005 B2
6909753 Meehan et al. Jun 2005 B2
6946978 Schofield Sep 2005 B2
6975775 Rykowski et al. Dec 2005 B2
6989736 Berberich et al. Jan 2006 B2
7004606 Schofield Feb 2006 B2
7038577 Pawlicki et al. May 2006 B2
7062300 Kim Jun 2006 B1
7065432 Moisel et al. Jun 2006 B2
7079017 Lang et al. Jul 2006 B2
7085637 Breed et al. Aug 2006 B2
7092548 Laumeyer et al. Aug 2006 B2
7111968 Bauer et al. Sep 2006 B2
7116246 Winter et al. Oct 2006 B2
7123168 Schofield Oct 2006 B2
7136753 Samukawa et al. Nov 2006 B2
7145519 Takahashi et al. Dec 2006 B2
7149613 Stam et al. Dec 2006 B2
7161616 Okamoto et al. Jan 2007 B1
7195381 Lynam et al. Mar 2007 B2
7202776 Breed Apr 2007 B2
7227611 Hull et al. Jun 2007 B2
7365769 Mager Apr 2008 B1
7460951 Altan Dec 2008 B2
7490007 Taylor et al. Feb 2009 B2
7526103 Schofield et al. Apr 2009 B2
7592928 Chinomi et al. Sep 2009 B2
7639149 Katoh Dec 2009 B2
7681960 Wanke et al. Mar 2010 B2
7720580 Higgins-Luthman May 2010 B2
7724962 Zhu et al. May 2010 B2
7855755 Weller et al. Dec 2010 B2
7881496 Camilleri et al. Feb 2011 B2
7952490 Fechner et al. May 2011 B2
8013780 Lynam et al. Sep 2011 B2
8027029 Lu et al. Sep 2011 B2
8849495 Chundrlik, Jr. et al. Sep 2014 B2
9227568 Hubbell Jan 2016 B1
9387813 Moeller Jul 2016 B1
9481301 Schaffner Nov 2016 B2
9912841 Schaffner Mar 2018 B2
10171709 Schaffner Jan 2019 B2
20020015153 Downs Feb 2002 A1
20020113873 Williams Aug 2002 A1
20030081935 Kirmuss May 2003 A1
20030125854 Kawasaki Jul 2003 A1
20030137586 Lewellen Jul 2003 A1
20030222982 Hamdan et al. Dec 2003 A1
20040114381 Salmeen et al. Jun 2004 A1
20050225636 Maemura Oct 2005 A1
20050285938 Suzuki Dec 2005 A1
20060018511 Stam et al. Jan 2006 A1
20060018512 Stam et al. Jan 2006 A1
20060091813 Stam et al. May 2006 A1
20060103727 Tseng May 2006 A1
20060164221 Jensen Jul 2006 A1
20060250501 Wildmann et al. Nov 2006 A1
20060257140 Seger Nov 2006 A1
20060290479 Akatsuka et al. Dec 2006 A1
20070104476 Yasutomi et al. May 2007 A1
20070206945 DeLorme Sep 2007 A1
20080189036 Elgersma Aug 2008 A1
20090002491 Haler Jan 2009 A1
20090093938 Isaji et al. Apr 2009 A1
20090113509 Tseng et al. Apr 2009 A1
20090177347 Breuer et al. Jul 2009 A1
20090243824 Peterson et al. Oct 2009 A1
20090244361 Gebauer et al. Oct 2009 A1
20090245223 Godfrey Oct 2009 A1
20090265069 Desbrunes Oct 2009 A1
20090278933 Maeda Nov 2009 A1
20100020170 Higgins-Luthman et al. Jan 2010 A1
20100228437 Hanzawa et al. Sep 2010 A1
20100231771 Yaghmai Sep 2010 A1
20110069170 Emoto Mar 2011 A1
20110193961 Peterson Aug 2011 A1
20120044066 Mauderer et al. Feb 2012 A1
20120062743 Lynam et al. Mar 2012 A1
20120075465 Wengrovitz Mar 2012 A1
20120162436 Cordell Jun 2012 A1
20120188355 Omi Jul 2012 A1
20120218412 Dellantoni et al. Aug 2012 A1
20120262340 Hassan et al. Oct 2012 A1
20120320207 Toyofuku Dec 2012 A1
20130038681 Osipov Feb 2013 A1
20130124052 Hahne May 2013 A1
20130129150 Saito May 2013 A1
20130131918 Hahne May 2013 A1
20140067206 Pflug Mar 2014 A1
20140071234 Millett Mar 2014 A1
20140156157 Johnson et al. Jun 2014 A1
20140222280 Salomonsson Aug 2014 A1
20140313339 Diessner et al. Oct 2014 A1
20140379233 Chundrlik, Jr. et al. Dec 2014 A1
Related Publications (1)
Number Date Country
20190158706 A1 May 2019 US
Provisional Applications (1)
Number Date Country
61733598 Dec 2012 US
Continuations (3)
Number Date Country
Parent 15911442 Mar 2018 US
Child 16234766 US
Parent 15338781 Oct 2016 US
Child 15911442 US
Parent 14097581 Dec 2013 US
Child 15338781 US