The present invention relates to imaging systems or vision systems for vehicles.
Use of imaging sensors in vehicle imaging systems is common and known. Examples of such known systems are described in U.S. Pat. Nos. 5,949,331; 5,670,935 and/or 5,550,677, which are hereby incorporated herein by reference in their entireties.
The present invention provides a vision system or imaging system for a vehicle that utilizes one or more cameras to capture images exterior of the vehicle, and provides the communication/data signals, including camera data or image data that is processed and, responsive to such image processing and responsive to other information pertaining to the driving condition or surrounding environment or traffic at or near the vehicle, a vehicle safety or alert system is operable to provide an appropriate vehicle safety feature or alert that may vary depending on the environment or traffic surrounding the vehicle.
According to an aspect of the present invention, a driver assist system for a vehicle includes an object detection sensor disposed at the subject vehicle and having an exterior field of view, and a receiver disposed at the subject vehicle and operable to receive a wireless communication from a communication device remote from the subject vehicle. The wireless communication is associated with at least one of (i) a driving condition of another vehicle and (ii) a road condition of interest to the driver of the subject vehicle (and the condition may be near or at the subject vehicle or remote from the subject vehicle, such as forward of the subject vehicle along the road being traveled by the subject vehicle and potentially outside of the field of view of the sensor of the driver assist system). The driver assist system includes a control operable to process data captured by the object detection sensor to detect an object approaching the subject vehicle. The driver assist system is operable to adjust the processing of the data responsive at least in part to the wireless communications received by the receiver. The driver assist system is operable to generate an alert to alert the driver of the subject vehicle of a potential hazard responsive to the processing of the data. Optionally, a system of the vehicle (such as a braking system or a steering system or a collision avoidance system or the like) may be operable to intervene with or control a vehicle function to mitigate or avoid a potential hazard responsive to the processing of the data.
Optionally, the control may adjust the processing of the data to a distant mode of processing responsive to the wireless communication being indicative of an object approaching the subject vehicle from a distance near the effective range of the object detection sensor, and with the distant mode processing of the data focuses on an area at which it is determined that the object approaching the subject vehicle is located. Optionally, the control may adjust the processing of the data to a nearby mode of processing responsive to the object detection sensor detecting an object of interest near to and approaching the subject vehicle, with the nearby mode of processing of the data focuses on an area at which the detected object of interest is located.
Optionally, the driver assist system may be operable to alert the driver of the subject vehicle to not open a vehicle door when the subject vehicle is parked and when the driver assist system detects a vehicle approaching the subject vehicle in a side lane adjacent to the subject vehicle. The driver assist system may alert the driver of the subject vehicle to not open the vehicle door responsive to the detected approaching vehicle being within a threshold distance of the subject vehicle, and the control may adjust or alter the threshold distance parameter responsive to at least one of (i) a distance from the subject vehicle to the detected approaching vehicle, (ii) a speed of the detected approaching vehicle and (iii) an environment in which the subject vehicle is parked.
These and other objects, advantages, purposes and features of the present invention will become apparent upon review of the following specification in conjunction with the drawings.
A driver assist system and/or vision system and/or alert system may operate to supervise a passenger car's environment, such as the environment surrounding a just stopped car with occupants still inside, to determine potential hazards due to surrounding traffic by detecting the traffic or other objects at or near the stopped vehicle, evaluating their shape, tracking their paths of travel to determine their travel paths or trajectories for deciding whether the detected vehicle or object may potentially collide with an occupant of the subject vehicle if the occupant attempts to open a vehicle door, whereby the system, upon making a determination that a potential collision may occur, may take safety measures, such as generating an alert, such as an audible warning sound or alert, a door handle vibration or locking/braking the door or the like (such as by utilizing aspects of the system described in U.S. Pat. No. 7,586,402, which is hereby incorporated herein by reference in its entirety). The supervision may be accomplished by various vehicle-based sensors, such as ultrasonic sensors, RADAR, LIDAR, time of flight (TOF) sensors and/or cameras and/or the like disposed at the subject vehicle and having exterior fields of view. The system may include seat sensors or the like for determining seat occupancy, such as by utilizing aspects of the system described in U.S. Pat. Publication No. US 2009/0033477, which is hereby incorporated herein by reference in its entirety. It is known for such systems to also detect not moving obstacles, such as a sign or lantern pole or the like, and to provide an alert or warning to the occupant or to intervene to avoid colliding with the detected object.
With such alert systems, the systems often provide warnings that are not necessary or desired due to the particular driving condition or environment in which the subject vehicle is being driven or parked. Thus, an alert system should provide warnings when appropriate to avoid bothering the driver and/or occupant(s) of the vehicle, while providing a safety feature that is beneficial to the driver and/or occupants of the vehicle.
The system of the present invention thus detects potential hazards or conditions or the like and analyzes detected objects (such as traffic or the like) and tracks the detected objects to provide proper hazard anticipations and to avoid misleading or inappropriate interventions and warnings.
The present invention provides an object or obstacle and traffic supervision and vision system working in an end user productive manner. The driver assist system of the present invention includes a control or processor that receives inputs from one or more sensors of a vehicle, such as multiple object detection sensors, including the likes of ultrasonic sensors, radar sensors, laser sensors, lidar sensors, imaging sensors, occupancy sensors (such as seat or cabin occupant detection sensors or the like), rain sensors, traction sensors (such as for an electronic traction control system (ETS) or an acceleration slip reduction (ASR) system or the like) and/or the like. The system may also receive information or inputs from (and be responsive to) information from other systems, such as infotainment systems, navigation systems, telematics systems and the likes, and/or systems of other vehicles and/or other remote systems, such as may be received from a satellite communication and may include information pertaining to other vehicles on the road at or near the subject vehicle (such as wireless communications from and/or pertaining to other vehicles on the road, such as wireless communications as part of smart vehicle communications systems of the types proposed for use in Europe), such as car-to-car systems or car-to-x systems or car-to-device systems (where the device may comprise a communication device or system of another vehicle or a communication device or system disposed at or along a road or other area travelled by vehicles or a satellite communication device or system or the like). The communication may be based on any suitable communication protocol or system, such as WLAN, BLUETOOTH® or the like, or by infrared data transmission (FM) or sub carried by (two) vehicle RADAR, such as by utilizing aspects of the system described in U.S. Pat. No. 7,315,239, which is hereby incorporated herein by reference in its entirety.
So far unknown is to realize a car-to-car communication by a sub carried transmission via a time of flight flash light (TOF) system. The TOF system comprises a light source or flash light (that is intermittently operable to flash at a selected pulse or rate) and a light sensor. The light source mainly serves to emit light flashes, which are reflected by its environment. As the distances to reflecting objects increases to longer distances, the time for the light to reach the light sensor increases. By that a 3D space reconstruction is possible. To utilize the light flashes as a data transmit system the flashes must be emitted in a time pattern (pulse widths or wavelength modulation may be difficult). Hereby usual pulse distance or pulse phase modulation codes may come into use. At the time two vehicles equipped with such communication capable TOF systems come into each other's field of reach or communication, the communication may initialize automatically. During the initialization, there may be a handshake mode in which transmission and measuring time patterns are set up to minimize the data collisions and disturbing influences when both systems are flashing at the same time.
An example of the described workflow is shown in
Optionally, a data communication may alternatively be sub carried via a structured light flash system instead of a TOF flash system.
The data exchange may comprise several nodes for either vehicles or infrastructure.
The system may utilize GPS information and/or data and/or satellite images and/or road maps or road data or the like, and/or may utilize cell phone localization methods, such as for detecting high traffic volume, average car speed and the like to determine when there are traffic jams or excessive or slow moving traffic at or near the subject vehicle.
The system may utilize other information, such as information wirelessly communicated from other vehicles and/or from communication systems associated with the road system or infrastructure (such as information communicated in response to sensors at intersections or along roadways or the like) and/or map data and the like, to provide instructions to the driver of the subject vehicle to assist the driver in bypassing detected traffic jams. For example, the system may receive information or data or inputs from host-based systems or accessories (such as a WLAN or Ethernet via a mobile phone or telematics system or the like) or from inter car communications (such as, for example, Daimler's Dedicated Short Range Communication or other smart vehicle communication systems and the like) and may use such information for setting up network grids (such as via Zigbee or the like) with vehicles ahead of and/or behind the subject vehicle or vehicles in opposing traffic or the like. For example, economic drive systems/algorithm may provide anticipation of upcoming traffic situations to provide early warning to the driver (possibly before the traffic condition is detectable by the vehicle-based or on-board sensors of the subject vehicle) so that the driver (or the system) can reduce the speed of the subject vehicle before it approaches the traffic situation. The system may also or otherwise receive information wirelessly from the likes of remote immobile supervision systems, such as monitoring or supervision systems at or in tunnels, traffic lights on intersections, and/or parking lots and/or the like.
For example, and with reference to
Any and/or all of the above inputs or systems may be fully or partially combined for the purpose of composing a current status of the vehicle's immediate or direct and further environment, so that the system knows the “context” or environment in which the vehicle is travelling. This context can be used to digitally switch or analogue alter the detection mode, view filters, display or warning condition or parameters or the like of a vision system and interacting or depending car systems (such as a power mode or other safety device condition such as the door locks (first, second detent and opening mode or first and second stroke opening and the like), window lifter, sun roof, folding top, brakes, parking brake, lighting, alarms, and/or the like).
Due to the high data rates and short image sampling rates of a vehicle vision or imaging system, the processing capacity of the vision system is limited. Higher processing capacity requires more expensive hardware, so it is desirable to instead utilize the processing capacity in an efficient and effective manner. For example, it is desirable to only process what is needed at any given moment, and thus such processing should be context dependent so that information that is needed for the specific environment or conditions surrounding the vehicle at that time is processed, but other information or data is not processed or processed at a lesser level. Typically, such processing of image data by a vision control may be adjusted in terms of resolution, frames per second, detection and tracking rates, and/or the like.
The above mode/filter switching/altering methodology or system or process may be used to control or adjust the image processing or alert generation or the like. If multiple tasks or criteria are used, the various inputs or criteria may be weighted or processed in a weighted manner according the context related priority. For example, if the vehicle is parked, information or data pertaining to traffic in the lane adjacent to the subject vehicle may be processed more than information or data pertaining to vehicles in front of or to the rear of the subject vehicle, in order to provide an alert to the driver or passenger of the vehicle if it is not safe to open a door of the vehicle to exit the vehicle.
The driver assist system or alert system of the present invention thus may, for example, provide an indication or warning or alert or the like to occupants in the vehicle such that the occupant or occupants are indicated and/or visualized and/or warned to not open a door into traffic such as on a highway with cars approaching in a lane adjacent to the subject vehicle at a high speed, but the system can determine when detected objects are not a potential hazard and will not bother or alert the occupants when the occupant or occupants are exiting the vehicle at times when the traffic is going very slow such as in a traffic jam or traffic stand still or the like, even if at the same or similar location on a highway or other roadway. It is also envisioned that the system may be responsive to other context dependent situations and that the driver's attention can be directed to different aspects of different situations. For example, when driving or parked along a road or highway, the system may operate in an “intermediate mode” or “distant mode” (
For example, and with reference to
Optionally, the system may provide a view of one object, or if multiple objects (such as a front tire and a rear tire at a curb) are of interest to the driver of the vehicle, the system may provide a dual display (such as half of a display screen used to show one tire at the curb and the other half of the display screen used to show another tire at the curb). The system may thus split the view to show or process two or more significant objects or the field of view of the camera may be de-zoomed to capture two or more significant objects together in one view or captured image. The camera or cameras may be part of a vehicle vision system and may comprise a plurality of cameras, and the vision system (utilizing a rearward facing camera and sidewardly facing cameras and a forwardly facing cameras disposed at the vehicle) may provide a display of a top-down view or birds-eye view of the vehicle or a surround view at the vehicle, such as by utilizing aspects of the vision systems described in PCT Application No. PCT/US10/25545, filed Feb. 26, 2010 and published Sep. 2, 2010 as International Publication No. WO 2010/099416, and/or PCT Application No. PCT/US10/47256, filed Aug. 31, 2010 and published Mar. 10, 2011 as International Publication No. WO 2011/028686, and/or U.S. patent application Ser. No. 13/333,337, filed Dec. 21, 2011 and published Jun. 28, 2012 as U.S. Publication No. US-2012-0162427, and/or U.S. provisional application Ser. No. 61/678,375, filed Jan. 20, 2012, and/or PCT Application No. PCT/CA2012/000378, filed Apr. 25, 2012 and published Nov. 1, 2012 as International Publication No. WO 2012/145822, which are hereby incorporated herein by reference in their entireties.
When the vehicle is driven outside of the city limits (which the system may determine responsive to image processing or GPS data or the like), the system may switch to the distant mode to detect more distant objects approaching the subject vehicle, so to early detect, track, zoom and anticipate the speed and trajectory and eventually the intentions of a detected approaching object or vehicle, so that the system may alert or warn the driver of a potential hazard or intervene to avoid a potential hazard (such as via automatically reducing the speed of the subject vehicle or the like). As discussed above, early detection of an upcoming object may be supported by knowledge of its presence or approach as received from a remote communication source (that may provide information pertaining to or associated with vehicles on the road or the like).
Optionally, the system may be operable to classify and ‘label’ or identify one or multiple object(s) and to set the speed and trajectory parameters and ‘maths’ properties to rank their hazardous potential or influence, such as suggested in U.S. provisional application Ser. No. 61/696,416, filed Sep. 4, 2012, which is hereby incorporated herein by reference in its entirety, even when the detected object is far from the subject vehicle and still a “spot” on the horizon, and when detection systems such as radar, laser and cameras are still unable to determine such parameters of the distant object. This hazardous influence ranking may be done by taking the speed, the distance, the size, the mass and the deformability and vulnerability of the subject vehicles or objects into account. There may be a look up table of each object's property influence value in use. In order to avoid overwhelming the driver with too many object's information and data, there may be a certain level of influence or a limited number of objects with the highest ranking which become brought to the driver's attention. In the example of such a ranking scheme shown in Table 1 (with Tables 2-4 showing sub tables of the used metrics) the gray deposited values are these of the three with the highest ranking value which would be the data of choice. When the vehicles' desired destinations are known due to data transmission, the intended paths can become predetermined. Imminent colliding hazards become conceivable by projecting the vehicle's path trajectories into the future (see Table 5). As Mehta information, the local traffic rules may be regarded by the rating algorithms as well as when choosing the ranking of the information which will become presented to the driver.
There may be driving contexts which do not need to be brought to the driver's attention, such as, for example, cross traffic information when the vehicle is stopped at a red light. However, a passing ban or no passing symbol may appear or pop up at the display for viewing by the driver of the vehicle when the driver is about to pass a just pulling out school bus (such as shown in
Optionally, the system may also possess or include a data transfer channel to a data cloud in the World Wide Web or one or more (special) servers suitable to receive the vehicle's data from several sensors as a stream of data conjuncted to the vehicle's current position while travelling along its path. By that a road may be sampled by a concert of different sensors. The server may collect and merge the incoming data. By the merge of several data sets of different vehicles consecutively passing the same location or area or spot, the characteristic conditions of this location will emerge out of the signal noise. The fixed items will appear in every data set, the mobile or moving items will fade away with the rising number of passed vehicles. The oldest data sets will be overwritten by the newest ones to keep the data high topical. The result will be the average of what one vehicle's sensors should detect at a certain area or geographic location. Due to that, the participating vehicles with data transfer system will receive a data stream from the server matching to the position where they are as to be the desired scene (such as illustrated in
Due to the proposed system of the present invention, very small local disturbances may be perceived. For example, when there is a location or spot where debris or an object (such as, for example, a piece of cargo that fell off a truck or the like) is located, classical map systems can't give any advice. Highly sophisticated collision avoidance and invasive braking assistant systems may be able to elude a crash, but only if the obstacle is detected by the system of the subject vehicle and is not hidden by the ahead driving traffic. These traffic participants may change their lane for avoiding a collision themselves very late so that the own systems will have to engage a full brake. The system of the present invention would be ready to aid already when just a few other (equipped) traffic participants would have passed the obstacle with their sensors (via detection of the object by the other vehicles' sensors and communication of the detection to the subject vehicle). For example, when vehicles ahead of the subject vehicle detect the debris or object, the communication and/or database is updated and the following subject vehicle's system receives information pertaining to the ahead debris or object, and may generate an alert or control one or more systems (such as a braking or steering or collision avoidance system or the like) responsive at least in part to receipt of a communication of such information. Thus, the hazard warning and avoidance path finding may be started even when the obstacle isn't in the line of sight of the subject vehicle's sensor or sensors.
To reduce the data amounts which are transferred and handled, these may be pre filtered and/or preselected before transferring. This may happen by determining the essential context information by the vehicle algorithm and just transferring these. Hereby the vehicle algorithm may decide to send a data set of a certain location or spot by the measure of the difference which was found there. Alternatively, the server may control the choice of the ‘needed’ information. The server may request a data set update of a certain location or spot which data tend to outdate but does not request those spots or locations at which data were recently transmitted. An example is shown in
There may be gaps in the data connection. As long the driver enters his desired destination to his navigation system (or driving aid system) the according of the path to be expected data stream may be previously downloaded (such as about 30 seconds early as an example). The vehicle's sensor data may be stored in a record medium until a connection to the server is re-established.
The server may also be able extract statistical evidence out of the sensor data. For example, there may be a location or spot which tends to become slippery when it's wet. This could become an experience value. The driving aid system may become warned regarding that location or spot already at the time rain or snow sets in there.
For example, a vehicle camera picture or captured image data may be processed and analyzed and further processed depending on the context determined or provided by on board sensors and other vehicle or environment information. The system is operable to label the speed and trajectory parameters of an object or traffic participant when the object or other vehicle is very far or distant from the subject vehicle. Additionally, the intended path of the detected vehicle may be provided or received via a signal or output of the navigation system of the tracked vehicle, and the intended path may be labelled and/or reflected to determine the context for switching or adjusting the vision system's view modes. As shown in
With reference to
Also, because the system is operable to switch the view filters or processing modes early (such as responsive to traffic/road information received from sources remote from the subject vehicle, where the information pertains to traffic or roadway problems or characteristics well ahead of the subject vehicle and generally at or out of the effective range of the vehicle-based sensors of the subject vehicle) to detect objects or road parameters or characteristics well ahead of the travelling subject vehicle, the system of the present invention may provide a reduction in false alerts or warnings as compared to systems relying on vehicle-based or on-board sensors and systems alone. The system thus may provide enhanced hazard anticipation, and may limit or substantially preclude misleading interventions and/or warnings.
Therefore, the present invention provides a system that is operable to detect objects or vehicles at or near the subject vehicle and is operable to switch the processing parameters or algorithms in response to such detections or in response to the driving conditions or environment in which the subject vehicle is being driven. The system may receive data or information via wireless communications from other vehicles or from other communication systems remote from the subject vehicle, so that the system may receive data or information pertaining to traffic conditions or road conditions or the like far ahead of the subject vehicle and not yet detected by the vehicle-based sensors of the subject vehicle. The system may switch the processing parameters or algorithms in response to such remote communications to further enhance detection of objects and/or conditions and to further enhance generation of an appropriate alert or warning or display or the like responsive to the detected object/condition.
The vehicle may include any type of sensor or sensors, such as imaging sensors or radar sensors or lidar sensors or ultrasonic sensors, time of flight sensors, structured light sensors or the like (such as by utilizing aspects of the systems described in U.S. Pat. No. 8,013,780, which is hereby incorporated herein by reference in its entirety). For example, the object detection sensor may comprise an imaging sensor or camera that may capture image data for image processing, and may comprise any suitable camera or sensing device, such as, for example, an array of a plurality of photosensor elements arranged in 640 columns and 480 rows (a 640×480 imaging array), with a respective lens focusing images onto respective portions of the array. The photosensor array may comprise a plurality of photosensor elements arranged in a photosensor array having rows and columns. The logic and control circuit of the imaging sensor may function in any known manner, such as in the manner described in U.S. Pat. Nos. 5,550,677; 5,877,897; 6,498,620; 5,670,935; 5,796,094 and/or 6,396,397, which are all hereby incorporated herein by reference in their entireties. The system may communicate with other communication systems via any suitable means, such as by utilizing aspects of the systems described in PCT Application No. PCT/US10/038477, filed Jun. 14, 2010 and published Dec. 16, 2010 as International Publication No. WO 2010/144900, and/or U.S. patent application Ser. No. 13/202,005, filed Aug. 17, 2011 and published Mar. 15, 2012 as U.S. Publication No. US-2012-0062743, which are hereby incorporated herein by reference in their entireties.
The image processing and algorithmic processing may comprise any suitable means for processing the images and/or image data. For example, the vision system and/or processing may utilize aspects described in U.S. Pat. Nos. 7,005,974; 5,760,962; 5,877,897; 5,949,331; 6,222,447; 6,302,545; 6,396,397; 6,498,620; 6,523,964; 6,611,202; 6,201,642; 6,690,268; 6,717,610; 6,757,109; 6,802,617; 6,806,452; 6,822,563; 6,891,563; 6,946,978; 7,859,565; 5,550,677; 5,670,935; 6,636,258; 7,145,519; 7,161,616; 7,230,640; 7,248,283; 7,295,229; 7,301,466; 7,592,928; 7,881,496; 7,720,580; 7,038,577; 6,882,287; 5,929,786 and/or 5,786,772, and/or PCT Application No. PCT/US2010/047256, filed Aug. 31, 2010 and published Mar. 10, 2011 as International Publication No. WO 2011/028686, and/or International Publication No. WO 2010/099416, published Sep. 2, 2010, and/or U.S. provisional application Ser. No. 61/696,416, filed Sep. 4, 2012; Ser. No. 61/682,995, filed Aug. 14, 2012; Ser. No. 61/682,486, filed Aug. 13, 2012; Ser. No. 61/680,883, filed Aug. 8, 2012; Ser. No. 61/678,375, filed Aug. 1, 2012; Ser. No. 61/676,405, filed Jul. 27, 2012; Ser. No. 61/666,146, filed Jun. 29, 2012; Ser. No. 61/653,665, filed May 31, 2012; Ser. No. 61/653,664, filed May 31, 2012; Ser. No. 61/648,744, filed May 18, 2012; Ser. No. 61/624,507, filed Apr. 16, 2012; Ser. No. 61/616,126, filed Mar. 27, 2012; Ser. No. 61/615,410, filed Mar. 26, 2012; Ser. No. 61/613,651, filed Mar. 21, 2012; Ser. No. 61/607,229, filed Mar. 6, 2012; Ser. No. 61/605,409, filed Mar. 1, 2012; Ser. No. 61/602,878, filed Feb. 24, 2012; Ser. No. 61/602,876, filed Feb. 24, 2012; Ser. No. 61/600,205, filed Feb. 17, 2012; Ser. No. 61/588,833, filed Jan. 20, 2012; Ser. No. 61/583,381, filed Jan. 5, 2012; Ser. No. 61/579,682, filed Dec. 23, 2011; Ser. No. 61/570,017, filed Dec. 13, 2011; Ser. No. 61/568,791, filed Dec. 9, 2011; Ser. No. 61/567,446, filed Dec. 6, 2011; Ser. No. 61/567,150, filed Dec. 6, 2011; Ser. No. 61/565,713, filed Dec. 1, 2011; Ser. No. 61/563,965, filed Nov. 28, 2011; Ser. No. 61/559,970, filed Nov. 15, 2011; Ser. No. 61/556,556, filed Nov. 7, 2011; Ser. No. 61/554,663, filed Nov. 2, 2011; Ser. No. 61/550,664, filed Oct. 24, 2011; Ser. No. 61/552,167, filed Oct. 27, 2011; and/or Ser. No. 61/548,902, filed Oct. 19, 2011, and/or PCT Application No. PCT/CA2012/000378, filed Apr. 25, 2012 and published Nov. 1, 2012 as International Publication No. WO 2012/145822, and/or PCT Application No. PCT/US2012/056014, filed Sep. 19, 2012 and published Mar. 28, 2013 as International Publication No. WO 2013/043661, and/or PCT Application No. PCT/US12/57007, filed Sep. 25, 2012 and published Apr. 4, 2013 as International Publication No. WO 2013/048994, and/or PCT Application No. PCT/US2012/048800, filed Jul. 30, 2012 and published Feb. 7, 2013 as International Publication No. WO 2013/019707, and/or PCT Application No. PCT/US2012/048110, filed Jul. 25, 2012 and published Jan. 31, 2013 as International Publication No. WO 2013/016409, and/or U.S. patent application Ser. No. 13/534,657, filed Jun. 27, 2012 and published Jan. 3, 2013 as U.S. Publication No. US-2013-0002873, which are all hereby incorporated herein by reference in their entireties.
The imaging device and control and image processor and any associated illumination source, if applicable, may comprise any suitable components, and may utilize aspects of the cameras and vision systems described in U.S. Pat. Nos. 5,550,677; 5,877,897; 6,498,620; 5,670,935; 5,796,094; 6,396,397; 6,806,452; 6,690,268; 7,005,974; 7,123,168; 7,004,606; 6,946,978; 7,038,577; 6,353,392; 6,320,176; 6,313,454 and 6,824,281, and/or International Publication No. WO 2010/099416, published Sep. 2, 2010, and/or PCT Application No. PCT/US10/47256, filed Aug. 31, 2010 and published Mar. 10, 2011 as International Publication No. WO 2011/028686, and/or U.S. patent application Ser. No. 12/508,840, filed Jul. 24, 2009, and published Jan. 28, 2010 as U.S. Pat. Publication No. US 2010-0020170, which are all hereby incorporated herein by reference in their entireties. The camera or cameras may comprise any suitable cameras or imaging sensors or camera modules, and may utilize aspects of the cameras or sensors described in U.S. patent application Ser. No. 12/091,359, filed Apr. 24, 2008 and published Oct. 1, 2009 as U.S. Publication No. US-2009-0244361, and/or U.S. Pat. Nos. 7,965,336 and/or 7,480,149, which are hereby incorporated herein by reference in their entireties. The imaging array sensor may comprise any suitable sensor, and may utilize various imaging sensors or imaging array sensors or cameras or the like, such as a CMOS imaging array sensor, a CCD sensor or other sensors or the like, such as the types described in U.S. Pat. Nos. 5,550,677; 5,670,935; 5,760,962; 5,715,093; 5,877,897; 6,922,292; 6,757,109; 6,717,610; 6,590,719; 6,201,642; 6,498,620; 5,796,094; 6,097,023; 6,320,176; 6,559,435; 6,831,261; 6,806,452; 6,396,397; 6,822,563; 6,946,978; 7,339,149; 7,038,577; 7,004,606; 7,720,580 and/or 7,965,336, and/or PCT Application No. PCT/US2008/076022, filed Sep. 11, 2008 and published Mar. 19, 2009 as International Publication No. WO 2009/036176, and/or PCT Application No. PCT/US2008/078700, filed Oct. 3, 2008 and published Apr. 9, 2009 as International Publication No. WO 2009/046268, which are all hereby incorporated herein by reference in their entireties.
The camera module and circuit chip or board and imaging sensor may be implemented and operated in connection with various vehicular vision-based systems, and/or may be operable utilizing the principles of such other vehicular systems, such as a vehicle headlamp control system, such as the type disclosed in U.S. Pat. Nos. 5,796,094; 6,097,023; 6,320,176; 6,559,435; 6,831,261; 7,004,606; 7,339,149 and/or 7,526,103, which are all hereby incorporated herein by reference in their entireties, a rain sensor, such as the types disclosed in commonly assigned U.S. Pat. Nos. 6,353,392; 6,313,454; 6,320,176 and/or 7,480,149, which are hereby incorporated herein by reference in their entireties, a vehicle vision system, such as a forwardly, sidewardly or rearwardly directed vehicle vision system utilizing principles disclosed in U.S. Pat. Nos. 5,550,677; 5,670,935; 5,760,962; 5,877,897; 5,949,331; 6,222,447; 6,302,545; 6,396,397; 6,498,620; 6,523,964; 6,611,202; 6,201,642; 6,690,268; 6,717,610; 6,757,109; 6,802,617; 6,806,452; 6,822,563; 6,891,563; 6,946,978 and/or 7,859,565, which are all hereby incorporated herein by reference in their entireties, a trailer hitching aid or tow check system, such as the type disclosed in U.S. Pat. No. 7,005,974, which is hereby incorporated herein by reference in its entirety, a reverse or sideward imaging system, such as for a lane change assistance system or lane departure warning system or for a blind spot or object detection system, such as imaging or detection systems of the types disclosed in U.S. Pat. Nos. 7,881,496; 7,720,580; 7,038,577; 5,929,786 and/or 5,786,772, and/or U.S. provisional application Ser. No. 60/628,709, filed Nov. 17, 2004; Ser. No. 60/614,644, filed Sep. 30, 2004; Ser. No. 60/618,686, filed Oct. 14, 2004; Ser. No. 60/638,687, filed Dec. 23, 2004, which are hereby incorporated herein by reference in their entireties, a video device for internal cabin surveillance and/or video telephone function, such as disclosed in U.S. Pat. Nos. 5,760,962; 5,877,897; 6,690,268 and/or 7,370,983, and/or U.S. patent application Ser. No. 10/538,724, filed Jun. 13, 2005 and published Mar. 9, 2006 as U.S. Publication No. US-2006-0050018, which are hereby incorporated herein by reference in their entireties, a traffic sign recognition system, a system for determining a distance to a leading or trailing vehicle or object, such as a system utilizing the principles disclosed in U.S. Pat. Nos. 6,396,397 and/or 7,123,168, which are hereby incorporated herein by reference in their entireties, and/or the like.
Optionally, the circuit board or chip may include circuitry for the imaging array sensor and or other electronic accessories or features, such as by utilizing compass-on-a-chip or EC driver-on-a-chip technology and aspects such as described in U.S. Pat. Nos. 7,255,451 and/or 7,480,149; and/or U.S. patent application Ser. No. 11/226,628, filed Sep. 14, 2005 and published Mar. 23, 2006 as U.S. Publication No. US-2006-0061008, and/or Ser. No. 12/578,732, filed Oct. 14, 2009 and published Apr. 22, 2010 as U.S. Publication No. US-2010-0097469, which are hereby incorporated herein by reference in their entireties.
Optionally, the driver assist system and/or vision system may include a display for displaying images captured by one or more of the imaging sensors for viewing by the driver of the vehicle while the driver is normally operating the vehicle. Optionally, for example, the vision system may include a video display device disposed at or in the interior rearview mirror assembly of the vehicle, such as by utilizing aspects of the video mirror display systems described in U.S. Pat. No. 6,690,268 and/or U.S. patent application Ser. No. 13/333,337, filed Dec. 21, 2011 and published Jun. 28, 2012 as U.S. Publication No. US-2012-0162427, which are hereby incorporated herein by reference in their entireties. The video mirror display may comprise any suitable devices and systems and optionally may utilize aspects of the compass display systems described in U.S. Pat. Nos. 7,855,755; 7,626,749; 7,581,859; 7,446,924; 7,446,650; 7,370,983; 7,338,177; 7,329,013; 7,308,341; 7,289,037; 7,274,501; 7,255,451; 7,249,860; 7,195,381; 7,184,190; 7,004,593; 6,877,888; 6,824,281; 6,690,268; 6,672,744; 6,642,851; 6,513,252; 6,386,742; 6,329,925; 6,222,460; 6,173,508; 6,124,886; 6,087,953; 5,878,370; 5,802,727; 5,737,226; 5,724,187; 5,708,410; 5,699,044; 5,677,851; 5,668,663; 5,632,092; 5,576,687; 5,530,240; 4,953,305 and/or 4,546,551, and/or European patent application, published Oct. 11, 2000 under Publication No. EP 0 1043566, and/or U.S. patent application Ser. No. 11/226,628, filed Sep. 14, 2005 and published Mar. 23, 2006 as U.S. Publication No. US-2006-0061008; and/or Ser. No. 10/538,724, filed Jun. 13, 2005 and published Mar. 9, 2006 as U.S. Publication No. US-2006-0050018, which are all hereby incorporated herein by reference in their entireties. Optionally, the video mirror display screen or device may be operable to display images captured by a rearward viewing camera of the vehicle during a reversing maneuver of the vehicle (such as responsive to the vehicle gear actuator being placed in a reverse gear position or the like) to assist the driver in backing up the vehicle, and optionally may be operable to display the compass heading or directional heading character or icon when the vehicle is not undertaking a reversing maneuver, such as when the vehicle is being driven in a forward direction along a road (such as by utilizing aspects of the display system described in PCT Application No. PCT/US2011/056295, filed Oct. 14, 2011 and published Apr. 19, 2012 as International Publication No. WO 2012/051500, which is hereby incorporated herein by reference in its entirety).
Optionally, the driver assist system and/or vision system (utilizing the forward facing camera and a rearward facing camera and other cameras disposed at the vehicle with exterior fields of view) may be part of or may provide a display of a top-down view or birds-eye view system of the vehicle or a surround view at the vehicle, such as by utilizing aspects of the vision systems described in PCT Application No. PCT/US10/25545, filed Feb. 26, 2010 and published on Sep. 2, 2010 as International Publication No. WO 2010/099416, and/or PCT Application No. PCT/US10/47256, filed Aug. 31, 2010 and published Mar. 10, 2011 as International Publication No. WO 2011/028686, and/or PCT Application No. PCT/US2011/062834, filed Dec. 1, 2011 and published Jun. 7, 2012 as International Publication No. WO 2012/075250, and/or PCT Application No. PCT/US2012/048993, filed Jul. 31, 2012 and published Feb. 7, 2013 as International Publication No. WO 2013/019795, and/or PCT Application No. PCT/CA2012/000378, filed Apr. 25, 2012 and published Nov. 1, 2012 as International Publication No. WO 2012/145822, and/or U.S. patent application Ser. No. 13/333,337, filed Dec. 21, 2011 and published Jun. 28, 2012 as U.S. Publication No. US-2012-0162427, and/or U.S. provisional application Ser. No. 61/615,410, filed Mar. 26, 2012; Ser. No. 61/588,833, filed Jan. 20, 2012; Ser. No. 61/570,017, filed Dec. 13, 2011; Ser. No. 61/568,791, filed Dec. 9, 2011; and/or Ser. No. 61/559,970, filed Nov. 15, 2011, which are hereby incorporated herein by reference in their entireties.
Changes and modifications to the specifically described embodiments may be carried out without departing from the principles of the present invention, which is intended to be limited only by the scope of the appended claims as interpreted according to the principles of patent law.
The present application is a continuation of U.S. patent application Ser. No. 17/655,381, filed Mar. 18, 2022, now U.S. Pat. No. 11,673,546, which is a continuation of U.S. patent application Ser. No. 15/924,892, filed Mar. 19, 2018, now U.S. Pat. No. 11,279,343, which is a continuation of U.S. patent application Ser. No. 14/867,069, filed Sep. 28, 2015, now U.S. Pat. No. 9,919,705, which is a continuation of U.S. patent application Ser. No. 13/660,306, filed Oct. 25, 2012, now U.S. Pat. No. 9,146,898, which claims the filing benefits of U.S. provisional application Ser. No. 61/552,167, filed Oct. 27, 2011, which is hereby incorporated herein by reference in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
5001558 | Burley et al. | Mar 1991 | A |
5003288 | Wilhelm | Mar 1991 | A |
5012082 | Watanabe | Apr 1991 | A |
5016977 | Baude et al. | May 1991 | A |
5027001 | Torbert | Jun 1991 | A |
5027200 | Petrossian et al. | Jun 1991 | A |
5044706 | Chen | Sep 1991 | A |
5055668 | French | Oct 1991 | A |
5059877 | Teder | Oct 1991 | A |
5064274 | Alten | Nov 1991 | A |
5072154 | Chen | Dec 1991 | A |
5086253 | Lawler | Feb 1992 | A |
5096287 | Kakinami et al. | Mar 1992 | A |
5097362 | Lynas | Mar 1992 | A |
5121200 | Choi | Jun 1992 | A |
5124549 | Michaels et al. | Jun 1992 | A |
5130709 | Toyama et al. | Jul 1992 | A |
5148014 | Lynam et al. | Sep 1992 | A |
5168378 | Black | Dec 1992 | A |
5170374 | Shimohigashi et al. | Dec 1992 | A |
5172235 | Wilm et al. | Dec 1992 | A |
5177685 | Davis et al. | Jan 1993 | A |
5182502 | Slotkowski et al. | Jan 1993 | A |
5184956 | Langlais et al. | Feb 1993 | A |
5189561 | Hong | Feb 1993 | A |
5193000 | Lipton et al. | Mar 1993 | A |
5193029 | Schofield et al. | Mar 1993 | A |
5204778 | Bechtel | Apr 1993 | A |
5208701 | Maeda | May 1993 | A |
5245422 | Borcherts et al. | Sep 1993 | A |
5253109 | O'Farrell et al. | Oct 1993 | A |
5276389 | Levers | Jan 1994 | A |
5285060 | Larson et al. | Feb 1994 | A |
5289182 | Brillard et al. | Feb 1994 | A |
5289321 | Secor | Feb 1994 | A |
5305012 | Faris | Apr 1994 | A |
5307136 | Saneyoshi | Apr 1994 | A |
5309137 | Kajiwara | May 1994 | A |
5313072 | Vachss | May 1994 | A |
5325096 | Pakett | Jun 1994 | A |
5325386 | Jewell et al. | Jun 1994 | A |
5329206 | Slotkowski et al. | Jul 1994 | A |
5331312 | Kudoh | Jul 1994 | A |
5336980 | Levers | Aug 1994 | A |
5341437 | Nakayama | Aug 1994 | A |
5351044 | Mathur et al. | Sep 1994 | A |
5355118 | Fukuhara | Oct 1994 | A |
5374852 | Parkes | Dec 1994 | A |
5386285 | Asayama | Jan 1995 | A |
5394333 | Kao | Feb 1995 | A |
5406395 | Wilson et al. | Apr 1995 | A |
5410346 | Saneyoshi et al. | Apr 1995 | A |
5414257 | Stanton | May 1995 | A |
5414461 | Kishi et al. | May 1995 | A |
5416313 | Larson et al. | May 1995 | A |
5416318 | Hegyi | May 1995 | A |
5416478 | Morinaga | May 1995 | A |
5424952 | Asayama | Jun 1995 | A |
5426294 | Kobayashi et al. | Jun 1995 | A |
5430431 | Nelson | Jul 1995 | A |
5434407 | Bauer et al. | Jul 1995 | A |
5440428 | Hegg et al. | Aug 1995 | A |
5444478 | Lelong et al. | Aug 1995 | A |
5451822 | Bechtel et al. | Sep 1995 | A |
5457493 | Leddy et al. | Oct 1995 | A |
5461357 | Yoshioka et al. | Oct 1995 | A |
5461361 | Moore | Oct 1995 | A |
5469298 | Suman et al. | Nov 1995 | A |
5471515 | Fossum et al. | Nov 1995 | A |
5475494 | Nishida et al. | Dec 1995 | A |
5487116 | Nakano et al. | Jan 1996 | A |
5498866 | Bendicks et al. | Mar 1996 | A |
5500766 | Stonecypher | Mar 1996 | A |
5510983 | Lino | Apr 1996 | A |
5515448 | Nishitani | May 1996 | A |
5521633 | Nakajima et al. | May 1996 | A |
5528698 | Kamei et al. | Jun 1996 | A |
5529138 | Shaw et al. | Jun 1996 | A |
5530240 | Larson et al. | Jun 1996 | A |
5530420 | Tsuchiya et al. | Jun 1996 | A |
5535314 | Alves et al. | Jul 1996 | A |
5537003 | Bechtel et al. | Jul 1996 | A |
5539397 | Asanuma et al. | Jul 1996 | A |
5541590 | Nishio | Jul 1996 | A |
5550677 | Schofield et al. | Aug 1996 | A |
5555312 | Shima et al. | Sep 1996 | A |
5555555 | Sato et al. | Sep 1996 | A |
5568027 | Teder | Oct 1996 | A |
5574443 | Hsieh | Nov 1996 | A |
5581464 | Woll et al. | Dec 1996 | A |
5594222 | Caldwell | Jan 1997 | A |
5614788 | Mullins | Mar 1997 | A |
5619370 | Guinosso | Apr 1997 | A |
5634709 | Iwama | Jun 1997 | A |
5642299 | Hardin et al. | Jun 1997 | A |
5648835 | Uzawa | Jul 1997 | A |
5650944 | Kise | Jul 1997 | A |
5660454 | Mori et al. | Aug 1997 | A |
5661303 | Teder | Aug 1997 | A |
5666028 | Bechtel et al. | Sep 1997 | A |
5668663 | Varaprasad et al. | Sep 1997 | A |
5670935 | Schofield et al. | Sep 1997 | A |
5675489 | Pomerleau | Oct 1997 | A |
5677851 | Kingdon et al. | Oct 1997 | A |
5699044 | Van Lente et al. | Dec 1997 | A |
5724187 | Varaprasad et al. | Mar 1998 | A |
5724316 | Brunts | Mar 1998 | A |
5737226 | Olson et al. | Apr 1998 | A |
5757949 | Kinoshita et al. | May 1998 | A |
5760826 | Nayar | Jun 1998 | A |
5760828 | Cortes | Jun 1998 | A |
5760931 | Saburi et al. | Jun 1998 | A |
5760962 | Schofield et al. | Jun 1998 | A |
5761094 | Olson et al. | Jun 1998 | A |
5765116 | Wilson-Jones et al. | Jun 1998 | A |
5781437 | Wiemer et al. | Jul 1998 | A |
5786772 | Schofield et al. | Jul 1998 | A |
5790403 | Nakayama | Aug 1998 | A |
5790973 | Blaker et al. | Aug 1998 | A |
5793308 | Rosinski et al. | Aug 1998 | A |
5793420 | Schmidt | Aug 1998 | A |
5796094 | Schofield et al. | Aug 1998 | A |
5798575 | O'Farrell et al. | Aug 1998 | A |
5835255 | Miles | Nov 1998 | A |
5837994 | Stam et al. | Nov 1998 | A |
5844505 | Van Ryzin | Dec 1998 | A |
5844682 | Kiyomoto et al. | Dec 1998 | A |
5845000 | Breed et al. | Dec 1998 | A |
5848802 | Breed et al. | Dec 1998 | A |
5850176 | Kinoshita et al. | Dec 1998 | A |
5850254 | Takano et al. | Dec 1998 | A |
5867591 | Onda | Feb 1999 | A |
5877707 | Kowalick | Mar 1999 | A |
5877897 | Schofield et al. | Mar 1999 | A |
5878370 | Olson | Mar 1999 | A |
5883739 | Ashihara et al. | Mar 1999 | A |
5884212 | Lion | Mar 1999 | A |
5890021 | Onoda | Mar 1999 | A |
5896085 | Mori et al. | Apr 1999 | A |
5899956 | Chan | May 1999 | A |
5914815 | Bos | Jun 1999 | A |
5923027 | Stam et al. | Jul 1999 | A |
5929786 | Schofield et al. | Jul 1999 | A |
5940120 | Frankhouse et al. | Aug 1999 | A |
5949331 | Schofield et al. | Sep 1999 | A |
5956181 | Lin | Sep 1999 | A |
5959367 | O'Farrell et al. | Sep 1999 | A |
5959555 | Furuta | Sep 1999 | A |
5963247 | Banitt | Oct 1999 | A |
5964822 | Alland et al. | Oct 1999 | A |
5971552 | O'Farrell et al. | Oct 1999 | A |
5986796 | Miles | Nov 1999 | A |
5990469 | Bechtel et al. | Nov 1999 | A |
5990649 | Nagao et al. | Nov 1999 | A |
6001486 | Varaprasad et al. | Dec 1999 | A |
6009336 | Harris et al. | Dec 1999 | A |
6020704 | Buschur | Feb 2000 | A |
6049171 | Stam et al. | Apr 2000 | A |
6066933 | Ponziana | May 2000 | A |
6084519 | Coulling et al. | Jul 2000 | A |
6087953 | DeLine et al. | Jul 2000 | A |
6097023 | Schofield et al. | Aug 2000 | A |
6097024 | Stam et al. | Aug 2000 | A |
6116743 | Hoek | Sep 2000 | A |
6124647 | Marcus et al. | Sep 2000 | A |
6124886 | DeLine et al. | Sep 2000 | A |
6139172 | Bos et al. | Oct 2000 | A |
6144022 | Tenenbaum et al. | Nov 2000 | A |
6172613 | DeLine et al. | Jan 2001 | B1 |
6175164 | O'Farrell et al. | Jan 2001 | B1 |
6175300 | Kendrick | Jan 2001 | B1 |
6198409 | Schofield et al. | Mar 2001 | B1 |
6201642 | Bos | Mar 2001 | B1 |
6222447 | Schofield et al. | Apr 2001 | B1 |
6222460 | DeLine et al. | Apr 2001 | B1 |
6243003 | DeLine et al. | Jun 2001 | B1 |
6250148 | Lynam | Jun 2001 | B1 |
6259412 | Duroux | Jul 2001 | B1 |
6266082 | Yonezawa et al. | Jul 2001 | B1 |
6266442 | Laumeyer et al. | Jul 2001 | B1 |
6285393 | Shimoura et al. | Sep 2001 | B1 |
6291906 | Marcus et al. | Sep 2001 | B1 |
6294989 | Schofield et al. | Sep 2001 | B1 |
6297781 | Turnbull et al. | Oct 2001 | B1 |
6302545 | Schofield et al. | Oct 2001 | B1 |
6310611 | Caldwell | Oct 2001 | B1 |
6313454 | Bos et al. | Nov 2001 | B1 |
6317057 | Lee | Nov 2001 | B1 |
6320176 | Schofield et al. | Nov 2001 | B1 |
6320282 | Caldwell | Nov 2001 | B1 |
6326613 | Heslin et al. | Dec 2001 | B1 |
6329925 | Skiver et al. | Dec 2001 | B1 |
6333759 | Mazzilli | Dec 2001 | B1 |
6341523 | Lynam | Jan 2002 | B2 |
6353392 | Schofield et al. | Mar 2002 | B1 |
6366213 | DeLine et al. | Apr 2002 | B2 |
6370329 | Teuchert | Apr 2002 | B1 |
6396397 | Bos et al. | May 2002 | B1 |
6405132 | Breed | Jun 2002 | B1 |
6411204 | Bloomfield et al. | Jun 2002 | B1 |
6411328 | Franke et al. | Jun 2002 | B1 |
6420975 | DeLine et al. | Jul 2002 | B1 |
6424273 | Gutta et al. | Jul 2002 | B1 |
6428172 | Hutzel et al. | Aug 2002 | B1 |
6430303 | Naoi et al. | Aug 2002 | B1 |
6433676 | DeLine et al. | Aug 2002 | B2 |
6433817 | Guerra | Aug 2002 | B1 |
6442465 | Breed et al. | Aug 2002 | B2 |
6477464 | McCarthy et al. | Nov 2002 | B2 |
6485155 | Duroux et al. | Nov 2002 | B1 |
6497503 | Dassanayake et al. | Dec 2002 | B1 |
6498620 | Schofield et al. | Dec 2002 | B2 |
6513252 | Schierbeek et al. | Feb 2003 | B1 |
6516664 | Lynam | Feb 2003 | B2 |
6523964 | Schofield et al. | Feb 2003 | B2 |
6534884 | Marcus et al. | Mar 2003 | B2 |
6539306 | Turnbull | Mar 2003 | B2 |
6547133 | Devries, Jr. et al. | Apr 2003 | B1 |
6553130 | Lemelson et al. | Apr 2003 | B1 |
6559435 | Schofield et al. | May 2003 | B2 |
6574033 | Chui et al. | Jun 2003 | B1 |
6578017 | Ebersole et al. | Jun 2003 | B1 |
6587573 | Stam et al. | Jul 2003 | B1 |
6589625 | Kothari et al. | Jul 2003 | B1 |
6593565 | Heslin et al. | Jul 2003 | B2 |
6594583 | Ogura et al. | Jul 2003 | B2 |
6611202 | Schofield et al. | Aug 2003 | B2 |
6611610 | Stam et al. | Aug 2003 | B1 |
6627918 | Getz et al. | Sep 2003 | B2 |
6631994 | Suzuki et al. | Oct 2003 | B2 |
6636258 | Strumolo | Oct 2003 | B2 |
6648477 | Hutzel et al. | Nov 2003 | B2 |
6650233 | DeLine et al. | Nov 2003 | B2 |
6650455 | Miles | Nov 2003 | B2 |
6672731 | Schnell et al. | Jan 2004 | B2 |
6674562 | Miles | Jan 2004 | B1 |
6678056 | Downs | Jan 2004 | B2 |
6678614 | McCarthy et al. | Jan 2004 | B2 |
6680792 | Miles | Jan 2004 | B2 |
6690268 | Schofield et al. | Feb 2004 | B2 |
6700605 | Toyoda et al. | Mar 2004 | B1 |
6703925 | Steffel | Mar 2004 | B2 |
6704621 | Stein et al. | Mar 2004 | B1 |
6710908 | Miles et al. | Mar 2004 | B2 |
6711474 | Treyz et al. | Mar 2004 | B1 |
6714331 | Lewis et al. | Mar 2004 | B2 |
6717610 | Bos et al. | Apr 2004 | B1 |
6735506 | Breed et al. | May 2004 | B2 |
6741377 | Miles | May 2004 | B2 |
6744353 | Sjonell | Jun 2004 | B2 |
6757109 | Bos | Jun 2004 | B2 |
6762867 | Lippert et al. | Jul 2004 | B2 |
6794119 | Miles | Sep 2004 | B2 |
6795221 | Urey | Sep 2004 | B1 |
6802617 | Schofield et al. | Oct 2004 | B2 |
6806452 | Bos et al. | Oct 2004 | B2 |
6822563 | Bos et al. | Nov 2004 | B2 |
6823241 | Shirato et al. | Nov 2004 | B2 |
6824281 | Schofield et al. | Nov 2004 | B2 |
6831261 | Schofield et al. | Dec 2004 | B2 |
6847487 | Burgner | Jan 2005 | B2 |
6882287 | Schofield | Apr 2005 | B2 |
6889161 | Winner et al. | May 2005 | B2 |
6891563 | Schofield et al. | May 2005 | B2 |
6909753 | Meehan et al. | Jun 2005 | B2 |
6946978 | Schofield | Sep 2005 | B2 |
6953253 | Schofield et al. | Oct 2005 | B2 |
6968736 | Lynam | Nov 2005 | B2 |
6975246 | Trudeau | Dec 2005 | B1 |
6975775 | Rykowski et al. | Dec 2005 | B2 |
7004593 | Weller et al. | Feb 2006 | B2 |
7004606 | Schofield | Feb 2006 | B2 |
7005974 | McMahon et al. | Feb 2006 | B2 |
7038577 | Pawlicki et al. | May 2006 | B2 |
7046448 | Burgner | May 2006 | B2 |
7062300 | Kim | Jun 2006 | B1 |
7065432 | Moisel et al. | Jun 2006 | B2 |
7085637 | Breed et al. | Aug 2006 | B2 |
7092548 | Laumeyer et al. | Aug 2006 | B2 |
7116246 | Winter et al. | Oct 2006 | B2 |
7123168 | Schofield | Oct 2006 | B2 |
7133661 | Hatae et al. | Nov 2006 | B2 |
7149613 | Stam et al. | Dec 2006 | B2 |
7167796 | Taylor et al. | Jan 2007 | B2 |
7195381 | Lynam et al. | Mar 2007 | B2 |
7202776 | Breed | Apr 2007 | B2 |
7224324 | Quist et al. | May 2007 | B2 |
7227459 | Bos et al. | Jun 2007 | B2 |
7227611 | Hull et al. | Jun 2007 | B2 |
7249860 | Kulas et al. | Jul 2007 | B2 |
7253723 | Lindahl et al. | Aug 2007 | B2 |
7255451 | McCabe et al. | Aug 2007 | B2 |
7311406 | Schofield et al. | Dec 2007 | B2 |
7325934 | Schofield et al. | Feb 2008 | B2 |
7325935 | Schofield et al. | Feb 2008 | B2 |
7338177 | Lynam | Mar 2008 | B2 |
7339149 | Schofield et al. | Mar 2008 | B1 |
7344261 | Schofield et al. | Mar 2008 | B2 |
7360932 | Uken et al. | Apr 2008 | B2 |
7370983 | DeWind et al. | May 2008 | B2 |
7375803 | Bamji | May 2008 | B1 |
7380948 | Schofield et al. | Jun 2008 | B2 |
7388182 | Schofield et al. | Jun 2008 | B2 |
7402786 | Schofield et al. | Jul 2008 | B2 |
7423248 | Schofield et al. | Sep 2008 | B2 |
7423821 | Bechtel et al. | Sep 2008 | B2 |
7425076 | Schofield et al. | Sep 2008 | B2 |
7459664 | Schofield et al. | Dec 2008 | B2 |
7526103 | Schofield et al. | Apr 2009 | B2 |
7541743 | Salmeen et al. | Jun 2009 | B2 |
7561181 | Schofield et al. | Jul 2009 | B2 |
7565006 | Stam et al. | Jul 2009 | B2 |
7616781 | Schofield et al. | Nov 2009 | B2 |
7619508 | Lynam et al. | Nov 2009 | B2 |
7633383 | Dunsmoir et al. | Dec 2009 | B2 |
7639149 | Katoh | Dec 2009 | B2 |
7676087 | Dhua et al. | Mar 2010 | B2 |
7720580 | Higgins-Luthman | May 2010 | B2 |
7792329 | Schofield et al. | Sep 2010 | B2 |
7843451 | Lafon | Nov 2010 | B2 |
7855778 | Yung et al. | Dec 2010 | B2 |
7859565 | Schofield et al. | Dec 2010 | B2 |
7881496 | Camilleri et al. | Feb 2011 | B2 |
7914187 | Higgins-Luthman et al. | Mar 2011 | B2 |
7930160 | Hosagrahara et al. | Apr 2011 | B1 |
8017898 | Lu et al. | Sep 2011 | B2 |
8095310 | Taylor et al. | Jan 2012 | B2 |
8098142 | Schofield et al. | Jan 2012 | B2 |
8179281 | Strauss | May 2012 | B2 |
8224031 | Saito | Jul 2012 | B2 |
9146898 | Ihlenburg et al. | Sep 2015 | B2 |
9919705 | Ihlenburg | Mar 2018 | B2 |
11279343 | Ihlenburg | Mar 2022 | B2 |
11673546 | Ihlenburg | Jun 2023 | B2 |
20020113873 | Williams | Aug 2002 | A1 |
20030137586 | Lewellen | Jul 2003 | A1 |
20030222982 | Hamdan et al. | Dec 2003 | A1 |
20050219852 | Stam et al. | Oct 2005 | A1 |
20050237385 | Kosaka et al. | Oct 2005 | A1 |
20060018511 | Stam et al. | Jan 2006 | A1 |
20060018512 | Stam et al. | Jan 2006 | A1 |
20060050018 | Hutzel et al. | Mar 2006 | A1 |
20060091813 | Stam et al. | May 2006 | A1 |
20060103727 | Tseng | May 2006 | A1 |
20060250501 | Wildmann et al. | Nov 2006 | A1 |
20060254142 | Das et al. | Nov 2006 | A1 |
20070104476 | Yasutomi et al. | May 2007 | A1 |
20070109406 | Schofield et al. | May 2007 | A1 |
20070120657 | Schofield et al. | May 2007 | A1 |
20070242339 | Bradley | Oct 2007 | A1 |
20080147321 | Howard et al. | Jun 2008 | A1 |
20080192132 | Bechtel et al. | Aug 2008 | A1 |
20090113509 | Tseng et al. | Apr 2009 | A1 |
20090160987 | Bechtel et al. | Jun 2009 | A1 |
20090190015 | Bechtel et al. | Jul 2009 | A1 |
20090198412 | Shiraki | Aug 2009 | A1 |
20090256938 | Bechtel et al. | Oct 2009 | A1 |
20100250106 | Bai et al. | Sep 2010 | A1 |
20110032119 | Pfeiffer et al. | Feb 2011 | A1 |
20120045112 | Lundblad et al. | Feb 2012 | A1 |
Number | Date | Country |
---|---|---|
0426503 | May 1991 | EP |
0492591 | Jul 1992 | EP |
0788947 | Aug 1997 | EP |
59114139 | Jul 1984 | JP |
6080953 | May 1985 | JP |
6079889 | Oct 1986 | JP |
6272245 | May 1987 | JP |
6414700 | Jan 1989 | JP |
4114587 | Apr 1992 | JP |
0577657 | Mar 1993 | JP |
05050883 | Mar 1993 | JP |
5213113 | Aug 1993 | JP |
6227318 | Aug 1994 | JP |
06267304 | Sep 1994 | JP |
06276524 | Sep 1994 | JP |
06295601 | Oct 1994 | JP |
07004170 | Jan 1995 | JP |
0732936 | Feb 1995 | JP |
0747878 | Feb 1995 | JP |
07052706 | Feb 1995 | JP |
0769125 | Mar 1995 | JP |
07105496 | Apr 1995 | JP |
2630604 | Jul 1997 | JP |
200383742 | Mar 2003 | JP |
Entry |
---|
G. Wang, D. Renshaw, P.B. Denyer and M. Lu, CMOS Video Cameras, article, 1991, 4 pages, University of Edinburgh, UK. |
Tokimaru et al., “CMOS Rear-View TV System with CCD Camera”, National Technical Report vol. 34, No. 3, pp. 329-336, Jun. 1988 (Japan). |
J. Borenstein et al., “Where am I? Sensors and Method for Mobile Robot Positioning”, University of Michigan, Apr. 1996, pp. 2, 125-128. |
Bow, Sing T., “Pattern Recognition and Image Preprocessing (Signal Processing and Communications)”, CRC Press, Jan. 15, 2002, pp. 557-559. |
Vlacic et al. (Eds), “Intelligent Vehicle Tecnologies, Theory and Applications”, Society of Automotive Engineers Inc., edited by SAE International, 2001. |
Van Leuven et al., “Real-Time Vehicle Tracking in Image Sequences”, IEEE, US, vol. 3, May 21, 2001, pp. 2049-2054, XP010547308. |
Van Leeuwen et al., “Requirements for Motion Estimation in Image Sequences for Traffic Applications”, IEEE, US, vol. 1, May 24, 1999, pp. 145-150, XP010340272. |
Van Leeuwen et al., “Motion Estimation with a Mobile Camera for Traffic Applications”, IEEE, US, vol. 1, Oct. 3, 2000, pp. 58-63. |
Van Leeuwen et al., “Motion Interpretation for In-Car Vision Systems”, IEEE, US, vol. 1, Sep. 30, 2002, p. 135-140. |
Pratt, “Digital Image Processing, Passage—ED.3”, John Wiley & Sons, US, Jan. 1, 2001, pp. 657-659, XP002529771. |
Number | Date | Country | |
---|---|---|---|
20230331220 A1 | Oct 2023 | US |
Number | Date | Country | |
---|---|---|---|
61552167 | Oct 2011 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 17655381 | Mar 2022 | US |
Child | 18332812 | US | |
Parent | 15924892 | Mar 2018 | US |
Child | 17655381 | US | |
Parent | 14867069 | Sep 2015 | US |
Child | 15924892 | US | |
Parent | 13660306 | Oct 2012 | US |
Child | 14867069 | US |