Vehicular lane change system

Information

  • Patent Grant
  • 10406980
  • Patent Number
    10,406,980
  • Date Filed
    Thursday, October 11, 2018
    6 years ago
  • Date Issued
    Tuesday, September 10, 2019
    5 years ago
Abstract
A vehicular lane change system includes a forward-viewing camera that has a field of view through the vehicle windshield that encompasses a road being traveled along by the vehicle. Responsive at least in part to processing of image data captured by the camera, a control determines the lane traveled by the vehicle. The control is operable to detect a lane change maneuver of the vehicle to a lane immediately adjacent to the lane traveled by the vehicle. Detection of the lane change maneuver is based at least in part on vehicle steering data and/or image processing of captured image data. Responsive to the control detecting the lane change maneuver of the vehicle when the driver of the vehicle has neglected to turn on an appropriate turn signal indicator of the vehicle, the control automatically turns on the turn signal indicator at the appropriate side of the vehicle.
Description
TECHNICAL FIELD OF INVENTION

This invention relates to object detection adjacent a motor vehicle as it travels along a highway, and more particularly relates to imaging systems that view the blind spot adjacent a vehicle and/or that view the lane adjacent the side of a vehicle and/or view the lane behind or forward the vehicle as it travels down a highway.


BACKGROUND OF INVENTION

Camera-based systems have been proposed, such as in commonly assigned patent application Ser. No. 09/372,915, filed Aug. 12, 1999, now U.S. Pat. No. 6,396,397, the disclosure of which is hereby incorporated herein by reference, that detect and display the presence, position of, distance to and rate of approach of vehicles, motorcycles, bicyclists, and the like, approaching a vehicle such as approaching from behind to overtake in a side lane to the vehicle. The image captured by such vehicular image capture systems can be displayed as a real-time image or by icons on a video screen, and with distances, rates of approach and object identifiers being displayed by indicia and/or overlays, such as is disclosed in U.S. Pat. Nos. 5,670,935; 5,949,331 and 6,222,447, the disclosures of which are hereby incorporated herein by reference. Such prior art systems work well. However, it is desirable for a vehicle driver to have visual access to the full 360 degrees surrounding the vehicle. It is not uncommon, however, for a vehicle driver to experience blind spots due to the design of the vehicle bodywork, windows and the rearview mirror system. A blind spot commonly exists between the field of view available to the driver through the exterior rearview mirror and the driver's peripheral limit of sight. Blind Spot Detection Systems (BSDS), in which a specified zone, or set of zones in the proximity of the vehicle, is monitored for the presence of other road users or hazardous objects, have been developed. A typical BSDS may monitor at least one zone approximately one traffic lane wide on the left- or right-hand side of the vehicle, and generally from the driver's position to approximately 10 m rearward. The objective of these systems is to provide the driver an indication of the presence of other road users located in the targeted blind spot.


Imaging systems have been developed in the prior art, such as discussed above, to perform this function, providing a visual, audio or tactile warning to the driver should a lane change or merge maneuver be attempted when another road user or hazard is detected within the monitored zone or zones. These systems are typically used in combination with a system of rearview mirrors in order to determine if a traffic condition suitable for a safe lane change maneuver exists. They are particularly effective when the detected object is moving at a low relative velocity with reference to the detecting vehicle, since the detected object may spend long periods of time in the blind spot and the driver may lose track of surrounding objects. However, prior art systems are inadequate in many driving conditions.


Known lane departure warning systems typically rely on visually detecting markers on the road on both sides of the vehicle for lane center determination. These markers must be fairly continuous or frequently occurring and generally must exist on both sides of the vehicle for the lane center position to be determined. Failure to detect a marker usually means failure of the departure-warning algorithm to adequately recognize a lane change event.


SUMMARY OF THE INVENTION

The present invention provides a Lane Change Aid (LCA) system wherein the driver of a motor vehicle traveling along a highway is warned if any unsafe lane change or merge maneuver is attempted, regardless of information available through the vehicle's rearview mirror system. The Lane Change Aid (LCA) system of the present invention extends the detection capability of the blind spot detection systems of the prior art.


A vehicle lane change aid system, according to an aspect of the invention, includes a detector that is operative to detect the presence of another vehicle adjacent the vehicle, an indicator for providing an indication that a lane change maneuver of the equipped vehicle may affect the other vehicle and a control receiving movement information of the equipped vehicle. The control develops a position history of the equipped vehicle at least as a function of the movement information. The control compares the detected presence of the other vehicle with the position history and provides the indication when a lane change maneuver may affect the other vehicle.


A vehicle lane change aid system, according to an aspect of the invention, includes an imaging device for capturing lane edge images and a control that is responsive to an output of the imaging device to recognize lane edge positions. The control is operable to distinguish between certain types of lane markers. The control may distinguish between dashed-lane markers and non-dashed-line markers.


A vehicle lane change aid system, according to an aspect of the invention, includes an imaging device for capturing lane edge images and a control that is responsive to an output of the imaging device to recognize lane edge positions. The control is operative to determine that the vehicle has departed a lane. The control may notify the driver that a lane has been departed. The control may further include oncoming vehicle monitoring and side object detection.


A vehicle lane change aid system, according to an aspect of the invention, includes a forward-facing imaging device for capturing images of other vehicles and a control that is responsive to an output of the imaging device to determine an imminent collision with another vehicle. The control may include a wireless transmission channel to transmit a safety warning to the other vehicle. The control may also activate a horn or headlights of the equipped vehicle of an imminent collision.


These and other objects, advantages and features of this invention will become apparent upon review of the following specification in conjunction with the drawings.





BRIEF DESCRIPTION OF THE DRAWINGS


FIGS. 1A-1C are top plan views illustrating a vehicle equipped with a lane change aid system, according to the invention, traveling a straight section of road;



FIG. 2 is a block diagram of a lane change aid system, according to the invention; and



FIG. 3 is a top plan view illustrating a vehicle equipped with a lane change aid system traveling a curved section of road.





DESCRIPTION OF THE PREFERRED EMBODIMENT

Referring to the drawings and the illustrative embodiments depicted therein, a Lane Change Aid (LCA) system 12 of the present invention as illustrated with a vehicle 10 includes a control 18 and an indicator and/or display system 16 that warns a vehicle operator if an intended, or attempted, lane change maneuver could cause an approaching rearward vehicle to brake and decelerate at an unsafe rate, or that otherwise constitutes a highway hazard. In Lane Change Aid (LCA) system 12, the dimension, in the direction of travel, of a zone 20 to be monitored may be calculated based on an assumed maximum relative velocity between a detecting vehicle and an approaching rearward vehicle, and a safe braking and deceleration assumption. Depending on the assumptions made, the required detection zone may vary in length, such as extending rearward from 50 to 100 m, or more. At 100 m, the road curvature behind the vehicle may have a significant impact on the position of the lane of the detected vehicle, relative to the detecting vehicle. Since it is important to know which lane an approaching rearward vehicle is in, relative to the detecting vehicle, in order to provide the driver an appropriate warning, and to avoid many false warnings, the Lane Change Aid (LCA) system 12 includes developing and maintaining a lane position history 20 for the space rearward of the detecting vehicle.


By combining distance traveled with steering angle, the detecting vehicle path may be plotted. Details of the last approximately 100 m traveled are of value for lane change aids and may be stored by the Lane Change Aid (LCA) system. Data may be stored by several methods including the method described below.


Vehicle speed information in the Lane Change Aid (LCA) system 12 is typically derived from a wheel rotation sensor signal 24, which consists of a number of pulses, n, per revolution of the road wheel, and available on a vehicle data bus 26, such as a CAN or LIN bus, or the like. Sensing and signal detail may vary depending on vehicle design, but for any particular design, a distance, d, traveled between pulses can be established. Also, as each pulse is detected, the current value of the steering angle, +/−α, determined by a steering angle encoder 22 may be read from vehicle data bus 26. Again, the sensing and signal detail may vary depending on vehicle design, but, for any particular vehicle design, an effective turning radius, r, for the vehicle can be established.


Image-based blind spot detection devices and lane change aids, generally shown at 14, are but two of a variety of sensing devices and technologies and devices suitable for the purpose of monitoring the local environment in which a vehicle operates. Radar, infrared, sonar, and laser devices are all capable of interrogating the local environment for the presence of other road users or obstacles to be avoided. GPS systems can accurately determine the vehicle position on the earth's surface, and map data can provide detailed information of a mobile local environment. Other wireless communication systems 28 such as short-range wireless communication protocols, such as BLUETOOTH, can provide information such as the position of road works, lane restrictions, or other hazards, which can be translated by on-board vehicle electronics into position data relative to the vehicle position. Lane Change Aid (LCA) system 12 may integrate all the available information from a multiplicity of sensors including non-image-based detectors 14b, such as a radar sensor, such as a Doppler radar sensor, and at least one image-based detector 14a such as a CMOS video camera imaging sensor, and converts the various sensor outputs into a single database with a common format, so that data from various sources, such as a Doppler radar source and a video camera source, may be easily compared, combined and maintained.


Consider a spherical space of radius R, and center (x, y, z)=(0, 0, 0) in Cartesian coordinates or (r, θ, β=(0,0,0)) in polar coordinates. It is convenient to describe the space in both coordinate systems since several operations will be used to fill the data space and to maintain it and a choice of systems allows for efficient computation methods. Let the center of this space (0, 0, 0) be at the center of the vehicle's rear axle, or nominal rear axle described by the line which passes through the center of the two rear non-steering wheels. Let the horizontal centerline of the vehicle, in the primary direction of travel, lie on (x, 0, 0), such that positive x values describe the space forward of the center of the vehicle's rear axle. Let the rear axle coincide with (0, y, 0), such that positive values of y describe the space to the right of the vehicle centerline when looking forward. (R, 90, 0) describes the positive y axis. Let positive z values describe the space above the centerline of the rear axle. (R, 0, 90) describes the positive z axis. This “sphere of awareness” 20 moves with the vehicle as it moves through space and provides a common frame of reference for all sensed or otherwise derived data concerning the vehicle's local environment.


For the purpose of storing vehicle path data, which may be used to improve the performance of lane change aid 12, the discussion may be simplified by considering only the horizontal plane. The use of polar coordinates simplifies operations used in this application. The first data point, as the vehicle starts with no history, is at point (0, 0). The steering angle is read from the data bus and stored as α0. When wheel rotation pulse, p1 is detected, steering angle α1 is recorded. Since the distance traveled between wheel pulses is known to be d, a new position for the previous data point can be calculated as ([2(1−Cos α0)]½, (180+α1)). This point is stored and recorded as historical vehicle path data. When pulse p2 is detected, the above calculation is repeated to yield ([2(1−Cos α1)]½, (180+α1)) as the new position for the previous data point. This requires the repositioning of the original data to ([2(1−Cos α0)]½+[2(1−Cos α1)]½, [(180+α0)+α1]). This process is continued until the distance from the vehicle, R, reaches the maximum required value, such as 100 m in the case of a lane change aid. Data beyond this point is discarded. Thus, a continuous record of the vehicle path for the last 100 m, or whatever distance is used, may be maintained. By maintaining a running record of the path traveled, rearward approaching vehicles detected by a lane change aid image analysis system may be positioned relative to that path as can be seen by comparing the other vehicle 40 in FIGS. 1B and 1C. In FIG. 1B, other vehicle 40 is overlapping zone 20 so an indication of potential conflict may be delayed or discarded. In FIG. 1C, the other vehicle 40 is moving outside of other vehicle 40 and in a blind spot of vehicle 10 so an indication of potential conflict would be given to the driver with indicator 16. Thus, a determination may be made if the approaching vehicle is in the same, adjacent or next but one lane, etc. By this means, the number of inappropriate or unnecessary warnings may be reduced.


Lane change aid system 12 may include a controller, such as a microprocessor including a digital signal processor microcomputer of CPU speed at least about 5 MIPS, more preferably at least about 12 MIPS and most preferably at least about 30 MIPS, that processes inputs from multiple cameras 14a and other sensors 14b and that includes a vehicle path history function whereby, for example, an object, such as a rear-approaching car or motorcycle or truck, or the like, is selected and its presence highlighted to the driver's attention, such as by icons on a dashboard or interior mirror-mounted display, based on the recent history of the side and rear lanes that the host vehicle equipped with the controller of this invention has recently traveled in. An example is over a previous interval of about 60 seconds or less, or over a longer period such as about 3 minutes or more. The vehicle path history function works to determine the lane positioning of an approaching other vehicle, and whether the host vehicle is traveling on, or has recently traveled on, a straight road as illustrated in FIGS. 1A, 1B and 1C, or a curved road portion as illustrated in FIG. 3.


Control 18 may comprise a central video processor module such as is disclosed in commonly assigned provisional patent application Ser. No. 60/309,023, filed Jul. 31, 2001, and utility patent application filed concurrently herewith, now U.S. patent application Ser. No. 10/209,181, filed Jul. 31, 2002, and published Feb. 6, 2003 as U.S. Publication No. US 2003/0025793, the disclosures of which are hereby incorporated herein by reference. Such video processor module operates to receive multiple image outputs from vehicle-mounted cameras, such as disclosed in commonly assigned patent application Ser. No. 09/793,002, filed Feb. 26, 2001, now U.S. Pat. No. 6,690,268, the disclosure of which is hereby incorporated herein by reference, and integrates these in a central processing module to allow reaction to the local vehicle environment. Optionally, and when bandwidth limitations exist that limit the ability to send raw image data, particularly high-resolution images, from a remote camera to a central processing unit across robust transmission means, such as a fiber-optic cable or a high-density wireless link, distributed processing can occur, at least local to some of the image capture sensors. In such an at least partial distributed processing environment, the local processors are adapted to preprocess images captured by the local camera or cameras and any other device such as a Doppler radar sensor viewing a blind spot in an adjacent side lane and to format this preprocessed data into a standard format and transmit this standard formatted data. The data can be transmitted via a wired network or a wireless network or over a vehicle bus system, such as a CAN bus and/or a LIN bus, or the like, to the central processor for effective, centralized mapping and combination of the total local environment around the vehicle. This provides the driver with a display of what is happening in both the right and the left side lanes, and in the lane that the host vehicle is itself traveling in.


In this regard, the vehicle can be provided with a dedicated bus and central processor, as described above, for providing a vehicle environment awareness, which can be both internal such as might be provided by interior cabin or trunk monitors/sensors that determine occupant presence, head position and/or movement, eye movement, air bag deployment, microphone aiming, seat positioning, air conditioning and/or heating targeting, audio controls, and the like, or can be external to the vehicle such as in blind spot detecting or lane change detecting. The present invention includes provision of an automatic environment awareness function that comprises automatic gathering of sensor-derived data collection and transmission in a standard format via a vehicle bus network, the data including data relating to the vehicle environment such as the exterior environment, for example, the presence of rear-approaching traffic in side and rear lanes to the host vehicle as captured by rear-facing CMOS or CCD cameras on the side of the host vehicle, such as included in a side view mirror assembly on either or both sides of the host vehicle and/or as detected by a rear lane/side lane-viewing Doppler radar sensor, and preferably includes processing in a central video processing unit.


The information relating to the external environment can be relayed/displayed to the driver in a variety of ways. For example, a blind-spot vehicle-presence indication can be displayed adjacent the exterior mirror assembly, such as inside the vehicle cabin local to where the exterior mirror assembly is attached to the vehicle door so that the indicator display used, typically an LED flashing light source, or the like, is visible to the driver but not visible to any traffic/drivers exterior to the vehicle, but is cognitively associated with the side of the vehicle to which that particular nearby exterior mirror is attached to, and as disclosed in commonly assigned U.S. Pat. Nos. 5,786,772; 5,929,786 and 6,198,409, the disclosures of which are hereby incorporated herein by reference. Optionally, a vibration transducer can be included in the steering wheel that trembles or otherwise vibrates to tactilely warn the driver of the presence of an overtaking vehicle in a side lane that the driver is using the steering wheel to turn the driver's vehicle into where an overtaking or following vehicle may constitute a collision hazard. Hazard warnings can be communicated to the driver by voice commands and/or audible warnings, and/or by heads-up-displays. The coordinate scheme for data collection of the present invention enables an improved blind spot and/or lane change detection system for vehicles and particularly in busy traffic on a winding, curved road.


The present invention includes the fusion of outputs from video and non-video sensors, such as, for example, a CMOS video camera sensor and a Doppler radar sensor, to allow all-weather and visibility side object detection. The present invention includes the fusion of outputs from video and non-video sensors, such as, for example, a CMOS video camera sensor and a Doppler radar sensor, to allow all-weather and visibility side object detection. The present invention can be utilized in a variety of applications such as disclosed in commonly assigned U.S. Pat. Nos. 5,670,935; 5,949,331; 6,222,447; 6,201,642; 6,097,023; 5,715,093; 5,796,094 and 5,877,897 and commonly assigned U.S. patent application Ser. No. 09/793,002 filed Feb. 26, 2001, now U.S. Pat. No. 6,690,268, Ser. No. 09/372,915, filed Aug. 12, 1999, now U.S. Pat. No. 6,396,397, Ser. No. 09/767,939, filed Jan. 23, 2001, now U.S. Pat. No. 6,590,719, Ser. No. 09/776,625, filed Feb. 5, 2001, now U.S. Pat. No. 6,611,202, Ser. No. 09/799,993, filed Mar. 6, 2001, now U.S. Pat. No. 6,538,827, Ser. No. 09/493,522, filed Jan. 28, 2000, now U.S. Pat. No. 6,426,492, Ser. No. 09/199,907, filed Nov. 25, 1998, now U.S. Pat. No. 6,717,610, Ser. No. 08/952,026, filed Nov. 19, 1997, now U.S. Pat. No. 6,498,620, Ser. No. 09/227,344, filed Jan. 8, 1999, now U.S. Pat. No. 6,302,545, International Publication No. WO 96/38319, published Dec. 5, 1996, and International Publication No. WO 99/23828, published May 14, 1999, the disclosures of which are collectively incorporated herein by reference.


Lane change aid system 12 may include a lane marker type recognition algorithm, or capability 32. Lane marker type recognition capability 32 utilizes classifying lane markers as one of many specific types for the purpose of interpreting the original purpose of the lane marker and issuing reliable and meaningful warnings based on this interpretation. As an example, a double line on the left side of a left-hand drive vehicle typically indicates a no-encroachment zone or no passing zone. A solid line with adjacent dashed line will indicate either an ability to pass safely if the dashed line is on the near side of the solid line or a do not encroach zone if the dashed line is on the far side of the solid line. Road edges can be distinctly recognized and classified as no-encroachment zones. Conversely, dashed lines may have no significance to lane departure warning algorithms since they merely indicate lane edge positions. Recognizing dashed lines as such gives the ability to not initiate nuisance warnings. The recognition algorithm can further be enhanced by recognizing road features when lane markers are too weak or missing. Features, such as curbs, road seams, grease or rubber slicks, road signs, vehicles in same, neighboring, and/or opposing lanes when recognized, could be used to interpret lane-vehicle positioning and issue intelligent warning alerts to the driver. Fewer false or nuisance type warnings with improved real warning functionality and speed can be realized with this improvement. Operation under difficult lighting and environmental conditions can be extended.


Note that collision avoidance functionality 34 can optionally be achieved using a forward-facing camera 14a in the present invention. For example, should the forward-looking camera detect an oncoming car likely to collide with the vehicle equipped with the present invention, or if another vehicle tries to pull in front of it, the system of the present invention can issue a warning (visual and/or audible) to one or both drivers involved. Such warning can be flash headlights and/or sound car horn. Similarly, the system can detect that the driver of the vehicle equipped with the present invention is failing to recognize a stop sign and/or a signal light, or some other warning sign and the driver can be warned visually, such as with a warning light at the interior mirror in the vehicle cabin, or audibly, such as via a warning beeper, or tactilely, such as via a rumble/vibration transducer that vibrates the steering wheel to alert the driver of a potential hazard.


System 12 may also include a lane departure warning algorithm, or system 36. For example, when a left-hand drive vehicle equipped with system 10 is making a left-hand turn generally across a line on the road. System 36 can monitor for a lane crossing and combine it with detection of an oncoming vehicle. The system 12 may also calculate closing speed for warning of potential impact of closing vehicles.


Also, the vehicle can be provided on its front fender or elsewhere at the front of the vehicle with a side-looking camera as an image-based detector 14a operable to warn the driver when he/she is making a left turn across lanes of traffic coming from his/her left (left-side warning) and then again when he/she is about to enter traffic lanes with traffic coming from his right (right-side warning). While executing this turn, the system of the present invention may utilize the detection of the lane markers when the driver's car is about to enter the specific lane combined with oncoming vehicle detection as a means of predictive warning before he actually enters the danger zone.


System 12 is also capable of performing one or more vehicle functions 30. For example, should the lane departure warning system 36 detect that the vehicle equipped with the system is intending to make or is making a lane change and the driver has neglected to turn on the appropriate turn signal indicators, then the system performs a vehicle function 30 of automatically turning on the turn signals on the appropriate side of the vehicle.


The lane departure warning system 36 of the present invention is operable to differentiate between solid and dashed lines and double lines on the road being traveled. Also, should the vehicle be equipped with a side object detection (SOD) system such as a Doppler radar unit or a camera vision side object detection system that detects the presence of overtaking vehicles in the adjacent side lane, then the SOD system can work in conjunction with the lane departure warning system such that as the lane departure system detects that the driver is making a lane change into a side lane when the SOD system detects an overtaking vehicle in that same side lane, then the driver is alerted and warned of the possible hazard, such as by a visual, audible and/or tactile alert.


As indicated above, the forward-facing camera can include stoplight or sign detection, and the system can further include a broadcast with wireless communication system 28 on a safety warning band when the forward-facing camera detects the stoplight or sign and determines the vehicle is not going to stop based on current speed and deceleration. This would warn crossing drivers of an unsafe condition. Such alerts can dynamically vary depending on road surface conditions (wet, snow, ice, etc.) as visually detected and determined by the forward-facing, road-monitoring camera. For example, wet or snowy roads would change the distance and/or speed at which it would warn based on camera vision recognition of stoplights and/or stop signs. When approaching a stoplight when it changes or the vehicle does not slow down for the light after the driver was warned, the system can blow the horn and/or flash the lights to warn vehicles at the stoplight of the oncoming vehicle. The car may also broadcast one of the safety alerts radar detectors pick up.


Changes and modifications in the specifically described embodiments can be carried out without departing from the principles of the invention, which is intended to be limited only by the scope of the appended claims, as interpreted according to the principles of patent law including the doctrine of equivalents.

Claims
  • 1. A vehicular lane change system, said vehicular lane change system comprising: a forward-viewing camera disposed at an in-cabin side of a windshield of a vehicle equipped with said vehicular lane change system, said forward-viewing camera viewing through the windshield to capture image data at least forward of the equipped vehicle;a radar sensor disposed at the equipped vehicle, wherein said radar sensor has a field of sensing exterior of the equipped vehicle and wherein said radar sensor captures radar data;said forward-viewing camera having a field of view that encompasses a road being traveled along by the equipped vehicle;a control disposed in the equipped vehicle and comprising a data processor;said data processor comprising an image processor;wherein image data captured by said forward-viewing camera is provided to said control and is processed by said image processor;wherein radar data captured by said radar sensor is provided to said control;wherein said image processor processes provided image data at a processing speed of at least 30 MIPS;wherein, responsive to image processing by said image processor of provided image data captured by said forward-viewing camera, lane markers on the road being traveled along by the equipped vehicle are detected;wherein, responsive at least in part to image processing by said image processor of provided image data captured by said forward-viewing camera, said control determines lane edges of a lane being traveled by the equipped vehicle;wherein said control is operable to detect a lane change maneuver by a driver of the equipped vehicle to a lane immediately adjacent to the lane being traveled by the equipped vehicle;wherein detection by said control of the lane change maneuver by the driver of the equipped vehicle is based at least in part on determining, via image processing by said image processor of provided image data captured by said forward-viewing camera, that the equipped vehicle is approaching a determined lane edge of the lane being traveled by the equipped vehicle to cross that determined lane edge of the lane being traveled by the equipped vehicle;wherein detection by said control of the lane change maneuver by the driver of the equipped vehicle is irrespective of activation by the driver of the equipped vehicle of a turn signal indicator of the equipped vehicle;wherein, responsive to said control detecting the lane change maneuver by the driver of the equipped vehicle to the lane immediately adjacent to the lane being traveled by the equipped vehicle when the driver of the equipped vehicle has neglected to turn on an appropriate turn signal indicator of the equipped vehicle, said control automatically turns on the turn signal indicator at the appropriate side of the equipped vehicle;wherein, responsive at least in part to processing at said control of provided radar data captured by said radar sensor, said control detects another vehicle present in the lane immediately adjacent to the lane being traveled by the equipped vehicle; andwherein, responsive to detection by said control of the lane change maneuver by the driver of the equipped vehicle to the lane immediately adjacent to the lane being traveled by the equipped vehicle, and responsive to detection by said control of the other vehicle present in that lane immediately adjacent to the lane being traveled by the equipped vehicle, said vehicular lane change system generates an alert to the driver of the equipped vehicle.
  • 2. The vehicular lane change system of claim 1, wherein said control receives vehicle data relating to the equipped vehicle via a vehicle bus of the equipped vehicle, and wherein said vehicle data comprises vehicle steering data, and wherein said vehicle bus comprises a CAN vehicle bus of the equipped vehicle.
  • 3. The vehicular lane change system of claim 2, wherein a steering angle of the equipped vehicle is determined based at least in part on vehicle steering data received at said control via said CAN vehicle bus, and wherein, based at least in part on the determined steering angle of the equipped vehicle, said vehicular lane change system determines that the driver of the equipped vehicle is making or is intending to make the lane change maneuver.
  • 4. The vehicular lane change system of claim 1, wherein the other vehicle is an oncoming vehicle that is approaching the equipped vehicle.
  • 5. The vehicular lane change system of claim 1, wherein provided radar data captured by said radar sensor and provided image data captured by said forward-viewing camera are fused at said control.
  • 6. The vehicular lane change system of claim 1, wherein, responsive to said control detecting the lane change maneuver by the driver of the equipped vehicle to the lane immediately adjacent to the lane being traveled by the equipped vehicle, and when said control, via processing at said control of provided radar data captured by said radar sensor, determines that the vehicle present in the lane immediately adjacent to the lane being traveled by the equipped vehicle is not at a location where the equipped vehicle may collide with the other vehicle if the lane change maneuver is made, said control determines that the lane change maneuver can proceed without hazard of collision with the other vehicle detected present in the lane immediately adjacent to the lane being traveled by the equipped vehicle.
  • 7. The vehicular lane change system of claim 1, wherein, responsive at least in part to processing at said control of provided radar data captured by said radar sensor and of provided image data captured by said forward-viewing camera, said control determines a hazard of collision of the equipped vehicle with the other vehicle.
  • 8. The vehicular lane change system of claim 7, wherein, responsive at least in part to said control determining the hazard of collision with the other vehicle, said control controls a system of the equipped vehicle to mitigate potential collision with the other vehicle.
  • 9. The vehicular lane change system of claim 8, wherein, responsive at least in part to said control determining the hazard of collision, said control controls activation of at least one selected from the group consisting of (i) a horn system of the equipped vehicle to mitigate potential collision with the other vehicle and (ii) a lighting system of the equipped vehicle to mitigate potential collision with the other vehicle.
  • 10. The vehicular lane change system of claim 7, wherein, responsive at least in part to said control determining the hazard of collision, said control controls a wireless transmission system of the equipped vehicle to transmit a safety warning to the other vehicle.
  • 11. The vehicular lane change system of claim 7, wherein said radar sensor comprises a Doppler radar sensor.
  • 12. A vehicular lane change system, said vehicular lane change system comprising: a forward-viewing camera disposed at an in-cabin side of a windshield of a vehicle equipped with said vehicular lane change system, said forward-viewing camera viewing through the windshield to capture image data at least forward of the equipped vehicle;said forward-viewing camera having a field of view that encompasses a road being traveled along by the equipped vehicle;a control disposed in the equipped vehicle and comprising a data processor;said data processor comprising an image processor;wherein image data captured by said forward-viewing camera is provided to said control and is processed by said image processor;wherein said image processor processes provided image data at a processing speed of at least 30 MIPS;wherein, responsive to image processing by said image processor of provided image data captured by said forward-viewing camera, lane markers on the road being traveled along by the equipped vehicle are detected;wherein, responsive at least in part to image processing by said image processor of provided image data captured by said forward-viewing camera, said control determines lane edges of a lane being traveled by the equipped vehicle;wherein said control is operable to detect a lane change maneuver by a driver of the equipped vehicle to a lane immediately adjacent to the lane being traveled by the equipped vehicle;wherein detection by said control of the lane change maneuver by the driver of the equipped vehicle is irrespective of activation by the driver of the equipped vehicle of a turn signal indicator of the equipped vehicle;wherein a radar sensor is disposed at the equipped vehicle;wherein said radar sensor has a field of sensing exterior of the equipped vehicle as the equipped vehicle travels along the road, and wherein said radar sensor captures radar data;wherein radar data captured by said radar sensor is provided to said control;wherein, responsive at least in part to processing at said control of provided radar data captured by said radar sensor and of provided image data captured by said forward-viewing camera, said control detects another vehicle present in the lane immediately adjacent to the lane being traveled by the equipped vehicle;wherein the other vehicle is an oncoming vehicle that is approaching the equipped vehicle; andwherein, responsive to detection by said control of the lane change maneuver by the driver of the equipped vehicle to the lane immediately adjacent to the lane being traveled by the equipped vehicle, and responsive to detection by said control of the other vehicle present in that lane immediately adjacent to the lane being traveled by the equipped vehicle, said vehicular lane change system generates an alert to the driver of the equipped vehicle.
  • 13. The vehicular lane change system of claim 12, wherein provided radar data captured by said radar sensor and provided image data captured by said forward-viewing camera are fused at said control.
  • 14. The vehicular lane change system of claim 13, wherein, responsive to said control detecting the lane change maneuver by the driver of the equipped vehicle to the lane immediately adjacent to the lane being traveled by the equipped vehicle, and when said control, via processing at said control of provided radar data captured by said radar sensor, determines that the vehicle present in the lane immediately adjacent to the lane being traveled by the equipped vehicle is not at a location where the equipped vehicle may collide with the other vehicle if the lane change maneuver is made, said control determines that the lane change maneuver can proceed without hazard of collision with the other vehicle detected present in the lane immediately adjacent to the lane being traveled by the equipped vehicle.
  • 15. The vehicular lane change system of claim 13, wherein, responsive at least in part to processing at said control of provided radar data captured by said radar sensor and of provided image data captured by said forward-viewing camera, said control determines a hazard of collision of the equipped vehicle with the other vehicle.
  • 16. The vehicular lane change system of claim 15, wherein, responsive at least in part to said control determining the hazard of collision with the other vehicle, said control controls a system of the equipped vehicle to mitigate potential collision with the other vehicle.
  • 17. The vehicular lane change system of claim 15, wherein, responsive at least in part to said control determining the hazard of collision, said control controls activation of at least one selected from the group consisting of (i) a horn system of the equipped vehicle to mitigate potential collision with the other vehicle and (ii) a lighting system of the equipped vehicle to mitigate potential collision with the other vehicle.
  • 18. The vehicular lane change system of claim 15, wherein, responsive at least in part to said control determining the hazard of collision, said control controls a wireless transmission system of the equipped vehicle to transmit a safety warning to the other vehicle.
  • 19. The vehicular lane change system of claim 13, wherein, responsive to said control detecting the lane change maneuver by the driver of the equipped vehicle to the lane immediately adjacent to the lane being traveled by the equipped vehicle when the driver of the equipped vehicle has neglected to turn on an appropriate turn signal indicator of the equipped vehicle, said control automatically turns on the turn signal indicator at the appropriate side of the equipped vehicle.
  • 20. The vehicular lane change system of claim 13, wherein said control receives vehicle data relating to the equipped vehicle via a CAN vehicle bus of the equipped vehicle, and wherein said vehicle data comprises vehicle steering data, and wherein detection by said control of a lane change maneuver by the driver of the equipped vehicle is based at least in part on determining, via image processing by said image processor of provided image data captured by said forward-viewing camera, that the equipped vehicle is approaching a determined lane edge of the lane being traveled by the equipped vehicle to cross that determined lane edge of the lane being traveled by the equipped vehicle.
  • 21. A vehicular lane change system, said vehicular lane change system comprising: a forward-viewing camera disposed at an in-cabin side of a windshield of a vehicle equipped with said vehicular lane change system, said forward-viewing camera viewing through the windshield to capture image data at least forward of the equipped vehicle;said forward-viewing camera having a field of view that encompasses a road being traveled along by the equipped vehicle;a control disposed in the equipped vehicle and comprising a data processor;said data processor comprising an image processor;wherein image data captured by said forward-viewing camera is provided to said control and is processed by said image processor;wherein said image processor processes provided image data at a processing speed of at least 30 MIPS;wherein, responsive to image processing by said image processor of provided image data captured by said forward-viewing camera, lane markers on the road being traveled along by the equipped vehicle are detected;wherein, responsive at least in part to image processing by said image processor of provided image data captured by said forward-viewing camera, said control determines lane edges of a lane being traveled by the equipped vehicle;wherein said control is operable to detect a lane change maneuver by a driver of the equipped vehicle to a lane immediately adjacent to the lane being traveled by the equipped vehicle;wherein said control receives vehicle data relating to the equipped vehicle via a CAN vehicle bus of the equipped vehicle;wherein said vehicle data comprises vehicle steering data, and wherein detection by said control of the lane change maneuver by the driver of the equipped vehicle is based at least in part on determining, responsive to vehicle steering data received at said control via said CAN vehicle bus and via image processing by said image processor of provided image data captured by said forward-viewing camera, that the equipped vehicle is approaching a determined lane edge of the lane being traveled by the equipped vehicle to cross that determined lane edge of the lane being traveled by the equipped vehicle;wherein detection by said control of the lane change maneuver by the driver of the equipped vehicle is irrespective of activation by the driver of the equipped vehicle of a turn signal indicator of the equipped vehicle;wherein a radar sensor is disposed at the equipped vehicle;wherein said radar sensor has a field of sensing exterior of the equipped vehicle as the equipped vehicle travels along the road, and wherein said radar sensor captures radar data;wherein radar data captured by said radar sensor is provided to said control;wherein provided radar data captured by said radar sensor and provided image data captured by said forward-viewing camera are fused at said control;wherein, responsive at least in part to processing at said control of provided radar data captured by said radar sensor and of provided image data captured by said forward-viewing camera, said control detects another vehicle present in the lane immediately adjacent to the lane being traveled by the equipped vehicle;wherein, responsive to detection by said control of the lane change maneuver by the driver of the equipped vehicle to the lane immediately adjacent to the lane being traveled by the equipped vehicle, and responsive to detection by said control of the other vehicle present in that lane immediately adjacent to the lane being traveled by the equipped vehicle, said vehicular lane change system generates an alert to the driver of the equipped vehicle;wherein, responsive at least in part to processing of provided radar data captured by said radar sensor and of provided image data captured by said forward-viewing camera, said control determines a hazard of collision of the equipped vehicle with the other vehicle; andwherein, responsive at least in part to said control determining the hazard of collision with the other vehicle, said control controls a system of the equipped vehicle to mitigate potential collision with the other vehicle.
  • 22. The vehicular lane change system of claim 21, wherein the other vehicle is an oncoming vehicle that is approaching the equipped vehicle.
  • 23. The vehicular lane change system of claim 22, wherein, responsive to said control detecting the lane change maneuver by the driver of the equipped vehicle to the lane immediately adjacent to the lane being traveled by the equipped vehicle when the driver of the equipped vehicle has neglected to turn on an appropriate turn signal indicator of the equipped vehicle, said control automatically turns on the turn signal indicator at the appropriate side of the equipped vehicle.
  • 24. The vehicular lane change system of claim 23, wherein, responsive at least in part to said control determining the hazard of collision, said control controls activation of at least one selected from the group consisting of (i) a horn system of the equipped vehicle to mitigate potential collision with the other vehicle and (ii) a lighting system of the equipped vehicle to mitigate potential collision with the other vehicle.
  • 25. The vehicular lane change system of claim 21, wherein, responsive to said control detecting the lane change maneuver by the driver of the equipped vehicle to the lane immediately adjacent to the lane being traveled by the equipped vehicle, and when said control, via processing at said control of provided radar data captured by said radar sensor, determines that the other vehicle present in the lane immediately adjacent to the lane being traveled by the equipped vehicle is not at a location where the equipped vehicle may collide with the other vehicle if the lane change maneuver is made, said control determines that the lane change maneuver can proceed without hazard of collision with the other vehicle detected present in the lane immediately adjacent to the lane being traveled by the equipped vehicle.
  • 26. The vehicular lane change system of claim 21, wherein, responsive at least in part to said control determining the hazard of collision, said control controls a wireless transmission system of the equipped vehicle to transmit a safety warning to the other vehicle.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application is a continuation of U.S. patent application Ser. No. 15/289,341, filed Oct. 10, 2016, now U.S. Pat. No. 10,099,610, which is a continuation of U.S. patent application Ser. No. 14/997,831, filed Jan. 18, 2016, now U.S. Pat. No. 9,463,744, which is a continuation of U.S. patent application Ser. No. 13/919,483, filed Jun. 17, 2013, now U.S. Pat. No. 9,245,448, which is a continuation of U.S. patent application Ser. No. 12/483,996, filed Jun. 12, 2009, now U.S. Pat. No. 8,466,806, which is a continuation of U.S. patent application Ser. No. 12/058,155, filed Mar. 28, 2008, now U.S. Pat. No. 7,551,103, which is a continuation of U.S. patent application Ser. No. 11/735,782, filed Apr. 16, 2007, now U.S. Pat. No. 7,355,524, which is a continuation of U.S. patent application Ser. No. 11/108,474, filed Apr. 18, 2005, now U.S. Pat. No. 7,205,904, which is a continuation of U.S. patent application Ser. No. 10/209,173, filed on Jul. 31, 2002, now U.S. Pat. No. 6,882,287, which claims priority from U.S. provisional application Ser. No. 60/309,022, filed on Jul. 31, 2001, the disclosures of which are hereby incorporated herein by reference in their entireties.

US Referenced Citations (513)
Number Name Date Kind
4200361 Malvano Apr 1980 A
4214266 Myers Jul 1980 A
4218698 Bart et al. Aug 1980 A
4236099 Rosenblum Nov 1980 A
4247870 Gabel et al. Jan 1981 A
4249160 Chilvers Feb 1981 A
4254931 Aikens Mar 1981 A
4266856 Wainwright May 1981 A
4277804 Robison Jul 1981 A
4281898 Ochiai Aug 1981 A
4288814 Talley et al. Sep 1981 A
4348652 Barnes et al. Sep 1982 A
4355271 Noack Oct 1982 A
4357558 Massoni et al. Nov 1982 A
4381888 Momiyama May 1983 A
4420238 Felix Dec 1983 A
4431896 Lodetti Feb 1984 A
4443057 Bauer Apr 1984 A
4460831 Oettinger et al. Jul 1984 A
4481450 Watanabe et al. Nov 1984 A
4491390 Tong-Shen Jan 1985 A
4512637 Ballmer Apr 1985 A
4521804 Bendell Jun 1985 A
4529275 Ballmer Jul 1985 A
4529873 Ballmer Jul 1985 A
4532550 Bendell et al. Jul 1985 A
4546551 Franks Oct 1985 A
4549208 Kamejima et al. Oct 1985 A
4571082 Downs Feb 1986 A
4572619 Reininger Feb 1986 A
4580875 Bechtel Apr 1986 A
4600913 Caine Jul 1986 A
4603946 Kato Aug 1986 A
4614415 Hyatt Sep 1986 A
4620141 McCumber et al. Oct 1986 A
4623222 Itoh Nov 1986 A
4626850 Chey Dec 1986 A
4629941 Ellis Dec 1986 A
4630109 Barton Dec 1986 A
4632509 Ohmi Dec 1986 A
4638287 Umebayashi et al. Jan 1987 A
4645975 Meitzler et al. Feb 1987 A
4647161 Müller Mar 1987 A
4653316 Fukuhara Mar 1987 A
4669825 Itoh Jun 1987 A
4669826 Itoh Jun 1987 A
4671615 Fukada Jun 1987 A
4672457 Hyatt Jun 1987 A
4676601 Itoh Jun 1987 A
4690508 Jacob Sep 1987 A
4692798 Seko et al. Sep 1987 A
4697883 Suzuki Oct 1987 A
4701022 Jacob Oct 1987 A
4713685 Nishimura et al. Dec 1987 A
4717830 Botts Jan 1988 A
4727290 Smith Feb 1988 A
4731669 Hayashi et al. Mar 1988 A
4741603 Miyagi May 1988 A
4758883 Kawahara et al. Jul 1988 A
4768135 Kretschmer et al. Aug 1988 A
4772942 Tuck Sep 1988 A
4789904 Peterson Dec 1988 A
4793690 Gahan Dec 1988 A
4817948 Simonelli Apr 1989 A
4820933 Hong Apr 1989 A
4825232 Howdle Apr 1989 A
4838650 Stewart Jun 1989 A
4847772 Michalopoulos et al. Jul 1989 A
4855822 Narendra et al. Aug 1989 A
4862037 Farber et al. Aug 1989 A
4867561 Fujii et al. Sep 1989 A
4871917 O'Farrell et al. Oct 1989 A
4872051 Dye Oct 1989 A
4881019 Shiraishi et al. Nov 1989 A
4882565 Gallmeyer Nov 1989 A
4886960 Molyneux Dec 1989 A
4891559 Matsumoto et al. Jan 1990 A
4892345 Rachael, III Jan 1990 A
4895790 Swanson et al. Jan 1990 A
4896030 Miyaji Jan 1990 A
4907870 Brucker Mar 1990 A
4910591 Petrossian et al. Mar 1990 A
4916374 Schierbeek Apr 1990 A
4917477 Bechtel et al. Apr 1990 A
4937796 Tendler Jun 1990 A
4953305 Van Lente et al. Sep 1990 A
4956591 Schierbeek Sep 1990 A
4961625 Wood et al. Oct 1990 A
4967319 Seko Oct 1990 A
4970653 Kenue Nov 1990 A
4971430 Lynas Nov 1990 A
4974078 Tsai Nov 1990 A
4975703 Delisle et al. Dec 1990 A
4987357 Masaki Jan 1991 A
4991054 Walters Feb 1991 A
5001558 Burley et al. Mar 1991 A
5003288 Wilhelm Mar 1991 A
5012082 Watanabe Apr 1991 A
5016977 Baude et al. May 1991 A
5027001 Torbert Jun 1991 A
5027200 Petrossian et al. Jun 1991 A
5044706 Chen Sep 1991 A
5055668 French Oct 1991 A
5059877 Teder Oct 1991 A
5064274 Alten Nov 1991 A
5072154 Chen Dec 1991 A
5075768 Wirtz et al. Dec 1991 A
5086253 Lawler Feb 1992 A
5096287 Kakinami et al. Mar 1992 A
5097362 Lynas Mar 1992 A
5121200 Choi Jun 1992 A
5124549 Michaels et al. Jun 1992 A
5130709 Toyama et al. Jul 1992 A
5148014 Lynam Sep 1992 A
5166681 Bottesch et al. Nov 1992 A
5168378 Black Dec 1992 A
5170374 Shimohigashi et al. Dec 1992 A
5172235 Wilm et al. Dec 1992 A
5177606 Koshizawa Jan 1993 A
5177685 Davis et al. Jan 1993 A
5182502 Slotkowski et al. Jan 1993 A
5184956 Langlais et al. Feb 1993 A
5189561 Hong Feb 1993 A
5193000 Lipton et al. Mar 1993 A
5193029 Schofield Mar 1993 A
5204778 Bechtel Apr 1993 A
5208701 Maeda May 1993 A
5225827 Persson Jul 1993 A
5245422 Borcherts et al. Sep 1993 A
5253109 O'Farrell Oct 1993 A
5276389 Levers Jan 1994 A
5285060 Larson et al. Feb 1994 A
5289182 Brillard et al. Feb 1994 A
5289321 Secor Feb 1994 A
5305012 Faris Apr 1994 A
5307136 Saneyoshi Apr 1994 A
5309137 Kajiwara May 1994 A
5313072 Vachss May 1994 A
5325096 Pakett Jun 1994 A
5325386 Jewell et al. Jun 1994 A
5329206 Slotkowski et al. Jul 1994 A
5331312 Kudoh Jul 1994 A
5336980 Levers Aug 1994 A
5339075 Abst et al. Aug 1994 A
5341437 Nakayama Aug 1994 A
5351044 Mathur et al. Sep 1994 A
5355118 Fukuhara Oct 1994 A
5374852 Parkes Dec 1994 A
5386285 Asayama Jan 1995 A
5394333 Kao Feb 1995 A
5406395 Wilson et al. Apr 1995 A
5410346 Saneyoshi et al. Apr 1995 A
5414257 Stanton May 1995 A
5414461 Kishi et al. May 1995 A
5416313 Larson et al. May 1995 A
5416318 Hegyi May 1995 A
5416478 Morinaga May 1995 A
5424952 Asayama Jun 1995 A
5426294 Kobayashi et al. Jun 1995 A
5430431 Nelson Jul 1995 A
5434407 Bauer et al. Jul 1995 A
5434927 Brady et al. Jul 1995 A
5440428 Hegg et al. Aug 1995 A
5444478 Lelong et al. Aug 1995 A
5451822 Bechtel et al. Sep 1995 A
5457493 Leddy et al. Oct 1995 A
5461357 Yoshioka et al. Oct 1995 A
5461361 Moore Oct 1995 A
5467284 Yoshioka et al. Nov 1995 A
5469298 Suman et al. Nov 1995 A
5471515 Fossum et al. Nov 1995 A
5475494 Nishida et al. Dec 1995 A
5487116 Nakano et al. Jan 1996 A
5498866 Bendicks et al. Mar 1996 A
5500766 Stonecypher Mar 1996 A
5510983 Iino Apr 1996 A
5515448 Nishitani May 1996 A
5521633 Nakajima et al. May 1996 A
5528698 Kamei et al. Jun 1996 A
5529138 Shaw et al. Jun 1996 A
5530240 Larson et al. Jun 1996 A
5530420 Tsuchiya et al. Jun 1996 A
5535314 Alves et al. Jul 1996 A
5537003 Bechtel et al. Jul 1996 A
5539397 Asanuma et al. Jul 1996 A
5541590 Nishio Jul 1996 A
5550677 Schofield et al. Aug 1996 A
5555312 Shima et al. Sep 1996 A
5555555 Sato et al. Sep 1996 A
5568027 Teder Oct 1996 A
5574443 Hsieh Nov 1996 A
5581464 Woll et al. Dec 1996 A
5594222 Caldwell Jan 1997 A
5612883 Shaffer et al. Mar 1997 A
5614788 Mullins Mar 1997 A
5619370 Guinosso Apr 1997 A
5634709 Iwama Jun 1997 A
5642299 Hardin et al. Jun 1997 A
5646612 Byon Jul 1997 A
5648835 Uzawa Jul 1997 A
5650944 Kise Jul 1997 A
5660454 Mori et al. Aug 1997 A
5661303 Teder Aug 1997 A
5666028 Bechtel et al. Sep 1997 A
5668663 Varaprasad et al. Sep 1997 A
5670935 Schofield et al. Sep 1997 A
5673019 Dantoni Sep 1997 A
5675489 Pomerleau Oct 1997 A
5677851 Kingdon et al. Oct 1997 A
5680123 Lee Oct 1997 A
5699044 Van Lente et al. Dec 1997 A
5699057 Ikeda et al. Dec 1997 A
5715093 Schierbeek et al. Feb 1998 A
5724187 Varaprasad et al. Mar 1998 A
5724316 Brunts Mar 1998 A
5737226 Olson et al. Apr 1998 A
5757949 Kinoshita et al. May 1998 A
5760826 Nayer Jun 1998 A
5760828 Cortes Jun 1998 A
5760931 Saburi et al. Jun 1998 A
5760962 Schofield et al. Jun 1998 A
5761094 Olson et al. Jun 1998 A
5765116 Wilson-Jones et al. Jun 1998 A
5781437 Wiemer et al. Jul 1998 A
5786772 Schofield et al. Jul 1998 A
5790403 Nakayama Aug 1998 A
5790973 Blaker et al. Aug 1998 A
5793308 Rosinski et al. Aug 1998 A
5793420 Schmidt Aug 1998 A
5796094 Schofield et al. Aug 1998 A
5798575 O'Farrell et al. Aug 1998 A
5835255 Miles Nov 1998 A
5837994 Stam et al. Nov 1998 A
5844505 Van Ryzin Dec 1998 A
5844682 Kiyomoto et al. Dec 1998 A
5845000 Breed et al. Dec 1998 A
5848802 Breed et al. Dec 1998 A
5850176 Kinoshita et al. Dec 1998 A
5850254 Takano et al. Dec 1998 A
5867591 Onda Feb 1999 A
5877707 Kowalick Mar 1999 A
5877897 Schofield et al. Mar 1999 A
5878370 Olson Mar 1999 A
5883739 Ashihara et al. Mar 1999 A
5884212 Lion Mar 1999 A
5890021 Onoda Mar 1999 A
5896085 Mori et al. Apr 1999 A
5899956 Chan May 1999 A
5914815 Bos Jun 1999 A
5923027 Stam et al. Jul 1999 A
5929786 Schofield et al. Jul 1999 A
5940120 Frankhouse et al. Aug 1999 A
5949331 Schofield et al. Sep 1999 A
5956181 Lin Sep 1999 A
5959367 O'Farrell et al. Sep 1999 A
5959555 Furuta Sep 1999 A
5963247 Banitt Oct 1999 A
5964822 Alland et al. Oct 1999 A
5971552 O'Farrell et al. Oct 1999 A
5986796 Miles Nov 1999 A
5990469 Bechtel et al. Nov 1999 A
5990649 Nagao et al. Nov 1999 A
6001486 Varaprasad et al. Dec 1999 A
6009336 Harris et al. Dec 1999 A
6020704 Buschur Feb 2000 A
6031484 Bullinger et al. Feb 2000 A
6037860 Zander et al. Mar 2000 A
6037975 Aoyama Mar 2000 A
6049171 Stam et al. Apr 2000 A
6057754 Kinoshita et al. May 2000 A
6066933 Ponziana May 2000 A
6084519 Coulling et al. Jul 2000 A
6087953 DeLine et al. Jul 2000 A
6097023 Schofield et al. Aug 2000 A
6097024 Stam et al. Aug 2000 A
6107939 Sorden Aug 2000 A
6116743 Hoek Sep 2000 A
6124647 Marcus et al. Sep 2000 A
6124886 DeLine et al. Sep 2000 A
6139172 Bos et al. Oct 2000 A
6144022 Tenenbaum et al. Nov 2000 A
6151539 Bergholz et al. Nov 2000 A
6172613 DeLine et al. Jan 2001 B1
6175164 O'Farrell et al. Jan 2001 B1
6175300 Kendrick Jan 2001 B1
6198409 Schofield et al. Mar 2001 B1
6201642 Bos Mar 2001 B1
6222447 Schofield et al. Apr 2001 B1
6222460 DeLine et al. Apr 2001 B1
6226592 Luckscheiter May 2001 B1
6243003 DeLine et al. Jun 2001 B1
6250148 Lynam Jun 2001 B1
6259412 Duroux Jul 2001 B1
6266082 Yonezawa et al. Jul 2001 B1
6266442 Laumeyer et al. Jul 2001 B1
6278377 DeLine et al. Aug 2001 B1
6281806 Smith et al. Aug 2001 B1
6285393 Shimoura et al. Sep 2001 B1
6291906 Marcus et al. Sep 2001 B1
6292752 Franke et al. Sep 2001 B1
6294989 Schofield et al. Sep 2001 B1
6297781 Turnbull et al. Oct 2001 B1
6302545 Schofield et al. Oct 2001 B1
6310611 Caldwell Oct 2001 B1
6311119 Sawamoto et al. Oct 2001 B2
6313454 Bos et al. Nov 2001 B1
6317057 Lee Nov 2001 B1
6320176 Schofield et al. Nov 2001 B1
6320282 Caldwell Nov 2001 B1
6324450 Iwama Nov 2001 B1
6326613 Heslin et al. Dec 2001 B1
6329925 Skiver et al. Dec 2001 B1
6333759 Mazzilli Dec 2001 B1
6341523 Lynam Jan 2002 B2
6353392 Schofield et al. Mar 2002 B1
6360170 Ishikawa Mar 2002 B1
6362729 Hellmann et al. Mar 2002 B1
6363326 Scully Mar 2002 B1
6366213 DeLine et al. Apr 2002 B2
6366236 Farmer et al. Apr 2002 B1
6370329 Teuchert Apr 2002 B1
6388565 Bernhard et al. May 2002 B1
6388580 Graham et al. May 2002 B1
6396397 Bos et al. May 2002 B1
6411204 Bloomfield et al. Jun 2002 B1
6411328 Franke et al. Jun 2002 B1
6420975 DeLine et al. Jul 2002 B1
6424273 Gutta et al. Jul 2002 B1
6428172 Hutzel et al. Aug 2002 B1
6430303 Naoi et al. Aug 2002 B1
6433676 DeLine et al. Aug 2002 B2
6433817 Guerra Aug 2002 B1
6441748 Takagi et al. Aug 2002 B1
6442465 Breed et al. Aug 2002 B2
6477464 McCarthy et al. Nov 2002 B2
6485155 Duroux et al. Nov 2002 B1
6497503 Dassanayake et al. Dec 2002 B1
6498620 Schofield et al. Dec 2002 B2
6502035 Levine Dec 2002 B2
6513252 Schierbeek et al. Feb 2003 B1
6516664 Lynam Feb 2003 B2
6523964 Schofield et al. Feb 2003 B2
6534884 Marcus et al. Mar 2003 B2
6539306 Turnbull Mar 2003 B2
6547133 DeVries, Jr. et al. Apr 2003 B1
6553130 Lemelson et al. Apr 2003 B1
6559435 Schofield et al. May 2003 B2
6574033 Chui et al. Jun 2003 B1
6578017 Ebersole et al. Jun 2003 B1
6587573 Stam et al. Jul 2003 B1
6589625 Kothari et al. Jul 2003 B1
6593565 Heslin et al. Jul 2003 B2
6594583 Ogura et al. Jul 2003 B2
6611202 Schofield et al. Aug 2003 B2
6611610 Stam et al. Aug 2003 B1
6627918 Getz et al. Sep 2003 B2
6631316 Stam et al. Oct 2003 B2
6631994 Suzuki et al. Oct 2003 B2
6636258 Strumolo Oct 2003 B2
6648477 Hutzel et al. Nov 2003 B2
6650233 DeLine et al. Nov 2003 B2
6650455 Miles Nov 2003 B2
6672731 Schnell et al. Jan 2004 B2
6674562 Miles Jan 2004 B1
6678056 Downs Jan 2004 B2
6678614 McCarthy et al. Jan 2004 B2
6680792 Miles Jan 2004 B2
6683969 Nishigaki et al. Jan 2004 B1
6690268 Schofield et al. Feb 2004 B2
6700605 Toyoda et al. Mar 2004 B1
6703925 Steffel Mar 2004 B2
6704621 Stein et al. Mar 2004 B1
6710908 Miles et al. Mar 2004 B2
6711474 Treyz et al. Mar 2004 B1
6714331 Lewis et al. Mar 2004 B2
6717610 Bos et al. Apr 2004 B1
6728623 Takenaga et al. Apr 2004 B2
6735506 Breed et al. May 2004 B2
6741377 Miles May 2004 B2
6744353 Sjönell Jun 2004 B2
6757109 Bos Jun 2004 B2
6762867 Lippert et al. Jul 2004 B2
6784828 Delcheccolo et al. Aug 2004 B2
6794119 Miles Sep 2004 B2
6795221 Urey Sep 2004 B1
6802617 Schofield et al. Oct 2004 B2
6806452 Bos et al. Oct 2004 B2
6813370 Arai Nov 2004 B1
6822563 Bos et al. Nov 2004 B2
6823241 Shirato et al. Nov 2004 B2
6824281 Schofield Nov 2004 B2
6831261 Schofield et al. Dec 2004 B2
6847487 Burgner Jan 2005 B2
6873253 Veriris Mar 2005 B2
6882287 Schofield Apr 2005 B2
6888447 Hori et al. May 2005 B2
6889161 Winner et al. May 2005 B2
6891563 Schofield et al. May 2005 B2
6906639 Lemelson et al. Jun 2005 B2
6909753 Meehan et al. Jun 2005 B2
6946978 Schofield Sep 2005 B2
6953253 Schofield et al. Oct 2005 B2
6968736 Lynam Nov 2005 B2
6975775 Rykowski et al. Dec 2005 B2
7004593 Weller et al. Feb 2006 B2
7004606 Schofield Feb 2006 B2
7005974 McMahon et al. Feb 2006 B2
7038577 Pawlicki et al. May 2006 B2
7046448 Burgner May 2006 B2
7062300 Kim Jun 2006 B1
7065432 Moisel et al. Jun 2006 B2
7085637 Breed et al. Aug 2006 B2
7092548 Laumeyer et al. Aug 2006 B2
7116246 Winter et al. Oct 2006 B2
7123168 Schofield Oct 2006 B2
7133661 Hatae et al. Nov 2006 B2
7149613 Stam et al. Dec 2006 B2
7167796 Taylor et al. Jan 2007 B2
7195381 Lynam et al. Mar 2007 B2
7202776 Breed Apr 2007 B2
7205904 Schofield Apr 2007 B2
7224324 Quist et al. May 2007 B2
7227459 Bos et al. Jun 2007 B2
7227611 Hull et al. Jun 2007 B2
7249860 Kulas et al. Jul 2007 B2
7253723 Lindahl et al. Aug 2007 B2
7255451 McCabe et al. Aug 2007 B2
7311406 Schofield et al. Dec 2007 B2
7325934 Schofield et al. Feb 2008 B2
7325935 Schofield et al. Feb 2008 B2
7338177 Lynam Mar 2008 B2
7339149 Schofield et al. Mar 2008 B1
7344261 Schofield et al. Mar 2008 B2
7355524 Schofield Apr 2008 B2
7360932 Uken et al. Apr 2008 B2
7370983 DeWind et al. May 2008 B2
7375803 Bamji May 2008 B1
7380948 Schofield et al. Jun 2008 B2
7388182 Schofield et al. Jun 2008 B2
7402786 Schofield et al. Jul 2008 B2
7423248 Schofield et al. Sep 2008 B2
7423821 Bechtel et al. Sep 2008 B2
7425076 Schofield et al. Sep 2008 B2
7459664 Schofield et al. Dec 2008 B2
7526103 Schofield et al. Apr 2009 B2
7541743 Salmeen et al. Jun 2009 B2
7551103 Schofield Jun 2009 B2
7561181 Schofield et al. Jul 2009 B2
7565006 Stam et al. Jul 2009 B2
7616781 Schofield et al. Nov 2009 B2
7619508 Lynam et al. Nov 2009 B2
7633383 Dunsmoir et al. Dec 2009 B2
7639149 Katoh Dec 2009 B2
7655894 Schofield et al. Feb 2010 B2
7676087 Dhua et al. Mar 2010 B2
7720580 Higgins-Luthman May 2010 B2
7792329 Schofield et al. Sep 2010 B2
7843451 Lafon Nov 2010 B2
7855778 Yung et al. Dec 2010 B2
7859565 Schofield et al. Dec 2010 B2
7877175 Higgins-Luthman Jan 2011 B2
7881496 Camilleri Feb 2011 B2
7914187 Higgins-Luthman et al. Mar 2011 B2
7930160 Hosagrahara et al. Apr 2011 B1
7991522 Higgins-Luthman Aug 2011 B2
7994462 Schofield et al. Aug 2011 B2
8017898 Lu et al. Sep 2011 B2
8095310 Taylor et al. Jan 2012 B2
8098142 Schofield et al. Jan 2012 B2
8203440 Schofield et al. Jun 2012 B2
8222588 Schofield et al. Jul 2012 B2
8224031 Saito Jul 2012 B2
8314689 Schofield et al. Nov 2012 B2
8324552 Schofield et al. Dec 2012 B2
8386114 Higgins-Luthman et al. Feb 2013 B2
8466806 Schofield Jun 2013 B2
9245448 Schofield et al. Jan 2016 B2
9463744 Schofield Oct 2016 B2
1009961 Schofield Oct 2018 A1
10099610 Schofield Oct 2018 B2
20010031068 Ohta Oct 2001 A1
20010034575 Takenaga et al. Oct 2001 A1
20010056326 Kirmura Dec 2001 A1
20020005778 Breed Jan 2002 A1
20020113873 Williams Aug 2002 A1
20020116126 Lin Aug 2002 A1
20020159270 Lynam et al. Oct 2002 A1
20030016143 Ghazarian Jan 2003 A1
20030025597 Schofield Feb 2003 A1
20030137586 Lewellen Jul 2003 A1
20030222982 Hamdan et al. Dec 2003 A1
20040016870 Pawlicki et al. Jan 2004 A1
20040164228 Fogg et al. Aug 2004 A1
20050046978 Schofield et al. Mar 2005 A1
20050219852 Stam et al. Oct 2005 A1
20050237385 Kosaka et al. Oct 2005 A1
20060018511 Stam et al. Jan 2006 A1
20060018512 Stam et al. Jan 2006 A1
20060050018 Hutzel et al. Mar 2006 A1
20060091813 Stam et al. May 2006 A1
20060103727 Tseng May 2006 A1
20060250501 Wildmann et al. Nov 2006 A1
20070104476 Yasutomi et al. May 2007 A1
20070109406 Schofield et al. May 2007 A1
20070120657 Schofield et al. May 2007 A1
20070242339 Bradley Oct 2007 A1
20080147321 Howard et al. Jun 2008 A1
20080192132 Bechtel et al. Aug 2008 A1
20090113509 Tseng et al. Apr 2009 A1
20090160987 Bechtel et al. Jun 2009 A1
20090190015 Bechtel et al. Jul 2009 A1
20090256938 Bechtel et al. Oct 2009 A1
20120045112 Lundblad et al. Feb 2012 A1
Foreign Referenced Citations (34)
Number Date Country
0353200 Jan 1990 EP
0426503 May 1991 EP
0492591 Jul 1992 EP
0640903 Mar 1995 EP
0788947 Aug 1997 EP
1074430 Feb 2001 EP
59114139 Jul 1984 JP
6079889 May 1985 JP
6080953 May 1985 JP
6272245 May 1987 JP
S62131837 Jun 1987 JP
6414700 Jan 1989 JP
03099952 Apr 1991 JP
4114587 Apr 1992 JP
H04127280 Apr 1992 JP
0577657 Mar 1993 JP
05050883 Mar 1993 JP
5213113 Aug 1993 JP
6227318 Aug 1994 JP
06267304 Sep 1994 JP
06276524 Sep 1994 JP
06295601 Oct 1994 JP
07004170 Jan 1995 JP
0732936 Feb 1995 JP
0747878 Feb 1995 JP
07052706 Feb 1995 JP
0769125 Mar 1995 JP
07105496 Apr 1995 JP
2630604 Jul 1997 JP
200274339 Mar 2002 JP
2003083742 Mar 2003 JP
20041658 Jan 2004 JP
WO1994019212 Feb 1994 WO
WO1996038319 Dec 1996 WO
Non-Patent Literature Citations (20)
Entry
Achler et al., “Vehicle Wheel Detector using 2D Filter Banks,” IEEE Intelligent Vehicles Symposium of Jun. 2004.
Borenstein et al., “Where am I? Sensors and Method for Mobile Robot Positioning”, University of Michigan, Apr. 1996, pp. 2, 125-128.
Bow, Sing T., “Pattern Recognition and Image Preprocessing (Signal Processing and Communications)”, CRC Press, Jan. 15, 2002, pp. 557-559.
Broggi et al., “Automatic Vehicle Guidance: The Experience of the ARGO Vehicle”, World Scientific Publishing Co., 1999.
Broggi et al., “Multi-Resolution Vehicle Detection using Artificial Vision,” IEEE Intelligent Vehicles Symposium of Jun. 2004.
Kastrinaki et al., “A survey of video processing techniques for traffic applications”.
Mei Chen et al., AURORA: A Vision-Based Roadway Departure Warning System, The Robotics Institute, Carnegie Mellon University, published Aug. 9, 1995.
Parker (ed.), McGraw-Hill Dictionary of Scientific and Technical Terms Fifth Edition (1993).
Philomin et al., “Pedestrain Tracking from a Moving Vehicle”.
Pratt, “Digital Image Processing, Passage—ED.3”, John Wiley & Sons, US, Jan. 1, 2001, pp. 657-659, XP002529771.
Sun et al., “On-road vehicle detection using optical sensors: a review”.
Tokimaru et al., “CMOS Rear-View TV System with CCD Camera”, National Technical Report vol. 34, No. 3, pp. 329-336, Jun. 1988 (Japan).
Van Leeuwen et al., “Motion Estimation with a Mobile Camera for Traffic Applications”, IEEE, US, vol. 1, Oct. 3, 2000, pp. 58-63.
Van Leeuwen et al., “Motion Interpretation for In-Car Vision Systems”, IEEE, US, vol. 1, Sep. 30, 2002, p. 135-140.
Van Leeuwen et al., “Real-Time Vehicle Tracking in Image Sequences”, IEEE, US, vol. 3, May 21, 2001, pp. 2049-2054, XP010547308.
Van Leeuwen et al., “Requirements for Motion Estimation in Image Sequences for Traffic Applications”, IEEE, US, vol. 1, May 24, 1999, pp. 145-150, XP010340272.
Vellacott, Oliver, “CMOS in Camera,” IEE Review, pp. 111-114 (May 1994).
Vlacic et al., (Eds), “Intelligent Vehicle Tecnologies, Theory and Applications”, Society of Automotive Engineers Inc., edited by SAE International, 2001.
Wang et al., CMOS Video Cameras, article, 1991, 4 pages, University of Edinburgh, UK.
Zheng et al., “An Adaptive System for Traffic Sign Recognition,” IEEE Proceedings of the Intelligent Vehicles '94 Symposium, pp. 165-170 (Oct. 1994).
Related Publications (1)
Number Date Country
20190039516 A1 Feb 2019 US
Provisional Applications (1)
Number Date Country
60309022 Jul 2001 US
Continuations (8)
Number Date Country
Parent 15289341 Oct 2016 US
Child 16157226 US
Parent 14997831 Jan 2016 US
Child 15289341 US
Parent 13919483 Jun 2013 US
Child 14997831 US
Parent 12483996 Jun 2009 US
Child 13919483 US
Parent 12058155 Mar 2008 US
Child 12483996 US
Parent 11735782 Apr 2007 US
Child 12058155 US
Parent 11108474 Apr 2005 US
Child 11735782 US
Parent 10209173 Jul 2002 US
Child 11108474 US