This disclosure generally relates to a lane-changing system for an automated vehicle, and more particularly relates to using historical vehicle motion information to operate an automated vehicle when a lane-marking is not detected by a camera.
It is known to operate, e.g. steer, an automated vehicle using a camera to detect features of a roadway such as lane-markings and curbs. However, in some instances those features may be inconsistent, degraded, or otherwise undetectable. In the absence of lane-markings, many systems simply disengage and give control back to the vehicle operator, even though lane-markings may be only momentarily undetected by the camera.
In accordance with one embodiment, an automatic lane-changing system suitable for use on an automated vehicle is provided. The system includes a camera, an inertial-measurement-unit, and a controller. The camera is detects a lane-marking of a roadway traveled by a host-host-vehicle. The inertial-measurement-unit determines relative-motion of the host-vehicle. The controller is in communication with the camera and the inertial-measurement-unit. While the lane-marking is detected the controller determines a last-position of the host-vehicle relative to the lane-marking of the roadway. The controller also determines a current-vector used to steer the host-vehicle towards a centerline of an adjacent-lane of the roadway based on the last-position. The controller also determines an offset-vector indicative of motion of the host-vehicle relative to the current-vector. While the lane-marking is not detected the controller determines an offset-position relative to the last-position based on information from the inertial-measurement-unit. The controller also determines a correction-vector used to steer the host-vehicle from the offset-position towards the centerline of the adjacent-lane based on the last-position and the offset-vector, and steers the host-vehicle according to the correction-vector towards the centerline of the adjacent-lane.
Further features and advantages will appear more clearly on a reading of the following detailed description of the preferred embodiment, which is given by way of non-limiting example only and with reference to the accompanying drawings.
The present invention will now be described, by way of example with reference to the accompanying drawings, in which:
The controller 28 may include a processor (not specifically shown) such as a microprocessor or other control circuitry such as analog and/or digital control circuitry including an application specific integrated circuit (ASIC) for processing data as should be evident to those in the art. The controller 28 may include a memory 30, including non-volatile memory, such as electrically erasable programmable read-only-memory (EEPROM) for storing one or more routines, thresholds, and captured data. The one or more routines may be executed by the processor to perform steps for operating the host-vehicle 12 based on signals received by the controller 28 as described herein.
The system 10 includes a camera 32 used to capture an image 34 of a roadway 36 traveled by the host-vehicle 12. Examples of the camera 32 suitable for use on the host-vehicle 12 are commercially available as will be recognized by those in the art, one such being the APTINA MT9V023 from Micron Technology, Inc. of Boise, Id., USA. The camera 32 may be mounted on the front of the host-vehicle 12, or mounted in the interior of the host-vehicle 12 at a location suitable for the camera 32 to view the area around the host-vehicle 12 through the windshield of the host-vehicle 12. The camera 32 is preferably a video type camera 32 or camera 32 that can capture images of the roadway 36 and surrounding area at a sufficient frame-rate, of ten frames per second, for example.
The image 34 may include, but is not limited to, a lane-marking 38 on a left-side and right-side of a travel-lane 40 of the roadway 36 traveled by the host-vehicle 12. The image 34 may also include the lane-marking 38 on the left-side and the right-side of an adjacent-lane 42 to the travel-lane 40. The lane-marking 38 may include a solid-line, as is typically used to indicate the boundary of a travel-lane 40 of the roadway 36. The lane-marking 38 may also include a dashed-line, as is also typically used to indicate the boundary of a travel-lane 40 of the roadway 36. The lane-marking 38 may become non-existent or otherwise undetectable by the camera 32 for a number of reasons such as, but not limited to, fading of the lane-marking-paint, erosion of the road surface, snow or dirt on the roadway 36, precipitation or dirt on the lens of the camera 32, operational failure of the camera 32, etc.
The system 10 also includes an inertial-measurement-unit 44, hereafter referred to as the IMU 44, used to determine a relative-motion 46 of the host-vehicle 12. The relative-motion 46 measured by the IMU 44 may include the host-vehicle's 12 current yaw rate, longitudinal acceleration, lateral acceleration, pitch rate, and roll rate. One example of the several instances of the IMU 44 suitable for use on the host-vehicle 12 that are commercially available as will be recognized by those in the art, is the 6DF-1N6-C2-HWL from Honeywell Sensing and Control, Golden Valley, Minn., USA.
The system 10 may also include a speed-sensor 48 used to determine a speed of the host-vehicle 12. The speed-sensor 48 may include a wheel-speed-sensor typically found on automotive applications. Other sensors capable of determining the speed of the host-vehicle 12 may include, but are not limited to, a global-positioning-system (GPS) receiver, and a RADAR transceiver, and other devices as will be recognized by those skilled in the art.
The controller 28 is in electrical communication with the camera 32 and the IMU 44 so that the controller 28 can receive the image 34, via a video-signal 50, and the relative-motion 46 of the host-vehicle 12, via a position-signal 52. The position-signal 52 originates in the IMU 44 and may include the host-vehicle's 12 current yaw rate, longitudinal acceleration, lateral acceleration, pitch rate, and roll rate, which defines the relative-motion 46 of the host-vehicle 12, e.g. lateral-motion, longitudinal-motion, change in yaw-angle, etc. of the host-vehicle 12. The controller 28 is also in electrical communication with the speed-sensor 48 so that the controller 28 can receive a speed of the host-vehicle 12 via a speed-signal 54. The controller 28 is also in electrical communication with the vehicle-controls 26.
The controller 28 is generally configured (e.g. programmed or hardwired) to determine a centerline 56 of the adjacent-lane 42 based on the lane-marking 38 of the roadway 36 detected by the camera 32. That is, the image 34 detected or captured by the camera 32 is processed by the controller 28 using known techniques for image-analysis 58 to determine where along the roadway 36 the host-vehicle should be operated or be steered when executing a lane-changing maneuver. Vision processing technologies, such as the EYE Q® platform from Moblieye Vision Technologies, Ltd. of Jerusalem, Israel, or other suitable devices may be used. By way of example and not limitation, the centerline 56 is preferably in the middle of the adjacent-lane 42 to the travel-lane 40 traveled by the host-vehicle 12.
The controller 28 may also determine a current-vector 62, represented by the arrow labeled AC, which illustrates the speed and direction of the host-vehicle 12 being steered by the controller 28 from point A to the desired point C, based on the last-position 60. The controller 28 may also determine an offset-vector 64 that indicates the actual motion of the host-vehicle 12 relative to the current-vector 62. The offset-vector 64 is represented by the arrow labeled AB, which illustrates the actual speed and actual direction of the host-vehicle 12 traveling from point A to point B. The offset-vector 64 may differ from the current-vector 62 due to crowning of the roadway 36, wind gusts, standing water, and other phenomena. Input from the IMU 44, the camera 32, and the speed-sensor 48 is used by the controller 28 to determine the offset-vector 64, as will be recognized by one skilled in the art.
Accordingly, a lane-changing system 10 (the system 10), and controller 28 for the system 10 is provided. In contrast to prior systems, the system 10 described herein delays the disengagement of automated driving controls when lane-markings 38 are non-existent, or otherwise undetectable by the camera 32. The disengagement of automated driving controls when changing lanes, even though the lane-markings 38 are momentarily undetectable, can lead to significant customer dissatisfaction and annoyance.
While this invention has been described in terms of the preferred embodiments thereof, it is not intended to be so limited, but rather only to the extent set forth in the claims that follow. Moreover, the use of the terms first, second, upper, lower, etc. does not denote any order of importance, location, or orientation, but rather the terms first, second, etc. are used to distinguish one element from another. Furthermore, the use of the terms a, an, etc. do not denote a limitation of quantity, but rather denote the presence of at least one of the referenced items.