Wheel rotation has been used to approximate speed and distance traveled by an automobile. This information is communicated to a driver, for example, via a speedometer and odometer. When more precise information is needed, for example for new vehicle performance and evaluation, a “fifth” wheel can be attached to a vehicle to more precisely record speed and distance. When using systems based on measurement of wheel rotation, tracking errors can be introduced, for example, by a wheel slipping or skidding.
Systems based on measurement of wheel rotation can also be used for navigation purposes such as determining an absolute position of a vehicle, or a relative position of the vehicle with respect to one or more locations. Navigation systems for automobiles are used to allow drivers to track current location and plot out routes. However, again, skidding, slipping, braking, etc. can introduce inaccuracies into such systems based on measurement of wheel rotation.
Many of the disadvantages of systems based on measurement of wheel rotation are overcome by using global positioning systems (GPS). Global positioning systems operate by receiving signals from global positioning system satellites to obtain information such as position and velocity.
GPS systems have been combined with detailed electronic maps to aid in the navigation of automobiles. For example, GPS-based navigation tools typically contain a reference base map showing Interstate, U.S., and State highways, plus rivers and lakes in large regions, such as the U.S., Canada, and Mexico. Additional detail may be shown such as main arterial streets in metropolitan areas, detailed street-level map information and even access to business listings and points of interest in a particular area. For example, upon entry of a street address or points of interest (such as restaurants, hotels, gas stations, banks, and shopping areas), some navigation tools will display the location on a map along with current vehicle location. Nevertheless, most GPS systems have accuracy limited to within a few feet and are susceptible to obstacles, multi-path reflections and hostile jamming. This significantly limits the use of most GPS systems for determination of speed and measurement of exact distances.
In accordance with an embodiment of the present invention, a vehicle includes a motion sensor and a control processor. The motion sensor optically detects motion of the vehicle with respect to an underlying surface. The motion sensor includes a variable focus imager. The control processor receives information from the motion sensor and calculates location of the vehicle, speed of the vehicle and/or acceleration of the vehicle.
A short depth of field increases the blur between objects at different distances. Auto-focus system 29 and zoom system 28 allow the optical motion sensor circuitry to measure range in the FOV of image array 21. With ranging capability added to highly accurate x and y positioning, the optical motion sensor circuitry can correlate absolute position accurately over short distances. Zoom system 28 makes the optical motion sensor circuitry more extensible and adaptable to various heights above a surface, so that ranging can be optimized for a height of a given vehicle or aerial flyer. This is desirable as it works with a controlled amount of blur, which prevents aliasing and aids in the interpolation of motion detection in the navigation sensor. Other methods to determine range in FOV for purpose of determining absolute displacements can be implemented alternatively or in addition to the use of zoom system 28. For more information on auto-focusing to determine distance, see for example, Subbarao in “Depth from Defocus: A Spatial Domain Approach”, Intl. J. of Computer Vision, 13:271-294, 1994 and Gordon et al in “Silicon Optical Navigation” which may be accessed on the internet at http://www.labs.agilent.com/news/2003features/news_fast50_gordon.html.
For example, illuminator 22 is implemented using a light emitting diode (LED), an infrared (IR) LED, a high powered laser diode or other lighting device. For example, illuminator 22 is a high-speed strobe or flash. In situations where ambient light is sufficient for image array to detect navigable features of an underlying surface without additional illumination, illuminator 22 can be temporarily shut down or even omitted if not necessary.
An analog-to-digital converter (ADC) 31 receives analog signals from image array 21 and converts the signals to digital data. The digital data represents “raw” or unprocessed sensor information. The analog pixel voltages can be converted into 6, 8, 10, or other-bit digital values, as necessary for resolution or for downstream processing efficiency, as needed.
An image processor control (IPC) 32 processes digital data received from ADC 31 and performs, for example, auto-exposure (AE) by determining optimal exposure time and pixel gain adjust within image array 21. This is done, for example, to prevent saturation or underexposure of images captured by image array 21. Additional functionality such as anti-vignetting or other lens correction, pixel correction, sizing, windowing, sharpening, processed image data formatting and outputting and other image processing can be performed within IPC 32.
Exposure time can be controlled using, for example, an electronic (e.g., “rolling” reset) shutter or a mechanical shutter used with or without strobe flash illumination. The optimal device used for exposure time control can vary dependent on the required accuracy for motion detection and desired overall system cost for a particular application. The illumination system can assist in the shortening of pixel exposure time to enable or maintain high frame rates as necessary to capture features moving in the FOV.
Other image processing algorithms can be invoked to, for example, identify and optimize for the texture of the roadway surface (e.g. asphalt, gravel, wet, dry, icy, etc.), and to apply sharpening or other feature enhancement techniques to optimize image detection and hence motion measurement. For example, detection of ice on the surface can result in a signal and/or warning to a vehicle driver. In applications for a motion detector, such as for a pedometer, an algorithm is used to remove the obstacles of the pedestrian's feet in the field of view of image array 21 before correlation is performed.
A navigation engine 34 evaluates the digital data from IPC 32 and performs a series of correlations to estimate the direction and magnitude of motion most likely to account for the difference between images taken at different times. Navigation engine 34 then determines a delta x (ΔX) value to be placed on an output 38 and determines a delta y (ΔY) value to be placed on an output 39. For example, ΔY represents movement in the forward or reverse direction of the vehicle and ΔX represents sideways motion of the vehicle. ΔX and ΔY can be either positive or negative. A positive ΔY indicates forward motion, a negative ΔY indicates motion in a reverse direction, a positive ΔX indicates motion toward one side, and a negative ΔX indicates motion towards another side.
ΔX and ΔY represent are correlated to represent actual displacement or distance. For example, optical zoom and auto-focus algorithms are used to focus features in the FOV, and from those settings determine the precise distance to (and hence between) the tracked motion, resulting in correlations to actual displacements. Alternatively, or in addition, other means of distance detection can be used, including sonar, radar, or light detecting and ranging (LIDAR), to measure position of the imager above the surface (see for example, U.S. Pat. No. 5,644,139, by Allen et al. for Navigation Technique for Detecting Movement of Navigation Sensors Relative to an Object). The frame rate at which image array 21 captures images is known. Therefore, it is possible from the available information to calculate time dependent characteristics such as speed (velocity) and acceleration. For applications that require detection of motion in the vertical (Z) direction, it is also possible to determine Z displacement. See, for example, U.S. Pat. No. 6,433,780 by Gordon, et al., for Seeing Eye Mouse for a Computer System.
Tracking angular rotation allows the navigation system to autonomously determine vehicle heading. This is can be accomplished using multiple optical sensors. Placing two or more optical sensors on a vehicle allows accurate reporting on skidding, slipping and other vehicle behavior, while maintaining accurate heading and odometry necessary for autonomous navigation over short distances.
Navigation engine 34 also generates a quality signal 37 that indicates the quality of the image detected by image array 21. For example, quality signal 37 represents an estimation of the likelihood that the ΔX and ΔY values represent the true motion of the vehicle with respect to an underlying surface. For example, this likelihood is based on the number of navigable features detected by image array 21. Alternatively, other methodology may be used to determine the quality of the image detected by image array 21. See, for example, ways quality is determined in U.S. Pat. No. 6,433,780. Quality signal 37 is fed back to IPC 32 to, for example, assist in the convergence of the algorithms to optimize illumination or frame rate. Quality signal 37 also is fed forward into control processor 35 and used, for example, as part of an error detection and correction scheme to improve system robustness or redundancy.
Typically, when quality signal 37 indicates the likelihood that the ΔX and ΔY values do not represent the true motion of the vehicle with respect to an underlying surface, this indicates for example, that dirt or grime is obstructing image array 21 or illuminator 22.
Quality signal 37 is, for example, a binary signal indicating whether quality is acceptable or not acceptable. Alternatively, quality signal 37 is a numeric value indicating level of quality. For example, the numeric value is related to how well reference and sample frames are correlated in the motion detection process. The numeric value is compared to, for example, a minimum accept value or threshold, for acceptance or rejection. A Kalman filter or other type of filter can be used to blend previous measurements and reduce the error variance in the presence of missing (or poorer quality) measurements.
A control processor 35 receives ΔX on output 38 and ΔY on output 39. Based on the current and past values of ΔX and ΔY, control processor 35 is able to determine location, speed and acceleration of vehicle 10. Control processor 35 also updates an odometer reading indicating total distance traveled by the vehicle. In an alternative embodiment of the present invention, output 38 and output 39 are replaced with a first-in-first-out (FIFO) memory. Navigation engine 34 buffers values for ΔX and ΔY, along with frame rate information, if necessary, in the FIFO memory. Control processor 35 accesses the FIFO memory for the buffered values.
For example, control processor 35 is implemented as a stand-alone processor. Alternatively, control processor 35 is integrated with IPC 32, ADC 31 and navigation 34 as a system-on-chip (SOC) device. Image array 21 and associated optics can also be integrated as a single component module that can optionally include illuminator 22. A flexible printed circuit board (PCB) or other substrate can be used that provides for, for example, low-cost and high-reliable assembly, installation, and operation. Interfaces between blocks can be accomplished using serial, parallel, wireless, optical and/or some other communication means.
Control processor 35 controls lens and magnification system 20. As shown in
Using a variable focus imager has significant advantages over use of a “fixed” focus system. A variable focus imager allows resolution of sufficient detail for surface correlation to work even when the distance between the optical motion sensor and an underlying surface is not constant.
When image array 21, illuminator 22, control processor 35, IPC 32, ADC 31 and navigation 34 are all integrated together as a single module, the resulting module sensor can be used for alternate applications. For example, such a motion sensor can be used as a pedometer placed on a person's belt or integrated in the sole of a shoe or sandal. This allows tracking not only of distance traveled, but can also be used to track speed and acceleration.
For example,
In a block 41, control processor 35 obtains a ΔX value and a ΔY value. For example, navigation engine 34 is able to generate hundreds of ΔX values and ΔY values per second. In some applications, dependent upon desired accuracy and available resources, it may be sufficient to sum the ΔX values and the ΔY values received by control processor 35 over a predetermined amount of time. The predetermined amount of time is, for example, optimized for the particular application and desired accuracy.
In a block 42, a check is made to see if quality signal 37 is at an acceptable level. If in block 42, quality signal 37 is not acceptable, this indicates, for example, some malfunction such as dirt or grime on the image array or illuminator. In a block 44, a note is made of the error and the appropriate message is indicated to the vehicle driver. The message could be, for example, in the form of a light signaling a diagnostic error or a warning sound, and so on. Also, readings from a back-up optical sensor, if available, may be used instead of readings from the optical sensor providing inadequate quality. Alternatively, or in addition, a corrective action can be initiated. For example, an automated system can be used to clean image array 21, and/or illuminator 22. For example, a sheet of transparent film is advanced across the lens of image array 21 removing any obstruction, such as road spray.
If, in block 43, quality signal 37 is at an acceptable level, in a block 45 current location of vehicle 10 is calculated. This is done, for example by adding the current ΔX value and the ΔY value to a previous location to obtain a current location.
In a block 46 current speed of vehicle 10 is calculated. This is done, for example by calculating distance traveled by vehicle, calculated for example by dividing the square root of the sum of the current ΔX value squared and the current ΔY value squared by elapsed time.
In a block 47 current acceleration of vehicle 10 is calculated. This is done, for example by subtracting the previous speed of vehicle from the current speed of vehicle, then dividing the result by elapsed time.
In a block 48 a new odometer reading for vehicle 10 is calculated. This is done, for example by calculating distance traveled by vehicle and adding the calculated distance to the previous odometer reading. Distance can be calculated, for example, by dividing the square root of the sum of the current ΔX value squared and the current ΔY value squared. Other less accurate ways to calculate distance travel can also be used. For example, distance can be estimated using the current ΔY value.
In addition to position-based navigation, it is possible to obtain position by integrating velocity over time. Thus, the change in motion or position can be determined by various numeric methods such as the rectangular rule, trapezoidal rule, or Simpson's rule, or other methods.
The motion sensor described herein can be used to implement, for example, odometer, speedometers and navigation systems. When used as a navigation system, calibration is essential to determine an original location from which other locations are determined. The motion sensor described herein can also be used instead of a tracking wheel for vehicle performance and evaluation. When integrated with other sensors and/or input sources, performance measures such as braking efficiency, yaw, tire slip, fuel efficiency and so on can be calculated. GPS systems can be combined with the motion sensor described herein to provide an optimized navigation system.
Image array 31 can also be used for other applications in addition to motion sensing. For example, image 31 could be implemented as a color image array and used to monitor driving conditions such as vehicle positioning with respect to the roadway. For example, detection of two yellow stripes in the field of vision of image 31 could indicate the vehicle has crossed a “no-passing zone”. Similarly, a white “fog” line can be detected. Appropriate alerts can be passed to the driver in such cases. Additionally, vehicle performance and driver evaluation can be monitored and appropriate alerts could be generated. Alternatively, or in addition, the motion sensor can be used within a security system. For example, if the vehicle is detected outside a predefined geographic region, a security action can be implemented. For example, the security action includes disconnecting a fuel line, notifying police and so on. The security action, for example, is overridden by entry of a predefined password.
When a tracking vehicle is, for example, an aircraft flying (autonomously) in formation with other aircraft, optical sensors can be placed strategically on the tracking vehicle to allow sensing in all directions around the aircraft. When the tracking vehicle is an underwater sea craft, optical sensors can be located, for example to allow the tracking vehicle to track the seafloor to, for example, maintain a stationary position in the presence of current.
The foregoing discussion discloses and describes merely exemplary methods and embodiments of the present invention. As will be understood by those familiar with the art, the invention may be embodied in other specific forms without departing from the spirit or essential characteristics thereof. Accordingly, the disclosure of the present invention is intended to be illustrative, but not limiting, of the scope of the invention, which is set forth in the following claims.