The present invention relates generally to systems, arrangements and methods for using maps and images to locate a vehicle as a Global Navigation Satellite System (GNSS) replacement, and then using the vehicle location to control one or more vehicular components, such as a display of a navigation system, a vehicle steering or guidance system, a vehicle throttle system and a vehicle braking system. Route guidance and autonomous vehicle operation using highly accurate vehicle position determination is provided.
A detailed discussion of background information is set forth in U.S. Pat. Nos. 6,405,132, 7,085,637, 7,110,880, 7,202,776, 9,103,671, and 9,528,834. Additional prior art of relevance includes U.S. Pat. Nos. 7,456,847, 8,334,879, 8,521,352, 8,676,430 and 8,903,591.
Method and system for adjusting a vehicular component based on highly accurate vehicle position includes obtaining kinematic data from an inertial measurement unit (IMU) on the vehicle, deriving, using a processor, information about current vehicle position from the data obtained from the inertial measurement unit and an earlier known vehicle position, and adjusting, using the processor, the derived current vehicle position to obtain a corrected current vehicle position. This latter step entails obtaining at least one image of an area external of the vehicle using at least one camera assembly on the vehicle, each being in a fixed relationship to the IMU, identifying multiple landmarks each obtained image, analyzing, using the processor, each image to derive positional information about each landmark, obtaining from a map database, positional information about each identified landmark, and identifying, using the processor, discrepancies between the positional information about each landmark derived from each image and the positional information about the same landmark obtained from the map database. Finally, the derived current vehicle position is adjusted using the processor based on the identified discrepancies to obtain the corrected current vehicle position which is used to change operation of the vehicular component.
Various hardware and software elements used to carry out the invention described herein are illustrated in the form of system diagrams, block diagrams, flow charts, and depictions of neural network algorithms and structures
Preferred embodiments are illustrated in the following figures:
The illustrated embodiments may be considered together as part of a common vehicle.
Map database 48 works in conjunction with a navigation system 46 to provide information to the driver of the vehicle 18 (see
Map database 48 contains a map of the roadway to an accuracy of a few centimeters (1 σ), i.e., data on the edges of the lanes of the roadway and the edges of the roadway, and the location of all stop signs and stoplights and other traffic control devices such as other types of road signs. Motion or operation of the vehicle can be analyzed relative to the data in the map database 48, e.g., data about edges of travel lanes, instructions or limitations provided or imposed by traffic control devices, etc., and a deviation from normal motion or operation of the vehicle detected.
Navigation system 46 is coupled to the GNSS and DGNSS processing system 42. The driver is warned if a warning situation is detected by a vehicle control or driver information system 45 coupled to the navigation system 46. Driver information system 45 comprises an alarm, light, buzzer or other audible noise, and/or a simulated rumble strip for yellow line and “running off of road” situations and a combined light and alarm for the stop sign and stoplight infractions. Driver information system 45 may be a sound only or sound and vibration as in a simulated rumble strip.
A local area differential correction system known as Real Time Kinematic, or RTK, differential system is available and the system of choice for creating accurate maps. In this system, local stations are established which, over time, determine their exact location within millimeters. Using this information, the local stations can provide corrections to moving vehicles that are nearby, allowing them to determine their locations to within a few centimeters. RTK base stations determine their locations by averaging their estimated locations over time and thereby averaging out errors in the GNSS signals. By this method, they converge to an accurate position determination. When an RTK base station or vehicle determines location, it is meant that hardware and/or software at the RTK base station or at or on the vehicle is configured to receive signals or data and derive location therefrom. Where implemented, RTK stations are typically placed 30 to 100 kilometers apart. However, in urban locations where multipath problems are relevant, such stations may be placed as close as tens to hundreds of meters.
Maps created from satellite photographs are available for most of the world. Such maps show the nature of topography including roads and nearby road structures. Accuracy of such roads is limited to many meters and such satellite-created maps are often insufficiently accurate for vehicle route guidance purposes, for example, and other purposes described herein. Various mapping companies provide significant corrections to maps through deployment of special mapping vehicles which, typically through use of lidar or laser-radar technology, created maps now in widespread use for route guidance, for example, by vehicles in many parts of the world. Such maps, however, are only accurate to a few meters.
Although this is sufficient for route guidance, additional accuracy is needed for autonomous vehicles guidance where centimeter level accuracy is required to prevent vehicles from crossing lane markers, running off the road, and/or impacting fixed objects such as poles, trees or curbs. This is especially a problem in low visibility conditions where laser radar system can be of marginal value. Techniques described herein solve this problem and provide maps to centimeter level accuracy.
An inventive approach is to accomplish the mapping function utilizing multiple probe vehicles, which are otherwise ordinary vehicles, each equipped with one or more cameras, an IMU and an accurate RTK DGNSS system as described below. Such a system can be called crowdsourcing. A receiver for obtaining WADGNSS, such as provided by OmniSTAR corrections is also preferably available on the vehicle for use in areas where RTK DGNSS is not available.
As each probe vehicle traverses a roadway, each camera thereon obtains images of the space around the vehicle and transmits these images, or information derived therefrom, to a remote station off of the vehicle, using a transmitter, which may be part of a vehicle-mounted communication unit. This communication can occur in any of a variety of ways including a cellphone, the Internet using broadband such as WiMAX, LEO or GEO satellites or even Wi-Fi where it is available or any other telematics communication system. The information can also be stored in memory on the vehicle for transmission at a later time.
The remote station can create and maintain a map database from information transmitted by probe vehicles. When a section of roadway is first traversed by such a probe vehicle, the remote station can request that a full set of images be sent from the probe vehicle depending on available bandwidth. When sufficient bandwidth is unavailable, images can be stored on the vehicle, along with position information, for later uploading. Additional images can also be requested from other probe vehicles until the remote station determines that a sufficient image set has been obtained, i.e., a processor configured to process images at the remote station determines that a sufficient image set has been obtained. Thereafter, the probe vehicles can monitor terrain and compare it to the on-vehicle map (from map database 48) and notify the remote site if discrepancies are discovered.
If a GNSS receiver is placed at a fixed location, with appropriate software, it can eventually accurately determine its location without the need for a survey. It accomplishes this by taking a multitude of GNSS data and making a multitude of position estimates, as GNSS satellites move across the sky, and applying appropriate algorithms that are known in the art. By averaging these position estimates, the estimated position gradually approaches the exact position. This is a method by which local RTK stations are created. This process can get more complicated when known and invariant errors are present. Software exists for removing these anomalies and, in some cases, they can be used to improve position accuracy estimates.
In a probe vehicle, corrected or uncorrected GNSS signals are used to correct drift errors in the IMU 44 and it is the IMU 44 which is used by the vehicle to provide an estimate of its position at any time. If the GNSS signals are the only available information, then the vehicle location, as represented by IMU 44, will contain significant errors on the order of many meters. If WADGNSS is available, these errors are reduced to on the order of a decimeter and if RTK DGNSS is available, these errors are reduced to a few centimeters or less.
When a probe vehicle acquires an image, it records position and pointing angle of the camera as determined by the IMU 44. Position and pointing angle are used to determine a vector to a point on an object, the landmark, in the image such as a pole. After two images are obtained, location of the pole can be determined mathematically as the intersection of the two vectors to the same point on the pole. This location will be in error due to the accuracy of the IMU 44 and the accuracies in the imaging apparatus.
Since imaging apparatus errors are invariant, such as imperfections in the lenses, they can be mostly removed through calibration of the apparatus. Distortion due to lens aberrations can be mapped and corrected in software. Other errors, due to barrel distortions or due to the shutter timing in a rolling shutter camera, can similarly be removed mathematically. Remaining errors are thus due to the IMU 44. These errors are magnified based on distance between, e.g., the vehicle and pole.
In the same manner as the fixed GNSS RTK receiver gradually determines its exact location through averaging multiple estimates, location of the reference point on a pole can similarly be exactly determined by averaging position estimates. When the IMU location is determined only using GNSS readings, a large number of position estimates are required since the IMU errors will be large. Similarly, if WADGNSS is available, fewer position estimates are necessary and with RTK DGNSS, only a few position estimates are required. This process favors use of nearby poles due to the error magnification effect but even further away poles will be accurately located if sufficient position estimates are available.
It takes two images to obtain one position estimate, provided the same landmark is in both images. Three images provide three position estimates by combining image 1 with image 2, image 1 with image 3 and image 2 with image 3. The number of position estimates grows rapidly with the number of images n according to the formula n*(n−1)/2. Thus, forty-five position estimates are obtained from ten images and 4950 position estimates from one hundred images.
Initially, multiple images can be obtained by a single probe vehicle but, as the system becomes widely adopted, images from multiple probe vehicles can be used, further randomizing any equipment systemic errors which have not been successfully removed.
A pole is one example of a landmark to be used in the creation of accurate maps as taught herein. Other landmarks include any invariant (fixed in position) structure with a characteristic which can be easily located, such as the right edge or center of a pole at its midpoint, top or at a point where the pole intersects the ground, or any other agreed upon reference point. Examples of other landmarks are edges of buildings, windows, curbs, guardrails, road edges, lane markers or other painted road markings, bridges, gantries, fences, road signs, traffic lights, billboards and walls.
Landmarks may be limited to man-made objects; however, in some cases, natural objects such as rocks and trees can be used. In many landmarks, a particular point, such as the midpoint or top of a pole, needs to be selected as a representative or position-representing point. Some landmarks, such as a curb, road edge or painted lane marker, do not have a distinctive beginning or end that appears in a single image. Even in such cases, the line does begin and end or is crossed by another line. Distance traveled from such a starting or crossing point can be defined as the representative point.
Some objects, such as trees and rocks, do not lend themselves to be chosen as landmarks and yet their placement on a map for safety reasons can be important. Such objects can be placed on the map so that vehicles can avoid impacting with them. For such objects, a more general location can be determined, but the object will not be used for map accuracy purposes.
Satellite-created maps are generally available which show the character of terrain. However, since satellite-created maps are generally not sufficiently accurate for route guidance purposes, such maps can be made more accurate using the teachings of this invention since location of landmarks discussed above, and that can be observed on the satellite-created maps, can be accurately established and the satellite-created maps appropriately adjusted so that all aspects of terrain are accurately represented.
Initially in the mapping process, complete images are transmitted to the cloud. As the map is established, only information relative to landmarks needs to be transmitted, greatly reducing the required bandwidth. Furthermore, once a desired accuracy level is obtained, only data relevant to map changes need to be transmitted. This is the automatic updating process.
Computer programs in the cloud, i.e., resident at a hosting facility (remote station) and executed by a processor and associated software and hardware thereat, will adjust satellite images and incorporate landmarks to create a map for various uses described herein. Probe vehicles can continuously acquire images and compare location of landmarks in those images with their location on the map database and when a discrepancy is discovered, new image data, or data extracted therefrom, is transmitted to the cloud for map updating. By this method, an accurate map database can be created and continuously verified using probe vehicles and a remote station in the cloud that creates and updates the map database. To facilitate this comparison, each landmark can be tagged with a unique identifier.
When processing multiple images at the remote station, using for example stereographic techniques with dual images, images or data derived from the images are converted to a map including objects from the images by identifying common objects in the images, for example by neural networks or deep learning, and using position and pointing information from when the images were obtained to place the objects on the map. Images may be obtained from the same probe vehicle, taken at different times and including the same, common object, or from two or more probe vehicles and again, including the same, common object.
By using a processor at the remote station, that is not located on the probe vehicles yet in communication with them, images from multiple vehicles or the same vehicle taken at different times may be used to form the map. In addition, by putting the processor separate from the probe vehicles, it is possible to use WADGNSS without having equipment to enable such corrections on the probe vehicles.
By using the method above, an accurate map database can automatically be constructed and continuously verified without the need for special mapping vehicles. Other map information can be incorporated in the map database at the remote station such as locations, names and descriptions of natural and man-made structures, landmarks, points of interest, commercial enterprises (e.g., gas stations, libraries, restaurants, etc.) along the roadway since their locations can have been recorded by probe vehicles.
Once a map database has been constructed using more limited data from probe vehicles, additional data can be added using data from probe vehicles that have been designed to obtain different data than the initial probe vehicles have obtained, thereby providing a continuous enrichment and improvement of the map database. Additionally, the names of streets or roadways, towns, counties, or any other such location based names and other information can be made part of the map.
Changes in the roadway location due to construction, landslides, accidents etc. can be automatically determined by the probe vehicles. These changes can be rapidly incorporated into the map database and transmitted to vehicles on the roadway as map updates. These updates can be transmitted by means of a ubiquitous Internet such as WiMAX, or equivalent, or any other appropriate telematics method. All vehicles should eventually have permanent Internet access which permits efficient and continuous map updating.
WADGNSS differential corrections can be applied at the remote station and need not be considered in the probe vehicles thus removing the calculation and telematics load from the probe vehicle. See, for example, U.S. Pat. No. 6,243,648. The remote station, for example, could know DGNSS corrections for the approximate location of the vehicle at the time that images or GNSS readings were acquired. Over time, the remote station would know exact locations of infrastructure resident features such as the pole discussed above in a manner similar to fixed GNSS receiver discussed above.
In this implementation, the remote station would know mounting locations of the vehicle-mounted camera(s), GNSS receivers and IMU on the vehicle and relative to one another and view angles of the vehicle-mounted camera(s) and its DGNSS corrected position which should be accurate within 10 cm or less, one sigma, for WADGNSS. By monitoring the movement of the vehicle and relative positions of objects in successive pictures from a given probe vehicle and from different probe vehicles, an accurate three-dimensional representation of the scene can be developed over time.
Once road edge and lane locations, and other roadway information, are transmitted to the vehicle, or otherwise included in the database (for example upon initial installation of the system into a vehicle), it requires very little additional bandwidth to include other information such as location of all businesses that a traveler would be interested in, such as gas stations, restaurants etc., which could be done on a subscription basis or based on advertising.
Considering now
In
The two antennas 74, 75 provide information to a processor in electronics package 60 to give an accurate measurement of the vehicle heading direction or yaw. This can also be determined from the IMU when the vehicle is moving. If the vehicle is at rest for an extended time period, the IMU can give a poor heading measurement due to drift errors.
The components which make up electronics assembly 60 are shown in
Additional systems in accordance with the invention are illustrated in
Assembly 110 is mounted onto the exterior surface of a roof 126 of a vehicle 128 along with a second GNSS antenna 145 coupled thereto by a coupling connector 118. Mounting means to provide for this mounting may be any known to those skilled in the art for attaching external vehicular components to vehicle body panels and roofs,
In
If 60 degree lenses are used in each camera assembly 132, 134, then the angle of rotation can be slightly less than about 30 degrees so that all areas within a 120 degree FOV except a small triangle in the center and in front of the vehicle are imaged. Navigation and antenna assembly 112 is shown mounted in the center of the exterior surface of roof 126.
An alternate configuration providing potentially greater accuracy is to move camera assemblies 132, 134 to positions that are as close as possible to the navigation and antenna assembly 112, moving navigation and antenna assembly 112 slightly rearward so that camera assemblies 132, 134 would be touching each other.
For some systems, a portable computing device, such as a laptop 80 as shown in
In some implementations, the only processing by laptop 80 is to tag received images with displacement and angular coordinates of the camera(s) providing each image and to update the IMU with corrections calculated from the navigation unit. The IMU may be part of the navigation unit. The images will then be retained on the laptop 80 and transferred either immediately or at some later time to a remote station via the telecommunications capability of the laptop 80.
At the remote station, there will likely be another processing unit that will further process the data to create a map. In other implementations, the images are processed by a computer program executed by the processing unit to search for landmarks using pattern recognition technology, such as neural networks, configured or trained to recognize poles and other landmarks in images. In this case, only landmark data needs to be transferred to the processing unit at the remote station for processing by the computer program. Initially the first process will be used but after the map is fully developed and operational, only landmark data that indicates a map change or error will need to be transmitted to the processing unit at the remote station.
Cameras assemblies 151, 151A, 152, 152A and 153 do not need to be mounted at the same location and if they were placed at edges of the roof 155 at A-Pillar 156, as in
When the system of this invention is used for determining vehicle location in poor visibility situations and displaying the vehicle location on display of laptop 80, IR flood lights 180 can be provided at the front on each side of vehicle 150 to augment the illumination of headlights 178 of vehicle 150. The camera assemblies in this case need to be sensitive to near IR illumination.
In some embodiments, additional cameras or wide angle lenses can be provided which extend the FOV to 180 degrees or more. This allows the system to monitor street view scenes and report changes.
The embodiments illustrated in
Electronics used in box 60 of
When used with RTK differential GPS, horizontal position accuracy is about 0.008 m, vertical position accuracy is about 0.015 m and dynamic roll and pitch accuracy is about 0.15 degrees and heading accuracy is about 0.1 degree. When the system of this invention is in serial production, a special navigation device is provided having similar properties to the AN, potentially at a lower cost. Until such time, the commercially available AN may be used in the invention.
AN 301 contains the IMU and two spaced apart GNSS antennas. Antennas provide the ability to attain accurate heading (yaw) information. In addition, AN 301 contains a receiver for receiving differential corrections from OmniSTAR and RTK differential correction systems. Accurate mapping can be obtained with either system and even without any differential corrections; however, a greater number of images are required, the lower the position and angular accuracy that is available. When RTK is available, 10 cm pole position accuracy can be obtained on a single pass by an image acquiring vehicle whereas 10 passes may be required when only OmniSTAR is available and perhaps 50 to 100 passes when no differential corrections are available.
In
5. Determining Vehicle Location without Satellite Navigation Systems
In the chart “FID” means landmark. The flowchart is shown generally at 400. Each of the steps is listed below. Step 401 is to begin. Step 402 is setting initial data, including the Kalman filter's parameters. Step 403 is IMU-data reading (detecting) with frequency 100 Hz: acceleration Ā, angular speed {right arrow over (ω)} (considered kinematic properties of the vehicle). Step 404 is error compensation for IMU. Step 405 is calculation of current longitude λ, latitude ϕ, altitude h, Roll, Pitch, Yaw, and linear speed {right arrow over (V)}gps. Step 405 is generally a step of deriving, using a processor, information about current vehicle position from the data obtained from the IMU and an earlier known vehicle position by analyzing movement therefrom. Step 406 is reading GPS-data with GNSS or RTK correction (if available), detected with frequency 1, . . . , 10 Hz: longitude λgps, latitude ϕ gps, altitude hgps, linear speed {right arrow over (V)}gps. Step 407 is a query as to whether there is new reliable GPS-data available. If so, step 408 is bringing the GPS and IMU measurements to common time (synchronization), and step 409 is calculation of the first observation vector: {right arrow over (Y)}1=[(λ−λgps)Re·cos(ϕgps); (ϕ−ϕgps)Re; h−hgps; {right arrow over (V)}−{right arrow over (V)}gps] where Re=6371116 m is the average Earth radius. Thereafter, or when there is no new reliable GPS-data available in step 407, step 410 is taking an image (if available) with frequency 1, . . . , 30 Hz. Landmark processing for correct vehicle position may thus occur only when GPS-data is not available.
Step 411 is a query as to whether a new image is available. If so, step 412 is to preload information about landmarks, previously recognized at current area, from the map, step 413 is identification of known landmarks Nj, j=1, . . . , M, and step 414 is a query as to whether one or more landmark(s) is/are recognized in the image. If so, step 415 is retrieving coordinates λj, ϕj, hj of the j-th landmark from the map (database), step 416 is calculating local angles j and γj of the landmark, step 417 is bringing the IMU measurements to time of the still image (synchronization), and step 418 is calculation of the second observation vector: {right arrow over (Y)}2=[{right arrow over (Y′)}1; . . . ; {right arrow over (Y′)}j; . . . ; {right arrow over (Y′)}M], j=1 . . . M′ where M′ is a number of recognized landmarks (M′≤M), {right arrow over (Y′)}j=[(λ−λj)Re·cos(ϕj); (ϕ−ϕj)Re; h−hj)]−rj·{right arrow over (R)}·{right arrow over (F)}j, rj=[{(λ−λj)Re·cos(ϕj)}2+{(ϕ−ϕj)Re}2+{h−hj}2]1/2, {right arrow over (R)} and {right arrow over (F)}J are calculated as in algorithm 1B.
In step 419, a query is made as to whether there is new data for error compensation. If so, step 420 is recursive estimation with the Kalman filter: {circumflex over ({right arrow over (X)})}={right arrow over (K)}·[{right arrow over (Y)}1, {right arrow over (Y)}2], {circumflex over ({right arrow over (X)})}=[Δλ, Δϕ, Δh, {right arrow over (ΔV)}, {right arrow over (ΔΨ)}, {right arrow over (ΔB)}], {right arrow over (ΔΨ)}=[ΔRoll, ΔPitch, ΔYaw] is a vector of orientation angle errors, {right arrow over (ΔB)} is a vector of IMU errors, {right arrow over (K)} is a matrix of gain coefficients, and step 421 is error compensation for longitude λ, latitude ϕ, altitude h, Roll, Pitch, Yaw, and linear speed {right arrow over (V)}. Step 421 constitutes a determination of adjusted IMU output. Thereafter, or when there is no new data for error compensation in step 419, step 422 is output parameters: longitude λ, latitude ϕ, altitude h, Roll, Pitch, Yaw, and linear speed {right arrow over (V)}. In step 423, a query is made as to whether to terminate operation, and if so, step 424 is the end. If not, the process returns to step 403. Some or all of steps 406-421 may be considered to constitute an overall step of adjusting, using a processor, derived current vehicle position (step 405 determined using an earlier known vehicle position and movement therefrom) to obtain a corrected current vehicle position (by compensating for errors in output from the IMU).
An important aspect of this technique is based on the fact that much in the infrastructure is invariant and once it is accurately mapped, a vehicle with one or more mounted cameras can accurately determine its position without the aid of satellite navigation systems. This accurate position is used for any known purposes, e.g., display vehicle location on a display of a navigation system.
Initially, a map will be created basically by identifying objects in the environment near a road and, through a picture taking technique, determining the location of each of these objects using photogrammetry as described in International Pat. Appl. No. PCT/US14/70377 and U.S. Pat. No. 9,528,834. The map can then be used by an at least partly vehicle-resident route guidance system to permit the vehicle to navigate from one point to another.
Using this photogrammetry technique, a vehicle can be autonomously driven such that it does not come close or and ideally not impact with any fixed objects on or near the roadway. For autonomous operation, the vehicle component being controlled based on the position determination includes one or more of the vehicle guidance or steering system 96, the vehicle throttle system including the engine 98, the vehicle braking system 94 (see
For route guidance, instead of using the corrected current vehicle position to display on a display of a navigation system, such as on laptop 80, the corrected current vehicle position is input to one or more of the vehicle component control systems to cause them to change their operation, e.g., turn the wheels, slow down. When displayed on navigation system, e.g., laptop 80 or another system in the vehicle, the content of the display is controlled based on the corrected current vehicle position to show rods, landmarks, terrain, etc. around the corrected current vehicle location.
Since this technique will generate maps accurate to within a few centimeters, it should be more accurate than existing maps and thus appropriate for autonomous vehicle guidance even when visibility is poor. Location of the vehicle during the map creation phase will be determined by GNSS satellites and a differential correction system. If RTK differential GNSS is available, then the vehicle location accuracy can be expected to be within a few centimeters. If WADGNSS is used, then accuracy is on the order of decimeters.
Once the map is created, a processing unit in the vehicle has the option of determining its location, which is considered location of the vehicle, based on landmarks represented in the map database. The method by which this can be done is described below. Exemplifying, but non-limiting and non-exclusive steps for such a process can be:
This process can be further explained from the following considerations.
Using the above process, a processing unit on a vehicle, in the presence of or knowledge about mapped landmarks, can rapidly determine its position and correct the errors in its IMU without the use of GNSS satellites. Once a map is in place, the vehicle is immune to satellite spoofing, jamming, or even the destruction of satellites as might occur in wartime. In fact, only a single mapped landmark is required, provided at least three images are made of the landmark from three different locations. If three landmarks are available in an image, then only a single image is required for the vehicle to correct its IMU. The more landmarks in a picture and the more pictures of particular landmarks results in a better estimation of the IMU errors.
To utilize this method of vehicle location and IMU error correction, landmarks must be visible to the vehicle camera assemblies. Normally, the headlights will provide sufficient illumination for nighttime driving. As an additional aid, near IR floodlights such as 180 in
On the vehicle 450, the following steps occur: Step 451, acquire Image; Step 452, acquire IMU Angles and Positions; Step 453, compress the acquired data for transmission to the cloud; and Step 454, send the compressed data to the cloud
In the cloud 460, the following steps occur: Step 461, receive an image from a mapping vehicle; Step 462, identify a landmark using a pattern recognition algorithm such as neural networks, Step 463, assign ID when a landmark is identified; Step 464, store the landmark and the assigned ID in a database; and when there are no identified landmarks, Step 465, search the database for multiple same ID entries. If there are none, the process reverts to step 461. If there are multiple ID entries in the database as determined in step 465, step 466 is to combine a pair to form a position estimate by calculating the intersection of vectors passing thought a landmark reference point.
An important aspect of the invention is use of two pictures, each including the same landmark, and calculation of position of a point on the landmark from the intersection of two vectors drawn based on the images and known vehicle location when each image was acquired. It is stereo vision where the distance between the stereo cameras is large and thus accuracy of the intersection calculation is great. Coupled with the method of combining images (n*(n−1)/2), highly accurate positional determination is obtained with only one pass and perhaps 10 images of a landmark.
Step 467 is a query as to whether there are more pairs, and if so, the process returns to step 466. If not, the process proceeds to step 468, combining position estimates to find most probable location of the vehicle, step 469, placing the vehicle location on a map, and Step 470, making updated map data available to vehicles. From step 470, the process returns to step 465.
The system processing depicted in
The cloud 460 represents a location remote from the vehicle 450, most generally, an off-vehicle location which communicates wirelessly with the vehicle 450. The cloud 460 is not limited to entities commonly considered to constitute the cloud and may be any location separate and apart from the vehicle at which a processing unit is resident.
On the vehicle 500, the following steps occur:
Step 501, acquire Image;
Step 502, acquire IMU Angles and Positions from which the image was acquired;
Step 503, identify a Landmark using a pattern recognition algorithm such as neural networks;
Step 504, assign an ID to the identified Landmark;
Step 505, compress the acquired data for transmission to the cloud; and
Step 506, send the compressed acquired data to the cloud.
In the cloud, the following steps occur:
Step 511, receive an image;
Step 512, store the received image in a database;
Step 513, search the database for multiple identical ID entries, and when one is found,
Step 514, combine a pair to form a position estimate. If no multiple identical ID entries are found, additional images are received in step 511.
A query is made in step 515 as to whether there are more pairs of multiple identical ID entries and if so, each is process in step 514, If not, in step 516, position estimates to find most probable location (of the vehicle) are combined and in step, 517, the vehicle location is placed on a map. In step 518, the updated map is made available to vehicles.
Once the map has been created and stored in a map database on the vehicle 500, essentially the only transmissions to the cloud 510 will relate to changes or accuracy improvements to the map. This will greatly reduce the bandwidth requirements at a time that the number of vehicles with the system is increasing.
Several distortions can arise in an image taken of the scene by a camera assembly. Some are due to aberrations in the lens of the camera assembly which are local distortions caused when the lens contains an imperfect geometry. These can be located and corrected for by taking a picture of a known pattern and seeing where the deviations from that known pattern occur. A map can be made of these errors and the image corrected using that map. Such image correction would likely be performed during processing of the image, e.g., as a sort of pre-processing step by a processing unit receiving the image from a camera assembly.
Barrel distortions are caused by distortions arising from use of a curved lens to create a pattern on a flat surface. They are characterized by a bending of an otherwise straight line as illustrated in
Cameras generally have either a global or a rolling shutter. In the global shutter case, all of the pixels are exposed simultaneously, whereas in the rolling shutter case, first the top row of pixels are exposed and the data transferred off of the imaging chip and then the second row pixels are exposed etc. If the camera is moving while the picture is being taken in the rolling shutter case, vertical straight lines appear to be bent to the left as shown by nearby fence pole 361 compared with distant pole 362 in
By above methods, known distortions can be computationally removed from the images.
An important part of some embodiments of the invention is the digital map that contains relevant information relating to the road on which the vehicle is traveling. The digital map of this invention usually includes location of the edge of the road, edge of the shoulder, elevation and surface shape of the road, he character of the land beyond the road, trees, poles, guard rails, signs, lane markers, speed limits, etc. as discussed elsewhere herein. These data or information is acquired in a unique manner for use in the invention and the method for acquiring the information either by special or probe vehicles and its conversion to, or incorporation into, a map database that can be accessed by the vehicle system is part of this invention.
The maps in the map database may also include road condition information, emergency notifications, hazard warnings and any other information which is useful to improve the safety of the vehicle road system. Map improvements can include the presence and locations of points of interest and commercial establishments providing location-based services. Such commercial locations can pay to have an enhanced representation of their presence along with advertisements and additional information which may be of interest to a driver or other occupant of the vehicle. This additional information could include the hours of operation, gas price, special promotions etc. Again, the location of the commercial establishment can be obtained from the probe vehicles and the commercial establishment can pay to add additional information to the map database to be present to the vehicle occupant when the location of the establishment is present on the map being displayed in the display of the navigation system.
All information regarding the road, both temporary and permanent, should be part of the map database, including speed limits, presence of guard rails, width of each lane, width of the highway, width of the shoulder, character of the land beyond the roadway, existence of poles or trees and other roadside objects, location and content of traffic control signs, location of variable traffic control devices, etc. The speed limit associated with particular locations on the maps may be coded in such a way that the speed limit can depend upon the time of day and/or the weather conditions. In other words, speed limit may be a variable that will change from time to time depending on conditions.
It is contemplated that there will be a display for various map information which will always be in view for the passenger and/or driver at least when the vehicle is operating under automatic control. Additional user information can thus also be displayed on this display, such as traffic conditions, weather conditions, advertisements, locations of restaurants and gas stations, etc.
Very large map databases can now reside on a vehicle as the price of memory continues to drop. Soon it may be possible to store the map database of an entire country on the vehicle and to update it as changes are made. The area that is within, for example, 1000 miles from the vehicle can certainly be stored and as the vehicle travels from place to place the remainder of the database can be updated as needed though a connection to the Internet, for example.
When mention is made of the vehicle being operative to perform communications functions, it is understood that the vehicle includes a processor, processing unit or other processing functionality, which may be in the form of a computer, which is coupled to a communications unit including at least a receiver capable of receiving wireless or cellphone communications, and thus this communications unit is performing the communications function and the processor is performing the processing or analytical functions.
If the output of the IMU pitch and roll sensors are additionally recorded, a map of the road topography can be added to the map to indicate the side to side and forward to rear slopes in the road. This information can then be used warn vehicles of unexpected changes in road slope which may affect driving safety. It can also be used along with pothole information to guide the road management as to where repairs are needed.
Many additional map enhancements can be provided to improve highway safety. Mapping cameras described herein can include stoplights in their field of view and as the vehicle is determined to be approaching the stoplight, i.e., is within a predetermined distance which allows the camera to determine the status of the stoplight, and, since the existence of the stoplight will be known by the system, as it will have been recorded on the map, the vehicle will know when to look for a stoplight and determine the color of the light. More generally, a method for obtaining information about traffic-related devices providing variable information includes providing a vehicle with a map database including the location of the devices, determining the location of the vehicle, and as the location of the vehicle is determined to be approaching the location of each device, as known in the database, obtaining an image of the device using for example, a vehicle-mounted camera. This step may be performed by the processor disclosed herein which interfaces with the map database and the vehicle-position determining system. Images are analyzed to determine status of the device, which entails optical recognition technology.
When RTK GNSS is available, a probe vehicle can know its location within a few centimeters and in some cases within one centimeter. If such a vehicle is traveling at less than 100 KPH, for example, at least three to four images can be obtained of each landmark near the road. From these three to four images, the location of each landmark can be obtained to within 10 centimeters which is sufficient to form an accurate map of the roadway and nearby structures. A single pass of a probe vehicle is sufficient to provide an accurate map of the road without use of special mapping vehicles.
While the invention has been illustrated and described in detail in the drawings and the foregoing description, the same is to be considered as illustrative and not restrictive in character, it being understood that only preferred embodiments have been shown and described and that all changes and modifications that come within the spirit of the invention are desired to be protected.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/US2017/012745 | 1/9/2017 | WO | 00 |
Number | Date | Country | |
---|---|---|---|
62276481 | Jan 2016 | US |