This invention generally relates to autonomous navigation and, more particularly, to a system and method for using visual navigation buoys with predetermined locations to aid in the task of autonomous navigation.
In navigation, odometry is the use of data from the movement of actuators to estimate change in position over time through devices such as rotary encoders to measure wheel rotations. Visual odometry is the process of determining equivalent odometry information using sequential camera images to estimate the distance traveled. There are many existing approaches to visual odometry are based on steps of image acquisition and correction, feature detection and tracking, the estimation of camera motion, and the calculation of feature geometric relationships.
Egomotion is defined as the 3D motion of a camera within an environment. In the field of computer vision, egomotion refers to estimating a camera's motion relative to a rigid scene. An example of egomotion estimation would be estimating a car's moving position relative to lines on the road or street signs being observed from the car itself. The goal of estimating the egomotion of a camera is to determine the 3D motion of that camera within the environment using a sequence of images taken by the camera. The process of estimating a camera's motion within an environment involves the use of visual odometry techniques on a sequence of images captured by the moving camera. As noted above, this may be done using feature detection to construct an optical flow from two image frames in a sequence generated from either single cameras or stereo cameras. Using stereo image pairs for each frame, for example, helps reduce error and provides additional depth and scale information. Stereo vision uses triangulation based on epipolar geometry to determine the distance to an object.
Currently, most autonomous robot navigation systems are implemented based on a high end sensor, such as a laser scanner, high accuracy GPS receiver, or orientation sensor (inertia measurement unit (IMU)). These sensors add to the cost of the robots, making them unaffordable for many applications. Visual odometry offers the potential of added redundancy to increase accuracy and/or cheaper equipment costs, but at the expense of high computation difficulty.
It would be advantageous if visual odometry could be simplified with the use of landmarks having predefined locations to aid in the task of autonomous navigation.
Disclosed herein are a system and method permitting an autonomous vehicle to apply high accuracy corrections derived from visually identifiable navigation buoys, independent of any other navigational methods, such as global positioning satellites (GPS), which may also be in use. As the term is used herein, “navigation buoy” or “buoy” means a visual navigation guidepost or landmark or other marking device positioned on land and not in water. Alternative terms for “navigation buoy” or “buoy” to describe that element of the present invention include, for example, landmark, mark, guidepost, post, beacon, marker and identifier.
With the advent of economical high resolution optical cameras, with economical but high quality optical and digital zoom, an autonomous vehicle can read highly detailed information from a marker (buoy). Therefore, information can be encoded on a marker that permits an autonomous vehicle to extract detailed navigational information. The buoys have a known location (latitude/longitude/height), and the locations of the buoys are known to the autonomous vehicle. The angular relationship between the buoys is known to the autonomous vehicle, as well as the exact distance between two or more markers. The buoys are encoded with identifiers such as labels like “21A” and “21B”, or barcodes that can be read optically. The buoys also have a form of angular degree coding on them.
The autonomous vehicle “sees” one or more buoys and identifies the buoys via the labels or the barcodes. The vehicle determines the distances to the buoys via laser or GPS measurements, or through optical estimation. By reading the angular degree coding from the markers, the vehicle calculates its exact position using basic trigonometry functions (e.g., angle/side/angle). The buoys can be illuminated either internally or externally for night time use
Accordingly, a method is provided for navigation correction assistance. The method provides a vehicle with a camera and an autonomous navigation system comprising a processor, a non-transitory memory, and a navigation buoy database stored in the memory. A navigation application is also stored in the memory. The navigation application visually acquires a first navigation buoy with an identity marker and accesses the navigation buoy database, which cross-references the first navigation buoy identity marker to a first spatial position. A first direction marker on the first navigation buoy is also visually acquired. In response to visually acquiring the first direction marker, a first angle is determined between the camera and the first spatial position. A first distance may also be determined between the vehicle and the first navigation buoy. Then, in response to the first spatial position, the first angle, and the first distance, the spatial position of the vehicle can be known. The first distance may be determined using one of the following: laser measurement (LiDAR), radar measurement, visual measurement (visual odometry), GPS information, or inertial measurement dead reckoning.
If a second navigation buoy with an identity marker is visually acquired, the navigation buoy database can be accessed to cross-reference the second navigation buoy identity marker to a second spatial position. By visually acquiring a second direction marker on the second navigation buoy, a second angle between the camera and the second spatial position can be determined. Then, in response to the first spatial position, the second spatial position, the first angle, and the second angle, the spatial position of the vehicle can be determined.
More explicitly, determining the first angle includes initially identifying a first quadrant with a first quadrant boundary marker, where the first quadrant represents a first predetermined general geographic direction (e.g., North-East) divided into a plurality of degrees with corresponding degree markers. The first quadrant boundary marker represents a predetermined number of degrees associated with an explicit geographic direction (e.g., 0 degrees or North). Then, the number of degree markers between the first quadrant boundary and the first direction marker is counted, and the count is added to the predetermined number of degrees to calculate the first angle. Typically, the first direction marker is a degree marker in the centerline of a vehicle camera image.
Additional details of the above-described method, a navigation buoy system, and a system for navigation correction assistance are provided below.
As shown, the direction markers may be organized into quadrants. Using buoy 102a as an example, first quadrants 204 and second quadrant 206 are visible. Each quadrant is associated with a general geographic direction (e.g., North-East, South-West) and divided into a plurality of degrees. Each quadrant boundary is associated with a predetermined number of degrees cross-referenced to an explicit geographic direction. Again using buoy 102a as an example, quadrant boundary 208 is associated with 0 degrees, which may be North. The quadrants are sub-divided by degree markers 210. In a 360 degree system, the circumference of a buoy may be circumscribed by 360 degree markers, although this number may vary depending on accuracy requirements.
Each quadrant is uniquely visually marked. Using buoy 102a as an example, quadrant 206 is identified by no horizontal marks (stripes of interruption in the degree marks 210) and quadrant 204 is identified by 3 horizontal marks. By visually identifying these two quadrants, the quadrant boundary 208 can be determined, or more explicitly, the explicit geographic direction (0 degrees, North) of the quadrant boundary is known. To continue the example using buoy 102c, third quadrant 212 and fourth quadrant 214 can be visually identified, which permits the identification of quadrant boundary 216, and therefore, the explicit geographic direction associated with 180 degrees (South).
The navigation application 412 visually acquires the first navigation buoy 102a, access the navigation buoy database 410 to determine a first spatial position associated with the first navigation buoy identity marker (200a, see
The navigation application 412 also has an interface to accept a measurement of a first distance 422 between the vehicle 402 and the first navigation buoy 102a. In response to knowing the first spatial position of the first buoy 102a, the first angle 418, and the first distance 422, the navigation application can determine the spatial position of the vehicle 402. A measurement device has an output interface to supply position-related data for calculating the first distance measurement. The measurement device may be a hardware device 424 such as a laser range finder (laser detection and ranging (LiDAR)), radar, a global positioning satellite (GPS) receiver, or an inertial measurement unit (IMU). The laser and radar devices actually measure the first distance 422, while GPS and IMU data may be used to calculate the first distance by comparing the known location of a buoy to the estimated location of the vehicle 402. Alternatively, the measurement device may be a distance visual calculation (visual odometry (VO)) module 426 stored in memory 408 and enabled as a sequence of processor executable instructions. In another aspect, the first navigation buoy 102a has a visible feature with a predetermined size, such as a predetermined height or diameter. The navigation application 412, perhaps in cooperation with the VO module 426, determines the first distance 433 between the vehicle 402 and the first navigation buoy 102a by calculating the relationship between the predetermined size of the buoy feature and the amount of image space occupied by the buoy feature. If the camera is equipped with a zoom lens, the first distance can be calculated in response to determining the degree of zoom required to occupy a predetermined portion of the camera image. In this case, the navigation application may exercise control over the zoom function of the camera.
As described above, the system is able to determine the position of the vehicle by visually acquiring a single buoy. However, the system typically comprises a plurality of buoys, and the acquisition of more than one buoy may improve the accuracy of the vehicle position calculation. Shown is a second navigation buoy 102n having a predetermined spatial position, an identity marker 200n, and a circumference with predetermined direction markers (i.e., second direction marker 428). The navigation application 412 visually acquires the second navigation buoy 102n, and accesses the navigation buoy database 410 to cross-reference the second navigation buoy identity marker 200n to a second spatial position. The navigation application 412 visually acquires a second direction marker 428 on the second navigation buoy 102n, determines a second angle 430 between the camera 404 and the second spatial position, and in response to the first spatial position, the second spatial position, the first angle 418, and the second angle 430, determines the spatial position of the vehicle 402. That is, the spatial position of the vehicle can be calculated with greater accuracy knowing both the first angle 418 and the second angle 430. Even greater accuracy is obtained if the second distance 432, between the second buoy 102n and the camera 404, is known. Although not explicitly shown in this figure, the vehicle may visually acquire three or more buoys. Each additionally acquired buoy improves the accuracy of the vehicle position calculation. However, the degree of improvement diminishes with each added buoy.
As explained above in the description of
Returning to
The controller 420 may be enabled as a personal computer (PC), Mac computer, tablet, workstation, server, PDA, or handheld device. The processor 407 may be connected to memory 408 and IO 414 via an interconnect bus 436. The memory 408 may include a main memory, a read only memory, and mass storage devices such as various disk drives, tape drives, etc. The main memory typically includes dynamic random access memory (DRAM) and high-speed cache memory. In operation, the main memory stores at least portions of instructions and data for execution by the processor 407. The IO 414 may be a modem, an Ethernet card, wireless (Bluetooth or WiFi), or any other appropriate data communications device such as USB. The physical communication links may be optical, wired, or wireless.
As noted above, the buoys or markers have a form of angular degree coding on them, with angular information presented based on quadrants, which permits an autonomous vehicle to instantly determine its positional relationship with a buoy by measuring the distance to the buoy. The distance measurement can be made using a laser based measuring method or an optical camera to estimate the distance. In one exact size of a buoy is known and the distance to a buoy can be calculated based on how much zoom is needed to accomplish a specific task such a read a barcode or have the buoy fill a known area of the image. Position related measurements made by devices other than the camera can be adjusted based upon the offset between the auxiliary device and camera.
As noted in the description of
An offset can be included in the calculations to move the calculated position to either the center of the vehicle, or to the position when a GPS receiver or other position or distance measurement device is located on the vehicle, to provide an exact vehicle fix.
Step 702 provides a vehicle with a camera and an autonomous navigation system comprising a processor, a non-transitory memory, a visual navigation buoy database stored in the memory, and a navigation application enabled as a sequence of processor executable instructions stored in the memory for autonomously navigating. In Step 704 the navigation application visually acquires a first visual navigation buoy with an identity marker. Step 706 accesses the navigation buoy database to cross-reference the first navigation buoy identity marker to a first spatial position. Step 708 visually acquires a first direction marker on the first navigation buoy. In response to visually acquiring the first direction marker, Step 710 determines a first angle between the camera and the first spatial position. Step 712 determines a first distance between the vehicle and the first navigation buoy. In response to the first spatial position, the first angle, and the first distance, Step 714 determines a spatial position of the vehicle.
In one aspect, Step 712 determines the first distance using laser measurement, radar measurement, visual (VO) measurement, GPS information, or inertial measurement dead reckoning. Alternatively, in Step 704 or 708 the navigation application visually acquires a navigation buoy having a visible feature with a predetermined size. Then, determining the first distance between the vehicle and the first navigation buoy in Step 712 includes calculating a relationship between the predetermined size of the buoy feature and the amount of camera image space occupied by the buoy feature. In another variation where the camera has a zoom lens, Step 704 or 708 adjusts the camera zoom to visually acquire the navigation buoy visible feature, and Step 712 calculates the relationship between the predetermined size of the buoy visible feature and the degree of zoom required to occupy a predetermined portion of camera image space.
In one aspect, Step 702 provides a vehicle with a reference measurement system and an auxiliary measurement system offset from the reference measurement system by a predetermined amount. Then, determining the first distance between the vehicle and the first navigation buoy in Step 712 includes the following substeps. Step 712a accepts position-related data from the reference and auxiliary measurement systems. Step 712b applies corrections to account for the offset, and Step 712c merges the reference and auxiliary measurement system position-related data. One example of an auxiliary measurement system may be a GPS receiver, and one example of a reference measurement system may be the camera.
In another aspect, Step 716 visually acquires a second navigation buoy with an identity marker. Step 718 accesses the navigation buoy database to cross-reference the second navigation buoy identity marker to a second spatial position. Step 720 visually acquires a second direction marker on the second navigation buoy. In response to visually acquiring the second direction marker, Step 722 determines a second angle between the camera and the second spatial position. Then, in response to the first spatial position, the second spatial position, the first angle, and the second angle, Step 714 determines the spatial position of the vehicle (with additional data points). The method may be extended to visually acquire additional buoys.
Determining the first angle in Step 710 may include substeps. Step 710a initially identifies a first quadrant with a first quadrant boundary, where the first quadrant represents a first predetermined general geographic direction divided into a plurality of degrees with corresponding degree markers. The first quadrant boundary represents a predetermined number of degrees associated with an explicit geographic direction. Step 710b counts the number of degree markers between the first quadrant boundary and the first direction marker, creating a count. Step 710c adds the count to the predetermined number of degrees to calculate the first angle. Typically, the first direction marker is visually acquired by selecting the degree marker in a centerline of a vehicle camera image.
A system and method have been provided for aiding autonomous navigations using buoys with predetermined locations and angular markings. Examples of particular hardware units and measurement techniques have been presented to illustrate the invention. However, the invention is not limited to merely these examples. Other variations and embodiments of the invention will occur to those skilled in the art.
Number | Name | Date | Kind |
---|---|---|---|
5646614 | Abersfelder | Jul 1997 | A |
7340076 | Stach et al. | Mar 2008 | B2 |
7925049 | Zhu et al. | Apr 2011 | B2 |
8022812 | Beniyama et al. | Sep 2011 | B2 |
8134479 | Suhr et al. | Mar 2012 | B2 |
8174568 | Samarasekera et al. | May 2012 | B2 |
8238612 | Susca et al. | Aug 2012 | B2 |
8340901 | Fahn et al. | Dec 2012 | B2 |
8447863 | Francis, Jr. et al. | May 2013 | B1 |
8655513 | Vanek | Feb 2014 | B2 |
8660736 | Chen et al. | Feb 2014 | B2 |
8661605 | Svendsen et al. | Mar 2014 | B2 |
8663130 | Neubach et al. | Mar 2014 | B2 |
8705842 | Lee et al. | Apr 2014 | B2 |
8825387 | Mays et al. | Sep 2014 | B2 |
8918241 | Chen et al. | Dec 2014 | B2 |
8929604 | Platonov et al. | Jan 2015 | B2 |
8937410 | Comins et al. | Jan 2015 | B2 |
9157757 | Liao et al. | Oct 2015 | B1 |
9378558 | Kajiwara | Jun 2016 | B2 |
9535123 | Mittal | Jan 2017 | B2 |
20040167667 | Goncalves et al. | Aug 2004 | A1 |
20050010342 | Li | Jan 2005 | A1 |
20060055530 | Wang et al. | Mar 2006 | A1 |
20060056707 | Suomela et al. | Mar 2006 | A1 |
20060058921 | Okamoto | Mar 2006 | A1 |
20070027579 | Suzuki et al. | Feb 2007 | A1 |
20070061041 | Zweig | Mar 2007 | A1 |
20070100498 | Matsumoto et al. | May 2007 | A1 |
20070188328 | Mochizuki et al. | Aug 2007 | A1 |
20070271003 | Bang et al. | Nov 2007 | A1 |
20070297075 | Schofield | Dec 2007 | A1 |
20080077511 | Zimmerman | Mar 2008 | A1 |
20090012667 | Matsumoto et al. | Jan 2009 | A1 |
20090074545 | Lert, Jr. et al. | Mar 2009 | A1 |
20090207257 | Jung et al. | Aug 2009 | A1 |
20100076631 | Mian | Mar 2010 | A1 |
20100155156 | Finkelstein | Jun 2010 | A1 |
20110106312 | Chen et al. | May 2011 | A1 |
20120121161 | Eade et al. | May 2012 | A1 |
20130231779 | Purkayastha et al. | Sep 2013 | A1 |
20130242101 | Schneider | Sep 2013 | A1 |
20150109148 | Cheatham | Apr 2015 | A1 |
20150328775 | Shamlian | Nov 2015 | A1 |
20150378361 | Walker | Dec 2015 | A1 |
20160223643 | Li | Aug 2016 | A1 |
Number | Date | Country | |
---|---|---|---|
20170315558 A1 | Nov 2017 | US |