The invention relates to a technique for generating a map of an environment using a plurality of sub-maps. In particular, the invention relates to a system and method for combining sensor data into a plurality of sub-maps based upon the location of the sensor when the data was acquired and the certainty with its location was known.
In the past few years, a substantial research effort has been devoted to the problem of Simultaneous Localization and Mapping (SLAM). The term “map” in the field of SLAM generally refers to a spatial arrangement of observed landmarks or features. If these landmarks correspond to obstacle locations (such as the measurements collected with a Laser Range Finder), then the “map” yields an occupancy map denoting the floor plan of the space in which the robot is operating. In other cases, in which the landmark information does not correspond to obstacle locations (such as the measurements taken with a camera), the “map” estimated with SLAM techniques is dissociated from the locations of obstacles (occupancy map). However, an occupancy map is required for the robot to properly make decisions and navigate the environment.
A number of SLAM techniques have been proposed for simultaneously estimating the poses (i.e. localization) and building the map. Some methods re-estimate past poses instead of only the latest pose as new information is collected, achieving an improvement in the estimate of the robot trajectory as the localization system is updated. Laser scans, for example, are collected as a mobile robot moves through an indoor environment. These scans are combined with odometry information to estimate the robot's trajectory to yield a map showing the floor plan of the building. As more information is collected, the accuracy of the map improves because the estimates of the past poses of the robot are improved. A disadvantage of this system is that all sensor readings and their associated poses must be stored to allow the sensor data to be re-processed when new information arrives. This results in storage requirements that grow linearly with time. There is therefore a need for a localization and mapping technique that efficiently creates an occupancy map using new information to improve accuracy of the map without the storage requirement growing linearly with time.
The invention in the preferred embodiment features a system and method for mapping parameter data acquired by a robot or other mapping system that travels through an environment. The method generally comprises: measuring parameters that characterize the environment while driving the robot through the environment; generating estimates of the current robot pose, mapping parameter data to a current grid associated with an anchor node until the estimated pose uncertainty between with the current pose and the prior anchor node exceeds a threshold. When the threshold is exceeded, the robot generates a new grid associated with a new anchor node to record parameter data. The robot repeatedly generates new grids associated with different anchor nodes for purpose of recording parameter data. The estimated positions of the anchor nodes are updated over time as the robot refines its estimates of the locations of landmarks from which it estimates its position in the environment. When an occupancy map or other global parameter map is required, the robot merges local grids into a comprehensive map indicating the parameter data in a global reference frame.
In accordance with some embodiments of the invention, the robot may map new parameter data to a new local parameter grid or to a pre-existing parameter grid. Data is recorded to a pre-existing parameter grid if the uncertainty between the current robot pose estimate and the pose estimate associated with the pre-existing grid is below a predetermined threshold. By using pre-existing grids, the robot can limit the memory requirements necessary to map the environment without the memory requirements growing linearly in time.
The present invention is illustrated by way of example and not limitation in the figures of the accompanying drawings, and in which:
Illustrated in
Data from the sensors 112, 114 may undergo preprocessing at processing unit 116. For example, the processing unit 116 may extract visual features from the image data for purposes of recognizing known landmarks, and process odometry data to convert wheel encoder signals or other odometry data to distance and rotation estimates. In some embodiments, odometry data may be used to detect and compensate for situations in which the drive wheels slip due to wet, slick, or carpeted surfaces. Data from the bump sensor 118 may undergo preprocessing at the processing unit 120 to determine when the robot encounters and obstacle as well as the position of the obstacle with respect to the robot path.
In other embodiments, the set of sensors 110 includes range finders, including laser, infrared (IR), and acoustic range finders; proximity sensors including lateral proximity sensors for determining lateral distance to objects in the environment; drop sensors for detecting staircases and other locations that are unsuitable for travel by the robot; and floor surface sensors, including sensors for measuring dirt concentration, slippage, and soil characteristics.
The mobile robot system 100 further includes at least one processor 130 configured to perform localization, generate maps of properties characterizing the environment in which the robot is operating, and navigate through the environment. In the preferred embodiment, the localization module 132 determines the location of landmarks as well as the mobile robot with visual and odometry data using a technique called Simultaneous Localization and Mapping (SLAM) 134 taught in U.S. Pat. No. 7,135,992 hereby incorporated by reference herein. Using this technique, the robotic system explores its environment, takes numerous images of its environment, makes a map depicting landmarks in the environment, and estimates the location of the robot relative to those landmarks. In the preferred embodiment, landmarks are visually identified using visual features from the image data are extracted and matched using a Scale Invariant Feature Transform (SIFT), Speeded Up Robust Features (SURF), Gradient Location and Orientation Histogram (GLOH), Binary Robust Independent Elementary Features (BRIEF), or other type of visual feature known to those skilled in the art. The visual landmarks—along with estimates of the robot position and orientation (pose) of the robot when the image was taken—are stored in the landmark database 142.
The parameter mapping module 136 is configured to generate a plurality of sub-maps or grids comprising local parameters and build global parameter maps based on those grids. In particular, the module 136 builds grids that depict the properties of the environment in proximity to associated anchor nodes, i.e., reference points fixed in their respective local reference frames. Estimates of the locations of the anchor nodes within the global reference frame are continually updated as the SLAM module 134 refines the localization map characterizing the environment. In the preferred embodiment, the parameters being mapped by the mapping module 136 include obstacles and clear spaces through which the robot system is free to navigate, as is explained in more detail below. Each of the anchor nodes is stored in node database 144 and the associated grid stored in the grid database 146. In the preferred embodiment, the mapping module includes an uncertainty tracking module 138 for measuring the uncertainty associated with the anchor nodes' localization estimate which is stored together with the anchor nodes' coordinates and heading in the global reference frame.
The processor 130 in the preferred embodiment further includes a navigation module 140 configured to generate signals that control the movement of the robot. For example, the navigation module can provide control signals to instruct the robot to move forward, to stop, to move backward, to turn, to rotate about a vertical axis. If the mobile robot system is an autonomous or semi-autonomous robot, the navigation module 140 can also perform path planning to efficiently guide the robot system to a desired destination and/or to achieve a desired goal. In accordance with the preferred embodiment, path planning is based on a parameter map that is generated from a plurality of parameter grids using current estimates of the poses of the anchors nodes corresponding to those grids.
The robot system 100 further includes a drive mechanism 150 for moving the robot around its environment, which may be indoors, outdoors, or a combination thereof. In the preferred embodiment, the drive mechanism includes two or more wheels drive wheels 152 powers by a motor 154 and battery pack 156, for example. In addition to, or instead of, the robot system may also incorporate other forms of locomotion including tracks, rollers, propellers, legs, and the like, to move around. The drive system may further include one or more optical wheel encoders 158, for example, for measuring the wheel rotation and estimating the distance traveled by the robot system. In addition, the difference in the rotation of opposing wheels can indicate changes in heading.
With wheel encoders 158 or other type of dead reckoning, the robot system can compute course and distance traveled from a previous position and orientation (pose) and use this information to estimate a current (pose). While relatively accurate over relatively short distances, dead reckoning sensing is prone to drift over time. Other forms of dead reckoning can include a pedometer (for walking robots), measurements from an inertial measurement unit, optical sensors such as those used in optical mouse devices, and the like.
In the preferred embodiment, the robot system 210 tracks its current location, path, or combination thereof with respect to a global reference frame represented by Cartesian (x-y) coordinates 250, as shown in
By contrast, a grid in the preferred embodiment includes a map of local parameter data located relative to an anchor node in a local reference frame. As shown in
Although the grids in the preferred embodiment are shown as two dimensional (2D) Cartesian sub-maps, the grids may effectively record local parameter data using other reference systems including spherical and cylindrical coordinates systems for example. The parameter data is represented with pixels in a Cartesian coordinate system in the preferred embodiment. In alternative embodiments, grids may represent local parameter data as (1) pixels in a cylindrical coordinate system, (2) polygons with an arbitrary number of sides, or (3) other arbitrary shape, for example.
Referring to
The robotic system generates a map of one or more parameters of interest in parallel with the location determination. In particular, the parameter mapping module senses properties of the environment and generates a parameter map depicting those properties. Referring to
Successive poses, like Pose 2 and Pose 3, generally have a relatively low relative uncertainty (due to the accuracy of the dead reckoning sensors) and may, therefore be combined into a single summary in many cases. As the localization information generated by the location module improves over time, the uncertainty of the relative pose between anchor nodes of two summaries will decrease. When the relative pose between two anchor nodes becomes sufficiently certain—the relative uncertainty drops below a threshold—the summaries associated with multiple nodes may be combined into a single summary that is then associated with a single anchor node. As shown in
As described above, the parameter data from a plurality of grids may be merged in a single summary associated with a single anchor nodes based on the relative pose uncertainty. Other criteria may also be used when determining whether to combine grids. These criteria may include, but are not limited to: (a) whether the summary reduces the memory requirements, i.e., whether the number of anchor nodes and grids data is reduced; (b) whether the summary improves performance, i.e., whether the summary reduces the time needed to compute a complete parameter map; (c) whether the map quality improves, i.e., whether merging or eliminating relatively “old” and outdated maps while retaining relatively “newer” maps improves the accuracy of the parameter map; or (d) any combination thereof.
Illustrated in
In accordance with the preferred embodiment, the parameter mapping module 136 identifies nodes having a relative pose uncertainty below a threshold, combines the sensor data for these poses into a single grid, and associates the grid with a single anchor node. The parameter data from grids 520-523, for example, can be combined by overlaying the respective grids 520-523 as shown by the superposition 530 of grids. As one skilled in the art will appreciate, the plurality of grids may overlap in physical extent, possess different orientations in their respective local reference frames, and be of different sizes. Thereafter, data from the superposition 530 of grids may be combined into a single spatial summary associated with a new anchor node, for example. In the alternative, the superposition of spatial summaries may be used to build a global parameter map used to, for example, plan a new path for the robot through the environment. Exemplary parameter maps are shown and discussed in reference to
Like
For example, cells 520, 521 in the grid associated with anchor node Al and A2 show occupied areas (or unsearched areas) in
At any point in time, the grids may be combined to generate a complete parameter map of the entire environment or a portion of the environment for purposes of path planning, for example. A representative parameter map is shown in
Illustrated in
While the robotic system navigates 802 through the environment, it measures 816 local parameters using on-board sensors including the bump sensor. Using the estimate of the current pose, the parameter mapping module searches for and identifies 818 an existing anchor node having the lowest relative pose uncertainty with respect to the current node. The identified node may be the preceding node in the robot path, or a prior node that is closest in distance to the current node. If the relative pose uncertainty between the current node and a prior node is below a predetermined threshold, the decision block 820 is answered in the affirmative. In this case, the grid associated with the prior anchor node is selected 822 to be the current grid and incoming sensor data mapped 826 to this current grid. The uncertainty is determined from the covariance matrix describing the positional uncertainties associated with the localization using the visual SLAM module and odometry sensors, for example. If, however, the uncertainty exceeds the predetermined threshold, the decision block 820 is answered in the negative. In this case, a new anchor node is generated 824 and the incoming sensor data mapped 826 to a new grid associated with the new anchor node. The process of mapping 826 incoming parameter data continues while the uncertainty remains sufficiently low. Over relatively short distances, dead reckoning measurements, such as those obtained from odometry readings, can be quite accurate. As such, the uncertainty remains low and incoming sensor data generally used to populate the current parameter. New nodes tend to be generated after the robot has traveled some distance in a previously unexplored area. New anchor nodes 830 are recorded in the node database 144 and new and updated grids 828 recorded in the grid database 146.
On occasion, the parameter data from a plurality of local grids is merged 832 into one or more spatial summaries. As discussed in detail in
The robotic system of the present invention can be implemented in systems include hardware, software, firmware, or a combination thereof. Hardware can include one or more general purpose computers, microprocessors, application specific integrated circuits (ASICs), field programmable gate arrays (FPGAs), and the like, as well as combinations thereof linked by networking systems, for example. Software may include computer-readable instructions for execution on various processors, computers, servers, or like circuit board or chips. The computer-readable instructions may be affixed in volatile or non-volatile memory including memory chips, hard drives, on compact discs, for example.
The present invention may also be implement in a plurality of platforms including a distributed platform including two or more network-enabled robots that cooperate with a remote central processing unit (CPU), for example, to collect landmark information from a relatively large environment. The CPU may include a personal computer, mobile phone, tablet computer, server, or like device that perform the computation of the processor 130. In some embodiments, the present invention is implemented with a fleet of robots that periodically exchange positioning information and parameter maps (either rendered a single map or as a collection of individual sub-maps) while traversing the environment so that each robot has information on all the parameters explored by other robots.
Although the description above contains many specifications, these should not be construed as limiting the scope of the invention but as merely providing illustrations of some of the presently preferred embodiments of this invention.
Therefore, the invention has been disclosed by way of example and not limitation, and reference should be made to the following claims to determine the scope of the present invention.
This application is a divisional of U.S. patent application Ser. No. 15/225,158, filed Aug. 1, 2016, which is a continuation of U.S. patent application Ser. No. 14/944,152, filed Nov. 17, 2015, now U.S. Pat. No. 9,404,756, which is a continuation of U.S. patent application Ser. No. 14/307,402, filed on Jun. 17, 2014, now U.S. Pat. No. 9,218,003, which is a continuation of U.S. patent application Ser. No. 13/632,997, filed on Oct. 1, 2012, now U.S. Pat. No. 8,798,840, which claims the benefit of U.S. Provisional Application No. 61/541,749, filed on Sep. 30, 2011, the contents of which are hereby incorporated by reference in their entirety.
Number | Name | Date | Kind |
---|---|---|---|
4628453 | Kamejima et al. | Dec 1986 | A |
4815840 | Benayad-Cherif et al. | Mar 1989 | A |
4846297 | Field et al. | Jul 1989 | A |
4942539 | McGee et al. | Jul 1990 | A |
4954962 | Evans, Jr. et al. | Sep 1990 | A |
5040116 | Evans, Jr. et al. | Aug 1991 | A |
5083257 | Kennedy | Jan 1992 | A |
5109425 | Lawton | Apr 1992 | A |
5111401 | Everett, Jr. et al. | May 1992 | A |
5144685 | Nasar et al. | Sep 1992 | A |
5155775 | Brown | Oct 1992 | A |
5170352 | McTamaney et al. | Dec 1992 | A |
5321614 | Ashworth | Jun 1994 | A |
5517419 | Lanckton et al. | May 1996 | A |
5525882 | Asaka et al. | Jun 1996 | A |
5525883 | Avitzour | Jun 1996 | A |
5581629 | Hanna et al. | Dec 1996 | A |
5677836 | Bauer | Oct 1997 | A |
5793934 | Bauer | Aug 1998 | A |
5911767 | Garibotto et al. | Jun 1999 | A |
5957984 | Rencken | Sep 1999 | A |
5961571 | Gorr et al. | Oct 1999 | A |
6005610 | Pingali | Dec 1999 | A |
6009359 | El-Hakim et al. | Dec 1999 | A |
6108597 | Kirchner et al. | Aug 2000 | A |
6243657 | Tuck et al. | Jun 2001 | B1 |
6256581 | Fujii et al. | Jul 2001 | B1 |
6266068 | Kang et al. | Jul 2001 | B1 |
6269763 | Woodland | Aug 2001 | B1 |
6285393 | Shimoura et al. | Sep 2001 | B1 |
6288704 | Flack et al. | Sep 2001 | B1 |
6299699 | Porat et al. | Oct 2001 | B1 |
6301370 | Steffens et al. | Oct 2001 | B1 |
6330858 | McDonough et al. | Dec 2001 | B1 |
6427118 | Suzuki | Jul 2002 | B1 |
6453223 | Kelly et al. | Sep 2002 | B1 |
6459955 | Bartsch et al. | Oct 2002 | B1 |
6463368 | Feiten et al. | Oct 2002 | B1 |
6496754 | Song et al. | Dec 2002 | B2 |
6516267 | Cherveny et al. | Feb 2003 | B1 |
6552729 | Di Bernardo et al. | Apr 2003 | B1 |
6711293 | Lowe | Mar 2004 | B1 |
6728635 | Hamada et al. | Apr 2004 | B2 |
6742613 | Erlich et al. | Jun 2004 | B2 |
6766245 | Padmanabhan | Jul 2004 | B2 |
6771932 | Caminiti et al. | Aug 2004 | B2 |
6836701 | McKee | Dec 2004 | B2 |
6856901 | Han | Feb 2005 | B2 |
6898518 | Padmanabhan | May 2005 | B2 |
6904360 | Pechatnikov et al. | Jun 2005 | B2 |
6915008 | Barman et al. | Jul 2005 | B2 |
6917855 | Gonzalez-Banos | Jul 2005 | B2 |
6922632 | Foxlin | Jul 2005 | B2 |
7015831 | Karlsson et al. | Mar 2006 | B2 |
7031496 | Shimano et al. | Apr 2006 | B2 |
7082350 | Skoog | Jul 2006 | B2 |
7135992 | Karlsson et al. | Nov 2006 | B2 |
7145478 | Goncalves et al. | Dec 2006 | B2 |
7162056 | Burl et al. | Jan 2007 | B2 |
7162338 | Goncalves et al. | Jan 2007 | B2 |
7177737 | Karlsson et al. | Feb 2007 | B2 |
7211980 | Bruemmer et al. | May 2007 | B1 |
7225552 | Kwon et al. | Jun 2007 | B2 |
7272467 | Goncalves et al. | Sep 2007 | B2 |
7573403 | Goncalves et al. | Aug 2009 | B2 |
7679532 | Karlsson et al. | Mar 2010 | B2 |
7689321 | Karlsson | Mar 2010 | B2 |
7720554 | Dibernardo et al. | May 2010 | B2 |
7774158 | Domingues Goncalves et al. | Aug 2010 | B2 |
7827643 | Erlich et al. | Nov 2010 | B2 |
7912633 | Dietsch et al. | Mar 2011 | B1 |
7996097 | Dibernardo et al. | Aug 2011 | B2 |
8086419 | Goncalves et al. | Dec 2011 | B2 |
8095336 | Goncalves et al. | Jan 2012 | B2 |
8150650 | Goncalves et al. | Apr 2012 | B2 |
8271132 | Nielsen | Sep 2012 | B2 |
8274406 | Karlsson et al. | Sep 2012 | B2 |
8295955 | Dibernardo et al. | Oct 2012 | B2 |
8380350 | Ozick et al. | Feb 2013 | B2 |
8396592 | Jones et al. | Mar 2013 | B2 |
8412377 | Casey et al. | Apr 2013 | B2 |
8428778 | Landry et al. | Apr 2013 | B2 |
8452450 | Dooley et al. | May 2013 | B2 |
8705893 | Zhang | Apr 2014 | B1 |
8798840 | Fong et al. | Aug 2014 | B2 |
8965578 | Versteeg | Feb 2015 | B2 |
9080874 | Haverinen | Jul 2015 | B2 |
9218003 | Fong et al. | Dec 2015 | B2 |
9395190 | Young | Jul 2016 | B1 |
9404756 | Fong et al. | Aug 2016 | B2 |
9952053 | Fong et al. | Apr 2018 | B2 |
10037028 | Loianno | Jul 2018 | B2 |
10203209 | Roumeliotis | Feb 2019 | B2 |
10203210 | Tagawa | Feb 2019 | B1 |
10222211 | Chen | Mar 2019 | B2 |
10584971 | Askeland | Mar 2020 | B1 |
20020072848 | Hamada et al. | Jun 2002 | A1 |
20020095239 | Wallach et al. | Jul 2002 | A1 |
20030007682 | Koshizen et al. | Jan 2003 | A1 |
20030025472 | Jones et al. | Feb 2003 | A1 |
20030030398 | Jacobs et al. | Feb 2003 | A1 |
20030044048 | Zhang et al. | Mar 2003 | A1 |
20030204382 | Julier | Oct 2003 | A1 |
20040073360 | Foxlin | Apr 2004 | A1 |
20040122587 | Kanemitsu | Jun 2004 | A1 |
20040167667 | Goncalves et al. | Aug 2004 | A1 |
20040167669 | Karlsson et al. | Aug 2004 | A1 |
20040167670 | Goncalves et al. | Aug 2004 | A1 |
20040167688 | Karlsson et al. | Aug 2004 | A1 |
20040167716 | Goncalves et al. | Aug 2004 | A1 |
20040168148 | Goncalves et al. | Aug 2004 | A1 |
20040249504 | Gutmann et al. | Dec 2004 | A1 |
20050007057 | Peless et al. | Jan 2005 | A1 |
20050010330 | Abramson et al. | Jan 2005 | A1 |
20050182518 | Karlsson | Aug 2005 | A1 |
20050213082 | Dibernardo et al. | Sep 2005 | A1 |
20050234679 | Karlsson | Oct 2005 | A1 |
20050238200 | Gupta et al. | Oct 2005 | A1 |
20060012493 | Karlsson et al. | Jan 2006 | A1 |
20060027404 | Foxlin | Feb 2006 | A1 |
20060293809 | Harwig et al. | Dec 2006 | A1 |
20070045018 | Carter et al. | Mar 2007 | A1 |
20070090973 | Karlsson et al. | Apr 2007 | A1 |
20070156286 | Yamauchi | Jul 2007 | A1 |
20070179670 | Chiappetta et al. | Aug 2007 | A1 |
20070244610 | Ozick et al. | Oct 2007 | A1 |
20070262884 | Goncalves et al. | Nov 2007 | A1 |
20070271011 | Lee et al. | Nov 2007 | A1 |
20070293985 | Myeong et al. | Dec 2007 | A1 |
20080009964 | Bruemmer | Jan 2008 | A1 |
20080009970 | Bruemmer | Jan 2008 | A1 |
20080012518 | Yamamoto | Jan 2008 | A1 |
20080155768 | Ziegler et al. | Jul 2008 | A1 |
20080273791 | Lee et al. | Nov 2008 | A1 |
20080294338 | Doh et al. | Nov 2008 | A1 |
20090024251 | Myeong et al. | Jan 2009 | A1 |
20090055020 | Jeong et al. | Feb 2009 | A1 |
20090081923 | Dooley et al. | Mar 2009 | A1 |
20090093907 | Masaki et al. | Apr 2009 | A1 |
20090234499 | Nielsen | Sep 2009 | A1 |
20090281661 | Dooley et al. | Nov 2009 | A1 |
20100020093 | Stroila et al. | Jan 2010 | A1 |
20100040279 | Yoon et al. | Feb 2010 | A1 |
20100049391 | Nakano | Feb 2010 | A1 |
20100070078 | Kong et al. | Mar 2010 | A1 |
20100198443 | Yabushita et al. | Aug 2010 | A1 |
20100235033 | Yamamoto et al. | Sep 2010 | A1 |
20100241289 | Sandberg | Sep 2010 | A1 |
20100274387 | Pitzer | Oct 2010 | A1 |
20100280754 | Goncalves et al. | Nov 2010 | A1 |
20100284621 | Goncalves et al. | Nov 2010 | A1 |
20100286905 | Goncalves et al. | Nov 2010 | A1 |
20110010033 | Asahara et al. | Jan 2011 | A1 |
20110082585 | Sofman et al. | Apr 2011 | A1 |
20110125323 | Gutmann et al. | May 2011 | A1 |
20110167574 | Stout et al. | Jul 2011 | A1 |
20110178668 | Tanaka et al. | Jul 2011 | A1 |
20110208745 | Dietsch et al. | Aug 2011 | A1 |
20120022785 | Dibernardo et al. | Jan 2012 | A1 |
20120029698 | Myeong et al. | Feb 2012 | A1 |
20120089295 | Ahn et al. | Apr 2012 | A1 |
20120106828 | Yoon et al. | May 2012 | A1 |
20120121161 | Eade et al. | May 2012 | A1 |
20120155775 | Ahn | Jun 2012 | A1 |
20120213443 | Shin et al. | Aug 2012 | A1 |
20120219207 | Shin et al. | Aug 2012 | A1 |
20120230550 | Kraut | Sep 2012 | A1 |
20120232795 | Robertson et al. | Sep 2012 | A1 |
20120239191 | Versteeg | Sep 2012 | A1 |
20130060382 | Pitzer | Mar 2013 | A1 |
20130096885 | Gupta | Apr 2013 | A1 |
20130216098 | Hasegawa et al. | Aug 2013 | A1 |
20130245937 | Dibernardo et al. | Sep 2013 | A1 |
20130332064 | Funk | Dec 2013 | A1 |
20140005933 | Fong et al. | Jan 2014 | A1 |
20140195148 | Erignac | Jul 2014 | A1 |
20150163993 | Pettersson | Jun 2015 | A1 |
20150261223 | Fong et al. | Sep 2015 | A1 |
20150304634 | Karvounis | Oct 2015 | A1 |
20150350378 | Hertel et al. | Dec 2015 | A1 |
20160005229 | Lee et al. | Jan 2016 | A1 |
20160025502 | Lacaze | Jan 2016 | A1 |
20160069691 | Fong et al. | Mar 2016 | A1 |
20160179830 | Schmalstieg | Jun 2016 | A1 |
20170021497 | Tseng | Jan 2017 | A1 |
20170023937 | Loianno | Jan 2017 | A1 |
20170046868 | Chernov | Feb 2017 | A1 |
20170052033 | Fong et al. | Feb 2017 | A1 |
20180149753 | Shin | May 2018 | A1 |
20180189565 | Lukierski | Jul 2018 | A1 |
20180216942 | Wang | Aug 2018 | A1 |
20190080463 | Davison | Mar 2019 | A1 |
20190220011 | Della Penna | Jul 2019 | A1 |
20190355173 | Gao | Nov 2019 | A1 |
20200003901 | Shroff | Jan 2020 | A1 |
20200069134 | Ebrahimi Afrouzi | Mar 2020 | A1 |
20200109954 | Li | Apr 2020 | A1 |
20200135014 | De La Guardia Gonzalez | Apr 2020 | A1 |
20200142410 | Liu | May 2020 | A1 |
Number | Date | Country |
---|---|---|
2013327774 | Oct 2014 | AU |
2013327774 | May 2016 | AU |
2016202515 | May 2016 | AU |
2016213835 | Sep 2016 | AU |
2016202515 | Nov 2017 | AU |
2016213835 | Nov 2017 | AU |
2834932 | Dec 2012 | CA |
2870381 | Apr 2014 | CA |
2935223 | Apr 2014 | CA |
2952355 | Apr 2014 | CA |
2968561 | Apr 2014 | CA |
2870381 | Feb 2017 | CA |
2935223 | Nov 2017 | CA |
2952355 | Nov 2017 | CA |
2968561 | Jul 2018 | CA |
0390052 | Oct 1990 | EP |
2450762 | May 2012 | EP |
2839399 | Feb 2015 | EP |
3018603 | May 2016 | EP |
3018603 | Aug 2017 | EP |
H07271434 | Oct 1995 | JP |
2003-166824 | Jun 2003 | JP |
2004-199389 | Jul 2004 | JP |
2007-525765 | Sep 2007 | JP |
2008-032478 | Feb 2008 | JP |
2010-108483 | May 2010 | JP |
2011-039969 | Feb 2011 | JP |
2011-108084 | Jun 2011 | JP |
2011138502 | Jul 2011 | JP |
2011052827 | May 2011 | WO |
2014055278 | Apr 2014 | WO |
Entry |
---|
“Australian Application Serial No. 2013327774, First Examination Report dated Feb. 3, 2016”, 2 pgs. |
“Australian Application Serial No. 2013327774, Response filed Apr. 19, 2016 to First Examination Report dated Apr. 19, 2016”, 46 pgs. |
“Australian Application Serial No. 2016202515, First Examination Report dated Mar. 10, 2017”, 2 pgs. |
“Australian Application Serial No. 2016202515, Response filed Sep. 28, 2017 to First Examination Report dated Mar. 10, 2017”, 48 pgs. |
“Australian Application Serial No. 2016213835, First Examination Report dated Dec. 9, 2016”, 4 pgs. |
“Australian Application Serial No. 2016213835, Response filed Sep. 27, 2017 to First Examination Report dated Dec. 9, 2016”, 48 pgs. |
“Canadian Application Serial No. 2,870,381, Response filed Jun. 4, 2016 to Office Action dated Jan. 11, 2016”, 12 pgs. |
“Canadian Application Serial No. 2,935,223, Response filed Apr. 3, 2017 to Office Action dated Jan. 11, 2017”, 57 pgs. |
“Canadian Application Serial No. 2,935,223, Response filed Dec. 19, 2016 to Office Action dated Oct. 12, 2016”, 57 pgs. |
“Canadian Application Serial No. 2,952,355, Office Action dated Jan. 23, 2017”, 4 pgs. |
“Canadian Application Serial No. 2,952,355, Response filed Apr. 3, 2017 to Office Action dated Jan. 23, 2017”, 54 pgs. |
“Canadian Application Serial No. 2,968,561, Office Action dated Jun. 28, 2017”, 4 pgs. |
“Canadian Application Serial No. 2,968,561, Office Action dated Sep. 29, 2017”, 3 pgs. |
“Canadian Application Serial No. 2,968,561, Response filed Sep. 18, 2017 to Office Action dated Jun. 28, 2017”, 57 pgs. |
“Canadian Application Serial No. 2,968,561, Response filed Dec. 4, 2017 to Office Action dated Sep. 29, 2017”, 37 pgs. |
“Japanese Application Serial No. 2018-224469, Notification of Reasons for Refusal dated Nov. 5, 2019”, w/ English translation, 11 pgs. |
Agrawal et al. “CenSurE: Center Surround Extremas for Realtime Feature Detection and Matching” Lecture Notes in Computer Science, Computer Vision—ECCV 5305:102-115 (2008). |
Bay et al. “Speeded-up Robust Features (SURF)” Computer Vision and Image Understanding 110(3):346-359 (2008). |
Castellanos et al. “Multisensor Fusion for Simultaneous Localization and Map Building” IEEE Transactions on Robotics and Automation 17(6):908-914 (Dec. 2001). |
Castle et al. “Video-rate localization in multiple maps for wearable augmented reality” 2008 12th IEEE International Symposium on Wearable Computers (pp. 15-22) (2008). |
Circirelli et al. “Position Estimation for a Mobile Robot using Data Fusion” Proceedings of Tenth International Symposium on Intelligent Control (pp. 565-570)(May 1995). |
Davison et al. “Monoslam: Real-time single camera slam” IEEE Transactions on Pattern Analysis and Machine Intelligence 29(6):1052-1067 (2007). |
Davison, A. J. “Real-time simultaneous localisation and mapping with a single camera” Proc. 9th IEEE International Conference on Computer Vision (ICCV'03) 2:1403-1410 (Oct. 2003). |
“Digiclops versatile digital camera” Point Grey Research, Retrieved from the Internet at URL: http://www.ptgrey.com/products/digiclops/digiclops.pdf (2 pages) (Retrieved on Sep. 23, 2003). |
Dissanayake et al. “A Computationally Efficient Solution to the Simultaneous Localisation and Map Building (SLAM) Problem” Proceedings of the 2000 IEEE International Conference on Robotics and Automation (ICRA) (pp. 1009-1014) (Apr. 2000). |
Eade et al. “Monocular SLAM as a graph of coalesced observations” 2007 IEEE 11th International Conference on Computer Vision (pp. 1-8) (Oct. 2007). |
Eade et al. “Unified Loop Closing and Recovery for Real Time Monocular Slam” Proc. British Machine Vision Conference (BM VC '08) (pp. 53-62) (Sep. 2008). |
Faugeras et al. “Three Views: The trifocal geometry” Chapter 8 in the Geometry of Multiple Images (pp. 409-500) (2001). |
Fischler et al. “Random sample consensus: A paradigm for model fitting with applications to image analysis and automated cartography” Communications of the ACM 24(6):381-395 (Jun. 1981). |
Fox et al. “Particle Filters for Mobile Robot Localization” Sequential Monte Carlo Methods in Practice (pp. 401-428) (2001). |
Fox et al. “Markov Localization for mobile robots in dynamic environments” Journal of Artificial Intelligence Research 11:391-427 (1999). |
Gaspar et al. “Vision-Based Navigation and Environmental Representation with an Omnidirectional Camera” IEEE Transactions on Robotics and Automation 16(6):890-898 (2000). |
Goncalves et al. “A Visual Front-End for Simultaneous Localization and Mapping” Proceedings of the 2005 IEEE International Conference on Robotics and Automation pp. 44-49 (Apr. 2005). |
Grisetti et al. “Online Constraint Network Optimization for Efficient Maximum Likelihood Map Learning” 2008 IEEE International Conference on Robotics and Automation (pp. 1880-1885) (May 2008). |
Ila et al. “Vision-based Loop Closing for Delayed State Robot Mapping” Proceedings of the 2007 IEEE/RSJ International (pp. 3892-3897) (Oct. 2007). |
International Search Report and Written Opinion for PCT Application No. PCT/US2011/053122 (dated Jan. 19, 2012). |
International Search Report for PCT Application No. PCT/US03/39996 (dated Jun. 30, 2004). |
International Search Report and Written Opinion for PCT Application No. PCT/US2013/061208 (dated Jan. 27, 2014). |
Strom et al. “Occupancy Grid Rasterization in Large Environments for Teams of Robots” 2011 IEEE/RSJ International Conference on Intelligent Robots and Systems (pp. 4271-4276) (Sep. 2011). |
Kalman, R.E. “A New Approach to Linear Filtering and Prediction Problems” Journal of Basic Engineering 82 (1):35-45 (Mar. 1960). |
Karlsson et al. “The vSLAM Algorithm for Robust Localization and Mapping” Proceedings of the 2005 IEEE International Conference on Robotics and Automation (pp. 24-29) (Apr. 2005). |
Klein et al. “Parallel tracking and mapping for small AR workspaces” 2007 6th IEEE and ACM International Symposium on Mixed and Augmented Reality (pp. 225-234) (2007). |
Konolige et al. “Frame-frame matching for realtime consistent visual mapping” Proceedings 2007 IEEE International Conference on Robotics and Automation (pp. 2803-2810)(Apr. 2007). |
Konolige, K. “SLAM via Variable Reduction from Constraint Maps” Proceedings of the 2005 IEEE International Conference on Robotics and Automation (pp. 667-672)(Apr. 2005). |
Konolige et al. “View-Based Maps” Robotics: Science and Systems (pp. 1-14) (Jun. 2009). |
Kretzschmar et al. “Lifelong Map Learning for Graph-based SLAM in Static Environments” Künstl Intell (3):199-206 (May 2010). |
Lowe, D. G. “Distinctive image features from scale-invariant keypoints” International Journal of Computer Vision 60 (2):91-100 (2004). |
Lowe, D. “Local Feature View Clustering for 3D Object Recognition” Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (pp. 682-688) (Dec. 2001). |
Lowe, D. “Object Recognition from Local Scale-Invariant Features” Proceedings of the International Conference on Computer Vision (pp. 1150-1157) (Sep. 1999). |
Matas et al. “Robust wide baseline stereo from maximally stable extremal regions” Proc. British Machine Vision Conference (BMVC '02) (pp. 384-393) (Sep. 2002). |
Mikolajczyk et al. “A performance evaluation of local descriptors” IEEE Transactions on Pattern Analysis and Machine Intelligence 27(10):1615-1630 (2005). |
Montemerlo et al. “FastSLAM 2.0: an Improved Particle Filtering Algorithm for Simultaneous Localization and Mapping that Provably Converges” Proc. 18th International Joint Conference on Artificial Intelligence (IJCAI'03) (pp. 1-6) (Aug. 2003). |
Montemerlo et al. “FastSLAM: A Factored Solution to the Simultaneous Localization and Mapping Problem” Proceedings of the American Association for Artificial Intelligence (AAAI) National conference on Artificial Intelligence (pp. 1-6) (2002). |
Nister et al. “Scalable recognition with a vocabulary tree” P2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'06) 2:2161-2168 (Jun. 2006). |
Nister, D. “An Efficient Solution to the Five-Point Relative Pose Problem” IEEE Transactions on Pattern Analysis and Machine Intelligence 26(6):756-770 (Jun. 2004). |
Roumeliotis et al. “Bayesian estimation and Kalman filtering: A unified framework for mobile robot localization” Proceedings of the IEEE International Conference on Robotics and Automation (ICRA) (pp. 2985-2992) (2000). |
Se et al. “Local and Global Localization for Mobile Robots using Visual Landmarks” Proceedings of the 2001 IEEE/RSJ International Conference on Intelligent Robots and Systems (pp. 414-420) (2001). |
Se et al. “Mobile Robot Localization and Mapping with Uncertainty using Scale-Invariant Visual Landmarks” The International Journal of Robotics Research 21(8):735-758 (Aug. 2002). |
Se et al. “Vision-based Mobile Robot Localization and Mapping using Scale-Invariant Features” Proceedings of IEEE International Conference on Robotics and Automation (ICRA 2001) (pp. 2051-2058) (May 2001). |
Shewchuk, J. R. “An Introduction to the Conjugate Gradient Method without the agonizing pain” Technical report (pp. 1-64) (1994). |
Sivic et al. “Video Google: A text retrieval approach to object matching in videos” In Proc. 9th IEEE International Conference on Computer Vision (ICCV'03) (pp. 1470-1477) (Oct. 2003). |
Stella et al. “Position Estimation for a Mobile Robot using Data Fusion” Proceedings of Tenth International Symposium on Intelligent Control (pp. 565-570) (May 1995). |
Thrun et al. “Multi-Robot SLAM With Sparce Extended Information Filers” Robotics Research, Springer Tracts in Advanced Robotics 15: 1-12 (2005). |
Thrun et al. “The GraphSLAM Algorithm with Applications to Large-scale Mapping of Urban Structures” International Journal on Robotics Research 25(5/6):403-430 (2005). |
Thrun et al. “A Probabilistic Approach to Concurrent Mapping and Localization for Mobile Robots” Machine Learning 31(1-3):29-53 (1998). |
Thrun, S. “Probabilistic Algorithms in Robotics” Technical Report, CMU-CS-00-126, Carnegie Mellon University (pp. 1-18) (Apr. 2000). |
“International Application Serial No. PCT US2013 061208, International Preliminary Report on Patentability dated Apr. 16, 2015”, 8 pgs. |
“U.S. Appl. No. 13/632,997, Notice of Allowance dated Mar. 20, 2014”, 7 pgs. |
“U.S. Appl. No. 13/632,997, 312 Amendment filed Apr. 17, 2014”, 5 pgs. |
“U.S. Appl. No. 13/632,997, PTO Response to Rule 312 Communication dated Jun. 24, 2014”, 2 pgs. |
“European Application Serial No. 13843954.2, Partial Supplementary European Search Report dated Oct. 29, 2015”, 6 pgs. |
“European Application Serial No. 13843954.2, Extended European Search Report dated May 19, 2016”, 11 pgs. |
“European Application Serial No. 13843954.2, Response filed Dec. 5, 2016 to Extended European Search Report dated May 19, 2016”, 10 pgs. |
“European Application Serial No. 13843954.2, Response filed Oct. 19, 2018 to Communication Pursuant to Article 94(3) EPC dated Jun. 20, 2018”, 49 pgs. |
“European Application Serial No. 15200594.8, Response filed Apr. 22, 2016 to Extended European Search Report dated Apr. 12, 2016”, 17 pgs. |
“U.S. Appl. No. 14/307,402, Non Final Office Action dated Apr. 10, 2015”, 9 pgs. |
“U.S. Appl. No. 14/307,402, Response filed Jun. 30, 2015 to Non Final Office Action dated Apr. 10, 2015”, 3 pgs. |
“U.S. Appl. No. 14/307,402, Notice of Allowance dated Aug. 18, 2015”, 5 pgs. |
“U.S. Appl. No. 14/944,152, Notice of Allowance dated Apr. 6, 2016”, 12 pgs. |
“U.S. Appl. No. 14/944,152, Notice of Allowance dated Jun. 6, 2016”, 2 pgs. |
“U.S. Appl. No. 15/225,158, Preliminary Amendment filed Dec. 6, 2016”, 6 pgs. |
“U.S. Appl. No. 15/225,158, Non Final Office Action dated Jun. 27, 2017”, 16 pgs. |
“U.S. Appl. No. 15/225,158, Response filed Sep. 26, 2017 to Non Final Office Action dated Jun. 27, 2017”, 7 pgs. |
“U.S. Appl. No. 15/225,158, Notice of Allowance dated Dec. 15, 2017”, 6 pgs. |
“Canadian Application Serial No. 2,870,381, Office Action dated Jan. 11, 2016”, 5 pgs. |
Lowe, D.G, “Distinctive Image Features from Scale-Invariant Keypoints”, International Journal of Computer Vision, 60(2), (2004), 91-110. |
Thrun, S. “Robotic Mapping: A Survey” Technical Report, CMU-CS-02-111, Carnegie Mellon University (pp. 1-29) (Feb. 2000). |
Triggs et al. “Bundle adjustment—A Modern Synthesis” International Workshop on Vision Algorithms 1883:298-372 (Sep. 2000). |
Williams et al. “An image-to-map loop closing method for monocular slam” 2008 IEEE/RSJ International Conference on Intelligent Robots and Systems (pp. 2053-2059) (Sep. 2008). |
Wolf et al. “Robust Vision-based Localization for Mobile Robots Using an Image Retrieval System Based on Invariant Features” Proceedings of the 2002 IEEE International Conference on Robotics and Automation (pp. 359-363) (May 2002). |
Adam et al. “Fusion of Fixation and Odemetry for Vehicle Navigation” IEEE 29(6):593-603 (1999). |
European Office Action Corresponding to European Patent Application No. 13843954.2 (8 Pages) (dated Jun. 20, 2018). |
Examiner's Report Corresponding to Canadian Patent Application No. 2,935,223 (4 pages) (dated Oct. 12, 2016). |
Australian Office Action Corresponding to Patent Application No. 2016213835 (4 Pages) (dated Dec. 9, 2016). |
Canadian Office Action Corresponding to Canadian Patent Application No. 2,935,223 (5 Pages) (dated Jan. 11, 2017). |
Milford et al. “Hybrid robot control and SLAM for persistent navigation and mapping” Robotics and Autonomous Systems 58(9):1096-1104 (Sep. 30, 2010). |
Urdiales et al. “Hierarchical planning in a mobile robot for map learning and navigation” Autonomous robotic systems, Retrieved from the Internet at URL http://webpersonal.uma.es/˜EPEREZ/files/ARS03.pdf (Jan. 2003). |
Yamauchi B. “Frontier-Based Exploration Using Multiple Robots” Proceedings of the Second International Conference on Autonomous Agents, Retrieved from the internet at URL https://pdfs.semanticscholar.org/ace4/428762424373c0986c80c92be6321ff523e4.pdf (May 2008). |
Youngblood et al. “A Framework for Autonomous Mobile Robot Exploration and Map Learning through the use of Place-Centric Occupancy Grids” ICML Workshop on Machine Learning of Spatial Knowledge Retrieved from the internet at URL http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.27.2188&rep=rep1&type=pdf (2000). |
Canadian Office Action corresponding to Canadian Patent Application No. 2,935,223 (4 pages) (dated Jun. 28, 2017). |
European Search Report corresponding to European Application No. 15200594.8 (6 pages) (dated Mar. 23, 2016). |
Japanese Office Action corresponding to Japanese Patent Application No. 2015-517500 (Foreign text 3 pages; English translation thereof 3 pages) (dated Oct. 27, 2015). |
Nagao et al. “Social Annotation to Indoor 30 Maps Generated Automatically” 2012 Information Processing Society of Japan 2012-DCC-1(12):1-8 (2012). |
Number | Date | Country | |
---|---|---|---|
20180299275 A1 | Oct 2018 | US |
Number | Date | Country | |
---|---|---|---|
61541749 | Sep 2011 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 15225158 | Aug 2016 | US |
Child | 15921085 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 14944152 | Nov 2015 | US |
Child | 15225158 | US | |
Parent | 14307402 | Jun 2014 | US |
Child | 14944152 | US | |
Parent | 13632997 | Oct 2012 | US |
Child | 14307402 | US |