The present disclosure relates to a robot for the processing of surfaces such as, e.g. for cleaning floors (vacuum cleaner robots), for mowing grass, for painting surfaces, etc. The present disclosure further concerns a method carried out by the robot for the efficient performance of the given task by the robot.
In recent years, autonomous mobile robots are being increasingly employed, for example, to clean floor surfaces or to mow a lawn. In both cases it is crucial that the given surface be entirely processed by a surface processing device arranged on the robot such as, for example, a brush. Simple devices can do without the generation and use of a map of the area of robot deployment by, for example, randomly moving over the area to be cleaned. More complex robots use a map of the area of robot deployment that they generate themselves or that is provided to them in electronic form. This system makes it possible to record the already processed surfaces.
These modern autonomous mobile robots with mapping function try to use, when processing (e.g. cleaning) a surface, a process pattern that is as systematic as possible. The pattern must be adapted to the complex circumstances of the area of operation, such as an apartment with furniture. Further, the robot must be capable of reacting to unexpected objects such as, for example, people moving about in its area of operation or obstacles that are difficult to detect with the robot's sensors. Still, it may occur that some areas of the floor will be left out in the process. A robot may possess a detection module to detect such an omitted surface at the end of or during its run, in order to clean it afterwards. For this purpose, it is once again necessary to move across the already processed surface, which reduces the processing efficiency and increases the processing time needed for the entire area.
This application discloses a robot and a method for improving existing robots and methods for an autonomous mobile robot to process a surface (e.g. to clean a floor surface) and to thus increase the efficiency of the robot.
In the following, a method for the processing of a surface of an area to be processed by an autonomous mobile robot will be described. In accordance with one embodiment, the surface is processed by the robot along path segments of a robot path in accordance with a first process pattern. The area to be processed is broken down, depending on a current position of the robot, into a first area which, according to the process pattern, should already have been processed, into a second area that has yet to be processed and into a third area that is currently being processed. The robot can then test whether the first area has, indeed, been completely cleaned.
In accordance with a further embodiment, the method comprises the processing of the surface along path segments of a robot path in accordance with a first process pattern, wherein the path segments exhibit a certain sequence. The method further comprises detecting—during the processing of a given path segment of the path segments—whether, in at least one path segment that lies before the first path segment in the sequence, an as of yet unprocessed area exists that would have already been cleaned given the unimpeded performance of the processing according to the sequence.
In accordance with a further embodiment, the method comprises controlling the robot in order to process the area in accordance with a first process pattern, the monitoring of an area in the environment of the robot, wherein the area has a permanent position relative to the robot, and controlling the robot in order to process the area in accordance with a second process pattern if an accessible and unprocessed area is detected in the monitored area.
Further, a method for controlling an autonomous mobile robot using a processing module for performing an activity within an area will be described. In accordance with one of the examples described here, the method comprises processing the surface of the subarea Along a robot path, identifying features of the robot's environment during the processing in order to detect obstacles and the automatic planning of a bypass route around a detected obstacle, wherein the planning is limited to a planning subarea that moves together with the robot and encompasses only a part of the area.
In accordance with a further embodiment, the method comprises processing a surface of the subarea Along a robot path, identifying features of the robot's environment during the processing in order to detect obstacles and an automatic planning of a bypass route around a detected obstacle, wherein during the planning, a processing gain is considered that arises due to a processing of the surface carried out while and/or after bypassing the obstacle.
In accordance with a further embodiment, the method comprises the processing of a surface of the subarea Along a robot path in accordance with a first process pattern, the automatic determination of the robot's position in the area by means of a sensor arrangement on the robot, the determination of a measurement value for the inaccuracy of the determined position, as well as the adaptation of the first process pattern depending on the measurement value for the inaccuracy (a measurement value for the inaccuracy is, naturally, also a measured value for the accuracy).
Further embodiments concern a robot that is connected with an internal and/or external data processing device that is configured to execute a software program which, when it is executed by the data processing device, causes the robot to carry out the methods described here.
Various embodiments will be explained in the following in greater detail by means of the examples illustrated in the figures. The illustrations are not necessarily true to scale and the embodiments are not limited to only the aspects shown. Instead importance is placed on illustrating the underlying principles of the embodiments described herein. The figures show:
In the following, the various embodiments will be explained using an autonomous robot for the cleaning of floor surfaces (e.g. a vacuum cleaner robot) as an illustrative example. The embodiments described here, however, may also be applied to other areas in which a surface should be processed as completely as possible by an autonomous mobile process. Further examples of such robots include, among others, process for removing a floor layer (for example, by means of sanding), process for applying, for example, paint on a surface and process for mowing a lawn, for cultivating farmland, as well as autonomous harvesting process.
A mobile autonomous robot 100 generally includes a drive module which, for example, may comprise electric motors, gears and wheels, by means of which the robot can theoretically arrive at any point of a floor surface. The robot further includes a processing module such as, for example, a cleaning module for cleaning the floor surface such as, for example, a brush or an apparatus for sucking in dirt (or both). Such robots are well known and predominantly differ according to the manner with which the environment is navigated, as well as according to the strategy that is applied when processing the floor surface as, e.g. during a cleaning process.
In order to increase the efficiency of the floor processing, systematic cleaning patterns were developed to process an area G in one run and with as few overlaps of the processed surface as possible. These process patterns are created, for example, by connecting together path segments to a robot path that the robot processes. By connecting path segments, a spiral formed process pattern (see
In order to ensure that the process pattern is reliably followed and that the robot paths are adapted to the area of robot deployment, the robot 100 can compile a map of the environment and, at the same time, determine its position on this map with the aid of suitable sensors, e.g. arranged on the robot and by employing a SLAM method (SLAM: Simultaneous localization and mapping, see e.g. H. Durrant Whyte and T. Bailey: “Simultaneous Localization and Mapping (SLAM): Part I The Essential Algorithms”, in: IEEE Robotics and Automation Magazine, Vol. 13, No. 2, pgs. 99-110, June 2006). This makes it possible to specifically access and clean every (accessible) point in a given area G. In the ideal case, every point is covered only once in order to avoid cleaning areas that have already been cleaned, significantly reducing the time needed to clean the area G. One such suitable sensor is, for example, a sensor for measuring the distance to objects in the environment such as, e.g. an optical and/or acoustic sensor that operates by means of optical triangulation or running time measurement of an emitted signal (triangulation sensor, time-of-flight camera, laser scanner, ultrasonic sensors). Further typical examples of suitable sensors are cameras, tactile sensors, acceleration sensors, gyroscope sensors and odometers.
In order to ensure that every point of the area G has been cleaned, the robot can save the already processed surfaces on a cleaning map (a map on which the already cleaned sections have been marked). For this purpose, e.g. a raster map may be used, on which the area G is sectored into numerous quadratic zones and for every quadrate it is saved as to whether the zone has been cleaned. The length of such a quadrate's edge may correspond, for example, to the size of the cleaning module or it may be smaller. As an alternative, the robot, e.g. can save its trajectory (the course of the robot path) and calculate from this those parts of the area G that have already been processed and those that have yet to be processed.
The specific area to be processed G may be confined within existing obstacles such as walls (e.g. an apartment or a separate room of an apartment). More advanced devices are capable of further defining the area G using virtual boundaries. For example, the virtual boundary on a map of robot deployment may be defined by a user or by the robot (e.g. in the form of boundary lines). The robot, for example, can sector the entire area into numerous zones (e.g. the numerous rooms of an apartment or numerous parts of a room) and these can be processed one after the other.
In general, the robot 100 is able to, by means of sensors, identify features of its environment and, based on the identified features, detect obstacles. The features identified by the sensors can also be used to localize the robot on an electronic map. Despite the use of map-based navigation, it is almost impossible to obtain a reliable advance planning of a cleaning of area G due to sensor and map data that is prone to error. The cause of this are obstacles that are difficult for the sensors arranged on the robot to detect, as well as dynamic obstacles such as people or pets that might temporarily block the robot's path. A long-term (spatial and/or temporal) advance planning is thus only to a very limited extent possible. It is therefore necessary that the process pattern employed by the robot be dynamically adapted to unanticipated (e.g. due to being unforeseeable) events. In the following, various approaches for the robot to dynamically adapt its previously specified process pattern when needed (e.g. when an unanticipated obstacle is detected) will be described. These enable the robot to efficiently, i.e. “intelligently”, react to a dynamically changing, (e.g. due to displaced obstacles), environment.
It should be noted here that the descriptions “from bottom to top” or “from top to bottom”, as well as “above the robot's position” or “below the robot's position” are not intended to be restricting and are only used in reference to the examples shown of the maps of the robot's environment. Actually, a coordination system for any systematic cleaning along a meandering path can be chosen such that the robot begins its processing of the area G within this coordination system “at the bottom” and ends it “at the top”.
In order to detect an omitted subarea M (see
The monitored environment may be confined, as illustrated in
Adapting the cleaning pattern in order to clean an omitted subarea M can be carried out using various means. In the example of
For the selection of a suitable adaptation of the process pattern, the robot can employ its existing map data. This map data, however, may contain errors. Alternatively, the robot may, at least partially, reexamine the borders of the omitted subarea M. For this purpose the robot can, for example, carry out at least a partial run following along the bordering edges of the omitted area. The borders of the omitted subarea M, for example, are given by obstacles and by the already cleaned surfaces. When doing so, the already cleaned surfaces may be, at least partially, overlapped in order to improve the cleaning results (cf.
The preceding explanations concern, for the most part, the detection (recognition as such and the determination of position and size) of an omitted subarea M (or of numerous omitted areas M1, M2) that were omitted from processing for any given reason (e.g. because of an obstacle H that was not considered when the robot path was originally planned). In the following, the dynamic planning of a robot path for the processing of surfaces will be discussed. In general, a surface is processed according to a defined process pattern, for example a meandering pattern, according to which the processing is carried out along a meandering path consisting of numerous straight-line path segments running parallel to each other and their corresponding connecting segments. A process pattern must be adapted to the actual specifics of the area G, such as its geometry and the obstacles found within it. This is carried out, in accordance with the method described here, dynamically during processing in order to make it possible to react to certain events such as, e.g. moving obstacles and/or obstacles not yet detected (by the sensors) that were not, or could not have been, considered during the initial path planning.
A process pattern comprises at least one path which is to be processed by the robot. The width w of the path corresponds to the processing module of the robot and the path may be comprised of numerous path segments (e.g. numerous parallel straight-line segments in the case of a meandering pattern). Here it should be noted that, for the sake of illustrative simplicity, in the figures it is assumed that the processing module has the same width as that of the robot. In practice, the processing module need not extend over the entire width of the robot. The robot should primarily process as wide a path as possible. Obstacles (or other objects) that block the planned processing path that the robot is moving along should be bypassed, if this is expedient. Robots are known that move completely around the obstacle before the decision is made concerning the further cleaning, which can lower the efficiency of the cleaning. Other robots end the processing of the current path segment upon encountering an obstacle and begin processing a different path segment. Still other robots follow the obstacle until an abort condition is fulfilled. This abort condition may be, for example, that the robot once again finds itself on the path segment it just left (upon detecting the obstacle) or that the robot finds itself on a different path segment of the meandering path that has not yet been processed.
Combining an initial planning of a path segment of a robot path in accordance with a process pattern with reactive algorithms enables the robot to make “more intelligent” decisions, thus improving the process efficiency and achieving the most complete coverage possible of the area G. For this, the autonomous mobile robot 100 must decide, based on environmental data, whether it should continue the cleaning of the path currently being processed or whether it would be more efficient to begin the cleaning of a new path segment. When doing so, it may weigh the process gain (i.e. the total increase of processed surface) against the needed costs in the form of detours and/or the double cleaning of a surface and it can confine the planning to a corridor (in the following designated as planning area) around the path segment currently being processed in order to reduce the planning effort. This will be explained in the following using examples illustrated in
Avoidance maneuver: When the autonomous mobile robot cleans a floor in an area G on which one or more obstacles H are standing, for example with a meandering process pattern, it must be able to decide whether it would be more effective to bypass the obstacle (see
If the robot discovers one or more obstacles H that block the processing of the current path, it tests whether bypassing it within the corridor (planning subarea A) is possible. If, as in the example of
It should be noted that the planning subarea A may be arranged symmetrically or offset to the currently processed path segment. For example, in the case of symmetric robots, or in order to simplify the calculations, a symmetric planning corridor may be used. When a meandering process pattern is used the corridor (planning subarea A) can be chosen to extend farther into the unprocessed area than into the area that has already been processed. An example of such a situation is shown in
There are various known algorithms for avoiding or bypassing one or more obstacles (see, e.g. R. Siegwart, I. R. Nourbakhsh and D. Scaramuzza: “Introduction to Autonomous Mobile Robots”, The MIT Press). For example, an obstacle can be bypassed using a “bug algorithm”. Here the robot follows around the edge of the obstacle until it can once again enter the path segment belonging to the original process pattern. While the robot is attempting to bypass the obstacle it tests whether it is moving within the planning subarea A used for this planning. If the robot leaves this subarea, it interrupts the attempted bypass of the obstacle and looks for a starting point for a new path that corresponds to the cleaning pattern. Such a case only arises if the robot has incorrectly assessed the obstacle as when, for example, due to the obstacle moving (or having moved) or due to it being difficult for the sensors to detect, the previously described planning had determined that it was possible to bypass the obstacle within the corridor although this is actually not the case.
When testing whether and how the robot can bypass an obstacle, the robot may additionally test whether the avoidance maneuver will result in a process gain (i.e. a previously unprocessed surface will be processed. This situation is illustrated in
In a second example, the robot 100 cleans using meandering path segments that are connected together and can bypass the obstacle in both directions with a process gain. In order to avoid leaving an isolated uncleaned area, in this case the obstacle should be bypassed in the direction of the last meandering path. There is therefore a preferred direction for bypassing an obstacle that is aligned with the cleaning pattern and faces in the direction of the already cleaned area. In accordance with this aspect, in the example of
In some cases the robot 100 is not symmetrically designed, but may instead possess, for example, a side brush on its right edge that enables it to clean the surface up to the border of an obstacle to the right of the robot 100, whereas an unprocessed area would remain at an obstacle to the left of the robot 100. This may be considered, for example, with the previously described method of assessing the cleaning gain. Alternatively, the robot may have a set preferred direction that ensures that the robot bypasses an obstacle while keeping the obstacle to its right.
The expected cleaning gain behind an obstacle H, which can be determined using the cleaning map, can be taken into consideration by the robot 100 as an additional decision criterion for bypassing an obstacle. Thus an obstacle will be bypassed when the effort involved in a possible double cleaning during the avoidance maneuver is clearly lower than the expected cleaning gain. Such a situation is illustrated in
An analogous approach is possible when the currently cleaned path segment leads over a previously cleaned area. This previously cleaned subarea May belong, for example, to the surroundings of an obstacle. Here the robot decides, e.g. based on the expected cleaning gain and/or on the costs entailed in cleaning an already cleaned area, whether the planned path through the already cleaned area will be continued or whether to look for a starting point for a new meandering path.
A meandering process pattern consists of nearly parallel straight-line path segments, arranged at a distance to each other that corresponds to the path width w (width of the processing module) minus a desired overlap. With a small overlap of, for example, one to two centimeters, a complete coverage of the surface can be ensured despite small inaccuracies that arise during the localization and/or control of the robot. If the overlap comprises more than 50%, the floor surface can be processed at least twice. Other process patterns can be generated with, for example, any desired starting segment and with further path segments having an appropriately chosen overlap (i.e. a suitable distance to) the respective preceding path segment. In the following example, shown in
First, for the area to be processed G, e.g. straight-line path segments, either adjacent to each other or overlapping, are determined in accordance with the specified process pattern (overlapping segments are not shown in
Beginning at an accessible starting point, the possible cleaning gain in both directions along the new path is determined. In this case, the planning may be confined to the previously described corridor and may take into account the principles described above regarding planning the subarea surrounding obstacles and the repeated cleaning of a previously cleaned area.
Based on the thus determined possible cleaning gain, the robot 100 can decide in which direction it will begin cleaning the path or whether a new starting point on the path or a new path should be chosen. The following possibilities, for example, exist: (a) In one direction there is no cleaning gain to be achieved because cleaning here has already been carried out or because there is an obstacle in the way. In the second direction there is a cleaning gain to be achieved. The robot begins cleaning in this second direction. (b) A cleaning in both directions is possible. In one of these, however, the cleaning gain only corresponds to the base area of the robot (and thus lies below a specified threshold). The robot shifts the starting point in this direction in order to be able to clean the entire possible path length at once. (c) In both directions the cleaning gain is smaller than a minimum value or even zero, for example due to larger obstacles standing both to the right and the left. The robot then tries to identify a new path segment. (d) In both directions the cleaning gain lies above a threshold, in this (in practice rather unlikely) case, cleaning is carried out in the direction providing the greatest cleaning gain.
In this example (see
In order for the robot 100 to be able to clean a surface, for example, using a meandering path 200, it must be capable of determining and correcting its position (for example, by means of the SLAM method). However, due to, for example, measurement or movement errors, inaccuracies in the positioning and/or localization can always occur that may result in strips between two cleaned meandering paths being left uncleaned. In order to avoid this, the processing is performed leaving an overlap between the path segment currently being cleaned (see
Statistical SLAM methods, for example, are able to estimate the inaccuracy present in the localization in addition to the position of the robot 100. Based on this estimate, for example, the overlap O1, O2, O3, etc. between the path segment currently being processed (surface T) and the previously cleaned surface (subarea C) can be determined and adapted, thus ensuring a complete cleaning of the surface despite the inaccurate localization. At the same time, the overlap in areas in which the localization is more reliable can be reduced to a minimum, thus accelerating the cleaning of the surface.
For example, the SLAM method used may be based on an extended Kalman filter or on a so-called particle filter, by means of which the probability distribution of the expected robot position and surrounding obstacles (e.g. a wall) is determined. Based on this probability distribution, the standard deviation of the robot's position relative to the surrounding obstacles, for example, can be taken as a value representing the inaccuracy in the localization. If the planned overlap then equals, for example, a twofold standard deviation, then there is a theoretical probability of approximately 2% that an uncleaned area will remain between the path segment currently being processed (surface T) and the previously cleaned surface (C) (negative overlap). Repeating the localization further reduces the probability, thus rendering it negligible in practice.
An inaccuracy in the positioning can also be evaluated using a SLAM method. Here, for example, in a first step, a prediction is made as to where the robot, having followed the movement orders of the robot controller, is located (positioning). In a second step, for example, this first prediction is corrected based on measurements of the distance to obstacles (e.g. a wall) in the environment (localization). This means that the robot's (target) position is known but that, due to odometry errors (e.g. caused by the floor covering) it is not at the originally planned position. The adjustment needed here may also be suitably used as a value to determine a needed overlap o1, o2, o3, etc.
The overlap can be set, for example, when selecting a starting point for a path segment of the meandering path. In addition, it may be determined during the cleaning of a path that the inaccuracy of the localization has suddenly increased or lessened (or the accuracy of the localization, which is virtually the same thing). In such a case, for example, the processing of the current path can be interrupted in order to select path with an overlap that corresponds to the new measure of inaccuracy or the planning of the path currently being processed can be dynamically adapted. Thus, an overlap (e.g. o1 between the first and the second path segment in
In some cases, the specific overlap between the path segment currently being processed (surface T) and the previously cleaned surface C may become so large, that it is no longer possible to carry out a systematic cleaning using the chosen cleaning pattern. The cleaning strategy can be adapted to such cases. For example, the process pattern may be employed twice, wherein the second time, the pattern is applied at an angle (e.g. of 90°) and/or offset (e.g. by about one half of the path's width) relative to the first use of the pattern. If the inaccuracy is so large that a goal-directed navigation is no longer or hardly possible, then, for example, a simple random and/or chaotic process pattern may be employed. In some cases, the inaccuracy in the positioning and localization of the robot is not isotropic. For example, in a long corridor the robot can easily measure its distance to the walls right and left of it, but it has no orientations points from which it could determine its position along the walls (i.e. in a direction parallel to the walls). In such a case of direction-dependent uncertainty it provides an advantage to arrange the process pattern such that the direction in which the overlapping is to occur is the direction with the greater accuracy. In a corridor, for example, the robot should therefore plan path segments that run parallel to the walls.
The robot can measure the inaccuracy of the positioning and/or localization in the entire area G to be cleaned. The measured inaccuracy values can also be saved on the robot's map and taken into consideration at the start of a future deployment. In this way the robot can identify at least one part of the area to be cleaned in which the inaccuracy value predominantly exceeds a given threshold. The robot is then able to adapt the process pattern solely for this part. In this manner, for example, in a very large room the robot 100 can clean systematically in the proximity of obstacles that it can use to localize itself, and it can clean chaotically in the middle of the room where there are no detectable obstacles. This breakdown of the area G based on inaccuracy can, for example, be permanently saved by the robot (e.g. on a map). The methods disclosed herein may allow for improved efficiency and accuracy of such robots.
In the present example, the robot includes a drive module 101 that, for example, includes electric motors, gears and wheels. It may further include a power source and power electronics used to drive the robot's drive train. The drive module 101 enables the robot to—theoretically—reach any point within the robot deployment area. As mentioned, robot 100 may have access to a communication link 104 via communication module 103 that may include an interface to a local computer network (WiFi, ZigBee, etc.), the internet or to a point-to point wireless link (e.g. Bluetooth) or any other suitable communication means. Further, the processors and memories may be utilized to optimize the usage of computing resources and/or power resources while executing the operations conducted by the robot. As a result, such features provide substantial operational efficiencies and improvements over existing technologies.
The robot may further include a processing module 102 that may be, for example, a cleaning module configured to clean the floor. The cleaning module my include brushes, a vacuum unit or the like. In order to be able to perform a task autonomously, the robot may have a navigation module 110, with which the robot may orient itself and navigate across the robot deployment area using so-called navigation features, i.e. features, with which the robot may orient itself in its environment, such as landmarks (furniture, doorways, corners or a room, walls, etc) that may be detected by the robot using its sensor module 130. The navigation module 110 may, for example, employ an obstacle avoidance strategy and/or a SLAM (simultaneous localization and mapping) algorithm in connection with one or more electronic maps of the robot deployment area. The map(s) of the robot deployment area may be newly generated by the robot during a deployment (e.g. while performing a task). Alternatively, a previously stored map may be used during a deployment. The stored map may have been generated by the robot itself during a preceding deployment or provided by the user or by another robot. The memory module 120 may include a non-volatile memory (e.g. a solid state disc, SSD) and may contain the maps of the robot deployment area. Alternatively, the maps may be stored externally, e.g. in or by the external data processing device 300 (e.g. in a computer located in the apartment or by a cloud server).
The sensor module 130 may include one or more sensors for measuring distances to objects in the robot's environment such as optical sensors, acoustic sensors which operate using as such known triangulation or time-of-flight measurement (e.g. triangulation sensor, time-of-flight camera, laser range finder, ultrasonic sensors, etc.). Other suitable sensors may be cameras (in connection with image processing techniques), tactile sensors, gyroscopic sensors, inertial measurement units (IMUs), odometers and/or floor clearance sensors.
While the machine-readable medium is shown in an example embodiment to be a single medium of memory 152, the term “machine-readable medium” should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more sets of instructions. The term “machine-readable medium” shall also be taken to include any medium that is capable of storing, encoding or carrying a set of instructions for execution by the machine and that causes the machine to perform any one or more of the methodologies of the present disclosure.
The term “machine-readable medium” shall accordingly be taken to include, but not be limited to: memory devices, solid-state memories such as a memory card or other package that houses one or more read-only (non-volatile) memories, random access memories, or other re-writable (volatile) memories; magneto-optical or optical medium such as a disk or tape; or other self-contained information archive or set of archives is considered a distribution medium equivalent to a tangible storage medium. The “machine-readable medium” may be non-transitory, and, in certain embodiments, may not include a wave or signal per se. Accordingly, the disclosure is considered to include any one or more of a machine-readable medium or a distribution medium, as listed herein and including art-recognized equivalents and successor media, in which the software implementations herein are stored.
Although various embodiments have been illustrated and described with respect to one or more specific implementations, alterations and/or modifications may be made to the illustrated examples without departing from the spirit and scope of the features and structures recited herein. With particular regard to the various functions performed by the above described components or structures (units, assemblies, devices, circuits, systems, etc.), the terms (including a reference to a “means”) used to describe such components are intended to correspond—unless otherwise indicated—to any component or structure that performs the specified function of the described component (e.g., that is functionally equivalent), even if it is not structurally equivalent to the disclosed structure that performs the function in the herein illustrated exemplary implementations of the present disclosure.
Number | Date | Country | Kind |
---|---|---|---|
10 2015 119 865.7 | Nov 2015 | DE | national |
This application is a Continuation of U.S. patent application Ser. No. 15/982,657, filed May 17, 2018, which is a Continuation-In-Part Application and claims the benefit of PCT/AT2016/060110 designating the United States, filed Nov. 15, 2016, the entirety of which is herein incorporated by reference and which claims priority to German Patent Application No. DE 10 2015 119 865.7, filed Nov. 17, 2015.
Number | Name | Date | Kind |
---|---|---|---|
4674048 | Okumura | Jun 1987 | A |
4740676 | Satoh et al. | Apr 1988 | A |
4777416 | George, II et al. | Oct 1988 | A |
5109566 | Kobayashi et al. | May 1992 | A |
5260710 | Omamyuda et al. | Nov 1993 | A |
5284522 | Kobayashi et al. | Feb 1994 | A |
5377106 | Drunk et al. | Dec 1994 | A |
5402051 | Fujiwara et al. | Mar 1995 | A |
5696675 | Nakamura et al. | Dec 1997 | A |
5787545 | Colens | Aug 1998 | A |
5995884 | Allen et al. | Nov 1999 | A |
6366219 | Hoummady | Apr 2002 | B1 |
6389329 | Colens | May 2002 | B1 |
6532404 | Colens | Mar 2003 | B2 |
6594844 | Jones | Jul 2003 | B2 |
6605156 | Clark et al. | Aug 2003 | B1 |
6615108 | Peless et al. | Sep 2003 | B1 |
6667592 | Jacobs et al. | Dec 2003 | B2 |
6690134 | Jones et al. | Feb 2004 | B1 |
6764373 | Osawa et al. | Jul 2004 | B1 |
6781338 | Jones et al. | Aug 2004 | B2 |
6809490 | Jones et al. | Oct 2004 | B2 |
6965209 | Jones et al. | Nov 2005 | B2 |
6972834 | Oka et al. | Dec 2005 | B1 |
7155308 | Jones | Dec 2006 | B2 |
7173391 | Jones et al. | Feb 2007 | B2 |
7196487 | Jones et al. | Mar 2007 | B2 |
7302345 | Kwon et al. | Nov 2007 | B2 |
7388343 | Jones et al. | Jun 2008 | B2 |
7389156 | Ziegler et al. | Jun 2008 | B2 |
7448113 | Jones et al. | Nov 2008 | B2 |
7483151 | Zganec et al. | Jan 2009 | B2 |
7507948 | Park et al. | Mar 2009 | B2 |
7539557 | Yamauchi | May 2009 | B2 |
7571511 | Jones et al. | Aug 2009 | B2 |
7636982 | Jones et al. | Dec 2009 | B2 |
7656541 | Waslowski et al. | Feb 2010 | B2 |
7761954 | Ziegler et al. | Jul 2010 | B2 |
7801676 | Kurosawa et al. | Sep 2010 | B2 |
8438695 | Gilbert et al. | May 2013 | B2 |
8594019 | Misumi | Nov 2013 | B2 |
8739355 | Morse et al. | Jun 2014 | B2 |
8855914 | Alexander et al. | Oct 2014 | B1 |
8892251 | Dooley et al. | Nov 2014 | B1 |
8921752 | Iizuka | Dec 2014 | B2 |
8982217 | Hickman | Mar 2015 | B1 |
9002511 | Hickerson et al. | Apr 2015 | B1 |
9026302 | Stout | May 2015 | B2 |
9037294 | Chung et al. | May 2015 | B2 |
9043017 | Jung et al. | May 2015 | B2 |
9149170 | Ozick et al. | Oct 2015 | B2 |
9220386 | Gilbert, Jr. et al. | Dec 2015 | B2 |
9486924 | Dubrovsky et al. | Nov 2016 | B2 |
9717387 | Szatmary et al. | Aug 2017 | B1 |
10228697 | Yoshino | Mar 2019 | B2 |
20020016649 | Jones | Feb 2002 | A1 |
20020103575 | Sugawara | Aug 2002 | A1 |
20020120364 | Colens | Aug 2002 | A1 |
20030025472 | Jones et al. | Feb 2003 | A1 |
20030030398 | Jacobs et al. | Feb 2003 | A1 |
20030034441 | Kang et al. | Feb 2003 | A1 |
20030120389 | Abramson et al. | Jun 2003 | A1 |
20030142925 | Melchior et al. | Jul 2003 | A1 |
20040020000 | Jones | Feb 2004 | A1 |
20040049877 | Jones et al. | Mar 2004 | A1 |
20040187457 | Colens | Sep 2004 | A1 |
20040207355 | Jones et al. | Oct 2004 | A1 |
20050000543 | Taylor et al. | Jan 2005 | A1 |
20050010331 | Taylor et al. | Jan 2005 | A1 |
20050041839 | Saitou et al. | Feb 2005 | A1 |
20050067994 | Jones et al. | Mar 2005 | A1 |
20050156562 | Cohen et al. | Jul 2005 | A1 |
20050171636 | Tani | Aug 2005 | A1 |
20050171644 | Tani | Aug 2005 | A1 |
20050204717 | Colens | Sep 2005 | A1 |
20050212680 | Uehigashi | Sep 2005 | A1 |
20050256610 | Orita | Nov 2005 | A1 |
20060012493 | Karlsson et al. | Jan 2006 | A1 |
20060020369 | Taylor | Jan 2006 | A1 |
20060095158 | Lee et al. | May 2006 | A1 |
20060237634 | Kim | Oct 2006 | A1 |
20070027579 | Suzuki et al. | Feb 2007 | A1 |
20070061041 | Zweig | Mar 2007 | A1 |
20070234492 | Svendsen et al. | Oct 2007 | A1 |
20070266508 | Jones et al. | Nov 2007 | A1 |
20070282484 | Chung et al. | Dec 2007 | A1 |
20080046125 | Myeong et al. | Feb 2008 | A1 |
20080140255 | Ziegler et al. | Jun 2008 | A1 |
20080155768 | Ziegler et al. | Jul 2008 | A1 |
20080192256 | Wolf et al. | Aug 2008 | A1 |
20080307590 | Jones et al. | Dec 2008 | A1 |
20090048727 | Hong et al. | Feb 2009 | A1 |
20090051921 | Masahiko | Feb 2009 | A1 |
20090143912 | Wang et al. | Jun 2009 | A1 |
20090177320 | Lee et al. | Jul 2009 | A1 |
20090182464 | Myeong et al. | Jul 2009 | A1 |
20090281661 | Dooley et al. | Nov 2009 | A1 |
20100030380 | Shah et al. | Feb 2010 | A1 |
20100049365 | Jones et al. | Feb 2010 | A1 |
20100082193 | Chiappetta | Apr 2010 | A1 |
20100257690 | Jones et al. | Oct 2010 | A1 |
20100257691 | Jones et al. | Oct 2010 | A1 |
20100263158 | Jones et al. | Oct 2010 | A1 |
20100324731 | Letsky | Dec 2010 | A1 |
20100324736 | Yoo et al. | Dec 2010 | A1 |
20110054689 | Nielsen et al. | Mar 2011 | A1 |
20110137461 | Kong et al. | Jun 2011 | A1 |
20110194755 | Jeong et al. | Aug 2011 | A1 |
20110211731 | Lee et al. | Sep 2011 | A1 |
20110224824 | Lee et al. | Sep 2011 | A1 |
20110231018 | Iwai et al. | Sep 2011 | A1 |
20110236026 | Yoo et al. | Sep 2011 | A1 |
20110238214 | Yoo et al. | Sep 2011 | A1 |
20110264305 | Choe et al. | Oct 2011 | A1 |
20110278082 | Chung et al. | Nov 2011 | A1 |
20110295420 | Wagner | Dec 2011 | A1 |
20120008128 | Bamji | Jan 2012 | A1 |
20120013907 | Jung et al. | Jan 2012 | A1 |
20120022785 | DiBernardo et al. | Jan 2012 | A1 |
20120060320 | Lee et al. | Mar 2012 | A1 |
20120069457 | Wolf et al. | Mar 2012 | A1 |
20120169497 | Schnittman et al. | Jul 2012 | A1 |
20120173070 | Schnittman | Jul 2012 | A1 |
20120215380 | Fouillade et al. | Aug 2012 | A1 |
20120223216 | Flaherty et al. | Sep 2012 | A1 |
20120265370 | Kim et al. | Oct 2012 | A1 |
20120271502 | Lee | Oct 2012 | A1 |
20120283905 | Nakano et al. | Nov 2012 | A1 |
20130001398 | Wada et al. | Jan 2013 | A1 |
20130024025 | Hsu | Jan 2013 | A1 |
20130166134 | Shitamoto et al. | Jun 2013 | A1 |
20130206177 | Burlutskiy | Aug 2013 | A1 |
20130217421 | Kim et al. | Aug 2013 | A1 |
20130221908 | Tang | Aug 2013 | A1 |
20130261867 | Burnett et al. | Oct 2013 | A1 |
20130265562 | Tang et al. | Oct 2013 | A1 |
20130317944 | Huang et al. | Nov 2013 | A1 |
20140005933 | Fong et al. | Jan 2014 | A1 |
20140098218 | Wu et al. | Apr 2014 | A1 |
20140100693 | Fong et al. | Apr 2014 | A1 |
20140115797 | Duenne | May 2014 | A1 |
20140124004 | Rosenstein et al. | May 2014 | A1 |
20140128093 | Das et al. | May 2014 | A1 |
20140156125 | Song et al. | Jun 2014 | A1 |
20140207280 | Duffley et al. | Jul 2014 | A1 |
20140207281 | Angle et al. | Jul 2014 | A1 |
20140207282 | Angle et al. | Jul 2014 | A1 |
20140218517 | Kim et al. | Aug 2014 | A1 |
20140257563 | Park et al. | Sep 2014 | A1 |
20140257564 | Sun et al. | Sep 2014 | A1 |
20140257565 | Sun et al. | Sep 2014 | A1 |
20140303775 | Oh et al. | Oct 2014 | A1 |
20140316636 | Hong et al. | Oct 2014 | A1 |
20140324270 | Chan et al. | Oct 2014 | A1 |
20140343783 | Lee | Nov 2014 | A1 |
20150006289 | Jakobson et al. | Jan 2015 | A1 |
20150115138 | Heng et al. | Apr 2015 | A1 |
20150115876 | Noh et al. | Apr 2015 | A1 |
20150120056 | Noh et al. | Apr 2015 | A1 |
20150151646 | Noiri | Jun 2015 | A1 |
20150166060 | Smith | Jun 2015 | A1 |
20150168954 | Hickerson et al. | Jun 2015 | A1 |
20150173578 | Kim et al. | Jun 2015 | A1 |
20150202772 | Kim | Jul 2015 | A1 |
20150212520 | Artes et al. | Jul 2015 | A1 |
20150223659 | Han et al. | Aug 2015 | A1 |
20150260829 | Wada | Sep 2015 | A1 |
20150265125 | Lee et al. | Sep 2015 | A1 |
20150269823 | Yamanishi et al. | Sep 2015 | A1 |
20150314453 | Witelson et al. | Nov 2015 | A1 |
20150367513 | Gettings et al. | Dec 2015 | A1 |
20160008982 | Artes et al. | Jan 2016 | A1 |
20160026185 | Smith et al. | Jan 2016 | A1 |
20160037983 | Hillen et al. | Feb 2016 | A1 |
20160041029 | T'ng et al. | Feb 2016 | A1 |
20160066759 | Langhammer et al. | Mar 2016 | A1 |
20160103451 | Vicenti | Apr 2016 | A1 |
20160123618 | Hester et al. | May 2016 | A1 |
20160132056 | Yoshino | May 2016 | A1 |
20160150933 | Duenne et al. | Jun 2016 | A1 |
20160165795 | Balutis et al. | Jun 2016 | A1 |
20160166126 | Morin et al. | Jun 2016 | A1 |
20160200161 | Van Den Bossche et al. | Jul 2016 | A1 |
20160209217 | Babu et al. | Jul 2016 | A1 |
20160213218 | Ham et al. | Jul 2016 | A1 |
20160214258 | Yan | Jul 2016 | A1 |
20160229060 | Kim et al. | Aug 2016 | A1 |
20160271795 | Vicenti | Sep 2016 | A1 |
20160278090 | Moeller et al. | Sep 2016 | A1 |
20160282873 | Masaki et al. | Sep 2016 | A1 |
20160297072 | Williams et al. | Oct 2016 | A1 |
20160298970 | Lindhe et al. | Oct 2016 | A1 |
20170001311 | Bushman et al. | Jan 2017 | A1 |
20170083022 | Tang | Mar 2017 | A1 |
20170147000 | Hoennige et al. | May 2017 | A1 |
20170164800 | Arakawa | Jun 2017 | A1 |
20170177001 | Cao et al. | Jun 2017 | A1 |
20170197314 | Stout et al. | Jul 2017 | A1 |
20170231452 | Saito et al. | Aug 2017 | A1 |
20170273528 | Watanabe | Sep 2017 | A1 |
20170364087 | Tang et al. | Dec 2017 | A1 |
20180004217 | Biber et al. | Jan 2018 | A1 |
20180074508 | Kleiner et al. | Mar 2018 | A1 |
20200348666 | Han | Nov 2020 | A1 |
Number | Date | Country |
---|---|---|
2015322263 | Apr 2017 | AU |
2322419 | Sep 1999 | CA |
1381340 | Nov 2002 | CN |
1696612 | Nov 2005 | CN |
101920498 | Dec 2010 | CN |
101945325 | Jan 2011 | CN |
101972129 | Feb 2011 | CN |
102407522 | Apr 2012 | CN |
102738862 | Oct 2012 | CN |
102866706 | Jan 2013 | CN |
103885444 | Jun 2014 | CN |
203672362 | Jun 2014 | CN |
104115082 | Oct 2014 | CN |
104460663 | Mar 2015 | CN |
104634601 | May 2015 | CN |
104765362 | Jul 2015 | CN |
105045098 | Nov 2015 | CN |
105334847 | Feb 2016 | CN |
105467398 | Apr 2016 | CN |
105527619 | Apr 2016 | CN |
105593775 | May 2016 | CN |
105990876 | Oct 2016 | CN |
4421805 | Aug 1995 | DE |
10204223 | Aug 2003 | DE |
10261787 | Jan 2004 | DE |
60002209 | Mar 2004 | DE |
69913150 | Aug 2004 | DE |
102007016802 | May 2008 | DE |
102008028931 | Jun 2008 | DE |
102008014912 | Sep 2009 | DE |
102009059217 | Feb 2011 | DE |
102009041362 | Mar 2011 | DE |
102009052629 | May 2011 | DE |
102010000174 | Jul 2011 | DE |
102010000317 | Aug 2011 | DE |
102010000607 | Sep 2011 | DE |
102010017211 | Dec 2011 | DE |
102010017689 | Jan 2012 | DE |
102010033768 | Feb 2012 | DE |
102011050357 | Feb 2012 | DE |
102012201870 | Aug 2012 | DE |
102011006062 | Sep 2012 | DE |
102011051729 | Jan 2013 | DE |
102012211071 | Nov 2013 | DE |
102012105608 | Jan 2014 | DE |
102012109004 | Mar 2014 | DE |
202014100346 | Mar 2014 | DE |
102012112035 | Jun 2014 | DE |
102012112036 | Jun 2014 | DE |
102013100192 | Jul 2014 | DE |
102014110265 | Jul 2014 | DE |
102014113040 | Sep 2014 | DE |
102013104399 | Oct 2014 | DE |
102013104547 | Nov 2014 | DE |
102015006014 | May 2015 | DE |
102014012811 | Oct 2015 | DE |
102015119501 | Nov 2015 | DE |
102014110104 | Jan 2016 | DE |
102016102644 | Feb 2016 | DE |
102016114594 | Feb 2018 | DE |
102016125319 | Jun 2018 | DE |
142594 | May 1985 | EP |
402764 | Dec 1990 | EP |
0769923 | May 1997 | EP |
1062524 | Dec 2000 | EP |
1342984 | Sep 2003 | EP |
1533629 | May 2005 | EP |
1553536 | Jul 2005 | EP |
1557730 | Jul 2005 | EP |
1621948 | Feb 2006 | EP |
1942313 | Jul 2008 | EP |
1947477 | Jul 2008 | EP |
1983396 | Oct 2008 | EP |
2027806 | Feb 2009 | EP |
2053417 | Apr 2009 | EP |
2078996 | Jul 2009 | EP |
2287697 | Feb 2011 | EP |
2327957 | Jun 2011 | EP |
1941411 | Sep 2011 | EP |
2407847 | Jan 2012 | EP |
2450762 | May 2012 | EP |
2457486 | May 2012 | EP |
2498158 | Sep 2012 | EP |
2502539 | Sep 2012 | EP |
2511782 | Oct 2012 | EP |
2515196 | Oct 2012 | EP |
2573639 | Mar 2013 | EP |
2595024 | May 2013 | EP |
2740013 | Jun 2014 | EP |
2741159 | Jun 2014 | EP |
2853976 | Apr 2015 | EP |
2752726 | May 2015 | EP |
2870852 | May 2015 | EP |
3079030 | Nov 2015 | EP |
3156873 | Apr 2017 | EP |
3184013 | Jun 2017 | EP |
2509989 | Jul 2014 | GB |
2509990 | Jul 2014 | GB |
2509991 | Jul 2014 | GB |
2513912 | Nov 2014 | GB |
H04338433 | Nov 1992 | JP |
2001125641 | May 2001 | JP |
2002085305 | Mar 2002 | JP |
2003330543 | Nov 2003 | JP |
2004133882 | Apr 2004 | JP |
2005205028 | Aug 2005 | JP |
2008140159 | Jun 2008 | JP |
2009123045 | Jun 2009 | JP |
2009238055 | Oct 2009 | JP |
2009301247 | Dec 2009 | JP |
2010066932 | Mar 2010 | JP |
2010227894 | Oct 2010 | JP |
2013077088 | Apr 2013 | JP |
2013146302 | Aug 2013 | JP |
2014176260 | Sep 2014 | JP |
201541203 | Mar 2015 | JP |
2016192040 | Nov 2016 | JP |
2016201095 | Dec 2016 | JP |
100735565 | May 2006 | KR |
20070045641 | May 2007 | KR |
100815545 | Mar 2008 | KR |
20090013523 | Feb 2009 | KR |
20110092158 | Aug 2011 | KR |
20140073854 | Jun 2014 | KR |
20140145648 | Dec 2014 | KR |
20150009413 | Jan 2015 | KR |
20150050161 | May 2015 | KR |
20150086075 | Jul 2015 | KR |
20150124011 | Nov 2015 | KR |
20150124013 | Nov 2015 | KR |
20150124014 | Nov 2015 | KR |
20150127937 | Nov 2015 | KR |
101640706 | Jul 2016 | KR |
20160097051 | Aug 2016 | KR |
9523346 | Aug 1995 | WO |
9928800 | Jun 1999 | WO |
200004430 | Jan 2000 | WO |
2005074362 | Aug 2005 | WO |
2007028667 | Mar 2007 | WO |
2011074165 | Jun 2011 | WO |
2012099694 | Jul 2012 | WO |
2012157951 | Nov 2012 | WO |
2013116887 | Aug 2013 | WO |
2014017256 | Jan 2014 | WO |
2014043732 | Mar 2014 | WO |
2014055966 | Apr 2014 | WO |
2014113091 | Jul 2014 | WO |
2014138472 | Sep 2014 | WO |
2015018437 | Feb 2015 | WO |
2015025599 | Feb 2015 | WO |
2015072897 | May 2015 | WO |
2015082017 | Jun 2015 | WO |
2015090398 | Jun 2015 | WO |
2015158240 | Oct 2015 | WO |
2015181995 | Dec 2015 | WO |
2016019996 | Feb 2016 | WO |
2016027957 | Feb 2016 | WO |
2016028021 | Feb 2016 | WO |
2016031702 | Mar 2016 | WO |
Entry |
---|
Choset et al., “Principles of Robot Motion”, Theory, Algorithms, and Implementations, Chapter 6—Cell decompositions, 2004, document of 41 pages. |
Durrant-Whyte et al., “Simultaneous Localization and Mapping (SLAM): Part I The Essential Algorithms”, in: IEEE Robotics and Automation Magazine, vol. 13, No. 2, pp. 99-108, Jun. 2006. |
Kim et al., “User-Centered Approach to Path Planning of Cleaning Robots: Analyzing User's Cleaning Behavior.” Proceedings of the 2007 ACM/IEEE Conference on Human-Robot Interaction, Mar. 8-11, 2007, pp. 373-380. |
Konolige et al., “A Low-Cost Laser Distance Sensor,” 2008 IEEE International Conference on Robotics and Automation, Pasadena, CA, USA, May 19-23, 2008, document of 7 pages. |
Oh et al., “Autonomous Battery Recharging for Indoor Mobile Robots,” Massachusetts Institute of Technology Press, Aug. 30, 2000, document of 6 pages, XP055321836. |
Siegwart, “Introduction to Autonomous Mobile Robots”, Massachusetts, ISBN 978-0-26-219502-7, (2004), pp. 104-115, 151-163, 250-251, document of 37 pages http://www.robotee.com/EBooks/Introduction_to_Autonomous_Mobile_Robots.pdf, XP055054850. |
Lymberopoulos et al., “A Realistic Evaluation and Comparison of Indoor Location Technologies: Experiences and Lessons Learned,” IPSN '15, Apr. 14-16, 2015, Seattle, WA, USA, document of 12 pages. http://dx.doi.org/10.1145/2737095.27. |
Neto et al., Human-Machine Interface Based on Electro-Biological Signals for Mobile Vehicles, 2006, IEEE, p. 2954-2959 (Year: 2006). |
Forlizzi, How robotic products become social products: An ethnographic study of cleaning in the home, 2007, IEEE, p. 129-136 (Year: 2007). |
Sick Sensor Intelligence, “LMS200/211/221/291 Laser Measurement Systems”, Jan. 2007, pp. 1-48, XP055581229, http://sicktoolbox.sourceforge.net/docs/sick-lms-technical-description.pdf. |
Vasquez-Gomez et al., “View planning for 3D object reconstruction with a mobile manipulator robot,” 2014 IEEE/RSJ International Conference on Intelligent Robots and Systems, Sep. 14, 2014 IEEE, pp. 4227-4233. |
World Intellectual Property Office, “International Search Report,” and English-language translation thereof issued in International Application No. PCT/AT2016/060110, document of 8 pages, dated Mar. 23, 2017. |
Number | Date | Country | |
---|---|---|---|
20220050468 A1 | Feb 2022 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 15982657 | May 2018 | US |
Child | 17513609 | US |
Number | Date | Country | |
---|---|---|---|
Parent | PCT/AT2016/060110 | Nov 2016 | WO |
Child | 15982657 | US |