This disclosure relates to a mobile autonomous service robot such as, e.g. a robot for processing surfaces (cleaning floors, mowing grass, painting a surface, etc.). In particular, a method for controlling an autonomous mobile service robot in order to localizely process a given part of the area of robot employment is described.
In recent years, autonomous mobile robots are being increasingly employed in the household, for example to clean or to monitor an apartment. In addition to this, service robots such as, for example, the PR2 (Personal Robot 2) from Willow Garage Inc., are being developed that are equipped with gripper arms for independently gripping and transporting objects (for example, for taking a drink from the refrigerator and bringing it to the user). A further example of an autonomous mobile service robot are so-called telepresence robots (Mobile Virtual Presence Devices, Telepresence Robot), that provide information or allow people to communicate over great distances.
One common employment scenario for a service robot is the execution of a task in a relatively small, locally contained area within the area of robot employment (e.g. an apartment), the location of which is indicated by the user. For example, a small cleaning robot may be employed by a user to clean up small dirtied areas such as, for example, crumbs that have fallen on the floor in a given area of the apartment. For this purpose, for example, the user can carry the robot to the dirtied area or direct it there using a remote control. In both cases the user must actively move the robot to the dirtied area.
Various embodiments described aim at providing new methods and improving existing ones for directing an autonomous mobile robot to a given location within a larger area of robot employment in order to execute a task.
In the following, a method for controlling an autonomous mobile robot for the purpose of executing a task in a localized area of an area of robot employment is described. In accordance with one embodiment, the method comprises: positioning of a robot at a start position within the area of robot employment; detection of information concerning the environment of the robot by means of a sensor; selection of a region of a certain geometric shape; and automatic determination, based on the detected information concerning the environment, of at least one of the following parameters: size and length (includes also the orientation/alignment) of the selected region.
Further, a method is described for controlling an autonomous mobile robot with a navigation module that comprises at least one sensor for navigation and orientation in the environment of the robot by means of at least one electronic map. In accordance with one embodiment, the method comprises; positioning the robot at a starting position within the area of robot employment; carrying out a self-localization, in the course of which the robot determines, by means of at least one sensor and at least one of the electronic maps, whether it is located on at least one of the electronic maps and, if so, at what position; verifying whether a task is linked to the determined robot position or to an area in which the determined robot position lies; and, as the case may be, the execution of the task by the robot.
In accordance with a further embodiment, the method comprises the following: positioning of the robot at a starting position within the area of robot employment; initiating the execution of a standard task by the robot; carrying out a self-localization, in the course of which the robot determines, by means of the at least one sensor and the at least one electronic map, whether it is located on at least one of the electronic maps and, if so, at what position; and verifying whether a task is linked to the determined robot position or to an area in which the determined robot position lies. If, after the robot has ended the self-localization, the determined robot position or an area in which the determined robot position lies is found to be linked to a first task, the standard task will be discontinued and the execution of the first task will be initiated.
Further, a method for controlling an autonomous mobile robot by means of a human-machine interface (HMI) is described, wherein the robot comprises a navigation module with an electronic map and with at least one sensor for navigation and orientation in the environment and wherein the robot and the HMI can exchange data via a communication connection. In accordance with one embodiment, the method comprises: displaying a map on the HMI; the user marking a point on the map; transferring the map coordinates of the marked point to the robot. The robot is then automatically controlled to begin processing a given area that is dependent on the marked point.
Further, a system with an autonomous mobile robot and an external device for wirelessly controlling the autonomous mobile robot is described. In accordance with one embodiment the device is configured to harvest the energy needed for its operation by means of Energy Harvesting.
In accordance with a further embodiment, the external device comprises a switch and a transmission unit and is configured to transmit a coded signal when the switch is actuated. Here the robot is configured to receive a code included in a coded signal, wherein the code determines a given task that is to be carried out in a given localized region of the area of robot employment and wherein the robot, immediately or after a defined delay time following the reception of the coded signal, initiates the execution of the given task in the given localized region.
Finally, a method for controlling an autonomous mobile robot configured as a floor processing machine is described, wherein the autonomous mobile floor processing machine comprises a navigation module that is configured to detect map information regarding the environment of the floor processing machine and, with this information, to localize itself in and to navigate within the environment. The map information may at least partially be displayed on at least one human-machine interface and the floor processing machine is configured to receive at least two different user commands from the at least one human-machine interface. One of the user commands signifies the complete floor processing of an area of employment of the floor processing machine. In accordance with one embodiment, the method comprises the following: deletion of all map information before the start of a complete floor processing; detection of new map information before the start of a complete floor processing; display of the newly detected map information on a human-machine interface; reception of a user command based on the displayed map information; control of the autonomous mobile floor processing machine in correspondence with the user command and the newly detected map information.
Various embodiments are described in the following in detail with the aid of the examples illustrated in the figures. The illustrations are not necessarily to scale and the embodiments are not limited to the aspects shown. Instead emphasis is placed on illustrating the underlying principles. The figures show:
An autonomous mobile service robot can usually independently carry out one or more tasks such as, for example, the cleaning or monitoring of a household or the transport of objects within a household. The following examples are intended to describe various employment possibilities of a cleaning robot. The embodiments described herein, however, are not limited to the examples explained here, but rather may be generally applicable to all applications, in which a user wants to assign a task, at a user-definable position or within a user-definable region of a larger area of robot employment, to an autonomous mobile robot.
A mobile autonomous robot generally comprises a drive module which, for example, may comprise electromotors, gears and wheels with which the robot—theoretically—can access every point of its area of employment. The robot may further comprise a work module such as, for example, a cleaning module for cleaning a floor surface (i.e. a brush, a vacuum cleaning device, etc.) or a gripper arm for gripping and transporting objects. A mobile autonomous robot configured as a telepresence robot dispenses with the work module and instead of this has at least one multimedia unit with, for example, a microphone, a loudspeaker, a camera and a screen.
In order to be able to autonomously carry out a task, a mobile autonomous robot generally has a navigation module and appropriate sensors, with the aid of which the robot can orient itself in and navigate throughout its environment. The navigation module can implement, e.g. an obstacle avoidance strategy and/or employ a SLAM algorithm (Simultaneous Localization and Mapping, see, e.g. Durrant Whyte and T. Baily: “Simultaneous Localization and Mapping (SLAM): Part I The Essential Algorithms”, in: IEEE Robotics and Automation Magazine, Vol. 13, No. 2, pgs. 99-110, June 2006). For this purpose, one or more maps of the area of robot employment can be stored in the robot. The robot may compile the map of the area of robot employment anew or may use an existing map that is already present at the start of its employment. An existing map may have been compiled during a previous employment, for example during an exploratory run, by the robot itself, or it may have been made available by another robot and/or the user. A sensor that is suitable for orienting and navigating the robot is, for example, a sensor that is configured to measure distances to objects in the environment such as, for example, an optical and/or acoustic sensor that operates, e.g. by means of triangulation or travel time measurement of an emitted signal (e.g. triangulation sensor, time-of-flight camera, laser scanner, ultrasonic sensors, etc.). Other typical examples of suitable sensors are cameras (together with digital image processing), tactile sensors, acceleration sensors, gyroscopes or odometers).
The software responsible for the behavior of the robot can be run entirely on the robot (in corresponding processor and storage element) or may be at least partially outsourced onto an external computer that is accessible, for example, in a home network or via the internet (cloud).
As is shown in
The processing of the standardized region L can be carried out, for example, along a meandering path P (see
In order to improve the practical utility for the user, it is desirable for the modi of operation for the cleaning a locally existing dirtied area by an autonomous mobile robot described with reference to
“Intelligent” automatic adaptation to the environment:
In the example shown in
As an alternative, a mathematic distribution model can be determined that represents or approximates the (actual, as determined by the sensors) distribution of dirt within the standardized region L. For example, it can be assumed that the dirt (2D) is normally distributed. Such a normal distribution is defined by its maximum (average value) and its width (standard deviation). By determining the most heavily dirtied spot in the region L, the average value of the distribution can be estimated. Using the spatial change of distribution beginning at the most heavily dirtied spot, the standard deviation (within the region L) can also be estimated. Based on the thus determined mathematical distribution model, the amount and distribution of the dirt outside of the region L can be estimated, and based on this the borders of the adapted region L′, for example, can be determined such that the probability of a dirtied area lying outside of the thus defined region L′ is smaller than a threshold value (that is, for example, smaller than 1%). In addition to the normal distribution, any other stochastic distribution module may be used to formulate a mathematical model of the dirt distribution. In an alternative embodiment, the distribution and amount of the dirt outside of the region L is estimated by extrapolating the actual distribution of the dirt within the region L as determined with the aid of (at least) one dirt sensor.
In the example shown in
Alternatively, the standardized region L may also be first completely cleaned. Based on the sensor measurements the cleaning process is repeated, wherein during the repetition the shifted region L′ is cleaned. This is determined, for example, such that the maximum of the dirt (that was cleaned during the first cleaning run) lies at the center point of the shifted region L′. This procedure can be iteratively repeated as needed, wherein the sensor data of the directly preceding cleaning or that of all preceding cleanings can be taken into consideration.
The decision as to whether the standardized region L should be aligned with a wall and, if so, then how, can be made, for example, based on the distance of the starting position of the robot to the closest wall and/or the distances of the border points of the standardized region L to the surrounding walls. In the example of
In addition to adapting the standardized region L based on the data gathered by the sensors, the robot can also adapt the basic geometric shape of the standardized region L to the spatial conditions of the environment surrounding the starting point specified by the user. Thus, for example, (with the aid of sensors) the robot can recognize that it has been placed near a wall and/or a corner of the room. In this case, a standardized region L with a basic quadratic or rectangular shape is chosen and this basic shape of the region to be cleaned L is modified by adapting it to the wall and/or the corner, as described above. If, in an opposing case, the robot is placed in a large open area (e.g. in the middle of a room) and there is no immediately apparent preferred direction in which to align it, e.g. a square, then the geometric shape chosen for a region L may be a circle. This region L can then be modified, for example, as described with referenced to
Numerous further examples in which the user's work is made easier by automatically adapting a region L for the localized cleaning of a floor surface are possible. The robot, for example, can be equipped with a sensor for recognizing floor coverings and can adapt the standardized region L based on these measurements. Thus the robot can be placed by a user onto a carpet in order to specifically clean the same. The region L can then be aligned, e.g. along the borders of the carpet, analogously to the alignment along a wall described above.
The robot may have, e.g. a sensor directed at the ceiling, such as, e.g. a camera. With such a sensor the robot can detect, e.g. whether it has been placed near a piece of furniture that it can move under. In this case, the robot can adapt the standardized region L based on these measurements. Thus the robot can be placed by the user, for example, directly in front of a bed (i.e. no further than a maximum distance from the bed) in order to specifically clean underneath this bed. The region L will then be adapted to the dimensions of the bed.
Further, the robot can be configured to recognize a table with chairs (dinette). If a table with chairs (group of furniture) is detected, the standardized region L can be adapted to the group of furniture. The robot can chose, e.g. based on the recognized obstacles (in particular the legs of tables and chairs), the region to be cleaned so as to be large enough so that, for example, an area around the group of furniture (i.e. the table and all the chairs) equivalent to at least one width of the robot is cleaned. For this purpose, for example, as the region to be cleaned L′, a rectangle is chosen that completely encompasses the group of furniture. The orientation of the rectangle can be determined based on how the group of furniture and/or nearby walls is(are) aligned. The borders of the rectangle are determined, for example, such that the minimum distance to a table or chair leg corresponds to that of at least one robot diameter. Thus a user, for example, after breakfast, may place the robot next to the breakfast table to clean up crumbs that have fallen to the floor. Recognition is carried out, for example, with a camera that uses image processing or based on the numerous small but regularly occurring obstacles formed by the legs of the table and chairs.
Further, the robot may be placed by the user, for example, near the door of a room, as a result of which the standardized region L would comprise two rooms. The robot then adapts the region to be cleaned such that it lies completely within one room. At the end of the cleaning the robot (with or without consulting the user) can adapt the region L again such that it lies completely within the other room.
The robot can also be configured to choose, based on the newly formed geometric shape of the region to be cleaned L′, a suitable cleaning pattern. For example, for a small round or square region L′, a spiral path P (cf.
The robot may have a map of the area of robot employment available by means of which it can localize itself. In this case, in addition to the information received from the sensors, information from the map can also be used to modify the region L to be processed and to adapt it to the conditions in the proximity of the starting point. In particular, the robot can take into consideration and avoid danger zones, either those it had previously identified or that were designated as such by the user. Alternatively, it is possible to carry out adaptations, for example, by automatically selecting basic, previously specified shapes and sizes for the region to be cleaned based on the sensor measurements (of the navigation module sensors). When the region to be cleaned L′ is determined, requests can be made regarding a specifiable extent of the processing (e.g. the surface area to be cleaned, the duration of the cleaning). Thus, for example, it can be requested that at least one square meter be cleaned in order to achieve a significant cleaning result. A further request may be that the expected cleaning duration should not exceed a specifiable time, for example, 10 minutes. The requests regarding an extent of the processing (in particular minimum and maximum) can be specified, for example, by the user.
Localization and returning home: One desirable function of service robots in a household is the ability to carry out regularly occurring tasks such as, for example, the cleaning of a quickly dirtied area (e.g. the floor of a dinette) without, however, having to clean the entire room or the complete apartment. The transport of used dishes, e.g. from the table in a dinette to a dishwashing machine, for example, may be one of these regularly occurring tasks. The relevant literature describes as solutions for such problems robot systems that have a permanent map of the area of robot employment. This map can be compiled by the robot itself or may otherwise be made available. The user can have the electronic map of the area of robot employment displayed to him on a human-machine interface (e.g. a tablet PC 200, cf.
Some users may find it inconvenient to first, e.g. look for the tablet PC and then to turn it on in order to instruct the robot to clean up a small dirtied area when they could simply carry the robot to the dirtied area and start it in less time. Furthermore, situations may arise in which the robot in unable to move directly to the area to be processed, such as when, for example, it is employed in a house having two or more floors and it cannot independently move from one floor to the next. In this case as well the user must relocate the robot manually and, at the same time, the user can carry the robot directly to the area to be processed and start it. In such situations is may be advantageous for the robot to independently recognize, after having been positioned and started by the user, that a previously defined standard operating region exists for its position. For example, the user places the robot next to a table leg TL (see
In order to accelerate the cleaning process, the robot can also begin cleaning a standardized region L (or a modified standardized region L′, cf.
A localization of the robot 100 on an existing map of the area of robot employment also enable the robot 100 to verify with the map (cf.
A localization of the robot 100 on an existing map of the area of robot employment also enables the robot 100 to recognize, using the map, previously identified danger zones in its environment and/or those that have been marked on a map by the user. These can be taken into account when determining the area to be cleaned and can be particularly avoided. Such a danger zone might be, for example, an area in which the robot regularly becomes stuck and can only be liberated by means of intervention by the user. It may also be an area, however, that the user does not want the robot to enter because it contains, for example, items of furniture that are very fragile or because small parts of toys might frequently be lying around in the area.
A localization of the robot 100 on an existing map of the area of robot employment also enables the robot 100 to display, during and/or after the cleaning, both the region to be cleaned (e.g. region L′ as according to
Finally, a localization of the robot on an existing map of the area of robot employment enables the robot to automatically save the position of the regions for localized cleaning (region L′ in accordance with
Quick start by pressing a button: In order to process a previously defined and saved standard operating region L″ (see
In a further embodiment a robot-external device is used that is configured, for example, to wirelessly transmit a signal to the robot when a user presses a button or when another event takes place. One example is shown in
Such an external device 210 can be realized simply and at minimal expense. It needs an energy supply (in the electronic system), a switch 213 (e.g. a button), which is actuated upon a given event and a transmission unit (in the electronic system 211) for generating a radio signal to send the signal S to the robot 100 after the switch has detected the event. The energy supply can be a battery. Alternatively, the needed energy can be derived from the environment and/or the event to be detected (Energy Harvesting). In particular, the needed energy can be derived from the mechanical labor exerted when the switch (button) is actuated. Accordingly, for example, the switch will be linked with a piezoelectric element that derives the energy needed to send the signal to the robot from the pressing of the button by the user. The switch 213 can be embodied as a simple button switch or contact switch that is triggered by being pressed by the user or by the opening and/or closing of a door. The transmission unit can emit an electromagnetic signal corresponding to a standard used in wireless networks, such as ZigBee, WiFi or Bluetooth. The signal may be directly received by the robot, as shown in
The signal S may transport information that can be permanently linked to a certain standard operating region L″ (cf.
The example of an external device for controlling an autonomous mobile robot that derives the energy for its own operation by means of energy harvesting techniques, as described above, can be expanded upon. In the most general terms, energy harvesting is understood to mean the “collecting” of energy from the environment and/or from the energy that is exerted when a device is used (e.g. by mechanical labor as when a switch is actuated). Various physical effects can be utilized to generate electrical energy such as, e.g. the piezoelectric effect, the thermoelectric effect, the photoelectric effect, osmosis, mechanical labor involving movement, induction, etc. Instead of the switch 213 mentioned above, the device may also have a sensor for detecting one or more parameters such as, e.g. the humidity in a flower pot, the pH value in a swimming pool. The measured parameter value (or values dependent thereon) are wirelessly transmitted to the robot which then, based on these values (or based on numerous values received over a certain period of time), reacts in a specified manner (e.g. by carrying out a previously defined task at a previously defined place). The robot, for example, can water the flowers or correct the pH value of the pool by adding a chemical. The decision as to which action should be carried out by the robot can be made in the robot or in a server connected to the same (e.g. a cloud service).
If the user touches the map (e.g. displayed on the tablet PC 200), at a different point outside the standardized region L, a further point P can be added. This can also be shifted to an appropriate position. In this manner, the user can assign numerous smaller tasks to the robot. The robot can independently plan how to carry these out. It can thus plan the sequence, for example, such that as little time as possible will be needed to complete the processing. If two (or more) regions overlap or lie very close to each other (at a distance smaller than a specifiable maximum value), the robot can independently combine the two (or more) standardized regions L into one new region. In simplest of cases, the standardized region L is always a square or a circle of the same size, for example, surrounding the point P designated by the touch of a finger. Alternatively, the user may select one of various predefined quantities (lateral length, radius, surface area) and shapes (circle, square, rectangle) from a menu displayed on a table PC 200.
In the example shown in
As described above, the cleaning begins as soon as the user has confirmed the assignment, for example by pressing an OK button, and the robot has reached a previously specified region L or L′. In order to start its work more quickly, the robot can set out from its current position (e.g. the base station) and start moving in the direction of the first inputted position while the user is still making the input (see, e.g. the robot position in
Since the map that a robot uses for navigation is generally complex and difficult for a human user to read, the user can be shown a greatly simplified and easy to interpret map. The user may then indicate the position P of the region L to be cleaned on this simplified map. After this, the position P is transformed by means of coordinate transformation into coordinates on the map that the robot 100 uses for navigation. Both the compilation of the simplified map based on the robot map, as well as the coordinate transformation between the two maps can be carried out on the robot 100, on the human-machine interface 200, or on an external computer (in particular a cloud service accessible via internet).
In the example described above with reference to
In order to achieve this purpose, the entire map date is compiled anew, for example, using a SLAM method, during every (complete) cleaning run. When doing so, the old map data is deleted at the beginning of the (complete) cleaning run (see
After a complete cleaning, the robot returns, for example, to its initial position or to the base station (cf.
A further user intervention might be the relocation of the robot. For this purpose the robot is picked up by the user and put down in a new area of employment (e.g. on a different floor) or on a new position of the present area of employment, which the robot can detect, for example, by means of a wheel contact switch. In this case the robot can attempt to determine its position on the old map (global localization,
A further user intervention is the complete shutdown of the robot. In this case the robot can no longer detect a relocation. After a restart of the robot it can attempt, for example, to localize itself in the old map data. In an alternative embodiment the previously gathered map information is saved on a volatile memory, causing it to be automatically deleted when the robot is shut down (
Adopting the approach outlined above, an autonomous mobile floor processing machine can be controlled in the following manner. First, all map information is deleted before the start of a complete floor processing in the area of employment of the floor processing machine. During the complete floor processing, new map information is gathered. The newly gathered map information is displayed, as described above, on a human-machine interface in order to allow a user interaction. At least one user command, which is based on the displayed map information, is received by this human-machine interface and the autonomous mobile floor processing machine is controlled in accordance with the user command and the newly gathered map information. For this the floor processing machine needs a navigation module (see also
Although various embodiments have been illustrated and described with respect to one or more specific implementations, alterations and/or modifications may be made to the illustrated examples without departing from the spirit and scope of the appended claims. With particular regard to the various functions performed by the above described components or structures (units, assemblies, devices, circuits, systems, etc.), the terms (including a reference to a “means”) used to describe such components are intended to correspond—unless otherwise indicated—to any component or structure that performs the specified function of the described component (e.g., that is functionally equivalent), even if it is not structurally equivalent to the disclosed structure that performs the function in the herein illustrated exemplary implementations of the embodiments described herein.
Further, the purpose of the Abstract of the Disclosure is to enable the U.S. Patent and Trademark Office and the public generally, and especially the scientists, engineers and practitioners in the art who are not familiar with patent or legal terms or phraseology, to determine quickly from a cursory inspection the nature and essence of the technical disclosure of the application. The Abstract of the Disclosure is not intended to be limiting as to the scope in any way.
Finally, it is the applicant's intent that only claims that include the express language “means for” or “step for” be interpreted under 35 U.S.C. 112. Claims that do not expressly include the phrase “means for” or “step for” are not to be interpreted under 35 U.S.C. 112.
Number | Date | Country | Kind |
---|---|---|---|
102016102644.1 | Feb 2016 | DE | national |
This application is a continuation of and claims the benefit of U.S. patent application Ser. No. 16/077,929, filed Aug. 14, 2018 and the entirety of which is incorporated by reference, and which is a § 371 National Phase of PCT/EP2017/053398, filed Feb. 15, 2017, the entirety of which is incorporated by reference and which claims priority to German Patent Application No. 10,2016,102,644.1, filed Feb. 15, 2016.
Number | Name | Date | Kind |
---|---|---|---|
4674048 | Okumura | Jun 1987 | A |
4740676 | Satoh et al. | Apr 1988 | A |
4777416 | George, II et al. | Oct 1988 | A |
5109566 | Kobayashi et al. | May 1992 | A |
5260710 | Omamyuda et al. | Nov 1993 | A |
5284522 | Kobayashi et al. | Feb 1994 | A |
5377106 | Drunk et al. | Dec 1994 | A |
5402051 | Fujiwara et al. | Mar 1995 | A |
5696675 | Nakamura et al. | Dec 1997 | A |
5787545 | Colens | Aug 1998 | A |
5995884 | Allen et al. | Nov 1999 | A |
6366219 | Hoummady | Apr 2002 | B1 |
6389329 | Colens | May 2002 | B1 |
6532404 | Colens | Mar 2003 | B2 |
6594844 | Jones | Jul 2003 | B2 |
6605156 | Clark et al. | Aug 2003 | B1 |
6615108 | Peless et al. | Sep 2003 | B1 |
6667592 | Jacobs et al. | Dec 2003 | B2 |
6690134 | Jones et al. | Feb 2004 | B1 |
6764373 | Osawa et al. | Jul 2004 | B1 |
6781338 | Jones et al. | Aug 2004 | B2 |
6809490 | Jones et al. | Oct 2004 | B2 |
6965209 | Jones et al. | Nov 2005 | B2 |
6972834 | Oka et al. | Dec 2005 | B1 |
7155308 | Jones | Dec 2006 | B2 |
7173391 | Jones et al. | Feb 2007 | B2 |
7196487 | Jones et al. | Mar 2007 | B2 |
7302345 | Kwon et al. | Nov 2007 | B2 |
7388343 | Jones et al. | Jun 2008 | B2 |
7389156 | Ziegler et al. | Jun 2008 | B2 |
7448113 | Jones et al. | Nov 2008 | B2 |
7483151 | Zganec et al. | Jan 2009 | B2 |
7507948 | Park et al. | Mar 2009 | B2 |
7539557 | Yamauchi | May 2009 | B2 |
7571511 | Jones et al. | Aug 2009 | B2 |
7636982 | Jones et al. | Dec 2009 | B2 |
7656541 | Waslowski et al. | Feb 2010 | B2 |
7761954 | Ziegler et al. | Jul 2010 | B2 |
7801676 | Kurosawa et al. | Sep 2010 | B2 |
8438695 | Gilbert et al. | May 2013 | B2 |
8594019 | Misumi | Nov 2013 | B2 |
8739355 | Morse et al. | Jun 2014 | B2 |
8855914 | Alexander et al. | Oct 2014 | B1 |
8892251 | Dooley et al. | Nov 2014 | B1 |
8921752 | Iizuka | Dec 2014 | B2 |
8982217 | Hickman | Mar 2015 | B1 |
9002511 | Hickerson et al. | Apr 2015 | B1 |
9026302 | Stout et al. | May 2015 | B2 |
9037294 | Chung et al. | May 2015 | B2 |
9037296 | Choe | May 2015 | B2 |
9043017 | Jung et al. | May 2015 | B2 |
9149170 | Ozick et al. | Oct 2015 | B2 |
9220386 | Gilbert, Jr. et al. | Dec 2015 | B2 |
9486924 | Dubrovsky et al. | Nov 2016 | B2 |
9717387 | Szatmary et al. | Aug 2017 | B1 |
9939814 | Bauer | Apr 2018 | B1 |
10168709 | Kleiner | Jan 2019 | B2 |
10175688 | Shin | Jan 2019 | B2 |
10228697 | Yoshino | Mar 2019 | B2 |
10327617 | Kim | Jun 2019 | B2 |
10575696 | O'Brien | Mar 2020 | B2 |
10860029 | Artes | Dec 2020 | B2 |
20020016649 | Jones | Feb 2002 | A1 |
20020103575 | Sugawara | Aug 2002 | A1 |
20020120364 | Colens | Aug 2002 | A1 |
20030025472 | Jones et al. | Feb 2003 | A1 |
20030030398 | Jacobs et al. | Feb 2003 | A1 |
20030120389 | Abramson et al. | Jun 2003 | A1 |
20030142925 | Melchior et al. | Jul 2003 | A1 |
20040020000 | Jones | Feb 2004 | A1 |
20040049877 | Jones et al. | Mar 2004 | A1 |
20040187457 | Colens | Sep 2004 | A1 |
20040207355 | Jones et al. | Oct 2004 | A1 |
20050000543 | Taylor et al. | Jan 2005 | A1 |
20050010331 | Taylor et al. | Jan 2005 | A1 |
20050041839 | Saitou et al. | Feb 2005 | A1 |
20050067994 | Jones et al. | Mar 2005 | A1 |
20050156562 | Cohen et al. | Jul 2005 | A1 |
20050171636 | Tani | Aug 2005 | A1 |
20050171644 | Tani | Aug 2005 | A1 |
20050204717 | Colens | Sep 2005 | A1 |
20050256610 | Orita | Nov 2005 | A1 |
20060020369 | Taylor | Jan 2006 | A1 |
20060095158 | Lee et al. | May 2006 | A1 |
20060237634 | Kim | Oct 2006 | A1 |
20070027579 | Suzuki et al. | Feb 2007 | A1 |
20070061041 | Zweig | Mar 2007 | A1 |
20070234492 | Svendsen et al. | Oct 2007 | A1 |
20070266508 | Jones et al. | Nov 2007 | A1 |
20070282484 | Chung et al. | Dec 2007 | A1 |
20080046125 | Myeong et al. | Feb 2008 | A1 |
20080140255 | Ziegler et al. | Jun 2008 | A1 |
20080155768 | Ziegler et al. | Jul 2008 | A1 |
20080192256 | Wolf et al. | Aug 2008 | A1 |
20080307590 | Jones et al. | Dec 2008 | A1 |
20090048727 | Hong et al. | Feb 2009 | A1 |
20090051921 | Masahiko | Feb 2009 | A1 |
20090177320 | Lee et al. | Jul 2009 | A1 |
20090182464 | Myeong et al. | Jul 2009 | A1 |
20090281661 | Dooley et al. | Nov 2009 | A1 |
20100030380 | Shah et al. | Feb 2010 | A1 |
20100049365 | Jones et al. | Feb 2010 | A1 |
20100082193 | Chiappetta | Apr 2010 | A1 |
20100257690 | Jones et al. | Oct 2010 | A1 |
20100257691 | Jones et al. | Oct 2010 | A1 |
20100263158 | Jones et al. | Oct 2010 | A1 |
20100324731 | Letsky | Dec 2010 | A1 |
20100324736 | Yoo et al. | Dec 2010 | A1 |
20110137461 | Kong et al. | Jun 2011 | A1 |
20110194755 | Jeong et al. | Aug 2011 | A1 |
20110211731 | Lee et al. | Sep 2011 | A1 |
20110224824 | Lee et al. | Sep 2011 | A1 |
20110236026 | Yoo et al. | Sep 2011 | A1 |
20110238214 | Yoo et al. | Sep 2011 | A1 |
20110264305 | Choe et al. | Oct 2011 | A1 |
20110278082 | Chung et al. | Nov 2011 | A1 |
20110295420 | Wagner | Dec 2011 | A1 |
20120008128 | Bamji | Jan 2012 | A1 |
20120013907 | Jung et al. | Jan 2012 | A1 |
20120022785 | DiBernardo et al. | Jan 2012 | A1 |
20120060320 | Lee et al. | Mar 2012 | A1 |
20120069457 | Wolf et al. | Mar 2012 | A1 |
20120169497 | Schnittman et al. | Jul 2012 | A1 |
20120215380 | Fouillade et al. | Aug 2012 | A1 |
20120223216 | Flaherty et al. | Sep 2012 | A1 |
20120265370 | Kim et al. | Oct 2012 | A1 |
20120271502 | Lee et al. | Oct 2012 | A1 |
20120283905 | Nakano et al. | Nov 2012 | A1 |
20130001398 | Wada et al. | Jan 2013 | A1 |
20130024025 | Hsu | Jan 2013 | A1 |
20130166134 | Shitamoto | Jun 2013 | A1 |
20130206177 | Burlutskiy | Aug 2013 | A1 |
20130214615 | Taleb | Aug 2013 | A1 |
20130217421 | Kim | Aug 2013 | A1 |
20130221908 | Tang | Aug 2013 | A1 |
20130261867 | Burnett et al. | Oct 2013 | A1 |
20130265562 | Tang et al. | Oct 2013 | A1 |
20130317944 | Huang et al. | Nov 2013 | A1 |
20140005933 | Fong et al. | Jan 2014 | A1 |
20140098218 | Wu et al. | Apr 2014 | A1 |
20140100693 | Fong et al. | Apr 2014 | A1 |
20140115797 | Duenne | May 2014 | A1 |
20140124004 | Rosenstein et al. | May 2014 | A1 |
20140128093 | Das et al. | May 2014 | A1 |
20140156125 | Song et al. | Jun 2014 | A1 |
20140207280 | Duffley et al. | Jul 2014 | A1 |
20140207281 | Angle et al. | Jul 2014 | A1 |
20140207282 | Angle et al. | Jul 2014 | A1 |
20140218517 | Kim et al. | Aug 2014 | A1 |
20140257563 | Park et al. | Sep 2014 | A1 |
20140257564 | Sun et al. | Sep 2014 | A1 |
20140257565 | Sun et al. | Sep 2014 | A1 |
20140303775 | Oh et al. | Oct 2014 | A1 |
20140316636 | Hong et al. | Oct 2014 | A1 |
20140324270 | Chan et al. | Oct 2014 | A1 |
20140343783 | Lee | Nov 2014 | A1 |
20150115876 | Noh et al. | Apr 2015 | A1 |
20150120056 | Noh et al. | Apr 2015 | A1 |
20150151646 | Noiri | Jun 2015 | A1 |
20150168954 | Hickerson et al. | Jun 2015 | A1 |
20150173578 | Kim et al. | Jun 2015 | A1 |
20150202772 | Kim | Jul 2015 | A1 |
20150212520 | Artes et al. | Jul 2015 | A1 |
20150223659 | Han et al. | Aug 2015 | A1 |
20150260829 | Wada | Sep 2015 | A1 |
20150265125 | Lee et al. | Sep 2015 | A1 |
20150314453 | Witelson et al. | Nov 2015 | A1 |
20150367513 | Gettings et al. | Dec 2015 | A1 |
20160008982 | Artes et al. | Jan 2016 | A1 |
20160037983 | Hillen et al. | Feb 2016 | A1 |
20160041029 | T'ng et al. | Feb 2016 | A1 |
20160066759 | Langhammer et al. | Mar 2016 | A1 |
20160103451 | Vicenti | Apr 2016 | A1 |
20160132056 | Yoshino | May 2016 | A1 |
20160150933 | Duenne et al. | Jun 2016 | A1 |
20160165795 | Balutis et al. | Jun 2016 | A1 |
20160166126 | Morin et al. | Jun 2016 | A1 |
20160209217 | Babu et al. | Jul 2016 | A1 |
20160213218 | Ham et al. | Jul 2016 | A1 |
20160229060 | Kim et al. | Aug 2016 | A1 |
20160271795 | Vicenti | Sep 2016 | A1 |
20160282873 | Masaki et al. | Sep 2016 | A1 |
20160298970 | Lindhe et al. | Oct 2016 | A1 |
20170001311 | Bushman et al. | Jan 2017 | A1 |
20170083022 | Tang | Mar 2017 | A1 |
20170147000 | Hoennige et al. | May 2017 | A1 |
20170177001 | Cao et al. | Jun 2017 | A1 |
20170197314 | Stout et al. | Jul 2017 | A1 |
20170231452 | Saito et al. | Aug 2017 | A1 |
20170364087 | Tang et al. | Dec 2017 | A1 |
20190250625 | Kleiner | Aug 2019 | A1 |
Number | Date | Country |
---|---|---|
2015322263 | Apr 2017 | AU |
2322419 | Sep 1999 | CA |
1381340 | Nov 2002 | CN |
101945325 | Jan 2011 | CN |
101972129 | Feb 2011 | CN |
102407522 | Apr 2012 | CN |
102738862 | Oct 2012 | CN |
203672362 | Jun 2014 | CN |
104460663 | Mar 2015 | CN |
104634601 | May 2015 | CN |
104765362 | Jul 2015 | CN |
105045098 | Nov 2015 | CN |
105334847 | Feb 2016 | CN |
105467398 | Apr 2016 | CN |
105527619 | Apr 2016 | CN |
105990876 | Oct 2016 | CN |
4421805 | Aug 1995 | DE |
10204223 | Aug 2003 | DE |
10261787 | Jan 2004 | DE |
60002209 | Mar 2004 | DE |
69913150 | Aug 2004 | DE |
102007016802 | May 2008 | DE |
102008028931 | Jun 2008 | DE |
102008014912 | Sep 2009 | DE |
102009059217 | Feb 2011 | DE |
102009041362 | Mar 2011 | DE |
102009052629 | May 2011 | DE |
102010000174 | Jul 2011 | DE |
102010000317 | Aug 2011 | DE |
102010000607 | Sep 2011 | DE |
102010017211 | Dec 2011 | DE |
102010017689 | Jan 2012 | DE |
102010033768 | Feb 2012 | DE |
102011050357 | Feb 2012 | DE |
102012201870 | Aug 2012 | DE |
102011006062 | Sep 2012 | DE |
102011051729 | Jan 2013 | DE |
102012211071 | Nov 2013 | DE |
102012105608 | Jan 2014 | DE |
102012109004 | Mar 2014 | DE |
202014100346 | Mar 2014 | DE |
102012112035 | Jun 2014 | DE |
102012112036 | Jun 2014 | DE |
102013100192 | Jul 2014 | DE |
102014110265 | Jul 2014 | DE |
102014113040 | Sep 2014 | DE |
102013104399 | Oct 2014 | DE |
102013104547 | Nov 2014 | DE |
102015006014 | May 2015 | DE |
102014012811 | Oct 2015 | DE |
102015119501 | Nov 2015 | DE |
102014110104 | Jan 2016 | DE |
102016102644 | Feb 2016 | DE |
142594 | May 1985 | EP |
402764 | Dec 1990 | EP |
0769923 | May 1997 | EP |
2515196 | Sep 1999 | EP |
1062524 | Dec 2000 | EP |
1342984 | Sep 2003 | EP |
1533629 | May 2005 | EP |
1553536 | Jul 2005 | EP |
1557730 | Jul 2005 | EP |
1621948 | Feb 2006 | EP |
1942313 | Jul 2008 | EP |
1947477 | Jul 2008 | EP |
1983396 | Oct 2008 | EP |
2027806 | Feb 2009 | EP |
2053417 | Apr 2009 | EP |
2078996 | Jul 2009 | EP |
2287697 | Feb 2011 | EP |
2327957 | Jun 2011 | EP |
1941411 | Sep 2011 | EP |
2407847 | Jan 2012 | EP |
2450762 | May 2012 | EP |
2457486 | May 2012 | EP |
2502539 | Sep 2012 | EP |
2511782 | Oct 2012 | EP |
2573639 | Mar 2013 | EP |
2595024 | May 2013 | EP |
2740013 | Jun 2014 | EP |
2741159 | Jun 2014 | EP |
2853976 | Apr 2015 | EP |
2870852 | May 2015 | EP |
3079030 | Nov 2015 | EP |
2498158 | Apr 2017 | EP |
3156873 | Apr 2017 | EP |
2509989 | Jul 2014 | GB |
2509990 | Jul 2014 | GB |
2509991 | Jul 2014 | GB |
2513912 | Nov 2014 | GB |
H04338433 | Nov 1992 | JP |
2001125641 | May 2001 | JP |
2002085305 | Mar 2002 | JP |
2003330543 | Nov 2003 | JP |
2004133882 | Apr 2004 | JP |
2005205028 | Aug 2005 | JP |
2009238055 | Oct 2009 | JP |
2010227894 | Oct 2010 | JP |
2012011200 | Jan 2012 | JP |
2013077088 | Apr 2013 | JP |
2013146302 | Aug 2013 | JP |
2014176260 | Sep 2014 | JP |
201541203 | Mar 2015 | JP |
100735565 | May 2006 | KR |
100815545 | Mar 2008 | KR |
20110092158 | Aug 2011 | KR |
20140073854 | Jun 2014 | KR |
20140145648 | Dec 2014 | KR |
20150009413 | Jan 2015 | KR |
20150050161 | May 2015 | KR |
20150086075 | Jul 2015 | KR |
20150124011 | Nov 2015 | KR |
20150124013 | Nov 2015 | KR |
20150124014 | Nov 2015 | KR |
20150127937 | Nov 2015 | KR |
101640706 | Jul 2016 | KR |
20160097051 | Aug 2016 | KR |
9523346 | Aug 1995 | WO |
9928800 | Jun 1999 | WO |
200004430 | Jan 2000 | WO |
2005074362 | Aug 2005 | WO |
2007028667 | Mar 2007 | WO |
2012099694 | Jul 2012 | WO |
2012157951 | Nov 2012 | WO |
2013116887 | Aug 2013 | WO |
2014017256 | Jan 2014 | WO |
2014043732 | Mar 2014 | WO |
2014055966 | Apr 2014 | WO |
2014113091 | Jul 2014 | WO |
2014138472 | Sep 2014 | WO |
2015018437 | Feb 2015 | WO |
2015025599 | Feb 2015 | WO |
2015072897 | May 2015 | WO |
2015082017 | Jun 2015 | WO |
2015090398 | Jun 2015 | WO |
2015158240 | Oct 2015 | WO |
2015181995 | Dec 2015 | WO |
2016019996 | Feb 2016 | WO |
2016027957 | Feb 2016 | WO |
2016028021 | Feb 2016 | WO |
2016031702 | Mar 2016 | WO |
2016048077 | Mar 2016 | WO |
2016050215 | Apr 2016 | WO |
2016091312 | Jun 2016 | WO |
2016095966 | Jun 2016 | WO |
Entry |
---|
Alriksson et al., A component-based approach to localization and collision avoidance for mobile multi-agent systems, 2007, IEEE, p. 4285-4292 (Year: 2007). |
Eren et al., Accuracy in position estimation of mobile robots based on coded infrared signal transmission, 1995, IEEE, p. 548-551 (Year: 1995). |
Eren et al., Position estimation of mobile robots based on coded infrared signal transmission, 1997, IEEE, p. 1280-1280 (Year: 1997). |
Chen et al., Complete coverage motion control of a cleaning robot using infrared sensors, 2005, IEEE, p. 543-548 (Year: 2005). |
Choset et al., “Principles of Robot Motion”, Theory, Algorithms, and Implementations, Chapter 6—Cell decompositions, 2004, document of 41 pages. |
Durrant-Whyte et al., “Simultaneous Localization and Mapping (SLAM): Part I The Essential Algorithms”, in: IEEE Robotics and Automation Magazine, vol. 13, No. 2, pp. 99-108, Jun. 2006. |
Kim et al., “User-Centered Approach to Path Planning of Cleaning Robots: Analyzing User's Cleaning Behavior.” Proceedings of the 2007 ACM/IEEE Conference on Human-Robot Interaction, Mar. 8-11, 2007, pp. 373-380. |
Konolige et al., “A Low-Cost Laser Distance Sensor,” 2008 IEEE International Conference on Robotics and Automation, Pasadena, CA, USA, May 19-23, 2008, document of 7 pages. |
Oh et al., “Autonomous Battery Recharging for Indoor Mobile Robots,” Massachusetts Institute of Technology Press, Aug. 30, 2000, document of 6 pages, XP055321836. |
Stegwart, “Introduction to Autonomous Mobile Robots”, Massachusetts, ISBN 978-0-26-219502-7, (2004), pp. 104-115, 151-163, 250-251, document of 37 pages. http://www.robotee.com/EBooks/Introduction_to_Autonomous_Mobile_Robots.pdf, XP055054850. |
Lymberopoulos et al., “A Realistic Evaluation and Comparison of Indoor Location Technologies: Experiences and Lessons Learned,” IPSN '15, Apr. 14-16, 2015, Seattle, WA, USA, document of 12 pages. http://dx.doi.org/10.1145/2737095.27. |
Neto et al., Human-Machine Interface Based on Electro-Biological Signals for Mobile Vehicles, 2006, IEEE, p. 2954-2959 (Year: 2006). |
Forlizzi, How robotic products become social products: An ethnographic study of cleaning in the home, 2007, IEEE, p. 129-136 (Year: 2007). |
Sick Sensor Intelligence, “LMS200/211/221/291 Laser Measurement Systems”, Jan. 2007, pp. 1-48, XP055581229, http://sicktoolbox.sourceforge.net/docs/sick-lms-technical-description.pdf. |
World Intellectual Property Office, “International Search Report” and English translation thereof, issued in PCT/EP2017053398, dated Jun. 12, 2017; document of 8 page. |
German Patent Office, “Office Action” issued in German Patent Application No. 10 2016 102 644.1, dated Jan. 16, 2017, document of 11 pages. |
Office Action issued in Japanese Patent Application No. 2021-186362 dated Feb. 7, 2023, with translation. |
Number | Date | Country | |
---|---|---|---|
20210141382 A1 | May 2021 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 16077929 | US | |
Child | 17113766 | US |