Methods for real-time and near real-time interactions with robots that service a facility

Information

  • Patent Grant
  • 8892256
  • Patent Number
    8,892,256
  • Date Filed
    Wednesday, January 28, 2009
    16 years ago
  • Date Issued
    Tuesday, November 18, 2014
    10 years ago
Abstract
In accordance with aspects of the present invention, a service robot and methods for controlling such a robot are provided. In particular, the robot is configured to sense the presence of a person and to take a next action in response to sensing the presence of the person. As examples, the robot could leave the area, await commands from the person, or enter an idle or sleep state or mode until the person leaves.
Description
FIELD OF INTEREST

The present inventive concepts relate to methods for optimal interactions between people and service robots, including robotic cleaners, while they are servicing a facility.


BACKGROUND

One of the advantages of service robots is that they can do the dull and dirty jobs in human facilities, such as homes, commercial and industrial buildings, and institutions. However, that very action of robotic service itself may be unpleasant, inconvenient, disruptive, or even dangerous to a human that comes into proximity of the working robot.


Previous service robots have ignored this problem. As an example, current robot cleaners blithely treat humans the same way they treat the leg of a stool; usually by bumping into it, going around, it and continuing their work.


Although some robots built for human interaction have included the ability to recognize humans in their proximity, to date this has been used to further the interaction itself, but not to further a distinct service agenda.


SUMMARY OF INVENTION

The present invention has been conceived to solve the above-mentioned problems occurring in the prior art. In accordance with aspects of the present invention, provided is a system and method that allows a robot to service a facility while more effectively interacting with people.


In order to achieve the above aspects, there is provided various methods of enabling the robot to optimally respond in the presence of a person.


The robot can be configured to have a work pause, whereby when the robot senses the proximity of a person it stops working and remains still and silent.


The robot can be configured to move out of the way, whereby when the robot senses the proximity of a person it stops working and actively moves into a position of the room that is least disturbing the person. This may be away from the person or nearer the person, depending on other factors.


The robot can be configured to move out of the room, whereby when the robot sense the proximity of a person it moves to another area of the facility where it is no longer in proximity to people.


The robot can be configured to look to a person to point things out when the person enters the room. This can provide a seamless way to improve the decision making process of the robot.


The robot can be configured to keep track of a work stoppage, so that it can restart work in the area once the person has left.





BRIEF DESCRIPTION OF THE DRAWINGS

The present invention will become more apparent in view of the attached drawings and accompanying detailed description. The embodiments depicted therein are provided by way of example, not by way of limitation, wherein like reference numerals refer to the same or similar elements. The drawings are not necessarily to scale, emphasis instead being placed upon illustrating aspects of the invention. In the drawings:



FIGS. 1, 2, 3, 4, and 5 are flowcharts illustrating embodiments of control operation procedures of a robot configured to service a facility and to interact with humans during such service.



FIG. 6 shows an embodiment of a service robot including an upward facing sensor.



FIG. 7 shows an embodiment of a service robot including multiple sensors.



FIG. 8 shows an embodiment of a service robot interactively communicating with multiple sensors installed in an environment.





DETAILED DESCRIPTION OF PREFERRED EMBODIMENT

Hereinafter, aspects of the present invention will be described by explaining illustrative embodiments in accordance therewith, with reference to the attached drawings. While describing these embodiments, detailed descriptions of well-known items, functions, or configurations are typically omitted for conciseness.


It will be understood that, although the terms first, second, etc. are be used herein to describe various elements, these elements should not be limited by these terms. These terms are used to distinguish one element from another, but not to imply a required sequence of elements. For example, a first element can be termed a second element, and, similarly, a second element can be termed a first element, without departing from the scope of the present invention. As used herein, the term “and/or” includes any and all combinations of one or more of the associated listed items.


It will be understood that when an element is referred to as being “on” or “connected” or “coupled” to another element, it can be directly on or connected or coupled to the other element or intervening elements can be present. In contrast, when an element is referred to as being “directly on” or “directly connected” or “directly coupled” to another element, there are no intervening elements present. Other words used to describe the relationship between elements should be interpreted in a like fashion (e.g., “between” versus “directly between,” “adjacent” versus “directly adjacent,” etc.).


The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used herein, the singular forms “a,” “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “comprises,” “comprising,” “includes” and/or “including,” when used herein, specify the presence of stated features, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, steps, operations, elements, components, and/or groups thereof.


Spatially relative terms, such as “beneath,” “below,” “lower,” “above,” “upper” and the like may be used to describe an element and/or feature's relationship to another element(s) and/or feature(s) as, for example, illustrated in the figures. It will be understood that the spatially relative terms are intended to encompass different orientations of the device in use and/or operation in addition to the orientation depicted in the figures. For example, if the device in the figures is turned over, elements described as “below” and/or “beneath” other elements or features would then be oriented “above” the other elements or features. The device may be otherwise oriented (e.g., rotated 90 degrees or at other orientations) and the spatially relative descriptors used herein interpreted accordingly.


In accordance with aspects of the present invention, a robotic cleaner (or other type of service platform) can be configured to implement a method of more effectively interacting with people while servicing a space. The platform and method can determine that the platform is in the presence of people and implement a different servicing pattern or behavior as a function thereof. The determination that the robotic platform is in the presence of people can be accomplished using any one or more of a plurality of types of sensors mounted on, integral with, or coupled to the robotic platform or mounted elsewhere in the environment and in communication, directly or indirectly with the robotic platform.



FIG. 1 is a flowchart 100 of an embodiment of a method for controlling a service robot when a person enters an area or room. In this embodiment the service robot is a cleaning robot, such as a robotic vacuum. This is a work pause method, whereby when the robot senses the proximity of a person it stops working and remains still and silent.


In step 102 the robot is in the process of servicing (e.g., cleaning) a room (or area). In step 104 the robot senses whether or not a person has entered the room. A variety of different types of sensors could be used, such as, for example, motion detection sensors, video camera sensors, acoustic sensors and so on.


If in step 104 a person was not sensed, the process continues in step 102, where the robot continues to clean, in this example. Thus, the sensing can serve as an interrupt condition to the servicing condition of the robot. If in step 104 the answer was “yes” then the process continues to step 106 where the robot stops and sits still. The robot can then can then transition to a quiet, sleep, or inactive mode where it makes little or no noise and uses minimum power, e.g., at least enough to power the sensors.


In step 108 the robot senses whether the person left the room or area. If not, the process remains (or returns to) step 106 where the robot remains idle. If the person did leave the room or area, then the process continues to step 102 where the robot resumes cleaning—i.e., wakes up.



FIG. 2 is a flowchart 200 of another embodiment of a method for controlling a service robot when a person enters an area or a room. In this method when the robot senses the proximity of a person it moves to another area of the facility where it is no longer in proximity to people.


In step 202 the robot is cleaning in an area. In step 204 a determination is made of whether a person has entered the area. This is preferably an on-going sensing activity, rather than a discrete standalone step. If a person was not sensed, the process continues in step 202, where it is cleaning. However, if in step 204 it was determined that a person entered the room or area, the process continues to step 206. In this step the robot leaves the area it was cleaning. In step 208 the robot enters a new area and can begin cleaning that area, if needed. Preferably the new area is one without people.



FIG. 3 is a flowchart 300 of another embodiment of a method for controlling a service robot when a person enters an area or a room. In this method when the robot senses the proximity of a person it stops working and actively moves into a position of the room or area that is least disturbing the person. This may be away from the person or nearer the person, depending on other factors.


In step 302 the robot is cleaning in an area. In step 304 a determination is made of whether a person has entered the area. This is preferably an on-going sensing activity, rather than a discrete standalone step. If a person was not sensed, the process continues in step 302, where it is cleaning. However, if in step 304 it was determined that a person entered the room or area, the process continues to step 306. In this step the robot stops cleaning and moves to another location within the area it was cleaning, e.g. a least disturbing area. A set of rules can be defined for choosing the least disturbing area, such as either within or outside a certain distance from the user, proximate to the entrance of the area through which the user entered, and so on. In step 308 the robot senses when the person leaves and then returns to step 302 and its cleaning operation.



FIG. 4 is a flowchart 400 of another embodiment of a method for controlling a service robot when a person enters an area or a room. When a person enters an area or room the service robot transitions to a mode where it looks to the person to provide commands, e.g., by pointing things out, giving verbal instructions, etc. This results in an improved and seamless way to improve the decisions making process of the robot.


In step 402 the robot is cleaning in an area. In step 404 a determination is made of whether a person has entered the area. This is preferably an on-going sensing activity, rather than a discrete standalone step. If a person was not sensed, the process continues in step 402, where the robot is cleaning. However, if in step 404 it was determined that a person entered the room or area, the process continues to step 406, where the robot stops cleaning. In step 408 the robot awaits commands from the person for its next action, e.g., continue cleaning, leave area, enter sleep mode, etc. In step 410 the robot executes the commands.



FIG. 5 is a flowchart 500 of another embodiment of a method for controlling a service robot when a person enters an area or a room. When a person enters and area or room the service robot leaves the area to clean another area and returns later to the left area to finish servicing that area.


In step 502 the robot is cleaning in an area. In step 504 a determination is made of whether a person has entered the area. This is preferably an on-going sensing activity, rather than a discrete standalone step. If a person was not sensed, the process continues in step 502, where the robot is cleaning. However, if in step 504 it was determined that a person entered the room or area, the process continues to step 506, where the robot records its current location. The robot could record its location in an on-going basis so that it maintains a record of all areas cleaned, whether fully or partially. In step 508 the robot moves to another area to service (e.g., clean) that area. In step 510 the robot returns to the original area to finish cleaning.


In the various service robot methods, therefore, the robot can be configured to keep track of the work stoppage, so that it can restart work in that area once the person has left.


As will be appreciated by those skilled in the art, a service robot (e.g., vacuum) includes a memory for storing instructions and data, and a processor for executing the instructions. Thus, the methods discussed above can be programmed into the service robotic for execution to accomplish the functions disclosed herein.


In another implementation of the method, the sensing of the person in the area of the robot may be performed by sensors installed in the environment and communicating directly or indirectly with the service robot. In this embodiment, the sensors may be serving other functions, such as temperature control, or security sensing of the environment, and communicating with other control components of the environment. The implementation requires the spatial relationships between the robot and the sensors is known. There are many possible implementations of this including perception of the sensors by the robot, a-priori knowledge of the sensors locations in a “map”, or information provided by an environment-wide control system.


Portions of the above description refer to the robot being configured to execute different behaviors in response to the presence of people. This configuration can be implemented as a “pre-programmed” behavior, or as a “set-up” menu providing a series of optional behaviors to be executed under certain triggering conditions as is commonly available on consumer and industrial products.


Also, while the service robot was indicated as a robotic cleaner in this embodiment, those skilled in the art will appreciate that methods in accordance with the present invention could be applied to any number of service robots, and could implement any number and types of sensors.



FIG. 6 shows an embodiment of a service robot 600 including an upward facing sensor 610. The sensor 610 can be used to determine whether a person has entered the area. The determination may be based on other perceptual techniques such as acoustics, motion detection, etc. In still other embodiments, the determination method may be based on a combination thereof.



FIG. 7 shows an embodiment of a service robot 700 with multiple sensors 710 that allow the robot 700 to determine the presence of a person in the area.



FIG. 8 shows an embodiment of a service robot 800 interactively communicating with multiple sensors 810 installed in the environment 850 allowing the robot 800 to determine the presence of a person (not shown) in the environment 850.


While the foregoing has described what are considered to be the best mode and/or other preferred embodiments, it is understood that various modifications can be made therein and that the invention or inventions may be implemented in various forms and embodiments, and that they may be applied in numerous applications, only some of which have been described herein.

Claims
  • 1. A method of servicing a facility with a robot that detects a living being, the method comprising: navigating the robot to a first area within an environment using automatic self-control by the robot;automatically commencing a first service of a first room by the robot;sensing a living being in the first room by the robot while providing the first service; andperforming at least one form of service pause of the first service in response to sensor data received from the sensing, the at least one form of service pause including the robot self-navigating to a second area, outside of the first room, to perform a second service of the second area and then returning to the first room to restart the first service of the first room.
  • 2. The method of claim 1, wherein the at least one form of service pause further includes stopping performance of the first service when the robot senses a proximity of a person.
  • 3. The method of claim 2, wherein the stopping of the performance of the first service further includes sensing a proximity of the person, determining a location that is least disturbing to the person, and actively moving to the location.
  • 4. The method of claim 3, wherein the location is away from the person.
  • 5. The method of claim 3, wherein the location is nearer the person.
  • 6. The method of claim 3, wherein the person is in the first room and the location is out of the first room.
  • 7. The method of claim 2, further comprising the robot recording in memory the service pause, including recording the robot's current location.
  • 8. The method of claim 7, further comprising restarting the first service in the first room once the sensor data indicates the person has left the first room.
  • 9. The method of claim 1, further comprising the robot responding to a gesture or verbal command from a person in a room in which the robot is present.
  • 10. The method of claim 1, wherein the sensing further includes using one or more of light sensing, acoustic sensing, visual sensing, motion detection, heat sensing, and electromagnetic sensing.
  • 11. The method of claim 1, wherein the sensing is performed by one or more sensors attached to the robot.
  • 12. The method of claim 1, wherein the sensing is performed by one or more sensors in the environment.
  • 13. The method of claim 1, wherein the robot is a robotic vacuum cleaner.
  • 14. A service robot configured to detect the presence of a living being, the service robot comprising: a platform supporting a servicing subsystem;the servicing subsystem coupled to a drive mechanism and configured to navigate the platform to a first room within an environment and to commence a first service of the first room; andone or more sensors configured to sense a person in the first room during the first service;wherein the servicing subsystem controls the service robot to perform at least one form of service pause in response to sensor data received from the one or more sensors, the at least one form of service pause including the service robot self-navigating to a second area, outside of the first room, to perform a second service of the second area and then returning to the first room to restart the first service of the first room.
  • 15. The system of claim 14, wherein the servicing subsystem is further configured to cause a stoppage in the first service, as a form of service pause, when the service robot senses a proximity of a person.
  • 16. The system of claim 15, wherein the servicing subsystem is further configured, as part of the stoppage in the first service, to determine a location that is least disturbing to the person, and automatically move the service robot to the location.
  • 17. The system of claim 16, wherein the location is away from the person.
  • 18. The system of claim 16, wherein the location is nearer the person.
  • 19. The system of claim 15, wherein the person is in the first room and the location is out of the first room.
  • 20. The system of claim 14, wherein the service robot is configured to record in a memory the service pause, including recordation of the service robot's current location.
  • 21. The system of claim 20, wherein the servicing subsystem is further configured to restart the first service in the first room once the sensor data indicates the person has left the first room.
  • 22. The system of claim 14, wherein the service robot is configured to respond to a gesture or verbal command from a person in a room in which the service robot is present.
  • 23. The system of claim 14, wherein the one or more sensors includes light sensors, and/or acoustic sensors, and/or visual sensing, and/or motion detection, and/or heat sensing, and/or electromagnetic sensors.
  • 24. The system of claim 14, wherein the service robot is a robotic vacuum cleaner.
  • 25. The system of claim 14, wherein one or more of the sensors are attached to the service robot.
  • 26. The system of claim 14, wherein one or more of the sensors are attached to the environment.
  • 27. A service robot configured to detect the presence of a living being, the service robot comprising: a platform supporting a servicing subsystem;the servicing subsystem coupled to a drive mechanism and configured to navigate the platform to a first room within an environment; anda communication system configured to communicate with one or more sensors installed in the environment that are configured to sense a person in the first room during performance of a first service of the first room;wherein the servicing subsystem controls the service robot to perform at least one form of service pause in response to sensor data received from the one or more sensors, the at least one form of service pause including the service robot self-navigating to a second area, outside of the first room, to perform a second service of the second area and then returning to the first room to restart the first service of the first room.
  • 28. The system of claim 27, wherein the servicing subsystem is further configured to cause a stoppage in the first service, as a form of service pause, when the service robot senses a proximity of a person.
  • 29. The system of claim 27, wherein the servicing subsystem is further configured, as part of the work stoppage in the first service, to determine a location that is least disturbing to the person, and automatically move the service robot to the location.
  • 30. The system of claim 29, wherein the location is away from the person.
  • 31. The system of claim 29, wherein the location is nearer the person.
  • 32. The system of claim 29, wherein the person is in the first room and the location is out of the first room.
  • 33. The system of claim 28, wherein the service robot is configured to record in a memory the service pause, including recordation of the service robot's current location.
  • 34. The system of claim 33, wherein the servicing subsystem is further configured to restart the first service in the first room once the sensor data indicates the person has left the first room.
  • 35. The system of claim 27, wherein the service robot is configured to respond to a gesture or verbal command from a person in a room in which the service robot is present.
  • 36. The system of claim 27, wherein the one or more sensors includes one or more light sensors, acoustic sensors, visual sensors, motion detection sensors, heat sensors, and electromagnetic sensors.
  • 37. The system of claim 27, wherein the service robot is a robotic vacuum cleaner.
CROSS-REFERENCE TO RELATED APPLICATION

This application claims the benefit of priority under 35 U.S.C. §119(e) from provisional application Ser. No. 61/024,019 filed on Jan. 28, 2008 which is incorporated herein by reference in its entirety.

US Referenced Citations (104)
Number Name Date Kind
4674048 Okumura Jun 1987 A
5032775 Mizuno et al. Jul 1991 A
5086535 Grossmeyer et al. Feb 1992 A
5369347 Yoo Nov 1994 A
5440216 Kim Aug 1995 A
D364840 Oshizawa et al. Dec 1995 S
5534762 Kim Jul 1996 A
5682313 Edlund et al. Oct 1997 A
5684695 Bauer Nov 1997 A
D395285 Allon Jun 1998 S
5867800 Leif Feb 1999 A
6076025 Ueno et al. Jun 2000 A
6076223 Dair et al. Jun 2000 A
6076230 Harsh Jun 2000 A
6119057 Kawagoe Sep 2000 A
6124694 Bancroft et al. Sep 2000 A
6278904 Ishii Aug 2001 B1
6339735 Peless et al. Jan 2002 B1
6374155 Wallach et al. Apr 2002 B1
6389329 Colens May 2002 B1
6459955 Bartsch et al. Oct 2002 B1
6496754 Song et al. Dec 2002 B2
6496755 Wallach et al. Dec 2002 B2
6532404 Colens Mar 2003 B2
6539284 Nourbakhsh et al. Mar 2003 B2
6604022 Parker et al. Aug 2003 B2
6611120 Song et al. Aug 2003 B2
6667592 Jacobs et al. Dec 2003 B2
6668157 Takeda et al. Dec 2003 B1
6728608 Ollis et al. Apr 2004 B2
6732826 Song et al. May 2004 B2
6760647 Nourbakhsh et al. Jul 2004 B2
6774596 Bisset Aug 2004 B1
6841963 Song et al. Jan 2005 B2
6868307 Song et al. Mar 2005 B2
6870792 Chiappetta Mar 2005 B2
6879878 Glenn et al. Apr 2005 B2
6883201 Jones et al. Apr 2005 B2
6925679 Wallach et al. Aug 2005 B2
6957712 Song et al. Oct 2005 B2
6968592 Takeuchi et al. Nov 2005 B2
6984952 Peless et al. Jan 2006 B2
7024278 Chiappetta et al. Apr 2006 B2
7155308 Jones Dec 2006 B2
7162056 Burl et al. Jan 2007 B2
7167775 Abramson et al. Jan 2007 B2
7188000 Chiappetta et al. Mar 2007 B2
7206677 Hulden Apr 2007 B2
D541798 Ichida et May 2007 S
7251548 Herz et al. Jul 2007 B2
7446766 Moravec Nov 2008 B2
7447593 Estkowski et al. Nov 2008 B2
7507948 Park et al. Mar 2009 B2
7526362 Kim et al. Apr 2009 B2
D602931 Kaner et al. Oct 2009 S
D613341 Mar et al. Apr 2010 S
7720572 Ziegler et al. May 2010 B2
7805220 Taylor et al. Sep 2010 B2
7835821 Roh et al. Nov 2010 B2
D697198 Amirouche et al. Jan 2014 S
20010047231 Peless et al. Nov 2001 A1
20020016649 Jones Feb 2002 A1
20020095239 Wallach et al. Jul 2002 A1
20020120364 Colens Aug 2002 A1
20020138936 Takeuchi et al. Oct 2002 A1
20020153184 Song et al. Oct 2002 A1
20030025472 Jones et al. Feb 2003 A1
20030028286 Glenn et al. Feb 2003 A1
20030212472 McKee Nov 2003 A1
20040030571 Solomon Feb 2004 A1
20040073337 McKee et al. Apr 2004 A1
20040076324 Burl et al. Apr 2004 A1
20040083570 Song et al. May 2004 A1
20040167716 Goncalves et al. Aug 2004 A1
20040168148 Goncalves et al. Aug 2004 A1
20040204792 Taylor et al. Oct 2004 A1
20040207355 Jones et al. Oct 2004 A1
20050000543 Taylor et al. Jan 2005 A1
20050067994 Jones et al. Mar 2005 A1
20050080514 Omote et al. Apr 2005 A1
20050134209 Kim Jun 2005 A1
20050216126 Koselka et al. Sep 2005 A1
20050273226 Tani Dec 2005 A1
20050273967 Taylor et al. Dec 2005 A1
20050287038 Dubrovsky et al. Dec 2005 A1
20050288079 Tani Dec 2005 A1
20060020369 Taylor et al. Jan 2006 A1
20060038521 Jones et al. Feb 2006 A1
20060060216 Woo Mar 2006 A1
20060061476 Patil et al. Mar 2006 A1
20060095158 Lee et al. May 2006 A1
20060178777 Park et al. Aug 2006 A1
20060293788 Pogodin Dec 2006 A1
20070042716 Goodall et al. Feb 2007 A1
20070135962 Kawabe et al. Jun 2007 A1
20070192910 Vu et al. Aug 2007 A1
20070199108 Angle et al. Aug 2007 A1
20070244610 Ozick et al. Oct 2007 A1
20070267570 Park et al. Nov 2007 A1
20080004904 Tran Jan 2008 A1
20080056933 Moore et al. Mar 2008 A1
20080086236 Saito et al. Apr 2008 A1
20080109114 Orita et al. May 2008 A1
20080184518 Taylor et al. Aug 2008 A1
Foreign Referenced Citations (49)
Number Date Country
63-222726 Sep 1988 JP
5-143158 Jun 1993 JP
6-314124 Nov 1994 JP
07101500 Apr 1995 JP
9-90026 Apr 1997 JP
11-104984 Apr 1999 JP
11-104984 Apr 1999 JP
2000-339028 Dec 2000 JP
2000-342498 Dec 2000 JP
2001-67124 Mar 2001 JP
2001-246580 Sep 2001 JP
2001-300874 Oct 2001 JP
2002-85305 Mar 2002 JP
2002-254374 Sep 2002 JP
2002-325708 Nov 2002 JP
2002-351305 Dec 2002 JP
2003006532 Jan 2003 JP
2003-515210 Apr 2003 JP
2003-515801 May 2003 JP
2003-180587 Jul 2003 JP
2003-225184 Aug 2003 JP
2003241833 Aug 2003 JP
2003-256043 Sep 2003 JP
2004-33340 Feb 2004 JP
2004-97439 Apr 2004 JP
2004-097439 Apr 2004 JP
2004-148090 May 2004 JP
2004148089 May 2004 JP
2005-111603 Apr 2005 JP
2005-124753 May 2005 JP
2005-205028 Aug 2005 JP
2005-219161 Aug 2005 JP
2006-007368 Jan 2006 JP
2006-087918 Apr 2006 JP
2006-102861 Apr 2006 JP
2006-218005 Aug 2006 JP
2006-252273 Sep 2006 JP
2006-331054 Dec 2006 JP
2007-4527 Jan 2007 JP
2007-309921 Nov 2007 JP
2008-3979 Jan 2008 JP
10-2002-0076153 Oct 2002 KR
10-2002-0081035 Oct 2002 KR
10-2002-0088880 Nov 2002 KR
10-0645818 Nov 2006 KR
0137060 May 2001 WO
0137060 May 2001 WO
0138945 May 2001 WO
2007051972 May 2007 WO
Non-Patent Literature Citations (17)
Entry
Bennewitz et al., Adapting Navigation Strategies Using Motions Patterns of People, 2003, Proceedings of the 2003 IEEE International Conference on Robotics & Automation, pp. 2000-2005.
Alami et al., Diligent: Towards a Human-Friendly Navigation System, 2000, Proceedings of the 2000 IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 21-26.
Lee et al., An Agent for Intelligent Spaces: Functions and Roles of Mobile Robots in Sensored, Networked and Thinking Spaces, 1997, IEEE Conference on Intelligent Transportation System (ITSC '97), pp. 983-988.
Extended European Search Report dated Feb. 22, 2011 issued in corresponding European Application No. EP09706350.
Extended European Search Report dated Mar. 7, 2011 issued in corresponding European Application No. EP09705670.
Extended European Search Report dated Mar. 8, 2011 issued in corresponding European Application No. EP09706723.
International Search Report dated Aug. 31, 2009 issued in corresponding International Application No. PCT/US2009/032274.
International Search Report dated Sep. 14, 2009 issued in corresponding International Application No. PCT/US2009/032243.
International Search Report dated Sep. 14, 2009 issued in corresponding International Application No. PCT/US2009/032245.
International Search Report dated Sep. 30, 2009 issued in corresponding International Application No. PCT/US2009/034081.
Extended European Search Report dated Sep. 12, 2013 issued in corresponding European Application No. 09710577.9.
Office Action dated Jan. 22, 2013 issued in corresponding Japanese Application No. 2010-545106.
Office Action dated Apr. 2, 2013 issued in related Japanese Application No. 2010-546923.
Office Action dated Feb. 19, 2013 issued in related Japanese Application No. 2010-545107.
Office Action in U.S. Appl. No. 29/471,328, dated Jun. 4, 2014.
Office Action in Chinese Patent Application No. 200980108309.X, dated Jun. 4, 2014.
Flexible & integrated unmanned command & control [online]. Howard, Courtney, 2013 [retrieved on May 24, 2014]. Retrieved from the Internet: <URL: http://www.militaryaerospace.com/articles/print/volume-24/issue-11/special-report/ flexible-integrated-unmanned-command-control.html>.
Related Publications (1)
Number Date Country
20090198380 A1 Aug 2009 US
Provisional Applications (1)
Number Date Country
61024019 Jan 2008 US