Methods for repurposing temporal-spatial information collected by service robots

Information

  • Patent Grant
  • 8433442
  • Patent Number
    8,433,442
  • Date Filed
    Wednesday, January 28, 2009
    16 years ago
  • Date Issued
    Tuesday, April 30, 2013
    12 years ago
Abstract
Robots and methods implemented therein implement an active repurposing of temporal-spatial information. A robot can be configured to analyze the information to improve the effectiveness and efficiency of the primary service function that generated the information originally. A robot can be configured to use the information to create a three dimensional (3D) model of the facility, which can be used for a number of functions such as creating virtual tours of the environment, or porting the environment into video games. A robot can be configured to use the information to recognize and classify objects in the facility so that the ensuing catalog can be used to locate selected objects later, or to provide a global catalog of all items, such as is needed for insurance documentation of facility effects.
Description
FIELD OF INTEREST

The present inventive concepts relate to methods for repurposing temporal-spatial information collected by service robots.


BACKGROUND

Service robots have traditionally been tasked with doing the dull and dirty jobs in human facilities, such as homes, commercial and industrial buildings. However, that very action of performing the service creates a large fund of temporal-spatial information and knowledge about the facility in which the work is performed.


Previous service robots have ignored this large fund of information and knowledge as an asset, resulting in its waste.


SUMMARY OF INVENTION

According to one aspect of the invention, provided is a method of servicing a facility with at least one service robot that obtains temporal-spatial information. The method includes: navigating a robot through an environment using automatic self-control by the robot; sensing the temporal-spatial information while performing a primary service function; and storing the temporal-spatial information.


The method can further include communicating the temporal-spatial information via a wireless network to a control system.


The method can further include at least one other service robot accessing the temporal-spatial information from the control system via the wireless network.


The method can further include at least one other autonomous control system accessing the temporal-spatial information from the control system.


The method can further include the service robot directly communicating the temporal-spatial information via a wireless network to at least one other service robot.


The method can further include the service robot performing a secondary service using the temporal-spatial information.


The method can further include the service robot analyzing the temporal-spatial information to improve the effectiveness and efficiency of the primary service that originally generated the temporal-spatial information.


The method can further include creating a three dimensional (3D) model of the environment using the temporal-spatial information.


The method can further include creating virtual tours from the 3D model.


The method can further include porting the 3D model of the environment to another system.


The method can further include recognizing and classifying objects in the environment using the temporal-spatial information.


The method can further include generating a catalog of objects and using the catalog to subsequently locate selected objects in the environment.


The at least one service robot can be a robotic vacuum cleaner.


In accordance with another aspect of the invention, provided is a service robot system. The service robot includes: a platform supporting a servicing subsystem; a navigation controller coupled to a drive mechanism and configured to navigate the platform through an environment; one or more sensors configured to sense collect temporal-spatial information while performing a primary service; and a storage media on which the temporal-spatial information is stored.


The system can further include a communication module configured to communicate the temporal-spatial information via a wireless network to a control system.


At least one other service robot can be configured to access the temporal-spatial information from the control system via the wireless network.


The system can further include a communication module configured to directly communicate the temporal-spatial information via a wireless network to at least one other service robot.


The service robot can be configured to perform a secondary service using the temporal-spatial information.


The service robot can be further configured to analyze the temporal-spatial information to improve the effectiveness and efficiency of the primary service that originally generated the temporal-spatial information.


The service robot can be further configured to create a three dimensional (3D) model of the environment using the temporal-spatial information.


The service robot can be further configured to recognize and classify objects in the environment using the temporal-spatial information.


The service robot can be further configured to generate a catalog of objects and to use the catalog to subsequently locate selected objects in the environment.


The service robot can be a robotic vacuum cleaner.


The system can further include at least one other autonomous control system configured to access the temporal-spatial information from the control system.





BRIEF DESCRIPTION OF THE DRAWINGS

The present invention will become more apparent in view of the attached drawings and accompanying detailed description. The embodiments depicted therein are provided by way of example, not by way of limitation, wherein like reference numerals refer to the same or similar elements. The drawings are not necessarily to scale, emphasis instead being placed upon illustrating aspects of the invention. In the drawings:



FIG. 1 is a flowchart of an embodiment of a method for illustrating the control operation procedures of a robot using temporal-spatial information.



FIG. 2 shows an embodiment of a service robot including an upward facing sensor.



FIG. 3 shows an embodiment of a service robot including multiple sensors.



FIG. 4 shows an embodiment of a system for repurposing temporal-spatial information collected by an automatically navigating robot.





DETAILED DESCRIPTION OF PREFERRED EMBODIMENT

Hereinafter, aspects of the present invention will be described by explaining illustrative embodiments in accordance therewith, with reference to the attached drawings. While describing these embodiments, detailed descriptions of well-known items, functions, or configurations are typically omitted for conciseness.


It will be understood that, although the terms first, second, etc. are be used herein to describe various elements, these elements should not be limited by these terms. These terms are used to distinguish one element from another, but not to imply a required sequence of elements. For example, a first element can be termed a second element, and, similarly, a second element can be termed a first element, without departing from the scope of the present invention. As used herein, the term “and/or” includes any and all combinations of one or more of the associated listed items.


It will be understood that when an element is referred to as being “on” or “connected” or “coupled” to another element, it can be directly on or connected or coupled to the other element or intervening elements can be present. In contrast, when an element is referred to as being “directly on” or “directly connected” or “directly coupled” to another element, there are no intervening elements present. Other words used to describe the relationship between elements should be interpreted in a like fashion (e.g., “between” versus “directly between,” “adjacent” versus “directly adjacent,” etc.).


The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used herein, the singular forms “a,” “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “comprises,” “comprising,” “includes” and/or “including,” when used herein, specify the presence of stated features, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, steps, operations, elements, components, and/or groups thereof.


Spatially relative terms, such as “beneath,” “below,” “lower,” “above,” “upper” and the like may be used to describe an element and/or feature's relationship to another element(s) and/or feature(s) as, for example, illustrated in the figures. It will be understood that the spatially relative terms are intended to encompass different orientations of the device in use and/or operation in addition to the orientation depicted in the figures. For example, if the device in the figures is turned over, elements described as “below” and/or “beneath” other elements or features would then be oriented “above” the other elements or features. The device may be otherwise oriented (e.g., rotated 90 degrees or at other orientations) and the spatially relative descriptors used herein interpreted accordingly.



FIG. 1 is a flowchart 100 of an embodiment of a method for controlling a service robot using temporal-spatial information. In this embodiment the service robot is a cleaning robot, such as a robotic vacuum.


In step 102, the robot performs the service functions, such as vacuuming. In step 104 the robot collects temporal-spatial information while performing the service, e.g., as in step 102. In step 106 the temporal-spatial information is made available for other uses by other applications and/or robots—and/or by the robot collecting the information. For example, in some implementations robots can work in teams and communicate such information back and forth via wireless networks. One robot could scout locations requiring service and note whether they are occupied and communicate the need and availability to another one or more robots—as an example.


The robots can include sensors to collect such temporal-spatial information, such as those generally known in the art. For example, sensors could include acoustic, motion detection, camera or other sensors.


As will be appreciated by those skilled in the art, a service robot (e.g., vacuum) includes a memory for storing instructions and data, and a processor for executing the instructions. Thus, the methods discussed above can be programmed into the service robot for execution to accomplish the functions disclosed herein.


Also, while the service robot was indicated as a robotic cleaner in this embodiment, those skilled in the art will appreciate that methods in accordance with the present invention could be applied to any number of service robots, and could implement any number and types of sensors.


In another embodiment the service robot would communicate the temporal-spatial data collected to a server or other computer, which would include the processor for manipulating the data and a storage system for storing the data. This server could be used to communicate the re-purposed information back to the original robot, or to other robots or servers which can make use of it.


As an example, the following description will describe a potential application for this method and illustrate its operation. A service robot is used to clean the floors in an industrial facility. It is programmed to follow a path which carries it throughout the facility. A temperature sensor is mounted to the robot, and as it is doing its primary function, it is also recording the temperature of the environment at intervals along the programmed path. The temperature-location data is transmitted to the facility heating system, which maps the position data from the robot to its control zones. It thus can use the recorded temperatures to adjust the heating output of the different zones.



FIG. 2 shows an embodiment of a service robot 200 including an upward facing sensor 210. The sensor 210 can be used to determine characteristics of the environment in the area. The determination may be based on other perceptual techniques such as acoustics, motion detection, etc. In still other embodiments, the determination method may be based on a combination thereof.



FIG. 3 shows an embodiment of a service robot 300 with multiple sensors 310 that allow the robot 300 to determine the characteristics of the environment in the area.



FIG. 4 shows an embodiment of a system for repurposing temporal-spatial information from a robot. Component 400 is the service robot, component 410 is the sensor mounted to the robot, component 420 is the localization system of the robot which provides the location, or spatial information, component 430 is the robot executive controller which manages the autonomous operation of the robot, component 440 is a communication means to transmit the spatial-temporal information to the data storage system 450. The data storage system 450 stores the temporal-spatial information, and any meta-data required for its interpretation. Component 460 is a transformation system that adapts the information to a form usable by a requesting application.


While reference has been made to using wireless networks, in various embodiments any of a variety of data communication schemes can be used, such as wired, (batch) wireless, networked, Point-to-point, and so on.


While the foregoing has described what are considered to be the best mode and/or other preferred embodiments, it is understood that various modifications can be made therein and that the invention or inventions may be implemented in various forms and embodiments, and that they may be applied in numerous applications, only some of which have been described herein.

Claims
  • 1. A method of servicing a facility, the method comprising: an autonomous first robot self-navigating through an environment while performing a first service function, wherein the first service function is a cleaning or an environmental control function;the first robot sensing and storing temporal-spatial information while performing the first service function;the first robot communicating the temporal-spatial information to an autonomous second robot or to control system; andrepurposing the temporal-spatial information by the second robot or the control system using at least some of the temporal-spatial information for a second service function, which is different from the first service function.
  • 2. The method of claim 1, further comprising communicating the temporal-spatial information via a wireless network to the control system.
  • 3. The method of claim 2, further comprising the second robot accessing the temporal-spatial information from the control system via the wireless network.
  • 4. The method of claim 2, further comprising at least one other autonomous control system accessing the temporal-spatial information from the control system.
  • 5. The method of claim 1, further comprising the first robot directly communicating the temporal-spatial information via a wireless network to the second robot.
  • 6. The method of claim 1, wherein the secondary service comprises at least one of cleaning or climate control.
  • 7. The method of claim 1, further comprising the service robot analyzing the temporal-spatial information to improve the effectiveness and efficiency of the first service function.
  • 8. The method of claim 1, further comprising creating a three dimensional (3D) model of the environment using the temporal-spatial information.
  • 9. The method of claim 8, further comprising creating virtual tours from the 3D model.
  • 10. The method of claim 8, further comprising porting the 3D model of the environment to a second system.
  • 11. The method of claim 1, further comprising recognizing and classifying objects in the environment using the temporal-spatial information.
  • 12. The method of claim 10, further comprising generating a catalog of objects and using the catalog to subsequently locate selected objects in the environment.
  • 13. The method of claim 1, wherein the the first robot is a robotic vacuum cleaner.
  • 14. A service robot system, comprising: a first robot comprising: a platform supporting a first servicing subsystem;a navigation controller coupled to a drive mechanism and configured to autonomously self-navigate the platform through an environment;one or more sensors configured to collect temporal-spatial information while performing a first service function, wherein the first service function is a cleaning function;a storage media on which the temporal-spatial information is stored;a communication module configured to communicate the temporal-spatial information to an autonomous second robot or to control system; andwherein the second robot or control system is configured to repurpose at least some of the stored temporal-spatial information for a second service function, which his different from the first service function.
  • 15. The system of claim 14, wherein the communication module is further configured to communicate the temporal-spatial information via a wireless network to the control system.
  • 16. The system of claim 15, wherein the second robot is configured to access the temporal-spatial information from the control system via the wireless network.
  • 17. The system of claim 14, wherein the communication module is configured to directly communicate the temporal-spatial information via a wireless network to the second robot.
  • 18. The system of claim 14, wherein the secondary service comprises at least one of cleaning or climate control.
  • 19. The system of claim 14, wherein the first robot is further configured to analyze the temporal-spatial information to improve the effectiveness and efficiency of the first service function.
  • 20. The system of claim 14, wherein the first robot is further configured to create a three dimensional (3D) model of the environment using the temporal-spatial information.
  • 21. The system of claim 14, wherein the first robot is further configured to recognize and classify objects in the environment using the temporal-spatial information.
  • 22. The system of claim 21, wherein the first robot is further configured to generate a catalog of objects and to use the catalog to subsequently locate selected objects in the environment.
  • 23. The system of claim 14, wherein the first robot is a robotic vacuum cleaner.
  • 24. The system of claim 14, further comprising at least one other autonomous control system configured to access the temporal-spatial information from the control system.
CROSS-REFERENCE TO RELATED APPLICATION

This application claims the benefit of priority under 35 U.S.C. § 119(e) from provisional application Ser. No. 61/024,028, entitled “METHODS FOR REPURPOSING TEMPORAL-SPATIAL INFORMATION COLLECTED BY SERVICE ROBOTS,” filed on Jan. 28, 2008 which is incorporated herein by reference in its entirety.

US Referenced Citations (82)
Number Name Date Kind
4674048 Okumura Jun 1987 A
5032775 Mizuno et al. Jul 1991 A
5086535 Grossmeyer et al. Feb 1992 A
5369347 Yoo Nov 1994 A
5440216 Kim Aug 1995 A
5534762 Kim Jul 1996 A
5682313 Edlund et al. Oct 1997 A
5684695 Bauer Nov 1997 A
5867800 Leif Feb 1999 A
6076025 Ueno et al. Jun 2000 A
6124694 Bancroft et al. Sep 2000 A
6278904 Ishii Aug 2001 B1
6374155 Wallach et al. Apr 2002 B1
6389329 Colens May 2002 B1
6459955 Bartsch et al. Oct 2002 B1
6496754 Song et al. Dec 2002 B2
6496755 Wallach et al. Dec 2002 B2
6532404 Colens Mar 2003 B2
6539284 Nourbakhsh et al. Mar 2003 B2
6604022 Parker et al. Aug 2003 B2
6611120 Song et al. Aug 2003 B2
6667592 Jacobs et al. Dec 2003 B2
6728608 Ollis et al. Apr 2004 B2
6732826 Song et al. May 2004 B2
6760647 Nourbakhsh et al. Jul 2004 B2
6774596 Bisset Aug 2004 B1
6841963 Song et al. Jan 2005 B2
6868307 Song et al. Mar 2005 B2
6870792 Chiappetta Mar 2005 B2
6879878 Glenn et al. Apr 2005 B2
6883201 Jones et al. Apr 2005 B2
6925679 Wallach et al. Aug 2005 B2
6957712 Song et al. Oct 2005 B2
6968592 Takeuchi et al. Nov 2005 B2
6984952 Peless et al. Jan 2006 B2
7024278 Chiappetta et al. Apr 2006 B2
7155308 Jones Dec 2006 B2
7162056 Burl et al. Jan 2007 B2
7167775 Abramson et al. Jan 2007 B2
7188000 Chiappetta et al. Mar 2007 B2
7206677 Hulden Apr 2007 B2
7251548 Herz et al. Jul 2007 B2
7446766 Moravec Nov 2008 B2
7447593 Estkowski et al. Nov 2008 B2
7526362 Kim et al. Apr 2009 B2
7720572 Ziegler et al. May 2010 B2
7805220 Taylor et al. Sep 2010 B2
7835821 Roh et al. Nov 2010 B2
20020138936 Takeuchi et al. Oct 2002 A1
20030025472 Jones et al. Feb 2003 A1
20040030571 Solomon Feb 2004 A1
20040073337 McKee et al. Apr 2004 A1
20040076324 Burl et al. Apr 2004 A1
20040167716 Goncalves et al. Aug 2004 A1
20040168148 Goncalves et al. Aug 2004 A1
20040204792 Taylor et al. Oct 2004 A1
20040207355 Jones et al. Oct 2004 A1
20050000543 Taylor et al. Jan 2005 A1
20050067994 Jones et al. Mar 2005 A1
20050080514 Omote et al. Apr 2005 A1
20050134209 Kim Jun 2005 A1
20050216126 Koselka et al. Sep 2005 A1
20050273226 Tani Dec 2005 A1
20050273967 Taylor et al. Dec 2005 A1
20050287038 Dubrovsky et al. Dec 2005 A1
20050288079 Tani Dec 2005 A1
20060020369 Taylor et al. Jan 2006 A1
20060038521 Jones et al. Feb 2006 A1
20060060216 Woo Mar 2006 A1
20060061476 Patil et al. Mar 2006 A1
20060095158 Lee et al. May 2006 A1
20060178777 Park et al. Aug 2006 A1
20060293788 Pogodin Dec 2006 A1
20070042716 Goodall et al. Feb 2007 A1
20070135962 Kawabe et al. Jun 2007 A1
20070192910 Vu et al. Aug 2007 A1
20070199108 Angle et al. Aug 2007 A1
20070244610 Ozick et al. Oct 2007 A1
20080004904 Tran Jan 2008 A1
20080086236 Saito et al. Apr 2008 A1
20080109114 Orita et al. May 2008 A1
20080184518 Taylor et al. Aug 2008 A1
Foreign Referenced Citations (16)
Number Date Country
11-104984 Apr 1999 JP
2002-254374 Sep 2002 JP
2003-515210 Apr 2003 JP
2004-97439 Apr 2004 JP
2004-097439 Apr 2004 JP
2005-111603 Apr 2005 JP
2006-007368 Jan 2006 JP
2006-087918 Apr 2006 JP
2006-102861 Apr 2006 JP
10-2002-0076153 Oct 2002 KR
10-2002-0081035 Oct 2002 KR
10-2002-0088880 Nov 2002 KR
10-0645818 Nov 2006 KR
0137060 May 2001 WO
0137060 May 2001 WO
2007051972 May 2007 WO
Non-Patent Literature Citations (10)
Entry
International Search Report dated Aug. 31, 2009 issued in corresponding International Application No. PCT/US2009/032274.
International Search Report dated Sep. 14, 2009 issued in corresponding International Application No. PCT/US2009/032243.
International Search Report dated Sep. 14, 2009 issued in corresponding International Application No. PCT/US2009/032245.
International Search Report dated Sep. 30, 2009 issued in corresponding International Application No. PCT/US2009/034081.
Bennewitz, et al., “Adapting Navigation Strategies Using Motions Patterns of People”, 2003, Proceedings of the 2003 IEEE International Conference on Robotics & Automation, pp. 2000-2005.
Alami, et al., “Diligent: Towards a Human-Friendly Navigation System”, 2000, Proceedings of the 2000 IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 21-26.
Lee, et al., “An Agent for Intelligent Spaces: Functions and Roles of Mobile Robots in Sensored, Networked and Thinking Spaces”, 1997, IEEE Conference on Intelligent Transportation System (ITSC '97), pp. 983-988.
Extended European Search Report dated Feb. 22, 2011 issued in corresponding European Application No. EP09706350.
Extended European Search Report dated Mar. 7, 2011 issued in corresponding European Application No. EP09705670.
Extended European Search Report dated Mar. 8, 2011 issued in corresponding European Application No. EP09706723.
Related Publications (1)
Number Date Country
20090198381 A1 Aug 2009 US
Provisional Applications (1)
Number Date Country
61024028 Jan 2008 US