Remote control scheduler and method for autonomous robotic device

Abstract
A method of scheduling a robotic device enables the device to run autonomously based on previously loaded scheduling information. The method consists of a communication device, such as a hand-held remote device, that can directly control the robotic device, or load scheduling information into the robotic device such that it will carry out a defined task at the desired time without the need for further external control. The communication device can also be configured to load a scheduling application program into an existing robotic device, such that the robotic device can receive and implement scheduling information from a user.
Description
FIELD OF THE INVENTION

The present invention relates generally to the field of robotics including the control of an autonomous robotic device and, more particularly, to a remote control device and associated method for inputting schedule information via IR signals to an autonomous robotic device, such as a cleaning robot.


BACKGROUND OF THE INVENTION

Robotic cleaning devices can be used to clean a defined area based on a program stored in the robot's processor. The purpose of these devices is to clean efficiently a room without the need for a user to physically control the cleaning device, or even be in the room when the floor is being cleaned. This can effectively reduce the time necessary for household chores, reduce noise pollution by enabling a room to be cleaned without the need for a user to be present, or enable an elderly or disabled person to clean a room which would otherwise be difficult or impossible to achieve without aid.


A number of methods for achieving this aim are currently in use. For example robotic cleaning devices are available which allow the movement of the robot to be controlled directly by a remote communication device to either follow a path defined by commands from the remote device, or to follow a cleaning path based on a program stored in the robot. These devices however require a user to be present in order to control the motion of the robot or directly implement a stored cleaning mission.


Devices are available which allow a robotic cleaner to be controlled remotely from a separate electronic device, such as a PC with a wireless communication attachment. These devices can therefore be controlled from a scheduling and control application program within the computer, thus allowing the device to operate without the need for a user to be present. These devices require a separate PC to be operational and in linked communication with the robotic device before it can carry out a task and complete a scheduling assignment.


Robotic cleaners are also available which allow a user to directly input scheduling and control information into the robotic device using buttons located on the device itself. As a result, this device can work autonomously once a schedule has been physically input into the robotic device. However, this device does not allow scheduling information to be communicated to the device from a remote controller. As a result, the device would not completely alleviate the need to physically approach the controller, bend down, and input the scheduling information manually. This could limit the ability of the device to be easily used by a person of limited mobility.


None of the current robotic cleaners allow all the desired functions of a robotic cleaning robot to be enabled from a single remote device, without the need for further external control. The devices currently available require either an external source to control the scheduling function, or a direct physical input of the scheduling information through user inputs on the robotic device itself. Allowing a robotic cleaner to run autonomously a scheduling application without external input, receive updated scheduling and other user applications or information from a portable remote device without direct physical contact, and also allow the robotic cleaner to be directly controlled remotely from the same portable remote device, if and when required, would greatly increase the utility of the robotic cleaner and broaden the range of applications for a user.


From the foregoing, there is a need for a method and apparatus to allow a robotic cleaning device to operate autonomously to a remotely communicated user defined schedule, without the need for a user to be present or for a further control input from an external source. It is also desirable to provide a single portable apparatus that can load the configuration applications into the robotic device, select and communicate scheduling information to the robotic device, and control a function of a robotic device based on a direct user command, to improve the utility, efficiency and usability of a robotic cleaning device.


SUMMARY OF THE INVENTION

The invention provides a method and apparatus for configuring a robotic device to operate according to a user defined schedule. Upon configuration of the robotic device, the method and apparatus allows a user to input scheduling information into the robotic device using a remote communication device, after which the robotic device is capable of operating without any further input from a user or the remote device. The communication device can also be used to control directly a function of the robotic device, or to receive information from the robotic device. One or more implementations of the invention may provide one or more of the following features.


In one embodiment of the invention, a configuration tool can be used to configure a robotic device. This method includes the steps of linking the configuration tool to the robotic device, authenticating the configuration tool, and loading, via the configuration tool, information into the robotic device after successful authentication of the configuration tool. The information loaded into the robotic device can include a scheduling application program to enable a scheduling capability of the device. The loading step also allows the communication device to retro-fit, reprogram, and upgrade the scheduling capability of the robotic device at any time.


In one configuration of the invention, the link connecting the configuration tool to the robotic device can include a communication port in each device, such as but not limited to a serial port, USB port, or other appropriate communication port. The robotic device and the configuration tool can then communicate through a cable plugged into the communication port of each device. In an alternative configuration, the link between the configuration tool and the robotic device can be a direct physical connection, wherein one device includes a male serial port adapter, or other communication port adapter such as a USB connector, which plugs directly into a female port on the other device. In a further alternative configuration, the robotic device and configuration tool may link through a wireless connection, wherein a remote signal, such as an infrared, radio frequency, or other appropriate frequency signal, is used to load information from the configuration tool to the robotic device.


The scheduling application program loaded into the robotic device can enable the robotic device to implement further scheduling information from a remote device. As a result, the robotic device can be enabled to run autonomously based on scheduling information loaded into, and stored in, the robotic device without further user input. One embodiment of the invention allows the remote device to be a hand-held input device that can communicate with the robotic device through a wireless connection.


One embodiment of the invention includes a method for configuring a robotic device for autonomous use, including the steps of selecting scheduling information in a remote device, linking the remote device to the robotic device, authenticating the link, communicating the scheduling information from the remote device to the robotic device, and storing the scheduling information in the robotic device. The link between the robotic device and the remote device can be a wireless connection, or any other linking method, such as those described above.


Once the scheduling information has been stored in the robotic device, it can operate in accordance with this stored scheduling information. As a result, the stored scheduling information enables the robotic device to run autonomously without further user input. In one embodiment, the stored scheduling information can include the date, day, and/or time at which the robotic device should operate, and also the number and type of mission it should run at each scheduled time.


Another embodiment of the invention includes a method of communicating with a robotic device. This method includes the steps of linking a single communication device to the robotic device, authenticating the link, and transmitting information from the communication device to the robotic device, wherein the transmitted information includes controlling a function of the robotic device, and at least one of configuring the robotic device and providing scheduling information for the robotic device. As a result, a single communication device can provide multiple functions for the control, scheduling, and configuration of a robotic device.


In various embodiments of the invention the transmitted information can include control and scheduling information, control and configuration information, or control, configuration and scheduling information. This control, configuration, and scheduling information need not be transmitted at the same time, but can be communicated independently, and at different times to enable one specific aspect of the invention. The communication device used to transmit this information can include, but is not limited to, a hand-held remote device, a PC, a laptop, and a wireless communication device.


In one embodiment of the invention, the method can further include the step of transmitting information from the robotic device to the communication device. This information transmitted from the robotic device can include, but is not limited to, an error report, a power level report, currently stored scheduling information, a status report, authentication information, and a user maintenance report.


One embodiment of the invention provides an apparatus for communicating with a robotic device. This apparatus includes a memory for storing information, at least one port for communicating with the robotic device, at least one authenticator for authenticating the robotic device, and at least one transmitter for transmitting information to the robotic device, via a communication port. The information communicated to the robotic device includes information for controlling a function of the robotic device, and at least one of configuration information and scheduling information for the robotic device. In an alternative embodiment, the apparatus can also receive information transmitted from the robotic device.


The apparatus can be a hand-held remote device, or other communication device, and can further include a wireless communication device. In one embodiment, the apparatus can include a display, allowing a user to view information associated with the running of the apparatus. The apparatus can also include at least one user input, allowing the user, for example, to input information to be communicated to the robotic device, directly control a function of the robotic device through a wireless connection, upload information from the robotic device, or search for or control a function of the remote device itself. In various embodiments of the apparatus, the user input can include, but is not limited to, a switch, a joystick, a button, a touch sensitive pad, a roller-ball, and an acoustic input, such as a voice command.


In another embodiment, the invention can include a robotic device that includes a memory for storing information, at least one port for receiving information from a communication device, and at least one authenticator for authenticating the communication device. The communicated information includes information for controlling a function of the robotic device, and at least one of configuration information and scheduling information for the robotic device. This control, configuration, and scheduling information need not be received at the same time, but can be received independently, and at different times, to enable one specific aspect of the invention.


The robotic device is adapted to operate autonomously without further user input based upon scheduling information received from the communication device. Thus, upon loading of a scheduling application program, either pre-installed or by a communication device, a user can enable the robotic device to run autonomously according to the received scheduling information. The robotic device can consist of either a mobile robotic device, such as a cleaning robot, or a stationary robotic device. In one embodiment it can also include at least one transmitter for transmitting information to the communication device.


In yet another aspect, the invention can consist of a robotic system including both a robotic device and a separate communication device for communicating information with the robotic device. The communicated information consists of information for controlling a function of the robotic device, and at least one of configuration information and scheduling information for the robotic device. In one embodiment of the invention, the robotic device can also transmit information to the communication device. In one embodiment, the communication device can be a hand-held remote device, while the robotic device can be either a mobile robotic device or a stationary robotic device.





BRIEF DESCRIPTION OF THE DRAWINGS

The objects and features of the invention can be better understood with reference to the drawings described below, and the claims. The drawings are not necessarily to scale, emphasis instead generally being placed upon illustrating the principles of the invention. In the drawings, like numerals are used to indicate like parts throughout the various views.



FIG. 1 is a block diagram showing one configuration of the communication device and robotic device system, in accordance with one embodiment of the invention.



FIG. 2A is a schematic front-end view of the communication device and/or configuration tool, in accordance with one embodiment of the invention.



FIG. 2B is a schematic top view of the communication device of FIG. 2A.



FIG. 2C is a schematic left-side view of the communication device of FIG. 2A.



FIG. 2D is a schematic right-side view of the communication device of FIG. 2A.



FIG. 2E is a schematic rear-end view of the communication device of FIG. 2A.



FIG. 3 is a schematic perspective view of the communication device and/or configuration tool with an open front cover, in accordance with one embodiment of the invention.



FIG. 4 is a schematic display for a communication device and/or configuration tool, in accordance with one embodiment of the invention.



FIG. 5 is a schematic top view of a communication device and/or configuration tool with the display of FIG. 4, in accordance with one embodiment of the invention.



FIG. 6A is a schematic diagram illustrating a communication device in wireless communication with mobile and stationary robotic devices, in accordance with one embodiment of the invention.



FIG. 6B is a schematic diagram illustrating a communication device in communication, through a communication port and cable, with a mobile robotic device, in accordance with one embodiment of the invention.



FIG. 6C is a schematic diagram illustrating a communication device in direct physical communication with a mobile robotic device, in accordance with one embodiment of the invention.



FIG. 7 is a block diagram featuring a method for implementing and using a system including a robotic device and a communication device, in accordance with one embodiment of the invention.





DETAILED DESCRIPTION

The invention provides a method and apparatus for configuring a robotic device to run autonomously according to stored scheduling information. The apparatus includes a communication device that can be used to directly control a function of the robotic device. For example, the communication device can be used to provide directional control to a mobile robotic device such as a cleaning robot. The communication device can also be used to load configuration information, such as a scheduling application program, into the robotic device, such that the robotic device can run autonomously without further user input upon storing user define scheduling information. This scheduling information can also be communicated to the robotic device via the communication device.



FIG. 1 is a block diagram showing one possible configuration of a combined scheduling tool and communication device 10. In this configuration, a single communication device 12 is adapted and configured to carry out multiple tasks related to the scheduling and control of a robotic device. Firstly, the communication device 12 can be linked with a robotic device in order to download configuration information 14 into the robotic device. This configuration information 14 may include a new application program to enable the robotic device to carry out new tasks, or be adapted to enhance the tasks it can already perform. For example, the configuration information 14 can include a scheduling application program 20, to enable the robotic device to carry out a set task at a set time. The task to be performed and time at which it is to be performed can be downloaded with the application program or communicated at a later date through the communication device 12, based on the requirements of the user. The application can also configure the robotic device to receive signals remotely from the communication device 12 in order to enable user defined scheduling.


In one embodiment of the device, this configuration information 14 can be sent through a wireless connection with the robotic device, with the information sent by infrared (IR), radio frequency (RF), or other appropriate signal. In alternative embodiments, the scheduling information could be sent through communication ports linked by a cable (for example a USB or serial port link), or even by a direct physical connection between the communication device 12 and the robotic device. For a direct communication, a male connector (e.g. USB, serial port or other appropriate connection element) on one device mates directly with a female connector on the other device. In further alternative embodiments, the direct communication can include a docking station on the robotic device, such that the communication device can be removeably attached to the robotic device, thus allowing the communication device to act as a direct user interface between a user and the robotic device.


The configuration information 12 can also include information 22 for upgrading the existing capabilities of the robotic device or reprogramming the device to carry out new tasks. This upgrading information 22 can include, but is not limited to, new versions of the software installed in the robotic device, diagnostic information to check the status of the robotic device, and programs to allow the robotic device to send information to the communication device (either prompted by the user or upon the occurrence of a certain event). Further upgrading or reprogramming information 22 can include programs and applications allowing the robotic device to carry out completely new tasks (such as, but not limited to, working as a toy, security device, or searching device for lost objects) or “learning” programs and applications allowing the robotic device to adapt its own programming based on information gained through carrying out specified tasks. These learning programs can, for example, allow a mobile robotic device 26 to map out a room and remember where the objects in the room are placed, or adapt its scheduling based on prior patterns of user behavior.


The communication device 12 can also be configured to communicate scheduling information 16 to a robotic device. In one embodiment, this scheduling information 16 is sent through a wireless connection between the communication device 12 and the robotic device, although again in alternative embodiments, communication ports providing a wired link (such as a USB or serial port link), or a direct physical connection can be used. The scheduling information can be communicated to both a stationary robotic device 24, or a mobile robotic device 26. The mobile robotic device 26 can, for example, be a cleaning robot such as the Roomba® brand floor vacuum sweeper available from iRobot Corporation, Burlington, Mass. The stationary robotic device 24 can, for example, be a portable barrier signal transmitter designed to send an IR beam along a designated path. The mobile robotic device 26 can be configured to change direction upon encountering this signal, thus the IR beam from the portable barrier signal transmitter acts as a “virtual wall” for the mobile robotic device (see U.S. Pat. No. 6,690,134, incorporated herein by reference in its entirety). The stationary robotic device 24 can also be a docking station, home-base, or charging device for the robotic device.


In one embodiment of the invention, scheduling information 16 can be input into the communication device 12 through a user interface of the device 12. This information can then be communicated to a stationary 24 or mobile 26 robotic device through a wireless connection between the communication device 12 and the robotic device. The robotic device stores this information and runs according to the stored scheduling information 16 without the need for any other input from a user, controller or communication device 12. Changes in the scheduling information 16 stored in the robotic device can be made by simply inputting new scheduling information 16 into the communication device 12 and communicating it to the robotic device. In an alternative embodiment, a further step, such as but not limited to clearing the stored scheduling information 16 from the robotic device's memory or inputting a code (either into the communication device 12 or directly into the robotic device), may be required before new scheduling information 16 can be loaded into the robotic device.


In one embodiment of the invention, the robotic device can be configured to provide a visual or audio signal upon the completion of a transfer of configuration or scheduling information. In an alternative embodiment, a return signal can be sent from the robotic device to the communication device 12 upon the successful completion of an information transfer. The robotic device can also be configured to illuminate a status light on either device if and when a scheduling program is stored in the memory.


The scheduling information 16 can include, but not be limited to, the date, day and time at which the robotic device operates, and may also include other information such as the length of time the robotic device should operate during a scheduled event, the mission or task it should carry out for each scheduled operation, and the number of missions or tasks it should carry out during a scheduled operation. The scheduling information can also include more complex calendar based information, such that the robotic device may be able to adjust its start time based on the time of year (for example due to time differences for daylight savings time or for the available hours of daylight), or adjust its schedule for holidays.


A robotic device can be configured or programmed to run a number of separate programs. For example, a mobile cleaning robot can be configured to clean different areas of a room or building, clean a particular spot on a floor, clean at varying power levels between a minimum to a maximum setting, return to a docking station when power drops to a specific level or the dirt compartment is full, or carry out other specific tasks. Using the scheduling information 16, the missions or tasks that the robotic device carries out can then be tailored to a user's requirements, for example by only carrying out a high power cleaning mission at times when nobody is in the house. In one embodiment of the invention, a stationary robotic device 24, such as a portable barrier signal transmitter, can be scheduled to operate at the same time as a mobile cleaning robot, thus saving power. Alternatively, the stationary robotic device 24 may only turn on during some scheduled operations depending on whether a user want to clean the area potentially blocked by the portable barrier signal transmitter or not.


In one embodiment, the communication device 12 can also be used to provide direct control information 18 to a robotic device, based on a user input. This can involve directly driving a function of a robotic device 28, or initiating the robotic device to carry out a preprogrammed mission or task 30. In one embodiment of the invention, the communication device 12 includes a user input, or a number of inputs, such as, but not limited to, switches, a joystick, buttons, a touch sensitive pad, and a roller-ball. Using one of, or a combination of, these user inputs, a user can command the robot to carry out a specific movement or action immediately. For example, the driving information 28 may include, but not be limited to, commands to make a moveable robotic device turn left, turn right, move forward, and move backward. In the specific embodiment of a mobile cleaning robot, the driving information 28 may also include such commands as start and stop cleaning, or clean at a specific power level.


The driving information 28 may also include commands to carry out pre-programmed missions, tasks or actions. For example, the communication device 12 can include buttons or other user inputs that command a robotic device to specific task when the user input is enabled. For a mobile cleaning robot, these task commands 30 could include cleaning a specific spot, carrying out a specified cleaning mission, cleaning at a specific power level, stop and power down, power up, or return to a docking station.



FIGS. 2A-2E show five views of an example of a particular communication device 40. FIG. 2A shows a front-end view of the communication device 40, showing a wireless communication port 42, allowing the communication device 40 to communicate remotely, using for example IR signals, with a robotic device or other electronic device. The wireless communication port 42 can be used to provide configuration, scheduling, and control information to a robotic device, and optionally also receive information from a robotic or other device.


In one embodiment of the invention, the communication device can be configured to receive a range of information from a robotic device. In the case of a robotic cleaning device, this information can include, but not be limited to, receiving power level or dirt compartment level status reports, error reports, information on when filters, sensors or brushes need to be cleaned, “dirt alerts” when a dirty area of floor is detected, or mission status reports (e.g. mission completed/abandoned/battery depleted, etc.)



FIG. 2B shows a top or plan view of the communication device 40. The communication device 40 includes a number of user input devices, including a button 44, a set of buttons 48, and a second set of buttons 50. Each of these buttons (i.e., switches) can be configured to input different information into the communication device 40, or provide different information to a robotic device. In one embodiment of the invention, the function of these buttons can differ when a front cover or flip lid 54 is in either an open or closed position. In this embodiment, with the front cover 54 open, the buttons can be used to input and store scheduling or other information into the communication device 40, while with the lid 54 closed the buttons can be used to communicate with a robotic device and provide configuration, scheduling, and control information to the robotic device.


In one embodiment, button 44 could be used to initiate the communication of configuration or scheduling information to a robotic device, control a specific task of the robotic device (such as initiating docking), or turn the robotic device, or the communication device 40, on and off. Buttons 48 can be used to provide input information into the communication device 40 when setting up scheduling information, enable the loading of specific configuration information into a robotic device, or control a specific mission, task or action of the robotic device. Buttons 50 may be used to input scheduling information into the communication device 40, enable the loading of configuration or scheduling information into a robotic device, and control a specific action of the robotic device. In one embodiment of the invention the buttons 50 could be used to control directly the movement of a cleaning robot, with the three buttons assigned to turning left, turning right, and moving forward. In an alternative embodiment, one or other of the buttons can also be used to lock the robotic device in a certain mode of operation, or in an “off” setting.


The communication device shown in FIG. 2B also includes a display 46. This display 46 can for example be a liquid crystal display (LCD), allowing the user to see the information being inputted into the communication device 40, showing which configuration information, scheduling information or control information a robotic device is being sent, or for showing information sent from the robotic device to the communication device 40.



FIG. 2C shows a left-side view of the communication device 40. This view shows the side of the flip lid 54 when the lid is closed. This front cover 54 can be used to, but not limited to, change the functions of the user inputs, show or hide portions of the display 46, uncover other hidden user inputs, or uncover instructions for the use of the device. Indentations 58 below the edge of the front cover 54 are included to allow for easy opening of the cover 54 by a user. The casing 56 or the communication device 40 can be made of either metal or plastic, and can include a removable cover to allow access to a battery compartment (not shown).



FIG. 2D shows a right-side view of the communication device 40. This view includes a communication port 52 to allow the communication device 40 to connect to another device for uploading and downloading information, such as, but not limited to, authenticating information, configuration information, scheduling information, control information, and technical information. The communication port 52 can be, but is not limited to, a serial port, a parallel port, a USB port, an IEEE 1394 “Firewire” port, a PS/2 port, a modem port, or an Ethernet network port.



FIG. 2D shows a rear-end view of the communication device 40. In one embodiment of the invention, the communication device 40 of FIGS. 2A-2E is sized and shaped to fit in a users hand, and can be used in the same manner as a television or video remote control. In an alternative embodiment of the invention, other user input devices can be used as a communication device 40 to communicate with a robotic device. Such alternative devices include, but are not limited to, another hand-held remote device, a stationary remote communication device with user interface, a PC, a laptop computer, and a wireless communication device, such as a mobile phone or personal digital assistant (PDA).



FIG. 3 is a schematic view of the communication device 40 of FIGS. 2A-2E with an open front cover 54. As previously discussed, the opening of the front cover 54 can be used to, amongst other things, uncover hidden user inputs, change the function of certain user inputs, uncover a portion of the display 46, enable certain “locked” functions of the communication device 40, or uncover instructions printed on the inside of the cover 54.



FIG. 4 is a image of an example display 60 for a communication device. This display 60 includes a day of the week indicator 62, a schedule frequency indicator 64, indicating whether a specific scheduling task is to be performed once or repeatedly on a weekly basis, and time indicators showing whether a scheduled task is to be performed in the “am” or “pm” 66, and at what specific time 68. The display 60 also includes a power indicator 70 and a signal indicator 72 that can indicate when a signal is being communicated from or to the communication device. A further indicator 74 can be used to display additional information, such as, but not limited to, the number of the scheduled task, the type of task to be performed, and the status of a device.


In alternative embodiments of the invention, the display 60 could also be configured to show such things as options for the type of configuration information that can be communicated, the range and type of scheduling information available for a given robotic device, and previously transmitted scheduling information. A display 60 can also be configured to show information received from a robotic device, as discussed above.



FIG. 5 shows an image of a communication device 80 with the display 60, in accordance with one embodiment of the invention. This configuration conforms generally with the communication device 40 shown in FIGS. 2A-2E and FIG. 3, with the user inputs assigned specific tasks. In this configuration, button 82 is an on/off switch, button 84 enables a new scheduling program or saves an inputted scheduling program, button 86 deletes inputted information, and button 88 enables the communication device 80 to send scheduling information to a robotic device. Further user inputs are configured to provide direct control commands to a mobile robotic device, with button 90 providing a “move forward” command, button 92 providing a “turn left” command, and button 94 providing a “turn right” command to the robotic device. The display 60 unit embedded in the communication device 80 conforms generally with that of FIG. 4.



FIGS. 6A-6C show various means of linking a communication device with a stationary or mobile robotic device. In the system configuration 100 shown in FIG. 6A, a communication device 102 is configured to communicate through a wireless connection with either or both of a mobile robotic device 104 and a stationary robotic device 106. In one embodiment, the wireless link consists of an IR signal 108, which can be sent from the communication device 102 to the mobile 104 and stationary 106 robotic devices. In this configuration, further IR signals 108 can be sent from the robotic devices to the communication device 102, as indicated by the arrows 110. In this system, configuration, scheduling, and control information can be communicated from the communication device 102 to the mobile 104 and stationary 106 robotic devices, while information such as, but not limited to, status reports and error reports can be communicated back to the communication device 102 from the robotic devices. In alternative embodiments, the communication device 102 can communicate information to a single mobile 104 or stationary 106 robotic device, that can then send the communicated information to further devices directly, without the need for additional linking to the communication device 102. In this configuration, multiple mobile 104 or stationary 106 robotic devices can be configured, scheduled, and/or controlled through a link with only a single robotic device.


In the system configuration 120 shown in FIG. 6B, a mobile robotic device 104 is linked to a communication device 102 through a cable 122. The cable connects both devices through communication ports located on each device. These ports can be a serial port, parallel ports, USB ports, IEEE 1394 “Firewire” ports, PS/2 ports, modem ports, Ethernet network ports, or other appropriate communication ports. In one embodiment of this system configuration 120, the cable connection 122 can be used to quickly load configuration information into a mobile robotic device 104. This system configuration 120 can also be used to load configuration information into a stationary robotic device 106. This configuration information can be used to either enable a new scheduling function on the robotic device, or upgrade or reprogram existing functions of the robotic device, as discussed above. After the configuration information has been loaded through the cable 122, the cable 122 can be removed. Further scheduling information and control information can then be communicated to the robotic device using a wireless connection, as shown in system configuration 100 and FIG. 6A.


In the system configuration 130 shown in FIG. 6C, a mobile robotic device 104 is linked to a communication device 102 through a direct connection 132. This direct connection can consist of a male connection port on the communication device 102, that mates directly to a female connection port on the mobile robotic device 104. Again, this system configuration 130 can also be used to link the communication device 102 to a stationary robotic device 106. As in system configuration 120, shown in FIG. 6B, this system configuration can be used to load configuration information into a robotic device, after which scheduling or control information can be communicated to the robotic device through the wireless communication configuration 100 shown in FIG. 6A. In one embodiment of the invention, the male connection port on the communication device 102 can retract or fold into the communication device 102 when not in use.



FIG. 7 shows a block diagram featuring one method 140 for implementing and using a system including a robotic device and a communication device. In this method 140, a single communication device can be used to provide configuring, scheduling, and control information to a robotic device.


For a robotic device without a pre-installed scheduling application program, or a robotic device needing reprogramming or upgrading, the communication device can be used to load the required configuration information into the robotic device. This requires first linking 142 the communication device to the robotic device, either through a wireless connection, communication port, or direct connection. Upon optional authentication of the link using an authenticator (e.g., by hardware or software based systems), the desired configuration information can be loaded 144 into the robotic device, at which time it is stored 146 in memory of the robotic device. After this has been completed, the robotic device is ready for use 148. For robotic devices that have already been configured, steps 142, 144, and 146 are not necessary.


Once the robotic device and communication device are ready for use 148, the communication device can be used to provide scheduling information or direct control information to the robotic device. In one embodiment, this information is communicated through a wireless link, although a communication port link or direct link is also possible.


For enabling the robotic device to run according to a user defined schedule, the scheduling information is first entered into the communication device 150. The communication device can then be linked 152 to the robotic device and, upon optional authentication of this link, the scheduling information can be loaded 154 into the robotic device and stored 156 in the device's memory. The robotic device is then free to run autonomously 158, based on this stored scheduling information. Depending on the schedule, the robotic device can start immediately or at a future time.


As well as providing scheduling information to a robotic device, the communication device can also directly control one or more function of the robotic device. Again, with the communication device and robotic device ready for use 148, a link can be formed 160 between the communication device and robotic device. Once the link has been authenticated, control information entered 162 into the communication device is received 164 immediately by the robotic device, which then operates 166 according to the user inputted control information.


The scheduling and control functions can run independently, such that the robotic device can be controlled by the user even when the robotic device is not scheduled to run, and alternatively run a scheduled mission without any need for control information. In one embodiment of the invention, the control function can be configured to overrule the scheduling function, so that during a scheduled event a user can take direct control of the robotic device without waiting until the scheduled task is completed. In an alternative embodiment, the scheduling function can be set as the dominant function, and thus upon the start of a scheduled task overrule any direct user control information being communicated at that time.


It should be noted that these functions can at any time be modified or updated by downloading new configuration information into the robotic device. In one embodiment of the invention the communication device can also be configured, updated or reprogrammed by linking the communication device to another device, such as but not limited to a PC, laptop, or other programming or diagnostic tool. As a result, both the communication device and the robotic device can be constantly updated to meet the requirements of the user and advancements developed by the system manufacturers or suppliers.


The invention may be embodied in other specific forms without departing form the spirit or essential characteristics thereof. The foregoing embodiments, therefore, are to be considered in all respects illustrative rather than limiting the invention described herein. Scope of the invention is thus indicated by the appended claims, rather than by the foregoing description, and all changes that come within the meaning and range of equivalency of the claims are intended to be embraced therein.

Claims
  • 1. A method of cleaning a room, the method comprising: transmitting from a cleaning robot to a mobile phone a status of the cleaning robot; andreceiving at the cleaning robot from the mobile phone, in response to an operator command input at the mobile phone and at least in part indicative of a schedule, information including instructions configured to cause a processor of the cleaning robot to execute a cleaning operation in the room according to the schedule, wherein executing the cleaning operation in the room according to the schedule comprises: leaving a stationary charging device at which the cleaning robot is docked according to the schedule, andnavigating about a floor surface of the room.
  • 2. The method of claim 1, wherein transmitting the status to the mobile phone comprises transmitting a mission status report.
  • 3. The method of claim 2, wherein the mission status report is transmitted after the cleaning operation is complete, the mission status report being indicative of completeness of the mission.
  • 4. The method of claim 2, wherein transmitting the mission status report comprises transmitting the mission status report during the cleaning operation.
  • 5. The method of claim 1, further comprising transmitting information to cause the cleaning robot to generate a map of the room.
  • 6. The method of claim 5, wherein the map of the room comprises a position of an object in the room.
  • 7. The method of claim 1, further comprising storing the schedule on a memory of the cleaning robot.
  • 8. The method of claim 1, wherein transmitting the status to the mobile phone comprises transmitting an error report indicative of an error of at least one of a filter, a sensor, a brush, a drive, or a dust compartment of the cleaning robot.
  • 9. The method of claim 1, wherein transmitting the status to the mobile phone comprises transmitting a power report indicative of at least one of a stop and power status, a low power status, a recharging status, and a specific power level.
  • 10. The method of claim 1, further comprising receiving an infrared signal from a portable signal transmitter, wherein navigating about the floor surface comprises navigating about the floor surface in response to the infrared signal from the portable signal transmitter.
  • 11. The method of claim 1, wherein transmitting from the cleaning robot to the mobile phone the status of the cleaning robot causes the mobile phone to illuminate a visual indicator.
  • 12. A method of controlling a cleaning robot to clean a room, the method comprising: initiating formation of a wireless communication link between the cleaning robot and a mobile phone; andentering an operator command input into the mobile phone to cause the mobile phone to transmit, using the wireless communication link, information to the cleaning robot, the operator command input being at least in part indicative of a schedule, and the information comprising instructions configured to cause a processor of the cleaning robot to perform operations including executing a cleaning operation in the room according to the schedule, wherein executing the cleaning operation in the room according to the schedule comprises: leaving, according to the schedule, a stationary charging device at which the cleaning robot is docked, andnavigating about a floor surface of the room.
  • 13. The method of claim 12, wherein entering the operator command input comprises entering an adjustment to scheduling information to generate the schedule, the scheduling information being transmitted from the cleaning robot to the mobile phone using the wireless communication link.
  • 14. The method of claim 12, wherein entering the operator command input comprises entering a start time comprising at least one of a selected time, a selected date, and a selected day, and wherein the operations further comprise leaving, according to the schedule, the stationary charging device at the start time.
  • 15. The method of claim 12, wherein: entering the operator command input comprises entering a return command, andthe operations further comprise returning to the stationary charging device in response to the return command.
  • 16. The method of claim 12, wherein: entering the operator command input comprises designating a cleaning mission selected from among a plurality of cleaning missions andthe operations further comprise navigating about the floor surface of the room according to the designated cleaning mission.
  • 17. The method of claim 16, wherein: designating the cleaning mission from among the plurality of cleaning missions comprises entering a movement command, andthe operations further comprise moving, based on the movement command, in at least one of a forward direction, leftward direction, and rightward direction.
  • 18. The method of claim 12, wherein entering the operator command input to cause the mobile phone to transmit the information to the cleaning robot comprises: entering the operator command input into the mobile phone such that the mobile phone stores the operator command input on a memory of the mobile phone and transmits the information to the cleaning robot.
  • 19. The method of claim 12, wherein entering the operator command input into the mobile phone comprises providing a voice command to the mobile phone.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application is a continuation of and claims priority to U.S. application Ser. No. 11/166,891, filed Jun. 24, 2005, which claims priority to and the benefit of U.S. provisional patent application Ser. No. 60/582,531, filed Jun. 24, 2004, the disclosures of which are being incorporated herein by reference in their entirety. This application is related to U.S. patent application Ser. No. 11/166,518, filed Jun. 24, 2008, entitled “Programming and Diagnostic Tool for a Mobile Robot,” the disclosure of which is being incorporated herein by reference in its entirety.

US Referenced Citations (1022)
Number Name Date Kind
1755054 Darst Apr 1930 A
1780221 Buchmann Nov 1930 A
1970302 Gerhardt Aug 1934 A
2136324 John Nov 1938 A
2302111 Dow et al. Nov 1942 A
2353621 Sav et al. Jul 1944 A
2770825 Pullen Nov 1956 A
2930055 Fallen et al. Mar 1960 A
3119369 Harland et al. Jan 1964 A
3166138 Dunn Jan 1965 A
3333564 Waters Aug 1967 A
3375375 Robert et al. Mar 1968 A
3381652 Schaefer et al. May 1968 A
3457575 Bienek Jul 1969 A
3550714 Bellinger Dec 1970 A
3569727 Aggarwal et al. Mar 1971 A
3649981 Woodworth Mar 1972 A
3674316 De Bray Jul 1972 A
3678882 Kinsella Jul 1972 A
3690559 Rudloff Sep 1972 A
3744586 Leinauer Jul 1973 A
3756667 Bombardier et al. Sep 1973 A
3809004 Leonheart May 1974 A
3816004 Bignardi Jun 1974 A
3845831 James Nov 1974 A
RE28268 Autrand Dec 1974 E
3851349 Lowder Dec 1974 A
3853086 Asplund Dec 1974 A
3863285 Hukuba Feb 1975 A
3888181 Kups Jun 1975 A
3937174 Haaga Feb 1976 A
3952361 Wilkins Apr 1976 A
3989311 Debrey Nov 1976 A
3989931 Phillips Nov 1976 A
4004313 Capra Jan 1977 A
4012681 Finger et al. Mar 1977 A
4070170 Leinfelt Jan 1978 A
4099284 Shinozaki et al. Jul 1978 A
4119900 Kremnitz Oct 1978 A
4152703 Ziemke et al. May 1979 A
4175589 Nakamura et al. Nov 1979 A
4175892 De bray Nov 1979 A
4196727 Verkaart et al. Apr 1980 A
4198727 Farmer Apr 1980 A
4199838 Simonsson Apr 1980 A
4209254 Reymond et al. Jun 1980 A
D258901 Keyworth Apr 1981 S
4297578 Carter Oct 1981 A
4305234 Pichelman Dec 1981 A
4306329 Yokoi Dec 1981 A
4309758 Halsall et al. Jan 1982 A
4328545 Halsall et al. May 1982 A
4367403 Miller Jan 1983 A
4369543 Chen et al. Jan 1983 A
4401909 Gorsek Aug 1983 A
4416033 Specht Nov 1983 A
4445245 Lu May 1984 A
4465370 Yuasa et al. Aug 1984 A
4477998 You Oct 1984 A
4481692 Kurz Nov 1984 A
4482960 Pryor Nov 1984 A
4492058 Goldfarb et al. Jan 1985 A
4513469 Godfrey et al. Apr 1985 A
D278732 Ohkado May 1985 S
4518437 Sommer May 1985 A
4534637 Suzuki et al. Aug 1985 A
4556313 Miller et al. Dec 1985 A
4575211 Matsumura et al. Mar 1986 A
4580311 Kurz Apr 1986 A
4601082 Kurz Jul 1986 A
4618213 Chen Oct 1986 A
4620285 Perdue Oct 1986 A
4624026 Olson et al. Nov 1986 A
4626995 Lofgren et al. Dec 1986 A
4628454 Ito Dec 1986 A
4638445 Mattaboni Jan 1987 A
4644156 Takahashi et al. Feb 1987 A
4649504 Krouglicof et al. Mar 1987 A
4652917 Miller Mar 1987 A
4654492 Koerner et al. Mar 1987 A
4654924 Getz et al. Apr 1987 A
4660969 Sorimachi et al. Apr 1987 A
4662854 Fang May 1987 A
4674048 Okumura Jun 1987 A
4679152 Perdue Jul 1987 A
4680827 Hummel Jul 1987 A
4696074 Cavalli Sep 1987 A
D292223 Trumbull Oct 1987 S
4700301 Dyke Oct 1987 A
4700427 Knepper Oct 1987 A
4703820 Reinaud Nov 1987 A
4709773 Clement et al. Dec 1987 A
4710020 Maddox et al. Dec 1987 A
4712740 Duncan et al. Dec 1987 A
4716621 Zoni Jan 1988 A
4728801 O'Connor Mar 1988 A
4733343 Yoneda et al. Mar 1988 A
4733430 Westergren Mar 1988 A
4733431 Martin Mar 1988 A
4735136 Lee et al. Apr 1988 A
4735138 Gawler et al. Apr 1988 A
4748336 Fujie et al. May 1988 A
4748833 Nagasawa Jun 1988 A
4756049 Uehara Jul 1988 A
4767213 Hummel Aug 1988 A
4769700 Pryor Sep 1988 A
4777416 George et al. Oct 1988 A
D298766 Tanno et al. Nov 1988 S
4782550 Jacobs Nov 1988 A
4796198 Boultinghouse et al. Jan 1989 A
4806751 Abe et al. Feb 1989 A
4811228 Hyyppa Mar 1989 A
4813906 Matsuyama et al. Mar 1989 A
4815157 Tsuchiya Mar 1989 A
4817000 Eberhardt Mar 1989 A
4818875 Weiner Apr 1989 A
4829442 Kadonoff et al. May 1989 A
4829626 Harkonen et al. May 1989 A
4832098 Palinkas et al. May 1989 A
4851661 Everett Jul 1989 A
4854000 Takimoto Aug 1989 A
4854006 Nishimura et al. Aug 1989 A
4855915 Dallaire Aug 1989 A
4857912 Everett et al. Aug 1989 A
4858132 Holmquist Aug 1989 A
4867570 Sorimachi et al. Sep 1989 A
4880474 Koharagi et al. Nov 1989 A
4887415 Martin Dec 1989 A
4891762 Chotiros Jan 1990 A
4893025 Lee Jan 1990 A
4901394 Nakamura et al. Feb 1990 A
4905151 Weiman et al. Feb 1990 A
4909972 Britz Mar 1990 A
4912643 Beirne Mar 1990 A
4918441 Bohman Apr 1990 A
4919224 Shyu et al. Apr 1990 A
4919489 Kopsco Apr 1990 A
4920060 Parrent et al. Apr 1990 A
4920605 Takashima May 1990 A
4933864 Evans et al. Jun 1990 A
4937912 Kurz Jul 1990 A
4953253 Fukuda et al. Sep 1990 A
4954962 Evans et al. Sep 1990 A
4955714 Stotler et al. Sep 1990 A
4956891 Wulff Sep 1990 A
4961303 McCarty et al. Oct 1990 A
4961304 Ovsborn et al. Oct 1990 A
4962453 Pong et al. Oct 1990 A
4967862 Pong et al. Nov 1990 A
4971591 Raviv et al. Nov 1990 A
4973912 Kaminski et al. Nov 1990 A
4974283 Holsten et al. Dec 1990 A
4977618 Allen Dec 1990 A
4977639 Takahashi et al. Dec 1990 A
4986663 Cecchi et al. Jan 1991 A
5001635 Yasutomi et al. Mar 1991 A
5002145 Wakaumi et al. Mar 1991 A
5012886 Jonas et al. May 1991 A
5018240 Holman May 1991 A
5020186 Lessig et al. Jun 1991 A
5022812 Coughlan et al. Jun 1991 A
5023788 Kitazume et al. Jun 1991 A
5024529 Svetkoff et al. Jun 1991 A
D318500 Malewicki et al. Jul 1991 S
5032775 Mizuno et al. Jul 1991 A
5033151 Kraft et al. Jul 1991 A
5033291 Podoloff et al. Jul 1991 A
5040116 Evans et al. Aug 1991 A
5045769 Everett Sep 1991 A
5049802 Mintus et al. Sep 1991 A
5051906 Evans et al. Sep 1991 A
5062819 Mallory Nov 1991 A
5070567 Holland Dec 1991 A
5084934 Lessig et al. Feb 1992 A
5086535 Grossmeyer et al. Feb 1992 A
5090321 Abouav Feb 1992 A
5093955 Blehert et al. Mar 1992 A
5094311 Akeel Mar 1992 A
5098262 Wecker et al. Mar 1992 A
5105502 Takashima Apr 1992 A
5105550 Shenoha Apr 1992 A
5109566 Kobayashi et al. May 1992 A
5111401 Everett, Jr. et al. May 1992 A
5115538 Cochran et al. May 1992 A
5127128 Lee Jul 1992 A
5136675 Hodson Aug 1992 A
5136750 Takashima et al. Aug 1992 A
5142985 Stearns et al. Sep 1992 A
5144471 Takanashi et al. Sep 1992 A
5144714 Mori et al. Sep 1992 A
5144715 Matsuyo et al. Sep 1992 A
5152028 Hirano Oct 1992 A
5152202 Strauss Oct 1992 A
5154617 Suman et al. Oct 1992 A
5155684 Burke et al. Oct 1992 A
5163202 Kawakami et al. Nov 1992 A
5163320 Goshima et al. Nov 1992 A
5164579 Pryor et al. Nov 1992 A
5165064 Mattaboni Nov 1992 A
5170352 McTamaney et al. Dec 1992 A
5173881 Sindle Dec 1992 A
5182833 Yamaguchi et al. Feb 1993 A
5187662 Kamimura et al. Feb 1993 A
5202742 Frank et al. Apr 1993 A
5204814 Noonan et al. Apr 1993 A
5206500 Decker et al. Apr 1993 A
5208521 Aoyama May 1993 A
5216777 Moro et al. Jun 1993 A
5220263 Onishi et al. Jun 1993 A
5222786 Sovis et al. Jun 1993 A
5227985 DeMenthon Jul 1993 A
5233682 Abe et al. Aug 1993 A
5239720 Wood et al. Aug 1993 A
5251358 Moro et al. Oct 1993 A
5261139 Lewis Nov 1993 A
5276618 Everett Jan 1994 A
5276939 Uenishi Jan 1994 A
5277064 Knigga et al. Jan 1994 A
5279672 Betker et al. Jan 1994 A
5284452 Corona Feb 1994 A
5284522 Kobayashi et al. Feb 1994 A
5293955 Lee Mar 1994 A
D345707 Alister Apr 1994 S
5303448 Hennessey et al. Apr 1994 A
5307273 Oh et al. Apr 1994 A
5309592 Hiratsuka May 1994 A
5310379 Hippely et al. May 1994 A
5315227 Pierson et al. May 1994 A
5319827 Yang Jun 1994 A
5319828 Waldhauser et al. Jun 1994 A
5321614 Ashworth Jun 1994 A
5323483 Baeg Jun 1994 A
5324948 Dudar et al. Jun 1994 A
5331713 Tipton Jul 1994 A
5341186 Kato Aug 1994 A
5341540 Soupert et al. Aug 1994 A
5341549 Wirtz et al. Aug 1994 A
5345649 Whitlow Sep 1994 A
5352901 Poorman Oct 1994 A
5353224 Lee et al. Oct 1994 A
5363305 Cox et al. Nov 1994 A
5363935 Schempf et al. Nov 1994 A
5369347 Yoo Nov 1994 A
5369838 Wood et al. Dec 1994 A
5386862 Glover et al. Feb 1995 A
5399951 Lavallee et al. Mar 1995 A
5400244 Watanabe et al. Mar 1995 A
5404612 Ishikawa Apr 1995 A
5410479 Coker Apr 1995 A
5435405 Schempf et al. Jul 1995 A
5440216 Kim Aug 1995 A
5442358 Keeler et al. Aug 1995 A
5444965 Colens Aug 1995 A
5446356 Kim Aug 1995 A
5446445 Bloomfield et al. Aug 1995 A
5451135 Schempf et al. Sep 1995 A
5454129 Kell Oct 1995 A
5455982 Armstrong et al. Oct 1995 A
5465525 Mifune et al. Nov 1995 A
5465619 Sotack et al. Nov 1995 A
5467273 Faibish et al. Nov 1995 A
5471560 Allard et al. Nov 1995 A
5483440 Aono et al. Jan 1996 A
5491670 Weber Feb 1996 A
5497529 Boesi Mar 1996 A
5498948 Bruni et al. Mar 1996 A
5502638 Takenaka Mar 1996 A
5505072 Oreper Apr 1996 A
5507067 Hoekstra et al. Apr 1996 A
5510893 Suzuki Apr 1996 A
5511147 Abdel Apr 1996 A
5515572 Hoekstra et al. May 1996 A
5534762 Kim Jul 1996 A
5535476 Kresse et al. Jul 1996 A
5537017 Feiten et al. Jul 1996 A
5537711 Tseng Jul 1996 A
5539953 Kurz Jul 1996 A
5542146 Hoekstra et al. Aug 1996 A
5542148 Young Aug 1996 A
5546631 Chambon Aug 1996 A
5548511 Bancroft Aug 1996 A
5551119 Wörwag Sep 1996 A
5551525 Pack et al. Sep 1996 A
5553349 Kilstrom et al. Sep 1996 A
5555587 Guha Sep 1996 A
5560077 Crotchett Oct 1996 A
5568589 Hwang Oct 1996 A
D375592 Ljunggren Nov 1996 S
5608306 Rybeck et al. Mar 1997 A
5608894 Kawakami et al. Mar 1997 A
5608944 Gordon Mar 1997 A
5610488 Miyazawa Mar 1997 A
5611106 Wulff Mar 1997 A
5611108 Knowlton et al. Mar 1997 A
5613261 Kawakami et al. Mar 1997 A
5613269 Miwa Mar 1997 A
5621291 Lee Apr 1997 A
5622236 Azumi et al. Apr 1997 A
5634237 Paranjpe Jun 1997 A
5634239 Tuvin et al. Jun 1997 A
5636402 Kubo et al. Jun 1997 A
5642299 Hardin et al. Jun 1997 A
5646494 Han Jul 1997 A
5647554 Ikegami et al. Jul 1997 A
5650702 Azumi Jul 1997 A
5652489 Kawakami Jul 1997 A
5659779 Laird et al. Aug 1997 A
5682313 Edlund et al. Oct 1997 A
5682839 Grimsley et al. Nov 1997 A
5696675 Nakamura et al. Dec 1997 A
5698861 Oh Dec 1997 A
5709007 Chiang Jan 1998 A
5710506 Broell et al. Jan 1998 A
5714119 Kawagoe et al. Feb 1998 A
5717169 Liang et al. Feb 1998 A
5717484 Hamaguchi et al. Feb 1998 A
5720077 Nakamura et al. Feb 1998 A
5732401 Conway Mar 1998 A
5735017 Barnes et al. Apr 1998 A
5735959 Kubo et al. Apr 1998 A
5742975 Knowlton et al. Apr 1998 A
5745235 Vercammen et al. Apr 1998 A
5752871 Tsuzuki May 1998 A
5756904 Oreper et al. May 1998 A
5761762 Kubo Jun 1998 A
5764888 Bolan et al. Jun 1998 A
5767437 Rogers Jun 1998 A
5767960 Orman Jun 1998 A
5770936 Hirai et al. Jun 1998 A
5777596 Herbert Jul 1998 A
5778486 Kim Jul 1998 A
5781697 Jeong Jul 1998 A
5781960 Kilstrom et al. Jul 1998 A
5784755 Karr et al. Jul 1998 A
5786602 Pryor et al. Jul 1998 A
5787545 Colens Aug 1998 A
5793900 Nourbakhsh et al. Aug 1998 A
5794297 Muta Aug 1998 A
5802665 Knowlton et al. Sep 1998 A
5812267 Everett et al. Sep 1998 A
5814808 Takada et al. Sep 1998 A
5815880 Nakanishi Oct 1998 A
5815884 Imamura et al. Oct 1998 A
5819008 Asama et al. Oct 1998 A
5819360 Fujii Oct 1998 A
5819936 Saveliev et al. Oct 1998 A
5820821 Kawagoe et al. Oct 1998 A
5821730 Drapkin Oct 1998 A
5825981 Matsuda Oct 1998 A
5828770 Leis et al. Oct 1998 A
5831597 West et al. Nov 1998 A
5836045 Anthony et al. Nov 1998 A
5839156 Park et al. Nov 1998 A
5839532 Yoshiji et al. Nov 1998 A
5841259 Kim et al. Nov 1998 A
5867800 Leif Feb 1999 A
5867861 Kasen et al. Feb 1999 A
5869910 Colens Feb 1999 A
5894621 Kubo Apr 1999 A
5896611 Haaga Apr 1999 A
5903124 Kawakami May 1999 A
5905209 Oreper May 1999 A
5907886 Buscher Jun 1999 A
5910700 Crotzer Jun 1999 A
5911260 Suzuki Jun 1999 A
5916008 Wong Jun 1999 A
5924167 Wright et al. Jul 1999 A
5926909 McGee Jul 1999 A
5933102 Miller et al. Aug 1999 A
5933913 Wright et al. Aug 1999 A
5935179 Kleiner et al. Aug 1999 A
5935333 Davis Aug 1999 A
5940346 Sadowsky et al. Aug 1999 A
5940927 Haegermarck et al. Aug 1999 A
5940930 Oh et al. Aug 1999 A
5942869 Katou et al. Aug 1999 A
5943730 Boomgaarden Aug 1999 A
5943733 Tagliaferri Aug 1999 A
5943933 Evans et al. Aug 1999 A
5947225 Kawakami et al. Sep 1999 A
5950408 Schaedler Sep 1999 A
5959423 Nakanishi et al. Sep 1999 A
5968281 Wright et al. Oct 1999 A
5974348 Rocks Oct 1999 A
5974365 Mitchell Oct 1999 A
5983448 Wright et al. Nov 1999 A
5984880 Lander et al. Nov 1999 A
5987383 Keller et al. Nov 1999 A
5989700 Krivopal Nov 1999 A
5991951 Kubo et al. Nov 1999 A
5995883 Nishikado Nov 1999 A
5995884 Allen et al. Nov 1999 A
5996167 Close Dec 1999 A
5998953 Nakamura et al. Dec 1999 A
5998971 Corbridge Dec 1999 A
6000088 Wright et al. Dec 1999 A
6009358 Angott et al. Dec 1999 A
6012618 Matsuo et al. Jan 2000 A
6021545 Delgado et al. Feb 2000 A
6023813 Thatcher et al. Feb 2000 A
6023814 Imamura Feb 2000 A
6025687 Himeda et al. Feb 2000 A
6026539 Mouw et al. Feb 2000 A
6030464 Azevedo Feb 2000 A
6030465 Marcussen et al. Feb 2000 A
6032327 Oka et al. Mar 2000 A
6032542 Warnick et al. Mar 2000 A
6036572 Sze Mar 2000 A
6038501 Kawakami Mar 2000 A
6040669 Hog Mar 2000 A
6041471 Charky et al. Mar 2000 A
6041472 Kasen et al. Mar 2000 A
6046800 Ohtomo et al. Apr 2000 A
6049620 Dickinson et al. Apr 2000 A
6050648 Keleny Apr 2000 A
6052821 Chouly et al. Apr 2000 A
6055042 Sarangapani Apr 2000 A
6055702 Imamura et al. May 2000 A
6061868 Moritsch et al. May 2000 A
6065182 Wright et al. May 2000 A
6070290 Schwarze et al. Jun 2000 A
6073432 Schaedler Jun 2000 A
6076025 Ueno et al. Jun 2000 A
6076026 Jambhekar et al. Jun 2000 A
6076226 Reed Jun 2000 A
6076227 Schallig et al. Jun 2000 A
6081257 Zeller Jun 2000 A
6088020 Mor Jul 2000 A
6094775 Behmer Aug 2000 A
6099091 Campbell Aug 2000 A
6101670 Song Aug 2000 A
6101671 Wright et al. Aug 2000 A
6108031 King et al. Aug 2000 A
6108067 Okamoto Aug 2000 A
6108076 Hanseder Aug 2000 A
6108269 Kabel Aug 2000 A
6108597 Kirchner et al. Aug 2000 A
6108859 Burgoon Aug 2000 A
6112143 Allen et al. Aug 2000 A
6112996 Matsuo Sep 2000 A
6119057 Kawagoe Sep 2000 A
6122798 Kobayashi et al. Sep 2000 A
6124694 Bancroft et al. Sep 2000 A
6125498 Roberts et al. Oct 2000 A
6131237 Kasper et al. Oct 2000 A
6138063 Himeda Oct 2000 A
6142252 Kinto et al. Nov 2000 A
6146041 Chen et al. Nov 2000 A
6146278 Kobayashi Nov 2000 A
6154279 Thayer Nov 2000 A
6154694 Aoki et al. Nov 2000 A
6160479 Ahlen et al. Dec 2000 A
6167332 Kurtzberg et al. Dec 2000 A
6167587 Kasper et al. Jan 2001 B1
6192548 Huffman Feb 2001 B1
6192549 Kasen et al. Feb 2001 B1
6202243 Beaufoy et al. Mar 2001 B1
6216307 Kaleta et al. Apr 2001 B1
6220865 Macri et al. Apr 2001 B1
6226830 Hendriks et al. May 2001 B1
6230362 Kasper et al. May 2001 B1
6237741 Guidetti May 2001 B1
6240342 Fiegert et al. May 2001 B1
6243913 Frank et al. Jun 2001 B1
6255793 Peless et al. Jul 2001 B1
6259979 Holmquist Jul 2001 B1
6261379 Conrad et al. Jul 2001 B1
6263539 Baig Jul 2001 B1
6263989 Won Jul 2001 B1
6272936 Oreper et al. Aug 2001 B1
6276478 Hopkins et al. Aug 2001 B1
6278918 Dickson et al. Aug 2001 B1
6279196 Kasen et al. Aug 2001 B2
6282526 Ganesh Aug 2001 B1
6283034 Miles Sep 2001 B1
6285778 Nakajima et al. Sep 2001 B1
6285930 Dickson et al. Sep 2001 B1
6286181 Kasper et al. Sep 2001 B1
6300737 Bergvall et al. Oct 2001 B1
6321337 Reshef et al. Nov 2001 B1
6321515 Colens Nov 2001 B1
6323570 Nishimura et al. Nov 2001 B1
6324714 Walz et al. Dec 2001 B1
6327741 Reed Dec 2001 B1
6332400 Meyer Dec 2001 B1
6339735 Peless et al. Jan 2002 B1
6362875 Burkley Mar 2002 B1
6370452 Pfister Apr 2002 B1
6370453 Sommer Apr 2002 B2
6374155 Wallach et al. Apr 2002 B1
6374157 Takamura Apr 2002 B1
6381802 Park May 2002 B2
6385515 Dickson et al. May 2002 B1
6388013 Saraf et al. May 2002 B1
6389329 Colens May 2002 B1
6397429 Legatt et al. Jun 2002 B1
6400048 Nishimura et al. Jun 2002 B1
6401294 Kasper Jun 2002 B2
6408226 Byrne et al. Jun 2002 B1
6412141 Kasper et al. Jul 2002 B2
6415203 Inoue et al. Jul 2002 B1
6418586 Fulghum Jul 2002 B2
6421870 Basham et al. Jul 2002 B1
6427285 Legatt et al. Aug 2002 B1
6430471 Kintou et al. Aug 2002 B1
6431296 Won Aug 2002 B1
6437227 Theimer Aug 2002 B1
6437465 Nishimura et al. Aug 2002 B1
6438456 Feddema et al. Aug 2002 B1
6438793 Miner et al. Aug 2002 B1
6442476 Poropat Aug 2002 B1
6442789 Legatt et al. Sep 2002 B1
6443509 Levin et al. Sep 2002 B1
6444003 Sutcliffe Sep 2002 B1
6446302 Kasper et al. Sep 2002 B1
6454036 Airey et al. Sep 2002 B1
D464091 Christianson Oct 2002 S
6457206 Judson Oct 2002 B1
6459955 Bartsch et al. Oct 2002 B1
6463368 Feiten et al. Oct 2002 B1
6465982 Bergvall et al. Oct 2002 B1
6473167 Odell Oct 2002 B1
6480762 Uchikubo et al. Nov 2002 B1
6481515 Kirkpatrick et al. Nov 2002 B1
6482252 Conrad et al. Nov 2002 B1
6490539 Dickson et al. Dec 2002 B1
6491127 Holmberg et al. Dec 2002 B1
6493612 Bisset et al. Dec 2002 B1
6493613 Peless et al. Dec 2002 B2
6496754 Song et al. Dec 2002 B2
6496755 Wallach et al. Dec 2002 B2
6502657 Kerrebrock et al. Jan 2003 B2
6504610 Bauer et al. Jan 2003 B1
6507773 Parker et al. Jan 2003 B2
6519808 Legatt et al. Feb 2003 B2
6525509 Petersson et al. Feb 2003 B1
D471243 Cioffi et al. Mar 2003 S
6530102 Pierce et al. Mar 2003 B1
6530117 Peterson Mar 2003 B2
6532404 Colens Mar 2003 B2
6535793 Allard Mar 2003 B2
6540424 Hall et al. Apr 2003 B1
6540607 Mokris et al. Apr 2003 B2
6548982 Papanikolopoulos et al. Apr 2003 B1
6553612 Dyson et al. Apr 2003 B1
6556722 Russell et al. Apr 2003 B1
6556892 Kuroki et al. Apr 2003 B2
6557104 Vu et al. Apr 2003 B2
D474312 Stephens et al. May 2003 S
6563130 Dworkowski et al. May 2003 B2
6571415 Gerber et al. Jun 2003 B2
6571422 Gordon et al. Jun 2003 B1
6572711 Sclafani et al. Jun 2003 B2
6574536 Kawagoe et al. Jun 2003 B1
6580246 Jacobs Jun 2003 B2
6584376 Van Kommer Jun 2003 B1
6586908 Petersson et al. Jul 2003 B2
6587573 Stam et al. Jul 2003 B1
6590222 Bisset et al. Jul 2003 B1
6594551 McKinney et al. Jul 2003 B2
6594844 Jones Jul 2003 B2
6597076 Scheible et al. Jul 2003 B2
D478884 Slipy et al. Aug 2003 S
6601265 Burlington Aug 2003 B1
6604021 Imai et al. Aug 2003 B2
6604022 Parker et al. Aug 2003 B2
6605156 Clark et al. Aug 2003 B1
6609269 Kasper Aug 2003 B2
6611120 Song et al. Aug 2003 B2
6611734 Parker et al. Aug 2003 B2
6611738 Ruffner Aug 2003 B2
6615108 Peless et al. Sep 2003 B1
6615434 Davis et al. Sep 2003 B1
6615885 Ohm Sep 2003 B1
6622465 Jerome et al. Sep 2003 B2
6624744 Wilson et al. Sep 2003 B1
6625843 Kim et al. Sep 2003 B2
6629028 Paromtchik et al. Sep 2003 B2
6633150 Wallach et al. Oct 2003 B1
6637546 Wang Oct 2003 B1
6639659 Granger Oct 2003 B2
6658325 Zweig Dec 2003 B2
6658354 Lin Dec 2003 B2
6658692 Lenkiewicz et al. Dec 2003 B2
6658693 Reed Dec 2003 B1
6661239 Ozick Dec 2003 B1
6662889 De Fazio et al. Dec 2003 B2
6668951 Won Dec 2003 B2
6670817 Fournier et al. Dec 2003 B2
6671592 Bisset et al. Dec 2003 B1
6671925 Field et al. Jan 2004 B2
6677938 Maynard Jan 2004 B1
6687571 Byrne et al. Feb 2004 B1
6690134 Jones et al. Feb 2004 B1
6690993 Foulke et al. Feb 2004 B2
6697147 Ko et al. Feb 2004 B2
6705332 Field et al. Mar 2004 B2
6711280 Stafsudd et al. Mar 2004 B2
6732826 Song et al. May 2004 B2
6735811 Field et al. May 2004 B2
6735812 Hekman et al. May 2004 B2
6737591 Lapstun et al. May 2004 B1
6741054 Koselka et al. May 2004 B2
6741364 Lange et al. May 2004 B2
6748297 Song et al. Jun 2004 B2
6756703 Chang Jun 2004 B2
6760647 Nourbakhsh et al. Jul 2004 B2
6764373 Osawa et al. Jul 2004 B1
6769004 Barrett Jul 2004 B2
6774596 Bisset Aug 2004 B1
6779380 Nieuwkamp Aug 2004 B1
6781338 Jones et al. Aug 2004 B2
6809490 Jones et al. Oct 2004 B2
6810305 Kirkpatrick Oct 2004 B2
6810350 Blakley Oct 2004 B2
6830120 Yashima et al. Dec 2004 B1
6832407 Salem et al. Dec 2004 B2
6836701 McKee Dec 2004 B2
6841963 Song et al. Jan 2005 B2
6845297 Allard Jan 2005 B2
6848146 Wright et al. Feb 2005 B2
6854148 Rief et al. Feb 2005 B1
6856811 Burdue et al. Feb 2005 B2
6859010 Jeon et al. Feb 2005 B2
6859682 Naka et al. Feb 2005 B2
6860206 Rudakevych et al. Mar 2005 B1
6865447 Lau et al. Mar 2005 B2
6870792 Chiappetta Mar 2005 B2
6871115 Huang et al. Mar 2005 B2
6883201 Jones et al. Apr 2005 B2
6886651 Slocum et al. May 2005 B1
6888333 Laby May 2005 B2
6901624 Mori et al. Jun 2005 B2
6906702 Tanaka et al. Jun 2005 B1
6914403 Tsurumi Jul 2005 B2
6917854 Bayer Jul 2005 B2
6925357 Wang et al. Aug 2005 B2
6925679 Wallach et al. Aug 2005 B2
6929548 Wang Aug 2005 B2
D510066 Hickey et al. Sep 2005 S
6938298 Aasen Sep 2005 B2
6940291 Ozick Sep 2005 B1
6941199 Bottomley et al. Sep 2005 B1
6956348 Landry et al. Oct 2005 B2
6957712 Song et al. Oct 2005 B2
6960986 Asama et al. Nov 2005 B2
6965209 Jones et al. Nov 2005 B2
6965211 Tsurumi Nov 2005 B2
6968592 Takeuchi et al. Nov 2005 B2
6971140 Kim Dec 2005 B2
6975246 Trudeau Dec 2005 B1
6980229 Ebersole Dec 2005 B1
6985556 Shanmugavel et al. Jan 2006 B2
6993954 George et al. Feb 2006 B1
6999850 McDonald Feb 2006 B2
7013527 Thomas et al. Mar 2006 B2
7024278 Chiappetta et al. Apr 2006 B2
7024280 Parker et al. Apr 2006 B2
7027893 Perry et al. Apr 2006 B2
7030768 Wanie Apr 2006 B2
7031805 Lee et al. Apr 2006 B2
7032469 Bailey Apr 2006 B2
7040869 Beenker May 2006 B2
7041029 Fulghum et al. May 2006 B2
7042342 Luo et al. May 2006 B2
7051399 Field et al. May 2006 B2
7053578 Diehl et al. May 2006 B2
7054716 McKee et al. May 2006 B2
7055210 Keppler et al. Jun 2006 B2
7057120 Ma et al. Jun 2006 B2
7057643 Iida et al. Jun 2006 B2
7059012 Song et al. Jun 2006 B2
7065430 Naka et al. Jun 2006 B2
7066291 Martins et al. Jun 2006 B2
7069124 Whittaker et al. Jun 2006 B1
7079923 Abramson et al. Jul 2006 B2
7085623 Siegers Aug 2006 B2
7085624 Aldred et al. Aug 2006 B2
7113847 Chmura et al. Sep 2006 B2
7133746 Abramson et al. Nov 2006 B2
7142198 Lee Nov 2006 B2
7148458 Schell et al. Dec 2006 B2
7155308 Jones Dec 2006 B2
7167775 Abramson et al. Jan 2007 B2
7171285 Kim et al. Jan 2007 B2
7173391 Jones et al. Feb 2007 B2
7174238 Zweig Feb 2007 B1
7188000 Chiappetta et al. Mar 2007 B2
7193384 Norman et al. Mar 2007 B1
7196487 Jones et al. Mar 2007 B2
7201786 Wegelin et al. Apr 2007 B2
7206677 Hulden Apr 2007 B2
7211980 Bruemmer et al. May 2007 B1
7225500 Diehl et al. Jun 2007 B2
7246405 Yan Jul 2007 B2
7248951 Hulden Jul 2007 B2
7275280 Haegermarck et al. Oct 2007 B2
7283892 Boillot et al. Oct 2007 B1
7288912 Landry et al. Oct 2007 B2
7318248 Yan Jan 2008 B1
7320149 Huffman et al. Jan 2008 B1
7321807 Laski Jan 2008 B2
7324870 Lee Jan 2008 B2
7328196 Peters Feb 2008 B2
7332890 Cohen et al. Feb 2008 B2
7346428 Huffman et al. Mar 2008 B1
7352153 Yan Apr 2008 B2
7359766 Jeon et al. Apr 2008 B2
7360277 Moshenrose et al. Apr 2008 B2
7363108 Noda et al. Apr 2008 B2
7388879 Sabe et al. Jun 2008 B2
7389156 Ziegler et al. Jun 2008 B2
7389166 Harwig et al. Jun 2008 B2
7408157 Yan Aug 2008 B2
7418762 Arai et al. Sep 2008 B2
7424611 Hino et al. Sep 2008 B2
7430455 Casey et al. Sep 2008 B2
7430462 Chiu et al. Sep 2008 B2
7441298 Svendsen et al. Oct 2008 B2
7444206 Abramson et al. Oct 2008 B2
7448113 Jones et al. Nov 2008 B2
7459871 Landry et al. Dec 2008 B2
7467026 Sakagami et al. Dec 2008 B2
7474941 Kim et al. Jan 2009 B2
7503096 Lin Mar 2009 B2
7515991 Egawa et al. Apr 2009 B2
7539557 Yamauchi May 2009 B2
7555363 Augenbraun et al. Jun 2009 B2
7557703 Yamada et al. Jul 2009 B2
7568259 Yan Aug 2009 B2
7571511 Jones et al. Aug 2009 B2
7578020 Jaworski et al. Aug 2009 B2
7600521 Woo Oct 2009 B2
7603744 Reindle Oct 2009 B2
7611583 Buckley et al. Nov 2009 B2
7617557 Reindle Nov 2009 B2
7620476 Morse et al. Nov 2009 B2
7636928 Uno Dec 2009 B2
7636982 Jones et al. Dec 2009 B2
7647144 Haegermarck Jan 2010 B2
7650666 Jang Jan 2010 B2
7660650 Kawagoe et al. Feb 2010 B2
7663333 Jones et al. Feb 2010 B2
7693605 Park Apr 2010 B2
7706917 Chiappetta et al. Apr 2010 B1
7761954 Ziegler et al. Jul 2010 B2
7765635 Park Aug 2010 B2
7784147 Burkholder et al. Aug 2010 B2
7801645 Taylor et al. Sep 2010 B2
7805220 Taylor et al. Sep 2010 B2
7809944 Kawamoto Oct 2010 B2
7832048 Harwig et al. Nov 2010 B2
7849555 Hahm et al. Dec 2010 B2
7853645 Brown et al. Dec 2010 B2
7860680 Arms et al. Dec 2010 B2
7920941 Park et al. Apr 2011 B2
7937800 Yan May 2011 B2
7957836 Myeong et al. Jun 2011 B2
8087117 Kapoor et al. Jan 2012 B2
20010004719 Sommer Jun 2001 A1
20010013929 Torsten Aug 2001 A1
20010020200 Das et al. Sep 2001 A1
20010025183 Shahidi Sep 2001 A1
20010037163 Allard Nov 2001 A1
20010043509 Green et al. Nov 2001 A1
20010045883 Holdaway et al. Nov 2001 A1
20010047231 Peless et al. Nov 2001 A1
20010047895 De Fazio et al. Dec 2001 A1
20020011367 Kolesnik Jan 2002 A1
20020011813 Koselka et al. Jan 2002 A1
20020016649 Jones Feb 2002 A1
20020021219 Edwards Feb 2002 A1
20020027652 Paromtchik et al. Mar 2002 A1
20020030142 James Mar 2002 A1
20020036779 Kiyoi et al. Mar 2002 A1
20020081937 Yamada et al. Jun 2002 A1
20020095239 Wallach et al. Jul 2002 A1
20020097400 Jung et al. Jul 2002 A1
20020104963 Mancevski Aug 2002 A1
20020108209 Peterson Aug 2002 A1
20020112742 Bredo et al. Aug 2002 A1
20020113973 Ge Aug 2002 A1
20020116089 Kirkpatrick Aug 2002 A1
20020120364 Colens Aug 2002 A1
20020124343 Reed Sep 2002 A1
20020153185 Song et al. Oct 2002 A1
20020156556 Ruffner Oct 2002 A1
20020159051 Guo Oct 2002 A1
20020166193 Kasper Nov 2002 A1
20020169521 Goodman et al. Nov 2002 A1
20020173877 Zweig Nov 2002 A1
20020180585 Kim et al. Dec 2002 A1
20020189871 Won Dec 2002 A1
20030009259 Hattori et al. Jan 2003 A1
20030015232 Nguyen Jan 2003 A1
20030019071 Field et al. Jan 2003 A1
20030023356 Keable Jan 2003 A1
20030024986 Mazz et al. Feb 2003 A1
20030025472 Jones et al. Feb 2003 A1
20030028286 Glenn et al. Feb 2003 A1
20030030399 Jacobs Feb 2003 A1
20030058262 Sato et al. Mar 2003 A1
20030060928 Abramson et al. Mar 2003 A1
20030067451 Tagg et al. Apr 2003 A1
20030097875 Lentz et al. May 2003 A1
20030120389 Abramson et al. Jun 2003 A1
20030124312 Autumn Jul 2003 A1
20030126352 Barrett Jul 2003 A1
20030137268 Papanikolopoulos et al. Jul 2003 A1
20030146384 Logsdon et al. Aug 2003 A1
20030159232 Hekman et al. Aug 2003 A1
20030165373 Felder et al. Sep 2003 A1
20030168081 Lee et al. Sep 2003 A1
20030175138 Beenker Sep 2003 A1
20030192144 Song et al. Oct 2003 A1
20030193657 Uomori et al. Oct 2003 A1
20030216834 Allard Nov 2003 A1
20030221114 Hino et al. Nov 2003 A1
20030229421 Chmura et al. Dec 2003 A1
20030229474 Suzuki et al. Dec 2003 A1
20030233171 Heiligensetzer Dec 2003 A1
20030233177 Johnson et al. Dec 2003 A1
20030233870 Mancevski Dec 2003 A1
20030233930 Ozick Dec 2003 A1
20040016077 Song et al. Jan 2004 A1
20040020000 Jones Feb 2004 A1
20040030448 Solomon Feb 2004 A1
20040030449 Solomon Feb 2004 A1
20040030450 Solomon Feb 2004 A1
20040030451 Solomon Feb 2004 A1
20040030570 Solomon Feb 2004 A1
20040030571 Solomon Feb 2004 A1
20040031113 Wosewick et al. Feb 2004 A1
20040049877 Jones et al. Mar 2004 A1
20040055163 McCambridge et al. Mar 2004 A1
20040056651 Marietta Bersana Mar 2004 A1
20040068351 Solomon Apr 2004 A1
20040068415 Solomon Apr 2004 A1
20040068416 Solomon Apr 2004 A1
20040074038 Im et al. Apr 2004 A1
20040074044 Diehl et al. Apr 2004 A1
20040076324 Burl et al. Apr 2004 A1
20040083570 Song et al. May 2004 A1
20040085037 Jones et al. May 2004 A1
20040088079 Lavarec et al. May 2004 A1
20040093122 Galibraith May 2004 A1
20040098167 Yi et al. May 2004 A1
20040111184 Chiappetta et al. Jun 2004 A1
20040111821 Lenkiewicz et al. Jun 2004 A1
20040113777 Matsuhira et al. Jun 2004 A1
20040117064 McDonald Jun 2004 A1
20040117846 Karaoguz et al. Jun 2004 A1
20040118998 Wingett et al. Jun 2004 A1
20040128028 Miyamoto et al. Jul 2004 A1
20040133316 Dean Jul 2004 A1
20040134336 Solomon Jul 2004 A1
20040134337 Solomon Jul 2004 A1
20040143919 Wilder Jul 2004 A1
20040148419 Chen et al. Jul 2004 A1
20040148731 Damman et al. Aug 2004 A1
20040153212 Profio et al. Aug 2004 A1
20040156541 Jeon et al. Aug 2004 A1
20040158357 Lee et al. Aug 2004 A1
20040181706 Chen et al. Sep 2004 A1
20040187249 Jones et al. Sep 2004 A1
20040187457 Colens Sep 2004 A1
20040196451 Aoyama Oct 2004 A1
20040200505 Taylor et al. Oct 2004 A1
20040201361 Koh et al. Oct 2004 A1
20040204792 Taylor et al. Oct 2004 A1
20040204804 Lee et al. Oct 2004 A1
20040210345 Noda et al. Oct 2004 A1
20040210347 Sawada et al. Oct 2004 A1
20040211444 Taylor et al. Oct 2004 A1
20040221790 Sinclair et al. Nov 2004 A1
20040236468 Taylor et al. Nov 2004 A1
20040244138 Taylor et al. Dec 2004 A1
20040255425 Arai et al. Dec 2004 A1
20050000543 Taylor et al. Jan 2005 A1
20050010330 Abramson et al. Jan 2005 A1
20050010331 Taylor et al. Jan 2005 A1
20050015920 Kim et al. Jan 2005 A1
20050021181 Kim et al. Jan 2005 A1
20050028316 Thomas et al. Feb 2005 A1
20050053912 Roth et al. Mar 2005 A1
20050055796 Wright et al. Mar 2005 A1
20050067994 Jones et al. Mar 2005 A1
20050081782 Buckley et al. Apr 2005 A1
20050085947 Aldred et al. Apr 2005 A1
20050091782 Gordon et al. May 2005 A1
20050091786 Wright et al. May 2005 A1
20050137749 Jeon et al. Jun 2005 A1
20050144751 Kegg et al. Jul 2005 A1
20050150074 Diehl et al. Jul 2005 A1
20050150519 Keppler et al. Jul 2005 A1
20050154795 Kuz et al. Jul 2005 A1
20050156562 Cohen et al. Jul 2005 A1
20050162119 Landry et al. Jul 2005 A1
20050163119 Ito et al. Jul 2005 A1
20050165508 Kanda et al. Jul 2005 A1
20050166354 Uehigashi Aug 2005 A1
20050166355 Tani Aug 2005 A1
20050172445 Diehl et al. Aug 2005 A1
20050183229 Uehigashi Aug 2005 A1
20050183230 Uehigashi Aug 2005 A1
20050187678 Myeong et al. Aug 2005 A1
20050192707 Park et al. Sep 2005 A1
20050204717 Colens Sep 2005 A1
20050209736 Kawagoe Sep 2005 A1
20050211880 Schell et al. Sep 2005 A1
20050212929 Schell et al. Sep 2005 A1
20050213082 DiBernardo et al. Sep 2005 A1
20050213109 Schell et al. Sep 2005 A1
20050217042 Reindle Oct 2005 A1
20050218852 Landry et al. Oct 2005 A1
20050222933 Wesby Oct 2005 A1
20050229340 Sawalski et al. Oct 2005 A1
20050229355 Crouch et al. Oct 2005 A1
20050235451 Yan Oct 2005 A1
20050251292 Casey et al. Nov 2005 A1
20050255425 Pierson Nov 2005 A1
20050258154 Blankenship et al. Nov 2005 A1
20050273967 Taylor et al. Dec 2005 A1
20050288819 De Guzman Dec 2005 A1
20060000050 Cipolla et al. Jan 2006 A1
20060009879 Lynch et al. Jan 2006 A1
20060010638 Shimizu et al. Jan 2006 A1
20060020369 Taylor et al. Jan 2006 A1
20060020370 Abramson Jan 2006 A1
20060021168 Nishikawa Feb 2006 A1
20060025134 Cho et al. Feb 2006 A1
20060037170 Shimizu Feb 2006 A1
20060042042 Mertes et al. Mar 2006 A1
20060044546 Lewin et al. Mar 2006 A1
20060060216 Woo Mar 2006 A1
20060061657 Rew et al. Mar 2006 A1
20060064828 Stein et al. Mar 2006 A1
20060087273 Ko et al. Apr 2006 A1
20060089765 Pack et al. Apr 2006 A1
20060100741 Jung May 2006 A1
20060107894 Buckley et al. May 2006 A1
20060119839 Bertin et al. Jun 2006 A1
20060143295 Costa-Requena et al. Jun 2006 A1
20060146776 Kim Jul 2006 A1
20060150361 Aldred et al. Jul 2006 A1
20060184293 Konandreas et al. Aug 2006 A1
20060185690 Song et al. Aug 2006 A1
20060190133 Konandreas et al. Aug 2006 A1
20060190134 Ziegler et al. Aug 2006 A1
20060190146 Morse et al. Aug 2006 A1
20060196003 Song et al. Sep 2006 A1
20060200281 Ziegler et al. Sep 2006 A1
20060220900 Ceskutti et al. Oct 2006 A1
20060229774 Park et al. Oct 2006 A1
20060259194 Chiu Nov 2006 A1
20060259494 Watson et al. Nov 2006 A1
20060278161 Burkholder et al. Dec 2006 A1
20060288519 Jaworski et al. Dec 2006 A1
20060293787 Kanda et al. Dec 2006 A1
20060293808 Qian Dec 2006 A1
20070006404 Cheng et al. Jan 2007 A1
20070016328 Ziegler et al. Jan 2007 A1
20070017061 Yan Jan 2007 A1
20070028574 Yan Feb 2007 A1
20070032904 Kawagoe et al. Feb 2007 A1
20070042716 Goodall et al. Feb 2007 A1
20070043459 Abbott et al. Feb 2007 A1
20070061041 Zweig Mar 2007 A1
20070061043 Ermakov et al. Mar 2007 A1
20070114975 Cohen et al. May 2007 A1
20070142964 Abramson Jun 2007 A1
20070150096 Yeh et al. Jun 2007 A1
20070156286 Yamauchi Jul 2007 A1
20070157415 Lee et al. Jul 2007 A1
20070157420 Lee et al. Jul 2007 A1
20070179670 Chiappetta et al. Aug 2007 A1
20070226949 Hahm et al. Oct 2007 A1
20070234492 Svendsen et al. Oct 2007 A1
20070244610 Ozick et al. Oct 2007 A1
20070245511 Hahm et al. Oct 2007 A1
20070250212 Halloran et al. Oct 2007 A1
20070261193 Gordon et al. Nov 2007 A1
20070266508 Jones et al. Nov 2007 A1
20070271011 Lee Nov 2007 A1
20080007203 Cohen et al. Jan 2008 A1
20080039974 Sandin et al. Feb 2008 A1
20080052846 Kapoor et al. Mar 2008 A1
20080091304 Ozick et al. Apr 2008 A1
20080109126 Sandin et al. May 2008 A1
20080134458 Ziegler et al. Jun 2008 A1
20080140255 Ziegler et al. Jun 2008 A1
20080155768 Ziegler et al. Jul 2008 A1
20080184518 Taylor et al. Aug 2008 A1
20080266748 Lee Oct 2008 A1
20080276407 Schnittman et al. Nov 2008 A1
20080281470 Gilbert et al. Nov 2008 A1
20080282494 Won et al. Nov 2008 A1
20080294288 Yamauchi Nov 2008 A1
20080302586 Yan Dec 2008 A1
20080307590 Jones et al. Dec 2008 A1
20090007366 Svendsen et al. Jan 2009 A1
20090038089 Landry et al. Feb 2009 A1
20090048727 Hong et al. Feb 2009 A1
20090049640 Lee et al. Feb 2009 A1
20090055022 Casey et al. Feb 2009 A1
20090102296 Greene et al. Apr 2009 A1
20090292393 Casey et al. Nov 2009 A1
20100006028 Buckley et al. Jan 2010 A1
20100011529 Won et al. Jan 2010 A1
20100049365 Jones et al. Feb 2010 A1
20100063628 Landry et al. Mar 2010 A1
20100082193 Chiappetta Apr 2010 A1
20100107355 Won et al. May 2010 A1
20100257690 Jones et al. Oct 2010 A1
20100257691 Jones et al. Oct 2010 A1
20100263158 Jones et al. Oct 2010 A1
20100268384 Jones et al. Oct 2010 A1
20100293742 Chung et al. Nov 2010 A1
20100312429 Jones et al. Dec 2010 A1
20120168240 Wilson Jul 2012 A1
20120259481 Kim Oct 2012 A1
20140316636 Hong Oct 2014 A1
Foreign Referenced Citations (320)
Number Date Country
2128842 Dec 1980 DE
3317376 Dec 1987 DE
3536907 Feb 1989 DE
3404202 Dec 1992 DE
199311014 Oct 1993 DE
4338841 May 1995 DE
4414683 Oct 1995 DE
19849978 Feb 2001 DE
102004038074 Jun 2005 DE
10357636 Jul 2005 DE
102004041021 Aug 2005 DE
102005046813 Apr 2007 DE
338988 Dec 1988 DK
0265542 May 1988 EP
0281085 Sep 1988 EP
0286328 Oct 1988 EP
0294101 Dec 1988 EP
0352045 Jan 1990 EP
0433697 Jun 1991 EP
0437024 Jul 1991 EP
0554978 Aug 1993 EP
0615719 Sep 1994 EP
0792726 Sep 1997 EP
0930040 Jul 1999 EP
0845237 Apr 2000 EP
0861629 Sep 2001 EP
1228734 A2164 Aug 2002 EP
1380245 Jan 2004 EP
1380246 Jan 2004 EP
1018315 Nov 2004 EP
1553472 Jul 2005 EP
1557730 Jul 2005 EP
1642522 Apr 2006 EP
1836941 Sep 2007 EP
2238196 Aug 2005 ES
722755 Mar 1932 FR
2601443 Jan 1988 FR
2828589 Feb 2003 FR
702426 Jan 1954 GB
2128842 May 1984 GB
2225221 May 1990 GB
2267360 Dec 1993 GB
2283838 May 1995 GB
2284957 Jun 1995 GB
2300082 Oct 1996 GB
2344747 Jun 2000 GB
2404330 Feb 2005 GB
2417354 Feb 2006 GB
53021869 Feb 1978 JP
53110257 Sep 1978 JP
57064217 Apr 1982 JP
59005315 Jan 1984 JP
59033511 Mar 1984 JP
59094005 May 1984 JP
59099308 Jun 1984 JP
59112311 Jun 1984 JP
59120124 Jul 1984 JP
59131668 Sep 1984 JP
59164973 Sep 1984 JP
59184917 Oct 1984 JP
2283343 Nov 1984 JP
59212924 Dec 1984 JP
59226909 Dec 1984 JP
60089213 May 1985 JP
60211510 Oct 1985 JP
60259895 Dec 1985 JP
61023221 Jan 1986 JP
61097712 May 1986 JP
61160366 Jul 1986 JP
62070709 Apr 1987 JP
62074018 Apr 1987 JP
62120510 Jun 1987 JP
62154008 Jul 1987 JP
62164431 Jul 1987 JP
62263507 Nov 1987 JP
62263508 Nov 1987 JP
62189057 Dec 1987 JP
63079623 Apr 1988 JP
63158032 Jul 1988 JP
63203483 Aug 1988 JP
63241610 Oct 1988 JP
1118752 Aug 1989 JP
2-6312 Jan 1990 JP
3051023 Mar 1991 JP
4019586 Jan 1992 JP
4074285 Mar 1992 JP
4084921 Mar 1992 JP
5023269 Feb 1993 JP
5042076 Feb 1993 JP
5046246 Feb 1993 JP
5091604 Apr 1993 JP
5095879 Apr 1993 JP
5150827 Jun 1993 JP
5150829 Jun 1993 JP
5054620 Jul 1993 JP
5-9160 Sep 1993 JP
5040519 Oct 1993 JP
05257527 Oct 1993 JP
05285861 Nov 1993 JP
5302836 Nov 1993 JP
5312514 Nov 1993 JP
05046239 Dec 1993 JP
5341904 Dec 1993 JP
6003251 Jan 1994 JP
6038912 Feb 1994 JP
6105781 Apr 1994 JP
6137828 May 1994 JP
6154143 Jun 1994 JP
6293095 Oct 1994 JP
06327 598 Nov 1994 JP
7047046 Feb 1995 JP
07129239 May 1995 JP
7059702 Jun 1995 JP
07222705 Aug 1995 JP
7270518 Oct 1995 JP
7313417 Dec 1995 JP
8000393 Jan 1996 JP
8016776 Jan 1996 JP
8084696 Apr 1996 JP
8089449 Apr 1996 JP
08089451 Apr 1996 JP
8123548 May 1996 JP
8152916 Jun 1996 JP
8263137 Oct 1996 JP
8335112 Dec 1996 JP
8339297 Dec 1996 JP
943901 Feb 1997 JP
9044240 Feb 1997 JP
9066855 Mar 1997 JP
9145309 Jun 1997 JP
09160644 Jun 1997 JP
09179625 Jul 1997 JP
09185410 Jul 1997 JP
9192069 Jul 1997 JP
2555263 Aug 1997 JP
9204223 Aug 1997 JP
09206258 Aug 1997 JP
09233712 Sep 1997 JP
9265319 Oct 1997 JP
9269807 Oct 1997 JP
9269810 Oct 1997 JP
9319431 Dec 1997 JP
9319432 Dec 1997 JP
9319434 Dec 1997 JP
9325812 Dec 1997 JP
10055215 Feb 1998 JP
10117973 May 1998 JP
10118963 May 1998 JP
10165738 Jun 1998 JP
10177414 Jun 1998 JP
10295595 Nov 1998 JP
10314088 Dec 1998 JP
11015941 Jan 1999 JP
11102220 Apr 1999 JP
11162454 Jun 1999 JP
11174145 Jul 1999 JP
11175149 Jul 1999 JP
11178765 Jul 1999 JP
114008764 Jul 1999 JP
11212642 Aug 1999 JP
11213157 Aug 1999 JP
11282532 Oct 1999 JP
11282533 Oct 1999 JP
11295412 Oct 1999 JP
2000047728 Feb 2000 JP
2000056006 Feb 2000 JP
2000056831 Feb 2000 JP
2000060782 Feb 2000 JP
2000066722 Mar 2000 JP
2000075925 Mar 2000 JP
2000102499 Apr 2000 JP
2000275321 Oct 2000 JP
2000279353 Oct 2000 JP
2000353014 Dec 2000 JP
2001022443 Jan 2001 JP
2001067588 Mar 2001 JP
2001087182 Apr 2001 JP
2001121455 May 2001 JP
2001125641 May 2001 JP
2001508572 Jun 2001 JP
2001197008 Jul 2001 JP
3197758 Aug 2001 JP
3201903 Aug 2001 JP
2001216482 Aug 2001 JP
2001258807 Sep 2001 JP
2001265437 Sep 2001 JP
2001275908 Oct 2001 JP
2001289939 Oct 2001 JP
2001306170 Nov 2001 JP
2002073170 Mar 2002 JP
2002078650 Mar 2002 JP
2002204768 Jul 2002 JP
2002204769 Jul 2002 JP
2002247510 Aug 2002 JP
2002532180 Oct 2002 JP
2002323925 Nov 2002 JP
2002333920 Nov 2002 JP
2002355206 Dec 2002 JP
2002360471 Dec 2002 JP
2002360482 Dec 2002 JP
2002366227 Dec 2002 JP
2002369778 Dec 2002 JP
200330157 Jan 2003 JP
2003005296 Jan 2003 JP
2003010076 Jan 2003 JP
2003010088 Jan 2003 JP
2003028528 Jan 2003 JP
2003036116 Feb 2003 JP
2003038401 Feb 2003 JP
2003038402 Feb 2003 JP
2003047579 Feb 2003 JP
2003061882 Mar 2003 JP
2003084994 Mar 2003 JP
2003167628 Jun 2003 JP
2003180586 Jul 2003 JP
2003180587 Jul 2003 JP
2003186539 Jul 2003 JP
2003190064 Jul 2003 JP
2003241836 Aug 2003 JP
2003262520 Sep 2003 JP
2003304992 Oct 2003 JP
2003310509 Nov 2003 JP
2003330543 Nov 2003 JP
2004123040 Apr 2004 JP
2004148021 May 2004 JP
2004160102 Jun 2004 JP
2004166968 Jun 2004 JP
2004198330 Jul 2004 JP
2004219185 Aug 2004 JP
2004351234 Dec 2004 JP
2005118354 May 2005 JP
2005211360 Aug 2005 JP
2005224265 Aug 2005 JP
2005230032 Sep 2005 JP
2005245916 Sep 2005 JP
2005352707 Dec 2005 JP
2006043071 Feb 2006 JP
2006155274 Jun 2006 JP
2006164223 Jun 2006 JP
2006227673 Aug 2006 JP
2006247467 Sep 2006 JP
2006260161 Sep 2006 JP
2006293662 Oct 2006 JP
2006296697 Nov 2006 JP
2007034866 Feb 2007 JP
2007213180 Aug 2007 JP
2009015611 Jan 2009 JP
2010198552 Sep 2010 JP
9526512 Oct 1995 WO
9530887 Nov 1995 WO
9617258 Jun 1996 WO
9715224 May 1997 WO
9740734 Nov 1997 WO
9741451 Nov 1997 WO
9853456 Nov 1998 WO
9905580 Feb 1999 WO
9916078 Apr 1999 WO
9938056 Jul 1999 WO
9938237 Jul 1999 WO
9943250 Sep 1999 WO
0038026 Jun 2000 WO
0038028 Jun 2000 WO
0038029 Jun 2000 WO
0004430 Oct 2000 WO
0078410 Dec 2000 WO
0106904 Feb 2001 WO
0106905 Feb 2001 WO
0180703 Nov 2001 WO
0191623 Dec 2001 WO
0224292 Mar 2002 WO
0239864 May 2002 WO
0239868 May 2002 WO
02058527 Aug 2002 WO
02062194 Aug 2002 WO
02067744 Sep 2002 WO
02067745 Sep 2002 WO
02067752 Sep 2002 WO
02069774 Sep 2002 WO
02069775 Sep 2002 WO
02071175 Sep 2002 WO
02074150 Sep 2002 WO
02075350 Sep 2002 WO
02075356 Sep 2002 WO
02075469 Sep 2002 WO
02075470 Sep 2002 WO
02081074 Oct 2002 WO
02101477 Dec 2002 WO
03015220 Feb 2003 WO
03024292 Mar 2003 WO
03040546 May 2003 WO
03040845 May 2003 WO
03040846 May 2003 WO
03062850 Jul 2003 WO
03062852 Jul 2003 WO
2004004533 Jan 2004 WO
2004004534 Jan 2004 WO
2004006034 Jan 2004 WO
2004025947 Mar 2004 WO
2004058028 Jul 2004 WO
2004059409 Jul 2004 WO
2005006935 Jan 2005 WO
2005037496 Apr 2005 WO
2005055795 Jun 2005 WO
2005055796 Jun 2005 WO
2005076545 Aug 2005 WO
2005077243 Aug 2005 WO
2005077244 Aug 2005 WO
2005081074 Sep 2005 WO
2005083541 Sep 2005 WO
2005098475 Oct 2005 WO
2005098476 Oct 2005 WO
2006046400 May 2006 WO
2006061133 Jun 2006 WO
2006068403 Jun 2006 WO
2006073248 Jul 2006 WO
2006089307 Aug 2006 WO
2007028049 Mar 2007 WO
2007036490 Apr 2007 WO
2007065033 Jun 2007 WO
2007137234 Nov 2007 WO
Non-Patent Literature Citations (205)
Entry
http://robotbg.com/news/2012/01/14/lg—hom—bot—20—demo—at—ces—2012.
Andersen et al., “Landmark based navigation strategies,” SPIE Conference on Mobile Robots XIII, SPIE vol. 3525, pp. 170-181, Jan. 8, 1999.
Ascii, Mar. 25, 2002, http://ascii.jp/elem/000/000/330/330024/, accessed Nov. 2011, 15 pages (with English translation).
Barker, “Navigation by the Stars—Ben Barker 4th Year Project,” Nov. 2004, 20 pages.
Becker et al., “Reliable Navigation Using Landmarks, ” IEEE International Conference on Robotics and Automation, 0-7803-1965-6, pp. 401-406, 1995.
Benayad-Cherif et al., “Mobile Robot Navigation Sensors,” SPIE vol. 1831 Mobile Robots, VII, pp. 378-387, 1992.
Betke et al. “Mobile robot localization using landmarks,” Proceedings of the IEEE/RSJ/GI International Conference on Intelligent Robots and Systems '94 Advanced Robotic Systems and the Real World' (IROS '94), Accessed via IEEE Xplore, 1994, 8 pages.
Bison et al., “Using a structured beacon for cooperative position estimation,” Robotics and Autonomous Systems, 29(1):33-40, Oct. 1999.
Blaasvaer et al., “AMOR—An Autonomous Mobile Robot Navigation System,” Proceedings of the IEEE International Conference on Systems, Man, and Cybernetics, pp. 2266-2271, 1994.
Borges et al., “Optimal Mobile Robot Pose Estimation Using Geometrical Maps,” IEEE Transactions on Robotics and Automation, 18(1): 87-94, Feb. 2002.
Braunstingl et al., “Fuzzy Logic Wall Following of a Mobile Robot Based on the Concept of General Perception,” ICAR '95, 7th International Conference on Advanced Robotics, Sant Feliu De Guixols, Spain, pp. 367-376, Sep. 1995.
Bulusu et al., “Self Configuring Localization systems: Design and Experimental Evaluation,”ACM Transactions on Embedded Computing Systems, 3(1):24-60, 2003.
Caccia et al., “Bottom-Following for Remotely Operated Vehicles,”5th IFAC Conference, Alaborg, Denmark, pp. 245-250, Aug. 2000.
U.S. Appl. No. 60/605,066 as provided to WIPO in PCT/US2005/030422, corresponding to U.S. Appl. No. 11/574,290, U.S.publication 2008/0184518, filing date Aug. 27, 2004.
U.S. Appl. No. 60/605,181 as provided to WIPO in PCT/US2005/030422, corresponding to U.S. Appl. No. 11/574,290, U.S.publication 2008/0184518, filing date Aug. 27, 2004.
Chae et al., “StarLITE: A new artificial landmark for the navigation of mobile robots,” http://www.irc.atr.jp/jk-nrs2005/pdf/Starlite.pdf, 4 pages, 2005.
Chamberlin et al., “Team 1: Robot Locator Beacon System, ” NASA Goddard SFC, Design Proposal, 15 pages, Feb. 2006.
Champy, “Physical management of IT assets in Data Centers using RFID technologies,” RFID 2005 University, Oct. 12-14, 2005 , 19 pages.
Chiri, “Joystick Control for Tiny OS Robot,” http://www.eecs.berkeley.edu/Programs/ugrad/superb/papers2002/chiri.pdf. 12 pages, Aug. 2002.
Christensen et al. “Theoretical Methods for Planning and Control in Mobile Robotics,” 1997 First International Conference on Knowledge-Based Intelligent Electronic Systems, Adelaide, Australia, pp. 81-86, May 1997.
CleanMate 365, Intelligent Automatic Vacuum Cleaner, Model No. QQ-1, User Manual www.metapo.com/support/user—manual.pdf, Dec. 2005, 11 pages.
Clerentin et al., “A localization method based on two omnidirectional perception systems cooperation,” Proc of IEEE International Conference on Robotics & Automation, San Francisco, CA vol. 2, pp. 1219-1224, Apr. 2000.
Corke, “High Performance Visual serving for robots end-point control,” SPIE vol. 2056, Intelligent Robots and Computer Vision, 1993, 10 pages.
Cozman et al., “Robot Localization using a Computer Vision Sextant,” IEEE International Midwest Conference on Robotics and Automation, pp. 106-111, 1995.
D'Orazio et al., “Model based Vision System for mobile robot position estimation”, SPIE, vol. 2058 Mobile Robots VIII, pp. 38-49, 1992.
De Bakker et al., “Smart PSD—array for sheet of light range imaging”, Proc. Of SPIE, vol. 3965, pp. 1-12, May 2000.
Denning Roboscrub image (1989), 1 page.
Desaulniers et al., “An Efficient Algorithm to find a shortest path for a car-like Robot,” IEEE Transactions on robotics and Automation, 11(6):819-828, Dec. 1995.
Dorfmüller-Ulhaas, “Optical Tracking From User Motion to 3D Interaction,” http://www.cg.tuwien.ac.at/research/publications/2002/Dorfmueller-Ulhaas-thesis, 182 pages, 2002.
Dorsch et al., “Laser Triangulation: Fundamental uncertainty in distance measurement,” Applied Optics, 33(7):1306-1314, Mar. 1994.
Doty et al., “Sweep Strategies for a Sensory-Driven, Behavior-Based Vacuum Cleaning Agent,” AAAI 1993 Fall Symposium Series, Instantiating Real-World Agents, pp. 1-6, Oct. 22-24, 1993.
Dudek et al., “Localizing a Robot with Minimum Travel” Proceedings of the sixth annual ACM-SIAM symposium on Discrete Algorithms, 27(2):583-604, Apr. 1998.
Dulimarta et al., “Mobile Robot Localization in Indoor Environment”, Pattern Recognition, 30(1):99-111, 1997.
Dyson's Robot Vacuum Cleaner—the DC06, May 2004, Retrieved from the Internet: URL<http://www.gizmag.com/go/1282/>. Accessed Nov. 2011, 3 pages.
EBay, “Roomba Timer—> Timed Cleaning—Floorvac Robotic Vacuum,” Retrieved from the Internet: URL Cgi.ebay.com/ws/eBay|SAP|.dll?viewitem&category=43526&item=4375198387&rd=1, 5 pages, Apr. 2005.
Electrolux Trilobite, “Time to enjoy life,” Retrieved from the Internet: URL<http://www.robocon.co.kr/trilobite/Presentation—Trilobite—Kor—030104.ppt, 10 pages, accessed Dec. 2011.
Electrolux Trilobite, Jan. 12, 2001, http://www.electroluxui.com:8080/2002%5C822%5C833102EN.pdf, accessed Jul. 2, 2012, 10 pages.
Electrolux, “Designed for the well-lived home,” Retrieved from the Internet: URL<http://www.electroluxusa.com/node57.as[?currentURL=node142.asp%3F >. Accessed Mar. 2005, 2 pages.
Eren et al., “Accuracy in position estimation of mobile robots based on coded infrared signal transmission,” Proceedings: Integrating Intelligent Instrumentation and Control, Instrumentation and Measurement Technology Conference, 1995, IMTC/95. pp. 548-551, 1995.
Eren et al., “Operation of Mobile Robots in a Structured Infrared Environment,” Proceedings ‘Sensing, Processing, Networking’, IEEE Instrumentation and Measurement Technology Conference, 1997 (IMTC/97), Ottawa, Canada vol. 1, pp. 20-25, May 1997.
Euroflex Intelligente Monstre, (English excerpt only), 2006, 15 pages.
Euroflex, Jan. 2006, Retrieved from the Internet: URL<http://www.euroflex.tv/novita—dett.php?id=15, accessed Nov. 2011, 1 page.
eVac Robotic Vacuum S1727 Instruction Manual, Sharper Image Corp, Copyright 2004, 16 pages.
Everyday Robots, “Everyday Robots: Reviews, Discussion and News for Consumers,” Aug. 2004, Retrieved from the Internet: URL<www.everydayrobots.com/index.php?option=content&task=view&id=9> (Sep. 2012), 4 pages.
Evolution Robotics, “NorthStar—Low-cost Indoor Localiztion—How it Works,” E Evolution Robotics , 2 pages, 2005.
Facchinetti Claudio et al., “Self-Positioning Robot Navigation Using Ceiling Images Sequences,” ACCV '95, 5 pages, Dec. 1995.
Facchinetti Claudio et al., “Using and Learning Vision-Based Self-Positioning for Autonomous Robot Navigation,” ICARCV '94, vol. 3, pp. 1694-1698, 1994.
Facts on Trilobite, webpage, Retrieved from the Internet: URL<http://trilobiteelectroluxse/presskit—en/model11335asp?print=yes&pressID=>. 2 pages, accessed Dec. 2003.
Fairfield et al., “Mobile Robot Localization with Sparse Landmarks,” SPIE vol. 4573, pp. 148-155, 2002.
Favre-Bulle, “Efficient tracking of 3D—Robot Position by Dynamic Triangulation,” IEEE Instrumentation and Measurement Technology Conference IMTC 98 Session on Instrumentation and Measurement in Robotics, vol. 1, pp. 446-449, May 1998.
Fayman, “Exploiting Process Integration and Composition in the context of Active Vision,” IEEE Transactions on Systems, Man, and Cybernetics—Part C: Application and reviews, vol. 29, No. 1, pp. 73-86, Feb. 1999.
Floorbot GE Plastics—IMAGE, available at http://www.fuseid.com/, 1989-1990, Accessed Sep. 2012, 1 page.
Floorbotics, VR8 Floor Cleaning Robot, Product Description for Manufacturing, URL: <http://www.consensus.sem.au/SoftwareAwards/CSAarchive/CSA2004/CSAart04/FloorBot/F>. Mar. 2004, 11 pages.
Franz et al., “Biomimetric robot navigation”, Robotics and Autonomous Systems, vol. 30 pp. 133-153, 2000.
Friendly Robotics, “Friendly Robotics—Friendly Vac, Robotic Vacuum Cleaner,” Retrieved from the Internet: URL< www.friendlyrobotics.com/vac.htm> 4 pages, Apr. 2005.
Friendly Robotics, Retrieved from the Internet: URL<http://www.robotsandrelax.com/PDFs/RV400Manual.pdf>. 18 pages, accessed Dec. 2011.
Fuentes et al., “Mobile Robotics 1994,” University of Rochester. Computer Science Department, TR 588, 44 pages, Dec. 1994.
Fukuda et al., “Navigation System based on Ceiling Landmark Recognition for Autonomous mobile robot,” 1995 IEEE/RSJ International Conference on Intelligent Robots and Systems 95. ‘Human Robot Interaction and Cooperative Robots’, Pittsburgh, PA, pp. 1466/1471, Aug. 1995.
Gat, “Robust Low-Computation Sensor-driven Control for Task-Directed Navigation,” Proc of IEEE International Conference on Robotics and Automation , Sacramento, CA pp. 2484-2489, Apr. 1991.
Gionis, “A hand-held optical surface scanner for environmental Modeling and Virtual Reality,” Virtual Reality World, 16 pages, 1996.
Goncalves et al., “A Visual Front-End for Simultaneous Localization and Mapping”, Proceedings of the 2005 IEEE International Conference on Robotics and Automation, Barcelona, Spain, pp. 44-49, Apr. 2005.
Gregg et al., “Autonomous Lawn Care Applications,” 2006 Florida Conference on Recent Advances in Robotics, Miami, Florida, May 25-26, 2006, Florida International University, 5 pages.
Grumet, “Robots Clean House,” Popular Mechanics, Nov. 2003, 3 pages.
Hamamatsu “SI PIN Diode S5980, S5981 S5870—Multi-element photodiodes for surface mounting,” Hamatsu Photonics, 2 pages, Apr. 2004.
Haralick et al. “Pose Estimation from Corresponding Point Data”, IEEE Transactions on Systems, Man, and Cybernetics, 19(6):1426-1446, Nov. 1989.
Hausler, “About the Scaling Behaviour of Optical Range Sensors,” Fringe '97, Proceedings of the 3rd International Workshop on Automatic Processing of Fringe Patterns, Bremen, Germany, pp. 147-155, Sep. 1997.
Hitachi, accessed at http://www.hitachi.co.jp/New/cnews/hi—030529—hi—030529.pdf , May 29, 2003, 15 pages (with English translation).
Hitachi: News release: “The home cleaning robot of the autonomous movement type (experimental machine),” Retrieved from the Internet: URL<www.i4u.com./japanreleases/hitachirobot.htm>. 5 pages, Mar. 2005.
Hoag et al., “Navigation and Guidance in interstellar space,” ACTA Astronautica, vol. 2, pp. 513-533 , Feb. 1975.
Home Robot—UBOT; Microbotusa.com, retrieved from the WWW at www.microrobotusa.com, accessed Dec. 2, 2008, 2 pages.
Huntsberger et al., “CAMPOUT: A Control Architecture for Tightly Coupled Coordination of Multirobot Systems for Planetary Surface Exploration,” IEEE Transactions on Systems, Man, and Cybernetics—Part A: Systems and Humans, 33(5):550-559, Sep. 2003.
Iirobotics.com, “Samsung Unveils Its Multifunction Robot Vacuum,” Retrieved from the Internet: URL<.www.iirobotics.com/webpages/hotstuff.php?ubre=111>. 3 pages, Mar. 2005.
InMach “Intelligent Machines,” Retrieved from the Internet: URL<www.inmach.de/inside.html>. 1 page , Nov. 2008.
Innovation First, “2004 EDU Robot Controller Reference Guide,” Retrieved from the Internet: URL<http://www.ifirobotics.com>. 13 pages, Mar. 2004.
IT media, Retrieved from the Internet: URL<http://www.itmedia.co.jp/news/0111/16/robofesta—m.html>. Accessed Nov. 1, 2011, 8 pages (with English translation).
It's eye, Retrieved from the Internet: URL< www.hitachi.co.jp/rd/pdf/topics/hitac2003—10.pdf>. 2 pages, 2003.
Jarosiewicz et al., “Final Report—Lucid,” University of Florida, Departmetn of Electrical and Computer Engineering, EEL 5666—Intelligent Machine Design Laboratory, 50 pages, Aug. 1999.
Jensfelt et al., “Active Global Localization for a mobile robot using multiple hypothesis tracking,” IEEE Transactions on Robots and Automation, 17(5): 748-760, Oct. 2001.
Jeong et al., “An intelligent map-building system for indoor mobile robot using low cost photo sensors,”SPIE, vol. 6042, 6 pages, 2005.
Kahney, “Robot Vacs are in the House,” Retrieved from the Internet: URL<www.wired.com/news/technology/o,1282,59237,00.html>. 5 pages, Jun. 2003.
Karcher “Karcher RoboCleaner RC 3000,” Retrieved from the Internet: URL<www.robocleaner.de/english/screen3.html>. 4 pages, Dec. 2003.
Karcher RC 3000 Cleaning Robot-user manual Manufacturer: Alfred-Karcher GmbH & Co, Cleaning Systems, Alfred Karcher-Str 28-40, PO Box 160, D-71349 Winnenden, Germany, Dec. 2002, 8 pages.
Karcher RC3000 RoboCleaner,—IMAGE, Accessed at <http://www.karcher.de/versions/int/assets/video/2—4—robo—en.swf>. Accessed Sep. 2009, 1 page.
Karcher USA, RC3000 Robotic Cleaner, website: http://www.karcher-usa.com/showproducts.php?op=view prod&param1=143&param2=&param3=, 3 pages, accessed Mar. 2005.
Karcher, “Product Manual Download ‘Karch’,” available at www.karcher.com, 16 pages, 2004.
Karlsson et al, “Core Technologies for service Robotics,” IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2004), vol. 3, pp. 2979-2984, Sep. 2004.
Karlsson et al., The vSLAM Algorithm for Robust Localization and Mapping, Proceedings of the 2005 IEEE International Conference on Robotics and Automation, Barcelona, Spain, pp. 24-29, Apr. 2005.
King and Weiman, “HelpmateTM Autonomous Mobile Robots Navigation Systems,” SPIE vol. 1388 Mobile Robots, pp. 190-198, 1990.
Kleinberg, The Localization Problem for Mobile Robots, Laboratory for Computer Science, Massachusetts Institute of Technology, 1994 IEEE, pp. 521-531, 1994.
Knights, et al., “Localization and Identification of Visual Landmarks,” Journal of Computing Sciences in Colleges, 16(4):312-313, May 2001.
Kolodko et al., “Experimental System for Real-Time Motion Estimation,” Proceedings of the 2003 IEEE/ASME International Conference on Advanced Intelligent Mechatronics (AIM 2003), pp. 981-986, 2003.
Komoriya et al., “Planning of Landmark Measurement for the Navigation of a Mobile Robot,” Proceedings of the 1992 IEEE/RSJ International Cofnerence on Intelligent Robots and Systems, Raleigh, NC pp. 1476-1481, Jul. 1992.
KOOLVAC Robotic Vacuum Cleaner Owner's Manual, Koolatron, 2004, 13 pages.
Krotkov et al., “Digital Sextant,” Downloaded from the internet at: http://www.cs.cmu.edu/˜epk/ , 1 page, 1995.
Krupa et al., “Autonomous 3-D Positioning of Surgical Instruments in Robotized Laparoscopic Surgery Using Visual Servoin,” IEEE Transactions on Robotics and Automation, 19(5):842-853, Oct. 2003.
Kuhl et al., “Self Localization in Environments using Visual Angles,” VRCAI '04 Proceedings of the 2004 ACM SIGGRAPH international conference on Virtual Reality continuum and its applications in industry, pp. 472-475, 2004.
Kurs et al, Wireless Power transfer via Strongly Coupled Magnetic Resonances, Downloaded from www.sciencemag.org, Aug. 2007, 5 pages.
Kurth, “Range-Only Robot Localization and SLAM with Radio”, http://www.ri.cmu.edu/pub—files/pub4/kurth—derek—2004—1/kurth—derek—2004—1.pdf. 60 pages, May 2004, accessed Jul. 27, 2012.
Kwon et al., “Table Recognition through Range-based Candidate Generation and Vision based Candidate Evaluation,” ICAR 2007, The 13th International Conference on Advanced Robotics Aug. 21-24, 2007, Jeju, Korea, pp. 918-923, 2007.
Lambrinos et al., “A mobile robot employing insect strategies for navigation,” Retrieved from the Internat: URL<http://www8.cs.umu.se/kurser/TDBD17/VT04/dl/Assignment%20Papers/lambrinos-RAS-2000.pdf>. 38 pages, Feb. 1999.
Lang et al., “Visual Measurement of Orientation Using Ceiling Features”, 1994 IEEE, pp. 552-555, 1994.
Lapin, “Adaptive position estimation for an automated guided vehicle,” SPIE, vol. 1831 Mobile Robots VII, pp. 82-94, 1992.
LaValle et al., “Robot Motion Planning in a Changing, Partially Predictable Environment,” 1994 IEEE International Symposium on Intelligent Control, Columbus, OH, pp. 261-266, Aug. 1994.
Lee et al., “Development of Indoor Navigation system for Humanoid Robot Using Multi-sensors Integration”, ION NTM, San Diego, CA pp. 798-805, Jan 2007.
Lee et al., “Localization of a Mobile Robot Using the Image of a Moving Object,” IEEE Transaction on Industrial Electronics, 50(3):612-619, Jun. 2003.
Leonard et al., “Mobile Robot Localization by tracking Geometric Beacons,” IEEE Transaction on Robotics and Automation, 7(3):376-382, Jun. 1991.
Li et al. “Robust Statistical Methods for Securing Wireless Localization in Sensor Networks,” Information Processing in Sensor Networks, 2005, Fourth International Symposium on, pp. 91-98, Apr. 2005.
Li et al., “Making a Local Map of Indoor Environments by Swiveling a Camera and a Sonar,” Proceedings of the 1999 IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 954-959, 1999.
Lin et al., “Mobile Robot Navigation Using Artificial Landmarks,” Journal of robotics System, 14(2): 93-106, 1997.
Linde, Dissertation—“On Aspects of Indoor Localization,” Available at: https://eldorado.tu-dortmund.de/handle/2003/22854, University of Dortmund, 138 pages, Aug. 2006.
Lumelsky et al., “An Algorithm for Maze Searching with Azimuth Input”, 1994 IEEE International Conference on Robotics and Automation, San Diego, CA vol. 1, pp. 111-116, 1994.
Luo et al., “Real-time Area-Covering Operations with Obstacle Avoidance for Cleaning Robots,” IEEE, pp. 2359-2364, 2002.
Ma, Thesis—“Documentation on Northstar,” California Institute of Technology, 14 pages, May 2006.
Madsen et al., “Optimal landmark selection for triangulation of robot position,” Journal of Robotics and Autonomous Systems, vol. 13 pp. 277-292, 1998.
Malik et al., “Virtual Prototyping for Conceptual Design of a Tracked Mobile Robot,” Electrical and Computer Engineering, Canadian Conference on, IEEE, PI. pp. 2349-2352, May 2006.
Martishevcky, “The Accuracy of point light target coordinate determination by dissectoral tracking system”, SPIE vol. 2591, pp. 25-30, Oct. 23, 2005.
Maschinemarkt Würzburg 105, No. 27, pp. 3, 30, Jul. 5, 1999 (with English translation).
Matsumura Camera Online Shop: Retrieved from the Internet: URL<http://www.rakuten.co.jp/matsucame/587179/711512/>. Accessed Nov. 2011, 15 pages (with English translation).
Matsutek Enterprises Co. Ltd, “Automatic Rechargeable Vacuum Cleaner,” http://matsutek.manufacturer.globalsources.com/si/6008801427181/pdtl/Home-vacuum/10 . . . , Apr. 2007, 3 pages.
McGillem et al., “Infra-red Lacation System for Navigation and Autonomous Vehicles,” 1988 IEEE International Conference on Robotics and Automation, vol. 2, pp. 1236-1238, Apr. 1988.
McGillem,et al. “A Beacon Navigation Method for Autonomous Vehicles,” IEEE Transactions on Vehicular Technology, 38(3):132-139, Aug. 1989.
McLurkin “Stupid Robot Tricks: A Behavior-based Distributed Algorithm Library for Programming Swarms of Robots,” Paper submitted for requirements of BSEE at MIT, May 2004, 127 pages.
McLurkin, “The Ants: A community of Microrobots,” Paper submitted for requirements of BSEE at MIT, May 1995, 60 pages.
Michelson, “Autonomous navigation,” McGraw-Hill—Access Science, Encyclopedia of Science and Technology Online, 2007, 4 pages.
Miro et al., “Towards Vision Based Navigation in Large Indoor Environments,” Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, Beijing, China, pp. 2096-2102, Oct. 2006.
MobileMag, “Samsung Unveils High-tech Robot Vacuum Cleaner,” Retrieved from the Internet: URL<http://www.mobilemag.com/content/100/102/C2261/>. 4 pages, Mar. 2005.
Monteiro et al., “Visual Servoing for Fast Mobile Robot: Adaptive Estimation of Kinematic Parameters,” Proceedings of the IECON '93., International Conference on Industrial Electronics, Maui, HI, pp. 1588-1593, Nov. 1993.
Moore et al., “A simple Map-bases Localization strategy using range measurements,” SPIE, vol. 5804 pp. 612-620, 2005.
Morland,“Autonomous Lawnmower Control”, Downloaded from the internet at: http://cns.bu.edu/˜cjmorlan/robotics/lawnmower/report.pdf, 10 pages, Jul. 2002.
Munich et al., “ERSP: A Software Platform and Architecture for the Service Robotics Industry,” Intelligent Robots and Systems, 2005. (IROS 2005), pp. 460-467, Aug. 2005.
Munich et al., “SIFT-ing Through Features with ViPR”, IEEE Robotics & Automation Magazine, pp. 72-77, Sep. 2006.
Nam et al., “Real-Time Dynamic Visual Tracking Using PSD Sensors and extended Trapezoidal Motion Planning”, Applied Intelligence 10, pp. 53-70, 1999.
Nitu et al., “Optomechatronic System for Position Detection of a Mobile Mini-Robot,” IEEE Ttransactions on Industrial Electronics, 52(4):969-973, Aug. 2005.
On Robo, “Robot Reviews Samsung Robot Vacuum (VC-RP30W),” Retrieved from the Internet: URL <www.onrobo.com/reviews/AT—Home/vacuum—cleaners/on00vcrb30rosam/index.htm>. 2 pages, 2005.
OnRobo “Samsung Unveils Its Multifunction Robot Vacuum,” Retrieved from the Internet: URL <www.onrobo.com/enews/0210/samsung—vacuum.shtml>. 3 pages, Mar. 2005.
Pages et al., “A camera-projector system for robot positioning by visual serving,” Proceedings of the 2006 Conference on Computer Vision and Pattern Recognition Workshop (CVPRW06), 8 pages, Jun. 2006.
Pages et al., “Optimizing Plane-to-Plane Positioning Tasks by Image-Based Visual Servoing and Structured Light,” IEEE Transactions on Robotics, 22(5):1000-1010, Oct. 2006.
Pages et al., “Robust decoupled visual servoing based on structured light,” 2005 IEEE/RSJ, Int. Conf. on Intelligent Robots and Systems, pp. 2676-2681, 2005.
Park et al., “A Neural Network Based Real-Time Robot Tracking Controller Using Position Sensitive Detectors,” IEEE World Congress on Computational Intelligence., 1994 IEEE International Conference on Neutral Networks, Orlando, Florida pp. 2754-2758, Jun./Jul. 1994.
Park et al., “Dynamic Visual Servo Control of Robot Manipulators using Neutral Networks,” The Korean Institute Telematics and Electronics, 29-B(10):771-779, Oct. 1992.
Paromtchik “Toward Optical Guidance of Mobile Robots,” Proceedings of the Fourth World Multiconference on Systemics, Cybermetics and Informatics, Orlando, FL, USA, Jul. 23, 2000, vol. IX, pp. 44-49, available at http://emotion.inrialpes.fr/˜paromt/infos/papers/paromtchik:asama:sci:2000.ps.gz, accessed Jul. 3, 2012, 6 pages.
Paromtchik et al., “Optical Guidance System for Multiple mobile Robots,” Proceedings 2001 ICRA. IEEE International Conference on Robotics and Automation, vol. 3, pp. 2935-2940, May 2001.
Penna et al., “Models for Map Building and Navigation”, IEEE Transactions on Systems. Man. And Cybernetics., 23(5):1276-1301, Sep./Oct. 1993.
Pirjanian et al. “Representation and Execution of Plan Sequences for Multi-Agent Systems,” Proceedings of the 2001 IEEE/RSJ International Conference on Intelligent Robots and Systems, Maui, Hawaii, pp. 2117-2123, Oct. 2001.
Pirjanian et al., “A decision-theoretic approach to fuzzy behavior coordination”, 1999 IEEE International Symposium on Computational Intelligence in Robotics and Automation, 1999. CIRA '99., Monterey, CA, pp. 101-106, Nov. 1999.
Pirjanian et al., “Distributed Control for a Modular, Reconfigurable Cliff Robot,” Proceedings of the 2002 IEEE International Conference on Robotics & Automation, Washington, D.C. pp. 4083-4088, May 2002.
Pirjanian et al., “Improving Task Reliability by Fusion of Redundant Homogeneous Modules Using Voting Schemes,” Proceedings of the 1997 IEEE International Conference on Robotics and Automation, Albuquerque, NM, pp. 425-430, Apr. 1997.
Pirjanian et al., “Multi-Robot Target Acquisition using Multiple Objective Behavior Coordination,” Proceedings of the 2000 IEEE International Conference on Robotics & Automation, San Francisco, CA, pp. 2696-2702, Apr. 2000.
Pirjanian, “Challenges for Standards for consumer Robotics,” IEEE Workshop on Advanced Robotics and its Social impacts, pp. 260-264, Jun. 2005.
Pirjanian, “Reliable Reaction,” Proceedings of the 1996 IEEE/SICE/RSJ International Conference on Multisensor Fusion and Integration for Intelligent Systems, pp. 158-165, 1996.
Prassler et al., “A Short History of Cleaning Robots,” Autonomous Robots 9, 211-226, 2000, 16 pages.
Put Your Roomba . . . On, Automatic webpages: http://www.acomputeredge.com/roomba, 5 pages, accessed Apr. 2005.
Remazeilles et al., “Image based robot navigation in 3D environments,” Proc. of SPIE, vol. 6052, pp. 1-14, Dec. 2005.
Rives et al., “Visual servoing based on ellipse features,” SPIE, vol. 2056 Intelligent Robots and Computer Vision pp. 356-367, 1993.
Roboking—not just a vacuum cleaner, a robot!, Jan. 21, 2004, infocom.uz/2004/01/21/robokingne-prosto-pyilesos-a-robot/, accessed Oct. 10, 2011, 5 pages.
RoboMaid Sweeps Your Floors So You Won't Have To, the Official Site, website: Retrieved from the Internet: URL<http://therobomaid.com>. 2 pages, accessed Mar. 2005.
Robot Buying Guide, “LG announces the first robotic vacuum cleaner for Korea,” Retrieved from the Internet: URL<http://robotbg.com/news/2003/04/22/lg—announces—the—first—robotic—vacu>. 1 page, Apr. 2003.
Robotics World, “A Clean Sweep,” 5 pages, Jan. 2001.
Ronnback, “On Methods for Assistive Mobile Robots,” Retrieved from the Internet: URL<http://www.openthesis.org/documents/methods-assistive-mobile-robots-595019.html>. 218 pages, Jan. 2006.
Roth-Tabak et al., “Environment Model for mobile Robots Indoor Navigation,” SPIE, vol. 1388 Mobile Robots, pp. 453-463, 1990.
Sahin et al., “Development of a Visual Object Localization Module for Mobile Robots,” 1999 Third European Workshop on Advanced Mobile Robots, (Eurobot '99), pp. 65-72, 1999.
Salomon et al., “Low-Cost Optical Indoor Localization system for Mobile Objects without Image Processing,” IEEE Conference on Emerging Technologies and Factory Automation, 2006. (ETFA '06), pp. 629-632, Sep. 2006.
Sato, “Range Imaging Based on Moving Pattern Light and Spatio-Temporal Matched Filter,” Proceedings International Conference on Image Processing, vol. 1., Lausanne, Switzerland, pp. 33-36, Sep. 1996.
Schenker et al., “Lightweight rovers for Mars science exploration and sample return,” Intelligent Robots and Computer Vision XVI, SPIE Proc. 3208, pp. 24-36, 1997.
Schlemmer, “Electrolux Trilobite Robotic Vacuum,” Retrieved from the Internet: URL<www.hammacher.com/publish/71579.asp?promo=xsells>. 3 pages, Mar. 2005.
Schofield, “Neither Master nor slave—A Practical Study in the Development and Employment of Cleaning Robots, Emerging Technologies and Factory Automation,” 1999 Proceedings ETFA '99 1999 7th IEEE International Conference on Barcelona, Spain, pp. 1427-1434, Oct. 1999.
Shimoga et al., “Touch and Force Reflection for Telepresence Surgery,” Engineering in Medicine and Biology Society, 1994. Engineering Advances: New Opportunities for Biomedical Engineers. Proceedings of the 16th Annual International Conference of the IEEE, Baltimore, MD, pp. 1049-1050, 1994.
Sim et al, “Learning Visual Landmarks for Pose Estimation,” IEEE International Conference on Robotics and Automation, vol. 3, Detroit, MI, pp. 1972-1978, May 1999.
Sobh et al., “Case Studies in Web-Controlled Devices and Remote Manipulation,” Automation Congress, 2002 Proceedings of the 5th Biannual World, pp. 435-440, Dec. 2002.
Special Reports, “Vacuum Cleaner Robot Operated in Conjunction with 3G Celluar Phone,” 59(9): 3 pages, Retrieved from the Internet: URL<http://www.toshiba.co.jp/tech/review/2004/09/59—0>. 2004.
Stella et al., “Self-Location for Indoor Navigation of Autonomous Vehicles,” Part of the SPIE conference on Enhanced and Synthetic Vision SPIE vol. 3364, pp. 298-302, 1998.
Summet, “Tracking Locations of Moving Hand-held Displays Using Projected Light,” Pervasive 2005, LNCS 3468, pp. 37-46, 2005.
Svedman et al., “Structure from Stereo Vision using Unsynchronized Cameras for Simultaneous Localization and Mapping,” 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 2993-2998, 2005.
Svet Computers—New Technologies—Robot Vacuum Cleaner, Oct. 1999, available at http://www.sk.rs/1999/10/sknt01.html, 1 page, accessed Nov. 1, 2011.
Taipei Times, “Robotic vacuum by Matsuhita about to undergo testing,” Retrieved from the Internet: URL<http://www.taipeitimes.com/News/worldbiz/archives/2002/03/26/0000129338>. accessed Mar. 2002, 2 pages.
Takio et al., “Real-Time Position and Pose Tracking Method of Moving Object Using Visual Servo System,” 47th IEEE International Symposium on Circuits and Systems, pp. 167-170, 2004.
“Tech-on!,” Retrieved from the Internet: URL http://techon.nikkeibp.co.jp/members/01db/200203/1006501/, accessed Nov. 2011, 7 pages (with English translation).
Teller, “Pervasive pose awareness for people, Objects and Robots,” http://www.ai.mit.edu/lab/dangerous-ideas/Spring2003/teller-pose.pdf, 22 pages, Apr. 2003.
Terada et al., “An Acquisition of the Relation between Vision and Action using Self-Organizing Map and Reinforcement Learning,” 1998 Second International Conference on Knowledge-Based Intelligent Electronic Systems, Adelaide, Australia, pp. 429-434, Apr. 1998.
The Sharper Image, eVac Robotic Vacuum—Product Details, www.sharperiamge.com/us/en/templates/products/pipmorework1printable.jhtml, 1 page, Accessed Mar. 2005.
TheRobotStore.com, “Friendly Robotics Robotic Vacuum RV400—The Robot Store,” www.therobotstore.com/s.nl/sc.9/category.-109/it.A/id.43/.f, 1 page, Apr. 2005.
Thrun, Sebastian, “Learning Occupancy Grid Maps With Forward Sensor Models,” Autonomous Robots 15, 28 pages, Sep. 1, 2003.
TotalVac.com, RC3000 RoboCleaner website, 2004, Accessed at http://ww.totalvac.com/robot—vacuum.htm (Mar. 2005), 3 pages.
Trebi-Ollennu et al., “Mars Rover Pair Cooperatively Transporting a Long Payload,” Proceedings of the 2002 IEEE International Conference on Robotics & Automation, Washington, D.C. pp. 3136-3141, May 2002.
Tribelhorn et al., “Evaluating the Roomba: A low-cost, ubiquitous platform for robotics research and education,” IEEE, pp. 1393-1399, 2007.
Tse et al., “Design of a Navigation System for a Household Mobile Robot Using Neural Networks,” Department of Manufacturing Engg. & Engg. Management, City University of Hong Kong, pp. 2151-2156, 1998.
UAMA (Asia) Industrial Co., Ltd., “RobotFamily,” 2005, 1 page.
UBOT, cleaning robot capable of wiping with a wet duster, Retrieved from the Internet: URL<http://us.aving.net/news/view.php?articleId=23031>. 4 pages, accessed Nov. 2011.
Watanabe et al., “Position Estimation of Mobile Robots With Internal and External Sensors Using Uncertainty Evolution Technique,” 1990 IEEE International Conference on Robotics and Automation, Cincinnati, OH, pp. 2011-2016, May 1990.
Watts, “Robot, boldly goes where no man can,” The Times—pp. 20, Jan. 1985.
Wijk et al., “Triangulation-Based Fusion of Sonar Data with Application in Robot Pose Tracking,” IEEE Transactions on Robotics and Automation, 16(6):740-752, Dec. 2000.
Wolf et al., “Robust Vision-Based Localization by Combining an Image-Retrieval System with Monte Carol Localization,”, IEEE Transactions on Robotics, 21(2):208-216, Apr. 2005.
Wolf et al., “Robust Vision-based Localization for Mobile Robots Using an Image Retrieval System Based on Invariant Features,” Proceedings of the 2002 IEEE International Conference on Robotics & Automation, Washington, D.C., pp. 359-365, May 2002.
Wong, “EIED Online>> Robot Business”, ED Online ID# 13114, 17 pages, Jul. 2006.
Yamamoto et al., “Optical Sensing for Robot Perception and Localization,” 2005 IEEE Workshop on Advanced Robotics and its Social Impacts, pp. 14-17, 2005.
Yata et al., “Wall Following Using Angle Information Measured by a Single Ultrasonic Transducer,” Proceedings of the 1998 IEEE, International Conference on Robotics & Automation, Leuven, Belgium, pp. 1590-1596, May 1998.
Yujin Robotics,“An intelligent cleaning robot,” Retrieved from the Internet: URL<http://us.aving.net/news/view.php?articleId=7257>. 8 pages, accessed Nov. 2011.
Yun et al., “Image-Based Absolute Positioning System for Mobile Robot Navigation,” IAPR International Workshops SSPR, Hong Kong, pp. 261-269, Aug. 2006.
Yun et al., “Robust Positioning a Mobile Robot with Active Beacon Sensors,” Lecture Notes in Computer Science, 2006, vol. 4251, pp. 890-897, 2006.
Yuta et al., “Implementation of an Active Optical Range sensor Using Laser Slit for In-Door Intelligent Mobile Robot,” IEE/RSJ International Workshop on Intelligent Robots and Systems (IROS 91) vol. 1, Osaka, Japan, pp. 415-420, Nov. 3-5, 1991.
Zha et al., “Mobile Robot Localization Using Incomplete Maps for Change Detection in a Dynamic Environment,” Advanced Intelligent Mechatronics '97. Final Program and Abstracts., IEEE/ASME International Conference, pp. 110, Jun. 1997.
Zhang et al., “A Novel Mobile Robot Localization Based on Vision,” SPIE vol. 6279, 6 pages, Jan. 2007.
Zoombot Remote Controlled Vaccuum-RV-500 NEW Roomba 2, website: http://cgi.ebay.com/ws/eBayISAPI.dll?ViewItem&category=43526&item=4373497618&rd=1, accessed Apr. 20, 2005, 7 pages.
International Search Report issued in PCT/US2005/022481, dated Oct. 24, 2005, 5 pages.
Written Opinion issued in PCT/2005/022481, dated Oct. 24, 2005, 6 pages.
Related Publications (1)
Number Date Country
20160023357 A1 Jan 2016 US
Provisional Applications (1)
Number Date Country
60582531 Jun 2004 US
Continuations (1)
Number Date Country
Parent 11166891 Jun 2005 US
Child 14670572 US