Robot system

Information

  • Patent Grant
  • 9599990
  • Patent Number
    9,599,990
  • Date Filed
    Wednesday, June 15, 2016
    8 years ago
  • Date Issued
    Tuesday, March 21, 2017
    7 years ago
Abstract
A power-saving robot system includes at least one peripheral device and a mobile robot. The peripheral device includes a controller having an active mode and a hibernation mode, and a wireless communication component capable of activation in the hibernation mode. A controller of the robot has an activating routine that communicates with and temporarily activates the peripheral device, via wireless communication, from the hibernation mode. In another aspect, a robot system includes a network data bridge and a mobile robot. The network data bridge includes a broadband network interface, a wireless command interface, and a data bridge component. The data bridge component extracts serial commands received via the broadband network interface from an internet protocol, applies a command protocol thereto, and broadcasts the serial commands via the wireless interface. The mobile robot includes a wireless command communication component that receives the serial command; transmitted from the network data bridge.
Description
TECHNICAL FIELD

This invention relates to robot systems, and more particularly to power saving robot systems and robot system networks.


BACKGROUND

Autonomous robots are robots which can perform desired tasks in unstructured environments without continuous human guidance. Many kinds of robots are autonomous to some degree. Different robots can be autonomous in different ways. An autonomous coverage robot traverses a work surface without continuous human guidance to perform one or more tasks. In the field of home, office and/or consumer-oriented robotics, mobile robots that perform household functions such as vacuum cleaning, floor washing, patrolling, lawn cutting and other such tasks have been widely adopted. Autonomous coverage robot systems that include a coverage robot and peripheral devices are generally battery powered. As a result, the battery life of each component of the system affects the operability of the overall system.


SUMMARY

The present disclosure provides a robot which can communicate with a base station and/or internet site via wireless communications media, in one configuration using a wireless bridge adapted for connection directly to a home wired network to provide a direct presence and proxies for the robot, allowing the robot to be a fully functional network node.


In one aspect, a power-saving robot system includes at least one peripheral device to be placed in an environment and a mobile robot. The peripheral device includes a power supply, a wireless communication component, and a controller having an active mode in which the peripheral device is fully operative and a hibernation mode in which the peripheral device is at least partly inactive. The wireless communication component is capable of activation in the hibernation mode. The mobile robot includes a drive system that moves the robot about the environment, a wireless communication component, and a controller. The controller has an activating routine that communicates with the peripheral device via the wireless communication components and temporarily activates the peripheral device from the hibernation mode when the wireless communication components of the peripheral device and the robot come within range of one another.


In one implementation, the wireless communication components communicate with transmission wavelengths that permit the robot and the peripheral device to be outside a line of sight. Conversely, in another implementation, the wireless communication components communicate with transmission wavelengths that require the robot and the peripheral device to be within a line of sight. In one example, the peripheral device is precluded from altering modes from the hibernation mode to the active mode until a line of sight is present between the robot and the peripheral device.


The wireless communication components may communicate with a point-to-point protocol while excluding routing and may communicate commands interpretable by the peripheral device to initiate a function. In one example, the wireless transmission communicated by the robot wireless communication circuit includes identification information. Similarly, the wireless transmission communicated by the peripheral device wireless communication circuit may include identification information as well.


In some implementations, the peripheral device, while in the hibernation mode, occasionally listens for a robot ping. Also, while in the hibernation mode, the peripheral device occasionally polls for a quiet robot. In one example, the peripheral device is a base station. In another example, the peripheral device is a mobile device.


In another implementation, the robot measures a signal strength of a wireless transmission communicated by the wireless communication component of the peripheral device to determine a distance from the peripheral device. In one example, the wireless communication components communicate over a radiofrequency.


In some implementations, the controller of the robot determines a locality of the robot based on information received via the wireless communication component of the robot from a peripheral device, such as a beacon, located in an area in which the robot is operating. The peripheral device is configured to transmit radio-frequency identification information. For example, each locality within a robot environment may respectively include a beacon configured to wirelessly emit respective location information (e.g. in an environment corresponding to a house, each “locality” may represent a room, and each room may be installed with a beacon broadcasting a unique identification over radio-frequency or other such medium). A base station may be provided in at least one locality, and the beacon may be configured to communicate with the base station and to relay data therefrom and/or thereto. It is advantageous for such beacons and base stations to save electrical power. The beacon may emerge from a low-power “hibernation” intermittently by listening for RF or IR of the robot, by operating a wake-up timer, by being actively RF or IR interrogated by the robot, or any permutation of combinations thereof based on time elapsed since a last event or number or frequency or character of interactions. Furthermore, the robot may activate a peripheral device according to a schedule or transmit schedule information to the peripheral device, which activates itself based on the schedule.


In another aspect, a robot system includes a network data bridge and a mobile robot. The network data bridge includes a broadband network interface, a wireless command interface, and a data bridge component. The broadband network interface is connectible to an internet protocol network and carries communications transferred in compliance with an internet protocol. The wireless command interface is connectible to a wireless command protocol network and carries communications transferred under a command protocol. The data bridge component extracts serial commands received via the broadband network interface from the internet protocol and applies the command protocol thereto. The data bridge component listens to the narrowband wireless interface and sends a robot, peripheral, network and system state to the Internet via the broadband network interface. This information is automatically sent to a monitoring service via internet protocols, where long-term monitoring and analysis takes place. Actions/commands resulting from this analysis are injected into the narrowband wireless network by the RF-bridge. These actions can include serial commands, new software images for robot and/or peripheral, or queries for more in-depth (debugging) information that can be interpreted and responded to by the robot. The data bridge component also broadcasts the serial commands via the narrowband wireless interface. The mobile robot includes a drive system that moves the robot about an environment and a wireless command communication component that receives the serial commands transmitted from the network data bridge.


In some implementations, the system also includes at least one peripheral device to be placed in the environment. The peripheral device includes a wireless command communication component that receives serial commands transmitted from the robot and the network data bridge. The peripheral device also includes a controller having an active mode wherein the peripheral device is fully operative and a hibernation mode wherein the peripheral device is at least partly inactive. The wireless communication circuit is capable of activation in the hibernation mode.


In another aspect, the robot can receive and utilize customizable sounds or other audio content for interacting with a user. The robot receives audible content and plays back the audible content in coordination with specific robot functions. In one example, the robot controls externally visible indicia of the robot in coordination with a play back of audible content such as, inter alia, synthesized voice content during a user interaction and/or training mode.


In yet another aspect, the robot includes a facade or external housing changeable with customized cladding. In one example, the home robot includes interchangeable molded plastic or metal body panels affixed to an exterior of the robot by snap-on fasteners, insertable fitting tabs and receivers, screws, magnetic fixing pieces, etc. The interchangeable body panels correlate to audio content uploadable to the robot. In one instance, the customized body panel includes an identification system for automatically causing the robot to download and/or use the corresponding audible content. The identification system may include an integrated circuit, characteristic resistance, bar code, optical identifier, RFID, passive magnetic resonance, or mechanical identification system (e.g. a punch-card-like series of holes or protrusions).


The ability to customize the robot by transferring selected sound or multimedia schemes, themes, tones, music, audio or visual content, choreographic or movement routines, or other data to the robot improves overall enjoyment from the robot by a user. In one aspect, the robot includes a radio receiver configured to receive audible content via wireless transmission, a memory configured to store the audible content, a speaker configured to emit the audible content, and indicia controllable by a controller and configured to indicate operative information in a first mode and to indicate illustrative information in coordination with the audible content in a second mode. The indicia may include a light-emitting diode and also a power indicator configured to indicate an actual power state of the robot in the first mode and a training pattern in the second mode. The robot may further include a voice synthesizer configured to synthesize spoken didactic information based on the audible content. Also, the wireless transmission may be structured according to a packet-encoded transmission protocol. The robot may further include a customizable body panel detachably affixed to a main body of home robot, where the customizable body panel corresponds with themed audio data included in the audible content.


The robot may be configured to operate in at least first and second modes. The first mode corresponds to a normal robot operating state of the performing a primary function. The second mode corresponds to a training mode, where the speaker emits an audible training instructional program. In the second mode, the indicia of the robot displays according to a training pattern in timed coordination with the audible training/instructional program, where the training pattern displayed on the indicia is different from an operative pattern corresponding to an actual state of the home robot.


In another aspect, a robot system includes a mobile robot and a wireless communication system for communicating with the robot. The wireless communication system includes a network interface unit configured to communicably interface with a first network and to wirelessly transmit data to the robot. The wireless communication system also includes a server configured to communicate with the network interface unit via the first network. The robot is configured to wirelessly transmit data to the network interface unit, which is configured to convert the data from a wireless protocol to a network protocol used by the first network. The network interface unit transmits the data to the server. The server may be configured to produce robot usage, robot behavior, and/or customer information based on the data transmitted to the server. Also, a user terminal configured to communicably interface with the first network and to control at least one function of the robot may be provided. The user terminal transmits a command corresponding to at least one robot function to the network interface unit via the first network. The network interface unit wirelessly transmits the command to the robot. User interaction is performed through this user interface, allowing further usage data to be collected. This offloaded interface also allows the robots to coordinate actions without needing to communicate directly with one another. The robot systems are functionally independent of one another, but are tied together via the server through a single user interface/data logging/usage information collecting server.


In one example, the wireless communication system includes audible content stored at the server. The server is configured to transmit the audible content to the network interface unit, which is configured to wirelessly transmit the audible content to the robot. Alternatively, audible content may be stored at the user terminal, which is configured to transmit the audible content to the network interface unit via the first network. The network interface unit is configured to wireless transmit the audible content to the robot. Furthermore, content may be stored at a base station to which the robot docks for recharging and/or servicing.


Robot-generated data may include theme data corresponding to audible content, in which the server transmits the audible content to the robot via the network interface unit in response to the theme data. The data may alternatively include a behavior theme configured to cause the robot to behave according to a theme. The audible content may include voice data. Other robot behavioral changes may be implemented based on long-term monitoring and analysis of the robot-generated data. (For example, if the robot does not make it back to the dock before the battery gives out three times in a row, the server modifies the behavior of the robot to start looking for the dock/base station earlier. This modifies the robot behavior to be “more conservative.”)


Behaviors are parameterized and can be modified, or even disabled/activated based on analysis of usage/sensor information. Robot performance can be autonomously modified by the learned effects of the actual customer home. This can take place after the robot has been purchased and server is updated to provide a better model of robot/home performance characteristics. The wireless reporting infrastructure allows a modification of behavior based telemetry to provide the best performance for a particular customer. The learning process is dynamic and can change as an understanding of the data increases.


In one implementation, the wireless communications system includes a second user terminal which is configured to communicably interface with the first network. A theme may be stored at the first user terminal, which transmits the theme to the second user terminal via the first network. The first network may include UDP, TCP/IP, and/or Ethernet, as examples.


In another aspect, a content distribution system for distributing data to a robot includes a first server configured to communicably interface with a first network and a user-side node configured to transmit data to the robot. The robot receives customizable content via the user-side node. In one example, the content distribution system further include a network hub configured to use a protocol compatible with the first network and a network adapter configured to communicably connect the network hub with the first network. The user-side node is configured to detachably interface with the network hub. In another example, a data slot is installed on the robot and configured to receive the user-side node. In yet another example, the content distribution system further includes a content server configured to communicably interface with the first network and to transmit the audible content to the robot via the user-side node using the first network. The content server transmits content to the user-side node based on information received from the first server (e.g., the content served by the content server may include licensed content such as music or sound, images, “dance” moves or patterns performable by an appropriate type of mobile robot such as a wheeled robot—also referred to herein as “robo-motions”, and the like, for which the copyright is held by a third party; alternatively, the copyright holder of the content may be the manufacturer or any other entity). Also, the user-side node of the content distribution system may be configured to receive content from the server via the first network, and user-side node may be configured to transmit the content to the robot via a wireless communication protocol.


In some instances, the content distribution system further includes a user terminal configured to communicate via the first network and a content selection display presented on the user terminal. The customizable content is transmitted to the robot when selected from the content selection display on the user terminal. The user-side node includes an Ethernet dongle or a USB dongle (or “network bridge”) configured to detachably connect to an Ethernet hub. The user-side node is configured to receive content via the Ethernet hub and is configured to transmit the content to the robot via a second protocol different from the first network. (As used herein, the term “data bridge” is understood to refer to all such dongles and/or pocketable and/or portable devices capable of appropriately communicating with a robot, either via wireless, wired, or direct physical connection, or any other suitable modality for such a portable device to communicate and/or transmit data to the robot.) Also, the user-side node may receive content from the data bridge or from a remote site, with no client application located on the user terminal, just a web browser. Alternatively, a specific client application may be provided. The user-side node may be configured to operate using power supplied via the first network. The customizable content may include audible content, which may be organized into a theme of related discrete sounds.


In other instances, the robot transmits information corresponding to a theme to the server via the user-side node and receives customizable content including thematically related audio data corresponding to the theme. In one implementation, a main body of the robot includes a detachable body panel having an audio/theme identification unit. The robot is configured to identify audio content and/or a theme corresponding to the detachable body panel via the theme identification unit. The second protocol may include a wireless transmission protocol (e.g. ZigBee, 802.11a/b, wireless USB, serial-over-RF, AMPS, CDMA, GSM, Bluetooth, a simplistic or proprietary scheme, etc.).


The content distribution system may further include a voice synthesizer installed to the robot, in which the audio data includes voice synthesis parameters (for altering the perceived “personality” of a robot, or to better accommodate someone who has hearing loss in certain frequency ranges, for example).


Also, the robot may further comprise a robot firmware which is customized based on user feedback or robot sensor data processed by a server, in which the robot firmware is downloaded to the robot from the server.


The details of one or more implementations are set forth in the accompanying drawings and the description below. Other features, objects, and advantages will be apparent from the description and drawings, and from the claims.





DESCRIPTION OF DRAWINGS


FIG. 1A is schematic diagram showing an example of a power-saving robot system.



FIG. 1B is a sectional view showing an example of a mobile robot.



FIG. 1C is a schematic diagram showing an example of a peripheral device.



FIG. 2 is a schematic diagram showing an example of a robot system.



FIG. 3 is a block diagram showing an example of a network data bridge.



FIG. 4 is a schematic diagram showing an example of a robot system including mobile robots, in which a computer transmits themes to the mobile robots.



FIG. 5 is a schematic diagram showing an example of body panel themes for a mobile robot.



FIG. 6A is a schematic diagram showing an example of a mobile robot that includes a network data bridge.



FIG. 6B is a schematic diagram showing an example of a mobile robot and an example of a network data bridge which connects to other networks via a network that runs over power lines in a building.



FIG. 7 is a block diagram showing an example of a robot system including a manufacturer server and a licensed content provider server.



FIG. 8 is a schematic diagram showing an example of a robot system including a vendor and a manufacturer server.



FIG. 9A is a state diagram showing as example of a state machine for a mobile robot.



FIG. 9B is a state diagram showing an example of a state machine for a mobile robot.



FIG. 9C is a state diagram showing an example of a state machine for a mobile robot.





Like reference symbols in the various drawings indicate like elements.


DETAILED DESCRIPTION


FIG. 1A is schematic diagram showing an example of a power-saving robot system 100. The system 100 includes a peripheral device 102 and a mobile robot 104. In this example, the mobile robot 104 is a cleaning robot, such as a vacuum, brushing, or mopping robot. The peripheral device 102 transmits wireless commands to control the movement of the mobile robot 104. When the mobile robot 104 is out of range of the peripheral device 102, the peripheral device 102 enters a hibernation mode or low power consumption state. When the mobile robot 104 is in range of the peripheral device 104, a wireless command transmitted by the mobile robot 104 activates the peripheral device 102 from the hibernation mode. In certain implementations, the mobile robot 104 and the peripheral device 102 use a point to point protocol while communicating to one another. In certain implementations, the peripheral device 102 is a base station, such as a device to recharge the mobile robot 104 or a receptacle to empty debris from the mobile robot 104.


Referring to FIG. 1B, the mobile robot 104 includes a drive system 1042, a wireless communication component 1044, and a controller 1046. The drive system 1042 moves the mobile robot 104 about an environment, such as a floor to be cleaned. The wireless communication component 1044 communicates with the peripheral device 102. For example, the wireless communication component 1044 may receive signal beams from the peripheral device 102, such as infrared (IR), radio frequency (RF), and/or audio signals. In certain implementations, RF signals may be used to provide communication when the peripheral device 102 and the mobile robot 104 are outside of one another's line of sight. In certain implementations, IR signals may be used to provide communication when the peripheral device 102 and the mobile robot 104 are inside of one another's line of sight. In addition, the mobile robot 104 may use the signal strength to determine a distance to the peripheral device 102. the signals may prohibit movement of the mobile robot 104 through a particular area or guide the movement of the mobile robot 104 to a particular area. In addition, the controller 1046 uses the wireless communication component 1044 to temporarily activate the peripheral device 102 from a hibernation state, such as a state consuming a low amount of power. In certain implementations, the mobile robot 104 uses an IR signal, or line of sight form of communication, to activate the peripheral device 102 from the hibernation mode. In certain implementations, the mobile robot 104 sends the activation command in response to a query from the peripheral device 102. In certain implementations, the mobile robot 104 occasionally sends the activation command, such as continuously or periodically.


Referring to FIG. 1C, the peripheral device 102 includes a power supply 1022, a wireless communication component 1024, and a controller 1026. The power supply 1022 may be, for example, an electric battery. The power supply 1022 provides power to the various functions of the peripheral device 102, such as generating navigational signal beams 106a-c. The wireless communication component 1024 generates a fence beam 106a, a left guide (or directed) beam 106b, and a right guide (or directed) beam 106c. The wireless communication component 1024 also receives wireless signals from the mobile robot 104. The controller 1026 activates one or more of the beams 106a-c during the active mode and disables beams 106a-c in the hibernation mode. In certain implementations, the peripheral device 102 occasionally listens for an activation command from the mobile robot 104. In certain implementations, the peripheral device 102 sends an activation poll to the mobile robot 104 to determine if it should become active. In this example, the fence or barrier beam 106a prevents the mobile robot 104 from passing through the area where the mobile robot 104 detects the fence beam 106a. Beams 106b-c aid the navigation of the mobile robot 104.


In certain implementations, the robot 104 includes a display panel 105 in electrical communication with the controller board 1046. The display panel 105 includes indicia 1052 and an audio output device 1054. In one example, the indicia 1052 include a segmented illuminable maintenance display substantially mimicking the appearance of the robot. In other examples, the indicia 1052 include themed displays, which will be described later in further detail. The controller board 1046 controls the illumination of indicia 1052 and the audio responses from the audio output device 1054.


The peripheral device 104 may perform in several capacities. For example, the peripheral device 102 may act as a fence. The peripheral device 102 may use the fence beam 106a to prevent the mobile robot 104 from passing through an area, such as a doorway. The peripheral device 102 may also act as a gate. The fence beam 106a may provide a gate that prevents passage during a particular time interval, such as when the mobile robot 104 is in the process of cleaning a room. The peripheral device 102 may deactivates the fence beam 106a when the mobile robot 104 has finished cleaning the room and grants passage to the mobile robot 104. The mobile robot 104 uses the beams 106b-c to guide its way through the area covered by the gate. In addition, the peripheral device 102 may act as a trail marker or navigation beacon. For example, as described above the mobile robot 104 may use the beams 106b-c to navigate through areas, such as doorways. The beams 106a-c may contain information, such as an identifier (ID) of the peripheral device 102, an identifier of the type of beam, and an indication of whether the peripheral device 102 is a gate or a fence. If it is a gate, the beam identification allows the robot 104 to determine whether it is detecting the left or right guide beams 106b and 106c, respectively. The peripheral device identifier allows the mobile robot 104 to distinguish the beams 106a-c of the peripheral device 102 from beams transmitted by another peripheral device. The mobile robot 104 may be taught (or may itself learn) a path to an area, such as a back room of a house, by following a pattern of peripheral device identifiers. The beam type identifier indicates whether the beam is a fence beam 106a, a left side navigation beam 106b, or a right side navigation beam 106c. If the beam is a fence beam 106a, the beam information may also indicate whether the beam is acting as a gate that may be opened, given the proper command, or a barrier that remains closed. In any case, while the mobile robot 104 is out of range, the peripheral device 102 hibernates and the beams 106a-c remain inactive.


The wireless communication component 1024 of the peripheral device 102 receives a signal from the wireless communication component 1044 of the mobile robot 104 to activate the peripheral device 102 from a hibernation state. In certain implementations, the mobile robot 104 transmits a first activation signal 108a to activate a first set of peripheral device beams, such as the fence beam 106a while cleaning is in progress. In certain implementations, the mobile robot 104 transmits a second activation signal 108b to activate a second set of peripheral device beams, such as the navigation beams 106b-c when the mobile robot 104 moves to another room. In certain implementations, the signals 108a-b include a mobile robot identifier. The peripheral device 102 may use the mobile robot identifier to activate, for example, a first set of beams, such as the fence beam 106a, in response to an activation request from the mobile robot 104 and a second set of beams, such as the beams 106b-c in response to an activation request from a second mobile robot. In this example, the mobile robot identifiers allow the peripheral device 102 to activate beams based on the mobile robot requesting the activation, such as by providing a fence to the mobile robot 104 and a gate to a second mobile robot.



FIG. 2 is a schematic diagram showing an example of a robot system 200. The robot system 200 includes the mobile robot 104 and a network data bridge 202. In this example, the wireless communication component 1044 of the mobile robot 104 receives serial commands from the network data bridge 202, such as radio-frequency (RF) signals. Typically, these signals may be transmitted by the network data bridge 202 or other such user-side node, which is in turn connected to an Ethernet router/switch/hub 204 along with several other Ethernet-connected devices such as a home computer 206, a laptop computer 208, a cable/DSL/satellite/broadband-adapter 210 or modem, and e.g. one or more other computing devices such as a personal digital assistant 212.


In one example, the network data bridge 202 which attaches to an Ethernet port on the Internet-connected router 204 or switch may automatically download a script from a predetermined Internet or local server (e.g., via BOOTP, DHCP, HTTP, FTP, and/or TFTP) thereby providing automatic commands, such as device configuration or diagnostic testing, to be performed. Alternatively or additionally, a user may manage the mobile robot 104 using a device, such as the computer 206. The Ethernet-attached network data bridge 202 may provide for configuration and operational functionality via a small, embedded HTTP server built into the firmware of the network data bridge 202. Devices other than the computer 206 may also be used to interface with the network data bridge 202, such as a set-top box, a game console, the PDA 212, a cell phone 214, or a home server that is programmed to communicate using web or other networked interfaces.


As an alternative, access to broadband is provided via a USB port, as may be provided by the computer 206. For example, a user may insert a driver CD-ROM into the computer 206 upon plugging in a USB-based wireless transceiver, in order to install a driver therefore. Another connection, such as IEEE 1394/Firewire, RS-232, parallel port connections, and/or X10, may be used. These, however, may not necessarily be network data bridges.


Once a network data bridge 202 is attached to the network-accessible device 204, it can contact a server.



FIG. 7 is a block diagram showing an example of a robot system 700 including a manufacturer server 702 and a licensed content provider server 704. The manufacturer server 702 and the content provider server 704 may be connected to the broadband modem 210 via the Internet 706 or another appropriate network. The mobile robot 104 may report information to the server 702, such as the status of the mobile robot 104 or usage data regarding the mobile robot 104. The server 702 may store the reported data in a repository 708. The reported data may be associated with information regarding the user of the mobile robot 104. The user information may be stored in a repository 710.


Further, the network data bridge 202 may connect wirelessly to the mobile robot 104 and initiate communications therewith. While the Ethernet hub 204 includes four wired Ethernet ports as well as 802.11 wireless Ethernet connectivity, and although 802.11 or other such wireless networking protocol may be used to communicate with a mobile robot 104 from the base station other than via a network data bridge, in certain implementations, the mobile robot 104 and the network data bridge 202 use a simple, serialized RF protocol in order to exchange information between the mobile robot 104 and the base station, rather than the full-weight networking protocols.


In certain implementations, the mobile robot 104 may be further simplified by providing receive-only functionality on the mobile robot 104, instead of bi-directional wireless communication support. However, as an alternative, the mobile robot 104 may include full bi-directional wireless communications support in order to transmit information from the mobile robot 104 to the base station (and e.g., to the user, the manufacturer, etc.).


The manufacturer may receive real-world mobile robot data for product refinement and R & D. For example, the mobile robot 104 may collect data regarding behavioral patterns (e.g., a number of errors encountered, a number of times the mobile robot 104 has become stuck, or how frequently the mobile robot 104 is used) and forward such information to the mobile robot's manufacturer for refining market research and producing future models of the mobile robot 104, for example, by correcting design flaws or device problems. Moreover, customer information such as frequency of robot use, name, customer ID, etc., may also be correlated using information forwarded to the manufacturer's website from the mobile robot 104 via wireless and wired networking.


In addition, instead of the user having to locate and e.g., tote the mobile robot 104 to the base station in order to physically connect the mobile robot 104 thereto for software upgrades, software updates, and the like, a wireless update function may be provided by the network data bridge 202 in order to update the robot's firmware or other on-board software, personality, sounds, and/or displayed pictures. Also, a user may design themes or other content and have this content transmitted to the mobile robot 104 via the wireless communication channel provided by the network data bridge 202.



FIG. 3 is a block diagram showing an example of a network data bridge. The network data bridge 202 includes a network connector 302, such as an RJ-11-style male Ethernet connector. Also, the network data bridge 202 includes an antenna 304, such as an enclosed, internal antenna, operatively driven by a wireless command interface 306, which is in turn connected to a data bridge component 308 (the mobile 104 robot may likewise include an enclosed, internal antenna; alternatively, either the network data bridge 202 and/or the mobile robot 104 may either one or both include one or more external antennas, either in addition to or in lieu of an internal antenna, for example). The data bridge component 308 is connected to a broadband network interface 310 for managing and converting inbound and outbound broadband-side data (such as Ethernet, 802.11b, and/or TCP/IP packets) to and from a wireless-side simplified networking protocol. The data bridge component 308 extracts serial commands received by the broadband network interface 310 and broadcasts the commands via the wireless command interface 306 and the antenna 304, using the RPAN protocol.


The network data bridge 202 is plugged directly into the owner's broadband router 204. The network data bridge 202 acquires network information from a DHCP server or optionally configured by an advance user. The network data bridge 202 calls home (i.e., a home robot manufacturer's or reseller's Internet server) with local configuration information (serial number, local network properties, etc.). The network data bridge 202 begins polling a pre-configured URL with periodic HTTP POSTs. Each post contains status information on the mobile robot(s) 104 in the customer's home. This data can be robot/firmware specific—the network data bridge 202 need not understand the data itself (although it may well do so in certain implementations).


A CGI script receiving the POSTS processes this sensor report and updates an internal database creating a historical view of the robot system. Software-based virtual sensors examine this database (on a per-robot basis) and trigger events such as virtually pressing a button on the robot or triggering an email to its owner.


The owner may visit the home robot manufacturer's web presence using a modern, i.e.; JavaScript-enabled (or any other suitable scripting language such as Visual Basic, python, PERL, Php, etc.) web browse, and creates a user account. As part of the registration process, the customer enters the unique key as shipped with the wireless data bridge—this unique key pairs the incoming sensor stream with the user's account.


After registration, the user may be forwarded to their portal page. This page is dynamically generated using the information already provided by the robot gateway and product information and tie-ins provided by the back end infrastructure of the manufacturer's server(s).


Owner browses to theme or content store and purchases an audio theme with immediate online delivery. Theme or content store contacts robot sensor database and adds to the command queue: “download this content to robot #2” When the gateway device next posts sensor data, the HTTP response is the command to download the attached content data to the specified robot. The wireless data bridge begins forwarding this binary stream to the robot via RF. When completed, the gateway may sends a download acknowledgement with the next sensor report.


During this transaction, the javascript (or other suitable script) embedded into the owners' web interface has been polling the back-end servers for status updates. A progress bar has been drawn and animated using javascript and DHTML (or Ruby on Rails, a JAVA applet, or any other suitable display technology). The user may feel that s/he is interacting directly with the robot via the web page, despite the levels of software and communication indirection laying therebetween.


In one implementation, the wireless data bridge 202 may include a female port into which an Ethernet patch cable (or other such networking cord) plugs into from a suitable network connection point, and/or into which an interface portion of a home robot attaches, for example. As examples of such a system as described hereinabove, these communication channels provides a mechanism for retrieving sensor data and sending commands to robots in the field by piggy-backing on their broadband connection.


Such a bi-directional communication system allows deployment of online services and to retrieve sensor data from a manufacturer's installed base for improved customer service and system characterizations. It may further increase the manufacturer's comprehension of how robots and individual subsystems perform in the field.


Interaction the network-enabled mobile robot(s) 104 in a customer's home may take place through a web browser, in accordance with certain embodiments. Web browser access provides support for robot interaction via non-PC devices (e.g., cell phones, and PDAs) with compliant browsers.



FIG. 6A is a schematic diagram showing an example of the mobile robot 104 that includes the network data bridge 202. In this example, the network data bridge 202 is a card that is inserted into an interface slot 602 in the mobile robot 104. This type of network data bridge may be self-contained and transport data on constituent RAM, ROM, Flash, or EEPROM, type storage devices (which might be loaded with software, video, or audio content either at a user's computer equipped with a special writing unit or at the manufacturer in order to provide content such as themed content, for example); or can be loaded with code number(s) that authorizes a wireless download to the network data bridge 202; or, alternatively, may be connected to a network via a wire or by wireless Ethernet, for example.


A “Memory stick”-type (serial port interface), network data bridge 202 may provide content to mobile robot users who lack an Ethernet hub or Internet connection, or for users who are unable to purchase content online via credit card, or who simply come across a set of content while at a store and wish to make an impulse purchase or gift purchase for another. Furthermore, similar to the network data bridge implementation discussed above, personal computer use is not necessarily required because the user may plug the “memory stick”-type network data bridge 202 directly into a receptacle 602 defined by the mobile robot 104, and content on the network data bridge 202 may be automatically uploaded to the mobile robot 104. See, e.g., U.S. patent application Ser. No. 11/166,518, incorporated herein by reference in its entirety.



FIG. 6B is a schematic diagram showing an example of the mobile robot 104 and an example of the network data bridge 202 which connects to other networks via a network that runs over power lines in a building. The network data bridge 202 may be configured to plug into a standard power outlet 604 and to participate with a home power-line network, for example, in homes or markets where Ethernet networking components are not available. Alternatively, the network data bridge 202 may plug into a standard telephone wall jack in order to communicate via a home telephone wiring network, for example. In certain implementations, the network data bridge 202 might be plugged into any of an Ethernet port, the power socket 604 or a telephone wall jack, and auto-negotiate a connection to the Internet (if available) and/or to the mobile robot(s) 104. To this end, many “Ethernet-over-home power lines” and similar schemes or products are widely produced and well known in the art; for example, as an early commercial endeavor in this technology area, the X10 communication standard permits communication over power lines by encoding a single bit of information at each zero-point in the 120 V(RMS)@60 Hz power cycle common in North America, for example, and many more modern Ethernet-like power line networking systems are commercially available, in which each networked device connects to the network typically via an electrical socket on a wall. A common feature is that the network data bridge extracts the serial commands and data from encapsulating broadband protocols (Ethernet, TCP/IP, 802.11x) for transmission on the local wireless robot network (RPAN), and similarly encapsulates such commands and data from the RPAN for transmission on the broadband network.


The wireless data bridge 202 may provide web server functionality and serve static or dynamic web content corresponding to enabled mobile robots 104 belonging to the mobile robot user. Such web server functionality may be provided on the mobile robot user's local broadband network and e.g. be broadcast discoverable using TCP/IP, UDP, Ethernet, SNMP, NetBEUI, IPX, SMB or uPnP broadcast network announcing, for example, in order to be found by mobile robot users when browsing the local area network; alternatively, a static network address (such as a standard, pre-set IP address) may be assigned to the data bridge 202 such that users may simply type the static network address into a web browser to reach the web server on the network data bridge 202. The web content may be active or static, and may be tailored to the functionality to be provided and/or may be updated via the Internet or local network.



FIG. 9 is a schematic diagram showing an example of the robot system 200 with a content serving system for transmitting content to a mobile robot. The system 200 for accessing home robot-related content and controlling a home robot via the Internet may include an embedded web-server; an online presence accessible as a service/content provider; and a web-based user interface specific to the robots 104 in the customer's home. These components may be used to tunnel events across the Internet to an online “robot presence” generated by, for example, a home robot manufacturer. This “robot presence” may provide interactivity with the user's home robot (audible content or other types of theme and/or content downloads, remote button presses, etc.) via a hosted web service. The events thus funneled may include changes in sensor values, user interaction, commands to the robot, and state changes, inter alia. Use of a bi-directional communication channel (such as a wireless robot network communication channel) allows melding between a robot in someone's home, procedural and informational repositories on a remote server, and a web-based robot user interface, to produce powerful capabilities and forge new functionalities such as adding offloaded “intelligence” to otherwise simple robots (by, for example, performing the bulk of number-crunching or compute-intensive tasks on a separate computer or server, and simply uploading and/or downloading results and/or sensor input to and from the home robot itself). In effect, by tunneling the local robot system's communication fabric across the Internet or other suitable network, it is possible to allow back-end servers to interact with robots in users' homes.


The data bridge 202 may send local network information to the Internet server. As a non-limiting example, the user may connect to the Internet 706 and may be redirected to the local bridge as appropriate. By publishing a well known Internet address that both bridge and/or the user may access, the need to know the user's local information may be eliminated.


In addition to the network data bridge 202, a wireless remote control may offer several similar wireless functions for controling or managing the mobile robot 104. The wireless remote control may communicate directly with the mobile robot 104 via an infrared (IR) or RF protocol, or may relay commands through the network data bridge 202, for example, when the mobile robot 104 is not within sight but the remote control is within IR signaling range of the network data bridge 202. The network data bridge 202 may thus also be provided with an IR sensor for receiving mobile robot control commands from the wireless remote control, and then relay the commands wirelessly to the mobile robot 104—for example, the embedded web-server may bridge a proprietary or ad-hoc communication method used internally by the mobile robot 104 (and also used by accessory items added to the mobile robot 104) with a persistent online presence by translating the internal communication protocol(s) into HTTP POST and GET transactions.


The online presence may generate a web-based user interface that incorporates javascript components to asynchronously poll the mobile robot 104 for state changes (e.g., battery voltage). This javascript may asynchronously fetch changes in the robot properties and rewrite the content in the page. Sensor values, etc., can be refreshed by the web browser without the customer needing to click refresh on the browser, for example.


The web-based interface may use customer tracking and persistence robot sensor data to pair the mobile robot 104 with the customer account and present user interfaces for equipment the customer owns.


Further, if a series of the peripheral devices 102 are arranged throughout a home or other location and set up as a relay network stemming from the base station, for example, then the commands relayed from the remote control may also be relayed throughout the beacon network to reach a home robot that may be quite distant from the remote control.


Wireless bandwidth (especially in unlicensed bands such as 900 MHz, 2.5 GHz, or any other such suitable public RF band) is by its nature limited, and because the presence of multiple RF devices (such as, for example, multiple mobile robots and/or network data bridges, WiFi, BlueTooth, X10 mobile or portable telephone or other common wireless devices; and/or interference from sources such as solar flares, RF discharge from electrical lines, florescent lights, or any other RF-interfering entity) may further restrict the effective amount of bandwidth or the degree of reliability of bandwidth available for wireless mobile robot communications, reliability and postponement measures may be taken to enhance the functionality of the network data bridge 202 and/or the mobile robot 104; conversely, the network data bridge 202 and/or the mobile robot 104 may be configured to reduce their consumption of available bandwidth in order to give priority to other wireless devices. For example, regarding the reliability of the wireless robot network communications, techniques such as cyclic redundancy checking (CRC) and/or has routines (such as, MD5 sims or CRAM) or other appropriate reliability techniques (such as parity or error correcting codes (ECC)) may be employed on either the data bridge-to-robot channel and/or the Internet-connected channel (e.g., on the Ethernet-to-data bridge channel). Furthermore, to limit the use of valuable bandwidth during business or other peak usage times, the network data bridge 202 and/or the mobile robot 104 may be scheduled to transmit theme content, usage/behavior data, or any other such communication during night-time or off-peak times; alternatively, for example, the network data bridge 202 and/or the mobile robot 104 (and/or the manufacturer's server) may be scheduled to perform their communication (or the bulk of their communication) at an automatically detected off-peak usage time, by detecting when bandwidth usage is lowest (either in real-time or by collecting data of bandwidth usage-per-time-of-day over a series of days or weeks and then determining the generally least used times of day, as non-limiting examples). Reliability measures may be taken at either the network or application layer or both, for example, or at any other suitable layer in a communication stack (such as the data bridge using UDP on the Internet for simplicity and non-critical communications, but the web server using full error-checking, reliability and/or error correction measures, windowing, etc.


Also, the web server functionality in such a data bridge can communicate with a known network address or location (such as a fixed IP address or uniform resource locator (URL)) in view of issues that arise with DHCP and dynamic IP address assignment, for example; the web server on the network data bridge 202 may thus behave in a manner similar to a client for a significant portion of time in order to actively access and/or poll “server”-like resources available on the mobile robot 104 (via wireless connection, for example), as in many examples the mobile robot 104 maintains relatively little network functionality in the mobile robot 104 itself (such capabilities being offloaded largely onto the network data bridge 202).


The web server functionality may establish network communications with the mobile robot 104 and/or Internet server(s) via a suitable protocol or standard, such as FTP, FTPS, TFTP, HTTP, HTTPS, GOPHER, TELNET, DICT, FILE and LDAP, HTTP POST, HTTP PUT, FTP uploading, HTTP form based upload, proxies, cookies, user+password authentication (Basic, Digest, NTLM, Negotiate, Kerberos4), file transfer resume, http proxy tunneling, and/or other suitable network methods (such as a method supported by libcurl, for example). The network data bridge 202 may employ network announcement techniques such as uPnP, dynamic DNS, reverse ARP, Ethernet or UDP or TCP/IP broadcasting, or another suitable method of broadcasting the existence of the network data bridge 202 to other devices on the same network.


By offloading much of the server functionality from the mobile robot 104 to the network data bridge 202, and using the network data bridge 202 as a communications proxy, mirror and gateway, the mobile robot 104 is also shielded from excessive client requests that might otherwise tax its processing and/or bandwidth resources. For example, the mobile robot 104 may in one time period (e.g., ten minutes) produce thirty visual snapshots from an on-board camera. Then, if several entities were to attempt to download the snapshots from the mobile robot 104 simultaneously, the mobile robot 104 might quickly be overwhelmed as the wireless network became bogged with service request traffic. As an alternative, however, the mobile robot 104 may be accessed by just one entity, the network data bridge 202, at a known regimen of polling requests, thus preserving the mobile robot's bandwidth and processing capability while still permitting replication of any such collected data by copying it to Internet servers for broader access, for example.


In addition to RF-band wireless communication, the network data bridge 202 (and/or the mobile robot 104 or the peripheral device 102) may transmit via other suitable frequencies an/or bands in the electromagnetic spectrum, such as the 900 MHz, 2.4 GHz, microwave frequencies, or other suitable bands. To alleviate interference that may occur in these or the RF or another band, the mobile robot 104 and/or the network data bridge 202 may employ frequency shifting, spread spectrum, sub-channel technologies, and/or other such interference-avoidance schemes or techniques for avoiding interference with other unlicensed RF applications (phones, baby monitors, etc.).


In addition, robot commands might be sent by the network data bridge 202. Additional functionality may be provided to the user in the form of issuing remote commands while away from home. Accordingly, if a home robot owner were to forget to schedule or activate the mobile robot 104 prior to leaving on a business trip, the mobile robot user might still program or activate the mobile robot 104 remotely via a command generated by (for example) a mobile robot interaction website provided by the mobile robot manufacturer, which would be relayed through the Internet or other suitable network to the network data bridge 202, which would in turn convert the information received from the Internet into a corresponding wireless robot network command and transmit the command wirelessly to the mobile robot 104 for execution.


In certain implementations, a series of robot commands corresponding to timing and execution of movements, etc., may be compiled into a “dance” routine and transmitted to the mobile robot 104 after a user selects the “dance” from a website maintained on the mobile robot manufacturer's server; alternatively, the “dance” might be combined with audible content such as music or sounds, and/or commands to control the indicia (such as light-emitting diodes (LEDs), liquid-crystal displays, and/or backlights, inter alia) to provide a multimedia “dance,” music and lightshow. A further non-limiting example includes live trouble-shooting or technical support provided to mobile robot users, e.g., initiated through a telephone call or the Internet, or e.g., through a microphone and speaker installed as part of the mobile robot 104 along with the appropriate computing, recording, mixing and transceiving hardware and software (and bandwidth, both wireless and over the Internet, as well as proper latency and synchronization). Additionally, for example, a camera might be included for enhancing such virtual interaction, and/or the proximity sensor normally put to use in obstacle detection may be employed during alternative modes as a general-purpose observation camera in those implementations in which the proximity sensor is configured to function as a visual-spectrum camera, with encoding and transmission of streaming video from the robot via the wireless link to the network data bridge 202 and onto a networked destination, inter alia.


Similarly, for transmitting robot usage and behavior information to the mobile robot manufacturer's server, the mobile robot 104 may collect certain data regarding battery use, recharging frequency, amount of times spent performing its primary task, the amount of time spend idle, the frequency with which the robot becomes stuck, etc., and periodically transmit this data to the mobile robot manufacturer's servers via the network data bridge 202.


Moreover, the ability to transmit audible content to the mobile robot 104, either via the network data bridge 202 or via a “memory stick”-type data bridge, permits the mobile robot 104 to “speak” instructions directly to the user of the mobile robot 104 at the time and place of use. For example, the mobile robot 104 may speak directions when in a demo mode that is initially run to demonstrate various features the mobile robot 104 can perform. Voice instruction to the mobile robot user can be accomplished by transmitting encoded audible content to the mobile robot 104 (either by installing such audible content on read-only memory (ROM) in the home robot at the time of manufacture, or by transmitting wirelessly on otherwise and storing the audible content in flash RAM, for example) and playing it back over an on-board decoder and/or synthesizer and speaker installed in the mobile robot 104. Synthesized speech encoded for decoding on speech synthesis hardware may require less on-board storage than non-synthesized speech; however, as an alternative, natural or computer speech may be recorded as wave-form encoded (and/or psycho-acoustically encoded) sound data and transmitted to the mobile robot 104 for storage and later playback. Alternatively, however, speech for playback on the mobile robot 104 may also be encoded as WAV files or compressed sound files (e.g., employing compression such as Lempel-Zev-Welch (LZW) or other techniques that are less computer-resource-intensive than, for example, MP3 or Windows Media (WMV) decoding).


As another example, by using a synthesizer, a phoneme string file to be downloaded to the mobile robot 104 may include and/or be thematically associated with, for example, an animation storyboard file including tags that trigger synchronized or asynchronous parallel movement events, lights (or other indicia) and/or non-synthesized sound Using such a phoneme pattern and storyboard, a string such as “he-llo ha-w [which may here present a tag to start a “bowing” trajectory in, trajectory out movement as an asynchronous ballistic behavior] a-re y-ou” may thus trigger the associated “bowing” robo-motion (e.g., thematic “dance” or emotive behavior performable by the mobile robot 104). Further, if sound recording hardware such as a microphone and sound processing hardware are installed in the mobile robot 104, as well as sufficient processing or transmitting capability, then the mobile robot 104 may include speech-recognition functionality in order to recognize spoken commands or other interaction from mobile robot users; also, the storyboarding capability as discussed above may encompass and encode response and references to any of the functionalities or capabilities performable by the mobile robot 104.


An RF system used by the mobile robot 104, the network data bridge 202, the remote control, and/or the peripheral device 102 may include four radio transceiver modules that are located in the mobile robot 104, the remote control, the peripheral device 102, and the network data bridge 202. The remote control may use RF to transmit control signals to the mobile robot 104 using a bidirectional protocol or unidirectional protocol; also, the remote control unit may allow the user to “drive” the mobile robot 104 around as well as sending scheduling data created on the remote control unit. The mobile robot 104 may use RF to wake-up and power-manage the peripheral device 102 using a bidirectional protocol. The network data bridge 202 may use RF to transmit data and code updates to the mobile robot 104 as well as to upload diagnostic data from the mobile robot 104 using a bidirectional protocol. Furthermore, when there are multiple peripheral device as well as the network data bridge 202 in operation, in which the peripheral devices and the network data bridge 202 can maintain an RF or other communication channel in a relayed fashion, the wireless robot network communication between the network data bridge 202 and the mobile robot 104 may be propagated along the chain of peripheral devices even when the mobile robot 104 is beyond the direct RF range of the network data bridge 202. The effective range of the wireless robot network can be extended by the linking of peripheral devices.


The 2.4 GHz ISM band may be used with either direct-sequence or frequency-hopping spread-spectrum transmission techniques. In addition, either a custom proprietary protocol based on some implementation of a standard 7-layer OSI model may be used, or the ZigBee 802.15.4 standard protocol may alternatively be used, inter alia. The custom protocol may allow for proper regulatory compliance in all countries of interest, for example, or alternatively, may be suited for each anticipated national market.


The following single chip, integrated RF transceiver radio modules are examples of chipsets that may be used to implement the RF system: Chipcon CC2500; Chipcon CC2420; Freescale MC13191; Freescale MC13192. A printed circuit “F” style antenna design may be used with either no external RF power amplification or a suitable RF power amplification depending on the range and power requirements.


Regarding a proprietary robot-net RF protocol, such a protocol may be simpler than Zigbee, for example, in that Zigbee has two parts—signaling (IEEE 802.15.4) and application (routing, etc.). As an alternative, the proprietary robot-net protocol may use 802.15.4 because the standard has driven down the cost of goods for antennas, microcontrollers, etc. The contemplated robot-net may however deviate from Zigbee (a meshed network with routing) at least in that it may be a point-to-point network. Under Zigbee, nodes would be required (in some cases) to route traffic for other nodes; this behavior may place excessive responsibility on Lighthouses, Remote controls, and RF data bridges, etc. Robot-net RF may include a sparse protocol that simply has robot or beacon control and reporting messages like WAKEUP, GO_CLEAN(robot-n), ERROR(robot-n, i-am-stuck), etc.


The savings in complexity may enable small amounts of memory (e.g., 8K) on some elements of the network. As an example, a Lighthouse may be 8 KByte of program memory. The point-to-point protocol may be simpler than Zigbee because it does not support routing of traffic from endpoints other than the home robot products, encryption of data packets, or many other features needed for meshing. Above this packet transport layer, robot-net may define messages that are specific to robot control and monitoring which are unique (lighthouses may also be configured to use the protocol, although the may be turned on and off by the robot extra-protocol). This control is unique even if implemented such that ZigBee forms a portion of the application layer, as a non-limiting example.


At least one of the endpoints may be mobile. Instantaneous signal strength, or signal strength over time can be used to help the robot navigate or correct beacon-based navigation, e.g., by providing additional data informing the robot that it is proceeding in the correct direction or in a failure mode, as non-limiting examples.


With regard to a vocal and multimedia demonstrations mode executed e.g., just once when the mobile robot 104 is first used (and then the accompanying data being e.g., discarded to liberate system resources), or at any time when an appropriate “demo” button is pushed, the demo mode may include several speech files may be scripted in sequence; in which the script is not necessarily an interpreted script, but may simply represent a linear routine coded in any suitable manner. The script may flash the lights and buttons visible on the mobile robot 104, make sounds and cause the mobile robot 104 to do the things it is supposed to demonstrate (such as spot cleaning). The demo script may flash the lights or other indicia directly and run the behaviors directly, or could generate phantom presses/sensor events within the robot's computer control system to make the appropriate indicia and/or behaviors occur. For example, to start the spot cleaning demo, the voice routine could tell the user to press spot clean now (or could send a phantom button press to the UI control, which would flash the button as usual and start as usual. Other demos may be triggered by fake signals forwarded to a virtual sensor, for example—demo of stasis/stuck response by telling the mobile robot 104 it is stuck, etc.), then wait a certain period before reasserting control for the remainder of the demo. The demo could detect that the user pressed the wrong button and redirect him, as well, for example.


Examples of “robo-motions” (interchangeably, “animotions”) which may be transmitted either alone or as part of a theme or bundle may include new functional robot motions or behaviors such as spot-coverage, wall-following, and bounce operational modes, which may pertain to at least certain implementations of mobile robots in accordance with the present invention are specifically described in U.S. Pat. No. 6,809,490, by Jones et al., entitled Method and System for Multi-Mode Coverage for an Autonomous Robot, the entire disclosure of which is herein incorporated by reference in its entirety.


In addition, to one example, the mobile robot 104 may be provided with a speaker for playback of audible content, a wireless or direct link to the network data bridge 202 for receiving the audible content, and a processor (for example, a speech synthesizer, MIDI chipset and/or frequency modulation (FM) unit, etc.) for replaying the audible content. The audible content may have significant functionality—in a non-limiting example, a warning siren sound may be downloaded to provide a clear signal when the mobile robot 104 detects a potentially hazardous condition such as overheating of a vacuum motor, or a series of slowly-spoken instructions may provide hard-of-hearing or disabled mobile robot users with a more understandable tutorial on the use of the mobile robot 104. Furthermore, the audible content and/or theme may provide instructions or other speech in any of a variety of human languages or dialects; in addition, any behavioral or movement-based content such as may be associated with or included in a regional, national, linguistic, cultural, occupational, character or other such theme may also be appropriately correlated. For example, a “ballerina” theme might include spoken instructions affecting an accent reminiscent of a French accent, and a motion profile that causes the mobile robot 104 to perform pirouettes, spirals, figure eights, and other movements reminiscent of a dancer performing ballet, inter alia; this might also be associated with a body cladding set that would suggest a leotard, for example, to enhance the thematic effect.


The specific tying of certain content to certain behaviors is also made possible. For example, whenever the mobile robot 104 performs a certain “robo-motion” or trick (or less sanguinely, gets stuck, for example) such as a game of chase, it might play the “William Tell Overture,” alternatively, it may issue plaintive wails for help when the mobile robot 104 detects that it is lost, cut off from communication with any beacon, the network data bridge 202 or a home base, or is otherwise stuck or stymied.



FIG. 5 is a schematic diagram showing an example of body panel themes for the mobile robot 104. The mobile robot 104 may include customizable, snap-on or otherwise interchangeable exterior panels 502a-b which may be “themed” for permitting mobile robot users to individualize their mobile robots. As an example, body panels may be molded from plastic and thereafter painted, dyed, or covered with an adhesive material; any suitable modality for coloring designing or drawing patterns thereon may be used, as appropriate. As an example, a design may be applied to the interior of a transparent sheet-like polymer material, and then the polymer sheet or body panel may be applied to a molded plastic piece as a body panel; as a result, the design is protected by the transparent polymer sheet, while rapid theming of body panels in a “just-in-time” (JIT) distribution strategy may be achieved. The panels may also be customized by user content, e.g., printed by ink jet onto an appropriate material. For example, the user may upload a photo, which can be adapted to a template, and the JIT-made shell may be made and delivered at the appropriate time (e.g., x-mas panel with sounds of a user's own family singing Christmas Carols, e.g. uploaded to the server by the user earlier for inclusion into a modified or customized theme).


Furthermore, the interchangeable body panels 502a-b may include an identification system corresponding to audible or other thematic content that may be transmitted to the home robot to complete the theming effect. For example, the body panels include an electrical contact 504 which is positioned along the interior edge of the panels 502a-b so as to contact a corresponding electrical contact on the mobile robot 104 when installed thereon. The electrical contact 504 on the body panels 502a-b is operatively connected to an appropriate electronic identification unit, such as an integrated circuit (IC) 506 that outputs a voltage pattern corresponding to a unique theme ID number; and/or a specific resistance 508 which likewise corresponds to particular theme IDs 510a-b. Alternatively, the body panels 502a-b may include an RFID or passive magnetic device; a mechanical ID mechanism such as a punch-card like series of holes or peaks; an optical ID system such as a barcode or characteristic color; or any other modality suitable for identifying a body panel's corresponding theme. The ID can be transmitted by the mobile robot 104 back to the network data bridge 202 as authorization or identification to download corresponding themed firmware, multimedia, etc. to the mobile robot 104 as discussed herein.


As a default behavior, if no identification of the body panel can be made by the sensors of the mobile robot 104, the mobile robot 104 may e.g., reject theme content as potentially unauthorized; or, conversely, accept any theme material if there is little concern regarding unlicensed theme content.



FIG. 8 is a schematic diagram showing an example of a robot system 800 including a vendor 802 (as a type of audible or other content) and a manufacturer server 804. The system 800 acts as a content distribution system, in which the vendor 802 distributes licensed content to mobile robots (“consumer robots”) under the supervision of licensing and security-checking systems (“big brother” 806, “CRM” 808) as a back-end to a consumer-oriented website administered by a robot manufacturer (“E-Commerce Site” 810). In this example, “Rtoon” may signify distributable content, whether audio or other thematic material, for example.


Also, once an identification of the theme corresponding to an installed body panel is determined, the mobile robot 104 may wirelessly transmit information regarding the detected theme ID to the mobile robot manufacturer's server via the wireless robot network data bridge 202 and Internet, for example. the server may then determine whether audible or other content corresponding to the transmitted theme ID is available, and if so whether the corresponding content is properly paid for, licensed, etc. If all determinations are affirmative, the server may transmit the corresponding content (such as a set of audio data arranged as a sound theme and/or a set of robot “dance” commands, indicia patterns, etc.) to the mobile robot 104 via the Internet and the network data bridge 202, for example; alternatively, the server may transmit an “unlock code” or cryptographic key for decoding encrypted and/or otherwise restricted content already present at the mobile robot 104; or, for example, the mobile robot manufacturer's server may cause a separate content server (e.g., belonging to a third party and under licensing and electronic distribution agreement with the mobile robot manufacturer, for example) to transmit such data to the appropriate mobile robot which sent the theme ID.



FIG. 4 is a schematic diagram showing an example of a robot system 400 including mobile robots 104a-c, in which the computer 206 transmits themes to the mobile robots 104a-c. Mobile robot users may receive new or updated musical, sound, visual, choreographic or other such thematic content which corresponds to the themed body panels installed on their mobile robots. A website 402 displayed on the personal computer (PC) 206 offers a choice of three sets of music contents, ‘A’ 404a, ‘B’ 404b, or ‘C’ 404c, for transmittal to or activation on the mobile robots 104a-c).


Also, the theming may even extend to the content within themes—e.g., if a robot has several audio files or “earcons” loaded (e.g., in a format such as MIDI), and then selects a “steel drum” theme, the theme might include musical instrument elements (e.g., also encoded as MIDI instruments or other suitable format) which replace the standard instruments that would normally be played in the earcons or sound files; as such, a rock ballad may be “themed” into a Caribbean anthem, as a non-limiting example.


With regards to the themed content, combinations of various interdependent (or independent) content types (e.g., including—but not necessarily limited to—sounds, body cladding, choreographed “dance” moves, behavioral response profiles, or the like, as non-limiting examples) are possible to permit a high degree of thoroughness in the impact themes may have on users. Sounds such as background music to be played while the home robot either sits idle or performs takes; “earcons”—that is, sounds that convey meaning and which are played back when triggered by certain events or behaviors, such as the robot playing a car-crash “earcon” when the robot bump sensor detects a collision with an obstacle, or the phrase “feed me!” as an earcon when the robot's detergent or other consumable bin is empty, inter alia; daily or monthly activity or activation schedules such that a “family dog” themed robot may cease hibernation or recharging mode when a sound corresponding to the delivery of the morning newspaper is detected, and then proceed to the user's bedroom to “bark” and cavort about in figure “s” or semi-random movement patterns (or to “wag its tail” by rotating the hind portion of the robot repeatedly back and forth) adjacent the user's bed to wake the user or to get the user's attention in a manner reminiscent of an excited dog; a “listening” behavioral routine which may cause the robot to, for example, behave as a parrot (e.g., in combination with other parrot-themed content such as plumage-like body cladding, “whistling” sounds and a parrot-like gait for movement) and repeat back recently spoken words that are detected by an on-board microphone (e.g., in a distorted, squawking manner reminiscent of a true parrot, for example); and/or network-access patterns for robots equipped to communicate via a wireless network (for example, a “spy” theme might, in combination with various other thematic content, include a network access profile that might cease all network access for various periods of time when the “spy” robot is expected to maintain “complete radio silence,” for example; or conversely may be conditioned to frequently retrieve updated information from a network source in order to fulfill a role as a trivia quizmaster by downloading questions and/or answers to be posed to guests of the user while playing a party game, for example—indeed, such quiz questions and answers or the rules of a game might themselves be bundled into a theme in static format, as a further example of the broad extent of theme-able content types). Also, content such as the “robo-motions” (“dance” moves or distinctive motions performed by wheeled or otherwise mobile home robots) may be triggered on command by way of voice command recognition, etc., on the robot such that the user may clap her hands or verbally instruct the robot to “beg” or “stand still” or “dance the Lindy Hop,” and the robot would comply by performing the associated robo-motion, for example.


Functional themes may enhance the primary purpose (or any other purpose) of the mobile robot, as well—for a non-limiting example, a “super robot cleaner” theme might include behavioral patterns that cause the home robot to detect spots on the floor and to spend a particular proportion of cleaning time vacuuming or washing the spots discovered as a result of the theme.


User commands to initiate actions such as power on/off, start, stop or to change a cleaning mode, set a cleaning duration, program cleaning parameters such as start time and duration, and or many other user initiated commands, functions, and/or components contemplated for use with the present invention are specifically described in U.S. patent application Ser. No. 11/166,891, by Dubrovsky et al., filed on Jun. 24, 2005, entitled Remote Control Scheduler and Method for Autonomous Robotic Device, the entire disclosure of which is herein incorporated by reference its entirety.


For another example of a type of theme which encompasses a variety of behavioral, audio, visual and other types of ecletic content, a mobile robot may be themed as a chess piece, such theme to include not only distinctive body cladding (with different types possible such as “knight,” “rook,” “pawn,” etc.) and e.g., chess-themed music and sounds, but also e.g., a network behavior so as to coordinate with a central server (or possible to “swarm” with several other home robots also acting as chess pieces) so as to adopt the role of a particular piece on a chessboard, in which a user or users bring a plurality of home robots and arrange them in an environment simulating a chess board and command the home robots to move as chess pieces during a chess game; this high-level “chess” theme thus may include also the rules of chess and behaviors and movement patterns (as well as network routines and functions) of various chess pieces, as well as visual and/or audio content and the like, for example.


As illustrated by the above-discussed non-limiting examples, the content types that may be used and combined into theme packages may encompass a broad range of material, virtually as broad as the range of potential capabilities performable by the mobile robot 104, for example. The present inventors intend that the examples of associated themed content given herein can be generalized for the purposes of the invention according to their readily recognized type.


The chess and trivia game examples are examples of providing associated themed content linking at least two of a predetermined game rule set, game piece or paraphernalia set, game appearance, and game sounds.


The parrot and dog examples are examples of providing associated themed content linking at least two of a predetermined entity (i.e., animal or person) motion set, appearance, and sounds. This would extend to celebrities, so called “licensed properties” linked to well-known entertainment programs or books, characters and the like.


The ballet example is an example of providing associated themed content linking at least two of a predetermined dance move set, paraphernalia, appearance, music, and sounds.


The country-and-western example below is an example of providing associated themed content linking at least two of a musical genre move set, paraphernalia, appearance, music, and sounds.


Mobile robot users with Internet-capable personal computers, cell phones, PDAs, and other devices may also browse the home robot manufacturer's server via a web site and select themes, sounds, tones, “dances,” software, or other suitable mobile robot content for download and/or purchase. (For example, a user interested in quelling potential sources of RF interference in her home, or for conserving bandwidth for other tasks, might purchase a low-RF-noise network profile content from a manufacturer's website, for example.) The user interface presented to the user may also be customized to match the robot theme as well—that is, a theme may include multimedia content that can be manifested on the robot but also on an online interface that is associated with the robot having the theme, and with which the user interacts when using the online interface and/or the robot.


Also, users may select themed body panels, base stations, batteries, accessories, new home robots, data bridges, etc., from the web site, and have the items shipped to their home. Items such as body panels may then be ordered in bulk in blank form by the manufacturer or reseller who operates the web site, and then apply themed designs rapidly and “just-in-time” after (or before, when sales prediction analysis is properly applied) receiving an order from a home robot user.


Furthermore, themed items may be accompanied by a CD-ROM, floppy disk, “memory stick”-type data bridge, or other data medium in order to transmit the appropriate corresponding themed content to the home robot upon installation of the ordered themed item. Alternatively, the mobile robot manufacturer or reseller may omit shipping data media along with physical items, and instead offer Internet-based transmission of the corresponding themed content (via the wireless robot network data bridge, for example), or do so when orders are received from customers who the manufacturer or reseller has a record of having previously bought the network data bridge 202, for example (or when records show that the customer already has the most up-to-date version of the appropriate themed content). The mobile robot manufacturer or reseller may reduce shipping charges when it is known that the ordering customer has the ability to update the mobile robot 104 via the network data bridge 202, for example.


Moreover, by virtue of the online ordering system and manufacturer's or reseller's website, customers may be offered a variety of functional and thematic combinations for body paneling, sounds and music, “dance” routines, and indicia flash patterns (and/or one-off or single-item offerings in a “mix-‘n’-match” mode, such as a Country/Western paneling-themed robot with a rock‘n’roll “dance” routine and a classical piano sound theme, in a non-limiting example). For example, the Country/Western theme body panel 502a is linked to music content ‘A’ 510a; and the piano theme body panel 502b is linked to music content ‘B’ 510b (which would usually be piano-related). Additionally, accessory replacement and service reminders tied to usage can be contemplated—e.g., reminders to replace the batteries after a certain number of recharge cycles or hours expanded. The online service may be configured to enter a recommended replacement part (to replace a part recorded as having accumulated sufficient cycles or time to be worn according to the uploaded data) or a consumable material such as detergent, cat food, lubricant or any other such material (so augment the stock of the consumable material recorded as needing replenishment) into the user's online shopping cart or one-click purchase queue, as non-limiting examples.


Other aspects of such a website may be handled in a conventional manner, by permitting online credit card or check payment, etc. As an advantage, customers may place orders for a customized home robot with their choice of complete theme or a “mix-‘n’-match” robot (e.g., a male lion vs. a female lion) personalized to the users' own variegated tastes and styles. Moreover, by using laser printing or other modality for applying digital images and/or writing to the polymer sheet panel coverings discussed above, users may be offered the option to apply slogans, names, or any arbitrary writing and/or images to be printed on their home robots, for example.


A further example permits the user to create or record his or her own sounds or music content and to transmit this custom content to his or her home robot. In order to address licensing and unauthorized duplication concerns, the home robot manufacturer may employ a media and/or authoring software protection scheme, for example, such that only licensed copies of the authoring software will function properly and such that only content produced on a properly licensed copy of the manufacturer-provided software will correctly play back on that manufacturer's home robots, for example. As a non-limiting example, public-key encryption techniques may be applied in which each robot receives a public key known to the user (such as a serial number, for example), and a private key known only by the manufacturer. Accordingly, when a home robot user purchases a copy of the content authoring software from the manufacturer, the copy that the home robot user receives may “watermark” its output content with the encryption key such that only the particular user's home robot can pay back the output content, as a non-limiting example. Other encryption or protection schemes may be employed to allow wider or narrow distribution, as appropriate to business and license/copyright protection concerns.


As a further example, users may be offered a content subscription service in which a number of themes and/or audible or other content are made available per month or other time period to the user who purchases the subscription. As an advantage, users can be assured of the integrity of the content they download and copyright concerns can be addressed.



FIGS. 9A-C are state diagrams showing examples of state machines 900, 930, and 960 for the mobile robot 104, the lighthouse peripheral device 102, and a remote control peripheral device, respectively. The Robot Personal Area Network (RPAN) protocol used by the network data bridge 202, the mobile robot 104, and the peripheral device 102 can be used in many ways as defined by applications.



FIG. 9A shows a high level state diagram that serves as a reference for the following discussion. The mobile robot 104 is an RPAN master responsible for several tasks, such as providing a unique address to isolate communications with its peripherals from other RPAN systems, deciding on a radio channel to use, operating on the common channel as necessary to report this operational channel, and transmitting a beacon which defines times windows that peripherals should use to communicate.


When the mobile robot 104 is conserving power in its dormant state 902 or charging, the RF network is inactive, meaning that no beacon is transmitted. While in this state 902, the mobile robot 104 can be woken up over RF by executing the following steps in a constant loop.


1. Turn on radio on the Common Signaling Channel (CSC).


2. Send “Activate Invite” broadcast message.


3. Listen for “Activate Request” message for up to 30 milliseconds.


4. Receive “Activate Request” message and leave Dormant state. Or, turn off radio and sleep for 970 milliseconds.


Therefore, every second the mobile robot 104 invites peripheral devices, such as the peripheral device 102, to wake it up. A peripheral wanting to wake up the mobile robot 104 will listen for a second for the “Active Invite” message and respond promptly with the “Activate Request” message which wakes up the mobile robot 104.


When the mobile robot 104 has been woken up to active scan state 904, it checks if its radio channel is still valid. If the robot sleeps for more than 10 minutes, the radio channel will be reselected. This time is chosen to safely exceed any session oriented timers. The first step in reselecting a channel is to actively scan for other RPAN masters and exclude their channels from the set of acceptable channels.


An active scan is performed by sending two “Ping” messages on the CSC to the broadcast RPAN address. The mobile robot 104 listens for 30 ms after each message for “Ping Response”. Each “Ping” message is separated by 360 ms.


After ruling out radio channels through active scanning the mobile robot 104 moves to energy scan state 906, candidate channels are scanned for energy levels in order of preference. On a given channel, 100 energy level samples are obtained in about 100 ms of time. The first channel to have an average energy level below an acceptance threshold is chosen as the new operational channel. In the unlikely event that no channels meet these criteria, one is randomly chosen.


When the mobile robot 104 is operating its RF network normally, then it is in normal state 908 and it transmits a beacon every 720 ms which is called “beacon period”. In that beacon message, a window of time following the beacon which is valid for devices to communicate is advertised. This “contention access period” is set to 220 ms in the normal mode. While not in the contention access period, the robot operates on the common channel to answer “Ping” messages with “Ping Responses” advertising the radio channel on which the robot is operating.


The motivations behind the specific beacon and contention access periods chosen are as follows: to keep beacon tracking overhead low, to keep radio power consumption low, and to allow peripherals with very inaccurate clocks to reliably find robots. This final goal is satisfied by defining one more time constant which is the “ping separation period”. If a peripheral sends two pings separated by 360 ms, the real time assuming the clock is plus or minus 30% is anywhere between 252 ms and 468 ms. On the low side, the 252 ms is sufficiently high so that both pings will not occur while the mobile robot 104 is on the operational channel. On the high side, the 468 ms is smaller than the 500 ms that the mobile robot 104 is listening on the common channel guaranteeing that one of them will be receiving during that time. There are other combinations of values that work. Also, with higher clock accuracy the contention access period duty cycle can be higher. These values can be recalculated for other systems based on those needs.


The 500 ms when the module robot 104 is operating on the common channel represent a dead time that can be unacceptable at times. One such time is when the mobile robot 104 is being driven remotely. Another is when the mobile robot 104 sensors are being monitored for diagnostic purposes. When low latency state 910 is needed, a peripheral may send a “Low Latency Request” message which contains a byte representing the number of seconds that low latency mode should be used. The low latency time can be refreshed with subsequent messages. Also, the mobile robot 104 itself may switch to low latency mode.



FIG. 9B shows the state diagram 930 that serves as a reference for the following discussion. In this section, message flows between the mobile robot 104 and the lighthouse peripheral device 102 are illustrated. A peripheral device, such as the lighthouse peripheral device 102, may be a simple slave device.


The slave 102 begins in a lower power consumption state 932 designated as “Free” in the state diagram 930. In this state 932, it wakes up periodically and attempts to join a robot network. It does this by setting its channel to the common signaling channel (CSC is the 5th channel). It then sends a broadcast message to ask any robots who are listening on this channel to respond. The response from a robot hearing this message advertises a network with an ID on an appropriate channel (zero based numbering). This is the same active scanning process described above. The lighthouse 102 will get 0 or more responses in the two 30 ms windows of time it listens after sending the requests. If none are received, it will go back to sleep and perform another active scan in 4 seconds. If one or more are received, it will choose to join the network of a mobile robot whose response message was received with the greatest signal strength value.


If the lighthouse 102 has ever received a Join Accept message from a robot, that robot's RPAN ID is used instead of the broadcast address in the ping message. In this way, the lighthouse 102 will not waste power waking up for a robot that is in RF range but not its owner, e.g. the neighbor's mobile robot.


If the lighthouse 102 wants to join a robot's network and does not have an assigned address, the lighthouse 102 will randomly select a MAC address (marked as “Soft” in the MAC header) to use temporarily until the robot assigns one to it.


In “Seeking” state 934, the lighthouse 102 changes channels and listens for the beacon emitted periodically by the mobile robot 104. It should pick this up within a few seconds at most. If this does not happen, a timeout (30 seconds) will send it back to the “Free” state 932.


If all goes well and the beacon is found, the lighthouse 102 will advance to “Binding” state 936. In the “Binding” state 936 and subsequent states, the lighthouse 102 will filter packets from other robots and monitor its link to the RPAN network from the MAC layer beacon tracking. These are shown in the state diagram as “Link Up” and “Link Down” events.


Upon entering this state 936, the robot will send a “join request” message. This starts a timer on the lighthouse 102 getting accepted into the network within 5 minutes. If that expires, the lighthouse 102 will return to “Free” 932. (This 5 minute time period is known to both the robot 104 and the lighthouse 102 so that each can expire their pending. Whenever the robot 104 receives a join request that causes a collision of soft MAC addresses in its table, it will send a join rejection message that does not need to be acknowledged, and the entry will not go into the table. The lighthouse 102 (and perhaps the other lighthouse with the colliding MAC address) will follow the Join Fail event on the state diagram which will result in regenerating a MAC address and trying the bind again.


When the robot 104 receives a join request message and wants to delay the binding until another handshake is performed as is the case with lighthouses, it sends a join pending message. A join pending message is needed if an acceptance or rejection will not be sent within 3500 ms.


While acceptance is pending, the lighthouse 102 will transmit a 4-bit code in the confinement beam (11) which indicates that it has not bound to the robot. When the robot 104 runs into a code 11 beam, it stops and looks at its list of lighthouses requesting bindings. For each entry, it issues a command to wink the code to 12. If that command is not acknowledged or the beam change is not seen, the lighthouse 102 is not in range, and the robot 104 moves on to the next entry in the list. If the robot 104 succeeds in seeing the beam, it sends a join accept message which moves the lighthouse 102 into Active state 938 where it obeys beam commands requested by the master. The beam command message contains the state of beams as well as the 4-bit code that should be in the beam.


While a lighthouse is in the binding state 936, it will probably lose contact with the robot 104 as it moves around a room and throughout a house. Losing the beacon for over 2 minutes returns the lighthouse 102 to the “Free” state 932 where the beams are off and power is being saved. When the robot 104 is back in range, the binding procedure is skipped since the assigned address is still valid.


After the lighthouse 102 has been bound to the robot 104, it will probably lose contact with the robot 104 as it moves around a room and throughout a house. A “Roam Recover” state is designed so that the binding process does not have to be repeated following each of these expected losses of communication. A gross timeout of 90 minutes is shown in the state diagram which puts the lighthouse back in a state where re-binding is necessary. The MAC address assigned is now considered expired.


The binding process is designed to obviate the need to assign static MAC addresses to simple devices of which there can be multiple talking to a robot at once. The assignment of addresses by the robot 104 can simply amount to cycling through a list of valid addresses. If the assigned MAC addresses are to expire some time after the bind, it greatly reduces the chance that the user could cause a configuration error.


For example, if there were a procedure the user needed to follow to assign MAC addresses to the lighthouse 102 (e.g. install the batteries, place in front of robot, and hit button sequence on robot), he might do this successfully for the two included in the initial package. If the robot 104 every forgot the last one assigned due to a code update or software problem, he might assign a conflicting address in the future if the user purchased an additional one later. Or, if the user replaces the robot 104 and then uses it to configure a new lighthouse, a conflict is very possible. Having the lighthouse MAC addresses expire tends to heal all such problems. One drawback to expiring addresses is the memories of what lighthouses the robot 104 encountered while cleaning are forgotten. These memories are potentially useful in having the robot 104 clean different rooms on different days. In either case, the age of a MAC address is specified in the “join accept” message giving the robot 104 (hence future software revisions of the robot 104) the freedom to make such decisions.



FIG. 9C shows the state diagram 960 for the remote control. The remote is used to drive the robot 104 around program its schedule. The remote control has a group address and does not require a numeric address.


From power saving state 962, the pressing of a button triggers seeking state 934 and the search for robots on the common channel. The search is performed at a very low power setting if the RPAN ID stored in non-volatile memory is blank. Otherwise, full power is used. In this way, a robot in very close proximity will respond to an unpaired remote. The search can be described by the following loop which is executed continually until a robot is found or until the remote puts itself back to sleep due to inactivity.


1. Operate radio on CSC responding to Activate Invite messages (1 second).


2. Perform 1 active scan (360 milliseconds).


If the active scan collects responses, the remote moves to binding state 936 and the robot with the highest signal strength is selected. The remote switches to the robots channel and gets link by tracking the beacon. It then sends a ping message to itself. If it gets a response, then that means another remote control is using the group address. If no response is received, the remote is in active state 938 and is allowed to control the robot 104.


If the remote successfully communicates with the robot 104 on the operational channel, that robots RPAN ID is programmed into the remote controls non-volatile memory. The remote control communicates with the robot 104 as long as it is awake and buttons have been pressed recently (60 seconds). If the beacon is lost for over 10 seconds which is how Link Down is configured on the remote, it tried to find a robot again.


A paired remote control can be unpaired by installing the batteries with the left drive button depressed and keeping it held down for three seconds. It is then paired as part of the robot discovery algorithm described above.


Driving the robot 104 and operating its user interface remotely is accomplished by sending button states to the robot 104 and receiving LED states from the robot 104. These are updated when they change as well as at a refresh interval. In this way, the remote can be thought of as a dumb terminal.


The following describes the design of RF communications system for a robot system, such as the robot systems 100 and 200. The communications system performs the following actions: wake to RF for lighthouses and robots, remote control and lighthouse beam control commands, low power consumes a low amount of power, occupies a small RAM/ROM footprint, code and sound download, coexists with common interferers found in such environments, coexists with other robot systems in proximity as will be common in the mobile robot 104 development labs and some home environments, provides a simple growth path at each layer of the networking hierarchy.


The RF communications stack to be used on the robot systems 100 and 200 is discussed in a layer oriented approach starting from lowest and ending with the highest. The approach is based on the seven layer Open Systems Interconnection (OSI) Reference Model.


The physical layer uses the 2.4 GHz direct sequence spread spectrum (DSSS) modem as is specified in IEEE 802.15.4. The physical layer supports the following features: 16 available channels; energy detection (ED) provided on demand; clear channel assessment (CCA) using energy, carrier sense or both; and link quality indication (LQI) provided with packet reception.


The MAC layer provides the ability for a device to transmit broadcast and uni-cast messages to other devices within radio range. This does not preclude any topology from being supported in the future. However, layers about this MAC layer will impose restrictions. The MAC layer supports the following features: single master and multiple slaves, master sends beacon containing the beacon period and active period which allows a slave device to track the beacon knowing when to listen and when to save power, slave tracks beacon to establish link status, master can be told by higher layers to listen on the network establishment channel during idle periods of the beacon, a 16-bit Robot Personal Area Network Identifier (RPAN ID) to allow devices to filter packets not on the robot network of interest when channel is shared, group identifier in addresses include allow broadcast to specific device types and obviate need for unique MAC addresses for many types of peripherals, collision avoidance algorithm using CCA and random back-off, and reliability through acknowledge requests and automatic retry.


Including acknowledgement in the MAC layer was done for the IEEE 802.15.4. This may bring the MAC layer up to par with wired media such as half duplex Ethernet where collision detection can be used to give the sender a high level of confidence that the packet arrived at the destination. Network layer acknowledgement schemes may be needed when bridges and routers between the sender and receiver have the potential to drop packets due to resource constraints. Either MAC layer or network layer acknowledgement can be made to suit the needs of this network.


The MAC layer acknowledge is time sensitive since there is no addressing information contained in the packet. If the acknowledgement is sent very quickly, it is unlikely to collide with a new data packet or be confused as an acknowledgement to the new data packet. The sequence number reduces the chances of processing the wrong ACK.


An acknowledgement at the network layer is not as time sensitive since the packet contains addressing information. However, more time is wasted sending this extra information and the latency is worse as information is passed between layers. More state information is potentially needed to remember which packets have not been acknowledged unless head of line blocking is used.


Avoiding time critical processing of packets is not desirable, but there may be situations in which it is used. If another robot or an IEEE 802.15.4 device is operating on the same channel, the receiver may need to promptly parse and discard a valid packet not intended for it. To the extent it delays, it risks not listening when a packet intended for it is received. After factoring this in, it may be appropriate to include the ACK and retry feature at the MAC layer and take steps to mitigate the imposed real time constraints.


Again, an acknowledgement scheme implemented at the MAC or network layers can be made to work. If the MAC layer proves problematic, due to any of the concerns expressed above, the acknowledgment scheme can be implemented at the network layer.


The network layer is responsible for establishing membership in a network. The role of a master device and a slave device are different at this layer. The network layer supports the following slave features: network discovery using low power active scanning on common channel, can issue requests to join a network using a temporary random MAC address, and can participate in a network without any joining transaction if MAC address is known. The network layer supports the following master features: channel selection when network is started based on best available channel, and management of join requests sent on the common channel including assignment of MAC addresses to slaves using temporary ones.


The 16 available channels are discussed in a zero based manner (0-15). Channel 4 does not get 802.11b interference in the USA or Europe. As such, it is chosen as the common signaling channel used for network joining procedures.


The defined MAC layer draws on IEEE 802.15.4. Some concepts borrowed include the CSMA-CD algorithm, the reliability feature, and the beacon concept to some extent. The PAN coordination features are replaced with a method more targeted to the more limited needs of the higher layers.


The MAC layer is based on the presence of a master who generates beacons which define a period of active communications during which any device can talk to a device. Slave devices track this beacon to determine if the robot is present and when it can save power. The mobile robot 104 is the master device and is responsible for transmitting the beacon and managing slave devices. Slave devices track the beacon of the master so they know when they should listen for the next beacon, when they can communicate with other devices, and when they should turn off their RF modems to save power.


The MAC layer header includes a field designed to conflict with the IEEE 802.15.4 frame type field so that such a MAC device should reject it as an invalid frame type, and is otherwise designed to allow multiple RPANs to share a single channel. The RPAN ID field is in a fixed location in the packet, so a receiver can filter on a particular RPAN much like a Virtual LAN (VLAN) in Ethernet.


Beacons are transmitted by a master at periodic intervals. One reason is to imbed information about when the slave devices should expect to exchange messages with the master. This duty cycle control allows some level of power saving even during active operational modes. The second reason for transmitting beacons is to provide a constant status on the proximity of the robot. The goal is to unburden application layer software from doing this task.


A beacon is sent periodically as specified by the Beacon Period which is specified in the beacon itself. So, a slave receiving a beacon knows when to expect the next one. An Access Period is specified in the beacon, as well. This dictates the period of time during which the master will be active on the channel. Slaves should pay attention during this time and may shut down their receivers at other times. The sequence number in the beacon allows the slave to detect one or more missing beacons.


When a master specifies a small active period relative to the beacon period, it gives it the opportunity to spend the idle time listening on the CSC to admit new peripherals into the network. As such, the beacon periods may be set in a manner related to the period that peripherals use to wake up and try to join a network.


A typical beacon period may be on the order of a second. The jitter of the beacon message is relatively high considering the random nature of the back-off algorithm. Also, the slaves should not be burdened with having to manage events with a high level of temporal precision. Subject to the clocking requirements discussed later, the slave should define a “beacon window” which is a time period centered on the next expected time a beacon will be received. The slave should listen for the beacon during this window. The window ends when the expected beacon is received, ideally. If no beacon is received, the window ends, but the slave operates during the access period as if it received one. When a beacon is missed in this way, the window is lengthened for the next beacon since clock inaccuracies add. Once too many beacons have been missed, a loss of beacon is declared and the slave just listens constantly until it reacquires it. The loss of beacon condition is analogous in the Ethernet world to losing link. The master transmits beacons with a time accuracy of less than 0.1%.


The MAC engine is based on a 250 microsecond process tick in order to simplify the management of state timers and avoid busy waiting. It should be a design goal of the implementation to insure that the processing in a single tick never exceeds 125 microseconds in order to leave plenty of processor available for other more critical tasks. In 250 microseconds, 7.8 characters can be transmitted at the fixed baud rate of 250 kbps. Including the preamble and PHY header, the smallest possible packet is 8 characters long. This means that two CCA functions performed in consecutive ticks will almost certainly detect an ACK in flight.


The collision avoidance algorithm is invoked whenever there is a packet ready to transmit. The transmitter will delay a random number of back-off periods before running the CCA function. On the tick where the CCA function has completed, the transmitter will start sending if the CCA returned saying that the channel is clear.


The dead time between a packet reception ending and an ACK starting is between one and two ticks. So, a CCA function that does its best to prevent stepping on the ACK is one which performs two CCA measurements spaced a tick apart and declaring the channel clear if both pass. The back-off period is designed to be longer than the transmission time of a small packet such as an ACK, so two ticks is chosen.


If a data frame is received with acknowledgement requested with a matching destination address, the receiver prepares to send an acknowledge packet provided that it will be able to pass the buffer along to the application. The receiver waits one tick to give the sender time to switch its transceiver into receive mode, then transmits an acknowledgement which is the first 2 bytes of the MAC header echoed with the frame type changed to an the ACK value. The sender expecting an ACK will wait for up to five ticks (1.25 ms) to receive the reply before retrying the transmission. Up to three retries are performed. If an acknowledgement is requested, the sender should postpone sending the packet if there is not enough time remaining in the current active period for the receiver to send the acknowledgement.


Data payloads in the network begin with the Transport Header which consists of a byte specifying the Service Access Point (SAP). This multiplexes different types of services onto the same device address. Previously, this has been accomplished using the “opcodes”.


Device Control, Device Status Request, and Device Status SAPs are related in that the payload messages use the same code points on a per device basis. That is to say that devices will have a published set of control and status information elements consisting of an element code followed by a known number of element payload bytes. If controllable over RF, the Device Control SAP is used to set values. Controllable and observable items can be queried with a Device Status Request. The actual status is delivered using the Device Status SAP whether solicited, i.e. requested over the Device Status Request SAP, or unsolicited, i.e. sent spontaneously. Alarms and other indications may be delivered in this way.


The reason to use multiple SAP codes for this related functionality is that it may a major portion of the overall RF traffic. As such, the smaller the packets can be made, the more reliable the transmission. So, for critical control and status messages, having a two byte header <Device_SAP><Control_Cmd> or <Device_SAP><Status_Cmd> keeps the PHY and MAC headers as small as possible.


“ROBOT OBSTACLE DETECTION SYSTEM”, U.S. Pat. No. 6,594,844, disclosing proximity sensors such as cliff sensors and wall following sensors; “AUTONOMOUS FLOOR-CLEANING ROBOT”, U.S. Pat. No. 6,883,201, disclosing a general structure of an iRobot Roomba coverage/cleaning robot and main and edge cleaning heads in detail: “METHOD AND SYSTEM FOR MULTI-MODE COVERAGE FOR AN AUTONOMOUS ROBOT”, U.S. Pat. No. 6,809,490, disclosing motion control and coverage behaviors, including escape behaviors, selected by an arbiter according to the principles of behavior based robotics; and “METHOD AND SYSTEM FOR ROBOT LOCALIZATION AND CONFINEMENT”, U.S. Pat. No. 6,781,338, disclosing virtual walls, i.e. robot confinement using wall-simulating directed beams, are each incorporated by reference herein in their entireties.


Other robot details and features combinable with those described herein may be found in the following U.S. patent applications filed concurrently herewith, entitled “AUTONOMOUS COVERAGE ROBOT NAVIGATION SYSTEM” having assigned Ser. No. 11/633,869; “MODULAR ROBOT” having assigned Ser. No. 11/633,886; and “COVERAGE ROBOT MOBILITY” having assigned Ser. No. 11/633,885, the entire contents of the aforementioned applications are hereby incorporated by reference.


A number of implementations have been described. Nevertheless, it will be understood that various modifications may be made without departing from the spirit and scope of the following claims. Accordingly, other implementations are within the scope of the following claims.

Claims
  • 1. A mobile cleaning robot comprising: a drive system to move the robot on a floor surface;a cleaning unit to clean the floor surface;an audio output device to emit audible content;a microphone to detect sound;a camera;a wireless communicator; anda controller operable with the wireless communicator to communicate with a remote device, the controller being configured to access audio data and cause the audio output device to emit audible content corresponding to the audio data, and to control an operation of the robot corresponding to the sound detected by the microphone.
  • 2. The mobile cleaning robot of claim 1, wherein the audio output device is configured to emit the audible content when the microphone detects the sound, the audible content corresponding to the detected sound.
  • 3. The mobile cleaning robot of claim 1, further comprising a bump sensor configured to detect a collision of the robot with an obstacle, wherein the audio output device is configured to emit the audible content when the bump sensor detects the collision.
  • 4. The mobile cleaning robot of claim 1, further comprising a bin in communication with the controller, wherein the audible content further includes an audible content corresponding to an amount of content in the bin.
  • 5. The mobile cleaning robot of claim 4, wherein the bin is a consumable fluid bin, and the audible content further includes an audible content corresponding to an amount of fluid in the bin.
  • 6. The mobile cleaning robot of claim 1, wherein the audible content includes an audible content corresponding to a user-selected setting.
  • 7. The mobile cleaning robot of claim 6, wherein the audible content corresponding to the user-selected setting corresponds to a theme of audible contents specified by the user-selected setting.
  • 8. The mobile cleaning robot of claim 7, wherein the audible content corresponding to the user-selected setting includes an instruction in a human language or dialect selected from a plurality of human languages or dialects based on the user-selected setting.
  • 9. The mobile cleaning robot of claim 1, wherein the controller is configured to access the audio data by using the wireless communicator to connect to the Internet to retrieve the audible content from a content provider.
  • 10. The mobile cleaning robot of claim 1, wherein the camera is configured to generate video data, and the wireless communicator is configured to transmit the video data from the robot to a networked destination.
  • 11. The mobile cleaning robot of claim 1, wherein the controller is configured to navigate the robot across the floor surface based on a signals from a sensor, the sensor comprising the camera.
  • 12. The mobile cleaning robot of claim 1, wherein the wireless communicator is configured to transmit usage and behavior information to a server periodically.
  • 13. The mobile cleaning robot of claim 1, wherein the wireless communicator is configured to measure a signal strength of a signal emitted by a peripheral device, and the controller is configured to determine a distance to the peripheral device based on the measured signal strength.
  • 14. The mobile cleaning robot of claim 13, wherein the controller is configured to prohibit movement of the mobile robot through a particular area based on the measured signal strength or to guide movement of the mobile robot to the particular area based on the measured signal strength.
  • 15. The mobile cleaning robot of claim 1, further comprising a display panel in electrical communication with the controller, the display panel including an illuminable maintenance display substantially mimicking an appearance of the robot.
  • 16. The mobile cleaning robot of claim 15, wherein the controller is configured to control illumination of indicia on the display panel and emission of audio responses from the audio output device.
  • 17. The mobile cleaning robot of claim 1, wherein the controller is configured to cause the audio output device to emit audible content corresponding to an instruction to a user to interact with the mobile cleaning robot.
  • 18. The mobile cleaning robot of claim 1, wherein the microphone is configured to receive a spoken command corresponding to a functionality performable by the robot, and the controller is configured to identify the functionality from the spoken command using a speech recognition routine and to cause the robot to perform the functionality.
  • 19. The mobile cleaning robot of claim 18, wherein the controller is configured to identify a movement pattern corresponding to the functionality and to cause the robot to move in accordance with the movement pattern.
  • 20. The mobile cleaning robot of claim 1, wherein the controller is configured to provide, using the audio output device, technical support to a user.
Parent Case Info

This application is a continuation of U.S. application Ser. No. 14/275,355, filed May 12, 2014, which claims priority to U.S. application Ser. No. 13/893,905, filed May 14, 2013, which claims priority to U.S. application Ser. No. 12/959,879, filed Dec. 3, 2010, which claims priority to U.S. application Ser. No. 11/633,883, filed on Dec. 4, 2006, which claims priority under 35 U.S.C. 119(e) to a U.S. provisional patent application filed on Dec. 2, 2005, entitled “ROBOT NETWORKING, THEMING AND COMMUNICATION SYSTEM” and having assigned Ser. No. 60/741,442, the entire contents of which are hereby incorporated by reference.

US Referenced Citations (1041)
Number Name Date Kind
1755054 Darst Apr 1930 A
1780221 Buchmann Nov 1930 A
1970302 Gerhardt Aug 1934 A
2136324 John Nov 1938 A
2302111 Dow et al. Nov 1942 A
2353621 Sav et al. Jul 1944 A
2770825 Pullen Nov 1956 A
2930055 Fallen et al. Mar 1960 A
3119369 Harland et al. Jan 1964 A
3166138 Dunn Jan 1965 A
3333564 Waters Aug 1967 A
3375375 Robert et al. Mar 1968 A
3381652 Schaefer et al. May 1968 A
3457575 Bienek Jul 1969 A
3550714 Bellinger Dec 1970 A
3569727 Aggarwal et al. Mar 1971 A
3649981 Woodworth Mar 1972 A
3674316 De Bray Jul 1972 A
3678882 Kinsella Jul 1972 A
3690559 Rudloff Sep 1972 A
3744586 Leinauer Jul 1973 A
3756667 Bombardier et al. Sep 1973 A
3809004 Leonheart May 1974 A
3816004 Bignardi Jun 1974 A
3845831 James Nov 1974 A
RE28268 Autrand Dec 1974 E
3851349 Lowder Dec 1974 A
3853086 Asplund Dec 1974 A
3863285 Hukuba Feb 1975 A
3888181 Kups Jun 1975 A
3937174 Haaga Feb 1976 A
3952361 Wilkins Apr 1976 A
3989311 Debrey Nov 1976 A
3989931 Phillips Nov 1976 A
4004313 Capra Jan 1977 A
4012681 Finger et al. Mar 1977 A
4070170 Leinfelt Jan 1978 A
4099284 Shinozaki et al. Jul 1978 A
4119900 Kremnitz Oct 1978 A
4175589 Nakamura et al. Nov 1979 A
4175892 De bray Nov 1979 A
4196727 Verkaart et al. Apr 1980 A
4198727 Farmer Apr 1980 A
4199838 Simonsson Apr 1980 A
4209254 Reymond et al. Jun 1980 A
D258901 Keyworth Apr 1981 S
4297578 Carter Oct 1981 A
4305234 Pichelman Dec 1981 A
4306329 Yokoi Dec 1981 A
4309758 Halsall et al. Jan 1982 A
4328545 Halsall et al. May 1982 A
4367403 Miller Jan 1983 A
4369543 Chen et al. Jan 1983 A
4401909 Gorsek Aug 1983 A
4416033 Specht Nov 1983 A
4445245 Lu May 1984 A
4465370 Yuasa et al. Aug 1984 A
4477998 You Oct 1984 A
4481692 Kurz Nov 1984 A
4482960 Pryor Nov 1984 A
4492058 Goldfarb et al. Jan 1985 A
4513469 Godfrey et al. Apr 1985 A
D278732 Ohkado May 1985 S
4518437 Sommer May 1985 A
4534637 Suzuki et al. Aug 1985 A
4556313 Miller et al. Dec 1985 A
4575211 Matsumura et al. Mar 1986 A
4580311 Kurz Apr 1986 A
4601082 Kurz Jul 1986 A
4618213 Chen Oct 1986 A
4620285 Perdue Oct 1986 A
4624026 Olson et al. Nov 1986 A
4626995 Lofgren et al. Dec 1986 A
4628454 Ito Dec 1986 A
4638445 Mattaboni Jan 1987 A
4644156 Takahashi et al. Feb 1987 A
4649504 Krouglicof et al. Mar 1987 A
4652917 Miller Mar 1987 A
4654492 Koerner et al. Mar 1987 A
4654924 Getz et al. Apr 1987 A
4660969 Sorimachi et al. Apr 1987 A
4662854 Fang May 1987 A
4674048 Okumura Jun 1987 A
4679152 Perdue Jul 1987 A
4680827 Hummel Jul 1987 A
4696074 Cavalli Sep 1987 A
D292223 Trumbull Oct 1987 S
4700301 Dyke Oct 1987 A
4700427 Knepper Oct 1987 A
4703820 Reinaud Nov 1987 A
4709773 Clement et al. Dec 1987 A
4710020 Maddox et al. Dec 1987 A
4712740 Duncan et al. Dec 1987 A
4716621 Zoni Jan 1988 A
4728801 O'Connor Mar 1988 A
4733343 Yoneda et al. Mar 1988 A
4733430 Westergren Mar 1988 A
4733431 Martin Mar 1988 A
4735136 Lee et al. Apr 1988 A
4735138 Gawler et al. Apr 1988 A
4748336 Fujie et al. May 1988 A
4748833 Nagasawa Jun 1988 A
4756049 Uehara Jul 1988 A
4767213 Hummel Aug 1988 A
4769700 Pryor Sep 1988 A
4777416 George et al. Oct 1988 A
D298766 Tanno et al. Nov 1988 S
4782550 Jacobs Nov 1988 A
4796198 Boultinghouse et al. Jan 1989 A
4806751 Abe et al. Feb 1989 A
4811228 Hyyppa Mar 1989 A
4813906 Matsuyama et al. Mar 1989 A
4815157 Tsuchiya Mar 1989 A
4817000 Eberhardt Mar 1989 A
4818875 Weiner Apr 1989 A
4829442 Kadonoff et al. May 1989 A
4829626 Harkonen et al. May 1989 A
4832098 Palinkas et al. May 1989 A
4851661 Everett Jul 1989 A
4854000 Takimoto Aug 1989 A
4854006 Nishimura et al. Aug 1989 A
4855915 Dallaire Aug 1989 A
4857912 Everett et al. Aug 1989 A
4858132 Holmquist Aug 1989 A
4867570 Sorimachi et al. Sep 1989 A
4880474 Koharagi et al. Nov 1989 A
4887415 Martin Dec 1989 A
4891762 Chotiros Jan 1990 A
4893025 Lee Jan 1990 A
4901394 Nakamura et al. Feb 1990 A
4905151 Weiman et al. Feb 1990 A
4909972 Britz Mar 1990 A
4912643 Beirne Mar 1990 A
4918441 Bohman Apr 1990 A
4919224 Shyu et al. Apr 1990 A
4919489 Kopsco Apr 1990 A
4920060 Parrent et al. Apr 1990 A
4920605 Takashima May 1990 A
4933864 Evans et al. Jun 1990 A
4937912 Kurz Jul 1990 A
4953253 Fukuda et al. Sep 1990 A
4954962 Evans et al. Sep 1990 A
4955714 Stotler et al. Sep 1990 A
4956891 Wulff Sep 1990 A
4961303 McCarty et al. Oct 1990 A
4961304 Ovsborn et al. Oct 1990 A
4962453 Pong et al. Oct 1990 A
4967862 Pong et al. Nov 1990 A
4971591 Raviv et al. Nov 1990 A
4973912 Kaminski et al. Nov 1990 A
4974283 Holsten et al. Dec 1990 A
4977618 Allen Dec 1990 A
4977639 Takahashi et al. Dec 1990 A
4986663 Cecchi et al. Jan 1991 A
5001635 Yasutomi et al. Mar 1991 A
5002145 Wakaumi et al. Mar 1991 A
5012886 Jonas et al. May 1991 A
5018240 Holman May 1991 A
5020186 Lessig et al. Jun 1991 A
5022812 Coughlan et al. Jun 1991 A
5023788 Kitazume et al. Jun 1991 A
5024529 Svetkoff et al. Jun 1991 A
D318500 Malewicki et al. Jul 1991 S
5032775 Mizuno et al. Jul 1991 A
5033151 Kraft et al. Jul 1991 A
5033291 Podoloff et al. Jul 1991 A
5040116 Evans et al. Aug 1991 A
5045769 Everett Sep 1991 A
5049802 Mintus et al. Sep 1991 A
5051906 Evans et al. Sep 1991 A
5062819 Mallory Nov 1991 A
5070567 Holland Dec 1991 A
5084934 Lessig et al. Feb 1992 A
5086535 Grossmeyer et al. Feb 1992 A
5090321 Abouav Feb 1992 A
5093955 Blehert et al. Mar 1992 A
5094311 Akeel Mar 1992 A
5098262 Wecker et al. Mar 1992 A
5105502 Takashima Apr 1992 A
5105550 Shenoha Apr 1992 A
5109566 Kobayashi et al. May 1992 A
5111401 Everett, Jr. et al. May 1992 A
5115538 Cochran et al. May 1992 A
5127128 Lee Jul 1992 A
5136675 Hodson Aug 1992 A
5136750 Takashima et al. Aug 1992 A
5142985 Stearns et al. Sep 1992 A
5144471 Takanashi et al. Sep 1992 A
5144714 Mori et al. Sep 1992 A
5144715 Matsuyo et al. Sep 1992 A
5152028 Hirano Oct 1992 A
5152202 Strauss Oct 1992 A
5154617 Suman et al. Oct 1992 A
5155684 Burke et al. Oct 1992 A
5163202 Kawakami et al. Nov 1992 A
5163320 Goshima et al. Nov 1992 A
5164579 Pryor et al. Nov 1992 A
5165064 Mattaboni Nov 1992 A
5170352 McTamaney et al. Dec 1992 A
5173881 Sindle Dec 1992 A
5182833 Yamaguchi et al. Feb 1993 A
5187662 Kamimura et al. Feb 1993 A
5202742 Frank et al. Apr 1993 A
5204814 Noonan et al. Apr 1993 A
5206500 Decker et al. Apr 1993 A
5208521 Aoyama May 1993 A
5216777 Moro et al. Jun 1993 A
5222786 Sovis et al. Jun 1993 A
5227985 DeMenthon Jul 1993 A
5233682 Abe et al. Aug 1993 A
5239720 Wood et al. Aug 1993 A
5251358 Moro et al. Oct 1993 A
5261139 Lewis Nov 1993 A
5276618 Everett Jan 1994 A
5276939 Uenishi Jan 1994 A
5277064 Knigga et al. Jan 1994 A
5279672 Betker et al. Jan 1994 A
5284452 Corona Feb 1994 A
5284522 Kobayashi et al. Feb 1994 A
5293955 Lee Mar 1994 A
D345707 Alister Apr 1994 S
5303448 Hennessey et al. Apr 1994 A
5307273 Oh et al. Apr 1994 A
5309592 Hiratsuka May 1994 A
5310379 Hippely et al. May 1994 A
5315227 Pierson et al. May 1994 A
5319827 Yang Jun 1994 A
5319828 Waldhauser et al. Jun 1994 A
5321614 Ashworth Jun 1994 A
5323483 Baeg Jun 1994 A
5324948 Dudar et al. Jun 1994 A
5331713 Tipton Jul 1994 A
5341186 Kato Aug 1994 A
5341540 Soupert et al. Aug 1994 A
5341549 Wirtz et al. Aug 1994 A
5345649 Whitlow Sep 1994 A
5352901 Poorman Oct 1994 A
5353224 Lee et al. Oct 1994 A
5363305 Cox et al. Nov 1994 A
5363935 Schempf et al. Nov 1994 A
5369347 Yoo Nov 1994 A
5369838 Wood et al. Dec 1994 A
5386862 Glover et al. Feb 1995 A
5399951 Lavallee et al. Mar 1995 A
5400244 Watanabe et al. Mar 1995 A
5404612 Ishikawa Apr 1995 A
5410479 Coker Apr 1995 A
5432907 Picazo et al. Jul 1995 A
5435405 Schempf et al. Jul 1995 A
5440216 Kim Aug 1995 A
5442358 Keeler et al. Aug 1995 A
5444965 Colens Aug 1995 A
5446356 Kim Aug 1995 A
5446445 Bloomfield et al. Aug 1995 A
5451135 Schempf et al. Sep 1995 A
5454129 Kell Oct 1995 A
5455982 Armstrong et al. Oct 1995 A
5465525 Mifune et al. Nov 1995 A
5465619 Sotack et al. Nov 1995 A
5467273 Faibish et al. Nov 1995 A
5471560 Allard et al. Nov 1995 A
5491670 Weber Feb 1996 A
5497529 Boesi Mar 1996 A
5498948 Bruni et al. Mar 1996 A
5502638 Takenaka Mar 1996 A
5505072 Oreper Apr 1996 A
5507067 Hoekstra et al. Apr 1996 A
5510893 Suzuki Apr 1996 A
5511147 Abdel Apr 1996 A
5515572 Hoekstra et al. May 1996 A
5534762 Kim Jul 1996 A
5535476 Kresse et al. Jul 1996 A
5537017 Feiten et al. Jul 1996 A
5537711 Tseng Jul 1996 A
5539953 Kurz Jul 1996 A
5542146 Hoekstra et al. Aug 1996 A
5542148 Young Aug 1996 A
5546631 Chambon Aug 1996 A
5548511 Bancroft Aug 1996 A
5548649 Jacobson Aug 1996 A
5551119 Wörwag Sep 1996 A
5551525 Pack et al. Sep 1996 A
5553349 Kilstrom et al. Sep 1996 A
5555587 Guha Sep 1996 A
5560077 Crotchett Oct 1996 A
5568589 Hwang Oct 1996 A
D375592 Ljunggren Nov 1996 S
5608306 Rybeck et al. Mar 1997 A
5608894 Kawakami et al. Mar 1997 A
5608944 Gordon Mar 1997 A
5610488 Miyazawa Mar 1997 A
5611106 Wulff Mar 1997 A
5611108 Knowlton et al. Mar 1997 A
5613261 Kawakami et al. Mar 1997 A
5613269 Miwa Mar 1997 A
5621291 Lee Apr 1997 A
5622236 Azumi et al. Apr 1997 A
5634237 Paranjpe Jun 1997 A
5634239 Tuvin et al. Jun 1997 A
5636402 Kubo et al. Jun 1997 A
5642299 Hardin et al. Jun 1997 A
5646494 Han Jul 1997 A
5647554 Ikegami et al. Jul 1997 A
5650702 Azumi Jul 1997 A
5652489 Kawakami Jul 1997 A
5682313 Edlund et al. Oct 1997 A
5682839 Grimsley et al. Nov 1997 A
5696675 Nakamura et al. Dec 1997 A
5698861 Oh Dec 1997 A
5709007 Chiang Jan 1998 A
5710506 Broell et al. Jan 1998 A
5714119 Kawagoe et al. Feb 1998 A
5717169 Liang et al. Feb 1998 A
5717484 Hamaguchi et al. Feb 1998 A
5720077 Nakamura et al. Feb 1998 A
5732401 Conway Mar 1998 A
5735017 Barnes et al. Apr 1998 A
5735959 Kubo et al. Apr 1998 A
5742975 Knowlton et al. Apr 1998 A
5745235 Vercammen et al. Apr 1998 A
5752871 Tsuzuki May 1998 A
5756904 Oreper et al. May 1998 A
5761762 Kubo Jun 1998 A
5764888 Bolan et al. Jun 1998 A
5767437 Rogers Jun 1998 A
5767960 Orman Jun 1998 A
5770936 Hirai et al. Jun 1998 A
5777596 Herbert Jul 1998 A
5778486 Kim Jul 1998 A
5781697 Jeong Jul 1998 A
5781960 Kilstrom et al. Jul 1998 A
5784755 Karr et al. Jul 1998 A
5786602 Pryor et al. Jul 1998 A
5787545 Colens Aug 1998 A
5793900 Nourbakhsh et al. Aug 1998 A
5794297 Muta Aug 1998 A
5796742 Klotzbach et al. Aug 1998 A
5802665 Knowlton et al. Sep 1998 A
5812267 Everett et al. Sep 1998 A
5814808 Takada et al. Sep 1998 A
5815880 Nakanishi Oct 1998 A
5815884 Imamura et al. Oct 1998 A
5819008 Asama et al. Oct 1998 A
5819360 Fujii Oct 1998 A
5819936 Saveliev et al. Oct 1998 A
5820821 Kawagoe et al. Oct 1998 A
5821730 Drapkin Oct 1998 A
5825981 Matsuda Oct 1998 A
5828770 Leis et al. Oct 1998 A
5831597 West et al. Nov 1998 A
5836045 Anthony et al. Nov 1998 A
5839156 Park et al. Nov 1998 A
5839532 Yoshiji et al. Nov 1998 A
5841259 Kim et al. Nov 1998 A
5867800 Leif Feb 1999 A
5867861 Kasen et al. Feb 1999 A
5869910 Colens Feb 1999 A
5894621 Kubo Apr 1999 A
5896611 Haaga Apr 1999 A
5903124 Kawakami May 1999 A
5905209 Oreper May 1999 A
5907886 Buscher Jun 1999 A
5910700 Crotzer Jun 1999 A
5911260 Suzuki Jun 1999 A
5916008 Wong Jun 1999 A
5924167 Wright et al. Jul 1999 A
5926909 McGee Jul 1999 A
5933102 Miller et al. Aug 1999 A
5933913 Wright et al. Aug 1999 A
5935179 Kleiner et al. Aug 1999 A
5935333 Davis Aug 1999 A
5940346 Sadowsky et al. Aug 1999 A
5940927 Haegermarck et al. Aug 1999 A
5940930 Oh et al. Aug 1999 A
5942869 Katou et al. Aug 1999 A
5943730 Boomgaarden Aug 1999 A
5943733 Tagliaferri Aug 1999 A
5943933 Evans et al. Aug 1999 A
5947225 Kawakami et al. Sep 1999 A
5950408 Schaedler Sep 1999 A
5959423 Nakanishi et al. Sep 1999 A
5968281 Wright et al. Oct 1999 A
5974348 Rocks Oct 1999 A
5974365 Mitchell Oct 1999 A
5983448 Wright et al. Nov 1999 A
5984880 Lander et al. Nov 1999 A
5987383 Keller et al. Nov 1999 A
5989700 Krivopal Nov 1999 A
5991951 Kubo et al. Nov 1999 A
5995883 Nishikado Nov 1999 A
5995884 Allen et al. Nov 1999 A
5996167 Close Dec 1999 A
5998953 Nakamura et al. Dec 1999 A
5998971 Corbridge Dec 1999 A
6000088 Wright et al. Dec 1999 A
6006275 Picazo et al. Dec 1999 A
6009358 Angott et al. Dec 1999 A
6012618 Matsuo et al. Jan 2000 A
6021545 Delgado et al. Feb 2000 A
6023813 Thatcher et al. Feb 2000 A
6023814 Imamura Feb 2000 A
6025687 Himeda et al. Feb 2000 A
6026539 Mouw et al. Feb 2000 A
6030464 Azevedo Feb 2000 A
6030465 Marcussen et al. Feb 2000 A
6032327 Oka et al. Mar 2000 A
6032542 Warnick et al. Mar 2000 A
6036572 Sze Mar 2000 A
6038501 Kawakami Mar 2000 A
6040669 Hog Mar 2000 A
6041471 Charky et al. Mar 2000 A
6041472 Kasen et al. Mar 2000 A
6046800 Ohtomo et al. Apr 2000 A
6049620 Dickinson et al. Apr 2000 A
6050648 Keleny Apr 2000 A
6052821 Chouly et al. Apr 2000 A
6055042 Sarangapani Apr 2000 A
6055702 Imamura et al. May 2000 A
6061868 Moritsch et al. May 2000 A
6065182 Wright et al. May 2000 A
6070290 Schwarze et al. Jun 2000 A
6073432 Schaedler Jun 2000 A
6076025 Ueno et al. Jun 2000 A
6076026 Jambhekar et al. Jun 2000 A
6076226 Reed Jun 2000 A
6076227 Schallig et al. Jun 2000 A
6081257 Zeller Jun 2000 A
6088020 Mor Jul 2000 A
6094775 Behmer Aug 2000 A
6099091 Campbell Aug 2000 A
6101670 Song Aug 2000 A
6101671 Wright et al. Aug 2000 A
6108031 King et al. Aug 2000 A
6108067 Okamoto Aug 2000 A
6108076 Hanseder Aug 2000 A
6108269 Kabel Aug 2000 A
6108597 Kirchner et al. Aug 2000 A
6108859 Burgoon Aug 2000 A
6112143 Allen et al. Aug 2000 A
6112996 Matsuo Sep 2000 A
6119057 Kawagoe Sep 2000 A
6122798 Kobayashi et al. Sep 2000 A
6124694 Bancroft et al. Sep 2000 A
6125498 Roberts et al. Oct 2000 A
6131237 Kasper et al. Oct 2000 A
6138063 Himeda Oct 2000 A
6142252 Kinto et al. Nov 2000 A
6146041 Chen et al. Nov 2000 A
6146278 Kobayashi Nov 2000 A
6154279 Thayer Nov 2000 A
6154694 Aoki et al. Nov 2000 A
6160479 Ahlen et al. Dec 2000 A
6167332 Kurtzberg et al. Dec 2000 A
6167587 Kasper et al. Jan 2001 B1
6192548 Huffman Feb 2001 B1
6192549 Kasen et al. Feb 2001 B1
6202243 Beaufoy et al. Mar 2001 B1
6216307 Kaleta et al. Apr 2001 B1
6220865 Macri et al. Apr 2001 B1
6226830 Hendriks et al. May 2001 B1
6230362 Kasper et al. May 2001 B1
6237741 Guidetti May 2001 B1
6240342 Fiegert et al. May 2001 B1
6243913 Frank et al. Jun 2001 B1
6255793 Peless et al. Jul 2001 B1
6259979 Holmquist Jul 2001 B1
6261379 Conrad et al. Jul 2001 B1
6263539 Baig Jul 2001 B1
6263989 Won Jul 2001 B1
6272936 Oreper et al. Aug 2001 B1
6276478 Hopkins et al. Aug 2001 B1
6278918 Dickson et al. Aug 2001 B1
6279001 DeBettencourt et al. Aug 2001 B1
6279196 Kasen et al. Aug 2001 B2
6282526 Ganesh Aug 2001 B1
6283034 Miles Sep 2001 B1
6285778 Nakajima et al. Sep 2001 B1
6285930 Dickson et al. Sep 2001 B1
6286181 Kasper et al. Sep 2001 B1
6300737 Bergvall et al. Oct 2001 B1
6321337 Reshef et al. Nov 2001 B1
6321515 Colens Nov 2001 B1
6323570 Nishimura et al. Nov 2001 B1
6324714 Walz et al. Dec 2001 B1
6327741 Reed Dec 2001 B1
6332400 Meyer Dec 2001 B1
6339735 Peless et al. Jan 2002 B1
6362875 Burkley Mar 2002 B1
6370453 Sommer Apr 2002 B2
6374155 Wallach et al. Apr 2002 B1
6374157 Takamura Apr 2002 B1
6381802 Park May 2002 B2
6385515 Dickson et al. May 2002 B1
6388013 Saraf et al. May 2002 B1
6389329 Colens May 2002 B1
6397429 Legatt et al. Jun 2002 B1
6400048 Nishimura et al. Jun 2002 B1
6401294 Kasper Jun 2002 B2
6408226 Byrne et al. Jun 2002 B1
6412141 Kasper et al. Jul 2002 B2
6415203 Inoue et al. Jul 2002 B1
6418586 Fulghum Jul 2002 B2
6421870 Basham et al. Jul 2002 B1
6427285 Legatt et al. Aug 2002 B1
6430471 Kintou et al. Aug 2002 B1
6431296 Won Aug 2002 B1
6437227 Theimer Aug 2002 B1
6437465 Nishimura et al. Aug 2002 B1
6438456 Feddema et al. Aug 2002 B1
6438793 Miner et al. Aug 2002 B1
6442476 Poropat Aug 2002 B1
6442789 Legatt et al. Sep 2002 B1
6443509 Levin et al. Sep 2002 B1
6444003 Sutcliffe Sep 2002 B1
6446192 Narasimhan et al. Sep 2002 B1
6446302 Kasper et al. Sep 2002 B1
6454036 Airey et al. Sep 2002 B1
D464091 Christianson Oct 2002 S
6457206 Judson Oct 2002 B1
6459955 Bartsch Oct 2002 B1
6463368 Feiten et al. Oct 2002 B1
6465982 Bergvall et al. Oct 2002 B1
6473167 Odell Oct 2002 B1
6480762 Uchikubo et al. Nov 2002 B1
6481515 Kirkpatrick et al. Nov 2002 B1
6482252 Conrad et al. Nov 2002 B1
6490539 Dickson et al. Dec 2002 B1
6491127 Holmberg et al. Dec 2002 B1
6493612 Bisset et al. Dec 2002 B1
6493613 Peless et al. Dec 2002 B2
6496754 Song et al. Dec 2002 B2
6496755 Wallach et al. Dec 2002 B2
6502657 Kerrebrock et al. Jan 2003 B2
6504610 Bauer et al. Jan 2003 B1
6507773 Parker et al. Jan 2003 B2
6519808 Legatt et al. Feb 2003 B2
6525509 Petersson et al. Feb 2003 B1
D471243 Cioffi et al. Mar 2003 S
6530102 Pierce et al. Mar 2003 B1
6530117 Peterson Mar 2003 B2
6532404 Colens Mar 2003 B2
6535793 Allard Mar 2003 B2
6540424 Hall et al. Apr 2003 B1
6540607 Mokris et al. Apr 2003 B2
6548982 Papanikolopoulos et al. Apr 2003 B1
6553612 Dyson et al. Apr 2003 B1
6556722 Russell et al. Apr 2003 B1
6556892 Kuroki et al. Apr 2003 B2
6557104 Vu et al. Apr 2003 B2
D474312 Stephens et al. May 2003 S
6563130 Dworkowski et al. May 2003 B2
6571415 Gerber et al. Jun 2003 B2
6571422 Gordon et al. Jun 2003 B1
6572711 Sclafani et al. Jun 2003 B2
6574536 Kawagoe et al. Jun 2003 B1
6580246 Jacobs Jun 2003 B2
6584376 Van Kommer Jun 2003 B1
6586908 Petersson et al. Jul 2003 B2
6587573 Stam et al. Jul 2003 B1
6590222 Bisset et al. Jul 2003 B1
6594551 McKinney et al. Jul 2003 B2
6594844 Jones Jul 2003 B2
6597076 Scheible et al. Jul 2003 B2
D478884 Slipy et al. Aug 2003 S
6601265 Burlington Aug 2003 B1
6604021 Imai et al. Aug 2003 B2
6604022 Parker et al. Aug 2003 B2
6605156 Clark et al. Aug 2003 B1
6609269 Kasper Aug 2003 B2
6611120 Song et al. Aug 2003 B2
6611734 Parker et al. Aug 2003 B2
6611738 Ruffner Aug 2003 B2
6615108 Peless et al. Sep 2003 B1
6615434 Davis et al. Sep 2003 B1
6615885 Ohm Sep 2003 B1
6622465 Jerome et al. Sep 2003 B2
6624744 Wilson et al. Sep 2003 B1
6625843 Kim et al. Sep 2003 B2
6629028 Paromtchik et al. Sep 2003 B2
6633150 Wallach et al. Oct 2003 B1
6637546 Wang Oct 2003 B1
6639659 Granger Oct 2003 B2
6658325 Zweig Dec 2003 B2
6658354 Lin Dec 2003 B2
6658692 Lenkiewicz et al. Dec 2003 B2
6658693 Reed Dec 2003 B1
6661239 Ozick Dec 2003 B1
6662889 De Fazio et al. Dec 2003 B2
6668951 Won Dec 2003 B2
6670817 Fournier et al. Dec 2003 B2
6671592 Bisset et al. Dec 2003 B1
6671925 Field et al. Jan 2004 B2
6677938 Maynard Jan 2004 B1
6687571 Byrne et al. Feb 2004 B1
6690134 Jones et al. Feb 2004 B1
6690993 Foulke et al. Feb 2004 B2
6697147 Ko et al. Feb 2004 B2
6705332 Field et al. Mar 2004 B2
6711280 Stafsudd et al. Mar 2004 B2
6732826 Song et al. May 2004 B2
6735811 Field et al. May 2004 B2
6735812 Hekman et al. May 2004 B2
6737591 Lapstun et al. May 2004 B1
6741054 Koselka et al. May 2004 B2
6741364 Lange et al. May 2004 B2
6748297 Song et al. Jun 2004 B2
6756703 Chang Jun 2004 B2
6760647 Nourbakhsh et al. Jul 2004 B2
6764373 Osawa et al. Jul 2004 B1
6769004 Barrett Jul 2004 B2
6774596 Bisset Aug 2004 B1
6779380 Nieuwkamp Aug 2004 B1
6781338 Jones et al. Aug 2004 B2
6809490 Jones et al. Oct 2004 B2
6810305 Kirkpatrick Oct 2004 B2
6810350 Blakley Oct 2004 B2
6830120 Yashima et al. Dec 2004 B1
6832407 Salem et al. Dec 2004 B2
6836701 McKee Dec 2004 B2
6841963 Song et al. Jan 2005 B2
6845297 Allard Jan 2005 B2
6848146 Wright et al. Feb 2005 B2
6854148 Rief et al. Feb 2005 B1
6856811 Burdue et al. Feb 2005 B2
6859010 Jeon et al. Feb 2005 B2
6859682 Naka et al. Feb 2005 B2
6860206 Rudakevych et al. Mar 2005 B1
6865447 Lau et al. Mar 2005 B2
6870792 Chiappetta Mar 2005 B2
6871115 Huang et al. Mar 2005 B2
6883201 Jones et al. Apr 2005 B2
6886651 Slocum et al. May 2005 B1
6888333 Laby May 2005 B2
6901624 Mori et al. Jun 2005 B2
6906702 Tanaka et al. Jun 2005 B1
6914403 Tsurumi Jul 2005 B2
6917854 Bayer Jul 2005 B2
6925357 Wang et al. Aug 2005 B2
6925679 Wallach et al. Aug 2005 B2
6929548 Wang Aug 2005 B2
D510066 Hickey et al. Sep 2005 S
6938298 Aasen Sep 2005 B2
6940291 Ozick Sep 2005 B1
6941199 Bottomley et al. Sep 2005 B1
6956348 Landry et al. Oct 2005 B2
6957712 Song et al. Oct 2005 B2
6960986 Asama et al. Nov 2005 B2
6965209 Jones et al. Nov 2005 B2
6965211 Tsurumi Nov 2005 B2
6968592 Takeuchi et al. Nov 2005 B2
6971140 Kim Dec 2005 B2
6975246 Trudeau Dec 2005 B1
6980229 Ebersole Dec 2005 B1
6985556 Shanmugavel et al. Jan 2006 B2
6993954 George et al. Feb 2006 B1
6999850 McDonald Feb 2006 B2
7013527 Thomas et al. Mar 2006 B2
7024278 Chiappetta et al. Apr 2006 B2
7024280 Parker et al. Apr 2006 B2
7027893 Perry et al. Apr 2006 B2
7030768 Wanie Apr 2006 B2
7031805 Lee et al. Apr 2006 B2
7032469 Bailey Apr 2006 B2
7040869 Beenker May 2006 B2
7041029 Fulghum et al. May 2006 B2
7051399 Field et al. May 2006 B2
7053578 Diehl et al. May 2006 B2
7054716 McKee et al. May 2006 B2
7055210 Keppler et al. Jun 2006 B2
7057120 Ma et al. Jun 2006 B2
7057643 Iida et al. Jun 2006 B2
7059012 Song et al. Jun 2006 B2
7065430 Naka et al. Jun 2006 B2
7066291 Martins et al. Jun 2006 B2
7069124 Whittaker et al. Jun 2006 B1
7079923 Abramson et al. Jul 2006 B2
7085623 Siegers Aug 2006 B2
7085624 Aldred et al. Aug 2006 B2
7113847 Chmura et al. Sep 2006 B2
7133746 Abramson et al. Nov 2006 B2
7142198 Lee Nov 2006 B2
7148458 Schell et al. Dec 2006 B2
7155308 Jones Dec 2006 B2
7167775 Abramson et al. Jan 2007 B2
7171285 Kim et al. Jan 2007 B2
7173391 Jones et al. Feb 2007 B2
7174238 Zweig Feb 2007 B1
7188000 Chiappetta et al. Mar 2007 B2
7188003 Ransom et al. Mar 2007 B2
7193384 Norman et al. Mar 2007 B1
7196487 Jones et al. Mar 2007 B2
7201786 Wegelin et al. Apr 2007 B2
7206677 Hulden Apr 2007 B2
7211980 Bruemmer et al. May 2007 B1
7225500 Diehl et al. Jun 2007 B2
7246405 Yan Jul 2007 B2
7248951 Hulden Jul 2007 B2
7250907 Krumm et al. Jul 2007 B2
7275280 Haegermarck et al. Oct 2007 B2
7283892 Boillot et al. Oct 2007 B1
7288912 Landry et al. Oct 2007 B2
7302469 Motoyama et al. Nov 2007 B2
7312752 Smith et al. Dec 2007 B2
7318248 Yan Jan 2008 B1
7320149 Huffman et al. Jan 2008 B1
7321807 Laski Jan 2008 B2
7324870 Lee Jan 2008 B2
7328196 Peters Feb 2008 B2
7332890 Cohen et al. Feb 2008 B2
7346428 Huffman et al. Mar 2008 B1
7352153 Yan Apr 2008 B2
7359766 Jeon et al. Apr 2008 B2
7360277 Moshenrose et al. Apr 2008 B2
7363108 Noda et al. Apr 2008 B2
7388879 Sabe et al. Jun 2008 B2
7389156 Ziegler et al. Jun 2008 B2
7389166 Harwig et al. Jun 2008 B2
7408157 Yan Aug 2008 B2
7418762 Arai et al. Sep 2008 B2
7430455 Casey et al. Sep 2008 B2
7430462 Chiu et al. Sep 2008 B2
7441298 Svendsen et al. Oct 2008 B2
7444206 Abramson et al. Oct 2008 B2
7448113 Jones et al. Nov 2008 B2
7456596 Goodall et al. Nov 2008 B2
7459871 Landry et al. Dec 2008 B2
7467026 Sakagami et al. Dec 2008 B2
7474941 Kim et al. Jan 2009 B2
7503096 Lin Mar 2009 B2
7512079 Labrador et al. Mar 2009 B2
7515991 Egawa et al. Apr 2009 B2
7539557 Yamauchi May 2009 B2
7555363 Augenbraun et al. Jun 2009 B2
7557703 Yamada et al. Jul 2009 B2
7568259 Yan Aug 2009 B2
7571511 Jones et al. Aug 2009 B2
7578020 Jaworski et al. Aug 2009 B2
7600521 Woo Oct 2009 B2
7603744 Reindle Oct 2009 B2
7611583 Buckley et al. Nov 2009 B2
7617557 Reindle Nov 2009 B2
7620476 Morse et al. Nov 2009 B2
7636928 Uno Dec 2009 B2
7636982 Jones et al. Dec 2009 B2
7647144 Haegermarck Jan 2010 B2
7650666 Jang Jan 2010 B2
7660650 Kawagoe et al. Feb 2010 B2
7663333 Jones et al. Feb 2010 B2
7693605 Park Apr 2010 B2
7706917 Chiappetta et al. Apr 2010 B1
7751936 Kim Jul 2010 B2
7761910 Ransom et al. Jul 2010 B2
7761954 Ziegler et al. Jul 2010 B2
7765635 Park Aug 2010 B2
7784147 Burkholder et al. Aug 2010 B2
7801645 Taylor et al. Sep 2010 B2
7805220 Taylor et al. Sep 2010 B2
7809944 Kawamoto Oct 2010 B2
7832048 Harwig et al. Nov 2010 B2
7849555 Hahm et al. Dec 2010 B2
7853645 Brown et al. Dec 2010 B2
7860680 Arms et al. Dec 2010 B2
7920941 Park et al. Apr 2011 B2
7937800 Yan May 2011 B2
7957836 Myeong et al. Jun 2011 B2
8087117 Kapoor et al. Jan 2012 B2
20010004719 Sommer Jun 2001 A1
20010013929 Torsten Aug 2001 A1
20010020200 Das et al. Sep 2001 A1
20010025183 Shahidi Sep 2001 A1
20010037163 Allard Nov 2001 A1
20010043509 Green et al. Nov 2001 A1
20010045883 Holdaway et al. Nov 2001 A1
20010047231 Peless et al. Nov 2001 A1
20010047895 De Fazio et al. Dec 2001 A1
20020011367 Kolesnik Jan 2002 A1
20020011813 Koselka et al. Jan 2002 A1
20020016649 Jones Feb 2002 A1
20020021219 Edwards Feb 2002 A1
20020027652 Paromtchik et al. Mar 2002 A1
20020036779 Kiyoi et al. Mar 2002 A1
20020046405 Lahr Apr 2002 A1
20020081937 Yamada et al. Jun 2002 A1
20020095239 Wallach et al. Jul 2002 A1
20020097400 Jung et al. Jul 2002 A1
20020104963 Mancevski Aug 2002 A1
20020108209 Peterson Aug 2002 A1
20020112742 Bredo et al. Aug 2002 A1
20020113973 Ge Aug 2002 A1
20020116089 Kirkpatrick Aug 2002 A1
20020120364 Colens Aug 2002 A1
20020124343 Reed Sep 2002 A1
20020153185 Song et al. Oct 2002 A1
20020156556 Ruffner Oct 2002 A1
20020159051 Guo Oct 2002 A1
20020166193 Kasper Nov 2002 A1
20020169521 Goodman et al. Nov 2002 A1
20020173877 Zweig Nov 2002 A1
20020189871 Won Dec 2002 A1
20030009259 Hattori et al. Jan 2003 A1
20030015232 Nguyen Jan 2003 A1
20030019071 Field et al. Jan 2003 A1
20030023356 Keable Jan 2003 A1
20030024986 Mazz et al. Feb 2003 A1
20030025472 Jones et al. Feb 2003 A1
20030028286 Glenn et al. Feb 2003 A1
20030030399 Jacobs Feb 2003 A1
20030058262 Sato et al. Mar 2003 A1
20030060928 Abramson et al. Mar 2003 A1
20030067451 Tagg et al. Apr 2003 A1
20030097875 Lentz et al. May 2003 A1
20030120389 Abramson et al. Jun 2003 A1
20030124312 Autumn Jul 2003 A1
20030126352 Barrett Jul 2003 A1
20030137268 Papanikolopoulos et al. Jul 2003 A1
20030146384 Logsdon et al. Aug 2003 A1
20030158628 Matsuoka et al. Aug 2003 A1
20030159232 Hekman et al. Aug 2003 A1
20030168081 Lee et al. Sep 2003 A1
20030175138 Beenker Sep 2003 A1
20030182117 Monchi et al. Sep 2003 A1
20030192144 Song et al. Oct 2003 A1
20030193657 Uomori et al. Oct 2003 A1
20030216834 Allard Nov 2003 A1
20030221114 Hino et al. Nov 2003 A1
20030229421 Chmura et al. Dec 2003 A1
20030229474 Suzuki et al. Dec 2003 A1
20030233171 Heiligensetzer Dec 2003 A1
20030233177 Johnson et al. Dec 2003 A1
20030233870 Mancevski Dec 2003 A1
20030233930 Ozick Dec 2003 A1
20040009777 Koskimies et al. Jan 2004 A1
20040016077 Song et al. Jan 2004 A1
20040019406 Wang Jan 2004 A1
20040020000 Jones Feb 2004 A1
20040030448 Solomon Feb 2004 A1
20040030449 Solomon Feb 2004 A1
20040030450 Solomon Feb 2004 A1
20040030451 Solomon Feb 2004 A1
20040030570 Solomon Feb 2004 A1
20040030571 Solomon Feb 2004 A1
20040031113 Wosewick et al. Feb 2004 A1
20040049317 Matsuoka et al. Mar 2004 A1
20040049877 Jones et al. Mar 2004 A1
20040055163 McCambridge et al. Mar 2004 A1
20040068351 Solomon Apr 2004 A1
20040068415 Solomon Apr 2004 A1
20040068416 Solomon Apr 2004 A1
20040074038 Im et al. Apr 2004 A1
20040074044 Diehl et al. Apr 2004 A1
20040076324 Burl et al. Apr 2004 A1
20040083570 Song et al. May 2004 A1
20040085037 Jones et al. May 2004 A1
20040088079 Lavarec et al. May 2004 A1
20040093122 Galibraith May 2004 A1
20040098167 Yi et al. May 2004 A1
20040111184 Chiappetta et al. Jun 2004 A1
20040111821 Lenkiewicz et al. Jun 2004 A1
20040113777 Matsuhira et al. Jun 2004 A1
20040117064 McDonald Jun 2004 A1
20040117846 Karaoguz et al. Jun 2004 A1
20040118998 Wingett et al. Jun 2004 A1
20040128028 Miyamoto et al. Jul 2004 A1
20040133316 Dean Jul 2004 A1
20040134336 Solomon Jul 2004 A1
20040134337 Solomon Jul 2004 A1
20040143919 Wilder Jul 2004 A1
20040148419 Chen et al. Jul 2004 A1
20040148731 Damman et al. Aug 2004 A1
20040153212 Profio et al. Aug 2004 A1
20040156541 Jeon et al. Aug 2004 A1
20040158357 Lee et al. Aug 2004 A1
20040160312 Fisher et al. Aug 2004 A1
20040170154 Carter et al. Sep 2004 A1
20040181706 Chen et al. Sep 2004 A1
20040187249 Jones et al. Sep 2004 A1
20040187457 Colens Sep 2004 A1
20040196451 Aoyama Oct 2004 A1
20040200505 Taylor et al. Oct 2004 A1
20040201361 Koh et al. Oct 2004 A1
20040204792 Taylor et al. Oct 2004 A1
20040204804 Lee et al. Oct 2004 A1
20040210345 Noda et al. Oct 2004 A1
20040210347 Sawada et al. Oct 2004 A1
20040211444 Taylor et al. Oct 2004 A1
20040221790 Sinclair et al. Nov 2004 A1
20040235468 Luebke et al. Nov 2004 A1
20040236468 Taylor et al. Nov 2004 A1
20040244138 Taylor et al. Dec 2004 A1
20040255425 Arai et al. Dec 2004 A1
20050000543 Taylor et al. Jan 2005 A1
20050010330 Abramson et al. Jan 2005 A1
20050010331 Taylor et al. Jan 2005 A1
20050015920 Kim et al. Jan 2005 A1
20050021181 Kim et al. Jan 2005 A1
20050028316 Thomas et al. Feb 2005 A1
20050053912 Roth et al. Mar 2005 A1
20050055796 Wright et al. Mar 2005 A1
20050067994 Jones et al. Mar 2005 A1
20050080514 Omote Apr 2005 A1
20050081782 Buckley et al. Apr 2005 A1
20050085947 Aldred et al. Apr 2005 A1
20050091782 Gordon et al. May 2005 A1
20050091786 Wright et al. May 2005 A1
20050096790 Tamura May 2005 A1
20050137749 Jeon et al. Jun 2005 A1
20050144437 Ransom et al. Jun 2005 A1
20050144751 Kegg et al. Jul 2005 A1
20050150074 Diehl et al. Jul 2005 A1
20050150519 Keppler et al. Jul 2005 A1
20050150697 Altman Jul 2005 A1
20050154795 Kuz et al. Jul 2005 A1
20050156562 Cohen et al. Jul 2005 A1
20050162119 Landry et al. Jul 2005 A1
20050163119 Ito et al. Jul 2005 A1
20050165508 Kanda et al. Jul 2005 A1
20050166354 Uehigashi Aug 2005 A1
20050166355 Tani Aug 2005 A1
20050172445 Diehl et al. Aug 2005 A1
20050183229 Uehigashi Aug 2005 A1
20050183230 Uehigashi Aug 2005 A1
20050187678 Myeong et al. Aug 2005 A1
20050192707 Park et al. Sep 2005 A1
20050197739 Noda et al. Sep 2005 A1
20050204505 Kashiwagi Sep 2005 A1
20050204717 Colens Sep 2005 A1
20050209736 Kawagoe Sep 2005 A1
20050211880 Schell et al. Sep 2005 A1
20050212929 Schell et al. Sep 2005 A1
20050213082 DiBernardo et al. Sep 2005 A1
20050213109 Schell et al. Sep 2005 A1
20050216126 Koselka et al. Sep 2005 A1
20050217042 Reindle Oct 2005 A1
20050218852 Landry et al. Oct 2005 A1
20050222933 Wesby Oct 2005 A1
20050229340 Sawalski et al. Oct 2005 A1
20050229355 Crouch et al. Oct 2005 A1
20050234611 Uehigashi Oct 2005 A1
20050235451 Yan Oct 2005 A1
20050251292 Casey et al. Nov 2005 A1
20050255425 Pierson Nov 2005 A1
20050258154 Blankenship et al. Nov 2005 A1
20050267826 Levy Dec 2005 A1
20050273967 Taylor et al. Dec 2005 A1
20050288819 De Guzman Dec 2005 A1
20060000050 Cipolla et al. Jan 2006 A1
20060009879 Lynch et al. Jan 2006 A1
20060010638 Shimizu et al. Jan 2006 A1
20060020369 Taylor et al. Jan 2006 A1
20060020370 Abramson Jan 2006 A1
20060020881 Szabo et al. Jan 2006 A1
20060021168 Nishikawa Feb 2006 A1
20060025134 Cho et al. Feb 2006 A1
20060037170 Shimizu Feb 2006 A1
20060042042 Mertes et al. Mar 2006 A1
20060044546 Lewin et al. Mar 2006 A1
20060060216 Woo Mar 2006 A1
20060061657 Rew et al. Mar 2006 A1
20060064828 Stein et al. Mar 2006 A1
20060087273 Ko et al. Apr 2006 A1
20060089765 Pack et al. Apr 2006 A1
20060100741 Jung May 2006 A1
20060107894 Buckley et al. May 2006 A1
20060119839 Bertin et al. Jun 2006 A1
20060143295 Costa-Requena et al. Jun 2006 A1
20060146776 Kim Jul 2006 A1
20060150361 Aldred et al. Jul 2006 A1
20060161301 Kim Jul 2006 A1
20060184293 Konandreas et al. Aug 2006 A1
20060185690 Song et al. Aug 2006 A1
20060190133 Konandreas et al. Aug 2006 A1
20060190134 Ziegler et al. Aug 2006 A1
20060190146 Morse et al. Aug 2006 A1
20060196003 Song et al. Sep 2006 A1
20060200281 Ziegler et al. Sep 2006 A1
20060220900 Ceskutti et al. Oct 2006 A1
20060229774 Park et al. Oct 2006 A1
20060259194 Chiu Nov 2006 A1
20060259494 Watson et al. Nov 2006 A1
20060278161 Burkholder et al. Dec 2006 A1
20060288519 Jaworski et al. Dec 2006 A1
20060293787 Kanda et al. Dec 2006 A1
20060293808 Qian Dec 2006 A1
20070006404 Cheng et al. Jan 2007 A1
20070016328 Ziegler et al. Jan 2007 A1
20070017061 Yan Jan 2007 A1
20070028574 Yan Feb 2007 A1
20070032904 Kawagoe et al. Feb 2007 A1
20070042716 Goodall et al. Feb 2007 A1
20070043459 Abbott et al. Feb 2007 A1
20070061041 Zweig Mar 2007 A1
20070061043 Ermakov et al. Mar 2007 A1
20070114975 Cohen et al. May 2007 A1
20070142964 Abramson Jun 2007 A1
20070150096 Yeh et al. Jun 2007 A1
20070156286 Yamauchi Jul 2007 A1
20070157415 Lee et al. Jul 2007 A1
20070157420 Lee et al. Jul 2007 A1
20070179670 Chiappetta et al. Aug 2007 A1
20070226949 Hahm et al. Oct 2007 A1
20070234492 Svendsen et al. Oct 2007 A1
20070244610 Ozick et al. Oct 2007 A1
20070245511 Hahm et al. Oct 2007 A1
20070250212 Halloran et al. Oct 2007 A1
20070261193 Gordon et al. Nov 2007 A1
20070266508 Jones et al. Nov 2007 A1
20080007203 Cohen et al. Jan 2008 A1
20080039974 Sandin et al. Feb 2008 A1
20080052846 Kapoor et al. Mar 2008 A1
20080091304 Ozick et al. Apr 2008 A1
20080109126 Sandin et al. May 2008 A1
20080134458 Ziegler et al. Jun 2008 A1
20080140255 Ziegler et al. Jun 2008 A1
20080155768 Ziegler et al. Jul 2008 A1
20080184518 Taylor et al. Aug 2008 A1
20080266748 Lee Oct 2008 A1
20080276407 Schnittman et al. Nov 2008 A1
20080281470 Gilbert et al. Nov 2008 A1
20080282494 Won et al. Nov 2008 A1
20080294288 Yamauchi Nov 2008 A1
20080302586 Yan Dec 2008 A1
20080307590 Jones et al. Dec 2008 A1
20090007366 Svendsen et al. Jan 2009 A1
20090038089 Landry et al. Feb 2009 A1
20090048727 Hong et al. Feb 2009 A1
20090049640 Lee et al. Feb 2009 A1
20090055022 Casey et al. Feb 2009 A1
20090102296 Greene et al. Apr 2009 A1
20090292393 Casey et al. Nov 2009 A1
20100006028 Buckley et al. Jan 2010 A1
20100011529 Won et al. Jan 2010 A1
20100049365 Jones et al. Feb 2010 A1
20100063628 Landry et al. Mar 2010 A1
20100082193 Chiappetta Apr 2010 A1
20100107355 Won et al. May 2010 A1
20100257690 Jones et al. Oct 2010 A1
20100257691 Jones et al. Oct 2010 A1
20100263158 Jones et al. Oct 2010 A1
20100268384 Jones et al. Oct 2010 A1
20100293742 Chung et al. Nov 2010 A1
20100312429 Jones et al. Dec 2010 A1
Foreign Referenced Citations (357)
Number Date Country
2128842 Dec 1980 DE
3317376 Dec 1987 DE
3536907 Feb 1989 DE
3404202 Dec 1992 DE
199311014 Oct 1993 DE
4338841 May 1995 DE
4414683 Oct 1995 DE
19849978 Feb 2001 DE
102004038074 Jun 2005 DE
10357636 Jul 2005 DE
102004041021 Aug 2005 DE
102005046813 Apr 2007 DE
338988 Dec 1988 DK
0265542 May 1988 EP
0281085 Sep 1988 EP
0286328 Oct 1988 EP
0294101 Dec 1988 EP
0352045 Jan 1990 EP
0433697 Jun 1991 EP
0437024 Jul 1991 EP
0554978 Aug 1993 EP
0615719 Sep 1994 EP
0792726 Sep 1997 EP
0930040 Jul 1999 EP
0845237 Apr 2000 EP
0861629 Sep 2001 EP
1228734 Aug 2002 EP
1380245 Jan 2004 EP
1380246 Jan 2004 EP
1018315 Nov 2004 EP
1553472 Jul 2005 EP
1557730 Jul 2005 EP
1642522 Apr 2006 EP
1836941 Sep 2007 EP
2238196 Aug 2005 ES
722755 Mar 1932 FR
2601443 Jan 1988 FR
2828589 Feb 2003 FR
702426 Jan 1954 GB
2128842 May 1984 GB
2225221 May 1990 GB
2267360 Dec 1993 GB
2283838 May 1995 GB
2284957 Jun 1995 GB
2300082 Oct 1996 GB
2344747 Jun 2000 GB
2404330 Feb 2005 GB
2417354 Feb 2006 GB
53021869 Feb 1978 JP
53110257 Sep 1978 JP
57064217 Apr 1982 JP
59005315 Jan 1984 JP
59033511 Mar 1984 JP
59094005 May 1984 JP
59099308 Jun 1984 JP
59112311 Jun 1984 JP
59120124 Jul 1984 JP
59131668 Sep 1984 JP
59164973 Sep 1984 JP
59184917 Oct 1984 JP
2283343 Nov 1984 JP
59212924 Dec 1984 JP
59226909 Dec 1984 JP
60089213 May 1985 JP
60211510 Oct 1985 JP
60259895 Dec 1985 JP
61023221 Jan 1986 JP
61097712 May 1986 JP
61160366 Jul 1986 JP
62070709 Apr 1987 JP
62074018 Apr 1987 JP
62120510 Jun 1987 JP
62154008 Jul 1987 JP
62164431 Jul 1987 JP
62263507 Nov 1987 JP
62263508 Nov 1987 JP
62189057 Dec 1987 JP
63079623 Apr 1988 JP
63158032 Jul 1988 JP
63192414 Aug 1988 JP
63203483 Aug 1988 JP
63241610 Oct 1988 JP
1118752 Aug 1989 JP
2-6312 Jan 1990 JP
3051023 Mar 1991 JP
3162814 Jul 1991 JP
4019586 Jan 1992 JP
4074285 Mar 1992 JP
4084921 Mar 1992 JP
04-083393 Jul 1992 JP
5023269 Feb 1993 JP
5035330 Feb 1993 JP
5042076 Feb 1993 JP
5046246 Feb 1993 JP
5091604 Apr 1993 JP
5095879 Apr 1993 JP
5150827 Jun 1993 JP
5150829 Jun 1993 JP
5054620 Jul 1993 JP
5040519 Oct 1993 JP
5257527 Oct 1993 JP
5257533 Oct 1993 JP
5285861 Nov 1993 JP
5302836 Nov 1993 JP
5312514 Nov 1993 JP
5046239 Dec 1993 JP
5341904 Dec 1993 JP
6003251 Jan 1994 JP
6038912 Feb 1994 JP
6105781 Apr 1994 JP
6137828 May 1994 JP
6154143 Jun 1994 JP
6293095 Oct 1994 JP
6327598 Nov 1994 JP
7047046 Feb 1995 JP
7129239 May 1995 JP
7059702 Jun 1995 JP
7202792 Aug 1995 JP
7222705 Aug 1995 JP
7270518 Oct 1995 JP
7313417 Dec 1995 JP
8000393 Jan 1996 JP
8016776 Jan 1996 JP
8084696 Apr 1996 JP
8089449 Apr 1996 JP
8089451 Apr 1996 JP
8123548 May 1996 JP
8125767 May 1996 JP
8152916 Jun 1996 JP
8263137 Oct 1996 JP
8335112 Dec 1996 JP
8339297 Dec 1996 JP
943901 Feb 1997 JP
9044240 Feb 1997 JP
9066855 Mar 1997 JP
9145309 Jun 1997 JP
9160644 Jun 1997 JP
9179625 Jul 1997 JP
9185410 Jul 1997 JP
9192069 Jul 1997 JP
2555263 Aug 1997 JP
9204223 Aug 1997 JP
9206258 Aug 1997 JP
9233712 Sep 1997 JP
9265319 Oct 1997 JP
9269807 Oct 1997 JP
9269810 Oct 1997 JP
9319431 Dec 1997 JP
9319432 Dec 1997 JP
9319434 Dec 1997 JP
9325812 Dec 1997 JP
10055215 Feb 1998 JP
10113318 May 1998 JP
10117973 May 1998 JP
10118963 May 1998 JP
10165738 Jun 1998 JP
10177414 Jun 1998 JP
10295595 Nov 1998 JP
10314088 Dec 1998 JP
11015941 Jan 1999 JP
11102220 Apr 1999 JP
11162454 Jun 1999 JP
11174145 Jul 1999 JP
11175149 Jul 1999 JP
11178764 Jul 1999 JP
11178765 Jul 1999 JP
11212642 Aug 1999 JP
11213157 Aug 1999 JP
11282532 Oct 1999 JP
11282533 Oct 1999 JP
11295412 Oct 1999 JP
11355558 Dec 1999 JP
2000047728 Feb 2000 JP
2000056006 Feb 2000 JP
2000056831 Feb 2000 JP
2000060782 Feb 2000 JP
2000066722 Mar 2000 JP
2000075925 Mar 2000 JP
2000102499 Apr 2000 JP
2000135186 May 2000 JP
2000235416 Aug 2000 JP
2000275321 Oct 2000 JP
2000279353 Oct 2000 JP
2000342496 Dec 2000 JP
2000353013 Dec 2000 JP
2000353014 Dec 2000 JP
2001022443 Jan 2001 JP
2001067588 Mar 2001 JP
2001087182 Apr 2001 JP
2001121455 May 2001 JP
2001125641 May 2001 JP
2001508572 Jun 2001 JP
2001197008 Jul 2001 JP
2001203744 Jul 2001 JP
3197758 Aug 2001 JP
3201903 Aug 2001 JP
2001216482 Aug 2001 JP
2001258807 Sep 2001 JP
2001265437 Sep 2001 JP
2001275908 Oct 2001 JP
2001289939 Oct 2001 JP
2001306170 Nov 2001 JP
2001344161 Dec 2001 JP
2002073170 Mar 2002 JP
2002078650 Mar 2002 JP
2002204768 Jul 2002 JP
2002204769 Jul 2002 JP
2002247510 Aug 2002 JP
2002312275 Oct 2002 JP
2002532180 Oct 2002 JP
2002321180 Nov 2002 JP
2002323925 Nov 2002 JP
2002333920 Nov 2002 JP
2002341937 Nov 2002 JP
2002355206 Dec 2002 JP
2002360471 Dec 2002 JP
2002360482 Dec 2002 JP
2002366227 Dec 2002 JP
2002369778 Dec 2002 JP
2003005296 Jan 2003 JP
2003006532 Jan 2003 JP
2003010076 Jan 2003 JP
2003010088 Jan 2003 JP
2003016203 Jan 2003 JP
2003018670 Jan 2003 JP
2003028528 Jan 2003 JP
2003036116 Feb 2003 JP
2003038401 Feb 2003 JP
2003038402 Feb 2003 JP
2003047579 Feb 2003 JP
2003061882 Mar 2003 JP
2003084994 Mar 2003 JP
2003167628 Jun 2003 JP
2003180586 Jul 2003 JP
2003180587 Jul 2003 JP
2003186539 Jul 2003 JP
2003190064 Jul 2003 JP
2003521204 Jul 2003 JP
2003241836 Aug 2003 JP
2003262520 Sep 2003 JP
2003304992 Oct 2003 JP
2003310509 Nov 2003 JP
2003330543 Nov 2003 JP
2004123040 Apr 2004 JP
2004135993 May 2004 JP
2004148021 May 2004 JP
2004515090 May 2004 JP
2004160102 Jun 2004 JP
2004166968 Jun 2004 JP
2004-195215 Jul 2004 JP
2004185586 Jul 2004 JP
2004198330 Jul 2004 JP
2004219185 Aug 2004 JP
2004336401 Nov 2004 JP
2004351234 Dec 2004 JP
2004357187 Dec 2004 JP
2005025516 Jan 2005 JP
2005101887 Apr 2005 JP
2005118354 May 2005 JP
2005192988 Jul 2005 JP
2005211360 Aug 2005 JP
2005218559 Aug 2005 JP
2005224265 Aug 2005 JP
2005230032 Sep 2005 JP
2005230948 Sep 2005 JP
2005245916 Sep 2005 JP
2005296512 Oct 2005 JP
2005324278 Nov 2005 JP
2005352707 Dec 2005 JP
2006043071 Feb 2006 JP
2006155274 Jun 2006 JP
2006164223 Jun 2006 JP
2006227673 Aug 2006 JP
2006247467 Sep 2006 JP
2006260161 Sep 2006 JP
2006293662 Oct 2006 JP
2006296697 Nov 2006 JP
2007034866 Feb 2007 JP
2007213180 Aug 2007 JP
2009015611 Jan 2009 JP
2010198552 Sep 2010 JP
20020037618 May 2002 KR
20020061341 Jul 2002 KR
9526512 Oct 1995 WO
9530887 Nov 1995 WO
9617258 Jun 1996 WO
9715224 May 1997 WO
9740734 Nov 1997 WO
9741451 Nov 1997 WO
9853456 Nov 1998 WO
9905580 Feb 1999 WO
9916078 Apr 1999 WO
9938056 Jul 1999 WO
9938237 Jul 1999 WO
9943250 Sep 1999 WO
0038026 Jun 2000 WO
0038028 Jun 2000 WO
0038029 Jun 2000 WO
0004430 Oct 2000 WO
0078410 Dec 2000 WO
0106904 Feb 2001 WO
0106905 Feb 2001 WO
0155879 Aug 2001 WO
0180703 Nov 2001 WO
0191623 Dec 2001 WO
0195557 Dec 2001 WO
0224292 Mar 2002 WO
0239864 May 2002 WO
0239868 May 2002 WO
02058527 Aug 2002 WO
02062194 Aug 2002 WO
02067744 Sep 2002 WO
02067745 Sep 2002 WO
02067752 Sep 2002 WO
02069774 Sep 2002 WO
02069775 Sep 2002 WO
02071175 Sep 2002 WO
02074150 Sep 2002 WO
02075350 Sep 2002 WO
02075356 Sep 2002 WO
02075469 Sep 2002 WO
02075470 Sep 2002 WO
02081074 Oct 2002 WO
02101477 Dec 2002 WO
03015220 Feb 2003 WO
03024292 Mar 2003 WO
03040546 May 2003 WO
03040845 May 2003 WO
03040846 May 2003 WO
03062850 Jul 2003 WO
03062852 Jul 2003 WO
2004004533 Jan 2004 WO
2004004534 Jan 2004 WO
2004006034 Jan 2004 WO
2004025947 Mar 2004 WO
2004058028 Jul 2004 WO
2004059409 Jul 2004 WO
2005006935 Jan 2005 WO
2005037496 Apr 2005 WO
2005055795 Jun 2005 WO
2005055796 Jun 2005 WO
2005076545 Aug 2005 WO
2005077243 Aug 2005 WO
2005077244 Aug 2005 WO
2005081074 Sep 2005 WO
2005083541 Sep 2005 WO
2005098475 Oct 2005 WO
2005098476 Oct 2005 WO
2006046400 May 2006 WO
2006061133 Jun 2006 WO
2006068403 Jun 2006 WO
2006073248 Jul 2006 WO
2006089307 Aug 2006 WO
2007028049 Mar 2007 WO
2007036490 Apr 2007 WO
2007065033 Jun 2007 WO
2007137234 Nov 2007 WO
Non-Patent Literature Citations (214)
Entry
Andersen et al., “Landmark based navigation strategies,” SPIE Conference on Mobile Robots XIII, SPIE vol. 3525, pp. 170-181, Jan. 8, 1999.
Ascii, Mar. 25, 2002, http://ascii.jp/elem/000/000/330/330024/, accessed Nov. 2011, 7 pages.
Barker, “Navigation by the Stars—Ben Barker 4th Year Project,” Nov. 2004, 20 pages.
Becker et al., “Reliable Navigation Using Landmarks,” IEEE International Conference on Robotics and Automation, 0-7803-1965-6, pp. 401-406, 1995.
Benayad-Cherif et al., “Mobile Robot Navigation Sensors,” SPIE vol. 1831 Mobile Robots, VII, pp. 378-387, 1992.
Bison et al., “Using a structured beacon for cooperative position estimation,” Robotics and Autonomous Systems, 29(1):33-40, Oct. 1999.
Blaasvaer et al., “AMOR—An Autonomous Mobile Robot Navigation System,” Proceedings of the IEEE International Conference on Systems, Man, and Cybernetics, pp. 2266-2271, 1994.
Borges et al., “Optimal Mobile Robot Pose Estimation Using Geometrical Maps,” IEEE Transactions on Robotics and Automation, 18(1): 87-94, Feb. 2002.
Braunstingl et al., “Fuzzy Logic Wall Following of a Mobile Robot Based on the Concept of General Perception,” ICAR '95, 7th International Conference on Advanced Robotics, Sant Feliu De Guixols, Spain, pp. 367-376, Sep. 1995.
Bulusu et al., “Self Configuring Localization systems: Design and Experimental Evaluation,” ACM Transactions on Embedded Computing Systems, 3(1):24-60, 2003.
Caccia et al., “Bottom-Following for Remotely Operated Vehicles,” 5th IFAC Conference, Alaborg, Denmark, pp. 245-250, Aug. 2000.
U.S. Appl. No. 60/605,066 as provided to WIPO in PCT/US2005/030422, corresponding to U.S. Appl. No. 11/574,290, U.S. publication 2008/0184518, filed Aug. 27, 2004.
U.S. Appl. No. 60/605,181 as provided to WIPO in PCT/US2005/030422, corresponding to U.S. Appl. No. 11/574,290, U.S. publication 2008/0184518, filed Aug. 27, 2004.
Chae et al., “StarLITE: A new artificial landmark for the navigation of mobile robots,” http://www.irc.atr.jp/jk-nrs2005/pdf/Starlite.pdf, 4 pages, 2005.
Chamberlin et al., “Team 1: Robot Locator Beacon System, ” NASA Goddard SFC, Design Proposal, 15 pages, Feb. 2006.
Champy, “Physical management of IT assets in Data Centers using RFID technologies,” RFID 2005 University, Oct. 12-14, 2005 , 19 pages.
Chili, “Joystick Control for Tiny OS Robot,” http://www.eecs.berkeley.edu/Programs/ugrad/supeth/papers2002/chiri.pdf. 12 pages, Aug. 2002.
Christensen et al. “Theoretical Methods for Planning and Control in Mobile Robotics,” 1997 First International Conference on Knowledge-Based Intelligent Electronic Systems, Adelaide, Australia, pp. 81-86, May 1997.
CleanMate 365, Intelligent Automatic Vacuum Cleaner, Model No. QQ-1, User Manual www.metapo.com/support/user—manual.pdf 11 pages.
Clerentin et al., “A localization method based on two omnidirectional perception systems cooperation,” Proc of IEEE International Conference on Robotics & Automation, San Francisco, CA vol. 2, pp. 1219-1224, Apr. 2000.
Corke, “High Performance Visual serving for robots end-point control,” SPIE vol. 2056, Intelligent Robots and Computer Vision, 1993, 10 pages.
Cozman et al., “Robot Localization using a Computer Vision Sextant,” IEEE International Midwest Conference on Robotics and Automation, pp. 106-111, 1995.
D'Orazio et al., “Model based Vision System for mobile robot position estimation”, SPIE, vol. 2058 Mobile Robots VIII, pp. 38-49, 1992.
De Bakker et al., “Smart PSD-array for sheet of light range imaging”, Proc. of SPIE, vol. 3965, pp. 1-12, May 2000.
Denning Roboscrub image (1989), 1 page.
Desaulniers et al., “An Efficient Algorithm to find a shortest path for a car-like Robot,” IEEE Transactions on robotics and Automation , 11(6):819-828, Dec. 1995.
Dorfmüller-Ulhaas, “Optical Tracking From User Motion to 3D Interaction,” http://www.cg.tuwien.ac.at/research/publications/2002/Dorfmueller-Ulhaas-thesis, 182 pages, 2002.
Dorsch et al., “Laser Triangulation. Fundamental uncertainty in distance measurement,” Applied Optics, 33(7):1306-1314, Mar. 1994.
Doty et al., “Sweep Strategies for a Sensory-Driven, Behavior-Based Vacuum Cleaning Agent,” AAAI 1993 Fall Symposium Series, Instantiating Real-World Agents, pp. 1-6, Oct. 22-24, 1993.
Dudek et al., “Localizing a Robot with Minimum Travel” Proceedings of the sixth annual ACM-SIAM symposium on Discrete Algorithms, 27(2):583-604, Apr. 1998.
Dulimarta et al., “Mobile Robot Localization in Indoor Environment”, Pattern Recognition, 30(1):99-111, 1997.
Dyson's Robot Vacuum Cleaner—the DC06, May 2004, Retrieved from the Internet: URL< http://www.gizmag.com/go/1282/>. Accessed Nov. 2011, 3 pages.
EBay, “Roomba Timer -> Timed Cleaning—Floorvac Robotic Vacuum,” Retrieved from the Internet: URL Cgi.ebay.com/ws/eBay|SAP|.dll?viewitem&category=43526&item=4375198387&rd=1, 5 pages, Apr. 2005.
Electrolux Trilobite, “Time to enjoy life,” Retrieved from the Internet: URL<http://www.robocon.co.kr/trilobite/Presentation—Trilobite—Kor—030104.ppt, 26 pages, accessed Dec. 2011.
Electrolux Trilobite, Jan. 12, 2001, http://www.electroluxui.com:8080/2002%5C822%5C833102EN.pff, accessed Jul. 2, 2012, 10 pages.
Electrolux, “Designed for the well-lived home,” Retrieved from the Internet: URL<http://www.electroluxusa.com/node57.as[?currentURL=node142.asp%3F >. Accessed Mar. 2005, 5 pages.
Electrolux, “Welcome to the Electrolux trilobite,” Retrieved from the Internet: URL<www.electroluxusa.com/node57.asp?currentURL=node142.asp%3F>. 2 pages, Mar. 2005.
Eren et al., “Accuracy in position estimation of mobile robots based on coded infrared signal transmission,” Proceedings: Integrating Intelligent Instrumentation and Control, Instrumentation and Measurement Technology Conference, 1995, IMTC/95. pp. 548-551, 1995.
Eren et al., “Operation of Mobile Robots in a Structured Infrared Environment,” Proceedings ‘Sensing, Processing, Networking’, IEEE Instrumentation and Measurement Technology Conference, 1997 (IMTC/97), Ottawa, Canada vol. 1, pp. 20-25, May 1997.
Euroflex Intelligente Monstre, (English excerpt only), 2006, 15 pages.
Euroflex, Jan. 2006, Retrieved from the Internet: URL< http://www.euroflex.tv/novita—dett.php?id=15, accessed Nov. 2011, 1 page.
eVac Robotic Vacuum S1727 Instruction Manual, Sharper Image Corp, Copyright 2004, 16 pages.
Everyday Robots, “Everyday Robots: Reviews, Discussion and News for Consumers,” Aug. 2004, Retrieved from the Internet: URL< www.everydayrobots.com/index.php?option=content&task=view&id=9> (Sep. 2012), 4 pages.
Evolution Robotics, “NorthStar—Low-cost Indoor Localiztion—How it Works,” E Evolution Robotics , 2 pages, 2005.
Facchinetti Claudio et al., “Self-Positioning Robot Navigation Using Ceiling Images Sequences,” ACCV '95, 5 pages, Dec. 1995.
Facchinetti Claudio et al., “Using and Learning Vision-Based Self-Positioning for Autonomous Robot Navigation,” ICARCV '94, vol. 3, pp. 1694-1698, 1994.
Facts on Trilobite, webpage, Retrieved from the Internet: URL< http://trilobiteelectroluxse/presskit—en/model11335asp?print=yes&pressID=>. 2 pages, accessed Dec. 2003.
Fairfield et al., “Mobile Robot Localization with Sparse Landmarks,” SPIE vol. 4573, pp. 148-155, 2002.
Favre-Bulle, “Efficient tracking of 3D—Robot Position by Dynamic Triangulation,” IEEE Instrumentation and Measurement Technology Conference IMTC 98 Session on Instrumentation and Measurement in Robotics, vol. 1, pp. 446-449, May 1998.
Fayman, “Exploiting Process Integration and Composition in the context of Active Vision,” IEEE Transactions on Systems, Man, and Cybernetics—Part C: Application and reviews, vol. 29, No. 1, pp. 73-86, Feb. 1999.
Flombot GE Plastics—IMAGE, available at http://www.fuseid.com/, 1989-1990, Accessed Sep. 2012, 1 page.
Franz et al., “Biomimetric robot navigation”, Robotics and Autonomous Systems, vol. 30 pp. 133-153, 2000.
Friendly Robotics, “Friendly Robotics—Friendly Vac, Robotic Vacuum Cleaner,” Retrieved from the Internet: URL< www.friendlyrobotics.com/vac.htm > 5 pages, Apr. 2005.
Friendly Robotics, Retrieved from the Internet: URL<http://www.robotsandrelax.com/PDFs/RV400Manual.pdf>. 18 pages, accessed Dec. 2011.
Fuentes et al., “Mobile Robotics 1994,” University of Rochester. Computer Science Department, TR 588, 44 pages, Dec. 1994.
Fukuda et al., “Navigation System based on Ceiling Landmark Recognition for Autonomous mobile robot,” 1995 IEEE/RSJ International Conference on Intelligent Robots and Systems 95. ‘Human Robot Interaction and Cooperative Robots’, Pittsburgh, PA, pp. 1466/1471, Aug. 1995.
Gat, “Robust Low-Computation Sensor-driven Control for Task-Directed Navigation,” Proc of IEEE International Conference on Robotics and Automation , Sacramento, CA pp. 2484-2489, Apr. 1991.
Gionis, “A hand-held optical surface scanner for environmental Modeling and Virtual Reality,” Virtual Reality World, 16 pages, 1996.
Goncalves et al., “A Visual Front-End for Simultaneous Localization and Mapping”, Proceedings of the 2005 IEEE International Conference on Robotics and Automation, Barcelona, Spain, pp. 44-49, Apr. 2005.
Gregg et al., “Autonomous Lawn Care Applications,” 2006 Florida Conference on Recent Advances in Robotics, Miami, Florida, May 25-26, 2006, Florida International University, 5 pages.
Grumet, “Robots Clean House,” Popular Mechanics, Nov. 2003, 3 pages.
Hamamatsu “SI PIN Diode S5980, S5981 S5870—Multi-element photodiodes for surface mounting,” Hamatsu Photonics, 2 pages, Apr. 2004.
Hammacher Schlemmer , “Electrolux Trilobite Robotic Vacuum,” Retrieved from the Internet: URL< www.hammacher.com/publish/71579.asp?promo=xsells>. 3 pages, Mar. 2005.
Haralick et al. “Pose Estimation from Corresponding Point Data”, IEEE Transactions on Systems, Man, and Cybernetics, 19(6):1426-1446, Nov. 1989.
Hausler, “About the Scaling Behaviour of Optical Range Sensors,” Fringe '97, Proceedings of the 3rd International Workshop on Automatic Processing of Fringe Patterns, Bremen, Germany, pp. 147-155, Sep. 1997.
Hitachi ‘Feature’, http://kadenfan.hitachi.co.jp/robot/feature/feature.html, Accessed Nov. 19, 2008, 1 page.
Hitachi, http://www.hitachi.co.jp/New/cnews/hi—030529—hi—030529.pdf , 8 pages, May 29, 2003.
Hitachi: News release: “The home cleaning robot of the autonomous movement type (experimental machine),” Retrieved from the Internet: URL< www.i4u.com./japanreleases/hitachirobot.htm>. 5 pages, Mar. 2005.
Hoag et al., “Navigation and Guidance in interstellar space,” ACTA Astronautica, vol. 2, pp. 513-533 , Feb. 1975.
Home Robot—UBOT; Microbotusa.com, retrieved from the WWW at www.microrobotusa.com, accessed Dec. 2, 2008, 2 pages.
Huntsberger et al., “CAMPOUT: A Control Architecture for Tightly Coupled Coordination of Multirobot Systems for Planetary Surface Exploration,” IEEE Transactions on Systems, Man, and Cybernetics—Part A: Systems and Humans, 33(5):550-559, Sep. 2003.
Iirobotics.com, “Samsung Unveils Its Multifunction Robot Vacuum,” Retrieved from the Internet: URL<.www.iirobotics.com/webpages/hotstuff.php?ubre=111>. 3 pages, Mar. 2005.
InMach “Intelligent Machines,” Retrieved from the Internet: URL<www.inmach.de/inside.html> 1 page , Nov. 2008.
Innovation First, “2004 EDU Robot Controller Reference Guide,” Retrieved from the Internet: URL<http://www.ifirobotics.com>. 13 pages, Mar. 2004.
IT media, Retrieved from the Internet: URL<http://www.itmedia.co.jp/news/0111/16/robofesta—m.html> Accessed Nov. 1, 2011, 4 pages.
It's eye, Retrieved from the Internet: URL< www.hitachi.co.jp/rd/pdf/topics/hitac2003—10.pdf>. 2 pages, 2003.
Jarosiewicz et al., “Final Report—Lucid,” University of Florida, Departmetn of Electrical and Computer Engineering, EEL 5666—Intelligent Machine Design Laboratory, 50 pages, Aug. 1999.
Jensfelt et al., “Active Global Localization for a mobile robot using multiple hypothesis tracking,” IEEE Transactions on Robots and Automation, 17(5): 748-760, Oct. 2001.
Jeong et al., “An intelligent map-building system for indoor mobile robot using low cost photo sensors,” SPIE, vol. 6042, 6 pages, 2005.
Kahney, “Robot Vacs are in the House,” Retrieved from the Internet: URL<www.wired.com/news/technology/o,1282,59237,00.html>. 6 pages, Jun. 2003.
Karcher “Karcher RoboCleaner RC 3000,” Retrieved from the Internet: URL<www.robocleaner.de/english/screen3.html> 4 pages, Dec. 2003.
Karcher Product Manual Download webpage: Retrieved from the Internet: URL<http://www.karcher.com/bta/download.en.shtml?ACTION=SELECTTEILENR&ID=rc3000&submitButtonName=Select+Product+Manual and associated .pdf file “5959-915en.pdf (4.7 MB) English/English,” 16 pages, accessed Jan. 2004.
Karcher RC 3000 Cleaning Robot-user manual Manufacturer: Alfred-Karcher GmbH & Co, Cleaning Systems, Alfred Karcher-Str 28-40, PO Box 160, D-71349 Winnenden, Germany, Dec. 2002, 8 pages.
Karcher RC3000 RoboCleaner,—IMAGE, Accessed at <http://www.karcher.de/versions/int/assets/video/2—4—robo—en.swf>. Accessed Sep. 2009, 1 page.
Karcher USA, RC3000 Robotic Cleaner, website: http://www.karcher-usa.com/showproducts.php?op=view prod&paraml=143&param2=&param3=, 3 pages, accessed Mar. 2005.
Karcher, “Product Manual Download Karch”, available at www.karcher.com, 16 pages, 2004.
Karlsson et al, “Core Technologies for service Robotics,” IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2004), vol. 3, pp. 2979-2984, Sep. 2004.
Karlsson et al., The vSLAM Algorithm for Robust Localization and Mapping, Proceedings of the 2005 IEEE International Conference on Robotics and Automation, Barcelona, Spain, pp. 24-29, Apr. 2005.
King and Weiman, “HelpmateTM Autonomous Mobile Robots Navigation Systems,” SPIE vol. 1388 Mobile Robots, pp. 190-198, 1990.
Kleinberg, The Localization Problem for Mobile Robots, Laboratory for Computer Science, Massachusetts Institute of Technology, 1994 IEEE, pp. 521-531, 1994.
Knights, et al., “Localization and Identification of Visual Landmarks,” Journal of Computing Sciences in Colleges, 16(4):312-313, May 2001.
Kolodko et al., “Experimental System for Real-Time Motion Estimation,” Proceedings of the 2003 IEEE/ASME International Conference on Advanced Intelligent Mechatronics (AIM 2003), pp. 981-986, 2003.
Komoriya et al., “Planning of Landmark Measurement for the Navigation of a Mobile Robot,” Proceedings of the 1992 IEEE/RSJ International Cofnerence on Intelligent Robots and Systems, Raleigh, NC pp. 1476-1481, Jul. 1992.
Koolvac Robotic Vacuum Cleaner Owner's Manual, Koolatron, 2004, 13 pages.
Krotkov et al., “Digital Sextant,” Downloaded from the internet at: http://www.cs.cmu.edu/˜epk/ , 1 page, 1995.
Krupa et al., “Autonomous 3-D Positioning of Surgical Instruments in Robotized Laparoscopic Surgery Using Visual Servoin,” IEEE Transactions on Robotics and Automation, 19(5):842-853, Oct. 2003.
Kuhl et al., “Self Localization in Environments using Visual Angles,” VRCAI '04 Proceedings of the 2004 ACM SIGGRAPH international conference on Virtual Reality continuum and its applications in industry, pp. 472-475, 2004.
Kurs et al, Wireless Power transfer via Strongly Coupled Magnetic Resonances, Downloaded from www.sciencemag.org, Aug. 2007, 5 pages.
Kurth, “Range-Only Robot Localization and SLAM with Radio”, http://www.ri.cmu.edu/pub—files/pub4/kurth—derek—2004—1/kurth—derek—2004—1.pdf. 60 pages, May, 2004, accessed Jul. 27, 2012.
Kwon et al., “Table Recognition through Range-based Candidate Generation and Vision based Candidate Evaluation,” ICAR 2007, The 13th International Conference on Advanced Robotics Aug. 21-24, 2007, Jeju, Korea, pp. 918-923, 2007.
Lambrinos et al., “A mobile robot employing insect strategies for navigation,” Retrieved from the Internat: URL<http://www8.cs.umu.se/kurser/TDBD17/VT04/dl/Assignment%20Papers/lambrinos-RAS-2000.pdf>. 38 pages, Feb. 1999.
Lang et al., “Visual Measurement of Orientation Using Ceiling Features”, 1994 IEEE, pp. 552-555, 1994.
Lapin, “Adaptive position estimation for an automated guided vehicle,” SPIE, vol. 1831 Mobile Robots VII, pp. 82-94, 1992.
LaValle et al., “Robot Motion Planning in a Changing, Partially Predictable Environment,” 1994 IEEE International Symposium on Intelligent Control, Columbus, OH, pp. 261-266, Aug. 1994.
Lee et al., “Development of Indoor Navigation system for Humanoid Robot Using Multi-sensors Integration”, ION NTM, San Diego, CA pp. 798-805, Jan. 2007.
Lee et al., “Localization of a Mobile Robot Using the Image of a Moving Object,” IEEE Transaction on Industrial Electronics, 50(3):612-619, Jun. 2003.
Leonard et al., “Mobile Robot Localization by tracking Geometric Beacons,” IEEE Transaction on Robotics and Automation, 7(3):376-382, Jun. 1991.
Li et al. “Robust Statistical Methods for Securing Wireless Localization in Sensor Networks,” Information Processing in Sensor Networks, 2005, Fourth International Symposium on, pp. 91-98, Apr. 2005.
Li et al., “Making a Local Map of Indoor Environments by Swiveling a Camera and a Sonar,” Proceedings of the 1999 IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 954-959, 1999.
Lin et al., “Mobile Robot Navigation Using Artificial Landmarks,” Journal of robotics System, 14(2): 93-106, 1997.
Linde, Dissertation—“On Aspects of Indoor Localization,” Available at: https://eldorado.tu-dortmund.de/handle/2003/22854, University of Dortmund, 138 pages, Aug. 2006.
Lumelsky et al., “An Algorithm for Maze Searching with Azimuth Input”, 1994 IEEE International Conference on Robotics and Automation, San Diego, CA vol. 1, pp. 111-116, 1994.
Luo et al., “Real-time Area-Covering Operations with Obstacle Avoidance for Cleaning Robots,” IEEE, pp. 2359-2364, 2002.
Luo et al., “Networked intelligent robots through the Internet: Issues and Opportunities,” Proceedings of the IEEE, Mar. 2003, 91(3):371-382.
Luo et al., “Development of a multibehavior-based mobile robot for remote supervisory control through the Internet,” IEEE/ASME Transactions on Mechatronics, 2000, 5(4):376-385.
Ma, Thesis—“Documentation on Northstar,” California Institute of Technology, 14 pages, May 2006.
Madsen et al., “Optimal landmark selection for triangulation of robot position,” Journal of Robotics and Autonomous Systems, vol. 13 pp. 277-292, 1998.
Malik et al., “Virtual Prototyping for Conceptual Design of a Tracked Mobile Robot,” Electrical and Computer Engineering, Canadian Conference on, IEEE, PI. pp. 2349-2352, May 2006.
Martishevcky, “The Accuracy of point light target coordinate determination by dissectoral tracking system”, SPIE vol. 2591, pp. 25-30, Oct. 23, 2005.
Maschinemarkt Wiirzburg 105, No. 27, pp. 3, 30, Jul. 5, 1999.
Matsumura Camera Online Shop: Retrieved from the Internet: URL< http://www.rakuten.co.jp/matsucame/587179/711512/. Accessed Nov. 2011, 7 pages.
Matsutek Enterprises Co. Ltd, “Automatic Rechargeable Vacuum Cleaner,” http://matsutek.manufacturer.globalsources.com/si/6008801427181/pdtl/Home-vacuum/10 . . . , Apr. 2007, 3 pages.
McGillem et al., “Infra-red Lacation System for Navigation and Autonomous Vehicles,” 1988 IEEE International Conference on Robotics and Automation, vol. 2, pp. 1236-1238, Apr. 1988.
McGillem,et al. “A Beacon Navigation Method for Autonomous Vehicles,” IEEE Transactions on Vehicular Technology, 38(3):132-139, Aug. 1989.
McLurkin “Stupid Robot Tricks: A Behavior-based Distributed Algorithm Library for Programming Swarms of Robots,” Paper submitted for requirements of BSEE at MIT, May 2004, 127 pages.
McLurkin, “The Ants: A community of Microrobots,” Paper submitted for requirements of BSEE at MIT, May 1995, 60 pages.
Miro et al., “Towards Vision Based Navigation in Large Indoor Environments,” Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, Beijing, China, pp. 2096-2102, Oct. 2006.
Miwako Doi “Using the symbiosis of human and robots from approaching Research and Development Center,” Toshiba Corporation, 16 pages, available at http://warp.ndl.go.jp/info:ndljp/pid/258151/www.soumu.go.jp/joho—tsusin/policyreports/chousa/netrobot/pdf/030214—1—33—a.pdf, 2003.
MobileMag, “Samsung Unveils High-tech Robot Vacuum Cleaner,” Retrieved from the Internet: URL<http://www.mobilemag.com/content/100/102/C2261/. 4 pages, Mar. 2005.
Monteiro et al., “Visual Servoing for Fast Mobile Robot: Adaptive Estimation of Kinematic Parameters,” Proceedings of the IECON '93., International Conference on Industrial Electronics, Maui, HI, pp. 1588-1593, Nov. 1993.
Moore et al., “A simple Map-bases Localization strategy using range measurements,” SPIE, vol. 5804 pp. 612-620, 2005.
Morland,“Autonomous Lawnmower Control”, Downloaded from the internet at: http://cns.bu.edu/˜cjmorlan/robotics/lawnmower/report.pdf, 10 pages, Jul. 2002.
Munich et al., “ERSP: A Software Platform and Architecture for the Service Robotics Industry,” Intelligent Robots and Systems, 2005. (IROS 2005), pp. 460-467, Aug. 2005.
Munich et al., “SIFT-ing Through Features with ViPR”, IEEE Robotics & Automation Magazine, pp. 72-77, Sep. 2006.
Nam et al., “Real-Time Dynamic Visual Tracking Using PSD Sensors and extended Trapezoidal Motion Planning”, Applied Intelligence 10, pp. 53-70, 1999.
Nitu et al., “Optomechatronic System for Position Detection of a Mobile Mini-Robot,” IEEE Ttransactions on Industrial Electronics, 52(4):969-973, Aug. 2005.
On Robo, “Robot Reviews Samsung Robot Vacuum (VC-RP30W),” Retrieved from the Internet: URL <www.onrobo.com/reviews/AT—Home/vacuum—cleaners/on00vcrb30rosam/index.htm>. 2 pages, 2005.
OnRobo “Samsung Unveils Its Multifunction Robot Vacuum,” Retrieved from the Internet: URL <www.onrobo.com/enews/0210/samsung—vacuum.shtml> 3 pages, Mar. 2005.
Pages et al., “A camera-projector system for robot positioning by visual serving,” Proceedings of the 2006 Conference on Computer Vision and Pattern Recognition Workshop (CVPRW06), 8 pages, Jun. 2006.
Pages et al., “Optimizing Plane-to-Plane Positioning Tasks by Image-Based Visual Servoing and Structured Light,” IEEE Transactions on Robotics, 22(5):1000-1010, Oct. 2006.
Pages et al., “Robust decoupled visual servoing based on structured light,” 2005 IEEE/RSJ, Int. Conf. on Intelligent Robots and Systems, pp. 2676-2681, 2005.
Park et al., “A Neural Network Based Real-Time Robot Tracking Controller Using Position Sensitive Detectors,” IEEE World Congress on Computational Intelligence., 1994 IEEE International Conference on Neutral Networks, Orlando, Florida pp. 2754-2758, Jun./Jul. 1994.
Park et al., “Dynamic Visual Servo Control of Robot Manipulators using Neutral Networks,” The Korean Institute Telematics and Electronics, 29-B(10):771-779, Oct. 1992.
Paromtchik “Toward Optical Guidance of Mobile Robots,” Proceedings of the Fourth World Multiconference on Systemics, Cybermetics and Informatics, Orlando, FL, USA, Jul. 23, 2000, vol. IX, pp. 44-49, available at http://emotion.inrialpes.fr/˜paromt/infos/papers/paromtchik:asama:sci:2000.ps.gz, accessed Jul. 3, 2012, 6 pages.
Paromtchik et al., “Optical Guidance System for Multiple mobile Robots,” Proceedings 2001 ICRA. IEEE International Conference on Robotics and Automation, vol. 3, pp. 2935-2940, May 2001.
Penna et al., “Models for Map Building and Navigation”, IEEE Transactions on Systems. Man. and Cybernetics., 23(5):1276-1301, Sep./Oct. 1993.
Pirjanian et al. “Representation and Execution of Plan Sequences for Multi-Agent Systems,” Proceedings of the 2001 IEEE/RSJ International Conference on Intelligent Robots and Systems, Maui, Hawaii, pp. 2117-2123, Oct. 2001.
Pirjanian et al., “A decision-theoretic approach to fuzzy behavior coordination”, 1999 IEEE International Symposium on Computational Intelligence in Robotics and Automation, 1999. CIRA '99., Monterey, CA, pp. 101-106, Nov. 1999.
Pirjanian et al., “Distributed Control for a Modular, Reconfigurable Cliff Robot,” Proceedings of the 2002 IEEE International Conference on Robotics & Automation, Washington, D.C. pp. 4083-4088, May 2002.
Pirjanian et al., “Improving Task Reliability by Fusion of Redundant Homogeneous Modules Using Voting Schemes,” Proceedings of the 1997 IEEE International Conference on Robotics and Automation, Albuquerque, NM, pp. 425-430, Apr. 1997.
Pirjanian et al., “Multi-Robot Target Acquisition using Multiple Objective Behavior Coordination,” Proceedings of the 2000 IEEE International Conference on Robotics & Automation, San Francisco, CA, pp. 2696-2702, Apr. 2000.
Pirjanian, “Challenges for Standards for consumer Robotics,” IEEE Workshop on Advanced Robotics and its Social impacts, pp. 260-264, Jun. 2005.
Pirjanian, “Reliable Reaction,” Proceedings of the 1996 IEEE/SICE/RSJ International Conference on Multisensor Fusion and Integration for Intelligent Systems, pp. 158-165, 1996.
Popco.net, “Make your digital life,” Retrieved from the Internet: URL<http://www.popco.net/zboard/view.php?id=tr—review&no=40>. 14 pages, Accessed Nov. 2011.
Prassler et al., “A Short History of Cleaning Robots,” Autonomous Robots 9, 211-226, 2000, 16 pages.
Put Your Roomba . . . On, Automatic webpages: http://www.acomputeredge.com/roomba, 5 pages, accessed Apr. 2005.
Remazeilles et al., “Image based robot navigation in 3D environments,” Proc. of SPIE, vol. 6052, pp. 1-14, Dec. 2005.
Rives et al., “Visual servoing based on ellipse features,” SPIE, vol. 2056 Intelligent Robots and Computer Vision pp. 356-367, 1993.
Roboking—not just a vacuum cleaner, a robot!, Jan. 21, 2004, infocom.uz/2004/01/21/robokingne-prosto-pyilesos-a-robot/, accessed Oct. 10, 2011, 5 pages.
RoboMaid Sweeps Your Floors So You Won't Have to, the Official Site, website: Retrieved from the Internet: URL<http://therobomaid.com/. 2 pages, accessed Mar. 2005.
Robot Buying Guide, “LG announces the first robotic vacuum cleaner for Korea,” Retrieved from the Internet: URL<http://robotbg.com/news/2003/04/22/lg—announces—the—first—robotic—vacu>. 1 page, Apr. 2003.
Robotics World, “A Clean Sweep,” 5 pages, Jan. 2001.
Ronnback, “On Methods for Assistive Mobile Robots,” Retrieved from the Internet: URL<http://www.openthesis.org/documents/methods-assistive-mobile-robots-595019.html>. 218 pages, Jan. 2006.
Roth-Tabak et al., “Environment Model for mobile Robots Indoor Navigation,” SPIE, vol. 1388 Mobile Robots, pp. 453-463, 1990.
Sahin et al., “Development of a Visual Object Localization Module for Mobile Robots,” 1999 Third European Workshop on Advanced Mobile Robots, (Eurobot '99), pp. 65-72, 1999.
Salomon et al., “Low-Cost Optical Indoor Localization system for Mobile Objects without Image Processing,” IEEE Conference on Emerging Technologies and Factory Automation, 2006. (ETFA '06), pp. 629-632, Sep. 2006.
Sato, “Range Imaging Based on Moving Pattern Light and Spatio-Temporal Matched Filter,” Proceedings International Conference on Image Processing, vol. 1., Lausanne, Switzerland, pp. 33-36, Sep. 1996.
Schenker et al., “Lightweight rovers for Mars science exploration and sample return,” Intelligent Robots and Computer Vision XVI, SPIE Proc. 3208, pp. 24-36, 1997.
Schofield, “Neither Master nor slave—A Practical Study in the Development and Employment of Cleaning Robots, Emerging Technologies and Factory Automation,” 1999 Proceedings ETFA '99 1999 7th IEEE International Conference on Barcelona, Spain, pp. 1427-1434, Oct. 1999.
Shimoga et al., “Touch and Force Reflection for Telepresence Surgery,” Engineering in Medicine and Biology Society, 1994. Engineering Advances: New Opportunities for Biomedical Engineers. Proceedings of the 16th Annual International Conference of the IEEE, Baltimore, MD, pp. 1049-1050, 1994.
Sim et al, “Learning Visual Landmarks for Pose Estimation,” IEEE International Conference on Robotics and Automation, vol. 3, Detroit, MI, pp. 1972-1978, May 1999.
Sobh et al., “Case Studies in Web-Controlled Devices and Remote Manipulation,” Automation Congress, 2002 Proceedings of the 5th Biannual World, pp. 435-440, Dec. 2002.
Special Reports, “Vacuum Cleaner Robot Operated in Conjunction with 3G Celluar Phone,” 59(9): 3 pages, Retrieved from the Internet: URL<http://www.toshiba.co.jp/tech/review/2004/09/59—0>. 2004.
Stella et al., “Self-Location for Indoor Navigation of Autonomous Vehicles,” Part of the SPIE conference on Enhanced and Synthetic Vision SPIE vol. 3364, pp. 298-302, 1998.
Summet, “Tracking Locations of Moving Hand-held Displays Using Projected Light,” Pervasive 2005, LNCS 3468, pp. 37-46, 2005.
Svedman et al., “Structure from Stereo Vision using Unsynchronized Cameras for Simultaneous Localization and Mapping,” 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 2993-2998, 2005.
SVET Computers—New Technologies—Robot Vacuum Cleaner, Oct. 1999, available at http://www.sk.rs/1999/10/sknt01.html, 1 page, accessed Nov. 1, 2011.
Taipei Times, “Robotic vacuum by Matsuhita about to undergo testing,” Retrieved from the Internet: URL<http://www.taipeitimes.com/News/worldbiz/archives/2002/03/26/0000129338>. accessed Mar. 2002, 2 pages.
Takio et al., “Real-Time Position and Pose Tracking Method of Moving Object Using Visual Servo System,” 47th IEEE International Symposium on Circuits and Systems, pp. 167-170, 2004.
Tech-on!, Retrieved from the Internet: URL<http://techon.nikkeibp.co.jp/members/01db/200203/1006501/>. 4 pages, accessed Nov. 2011.
Teller, “Pervasive pose awareness for people, Objects and Robots,” http://www.ai.mit.edu/lab/dangerous-ideas/Spring2003/teller-pose.pdf, 6 pages, Apr. 2003.
Terada et al., “An Acquisition of the Relation between Vision and Action using Self-Organizing Map and Reinforcement Learning,” 1998 Second International Conference on Knowledge-Based Intelligent Electronic Systems, Adelaide, Australia, pp. 429-434, Apr. 1998.
The Sharper Image, eVac Robotic Vacuum—Product Details, www.sharperiamge.com/us/en/templates/products/pipmorework1printable.jhtml, 1 page, Accessed Mar. 2005.
TheRobotStore.com, “Friendly Robotics Robotic Vacuum RV400—The Robot Store,” www.therobotstore.com/s.nl/sc.9/category.-109/it.A/id.43/.f, 1 page, Apr. 2005.
Thrun, Sebastian, “Learning Occupancy Grid Maps With Forward Sensor Models,” Autonomous Robots 15, 28 pages, Sep. 1, 2003.
TotalVac.com, RC3000 RoboCleaner website, 2004, Accessed at http://ww.totalvac.com/robot—vacuum.htm (Mar. 2005), 3 pages.
Trebi-Ollennu et al., “Mars Rover Pair Cooperatively Transporting a Long Payload,” Proceedings of the 2002 IEEE International Conference on Robotics & Automation, Washington, D.C. pp. 3136-3141, May 2002.
Tribelhorn et al., “Evaluating the Roomba: A low-cost, ubiquitous platform for robotics research and education,” IEEE, pp. 1393-1399, 2007.
Tse et al., “Design of a Navigation System for a Household Mobile Robot Using Neural Networks,” Department of Manufacturing Engg. & Engg. Management, City University of Hong Kong, pp. 2151-2156, 1998.
UAMA (Asia) Industrial Co., Ltd., “RobotFamily,” 2005, 1 page.
UBOT, cleaning robot capable of wiping with a wet duster, Retrieved from the Internet: URL<http://us.aving.net/news/view.php?articleId=23031>. 4 pages, accessed Nov. 2011.
Watanabe et al., “Position Estimation of Mobile Robots With Internal and External Sensors Using Uncertainty Evolution Technique,” 1990 IEEE International Conference on Robotics and Automation, Cincinnati, OH, pp. 2011-2016, May 1990.
Watts, “Robot, boldly goes where no man can,” The Times—pp. 20, Jan. 1985.
Wijk et al., “Triangulation-Based Fusion of Sonar Data with Application in Robot Pose Tracking,” IEEE Transactions on Robotics and Automation, 16(6):740-752, Dec. 2000.
Winfield et al., “The application of wireless local area network technology to the control of mobile robots,” Microprocessors and Microsystems, 2000, vol. 23, pp. 597-607.
Wolf et al., “Robust Vision-Based Localization by Combining an Image-Retrieval System with Monte Carol Localization,”, IEEE Transactions on Robotics, 21(2):208-216, Apr. 2005.
Wolf et al., “Robust Vision-based Localization for Mobile Robots Using an Image Retrieval System Based on Invariant Features,” Proceedings of the 2002 IEEE International Conference on Robotics & Automation, Washington, D.C., pp. 359-365, May 2002.
Wong, “EIED Online>> Robot Business”, ED Online ID# 13114, 17 pages, Jul. 2006.
Yamamoto et al., “Optical Sensing for Robot Perception and Localization,” 2005 IEEE Workshop on Advanced Robotics and its Social Impacts, pp. 14-17, 2005.
Yata et al., “Wall Following Using Angle Information Measured by a Single Ultrasonic Transducer,” Proceedings of the 1998 IEEE, International Conference on Robotics & Automation, Leuven, Belgium, pp. 1590-1596, May 1998.
Yujin Robotics,“An intelligent cleaning robot,” Retrieved from the Internet: URL<http://us.aving.net/news/view.php?articleId=7257>. 8 pages, accessed Nov. 2011.
Yun et al., “Image-Based Absolute Positioning System for Mobile Robot Navigation,” IAPR International Workshops SSPR, Hong Kong, pp. 261-269, Aug. 2006.
Yun et al., “Robust Positioning a Mobile Robot with Active Beacon Sensors,” Lecture Notes in Computer Science, 2006, vol. 4251, pp. 890-897, 2006.
Yuta et al., “Implementation of an Active Optical Range sensor Using Laser Slit for In-Door Intelligent Mobile Robot,” IEE/RSJ International Workshop on Intelligent Robots and Systems (IROS 91) vol. 1, Osaka, Japan, pp. 415-420, Nov. 3-5, 1991.
Zha et al., “Mobile Robot Localization Using Incomplete Maps for Change Detection in a Dynamic Environment,” Advanced Intelligent Mechatronics '97. Final Program and Abstracts., IEEE/ASME International Conference, pp. 110, Jun. 1997.
Zhang et al., “A Novel Mobile Robot Localization Based on Vision,” SPIE vol. 6279, 6 pages, Jan. 2007.
Zoombot Remote Controlled Vaccuum—RV-500 NEW Roomba 2, website: http://cgi.ebay.com/ws/eBayISAPI.dll?ViewItem&category=43526&item=4373497618&rd=1, accessed Apr. 20, 2005, 7 pages.
Maxwell et al, “Alfred: The Robot Waiter who Remembers You,” Proceedings of AAAI Workshop on Robotics, Jul. 1, 1999, 12 pages.
European Search Report issued in EP Application No. 12179853.2, dated Jan. 7, 2013, 4 pages.
European Search Report issued in EP Application No. 12179855.7, dated Jan. 7, 2013, 3 pages.
European Search Report issued in EP Application No. 12179857.3, dated Jan. 7, 2013, 6 pages.
European Search Report issued in EP Application No. 12155298.8, dated Aug. 8, 2012, 6 pages.
International Preliminary Report on Patentability issued in International Application No. PCT/US2006/046398, dated Oct. 13, 2008, 6 pages.
International Search Report and Written Opinion issued in International Application No. PCT/US2006/046398, mailed Oct. 31, 2007, 12 pages.
Related Publications (1)
Number Date Country
20160291595 A1 Oct 2016 US
Provisional Applications (1)
Number Date Country
60741442 Dec 2005 US
Continuations (4)
Number Date Country
Parent 14275355 May 2014 US
Child 15182849 US
Parent 13893905 May 2013 US
Child 14275355 US
Parent 12959879 Dec 2010 US
Child 13893905 US
Parent 11633883 Dec 2006 US
Child 12959879 US