Mobile robot for telecommunication

Information

  • Patent Grant
  • 9296109
  • Patent Number
    9,296,109
  • Date Filed
    Monday, October 13, 2014
    9 years ago
  • Date Issued
    Tuesday, March 29, 2016
    8 years ago
Abstract
A system including a mobile telepresence robot, a to telepresence computing device in wireless communication with the robot, and a host computing device in wireless communication with the robot and the telepresence computing device. The host computing device relays User Datagram Protocol traffic between the robot and the telepresence computing device through a firewall.
Description
TECHNICAL FIELD

This disclosure relates to mobile robots for telecommunications.


BACKGROUND

Robots have been used for facilitating videoconferencing and remote communication. For example, U.S. Pat. No. 7,123,285 to SMITH (the entire contents of which are incorporated herein by reference) relates to a system including a robot having a swiveling video monitor, a speaker and a microphone, and a remote terminal unit having a microphone and a camera. In accordance with SMITH, a user at the remote terminal unit can operate the robot while voice and video signals are sent to the robot to be output on the robot's speaker and video monitor. The swiveling video monitor of the robot in SMITH can also be operated via the remote terminal unit.


Patent Application Publication 2006/0082642 to WANG, published Apr. 20, 2006 (the entire contents of which are incorporated herein by reference), a robot is used for two-way mobile teleconferencing between the robot and a remote control station. The remote control station of WANG communicates with the robot through a network, receiving video input from a camera on the robot, and the user at the remote control station can move the robot using the remote control station.


As another example, robots have also been used for companionship and remote care giving. A mobile robot capable of facilitating telecommunication between a remote caregiver and a person in a home or other location, inter alia, is disclosed in US Patent Application Publication 2007/0192910 to VU, published Aug. 16, 2007 (which is incorporated herein by reference).


In order for the remote user to operate a mobile robot located in a home, building, or other location (such locations herein referred to as “premises”) distant from the remote user's location, a communicative channel is provided therebetween. Typically, at the premises where the mobile robot is located, there may be public telecommunication service connections that may be used as the communicative channel; such as, for example, the public switched telephone network (“PSTN”), cable television (“cable”), satellite television (“satellite”), and/or campus wireless Ethernet services (“Wi-Fi”). At the remote user's location, which may be a home, an office, or a hotel, inter alia, there may be similar connectivity. Alternatively, the remote user may have access to mobile telephone-type service (such as GSM, COMA, 3G, or the like) over which Internet communication is provided. Thus, one approach for connecting a remote terminal to a mobile robot is via an Internet protocol using universal datagram packet (UDP), transmission control protocol (TCP), and/or interact protocol (IP).


However, because many homes having a broadband connection to the Internet utilize a firewall or a network address translation system (hereinafter, “NAT”) collectively referred to as a “firewall” hereinafter-difficulties can occur when the remote terminal attempts to connect to the mobile robot. One such difficulty arises because many firewalls prevent direct connections initiated by Internet hosts not protected by the firewall (hereinafter, “outside hosts”) to reach hosts located behind (i.e., protected by) the firewall (hereinafter, a “firewalled host”). Therefore, when the mobile robot is a firewalled host that is sequestered from incoming Internet connections originating beyond the firewall, it may not be possible for the remote terminal to initiate a direct connection with the mobile robot.


STUN and TURN are technologies that enable some incoming Internet connection initiation requests to “traverse” the firewall or NAT and successfully connect to fire-walled hosts (see, for example, US Patent Application Publication 2007/0076729 A1 to TAKEDA, published Apr. 5, 2007; 2006/0209794 to BAE, published Sep. 21, 2006; and US Patent Application Publication 2007/0189311 A1 to KIM, published Aug. 16, 2007, each of which are incorporated herein by reference). Nonetheless, even employing STUN and/or TURN, some kinds of incoming connection attempts may fail to reach firewalled hosts in certain kinds of network arrangements using a firewall or NAT.


For these reasons, among other, there has remained significant unmet demand for connecting a remote terminal to a mobile robot at a home or other such premises.


SUMMARY

In view of the above, as well as other considerations, presently provided is a mobile robot for performing telecommunication and remote observation, inter alia. The mobile robot may include at least one sensor and a privacy controller for preventing operation of the sensor when the local user causes the robot to operate in a privacy mode. The sensor may include a camera and/or a microphone, for example. The mobile robot may further include a wireless network interface for communicating with a base station, and a communication controller for transmitting and receiving audio and/or video data, in which the communication controller controls the mobile robot so as to prevent transmission of data from the sensor when the mobile robot operates in the privacy mode.


The mobile robot may also be usable with an RC unit for operating the mobile robot locally via a wireless signal transmitted to the mobile robot. In accordance with another aspect, a robot system may include abuse station and a mobile robot, the base station including abase station wireless transceiver for communicating with the mobile robot via a local wireless connection and a premises Internet connection for interfacing with the Internet, the mobile robot including a robot wireless transceiver capable of communicating with the base station via the local wireless connection.


The present teachings provide a remote control unit configured to wirelessly control a mobile robot moving through an environment and having a robot camera. The remote control unit comprises a privacy button operable by a local user and configured to engage a privacy mode of the mobile robot, and a wireless transmitter configured to emit a wireless control signal to the mobile robot based on input from a keypad of the RC unit. The wireless control signal is configured to cause the robot camera to block the field of view of the robot camera such that the environment of the mobile robot is obscured when the privacy mode of the mobile robot is engaged.


The present teachings also provide a remote control unit configured to wirelessly control a mobile robot moving through an environment and having a robot camera. The remote control unit comprises an input device operable by a local user and configured to engage a privacy mode of the mobile robot, and a transmitter configured to emit a control signal to the mobile robot based on input from the input device of the remote control unit. The control signal is configured to cause the robot camera to change position to block the field of view of the robot camera such that the environment of the mobile robot is obscured when the privacy mode of the mobile robot is engaged.


The present teachings further provide a method of controlling a mobile robot having a robot camera comprises receiving an input at an input device of a remote control unit to engage a privacy mode of the mobile robot, emitting a control signal from the input device to engage the privacy mode, and, in response to receiving the emitted control signal to engage the privacy mode, moving the robot camera into a conspicuously disabled orientation.


The details of one or more implementations of the disclosure are set forth in the accompanying drawings and the description below. Other aspects, features, and advantages will be apparent from the description and drawings, and from the claims.





DESCRIPTION OF DRAWINGS


FIG. 1 is a plan view of a mobile robot.



FIG. 2 is a profile view of the mobile robot shown in FIG. 1.



FIG. 3 is a plan view of an RC unit.



FIG. 4 is a perspective view of a remote user terminal.



FIG. 5 is a perspective view of a mobile robot facilitating telecommunication with a local user.



FIG. 6 is an illustrative diagram of a robot telecommunication system using peer-to-peer VoIP connection between a remote terminal and a mobile robot.



FIG. 7 is an illustrative diagram of a robot telecommunication system using peer-to-peer VoIP connection between a remote terminal and a mobile robot facilitated by an Internet server.



FIG. 8 is a schematic diagram of a robot telecommunication system using direct peer-to-peer VoIP connections.



FIG. 9 is a schematic diagram of a robot telecommunication system using peer-to-peer VoIP connections, where the remote terminal and the mobile robot are both behind separate NATs.



FIG. 10 is a schematic diagram of a robot telecommunication system using peer-to-peer VoIP connections, where the base station is opening a pinhole in the base station's NAT.



FIG. 11 is a schematic diagram of the robot telecommunication system using peer-to-peer VoIP connections shown in FIG. 10, where the remote terminal connects to the base station through the pinhole opened in the base station's NAT.



FIG. 12 is a schematic diagram of a robot telecommunication system using an Internet server operating as a relay server for a VoIP connection between the remote terminal and the mobile robot.



FIG. 13 is a schematic diagram of an example component organization of a mobile robot.



FIG. 14 is a schematic diagram of a software organization of a mobile robot.



FIG. 15 is a perspective view of an RC unit having a sliding cover in the closed position.



FIG. 16 is a perspective view of the RC unit of FIG. 15, having a sliding cover in the open position.



FIG. 17 is a perspective view of a robot camera in an active position.



FIG. 18 is a perspective view of the robot camera of FIG. 17, in a disabled position.



FIG. 19 is a partially exploded detail view of a mobile robot having a lock-and-key mechanism for toggling privacy mode, and a slidable privacy mechanism for preventing transmission of telecommunication data.



FIG. 20 is a diagram illustrating a first example user interface for controlling mobile robot.



FIG. 21 is a diagram illustrating a second example user interface for controlling a mobile robot.



FIG. 22 is a perspective view of an example on-board user interface of a mobile robot.



FIG. 23 is a partial cutaway exploded view of a robot camera having a motorized tilt and zoom mechanism operable by a remote user.



FIG. 24 is a series of isometric diagrams of the robot camera of FIG. 23.



FIG. 25 is a screen shot of a remote user interface for using a mobile telecommunication robot.



FIG. 26 is a perspective view of a mobile robot having a rotatable, reversible robot camera.



FIG. 27 is a profile view of a mobile robot having an example alternative drive system including bipedal legs.



FIG. 28 is a schematic diagram illustrating a network component organization of a mobile telecommunication robot service.



FIG. 29 is another schematic diagram illustrating a network component organization of a mobile telecommunication robot service.



FIG. 30 is a schematic diagram illustrating a software component organization of a mobile telecommunication robot system.





Like reference symbols in the various drawings indicate like elements.


DETAILED DESCRIPTION

In accordance with a first example implementation, as illustrated in FIGS. 4, 5 and 6, a robot telecommunication system is provided including a mobile robot 100 at local premises 600 and a remote terminal 430 at a remote location, in which a remote user 400 can operate the mobile robot 100 using the remote terminal 430. The remote location may be located beyond at least one minute's walk from the mobile robot 100, in accordance with one example implementation; or, the remote location may be located at least 100 meters away from the mobile robot 100, in accordance with another implementation. Alternatively, the remote location may in fact be located quite near the mobile robot 100, in terms of physical distance-even within the same home or building, or the same room in accordance with another example implementation.


Referring to FIGS. 1 and 2 to illustrate an example implementation, the mobile robot 100 includes one or more telecommunication sensors, such as a robot microphone 193 for inputting voice or sound data, or a camera 191 for inputting image data. The sound data or image data originating from the robot's telecommunication sensors is referred to herein as “local telecommunication data.” Preferably, the mobile robot 100 also includes one or more telecommunication output devices, such as a speaker 194 for outputting voice or sound data received from the remote terminal, or a liquid crystal display screen (“LCD”) 181 for displaying image data received from the remote terminal 430. The sound or image data received from the remote terminal 430 is referred to herein as “remote telecommunication data.”


The mobile robot may have a chassis 105 (also referred to herein as a main body) including a docking connector for interfacing with a base station or recharging station (also referred to herein as a “docking station”), and a user interface disposed on the chassis for interacting with a local user. Furthermore, the mobile robot 100 includes a drive system 130 for propelling and steering the mobile robot 100 in accordance with user input. In accordance with at least one example implementation, the mobile robot 100 may further include features similar to any of those discussed in ZIEGLER, VU, or SMITH, as well as any of the robots discussed below in reference to the other incorporated documents, such as with respect to the drive system, chassis, form factor, electronics, and/or other aspects.


In at least one implementation, as illustrated in FIG. 6, the robot system includes a base station 200 located at the local premises 600 within communicative range of the mobile robot 100, such as in a room 611A of a home. The base station 200 communicates with the Internet 901 via a premises Internet connection 961, such as a digital subscriber line (“DSL”) or cable modem, and communicates with the mobile robot 100 using a local wireless connection 967 via a base station wireless transceiver 267. The local wireless connection 967 may include a radio-frequency (“RF”) data. network protocol such as wireless Ethernet (“Wi-Fi”) conforming to IEEE 802.11a, 802.11b, IEEE 802.11g, IEEE 802.11n, or other suitable standard, for example (see, for example, “Standard for Information Technology-Telecommunications and information exchange between systems-Local and metropolitan area networks-Specific Requirements-Part 11: Wireless LAN Medium Access Control (MAC) and Physical Layer (PHY),” published by IEEE, 1999, which is incorporated herein by reference.) The mobile robot 100 in this example correspondingly includes a robot wireless transceiver 167 for communicating with the base station 200.


The mobile robot 100 may also include a tether port 165 for connecting to the base station 200 using a wire or cable. The mobile robot 100 may additionally include software for enabling the local user 500 to set configurable features of the mobile robot 100, such as during an initial set-up phase when the mobile robot 100 is first used. In accordance with one example implementation, the tether port 165 includes an RJ45 port for connecting to the base station 200 using an Ethernet cable (e.g., unshielded twisted pair, “UTP”). In accordance with an alternative example, the tether port 165 includes a universal serial bus (“USB”) port; and in another example implementation, an IEEE 1394-compatible port. The mobile robot 200 receives an IP address using DHCP and may be accessed using the HTTP protocol on a local network via the base station 200 and the tether port 165, in accordance with at least one example implementation.


As an advantage, during initial setup or subsequent re-setup, security settings (WU or RADIUS security codes, passwords, or the like) or other parameters for enabling the mobile robot 100 to communicate with the base station 200 via the local wireless connection 967 can be established using the wired tether port connection to the base station 200, and potential frustrations of wireless network setup may be alleviated.


In accordance with another example implementation, as illustrated in FIG. 22, the mobile robot 100 may include an on-board user interface 180 including an LED or LCD display 181 (which may include a seven-segment display, or a four-line character-based LCD display, as non-limiting examples), and an input device such as a keypad, as illustrated in FIG. 1. Initial setup of the mobile robot 100, such as establishing behavioral settings or wireless network parameters, inter alia, may be carried out by the local user 500 using the on-board user interface 180, without needing to use a personal computer, for example.



FIG. 4 illustrates a remote terminal 430 including a PC having a remote microphone 491 for inputting remote telecommunication data to be sent to the mobile robot 100 and a remote speaker 497 for outputting local telecommunication data received from the mobile robot 100. The remote terminal 430 may also include additional input hardware such as a remote camera and/or additional output hardware such as a remote display 435 (e.g., a computer monitor) for use with telecommunication. The remote o terminal communicates with the Internet 901 through a remote terminal Internet connection (such as, for example, cable modem, DSL transceiver, PPP over PSTN, Wi-Fi, T1, COMA, or GSM).


In addition to telecommunication data, the remote terminal 430 includes robot control input devices such as a keyboard 436, joystick 437, and/or mouse 438. The remote display 435 can display robot control data, such as robot telemetry, battery status, and the like. By manipulating the robot control input devices 436, 437, the remote user 400 causes the remote terminal 430 to transmit a robot control signal to the base station 200 via the Internet 901. The robot control signal may include commands instructing the mobile robot 100 to turn, to move forward, to cause a manipulator or effector to operate (e.g., such as commanding a robotic arm to grasp an object, or commanding a robot camera-aiming mechanism to change the viewing angle of the robot camera 196), or to perform other actions in accordance with remote user input.


The base station 200 may include a base station controller, such as a microprocessor or microcontroller, for intermediating and processing data exchanged between the remote terminal 430 and the mobile robot 100. The base station 200 receives the local telecommunication data sent from the mobile robot 100 via the local wireless connection 967. if the local telecommunication data is already encoded in accordance with a suitable protocol, the base station 200 may forward the local telecommunication without additional processing. Otherwise, if the local telecommunication data is not yet encoded, or is encoded in a format incompatible with the remote terminal 430, the base station 200 may then encode it using an appropriate media protocol (such as MPEG, WAV, AVI, Ogg Vorbis/Ogg Theora, etc.) and forward the encoded local telecommunication data to the remote terminal 430 over the Internet 901. Similarly, the base station 200 may receive remote telecommunication data from the remote terminal 430 via the premises Internet connection 961, perform decoding if the received data is not in a format or protocol compatible with the mobile robot 100, and then forward the remote telecommunication data to the mobile robot 100 using the local wireless connection 967.


Voice-over-IP (“VoIP”) refers to technologies and standards for transmitting media (such as sound and/or video, inter alia) over data networks using Internet protocols. Examples of VoIP protocols include SKYPE, VONAGE, SIP, and ITU H.323 (see, for example, U.S. Patent Application Publication 2007/0153801A1 to SUNG, published Jul. 5, 2007, which is incorporated herein by reference). In some VoIP implementations (such as SKYPE and VONAGE, among others), Internet-based VoIP terminals (such as personal computers—“PCs”—running MAP software and having appropriate hardware such as a microphone and speaker, and/or a camera and video display; as well as VoIP-specific equipment such as SIP telephones) are assigned ENUM- or DUNDi-compatible telephone numbers, and can receive incoming telephone calls even from non-VOIP telephones operating on the PSTN. See, for example, the discussion of Skype set forth in US Patent Application Publication 2007/0159979 to BUTLER, published Jul. 12, 2007, the entire contents of which are incorporated herein by reference.


The local and remote telecommunication data may be encoded and exchanged using a VoIP protocol. For example, the mobile robot 100 and the remote terminal 430 may encode the local telecommunication data using a CODEC according to the Session Initiation Protocol (“SIP”; see, for example, RFC3261 published by the Internet Engineering Task Force). Alternatively, any other VoIP standard suitable for establishing connections and encoding data for exchange between Internet-connected telecommunication devices may be used, such as H.323 and/or related standards.


In addition, the robot control signal generated as a result of robot control input from the remote user 400 may also be encoded (or “piggybacked”) using the same VoIP standard that is used to exchange the local and remote telecommunication data. In accordance with one example implementation, the remote terminal 430 may include a PC executing software for telecommunication in accordance with the SIP standard. While the remote telecommunication data generated from the remote microphone 491 and/or the remote camera are encoded using a CODEC (e.g., MPEG or WAY) and transported over the Internet 901 to the base station 200 and relayed to the mobile robot 100 as a first Real-Ti me Transport Protocol (RTP) data stream, the robot control signals input from the keyboard 436 and/or joystick 437 may also be transmitted using the same SIP session as a second RTP data stream, simultaneously.


As discussed, firewall or NAT devices can obstruct incoming Internet connection initiation requests and prevent them from reaching a firewalled host. However, NAT or firewall traversal may be accomplished for some VoIP protocols by opening a “pinhole” (also referred to as “hole-punch”) port in the NAT or firewall. Typically, a VoIP session—such as an SIP stream—can then be “tunneled” through the NAT pinhole to reach the firewalled host. However, in some circumstances, a separate pinhole must be opened for each protocol or session that will be sent through the NAT or firewall; and in some cases, the number or type of pinholes that may be opened in the NAT or firewall may be limited. Accordingly, it can be advantageous to encode both the telecommunication data and the robot control signal using a common SIP session that is tunneled through a firewall pinhole. Or, as one alternative example implementation using the H.323 protocol, the remote telecommunication data may be encoded using a CODEC in accordance with a suitable standard (such as H.261, H.263, or H.264, inter alia) and transported to the mobile robot 100 over a first “channel” within the H.323 session; while a second H.323 channel is opened within the same H.323 session to transport the robot control signal.


As a further advantage, the remote user 400 may connect to the mobile robot 100 at the local premises 600 by simply calling a telephone number associated therewith, using a VoIP device such as a SIP phone or a PC, without having to necessarily use a specialized software application. Once the call connection IS established using the VoIP protocol, both the robot control signal and the telecommunication data can be exchanged with the mobile robot 100 via one common VOW session, as an example.



FIGS. 20 and 21 illustrate example user interfaces for display on a remote terminal 430. In FIG. 20, the user interface includes a video window for displaying the local video data received from the robot camera of the mobile robot 100, and a navigation input area for enabling the remote user 400 to control steering and mobility of the mobile robot 100 by clicking on arrows corresponding to directions. As illustrated in FIG. 21, the video window may include a navigation reference, such as a portion of the chassis 105 of the mobile robot 100, within the viewing area displayed in the video window. As a benefit, the remote user can ascertain size and perspective for judging speed of motion and/or potential obstacles when navigating the mobile robot 100.



FIG. 3 illustrates an infrared local control wand, herein referred to as an “RC unit” 560, for enabling a local user 500 (see FIG. 5) to control the mobile robot 100 locally. The RC unit 560 may include a form factor and features for enabling the local user 500 to hold the RC unit 560 in his or her hand and operate one or more buttons 561, 562, 563 or other input mechanisms (such as pressure-sensing pads, a directional keypad 566, a joystick, or a touch-sensitive LCD screen, as non-limiting examples). A local control signal 968 is generated by the RC unit 560 based on the input from the local user 500 and transmitted to the mobile robot 100. In one example implementation, the RC unit 560 includes an RC transmitter 568 such as a light-emitting diode that emits light in the infrared spectrum (“infrared LED”), and which transmits the local control signal 968 encoded as a pattern of pulses of infrared light. The mobile robot 100 may include a corresponding local control receiver 168 (such as an infrared light sensor) for receiving the local control signal 968. As example alternatives, the RC transmitter 568 may include a radio-frequency transmitter and/or antenna, or an LED that emits light in the visible spectrum, inter alia. In accordance with one example implementation, the RC transmitter 568 includes a wireless transmitter that functions within the local user's line-of-sight.


The RC unit 560 may include a privacy button 561 for initiating a privacy mode of the mobile robot 100, and also may include an audio mute button 562 and a video mute button 563 for disabling audio or video telecommunication, respectively. When the mobile robot 100 receives a local control signal 968 indicating that the privacy button 561 has been operated, the mobile robot initiates the privacy mode by causing the robot camera 196 to move into a conspicuously disabled orientation, for example, and also by disabling the robot microphone 191 (see, for example, FIGS. 17 and 18). In one example implementation, the mobile robot 100 may also disable the speaker 197 when the privacy button 561 is operated. In another example implementation, the mobile robot 100 may not disable the robot microphone 191, but instead prevent any data generated by the robot microphone 191 from being transmitted to the remote terminal 430. The mobile robot may include a second robot camera for example. In accordance with at least one example implementation, as illustrated in FIGS. 17 and 18, the mobile robot may include a wide angle camera 196 for viewing a wide field around the mobile robot, and a narrow angle (or infrared) camera 198 for viewing a focused area around the mobile robot. The user may toggle the view between the wide angle camera 196 and the narrow angle camera 198, for example; or, the user interface may display data from both the wide and narrow angle cameras, as one alternative example.


The RC unit 560 may also enable the local user 500 to navigate the mobile robot 100 by operating a navigation control mechanism 566. Furthermore, the RC unit 560 may include a “call out” button to initiate outgoing telecommunication connections from the mobile robot 100. In one implementation, when a telecommunication session (e.g., a phone call) has ended, but the local user does not hit the “privacy” button within a particular span of time, then the mobile robot 100 automatically enters the privacy mode.


In one embodiment, as illustrated in FIGS. 3, 15 and 16, the RC unit includes a sliding cover 564 that can slide between an open position and a closed position. As shown in FIGS. 15 and 16, the privacy button 564 may be disposed in a location such that, and/or the sliding cover 564 may have a shape such that, the privacy button 564 remains exposed and/or operable by a user when the sliding cover 564 is open and also when the sliding cover 564 is closed. As an advantage, the user can quickly operate the privacy button 561 to engage the mobile robot's privacy mode even when the sliding cover 564 RC unit 560 is closed.


The mobile robot 100 may include a telecommunication processor having a microprocessor and a data store for storing robot control software (for example, a Pentium III processor and system board connected with SDRAM, a flash EEPROM, a hard disk drive, or other storage technology). The telecommunication processor may execute telecommunication software for encoding and decoding the VoIP protocol, and may also communicate with a robot system controller (such as, for a non-limiting example, a Roomba R2 controller) via an on-board link (such as an RS232-compatible serial line or USB or an IEEE-1394 connection), in which the system controller performs control of the drive system 130 and other actuators, for example. Also, the telecommunication processor may receive the remote telecommunication data and the robot control signal via the robot wireless transceiver 167, and additionally may receive the local control signal 968 via the local control receiver 168. FIG. 13 illustrates a component organization of a mobile robot in accordance with one example implementation.


When there is conflict between the commands issued by the remote user 400 and the local user 500, the telecommunication processor may arbitrate between the two users by selecting which commands to suppress based on a rule set or priority allocation, for example.


In accordance with at least one example implementation, the telecommunication processor preferentially selects robot control commands from the local user 500 (and discards or ignores robot control commands from the remote user 400) when there is conflict between the local user's commands and the remote user's commands. For example, when the remote user 400 commands the mobile robot 100 to turn right but the local user 500 commands the mobile robot 100 to turn left, the mobile robot 100 would turn left i.e., obeying the local user's command white ignoring the remote user's command. Alternatively, the telecommunication processor may instead always select the robot control commands from the remote user 400. As another alternative implementation, the mobile robot 100 may permit operation of the privacy button 561 on the RC unit 560 to always override any conflicting command received from the remote terminal 430. As a result, apprehension or privacy concerns on the part of potential users of the robot system may be relieved.


In addition, when the audio mute button 562 is operated on the RC unit 560, the mobile robot 100 may then disable local audio data from being transmitted, while permitting local video data to be communicated to the remote terminal 430. Also, when the video mute button 563 is operated on the RC unit 560, the mobile robot 100 may prevent local video data from being transmitted, while permitting local audio data to be sent to the remote terminal 430.



FIG. 6 illustrates a robot telecommunication system using a peer-to-peer (“P2P”) VoIP protocol, in which the base station 200 and the remote terminal 430 are connected to the internet 901 and both have unique, globally-accessible IP addresses, without a firewall or NAT interposed between the base station 200 or remote terminal 430. In this example implementation, when the remote user 400 operates the remote terminal 430 to connect to the mobile robot 100, the remote terminal initiates a VoIP session with the base station 200 on whichever port is typically used by the VoIP protocol. The telecommunication data may be sent via the directly established VoIP session. Furthermore, the robot control signal from the remote terminal 430 may “piggyback” on the VOW session; or alternatively, it may instead be transmitted on a second port or separate, unrelated data stream.


Although networks in which each Internet-connected host is associated with a unique, globally accessible IP address have grown less common, particularly in home networks, this network configuration nonetheless may function in accordance with the present example. As shown schematically in FIG. 8, direct IP connections are possible between the remote terminal 430 and the base station 200 over link C.



FIG. 7 illustrates another example implementation of a robot telecommunication system, in which a P2P-type VoIP system is used that includes at least one Internet server 380 having a globally accessible IP address. In this example, it is not necessary for the remote terminal 430 or the base station 200 to have a unique, globally-accessible IP address; rather, the remote terminal 430 may be directly connected to the Internet, or may be behind a “cone” NAT; and the base station 200 is connected through a “cone”-type NAT router 761. The NAT router 761 has two IP addresses: the first is a unique, globally accessible IP address used to communicate with Internet hosts via the local premises Internet connection 961, and the second is a non-globally accessible “private” IP address (such as, for example, 192.168.0.1). The NAT router 761 may in turn assign a non-globally accessible private IP address to the base station 200 using DHCP (alternatively, the base station may have a pre-established private IP address). When the base station 200 transmits IP packets addressed to an Internet host while using the NAT router 761 as an P gateway, the NAT router 761 re-transmits the base station's outgoing IP packets to the Internet 901 using the NAT routers own unique, globally accessible IP address. For this reason, from the perspective of Internet hosts that are unaware of the base station's NAT status, the base station 200 has an apparent IP address that is the same as the IP address of the NAT router 761.


As illustrated in FIG. 9, both the remote terminal 430 and also the base station 200 maintain separate connections B, A with the Internet server 380, through respective firewalls FW1 and FW2, that were originally initiated from the respective hosts (since it is not possible for the Internet server 380 to initiate a connection to firewalled hosts such as the base station 200 or remote terminal 430 in this example). When the remote user 400 wants to connect to the mobile robot 100, the remote terminal 430 notifies the Internet server 380 via connection B. When the remote terminal 430 and the base station 200 connect to the Internet server 380, the Internet server 380 may determine the nature of any NAT or firewall status for the hosts (using, e.g., STUN, TURN, or another suitable method for network examination) and determine each connecting host's apparent IP address. The Internet server 380 then notifies the base station 200 via connection A that the remote terminal 430 requests to connect to the base station 200, also informing the base station 200 of the remote terminal's apparent global IP address.



FIG. 14 illustrates a possible software organization of a mobile robot system in accordance with at least one example implementation. The VoIP protocol and integration of the remote telecom data signal and the robot control signal into a single VoIP data stream may be performed using a software product such as, for example, VeriCall Edge (see, for example, U.S. Pat. No. 6,985,480 to BROWN, issued Jan. 10, 2006, which is incorporated herein by reference). The robot system may include a media processor, for example.


The base station 200 then opens pinhole PH3 on a particular UDP port of the base station's NAT by sending an outgoing connection packet to the remote terminal's apparent IP address, as shown in FIG. 10. As illustrated in FIG. 11, the remote terminal 430 then connects to the base station 200 by establishing connection C through the pinhole PH3.


In some network arrangements such as when a symmetric NAT is interposed between the remote terminal 430 and the base station 200 it may not be possible to determine which apparent IP address or port the remote terminal 430 will have, and therefore it may not be possible for the base station 200 to open an appropriate pinhole through which the remote terminal 430 can connect. Nonetheless, it may be possible for the remote terminal 430 and the base station 200 to communicate over the P2P VoIP protocol by employing a relay or TURN server.


In accordance with the exemplary implementation shown in FIG. 12, the Internet server 380 functions as a relay server to enable VoIP data to be exchanged between the remote terminal 430 and the base station 200. In this example implementation, the Internet server 380 first determines the NAT status of the remote terminal 430 and the base station 200 when they first connect to the Internet server 380. The Internet server 380 then determines an optimal connection strategy and coordinates the hosts to effect the VoIP connection.


As one example, when the Internet server 380 determines that the remote terminal 430 and the base station 200 are both behind “cone” NATs, the Internet server 380 may determine that the optimal connection strategy is for the base station 200 to open a pinhole (e.g., PH3 shown in FIG. 11) and for the remote terminal 430 to connect through the pinhole. Alternatively, if the Internet server 380 detects a symmetric NAT between the remote terminal 430 and the base station 200, it may determine that use of a relay or TURN server is necessary.


Then, for example, in an implementation using a SKYPE network, the Internet server 380 may designate a “super node” (e.g., a node that is not behind a NAT or firewall) for the hosts to use as a relay server. As illustrated in FIG. 12, the Internet server 380 itself may also function as a relay server, for example (alternatively, the relay server may be another Internet host separate from the Internet server 380). When communicating via a relay server, there need not be any direct connection between the remote terminal 430 and the base station 200.


Poor quality of service (“QoS”) may affect the ability of the remote user 400 to effectively control the mobile robot 100 and/or to conduct effective telecommunication with the local user 500. Accordingly, the Internet server 380 may measure and/or monitor QoS of the VoIP connection between the remote terminal 430 and the mobile robot 100, and may select relay servers (or connection strategy recommendations) based on the QoS metric. For example, when the VoIP protocol being used is SIP, the Internet server 380 may measure QoS using the RTCP protocol. If the QoS is determined to be poor, then the Internet server 380 may notify the remote terminal 430 and the base station 200 to attempt to connect via a different relay server or to attempt a different connection strategy.



FIG. 26 shows an example implementation of a mobile robot 100 having a rotatable robot camera 196 mounted on a turret, in which the robot camera 196 is rotatable from a forward-facing position (the forward direction of the chassis 105 being indicated by the arrow “A”) to a rear-facing position (the rear direction of the chassis 105 being indicated b the arrow “B”). In accordance with at least one implementation, the rotatable robot camera 196 is operable even when rotated to the rear-facing position (see FIG. 26).


When the mobile robot 100 having this type of robot camera 196 is recharging or connected to a base station 200, the mobile robot 100 may typically be oriented such that the front end “A” of the chassis 105 faces a wall or is obstructed from rotating the chassis 105. Yet the mobile robot 100 can still enable a remote user 400 to receive local image data from the rotatable robot camera 196, as illustrated in FIG. 26, by rotating the robot camera 196 so as to face the rearward direction “B”.


In accordance with one example implementation, when the robot camera 196 flips by 180 degrees so as to switch between the forward-facing and rearward-facing directions, the mobile robot 100 automatically adjusts the output signal to compensate for the flipped view of the robot camera 196.


As shown in FIGS. 23 and 24, the rotatable robot camera 196 may include a tilt mechanism, including a tilt gear motor and gears for adjusting the angular position of the robot camera 196.



FIG. 27 illustrates an example implementation of a mobile robot 100, in which the mobile robot 100 has a walking-type mobility platform 103. The walking type mobility platform 103 may include a bipedal leg-type drive system (see, for example, US Patent Application Publication 2006/0249314 to TAKENAKA, published Nov. 9, 2006, US Patent Application Publication 2005/0151496 to FURUTA, published Jul. 14, 2005, and US Patent Application Publication 2004/0027086 to OGAWA, published Feb. 12, 2004, each of which are incorporated herein by reference). Alternatively, the drive system may include a tread track-type mobility platform (such as used with the iRobot® PackBot®, as a non-limiting example), an insect-leg-type mobility platform, or any other mobility platform suitable for propelling the mobile robot 100. Also, the mobile robot 100 may include a component housing 7003 to contain the telecommunication processor, the wireless network transceiver, or other components.



FIG. 19 illustrates a robot camera 196 that can be placed into the privacy mode using a key 133 inserted into a 134, in which turning the key 133 toggles the position of the robot camera 196 from enabled mode to the privacy mode. When the robot camera 196 is in the privacy mode position, the backstop 199 (see FIGS. 17 and 18) Obstructs the view of the robot camera 196. Further, the privacy position of the robot camera 196 provides visible assurance to users that the robot camera 196 cannot send out visual Observation of the mobile robot's surroundings.



FIG. 25 illustrates an example implementation of a remote user interface 402 provided to the remote user 400 at the remote terminal 430. In the remote user interface 402 is a camera window 4021 which shows the real-time video data sent from the robot camera 196 Clickable navigation icons 4022 permit the remote user 400 to navigate the mobile robot 100 by placing the screen cursor onto an icon representing a desired direction of turning (or straight ahead) and then clicking the mouse 438, for example. A robot control panel 4025 includes buttons for sending a robot control signal to control various robot functions, while the telecommunication windows 4024 includes indicators of current call status and also icons for controlling the initiation, termination, etc., of the telecommunication functionality of the mobile robot 100. A robot status window 4023 indicates the mode of operation (viewing angle, currently selected camera, etc.) of the robot camera 196.


In accordance with one example implementation, the mobile robot 100 may include a rotatable robot camera 196 that can rotate.


Software Component Organization


As one example of a telecommunication robot system, a mobile robot 100 and a remote terminal 430 (e.g., soft-phone) communicate over the Internet 901. To facilitate a reasonable user experience for the remote user 400, a back-end infrastructure is provided, such as the Internet-connected server 380, which performs matchmaking, facilitates NAT traversal and ties customer and account management to the existing customer relationship management services. In addition, the back-end infrastructure introduces product subscriptions, remote robot user interfaces, persistent robot history, and online software releases.


In accordance with at least one example implementation, the back-end infrastructure: provides a SIP registry for ail robots and users, providing a location-free global address space for easier calling; facilitates firewall traversal such that users are not required to endure home networking configuration; allows owners to interact with online customer support by tying the SIP domain and product registration to iRobot's existing customer support infrastructure; acts as a gateway for extended product status, such as “robot stuck” messages and call history inappropriate or impossible to communicate directly by the robot; hosts software images (and release notes) for the product; and provides an environment to stage and deploy these and other new services enabled by a remote presence in the home The following discussion relates to the example component organizations illustrated in FIGS. 28 and 29.


In FIGS. 28 and 29, a conceptual view is provided of the system based on the three software platforms: “Snoopy;” “Control Application;” and “Back End Services.” The “Control Application,” or soft-phone is a software application that a customer (such as the remote user 400) installs on a PC (such as the remote terminal 430) to interact with his or her mobile robot 100 directly. This may include, for example, a binary application for Windows XP, bundled with the product. The “Snoopy” block refers to the software components of the mobile robot 100, executed on the telecommunication processor. This may include robot control software, a VoIP application, device drivers, and/or an embedded Linux software environment. The mobile robot 100 may also include an embedded web server and web interface used to configure the mobile robot 100 (e.g., via the tether interface 165).


The “Back End Services” run on one or more servers in a hosted environment, in accordance with one example implementation. These servers (e.g., the Internet server 380) introduce additional interfaces for maintaining the software services bundled with the product. The back-end infrastructure simplifies address lookups, facilitates firewall traversal, and adds remote robot monitoring capabilities. Internally, the infrastructure also provides customer and account management and service performance information The following are examples of back end services in accordance with at least one implementation:


Private Interfaces


Private interfaces are used internally or by our vendors to diagnose and manage mobile robots 100 in the field and the back-end infrastructure itself.


System Monitoring UI


It is assumed that all software components will provide mechanisms for service health monitoring. An associated user interface will be used by our hosting provider to maintain the infrastructure. The software may include monitoring functionality such as SNMP agents. The date may be aggregated into a single user interface or dashboard that can be used by the hosting vendor to maintain the servers. In addition, remote access to this information may be provided.


Engineering UI


Product development teams may interact with an Engineering UI service to determine appropriate over-subscription ratios and understand customer usage patterns.


Customer Support UI


This user interface may be used by customer support when diagnosing robot and/or networking issues. This user interface may include service level status, robot status, VoIP status, and subscription account information.


System Monitoring


This information is used by the hosting provider to gauge system health. This information is rendered into the “System Monitoring UI.” For example, a network accessible syslog server, SNMP monitor, or similar software monitoring/logging system may be provided.


Media Relay


As discussed above with regard to the Internet server 380, the server 380 may function to perform relaying of UDP traffic (e.g., VoIP datagrams) between the remote terminal 430 and the mobile robot 100, allowing communication through firewalls. As one non-limiting example, a software product such as MediaProxy may be executed on the server 380 to provide this functionality, and may report its status via SOAP/XML or other suitable system.


Provisioning and Call Logs


In accordance with one non-limiting example implementation, a mobile robot 100 may be marketed with two bundled messaging accounts. In addition, additional users may be associated with an account such that multiple family members can each have their own soft-phone installation (for example, the functionality may be provided by executing a software product such as NGN-Pro for provisioning and/or CDRTool for call records, inter alia). This information may also be correlated to an existing customer database schema.


SIP Registrar


The directory service for robot and soft-phone phone numbers. As a benefit, the user may be shielded from technical complexities such as IP addresses and/or hostnames. As a non-limiting example, such functionality may be provided by executing a software product such as OpenSER to maintain an SIP domain.


Customer Support


Customer support may enable restricted access to subscription account information in order to troubleshoot messaging and account issues.


Subscription Management


Users may be enabled to self-manage (pay, cancel) their own messaging accounts.


Email Gateway


May be used to notify users of scheduled downtime. For example, when triggered by the status aggregator, this gateway may automatically notify the corresponding e-mail address with troubleshooting and/or intervention requests. This gateway may also be used to notify users of service-related issues.


Status Aggregator


This application may receive periodic status updates from the mobile robot 100, and uses this information to populate a database for use by customer support and the fleet manager UI. Can provide interoperability with existing infrastructure such that robot status (e.g., battery voltage) can be seen via, for example, a web-based fleet management interface.


Software Image Repository


Software updates will be hosted by our infrastructure. This will be used by customers to update robot, media board, and soft phone software. Robot Interfaces may be provided as HTTP-based interfaces for robot to back-end communication:


Robot Status


The mobile robot 100 may use this secure link to receive status updates for robots in the field. The application decodes and publishes status to persistent storage (e.g., a relational database) for use by the various user interfaces.


Product Registrar


The product registrar receives configuration information from the mobile robot 100 when a robot owner configures his or her mobile robot 100 and messaging account. This CGI application decodes and pushes this data to the other services to register a subscription, and provision a SIP address, for example. The CGI application may update a global customer record with product and subscription information, for example.


Robot Administrator UI


This is a web-based user interface used tier initial configuration and occasional administration. This interface may be limited to the local user, but does display basic status. For example, the interface may be provided by an embedded webserver (running a suitable http server such as, e.g., thttpd) on the telecommunication processor of the mobile robot 100.


Example Use Scenarios


The following are brief example use scenarios:


A. Out of Box Experience


Initial experience for a new customer:


1. User configures mobile robot 100 using Account Admin UI served by the mobile robot 100, using his/her global iRobot account credentials.


2. User submits form on the mobile robot 100, which then relays portions of the data to the Product Registrar.


3. Product Registrar validates customer account, registers the mobile robot 100 and binds a subscription to it.


4. Product Registrar provisions accounts for the mobile robot 100 and soft phone (e.g., the remote terminal 430) with SIP Registrar.


5. Product Registrar returns success to Robot Admin on the mobile robot 100.


6. Robot Admin indicates success to user, and has Configuration Management notify the Robot Application of the configuration changes.


7. User installs soft-phone and dials first call to the robot.


8. The soft-phone VOW stack contacts the SIP registrar, which validates the soft-phone and returns the address information of the robot.


9. The soft-phone VoIP stack contacts the robot application and establishes a call.


10. Audio and video begins streaming between robot and soft-phone. These streams pass through the Media Relay.


11. Provisioning and Call Logs logs the call (but not the data).


B. Account Provisioning


User wants to register another account for her husband:


1. User logs into Account Admin UI and adds another SIP number and password to the robot subscription.


2. Information is forwarded to SIP Registrar.


3. Later, husband downloads soft-phone installer from Software Image Repository and installs on his desktop.


4. Husband uses the new SIP credentials to login to Snoopy. He has to enter PIN number as given to him by user (this information is not stored by iRobot).


C. Robot Stuck


The mobile robot 100 gets stuck and cannot find its way back to the base station/dock 200:


1. Robot Platform decides it has tried too tong to find the dock. It notifies Robot App, which spawns a message using the robot's Status Reporting module.


2. Status Reporting sends HTTP POST to Robot Status module on back-end.


3. Robot Status updates database.


4. The database update triggers an Email Gateway to notify the main user.


5. Email Gateway looks up the paired email account from the Customer Support module and sends a message to that user: “Robot could not find dock, please login and drive to dock.”


D. Robot Diagnostics and Customer Support


Mobile robot 100 has failed it could not reach the dock or base station 200 and has since lost power, and the user missed the email notifications from the mobile robot 100:


1. Remote user 400 tries calling the mobile robot 100, but fails.


2. The remote user 400 contacts customer support (via web, soft-phone, or telephone, as non-limiting examples).


3. Customer support contact opens Customer Support UI and enters global account login for remote user 400.


4. Customer support queries Subscription Management and Provisioning for subscription status. Subscription is valid.


5. Customer support queries Provisioning and Call Logs to examine call records. No unusual loss of networking or error codes detected.


6. Customer support queries Status Aggregator and finds stuck, and power down messages from the mobile robot 100.


7. Customer support instructs the remote user 400 about the failure and logs the call.


8. Customer calls home and has mobile robot 100 put on dock.


Although various exemplary implementations have been discussed hereinabove, the claimed subject is not limited necessarily thereto. Rather, the claimed subject matter relates to the features of the accompanying claims and their equivalents.

Claims
  • 1. A method comprising: receiving, in non-transitory storage, configuration information from a mobile telepresence robot;updating, using a computer processor in communication with the non-transitory storage, a user account stored in the non-transitory storage using the configuration information;provisioning, using the computer processor, a session initiation protocol address using the configuration information;receiving, at the computer processor, a Voice-over-Internet Protocol datagram from a remote computing device, the Voice-over-Internet Protocol datagram including a request for establishing a communication connection between the remote computing device and the mobile telepresence robot; andinstantiating, at the computer processor, a communication connection between the remote computing device and the telepresence robot.
  • 2. The method of claim 1, further comprising decoding the configuration information using the computer processor.
  • 3. The method of claim 1, further comprising sending a confirmation message from the computer processor to the robot after successfully provisioning the session initiation protocol address.
  • 4. The method of claim 1, further comprising validating the remote computing device before instantiating the communication connection.
  • 5. The method of claim 1, further comprising receiving periodic status updates at the computer processor from the robot and storing the status updates in the non-transitory storage.
  • 6. The method of claim 1, further comprising sending a customer support message from the computer processor to the remote computing device in response to a received status update.
  • 7. The method of claim 1, further comprising diagnosing a robot operating or networking issue based on a received status update.
  • 8. The method of claim 1, further comprising displaying a fleet management interface providing service, health, or operating data related to the robot.
  • 9. The method of claim 1, further comprising sending, from the computer processor, a data packet to an apparent Internet Protocol address associated with the telepresence robot.
  • 10. A hosting device comprising: a hosting computing device located at a first location, different from a second location of a mobile telepresence robot and a third location of a telepresence computing device in wireless communication with the mobile telepresence robot through a firewall interposed between the mobile telepresence robot and the telepresence computing device, the hosting computing device in wireless communication with the mobile telepresence robot and the telepresence computing device, the host computing device relaying User Datagram Protocol traffic between the mobile telepresence robot and the telepresence computing device through the firewall.
  • 11. The hosting device of claim 10, wherein the host computing device: receives periodic status updates from the mobile telepresence robot; andstores the status updates in non-transitory storage in communication with the hosting computing device.
  • 12. The hosting device of claim 11, wherein the host computing device decodes the status updates before storing the status updates in a non-transitory storage.
  • 13. The hosting device of claim 11, wherein the host computing device sends a customer support message to the telepresence computing device in response to a received status update.
  • 14. The hosting device of claim 11, wherein the host computing device diagnoses a robot operating or networking issue based on a received status update.
  • 15. The hosting device of claim 11, wherein the host computing device displays a fleet management interface providing service, health, or operating data related to the robot.
  • 16. The hosting device of claim 10, wherein the host computing device receives configuration information from the robot in response to user configuration of the robot or a messaging account associated with the robot.
  • 17. The hosting device of claim 16, wherein the host computing device provisions a session initiation protocol address using the configuration information.
  • 18. The hosting device of claim 16, wherein the host computing device decodes the configuration information and pushes the decoded configuration information to one or more services in communication with the host computing device.
  • 19. The hosting device of claim 16, wherein the host computing device decodes the configuration information and updates a customer record stored in non-transitory storage.
CROSS REFERENCE TO RELATED APPLICATIONS

This U.S. patent application is a continuation of, and claims priority under 35 U.S.C. §120 from, U.S. patent application Ser. No. 14/041,325, filed on Sep. 30, 2013, which is a continuation of U.S. patent application Ser. No. 13/562,315, filed on Jul. 31, 2012 (now U.S. Pat. No. 8,577,501), which is a continuation of patent application Ser. No. 11/862,197, filed on Sep. 27, 2007 (now U.S. Pat. No. 8,265,793), which claims priority under 35 U.S.C. §119(e) to U.S. Provisional Patent Application 60/974,404, filed on Sep. 21, 2007, and U.S. Provisional Patent Application No. 60/895,740, filed Mar. 20, 2007. The disclosures of these prior applications are considered part of the disclosure of this application and are hereby incorporated by reference in their entireties. The entire contents of U.S. Patent Application Publication 2007/0198128 to ZIEGLER, published Aug. 23, 2007, and of International Patent Application Publication WO 2007/041295 A2 to CROSS, published Apr. 12, 2007, are hereby incorporated by reference in their entireties.

US Referenced Citations (427)
Number Name Date Kind
4413693 Derby Nov 1983 A
4638445 Mattaboni Jan 1987 A
4669168 Tamura et al. Jun 1987 A
4697472 Hiyane Oct 1987 A
4709265 Silverman et al. Nov 1987 A
4751658 Kadonoff et al. Jun 1988 A
4777416 George, II et al. Oct 1988 A
4797557 Ohman Jan 1989 A
4803625 Fu et al. Feb 1989 A
4847764 Halvorson Jul 1989 A
4875172 Kanayama Oct 1989 A
4942538 Yuan et al. Jul 1990 A
4953159 Hayden et al. Aug 1990 A
4974607 Miwa Dec 1990 A
4977971 Crane, III et al. Dec 1990 A
5006988 Borenstein et al. Apr 1991 A
5040116 Evans, Jr. et al. Aug 1991 A
5051906 Evans, Jr. et al. Sep 1991 A
5073749 Kanayama Dec 1991 A
5084828 Kaufman et al. Jan 1992 A
5130794 Ritchey Jul 1992 A
5148591 Pryor Sep 1992 A
5153833 Gordon et al. Oct 1992 A
5155684 Burke et al. Oct 1992 A
5157491 Kassatly Oct 1992 A
5193143 Kaemmerer et al. Mar 1993 A
5217453 Wilk Jun 1993 A
5224157 Yamada et al. Jun 1993 A
5231693 Backes et al. Jul 1993 A
5236432 Matsen, III et al. Aug 1993 A
5252951 Tannenbaum et al. Oct 1993 A
5315287 Sol May 1994 A
5319611 Korba Jun 1994 A
5341242 Gilboa et al. Aug 1994 A
5341459 Backes Aug 1994 A
5350033 Kraft Sep 1994 A
5366896 Margrey et al. Nov 1994 A
5372211 Wilcox et al. Dec 1994 A
5374879 Pin et al. Dec 1994 A
5417210 Funda et al. May 1995 A
5436542 Petelin et al. Jul 1995 A
5441047 David et al. Aug 1995 A
5442728 Kaufman et al. Aug 1995 A
5462051 Oka et al. Oct 1995 A
5539741 Barraclough et al. Jul 1996 A
5544649 David et al. Aug 1996 A
5553609 Chen et al. Sep 1996 A
5572229 Fisher Nov 1996 A
5572999 Funda et al. Nov 1996 A
5594859 Palmer et al. Jan 1997 A
5636218 Ishikawa et al. Jun 1997 A
5652849 Conway et al. Jul 1997 A
5682199 Lankford Oct 1997 A
5684695 Bauer Nov 1997 A
5701904 Simmons et al. Dec 1997 A
5739657 Takayama et al. Apr 1998 A
5749058 Hashimoto May 1998 A
5749362 Funda et al. May 1998 A
5762458 Wang et al. Jun 1998 A
5767897 Howell Jun 1998 A
5786846 Hiroaki Jul 1998 A
5802494 Kuno Sep 1998 A
5836872 Kenet et al. Nov 1998 A
5867653 Aras et al. Feb 1999 A
5876325 Mizuno et al. Mar 1999 A
5911036 Wright et al. Jun 1999 A
5917958 Nunally et al. Jun 1999 A
5927423 Wada et al. Jul 1999 A
5949758 Kober Sep 1999 A
5954692 Smith et al. Sep 1999 A
5959423 Nakanishi et al. Sep 1999 A
5966130 Benman, Jr. Oct 1999 A
5974446 Sonnenreich et al. Oct 1999 A
6133944 Braun et al. Oct 2000 A
6135228 Asada et al. Oct 2000 A
6148100 Anderson et al. Nov 2000 A
6170929 Wilson et al. Jan 2001 B1
6175779 Barrett Jan 2001 B1
6201984 Funda et al. Mar 2001 B1
6211903 Bullister Apr 2001 B1
6219587 Ahlin et al. Apr 2001 B1
6232735 Baba et al. May 2001 B1
6233504 Das et al. May 2001 B1
6256556 Zenke Jul 2001 B1
6259806 Green Jul 2001 B1
6259956 Myers et al. Jul 2001 B1
6266162 Okamura et al. Jul 2001 B1
6266577 Popp et al. Jul 2001 B1
6289263 Mukherjee Sep 2001 B1
6292713 Jouppi et al. Sep 2001 B1
6304050 Skaar et al. Oct 2001 B1
6321137 De Smet Nov 2001 B1
6323942 Bamji Nov 2001 B1
6325756 Webb et al. Dec 2001 B1
6327516 Zenke Dec 2001 B1
6330486 Padula Dec 2001 B1
6330493 Takahashi et al. Dec 2001 B1
6346950 Jouppi Feb 2002 B1
6346962 Goodridge Feb 2002 B1
6369847 James et al. Apr 2002 B1
6408230 Wada Jun 2002 B2
6430471 Kintou et al. Aug 2002 B1
6430475 Okamoto et al. Aug 2002 B2
6438457 Yokoo et al. Aug 2002 B1
6452915 Jorgensen Sep 2002 B1
6463352 Tadokoro et al. Oct 2002 B1
6463361 Wang et al. Oct 2002 B1
6466844 Ikeda et al. Oct 2002 B1
6468265 Evans et al. Oct 2002 B1
6491701 Tierney et al. Dec 2002 B2
6496099 Wang et al. Dec 2002 B2
6496755 Wallach et al. Dec 2002 B2
6501740 Sun et al. Dec 2002 B1
6507773 Parker et al. Jan 2003 B2
6515740 Bamji et al. Feb 2003 B2
6522906 Salisbury, Jr. et al. Feb 2003 B1
6523629 Buttz et al. Feb 2003 B1
6526332 Sakamoto et al. Feb 2003 B2
6529765 Franck et al. Mar 2003 B1
6529802 Kawakita et al. Mar 2003 B1
6532404 Colens Mar 2003 B2
6535182 Stanton Mar 2003 B2
6535793 Allard Mar 2003 B2
6540039 Yu et al. Apr 2003 B1
6543899 Covannon et al. Apr 2003 B2
6549215 Jouppi Apr 2003 B2
6563533 Colby May 2003 B1
6580246 Jacobs Jun 2003 B2
6584376 Van Kommer Jun 2003 B1
6604019 Ahlin et al. Aug 2003 B2
6604021 Imai et al. Aug 2003 B2
6611120 Song et al. Aug 2003 B2
6646677 Noro et al. Nov 2003 B2
6650748 Edwards et al. Nov 2003 B1
6659215 Lie Dec 2003 B1
6684129 Salisbury, Jr. et al. Jan 2004 B2
6691000 Nagai et al. Feb 2004 B2
6710797 McNelley et al. Mar 2004 B1
6728599 Wang et al. Apr 2004 B2
6764373 Osawa et al. Jul 2004 B1
6769771 Trumbull Aug 2004 B2
6781606 Jouppi Aug 2004 B2
6784916 Smith Aug 2004 B2
6785589 Eggenberger et al. Aug 2004 B2
6791550 Goldhor et al. Sep 2004 B2
6799065 Niemeyer Sep 2004 B1
6799088 Wang et al. Sep 2004 B2
6804580 Stoddard et al. Oct 2004 B1
6804656 Rosenfeld et al. Oct 2004 B1
6810411 Coughlin et al. Oct 2004 B1
6836703 Wang et al. Dec 2004 B2
6839612 Sanchez et al. Jan 2005 B2
6840904 Goldberg Jan 2005 B2
6845297 Allard Jan 2005 B2
6852107 Wang et al. Feb 2005 B2
6853878 Hirayama et al. Feb 2005 B2
6853880 Sakagami et al. Feb 2005 B2
6871117 Wang et al. Mar 2005 B2
6879879 Jouppi et al. Apr 2005 B2
6892112 Wang et al. May 2005 B2
6895305 Lathan et al. May 2005 B2
6909708 Krishnaswamy et al. Jun 2005 B1
6914622 Smith et al. Jul 2005 B1
6925357 Wang et al. Aug 2005 B2
6951535 Ghodoussi et al. Oct 2005 B2
6958706 Chaco et al. Oct 2005 B2
6965394 Gutta et al. Nov 2005 B2
6985480 Brown Jan 2006 B2
6995664 Darling Feb 2006 B1
7030757 Matsuhira et al. Apr 2006 B2
7092001 Schulz Aug 2006 B2
7096090 Zweig Aug 2006 B1
7115102 Abbruscato Oct 2006 B2
7115849 Dowski, Jr. et al. Oct 2006 B2
7117067 McLurkin et al. Oct 2006 B2
7123285 Smith et al. Oct 2006 B2
7123974 Hamilton Oct 2006 B1
7123991 Graf et al. Oct 2006 B2
7127325 Nagata et al. Oct 2006 B2
7129970 James et al. Oct 2006 B2
7133062 Castles et al. Nov 2006 B2
7142945 Wang et al. Nov 2006 B2
7142947 Wang et al. Nov 2006 B2
7151982 Liff et al. Dec 2006 B2
7154526 Foote et al. Dec 2006 B2
7155306 Haitin et al. Dec 2006 B2
7156809 Quy Jan 2007 B2
7158317 Ben-Eliezer et al. Jan 2007 B2
7158859 Wang et al. Jan 2007 B2
7158860 Wang et al. Jan 2007 B2
7161322 Wang et al. Jan 2007 B2
7162338 Goncalves et al. Jan 2007 B2
7164969 Wang et al. Jan 2007 B2
7171286 Wang et al. Jan 2007 B2
7174238 Zweig Feb 2007 B1
7184559 Jouppi Feb 2007 B2
7188000 Chiappetta et al. Mar 2007 B2
7206627 Abovitz et al. Apr 2007 B2
7215786 Nakadai et al. May 2007 B2
7256708 Rosenfeld et al. Aug 2007 B2
7262573 Wang et al. Aug 2007 B2
7283893 Hara et al. Oct 2007 B2
7289883 Wang et al. Oct 2007 B2
7321807 Laski Jan 2008 B2
7340077 Gokturk et al. Mar 2008 B2
7346429 Goldenberg et al. Mar 2008 B2
7382399 McCall et al. Jun 2008 B1
7432949 Remy et al. Oct 2008 B2
7433024 Garcia et al. Oct 2008 B2
7441953 Banks Oct 2008 B2
7624166 Foote et al. Nov 2009 B2
7633586 Winlow et al. Dec 2009 B2
7706917 Chiappetta et al. Apr 2010 B1
7720572 Ziegler et al. May 2010 B2
7844364 McLurkin et al. Nov 2010 B2
7924323 Walker et al. Apr 2011 B2
7957837 Ziegler et al. Jun 2011 B2
8195333 Ziegler et al. Jun 2012 B2
8265793 Cross et al. Sep 2012 B2
8892260 Cross et al. Nov 2014 B2
20010002448 Wilson et al. May 2001 A1
20010010053 Ben-Shachar et al. Jul 2001 A1
20010034475 Flach et al. Oct 2001 A1
20010037163 Allard Nov 2001 A1
20010051881 Filler Dec 2001 A1
20010054071 Loeb Dec 2001 A1
20020015296 Howell et al. Feb 2002 A1
20020027597 Sachau Mar 2002 A1
20020049517 Ruffner Apr 2002 A1
20020055917 Muraca May 2002 A1
20020057279 Jouppi May 2002 A1
20020058929 Green May 2002 A1
20020059587 Cofano et al. May 2002 A1
20020063726 Jouppi May 2002 A1
20020073429 Beane et al. Jun 2002 A1
20020082498 Wendt et al. Jun 2002 A1
20020095238 Ahlin et al. Jul 2002 A1
20020098879 Rheey Jul 2002 A1
20020104094 Alexander et al. Aug 2002 A1
20020111988 Sato Aug 2002 A1
20020120362 Lathan et al. Aug 2002 A1
20020130950 James et al. Sep 2002 A1
20020141595 Jouppi Oct 2002 A1
20020143923 Alexander Oct 2002 A1
20020177925 Onishi et al. Nov 2002 A1
20020183894 Wang et al. Dec 2002 A1
20020184674 Xi et al. Dec 2002 A1
20020186243 Ellis et al. Dec 2002 A1
20030003962 Vooi-Kia et al. Jan 2003 A1
20030030397 Simmons Feb 2003 A1
20030048481 Kobayashi et al. Mar 2003 A1
20030050733 Wang et al. Mar 2003 A1
20030060808 Wilk Mar 2003 A1
20030069752 LeDain et al. Apr 2003 A1
20030100892 Morley et al. May 2003 A1
20030104806 Ruef et al. Jun 2003 A1
20030114962 Niemeyer Jun 2003 A1
20030135203 Wang et al. Jul 2003 A1
20030144579 Buss Jul 2003 A1
20030144649 Ghodoussi et al. Jul 2003 A1
20030151658 Smith Aug 2003 A1
20030171710 Bassuk et al. Sep 2003 A1
20030174285 Trumbull Sep 2003 A1
20030180697 Kim et al. Sep 2003 A1
20030199000 Valkirs et al. Oct 2003 A1
20030216834 Allard Nov 2003 A1
20030220541 Salisbury et al. Nov 2003 A1
20030231244 Bonilla et al. Dec 2003 A1
20030232649 Gizis et al. Dec 2003 A1
20040012362 Tsurumi Jan 2004 A1
20040013295 Sabe et al. Jan 2004 A1
20040019406 Wang et al. Jan 2004 A1
20040024490 McLurkin et al. Feb 2004 A1
20040027086 Ogawa et al. Feb 2004 A1
20040041904 Lapalme et al. Mar 2004 A1
20040065073 Nash Apr 2004 A1
20040068657 Alexander et al. Apr 2004 A1
20040078219 Kaylor et al. Apr 2004 A1
20040080610 James et al. Apr 2004 A1
20040088077 Jouppi et al. May 2004 A1
20040093409 Thompson et al. May 2004 A1
20040098167 Yi et al. May 2004 A1
20040102167 Shim et al. May 2004 A1
20040117065 Wang et al. Jun 2004 A1
20040138547 Wang et al. Jul 2004 A1
20040143421 Wang et al. Jul 2004 A1
20040148638 Weisman et al. Jul 2004 A1
20040153211 Kamoto et al. Aug 2004 A1
20040157612 Kim Aug 2004 A1
20040162637 Wang et al. Aug 2004 A1
20040167666 Wang et al. Aug 2004 A1
20040167668 Wang et al. Aug 2004 A1
20040172301 Mihai et al. Sep 2004 A1
20040174129 Wang et al. Sep 2004 A1
20040175684 Kaasa et al. Sep 2004 A1
20040179714 Jouppi Sep 2004 A1
20040201602 Mody et al. Oct 2004 A1
20040204074 Desai Oct 2004 A1
20040215490 Duchon et al. Oct 2004 A1
20040230340 Fukuchi et al. Nov 2004 A1
20050003330 Asgarinejad et al. Jan 2005 A1
20050021182 Wang et al. Jan 2005 A1
20050021183 Wang et al. Jan 2005 A1
20050021187 Wang et al. Jan 2005 A1
20050021309 Alexander et al. Jan 2005 A1
20050024485 Castles et al. Feb 2005 A1
20050027567 Taha Feb 2005 A1
20050027794 Decker Feb 2005 A1
20050028221 Liu et al. Feb 2005 A1
20050035862 Wildman et al. Feb 2005 A1
20050038416 Wang et al. Feb 2005 A1
20050038564 Burick Feb 2005 A1
20050052527 Remy et al. Mar 2005 A1
20050057699 Bowser Mar 2005 A1
20050065435 Rauch et al. Mar 2005 A1
20050065659 Tanaka et al. Mar 2005 A1
20050065813 Mishelevich et al. Mar 2005 A1
20050071046 Miyazaki et al. Mar 2005 A1
20050099493 Chew May 2005 A1
20050110867 Schulz May 2005 A1
20050151496 Furuta et al. Jul 2005 A1
20050154265 Miro et al. Jul 2005 A1
20050182322 Grispo Aug 2005 A1
20050192721 Jouppi Sep 2005 A1
20050204438 Wang et al. Sep 2005 A1
20050216126 Koselka et al. Sep 2005 A1
20050219356 Smith et al. Oct 2005 A1
20050267826 Levy et al. Dec 2005 A1
20050283414 Fernandes et al. Dec 2005 A1
20050286494 Hollatz et al. Dec 2005 A1
20060007943 Fellman Jan 2006 A1
20060013263 Fellman Jan 2006 A1
20060013469 Wang et al. Jan 2006 A1
20060013488 Inoue Jan 2006 A1
20060029065 Fellman Feb 2006 A1
20060047365 Ghodoussi et al. Mar 2006 A1
20060052676 Wang et al. Mar 2006 A1
20060052684 Takahashi et al. Mar 2006 A1
20060064212 Thorne Mar 2006 A1
20060082642 Wang et al. Apr 2006 A1
20060087746 Lipow Apr 2006 A1
20060095170 Yang et al. May 2006 A1
20060098573 Beer et al. May 2006 A1
20060103659 Karandikar et al. May 2006 A1
20060104279 Fellman et al. May 2006 A1
20060106493 Niemeyer et al. May 2006 A1
20060122482 Mariotti et al. Jun 2006 A1
20060142983 Sorensen et al. Jun 2006 A1
20060161303 Wang et al. Jul 2006 A1
20060173712 Joubert Aug 2006 A1
20060178776 Feingold et al. Aug 2006 A1
20060189393 Edery Aug 2006 A1
20060195569 Barker Aug 2006 A1
20060209794 Bae et al. Sep 2006 A1
20060249314 Takenaka et al. Nov 2006 A1
20060259193 Wang et al. Nov 2006 A1
20060293788 Pogodin Dec 2006 A1
20070021871 Wang et al. Jan 2007 A1
20070046237 Lakshmanan et al. Mar 2007 A1
20070064092 Sandbeg et al. Mar 2007 A1
20070076729 Takeda Apr 2007 A1
20070078566 Wang et al. Apr 2007 A1
20070100498 Matsumoto et al. May 2007 A1
20070114075 Buehler et al. May 2007 A1
20070117516 Saidi et al. May 2007 A1
20070120965 Sandberg et al. May 2007 A1
20070135967 Jung et al. Jun 2007 A1
20070136405 Weinstein et al. Jun 2007 A1
20070142964 Abramson Jun 2007 A1
20070152427 Olsen Jul 2007 A1
20070153801 Sung et al. Jul 2007 A1
20070159979 Butler et al. Jul 2007 A1
20070189311 Kim et al. Aug 2007 A1
20070192910 Vu et al. Aug 2007 A1
20070197896 Moll et al. Aug 2007 A1
20070198128 Ziegler et al. Aug 2007 A1
20070199108 Angle et al. Aug 2007 A1
20070250212 Halloran et al. Oct 2007 A1
20070262884 Goncalves et al. Nov 2007 A1
20070273751 Sachau Nov 2007 A1
20070291109 Wang et al. Dec 2007 A1
20070291128 Wang et al. Dec 2007 A1
20070293985 Myeong et al. Dec 2007 A1
20080011904 Cepollina et al. Jan 2008 A1
20080065268 Wang et al. Mar 2008 A1
20080082211 Wang et al. Apr 2008 A1
20080105481 Hutcheson et al. May 2008 A1
20080106746 Shpunt et al. May 2008 A1
20080201014 Sonoura Aug 2008 A1
20080201017 Wang et al. Aug 2008 A1
20080215987 Alexander et al. Sep 2008 A1
20080229531 Takida Sep 2008 A1
20080240502 Freedman et al. Oct 2008 A1
20080255703 Wang et al. Oct 2008 A1
20080281467 Pinter Nov 2008 A1
20090055023 Walters et al. Feb 2009 A1
20090096783 Shpunt et al. Apr 2009 A1
20090105882 Wang et al. Apr 2009 A1
20090125147 Wang et al. May 2009 A1
20090164045 Deguire et al. Jun 2009 A1
20090177323 Ziegler et al. Jul 2009 A1
20090185274 Shpunt Jul 2009 A1
20090226113 Matsumoto et al. Sep 2009 A1
20090240371 Wang et al. Sep 2009 A1
20090259339 Wright et al. Oct 2009 A1
20100010672 Wang et al. Jan 2010 A1
20100010673 Wang et al. Jan 2010 A1
20100019715 Roe et al. Jan 2010 A1
20100020078 Shpunt Jan 2010 A1
20100034457 Berliner et al. Feb 2010 A1
20100066587 Yamauchi et al. Mar 2010 A1
20100070079 Mangaser et al. Mar 2010 A1
20100073490 Wang et al. Mar 2010 A1
20100115418 Wang et al. May 2010 A1
20100118123 Freedman et al. May 2010 A1
20100131103 Herzog et al. May 2010 A1
20100191375 Wright et al. Jul 2010 A1
20100268383 Wang et al. Oct 2010 A1
20110050841 Wang et al. Mar 2011 A1
20110071702 Wang et al. Mar 2011 A1
20110187875 Sanchez et al. Aug 2011 A1
20110190930 Hanrahan et al. Aug 2011 A1
20110218674 Stuart et al. Sep 2011 A1
20110288684 Farlow et al. Nov 2011 A1
20110292193 Wang et al. Dec 2011 A1
20110301759 Wang et al. Dec 2011 A1
20130139193 Fan et al. May 2013 A1
Foreign Referenced Citations (73)
Number Date Country
2289697 Nov 1998 CA
101866396 Oct 2010 CN
101978365 Feb 2011 CN
101106939 Nov 2011 CN
0981905 Jan 2002 EP
1262142 Dec 2002 EP
1536660 Jun 2005 EP
1536660 Apr 2008 EP
2263158 Dec 2010 EP
2300930 Mar 2011 EP
2431261 Apr 2007 GB
07213753 Aug 1995 JP
07248823 Sep 1995 JP
08084328 Mar 1996 JP
07257422 Dec 1996 JP
08320727 Dec 1996 JP
09267276 Oct 1997 JP
10079097 Mar 1998 JP
10288689 Oct 1998 JP
00032319 Jan 2000 JP
00049800 Feb 2000 JP
00079587 Mar 2000 JP
00196876 Jul 2000 JP
00188124 Apr 2001 JP
01125641 May 2001 JP
01147718 May 2001 JP
01179663 Jul 2001 JP
01198865 Jul 2001 JP
01198868 Jul 2001 JP
01199356 Jul 2001 JP
02000574 Jan 2002 JP
02035423 Feb 2002 JP
02046088 Feb 2002 JP
02305743 Oct 2002 JP
02355779 Dec 2002 JP
04261941 Sep 2004 JP
04524824 Sep 2004 JP
05028066 Feb 2005 JP
10064154 Mar 2010 JP
10532109 Sep 2010 JP
10246954 Nov 2010 JP
060037979 May 2006 KR
100019479 Feb 2010 KR
100139037 Dec 2010 KR
WO-9306690 Apr 1993 WO
WO-9851078 Nov 1998 WO
WO-9967067 Dec 1999 WO
WO-0033726 Jun 2000 WO
WO-03077745 Sep 2003 WO
WO-2004075456 Sep 2004 WO
WO-2006012797 Feb 2006 WO
WO-2006078611 Jul 2006 WO
WO-2007041038 Apr 2007 WO
WO-2007041295 Apr 2007 WO
WO-2008100272 Aug 2008 WO
WO-2008100272 Oct 2008 WO
WO-2009117274 Sep 2009 WO
WO-2009128997 Oct 2009 WO
WO-2009145958 Dec 2009 WO
WO-2010006205 Jan 2010 WO
WO-2010006211 Jan 2010 WO
WO-2010033666 Mar 2010 WO
WO-2010047881 Apr 2010 WO
WO-2010062798 Jun 2010 WO
WO-2010065257 Jun 2010 WO
WO-2010120407 Oct 2010 WO
WO-2011028589 Mar 2011 WO
WO-2011028589 Apr 2011 WO
WO-2011097130 Aug 2011 WO
WO-2011097132 Aug 2011 WO
WO-2011109336 Sep 2011 WO
WO-2011097132 Dec 2011 WO
WO-2011149902 Dec 2011 WO
Non-Patent Literature Citations (121)
Entry
Rodoplu et al., Empirical Modeling and Estimation of End-to-End VoIP Delay over Mobile Multi-hop Wireless Networks, 2010, IEEE, p. 1-6.
Pleva et al., Speech and Mobile Technologies for Cognitive Communication and Information Systems, 2005, IEEE, p. 1-5.
Ren et al., ASAP: an AS-Aware Peer-Relay Protocol for High Quality VoIP, 2006, IEEE, p. 1-10.
Xiaohui et al., The design and Implementation of Real-Time Internet-Based Telerobotics, 2003, IEEE, p. 815-819.
Office Action for U.S. Appl. No. 11/862,197, dated Nov. 1, 2010.
Office Action for U.S. Appl. No. 11/862,197, dated Mar. 28, 2011.
Office Action for U.S. Appl. No. 13/562,315 dated Jan. 4, 2013.
Australian examination report for related Application No. 2011256720 dated Mar. 27, 2014.
Adams, Chris, “Mobile Robotics Research Group”, Mobile Robotics Research Group, Edinburgh University, http://www.dai.ed.ac.uk/groups/mrg/MRG.html, Internet, Edinburgh. Duplicate of 575084, 2000, pp. 1-2.
Ando, et al., “A Multimedia Self-service Terminal with Conferencing Functions”, IEEE, Jul. 5-7, 1995, pp. 357-362.
Android Amusement Corp., “What Marketing Secret . . . Renting Robots from Android Amusement Corp!”, (Advertisement), 1982.
Applebome, “Planning Domesticated Robots for Tomorrow's Household”, New York Times, http://www.theoldrobots.com/images17/dc17.JPG, Mar. 4,1982, pp. 21, 23.
Baltus, et al., “Towards Personal Service Robots for the Elderly, Proceedings for the Elderly Workshop on Interactive Robots and Entertainment”, Computer Science and Robotics, 2000.
Bar-Cohen, et al., “Virtual reality robotic telesurgery simulations using MEMICA haptic system”, Internet, Mar. 5, 2001, pp. 1-7.
Bauer, Jeffrey C. et al., “Service Robots in Health Care: The Evolution of Mechanical Solutions to Human Resource Problems”, Jun. 2003.
Bauer, John et al., “Remote telesurgical mentoring: feasibility and efficacy”, IEEE, 2000, pp. 1-9.
Bischoff, “Design Concept and Realization of the Humanoid Service Robot HERMES”, Field and Service Robotics, Springer, London, 1998, pp. 485-492.
Blackwell, Gerry, “Video: A Wireless LAN Killer App?”, Internet, Apr. 16, 2002, pp. 1-3.
Breslow, Michael J. et al., “Effect of a multiple-site intensive care unit telemedicine program on clinical and economic outcome an alternative paradigm for intensivist staffing”, Critical Care Med; vol. 32 No. 1, Jan. 2004, pp. 31-38.
Brooks, Rodney, “Remote Presence”, Abstracts from Flesh & Machines, How Robots Will Change Us, Feb. 2002, pp. 131-147.
Candelas, Herias et al., “Flexible virtual and remote laboratory for teaching Robotics”, FORMATEX 2006; Proc. Advance in Control Education Madrid, Spain, Jun. 2006, pp. 21-23.
Celi, et al., “The EICU: It's not just telemedicine”, Critical Care Medicine vol. 29, No. 8 (Supplement), Aug. 2001.
Cheetham, Anastasia et al., “Interface Development for a Child's Video Conferencing Robot”, 2000, pp. 1-4.
Cleary, et al., “State of the art in surgical robotics: Clinical applications and technology challenges”, Internet, Feb. 24, 2002, pp. 1-26.
CNN, “Floating ‘droids’ to roam space corridors of the future”, Internet, Jan. 12, 2000, pp. 1-4.
cnn.com/technology, “Paging R.Robot: Machine helps doctors with patients”, Internet, Sep. 30, 2003, 1-3.
Crowley, Susan L., “Hello to Our Future”, AARP Bulletin, http://www.cs.cmu.ed/-nursebot/web/press/aarp 99—14/millennium.html, Jan. 2000.
Dalton, “Techniques for Web Telerobotics”, PhD Thesis, University of Western Australia, http://telerobot.mech.uwa.edu.au/information.html, http://catalogue.library.uwa.edu.au/search, 2001, 27-62 pp. 149-191.
Davies, “Robotics in Minimally Invasive Surgery”, Internet, 1995, pp. 5/1-5/2.
DiGiorgio, James, “Is Your Emergency Department of the Leading Edge?”, Internet, 2005, pp. 1-4.
Discovery Channel Canada, “Inventing the Future: 2000 Years of Discovery”, (Video Transcript), Jan. 2, 2000.
Elhajj, et al., “Supermedia in Internet-based telerobotic operations”, Internet, 2001, pp. 1-14.
Elhajj, et al., “Synchronization and Control of Supermedia Transmission Via the Internet”, Proceedings of 2001 International Symposium on Intelligent Multimedia Video and Speech Processing., Hong Kong, May 2-4, 2001.
Ellison, et al., “Telerounding and Patient Satisfaction Following Surgery”, pp. 523-530.
Fels, “Developing a Video-Mediated Communication System for Hospitalized Children”, Telemedicine Journal, vol. 5,vol. 5, No. 2, 1999.
Fetterman, “Videoconferencing over the Internet”, Internet, 2001, pp. 1-8.
Fiorini, P., et al, “Health Care Robotics: A Progress Report”, IEEE International Conference on Robotics and Automation, Apr. 1997, pp. 1271-1276.
Ghiasi, et al., “A Generic Web-based Teleoperations Architecture: Details and Experience”, SPIE Conference on Telemanipulator and Telepresence Technologies VI, Sep. 1999.
Goldberg, et al., “Collaborative Teleoperation via the Internet”, IEEE International Conference on Robotics and Automation, San Francisco, California, Apr. 2000.
Goldberg, “Desktop Teleoperation via the World Wide Web, Proceedings of the IEEE International Conference on Robotics and Automation”, htto://citeseer.ist.osu.edu/cache/oaoers/cs/5/fto:zSzzSzusc.eduzSzoubzSziriszS zraiders.odf/aol, 1995, pp. 654-659.
Goldberg, “More Online Robots, Robots that Manipulate”, Internet, Updated Aug. 2001, http://ford.ieor.berkeley.edu/ir/robots—a2.html, Aug. 2001.
Goldenberg, et al., “Telemedicine in Otolaryngology”, American Journal of Otolaryngology vol. 23, No. 1, 2002, pp. 35-43.
Goldman, Lea, “Machine Dreams”, Entrepreneurs, Forbes, May 27, 2002.
Gump, Michael D., “Robot Technology Improves VA Pharmacies”, Internet, 2001, pp. 1-3.
Hameed, Mohammed et al., “A Review of Telemedicine”, Journal of Telemedicine and Telecare., vol. 5, Supplement 1, 1999, pp. S1:103-S1:106.
Han, et al., “Construction of an Omnidirectional Mobile Robot Platform Based on Active Dual-Wheel Caster Mechanisms and Development of a Control Simulator”, Kluwer Acedemic Publishers, vol. 29, Nov. 2000, pp. 257-275.
Handley, et al., “RFC 2327—SDP:Session Description Protocol”, http://www.faqs.org/rfcs/rfc2327.html, Apr. 1998.
Hanebeck, et al., “Roman: A mobile Robotic Assistant for Indoor Service Applications”, Proceedings of the 1997 IEEE/RSJ International Conference on Intelligent Robots and Systems, 1997.
Harmo, et al., “Moving Eye-Interactive Telepresence Over Internet With a Ball Shaped Mobile Robot”, 2000.
Haule, et al., “Control Scheme for Delayed Teleoperation Tasks”, Proceedings of the Pacific Rim Conference on Communications, Computer and Signal Processing, May 17, 1995.
Hees, William P., “Communications Design for a Remote Presence Robot”, Jan. 14, 2002.
Holmberg, “Development of a Holonomic Mobile Robot for Mobile Manipulation Tasks”, International Conference on Field and Service Robotics, Pittsburgh, PA, Aug. 1999.
Int'l Communication Union, “ITU-T H.323 Packet-based multimedia communications”, http://www.itu.int/rec/T-REC-H.323-199802-S/en, Feb. 1998.
Ishiguro, “Integrating a Perceptual Information Infrastructure with Robotic Avatars: A Framework for Tele-Existence”, Proceeding of IEEE Conference on Intelligent Robots and Systems, 1999, pp. 1032-1038.
Ishihara, et al., “Intelligent Microrobot DDS (Drug Delivery System) Measured and Controlled by Ultrasonics”, IEEE/RSJ, vol. 2, Nov. 3-5, 1991, pp. 1145-115.
Ivanova, Natali, “Master's thesis: Internet Based Interface for Control of a Mobile Robot”, Department of Numerical Analysis and Computer Science, 2003, 59 pages.
Jenkins, et al., “Telehealth Advancing Nursing Practice”, Nursing Outlook, vol. 49, No. 2, Mar./Apr. 2001.
Johanson, “Supporting video-mediated communication over the Internet”, Chalmers University of Technology,Dept of Computer Engineering, Gothenburg, Sweden, 2003.
Jouppi, et al., “Mutually-Immersive Audio Telepresence”, Audio Engineering Society Convention Paper presented at 113th Convention, Oct. 2002.
Jouppi, Norman et al., “First Steps Towards Mutually-Immersive Mobile Telepresence”, CSCW, 02, New Orleans LA, Nov. 16-20, 2002.
Kanehiro, Fumio et al., “Virtual Humanoid Robot Platform to Develop Controllers of Real Humanoid Robots without Porting”, IEEE, 2001, pp. 3217-3276.
Keller, et al., “Raven Interface Project”, http://upclose.lrdc.pitt.edu/people/louw—assets/Raven—Slides.pps, Fall 2001.
Khatib, “Robots in Human Environments”, Proc. International Conference on Control, Automation, Robotics, and Vision ICRACV2000, Singapore, Dec. 2000, pp. 454-457.
Kuzuoka, et al., “Can the GestureCam Be a Surrogate?”, Proceedings of the Fourth European Conference on Computer-Supported Cooperative Work, Sep. 10-14, pp. 181-196.
Lane, “Automated Aides”, Newsday, http://www.cs.cum.edu/nursebot/web/press/nd4380.htm, Oct. 17, 2000.
Lee, et al., “A novel method of surgical instruction: International telementoring”, Internet, 1998, pp. 1-4.
Lim, Hun-Ok et al., “Control to Realize Human-like Walking of a Biped Humanoid Robot”, IEEE, 2000, pp. 3271-3276.
Linebarger, John M. et al., “Concurrency Control Mechanisms for Closely Coupled Collaboration in Multithreaded Virtual Environments”, Presence, Special Issue on Advances in Collaborative VEs, 2004.
Loeb, et al., “Virtual Visit: Improving Communication for Those Who Need It Most”, Stud Health Technol Inform.; 94: 2003 pp. 302-308.
Long, “HelpMate Robotics, Inc. (Formerly Transitions Research Corporation) Robot Navigation Technology”, NIST Special Publication, http://www.atp.nist.gov/eao/sp950-1/helpmate.htm, Mar. 1999, pp. 950-951.
Luna, Nancy, “Robot a new face on geriatric care”, OC Register, Aug. 6, 2003.
Mack, “Minimally invasive and robotic surgery”, Internet IEEE, 2001, pp. 568-572.
Mair, “Telepresence—The Technology. And Its Economic and Social Implications”, IEEE Technology and Society, 1997.
Martin, Anya, “Days Ahead”, Assisted Living Today, vol. 9, Nov./Dec. 2002, pp. 19-22.
McCardle, et al., “The challenge of utilizing new technology in design education”, Internet, 2000, pp. 122-127.
Meng, et al., “E-Service Robot in Home Healthcare”, Proceedings of the 2000 IEEE/RSJ, International Conference on Intelligent Robots and Systems, 2000, pp. 832-837.
Michaud, “Introducing Nursebot”, The Boston Globe, http://www.cs.cmu.edu/nursebot/web/press/globe 3 01/index.html, Sep. 11, 2001, pp. 1-5.
Montemerlo, “Telepresence: Experiments in Next Generation Internet”, CMU Robotics Institute, http://www.ri.cmu.edu/creative/archives.htm (Video/Transcript), Oct. 20, 1998.
Murphy, “Introduction to A1 Robotics”, A Bradford Book, 2000, p. 487.
Nakajima, et al., “A Multimedia Teleteaching System using an Electronic Whiteboard for Two Way Communication of Motion Videos and Chalkboards”, IEEE, 1993, pp. 436-441.
Nomadic Technologies Inc., “Nomad XR4000 Hardware Manual”, Release 1.0, Mar. 1999.
Nt'l Energy Res Sci Comp Ctr, “Berkeley Lab's RAGE Telepresence Robot Captures R&D100 Award”, http://www.nersc.gov/news/newsroom/RAGE070202.php, Jul. 2, 2002.
Ogata, et al., “Development of Emotional Communication Robot: WAMOEBA-2r—Experimental evaluation.”, IEEE, 2000, pp. 175-180.
Ogata, et al., “Emotional Communication Robot: WAMOEBA-2R—Emotion Model and Evaluation Experiments”, Internet, 1999, pp. 1-16.
Oh, et al., “Autonomous Battery Recharging for Indoor Mobile Robots”, Proceedings of Australian Conference on Robotics and Automation, http://users.rsise.anu.edu.au/rsl/rsl—papers/ACRA2000/Auto—Recharge—Paper. pdf, 2000.
Ojha, A. K., “An application of Virtual Reality in Rehabilitation”, IEEE, Apr. 10-13, 1994, pp. 4-6.
Paulos, et al., “A World Wide Web Telerobotic Remote Environment Browser”, http://vive.cs.berkeley.edu/capek, 1995.
Paulos, “Designing Personal Tele-embodiment”, IEEE International Conference on Robotics and Automation http://www.prop.org/papers/icra98.pdf, 1998.
Paulos, “PRoP: Personal Roving Presence”, ACM:CHI Proceedings of CHI '98, http://www.prop.org/papers/chi98.pdf, 1998, p. 6.
Paulos, et al., “Ubiquitous Tele-embodiment: Applications and Implications”, International Journal of Human Computer Studies, vol. 46, No. 6, Jun. 1997, pp. 861-877.
Paulos, “Video of PRoP 2 at Richmond Field Station”, www.prop.org Printout of Home Page of Website and two-page Transcript of the audio portion of said PRoP Video, May 2001.
Paulos, Eric J., “Personal Tele-Embodiment”, UC Berkeley, Fall 2001.
Pin, et al., “A New Family of Omnidirectional and Holonomic Wheeled Platforms for Mobile Robots”, IEEE, vol. 10, No. 4, Aug. 1994.
Rovetta, et al., “A New Telerobotic Application: Remote Laparoscopic Surgery Using Satellites and and optical fiber Networks for Data Exchange”, International Journal of Robotics Research, Jun. 1,1996, pp. 267-279.
Roy, et al., “Towards Personal Service Robots for the Elderly”, Internet, Mar. 7, 2002, 7 pgs.
Salemi, et al., “MILO: Personal robot platform”, Internet, 2005, pp. 1-6.
Sandt, Frederic et al., “Perceptions for a Transport Robot in Public Environments”, IROS, 1997.
Schaeffer, “Care-O-bot: A System for Assisting Elderly or Disabled Persons in Home Environments”, Proceedings of AAATE-99, http://morpha.de/download/publications/IPA, 1999.
Schulz, “Web Interfaces for Mobile Robots in Public Places”, Robotics & Automation Magazine, IEEE, vol. 7, Issue 1, Mar. 2000.
Shimoga, et al., “Touch and force reflection for telepresence surgery”, IEEE, 1994, pp. 1049-1050.
Siegwart, “Interacting Mobile Robots on the Web”, Proceedings of the 1999 IEEE International Conference on Robotics and Automation, May 1999.
Simmons, “Xavier: An Autonomous Mobile Robot on the Web”, IEEE Robotics and Automation Magazine, 1999, pp. 43-48.
Spawar Systems Center, “Robart”, San Diego, CA, http://www.nosc.mil/robots/land/robart/robart.html, 1998, pp. 1-8.
Stephenson, Gary, “Dr. Robot Tested at Hopkins”, Internet, Aug. 5, 2003, pp. 1-2.
Stoianovici, et al., “Robotic Tools for Minimally Invasive Urologic Surgery”, Internet, Dec. 2002, pp. 1-17.
Suplee, “Mastering the Robot”, The Washington Post, http://www.cs.cmu.edu-nursebotlweb/press/wash/index.html, Sep. 17, 2000, p. A01.
Tahboub, Karim A. et al., “Dynamics Analysis and Control of a Holonomic Vehicle With Continously Variable Transmission”, Journal of Dynamic Systems, Measurement and Control ASME vol. 124, Mar. 2002, pp. 118-126.
Tendick, et al., “Human-Machine Interfaces for Minimally Invasive Surgery”, IEEE, 1997, pp. 2771-2776.
Thrun, et al., “Probabilistic Algorithms and the Interactive Museum Tour-Guide Robot Minerva”, Internet, 2000, pp. 1-35.
Tzafestas, et al., “VR-based Teleoperation of a Mobile Robotic Assistant: Progress Report”, Internet, Nov. 2000, pp. 1-23.
Urquhart, Kim , “InTouch's robotic Companion ‘beams up’ healthcare experts”, Medical Device Daily, vol. 7, No. 39, Feb. 27, 2003, p. 1,4.
Weiss, et al., “Telework and video-mediated communication: Importance of real-time, interactive communication for workers with disabilities”, California State University Northridge http://www.csun.edu/cod/conf/1999/proceedings/session0238.html, pp. 1-4.
West, et al., “Design of Ball Wheel Mechanisms for Omnidirectional Vehicles with Full Mobility and Invariant Kinematics”, Journal of Mechanical Design, vol. 119, Jun. 1997, pp. 153-161.
Yamasaki, et al., “Applying Personal Robots and Active Interface to Video Conference Systems”, Internet, 1995, pp. 243-248.
Yamauchi, “PackBot: A Versatile Platform for Military Robotics”, Internet, 2004, pp. 1-10.
Yong, et al., “Robot task execution with telepresence using virtual reality technology”, Internet, 1998, pp. 1-8.
Zamrazil, Kristie, “Telemedicine in Texas: Public Policy Concerns”, House Research Organization Focus Report, Texas House of Representatives, http://www.hro.house.state.tx.us/focus/telemed.pdf, May 5, 2000, pp. 76-22.
Zipperer, Lorri, “Robotic dispensing system”, 1999, pp. 1-2.
Zorn, Benjamin G., “Ubiquitous Telepresence”, http://www.cs.colorado.edu/-zorn/utlvision/vision.html, May 5, 1996.
Kaplan, A. E. et al., “An Internet Accessible Telepresence”, {aek keshav nls jhv}@research.att.com, AT&T Bell Laboratories, Murray Hill, N.J., pp. 1-7 1197 (1997).
Kuzuoka, et al., “Can the GestureCam Be a Surrogate?”, Proceedings of the Fourth European Conference on Computer-Supported Cooperative Work, Sep. 1995, pp. 181-196.
Related Publications (1)
Number Date Country
20150094854 A1 Apr 2015 US
Provisional Applications (2)
Number Date Country
60974404 Sep 2007 US
60895740 Mar 2007 US
Continuations (3)
Number Date Country
Parent 14041325 Sep 2013 US
Child 14512842 US
Parent 13562315 Jul 2012 US
Child 14041325 US
Parent 11862197 Sep 2007 US
Child 13562315 US