Documentation through a remote presence robot

Information

  • Patent Grant
  • 11850757
  • Patent Number
    11,850,757
  • Date Filed
    Thursday, August 28, 2014
    9 years ago
  • Date Issued
    Tuesday, December 26, 2023
    5 months ago
Abstract
A robotic system that is used in a tele-presence session. For example, the system can be used by medical personnel to examine, diagnose and prescribe medical treatment in the session. The system includes a robot that has a camera and is controlled by a remote station. The system further includes a storage device that stores session content data regarding the session. The data may include a video/audio taping of the session by the robot. The session content data may also include time stamps that allow a user to determine the times that events occurred during the session. The session content data may be stored on a server that accessible by multiple users. Billing information may be automatically generated using the session content data.
Description
1. FIELD OF THE INVENTION

The subject matter disclosed generally relates to a robotic tele-presence system.


2. BACKGROUND INFORMATION

Robots have been used in a variety of applications ranging from remote control of hazardous material to assisting in the performance of surgery. For example, U.S. Pat. No. 5,762,458 issued to Wang et al. discloses a system that allows a surgeon to perform minimally invasive medical procedures through the use of robotically controlled instruments. One of the robotic arms in the Wang system moves an endoscope that has a camera. The camera allows a surgeon to view a surgical area of a patient.


There has been marketed a mobile robot introduced by InTouch-Health, Inc., the assignee of this application, under the trademark RP-7. The InTouch robot is controlled by a user at a remote station. The remote station includes personal computer with a joystick that allows the user to remotely control the movement of the robot. Both the robot and remote station have cameras, monitors, speakers and microphones to allow for two-way video/audio communication.


The InTouch RP-7 system is used by medical personnel to remotely “visit” a patient. The system is particularly useful for medical specialist. For example, medical personnel specializing in patient stroke care can remotely examine, diagnose and prescribe a patient management plan. With the proliferation of such robots it would be desirable to track and store data related to tele-presence sessions.


BRIEF SUMMARY OF THE INVENTION

A robotic system with a robot that has a camera and a remote station coupled to the robot. The remote station controls the robot in a session that results in session content data. The system further includes a storage device that stores the session content data.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is an illustration of a robotic system;



FIG. 2 is an illustration showing a user interface;



FIG. 3 is an illustration of a user interface displaying events and associated time stamps;



FIG. 4 is an illustration of a user interface with selectable fields;



FIG. 5 is an illustration showing the display of a pull-down menu;



FIG. 6 is an illustration showing a session field displayed in response to the selection of a field;



FIG. 7 is a schematic of an electrical system of a robot;



FIG. 8 is a graphical user interface of a user interface;



FIG. 9 is a graphical user interface at a remote station;



FIG. 10 is a graphical user interface at the remote station;



FIG. 11 is a graphical user interface when a NIHSS tab is selected;



FIG. 12 is a graphical user interface displayed when a t-PA table is selected



FIG. 13 is a graphical user interface displayed when a view images button is selected.





DETAILED DESCRIPTION

Disclosed is a robotic system that is used in a tele-presence session. For example, the system can be used by medical personnel to examine, diagnose and prescribe medical treatment in the session. The system includes a robot that has a camera and is controlled by a remote station. The system further includes a storage device that stores session content data regarding the session. The data may include a video/audio taping of the session by the robot. The session content data may also include time stamps that allow a user to determine the times that events occurred during the session. The session data may be stored on a server that is accessible to multiple users. Billing information may be automatically generated using the session data.


Referring to the drawings more particularly by reference numbers, FIG. 1 shows a robotic system 10. The robotic system 10 includes one or more robots 12. Each robot 12 may have a base station 14. The robot 12 is coupled to a remote control station 16. The remote control station 16 may be coupled to the base station 14 through a network 18. By way of example, the network 18 may be either a packet switched network such as the Internet, or a circuit switched network such has a Public Switched Telephone Network (PSTN) or other broadband system. The base station 14 may be coupled to the network 18 by a modem 20 or other broadband network interface device. By way of example, the base station 14 may be a wireless router. Alternatively, the robot 12 may have a direct connection to the network 18 through, for example, a satellite.


The remote control station 16 may include a computer 22 that has a monitor 24, a camera 26, a microphone 28 and a speaker 30. The computer 22 may also contain an input device 32 such as a joystick or a mouse. The control station 16 is typically located in a place that is remote from the robot 12. Although only one remote control station 16 is shown, the system 10 may include a plurality of remote stations. In general any number of robots 12 may be controlled by any number of remote stations 16 or other robots 12. For example, one remote station 16 may be coupled to a plurality of robots 12, or one robot 12 may be coupled to a plurality of remote stations 16, or a plurality of robots 12.


Each robot 12 includes a movement platform 34 that is attached to a robot housing 36. The robot 12 may also have a camera 38, a monitor 40, a microphone(s) 42 and a speaker(s) 44. The microphone 42 and speaker 30 may create a stereophonic sound. The robot 12 may also have an antenna 46 that is wirelessly coupled to an antenna 48 of the base station 14. The system 10 allows a user at the remote control station 16 to move the robot 12 through operation of the input device 32. The robot camera 38 is coupled to the remote monitor 24 so that a user at the remote station 16 can view someone at the robot site such as a patient. Likewise, the robot monitor 40 is coupled to the remote camera 26 so that someone at the robot site can view the user. The microphones 28 and 42, and speakers 30 and 44, allow for audible communication between the robot site and the user of the system.


The remote station computer 22 may operate Microsoft OS software and WINDOWS XP or other operating systems such as LINUX. The remote computer 22 may also operate a video driver, a camera driver, an audio driver and a joystick driver. The video images may be transmitted and received with compression software such as MPEG CODEC.


The system 10 can be used to engage in a session that results in data. For example, the system 10 can be used by medical personnel to remotely examine, diagnose and prescribe a patient management plan for a patient 50 in a medical session. Either the patient, or a bed supporting the patient, may have a radio frequency information device (“RFID”) 52. The RFID 52 may wirelessly transmit information that is received by the robot 12 through antennae 46. The RFID information can be used to correlate a particular session with a specific patient. The receipt of RFID information may initiate the storage of session data. Although a medical session is described, it is to be understood that other types of sessions may be conducted with the system 10. For example, the system 10 may be used to move the robot(s) about a factory floor wherein the user provides remote consultation. Consultation session data may be stored by the system 10.


The system can store and display session content data. Session content data is information regarding the substance of a session. For example, in a medical application, session content data would include physician notes, diagnosis and prescription information. In a factory-equipment repair application, session content data would include repair methodology and replaced parts. Session content data would not be mere time entries associated with the logging on and termination of a robot session.


The system 10 may include a records server 54 and/or a billing server 56 that can be accessed through the network 18. The servers 54 and 56 may include memory, processors, I/O interfaces and storage devices such as hard disk drives, as is known in the art. Records server 54 may have a storage device(s) 57 that stores session data. The server 54 may receive and store session data during a session. For example, the server 54 may receive and store video and audio captured by the robot camera 38 and microphone 42, respectively. To reduce bandwidth requirements during a session the session data, such as video/audio segments, can be transmitted from the robot 12 to the server 54 after the session has terminated. For example, when the user logs off the system. Timestamped progress notes are also simultaneously uploaded. The server 54 may contain other medical records of a patient such as written records of treatment, patient history, medication information, laboratory results, physician notes, etc. Video/audio segments can be timestamped and associated with the identification of the control station and the robot, and a unique identifier which can be cross-referenced with progress notes and other session data. These video/audio segments can then later be used to substantiate and reference the various progress notes and other events in a visual fashion. The system can track all head and base movements made during the course of the associated portion of the session, to allow correlation of those movements with the actions taken.


The system 10 may include a user interface 58 that allows a user at the remote location to enter data into the system. For example, the interface 58 may be a computer or a computer terminal that allows a user to enter information about the patient. The robot 12 can be moved into view of the patient through the remote station 16 so that patient information can be entered into the system while a physician is viewing the patient through the robot camera. The physician can remotely move the robot 12 to obtain different viewing angles of the patient. The user interface 58 may be a separate computer and/or be integral with the robot 12. The billing server 56 may automatically generate a bill from the information provided by the session data on a periodic basis. The billed elements may be based on either actions performed or outcomes achieved, or both. Alternatively, a user can manually generate bills through a user interface to the billing server.


The billing server 56 may receive session data during a session or upon termination of a session. Additionally, the billing server may poll a robot to retrieve data from its hard drive. The session data may be organized so as to automatically populate certain fields of a billing statement or report. The billing information can be automatically sent to an insurance carrier.


The server 54 can be accessible through a web page or other means for accessing information through a network 18. FIG. 2 shows a user interface 62 displayed at a remote station 16, or any other terminal that can access the server 54. The interface 62 can for example, provide a date and time that various physicians had sessions with different patients. FIG. 3 shows another user interface 64 that displays time stamps 66 that are associated with certain events 68. Records can be retrieved by various filters including physician name, patient name, time of session and services performed during the session. The event data can be initially stored in either the robot 12 or the remote station 16 and then loaded into the server 54, either during or after a session. Alternatively, event data can be directly loaded into the server without storing it locally on the robot or remote station.


The session data can be organized into a plurality of data types. FIG. 4 shows a plurality of different data types. For example, the session data can be organized into ENCOUNTER PROFILE data 70, PATIENT PROFILE data 72 and CLINICAL MANAGEMENT PROFILE data 74, with each having subfields such as EVENT and HISTORY. FIG. 5 shows a pull-down screen 78 that is displayed when a DEMOGRAPHICS field 76 is selected. FIG. 6 shows a field 80 that displays a number of sessions that match a selected HISPANIC field 82. The session data can be searched with Boolean operators such as AND and OR to search for multiple terms, data types, etc. The user can display all hits for the search, or have a statistical analysis performed based on the matching sessions.


In a factory equipment-repair application, the equipment being repaired during the session would replace the patient name in FIG. 2; and steps for repair would replace the event list in FIG. 3. Repair methodologies and affected part numbers would replace the search criteria in FIGS. 4, 5 and 6. Captured video and audio would show the steps in the repair process, and would be timestamped and cross-referenced to the data in FIG. 3.


Referring to FIG. 1, the system 10 may also include an image server 84 and a registry server 86. The image server 84 may include medical images. For example, the medical images may include CT scans of a patient's brain. The images can be downloaded to one of the remote stations 16 through the network 18. The registry server 86 may store historical data on patients. The historical data can be downloaded to a remote computer 16 through the network 18.



FIG. 7 shows an embodiment of a robot 12. Each robot 12 may include a high level control system 160 and low level control system 162. The high level control system 160 may include a processor 164 that is connected to a bus 166. The bus is coupled to the camera 38 by an input/output (I/O) port 168, and to the monitor 40 by a serial output port 170 and a VGA driver 172. The monitor 40 may include a touchscreen function that allows a user to enter input by touching the monitor screen.


The speaker 44 is coupled to the bus 166 by a digital to analog converter 174. The microphone 42 is coupled to the bus 166 by an analog to digital converter 176. The high level controller 160 may also contain random access memory (RAM) device 178, a non-volatile RAM device 180 and a mass storage device 182 that are all coupled to the bus 172. The RAM 178, NVRAM 180 and/or mass storage device 182 may contain session data that is transmitted to the remote station and/or server. The robot antennae 46 may be coupled to a wireless transceiver 184. By way of example, the transceiver 184 may transmit and receive information in accordance with IEEE 802.11b.


The controller 164 may operate with a LINUX OS operating system. The controller 164 may also operate MS WINDOWS along with video, camera and audio drivers for communication with the remote control station 16. Video information may be transceived using MPEG CODEC compression techniques. The software may allow the user to send e-mail to the patient and vice versa, or allow the patient to access the Internet. In general the high level controller 160 operates to control communication between the robot 12 and the remote control station 16.


The high level controller 160 may be linked to the low level control system 162 by a serial port 186. The low level control system 162 may include components and software that mechanically actuate the robot 12. For example, the low level control system 162 provides instructions to actuate the movement platform to move the robot 12. The low level control system 162 may receive movement instructions from the high level controller 160. The movement instructions may be received as movement commands from the remote control station or another robot. Although two controllers are shown, it is to be understood that each robot 12 may have one controller, or more than two controllers, controlling the high and low level functions.


The system may be the same or similar to a robotic system provided by the assignee InTouch Technology, Inc. of Santa Barbara, California under the name RP-7, which is hereby incorporated by reference. The system may also be the same or similar to the system disclosed in U.S. Pat. No. 7,292,912, which is hereby incorporated by reference.



FIG. 8 shows a graphical user interface 250 can be provided at the user interface 58. The graphical user interface 250 includes a plurality of data fields 252 that can be filled by the user. The data fields 252 can request patient information such as name, age, etc. The data fields may also include request for medical data such as heart rate, glucose level and blood pressure (“SBP” and “DBP”). The data entered into the fields 252 can be included in the session data that is transmitted and stored by the system 10. Filling the data fields may be designated an “event” that is given as associated time stamp and displayed by a user interface.



FIG. 9 shows a display user interface (“DUI”) 260 that can be displayed at the remote station 16. The DUI 260 may include a robot view field 262 that displays a video image captured by the camera of the robot. The DUI 260 may also include a station view field 264 that displays a video image provided by the camera of the remote station 16. The DUI 260 may be part of an application program stored and operated by the computer 22 of the remote station 16. The video and any accompanying audio displayed by the robot and station view fields may be transmitted and stored by the system 10 as session data.


The DUI 260 may contain a “progress notes” text editing field, which enables a “document as you treat” methodology. As the physician conducts treatment, he can document both the treatment steps and outcomes in the progress notes field. Each note may be manually timestamped by the physician, or automatically timestamped by the software based on when the physician began typing each note. In the application of factory floor equipment repair, the progress notes would detail the various examinations and repair steps taken.



FIG. 10 shows a graphical user interface 270 that can be displayed by the monitor of the remote station 16. The interface 270 includes a “PATIENT INFO” tab 272, a “NIHSS” tab 274 and a “t-PA” tab 276. Selection of the PATIENT INFO tab 272 displays various data fields 278 including patient name, age, weight, heart rate, etc. This may be the same information entered through the user interface 250. This information may be included in the session data that is transmitted and stored by the system 10. The usage of this interface may be tagged as an event with an associated time stamp.



FIG. 11 shows an interface 280 when the “NIHSS” tab 274 is selected. The interface 280 has a data field 282 that provides a questionnaire to rate the severity of a stroke victim using the NIHSS stroke scale. This provides a readily available medical tool for the physician. The results of the questionnaire can be included in the session data and be tagged as an event that has an associated time stamp.



FIG. 12 shows an interface 290 when the “t-PA” tab 276 is selected. The interface 290 may include a data field 292 that provides the patient's weight, a “TOTAL DOSE” data field 294, a “BOLUS DOSE” data field 296 and an “INFUSION DOSE” data field 298. The interface 290 may also include a “CALCULATE” button 300. When the CALCULATE button 300 is selected the data fields 294, 296 and 298 are automatically populated with a calculated dosage. This provides a patient management plan for the physician to review. The interfaces 270, 280 and 290 also have a “VIEW IMAGES” button 302 that when selected displays an interface 310 shown in FIG. 13. The interface 310 includes a data field 312 and an image field 314. The image field 314 can provide a plurality of medical images such as a CT scan of the patient's head.


The calculated dosage and images can be included in the session data that is transmitted and stored by the system. The automatic population of the data fields may be tagged as an event with an associated time stamp. Likewise, the selection of the data and/or image fields may be tagged as events with time stamps.


The system is useful for allowing a physician to remotely view and treat a stroke patient. The system provides patient information, NIHSS stroke severity assessment, calculated t-PA dosage and CT head images that allow the physician to provide real time remote patient treatment. The system also allows such sessions to be audited so that medical personnel, healthcare institutions, insurance carriers, etc. can audit sessions. Such audits may include viewing video/audio captured by the robot during a session.


While certain exemplary embodiments have been described and shown in the accompanying drawings, it is to be understood that such embodiments are merely illustrative of and not restrictive on the broad invention, and that this invention not be limited to the specific constructions and arrangements shown and described, since various other modifications may occur to those ordinarily skilled in the art.

Claims
  • 1. A medical telepresence system, comprising: a telepresence device in the vicinity of a patient, said telepresence device has a patient camera, a patient monitor, a patient microphone, and a patient speaker;a remote station that has a station camera, a station monitor, a station microphone, and a station speaker, said remote station and said telepresence device are configured to establish a telepresence session during which: said station monitor is coupled to said patient camera and configured to display a patient video captured by said patient camera,said patient monitor is coupled said station camera and configured to display a station video captured by said station camera,said station speaker is coupled to said patient microphone and said patient speaker is coupled to said station microphone to enable two-way audio communication between said telepresence device and said remote station, andsaid remote station is configured to control said telepresence device,wherein said telepresence session results in session content data that includes at least a portion of both the audio captured by the patient microphone and the video captured by the patient camera and at least one of a physician note, a diagnosis, and a prescription information; and,a records server configured to store said session content data in association with a unique identifier and provide said stored session content data to a terminal via a network after said telepresence session is concluded, wherein said terminal is configured to reproduce said session content data, including the session audio and video.
  • 2. The system of claim 1, wherein said session content data is entered by an operator at the remote station.
  • 3. The system of claim 1, wherein said session content data is correlated with a movement of said telepresence device.
  • 4. The system of claim 1, wherein said session content data is searchable.
  • 5. The system of claim 1, wherein said session content data includes at least one time stamp.
  • 6. The system of claim 5, wherein said remote station provides a graphical user interface that displays said time stamp and said session content data.
  • 7. The system of claim 6, wherein said session content data is entered by an operator at said remote station.
  • 8. The system of claim 7, wherein said time stamp is automatically generated when said session content data is entered by the operator.
  • 9. The system of claim 1, further comprising a billing server that generates a bill with said session content data.
  • 10. The system of claim 1, further comprising a bill that is based on an action of said session content data.
  • 11. The system of claim 1, wherein said session content data is structured into a plurality of data types and is searchable across said data types.
  • 12. A method for conducting a medical tele-presence session, comprising: controlling a telepresence device in the vicinity of a patient through control of a remote station, the telepresence device has a patient camera, a patient monitor, a patient speaker, and a patient microphone, the remote station includes a station camera, a station monitor, a station speaker, and a station microphone;establishing a telepresence session during which the station monitor is coupled to the patient camera and displays a patient video captured by the patient camera, said patient monitor is coupled to the station camera and displays station video captured by the station camera, the station speaker is coupled to the patient microphone, and the patient speaker is coupled to the station microphone to enable two-way audio communication between said telepresence device and the remote station;generating session content data that includes session audio captured by the patient microphone, session video captured by the patient camera, and at least one of a physician note, a diagnosis, and a prescription information;storing the session content data generated during the telepresence session in a records server in association with a unique identifier;accessing and reproducing said session content data, including the session audio and video, stored in the server at a terminal via a network after said telepresence session is concluded.
  • 13. The method of claim 12, wherein the session content data is searchable.
  • 14. The method of claim 12, further comprising generating at least one time stamp for the session content data.
  • 15. The method of claim 14, further comprising displaying the time stamp and the session content data.
  • 16. The method of claim 15, wherein the session content data is entered by an operator at the remote station.
  • 17. The method of claim 16, wherein the time stamp is automatically generated when the session content data is entered by the operator.
  • 18. The method of claim 12, further comprising transmitting a video image of a user at the control station to a monitor of the robot.
  • 19. The method of claim 12, further comprising automatically generating a bill with the session content data.
  • 20. The method of claim 12, further comprising structuring the session content data into a plurality of data types and searching the session content data across the data types.
BACKGROUND OF THE INVENTION

This application is a continuation of U.S. application Ser. No. 12/362,454, filed Jan. 29, 2009, now U.S. Pat. No. 8,849,680, the contents of which are hereby incorporated by reference in their entirety.

US Referenced Citations (173)
Number Name Date Kind
4107689 Jellinek Aug 1978 A
4213182 Eichelberger et al. Jul 1980 A
4553309 Hess et al. Nov 1985 A
4697278 Fleischer Sep 1987 A
5220263 Onishi et al. Jun 1993 A
5262944 Weisner et al. Nov 1993 A
5400068 Ishida et al. Mar 1995 A
5617539 Ludwig et al. Apr 1997 A
5619341 Auyeung et al. Apr 1997 A
5623679 Rivette et al. Apr 1997 A
5734805 Isensee et al. Mar 1998 A
5793365 Tang et al. Aug 1998 A
5801755 Echerer Sep 1998 A
5844599 Hildin Dec 1998 A
5867494 Krishnaswamy et al. Feb 1999 A
5872922 Hogan et al. Feb 1999 A
6091219 Maruo et al. Jul 2000 A
6189034 Riddle Feb 2001 B1
6195683 Palmer Feb 2001 B1
6292713 Jouppi et al. Sep 2001 B1
6292714 Okabayashi Sep 2001 B1
6304050 Skaar et al. Oct 2001 B1
6314631 Pryor Nov 2001 B1
6317953 Pryor Nov 2001 B1
6373855 Downing et al. Apr 2002 B1
6389329 Colens May 2002 B1
6411055 Fujita et al. Jun 2002 B1
6430471 Kintou et al. Aug 2002 B1
6507773 Parker et al. Jan 2003 B2
6529620 Thompson Mar 2003 B2
6535793 Allard Mar 2003 B2
6567038 Granot et al. May 2003 B1
6590604 Tucker et al. Jul 2003 B1
6597392 Jenkins et al. Jul 2003 B1
6667592 Jacobs et al. Dec 2003 B2
6674259 Norman et al. Jan 2004 B1
6693585 Macleod Feb 2004 B1
6724823 Rovati et al. Apr 2004 B2
6816192 Nishikawa Nov 2004 B1
6816754 Mukai et al. Nov 2004 B2
6893267 Yueh May 2005 B1
6951535 Ghodoussi Oct 2005 B2
6990112 Brent et al. Jan 2006 B1
7011538 Chang Mar 2006 B2
7053578 Diehl et al. May 2006 B2
7055210 Keppler et al. Jun 2006 B2
7219364 Bolle et al. May 2007 B2
7222000 Wang et al. May 2007 B2
7283153 Provost et al. Oct 2007 B2
7292257 Kang et al. Nov 2007 B2
7305114 Wolff et al. Dec 2007 B2
7332890 Cohen et al. Feb 2008 B2
7333642 Green Feb 2008 B2
7352153 Yan Apr 2008 B2
7363121 Chen et al. Apr 2008 B1
7467211 Herman et al. Dec 2008 B1
7483867 Ansari et al. Jan 2009 B2
7510428 Obata et al. Mar 2009 B2
7557758 Rofougaran Jul 2009 B2
7587260 Bruemmer et al. Sep 2009 B2
7631833 Ghaleb et al. Dec 2009 B1
7657560 DiRienzo Feb 2010 B1
7703113 Dawson Apr 2010 B2
7737993 Kaasila et al. Jun 2010 B2
7774158 Domingues Goncalves et al. Aug 2010 B2
7861366 Hahm et al. Jan 2011 B2
7885822 Akers et al. Feb 2011 B2
7956894 Akers et al. Jun 2011 B2
7957837 Ziegler et al. Jun 2011 B2
7982769 Jenkins et al. Jul 2011 B2
8126960 Obradovich et al. Feb 2012 B2
8212533 Ota Jul 2012 B2
8287522 Moses et al. Oct 2012 B2
8320534 Kim et al. Nov 2012 B2
8348675 Dohrmann Jan 2013 B2
8374171 Cho et al. Feb 2013 B2
8384753 Bedingfield, Sr. Feb 2013 B1
8400491 Panpaliya et al. Mar 2013 B1
8401275 Wang et al. Mar 2013 B2
8423284 O'Shea Apr 2013 B2
8451731 Lee et al. May 2013 B1
8515577 Wang et al. Aug 2013 B2
8610786 Ortiz Dec 2013 B2
8612051 Norman et al. Dec 2013 B2
8639797 Pan et al. Jan 2014 B1
8670017 Stuart et al. Mar 2014 B2
8726454 Gilbert et al. May 2014 B2
8836751 Ballantyne et al. Sep 2014 B2
8849679 Wang et al. Sep 2014 B2
8849680 Wright et al. Sep 2014 B2
8861750 Roe et al. Oct 2014 B2
8897920 Wang et al. Nov 2014 B2
8902278 Pinter et al. Dec 2014 B2
20010051881 Filler Dec 2001 A1
20020044201 Alexander et al. Apr 2002 A1
20020106998 Presley et al. Aug 2002 A1
20020109775 White et al. Aug 2002 A1
20020128985 Greenwald Sep 2002 A1
20020193908 Parker Dec 2002 A1
20030080901 Piotrowski May 2003 A1
20030112823 Collins et al. Jun 2003 A1
20030120714 Wolff et al. Jun 2003 A1
20030135097 Wiederhold et al. Jul 2003 A1
20030195662 Wang et al. Oct 2003 A1
20030216833 Mukai et al. Nov 2003 A1
20040008138 Hockley, Jr. et al. Jan 2004 A1
20040017475 Akers Jan 2004 A1
20040019406 Wang Jan 2004 A1
20040088078 Jouppi et al. May 2004 A1
20040117067 Jouppi Jun 2004 A1
20040150725 Taguchi Aug 2004 A1
20040168148 Goncalves et al. Aug 2004 A1
20040218099 Washington Nov 2004 A1
20040260790 Balloni et al. Dec 2004 A1
20050052527 Remy Mar 2005 A1
20050073575 Thacher et al. Apr 2005 A1
20050125083 Kiko Jun 2005 A1
20050149364 Ombrellaro Jul 2005 A1
20050152447 Jouppi et al. Jul 2005 A1
20050152565 Jouppi et al. Jul 2005 A1
20050168568 Jouppi Aug 2005 A1
20050264649 Chang et al. Dec 2005 A1
20050286759 Zitnick et al. Dec 2005 A1
20060010028 Sorensen Jan 2006 A1
20060056655 Wen et al. Mar 2006 A1
20060056837 Vapaakoski Mar 2006 A1
20060066609 Todice et al. Mar 2006 A1
20060071797 Rosenfeld et al. Apr 2006 A1
20060178559 Kumar et al. Aug 2006 A1
20070093279 Janik Apr 2007 A1
20070116152 Thesling May 2007 A1
20070170886 Plishner Jul 2007 A1
20070199108 Angle Aug 2007 A1
20070226949 Hahm et al. Oct 2007 A1
20070290040 Wurman et al. Dec 2007 A1
20080027591 Lenser et al. Jan 2008 A1
20080033641 Medalia Feb 2008 A1
20080051985 D'Andrea et al. Feb 2008 A1
20080086241 Phillips et al. Apr 2008 A1
20080091340 Milstein et al. Apr 2008 A1
20080161969 Lee et al. Jul 2008 A1
20080232763 Brady Sep 2008 A1
20080263628 Norman et al. Oct 2008 A1
20080267069 Thielman et al. Oct 2008 A1
20090049640 Lee et al. Feb 2009 A1
20090055023 Walters Feb 2009 A1
20090102919 Zamierowski et al. Apr 2009 A1
20100017046 Cheung et al. Jan 2010 A1
20100026239 Li et al. Feb 2010 A1
20100030578 Siddique et al. Feb 2010 A1
20100066804 Shoemake et al. Mar 2010 A1
20100171826 Hamilton et al. Jul 2010 A1
20100278086 Pochiraju et al. Nov 2010 A1
20100286905 Goncalves et al. Nov 2010 A1
20100301679 Murray et al. Dec 2010 A1
20110022705 Yellamraju et al. Jan 2011 A1
20110071675 Wells et al. Mar 2011 A1
20110072114 Hoffert et al. Mar 2011 A1
20110153198 Kokkas et al. Jun 2011 A1
20110193949 Nambakam et al. Aug 2011 A1
20110195701 Cook et al. Aug 2011 A1
20110280551 Sammon Nov 2011 A1
20110306400 Nguyen Dec 2011 A1
20120059946 Wang Mar 2012 A1
20120113856 Krishnaswamy May 2012 A1
20120203731 Nelson et al. Aug 2012 A1
20120291809 Kuhe et al. Nov 2012 A1
20130250938 Anandakumar et al. Sep 2013 A1
20140047022 Chan et al. Feb 2014 A1
20140085543 Hartley et al. Mar 2014 A1
20140135990 Stuart et al. May 2014 A1
20140139616 Pinter et al. May 2014 A1
20140155755 Pinter et al. Jun 2014 A1
Foreign Referenced Citations (14)
Number Date Country
1404695 Mar 2003 CN
1561923 Jan 2005 CN
1743144 Mar 2006 CN
101049017 Oct 2007 CN
101151614 Mar 2008 CN
100407729 Jul 2008 CN
11220706 Aug 1999 JP
2002321180 Nov 2002 JP
2004181229 Jul 2004 JP
2005111083 Apr 2005 JP
2009125133 Jun 2009 JP
9742761 Nov 1997 WO
WO-2007041295 Apr 2007 WO
2009128997 Oct 2009 WO
Non-Patent Literature Citations (22)
Entry
Diggs, Alice C.; Design of a socially intelligent task selection software mechanism for a mobile robot; Tennessee State University. ProQuest Dissertations Publishing, 2008. 1456761. (Year: 2008).
Fulbright et al., “SWAMI: An Autonomous Mobile Robot for Inspection of Nuclear Waste of Storage Facilities”, Autonomous Robots, vol. 2, 1995, pp. 225-235.
Screenshot Showing Google Date for Lemaire Telehealth Manual, Screenshot Retrieved on Dec. 18, 2014, 1 page.
Nomadic Technologies, Inc., “Nomad Scout Language Reference Manual”, Software Version: 2.7, Part No. DOC00002, Jul. 12, 1999, 47 pages.
“Appeal from the U.S. District Court for the Central District of California in No. 11-CV-9185, Judge Percy Anderson”, May 9, 2014, pp. 1-48.
“Google translation of: Innovations Report”, From research project to television star: Care-O-bot in ZDF series, http://www.innovations-report.de/specials/printa.php?id=5157, Sep. 28, 2001, 2 pages.
“MPEG File Format Summary”, downloaded from: http://www.fileformat.info/format/mpeg/egff.htm, Feb. 1, 2001, 8 pages.
“Nomad Scout User's Manual”, Nomadic Technologies, Software Version 2. 7, Part No. DOC00004, Jul. 12, 1999, pp. 1-59.
ACM Digital Library Record, “Autonomous Robots vol. 11 Issue 1”, downloaded from <http://dl.acm.org/citation.cfm?id=591550&picked=prox&cfid=360891374&cftoken=35225929>, Jul. 2001, 2 pages.
Brenner, “A technical tutorial on the IEEE 802.11 protocol”, BreezeCOM Wireless Communications, 1997, pp. 1-24.
CMU Course 16X62, “Robot user's manual”, (describing the Nomad Scout), Carnegie Mellon University, Feb. 1, 2001, 11 pages.
Gostai, “Gostai Jazz: Robotic Telepresence”, Available online at <http://www.gostai.com>, 4 pages.
Koenen, “MPEG-4: a Powerful Standard for Use in Web and Television Environments”, (KPN Research), downloaded from http://www.w3.org/Architecture/1998/06/Workshop/paper26, Jul. 1, 1998, 4 pages.
Library of Congress, “008-Fixed-Length Data Elements (NR)”, Mar. 21 Format for Classification Data, downloaded from http://www.loc.gov/marc/classification/cd008.html, Jan. 2000, pp. 1-14.
Osborn, “Quality of Life Technology Center”, QoLT Research Overview:A National Science Foundation Engineering Research Center, Carnegie Mellon University of Pittsburgh, 2 pages.
Panusopone, et al., “Performance comparison of MPEG-4 and H.263+ for streaming video applications”, Circuits Systems Signal Processing, vol. 20, No. 3, 2001, pp. 293-309.
Paulos, et al., “Personal Tele-Embodiment”, Chapter 9 in Goldberg, et al., ed. “Beyond webcams”, MIT Press, Jan. 4, 2002, pp. 155-167.
Paulos, “Personal tele-embodiment”, OskiCat Catalog Record, UCB Library Catalog, 2001, 3 pages.
Paulos, “Personal Tele-Embodiment”, Introductory and cover pages from 2001 Dissertation including Contents table, together with e-mails relating thereto from UC Berkeley Libraries, as shelved at UC Berkeley Engineering Library (Northern Regional library Facility), May 8, 2002, 25 pages, including 4 pages of e-mails.
Paulos, et al., “Social Tele-Embodiment: Understanding Presence”, Autonomous Robots, vol. 11, Issue 1, Kluwer Academic Publishers, Jul. 2001, pp. 87-95.
Schraft, et al., “Care-O-bot™: the concept of a system fro assisting elderly or disabled persons in home enviornments”, IEEE Proceedings of the 24th Annual Conference of the Industrial Electronics Society, IECON '98, Aug. 31-Sep. 4, 1998, pp. 2476-2481.
Mdeo Middleware Cookbook, “H.350 Directory Services for Multimedia”, 2 pages.
Related Publications (1)
Number Date Country
20150339452 A1 Nov 2015 US
Continuations (1)
Number Date Country
Parent 12362454 Jan 2009 US
Child 14472277 US