The subject matter disclosed generally relates to a robotic tele-presence system.
Robots have been used in a variety of applications ranging from remote control of hazardous material to assisting in the performance of surgery. For example, U.S. Pat. No. 5,762,458 issued to Wang et al. discloses a system that allows a surgeon to perform minimally invasive medical procedures through the use of robotically controlled instruments. One of the robotic arms in the Wang system moves an endoscope that has a camera. The camera allows a surgeon to view a surgical area of a patient.
There has been marketed a mobile robot introduced by InTouch-Health, Inc., the assignee of this application, under the trademark RP-7. The InTouch robot is controlled by a user at a remote station. The remote station includes personal computer with a joystick that allows the user to remotely control the movement of the robot. Both the robot and remote station have cameras, monitors, speakers and microphones to allow for two-way video/audio communication.
The InTouch RP-7 system is used by medical personnel to remotely “visit” a patient. The system is particularly useful for medical specialist. For example, medical personnel specializing in patient stroke care can remotely examine, diagnose and prescribe a patient management plan. With the proliferation of such robots it would be desirable to track and store data related to tele-presence sessions.
A robotic system with a robot that has a camera and a remote station coupled to the robot. The remote station controls the robot in a session that results in session content data. The system further includes a storage device that stores the session content data.
Disclosed is a robotic system that is used in a tele-presence session. For example, the system can be used by medical personnel to examine, diagnose and prescribe medical treatment in the session. The system includes a robot that has a camera and is controlled by a remote station. The system further includes a storage device that stores session content data regarding the session. The data may include a video/audio taping of the session by the robot. The session content data may also include time stamps that allow a user to determine the times that events occurred during the session. The session data may be stored on a server that is accessible to multiple users. Billing information may be automatically generated using the session data.
Referring to the drawings more particularly by reference numbers,
The remote control station 16 may include a computer 22 that has a monitor 24, a camera 26, a microphone 28 and a speaker 30. The computer 22 may also contain an input device 32 such as a joystick or a mouse. The control station 16 is typically located in a place that is remote from the robot 12. Although only one remote control station 16 is shown, the system 10 may include a plurality of remote stations. In general any number of robots 12 may be controlled by any number of remote stations 16 or other robots 12. For example, one remote station 16 may be coupled to a plurality of robots 12, or one robot 12 may be coupled to a plurality of remote stations 16, or a plurality of robots 12.
Each robot 12 includes a movement platform 34 that is attached to a robot housing 36. The robot 12 may also have a camera 38, a monitor 40, a microphone(s) 42 and a speaker(s) 44. The microphone 42 and speaker 30 may create a stereophonic sound. The robot 12 may also have an antenna 46 that is wirelessly coupled to an antenna 48 of the base station 14. The system 10 allows a user at the remote control station 16 to move the robot 12 through operation of the input device 32. The robot camera 38 is coupled to the remote monitor 24 so that a user at the remote station 16 can view someone at the robot site such as a patient. Likewise, the robot monitor 40 is coupled to the remote camera 26 so that someone at the robot site can view the user. The microphones 28 and 42, and speakers 30 and 44, allow for audible communication between the robot site and the user of the system.
The remote station computer 22 may operate Microsoft OS software and WINDOWS XP or other operating systems such as LINUX. The remote computer 22 may also operate a video driver, a camera driver, an audio driver and a joystick driver. The video images may be transmitted and received with compression software such as MPEG CODEC.
The system 10 can be used to engage in a session that results in data. For example, the system 10 can be used by medical personnel to remotely examine, diagnose and prescribe a patient management plan for a patient 50 in a medical session. Either the patient, or a bed supporting the patient, may have a radio frequency information device (“RFID”) 52. The RFID 52 may wirelessly transmit information that is received by the robot 12 through antennae 46. The RFID information can be used to correlate a particular session with a specific patient. The receipt of RFID information may initiate the storage of session data. Although a medical session is described, it is to be understood that other types of sessions may be conducted with the system 10. For example, the system 10 may be used to move the robot(s) about a factory floor wherein the user provides remote consultation. Consultation session data may be stored by the system 10.
The system can store and display session content data. Session content data is information regarding the substance of a session. For example, in a medical application, session content data would include physician notes, diagnosis and prescription information. In a factory-equipment repair application, session content data would include repair methodology and replaced parts. Session content data would not be mere time entries associated with the logging on and termination of a robot session.
The system 10 may include a records server 54 and/or a billing server 56 that can be accessed through the network 18. The servers 54 and 56 may include memory, processors, I/O interfaces and storage devices such as hard disk drives, as is known in the art. Records server 54 may have a storage device(s) 57 that stores session data. The server 54 may receive and store session data during a session. For example, the server 54 may receive and store video and audio captured by the robot camera 38 and microphone 42, respectively. To reduce bandwidth requirements during a session the session data, such as video/audio segments, can be transmitted from the robot 12 to the server 54 after the session has terminated. For example, when the user logs off the system. Timestamped progress notes are also simultaneously uploaded. The server 54 may contain other medical records of a patient such as written records of treatment, patient history, medication information, laboratory results, physician notes, etc. Video/audio segments can be timestamped and associated with the identification of the control station and the robot, and a unique identifier which can be cross-referenced with progress notes and other session data. These video/audio segments can then later be used to substantiate and reference the various progress notes and other events in a visual fashion. The system can track all head and base movements made during the course of the associated portion of the session, to allow correlation of those movements with the actions taken.
The system 10 may include a user interface 58 that allows a user at the remote location to enter data into the system. For example, the interface 58 may be a computer or a computer terminal that allows a user to enter information about the patient. The robot 12 can be moved into view of the patient through the remote station 16 so that patient information can be entered into the system while a physician is viewing the patient through the robot camera. The physician can remotely move the robot 12 to obtain different viewing angles of the patient. The user interface 58 may be a separate computer and/or be integral with the robot 12. The billing server 56 may automatically generate a bill from the information provided by the session data on a periodic basis. The billed elements may be based on either actions performed or outcomes achieved, or both. Alternatively, a user can manually generate bills through a user interface to the billing server.
The billing server 56 may receive session data during a session or upon termination of a session. Additionally, the billing server may poll a robot to retrieve data from its hard drive. The session data may be organized so as to automatically populate certain fields of a billing statement or report. The billing information can be automatically sent to an insurance carrier.
The server 54 can be accessible through a web page or other means for accessing information through a network 18.
The session data can be organized into a plurality of data types.
In a factory equipment-repair application, the equipment being repaired during the session would replace the patient name in
Referring to
The speaker 44 is coupled to the bus 166 by a digital to analog converter 174. The microphone 42 is coupled to the bus 166 by an analog to digital converter 176. The high level controller 160 may also contain random access memory (RAM) device 178, a non-volatile RAM device 180 and a mass storage device 182 that are all coupled to the bus 172. The RAM 178, NVRAM 180 and/or mass storage device 182 may contain session data that is transmitted to the remote station and/or server. The robot antennae 46 may be coupled to a wireless transceiver 184. By way of example, the transceiver 184 may transmit and receive information in accordance with IEEE 802.11b.
The controller 164 may operate with a LINUX OS operating system. The controller 164 may also operate MS WINDOWS along with video, camera and audio drivers for communication with the remote control station 16. Video information may be transceived using MPEG CODEC compression techniques. The software may allow the user to send e-mail to the patient and vice versa, or allow the patient to access the Internet. In general the high level controller 160 operates to control communication between the robot 12 and the remote control station 16.
The high level controller 160 may be linked to the low level control system 162 by a serial port 186. The low level control system 162 may include components and software that mechanically actuate the robot 12. For example, the low level control system 162 provides instructions to actuate the movement platform to move the robot 12. The low level control system 162 may receive movement instructions from the high level controller 160. The movement instructions may be received as movement commands from the remote control station or another robot. Although two controllers are shown, it is to be understood that each robot 12 may have one controller, or more than two controllers, controlling the high and low level functions.
The system may be the same or similar to a robotic system provided by the assignee InTouch Technology, Inc. of Santa Barbara, California under the name RP-7, which is hereby incorporated by reference. The system may also be the same or similar to the system disclosed in U.S. Pat. No. 7,292,912, which is hereby incorporated by reference.
The DUI 260 may contain a “progress notes” text editing field, which enables a “document as you treat” methodology. As the physician conducts treatment, he can document both the treatment steps and outcomes in the progress notes field. Each note may be manually timestamped by the physician, or automatically timestamped by the software based on when the physician began typing each note. In the application of factory floor equipment repair, the progress notes would detail the various examinations and repair steps taken.
The calculated dosage and images can be included in the session data that is transmitted and stored by the system. The automatic population of the data fields may be tagged as an event with an associated time stamp. Likewise, the selection of the data and/or image fields may be tagged as events with time stamps.
The system is useful for allowing a physician to remotely view and treat a stroke patient. The system provides patient information, NIHSS stroke severity assessment, calculated t-PA dosage and CT head images that allow the physician to provide real time remote patient treatment. The system also allows such sessions to be audited so that medical personnel, healthcare institutions, insurance carriers, etc. can audit sessions. Such audits may include viewing video/audio captured by the robot during a session.
While certain exemplary embodiments have been described and shown in the accompanying drawings, it is to be understood that such embodiments are merely illustrative of and not restrictive on the broad invention, and that this invention not be limited to the specific constructions and arrangements shown and described, since various other modifications may occur to those ordinarily skilled in the art.
This application is a continuation of U.S. application Ser. No. 12/362,454, filed Jan. 29, 2009, now U.S. Pat. No. 8,849,680, the contents of which are hereby incorporated by reference in their entirety.
Number | Name | Date | Kind |
---|---|---|---|
4107689 | Jellinek | Aug 1978 | A |
4213182 | Eichelberger et al. | Jul 1980 | A |
4553309 | Hess et al. | Nov 1985 | A |
4697278 | Fleischer | Sep 1987 | A |
5220263 | Onishi et al. | Jun 1993 | A |
5262944 | Weisner et al. | Nov 1993 | A |
5400068 | Ishida et al. | Mar 1995 | A |
5617539 | Ludwig et al. | Apr 1997 | A |
5619341 | Auyeung et al. | Apr 1997 | A |
5623679 | Rivette et al. | Apr 1997 | A |
5734805 | Isensee et al. | Mar 1998 | A |
5793365 | Tang et al. | Aug 1998 | A |
5801755 | Echerer | Sep 1998 | A |
5844599 | Hildin | Dec 1998 | A |
5867494 | Krishnaswamy et al. | Feb 1999 | A |
5872922 | Hogan et al. | Feb 1999 | A |
6091219 | Maruo et al. | Jul 2000 | A |
6189034 | Riddle | Feb 2001 | B1 |
6195683 | Palmer | Feb 2001 | B1 |
6292713 | Jouppi et al. | Sep 2001 | B1 |
6292714 | Okabayashi | Sep 2001 | B1 |
6304050 | Skaar et al. | Oct 2001 | B1 |
6314631 | Pryor | Nov 2001 | B1 |
6317953 | Pryor | Nov 2001 | B1 |
6373855 | Downing et al. | Apr 2002 | B1 |
6389329 | Colens | May 2002 | B1 |
6411055 | Fujita et al. | Jun 2002 | B1 |
6430471 | Kintou et al. | Aug 2002 | B1 |
6507773 | Parker et al. | Jan 2003 | B2 |
6529620 | Thompson | Mar 2003 | B2 |
6535793 | Allard | Mar 2003 | B2 |
6567038 | Granot et al. | May 2003 | B1 |
6590604 | Tucker et al. | Jul 2003 | B1 |
6597392 | Jenkins et al. | Jul 2003 | B1 |
6667592 | Jacobs et al. | Dec 2003 | B2 |
6674259 | Norman et al. | Jan 2004 | B1 |
6693585 | Macleod | Feb 2004 | B1 |
6724823 | Rovati et al. | Apr 2004 | B2 |
6816192 | Nishikawa | Nov 2004 | B1 |
6816754 | Mukai et al. | Nov 2004 | B2 |
6893267 | Yueh | May 2005 | B1 |
6951535 | Ghodoussi | Oct 2005 | B2 |
6990112 | Brent et al. | Jan 2006 | B1 |
7011538 | Chang | Mar 2006 | B2 |
7053578 | Diehl et al. | May 2006 | B2 |
7055210 | Keppler et al. | Jun 2006 | B2 |
7219364 | Bolle et al. | May 2007 | B2 |
7222000 | Wang et al. | May 2007 | B2 |
7283153 | Provost et al. | Oct 2007 | B2 |
7292257 | Kang et al. | Nov 2007 | B2 |
7305114 | Wolff et al. | Dec 2007 | B2 |
7332890 | Cohen et al. | Feb 2008 | B2 |
7333642 | Green | Feb 2008 | B2 |
7352153 | Yan | Apr 2008 | B2 |
7363121 | Chen et al. | Apr 2008 | B1 |
7467211 | Herman et al. | Dec 2008 | B1 |
7483867 | Ansari et al. | Jan 2009 | B2 |
7510428 | Obata et al. | Mar 2009 | B2 |
7557758 | Rofougaran | Jul 2009 | B2 |
7587260 | Bruemmer et al. | Sep 2009 | B2 |
7631833 | Ghaleb et al. | Dec 2009 | B1 |
7657560 | DiRienzo | Feb 2010 | B1 |
7703113 | Dawson | Apr 2010 | B2 |
7737993 | Kaasila et al. | Jun 2010 | B2 |
7774158 | Domingues Goncalves et al. | Aug 2010 | B2 |
7861366 | Hahm et al. | Jan 2011 | B2 |
7885822 | Akers et al. | Feb 2011 | B2 |
7956894 | Akers et al. | Jun 2011 | B2 |
7957837 | Ziegler et al. | Jun 2011 | B2 |
7982769 | Jenkins et al. | Jul 2011 | B2 |
8126960 | Obradovich et al. | Feb 2012 | B2 |
8212533 | Ota | Jul 2012 | B2 |
8287522 | Moses et al. | Oct 2012 | B2 |
8320534 | Kim et al. | Nov 2012 | B2 |
8348675 | Dohrmann | Jan 2013 | B2 |
8374171 | Cho et al. | Feb 2013 | B2 |
8384753 | Bedingfield, Sr. | Feb 2013 | B1 |
8400491 | Panpaliya et al. | Mar 2013 | B1 |
8401275 | Wang et al. | Mar 2013 | B2 |
8423284 | O'Shea | Apr 2013 | B2 |
8451731 | Lee et al. | May 2013 | B1 |
8515577 | Wang et al. | Aug 2013 | B2 |
8610786 | Ortiz | Dec 2013 | B2 |
8612051 | Norman et al. | Dec 2013 | B2 |
8639797 | Pan et al. | Jan 2014 | B1 |
8670017 | Stuart et al. | Mar 2014 | B2 |
8726454 | Gilbert et al. | May 2014 | B2 |
8836751 | Ballantyne et al. | Sep 2014 | B2 |
8849679 | Wang et al. | Sep 2014 | B2 |
8849680 | Wright et al. | Sep 2014 | B2 |
8861750 | Roe et al. | Oct 2014 | B2 |
8897920 | Wang et al. | Nov 2014 | B2 |
8902278 | Pinter et al. | Dec 2014 | B2 |
20010051881 | Filler | Dec 2001 | A1 |
20020044201 | Alexander et al. | Apr 2002 | A1 |
20020106998 | Presley et al. | Aug 2002 | A1 |
20020109775 | White et al. | Aug 2002 | A1 |
20020128985 | Greenwald | Sep 2002 | A1 |
20020193908 | Parker | Dec 2002 | A1 |
20030080901 | Piotrowski | May 2003 | A1 |
20030112823 | Collins et al. | Jun 2003 | A1 |
20030120714 | Wolff et al. | Jun 2003 | A1 |
20030135097 | Wiederhold et al. | Jul 2003 | A1 |
20030195662 | Wang et al. | Oct 2003 | A1 |
20030216833 | Mukai et al. | Nov 2003 | A1 |
20040008138 | Hockley, Jr. et al. | Jan 2004 | A1 |
20040017475 | Akers | Jan 2004 | A1 |
20040019406 | Wang | Jan 2004 | A1 |
20040088078 | Jouppi et al. | May 2004 | A1 |
20040117067 | Jouppi | Jun 2004 | A1 |
20040150725 | Taguchi | Aug 2004 | A1 |
20040168148 | Goncalves et al. | Aug 2004 | A1 |
20040218099 | Washington | Nov 2004 | A1 |
20040260790 | Balloni et al. | Dec 2004 | A1 |
20050052527 | Remy | Mar 2005 | A1 |
20050073575 | Thacher et al. | Apr 2005 | A1 |
20050125083 | Kiko | Jun 2005 | A1 |
20050149364 | Ombrellaro | Jul 2005 | A1 |
20050152447 | Jouppi et al. | Jul 2005 | A1 |
20050152565 | Jouppi et al. | Jul 2005 | A1 |
20050168568 | Jouppi | Aug 2005 | A1 |
20050264649 | Chang et al. | Dec 2005 | A1 |
20050286759 | Zitnick et al. | Dec 2005 | A1 |
20060010028 | Sorensen | Jan 2006 | A1 |
20060056655 | Wen et al. | Mar 2006 | A1 |
20060056837 | Vapaakoski | Mar 2006 | A1 |
20060066609 | Todice et al. | Mar 2006 | A1 |
20060071797 | Rosenfeld et al. | Apr 2006 | A1 |
20060178559 | Kumar et al. | Aug 2006 | A1 |
20070093279 | Janik | Apr 2007 | A1 |
20070116152 | Thesling | May 2007 | A1 |
20070170886 | Plishner | Jul 2007 | A1 |
20070199108 | Angle | Aug 2007 | A1 |
20070226949 | Hahm et al. | Oct 2007 | A1 |
20070290040 | Wurman et al. | Dec 2007 | A1 |
20080027591 | Lenser et al. | Jan 2008 | A1 |
20080033641 | Medalia | Feb 2008 | A1 |
20080051985 | D'Andrea et al. | Feb 2008 | A1 |
20080086241 | Phillips et al. | Apr 2008 | A1 |
20080091340 | Milstein et al. | Apr 2008 | A1 |
20080161969 | Lee et al. | Jul 2008 | A1 |
20080232763 | Brady | Sep 2008 | A1 |
20080263628 | Norman et al. | Oct 2008 | A1 |
20080267069 | Thielman et al. | Oct 2008 | A1 |
20090049640 | Lee et al. | Feb 2009 | A1 |
20090055023 | Walters | Feb 2009 | A1 |
20090102919 | Zamierowski et al. | Apr 2009 | A1 |
20100017046 | Cheung et al. | Jan 2010 | A1 |
20100026239 | Li et al. | Feb 2010 | A1 |
20100030578 | Siddique et al. | Feb 2010 | A1 |
20100066804 | Shoemake et al. | Mar 2010 | A1 |
20100171826 | Hamilton et al. | Jul 2010 | A1 |
20100278086 | Pochiraju et al. | Nov 2010 | A1 |
20100286905 | Goncalves et al. | Nov 2010 | A1 |
20100301679 | Murray et al. | Dec 2010 | A1 |
20110022705 | Yellamraju et al. | Jan 2011 | A1 |
20110071675 | Wells et al. | Mar 2011 | A1 |
20110072114 | Hoffert et al. | Mar 2011 | A1 |
20110153198 | Kokkas et al. | Jun 2011 | A1 |
20110193949 | Nambakam et al. | Aug 2011 | A1 |
20110195701 | Cook et al. | Aug 2011 | A1 |
20110280551 | Sammon | Nov 2011 | A1 |
20110306400 | Nguyen | Dec 2011 | A1 |
20120059946 | Wang | Mar 2012 | A1 |
20120113856 | Krishnaswamy | May 2012 | A1 |
20120203731 | Nelson et al. | Aug 2012 | A1 |
20120291809 | Kuhe et al. | Nov 2012 | A1 |
20130250938 | Anandakumar et al. | Sep 2013 | A1 |
20140047022 | Chan et al. | Feb 2014 | A1 |
20140085543 | Hartley et al. | Mar 2014 | A1 |
20140135990 | Stuart et al. | May 2014 | A1 |
20140139616 | Pinter et al. | May 2014 | A1 |
20140155755 | Pinter et al. | Jun 2014 | A1 |
Number | Date | Country |
---|---|---|
1404695 | Mar 2003 | CN |
1561923 | Jan 2005 | CN |
1743144 | Mar 2006 | CN |
101049017 | Oct 2007 | CN |
101151614 | Mar 2008 | CN |
100407729 | Jul 2008 | CN |
11220706 | Aug 1999 | JP |
2002321180 | Nov 2002 | JP |
2004181229 | Jul 2004 | JP |
2005111083 | Apr 2005 | JP |
2009125133 | Jun 2009 | JP |
9742761 | Nov 1997 | WO |
WO-2007041295 | Apr 2007 | WO |
2009128997 | Oct 2009 | WO |
Entry |
---|
Diggs, Alice C.; Design of a socially intelligent task selection software mechanism for a mobile robot; Tennessee State University. ProQuest Dissertations Publishing, 2008. 1456761. (Year: 2008). |
Fulbright et al., “SWAMI: An Autonomous Mobile Robot for Inspection of Nuclear Waste of Storage Facilities”, Autonomous Robots, vol. 2, 1995, pp. 225-235. |
Screenshot Showing Google Date for Lemaire Telehealth Manual, Screenshot Retrieved on Dec. 18, 2014, 1 page. |
Nomadic Technologies, Inc., “Nomad Scout Language Reference Manual”, Software Version: 2.7, Part No. DOC00002, Jul. 12, 1999, 47 pages. |
“Appeal from the U.S. District Court for the Central District of California in No. 11-CV-9185, Judge Percy Anderson”, May 9, 2014, pp. 1-48. |
“Google translation of: Innovations Report”, From research project to television star: Care-O-bot in ZDF series, http://www.innovations-report.de/specials/printa.php?id=5157, Sep. 28, 2001, 2 pages. |
“MPEG File Format Summary”, downloaded from: http://www.fileformat.info/format/mpeg/egff.htm, Feb. 1, 2001, 8 pages. |
“Nomad Scout User's Manual”, Nomadic Technologies, Software Version 2. 7, Part No. DOC00004, Jul. 12, 1999, pp. 1-59. |
ACM Digital Library Record, “Autonomous Robots vol. 11 Issue 1”, downloaded from <http://dl.acm.org/citation.cfm?id=591550&picked=prox&cfid=360891374&cftoken=35225929>, Jul. 2001, 2 pages. |
Brenner, “A technical tutorial on the IEEE 802.11 protocol”, BreezeCOM Wireless Communications, 1997, pp. 1-24. |
CMU Course 16X62, “Robot user's manual”, (describing the Nomad Scout), Carnegie Mellon University, Feb. 1, 2001, 11 pages. |
Gostai, “Gostai Jazz: Robotic Telepresence”, Available online at <http://www.gostai.com>, 4 pages. |
Koenen, “MPEG-4: a Powerful Standard for Use in Web and Television Environments”, (KPN Research), downloaded from http://www.w3.org/Architecture/1998/06/Workshop/paper26, Jul. 1, 1998, 4 pages. |
Library of Congress, “008-Fixed-Length Data Elements (NR)”, Mar. 21 Format for Classification Data, downloaded from http://www.loc.gov/marc/classification/cd008.html, Jan. 2000, pp. 1-14. |
Osborn, “Quality of Life Technology Center”, QoLT Research Overview:A National Science Foundation Engineering Research Center, Carnegie Mellon University of Pittsburgh, 2 pages. |
Panusopone, et al., “Performance comparison of MPEG-4 and H.263+ for streaming video applications”, Circuits Systems Signal Processing, vol. 20, No. 3, 2001, pp. 293-309. |
Paulos, et al., “Personal Tele-Embodiment”, Chapter 9 in Goldberg, et al., ed. “Beyond webcams”, MIT Press, Jan. 4, 2002, pp. 155-167. |
Paulos, “Personal tele-embodiment”, OskiCat Catalog Record, UCB Library Catalog, 2001, 3 pages. |
Paulos, “Personal Tele-Embodiment”, Introductory and cover pages from 2001 Dissertation including Contents table, together with e-mails relating thereto from UC Berkeley Libraries, as shelved at UC Berkeley Engineering Library (Northern Regional library Facility), May 8, 2002, 25 pages, including 4 pages of e-mails. |
Paulos, et al., “Social Tele-Embodiment: Understanding Presence”, Autonomous Robots, vol. 11, Issue 1, Kluwer Academic Publishers, Jul. 2001, pp. 87-95. |
Schraft, et al., “Care-O-bot™: the concept of a system fro assisting elderly or disabled persons in home enviornments”, IEEE Proceedings of the 24th Annual Conference of the Industrial Electronics Society, IECON '98, Aug. 31-Sep. 4, 1998, pp. 2476-2481. |
Mdeo Middleware Cookbook, “H.350 Directory Services for Multimedia”, 2 pages. |
Number | Date | Country | |
---|---|---|---|
20150339452 A1 | Nov 2015 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 12362454 | Jan 2009 | US |
Child | 14472277 | US |