The present disclosure is generally directed to patient monitoring systems and particularly to a system and method for monitoring patients in a manner which prevents or reduces patient falls.
According to recent studies, falls are a leading cause of death among people over the age of 65 years, and 10% of the fatal falls for patients over 65 years of age occur in a hospital setting. For the general population, studies estimate that patient falls occur in 1.9 to 3% of all acute care hospitalizations. Of these hospital-based falls, approximately 30% will result in a serious injury with the cost to care for these injuries estimated to reach $54.9 billion per year by 2020. Current technologies that exist to assist in the prevention of falls are limited in their capabilities. These include pressure pads on the bed that trigger an alert when no pressure is detected on the pad, pressure pads on the floor and light beams that create a perimeter with alarms going off upon interruption of the beam. The pressure pads are ineffective as they do not prevent the fall but, rather, alert after the fact when it is too late. Additionally they are prone to false positive alerts. The light beams are also prone to false alerts when the patient or visitor simply reaches through it or the caregiver breaks the beam while delivering medication, food, or drink or conducting a procedure on the patient. The present disclosure is directed to addressing these above-described shortcomings with current technology.
Generally disclosed is a novel method and system that allows healthcare providers, hospitals, skilled nursing facilities, and other persons to monitor disabled, elderly, or other high-risk individuals and utilize the described technology to prevent or reduce falls and/or mitigate the impact of a fall by delivering automated notification of “at risk” behavior and/or falls by such an individual being monitored, especially falls and/or behavior where assistance is required, using a skeletal tracking system and a virtual blob detection system.
With skeletal tracking alone, there can be factors affecting the cameras/image-video quality which affects the ability of the detection/monitoring system to detect a skeleton. Such factors, especially in a hospital, include, but are not limited to, sheets/blankets covering a patient, trays positioned over the bed hiding the patient, and the patient blending into the bed and not having a skeleton recognized.
With blob detection alone, there can be an increase in false positives in detecting falls and “at risk” behavior. These false positives can occur because blob detection does not differentiate between types of 3D objects. Blankets, trays, caretakers, or other 3D objects can trigger an automated notification. Blob recognition also does not differentiate between parts of the body.
The present disclosure using of a skeletal tracking system with a virtual blob detection system addresses or at least reduces the shortcomings of both systems. Skeletal tracking can be used to reduce or eliminate false positives generated by a virtual blob detection system. Virtual blob detection relies on 3D object detection, which works regardless of how much of the person is viewable by the camera or whether other objects are blocking the view of the camera. Even in poor lighting conditions, both the virtual blob detection system and skeletal tracking system can still capture and/or recognize movement as the system can use an IR Depth Map to perform the blob and/or skeletal detection, which does not rely on lighting conditions.
The present disclosure uses both a skeletal tracking system and a virtual blob detection system to track whether an individual has fallen or engaged in “at risk” behavior. When the skeletal tracking system is unable to track a skeleton, then a virtual blob detection system is used to capture and/or recognize movement. In the alternative, both a skeletal tracking system and a blob detection system can monitor an individual simultaneously, and a notification is delivered when either system detects a fall or “at risk” behavior.
The following non-limiting definitions are provided as aid in understanding the disclosed novel method and system:
At step F1a, one or more 3D camera, motion and/or sound sensors can be installed in the patient's or individual's room. At step F1b, the one or more 3D camera, motion and/or sound sensors can be configured to recognize the patient or monitored individual using biometric identifiers such as facial recognition, height, distance between points on the body, etc. The patient's body can be recognized and tracked as one or more skeletal components. Alternatively, the patient can be identified by means of a user creating a three-dimensional zone around the patient through a software application stored on a computerized skeletal tracking system or through the use of an electronic transmitter on the patient's or other individual's person. Once a patient is identified, the software can automatically generate or allow the user to generate a configurable three-dimensional zone or perimeter around the patient and/or the patient's bed that acts as a virtual barrier. At step F1c, data from the 3D camera, motion and/or sound sensors can be continuously sent to a computerized skeletal tracking system, preferably at all times while the system is being used for monitoring. At step F1d, a continuous video feed can be sent to the central monitoring primary display, preferably at all times while the system is being used for monitoring.
At step F1e, if the computerized skeletal tracking system does not detect a patient's skeleton because, for non-limiting example, the patient is covered with a sheet, blanket, or tray, is obscured by a caretaker or other individual, or for another reason, then the computerized skeletal tracking system continues to try to identify the patient until the obstruction is removed or the issue is resolved. In some embodiments, if the computerized skeletal tracking system does not detect a patient's skeleton within a preprogrammed, configurable length of time, then the computerized virtual blob detection monitoring system is activated and begins to monitor the patient (see e.g.,
At step F1f, if the computerized skeletal tracking system detects that the patient or any part of the patient crosses the virtual barrier around the patient and/or the patient's bed, the skeletal monitoring system will alert the computerized communication system. A record can also be entered in a database to record the incident if other individuals, such as a caregiver, are also detected within the monitored room at the time virtual barrier is crossed. The system can be designed or programmed such that no alert is generated when another individual is detected, and it will continue to monitor the data being sent from the 3D camera, motion, and sound sensor. In this situation, generating an alarm/alert could result in a false alarm because there are other individual(s) with the patient, and such person(s) may be responsible for monitoring the patient and/or (even if not responsible) can assist the patient who is falling. Additionally, the person in the room should be in a better position to assist the patient as compared to the individual located at the central monitoring station. It is also within the scope of the disclosure to send alarm/alerts even if other individual(s) are in the room with the patient, as those individuals may not be the person responsible, may be elderly, may have a physical handicap preventing them from stopping a patient from falling, etc.
At step F1g, the computerized communication system preferably can first issue a verbal warning to the patient that they have crossed the virtual barrier. This verbal warning can be a pre-recorded message, including, but not limited to, a pre-recorded message from any caregiver, and will advise the patient to exit the virtual barrier and return to their previous position. At step F1h, should the patient fail to exit the virtual barrier and return to their previous position in a timely manner, an alert can be generated on the central monitoring alert display system (see e.g.,
At step F1i, the computerized communication system can notify caregivers or other designated persons that the individual being monitored requires assistance. Notification of caregivers can be made through phone call, text messaging, speakerphone systems, pagers, email, or other electronic means of communication if so desired and configured. At step F1j, if the patient exits the virtual zone (i.e., crosses the virtual barrier), the system database can be updated to reflect such. Additionally, the system will continue to monitor the patient and store all data in the system database.
At step F2a, one or more 3D camera, motion and/or sound sensors can be installed in the patient's or individual's room. At step F2b, the one or more 3D camera, motion, and sound sensors can be configured to recognize the area being monitored using three-dimensional areas as defined by x, y, and z coordinates in relation to the 3D camera, motion and/or sound sensor. Based on the data sent/captured by the 3D camera, motion and/or sound sensor(s), the computerized virtual blob detection monitoring system is programmed to recognize any 3D object within the configured area. The patient's body is recognized and tracked as one or more blobs. Virtual blob detection zones can also be calibrated at this time. At step F1c, data from the 3D camera, motion, and sound sensors can be continuously sent to a computerized virtual blob detection monitoring system, preferably at all times while the system is being used for monitoring. At step F1d, a continuous video feed can be sent to the central monitoring primary display, preferably at all times while the system is being used for monitoring.
At step F2e, if the computerized virtual blob detection monitoring system does not detect that the patient or any part of the patient (i.e. presented as a blob object(s)) has crossed into the designated virtual blob detection zone, it will continue monitoring. As a non-limiting example, if both hands of the patient enter the blob detection zone, the system may display and/or track as two different blobs or possibly as a single blob depending on how close the hands are to each other. If the computerized virtual blob detection monitoring system detects that the patient or any part of the patient has crossed into the designated virtual blob detection zone, it will then proceed to step F2f to determine how large the portion of the patient's body that entered the blob detection zone is. If the size of the patient's body that entered the blob detection zone is less than the configured minimum size, it will continue to monitor. Configuration is preferably through the detection system's programmed software and may be similar to how the zones, trip wires, etc. are configured. However, if the size of the patient's body that is within the blob detection zone is above the minimum predetermined or preprogrammed threshold for the object size, it can then proceed to step F2g. At step F2g, the system determines how long the patient's body or part of the patient's body has remained within the blob detection zone. If the patient's body or part of the body has not remained in the detection zone for greater than a configured amount of time, preferably no alert is generated and the system continues to monitor the individual. However, the system can also be programmed to issue/generate an alert based solely on the system detecting a large enough blob within the detection zone for any period of time and such is also considered within the scope of the disclosure. However, if at step F2g, the patient's body has remained within the blob detection zone for greater than the minimum configured time period, the monitoring system will alert the computerized communication system and can also enter a record of the incident in a database. If other individuals, such as a caregiver, are also detected within the monitored room at the time the virtual blob detection zone threshold is crossed, the system can be designed or programmed such that no alert is generated, and it will continue to monitor the data being sent from the 3D camera, motion, and sound sensor. In this situation, generating an alarm/alert could result in a false alarm because there are other individual(s) with the patient, and such person(s) may be responsible for monitoring the patient and/or (even if not responsible) can assist the patient who is falling. The person in the room may be in a better position to assist the patient as compared to the individual located at the central monitoring station. It is also within the scope of the disclosure to send alarm/alerts even if other individual(s) are in the room with the patient, as those individuals may not be the person responsible, may be elderly, may have a physical handicap preventing them from stopping a patient from falling, etc.
At step F2h, the computerized communication system preferably can first issue a verbal warning to the patient that he or she has entered the virtual blob detection zone. This verbal warning can be a pre-recorded message, including, but not limited to, a pre-recorded message from any caregiver, and will advise the patient to exit the virtual blob detection zone and return to his or her previous position. At step F2i, should the patient fail to exit the virtual blob detection zone and return to his or her previous position in a timely manner, an alert can be generated on the central monitoring alert display system (see e.g.,
At step F2j, the computerized communication system can notify caregivers or other designated persons that the individual requires assistance. Notification of caregivers can be made through phone call, text messaging, speakerphone systems, pagers, email, or other electronic means of communication if so desired and configured. At step F2k, if the patient exits the virtual blob detection zone, the system database can be updated to reflect such. Additionally, the system will continue to monitor the patient and store all data in the system database.
At step F3b, all video, audio and alert feeds received by the centralized monitoring station can be displayed on the centralized monitoring primary display. Alternatively, multiple centralized monitoring primary displays can be utilized based on the quantity of rooms to be monitored at a given time. At step F3c, when the centralized monitoring system receives an alert from any of the computerized monitoring and communication systems indicating that an individual in any of the monitored rooms or other locations has fallen or otherwise entered into a virtual detection zone, the video, audio, and alert information for the specific room and/or individual is displayed on the centralized monitoring alert display. Should the centralized monitoring station receive alerts from more than one of the computerized monitoring and communication systems indicating that an individual in a monitored room or location has entered a virtual barrier or virtual blob detection zone, the centralized monitoring alert display may display the video, audio, and alerting information from all such instances preferably at the same time. If no alert is received by the centralized monitoring station, preferably nothing is displayed on the centralized monitoring alert display. At step F3d, an electronic record of any alerts received by the centralized monitoring station can be stored in an electronic database, which is in communication with the centralized monitoring station.
Additionally, the functions of the computerized virtual blob detection monitoring system and the computerized skeletal tracking system can be performed in practice by a single system. In these embodiments, the disclosure performs the same processes described in
At step F4c, in some embodiments, if the computerized skeletal tracking system detects a skeleton, then the method proceeds to step F1e of
As non-limiting examples, the icons that appear on the screen for selection can include the following symbols shown in
As seen in
As seen in
If there are any other types of zones or rails to draw for the particular sensor, the above steps are repeated to place the next zone or rail and the depth setting can be adjusted for each if necessary. Additionally, all zones and rails can be cleared by clicking on or otherwise selecting the Clear All icon in the toolbar. Once all of the zones/rails are configured, the user can close the window to finish, or the user may have the option to save the zone/rail configuration for later use.
As seen in
As seen in
As seen in
For certain Actions, an additional field may need to be completed to finish the Action. If the field is required, it can appear below the Action dropdown (see e.g.,
To add more Alerts, the user clicks or selects the Add button and repeats the above described steps. Once finished, the user clicks on or otherwise selects the bottom corner OK button to save and close the window.
In one non-limiting embodiment, the disclosed system and method can use the following components:
The various components can be in electrical and/or wireless communication with each other.
Located remote is defined to mean that the centralized monitoring station, centralized monitoring primary display and/or centralized monitoring alert display is not physically located within the monitored rooms. However, the location can be on the same premises at a different location (i.e. nurse station for the premises, hospital, etc.) or a different location (i.e. monitoring station, etc.).
The automatic detection of an individual entering a prescribed virtual blob detection zone will provide significant administrative and clinical benefits to caregivers and individuals alike, including the following non-limiting public benefits.
Any computer/server/electronic database system (collectively “computer system”) capable of being programmed with the specific steps of the present invention can be used and is considered within the scope of the disclosure. Once specifically programmed, such computer system can preferably be considered a special purpose computer limited to the use of two or more of the above particularly described combination of steps (programmed instructions) performing two of more of the above particularly described combination of functions.
All components of the present disclosure system and their locations, electronic communication methods between the system components, electronic storage mechanisms, etc. discussed above or shown in the drawings, if any, are merely by way of example and are not considered limiting and other component(s) and their locations, electronic communication methods, electronic storage mechanisms, etc. currently known and/or later developed can also be chosen and used and all are considered within the scope of the disclosure.
Unless feature(s), part(s), component(s), characteristic(s) or function(s) described in the specification or shown in the drawings for a claim element, claim step or claim term specifically appear in the claim with the claim element, claim step or claim term, then the inventor does not consider such feature(s), part(s), component(s), characteristic(s) or function(s) to be included for the claim element, claim step or claim term in the claim when and if the claim element, claim step or claim term is interpreted or construed. Similarly, with respect to any “means for” elements in the claims, the inventor considers such language to require only the minimal amount of features, components, steps, or parts from the specification to achieve the function of the “means for” language and not all of the features, components, steps or parts describe in the specification that are related to the function of the “means for” language.
While the disclosure has been described and disclosed in certain terms and has disclosed certain embodiments or modifications, persons skilled in the art who have acquainted themselves with the disclosure will appreciate that it is not necessarily limited by such terms nor to the specific embodiments and modification disclosed herein. Thus, a wide variety of alternatives suggested by the teachings herein can be practiced without departing from the spirit of the disclosure, and rights to such alternatives are particularly reserved and considered within the scope of the disclosure.
This application, having 16/166,857 and entitled “Systems and Methods for Determining Whether an Individual Enters a Prescribed Virtual Zone Using Skeletal Tracking and 3D Blob Detection” is a continuation of pending U.S. application Ser. No. 15/728,110 filed Oct. 9, 2017, and entitled “Method for Determining Whether an Individual Enters a Prescribed Virtual Zone Using Skeletal Tracking and 3D Blob Detection,” which is a continuation of U.S. application Ser. No. 14/727,434, filed Jun. 1, 2015, now U.S. Pat. No. 9,892,611, issued Feb. 13, 2018, the entirety of each of which is incorporated by reference herein.
Number | Name | Date | Kind |
---|---|---|---|
4669263 | Sugiyama | Jun 1987 | A |
4857716 | Gombrich et al. | Aug 1989 | A |
5031228 | Lu | Jul 1991 | A |
5276432 | Travis | Jan 1994 | A |
5448221 | Weller | Sep 1995 | A |
5482050 | Smokoff et al. | Jan 1996 | A |
5592153 | Welling et al. | Jan 1997 | A |
5798798 | Rector et al. | Aug 1998 | A |
5838223 | Gallant et al. | Nov 1998 | A |
5915379 | Wallace et al. | Jun 1999 | A |
5942986 | Shabot et al. | Aug 1999 | A |
6050940 | Braun | Apr 2000 | A |
6095984 | Amano et al. | Aug 2000 | A |
6160478 | Jacobsen et al. | Dec 2000 | A |
6174283 | Nevo et al. | Jan 2001 | B1 |
6188407 | Smith et al. | Feb 2001 | B1 |
6269812 | Wallace et al. | Aug 2001 | B1 |
6287452 | Allen | Sep 2001 | B1 |
6322502 | Schoenberg et al. | Nov 2001 | B1 |
6369838 | Wallace et al. | Apr 2002 | B1 |
6429869 | Kamakura et al. | Aug 2002 | B1 |
6614349 | Proctor et al. | Sep 2003 | B1 |
6727818 | Wildman et al. | Apr 2004 | B1 |
6804656 | Rosenfeld et al. | Oct 2004 | B1 |
7015816 | Wildman et al. | Mar 2006 | B2 |
7122005 | Shusterman | Oct 2006 | B2 |
7154397 | Zerhusen et al. | Dec 2006 | B2 |
7237287 | Weismiller et al. | Jul 2007 | B2 |
7323991 | Eckert et al. | Jan 2008 | B1 |
7408470 | Wildman et al. | Aug 2008 | B2 |
7420472 | Tran | Sep 2008 | B2 |
7430608 | Noonan et al. | Sep 2008 | B2 |
7502498 | Wen et al. | Mar 2009 | B2 |
7612679 | Fackler et al. | Nov 2009 | B1 |
7669263 | Menkedick et al. | Mar 2010 | B2 |
7715387 | Schuman | May 2010 | B2 |
7724147 | Brown | May 2010 | B2 |
7756723 | Rosow et al. | Jul 2010 | B2 |
7890349 | Cole et al. | Feb 2011 | B2 |
7895055 | Schneider et al. | Feb 2011 | B2 |
7908153 | Scherpbier et al. | Mar 2011 | B2 |
7945457 | Zaleski | May 2011 | B2 |
7962544 | Torok et al. | Jun 2011 | B2 |
7972140 | Renaud | Jul 2011 | B2 |
8108036 | Tran | Jan 2012 | B2 |
8123685 | Brauers et al. | Feb 2012 | B2 |
8224108 | Steinberg et al. | Jul 2012 | B2 |
8237558 | Seyed Momen et al. | Aug 2012 | B2 |
8273018 | Fackler et al. | Sep 2012 | B1 |
8432263 | Kunz | Apr 2013 | B2 |
8451314 | Cline et al. | May 2013 | B1 |
8529448 | McNair | Sep 2013 | B2 |
8565500 | Neff | Oct 2013 | B2 |
8620682 | Bechtel et al. | Dec 2013 | B2 |
8655680 | Bechtel et al. | Feb 2014 | B2 |
8700423 | Eaton, Jr. et al. | Apr 2014 | B2 |
8727981 | Bechtel et al. | May 2014 | B2 |
8769153 | Dziubinski | Jul 2014 | B2 |
8890937 | Skubic et al. | Nov 2014 | B2 |
8902068 | Bechtel et al. | Dec 2014 | B2 |
8917186 | Grant | Dec 2014 | B1 |
8953886 | King et al. | Feb 2015 | B2 |
9072929 | Rush et al. | Jul 2015 | B1 |
9129506 | Kusens | Sep 2015 | B1 |
9147334 | Long et al. | Sep 2015 | B2 |
9159215 | Kusens | Oct 2015 | B1 |
9269012 | Fotland | Feb 2016 | B2 |
9292089 | Sadek | Mar 2016 | B1 |
9305191 | Long et al. | Apr 2016 | B2 |
9408561 | Stone et al. | Aug 2016 | B2 |
9489820 | Kusens | Nov 2016 | B1 |
9519969 | Kusens | Dec 2016 | B1 |
9524443 | Kusens | Dec 2016 | B1 |
9536310 | Kusens | Jan 2017 | B1 |
9538158 | Rush et al. | Jan 2017 | B1 |
9563955 | Kamarshi et al. | Feb 2017 | B1 |
9597016 | Stone et al. | Mar 2017 | B2 |
9729833 | Kusens | Aug 2017 | B1 |
9741227 | Kusens | Aug 2017 | B1 |
9892310 | Kusens et al. | Feb 2018 | B2 |
9892311 | Kusens et al. | Feb 2018 | B2 |
9892611 | Kusens | Feb 2018 | B1 |
9905113 | Kusens | Feb 2018 | B2 |
10055961 | Johnson et al. | Aug 2018 | B1 |
10096223 | Kusens | Oct 2018 | B1 |
10210378 | Kusens et al. | Feb 2019 | B2 |
10225522 | Kusens | Mar 2019 | B1 |
10276019 | Johnson et al. | Apr 2019 | B2 |
20020015034 | Malmborg | Feb 2002 | A1 |
20020077863 | Rutledge et al. | Jun 2002 | A1 |
20020101349 | Rojas, Jr. | Aug 2002 | A1 |
20020115905 | August | Aug 2002 | A1 |
20020183976 | Pearce | Dec 2002 | A1 |
20030037786 | Biondi et al. | Feb 2003 | A1 |
20030070177 | Kondo et al. | Apr 2003 | A1 |
20030092974 | Santos et al. | May 2003 | A1 |
20030095147 | Daw | May 2003 | A1 |
20030135390 | O'brien et al. | Jul 2003 | A1 |
20030140928 | Bui et al. | Jul 2003 | A1 |
20030227386 | Pulkkinen et al. | Dec 2003 | A1 |
20040019900 | Knightbridge et al. | Jan 2004 | A1 |
20040052418 | Delean | Mar 2004 | A1 |
20040054760 | Ewing et al. | Mar 2004 | A1 |
20040097227 | Siegel | May 2004 | A1 |
20040116804 | Mostafavi | Jun 2004 | A1 |
20040193449 | Wildman et al. | Sep 2004 | A1 |
20050038326 | Mathur | Feb 2005 | A1 |
20050182305 | Hendrich | Aug 2005 | A1 |
20050231341 | Shimizu | Oct 2005 | A1 |
20050249139 | Nesbit | Nov 2005 | A1 |
20060004606 | Wendl et al. | Jan 2006 | A1 |
20060047538 | Condurso et al. | Mar 2006 | A1 |
20060049936 | Collins et al. | Mar 2006 | A1 |
20060058587 | Heimbrock et al. | Mar 2006 | A1 |
20060089541 | Braun et al. | Apr 2006 | A1 |
20060092043 | Lagassey | May 2006 | A1 |
20060107295 | Margis et al. | May 2006 | A1 |
20060145874 | Fredriksson et al. | Jul 2006 | A1 |
20060261974 | Albert et al. | Nov 2006 | A1 |
20070085690 | Tran | Apr 2007 | A1 |
20070118054 | Pinhas et al. | May 2007 | A1 |
20070120689 | Zerhusen et al. | May 2007 | A1 |
20070129983 | Scherpbier et al. | Jun 2007 | A1 |
20070136218 | Bauer et al. | Jun 2007 | A1 |
20070159332 | Koblasz | Jul 2007 | A1 |
20070279219 | Warriner | Dec 2007 | A1 |
20070296600 | Dixon et al. | Dec 2007 | A1 |
20080001735 | Tran | Jan 2008 | A1 |
20080001763 | Raja et al. | Jan 2008 | A1 |
20080002860 | Super et al. | Jan 2008 | A1 |
20080004904 | Tran | Jan 2008 | A1 |
20080009686 | Hendrich | Jan 2008 | A1 |
20080015903 | Rodgers | Jan 2008 | A1 |
20080021731 | Rodgers | Jan 2008 | A1 |
20080071210 | Moubayed et al. | Mar 2008 | A1 |
20080087719 | Sahud | Apr 2008 | A1 |
20080106374 | Sharbaugh | May 2008 | A1 |
20080126132 | Warner et al. | May 2008 | A1 |
20080228045 | Gao et al. | Sep 2008 | A1 |
20080249376 | Zaleski | Oct 2008 | A1 |
20080267447 | Kelusky et al. | Oct 2008 | A1 |
20080277486 | Seem et al. | Nov 2008 | A1 |
20080281638 | Weatherly et al. | Nov 2008 | A1 |
20090082829 | Panken et al. | Mar 2009 | A1 |
20090091458 | Deutsch | Apr 2009 | A1 |
20090099480 | Salgo et al. | Apr 2009 | A1 |
20090112630 | Collins et al. | Apr 2009 | A1 |
20090119843 | Rodgers et al. | May 2009 | A1 |
20090177327 | Turner et al. | Jul 2009 | A1 |
20090224924 | Thorp | Sep 2009 | A1 |
20090278934 | Ecker et al. | Nov 2009 | A1 |
20090322513 | Hwang et al. | Dec 2009 | A1 |
20100117836 | Seyed momen et al. | May 2010 | A1 |
20100169114 | Henderson et al. | Jul 2010 | A1 |
20100169120 | Herbst et al. | Jul 2010 | A1 |
20100172567 | Prokoski | Jul 2010 | A1 |
20100176952 | Bajcsy et al. | Jul 2010 | A1 |
20100188228 | Hyland | Jul 2010 | A1 |
20100205771 | Pietryga et al. | Aug 2010 | A1 |
20100245577 | Yamamoto et al. | Sep 2010 | A1 |
20100285771 | Peabody | Nov 2010 | A1 |
20100305466 | Corn | Dec 2010 | A1 |
20110018709 | Kornbluh | Jan 2011 | A1 |
20110022981 | Mahajan et al. | Jan 2011 | A1 |
20110025493 | Papadopoulos et al. | Feb 2011 | A1 |
20110025499 | Hoy et al. | Feb 2011 | A1 |
20110035057 | Receveur et al. | Feb 2011 | A1 |
20110035466 | Panigrahi | Feb 2011 | A1 |
20110054936 | Cowan et al. | Mar 2011 | A1 |
20110068930 | Wildman et al. | Mar 2011 | A1 |
20110077965 | Nolte et al. | Mar 2011 | A1 |
20110087079 | Aarts | Apr 2011 | A1 |
20110102133 | Shaffer | May 2011 | A1 |
20110102181 | Metz et al. | May 2011 | A1 |
20110106560 | Eaton et al. | May 2011 | A1 |
20110106561 | Eaton et al. | May 2011 | A1 |
20110175809 | Markovic et al. | Jul 2011 | A1 |
20110190593 | Mcnair | Aug 2011 | A1 |
20110227740 | Wohltjen | Sep 2011 | A1 |
20110245707 | Castle et al. | Oct 2011 | A1 |
20110254682 | Sigrist Christensen | Oct 2011 | A1 |
20110288811 | Greene | Nov 2011 | A1 |
20110295621 | Farooq et al. | Dec 2011 | A1 |
20110301440 | Riley et al. | Dec 2011 | A1 |
20110313325 | Cuddihy | Dec 2011 | A1 |
20120025991 | O'keefe et al. | Feb 2012 | A1 |
20120026308 | Johnson et al. | Feb 2012 | A1 |
20120075464 | Derenne et al. | Mar 2012 | A1 |
20120092162 | Rosenberg | Apr 2012 | A1 |
20120098918 | Murphy | Apr 2012 | A1 |
20120140068 | Monroe et al. | Jun 2012 | A1 |
20120154582 | Johnson et al. | Jun 2012 | A1 |
20120212582 | Deutsch | Aug 2012 | A1 |
20120259650 | Mallon et al. | Oct 2012 | A1 |
20120314901 | Hanson et al. | Dec 2012 | A1 |
20130027199 | Bonner | Jan 2013 | A1 |
20130028570 | Suematsu et al. | Jan 2013 | A1 |
20130120120 | Long et al. | May 2013 | A1 |
20130122807 | Tenarvitz et al. | May 2013 | A1 |
20130184592 | Venetianer et al. | Jul 2013 | A1 |
20130265482 | Funamoto | Oct 2013 | A1 |
20130309128 | Voegeli et al. | Nov 2013 | A1 |
20130332184 | Burnham et al. | Dec 2013 | A1 |
20140039351 | Mix et al. | Feb 2014 | A1 |
20140070950 | Snodgrass | Mar 2014 | A1 |
20140085501 | Tran | Mar 2014 | A1 |
20140086450 | Huang et al. | Mar 2014 | A1 |
20140155755 | Pinter et al. | Jun 2014 | A1 |
20140191861 | Scherrer | Jul 2014 | A1 |
20140267625 | Clark et al. | Sep 2014 | A1 |
20140267736 | Delean | Sep 2014 | A1 |
20140327545 | Bolling et al. | Nov 2014 | A1 |
20140328512 | Gurwicz et al. | Nov 2014 | A1 |
20140333744 | Baym et al. | Nov 2014 | A1 |
20140333776 | Dedeoglu et al. | Nov 2014 | A1 |
20140354436 | Nix et al. | Dec 2014 | A1 |
20140365242 | Neff | Dec 2014 | A1 |
20150109442 | Derenne et al. | Apr 2015 | A1 |
20150206415 | Wegelin et al. | Jul 2015 | A1 |
20150269318 | Neff | Sep 2015 | A1 |
20150278456 | Bermudez rodriguez et al. | Oct 2015 | A1 |
20150294143 | Wells et al. | Oct 2015 | A1 |
20160022218 | Hayes et al. | Jan 2016 | A1 |
20160070869 | Portnoy | Mar 2016 | A1 |
20160093195 | Ophardt | Mar 2016 | A1 |
20160127641 | Gove | May 2016 | A1 |
20160217347 | Mineo | Jul 2016 | A1 |
20160253802 | Venetianer et al. | Sep 2016 | A1 |
20160267327 | Franz et al. | Sep 2016 | A1 |
20160360970 | Tzvieli et al. | Dec 2016 | A1 |
20170055917 | Stone et al. | Mar 2017 | A1 |
20170143240 | Stone et al. | May 2017 | A1 |
20170337682 | Liao et al. | Nov 2017 | A1 |
20180018864 | Baker | Jan 2018 | A1 |
20180068545 | Kusens | Mar 2018 | A1 |
20180357875 | Kusens | Dec 2018 | A1 |
20190006046 | Kusens et al. | Jan 2019 | A1 |
20190029528 | Tzvieli et al. | Jan 2019 | A1 |
20190043192 | Kusens et al. | Feb 2019 | A1 |
20190122028 | Kusens et al. | Apr 2019 | A1 |
20190205630 | Kusens | Jul 2019 | A1 |
20190206218 | Kusens et al. | Jul 2019 | A1 |
Number | Date | Country |
---|---|---|
19844918 | Apr 2000 | DE |
2009018422 | Feb 2009 | WO |
2012122002 | Sep 2012 | WO |
Entry |
---|
Notice of Allowance received for U.S. Appl. No. 15/857,696, dated Jul. 16, 2019, 9 pages. |
Notice of Allowance received for U.S. Appl. No. 16/380,013, dated Jul. 10, 2019, 10 pages. |
Final Office Action received for U.S. Appl. No. 13/543,816, dated Jun. 17, 2014, 15 pages. |
Final Office Action received for U.S. Appl. No. 14/084,588, dated Dec. 19, 2014, 24 pages. |
Final Office Action received for U.S. Appl. No. 14/575,850, dated Dec. 12, 2017, 10 pages. |
Final Office Action received for U.S. Appl. No. 14/599,498, dated Oct. 12, 2017, 28 pages. |
Final Office Action received for U.S. Appl. No. 14/611,363, dated Apr. 28, 2017, 20 pages. |
Final Office Action received for U.S. Appl. No. 14/623,349, dated Oct. 4, 2017, 29 pages. |
Final Office Action received for U.S. Appl. No. 14/724,969, dated Jul. 28, 2016, 26 pages. |
Final Office Action received for U.S. Appl. No. 14/757,877, dated Sep. 29, 2017, 22 pages. |
Final Office Action received for U.S. Appl. No. 15/134,189, dated Jul. 12, 2018, 23 pages. |
Final Office Action received for U.S. Appl. No. 15/285,416, dated Aug. 23, 2017, 16 pages. |
Final Office Action received for U.S. Appl. No. 15/285,416, dated Jul. 5, 2018, 8 pages. |
Final Office Action received for U.S. Appl. No. 15/396,263, dated Oct. 18, 2017, 20 pages. |
First Action Interview Office Action received for U.S. Appl. No. 14/244,160, dated Nov. 28, 2017, 5 pages. |
Kusens, Neil, Unpublished U.S. Appl. No. 14/613,866, filed Feb 4, 2015, titled “Method and System for Determining Whether an Individual Takes Appropriate Measures to Prevent the Spread of Healthcare Associated Infections Along With Centralized Monitoring”. |
Kusens, Neil, Unpublished U.S. Appl. No. 14/084,588, filed Nov. 19, 2013, titled “Method for Determining Whether an Individual Leaves a Prescribed Virtual Perimeter”. |
Kusens, Neil, Unpublished U.S. Appl. No. 14/575,850, filed Dec. 18, 2014, titled “Method and Process for Determining Whether an Individual Suffers a Fall Requiring Assistance”. |
Kusens, Neil, Unpublished U.S. Appl. No. 14/599,498, filed Jan. 17, 2015, titled “Method and System for Determining Whether an Individual Takes Appropriate Measures to Prevent the D Spread of Healthcare Associated Infections”. |
Kusens, Neil, Unpublished U.S. Appl. No. 14/611,363, filed Feb. 2, 2015, titled “Method and System for Determining Whether an Individual Takes Appropriate Measures to Prevent the Spread of Healthcare Associated Infections”. |
Kusens, Neil, Unpublished U.S. Appl. No. 14/623,349, filed Feb. 16, 2015, titled “Method for Determining Whether an Individual Enters a Prescribed Virtual Zone Using 3D Blob Detection”. |
Kusens, Neil, Unpublished U.S. Appl. No. 13/543,816, filed Jul. 7, 2012, titled “Method and Process for Determining Whether an Individual Suffers a Fall Requiring Assistance”. |
Kusens, Neil, Unpublished U.S. Appl. No. 14/724,969, filed May 29, 2015, titled “Method and Process for Determining Whether an Individual Suffers a Fall Requiring Assistance”. |
Kusens, Neil, Unpublished U.S. Appl. No. 14/727,434, filed Jun. 1, 2015, titled “Method for Determining Whether Enters a Prescribed Virtual Zone Using Skeletal Tracking and 3D Blob Detection”. |
Kusens, Neil, Unpublished U.S. Appl. No. 14/728,762, filed Jun. 2, 2015, titled “Method for Determining Whether an Individual Leaves a Prescribed Virtual Perimeter”. |
Kusens, Neil, Unpublished U.S. Appl. No. 14/743,264, filed Jun. 18, 2015, titled “System for Determining Whether an Individual Enters a Prescribed Virtual Zone Using 3D Blob Detection”. |
Kusens, Neil, Unpublished U.S. Appl. No. 14/743,447, filed Jun. 18, 2015, titled “System for Determining Whether an Individual Suffers a Fall Requiring Assistance”. |
Kusens, Neil, Unpublished U.S. Appl. No. 14/743,499, filed Jun. 18, 2015, titled “System for Determining Whether an Individual Suffers a Fall Requiring Assistance”. |
Mooney, Tom, Rhode Island ER First to Test Google Glass on Medical Conditions, retrived from <https://www.ems1.com/ems-products/technology/articles/1860487-Rhode-Island-ER-first-to-test-Google-Glass-on-medical-conditions/>, Mar. 11, 2014, 3 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/148,151, dated May 8, 2018, 5 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/285,416, dated Apr. 11, 2017, 13 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/285,416, dated Mar. 12, 2018, 20 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/395,250, dated May 8, 2017, 19 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/395,526, dated Apr. 27, 2017, 16 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/395,762, dated May 31, 2018, 24 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/396,263, dated Apr. 14, 2017, 18 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/628,318, dated Jun. 8, 2018, 9 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/728,110, dated May 2, 2018, 8 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/848,621, dated May 31, 2018, 23 pages. |
Non-Final Office Action received for U.S. Appl. No. 13/543,816, dated Dec. 30, 2013, 9 pages. |
Non-Final Office Action received for U.S. Appl. No. 13/543,816, dated Dec. 1, 2014, 18 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/084,588, dated Jul. 16, 2014, 12 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/339,397, dated Oct. 7, 2015, 16 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/575,850, dated Mar. 11, 2016, 10 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/599,498, dated May 31, 2017, 24 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/611,363, dated Jan. 11, 2017, 19 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/611,363, dated May 7, 2018, 6 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/623,349, dated Apr. 5, 2017, 15 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/724,969, dated Feb. 11, 2016, 14 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/727,434, dated Sep. 23, 2016, 9 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/743,499, dated May 23, 2016, 6 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/757,593, dated Apr. 21, 2017, 9 pages. |
Non Final Office Action received for U.S. Appl. No. 15/395,243, dated Feb. 14, 2019, 14 pages. |
Non Final Office Action received for U.S. Appl. No. 16/216,210, dated Feb. 13, 2019, 29 pages. |
Non Final Office Action received for U.S. Appl. No. 16/107,567, dated Mar. 29, 2019, 8 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/395,762, dated May 1, 2019, 27 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/856,419, dated May 2, 2019, 8 pages. |
Conaire, et al., “Fusion of Infrared and Visible Spectrum Video for Indoor Surveillance”, WIAMIS, Apr. 2005, 4 pages. |
Final Office Action received for U.S. Appl. No. 15/395,243, dated Jun. 11, 2019, 18 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/134,189, dated May 9, 2019, 30 pages. |
Preinterview First Office Action received for U.S. Appl. No. 15/857,696, dated May 23, 2019, 14 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/757,593, dated Aug. 16, 2017, 8 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/757,877, dated Feb. 23, 2017, 24 pages. |
Notice of Allowance received for U.S. Appl. No. 13/543,816, dated Jun. 5, 2015, 5 pages. |
Notice of Allowance received for U.S. Appl. No. 14/575,850, dated Jun. 13, 2018, 5 pages. |
Notice of Allowance received for U.S. Appl. No. 14/599,498, dated Jul. 18, 2018, 6 pages. |
Notice of Allowance received for U.S. Appl. No. 14/611,363, dated Dec. 29, 2017, 11 pages. |
Notice of Allowance received for U.S. Appl. No. 14/613,866, dated Mar. 20, 2017, 11 pages. |
Notice of Allowance received for U.S. Appl. No. 14/623,349, dated Jun. 18, 2018, 11 pages. |
Notice of Allowance received for U.S. Appl. No. 14/724,969, dated Apr. 21, 2017, 8 pages. |
Notice of Allowance received for U.S. Appl. No. 14/724,969, dated Dec. 23, 2016, 5 pages. |
Notice of Allowance received for U.S. Appl. No. 14/127,434, dated Apr. 25, 2017, 9 pages. |
Notice of Allowance received for U.S. Appl. No. 14/727,434, dated Jan. 4, 2018, 2 pages. |
Notice of Allowance received for U.S. Appl. No. 14/727,434, dated Jul. 5, 2017, 9 pages. |
Notice of Allowance received for U.S. Appl. No. 14/727,434, dated Oct. 10, 2017, 9 pages. |
Notice of Allowance received for U.S. Appl. No. 14/728,762, dated Jun. 27, 2016, 14 pages. |
Notice of Allowance received for U.S. Appl. No. 14/743,264, dated Jul. 18, 2016, 16 pages. |
Notice of Allowance received for U.S. Appl. No. 14/743,264, dated Nov. 9, 2016, 14 pages. |
Notice of Allowance received for U.S. Appl. No. 14/743,264, dated Oct. 14, 2016, 14 pages. |
Notice of Allowance received for U.S. Appl. No. 14/743,447, dated Aug. 26, 2016, 5 pages. |
Notice of Allowance received for U.S. Appl. No. 14/743,447, dated Jun. 22, 2016, 4 pages. |
Notice of Allowance received for U.S. Appl. No. 14/743,447, dated May 31, 2016, 8 pages. |
Notice of Allowance received for U.S. Appl. No. 14/743,447, dated Nov. 14, 2016, 5 pages. |
Notice of Allowance received for U.S. Appl. No. 14/743,499, dated Sep. 19, 2016, 5 pages. |
Notice of Allowance received for U.S. Appl. No. 14/757,593, dated Jun. 4, 2018, 5 pages. |
Notice of Allowance received for U.S. Appl. No. 15/279,054, dated Nov. 27, 2017, 2 pages. |
Notice of Allowance received for U.S. Appl. No. 15/279,054, dated Oct. 20, 2017, 13 pages. |
Notice of Allowance received for U.S. Appl. No. 15/395,250, dated Sep. 26, 2017, 13 pages. |
Notice of Allowance received for U.S. Appl. No. 15/395,526, dated Sep. 21, 2017, 13 pages. |
Notice of Allowance received for U.S. Appl. No. 15/395,716, dated Apr. 19, 2017, 5 pages. |
Notice of Allowance received for U.S. Appl. No. 15/395,716, dated Dec. 6, 2017, 5 pages. |
Notice of Allowance received for U.S. Appl. No. 15/395,716, dated May 9, 2018, 5 pages. |
Notice of Allowance received for U.S. Appl. No. 15/396,263, dated Jul. 13, 2018, 9 pages. |
Notice of Allowance received for U.S. Appl. No. 15/728,110, dated Jul. 23, 2018, 15 pages. |
Notice of Allowance received for U.S. Appl. No. 15/728,110, dated Sep. 21, 2018, 2 pages. |
Notice of Allowance received for U.S. Appl. No. 15/395,716, dated Jun. 19, 2018, 2 pages. |
Notice of Allowance received for U.S. Appl. No. 15/395,716, dated Jul. 24, 2017, 5 pages. |
Pre-interview First Office Action received for U.S. Appl. No. 15/910,645, dated May 21, 2018, 14 pages. |
Pre-interview First Office Action received for U.S. Appl. No. 15/395,716, dated Feb. 24, 2017, 5 pages. |
Pre-interview First Office Action received for U.S. Appl. No. 15/134,189, dated Nov. 22, 2017, 5 pages. |
Raheja, et al., Human Facial Expression Detection From Detected in Captured Image Using Back Propagation Neural Network, International Journal of Computer Science and Information Technology (IJCSIT), vol. 2, No. 1, Feb. 2010, 7 pages. |
Virtual Patient Observation: Centralize Monitoring of High-Risk Patients with Video—Cisco Video Surveillance Manager, Retrived from <https://www.cisco.com/c/en/us/products/collateral/physical-security/video-surveillance-manager/whitepaper_11-715263.pdf>. |
Number | Date | Country | |
---|---|---|---|
20190057592 A1 | Feb 2019 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 15728110 | Oct 2017 | US |
Child | 16166857 | US | |
Parent | 14727434 | Jun 2015 | US |
Child | 15728110 | US |