Systems and methods for determining whether an individual enters a prescribed virtual zone using skeletal tracking and 3D blob detection

Abstract
A method and system that allows healthcare providers to monitor disabled, elderly or other high-risk individuals to prevent or reduce falls and/or mitigate the impact of a fall by delivering automated notification of “at risk” behavior and falls by such an individual. Two systems are used to identify patients—a skeletal tracking system that identifies patients by biometric indicators and a virtual blob detection system. In the virtual blob system, the monitored individual is virtually represented as a blob object of at least a specific size by a computerized monitoring system, and such system detects and alerts when the blob object enters or crosses into a virtually defined or designated blob detection zone and remains in the zone for at least a predetermined period of time. These systems may be used concurrently and conflicts between the systems may be resolved.
Description
FIELD OF THE DISCLOSURE

The present disclosure is generally directed to patient monitoring systems and particularly to a system and method for monitoring patients in a manner which prevents or reduces patient falls.


BACKGROUND

According to recent studies, falls are a leading cause of death among people over the age of 65 years, and 10% of the fatal falls for patients over 65 years of age occur in a hospital setting. For the general population, studies estimate that patient falls occur in 1.9 to 3% of all acute care hospitalizations. Of these hospital-based falls, approximately 30% will result in a serious injury with the cost to care for these injuries estimated to reach $54.9 billion per year by 2020. Current technologies that exist to assist in the prevention of falls are limited in their capabilities. These include pressure pads on the bed that trigger an alert when no pressure is detected on the pad, pressure pads on the floor and light beams that create a perimeter with alarms going off upon interruption of the beam. The pressure pads are ineffective as they do not prevent the fall but, rather, alert after the fact when it is too late. Additionally they are prone to false positive alerts. The light beams are also prone to false alerts when the patient or visitor simply reaches through it or the caregiver breaks the beam while delivering medication, food, or drink or conducting a procedure on the patient. The present disclosure is directed to addressing these above-described shortcomings with current technology.


SUMMARY OF THE DISCLOSURE

Generally disclosed is a novel method and system that allows healthcare providers, hospitals, skilled nursing facilities, and other persons to monitor disabled, elderly, or other high-risk individuals and utilize the described technology to prevent or reduce falls and/or mitigate the impact of a fall by delivering automated notification of “at risk” behavior and/or falls by such an individual being monitored, especially falls and/or behavior where assistance is required, using a skeletal tracking system and a virtual blob detection system.


With skeletal tracking alone, there can be factors affecting the cameras/image-video quality which affects the ability of the detection/monitoring system to detect a skeleton. Such factors, especially in a hospital, include, but are not limited to, sheets/blankets covering a patient, trays positioned over the bed hiding the patient, and the patient blending into the bed and not having a skeleton recognized.


With blob detection alone, there can be an increase in false positives in detecting falls and “at risk” behavior. These false positives can occur because blob detection does not differentiate between types of 3D objects. Blankets, trays, caretakers, or other 3D objects can trigger an automated notification. Blob recognition also does not differentiate between parts of the body.


The present disclosure using of a skeletal tracking system with a virtual blob detection system addresses or at least reduces the shortcomings of both systems. Skeletal tracking can be used to reduce or eliminate false positives generated by a virtual blob detection system. Virtual blob detection relies on 3D object detection, which works regardless of how much of the person is viewable by the camera or whether other objects are blocking the view of the camera. Even in poor lighting conditions, both the virtual blob detection system and skeletal tracking system can still capture and/or recognize movement as the system can use an IR Depth Map to perform the blob and/or skeletal detection, which does not rely on lighting conditions.


The present disclosure uses both a skeletal tracking system and a virtual blob detection system to track whether an individual has fallen or engaged in “at risk” behavior. When the skeletal tracking system is unable to track a skeleton, then a virtual blob detection system is used to capture and/or recognize movement. In the alternative, both a skeletal tracking system and a blob detection system can monitor an individual simultaneously, and a notification is delivered when either system detects a fall or “at risk” behavior.


The following non-limiting definitions are provided as aid in understanding the disclosed novel method and system:















3D Camera, Motion
An electronic device that contains one or more cameras capable of


and Sound sensor
identifying individual objects, people, and motion regardless of



lighting conditions as well as one or more microphones to detect audio. The



cameras can utilize technologies including but not limited to color RGB,



CMOS sensors, infrared projectors and RF-modulated light. They may



also contain microprocessors and image sensors to detect and process



information both sent out and received by the various cameras.



The electronic device calculates if there has been a change in location of



the person or object of interest over a period of time. As a non-limiting



example, an object can be, at time T1, located at coordinates (x1, y1, z1)



in a picture frame taken by the camera. At time T2, the object is captured



by the picture frame taken by the camera at coordinates (x2, y2, z2).



Based on this information, motion, speed, and direction can be derived



utilizing the elapsed time and comparing the two 3D coordinates over the



elapsed time. As opposed to conventional motion sensors, which use



captured motion to control a camera, the 3D camera, motion, and sound



sensors used in accordance with the present disclosure uses the a camera



in order to compute the motion as well as the size of the object. The



camera/sensors are preferably continuously on at all times during while



the monitoring is occurring, regardless of whether the person or object of



interest is moving or not. The object size (minimum and/or maximum)



can be configured through the software within, running, operating and/or



controlling the computerized virtual blob detection monitoring system. A



3D camera, motion, and sound sensor can additionally be programmed to



lock on a person and can send back to the computerized monitoring



system the 3D coordinates of the joints in the person's body and a skeletal



outline of the person. As a non-limiting example, a person's right arm



can be, at time T1, located at coordinates (x1, y1, z1) in a picture frame



taken by the camera. At time T2, the right arm is captured by the picture



frame taken by the camera at coordinates (x2, y2, z2). Based on this



information, motion, speed and direction can be derived utilizing the



elapsed time and comparing the two 3D coordinates over the elapsed time.



The camera preferably views the entire bed of a patient or a large portion



of the bed or other area that the patient is resting at (i.e. chair, couch, etc.)



simply by its placement in a manner sufficient for the monitored area to



be visible to the camera. Thus, the camera does not require any triggering



event to cause the camera to begin recording video and/or 3D depth data



or transmitting video and/or 3D depth data to the other components of the



system for analysis. Because the video camera may be recording or



otherwise transmitting video and/or 3D depth data to the other system



components at all times during monitoring, the electronic device is able



to immediately track, capture and/or record the monitored individual's



movements at all times within the room or monitored area and will be



able to provide information as to whether and when the individual begins



to move or begins to get up to move. Preferably the 3D camera, motion,



and sound sensor record, capture, and/or stream video and/or 3D depth



data. As video is technically made up of individual picture frames (i.e.



30 frames per second of video), the above reference to picture frames is



referring to frames of video.



Whether used with skeletal tracking or virtual blob detection, depth



sensitivity comes into play with both methods in order to minimize false



alarms, as objects behind and in front of the patient can be effectively



ignored. The preferred use of depth as a factor also differentiates the



current monitoring system from motion/object detection systems that rely



on 2D images.



The 3D camera, motion, and sound sensor can be located within the room



of the patient being monitored and/or potentially just outside of the



patient's room. It is connected to the computerized communication and



computerized monitoring systems via a data connection (TCP/IP or



comparable technology).


Computerized Virtual
A computer system specifically designed and programmed to create


Blob Detection
virtual blob detection zones around a specific object, including but not


Monitoring System
limited to a hospital bed, and that monitors activity based on information



received from the 3D camera, motion, and sound sensor. The



computerized monitoring system will preferably be located within the



patient's room and can be connected to the centralized monitoring station



at the facility but can also be located at any physical location so long as a



data connection (TCP/IP or comparable technology) exists between the



computerized virtual blob detection monitoring system, the computerized



communication system, the centralized monitoring station, and the 3D



camera, motion, and sound sensor.


Computerized
A computer system specifically designed and programmed to facilitate


Communication
communication between the monitored patient and the computerized


System
monitoring system in the event of either an object, meeting the



preprogrammed or preconfigured size for a triggering object, enters the



virtual blob detection zone or the computerized skeletal tracking system



determines the patient has fallen or performed an “at risk” behavior.



This system may include but is not limited to amplified speakers,



microphones, lights, monitors, computer terminals, mobile phones and/or



other technologies to allow for the electronic communication to take



place. The computerized communication system will preferably be



located within the room of the patient being monitored, but certain



components of the system are mobile by their nature (i.e. mobile phones,



pagers, computers) and can also be located at any location so long as a



data connection (TCP/IP or comparable technology) exists between the



computerized monitoring system, the computerized communication



system, the centralized monitoring station and 3D camera, motion,



and sound sensor.


Computerized
A computer system specifically designed and programmed to lock onto


Skeletal Tracking
an individual and send the 3D coordinates of the joints in an individual's


System
body and a skeletal outline of the individual based on information



received from the 3D camera, motion, and sound sensor. The



computerized skeletal tracking system will preferably be located within



the patient's room and can be connected to the centralized monitoring



station at the facility but can also be located at any physical location so



long as a data connection (TCP/IP or comparable technology) exists



between the computerized skeletal tracking system, the computerized



communication system, the centralized monitoring station and the 3D



camera, motion, and sound sensor.


System Database
A computer database that stores records, documents, or other files of all



alerts generated, notifications, confirmation requests, responses, and



reconfirmation requests, or any other desired information concerning a



triggering event or lack of a triggering event.


Centralized
A computer display connected to the centralized monitoring station,


Monitoring Primary
providing video and audio of all patient rooms connected to the


Display
centralized monitoring station.


Centralized
A computer display connected to the centralized monitoring station,


Monitoring Alert
providing video and audio of any patient room where an object (such as


Display
an individual) is deemed to have entered a virtual blob detection zone,



fallen, or performed an “at risk” behavior, preferably at the moment such



determination is made.


Caregiver
A relative, friend, individual, company or facility whose purpose is to



provide assistance in daily living activities for individuals who are



disabled, elderly or otherwise in needs of assistance.












BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a block diagram of a computerized skeletal tracking, monitoring and alerting system and method in accordance with an embodiment of the present invention;



FIG. 2 is a block diagram of a virtual blob detection zone configuration, monitoring and alerting system and method in accordance with an embodiment of the present invention;



FIG. 3 is a block diagram of the centralizing monitoring and alerting system in accordance with an embodiment of the disclosure;



FIG. 4 is a block diagram of one embodiment illustrating how the computerized skeletal tracking system and virtual blob detection systems can be used separately and/or concurrently in accordance with an embodiment of the present invention;



FIGS. 5 through 17 illustrate various screen shots for configuring the system for operation, including defining a bed zone, virtual blob detection zone(s), and alert types in accordance with an embodiment of the present disclosure;



FIG. 18 is a non-limiting example of a centralized video monitoring system that can be used with the systems and method shown in FIGS. 1 through 3 in accordance with an embodiment of the present disclosure; and



FIG. 19 is a non-limiting example illustrating how to configure the operational mode of the system in accordance with an embodiment of the present disclosure.





DETAILED DESCRIPTION


FIG. 1 illustrates a block diagram for the computerized skeletal tracking configuration, monitoring, and alerting system and a method of the disclosed system. Specifically, FIG. 1 shows the workflow for monitoring an individual's status through the use of one or more 3D camera, motion, and sound sensors.


At step F1a, one or more 3D camera, motion and/or sound sensors can be installed in the patient's or individual's room. At step F1b, the one or more 3D camera, motion and/or sound sensors can be configured to recognize the patient or monitored individual using biometric identifiers such as facial recognition, height, distance between points on the body, etc. The patient's body can be recognized and tracked as one or more skeletal components. Alternatively, the patient can be identified by means of a user creating a three-dimensional zone around the patient through a software application stored on a computerized skeletal tracking system or through the use of an electronic transmitter on the patient's or other individual's person. Once a patient is identified, the software can automatically generate or allow the user to generate a configurable three-dimensional zone or perimeter around the patient and/or the patient's bed that acts as a virtual barrier. At step F1c, data from the 3D camera, motion and/or sound sensors can be continuously sent to a computerized skeletal tracking system, preferably at all times while the system is being used for monitoring. At step F1d, a continuous video feed can be sent to the central monitoring primary display, preferably at all times while the system is being used for monitoring.


At step F1e, if the computerized skeletal tracking system does not detect a patient's skeleton because, for non-limiting example, the patient is covered with a sheet, blanket, or tray, is obscured by a caretaker or other individual, or for another reason, then the computerized skeletal tracking system continues to try to identify the patient until the obstruction is removed or the issue is resolved. In some embodiments, if the computerized skeletal tracking system does not detect a patient's skeleton within a preprogrammed, configurable length of time, then the computerized virtual blob detection monitoring system is activated and begins to monitor the patient (see e.g., FIGS. 2 and 4). In some embodiments, the computerized virtual blob detection monitoring system runs concurrently with the computerized skeletal tracking system. This preprogrammed, configurable length of time can be created or selected by the user and/or can be programmed through a software application stored on the computerized skeletal tracking system.


At step F1f, if the computerized skeletal tracking system detects that the patient or any part of the patient crosses the virtual barrier around the patient and/or the patient's bed, the skeletal monitoring system will alert the computerized communication system. A record can also be entered in a database to record the incident if other individuals, such as a caregiver, are also detected within the monitored room at the time virtual barrier is crossed. The system can be designed or programmed such that no alert is generated when another individual is detected, and it will continue to monitor the data being sent from the 3D camera, motion, and sound sensor. In this situation, generating an alarm/alert could result in a false alarm because there are other individual(s) with the patient, and such person(s) may be responsible for monitoring the patient and/or (even if not responsible) can assist the patient who is falling. Additionally, the person in the room should be in a better position to assist the patient as compared to the individual located at the central monitoring station. It is also within the scope of the disclosure to send alarm/alerts even if other individual(s) are in the room with the patient, as those individuals may not be the person responsible, may be elderly, may have a physical handicap preventing them from stopping a patient from falling, etc.


At step F1g, the computerized communication system preferably can first issue a verbal warning to the patient that they have crossed the virtual barrier. This verbal warning can be a pre-recorded message, including, but not limited to, a pre-recorded message from any caregiver, and will advise the patient to exit the virtual barrier and return to their previous position. At step F1h, should the patient fail to exit the virtual barrier and return to their previous position in a timely manner, an alert can be generated on the central monitoring alert display system (see e.g., FIG. 3). The system database can also be updated to reflect actions taken. The system can be designed to provide visual and/or audio alerts.


At step F1i, the computerized communication system can notify caregivers or other designated persons that the individual being monitored requires assistance. Notification of caregivers can be made through phone call, text messaging, speakerphone systems, pagers, email, or other electronic means of communication if so desired and configured. At step F1j, if the patient exits the virtual zone (i.e., crosses the virtual barrier), the system database can be updated to reflect such. Additionally, the system will continue to monitor the patient and store all data in the system database.



FIG. 2 illustrates a block diagram for the virtual blob detection zone configuration, monitoring, and alerting system and method of the disclosed system and method. Specifically, FIG. 2 shows the workflow for monitoring an individual's status through the use of one or more 3D camera, motion, and sound sensors.


At step F2a, one or more 3D camera, motion and/or sound sensors can be installed in the patient's or individual's room. At step F2b, the one or more 3D camera, motion, and sound sensors can be configured to recognize the area being monitored using three-dimensional areas as defined by x, y, and z coordinates in relation to the 3D camera, motion and/or sound sensor. Based on the data sent/captured by the 3D camera, motion and/or sound sensor(s), the computerized virtual blob detection monitoring system is programmed to recognize any 3D object within the configured area. The patient's body is recognized and tracked as one or more blobs. Virtual blob detection zones can also be calibrated at this time. At step F1c, data from the 3D camera, motion, and sound sensors can be continuously sent to a computerized virtual blob detection monitoring system, preferably at all times while the system is being used for monitoring. At step F1d, a continuous video feed can be sent to the central monitoring primary display, preferably at all times while the system is being used for monitoring.


At step F2e, if the computerized virtual blob detection monitoring system does not detect that the patient or any part of the patient (i.e. presented as a blob object(s)) has crossed into the designated virtual blob detection zone, it will continue monitoring. As a non-limiting example, if both hands of the patient enter the blob detection zone, the system may display and/or track as two different blobs or possibly as a single blob depending on how close the hands are to each other. If the computerized virtual blob detection monitoring system detects that the patient or any part of the patient has crossed into the designated virtual blob detection zone, it will then proceed to step F2f to determine how large the portion of the patient's body that entered the blob detection zone is. If the size of the patient's body that entered the blob detection zone is less than the configured minimum size, it will continue to monitor. Configuration is preferably through the detection system's programmed software and may be similar to how the zones, trip wires, etc. are configured. However, if the size of the patient's body that is within the blob detection zone is above the minimum predetermined or preprogrammed threshold for the object size, it can then proceed to step F2g. At step F2g, the system determines how long the patient's body or part of the patient's body has remained within the blob detection zone. If the patient's body or part of the body has not remained in the detection zone for greater than a configured amount of time, preferably no alert is generated and the system continues to monitor the individual. However, the system can also be programmed to issue/generate an alert based solely on the system detecting a large enough blob within the detection zone for any period of time and such is also considered within the scope of the disclosure. However, if at step F2g, the patient's body has remained within the blob detection zone for greater than the minimum configured time period, the monitoring system will alert the computerized communication system and can also enter a record of the incident in a database. If other individuals, such as a caregiver, are also detected within the monitored room at the time the virtual blob detection zone threshold is crossed, the system can be designed or programmed such that no alert is generated, and it will continue to monitor the data being sent from the 3D camera, motion, and sound sensor. In this situation, generating an alarm/alert could result in a false alarm because there are other individual(s) with the patient, and such person(s) may be responsible for monitoring the patient and/or (even if not responsible) can assist the patient who is falling. The person in the room may be in a better position to assist the patient as compared to the individual located at the central monitoring station. It is also within the scope of the disclosure to send alarm/alerts even if other individual(s) are in the room with the patient, as those individuals may not be the person responsible, may be elderly, may have a physical handicap preventing them from stopping a patient from falling, etc.


At step F2h, the computerized communication system preferably can first issue a verbal warning to the patient that he or she has entered the virtual blob detection zone. This verbal warning can be a pre-recorded message, including, but not limited to, a pre-recorded message from any caregiver, and will advise the patient to exit the virtual blob detection zone and return to his or her previous position. At step F2i, should the patient fail to exit the virtual blob detection zone and return to his or her previous position in a timely manner, an alert can be generated on the central monitoring alert display system (see e.g., FIG. 3). The system database can also be updated to reflect actions taken. The system can be designed to provide visual and/or audio alerts.


At step F2j, the computerized communication system can notify caregivers or other designated persons that the individual requires assistance. Notification of caregivers can be made through phone call, text messaging, speakerphone systems, pagers, email, or other electronic means of communication if so desired and configured. At step F2k, if the patient exits the virtual blob detection zone, the system database can be updated to reflect such. Additionally, the system will continue to monitor the patient and store all data in the system database.



FIG. 3 illustrates a block diagram for centralized monitoring and alerting and shows the workflow for centralized monitoring and alerting of the central monitoring system regarding whether an individual has entered a virtual blob detection zone through the use of 3D camera, motion, and sound sensors. At step F3a, one or more 3D camera, motion, and sound sensors are installed in and/or just outside an individual's room, home, hospital room, or other place of temporary or permanent residence and are connected to the computerized monitoring and communication systems as described in FIGS. 1 and 2. The video, audio, and alert data can be sent to a centralized monitoring station where the data is aggregated. Preferably, the centralized monitoring station receives data at all times from the sensors to allow the various individuals to be constantly monitored at the centralized station regardless of whether or not an individual has entered a virtual blob detection zone.


At step F3b, all video, audio and alert feeds received by the centralized monitoring station can be displayed on the centralized monitoring primary display. Alternatively, multiple centralized monitoring primary displays can be utilized based on the quantity of rooms to be monitored at a given time. At step F3c, when the centralized monitoring system receives an alert from any of the computerized monitoring and communication systems indicating that an individual in any of the monitored rooms or other locations has fallen or otherwise entered into a virtual detection zone, the video, audio, and alert information for the specific room and/or individual is displayed on the centralized monitoring alert display. Should the centralized monitoring station receive alerts from more than one of the computerized monitoring and communication systems indicating that an individual in a monitored room or location has entered a virtual barrier or virtual blob detection zone, the centralized monitoring alert display may display the video, audio, and alerting information from all such instances preferably at the same time. If no alert is received by the centralized monitoring station, preferably nothing is displayed on the centralized monitoring alert display. At step F3d, an electronic record of any alerts received by the centralized monitoring station can be stored in an electronic database, which is in communication with the centralized monitoring station.



FIG. 4 illustrates a block diagram for how a computerized skeletal tracking system and a virtual blob detection monitoring system can be used concurrently or independently in the disclosed system and method. At step F4a, one or more 3D camera, motion, and sound sensors are installed in and/or just outside an individual's room, home, hospital room, or other place of temporary or permanent residence and are connected to the computerized monitoring and communication systems as described in FIGS. 1, 2 and 3. The video, audio and/or alert data can be sent to a centralized monitoring station where the data is aggregated. Preferably, the centralized monitoring station receives data at all times from the sensors to allow the various individuals to be constantly monitored at the centralized monitoring station regardless of whether or not an individual has entered a virtual barrier or virtual blob detection zone. At step F4b, in some embodiments, both the computerized skeletal tracking system and the computerized virtual blob detection systems operate independently. In these embodiments, the computerized skeletal tracking system performs as described in FIG. 1, and the computerized virtual blob detection monitoring system performs concurrently as described in FIG. 2. In these embodiments, it is possible that information between the two systems may conflict. As a non-limiting example, if a hospital food tray is passed to a patient and remains within the blob detection zone for longer than the configured minimum time, the computerized virtual blob detection system may register an alert, while the computerized skeletal tracking system may not. The user can select which system (if any) would take priority through a software application stored on the computerized skeletal tracking system and/or the computerized virtual blob detection monitoring system. Also, an alert can be generated only when both systems independently generate an alert, or in the alternative, an alert may be generated in the event either system generates an alert. In other embodiments, the computerized virtual blob detection monitoring system is only used when the computerized skeletal tracking system fails to identify a patient or skeleton.


Additionally, the functions of the computerized virtual blob detection monitoring system and the computerized skeletal tracking system can be performed in practice by a single system. In these embodiments, the disclosure performs the same processes described in FIGS. 1-4, but a single combined system replaces the computerized skeletal tracking system and computerized virtual blob detection monitoring system.


At step F4c, in some embodiments, if the computerized skeletal tracking system detects a skeleton, then the method proceeds to step F1e of FIG. 1. It the computerized skeletal tracking system does not detect a skeleton within a preprogrammed, configurable length of time, then the computerized virtual blob detection monitoring system can be activated. This preprogrammed, configurable length of time can be created by the user through a software application stored on the computerized skeletal tracking system. In some embodiments, while the computerized virtual blob detection monitoring system is in use, the computerized skeletal tracking system may continue to try to detect a patient. If a patient is detected, the process proceeds to F1e of FIG. 1. At step F4d, the process then proceeds to step F2e of FIG. 2.



FIGS. 5 through 17 illustrate several set up screen shots for configuring the bed zone, virtual rails (trip wires), virtual blob detection zones and alert types. In FIG. 5, the bed zone, virtual trip wires, and virtual blob detection zones can be configured for a given or specific 3D camera, motion, and sound sensor. To begin configuration, the user can hover over the 3D camera, motion, and sound sensor video window with the cursor, right-click, select the plugin, and then select configure plug-in. A window will popup showing the 3D camera, motion, and sound sensors' feed (see e.g., FIG. 6). The user selects the icon for the type of zone or rail the user wishes to draw, which as a non-limiting example and illustrative purposes, can be a bed zone, virtual rail (trip wires), and virtual blob detection zone(s) (see e.g., FIG. 7).


As non-limiting examples, the icons that appear on the screen for selection can include the following symbols shown in FIG. 7. In this non-limiting example, in no particular order, some of the icons include Bed Zone, Auto Bed Zone (Select Patient), Auto Bed Zone (Auto-select), Saved Zones, Virtual Rail (Trip Wires), Virtual Blob Detection Zone and Clear All.


As seen in FIG. 6, to place a zone, the user clicks on the screen where he or she would like to start the zone. Then, the cursor is moved to the corner point for zone and clicked again. The user continues to select additional points until the zone is drawn to the user's satisfaction. Preferably, the user clicks on the round end point of the beginning of the zone to complete the zone (see e.g., FIG. 6). When the zone has been completed, the zone can appear, and a depth range box (i.e. square, rectangle, etc. disposed over the patient on the screen) can be provided on the screen, such as, but not limited to, in the middle of the screen or zone (see e.g., FIG. 8), though any location on the screen is considered within the scope of the invention. Placing a virtual rail is done with a similar process wherein the user clicks on the screen where he or she would like to start the rail. Then, the cursor is moved to the end point for the rail, and the user clicks on the screen again to place the rail. As seen in FIG. 10, upon completion of this process, the zone and/or rail(s) appear and have a depth range box, preferably in the middle.


As seen in FIG. 9, the user can adjust the depth range for any given zone or rail. By preferably double clicking on the depth range box or by other conventional selection methods an Edit Depth window can appear. The user can enter in the depth ranges (preferably in millimeters (mm) though not considered limiting). Additionally, the user can enter in minimum and maximum 3D object sizes (preferably in square root pixels though not considering limiting) to detect entering the virtual blob detection zones as well as a tolerance for the size change of an object. The user can click Save button or icon when done to store the entered values.


If there are any other types of zones or rails to draw for the particular sensor, the above steps are repeated to place the next zone or rail and the depth setting can be adjusted for each if necessary. Additionally, all zones and rails can be cleared by clicking on or otherwise selecting the Clear All icon in the toolbar. Once all of the zones/rails are configured, the user can close the window to finish, or the user may have the option to save the zone/rail configuration for later use.


As seen in FIG. 11, to access the main settings window, the user can click or otherwise select the Settings menu and then select Main Settings from the drop-down list. As one non-limiting alternative, the user can click on the Gear icon or other designated icon in the toolbar to access the main settings window.


As seen in FIG. 12, for one non-limiting way to configure a new Alert, the user can select the Alerts tabs and then click on or otherwise select the Add button, which can result in the Configure Alert box appearing on the screen (see e.g., FIG. 13). As seen in FIG. 13, under the Event field, the user can then select the event from the drop down list that the user wishes to send an alert on.


As seen in FIG. 13, once the Event type is selected, under the Action field, the user can select the Action he or she wishes to have the system perform when the selected Event is detected. Once the Event and Action have been selected, the OK button (see e.g., FIG. 15) can be selected to save the selected entries.


For certain Actions, an additional field may need to be completed to finish the Action. If the field is required, it can appear below the Action dropdown (see e.g., FIG. 16). If no further fields are required, the Configure Alert box can display N/A (see e.g., FIG. 15) or just be blank. As mentioned above, once all settings are selected, the user clicks or otherwise selects the OK button, which causes the new Alert to be listed in the Alerts tab window. To edit an existing Alert, the user first clicks on or otherwise selects the Alert and then selects the Edit button (see e.g., FIG. 17). To delete an Alert, first highlight it can then click on the Delete button (see e.g., FIG. 17).


To add more Alerts, the user clicks or selects the Add button and repeats the above described steps. Once finished, the user clicks on or otherwise selects the bottom corner OK button to save and close the window.



FIG. 18 shows a non-limiting example of a centralized video monitoring system that can be used with the system and method. The window highlighted in red is a non-limiting example of an alert that can be generated when the patient fails to return to within the perimeter of the virtual safety rails.



FIG. 19 shows a non-limiting example of how to configure the operational mode of the system. The user can select Auto-Switch in which the system will automatically switch between skeletal tracking and 3D blob detection depending on whether a skeleton is able to be tracked. Alternatively, the user may select both and cause the system to use both 3D Blob detection and skeletal tracking for alerting. Alternatively, the user can select to use either skeletal tracking or 3D blob detection solely or neither at all.


In one non-limiting embodiment, the disclosed system and method can use the following components:

    • 1. One or more 3D camera, motion and/or sound sensors;
    • 2. A computerized virtual blob detection monitoring system;
    • 3. A computerized skeletal tracking system;
    • 4. A computerized communication system;
    • 5. A centralized monitoring primary display;
    • 6. A centralized monitoring alert display; and
    • 7. Database


The various components can be in electrical and/or wireless communication with each other.


Located remote is defined to mean that the centralized monitoring station, centralized monitoring primary display and/or centralized monitoring alert display is not physically located within the monitored rooms. However, the location can be on the same premises at a different location (i.e. nurse station for the premises, hospital, etc.) or a different location (i.e. monitoring station, etc.).


The automatic detection of an individual entering a prescribed virtual blob detection zone will provide significant administrative and clinical benefits to caregivers and individuals alike, including the following non-limiting public benefits.

    • 1. Automation of determination of perimeter violation and automated notification of caregivers and/or other designated entities and/or individuals.
    • 2. Ability to alert patients, caregivers and other individuals in time to prevent a monitored patient from getting out of bed.
    • 3. Reduction in response time for monitored individuals who have fallen and require assistance.
    • 4. Increased survival rate for monitored individuals who have experienced a fall.
    • 5. Reduction in costs for hospitalization and medical care related to complications from a fall.
    • 6. Ability to distinguish multiple individuals and prevent false positives.
    • 7. Ability to distinguish direction of motion to prevent false positives.
    • 8. Ability to provide video feed of a monitored patient under all lighting conditions to the central video monitoring system.
    • 9. Audio and gesture based recognition to allow multiple forms of communication with patient.


Any computer/server/electronic database system (collectively “computer system”) capable of being programmed with the specific steps of the present invention can be used and is considered within the scope of the disclosure. Once specifically programmed, such computer system can preferably be considered a special purpose computer limited to the use of two or more of the above particularly described combination of steps (programmed instructions) performing two of more of the above particularly described combination of functions.


All components of the present disclosure system and their locations, electronic communication methods between the system components, electronic storage mechanisms, etc. discussed above or shown in the drawings, if any, are merely by way of example and are not considered limiting and other component(s) and their locations, electronic communication methods, electronic storage mechanisms, etc. currently known and/or later developed can also be chosen and used and all are considered within the scope of the disclosure.


Unless feature(s), part(s), component(s), characteristic(s) or function(s) described in the specification or shown in the drawings for a claim element, claim step or claim term specifically appear in the claim with the claim element, claim step or claim term, then the inventor does not consider such feature(s), part(s), component(s), characteristic(s) or function(s) to be included for the claim element, claim step or claim term in the claim when and if the claim element, claim step or claim term is interpreted or construed. Similarly, with respect to any “means for” elements in the claims, the inventor considers such language to require only the minimal amount of features, components, steps, or parts from the specification to achieve the function of the “means for” language and not all of the features, components, steps or parts describe in the specification that are related to the function of the “means for” language.


While the disclosure has been described and disclosed in certain terms and has disclosed certain embodiments or modifications, persons skilled in the art who have acquainted themselves with the disclosure will appreciate that it is not necessarily limited by such terms nor to the specific embodiments and modification disclosed herein. Thus, a wide variety of alternatives suggested by the teachings herein can be practiced without departing from the spirit of the disclosure, and rights to such alternatives are particularly reserved and considered within the scope of the disclosure.

Claims
  • 1. A system for detecting when a monitored individual or any part of the monitored individual has crossed over a designated electronic perimeter, the system comprising: one or more 3D camera, motion, and sound sensors located in a room with an individual to be monitored and configured to capture video data of the individual within the room;a computerized monitoring system configured to: electronically receive video data from the one or more 3D camera, motion, and sound sensors;use skeletal tracking to electronically monitor the room for a crossing of a designated electronic perimeter by the individual based on the video data electronically received from the one or more 3D camera, motion, and sound sensors;use virtual blob detection to electronically monitor the room for the crossing of the designated electronic perimeter by the individual based on the video data electronically received from the one or more 3D camera, motion, and sound sensors;detect a conflict between determinations of whether the individual or part of the individual crossed the designated electronic perimeter made using skeletal tracking and virtual blob detection;resolve the conflict between the determinations made using skeletal tracking and virtual blob detection according to a predetermined setting; andbased on resolution of the conflict, electronically transmit a determination that the individual or part of the individual has crossed the designated electronic perimeter.
  • 2. The system of claim 1, wherein the predetermined setting prioritizes using skeletal tracking over using virtual blob detection, and wherein the determination that the individual or part of the individual has crossed the designated electronic perimeter that is electronically transmitted is made using skeletal tracking.
  • 3. The system of claim 2, wherein the predetermined setting prioritizes using skeletal tracking over using virtual blob detection when the individual has previously been identified using skeletal tracking.
  • 4. The system of claim 1, wherein the predetermined setting prioritizes using virtual blob detection over using skeletal tracking, and wherein the determination that the individual or part of the individual has crossed the designated electronic perimeter that is electronically transmitted is made using virtual blob detection.
  • 5. The system of claim 1 further comprising a computerized communication system that is configured to receive an alert from the computerized monitoring system when the computerized monitoring system electronically detects that the individual or a part of the individual has crossed the designated electronic perimeter.
  • 6. The system of claim 5, wherein the computerized monitoring system is configured to electronically alert the computerized communication system when the computerized monitoring system electronically detects that the individual or a part of the individual has crossed the designated electronic perimeter and remains in the designated electronic perimeter for a predetermined period of time.
  • 7. The system of claim 6, wherein the computerized communication system is further configured to electronically issue an audible message to the individual to inform the individual that the individual has crossed the designated electronic perimeter.
  • 8. The system of claim 1, further comprising a central monitoring alert display, wherein an alert is generated on the central monitoring alert display upon the computerized monitoring system electronically determining that the individual or part of the individual crossed the designated electronic perimeter.
  • 9. The system of claim 1, wherein electronically transmitting the determination that the individual has crossed the designated electronic perimeter comprises notifying a designated person associated with the individual.
  • 10. The system of claim 9, wherein notifying the designated person associated with the individual is further based on determining that the individual remains in the designated electronic perimeter for at least a predetermined period of time.
  • 11. A computerized method for detecting when a monitored individual or any part of the monitored individual has crossed over a designated electronic perimeter, the method comprising: electronically receiving video data from one or more 3D camera, motion, and sound sensors that are configured to capture video data of an individual being monitored;using skeletal tracking to electronically monitor the room for a crossing of a designated electronic perimeter by the individual based on the video data electronically received from the one or more 3D camera, motion, and sound sensors;using virtual blob detection to electronically monitor the room for the crossing of the designated electronic perimeter by the individual based on the video data electronically received from the one or more 3D camera, motion, and sound sensors;detecting a conflict between determinations of whether the individual or part of the individual crossed the designated electronic perimeter made using skeletal tracking and virtual blob detection;resolving the conflict between the determinations made using skeletal tracking and virtual blob detection according to a predetermined setting; andbased on resolution of the conflict, electronically transmitting a determination that the individual or part of the individual has crossed the designated electronic perimeter.
  • 12. The computerized method of claim 11, wherein the predetermined setting prioritizes using skeletal tracking over using virtual blob detection, and wherein the determination that the individual or part of the individual has crossed the designated electronic perimeter that is electronically transmitted is made using skeletal tracking.
  • 13. The computerized method of claim 12, wherein the predetermined setting prioritizes using skeletal tracking over using virtual blob detection when the individual has previously been identified using skeletal tracking.
  • 14. The computerized method of claim 11, wherein the predetermined setting prioritizes using virtual blob detection over using skeletal tracking, and wherein the determination that the individual or part of the individual has crossed the designated electronic perimeter that is electronically transmitted is made using virtual blob detection.
  • 15. The computerized method of claim 11, wherein electronically transmitting the determination that the individual has crossed the designated electronic perimeter comprises transmitting an alert to a computerized communication system.
  • 16. The computerized method of claim 15, wherein the alert to the computerized communication system is transmitted upon determining that the individual or part of the individual has remained in the designated electronic perimeter for at least a predetermined period of time.
  • 17. The computerized method of claim 15, wherein the computerized communication system is further configured to transmit the determination to a central monitoring alert display, wherein an alert is generated on the central monitoring alert display.
  • 18. The computerized method of claim 11, wherein electronically transmitting the determination that the individual or part of the individual has crossed the designated electronic perimeter comprises electronically issuing an audible message to the individual to inform the individual that the individual has crossed the designated electronic perimeter.
  • 19. The computerized method of claim 11, wherein electronically transmitting the determination that the individual has crossed the designated electronic perimeter comprises notifying a designated person associated with the individual.
  • 20. The computerized method of claim 19, wherein notifying the designated person associated with the individual is further based on determining that the individual remains in the designated electronic perimeter after a predetermined period of time.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application, having 16/166,857 and entitled “Systems and Methods for Determining Whether an Individual Enters a Prescribed Virtual Zone Using Skeletal Tracking and 3D Blob Detection” is a continuation of pending U.S. application Ser. No. 15/728,110 filed Oct. 9, 2017, and entitled “Method for Determining Whether an Individual Enters a Prescribed Virtual Zone Using Skeletal Tracking and 3D Blob Detection,” which is a continuation of U.S. application Ser. No. 14/727,434, filed Jun. 1, 2015, now U.S. Pat. No. 9,892,611, issued Feb. 13, 2018, the entirety of each of which is incorporated by reference herein.

US Referenced Citations (242)
Number Name Date Kind
4669263 Sugiyama Jun 1987 A
4857716 Gombrich et al. Aug 1989 A
5031228 Lu Jul 1991 A
5276432 Travis Jan 1994 A
5448221 Weller Sep 1995 A
5482050 Smokoff et al. Jan 1996 A
5592153 Welling et al. Jan 1997 A
5798798 Rector et al. Aug 1998 A
5838223 Gallant et al. Nov 1998 A
5915379 Wallace et al. Jun 1999 A
5942986 Shabot et al. Aug 1999 A
6050940 Braun Apr 2000 A
6095984 Amano et al. Aug 2000 A
6160478 Jacobsen et al. Dec 2000 A
6174283 Nevo et al. Jan 2001 B1
6188407 Smith et al. Feb 2001 B1
6269812 Wallace et al. Aug 2001 B1
6287452 Allen Sep 2001 B1
6322502 Schoenberg et al. Nov 2001 B1
6369838 Wallace et al. Apr 2002 B1
6429869 Kamakura et al. Aug 2002 B1
6614349 Proctor et al. Sep 2003 B1
6727818 Wildman et al. Apr 2004 B1
6804656 Rosenfeld et al. Oct 2004 B1
7015816 Wildman et al. Mar 2006 B2
7122005 Shusterman Oct 2006 B2
7154397 Zerhusen et al. Dec 2006 B2
7237287 Weismiller et al. Jul 2007 B2
7323991 Eckert et al. Jan 2008 B1
7408470 Wildman et al. Aug 2008 B2
7420472 Tran Sep 2008 B2
7430608 Noonan et al. Sep 2008 B2
7502498 Wen et al. Mar 2009 B2
7612679 Fackler et al. Nov 2009 B1
7669263 Menkedick et al. Mar 2010 B2
7715387 Schuman May 2010 B2
7724147 Brown May 2010 B2
7756723 Rosow et al. Jul 2010 B2
7890349 Cole et al. Feb 2011 B2
7895055 Schneider et al. Feb 2011 B2
7908153 Scherpbier et al. Mar 2011 B2
7945457 Zaleski May 2011 B2
7962544 Torok et al. Jun 2011 B2
7972140 Renaud Jul 2011 B2
8108036 Tran Jan 2012 B2
8123685 Brauers et al. Feb 2012 B2
8224108 Steinberg et al. Jul 2012 B2
8237558 Seyed Momen et al. Aug 2012 B2
8273018 Fackler et al. Sep 2012 B1
8432263 Kunz Apr 2013 B2
8451314 Cline et al. May 2013 B1
8529448 McNair Sep 2013 B2
8565500 Neff Oct 2013 B2
8620682 Bechtel et al. Dec 2013 B2
8655680 Bechtel et al. Feb 2014 B2
8700423 Eaton, Jr. et al. Apr 2014 B2
8727981 Bechtel et al. May 2014 B2
8769153 Dziubinski Jul 2014 B2
8890937 Skubic et al. Nov 2014 B2
8902068 Bechtel et al. Dec 2014 B2
8917186 Grant Dec 2014 B1
8953886 King et al. Feb 2015 B2
9072929 Rush et al. Jul 2015 B1
9129506 Kusens Sep 2015 B1
9147334 Long et al. Sep 2015 B2
9159215 Kusens Oct 2015 B1
9269012 Fotland Feb 2016 B2
9292089 Sadek Mar 2016 B1
9305191 Long et al. Apr 2016 B2
9408561 Stone et al. Aug 2016 B2
9489820 Kusens Nov 2016 B1
9519969 Kusens Dec 2016 B1
9524443 Kusens Dec 2016 B1
9536310 Kusens Jan 2017 B1
9538158 Rush et al. Jan 2017 B1
9563955 Kamarshi et al. Feb 2017 B1
9597016 Stone et al. Mar 2017 B2
9729833 Kusens Aug 2017 B1
9741227 Kusens Aug 2017 B1
9892310 Kusens et al. Feb 2018 B2
9892311 Kusens et al. Feb 2018 B2
9892611 Kusens Feb 2018 B1
9905113 Kusens Feb 2018 B2
10055961 Johnson et al. Aug 2018 B1
10096223 Kusens Oct 2018 B1
10210378 Kusens et al. Feb 2019 B2
10225522 Kusens Mar 2019 B1
10276019 Johnson et al. Apr 2019 B2
20020015034 Malmborg Feb 2002 A1
20020077863 Rutledge et al. Jun 2002 A1
20020101349 Rojas, Jr. Aug 2002 A1
20020115905 August Aug 2002 A1
20020183976 Pearce Dec 2002 A1
20030037786 Biondi et al. Feb 2003 A1
20030070177 Kondo et al. Apr 2003 A1
20030092974 Santos et al. May 2003 A1
20030095147 Daw May 2003 A1
20030135390 O'brien et al. Jul 2003 A1
20030140928 Bui et al. Jul 2003 A1
20030227386 Pulkkinen et al. Dec 2003 A1
20040019900 Knightbridge et al. Jan 2004 A1
20040052418 Delean Mar 2004 A1
20040054760 Ewing et al. Mar 2004 A1
20040097227 Siegel May 2004 A1
20040116804 Mostafavi Jun 2004 A1
20040193449 Wildman et al. Sep 2004 A1
20050038326 Mathur Feb 2005 A1
20050182305 Hendrich Aug 2005 A1
20050231341 Shimizu Oct 2005 A1
20050249139 Nesbit Nov 2005 A1
20060004606 Wendl et al. Jan 2006 A1
20060047538 Condurso et al. Mar 2006 A1
20060049936 Collins et al. Mar 2006 A1
20060058587 Heimbrock et al. Mar 2006 A1
20060089541 Braun et al. Apr 2006 A1
20060092043 Lagassey May 2006 A1
20060107295 Margis et al. May 2006 A1
20060145874 Fredriksson et al. Jul 2006 A1
20060261974 Albert et al. Nov 2006 A1
20070085690 Tran Apr 2007 A1
20070118054 Pinhas et al. May 2007 A1
20070120689 Zerhusen et al. May 2007 A1
20070129983 Scherpbier et al. Jun 2007 A1
20070136218 Bauer et al. Jun 2007 A1
20070159332 Koblasz Jul 2007 A1
20070279219 Warriner Dec 2007 A1
20070296600 Dixon et al. Dec 2007 A1
20080001735 Tran Jan 2008 A1
20080001763 Raja et al. Jan 2008 A1
20080002860 Super et al. Jan 2008 A1
20080004904 Tran Jan 2008 A1
20080009686 Hendrich Jan 2008 A1
20080015903 Rodgers Jan 2008 A1
20080021731 Rodgers Jan 2008 A1
20080071210 Moubayed et al. Mar 2008 A1
20080087719 Sahud Apr 2008 A1
20080106374 Sharbaugh May 2008 A1
20080126132 Warner et al. May 2008 A1
20080228045 Gao et al. Sep 2008 A1
20080249376 Zaleski Oct 2008 A1
20080267447 Kelusky et al. Oct 2008 A1
20080277486 Seem et al. Nov 2008 A1
20080281638 Weatherly et al. Nov 2008 A1
20090082829 Panken et al. Mar 2009 A1
20090091458 Deutsch Apr 2009 A1
20090099480 Salgo et al. Apr 2009 A1
20090112630 Collins et al. Apr 2009 A1
20090119843 Rodgers et al. May 2009 A1
20090177327 Turner et al. Jul 2009 A1
20090224924 Thorp Sep 2009 A1
20090278934 Ecker et al. Nov 2009 A1
20090322513 Hwang et al. Dec 2009 A1
20100117836 Seyed momen et al. May 2010 A1
20100169114 Henderson et al. Jul 2010 A1
20100169120 Herbst et al. Jul 2010 A1
20100172567 Prokoski Jul 2010 A1
20100176952 Bajcsy et al. Jul 2010 A1
20100188228 Hyland Jul 2010 A1
20100205771 Pietryga et al. Aug 2010 A1
20100245577 Yamamoto et al. Sep 2010 A1
20100285771 Peabody Nov 2010 A1
20100305466 Corn Dec 2010 A1
20110018709 Kornbluh Jan 2011 A1
20110022981 Mahajan et al. Jan 2011 A1
20110025493 Papadopoulos et al. Feb 2011 A1
20110025499 Hoy et al. Feb 2011 A1
20110035057 Receveur et al. Feb 2011 A1
20110035466 Panigrahi Feb 2011 A1
20110054936 Cowan et al. Mar 2011 A1
20110068930 Wildman et al. Mar 2011 A1
20110077965 Nolte et al. Mar 2011 A1
20110087079 Aarts Apr 2011 A1
20110102133 Shaffer May 2011 A1
20110102181 Metz et al. May 2011 A1
20110106560 Eaton et al. May 2011 A1
20110106561 Eaton et al. May 2011 A1
20110175809 Markovic et al. Jul 2011 A1
20110190593 Mcnair Aug 2011 A1
20110227740 Wohltjen Sep 2011 A1
20110245707 Castle et al. Oct 2011 A1
20110254682 Sigrist Christensen Oct 2011 A1
20110288811 Greene Nov 2011 A1
20110295621 Farooq et al. Dec 2011 A1
20110301440 Riley et al. Dec 2011 A1
20110313325 Cuddihy Dec 2011 A1
20120025991 O'keefe et al. Feb 2012 A1
20120026308 Johnson et al. Feb 2012 A1
20120075464 Derenne et al. Mar 2012 A1
20120092162 Rosenberg Apr 2012 A1
20120098918 Murphy Apr 2012 A1
20120140068 Monroe et al. Jun 2012 A1
20120154582 Johnson et al. Jun 2012 A1
20120212582 Deutsch Aug 2012 A1
20120259650 Mallon et al. Oct 2012 A1
20120314901 Hanson et al. Dec 2012 A1
20130027199 Bonner Jan 2013 A1
20130028570 Suematsu et al. Jan 2013 A1
20130120120 Long et al. May 2013 A1
20130122807 Tenarvitz et al. May 2013 A1
20130184592 Venetianer et al. Jul 2013 A1
20130265482 Funamoto Oct 2013 A1
20130309128 Voegeli et al. Nov 2013 A1
20130332184 Burnham et al. Dec 2013 A1
20140039351 Mix et al. Feb 2014 A1
20140070950 Snodgrass Mar 2014 A1
20140085501 Tran Mar 2014 A1
20140086450 Huang et al. Mar 2014 A1
20140155755 Pinter et al. Jun 2014 A1
20140191861 Scherrer Jul 2014 A1
20140267625 Clark et al. Sep 2014 A1
20140267736 Delean Sep 2014 A1
20140327545 Bolling et al. Nov 2014 A1
20140328512 Gurwicz et al. Nov 2014 A1
20140333744 Baym et al. Nov 2014 A1
20140333776 Dedeoglu et al. Nov 2014 A1
20140354436 Nix et al. Dec 2014 A1
20140365242 Neff Dec 2014 A1
20150109442 Derenne et al. Apr 2015 A1
20150206415 Wegelin et al. Jul 2015 A1
20150269318 Neff Sep 2015 A1
20150278456 Bermudez rodriguez et al. Oct 2015 A1
20150294143 Wells et al. Oct 2015 A1
20160022218 Hayes et al. Jan 2016 A1
20160070869 Portnoy Mar 2016 A1
20160093195 Ophardt Mar 2016 A1
20160127641 Gove May 2016 A1
20160217347 Mineo Jul 2016 A1
20160253802 Venetianer et al. Sep 2016 A1
20160267327 Franz et al. Sep 2016 A1
20160360970 Tzvieli et al. Dec 2016 A1
20170055917 Stone et al. Mar 2017 A1
20170143240 Stone et al. May 2017 A1
20170337682 Liao et al. Nov 2017 A1
20180018864 Baker Jan 2018 A1
20180068545 Kusens Mar 2018 A1
20180357875 Kusens Dec 2018 A1
20190006046 Kusens et al. Jan 2019 A1
20190029528 Tzvieli et al. Jan 2019 A1
20190043192 Kusens et al. Feb 2019 A1
20190122028 Kusens et al. Apr 2019 A1
20190205630 Kusens Jul 2019 A1
20190206218 Kusens et al. Jul 2019 A1
Foreign Referenced Citations (3)
Number Date Country
19844918 Apr 2000 DE
2009018422 Feb 2009 WO
2012122002 Sep 2012 WO
Non-Patent Literature Citations (102)
Entry
Notice of Allowance received for U.S. Appl. No. 15/857,696, dated Jul. 16, 2019, 9 pages.
Notice of Allowance received for U.S. Appl. No. 16/380,013, dated Jul. 10, 2019, 10 pages.
Final Office Action received for U.S. Appl. No. 13/543,816, dated Jun. 17, 2014, 15 pages.
Final Office Action received for U.S. Appl. No. 14/084,588, dated Dec. 19, 2014, 24 pages.
Final Office Action received for U.S. Appl. No. 14/575,850, dated Dec. 12, 2017, 10 pages.
Final Office Action received for U.S. Appl. No. 14/599,498, dated Oct. 12, 2017, 28 pages.
Final Office Action received for U.S. Appl. No. 14/611,363, dated Apr. 28, 2017, 20 pages.
Final Office Action received for U.S. Appl. No. 14/623,349, dated Oct. 4, 2017, 29 pages.
Final Office Action received for U.S. Appl. No. 14/724,969, dated Jul. 28, 2016, 26 pages.
Final Office Action received for U.S. Appl. No. 14/757,877, dated Sep. 29, 2017, 22 pages.
Final Office Action received for U.S. Appl. No. 15/134,189, dated Jul. 12, 2018, 23 pages.
Final Office Action received for U.S. Appl. No. 15/285,416, dated Aug. 23, 2017, 16 pages.
Final Office Action received for U.S. Appl. No. 15/285,416, dated Jul. 5, 2018, 8 pages.
Final Office Action received for U.S. Appl. No. 15/396,263, dated Oct. 18, 2017, 20 pages.
First Action Interview Office Action received for U.S. Appl. No. 14/244,160, dated Nov. 28, 2017, 5 pages.
Kusens, Neil, Unpublished U.S. Appl. No. 14/613,866, filed Feb 4, 2015, titled “Method and System for Determining Whether an Individual Takes Appropriate Measures to Prevent the Spread of Healthcare Associated Infections Along With Centralized Monitoring”.
Kusens, Neil, Unpublished U.S. Appl. No. 14/084,588, filed Nov. 19, 2013, titled “Method for Determining Whether an Individual Leaves a Prescribed Virtual Perimeter”.
Kusens, Neil, Unpublished U.S. Appl. No. 14/575,850, filed Dec. 18, 2014, titled “Method and Process for Determining Whether an Individual Suffers a Fall Requiring Assistance”.
Kusens, Neil, Unpublished U.S. Appl. No. 14/599,498, filed Jan. 17, 2015, titled “Method and System for Determining Whether an Individual Takes Appropriate Measures to Prevent the D Spread of Healthcare Associated Infections”.
Kusens, Neil, Unpublished U.S. Appl. No. 14/611,363, filed Feb. 2, 2015, titled “Method and System for Determining Whether an Individual Takes Appropriate Measures to Prevent the Spread of Healthcare Associated Infections”.
Kusens, Neil, Unpublished U.S. Appl. No. 14/623,349, filed Feb. 16, 2015, titled “Method for Determining Whether an Individual Enters a Prescribed Virtual Zone Using 3D Blob Detection”.
Kusens, Neil, Unpublished U.S. Appl. No. 13/543,816, filed Jul. 7, 2012, titled “Method and Process for Determining Whether an Individual Suffers a Fall Requiring Assistance”.
Kusens, Neil, Unpublished U.S. Appl. No. 14/724,969, filed May 29, 2015, titled “Method and Process for Determining Whether an Individual Suffers a Fall Requiring Assistance”.
Kusens, Neil, Unpublished U.S. Appl. No. 14/727,434, filed Jun. 1, 2015, titled “Method for Determining Whether Enters a Prescribed Virtual Zone Using Skeletal Tracking and 3D Blob Detection”.
Kusens, Neil, Unpublished U.S. Appl. No. 14/728,762, filed Jun. 2, 2015, titled “Method for Determining Whether an Individual Leaves a Prescribed Virtual Perimeter”.
Kusens, Neil, Unpublished U.S. Appl. No. 14/743,264, filed Jun. 18, 2015, titled “System for Determining Whether an Individual Enters a Prescribed Virtual Zone Using 3D Blob Detection”.
Kusens, Neil, Unpublished U.S. Appl. No. 14/743,447, filed Jun. 18, 2015, titled “System for Determining Whether an Individual Suffers a Fall Requiring Assistance”.
Kusens, Neil, Unpublished U.S. Appl. No. 14/743,499, filed Jun. 18, 2015, titled “System for Determining Whether an Individual Suffers a Fall Requiring Assistance”.
Mooney, Tom, Rhode Island ER First to Test Google Glass on Medical Conditions, retrived from <https://www.ems1.com/ems-products/technology/articles/1860487-Rhode-Island-ER-first-to-test-Google-Glass-on-medical-conditions/>, Mar. 11, 2014, 3 pages.
Non-Final Office Action received for U.S. Appl. No. 15/148,151, dated May 8, 2018, 5 pages.
Non-Final Office Action received for U.S. Appl. No. 15/285,416, dated Apr. 11, 2017, 13 pages.
Non-Final Office Action received for U.S. Appl. No. 15/285,416, dated Mar. 12, 2018, 20 pages.
Non-Final Office Action received for U.S. Appl. No. 15/395,250, dated May 8, 2017, 19 pages.
Non-Final Office Action received for U.S. Appl. No. 15/395,526, dated Apr. 27, 2017, 16 pages.
Non-Final Office Action received for U.S. Appl. No. 15/395,762, dated May 31, 2018, 24 pages.
Non-Final Office Action received for U.S. Appl. No. 15/396,263, dated Apr. 14, 2017, 18 pages.
Non-Final Office Action received for U.S. Appl. No. 15/628,318, dated Jun. 8, 2018, 9 pages.
Non-Final Office Action received for U.S. Appl. No. 15/728,110, dated May 2, 2018, 8 pages.
Non-Final Office Action received for U.S. Appl. No. 15/848,621, dated May 31, 2018, 23 pages.
Non-Final Office Action received for U.S. Appl. No. 13/543,816, dated Dec. 30, 2013, 9 pages.
Non-Final Office Action received for U.S. Appl. No. 13/543,816, dated Dec. 1, 2014, 18 pages.
Non-Final Office Action received for U.S. Appl. No. 14/084,588, dated Jul. 16, 2014, 12 pages.
Non-Final Office Action received for U.S. Appl. No. 14/339,397, dated Oct. 7, 2015, 16 pages.
Non-Final Office Action received for U.S. Appl. No. 14/575,850, dated Mar. 11, 2016, 10 pages.
Non-Final Office Action received for U.S. Appl. No. 14/599,498, dated May 31, 2017, 24 pages.
Non-Final Office Action received for U.S. Appl. No. 14/611,363, dated Jan. 11, 2017, 19 pages.
Non-Final Office Action received for U.S. Appl. No. 14/611,363, dated May 7, 2018, 6 pages.
Non-Final Office Action received for U.S. Appl. No. 14/623,349, dated Apr. 5, 2017, 15 pages.
Non-Final Office Action received for U.S. Appl. No. 14/724,969, dated Feb. 11, 2016, 14 pages.
Non-Final Office Action received for U.S. Appl. No. 14/727,434, dated Sep. 23, 2016, 9 pages.
Non-Final Office Action received for U.S. Appl. No. 14/743,499, dated May 23, 2016, 6 pages.
Non-Final Office Action received for U.S. Appl. No. 14/757,593, dated Apr. 21, 2017, 9 pages.
Non Final Office Action received for U.S. Appl. No. 15/395,243, dated Feb. 14, 2019, 14 pages.
Non Final Office Action received for U.S. Appl. No. 16/216,210, dated Feb. 13, 2019, 29 pages.
Non Final Office Action received for U.S. Appl. No. 16/107,567, dated Mar. 29, 2019, 8 pages.
Non-Final Office Action received for U.S. Appl. No. 15/395,762, dated May 1, 2019, 27 pages.
Non-Final Office Action received for U.S. Appl. No. 15/856,419, dated May 2, 2019, 8 pages.
Conaire, et al., “Fusion of Infrared and Visible Spectrum Video for Indoor Surveillance”, WIAMIS, Apr. 2005, 4 pages.
Final Office Action received for U.S. Appl. No. 15/395,243, dated Jun. 11, 2019, 18 pages.
Non-Final Office Action received for U.S. Appl. No. 15/134,189, dated May 9, 2019, 30 pages.
Preinterview First Office Action received for U.S. Appl. No. 15/857,696, dated May 23, 2019, 14 pages.
Non-Final Office Action received for U.S. Appl. No. 14/757,593, dated Aug. 16, 2017, 8 pages.
Non-Final Office Action received for U.S. Appl. No. 14/757,877, dated Feb. 23, 2017, 24 pages.
Notice of Allowance received for U.S. Appl. No. 13/543,816, dated Jun. 5, 2015, 5 pages.
Notice of Allowance received for U.S. Appl. No. 14/575,850, dated Jun. 13, 2018, 5 pages.
Notice of Allowance received for U.S. Appl. No. 14/599,498, dated Jul. 18, 2018, 6 pages.
Notice of Allowance received for U.S. Appl. No. 14/611,363, dated Dec. 29, 2017, 11 pages.
Notice of Allowance received for U.S. Appl. No. 14/613,866, dated Mar. 20, 2017, 11 pages.
Notice of Allowance received for U.S. Appl. No. 14/623,349, dated Jun. 18, 2018, 11 pages.
Notice of Allowance received for U.S. Appl. No. 14/724,969, dated Apr. 21, 2017, 8 pages.
Notice of Allowance received for U.S. Appl. No. 14/724,969, dated Dec. 23, 2016, 5 pages.
Notice of Allowance received for U.S. Appl. No. 14/127,434, dated Apr. 25, 2017, 9 pages.
Notice of Allowance received for U.S. Appl. No. 14/727,434, dated Jan. 4, 2018, 2 pages.
Notice of Allowance received for U.S. Appl. No. 14/727,434, dated Jul. 5, 2017, 9 pages.
Notice of Allowance received for U.S. Appl. No. 14/727,434, dated Oct. 10, 2017, 9 pages.
Notice of Allowance received for U.S. Appl. No. 14/728,762, dated Jun. 27, 2016, 14 pages.
Notice of Allowance received for U.S. Appl. No. 14/743,264, dated Jul. 18, 2016, 16 pages.
Notice of Allowance received for U.S. Appl. No. 14/743,264, dated Nov. 9, 2016, 14 pages.
Notice of Allowance received for U.S. Appl. No. 14/743,264, dated Oct. 14, 2016, 14 pages.
Notice of Allowance received for U.S. Appl. No. 14/743,447, dated Aug. 26, 2016, 5 pages.
Notice of Allowance received for U.S. Appl. No. 14/743,447, dated Jun. 22, 2016, 4 pages.
Notice of Allowance received for U.S. Appl. No. 14/743,447, dated May 31, 2016, 8 pages.
Notice of Allowance received for U.S. Appl. No. 14/743,447, dated Nov. 14, 2016, 5 pages.
Notice of Allowance received for U.S. Appl. No. 14/743,499, dated Sep. 19, 2016, 5 pages.
Notice of Allowance received for U.S. Appl. No. 14/757,593, dated Jun. 4, 2018, 5 pages.
Notice of Allowance received for U.S. Appl. No. 15/279,054, dated Nov. 27, 2017, 2 pages.
Notice of Allowance received for U.S. Appl. No. 15/279,054, dated Oct. 20, 2017, 13 pages.
Notice of Allowance received for U.S. Appl. No. 15/395,250, dated Sep. 26, 2017, 13 pages.
Notice of Allowance received for U.S. Appl. No. 15/395,526, dated Sep. 21, 2017, 13 pages.
Notice of Allowance received for U.S. Appl. No. 15/395,716, dated Apr. 19, 2017, 5 pages.
Notice of Allowance received for U.S. Appl. No. 15/395,716, dated Dec. 6, 2017, 5 pages.
Notice of Allowance received for U.S. Appl. No. 15/395,716, dated May 9, 2018, 5 pages.
Notice of Allowance received for U.S. Appl. No. 15/396,263, dated Jul. 13, 2018, 9 pages.
Notice of Allowance received for U.S. Appl. No. 15/728,110, dated Jul. 23, 2018, 15 pages.
Notice of Allowance received for U.S. Appl. No. 15/728,110, dated Sep. 21, 2018, 2 pages.
Notice of Allowance received for U.S. Appl. No. 15/395,716, dated Jun. 19, 2018, 2 pages.
Notice of Allowance received for U.S. Appl. No. 15/395,716, dated Jul. 24, 2017, 5 pages.
Pre-interview First Office Action received for U.S. Appl. No. 15/910,645, dated May 21, 2018, 14 pages.
Pre-interview First Office Action received for U.S. Appl. No. 15/395,716, dated Feb. 24, 2017, 5 pages.
Pre-interview First Office Action received for U.S. Appl. No. 15/134,189, dated Nov. 22, 2017, 5 pages.
Raheja, et al., Human Facial Expression Detection From Detected in Captured Image Using Back Propagation Neural Network, International Journal of Computer Science and Information Technology (IJCSIT), vol. 2, No. 1, Feb. 2010, 7 pages.
Virtual Patient Observation: Centralize Monitoring of High-Risk Patients with Video—Cisco Video Surveillance Manager, Retrived from <https://www.cisco.com/c/en/us/products/collateral/physical-security/video-surveillance-manager/whitepaper_11-715263.pdf>.
Related Publications (1)
Number Date Country
20190057592 A1 Feb 2019 US
Continuations (2)
Number Date Country
Parent 15728110 Oct 2017 US
Child 16166857 US
Parent 14727434 Jun 2015 US
Child 15728110 US