Methods and systems for assigning locations to devices

Information

  • Patent Grant
  • 10878220
  • Patent Number
    10,878,220
  • Date Filed
    Friday, December 30, 2016
    7 years ago
  • Date Issued
    Tuesday, December 29, 2020
    3 years ago
Abstract
A location identification system analyzes information received corresponding to a device detected in a room of a patient. On detecting a location identification of the device, the system assigns the device to the location corresponding to the location identification. In embodiments, the system retrieves patient and care team information for the location. The location and patient and care team information may be communicated to a central video monitoring system.
Description
BACKGROUND

Medical facilities, such as hospitals, face many challenges in addition to simply caring for patients. For example, securing patients and equipment (e.g., medical devices) consumes many resources and current methods lack effectiveness. In addition to requiring personnel to physically monitor locations within the facility, visitor logs and visitor badges, and radio-frequency identification (RFID) technology are often utilized to control access to certain locations within the facility. However, each of these require subjective decision-making and are prone to error by the personnel monitoring the locations or assisting visitors signing a visitor log and issuing visitor badges accordingly. Further, none of these methods necessarily prevent an authorized visitor from breaching areas of the facility where the authorized visitor is not authorized. For example, a visitor may be authorized to visit a particular patient but, based on some condition of the patient, may not have close contact with the patient. In contrast, a caregiver of the same patient may need to have close contact with the patient. Additionally, in some situations, an authorized visitor may unwittingly provide contraband (e.g., some thing or some object a particular patient is not allowed) to a patient that the current methods are unable to detect. Finally, medical devices are constantly being shuffled between patients and locations within a facility. Tracking the locations of these devices can be extremely difficult. Accordingly, overall security for patients and equipment suffers and the many resources currently being utilized are wasted.


BRIEF SUMMARY

This brief summary is provided as a general overview of the more detailed disclosure which follows. It is not intended to identify key or essential elements of the disclosure, or to define the claim terms in isolation from the remainder of the disclosure, including the drawings.


This disclosure generally relates to systems and method for assigning locations to devices. Generally, and without limitation, the method involves identifying signal from a location beacon corresponding to a location of the device, a location identification visible on a visual location identifier tag of the location, or a location identification selected by a user. Accordingly, the system assigns the location corresponding to the signal or the location identification to the device. The system may also retrieve patient and care team information for the location from an electronic medical records (EMR) system. The images, the location, and/or the patient and care team information may be communicated to a central video monitoring system.


In some aspects, this disclosure relates to a method for assigning locations to devices. The method comprises: receiving, by a motion sensor in a room of a patient, a signal from a location beacon in the room of the patient corresponding to a location; determining the signal includes a known location identification; and assigning the motion sensor to the location corresponding to the known location identification.


In some aspects, this disclosure relates to a system for assigning locations to devices. The system may comprise one or more 3D motion sensors. The one or more 3D motion sensors may be located to provide the one or more 3D motion sensors with a view of a location of a patient. The 3D motion sensors may be configured to collect a series of images of the location and the patient. The system may comprise a computerized monitoring system. The computerized monitoring system may be communicatively coupled to the one or more 3D motion sensors. The computerized monitoring system may be configured to determine if a location identification is visible on a visual location identifier tag in the location. The computerized communication system may be communicatively coupled to an EMR system. The EMR may be configured to communicate patient and care team information available for the location to the computerized monitoring system.


The location assignment system may further comprise a central video monitoring station. The central video monitoring station may be communicatively coupled to the computerized communication system. The central monitoring station may be configured to display at least a portion of the series of images of the location and the patient and/or information received by the computerized monitoring system. The central video monitoring station may comprise a primary display. The central monitoring station may comprise an alert display. The alert display may be a dedicated portion of the primary display. The alert display may be a separate display or series of displays from the primary display. If the computerized monitoring system detects an unknown device or a device in an unknown location, the computerized communication system may be configured to send an alert to the central monitoring station. The central monitoring station may be configured to move the display of at least a portion of the series of images of the location and the patient from the primary display to the alert display upon receipt of an alert.


In some aspects this disclosure relates to computer-readable storage media having embodied thereon computer-executable instructions. When executed by one or more computer processors the instructions may cause the processors to: receiving from a motion sensor an image of a location corresponding to a patient; determine if a location identification corresponding to the location has been input by a user; upon determining the location identification corresponding to the location has been input by the user, assigning the 3D motion sensor to the location corresponding to the location identification; retrieving, from the EMR system, patient and care team information for the location; and displaying the image of the location and the patient and the patient and care team information for the location.


Additional objects, advantages, and novel features of the disclosure will be set forth in part in the description which follows, and in part will become apparent to those skilled in the art upon examination of the following, or may be learned by practice of the disclosure.





BRIEF DESCRIPTION OF THE SEVERAL VIEWS OF THE DRAWING

The description references the attached drawing figures, wherein:



FIGS. 1-3 are exemplary flowcharts for location assignment systems, in accordance with embodiments of the present disclosure; and



FIGS. 4-8 are exemplary displays for location assignment systems, in accordance with embodiments of the present disclosure.





DETAILED DESCRIPTION

As noted in the Background, medical facilities, such as hospitals, face many challenges in addition to simply caring for patients. For example, securing patients and equipment (e.g., medical devices) consumes many resources and current methods lack effectiveness. In addition to requiring personnel to physically monitor locations within the facility, visitor logs and visitor badges, and radio-frequency identification (RFID) technology are often utilized to control access to certain locations within the facility. However, each of these require subjective decision-making and are prone to error by the personnel monitoring the locations or assisting visitors signing a visitor log and issuing visitor badges accordingly. Further, none of these methods necessarily prevent an authorized visitor from breaching areas of the facility where the authorized visitor is not authorized. For example, a visitor may be authorized to visit a particular patient but is not authorized to visit another patient or particular areas of the facility. Additionally, in some situations, an authorized visitor may unwittingly provide contraband (e.g., some thing or some object a particular patient is not allowed to possess or be near) to a patient that the current methods are unable to detect. Finally, medical devices are constantly being shuffled between patients and locations within a facility. Tracking the locations of these devices can be extremely difficult. Accordingly, overall security for patients and equipment suffers and the many resources currently being utilized are wasted.


The subject matter of the present invention is described with specificity herein to meet statutory requirements. However, the description itself is not intended to limit the scope of this patent. Rather, the inventor has contemplated that the claimed subject matter might also be embodied in other ways, to include different steps or combinations of steps similar to the ones described in this document, in conjunction with other present or future technologies. Moreover, although the terms “step” and/or “block” may be used herein to connote different elements of methods employed, the terms should not be interpreted as implying any particular order among or between various steps herein disclosed unless and except when the order of individual steps is explicitly described.


As shown in FIG. 1, a system for assigning locations to 3D motion sensors 112 having location beacons 110. A 3D motion sensor is an electronic device that contains one or more cameras capable of identifying individual objects, people and motion. The 3D motion sensor may further contain one or more microphones to detect audio. The cameras can utilize technologies including but not limited to color RGB, CMOS sensors, lasers, infrared projectors and RF-modulated light. The 3D motion sensor may have one or more integrated microprocessors and/or image sensors to detect and process information both transmitted from and received by the various cameras. Exemplary 3D motion sensors include the Microsoft® Kinect® Camera, the Sony® PlayStation® Camera, and the Intel® RealSense™ Camera, each of which happens to include microphones, although sound capture is not essential to the practice of the disclosure. A user may be able to configure alerts based on data that is received from the 3D motion sensor 112 and interpreted by the computerized monitoring system 114. For example, a user can configure the computerized monitoring system 114 to provide alerts based on data the computerized monitoring system 114 has interpreted for setting zones in a patient's room, comparing data from multiple systems (RTLS or facial recognition) to determine authorized visitors, a patient crossing a trip wire, falling on the ground, or entering or exiting a safety zone.


As used herein, “a sensor” and “sensors” are used interchangeably in the singular and plural unless expressly described as a singular sensor or an array of sensors. A singular sensor may be used, or a sensor may comprise two or more cameras integrated into a single physical unit. Alternately, two or more physically distinct sensors may be used, or two or more physically distinct arrays of sensors may be used.


A “device” may be any device in the room of a patient being monitored that is movable (e.g., 3D motion sensor) and capable of communicating data to an EMR. As can be appreciated, knowing the location of such a device may be vital to ensuring that the correct information for the correct patient is communicated to the EMR.


As shown in FIG. 1, location assignment system 100 may be utilized to assign the location of a 3D motion sensor 112 in the room of a patient. A 3D motion sensor 112 may be co-located with a patient room to be monitored. The patient room to be monitored may be monitored in a variety of environments, including, without limitation, a hospital, a home, a hospice care facility, a nursing home, an assisted living facility, an outpatient medical care facility, and the like. The 3D motion sensor 112 may be positioned where it is likely to capture images of the patient room to be monitored. For example, a 3D motion sensor 112 may be oriented to take images of a bed, chair, or other location where a patient in the patient room to be monitored may spend a significant amount of time. In some embodiments, the 3D motion sensor 112 may be oriented to take images of persons and/or objects entering and exiting the patient room to be monitored. In some embodiments, the 3D mention sensor 112 may be oriented to take images of equipment (e.g., medical devices) that may be located in the patient room to be monitored. The 3D motion sensor 112 may be permanently installed, or may be temporarily set up in a room as needed. The patient in the patient room to be monitored may be under immediate medical care, e.g., in a medical facility under the supervision of a medical professional, or may not be under immediate care, e.g., in a home or other environment, possibly with a caregiver. A caregiver may be a medical professional or paraprofessional, such as an orderly, nurse's aide, nurse, or the like. A caregiver may also be a friend, relative, individual, company, or facility that provides assistance with daily living activities and/or medical care for individuals, such as individuals who are disabled, ill, injured, elderly, or otherwise in need of temporary or long-term assistance. In some instances, the person to be monitored may be self-sufficient and not under the immediate care of any other person or service provider.


The 3D motion sensor 112 may communicate data, such as images of the patient room being monitored, to a computerized monitoring system 114. The computerized monitoring system 114 is a computer programmed to monitor transmissions of data from the 3D motion sensor 112. The computerized monitoring system 114 may be integral to the 3D motion sensor 112 or a distinctly separate apparatus from the 3D motion sensor 112, possibly in a remote location from 3D motion sensor 112 provided that the computerized monitoring system 114 can receive data from the 3D motion sensor 112. The computerized monitoring system 114 may be located in the monitored patient room, such as a hospital room, bedroom, or living room. The computerized monitoring system 114 may be connected to a central video monitoring system 126. The computerized monitoring system 114 and central monitoring system 126 may be remotely located at any physical locations so long as a data connection exists (USB, TCP/IP or comparable) between the computerized monitoring system 114, the central monitoring system 126, and the 3D motion sensor(s) 112.


The computerized monitoring system 114 may receive data from 3D motion sensor 112 for a monitoring zone (i.e., the patient's room or area to be monitored). Computerized monitoring system 114 may assign reference points to identify the boundaries of the patient identification zone. For example, reference points may be assigned to a perimeter around the patient. It should be understood that the selection of the reference points may vary with the individual and/or the configuration of the location assignment system 100. Reference points may be configured automatically by the location assignment system 100, may be configured automatically by the location assignment system 100 subject to confirmation and/or modification by a system user, or may be configured manually by a system user.


If 3D motion sensor 112 begins to transmit data, computerized monitoring system 114 may, at step 118, assess whether a location beacon 110 on the 3D motion sensor 112 is broadcasting a known location identification. For example, a signal may be received by computerized monitoring system 114 from the location beacon 110 on the 3D motion sensor 112. If no signal is received, the computerized monitoring system 114 may continue to wait for a signal from a location beacon on the 3D motion sensor 112 (or utilize other methods to determine the location identification as described in more detail below with respect to FIGS. 2 and 3) and analyze images in the monitoring zone as long as 3D motion sensor 112 continues to transmit data.


If on the other hand, the location beacon 110 on the 3D motion sensor 112 is broadcasting a known location identification, computerized monitoring system 114 assigns the 3D motion sensor 112 to the location corresponding to the known location identification. The location assignment may be stored in location database 116. Regardless of the signal being received or having a known location identification, images received from the 3D motion sensor 112 may additionally be communicated to central video monitoring system 126.


When the 3D motion sensor 112 has been assigned to the location corresponding to the known location identification, the assignment may be recorded in an EMR 122 by an EMR system 120. If patient and care team information has been saved for the location, at step 124, the location assignment and the patient and care team information are communicated to central video monitoring system 126.


If patient and care team information has not been saved for the location, at step 124, patient and care team information may be retrieved, at step 128, that correspond to the location. This information may be communicated to computerized monitoring system 114 and communicated to central video monitoring system 126 along with the location assignment.


On receiving an image and/or patient and care team information, the central video monitoring system 126, or an attendant there, may receive an alert that provides a live image, video and/or audio feed from the 3D motion sensor 112. One or more caregiver(s) local to patient can be alerted with or even before central video monitoring system 126, so that the caregiver(s) can assess what is happening in person. The priority and timing of alerts to different individuals or stations can be configured in accordance with the needs and desires of a particular facility, experience with a particular monitored individual or type of patient, or any other criterion of the system owner or user. This is true for initial alerts as well as continuing alerts (e.g., if 3D motion sensor 112 is detected in and has not been assigned to a location) or repeated alerts (two or more distinct events where 3D motion sensor 112 is detected and not assigned to a location). The priority and timing of alerts to different individuals may be different for initial, continuing, and/or repeated alerts.


Data associated with alerts may be logged by computerized monitoring system 114 and/or central video monitoring system 126 in a database 116. Data associated with an alert may include, without limitation, the telemetry data from 3D motion sensor 112 that triggered the alert; buffered data preceding the telemetry data that triggered the alert; telemetry data subsequent to the alert; the number and substantive content of an alert; the individual(s) and/or groups to whom an alert was addressed; the response, if any, received or observed following an alert; and combinations thereof.


Referring now to FIG. 2, location assignment system 200 may be utilized to assign the location of 3D motion sensor 212 in the room of a patient. A 3D motion sensor 212 may be co-located with a patient room to be monitored. The patient room to be monitored may be monitored in a variety of environments, including, without limitation, a hospital, a home, a hospice care facility, a nursing home, an assisted living facility, an outpatient medical care facility, and the like. The 3D motion sensor 212 may be positioned where it is likely to capture images of the patient room to be monitored. For example, a 3D motion sensor 212 may be oriented to take images of a bed, chair, or other location where a patient in the patient room to be monitored may spend a significant amount of time. In some embodiments, the 3D motion sensor 212 may be oriented to take images of persons and/or objects entering and exiting the patient room to be monitored. In some embodiments, the 3D mention sensor 212 may be oriented to take images of equipment (e.g., medical devices) that may be located in the patient room to be monitored. The 3D motion sensor 212 may be permanently installed, or may be temporarily set up in a room as needed. The patient in the patient room to be monitored may be under immediate medical care, e.g., in a medical facility under the supervision of a medical professional, or may not be under immediate care, e.g., in a home or other environment, possibly with a caregiver. A caregiver may be a medical professional or paraprofessional, such as an orderly, nurse's aide, nurse, or the like. A caregiver may also be a friend, relative, individual, company, or facility that provides assistance with daily living activities and/or medical care for individuals, such as individuals who are disabled, ill, injured, elderly, or otherwise in need of temporary or long-term assistance. In some instances, the person to be monitored may be self-sufficient and not under the immediate care of any other person or service provider.


The 3D motion sensor 212 may communicate data, such as images the patient room being monitored, to a computerized monitoring system 214. The computerized monitoring system 214 is a computer programmed to monitor transmissions of data from the 3D motion sensor 212. The computerized monitoring system 214 may be integral to the 3D motion sensor 212 or a distinctly separate apparatus from the 3D motion sensor 212, possibly in a remote location from 3D motion sensor 212 provided that the computerized monitoring system 214 can receive data from the 3D motion sensor 212. The computerized monitoring system 214 may be located in the monitored patient room, such as a hospital room, bedroom, or living room. The computerized monitoring system 214 may be connected to a central video monitoring system 226. The computerized monitoring system 214 and central video monitoring system 226 may be remotely located at any physical locations so long as a data connection exists (USB, TCP/IP or comparable) between the computerized monitoring system 214, the central monitoring system 126, and the 3D motion sensor(s) 212.


The computerized monitoring system 214 may receive data from 3D motion sensor 212 for a monitoring zone (i.e., the patient's room or area to be monitored). Computerized monitoring system 214 may assign reference points to identify the boundaries of the patient identification zone. For example, reference points may be assigned to a perimeter around the patient. It should be understood that the selection of the reference points may vary with the individual and/or the configuration of the location assignment system 200. Reference points may be configured automatically by the location assignment system 200, may be configured automatically by the location assignment system 200 subject to confirmation and/or modification by a system user, or may be configured manually by a system user.


If 3D motion sensor 212 begins to transmit data in a location, computerized monitoring system 214 may, at step 218, assess whether a known location identification is visible on a visual location identifier tag 210 in the location. For example, computerized monitoring system 214 may receive data from 3D motion sensor 212 that includes a visual location identifier tag. If no such data is received, the computerized monitoring system 214 may continue to wait for a known location identification to be visible on a visual location identifier tag on 210 in the location (or utilize other methods to determine the location identification as described herein with respect to FIGS. 1 and 3) and analyze images in the monitoring zone as long as 3D motion sensor 212 continues to transmit data.


If on the other hand, a known location identification is visible on a visual location identifier tag on 210 in the location, computerized monitoring system 214 assigns the 3D motion sensor 212 to the location corresponding to the known location identification. The location assignment may be stored in location database 216. Regardless of a visual location identifier tag 210 being detected on the 3D motion sensor 212 or having a known location identification, image received from the 3D motion sensor 212 may additionally be communicated to central video monitoring system 226.


When the 3D motion sensor 212 has been assigned to the location corresponding to the known location identification, the assignment may be recorded in an EMR 222 by an EMR system 220. If patient and care team information has been saved for the location, at step 224, the location assignment and the patient and care team information are communicated to central video monitoring system 226.


If patient and care team information has not been saved for the location, at step 224, patient and care team information may be retrieved, at step 228, that correspond to the location. This information may be communicated to computerized monitoring system 214 and communicated to central video monitoring system 226 along with the location assignment.


On receiving the location assignment and/or patient and care team information, the central video monitoring system 226, or an attendant there, may receive an alert that provides a live image, video and/or audio feed from the 3D motion sensor 212. One or more caregiver(s) local to patient can be alerted with or even before central video monitoring system 226, so that the caregiver(s) can assess what is happening in person. The priority and timing of alerts to different individuals or stations can be configured in accordance with the needs and desires of a particular facility, experience with a particular monitored individual or type of patient, or any other criterion of the system owner or user. This is true for initial alerts as well as continuing alerts (e.g., if a 3D motion sensor 212 is detected in and has not been assigned to a location) or repeated alerts (two or more distinct events where a 3D motion sensor 212 is detected and not assigned to a location). The priority and timing of alerts to different individuals may be different for initial, continuing, and/or repeated alerts.


Data associated with alerts may be logged by computerized monitoring system 214 and/or central video monitoring system 226 in a database 216. Data associated with an alert may include, without limitation, the telemetry data from 3D motion sensor 212 that triggered the alert; buffered data preceding the telemetry data that triggered the alert; telemetry data subsequent to the alert; the number and substantive content of an alert; the individual(s) and/or groups to whom an alert was addressed; the response, if any, received or observed following an alert; and combinations thereof.


As shown in FIG. 3, location assignment system 300 may be utilized to assign the location of a 3D motion sensor 312 in the room of a patient. A 3D motion sensor 312 may be co-located with a patient room to be monitored. The patient room to be monitored may be monitored in a variety of environments, including, without limitation, a hospital, a home, a hospice care facility, a nursing home, an assisted living facility, an outpatient medical care facility, and the like. The 3D motion sensor 312 may be positioned where it is likely to capture images of the patient room to be monitored. For example, a 3D motion sensor 312 may be oriented to take images of a bed, chair, or other location where a patient in the patient room to be monitored may spend a significant amount of time. In some embodiments, the 3D motion sensor 312 may be oriented to take images of persons and/or objects entering and exiting the patient room to be monitored. In some embodiments, the 3D mention sensor 312 may be oriented to take images of equipment (e.g., medical devices) that may be located in the patient room to be monitored. The 3D motion sensor 312 may be permanently installed, or may be temporarily set up in a room as needed. The patient in the patient room to be monitored may be under immediate medical care, e.g., in a medical facility under the supervision of a medical professional, or may not be under immediate care, e.g., in a home or other environment, possibly with a caregiver. A caregiver may be a medical professional or paraprofessional, such as an orderly, nurse's aide, nurse, or the like. A caregiver may also be a friend, relative, individual, company, or facility that provides assistance with daily living activities and/or medical care for individuals, such as individuals who are disabled, ill, injured, elderly, or otherwise in need of temporary or long-term assistance. In some instances, the person to be monitored may be self-sufficient and not under the immediate care of any other person or service provider.


The 3D motion sensor 312 may communicate data, such as images of the patient room being monitored, to a computerized monitoring system 314. The computerized monitoring system 314 is a computer programmed to monitor transmissions of data from the 3D motion sensor 312. The computerized monitoring system 314 may be integral to the 3D motion sensor 312 or a distinctly separate apparatus from the 3D motion sensor 312, possibly in a remote location from 3D motion sensor 312 provided that the computerized monitoring system 314 can receive data from the 3D motion sensor 312. The computerized monitoring system 314 may be located in the monitored patient room, such as a hospital room, bedroom, or living room. The computerized monitoring system 314 may be connected to a central video monitoring system 326. The computerized monitoring system 314 and central monitoring system 326 may be remotely located at any physical locations so long as a data connection exists (USB, TCP/IP or comparable) between the computerized monitoring system 314, the central monitoring system 326, and the 3D motion sensor(s) 312.


The computerized monitoring system 314 may receive data from 3D motion sensor 312 for a monitoring zone (i.e., the patient's room or area to be monitored). Computerized monitoring system 314 may assign reference points to identify the boundaries of the patient identification zone. For example, reference points may be assigned to a perimeter around the patient. It should be understood that the selection of the reference points may vary with the individual and/or the configuration of the location assignment system 300. Reference points may be configured automatically by the location assignment system 300, may be configured automatically by the location assignment system 300 subject to confirmation and/or modification by a system user, or may be configured manually by a system user.


If a 3D motion sensor 312 begins transmitting, computerized monitoring system 314 may, at step 318, assess whether a location identification has been selected for the 3D motion sensor. For example, the location identification may be selected by a user and received by computerized monitoring system 314 indicating the location of the 3D motion sensor 312. If the location identification has not been selected by a user and received by computerized monitoring system 314, the computerized monitoring system 114 may continue to wait for the location identification of the 3D motion sensor 312 (or utilize other methods to determine the location identification as described in more detail above with respect to FIGS. 1 and 2) and analyze images in the monitoring zone as long as 3D motion sensor 312 continues to transmit data.


If on the other hand, location identification has been selected by a user and is received by computerized monitoring system 314, computerized monitoring system 314 assigns the 3D motion sensor 312 to the location corresponding to the location identification. The location assignment may be stored in location database 316. Regardless of the signal being received or having a known location identification, images received from the 3D motion sensor 312 may additionally be communicated to central video monitoring system 326.


When the 3D motion sensor 312 has been assigned to the location corresponding to the location identification, the assignment may be recorded in an EMR 322 by an EMR system 320. If patient and care team information has been saved for the location, at step 324, the location assignment and the patient and care team information are communicated to central video monitoring system 326.


If patient and care team information has not been saved for the location, at step 324, patient and care team information may be retrieved, at step 328, that correspond to the location. This information may be communicated to computerized monitoring system 314 and communicated to central video monitoring system 326 along with the location assignment.


On receiving the location assignment and/or patient and care team information, the central video monitoring system 326, or an attendant there, may receive an alert that provides a live image, video and/or audio feed from the 3D motion sensor 312. One or more caregiver(s) local to patient can be alerted with or even before central video monitoring system 326, so that the caregiver(s) can assess what is happening in person. The priority and timing of alerts to different individuals or stations can be configured in accordance with the needs and desires of a particular facility, experience with a particular monitored individual or type of patient, or any other criterion of the system owner or user. This is true for initial alerts as well as continuing alerts (e.g., if a 3D motion sensor 312 is detected in and has not been assigned to a location) or repeated alerts (two or more distinct events where a 3D motion sensor 312 is detected and not assigned to a location). The priority and timing of alerts to different individuals may be different for initial, continuing, and/or repeated alerts.


Data associated with alerts may be logged by computerized monitoring system 314 and/or central video monitoring system 326 in a database 316. Data associated with an alert may include, without limitation, the telemetry data from 3D motion sensor 312 that triggered the alert; buffered data preceding the telemetry data that triggered the alert; telemetry data subsequent to the alert; the number and substantive content of an alert; the individual(s) and/or groups to whom an alert was addressed; the response, if any, received or observed following an alert; and combinations thereof.



FIG. 4 shows an exemplary view for location assignment 400. As illustrated, a clinician may select a patient onboarding icon 410 to begin the process of assigning a 3D motion sensor to a location. In FIG. 5, a patient onboarding view 500 illustrates a bed icon 510 that can be selected to assign a device to a location manually or employ one of the automatic location assignment options (such as the options discussed with reference to FIGS. 1 and 2). FIG. 6 illustrates a location view that enables a clinician to select a room 610 and/or bed 612 to assign the 3D motion sensor to the corresponding location (i.e., the room and/or bed). Once selected, patient and care team demographics 710 may be populated based on the selection 720, as illustrated in the populated onboarding view 700 in FIG. 7. Accordingly, as shown in FIG. 8, the patient demographics and care team information, as well as any images (such as those corresponding to the patient, the patient room, and/or the location assignment) are provided in video feed 800.


The various computerized systems and processors as described herein may include, individually or collectively, and without limitation, a processing unit, internal system memory, and a suitable system bus for coupling various system components, including database 118, with a control server. Computerized patient monitoring system 106 and/or central video monitoring system 116 may provide control server structure and/or function. The system bus may be any of several types of bus structures, including a memory bus or memory controller, a peripheral bus, and a local bus, using any of a variety of bus architectures. By way of example, and not limitation, such architectures include Industry Standard Architecture (ISA) bus, Micro Channel Architecture (MCA) bus, Enhanced ISA (EISA) bus, Video Electronic Standards Association (VESA) local bus, and Peripheral Component Interconnect (PCI) bus.


The computerized systems typically include therein, or have access to, a variety of computer-readable media, for instance, database 116. Computer-readable media can be any available media that may be accessed by the computerized system, and includes volatile and nonvolatile media, as well as removable and non-removable media. By way of example, and not limitation, computer-readable media may include computer-storage media and communication media. Computer-readable storage media may include, without limitation, volatile and nonvolatile media, as well as removable and non-removable media implemented in any method or technology for storage of information, such as computer readable instructions, data structures, program modules, or other data. In this regard, computer-storage media may include, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVDs) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage, or other magnetic storage device, or any other medium which can be used to store the desired information and which may be accessed by the control server. Computer-readable storage media excludes signals per se.


Communication media typically embodies computer readable instructions, data structures, program modules, or other data in a modulated data signal, such as a carrier wave or other transport mechanism, and may include any information delivery media. As used herein, the term “modulated data signal” refers to a signal that has one or more of its attributes set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media includes wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared, and other wireless media. Combinations of any of the above also may be included within the scope of computer-readable media. The computer-readable storage media discussed above, including database 116, provide storage of computer readable instructions, data structures, program modules, and other data for the computerized systems. Computer readable instructions embodied on computer-readable storage media may be accessible by location assignment system 100 and/or component(s) thereof, and, when executed by a computer processor and/or server, may cause the system to function and/or perform the methods described herein.


The computerized systems may operate in a computer network using logical connections to one or more remote computers. Remote computers may be located at a variety of locations, for example, but not limited to, hospitals and other inpatient settings, veterinary environments, ambulatory settings, medical billing and financial offices, hospital administration settings, home health care environments, payer offices (e.g., insurance companies), home health care agencies, clinicians' offices and the clinician's home or the patient's own home or over the Internet. Clinicians may include, but are not limited to, a treating physician or physicians, specialists such as surgeons, radiologists, cardiologists, and oncologists, emergency medical technicians, physicians' assistants, nurse practitioners, nurses, nurses' aides, pharmacists, dieticians, microbiologists, laboratory experts, laboratory technologists, genetic counselors, researchers, veterinarians, students, and the like. The remote computers may also be physically located in non-traditional medical care environments so that the entire health care community may be capable of integration on the network. The remote computers may be personal computers, servers, routers, network PCs, peer devices, other common network nodes, or the like, and may include some or all of the elements described above in relation to the control server. The devices can be personal digital assistants or other like devices.


Exemplary computer networks may include, without limitation, local area networks (LANs) and/or wide area networks (WANs). Such networking environments are commonplace in offices, enterprise-wide computer networks, intranets, and the Internet. When utilized in a WAN networking environment, the control server may include a modem or other means for establishing communications over the WAN, such as the Internet. In a networked environment, program modules or portions thereof may be stored in the control server, in the database 118, or on any of the remote computers. For example, and not by way of limitation, various application programs may reside on the memory associated with any one or more of the remote computers. It will be appreciated by those of ordinary skill in the art that the network connections shown are exemplary and other means of establishing a communications link between the computers may be utilized.


In operation, a user may enter commands and information into the computerized system(s) using input devices, such as a keyboard, a pointing device (commonly referred to as a mouse), a trackball, a touch pad, a 3D Gesture recognition camera or motion sensor. Other input devices may include, without limitation, microphones, satellite dishes, scanners, or the like. In addition to or in lieu of a monitor, the computerized systems may include other peripheral output devices, such as speakers and a printer.


Many other internal components of the computerized system hardware are not shown because such components and their interconnection are well known. Accordingly, additional details concerning the internal construction of the computers that make up the computerized systems are not further disclosed herein.


Methods and systems of embodiments of the present disclosure may be implemented in a WINDOWS or LINUX operating system, operating in conjunction with an Internet-based delivery system, however, one of ordinary skill in the art will recognize that the described methods and systems can be implemented in any operating system suitable for supporting the disclosed processing and communications. As contemplated by the language above, the methods and systems of embodiments of the present invention may also be implemented on a stand-alone desktop, personal computer, cellular phone, smart phone, tablet computer, PDA, or any other computing device used in a healthcare environment or any of a number of other locations.


From the foregoing, it will be seen that this disclosure is well adapted to attain all the ends and objects hereinabove set forth together with other advantages which are obvious and which are inherent to the structure.


It will be understood that certain features and subcombinations are of utility and may be employed without reference to other features and subcombinations. This is contemplated by and is within the scope of the claims.


Since many possible embodiments may be made of the invention without departing from the scope thereof, it is to be understood that all matter herein set forth or shown in the accompanying drawings is to be interpreted as illustrative and not in a limiting sense.

Claims
  • 1. A system for assigning locations to 3D motion sensors, the system comprising: one or more motion sensors configured to collect one or more images;a computerized monitoring system communicatively coupled to the one or more motion sensors, the computerized monitoring system configured to: determine that a location identification is visible on a location identifier tag in the one or more images received from the one or more motion sensors,based on determining that the location identification is visible in the one or more images from the one or more motion sensors, automatically assign the one or more motion sensors to a location corresponding to the location identification to indicate that the one or more motion sensors are located at the location, andsend an assignment of the one or more motion sensors to the location to an electronic medical records (EMR) system for recording; andthe computerized monitoring system communicatively coupled to the EMR system, the EMR system configured to communicate patient information and care team information available for the location to the computerized monitoring system.
  • 2. The system of claim 1, further comprising a central video monitoring system communicatively coupled to the computerized monitoring system, the central video monitoring system configured to display at least a portion of the one or more images, and the patient information and care team information for the location.
  • 3. The system of claim 2, wherein the central video monitoring station comprises a primary display and an alert display.
  • 4. The system of claim 2, wherein the alert display is a dedicated portion of the primary display or a separate display or series of displays from the primary display.
  • 5. A method for assigning locations to 3D motion sensors, the method comprising: receiving, by a motion sensor, a signal from a location beacon corresponding to a location;determining the signal received by the motion sensor includes a known location identification identifying one or more of a room number or a bed number corresponding to the location;in response to determining the signal received by the motion sensor includes the known location identification, automatically assigning the motion sensor to the location corresponding to the known location identification to indicate that the motion sensor is located at the location; andsending an assignment of the motion sensor to the location to an electronic medical records (EMR) system for recording.
  • 6. The method of claim 5, further comprising retrieving, from the EMR system, patient information and care team information for the location.
  • 7. The method of claim 6, further comprising upon saving the patient information and care team information, communicating the patient information and care team information to a central video monitoring system.
  • 8. The method of claim 5, wherein the assignment of the motion sensor to the location is communicated to a location database.
  • 9. The method of claim 5, further comprising, if the signal is not detected, using the motion sensor to identify a visual indicator.
  • 10. The method of claim 9, wherein the visual indicator comprises a bar code, QR code, or a distinct object or symbol corresponding to the location.
  • 11. The method of claim 10, further comprising retrieving, from the EMR system, patient information and care team information for the location.
  • 12. Computer-readable storage media having embodied thereon instructions which, when executed by one or more computer processors, cause the processors to: receive, from a motion sensor, an image of a location corresponding to a patient;determine if a location identification corresponding to at least one of the location or the patient has been input by a user;upon determining the location identification corresponding to at least one of the location or the patient has not been input by the user, receive, by the motion sensor, a location identification that is visible on a location identifier tag in the image of the location;based on the location identification visible in the image from the motion sensor, determine that the motion sensor is located at the location of the patient;automatically assign the motion sensor to the location corresponding to the location identification based on determining that the motion sensor is located at the location based on the location identification visible in the image from the motion sensor;send an assignment of the motion sensor to the location to an electronic medical records (EMR) system for recording;retrieve, from the EMR system, patient information and care team information for the location; anddisplay the image of the location and the patient and the patient information and care team information for the location.
  • 13. The system of claim 2, wherein the computerized monitoring system is further configured to send an alert to a clinician present at the location with one or more of a live image, video and/or audio feed from the one or more motion sensors assigned to the location.
  • 14. The system of claim 13, wherein the computerized monitoring system is configured to send the alert to the clinician present at the location prior to an alert being sent to the central video monitoring system.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application claims the benefit of U.S. Provisional Application Ser. No. 62/273,735, entitled “Methods and Systems for Detecting Stroke Symptoms”, filed Dec. 31, 2015, herein incorporated by reference in its entirety, and is related to commonly assigned U.S. patent application Ser. No. 15/395,250,entitled “Methods and Systems for Detecting Prohibited Objects in a Patient Room”, and U.S. patent application Ser. No. 15/395,526, entitled “Detecting Unauthorized Visitors”, filed concurrently herewith on the same date.

US Referenced Citations (294)
Number Name Date Kind
4669263 Sugiyama Jun 1987 A
4857716 Gombrich et al. Aug 1989 A
5031228 Lu Jul 1991 A
5276432 Travis Jan 1994 A
5448221 Weller Sep 1995 A
5482050 Smokoff et al. Jan 1996 A
5592153 Welling et al. Jan 1997 A
5798798 Rector et al. Aug 1998 A
5838223 Gallant et al. Nov 1998 A
5915379 Wallace et al. Jun 1999 A
5942986 Shabot et al. Aug 1999 A
6050940 Braun et al. Apr 2000 A
6095984 Amano et al. Aug 2000 A
6160478 Jacobsen et al. Dec 2000 A
6174283 Nevo et al. Jan 2001 B1
6188407 Smith et al. Feb 2001 B1
6269812 Wallace et al. Aug 2001 B1
6287452 Allen et al. Sep 2001 B1
6322502 Schoenberg et al. Nov 2001 B1
6369838 Wallace et al. Apr 2002 B1
6429869 Kamakura et al. Aug 2002 B1
6614349 Proctor et al. Sep 2003 B1
6727818 Wildman et al. Apr 2004 B1
6804656 Rosenfeld et al. Oct 2004 B1
7015816 Wildman et al. Mar 2006 B2
7122005 Shusterman Oct 2006 B2
7154397 Zerhusen et al. Dec 2006 B2
7237287 Weismiller et al. Jul 2007 B2
7323991 Eckert et al. Jan 2008 B1
7408470 Wildman et al. Aug 2008 B2
7420472 Tran Sep 2008 B2
7430608 Noonan et al. Sep 2008 B2
7502498 Wen et al. Mar 2009 B2
7612679 Fackler et al. Nov 2009 B1
7669263 Menkedick et al. Mar 2010 B2
7715387 Schuman May 2010 B2
7724147 Brown May 2010 B2
7756723 Rosow et al. Jul 2010 B2
7890349 Cole et al. Feb 2011 B2
7893842 Deutsch Feb 2011 B2
7895055 Schneider et al. Feb 2011 B2
7908153 Scherpbier et al. Mar 2011 B2
7945457 Zaleski May 2011 B2
7962544 Torok et al. Jun 2011 B2
7972140 Renaud Jul 2011 B2
8108036 Tran Jan 2012 B2
8123685 Brauers et al. Feb 2012 B2
8128596 Carter Mar 2012 B2
8190447 Hungerford et al. May 2012 B2
8224108 Steinberg et al. Jul 2012 B2
8237558 Seyed Momen et al. Aug 2012 B2
8273018 Fackler et al. Sep 2012 B1
8432263 Kunz Apr 2013 B2
8451314 Cline et al. May 2013 B1
8529448 McNair Sep 2013 B2
8565500 Neff Oct 2013 B2
8620682 Bechtel et al. Dec 2013 B2
8655680 Bechtel et al. Feb 2014 B2
8700423 Eaton, Jr. et al. Apr 2014 B2
8727981 Bechtel et al. May 2014 B2
8769153 Dziubinski Jul 2014 B2
8890937 Skubic et al. Nov 2014 B2
8902068 Bechtel et al. Dec 2014 B2
8917186 Grant Dec 2014 B1
8953886 King et al. Feb 2015 B2
9072929 Rush et al. Jul 2015 B1
9129506 Kusens Sep 2015 B1
9147334 Long et al. Sep 2015 B2
9159215 Kusens Oct 2015 B1
9269012 Fotland Feb 2016 B2
9292089 Sadek Mar 2016 B1
9305191 Long et al. Apr 2016 B2
9408561 Stone et al. Aug 2016 B2
9424699 Kusens et al. Aug 2016 B2
9466163 Kusens et al. Oct 2016 B2
9489820 Kusens Nov 2016 B1
9519969 Kusens Dec 2016 B1
9524443 Kusens Dec 2016 B1
9536310 Kusens Jan 2017 B1
9538158 Rush et al. Jan 2017 B1
9563955 Kamarshi et al. Feb 2017 B1
9597016 Stone et al. Mar 2017 B2
9691206 Kusens et al. Jun 2017 B2
9729833 Kusens Aug 2017 B1
9741227 Kusens Aug 2017 B1
9774991 Kusens Sep 2017 B2
9838849 Kusens Dec 2017 B2
9858741 Kusens et al. Jan 2018 B2
9892310 Kusens et al. Feb 2018 B2
9892311 Kusens et al. Feb 2018 B2
9892611 Kusens Feb 2018 B1
9905113 Kusens Feb 2018 B2
9984521 Kusens et al. May 2018 B1
9997001 Kusens et al. Jun 2018 B2
9998857 Kusens Jun 2018 B2
10013831 Kusens et al. Jul 2018 B1
10078956 Kusens Sep 2018 B1
10090068 Kusens et al. Oct 2018 B2
10091463 Kusens Oct 2018 B1
10096223 Kusens Oct 2018 B1
10109179 Kusens Oct 2018 B2
10115253 Kusens et al. Oct 2018 B2
10115254 Kusens et al. Oct 2018 B1
10121299 Kusens et al. Nov 2018 B2
10210378 Kusens et al. Feb 2019 B2
10225522 Kusens Mar 2019 B1
20020015034 Malmborg Feb 2002 A1
20020038073 August Mar 2002 A1
20020077863 Rutledge et al. Jun 2002 A1
20020101349 Rojas, Jr. Aug 2002 A1
20020115905 August Aug 2002 A1
20020183976 Pearce Dec 2002 A1
20030037786 Biondi et al. Feb 2003 A1
20030070177 Kondo et al. Apr 2003 A1
20030092974 Santoso et al. May 2003 A1
20030095147 Daw May 2003 A1
20030135390 O'Brien et al. Jul 2003 A1
20030140928 Bui et al. Jul 2003 A1
20030227386 Pulkkinen et al. Dec 2003 A1
20040019900 Knightbridge et al. Jan 2004 A1
20040052418 DeLean Mar 2004 A1
20040054760 Ewing et al. Mar 2004 A1
20040097227 Siegel May 2004 A1
20040116804 Mostafavi Jun 2004 A1
20040193449 Wildman et al. Sep 2004 A1
20050038326 Mathur Feb 2005 A1
20050182305 Hendrich Aug 2005 A1
20050231341 Shimizu Oct 2005 A1
20050249139 Nesbit Nov 2005 A1
20060004606 Wendl Jan 2006 A1
20060047538 Condurso et al. Mar 2006 A1
20060049936 Collins et al. Mar 2006 A1
20060058587 Heimbrock et al. Mar 2006 A1
20060089541 Braun et al. Apr 2006 A1
20060092043 Lagassey May 2006 A1
20060107295 Margis et al. May 2006 A1
20060145874 Fredriksson et al. Jul 2006 A1
20060261974 Albert et al. Nov 2006 A1
20070085690 Tran Apr 2007 A1
20070118054 Pinhas et al. May 2007 A1
20070120689 Zerhusen et al. May 2007 A1
20070129983 Scherpbier et al. Jun 2007 A1
20070136102 Rodgers Jun 2007 A1
20070136218 Bauer et al. Jun 2007 A1
20070159332 Koblasz Jul 2007 A1
20070279219 Warriner Dec 2007 A1
20070296600 Dixon et al. Dec 2007 A1
20080001735 Tran Jan 2008 A1
20080001763 Raja et al. Jan 2008 A1
20080002860 Super et al. Jan 2008 A1
20080004904 Tran Jan 2008 A1
20080009686 Hendrich Jan 2008 A1
20080015903 Rodgers Jan 2008 A1
20080021731 Rodgers Jan 2008 A1
20080071210 Moubayed et al. Mar 2008 A1
20080087719 Sahud Apr 2008 A1
20080106374 Sharbaugh May 2008 A1
20080126132 Warner et al. May 2008 A1
20080228045 Gao et al. Sep 2008 A1
20080249376 Zaleski Oct 2008 A1
20080267447 Kelusky et al. Oct 2008 A1
20080277486 Seem et al. Nov 2008 A1
20080281638 Weatherly et al. Nov 2008 A1
20090082829 Panken et al. Mar 2009 A1
20090091458 Deutsch Apr 2009 A1
20090099480 Salgo et al. Apr 2009 A1
20090112630 Collins, Jr. et al. Apr 2009 A1
20090119843 Rodgers et al. May 2009 A1
20090177327 Turner et al. Jul 2009 A1
20090224924 Thorp Sep 2009 A1
20090278934 Ecker et al. Nov 2009 A1
20090322513 Hwang et al. Dec 2009 A1
20090326340 Wang et al. Dec 2009 A1
20100117836 Seyed Momen et al. May 2010 A1
20100169114 Henderson et al. Jul 2010 A1
20100169120 Herbst et al. Jul 2010 A1
20100172567 Prokoski Jul 2010 A1
20100176952 Bajcsy et al. Jul 2010 A1
20100188228 Hyland Jul 2010 A1
20100205771 Pietryga et al. Aug 2010 A1
20100245577 Yamamoto et al. Sep 2010 A1
20100285771 Peabody Nov 2010 A1
20100305466 Corn Dec 2010 A1
20110018709 Kombluh Jan 2011 A1
20110022981 Mahajan et al. Jan 2011 A1
20110025493 Papadopoulos et al. Feb 2011 A1
20110025499 Hoy et al. Feb 2011 A1
20110035057 Receveur et al. Feb 2011 A1
20110035466 Panigrahi Feb 2011 A1
20110054936 Cowan et al. Mar 2011 A1
20110068930 Wildman et al. Mar 2011 A1
20110077965 Nolte et al. Mar 2011 A1
20110087079 Aarts Apr 2011 A1
20110087125 Causevic Apr 2011 A1
20110102133 Shaffer May 2011 A1
20110102181 Metz et al. May 2011 A1
20110106560 Eaton, Jr. et al. May 2011 A1
20110106561 Eaton, Jr. et al. May 2011 A1
20110175809 Markovic et al. Jul 2011 A1
20110190593 McNair Aug 2011 A1
20110227740 Wohltjen Sep 2011 A1
20110245707 Castle et al. Oct 2011 A1
20110254682 Sigrist Christensen Oct 2011 A1
20110288811 Greene Nov 2011 A1
20110295621 Farooq et al. Dec 2011 A1
20110301440 Riley et al. Dec 2011 A1
20110313325 Cuddihy Dec 2011 A1
20120016295 Tsoukalis Jan 2012 A1
20120025991 O'Keefe et al. Feb 2012 A1
20120026308 Johnson et al. Feb 2012 A1
20120075464 Derenne et al. Mar 2012 A1
20120092162 Rosenberg Apr 2012 A1
20120098918 Murphy Apr 2012 A1
20120140068 Monroe Jun 2012 A1
20120154582 Johnson et al. Jun 2012 A1
20120212582 Deutsch Aug 2012 A1
20120259650 Mallon et al. Oct 2012 A1
20120314901 Hanson Dec 2012 A1
20120323090 Bechtel et al. Dec 2012 A1
20120323591 Bechtel et al. Dec 2012 A1
20120323592 Bechtel et al. Dec 2012 A1
20130027199 Bonner Jan 2013 A1
20130028570 Suematsu et al. Jan 2013 A1
20130120120 Long et al. May 2013 A1
20130122807 Tenarvitz et al. May 2013 A1
20130127620 Siebers et al. May 2013 A1
20130184592 Venetianer Jul 2013 A1
20130265482 Funamoto Oct 2013 A1
20130309128 Voegeli et al. Nov 2013 A1
20130332184 Burnham et al. Dec 2013 A1
20140039351 Mix et al. Feb 2014 A1
20140070950 Snodgrass Mar 2014 A1
20140081654 Bechtel et al. Mar 2014 A1
20140085501 Tran Mar 2014 A1
20140086450 Huang et al. Mar 2014 A1
20140108041 Bechtel et al. Apr 2014 A1
20140155755 Pinter et al. Jun 2014 A1
20140191861 Scherrer Jul 2014 A1
20140191946 Cho et al. Jul 2014 A1
20140213845 Bechtel et al. Jul 2014 A1
20140267625 Clark et al. Sep 2014 A1
20140267736 Delean Sep 2014 A1
20140309789 Ricci Oct 2014 A1
20140327545 Bolling et al. Nov 2014 A1
20140328512 Gurwicz et al. Nov 2014 A1
20140333744 Baym et al. Nov 2014 A1
20140333776 Dedeoglu et al. Nov 2014 A1
20140354436 Nix et al. Dec 2014 A1
20140365242 Neff Dec 2014 A1
20150057635 Bechtel et al. Feb 2015 A1
20150109442 Derenne et al. Apr 2015 A1
20150206415 Wegelin et al. Jul 2015 A1
20150269318 Neff Sep 2015 A1
20150278456 Bermudez Rodriguez et al. Oct 2015 A1
20150294143 Wells et al. Oct 2015 A1
20160022218 Hayes et al. Jan 2016 A1
20160070869 Portnoy Mar 2016 A1
20160093195 Ophardt Mar 2016 A1
20160098676 Kusens et al. Apr 2016 A1
20160127641 Gove May 2016 A1
20160180668 Kusens et al. Jun 2016 A1
20160183864 Kusens et al. Jun 2016 A1
20160217347 Mineo Jul 2016 A1
20160253802 Venetianer et al. Sep 2016 A1
20160267327 Franz et al. Sep 2016 A1
20160314258 Kusens Oct 2016 A1
20160324460 Kusens Nov 2016 A1
20160360970 Tzvieli et al. Dec 2016 A1
20170055917 Stone et al. Mar 2017 A1
20170084158 Kusens Mar 2017 A1
20170091562 Kusens Mar 2017 A1
20170109991 Kusens Apr 2017 A1
20170116473 Sashida et al. Apr 2017 A1
20170143240 Stone et al. May 2017 A1
20170163949 Suzuki Jun 2017 A1
20170214902 Braune Jul 2017 A1
20170289503 Kusens Oct 2017 A1
20170337682 Liao et al. Nov 2017 A1
20180018864 Baker Jan 2018 A1
20180068545 Kusens Mar 2018 A1
20180104409 Bechtel et al. Apr 2018 A1
20180144605 Kusens May 2018 A1
20180189946 Kusens et al. Jul 2018 A1
20180190098 Kusens Jul 2018 A1
20180357875 Kusens Dec 2018 A1
20190006046 Kusens et al. Jan 2019 A1
20190029528 Tzvieli et al. Jan 2019 A1
20190043192 Kusens et al. Feb 2019 A1
20190057592 Kusens Feb 2019 A1
20190205630 Kusens Jul 2019 A1
20190206218 Kusens et al. Jul 2019 A1
20190261915 Kusens Aug 2019 A1
20190318478 Kusens et al. Oct 2019 A1
20200050844 Kusens Feb 2020 A1
Foreign Referenced Citations (7)
Number Date Country
19844918 Apr 2000 DE
2007081629 Jul 2007 WO
2009018422 Feb 2009 WO
2012122002 Sep 2012 WO
2016126845 Aug 2016 WO
2017058991 Apr 2017 WO
2017124056 Jul 2017 WO
Non-Patent Literature Citations (84)
Entry
US 9,948,899 B1, 04/2018, Kusens (withdrawn)
Pending U.S. Application by same inventor Neil Kusens, U.S. Appl. No. 14/575,850, filed Dec. 18, 2014, entitled “Method and Process for Determining Whether an Individual Suffers a Fall Requiring Assistance”.
Pending U.S. Application by same inventor Neil Kusens, U.S. Appl. No. 14/599,498, filed Jan. 17, 2015, entitled “Method and System for Determining Whether an Individual Takes Appropriate Measures to Prevent the Spread of Healthcare Associated Infections”.
Pending U.S. Application by same inventor Neil Kusens, U.S. Appl. No. 14/611,363, filed Feb. 2, 2015, entitled “Method and System for Determining Whether an Individual Takes Appropriate Measures to Prevent the Spread of Healthcare Associated Infections”.
Pending U.S. Application by same inventor Neal Kusens, U.S. Appl. No. 14/613,866, filed Feb. 4, 2015, entitled “Method and System for Determining Whether an Individual Takes Appropriate Measures to Prevent the Spread of Healthcare Associated Infections Along With Centralized Monitoring”.
Pending U.S. Application by same inventor Neil Kusens, U.S. Appl. No. 14/623,349, filed Feb. 16, 2015, entitled “Method for Determining Whether an Individual Enters a Prescribed Virtual Zone Using 3D Blob Detection”.
Pending U.S. Application by same inventor Neil Kusens, U.S. Appl. No. 14/724,969, filed May 29, 2015, entitled “Method and Process for Determining Whether an Individual Suffers a Fall Requiring Assistance”.
Pending U.S. Application by same inventor Neil Kusens, U.S. Appl. No. 14/743,264, filed Jun. 18, 2015, entitled “System for Determining Whether an Individual Enters a Prescribed Virtual Zone Using 3D Blob Detection”.
Tom Mooney, “Rhode Island ER first to test Google Glass on medical conditions”, http://www.ems1.com/ems-products/cameras-video/articles/1860487-Rhode-Island-ER-first . . . printed on Mar. 11, 2014.
Non-Final Office Action dated Jan. 11, 2017 in U.S. Appl. No. 14/611,363, 19 pages.
Non-Final Office Action dated Feb. 23, 2017 in U.S. Appl. No. 14/757,877, 24 pages.
First Action Interview Preinterview Communication dated Feb. 24, 2017 in U.S. Appl. No. 15/395,716, 5 pages.
Notice of Allowance dated Mar. 20, 2017 in U.S. Appl. No. 14/613,866, 11 pages.
Non-Final Office Action dated Apr. 5, 2017 in U.S. Appl. No. 14/623,349, 15 pages.
Non-Final Office Action dated Apr. 11, 2017 in U.S. Appl. No. 15/285,416, 13 pages.
Notice of Allowance dated Apr. 19, 2017 in U.S. Appl. No. 15/395,716, 5 pages.
Non-Final Office Action dated Apr. 21, 2017 in U.S. Appl. No. 14/757,593, 9 pages.
Notice of Allowance dated Apr. 21, 2017 in U.S. Appl. No. 14/724,969, 9 pages.
Notice of Allowance dated Apr. 25, 2017 in U.S. Appl. No. 14/727,434, 9 pages.
Non-Final Office Action dated Apr. 27, 2017 in U.S. Appl. No. 15/395,526, 16 pages.
Final Office Action dated Apr. 28, 2017 in U.S. Appl. No. 14/611,363, 20 pages.
Non-Final Office Action dated May 8, 2017 in U.S. Appl. No. 15/395,250, 19 pages.
Non-Final Office Action dated May 31, 2017 in U.S. Appl. No. 14/599,498, 24 pages.
Notice of Allowance dated Jul. 5, 2017 in U.S. Appl. No. 14/727,434, 9 pages.
Notice of Allowance dated Jul. 24, 2017 in U.S. Appl. No. 15/395,716, 5 pages.
Non-Final Office Action dated Aug. 16, 2017 in U.S. Appl. No. 14/757,593, 8 pages.
Final Office Action dated Aug. 23, 2017 in U.S. Appl. No. 15/285,416, 16 pages.
Notice of Allowance dated Sep. 21, 2017 in U.S. Appl. No. 15/395,526, 13 pages.
Notice of Allowance dated Sep. 26, 2017 in U.S. Appl. No. 15/395,250, 13 pages.
Final Office Action dated Sep. 29, 2017 in U.S. Appl. No. 14/757,877, 22 pages.
Final Office Action dated Oct. 4, 2017 in U.S. Appl. No. 14/623,349, 30 pages.
Notice of Allowance dated Oct. 10, 2017 in U.S. Appl. No. 14/727,434, 9 pages.
Final Office Action dated Oct. 12, 2017 in U.S. Appl. No. 14/599,498, 28 pages.
Notice of Allowance dated Oct. 20, 2017 in U.S. Appl. No. 15/279,054, 14 pages.
Final Office Action dated Jul. 5, 2018 in U.S. Appl. No. 15/285,416, 8 pages.
Final Office Action dated Jul. 12, 2018 in U.S. Appl. No. 15/134,189, 23 pages.
Notice of Allowance dated Jul. 13, 2018 in U.S. Appl. No. 15/396,263, 9 pages.
Notice of Allowance dated Jul. 18, 2018 in U.S. Appl. No. 14/599,498, 6 pages.
Notice of Allowance dated Jul. 23, 2018 in U.S. Appl. No. 15/728,110, 15 pages.
Non-Final Office Action dated Aug. 15, 2018 in U.S. Appl. No. 15/910,632, 7 pages.
Non-Final Office Action dated Sep. 10, 2018 in U.S. Appl. No. 15/910,645, 11 pages.
Notice of Allowance dated Sep. 21, 2018 in U.S. Appl. No. 15/285,416, 8 pages.
Non-Final Office Action dated Feb. 11, 2016 in U.S. Appl. No. 14/724,969, 14 pages.
Final Office Action dated Jul. 28, 2016 in U.S. Appl. No. 14/723,969, 26 pages.
Notice of Allowance dated Dec. 23, 2016 in U.S. Appl. No. 14/724,969, 5 pages.
Raheja, et al., “Human Facial Expression Detection From Detected in CapturedImage Using Back Propagation Neural Network”, International Journal of Computer Science and Information Technology (IJCSIT), vol. 2, No. 1, Feb. 2010, 8 pages.
Final Office Action dated Oct. 18, 2017 in U.S. Appl. No. 15/396,263, 20 pages.
First Action Interview Office Action dated Nov. 28, 2017 in U.S. Appl. No. 14/244,160, 5 pages.
Non-Final Office Action dated Apr. 14, 2017 in U.S. Appl. No. 15/396,263, 18 pages.
Notice of Allowance dated Nov. 27, 2017 in U.S. Appl. No. 15/279,054, 2 pages.
First Action Interview Office Action dated Feb. 22, 2018 in U.S. Appl. No. 15/134,189, 4 pages.
First Action Interview Pre-Interview Communication dated May 21, 2018 in U.S. Appl. No. 15/910,645, 14 pages.
Non-Final Office Action dated Feb. 22, 2018 in U.S. Appl. No. 14/599,498, 24 pages.
Non-Final Office Action dated Feb. 7, 2018 in U.S. Appl. No. 15/396,263, 19 pages.
Non-Final Office Action dated Jun. 8, 2018 in U.S. Appl. No. 15/628,318, 9 new pages.
Non-Final Office Action dated Mar. 12, 2018 in U.S. Appl. No. 15/285,416, 20 pages.
Non-Final Office Action dated May 2, 2018 in U.S. Appl. No. 15/728,110, 8 pages.
Non-Final Office Action dated May 31, 2018 in U.S. Appl. No. 15/395,762, 24 pages.
Non-Final Office Action dated May 31, 2018 in U.S. Appl. No. 15/848,621, 23 pages.
Non-Final Office Action dated May 7, 2018 in U.S. Appl. No. 14/611,363, 6 pages.
Non-Final Office Action dated May 8, 2018 in U.S. Appl. No. 15/148,151, 5 pages.
Notice of Allowance dated Jan. 18, 2018 in U.S. Appl. No. 15/279,054, 2 pages.
Notice of Allowance dated Jun. 18, 2018 U.S. Appl. No. 14/623,349, 11 pages.
Notice of Allowance dated May 9, 2018 in U.S. Appl. No. 15/395,716, 5 pages.
Notice of Allowance dated Jun. 13, 2018 in U.S. Appl. No. 14/575,850, 5 pages.
First Action Interview Pre-Interview Communication dated Nov. 22, 2017 in U.S. Appl. No. 15/134,189, 4 pages.
Virtual Patient Observation: Centralize Monitoring of High-Risk Patients with Video—Cisco Video Surveillance Manager, https://www.cisco.com/c/en/us/products/collateral/physical-security/video-surveillance-manager/white paper_ C11-715263.pdf.
Notice of Allowance dated Dec. 6, 2017 in U.S. Appl. No. 15/395,716, 5 pages.
Final Office Action dated Dec. 12, 2017 in U.S. Appl. No. 14/575,850, 10 pages.
Notice of Allowance dated Dec. 29, 2017 in U.S. Appl. No. 14/611,363, 11 pages.
Notice of Allowance dated Feb. 12, 2018 in U.S. Appl. No. 14/623,349, 12 pages.
Final Office Action dated Feb. 16, 2018 in U.S. Appl. No. 14/757,593, 8 pages.
Non Final Office Action received for U.S. Appl. No. 16/107,567, dated Mar. 29, 2019, 8 pages.
Non-Final Office Action received for U.S. Appl. No. 15/395,762, dated May 1, 2019, 27 pages.
Non-Final Office Action received for U.S. Appl. No. 15/856,419, dated May 2, 2019, 8 pages.
Non Final Office Action received for U.S. Appl. No. 16/216,210, dated Feb. 13, 2019, 29 pages.
Notice of Allowance received for U.S. Appl. No. 16/380,013, dated Jul. 10, 2019, 10 pages.
Non-Final Office Action received for U.S. Appl. No. 15/134,189, dated May 9, 2019, 30 pages.
Preinterview First Office Action received for U.S. Appl. No. 15/857,696, dated May 23, 2019, 14 pages.
Conaire, et al., “Fusion of Infrared and Visible Spectrum Video for Indoor Surveillance”, WIAMIS, Apr. 2005, 4 pages.
Final Office Action received for U.S. Appl. No. 15/134,189, dated May 6, 2020, 31 pages.
Preinterview First Office Action received for U.S. Appl. No. 16/181,897 dated May 11, 2020, 5 pages.
Preinterview First Office Action received for U.S. Appl. No. 16/832,790, dated Aug. 25, 2020, 5 pages.
Notice of Allowance received for U.S. Appl. No. 16/181,897, dated Oct. 14, 2020, 9 pages.
Related Publications (1)
Number Date Country
20170193177 A1 Jul 2017 US
Provisional Applications (1)
Number Date Country
62273735 Dec 2015 US