Medical facilities, such as hospitals, face many challenges in addition to simply caring for patients. For example, securing patients and equipment (e.g., medical devices) consumes many resources and current methods lack effectiveness. In addition to requiring personnel to physically monitor locations within the facility, visitor logs and visitor badges, and radio-frequency identification (RFID) technology are often utilized to control access to certain locations within the facility. However, each of these require subjective decision-making and are prone to error by the personnel monitoring the locations or assisting visitors signing a visitor log and issuing visitor badges accordingly. Further, none of these methods necessarily prevent an authorized visitor from breaching areas of the facility where the authorized visitor is not authorized. For example, a visitor may be authorized to visit a particular patient but, based on some condition of the patient, may not have close contact with the patient. In contrast, a caregiver of the same patient may need to have close contact with the patient. Additionally, in some situations, an authorized visitor may unwittingly provide contraband (e.g., some thing or some object a particular patient is not allowed) to a patient that the current methods are unable to detect. Finally, medical devices are constantly being shuffled between patients and locations within a facility. Tracking the locations of these devices can be extremely difficult. Accordingly, overall security for patients and equipment suffers and the many resources currently being utilized are wasted.
This brief summary is provided as a general overview of the more detailed disclosure which follows. It is not intended to identify key or essential elements of the disclosure, or to define the claim terms in isolation from the remainder of the disclosure, including the drawings.
This disclosure generally relates to systems and method for assigning locations to devices. Generally, and without limitation, the method involves identifying a signal from a location beacon corresponding to a location of the device, a location identification visible on a visual location identifier tag of the location, or a location identification selected by a user. Accordingly, the system assigns the location corresponding to the signal or the location identification to the device. The system may also retrieve patient and care team information for the location from an electronic medical records (EMR) system. The images, the location, and/or the patient and care team information may be communicated to a central video monitoring system.
In some aspects, this disclosure relates to a method for assigning locations to devices. The method comprises: receiving, by a motion sensor in a room of a patient, a signal from a location beacon in the room of the patient corresponding to a location; determining the signal includes a known location identification; and assigning the motion sensor to the location corresponding to the known location identification.
In some aspects, this disclosure relates to a system for assigning locations to devices. The system may comprise one or more 3D motion sensors. The one or more 3D motion sensors may be located to provide the one or more 3D motion sensors with a view of a location of a patient. The 3D motion sensors may be configured to collect a series of images of the location and the patient. The system may comprise a computerized monitoring system. The computerized monitoring system may be communicatively coupled to the one or more 3D motion sensors. The computerized monitoring system may be configured to determine if a location identification is visible on a visual location identifier tag in the location. The computerized communication system may be communicatively coupled to an EMR system. The EMR may be configured to communicate patient and care team information available for the location to the computerized monitoring system.
The location assignment system may further comprise a central video monitoring station. The central video monitoring station may be communicatively coupled to the computerized communication system. The central monitoring station may be configured to display at least a portion of the series of images of the location and the patient and/or information received by the computerized monitoring system. The central video monitoring station may comprise a primary display. The central monitoring station may comprise an alert display. The alert display may be a dedicated portion of the primary display. The alert display may be a separate display or series of displays from the primary display. If the computerized monitoring system detects an unknown device or a device in an unknown location, the computerized communication system may be configured to send an alert to the central monitoring station. The central monitoring station may be configured to move the display of at least a portion of the series of images of the location and the patient from the primary display to the alert display upon receipt of an alert.
In some aspects this disclosure relates to computer-readable storage media having embodied thereon computer-executable instructions. When executed by one or more computer processors the instructions may cause the processors to: receiving from a motion sensor an image of a location corresponding to a patient; determine if a location identification corresponding to the location has been input by a user; upon determining the location identification corresponding to the location has been input by the user, assigning the 3D motion sensor to the location corresponding to the location identification; retrieving, from the EMR system, patient and care team information for the location; and displaying the image of the location and the patient and the patient and care team information for the location.
Additional objects, advantages, and novel features of the disclosure will be set forth in part in the description which follows, and in part will become apparent to those skilled in the art upon examination of the following, or may be learned by practice of the disclosure.
The description references the attached drawing figures, wherein:
As noted in the Background, medical facilities, such as hospitals, face many challenges in addition to simply caring for patients. For example, securing patients and equipment (e.g., medical devices) consumes many resources and current methods lack effectiveness. In addition to requiring personnel to physically monitor locations within the facility, visitor logs and visitor badges, and radio-frequency identification (RFID) technology are often utilized to control access to certain locations within the facility. However, each of these require subjective decision-making and are prone to error by the personnel monitoring the locations or assisting visitors signing a visitor log and issuing visitor badges accordingly. Further, none of these methods necessarily prevent an authorized visitor from breaching areas of the facility where the authorized visitor is not authorized. For example, a visitor may be authorized to visit a particular patient but is not authorized to visit another patient or particular areas of the facility. Additionally, in some situations, an authorized visitor may unwittingly provide contraband (e.g., some thing or some object a particular patient is not allowed to possess or be near) to a patient that the current methods are unable to detect. Finally, medical devices are constantly being shuffled between patients and locations within a facility. Tracking the locations of these devices can be extremely difficult. Accordingly, overall security for patients and equipment suffers and the many resources currently being utilized are wasted.
The subject matter of the present invention is described with specificity herein to meet statutory requirements. However, the description itself is not intended to limit the scope of this patent. Rather, the inventor has contemplated that the claimed subject matter might also be embodied in other ways, to include different steps or combinations of steps similar to the ones described in this document, in conjunction with other present or future technologies. Moreover, although the terms “step” and/or “block” may be used herein to connote different elements of methods employed, the terms should not be interpreted as implying any particular order among or between various steps herein disclosed unless and except when the order of individual steps is explicitly described.
As shown in
As used herein, “a sensor” and “sensors” are used interchangeably in the singular and plural unless expressly described as a singular sensor or an array of sensors. A singular sensor may be used, or a sensor may comprise two or more cameras integrated into a single physical unit. Alternately, two or more physically distinct sensors may be used, or two or more physically distinct arrays of sensors may be used.
A “device” may be any device in the room of a patient being monitored that is movable (e.g., 3D motion sensor) and capable of communicating data to an EMR. As can be appreciated, knowing the location of such a device may be vital to ensuring that the correct information for the correct patient is communicated to the EMR.
As shown in
The 3D motion sensor 112 may communicate data, such as images of the patient room being monitored, to a computerized monitoring system 114. The computerized monitoring system 114 is a computer programmed to monitor transmissions of data from the 3D motion sensor 112. The computerized monitoring system 114 may be integral to the 3D motion sensor 112 or a distinctly separate apparatus from the 3D motion sensor 112, possibly in a remote location from 3D motion sensor 112 provided that the computerized monitoring system 114 can receive data from the 3D motion sensor 112. The computerized monitoring system 114 may be located in the monitored patient room, such as a hospital room, bedroom, or living room. The computerized monitoring system 114 may be connected to a central video monitoring system 126. The computerized monitoring system 114 and central monitoring system 126 may be remotely located at any physical locations so long as a data connection exists (USB, TCP/IP or comparable) between the computerized monitoring system 114, the central monitoring system 126, and the 3D motion sensor(s) 112.
The computerized monitoring system 114 may receive data from 3D motion sensor 112 for a monitoring zone (i.e., the patient's room or area to be monitored). Computerized monitoring system 114 may assign reference points to identify the boundaries of the patient identification zone. For example, reference points may be assigned to a perimeter around the patient. It should be understood that the selection of the reference points may vary with the individual and/or the configuration of the location assignment system 100. Reference points may be configured automatically by the location assignment system 100, may be configured automatically by the location assignment system 100 subject to confirmation and/or modification by a system user, or may be configured manually by a system user.
If 3D motion sensor 112 begins to transmit data, computerized monitoring system 114 may, at step 118, assess whether a location beacon 110 on the 3D motion sensor 112 is broadcasting a known location identification. For example, a signal may be received by computerized monitoring system 114 from the location beacon 110 on the 3D motion sensor 112. If no signal is received, the computerized monitoring system 114 may continue to wait for a signal from a location beacon on the 3D motion sensor 112 (or utilize other methods to determine the location identification as described in more detail below with respect to
If on the other hand, the location beacon 110 on the 3D motion sensor 112 is broadcasting a known location identification, computerized monitoring system 114 assigns the 3D motion sensor 112 to the location corresponding to the known location identification. The location assignment may be stored in location database 116. Regardless of the signal being received or having a known location identification, images received from the 3D motion sensor 112 may additionally be communicated to central video monitoring system 126.
When the 3D motion sensor 112 has been assigned to the location corresponding to the known location identification, the assignment may be recorded in an EMR 122 by an EMR system 120. If patient and care team information has been saved for the location, at step 124, the location assignment and the patient and care team information are communicated to central video monitoring system 126.
If patient and care team information has not been saved for the location, at step 124, patient and care team information may be retrieved, at step 128, that correspond to the location. This information may be communicated to computerized monitoring system 114 and communicated to central video monitoring system 126 along with the location assignment.
On receiving an image and/or patient and care team information, the central video monitoring system 126, or an attendant there, may receive an alert that provides a live image, video and/or audio feed from the 3D motion sensor 112. One or more caregiver(s) local to patient can be alerted with or even before central video monitoring system 126, so that the caregiver(s) can assess what is happening in person. The priority and timing of alerts to different individuals or stations can be configured in accordance with the needs and desires of a particular facility, experience with a particular monitored individual or type of patient, or any other criterion of the system owner or user. This is true for initial alerts as well as continuing alerts (e.g., if 3D motion sensor 112 is detected in and has not been assigned to a location) or repeated alerts (two or more distinct events where 3D motion sensor 112 is detected and not assigned to a location). The priority and timing of alerts to different individuals may be different for initial, continuing, and/or repeated alerts.
Data associated with alerts may be logged by computerized monitoring system 114 and/or central video monitoring system 126 in a database 116. Data associated with an alert may include, without limitation, the telemetry data from 3D motion sensor 112 that triggered the alert; buffered data preceding the telemetry data that triggered the alert; telemetry data subsequent to the alert; the number and substantive content of an alert; the individual(s) and/or groups to whom an alert was addressed; the response, if any, received or observed following an alert; and combinations thereof.
Referring now to
The 3D motion sensor 212 may communicate data, such as images the patient room being monitored, to a computerized monitoring system 214. The computerized monitoring system 214 is a computer programmed to monitor transmissions of data from the 3D motion sensor 212. The computerized monitoring system 214 may be integral to the 3D motion sensor 212 or a distinctly separate apparatus from the 3D motion sensor 212, possibly in a remote location from 3D motion sensor 212 provided that the computerized monitoring system 214 can receive data from the 3D motion sensor 212. The computerized monitoring system 214 may be located in the monitored patient room, such as a hospital room, bedroom, or living room. The computerized monitoring system 214 may be connected to a central video monitoring system 226. The computerized monitoring system 214 and central video monitoring system 226 may be remotely located at any physical locations so long as a data connection exists (USB, TCP/IP or comparable) between the computerized monitoring system 214, the central monitoring system 126, and the 3D motion sensor(s) 212.
The computerized monitoring system 214 may receive data from 3D motion sensor 212 for a monitoring zone (i.e., the patient's room or area to be monitored). Computerized monitoring system 214 may assign reference points to identify the boundaries of the patient identification zone. For example, reference points may be assigned to a perimeter around the patient. It should be understood that the selection of the reference points may vary with the individual and/or the configuration of the location assignment system 200. Reference points may be configured automatically by the location assignment system 200, may be configured automatically by the location assignment system 200 subject to confirmation and/or modification by a system user, or may be configured manually by a system user.
If 3D motion sensor 212 begins to transmit data in a location, computerized monitoring system 214 may, at step 218, assess whether a known location identification is visible on a visual location identifier tag 210 in the location. For example, computerized monitoring system 214 may receive data from 3D motion sensor 212 that includes a visual location identifier tag. If no such data is received, the computerized monitoring system 214 may continue to wait for a known location identification to be visible on a visual location identifier tag on 210 in the location (or utilize other methods to determine the location identification as described herein with respect to
If on the other hand, a known location identification is visible on a visual location identifier tag on 210 in the location, computerized monitoring system 214 assigns the 3D motion sensor 212 to the location corresponding to the known location identification. The location assignment may be stored in location database 216. Regardless of a visual location identifier tag 210 being detected on the 3D motion sensor 212 or having a known location identification, image received from the 3D motion sensor 212 may additionally be communicated to central video monitoring system 226.
When the 3D motion sensor 212 has been assigned to the location corresponding to the known location identification, the assignment may be recorded in an EMR 222 by an EMR system 220. If patient and care team information has been saved for the location, at step 224, the location assignment and the patient and care team information are communicated to central video monitoring system 226.
If patient and care team information has not been saved for the location, at step 224, patient and care team information may be retrieved, at step 228, that correspond to the location. This information may be communicated to computerized monitoring system 214 and communicated to central video monitoring system 226 along with the location assignment.
On receiving the location assignment and/or patient and care team information, the central video monitoring system 226, or an attendant there, may receive an alert that provides a live image, video and/or audio feed from the 3D motion sensor 212. One or more caregiver(s) local to patient can be alerted with or even before central video monitoring system 226, so that the caregiver(s) can assess what is happening in person. The priority and timing of alerts to different individuals or stations can be configured in accordance with the needs and desires of a particular facility, experience with a particular monitored individual or type of patient, or any other criterion of the system owner or user. This is true for initial alerts as well as continuing alerts (e.g., if a 3D motion sensor 212 s detected in and has not been assigned to a location) or repeated alerts (two or more distinct events where a 3D motion sensor 212 is detected and not assigned to a location). The priority and timing of alerts to different individuals may be different for initial, continuing, and/or repeated alerts.
Data associated with alerts may be logged by computerized monitoring system 214 and/or central video monitoring system 226 in a database 216. Data associated with an alert may include, without limitation, the telemetry data from 3D motion sensor 212 that triggered the alert; buffered data preceding the telemetry data that triggered the alert; telemetry data subsequent to the alert; the number and substantive content of an alert; the individual(s) and/or groups to whom an alert was addressed; the response, if any, received or observed following an alert; and combinations thereof.
As shown in
The 3D motion sensor 312 may communicate data, such as images of the patient room being monitored, to a computerized monitoring system 314. The computerized monitoring system 314 is a computer programmed to monitor transmissions of data from the 3D motion sensor 312. The computerized monitoring system 314 may be integral to the 3D motion sensor 312 or a distinctly separate apparatus from the 3D motion sensor 312, possibly in a remote location from 3D motion sensor 312 provided that the computerized monitoring system 314 can receive data from the 3D motion sensor 312. The computerized monitoring system 314 may be located in the monitored patient room, such as a hospital room, bedroom, or living room. The computerized monitoring system 314 may be connected to a central video monitoring system 326. The computerized monitoring system 314 and central monitoring system 326 may be remotely located at any physical locations so long as a data connection exists (USB, TCP/IP or comparable) between the computerized monitoring system 314, the central monitoring system 326, and the 3D motion sensor(s) 312.
The computerized monitoring system 314 may receive data from 3D motion sensor 312 for a monitoring zone (i.e., the patient's room or area to be monitored). Computerized monitoring system 314 may assign reference points to identify the boundaries of the patient identification zone. For example, reference points may be assigned to a perimeter around the patient. It should be understood that the selection of the reference points may vary with the individual and/or the configuration of the location assignment system 300. Reference points may be configured automatically by the location assignment system 300, may be configured automatically by the location assignment system 300 subject to confirmation and/or modification by a system user, or may be configured manually by a system user.
If a 3D motion sensor 312 begins transmitting, computerized monitoring system 314 may, at step 318, assess whether a location identification has been selected for the 3D motion sensor. For example, the location identification may be selected by a user and received by computerized monitoring system 314 indicating the location of the 3D motion sensor 312. If the location identification has not been selected by a user and received by computerized monitoring system 314, the computerized monitoring system 114 may continue to wait for the location identification of the 3D motion sensor 312 (or utilize other methods to determine the location identification as described in more detail above with respect to
If on the other hand, location identification has been selected by a user and is received by computerized monitoring system 314, computerized monitoring system 314 assigns the 3D motion sensor 312 to the location corresponding to the location identification. The location assignment may be stored in location database 316. Regardless of the signal being received or having a known location identification, images received from the 3D motion sensor 312 may additionally be communicated to central video monitoring system 326.
When the 3D motion sensor 312 has been assigned to the location corresponding to the location identification, the assignment may be recorded in an EMR 322 by an EMR system 320. If patient and care team information has been saved for the location, at step 324, the location assignment and the patient and care team information are communicated to central video monitoring system 326.
If patient and care team information has not been saved for the location, at step 324, patient and care team information may be retrieved, at step 328, that correspond to the location. This information may be communicated to computerized monitoring system 314 and communicated to central video monitoring system 326 along with the location assignment.
On receiving the location assignment and/or patient and care team information, the central video monitoring system 326, or an attendant there, may receive an alert that provides a live image, video and/or audio feed from the 3D motion sensor 312. One or more caregiver(s) local to patient can be alerted with or even before central video monitoring system 326, so that the caregiver(s) can assess what is happening in person. The priority and timing of alerts to different individuals or stations can be configured in accordance with the needs and desires of a particular facility, experience with a particular monitored individual or type of patient, or any other criterion of the system owner or user. This is true for initial alerts as well as continuing alerts (e.g., if a 3D motion sensor 312 is detected in and has not been assigned to a location) or repeated alerts (two or more distinct events where a 3D motion sensor 312 is detected and not assigned to a location). The priority and timing of alerts to different individuals may be different for initial, continuing, and/or repeated alerts.
Data associated with alerts may be logged by computerized monitoring system 314 and/or central video monitoring system 326 in a database 316. Data associated with an alert may include, without limitation, the telemetry data from 3D motion sensor 312 that triggered the alert; buffered data preceding the telemetry data that triggered the alert; telemetry data subsequent to the alert; the number and substantive content of an alert; the individual(s) and/or groups to whom an alert was addressed; the response, if any, received or observed following an alert; and combinations thereof.
The various computerized systems and processors as described herein may include, individually or collectively, and without limitation, a processing unit, internal system memory, and a suitable system bus for coupling various system components, including database 118, with a control server. Computerized patient monitoring system 106 and/or central video monitoring system 116 may provide control server structure and/or function. The system bus may be any of several types of bus structures, including a memory bus or memory controller, a peripheral bus, and a local bus, using any of a variety of bus architectures. By way of example, and not limitation, such architectures include Industry Standard Architecture (ISA) bus, Micro Channel Architecture (MCA) bus, Enhanced ISA (EISA) bus, Video Electronic Standards Association (VESA) local bus, and Peripheral Component Interconnect (PCI) bus.
The computerized systems typically include therein, or have access to, a variety of computer-readable media, for instance, database 116. Computer-readable media can be any available media that may be accessed by the computerized system, and includes volatile and nonvolatile media, as well as removable and non-removable media. By way of example, and not limitation, computer-readable media may include computer-storage media and communication media. Computer-readable storage media may include, without limitation, volatile and nonvolatile media, as well as removable and non-removable media implemented in any method or technology for storage of information, such as computer readable instructions, data structures, program modules, or other data. In this regard, computer-storage media may include, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVDs) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage, or other magnetic storage device, or any other medium which can be used to store the desired information and which may be accessed by the control server. Computer-readable storage media excludes signals per se.
Communication media typically embodies computer readable instructions, data structures, program modules, or other data in a modulated data signal, such as a carrier wave or other transport mechanism, and may include any information delivery media. As used herein, the term “modulated data signal” refers to a signal that has one or more of its attributes set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media includes wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared, and other wireless media. Combinations of any of the above also may be included within the scope of computer-readable media. The computer-readable storage media discussed above, including database 116, provide storage of computer readable instructions, data structures, program modules, and other data for the computerized systems. Computer readable instructions embodied on computer-readable storage media may be accessible by location assignment system 100 and/or component(s) thereof, and, when executed by a computer processor and/or server, may cause the system to function and/or perform the methods described herein.
The computerized systems may operate in a computer network using logical connections to one or more remote computers. Remote computers may be located at a variety of locations, for example, but not limited to, hospitals and other inpatient settings, veterinary environments, ambulatory settings, medical billing and financial offices, hospital administration settings, home health care environments, payer offices (e.g., insurance companies), home health care agencies, clinicians' offices and the clinician's home or the patient's own home or over the Internet. Clinicians may include, but are not limited to, a treating physician or physicians, specialists such as surgeons, radiologists, cardiologists, and oncologists, emergency medical technicians, physicians' assistants, nurse practitioners, nurses, nurses' aides, pharmacists, dieticians, microbiologists, laboratory experts, laboratory technologists, genetic counselors, researchers, veterinarians, students, and the like. The remote computers may also be physically located in non-traditional medical care environments so that the entire health care community may be capable of integration on the network. The remote computers may be personal computers, servers, routers, network PCs, peer devices, other common network nodes, or the like, and may include some or all of the elements described above in relation to the control server. The devices can be personal digital assistants or other like devices.
Exemplary computer networks may include, without limitation, local area networks (LANs) and/or wide area networks (WANs). Such networking environments are commonplace in offices, enterprise-wide computer networks, intranets, and the Internet. When utilized in a WAN networking environment, the control server may include a modem or other means for establishing communications over the WAN, such as the Internet. In a networked environment, program modules or portions thereof may be stored in the control server, in the database 118, or on any of the remote computers. For example, and not by way of limitation, various application programs may reside on the memory associated with any one or more of the remote computers. It will be appreciated by those of ordinary skill in the art that the network connections shown are exemplary and other means of establishing a communications link between the computers may be utilized.
In operation, a user may enter commands and information into the computerized system(s) using input devices, such as a keyboard, a pointing device (commonly referred to as a mouse), a trackball, a touch pad, a 3D Gesture recognition camera or motion sensor. Other input devices may include, without limitation, microphones, satellite dishes, scanners, or the like. In addition to or in lieu of a monitor, the computerized systems may include other peripheral output devices, such as speakers and a printer.
Many other internal components of the computerized system hardware are not shown because such components and their interconnection are well known. Accordingly, additional details concerning the internal construction of the computers that make up the computerized systems are not further disclosed herein.
Methods and systems of embodiments of the present disclosure may be implemented in a WINDOWS or LINUX operating system, operating in conjunction with an Internet-based delivery system, however, one of ordinary skill in the art will recognize that the described methods and systems can be implemented in any operating system suitable for supporting the disclosed processing and communications. As contemplated by the language above, the methods and systems of embodiments of the present invention may also be implemented on a stand-alone desktop, personal computer, cellular phone, smart phone, tablet computer, PDA, or any other computing device used in a healthcare environment or any of a number of other locations.
From the foregoing, it will be seen that this disclosure is well adapted to attain all the ends and objects hereinabove set forth together with other advantages which are obvious and which are inherent to the structure.
It will be understood that certain features and subcombinations are of utility and may be employed without reference to other features and subcombinations. This is contemplated by and is within the scope of the claims.
Since many possible embodiments may be made of the invention without departing from the scope thereof, it is to be understood that all matter herein set forth or shown in the accompanying drawings is to be interpreted as illustrative and not in a limiting sense.
This application is a continuation of U.S. patent application Ser. No. 15/395,243, entitled “Methods and Systems for Assigning Locations to Devices,” filed Dec. 30, 2016, which claims the benefit of U.S. Provisional Application Ser. No. 62/273,735, entitled “Methods and Systems for Detecting Stroke Symptoms,” filed Dec. 31, 2015, herein incorporated by reference in its entirety, and is related to commonly assigned U.S. patent application Ser. No. 15/395,250, entitled “Methods and Systems for Detecting Prohibited Objects in a Patient Room,” filed Dec. 30, 2016, and U.S. patent application Ser. No. 15/395,526, entitled “Detecting Unauthorized Visitors,” filed Dec. 30, 2016.
Number | Name | Date | Kind |
---|---|---|---|
4669263 | Sugiyama | Jun 1987 | A |
4857716 | Gombrich et al. | Aug 1989 | A |
5031228 | Lu | Jul 1991 | A |
5276432 | Travis | Jan 1994 | A |
5448221 | Weller | Sep 1995 | A |
5482050 | Smokoff et al. | Jan 1996 | A |
5592153 | Welling et al. | Jan 1997 | A |
5798798 | Rector et al. | Aug 1998 | A |
5838223 | Gallant et al. | Nov 1998 | A |
5915379 | Wallace et al. | Jun 1999 | A |
5942986 | Shabot et al. | Aug 1999 | A |
6050940 | Braun et al. | Apr 2000 | A |
6095984 | Amano et al. | Aug 2000 | A |
6160478 | Jacobsen et al. | Dec 2000 | A |
6174283 | Nevo et al. | Jan 2001 | B1 |
6188407 | Smith et al. | Feb 2001 | B1 |
6269812 | Wallace et al. | Aug 2001 | B1 |
6287452 | Allen et al. | Sep 2001 | B1 |
6322502 | Schoenberg et al. | Nov 2001 | B1 |
6369838 | Wallace et al. | Apr 2002 | B1 |
6429869 | Kamakura et al. | Aug 2002 | B1 |
6614349 | Proctor et al. | Sep 2003 | B1 |
6727818 | Wildman et al. | Apr 2004 | B1 |
6804656 | Rosenfeld et al. | Oct 2004 | B1 |
7015816 | Wildman et al. | Mar 2006 | B2 |
7122005 | Shusterman | Oct 2006 | B2 |
7154397 | Zerhusen et al. | Dec 2006 | B2 |
7237287 | Weismiller et al. | Jul 2007 | B2 |
7323991 | Eckert et al. | Jan 2008 | B1 |
7408470 | Wildman et al. | Aug 2008 | B2 |
7420472 | Tran | Sep 2008 | B2 |
7430608 | Noonan et al. | Sep 2008 | B2 |
7502498 | Wen et al. | Mar 2009 | B2 |
7612679 | Fackler et al. | Nov 2009 | B1 |
7669263 | Menkedick et al. | Mar 2010 | B2 |
7715387 | Schuman | May 2010 | B2 |
7724147 | Brown | May 2010 | B2 |
7756723 | Rosow et al. | Jul 2010 | B2 |
7890349 | Cole et al. | Feb 2011 | B2 |
7893842 | Deutsch | Feb 2011 | B2 |
7895055 | Schneider et al. | Feb 2011 | B2 |
7908153 | Scherpbier et al. | Mar 2011 | B2 |
7945457 | Zaleski | May 2011 | B2 |
7962544 | Torok et al. | Jun 2011 | B2 |
7972140 | Renaud | Jul 2011 | B2 |
8108036 | Tran | Jan 2012 | B2 |
8123685 | Brauers et al. | Feb 2012 | B2 |
8128596 | Carter | Mar 2012 | B2 |
8190447 | Hungerford et al. | May 2012 | B2 |
8224108 | Steinberg et al. | Jul 2012 | B2 |
8237558 | Seyed et al. | Aug 2012 | B2 |
8273018 | Fackler et al. | Sep 2012 | B1 |
8432263 | Kunz | Apr 2013 | B2 |
8451314 | Cline et al. | May 2013 | B1 |
8529448 | Mcnair | Sep 2013 | B2 |
8565500 | Neff | Oct 2013 | B2 |
8620682 | Bechtel et al. | Dec 2013 | B2 |
8655680 | Bechtel et al. | Feb 2014 | B2 |
8700423 | Eaton et al. | Apr 2014 | B2 |
8727981 | Bechtel et al. | May 2014 | B2 |
8769153 | Dziubinski | Jul 2014 | B2 |
8890937 | Skubic et al. | Nov 2014 | B2 |
8902068 | Bechtel et al. | Dec 2014 | B2 |
8917186 | Grant | Dec 2014 | B1 |
8953886 | King et al. | Feb 2015 | B2 |
9072929 | Rush et al. | Jul 2015 | B1 |
9129506 | Kusens | Sep 2015 | B1 |
9147334 | Long et al. | Sep 2015 | B2 |
9159215 | Kusens | Oct 2015 | B1 |
9269012 | Fotland | Feb 2016 | B2 |
9292089 | Sadek | Mar 2016 | B1 |
9305191 | Long et al. | Apr 2016 | B2 |
9408561 | Stone et al. | Aug 2016 | B2 |
9424699 | Kusens et al. | Aug 2016 | B2 |
9466163 | Kusens et al. | Oct 2016 | B2 |
9489820 | Kusens | Nov 2016 | B1 |
9519969 | Kusens | Dec 2016 | B1 |
9524443 | Kusens | Dec 2016 | B1 |
9536310 | Kusens | Jan 2017 | B1 |
9538158 | Rush et al. | Jan 2017 | B1 |
9563955 | Kamarshi et al. | Feb 2017 | B1 |
9597016 | Stone et al. | Mar 2017 | B2 |
9691206 | Kusens et al. | Jun 2017 | B2 |
9729833 | Kusens | Aug 2017 | B1 |
9741227 | Kusens | Aug 2017 | B1 |
9774991 | Kusens | Sep 2017 | B2 |
9838849 | Kusens | Dec 2017 | B2 |
9858741 | Kusens et al. | Jan 2018 | B2 |
9892310 | Kusens et al. | Feb 2018 | B2 |
9892311 | Kusens et al. | Feb 2018 | B2 |
9892611 | Kusens | Feb 2018 | B1 |
9905113 | Kusens | Feb 2018 | B2 |
9934427 | Derenne et al. | Apr 2018 | B2 |
9984521 | Kusens et al. | May 2018 | B1 |
9997001 | Kusens et al. | Jun 2018 | B2 |
9998857 | Kusens | Jun 2018 | B2 |
10013831 | Kusens et al. | Jul 2018 | B1 |
10068116 | Good | Sep 2018 | B2 |
10078956 | Kusens | Sep 2018 | B1 |
10090068 | Kusens et al. | Oct 2018 | B2 |
10091463 | Kusens | Oct 2018 | B1 |
10096223 | Kusens | Oct 2018 | B1 |
10109179 | Kusens | Oct 2018 | B2 |
10115253 | Kusens et al. | Oct 2018 | B2 |
10115254 | Kusens et al. | Oct 2018 | B1 |
10121299 | Kusens et al. | Nov 2018 | B2 |
10210378 | Kusens et al. | Feb 2019 | B2 |
10225522 | Kusens | Mar 2019 | B1 |
10342478 | Kusens | Jul 2019 | B2 |
10524722 | Kusens et al. | Jan 2020 | B2 |
10643061 | Kusens et al. | May 2020 | B2 |
10922936 | Kusens et al. | Feb 2021 | B2 |
10922946 | Kusens et al. | Feb 2021 | B2 |
20020015034 | Malmborg | Feb 2002 | A1 |
20020038073 | August | Mar 2002 | A1 |
20020077863 | Rutledge et al. | Jun 2002 | A1 |
20020101349 | Rojas | Aug 2002 | A1 |
20020115905 | August | Aug 2002 | A1 |
20020183976 | Pearce | Dec 2002 | A1 |
20030037786 | Biondi et al. | Feb 2003 | A1 |
20030070177 | Kondo et al. | Apr 2003 | A1 |
20030092974 | Santoso et al. | May 2003 | A1 |
20030095147 | Daw | May 2003 | A1 |
20030135390 | Obrien et al. | Jul 2003 | A1 |
20030140928 | Bui et al. | Jul 2003 | A1 |
20030227386 | Pulkkinen et al. | Dec 2003 | A1 |
20040019900 | Knightbridge et al. | Jan 2004 | A1 |
20040052418 | Delean | Mar 2004 | A1 |
20040054760 | Ewing et al. | Mar 2004 | A1 |
20040097227 | Siegel | May 2004 | A1 |
20040116804 | Mostafavi | Jun 2004 | A1 |
20040193449 | Wildman et al. | Sep 2004 | A1 |
20050038326 | Mathur | Feb 2005 | A1 |
20050182305 | Hendrich | Aug 2005 | A1 |
20050231341 | Shimizu | Oct 2005 | A1 |
20050249139 | Nesbit | Nov 2005 | A1 |
20060004606 | Wendl et al. | Jan 2006 | A1 |
20060047538 | Condurso et al. | Mar 2006 | A1 |
20060049936 | Collins et al. | Mar 2006 | A1 |
20060058587 | Heimbrock et al. | Mar 2006 | A1 |
20060089541 | Braun et al. | Apr 2006 | A1 |
20060092043 | Lagassey | May 2006 | A1 |
20060107295 | Margis et al. | May 2006 | A1 |
20060145874 | Fredriksson et al. | Jul 2006 | A1 |
20060261974 | Albert et al. | Nov 2006 | A1 |
20070033072 | Bildirici | Feb 2007 | A1 |
20070083445 | Garcia et al. | Apr 2007 | A1 |
20070085690 | Tran | Apr 2007 | A1 |
20070118054 | Pinhas et al. | May 2007 | A1 |
20070120689 | Zerhusen et al. | May 2007 | A1 |
20070129983 | Scherpbier et al. | Jun 2007 | A1 |
20070136102 | Rodgers | Jun 2007 | A1 |
20070136218 | Bauer et al. | Jun 2007 | A1 |
20070159332 | Koblasz | Jul 2007 | A1 |
20070279219 | Warriner | Dec 2007 | A1 |
20070296600 | Dixon et al. | Dec 2007 | A1 |
20080001735 | Tran | Jan 2008 | A1 |
20080001763 | Raja et al. | Jan 2008 | A1 |
20080002860 | Super et al. | Jan 2008 | A1 |
20080004904 | Tran | Jan 2008 | A1 |
20080009686 | Hendrich | Jan 2008 | A1 |
20080015903 | Rodgers | Jan 2008 | A1 |
20080021731 | Rodgers | Jan 2008 | A1 |
20080071210 | Moubayed et al. | Mar 2008 | A1 |
20080087719 | Sahud | Apr 2008 | A1 |
20080106374 | Sharbaugh | May 2008 | A1 |
20080126132 | Warner et al. | May 2008 | A1 |
20080228045 | Gao et al. | Sep 2008 | A1 |
20080249376 | Zaleski | Oct 2008 | A1 |
20080267447 | Kelusky et al. | Oct 2008 | A1 |
20080277486 | Seem et al. | Nov 2008 | A1 |
20080281638 | Weatherly et al. | Nov 2008 | A1 |
20090082829 | Panken et al. | Mar 2009 | A1 |
20090091458 | Deutsch | Apr 2009 | A1 |
20090099480 | Salgo et al. | Apr 2009 | A1 |
20090112630 | Collins et al. | Apr 2009 | A1 |
20090119843 | Rodgers et al. | May 2009 | A1 |
20090177327 | Turner et al. | Jul 2009 | A1 |
20090224924 | Thorp | Sep 2009 | A1 |
20090278934 | Ecker et al. | Nov 2009 | A1 |
20090322513 | Hwang et al. | Dec 2009 | A1 |
20090326340 | Wang et al. | Dec 2009 | A1 |
20100117836 | Seyed et al. | May 2010 | A1 |
20100169114 | Henderson et al. | Jul 2010 | A1 |
20100169120 | Herbst et al. | Jul 2010 | A1 |
20100172567 | Prokoski | Jul 2010 | A1 |
20100176952 | Bajcsy et al. | Jul 2010 | A1 |
20100188228 | Hyland | Jul 2010 | A1 |
20100205771 | Pietryga et al. | Aug 2010 | A1 |
20100245577 | Yamamoto et al. | Sep 2010 | A1 |
20100285771 | Peabody | Nov 2010 | A1 |
20100305466 | Corn | Dec 2010 | A1 |
20110018709 | Kombluh | Jan 2011 | A1 |
20110022981 | Mahajan et al. | Jan 2011 | A1 |
20110025493 | Papadopoulos et al. | Feb 2011 | A1 |
20110025499 | Hoy et al. | Feb 2011 | A1 |
20110035057 | Receveur et al. | Feb 2011 | A1 |
20110035466 | Panigrahi | Feb 2011 | A1 |
20110050411 | Schuman | Mar 2011 | A1 |
20110054936 | Cowan et al. | Mar 2011 | A1 |
20110068930 | Wildman et al. | Mar 2011 | A1 |
20110077965 | Nolte et al. | Mar 2011 | A1 |
20110087079 | Aarts | Apr 2011 | A1 |
20110087125 | Causevic | Apr 2011 | A1 |
20110102133 | Shaffer | May 2011 | A1 |
20110102181 | Metz et al. | May 2011 | A1 |
20110106560 | Eaton et al. | May 2011 | A1 |
20110106561 | Eaton et al. | May 2011 | A1 |
20110175809 | Markovic et al. | Jul 2011 | A1 |
20110190593 | Mcnair | Aug 2011 | A1 |
20110227740 | Wohltjen | Sep 2011 | A1 |
20110245707 | Castle et al. | Oct 2011 | A1 |
20110254682 | Sigrist | Oct 2011 | A1 |
20110288811 | Greene | Nov 2011 | A1 |
20110295621 | Farooq et al. | Dec 2011 | A1 |
20110301440 | Riley et al. | Dec 2011 | A1 |
20110313325 | Cuddihy | Dec 2011 | A1 |
20120016295 | Tsoukalis | Jan 2012 | A1 |
20120025991 | Okeefe et al. | Feb 2012 | A1 |
20120026308 | Johnson et al. | Feb 2012 | A1 |
20120075464 | Derenne et al. | Mar 2012 | A1 |
20120092162 | Rosenberg | Apr 2012 | A1 |
20120098918 | Murphy | Apr 2012 | A1 |
20120140068 | Monroe | Jun 2012 | A1 |
20120154582 | Johnson | Jun 2012 | A1 |
20120212582 | Deutsch | Aug 2012 | A1 |
20120259650 | Mallon et al. | Oct 2012 | A1 |
20120314901 | Hanson et al. | Dec 2012 | A1 |
20120323090 | Bechtel et al. | Dec 2012 | A1 |
20120323591 | Bechtel et al. | Dec 2012 | A1 |
20120323592 | Bechtel et al. | Dec 2012 | A1 |
20130027199 | Bonner | Jan 2013 | A1 |
20130028570 | Suematsu et al. | Jan 2013 | A1 |
20130120120 | Long et al. | May 2013 | A1 |
20130122807 | Fenarvitz et al. | May 2013 | A1 |
20130127620 | Siebers et al. | May 2013 | A1 |
20130184592 | Venetianer et al. | Jul 2013 | A1 |
20130265482 | Funamoto | Oct 2013 | A1 |
20130309128 | Voegeli et al. | Nov 2013 | A1 |
20130332184 | Burnham et al. | Dec 2013 | A1 |
20140039351 | Mix et al. | Feb 2014 | A1 |
20140070950 | Snodgrass | Mar 2014 | A1 |
20140081654 | Bechtel et al. | Mar 2014 | A1 |
20140085501 | Tran | Mar 2014 | A1 |
20140086450 | Huang et al. | Mar 2014 | A1 |
20140108041 | Bechtel et al. | Apr 2014 | A1 |
20140155755 | Pinter et al. | Jun 2014 | A1 |
20140168397 | Greco | Jun 2014 | A1 |
20140191861 | Scherrer | Jul 2014 | A1 |
20140191946 | Cho et al. | Jul 2014 | A1 |
20140213845 | Bechtel et al. | Jul 2014 | A1 |
20140267625 | Clark et al. | Sep 2014 | A1 |
20140267736 | Delean | Sep 2014 | A1 |
20140309789 | Ricci | Oct 2014 | A1 |
20140327545 | Bolling et al. | Nov 2014 | A1 |
20140328512 | Gurwicz et al. | Nov 2014 | A1 |
20140333744 | Baym et al. | Nov 2014 | A1 |
20140333776 | Dedeoglu et al. | Nov 2014 | A1 |
20140354436 | Nix et al. | Dec 2014 | A1 |
20140365242 | Neff | Dec 2014 | A1 |
20150057635 | Bechtel et al. | Feb 2015 | A1 |
20150061891 | Oleson et al. | Mar 2015 | A1 |
20150109442 | Derenne | Apr 2015 | A1 |
20150206415 | Wegelin et al. | Jul 2015 | A1 |
20150269318 | Neff | Sep 2015 | A1 |
20150278456 | Bermudez Rodriguez et al. | Oct 2015 | A1 |
20150294143 | Wells et al. | Oct 2015 | A1 |
20160022218 | Hayes et al. | Jan 2016 | A1 |
20160029160 | Theurer | Jan 2016 | A1 |
20160070869 | Portnoy | Mar 2016 | A1 |
20160093195 | Ophardt | Mar 2016 | A1 |
20160098676 | Kusens et al. | Apr 2016 | A1 |
20160127641 | Gove | May 2016 | A1 |
20160180668 | Kusens et al. | Jun 2016 | A1 |
20160183864 | Kusens et al. | Jun 2016 | A1 |
20160217347 | Mineo | Jul 2016 | A1 |
20160253802 | Venetianer et al. | Sep 2016 | A1 |
20160267327 | Franz et al. | Sep 2016 | A1 |
20160285416 | Tiwari | Sep 2016 | A1 |
20160314258 | Kusens | Oct 2016 | A1 |
20160324460 | Kusens | Nov 2016 | A1 |
20160360970 | Tzvieli et al. | Dec 2016 | A1 |
20170055917 | Stone et al. | Mar 2017 | A1 |
20170084158 | Kusens | Mar 2017 | A1 |
20170091562 | Kusens | Mar 2017 | A1 |
20170109991 | Kusens | Apr 2017 | A1 |
20170116473 | Sashida et al. | Apr 2017 | A1 |
20170143240 | Stone et al. | May 2017 | A1 |
20170163949 | Suzuki | Jun 2017 | A1 |
20170193177 | Kusens | Jul 2017 | A1 |
20170214902 | Braune | Jul 2017 | A1 |
20170289503 | Kusens | Oct 2017 | A1 |
20170337682 | Liao et al. | Nov 2017 | A1 |
20180018864 | Baker | Jan 2018 | A1 |
20180068545 | Kusens | Mar 2018 | A1 |
20180104409 | Bechtel et al. | Apr 2018 | A1 |
20180144605 | Kusens | May 2018 | A1 |
20180189946 | Kusens et al. | Jul 2018 | A1 |
20180190098 | Kusens | Jul 2018 | A1 |
20180357875 | Kusens | Dec 2018 | A1 |
20190006046 | Kusens et al. | Jan 2019 | A1 |
20190029528 | Tzvieli et al. | Jan 2019 | A1 |
20190043192 | Kusens et al. | Feb 2019 | A1 |
20190057592 | Kusens | Feb 2019 | A1 |
20190205630 | Kusens | Jul 2019 | A1 |
20190206218 | Kusens et al. | Jul 2019 | A1 |
20190209022 | Sobol et al. | Jul 2019 | A1 |
20190228866 | Weffers-Albu et al. | Jul 2019 | A1 |
20190261915 | Kusens | Aug 2019 | A1 |
20190307405 | Terry et al. | Oct 2019 | A1 |
20190318478 | Kusens et al. | Oct 2019 | A1 |
20200050844 | Kusens | Feb 2020 | A1 |
20210202052 | Bechtel et al. | Jul 2021 | A1 |
Number | Date | Country |
---|---|---|
19844918 | Apr 2000 | DE |
2007081629 | Jul 2007 | WO |
2009018422 | Feb 2009 | WO |
2012122002 | Sep 2012 | WO |
2016126845 | Aug 2016 | WO |
2017058991 | Apr 2017 | WO |
2017124056 | Jul 2017 | WO |
2018218286 | Dec 2018 | WO |
Entry |
---|
Notice of Allowance received for U.S. Appl. No. 16/654,502, dated Feb. 17, 2021, 9 pages. |
Non-Final Office action received for U.S. Appl. No. 16/410,745, dated May 21, 2021, 21 pages. |
Conaire et al., “Fusion of Infrared and Visible Spectrum Video for Indoor Surveillance”, WIAMIS, Apr. 2005, 4 pages. |
Mooney, Tom, “Rhode Island ER First to Test Google Glass on Medical Conditions”, EMS1, Available online at: <https://www.ems1.com/ems-products/technology/articles/1860487-Rhode-Island-ER-first-to-test-Google-Glass-on-medical-conditions/>, Mar. 10, 2014, 3 pages. |
Raheja et al., “Human Facial Expression Detection From Detected in Captured Image Using Back Propagation Neural Network”, International Journal of Computer Science and Information Technology (IJCSIT), vol. 2, No. 1, Feb. 2010, 9 pages. |
“Virtual Patient Observation: Centralize Monitoring of High-Risk Patients with Video”, CISCO, Cisco Video Surveillance Manager, 2013, pp. 1-6. |
Notice of Allowance received for U.S. Appl. No. 16/410,745, dated Jan. 4, 2022, 10 pages. |
Non-Final Office Action received for U.S. Appl. No. 16/830,498, dated Sep. 22, 2021, 29 pages. |
Notice of Allowance received for U.S. Appl. No. 16/816,626, dated Sep. 30, 2021, 9 pages. |
Quan et al., “Facial Asymmetry Analysis Based on 3-D Dynamic Scans”, 2012 IEEE International Conference on Systems, Man, and Cybernetics; COEX, Seoul, Korea; DOI: 10.1109/ICSMC.2012.6378151, Oct. 14-17, 2012, pp. 2676-2681. |
Non-Final Office action received for U.S. Appl. No. 17/117,414, dated Jul. 27, 2021, 12 pages. |
Pre-interview First Office Action received for U.S. Appl. No. 16/731,274, dated Sep. 1, 2021, 12 pages. |
Non-Final Office Action received for U.S. Appl. No. 17/152,403, dated Mar. 15, 2022, 10 pages. |
Notice of Allowance received for U.S. Appl. No. 17/318,521, dated Aug. 31, 2022, 9 pages. |
Otanasap et al., “Pre-Impact Fall Detection System Using Dynamic Threshold and 3D Bounding Box”, SPIE.Digital Library, Proceedings vol. 10225, Eighth International Conference on Graphic and Image Processing (ICGIP 2016), Available online at: <https://doi.org/10.1117/12.2266822>, Feb. 8, 2017, pp. 1-6. |
Zarka et al., “Real-Time Human Motion Detection and Tracking”, IEEE, Available online at: <https://ieeexplore.IEEE.org/document/4530098>, 2008, pp. 1-6. |
Number | Date | Country | |
---|---|---|---|
20210068709 A1 | Mar 2021 | US |
Number | Date | Country | |
---|---|---|---|
62273735 | Dec 2015 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 15395243 | Dec 2016 | US |
Child | 17101639 | US |