The present disclosure relates to computerized methods and systems for detecting stroke symptoms.
A stroke occurs when a portion of the brain has insufficient blood flow. There are generally two kinds of strokes, hemorrhagic and ischemic. Hemorrhagic stroke occurs when a blood vessel in the brain ruptures or leaks. This can prevent other areas of the brain from receiving adequate blood flow, as well as creating pressure and other injuries in the area of the rupture or leak. Ischemic stroke occurs when a blood vessel in the brain becomes at least partially blocked, preventing a full supply of blood from passing the blockage to reach other areas of the brain. Stroke may also refer to a transient ischemic attack (TIA), where blood flow to part of the brain is temporarily interrupted, but is restored without intervention. A TIA may be, but is not always, a precursor to a stroke. A person who has experienced a TIA is generally considered at higher risk for stroke than someone who has not experienced a TIA.
Although different people may experience stroke differently, symptoms for hemorrhagic and ischemic stroke, as well as TIA, are generally similar. Symptoms may include slurred, garbled, or nonsense speech and physical asymmetry. For example, a person experiencing a stroke or TIA may have difficulty holding an arm up in an extended position for even a short period of time—the stroke victim may need assistance to lift the arm, or may be unable to hold the arm in a raised position, even if the other arm can easily be raised and held in a raised position. As another example, a person experiencing a stroke or TIA may have a visually perceptible droop or sagging in the face, shoulders, or other body carriage. The droop or sagging may be pronounced, or the asymmetry may be most noticeable with movement, such as when the person smiles or speaks. Typically, weakness and/or drooping are one-sided, occurring predominantly or entirely on the right- or left-side of the body, depending on what part of the brain is affected by the stroke.
Stroke can be treated, however, currently available treatments require prompt action. Medication for ischemic stroke is best delivered within 4-6 hours of the onset of symptoms. Hemorrhagic stroke may require even faster intervention, depending on the severity and location of the rupture or leak. A TIA by definition does not require intervention, however, recognizing the occurrence of a TIA is important to allow for diagnostic and preventative care.
This brief summary is provided as a general overview of the more detailed disclosure which follows. It is not intended to identify key or essential elements of the disclosure, or to define the claim terms in isolation from the remainder of the disclosure, including the drawings.
This disclosure generally relates to systems and method for detecting stroke symptoms. Generally, and without limitation, the method involves collecting a series of images of a person's face. The system identifies reference points on the person's face, for example, points along the cheeks, jowls, and/or brow. The system may superimpose an x-y plane over the reference points in a digital image. The system then monitors images of the face of the person over time. Using the x-y plane (and, optionally, a z-axis), the system compares the positions of the reference points over time. If an asymmetric change in the positions of the reference points is detected, the system generates an alert that the person may be experiencing a stroke.
In some aspects, this disclosure relates to a method for detecting stroke symptoms. The method may comprise receiving from a 3D motion sensor a series of two or more images of the face of a person. The method may comprise superimposing an x-y plane over a plurality of reference points related to anatomical features of the person. The method may comprise comparing, over time, positions of the plurality of reference points relative to the x-y plane. The method may comprise assessing a maintenance of symmetry of the plurality of reference points.
The method may comprise assessing symmetry about an x-axis, a y-axis, and a z-axis. The method may comprise identifying a minimum asymmetric change in the positions of the plurality of reference points. The method may comprise evaluating whether the minimum asymmetric change is maintained for a minimum period of time. The method may comprise alerting a designated recipient of an asymmetric change. The method may further comprise superimposing a z-axis over the plurality of reference points. The method may further comprise using the z-axis at least in part to assess the maintenance of symmetry of the plurality of reference points over time. The method may comprise communicating the series of images of the face of the person to a central monitoring station. The method may comprise displaying series of images for a plurality of people being monitored on a primary display at the central monitoring station. The method may comprise alerting the central monitoring station if an asymmetric change in the positions of the plurality of reference points relative to the x-y plane is identified. The method may comprise displaying images of the person for whom the asymmetric change was identified on a central monitoring station alert display upon receiving an alert.
In some aspects, this disclosure relates to a system for detecting stroke symptoms. The system may comprise one or more 3D motion sensors. The one or more 3D motion sensors may be located to provide the one or more 3D motion sensors with a view of the face of a person to be monitored. The 3D motion sensors may be configured to collect a series of images of the face of the person. The system may comprise a computerized monitoring system. The computerized monitoring system may be communicatively coupled to the one or more 3D motion sensors. The computerized monitoring system may be configured to identify a plurality of reference points on the face of the person to be monitored. The computerized monitoring system may be configured to superimpose an x-y-z axis system over the plurality of reference points on the face of the person. The computerized monitoring system may be configured to monitor positions of the plurality of the reference points of the face of the person on the x-y-z axis system. The stroke detection system may comprise a computerized communication system. The computerized communication system may be communicatively coupled to the computerized monitoring system. The computerized communication system may be configured to send an alert to one or more designated recipients if a minimum asymmetric change in the position of the reference points relative to the x-y-z axis system is identified. The computerized communication system may be configured to send an alert if the minimum asymmetric change is maintained for a minimum period of time.
The stroke detection system may further comprise a central monitoring station. The central monitoring station may be communicatively coupled to the computerized communication system. The central monitoring station may be configured to display at least a portion of the series of images of the person. The central monitoring station may comprise a primary display. The central monitoring station may comprise an alert display. The alert display may be a dedicated portion of the primary display. The alert display may be a separate display or series of displays from the primary display. If the computerized monitoring system detects an asymmetric change in the positions of the plurality of the reference points on the face of the person on the x-y-z axis system, the computerized communication system may be configured to send an alert to the central monitoring station. The central monitoring station may be configured to move the display of at least a portion of the series of images of the person from the primary display to the alert display upon receipt of an alert.
In some aspects this disclosure relates to computer-readable storage media having embodied thereon computer-executable instructions. When executed by one or more computer processors the instructions may cause the processors to receive from a 3D motion sensor a series of two or more images of the face of a person. The instructions may cause the processor(s) to superimpose an x-y plane over a plurality of reference points related to anatomical features of the person. The instructions may cause the processor(s) to compare, over time, the position of the plurality of reference points relative to the x-y plane. The instructions may cause the processor(s) to assess the maintenance of symmetry of the plurality of reference points over time. The instructions may cause the processor(s) to identifying asymmetric change in the positions of the plurality of reference points. The instructions may cause the processors to alert a designated recipient of the asymmetric change. The instructions may cause the processor(s) to display series of images for a plurality of people being monitored on a primary display at a central monitoring station. The instructions may cause the processor(s) to alert the central monitoring station if an asymmetric change in the positions of the plurality of reference points is identified. The instructions may cause the processor(s) upon receiving an alert to cause the central monitoring station to duplicate the display of the series of images associated with the alert on a central monitoring station alert display.
Additional objects, advantages, and novel features of the disclosure will be set forth in part in the description which follows, and in part will become apparent to those skilled in the art upon examination of the following, or may be learned by practice of the disclosure.
The description references the attached drawing figures, wherein:
Some individuals may be at higher risk of stroke than others. There seem to be hereditary components to stroke, so those with a family history of stroke may have a higher risk than others with no family history of stroke. Certain medications or medical procedures may increase the risk of stroke, sometimes for a relatively short period. For example, some surgical interventions can cause blood clots that can break away from the surgical site and be swept through the blood vessels to the brain, where smaller blood vessels may trap the clot, causing an ischemic stroke. That risk may be present for days or weeks after the surgery, and then decrease significantly as the surgical site heals and blood clots are less likely to form. People who have had a stroke before may be more likely to have a stroke again. For these and other reasons, some people may be at higher risk of stroke than others, and may merit monitoring for stroke symptoms.
A common stroke symptom is an atypical asymmetry in facial features, often described as a one-sided droop. Most people have some asymmetry between the left and right sides of their faces (e.g., about an imaginary y-axis run down the center of the person's face), but a person having a stroke may lose involuntary as well as voluntary muscle control, sometimes resulting in a distinct sagging of one or more features on one side of the face. The sagging may be observed in the mouth, for example, where one corner of the mouth might be appreciably lower than the other. The sagging may be most noticeable when the person tries to smile or speak, with one side of the mouth responding, and the other side of the mouth unresponsive and/or sagging. Sagging may also be observed in the eyebrows, eyelids, cheeks, or jowls.
The subject matter of the present invention is described with specificity herein to meet statutory requirements. However, the description itself is not intended to limit the scope of this patent. Rather, the inventor has contemplated that the claimed subject matter might also be embodied in other ways, to include different steps or combinations of steps similar to the ones described in this document, in conjunction with other present or future technologies. Moreover, although the terms “step” and/or “block” may be used herein to connote different elements of methods employed, the terms should not be interpreted as implying any particular order among or between various steps herein disclosed unless and except when the order of individual steps is explicitly described.
As shown in
As used herein, “a sensor” and “sensors” are used interchangeably in the singular and plural unless expressly described as a singular sensor or an array of sensors. A singular sensor may be used, or a sensor may comprise two or more cameras integrated into a single physical unit. Alternately, two or more physically distinct sensors may be used, or two or more physically distinct arrays of sensors may be used.
A 3D motion sensor 110 may be co-located with a person 120 to be monitored. The person 120 to be monitored may be monitored in a variety of environments, including, without limitation, a hospital, a home, a hospice care facility, a nursing home, an assisted living facility, an outpatient medical care facility, and the like. The 3D motion sensor 110 may be positioned where it is likely to capture images of the face of the person 120 to be monitored. For example, a 3D motion sensor 110 may be oriented to take images of a bed, chair, or other location where the person 120 to be monitored may spend a significant amount of time. The 3D motion sensor 110 may be permanently installed, or may be temporarily set up in a room as needed. The person 120 to be monitored may be under immediate medical care, e.g., in a medical facility under the supervision of a medical professional, or may not be under immediate care, e.g., in a home or other environment, possibly with a caregiver. A caregiver may be a medical professional or paraprofessional, such as an orderly, nurse's aide, nurse, or the like. A caregiver may also be a friend, relative, individual, company, or facility that provides assistance with daily living activities and/or medical care for individuals, such as individuals who are disabled, ill, injured, elderly, or otherwise in need of temporary or long-term assistance. In some instances, the person to be monitored may be self-sufficient and not under the immediate care of any other person or service provider.
The 3D motion sensor 110 may communicate data, such as images of the person 120 being monitored, to a computerized monitoring system 130. The computerized monitoring system 130 is a computer programmed to monitor transmissions of data from the 3D motion sensor 110. The computerized monitoring system 130 may be integral to the 3D motion sensor 110 or a distinctly separate apparatus from the 3D motion sensor 110, possibly in a remote location from 3D motion sensor 110 provided that the computerized monitoring system 130 can receive data from the 3D motion sensor 110. The computerized monitoring system 130 may be located in the monitored person's room, such as a hospital room, bedroom, or living room. The computerized monitoring system 130 may be connected to a central monitoring station 150. The computerized monitoring system 130 and central monitoring station 150 may be remotely located at any physical locations so long as a data connection exists (USB, TCP/IP or comparable) between the computerized monitoring system 130, the computerized communication system 140 (if separate from computerized monitoring system 130), the central monitoring station 150, and the 3D motion sensor(s) 110.
The computerized monitoring system 130 may receive data from 3D motion sensor 110. The data may include images of a monitoring zone 300, as shown in
If a face is detected within the monitoring zone 300 at step 180, computerized monitoring system 130 may, at step 185, configure a face monitoring zone 310. Configuring the face monitoring zone 310 may include digitally superimposing an x-y plane over the face detected in the image data. The face monitoring zone 310 may include a frame 360 around the face detected in the image data. Frame 360 is shown in
In addition to or in lieu of superimposing a frame 360 around the face detected in the image data, computerized monitoring system 130 may digitally superimpose over the face an x-y plane, such as y-axis 340 and x-axis 350. The y-axis may be placed generally along the bridge of the nose, roughly dividing the face into two halves, with one eye on each side of the y-axis 340. The x-axis 350 may be placed roughly halfway between the top of the head and/or hairline and the bottom of the chin. Reference points may be assigned to distinctive features of the face. For example, in
Returning to
In some embodiments, a certain degree of change in the symmetry or relative symmetry about the x-, y-, and/or z-axes, such as a change in distance of at least 10%, or at least 3 mm, may be required before issuing an alert. These examples are intended to be non-limiting, and any desired percentage or distance may be used. For example, an alarm limit may be set to require a change in distance of at least 5%, or at least 10%, or at least 15%, or at least 20% or more. As another example, an alarm limit may be set to require a change of at least 1-5mm. Additionally or alternatively, other measures of variance could be used, such as standard deviations, and measures may be based on absolute or relative values. Summed variances, such as the sum of the variances relative to the x-, y- and/or z-axes, or any subcombination thereof, could be used as a minimum variance threshold for issuing an alert.
On detecting facial features and/or a change in facial features consistent with a stroke symptom, computerized monitoring system 130 may communicate the detected stroke symptom to computerized communication system 140. Computerized communication system 140 may be configured to send an alert of the stroke symptom to one or more designated recipients. Computerized communication system 140 may be an integral part of computerized monitoring system 130 and/or may be implemented using separate software, firmware and/or hardware, possibly physically remote from computerized communication system 140.
When an alert is triggered, the alert may be sent, at least initially, to the person 120 being monitored, to give the person 120 being monitored an opportunity to respond before alerting the central monitoring station 150 and/or caregiver(s) 170. For example, an audible message may be played in the room where person 120 is being monitored, possibly asking something like, “Are you ok?” or “Do you need help?”. Shown as step 195 in FIG.1, computerized monitoring system 130 can analyze subsequent image data from 3D motion sensor 110 for gestures, such as a head nod, consistent with a yes or no answer. If 3D motion sensor 110 is equipped with microphones, computerized monitoring system 130 can analyze sound data for recognizable words, such as yes, no, help, or even certain extended sounds, such as “oooooohhhhhhhhh,” which might be consistent with moaning or other vocalization associated with pain, discomfort, or disorientation.
Central monitoring station 150 may be alerted if no response is received at step 195, or if the response is unintelligible or indicates that the person 120 being monitored wants or needs assistance. Alternately, or additionally, central monitoring station 150 may be alerted with or even before person 120, so that central monitoring station 150 can determine whether the apparent stroke symptom detected is, in fact, problematic. On receiving an alert, the central monitoring station 150, or an attendant there, may view live image, video and/or audio feed from the 3D motion sensor 110, and evaluate whether the automated observations are persistent and/or troubling. If person 120 has been alerted by the computerized communication system 140, central monitoring station 150 or an attendant there can use the data from 3D motion sensor 110 to evaluate whether a response from person 120 is reassuring or indicates that person 120 requires assistance. Central monitoring station 150 and/or computerized monitoring system 130 may analyze the response from person 120, however, if the response does not include words or gestures recognizable by the computerized system, an attendant at central monitoring station 150 may be able to interpret the person's response. If needed, the central monitoring station 150 and/or the attendant could then approve alert(s) to appropriate caregiver(s) 170 and/or call for emergency assistance (e.g., send a request for emergency medical services to 9-1-1 or a similar service local to the person 120).
One or more caregiver(s) 170 local to person 120 can be alerted with or even before person 120 and/or central monitoring station 150, so that the caregiver(s) 170 can assess what is happening in person. Or, monitored person 120, caregiver(s) 170 and the central monitoring station 150 could all be alerted at the same time. The priority and timing of alerts to different individuals or stations can be configured in accordance with the needs and desires of a particular facility, experience with a particular monitored individual or type of patient, or any other criterion of the system owner or user. This is true for initial alerts as well as continuing alerts (e.g., if stroke symptoms are detected, and no response from person 120 or a caregiver 170 is received or observed) or repeated alerts (two or more distinct events where possible stroke symptoms are observed). The priority and timing of alerts to different individuals may be different for initial, continuing, and/or repeated alerts.
Data associated with alerts may be logged by computerized monitoring system 130 and/or central monitoring station 150 in a database 160. Data associated with an alert may include, without limitation, the telemetry data from 3D motion sensor 110 that triggered the alert; buffered data preceding the telemetry data that triggered the alert; telemetry data subsequent to the alert; the number and substantive content of an alert; the individual(s) and/or groups to whom an alert was addressed; the response, if any, received or observed following an alert; and combinations thereof.
As shown in
When the central monitoring station 150 receives an alert from any of the computerized monitoring and communication systems 130A, 130B, 130C, indicating that a monitored person 120A, 120B, or 120C is presenting one or more stroke symptoms, audio and/or alert information for that particular person may be displayed on the central monitoring alert display 210. An alert can be presented in a variety of formats. An alert may be a visual cue on screen at the central monitoring station 150, such as the specific camera view flashing or being highlighted in a color to draw attention to that display among others. An alert may be an audible sound (e.g., a voice or alarm type sound) at the central monitoring station 150, an audible sound at the computerized monitoring system 130 attached to the 3D motion sensor 110, a text message, an email, turning on a light or even running a program on a computer. Should the central monitoring station 150 receive alerts from more than one of the computerized monitoring and communication systems 130A, 130B, 130C, indicating that a person 120A, 120B, and/or 120C is presenting a stroke symptom, the central monitoring alert display 210 may display the video, audio and/or alerting information from all such instances at the same time. If no alert is received by the central monitoring station 150, it may be that nothing is displayed on the central monitoring alert display 210. Preferably, all monitored individual rooms can be displayed and visible on the central monitoring primary display 200 whether alerting or not. When an alert is generated, attention can be drawn to the particular camera on central monitoring primary display 200 and/or a duplicative display of the alerting camera can be displayed on a second separate computer monitor, e.g., the central monitoring alert display 210.
An electronic record of any alerts received, any responses to the alert observed or received, and/or any actions taken by the central monitoring station 150 can be stored in a database 160.
As mentioned above,
Using facial recognition algorithms, the computerized monitoring system 130 may identify key features of the face of person 120C being monitored. Key features may include, without limitation, the orbit of the eye socket(s), eyebrow(s), eyebrow ridge(s), the nose, the bridge of the nose, the mouth, top of the head, hairline, chin, ears, cheekbones, etc. The features used may vary with the kind of technology (e.g., visible vs. infrared light) and/or prominent or accessible features on person 120C. Using the key features, the computerized monitoring system 130 may center an x-y plane defined by y-axis 340 and x-axis 350 at roughly the center of the person's face. The placement of the y-axis 340, for example, may be conditioned on having one eye and at least a portion of the mouth on each side of y-axis 340. The x-axis 340 may be placed on the location of the upper cheekbone, orbital bone about the eye socket, the apex of the nose, or the like.
Alternately, the position of frame 360 may be determined, e.g. to circumscribe the head, and the y-axis 340 may divide the frame 360 in half vertically, and the x-axis 350 may divide the frame 360 in half horizontally. The absolute placement of the x-y plane is not essential, as long as the x-y plane can be consistently positioned over new images of person 120, so that the x-y plane provides a constant frame of reference. Since the x-y plane will be used for comparing the position of soft tissue features, if the x-y plane is defined by anatomical features, those features may preferably be bony, cartilaginous, or otherwise unlikely to move, particularly during a stroke or TIA. For example, the ears do not typically sag or droop during a stroke or TIA. As another example, the x-y plane might be situated with the intersection between the y-axis 340 and the x-axis 350 on the tip or apex of the nose of the person 120 being monitored. The computerized monitoring system 130 may use facial tracking rather than facial recognition, facial recognition implying that the software attempts to identify a particular person (e.g., Jane Doe) based on facial features, as opposed to recognizing a particular facial feature (e.g., an eye) using facial tracking. If desired, facial recognition algorithms could also be used, e.g., to confirm that the system has “locked on” to the intended person 120 to be monitored; or to confirm the identity of the monitored person 120 for cross-checking records before recording monitoring data to an electronic medical record, billing system, or the like.
The computerized monitoring system 130 may identify soft-tissue reference points on the face of monitored person 120A. As shown in
If monitoring zone 300 and/or face monitoring zone 310 are configured by a user, the user may operate an input device to select a point on an image or video from the computerized monitoring station 130. The user may draw a perimeter defining a zone freehand, or may drag the input device (such as an electronic stylus or mouse pointer) from one point to another to define a diagonal axis for the perimeter of the zone. Other configuration options, including drag-and-drop templates and coordinate identification, could be used. A 2D monitoring zone 300 and/or face monitoring zone 310 can be operated as a perimeter, or a third dimension of depth can be specified. As with the perimeter, the computerized monitoring system can define or recommend a depth measurement, or the user can provide the depth measurement.
On setting a depth parameter, and while still in a configuration view, the depth of the monitoring zone may be visible as a label 900, as shown in
Although monitoring zone 300 and face monitoring zone 310 may be configured and operational, they may not be shown outside of the configuration screens for those zones, as in
If the Device menu 1010 in
On selection of an event 1210 in
As shown in
The various computerized systems and processors as described herein may include, individually or collectively, and without limitation, a processing unit, internal system memory, and a suitable system bus for coupling various system components, including database 160, with a control server. Computerized monitoring system 130 and/or central monitoring station 150 may provide control server structure and/or function. The system bus may be any of several types of bus structures, including a memory bus or memory controller, a peripheral bus, and a local bus, using any of a variety of bus architectures. By way of example, and not limitation, such architectures include Industry Standard Architecture (ISA) bus, Micro Channel Architecture (MCA) bus, Enhanced ISA (EISA) bus, Video Electronic Standards Association (VESA) local bus, and Peripheral Component Interconnect (PCI) bus.
The computerized systems typically include therein, or have access to, a variety of computer-readable media, for instance, database 160. Computer-readable media can be any available media that may be accessed by the computerized system, and includes volatile and nonvolatile media, as well as removable and non-removable media. By way of example, and not limitation, computer-readable media may include computer-storage media and communication media. Computer-readable storage media may include, without limitation, volatile and nonvolatile media, as well as removable and non-removable media implemented in any method or technology for storage of information, such as computer readable instructions, data structures, program modules, or other data. In this regard, computer-storage media may include, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVDs) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage, or other magnetic storage device, or any other medium which can be used to store the desired information and which may be accessed by the control server. Computer-readable storage media excludes signals per se.
Communication media typically embodies computer readable instructions, data structures, program modules, or other data in a modulated data signal, such as a carrier wave or other transport mechanism, and may include any information delivery media. As used herein, the term “modulated data signal” refers to a signal that has one or more of its attributes set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media includes wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared, and other wireless media. Combinations of any of the above also may be included within the scope of computer-readable media. The computer-readable storage media discussed above, including database 160, provide storage of computer readable instructions, data structures, program modules, and other data for the computerized systems. Computer readable instructions embodied on computer-readable storage media may be accessible by stroke detection system 100 and/or component(s) thereof, and, when executed by a computer processor and/or server, may cause the system to function and/or perform the methods described herein.
The computerized systems may operate in a computer network using logical connections to one or more remote computers. Remote computers may be located at a variety of locations, for example, but not limited to, hospitals and other inpatient settings, veterinary environments, ambulatory settings, medical billing and financial offices, hospital administration settings, home health care environments, payer offices (e.g., insurance companies), home health care agencies, clinicians' offices and the clinician's home or the patient's own home or over the Internet. Clinicians may include, but are not limited to, a treating physician or physicians, specialists such as surgeons, radiologists, cardiologists, and oncologists, emergency medical technicians, physicians' assistants, nurse practitioners, nurses, nurses' aides, pharmacists, dieticians, microbiologists, laboratory experts, laboratory technologists, genetic counselors, researchers, veterinarians, students, and the like. The remote computers may also be physically located in non-traditional medical care environments so that the entire health care community may be capable of integration on the network. The remote computers may be personal computers, servers, routers, network PCs, peer devices, other common network nodes, or the like, and may include some or all of the elements described above in relation to the control server. The devices can be personal digital assistants or other like devices.
Exemplary computer networks may include, without limitation, local area networks (LANs) and/or wide area networks (WANs). Such networking environments are commonplace in offices, enterprise-wide computer networks, intranets, and the Internet. When utilized in a WAN networking environment, the control server may include a modem or other means for establishing communications over the WAN, such as the Internet. In a networked environment, program modules or portions thereof may be stored in the control server, in the database 160, or on any of the remote computers. For example, and not by way of limitation, various application programs may reside on the memory associated with any one or more of the remote computers. It will be appreciated by those of ordinary skill in the art that the network connections shown are exemplary and other means of establishing a communications link between the computers may be utilized.
In operation, a user may enter commands and information into the computerized system(s) using input devices, such as a keyboard, a pointing device (commonly referred to as a mouse), a trackball, a touch pad, a 3D Gesture recognition camera or motion sensor. Other input devices may include, without limitation, microphones, satellite dishes, scanners, or the like. In addition to or in lieu of a monitor, the computerized systems may include other peripheral output devices, such as speakers and a printer.
Many other internal components of the computerized system hardware are not shown because such components and their interconnection are well known. Accordingly, additional details concerning the internal construction of the computers that make up the computerized systems are not further disclosed herein.
Methods and systems of embodiments of the present disclosure may be implemented in a WINDOWS or LINUX operating system, operating in conjunction with an Internet-based delivery system, however, one of ordinary skill in the art will recognize that the described methods and systems can be implemented in any operating system suitable for supporting the disclosed processing and communications. As contemplated by the language above, the methods and systems of embodiments of the present invention may also be implemented on a stand-alone desktop, personal computer, cellular phone, smart phone, tablet computer, PDA, or any other computing device used in a healthcare environment or any of a number of other locations.
From the foregoing, it will be seen that this disclosure is well adapted to attain all the ends and objects hereinabove set forth together with other advantages which are obvious and which are inherent to the structure.
It will be understood that certain features and subcombinations are of utility and may be employed without reference to other features and subcombinations. This is contemplated by and is within the scope of the claims.
Since many possible embodiments may be made of the invention without departing from the scope thereof, it is to be understood that all matter herein set forth or shown in the accompanying drawings is to be interpreted as illustrative and not in a limiting sense.
This application claims the benefit of U.S. Provisional Patent Application No. 62/273,735, filed Dec. 31, 2015, which is hereby incorporated by reference in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
4669263 | Sugiyama | Jun 1987 | A |
4857716 | Gombrich et al. | Aug 1989 | A |
5031228 | Lu | Jul 1991 | A |
5276432 | Travis | Jan 1994 | A |
5448221 | Weller | Sep 1995 | A |
5482050 | Smokoff et al. | Jan 1996 | A |
5592153 | Welling et al. | Jan 1997 | A |
5798798 | Rector et al. | Aug 1998 | A |
5838223 | Gallant et al. | Nov 1998 | A |
5915379 | Wallace et al. | Jun 1999 | A |
5942986 | Shabot et al. | Aug 1999 | A |
6050940 | Braun et al. | Apr 2000 | A |
6095984 | Amano et al. | Aug 2000 | A |
6160478 | Jacobsen et al. | Dec 2000 | A |
6174283 | Nevo et al. | Jan 2001 | B1 |
6188407 | Smith et al. | Feb 2001 | B1 |
6269812 | Wallace et al. | Aug 2001 | B1 |
6287452 | Allen et al. | Sep 2001 | B1 |
6322502 | Schoenberg et al. | Nov 2001 | B1 |
6369838 | Wallace et al. | Apr 2002 | B1 |
6429869 | Kamakura et al. | Aug 2002 | B1 |
6614349 | Proctor et al. | Sep 2003 | B1 |
6727818 | Wildman et al. | Apr 2004 | B1 |
6804656 | Rosenfeld et al. | Oct 2004 | B1 |
7015816 | Wildman et al. | Mar 2006 | B2 |
7122005 | Shusterman | Oct 2006 | B2 |
7154397 | Zerhusen et al. | Dec 2006 | B2 |
7237287 | Weismiller et al. | Jul 2007 | B2 |
7323991 | Eckert et al. | Jan 2008 | B1 |
7408470 | Wildman et al. | Aug 2008 | B2 |
7420472 | Tran | Sep 2008 | B2 |
7430608 | Noonan et al. | Sep 2008 | B2 |
7502498 | Wen | Mar 2009 | B2 |
7612679 | Fackler et al. | Nov 2009 | B1 |
7669263 | Menkedick et al. | Mar 2010 | B2 |
7715387 | Schuman | May 2010 | B2 |
7724147 | Brown | May 2010 | B2 |
7756723 | Rosow et al. | Jul 2010 | B2 |
7890349 | Cole et al. | Feb 2011 | B2 |
7893842 | Deutsch | Feb 2011 | B2 |
7895055 | Schneider et al. | Feb 2011 | B2 |
7908153 | Scherpbier et al. | Mar 2011 | B2 |
7945457 | Zaleski | May 2011 | B2 |
7962544 | Torok et al. | Jun 2011 | B2 |
7972140 | Renaud | Jul 2011 | B2 |
8108036 | Tran | Jan 2012 | B2 |
8123685 | Brauers et al. | Feb 2012 | B2 |
8128596 | Carter | Mar 2012 | B2 |
8224108 | Steinberg | Jul 2012 | B2 |
8237558 | Seyed Momen et al. | Aug 2012 | B2 |
8273018 | Fackler et al. | Sep 2012 | B1 |
8432263 | Kunz | Apr 2013 | B2 |
8451314 | Cline et al. | May 2013 | B1 |
8529448 | McNair | Sep 2013 | B2 |
8565500 | Neff | Oct 2013 | B2 |
8620682 | Bechtel et al. | Dec 2013 | B2 |
8655680 | Bechtel et al. | Feb 2014 | B2 |
8700423 | Eaton, Jr. et al. | Apr 2014 | B2 |
8727981 | Bechtel et al. | May 2014 | B2 |
8769153 | Dziubinski | Jul 2014 | B2 |
8890937 | Skubic et al. | Nov 2014 | B2 |
8902068 | Bechtel et al. | Dec 2014 | B2 |
8917186 | Grant | Dec 2014 | B1 |
8953886 | King et al. | Feb 2015 | B2 |
9072929 | Rush et al. | Jul 2015 | B1 |
9129506 | Kusens | Sep 2015 | B1 |
9147334 | Long et al. | Sep 2015 | B2 |
9159215 | Kusens | Oct 2015 | B1 |
9269012 | Fotland | Feb 2016 | B2 |
9292089 | Sadek | Mar 2016 | B1 |
9305191 | Long et al. | Apr 2016 | B2 |
9408561 | Stone et al. | Aug 2016 | B2 |
9489820 | Kusens | Nov 2016 | B1 |
9519969 | Kusens | Dec 2016 | B1 |
9524443 | Kusens | Dec 2016 | B1 |
9536310 | Kusens | Jan 2017 | B1 |
9538158 | Rush et al. | Jan 2017 | B1 |
9563955 | Kamarshi et al. | Feb 2017 | B1 |
9597016 | Stone et al. | Mar 2017 | B2 |
9729833 | Kusens | Aug 2017 | B1 |
9741227 | Kusens | Aug 2017 | B1 |
9892310 | Kusens et al. | Feb 2018 | B2 |
9892311 | Kusens et al. | Feb 2018 | B2 |
9892611 | Kusens | Feb 2018 | B1 |
9905113 | Kusens | Feb 2018 | B2 |
10078956 | Kusens | Sep 2018 | B1 |
10090068 | Kusens et al. | Oct 2018 | B2 |
10091463 | Kusens | Oct 2018 | B1 |
10096223 | Kusens | Oct 2018 | B1 |
10210378 | Kusens et al. | Feb 2019 | B2 |
10225522 | Kusens | Mar 2019 | B1 |
20020015034 | Malmborg | Feb 2002 | A1 |
20020038073 | August | Mar 2002 | A1 |
20020077863 | Rutledge et al. | Jun 2002 | A1 |
20020101349 | Rojas, Jr. | Aug 2002 | A1 |
20020115905 | August | Aug 2002 | A1 |
20020183976 | Pearce | Dec 2002 | A1 |
20030037786 | Biondi et al. | Feb 2003 | A1 |
20030070177 | Kondo et al. | Apr 2003 | A1 |
20030092974 | Santoso et al. | May 2003 | A1 |
20030095147 | Daw | May 2003 | A1 |
20030135390 | O'Brien et al. | Jul 2003 | A1 |
20030140928 | Bui et al. | Jul 2003 | A1 |
20030227386 | Pulkkinen et al. | Dec 2003 | A1 |
20040019900 | Knightbridge et al. | Jan 2004 | A1 |
20040052418 | DeLean | Mar 2004 | A1 |
20040054760 | Ewing et al. | Mar 2004 | A1 |
20040097227 | Siegel | May 2004 | A1 |
20040116804 | Mostafavi | Jun 2004 | A1 |
20040193449 | Wildman et al. | Sep 2004 | A1 |
20050038326 | Mathur | Feb 2005 | A1 |
20050182305 | Hendrich | Aug 2005 | A1 |
20050231341 | Shimizu | Oct 2005 | A1 |
20050249139 | Nesbit | Nov 2005 | A1 |
20060004606 | Wendl | Jan 2006 | A1 |
20060047538 | Condurso et al. | Mar 2006 | A1 |
20060049936 | Collins et al. | Mar 2006 | A1 |
20060058587 | Heimbrock et al. | Mar 2006 | A1 |
20060089541 | Braun et al. | Apr 2006 | A1 |
20060092043 | Lagassey | May 2006 | A1 |
20060107295 | Margis et al. | May 2006 | A1 |
20060145874 | Fredriksson et al. | Jul 2006 | A1 |
20060261974 | Albert et al. | Nov 2006 | A1 |
20070085690 | Tran | Apr 2007 | A1 |
20070118054 | Pinhas et al. | May 2007 | A1 |
20070120689 | Zerhusen et al. | May 2007 | A1 |
20070129983 | Scherpbier et al. | Jun 2007 | A1 |
20070136102 | Rodgers | Jun 2007 | A1 |
20070136218 | Bauer et al. | Jun 2007 | A1 |
20070159332 | Koblasz | Jul 2007 | A1 |
20070279219 | Warriner | Dec 2007 | A1 |
20070296600 | Dixon et al. | Dec 2007 | A1 |
20080001735 | Tran | Jan 2008 | A1 |
20080001763 | Raja et al. | Jan 2008 | A1 |
20080002860 | Super et al. | Jan 2008 | A1 |
20080004904 | Tran | Jan 2008 | A1 |
20080009686 | Hendrich | Jan 2008 | A1 |
20080015903 | Rodgers | Jan 2008 | A1 |
20080021731 | Rodgers | Jan 2008 | A1 |
20080071210 | Moubayed et al. | Mar 2008 | A1 |
20080087719 | Sahud | Apr 2008 | A1 |
20080106374 | Sharbaugh | May 2008 | A1 |
20080126132 | Warner et al. | May 2008 | A1 |
20080228045 | Gao et al. | Sep 2008 | A1 |
20080249376 | Zaleski | Oct 2008 | A1 |
20080267447 | Kelusky et al. | Oct 2008 | A1 |
20080277486 | Seem et al. | Nov 2008 | A1 |
20080281638 | Weatherly et al. | Nov 2008 | A1 |
20090082829 | Panken et al. | Mar 2009 | A1 |
20090091458 | Deutsch | Apr 2009 | A1 |
20090099480 | Salgo et al. | Apr 2009 | A1 |
20090112630 | Collins, Jr. et al. | Apr 2009 | A1 |
20090119843 | Rodgers et al. | May 2009 | A1 |
20090177327 | Turner et al. | Jul 2009 | A1 |
20090224924 | Thorp | Sep 2009 | A1 |
20090278934 | Ecker et al. | Nov 2009 | A1 |
20090322513 | Hwang et al. | Dec 2009 | A1 |
20100117836 | Seyed Momen et al. | May 2010 | A1 |
20100169114 | Henderson et al. | Jul 2010 | A1 |
20100169120 | Herbst et al. | Jul 2010 | A1 |
20100172567 | Prokoski | Jul 2010 | A1 |
20100176952 | Bajcsy et al. | Jul 2010 | A1 |
20100188228 | Hyland | Jul 2010 | A1 |
20100205771 | Pietryga et al. | Aug 2010 | A1 |
20100245577 | Yamamoto et al. | Sep 2010 | A1 |
20100285771 | Peabody | Nov 2010 | A1 |
20100305466 | Corn | Dec 2010 | A1 |
20110018709 | Kombluh | Jan 2011 | A1 |
20110022981 | Mahajan et al. | Jan 2011 | A1 |
20110025493 | Papadopoulos et al. | Feb 2011 | A1 |
20110025499 | Hoy et al. | Feb 2011 | A1 |
20110035057 | Receveur et al. | Feb 2011 | A1 |
20110035466 | Panigrahi | Feb 2011 | A1 |
20110054936 | Cowan et al. | Mar 2011 | A1 |
20110068930 | Wildman et al. | Mar 2011 | A1 |
20110077965 | Nolte et al. | Mar 2011 | A1 |
20110087079 | Aarts | Apr 2011 | A1 |
20110087125 | Causevic | Apr 2011 | A1 |
20110102133 | Shaffer | May 2011 | A1 |
20110102181 | Metz et al. | May 2011 | A1 |
20110106560 | Eaton, Jr. et al. | May 2011 | A1 |
20110106561 | Eaton, Jr. et al. | May 2011 | A1 |
20110175809 | Markovic et al. | Jul 2011 | A1 |
20110190593 | McNair | Aug 2011 | A1 |
20110227740 | Wohltjen | Sep 2011 | A1 |
20110245707 | Castle | Oct 2011 | A1 |
20110254682 | Sigrist Christensen | Oct 2011 | A1 |
20110288811 | Greene | Nov 2011 | A1 |
20110295621 | Farooq et al. | Dec 2011 | A1 |
20110301440 | Riley et al. | Dec 2011 | A1 |
20110313325 | Cuddihy | Dec 2011 | A1 |
20120016295 | Tsoukalis | Jan 2012 | A1 |
20120025991 | O'Keefe et al. | Feb 2012 | A1 |
20120026308 | Johnson et al. | Feb 2012 | A1 |
20120075464 | Derenne | Mar 2012 | A1 |
20120092162 | Rosenberg | Apr 2012 | A1 |
20120098918 | Murphy | Apr 2012 | A1 |
20120140068 | Monroe et al. | Jun 2012 | A1 |
20120154582 | Johnson et al. | Jun 2012 | A1 |
20120212582 | Deutsch | Aug 2012 | A1 |
20120259650 | Mallon et al. | Oct 2012 | A1 |
20120314901 | Hanson et al. | Dec 2012 | A1 |
20120323090 | Bechtel et al. | Dec 2012 | A1 |
20120323591 | Bechtel et al. | Dec 2012 | A1 |
20120323592 | Bechtel et al. | Dec 2012 | A1 |
20130027199 | Bonner | Jan 2013 | A1 |
20130028570 | Suematsu et al. | Jan 2013 | A1 |
20130120120 | Long et al. | May 2013 | A1 |
20130122807 | Tenarvitz et al. | May 2013 | A1 |
20130127620 | Siebers et al. | May 2013 | A1 |
20130184592 | Venetianer et al. | Jul 2013 | A1 |
20130265482 | Funamoto | Oct 2013 | A1 |
20130309128 | Voegeli et al. | Nov 2013 | A1 |
20130332184 | Burnham et al. | Dec 2013 | A1 |
20140039351 | Mix et al. | Feb 2014 | A1 |
20140070950 | Snodgrass | Mar 2014 | A1 |
20140081654 | Bechtel et al. | Mar 2014 | A1 |
20140085501 | Tran | Mar 2014 | A1 |
20140086450 | Huang et al. | Mar 2014 | A1 |
20140108041 | Bechtel et al. | Apr 2014 | A1 |
20140155755 | Pinter et al. | Jun 2014 | A1 |
20140191861 | Scherrer | Jul 2014 | A1 |
20140213845 | Bechtel et al. | Jul 2014 | A1 |
20140267625 | Clark et al. | Sep 2014 | A1 |
20140267736 | Delean | Sep 2014 | A1 |
20140327545 | Bolling et al. | Nov 2014 | A1 |
20140328512 | Gurwicz et al. | Nov 2014 | A1 |
20140333744 | Baym et al. | Nov 2014 | A1 |
20140333776 | Dedeoglu et al. | Nov 2014 | A1 |
20140354436 | Nix et al. | Dec 2014 | A1 |
20140365242 | Neff | Dec 2014 | A1 |
20150057635 | Bechtel et al. | Feb 2015 | A1 |
20150109442 | Derenne et al. | Apr 2015 | A1 |
20150206415 | Wegelin et al. | Jul 2015 | A1 |
20150269318 | Neff | Sep 2015 | A1 |
20150278456 | Bermudez Rodriguez et al. | Oct 2015 | A1 |
20150294143 | Wells et al. | Oct 2015 | A1 |
20160022218 | Hayes et al. | Jan 2016 | A1 |
20160070869 | Portnoy | Mar 2016 | A1 |
20160093195 | Ophardt | Mar 2016 | A1 |
20160127641 | Gove | May 2016 | A1 |
20160180668 | Kusens et al. | Jun 2016 | A1 |
20160183864 | Kusens et al. | Jun 2016 | A1 |
20160217347 | Mineo | Jul 2016 | A1 |
20160253802 | Venetianer | Sep 2016 | A1 |
20160267327 | Franz et al. | Sep 2016 | A1 |
20160314258 | Kusens | Oct 2016 | A1 |
20160324460 | Kusens | Nov 2016 | A1 |
20160360970 | Tzvieli | Dec 2016 | A1 |
20170055917 | Stone et al. | Mar 2017 | A1 |
20170084158 | Kusens | Mar 2017 | A1 |
20170085501 | Utgikar | Mar 2017 | A1 |
20170091562 | Kusens | Mar 2017 | A1 |
20170109991 | Kusens | Apr 2017 | A1 |
20170143240 | Stone et al. | May 2017 | A1 |
20170289503 | Kusens | Oct 2017 | A1 |
20170337682 | Liao et al. | Nov 2017 | A1 |
20180018864 | Baker | Jan 2018 | A1 |
20180068545 | Kusens | Mar 2018 | A1 |
20180104409 | Bechtel et al. | Apr 2018 | A1 |
20180144605 | Kusens | May 2018 | A1 |
20180189946 | Kusens et al. | Jul 2018 | A1 |
20180190098 | Kusens | Jul 2018 | A1 |
20180357875 | Kusens | Dec 2018 | A1 |
20190006046 | Kusens et al. | Jan 2019 | A1 |
20190029528 | Tzvieli | Jan 2019 | A1 |
20190043192 | Kusens et al. | Feb 2019 | A1 |
20190057592 | Kusens | Feb 2019 | A1 |
20190205630 | Kusens | Jul 2019 | A1 |
20190206218 | Kusens et al. | Jul 2019 | A1 |
Number | Date | Country |
---|---|---|
19844918 | Apr 2000 | DE |
2007081629 | Jul 2007 | WO |
2009018422 | Feb 2009 | WO |
2012122002 | Sep 2012 | WO |
Entry |
---|
US 9,948,899 B1, 04/2018, Kusens (withdrawn) |
Non-Final Office Action dated Jan. 11, 2017 in U.S. Appl. No. 14/611,363, 19 pages. |
Pending U.S. Application by inventor Neil Kusens, U.S. Appl. No. 14/084,588, filed Nov. 19, 2013, entitled “Method for Determining Whether an Individual Leaves a Prescribed Virtual Perimeter”. |
Pending U.S. Application by same Neil Kusens, U.S. Appl. No. 14/575,850, filed Dec. 18, 2014, entitled “Method and Process for Determining Whether an Individual Suffers a Fall Requiring Assistance”. |
Pending U.S. Application by inventor Neil Kusens, U.S. Appl. No. 14/599,498, filed Jan. 17, 2015, entitled “Method and System for Determining Whether an Individual Takes Appropriate Measures to Prevent the Spread of Healthcare Associated Infections”. |
Pending U.S. Application by inventor Neil Kusens, U.S. Appl. No. 14/611,363, filed Feb. 2, 2015, entitled “Method and System for Determining Whether an Individual Takes Appropriate Measures to Prevent the Spread of Healthcare Associated Infections”. |
Pending U.S. Application by inventor Neil Kusens, U.S. Appl. No. 14/623,349, filed Feb. 16, 2015, entitled “Method for Determining Whether an Individual Enters a Prescribed Virtual Zone Using 3D Blob Detection”. |
Pending U.S. Application by inventor Neil Kusens, U.S. Appl. No. 14/724,969, filed May 29, 2015, entitled “Method and Process for Determining Whether an Individual Suffers a Fall Requiring Assistance”. |
Pending U.S. Application by inventor Neil Kusens, U.S. Appl. No. 141724,969, filed May 29, 2015, entitled “Method and Process for Determining Whether an Individual Suffers a Fall Requiring Assistance”. |
Pending U.S. Application by inventor Neil Kusens, U.S. Appl. No. 14/727,434, filed Jun. 1, 2015, entitled “Method for Determining Whether Enters a Prescribed Virtual Zone Using Skeletal Tracking and 3D Blob Detection”. |
Pending U.S. Application by inventor Neil Kusens, U.S. Appl. No. 14/728,762, filed Jun. 2, 2015, entitled “Method for Determining Whether an Individual Leaves a Prescribed Virtual Perimeter”. |
Pending U.S. Application by inventor Neil Kusens, U.S. Appl. No. 14/743,264, filed Jun. 18, 2015, entitled “System for Determining Whether an Individual Enters a Prescribed Virtual Zone Using 3D Blob Detection”. |
Pending U.S. Application by inventor Neil Kusens, U.S. Appl. No. 14/743,499, filed Jun. 18, 2015, entitled “System for Determining Whether an Individual Suffers a Fall Requiring Assistance”. |
Pending U.S. Application by inventor Neil Kusens, U.S. Appl. No. 14/743,447, filed Jun. 18, 2015, entitled System for Determining Whether an Individual Suffers a Fall Requiring. |
Pending U.S. Application by inventor Neal Kusens. U.S. Appl. No. 14/611,363, filed Feb. 2, 2015, entitled “Method and System for Determining Whether an Individual Takes Appropriate Measures to Prevent the Spread of Healthcare Associated Infections”. |
Tom Mooney, “Rhode Island ER first to test Google Glass on medical conditions”, http://www.ems1.com/ems-products/cameras-video/articles/1860487-Rhode-Island-ER-first . . . printed on Mar. 11, 2014. |
Final Office Action dated Apr. 28, 2017 in U.S. Appl. No. 14/611,363, 20 pages. |
Non-Final Office Action dated Apr. 27, 2017 in U.S. Appl. No. 15/395,526, 16 pages. |
Non-Final Office Action dated May 8, 2017 in U.S. Appl. No. 15/395,250, 19 pages. |
Non-Final Office Action dated May 31, 2017 in U.S. Appl. No. 14/599,498, 24 pages. |
Notice of Allowance dated Jul. 5, 2017 in U.S. Appl. No. 14/727,434, 9 pages. |
Notice of Allowance dated Jul. 24, 2017 in U.S. Appl. No. 15/395,716, 5 pages. |
First Action Interview Pre-Interview Communication dated Nov. 22, 2017 in U.S. Appl. No. 15/134,189, 4 pages. |
Virtual Patient Observation: Centralize Monitoring of High-Risk Patients with Video—Cisco Video Surveillance Manager, https://www.cisco.com/c/en/us/products/collateral/physical-security/video-surveillance-manager/white paper_ C 11-715263.pdf. |
Notice of Allowance dated Dec. 6, 2017 in U.S. Appl. No. 15/395,716, 5 pages. |
Final Office Action dated Dec. 12, 2017 in U.S. Appl. No. 14/575,850, 10 pages. |
Notice of Allowance dated Dec. 29, 2017 in U.S. Appl. No. 14/611,363, 11 pages. |
Notice of Allowance dated Feb. 12, 2018 in U.S. Appl. No. 14/623,349, 12 pages. |
Non-Final Office Action dated Aug. 16, 2017 in U.S. Appl. No. 14/757,593, 8 pages. |
Final Office Action dated Aug. 23, 2017 in U.S. Appl. No. 15/285,416, 16 pages. |
Notice of Allowance dated Sep. 21, 2017 in U.S. Appl. No. 15/395,526, 13 pages. |
Notice of Allowance dated Sep. 26, 2017 in U.S. Appl. No. 15/395,250, 13 pages. |
Final Office Action dated Sep. 29, 2017 in U.S. Appl. No. 14/757,877, 22 pages. |
Final Office Action dated Oct. 4, 2017 in U.S. Appl. No. 14/623,349, 30 pages. |
Notice of Allowance dated Oct. 10, 2017 in U.S. Appl. No. 14/727,434, 9 pages. |
Final Office Action dated Oct. 12, 2017 in U.S. Appl. No. 14/599,498, 28 pages. |
Notice of Allowance dated Oct. 20, 2017 in U.S. Appl. No. 15/279,054, 14 pages. |
Pending U.S. Application by same inventor Neil Kusens, U.S. Appl. No. 14/623,349, filed Feb. 16, 2015, entitled “Method for Determining Whether an Individual Enters a Prescribed Virtual Zone Using 3D Blob Detection.” |
Non-Final Office Action dated Feb. 23, 2017 in U.S. Appl. No. 14/757,877, 24 pages. |
First Action Interview Preinterview Communication dated Feb. 24, 2017 in U.S. Appl. No. 15/395,716, 5 pages. |
Notice of Allowance dated Mar. 20, 2017 in U.S. Appl. No. 14/613,866, 11 pages. |
Non-Final Office Action dated Apr. 5, 2017 in U.S. Appl. No. 14/623,349, 15 pages. |
Non-Final Office Action dated Apr. 11, 2017 in U.S. Appl. No. 15/285,416, 13 pages. |
Notice of Allowance dated Apr. 19, 2017 in U.S. Appl. No. 15/395,716, 5 pages. |
Notice of Allowance dated Apr. 21, 2017 in U.S. Appl. No. 14/724,969, 9 pages. |
Non-Final Office Action dated Apr. 21, 2017 in U.S. Appl. No. 14/757,593, 9 pages. |
Notice of Allowance dated Apr. 25, 2017 in U.S. Appl. No. 14/727,434, 9 pages. |
Final Office Action dated Apr. 28, 2017 in U.S. Appl. No. 14/757,593, 20 pages. |
Raheja, et al., “Human Facial Expression Detection From Detected in CapturedImage Using Back Propagation Neural Network”, International Journal of Computer Science and Information Technology (IJCSIT), vol. 2, No. 1, Feb. 2010, 8 pages. |
Non-Final Office Action dated Apr. 14, 2017 in U.S. Appl. No. 15/396,263, 18 pages. |
Final Office Action dated Oct. 18, 2017 in U.S. Appl. No. 15/396,263, 20 pages. |
Notice of Allowance dated Nov. 27, 2017 in U.S. Appl. No. 15/279,054, 2 pages. |
First Action Interview Office Action dated Nov. 28, 2017 in U.S. Appl. No. 14/244,160, 5 pages. |
Notice of Allowance dated Jan. 18, 2018 in U.S. Appl. No. 15/279,054, 2 pages. |
Non-Final Office Action dated Feb. 7, 2018 in U.S. Appl. No. 15/396,263, 19 pages. |
Final Office Action dated Feb. 16, 2018 in U.S. Appl. No. 14/757,593, 8 pages. |
First Action Interview Office Action dated Feb. 22, 2018 in U.S. Appl. No. 15/134,189, 4 pages. |
Non-Final Office Action dated Feb. 22, 2018 in U.S. Appl. No. 14/599,498, 24 pages. |
Non-Final Office Action dated Mar. 12, 2018 in U.S. Appl. No. 15/285,416, 20 pages. |
Non-Final Office Action dated May 7, 2018 in U.S. Appl. No. 14/611,363, 6 pages. |
Non-Final Office Action dated May 8, 2018 in U.S. Appl. No. 15/148,151, 5 pages. |
Notice of Allowance dated May 9, 2018 in U.S. Appl. No. 15/395,716, 5 pages. |
First Action Interview Pre-Interview Communication dated May 21, 2018 in U.S. Appl. No. 15/910,645, 14 pages. |
Non-Final Office Action dated May 2, 2018 in U.S. Appl. No. 15/728,110, 8 pages. |
Non Final Office Action received for U.S. Appl. No. 15/395,243, dated Feb. 14, 2019, 14 pages. |
Non Final Office Action received for U.S. Appl. No. 16/216,210, dated Feb. 13, 2019, 29 pages. |
Non-Final Office Action dated Mar. 14, 2018 in U.S. Appl. No. 14/757,877, 13 pages. |
Non-Final Office Action dated May 31, 2018 in U.S. Appl. No. 15/848,621, 23 pages. |
Non-Final Office Action dated Jun. 8, 2018 in U.S. Appl. No. 15/628,318, 9 pages. |
Conaire, et al., “Fusion of Infrared and Visible Spectrum Video for Indoor Surveillance”, WIAMIS, Apr. 2005, 4 pages. |
Final Office Action received for U.S. Appl. No. 15/395,243, dated Jun. 11, 2019, 18 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/134,189, dated May 9, 2019, 30 pages. |
Preinterview First Office Action received for U.S. Appl. No. 15/857,696, dated May 23, 2019, 14 pages. |
Non Final Office Action received for U.S. Appl. No. 16/107,567, dated Mar. 29, 2019, 8 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/856,419, dated May 2, 2019, 8 pages. |
Number | Date | Country | |
---|---|---|---|
20170195637 A1 | Jul 2017 | US |
Number | Date | Country | |
---|---|---|---|
62273735 | Dec 2015 | US |