The present invention is directed to systems and methods for processing video of a subject such that a determination can be made whether that subject is in atrial fibrillation or in normal sinus rhythm.
Monitoring cardiac events is of clinical importance in the early detection of potentially fatal conditions. Current technologies involve contact sensors (e.g., ECG) the individual must wear constantly. Such a requirement can lead to patient discomfort, dependency, loss of dignity, and further may fail due to a variety of reasons including refusal to wear the monitoring device. Elderly cardiac patients are even more likely to suffer from the adverse effects of continued monitoring. The ability to monitor cardiac function by non-contact means is highly desirable in the healthcare industry. Measurements can be made without disturbing the resting patient, and will be suitable for long observation/monitoring periods and can provide a record of visual imagery of such patients. Video-based methods offer abilities to detect pulsation for long term cardiac function monitoring in a non-contact, unobtrusive manner.
Among cardiac arrhythmias, atrial fibrillation (AF) represents ⅓ of hospital admissions for cardiac-related issues. AF is one of the most common arrhythmias and can cause palpitations, fainting, chest pain, heart failure, and stroke. AF tends to increase with age and often presents with a wide spectrum of symptoms. Presently, there are over 2 million Americans diagnosed with AF. Unobtrusive, non-contact, imaging based methods are needed for monitoring patients for AF such that diagnosis and treatment can be improved. Much work has been done in this regard. The present invention is directed towards this issue.
Accordingly, what is needed in this art are increasingly sophisticated systems and methods for processing video of a subject such that a determination can be made whether that subject is in atrial fibrillation.
The following U.S. Patents, U.S. Patent Applications, and Publications are incorporated herein in their entirety by reference.
“Determining Cardiac Arrhythmia From A Video Of A Subject Being Monitored For Cardiac Function”, U.S. Pat. No. 8,768,438, by Mestha et al.
“Continuous Cardiac Signal Generation From A Video Of A Subject Being Monitored For Cardiac Function”, U.S. patent application Ser. No. 13/871,766, by Kyal et al.
“Continuous Cardiac Pulse Rate Estimation From Multi-Channel Source Video Data With Mid-Point Stitching”, U.S. Pat. No. 9,036,877, by Kyal et al.
“Estimating Cardiac Pulse Recovery From Multi-Channel Source Data Via Constrained Source Separation”, U.S. Pat. No. 8,617,081, by Mestha et al.
“Filtering Source Video Data Via Independent Component Selection”, U.S. Pat. No. 8,600,213, by Mestha et al.
“Determining A Total Number Of People In An IR Image Obtained Via An IR Imaging System”, U.S. Pat. No. 8,520,074, by Wang et al.
“Determining A Number Of Objects In An IR Image”, U.S. Pat. No. 8,587,657, by Wang et al.
“Determining A Pixel Classification Threshold For Vehicle Occupancy Detection”, U.S. Pat. No. 9,202,118, by Wang et al.
“Deriving Arterial Pulse Transit Time From A Source Video Image”, U.S. Pat. No. 8,838,209, by Mestha.
“Video-Based Estimation Of Heart Rate Variability”, U.S. Pat. No. 8,977,347, by Mestha et al.
“Systems And Methods For Non-Contact Heart Rate Sensing”, U.S. Pat. No. 9,020,185, by Mestha et al.
“Processing A Video For Vascular Pattern Detection And Cardiac Function Analysis”, U.S. Pat. No. 8,897,522, by Mestha et al.
“Subcutaneous Vein Pattern Detection Via Multi-Spectral IR Imaging In An Identity Verification System”, U.S. Pat. No. 8,509,495, by Xu et al.
“Method And Apparatus For Monitoring A Subject For Atrial Fibrillation”, U.S. patent application Ser. No. 13/937,740, by Mestha et al.
“System And Method For Determining Video-Based Pulse Transit Time With Time-Series Signals”, U.S. patent application Ser. No. 14/026,739, by Mestha et al.
“Processing Source Video For Real-Time Enhancement Of A Signal Of Interest”, U.S. Pat. No. 8,879,867, by Tanaka et al.
“Removing Environment Factors From Signals Generated From Video Images Captured For Biomedical Measurements”, U.S. Pat. No. 9,185,353, by Mestha et al.
What is disclosed is a system and method for processing video of a subject such that a determination can be made whether that subject is in atrial fibrillation. The teachings hereof are directed to detecting AF episodes by analyzing videoplethysmographic (VPG) signals extracted from time-series signals generated from video. With an implementation of the teachings hereof, cardiac arrhythmias can be discovered in real-time or processed offline from a video of the resting cardiac patient. The system and methods disclosed herein provide an effective tool for AF detection and cardiac function assessment.
One embodiment of the present method for determining whether a subject is having an atrial fibrillation event involves performing the following. First, a video of a region of exposed skin of a subject is received. The video is acquired of a region where a videoplethysmographic (VPG) signal can be registered by at least one imaging channel of a video imaging device used to capture that video. A size N of a batch of image frames is defined. Batch of image frames of size N the following are performed. The batch of image frames is processed to isolate pixels associated with the region of exposed skin and the isolated pixels are processed to obtain a time-series signal for this batch. A VPG signal is extracted from this time-series signal. Thereafter, a power spectral density is computed across all frequencies within the VPG signal to facilitate an identification of a fundamental frequency and at least a first harmonic of the fundamental frequency. A pulse harmonic strength (PHS) is calculated comprising a ratio of signal strength at the identified fundamental frequency and harmonics to a strength of a base signal without these fundamental frequencies and harmonics. A pre-defined discrimination threshold can be selected using a Receiver Operating Characteristic (ROC) curve which is constructed for various values of the pulse harmonic strength. Using this selection of discrimination threshold a comparison is made to the calculated pulse harmonic strength from the VPG signal. As a result of the comparison, a determination is made whether the subject is in atrial fibrillation or in normal sinus rhythm. A pre-defined discrimination threshold can also be selected on an individual patient basis based on VPG signals obtained during atrial fibrillation and during sinus rhythm.
Many features and advantages of the above-described system and method will become readily apparent from the following detailed description and accompanying drawings.
The foregoing and other features and advantages of the subject matter disclosed herein will be made apparent from the following detailed description taken in conjunction with the accompanying drawings, in which:
What is disclosed is a system and method for processing video of a subject such that a determination can be made whether that subject is in atrial fibrillation.
“Atrial fibrillation” (AF or A-fib), is one of the most common cardiac arrhythmias. In AF, the normal regular electrical impulses generated by the sinoatrial node are overwhelmed by disorganized electrical impulses usually originating in the roots of the pulmonary veins, leading to irregular conduction of impulses to the ventricles which generate the heartbeat. AF increases the risk of stroke. The degree of stroke risk can be up to seven times that of the average population depending on the presence of additional risk factors such as high blood pressure. Around 15% of ischemic strokes are directly attributed to emboli in the setting of AF. Nearly 25% of additional ischemic strokes in which no cause can be identified may also be due to asymptomatic AF. A substantial percentage of patients with AF have no symptoms during brief periods or even sustained episodes of AF, making detection of AF in patients at high risk for stroke challenging with current technology over long term follow-up. AF may occur in episodes lasting from minutes to days, or be permanent in nature. A number of medical conditions increase the risk of AF. Atrial fibrillation may be treated with medications which either slow the heart rate to a normal range or revert the heart rhythm back to normal. The teachings hereof help identify asymptomatic AF patients at an early stage. The identification of subclinical AF is extremely important because early detection directly affects management as the majority of these patients can be effectively treated with systemic anticoagulation once AF has been determined.
A “subject” refers to a living being monitored for atrial fibrillation in accordance with the methods disclosed herein. Although the term “person” or “patient” may be used throughout this disclosure, it should be appreciated that the subject may be something other than a person such as, for example, a primate. Therefore, the use of such terms is not to be viewed as limiting the scope of the appended claims strictly to humans.
A “video” refers to a plurality of time-sequential image frames captured by a video imaging device, as is generally understood. The video may also contain other components such as, audio, time, frame rate data, and the like. The video is captured by a video imaging device.
A “video imaging device” is a single-channel or a multi-channel video capture device or system capable of registering a videoplethysmographic (VPG) signal on at least one imaging channel.
“Receiving image frames” of a video is intended to be widely construed and includes: retrieving, capturing, acquiring, or otherwise obtaining image frames for processing. The image frames can be retrieved from a memory or storage device of the video imaging device, obtained from a remote device over a network, or received from a media such as a CDROM or DVD. Image frames may be downloaded from a web-based system or application which makes video available for processing. Image frames can also be received from an application such as those which are available for handheld cellular devices and processed on the cellphone or other handheld computing device such as an iPad or tablet. The image frames of the video are processed in batches.
A “region of exposed skin” refers to an unobstructed view of the subject's skin as seen through the lens of the video imaging device. Regions of exposed skin are isolated in the image frames of the batch where a physiological signal corresponding to the subject's cardiac function was registered by one or more imaging channels of the video imaging device used to capture that video. The region(s) of exposed skin in the image frames are isolated using image processing techniques which include, for instance object identification, pattern recognition, face detection and facial recognition methods, and a pixel classification method as disclosed in the above references by Wang et al. Other methods include color and texture identification, analysis of spatial features, and spectral information. Moreover, a user or technician may use a mouse or a touchscreen display to select or otherwise identify one or more regions of exposed skin in the image frames of the video for processing. Regions of exposed skin in the image frames do not have to be the same size. The regions should at least have a minimum size as defined by the application. The exact size of a given region of exposed skin will vary depending on the application and thus a discussion as to a specific size is omitted. The video imaging device should be zoomed-in to capture a large region of exposed skin to obtain a greater numbers of pixels of skin surface for processing.
A “time-series signal” is a signal extracted from a batch of image frames that contains frequency components of interest that relate to the cardiac function for which the subject is being monitored. Image frames of the video are processed in batches to isolate one or more regions of exposed skin where the subject's videoplethysmographic (VPG) signal can be registered by one or more imaging channels of the video imaging device. Methods for processing video image frames to identify a time-series signal and for enhancing that signal are disclosed in several of the above references.
In one embodiment, an average of all pixel values in the isolated regions of exposed skin is computed to obtain a channel average on a per-frame basis. A global channel average is computed, for each channel, by adding the channel averages across multiple image frames and dividing by the total number of frames comprising the batch. The channel average is subtracted from the global channel average and the result is divided by a global channel standard deviation to obtain a time-series signal. The time-series signal can be normalized and filtered to remove undesirable frequencies. The time-series signal obtained from processing a given batch of image frames contains the sum total of the relative blood volume changes in the blood vessels close to the skin surface within the isolated region. These arterial pulsations comprise a dominant component of the time-series signals. A videoplethysmographic signal corresponding to the subject's cardiac function is extracted from the time-series signal.
A “videoplethysmographic (VPG) signal” is a physiological signal obtained by performing signal separation on the time-series signal. Methods for extracting a VPG signal from a time-series signal obtained from video images are disclosed in several of the above references.
“Processing” includes the application of any mathematical operation applied to data, according to any specific context, or for any specific purpose as described herein.
A “threshold for movement” is a level of movement during video acquisition to determine whether motion artifacts may have been introduced into the video. If the movement is above the threshold for movement then the current batch of image frames is discarded. Alternatively, an indication is provided that the VPG signal extracted from the time-series signal for this batch may be unreliable and may require further processing. The threshold for movement may be based on a type of motion or a source of motion (i.e., by the subject or by the environment) or the time the movement occurred. The threshold level may be set by a user or technician. The threshold level may be automatically adjusted in real-time or manually adjusted by a user/technician as the video of the subject is being captured by the video imaging device. The threshold for movement will likely depend on the application where the teachings hereof find their intended uses. Therefore, a discussion with respect to a particular threshold level is omitted. Various other responses to movement exceeding the threshold include, for example, initiating an alert signal that movement is excessive; signaling a medical professional that movement has occurred; changing a frame rate of the video imaging device; swapping the video imaging device for another video camera; moving a position of the video imaging device; and stopping video acquisition altogether.
A “fundamental frequency”, or simply the “fundamental”, is the frequency of a periodic waveform with highest power. The fundamental is usually abbreviated f0, indicating the frequency as given by:
where T is the fundamental period. The first harmonic is often abbreviated as f1. In some contexts, the fundamental f0 is the first harmonic. In this context, fundamental frequency is the frequency at which the ventricles contract. For example, during sinus rhythm, fundamental frequency is the largest frequency component present in the signal. The first harmonic is at twice the frequency of the fundamental as described below.
A “harmonic” is a component frequency of a signal that is an integer multiple of the fundamental frequency. If the fundamental frequency is f0, the harmonics have frequencies 2f0, 3f0, 4f0, . . . , etc. The harmonics have the property that they are all periodic at the fundamental frequency. Therefore, the sum of harmonics is also periodic at that frequency. Harmonics are equally spaced in frequencies.
A “power spectral density” (PSD), describes how the power of a signal or time series is distributed over the different frequencies contained within that signal. In general, the power P of a signal x(t) is an average over time interval [−T, T], given by:
It is advantageous to work with a truncated Fourier transform where the signal is integrated only over a finite interval. Methods for computing power spectral density are well understood in the signal processing arts. The reader is directed to the textbooks: “Principles of Random Signal Analysis and Low Noise Design: The Power Spectral Density and its Applications”, R. M. Howard (Author), Wiley 1st Ed. (2002), ISBN-13: 978-0471226178, and “Random Signal Analysis in Engineering Systems”, John J. Komo (Author), Academic Press (1987), ISBN-13: 978-0124186606.
Due to irregularity in heart beats during AF (
“Pulse harmonic strength (PHS)” is a ratio of signal strength at the fundamental frequency and harmonics to a strength of a base signal without these fundamental frequency and harmonics. From the PSD, the fundamental frequency and its harmonics are identified. Frequencies in a neighborhood of the harmonics can also be considered by defining a band (e.g., 0.2 Hz or 12 beats per minutes (bpm)). All the power is integrated within this band, denoted Psig. Power in all remaining bands are integrated separately, denoted Pnoi. The pulse harmonic strength is therefore given by the ratio:
PHS=Psig/Pnoi
Pnoi=PTotal−Psig.
where PTotal is the total energy of the signal.
The PHS therefore represents the total strength of the pulse power because the power is centered at heart beats and the harmonics of those beats.
“Normalized pulse harmonic strength (NPHS)” is a ratio of signal strength at the fundamental frequency and harmonics to a strength of a base signal. The normalized pulse harmonic strength is therefore given by the ratio:
NPHS=Psig/PTotal
The normalized PHS has a value between 0 and 1.
FIG. 8 is a histogram of the pulse power of the AF segment of
A “Receiver Operating Characteristic (ROC) curve” is a graphical plot which illustrates the performance of a binary classifier system as its discrimination threshold is varied. The ROC is created by plotting the fraction of true positives out of the total actual positives (TPR=true positive rate) vs. the fraction of false positives out of the total actual negatives (FPR=false positive rate), at various discrimination threshold levels. TPR is also known as sensitivity. The FPR is also known as the fall-out and can be calculated as one minus the well-known specificity. The ROC curve is then the sensitivity as a function of fall-out. In general, if both of the probability distributions for detection and false alarm are known, the ROC curve plots the Cumulative Distribution Function (area under the probability distribution from −∞ to +∞) of the detection probability along the y-axis versus the Cumulative Distribution Function of the false alarm probability along the x-axis. ROC analysis tools select possibly optimal models and discard suboptimal ones independently from the cost context or the class distribution. ROC analysis has been used in medicine, radiology, biometrics, and other areas for many decades and is increasingly used in machine learning and data mining. For a more in-depth recitation, the reader is directed to the textbook: “Analyzing Receiver Operating Characteristic Curves With SAS”, Mithat Gonen, SAS Institute; 1st Edition (2007), ISBN-13: 978-1599942988, and to the paper: “ROC Graphs: Notes and Practical Considerations for Researchers”, Tom Fawcett, Kluwer Academic Publishers, Netherlands, (2004).
The “discrimination threshold” is used herein to determine whether the subject in the video is having an AF episode or is in sinus rhythm (SR). A Receiver Operating Characteristic (ROC) curve facilitates a determination of the discrimination threshold which separates low PHS values from high PHS values. It should be appreciated that, as the threshold is varied, the sensitivity and specificity vary. In
A “remote sensing environment” refers to the non-contact, unobtrusive, non-invasive acquisition of video images of a subject. The video imaging device can be any distance away from the subject, for example, as close as less than an inch to as far as miles in the case of telemedicine. The teachings hereof advantageously find their uses in a remote sensing environment such that the resting patient is undisturbed.
Flow Diagram of One Example Embodiment
Reference is now being made to the flow diagram of
At step 1102, receive a video of a subject being monitored for atrial fibrillation. The video is of a region of exposed skin of a subject where a videoplethysmographic (VPG) signal can be registered by at least one imaging channel of a video imaging device used to capture that video.
At step 1104, define a size N of a batch of image frames for processing. The size is such that Nmin≦N≦Nmax, where Nmin is a minimum size of a batch of image frames and Nmax is a maximum size of a batch of image frames.
At step 1106, select a first batch of image frames of size N.
At step 1108, process the batch of image frames (of step 1106) to isolate pixels associated with the region of exposed skin.
At step 1110, process the isolated pixels to obtain a time-series signal for this batch of image frames.
At step 1112, extract a VPG signal from the time-series signal for this batch.
At step 1114, compute a power spectral density across all frequencies within the VPG signal.
Reference is now being made to the flow diagram of
At step 1116, compare the pulse harmonic strength to a pre-determined discrimination threshold which may be obtained using a Receiver Operating Characteristic (ROC) curve or patient VPG signals during AF and sinus.
At step 1118, a determination is made whether, as a result of the comparison, that the subject is in atrial fibrillation. If the subject is in atrial fibrillation then, at step 1120, an alert signal is provided. The alert may take the form of a message displayed on a display device or a sound activated at, for example, a nurse's station or a display of a device. The alert may take the form of a colored or blinking light which provides a visible indication that an alert condition exists. The alert can be a text, audio, and/or video message. The alert signal may be communicated to one or more remote devices over a wired or wireless network. The alert may be sent directly to a handheld wireless cellular device of a medical professional. Thereafter, additional actions would be taken in response to the alert. In this embodiment, after the alert signal is initiated, further processing stops. In other embodiments, flow processing continues in a similar manner. If, at step 1118, it is determined that the subject not in atrial fibrillation, i.e., is in normal sinus rhythm then, in this embodiment, processing continues with respect to node B wherein, at step 1106, a next batch is selected or is otherwise identified for processing. It is to be noted that the next batch may be selected immediately following the current batch or with overlap as described above depending on the duration over which monitoring is required. Processing continues in a similar manner for the next batch. The method hereof is preferably used for patient monitoring where the image frames of the video are captured by the video imaging device in real-time and processed as they are received on a continuous basis or until video acquisition is terminated.
It should also be appreciated that the flow diagrams depicted herein are illustrative. One or more of the operations illustrated in the flow diagrams may be performed in a differing order. Other operations may be added, modified, enhanced, or consolidated. Variations thereof are intended to fall within the scope of the appended claims.
Block Diagram of Video Processing System
Reference is now being made to
In
Batch processor 1305 receives the defined size N of a batch of image frames from the workstation 1311 and continuously processes batches of image frames of size N by isolating pixels associated with the exposed body region in the image frames and then processing the isolated pixels to obtain a time-series signal for each batch. The batch processor further extracts a VPG signal from the time-series signal. PSD Analyzer 1306 receives the VPG signal and computes a power spectral density across all frequencies within the VPG signal. PHS Calculator 1307 calculates a pulse harmonic strength for the VPG signal. Threshold Comparator 1308 compares the pulse harmonic strength to a discrimination threshold which is retrieved from the workstation 1311. As a result of this comparison, a determination is made whether the subject in the video is in atrial fibrillation or in sinus rhythm.
Central Processor (CPU) 1309 retrieves machine readable program instructions from Memory 1310 and is provided to facilitate the functionality of any of the modules of the video processing system 1304. The processor 1309, operating alone or in conjunction with other processors and memory, may be configured to assist or otherwise perform the functionality of any of the processors and modules of system 1304. Processor 1309 proceeds to generate a physiological signal from the various time-series signals and communicates the subject's physiological signal to the display device of workstation 1311.
A computer case of the workstation 1311 houses various components such as a motherboard with a processor and memory, a network card, a video card, a hard drive capable of reading/writing to machine readable media 1312 such as a floppy disk, optical disk, CD-ROM, DVD, magnetic tape, and the like, and other software and hardware needed to perform the functionality of a computer workstation. The workstation further includes a display device 1313, such as a CRT, LCD, or touchscreen device, for displaying information, video, measurement data, computed values, medical information, results, locations, and the like. A user can view any of that information and make a selection from menu options displayed thereon. Keyboard 1314 and mouse 1315 effectuate a user input or selection.
The workstation implements a database in storage device 1316 wherein patient records are stored, manipulated, and retrieved in response to a query. Such records, in various embodiments, take the form of patient medical history stored in association with information identifying the patient along with medical information. Although the database is shown as an external device, the database may be internal to the workstation mounted, for example, on a hard disk therein. It should be appreciated that the workstation has an operating system and other specialized software configured to display alphanumeric values, menus, scroll bars, dials, slideable bars, pull-down options, selectable buttons, and the like, for entering, selecting, modifying, and accepting information needed for processing image frames. The workstation is further enabled to display the image frames comprising the video.
In other embodiments, a user or technician may use the user interface of the workstation to identify areas of interest, set parameters, select image frames and/or regions of images for processing. These selections may be stored/retrieved in a storage devices 1312 and 1316. Default settings and initial parameters can be retrieved from any of the storage devices shown, as desired. Further, a user may adjust the various parameters being employed or dynamically settings in real-time as successive batches of image frames are received for processing.
Although shown as a desktop computer, it should be appreciated that the workstation can be a laptop, mainframe, or a special purpose computer such as an ASIC, circuit, or the like. The embodiment of the workstation of
Each of the modules of the video processing system may be placed in communication with one or more remote devices over network 1317. It should be appreciated that some or all of the functionality performed by any of the modules or processing units of system 1304 can be performed, in whole or in part, by the workstation placed in communication with the video imaging device 1300 over network 1317. The embodiment shown is illustrative and should not be viewed as limiting the scope of the appended claims strictly to that configuration. Various modules may designate one or more components which may, in turn, comprise software and/or hardware designed to perform the intended function.
Performance Results
Initially, a face region of interest (ROI) was manually selected from a first image frame of a batch of image frames of a video of the subject. Motion tracking of the ROI was utilized to automatically select multiple video segments of 15 seconds when no movement occurred or when the movement was determined to be below a threshold level pre-defined for movement. The image frames from the batches were processed and a VPG signal extracted from each batch. Thereafter, PSD was computed and the PHS was determined from the PSD. A total of 407 video segments were processed. ECG RR intervals and manual annotations for the corresponding video segments were found at the same time. For AF, PHS values were found to lie in a much lower range when compared to the range of PHS values for SR. The data structure was that of both clustered and correlated data, given that we obtained a random number of repeated measurements (one per 15-second epoch) on multiple candidate predictors of AF from each of 11 patients in both AF and SR periods. The ability of each candidate predictor to classify each of the 407 epochs (i.e., segments or batches) was assessed as either AF (n=143) or SR (n=264).
Leave-one-subject-out cross-validation was used to estimate the classification error rate (CER) of each method. This was done by selecting the threshold that optimized the epoch-level CER for 10 training subjects; computing the empirical CER for the epochs from the 1 test subject, using the training threshold; and computing the weighted average of the 11 subject-specific cross-validated error rates, weighted by the number of epochs per subject. Sensitivity of VPG-based PHS was compared with that of each ECG-based parameters, for each level of specificity, using nonparametric analysis of clustered ROC curve data. Separate linear mixed effect models for AF and for SR, each with a random effect for subject, were used to model each continuous predictor (VPG, and the ECG-based measures), thus providing period-specific means and standard errors (SE). Linear mixed effect models with a fixed effect for AF (versus SR) and a random effect for subject facilitated an estimation of the difference in period-specific means, along with the SE and p-value for the difference. All hypothesis tests were 2-sided 0.05 level tests.
The two panels of
The Table of
Based on these results, the methods disclosed herein have a potential to shift the paradigm of AF detection.
The teachings hereof can be implemented in hardware or software using any known or later developed systems, structures, devices, and/or software by those skilled in the applicable art without undue experimentation from the functional description provided herein with a general knowledge of the relevant arts. One or more aspects of the methods described herein are intended to be incorporated in an article of manufacture which may be shipped, sold, leased, or otherwise provided separately either alone or as part of a product suite or a service.
It will be appreciated that the above-disclosed and other features and functions, or alternatives thereof, may be desirably combined into other different systems or applications. Presently unforeseen or unanticipated alternatives, modifications, variations, or improvements may become apparent and/or subsequently made by those skilled in this art which are also intended to be encompassed by the following claims. The teachings of any textbooks, papers, or other publications referenced herein are each hereby incorporated herein in their entirety by reference.
Number | Name | Date | Kind |
---|---|---|---|
8520074 | Wang et al. | Aug 2013 | B2 |
8587657 | Wang et al. | Nov 2013 | B2 |
8600213 | Mestha et al. | Dec 2013 | B2 |
8617081 | Mestha et al. | Dec 2013 | B2 |
20090265671 | Sachs | Oct 2009 | A1 |
20120263357 | Xu et al. | Oct 2012 | A1 |
20120314759 | Huang | Dec 2012 | A1 |
20130077823 | Mestha et al. | Mar 2013 | A1 |
20130147959 | Wang et al. | Jun 2013 | A1 |
20130197380 | Oral | Aug 2013 | A1 |
20130215244 | Mestha et al. | Aug 2013 | A1 |
20130218028 | Mestha et al. | Aug 2013 | A1 |
20130322729 | Mestha et al. | Dec 2013 | A1 |
20130345568 | Mestha et al. | Dec 2013 | A1 |
20130345569 | Mestha et al. | Dec 2013 | A1 |
Entry |
---|
Mestha et al., “Method and Apparatus for Monitoring a Subject for Atrial Fibrillation”, U.S. Appl. No. 13/937,740, filed Jul. 9, 2013. |
Mestha et al., “System and Method for Determining Video-Based Pulse Transit Time With Time-Series Signals”, U.S. Appl. No. 14/026,739, filed Jan. 9, 2014. |
Kyal et al., “Continuous Cardiac Signal Generation From a Video of a Subject Being Monitored for Cardiac Function”, U.S. Appl. No. 13/871,766, filed Apr. 26, 2013. |
Kyal et al., “Continuous Cardiac Pulse Rate Estimation From Multi-Channel Source Video Data With Mid-Point Stitching”, U.S. Appl. No. 13/871,728, filed Apr. 26, 2013. |
Tanaka et al., “Processing Source Video for Real-Time Enhancement of a Signal of Interest”, U.S. Appl. No. 13/745,283, filed Jan. 18, 2013. |
Kyal et al., “A Method to Detect Cardiac Arrhythmias with a Webcam”, Signal Processing in Medicine and Biology Symposium (SPMB), 2013 IEEE. |
Number | Date | Country | |
---|---|---|---|
20150272456 A1 | Oct 2015 | US |