Respiration processor

Information

  • Patent Grant
  • 11974841
  • Patent Number
    11,974,841
  • Date Filed
    Friday, February 14, 2020
    4 years ago
  • Date Issued
    Tuesday, May 7, 2024
    7 months ago
Abstract
Respiratory rate can be calculated from an acoustic input signal using time domain and frequency domain techniques. Confidence in the calculated respiratory rate can also be calculated using time domain and frequency domain techniques. Overall respiratory rate and confidence values can be obtained from the time and frequency domain calculations. The overall respiratory rate and confidence values can be output for presentation to a clinician.
Description
BACKGROUND

The “piezoelectric effect” is the appearance of an electric potential and current across certain faces of a crystal when it is subjected to mechanical stresses. Due to their capacity to convert mechanical deformation into an electric voltage, piezoelectric crystals have been broadly used in devices such as transducers, strain gauges and microphones. However, before the crystals can be used in many of these applications they must be rendered into a form which suits the requirements of the application. In many applications, especially those involving the conversion of acoustic waves into a corresponding electric signal, piezoelectric membranes have been used.


Piezoelectric membranes are typically manufactured from polyvinylidene fluoride plastic film. The film is endowed with piezoelectric properties by stretching the plastic while it is placed under a high-poling voltage. By stretching the film, the film is polarized and the molecular structure of the plastic aligned. A thin layer of conductive metal (typically nickel-copper) is deposited on each side of the film to form electrode coatings to which connectors can be attached.


Piezoelectric membranes have a number of attributes that make them interesting for use in sound detection, including: a wide frequency range of between 0.001 Hz to 1 GHz; a low acoustical impedance close to water and human tissue; a high dielectric strength; a good mechanical strength; and piezoelectric membranes are moisture resistant and inert to many chemicals.


SUMMARY

Respiratory rate can be calculated from an acoustic input signal using time domain and frequency domain techniques. Confidence in the calculated respiratory rate can also be calculated using time domain and frequency domain techniques. Overall respiratory rate and confidence values can be obtained from the time and frequency domain calculations. The overall respiratory rate and confidence values can be output for presentation to a clinician.


For purposes of summarizing the disclosure, certain aspects, advantages and novel features of the inventions have been described herein. It is to be understood that not necessarily all such advantages can be achieved in accordance with any particular embodiment of the inventions disclosed herein. Thus, the inventions disclosed herein can be embodied or carried out in a manner that achieves or optimizes one advantage or group of advantages as taught herein without necessarily achieving other advantages as can be taught or suggested herein.





BRIEF DESCRIPTION OF THE DRAWINGS

Throughout the drawings, reference numbers can be re-used to indicate correspondence between referenced elements. The drawings are provided to illustrate embodiments of the inventions described herein and not to limit the scope thereof.



FIGS. 1A-B are block diagrams illustrating physiological monitoring systems in accordance with embodiments of the disclosure.



FIG. 1C illustrates an embodiment of a sensor system including a sensor assembly and a monitor cable suitable for use with any of the physiological monitors shown in FIGS. 1A-B.



FIG. 2 is a top perspective view illustrating portions of a sensor system in accordance with an embodiment of the disclosure.



FIG. 3 illustrates one embodiment of an acoustic respiratory monitoring system.



FIG. 4 illustrates one embodiment of the respiration processor of the acoustic respiratory monitoring system of FIG. 3.



FIG. 5 illustrates one method of generating a working output signal, performed by the mixer of the respiration processor of FIG. 4.



FIG. 6 illustrates one embodiment of the front end processor of the respiration processor of FIG. 4.



FIG. 7 illustrates one embodiment of a time domain processor of the front end processor of FIG. 6.



FIG. 8 illustrates one embodiment of a frequency domain processor of the front end processor of FIG. 6.



FIG. 9 illustrates an embodiment of a back end processor.



FIG. 10 illustrates an embodiment of a time domain respiratory rate processor.



FIG. 11A illustrates an embodiment of an automatic tagging module of the time domain respiratory rate processor of FIG. 10.



FIGS. 11B through 11D illustrate example time domain waveforms that can be analyzed by the time domain respiratory rate processor.



FIG. 12 illustrates an example time domain waveform corresponding to an acoustic signal derived from a patient.



FIG. 13A illustrates an example spectrogram.



FIG. 13B illustrates an embodiment of a tags confidence estimation module.



FIG. 13C illustrates an embodiment of a spectrogram and a time domain plot together on the same time scale.



FIG. 13D illustrates an embodiment of a process for computing spectral density.



FIGS. 13E through 13G illustrate example spectrums for computing centroids.



FIG. 14 illustrates an embodiment of a process for calculating respiratory rate in the frequency domain.



FIGS. 15A and 15B illustrate example frequency spectrums corresponding to an acoustic signal derived from a patient.



FIG. 16A illustrates an embodiment of an arbitrator of the back end processor of FIG. 9.



FIG. 16B illustrates another embodiment of an arbitrator.



FIG. 16C illustrates an embodiment of a processing frame for processing a long frequency transform and short frequency transforms.



FIG. 17A illustrates an embodiment of plot depicting a respiratory rate point cloud.



FIGS. 17B and 17C illustrate embodiments of plots depicting example curve fitting for the point cloud of FIG. 17A.



FIG. 18 illustrates an embodiment of a decision logic process.





DETAILED DESCRIPTION

Various embodiments will be described hereinafter with reference to the accompanying drawings. These embodiments are illustrated and described by example only, and are not intended to be limiting.


Acoustic sensors, including piezoelectric acoustic sensors, can be used to measure breath sounds and other biological sounds of a patient. Breath sounds obtained from an acoustic sensor can be processed by a patient monitor to derive one or more physiological parameters of a patient, including respiratory rate. For purposes of illustration, this disclosure is described primarily in the context of respiratory rate. However, the features described herein can be applied to other respiratory parameters, including, for example, inspiratory time, expiratory time, inspiratory to expiratory ratio, inspiratory flow, expiratory flow, tidal volume, minute volume, apnea duration, breath sounds (including, e.g., rales, rhonchi, or stridor), changes in breath sounds, and the like. Moreover, the features described herein can also be applied to other physiological parameters and/or vital signs derived from physiological sensors other than acoustic sensors.


Referring to the drawings, FIGS. 1A through 1C illustrate an overview of example patient monitoring systems, sensors, and cables that can be used to derive a respiratory rate measurement from a patient. FIGS. 2 through 16 illustrate more detailed embodiments for deriving respiratory rate measurements. The embodiments of FIGS. 2 through 16 can be implemented at least in part using the systems and sensors described in FIGS. 1A through 1C.


System Overview

Turning to FIG. 1A, an embodiment of a physiological monitoring system 10 is shown. In the physiological monitoring system 10, a medical patient 12 is monitored using one or more sensor assemblies 13, each of which transmits a signal over a cable 15 or other communication link or medium to a physiological monitor 17. The physiological monitor 17 includes a processor 19 and, optionally, a display 11. The one or more sensors 13 include sensing elements such as, for example, acoustic piezoelectric devices, electrical ECG leads, pulse oximetry sensors, or the like. The sensors 13 can generate respective signals by measuring a physiological parameter of the patient 12. The signals are then processed by one or more processors 19.


The one or more processors 19 can communicate the processed signal to the display 11. In an embodiment, the display 11 is incorporated in the physiological monitor 17. In another embodiment, the display 11 is separate from the physiological monitor 17. In one embodiment, the monitoring system 10 is a portable monitoring system. In another embodiment, the monitoring system 10 is a pod, without a display, that is adapted to provide physiological parameter data to a display.


For clarity, a single block is used to illustrate the one or more sensors 13 shown in FIG. 1A. It should be understood that the sensor 13 shown is intended to represent one or more sensors. In an embodiment, the one or more sensors 13 include a single sensor of one of the types described below. In another embodiment, the one or more sensors 13 include at least two acoustic sensors. In still another embodiment, the one or more sensors 13 include at least two acoustic sensors and one or more ECG sensors, pulse oximetry sensors, bioimpedance sensors, capnography sensors, and the like. In each of the foregoing embodiments, additional sensors of different types are also optionally included. Other combinations of numbers and types of sensors are also suitable for use with the physiological monitoring system 10.


In some embodiments of the system shown in FIG. 1A, all of the hardware used to receive and process signals from the sensors are housed within the same housing. In other embodiments, some of the hardware used to receive and process signals is housed within a separate housing. In addition, the physiological monitor 17 of certain embodiments includes hardware, software, or both hardware and software, whether in one housing or multiple housings, used to receive and process the signals transmitted by the sensors 13.


As shown in FIG. 1B, the acoustic sensor assembly 13 can include a cable 25. The cable 25 can include three conductors within an electrical shielding. One conductor 26 can provide power to a physiological monitor 17, one conductor 28 can provide a ground signal to the physiological monitor 17, and one conductor 28 can transmit signals from the sensor 13 to the physiological monitor 17. For multiple sensors 13, one or possibly more cables 13 can be provided.


In some embodiments, the ground signal is an earth ground, but in other embodiments, the ground signal is a patient ground, sometimes referred to as a patient reference, a patient reference signal, a return, or a patient return. In some embodiments, the cable 25 carries two conductors within an electrical shielding layer, and the shielding layer acts as the ground conductor. Electrical interfaces 23 in the cable 25 can enable the cable to electrically connect to electrical interfaces 21 in a connector 20 of the physiological monitor 17. In another embodiment, the sensor assembly 13 and the physiological monitor 17 communicate wirelessly.



FIG. 1C illustrates an embodiment of a sensor system 100 including a sensor assembly 101 and a monitor cable 111 suitable for use with any of the physiological monitors shown in FIGS. 1A and 1B. The sensor assembly 101 includes a sensor 115, a cable assembly 117, and a connector 105. The sensor 115, in one embodiment, includes a sensor subassembly 102 and an attachment subassembly 104. The cable assembly 117 of one embodiment includes a sensor 107 and a patient anchor 103. A sensor connector subassembly 105 is connected to the sensor cable 107.


The sensor connector subassembly 105 can be removably attached to an instrument cable 111 via an instrument cable connector 109. The instrument cable 111 can be attached to a cable hub 120, which includes a port 121 for receiving a connector 112 of the instrument cable 111 and a second port 123 for receiving another cable. In certain embodiments, the second port 123 can receive a cable connected to a pulse oximetry or other sensor. In addition, the cable hub 120 could include additional ports in other embodiments for receiving additional cables. The hub includes a cable 122 which terminates in a connector 124 adapted to connect to a physiological monitor (not shown).


The sensor connector subassembly 105 and connector 109 can be configured to allow the sensor connector 105 to be straightforwardly and efficiently joined with and detached from the connector 109. Embodiments of connectors having connection mechanisms that can be used for the connectors 105, 109 are described in U.S. patent application Ser. No. 12/248,856 (hereinafter referred to as “the '856 Application”), filed on Oct. 9, 2008, which is incorporated in its entirety by reference herein. For example, the sensor connector 105 could include a mating feature (not shown) which mates with a corresponding feature (not shown) on the connector 109. The mating feature can include a protrusion which engages in a snap fit with a recess on the connector 109. In certain embodiments, the sensor connector 105 can be detached via one hand operation, for example. Examples of connection mechanisms can be found specifically in paragraphs [0042], [0050], [0051], [0061]-[0068] and [0079], and with respect to FIGS. 8A-F, 13A-E, 19A-F, 23A-D and 24A-C of the '856 Application, for example.


The sensor connector subassembly 105 and connector 109 can reduce the amount of unshielded area in and generally provide enhanced shielding of the electrical connection between the sensor and monitor in certain embodiments. Examples of such shielding mechanisms are disclosed in the '856 Application in paragraphs [0043]-[0053], [0060] and with respect to FIGS. 9A-C, 11A-E, 13A-E, 14A-B, 15A-C, and 16A-E, for example.


In an embodiment, the acoustic sensor assembly 101 includes a sensing element, such as, for example, a piezoelectric device or other acoustic sensing device. The sensing element can generate a voltage that is responsive to vibrations generated by the patient, and the sensor can include circuitry to transmit the voltage generated by the sensing element to a processor for processing. In an embodiment, the acoustic sensor assembly 101 includes circuitry for detecting and transmitting information related to biological sounds to a physiological monitor. These biological sounds can include heart, breathing, and/or digestive system sounds, in addition to many other physiological phenomena.


The acoustic sensor 115 in certain embodiments is a biological sound sensor, such as the sensors described herein. In some embodiments, the biological sound sensor is one of the sensors such as those described in the '883 Application. In other embodiments, the acoustic sensor 115 is a biological sound sensor such as those described in U.S. Pat. No. 6,661,161, which is incorporated by reference herein in its entirety. Other embodiments include other suitable acoustic sensors.


The attachment sub-assembly 104 includes first and second elongate portions 106, 108. The first and second elongate portions 106, 108 can include patient adhesive (e.g., in some embodiments, tape, glue, a suction device, etc.). The adhesive on the elongate portions 106, 108 can be used to secure the sensor subassembly 102 to a patient's skin. One or more elongate members 110 included in the first and/or second elongate portions 106, 108 can beneficially bias the sensor subassembly 102 in tension against the patient's skin and reduce stress on the connection between the patient adhesive and the skin. A removable backing can be provided with the patient adhesive to protect the adhesive surface prior to affixing to a patient's skin.


The sensor cable 107 can be electrically coupled to the sensor subassembly 102 via a printed circuit board (“PCB”) (not shown) in the sensor subassembly 102. Through this contact, electrical signals are communicated from the multi-parameter sensor subassembly to the physiological monitor through the sensor cable 107 and the cable 111.


In various embodiments, not all of the components illustrated in FIG. 1C are included in the sensor system 100. For example, in various embodiments, one or more of the patient anchor 103 and the attachment subassembly 104 are not included. In one embodiment, for example, a bandage or tape is used instead of the attachment subassembly 104 to attach the sensor subassembly 102 to the measurement site. Moreover, such bandages or tapes can be a variety of different shapes including generally elongate, circular and oval, for example. In addition, the cable hub 120 need not be included in certain embodiments. For example, multiple cables from different sensors could connect to a monitor directly without using the cable hub 120.


Additional information relating to acoustic sensors compatible with embodiments described herein, including other embodiments of interfaces with the physiological monitor, are included in U.S. patent application Ser. No. 12/044,883, filed Mar. 7, 2008, entitled “Systems and Methods for Determining a Physiological Condition Using an Acoustic Monitor,” (hereinafter referred to as “the '883 Application”), the disclosure of which is hereby incorporated by reference in its entirety. An example of an acoustic sensor that can be used with the embodiments described herein is disclosed in U.S. patent application Ser. No. 12/643,939, filed Dec. 21, 2009, titled “Acoustic Sensor Assembly,” the disclosure of which is hereby incorporated by reference in its entirety.



FIG. 2 depicts an embodiment of an acoustic signal processing system 200. The acoustic signal processing system 200 can be implemented by either of the physiological monitors described above. In the acoustic signal processing system 200, a sensor 201 transmits a physiological signal to a filter 202. In one embodiment, the filter 202 is a high pass filter. The high pass filter 202 can allow high frequency components of the voltage signal above a certain predetermined cutoff frequency to be transmitted and attenuates low frequency components below the cutoff frequency. Certain low frequency signals are desirable to attenuate in certain embodiments because such signals can saturate amplifiers in the gain bank 220.


Other types of filters may be included in the filter 202. For example, the filter 202 may include a low pass filter that attenuates high frequency signals. It may be desirable to reject high frequency signals because such signals often include noise. In certain embodiments, the filter 202 includes both a low pass filter and a high pass filter. Alternatively, the filter 202 may include a band-pass filter that simultaneously attenuates both low and high frequencies.


The output from the filter 202 is split in certain embodiments into two channels, for example, first and second channels 203, 205. In other embodiments, more than two channels are used. For example, in some embodiments three or more channels are used. The filter 202 provides an output on both first and second channels 203, 205 to a gain bank 220. The gain bank 220 can include one or more gain stages. In the depicted embodiment, there are two gain stages 204, 206. A high gain stage 204 amplifies the output signal from the filter 220 to a relatively higher level than the low gain stage 206. For example, the low gain stage 206 may not amplify the signal or can attenuate the signal.


The amplified signal at both first and second channels 203, 205 can be provided to an analog-to-digital converter (ADC) 230. The ADC 230 can have two input channels to receive the separate output of both the high gain stage 204 and the low gain stage 206. The ADC bank 230 can sample and convert analog voltage signals into digital signals. The digital signals can then be provided to the DSP 208 and thereafter to the display 210. In some embodiments, a separate sampling module samples the analog voltage signal and sends the sampled signal to the ADC 230 for conversion to digital form. In some embodiments, two (or more) ADCs 230 may be used in place of the single ADC 230 shown.


Acoustic Respiratory Monitoring System


FIG. 3 illustrates one embodiment of an acoustic respiratory monitoring system 300 compatible with the sensors, systems, and methods described herein. The acoustic respiratory monitoring system 300 includes an acoustic sensor 302 that communicates with a physiological monitor 304. The sensor 302 provides a sensor signal to the physiological monitor 304. In one embodiment, the sensor signal is provided over three signal lines, which include a high gain channel 306, a low gain channel 308, and a ground line 310. The physiological monitor 304 includes a data collection processor 312, respiration processor 314, and an optional display 316. The high and low gain channels 306, 308 can include any of the functionality of the high and low gain channels described above.


The acoustic sensor 302 is configured to detect sounds emanating from within a medical patient. For example, the acoustic sensor 302 is often configured to detect the sound of a medical patient's respiration (including inhalation and exhalation), as well as other biological sounds (e.g., swallowing, speaking, coughing, choking, sighing, throat clearing, snoring, wheezing, chewing, humming, etc.). The acoustic sensor 302 can include any of the sensors described herein.


The acoustic sensor 302 may include pre-amplification circuitry (not shown) to generate an output signal, or signals, in response to detected sounds. In some cases, high amplitude (e.g., loud) detected sounds can cause the sensor to saturate. For example, the high amplitude input signal could cause a pre-amplifier having a predetermined, fixed gain setting to saturate. In such cases, meaningful acoustic information detected by the sensor 302 simultaneously with the high amplitude sounds could be lost. To avoid losing such information, the acoustic sensor 302 can include a second pre-amplifier that has a lower, predetermined, fixed gain setting. The sensor 302 provides both high and low gain outputs over separate channels 306, 308 to the physiological monitor. As will be discussed later, the physiological monitor 304 can be configured to switch its input stream between the high and low gain channels 306, 308 depending upon whether a sensor saturation event has occurred, or appears likely to occur.


The data collection processor 312 pre-processes the signals received from the sensor 302 in preparation for further analysis. In one embodiment, the data collection processor 312 includes one or more of a gain amplifier 318, an analog-to-digital converter 320, a decimation module 322, a filtering module 324, and a storage module 326.


A gain amplifier 318 is configured to increase or decrease the gain on the incoming data signal. An analog-to-digital converter 320 is configured to convert the incoming analog data signal into a digital signal. A decimation module 322 is configured to reduce the quantity of data associated with the digitized analog input signal.


For example, the sensor 302 can provide a continuous analog signal to the analog-to-digital converter 320. The analog-to-digital converter 320 outputs a digital data stream having a relatively high data rate. For example, in one embodiment, the analog-to-digital converter 320 provides a digital signal at 48 kHz. The decimation module 322 receives the high data rate data, and can reduce the data rate to a lower level. For example, in one embodiment the decimation module 322 reduces the data rate to 4 kHz or 2 kHz. Reducing the data rate advantageously reduces the processing power of the physiological monitor, in certain embodiments, as the high bandwidth data from the analog-to-digital converter 320 may not be necessary or useful for further respiratory processing. The decimation module 322 can also reduce and/or eliminate unwanted noise from the digitized respiratory signal. A filtering module 324 provides further signal filtering and/or cleaning, and a storage module 326 stores the filtered, digitized input respiration signal for further processing.


A respiration processor 314 analyzes the digitized input respiration signal to determine various physiological parameters and/or conditions. For example, the respiration processor 314 is configured to determine one or more of the respiration rate of the patient, inspiration time, expiration time, apnea events, breathing obstructions, choking, talking, coughing, etc. The respiration processor 314 is further configured to determine a confidence value, or quality value, associated with each physiological parameter or condition calculation. For example, for each sample of time for which the respiration processor 314 calculates a patient's respiration rate, the respiration processor 314 will also determine a confidence value (or quality value) associated with such calculation. Both the calculated physiological parameter value and the confidence associated with such calculation are displayed on a display 316.


Displaying both calculated parameters and confidence values advantageously helps a clinician to exercise her own clinical judgment in evaluating the patient's physiological condition. For example, instead of merely showing “error” or “no available signal” during physiological events that affect respiration detection, the physiological monitor 304 can provide the clinician with its best estimation of the physiological parameter while indicating that some other event has occurred that impacts the monitor's 304 confidence in its own calculations.


In addition, as data passes through the monitor 304, the monitor's various processing modules can appropriately process the data based upon the confidence level associated with each data point. For example, if data is identified as having a low confidence value, the monitor 304 can cause that data to have less or no influence over a physiological parameter calculation.


Identifying data quality can also advantageously allow the monitor 304 to process a continuous stream of data. For example, instead of stopping and starting data flow, or skipping “bad data” samples, the monitor 304 can continuously maintain and process the input data stream. This feature can allow implementation of much more robust, efficient, and less complex processing systems. For example, a finite infinite response filter (FIR filter) (or other filter) can break down, and large data portions may be lost, if it is used to process discontinuous or otherwise erroneous data. Alternatively, providing a data confidence or quality flag allows a processor employing an FIR filter to take further action avoid such problems. For example, in one embodiment, a low confidence or quality flag can cause a processor to substitute a measured data point value with an interpolated value falling between the two closest high confidence data points surrounding the measured data point.


Respiration Processor


FIG. 4 illustrates one embodiment of a respiration processor 314 compatible with any of the acoustic respiratory monitoring systems described herein. The respiration processor 314 includes a mixer 400, a front end processor 402, and a back end processor 404.


The mixer 400 obtains high and low gain data from the data collection processor 312. Because the high and low gain channels of the acoustic sensor 302 can have known, fixed values, the mixer 400 is able to determine when the high gain channel 306 will clip. If data obtained from the high gain channel 306 exceeds a clipping threshold, the mixer can change the data input stream from the high gain channel to the low gain channel 308. The mixer 400 effectively optimizes, in certain embodiments, the two incoming data signals to generate a working output signal for further processing. In certain embodiments, the mixer 400 generates an optimized working output signal having a desired dynamic range by avoiding use of clipped (e.g., pre-amplifier saturated) data points.


Although not shown, a compression module can also be provided as an output from the mixer. The compression module can compress, down sample, and/or decimate the working output signal to produce a reduced bit-rate acoustic signal. The compression module can provide this reduced bit-rate acoustic signal to a database for data storage or for later analysis. In addition, the compression module can transmit the reduced bit-rate acoustic signal over a network to a remote device, such as a computer, cell phone, PDA, or the like. Supplying a reduced bit-rate acoustic signal can enable real time streaming of the acoustic signal over the network.


Generating a Working Signal

One embodiment of a method 420 of generating a working output signal is illustrated in FIG. 5. The mixer 400 of FIG. 4 is configured to execute the illustrated method 420. At block 422, the method 420 obtains data samples from high and low gain data channels. At block 424, the method 420 determines whether the high gain channel sample's amplitude exceeds a high gain clipping threshold. If not, the method 420 proceeds to block 426.


At block 426, the method 420 selects the high gain channel sample. At block 428, the method 420 sets a “valid” flag to “on.” The “valid” flag is one form of confidence or signal quality indicator, discussed above. Setting the “valid” flag to “on” indicates that the associated data sample may be a valid clinical data sample (for example, the measured value did not exceed a predetermined signal clipping threshold). The method 420 proceeds to block 430, where the method 420 stores both the selected sample and the flag values. The method 420 returns to block 422, where the next pair of samples is obtained from the high and low gain channels.


If at block 424 the method 420 determines that the high gain channel sample amplitude exceeds a high gain clipping threshold, the method 420 proceeds to block 432. At block 432, the method 420 selects the low gain channel sample. At block 434, the method 420 determines whether the low gain channel sample exceeds a low gain clipping threshold. If not, the method 420 proceeds to block 428, as described above.


If the method 420 determines that the low gain channel sample amplitude does exceed a low gain clipping threshold, the method proceeds to block 436. At block 436, the method 420 sets the “valid” flag “to “off.” Setting the “valid” flag to “off” indicates that the data sample may not be a valid clinical data sample. The method 420 then proceeds to block 430, as described above. The stored values (referred to as the “working” and “valid” signals in FIGS. 6-8) are further processed by a front end processor of a respiration processor, for example, as described below.


Front End Processor


FIG. 6 illustrates one embodiment of the front end processor 402 of FIG. 4. The front end processor 402 includes an envelope processor 450 and an exceptions processor 452. The envelope processor 450 includes a time domain processor 454 and a frequency domain processor 456. The exceptions processor includes a spectral envelope processor 458, a probe-off processor 460, and an interference processor 462.


The envelope processor 450 receives the working and valid signals from the mixer 400, discussed above. The envelope processor 450 processes these signals to generate a time domain envelope signal, a time domain valid signal, a noise floor, a frequency domain envelope signal, and a frequency domain valid signal. The exceptions processor 452 receives the working signal from the mixer 400 and the noise floor signal from the envelope processor 450 to generate a spectral envelope signal, a probe-off signal, and an interference signal. The time domain processor 454 and frequency domain processor 458 are described in further detail with respect to FIGS. 7 and 8, below.


By analyzing a working signal in both the time and frequency domains, in certain embodiments, the acoustic respiratory monitoring system 300 provides substantial, clinically significant advantages. To improve accuracy, the acoustic respiratory monitoring system 300 can seek to remove noise content from measured or sensed acoustic signals. However, noise can arise from multiple sources, and simple, single stage, or single engine processing techniques often provide inadequate results. For example, some noise occurs as a spike in signal amplitude. A slamming door, cough, or alarm could result in noise spike in an acoustic sensor's signal amplitude.


Other noise occurs in a more continuous nature. Such noise may not cause the sensor signal to spike in amplitude. For example, the low-frequency humming of a motor, light ballast, speech, or other vibration, could result in such continuous, relatively constant frequency noise signals. A time domain processor can be configured to effectively and efficiently remove amplitude-related noise content, while a frequency domain processor can be configured to effectively and efficiently remove frequency-related noise content. Employing both time and frequency processors can yield even further improved results.


In one embodiment, the spectral envelope processor 458 generates a spectrogram of the working signal. For example, each data point in the spectrogram signal can correspond to the highest frequency present in the working signal at that data point. Each data point in the working signal can correspond to the amplitude of the signal measured by the sensor 302 at a particular point in time. The spectral envelope provides the highest frequency present in the working signal at the same particular point in time.


The probe-off processor 460 determines whether the acoustic sensor 302 is properly attached to a measurement site on a patient. For example, if the working signal amplitude is below a certain threshold, the probe-off processor 460 determines that the acoustic sensor 302 is not properly attached to the patient's skin.


The interference processor 462 indicates whether an interference event has occurred. For example, the interference processor 462 indicates whether a noise event or disruption has been detected. Such events include can be identified, for example, when a loud instrument or device (e.g., a pump, generator, fan, ringing telephone, overhead page, television set, radio) has been turned-on or activated near the medical patient and/or whether the patient herself has generated an interfering, non-clinically relevant acoustical signal (e.g., sneezing, coughing, talking, humming, etc.).


Time Domain Processor


FIG. 7 illustrates one embodiment of a time domain processor 454 compatible with the front end processor 402, discussed above. The time domain processor 454 includes a filter 482, a time domain envelope processor 484, and a noise floor processor 486. The filter 482 receives working and valid signals from the mixer 400, as discussed above. The time domain envelope processor 484 receives filtered working and valid signals from the filter 482. The time domain envelope processor 484 generates a time domain envelope signal 494 and a time domain valid signal 496 based upon the filtered working and valid signals. The noise floor processor 486 provides a noise floor signal 498 in response to the filtered working signal received from the filter 482.


The time domain processor 454 includes one or more envelope generators. For example, the time domain processor 454 can include one or more of a demodulation and decimation processor 488, a smoothing processor 490, and/or any other envelope generator (e.g., a Hilbert transformation processor, a power method envelope processor, etc.). The envelope processor 484 can also include a median filter 492. In one embodiment, the demodulation and decimation processor 488 includes the median filter 492.


In one embodiment, the demodulation and decimation processor 488 squares the input data, multiplies it by a factor of two, sends it through a filter (such as a low pass filter, a finite impulse response (FIR) filter, or an infinite impulse response (IIR) filter), buffers the output, provides the buffered output to a median filter 492, and then takes the square root, to derive an envelope signal. The median filter 492 picks out only the middle value of sorted, buffered data. For example, the median filter 492 first sorts the data and then selects the middle value. In one embodiment, the demodulation and decimation processor 488 buffers seven data points (or some other number of points) that are output from an FIR filter (as discussed above). The median filter 492 sorts the data (either high to low, or low to high), and then selects the value of the fourth data point (e.g., the middle of the seven data points) as its output value.


As subsequent data is analyzed by the demodulation and decimation processor 488, the buffered data is shifted by one sample, and the median filter 492 again selects the middle value of the sorted, buffered data. The median filter 492 is sometimes referred to as a running median filter.


The smoothing processor 490 employs another technique to derive a time domain envelope signal. For example, in one embodiment, the smoothing processor 490 processes buffered input data by determining a standard deviation, multiplying the output by three, squaring the output, and then sending the squared output to a filter, such as a low pass filter, FIR filter, or IIR filter.


The noise floor processor 486 determines the noise floor of its input signal. For example, the noise floor processor 486 can be configured to select a value below which or above which the input signal will be considered to be merely noise, or to have a substantial noise content.


Frequency Domain Processor


FIG. 8 illustrates one embodiment of a frequency domain processor 456 compatible with the front end processor 402, discussed above. The frequency domain processor includes a frequency domain envelope processor 500, which includes a median filter 502. The frequency domain processor 456 receives working and valid signals from the mixer 400, as described above, and generates a frequency domain envelope and frequency domain valid signals based on the working and valid signals.


In one embodiment, the frequency domain envelope processor 500 transforms the working signal into the frequency domain by utilizing a Fourier transform (e.g., fast Fourier transform (FFT), discrete Fourier transform (DFT), etc.). The frequency domain processor 456 can then identify an envelope of the output of the Fourier transform. For example, the frequency domain processor 456 can square an absolute value of the Fourier transform output. In some embodiments, the frequency domain processor 456 further processes the output with sub matrix, mean, log, and gain operations. Additionally, in some embodiments, the output is buffered and provided to a median filter 502, which can operate under the same principles as the median filter 492 described above with respect to FIG. 7. The output of the median filter 502 can be provided as a frequency domain envelope signal 504.


Although not shown, in certain embodiments, the frequency domain processor 456 can also calculate spectral information, such as a power spectrum, spectral density, power spectral density, or the like. In one embodiment, the spectral information X can be calculated as follows:

X=20 log(FFT(x))  (1)

where x in equation (1) represents the working signal input to the envelope processor 450 (or a processed form of the working signal). The spectral information can be calculated for each block or frame of samples received by the envelope processor 450. The spectral information can be calculated using other techniques than those described herein. For example, the log(FFT(x)) may be multiplied by 10 instead of 20, or the power spectral density may be calculated from the Fourier transform of the autocorrelation of the input signal, or the like.


The spectral information from multiple blocks or frames of samples can be combined to form a representation of changes in the spectral information over time. If this combined spectral information were output on a display, the combined spectral information could be considered a spectrogram, a waterfall diagram, or the like. An example of such a spectrogram is described below with respect to FIG. 13B.


Back End Processor


FIG. 9 illustrates a more detailed embodiment of the back end processor 404 described above with respect to FIG. 4. The back end processor 404 can be implemented in hardware and/or software. Advantageously, in certain embodiments, the back end processor 404 calculates respiratory rate in the time domain and in the frequency domain. The back end processor 404 can use various arbitration techniques to select and/or to combine the respiratory rate values obtained from the time and frequency domains.


In the depicted embodiment, the back end processor 404 receives a time domain (TD) envelope, a time domain (TD) valid signal, a frequency domain (FD) envelope, a frequency domain (FD) valid signal, a spectral envelope, noise floor information, and interference information. These inputs can be generated using any of the systems and processes described above. Not all of these inputs need be used or received by the back end processor 404 in certain implementations.


The back end processor 404 includes a time domain respiratory rate (TD RR) processor 910, the frequency domain respiratory rate (FD RR) processor 920, and an arbitrator 930. The TD RR processor 910 can use time domain techniques to derive respiratory rate values from various ones of the inputs received by the back end processor 404. The TD RR processor 910 can also calculate confidence values reflecting confidence in the calculated respiratory rate values. The FD RR processor 920 can use frequency domain techniques to derive respiratory rate values from various ones of the inputs received by the back end processor 404. The FD RR processor 920 can also calculate confidence values reflecting confidence in the calculated respiratory rate values. In some embodiments, only time domain techniques or only frequency domain techniques are used by the back end processor 404 to calculate respiratory rate.


The TD RR processor 910 and the FD RR processor 920 can provide respiratory rate outputs and confidence values to the arbitrator 930. In certain embodiments, the arbitrator 930 can evaluate the time domain and frequency domain respiratory rate calculations as well as the confidence values to select a respiratory rate value to output. Based on the confidence calculations, the arbitrator 930 can output the respiratory rate values derived from the time domain or from the frequency domain. The arbitrator 930 can also combine the respiratory rate values derived from the different domains, for example, by averaging the values together.


For ease of illustration, this specification describes the respiration processor 314 as having time domain and frequency domain processors or engines. However, in other embodiments, the respiration processor 314 can include only a time domain engine, only a frequency domain engine, or additional engines in combination with either of the time domain and frequency domain engines or with both. The arbitrator 930 can calculate an overall respiratory rate based on the outputs of any of the engines, including a combination of the engines. The arbitrator 930 can weight, select, or otherwise choose outputs from the different engines. The weightings for each engine can be learned and adapted (e.g., using an adaptive algorithm) over time. Other configurations are possible.


One example of an additional engine that can calculate respiratory rate is a wavelet-based engine, which can be used to perform both time and frequency domain analysis. In one embodiment, the respiration processor 314 includes the wavelet analysis features described in U.S. application Ser. No. 11/547,570, filed Jun. 19, 2007, titled “Non-invasive Monitoring of Respiratory Rate, Heart Rate and Apnea,” the disclosure of which is hereby incorporated by reference in its entirety.


Another example of an engine that can be used to calculate respiratory rate is an engine based on a Kalman filter. The Kalman filter can be used to predict a state of a system, such as the human body, based on state variables. The state of the human body system can depend on certain state parameters, such as hemodynamic parameters (e.g., heart rate, oxygen saturation, respiratory rate, blood pressure, and the like). A Kalman filter model can be developed that interrelates these parameters. As part of this model, one or more measurement equations can be established for measuring the various state parameters, including respiratory rate.


The arbitrator 930 can also calculate a freshness or relevance of the respiratory rate measurement. In some situations, the confidence output by the arbitrator 930 may be low, and the corresponding respiratory rate output value may be invalid or corrupted. One option when this happens is to discard the invalid or low quality respiratory rate value and not display a new respiratory rate measurement until a valid or higher quality value is obtained. However, not displaying a respiratory rate value (or showing zero) can confuse a clinician if the patient is still breathing. Likewise, cycling between zero and a valid respiratory rate value can be annoying or confusing for a clinician.


Thus, instead of displaying no (or zero) value in low quality signal conditions, the arbitrator 930 can continue to output the previous respiratory rate measurement for a certain amount of time. The arbitrator 930 can include a counter, timer, or the like that increments as soon as a low or invalid signal condition is detected. When the arbitrator 930 determines that a certain amount of time has elapsed, the arbitrator 930 can output a zero or no respiratory rate value. Conversely, if while the arbitrator 930 is incrementing the counter a valid or higher quality respiratory rate measurement is produced, the arbitrator 930 can reset the counter.


The amount of time elapsed before the arbitrator 930 outputs a zero or no respiratory rate value can be user-configurable. The arbitrator 930 can output the elapsed time that the respiratory rate measurement has had low or no confidence. The arbitrator 930 could also (or instead) output an indicator, such as a light, changing bar, number, an audible alarm, and/or the like that is output when the arbitrator 930 is incrementing the counter. The indicator can be deactivated once valid or higher confidence respiratory rate values are calculated.


Time Domain RR Calculation


FIG. 10 illustrates an embodiment of a time domain respiratory rate (TD RR) processor 1000. The TD RR processor 1000 is an example implementation of the TD RR processor 910 of FIG. 9.


By way of overview, the TD RR processor 1000 can determine respiratory rate by identifying energy in the time domain that has possible meaning. Meaningful energy can include peaks in the TD envelope that correspond to inspiration and/or expiration. The TD RR processor 1000 can disqualify energy that does not have meaning, including energy due to noise (e.g., coughing, sneezing, talking, chewing, ambient noise from the environment, etc.). From the energy that has meaning, the TD RR processor 1000 can calculate respiratory rate. Moreover, the TD RR processor 1000 can calculate one or more confidence values reflecting confidence in the calculated respiratory rate.


In the depicted embodiment, the TD RR processor 1000 includes a variable filtering module 1010, an automatic tagging module 1020, and an RR estimation module 1030. Each of these modules can be implemented in hardware and/or software. Each of these modules can operate on blocks or frames of samples in certain implementations (see, e.g., FIG. 16C).


The variable filtering module 1010 receives a TD envelope, such as the TD envelope described above. The variable filtering module 1010 can apply one or more filters to the TD envelope to smooth the TD envelope. The one or more filters can include a low pass filter. Smoothing the TD envelope can reduce noise in the TD envelope. In certain embodiments, the variable filtering module 1010 can change the cutoff frequency or frequencies of the one or more filters based at least in part on previously calculated RR values. Using a previously calculated respiratory rate value to determine a cut off frequency of a smoothing filter can reduce errors that can occur from over-smoothing or under-smoothing the TD envelope. Over smoothing can occur if the new respiratory rate value is relatively high for a given lower cutoff frequency, and under smoothing can occur if the new respiratory rate value is relatively low for a given higher cutoff frequency.


In the depicted embodiment, the variable filtering module 1010 receives a previous respiratory rate value (RR_old) from the RR estimation module 1030 via a delay block 1040. In one embodiment, if the previous respiratory rate value is above a certain threshold, the cutoff frequency for the smoothing filter is increased to a certain value. If the previous respiratory rate value is below a certain rate, then the cutoff frequency for the smoothing filter can be decreased to a certain value. The cut off frequency can also be adjusted on a sliding scale corresponding to the previous respiratory rate value. In one embodiment, the cut off frequency can range from about 1 Hz to about 3 Hz. This range can be varied considerably in certain embodiments.


The variable filtering module 1010 can also create a difference envelope by subtracting the TD envelope from the filtered envelope (or vice versa). The difference envelope can be used in glitch detection and reduction, as will be described in greater detail below. The variable filtering module 1010 can provide the filtered and difference envelopes to the automatic tagging module 1020.


In certain embodiments, the automatic tagging module 1020 identifies peaks in the filtered TD envelope that correspond to inspiration and/or expiration. Each peak can be referred to as a tag. The automatic tagging module 1020 can determine confidence and the identified tags based at least in part on a received TD and/or FD valid signal, a spectral envelope signal, and/or a noise floor signal. These signals and values are described in part above with respect to FIGS. 3 through 8.


The automatic tagging module 1020 can output the tags and confidence values corresponding to the tags to the RR estimation module 1030. The RR estimation module 1030 can estimate respiratory rate from the identified tags. For example, the RR estimation module 1030 can calculate time periods between pairs of tags within a given processing frame to derive a respiratory rate value for each pair of tags. In one embodiment, the RR estimation module 1030 calculates respiratory rate for every first and third peak so as to calculate respiration from inspiration to inspiration peak and/or from expiration to expiration peak. The respiratory rate value can be derived by multiplying the reciprocal of the time period between two peaks by 60 (seconds per minute) to achieve breaths per minute. The RR estimation module 1030 can average the respiratory rate values from the various tag pairs within the processing frame. This average can be a weighted average (see FIGS. 11B through 11D). The RR estimation module 1030 can also calculate a confidence value for the processing frame based at least in part on the received tags confidence value for each tag or pair of tags.



FIG. 11A illustrates a more detailed embodiment of an automatic tagging module 1100. The automatic tagging module 1100 is an example implementation of the automatic tagging module 1020 of the TD RR processor 1000. As described above, the automatic tagging module 1100 can be used to identify peaks of energy in the filtered TD envelope.


The automatic tagging module 1100 includes a tag detection module 1110, a glitch removal module 1120, a tags rejection module 1130, and a tags confidence estimation module 1140. Each of these modules can be implemented in hardware and/or software. Each of these modules operate on blocks or frames of samples in certain implementations.


The tag detection module 1110 receives the filtered (TD) envelope, a valid signal, and the noise floor signal. The tag detection module 1110 can make a preliminary detection of peaks and/or valleys in the filtered envelope. The tag detection module 1110 can use the noise floor signal to determine which peaks are above the noise floor. The tag detection module 1110 can include an adaptive threshold, for instance, for estimating which peaks should be considered tags based at least in part on the noise floor level. Thus, peaks occurring below the noise floor may not be considered to be tags that represent inspiration or expiration. In addition, the tag detection module 1110 can have an upper threshold based at least in part on a root mean square (RMS) value of the filtered envelope. Peaks above the upper threshold may not be considered to be tags because they may be due to noise such as talking or the like.


The tag detection module 1110 can output preliminary tags (referred to as tags_1 in the FIGURE) to the glitch removal module 1120. The glitch removal module 1120 can compare the identified tags with the difference envelope described above with respect to FIG. 10. This comparison can yield an identification of spikes or other peaks in the identified tags that can correspond to noise or other glitches. The glitch removal module 1120 can filter out, remove, or reduce the amplitude of tags that correspond to glitches or noise. The glitch removal module 1120 therefore outputs new tags (referred to as tags_2 in the FIGURE) with the glitches removed or reduced.


The glitch removal module 1120 can provide the tags to the tags rejection module 1130. The tags rejection module 1130 can use any of a variety of techniques to remove tags that possibly do not correspond to inspiration or expiration sounds of a patient. For instance, the tags rejection module 1130 could reject a tag if it is too close to another tag. Tags that are too close together could result in a respiratory rate that is too high and therefore not physiologically possible. In certain embodiments, the smaller of the two peaks is rejected, although this may not always be the case. The closeness constraint could be relaxed or not used entirely for certain patients, such as neonates, who may have a higher respiratory rate.


Another technique that could be used by the tags rejection module 1130 is to find peaks that are related. Peaks that are related can include peaks that do not have a clear separation from one another. For example, a valley separating two peaks could be relatively shallow compared to valleys surrounding the peaks. In certain embodiments, if the valley between the peaks falls below half the amplitude of one or both of the peaks, the peaks could be considered to be separate peaks or tags. Otherwise, the peaks could be considered to be one tag (e.g., corresponding to two intakes or exhales of air in rapid succession).


The tags rejection module 1130 could also analyze tags that appear at the beginning and/or end of a block of samples or frame. The tags rejection module 1130 could select to reject one of these first or last peaks based at least in part on how much of the peak is included in the frame. For instance, a peak that is sloping downward toward the beginning or end of the frame could be considered a tag for that frame; otherwise, the tag might be rejected.


Another technique that can be used by the tags rejection module 1130 is to identify and remove isolated peaks. Since breathing sounds tend to include an inspiration peak followed by an expiration peak, an isolated peak might not refer to a breathing sound. Therefore, the tags rejection module 1130 could remove any such peaks. The tags rejection module 1130 could also remove narrow peaks, which may not realistically correspond to a breathing sound.


Moreover, in certain embodiments, the tags rejection module 1130 can perform parametric rejection of peaks. The tags rejection module 1130 can, for example, determine statistical parameters related to the peaks within a block or frame, such as standard deviation, mean, variance, peaks duration, peak amplitude, and so forth. If the standard deviation or mean the value of the peak or peaks is much different from that of the previous block or frame, one or more of the peaks may be rejected from that block. Similarly, if the duration and/or amplitude of the peaks differ substantially from the duration and/or amplitude of peaks in a previous block or frame, those peaks might also be rejected.


The tags rejection module 1130 can provide the refined tags (referred to as tags_3 in the FIGURE) to the tags confidence estimation module 1140. The confidence estimation module 1140 can also receive the spectral envelope calculated above and the noise floor signal. The confidence estimation module 1140 can use a combination of the tags, the spectral envelope, and/or the noise floor to determine a confidence value or values in the tags. The confidence estimation module 1140 can output the tags and the confidence value.


In certain embodiments, the confidence estimation module 1140 can determine confidence in the tags received from the tags rejection module 1130. The confidence estimation module 1140 can use one or more different techniques to determine confidence in the tags. For instance, a confidence estimation module 1140 can analyze power or energy of the tags in the time domain. The confidence estimation module 1140 can also analyze the distribution of frequency components corresponding to the tags in the frequency domain. The confidence estimation module 1140 can use either of these two techniques or both together.



FIGS. 11B through 11D illustrate example time domain waveforms 1151, 1152, 1153 that can be analyzed by the time domain respiratory rate processor described above. As described above, the RR estimation module 1030 can calculate an average or weighted average of individual respiratory rate values for pairs of tags within a processing frame. The time domain waveforms 1151, 1152, 1153 illustrate an embodiment of a technique for computing a weighted average. Further, the time domain waveforms 1151, 1152, 1153 illustrate how an occurrence value or occurrence weighting can be computed for each pair of tags. The occurrence value can be used in an arbitration or decision logic process, described below with respect to FIGS. 16 through 18.


As described above, respiratory rate can be calculated for each first and third tag (e.g., inspiration peak to inspiration peak or expiration peak to expiration peak). However, for ease of illustration, time periods in FIGS. 11B and 11C are calculated between successive peaks. FIG. 11D illustrates how the features described with respect to FIGS. 11B and 11C can be extended to first and third peaks.


In the plot 1151 and 1152, peaks or tags 1154a, 1154b, 1154c, and 1154d are shown along a timeline. In the plot 1151, a time period 1170 between the first two peaks 1154a and 1154b is 20 seconds. A time period 1172 between the peaks 1154b and 1154c is 5 seconds, and a time period 1174 between peaks 1154c and 1154d is also 5 seconds. Two processing windows (or frames) 1162, 1164 are shown. Processing windows can be slid over time, causing older peaks to fall out of the window and newer peaks to enter. Thus, the second window 1164 occurs after the first window 1162.


If the RR estimation module 1030 were to compute a respiratory rate value for each window 1162, 1164 using simple averaging, the respiratory rate values for each window 1162, 1164 would differ substantially in this example. The average period for the first window 1162 would be equal to (20 seconds+5 seconds+5 seconds)/3, resulting in an average 10 second period. The average respiratory rate value for this window 1162 would then be 1/10*60 seconds, or 6 breaths per minute (BPM). The average time period between peaks for the second window 1164 is (5 seconds+5 seconds)/2=5 seconds, resulting in an average respiratory rate of 12 BPM. This respiratory rate value is twice the respiratory rate value calculated for the first time window 1162, resulting in a significant jump or discontinuity between respiratory rate values. This jump results from the respiratory rate value from the oldest tag 1154a being dropped from the second window 1164.


To reduce or prevent these jumps in average respiratory rate values between windows, a weighted average can be computed instead of a simple average. This weighted average is illustrated in the plot 1152 of FIG. 11C. In the plot 1152, an approach for counting the most recently-dropped peak 1154a from the second window 1164 is shown. The period between the most-recently dropped peak 1154a and the next peak 1154b in the second window 1164 is denoted as v. A partial period v′ is also shown. This partial period v′ denotes the period between the start of the current window 1164 and the center (or some other feature) of the first peak 1154b in the current window 1164. A weight can be assigned to the contribution of the period v to the window 1164 by the ratio v′/v. Thus, if the peak 1154a were within the window 1164, the ratio v′/v would be 1. If the period v extended halfway out of the window 1164, the ratio v′/v would be 0.5. The farther out of the window the most recently-dropped peak 1154a is, the less weight or contribution this peak may have to the average respiration calculated for the window 1164.


The weights for pairs of peaks remaining in the window 1164 can simply be 1. Thus, the average respiratory rate in BPM for a window can be computed as follows:








6

0

n

·




i
=
1

n




v
i



T
i








where n represents the number of tag pairs in the window, Ti represents the time period of the ith tag pair, and vi represents the ith weight, which may be 1 for all weights except v1, which is equal to v′/v.


In the plot 1153 of FIG. 11D, the features of FIGS. 11B and 11C are extended to detecting respiratory rates between first and third peaks. Peaks shown include peaks 1184a, 1184b, 1184c, and 1184d. The peaks 1184a and 1184b can correspond to inspiration and expiration, respectively. Similarly, the peaks 1184c and 1184d can correspond to inspiration and expiration as well. Instead of calculating respiratory rate for successive peaks, respiratory rate can be calculated for alternating (e.g., first and third). Thus, respiratory rate can be calculated between the inspiration peaks 1184a and 1184c and again between the expiration peaks 1184b and 1184d. Accordingly, a first time period 1192 between the inspiration peaks 1184a and 1184c, represented as v1, and a second time period 1194 between the expiration peaks 1184b and 1184d, represented as v2, can be calculated.


A processing window 1166 is shown. This window 1166 does not include the two initial peaks 1184a and 1184b. Partial periods v1′ and v2′ can be calculated as shown. The partial period v1′ can be calculated as the time difference between the start of the window 1166 and a feature of the next alternate peak (1184c). Similarly, the partial period v1′ can be calculated as the time difference between the start of the window 1166 and a feature of the next alternate peak (1184d). The weights of contributions of the periods from the earlier peaks 1184a and 1184b can therefore be v1′/v1 and v2′/v2, respectively.


Generally, weight v′/v can also be referred to as an occurrence value, reflecting an amount of occurrence that a particular peak has on a processing window. In one embodiment, the time domain respiratory processor 1000 can output an occurrence value for each tag in a processing window. The use of the occurrence value is described below with respect to FIGS. 16B through 18.


The time domain power analysis technique can be described in the context of FIG. 12, which illustrates an example time domain plot 1200. The plot 1200 illustrates a wave form 1202, which represents a tag. The tag waveform 1202 is plotted as energy or magnitude with respect to time. Confidence values ranging from 0% confidence 100% confidence are also indicated on the y-axis.


The noise floor signal received by the confidence estimation module 1140 is plotted as the line 1210. The line 1210 is a simplified representation of the noise floor. Other lines 1212, 1214, and 1216 are drawn on the plot 1200 to represent different regions 1220, 1222, and 1224 of the plot 1200. A peak 1204 of the tag 1202 falls within the region 1222. In the depicted embodiment, tags having peaks falling within this region 1222 are assigned 100% confidence.


If the peak 1204 were in the region 1220, the confidence assigned would be less than 100%. The confidence can drop off linearly, for example, from the line 1212 to the line 1210. Confidence can decrease as the peak is smaller because the peak would be close to the noise floor 1210. Peaks falling below the noise floor 1210 can be assigned 0% confidence.


Likewise, peaks that exceed a threshold represented by the line 1216 can also have zero confidence. These large peaks are more likely to represent talking or other noise, rather than respiration, and can therefore be assigned low confidence. From the line 1214 representing the end of the 100% confidence region, confidence can linearly decrease as peak amplitude increases to the line 1216.


The threshold represented by the lines 1210, 1212, 1214, and 1216 can be changed dynamically. For instance, the thresholds represented by the lines 1210 and 1212 can be changed based on a changing noise floor. In certain embodiments, however, the upper thresholds represented by the lines 1214, 1216 are not changed.


It should be understood, in certain embodiments, that the plot 1200 need not be generated to determine confidence in the respiratory rate values. Rather, the plot 1200 is illustrated herein to depict how one or more dynamic thresholds can be used in the time domain to determine respiratory rate confidence.


As mentioned above, the confidence estimation module 1140 can also analyze the distribution of frequency components corresponding to the tags in the frequency domain. The confidence estimation module 1140 can analyze the frequency components using the spectral envelope described above. A spectrogram can be plotted to illustrate how confidence can be calculated from the spectral envelope, as shown in FIG. 13A.



FIG. 13A illustrates an example spectrogram 1300 that can be used by the confidence estimation module 1140 to determine respiratory rate confidence. The spectrogram 1300 can include a spectral envelope 1302 corresponding to the same frame or block of samples used to analyze confidence of the tags described above. The spectral envelope 1302 is plotted as peak frequency with respect to time. Thus, for each moment in time, the maximum frequency of the acoustic signal is plotted. Confidence values are also indicated on the y-axis.


In certain embodiments, respiration sounds can approximately correspond to white noise. However, respiration sounds are generally distributed in the frequency range of about 0 Hz to about 2 kHz. More particularly, in certain embodiments, respiration sounds can be distributed in the frequency range of about 200 Hz to about 1.2 kHz. Below about 200 Hz, sounds are usually hard sounds. Above about 1.2 kHz, sounds include moans, size, speech (speech also has lower frequency components), and the like.


Thus, in a first region 1322 (denoted by lines 1312 and 1314), 100% confidence is assigned if a peak falls within this region 1322. Peaks falling outside of this region 1322 can have a confidence that drops off linearly to 0% confidence, as shown in the FIGURE.


In certain embodiments, this confidence value determined from the spectrogram 1300 can be further refined by the density of the spectrogram at each peak. If a peak at a given point in time represents narrowband or periodic noise (e.g., a pure tone), it may have been derived from a spectrum having fewer frequency components than a respiration spectrum. The frequency spectrum corresponding to the time where the peak occurs can therefore be considered to be less dense. In contrast, if a peak at a given point in time comes from a broader spectrum, that peak can be considered to be denser. Spectral density is schematically represented by line 1330, with more tick marks on the line representing greater density.


If the density is low, the peak can likely represent narrowband noise or periodic noise. An example of narrowband noise can include an alarm from a medical instrument. Examples of periodic noise can come from medical instrumentation that emits periodic noise, such as CPAP machine cycles, noise from motors, and the like. Periodic noise can also result from speech, closing doors, motor vehicles (e.g., in the hospital parking lot), and the like. If the density is higher, the peak more likely represents a respiration sound because respiration can be relatively more broadband than narrowband or periodic noise. As a relatively broadband signal, respiration can include more spectral components than a narrowband signal. Thus, respiration signals can be more spectrally dense than narrowband noise signals.


In fact, in some instances, respiration can resemble white noise over a certain bandwidth. Somewhat paradoxically, the broadband sound (respiration) is the signal that is to be detected, rather than the narrowband/periodic sound, which can be the noise. This detection problem contrasts with typical detection problems where a narrowband signal of interest is to be detected from broadband noise.


Thus, in certain embodiments, the calculated density of spectral information corresponding to a time when a peak can be used at least in part to determine confidence values. In brief, the denser a signal is, the more confident the respiration processor can be that the signal contains respiration. Conversely, the less dense the signal, the less confident the respiration processor can be that the signal includes respiration. More detailed features for computing and analyzing spectral density are described below with respect to FIGS. 13B through 13D.


Referring again to FIG. 11A, the confidence estimation module 1140 can combine the confidence values derived from the TD and FD techniques described above. Advantageously, in certain embodiments, using these different techniques increases the accuracy of the confidence calculation. For example, the confidence estimation module 1140 can average the confidence values derived from both techniques to produce confidence values for each tag in a frame or block. Moreover, the confidence estimation module 1140 can look at the statistical distribution of shapes of peaks, for example, by determining if tags are similar, to further increase or decrease the calculated confidences in a frame or block.



FIG. 13B illustrates a more detailed embodiment of the confidence estimation module 1140, namely the confidence estimation module 1340. In the depicted embodiment, the confidence estimation module 1340 includes a density calculator 1382, a matched filter 1384, and a decision/confidence block 1386. In one embodiment, these components 1382, 1384, 1386 include hardware and/or software. Another view of these components is that the blocks 1382, 1384, 1386 represent algorithm or process flow states that can be implemented by hardware and/or software. Advantageously, in certain embodiments, the confidence estimation module 1340 can use spectral density, spectral envelope, and other features described above to compute confidence in a respiratory measurement. Moreover, as will be described in greater detail below, the confidence estimation module 1340 can use these features to detect respiratory pauses and other respiratory events, such as apnea, hypopnea, and the like.


In the depicted embodiment, spectral information is provided to the density calculator 1382. This spectral information was described above with respect to FIG. 8 as being calculated by the FD envelope processor 500. In certain embodiments, the spectral information can include the power spectrum of the working time domain signal (e.g., the signal received by the FD envelope processor 500). Advantageously, in certain embodiments, the density calculator 1382 computes spectral density from the spectral information. As described above with respect to FIG. 13, higher spectral density can be more representative of respiration, while lower spectral density can be more representative of narrowband and/or periodic noise.


Spectral density as calculated by the density calculator 1382 can be further understood in the context of example waveforms 1319 depicted in FIG. 13C. In FIG. 13C, these waveforms 1319 include a spectrogram 1317 of the spectral information and a graph 1321 of spectral peaks plotted over time (a one-dimensional spectrogram 1360) and spectral density 1370 plotted over time, among other features. The waveforms 1319 shown are used herein to facilitate a description of spectral density calculations, among other features. The waveforms shown 1319 may or may not be output on a patient monitor in various embodiments.


The spectrogram 1317 is plotted with respect to frequency over time. At each slice in time (represented on the “x” axis), a magnitude frequency response is plotted (represented on the “y” axis). This magnitude frequency response can be obtained, as described above, from an FFT performed on a block or frame of input samples. In the depicted example spectrogram 1317, the magnitude of a particular frequency is illustrated by the brightness or intensity of the points (e.g., pixels) in the spectrogram. Thus, brighter points tend to represent higher frequency magnitudes while dimmer pixels tend to represent lower frequency magnitudes.


The spectrogram 1317 differs from the spectrogram 1300 shown in FIG. 13A in that the spectrogram 1317 shows all (or substantially all) frequencies of the spectral information over time, whereas the spectrogram 1300 shows the peak frequency of the spectral information over time. For comparison, the spectrogram 1360 (referred to herein as a one-dimensional spectrogram, in contrast with the multi-dimensional spectrogram 1317) shown in the graph 1321 is an example implementation of the spectrogram 1300 of FIG. 13.


Some features of the spectrogram 1317 include respiration portions 1330, speech portions 1340, narrowband portions 1342, and ambient noise portions 1350. The respiration portions 1330 in the depicted embodiment extend through most of all of the frequency spectrum (e.g., at a given point in time), illustrating how respiration can be broadband sound. In the depicted embodiment, the energy of the respiration portions 1330 is greatest in the lower frequencies (e.g., below about 1 kHz), although this may not always be the case. Because respiration energy can sometimes be greatest in the lower frequencies, the confidence estimation module 1340 of FIG. 13B can consider signals having greater energy in the lower frequencies to more likely be respiration. The confidence estimation module 1340 can use this information about the frequency locations of signal energy as an input to a confidence calculation.


The speech portions 1340 of the example spectrogram 1317 reflect an example pattern of speech. As speech can include periodic or narrowband signals, narrowband portions 1342 are illustrated in the speech portion 1340. These narrowband portions 1342 appear as striations on the spectrogram 1317. At any one slice in time, the frequency response at the speech portions 1340 of the spectrogram 1317 are less dense than the example respiration portions 1330. The confidence estimation module 1340 of FIG. 13B can use this density information to distinguish between respiration and speech, as well as other noise signals.


The ambient noise portions 1350 reflect low-level background noise from the environment (e.g., the environment around a patient). For example, the ambient noise can include white noise. Because this ambient noise has signal energy, it can adversely affect a density calculation by inflating the calculated density. Thus, in certain embodiments, the density calculator 1382 of FIG. 13B can advantageously reduce the effect of the ambient noise 1350 on a density calculation. For example, the density calculator 1382 can exclude the ambient noise from the density calculation.


An example process 1390 for calculating spectral density that takes into account the ambient noise 1350 is illustrated in FIG. 13D. The process 1390 can be implemented by the density calculator 1382. Referring to FIG. 13D, the process 1390 receives the spectral information described above at block 1391. At block 1392, a spectral bias is determined based at least partly on the level of the ambient noise energy. The spectral bias can be the level of the ambient noise energy. The level of the ambient noise energy can be calculated in a variety of ways. One approach for determining the ambient noise energy within a given time period is to calculate the spectral centroid from the spectral information. The spectral centroid can be calculated as a weighted mean of the frequencies present in the spectral information, with the magnitudes of the frequencies as the weights (see FIGS. 13E through 13G).


Over time, the spectral bias can be smoothed or averaged, for example, by applying a weighted average or the like. Any smoothing filter can be used to average the spectral bias. For example, a first-order IIR filter can be used to smooth the spectral bias. In one embodiment, the ambient noise energy is weighted such that the spectral bias decreases slowly but increases quickly. By decreasing slowly, the spectral bias can be used to exclude relatively louder ambient noises. By increasing quickly, the spectral bias can be used to assist with detection of shallow breathing signals.


At block 1393, the energy of each frequency bin in the spectral information is determined. As mentioned above with respect to FIG. 8, the spectral information can be obtained from the FFT of the working acoustic signal. The FFT can partition spectral information into a plurality of frequency bins. The number and size of the bins can be determined based on the sampling frequency and the number of points used to compute the FFT. Thus, for example, for a 4 kHz sampling frequency and a 256 point FFT, the frequency bin size can be about 15.625 Hz. The energy of each frequency bin can be determined by determining the magnitude or squared magnitude, for example, of the spectral information in each bin. In some embodiments, the energy of a subset of the bins, rather than each of the bins, is calculated.


At decision block 1394, it is determine for each bin, whether the energy of the bin is greater than the spectral bias. If so, then a counter is incremented at block 1395. Otherwise, the counter is not incremented, and the process 1390 proceeds to decision block 1396. Incrementing the counter for a bin can reflect that the bin is spectrally dense, or in other words, that the bin should be counted toward the spectral density calculation. Said another way, in certain embodiments, when the energy of a bin exceeds the ambient noise level of that bin, the bin is counted toward the spectral density.


At decision block 1396, it is further determined whether any additional bins remain to be processed. If so, the process 1390 loops back to block 1394. Otherwise, spectral density is calculated at block 1397, for example, by dividing the counter value by the total number of bins considered. In effect, the process 1390 can therefore determine the ratio (or percentage) of bins that have a spectral density above the spectral bias.


The spectral density can be determined using other techniques or algorithms. For example, spectral density can be computed by summing the energy levels (obtained at block 1393) for the bins. Further, in some embodiments, this sum can be divided by the total number of bins considered to arrive at a mean energy value. This mean energy value can be the spectral density. Many other techniques can be used to calculate the spectral density, as will be apparent from this disclosure.


Referring again to FIG. 13B, the density calculator 1382 can aggregate the spectral densities calculated for each block or frame of samples to output density data over time. An example of such density data is shown in the graph 1321 of FIG. 13C. In the graph 1321, a density waveform 1370 is shown, representing the calculated density over time. The time scale in the graph 1321 corresponds to the time scale used in the spectrogram 1317, for the same input acoustic signal. Inspection of the graph 1321 indicates that for periods of respiration 1330 shown in the spectrogram 1317, the density waveform 1370 is higher in magnitude, while for periods of speech 1340 and other noise, the density waveform 1370 is lower in magnitude.


Also shown in the graph 1321 is a one-dimensional spectrogram 1360, which includes the features of the spectrogram 1300 described above with respect to FIG. 13. This spectrogram 1360 includes the peak spectral value over time (e.g., the peak in the spectrogram 1317). Superimposed on the graph 1321 are markers 1362 representing tags detected by the time domain algorithms described above. The tags 1362 can reflect the tags_3, for example, described above with respect to FIG. 11A. A density threshold 1372 is also shown, which will be described in greater detail below.


Referring again to FIG. 13B, the density calculator 1382 provides the density data (corresponding to the density waveform 1370) to the matched filter 1384. The matched filter 1384 is an example of a detector that can be used to analyze the density data. Advantageously, in certain embodiments, the matched filter 1384 attempts to match the density data to a template or pattern to determine whether the density data corresponds to respiration. In one embodiment, a breath results in a sinusoidal-shaped pattern. The matched filter 1384 can convolve a portion of the density data with a sinusoidal-shaped pattern to determine whether the density data matches the sinusoidal-shaped pattern. Pattern-matching filters other than a matched filter can be used in other implementations.


An example illustrating the pattern matching techniques of the matched filter 1384 is shown in FIG. 13C. Referring again to FIG. 13C, the graph 1321 depicts sinusoidal patterns 1373 superimposed on portions 1371 of the density waveform 1370. The portions 1371 of the density waveform 1370 are shown as copies below the actual density waveform 1370 for illustration purposes. The density waveform portions 1371 can be centered on the locations of the tag indicators 1362 but need not be.


In the example embodiment shown, the density portions 1371 do not appear similar to the sinusoidal patterns 1373. Thus, the output of the matched filter 1384 would be relatively small, indicating that these density portions 1371 likely do not reflect the presence of respiration. This conclusion makes sense, given that the density portions 1371 analyzed fall within the breathing portion 1340 of the spectrogram 1317. Conversely, if the density portions 1371 more closely matched the sinusoidal patterns 1373, the output of the matched filter 1384 would be relatively larger, reflecting the possible presence of respiration.


Referring again to FIG. 13B, the matched filter 1384 can use other templates besides a purely sinusoidal template to compare with the density data. For instance, the matched filter 1384 can compare density data with a patients' baseline respiration density shape or template. An actual patient's baseline density shape (e.g., where the patient has breathed normally previously) may not be exactly sinusoidal. Thus, instead of using a sinusoid to detect breathing shapes in the density data, the matched filter 1384 can attempt to match the patient's baseline template with the density data.


The matched filter 1384 can initially detect the patient's baseline template using the sinusoidal template in one embodiment. The matched filter 1384 can also adjust the patient's baseline template over time as the patient's breathing changes. For instance, the matched filter 1384 can average a patient's density shapes for each detected respiration to derive an average baseline template.


The matched filter 1384 provides a filtered output. In one embodiment, the matched filter 1384 provides a higher output if the density data matches the template than if the density data does not match the template. The filtered output is provided to the confidence module 1386. The confidence module 1386 can receive the filtered output, the spectral envelope (see FIGS. 13A, 13C), the tags (e.g., the tags_3 of FIG. 11A), the noise floor (see FIG. 6), and a user-adjustable sensitivity (discussed below).


In certain embodiments, the confidence module 1386 calculates a confidence reflecting whether the identified tags represent breaths. The confidence module 1386 can calculate this confidence using the features described above with respect to FIG. 13. For example, the confidence module 1386 can determine where the peak of the spectral envelope at a given point in time lies, determine whether the spectral density (e.g., as computed by the density calculator 1382) at that peak exceeds a threshold, and so forth.


An example of such a density threshold 1372 is shown in FIG. 13C. At the points where tags occur (the indicators 1362), the confidence module 1386 can compare the value of the density waveform 1370 with the threshold 1372. If the density value exceeds the threshold 1372, the confidence module 1386 can assign higher confidence to the tag occurring at that density value. The confidence module 1386 can calculate a degree of confidence based at least in part on how far the density value exceeds the threshold 1372. The confidence module 1386 can calculate lower confidence values when the density value is below the threshold 1372.


The confidence module 1386 can further base confidence at least in part on the degree to which the filtered output from the matched filter 1384 exceeds a threshold. Thus, the confidence module 1386 can use both the density data and the matched filter 1384 output together to determine confidence. In some implementations, the confidence module 1386 can use one or both of density and matched filter 1384 output to determine confidence. The confidence module 1386 can also use other signal features to calculate confidence, some of which are discussed elsewhere herein.


In certain embodiments, the user-adjustable sensitivity input into the confidence module 1386 can advantageously adjust one or more of the thresholds used by the confidence module 1386. The user-adjustable sensitivity can be used to select between focusing on detecting non-respiration events (such as respiratory pauses, apnea, and the like) and focusing on detecting respiration events. In some care scenarios, it can be more important to a clinician to determine whether a patient is breathing than what the patient's precise respiratory rate is. Thus, the user-adjustable sensitivity can be used to select this preference. For example, in one embodiment, the user-adjustable sensitivity can be controlled by a clinician via a user interface on a patient monitor. A display button or other user interface control (such as a hardware button or switch) can allow the clinician to select between detecting respiratory pauses, apnea, or the like and more precise detection of respiratory rate.


If the clinician selects detection of respiratory pauses or the like, the user-adjustable sensitivity control can be adjusted to cause the density threshold 1372 to be set higher than if the clinician selects more accurate respiratory rate detection. Thus, the confidence module 1386 can intentionally under-read respiratory rate to more aggressively detect lack of breathing. By under-reading respiratory rate, the confidence module 1386 can ultimately cause more alarms to be triggered, enabling clinicians to respond to patients who are not breathing. Thus, scenarios can be avoided or reduced where clinicians are unaware of a patient's cessation of breathing, allowing clinicians to save more lives.


The user-adjustable sensitivity can be provided to the confidence module 1386 from the patient monitor. The user-adjustable sensitivity can be used by the confidence module 1386 to adjust the density threshold 1372 (see FIG. 13C), the matched filter threshold, or a combination of the same.


Another advantageous feature that can be provided in certain embodiments is to use manual tags generated by a clinician to train the confidence estimation module 1340. Example indicators 1364 of manual tags are depicted in FIG. 13C. The indicators 1364 represent instances where clinicians have marked that a patient actually breathed, e.g., by visually observing the patient breathing. These indicators can be used by the confidence estimation module 1340 during manufacturing, testing, and/or servicing to automatically improve confidence calculations. In one embodiment, an engineer can use the manual tag indicators 1364 to improve signal processing of the respiratory data.



FIGS. 13E through 13G illustrate example spectrums 1398 for computing centroids. The centroids can be used, for example, to calculate ambient noise energy as described above. More generally, centroids can be calculated to assist with detection of background noise, detecting respiratory pauses, calculating confidence values, and so forth.


Referring to FIG. 13E, an example spectrum 1398a is shown at a point in time. The spectrum 1398a plots energy against frequency. More generally, any of the spectrums 1398 described herein can plot a frequency response magnitude, energy, power, or the like. The spectrum 1398a corresponds to an example respiration, having energy distributed primarily in lower frequencies rather than higher frequencies. A representation of a centroid 1399a is also shown. The centroid 1399a can be an equilibrium point about which the energy in the spectrum 1398a is equally divided. Thus, the energy in the spectrum 1398a to the left of the centroid 1399a can be equal or substantially equal to the energy to the right of the centroid 1399a. With more energy distributed toward the lower frequencies for a respiration spectrum 1398a, the centroid 1399a can be toward the lower end of the spectrum 1399a as well.


Other centroids can have higher values, such as the centroid 1399b in the spectrum 1398b (where energy is concentrated in high frequencies) and the centroid 1399c of the spectrum 1398c (where energy is distributed evenly across the spectrum 1398c). Thus, the higher the centroid 1399, the less likely the spectrum 1399 represents respiration, and vice versa.


In one embodiment, the tags confidence estimation module 1140 can analyze the centroid of a spectrum for a given point in time, in addition to or instead of analyzing spectral density, and spectral envelope peaks. The lower the centroid, the more confident the tags confidence estimation module 1140 can be that tags correspond to respiration, and vice versa. Similarly, the tags confidence estimation module 1140 can indicate that a respiration pause is more likely if the centroid is relatively higher than during respiration periods.


Another concept that can be useful in distinguishing respiration from respiratory pauses and background noise is entropy. Entropy, or Shannon entropy, can generally represent the order (or disorder) of a signal. A signal that is more ordered, such as speech (which tends to have well-ordered harmonics), can have a higher entropy value. A signal that is less ordered, such as respiration, can have a lower entropy value. Thus, the tags confidence estimation module 1140 can calculate the entropy of an acoustic signal (e.g., according to processing windows) and can compare the entropy of the acoustic signal to some threshold to estimate whether the signal represents background noise/respiratory pauses or respiration.


In other embodiments, trends of centroids and entropy values are analyzed instead of thresholds. Further, entropy and centroids can be analyzed together to estimate confidence/respiratory pauses, and the like. In addition, entropy, centroids, spectral density, and spectral envelope peaks can be used together to estimate confidence/respiratory pauses, and the like. For instance, each of these quantities can be normalized on a scale (such as from 0 to 1) and multiplied together to produce a probability of respiration. If the multiplied value is close to 1 on a scale of 0 to 1, the probability that a respiration is detected can be close to 1, and vice versa. Some subset of the parameters entropy, centroids, spectral density, and spectral envelope peaks can be used to analyze confidence and/or respiratory pauses in different implementations. Other parameters may also be used, including other statistical parameters.


Referring again to FIG. 10, the RR estimation module 1030 can receive the tags and calculated confidence values and the tags from the automatic tagging module 1020. As described above, the RR estimation module 1030 can calculate respiratory rate from the tags and an overall TD RR confidence from the individual tag confidence values in a frame or block. In one embodiment, the RR estimation module 1030 counts the number of tags occurring in a frame or block and uses this count to derive respiratory rate based on the amount of time represented by the frame or block. In another embodiment, the RR estimation module 1030 determines respiratory rate based on the difference in time between different tags in a frame or block. The two possible TD RR values are represented as TD RR 1 and TD RR 2 in the FIGURE. In some embodiments, only one method is used to calculate the TD RR.


The RR estimation module 1030 can determine the overall TD RR confidence value for a given block or frame by averaging confidence values for each peak or tag within the frame. The RR estimation module 1030 can therefore output an overall TD RR confidence value.


Frequency Domain RR Calculation


FIG. 14 illustrates an embodiment of a frequency domain respiratory rate (FD RR) calculation process 1400. The FD RR calculation process 1400 can be implemented by the FD RR processor 920 described above with respect to FIG. 9. Advantageously, in certain embodiments, the FD RR calculation process 1400 can calculate a respiratory rate value based on frequency domain analysis.


The FD RR calculation process 1400 can analyze an FD envelope obtained above. The FD RR calculation process 1400 will be described in the context of the example frequency envelopes or spectrums 1510, 1520 shown in FIGS. 15A, 15B respectively. Each spectrum 1510, 1520 is plotted as magnitude versus frequency.


At block 1402, peaks that do not have harmonics are removed from the frequency envelope. Referring to FIG. 15A, a peak 1512 has harmonics 1514, which can be integer multiples of the frequency of the peak 1512. In contrast, peak 1516 does not have a corresponding harmonic. This can also be seen in FIG. 15B, where a peak 1526 does not have a corresponding harmonic. These peaks could therefore be removed from the spectrum because they likely do not correspond to respiration.


Referring again to FIG. 14, a peak with maximum energy is found at block 1404. In FIG. 15A, the maximum energy peak is peak 1512, and in FIG. 15B the maximum energy peak is peak 1522. It is determined at decision block 1406 whether a peak exists at half the frequency of the maximum peak. If not, at block 1408, the frequency of the maximum peak is determined to correspond to the respiratory rate, and the process 1400 ends.


Thus, in FIG. 15A, there is no peak at half the frequency of the maximum peak 1512, and therefore the frequency of this peak 1512 corresponds to the respiratory rate. In contrast, there is a peak 1524 at half the frequency of the maximum peak 1522 in FIG. 15B. Thus, the analysis continues.


Referring again to FIG. 14, if there is a peak at half the maximum peak's frequency, at decision block 1410, it is further determined whether this peak has harmonics. If not, the frequency of the maximum peak corresponds to respiratory rate in block 1408, and the process ends. Otherwise, the frequency of the peak at half the maximum peak's frequency corresponds to respiratory rate at block 1412, and the process ends. Thus, for example, in FIG. 15B, the peak 1524 has a harmonic at peak 1522 (the maximum peak), peak 1528, and so on. Therefore, this peak is determined to have a frequency corresponding to respiratory rate.


Although not shown, in some cases, although the first peak has harmonics, the maximum peak may still be selected as corresponding to respiratory rate based on historical past respiratory rate values. For instance, if the respiratory rate value calculated by the first peak is much different from the previously calculated value, the respiratory rate value for the second peak may be selected if it is closer to the historical value. In addition, although example spectrums have been shown in FIGS. 15A and 15B, many other spectrums shapes and types can exist. Thus, the FD RR processor 920 and the process 1400 can be adjusted or adapted accordingly to account for different types of spectrums.


Referring again to FIG. 9, the FD RR processor 920 can calculate FD RR confidence as follows. In one embodiment, the FD RR processor 920 can sum the energy of all or substantially all the harmonics in the FD envelope or spectrum. The FD RR processor 920 can further determine the energy of the entire envelope or spectrum. The confidence can be equal to, or derived from, the total harmonic energy divided by the total energy in the spectrum. Thus, if there is substantially more energy in the spectrum other than harmonic energy, the confidence can be low, and vice versa.


In some embodiments, if the amplitude of the acoustic signal for a given frame or processing window is small (e.g., below a threshold), the frequency domain processor can zero out or otherwise perform no analysis. Similarly, if the respiratory rate calculated by the frequency domain processor is very high (e.g., above a threshold where patients typically cannot breathe so fast), the frequency domain processor can zero out the calculated respiratory rate value. Alternatively, in either the case of low signal level or high calculated respiratory rate, the frequency domain processor can calculate low or zero confidence instead of zeroing out the frequency domain respiratory rate value.


Arbitrator

Advantageously, in certain embodiments, FD RR analysis can be simpler than TD RR analysis. However, using both in conjunction can reduce errors and provide more accurate confidence analysis. For instance, if the FD spectrum includes harmonic peaks that are substantially equal in value, it can be difficult to determine which corresponds to inspiration and which correspond to expiration. Thus, the FD RR processor 920 might calculate a respiratory rate that is double the actual respiratory rate. The RR calculated in the time domain, if half that rate, is likely to be more accurate. In certain embodiments, the arbitrator 930 described above arbitrates between the TD and FD engines to output a more accurate respiratory rate value.



FIG. 16A illustrates a more detailed embodiment of an arbitrator 1600 corresponding to the arbitrator 930 of FIG. 9. The arbitrator 1600 can be implemented in software and/or hardware. Advantageously, in certain embodiments, the arbitrator 1600 can combine and/or select from the TD and FD RR values and from the TD and FD confidence values.


The arbitrator 1600 includes a decision logic module 1610 and a post-processing module 1620. The decision logic module 1610 can receive the TD and FD RR and confidence values. The decision logic module 1610 can select RR and confidence values within a block or frame, whereas the post-processing module 1620 can determine an overall RR and an overall confidence value for a period of time. As frames can overlap, the post-processing module can determine the overall RR and confidence values for several frames.


The post-processing module 1620 can output the RR value, the confidence value, and the elapsed time value discussed above for presentation to a user. The confidence value output by the post-processing module 1620 can be referred to as a respiratory signal quality (RSQ) value or indicator. However, any confidence value described herein could also be referred to as a respiratory signal quality (RSQ) value or indicator.


Several techniques of the decision logic module 1610 will now be described, some of which may be implemented in certain embodiments, while others are implemented in others. All techniques may also be implemented. It should also be noted that two TD RR values and possibly two confidence values can be output by the TD RR processor 910, as described above. Either of these TD RR values can be compared with the FD RR value or an average of the two can be compared with the FD RR value. Or, each can be compared with each other using the techniques described below, and then one can be selected and compared with the FD RR value.


In certain embodiments, the decision logic module 1610 can compare the TD and FD RR values within a frame to determine if they are close to each other. If they are close, the decision logic module 1610 could output either value. In one embodiment, the decision logic module 1610 outputs the value with the higher associated confidence value. If the values are not close, the decision logic module 1610 can output the RR value from the engine that is closest to a previous RR value.


If both TD and FD RR values are of high confidence but not close (within a delta), and neither is close to a previous value, then the decision logic module 1610 can output one of the values if it passes other checks. For instance, if the FD value is double the TD value (e.g., for reasons discussed above), the decision logic module 1610 could output the TD value instead.


In one embodiment, the decision logic module 1610 outputs a RR value only if either the TD or FD confidence is above a threshold. If both are below a threshold (which may differ for both), the decision logic module 1610 can output a zero RR value or “not a number” (NaN) type RR value.


The post-processing module 1620 can average or otherwise combine the FD and TD RR values from possibly overlapping frames based on a selected averaging time. The averaging time can be user selectable in some embodiments. For instance, if the averaging time is 30 seconds, the post-processing module 1620 can average the TD and/or FD values selected for overlapping frames falling within the 30 second period. The output of this averaging can be an overall RR value. In one embodiment, the post-processing module 1620 performs a weighted average in of the respective respiratory rate values. The post-processing module 1620 can, for instance, use the calculated confidence values as weights or to derive weights to be applied to the respiratory rate values.


The post-processing module 1620 can similarly average the TD and RR confidence values to obtain an overall confidence value. Various combinations can be used to obtain the overall confidence value. For example, the confidence values can be averaged (and possibly weighted) over several confidence values. Also, a number of nonzero points can be selected from a buffer of confidence values and divided by the total number of points in the buffer. For example, if a buffer has five total confidence measurements, and two of those are zero, the overall confidence value could be ⅗. Also, the overall confidence value could be a combination of the weighted average and the nonzero-based confidence calculation. Many other embodiments are possible.


An alternative embodiment to the arbitrator 1600 is illustrated in FIG. 16B. The arbitrator 1630 includes decision logic for determining respiratory rate and confidence values based on various inputs. The arbitrator 1630 receives time domain parameters, long frequency domain parameters, and short frequency domain parameters. From these parameters, the arbitrator 1630 outputs an overall respiratory rate and confidence value or indication.


In one embodiment, the time domain parameters include each respiratory rate value calculated during a given window, corresponding time stamps for the respiratory rate values, confidence values for each respiratory rate value, and an occurrence value for each respiratory rate value (see FIGS. 11B through 11D). The time domain parameters can also include the average time domain respiratory rate and time domain confidence described above for a given frame or processing window.


The long frequency domain parameters can include the frequency domain respiratory rate and confidence value calculated for a given frame or processing window. Likewise, the long frequency domain parameters can also include time stamps for the respiratory rate values and optionally occurrence values (which may be 1). In contrast, the short frequency domain parameters can include multiple respiratory rate and corresponding confidence values in a given time window. These parameters are referred to as short because a shorter time window (within the main frame or processing window) is used to calculate multiple respiratory rate values.


An illustration of calculating the short frequency domain parameters is shown with respect to FIG. 16C. In FIG. 16C, a frame 1640 or primary processing window is shown. The frame can include samples over a period of time, such as several seconds (e.g., 30 seconds). Subframes 1650, 1660, and 1670 are superimposed on the frame 1640 to illustrate subdivisions of the samples in the frame 1640. The subframes 1650, 1660, and 1670 are used to calculate the short frequency domain parameters. In the depicted embodiment, three subframes 1650, 1660, and 1670 are shown; however, two, four, or more subframes can be used to calculate the short frequency domain parameters.


In one embodiment, the time domain and frequency domain envelopes are calculated for the frame 1640, using the techniques described above. The long frequency domain parameters are thus derived from the frame 1640. The short frequency domain parameters are derived from frequency envelopes for each subframe 1650, 1660, and 1670. The frequency domain processor described above can calculate a separate respiratory rate and associated confidence value for each subframe 1650, 1660, and 1670, as well as associated time stamps and optionally occurrence values (which may be 1). These short frequency domain parameters can be used in decision logic of the arbitrator 1630 of FIG. 16B to improve respiratory rate calculations.


In certain embodiments, a logical module used to calculate the short frequency domain parameters can be considered a second frequency domain processor or engine. Thus, the respiration processor 314 of FIG. 3 can output respiratory rate values from three engines, instead of two. More engines can be used in other embodiments.


Example features of the arbitrator 1630 can be explained in the context of FIGS. 17A through 18. In FIG. 17A, a point cloud plot 1700 is depicted. The point cloud plot 1700 graphs respiratory rate values over time. Thus, points 1710 in the point cloud plot 1700 represent discrete respiratory rate values. The point cloud plot 1700 can, but need not, be actually generated, but is used herein as a tool to describe arbitration between different time domain and frequency domain respiration processors.


The points 1710 in the point cloud plot 1700 represent time domain respiratory rate values, long frequency domain respiratory rate values, and short frequency domain respiratory rate values, as indicated by the legend 1720. The time axis of the plot 1700 is divided into frames or processing windows 1730. In certain embodiments, possibly multiple time domain respiratory rate values can be calculated for each window 1730. A single long frequency domain respiratory rate value can be calculated in each window 1730, and multiple short frequency domain respiratory rate values can be calculated in each window 1730 (see FIG. 16C).


The point cloud plot 1700 reflects trends in the respiratory rate values over time. These trends can be analyzed to determine one or more measures of how accurate respiratory rate values are for each time domain and frequency domain processor. An analysis window 1740 can be used to analyze these trends and other statistical measurements. The analysis window 1740 can be the same as the current processing window 1730 or can cover multiple processing windows 1730.


Respiratory rate data from example analysis windows is shown in more detail in FIGS. 17B and 17C. FIG. 17B depicts a plot 1730 of respiratory rate over time and is a subplot of the point cloud plot 1700. FIG. 17C depicts a similar plot 1750 that is also a subplot of the point cloud plot 1700. Points 1732, 1752 in each plot can represent respiratory rate values obtained from one of the respiration engines (e.g., time domain, long frequency domain, or short frequency domain). A fitted curve 1740, 1760 is shown in each plot. The fitted curves 1740, 1760 represent curves constructed to have a best (or estimated best) fit to the points 1732, 1752 in each plot.


Generally, the arbitrator 1730 of FIG. 16B can fit curves to respiratory rate points for each respiration engine and each analysis window 1740 (see FIG. 17A). Curves can be fit using any of a variety of methods, such as linear regression, spline functions, Bezier curves, or the like. In one embodiment, the curves are fit at least in part by taking confidence values for each respiratory rate value into account. The confidence values can act as weights in the point cloud, with higher confidence values giving greater weight to the points in the point cloud. Thus, for respiratory rate values having higher confidence, the curve fit to those points can be closer to those points than for respiratory rate values having lower confidence.


Each curve can be assessed by the arbitrator 1630 to determine the quality of an engine's respiratory rate values. This quality can be analyzed to score the outputs of the different respiration engines. One measure of quality is the stability, or goodness-of-fit, of the curves 1740, 1760. In the plot 1730, the curve 1740 has a relatively large degree of error due to the wide dispersion of the points 1732. Thus, the stability of this curve 1740 is relatively low. In contrast, the points 1752 in the plot 1750 are less dispersed, and the stability of the curve 1760 is therefore higher. Stability can be calculated, for example, with the minimum mean square error or some other similar measurement. A curve with higher stability reflects a better set of respiratory rate data for a particular engine. The arbitrator 1630 of FIG. 16B can use this stability information, at least in part, to select or arbitrate between different respiratory rate values from different engines.


In addition, a slope of the curves 1740, 1760 can be used to assess the quality of the respiratory rate values. A more positive or more negative slope can reflect a trend among respiratory rate values within a processing window. Because a processing window is a relatively short amount of time in certain embodiments, a patient's respiration rate is likely not changing significantly. Thus, a trend can reflect lower quality respiratory rate data. Conversely, a flatter curve can reflect possibly more accurate respiratory rate data. The curve 1740 has a higher slope than the curve 1760, reflecting possibly lower quality respiratory rate data.



FIG. 18 illustrates a decision logic process 1800 for selecting respiratory rate measurements. The process 1800 ties together the techniques described above with respect to FIGS. 16B through 17C and can be implemented by the arbitrator 1630 of FIG. 16. In certain embodiments, some or all features of the process 1800 can also be combined with features of the arbitrator 1600 described above with respect to FIG. 16A.


At block 1802, for each respiration processor or engine, one or more respiratory rate and confidence values are computed in a processing window. Thus, for example, time domain respiratory rate and confidence values can be calculated, long frequency domain values can be calculated, and short frequency domain values can be calculated. Blocks 1804 through 1820 of the process 1800 are described from the perspective of a single engine. At block 1804, a curve is fit to the respiratory rate values, using any of the techniques described above with respect to FIG. 17. An actual curve need not be drawn on a display although one can be; rather, values for such a curve can be stored in memory (volatile or nonvolatile). Stability and/or trend characteristics of the curve are calculated at block 1806.


A single confidence value is computed for the processing window at block 1808. This confidence value can be the average of the confidence values of each respiratory rate value calculated in the time domain (with a possible weighting applied with the occurrence value). The confidence value can also be an average for the short frequency domain engine. The confidence value from the long frequency domain engine can be the single confidence value computed (or rather, accessed) at block 1808.


Similarly, a single respiratory rate value is computed for each engine at block 1810. The time domain respiratory rate value can be computed using a simple or weighted average (see FIGS. 11B through 11D). Similarly, the short frequency domain respiratory rate values can be computed with a simple or weighted average, and the long frequency domain respiratory rate value can simply be accessed (e.g., from memory).


The computed respiratory rate values from each engine are scored at block 1812. This scoring can be based on the stability and/or trend values computed above, among other factors. Other factors might include, for example, how close the respiratory rate value for each engine is. Two engines with close respiratory rate values might be given bonus points in the scoring algorithm, while an outlier is not given bonus points (or is docked points). The score can also be based on the confidence values.


An overall respiratory rate value is selected from the scored respiratory rates at block 1814. The overall respiratory rate value can be selected based at least partly on the scores calculated above with respect to block 1812. Confidence values can be used as a tiebreaker between the respiratory rate values of different engines in certain embodiments. In one embodiment, respiratory rate values from different engines can be combined or averaged to produce the overall respiratory rate value. The combination of the respiratory rate values can be done as a weighted average, for example, using confidence values (or derivatives thereof) as weights.


An overall confidence value is calculated at block 1816. The overall confidence value can be the confidence value of the respiratory rate value that was selected from a particular engine as the overall respiratory rate. If the respiratory rate values from different engines were combined at block 1814, the confidence values may likewise be combined.


As described above, the arbitrator 1630 can average respiratory rate values and confidence values over time, such as a plurality of processing windows. If the overall confidence value is low and the respiratory rate value is therefore considered unreliable, in one embodiment the arbitrator 1630 lengthens this averaging time. The lengthened averaging time can reduce the impact of invalid or low quality respiratory rate values on the overall respiratory rate value output to a display of a patient monitor. Conversely, higher confidence values can cause the arbitrator 1630 to decrease the averaging time.


Terminology

The modules described herein of certain embodiments may be implemented as software modules, hardware modules, or a combination thereof. In general, the word “module,” as used herein, can refer to logic embodied in hardware or firmware or to a collection of software instructions executable on a processor. Additionally, the modules or components thereof may be implemented in analog circuitry in some embodiments.


Conditional language used herein, such as, among others, “can,” “could,” “might,” “may,” “e.g.,” and the like, unless specifically stated otherwise, or otherwise understood within the context as used, is generally intended to convey that certain embodiments include, while other embodiments do not include, certain features, elements and/or states. Thus, such conditional language is not generally intended to imply that features, elements and/or states are in any way required for one or more embodiments or that one or more embodiments necessarily include logic for deciding, with or without author input or prompting, whether these features, elements and/or states are included or are to be performed in any particular embodiment.


Depending on the embodiment, certain acts, events, or functions of any of the methods described herein can be performed in a different sequence, can be added, merged, or left out all together (e.g., not all described acts or events are necessary for the practice of the method). Moreover, in certain embodiments, acts or events can be performed concurrently, e.g., through multi-threaded processing, interrupt processing, or multiple processors or processor cores, rather than sequentially.


The various illustrative logical blocks, modules, circuits, and algorithm steps described in connection with the embodiments disclosed herein can be implemented as electronic hardware, computer software, or combinations of both. To clearly illustrate this interchangeability of hardware and software, various illustrative components, blocks, modules, circuits, and steps have been described above generally in terms of their functionality. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the overall system. The described functionality can be implemented in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the disclosure.


The various illustrative logical blocks, modules, and circuits described in connection with the embodiments disclosed herein can be implemented or performed with a general purpose processor, a digital signal processor (DSP), an application specific integrated circuit (ASIC), a field programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein. A general purpose processor can be a microprocessor, but in the alternative, the processor can be any conventional processor, controller, microcontroller, or state machine. A processor can also be implemented as a combination of computing devices, e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration.


The blocks of the methods and algorithms described in connection with the embodiments disclosed herein can be embodied directly in hardware, in a software module executed by a processor, or in a combination of the two. A software module can reside in RAM memory, flash memory, ROM memory, EPROM memory, EEPROM memory, registers, a hard disk, a removable disk, a CD-ROM, or any other form of computer-readable storage medium known in the art. An exemplary storage medium is coupled to a processor such that the processor can read information from, and write information to, the storage medium. In the alternative, the storage medium can be integral to the processor. The processor and the storage medium can reside in an ASIC. The ASIC can reside in a user terminal. In the alternative, the processor and the storage medium can reside as discrete components in a user terminal.


While the above detailed description has shown, described, and pointed out novel features as applied to various embodiments, it will be understood that various omissions, substitutions, and changes in the form and details of the devices or algorithms illustrated can be made without departing from the spirit of the disclosure. As will be recognized, certain embodiments of the inventions described herein can be embodied within a form that does not provide all of the features and benefits set forth herein, as some features can be used or practiced separately from others. The scope of certain inventions disclosed herein is indicated by the appended claims rather than by the foregoing description. All changes which come within the meaning and range of equivalency of the claims are to be embraced within their scope.

Claims
  • 1. A system for determining a respiration rate of a patient using an acoustic signal received from an acoustic sensor adapted to monitor the patient, the system comprising: an input configured to receive the acoustic signal from the acoustic sensor; andone or more hardware processors configured to: generate a time domain envelope signal from the acoustic signal,individually tag each local extremum of a plurality of local extrema in the time domain envelope signal as potentially corresponding to an inspiration or an expiration of the patient,determine to reject at least one local extremum of the plurality of local extrema as not corresponding to the inspiration or the expiration,determine in real time a respiratory rate from the plurality of local extrema without using the at least one local extremum that was determined to be rejected, andoutput the respiratory rate.
  • 2. The system of claim 1, wherein the one or more hardware processors are configured to determine to reject a first local extremum of the plurality of local extrema according to a time difference between the first local extremum and a second local extremum of the plurality of local extrema, the at least one local extremum that was determined to be rejected comprising the first local extremum.
  • 3. The system of claim 2, wherein the first local extremum and the second local extremum are consecutively-occurring local extrema in the time domain envelope signal.
  • 4. The system of claim 2, wherein the one or more hardware processors are configured to determine to reject the first local extremum further according to a magnitude difference between a first magnitude of the first local extremum and a second magnitude of the second local extremum.
  • 5. The system of claim 2, wherein the one or more hardware processors are configured to: determine to reject the first local extremum from a comparison of the time difference and a threshold, andadjust the threshold.
  • 6. The system of claim 5, wherein the one or more hardware processors are configured to: determine that the patient is a neonate, andadjust the threshold responsive to determining that the patient is the neonate.
  • 7. The system of claim 1, wherein the one or more hardware processors are configured to determine to reject a first local extremum of the plurality of local extrema according to a change in a magnitude of the time domain envelope signal between the first local extremum and a second local extremum of the plurality of local extrema, the at least one local extremum that was determined to be rejected comprising the first local extremum.
  • 8. The system of claim 1, wherein the one or more hardware processors are configured to determine to reject a first local extremum of the plurality of local extrema because the first local extremum is at a beginning or an end of a processing unit, the processing unit corresponding to a set of the plurality of local extrema that the one or more hardware processors are configured to process as a batch, the at least one local extremum that was determined to be rejected comprising the first local extremum.
  • 9. The system of claim 1, wherein the one or more hardware processors are configured to determine to reject a first local extremum of the plurality of local extrema according to a width of the time domain envelope signal around the first local extremum, the at least one local extremum that was determined to be rejected comprising the first local extremum.
  • 10. The system of claim 1, wherein the one or more hardware processors are configured to: determine a statistical parameter from processing a set of the plurality of local extrema as a batch, the set of the plurality of local extrema not comprising a first local extremum of the plurality of local extrema, anddetermine to reject the first local extremum from a deviation of a characteristic of the first local extremum relative to the statistical parameter, the at least one local extremum that was determined to be rejected comprising the first local extremum.
  • 11. The system of claim 1, wherein the one or more hardware processors are configured to: filter the time domain envelope signal with a filter to smooth the time domain envelope signal, the filter having a cutoff frequency, andadjust the cutoff frequency.
  • 12. The system of claim 11, wherein the one or more hardware processors are configured to adjust the cutoff frequency responsive to the respiratory rate.
  • 13. The system of claim 1, wherein the one or more hardware processors are configured to individually tag each local extremum of the plurality of local extrema by identifying any local extremum of the plurality of local extrema in the time domain envelope signal that are between a lower threshold and an upper threshold, the lower threshold corresponding to a noise floor.
  • 14. The system of claim 1, further comprising: the acoustic sensor; ora display configured to present the respiratory rate.
  • 15. A method for determining a respiration rate of a patient using an acoustic signal received from an acoustic sensor monitoring the patient, the method comprising: detecting, using the acoustic sensor, the acoustic signal;generating, by one or more hardware processors, a time domain envelope signal from the acoustic signal;individually tagging, by the one or more hardware processors, each local extremum of a plurality of local extrema in the time domain envelope signal as potentially corresponding to an inspiration or an expiration of the patient;determining, by the one or more hardware processors, to reject at least one local extremum of the plurality of local extrema as not corresponding to the inspiration or the expiration;determining, by the one or more hardware processors, in real time a respiratory rate from the plurality of local extrema without using the at least one local extremum that was determined to be rejected; andpresenting the respiratory rate on a display.
  • 16. The method of claim 15, wherein said determining to reject the at least one local extremum comprises determining to reject a first local extremum of the plurality of local extrema according to a time difference between the first local extremum and a second local extremum of the plurality of local extrema, the first local extremum and the second local extremum being consecutively-occurring local extrema in the time domain envelope signal, the at least one local extremum that was determined to be rejected comprising the first local extremum.
  • 17. The method of claim 16, wherein said determining to reject the at least one local extremum comprises determining to reject the first local extremum from a comparison of the time difference and a threshold, and further comprising: determining that the patient is a neonate; andadjusting the threshold responsive to determining that the patient is the neonate.
  • 18. The method of claim 15, wherein said determining to reject the at least one local extremum comprises determining to reject a first local extremum of the plurality of local extrema according to a change in a magnitude of the time domain envelope signal between the first local extremum and a second local extremum of the plurality of local extrema, the first local extremum and the second local extremum being consecutively-occurring local extrema in the time domain envelope signal, the at least one local extremum that was determined to be rejected comprising the first local extremum.
  • 19. The method of claim 15, further comprising: filtering the time domain envelope signal with a filter to smooth the time domain envelope signal, the filter having a cutoff frequency; andadjusting the cutoff frequency responsive to the respiratory rate.
  • 20. The method of claim 15, wherein said tagging comprises identifying any local extremum of the plurality of local extrema in the time domain envelope signal that are between a lower threshold and an upper threshold.
RELATED APPLICATIONS

This application is a continuation of U.S. patent application Ser. No. 15/669,201, filed Aug. 4, 2017, entitled “RESPIRATION PROCESSOR,” which is a continuation of U.S. patent application Ser. No. 12/905,384, filed Oct. 15, 2010, entitled “RESPIRATION PROCESSOR,” which claims priority from U.S. Provisional Patent Application No. 61/252,594 filed Oct. 16, 2009, entitled “Respiration Processor,” and from U.S. Provisional Patent Application No. 61/326,200, filed Apr. 20, 2010, entitled “Respiration Processor,” the disclosures of which are hereby incorporated by reference in their entirety.

US Referenced Citations (1444)
Number Name Date Kind
3682161 Alibert Aug 1972 A
4127749 Atoji et al. Nov 1978 A
4326143 Guth et al. Apr 1982 A
4507653 Bayer Mar 1985 A
4537200 Widrow Aug 1985 A
4685140 Mount, II Aug 1987 A
4714341 Hamaguri Dec 1987 A
4848901 Hood, Jr. Jul 1989 A
4884809 Rowan Dec 1989 A
4958638 Sharpe et al. Sep 1990 A
4960128 Gordon et al. Oct 1990 A
4964408 Hink et al. Oct 1990 A
5033032 Houghtaling Jul 1991 A
5041187 Hink et al. Aug 1991 A
5069213 Polczynski Dec 1991 A
5143078 Mather et al. Sep 1992 A
5163438 Gordon et al. Nov 1992 A
5273036 Kronberg et al. Dec 1993 A
5309922 Schechter et al. May 1994 A
5319355 Russek Jun 1994 A
5337744 Branigan Aug 1994 A
5341805 Stavridi et al. Aug 1994 A
5353798 Sieben Oct 1994 A
D353195 Savage et al. Dec 1994 S
D353196 Savage et al. Dec 1994 S
5377302 Tsiang Dec 1994 A
5377676 Vari et al. Jan 1995 A
D359546 Savage et al. Jun 1995 S
5431170 Mathews Jul 1995 A
5436499 Namavar et al. Jul 1995 A
D361840 Savage et al. Aug 1995 S
D362063 Savage et al. Sep 1995 S
5448996 Bellin et al. Sep 1995 A
5452717 Branigan et al. Sep 1995 A
D363120 Savage et al. Oct 1995 S
5456252 Vari et al. Oct 1995 A
5479934 Imran Jan 1996 A
5482036 Diab et al. Jan 1996 A
5490505 Diab et al. Feb 1996 A
5494043 O'Sullivan et al. Feb 1996 A
5533511 Kaspari et al. Jul 1996 A
5534851 Russek Jul 1996 A
5561275 Savage et al. Oct 1996 A
5562002 Lalin Oct 1996 A
5590649 Caro et al. Jan 1997 A
5602924 Durand et al. Feb 1997 A
5632272 Diab et al. May 1997 A
5638403 Birchler et al. Jun 1997 A
5638816 Kiani-Azarbayjany et al. Jun 1997 A
5638818 Diab et al. Jun 1997 A
5645440 Tobler et al. Jul 1997 A
5671191 Gerdt Sep 1997 A
5671914 Kalkhoran et al. Sep 1997 A
5685299 Diab et al. Nov 1997 A
5724983 Selker et al. Mar 1998 A
5726440 Kalkhoran et al. Mar 1998 A
D393830 Tobler et al. Apr 1998 S
5743262 Lepper, Jr. et al. Apr 1998 A
5747806 Khalil et al. May 1998 A
5750994 Schlager May 1998 A
5758644 Diab et al. Jun 1998 A
5760910 Lepper, Jr. et al. Jun 1998 A
5769785 Diab et al. Jun 1998 A
5782757 Diab et al. Jul 1998 A
5785659 Caro et al. Jul 1998 A
5791347 Flaherty et al. Aug 1998 A
5810734 Caro et al. Sep 1998 A
5819007 Elghazzawi Oct 1998 A
5823950 Diab et al. Oct 1998 A
5830131 Caro et al. Nov 1998 A
5833618 Caro et al. Nov 1998 A
5860919 Kiani-Azarbayjany et al. Jan 1999 A
5865168 Isaza Feb 1999 A
5865736 Baker, Jr. et al. Feb 1999 A
5890929 Mills et al. Apr 1999 A
5904654 Wohltmann et al. May 1999 A
5919134 Diab Jul 1999 A
5928156 Krumbiegel Jul 1999 A
5934925 Tobler et al. Aug 1999 A
5940182 Lepper, Jr. et al. Aug 1999 A
5987343 Kinast Nov 1999 A
5995855 Kiani et al. Nov 1999 A
5997343 Mills et al. Dec 1999 A
6002952 Diab et al. Dec 1999 A
6010937 Karam et al. Jan 2000 A
6011986 Diab et al. Jan 2000 A
6027452 Flaherty et al. Feb 2000 A
6029665 Berthon-Jones Feb 2000 A
6036642 Diab et al. Mar 2000 A
6040578 Malin et al. Mar 2000 A
6045509 Caro et al. Apr 2000 A
6064910 Andersson et al. May 2000 A
6066204 Haven May 2000 A
6067462 Diab et al. May 2000 A
6081735 Diab et al. Jun 2000 A
6083172 Baker et al. Jul 2000 A
6088607 Diab et al. Jul 2000 A
6091973 Colla et al. Jul 2000 A
6110522 Lepper, Jr. et al. Aug 2000 A
6112171 Sugiyama et al. Aug 2000 A
6115673 Malin et al. Sep 2000 A
6124597 Shehada Sep 2000 A
6128521 Marro et al. Oct 2000 A
6129675 Jay Oct 2000 A
6138675 Berthon-Jones Oct 2000 A
6139505 Murphy Oct 2000 A
6144868 Parker Nov 2000 A
6151516 Kiani-Azarbayjany et al. Nov 2000 A
6152754 Gerhardt et al. Nov 2000 A
6157850 Diab et al. Dec 2000 A
6165005 Mills et al. Dec 2000 A
6168568 Gavriely Jan 2001 B1
6178343 Bindszus et al. Jan 2001 B1
6184521 Coffin, IV et al. Feb 2001 B1
6206830 Diab et al. Mar 2001 B1
6229856 Diab et al. May 2001 B1
6232609 Snyder et al. May 2001 B1
6236872 Diab et al. May 2001 B1
6241683 Macklem et al. Jun 2001 B1
6248083 Smith et al. Jun 2001 B1
6253097 Aronow et al. Jun 2001 B1
6254551 Varis Jul 2001 B1
6255708 Sudharsanan et al. Jul 2001 B1
6256523 Diab et al. Jul 2001 B1
6261238 Gavriely Jul 2001 B1
6263222 Diab et al. Jul 2001 B1
6278522 Lepper, Jr. et al. Aug 2001 B1
6280213 Tobler et al. Aug 2001 B1
6280381 Malin et al. Aug 2001 B1
6285896 Tobler et al. Sep 2001 B1
6301493 Marro et al. Oct 2001 B1
6308089 von der Ruhr et al. Oct 2001 B1
6317627 Ennen et al. Nov 2001 B1
6321100 Parker Nov 2001 B1
6325761 Jay Dec 2001 B1
6331162 Mitchell Dec 2001 B1
6334065 Al-Ali et al. Dec 2001 B1
6343224 Parker Jan 2002 B1
6349228 Kiani et al. Feb 2002 B1
6360114 Diab et al. Mar 2002 B1
6368283 Xu et al. Apr 2002 B1
6371921 Caro et al. Apr 2002 B1
6377829 Al-Ali Apr 2002 B1
6383143 Rost May 2002 B1
6388240 Schulz et al. May 2002 B2
6397091 Diab et al. May 2002 B2
6411373 Garside et al. Jun 2002 B1
6415167 Blank et al. Jul 2002 B1
6430437 Marro Aug 2002 B1
6430525 Weber et al. Aug 2002 B1
6443907 Mansy et al. Sep 2002 B1
6463311 Diab Oct 2002 B1
6470199 Kopotic et al. Oct 2002 B1
6486588 Doron et al. Nov 2002 B2
6487429 Hockersmith et al. Nov 2002 B2
6491647 Bridger et al. Dec 2002 B1
6501975 Diab et al. Dec 2002 B2
6505059 Kollias et al. Jan 2003 B1
6515273 Al-Ali Feb 2003 B2
6517497 Rymut et al. Feb 2003 B2
6519487 Parker Feb 2003 B1
6525386 Mills et al. Feb 2003 B1
6526300 Kiani et al. Feb 2003 B1
6534012 Hazen et al. Mar 2003 B1
6541756 Schulz et al. Apr 2003 B2
6542764 Al-Ali et al. Apr 2003 B1
6580086 Schulz et al. Jun 2003 B1
6584336 Ali et al. Jun 2003 B1
6587196 Stippick et al. Jul 2003 B1
6587199 Luu Jul 2003 B1
6595316 Cybulski et al. Jul 2003 B2
6597932 Tian et al. Jul 2003 B2
6597933 Kiani et al. Jul 2003 B2
6606511 Ali et al. Aug 2003 B1
6632181 Flaherty et al. Oct 2003 B2
6635559 Greenwald et al. Oct 2003 B2
6639668 Trepagnier Oct 2003 B1
6640116 Diab Oct 2003 B2
6640117 Makarewicz et al. Oct 2003 B2
6643530 Diab et al. Nov 2003 B2
6647280 Bahr et al. Nov 2003 B2
6650917 Diab et al. Nov 2003 B2
6654624 Diab et al. Nov 2003 B2
6658276 Kiani et al. Dec 2003 B2
6659960 Derksen et al. Dec 2003 B2
6661161 Lanzo et al. Dec 2003 B1
6671531 Al-Ali et al. Dec 2003 B2
6678543 Diab et al. Jan 2004 B2
6684090 Ali et al. Jan 2004 B2
6684091 Parker Jan 2004 B2
6697656 Al-Ali Feb 2004 B1
6697657 Shehada et al. Feb 2004 B1
6697658 Al-Ali Feb 2004 B2
RE38476 Diab et al. Mar 2004 E
6699194 Diab et al. Mar 2004 B1
6709402 Dekker Mar 2004 B2
6714804 Al-Ali et al. Mar 2004 B2
RE38492 Diab et al. Apr 2004 E
6721582 Trepagnier et al. Apr 2004 B2
6721585 Parker Apr 2004 B1
6725074 Kastle Apr 2004 B1
6725075 Al-Ali Apr 2004 B2
6728560 Kollias et al. Apr 2004 B2
6735459 Parker May 2004 B2
6738652 Mattu et al. May 2004 B2
6745060 Diab et al. Jun 2004 B2
6754516 Mannheimer Jun 2004 B2
6760607 Al-Ali Jul 2004 B2
6766038 Sakuma et al. Jul 2004 B1
6770028 Ali et al. Aug 2004 B1
6771994 Kiani et al. Aug 2004 B2
6788965 Ruchti et al. Sep 2004 B2
6792300 Diab et al. Sep 2004 B1
6813511 Diab et al. Nov 2004 B2
6816241 Grubisic Nov 2004 B2
6816741 Diab Nov 2004 B2
6822564 Al-Ali Nov 2004 B2
6826419 Diab et al. Nov 2004 B2
6830711 Mills et al. Dec 2004 B2
6839581 El-Solh et al. Jan 2005 B1
6850787 Weber et al. Feb 2005 B2
6850788 Al-Ali Feb 2005 B2
6852083 Caro et al. Feb 2005 B2
6861639 Al-Ali Mar 2005 B2
6869402 Arnold Mar 2005 B2
6876931 Lorenz et al. Apr 2005 B2
6898452 Al-Ali et al. May 2005 B2
6920345 Al-Ali et al. Jul 2005 B2
6931268 Kiani-Azarbayjany et al. Aug 2005 B1
6934570 Kiani et al. Aug 2005 B2
6939305 Flaherty et al. Sep 2005 B2
6943348 Coffin, IV Sep 2005 B1
6950687 Al-Ali Sep 2005 B2
6956649 Acosta et al. Oct 2005 B2
6961598 Diab Nov 2005 B2
6970792 Diab Nov 2005 B1
6979812 Al-Ali Dec 2005 B2
6985764 Mason et al. Jan 2006 B2
6990364 Ruchti et al. Jan 2006 B2
6993371 Kiani et al. Jan 2006 B2
6996427 Ali et al. Feb 2006 B2
6998247 Monfre et al. Feb 2006 B2
6999904 Weber et al. Feb 2006 B2
7003338 Weber et al. Feb 2006 B2
7003339 Diab et al. Feb 2006 B2
7015451 Dalke et al. Mar 2006 B2
7024233 Ali et al. Apr 2006 B2
7027849 Al-Ali Apr 2006 B2
7030749 Al-Ali Apr 2006 B2
7039449 Al-Ali May 2006 B2
7041060 Flaherty et al. May 2006 B2
7044918 Diab May 2006 B2
7067893 Mills et al. Jun 2006 B2
D526719 Richie, Jr. et al. Aug 2006 S
7096052 Mason et al. Aug 2006 B2
7096054 Abdul-Hafiz et al. Aug 2006 B2
7096060 Arand et al. Aug 2006 B2
D529616 Deros et al. Oct 2006 S
7132641 Schulz et al. Nov 2006 B2
7133710 Acosta et al. Nov 2006 B2
7142901 Kiani et al. Nov 2006 B2
7149561 Diab Dec 2006 B2
7186966 Al-Ali Mar 2007 B2
7190261 Al-Ali Mar 2007 B2
7194306 Turcott Mar 2007 B1
7215984 Diab May 2007 B2
7215986 Diab May 2007 B2
7221971 Diab May 2007 B2
7225006 Al-Ali et al. May 2007 B2
7225007 Al-Ali May 2007 B2
RE39672 Shehada et al. Jun 2007 E
7239905 Kiani-Azarbayjany et al. Jul 2007 B2
7245953 Parker Jul 2007 B1
7254429 Schurman et al. Aug 2007 B2
7254431 Al-Ali Aug 2007 B2
7254433 Diab et al. Aug 2007 B2
7254434 Schulz et al. Aug 2007 B2
7267652 Coyle et al. Sep 2007 B2
7272425 Al-Ali Sep 2007 B2
7274955 Kiani et al. Sep 2007 B2
D554263 Al-Ali Oct 2007 S
7280858 Al-Ali et al. Oct 2007 B2
7289835 Mansfield et al. Oct 2007 B2
7292883 De Felice et al. Nov 2007 B2
7295866 Al-Ali Nov 2007 B2
7328053 Diab et al. Feb 2008 B1
7332784 Mills et al. Feb 2008 B2
7340287 Mason et al. Mar 2008 B2
7341559 Schulz et al. Mar 2008 B2
7343186 Lamego et al. Mar 2008 B2
D566282 Al-Ali et al. Apr 2008 S
7355512 Al-Ali Apr 2008 B1
7356365 Schurman Apr 2008 B2
7361146 Bharmi et al. Apr 2008 B1
7371981 Abdul-Hafiz May 2008 B2
7373193 Al-Ali et al. May 2008 B2
7373194 Weber et al. May 2008 B2
7376453 Diab et al. May 2008 B1
7377794 Al Ali et al. May 2008 B2
7377899 Weber et al. May 2008 B2
7383070 Diab et al. Jun 2008 B2
7395158 Monfre et al. Jul 2008 B2
7398115 Lynn Jul 2008 B2
7415297 Al-Ali et al. Aug 2008 B2
7428432 Ali et al. Sep 2008 B2
7438683 Al-Ali et al. Oct 2008 B2
7440787 Diab Oct 2008 B2
7454240 Diab et al. Nov 2008 B2
7467002 Weber et al. Dec 2008 B2
7469157 Diab et al. Dec 2008 B2
7471969 Diab et al. Dec 2008 B2
7471971 Diab et al. Dec 2008 B2
7483729 Al-Ali et al. Jan 2009 B2
7483730 Diab et al. Jan 2009 B2
7489958 Diab et al. Feb 2009 B2
7496391 Diab et al. Feb 2009 B2
7496393 Diab et al. Feb 2009 B2
D587657 Al-Ali et al. Mar 2009 S
7499741 Diab et al. Mar 2009 B2
7499835 Weber et al. Mar 2009 B2
7500950 Al-Ali et al. Mar 2009 B2
7509154 Diab et al. Mar 2009 B2
7509494 Al-Ali Mar 2009 B2
7510849 Schurman et al. Mar 2009 B2
7514725 Wojtczuk et al. Apr 2009 B2
7519406 Blank et al. Apr 2009 B2
7526328 Diab et al. Apr 2009 B2
D592507 Wachman et al. May 2009 S
7530942 Diab May 2009 B1
7530949 Al Ali et al. May 2009 B2
7530955 Diab et al. May 2009 B2
7539533 Tran May 2009 B2
7563110 Al-Ali et al. Jul 2009 B2
7593230 Abul-Haj et al. Sep 2009 B2
7596398 Al-Ali et al. Sep 2009 B2
7606608 Blank et al. Oct 2009 B2
7618375 Flaherty Nov 2009 B2
7620674 Ruchti et al. Nov 2009 B2
D606659 Kiani et al. Dec 2009 S
7629039 Eckerbom et al. Dec 2009 B2
7640140 Ruchti et al. Dec 2009 B2
7647083 Al-Ali et al. Jan 2010 B2
D609193 Al-Ali et al. Feb 2010 S
D614305 Al-Ali et al. Apr 2010 S
7690378 Turcott Apr 2010 B1
7697966 Monfre et al. Apr 2010 B2
7698105 Ruchti et al. Apr 2010 B2
RE41317 Parker May 2010 E
RE41333 Blank et al. May 2010 E
7729733 Al-Ali et al. Jun 2010 B2
7734320 Al-Ali Jun 2010 B2
7761127 Al-Ali et al. Jul 2010 B2
7761128 Al-Ali et al. Jul 2010 B2
7764982 Dalke et al. Jul 2010 B2
D621516 Kiani et al. Aug 2010 S
7791155 Diab Sep 2010 B2
7801581 Diab Sep 2010 B2
7822452 Schurman et al. Oct 2010 B2
RE41912 Parker Nov 2010 E
7844313 Kiani et al. Nov 2010 B2
7844314 Al-Ali Nov 2010 B2
7844315 Al-Ali Nov 2010 B2
7865222 Weber et al. Jan 2011 B2
7873497 Weber et al. Jan 2011 B2
7880606 Al-Ali Feb 2011 B2
7880626 Al-Ali et al. Feb 2011 B2
7891355 Al-Ali et al. Feb 2011 B2
7894868 Al-Ali et al. Feb 2011 B2
7899507 Al-Ali et al. Mar 2011 B2
7899518 Trepagnier et al. Mar 2011 B2
7904132 Weber et al. Mar 2011 B2
7909772 Popov et al. Mar 2011 B2
7910875 Al-Ali Mar 2011 B2
7919713 Al-Ali et al. Apr 2011 B2
7937128 Al-Ali May 2011 B2
7937129 Mason et al. May 2011 B2
7937130 Diab et al. May 2011 B2
7941199 Kiani May 2011 B2
7951086 Flaherty et al. May 2011 B2
7957780 Lamego et al. Jun 2011 B2
7962188 Kiani et al. Jun 2011 B2
7962190 Diab et al. Jun 2011 B1
7976472 Kiani Jul 2011 B2
7988637 Diab Aug 2011 B2
7990382 Kiani Aug 2011 B2
7991446 Al-Ali et al. Aug 2011 B2
8000761 Al-Ali Aug 2011 B2
8008088 Bellott et al. Aug 2011 B2
RE42753 Kiani-Azarbayjany et al. Sep 2011 E
8019400 Diab et al. Sep 2011 B2
8028701 Al-Ali et al. Oct 2011 B2
8029765 Bellott et al. Oct 2011 B2
8036727 Schurman et al. Oct 2011 B2
8036728 Diab et al. Oct 2011 B2
8046040 Ali et al. Oct 2011 B2
8046041 Diab et al. Oct 2011 B2
8046042 Diab et al. Oct 2011 B2
8048040 Kiani Nov 2011 B2
8050728 Al-Ali et al. Nov 2011 B2
RE43169 Parker Feb 2012 E
8118620 Al-Ali et al. Feb 2012 B2
8126528 Diab et al. Feb 2012 B2
8128572 Diab et al. Mar 2012 B2
8130105 Al-Ali et al. Mar 2012 B2
8145287 Diab et al. Mar 2012 B2
8150487 Diab et al. Apr 2012 B2
8175672 Parker May 2012 B2
8180420 Diab et al. May 2012 B2
8182443 Kiani May 2012 B1
8185180 Diab et al. May 2012 B2
8190223 Al-Ali et al. May 2012 B2
8190227 Diab et al. May 2012 B2
8203438 Kiani et al. Jun 2012 B2
8203704 Merritt et al. Jun 2012 B2
8204566 Schurman et al. Jun 2012 B2
8219172 Schurman et al. Jul 2012 B2
8224411 Al-Ali et al. Jul 2012 B2
8228181 Al-Ali Jul 2012 B2
8229532 Davis Jul 2012 B2
8229533 Diab et al. Jul 2012 B2
8233955 Al-Ali et al. Jul 2012 B2
8244325 Al-Ali et al. Aug 2012 B2
8255026 Al-Ali Aug 2012 B1
8255027 Al-Ali et al. Aug 2012 B2
8255028 Al-Ali et al. Aug 2012 B2
8260577 Weber et al. Sep 2012 B2
8265723 McHale et al. Sep 2012 B1
8274360 Sampath et al. Sep 2012 B2
8280473 Al-Ali Oct 2012 B2
8301217 Al-Ali et al. Oct 2012 B2
8306596 Schurman et al. Nov 2012 B2
8310336 Muhsin et al. Nov 2012 B2
8315683 Al-Ali et al. Nov 2012 B2
RE43860 Parker Dec 2012 E
8337403 Al-Ali et al. Dec 2012 B2
8346330 Lamego Jan 2013 B2
8353842 Al-Ali et al. Jan 2013 B2
8355766 MacNeish, III et al. Jan 2013 B2
8359080 Diab et al. Jan 2013 B2
8364223 Al-Ali et al. Jan 2013 B2
8364226 Diab et al. Jan 2013 B2
8374665 Lamego Feb 2013 B2
8385995 Al-ali et al. Feb 2013 B2
8385996 Smith et al. Feb 2013 B2
8388353 Kiani et al. Mar 2013 B2
8399822 Al-Ali Mar 2013 B2
8401602 Kiani Mar 2013 B2
8405608 Al-Ali et al. Mar 2013 B2
8414499 Al-Ali et al. Apr 2013 B2
8418524 Al-Ali Apr 2013 B2
8423106 Lamego et al. Apr 2013 B2
8428967 Olsen et al. Apr 2013 B2
8430817 Al-Ali et al. Apr 2013 B1
8437825 Dalvi et al. May 2013 B2
8455290 Siskavich Jun 2013 B2
8457703 Al-Ali Jun 2013 B2
8457707 Kiani Jun 2013 B2
8463349 Diab et al. Jun 2013 B2
8466286 Bellot et al. Jun 2013 B2
8471713 Poeze et al. Jun 2013 B2
8473020 Kiani et al. Jun 2013 B2
8478538 McGonigle et al. Jul 2013 B2
8483787 Al-Ali et al. Jul 2013 B2
8489364 Weber et al. Jul 2013 B2
8498684 Weber et al. Jul 2013 B2
8504128 Blank et al. Aug 2013 B2
8509867 Workman et al. Aug 2013 B2
8515509 Bruinsma et al. Aug 2013 B2
8523781 Al-Ali Sep 2013 B2
8529301 Al-Ali et al. Sep 2013 B2
8532727 Ali et al. Sep 2013 B2
8532728 Diab et al. Sep 2013 B2
D692145 Al-Ali et al. Oct 2013 S
8547209 Kiani et al. Oct 2013 B2
8548548 Al-Ali Oct 2013 B2
8548549 Schurman et al. Oct 2013 B2
8548550 Al-Ali et al. Oct 2013 B2
8560032 Al-Ali et al. Oct 2013 B2
8560034 Diab et al. Oct 2013 B1
8570167 Al-Ali Oct 2013 B2
8570503 Vo et al. Oct 2013 B2
8571617 Reichgott et al. Oct 2013 B2
8571618 Lamego et al. Oct 2013 B1
8571619 Al-Ali et al. Oct 2013 B2
8584345 Al-Ali et al. Oct 2013 B2
8577431 Lamego et al. Nov 2013 B2
8581732 Al-Ali et al. Nov 2013 B2
8588880 Abdul-Hafiz et al. Nov 2013 B2
8597274 Sloan Dec 2013 B2
8600467 Al-Ali et al. Dec 2013 B2
8606342 Diab Dec 2013 B2
8622902 Woehrle Jan 2014 B2
8626255 Al-Ali et al. Jan 2014 B2
8630691 Lamego et al. Jan 2014 B2
8634889 Al-Ali et al. Jan 2014 B2
8641631 Sierra et al. Feb 2014 B2
8652060 Al-Ali Feb 2014 B2
8663107 Kiani Mar 2014 B2
8666468 Al-Ali Mar 2014 B1
8667967 Al-Ali et al. Mar 2014 B2
8670811 O'Reilly Mar 2014 B2
8670814 Diab et al. Mar 2014 B2
8676286 Weber et al. Mar 2014 B2
8682407 Al-Ali Mar 2014 B2
RE44823 Parker Apr 2014 E
RE44875 Kiani et al. Apr 2014 E
8688183 Bruinsma et al. Apr 2014 B2
8690799 Telfort et al. Apr 2014 B2
8700112 Kiani Apr 2014 B2
8702627 Telfort et al. Apr 2014 B2
8706179 Parker Apr 2014 B2
8712494 MacNeish, III et al. Apr 2014 B1
8715206 Telfort et al. May 2014 B2
8718735 Lamego et al. May 2014 B2
8718737 Diab et al. May 2014 B2
8718738 Blank et al. May 2014 B2
8720249 Al-Ali May 2014 B2
8721541 Al-Ali et al. May 2014 B2
8721542 Al-Ali et al. May 2014 B2
8723677 Kiani May 2014 B1
8740792 Kiani et al. Jun 2014 B1
8754776 Poeze et al. Jun 2014 B2
8755535 Telfort et al. Jun 2014 B2
8755856 Diab et al. Jun 2014 B2
8755872 Marinow Jun 2014 B1
8761850 Lamego Jun 2014 B2
8764671 Kiani Jul 2014 B2
8768423 Shakespeare et al. Jul 2014 B2
8771204 Telfort et al. Jul 2014 B2
8777634 Kiani et al. Jul 2014 B2
8781543 Diab et al. Jul 2014 B2
8781544 Al-Ali et al. Jul 2014 B2
8781549 Al-Ali et al. Jul 2014 B2
8788003 Schurman et al. Jul 2014 B2
8790268 Al-Ali Jul 2014 B2
8792949 Baker Jul 2014 B2
8801613 Al-Ali et al. Aug 2014 B2
8821397 Al-Ali et al. Sep 2014 B2
8821415 Al-Ali et al. Sep 2014 B2
8830449 Lamego et al. Sep 2014 B1
8831700 Schurman et al. Sep 2014 B2
8840549 Al-Ali et al. Sep 2014 B2
8847740 Kiani et al. Sep 2014 B2
8849365 Smith et al. Sep 2014 B2
8852094 Al-Ali et al. Oct 2014 B2
8852994 Wojtczuk et al. Oct 2014 B2
8868147 Stippick et al. Oct 2014 B2
8868150 Al-Ali et al. Oct 2014 B2
8870792 Al-Ali et al. Oct 2014 B2
8886271 Kiani et al. Nov 2014 B2
8888539 Al-Ali et al. Nov 2014 B2
8888708 Diab et al. Nov 2014 B2
8892180 Weber et al. Nov 2014 B2
8897847 Al-Ali Nov 2014 B2
8909310 Lamego Dec 2014 B2
8911377 Al-Ali Dec 2014 B2
8912909 Al-Ali et al. Dec 2014 B2
8920317 Al-Ali et al. Dec 2014 B2
8921699 Al-Ali et al. Dec 2014 B2
8922382 Al-Ali et al. Dec 2014 B2
8929964 Al-Ali et al. Jan 2015 B2
8942777 Diab et al. Jan 2015 B2
8948834 Diab et al. Feb 2015 B2
8948835 Diab Feb 2015 B2
8965471 Lamego Feb 2015 B2
8983564 Al-Ali Mar 2015 B2
8989831 Al-Ali et al. Mar 2015 B2
8996085 Kiani et al. Mar 2015 B2
8998809 Kiani Apr 2015 B2
9028429 Telfort et al. May 2015 B2
9037207 Al-Ali et al. May 2015 B2
9060721 Reichgott et al. Jun 2015 B2
9066666 Kiani Jun 2015 B2
9066680 Al-Ali et al. Jun 2015 B1
9072474 Al-Ali et al. Jul 2015 B2
9078560 Schurman et al. Jul 2015 B2
9084569 Weber et al. Jul 2015 B2
9095316 Welch et al. Aug 2015 B2
9106038 Telfort et al. Aug 2015 B2
9107625 Telfort et al. Aug 2015 B2
9107626 Al-Ali et al. Aug 2015 B2
9113831 Al-Ali Aug 2015 B2
9113832 Al-Ali Aug 2015 B2
9119595 Lamego Sep 2015 B2
9131881 Diab et al. Sep 2015 B2
9131882 Al-Ali et al. Sep 2015 B2
9131883 Al-Ali Sep 2015 B2
9131917 Telfort et al. Sep 2015 B2
9135398 Kaib Sep 2015 B2
9138180 Coverston et al. Sep 2015 B1
9138182 Al-Ali et al. Sep 2015 B2
9138192 Weber et al. Sep 2015 B2
9142117 Muhsin et al. Sep 2015 B2
9153112 Kiani et al. Oct 2015 B1
9153121 Kiani et al. Oct 2015 B2
9161696 Al-Ali et al. Oct 2015 B2
9161713 Al-Ali et al. Oct 2015 B2
9167995 Lamego et al. Oct 2015 B2
9176141 Al-Ali et al. Nov 2015 B2
9186102 Bruinsma et al. Nov 2015 B2
9192312 Al-Ali Nov 2015 B2
9192329 Al-Ali Nov 2015 B2
9192351 Telfort et al. Nov 2015 B1
9195385 Al-Ali et al. Nov 2015 B2
9211072 Kiani Dec 2015 B2
9211095 Al-Ali Dec 2015 B1
9218454 Kiani et al. Dec 2015 B2
9220440 Addison et al. Dec 2015 B2
9226696 Kiani Jan 2016 B2
9241662 Al-Ali et al. Jan 2016 B2
9245668 Vo et al. Jan 2016 B1
9259185 Abdul-Hafiz et al. Jan 2016 B2
9267572 Barker et al. Feb 2016 B2
9277880 Poeze et al. Feb 2016 B2
9289167 Diab et al. Mar 2016 B2
9295421 Kiani et al. Mar 2016 B2
9307928 Al-Ali et al. Apr 2016 B1
9323894 Kiani Apr 2016 B2
D755392 Hwang et al. May 2016 S
9326712 Kiani May 2016 B1
9333316 Kiani May 2016 B2
9339220 Lamego et al. May 2016 B2
9341565 Lamego et al. May 2016 B2
9351673 Diab et al. May 2016 B2
9351675 Al-Ali et al. May 2016 B2
9364181 Kiani et al. Jun 2016 B2
9368671 Wojtczuk et al. Jun 2016 B2
9370325 Al-Ali et al. Jun 2016 B2
9370326 McHale et al. Jun 2016 B2
9370335 Al-ali et al. Jun 2016 B2
9375185 Ali et al. Jun 2016 B2
9378637 Kaib Jun 2016 B2
9386953 Al-Ali Jul 2016 B2
9386961 Al-Ali et al. Jul 2016 B2
9392945 Al-Ali et al. Jul 2016 B2
9397448 Al-Ali et al. Jul 2016 B2
9408542 Kinast et al. Aug 2016 B1
9436645 Al-Ali et al. Sep 2016 B2
9445759 Lamego et al. Sep 2016 B1
9466919 Kiani et al. Oct 2016 B2
9474474 Lamego et al. Oct 2016 B2
9480422 Al-Ali Nov 2016 B2
9480435 Olsen Nov 2016 B2
9492110 Al-Ali et al. Nov 2016 B2
9510779 Poeze et al. Dec 2016 B2
9517024 Kiani et al. Dec 2016 B2
9532722 Lamego et al. Jan 2017 B2
9538949 Al-Ali et al. Jan 2017 B2
9538980 Telfort et al. Jan 2017 B2
9549696 Lamego et al. Jan 2017 B2
9554737 Schurman et al. Jan 2017 B2
9560996 Kiani Feb 2017 B2
9560998 Al-Ali et al. Feb 2017 B2
9566019 Al-Ali et al. Feb 2017 B2
9579039 Jansen et al. Feb 2017 B2
9591975 Dalvi et al. Mar 2017 B2
9622692 Lamego et al. Apr 2017 B2
9622693 Diab Apr 2017 B2
D788312 Al-Ali et al. May 2017 S
9636055 Al-Ali et al. May 2017 B2
9636056 Al-Ali May 2017 B2
9649054 Lamego et al. May 2017 B2
9659475 Kaib May 2017 B2
9662052 Al-Ali et al. May 2017 B2
9668679 Schurman et al. Jun 2017 B2
9668680 Bruinsma et al. Jun 2017 B2
9668703 Al-Ali Jun 2017 B2
9675286 Diab Jun 2017 B2
9687160 Kiani Jun 2017 B2
9693719 Al-Ali et al. Jul 2017 B2
9693737 Al-Ali Jul 2017 B2
9697928 Al-Ali et al. Jul 2017 B2
9717425 Kiani et al. Aug 2017 B2
9717458 Lamego et al. Aug 2017 B2
9724016 Al-Ali et al. Aug 2017 B1
9724024 Al-Ali Aug 2017 B2
9724025 Kiani et al. Aug 2017 B1
9730640 Diab et al. Aug 2017 B2
9743887 Al-Ali et al. Aug 2017 B2
9749232 Sampath et al. Aug 2017 B2
9750442 Olsen Sep 2017 B2
9750443 Smith et al. Sep 2017 B2
9750461 Telfort Sep 2017 B1
9775545 Al-Ali et al. Oct 2017 B2
9775546 Diab et al. Oct 2017 B2
9775570 Al-Ali Oct 2017 B2
9778079 Al-Ali et al. Oct 2017 B1
9782077 Lamego et al. Oct 2017 B2
9782110 Kiani Oct 2017 B2
9787568 Lamego et al. Oct 2017 B2
9788735 Al-Ali Oct 2017 B2
9788768 Al-Ali et al. Oct 2017 B2
9795300 Al-Ali Oct 2017 B2
9795310 Al-Ali Oct 2017 B2
9795358 Telfort et al. Oct 2017 B2
9795739 Al-Ali et al. Oct 2017 B2
9801556 Kiani Oct 2017 B2
9801588 Weber et al. Oct 2017 B2
9808188 Perea et al. Nov 2017 B1
9814418 Weber et al. Nov 2017 B2
9820691 Kiani Nov 2017 B2
9833152 Kiani et al. Dec 2017 B2
9833180 Shakespeare et al. Dec 2017 B2
9839379 Al-Ali et al. Dec 2017 B2
9839381 Weber et al. Dec 2017 B1
9847002 Kiani et al. Dec 2017 B2
9847749 Kiani et al. Dec 2017 B2
9848800 Lee Dec 2017 B1
9848806 Al-Ali et al. Dec 2017 B2
9848807 Lamego Dec 2017 B2
9861298 Eckerbom et al. Jan 2018 B2
9861304 Al-Ali et al. Jan 2018 B2
9861305 Weber et al. Jan 2018 B1
9867578 Al-Ali et al. Jan 2018 B2
9872623 Al-Ali Jan 2018 B2
9876320 Coverston et al. Jan 2018 B2
9877650 Muhsin et al. Jan 2018 B2
9877686 Al-Ali et al. Jan 2018 B2
9891079 Dalvi Feb 2018 B2
9895107 Al-Ali et al. Feb 2018 B2
9924893 Schurman et al. Mar 2018 B2
9924897 Abdul-Hafiz Mar 2018 B1
9936917 Poeze et al. Apr 2018 B2
9943269 Muhsin et al. Apr 2018 B2
9949676 Al-Ali Apr 2018 B2
9955937 Telfort May 2018 B2
9965946 Al-Ali May 2018 B2
9968266 An et al. May 2018 B2
9980667 Kiani et al. May 2018 B2
D820865 Muhsin et al. Jun 2018 S
9986919 Lamego et al. Jun 2018 B2
9986952 Dalvi et al. Jun 2018 B2
9989560 Poeze et al. Jun 2018 B2
9993207 Al-Ali et al. Jun 2018 B2
10007758 Al-Ali et al. Jun 2018 B2
D822215 Al-Ali et al. Jul 2018 S
D822216 Barker et al. Jul 2018 S
10010276 Al-Ali et al. Jul 2018 B2
10032002 Kiani et al. Jul 2018 B2
10039482 Al-Ali et al. Aug 2018 B2
10052037 Kinast et al. Aug 2018 B2
10058275 Al-Ali et al. Aug 2018 B2
10064562 Al-Ali Sep 2018 B2
10086138 Novak, Jr. Oct 2018 B1
10092200 Al-Ali et al. Oct 2018 B2
10092249 Kiani et al. Oct 2018 B2
10098550 Al-Ali et al. Oct 2018 B2
10098591 Al-Ali et al. Oct 2018 B2
10098610 Al-Ali et al. Oct 2018 B2
10111591 Dyell et al. Oct 2018 B2
D833624 DeJong et al. Nov 2018 S
10123726 Al-Ali et al. Nov 2018 B2
10123729 Dyell et al. Nov 2018 B2
10130289 Al-Ali et al. Nov 2018 B2
10130291 Schurman et al. Nov 2018 B2
D835282 Barker et al. Dec 2018 S
D835283 Barker et al. Dec 2018 S
D835284 Barker et al. Dec 2018 S
D835285 Barker et al. Dec 2018 S
10149616 Al-Ali et al. Dec 2018 B2
10154815 Al-Ali et al. Dec 2018 B2
10159412 Lamego et al. Dec 2018 B2
10188296 Al-Ali et al. Jan 2019 B2
10188331 Al-Ali et al. Jan 2019 B1
10188348 Kiani et al. Jan 2019 B2
RE47218 Al-Ali Feb 2019 E
RE47244 Kiani et al. Feb 2019 E
RE47249 Kiani et al. Feb 2019 E
10194847 Al-Ali Feb 2019 B2
10194848 Kiani et al. Feb 2019 B1
10201298 Al-Ali et al. Feb 2019 B2
10205272 Kiani et al. Feb 2019 B2
10205291 Scruggs et al. Feb 2019 B2
10213108 Al-Ali Feb 2019 B2
10219706 Al-Ali Mar 2019 B2
10219746 McHale et al. Mar 2019 B2
10226187 Al-Ali et al. Mar 2019 B2
10226576 Kiani Mar 2019 B2
10231657 Al-Ali et al. Mar 2019 B2
10231670 Blank et al. Mar 2019 B2
10231676 Al-Ali et al. Mar 2019 B2
RE47353 Kiani et al. Apr 2019 E
10251585 Al-Ali et al. Apr 2019 B2
10251586 Lamego Apr 2019 B2
10255994 Sampath et al. Apr 2019 B2
10258265 Poeze et al. Apr 2019 B1
10258266 Poeze et al. Apr 2019 B1
10271748 Al-Ali Apr 2019 B2
10278626 Schurman et al. May 2019 B2
10278648 Al-Ali et al. May 2019 B2
10279247 Kiani May 2019 B2
10292628 Poeze et al. May 2019 B1
10292657 Abdul-Hafiz et al. May 2019 B2
10292664 Al-Ali May 2019 B2
10299708 Poeze et al. May 2019 B1
10299709 Perea et al. May 2019 B2
10299720 Brown et al. May 2019 B2
10305775 Lamego et al. May 2019 B2
10307111 Muhsin et al. Jun 2019 B2
10325681 Sampath et al. Jun 2019 B2
10327337 Triman et al. Jun 2019 B2
10327713 Barker et al. Jun 2019 B2
10332630 Al-Ali Jun 2019 B2
10335033 Al-Ali Jul 2019 B2
10335068 Poeze et al. Jul 2019 B2
10335072 Al-Ali et al. Jul 2019 B2
10342470 Al-Ali et al. Jul 2019 B2
10342487 Al-Ali et al. Jul 2019 B2
10342497 Al-Ali et al. Jul 2019 B2
10349895 Telfort et al. Jul 2019 B2
10349898 Al-Ali et al. Jul 2019 B2
10354504 Kiani et al. Jul 2019 B2
10357206 Weber et al. Jul 2019 B2
10357209 Al-Ali Jul 2019 B2
10366787 Sampath et al. Jul 2019 B2
10368787 Reichgott et al. Aug 2019 B2
10376190 Poeze et al. Aug 2019 B1
10376191 Poeze et al. Aug 2019 B1
10383520 Wojtczuk et al. Aug 2019 B2
10383527 Al-Ali Aug 2019 B2
10388120 Muhsin et al. Aug 2019 B2
D864120 Forrest et al. Oct 2019 S
10441181 Telfort et al. Oct 2019 B1
10441196 Eckerbom et al. Oct 2019 B2
10448844 Al-Ali et al. Oct 2019 B2
10448871 Al-Ali et al. Oct 2019 B2
10456038 Lamego et al. Oct 2019 B2
10463340 Telfort et al. Nov 2019 B2
10471159 Lapotko et al. Nov 2019 B1
10505311 Al-Ali et al. Dec 2019 B2
10524738 Olsen Jan 2020 B2
10532174 Al-Ali Jan 2020 B2
10537285 Shreim et al. Jan 2020 B2
10542903 Al-Ali et al. Jan 2020 B2
10555678 Dalvi et al. Feb 2020 B2
10568553 O'Neil et al. Feb 2020 B2
10608817 Haider et al. Mar 2020 B2
D880477 Forrest et al. Apr 2020 S
10617302 Al-Ali et al. Apr 2020 B2
10617335 Al-Ali et al. Apr 2020 B2
10637181 Al-Ali et al. Apr 2020 B2
D886849 Muhsin et al. Jun 2020 S
D887548 Abdul-Hafiz et al. Jun 2020 S
D887549 Abdul-Hafiz et al. Jun 2020 S
10667764 Ahmed et al. Jun 2020 B2
D890708 Forrest et al. Jul 2020 S
10721785 Al-Ali Jul 2020 B2
10736518 Al-Ali et al. Aug 2020 B2
10750984 Pauley et al. Aug 2020 B2
D897098 Al-Ali Sep 2020 S
10779098 Iswanto et al. Sep 2020 B2
10827961 Iyengar et al. Nov 2020 B1
10828007 Telfort et al. Nov 2020 B1
10832818 Muhsin et al. Nov 2020 B2
10849554 Shreim et al. Dec 2020 B2
10856750 Indorf Dec 2020 B2
D906970 Forrest et al. Jan 2021 S
D908213 Abdul-Hafiz et al. Jan 2021 S
10918281 Al-Ali et al. Feb 2021 B2
10932705 Muhsin et al. Mar 2021 B2
10932729 Kiani et al. Mar 2021 B2
10939878 Kiani et al. Mar 2021 B2
10956950 Al-Ali et al. Mar 2021 B2
D916135 Indorf et al. Apr 2021 S
D917046 Abdul-Hafiz et al. Apr 2021 S
D917550 Indorf et al. Apr 2021 S
D917564 Indorf et al. Apr 2021 S
D917704 Al-Ali et al. Apr 2021 S
10987066 Chandran et al. Apr 2021 B2
10991135 Al-Ali et al. Apr 2021 B2
D919094 Al-Ali et al. May 2021 S
D919100 Al-Ali et al. May 2021 S
11006867 Ai-Ali May 2021 B2
D921202 Al-Ali et al. Jun 2021 S
11024064 Muhsin et al. Jun 2021 B2
11026604 Chen et al. Jun 2021 B2
D925597 Chandran et al. Jul 2021 S
D927699 Al-Ali et al. Aug 2021 S
11076777 Lee et al. Aug 2021 B2
11114188 Poeze et al. Sep 2021 B2
D933232 Al-Ali et al. Oct 2021 S
D933233 Al-Ali et al. Oct 2021 S
D933234 Al-Ali et al. Oct 2021 S
11145408 Sampath et al. Oct 2021 B2
11147518 Al-Ali et al. Oct 2021 B1
11185262 Al-Ali et al. Nov 2021 B2
11191484 Kiani et al. Dec 2021 B2
D946596 Ahmed Mar 2022 S
D946597 Ahmed Mar 2022 S
D946598 Ahmed Mar 2022 S
D946617 Ahmed Mar 2022 S
11272839 Al-Ali et al. Mar 2022 B2
11289199 Al-Ali Mar 2022 B2
RE49034 Al-Ali Apr 2022 E
11298021 Muhsin et al. Apr 2022 B2
D950580 Ahmed May 2022 S
D950599 Ahmed May 2022 S
D950738 Al-Ali et al. May 2022 S
D957648 Al-Ali Jul 2022 S
11382567 O'Brien et al. Jul 2022 B2
11389093 Triman et al. Jul 2022 B2
11406286 Al-Ali et al. Aug 2022 B2
11417426 Muhsin et al. Aug 2022 B2
11439329 Lamego Sep 2022 B2
11445948 Scruggs et al. Sep 2022 B2
D965789 Al-Ali et al. Oct 2022 S
D967433 Al-Ali et al. Oct 2022 S
11464410 Muhsin Oct 2022 B2
11504058 Sharma et al. Nov 2022 B1
11504066 Dalvi et al. Nov 2022 B1
D971933 Ahmed Dec 2022 S
D973072 Ahmed Dec 2022 S
D973685 Ahmed Dec 2022 S
D973686 Ahmed Dec 2022 S
D974193 Forrest et al. Jan 2023 S
D979516 Al-Ali et al. Feb 2023 S
D980091 Forrest et al. Mar 2023 S
11596363 Lamego Mar 2023 B2
11627919 Kiani et al. Apr 2023 B2
11637437 Al-Ali et al. Apr 2023 B2
D985498 Al-Ali et al. May 2023 S
11653862 Dalvi et al. May 2023 B2
D989112 Muhsin et al. Jun 2023 S
D989327 Al-Ali et al. Jun 2023 S
11678829 Al-Ali et al. Jun 2023 B2
11679579 Al-Ali Jun 2023 B2
11684296 Vo et al. Jun 2023 B2
11692934 Normand et al. Jul 2023 B2
11701043 Al-Ali et al. Jul 2023 B2
D997365 Hwang Aug 2023 S
11721105 Ranasinghe et al. Aug 2023 B2
11730379 Ahmed et al. Aug 2023 B2
D998625 Indorf et al. Sep 2023 S
D998630 Indorf et al. Sep 2023 S
D998631 Indorf et al. Sep 2023 S
11766198 Pauley et al. Sep 2023 B2
D1000975 Al-Ali et al. Oct 2023 S
20010002206 Diab et al. May 2001 A1
20010034477 Mansfield et al. Oct 2001 A1
20010039483 Brand et al. Nov 2001 A1
20020010401 Bushmakin et al. Jan 2002 A1
20020058864 Mansfield et al. May 2002 A1
20020133080 Apruzzese et al. Sep 2002 A1
20020193670 Garfield et al. Dec 2002 A1
20030013975 Kiani Jan 2003 A1
20030015368 Cybulski et al. Jan 2003 A1
20030018243 Gerhardt et al. Jan 2003 A1
20030065269 Vetter Apr 2003 A1
20030076494 Bonin et al. Apr 2003 A1
20030144582 Cohen et al. Jul 2003 A1
20030156288 Barnum et al. Aug 2003 A1
20030158466 Lynn et al. Aug 2003 A1
20030163033 Dekker et al. Aug 2003 A1
20030163054 Dekker Aug 2003 A1
20030212312 Coffin, IV et al. Nov 2003 A1
20040010202 Nakatani Jan 2004 A1
20040059203 Guerrero Mar 2004 A1
20040060362 Kjellmann et al. Apr 2004 A1
20040106163 Workman, Jr. et al. Jun 2004 A1
20040133087 Ali et al. Jul 2004 A1
20040158162 Narimatsu Aug 2004 A1
20040225332 Gebhardt Nov 2004 A1
20040260186 Dekker Dec 2004 A1
20050027205 Tarassenko et al. Feb 2005 A1
20050048456 Chefd'hotel et al. Mar 2005 A1
20050055276 Kiani et al. Mar 2005 A1
20050070774 Addison et al. Mar 2005 A1
20050107699 Loftman May 2005 A1
20050116820 Goldreich Jun 2005 A1
20050199056 Strong Sep 2005 A1
20050234317 Kiani Oct 2005 A1
20060047215 Newman et al. Mar 2006 A1
20060073719 Kiani Apr 2006 A1
20060129216 Hastings et al. Jun 2006 A1
20060149144 Lynn et al. Jul 2006 A1
20060155206 Lynn Jul 2006 A1
20060155207 Lynn et al. Jul 2006 A1
20060161071 Lynn et al. Jul 2006 A1
20060189871 Al-Ali et al. Aug 2006 A1
20060189880 Lynn et al. Aug 2006 A1
20060195041 Lynn et al. Aug 2006 A1
20060235324 Lynn Oct 2006 A1
20060238333 Welch et al. Oct 2006 A1
20060241510 Halperin et al. Oct 2006 A1
20060258921 Addison et al. Nov 2006 A1
20070073116 Kiani et al. Mar 2007 A1
20070093721 Lynn et al. Apr 2007 A1
20070129643 Kwok et al. Jun 2007 A1
20070129647 Lynn Jun 2007 A1
20070135725 Hatlestad Jun 2007 A1
20070149860 Lynn et al. Jun 2007 A1
20070163353 Lec et al. Jul 2007 A1
20070180140 Welch et al. Aug 2007 A1
20070185397 Govari et al. Aug 2007 A1
20070213619 Linder Sep 2007 A1
20070239057 Pu et al. Oct 2007 A1
20070244377 Cozad et al. Oct 2007 A1
20070282212 Sierra et al. Dec 2007 A1
20070282478 Al-Ali et al. Dec 2007 A1
20080013747 Tran Jan 2008 A1
20080039735 Hickerson Feb 2008 A1
20080064965 Jay et al. Mar 2008 A1
20080071185 Beck et al. Mar 2008 A1
20080076972 Dorogusker et al. Mar 2008 A1
20080094228 Welch et al. Apr 2008 A1
20080103375 Kiani May 2008 A1
20080119716 Boric-Lubecke et al. May 2008 A1
20080161878 Tehrani et al. Jul 2008 A1
20080177195 Armitstead Jul 2008 A1
20080188733 Al-Ali Aug 2008 A1
20080188760 Al-Ali Aug 2008 A1
20080218153 Patel et al. Sep 2008 A1
20080221418 Al-Ali et al. Sep 2008 A1
20080275349 Halperin et al. Nov 2008 A1
20080304580 Ichiyama Dec 2008 A1
20090018409 Banet Jan 2009 A1
20090018429 Saliga et al. Jan 2009 A1
20090018453 Banet et al. Jan 2009 A1
20090036759 Ault et al. Feb 2009 A1
20090093687 Telfort et al. Apr 2009 A1
20090095926 MacNeish, III Apr 2009 A1
20090112096 Tamura Apr 2009 A1
20090160654 Yang Jun 2009 A1
20090167332 Forbes Jul 2009 A1
20090187065 Basinger Jul 2009 A1
20090227882 Foo Sep 2009 A1
20090240119 Schwaibold et al. Sep 2009 A1
20090247848 Baker Oct 2009 A1
20090247984 Lamego et al. Oct 2009 A1
20090275844 Al-Ali Nov 2009 A1
20090299157 Telfort et al. Dec 2009 A1
20090312612 Rantala Dec 2009 A1
20100004518 Vo et al. Jan 2010 A1
20100004552 Zhang et al. Jan 2010 A1
20100014761 Addison et al. Jan 2010 A1
20100016682 Schluess et al. Jan 2010 A1
20100016693 Addison Jan 2010 A1
20100030040 Poeze et al. Feb 2010 A1
20100099964 O'Reilly et al. Apr 2010 A1
20100130873 Yuen May 2010 A1
20100204550 Heneghan Aug 2010 A1
20100234718 Sampath et al. Sep 2010 A1
20100261979 Kiani Oct 2010 A1
20100270257 Wachman et al. Oct 2010 A1
20100274099 Telfort et al. Oct 2010 A1
20100295686 Sloan Nov 2010 A1
20100298661 McCombie et al. Nov 2010 A1
20100298730 Taressenko et al. Nov 2010 A1
20100324377 Woehrle Dec 2010 A1
20100331903 Zhang Dec 2010 A1
20110001605 Kiani Jan 2011 A1
20110009710 Kroeger et al. Jan 2011 A1
20110028806 Merritt et al. Feb 2011 A1
20110028809 Goodman Feb 2011 A1
20110040197 Welch et al. Feb 2011 A1
20110040713 Colman Feb 2011 A1
20110066062 Banet et al. Mar 2011 A1
20110074409 Stoughton Mar 2011 A1
20110082711 Poeze et al. Apr 2011 A1
20110087081 Kiani et al. Apr 2011 A1
20110105854 Kiani et al. May 2011 A1
20110118561 Tari et al. May 2011 A1
20110118573 McKenna May 2011 A1
20110125060 Telfort et al. May 2011 A1
20110137297 Kiani et al. Jun 2011 A1
20110172498 Olsen et al. Jul 2011 A1
20110172561 Kiani et al. Jul 2011 A1
20110208015 Welch et al. Aug 2011 A1
20110209915 Telfort et al. Sep 2011 A1
20110213212 Al-Ali Sep 2011 A1
20110222371 Liu et al. Sep 2011 A1
20110230733 Al-Ali et al. Sep 2011 A1
20110237911 Lamego et al. Sep 2011 A1
20120016255 Masuo Jan 2012 A1
20120059267 Lamego et al. Mar 2012 A1
20120070013 Vau Mar 2012 A1
20120101344 Desjardins Apr 2012 A1
20120116175 Al-Ali et al. May 2012 A1
20120123231 O'Reilly May 2012 A1
20120165629 Merritt et al. Jun 2012 A1
20120179006 Jansen et al. Jul 2012 A1
20120209082 Al-Ali Aug 2012 A1
20120209084 Olsen et al. Aug 2012 A1
20120226117 Lamego et al. Sep 2012 A1
20120227739 Kiani Sep 2012 A1
20120253140 Addison et al. Oct 2012 A1
20120262298 Bohm Oct 2012 A1
20120283524 Kiani et al. Nov 2012 A1
20120296178 Lamego et al. Nov 2012 A1
20120319816 Al-Ali Dec 2012 A1
20120330112 Lamego et al. Dec 2012 A1
20130023775 Lamego et al. Jan 2013 A1
20130045685 Kiani Feb 2013 A1
20130046204 Lamego Feb 2013 A1
20130041591 Lamego Mar 2013 A1
20130060147 Welch et al. Mar 2013 A1
20130096405 Garfio Apr 2013 A1
20130096936 Sampath et al. Apr 2013 A1
20130109935 Al-Ali et al. May 2013 A1
20130116578 An et al. May 2013 A1
20130128690 Gopalan May 2013 A1
20130137936 Baker, Jr. et al. May 2013 A1
20130162433 Muhsin et al. Jun 2013 A1
20130190581 Al-Ali et al. Jul 2013 A1
20130190595 Oraevsky Jul 2013 A1
20130197328 Diab et al. Aug 2013 A1
20130243021 Siskavich Sep 2013 A1
20130253334 Al-Ali et al. Sep 2013 A1
20130274571 Diab et al. Oct 2013 A1
20130296672 Dalvi et al. Nov 2013 A1
20130296713 Al-Ali et al. Nov 2013 A1
20130296726 Nievauer et al. Nov 2013 A1
20130317370 Dalvi et al. Nov 2013 A1
20130324808 Al-Ali et al. Dec 2013 A1
20130331670 Kiani Dec 2013 A1
20130338461 Lamego et al. Dec 2013 A1
20130345921 Al-Ali et al. Dec 2013 A1
20140012100 Lamego et al. Jan 2014 A1
20140025306 Weber et al. Jan 2014 A1
20140034353 Al-Ali et al. Feb 2014 A1
20140051953 Lamego et al. Feb 2014 A1
20140058230 Abdul-Hafiz et al. Feb 2014 A1
20140066783 Kiani et al. Mar 2014 A1
20140077956 Sampath et al. Mar 2014 A1
20140081100 Muhsin et al. Mar 2014 A1
20140081175 Telfort Mar 2014 A1
20140094667 Schurman et al. Apr 2014 A1
20140100434 Diab et al. Apr 2014 A1
20140114199 Lamego et al. Apr 2014 A1
20140120564 Workman et al. May 2014 A1
20140121482 Merritt et al. May 2014 A1
20140121483 Kiani May 2014 A1
20140127137 Bellott et al. May 2014 A1
20140128696 Al-Ali May 2014 A1
20140128699 Al-Ali et al. May 2014 A1
20140129702 Lamego et al. May 2014 A1
20140135588 Al-Ali et al. May 2014 A1
20140142401 Al-Ali et al. May 2014 A1
20140142402 Al-Ali et al. May 2014 A1
20140163344 Al-Ali Jun 2014 A1
20140163402 Lamego et al. Jun 2014 A1
20140166076 Kiani et al. Jun 2014 A1
20140171763 Diab Jun 2014 A1
20140180038 Kiani et al. Jun 2014 A1
20140180154 Sierra et al. Jun 2014 A1
20140180160 Brown et al. Jun 2014 A1
20140187973 Brown et al. Jul 2014 A1
20140194709 Al-Ali et al. Jul 2014 A1
20140194711 Al-Ali Jul 2014 A1
20140194766 Al-Ali et al. Jul 2014 A1
20140206963 Diab et al. Jul 2014 A1
20140213864 Abdul-Hafiz et al. Jul 2014 A1
20140243627 Diab et al. Aug 2014 A1
20140266790 Al-Ali et al. Sep 2014 A1
20140275808 Poeze et al. Sep 2014 A1
20140275835 Lamego et al. Sep 2014 A1
20140275871 Lamego et al. Sep 2014 A1
20140275872 Merritt et al. Sep 2014 A1
20140275881 Lamego et al. Sep 2014 A1
20140288400 Diab et al. Sep 2014 A1
20140296664 Bruinsma et al. Oct 2014 A1
20140303520 Telfort et al. Oct 2014 A1
20140309506 Lamego et al. Oct 2014 A1
20140316217 Purdon et al. Oct 2014 A1
20140316218 Purdon et al. Oct 2014 A1
20140316228 Blank et al. Oct 2014 A1
20140323825 Al-Ali et al. Oct 2014 A1
20140323897 Brown et al. Oct 2014 A1
20140323898 Purdon et al. Oct 2014 A1
20140330092 Al-Ali et al. Nov 2014 A1
20140330098 Merritt et al. Nov 2014 A1
20140330099 Al-Ali et al. Nov 2014 A1
20140333440 Kiani Nov 2014 A1
20140336481 Shakespeare et al. Nov 2014 A1
20140343436 Kiani Nov 2014 A1
20140357966 Al-Ali et al. Dec 2014 A1
20150005600 Blank et al. Jan 2015 A1
20150011907 Purdon et al. Jan 2015 A1
20150012231 Poeze et al. Jan 2015 A1
20150018650 Al-Ali et al. Jan 2015 A1
20150032029 Al-Ali et al. Jan 2015 A1
20150038859 Dalvi et al. Feb 2015 A1
20150073241 Lamego Mar 2015 A1
20150080754 Purdon et al. Mar 2015 A1
20150087936 Al-Ali et al. Mar 2015 A1
20150094546 Al-Ali Apr 2015 A1
20150097701 Al-Ali et al. Apr 2015 A1
20150099950 Al-Ali et al. Apr 2015 A1
20150099955 Al-Ali et al. Apr 2015 A1
20150101844 Al-Ali et al. Apr 2015 A1
20150106121 Muhsin et al. Apr 2015 A1
20150112151 Muhsin et al. Apr 2015 A1
20150116076 Al-Ali et al. Apr 2015 A1
20150165312 Kiani Jun 2015 A1
20150196249 Brown et al. Jul 2015 A1
20150216459 Al-Ali et al. Aug 2015 A1
20150238722 Al-Ali Aug 2015 A1
20150245773 Lamego et al. Sep 2015 A1
20150245794 Al-Ali Sep 2015 A1
20150257689 Al-Ali et al. Sep 2015 A1
20150272514 Kiani et al. Oct 2015 A1
20150351697 Weber et al. Dec 2015 A1
20150359429 Al-Ali et al. Dec 2015 A1
20150366507 Blank Dec 2015 A1
20160029932 Al-Ali Feb 2016 A1
20160058347 Reichgott et al. Mar 2016 A1
20160066824 Al-Ali et al. Mar 2016 A1
20160081552 Wojtczuk et al. Mar 2016 A1
20160095543 Telfort et al. Apr 2016 A1
20160095548 Al-Ali et al. Apr 2016 A1
20160103598 Al-Ali et al. Apr 2016 A1
20160143548 Al-Ali May 2016 A1
20160166182 Al-Ali et al. Jun 2016 A1
20160166183 Poeze et al. Jun 2016 A1
20160192869 Kiani et al. Jul 2016 A1
20160196388 Lamego Jul 2016 A1
20160197436 Barker et al. Jul 2016 A1
20160213281 Eckerbom et al. Jul 2016 A1
20160228043 O'Neil et al. Aug 2016 A1
20160233632 Scruggs et al. Aug 2016 A1
20160234944 Schmidt et al. Aug 2016 A1
20160270735 Diab et al. Sep 2016 A1
20160283665 Sampath et al. Sep 2016 A1
20160287090 Al-Ali et al. Oct 2016 A1
20160287786 Kiani Oct 2016 A1
20160296169 McHale et al. Oct 2016 A1
20160310052 Al-Ali et al. Oct 2016 A1
20160314260 Kiani Oct 2016 A1
20160324486 Al-Ali et al. Nov 2016 A1
20160324488 Olsen Nov 2016 A1
20160327984 Al-Ali et al. Nov 2016 A1
20160328528 Al-Ali et al. Nov 2016 A1
20160331332 Al-Ali Nov 2016 A1
20160367173 Dalvi et al. Dec 2016 A1
20170000394 Al-Ali et al. Jan 2017 A1
20170007134 Al-Ali et al. Jan 2017 A1
20170007198 Al-Ali et al. Jan 2017 A1
20170014083 Diab et al. Jan 2017 A1
20170014084 Al-Ali et al. Jan 2017 A1
20170024748 Haider Jan 2017 A1
20170027456 Kinast et al. Feb 2017 A1
20170042488 Muhsin Feb 2017 A1
20170055851 Al-Ali Mar 2017 A1
20170055882 Al-Ali et al. Mar 2017 A1
20170055887 Al-Ali Mar 2017 A1
20170055896 Al-Ali et al. Mar 2017 A1
20170079594 Telfort et al. Mar 2017 A1
20170086723 Al-Ali et al. Mar 2017 A1
20170143281 Olsen May 2017 A1
20170147774 Kiani May 2017 A1
20170156620 Al-Ali et al. Jun 2017 A1
20170173632 Al-Ali Jun 2017 A1
20170187146 Kiani et al. Jun 2017 A1
20170188919 Al-Ali et al. Jul 2017 A1
20170196464 Jansen et al. Jul 2017 A1
20170196470 Lamego et al. Jul 2017 A1
20170202490 Al-Ali et al. Jul 2017 A1
20170224262 Al-Ali Aug 2017 A1
20170228516 Sampath et al. Aug 2017 A1
20170245790 Al-Ali et al. Aug 2017 A1
20170251974 Shreim et al. Sep 2017 A1
20170251975 Shreim et al. Sep 2017 A1
20170258403 Abdul-Hafiz et al. Sep 2017 A1
20170311851 Schurman et al. Nov 2017 A1
20170311891 Kiani et al. Nov 2017 A1
20170325728 Al-Ali et al. Nov 2017 A1
20170332976 Al-Ali et al. Nov 2017 A1
20170340293 Al-Ali et al. Nov 2017 A1
20170360310 Kiani et al. Dec 2017 A1
20170367632 Al-Ali et al. Dec 2017 A1
20180008146 Al-Ali et al. Jan 2018 A1
20180014752 Al-Ali et al. Jan 2018 A1
20180028124 Al-Ali et al. Feb 2018 A1
20180055385 Al-Ali Mar 2018 A1
20180055390 Kiani et al. Mar 2018 A1
20180055430 Diab et al. Mar 2018 A1
20180064381 Shakespeare et al. Mar 2018 A1
20180069776 Lamego et al. Mar 2018 A1
20180103874 Lee et al. Apr 2018 A1
20180110478 Al-Ali Apr 2018 A1
20180116575 Perea et al. May 2018 A1
20180125368 Lamego et al. May 2018 A1
20180125430 Al-Ali et al. May 2018 A1
20180130325 Kiani et al. May 2018 A1
20180132769 Weber et al. May 2018 A1
20180132770 Lamego May 2018 A1
20180146901 Al-Ali et al. May 2018 A1
20180146902 Kiani et al. May 2018 A1
20180153442 Eckerbom et al. Jun 2018 A1
20180153446 Kiani Jun 2018 A1
20180153447 Al-Ali et al. Jun 2018 A1
20180153448 Weber et al. Jun 2018 A1
20180161499 Al-Ali et al. Jun 2018 A1
20180168491 Al-Ali et al. Jun 2018 A1
20180174679 Sampath et al. Jun 2018 A1
20180174680 Sampath et al. Jun 2018 A1
20180182484 Sampath et al. Jun 2018 A1
20180184917 Kiani Jul 2018 A1
20180192953 Shreim et al. Jul 2018 A1
20180192955 Al-Ali et al. Jul 2018 A1
20180199871 Pauley et al. Jul 2018 A1
20180206795 Al-Ali Jul 2018 A1
20180206815 Telfort Jul 2018 A1
20180213583 Al-Ali Jul 2018 A1
20180214031 Kiani et al. Aug 2018 A1
20180214090 Al-Ali et al. Aug 2018 A1
20180216370 Ishiguro et al. Aug 2018 A1
20180218792 Muhsin et al. Aug 2018 A1
20180225960 Al-Ali et al. Aug 2018 A1
20180238718 Dalvi Aug 2018 A1
20180242853 Al-Ali Aug 2018 A1
20180242921 Muhsin et al. Aug 2018 A1
20180242923 Al-Ali et al. Aug 2018 A1
20180242924 Barker et al. Aug 2018 A1
20180242926 Muhsin et al. Aug 2018 A1
20180247353 Al-Ali et al. Aug 2018 A1
20180247712 Muhsin et al. Aug 2018 A1
20180249933 Schurman et al. Sep 2018 A1
20180253947 Muhsin et al. Sep 2018 A1
20180256087 Al-Ali et al. Sep 2018 A1
20180256113 Weber et al. Sep 2018 A1
20180285094 Housel et al. Oct 2018 A1
20180289325 Poeze et al. Oct 2018 A1
20180289337 Al-Ali et al. Oct 2018 A1
20180296161 Shreim et al. Oct 2018 A1
20180300919 Muhsin et al. Oct 2018 A1
20180310822 Indorf et al. Nov 2018 A1
20180310823 Al-Ali et al. Nov 2018 A1
20180317826 Muhsin Nov 2018 A1
20180317841 Novak, Jr. Nov 2018 A1
20180333055 Lamego et al. Nov 2018 A1
20180333087 Al-Ali Nov 2018 A1
20190000317 Muhsin et al. Jan 2019 A1
20190000362 Kiani et al. Jan 2019 A1
20190015023 Monfre Jan 2019 A1
20190021638 Al-Ali et al. Jan 2019 A1
20190029574 Schurman et al. Jan 2019 A1
20190029578 Al-Ali et al. Jan 2019 A1
20190038143 Al-Ali Feb 2019 A1
20190058280 Al-Ali et al. Feb 2019 A1
20190058281 Al-Ali et al. Feb 2019 A1
20190069813 Al-Ali Mar 2019 A1
20190069814 Al-Ali Mar 2019 A1
20190076028 Al-Ali et al. Mar 2019 A1
20190082979 Al-Ali et al. Mar 2019 A1
20190090748 Al-Ali Mar 2019 A1
20190090760 Kinast et al. Mar 2019 A1
20190090764 Al-Ali Mar 2019 A1
20190104973 Poeze et al. Apr 2019 A1
20190110719 Poeze et al. Apr 2019 A1
20190117070 Muhsin et al. Apr 2019 A1
20190117139 Al-Ali et al. Apr 2019 A1
20190117140 Al-Ali et al. Apr 2019 A1
20190117141 Al-Ali Apr 2019 A1
20190117930 Al-Ali Apr 2019 A1
20190122763 Sampath et al. Apr 2019 A1
20190133525 Al-Ali et al. May 2019 A1
20190142283 Lamego et al. May 2019 A1
20190142344 Telfort et al. May 2019 A1
20190150800 Poeze et al. May 2019 A1
20190150856 Kiani et al. May 2019 A1
20190167161 Al-Ali et al. Jun 2019 A1
20190175019 Al-Ali et al. Jun 2019 A1
20190192076 McHale et al. Jun 2019 A1
20190200941 Chandran et al. Jul 2019 A1
20190201623 Kiani Jul 2019 A1
20190209025 Al-Ali Jul 2019 A1
20190214778 Scruggs et al. Jul 2019 A1
20190216319 Poeze et al. Jul 2019 A1
20190216379 Al-Ali et al. Jul 2019 A1
20190221966 Kiani et al. Jul 2019 A1
20190223804 Blank et al. Jul 2019 A1
20190231199 Al-Ali et al. Aug 2019 A1
20190231241 Al-Ali et al. Aug 2019 A1
20190231270 Abdul-Hafiz et al. Aug 2019 A1
20190239787 Pauley et al. Aug 2019 A1
20190239824 Muhsin et al. Aug 2019 A1
20190254578 Lamego Aug 2019 A1
20190261857 Al-Ali Aug 2019 A1
20190320906 Olsen Oct 2019 A1
20200060869 Telfort et al. Feb 2020 A1
20200111552 Ahmed Apr 2020 A1
20200113520 Abdul-Hafiz et al. Apr 2020 A1
20200138368 Kiani et al. May 2020 A1
20200163597 Dalvi et al. May 2020 A1
20200253474 Muhsin et al. Aug 2020 A1
20200253544 Belur Nagaraj et al. Aug 2020 A1
20200275841 Telfort et al. Sep 2020 A1
20200288983 Telfort et al. Sep 2020 A1
20200329983 Al-Ali et al. Oct 2020 A1
20200329993 Al-Ali et al. Oct 2020 A1
20210022628 Telfort et al. Jan 2021 A1
20210104173 Pauley et al. Apr 2021 A1
20210113121 Diab et al. Apr 2021 A1
20210117525 Kiani et al. Apr 2021 A1
20210118581 Kiani et al. Apr 2021 A1
20210121582 Krishnamani et al. Apr 2021 A1
20210161465 Barker et al. Jun 2021 A1
20210236729 Kiani et al. Aug 2021 A1
20210256835 Ranasinghe et al. Aug 2021 A1
20210275101 Vo et al. Sep 2021 A1
20210290072 Forrest Sep 2021 A1
20210290080 Ahmed Sep 2021 A1
20210290120 Al-Ali Sep 2021 A1
20210290177 Novak, Jr. Sep 2021 A1
20210290184 Ahmed Sep 2021 A1
20210296008 Novak, Jr. Sep 2021 A1
20210330228 Olsen et al. Oct 2021 A1
20210386382 Olsen et al. Dec 2021 A1
20210402110 Pauley et al. Dec 2021 A1
20220039707 Sharma et al. Feb 2022 A1
20220053892 Al-Ali et al. Feb 2022 A1
20220071562 Kiani Mar 2022 A1
20220096603 Kiani et al. Mar 2022 A1
20220151521 Krishnamani et al. May 2022 A1
20220218244 Kiani et al. Jul 2022 A1
20220287574 Telfort et al. Sep 2022 A1
20220296161 Al-Ali et al. Sep 2022 A1
20220361819 Al-Ali et al. Nov 2022 A1
20220379059 Yu et al. Dec 2022 A1
20220392610 Kiani et al. Dec 2022 A1
20230028745 Al-Ali Jan 2023 A1
20230038389 Vo Feb 2023 A1
20230045647 Vo Feb 2023 A1
20230058052 Ai-Ai Feb 2023 A1
20230058342 Kiani Feb 2023 A1
20230069789 Koo et al. Mar 2023 A1
20230087671 Telfort et al. Mar 2023 A1
20230110152 Forrest et al. Apr 2023 A1
20230111198 Yu et al. Apr 2023 A1
20230115397 Vo et al. Apr 2023 A1
20230116371 Mills et al. Apr 2023 A1
20230135297 Kiani et al. May 2023 A1
20230138098 Telfort et al. May 2023 A1
20230145155 Krishnamani et al. May 2023 A1
20230147750 Barker et al. May 2023 A1
20230210417 Al-Ali et al. Jul 2023 A1
20230222805 Muhsin et al. Jul 2023 A1
20230222887 Muhsin et al. Jul 2023 A1
20230226331 Kiani et al. Jul 2023 A1
20230284916 Telfort Sep 2023 A1
20230284943 Scruggs et al. Sep 2023 A1
20230301562 Scruggs et al. Sep 2023 A1
20230346993 Kiani et al. Nov 2023 A1
Foreign Referenced Citations (30)
Number Date Country
2262236 Apr 2008 CA
0716628 Dec 1998 EP
0659058 Jan 1999 EP
1207536 May 2002 EP
2358546 Nov 1999 GB
6214898 Jan 1987 JP
01-309872 Jun 1998 JP
10-155755 Jun 1998 JP
2001-50713 May 1999 JP
2003-329719 Nov 2003 JP
WO 1994005207 Mar 1994 WO
WO 1994013207 Jun 1994 WO
WO 1995029632 Nov 1995 WO
WO 1999053277 Oct 1999 WO
WO 2000010462 Mar 2000 WO
WO 2001034033 May 2001 WO
WO 2001078059 Oct 2001 WO
WO 2001097691 Dec 2001 WO
WO 2002003042 Jan 2002 WO
WO 2003058646 Jul 2003 WO
WO 2003087737 Oct 2003 WO
WO 2004000111 Dec 2003 WO
WO 2004004411 Jan 2004 WO
WO 2005096931 Oct 2005 WO
WO 2005099562 Oct 2005 WO
WO 2008017246 Feb 2008 WO
WO 2008080469 Jul 2008 WO
WO 2008148172 Dec 2008 WO
WO 2009093159 Jul 2009 WO
WO 2009137524 Nov 2009 WO
Non-Patent Literature Citations (29)
Entry
US 8,845,543 B2, 09/2014, Diab et al. (withdrawn)
US 9,579,050 B2, 02/2017, Al-Ali (withdrawn)
U.S. Appl. No. 15/851,176, Non-Invasive Monitoring of Respiratory Rate, Heart Rate and Apnea, filed Dec. 21, 2017.
U.S. Appl. No. 16/557,198, Acoustic Physiological Monitoring System, filed Aug. 30, 2019.
U.S. Appl. No. 16/155,456, Acoustic Respiratory Event Processor, filed Oct. 9, 2018.
U.S. Appl. No. 16/119,215, Plethysmographic Respiration Processor, filed Aug. 31, 2018.
U.S. Appl. No. 16/459,955, Respiration Event Processor, filed Jul. 2, 2019.
U.S. Appl. No. 13/900,984, Plethysmographic Respiration Analysis System, filed May 23, 2013.
Analog Devices, 12-Bit Serial Input Multiplying D/A Converter, Product Data Sheet, 2000.
Chambrin, M-C.; “Alarms in the intensive care unit: how can the number of false alarms be reduced?”; Critical Care Aug. 2001, vol. 5 No. 4; p. 1-5.
Eldor et al., “A device for monitoring ventilation during anaesthesia; the paratracheal audible respiratory monitor”, Canadian Journal of Anaesthesia, 1990, vol. 9, No. 1, p. 95-98.
GE Healthcare, “Transport ProTM Patient Monitor Operator's Manual” Apr. 9, 2007, in 286 pages.
Gorges, M. et al; “Improving Alarm Performance in the Medical Intensive Care Unit Using Delays and Clinical Context”; Technology, Computing, and Simulation; vol. 108, No. 5, May 2009; p. 1546-1552.
Imhoff, M. et al; “Alarm Algorithms in Critical Care Monitoring”; Anesth Analg 2006;102:1525-37.
International Search Report & Written Opinion, PCT Application PCT/US2010/052758, dated Feb. 10, 2011; 12 pages.
International Search Report & Written Opinion, PCT Application PCT/US2010/058981, dated Feb. 17, 2011; 11 pages.
International Search Report and Written Opinion issued in application No. PCT/US2010/052756 dated Feb. 6, 2012.
International Search Report, PCT Application PCT/CA2003/000536, dated Dec. 11, 2003; 2 pages.
International Search Report, PCT Application PCT/US2009/069287, dated Mar. 30, 2010; 7 pages.
Japanese Office Action for JP Application No. 2007-506626 dated Mar. 1, 2011.
Sierra et al., Monitoring Respiratory Rate Based on Tracheal Sounds. First Experieances, Proceedings of the 26th Annual Int'l Conf. of the IEEE EMBS (Sep. 2004), 317-320.
Watt, R. C.; “Alarms and Anesthesia. Challenges in the design of Intelligent systems for Patient Monitoring”; IEEE Engineering in Medicine and biology; Dec. 1993, p. 34-41.
Welch Allyn, ECG ASIC, Product Data Sheete, 2001.
Supplementary Partial European Search Report for International Application No. 05732095.4, dated Jun. 26, 2009 in 4 pages.
Theimer et al., “Definitions of audio features for music content description”, Algorithm Engineering Report TR08-2-001, Feb. 2008.
Stewart, C., Larson, V., “Detection and classification of acoustic signals from fixed-wing aircraft,” Systems Engineering, CH3051-0/91/0000-0025, IEEE, 1991.
Johnston, Development of a Signal Processing Library for Extraction of Sp02, HR, HRV, and RR from Photoplethysmographic Waveforms, Thesis: Degree of Master of Science, Worcester Polytechnic Institute, date of presentation/defense Jul. 17, 2006, date listed Jul. 27, 2006.
Hsu, “Signals and Systems”, Schaum's Theory and Problems, 1995, Ch. 3, p. 121.
Smith, “The Scientist and Engineer's Guide to Digital Signal Processing”, https://www.dspguide.com/ch17/2.htm, Ch. 17 and Ch. 18, 2 book editions, original 1997, updated 2002, pp. 297-318.
Related Publications (1)
Number Date Country
20200253509 A1 Aug 2020 US
Provisional Applications (2)
Number Date Country
61326200 Apr 2010 US
61252594 Oct 2009 US
Continuations (2)
Number Date Country
Parent 15669201 Aug 2017 US
Child 16791241 US
Parent 12905384 Oct 2010 US
Child 15669201 US