Acoustic physiological monitoring system

Information

  • Patent Grant
  • 11963749
  • Patent Number
    11,963,749
  • Date Filed
    Friday, August 30, 2019
    5 years ago
  • Date Issued
    Tuesday, April 23, 2024
    8 months ago
  • Inventors
  • Original Assignees
  • Examiners
    • Cheng; Jacqueline
    • Tran; Tho Q
    Agents
    • Knobbe, Martens, Olson & Bear, LLP
Abstract
An acoustic sensor attached to a medical patient can non-invasively detect acoustic vibrations indicative of physiological parameters of the medical patient and produce an acoustic signal corresponding to the acoustic vibrations. The acoustic signal can be integrated one or more times with respect to time, and a physiological monitoring system can determine pulse or respiration parameters based on the integrated acoustic signal. The physiological monitoring system can, for instance, estimate a pulse rate according to pulses in the integrated acoustic signal and a respiration rate according to a modulation of the integrated acoustic signal, among other parameters. Further, the physiological monitoring system can compare the integrated acoustic signal or parameters determined based on the integrated acoustic signal with other signals or parameters to activate alarms.
Description
BACKGROUND

The “piezoelectric effect” is the appearance of an electric potential and current across certain faces of a crystal when it is subjected to mechanical stresses. Due to their capacity to convert mechanical deformation into an electric voltage, piezoelectric crystals have been broadly used in devices such as transducers, strain gauges and microphones. However, before the crystals can be used in many of these applications they must be rendered into a form which suits the requirements of the application. In many applications, especially those involving the conversion of acoustic waves into a corresponding electric signal, piezoelectric membranes have been used.


Piezoelectric membranes are typically manufactured from polyvinylidene fluoride plastic film. The film is endowed with piezoelectric properties by stretching the plastic while it is placed under a high-poling voltage. By stretching the film, the film is polarized and the molecular structure of the plastic aligned. A thin layer of conductive metal (typically nickel-copper) is deposited on each side of the film to form electrode coatings to which connectors can be attached.


Piezoelectric membranes have a number of attributes that make them interesting for use in sound detection, including: a wide frequency range; a low acoustical impedance close to water and human tissue; a high dielectric strength; a good mechanical strength; and piezoelectric membranes are moisture resistant and inert to many chemicals.


SUMMARY

Acoustic sensors, such as piezoelectric membranes, can be used to determine respiration related parameters from an acoustic signal sensed from the neck of an individual, such as a medical patient. The determined respiration parameters can include parameters such as the individual's respiration rate in some implementations. As a result, the sensed acoustic signal can be filtered before signal processing to remove certain frequency components that may not be used to determine the respiration parameters. In one such embodiment, the sensed acoustic signal can be high-pass filtered to remove or diminish frequencies below about 100 Hz and pass frequencies above about 100 Hz because the determined respiration parameters may be determined based on frequency components of the sensed acoustic signal that may exceed about 100 Hz. However, such filtering can remove or diminish pulse information that may be included in the sensed acoustic signal.


The systems and methods of this disclosure, in some embodiments, advantageously may not high-pass filter a sensed acoustic signal to remove or diminish frequency components below about 100 Hz. Instead, the sensed acoustic signal can be high-pass filtered at a lower frequency, such as about 0.1 Hz, 1 Hz, 10 Hz, 30 Hz, 40 Hz, or the like. The filtered acoustic signal can be further filtered to remove or reduce effects on the acoustic signal of a sensing device, which is used to sense and/or process the acoustic signal, to thereby obtain a compensated signal that may correspond closely to a pulse signal of the individual. The compensated signal can then be used to determine numerous respiration and pulse parameters, such as the individual's respiration rate or pulse rate.


Acoustic sensors and associated processing modules that together form a sensing device can inherently filter and change signals output by the sensing device. For example, the mechanical properties of an acoustic sensor, such as the materials of the acoustic sensor or a match of the acoustic sensor to the skin of an individual, can influence an acoustic signal output by a sensing device. In addition, the electrical properties of a high-pass, band-pass, or low-pass filter module included in a sensing device can influence an acoustic signal output by the sensing device. Such filtering and changing of signals, unfortunately, can result in an acoustic signal output by a sensing device that may hide or mask an underlying physical signal detected by the sensing device. The output acoustic signal thus can be difficult to process for determining parameters for understanding the physiological condition of an individual.


The impact of a sensing device, including an acoustic sensor and one or more associated processing modules, on a detected acoustic signal can be understood in terms of a system transfer function. The sensing device can be considered to receive an input signal (for example, the vibration of an individual's skin) and then generate an output signal based on both the received input signal and a system transfer function. The sensing system, for instance, may be considered to output a signal that corresponds to the input signal after being influenced by the system transfer function.


Accordingly, the systems and methods of this disclosure, in some embodiments, can filter an acoustic signal so as to reverse or undo the effects on the acoustic signal of a sensing device used for sensing or processing the acoustic signal. An acoustic signal can be obtained as a result that corresponds closely to a physical signal detected by the sensing device. This acoustic signal desirably can be understood in terms of physical limitations, boundaries, or intuitions since the acoustic signal may correspond closely to a physical signal. For example, the acoustic signal can directly correspond to an expansion and contraction of the sensed skin of an individual, which can be useful in determining accurate and reliable respiration and pulse parameters for the individual.


One aspect of this disclosure provides a physiological monitoring system configured to determine one or more pulse or respiration parameters from one or more of an acoustic signal and a plethysmograph signal. Before determining respiration or pulse parameters from the acoustic signal, the acoustic signal can be integrated one or more times with respect to time. The physiological monitoring system can utilize the integrated acoustic signal to estimate a pulse rate based on pulses in the integrated acoustic signal and a respiration rate based on modulation of the integrated acoustic signal, among other parameters. The physiological monitoring system further can compare the determined parameters with predetermined values or pulse and respiration parameters determined based on a plethysmograph signal, for example, to activate alarms of the physiological monitor.


Advantageously, in certain embodiments, the pulse and respiration parameters determined in accordance with this disclosure can increase the robustness of a physiological monitoring system. For instance, the pulse and respiration parameters can provide one or more additional parameter values to validate the accuracy of parameters determined using one or more other physiological sensors. Moreover, the pulse and respiration parameters determined in accordance with this disclosure can be sensed closer to an individual's heart or chest than using one or more other types or placements of physiological sensors.





BRIEF DESCRIPTION OF THE DRAWINGS


FIGS. 1A-B are block diagrams illustrating physiological monitoring systems;



FIG. 1C is a top perspective view illustrating portions of a sensor system;



FIG. 2A illustrates an acoustic neck sensor and a chest sensor for physiological measurements;



FIG. 2B illustrates an acoustic neck sensor and a plethysmograph for physiological measurements;



FIG. 3 is a schematic diagram of acoustic and optical sensors and sensor drive elements and a corresponding digital signal processor and I/O drive elements;



FIG. 4 is a block diagram of a pulse and respiration processor of a physiological monitor that includes an acoustic signal processor and a plethysmograph signal processor;



FIG. 5 is a block diagram of an example acoustic signal processor;



FIG. 6 is a block diagram of an example acoustic filter;



FIG. 7A is an example acoustic signal processed by an acoustic signal processor;



FIG. 7B is an example filtered acoustic signal generated by a filter;



FIG. 7C is another example filtered acoustic signal generated by a filter;



FIG. 7D is an example filtered acoustic signal generated by a filter that illustrates amplitude modulation;



FIG. 8 illustrates a process for determining a patient pulse rate based on an acoustic signal;



FIG. 9 illustrates a process for detecting an acoustic probe error;



FIG. 10 illustrates a process for determining a patient respiration rate based on an acoustic signal; and



FIG. 11 illustrates example signals processed by an acoustic signal processor.





DETAILED DESCRIPTION

In various embodiments, a physiological monitoring system that includes an acoustic signal processing system can communicate with an acoustic sensor to measure or determine any of a variety of physiological parameters of a medical patient. For example, the physiological monitoring system can include an acoustic monitor. The acoustic monitor may, in an embodiment, be an acoustic respiratory monitor that can determine one or more respiratory parameters of the patient, including respiratory rate, expiratory flow, tidal volume, minute volume, apnea duration, breath sounds, rales, rhonchi, stridor, and changes in breath sounds such as decreased volume or change in airflow. In addition, in some implementations, the acoustic signal processing system can be used to monitor or determine other physiological sounds, such as patient heart rate to help with probe off detection, heart sounds (S1, S2, S3, S4, and murmurs), or change in heart sounds including normal to murmur or split heart sounds indicating fluid overload. Moreover, the acoustic signal processing system can further communicate with a second probe placed over the patient's chest for additional heart sound detection in some implementations.


In certain embodiments, the physiological monitoring system can include an electrocardiograph (ECG or EKG) that may measure or process electrical signals generated by the cardiac system of a patient. The ECG can include one or more sensors for measuring the electrical signals. In some implementations, the electrical signals can be obtained using the same sensors that may be used to obtain acoustic signals.


In certain embodiments, the physiological monitoring system can communicate with one or more additional sensors to determine other desired physiological parameters for a patient. For example, a photoplethysmograph sensor can be used to determine the concentrations of analytes contained in the patient's blood, such as oxyhemoglobin, carboxyhemoglobin, methemoglobin, other dyshemoglobins, total hemoglobin, fractional oxygen saturation, glucose, bilirubin, and/or other analytes. In another example, a capnograph can be used to determine the carbon dioxide content in inspired and expired air from a patient. In yet another example, one or more other sensors, such as a pneumotachometer for measuring air flow and a respiratory effort belt, can be used to determine blood pressure, flow rate, air flow, and fluid flow (first derivative of pressure). In certain embodiments, the sensors can be combined in a single processing system that can process the one or more signals output from the sensors on a single multi-function circuit board.



FIGS. 1A through 1C illustrate example patient monitoring systems, sensors, and cables that can be used to provide acoustic physiological monitoring, such as acoustic pulse and respiration monitoring, of a patient.



FIG. 1A shows an embodiment of a physiological monitoring system 10. In the monitoring system 10, a medical patient 12 can be monitored using one or more sensors 13, each of which can transmit a signal over a cable 15 or other communication link or medium to a physiological monitor 17. The physiological monitor 17 can include a processor 19 and, optionally, a display 11. The one or more sensors 13 can include sensing elements such as, for example, acoustic piezoelectric devices, electrical ECG leads, pulse oximetry sensors, or the like. The one or more sensors 13 can generate respective signals by sensing a physiological condition of the patient 12. The signals can then be processed by the processor 19. The processor 19 can communicate the processed signal to the display 11 if a display 11 is provided. In an embodiment, the display 11 is incorporated in the physiological monitor 17. In another embodiment, the display 11 is separate from the physiological monitor 17. The monitoring system 10 can, for instance, be a portable monitoring system or a pod, without a display, that may be adapted to provide physiological parameter data to a display.


For clarity, a single block is used to illustrate the one or more sensors 13 shown in FIG. 1A. It should be understood that the sensor 13 shown is intended to represent one or more sensors. In an embodiment, the one or more sensors 13 include a single sensor of one of the types described below. In another embodiment, the one or more sensors 13 include at least two acoustic sensors. In still another embodiment, the one or more sensors 13 include at least two acoustic sensors and one or more ECG sensors, pulse oximetry sensors, bioimpedance sensors, capnography sensors, or the like. Additional sensors of different types can also be included. Other combinations of numbers and types of sensors are also suitable for use with the physiological monitoring system 10.


In some embodiments of the system shown in FIG. 1A, the hardware used to receive and process signals from the sensors are housed within the same housing. In other embodiments, some of the hardware used to receive or process the signals can be housed within a separate housing. In addition, the physiological monitor 17 can include hardware, software, or both hardware and software, whether in one housing or multiple housings, usable to receive and process the signals transmitted by the one or more sensors 13.


As shown in FIG. 1B, the one or more sensors 13 can include a cable 25. The cable 25 can include three conductors within an electrical shielding. One conductor 26 can provide power to a physiological monitor 17, one conductor 28 can provide a ground signal to the physiological monitor 17, and one conductor 28 can transmit signals from the one or more sensors 13 to the physiological monitor 17. For multiple sensors implementations, one or more additional cables 115 can further be provided.


In some embodiments, the ground signal can be an earth ground, but in other embodiments, the ground signal may be a patient ground, sometimes referred to as a patient reference, a patient reference signal, a return, or a patient return. In some embodiments, the cable 25 can carry two conductors within an electrical shielding layer, and the shielding layer can act as the ground conductor. Electrical interfaces 23 in the cable 25 can enable the cable to electrically connect to electrical interfaces 21 in a connector 20 of the physiological monitor 17. In another embodiment, the sensor 13 and the physiological monitor 17 communicate wirelessly, such as via an IEEE standard (e.g., IEEE 802, IEEE 802.11 a/b/g/n, WiFi™, or Bluetooth™, etc.)



FIG. 1C illustrates an embodiment of a sensor system 100 including a sensor 101 suitable for use with the physiological monitors shown in FIGS. 1A and 1B. The sensor system 100 can include the sensor 101, a sensor cable 117, and a connector 105 attached to the sensor cable 117. The sensor 101 can include a shell 102, an acoustic coupler, 103 and a frame 104, which may also be referred to as a sensor support, configured to house certain componentry of the sensor 101, and an attachment portion 107 positioned on the sensor 101 and configured to attach the sensor 101 to the patient.


The sensor 101 can be removably attached to an instrument cable 111 via an instrument cable connector 109. The instrument cable 111 can be attached to a cable hub 120, which can include a port 121 for receiving a connector 112 of the instrument cable 111 and a second port 123 for receiving another cable. In certain embodiments, the second port 123 can receive a cable connected to a pulse oximetry or other sensor. In addition, the cable hub 120 could include additional ports for receiving one or more additional cables in other embodiments. The hub includes a cable 122 which terminates in a connector 124 adapted to connect to a physiological monitor. In another embodiment, no hub may be provided and the acoustic sensor 101 can be connected directly to the monitor, via an instrument cable 111, or directly by the sensor cable 117, for example. Examples of compatible hubs are described in U.S. patent application Ser. No. 12/904,775, filed on Oct. 14, 2010, which is incorporated by reference in its entirety herein. Examples of acoustic sensors are described in U.S. patent application Ser. No. 14/030,268, filed on Sep. 18, 2013, which is incorporated by reference in its entirety herein.


The component or group of components between the sensor 101 and monitor can be referred to generally as a cabling apparatus. For example, where one or more of the following components are included, such components or combinations thereof can be referred to as a cabling apparatus: the sensor cable 117, the connector 105, the cable connector 109, the instrument cable 111, the hub 120. the cable 122, or the connector 124. It should be noted that one or more of these components may not be included, and that one or more other components may be included between the sensor 101 and the monitor to form the cabling apparatus.


In an embodiment, the acoustic sensor 101 includes one or more sensing elements, such as, for example, one or more piezoelectric devices or other acoustic sensing devices. Where a piezoelectric membrane may be used, a thin layer of conductive metal can be deposited on each side of the film as electrode coatings, forming electrical poles. The opposing surfaces or poles may be referred to as an anode and cathode, respectively, Each sensing element can be configured to mechanically deform in response to sounds emanating from the patient and generate a corresponding voltage potential across the electrical poles of the sensing element.


The shell 102 can house a frame or other support structure configured to support various components of the sensor 101. The one or more sensing elements can be generally wrapped in tension around the frame. For example, the sensing elements can be positioned across an acoustic cavity disposed on the bottom surface of the frame. Thus, the sensing elements can be free to respond to acoustic waves incident upon them, resulting in corresponding induced voltages across the poles of the sensing elements.


Additionally, the shell 102 can include an acoustic coupler, which advantageously can improve the coupling between the source (for example, the patient's body) of the signal to be measured by the sensor and the sensing element. The acoustic coupler can include a bump positioned to apply pressure to the sensing element so as to bias the sensing element in tension. In one example, the bump can be positioned against the portion of the sensing element that may be stretched across the cavity of the frame. The acoustic coupler further can include a protrusion on the upper portion of the inner lining, which exerts pressure on the backbone 110 and other internal components of the sensor 101.


The attachment portion 107 can help secure the sensor assembly 101 to the patient. The illustrated attachment portion 107 can include first and second attachment arms 106, 108. The attachment arms can be made of any number of materials, such as plastic, metal or fiber. Furthermore, the attachment arms can be integrated with the backbone. The underside of the attachment arms 106, 108 include patient adhesive (for example, tape, glue, a suction device, or the like), which can be used to secure the sensor 101 to a patient's skin. The attachment portion 107 further can include a resilient backbone member 110 which may extend into and form a portion of the attachment arms 106, 108. The backbone 110 can be placed above or below the attachment arms 106, 108, or can be placed between an upper portion and a lower portion of the attachment arms 106, 108. Furthermore, the backbone can be constructed of any number of resilient materials, such as plastic, metal, fiber, combinations thereof, or the like.


As the attachment arms 106, 108 may be brought down into contact with the patient's skin on either side of the sensor 102, the adhesive affixes to the patient. Moreover, the resiliency of the backbone 110 can cause the sensor 101 to be beneficially biased in tension against the patient's skin or reduces stress on the connection between the patient adhesive and the skin. Further examples of compatible attachment portions, associated functionality and advantages are described in U.S. application Ser. No. 12/643,939 (the '939 Application), which is incorporated by reference herein. For example, embodiments of attachment portions are shown in and described with respect to FIGS. 2B, 2C, 9A-9D and 10 of the '939 Application, which is explicitly incorporated by reference herein in its entirety.


The acoustic sensor 101 can further include circuitry for detecting and transmitting information related to biological sounds to the physiological monitor. These biological sounds can include heart, breathing, or digestive system sounds, in addition to many other physiological phenomena. The acoustic sensor 101 in certain embodiments is a biological sound sensor, such as the sensors described herein. In some embodiments, the biological sound sensor is one of the sensors such as those described in U.S. patent application Ser. No. 12/044,883, filed Mar. 7, 2008, which is incorporated in its entirety by reference herein. In other embodiments, the acoustic sensor 101 can be a biological sound sensor such as those described in the '939 Application. Other embodiments can include other suitable acoustic sensors. For example, in certain embodiments, compatible acoustic sensors can be configured to provide a variety of auscultation functions, including live or recorded audio output (e.g., continuous audio output) for listening to patient bodily or speech sounds. Examples of such sensors and sensors capable of providing other compatible functionality can be found in U.S. patent application Ser. No. 12/905,036, filed on Oct. 14, 2010, which is incorporated by reference herein in its entirety.


While the sensor system 100 has been provided as one example sensor system, embodiments described herein are compatible with a variety of sensors and associated components.



FIGS. 2A-B illustrate physiological acoustic monitoring system 200 embodiments having sensors in communication with a physiological monitor 205. As shown in FIG. 2A, a first acoustic sensor 210 can be neck-mounted and utilized for monitoring body sounds and deriving one or more physiological parameters, such as the pulse or respiration rate of the patient 201. An optional second acoustic sensor 220 can be utilized to monitor body sounds. In an embodiment, the body sound sensor 220 may be chest-mounted for monaural heart sound monitoring and for determination of heart rate. In another embodiment, the second acoustic sensor 220 can include an additional body sound sensor mounted proximate the same body site, but with sufficient spatial separation to allow for stereo sensor reception. As shown in FIG. 2B, an optional plethysmograph sensor 230 coupled to the finger of a patient can further be utilized for monitoring and deriving one or more physiological parameters, such as respiration or pulse rate of the patent 201.



FIG. 3 illustrates acoustic 301 and optical 302 sensors and sensor drive elements 303 and a corresponding digital signal processor 340 and I/O drive elements 304. Some elements in FIG. 3, such as piezoelectric membrane 317 and optical front-end 325, are denoted as within a dashed area as optional features and can be included individually or as sets of elements in some embodiments of physiological monitoring system.


A multi-acoustic sensor configuration 301 can include a power interface 313, piezo circuits and a piezoelectric membrane 317 corresponding to each sensor head 306, 307, The piezoelectric membrane 317 can sense vibrations and generate a voltage in response to the vibrations. The signal generated by the piezoelectric membrane can be communicated to the piezo circuit and transmitted to the monitor 205 (FIGS. 2A-B) for signal conditioning and processing. The piezo circuit can decouple the power supply 313 and perform preliminary signal conditioning. In an embodiment, the piezo circuit 316 can include clamping diodes to provide electrostatic discharge (ESD) protection and a mid-level voltage DC offset for the piezoelectric signal to ride on, to be superimposed on, or to be added to. The piezo circuit may also, for instance, have a high pass filter to eliminate unwanted low frequencies, such as below about 100 Hz for some breath sound applications or below about 30 Hz for some pulse sound applications, and an op amp to provide gain to the piezoelectric signal. The piezo circuit may also have a low pass filter on the output of the op amp to filter out unwanted high frequencies. In an embodiment, a high pass filter can be provided on the output in addition to or instead of the low pass filter. The piezo circuit may also provide impedance compensation to the piezoelectric membrane, such as a series/parallel combination used to control the signal level strength and frequency of interest that can be input to the op amp. In one embodiment, the impedance compensation can be used to minimize the variation of the piezoelectric element output. The impedance compensation can be constructed of any combination of resistive, capacitive, and inductive elements, such as RC or RLC circuits.


As shown in FIG. 3, a physiological acoustic monitor 300 embodiment can drive and process signals from the multi-acoustic sensor 301 and the optical sensor 302. The monitor 300 can include one or more acoustic front-ends 321, 322, an analog-to-digital (A/D) converter 331, an audio driver 370 and a digital signal processor (DSP) 340. The DSP 340 can include a wide variety of data or signal processors capable of executing programs for determining physiological parameters from input data. An optical front-end 325, digital-to-analog (D/A) converters 334 and an A/D converter 335 can drive emitters 308 and transform resulting composite analog intensity signal(s) from light sensitive detector(s) 309 received via a sensor cable 310 into digital data input to the DSP 340. The acoustic front-ends 321, 322 and A/D converter 331 can transform analog acoustic signals from piezoelectric elements 301 into digital data input to the DSP 340. The A/D converter 331 is shown as having a two-channel analog input and a multiplexed digital output to the DSP. In another embodiment, each front-end, can communicate with a dedicated single channel A/D converter generating two independent digital outputs to the DSP. An acoustic front-end 321 can also feed an acoustic sensor signal 311 directly into an audio driver 370 for direct and continuous acoustic reproduction of an unprocessed (raw) sensor signal by a speaker, earphones or other audio transducer 362.


Also shown in FIG. 3, the monitor 300 may also have an instrument manager 350 that communicates between the DSP 340 and input/output 360. One or more I/O devices 360 can communicate with the instrument manager 350 including displays, alarms, user I/O and instrument communication ports. Alarms 366 may be audible or visual indicators or both. The user I/O 368 may be, as examples, keypads, touch screens, pointing devices or voice recognition devices, or the like. The displays 364 can be indicators, numeric, or graphics for displaying one or more of various physiological parameters or acoustic data. The instrument manager 350 may also be capable of storing or displaying historical or trending data related to one or more of parameters or acoustic data.


Further shown in FIG. 3, the physiological acoustic monitor 300 may also have a “push-to-talk” feature that provides a “listen on demand” capability. For example, a button 368 on the monitor can be pushed or otherwise actuated so as to initiate acoustic sounds to be sent to a speaker, handheld device, or other listening device, either directly or via a network. The monitor 300 may also have a “mode selector” button or switch 368 that can determine the acoustic content provided to a listener, either local or remote. These controls may be actuated local or at a distance by a remote listener. In an embodiment, push on demand audio occurs on an alarm condition in lieu of or in addition to an audio alarm. Controls 368 may include output filters like on a high quality stereo system so that a clinician or other user could selectively emphasize or deemphasize certain frequencies so as to hone-in on particular body sounds or characteristics.


In various embodiments, the monitor 300 can include one or more processor boards installed within and used for communicating with a host instrument. Generally, a processor board incorporates the front-end, drivers, converters and DSP. Accordingly, the processor board can derive physiological parameters and communicate values for those parameters to the host instrument. Correspondingly, the host instrument can incorporate the instrument manager and I/O devices. The processor board may also include one or more microcontrollers for board management, including, for example, communications of calculated parameter data or the like to the host instrument.


Communications 369 may transmit or receive acoustic data or audio waveforms via local area or wide area data networks or cellular networks. Controls may cause the audio processor to amplify, filter, shape or otherwise process audio waveforms so as to emphasize, isolate, deemphasize or otherwise modify various features of the audio waveform or spectrum. In addition, switches, such as a “push to play” button can initiate audio output of live or recorded acoustic data. Controls may also initiate or direct communications.



FIG. 4 is a block diagram of a pulse and respiration processor 400 of a physiological monitor that can include an acoustic signal processor 410, a plethysmograph (“pleth”) signal processor 420, and a collection processing module 430. The acoustic signal processor 410 can include, for instance, any of the acoustic signal processors described in this disclosure. The plethysmograph signal processor 420 includes, for instance, any of the plethysmograph signal processors described in this disclosure. The one or more processors 19 of FIGS. 1A-B and DSP 340 of FIG. 3 can include the pulse and respiration processor 400.


The pulse and respiration processor 400 can determine one or more pulse or respiration parameters from one or more of an acoustic signal 412 and a plethysmograph signal 422. The acoustic signal processor 410 can receive an input acoustic signal 412, such as an acoustic signal obtained from the neck of an individual via the first acoustic sensor 210 of FIGS. 2A-B or the sensor head 306 of FIG. 3. The acoustic signal 412 can correspond to a signal received from the A/D converter 331 of FIG. 3. The plethysmograph signal processor 420 can receive the input plethysmograph signal 422, such as a plethysomographic signal obtained from the finger of a patient via plethysmograph sensor 230 of FIG. 2B or optical sensor 302 of FIG. 3, The plethysomographic signal can correspond to a signal received from the A/D converter 335 of FIG. 3.


The acoustic signal processor 410 and plethysmograph signal processor 420 can each respectively determine pulse and respiration parameters, such as a pulse rate (“PR”) and respiration rate (“RR”) of a patient. The acoustic signal processor 410 can output 414 the parameters determined based on the acoustic signal 412 to the collection processing module 430, and plethysmograph signal processor 420 can output 424 the parameters determined based on the plethysmograph signal 422 to the collection processing module 430. The collection processing module 430 can include a decision logic module 430A (sometimes referred to as an arbiter or arbitration module) and a probe error detection module 430B. The collection processing module 430 can perform processing of received parameters and output 434 arbitrated parameters for additional processing or detected probe errors, such as for triggering alarm conditions corresponding to the status of a patient.


In some embodiments, the pulse and respiration processor 400 can determine other pulse or respiration information, such as estimating a carotid intensity or respiration events. Such carotid intensity information may be used as an indication of blood pressure changes or pulse variability of an individual. The respiratory events can include information regarding a time when inspiration or expiration begin (Ti or Te, respectively), a time duration of an inspiration or an expiration (Tie or Tei, respectively), a ratio of the time duration of inspiration to expiration, or of expiration to inspiration (Tie/Tei or Tei/Tie, respectively), or some other respiratory event (e.g., conclusion of inspiration or expiration, midpoint of inspiration or expiration, or any other marker indicating a specific time within the respiratory cycle, or the like). Such respiratory event information may be used to further identify the occurrence of various respiratory conditions, such as apnea, occlusion of the breathing passageway, or snoring, for example.



FIG. 5 is a block diagram of the acoustic signal processor 410 according to one embodiment. As illustrated, the acoustic signal processor 410 can include an acoustic filter 510 and an acoustic signal processing module 520, The acoustic filter 510 can filter the acoustic signal 412 to perform an inverse filtering relative to a transfer function of a sensing device (for example, including the piezoelectric membrane 317 and associated processing circuitry) used to sense the acoustic signal from the patient. The acoustic filter 510 can, for instance, perform a deconvolution using the transfer function of the sensing device and the acoustic signal 412 to undo, reverse, or diminish the impact of the sensing device on the acoustic signal 412. In one implementation, where the transfer function for a sensing device results in one or more derivatives with respect to time being performed on the detected acoustic signal, the acoustic filter 510 can integrate the acoustic signal 412 one or more times with respect to time to obtain a filtered acoustic signal 514 corresponding to the carotid pulse of an individual, A sensing device can have a transfer function that results in one or more derivatives with respect to time being performed on the detected acoustic signal when, for example, the sensing device may include one or more high-pass filters, Each high-pass filter in a sensing device can function as a differentiator of the acoustic signal. For instance, a sensing device may include a piezoelectric membrane, which can function as a high-pass filter of an acoustic signal, as well as one or more cutoff high-pass filters.


In some embodiments, the transfer function for a particular sensing device can be programmed or determined for the acoustic filter 510 at manufacture, setup-time, or runtime of a physiological monitor. In one example, a known input signal, which has an expected output signal, can be provided to the sensing device at manufacture. By analyzing the actual output signal, expected output signal, and known input signal, the transfer function for the particular sensing device can be determined and then stored to a memory of the monitor for later retrieval. In another example, the outputs of different sensors that may be connected to the same input signal can be compared at setup-time and used to determine the transfer function. Again, the determined transfer function can be stored to a memory of the monitor for later retrieval. In other implementations, one or more other approaches additionally or alternatively can be used to determine the transfer function for a particular sensing device.


The acoustic signal processing module 520 can include a pulse processor 520A and respiration processor 520B configured to determine one or more pulse or respiration parameters, respectively, based on the filtered acoustic signal 514. The pulse processor 520A and respiration processor 520B can output the determined pulse and respiration parameters 414A, 414B for further processing, such as by the collection processing module 430 of FIG. 4. In some embodiments, the respiration processor 520B can process the filtered signal acoustic signal 514 to determine one or more respiration parameters, such as respiratory rate, as disclosed in U.S. patent application Ser. No. 14/201,566, filed on Mar. 7, 2014, which is incorporated herein by reference in its entirety.



FIG. 6 is a block diagram of the acoustic filter 510 according to one embodiment. The acoustic filter 510 can filter the acoustic signal 412 in the frequency domain to reverse the effects of a transfer function of a sensing device used to sense the acoustic signal from the patient. The acoustic filter 510 can, in one implementation, integrate the acoustic signal 412 one or more times with respect to time to generate the filtered acoustic signal 514. In some embodiments, the acoustic filter 510 can integrate the acoustic signal 412 twice with respect to time to obtain the filtered acoustic signal 514. The filtered acoustic signal 514 can advantageously be a signal corresponding to an individual's carotid pulse and has relatively minimal noise or few other dominating frequency components. The filtered acoustic signal 514 can enable the straightforward determination of numerous characteristics indicative of pulse or respiration parameters of an individual.


As illustrated in FIG. 6, the acoustic filter 510 can include a frequency domain (“FD”) transform module 610, a filtering module 620, and a time domain (“TD”) transform module 630. The FD transform module 610 and TD transform module 630 together can enable performance of filtering by the filtering module 620 in a domain other than the time domain. Advantageously, in certain embodiments, performing filtering, such as integration, in the frequency domain can reduce the complexity of calculations when performing filtering. For instance, integrating in the frequency domain can permit integration calculations without accounting for additional constants that may be added if the integration may be performed in the time domain.


The frequency domain transform module 610 can receive the input acoustic signal 412 and transform the acoustic signal 412 to generate a frequency domain equivalent transformed signal 614. In one embodiment, the frequency domain transform module 610 can perform a fast Fourier transform (“FFT”) of the acoustic signal 412 to generate the transformed signal 614. The filtering module 620 can receive the transformed signal 614 and, in the case of integration filtering, scale the transformed signal 614 by a frequency function, such as a function proportional to (2πf)−2, to generate a scaled signal 624. The filtering module 620 can thus integrate the transformed signal 614 with respect to time in the frequency domain. The time domain transform module 630 can then transform the scaled signal 624 to a time domain equivalent filtered acoustic signal 514. In one embodiment, the time domain transform module 630 can perform an inverse fast Fourier transform (“IFFT”) of the scaled signal 624 to generate the filtered acoustic signal 514.



FIG. 7A is a normalized acoustic signal 700, such as the acoustic signal 412 of FIGS. 4-6, processed by an acoustic signal processor, such as the acoustic signal processor 410 of FIGS. 4 and 5. The acoustic signal 700 can be sensed from the neck of a patient via an acoustic sensor, such as the first acoustic sensor 210 of FIGS. 2A-B or the sensor head 306 of FIG. 3. The acoustic signal 700 is shown plotted on an intensity axis versus a time axis. As can be seen in FIG. 7A, the acoustic signal 700 can be a relatively chaotic signal, including numerous frequency components ranging from low to high frequency components.


In one implementation, the steps of sensing and processing the acoustic signal 700 from an individual's neck can result in a differentiation with respect to time of the individual's physiological pulse signal. Accordingly, the acoustic signal 700 can be integrated with respect to time to reverse one or more differentiations during sensing and processing. For example, the piezo circuits illustrated in FIG. 3 can output a signal corresponding to the derivative of the sensed motion of the skin of a patient. Further, before processing the signal at the DSP, a high-pass filter can be utilized and thus output the derivative with respect to time of the received signals from the piezo circuits. As a result, advantageously, in certain embodiments, the acoustic signal 700 can be filtered by an acoustic filter, such as acoustic filter 510 of FIGS. 5 and 6, by computing the double integral of the acoustic signal to obtain a signal corresponding to an individual's carotid pulse that may have relatively minimal noise or few other dominating frequency components.



FIG. 7B is a normalized filtered acoustic signal 720 generated by a filter, such as the acoustic filter 510 of FIGS. 5 and 6. The acoustic signal 700 of FIG. 7A may have been integrated twice with respect to time to generate the filtered acoustic signal 720. The filtered acoustic signal 720 is shown plotted on an intensity axis versus a time axis. As can be seen in FIG. 7B, the filtered acoustic signal 720 can be a relatively ordered signal, including fewer frequency components than the acoustic signal 700 of FIG. 7A.



FIG. 7C is another normalized filtered acoustic signal 740 generated by a filter, such as the acoustic filter of FIGS. 5 and 6. The filtered acoustic signal 740 can be a closer view of the filtered acoustic signal 720 of FIG. 7B. The filtered acoustic signal 740 is shown plotted on an intensity axis versus a time axis. Advantageously, in certain embodiments, the filtered acoustic signal 740 can be used by the acoustic signal processing module 520 of FIG. 5 to determine numerous pulse and respiration parameters of an individual.


The filtered acoustic signal 740 can have multiple pulses 742, each with a peak 744 and a valley 746 and extending over a time period 748, where the reciprocal of the time period 748 may equal a pulse rate. A carotid index (CI) value can be defined for each pulse 742:









CI
=

AC
DC





(
1
)








where “AC” 752 designates a peak amplitude 744 minus a valley amplitude 746 for a particular pulse, “DC” 750 designates a peak amplitude 744 relative to a particular intensity level. A pulse variability measure can be calculated that may be responsive to the magnitude of pulse variations, such as the amplitude modulation described with respect to FIG. 7D and depicted by envelope 770 of FIG. 7D, for example. One pulse variability measure can be a pulse variability index (PVI), In an embodiment, PVI is calculated as:









PVI
=




CI
MAX

-

CI
MIN



CI
MAX


×
100





(
2
)








where “CIMAX” designates a maximum CI over a particular period of time and “CIMIN” designates a minimum CI over the particular period of time. Thus, PVI can be the CI variation, expressed as a percentage of the maximum CI. Advantageously, in certain embodiments, pulse variability measures such as PVI can provide a parameter indicative of an individual's physical condition or health.


The pulse processor 520A of the acoustic signal processing module 520 can analyze the filtered acoustic signal 740 as discussed with respect to FIG. 7C to determine numerous other pulse parameters. In addition to determining a pulse rate, CI, and PVI, the pulse processor 520A can, for instance, detect blood pressure changes. Such parameter information can be useful for determining appropriate doses or timings for delivery of medicine to an individual or designing an intelligent cuff inflation system for measuring patient blood pressure. Moreover, in certain embodiments, advantageously the parameter information can be based on a carotid signal sensed closer to an individual's heart and with fewer turns in vasculature than a signal sensed from the individual's wrist or finger, and thus can be useable to determine relatively reliable or accurate parameter information.


The collection processing module 430 can receive the pulse rate and related pulse parameters from the acoustic signal processor 410. The probe error detection module 430B of the collection processing module 430 can use the parameters, for example, to determine a sensor or probe connection state including a probe-off, probe-error, or probe-on state, such as discussed with respect to FIG. 9, Further, the collection processing module 430 can use the pulse rate and other pulse parameters and available information to determine a pulse wave transit time (PWTT), corresponding to the blood pressure of an individual. Advantageously, in certain embodiments, by using the filtered acoustic signal 740 and another signal from an acoustic sensor near an individual's heart, PWTT can be determined with greater robustness and accuracy than using some other methods. The filtered acoustic signal 740 and the signal from the another sensor can provide signals in the fluid domain that may not introduce domain conversion delay. For instance, if PWTT may be determined using an ECG signal, the determined PWTT value can include a domain transition delay time for a bodily electrical signal to transfer to the individual's muscles.



FIG. 7D is a filtered acoustic signal 760 that illustrates amplitude modulation. Inhalation and exhalation can create positive pressure and negative pressure, respectively, on an individual's blood vessels, which may modulate the individual's pulse signal. Under certain conditions, an individual's respiration can amplitude modulate (“AM”) 762 an acoustic signal, such as filtered acoustic signal 720 of FIG. 7B, sensed from the neck of the individual. In particular, the modulation period 764 can be inversely related to the individual's respiration rate. Certain implementations may utilize other modulations of the acoustic signal, such as a frequency modulation, to determine the respiration rate in place of or in addition to amplitude modulation.


In some embodiments, respiration rate can be determined in the frequency domain by analyzing the spectrum of the filtered acoustic signal 760. In the frequency domain, the filtered acoustic signal 760 can include at least a peak corresponding to the pulse rate and two respiration peak sidebands, displaced on either side of the pulse rate peak. By extracting the respiration beak sidebands, the respiration rate corresponding to the two respiration peaks can be determined.


In some embodiments, respiration rate can be determined in the time domain based on the respiration modulation period 764. A time domain calculation may be based upon envelope detection of the filtered acoustic signal 760, such as a curve-fit to the peaks (or valleys) of the filtered acoustic signal 760 or, alternatively, the peak-to-peak variation. Related measurements of variation in a plethysmograph envelope are described, for instance, in U.S. patent application Ser. No. 11/952,940, filed Dec. 7, 2007, which is incorporated by reference in its entirety herein.


In some embodiments, the respiration processor 520B of FIG. 5 can determine local maxima 766 and minima 770 in the upper envelope 762 of the filtered acoustic signal 760. The maxima 766 and minima 770 can correspond to, or may be further processed to determine, various respiratory events, such as the onset of inspiration Ti 766, the onset of expiration Te 770, the duration of inspiration Tie 768, the duration of expiration Tei 772, the ratio of the duration of inspiration to expiration Tie/Tei, the ratio of the duration of expiration to inspiration Tei/Tie, respiration rate, or other respiration-related events.



FIG. 8 illustrates a process 800 for determining a patient pulse rate based on an acoustic signal, such as the acoustic signal 700 of FIG. 7A. For convenience, the process 800 is described in the context of the signals, systems, and devices of FIGS. 2A-B, 3-6, and 7A-D, but may instead be implemented by other signals, systems, and devices described herein or other computing systems.


At block 805, an acoustic signal can be received from a probe. The acoustic signal can be a signal obtained from the neck of a patient via the probe, such as the first acoustic sensor 210 of FIGS. 2A-B or the sensor head 306 of FIG. 3. The acoustic signal 412 can correspond to a signal received from the A/D converter 331 by the DSP 340 of FIG. 3 or the acoustic signal 412 received by the acoustic signal processor 410 of FIGS. 4 and 5.


At block 810, the received acoustic signal can be integrated twice with respect to time. The integration can be performed by the DSP 340 or the acoustic filter 510 of FIGS. 5 and 6. In some embodiments, the integration can be performed by the acoustic filter 510 in the frequency domain as discussed with respect to FIG. 6.


At block 815, a pulse rate can be estimated based on the integrated acoustic signal. The DSP 340 or acoustic signal processor 410 can estimate the pulse rate based on the reciprocal of the time period between pulses of the integrated acoustic signal, such as time period 748 of FIG. 7C.


Although block 810 can include the operation of integrating the received acoustic signal twice with respect to time in some embodiments, the operation at block 810 can include one or more other filtering operations (for example, differentiating, integrating, multiplying, subtracting, or computing the results of another function) in other embodiments to reverse or undue changes to the received acoustic signal due to the probe, as well as one or more associated processing modules.



FIG. 9 illustrates a process 900 for detecting an acoustic probe error. For convenience, the process 900 is described in the context of the signals, systems, and devices of FIGS. 2A-B, 3-5, and 7A-D, but may instead be implemented by other signals, systems, and devices described herein or other computing systems.


At block 905, an acoustic signal can be received from a probe, and a plethysmograph signal can be received from a pleth sensor. The acoustic signal can be a signal obtained from the neck of a patient via the probe, such as the first acoustic sensor 210 of FIGS. 2A-B or the sensor head 306 of FIG. 3. The acoustic signal 412 can correspond to a signal received from the A/D converter 331 by the DSP 340 of FIG. 3 or the acoustic signal 412 received by the acoustic signal processor 410 of FIGS. 4 and 5. The plethysmograph signal can be a signal obtained from the finger of a patient via a non-invasive sensor, such as the plethysmograph sensor 230 of FIG. 2B or optical sensor 302 of FIG. 3. The plethysomographic signal can correspond to a signal received from the A/D converter 335 by the DSP 340 of FIG. 3 or the plethysmograph signal 422 received by the plethysmograph processor 420 of FIG. 4.


At block 910, the received acoustic signal can be integrated twice with respect to time. The integration can be performed by the DSP 340 or the acoustic filter 510 of FIGS. 5 and 6, In some embodiments, the integration can be performed by the acoustic filter 510 in the frequency domain as discussed with respect to FIG. 6.


At block 915, a pulse rate can be estimated based on the integrated acoustic signal and the plethysmograph signal. The DSP 340 or acoustic signal processor 410 can estimate the pulse rate PRA based on the reciprocal of the time period between pulses of the integrated acoustic signal, such as time period 748 of FIG. 7C. The DSP 340 or plethysmograph signal processor 420 can estimate the pulse rate PRpleth using the plethysmograph processor 422.


At block 920, the pulse rate PRA can be compared to a pulse rate value of zero or about zero beats per minute. The DSP 340 or probe error detection module 430B can perform the comparison. In response to determining that the pulse rate equals zero or about zero, at block 925, the DSP 340 or combining module 430 can activate an alarm condition indicating a probe error. For instance, the DSP 340 can transmit a signal to the instrument manager 350 of FIG. 3 to activate an alarm 366 of one of the I/O devices 360.


At block 930, the pulse rate PRA can be compared to a first threshold pulse rate value. The DSP 340 or probe error detection module 430E can perform the comparison. The first threshold value can be a value determined based on a minimum pulse rate that would be expected for an individual. In some embodiments, the first threshold can equal 20 beats per minute. In response to determining that the pulse rate does not exceed the first threshold, at block 925, the DSP 340 or combining module 430 can activate an alarm condition indicating a probe error. For instance, the DSP 340 can transmit a signal to the instrument manager 350 to activate an alarm 366 of one of the I/O devices 360.


At block 935, the difference between the pulse rate PRA and pulse rate PRpleth can be compared to a second threshold pulse rate value. The second threshold value can be a value determined based on a minimum pulse rate difference that would be expected between an acoustic and plethysomographic determined pulse rate. In some embodiments, the second threshold can equal 5 or 10 beats per minute. In response to determining that the difference exceeds or equals the second threshold, at block 925, the DSP 340 or combining module 430 can activate an alarm condition indicating a probe error. For instance, the DSP 340 can transmit a signal to the instrument manager 350 to activate an alarm 366 of one of the I/O devices 360.


At block 940, a no-probe-error state can be determined. For instance, the DSP 340 or combining module 430 can determine that probe may be operating without error and may take no corrective action. In some embodiments, the DSP 340 or combining module 430 can utilize the absence of a probe error to determine the validity of a pulse rate or to cause DSP 340 or combining module 430 to output a particular value for display to a patient.


In some embodiments, other approaches can be additionally or alternatively used to determine probe errors or activate alarms based on the integrated acoustic signal. For instance, the timing or shape of features of the integrated acoustic signal can be compared to features of one or more other signals, such as signals from a plethysomographic sensor or another acoustic sensor. The features can include local maxima or minima of the signals, and the like. Deviations in the timing or shape between features of the integrated acoustic signal and features of the other signals can indicate a probe error or alarm condition. As another example, detected energy levels in lower frequencies of the integrated acoustic signal can be used to determine the presence of a pulse rate and thus to indicate a no probe error state. In a further example, the integrated acoustic signal can be compared to one or more signal templates to determine whether the integrated acoustic signal has an expected form. When the integrated acoustic signal does not have an expected form, a probe error indication can be triggered and an alarm can be activated. Such other approaches are described in more detail in U.S. patent application Ser. No. 14/137,629, filed Dec. 20, 2013, which is incorporated by reference in its entirety herein.


Although block 910 can include the operation of integrating the received acoustic signal twice with respect to time in some embodiments, the operation at block 910 can include one or more other filtering operations (for example, differentiating, integrating, multiplying, subtracting, or computing the results of another function) in other embodiments to reverse or undue changes to the received acoustic signal due to the probe, as well as one or more associated processing modules.



FIG. 10 illustrates a process 1000 for determining a patient respiration rate based on an acoustic signal, such as the acoustic signal 700 of FIG. 7A. For convenience, the process 1000 is described in the context of the signals, systems, and devices of FIGS. 2A-B, 3-5, and 7A-D, but may instead be implemented by other signals, systems, and devices described herein or other computing systems.


At block 1005, the acoustic signal can be received from a probe. The acoustic signal can be a signal obtained from the neck of a patient via the probe, such as the first acoustic sensor 210 of FIGS. 2A-B or the sensor head 306 of FIG. 3. The acoustic signal 412 can correspond to a signal received from the A/D converter 331 by the DSP 340 of FIG. 3 or the acoustic signal 412 received by the acoustic signal processor 410 of FIGS. 4 and 5.


At block 1010, the received acoustic signal can be integrated twice with respect to time. The integration can be performed by the DSP 340 or the acoustic filter 510 of FIGS. 5 and 6. In some embodiments, the integration can be performed by the acoustic filter 510 in the frequency domain as discussed with respect to FIG. 6.


At block 1015, a respiration rate can be estimated based on the integrated acoustic signal. For instance, the DSP 340 or acoustic signal processor 410 can estimate the respiration rate based on amplitude modulation of the integrated acoustic signal as discussed with respect to FIG. 7D.


Although block 1010 can include the operation of integrating the received acoustic signal twice with respect to time in some embodiments, the operation at block 1010 can include one or more other filtering operations (for example, differentiating, integrating, multiplying, subtracting, or computing the results of another function) in other embodiments to reverse or undue changes to the received acoustic signal due to the probe, as well as one or more associated processing modules.



FIG. 11 illustrates example signals processed by an acoustic signal processor. The signals include a raw acoustic signal 1102, such as the acoustic signal 412 of FIGS. 4-6, processed by the acoustic signal processor 410 of FIGS. 4 and 5. The raw acoustic signal 1102 can be sensed from the neck of a patient via an acoustic sensor, such as the first acoustic sensor 210 of FIGS. 2A-B or the sensor head 306 of FIG. 3. The acoustic signal 1102 is shown plotted on an intensity axis versus a time axis. As can be seen in FIG. 11, the acoustic signal 1102 can be a relatively chaotic signal, including numerous frequency components ranging from low to high frequency components.


The signals of FIG. 11 further include a compensated acoustic signal 1106 that can be a filtered acoustic signal generated by a filter, such as the acoustic filter 510 of FIGS. 5 and 6. In one implementation, the raw acoustic signal 1102 may have been integrated twice with respect to time to generate the compensated acoustic signal 1106. As can be seen, the compensated acoustic signal 1106 can be a relatively ordered signal, including fewer frequency components than the raw acoustic signal 1102.


In addition, the signals of FIG. 11 include a high frequency acoustic signal 1104 and a low frequency acoustic signal 1108. The high frequency acoustic signal 1104 can illustrate just the high frequency components of the compensated acoustic signal 1106, and the low frequency acoustic signal 1108 can just illustrate the low frequency components of the compensated acoustic signal 1106 (for example, the low frequency components between about 0.2 Hz and 0.8 Hz).


Advantageously, in certain embodiments, the low frequency acoustic signal 1108 can be used to accurately and precisely determine one or more respiration parameters for a patient since the local maxima and minima of the low frequency acoustic signal 1108 can directly correspond to exhalation and inhalation. Multiple consecutive local maxima or multiple consecutive local minima can thus be correctly identified as multiple exhalations or multiple inhalations. As a result, an acoustic signal processor can, for example, determine a time when inspiration or expiration begin (Ti or Te, respectively), a time duration of an inspiration or an expiration (Tie or Tei, respectively), a ratio of the time duration of inspiration to expiration, or of expiration to inspiration (Tie/Tei or Tei/Tie, respectively) with greater confidence.


Embodiments have been described in connection with the accompanying drawings. However, it should be understood that the figures are not drawn to scale. Distances, angles, etc. are merely illustrative and do not necessarily bear an exact relationship to actual dimensions and layout of the devices illustrated. In addition, the foregoing embodiments have been described at a level of detail to allow one of ordinary skill in the art to make and use the devices, systems, etc. described herein. A wide variety of variation is possible. Components, elements, and/or steps can be altered, added, removed, or rearranged. While certain embodiments have been explicitly described, other embodiments will become apparent to those of ordinary skill in the art based on this disclosure.


Conditional language used herein, such as, among others, “can,” “could,” “might,” “may,” “e.g.,” and the like, unless specifically stated otherwise, or otherwise understood within the context as used, is generally intended to convey that certain embodiments include, while other embodiments do not include, certain features, elements and/or states. Thus, such conditional language is not generally intended to imply that features, elements and/or states are in any way required for one or more embodiments or that one or more embodiments necessarily include logic for deciding, with or without author input or prompting, whether these features, elements and/or states are included or are to be performed in any particular embodiment.


Depending on the embodiment, certain acts, events, or functions of any of the methods described herein can be performed in a different sequence, can be added, merged, or left out altogether (e.g., not all described acts or events are necessary for the practice of the method). Moreover, in certain embodiments, acts or events can be performed concurrently, e.g., through multi-threaded processing, interrupt processing, or multiple processors or processor cores, rather than sequentially.


The various illustrative logical blocks, modules, circuits, and algorithm steps described in connection with the embodiments disclosed herein can be implemented as electronic hardware, computer software, or combinations of both. To clearly illustrate this interchangeability of hardware and software, various illustrative components, blocks, modules, circuits, and steps have been described above generally in terms of their functionality. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the overall system. The described functionality can be implemented in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the disclosure.


The various illustrative logical blocks, modules, and circuits described in connection with the embodiments disclosed herein can be implemented or performed with a general purpose processor, a digital signal processor (DSP), an application specific integrated circuit (ASIC), a field programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein. A general purpose processor can be a microprocessor, but in the alternative, the processor can be any conventional processor, controller, microcontroller, or state machine. A processor can also be implemented as a combination of computing devices, e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration.


The blocks of the methods and algorithms described in connection with the embodiments disclosed herein can be embodied directly in hardware, in a software module executed by a processor, or in a combination of the two. A software module can reside in RAM memory, flash memory, ROM memory, EPROM memory, EEPROM memory, registers, a hard disk, a removable disk, a CD-ROM, or any other form of computer-readable storage medium known in the art. An exemplary storage medium is coupled to a processor such that the processor can read information from, and write information to, the storage medium. In the alternative, the storage medium can be integral to the processor. The processor and the storage medium can reside in an ASIC. The ASIC can reside in a user terminal. In the alternative, the processor and the storage medium can reside as discrete components in a user terminal.


While the above detailed description has shown, described, and pointed out novel features as applied to various embodiments, it will be understood that various omissions, substitutions, and changes in the form and details of the devices or algorithms illustrated can be made without departing from the spirit of the disclosure. As will be recognized, certain embodiments of the inventions described herein can be embodied within a form that does not provide all of the features and benefits set forth herein, as some features can be used or practiced separately from others. The scope of certain inventions disclosed herein is indicated by the appended claims rather than by the foregoing description. All changes which come within the meaning and range of equivalency of the claims are to be embraced within their scope.

Claims
  • 1. A physiological monitoring system configured to non-invasively detect acoustic vibrations indicative of one or more physiological parameters of a medical patient, the physiological monitoring system comprising: an acoustic sensor;a memory;one or more hardware processors configured to: receive an acoustic signal from the acoustic sensor, the acoustic sensor configured to attach to the medical patient, detect acoustic vibrations associated with the medical patient, and generate the acoustic signal indicative of the acoustic vibrations, wherein the acoustic sensor is associated with a transfer function that affects the acoustic signal generated by the acoustic sensor, and wherein the transfer function of the acoustic sensor corresponds to a high-pass filter;retrieve, from the memory, the transfer function of the acoustic sensor;deconvolve the acoustic signal to lessen an effect of the transfer function on the acoustic signal; andgenerate a deconvolved acoustic signal, wherein the deconvolved acoustic signal corresponds to a scaled frequency domain equivalent of the acoustic signal;estimate a physiological parameter of the medical patient based at least on the deconvolved acoustic signal, wherein the physiological parameter includes one or more respiratory or cardiac parameters; andcause a display to display an indication of the physiological parameter.
  • 2. The physiological monitoring system of claim 1, wherein to deconvolve the acoustic signal, the one or more hardware processors are configured to scale the acoustic signal by a frequency function.
  • 3. The physiological monitoring system of claim 2, where the frequency function comprises a function that is proportional to (2πf)−2.
  • 4. The physiological monitoring system of claim 1, wherein to deconvolve the acoustic signal, the one or more hardware processors are configured to integrate a frequency domain equivalent of the acoustic signal with respect to time in the frequency domain.
  • 5. The physiological monitoring system of claim 1, wherein to deconvolve the acoustic signal, the one or more hardware processors are configured to compute a double integral with respect to time of the acoustic signal.
  • 6. The physiological monitoring system of claim 1, wherein to deconvolve the acoustic signal, the one or more hardware processors are further configured to generate a frequency domain equivalent of the acoustic signal.
  • 7. The physiological monitoring system of claim 6, wherein to generate the frequency domain equivalent, the one or more hardware processors are configured to compute a Fourier transform of the acoustic signal.
  • 8. The physiological monitoring system of claim 7, wherein the Fourier transform comprises a fast Fourier transform.
  • 9. The physiological monitoring system of claim 1, wherein to deconvolve the acoustic signal, the one or more hardware processors are further configured to generate a time domain equivalent of the acoustic signal.
  • 10. The physiological monitoring system of claim 9, wherein to generate the time domain equivalent of the acoustic signal, the one or more hardware processors are configured to compute an inverse Fourier transform of the acoustic signal.
  • 11. The physiological monitoring system of claim 1, wherein to lessen the effect of the transfer function on the acoustic signal comprises removing the effect of the transfer function on the acoustic signal.
  • 12. The physiological monitoring system of claim 1, wherein at least a portion of the transfer function of the acoustic sensor is caused by at least one of a processing circuitry of the acoustic sensor, a piezoelectric membrane of the acoustic sensor, or a sensed motion of skin of the medical patient.
  • 13. The physiological monitoring system of claim 1, wherein the physiological parameter comprises at least one of pulse rate, expiratory flow, tidal volume, minute volume, apnea duration, breath sounds, rales, rhonchi, stridor, air volume, airflow, heart sounds, or change in heart sounds.
  • 14. A method for determining one or more physiological parameters of a medical patient, the method comprising: receiving an acoustic signal from an acoustic sensor attached to the medical patient, the acoustic sensor configured to detect acoustic vibrations associated with the medical patient and generate the acoustic signal indicative of the acoustic vibrations, wherein the acoustic sensor is associated with a transfer function that affects the acoustic signal generated by the acoustic sensor, and wherein the transfer function of the acoustic sensor corresponds to a high-pass filter; retrieving, from a memory, the transfer function of the acoustic sensor;deconvolving the acoustic signal to lessen an effect of the transfer function on the acoustic signal;generate a deconvolved acoustic signal, wherein the deconvolved acoustic signal comprises a scaled frequency domain equivalent of the acoustic signal; andestimating a physiological parameter of the medical patient based at least on the deconvolved acoustic signal, wherein the physiological parameter includes one or more respiratory or cardiac parameters; andcausing a display to display an indication of the physiological parameter.
  • 15. The method of claim 14, wherein said deconvolving comprises integrating a frequency domain equivalent of the acoustic signal with respect to time in the frequency domain.
  • 16. The method of claim 14, wherein said deconvolving comprises computing a double integral with respect to time of the acoustic signal.
  • 17. The method of claim 14, wherein said deconvolving comprises: generating a frequency domain equivalent of the acoustic signal by computing a fast Fourier transform of the acoustic signal; andgenerating a time domain equivalent of the acoustic signal by computing an inverse fast Fourier transform of the acoustic signal.
  • 18. A physiological monitor comprising: an acoustic sensor;
  • 19. The physiological monitor of claim 18, wherein to deconvolve the acoustic signal, the one or more hardware processors are further configured to: generate the frequency domain equivalent of the acoustic signal by computing a fast Fourier transform of the acoustic signal; andgenerate a time domain equivalent of the acoustic signal by computing an inverse fast Fourier transform of the acoustic signal.
REFERENCE TO RELATED APPLICATIONS

The present application is a continuation of U.S. patent application Ser. No. 14/636,500, filed Mar. 3, 2015, entitled “Acoustic Pulse And Respiration Monitoring System,” which is a continuation of U.S. patent application Ser. No. 14/206,900, filed Mar. 12, 2014, entitled “Acoustic Physiological Monitoring System,” which claims priority benefit from U.S. Provisional Application No. 61/780,412, filed Mar. 13, 2013, entitled “Acoustic Pulse And Respiration Monitoring System,” each of which is hereby incorporated herein by reference in its entirety.

US Referenced Citations (1182)
Number Name Date Kind
3682161 Alibert Aug 1972 A
4127749 Atoji et al. Nov 1978 A
4326143 Guth et al. Apr 1982 A
4507653 Bayer Mar 1985 A
4537200 Widrow Aug 1985 A
4685140 Mount, II Aug 1987 A
4714341 Hamaguri Dec 1987 A
4848901 Hood, Jr. Jul 1989 A
4884809 Rowan Dec 1989 A
4958638 Sharpe et al. Sep 1990 A
4960128 Gordon et al. Oct 1990 A
4964408 Hink et al. Oct 1990 A
5033032 Houghtaling Jul 1991 A
5041187 Hink et al. Aug 1991 A
5069213 Polczynski Dec 1991 A
5143078 Mather et al. Sep 1992 A
5163438 Gordon et al. Nov 1992 A
5273036 Kronberg et al. Dec 1993 A
5309922 Schechter et al. May 1994 A
5319355 Russek Jun 1994 A
5337744 Branigan Aug 1994 A
5341805 Stavridi et al. Aug 1994 A
5353798 Sieben Oct 1994 A
D353195 Savage et al. Dec 1994 S
D353196 Savage et al. Dec 1994 S
5377302 Tsiang Dec 1994 A
5377676 Vari et al. Jan 1995 A
D359546 Savage et al. Jun 1995 S
5431170 Mathews Jul 1995 A
5436499 Namavar et al. Jul 1995 A
D361840 Savage et al. Aug 1995 S
D362063 Savage et al. Sep 1995 S
5448996 Bellin et al. Sep 1995 A
5452717 Branigan et al. Sep 1995 A
D363120 Savage et al. Oct 1995 S
5456252 Vari et al. Oct 1995 A
5479934 Imran Jan 1996 A
5482036 Diab et al. Jan 1996 A
5490505 Diab et al. Feb 1996 A
5494043 O'Sullivan et al. Feb 1996 A
5533511 Kaspari et al. Jul 1996 A
5534851 Russek Jul 1996 A
5561275 Savage et al. Oct 1996 A
5562002 Lalin Oct 1996 A
5590649 Caro et al. Jan 1997 A
5602924 Durand et al. Feb 1997 A
5632272 Diab et al. May 1997 A
5638403 Birchler et al. Jun 1997 A
5638816 Kiani-Azarbayjany et al. Jun 1997 A
5638818 Diab et al. Jun 1997 A
5645440 Tobler et al. Jul 1997 A
5671191 Gerdt Sep 1997 A
5671914 Kalkhoran et al. Sep 1997 A
5685299 Diab et al. Nov 1997 A
5724983 Selker et al. Mar 1998 A
5726440 Kalkhoran et al. Mar 1998 A
D393830 Tobler et al. Apr 1998 S
5743262 Lepper, Jr. et al. Apr 1998 A
5747806 Khalil et al. May 1998 A
5750994 Schlager May 1998 A
5758644 Diab et al. Jun 1998 A
5760910 Lepper, Jr. et al. Jun 1998 A
5769785 Diab et al. Jun 1998 A
5782757 Diab et al. Jul 1998 A
5785659 Caro et al. Jul 1998 A
5791347 Flaherty et al. Aug 1998 A
5810734 Caro et al. Sep 1998 A
5819007 Elghazzawi Oct 1998 A
5823950 Diab et al. Oct 1998 A
5830131 Caro et al. Nov 1998 A
5833618 Caro et al. Nov 1998 A
5860919 Kiani-Azarbayjany et al. Jan 1999 A
5865168 Isaza Feb 1999 A
5865736 Baker, Jr. et al. Feb 1999 A
5890929 Mills et al. Apr 1999 A
5904654 Wohltmann et al. May 1999 A
5919134 Diab Jul 1999 A
5928156 Krumbiegel Jul 1999 A
5934925 Tobler et al. Aug 1999 A
5940182 Lepper, Jr. et al. Aug 1999 A
5987343 Kinast Nov 1999 A
5995855 Kiani et al. Nov 1999 A
5997343 Mills et al. Dec 1999 A
6002952 Diab et al. Dec 1999 A
6010937 Karam et al. Jan 2000 A
6011986 Diab et al. Jan 2000 A
6027452 Flaherty et al. Feb 2000 A
6029665 Berthon-Jones Feb 2000 A
6036642 Diab et al. Mar 2000 A
6040578 Malin et al. Mar 2000 A
6045509 Caro et al. Apr 2000 A
6064910 Andersson et al. May 2000 A
6066204 Haven May 2000 A
6067462 Diab et al. May 2000 A
6081735 Diab et al. Jun 2000 A
6083172 Baker et al. Jul 2000 A
6088607 Diab et al. Jul 2000 A
6091973 Colla et al. Jul 2000 A
6110522 Lepper, Jr. et al. Aug 2000 A
6112171 Sugiyama et al. Aug 2000 A
6115673 Malin et al. Sep 2000 A
6124597 Shehada Sep 2000 A
6128521 Marro et al. Oct 2000 A
6129675 Jay Oct 2000 A
6138675 Berthon-Jones Oct 2000 A
6139505 Murphy Oct 2000 A
6144868 Parker Nov 2000 A
6151516 Kiani-Azarbayjany et al. Nov 2000 A
6152754 Gerhardt et al. Nov 2000 A
6157850 Diab et al. Dec 2000 A
6165005 Mills et al. Dec 2000 A
6168568 Gavriely Jan 2001 B1
6178343 Bindszus et al. Jan 2001 B1
6184521 Coffin, IV et al. Feb 2001 B1
6206830 Diab et al. Mar 2001 B1
6229856 Diab et al. May 2001 B1
6232609 Snyder et al. May 2001 B1
6236872 Diab et al. May 2001 B1
6241683 Macklem et al. Jun 2001 B1
6248083 Smith et al. Jun 2001 B1
6253097 Aronow et al. Jun 2001 B1
6254551 Varis Jul 2001 B1
6255708 Sudharsanan et al. Jul 2001 B1
6256523 Diab et al. Jul 2001 B1
6261238 Gavriely Jul 2001 B1
6263222 Diab et al. Jul 2001 B1
6278522 Lepper, Jr. et al. Aug 2001 B1
6280213 Tobler et al. Aug 2001 B1
6280381 Malin et al. Aug 2001 B1
6285896 Tobler et al. Sep 2001 B1
6301493 Marro et al. Oct 2001 B1
6308089 von der Ruhr et al. Oct 2001 B1
6317627 Ennen et al. Nov 2001 B1
6321100 Parker Nov 2001 B1
6325761 Jay Dec 2001 B1
6331162 Mitchell Dec 2001 B1
6334065 Al-Ali et al. Dec 2001 B1
6343224 Parker Jan 2002 B1
6349228 Kiani et al. Feb 2002 B1
6360114 Diab et al. Mar 2002 B1
6368283 Xu et al. Apr 2002 B1
6371921 Caro et al. Apr 2002 B1
6377829 Al-Ali Apr 2002 B1
6383143 Rost May 2002 B1
6388240 Schulz et al. May 2002 B2
6397091 Diab et al. May 2002 B2
6411373 Garside et al. Jun 2002 B1
6415167 Blank et al. Jul 2002 B1
6430437 Marro Aug 2002 B1
6430525 Weber et al. Aug 2002 B1
6443907 Mansy et al. Sep 2002 B1
6463311 Diab Oct 2002 B1
6470199 Kopotic et al. Oct 2002 B1
6486588 Doron et al. Nov 2002 B2
6487429 Hockersmith et al. Nov 2002 B2
6491647 Bridger et al. Dec 2002 B1
6501975 Diab et al. Dec 2002 B2
6505059 Kollias et al. Jan 2003 B1
6515273 Al-Ali Feb 2003 B2
6517497 Rymut et al. Feb 2003 B2
6519487 Parker Feb 2003 B1
6525386 Mills et al. Feb 2003 B1
6526300 Kiani et al. Feb 2003 B1
6534012 Hazen et al. Mar 2003 B1
6541756 Schulz et al. Apr 2003 B2
6542764 Al-Ali et al. Apr 2003 B1
6580086 Schulz et al. Jun 2003 B1
6584336 Ali et al. Jun 2003 B1
6587196 Stippick et al. Jul 2003 B1
6587199 Luu Jul 2003 B1
6595316 Cybulski et al. Jul 2003 B2
6597932 Tian et al. Jul 2003 B2
6597933 Kiani et al. Jul 2003 B2
6606511 Ali et al. Aug 2003 B1
6632181 Flaherty et al. Oct 2003 B2
6635559 Greenwald et al. Oct 2003 B2
6639668 Trepagnier Oct 2003 B1
6640116 Diab Oct 2003 B2
6640117 Makarewicz et al. Oct 2003 B2
6643530 Diab et al. Nov 2003 B2
6647280 Bahr et al. Nov 2003 B2
6650917 Diab et al. Nov 2003 B2
6654624 Diab et al. Nov 2003 B2
6658276 Kiani et al. Dec 2003 B2
6659960 Derksen et al. Dec 2003 B2
6661161 Lanzo et al. Dec 2003 B1
6671531 Al-Ali et al. Dec 2003 B2
6678543 Diab et al. Jan 2004 B2
6684090 Ali et al. Jan 2004 B2
6684091 Parker Jan 2004 B2
6697656 Al-Ali Feb 2004 B1
6697657 Shehada et al. Feb 2004 B1
6697658 Al-Ali Feb 2004 B2
RE38476 Diab et al. Mar 2004 E
6699194 Diab et al. Mar 2004 B1
6709402 Dekker Mar 2004 B2
6714804 Al-Ali et al. Mar 2004 B2
RE38492 Diab et al. Apr 2004 E
6721582 Trepagnier et al. Apr 2004 B2
6721585 Parker Apr 2004 B1
6725074 Kastle Apr 2004 B1
6725075 Al-Ali Apr 2004 B2
6728560 Kollias et al. Apr 2004 B2
6735459 Parker May 2004 B2
6738652 Mattu et al. May 2004 B2
6745060 Diab et al. Jun 2004 B2
6754516 Mannheimer Jun 2004 B2
6760607 Al-Ali Jul 2004 B2
6766038 Sakuma et al. Jul 2004 B1
6770028 Ali et al. Aug 2004 B1
6771994 Kiani et al. Aug 2004 B2
6788965 Ruchti et al. Sep 2004 B2
6792300 Diab et al. Sep 2004 B1
6813511 Diab et al. Nov 2004 B2
6816241 Grubisic Nov 2004 B2
6816741 Diab Nov 2004 B2
6822564 Al-Ali Nov 2004 B2
6826419 Diab et al. Nov 2004 B2
6830711 Mills et al. Dec 2004 B2
6839581 El-Solh et al. Jan 2005 B1
6850787 Weber et al. Feb 2005 B2
6850788 Al-Ali Feb 2005 B2
6852083 Caro et al. Feb 2005 B2
6861639 Al-Ali Mar 2005 B2
6869402 Arnold Mar 2005 B2
6876931 Lorenz et al. Apr 2005 B2
6898452 Al-Ali et al. May 2005 B2
6920345 Al-Ali et al. Jul 2005 B2
6931268 Kiani-Azarbayjany et al. Aug 2005 B1
6934570 Kiani et al. Aug 2005 B2
6939305 Flaherty et al. Sep 2005 B2
6943348 Coffin, IV Sep 2005 B1
6950687 Al-Ali Sep 2005 B2
6956649 Acosta et al. Oct 2005 B2
6961598 Diab Nov 2005 B2
6970792 Diab Nov 2005 B1
6979812 Al-Ali Dec 2005 B2
6985764 Mason et al. Jan 2006 B2
6990364 Ruchti et al. Jan 2006 B2
6993371 Kiani et al. Jan 2006 B2
6996427 Ali et al. Feb 2006 B2
6998247 Monfre et al. Feb 2006 B2
6999904 Weber et al. Feb 2006 B2
7003338 Weber et al. Feb 2006 B2
7003339 Diab et al. Feb 2006 B2
7015451 Dalke et al. Mar 2006 B2
7024233 Ali et al. Apr 2006 B2
7027849 Al-Ali Apr 2006 B2
7030749 Al-Ali Apr 2006 B2
7039449 Al-Ali May 2006 B2
7041060 Flaherty et al. May 2006 B2
7044918 Diab May 2006 B2
7067893 Mills et al. Jun 2006 B2
D526719 Richie, Jr. et al. Aug 2006 S
7096052 Mason et al. Aug 2006 B2
7096054 Abdul-Hafiz et al. Aug 2006 B2
7096060 Arand et al. Aug 2006 B2
D529616 Deros et al. Oct 2006 S
7132641 Schulz et al. Nov 2006 B2
7133710 Acosta et al. Nov 2006 B2
7142901 Kiani et al. Nov 2006 B2
7149561 Diab Dec 2006 B2
7186966 Al-Ali Mar 2007 B2
7190261 Al-Ali Mar 2007 B2
7194306 Turcott Mar 2007 B1
7215984 Diab May 2007 B2
7215986 Diab May 2007 B2
7221971 Diab May 2007 B2
7225006 Al-Ali et al. May 2007 B2
7225007 Al-Ali May 2007 B2
RE39672 Shehada et al. Jun 2007 E
7239905 Kiani-Azarbayjany et al. Jul 2007 B2
7245953 Parker Jul 2007 B1
7254429 Schurman et al. Aug 2007 B2
7254431 Al-Ali Aug 2007 B2
7254433 Diab et al. Aug 2007 B2
7254434 Schulz et al. Aug 2007 B2
7267652 Coyle et al. Sep 2007 B2
7272425 Al-Ali Sep 2007 B2
7274955 Kiani et al. Sep 2007 B2
D554263 Al-Ali Oct 2007 S
7280858 Al-Ali et al. Oct 2007 B2
7289835 Mansfield et al. Oct 2007 B2
7292883 De Felice et al. Nov 2007 B2
7295866 Al-Ali Nov 2007 B2
7328053 Diab et al. Feb 2008 B1
7332784 Mills et al. Feb 2008 B2
7340287 Mason et al. Mar 2008 B2
7341559 Schulz et al. Mar 2008 B2
7343186 Lamego et al. Mar 2008 B2
D566282 Al-Ali et al. Apr 2008 S
7355512 Al-Ali Apr 2008 B1
7356365 Schurman Apr 2008 B2
7361146 Bharmi et al. Apr 2008 B1
7371981 Abdul-Hafiz May 2008 B2
7373193 Al-Ali et al. May 2008 B2
7373194 Weber et al. May 2008 B2
7376453 Diab et al. May 2008 B1
7377794 Al Ali et al. May 2008 B2
7377899 Weber et al. May 2008 B2
7383070 Diab et al. Jun 2008 B2
7395158 Monfre et al. Jul 2008 B2
7398115 Lynn Jul 2008 B2
7415297 Al-Ali et al. Aug 2008 B2
7428432 Ali et al. Sep 2008 B2
7438683 Al-Ali et al. Oct 2008 B2
7440787 Diab Oct 2008 B2
7454240 Diab et al. Nov 2008 B2
7467002 Weber et al. Dec 2008 B2
7469157 Diab et al. Dec 2008 B2
7471969 Diab et al. Dec 2008 B2
7471971 Diab et al. Dec 2008 B2
7483729 Al-Ali et al. Jan 2009 B2
7483730 Diab et al. Jan 2009 B2
7489958 Diab et al. Feb 2009 B2
7496391 Diab et al. Feb 2009 B2
7496393 Diab et al. Feb 2009 B2
D587657 Al-Ali et al. Mar 2009 S
7499741 Diab et al. Mar 2009 B2
7499835 Weber et al. Mar 2009 B2
7500950 Al-Ali et al. Mar 2009 B2
7509154 Diab et al. Mar 2009 B2
7509494 Al-Ali Mar 2009 B2
7510849 Schurman et al. Mar 2009 B2
7514725 Wojtczuk et al. Apr 2009 B2
7519406 Blank et al. Apr 2009 B2
7526328 Diab et al. Apr 2009 B2
D592507 Wachman et al. May 2009 S
7530942 Diab May 2009 B1
7530949 Al Ali et al. May 2009 B2
7530955 Diab et al. May 2009 B2
7539533 Tran May 2009 B2
7563110 Al-Ali et al. Jul 2009 B2
7593230 Abul-Haj et al. Sep 2009 B2
7596398 Al-Ali et al. Sep 2009 B2
7606608 Blank et al. Oct 2009 B2
7618375 Flaherty Nov 2009 B2
7620674 Ruchti et al. Nov 2009 B2
D606659 Kiani et al. Dec 2009 S
7629039 Eckerbom et al. Dec 2009 B2
7640140 Ruchti et al. Dec 2009 B2
7647083 Al-Ali et al. Jan 2010 B2
D609193 Al-Ali et al. Feb 2010 S
D614305 Al-Ali et al. Apr 2010 S
7690378 Turcott Apr 2010 B1
7697966 Monfre et al. Apr 2010 B2
7698105 Ruchti et al. Apr 2010 B2
RE41317 Parker May 2010 E
RE41333 Blank et al. May 2010 E
7729733 Al-Ali et al. Jun 2010 B2
7734320 Al-Ali Jun 2010 B2
7761127 Al-Ali et al. Jul 2010 B2
7761128 Al-Ali et al. Jul 2010 B2
7764982 Dalke et al. Jul 2010 B2
D621516 Kiani et al. Aug 2010 S
7791155 Diab Sep 2010 B2
7801581 Diab Sep 2010 B2
7822452 Schurman et al. Oct 2010 B2
RE41912 Parker Nov 2010 E
7844313 Kiani et al. Nov 2010 B2
7844314 Al-Ali Nov 2010 B2
7844315 Al-Ali Nov 2010 B2
7865222 Weber et al. Jan 2011 B2
7873497 Weber et al. Jan 2011 B2
7880606 Al-Ali Feb 2011 B2
7880626 Al-Ali et al. Feb 2011 B2
7891355 Al-Ali et al. Feb 2011 B2
7894868 Al-Ali et al. Feb 2011 B2
7899507 Al-Ali et al. Mar 2011 B2
7899518 Trepagnier et al. Mar 2011 B2
7904132 Weber et al. Mar 2011 B2
7909772 Popov et al. Mar 2011 B2
7910875 Al-Ali Mar 2011 B2
7919713 Al-Ali et al. Apr 2011 B2
7937128 Al-Ali May 2011 B2
7937129 Mason et al. May 2011 B2
7937130 Diab et al. May 2011 B2
7941199 Kiani May 2011 B2
7951086 Flaherty et al. May 2011 B2
7957780 Lamego et al. Jun 2011 B2
7962188 Kiani et al. Jun 2011 B2
7962190 Diab et al. Jun 2011 B1
7976472 Kiani Jul 2011 B2
7988637 Diab Aug 2011 B2
7990382 Kiani Aug 2011 B2
7991446 Al-Ali et al. Aug 2011 B2
8000761 Al-Ali Aug 2011 B2
8008088 Bellott et al. Aug 2011 B2
RE42753 Kiani-Azarbayjany et al. Sep 2011 E
8019400 Diab et al. Sep 2011 B2
8028701 Al-Ali et al. Oct 2011 B2
8029765 Bellott et al. Oct 2011 B2
8036727 Schurman et al. Oct 2011 B2
8036728 Diab et al. Oct 2011 B2
8046040 Ali et al. Oct 2011 B2
8046041 Diab et al. Oct 2011 B2
8046042 Diab et al. Oct 2011 B2
8048040 Kiani Nov 2011 B2
8050728 Al-Ali et al. Nov 2011 B2
RE43169 Parker Feb 2012 E
8118620 Al-Ali et al. Feb 2012 B2
8126528 Diab et al. Feb 2012 B2
8128572 Diab et al. Mar 2012 B2
8130105 Al-Ali et al. Mar 2012 B2
8145287 Diab et al. Mar 2012 B2
8150487 Diab et al. Apr 2012 B2
8175672 Parker May 2012 B2
8180420 Diab et al. May 2012 B2
8182443 Kiani May 2012 B1
8185180 Diab et al. May 2012 B2
8190223 Al-Ali et al. May 2012 B2
8190227 Diab et al. May 2012 B2
8203438 Kiani et al. Jun 2012 B2
8203704 Merritt et al. Jun 2012 B2
8204566 Schurman et al. Jun 2012 B2
8219172 Schurman et al. Jul 2012 B2
8224411 Al-Ali et al. Jul 2012 B2
8228181 Al-Ali Jul 2012 B2
8229532 Davis Jul 2012 B2
8229533 Diab et al. Jul 2012 B2
8233955 Al-Ali et al. Jul 2012 B2
8244325 Al-Ali et al. Aug 2012 B2
8255026 Al-Ali Aug 2012 B1
8255027 Al-Ali et al. Aug 2012 B2
8255028 Al-Ali et al. Aug 2012 B2
8260577 Weber et al. Sep 2012 B2
8265723 McHale et al. Sep 2012 B1
8274360 Sampath et al. Sep 2012 B2
8280473 Al-Ali Oct 2012 B2
8301217 Al-Ali et al. Oct 2012 B2
8306596 Schurman et al. Nov 2012 B2
8310336 Muhsin et al. Nov 2012 B2
8315683 Al-Ali et al. Nov 2012 B2
RE43860 Parker Dec 2012 E
8337403 Al-Ali et al. Dec 2012 B2
8346330 Lamego Jan 2013 B2
8353842 Al-Ali et al. Jan 2013 B2
8355766 MacNeish, III et al. Jan 2013 B2
8359080 Diab et al. Jan 2013 B2
8364223 Al-Ali et al. Jan 2013 B2
8364226 Diab et al. Jan 2013 B2
8374665 Lamego Feb 2013 B2
8385995 Al-ali et al. Feb 2013 B2
8385996 Smith et al. Feb 2013 B2
8388353 Kiani et al. Mar 2013 B2
8399822 Al-Ali Mar 2013 B2
8401602 Kiani Mar 2013 B2
8405608 Al-Ali et al. Mar 2013 B2
8414499 Al-Ali et al. Apr 2013 B2
8418524 Al-Ali Apr 2013 B2
8423106 Lamego et al. Apr 2013 B2
8428967 Olsen et al. Apr 2013 B2
8430817 Al-Ali et al. Apr 2013 B1
8437825 Dalvi et al. May 2013 B2
8455290 Siskavich Jun 2013 B2
8457703 Al-Ali Jun 2013 B2
8457707 Kiani Jun 2013 B2
8463349 Diab et al. Jun 2013 B2
8466286 Bellot et al. Jun 2013 B2
8471713 Poeze et al. Jun 2013 B2
8473020 Kiani et al. Jun 2013 B2
8478538 McGonigle et al. Jul 2013 B2
8483787 Al-Ali et al. Jul 2013 B2
8489364 Weber et al. Jul 2013 B2
8498684 Weber et al. Jul 2013 B2
8504128 Blank et al. Aug 2013 B2
8509867 Workman et al. Aug 2013 B2
8515509 Bruinsma et al. Aug 2013 B2
8523781 Al-Ali Sep 2013 B2
8529301 Al-Ali et al. Sep 2013 B2
8532727 Ali et al. Sep 2013 B2
8532728 Diab et al. Sep 2013 B2
D692145 Al-Ali et al. Oct 2013 S
8547209 Kiani et al. Oct 2013 B2
8548548 Al-Ali Oct 2013 B2
8548549 Schurman et al. Oct 2013 B2
8548550 Al-Ali et al. Oct 2013 B2
8560032 Al-Ali et al. Oct 2013 B2
8560034 Diab et al. Oct 2013 B1
8570167 Ai-Ali Oct 2013 B2
8570503 Vo et al. Oct 2013 B2
8571617 Reichgott et al. Oct 2013 B2
8571618 Lamego et al. Oct 2013 B1
8571619 Al-Ali et al. Oct 2013 B2
8584345 Al-Ali et al. Oct 2013 B2
8577431 Lamego et al. Nov 2013 B2
8581732 Al-Ali et al. Nov 2013 B2
8588880 Abdul-Hafiz et al. Nov 2013 B2
8597274 Sloan Dec 2013 B2
8600467 Al-Ali et al. Dec 2013 B2
8606342 Diab Dec 2013 B2
8622902 Woehrle Jan 2014 B2
8626255 Al-Ali et al. Jan 2014 B2
8630691 Lamego et al. Jan 2014 B2
8634889 Al-Ali et al. Jan 2014 B2
8641631 Sierra et al. Feb 2014 B2
8652060 Al-Ali Feb 2014 B2
8663107 Kiani Mar 2014 B2
8666468 Al-Ali Mar 2014 B1
8667967 Al-Ali et al. Mar 2014 B2
8670811 O'Reilly Mar 2014 B2
8670814 Diab et al. Mar 2014 B2
8676286 Weber et al. Mar 2014 B2
8682407 Al-Ali Mar 2014 B2
RE44823 Parker Apr 2014 E
RE44875 Kiani et al. Apr 2014 E
8688183 Bruinsma et al. Apr 2014 B2
8690799 Telfort et al. Apr 2014 B2
8700112 Kiani Apr 2014 B2
8702627 Telfort et al. Apr 2014 B2
8706179 Parker Apr 2014 B2
8712494 MacNeish, III et al. Apr 2014 B1
8715206 Telfort et al. May 2014 B2
8718735 Lamego et al. May 2014 B2
8718737 Diab et al. May 2014 B2
8718738 Blank et al. May 2014 B2
8720249 Al-Ali May 2014 B2
8721541 Al-Ali et al. May 2014 B2
8721542 Al-Ali et al. May 2014 B2
8723677 Kiani May 2014 B1
8740792 Kiani et al. Jun 2014 B1
8754776 Poeze et al. Jun 2014 B2
8755535 Telfort et al. Jun 2014 B2
8755856 Diab et al. Jun 2014 B2
8755872 Marinow Jun 2014 B1
8761850 Lamego Jun 2014 B2
8764671 Kiani Jul 2014 B2
8768423 Shakespeare et al. Jul 2014 B2
8771204 Telfort et al. Jul 2014 B2
8777634 Kiani et al. Jul 2014 B2
8781543 Diab et al. Jul 2014 B2
8781544 Al-Ali et al. Jul 2014 B2
8781549 Al-Ali et al. Jul 2014 B2
8788003 Schurman et al. Jul 2014 B2
8790268 Al-Ali Jul 2014 B2
8792949 Baker Jul 2014 B2
8801613 Al-Ali et al. Aug 2014 B2
8821397 Al-Ali et al. Sep 2014 B2
8821415 Al-Ali et al. Sep 2014 B2
8830449 Lamego et al. Sep 2014 B1
8831700 Schurman et al. Sep 2014 B2
8840549 Al-Ali et al. Sep 2014 B2
8847740 Kiani et al. Sep 2014 B2
8849365 Smith et al. Sep 2014 B2
8852094 Al-Ali et al. Oct 2014 B2
8852994 Wojtczuk et al. Oct 2014 B2
8868147 Stippick et al. Oct 2014 B2
8868150 Al-Ali et al. Oct 2014 B2
8870792 Al-Ali et al. Oct 2014 B2
8886271 Kiani et al. Nov 2014 B2
8888539 Al-Ali et al. Nov 2014 B2
8888708 Diab et al. Nov 2014 B2
8892180 Weber et al. Nov 2014 B2
8897847 Al-Ali Nov 2014 B2
8909310 Lamego Dec 2014 B2
8911377 Al-Ali Dec 2014 B2
8912909 Al-Ali et al. Dec 2014 B2
8920317 Al-Ali et al. Dec 2014 B2
8921699 Al-Ali et al. Dec 2014 B2
8922382 Al-Ali et al. Dec 2014 B2
8929964 Al-Ali et al. Jan 2015 B2
8942777 Diab et al. Jan 2015 B2
8948834 Diab et al. Feb 2015 B2
8948835 Diab Feb 2015 B2
8965471 Lamego Feb 2015 B2
8983564 Al-Ali Mar 2015 B2
8989831 Al-Ali et al. Mar 2015 B2
8996085 Kiani et al. Mar 2015 B2
8998809 Kiani Apr 2015 B2
9028429 Telfort et al. May 2015 B2
9037207 Al-Ali et al. May 2015 B2
9060721 Reichgott et al. Jun 2015 B2
9066666 Kiani Jun 2015 B2
9066680 Al-Ali et al. Jun 2015 B1
9072474 Al-Ali et al. Jul 2015 B2
9078560 Schurman et al. Jul 2015 B2
9084569 Weber et al. Jul 2015 B2
9095316 Welch et al. Aug 2015 B2
9106038 Telfort et al. Aug 2015 B2
9107625 Telfort et al. Aug 2015 B2
9107626 Al-Ali et al. Aug 2015 B2
9113831 Al-Ali Aug 2015 B2
9113832 Al-Ali Aug 2015 B2
9119595 Lamego Sep 2015 B2
9131881 Diab et al. Sep 2015 B2
9131882 Al-Ali et al. Sep 2015 B2
9131883 Al-Ali Sep 2015 B2
9131917 Telfort et al. Sep 2015 B2
9135398 Kaib Sep 2015 B2
9138180 Coverston et al. Sep 2015 B1
9138182 Al-Ali et al. Sep 2015 B2
9138192 Weber et al. Sep 2015 B2
9142117 Muhsin et al. Sep 2015 B2
9153112 Kiani et al. Oct 2015 B1
9153121 Kiani et al. Oct 2015 B2
9161696 Al-Ali et al. Oct 2015 B2
9161713 Al-Ali et al. Oct 2015 B2
9167995 Lamego et al. Oct 2015 B2
9176141 Al-Ali et al. Nov 2015 B2
9186102 Bruinsma et al. Nov 2015 B2
9192312 Al-Ali Nov 2015 B2
9192329 Al-Ali Nov 2015 B2
9192351 Telfort et al. Nov 2015 B1
9195385 Al-Ali et al. Nov 2015 B2
9211072 Kiani Dec 2015 B2
9211095 Al-Ali Dec 2015 B1
9218454 Kiani et al. Dec 2015 B2
9220440 Addison et al. Dec 2015 B2
9226696 Kiani Jan 2016 B2
9241662 Al-Ali et al. Jan 2016 B2
9245668 Vo et al. Jan 2016 B1
9259185 Abdul-Hafiz et al. Jan 2016 B2
9267572 Barker et al. Feb 2016 B2
9277880 Poeze et al. Feb 2016 B2
9289167 Diab et al. Mar 2016 B2
9295421 Kiani et al. Mar 2016 B2
9307928 Al-Ali et al. Apr 2016 B1
9323894 Kiani Apr 2016 B2
D755392 Hwang et al. May 2016 S
9326712 Kiani May 2016 B1
9333316 Kiani May 2016 B2
9339220 Lamego et al. May 2016 B2
9341565 Lamego et al. May 2016 B2
9351673 Diab et al. May 2016 B2
9351675 Al-Ali et al. May 2016 B2
9364181 Kiani et al. Jun 2016 B2
9368671 Wojtczuk et al. Jun 2016 B2
9370325 Al-Ali et al. Jun 2016 B2
9370326 McHale et al. Jun 2016 B2
9370335 Al-ali et al. Jun 2016 B2
9375185 Ali et al. Jun 2016 B2
9378637 Kaib Jun 2016 B2
9386953 Al-Ali Jul 2016 B2
9386961 Al-Ali et al. Jul 2016 B2
9392945 Al-Ali et al. Jul 2016 B2
9397448 Al-Ali et al. Jul 2016 B2
9408542 Kinast et al. Aug 2016 B1
9436645 Al-Ali et al. Sep 2016 B2
9445759 Lamego et al. Sep 2016 B1
9466919 Kiani et al. Oct 2016 B2
9474474 Lamego et al. Oct 2016 B2
9480422 Al-Ali Nov 2016 B2
9480435 Olsen Nov 2016 B2
9492110 Al-Ali et al. Nov 2016 B2
9510779 Poeze et al. Dec 2016 B2
9517024 Kiani et al. Dec 2016 B2
9532722 Lamego et al. Jan 2017 B2
9538949 Al-Ali et al. Jan 2017 B2
9538980 Telfort et al. Jan 2017 B2
9549696 Lamego et al. Jan 2017 B2
9554737 Schurman et al. Jan 2017 B2
9560996 Kiani Feb 2017 B2
9560998 Al-Ali et al. Feb 2017 B2
9566019 Al-Ali et al. Feb 2017 B2
9579039 Jansen et al. Feb 2017 B2
9591975 Dalvi et al. Mar 2017 B2
9622692 Lamego et al. Apr 2017 B2
9622693 Diab Apr 2017 B2
D788312 Al-Ali et al. May 2017 S
9649054 Lamego et al. May 2017 B2
9659475 Kaib May 2017 B2
9697928 Al-Ali et al. Jul 2017 B2
9717458 Lamego et al. Aug 2017 B2
9724016 Al-Ali et al. Aug 2017 B1
9724024 Al-Ali Aug 2017 B2
9724025 Kiani et al. Aug 2017 B1
9749232 Sampath et al. Aug 2017 B2
9750442 Olsen Sep 2017 B2
9750461 Telfort Sep 2017 B1
9775545 Al-Ali et al. Oct 2017 B2
9778079 Al-Ali et al. Oct 2017 B1
9782077 Lamego et al. Oct 2017 B2
9787568 Lamego et al. Oct 2017 B2
9808188 Perea et al. Nov 2017 B1
9839379 Al-Ali et al. Dec 2017 B2
9839381 Weber et al. Dec 2017 B1
9847749 Kiani et al. Dec 2017 B2
9848800 Lee Dec 2017 B1
9861298 Eckerbom et al. Jan 2018 B2
9861305 Weber et al. Jan 2018 B1
9877650 Muhsin et al. Jan 2018 B2
9891079 Dalvi Feb 2018 B2
9924897 Abdul-Hafiz Mar 2018 B1
9936917 Poeze et al. Apr 2018 B2
9955937 Telfort May 2018 B2
9965946 Al-Ali et al. May 2018 B2
9968266 An et al. May 2018 B2
D820865 Muhsin et al. Jun 2018 S
9986952 Dalvi et al. Jun 2018 B2
D822215 Al-Ali et al. Jul 2018 S
D822216 Barker et al. Jul 2018 S
10010276 Al-Ali et al. Jul 2018 B2
10086138 Novak, Jr. Oct 2018 B1
10111591 Dyell et al. Oct 2018 B2
D833624 DeJong et al. Nov 2018 S
10123729 Dyell et al. Nov 2018 B2
D835282 Barker et al. Dec 2018 S
D835283 Barker et al. Dec 2018 S
D835284 Barker et al. Dec 2018 S
D835285 Barker et al. Dec 2018 S
10149616 Al-Ali et al. Dec 2018 B2
10154815 Al-Ali et al. Dec 2018 B2
10159412 Lamego et al. Dec 2018 B2
10188348 Al-Ali et al. Jan 2019 B2
RE47218 Al-Ali Feb 2019 E
RE47244 Kiani et al. Feb 2019 E
RE47249 Kiani et al. Feb 2019 E
10205291 Scruggs et al. Feb 2019 B2
10226187 Al-Ali et al. Mar 2019 B2
10231657 Al-Ali et al. Mar 2019 B2
10231670 Blank et al. Mar 2019 B2
RE47353 Kiani et al. Apr 2019 E
10279247 Kiani May 2019 B2
10292664 Al-Ali May 2019 B2
10299720 Brown et al. May 2019 B2
10327337 Schmidt et al. Jun 2019 B2
10327713 Barker et al. Jun 2019 B2
10332630 Al-Ali Jun 2019 B2
10383520 Wojtczuk et al. Aug 2019 B2
10383527 Al-Ali Aug 2019 B2
10388120 Muhsin et al. Aug 2019 B2
D864120 Forrest et al. Oct 2019 S
10441181 Telfort et al. Oct 2019 B1
10441196 Eckerbom et al. Oct 2019 B2
10448844 Al-Ali et al. Oct 2019 B2
10448871 Al-Ali et al. Oct 2019 B2
10456038 Lamego et al. Oct 2019 B2
10463340 Telfort et al. Nov 2019 B2
10471159 Lapotko et al. Nov 2019 B1
10505311 Al-Ali et al. Dec 2019 B2
10524738 Olsen Jan 2020 B2
10532174 Al-Ali Jan 2020 B2
10537285 Shreim et al. Jan 2020 B2
10542903 Al-Ali et al. Jan 2020 B2
10555678 Dalvi et al. Feb 2020 B2
10568553 O'Neil et al. Feb 2020 B2
RE47882 Al-Ali Mar 2020 E
10608817 Haider et al. Mar 2020 B2
D880477 Forrest et al. Apr 2020 S
10617302 Al-Ali et al. Apr 2020 B2
10617335 Al-Ali et al. Apr 2020 B2
10637181 Al-Ali et al. Apr 2020 B2
D886849 Muhsin et al. Jun 2020 S
D887548 Abdul-Hafiz et al. Jun 2020 S
D887549 Abdul-Hafiz et al. Jun 2020 S
10667764 Ahmed et al. Jun 2020 B2
D890708 Forrest et al. Jul 2020 S
10721785 Al-Ali Jul 2020 B2
10736518 Al-Ali et al. Aug 2020 B2
10750984 Pauley et al. Aug 2020 B2
D897098 Al-Ali Sep 2020 S
10779098 Iswanto et al. Sep 2020 B2
10827961 Iyengar et al. Nov 2020 B1
10828007 Telfort et al. Nov 2020 B1
10832818 Muhsin et al. Nov 2020 B2
10849554 Shreim et al. Dec 2020 B2
10856750 Indorf et al. Dec 2020 B2
D906970 Forrest et al. Jan 2021 S
D908213 Abdul-Hafiz et al. Jan 2021 S
10918281 Al-Ali et al. Feb 2021 B2
10932705 Muhsin et al. Mar 2021 B2
10932729 Kiani et al. Mar 2021 B2
10939878 Kiani et al. Mar 2021 B2
10956950 Al-Ali et al. Mar 2021 B2
D916135 Indorf et al. Apr 2021 S
D917046 Abdul-Hafiz et al. Apr 2021 S
D917550 Indorf et al. Apr 2021 S
D917564 Indorf et al. Apr 2021 S
D917704 Al-Ali et al. Apr 2021 S
10987066 Chandran et al. Apr 2021 B2
10991135 Al-Ali et al. Apr 2021 B2
D919094 Al-Ali et al. May 2021 S
D919100 Al-Ali et al. May 2021 S
11006867 Al-Ali May 2021 B2
D921202 Al-Ali et al. Jun 2021 S
11024064 Muhsin et al. Jun 2021 B2
11026604 Chen et al. Jun 2021 B2
D925597 Chandran et al. Jul 2021 S
D927699 Al-Ali et al. Aug 2021 S
11076777 Lee et al. Aug 2021 B2
11114188 Poeze et al. Sep 2021 B2
D933232 Al-Ali et al. Oct 2021 S
D933233 Al-Ali et al. Oct 2021 S
D933234 Al-Ali et al. Oct 2021 S
11145408 Sampath et al. Oct 2021 B2
11147518 Al-Ali et al. Oct 2021 B1
11185262 Al-Ali et al. Nov 2021 B2
11191484 Kiani et al. Dec 2021 B2
D946596 Ahmed Mar 2022 S
D946597 Ahmed Mar 2022 S
D946598 Ahmed Mar 2022 S
D946617 Ahmed Mar 2022 S
11272839 Al-Ali et al. Mar 2022 B2
11289199 Al-Ali Mar 2022 B2
RE49034 Al-Ali Apr 2022 E
11298021 Muhsin et al. Apr 2022 B2
D950580 Ahmed May 2022 S
D950599 Ahmed May 2022 S
D950738 Al-Ali et al. May 2022 S
D957648 Al-Ali Jul 2022 S
11382567 O'Brien et al. Jul 2022 B2
11389093 Triman et al. Jul 2022 B2
11406286 Al-Ali et al. Aug 2022 B2
11417426 Muhsin et al. Aug 2022 B2
11439329 Lamego Sep 2022 B2
11445948 Scruggs et al. Sep 2022 B2
D965789 Al-Ali et al. Oct 2022 S
D967433 Al-Ali et al. Oct 2022 S
11464410 Muhsin Oct 2022 B2
11504058 Sharma et al. Nov 2022 B1
11504066 Dalvi et al. Nov 2022 B1
D971933 Ahmed Dec 2022 S
D973072 Ahmed Dec 2022 S
D973685 Ahmed Dec 2022 S
D973686 Ahmed Dec 2022 S
D974193 Forrest et al. Jan 2023 S
D979516 Al-Ali et al. Feb 2023 S
D980091 Forrest et al. Mar 2023 S
11596363 Lamego Mar 2023 B2
11627919 Kiani et al. Apr 2023 B2
11637437 Al-Ali et al. Apr 2023 B2
D985498 Al-Ali et al. May 2023 S
11653862 Dalvi et al. May 2023 B2
D989112 Muhsin et al. Jun 2023 S
D989327 Al-Ali et al. Jun 2023 S
11678829 Al-Ali et al. Jun 2023 B2
11679579 Al-Ali Jun 2023 B2
11684296 Vo et al. Jun 2023 B2
11692934 Normand et al. Jul 2023 B2
11701043 Al-Ali et al. Jul 2023 B2
D997365 Hwang Aug 2023 S
11721105 Ranasinghe et al. Aug 2023 B2
11730379 Ahmed et al. Aug 2023 B2
D998625 Indorf et al. Sep 2023 S
D998630 Indorf et al. Sep 2023 S
D998631 Indorf et al. Sep 2023 S
11766198 Pauley et al. Sep 2023 B2
D1000975 Al-Ali et al. Oct 2023 S
20010002206 Diab et al. May 2001 A1
20010034477 Mansfield et al. Oct 2001 A1
20010039483 Brand et al. Nov 2001 A1
20020010401 Bushmakin et al. Jan 2002 A1
20020058864 Mansfield et al. May 2002 A1
20020133080 Apruzzese et al. Sep 2002 A1
20020193670 Garfield et al. Dec 2002 A1
20030013975 Kiani Jan 2003 A1
20030015368 Cybulski et al. Jan 2003 A1
20030018243 Gerhardt et al. Jan 2003 A1
20030065269 Vetter Apr 2003 A1
20030076494 Bonin et al. Apr 2003 A1
20030144582 Cohen et al. Jul 2003 A1
20030156288 Barnum et al. Aug 2003 A1
20030158466 Lynn et al. Aug 2003 A1
20030163033 Dekker et al. Aug 2003 A1
20030163054 Dekker Aug 2003 A1
20030212312 Coffin, IV et al. Nov 2003 A1
20040010202 Nakatani Jan 2004 A1
20040059203 Guerrero Mar 2004 A1
20040060362 Kjellmann et al. Apr 2004 A1
20040106163 Workman, Jr. et al. Jun 2004 A1
20040133087 Ali et al. Jul 2004 A1
20040158162 Narimatsu Aug 2004 A1
20040225332 Gebhardt Nov 2004 A1
20040260186 Dekker Dec 2004 A1
20050027205 Tarassenko et al. Feb 2005 A1
20050048456 Chefd'hotel et al. Mar 2005 A1
20050055276 Kiani et al. Mar 2005 A1
20050070774 Addison et al. Mar 2005 A1
20050107699 Loftman May 2005 A1
20050116820 Goldreich Jun 2005 A1
20050199056 Strong Sep 2005 A1
20050234317 Kiani Oct 2005 A1
20060047215 Newman et al. Mar 2006 A1
20060073719 Kiani Apr 2006 A1
20060129216 Hastings et al. Jun 2006 A1
20060149144 Lynn et al. Jul 2006 A1
20060155206 Lynn Jul 2006 A1
20060155207 Lynn et al. Jul 2006 A1
20060161071 Lynn et al. Jul 2006 A1
20060189871 Al-Ali et al. Aug 2006 A1
20060189880 Lynn et al. Aug 2006 A1
20060195041 Lynn et al. Aug 2006 A1
20060235324 Lynn Oct 2006 A1
20060238333 Welch et al. Oct 2006 A1
20060241510 Halperin et al. Oct 2006 A1
20060258921 Addison et al. Nov 2006 A1
20070073116 Kiani et al. Mar 2007 A1
20070093721 Lynn et al. Apr 2007 A1
20070129643 Kwok et al. Jun 2007 A1
20070129647 Lynn Jun 2007 A1
20070135725 Hatlestad Jun 2007 A1
20070149860 Lynn et al. Jun 2007 A1
20070163353 Lec et al. Jul 2007 A1
20070180140 Welch et al. Aug 2007 A1
20070185397 Govari et al. Aug 2007 A1
20070213619 Linder Sep 2007 A1
20070239057 Pu et al. Oct 2007 A1
20070244377 Cozad et al. Oct 2007 A1
20070282212 Sierra et al. Dec 2007 A1
20080013747 Tran Jan 2008 A1
20080039735 Hickerson Feb 2008 A1
20080064965 Jay et al. Mar 2008 A1
20080071185 Beck et al. Mar 2008 A1
20080076972 Dorogusker et al. Mar 2008 A1
20080094228 Welch et al. Apr 2008 A1
20080103375 Kiani May 2008 A1
20080119716 Boric-Lubecke et al. May 2008 A1
20080161878 Tehrani et al. Jul 2008 A1
20080177195 Armitstead Jul 2008 A1
20080188733 Al-Ali Aug 2008 A1
20080188760 Al-Ali Aug 2008 A1
20080218153 Patel et al. Sep 2008 A1
20080221418 Al-Ali et al. Sep 2008 A1
20080275349 Halperin et al. Nov 2008 A1
20080304580 Ichiyama Dec 2008 A1
20090018409 Banet et al. Jan 2009 A1
20090018429 Saliga et al. Jan 2009 A1
20090018453 Banet et al. Jan 2009 A1
20090036759 Ault et al. Feb 2009 A1
20090093687 Telfort et al. Apr 2009 A1
20090095926 MacNeish, III Apr 2009 A1
20090112096 Tamura Apr 2009 A1
20090160654 Yang Jun 2009 A1
20090167332 Forbes Jul 2009 A1
20090187065 Basinger Jul 2009 A1
20090227882 Foo Sep 2009 A1
20090240119 Schwaibold et al. Sep 2009 A1
20090247848 Baker Oct 2009 A1
20090247984 Lamego et al. Oct 2009 A1
20090275813 Davis Nov 2009 A1
20090275844 Al-Ali Nov 2009 A1
20090299157 Telfort et al. Dec 2009 A1
20090312612 Rantala Dec 2009 A1
20100004518 Vo et al. Jan 2010 A1
20100004552 Zhang et al. Jan 2010 A1
20100014761 Addison et al. Jan 2010 A1
20100016682 Schluess et al. Jan 2010 A1
20100016693 Addison Jan 2010 A1
20100030040 Poeze et al. Feb 2010 A1
20100099964 O'Reilly et al. Apr 2010 A1
20100130873 Yuer May 2010 A1
20100204550 Heneghan Aug 2010 A1
20100234718 Sampath et al. Sep 2010 A1
20100261979 Kiani Oct 2010 A1
20100270257 Wachman et al. Oct 2010 A1
20100274099 Telfort et al. Oct 2010 A1
20100295686 Sloan Nov 2010 A1
20100298661 McCombie et al. Nov 2010 A1
20100298730 Taressenko et al. Nov 2010 A1
20100324377 Woehrle Dec 2010 A1
20100331903 Zhang Dec 2010 A1
20110001605 Kiani Jan 2011 A1
20110009710 Kroeger et al. Jan 2011 A1
20110028806 Merritt et al. Feb 2011 A1
20110028809 Goodman Feb 2011 A1
20110040197 Welch et al. Feb 2011 A1
20110040713 Colman Feb 2011 A1
20110066062 Banet et al. Mar 2011 A1
20110074409 Stoughton Mar 2011 A1
20110082711 Poeze et al. Apr 2011 A1
20110087081 Kiani et al. Apr 2011 A1
20110105854 Kiani et al. May 2011 A1
20110118561 Tari et al. May 2011 A1
20110118573 McKenna May 2011 A1
20110125060 Telfort et al. May 2011 A1
20110137297 Kiani et al. Jun 2011 A1
20110172498 Olsen et al. Jul 2011 A1
20110172561 Kiani et al. Jul 2011 A1
20110208015 Welch et al. Aug 2011 A1
20110209915 Telfort et al. Sep 2011 A1
20110213212 Al-Ali Sep 2011 A1
20110222371 Liu et al. Sep 2011 A1
20110230733 Al-Ali et al. Sep 2011 A1
20110237911 Lamego et al. Sep 2011 A1
20120016255 Masuo Jan 2012 A1
20120059267 Lamego et al. Mar 2012 A1
20120070013 Vau Mar 2012 A1
20120101344 Desjardins Apr 2012 A1
20120116175 Al-Ali et al. May 2012 A1
20120123231 O'Reilly May 2012 A1
20120165629 Merritt et al. Jun 2012 A1
20120179006 Jansen et al. Jul 2012 A1
20120209082 Al-Ali Aug 2012 A1
20120209084 Olsen et al. Aug 2012 A1
20120226117 Lamego et al. Sep 2012 A1
20120227739 Kiani Sep 2012 A1
20120253140 Addison et al. Oct 2012 A1
20120262298 Bohm Oct 2012 A1
20120283524 Kiani et al. Nov 2012 A1
20120296178 Lamego et al. Nov 2012 A1
20120319816 Al-Ali Dec 2012 A1
20120330112 Lamego et al. Dec 2012 A1
20130023775 Lamego et al. Jan 2013 A1
20130045685 Kiani Feb 2013 A1
20130046204 Lamego Feb 2013 A1
20130041591 Lamego Mar 2013 A1
20130060147 Welch et al. Mar 2013 A1
20130096405 Garfio Apr 2013 A1
20130096936 Sampath et al. Apr 2013 A1
20130109935 Al-Ali et al. May 2013 A1
20130116578 An et al. May 2013 A1
20130128690 Gopalan May 2013 A1
20130137936 Baker, Jr. et al. May 2013 A1
20130162433 Muhsin et al. Jun 2013 A1
20130190581 Al-Ali et al. Jul 2013 A1
20130190595 Oraevsky Jul 2013 A1
20130197328 Diab et al. Aug 2013 A1
20130243021 Siskavich Sep 2013 A1
20130253334 Al-Ali et al. Sep 2013 A1
20130274571 Diab et al. Oct 2013 A1
20130296672 Dalvi et al. Nov 2013 A1
20130296726 Nievauer et al. Nov 2013 A1
20130317370 Dalvi et al. Nov 2013 A1
20130324808 Al-Ali et al. Dec 2013 A1
20130331670 Kiani Dec 2013 A1
20130338461 Lamego et al. Dec 2013 A1
20130345921 Al-Ali et al. Dec 2013 A1
20140012100 Lamego et al. Jan 2014 A1
20140025306 Weber et al. Jan 2014 A1
20140034353 Al-Ali et al. Feb 2014 A1
20140051953 Lamego et al. Feb 2014 A1
20140058230 Abdul-Hafiz et al. Feb 2014 A1
20140066783 Kiani et al. Mar 2014 A1
20140077956 Sampath et al. Mar 2014 A1
20140081100 Muhsin et al. Mar 2014 A1
20140081175 Telfort Mar 2014 A1
20140094667 Schurman et al. Apr 2014 A1
20140100434 Diab et al. Apr 2014 A1
20140114199 Lamego et al. Apr 2014 A1
20140120564 Workman et al. May 2014 A1
20140121482 Merritt et al. May 2014 A1
20140121483 Kiani May 2014 A1
20140127137 Bellott et al. May 2014 A1
20140128696 Al-Ali May 2014 A1
20140128699 Al-Ali et al. May 2014 A1
20140129702 Lamego et al. May 2014 A1
20140135588 Al-Ali et al. May 2014 A1
20140142401 Al-Ali et al. May 2014 A1
20140142402 Al-Ali et al. May 2014 A1
20140163344 Al-Ali Jun 2014 A1
20140163402 Lamego et al. Jun 2014 A1
20140166076 Kiani et al. Jun 2014 A1
20140171763 Diab Jun 2014 A1
20140180038 Kiani et al. Jun 2014 A1
20140180154 Sierra et al. Jun 2014 A1
20140180160 Brown et al. Jun 2014 A1
20140187973 Brown et al. Jul 2014 A1
20140194709 Al-Ali et al. Jul 2014 A1
20140194711 Al-Ali Jul 2014 A1
20140194766 Al-Ali et al. Jul 2014 A1
20140206963 Diab et al. Jul 2014 A1
20140213864 Abdul-Hafiz et al. Jul 2014 A1
20140243627 Diab et al. Aug 2014 A1
20140266790 Al-Ali et al. Sep 2014 A1
20140275808 Poeze et al. Sep 2014 A1
20140275835 Lamego et al. Sep 2014 A1
20140275871 Lamego et al. Sep 2014 A1
20140275872 Merritt et al. Sep 2014 A1
20140275881 Lamego et al. Sep 2014 A1
20140288400 Diab et al. Sep 2014 A1
20140296664 Bruinsma et al. Oct 2014 A1
20140303520 Telfort et al. Oct 2014 A1
20140309506 Lamego et al. Oct 2014 A1
20140316217 Purdon et al. Oct 2014 A1
20140316218 Purdon et al. Oct 2014 A1
20140316228 Blank et al. Oct 2014 A1
20140323825 Al-Ali et al. Oct 2014 A1
20140323897 Brown et al. Oct 2014 A1
20140323898 Purdon et al. Oct 2014 A1
20140330092 Al-Ali et al. Nov 2014 A1
20140330098 Merritt et al. Nov 2014 A1
20140330099 Al-Ali et al. Nov 2014 A1
20140333440 Kiani Nov 2014 A1
20140336481 Shakespeare et al. Nov 2014 A1
20140343436 Kiani Nov 2014 A1
20150005600 Blank et al. Jan 2015 A1
20150011907 Purdon et al. Jan 2015 A1
20150018650 Al-Ali et al. Jan 2015 A1
20150073241 Lamego Mar 2015 A1
20150080754 Purdon et al. Mar 2015 A1
20150099950 Al-Ali et al. Apr 2015 A1
20150106121 Muhsin et al. Apr 2015 A1
20160196388 Lamego Jul 2016 A1
20160367173 Dalvi et al. Dec 2016 A1
20170024748 Haider Jan 2017 A1
20170042488 Muhsin Feb 2017 A1
20170173632 Al-Ali Jun 2017 A1
20170251974 Shreim et al. Sep 2017 A1
20170311891 Kiani et al. Nov 2017 A1
20180103874 Lee et al. Apr 2018 A1
20180242926 Muhsin et al. Aug 2018 A1
20180247353 Al-Ali et al. Aug 2018 A1
20180247712 Muhsin et al. Aug 2018 A1
20180256087 Al-Ali et al. Sep 2018 A1
20180296161 Shreim et al. Oct 2018 A1
20180300919 Muhsin et al. Oct 2018 A1
20180310822 Indorf et al. Nov 2018 A1
20180310823 Al-Ali et al. Nov 2018 A1
20180317826 Muhsin et al. Nov 2018 A1
20190015023 Monfre Jan 2019 A1
20190117070 Muhsin et al. Apr 2019 A1
20190200941 Chandran et al. Jul 2019 A1
20190239787 Pauley et al. Aug 2019 A1
20190320906 Olsen Oct 2019 A1
20190374139 Kiani et al. Dec 2019 A1
20190374173 Kiani et al. Dec 2019 A1
20190374713 Kiani et al. Dec 2019 A1
20200060869 Telfort et al. Feb 2020 A1
20200111552 Ahmed Apr 2020 A1
20200113435 Muhsin Apr 2020 A1
20200113488 Al-Ali et al. Apr 2020 A1
20200113496 Scruggs et al. Apr 2020 A1
20200113497 Triman et al. Apr 2020 A1
20200113520 Abdul-Hafiz et al. Apr 2020 A1
20200138288 Al-Ali et al. May 2020 A1
20200138368 Kiani et al. May 2020 A1
20200163597 Dalvi et al. May 2020 A1
20200196877 Vo et al. Jun 2020 A1
20200253474 Muhsin et al. Aug 2020 A1
20200253544 Belur Nagaraj et al. Aug 2020 A1
20200275841 Telfort et al. Sep 2020 A1
20200288983 Telfort et al. Sep 2020 A1
20200321793 Al-Ali et al. Oct 2020 A1
20200329983 Al-Ali et al. Oct 2020 A1
20200329984 Al-Ali et al. Oct 2020 A1
20200329993 Al-Ali et al. Oct 2020 A1
20200330037 Al-Ali et al. Oct 2020 A1
20210022628 Telfort et al. Jan 2021 A1
20210104173 Pauley et al. Apr 2021 A1
20210113121 Diab et al. Apr 2021 A1
20210117525 Kiani et al. Apr 2021 A1
20210118581 Kiani et al. Apr 2021 A1
20210121582 Krishnamani et al. Apr 2021 A1
20210161465 Barker et al. Jun 2021 A1
20210236729 Kiani et al. Aug 2021 A1
20210256267 Ranasinghe et al. Aug 2021 A1
20210256835 Ranasinghe et al. Aug 2021 A1
20210275101 Vo et al. Sep 2021 A1
20210290060 Ahmed Sep 2021 A1
20210290072 Forrest Sep 2021 A1
20210290080 Ahmed Sep 2021 A1
20210290120 Al-Ali Sep 2021 A1
20210290177 Novak, Jr. Sep 2021 A1
20210290184 Ahmed Sep 2021 A1
20210296008 Novak, Jr. Sep 2021 A1
20210330228 Olsen et al. Oct 2021 A1
20210386382 Olsen et al. Dec 2021 A1
20210402110 Pauley et al. Dec 2021 A1
20220026355 Normand et al. Jan 2022 A1
20220039707 Sharma et al. Feb 2022 A1
20220053892 Al-Ali et al. Feb 2022 A1
20220071562 Kiani Mar 2022 A1
20220096603 Kiani et al. Mar 2022 A1
20220151521 Krishnamani et al. May 2022 A1
20220218244 Kiani et al. Jul 2022 A1
20220287574 Telfort et al. Sep 2022 A1
20220296161 Al-Ali et al. Sep 2022 A1
20220361819 Al-Ali et al. Nov 2022 A1
20220379059 Yu et al. Dec 2022 A1
20220392610 Kiani et al. Dec 2022 A1
20230028745 Al-Ali Jan 2023 A1
20230038389 Vo Feb 2023 A1
20230045647 Vo Feb 2023 A1
20230058052 Al-Ali Feb 2023 A1
20230058342 Kiani Feb 2023 A1
20230069789 Koo et al. Mar 2023 A1
20230087671 Telfort et al. Mar 2023 A1
20230110152 Forrest et al. Apr 2023 A1
20230111198 Yu et al. Apr 2023 A1
20230115397 Vo et al. Apr 2023 A1
20230116371 Mills et al. Apr 2023 A1
20230135297 Kiani et al. May 2023 A1
20230138098 Telfort et al. May 2023 A1
20230145155 Krishnamani et al. May 2023 A1
20230147750 Barker et al. May 2023 A1
20230210417 Al-Ali et al. Jul 2023 A1
20230222805 Muhsin et al. Jul 2023 A1
20230222887 Muhsin et al. Jul 2023 A1
20230226331 Kiani et al. Jul 2023 A1
20230284916 Telfort Sep 2023 A1
20230284943 Scruggs et al. Sep 2023 A1
20230301562 Scruggs et al. Sep 2023 A1
Foreign Referenced Citations (30)
Number Date Country
2262236 Apr 2008 CA
0716628 Dec 1998 EP
0659058 Jan 1999 EP
1207536 May 2002 EP
2358546 Nov 1999 GB
6214898 Jan 1987 JP
01-309872 Jun 1998 JP
10-155755 Jun 1998 JP
2001-50713 May 1999 JP
2003-329719 Nov 2003 JP
WO 1994005207 Mar 1994 WO
WO 1994013207 Jun 1994 WO
WO 1995029632 Nov 1995 WO
WO 1999053277 Oct 1999 WO
WO 2000010462 Mar 2000 WO
WO 2001034033 May 2001 WO
WO 2001078059 Oct 2001 WO
WO 2001097691 Dec 2001 WO
WO 2002003042 Jan 2002 WO
WO 2003058646 Jul 2003 WO
WO 2003087737 Oct 2003 WO
WO 2004000111 Dec 2003 WO
WO 2004004411 Jan 2004 WO
WO 2005096931 Oct 2005 WO
WO 2005099562 Oct 2005 WO
WO 2008017246 Feb 2008 WO
WO 2008080469 Jul 2008 WO
WO 2008148172 Dec 2008 WO
WO 2009093159 Jul 2009 WO
WO 2009137524 Nov 2009 WO
Non-Patent Literature Citations (45)
Entry
US 8,845,543 B2, 09/2014, Diab et al. (withdrawn)
US 9,579,050 B2, 02/2017, Al-Ali (withdrawn)
US 2022/0192529 A1, 06/2022, Al-Ali et al. (withdrawn)
Smith, The Scientist and Engineer's Guide to Digital Signal Processing, https://www.dspguide.com/ch17/2.htm, PDF of Ch. 17 and Ch. 18 is included, two book editions, original 1997, updated 2002 (Year: 2002).
U.S. Appl. No. 12/904,775 (now pub 2011-0209915), filed Oct. 1, 2010, Telfort, Valery et al.
U.S. Appl. No. 12/904,789 (now pub 2011-0125060), filed Oct. 1, 2010, Telfort, Valery et al.
U.S. Appl. No. 12/904,823 (now pub 2011- 0172551) and U.S. Pat. No. 8,790,268, filed Oct. 1, 2010, Al-Ali et al.
U.S. Appl. No. 12/904,836 (now U.S. Pat. No. 8,523,781), filed Oct. 1, 2010, Al-Ali, Ammar.
U.S. Appl. No. 12/904,890 (now pub 2011-0213271)and (U.S. Pat. No. 8,702,627), filed Oct. 1, 2010, Telfort et al.
U.S. Appl. No. 12/904,907 (now pub 2011-0213272) (Now U.S. Pat. No. 8,715,206), filed Oct. 1, 2010, Telfort et al.
U.S. Appl. No. 12/904,931 (now pub 2011-0213273) (U.S. Pat. No. 8,690,799), filed Oct. 1, 2010, Telfort et al.
U.S. Appl. No. 12/904,938 (now pub 2011-0213274) and (U.S. Pat. No. 8,755,535), filed Oct. 1, 2010, Telfort et al.
U.S. Appl. No. 12/905,036 (now pub 2011-0172561), filed Oct. 1, 2010, Kiani et al.
U.S. Appl. No. 12/905,384 Now U.S. Pat. No. 9,724,016, filed Oct. 1, 2010, Al-Ali et al.
U.S. Appl. No. 12/905,449 Now U.S. Pat. No. 9,066,680, filed Oct. 1, 2010, Al-Ali et al.
U.S. Appl. No. 12/905,489 Now U.S. Pat. No. 9,848,800, filed Oct. 1, 2010, Weber et al.
U.S. Appl. No. 12/905,530 (now U.S. Pat. No. 8,430,817), filed Oct. 1, 2010, Al-Ali et al.
U.S. Appl. No. 12/960,325 (now pub 2011-0196211) and (U.S. Pat No. 8,801,613), filed Dec. 1, 2010, Al-Ali, Ammar et al.
2011/0172551 Now U.S. Pat. No. 8,790,268, filed Jul. 1, 2011, Al-Ali, et al.
2011/0196211 Now U.S. Pat. No. 8,801,613, filed Aug. 1, 2011, Al-Ali, et al.
2011/0213271 Now U.S. Pat. No. 8,702,627, filed Sep. 1, 2011, Telfort, Valery et al.
2011/0213272 Now U.S. Pat. No. 8,715,206, filed Sep. 1, 2011, Telfort, Valery et al.
2011/0213273 Now U.S. Pat. No. 8,690,799, filed Sep. 1, 2011, Telfort, Valery et al.
2011/0213274 Now U.S. Pat. No. 8,755,535, filed Sep. 1, 2011, Telfort, Valery et al.
Analog Devices, 12-Bit Serial Input Multiplying D/A Converter, Product Data Sheet, 2000.
Chambrin, M-C.; “Alarms in the intensive care unit: how can the No. of false alarms be reduced?”; Critical Care Aug. 2001, vol. 5 No. 4; p. 1-5.
Eldor et al., “A device for monitoring ventilation during anaesthesia; the paratracheal audible respiratory monitor”, Canadian Journal of Anaesthesia, 1990, vol. 9, No. 1, p. 95-98.
Gorges, M. et al; “Improving Alarm Performance in the Medical Intensive Care Unit Using Delays and Clinical Context”; Technology, Computing, and Simulation; vol. 108, No. 5, May 2009; p. 1546-1552.
Imhoff, M. et al; “Alarm Algorithms in Critical Care Monitoring”; Anesth Analg 2006;102:1525-37.
International Search Report & Written Opinion, PCT Application PCT/US2010/052758, dated Feb. 10, 2011; 12 pages.
International Search Report & Written Opinion, PCT Application PCT/US2010/058981, dated Feb. 17, 2011; 11 pages.
International Search Report and Written Opinion issued in application No. PCT/US2010/052756 dated Feb. 6, 2012.
International Search Report, PCT Application PCT/CA2003/000536, dated Dec. 11, 2003; 2 pages.
International Search Report, PCT Application PCT/US2009/069287, dated Mar. 30, 2010; 7 pages.
Japanese Office Action for JP Application No. 2007-506626 dated Mar. 1, 2011.
Sierra et al., Monitoring Respiratory Rate Based on Tracheal Sounds. First Experieances, Proceedings of the 26th Annual Int'l Conf. of the IEEE EMBS (Sep. 2004), 317-320.
Watt, R. C.; “Alarms and Anesthesia. Challenges in the design of Intelligent systems for Patient Monitoring”; IEEE Engineering in Medicine and biology; Dec. 1993, p. 34-41.
Welch Allyn, ECG ASIC, Product Data Sheete, 2001.
Supplementary Partial European Search Report for International Application No. 05732095.4, dated Jun. 26, 2009 in 4 pages.
Theimer et al., “Definitions of audio features for music content description”, Algorithm Engineering Report TR08-2-001, Feb. 2008.
Stewart, C., Larson, V., “Detection and classification of acoustic signals from fixed-wing aircraft,” Systems Engineering, CH3051-0/91/0000-0025, IEEE, 1991.
Johnston, Development of a Signal Processing Library for Extraction of Sp02, HR, HRV, and RR from Photoplethysmographic Waveforms, Thesis: Degree of Master of Science, Worcester Polytechnic Institute, date of presentation/defense Jul. 17, 2006, date listed Jul. 27, 2006.
GE Healthcare, “Transport ProTM Patient Monitor Operator's Manual” Apr. 9, 2007, in 286 pages.
Hsu, “Signals and Systems”, Schaum's Theory and Problems, 1995, Ch. 3, p. 121.
Letter from Anthony D. Watson to Masimo Corporation re 510(k) No. K120984, U.S. Food & Drug Administration, dated Apr. 23, 2013 in 6 pages.
Related Publications (1)
Number Date Country
20200121205 A1 Apr 2020 US
Provisional Applications (1)
Number Date Country
61780412 Mar 2013 US
Continuations (2)
Number Date Country
Parent 14636500 Mar 2015 US
Child 16557198 US
Parent 14206900 Mar 2014 US
Child 14636500 US