Physiological acoustic monitoring system

Information

  • Patent Grant
  • 10980507
  • Patent Number
    10,980,507
  • Date Filed
    Friday, October 12, 2018
    6 years ago
  • Date Issued
    Tuesday, April 20, 2021
    3 years ago
Abstract
A physiological acoustic monitoring system receives physiological data from an acoustic sensor, down-samples the data to generate raw audio of breathing sounds and compresses the raw audio. The acoustic monitoring system has an acoustic sensor signal responsive to tracheal sounds in a person. An A/D converter is responsive to the sensor signal so as to generate breathing sound data. A decimation filter and mixer down-samples the breathing sound data to raw audio data. A coder/compressor generates compressed audio data from the raw audio data. A decoder/decompressor decodes and decompresses the compressed audio data into decompressed audio data. The decompressed audio data is utilized to generate respiration-related parameters in real-time. The compressed audio data is stored and retrieved so as to generate respiration-related parameters in non-real-time. The real-time and non-real-time parameters are compared to verify matching results across multiple monitors.
Description

Many of the embodiments described herein are compatible with embodiments described in the above related applications. Moreover, some or all of the features described herein can be used or otherwise combined with many of the features described in the applications listed above.


BACKGROUND OF THE INVENTION

The “piezoelectric effect” is the appearance of an electric potential and current across certain faces of a crystal when it is subjected to mechanical stresses. Due to their capacity to convert mechanical deformation into an electric voltage, piezoelectric crystals have been broadly used in devices such as transducers, strain gauges and microphones. However, before the crystals can be used in many of these applications they must be rendered into a form which suits the requirements of the application. In many applications, especially those involving the conversion of acoustic waves into a corresponding electric signal, piezoelectric membranes have been used.


Piezoelectric membranes are typically manufactured from polyvinylidene fluoride plastic film. The film is endowed with piezoelectric properties by stretching the plastic while it is placed under a high-poling voltage. By stretching the film, the film is polarized and the molecular structure of the plastic aligned. A thin layer of conductive metal (typically nickel-copper) is deposited on each side of the film to form electrode coatings to which connectors can be attached.


Piezoelectric membranes have a number of attributes that make them interesting for use in sound detection, including: a wide frequency range of between 0.001 Hz to 1 GHz; a low acoustical impedance close to water and human tissue; a high dielectric strength; a good mechanical strength; and piezoelectric membranes are moisture resistant and inert to many chemicals.


Due in large part to the above attributes, piezoelectric membranes are particularly suited for the capture of acoustic waves and the conversion thereof into electric signals and, accordingly, have found application in the detection of body sounds. However, there is still a need for a reliable acoustic sensor, particularly one suited for measuring bodily sounds in noisy environments.


SUMMARY OF THE INVENTION

An aspect of a physiological acoustic monitoring system receives physiological data from an acoustic sensor, down-samples the data to generate raw audio of breathing sounds and compresses the raw audio. The acoustic monitoring system has an acoustic sensor signal responsive to tracheal sounds in a person. An A/D converter is responsive to the sensor signal so as to generate breathing sound data. A decimation filter and mixer down-samples the breathing sound data to raw audio data. A coder/compressor generates compressed audio data from the raw audio data. A decoder/decompressor decodes and decompresses the compressed audio data into decompressed audio data. The decompressed audio data is utilized to generate respiration-related parameters in real-time. The compressed audio data is stored and retrieved so as to generate respiration-related parameters in non-real-time. The real-time and non-real-time parameters are compared to verify matching results across multiple monitors.


Another aspect of a physiological acoustic monitoring system inputs an acoustic sensor signal responsive to tracheal sounds of a person and generates breath tags and a respiration rate. The breath tags represent the acoustic envelope of the tracheal sound, and the respiration rate represents the inverse period of the acoustic envelope. The breath tags and respiration rate have a sufficiently low bandwidth to share a data channel with other physiological parameters. In an embodiment, the acoustic monitor has an acoustic sensor input and an A/D converter that digitizes the sensor input and outputs a digitized sensor signal. A decimation filter and mixer reduces the data rate of the digitized sensor signal and outputs a digitized raw audio. An acoustic parameter processor generates a respiration rate and breath tags in response to the digitized raw audio.


In various embodiments, the acoustic monitoring system has a coder/compressor that compresses the digitized raw audio to generate compressed audio data, which is stored and retrieved so as to generate respiration-related parameters in non-real-time. A decoder/decompressor decompresses the compressed audio data for the acoustic parameter processor. A D/A converter inputs the digitized raw audio and generates a raw audio analog signal for local playback and listening to the acoustic sensor signal. The compressed audio is transmitted to a remote location as a troubleshooting aid at a remote monitor.


A further aspect of a physiological acoustic monitoring system inputs a sensor signal responsive to respiratory sounds of a living being, digitizes the sensor signal so as to generate acoustic data, extracts an envelope from the acoustic data, defines an idealized envelope from the extracted envelope, describes the idealized envelope as breath tags and transmits the breath tags over a data channel. In various embodiments, the breath tags are received from the data channel, a reconstructed envelope is synthesized in response to the breath tags and reconstructed acoustic data is generated by filling the envelope with an artificial waveform. In an embodiment, the artificial waveform is white noise.


An additional aspect of a physiological acoustic monitoring system detects a physiological feature in the extracted envelope and includes the physiological feature in the breath tags. The reconstructed envelope is modified with the detected physiological feature, which may be wheezing or coughing, as examples. The respiratory sounds are approximately reproduced by playing the reconstructed acoustic data on an audio transducer.


Yet another aspect of a physiological acoustic monitoring system is a sensor signal responsive to respiratory sounds of a living being. An A/D converter digitizes the sensor signal into acoustic data. A parameter generator extracts a respiratory sound envelope from the acoustic data so as to generate a breath tag, which is transmitted over a data channel as a representation of the respiratory sounds. In various embodiments, a remote monitoring station receives the breath tag and a corresponding respiration rate. The monitoring station synthesizes an envelope from the breath tag and the respiration rate and fills the envelope with an artificial waveform so as to generate reconstituted respiratory sounds. In an embodiment, the artificial waveform is white noise.


In various other embodiments, a decimation filter and mixer down-samples the acoustic data to raw audio data, a D/A converter converts the raw audio data to a raw audio signal and a speaker that plays the raw audio signal. The parameter generator detects a physiological feature in the extracted envelope and includes the physiological feature in the breath tag. The remote monitor modifies the reconstructed envelope with the detected physiological feature. An audio transducer approximately reproduces the reconstructed acoustic data as compared to the raw audio signal. A compressor generates compressed audio data, which is stored and retrieved so as to generate respiration-related parameters in non-real-time.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a general block diagram of a physiological acoustic monitoring system;



FIGS. 2A-B are illustrations of dual channel acoustic sensors;



FIG. 2A illustrates a neck sensor for physiological measurements and a chest sensor for monaural body sound monitoring;



FIG. 2B illustrates a dual acoustic sensor for stereo body sound monitoring;



FIGS. 3A-B are top and bottom perspective views of a body sound sensor;



FIG. 4 is a general schematic diagram of acoustic and optical sensors and sensor drive elements and a corresponding digital signal processor and I/O drive elements;



FIG. 5 is a matrix diagram of processor modules and corresponding functionality;



FIG. 6 is a network diagram for a physiological acoustic monitoring system;



FIGS. 7A-B are block diagrams of respiration sound generator embodiments;



FIGS. 8A-C are graphs illustrating breath tag generator embodiments;



FIG. 9 is a block diagram illustrating a physiological parameter processor embodiment for generating acoustic and optical sensor parameters, breath tags and compressed and raw audio outputs;



FIGS. 10A-B are a waveform and a block diagram illustrating a respiration beep generator embodiment; and



FIG. 11 is a block diagram of a physiological acoustic monitoring system for wireless monitoring applications.





DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS


FIG. 1 generally illustrates a physiological acoustic monitoring system 100 embodiment having one or more sensors 110 in communications with one or more processors 130 via a sensor interface 120. The processors 130 both initiate and respond to input/output 150, including audio output 152, displays and alarms 154, communications 156 and controls 158. In an embodiment, the processors 130 are implemented in firmware executing on one or more digital signal processors (DSP), as described with respect to FIGS. 4-5, below. At least a portion of the sensors 110 generate acoustic signals, which may be directly utilized by the processors 130 or recorded onto or played back from storage media 160 or both.


The processors 130 include an audio processor 132 that outputs audio waveforms 142, a parameter processor 134 that derives physiological parameters 144 from sensor signals 112 and an acoustic data processor 136 that stores, retrieves and communicates acoustic data 146. Parameters include, as examples, respiration rate, heart rate and pulse rate. Audio waveforms include body sounds from the heart, lungs, gastrointestinal system and other organs. These body sounds may include tracheal air flow, heart beats and pulsatile blood flow, to name a few. Displays allow parameters 144 and acoustic data 146 to be visually presented to a user in various forms such as numbers, waveforms and graphs, as examples. Audio 152 allows audio waveforms to be reproduced through speakers, headphones or similar transducers. Raw audio 122 allows acoustic sensor signals 112 to be continuously reproduced through speakers, headphones or similar transducers, bypassing A/D conversion 120 and digital signal processing 130.


Storage media 160 allows acoustic data 146 to be recorded, organized, searched, retrieved and played back via the processors 130, communications 156 and audio output 152. Communications 156 transmit or receive acoustic data or audio waveforms via local area or wide area data networks or cellular networks 176. Controls 158 may cause the audio processor 132 to amplify, filter, shape or otherwise process audio waveforms 142 so as to emphasize, isolate, deemphasize or otherwise modify various features of an audio waveform or spectrum. In addition, controls 158 include buttons and switches 178, such as a “push to play” button that initiates local audio output 152 or remote transmission 176 of live or recorded acoustic waveforms.


As shown in FIG. 1, acoustic data 146 is initially derived from one or more acoustic sensor signals 112, along with, perhaps, other data inputs, such as from optical, blood pressure, EEG and ECG sensors, to name a few. The acoustic data 146 provides audio outputs 142, including audio respiration indicators, described with respect to FIGS. 7-10, below. The acoustic data 146, when analyzed, provides physiological parameters 144 that provide an indication of patient status, such as respiration rate or heart rate. Such analyses may result in visual or audible alerts or alarms 154 that are viewed locally or via notifications transmitted over local or wide area networks 176 to medical staff or other persons. Acoustic data 146 is utilized in real time or stored and retrieved for later use. Acoustic data 146 may be written on various storage media 160, such as a hard drive, and organized for convenient search and retrieval. In an embodiment, acoustic data 146 is advantageous organized on one or more hard drives as virtual magnetic tape so as to more easily manage, search, retrieve and playback acoustic data volumes. Further, the virtual tape volumes and/or the acoustic data itself may be entered into a database and organized as an acoustic library according to various search parameters including patient information, dates, corresponding physiological parameters and acoustic waveform features, to name a few. Applications for a physiological acoustic monitoring system include auscultation of body sounds by medical staff or by audio processors or both; SIDS monitoring; heart distress monitoring including the early detection and mitigation of myocardial infarction and cardiopulmonary arrest, as examples; and elder care, to name a few.


In an embodiment, sensor sounds 142 may be continuously “piped” to a remote device/listener or a central monitor or both. Listening devices may variously include pagers, cell phones, PDAs, electronic pads or tablets and laptops or other computers to name a few. Medical staff or other remote listeners are notified by the acoustic monitoring system according to flexible pre-programmed protocols to respond to the notification so as to hear breathing sounds, voice, heart sounds or other body sounds.



FIGS. 2A-B illustrate physiological acoustic monitoring system 200 embodiments each having dual channel acoustic sensors 201, 202 in communications with a physiological monitor 205. As shown in FIG. 2A, a first acoustic sensor 210 is utilized for deriving one or more physiological parameters, such as respiration rate. A second acoustic sensor 220 is utilized to continuously monitor body sounds. In an embodiment, the second acoustic sensor 220 has a different color or shape than the first acoustic sensor 210 so as identify the sensor as a body sound listening device rather than an acoustic sensing device for determining a physiological parameter. In an embodiment, the body sound sensor 220 is placed over the heart to allow the monitoring of heart sounds or for determination of heart rate. In an embodiment, the body sound sensor 220 generates a signal that bypasses monitor digitization and signal processing so as to allow continuous listening of the unprocessed or “raw” body sounds. In particular, the first acoustic sensor 210 is neck-mounted so as to determine one or more physiological parameters, such as respiration rate. The second acoustic sensor 220 is chest-mounted for monaural heart sound monitoring. As shown in FIG. 2B, first and second acoustic sensors 260, 270 are mounted proximate the same body site but with sufficient spatial separation to allow for stereo sensor reception. In this manner, the listener can more easily distinguish and identify the source of body sounds.



FIGS. 3A-B illustrate a body sound sensor 300 having acoustic 310, interconnect (not visible) and attachment 350 assemblies. The acoustic assembly 310 has an acoustic coupler 312 and a piezoelectric subassembly 314. The acoustic coupler 312 generally envelops or at least partially covers some or all of the piezoelectric subassembly 314. The piezoelectric subassembly 314 includes a piezoelectric membrane and a support frame (not visible). The piezoelectric membrane is configured to move on the frame in response to acoustic vibrations, thereby generating electrical signals indicative of body sounds. The acoustic coupler 312 advantageously improves the coupling between the acoustic signal measured at a skin site and the piezoelectric membrane. The acoustic coupler 312 includes a contact portion 316 placed against a person's skin.


Further shown in FIGS. 3A-B, the acoustic assembly 310 communicates with the sensor cable 340 via the interconnect assembly. In an embodiment, the interconnect assembly is a flex circuit having multiple conductors that are adhesively bonded to the attachment assembly 350. The interconnect assembly has a solder pad or other interconnect to interface with the sensor cable 340, and the attachment assembly 350 has a molded strain relief for the sensor cable. In an embodiment, the attachment assembly 350 is a generally circular, planar member having a top side 3511, a bottom side 352, and a center. A button 359 mechanically couples the acoustic assembly 310 to the attachment assembly center so that the acoustic assembly 310 extends from the bottom side 352. The sensor cable 340 extends from one end of the interconnect and attachment assemblies to a sensor connector at an opposite end so as to provide communications between the sensor and a monitor, as described in further detail with respect to, below. In an embodiment, an adhesive along the bottom side 352 secures the acoustic assembly 310 to a person's skin, such as at a neck, chest, back, abdomen site. A removable backing can be provided with the adhesive to protect the adhesive surface prior to affixing to a person's skin. In other embodiments, the attachment assembly 350 has a square, oval or oblong shape, so as to allow a uniform adhesion of the sensor to a measurement site. In a resposable embodiment, the attachment assembly 350 or portions thereof are removably detachable and attachable to the acoustic assembly 310 for disposal and replacement. The acoustic assembly 310 is reusable accordingly.



FIG. 4 illustrates acoustic 401 and optical 402 sensors and sensor drive elements 403 and a corresponding digital signal processor 440 and I/O drive elements 404. A multi-acoustic sensor configuration 401 includes a power interface 413, piezo circuits 416 and a piezoelectric membrane 417 corresponding to each sensor head 406, 407. The piezoelectric membrane 417 senses vibrations and generates a voltage in response to the vibrations, as described with respect to the sensor of FIGS. 3A-B, above. The signal generated by the piezoelectric membrane is communicated to the piezo circuit 416, described immediately below, and transmits the signal to the monitor 205 (FIG. 2A) for signal conditioning and processing. The piezo circuit 416 decouples the power supply 413 and performs preliminary signal conditioning. In an embodiment, the piezo circuit 416 includes clamping diodes to provide electrostatic discharge (ESD) protection and a mid-level voltage DC offset for the piezoelectric signal to ride on, to be superimposed on or to be added to. The piezo circuit may also have a high pass filter to eliminate unwanted low frequencies such as below about 100 Hz for breath sound applications, and an op amp to provide gain to the piezoelectric signal. The piezo circuit 416 may also have a low pass filter on the output of the op amp to filter out unwanted high frequencies. In an embodiment, a high pass filter is also provided on the output in addition to or instead of the low pass filter. The piezo circuit may also provide impedance compensation to the piezoelectric membrane, such as a series/parallel combination used to control the signal level strength and frequency of interest that is input to the op amp. In one embodiment, the impedance compensation is used to minimize the variation of the piezoelectric element output. The impedance compensation can be constructed of any combination of resistive, capacitive and inductive elements, such as RC or RLC circuits.


As shown in FIG. 4, a physiological acoustic monitor 400 embodiment drives and processes signals from a multi-acoustic sensor 401 and an optical sensor 402. The monitor 400 includes one or more acoustic front-ends 421, 422, an analog-to-digital (A/D) converter 431, an audio driver 470 and a digital signal processor (DSP) 440. The DSP 440 can comprise a wide variety of data and/or signal processors capable of executing programs for determining physiological parameters from input data. An optical front-end 425, digital-to-analog (D/A) converters 434 and an A/D converter 435 drive emitters 408 and transform resulting composite analog intensity signal(s) from light sensitive detector(s) 409 received via a sensor cable 410 into digital data input to the DSP 440. The acoustic front-ends 421, 422 and A/D converter 431 transform analog acoustic signals from piezoelectric elements 401 into digital data input to the DSP 440. The A/D converter 431 is shown as having a two-channel analog input and a multiplexed digital output to the DSP. In another embodiment, each front-end, communicates with a dedicated single channel A/D converter generating two independent digital outputs to the DSP. An acoustic front-end 421 can also feed an acoustic sensor signal 411 directly into an audio driver 470 for direct and continuous acoustic reproduction of an unprocessed (raw) sensor signal by a speaker, earphones or other audio transducer 462, as described with respect to FIG. 9, below.


Also shown in FIG. 4, the physiological acoustic monitor 400 may also have an instrument manager 450 that communicates between the DSP 440 and input/output 460. One or more I/O devices 460 have communications with the instrument manager 450 including displays, alarms, user I/O and instrument communication ports. Alarms 466 may be audible or visual indicators or both. The user I/O 468 may be, as examples, keypads, touch screens, pointing devices or voice recognition devices, to name a few. The displays 464 may be indicators, numerics or graphics for displaying one or more of various physiological parameters or acoustic data. The instrument manager 450 may also be capable of storing or displaying historical or trending data related to one or more of parameters or acoustic data.


Further shown in FIG. 4, the physiological acoustic monitor 400 may also have a “push-to-talk” feature that provides a “listen on demand” capability. That is, a button 468 on the monitor is pushed or otherwise actuated so as to initiate acoustic sounds to be sent to a speaker, handheld device, or other listening device, either directly or via a network. The monitor 400 may also has a “mode selector” button or switch 468 that determines the acoustic content provided to a listener, either local or remote. These controls may be actuated local or at a distance by a remote listener. In an embodiment, push on demand audio occurs on an alarm condition in lieu of or in addition to an audio alarm. Controls 468 may include output filters like on a high quality stereo system so that a clinician or other user could selectively emphasize or deemphasize certain frequencies so as to hone-in on particular body sounds or characteristics.


In various embodiments, the monitor 400 may be one or more processor boards installed within and communicating with a host instrument. Generally, a processor board incorporates the front-end, drivers, converters and DSP. Accordingly, the processor board derives physiological parameters and communicates values for those parameters to the host instrument. Correspondingly, the host instrument incorporates the instrument manager and I/O devices. A processor board may also have one or more microcontrollers (not shown) for board management, including, for example, communications of calculated parameter data and the like to the host instrument. A processor board embodiment is described with respect to FIG. 9, below.


Communications 469 may transmit or receive acoustic data or audio waveforms via local area or wide area data networks or cellular networks. Controls may cause the audio processor to amplify, filter, shape or otherwise process audio waveforms so as to emphasize, isolate, deemphasize or otherwise modify various features of the audio waveform or spectrum. In addition, switches, such as a “push to play” button can initiate audio output of live or recorded acoustic data. Controls may also initiate or direct communications.



FIG. 5 illustrates processor modules 500 that may execute on a DSP 440 (FIG. 4) and/or instrument manager 450 (FIG. 4) in various physiological acoustic monitoring system embodiments and the corresponding functionality of these modules. Module functionality includes processing sensor input 510, storage 520 and playback 530 of acoustic data, acoustic data analysis 540, communication of acoustic data and derived physiological parameters 550 and specific applications 560. Sensor input 510 related modules include dynamic range 512 and noise reduction 513. Dynamic range 512 functionality is described with respect to the processor board codec and FIG. 9, below. Storage 520 related modules include virtual tape 523 and database 524 functionality, described with respect to FIG. 6, below. Playback 530 functionality includes audio filters 532, sound reproduction 534 including mono/stereo/quadraphonic 533 modes and auscultation 535 enhancement. Analysis 540 related modules include audio parameters 542, multi-sensor parameters 543 and corresponding alarms 544. Communications 550 related modules include cellular 553, wireless 554 and network 555 modes. Wireless is described with respect to FIG. 11, below, and cellular 553 and networks 555 are described with respect to FIG. 6, below. Applications 560 include elder care 561 and SIDS 562, described with respect to FIG. 12, below.



FIG. 6 illustrates a physiological acoustic monitoring system 600 embodiment having a shared or open network architecture interconnecting one or more physiological monitors 610, monitoring stations 620 and mass storage 660. This interconnection includes proximity wireless devices 612 in direct wireless communication with a particular physiological monitor 610; local wireless devices 632 in communications with the monitors 610 via a wireless LAN 630; and distant wired or wireless devices 642, 652 in communications with the monitors 610 via WAN, such as Internet 640 or cellular networks 650. Communication devices may include local and remote monitoring stations 620 and wired or wireless communications and/or computing devices including cell phones, lap tops, pagers, PDAs, tablets and pads, to name a few. Physiological information is transmitted/received directly to/from end users over LAN or WAN. End users such as clinicians may carry wireless devices 632 in communications with the WLAN 630 so as to view in real-time physiological parameters or listen to audio data and waveforms on demand or in the event of an alarm or alert.


The network server 622 in certain embodiments provides logic and management tools to maintain connectivity between physiological monitors, clinician notification devices and external systems, such as EMRs. The network server 622 also provides a web based interface to allow installation (provisioning) of software related to the physiological monitoring system, adding new devices to the system, assigning notifiers to individual clinicians for alarm notification, escalation algorithms in cases where a primary caregiver does not respond to an alarm, interfaces to provide management reporting on alarm occurrences and internal journaling of system performance metrics such as overall system uptime. The network server 622 in certain embodiments also provides a platform for advanced rules engines and signal processing algorithms that provide early alerts in anticipation of a clinical alarm.


As shown in FIG. 6, audio data and corresponding audio files are advantageously stored on virtual tape 662, which provides the storage organization of tape cartridges without the slow, bulky, physical storage of magnetic tape and the corresponding human-operator intervention to physically locate and load physical cartridges into an actual tape-drive. A virtual tape controller 662 emulates standard tape cartridges and drives on modern, high capacity disk drive systems, as is well-known in the art. Accordingly, virtual “audio tapes” appear the same as physical tapes to applications, allowing the use of many existing cartridge tape storage, retrieval and archival applications. Further, while the upper-limit of a physical tape cartridge may be a few hundred megabytes, a virtual tape server 662 can be configured to provide considerably larger “tape” capacity. Mount-time is near-zero for a virtual tape and the data is available immediately. Also, while traditional physical tape systems have to read a tape from the beginning, moving sequentially through the files on the tape, a virtual drive can randomly access data at hard-disk speeds, providing tape I/O at disk access speeds.


Additionally shown in FIG. 6, a sound processing firmware module of certain embodiments accesses a database 670 of sound signatures 660 and compares the received signal with the entries in the database to characterize or identify sounds in the received signal. In another embodiment, the sound processing module generates and/or accesses a database 670 of sound signatures specific to a patient, or specific to a particular type of patient (e.g., male/female, pediatric/adult/geriatric, etc.). Samples from a person may be recorded and used to generate the sound signatures. In some embodiments, certain signal characteristics are used to identify particular sounds or classes of sounds. For example, in one embodiment, signal deviations of relatively high amplitude and or sharp slope may be identified by the sound processing module. Sounds identified in various embodiments by the sound processing module include, but are not limited to, breathing, speech, choking, swallowing, spasms such as larynx spasms, coughing, gasping, etc.


Once the sound processing module characterizes a particular type of sound, the acoustic monitoring system can, depending on the identified sound, use the characterization to generate an appropriate response. For example, the system may alert the appropriate medical personnel to modify treatment. In one embodiment, medical personnel may be alerted via an audio alarm, mobile phone call or text message, or other appropriate means. In one example scenario, the breathing of the patient can become stressed or the patient may begin to choke due to saliva, mucosal, or other build up around an endotracheal tube. In an embodiment, the sound processing module can identify the stressed breathing sounds indicative of such a situation and alert medical personnel to the situation so that a muscle relaxant medication can be given to alleviate the stressed breathing or choking.


According to some embodiments, acoustic sensors described herein can be used in a variety of other beneficial applications. For example, an auscultation firmware module may process a signal received by the acoustic sensor and provide an audio output indicative of internal body sounds of the patient, such as heart sounds, breathing sounds, gastrointestinal sounds, and the like. Medical personnel may listen to the audio output, such as by using a headset or speakers. In some embodiments the auscultation module allows medical personnel to remotely listen for patient diagnosis, communication, etc. For example, medical personnel may listen to the audio output in a different room in a hospital than the patient's room, in another building, etc. The audio output may be transmitted wirelessly (e.g., via Bluetooth, IEEE 802.11, over the Internet, etc.) in some embodiments such that medical personnel may listen to the audio output from generally any location.



FIGS. 7A-B illustrate sound processing embodiments 701, 702 for generating an audio output for an acoustic sensor. As shown in FIG. 7A, in one embodiment, acoustic sensor data is A/D converted 710, down-sampled with a decimation filter 720 and compressed 730. The compressed audio data 732 is transmitted to a monitor, which decompresses the data 740 and outputs it to a speaker 742 or similar audio transducer. However, compressed audio data 732 from a physiological acoustic sensor has too high a bit rate to transmit over monitor data channels shared with other physiological processors or patient networks shared by multiple patient monitors all communicating physiological parameters, waveforms and other real-time medical data. Acoustic sensor data rates are described in further detail with respect to FIG. 9, below.


As shown in FIG. 7B, an envelope-based sound processing 702 embodiment advantageously allows respiration-related acoustic data to be transmitted at significantly reduced data rates compared with data compression so as to allow shared transmission over monitor data channels (990FIG. 9) and patient networks. Respiration-related acoustic data is A/D converted 710 and input to an envelope detector 750. The detected envelopes are idealized and represented by a small number set or “tag” corresponding to each breath. In an embodiment, a breath tag represents the time-of-occurrence of the breath envelope peak for each inspiration and expiration cycle. These breath tags 760 are then transmitted over standard multiple parameter patient monitor data channels and/or patient networks. At the receiving end, a patient monitor, multiple patient monitoring system or like monitoring device synthesizes the envelopes 770 from the breath tags 760 according to the respiration rate (RR). The envelopes 770 are then filled with white noise 780 so as to simulate the original respiration acoustic data 782.



FIGS. 8A-C further illustrate envelope processing for acoustic sensor data. FIG. 8A illustrates a representative acoustic signal 801 derived by a neck sensor detecting vibrations resulting from tracheal air flow during respiration. A breath sound 810 has an envelope 820 “pulse” corresponding to either inhalation or exhalation. An envelope detector 750 (FIG. 7B) generates breath tags that numerically describe the envelope 820. As shown in FIG. 8B, in one embodiment, breath tags describe an idealized envelope 830. For example, a breath tag may be an amplitude value and a duration value for each idealized pulse. In other embodiments, a breath tag may include leading/trailing slope values for a pulse 830. As shown in FIG. 8C, in other embodiments, breath tags include detected envelope features 842, 843, 844 that are characteristic of known acoustically-related phenomena such as wheezing or coughing, as examples. At a receiving device, envelop synthesis 770 (FIG. 7B) reproduces an envelope 830, 840 and fills the envelope with an artificial waveform, such as white noise. This reconstructed or simulated breath signal is then output to a speaker or similar device. In other embodiments, breath tags are transmitted over a network to a remote device, which reconstructs breathing waveforms from the breath tags in like manner.


In various other embodiments, acoustic breathing waveforms are detected by an acoustic sensor, processed, transmitted and played on a local or remote speaker or other audio output from actual (raw) data, synthetic data and artificial data. Actual data may be compressed, but is a nearly complete or totally complete reproduction of the actual acoustic sounds at the sensor. Synthetic data may be a synthetic version of the breathing sound with the option of the remote listener to request additional resolution. Artificial data may simulate an acoustic sensor sound with minimal data rate or bandwidth, but is not as clinically useful as synthetic or actual data. Artificial data may be, for example, white noise bursts generated in sync with sensed respiration. Synthetic data is something between actual data and artificial data, such as the acoustic envelope process described above that incorporates some information from the actual sensor signal. In an embodiment breath sounds are artificially hi/lo frequency shifted or hi/lo volume amplified to distinguish inhalation/exhalation. In an embodiment, dual acoustic sensors placed along the neck are responsive to the relative time of arrival of tracheal sounds so as to distinguish inhalation and exhalation in order to appropriately generate the hi/lo frequency shifts. Raw and compressed acoustic respiration data is described with respect to FIG. 9, below. Artificial data “breath beeps” are described with respect to FIGS. 10A-B, below.



FIG. 9 illustrates a processor board 900 embodiment of an acoustic monitoring system that generates both optical and acoustic data. An optical portion has D/A converters 958 responsive to emitter drives 954 and an emitter control 950 so as to alternately activate optical sensor 902 LEDs of multiple wavelengths so as to illuminate blood perfused tissue. An A/D converter 960 and demodulator 964 are responsive to sensor 902 detectors so as to generate plethysmographic data 968 to a digital signal processor (DSP) 980. Corresponding blood parameter algorithms 970 generate blood parameter outputs 972, such as oxygen saturation (SpO2), to a data channel 990.


Also shown in FIG. 9, an acoustic portion has an A/D converter 910, a decimation filter and mixer 920, a coder/compressor 930 and a decoder/decompressor 935 so as to generate acoustic data to the DSP 980. The A/D 910, decimation filter/mixer 920 and a D/A converter 925 are responsive to an acoustic sensor 901 so as to generate an analog “raw” audio 909 output. In an embodiment, the A/D 910 is a 48 Khz, 16-bit, 2-channel variable gain device that provides higher resolution at lower signal amplitudes and lower resolution and higher signal amplitudes. In an embodiment, the decimation filter/mixer generates 2 KHz, 32-bit (64 Kbps) digitized raw audio 922. Advantageously, the raw audio 909 is routed to a proximate amplifier/speaker 122 (FIG. 1). The digitized raw audio 922 is also input to the coder/compressor 930. A 3:1 (approx.) compression generates a 20 Kbps compressed (digitized) audio 908 output. The compressed audio 908 is immediately input into a decoder/decompresser 935 for use by acoustic algorithms 940 to generate respiration rate (RR) and breath tag 942 outputs to a data channel 990, as described above, among other acoustic-related parameters. Advantageously, the compression and immediate decompression of the digitized raw audio 922 provides a compressed audio output 908 that can be stored and retrieved for accurate off-line reproduction and troubleshooting of device behavior. Also, the compressed audio output 908 can be advantageously transmitted via Wi-Fi or other communication links to remote locations for processing and patient analysis.



FIGS. 10A-B illustrate a “respiration beep” embodiment 1001 for communicating reduced-rate respiration data over relatively low bandwidth monitor data channels and patient networks. As shown in FIG. 10A, in some situations, acoustic respiration data 1000 presents an inspiration (I) 1010 pulse relatively closely followed by an expiration (E) 1020 pulse, where each I/E pair is separated by a relatively longer pulseless interval. That is, these I/E pairs are relatively easily distinguished. As such, I/E pairs can be transmitted as simply time-of-occurrence values.


As shown in FIG. 10B, at an inspiration time 1062, a high (HI) frequency tone 1064 is generated. At an expiration time 1062, a low (LO) frequency tone 1064 is generated. A mixer 1050 combines colored noise 1042 with the HI/LO tones 1064 to generate higher-pitched followed by lower-pitched noise pulses representative of the original acoustic waveform 1000. These respiration “beeps” are roughly analogous to pulse oximeter-generated “beeps” that coincide with optical sensor detected arterial blood pulses. In an advantageous embodiment, a processor board 900 (FIG. 9) having optical and acoustic sensors generates simultaneously occurring respiration beeps and pulse beeps, where the pulse beep tone is easily distinguished from the respiration beep HI/LO noise pulses. These combined pulse/respiration beeps advantageously allow a care provider to “monitor” a patient's respiration and pulse by sound alone.



FIG. 11 illustrates a wireless physiological acoustic monitor 1100 embodiment, which is particular advantageous for out-patient applications, such as sudden infant death syndrome (SIDS) prevention and elder care. The monitor 1100 has a sensor section 1101 and a remote section 1102. The sensor section 1101 has a sensor 1110, a sensor interface 1120 and a communications element 1130. In an embodiment, the sensor 1110 is an adhesive substrate integrated with a piezoelectric assembly and interconnect cable, such as described with respect to FIGS. 3A-B, above. The sensor interface 1120 provides power to and receives the sensor signal from the sensor piezo circuit, as described with respect to FIG. 4, above. The wireless communications element 1130 receives the sensor signal from the sensor interface 1120 and transmits the signal to the corresponding communications element 1140 in the remote section 1102, which provides an amplified sensor signal sufficient to drive a small speaker. In an embodiment, the communications link 1160 conforms with IEEE 802.15 (Bluetooth).


A physiological acoustic monitoring system has been disclosed in detail in connection with various embodiments. These embodiments are disclosed by way of examples only and are not to limit the scope of the claims that follow. One of ordinary skill in art will appreciate many variations and modifications.

Claims
  • 1. A system for transmitting physiological sounds of a patient across a network, the system comprising: a body sound sensor configured to detect a sound signal corresponding to a physiological process; andone or more hardware processors configured to: convert the detected sound signal from an analog signal to a digital signal;compress the converted sound signal;generate a tag corresponding a physiological sound, said tag indicating a time of occurrence of the physiological sound within the detected sound signal;transmit, over a network, the compressed sound signal and the tag corresponding to the physiological sound, wherein the tag enables a remote computing system to identify the time of occurrence of the physiological sound in the transmitted sound signal.
  • 2. The system of claim 1, wherein the one or more hardware processors are further configured to down sample the converted sound signal with a decimation filter prior to the compression.
  • 3. The system of claim 1, wherein the remote computing system includes a control configured to enable a user to selectively emphasize a first range of frequencies.
  • 4. The system of claim 1, wherein the one or more hardware processors are further configured to apply envelope based sound processing to the compressed sound signal.
  • 5. The system of claim 4, wherein the one or more hardware processors are configured to transmit the envelope based processed signal in lieu of the compressed sound signal.
  • 6. The system of claim 5, wherein the one or more hardware processors are configured to generate an artificial sound signal that correspond to the detected sound signal based on the received envelope based processed signal.
  • 7. The system of claim 1, wherein the body sound sensor comprises a piezoelectric sensor.
  • 8. The system of claim 1, wherein the physiological sound comprises coughing.
  • 9. The system of claim 1, wherein the one or more hardware processors are configured to detect respiration rate from the converted sound signal.
  • 10. The system of claim 1, wherein the tag is generated based on detection of features in the converted sound signal.
  • 11. The system of claim 10, wherein the features comprise relatively high amplitude.
  • 12. The system of claim 10, wherein the features comprise a sharp slope.
Parent Case Info

This application is a continuation of U.S. patent application Ser. No. 15/184,951, filed Jun. 16, 2016, titled PHYSIOLOGICAL ACOUSTIC MONITORING SYSTEM, which is a continuation of U.S. patent application Ser. No. 14/522,474, filed Oct. 23, 2014, now U.S. Pat. No. 9,370,335, titled PHYSIOLOGICAL ACOUSTIC MONITORING SYSTEM, which is a continuation of U.S. patent application Ser. No. 13/650,775, filed Oct. 12, 2012, now U.S. Pat. No. 8,870,792, titled PHYSIOLOGICAL ACOUSTIC MONITORING SYSTEM, which claims the benefit of priority under 35 U.S.C. § 119(e) of U.S. Provisional Patent Application No. 61/547,007, filed Oct. 13, 2011, titled Physiological Acoustic Monitoring System, and is a continuation-in-part of U.S. patent application Ser. No. 12/905,036, filed Oct. 14, 2010, now U.S. Pat. No. 8,821,415, titled Physiological Acoustic Monitoring System, which claims the benefit of priority under 35 U.S.C. § 119(e) of U.S. Provisional Patent Application No. 61/252,099, filed Oct. 15, 2009, and U.S. Provisional Patent Application No. 61/391,098, filed Oct. 8, 2010, the disclosures of which are hereby incorporated in their entirety by reference herein. Additionally, this application relates to the following U.S. patent applications, the disclosures of which are incorporated in their entirety by reference herein: App. No.Filing DateTitle60/893,853Mar. 8, 2007MULTI-PARAMETER PHYSIOLOGICAL MONITOR60/893,850Mar. 8, 2007BACKWARD COMPATIBLE PHYSIOLOGICALSENSOR WITH INFORMATION ELEMENT60/893,858Mar. 8, 2007MULTI-PARAMETER SENSOR FORPHYSIOLOGICAL MONITORING60/893,856Mar. 8, 2007PHYSIOLOGICAL MONITOR WITH FAST GAINADJUST DATA ACQUISITION12/044,883Mar. 8, 2008SYSTEMS AND METHODS FOR DETERMINING A PHYSIOLOGICAL CONDITION USING ANACOUSTIC MONITOR61/252,083Oct. 15, 2009DISPLAYING PHYSIOLOGICAL INFORMATION12/904,823Oct. 14, 2010BIDIRECTIONAL PHYSIOLOGICAL INFORMATION DISPLAY61/141,584Dec. 30, 2008ACOUSTIC SENSOR ASSEMBLY61/252,076Oct. 15, 2009ACOUSTIC SENSOR ASSEMBLY12/643,939Dec. 21, 2009ACOUSTIC SENSOR ASSEMBLY61/313,645Mar. 12, 2010ACOUSTIC RESPIRATORY MONITORINGSENSOR HAVING MULTIPLE SENSING ELEMENTS12/904,890Oct. 14, 2010ACOUSTIC RESPIRATORY MONITORINGSENSOR HAVING MULTIPLE SENSINGELEMENTS12/904,931Oct. 14, 2010ACOUSTIC RESPIRATORY MONITORINGSENSOR HAVING MULTIPLE SENSINGELEMENTS12/904,938Oct. 14, 2010ACOUSTIC RESPIRATORY MONITORINGSENSOR HAVING MULTIPLE SENSINGELEMENTS12/904,907Oct. 14, 2010ACOUSTIC PATIENT SENSOR12/904,789Oct. 14, 2010ACOUSTIC RESPIRATORY MONITORINGSYSTEMS AND METHODS61/252,062Oct. 15, 2009PULSE OXIMETRY SYSTEM WITH LOW NOISECABLE HUB61/265,730Dec. 1, 2009PULSE OXIMETRY SYSTEM WITH ACOUSTICSENSOR12/904,775Oct. 14, 2010PULSE OXIMETRY SYSTEM WITH LOW NOISECABLE HUB12/905,036Oct. 14, 2010PHYSIOLOGICAL ACOUSTIC MONITORINGSYSTEM61/331,087May 4, 2010ACOUSTIC RESPIRATION DISPLAY14/473,831Aug. 29, 2014PHYSIOLOGICAL ACOUSTIC MONITORING SYSTEM

US Referenced Citations (905)
Number Name Date Kind
4960128 Gordon et al. Oct 1990 A
4964408 Hink et al. Oct 1990 A
5041187 Hink et al. Aug 1991 A
5069213 Polczynski Dec 1991 A
5163438 Gordon et al. Nov 1992 A
5319355 Russek Jun 1994 A
5337744 Branigan Aug 1994 A
5341805 Stavridi et al. Aug 1994 A
D353195 Savage et al. Dec 1994 S
D353196 Savage et al. Dec 1994 S
5377676 Vari et al. Jan 1995 A
D359546 Savage et al. Jun 1995 S
5431170 Mathews Jul 1995 A
D361840 Savage et al. Aug 1995 S
D362063 Savage et al. Sep 1995 S
5452717 Branigan et al. Sep 1995 A
D363120 Savage et al. Oct 1995 S
5456252 Vari et al. Oct 1995 A
5479934 Imran Jan 1996 A
5482036 Diab et al. Jan 1996 A
5490505 Diab et al. Feb 1996 A
5494043 O'Sullivan et al. Feb 1996 A
5533511 Kaspari et al. Jul 1996 A
5534851 Russek Jul 1996 A
5561275 Savage et al. Oct 1996 A
5562002 Lalin Oct 1996 A
5590649 Caro et al. Jan 1997 A
5602924 Durand et al. Feb 1997 A
5632272 Diab et al. May 1997 A
5638816 Kiani-Azarbayjany et al. Jun 1997 A
5638818 Diab et al. Jun 1997 A
5645440 Tobler et al. Jul 1997 A
5685299 Diab et al. Nov 1997 A
D393830 Tobler et al. Apr 1998 S
5743262 Lepper, Jr. et al. Apr 1998 A
5758644 Diab et al. Jun 1998 A
5760910 Lepper, Jr. et al. Jun 1998 A
5769785 Diab et al. Jun 1998 A
5782757 Diab et al. Jul 1998 A
5785659 Caro et al. Jul 1998 A
5791347 Flaherty et al. Aug 1998 A
5810734 Caro et al. Sep 1998 A
5823950 Diab et al. Oct 1998 A
5830131 Caro et al. Nov 1998 A
5833618 Caro et al. Nov 1998 A
5860919 Kiani-Azarbayjany et al. Jan 1999 A
5890929 Mills et al. Apr 1999 A
5904654 Wohltmann et al. May 1999 A
5919134 Diab Jul 1999 A
5934925 Tobler et al. Aug 1999 A
5940182 Lepper, Jr. et al. Aug 1999 A
5987343 Kinast Nov 1999 A
5995855 Kiani et al. Nov 1999 A
5997343 Mills et al. Dec 1999 A
6002952 Diab et al. Dec 1999 A
6011986 Diab et al. Jan 2000 A
6027452 Flaherty et al. Feb 2000 A
6036642 Diab et al. Mar 2000 A
6045509 Caro et al. Apr 2000 A
6067462 Diab et al. May 2000 A
6081735 Diab et al. Jun 2000 A
6088607 Diab et al. Jul 2000 A
6110522 Lepper, Jr. et al. Aug 2000 A
6124597 Shehada Sep 2000 A
6128521 Marro et al. Oct 2000 A
6129675 Jay Oct 2000 A
6144868 Parker Nov 2000 A
6151516 Kiani-Azarbayjany et al. Nov 2000 A
6152754 Gerhardt et al. Nov 2000 A
6157850 Diab et al. Dec 2000 A
6165005 Mills et al. Dec 2000 A
6184521 Coffin, IV et al. Feb 2001 B1
6206830 Diab et al. Mar 2001 B1
6229856 Diab et al. May 2001 B1
6232609 Snyder et al. May 2001 B1
6236872 Diab et al. May 2001 B1
6241683 Macklem et al. Jun 2001 B1
6253097 Aronow et al. Jun 2001 B1
6256523 Diab et al. Jul 2001 B1
6263222 Diab et al. Jul 2001 B1
6278522 Lepper, Jr. et al. Aug 2001 B1
6280213 Tobler et al. Aug 2001 B1
6285896 Tobler et al. Sep 2001 B1
6301493 Marro et al. Oct 2001 B1
6308089 von der Ruhr et al. Oct 2001 B1
6317627 Ennen et al. Nov 2001 B1
6321100 Parker Nov 2001 B1
6325761 Jay Dec 2001 B1
6334065 Al-Ali et al. Dec 2001 B1
6343224 Parker Jan 2002 B1
6349228 Kiani et al. Feb 2002 B1
6360114 Diab et al. Mar 2002 B1
6368283 Xu et al. Apr 2002 B1
6371921 Caro et al. Apr 2002 B1
6377829 Al-Ali Apr 2002 B1
6388240 Schulz et al. May 2002 B2
6397091 Diab et al. May 2002 B2
6430437 Marro Aug 2002 B1
6430525 Weber et al. Aug 2002 B1
6463311 Diab Oct 2002 B1
6470199 Kopotic et al. Oct 2002 B1
6501975 Diab et al. Dec 2002 B2
6505059 Kollias et al. Jan 2003 B1
6515273 Al-Ali Feb 2003 B2
6519487 Parker Feb 2003 B1
6525386 Mills et al. Feb 2003 B1
6526300 Kiani et al. Feb 2003 B1
6541756 Schulz et al. Apr 2003 B2
6542764 Al-Ali et al. Apr 2003 B1
6580086 Schulz et al. Jun 2003 B1
6584336 Ali et al. Jun 2003 B1
6595316 Cybulski et al. Jul 2003 B2
6597932 Tian et al. Jul 2003 B2
6597933 Kiani et al. Jul 2003 B2
6606511 Ali et al. Aug 2003 B1
6632181 Flaherty et al. Oct 2003 B2
6639668 Trepagnier Oct 2003 B1
6640116 Diab Oct 2003 B2
6643530 Diab et al. Nov 2003 B2
6650917 Diab et al. Nov 2003 B2
6654624 Diab et al. Nov 2003 B2
6658276 Kiani et al. Dec 2003 B2
6661161 Lanzo et al. Dec 2003 B1
6671531 Al-Ali et al. Dec 2003 B2
6678543 Diab et al. Jan 2004 B2
6684090 Ali et al. Jan 2004 B2
6684091 Parker Jan 2004 B2
6697656 Al-Ali Feb 2004 B1
6697657 Shehada et al. Feb 2004 B1
6697658 Al-Ali Feb 2004 B2
RE38476 Diab et al. Mar 2004 E
6699194 Diab et al. Mar 2004 B1
6714804 Al-Ali et al. Mar 2004 B2
RE38492 Diab et al. Apr 2004 E
6721582 Trepagnier et al. Apr 2004 B2
6721585 Parker Apr 2004 B1
6725075 Al-Ali Apr 2004 B2
6728560 Kollias et al. Apr 2004 B2
6735459 Parker May 2004 B2
6745060 Diab et al. Jun 2004 B2
6760607 Al-Ali Jul 2004 B2
6770028 Ali et al. Aug 2004 B1
6771994 Kiani et al. Aug 2004 B2
6792300 Diab et al. Sep 2004 B1
6813511 Diab et al. Nov 2004 B2
6816741 Diab Nov 2004 B2
6822564 Al-Ali Nov 2004 B2
6826419 Diab et al. Nov 2004 B2
6830711 Mills et al. Dec 2004 B2
6850787 Weber et al. Feb 2005 B2
6850788 Al-Ali Feb 2005 B2
6852083 Caro et al. Feb 2005 B2
6861639 Al-Ali Mar 2005 B2
6898452 Al-Ali et al. May 2005 B2
6920345 Al-Ali et al. Jul 2005 B2
6931268 Kiani-Azarbayjany et al. Aug 2005 B1
6934570 Kiani et al. Aug 2005 B2
6939305 Flaherty et al. Sep 2005 B2
6943348 Coffin, IV Sep 2005 B1
6950687 Al-Ali Sep 2005 B2
6961598 Diab Nov 2005 B2
6970792 Diab Nov 2005 B1
6979812 Al-Ali Dec 2005 B2
6985764 Mason et al. Jan 2006 B2
6993371 Kiani et al. Jan 2006 B2
6996427 Ali et al. Feb 2006 B2
6999904 Weber et al. Feb 2006 B2
7003338 Weber et al. Feb 2006 B2
7003339 Diab et al. Feb 2006 B2
7015451 Dalke et al. Mar 2006 B2
7024233 Ali et al. Apr 2006 B2
7027849 Al-Ali Apr 2006 B2
7030749 Al-Ali Apr 2006 B2
7039449 Al-Ali May 2006 B2
7041060 Flaherty et al. May 2006 B2
7044918 Diab May 2006 B2
7048687 Reuss et al. May 2006 B1
7067893 Mills et al. Jun 2006 B2
7096052 Mason et al. Aug 2006 B2
7096054 Abdul-Hafiz et al. Aug 2006 B2
7132641 Schulz et al. Nov 2006 B2
7142901 Kiani et al. Nov 2006 B2
7149561 Diab Dec 2006 B2
7186966 Al-Ali Mar 2007 B2
7190261 Al-Ali Mar 2007 B2
7215984 Diab May 2007 B2
7215986 Diab May 2007 B2
7221971 Diab May 2007 B2
7225006 Al-Ali et al. May 2007 B2
7225007 Al-Ali May 2007 B2
RE39672 Shehada et al. Jun 2007 E
7239905 Kiani-Azarbayjany et al. Jul 2007 B2
7245953 Parker Jul 2007 B1
7254429 Schurman et al. Aug 2007 B2
7254431 Al-Ali Aug 2007 B2
7254433 Diab et al. Aug 2007 B2
7254434 Schulz et al. Aug 2007 B2
7272425 Al-Ali Sep 2007 B2
7274955 Kiani et al. Sep 2007 B2
D554263 Al-Ali Oct 2007 S
7280858 Al-Ali et al. Oct 2007 B2
7289835 Mansfield et al. Oct 2007 B2
7292883 De Felice et al. Nov 2007 B2
7295866 Al-Ali Nov 2007 B2
7328053 Diab et al. Feb 2008 B1
7332784 Mills et al. Feb 2008 B2
7340287 Mason et al. Mar 2008 B2
7341559 Schulz et al. Mar 2008 B2
7343186 Lamego et al. Mar 2008 B2
D566282 Al-Ali et al. Apr 2008 S
7355512 Al-Ali Apr 2008 B1
7356365 Schurman Apr 2008 B2
7371981 Abdul-Hafiz May 2008 B2
7373193 Al-Ali et al. May 2008 B2
7373194 Weber et al. May 2008 B2
7376453 Diab et al. May 2008 B1
7377794 Al Ali et al. May 2008 B2
7377899 Weber et al. May 2008 B2
7383070 Diab et al. Jun 2008 B2
7415297 Al-Ali et al. Aug 2008 B2
7428432 Ali et al. Sep 2008 B2
7438683 Al-Ali et al. Oct 2008 B2
7440787 Diab Oct 2008 B2
7454240 Diab et al. Nov 2008 B2
7467002 Weber et al. Dec 2008 B2
7469157 Diab et al. Dec 2008 B2
7471969 Diab et al. Dec 2008 B2
7471971 Diab et al. Dec 2008 B2
7483729 Al-Ali et al. Jan 2009 B2
7483730 Diab et al. Jan 2009 B2
7489958 Diab et al. Feb 2009 B2
7496391 Diab et al. Feb 2009 B2
7496393 Diab et al. Feb 2009 B2
D587657 Al-Ali et al. Mar 2009 S
7499741 Diab et al. Mar 2009 B2
7499835 Weber et al. Mar 2009 B2
7500950 Al-Ali et al. Mar 2009 B2
7509154 Diab et al. Mar 2009 B2
7509494 Al-Ali Mar 2009 B2
7510849 Schurman et al. Mar 2009 B2
7526328 Diab et al. Apr 2009 B2
7530942 Diab May 2009 B1
7530949 Al Ali et al. May 2009 B2
7530955 Diab et al. May 2009 B2
7563110 Al-Ali et al. Jul 2009 B2
7596398 Al-Ali et al. Sep 2009 B2
7618375 Flaherty Nov 2009 B2
D606659 Kiani et al. Dec 2009 S
7647083 Al-Ali et al. Jan 2010 B2
D609193 Al-Ali et al. Feb 2010 S
D614305 Al-Ali et al. Apr 2010 S
RE41317 Parker May 2010 E
7729733 Al-Ali et al. Jun 2010 B2
7734320 Al-Ali Jun 2010 B2
7761127 Al-Ali et al. Jul 2010 B2
7761128 Al-Ali et al. Jul 2010 B2
7764982 Dalke et al. Jul 2010 B2
D621516 Kiani et al. Aug 2010 S
7791155 Diab Sep 2010 B2
7801581 Diab Sep 2010 B2
7822452 Schurman et al. Oct 2010 B2
RE41912 Parker Nov 2010 E
7844313 Kiani et al. Nov 2010 B2
7844314 Al-Ali Nov 2010 B2
7844315 Al-Ali Nov 2010 B2
7865222 Weber et al. Jan 2011 B2
7873497 Weber et al. Jan 2011 B2
7880606 Al-Ali Feb 2011 B2
7880626 Al-Ali et al. Feb 2011 B2
7891355 Al-Ali et al. Feb 2011 B2
7894868 Al-Ali et al. Feb 2011 B2
7899507 Al-Ali et al. Mar 2011 B2
7899518 Trepagnier et al. Mar 2011 B2
7904132 Weber et al. Mar 2011 B2
7909772 Popov et al. Mar 2011 B2
7910875 Al-Ali Mar 2011 B2
7919713 Al-Ali et al. Apr 2011 B2
7937128 Al-Ali May 2011 B2
7937129 Mason et al. May 2011 B2
7937130 Diab et al. May 2011 B2
7941199 Kiani May 2011 B2
7951086 Flaherty et al. May 2011 B2
7957780 Lamego et al. Jun 2011 B2
7962188 Kiani et al. Jun 2011 B2
7962190 Diab et al. Jun 2011 B1
7976472 Kiani Jul 2011 B2
7988637 Diab Aug 2011 B2
7990382 Kiani Aug 2011 B2
7991446 Al-Ali et al. Aug 2011 B2
8000761 Al-Ali Aug 2011 B2
8008088 Bellott et al. Aug 2011 B2
RE42753 Kiani-Azarbayjany et al. Sep 2011 E
8019400 Diab et al. Sep 2011 B2
8028701 Al-Ali et al. Oct 2011 B2
8029765 Bellott et al. Oct 2011 B2
8036727 Schurman et al. Oct 2011 B2
8036728 Diab et al. Oct 2011 B2
8046040 Ali et al. Oct 2011 B2
8046041 Diab et al. Oct 2011 B2
8046042 Diab et al. Oct 2011 B2
8048040 Kiani Nov 2011 B2
8050728 Al-Ali et al. Nov 2011 B2
RE43169 Parker Feb 2012 E
8118620 Al-Ali et al. Feb 2012 B2
8126528 Diab et al. Feb 2012 B2
8128572 Diab et al. Mar 2012 B2
8130105 Al-Ali et al. Mar 2012 B2
8145287 Diab et al. Mar 2012 B2
8150487 Diab et al. Apr 2012 B2
8175672 Parker May 2012 B2
8180420 Diab et al. May 2012 B2
8182443 Kiani May 2012 B1
8185180 Diab et al. May 2012 B2
8190223 Al-Ali et al. May 2012 B2
8190227 Diab et al. May 2012 B2
8203438 Kiani et al. Jun 2012 B2
8203704 Merritt et al. Jun 2012 B2
8204566 Schurman et al. Jun 2012 B2
8219172 Schurman et al. Jul 2012 B2
8224411 Al-Ali et al. Jul 2012 B2
8228181 Al-Ali Jul 2012 B2
8229533 Diab et al. Jul 2012 B2
8233955 Al-Ali et al. Jul 2012 B2
8244325 Al-Ali et al. Aug 2012 B2
8255026 Al-Ali Aug 2012 B1
8255027 Al-Ali et al. Aug 2012 B2
8255028 Al-Ali et al. Aug 2012 B2
8260577 Weber et al. Sep 2012 B2
8265723 McHale et al. Sep 2012 B1
8274360 Sampath et al. Sep 2012 B2
8280473 Al-Ali Oct 2012 B2
8301217 Al-Ali et al. Oct 2012 B2
8306596 Schurman et al. Nov 2012 B2
8310336 Muhsin et al. Nov 2012 B2
8315683 Al-Ali et al. Nov 2012 B2
RE43860 Parker Dec 2012 E
8337403 Al-Ali et al. Dec 2012 B2
8346330 Lamego Jan 2013 B2
8353842 Al-Ali et al. Jan 2013 B2
8355766 MacNeish, III et al. Jan 2013 B2
8359080 Diab et al. Jan 2013 B2
8364223 Al-Ali et al. Jan 2013 B2
8364226 Diab et al. Jan 2013 B2
8374665 Lamego Feb 2013 B2
8385995 Al-ali et al. Feb 2013 B2
8385996 Smith et al. Feb 2013 B2
8388353 Kiani et al. Mar 2013 B2
8399822 Al-Ali Mar 2013 B2
8401602 Kiani Mar 2013 B2
8405608 Al-Ali et al. Mar 2013 B2
8414499 Al-Ali et al. Apr 2013 B2
8418524 Al-Ali Apr 2013 B2
8423106 Lamego et al. Apr 2013 B2
8428967 Olsen et al. Apr 2013 B2
8430817 Al-Ali et al. Apr 2013 B1
8437825 Dalvi et al. May 2013 B2
8455290 Siskavich Jun 2013 B2
8457703 Al-Ali Jun 2013 B2
8457707 Kiani Jun 2013 B2
8463349 Diab et al. Jun 2013 B2
8466286 Bellot et al. Jun 2013 B2
8471713 Poeze et al. Jun 2013 B2
8473020 Kiani et al. Jun 2013 B2
8483787 Al-Ali et al. Jul 2013 B2
8489364 Weber et al. Jul 2013 B2
8498684 Weber et al. Jul 2013 B2
8504128 Blank et al. Aug 2013 B2
8509867 Workman et al. Aug 2013 B2
8515509 Bruinsma et al. Aug 2013 B2
8523781 Al-Ali Sep 2013 B2
8529301 Al-Ali et al. Sep 2013 B2
8532727 Ali et al. Sep 2013 B2
8532728 Diab et al. Sep 2013 B2
D692145 Al-Ali et al. Oct 2013 S
8547209 Kiani et al. Oct 2013 B2
8548548 Al-Ali Oct 2013 B2
8548549 Schurman et al. Oct 2013 B2
8548550 Al-Ali et al. Oct 2013 B2
8560032 Al-Ali et al. Oct 2013 B2
8560034 Diab et al. Oct 2013 B1
8570167 Al-Ali Oct 2013 B2
8570503 Vo et al. Oct 2013 B2
8571617 Reichgott et al. Oct 2013 B2
8571618 Lamego et al. Oct 2013 B1
8571619 Al-Ali et al. Oct 2013 B2
8577431 Lamego et al. Nov 2013 B2
8581732 Al-Ali et al. Nov 2013 B2
8584345 Al-Ali et al. Nov 2013 B2
8588880 Abdul-Hafiz et al. Nov 2013 B2
8600467 Al-Ali et al. Dec 2013 B2
8606342 Diab Dec 2013 B2
8626255 Al-Ali et al. Jan 2014 B2
8630691 Lamego et al. Jan 2014 B2
8634889 Al-Ali et al. Jan 2014 B2
8641631 Sierra et al. Feb 2014 B2
8652060 Al-Ali Feb 2014 B2
8663107 Kiani Mar 2014 B2
8666468 Al-Ali Mar 2014 B1
8667967 Al-Ali et al. Mar 2014 B2
8670811 O'Reilly Mar 2014 B2
8670814 Diab et al. Mar 2014 B2
8676286 Weber et al. Mar 2014 B2
8682407 Al-Ali Mar 2014 B2
RE44823 Parker Apr 2014 E
RE44875 Kiani et al. Apr 2014 E
8690799 Telfort et al. Apr 2014 B2
8700112 Kiani Apr 2014 B2
8702627 Telfort et al. Apr 2014 B2
8706179 Parker Apr 2014 B2
8712494 MacNeish, III et al. Apr 2014 B1
8715206 Telfort et al. May 2014 B2
8718735 Lamego et al. May 2014 B2
8718737 Diab et al. May 2014 B2
8718738 Blank et al. May 2014 B2
8720249 Al-Ali May 2014 B2
8721541 Al-Ali et al. May 2014 B2
8721542 Al-Ali et al. May 2014 B2
8723677 Kiani May 2014 B1
8740792 Kiani et al. Jun 2014 B1
8754776 Poeze et al. Jun 2014 B2
8755535 Telfort et al. Jun 2014 B2
8755856 Diab et al. Jun 2014 B2
8755872 Marinow Jun 2014 B1
8761850 Lamego Jun 2014 B2
8764671 Kiani Jul 2014 B2
8768423 Shakespeare et al. Jul 2014 B2
8771204 Telfort et al. Jul 2014 B2
8777634 Kiani et al. Jul 2014 B2
8781543 Diab et al. Jul 2014 B2
8781544 Al-Ali et al. Jul 2014 B2
8781549 Al-Ali et al. Jul 2014 B2
8788003 Schurman et al. Jul 2014 B2
8790268 Al-Ali Jul 2014 B2
8801613 Al-Ali et al. Aug 2014 B2
8821397 Al-Ali et al. Sep 2014 B2
8821415 Al-Ali et al. Sep 2014 B2
8830449 Lamego et al. Sep 2014 B1
8831700 Schurman et al. Sep 2014 B2
8840549 Al-Ali et al. Sep 2014 B2
8847740 Kiani et al. Sep 2014 B2
8849365 Smith et al. Sep 2014 B2
8852094 Al-Ali et al. Oct 2014 B2
8852994 Wojtczuk et al. Oct 2014 B2
8868147 Stippick et al. Oct 2014 B2
8868150 Al-Ali et al. Oct 2014 B2
8870792 Al-Ali et al. Oct 2014 B2
8886271 Kiani et al. Nov 2014 B2
8888539 Al-Ali et al. Nov 2014 B2
8888708 Diab et al. Nov 2014 B2
8892180 Weber et al. Nov 2014 B2
8897847 Al-Ali Nov 2014 B2
8909310 Lamego et al. Dec 2014 B2
8911377 Al-Ali Dec 2014 B2
8912909 Al-Ali et al. Dec 2014 B2
8920317 Al-Ali et al. Dec 2014 B2
8921699 Al-Ali et al. Dec 2014 B2
8922382 Al-Ali et al. Dec 2014 B2
8929964 Al-Ali et al. Jan 2015 B2
8942777 Diab et al. Jan 2015 B2
8948834 Diab et al. Feb 2015 B2
8948835 Diab Feb 2015 B2
8965471 Lamego Feb 2015 B2
8983564 Al-Ali Mar 2015 B2
8989831 Al-Ali et al. Mar 2015 B2
8996085 Kiani et al. Mar 2015 B2
8998809 Kiani Apr 2015 B2
9028429 Telfort et al. May 2015 B2
9037207 Al-Ali et al. May 2015 B2
9060721 Reichgott et al. Jun 2015 B2
9066666 Kiani Jun 2015 B2
9066680 Al-Ali et al. Jun 2015 B1
9072474 Al-Ali et al. Jul 2015 B2
9078560 Schurman et al. Jul 2015 B2
9084569 Weber et al. Jul 2015 B2
9095316 Welch et al. Aug 2015 B2
9106038 Telfort et al. Aug 2015 B2
9107625 Telfort et al. Aug 2015 B2
9107626 Al-Ali et al. Aug 2015 B2
9113831 Al-Ali Aug 2015 B2
9113832 Al-Ali Aug 2015 B2
9119595 Lamego Sep 2015 B2
9131881 Diab et al. Sep 2015 B2
9131882 Al-Ali et al. Sep 2015 B2
9131883 Al-Ali Sep 2015 B2
9131917 Telfort et al. Sep 2015 B2
9138180 Coverston et al. Sep 2015 B1
9138182 Al-Ali et al. Sep 2015 B2
9138192 Weber et al. Sep 2015 B2
9142117 Muhsin et al. Sep 2015 B2
9153112 Kiani et al. Oct 2015 B1
9153121 Kiani et al. Oct 2015 B2
9161696 Al-Ali et al. Oct 2015 B2
9161713 Al-Ali et al. Oct 2015 B2
9167995 Lamego et al. Oct 2015 B2
9176141 Al-Ali et al. Nov 2015 B2
9186102 Bruinsma et al. Nov 2015 B2
9192312 Al-Ali Nov 2015 B2
9192329 Al-Ali Nov 2015 B2
9192351 Telfort et al. Nov 2015 B1
9195385 Al-Ali et al. Nov 2015 B2
9211072 Kiani Dec 2015 B2
9211095 Al-Ali Dec 2015 B1
9218454 Kiani et al. Dec 2015 B2
9226696 Kiani Jan 2016 B2
9241662 Al-Ali et al. Jan 2016 B2
9245668 Vo et al. Jan 2016 B1
9259185 Abdul-Hafiz et al. Feb 2016 B2
9267572 Barker et al. Feb 2016 B2
9277880 Poeze et al. Mar 2016 B2
9289167 Diab et al. Mar 2016 B2
9295421 Kiani et al. Mar 2016 B2
9307928 Al-Ali et al. Apr 2016 B1
9323894 Kiani Apr 2016 B2
D755392 Hwang et al. May 2016 S
9326712 Kiani May 2016 B1
9333316 Kiani May 2016 B2
9339220 Lamego et al. May 2016 B2
9341565 Lamego et al. May 2016 B2
9351673 Diab et al. May 2016 B2
9351675 Al-Ali et al. May 2016 B2
9364181 Kiani et al. Jun 2016 B2
9368671 Wojtczuk et al. Jun 2016 B2
9370325 Al-Ali et al. Jun 2016 B2
9370326 McHale et al. Jun 2016 B2
9370335 Al-ali et al. Jun 2016 B2
9375185 Ali et al. Jun 2016 B2
9386953 Al-Ali Jul 2016 B2
9386961 Al-Ali et al. Jul 2016 B2
9392945 Al-Ali et al. Jul 2016 B2
9397448 Al-Ali et al. Jul 2016 B2
9408542 Kinast et al. Aug 2016 B1
9436645 Al-Ali et al. Sep 2016 B2
9445759 Lamego et al. Sep 2016 B1
9466919 Kiani et al. Oct 2016 B2
9474474 Lamego et al. Oct 2016 B2
9480422 Al-Ali Nov 2016 B2
9480435 Olsen Nov 2016 B2
9492110 Al-Ali et al. Nov 2016 B2
9510779 Poeze et al. Dec 2016 B2
9517024 Kiani et al. Dec 2016 B2
9532722 Lamego et al. Jan 2017 B2
9538949 Al-Ali et al. Jan 2017 B2
9538980 Telfort et al. Jan 2017 B2
9549696 Lamego et al. Jan 2017 B2
9554737 Schurman et al. Jan 2017 B2
9560996 Kiani Feb 2017 B2
9560998 Al-Ali et al. Feb 2017 B2
9566019 Al-Ali et al. Feb 2017 B2
9579039 Jansen et al. Feb 2017 B2
9591975 Dalvi et al. Mar 2017 B2
9622692 Lamego et al. Apr 2017 B2
9622693 Diab Apr 2017 B2
D788312 Al-Ali et al. May 2017 S
9636055 Al-Ali et al. May 2017 B2
9636056 Al-Ali May 2017 B2
9649054 Lamego et al. May 2017 B2
9662052 Al-Ali et al. May 2017 B2
9668679 Schurman et al. Jun 2017 B2
9668680 Bruinsma et al. Jun 2017 B2
9668703 Al-Ali Jun 2017 B2
9675286 Diab Jun 2017 B2
9687160 Kiani Jun 2017 B2
9693719 Al-Ali et al. Jul 2017 B2
9693737 Al-Ali Jul 2017 B2
9697928 Al-Ali et al. Jul 2017 B2
9717425 Kiani et al. Aug 2017 B2
9717458 Lamego et al. Aug 2017 B2
9724016 Al-Ali et al. Aug 2017 B1
9724024 Al-Ali Aug 2017 B2
9724025 Kiani et al. Aug 2017 B1
9730640 Diab et al. Aug 2017 B2
9743887 Al-Ali et al. Aug 2017 B2
9749232 Sampath et al. Aug 2017 B2
9750442 Olsen Sep 2017 B2
9750443 Smith et al. Sep 2017 B2
9750461 Telfort Sep 2017 B1
9775545 Al-Ali et al. Oct 2017 B2
9775546 Diab et al. Oct 2017 B2
9775570 Al-Ali Oct 2017 B2
9778079 Al-Ali et al. Oct 2017 B1
9782077 Lamego et al. Oct 2017 B2
9782110 Kiani Oct 2017 B2
9787568 Lamego et al. Oct 2017 B2
9788735 Al-Ali Oct 2017 B2
9788768 Al-Ali et al. Oct 2017 B2
9795300 Al-Ali Oct 2017 B2
9795310 Al-Ali Oct 2017 B2
9795358 Telfort et al. Oct 2017 B2
9795739 Al-Ali et al. Oct 2017 B2
9801556 Kiani Oct 2017 B2
9801588 Weber et al. Oct 2017 B2
9808188 Perea et al. Nov 2017 B1
9814418 Weber et al. Nov 2017 B2
9820691 Kiani Nov 2017 B2
9833152 Kiani et al. Dec 2017 B2
9833180 Shakespeare et al. Dec 2017 B2
9839379 Al-Ali et al. Dec 2017 B2
9839381 Weber et al. Dec 2017 B1
9847002 Kiani et al. Dec 2017 B2
9847749 Kiani et al. Dec 2017 B2
9848800 Lee et al. Dec 2017 B1
9848806 Al-Ali et al. Dec 2017 B2
9848807 Lamego Dec 2017 B2
9861298 Eckerbom et al. Jan 2018 B2
9861304 Al-Ali et al. Jan 2018 B2
9861305 Weber et al. Jan 2018 B1
9867578 Al-Ali et al. Jan 2018 B2
9872623 Al-Ali Jan 2018 B2
9876320 Coverston et al. Jan 2018 B2
9877650 Muhsin et al. Jan 2018 B2
9877686 Al-Ali et al. Jan 2018 B2
9891079 Dalvi Feb 2018 B2
9895107 Al-Ali et al. Feb 2018 B2
9913617 Al-Ali et al. Mar 2018 B2
9924893 Schurman et al. Mar 2018 B2
9924897 Abdul-Hafiz Mar 2018 B1
9936917 Poeze et al. Apr 2018 B2
9943269 Muhsin et al. Apr 2018 B2
9949676 Al-Ali Apr 2018 B2
9955937 Telfort May 2018 B2
9965946 Al-Ali May 2018 B2
9980667 Kiani et al. May 2018 B2
D820865 Muhsin et al. Jun 2018 S
9986919 Lamego et al. Jun 2018 B2
9986952 Dalvi et al. Jun 2018 B2
9989560 Poeze et al. Jun 2018 B2
9993207 Al-Ali et al. Jun 2018 B2
10007758 Al-Ali et al. Jun 2018 B2
D822215 Al-Ali et al. Jul 2018 S
D822216 Barker et al. Jul 2018 S
10010276 Al-Ali et al. Jul 2018 B2
10032002 Kiani et al. Jul 2018 B2
10039482 Al-Ali et al. Aug 2018 B2
10052037 Kinast et al. Aug 2018 B2
10058275 Al-Ali et al. Aug 2018 B2
10064562 Al-Ali Sep 2018 B2
10086138 Novak, Jr. Oct 2018 B1
10092200 Al-Ali et al. Oct 2018 B2
10092249 Kiani et al. Oct 2018 B2
10098550 Al-Ali et al. Oct 2018 B2
10098591 Al-Ali et al. Oct 2018 B2
10098610 Al-Ali et al. Oct 2018 B2
D833624 DeJong et al. Nov 2018 S
10123726 Al-Ali et al. Nov 2018 B2
10130289 Al-Ali et al. Nov 2018 B2
10130291 Schurman et al. Nov 2018 B2
D835282 Barker et al. Dec 2018 S
D835283 Barker et al. Dec 2018 S
D835284 Barker et al. Dec 2018 S
D835285 Barker et al. Dec 2018 S
10149616 Al-Ali et al. Dec 2018 B2
10154815 Al-Ali et al. Dec 2018 B2
10159412 Lamego et al. Dec 2018 B2
10188296 Al-Ali et al. Jan 2019 B2
10188331 Al-Ali et al. Jan 2019 B1
10188348 Kiani et al. Jan 2019 B2
RE47218 Ali-Ali Feb 2019 E
RE47244 Kiani et al. Feb 2019 E
RE47249 Kiani et al. Feb 2019 E
10194847 Al-Ali Feb 2019 B2
10194848 Kiani et al. Feb 2019 B1
10201298 Al-Ali et al. Feb 2019 B2
10205272 Kiani et al. Feb 2019 B2
10205291 Scruggs et al. Feb 2019 B2
10213108 Al-Ali Feb 2019 B2
10219706 Al-Ali Mar 2019 B2
10219746 McHale et al. Mar 2019 B2
10226187 Al-Ali et al. Mar 2019 B2
10226576 Kiani Mar 2019 B2
10231657 Al-Ali et al. Mar 2019 B2
10231670 Blank et al. Mar 2019 B2
10231676 Al-Ali et al. Mar 2019 B2
RE47353 Kiani et al. Apr 2019 E
10251585 Al-Ali et al. Apr 2019 B2
10251586 Lamego Apr 2019 B2
10255994 Sampath et al. Apr 2019 B2
10258265 Poeze et al. Apr 2019 B1
10258266 Poeze et al. Apr 2019 B1
20060161054 Reuss et al. Jul 2006 A1
20070282478 Al-Ali et al. Dec 2007 A1
20090247984 Lamego et al. Oct 2009 A1
20090275813 Davis Nov 2009 A1
20090275844 Al-Ali Nov 2009 A1
20100004518 Vo et al. Jan 2010 A1
20100030040 Poeze et al. Feb 2010 A1
20110082711 Poeze et al. Apr 2011 A1
20110125060 Telfort et al. May 2011 A1
20110208015 Welch et al. Aug 2011 A1
20110230733 Al-Ali Sep 2011 A1
20120165629 Merritt et al. Jun 2012 A1
20120209082 Al-Ali Aug 2012 A1
20120209084 Olsen et al. Aug 2012 A1
20120283524 Kiani et al. Nov 2012 A1
20130023775 Lamego et al. Jan 2013 A1
20130041591 Lamego Feb 2013 A1
20130060147 Welch et al. Mar 2013 A1
20130096405 Garfio Apr 2013 A1
20130096936 Sampath et al. Apr 2013 A1
20130243021 Siskavich Sep 2013 A1
20130253334 Al-Ali et al. Sep 2013 A1
20130296672 O'Neil et al. Nov 2013 A1
20130296713 Al-Ali et al. Nov 2013 A1
20130324808 Al-Ali et al. Dec 2013 A1
20130331660 Al-Ali et al. Dec 2013 A1
20140012100 Al-Ali et al. Jan 2014 A1
20140051953 Lamego et al. Feb 2014 A1
20140120564 Workman et al. May 2014 A1
20140121482 Merritt et al. May 2014 A1
20140127137 Bellott et al. May 2014 A1
20140163344 Al-Ali Jun 2014 A1
20140166076 Kiani et al. Jun 2014 A1
20140171763 Diab Jun 2014 A1
20140180038 Kiani Jun 2014 A1
20140180154 Sierra et al. Jun 2014 A1
20140180160 Brown et al. Jun 2014 A1
20140187973 Brown et al. Jul 2014 A1
20140213864 Abdul-Hafiz et al. Jul 2014 A1
20140275835 Lamego et al. Sep 2014 A1
20140275871 Lamego et al. Sep 2014 A1
20140275872 Merritt et al. Sep 2014 A1
20140288400 Diab et al. Sep 2014 A1
20140316217 Purdon et al. Oct 2014 A1
20140316218 Purdon et al. Oct 2014 A1
20140316228 Blank et al. Oct 2014 A1
20140323825 Al-Ali et al. Oct 2014 A1
20140323897 Brown et al. Oct 2014 A1
20140323898 Purdon et al. Oct 2014 A1
20140330092 Al-Ali et al. Nov 2014 A1
20140330098 Merritt et al. Nov 2014 A1
20140357966 Al-Ali et al. Dec 2014 A1
20150005600 Blank et al. Jan 2015 A1
20150011907 Purdon et al. Jan 2015 A1
20150032029 Al-Ali et al. Jan 2015 A1
20150038859 Dalvi et al. Feb 2015 A1
20150080754 Purdon et al. Mar 2015 A1
20150087936 Al-Ali et al. Mar 2015 A1
20150094546 Al-Ali Apr 2015 A1
20150099950 Al-Ali et al. Apr 2015 A1
20150101844 Al-Ali et al. Apr 2015 A1
20150106121 Muhsin et al. Apr 2015 A1
20150112151 Muhsin et al. Apr 2015 A1
20150165312 Kiani Jun 2015 A1
20150196249 Brown et al. Jul 2015 A1
20150216459 Al-Ali et al. Aug 2015 A1
20150238722 Al-Ali Aug 2015 A1
20150245773 Lamego et al. Sep 2015 A1
20150245794 Al-Ali Sep 2015 A1
20150257689 Al-Ali et al. Sep 2015 A1
20150272514 Kiani et al. Oct 2015 A1
20150351697 Weber et al. Dec 2015 A1
20150359429 Al-Ali et al. Dec 2015 A1
20150366507 Blank Dec 2015 A1
20160029932 Al-Ali Feb 2016 A1
20160058347 Reichgott et al. Mar 2016 A1
20160066824 Al-Ali et al. Mar 2016 A1
20160081552 Wojtczuk et al. Mar 2016 A1
20160095543 Telfort et al. Apr 2016 A1
20160095548 Al-Ali et al. Apr 2016 A1
20160103598 Al-Ali et al. Apr 2016 A1
20160166182 Al-Ali et al. Jun 2016 A1
20160166183 Poeze et al. Jun 2016 A1
20160196388 Lamego Jul 2016 A1
20160197436 Barker et al. Jul 2016 A1
20160213281 Eckerbom et al. Jul 2016 A1
20160228043 O'Neil et al. Aug 2016 A1
20160233632 Scruggs et al. Aug 2016 A1
20160234944 Schmidt et al. Aug 2016 A1
20160270735 Diab et al. Sep 2016 A1
20160283665 Sampath et al. Sep 2016 A1
20160287090 Al-Ali et al. Oct 2016 A1
20160287786 Kiani Oct 2016 A1
20160296169 McHale et al. Oct 2016 A1
20160310052 Al-Ali et al. Oct 2016 A1
20160314260 Kiani Oct 2016 A1
20160324488 Olsen Nov 2016 A1
20160327984 Al-Ali et al. Nov 2016 A1
20160331332 Al-Ali Nov 2016 A1
20160367173 Dalvi et al. Dec 2016 A1
20170000394 Al-Ali et al. Jan 2017 A1
20170007134 Al-Ali et al. Jan 2017 A1
20170007198 Al-Ali et al. Jan 2017 A1
20170014083 Diab et al. Jan 2017 A1
20170014084 Al-Ali et al. Jan 2017 A1
20170024748 Haider Jan 2017 A1
20170042488 Muhsin Feb 2017 A1
20170055851 Al-Ali Mar 2017 A1
20170055882 Al-Ali et al. Mar 2017 A1
20170055887 Al-Ali Mar 2017 A1
20170055896 Al-Ali et al. Mar 2017 A1
20170079594 Telfort et al. Mar 2017 A1
20170086723 Al-Ali et al. Mar 2017 A1
20170143281 Olsen May 2017 A1
20170147774 Kiani May 2017 A1
20170156620 Al-Ali et al. Jun 2017 A1
20170173632 Al-Ali Jun 2017 A1
20170187146 Kiani et al. Jun 2017 A1
20170188919 Al-Ali et al. Jul 2017 A1
20170196464 Jansen et al. Jul 2017 A1
20170196470 Lamego et al. Jul 2017 A1
20170224262 Al-Ali Aug 2017 A1
20170228516 Sampath et al. Aug 2017 A1
20170245790 Al-Ali et al. Aug 2017 A1
20170251974 Shreim et al. Sep 2017 A1
20170251975 Shreim et al. Sep 2017 A1
20170258403 Abdul-Hafiz et al. Sep 2017 A1
20170311851 Schurman et al. Nov 2017 A1
20170311891 Kiani et al. Nov 2017 A1
20170325728 Al-Ali et al. Nov 2017 A1
20170332976 Al-Ali et al. Nov 2017 A1
20170340293 Al-Ali et al. Nov 2017 A1
20170360310 Kiani et al. Dec 2017 A1
20170367632 Al-Ali et al. Dec 2017 A1
20180008146 Al-Ali et al. Jan 2018 A1
20180013562 Haider et al. Jan 2018 A1
20180014752 Al-Ali et al. Jan 2018 A1
20180028124 Al-Ali et al. Feb 2018 A1
20180055385 Al-Ali Mar 2018 A1
20180055390 Kiani et al. Mar 2018 A1
20180055430 Diab et al. Mar 2018 A1
20180064381 Shakespeare et al. Mar 2018 A1
20180069776 Lamego et al. Mar 2018 A1
20180070867 Smith et al. Mar 2018 A1
20180082767 Al-Ali et al. Mar 2018 A1
20180085068 Telfort Mar 2018 A1
20180087937 Al-Ali et al. Mar 2018 A1
20180103874 Lee et al. Apr 2018 A1
20180103905 Kiani Apr 2018 A1
20180110478 Al-Ali Apr 2018 A1
20180116575 Perea et al. May 2018 A1
20180125368 Lamego et al. May 2018 A1
20180125430 Al-Ali et al. May 2018 A1
20180125445 Telfort et al. May 2018 A1
20180130325 Kiani et al. May 2018 A1
20180132769 Weber et al. May 2018 A1
20180132770 Lamego May 2018 A1
20180146901 Al-Ali et al. May 2018 A1
20180146902 Kiani et al. May 2018 A1
20180153442 Eckerbom et al. Jun 2018 A1
20180153446 Kiani Jun 2018 A1
20180153447 Al-Ali et al. Jun 2018 A1
20180153448 Weber et al. Jun 2018 A1
20180161499 Al-Ali et al. Jun 2018 A1
20180168491 Al-Ali et al. Jun 2018 A1
20180174679 Sampath et al. Jun 2018 A1
20180174680 Sampath et al. Jun 2018 A1
20180182484 Sampath et al. Jun 2018 A1
20180184917 Kiani Jul 2018 A1
20180192924 Al-Ali Jul 2018 A1
20180192953 Shreim et al. Jul 2018 A1
20180192955 Al-Ali et al. Jul 2018 A1
20180199871 Pauley et al. Jul 2018 A1
20180206795 Al-Ali Jul 2018 A1
20180206815 Telfort Jul 2018 A1
20180213583 Al-Ali Jul 2018 A1
20180214031 Kiani et al. Aug 2018 A1
20180214090 Al-Ali et al. Aug 2018 A1
20180218792 Muhsin et al. Aug 2018 A1
20180225960 Al-Ali et al. Aug 2018 A1
20180238718 Dalvi Aug 2018 A1
20180242853 Al-Ali Aug 2018 A1
20180242921 Muhsin et al. Aug 2018 A1
20180242923 Al-Ali et al. Aug 2018 A1
20180242924 Barker et al. Aug 2018 A1
20180242926 Muhsin et al. Aug 2018 A1
20180247353 Al-Ali et al. Aug 2018 A1
20180247712 Muhsin et al. Aug 2018 A1
20180249933 Schurman et al. Sep 2018 A1
20180253947 Muhsin et al. Sep 2018 A1
20180256087 Al-Ali et al. Sep 2018 A1
20180256113 Weber et al. Sep 2018 A1
20180285094 Housel et al. Oct 2018 A1
20180289325 Poeze et al. Oct 2018 A1
20180289337 Al-Ali et al. Oct 2018 A1
20180296161 Shreim et al. Oct 2018 A1
20180300919 Muhsin et al. Oct 2018 A1
20180310822 Indorf et al. Nov 2018 A1
20180310823 Al-Ali et al. Nov 2018 A1
20180317826 Muhsin Nov 2018 A1
20180317841 Novak, Jr. Nov 2018 A1
20180333055 Lamego et al. Nov 2018 A1
20180333087 Al-Ali Nov 2018 A1
20190000317 Muhsin et al. Jan 2019 A1
20190000362 Kiani et al. Jan 2019 A1
20190015023 Monfre Jan 2019 A1
20190021638 Al-Ali et al. Jan 2019 A1
20190029574 Schurman et al. Jan 2019 A1
20190029578 Al-Ali et al. Jan 2019 A1
20190038143 Al-Ali Feb 2019 A1
20190058280 Al-Ali et al. Feb 2019 A1
20190058281 Al-Ali et al. Feb 2019 A1
20190069813 Al-Ali Mar 2019 A1
20190069814 Al-Ali Mar 2019 A1
20190076028 Al-Ali et al. Mar 2019 A1
20190082979 Al-Ali et al. Mar 2019 A1
20190090748 Al-Ali Mar 2019 A1
20190090760 Kinast et al. Mar 2019 A1
20190090764 Al-Ali Mar 2019 A1
20190104973 Poeze et al. Apr 2019 A1
20190110719 Poeze et al. Apr 2019 A1
20190117070 Muhsin et al. Apr 2019 A1
20190117139 Al-Ali et al. Apr 2019 A1
20190117140 Al-Ali et al. Apr 2019 A1
20190117141 Al-Ali Apr 2019 A1
20190117930 Al-Ali Apr 2019 A1
20190122763 Sampath et al. Apr 2019 A1
Related Publications (1)
Number Date Country
20190254622 A1 Aug 2019 US
Provisional Applications (3)
Number Date Country
61547007 Oct 2011 US
61391098 Oct 2010 US
61252099 Oct 2009 US
Continuations (3)
Number Date Country
Parent 15184951 Jun 2016 US
Child 16159395 US
Parent 14522474 Oct 2014 US
Child 15184951 US
Parent 13650775 Oct 2012 US
Child 14522474 US
Continuation in Parts (1)
Number Date Country
Parent 12905036 Oct 2010 US
Child 13650775 US