Physiological measurement logic engine

Information

  • Patent Grant
  • 11399774
  • Patent Number
    11,399,774
  • Date Filed
    Friday, August 30, 2019
    4 years ago
  • Date Issued
    Tuesday, August 2, 2022
    a year ago
  • Inventors
  • Original Assignees
  • Examiners
    • Winakur; Eric F
    Agents
    • Knobbe, Martens, Olson & Bear LLP
Abstract
A patient monitor including a physiological measurement logic engine receives physiological data from a physiological sensor. The logic engine abstracts one or more features of the physiological data and determines a category for the abstracted feature. The logic engine further encodes the category of each of the one or more features and determines an action to perform based on the encoded categories.
Description
FIELD

The present disclosure relates to non-invasive biological parameter sensing, including sensing using optical and/or acoustic sensors and related systems and methods.


BACKGROUND

Patient monitoring of various physiological parameters of a patient is important to a wide range of medical applications. Pulse oximetry is one of the techniques that have developed to accomplish the monitoring of some of these physiological characteristics. Pulse oximetry relies on a sensor attached externally to a patient to output signals indicative of various physiological parameters, such as a patient's constituents and/or analytes, including for example a percent value for arterial oxygen saturation, carbon monoxide saturation, methemoglobin saturation, fractional saturations, total hematocrit, billirubins, perfusion quality, or the like. A pulse oximetry system generally includes a patient monitor, a communications medium such as a cable, and/or a physiological sensor having light emitters and a detector, such as one or more LEDs and a photodetector. The sensor is attached to a tissue site, such as a finger, toe, ear lobe, nose, hand, foot, or other site having pulsatile blood flow which can be penetrated by light from the emitters. The detector is responsive to the emitted light after attenuation by pulsatile blood flowing in the tissue site. The detector outputs a detector signal to the monitor over the communication medium, which processes the signal to provide a numerical readout of physiological parameters such as oxygen saturation (SpO2) and/or pulse rate.


High fidelity patient monitors capable of reading through motion induced noise are disclosed in U.S. Pat. Nos. 7,096,054, 6,813,511, 6,792,300, 6,770,028, 6,658,276, 6,157,850, 6,002,952 5,769,785, and 5,758,644, which are assigned to Masimo Corporation of Irvine, Calif. (“Masimo Corp.”) and are incorporated by reference herein. Advanced physiological monitoring systems can incorporate pulse oximetry in addition to advanced features for the calculation and display of other blood parameters, such as carboxyhemoglobin (HbCO), methemoglobin (HbMet), total hemoglobin (Hbt), total Hematocrit (Hct), oxygen concentrations, glucose concentrations, blood pressure, electrocardiogram data, temperature, and/or respiratory rate as a few examples. Typically, the physiological monitoring system provides a numerical readout of and/or waveform of the measured parameter. Advanced physiological monitors and multiple wavelength optical sensors capable of measuring parameters in addition to SpO2, such as HbCO, HbMet and/or Hbt are described in at least U.S. patent Ser. No. 11/367,013, filed Mar. 1, 2006, titled Multiple Wavelength Sensor Emitters, now issued as U.S. Pat. No. 7,764,982, and U.S. patent application Ser. No. 11/366,208, filed Mar. 1, 2006, titled Noninvasive Multi-Parameter Patient Monitor, assigned to Masimo Laboratories, Inc. and incorporated by reference herein. Further, noninvasive blood parameter monitors and optical sensors including Rainbow™ adhesive and reusable sensors and RAD-57™ and Radical-7™ monitors capable of measuring SpO2, pulse rate, perfusion index (PI), signal quality (SiQ), pulse variability index (PVI), HbCO and/or HbMet, among other parameters, are also commercially available from Masimo Corp. of Irvine, Calif.


Another physiological monitoring system uses sensors that include piezoelectric membranes located on or near a patient's body to measure body sounds. The body sounds can then be analyzed to determine ventilation, apnea, respiration rate, or other parameters. These monitors are referred to as acoustic respiratory monitors. Acoustic respiratory monitors are also commercially available from Masimo Corp. of Irvine, Calif.


SUMMARY

The present disclosure relates to a system for simplifying logic choices in a computing environment. In an embodiment physiological processing is simplified by abstracting relevant features, or general characteristics of the signal. As used herein, features and general characteristics are used interchangeably. In an embodiment, features of physiological signals are abstracted and are used in conjunction with a logic table in order to determine a course of action. In an embodiment, the abstracted features are used to provide a bit encoding scheme which directly relates to a specified result. In an embodiment, the system is used to encode logic choices relevant to display characteristics of the device.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 illustrates an example of an abstraction of the relative slopes of physiological signals.



FIG. 2 illustrates an example of an abstraction of the difference in outputs of different processing engines.



FIG. 3 illustrates an example of an abstraction of a trend line linear regression, and/or a standard deviation.



FIG. 4 is an expanded view of a lookup table that can be used by a patient monitor to determine what action to perform in view of the abstracted features.



FIG. 5 is a flow chart illustrating a process implemented by a patient monitor for determining what action to perform based on the abstracted features.



FIG. 6 is a flow chart illustrating an embodiment of a process implemented by a patient monitor for determining an appropriate action based on the features of a set of data.



FIG. 7 illustrates an embodiment of a patient monitoring system capable of abstracting features of physiological data.



FIG. 8 illustrates an example noninvasive multiparameter physiological monitor.



FIG. 9 illustrates an embodiment of an acoustic sensor system, which can provide physiological data to the patient monitoring system.





DETAILED DESCRIPTION

Signal Analysis


Real time processing of physiological signals is often difficult and requires significant computing power, fast processors and significant power consumption and heat dissipation. Typical signal processing involves intensive and difficult mathematic operations in order to extrapolate useful data.


Feature Abstraction


One way to reduce the computational load in a physiological processing system is to abstract features of the signal. Feature abstractions use only relatively simple analysis which significantly reduces computational loads. Once abstracted, the features can then be used to make determinations about the physiological signals.


The difference between feature abstraction and typical computationally intensive signal processing is best understood with an analogy to human observations vs. computations. For example, consider the situation where Person A lives next to Person B. Person B brings their trash container out to the same position every time the trash is collected. Although Person A may not consciously note the position of the trash container, Person A is likely to notice that something is different if the trash container is placed in a different position. Importantly, that recognition can be made without measuring the exact position change of the trash container or even consciously noting the normal position of the trash container. Similarly, Person A may further note other changes regarding Person B. For example, Person A is likely to note that a different car is parked in Person B's driveway. Person A can make this determination without comparing old and new license plate numbers. Another example may be that Person A notes different children playing in Person B's yard. None of these abstractions or general observations required significant thought or specific measurements on the part of Person A. Similarly, features of physiological signals can be abstracted at various levels without significant computations.


Once abstracted, each feature can potentially indicate a number of possible explanations. For example, using the same analogy above, the change in location of Person B's trash container could indicate that a different family member took out the trash. However, coupling the change in location of trash container along with the different car and unfamiliar children playing in the yard may indicate that Person A has new neighbors. Importantly, the conclusion that Person A may have new neighbors is made without having direct knowledge of a change in neighbors or actually seeing the neighbors. Similarly, combinations of abstracted features provide different indications about the signal under analysis, while using relatively low processing power. However, it is to be understood that abstracting features of the signals described herein is significantly more complex and computationally intensive than the example given above. Furthermore, the abstraction of features of the signals (or data) described herein is typically done in real-time or near real-time using a digital signal processor, microcontroller, or other processor, operating at speeds far surpassing those of a human. For example, the processor may operate at hundreds, thousands, millions, billions, or even more cycles per second to ensure the features are abstracted in a timely manner. If the features are abstracted too slowly they lose their relevance.


In an embodiment, various features of a detected signal are abstracted. For example, in an embodiment, the relative slope of a signal over one or more windows of time is abstracted. In an embodiment, the relative noise level of a signal is determined. In an embodiment, the relative signal strength is determined. In an embodiment, comparisons between various features over different windows of time are compared and the comparison is an abstracted feature. In an embodiment, the features are abstracted in real-time or near real-time. Other abstractions can also be made as will be understood by those of skill in the art based on the present disclosure.


The examples provided below in FIGS. 1-3 are simplified examples of the types of abstractions that occur using the device described herein. It is to be understood that the calculations, computations, and abstractions are performed in real-time or near real-time at speeds surpassing those capable of a human. For example, the processor may perform hundreds, thousands, millions, billions or more calculations per second to determine the appropriate abstraction in real time or near real-time. In addition, the signals (or data) abstracted by the processor may be electrical signals, infrared signals, wireless signals, or other electro-magnetic wave signals that are incomprehensible to a human in their raw form and at the speeds communicated.



FIG. 1 illustrates an example of an abstraction of the relative slopes of physiological signals. As illustrated in FIG. 1, signals within two windows of time 102, 104 are analyzed. The windows can be any length of time. In an embodiment, each window is 30 seconds long. In another embodiment, each window is 60 seconds long. Other lengths of time can also be used. The windows can overlap or be non-overlapping in time. The signals in each window are then abstracted and characterized into one of five different slopes. These slopes are illustrated by the slope abstraction illustrator 114. For example, the relative slope of the signal within window 102 most closely matches slope 118. Similarly, the overall slope of the signal, or data, within window 104 most closely matches slope 130. By abstracting the overall slope of the signal into a relative slope, computations relevant to the signals are simplified. The overall slope of the signal may also be referred to as the trend of the signal. Although described with respect to five different slope abstraction values, more or fewer abstractions levels can be used. For example, one system might use 10 different abstraction slopes while another system might use three different abstraction slopes.


In an embodiment, after abstraction, the signal in each window is assigned a category. In the embodiment illustrated, the category is a bit code corresponding to the slope abstraction. In an embodiment, the slope of the signal in window 102, is matched to slope 118, and assigned bits “001.” Similarly, the slope of the signal in window 104 is assigned to bits “010.” As will be explained in greater detail below, bit assignments can be used to further simplify processing. Furthermore, as will be understood by those in the art from the present disclosure, the bit codes presented in the this example are not intended to be limiting and other bit codes, including different numbers of bits, can be used with the present disclosure.


In an embodiment, after abstracting the slopes of two or more signals in two or more windows of time, the slopes are then compared in order to determine a change in slope. The change in relative slope then provides another abstraction that can be used in further processing to make determinations regarding the signal. In an embodiment, this abstraction is also assigned a bit code based on the comparison. In an embodiment, the comparison is not necessary because the bit codes can be used in conjunction with the table described in FIG. 4 to designate an outcome based on the slopes without the necessity of further comparison computations. Thus, as described below, using the bit codes in association with FIG. 4 obviates the need the further processing and computational steps that might otherwise be necessary.


Another abstraction involves the comparison of two overlapping windows of data. This is illustrated, for example, in FIG. 2. Typically, physiological signals received from sensors are processed before being displayed. In processing the physiological signals, a patient monitor, described in more detail below with reference to FIGS. 7 and 9, performs any number of functions, computations, filters, and the like. The processing can account for signal noise, motion artifact, or any number of other signal distortions that can affect signal quality and the reliability of the output.


In an embodiment a patient monitor processes the data using one or more processing techniques, which may also be referred to as engines or processing engines, in parallel. The various engines can identify and filter specific signal distortions. For example, one engine can be configured to filter according to repetitive motion, while another engine can be configured to filter according to a single instance of motion. Other engines may be configured to filter or process the signals to account for other signal distortions, such as low signal-to-noise ratio, low signal power, low perfusion and the like. The use of parallel engines is described in U.S. Pat. No. 6,157,850, the disclosure of which is hereby incorporated by reference in its entirety.


With continued reference to FIG. 2, window 202 is an illustration of a physiological signal processed over a first time period, or window of time. Window 204 is an illustration of the same physiological signal processed over a different window of time, but that overlaps with the first window of time. This can be done using the same or different engines for each window of time. The x-axis 212 of the windows 202, 204 represents time, and the y-axis 214 of the windows 202, 204 represents signal amplitude. The data 220, 222 can be any type of physiological data, signal, or other data. Furthermore, the windows 202, 204 can be broken down into time the segments 210A, 210B, 210C, 210D, that divide the data based on a predefined time period. The predefined time period may be any size, ranging from less than a microsecond to minutes, hours or more. Typically, the data within the same time segments 210A, 210B, 210C, 210D of different windows 202 and 204 is similar. As noted in FIG. 2, the data 220, 222 is generally similar throughout most of the time segments 210A, 210B, 210C, 210D. However, in time segment 210A, data 206 and 208 are substantially dissimilar. The dissimilarities may be a result of differences in the processing. The dissimilarities may also be the result of different engines, where one engine is more suited for handling an event that occurred during time segment 210A, poor signal strength during time segment 210A, an error, or the like. Thus, one feature that the patient monitor can abstract is the difference in outputs between different engines and/or different windows of time.


In an embodiment, while abstracting the data, the patient monitor compares and identifies the difference between data 206 and data 208 in time segment 210A. The patient monitor categorizes the difference depending on various factors, such as type of data being analyzed, type of analysis being performed, engines being used, etc. In an embodiment, the difference is categorized as an insignificant difference, a minor difference, and/or significant difference. Based on the categorization, the patient monitor can implement a predefined action using a look-up table, which will be described in greater detail below, with reference to FIG. 4. Additional methods for categorizing the differences between the engine outputs may be used without departing from the spirit and scope of the description. Furthermore, the categories may be encoded using any number of different bit codes, as described above.



FIG. 3 is a plot diagram illustrating another embodiment of a feature abstraction, involving a trend line, linear regression, and/or standard deviation. Graph 300 illustrates an example of data measurements 306 over time. The x-axis 302 of graph 300 represents time and the y-axis 304 represents the amplitude of the measurements 306. The line 308 represents a trend line of the data measurements 306 over time. In an embodiment, the data measurements are normalized.


In an embodiment, the patient monitor computes a confidence value of the data measurements 306. The confidence value can be computed using the standard deviation, average, correlation coefficient and/or linear regression of the data measurements 306. For example, a high standard deviation and/or low correlation coefficient may be equated with a low confidence value, whereas a low standard deviation and/or high correlation coefficient may be equated with a high confidence value. Based on the confidence value, the data can be sorted into different categories indicating the level of confidence that can be placed in the data. For example, a relatively low confidence level may indicate that the signals are experiencing relatively large amounts of noise or other distortions, and that little confidence should be placed in the output. A relatively high confidence level may indicate that the patient monitor is experiencing relatively little noise in the system, and that high confidence can be placed in the output. The categories may be implemented using bit codes, described above with reference to FIG. 1. It is to be understood that more or fewer categories can be used to categorize the standard deviation of the data, or the trend line.



FIG. 4 is a block diagram illustrating an embodiment of an expanded view of an electronic lookup table (LUT) 400 implemented in the patient monitor. In an embodiment, the LUT 400 includes at least three sections: the features logic section 402, the expansion section 404, and the output section 406. The features logic section 402 is further broken down into three subsections 408, 410, 412. Each subsection 408, 410, 412 includes a bit encoding of the category of one abstracted feature. Each bit encoding is made up of multiple bits and/or bytes 414. The LUT 400 can be used by the patient monitor to determine what action to perform based on the category, or bit code, of the various features of the physiological data. It will be understood that the LUT 400 is only an example, and other embodiments with fewer or more sections, subsections, bit encodings, and features may be used without departing from the spirit and scope of the description.


As mentioned, LUT 400 includes three sections: the features logic section 402, the expansion section 404, and the output section 406. The patient monitor uses the feature logic section 402 and the expansion section 404 to “lookup” the action (encoded as the output 406) that is to be performed. Thus, each possible permutation of the featured logic section 402 and the expansion section 404 can have a corresponding output section 406. In other words, the output section 406 (or action) is selected as a function of the featured logic section 402 and the expansion section 404.


The feature logic section 402 is made up of one or more subsections 408, 410, and 412. Each subsection 408, 410, 412 can include one or more representations of categories of individual features in the form of individual bits and/or bytes 414. In the example illustrated, the features logic 402 includes three subsections 408, 410, 412. Each subsection 408, 410, 412 includes a bit code, made up of two bits, for a category of one individual feature. It will be understood that the feature logic section 402 can include fewer or more subsections and that the categories of the individual features may be represented with more or fewer bits as desired. For example, a greater number of categories may be desired for some features based on their complexity. As such, the features having more categories can use larger bit codes with more bits or bytes. Accordingly, in an embodiment, the bit codes for the different features are not uniform in their size. For example, one bit code for one feature may use two bits, while another bit code for another feature may use five bytes. In another embodiment, the bit codes are uniform in size for all features.


The expansion section 404 can include a number of subsections, similar to the subsections 408, 410, 412 of the feature logic section 402. The expansion subsections can include space, in the form of bits/bytes, for new features that are not included in the subsections 408, 410, 412. When not being used, the bits/bytes in the expansion section 404 can all be set to a logic ‘0’ or logic ‘1,’ as desired.


As mentioned earlier, the output section 406 is used by the patient monitor to determine the appropriate action in light of the feature logic section 402 and the expansion section 404. The patient monitor can use other logic as well in determining the appropriate output or action. The output section 406 can include a number of subsections similar to the feature logic section 402 and the expansion section 404. Furthermore, the actions to be taken by the patient monitor are encoded as bit codes within the output section 406. In an embodiment, each permutation of the feature logic section 402 and the expansion section 404 equates to a different bit code in the output section 406. In another embodiment, the bit code in the output section 406 for one or more permutations of the feature logic section 402 and the expansion section 404 is the same.


By abstracting the features and using the LUT 400, the patient monitor can reduce the amount of processing resources needed to perform the appropriate action given the set of data. Rather than processing the data itself, the patient monitor is able to abstract generalizations or general characteristics of the data and make determinations based on the general characteristics themselves. Thus, the patient monitor avoids processing the individual data itself. Even in those instances where analyzing or determining a feature is resource intensive, the patient monitor is able to reduce the overall amount of processing by reducing the number of items analyzed. For instance, instead of processing hundreds or even thousands of individual pieces of data, the patient monitor is able to process all, or a large number of, the pieces of data using a relatively small number of general characteristics that apply to the pieces of data in the aggregate. In addition, the use of a lookup table allows the actions or outputs to be predetermined, allowing the patient monitor to perform a simple “lookup” rather than repeatedly determining the appropriate action for each feature or piece of data analyzed. Furthermore, the lookup table can be implemented in hardware, further saving processing resources. Another benefit of the table is that in one embodiment there are no conditions left undefined. Often, in generating large and complex if/then statements in software, conditions are inevitably left out such that the device does not know what to do under an undefined condition. The table obviates this problem by inherently providing a result for every possible state.



FIG. 5 is a flow chart illustrating a process 500 implemented by a patient monitor for determining what action to perform based on the abstracted features. In an embodiment, the process 500 is executed in real-time or near real-time by the patient monitor. At block 504, the patient monitor obtains physiological data. The physiological data can be many various types of physiological data as described above. In some embodiments, the data need not be physiological data, as will be described in greater detail below.


At block 506, the patient monitor abstracts features of a set of data, or general characteristics. As described above, the features may include: differing engine outputs, standard deviation, slope, average, linear regression, correlation coefficient, and the like. Additional features may be used as well, such as time domain features, frequency domain features, and the like.


In abstracting the features, the patient monitor may analyze various general characteristics of the set of data in a variety of ways. For example, the patient monitor can abstract all the features or a subset of all the features. The subset can be determined based on the features that require, or are likely to use, relatively little processing resources, or can be determined randomly. In an embodiment, the patient monitor uses a list of predetermined features to determine which features to analyze. In an embodiment, the list of features is stored in the memory of the patient monitor. In another embodiment, the list is stored in memory remotely located from the patient monitor. In yet another embodiment, the patient monitor determines which features are to be abstracted based on the type of data being processed or based on the abstractions that are available to the monitor at any given point in time. For example, some features of data may be more pronounced or more easily determined based on the type of data received. For example, comparing the output of different engines of plethysmograph data may be less computationally intensive than calculating the standard deviation or linear regression of the plethysmograph data. In such an instance, the patient monitor can select to abstract the comparison between the data engines and not calculate the standard deviation or linear regression. In another embodiment, the patient monitor determines both abstractions. In an embodiment, the determination of which abstraction to use is based on a confidence level for each abstraction. In an embodiment, each abstraction is further given a confidence bit code to indicate a confidence level of that abstraction.


At block 508, the patient monitor matches the features of the set of data with an appropriate output, or action using a lookup table. The lookup table may be similar to the one described above, with reference to FIG. 4. Although not shown, the patient monitor can encode the features into different categories prior to using the lookup table. Upon completing the lookup and performing the associated action, the process 500 repeats itself as needed throughout the operation of the device.



FIG. 6 is a flow chart illustrating an embodiment of a process 600 implemented by a patient monitor for determining an appropriate action based on the features of physiological data, which can also be referred to as a set of data.


At block 604, the patient monitor obtains a set of data as described in greater detail above, with reference to block 502 of FIG. 5. At block 606, the patient monitor determines a first feature of the set of data. As described earlier, the feature can be any one of various general characteristics of the data including, slope, standard deviation, linear regression, correlation coefficient, differences between engine outputs, time domain, frequency domain, and the like. It is to be understood that features of data that can be abstracted are many, and should not be limited by the features specifically recited herein.


At block 608, the patient monitor determines a category within the first feature of the set of data. As described earlier, with reference to FIGS. 1, 2, and 3, the categories may take many different forms. For instance, as described above with reference to FIG. 1, if the abstracted feature is the slope of the set of data, the categories may be represented as a number of different slopes, and encoded using bit codes. Thus, the patient monitor will determine the proper slope, or category, that should be associated with the set of data.


In the example illustrated in FIG. 2, the abstracted feature is the difference between engine outputs, and the categories can be insignificant difference, minor difference, and significant difference. Thus, the patient monitor determines which category is appropriate for the difference between data 206 and 208 in time segment 210A of FIG. 2.


Similarly, the patient monitor can determine the appropriate category for the standard deviation, linear regression, correlation coefficient, and/or trend line of data measurements, as described above with reference to FIG. 3. In an embodiment, the number of different categories within a feature is finite and/or predetermined. In another embodiment, the categories are determined dynamically during process 600. In an embodiment, the categories are implemented using bit codes. In another embodiment, the categories are implemented using an alphanumeric code, or word. Furthermore, the categories of each feature may be different. For instance, one feature may have two categories and another feature may have ten or more features. Furthermore, the different categories of a feature can based on the type of data, the type of physiological data, the type of feature, the type of analysis being performed, or the like.


At block 610, the patient monitor encodes the selected category to be used when looking up the appropriate action in a lookup table. The patient monitor can encode the category in any number of different ways and use any number of different bits and/or bytes to do so. In an embodiment, the categories are represented as different sequences of bits and/or bytes, or bit codes. The patient monitor uses the bit codes to look up the appropriate output in the lookup table. Other methods of encoding the data are envisioned without departing from the spirit and scope of the description. For example, the different categories may be represented using some alphanumeric code or word, or the like.


At determination block 612, the patient monitor determines if there are additional features of the set of data to be analyzed. As described above in greater detail with reference to FIG. 4, more than one feature can be used in the lookup table to select the appropriate action. Thus, at block 612, the patient monitor determines if an additional feature of the set of data is to be analyzed. If there is an additional feature of the set of data to be analyzed, the patient monitor determines the additional feature of the set of data, as illustrated at block 614. Block 614 is similar to block 606, described above. At block 616, the patient monitor determines a category within the additional feature of the set of data, similar to block 608, described above. At block 618, the patient monitor encodes the category to use with the lookup table, as described above with reference to block 610.


After encoding the category, as illustrated at block 618, the patient monitor again determines if there is an additional feature to be analyzed, as illustrated at block 612. If there are additional features, the patient monitor continues to analyze the additional feature(s), determine the category within the additional feature(s) and encode the category, as illustrated in blocks 614, 616, and 618, respectively. Once there are no additional features, the patient monitor looks up the action corresponding to the one or more encoded categories, as illustrated at block 620.


To determine the appropriate action based on the encoded categories, the patient monitor can use a lookup table, similar to the LUT 400 described above, with reference to FIG. 4. Using the lookup table, the patient monitor can account for all of the different categories of the different features that were previously analyzed and determine the appropriate action. In an embodiment, abstracting features and using the lookup table reduces the number of computations processed by the patient monitor.


At block 622, the patient monitor performs the appropriate action based on the output of the lookup table. In an embodiment, the patient monitor repeats process 600 as desired.


It is to be understood that the different actions that can be performed by the patient monitor are many. For example, the patient monitor may determine that the appropriate action includes changing a display or output, activating an alarm, gathering additional data via the sensor, the internet or some other means, notifying a healthcare provider, the patient or another person, powering off, requesting additional information from a user, etc. Thus, the various actions that may be performed should be construed broadly.


Although described in terms of a patient monitor and physiological data, the processes described above, may be carried out using any number of general computing devices such as a personal computer, tablet, smart phone, and the like. As described above, abstracting features, or general characteristics, of data and then performing some type of action based on the general characteristics, or features, of the data rather than the data itself can significantly decrease processing resources. The process can be useful whenever abstracting and processing features would use fewer processing resources than processing the data itself or where a number of different potentials options are available and undefined states would be harmful.


For example, in an embodiment, the table of FIG. 4 can be used in conjunction with a system that automatically changes display characteristics based user proximity. In an embodiment in a system that tracks proximate users, the abstraction system described and table of FIG. 4 can be used to determine screen changes based on proximate user preferences. There can be a number of potential inputs used in determining screen orientation. These can include proximity to the screen in distance, hierarchy of proximate users, alarms relevant to proximate users, etc. Each of these inputs act as the signal abstractions and each will receive an appropriate bit code that can be used with the table of FIG. 4 to determine a solution. In an embodiment, use of the signal abstractions reduces the number nested if/then statements in software and simplifies the ability to encode for every possible solution.



FIG. 7 illustrates an embodiment of a patient monitoring system 700 capable of abstracting features of physiological data, as described above with reference to FIGS. 1-6. The patient monitoring system 700 includes a patient monitor 702 attached to a sensor 706 by a cable 704. The sensor monitors various physiological data of a patient and sends signals indicative of the one or more parameters to the patient monitor 702 for processing.


The patient monitor 702 generally includes a display 708, control buttons 710, and a speaker 712 for audible alerts. The display 708 is capable of displaying readings of various monitored patient parameters, which may include numerical readouts, graphical readouts, and the like. Display 708 may be a liquid crystal display (LCD), a cathode ray tube (CRT), a plasma screen, a Light Emitting Diode (LED) screen, Organic Light Emitting Diode (OLED) screen, or any other suitable display. The patient monitor 702 may monitor SpO2, Hb, HbO2, SpHb™, SpCO®, SpOC™, SpMet®, PI, PVI®, PR, temperature, and/or other parameters.


An embodiment of a patient monitoring system 700 according to the present disclosure is capable of measuring and displaying trending data of the various parameters and preferably is capable of conducting data analysis as to the trending. Furthermore, the patient monitoring system is capable of abstracting features of the physiological data being monitored. In an embodiment, the patient monitor 702 includes an abstraction module for carrying out the processes described above. It is to be understood by one skilled in the art that the patient monitor 702 may come in various, shapes, sizes and configurations without departing from the spirit and scope of the description. For example, the patient monitor 702 may be larger, smaller, portable, comprise varying size displays 708, and the like.


The sensor 706 may be one of many different types. For example, the sensor 706 may be disposable, reusable, multi-site, partially reusable, partially disposable, be adhesive or non-adhesive, monitor the physiological parameters using reflectance, transmittance, or transreflectance, and may be placed on a finger, hand, foot, forehead, neck, or ear, and may be a stereo sensor or a two-headed sensor. Thus, one of skill in the art will appreciate that sensor 706 may be any number of different types of sensors without departing from the spirit and scope of the disclosure.



FIG. 8 illustrates an example noninvasive multiparameter physiological monitor 800 that can implement any of the features, processes, steps, etc., described herein. An embodiment of the monitor 800 includes a display 801 showing data for multiple physiological parameters. For example, the display 801 can include a CRT or an LCD display including circuitry similar to that available on physiological monitors commercially available from Masimo Corporation of Irvine, Calif. sold under the name Radical™, and disclosed in U.S. Pat. Nos. 7,221,971; 7,215,986; 7,215,984 and 6,850,787, for example, the disclosures of which are hereby incorporated by reference in their entirety. In an embodiment, the multiparameter patient monitor includes an abstraction module for performing the processes described above. Many other display components can be used that are capable of displaying respiratory rate and other physiological parameter data along with the ability to display graphical data such as plethysmographs, respiratory waveforms, trend graphs or traces, and the like.


The depicted embodiment of the display 801 includes a measured value of respiratory rate 812 (in breaths per minute (bpm)) and a respiratory rate waveform graph 806. In addition, other measured blood constituents shown include SpO2 802, a pulse rate 804 in beats per minute (BPM), and a perfusion index 808. Many other blood constituents or other physiological parameters can be measured and displayed by the multiparameter physiological monitor 800, such as blood pressure, ECG readings, EtCO2 values, bioimpedance values, and the like. In some embodiments, multiple respiratory rates, corresponding to the multiple input sensors and/or monitors, can be displayed.



FIG. 9 illustrates an embodiment of a sensor system 900 including a sensor assembly 901 and a monitor cable 911 suitable for use with any of the physiological monitors shown in FIGS. 7 and 8. The sensor assembly 901 includes a sensor 915, a cable assembly 917, and a connector 905. The sensor 915, in one embodiment, includes a sensor subassembly 902 and an attachment subassembly 904. The cable assembly 917 of one embodiment includes a sensor 907 and a patient anchor 903. A sensor connector subassembly 905 is connected to the sensor cable 907.


The sensor connector subassembly 905 can be removably attached to an instrument cable 911 via an instrument cable connector 909. The instrument cable 911 can be attached to a cable hub 920, which includes a port 921 for receiving a connector 912 of the instrument cable 911 and a second port 923 for receiving another cable. In certain embodiments, the second port 923 can receive a cable connected to a pulse oximetry or other sensor. In addition, the cable hub 920 could include additional ports in other embodiments for receiving additional cables. The hub includes a cable 922 which terminates in a connector 924 adapted to connect to a physiological monitor (not shown).


In an embodiment, the acoustic sensor assembly 901 includes a sensing element, such as, for example, a piezoelectric device or other acoustic sensing device. The sensing element can generate a voltage that is responsive to vibrations generated by the patient, and the sensor can include circuitry to transmit the voltage generated by the sensing element to a processor for processing. In an embodiment, the acoustic sensor assembly 901 includes circuitry for detecting and transmitting information related to biological sounds to a physiological monitor. These biological sounds can include heart, breathing, and/or digestive system sounds, in addition to many other physiological phenomena. The acoustic sensor 915 in certain embodiments is a biological sound sensor, such as the sensors described herein. In some embodiments, the biological sound sensor is one of the sensors such as those described in the '883 application. In other embodiments, the acoustic sensor 915 is a biological sound sensor such as those described in U.S. Pat. No. 6,661,161, which is incorporated by reference herein in its entirety. Other embodiments include other suitable acoustic sensors.


The attachment sub-assembly 904 includes first and second elongate portions 906, 908. The first and second elongate portions 906, 908 can include patient adhesive (e.g., in some embodiments, tape, glue, a suction device, etc.). The adhesive on the elongate portions 906, 908 can be used to secure the sensor subassembly 902 to a patient's skin. One or more elongate members 910 included in the first and/or second elongate portions 906, 908 can beneficially bias the sensor subassembly 902 in tension against the patient's skin and reduce stress on the connection between the patient adhesive and the skin. A removable backing can be provided with the patient adhesive to protect the adhesive surface prior to affixing to a patient's skin.


The sensor cable 907 can be electrically coupled to the sensor subassembly 902 via a printed circuit board (“PCB”) (not shown) in the sensor subassembly 902. Through this contact, electrical signals are communicated from the multi-parameter sensor subassembly to the physiological monitor through the sensor cable 907 and the cable 911.


In various embodiments, not all of the components illustrated in FIG. 9 are included in the sensor system 900. For example, in various embodiments, one or more of the patient anchor 903 and the attachment subassembly 904 are not included. In one embodiment, for example, a bandage or tape is used instead of the attachment subassembly 904 to attach the sensor subassembly 902 to the measurement site. Moreover, such bandages or tapes can be a variety of different shapes including generally elongate, circular and oval, for example. In addition, the cable hub 920 need not be included in certain embodiments. For example, multiple cables from different sensors could connect to a monitor directly without using the cable hub 920.


Additional information relating to acoustic sensors compatible with embodiments described herein, including other embodiments of interfaces with the physiological monitor, are included in U.S. patent application Ser. No. 12/044,883, filed Mar. 7, 2008, entitled “Systems and Methods for Determining a Physiological Condition Using an Acoustic Monitor,” and U.S. Pat. Application No. 61/366,866, filed Jul. 22, 2010, entitled “Pulse Oximetry System for Determining Confidence in Respiratory Rate Measurements,” the disclosures of which are hereby incorporated by reference in their entirety. An example of an acoustic sensor that can be used with the embodiments described herein is disclosed in U.S. Pat. Application No. 61/252,076, filed Oct. 15, 2009, titled “Acoustic Sensor Assembly,” the disclosure of which is hereby incorporated by reference in its entirety.


Conditional language used herein, such as, among others, “can,” “could,” “might,” “may,” “e.g.,” and the like, unless specifically stated otherwise, or otherwise understood within the context as used, is generally intended to convey that certain embodiments include, while other embodiments do not include, certain features, elements and/or states. Thus, such conditional language is not generally intended to imply that features, elements and/or states are in any way required for one or more embodiments or that one or more embodiments necessarily include logic for deciding, with or without author input or prompting, whether these features, elements and/or states are included or are to be performed in any particular embodiment.


Depending on the embodiment, certain acts, events, or functions of any of the methods described herein can be performed in a different sequence, can be added, merged, or left out all together (e.g., not all described acts or events are necessary for the practice of the method). Moreover, in certain embodiments, acts or events can be performed concurrently, e.g., through multi-threaded processing, interrupt processing, or multiple processors or processor cores, rather than sequentially.


The methods, steps, processes, calculations, computations or the like (“methods”) provided herein are simplified examples that are generally performed by advanced processing devices, including complex signal processors, sensitive analog and digital signal preprocessing boards, optical/optoelectronic componentry, display drivers and devices, or similar electronic devices. An artisan will recognize from the disclosure herein that the various methods often must be performed at speeds that, as a practical matter, could never be performed entirely in a human mind. Rather, for many calculations providing real time or near real time solutions, outputs, measurements, criteria, estimates, display indicia, or the like, many of the foregoing processing devices perform tens to billions or more calculations per second. In addition, such processing devices may process electrical signals, infrared signals, wireless signals, or other electro-magnetic wave signals that are incomprehensible to a human mind in their raw form and at the speeds communicated.


The various illustrative logical blocks, modules, circuits, and algorithm steps described in connection with the embodiments disclosed herein can be implemented as electronic hardware, computer software, or combinations of both operating in real-time or near real-time and at speeds unattainable by a human. To clearly illustrate this interchangeability of hardware and software, various illustrative components, blocks, modules, circuits, and steps have been described above generally in terms of their functionality. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the overall system. The described functionality can be implemented in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the disclosure.


The various illustrative logical blocks, modules, and circuits described in connection with the embodiments disclosed herein can be implemented or performed with a general purpose processor, a digital signal processor (DSP), an application specific integrated circuit (ASIC), a field programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein. A general purpose processor can be a microprocessor, but in the alternative, the processor can be any conventional processor, controller, microcontroller, or state machine. A processor can also be implemented as a combination of computing devices, e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration.


The blocks of the methods and algorithms described in connection with the embodiments disclosed herein can be embodied directly in hardware, in a software module executed by a processor, or in a combination of the two. A software module can reside in RAM memory, flash memory, ROM memory, EPROM memory, EEPROM memory, registers, a hard disk, a removable disk, a CD-ROM, or any other form of computer-readable storage medium known in the art. An exemplary storage medium is coupled to a processor such that the processor can read information from, and write information to, the storage medium. In the alternative, the storage medium can be integral to the processor. The processor and the storage medium can reside in an ASIC. The ASIC can reside in a user terminal. In the alternative, the processor and the storage medium can reside as discrete components in a user terminal.


While the above detailed description has shown, described, and pointed out novel features as applied to various embodiments, it will be understood that various omissions, substitutions, and changes in the form and details of the devices or algorithms illustrated can be made without departing from the spirit of the disclosure. As will be recognized, certain embodiments of the inventions described herein can be embodied within a form that does not provide all of the features and benefits set forth herein, as some features can be used or practiced separately from others. The scope of certain inventions disclosed herein is indicated by the appended claims rather than by the foregoing description. All changes which come within the meaning and range of equivalency of the claims are to be embraced within their scope.

Claims
  • 1. A patient monitoring system comprising: one or more processors communicatively coupled with a physiological sensor that is configured to emit light towards a tissue site, detect the light after it has interacted with tissue, and output a first sensor signal responsive to the detected light, the one or more processors configured to: receive the first sensor signal from the physiological sensor, wherein the first sensor signal corresponds to a plethysmographic signal;identify a first feature associated with the first sensor signal, wherein the first feature is a frequency domain feature;compare the first feature to a plurality of categories, wherein each category of the plurality of categories corresponds to a different category value;select a category of the plurality of categories that most closely matches the first feature;combine a first category value corresponding to the selected category with a second category value corresponding to a second feature associated with a second sensor signal; andoutput an abstraction of a physiological parameter based at least in part on the combination of the first category value and the second category value.
  • 2. The patient monitoring system of claim 1, wherein the one or more processors are further configured to receive the second sensor signal from the physiological sensor.
  • 3. The patient monitoring system of claim 1, wherein the one or more processors are further configured to receive the second sensor signal from a sensor other than the physiological sensor.
  • 4. The patient monitoring system of claim 1, wherein the physiological parameter comprises at least one of respiratory rate, pulse rate, perfusion index, or blood pressure.
  • 5. The patient monitoring system of claim 1, wherein the one or more processors are further configured to determine a plethysmograph waveform from the first sensor signal, wherein the first feature corresponds to the plethysmograph waveform.
  • 6. The patient monitoring system of claim 1, wherein the first feature corresponds to at least one of a trend line, a standard deviation, a slope, a change in the slope, a correlation coefficient, or a linear regression.
  • 7. The patient monitoring system of claim 1, wherein at least one of the first category value or the second category value comprises at least one of a bit code, a sequence of bits, a sequence of bytes, or an alphanumeric code.
  • 8. A patient monitoring system comprising: one or more processors communicatively coupled with a physiological sensor that is configured to emit light towards a tissue site, detect the light after it has interacted with tissue, and output a first sensor signal responsive to the detected light, the one or more processors configured to: receive the first sensor signal from the physiological sensor;identify a first feature associated with the first sensor signal;select, from a plurality of categories, a category that most closely matches the first feature, wherein to select the category the one or more processors are configured to compare the first feature to the plurality of categories;identify a first category value corresponding to the selected category; andoutput an abstraction of a physiological parameter based at least in part on a combination of the first category value and a second category value corresponding to a second feature associated with a second sensor signal.
  • 9. The patient monitoring system of claim 8, wherein the one or more processors are further configured to receive the second sensor signal from the physiological sensor.
  • 10. The patient monitoring system of claim 8, wherein the one or more processors are further configured to receive the second sensor signal from a sensor other than the physiological sensor.
  • 11. The patient monitoring system of claim 8, wherein the physiological parameter comprises at least one of respiratory rate, pulse rate, perfusion index, or blood pressure.
  • 12. The patient monitoring system of claim 8, wherein the first feature corresponds to at least one of a trend line, a standard deviation, a slope, a change in the slope, a correlation coefficient, or a linear regression.
  • 13. The patient monitoring system of claim 8, wherein each of the plurality of categories correspond to a different candidate feature associated with the first feature.
  • 14. The patient monitoring system of claim 13, wherein the first feature corresponds to a slope, and wherein each of the plurality of categories correspond to a different candidate slope.
  • 15. The patient monitoring system of claim 8, wherein at least one of the first category value or the second category value comprises at least one of a bit code, a sequence of bits, a sequence of bytes, or an alphanumeric code.
  • 16. The patient monitoring system of claim 8, wherein the first feature comprises at least one of a time domain feature or a frequency domain feature.
  • 17. A method for reducing processing load of a patient monitor in communication with a physiological sensor, the physiological sensor configured to emit light towards a tissue site, detect the light after it has been attenuated by tissue, and output a first sensor signal responsive to the detected light, the method comprising: receiving the first sensor signal from the physiological sensor;identifying a first feature associated with the first sensor signal;selecting, from a plurality of categories, a category that most closely matches the first feature, wherein said selecting the category comprises comparing the first feature to at least two of the plurality of categories;identifying a first category value corresponding to the selected category; andoutputting an abstraction of a physiological parameter based at least in part on a combination of the first category value and a second category value corresponding to a second feature associated with a second sensor signal.
  • 18. The method of claim 17, wherein the second sensor signal is received from the physiological sensor.
  • 19. The method of claim 17, wherein the second sensor signal is received from a sensor other than the physiological sensor.
  • 20. The method of claim 17, wherein at least one of the first category value or the second category value comprises at least one of a bit code, a sequence of bits, a sequence of bytes, or an alphanumeric code.
RELATED APPLICATIONS

The present application is a continuation of U.S. patent application Ser. No. 15/634,745, filed Jun. 27, 2017, entitled “Physiological Measurement Logic Engine,” which is a continuation of U.S. patent application Ser. No. 14/967,075, filed Dec. 11, 2015, entitled “Physiological Measurement Logic Engine,” which is a continuation of U.S. patent application Ser. No. 13/425,085, filed Mar. 20, 2012, entitled “Physiological Measurement Logic Engine,” which is a continuation of U.S. patent application Ser. No. 13/272,038, filed Oct. 12, 2011, entitled “Physiological Measurement Logic Engine,” which claims priority benefit of U.S. Provisional Patent Application No. 61/392,863, filed Oct. 13, 2010, entitled “Physiological Measurement Logic Engine,” each of which is hereby incorporated by reference herein in its entirety.

US Referenced Citations (890)
Number Name Date Kind
4960128 Gordon et al. Oct 1990 A
4964408 Hink et al. Oct 1990 A
5041187 Hink et al. Aug 1991 A
5069213 Polczynski Dec 1991 A
5163438 Gordon et al. Nov 1992 A
5319355 Russek Jun 1994 A
5337744 Branigan Aug 1994 A
5341805 Stavridi et al. Aug 1994 A
D353195 Savage et al. Dec 1994 S
D353196 Savage et al. Dec 1994 S
5377676 Vari et al. Jan 1995 A
D359546 Savage et al. Jun 1995 S
5431170 Mathews Jul 1995 A
5436499 Namavar et al. Jul 1995 A
D361840 Savage et al. Aug 1995 S
D362063 Savage et al. Sep 1995 S
5452717 Branigan et al. Sep 1995 A
D363120 Savage et al. Oct 1995 S
5456252 Vari et al. Oct 1995 A
5479934 Imran Jan 1996 A
5482036 Diab et al. Jan 1996 A
5490505 Diab et al. Feb 1996 A
5494043 O'Sullivan et al. Feb 1996 A
5533511 Kaspari et al. Jul 1996 A
5534851 Russek Jul 1996 A
5561275 Savage et al. Oct 1996 A
5562002 Lalin Oct 1996 A
5590649 Caro et al. Jan 1997 A
5602924 Durand et al. Feb 1997 A
5632272 Diab et al. May 1997 A
5638816 Kiani-Azarbayjany et al. Jun 1997 A
5638818 Diab et al. Jun 1997 A
5645440 Tobler et al. Jul 1997 A
5671914 Kalkhoran et al. Sep 1997 A
5685299 Diab et al. Nov 1997 A
5726440 Kalkhoran et al. Mar 1998 A
D393830 Tobler et al. Apr 1998 S
5743262 Lepper, Jr. et al. Apr 1998 A
5747806 Khalil et al. May 1998 A
5750994 Schlager May 1998 A
5758644 Diab et al. Jun 1998 A
5760910 Lepper, Jr. et al. Jun 1998 A
5769785 Diab et al. Jun 1998 A
5782757 Diab et al. Jul 1998 A
5785659 Caro et al. Jul 1998 A
5791347 Flaherty et al. Aug 1998 A
5810734 Caro et al. Sep 1998 A
5823950 Diab et al. Oct 1998 A
5830131 Caro et al. Nov 1998 A
5833618 Caro et al. Nov 1998 A
5860919 Kiani-Azarbayjany et al. Jan 1999 A
5890929 Mills et al. Apr 1999 A
5904654 Wohltmann et al. May 1999 A
5919134 Diab Jul 1999 A
5934925 Tobler et al. Aug 1999 A
5940182 Lepper, Jr. et al. Aug 1999 A
5987343 Kinast Nov 1999 A
5995855 Kiani et al. Nov 1999 A
5997343 Mills et al. Dec 1999 A
6002952 Diab et al. Dec 1999 A
6010937 Karam et al. Jan 2000 A
6011986 Diab et al. Jan 2000 A
6027452 Flaherty et al. Feb 2000 A
6036642 Diab et al. Mar 2000 A
6040578 Malin et al. Mar 2000 A
6045509 Caro et al. Apr 2000 A
6066204 Haven May 2000 A
6067462 Diab et al. May 2000 A
6081735 Diab et al. Jun 2000 A
6088607 Diab et al. Jul 2000 A
6110522 Lepper, Jr. et al. Aug 2000 A
6115673 Malin et al. Sep 2000 A
6124597 Shehada Sep 2000 A
6128521 Marro et al. Oct 2000 A
6129675 Jay Oct 2000 A
6144868 Parker Nov 2000 A
6151516 Kiani-Azarbayjany et al. Nov 2000 A
6152754 Gerhardt et al. Nov 2000 A
6157850 Diab et al. Dec 2000 A
6165005 Mills et al. Dec 2000 A
6184521 Coffin, IV et al. Feb 2001 B1
6206830 Diab et al. Mar 2001 B1
6229856 Diab et al. May 2001 B1
6232609 Snyder et al. May 2001 B1
6236872 Diab et al. May 2001 B1
6241683 Macklem et al. Jun 2001 B1
6253097 Aronow et al. Jun 2001 B1
6255708 Sudharsanan et al. Jul 2001 B1
6256523 Diab et al. Jul 2001 B1
6263222 Diab et al. Jul 2001 B1
6278522 Lepper, Jr. et al. Aug 2001 B1
6280213 Tobler et al. Aug 2001 B1
6280381 Malin et al. Aug 2001 B1
6285896 Tobler et al. Sep 2001 B1
6301493 Marro et al. Oct 2001 B1
6308089 von der Ruhr et al. Oct 2001 B1
6317627 Ennen et al. Nov 2001 B1
6321100 Parker Nov 2001 B1
6325761 Jay Dec 2001 B1
6334065 Al-Ali et al. Dec 2001 B1
6343224 Parker Jan 2002 B1
6349228 Kiani et al. Feb 2002 B1
6360114 Diab et al. Mar 2002 B1
6368283 Xu et al. Apr 2002 B1
6371921 Caro et al. Apr 2002 B1
6377829 Al-Ali Apr 2002 B1
6388240 Schulz et al. May 2002 B2
6397091 Diab et al. May 2002 B2
6411373 Garside et al. Jun 2002 B1
6415167 Blank et al. Jul 2002 B1
6430437 Marro Aug 2002 B1
6430525 Weber et al. Aug 2002 B1
6463311 Diab Oct 2002 B1
6470199 Kopotic et al. Oct 2002 B1
6487429 Hockersmith et al. Nov 2002 B2
6501975 Diab et al. Dec 2002 B2
6505059 Kollias et al. Jan 2003 B1
6515273 Al-Ali Feb 2003 B2
6519487 Parker Feb 2003 B1
6525386 Mills et al. Feb 2003 B1
6526300 Kiani et al. Feb 2003 B1
6534012 Hazen et al. Mar 2003 B1
6541756 Schulz et al. Apr 2003 B2
6542764 Al-Ali et al. Apr 2003 B1
6580086 Schulz et al. Jun 2003 B1
6584336 Ali et al. Jun 2003 B1
6587196 Stippick et al. Jul 2003 B1
6587199 Luu Jul 2003 B1
6595316 Cybulski et al. Jul 2003 B2
6597932 Tian et al. Jul 2003 B2
6597933 Kiani et al. Jul 2003 B2
6606511 Ali et al. Aug 2003 B1
6632181 Flaherty et al. Oct 2003 B2
6635559 Greenwald et al. Oct 2003 B2
6639668 Trepagnier Oct 2003 B1
6640116 Diab Oct 2003 B2
6640117 Makarewicz et al. Oct 2003 B2
6643530 Diab et al. Nov 2003 B2
6650917 Diab et al. Nov 2003 B2
6654624 Diab et al. Nov 2003 B2
6658276 Kiani et al. Dec 2003 B2
6661161 Lanzo et al. Dec 2003 B1
6671531 Al-Ali et al. Dec 2003 B2
6678543 Diab et al. Jan 2004 B2
6684090 Ali et al. Jan 2004 B2
6684091 Parker Jan 2004 B2
6697656 Al-Ali Feb 2004 B1
6697657 Shehada et al. Feb 2004 B1
6697658 Al-Ali Feb 2004 B2
RE38476 Diab et al. Mar 2004 E
6699194 Diab et al. Mar 2004 B1
6714804 Al-Ali et al. Mar 2004 B2
RE38492 Diab et al. Apr 2004 E
6721582 Trepagnier et al. Apr 2004 B2
6721585 Parker Apr 2004 B1
6725075 Al-Ali Apr 2004 B2
6728560 Kollias et al. Apr 2004 B2
6735459 Parker May 2004 B2
6738652 Mattu et al. May 2004 B2
6745060 Diab et al. Jun 2004 B2
6760607 Al-Ali Jul 2004 B2
6770028 Ali et al. Aug 2004 B1
6771994 Kiani et al. Aug 2004 B2
6788965 Ruchti et al. Sep 2004 B2
6792300 Diab et al. Sep 2004 B1
6813511 Diab et al. Nov 2004 B2
6816241 Grubisic Nov 2004 B2
6816741 Diab Nov 2004 B2
6822564 Al-Ali Nov 2004 B2
6826419 Diab et al. Nov 2004 B2
6830711 Mills et al. Dec 2004 B2
6850787 Weber et al. Feb 2005 B2
6850788 Al-Ali Feb 2005 B2
6852083 Caro et al. Feb 2005 B2
6861639 Al-Ali Mar 2005 B2
6876931 Lorenz et al. Apr 2005 B2
6898452 Al-Ali et al. May 2005 B2
6920345 Al-Ali et al. Jul 2005 B2
6931268 Kiani-Azarbayjany et al. Aug 2005 B1
6934570 Kiani et al. Aug 2005 B2
6939305 Flaherty et al. Sep 2005 B2
6943348 Coffin IV Sep 2005 B1
6950687 Al-Ali Sep 2005 B2
6956649 Acosta et al. Oct 2005 B2
6961598 Diab Nov 2005 B2
6970792 Diab Nov 2005 B1
6979812 Al-Ali Dec 2005 B2
6985764 Mason et al. Jan 2006 B2
6990364 Ruchti et al. Jan 2006 B2
6993371 Kiani et al. Jan 2006 B2
6996427 Ali et al. Feb 2006 B2
6998247 Monfre et al. Feb 2006 B2
6999904 Weber et al. Feb 2006 B2
7003338 Weber et al. Feb 2006 B2
7003339 Diab et al. Feb 2006 B2
7015451 Dalke et al. Mar 2006 B2
7024233 Ali et al. Apr 2006 B2
7027849 Al-Ali Apr 2006 B2
7030749 Al-Ali Apr 2006 B2
7039449 Al-Ali May 2006 B2
7041060 Flaherty et al. May 2006 B2
7044918 Diab May 2006 B2
7067893 Mills et al. Jun 2006 B2
D526719 Richie, Jr. et al. Aug 2006 S
7096052 Mason et al. Aug 2006 B2
7096054 Abdul-Hafiz et al. Aug 2006 B2
D529616 Deros et al. Oct 2006 S
7132641 Schulz et al. Nov 2006 B2
7133710 Acosta et al. Nov 2006 B2
7142901 Kiani et al. Nov 2006 B2
7149561 Diab Dec 2006 B2
7186966 Al-Ali Mar 2007 B2
7190261 Al-Ali Mar 2007 B2
7209774 Baker, Jr. Apr 2007 B2
7215984 Diab May 2007 B2
7215986 Diab May 2007 B2
7221971 Diab May 2007 B2
7225006 Al-Ali et al. May 2007 B2
7225007 Al-Ali May 2007 B2
RE39672 Shehada et al. Jun 2007 E
7239905 Kiani-Azarbayjany et al. Jul 2007 B2
7245953 Parker Jul 2007 B1
7254429 Schurman et al. Aug 2007 B2
7254431 Al-Ali Aug 2007 B2
7254433 Diab et al. Aug 2007 B2
7254434 Schulz et al. Aug 2007 B2
7272425 Al-Ali Sep 2007 B2
7274955 Kiani et al. Sep 2007 B2
D554263 Al-Ali Oct 2007 S
7280858 Al-Ali et al. Oct 2007 B2
7289835 Mansfield et al. Oct 2007 B2
7292883 De Felice et al. Nov 2007 B2
7295866 Al-Ali Nov 2007 B2
7328053 Diab et al. Feb 2008 B1
7332784 Mills et al. Feb 2008 B2
7340287 Mason et al. Mar 2008 B2
7341559 Schulz et al. Mar 2008 B2
7343186 Lamego et al. Mar 2008 B2
D566282 Al-Ali et al. Apr 2008 S
7355512 Al-Ali Apr 2008 B1
7356365 Schurman Apr 2008 B2
7371981 Abdul-Hafiz May 2008 B2
7373193 Al-Ali et al. May 2008 B2
7373194 Weber et al. May 2008 B2
7376453 Diab et al. May 2008 B1
7377794 Al-Ali et al. May 2008 B2
7377899 Weber et al. May 2008 B2
7383070 Diab et al. Jun 2008 B2
7395158 Monfre et al. Jul 2008 B2
7415297 Al-Ali et al. Aug 2008 B2
7428432 Ali et al. Sep 2008 B2
7438683 Al-Ali et al. Oct 2008 B2
7440787 Diab Oct 2008 B2
7454240 Diab et al. Nov 2008 B2
7467002 Weber et al. Dec 2008 B2
7469157 Diab et al. Dec 2008 B2
7471969 Diab et al. Dec 2008 B2
7471971 Diab et al. Dec 2008 B2
7483729 Al-Ali et al. Jan 2009 B2
7483730 Diab et al. Jan 2009 B2
7489958 Diab et al. Feb 2009 B2
7496391 Diab et al. Feb 2009 B2
7496393 Diab et al. Feb 2009 B2
D587657 Al-Ali et al. Mar 2009 S
7499741 Diab et al. Mar 2009 B2
7499835 Weber et al. Mar 2009 B2
7500950 Al-Ali et al. Mar 2009 B2
7509154 Diab et al. Mar 2009 B2
7509494 Al-Ali Mar 2009 B2
7510849 Schurman et al. Mar 2009 B2
7514725 Wojtczuk et al. Apr 2009 B2
7519406 Blank et al. Apr 2009 B2
7526328 Diab et al. Apr 2009 B2
D592507 Wachman et al. May 2009 S
7530942 Diab May 2009 B1
7530949 Al Ali et al. May 2009 B2
7530955 Diab et al. May 2009 B2
7563110 Al-Ali et al. Jul 2009 B2
7593230 Abul-Haj et al. Sep 2009 B2
7596398 Al-Ali et al. Sep 2009 B2
7606608 Blank et al. Oct 2009 B2
7618375 Flaherty Nov 2009 B2
7620674 Ruchti et al. Nov 2009 B2
D606659 Kiani et al. Dec 2009 S
7629039 Eckerbom et al. Dec 2009 B2
7640140 Ruchti et al. Dec 2009 B2
7647083 Al-Ali et al. Jan 2010 B2
D609193 Al-Ali et al. Feb 2010 S
D614305 Al-Ali et al. Apr 2010 S
7697966 Monfre et al. Apr 2010 B2
7698105 Ruchti et al. Apr 2010 B2
7706852 Baker, Jr. Apr 2010 B2
RE41317 Parker May 2010 E
RE41333 Blank et al. May 2010 E
7729733 Al-Ali et al. Jun 2010 B2
7734320 Al-Ali Jun 2010 B2
7761127 Al-Ali et al. Jul 2010 B2
7761128 Al-Ali et al. Jul 2010 B2
7764982 Dalke et al. Jul 2010 B2
D621516 Kiani et al. Aug 2010 S
7791155 Diab Sep 2010 B2
7801581 Diab Sep 2010 B2
7822452 Schurman et al. Oct 2010 B2
RE41912 Parker Nov 2010 E
7844313 Kiani et al. Nov 2010 B2
7844314 Al-Ali Nov 2010 B2
7844315 Al-Ali Nov 2010 B2
7865222 Weber et al. Jan 2011 B2
7873497 Weber et al. Jan 2011 B2
7880606 Al-Ali Feb 2011 B2
7880626 Al-Ali et al. Feb 2011 B2
7891355 Al-Ali et al. Feb 2011 B2
7894868 Al-Ali et al. Feb 2011 B2
7899507 Al-Ali et al. Mar 2011 B2
7899518 Trepagnier et al. Mar 2011 B2
7904132 Weber et al. Mar 2011 B2
7909772 Popov et al. Mar 2011 B2
7910875 Al-Ali Mar 2011 B2
7919713 Al-Ali et al. Apr 2011 B2
7937128 Al-Ali May 2011 B2
7937129 Mason et al. May 2011 B2
7937130 Diab et al. May 2011 B2
7941199 Kiani May 2011 B2
7951086 Flaherty et al. May 2011 B2
7957780 Lamego et al. Jun 2011 B2
7962188 Kiani et al. Jun 2011 B2
7962190 Diab et al. Jun 2011 B1
7976472 Kiani Jul 2011 B2
7988637 Diab Aug 2011 B2
7990382 Kiani Aug 2011 B2
7991446 Ali et al. Aug 2011 B2
8000761 Al-Ali Aug 2011 B2
8008088 Bellott et al. Aug 2011 B2
RE42753 Kiani-Azarbayjany et al. Sep 2011 E
8019400 Diab et al. Sep 2011 B2
8028701 Al-Ali et al. Oct 2011 B2
8029765 Bellott et al. Oct 2011 B2
8036727 Schurman et al. Oct 2011 B2
8036728 Diab et al. Oct 2011 B2
8046040 Ali et al. Oct 2011 B2
8046041 Diab et al. Oct 2011 B2
8046042 Diab et al. Oct 2011 B2
8048040 Kiani Nov 2011 B2
8050728 Al-Ali et al. Nov 2011 B2
RE43169 Parker Feb 2012 E
8118620 Al-Ali et al. Feb 2012 B2
8126528 Diab et al. Feb 2012 B2
8128572 Diab et al. Mar 2012 B2
8130105 Al-Ali et al. Mar 2012 B2
8145287 Diab et al. Mar 2012 B2
8150487 Diab et al. Apr 2012 B2
8175672 Parker May 2012 B2
8180420 Diab et al. May 2012 B2
8182443 Kiani May 2012 B1
8185180 Diab et al. May 2012 B2
8190223 Al-Ali et al. May 2012 B2
8190227 Diab et al. May 2012 B2
8203438 Kiani et al. Jun 2012 B2
8203704 Merritt et al. Jun 2012 B2
8204566 Schurman et al. Jun 2012 B2
8219172 Schurman et al. Jul 2012 B2
8224411 Al-Ali et al. Jul 2012 B2
8228181 Al-Ali Jul 2012 B2
8229532 Davis Jul 2012 B2
8229533 Diab et al. Jul 2012 B2
8233955 Al-Ali et al. Jul 2012 B2
8244325 Al-Ali et al. Aug 2012 B2
8255026 Al-Ali Aug 2012 B1
8255027 Al-Ali et al. Aug 2012 B2
8255028 Al-Ali et al. Aug 2012 B2
8260577 Weber et al. Sep 2012 B2
8265723 McHale et al. Sep 2012 B1
8274360 Sampath et al. Sep 2012 B2
8280473 Al-Ali Oct 2012 B2
8301217 Al-Ali et al. Oct 2012 B2
8306596 Schurman et al. Nov 2012 B2
8310336 Muhsin et al. Nov 2012 B2
8315683 Al-Ali et al. Nov 2012 B2
RE43860 Parker Dec 2012 E
8337403 Al-Ali et al. Dec 2012 B2
8346330 Lamego Jan 2013 B2
8353842 Al-Ali et al. Jan 2013 B2
8355766 MacNeish, III et al. Jan 2013 B2
8359080 Diab et al. Jan 2013 B2
8364223 Al-Ali et al. Jan 2013 B2
8364226 Diab et al. Jan 2013 B2
8374665 Lamego Feb 2013 B2
8385995 Al-ali et al. Feb 2013 B2
8385996 Smith et al. Feb 2013 B2
8388353 Kiani et al. Mar 2013 B2
8399822 Al-Ali Mar 2013 B2
8401602 Kiani Mar 2013 B2
8405608 Al-Ali et al. Mar 2013 B2
8414499 Al-Ali et al. Apr 2013 B2
8418524 Al-Ali Apr 2013 B2
8423106 Lamego et al. Apr 2013 B2
8428967 Olsen et al. Apr 2013 B2
8430817 Al-Ali et al. Apr 2013 B1
8437825 Dalvi et al. May 2013 B2
8455290 Siskavich Jun 2013 B2
8457703 Al-Ali Jun 2013 B2
8457707 Kiani Jun 2013 B2
8463349 Diab et al. Jun 2013 B2
8466286 Bellot et al. Jun 2013 B2
8471713 Poeze et al. Jun 2013 B2
8473020 Kiani et al. Jun 2013 B2
8483787 Al-Ali et al. Jul 2013 B2
8489364 Weber et al. Jul 2013 B2
8498684 Weber et al. Jul 2013 B2
8504128 Blank et al. Aug 2013 B2
8509867 Workman et al. Aug 2013 B2
8515509 Bruinsma et al. Aug 2013 B2
8523781 Al-Ali Sep 2013 B2
8529301 Al-Ali et al. Sep 2013 B2
8532727 Ali et al. Sep 2013 B2
8532728 Diab et al. Sep 2013 B2
D692145 Al-Ali et al. Oct 2013 S
8547209 Kiani et al. Oct 2013 B2
8548548 Al-Ali Oct 2013 B2
8548549 Schurman et al. Oct 2013 B2
8548550 Al-Ali et al. Oct 2013 B2
8560032 Al-Ali et al. Oct 2013 B2
8560034 Diab et al. Oct 2013 B1
8570167 Al-Ali Oct 2013 B2
8570503 Vo et al. Oct 2013 B2
8571617 Reichgott et al. Oct 2013 B2
8571618 Lamego et al. Oct 2013 B1
8571619 Al-Ali et al. Oct 2013 B2
8577431 Lamego et al. Nov 2013 B2
8581732 Al-Ali et al. Nov 2013 B2
8584345 Al-Ali et al. Nov 2013 B2
8588880 Abdul-Hafiz et al. Nov 2013 B2
8600467 Al-Ali et al. Dec 2013 B2
8606342 Diab Dec 2013 B2
8626255 Al-Ali et al. Jan 2014 B2
8630691 Lamego et al. Jan 2014 B2
8634889 Al-Ali et al. Jan 2014 B2
8641631 Sierra et al. Feb 2014 B2
8652060 Al-Ali Feb 2014 B2
8663107 Kiani Mar 2014 B2
8666468 Al-Ali Mar 2014 B1
8667967 Al-Ali et al. Mar 2014 B2
8670811 O'Reilly Mar 2014 B2
8670814 Diab et al. Mar 2014 B2
8676286 Weber et al. Mar 2014 B2
8682407 Al-Ali Mar 2014 B2
RE44823 Parker Apr 2014 E
RE44875 Kiani et al. Apr 2014 E
8688183 Bruinsma et al. Apr 2014 B2
8690799 Telfort et al. Apr 2014 B2
8700112 Kiani Apr 2014 B2
8702627 Telfort et al. Apr 2014 B2
8706179 Parker Apr 2014 B2
8712494 MacNeish, III et al. Apr 2014 B1
8715206 Telfort et al. May 2014 B2
8718735 Lamego et al. May 2014 B2
8718737 Diab et al. May 2014 B2
8718738 Blank et al. May 2014 B2
8720249 Al-Ali May 2014 B2
8721541 Al-Ali et al. May 2014 B2
8721542 Al-Ali et al. May 2014 B2
8723677 Kiani May 2014 B1
8740792 Kiani et al. Jun 2014 B1
8754776 Poeze et al. Jun 2014 B2
8755535 Telfort et al. Jun 2014 B2
8755856 Diab et al. Jun 2014 B2
8755872 Marinow Jun 2014 B1
8761850 Lamego Jun 2014 B2
8764671 Kiani Jul 2014 B2
8768423 Shakespeare et al. Jul 2014 B2
8771204 Telfort et al. Jul 2014 B2
8777634 Kiani et al. Jul 2014 B2
8781543 Diab et al. Jul 2014 B2
8781544 Al-Ali et al. Jul 2014 B2
8781549 Al-Ali et al. Jul 2014 B2
8788003 Schurman et al. Jul 2014 B2
8790268 Al-Ali Jul 2014 B2
8801613 Al-Ali et al. Aug 2014 B2
8821397 Al-Ali et al. Sep 2014 B2
8821415 Al-Ali et al. Sep 2014 B2
8830449 Lamego et al. Sep 2014 B1
8831700 Schurman et al. Sep 2014 B2
8840549 Al-Ali et al. Sep 2014 B2
8847740 Kiani et al. Sep 2014 B2
8849365 Smith et al. Sep 2014 B2
8852094 Al-Ali et al. Oct 2014 B2
8852994 Wojtczuk et al. Oct 2014 B2
8868147 Stippick et al. Oct 2014 B2
8868150 Al-Ali et al. Oct 2014 B2
8870792 Al-Ali et al. Oct 2014 B2
8886271 Kiani et al. Nov 2014 B2
8888539 Al-Ali et al. Nov 2014 B2
8888708 Diab et al. Nov 2014 B2
8892180 Weber et al. Nov 2014 B2
8897847 Al-Ali Nov 2014 B2
8909310 Lamego et al. Dec 2014 B2
8911377 Al-Ali Dec 2014 B2
8912909 Al-Ali et al. Dec 2014 B2
8920317 Al-Ali et al. Dec 2014 B2
8921699 Al-Ali et al. Dec 2014 B2
8922382 Al-Ali et al. Dec 2014 B2
8929964 Al-Ali et al. Jan 2015 B2
8942777 Diab et al. Jan 2015 B2
8948834 Diab et al. Feb 2015 B2
8948835 Diab Feb 2015 B2
8965471 Lamego Feb 2015 B2
8983564 Al-Ali Mar 2015 B2
8989831 Al-Ali et al. Mar 2015 B2
8996085 Kiani et al. Mar 2015 B2
8998809 Kiani Apr 2015 B2
9028429 Telfort et al. May 2015 B2
9037207 Al-Ali et al. May 2015 B2
9060721 Reichgott et al. Jun 2015 B2
9066666 Kiani Jun 2015 B2
9066680 Al-Ali et al. Jun 2015 B1
9072474 Al-Ali et al. Jul 2015 B2
9078560 Schurman et al. Jul 2015 B2
9084569 Weber et al. Jul 2015 B2
9095316 Welch et al. Aug 2015 B2
9106038 Telfort et al. Aug 2015 B2
9107625 Telfort et al. Aug 2015 B2
9107626 Al-Ali et al. Aug 2015 B2
9113831 Al-Ali Aug 2015 B2
9113832 Al-Ali Aug 2015 B2
9119595 Lamego Sep 2015 B2
9131881 Diab et al. Sep 2015 B2
9131882 Al-Ali et al. Sep 2015 B2
9131883 Al-Ali Sep 2015 B2
9131917 Telfort et al. Sep 2015 B2
9138180 Coverston et al. Sep 2015 B1
9138182 Al-Ali et al. Sep 2015 B2
9138192 Weber et al. Sep 2015 B2
9142117 Muhsin et al. Sep 2015 B2
9153112 Kiani et al. Oct 2015 B1
9153121 Kiani et al. Oct 2015 B2
9161696 Al-Ali et al. Oct 2015 B2
9161713 Al-Ali et al. Oct 2015 B2
9167995 Lamego et al. Oct 2015 B2
9176141 Al-Ali et al. Nov 2015 B2
9186102 Bruinsma et al. Nov 2015 B2
9192329 Al-Ali Nov 2015 B2
9192351 Telfort et al. Nov 2015 B1
9195385 Al-Ali et al. Nov 2015 B2
9211095 Al-Ali Dec 2015 B1
9218454 Kiani et al. Dec 2015 B2
9245668 Vo et al. Jan 2016 B1
9267572 Barker et al. Feb 2016 B2
9277880 Poeze et al. Mar 2016 B2
9307928 Al-Ali et al. Apr 2016 B1
9323894 Kiani Apr 2016 B2
D755392 Hwang et al. May 2016 S
9326712 Kiani May 2016 B1
9392945 Al-Ali et al. Jul 2016 B2
9408542 Kinast et al. Aug 2016 B1
9436645 Al-Ali et al. Sep 2016 B2
9445759 Lamego et al. Sep 2016 B1
9474474 Lamego et al. Oct 2016 B2
9480435 Olsen Nov 2016 B2
9510779 Poeze et al. Dec 2016 B2
9517024 Kiani et al. Dec 2016 B2
9532722 Lamego et al. Jan 2017 B2
9560996 Kiani Feb 2017 B2
9579039 Jansen et al. Feb 2017 B2
9622692 Lamego et al. Apr 2017 B2
D788312 Al-Ali et al. May 2017 S
9649054 Lamego et al. May 2017 B2
9693737 Al-Ali Jul 2017 B2
9697928 Al-Ali et al. Jul 2017 B2
9717458 Lamego et al. Aug 2017 B2
9724016 Al-Ali et al. Aug 2017 B1
9724024 Al-Ali Aug 2017 B2
9724025 Kiani et al. Aug 2017 B1
9749232 Sampath et al. Aug 2017 B2
9750442 Olsen Sep 2017 B2
9750461 Telfort Sep 2017 B1
9775545 Al-Ali et al. Oct 2017 B2
9778079 Al-Ali et al. Oct 2017 B1
9782077 Lamego et al. Oct 2017 B2
9787568 Lamego et al. Oct 2017 B2
9808188 Perea et al. Nov 2017 B1
9839379 Al-Ali et al. Dec 2017 B2
9839381 Weber et al. Dec 2017 B1
9847749 Kiani et al. Dec 2017 B2
9848800 Lee et al. Dec 2017 B1
9861298 Eckerbom et al. Jan 2018 B2
9861305 Weber et al. Jan 2018 B1
9877650 Muhsin et al. Jan 2018 B2
9891079 Dalvi Feb 2018 B2
9924897 Abdul-Hafiz Mar 2018 B1
9936917 Poeze et al. Apr 2018 B2
9955937 Telfort May 2018 B2
9965946 Al-Ali et al. May 2018 B2
D820865 Muhsin et al. Jun 2018 S
9986952 Dalvi et al. Jun 2018 B2
D822215 Al-Ali et al. Jul 2018 S
D822216 Barker et al. Jul 2018 S
10010276 Al-Ali et al. Jul 2018 B2
10086138 Novak, Jr. Oct 2018 B1
10111591 Dyell et al. Oct 2018 B2
D833624 DeJong et al. Nov 2018 S
10123729 Dyell et al. Nov 2018 B2
D835282 Barker et al. Dec 2018 S
D835283 Barker et al. Dec 2018 S
D835284 Barker et al. Dec 2018 S
D835285 Barker et al. Dec 2018 S
10149616 Al-Ali et al. Dec 2018 B2
10154815 Al-Ali et al. Dec 2018 B2
10159412 Lamego et al. Dec 2018 B2
10188348 Al-Ali et al. Jan 2019 B2
RE47218 Al-Ali Feb 2019 E
RE47244 Kiani et al. Feb 2019 E
RE47249 Kiani et al. Feb 2019 E
10205291 Scruggs et al. Feb 2019 B2
10226187 Al-Ali et al. Mar 2019 B2
10231657 Al-Ali et al. Mar 2019 B2
10231670 Blank et al. Mar 2019 B2
RE47353 Kiani et al. Apr 2019 E
10279247 Kiani May 2019 B2
10292664 Al-Ali May 2019 B2
10299720 Brown et al. May 2019 B2
10327337 Schmidt et al. Jun 2019 B2
10327713 Barker et al. Jun 2019 B2
10332630 Al-Ali Jun 2019 B2
10383520 Wojtczuk et al. Aug 2019 B2
10383527 Al-Ali Aug 2019 B2
10388120 Muhsin et al. Aug 2019 B2
10405804 Ammar Sep 2019 B2
D864120 Forrest et al. Oct 2019 S
10441181 Telfort et al. Oct 2019 B1
10441196 Eckerbom et al. Oct 2019 B2
10448844 Al-Ali et al. Oct 2019 B2
10448871 Al-Ali et al. Oct 2019 B2
10456038 Lamego et al. Oct 2019 B2
10463340 Telfort et al. Nov 2019 B2
10471159 Lapotko et al. Nov 2019 B1
10505311 Al-Ali et al. Dec 2019 B2
10524738 Olsen Jan 2020 B2
10532174 Al-Ali Jan 2020 B2
10537285 Shreim et al. Jan 2020 B2
10542903 Al-Ali et al. Jan 2020 B2
10555678 Dalvi et al. Feb 2020 B2
10568553 O'Neil et al. Feb 2020 B2
RE47882 Al-Ali Mar 2020 E
10608817 Haider et al. Mar 2020 B2
D880477 Forrest et al. Apr 2020 S
10617302 Al-Ali et al. Apr 2020 B2
10617335 Al-Ali et al. Apr 2020 B2
10637181 Al-Ali et al. Apr 2020 B2
D887548 Abdul-Hafiz et al. Jun 2020 S
D887549 Abdul-Hafiz et al. Jun 2020 S
10667764 Ahmed et al. Jun 2020 B2
D890708 Forrest et al. Jul 2020 S
10721785 Al-Ali Jul 2020 B2
10736518 Al-Ali et al. Aug 2020 B2
10750984 Pauley et al. Aug 2020 B2
D897098 Al-Ali Sep 2020 S
10779098 Iswanto et al. Sep 2020 B2
10827961 Iyengar et al. Nov 2020 B1
10828007 Telfort et al. Nov 2020 B1
10832818 Muhsin et al. Nov 2020 B2
10849554 Shreim et al. Dec 2020 B2
10856750 Indorf Dec 2020 B2
D906970 Forrest et al. Jan 2021 S
10918281 Al-Ali et al. Feb 2021 B2
10932705 Muhsin et al. Mar 2021 B2
10932729 Kiani et al. Mar 2021 B2
10939878 Kiani et al. Mar 2021 B2
10956950 Al-Ali et al. Mar 2021 B2
D916135 Indorf et al. Apr 2021 S
D917550 Indorf et al. Apr 2021 S
D917564 Indorf et al. Apr 2021 S
D917704 Al-Ali et al. Apr 2021 S
10987066 Chandran et al. Apr 2021 B2
10991135 Al-Ali et al. Apr 2021 B2
D919094 Al-Ali et al. May 2021 S
D919100 Al-Ali et al. May 2021 S
11006867 Al-Ali May 2021 B2
D921202 Al-Ali et al. Jun 2021 S
11024064 Muhsin et al. Jun 2021 B2
11026604 Chen et al. Jun 2021 B2
D925597 Chandran et al. Jul 2021 S
D927699 Al-Ali et al. Aug 2021 S
11076777 Lee et al. Aug 2021 B2
11114188 Poeze et al. Sep 2021 B2
D933232 Al-Ali et al. Oct 2021 S
11145408 Sampath et al. Oct 2021 B2
11147518 Al-Ali et al. Oct 2021 B1
11185262 Al-Ali et al. Nov 2021 B2
11191484 Kiani et al. Dec 2021 B2
20010034477 Mansfield et al. Oct 2001 A1
20010039483 Brand et al. Nov 2001 A1
20020010401 Bushmakin et al. Jan 2002 A1
20020058864 Mansfield et al. May 2002 A1
20020133080 Apruzzese et al. Sep 2002 A1
20030013975 Kiani Jan 2003 A1
20030018243 Gerhardt et al. Jan 2003 A1
20030144582 Cohen et al. Jul 2003 A1
20030156288 Barnum et al. Aug 2003 A1
20030212312 Coffin, IV et al. Nov 2003 A1
20040106163 Workman, Jr. et al. Jun 2004 A1
20050055276 Kiani et al. Mar 2005 A1
20050234317 Kiani Oct 2005 A1
20060073719 Kiani Apr 2006 A1
20060189871 Al-Ali et al. Aug 2006 A1
20070073116 Kiani et al. Mar 2007 A1
20070180140 Welch et al. Aug 2007 A1
20070244377 Cozad et al. Oct 2007 A1
20080064965 Jay et al. Mar 2008 A1
20080094228 Welch et al. Apr 2008 A1
20080221418 Al-Ali et al. Sep 2008 A1
20090036759 Ault et al. Feb 2009 A1
20090093687 Telfort et al. Apr 2009 A1
20090095926 MacNeish, III Apr 2009 A1
20090247984 Lamego et al. Oct 2009 A1
20090275813 Davis Nov 2009 A1
20090275844 Al-Ali Nov 2009 A1
20100004518 Vo et al. Jan 2010 A1
20100030040 Poeze et al. Feb 2010 A1
20100099964 O'Reilly et al. Apr 2010 A1
20100234718 Sampath et al. Sep 2010 A1
20100270257 Wachman et al. Oct 2010 A1
20110001605 Kiani et al. Jan 2011 A1
20110028806 Merritt et al. Feb 2011 A1
20110028809 Goodman Feb 2011 A1
20110040197 Welch et al. Feb 2011 A1
20110082711 Poeze et al. Apr 2011 A1
20110087081 Kiani et al. Apr 2011 A1
20110105854 Kiani et al. May 2011 A1
20110118561 Tari et al. May 2011 A1
20110137297 Kiani et al. Jun 2011 A1
20110172498 Olsen et al. Jul 2011 A1
20110208015 Welch et al. Aug 2011 A1
20110213212 Al-Ali Sep 2011 A1
20110230733 Al-Ali Sep 2011 A1
20110237911 Lamego et al. Sep 2011 A1
20120059267 Lamego et al. Mar 2012 A1
20120123231 O'Reilly May 2012 A1
20120165629 Merritt et al. Jun 2012 A1
20120179006 Jansen et al. Jul 2012 A1
20120209082 Al-Ali Aug 2012 A1
20120209084 Olsen et al. Aug 2012 A1
20120226117 Lamego et al. Sep 2012 A1
20120227739 Kiani Sep 2012 A1
20120283524 Kiani et al. Nov 2012 A1
20120296178 Lamego et al. Nov 2012 A1
20120319816 Al-Ali Dec 2012 A1
20120330112 Lamego et al. Dec 2012 A1
20130023775 Lamego et al. Jan 2013 A1
20130041591 Lamego Feb 2013 A1
20130045685 Kiani Feb 2013 A1
20130046204 Lamego et al. Feb 2013 A1
20130060147 Welch et al. Mar 2013 A1
20130096405 Garfio Apr 2013 A1
20130096936 Sampath et al. Apr 2013 A1
20130190581 Al-Ali et al. Jul 2013 A1
20130197328 Diab et al. Aug 2013 A1
20130211214 Olsen Aug 2013 A1
20130243021 Siskavich Sep 2013 A1
20130253334 Al-Ali et al. Sep 2013 A1
20130296672 O'Neil et al. Nov 2013 A1
20130317370 Dalvi et al. Nov 2013 A1
20130324808 Al-Ali et al. Dec 2013 A1
20130331670 Kiani Dec 2013 A1
20130338461 Lamego et al. Dec 2013 A1
20130345921 Al-Ali et al. Dec 2013 A1
20140012100 Al-Ali et al. Jan 2014 A1
20140034353 Al-Ali et al. Feb 2014 A1
20140051953 Lamego et al. Feb 2014 A1
20140058230 Abdul-Hafiz et al. Feb 2014 A1
20140066783 Kiani et al. Mar 2014 A1
20140077956 Sampath et al. Mar 2014 A1
20140081100 Muhsin et al. Mar 2014 A1
20140081175 Telfort Mar 2014 A1
20140094667 Schurman et al. Apr 2014 A1
20140100434 Diab et al. Apr 2014 A1
20140114199 Lamego et al. Apr 2014 A1
20140120564 Workman et al. May 2014 A1
20140121482 Merritt et al. May 2014 A1
20140121483 Kiani May 2014 A1
20140127137 Bellott et al. May 2014 A1
20140129702 Lamego et al. May 2014 A1
20140135588 Al-Ali et al. May 2014 A1
20140142401 Al-Ali et al. May 2014 A1
20140163344 Al-Ali Jun 2014 A1
20140163402 Lamego et al. Jun 2014 A1
20140166076 Kiani et al. Jun 2014 A1
20140171763 Diab Jun 2014 A1
20140180038 Kiani Jun 2014 A1
20140180154 Sierra et al. Jun 2014 A1
20140180160 Brown et al. Jun 2014 A1
20140187973 Brown et al. Jul 2014 A1
20140194709 Al-Ali et al. Jul 2014 A1
20140194711 Al-Ali Jul 2014 A1
20140194766 Al-Ali et al. Jul 2014 A1
20140206963 Al-Ali Jul 2014 A1
20140213864 Abdul-Hafiz et al. Jul 2014 A1
20140243627 Diab et al. Aug 2014 A1
20140266790 Al-Ali et al. Sep 2014 A1
20140275808 Poeze et al. Sep 2014 A1
20140275835 Lamego et al. Sep 2014 A1
20140275871 Lamego et al. Sep 2014 A1
20140275872 Merritt et al. Sep 2014 A1
20140275881 Lamego et al. Sep 2014 A1
20140288400 Diab et al. Sep 2014 A1
20140303520 Telfort et al. Oct 2014 A1
20140316217 Purdon et al. Oct 2014 A1
20140316218 Purdon et al. Oct 2014 A1
20140316228 Blank et al. Oct 2014 A1
20140323825 Al-Ali et al. Oct 2014 A1
20140323897 Brown et al. Oct 2014 A1
20140323898 Purdon et al. Oct 2014 A1
20140330092 Al-Ali et al. Nov 2014 A1
20140330098 Merritt et al. Nov 2014 A1
20140330099 Al-Ali et al. Nov 2014 A1
20140333440 Kiani Nov 2014 A1
20140336481 Shakespeare et al. Nov 2014 A1
20140343436 Kiani Nov 2014 A1
20150005600 Blank et al. Jan 2015 A1
20150011907 Purdon et al. Jan 2015 A1
20150018650 Al-Ali et al. Jan 2015 A1
20150073241 Lamego Mar 2015 A1
20150080754 Purdon et al. Mar 2015 A1
20150099950 Al-Ali et al. Apr 2015 A1
20150106121 Muhsin et al. Apr 2015 A1
20160196388 Lamego Jul 2016 A1
20160367173 Dalvi et al. Dec 2016 A1
20170024748 Haider Jan 2017 A1
20170042488 Muhsin Feb 2017 A1
20170173632 Al-Ali Jun 2017 A1
20170251974 Shreim et al. Sep 2017 A1
20170311891 Kiani et al. Nov 2017 A1
20180103874 Lee et al. Apr 2018 A1
20180242926 Muhsin et al. Aug 2018 A1
20180247353 Al-Ali et al. Aug 2018 A1
20180247712 Muhsin et al. Aug 2018 A1
20180256087 Al-Ali et al. Sep 2018 A1
20180296161 Shreim et al. Oct 2018 A1
20180300919 Muhsin et al. Oct 2018 A1
20180310822 Indorf et al. Nov 2018 A1
20180310823 Al-Ali et al. Nov 2018 A1
20180317826 Muhsin et al. Nov 2018 A1
20190015023 Monfre Jan 2019 A1
20190117070 Muhsin et al. Apr 2019 A1
20190200941 Chandran et al. Jul 2019 A1
20190239787 Pauley et al. Aug 2019 A1
20190320906 Olsen Oct 2019 A1
20190374139 Kiani et al. Dec 2019 A1
20190374173 Kiani et al. Dec 2019 A1
20190374713 Kiani et al. Dec 2019 A1
20200060869 Telfort et al. Feb 2020 A1
20200111552 Ahmed Apr 2020 A1
20200113435 Muhsin Apr 2020 A1
20200113488 Al-Ali et al. Apr 2020 A1
20200113496 Scruggs et al. Apr 2020 A1
20200113497 Triman et al. Apr 2020 A1
20200113520 Abdul-Hafiz et al. Apr 2020 A1
20200138288 Al-Ali et al. May 2020 A1
20200138368 Kiani et al. May 2020 A1
20200163597 Dalvi et al. May 2020 A1
20200196877 Vo et al. Jun 2020 A1
20200253474 Muhsin et al. Aug 2020 A1
20200253544 Belur Nagaraj et al. Aug 2020 A1
20200275841 Telfort et al. Sep 2020 A1
20200288983 Telfort et al. Sep 2020 A1
20200321793 Al-Ali et al. Oct 2020 A1
20200329983 Al-Ali et al. Oct 2020 A1
20200329984 Al-Ali et al. Oct 2020 A1
20200329993 Al-Ali et al. Oct 2020 A1
20200330037 Al-Ali et al. Oct 2020 A1
20210022628 Telfort et al. Jan 2021 A1
20210104173 Pauley et al. Apr 2021 A1
20210113121 Diab et al. Apr 2021 A1
20210117525 Kiani et al. Apr 2021 A1
20210118581 Kiani et al. Apr 2021 A1
20210121582 Krishnamani et al. Apr 2021 A1
20210161465 Barker et al. Jun 2021 A1
20210236729 Kiani et al. Aug 2021 A1
20210256267 Ranasinghe et al. Aug 2021 A1
20210256835 Ranasinghe et al. Aug 2021 A1
20210275101 Vo et al. Sep 2021 A1
20210290060 Ahmed Sep 2021 A1
20210290072 Forrest Sep 2021 A1
20210290080 Ahmed Sep 2021 A1
20210290120 Al-Ali Sep 2021 A1
20210290177 Novak, Jr. Sep 2021 A1
20210290184 Ahmed Sep 2021 A1
20210296008 Novak, Jr. Sep 2021 A1
20210330228 Olsen et al. Oct 2021 A1
20210386382 Olsen et al. Dec 2021 A1
20210402110 Pauley et al. Dec 2021 A1
Non-Patent Literature Citations (1)
Entry
US 8,845,543 B2, 09/2014, Diab et al. (withdrawn)
Related Publications (1)
Number Date Country
20200163625 A1 May 2020 US
Provisional Applications (1)
Number Date Country
61392863 Oct 2010 US
Continuations (4)
Number Date Country
Parent 15634745 Jun 2017 US
Child 16556741 US
Parent 14967075 Dec 2015 US
Child 15634745 US
Parent 13425085 Mar 2012 US
Child 14967075 US
Parent 13272038 Oct 2011 US
Child 13425085 US