Depth of consciousness monitor

Information

  • Patent Grant
  • 10542903
  • Patent Number
    10,542,903
  • Date Filed
    Thursday, June 6, 2013
    11 years ago
  • Date Issued
    Tuesday, January 28, 2020
    4 years ago
  • CPC
  • Field of Search
    • US
    • 600 544000
    • 128 204230
    • CPC
    • A61B5/4821
    • A61M2205/35
    • A61M2230/04
  • International Classifications
    • A61B5/0476
    • Term Extension
      1351
Abstract
The present disclosure relates to physiological monitoring to determine the depth of consciousness of a patient under sedation. The monitor includes an EEG sensor and a depth of consciousness monitor. The depth of consciousness monitor can utilize treatment data, such as patient data and/or drug profile information with an EEG signal to determine whether the patient is adequately sedated.
Description
FIELD OF THE DISCLOSURE

The present disclosure relates to the field of patient monitoring. In some embodiments, the disclosure relates to monitoring the depth of consciousness of a patient under anesthetic sedation.


BACKGROUND

General anesthesia is often used to put patients to sleep and block pain and memory during medical or diagnostic procedures. While extremely useful, general anesthesia is not risk free. Caregivers therefore generally seek to maintain a depth of consciousness consistent with the needs of a particular medical procedure. Caregivers will monitor various physiological parameters of the patient to predict the patient's depth of consciousness. In response to monitored parameters, the caregiver may manually adjust the anesthetic dosage level to avoid over and under dosing. However, as a patient's depth of consciousness may frequently change, caregivers often employ a host of monitoring technologies to attempt to periodically, sporadically, or continually ascertain the wellness and consciousness of a patient. For example, caregivers may desire to monitor one or more of a patient's temperature, electroencephalogram or EEG, brain oxygen saturation, stimulus response, electromyography or EMG, respiration, body oxygen saturation or other blood analytes, pulse, hydration, blood pressure, perfusion, or other parameters or combinations of parameters. For many of the foregoing, monitoring technologies are individually readily available and widely used, such as, for example, pulse oximeters, vital signs monitors, and the like.


In their depth of consciousness monitoring, caregivers may also use recording devices to acquire EEG signals. For example, caregivers place electrodes on the skin of the forehead to detect electrical activity produced by the firing of neurons within the brain. From patterns in the electrical activity, caregivers attempt to determine, among other things, the state of consciousness of the brain. Caregivers may also use a pulse oximeter or cerebral oximetry to determine the percentage of oxygenation of the hemoglobin in the patient's blood. Caregivers may also use an EMG monitor to detect the muscular action and mechanical impulses generated by the musculature around the patient's forehead, among other bodily locations. Caregivers manually monitor such physiological parameters and then manually adjust anesthetic dosage.


However, manual monitoring and dosage adjustment could lead to serious adverse results, including death, if improperly performed. In addition, typical depth of consciousness monitors do not account for variations in responses to sedation therapies that exist between patient demographics. Furthermore, typical depth of consciousness monitors do not account for differences in physiological responses that exists between particular sedation therapies and among different patient populations. Therefore, there remains a need in the art for a depth of consciousness monitor that is configured to automatically communicate with a caregiver and/or an anesthetic dosage device to provide accurate control over patient care by accounting for variations between populations and drug actions.


SUMMARY

Based on at least the foregoing, the present disclosure seeks to overcome some or all of the drawbacks discussed above and provide additional advantages over prior technologies. The present disclosure describes embodiments of noninvasive methods, devices, and systems for monitoring depth of consciousness through brain electrical activity.


In one embodiment, a depth of consciousness monitor is configured to determine the level of sedation of a medical patient. The depth of consciousness monitor includes: an EEG interface configured to receive an EEG signal from an EEG sensor; an EEG front end configured to pre-process the EEG signal; a processor, configured to determine a level of sedation of a medical patient based at least upon the pre-processed EEG signal, wherein the processor is further configured to determine delay information associated with the time the EEG signal is received and the time the level of sedation is determined; and a drug delivery device interface, configured to provide the level of sedation and the delay information to a drug delivery device.


In some embodiments the EEG front end includes an EEG engine and an EMG engine configured to extract EEG information and EMG information from the EEG signal, respectively. In some embodiments, the processor is further configured to time stamp the EEG signal when received from the EEG sensor. In one embodiment, the depth of consciousness monitor also includes an additional sensor front end, such as an SpO2 sensor front end. In some embodiments, the depth of consciousness monitor also includes a data port configured to receive at least one of patient data and drug profile information. The processor may be configured to determine a level of sedation of a medical patient based at least upon the pre-processed EEG signal and the at least one of patient data and drug profile information. In some embodiments, the depth of consciousness monitor also includes the EEG sensor and/or the drug delivery system. In some embodiments, the drug delivery device interface includes a wireless communication device.


In another embodiment, a depth of consciousness monitor is configured to determine the level of sedation of a medical patient. The depth of consciousness monitor includes: an EEG interface configured to receive an EEG signal from an EEG sensor; an EEG front end configured to pre-process the EEG signal; a processor, configured to determine a level of sedation of a medical patient based at least upon the pre-processed EEG signal, and a data port configured to transmit the patient sedation level. The processor can include: two or more computing engines, each configured to compute a possible sedation level according to a different process; and a decision logic module, configured to determine the patient's sedation level based at least upon the possible sedation level computations;


In some embodiments, at least one of the computing engines is configured to implement a motion vector process, a phase coherence process, and/or utilize a brain model to compute one of the possible sedation levels. The EEG front end may include an EEG engine and an EMG engine configured to extract EEG information and EMG information from the EEG signal, respectively. In some embodiments, the data port comprises a display and/or a wireless communication device.


In yet another embodiment, a method of determining the level of sedation of a medical patient is provided. The method includes: receiving an EEG signal indicative of a medical patient's EEG; receiving treatment data associated with at least one of the medical patient and a drug to be delivered to the medical patient; and determining a level of sedation based at least upon the EEG signal and the treatment data.


In some embodiments, the treatment data includes at least one of a patient age, age classification, sex, weight, body-mass index, a physiological parameters, a temperature, a blood oxygen concentration, an EMG signal, a drug type, a drug class, a mechanism of delivery, and an active ingredient. In some embodiments, determining a level of sedation comprises determining two or more possible levels of sedation with parallel computing engines and wherein said determining the level of sedation of the medical patient is based upon said possible levels of sedation. In some embodiments, determining the level of sedation comprises averaging two or more possible levels of sedation and/or selecting one of the possible levels of sedation as the level of sedation of the medical patient. In some embodiments, determining the possible levels of sedation is performed with motion vector analysis, phase coherence, and or by utilizing a brain model.


For purposes of summarizing the disclosure, certain aspects, advantages and novel features of the disclosure have been described herein. Of course, it is to be understood that not necessarily all such aspects, advantages or features will be embodied in any particular embodiment of the disclosure.





BRIEF DESCRIPTION OF THE DRAWINGS

The following drawings and the associated descriptions are provided to illustrate embodiments of the present disclosure and do not limit the scope of the claims.



FIG. 1 illustrates an embodiment of a depth of consciousness monitoring system under closed-loop control.



FIG. 2 illustrates a block diagram of one embodiment of the depth of consciousness monitor of FIG. 1.



FIG. 3 illustrates one embodiment of the forehead sensor of FIG. 1.



FIG. 4 illustrates one embodiment of an EEG frequency spectrum of the patient before and during sedation using the depth of consciousness monitor of FIG. 2.



FIG. 5 illustrates one embodiment of the processor of the depth of consciousness monitor of FIG. 2.



FIG. 6 illustrates another embodiment of the processor of the depth of consciousness monitor of FIG. 2.



FIG. 7 illustrates yet another embodiment of the processor of the depth of consciousness monitor of FIG. 2.



FIG. 8 illustrates one embodiment of a routine to determine patient sedation information that can be performed by any of the processors of FIGS. 5-7.



FIG. 9 illustrates another embodiment of a routine to determine patient sedation information that can be performed by any of the processors of FIGS. 5-7.



FIG. 10 illustrates another embodiment of a depth of consciousness monitoring system.



FIG. 11 illustrates one embodiment of the depth of consciousness monitoring system of FIG. 10.



FIG. 12 illustrates one embodiment of the depth of consciousness monitoring assembly of FIG. 10.



FIG. 13 illustrates the depth of consciousness monitoring assembly of FIG. 12 attached to a sensor assembly.



FIGS. 14-16 illustrate views of the display of a multi-parameter monitor displaying physiological signals received from the depth of consciousness monitor of FIG. 10.





DETAILED DESCRIPTION

The present disclosure generally relates to patient monitoring devices. In order to provide a complete and accurate assessment of the state of a patient's various physiological systems, in an embodiment, a sensor may advantageously monitor one, multiple or combinations of EEG, EMG, cerebral oximetry, temperature, pulse oximetry, respiration, and other physiological parameters. In various embodiments, the sensor includes a disposable portion and reusable portion. For example, the disposable portion may advantageously include components near a measurement site surface (the patient's skin), including, for example, an EEG sensor, an EMG sensor, a temperature sensor, tape, adhesive elements, positioning elements, or the like. In addition, or alternatively, the reusable portion may advantageously include other components, circuitry or electronics, which, in some embodiments include time-of-use restrictions for quality control or the like. The reusable portion, can be used multiple times for a single patient, across different patients, or the like, often depending upon the effectiveness of sterilization procedures. The reusable components may include, for example, cerebral oximetry components, pulse oximetry components and other components to measure other various parameters.


In an embodiment, the disposable portion of the sensor may include an inductance connection or other electrical connection to the reusable portion of the sensor. The physiological signals from all sensors can be transmitted through a common cable to a depth of consciousness monitor. In an embodiment, the depth of consciousness monitor may include an analog to digital converter, various electrical filters, and a microcontroller for processing and controlling the various sensor components.


In an embodiment, a depth of consciousness monitor is configured to communicate with the forehead sensor and one or more host display and patient monitoring stations. In an embodiment, the patient monitoring station may be a pulse oximeter. In an embodiment, the pulse oximeter may perform integrated display, data monitoring and processing of patient parameters including a connection for power and data communication. In an embodiment, some or all communication may be through wired, wireless, or other electrical connections. In an embodiment, the depth of consciousness monitor may advantageously be housed in a portable housing. In such embodiments, the unit may advantageously be physically associated with a monitored patient, such as, for example, attached in an arm band, a patient bed pouch, a hood or hat, a pocket of a shirt, gown, or other clothing, or the like. In other embodiments, the unit may be entirely or partially housed in a cable connector. In an embodiment, the signal processing and condition unit and/or the depth of consciousness monitor could also monitor patient parameters through other sensors including, for example, ECG, Sp02 from the earlobe, finger, forehead or other locations, blood pressure, respiration through acoustic or other monitoring technologies, or other clinically relevant physiological parameters.


In an embodiment, the depth of consciousness monitor communicates with a sensor, such as a forehead sensor including one or more light sources configured to emit light at a patient's forehead. In an embodiment, the light source may include one or more emitters or emitter systems, such emitters or emitter systems may be embedded into a substrate. In various embodiments, the emitters could include light emitting diodes (“LEDs”), lasers, superluminescent LEDs or some other light emitting components. These components could be arranged in any pattern on the substrate and could be either a single light emitting source or several light emitting sources. In an embodiment, the emitting components could emit light that deflects off of reflective surfaces associated with a cap of the substrate. The reflective cover could be any number of shapes or sizes and could be constructed to direct light to specific points or a point on the cap or substrate.


In an embodiment, a multi-faceted splitting mirror could reflect light to an opening in the substrate that would allow the light to escape and be emitted to an emission detector in an embodiment also housed in the light source substrate. The emission detector may advantageously sample the light providing feedback usable to create an optical bench or at least optical bench properties of the light source, including, for example, determinations of intensity, wavelength, or the like. In an embodiment, the light source may include a polarized filter for adjusting the emitter light, in some embodiments before exiting an opening in the emitter or being detected by the emission detector.


In an embodiment, a caregiver could analyze physiological information collected from the various sensors including a patient's temperature, EEG, brain oxygen saturation, stimulus response, electromyography or EMG, respiration using acoustic sensor applied to the through, body oxygen saturation, glucose concentration, or other blood analytes, pulse, hydration, blood pressure, perfusion, or other parameters or combinations of parameters to determine relevant information about the state of a patient's well-being. In another embodiment, a caregiver may advantageously analyze information collected from the various sensors to more completely assess the overall depth of a patient's sedation and obtain an assessment superior to an assessment derived from monitoring a single or a few of the parameters mentioned above.


Reference will now be made to the Figures to discuss embodiments of the present disclosure.



FIG. 1 illustrates one example of a depth of consciousness monitoring system 100. In certain embodiments, the depth of consciousness monitoring system 100 measures one or more physiological parameters including cerebral electrical activity (e.g., via EEG), cerebral muscular activity (e.g., via EMG), temperature, cerebral oxygenation, including venous and arterial oxygenation, arterial oxygenation at various other points on the body, various other blood analytes, including total hemoglobin, glucose, lipids, stimulus response, electromyography or EMG, respiration, pulse, hydration, blood pressure, perfusion, or other parameters or combination of other physiologically relevant patient characteristics. The information from these physiological parameters can be evaluated using trend analysis, absolute and relative measures of certain parameters, combined or alone to evaluate the total wellness and current state of sedation of a patient.


The depth of consciousness monitoring system 100 includes multiple or a single sensor 120 in communication with a depth of consciousness monitor 140 via a communications link 150. In the illustrated embodiment, the depth of consciousness monitoring system 100 also includes a drug delivery device 160 that receives a control signal from the depth of consciousness monitor 140 via a control link 170. The drug delivery device 160 provides one or more sedatives (e.g., narcotic, hypnotic, analgesic, opiate, etc.) to a patient 180 via a conduit 190.


The sensor 120 can include any variety of shapes and sizes, and could be applied to a variety of measurement sites on a patient's skin including any location on the forehead and temples or other location of the head. One example of a sensor 120 configured for placement on a patient's forehead is described below with respect to FIG. 3. The sensor 120 generally includes one or more electrodes and is configured to measure the electrical activity within the patient's head and generate an EEG signal, as discussed in further detail below.


In some embodiments, the sensor's electrodes are designed to be placed at a measurement site covered with a patient's hair. A caregiver or patient may fasten the sensor 120 to the patient's head with a variety of mechanism including adhesive, straps, caps, combinations of the same, or other devices for fastening sensors to a patient's body or skin known in the art.


In some embodiments, the communication link 150 and/or the control link 170 are wired electrical connections (e.g., an electrical cable, etc.). In other embodiments, the communication link 150 and/or the control link 170 utilize wireless communication to provide portability, and greater flexibility in depth of consciousness monitor placement with respect to the drug delivery device 160. Wireless communications also help accommodate an ambulatory patient, or other patient in transit. For example, in one embodiment, the depth of consciousness monitor 140 may be attached to an arm band or included in an arm band or other device that is wearable by the patient, including in a cap, a hood, a sling or a pocket of a garment. In such an embodiment, the sensor 120 communicates with the arm band depth of consciousness monitor 140 via a wired or a wireless connection.


In an embodiment, the depth of consciousness monitor 140 communicates wirelessly with the drug delivery device 160 over a wireless control link 170. This allows the depth of consciousness monitor 140 to be transported between various caregiving facilities, each with their own stationary drug delivery devices 160 without unhooking and reinserting hardwired electrical connections. Instead, the depth of consciousness monitor 140 could establish a wireless communication link with a stationary drug delivery device 160 as the depth of consciousness monitor 140 is brought into proximity with the drug delivery device 160. In an embodiment, the devices could establish the connection automatically and patient data may be automatically sent from the depth of consciousness monitor 140 to the drug delivery device 160. In other embodiments, caregiver interaction is required to establish a wireless control link 170 between the depth of consciousness monitor 140 and the drug delivery device 160. Such configurations advantageously facilitate portability and seamless monitoring of a patient while the patient is transported, for example, from an ambulance to a hospital room or from room to room within a hospital.


In an embodiment, the depth of consciousness monitor 140 also communicates with, or incorporates, a pulse oximeter (not shown). The pulse oximeter may be a multi-parameter monitor or other host device capable of monitoring a wide variety of vital signs and blood constituents and other parameters or combinations of parameters such as those monitors commercially available from Masimo Corporation of Irvine, Calif., and disclosed herein with reference to U.S. Pat. Nos. 6,584,336, 6,661,161, 6,850,788, and 7,415,297, among others assigned to Masimo Corporation, and U.S. Patent Publication Nos. 2006/0211924, 2010/0030040, among others assigned to Masimo Corporation or Masimo Laboratories, Inc. of Irvine Calif., all of which are incorporated by reference in their entireties.


The communication link 150 and the control link 170 can include any of a variety of wired or wireless configurations. For example, in some embodiments the links 150, 170 are implemented according to one or more of an IEEE 801.11x standard (e.g., a/b/g/n, etc.), a BLUETOOTH wireless standard, and a wireless medical information communications standard, etc.


The drug delivery device 160 generally includes at least a drug therapy interface 192, a drug flow device 194, and a flow controller 196. The drug therapy interface 192 receives a drug and provides a flow path to the drug flow device 194. For example, the drug therapy interface 192 can include a port or receptacle to receive or interface with a drug capsule, intravenous bag, syringe, etc. The drug flow device 194 receives the drug therapy from the drug therapy interface 192 and allows the drug to flow to the patient 180 via the conduit 190. In some embodiments, the drug flow device 194 includes one or more of a solenoid, a pump, a valve, a peristaltic pump, a variable speed pump, a compression sleeve (e.g., to squeeze an intravenous bag), etc. The action and activation of the drug flow device 194 is controlled by the flow controller 196. The flow controller 196 includes a microcontroller, a memory, a signal input to receive a control signal from the depth of consciousness monitor 140 via the control link 170 and a signal output to provide control over the functionality of the drug flow device 194. The signal input can include a wireless radio to facilitate wireless communication over a wireless control link 170. The signal input allows closed-loop control over the operation of the drug delivery device 160, as will be described in greater detail below.


In some embodiments, the drug delivery device 160 is manually controlled by a clinician (e.g., an anesthesiologist) and/or includes a manual override to allow the clinician to take control over drug delivery to the patient 180. Therefore, in some embodiments, the depth of consciousness monitoring system 100 does not include an electronic control link 170 between the depth of consciousness monitor 140 and the drug delivery device. Instead, in such an open-loop configuration, the depth of consciousness monitor 140 displays an indication of the patient's depth of consciousness. The clinician is able to manually adjust drug therapy to the patient in response to the signals displayed by the depth of consciousness monitor 140.


In some embodiments, the depth of consciousness monitor 140 is configured to generate and/or provide a delay signal (which is form of delay information) to the drug delivery device 160. The delay signal may be used by the drug delivery device to control whether the depth of consciousness monitoring system 100 is operating under positive or negative feedback. In some embodiments, the drug delivery device 160 controls or delays the delivery of drugs provided to the patient 180 in response to the delay signal. For example, the drug delivery device 160 may delay drug deliver to make sure that the depth of consciousness monitoring system 100 is operating under negative feedback, and is therefore a stable control system.


The delay signal can be determined by the depth of consciousness monitor 140 in any of a variety of manners. In one embodiment, the depth of consciousness monitor 140 time stamps the data received from the sensor 120 and provides the time stamp information or other related time information to the drug delivery device 160 as the delay signal. For example, in some embodiments, the delay signal includes the time stamp associated with the data received from the sensor 120 as well as a time stamp associated with the time the control signal data packet is sent to the drug delivery device 160. In other embodiments, the delay signal includes the difference between the two time stamps. In some embodiments, the drug delivery device 160 is configured to calculate or otherwise determine the appropriate control delay based upon the delay signal. For example, the drug delivery device 160 may time stamp the time the control signal is received from the depth of consciousness monitor 140. The drug delivery device 160 can determine a control delay based upon the difference between the time the control signal is received from the depth of consciousness monitor 140 and the time the depth of consciousness monitor received the signal from the sensor 120 that was used to generate the associated data packet received by the drug delivery device. The signal propagation delay through the depth of consciousness monitoring system 100 may be used to keep the system's feedback negative to avoid oscillation and to achieve stability.


In an embodiment, a caregiver or the patient may attach the depth of consciousness monitor 140 directly to the patient's arm or other part or clothing of the patient through an armband with straps or some other means known in the art to connect a portable monitoring unit to a patient. In an embodiment, a depth of consciousness monitor 140 may be integrated into a hat or other headgear wearable by the patient or some other structure near the patient. In an embodiment, the depth of consciousness monitor 140 can rest on a table or other surface near the patient.


In some embodiments, a depth of consciousness monitor 140 is integrated with the drug delivery device 160. Alternatively, the depth of consciousness monitor 140 could be a module that is docked or otherwise attached to the drug delivery device 160. The depth of consciousness monitor 140 can communicate with and/or be integrated with a variety of other device, such as a pulse oximeter, a respiration monitor, and EKG device, a blood pressure measurement device, a respirator, and/or a multi-parameter monitor.



FIG. 2 illustrates a block diagram of one embodiment of the depth of consciousness monitor 140, sensors 120, and drug delivery device 160. In an embodiment, the depth of consciousness monitor 140 includes a processor 220 which may be a micro-controller or other processor, to control and/or process signals received from the sensors 120. For example, the processor 220 may coordinate, process or condition, or manipulate the signals received from the sensor 120 to generate electronic data, control signals, and/or delay signals that are subsequently displayed and/or communicated to the drug delivery device 160. In addition, the processor 220 may receive instructions or data control messages from the drug delivery device 160 or other patient monitoring device to provide the appropriate conditioning and controlling of the various front end components of the sensors 120. Data transmitted between the depth of consciousness monitor 140, the drug delivery device 160, the sensors 120 and any other associated components, devices, or systems in communication with the depth of consciousness monitoring system 100 may be communicated by the devices using one or more interfaces, e.g., electrical wires, wireless communication, optical communication, RFID, LAN networks, or other electronic devices for communicating data known in the art.


The depth of consciousness monitor 140 may also include various front end components to enable the depth of consciousness monitor 140 to process the various signals received by the various sensors 120 that may be communicating with the depth of consciousness monitor 140. In an embodiment, front end components may translate and transmit instructions and control signals for driving the various sensors. In an embodiment, the front end components may translate, process, or transmit instructions and control signals to the emitting or light producing components of the sensor. The front end components may also receive and transmit data acquired by the detectors of the sensors to the microcontroller 220 or other processor 220. The front end components can include one or more of an analog-to-digital converter, a preamplifier, an amplifier, a filter, a decimation filter, a demodulator, etc.


These front end components could include front end components for a variety of sensors 120 including for sensors that detect blood oxygenation, EEG, EMG, ECG, temperature, acoustic respiration monitoring (“ARM”) sensors, such as those available from Masimo Corporation of Irvine, Calif., acoustic throat respiratory sensor, and brain oxygenation. In an embodiment, a caregiver could advantageously utilize a device with the ability to monitor the plurality of above mentioned parameters to more accurately determine a depth of a patient's sedation. However, in some embodiments, the depth of consciousness monitor 140 only includes EEG and EMG front end components. In an embodiment, a front end component that would be associated with a sensor 120 that detects brain oxygenation may have a sub component dedicated to driving emitters 230 associated with a light source of the brain oxygenation sensor and a sub-component associated with the detector 230 or detectors 230 of the brain oxygenation sensor 300 for receiving and transmitting the detected signals that pass through various body tissues. In other embodiments, the light drivers and detector front end are omitted.


In an embodiment, one of the various sensors associated with the front end components of the brain oximetry unit could be, for example, a blood oxygenation sensor 310 which may be placed at various measurement sites on a patient's skin, including the earlobe, finger, forehead or other places known in the art suitable for detecting blood oxygenation. Many suitable pulse oximeter sensors 310 are known in the art such as those blood oxygenation sensors 310 commercially available from Masimo Corporation of Irvine, Calif., including but not limited to those described in U.S. Pat. Nos. 5,638,818, 6,285,896, 6,377,829, 6,580,086, 6,985,764, and 7,341,559, which are expressly incorporated by reference in their entireties.


In an embodiment, a temperature sensor 320 communicates with an associated front end component of the depth of consciousness monitor 140. The temperature sensor 320 detects the temperature of the skin, the temperature inside the ear, the temperature under the tongue, or any other temperature measurement method known in the art. In an embodiment, the temperature sensor 320 could be any suitable thermistor, or any other temperature sensor 320 known in the art capable of detecting a surface temperature of a patient's skin. Additional temperature sensor may advantageously provide feedback to the depth of consciousness monitor 140 regarding the performance or temperature of one, combinations of, or all of the emitters 230.


An EEG sensor 330 may also be associated with the front end components of the depth of consciousness monitor 140. In an embodiment, the EEG sensor 330 may be any of a variety of EEG sensors 330 known in the art. An EEG sensor 330 could be applied to a patient at any of a multitude of locations and measurement sites on the skin of the head of a patient. In an embodiment, the EEG sensor 330 may include electrode leads that may be placed on a measurement site in contact with the skin of the patient. In an embodiment, the EEG 330 may monitor the electrical activity of a patient's brain through any number of electrodes, electrode leads, and channels or other systems known in the art. One such EEG sensor 330 is illustrated in FIG. 3 and described in greater detail below.


In an embodiment, the EEG sensor 330 may monitor and collect data from a patient's brain using 4 channels and 6 electrodes. In another embodiment, the EEG 330 may use 3 channels and 5 electrodes. In another embodiment, any variety or combination of sensors maybe be used that are suitable for obtaining an EEG signal, such as those described in U.S. Pat. Nos. 6,016,4444, 6,654,626, 6,128,521, which are incorporated by reference in their entireties.


A brain oxygenation sensor 300 may also be associated with the front end components of the depth of consciousness monitor 140. In an embodiment, the brain oxygenation sensor 300 includes a light source 230, and a detector 260. The light source 230 of the brain oxygenation sensor 300 includes emitter(s) that would emit light, sonic or other radiation into the forehead at one, two or other plurality of measurement sites located on the skin of the patient at a plurality of predetermined wavelengths. In an embodiment, the brain oxygenation sensor 300 would include a detector 260 with photodiodes or other radiation detection devices to detect the radiation emitting from the patient at a one or two or a plurality of measurement sites on the skin of the head of a patient. Many suitable brain oxygenation sensors 300 and cerebral oximeters are known in the art including those disclosed in U.S. Pat. Nos. 7,072,701, 7,047,054, which are expressly incorporated by reference in their entireties.


In an embodiment, the light source 230 of the brain oxygenation sensor 300 may include an emission detector 260. In an embodiment, the emission detector 260 detects the light emitted from the light source 230 before passing through or contacting the measurement site of the patient. In an embodiment, an output from the emission detector 230 would be communicated to the micro-controller 220 of the depth of consciousness monitor 140 in order to calculate an approximate output intensity of the light emitted by the emitter(s) 230. The micro-controller 220 or other processor 220 could calculate the output intensity based on the output of the emission detector 260 by comparing the data to calibration data. In an embodiment, the calibration data could include measurement of intensity of light emitted from the emitter(s) 230 and corresponding measurements of output from the emission detector 260. This data could then be correlated to real time output from the emission detector 260 while the oxygenation sensor 230 is in use to determine an actual or approximate intensity of light or radiation being emitted by the emitter(s) 230 utilizing a calibration curve or other suitable calculation or processing method. In an embodiment, the calibration data may be stored in an EPROM or other memory module in the depth of consciousness monitor 140 or other patient processing module or device associated with the patient monitoring system 100.


In an embodiment, the detector 260 detects light or other radiation emitted from the light source 230 after, in an embodiment, some of the light has entered the measurement site on the patient and has been attenuated by a patient's tissue. In an embodiment, the detector 260 could be any number of detectors known in the art for detecting light or other radiation including photodiodes or other types of light or radiation detectors. In one embodiment, the detector 260 may convert detected light or other radiation into a signal, for example, an electrical output signal, which may represent the intensity or other attributes of the radiation. In an embodiment, the signal from the detector 260 may be sent to a brain oxygenation detector 260 front end located in the depth of consciousness monitor 140 for processing, conditioning or transmitting to a pulse oximeter (not shown) or other patient monitoring processor. In one embodiment, the signal may be converted into a digital format by an analog to digital converted located in the depth of consciousness monitor 140. In an embodiment, the data from the detector 260 of the brain oxygenation sensor 300 may be processed to determine the cerebral oxygenation of a patient's brain tissue. In an embodiment, the processing of the data may include determining the changes of intensity between various wavelengths of emitted and detected light of the cerebral oxygenation sensor 300.


In an embodiment, a cerebral oximeter or multi-parameter monitor (not shown) acquires data from the depth of consciousness monitor 140 or sensor 120 derived from physiologically relevant parameters. In an embodiment, the pulse oximeter could provide visual quantitative or qualitative assessments of the patient's well-being based on one or more of the various parameters or physiological attributes measured.


In an embodiment, a caregiver may utilize various physiological parameters to make a quantitative assessment of the patient's depth of sedation as indicated by an index based on for example, a patient's temperature, electroencephalogram or EEG, brain oxygen saturation, stimulus response, electromyography or EMG, respiration based on acoustic through sensors, body oxygen saturation or other blood analytes, pulse, hydration, blood pressure, perfusion, or other parameters or combinations of parameters. In another embodiment, various aspects of sedation could be assessed quantitatively or qualitatively based on a visual representation of the patient's sedation in the aspects including hypnosis, responsiveness, muscle relaxation or other clinically relevant facets of depth of anesthesia.


In an embodiment, the functionality of the depth of consciousness monitor 140 could be optionally controlled by the drug delivery device 160. In an embodiment, the data and qualitative and quantitative assessments of a patient's wellness being could be displayed on one or more of the depth of consciousness monitor 140, the drug delivery device 160, or any other device or system in communication with the depth of consciousness monitoring system 100 (e.g., a pulse oximeter, physiological patient monitor, nurse station, etc.). Also, audible alarms and other indicators could be displayed on either or both the depth of consciousness monitor 140 and drug delivery device in response to various threshold breaches based on the assessment of the patient's wellness and depth of consciousness as determined from the various monitored parameters.



FIG. 3 illustrates one embodiments of a sensor 120 in the form of an EEG sensor 410, which is configured for placement on a patient's forehead to generate an EEG signal. Disposable and reusable portions of the sensor 410 may be connected and overlayed on top of one another. The EEG sensor 410 includes six EEG electrodes 440 with two reference electrodes (along the central axis of the EEG sensor 410) and four active channel electrodes (two on each side of the EEG sensor's central axis). In some embodiments, light source(s) and light detector(s) (not shown) may be incorporated into the EEG sensor 410, as well. All or some of the above mentioned sensor components including the EEG electrodes 440, leads from the electrodes 440, and blood oxygen detecting light emitters and detectors (when provided) may communicate with one or more chips 420 that enables transmission of acquired signals and drive signals. In some embodiments a single chip 420 is provided. In other embodiments, each component communicates with its own individual chip through wires, or printed circuits, or other suitable electrical connections.


The EEG electrodes 440 may be any suitable electrodes for detecting the electro-potentials on the surface of the skin of a patient's head. In one embodiment, EEG electrodes 440 include a metal or other suitable conductor and utilize leads contacting the surface of the skin. In another embodiment, the electrodes 440 are gelled electrodes that make contact through the skin via gel and have metal leads that come into contact with the gel. In still yet another embodiment, the EEG electrodes 440 may be glued to the forehead with any suitable patient dermal adhesive for connecting the EEG electrodes 440 and may have electrical conductivity. In an embodiment, potentials from the EEG electrodes 440 are transmitted to the depth of consciousness monitor 140 for further conditioning, transmitting or processing.


The sensor 410 may also include one or more temperature sensors (not shown). The temperature sensor may be any suitable sensor that can detect the temperature of the surface of the skin or other patient temperatures. In an embodiment, the temperature sensor may include a thermistor attached to a reusable portion of the sensor 410.


In an embodiment, the sensor 410 includes an interface 450 to couple the sensor 410 to the depth of consciousness monitor 140. The interface 450 may be any suitable electrical or data connection or communication port or device including, for example, a pin connector and receiver. Various other communication or electrical connections known in the art may be utilized. In an embodiment, the interface 450 may include an inductance connection utilizing transformers to couple a data and electrical connection across an insulator. In another embodiment, the interface 450 provides a data or electronic coupling between a reusable portion and a disposable portion of the sensor 410.


In some embodiments, the sensor 410 includes a disposable portion (not shown) removably attached to a reusable portion (not shown). In an embodiment, the disposable portion attaches to a measurement site of a patient's head and provides a base to which the reusable portion may be docked, mated or connected. The disposable portion houses the components of the sensor 410 that may be less expensive than at least some of the components contained in the reusable portion, and therefore may be disposed after a single or multiple uses, either on the same patient or different patients. The disposable portion of the sensor 410 includes a tape substrate that provides a base or substrate to which at least some of the components of the disposable portion may adhere or be integrated. In an embodiment, the tape can be constructed from any suitable disposable material that will effectively hold the components includes in the disposable portion to a patient's forehead or other measurement site. In an embodiment, the tape includes a suitable dermal adhesive on a patient side of the disposable portion for temporary adhesion of the sensor to a patient's skin.


In an embodiment, the sensor 410 may include an adhesive tape 430 that supports the EEG electrodes 440. In one embodiment, the EEG electrodes 440 may be fastened to the tape 430. In an embodiment, the EEG electrodes 440 could be embedded in the tape 430 by any known adhesive in the sensor arts or any other suitable means for connecting the EEG electrodes 440 that would allow the EEG electrode 440 leads to be exposed on a patient side of tape 430 in an appropriate position to come in close proximity to a measurement site of a patient's skin. In an embodiment, EEG electrodes 440 may be gelled so that the gel contacts the electrodes and a measurement site of a patient's skin to provide an electrical path between the measurement site of the patient's skin and the EEG electrodes 440. In an embodiment, the leads of the EEG electrodes 440 are connected to a chip 420 by wires or other suitable electrical connections, such a as a printed circuit, flex circuit, etc.


The sensor 410 may also include a temperature sensor (not shown). In an embodiment, the temperature sensor includes a thermistor with the thermistor leads exposed on a patient contacting side of the tape 430, in order to facilitate the contacting of the leads of temperature sensor to a measurement site of a patient's skin. In an embodiment, the temperature sensor is connected to single chip through wires or other suitable electrical connections such as a flexible printed circuit. In an embodiment, the temperature sensor may be located anywhere on the tape 430, the disposable portion or the reusable portion of the sensor 410 (if the sensor is provided with disposable and reusable portions). In an embodiment, the leads for the temperature sensor may be near the center of tape 430 or anywhere on the periphery of tape 430.


In some embodiments, the sensor 410 includes a pulse oximeter sensor (not shown). The pulse oximeter sensor can include an ear pulse oximeter sensor that emits and detects radiation to determine the oxygenation of the blood travelling through the arteries of the ear. Many suitable ear pulse oximeter sensors are known in the art such as those sensors commercially available from Masimo Corporation and disclosed herein with reference to U.S. Pat. No. 7,341,599, which is expressly incorporated by reference in its entirety. In another embodiment, the pulse oximeter sensor may be a forehead pulse oximeter sensor or any other suitable pulse oximeter known in the art or disclosed herein. The pulse oximeter sensor may be connected to the sensor 410 through electrical wires, wirelessly or other suitable electrical or data connection. Data collected from the pulse oximeter sensor may be transmitted to the depth of consciousness monitor 140, pulse oximeter, or both for conditioning and further processing.


In some embodiments, signal processing and conditioning circuitry of depth of consciousness monitor configured to monitor the EEG signals of a patient and providing feedback on the depth of sedation or awareness of a patient undergoing anesthesia, may be partially or entirely incorporated into the sensor 410. Sedation brain function monitors, including those similar to the SEDLINE sedation monitor commercially available from Masimo Corporation of Irvine, Calif., as well as those described in U.S. Pat. Nos. 6,128,521, 6,301,493, 6,317,627, 6,430,437, all of which are expressly incorporated by reference in their entireties. For example, the sensor's connector or interface 450 may house the circuit board, with six channels for six detectors and a processor configured to determine depth of consciousness.


Integration of all or the majority of the associated circuitry and processing components of several different patient monitoring sensors in a single sensor advantageously provides a caregiver a simple device that can be attached to the patient's forehead and/or other areas on the patient, to provide minimal discomfort to the patient and minimal amount of wires and connections to cause electrical interference with instruments in the hospital environment. Additionally, the caregiver will spend less time attaching various sensors to a patient where each would otherwise require its own associated monitoring station. Furthermore, integration of sensor processing components allows some of the processing components to have shared functionality and therefore saves considerably on manufacturing costs. For example, memory chips, processors, or other electrical components may be shared by the various sensors.


EEG Signal Processing


Referring again to FIG. 2, the depth of consciousness monitor's processor 220 is configured to receive at least an EEG signal from an EEG sensor 330 using an interface, such as an EEG interface 332 and process the EEG signal to determine the patient's depth of consciousness. In some embodiments, the processor 220 determines an index value between 0 and 100 to indicate depth of consciousness. The depth of consciousness monitor 140 may include a display, such as a monitor, LED, speaker, etc., to indicate the patient's depth of consciousness, e.g., the index value.


In some embodiments, the processor 220 determines the frequency content of the EEG signal prior to administration of any sedatives as well as during sedation. FIG. 4 illustrates one embodiment of a graph 500 showing the patient's EEG's frequency content or frequency spectrum prior to sedation as curve 510 and during sedation as curve 530. Shifting of the curve 510 amplitude at frequencies below 10 Hz, a drop in curve slope at frequencies above 10 Hz, and the formation or increase in the frequency curve 510 to form a local maximum (e.g., local maximum 540) at 10 Hz each indicates that the patient has entered a sedated state.


Indeed, in one embodiment, the processor 220 determines whether the patient is adequately sedated by monitoring for the presence of a local maximum 540 in the frequency curve 530 above a predetermined threshold value, at 10 Hz. However, the shape of the frequency curve 530 can vary based upon several factors, such as any one or more of the patient's age, age classification (e.g., pediatric, adult, geriatric, etc.), sex, weight, body-mass index, genetic factors, etc. The shape of the frequency curve 530 can also vary based upon one or more physiological parameters associated with the patient, such as the patient's temperature, blood oxygen concentration, EMG signal, etc. Furthermore, the shape of the frequency curve 530 can also vary based upon the particular drug administered to sedate the patient. For example, the drug type, drug class (e.g., hypnotic, analgesic, opiate, etc.), mechanism or method of delivery (inhalant, intravenous, ingestible, etc.) and/or particular active ingredient (e.g., Propofol (TIVA), Sevoflurane, nitrous oxide, morphine, etc.) can each affect the shape of the frequency curve 530. Variations in the frequency curve 530 make it more difficult for the depth of consciousness monitor 140 to accurately determine whether the patient is adequately sedated.


Therefore, to improve accuracy, in one embodiment the depth of consciousness monitor's processor 220 analyzes the frequency curve 530 by considering one or more curve profiles associated with the patient and/or the drug administered. For example, in one embodiment, the processor 220 obtains physiological information regarding the patient from sensors 120 attached to the depth of consciousness monitor 140. In other embodiments, the processor 220 obtains physiological information regarding the patient via a data port. For example, the data port can receive temperature, blood oxygen saturation, respiration rate, and/or other physiological parameter information from an separate monitor. In addition, the depth of consciousness monitor 140 can receive additional information regarding the patient and the drug via the data port, as well.


For example, in some embodiments, the data port includes a wireless radio, a network adapter, a cable, an Ethernet adapter, a modem, a cellular telephone, etc., to receive patient and/or drug information. The patient and/or drug information is provided to the processor 220 to accurately interpret the frequency curve 530 derived from the EEG sensor 330 signal. In one embodiment, the data port includes a keyboard or other data entry device that allows the clinician to manually inter data relating a patient or drug parameters, such as those examples described above. Indeed, the processor 220 can include one or more EEG processing engines that are configured based upon the patient and/or drug data received by the depth of consciousness monitor 140, as discussed in greater detail below.


In some embodiments, the patient's frequency response graph 500 is processed as four distinct, non-overlapping frequency bands 550, 552, 554, 556.


For example, the first frequency band, sometimes referred to as the delta band, is the portion of the graph 500 between 0 and 4 or about 4 Hz. The second frequency band, sometimes referred to as the theta band, is the portion of the graph 500 between 4 or about 4 Hz and 7 or about 7 Hz. The third frequency band, sometimes referred to as the alpha band, is the portion of the graph 500 between 7 or about 7 Hz and 12 or about 12 Hz; and the fourth frequency band, sometimes referred to as the beta band, is the portion of the graph 500 greater than 12 or about 12 Hz.


In some embodiments, the depth of consciousness monitor 140 determines whether there is a peak 540 greater than a predetermined threshold in the frequency curve 530 anywhere within the alpha band 554. If so, the monitor 140 may determine that the patient is adequately sedated. However, in some cases, the peak 540 can shift and appear outside of the alpha band 554. For example, a sedated patient that is experiencing hypothermia may not manifest a peak in the alpha band; instead, the peak may shift to the theta or beta bands.


Therefore, in one embodiment, the depth of consciousness monitor 140 does not limit its search for a peak 540 to a particular frequency value (e.g., 10 Hz) or a particular frequency band (e.g., alpha band 554). Instead, in such an embodiment, the depth of consciousness monitor 140 scans across all frequencies (or a larger subset of frequencies than just those within the alpha band) to search for a peak 540 (e.g., across two or more frequency bands). A detected peak may be used to determine alone (or in combination with other patient and/or drug data) whether the patient is adequately sedated.


The peak 540 can be defined in any of a variety of clinically-relevant manners. For example, the peak 540 can be defined based upon the slope of the curve segment on one or both sides of the peak 540, the relative magnitude of the peak compared to the curve values at predetermined locations or offsets on either side of the peak 540, the relative magnitude of the peak compared to the frequency curve 510 of the patient obtained prior to sedation, etc.


In one embodiment, the processor 220 processes the patient's frequency spectra curve 510, 530 as deformable curves by utilizing motion vector processing. For example, the processor 220 compares each point (or a predetermined number of points) in the pre-sedation curve 510 to points within the sedation frequency curve 530 to match points having the greatest similarity (e.g., relative position with respect to its neighbors, pattern matching, sum of absolute differences, any pattern matching technique, etc.). The processor 220 determines one or more motion vectors 560 to describe the motion of the points from one curve 510 to the next 530. Each motion vector 560 includes both direction and amplitude (e.g., distance traveled) information. Although the graph 500 includes curve 510, 530 illustrated in the frequency domain (the x-axis represents frequency), the motion vectors 560 include time domain information. For example, the processor 220 can look at multiple frames of data (e.g., multiple graphs 500) and employ pattern matching techniques (e.g., sum of absolute differences) to determine which points in the graphs 500 and their curves 510, 530 to use to define the respective motion vectors 560. In some embodiments, the motion vectors 560 are determined at 0.5, 1, 2, or 2.5 Hz intervals.


One or more motion vector 560 profiles may be constructed based upon particular drug and patient data. For example, each drug used for sedation may be characterized by a unique set of motion vectors. When a patient is treated with a particular drug, and the patient's motion vectors match those of the drug (e.g., the drug profile), the processor 220 can determine that the patient is adequately sedated. Such profiles may be determined for any one or more of the patient's age, age classification (e.g., pediatric, adult, geriatric, etc.), sex, weight, body-mass index, genetic factors, etc., physiological parameters associated with the patient, such as the patient's temperature, blood oxygen concentration, EMG signal, etc., the particular drug administered to sedate the patient, the drug type, drug class (e.g., hypnotic, analgesic, opiate, etc.), mechanism or method of delivery (inhalant, intravenous, ingestible, etc.) and/or particular active ingredient (e.g., Propofol (TIVA), Sevoflurane, nitrous oxide, morphine, etc.). Such profiles may be stored within the depth of consciousness monitor's memory, or they may be retrieved from one or more data repositories stored at one or more remote locations (e.g., over a computer network, over the Internet, from a server, from the cloud, etc.).


In another embodiment, the EEG front end circuitry is configured not to eliminate or filter out low frequencies. The EEG front end circuitry instead allows the processor 220 to determine slow waves (e.g., time-domain signals at or below 1, 0.5, and/or 0.2 Hz). The processor 220 can employ one or more phase coherence methods to detect phase coherence between one or more slow waves and one or more patient signals falling within one of the frequency bands 550, 552, 554, 556. For example, in some embodiments, phase coherence between a slow wave and a signal from the theta band 552 indicates that the patient is awake. One the slow wave and signal from the theta band 552 are out of phase, the patient is sedated. In other embodiments, phase coherence analysis is performed to compare phase coherence between a selected slow wave and a different frequency band's signals (e.g., the delta band 550, the alpha band 554, and/or the beta band 556). In some embodiments, the processor 220 performs phase coherence analysis between a selected slow wave and multiple signals between 4 and 50 Hz, e.g., every 0.2, 0.5, 1, or 2 Hz. In other embodiments, phase coherence is determined along the entire frequency spectrum.


In yet another embodiment, the processor 220 generates and/or utilizes a mathematical or electrical model of brain activity to determine whether the patient is adequately sedated. The model can be used to predict what the EEG of a sedated patient should look like based upon a particular drug, drug delivery mechanism, concentration (or any other drug parameter, including those discussed above). Actual EEG signals may be compared to the signal predicted by the model to determine whether the patient is adequately sedated. The model can be constructed of various combinations of electrical components (e.g., resistors, capacitors, amplifiers, etc.) or computing elements.


In one embodiment, brain modeling occurs by storing EEG signals from sedated patients in a memory location and categorized the EEG signals based upon any of a variety of drug and patient data information. For example, sedated EEG signals may be categorized based upon the particular drug, dosage, concentration, delivery method, etc. used to treat the patient. A brain response model is constructed by combing the various data into a single model.


In some embodiments, the processor 220 includes a pre-processor 602, a compute engine 604, and a post processor 606, as illustrated in FIG. 5. The patient's EEG signal is received by the pre-processor 602. The pre-processor 602 performs front end processing, such as one or more of filtering, amplification, ND sampling, decimation, demodulation, etc. of the EEG signal. In some embodiments, the pre-processor 602 includes the EEG front end functionality discussed above with respect to FIG. 2. The compute engine 602 determines the level of patient sedation and/or depth of consciousness utilizing, for example, any of the techniques described herein. In one embodiment, the compute engine 602 determines an index value representative of the patient's sedation level. The post-processor 606 provides an indication of the patient's sedation level as well as other relevant information (e.g., system delay, as discussed above, other physiological parameter information, pass-through signals, etc.) for display to the clinician and/or transmission to a drug delivery device or other physiological monitor or information display station. In some embodiments, the post-processor 606 stores patient sedation, EEG signals, patient data and drug information in a memory location.


Another embodiment of a processor 220 is illustrated in FIG. 6. The processor 220 includes a pre-processor 602, multiple compute engines 604a, 604b, . . . 604n, and decision logic 608. Each compute engine 604a, 604b, . . . 604n determines patient sedation information utilizing different processing approaches. For example, one compute engine 604 may determine patient sedation information utilizing motion vector analysis (e.g., as discussed above), one compute engine 604 may determine patient sedation information utilizing frequency coherence analysis (e.g., as discussed above), etc. Furthermore, each compute engine 604 can be drug or patient information specific. For example, the compute engine 604 may utilize historical information (either of the patient himself or from a model, etc.) to determine patient sedation. Each compute engine 604 could therefore correspond to a particular patient's age, age classification (e.g., pediatric, adult, geriatric, etc.), sex, weight, body-mass index, genetic factors, etc., physiological parameters, such as temperature, blood oxygen concentration, EMG signal, etc., the particular drug administered to sedate the patient, such as the drug type, drug class (e.g., hypnotic, analgesic, opiate, etc.), mechanism or method of delivery (inhalant, intravenous, ingestible, etc.) and/or particular active ingredient (e.g., Propofol (TIVA), Sevoflurane, nitrous oxide, morphine, etc.). The compute engines 604 may operate simultaneously to parallel process EEG information.


A decision logic module 608 receives the outputs of each compute engine 604 and applies logic to determine the best estimate of the patient's sedation level. For example, in some embodiments, the decision logic module 608 averages or weighted averages the outputs of the compute engines 604. In other embodiments, the decision logic module 608 selects one or more compute engine outputs based upon known information about the patient and/or drug(s) used for sedation. The decision logic output 610 can indicate one or more parameters relevant to patient sedation. For example, in some embodiments, the decision logic output 610 includes suppression bar, EMG estimation, patient sedation index, drug type and patient age estimates. If any one or more decision logic outputs do not match actual drug or patient profile information, an alarm can activate. In other embodiments, the clinician manually compares the decision logic outputs to actual drug and patient profile information to confirm the accuracy of the depth of consciousness monitor 140.


Another embodiment of a depth of consciousness monitor's processor 220 is illustrated in FIG. 7. The processor 602 includes a pre-processor 602, compute engines 604 and decision logic 608, as discussed above with respect to FIG. 6. In addition, the processor 220 includes an EEG engine 620 and an EMG engine 630. The EEG and EMG engines receive a pre-processed EEG signal from the pre-processor 602. The pre-processed EEG signal will generally contain both EEG and EMG content. For example, EEG content describes the electrical activity within the patient's brain and the EMG content describes the electrical activity associated with the muscular contractions in the patient's forehead, near the EEG sensor. The EEG and EMG engines 620, 630 separate the EEG and EMG content from the pre-processed EEG signal. The outputs of the EEG and EMG engines 620, 630 communicate with the inputs of one or more compute engines 604. The EEG signal from the EEG engine provides an indication of the patient's hypnotic state, while the EMG engine provides an indication of the patient's analgesic response, or pain state. Separating the two provides more information about the patient's state, and allows improved depth of consciousness processing. When EMG content is included in the EEG signal, the frequency response curve is flatter at higher frequencies (e.g., at frequencies in the beta band).



FIG. 8 illustrates one embodiment of a process 700 to determine a patient's sedation level that can be implemented by any of the processors described above. The process 700 begins at block 702. At block 702, the process 700 receives an EEG signal from a patient. At block 704, the process 700 receives treatment data. The treatment data may include one or more of patient data and drug profile information. The patient data can include any of the patient data or drug profile information described above. At block 706, the process 700 selects a computing engine based upon one or more treatment data. At block 708, the process 700 computes patient sedation information using EEG information and the selected computing engine. The patient sedation information can include one or more of a patient sedation level or index, an EMG level, a prediction of the drug used to sedate the patient, a prediction of the patient's age, etc. The process 700 ends at block 708.



FIG. 9 illustrates another embodiment of a process 800 to determine a patient's sedation level that can be implemented by any of the processors described above. The process 800 begins at block 802. At block 802, the process 800 receives an EEG signal from a patient. At block 804, the process 800 receives treatment data. The treatment data may include one or more of patient data and drug profile information. The patient data can include any of the patient data or drug profile information described above. At block 806, the process 800 computes patient sedation information with parallel computing engines using the EEG information. The patient sedation information can include one or more of a patient sedation level or index, an EMG level, a prediction of the drug used to sedate the patient, a prediction of the patient's age, etc. At block 808, the process 800 determines patient sedation information by selecting the output of one of the parallel computing engines, or by combining one or more computing engine outputs (e.g., averaging, weighted averaging, etc.). The process 800 ends at block 808.



FIG. 10 illustrates one embodiment of a depth of consciousness monitoring system 1000. The system 1000 includes a depth of consciousness monitor assembly 1002, a multi-parameter monitor 1012, and a sensor 1014. The monitor assembly 1002 includes a depth of consciousness processor 1003, which can include any of the processors described above. In some embodiments, the processor 1003 is configured to perform one or more of the methods described above.


The assembly 1002 may be provided in the form of a cable. For example, the assembly 1002 may include one or more cables 1004 (or cable portions) that terminate in connectors 1005, 1008 located at the cable ends 1006, 1010. In the illustrated embodiment of FIG. 10, the assembly 1002 includes two cables 1004. The first cable 1004 has two ends and is coupled to the processor 1003 at one end and terminates at a connector 1005 at the other end 1006. In one embodiment, the connector 1005 (which is one embodiment of an interface, such as a multi-parameter monitor interface) is configured facilitate communication between the processor 1003 and a medical device, such as a physiological monitor, display instrument, and/or a multi-parameter monitor 1012, etc. In some embodiments, the connector 1005 receives power from a multi-parameter monitor 1012 to power the depth of consciousness processor 1003. In some embodiments, the processor 1003 is configured to consume less than about 250 mW, 500 mW or 1 W at about 4.75 V, 5 V or 5.25 V.


Physiological signals generated by the depth of consciousness processor 1003 are communicated to the multi-parameter monitor 1012 via the connector 1005. The multi-parameter monitor 1012 is configured to display one or more of the signals generated by the depth of consciousness processor 1003. In one embodiment, the cable 1004 that terminates at the multi-parameter monitor 1012 connector 1005 is configured to provide and/or receive power, ground, data +and data −signals. For example, in one embodiment, the cable 1004 includes four conductors, one each for power, ground, data +and data −.


An adapter or coupler (not shown) may be provided to facilitate coupling of the connector 1005 to the multi-parameter monitor 1012. For example, an adapter having first and second ends can be configured to have different shapes and pin configurations to allow communication and physical coupling of the connector 1005 to the multi-parameter monitor 1012. In some embodiment, the adapter (not shown) also includes conditioning circuitry to facilitate communication and/or power transmission between the multi-parameter monitor 1012 and the processor 1003. For example, the adapter may provide voltage regulation, electrical isolation, signal multiplexing, etc.


The second cable 1004 has two ends and is coupled to the processor 1003 at one end and terminates at a connector 1008 at the other end 1010. In one embodiment, the connector 1008 is configured to facilitate communication with a physiological sensor 1014, such as an EEG sensor, and/or any other sensor described above via an interface 1015 (e.g., interface 450 of FIG. 3). In one embodiment, power from the multi-parameter monitor 1012 is directly or indirectly (e.g., after further filtering, conditioning, pulsing, etc. by the processor 1003) communicated to the sensor 1014. Signals (e.g., e.g., measured patient signals) from the sensor 1014 are communicated to the processor 1003 via the interface 1015, which can be coupled to the connector 1008.



FIG. 11 illustrates another embodiment of a depth of consciousness monitoring assembly 1002 coupled to a multi-parameter monitor (sometimes referred to as a multi-parameter instrument) 1012. The multi-parameter monitor 1012 is configured to receive and display a plurality of physiological parameters received from a patient monitoring device, such as, but not limited to, the depth of consciousness monitoring assembly 1002. The multi-parameter monitor 1012 includes a display 1020. The display 1020 is configured to display a plurality of physiological signals 1022 related to a medical patient. In some embodiments, the display 1020 can be configured to display only selected or groups of physiological signals 1022. In some embodiments, the monitor 1012 can be configured to display a particular view or mode on the display 1020. The view or mode can include one or more pre-selected groupings of physiological signals to display. Examples of different views that may be provided via the display 1020 are discussed below with respect to FIGS. 14-16. Other views, in addition to or instead of those illustrated in FIGS. 14-16 may be displayed on the multi-parameter monitor 1012 display 1020, as well.


In some embodiments, the multi-parameter monitor 1012 also includes a removable module 1024. The removable module 1024 can include a physiological monitor configured to determine one or more physiological parameters associated with the medical patient. For example, in some embodiments, the removable module 1024 includes a respiration rate monitor, a blood oxygen saturation monitor, a blood gas monitor, a carbon monoxide monitor, an ECG monitor, an EKG monitor, a blood pressure monitor, a temperature monitor, a heart rate monitor, etc., or a combination of any one or more of the foregoing.


In the illustrated embodiment of FIG. 11, the depth of consciousness monitor assembly 1002 only includes one cable 1004. A first end of the cable 1004 terminates at a connector (not shown) that is attached to the multi-parameter monitor 1012. The second end of the cable 1003 includes the depth of consciousness processor 1003 and connector 1008, which are integrated within a single housing assembly. The connector end of a sensor 1014 is shown attached to the depth of consciousness monitor assembly's 1002 cable's 1003 second end.



FIG. 12 illustrates another embodiment of a depth of consciousness monitor assembly 1002. The processor 1003 is positioned between the ends 1006, 1010 of the assembly 1002. The length of the cable 1004 attached to the connector 1008 configured to attach to a sensor (not shown) may be shorter than the length of the cable 1004 attached to the connector 1005 configured to attach to the multi-parameter monitor (not shown). The shorter sensor cable 1004 length can provide additional comfort and less pulling on the sensor when attached to the patient. FIG. 13 illustrates the depth of consciousness monitor assembly 1002 coupled to a sensor 1014. The sensor 1014 can include any of the EEG sensors described above.



FIGS. 14-16 illustrate views of various parameters 1022 that may be displayed on the multi-parameter monitor 1012 display 1020. The embodiment of FIG. 14 illustrates an EEG view of a multi-parameter monitor 1012. The multi-parameter monitor 1012 view includes a numeric value indicator (e.g., EEG data, insufficient EEG data, patent state index (PSI) value), or any other value described herein, etc.), a bar-graph indicator, menu indicators, date and time indicators, message indicators, physiological waveform indicators, and system status indicators. The physiological waveform indicators can display each of the waveforms received from each electrode (e.g., R2, R1, L1) of an sensor 1014, such as an EEG sensor.



FIG. 15 illustrates a trend view of a multi-parameter monitor 1012. The multi-parameter monitor 1012 view includes a primary indicator or display and a secondary indicator or display. The primary indicator displays the trend of one or more physiological parameters (e.g., PSI, etc.) over time. One or more secondary indicators can display additional physiological parameters of the medical patient, including but not limited to, EMG, and SR. In one embodiment, the secondary indicators display information in the same format (e.g., waveform, solid waveform, bargraph, etc.) as the primary (e.g., trend plot) indicator, but at a smaller size. In other embodiments, the secondary indicators display information in a different format than the primary (e.g., trend plot, etc.) indicator. FIG. 16 illustrates a density spectral array view of a multi-parameter monitor 1012. The multi-parameter monitor 1012 view includes a spectral density indicator


Each of the displayed physiological parameters can be determined by the depth of consciousness processor 1003. In addition, the multi-parameter monitor 1012 can be configured to display any one or more of the parameters discussed above, as well as other parameters, such as signals from the removable module 1024, when provided.


All publications, patents, and patent applications mentioned in this specification are herein incorporated by reference to the same extent as if each individual publication, patent, or patent application was specifically and individually indicated to be incorporated by reference.


Depending on the embodiment, certain acts, events, or functions of any of the processes or algorithms described herein can be performed in a different sequence, can be added, merged, or left out altogether (e.g., not all described operations or events are necessary for the practice of the algorithm). Moreover, in certain embodiments, operations or events can be performed concurrently, e.g., through multi-threaded processing, interrupt processing, or multiple processors or processor cores or on other parallel architectures, rather than sequentially.


The various illustrative logical blocks, modules, routines, and algorithm steps described in connection with the embodiments disclosed herein can be implemented as electronic hardware, computer software, or combinations of both. To clearly illustrate this interchangeability of hardware and software, various illustrative components, blocks, modules, and steps have been described above generally in terms of their functionality. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the overall system. The described functionality can be implemented in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the disclosure.


The steps of a method, process, routine, or algorithm described in connection with the embodiments disclosed herein can be embodied directly in hardware, in a software module executed by a processor, or in a combination of the two. A software module can reside in RAM memory, flash memory, ROM memory, EPROM memory, EEPROM memory, registers, hard disk, a removable disk, a CD-ROM, or any other form of a non-transitory computer-readable storage medium. An example storage medium can be coupled to the processor such that the processor can read information from, and write information to, the storage medium. In the alternative, the storage medium can be integral to the processor. The processor and the storage medium can reside in an ASIC. The ASIC can reside in a user terminal. In the alternative, the processor and the storage medium can reside as discrete components in a user terminal.


Conditional language used herein, such as, among others, “can,” “could,” “might,” “may,” “e.g.,” and the like, unless specifically stated otherwise, or otherwise understood within the context as used, is generally intended to convey that certain embodiments include, while other embodiments do not include, certain features, elements and/or steps. Thus, such conditional language is not generally intended to imply that features, elements and/or steps are in any way required for one or more embodiments or that one or more embodiments necessarily include logic for deciding, with or without author input or prompting, whether these features, elements and/or steps are included or are to be performed in any particular embodiment. The terms “comprising,” “including,” “having,” and the like are synonymous and are used inclusively, in an open-ended fashion, and do not exclude additional elements, features, acts, operations, and so forth. Also, the term “or” is used in its inclusive sense (and not in its exclusive sense) so that when used, for example, to connect a list of elements, the term “or” means one, some, or all of the elements in the list.


Conjunctive language such as the phrase “at least one of X, Y and Z,” unless specifically stated otherwise, is to be understood with the context as used in general to convey that an item, term, etc. may be either X, Y, or Z, or a combination thereof. Thus, such conjunctive language is not generally intended to imply that certain embodiments require at least one of X, at least one of Y and at least one of Z to each be present.


While the above detailed description has shown, described, and pointed out novel features as applied to various embodiments, it can be understood that various omissions, substitutions, and changes in the form and details of the devices or algorithms illustrated can be made without departing from the spirit of the disclosure. As can be recognized, certain embodiments of the inventions described herein can be embodied within a form that does not provide all of the features and benefits set forth herein, as some features can be used or practiced separately from others. The scope of certain inventions disclosed herein is indicated by the appended claims rather than by the foregoing description. All changes which come within the meaning and range of equivalency of the claims are to be embraced within their scope.

Claims
  • 1. A depth of consciousness monitor configured to determine a level of sedation of a medical patient, the depth of consciousness monitor comprising: an EEG interface located within the depth of consciousness monitor and configured to receive an EEG signal from an EEG sensor;an EEG front end configured to pre-process the EEG signal;a processor, configured to determine a level of sedation of a medical patient based at least upon the pre-processed EEG signal, wherein the processor is further configured to determine delay information using a time the EEG signal is received by the EEG interface and a time the level of sedation is determined by the processor; anda drug delivery device interface, configured to provide the level of sedation and the delay information to a drug delivery device.
  • 2. The depth of consciousness monitor of claim 1, further comprising a multi-parameter monitor interface configured to receive power from a multi-parameter monitor and provide to the multi-parameter monitor one or more of: (a) one or more physiological signals determined by the processor, (b) the level of sedation, or (c) the delay information.
  • 3. The depth of consciousness monitor of claim 1, wherein the EEG front end comprises an EEG engine and an EMG engine configured to extract EEG information and EMG information from the EEG signal, respectively.
  • 4. The depth of consciousness monitor of claim 1, wherein processor is further configured to time stamp the EEG signal when received from the EEG sensor.
  • 5. The depth of consciousness monitor of claim 1, further comprising an additional sensor front end.
  • 6. The depth of consciousness monitor of claim 5, wherein the additional sensor front end comprises an SpO2 sensor front end.
  • 7. The depth of consciousness monitor of claim 1, further comprising a data port configured to receive at least one of patient data and drug profile information.
  • 8. The depth of consciousness monitor of claim 7, wherein the processor is configured to determine a level of sedation of a medical patient based at least upon the pre-processed EEG signal and the at least one of patient data and drug profile information.
  • 9. A depth of consciousness monitoring system comprising the depth of consciousness monitor of claim 1 and the EEG sensor.
  • 10. A depth of consciousness monitoring system comprising the depth of consciousness monitor of claim 1 and the drug delivery system.
  • 11. The depth of consciousness monitor of claim 1, wherein the drug delivery device interface comprises a wireless communication device.
CROSS REFERENCE TO RELATED APPLICATIONS

This application claims the benefit of priority from U.S. Provisional No. 61/703,747, filed Sep. 20, 2012, and U.S. Provisional No. 61/656,974, filed Jun. 7, 2012, both of which are incorporated by reference in their entireties.

US Referenced Citations (969)
Number Name Date Kind
3702431 Pinckaers Nov 1972 A
3710041 Hayashi Jan 1973 A
3719830 Ananiades Mar 1973 A
4610259 Cohen Sep 1986 A
4960128 Gordon et al. Oct 1990 A
4964408 Hink et al. Oct 1990 A
5041187 Hink et al. Aug 1991 A
5069213 Polczynski Dec 1991 A
5163438 Gordon et al. Nov 1992 A
5319355 Russek Jun 1994 A
5337744 Branigan Aug 1994 A
5341805 Stavridi et al. Aug 1994 A
D353195 Savage et al. Dec 1994 S
D353196 Savage et al. Dec 1994 S
5377676 Vari et al. Jan 1995 A
D359546 Savage et al. Jun 1995 S
5431170 Mathews Jul 1995 A
D361840 Savage et al. Aug 1995 S
D362063 Savage et al. Sep 1995 S
5452717 Branigan et al. Sep 1995 A
D363120 Savage et al. Oct 1995 S
5456252 Vari et al. Oct 1995 A
5479934 Imran Jan 1996 A
5482036 Diab et al. Jan 1996 A
5490505 Diab et al. Feb 1996 A
5494043 O'Sullivan et al. Feb 1996 A
5533511 Kaspari et al. Jul 1996 A
5534851 Russek Jul 1996 A
5561275 Savage et al. Oct 1996 A
5562002 Lalin Oct 1996 A
5590649 Caro et al. Jan 1997 A
5602924 Durand et al. Feb 1997 A
5632272 Diab et al. May 1997 A
5638816 Kiani-Azarbayjany et al. Jun 1997 A
5638818 Diab et al. Jun 1997 A
5645440 Tobler et al. Jul 1997 A
5685299 Diab et al. Nov 1997 A
D393830 Tobler et al. Apr 1998 S
5743262 Lepper, Jr. et al. Apr 1998 A
5758644 Diab et al. Jun 1998 A
5760910 Lepper, Jr. et al. Jun 1998 A
5769785 Diab et al. Jun 1998 A
5782757 Diab et al. Jul 1998 A
5785659 Caro et al. Jul 1998 A
5791347 Flaherty et al. Aug 1998 A
5810734 Caro et al. Sep 1998 A
5823950 Diab et al. Oct 1998 A
5830131 Caro et al. Nov 1998 A
5833618 Caro et al. Nov 1998 A
5860919 Kiani-Azarbayjany et al. Jan 1999 A
5890929 Mills et al. Apr 1999 A
5904654 Wohltmann et al. May 1999 A
5919134 Diab Jul 1999 A
5934925 Tobler et al. Aug 1999 A
5940182 Lepper, Jr. et al. Aug 1999 A
5987343 Kinast Nov 1999 A
5995855 Kiani et al. Nov 1999 A
5997343 Mills et al. Dec 1999 A
6002952 Diab et al. Dec 1999 A
6011986 Diab et al. Jan 2000 A
6016449 Fischell Jan 2000 A
6027452 Flaherty et al. Feb 2000 A
6036642 Diab et al. Mar 2000 A
6045509 Caro et al. Apr 2000 A
6067462 Diab et al. May 2000 A
6081735 Diab et al. Jun 2000 A
6088607 Diab et al. Jul 2000 A
6110522 Lepper, Jr. et al. Aug 2000 A
6124597 Shehada Sep 2000 A
6128521 Marro et al. Oct 2000 A
6129675 Jay Oct 2000 A
6144868 Parker Nov 2000 A
6151516 Kiani-Azarbayjany et al. Nov 2000 A
6152754 Gerhardt et al. Nov 2000 A
6157850 Diab et al. Dec 2000 A
6165005 Mills et al. Dec 2000 A
6184521 Coffin, IV et al. Feb 2001 B1
6206830 Diab et al. Mar 2001 B1
6229856 Diab et al. May 2001 B1
6232609 Snyder et al. May 2001 B1
6236872 Diab et al. May 2001 B1
6241683 Macklem et al. Jun 2001 B1
6253097 Aronow et al. Jun 2001 B1
6256523 Diab et al. Jul 2001 B1
6263222 Diab et al. Jul 2001 B1
6278522 Lepper, Jr. et al. Aug 2001 B1
6280213 Tobler et al. Aug 2001 B1
6285896 Tobler et al. Sep 2001 B1
6301493 Marro et al. Oct 2001 B1
6308089 von der Ruhr et al. Oct 2001 B1
6317627 Ennen et al. Nov 2001 B1
6321100 Parker Nov 2001 B1
6325761 Jay Dec 2001 B1
6334065 Al-Ali et al. Dec 2001 B1
6343224 Parker Jan 2002 B1
6349228 Kiani et al. Feb 2002 B1
6360114 Diab et al. Mar 2002 B1
6366805 Lutz Apr 2002 B1
6368283 Xu et al. Apr 2002 B1
6371921 Caro et al. Apr 2002 B1
6377829 Al-Ali Apr 2002 B1
6388240 Schulz et al. May 2002 B2
6397091 Diab et al. May 2002 B2
6430437 Marro Aug 2002 B1
6430525 Weber et al. Aug 2002 B1
6463311 Diab Oct 2002 B1
6470199 Kopotic et al. Oct 2002 B1
6501975 Diab et al. Dec 2002 B2
6505059 Kollias et al. Jan 2003 B1
6515273 Al-Ali Feb 2003 B2
6519487 Parker Feb 2003 B1
6525386 Mills et al. Feb 2003 B1
6526300 Kiani et al. Feb 2003 B1
6541756 Schulz et al. Apr 2003 B2
6542764 Al-Ali et al. Apr 2003 B1
6580086 Schulz et al. Jun 2003 B1
6584336 Ali et al. Jun 2003 B1
6595316 Cybulski et al. Jul 2003 B2
6597932 Tian et al. Jul 2003 B2
6597933 Kiani et al. Jul 2003 B2
6606511 Ali et al. Aug 2003 B1
6632181 Flaherty et al. Oct 2003 B2
6639668 Trepagnier Oct 2003 B1
6640116 Diab Oct 2003 B2
6643530 Diab et al. Nov 2003 B2
6650917 Diab et al. Nov 2003 B2
6654624 Diab et al. Nov 2003 B2
6658276 Kiani et al. Dec 2003 B2
6661161 Lanzo et al. Dec 2003 B1
6671531 Al-Ali et al. Dec 2003 B2
6678543 Diab et al. Jan 2004 B2
6684090 Ali et al. Jan 2004 B2
6684091 Parker Jan 2004 B2
6697164 Babayoff Feb 2004 B1
6697656 Al-Ali Feb 2004 B1
6697657 Shehada et al. Feb 2004 B1
6697658 Al-Ali Feb 2004 B2
RE38476 Diab et al. Mar 2004 E
6699194 Diab et al. Mar 2004 B1
6714804 Al-Ali et al. Mar 2004 B2
RE38492 Diab et al. Apr 2004 E
6721582 Trepagnier et al. Apr 2004 B2
6721585 Parker Apr 2004 B1
6725075 Al-Ali Apr 2004 B2
6728560 Kollias et al. Apr 2004 B2
6735459 Parker May 2004 B2
6745060 Diab et al. Jun 2004 B2
6760607 Al-Ali Jul 2004 B2
6770028 Ali et al. Aug 2004 B1
6771994 Kiani et al. Aug 2004 B2
6792300 Diab et al. Sep 2004 B1
6801803 Viertio-Oja Oct 2004 B2
6813511 Diab et al. Nov 2004 B2
6816741 Diab Nov 2004 B2
6822564 Al-Ali Nov 2004 B2
6826419 Diab et al. Nov 2004 B2
6830711 Mills et al. Dec 2004 B2
6850787 Weber et al. Feb 2005 B2
6850788 Al-Ali Feb 2005 B2
6852083 Caro et al. Feb 2005 B2
6861639 Al-Ali Mar 2005 B2
6898452 Al-Ali et al. May 2005 B2
6920345 Al-Ali et al. Jul 2005 B2
6931268 Kiani-Azarbayjany et al. Aug 2005 B1
6934570 Kiani et al. Aug 2005 B2
6939305 Flaherty et al. Sep 2005 B2
6943348 Coffin, IV Sep 2005 B1
6944501 Pless Sep 2005 B1
6950687 Al-Ali Sep 2005 B2
6961598 Diab Nov 2005 B2
6970792 Diab Nov 2005 B1
6979812 Al-Ali Dec 2005 B2
6985764 Mason et al. Jan 2006 B2
6993371 Kiani et al. Jan 2006 B2
6996427 Ali et al. Feb 2006 B2
6999904 Weber et al. Feb 2006 B2
7003338 Weber et al. Feb 2006 B2
7003339 Diab et al. Feb 2006 B2
7015451 Dalke et al. Mar 2006 B2
7024233 Ali et al. Apr 2006 B2
7027849 Al-Ali Apr 2006 B2
7030749 Al-Ali Apr 2006 B2
7039449 Al-Ali May 2006 B2
7041060 Flaherty et al. May 2006 B2
7044918 Diab May 2006 B2
7048687 Reuss et al. May 2006 B1
7067893 Mills et al. Jun 2006 B2
7096052 Mason et al. Aug 2006 B2
7096054 Abdul-Hafiz et al. Aug 2006 B2
7132641 Schulz et al. Nov 2006 B2
7142901 Kiani et al. Nov 2006 B2
7149561 Diab Dec 2006 B2
7186966 Al-Ali Mar 2007 B2
7190261 Al-Ali Mar 2007 B2
7215984 Diab May 2007 B2
7215986 Diab May 2007 B2
7221971 Diab May 2007 B2
7225006 Al-Ali et al. May 2007 B2
7225007 Al-Ali May 2007 B2
RE39672 Shehada et al. Jun 2007 E
7239905 Kiani-Azarbayjany et al. Jul 2007 B2
7245953 Parker Jul 2007 B1
7254429 Schurman et al. Aug 2007 B2
7254431 Al-Ali Aug 2007 B2
7254433 Diab et al. Aug 2007 B2
7254434 Schulz et al. Aug 2007 B2
7272425 Al-Ali Sep 2007 B2
7274955 Kiani et al. Sep 2007 B2
D554263 Al-Ali Oct 2007 S
7280550 Rosenboom Oct 2007 B1
7280858 Al-Ali et al. Oct 2007 B2
7289835 Mansfield et al. Oct 2007 B2
7292883 De Felice et al. Nov 2007 B2
7295866 Al-Ali Nov 2007 B2
7308894 Hickle Dec 2007 B2
7328053 Diab et al. Feb 2008 B1
7332784 Mills et al. Feb 2008 B2
7340287 Mason et al. Mar 2008 B2
7341559 Schulz et al. Mar 2008 B2
7343186 Lamego et al. Mar 2008 B2
D566282 Al-Ali et al. Apr 2008 S
7355512 Al-Ali Apr 2008 B1
7356365 Schurman Apr 2008 B2
7371981 Abdul-Hafiz May 2008 B2
7373193 Al-Ali et al. May 2008 B2
7373194 Weber et al. May 2008 B2
7376453 Diab et al. May 2008 B1
7377794 Al Ali et al. May 2008 B2
7377899 Weber et al. May 2008 B2
7383070 Diab et al. Jun 2008 B2
7407485 Huiku Aug 2008 B2
7415297 Al-Ali et al. Aug 2008 B2
7428432 Ali et al. Sep 2008 B2
7438683 Al-Ali et al. Oct 2008 B2
7440787 Diab Oct 2008 B2
7454240 Diab et al. Nov 2008 B2
7467002 Weber et al. Dec 2008 B2
7469157 Diab et al. Dec 2008 B2
7471969 Diab et al. Dec 2008 B2
7471971 Diab et al. Dec 2008 B2
7483729 Al-Ali et al. Jan 2009 B2
7483730 Diab et al. Jan 2009 B2
7489958 Diab et al. Feb 2009 B2
7496391 Diab et al. Feb 2009 B2
7496393 Diab et al. Feb 2009 B2
D587657 Al-Ali et al. Mar 2009 S
7499741 Diab et al. Mar 2009 B2
7499835 Weber et al. Mar 2009 B2
7500950 Al-Ali et al. Mar 2009 B2
7509154 Diab et al. Mar 2009 B2
7509494 Al-Ali Mar 2009 B2
7510849 Schurman et al. Mar 2009 B2
7526328 Diab et al. Apr 2009 B2
7530942 Diab May 2009 B1
7530949 Al Ali et al. May 2009 B2
7530955 Diab et al. May 2009 B2
7563110 Al-Ali et al. Jul 2009 B2
7596398 Al-Ali et al. Sep 2009 B2
7618375 Flaherty Nov 2009 B2
D606659 Kiani et al. Dec 2009 S
7647083 Al-Ali et al. Jan 2010 B2
D609193 Al-Ali et al. Feb 2010 S
D614305 Al-Ali et al. Apr 2010 S
RE41317 Parker May 2010 E
7725173 Viertio-Oja May 2010 B2
7729733 Al-Ali et al. Jun 2010 B2
7734320 Al-Ali Jun 2010 B2
7761127 Al-Ali et al. Jul 2010 B2
7761128 Al-Ali et al. Jul 2010 B2
7761146 Carlson Jul 2010 B2
7764982 Dalke et al. Jul 2010 B2
D621516 Kiani et al. Aug 2010 S
7791155 Diab Sep 2010 B2
7801581 Diab Sep 2010 B2
7822452 Schurman et al. Oct 2010 B2
RE41912 Parker Nov 2010 E
7844313 Kiani et al. Nov 2010 B2
7844314 Al-Ali Nov 2010 B2
7844315 Al-Ali Nov 2010 B2
7865222 Weber et al. Jan 2011 B2
7873497 Weber et al. Jan 2011 B2
7880606 Al-Ali Feb 2011 B2
7880626 Al-Ali et al. Feb 2011 B2
7891355 Al-Ali et al. Feb 2011 B2
7894868 Al-Ali et al. Feb 2011 B2
7899507 Al-Ali et al. Mar 2011 B2
7899518 Trepagnier et al. Mar 2011 B2
7904132 Weber et al. Mar 2011 B2
7909772 Popov et al. Mar 2011 B2
7910875 Al-Ali Mar 2011 B2
7917199 Drew Mar 2011 B2
7919713 Al-Ali et al. Apr 2011 B2
7925338 Huiku Apr 2011 B2
7937128 Al-Ali May 2011 B2
7937129 Mason et al. May 2011 B2
7937130 Diab et al. May 2011 B2
7941199 Kiani May 2011 B2
7951086 Flaherty et al. May 2011 B2
7957780 Lamego et al. Jun 2011 B2
7962188 Kiani et al. Jun 2011 B2
7962190 Diab et al. Jun 2011 B1
7976472 Kiani Jul 2011 B2
7988637 Diab Aug 2011 B2
7990382 Kiani Aug 2011 B2
7991446 Al-Ali et al. Aug 2011 B2
8000761 Al-Ali Aug 2011 B2
8008088 Bellott et al. Aug 2011 B2
RE42753 Kiani-Azarbayjany et al. Sep 2011 E
8019400 Diab et al. Sep 2011 B2
8028701 Al-Ali et al. Oct 2011 B2
8029765 Bellott et al. Oct 2011 B2
8036727 Schurman et al. Oct 2011 B2
8036728 Diab et al. Oct 2011 B2
8046040 Ali et al. Oct 2011 B2
8046041 Diab et al. Oct 2011 B2
8046042 Diab et al. Oct 2011 B2
8048040 Kiani Nov 2011 B2
8050728 Al-Ali et al. Nov 2011 B2
RE43169 Parker Feb 2012 E
8118620 Al-Ali et al. Feb 2012 B2
8126528 Diab et al. Feb 2012 B2
8128572 Diab et al. Mar 2012 B2
8130105 Al-Ali et al. Mar 2012 B2
8145287 Diab et al. Mar 2012 B2
8150487 Diab et al. Apr 2012 B2
8175672 Parker May 2012 B2
8180420 Diab et al. May 2012 B2
8182443 Kiani May 2012 B1
8185180 Diab et al. May 2012 B2
8190223 Al-Ali et al. May 2012 B2
8190227 Diab et al. May 2012 B2
8203438 Kiani et al. Jun 2012 B2
8203704 Merritt et al. Jun 2012 B2
8204566 Schurman et al. Jun 2012 B2
8219172 Schurman et al. Jul 2012 B2
8219187 Sarkela Jul 2012 B2
8224411 Al-Ali et al. Jul 2012 B2
8228181 Al-Ali Jul 2012 B2
8229533 Diab et al. Jul 2012 B2
8233955 Al-Ali et al. Jul 2012 B2
8244325 Al-Ali et al. Aug 2012 B2
8255026 Al-Ali Aug 2012 B1
8255027 Al-Ali et al. Aug 2012 B2
8255028 Al-Ali et al. Aug 2012 B2
8260577 Weber et al. Sep 2012 B2
8265723 McHale et al. Sep 2012 B1
8274360 Sampath et al. Sep 2012 B2
8280473 Al-Ali Oct 2012 B2
8301217 Al-Ali et al. Oct 2012 B2
8306596 Schurman et al. Nov 2012 B2
8310336 Muhsin et al. Nov 2012 B2
8315683 Al-Ali et al. Nov 2012 B2
RE43860 Parker Dec 2012 E
8337403 Al-Ali et al. Dec 2012 B2
8346330 Lamego Jan 2013 B2
8353842 Al-Ali et al. Jan 2013 B2
8355766 MacNeish, III et al. Jan 2013 B2
8359080 Diab et al. Jan 2013 B2
8364223 Al-Ali et al. Jan 2013 B2
8364226 Diab et al. Jan 2013 B2
8374665 Lamego Feb 2013 B2
8385995 Al-ali et al. Feb 2013 B2
8385996 Smith et al. Feb 2013 B2
8388353 Kiani et al. Mar 2013 B2
8399822 Al-Ali Mar 2013 B2
8401602 Kiani Mar 2013 B2
8405608 Al-Ali et al. Mar 2013 B2
8414499 Al-Ali et al. Apr 2013 B2
8418524 Al-Ali Apr 2013 B2
8423106 Lamego et al. Apr 2013 B2
8428967 Olsen et al. Apr 2013 B2
8430817 Al-Ali et al. Apr 2013 B1
8437825 Dalvi et al. May 2013 B2
8455290 Siskavich Jun 2013 B2
8457703 Al-Ali Jun 2013 B2
8457707 Kiani Jun 2013 B2
8463349 Diab et al. Jun 2013 B2
8466286 Bellot et al. Jun 2013 B2
8471713 Poeze et al. Jun 2013 B2
8473020 Kiani et al. Jun 2013 B2
8483787 Al-Ali et al. Jul 2013 B2
8489364 Weber et al. Jul 2013 B2
8498684 Weber et al. Jul 2013 B2
8504128 Blank et al. Aug 2013 B2
8509867 Workman et al. Aug 2013 B2
8515509 Bruinsma et al. Aug 2013 B2
8523781 Al-Ali Sep 2013 B2
8529301 Al-Ali et al. Sep 2013 B2
8532727 Ali et al. Sep 2013 B2
8532728 Diab et al. Sep 2013 B2
D692145 Al-Ali et al. Oct 2013 S
8547209 Kiani et al. Oct 2013 B2
8548548 Al-Ali Oct 2013 B2
8548549 Schurman et al. Oct 2013 B2
8548550 Al-Ali et al. Oct 2013 B2
8560032 Al-Ali et al. Oct 2013 B2
8560034 Diab et al. Oct 2013 B1
8570167 Al-Ali Oct 2013 B2
8570503 Vo et al. Oct 2013 B2
8571617 Reichgott et al. Oct 2013 B2
8571618 Lamego et al. Oct 2013 B1
8571619 Al-Ali et al. Oct 2013 B2
8577431 Lamego et al. Nov 2013 B2
8581732 Al-Ali et al. Nov 2013 B2
8584345 Al-Ali et al. Nov 2013 B2
8588880 Abdul-Hafiz et al. Nov 2013 B2
8600467 Al-Ali et al. Dec 2013 B2
8606342 Diab Dec 2013 B2
8626255 Al-Ali et al. Jan 2014 B2
8630691 Lamego et al. Jan 2014 B2
8634889 Al-Ali et al. Jan 2014 B2
8641631 Sierra et al. Feb 2014 B2
8652060 Al-Ali Feb 2014 B2
8663107 Kiani Mar 2014 B2
8666468 Al-Ali Mar 2014 B1
8667967 Al- Ali et al. Mar 2014 B2
8670811 O'Reilly Mar 2014 B2
8670814 Diab et al. Mar 2014 B2
8676286 Weber et al. Mar 2014 B2
8682407 Al-Ali Mar 2014 B2
RE44823 Parker Apr 2014 E
RE44875 Kiani et al. Apr 2014 E
8690799 Telfort et al. Apr 2014 B2
8700112 Kiani Apr 2014 B2
8702627 Telfort et al. Apr 2014 B2
8706179 Parker Apr 2014 B2
8712494 MacNeish, III et al. Apr 2014 B1
8715206 Telfort et al. May 2014 B2
8718735 Lamego et al. May 2014 B2
8718737 Diab et al. May 2014 B2
8718738 Blank et al. May 2014 B2
8720249 Al-Ali May 2014 B2
8721541 Al-Ali et al. May 2014 B2
8721542 Al-Ali et al. May 2014 B2
8723677 Kiani May 2014 B1
8740792 Kiani et al. Jun 2014 B1
8754776 Poeze et al. Jun 2014 B2
8755535 Telfort et al. Jun 2014 B2
8755856 Diab et al. Jun 2014 B2
8755872 Marinow Jun 2014 B1
8761850 Lamego Jun 2014 B2
8764671 Kiani Jul 2014 B2
8768423 Shakespeare et al. Jul 2014 B2
8771204 Telfort et al. Jul 2014 B2
8777634 Kiani et al. Jul 2014 B2
8781543 Diab et al. Jul 2014 B2
8781544 Al-Ali et al. Jul 2014 B2
8781549 Al-Ali et al. Jul 2014 B2
8788003 Schurman et al. Jul 2014 B2
8790268 Al-Ali Jul 2014 B2
8801613 Al-Ali et al. Aug 2014 B2
8821397 Al-Ali et al. Sep 2014 B2
8821415 Al-Ali et al. Sep 2014 B2
8830449 Lamego et al. Sep 2014 B1
8831700 Schurman et al. Sep 2014 B2
8840549 Al-Ali et al. Sep 2014 B2
8847740 Kiani et al. Sep 2014 B2
8849365 Smith et al. Sep 2014 B2
8852094 Al-Ali et al. Oct 2014 B2
8852994 Wojtczuk et al. Oct 2014 B2
8868147 Stippick et al. Oct 2014 B2
8868150 Al-Ali et al. Oct 2014 B2
8870792 Al-Ali et al. Oct 2014 B2
8886271 Kiani et al. Nov 2014 B2
8888539 Al-Ali et al. Nov 2014 B2
8888708 Diab et al. Nov 2014 B2
8892180 Weber et al. Nov 2014 B2
8897847 Al-Ali Nov 2014 B2
8909310 Lamego et al. Dec 2014 B2
8911377 Al-Ali Dec 2014 B2
8912909 Al-Ali et al. Dec 2014 B2
8920317 Al-Ali et al. Dec 2014 B2
8921699 Al-Ali et al. Dec 2014 B2
8922382 Al-Ali et al. Dec 2014 B2
8929964 Al-Ali et al. Jan 2015 B2
8942777 Diab et al. Jan 2015 B2
8948834 Diab et al. Feb 2015 B2
8948835 Diab Feb 2015 B2
8965471 Lamego Feb 2015 B2
8983564 Al-Ali Mar 2015 B2
8989831 Al-Ali et al. Mar 2015 B2
8996085 Kiani et al. Mar 2015 B2
8998809 Kiani Apr 2015 B2
9028429 Telfort et al. May 2015 B2
9037207 Al-Ali et al. May 2015 B2
9060721 Reichgott et al. Jun 2015 B2
9066666 Kiani Jun 2015 B2
9066680 Al-Ali et al. Jun 2015 B1
9072474 Al-Ali et al. Jul 2015 B2
9078560 Schurman et al. Jul 2015 B2
9084569 Weber et al. Jul 2015 B2
9095316 Welch et al. Aug 2015 B2
9106038 Telfort et al. Aug 2015 B2
9107625 Telfort et al. Aug 2015 B2
9107626 Al-Ali et al. Aug 2015 B2
9113831 Al-Ali Aug 2015 B2
9113832 Al-Ali Aug 2015 B2
9119595 Lamego Sep 2015 B2
9131881 Diab et al. Sep 2015 B2
9131882 Al-Ali et al. Sep 2015 B2
9131883 Al-Ali Sep 2015 B2
9131917 Telfort et al. Sep 2015 B2
9138180 Coverston et al. Sep 2015 B1
9138182 Al-Ali et al. Sep 2015 B2
9138192 Weber et al. Sep 2015 B2
9142117 Muhsin et al. Sep 2015 B2
9153112 Kiani et al. Oct 2015 B1
9153121 Kiani et al. Oct 2015 B2
9161696 Al-Ali et al. Oct 2015 B2
9161713 Al-Ali et al. Oct 2015 B2
9167995 Lamego et al. Oct 2015 B2
9176141 Al-Ali et al. Nov 2015 B2
9186102 Bruinsma et al. Nov 2015 B2
9192312 Al-Ali Nov 2015 B2
9192329 Al-Ali Nov 2015 B2
9192330 Lin Nov 2015 B2
9192351 Telfort et al. Nov 2015 B1
9195385 Al-Ali et al. Nov 2015 B2
9211072 Kiani Dec 2015 B2
9211095 Al-Ali Dec 2015 B1
9218454 Kiani et al. Dec 2015 B2
9226696 Kiani Jan 2016 B2
9241662 Al-Ali et al. Jan 2016 B2
9245668 Vo et al. Jan 2016 B1
9259185 Abdul-Hafiz et al. Feb 2016 B2
9267572 Barker et al. Feb 2016 B2
9277880 Poeze et al. Mar 2016 B2
9289167 Diab et al. Mar 2016 B2
9295421 Kiani et al. Mar 2016 B2
9307928 Al-Ali et al. Apr 2016 B1
9323894 Kiani Apr 2016 B2
D755392 Hwang et al. May 2016 S
9326712 Kiani May 2016 B1
9333316 Kiani May 2016 B2
9339220 Lamego et al. May 2016 B2
9341565 Lamego et al. May 2016 B2
9351673 Diab et al. May 2016 B2
9351675 Al-Ali et al. May 2016 B2
9364181 Kiani et al. Jun 2016 B2
9368671 Wojtczuk et al. Jun 2016 B2
9370325 Al-Ali et al. Jun 2016 B2
9370326 McHale et al. Jun 2016 B2
9370335 Al-ali et al. Jun 2016 B2
9375185 Ali et al. Jun 2016 B2
9386953 Al-Ali Jul 2016 B2
9386961 Al-Ali et al. Jul 2016 B2
9392945 Al-Ali et al. Jul 2016 B2
9397448 Al-Ali et al. Jul 2016 B2
9408542 Kinast et al. Aug 2016 B1
9436645 Al-Ali et al. Sep 2016 B2
9445759 Lamego et al. Sep 2016 B1
9466919 Kiani et al. Oct 2016 B2
9474474 Lamego et al. Oct 2016 B2
9480422 Al-Ali Nov 2016 B2
9480435 Olsen Nov 2016 B2
9492110 Al-Ali et al. Nov 2016 B2
9510779 Poeze et al. Dec 2016 B2
9517024 Kiani et al. Dec 2016 B2
9532722 Lamego et al. Jan 2017 B2
9538949 Al-Ali et al. Jan 2017 B2
9538980 Telfort et al. Jan 2017 B2
9549696 Lamego et al. Jan 2017 B2
9554737 Schurman et al. Jan 2017 B2
9560996 Kiani Feb 2017 B2
9560998 Al-Ali et al. Feb 2017 B2
9566019 Al-Ali et al. Feb 2017 B2
9579039 Jansen et al. Feb 2017 B2
9591975 Dalvi et al. Mar 2017 B2
9622692 Lamego et al. Apr 2017 B2
9622693 Diab Apr 2017 B2
D788312 Al-Ali et al. May 2017 S
9636055 Al-Ali et al. May 2017 B2
9636056 Al-Ali May 2017 B2
9649054 Lamego et al. May 2017 B2
9662052 Al-Ali et al. May 2017 B2
9668679 Schurman et al. Jun 2017 B2
9668680 Bruinsma et al. Jun 2017 B2
9668703 Al-Ali Jun 2017 B2
9675286 Diab Jun 2017 B2
9687160 Kiani Jun 2017 B2
9693719 Al-Ali et al. Jul 2017 B2
9693737 Al-Ali Jul 2017 B2
9697928 Al-Ali et al. Jul 2017 B2
9712318 Foerster Jul 2017 B2
9717425 Kiani et al. Aug 2017 B2
9717458 Lamego et al. Aug 2017 B2
9724016 Al-Ali et al. Aug 2017 B1
9724024 Al-Ali Aug 2017 B2
9724025 Kiani et al. Aug 2017 B1
9730640 Diab et al. Aug 2017 B2
9743887 Al-Ali et al. Aug 2017 B2
9749232 Sampath et al. Aug 2017 B2
9750442 Olsen Sep 2017 B2
9750443 Smith et al. Sep 2017 B2
9750461 Telfort Sep 2017 B1
9775545 Al-Ali et al. Oct 2017 B2
9775546 Diab et al. Oct 2017 B2
9775570 Al-Ali Oct 2017 B2
9778079 Al-Ali et al. Oct 2017 B1
9782077 Lamego et al. Oct 2017 B2
9782110 Kiani Oct 2017 B2
9787568 Lamego et al. Oct 2017 B2
9788735 Al-Ali Oct 2017 B2
9788768 Al-Ali et al. Oct 2017 B2
9795300 Al-Ali Oct 2017 B2
9795310 Al-Ali Oct 2017 B2
9795358 Telfort et al. Oct 2017 B2
9795739 Al-Ali et al. Oct 2017 B2
9801556 Kiani Oct 2017 B2
9801588 Weber et al. Oct 2017 B2
9808188 Perea et al. Nov 2017 B1
9814418 Weber et al. Nov 2017 B2
9820691 Kiani Nov 2017 B2
9833152 Kiani et al. Dec 2017 B2
9833180 Shakespeare et al. Dec 2017 B2
9839379 Al-Ali et al. Dec 2017 B2
9839381 Weber et al. Dec 2017 B1
9847002 Kiani et al. Dec 2017 B2
9847749 Kiani et al. Dec 2017 B2
9848800 Lee et al. Dec 2017 B1
9848806 Al-Ali et al. Dec 2017 B2
9848807 Lamego Dec 2017 B2
9861298 Eckerbom et al. Jan 2018 B2
9861304 Al-Ali et al. Jan 2018 B2
9861305 Weber et al. Jan 2018 B1
9867578 Al-Ali et al. Jan 2018 B2
9872623 Al-Ali Jan 2018 B2
9876320 Coverston et al. Jan 2018 B2
9877650 Muhsin et al. Jan 2018 B2
9877686 Al-Ali et al. Jan 2018 B2
9891079 Dalvi Feb 2018 B2
9895107 Al-Ali et al. Feb 2018 B2
9913617 Al-Ali et al. Mar 2018 B2
9924893 Schurman et al. Mar 2018 B2
9924897 Abdul-Hafiz Mar 2018 B1
9936917 Poeze et al. Apr 2018 B2
9943269 Muhsin et al. Apr 2018 B2
9949676 Al-Ali Apr 2018 B2
9955937 Telfort May 2018 B2
9965946 Al-Ali May 2018 B2
9980667 Kiani et al. May 2018 B2
D820865 Muhsin et al. Jun 2018 S
9986919 Lamego et al. Jun 2018 B2
9986952 Dalvi et al. Jun 2018 B2
9989560 Poeze et al. Jun 2018 B2
9993207 Al-Ali et al. Jun 2018 B2
10007758 Al-Ali et al. Jun 2018 B2
D822215 Al-Ali et al. Jul 2018 S
D822216 Barker et al. Jul 2018 S
10010276 Al-Ali et al. Jul 2018 B2
10032002 Kiani et al. Jul 2018 B2
10039482 Al-Ali et al. Aug 2018 B2
10052037 Kinast et al. Aug 2018 B2
10058275 Al-Ali et al. Aug 2018 B2
10064562 Al-Ali Sep 2018 B2
10086138 Novak, Jr. Oct 2018 B1
10092200 Al-Ali et al. Oct 2018 B2
10092249 Kiani et al. Oct 2018 B2
10098550 Al-Ali et al. Oct 2018 B2
10098591 Al-Ali et al. Oct 2018 B2
10098610 Al-Ali et al. Oct 2018 B2
D833624 DeJong et al. Nov 2018 S
10123726 Al-Ali et al. Nov 2018 B2
10130289 Al-Ali et al. Nov 2018 B2
10130291 Schurman et al. Nov 2018 B2
D835282 Barker et al. Dec 2018 S
D835283 Barker et al. Dec 2018 S
D835284 Barker et al. Dec 2018 S
D835285 Barker et al. Dec 2018 S
10149616 Al-Ali et al. Dec 2018 B2
10154815 Al-Ali et al. Dec 2018 B2
10159412 Lamego et al. Dec 2018 B2
10188296 Al-Ali et al. Jan 2019 B2
10188331 Al-Ali et al. Jan 2019 B1
10188348 Kiani et al. Jan 2019 B2
RE47218 Ali-Ali Feb 2019 E
RE47244 Kiani et al. Feb 2019 E
RE47249 Kiani et al. Feb 2019 E
10194847 Al-Ali Feb 2019 B2
10194848 Kiani et al. Feb 2019 B1
10201298 Al-Ali et al. Feb 2019 B2
10205272 Kiani et al. Feb 2019 B2
10205291 Scruggs et al. Feb 2019 B2
10213108 Al-Ali Feb 2019 B2
10219706 Al-Ali Mar 2019 B2
10219746 McHale et al. Mar 2019 B2
10226187 Al-Ali et al. Mar 2019 B2
10226576 Kiani Mar 2019 B2
10231657 Al-Ali et al. Mar 2019 B2
10231670 Blank et al. Mar 2019 B2
10231676 Al-Ali et al. Mar 2019 B2
RE47353 Kiani et al. Apr 2019 E
10251585 Al-Ali et al. Apr 2019 B2
10251586 Lamego Apr 2019 B2
10255994 Sampath et al. Apr 2019 B2
10258265 Poeze et al. Apr 2019 B1
10258266 Poeze et al. Apr 2019 B1
10271748 Al-Ali Apr 2019 B2
10278626 Schurman et al. May 2019 B2
10278648 Al-Ali et al. May 2019 B2
10279247 Kiani May 2019 B2
10292628 Poeze et al. May 2019 B1
10292657 Abdul-Hafiz et al. May 2019 B2
10292664 Al-Ali May 2019 B2
10299708 Poeze et al. May 2019 B1
10299709 Perea et al. May 2019 B2
10305775 Lamego et al. May 2019 B2
10307111 Muhsin et al. Jun 2019 B2
10325681 Sampath et al. Jun 2019 B2
10327337 Triman et al. Jun 2019 B2
20040038169 Mandelkern Feb 2004 A1
20050010166 Hickle Jan 2005 A1
20050070812 Donofrio Mar 2005 A1
20060161054 Reuss et al. Jul 2006 A1
20070010755 Sarkela Jan 2007 A1
20070010756 Viertio-Oja Jan 2007 A1
20070010795 Sarkela Jan 2007 A1
20070208322 Rantala Sep 2007 A1
20070282478 Al-Ali et al. Dec 2007 A1
20080243021 Causevic Oct 2008 A1
20090131762 Pelzek May 2009 A1
20090160642 Kim Jun 2009 A1
20090247984 Lamego et al. Oct 2009 A1
20090275813 Davis Nov 2009 A1
20090275844 Al-Ali Nov 2009 A1
20100004518 Vo et al. Jan 2010 A1
20100030040 Poeze et al. Feb 2010 A1
20100324441 Hargrove Dec 2010 A1
20110054272 Derchak Mar 2011 A1
20110082711 Poeze et al. Apr 2011 A1
20110125060 Telfort et al. May 2011 A1
20110172554 Leyde Jul 2011 A1
20110184307 Hulin Jul 2011 A1
20110208015 Welch et al. Aug 2011 A1
20110230733 Al-Ali Sep 2011 A1
20110270047 O'Brien Nov 2011 A1
20120053433 Chamoun Mar 2012 A1
20120083673 Al-Ali et al. Apr 2012 A1
20120165629 Merritt et al. Jun 2012 A1
20120203087 McKenna Aug 2012 A1
20120209082 Al-Ali Aug 2012 A1
20120209084 Olsen et al. Aug 2012 A1
20120283524 Kiani et al. Nov 2012 A1
20130023775 Lamego et al. Jan 2013 A1
20130030267 Lisogurski Jan 2013 A1
20130041591 Lamego Feb 2013 A1
20130060147 Welch et al. Mar 2013 A1
20130096405 Garfio Apr 2013 A1
20130096936 Sampath et al. Apr 2013 A1
20130150748 Jensen Jun 2013 A1
20130243021 Siskavich Sep 2013 A1
20130253334 Al-Ali et al. Sep 2013 A1
20130276785 Melker Oct 2013 A1
20130296672 O'Neil et al. Nov 2013 A1
20130296713 Al-Ali et al. Nov 2013 A1
20130324808 Al-Ali et al. Dec 2013 A1
20130331660 Al-Ali et al. Dec 2013 A1
20140012100 Al-Ali et al. Jan 2014 A1
20140051953 Lamego et al. Feb 2014 A1
20140120564 Workman et al. May 2014 A1
20140121482 Merritt et al. May 2014 A1
20140127137 Bellott et al. May 2014 A1
20140163344 Al-Ali Jun 2014 A1
20140166076 Kiani et al. Jun 2014 A1
20140171763 Diab Jun 2014 A1
20140180038 Kiani Jun 2014 A1
20140180154 Sierra et al. Jun 2014 A1
20140180160 Brown Jun 2014 A1
20140187973 Brown et al. Jul 2014 A1
20140213864 Abdul-Hafiz et al. Jul 2014 A1
20140275835 Lamego et al. Sep 2014 A1
20140275871 Lamego et al. Sep 2014 A1
20140275872 Merritt et al. Sep 2014 A1
20140288400 Diab et al. Sep 2014 A1
20140316217 Purdon Oct 2014 A1
20140316218 Purdon et al. Oct 2014 A1
20140316228 Blank et al. Oct 2014 A1
20140323825 Al-Ali et al. Oct 2014 A1
20140323897 Brown et al. Oct 2014 A1
20140323898 Purdon et al. Oct 2014 A1
20140330092 Al-Ali et al. Nov 2014 A1
20140330098 Merritt et al. Nov 2014 A1
20140357966 Al-Ali et al. Dec 2014 A1
20150005600 Blank et al. Jan 2015 A1
20150011907 Purdon et al. Jan 2015 A1
20150032029 Al-Ali et al. Jan 2015 A1
20150038859 Dalvi et al. Feb 2015 A1
20150080754 Purdon et al. Mar 2015 A1
20150087936 Al-Ali et al. Mar 2015 A1
20150094546 Al-Ali Apr 2015 A1
20150099950 Al-Ali et al. Apr 2015 A1
20150101844 Al-Ali et al. Apr 2015 A1
20150106121 Muhsin et al. Apr 2015 A1
20150112151 Muhsin et al. Apr 2015 A1
20150142082 Simon May 2015 A1
20150165312 Kiani Jun 2015 A1
20150196249 Brown et al. Jul 2015 A1
20150216459 Al-Ali et al. Aug 2015 A1
20150238722 Al-Ali Aug 2015 A1
20150245773 Lamego et al. Sep 2015 A1
20150245794 Al-Ali Sep 2015 A1
20150257689 Al-Ali et al. Sep 2015 A1
20150272508 Chiouchang Oct 2015 A1
20150272514 Kiani et al. Oct 2015 A1
20150351697 Weber et al. Dec 2015 A1
20150359429 Al-Ali et al. Dec 2015 A1
20150366507 Blank Dec 2015 A1
20160029932 Al-Ali Feb 2016 A1
20160058347 Reichgott et al. Mar 2016 A1
20160066824 Al-Ali et al. Mar 2016 A1
20160081552 Wojtczuk et al. Mar 2016 A1
20160095543 Telfort et al. Apr 2016 A1
20160095548 Al-Ali et al. Apr 2016 A1
20160103598 Al-Ali et al. Apr 2016 A1
20160166182 Al-Ali et al. Jun 2016 A1
20160166183 Poeze et al. Jun 2016 A1
20160196388 Lamego Jul 2016 A1
20160197436 Barker et al. Jul 2016 A1
20160213281 Eckerbom et al. Jul 2016 A1
20160228043 O'Neil et al. Aug 2016 A1
20160233632 Scruggs et al. Aug 2016 A1
20160234944 Schmidt et al. Aug 2016 A1
20160270735 Diab et al. Sep 2016 A1
20160283665 Sampath et al. Sep 2016 A1
20160287090 Al-Ali et al. Oct 2016 A1
20160287786 Kiani Oct 2016 A1
20160296169 McHale et al. Oct 2016 A1
20160310052 Al-Ali et al. Oct 2016 A1
20160314260 Kiani Oct 2016 A1
20160324488 Olsen Nov 2016 A1
20160327984 Al-Ali et al. Nov 2016 A1
20160331332 Al-Ali Nov 2016 A1
20160367173 Dalvi et al. Dec 2016 A1
20170000394 Al-Ali et al. Jan 2017 A1
20170007134 Al-Ali et al. Jan 2017 A1
20170007198 Al-Ali et al. Jan 2017 A1
20170014083 Diab et al. Jan 2017 A1
20170014084 Al-Ali et al. Jan 2017 A1
20170024748 Haider Jan 2017 A1
20170042488 Muhsin Feb 2017 A1
20170055851 Al-Ali Mar 2017 A1
20170055882 Al-Ali et al. Mar 2017 A1
20170055887 Al-Ali Mar 2017 A1
20170055896 Al-Ali et al. Mar 2017 A1
20170079594 Telfort et al. Mar 2017 A1
20170086723 Al-Ali et al. Mar 2017 A1
20170143281 Olsen May 2017 A1
20170147774 Kiani May 2017 A1
20170156620 Al-Ali et al. Jun 2017 A1
20170173632 Al-Ali Jun 2017 A1
20170181693 Kim Jun 2017 A1
20170187146 Kiani et al. Jun 2017 A1
20170188919 Al-Ali et al. Jul 2017 A1
20170196464 Jansen et al. Jul 2017 A1
20170196470 Lamego et al. Jul 2017 A1
20170224262 Al-Ali Aug 2017 A1
20170228516 Sampath et al. Aug 2017 A1
20170245790 Al-Ali et al. Aug 2017 A1
20170251974 Shreim et al. Sep 2017 A1
20170251975 Shreim et al. Sep 2017 A1
20170258403 Abdul-Hafiz et al. Sep 2017 A1
20170311851 Schurman et al. Nov 2017 A1
20170311891 Kiani et al. Nov 2017 A1
20170325728 Al-Ali et al. Nov 2017 A1
20170332976 Al-Ali et al. Nov 2017 A1
20170340293 Al-Ali et al. Nov 2017 A1
20170360310 Kiani et al. Dec 2017 A1
20170367632 Al-Ali et al. Dec 2017 A1
20180008146 Al-Ali et al. Jan 2018 A1
20180013562 Haider et al. Jan 2018 A1
20180014752 Al-Ali et al. Jan 2018 A1
20180028124 Al-Ali et al. Feb 2018 A1
20180055385 Al-Ali Mar 2018 A1
20180055390 Kiani et al. Mar 2018 A1
20180055430 Diab et al. Mar 2018 A1
20180064381 Shakespeare et al. Mar 2018 A1
20180069776 Lamego et al. Mar 2018 A1
20180070867 Smith et al. Mar 2018 A1
20180082767 Al-Ali et al. Mar 2018 A1
20180085068 Telfort Mar 2018 A1
20180087937 Al-Ali et al. Mar 2018 A1
20180103874 Lee et al. Apr 2018 A1
20180103905 Kiani Apr 2018 A1
20180110478 Al-Ali Apr 2018 A1
20180116575 Perea et al. May 2018 A1
20180125368 Lamego et al. May 2018 A1
20180125430 Al-Ali et al. May 2018 A1
20180125445 Telfort et al. May 2018 A1
20180130325 Kiani et al. May 2018 A1
20180132769 Weber et al. May 2018 A1
20180132770 Lamego May 2018 A1
20180146901 Al-Ali et al. May 2018 A1
20180146902 Kiani et al. May 2018 A1
20180153442 Eckerbom et al. Jun 2018 A1
20180153446 Kiani Jun 2018 A1
20180153447 Al-Ali et al. Jun 2018 A1
20180153448 Weber et al. Jun 2018 A1
20180161499 Al-Ali et al. Jun 2018 A1
20180168491 Al-Ali et al. Jun 2018 A1
20180174679 Sampath et al. Jun 2018 A1
20180174680 Sampath et al. Jun 2018 A1
20180182484 Sampath et al. Jun 2018 A1
20180184917 Kiani Jul 2018 A1
20180192924 Al-Ali Jul 2018 A1
20180192953 Shreim et al. Jul 2018 A1
20180192955 Al-Ali et al. Jul 2018 A1
20180199871 Pauley et al. Jul 2018 A1
20180206795 Al-Ali Jul 2018 A1
20180206815 Telfort Jul 2018 A1
20180213583 Al-Ali Jul 2018 A1
20180214031 Kiani et al. Aug 2018 A1
20180214090 Al-Ali et al. Aug 2018 A1
20180218792 Muhsin et al. Aug 2018 A1
20180225960 Al-Ali et al. Aug 2018 A1
20180238718 Dalvi Aug 2018 A1
20180242853 Al-Ali Aug 2018 A1
20180242921 Muhsin et al. Aug 2018 A1
20180242923 Al-Ali et al. Aug 2018 A1
20180242924 Barker et al. Aug 2018 A1
20180242926 Muhsin et al. Aug 2018 A1
20180247353 Al-Ali et al. Aug 2018 A1
20180247712 Muhsin et al. Aug 2018 A1
20180249933 Schurman et al. Sep 2018 A1
20180253947 Muhsin et al. Sep 2018 A1
20180256087 Al-Ali et al. Sep 2018 A1
20180256113 Weber et al. Sep 2018 A1
20180285094 Housel et al. Oct 2018 A1
20180289325 Poeze et al. Oct 2018 A1
20180289337 Al-Ali et al. Oct 2018 A1
20180296161 Shreim et al. Oct 2018 A1
20180300919 Muhsin et al. Oct 2018 A1
20180310822 Indorf et al. Nov 2018 A1
20180310823 Al-Ali et al. Nov 2018 A1
20180317826 Muhsin Nov 2018 A1
20180317841 Novak, Jr. Nov 2018 A1
20180333055 Lamego et al. Nov 2018 A1
20180333087 Al-Ali Nov 2018 A1
20190000317 Muhsin et al. Jan 2019 A1
20190000362 Kiani et al. Jan 2019 A1
20190015023 Monfre Jan 2019 A1
20190021638 Al-Ali et al. Jan 2019 A1
20190029574 Schurman et al. Jan 2019 A1
20190029578 Al-Ali et al. Jan 2019 A1
20190038143 Al-Ali Feb 2019 A1
20190058280 Al-Ali et al. Feb 2019 A1
20190058281 Al-Ali et al. Feb 2019 A1
20190069813 Al-Ali Mar 2019 A1
20190069814 Al-Ali Mar 2019 A1
20190076028 Al-Ali et al. Mar 2019 A1
20190082979 Al-Ali et al. Mar 2019 A1
20190090748 Al-Ali Mar 2019 A1
20190090760 Kinast et al. Mar 2019 A1
20190090764 Al-Ali Mar 2019 A1
20190104973 Poeze et al. Apr 2019 A1
20190110719 Poeze et al. Apr 2019 A1
20190117070 Muhsin et al. Apr 2019 A1
20190117139 Al-Ali et al. Apr 2019 A1
20190117140 Al-Ali et al. Apr 2019 A1
20190117141 Al-Ali Apr 2019 A1
20190117930 Al-Ali Apr 2019 A1
20190122763 Sampath et al. Apr 2019 A1
20190133525 Al-Ali et al. May 2019 A1
20190142283 Lamego et al. May 2019 A1
20190142344 Telfort et al. May 2019 A1
20190150800 Poeze et al. May 2019 A1
20190150856 Kiani et al. May 2019 A1
20190167161 Al-Ali et al. Jun 2019 A1
20190175019 Al-Ali et al. Jun 2019 A1
20190192076 McHale et al. Jun 2019 A1
Foreign Referenced Citations (2)
Number Date Country
1 741 388 Jan 2007 EP
WO 0232305 Apr 2002 WO
Non-Patent Literature Citations (1)
Entry
Ionescu, et al., C. M., Variable Time-Delay Estimation for Anesthesia Control During Intensive Care, IEEE Transactions on Biomedical Engineering, Feb. 1, 2011, pp. 363-369, vol. 58, No. 2, IEEE Service Center, Piscataway, NJ, USA.
Related Publications (1)
Number Date Country
20130331660 A1 Dec 2013 US
Provisional Applications (2)
Number Date Country
61703747 Sep 2012 US
61656974 Jun 2012 US