The present disclosure relates generally to patient monitoring devices and specifically to a patient monitor and medical data communication hub.
Today's patient monitoring environments are crowded with sophisticated and often electronic medical devices servicing a wide variety of monitoring and treatment endeavors for a given patient. Generally, many if not all of the devices are from differing manufacturers, and many may be portable devices. The devices may not communicate with one another and each may include its own control, display, alarms, configurations and the like. Complicating matters, caregivers often desire to associate all types of measurement and use data from these devices to a specific patient. Thus, patient information entry often occurs at each device. Sometimes, the disparity in devices leads to a need to simply print and store paper from each device in a patient's file for caregiver review.
The result of such device disparity is often a caregiver environment scattered with multiple displays and alarms leading to a potentially chaotic experience. Such chaos can be detrimental to the patient in many situations including surgical environments where caregiver distraction is unwanted, and including recovery or monitoring environments where patient distraction or disturbance may be unwanted.
Various manufacturers produce multi-monitor devices or devices that modularly expand to increase the variety of monitoring or treatment endeavors a particular system can accomplish. However, as medical device technology expands, such multi-monitor devices begin to be obsolete the moment they are installed.
For purposes of summarizing the disclosure, certain aspects, advantages and novel features are discussed herein. It is to be understood that not necessarily all such aspects, advantages or features will be embodied in any particular embodiment of the invention and an artisan would recognize from the disclosure herein a myriad of combinations of such aspects, advantages or features.
Some aspects of the disclosure describe a method for presenting augmented reality data from a medical monitoring device. Under control of a hardware processor, the method can include receiving physiological monitoring data comprising physiological parameter values associated with a patient from a monitoring hub. The method can further include accessing user interface configuration data. The method can further include generating, from the physiological monitoring data, a plurality of augmented reality objects according to the user interface configuration data. The method can further include receiving user interaction data from a user input device of an augmented reality device, wherein the user interaction data comprises an indication of an interaction to virtually pin the plurality of augmented reality objects to a reference object. The method can further include determining a reference position based at least in part the user interaction data. The method can further include causing presentation of the plurality of augmented reality objects in an augmented reality display, wherein the plurality of augmented reality objects are presented relative to the reference position.
The method can further include: identifying, from the user interaction data, the reference object; determining the reference position for the reference object; and calculating a positional offset from the reference position, wherein the plurality of augmented reality objects are presented relative to the reference position according to the positional offset.
The method can further include: determining, from the user interface configuration data, whether to present a direct overlay; identifying, from the user interface configuration data, the reference object; calculating the reference position for the reference object, wherein an overlay object of the plurality of augmented reality objects is presented at the reference position.
The method can further include: detecting a gesture from the user interaction data; and generating the user interface configuration data based at least in part on the gesture.
The plurality of augmented reality objects can be presented in a first arrangement. The method can further include: receiving second user interaction data from the user input device, the second user interaction data indicating a second arrangement of the plurality of augmented reality objects; generating second user interface configuration data from the second user interaction data; and causing presentation, in the augmented reality display, of the plurality of augmented reality objects in the second arrangement according to the second user interface configuration data.
A user interface of the augmented reality display can be configurable via a device different from an augmented reality device.
The method can further include: receiving hub user interaction data from the monitoring hub; determining a hub user interface configuration from the hub user interaction data, wherein the user interface configuration data is indicative of the hub user interface configuration; and causing presentation of a user interface on a display of the monitoring hub according to the hub user interface configuration.
The hub user interface configuration can include a plurality of user interface elements, wherein each element of the plurality of user interface elements comprises a physiological parameter value, and wherein each element of the plurality of user interface elements corresponds to an object from the plurality of augmented reality objects.
At least some of the plurality of augmented reality objects can correspond to the hub user interface configuration.
The hub user interaction data can be indicative of at least one addition, removal, or rearrangement of a user interface element.
An augmented reality device can be configured to present physiological data. The system can include a memory device configured to store instructions, and a hardware processor. The hardware processor can be configured to execute the instructions to receive physiological monitoring data comprising physiological parameter values associated with a patient from a patient monitor. The hardware processor can be further configured to access user interface configuration data. The hardware processor can be further configured to generate, from the physiological monitoring data, a plurality of augmented reality objects according to the user interface configuration data. The hardware processor can be further configured to determine that a clinician wearing the augmented reality device is looking toward the patient monitor. In response to said determination, the hardware processor can be further configured to cause presentation of the plurality of augmented reality objects in an augmented reality display in a vicinity of the patient monitor.
The hardware processor can be further configured to: receive user interaction data from a user input device of an augmented reality device; and generate the user interface configuration data from the user interaction data.
The hardware processor can be further configured to: receive user interaction data from a user input device of an augmented reality device; identify, from the user interaction data, a reference object; determine a reference position for the reference object; and calculate a positional offset from the reference position, wherein the plurality of augmented reality objects are presented relative to the reference position according to the positional offset.
The hardware processor can be further configured to: determine, from the user interface configuration data, whether to present a direct overlay; identify, from the user interface configuration data, a reference object; and calculating a reference position for the reference object, wherein an overlay object of the plurality of augmented reality objects is presented at the reference position.
The hardware processor can be further configured to: receive user input data from a user input device; detect a gesture from the user input data; and generate the user interface configuration data based at least in part on the gesture.
A user interface of the augmented reality device can be configurable via a device different from the augmented reality device.
The hardware processor can be further configured to: receive monitor user interaction data from the patient monitor; determine a monitor user interface configuration from the monitor user interaction data, wherein the user interface configuration data is indicative of the monitor user interface configuration; and cause presentation of a user interface on a display of the patient monitor according to the monitor user interface configuration.
At least some of the plurality of augmented reality objects can correspond to the monitor user interface configuration.
The monitor user interaction data can be indicative of at least one addition, removal, or rearrangement of a user interface element.
The user input data can include image data. Detecting the gesture from the user input data can further include: determining color histogram data from the image data; locating a search window in the image data according to the color histogram data; and identifying a plurality of positions of the search window in the image data.
The reference object can be the patient. The user input device can include a camera.
A system for presenting augmented reality data from a medical monitoring device can include: an augmented reality device comprising a hardware processor, memory, a display, and a wireless device configured to communicate with a medical monitoring device in communication with a patient. The hardware processor of the augmented reality device can be configured to identify a user interaction with the augmented reality device by a clinician wearing the augmented reality device, the user interaction comprising a gesture. The hardware processor can be further configured to determine that the user interaction is an instruction to virtually pin a user interface of the medical monitoring device to a virtual location separate from the medical monitoring device. The hardware processor can be further configured to wirelessly obtain patient data depicted in the user interface from the medical monitoring device. The hardware processor can be further configured to output for presentation to the clinician, in the augmented reality device, a display of the user interface pinned to the virtual location.
The instruction can be to virtually pin the user interface outside of and next to a hospital room of the patient so that the clinician can subsequently view the user interface while walking past the hospital room and without entering the hospital room.
The display of the user interface pinned to the virtual location can be viewable only when the display's field of view includes the virtual location, and wherein the virtual location does not obscure the patient.
The augmented reality device can be further operable to detect that the clinician has moved his or her head away from looking at the medical monitoring device and to reduce a size or content of the user interface at the virtual location.
The augmented reality device further can further include a camera or a movement sensor configured to detect that the clinician has moved his or her head away from looking at the medical monitoring device.
The user interface can be configurable via a device different from the augmented reality device.
The following drawings and the associated descriptions are provided to illustrate embodiments of the present disclosure and do not limit the scope of the claims.
While the foregoing “Brief Description of the Drawings” references generally various embodiments of the disclosure, an artisan will recognize from the disclosure herein that such embodiments are not mutually exclusive. Rather, the artisan would recognize a myriad of combinations of some or all of such embodiments.
Based on at least the foregoing, a solution is needed that coordinates the various medical devices treating or monitoring a patient. The solutions described herein can provide patient identification seamlessly across the device space and can expand for future technologies without necessarily requiring repeated software upgrades. In addition, such solutions can include patient electrical isolation where desired.
Therefore, the present disclosure relates to a patient monitoring hub that is the center of patient monitoring and treatment activities for a given patient. The patient monitoring hub can interface with legacy devices without necessitating legacy reprogramming, can provide flexibility for interfacing with future devices without necessitating software upgrades, and can offer optional patient electrical isolation. The hub can include a large display that can dynamically provide information to a caregiver about a wide variety of measurement or otherwise determined parameters. The hub can include a docking station for a portable patient monitor. The portable patient monitor may communicate with the hub through the docking station or through various wireless paradigms known to an artisan from the disclosure herein, including WiFi, Bluetooth, Zigbee, or the like.
The portable patient monitor can modify its screen when docked. The undocked display indicia is in part or in whole transferred to a large dynamic display of the hub and the docked display presents one or more anatomical graphics of monitored body parts. For example, the display may present a heart, lungs, a brain, kidneys, intestines, a stomach, other organs, digits, gastrointestinal systems or other body parts when it is docked. The anatomical graphics may advantageously be animated. The animation may generally follow the behavior of measured parameters, such as, for example, the lungs may inflate in approximate correlation to the measured respiration rate and/or the determined inspiration portion of a respiration cycle, and likewise deflate according to the expiration portion of the same. The heart may beat according to the pulse rate, may beat generally along understood actual heart contraction patterns, and the like. Moreover, when the measured parameters indicate a need to alert a caregiver, a changing severity in color may be associated with one or more displayed graphics, such as the heart, lungs, brain, or the like. The body portions may include animations on where, when or how to attach measurement devices to measurement sites on the patient. For example, the monitor may provide animated directions for CCHD screening procedures or glucose strip reading protocols, the application of a forehead sensor, a finger or toe sensor, one or more electrodes, an acoustic sensor, and ear sensor, a cannula sensor or the like.
The present disclosure relates to a medical monitoring hub configured to be the center of monitoring activity for a given patient. The hub can include a large easily readable display, such as an about ten (10) inch display dominating the majority of real estate on a front face of the hub. The display could be much larger or much smaller depending upon design constraints. However, for portability and current design goals, the preferred display is roughly sized proportional to the vertical footprint of one of the dockable portable patient monitors. Other considerations are recognizable from the disclosure herein by those in the art.
The display can provide measurement data for a wide variety of monitored parameters for the patient under observation in numerical or graphic form. The display can be automatically configured based on the type of data and information being received at the hub. The hub can be moveable, portable, or mountable so that it can be positioned to convenient areas within a caregiver environment. For example, the hub can be collected within a singular housing.
The hub may advantageously receive data from a portable patient monitor while docked or undocked from the hub. Typical portable patient monitors, such as oximeters or co-oximeters can provide measurement data for a large number of physiological parameters derived from signals output from optical and/or acoustic sensors, electrodes, or the like. The physiological parameters can include, but are not limited to oxygen saturation, carboxy hemoglobin, methemoglobin, total hemoglobin, glucose, pH, bilirubin, fractional saturation, pulse rate, respiration rate, components of a respiration cycle, indications of perfusion including perfusion index, signal quality and/or confidences, plethysmograph data, indications of wellness or wellness indexes or other combinations of measurement data, audio information responsive to respiration, ailment identification or diagnosis, blood pressure, patient and/or measurement site temperature, depth of sedation, organ or brain oxygenation, hydration, measurements responsive to metabolism, combinations of the same or the like, to name a few. The hub may also (or instead) output data sufficient to accomplish closed-loop drug administration in combination with infusion pumps or the like.
The hub can communicate with other devices in a monitoring environment that are interacting with the patient in a number of ways. For example, the hub advantageously can receive serial data from other devices without necessitating their reprogramming or that of the hub. Such other devices can include pumps, ventilators, all manner of monitors monitoring any combination of the foregoing parameters, ECG/EEG/EKG devices, electronic patient beds, and the like. Moreover, the hub advantageously can receive channel data from other medical devices without necessitating their reprogramming or that of the hub. When a device communicates through channel data, the hub may advantageously alter the large display to include measurement information from that device. Additionally, the hub can access nurse call systems to ensure that nurse call situations from the device are passed to the appropriate nurse call system.
The hub also can communicate with hospital systems to advantageously associate incoming patient measurement and treatment data with the patient being monitored. For example, the hub may communicate wirelessly or otherwise to a multi-patient monitoring system, such as a server or collection of servers, which in turn many communicate with a caregiver's data management systems, such as, for example, an Admit, Discharge, Transfer (“ADT”) system and/or an Electronic Medical Records (“EMR”) system. The hub advantageously can associate the data flowing through it with the patient being monitored thereby providing the electronic measurement and treatment information to be passed to the caregiver's data management systems without the caregiver associating each device in the environment with the patient.
The hub advantageously can include a reconfigurable and removable docking station. The docking station may dock additional layered docking stations to adapt to different patient monitoring devices. Additionally, the docking station itself can be modularized so that it may be removed if the primary dockable portable patient monitor changes its form factor. Thus, the hub can be flexible in how its docking station is configured.
The hub can include a large memory for storing some or all of the data it receives, processes, and/or associates with the patient, and/or communications it has with other devices and systems. Some or all of the memory may advantageously comprise removable SD memory.
The hub can communicate with other devices through at least (1) the docking station to acquire data from a portable monitor, (2) innovative universal medical connectors to acquire channel data, (3) serial data connectors, such as RJ ports to acquire output data, (4) Ethernet, USB, and nurse call ports, (5) Wireless devices to acquire data from a portable monitor, (6) other wired or wireless communication mechanisms known to an artisan. The universal medical connectors advantageously can provide optional electrically isolated power and communications, and can be designed to be smaller in cross section than isolation requirements. The connectors and the hub can communicate to advantageously translate or configure data from other devices to be usable and displayable for the hub. A software developers kit (“SDK”) can be provided to a device manufacturer to establish or define the behavior and meaning of the data output from their device. When the output is defined, the definition can be programmed into a memory residing in the cable side of the universal medical connector and supplied as an original equipment manufacturer (“OEM”) to the device provider. When the cable is connected between the device and the hub, the hub can understand the data and can use it for display and processing purposes without necessitating software upgrades to the device or the hub. The hub can negotiate the schema and can even add additional compression and/or encryption. Through the use of the universal medical connectors, the hub can organize the measurement and treatment data into a single display and alarm system effectively and efficiently bringing order to the monitoring environment.
As the hub can receive and track data from other devices according to a channel paradigm, the hub may advantageously provide processing to create virtual channels of patient measurement or treatment data. A virtual channel may comprise a non-measured parameter that is, for example, the result of processing data from various measured or other parameters. An example of such a parameter includes a wellness indicator derived from various measured parameters that give an overall indication of the wellbeing of the monitored patient. An example of a wellness parameter is disclosed in U.S. patent application Ser. Nos. 13/269,296, 13/371,767 and 12/904,925, by the assignee of the present disclosure and incorporated by reference herein. By organizing data into channels and virtual channels, the hub may advantageously time-wise synchronize incoming data and virtual channel data.
The hub can also receive serial data through serial communication ports, such as RJ connectors. The serial data can be associated with the monitored patient and passed on to the multi-patient server systems and/or caregiver backend systems discussed above. Through receiving the serial data, the caregiver advantageously can associate devices in the caregiver environment, often from varied manufactures, with a particular patient, avoiding a need to have each individual device associated with the patient and possible communicating with hospital systems. Such association can be vital as it reduces caregiver time spent entering biographic and demographic information into each device about the patient. Moreover, through the SDK the device manufacturer may advantageously provide information associated with any measurement delay of their device, thereby further allowing the hub to advantageously time-wise synchronize serial incoming data and other data associated with the patient.
When a portable patient monitor is docked, and it includes its own display, the hub can effectively increase its display real estate. For example, the portable patient monitor may simply continue to display its measurement and/or treatment data, which may be now duplicated on the hub display, or the docked display may alter its display to provide additional information. The docked display, when docked, presents anatomical graphical data of, for example, the heart, lungs, organs, the brain, or other body parts being measured and/or treated. The graphical data may advantageously animate similar to and in concert with the measurement data. For example, lungs may inflate in approximate correlation to the measured respiration rate and/or the determined inspiration/expiration portions of a respiration cycle, the heart may beat according to the pulse rate, may beat generally along understood actual heart contraction patterns, the brain may change color or activity based on varying depths of sedation, or the like. When the measured parameters indicate a need to alert a caregiver, a changing severity in color may be associated with one or more displayed graphics, such as the heart, lungs, brain, organs, circulatory system or portions thereof, respiratory system or portions thereof, other body parts or the like. The body portions may include animations on where, when or how to attach measurement devices.
The hub may also advantageously overlap parameter displays to provide additional visual information to the caregiver. Such overlapping may be user definable and configurable. The display may also incorporate analog-appearing icons or graphical indicia.
In the interest of clarity, not all features of an actual implementation are described in this specification. An artisan will of course be appreciate that in the development of any such actual implementation (as in any development project), numerous implementation-specific decisions must be made to achieve a developers' specific goals and subgoals, such as compliance with system- and business-related constraints, which will vary from one implementation to another. Moreover, it will be appreciated that such a development effort might be complex and time-consuming, but would nevertheless be a routine undertaking of device engineering for those of ordinary skill having the benefit of this disclosure.
To facilitate a complete understanding of the disclosure, the remainder of the detailed description describes the disclosure with reference to the drawings, wherein like reference numbers are referenced with like numerals throughout.
The display 104 may present a wide variety of measurement and/or treatment data in numerical, graphical, waveform, or other display indicia 110. The display 104 can occupy much of a front face of the housing 108, although an artisan will appreciate the display 104 may include a tablet or tabletop horizontal configuration, a laptop-like configuration or the like. Display information and data can be communicated to a table computer, smartphone, television, or any display system recognizable to an artisan. The upright inclined configuration of
The housing 108 may also include pockets or indentations to hold additional medical devices, such as, for example, a blood pressure monitor or temperature sensor 112, such as that shown in
The portable patient monitor 102 of
The docking station 106 of the hub 100 can include a mechanical latch 118, or mechanically releasable catch to ensure that movement of the hub 100 doesn't mechanically detach the monitor 102 in a manner that could damage the same.
Although disclosed with reference to particular portable patient monitors 102, an artisan will recognize from the disclosure herein a large number and wide variety of medical devices that may advantageously dock with the hub 100. Moreover, the docking station 106 may advantageously electrically and not mechanically connect with the monitor 102, and/or wirelessly communicate with the same.
As disclosed herein, the portable patient monitor 102 can communicate with the hub 100 through the docking station 106 when docked and wirelessly when undocked; however, such undocked communication is not required. The hub 100 can communicate with one or more multi-patient monitoring servers 204 or server systems, such as, for example, those disclosed with in U.S. Pat. Pub. Nos. 2011/0105854, 2011/0169644, and 2007/0180140. In general, the server 204 can communicate with caregiver backend systems 206 such as EMR and/or ADT systems. The server 204 may advantageously obtain through push, pull or combination technologies patient information entered at patient admission, such as demographical information, billing information, and the like. The hub 100 can access this information to seamlessly associate the monitored patient with the caregiver backend systems 206. Communication between the server 204 and the monitoring hub 100 may be any recognizable to an artisan from the disclosure herein, including wireless, wired, over mobile or other computing networks, or the like.
Although disclosed with reference to a single docking station 106, the environment 200 may include stacked docking stations where a subsequent docking station mechanically and electrically docks to a first docking station to change the form factor for a different portable patent monitor as discussed with reference to
An artisan will recognize from the disclosure herein that the instrument board 302 may comprise a large number of electronic components organized in a large number of ways. Using different boards such as those disclosed above advantageously provides organization and compartmentalization to the complex system.
The housing 108 of the hub 100 can also include a cavity 406 housing the docking station 400. To the extent a change to the form factor for the portable patient monitor 102 occurs, the docking station 400 can be advantageously removable and replaceable. Similar to the docking station 400, the hub 100 can include, within the cavity 406 of the housing 108, electrical connectors 408 providing electrical communication to the docking station 400. The docking station 400 can include its own microcontroller and processing capabilities, such as those disclosed in U.S. Pat. Pub. No. 2002/0140675. The docking station 400 can also (or instead) pass communications through to the electrical connector 408.
Moreover, using the memory 702, the host 602 may determine to simply not enable any unused power supplies, whether that be the isolated power or one or more of the higher voltage non-isolated power supplies, thereby increasing the efficiency of the host.
An artisan will recognize from the disclosure herein that hub 100 may not check to see if sufficient power is available or may provide one, two or many levels of non-isolated voltages based on information from the memory 702.
As shown from a different perspective in
Such open architecture can advantageously provide device manufacturers the ability to port the output of their device into the hub 100 for display, processing, and data management as disclosed in the foregoing. By implementation through the cable connector, the device manufacturer can avoid any reprogramming of their original device; rather, they simply let the hub 100 know through the cable connector how the already existing output is formatted. Moreover, by describing the data in a language already understood by the hub 100, the hub 100 also avoids software upgrades to accommodate data from “new-to-the-hub” medical devices.
Once the device provider describes the data, the hub provider can create a binary image or other file to store in a memory within a cable connector in step 1405; however, the SDK may create the image and simply communicated it to the hub provider. The cable connector can be provided as an OEM part to the provider in step 1410, who constructs and manufactures the cable to mechanically and electrically mate with output ports on their devices in step 1412.
Once a caregiver has the appropriately manufactured cable, with one end matching the device provider's system and the other OEM'ed to match the hub 100 at its channel ports 212, in step 1452 the caregiver can connect the hub between the devices. In step 1454, the hub 100 reads the memory, provides isolated or non-isolated power, and the cable controller and the hub 100 negotiate a protocol or schema for data delivery. A controller on the cable can negotiate the protocol. The controller of the hub 100 can negotiate with other processors on the hub the particular protocol. Once the protocol is set, the hub 100 can use, display and otherwise process the incoming data stream in an intelligent manner.
Through the use of the universal medical connectors described herein, connection of a myriad of devices to the hub 100 can be accomplished through straightforward programming of a cable connector as opposed to necessitating software upgrades to each device.
In
In
For example, acoustic data from an acoustic sensor may advantageously provide breath sound data, while the plethysmograph and ECG or other signals can also be presented in separate waveforms (
Providing a visual correlation between multiple physiological signals can provide a number of valuable benefits where the signals have some observable physiological correlation. As one example of such a correlation, changes in morphology (e.g., envelope and/or baseline) of the plethysmographic signal can be indicative of patient blood or other fluid levels. And, these changes can be monitored to detect hypovolemia or other fluid-level related conditions. A pleth variability index may provide an indication of fluid levels, for example. And, changes in the morphology of the plethysmographic signal are correlated to respiration. For example, changes in the envelope and/or baseline of the plethysmographic signal are correlated to breathing. This is at least in part due to aspects of the human anatomical structure, such as the mechanical relationship and interaction between the heart and the lungs during respiration.
Thus, superimposing a plethysmographic signal and a respiratory signal (
The monitor may also be configured to process the signals and determine whether there is a threshold level of correlation between the two signals, or otherwise assess the correlation. However, by additionally providing a visual indication of the correlation, such as by showing the signals superimposed with one another, the display provides operators a continuous, intuitive and readily observable gauge of the particular physiological correlation. For example, by viewing the superimposed signals, users can observe trends in the correlation over time, which may not be otherwise ascertainable.
The monitor can visually correlate a variety of other types of signals instead of, or in addition to plethysmographic and respiratory signals. For example,
The hub 100 can provide a user interface through which the user can move the signals together to overlay on one another. For example, the user may be able to drag the respiration signal down onto the plethysmographic signal using a touch screen interface. Conversely, the user may be able to separate the signals, also using the touch screen interface. The monitor can include a button the user can press, or some other user interface allowing the user to overlay and separate the signals, as desired.
In certain configurations, in addition to providing the visual correlation between the plethysmographic signal and the respiratory signal, the monitor can additionally be configured to process the respiratory signal and the plethysmographic signal to determine a correlation between the two signals. For example, the monitor may process the signals to determine whether the peaks and valleys in the changes in the envelope and/or baseline of the plethysmographic signal correspond to bursts in the respiratory signal. And, in response to the determining that there is or is not a threshold level of correlation, the monitor may provide some indication to the user. For example, the monitor may provide a graphical indication (e.g., a change in color of pleth variability index indicator), an audible alarm, or some other indication. The monitor may employ one or more envelope detectors or other appropriate signal processing componentry in making the determination.
The system may further provide an audible indication of the patient's breathing sounds instead of, or in addition to the graphical indication. For example, the monitor may include a speaker, or an earpiece (e.g., a wireless earpiece) that may be provided to the monitoring personnel that can provide an audible output of the patient sounds. Examples of sensors and monitors having such capability are described in U.S. Pat. Pub. No. 2011/0172561 and are incorporated by reference herein.
In addition to the above described benefits, providing both the acoustic and plethysmographic signals on the same display in the manner described can allow monitoring personnel to more readily detect respiratory pause events where there is an absence of breathing, high ambient noise that can degrade the acoustic signal, improper sensor placement, etc.
The user interface of the hub 100 can further be modified by user. A user can modify any of the user interfaces described herein, such as the user interfaces of
Each analog indicator of the health indicator can include a dial that moves about an arc based on measured levels of monitored physiological parameters. As the measured physiological parameter levels increase the dial can move clockwise, and as the measured physiological parameter levels decrease, the dial can move counter-clockwise, or vice versa. In this way, a user can quickly determine the patient's status by looking at the analog indicator. For example, if the dial is in the center of the arc, the observer can be assured that the current physiological parameter measurements are normal, and if the dial is skewed too far to the left or right, the observer can quickly assess the severity of the physiological parameter levels and take appropriate action. Normal parameter measurements can also (or instead) be indicated when the dial is to the right or left, etc.
The dial can be implemented as a dot, dash, arrow, or the like, and the arc can be implemented as a circle, spiral, pyramid, or other shape, as desired. Furthermore, the entire arc can be lit up or only portions of the arc can be lit up based on the current physiological parameter measurement level. Furthermore, the arc can turn colors or be highlighted based on the current physiological parameter level. For example, as the dial approaches a threshold level, the arc and/or dial can turn from green, to yellow, to red, shine brighter, flash, be enlarged, move to the center of the display, or the like.
Different physiological parameters can have different thresholds indicating abnormal conditions. For example, some physiological parameters may have upper and lower threshold levels, while others only have an upper threshold or a lower threshold. Accordingly, each health indicator can be adjusted based on the physiological parameter being monitored. For example, the SpO2 health indicator can have a lower threshold that when met activates an alarm, while the respiration rate health indicator can have both a lower and upper threshold, and when either is met an alarm is activated. The thresholds for each physiological parameter can be based on typical, expected thresholds and/or user-specified thresholds.
The digital indicator can provide a numerical representation of the current levels of the physiological parameter the digital indicator may indicate an actual level or a normalized level and can also be used to quickly assess the severity of a patient condition. The display can include multiple health indicators for each monitored physiological parameter. The display can also include fewer health indicators than the number of monitored physiological parameters. The health indicators can cycle between different monitored physiological parameters.
The term “and/or” herein has its broadest least limiting meaning which is the disclosure includes A alone, B alone, both A and B together, or A or B alternatively, but does not require both A and B or require one of A or one of B. As used herein, the phrase “at least one of” A, B, “and” C should be construed to mean a logical A or B or C, using a non-exclusive logical or.
The term “plethysmograph” includes it ordinary broad meaning known in the art which includes data responsive to changes in volume within an organ or whole body (usually resulting from fluctuations in the amount of blood or air it contains).
As described above, the hub 100 may receive serial data from a variety of medical equipment, including the patient's bed 214, infusion pumps 216, a ventilator 218, and other vital signs monitors 220. The hub 100 can pass serial data from these sources on to the MMS 2004. As described above, the MMS 2004 may then store the serial data in a caregiver backend system 206 such as an EMR system or ADT system.
The medical equipment providing this serial data may use a variety of different proprietary protocols, messaging infrastructure, and the like that may not be natively recognizable by the hub 100. Accordingly, the hub 100 may not have native capability to read parameter values or other data from this medical equipment, and as a result, may not have the capability to display parameter values or other data from these devices. Advantageously, however, the translation module 2005 at the MMS 2004 can receive serial data from these devices, translate the serial data into a format recognizable by the monitoring hub 100, and provide the serial data to the monitoring hub 100. The monitoring hub 100 can then read parameter values and other data from the translated information and output these values or data to a display, such as any of the displays described above.
The translation module 2005 can apply one or more translation rules to the serial data to translate or transform the serial data from one format to another format. The serial data may be formatted according to a Health Level Seven (“HL7”) protocol. The HL7 protocol has been developed to provide a messaging framework for the communication of clinical messages between medical computer systems and devices. However, the HL7 standard is quite flexible and merely provides a framework of guidelines. Consequently, medical devices or clinical computer systems that are all HL7-compliant may still be unable to communicate with each other. For example, the medical equipment 214-220 may each implement a version of the HL7 protocol, but these implementations may be different from an HL7 protocol implemented by the monitoring hub 100. Accordingly, the monitoring hub 100 may not be able to parse or read messages from the medical equipment 214-220, even though both use the HL7 standard. Further, the translation module 2005 may translate between different implementations of a common standard other than the HL7 protocol implemented by the hub 100 and medical equipment 214-220.
In addition to translating between different implementations of a common electronic medical communication protocol (e.g., different formatting of HL7 messages), the translation module 2005 can also translate between input and output messages adhering to different communication protocols. The translation module 2005 can be capable of responding to and translating messages from, for example, one medical communication protocol to a separate medical communication protocol. For example, the translation module 2005 can facilitate communication between messages sent according to the HL7 protocol, the ISO 11073 protocol, other open protocols, or proprietary protocols. Accordingly, the translation module 2005 can translate an input message sent according to the HL7 protocol to an output message according to a different protocol, or vice-versa. The translation module 2005 can implement any of the translation features described below in greater detail under the section entitled “Translation Module Embodiments,” as well as further in U.S. application Ser. No. 14/032,132, filed Sep. 19, 2013, titled “Medical Monitoring System,” the disclosure of which is hereby incorporated by reference in its entirety.
Advantageously, the translation module 2005 can pass translated serial data back to the hub 100 or PPM 102. Since the translated data is in a format readable by the hub 100 or PPM 102, the hub 100 or PPM 102 can output the data from the medical equipment 214-220 on the display of the hub 100 or PPM 102. In addition, the translation module 2005 can provide the translated data to devices other than the hub 100, including clinician devices (such as cell phones, tablets, or pagers) and an auxiliary device 2040 that will be described below. Moreover, since the serial data provided by the medical equipment 214-220 may include alarm notifications, the translation module 2005 can pass these alarm notifications to the hub 100 or PPM 102. The hub 100 or PPM 102 can therefore generate visual or audible alarms responsive to these alarm notifications. Further, the translation module 2005 can provide the alarm notifications to clinician devices, e.g., over a hospital network or wide area network (such as the Internet). In addition, the translation module 2005 can provide the alarm notifications to the auxiliary device 2040.
The translation module 2005 is shown as implemented in the MMS 2004 because it may be beneficial to maintain and update the translation rules of the translation module 2005 in a single location. However, the translation module 2005 may also be (or instead be) implemented in the hub 100 or PPM 102. Accordingly, the hub 100 or PPM 102 can access an internal translation module 2005 to translate serial data for output to the display of the hub 100 or PPM 102.
The auxiliary device 2040 can be a computing device having physical computer hardware, a display, and the like. For example, the auxiliary device 2040 may be a handheld computing device used by a clinician, such as a tablet, laptop, cellphone or smartphone, personal digital assistant (PDA), a wearable computer (such as a smart watch or glasses), or the like. The auxiliary device 2040 may also be simply a display device, such as a computer monitor or digital television. The auxiliary device 2040 can provide second screen functionality for the hub 100, PPM 102, or MMS 2004. As such, the auxiliary device 2040 can communicate wirelessly or through a wired connection with the hub 100, MMS 2004, or PPM 102.
As a second screen device, the auxiliary device 2040 can depict a copy of at least a portion of the display of the hub 100 (or the PPM 102) or a different version of the hub 100 (or the PPM 102) display. For instance, the auxiliary device 2040 can receive physiological parameter data, trend data, or waveforms from the hub 100, PPM 102, or MMS 2040 and display the parameter data, trend data, or waveforms. The auxiliary device 2040 can output any information available to the hub 100, PPM 102, or MMS 2004. One use of the auxiliary device 2040 is as a clinician device usable by a clinician to view data from the hub 100, PPM 102, or MMS 2004 while away from a patient's room (or even while in a patient's room). A clinician can use the auxiliary device 2040 to view more detailed information about physiological parameters than is displayed on the hub 100 or PPM 102 (see, e.g.,
One example reason for copying at least a portion of the display of the hub 100 or PPM 102 is to enable different clinicians to have the same view of the data during a surgical procedure. In some surgical procedures, for instance, two anesthesiologists monitor a patient, one anesthesiologist monitoring the brain function and brain oxygenation of the patient, while the other monitors peripheral oxygenation of the patient. A brain sensor, such as has been described above, may be attached to the patient and provide brain monitoring and oxygenation data that is output to the hub 100 or the PPM 102 for presentation to the first anesthesiologist. A finger or toe/foot optical sensor can also be attached to the patient and output data to the hub 100 or PPM 102. The hub 100 or PPM 102 can transmit this data to the auxiliary device 2040, which the second anesthesiologist can monitor to observe oxygenation in the patient's peripheral limbs. The second anesthesiologist may also need to know the oxygenation at the brain to help interpret the seriousness or lack thereof of poor peripheral oxygenation values. However, in many surgical procedures, a curtain or screen is placed over the patient as part of the procedure, blocking the second anesthesiologist's view of the hub 100 or PPM 102. Accordingly, the hub 100 or PPM 102 can output a copy of at least a portion of its display to the auxiliary device 2040 so that the second anesthesiologist can monitor brain function or oxygenation.
The auxiliary device can have a larger display area than the display of the hub 100. For instance, the hub 100 may have a relatively smaller display, such as about 10 inches, while the auxiliary device 2040 may be a television monitor or the like that has a 40 inch or larger display (although any size display may be used for the auxiliary device 2040). The auxiliary device 2040 can be a television that can include a hardware module that includes a processor, memory, and a wireless or wired networking interface or the like. The processor can execute programs from the memory, including programs for displaying physiological parameters, trends, and waveforms on the display of the television. Since a television monitor can be larger than the hub 100, the television monitor version of the auxiliary device 2040 can display more fine detail of patient waveforms and trends (see, e.g.,
The auxiliary device 2040 may display one portion of any of the displays described herein while the hub 100 displays another portion thereof. For instance, the auxiliary device 2040 may display any of the anatomical graphics described above with respect to
The auxiliary device 2040 can also perform at least some processing of physiological parameters, including any of the functionality of the monitoring hub 100. For instance, the auxiliary device 2040 may include the translation module 2005 and perform the features thereof.
For example, a first medical device having digital logic circuitry receives a physiological signal associated with a patient from a physiological sensor, obtains a first physiological parameter value based on the physiological signal, and outputs the first physiological parameter value for display. The first medical device can also receive a second physiological parameter value from a second medical device other than the first medical device, where the second physiological parameter value is formatted according to a protocol not used by the first medical device, such that the first medical device is not able to process the second physiological parameter value to produce a displayable output value. The first medical device can pass the physiological parameter data from the first medical device to a separate translation module, receive translated parameter data from the translation module at the first medical device, where the translated parameter data is able to be processed for display by the first medical device, and output a second value from the translated parameter data for display. The first medical device may be, for example, the hub 100, PPM 102, or MMS 2004, and the second medical device may be the infusion pump 216 or ventilator 218 or the like.
Turning to
In each of the displays of
Referring again to
Any of the following features described with respect to
Healthcare costs have been increasing and the demand for reasonably-priced, high-quality patient care is also on the rise. Health care costs can be reduced by increasing the effectiveness of hospital information systems. One factor which may affect the efficacy of a health institution is the extent to which the various clinical computer systems employed at the health institution can interact with one another to exchange information.
Hospitals, patient care facilities, and healthcare provider organizations typically include a wide variety of different clinical computer systems for the management of electronic healthcare information. Each of the clinical computer systems of the overall IT or management infrastructure can help fulfill a particular category or aspect of the patient care process. For example, a hospital can include patient monitoring systems, medical documentation and/or imaging systems, patient administration systems, electronic medical record systems, electronic practice management systems, business and financial systems (such as pharmacy and billing), and/or communications systems, etc.
The quality of care in a hospital or other patient care facility could be improved if each of the different clinical computer systems across the IT infrastructure (or even within the same hospital room; see, e.g.,
In current practice, individual clinical computer systems can be, and often are, provided by different vendors. As a result, individual clinical computer systems may be implemented using a proprietary network or communication infrastructure, proprietary communication protocols, etc.; the various clinical computer systems used in the hospital cannot always effectively communicate with each other.
Medical device and medical system vendors sometimes develop proprietary systems that cannot communicate effectively with medical devices and systems of other vendors in order to increase their market share and to upsell additional products, systems, and/or upgrades to the healthcare provider. Thus, healthcare providers are forced to make enterprise or system-wide purchase decisions, rather than selecting the best technology available for each type of individual clinical computer system in use.
One example where this occurs is in the area of life-saving technology available for patient monitoring. For example, many different bedside devices for monitoring various physiological parameters are available from different vendors or providers. One such provider may offer a best-in-class device for monitoring a particular physiological parameter, while another such provider may offer the best-in-class device for another physiological parameter. Accordingly, it may be desirable in some circumstances for a hospital to have the freedom to use monitoring devices from more than one manufacturer, but this may not be possible if devices from different manufacturers are incapable of interfacing and exchanging patient information. Accordingly, the ability to provide reasonably-priced, high-quality patient care can be compromised. In addition, since each hospital or patient care facility may also implement its own proprietary communication protocols for its clinical computer network environment, the exchange of information can be further hindered.
As described above, the Health Level Seven (“HL7”) protocol has been developed to provide a messaging framework for the communication of clinical messages between medical computer systems and devices. The HL7 communication protocol specifies a number of standards, guidelines, and methodologies which various HL7-compliant clinical computer systems can use to communicate with each other.
The HL7 communication protocol has been adopted by many medical device manufacturers. However, the HL7 standard is quite flexible, and merely provides a framework of guidelines (e.g., the high-level logical structure of the messages); consequently, each medical device or medical system manufacturer or vendor may implement the HL7 protocol somewhat differently while still remaining HL7-compliant. For example, the format of the HL7 messages can be different from implementation to implementation, as described more fully herein. In some cases, the HL7 messages of one implementation can also include information content that is not included in messages according to another HL7 implementation. Accordingly, medical devices or clinical computer systems that are all HL7-compliant still may be unable to communicate with each other.
Consequently, a translation module can be provided that can improve the communication of medical messages between medical devices or systems that use different allowed implementations of an established communication protocol (e.g., HL7), thereby increasing the quality of patient care through the integration of multiple clinical computer systems.
The translation module 2415 receives input messages having the first protocol format from the first medical device 2405 and generates output messages to the second medical device 2410 having the second protocol format. The translation module 2415 also receives input messages having the second protocol format from the second medical device 2410 and generates output messages to the first medical device 2405 having the first protocol format. Thus, the translation module 2415 can enable the first and second medical devices 2405, 2410 to effectively and seamlessly communicate with one another without necessarily requiring modification to the communication equipment or protocol implemented by each device.
The translation module 2415 can determine the protocol format expected by an intended recipient of the input message based on, for example, the information in the input message or by referencing a database that stores the protocol format used by various devices, and then generates the output message based on the protocol format used by the intended recipient device or system. The output message can be generated based upon a comparison with, and application of, a set of translation rules 2420 that are accessible by the translation module 2415.
The translation rules 2420 can include rules that govern how to handle possible variations between formatting implementations within a common protocol. Examples of variations in formatting implementation of an electronic medical communication protocol include, for example, the delimiter or separator characters that are used to separate data fields, whether a particular field is required or optional, the repeatability of portions of the message (e.g., segments, fields, components, sub-components), the sequence of portions of the message (e.g., the order of fields or components), whether a particular portion of a message is included, the length of the message or portions of the message, and the data type used for the various portions of the message.
The translation rules 2420 can define additions, deletions, swappings, and/or modifications that should be performed in order to “translate” an input message that adheres to a first HL7 implementation into an output message that adheres to a second HL7 implementation. The output message can have, for example, different formatting than the input message, while maintaining all, or a portion of, the substance or content of the input message.
In addition to translating between different implementations of a common electronic medical communication protocol (e.g., different formatting of HL7 messages), the translation module 2415 can also translate between input and output messages adhering to different communication protocols. The translation module 2415 can be capable of responding to and translating messages from, for example, one medical communication protocol to a separate medical communication protocol. For example, the translation module 2415 can facilitate communication between messages sent according to the HL7 protocol, the ISO 11073 protocol, other open protocols, and/or proprietary protocols. Accordingly, an input message sent according to the HL7 protocol can be translated to an output message according to a different protocol, or vice-versa.
The operation of the translation module 2415 and the translation rules 2420 will be described in more detail below. Various examples of system architectures including the translation module 2415 will now be described.
The first medical device 2405, the second medical device 2410, and the translation module 2415 can be communicatively coupled via connection to a common communications network or directly (via cables or wirelessly), for example, through the hub 100, PPM 102, and/or MMS 2004. The translation module 2415 can be communicatively coupled between the first medical device 2405 and the second medical device 2410 (with or without a communications network) such that all messages between the first and second medical devices 2405, 2410 are routed through the translation module 2415. Other architectures are also possible.
The first and second medical devices 2405, 2410 and the translation module 2415 can be included in, for example, a portion of the monitoring environments of
The translation module 2415 can facilitate communication across multiple networks within a hospital environment. The translation module 2415 can also (or instead) facilitate communication of messages across one or more networks extending outside of the hospital or clinical network environment. For example, the translation module 2415 can provide a communications interface with banking institutions, insurance providers, government institutions, outside pharmacies, other hospitals, nursing homes, or patient care facilities, doctors' offices, and the like.
The translation module 2415 of
The translation module 2415 can also facilitate communication between a first medical device that is part of the patient monitoring sub-system and a second medical device that is not part of, or is external to, the patient monitoring system 200. As such, the translation module 2415 can be capable of responding to externally-generated medical messages (such as patient information update messages, status query messages, and the like from an HIS or CIS) and generating external reporting messages (such as event reporting messages, alarm notification messages, and the like from patient monitors or nurses' monitoring stations).
The first and second medical devices 2405, 2410 can communicate with each other over a communication bus 2421. The communication bus 2421 can include any one or more of the communication networks, systems, and methods described above, including the Internet, a hospital WLAN, a LAN, a personal area network, etc. For example, any of the networks describe above can be used to facilitate communication between a plurality of medical devices, including first and second medical devices 2405, 2410, discussed above. One such example is illustrated in
In
The translation module 2415 monitors the communication bus 2421 for such messages. The translation module 2415 receives the message and determine that first medical device 2405 is attempting to communicate with second medical device 2410. The translation module 2415 determines that message translation would facilitate communication between first and second medical devices 2405, 2410. The translation module 2415 therefore utilizes an appropriate translation rule stored in a translation module 2420. The translation module 2420 can include a memory, EPROM, RAM, ROM, etc.
The translation module 2415 translates the message from the first medical device 2405 according to any of the methods described herein. Once translated, the translation module 2415 delivers the translated message to the communication bus 2421. The second medical device 2410 receives the translated message and responds appropriately. For example, the second medical device may perform a function and/or attempt to communication with the first medical device 2405. The translation module 2415 facilitates communication from the second medical device 2410 to the first medical device 2405 in a similar manner.
The first medical device 2405 and the second medical device 2410 can be, for example, any of the medical devices or systems communicatively coupled to a hospital network or hub 100, PPM 102, and/or MMS 2004. These medical devices or systems can include, for example, point-of-care devices (such as bedside patient monitors), data storage units or patient record databases, hospital or clinical information systems, central monitoring stations (such as a nurses' monitoring station), and/or clinician devices (such as pagers, cell phones, smart phones, personal digital assistants (PDAs), laptops, tablet PCs, personal computers, pods, and the like).
The first medical device 2405 can be a patient monitor that can be communicatively coupled to a patient for tracking a physiological parameter (e.g., oxygen saturation, pulse rate, blood pressure, etc.), and the second medical device 2410 can be a hospital information system (“HIS”) or clinical information system (“CIS”). The patient monitor can communicate physiological parameter measurements, physiological parameter alarms, or other physiological parameter measurement information generated during the monitoring of a patient to the HIS or CIS for inclusion with the patient's electronic medical records maintained by the HIS or CIS.
The first medical device 2405 can be a HIS or CIS and the second medical device 2410 can be a nurses' monitoring station, as described herein. However, the translation module 2415 can facilitate communication between a wide variety of medical devices and systems that are used in hospitals or other patient care facilities. For example, the translation module 2415 can facilitate communication between patient physiological parameter monitoring devices, between a monitoring device and a nurses' monitoring station, etc.
Using the translation module 2415, a patient monitoring sub-system, such as those described herein (e.g., physiological monitoring system 200), can push data to the HIS or pull data from the HIS even if the HIS uses a different implementation of the HL7 protocol, or some other electronic medical communication protocol.
The patient monitoring sub-system can be configured to push/pull data at predetermined intervals. For example, a patient monitor or clinician monitoring station can download patient data automatically from the HIS at periodic intervals so that the patient data is already available when a patient is connected to a patient monitor. The patient data sent from the HIS can include admit/discharge/transfer (“ADT”) information received upon registration of the patient. ADT messages can be initiated by a hospital information system to inform ancillary systems that, for example, a patient has been admitted, discharged, transferred or registered, that patient information has been updated or merged, or that a transfer or discharge has been canceled.
The patient monitoring sub-system can also (or instead) be configured to push/pull data to/from the HIS only when the HIS is solicited by a query. For example, a clinician may make a request for information stored in a patient's electronic medical records on the HIS.
The patient monitoring sub-system can also (or instead) be configured to push/pull data to/from the HIS in response to an unsolicited event. For example, a physiological parameter of a patient being monitored can enter an alarm condition, which can automatically be transmitted to the HIS for storing in the patient's electronic medical records. Moreover, any combination of the above methods or alternative methods for determining when to communicate messages to and from the HIS can be employed.
Example system architectures and example triggers for the communication of messages involving the translation module 2415 have been described. Turning now to the operation of the translation module,
The message header (“MSH”) segment 2506 can define how the message is being sent, the field delimiters and encoding characters, the message type, the sender and receiver, etc. The first symbol or character after the MSH string can define the field delimiter or separator (in this message, a “caret” symbol). The next four symbols or characters can define the encoding characters. The first symbol defines the component delimiter (“˜”), the second symbol defines the repeatable delimiter (“|”), the third symbol defines the escape delimiter (“\”), and the fourth symbol defines the sub-component delimiter (“&”). All of these delimiters can vary between HL7 implementations.
The example header segment 2506 can further include the sending application (“VAFC PIMS”), the receiving application (“NPTF-508”), the date/time of the message (“20091120104609-0600”), the message type (“ADT˜A01”), the message control ID (“58103”), the processing ID (“P”), and the country code (“USA”). As represented by the consecutive caret symbols, the header segment can also contain multiple empty fields.
The parsed input message can be encoded.
The encoded message header segment shows some of the various data types that can be used in the message. For example, the sending application (“VAFC PIMS”) of the third parsed field and the receiving application (“NPTF-508”) of the fifth parsed field are represented using a hierarchic designator (“HD”) name data type. The date/time field (the seventh parsed field) is represented using the time stamp (“TS”) data type. The processing ID field (the eleventh parsed field) is represented using the processing type (“PT”) data type. The fields that do not include a data type identifier are represented using the string (“ST”) data type. Other possible data types include, for example, coded element, structured numeric, timing quantity, text data, date, entry identifier, coded value, numeric, and sequence identification. The data types used for the various fields or attributes of the segments can vary between formatting implementations.
Turning to the operation of the translation module, the translation module 2415 can, for example, create, generate, or produce an output message that is reflective of the input message based on an application of the set of translation rules 2420. The translation module 2415 can, for example, translate, transform, convert, reformat, configure, change, rearrange, modify, adapt, alter, or adjust the input message based on a comparison with, and application of, the set of translation rules 2420 to form the output message. The translation module 2415 can, for example, replace or substitute the input message with an output message that retains the content of the input message but has a new formatting implementation based upon a comparison with, and application of, the set of translation rules 2420.
At block 2604, the translation module 2415 determines the formatting implementation of the input message and the formatting implementation to be used for the output message. The input message can include one or more identifiers indicative of the formatting implementation. The determination of the formatting implementation can be made, for example, by analyzing the message itself by identifying the delimiter or encoding characters used, the field order, the repeatability of segments, fields, or components, the data type of the fields, or other implementation variations. The translation module 2415 can separate or parse out the formatting from the content of the message (as shown in
The determination of the formatting implementation used by the output message can also be determined from the input message. For example, the input message can include a field that identifies the intended recipient application, facility, system, device, and/or destination. The input message can alternatively include a field that identifies the type of message being sent (e.g., ADT message) and the translation module 2415 can determine the appropriate recipient from the type of message being sent and/or the sending application, device, or system. The translation module 2415 can then determine the formatting implementation required by the intended recipient of the input message.
At decision block 2605, the translation module 2415 determines whether a rule set has been configured for the translation from the identified formatting implementation of the input message to the identified formatting implementation to be used for the output message. The rule set may have been manually configured prior to installation of the translation module software or may have been automatically configured prior to receipt of the input message. If a rule set has already been configured, then the translation process 2600 continues to block 2606. If a rule set has not been configured, then a rule set is configured at block 2607. The configuration of the rule set can be performed as described below in connection with
At block 2606, the translation module 2415 identifies the pre-configured rules from the set of translation rules 2420 that govern translation between the determined formatting implementation of the input message and the formatting implementation of the output message. The identification of the pre-configured rules can be made manually.
At block 2608, the translation module 2415 generates an output message based on the configured rule set(s) of the translation rules 2420. The output message retains all, or at least a portion of, the content of the input message but has the format expected and supported by the intended recipient of the input message.
The translation rules 2420 can include, for example, unidirectional rules and/or bidirectional rules. A unidirectional rule can be one, for example, that may be applied in the case of a message from a first medical device (e.g., 2405) to a second medical device (e.g., 2410) but is not applied in the case of a message from the second medical device to the first medical device. For example, a unidirectional rule could handle a difference in the delimiters used between fields for two different formatting implementations of, for example, the HL7 communication protocol. The translation module 2415 can apply a field delimiter rule to determine if the field delimiter is supported by the intended recipient of the input message. If the field delimiter of the input message is not supported by the intended recipient, the field delimiter rule can replace the field delimiter of the input message with a field delimiter supported by the intended recipient.
For example, an input message from an input medical device can include a formatting implementation that uses a “caret” symbol (“{circumflex over ( )}”) as the field delimiter or separator. However, the formatting implementation recognized by the intended recipient medical device may use a “pipe” symbol (“|”) as the field delimiter. The translation module 2415 can identify the field delimiter symbol used in the formatting implementation recognized by the intended recipient medical device from the set of translation rules 2420 and generate an output message based on the input message that uses the pipe field delimiter symbol instead of the caret field delimiter symbol used in the input message. The rule to substitute a pipe symbol for a caret symbol would, in this case, only apply to messages that are sent to a recipient device that recognizes the pipe symbol as a field delimiter. This rule could be accompanied by a complementary rule that indicates that a caret symbol should be substituted for a pipe symbol in the case of a message that is intended for a recipient device that is known to recognize the caret symbol as the field delimiter.
Another unidirectional rule can handle the presence or absence of certain fields between different formatting implementations. For example, an input message from an input medical device can include fields that would not be recognized by the intended recipient medical device. The translation module 2415 can generate an output message that does not include the unrecognized or unsupported fields. In situations where an input message does not include fields expected by the intended recipient medical device, the set of translation rules 2420 can include a rule to insert null entries or empty “ ” strings in the fields expected by the intended recipient medical device and/or to alert the recipient device of the absence of the expected field. The sender device may also be notified by the translation module 2415 that the recipient device does not support certain portions of the message.
Other unidirectional rules can facilitate, for example, the conversion of one data type to another (for example, string (“ST”) to text data (“TX”) or structured numeric (“SN”) to numeric (“NM”)), and the increase or decrease in the length of various portions of the message. Unidirectional rules can also be used to handle variations in repeatability of portions of the message. For example, the translation module 2415 can apply a field repeatability rule to repeated instances of a segment, field, component, or sub-component of the message to determine how many such repeated instances are supported by the recipient device, if any, and deleting or adding any repeated instances if necessary. For example, a phone number field of a patient identification segment can be a repeatable field to allow for entry of home, work, and cell phone numbers.
Bidirectional rules can also be used. Such rules may apply equally to messages between first and second medical devices (e.g., 2405, 2410) regardless of which device is the sender and which is the recipient. A bidirectional rule can be used to handle changes in sequence, for example. In certain implementations, an input message from an input medical device can include a patient name field, or fields, in which a first name component appears before a last name component. However, the intended recipient medical device may be expecting an implementation where the last name component appears before the first name component. Accordingly, the set of translation rules 2420 can include a bidirectional rule to swap the order of the first and last name components when communicating between the two medical devices, or between the two formatting implementations. In general, field order rules can be applied to determine whether the fields, components, or sub-components are in the correct order for the intended recipient and rearranging them if necessary. Other bidirectional rules can be included to handle, for example, other sequential variations between formatting implementations or other types of variations.
The translation rules 2420 can also include compound rules. For example, a compound rule can include an if-then sequence of rules, wherein a rule can depend on the outcome of another rule. Some translation rules 2420 may employ computations and logic (e.g., Boolean logic or fuzzy logic), etc.
As discussed above, the messages communicated over the hospital-based communication network can employ the HL7 protocol.
The translation process 2700A starts at block 2701, where the translation module 2415 receives an input message having a first HL7 format from the HIS. The input message can include information regarding, for example, the admission of a patient and/or patient identification and patient medical history information from an electronic medical records database.
At block 2703, the translation module 2415 determines the formatting implementation of the input message and the formatting implementation to be used for the output message. These determinations can be made in a similar manner to the determinations discussed above in connection with block 2604 of
At block 2705, the translation module 2415 identifies the rules that govern translation between the determined HL7 format of the input message and the HL7 format of the output message and generates an output message having the second HL7 format based on the identified rules. The output message can retain the content of the input message sent by the HIS but has the format expected and supported by the intended recipient of the input message.
At block 2707, the translation module 2415 can output the output message to the intended recipient over the hospital-based communications network. The intended recipient can transmit an acknowledgement message back to the hospital information system acknowledging successful receipt or reporting that an error occurred.
The translation process 2700B starts at block 2702, where the translation module 2415 receives an input message having a first HL7 format from the medical device. The input message can include patient monitoring data or alarm data regarding one or more physiological parameters of the patient being monitored for storage in an electronic medical records database associated with the HIS.
At block 2704, the translation module 2415 determines the formatting implementation of the input message and the formatting implementation to be used for the output message. These determinations can be made in a similar manner to the determinations discussed above in connection with block 2604 of
At block 2706, the translation module 2415 identifies the rules that govern translation between the determined HL7 format of the input message and the HL7 format of the output message and generates an output message having the second HL7 format based on the identified rules. The output message can retain the content of the input message sent by the medical device but has the format expected and supported by the HIS.
At block 2708, the translation module 2415 can output the output message to the hospital information system over the hospital-based communications network. The HIS can transmit an acknowledgement message back to the medical device acknowledging successful receipt or reporting that an error occurred.
The translation rules 2420 can be implemented as one or more stylesheets, hierarchical relationship data structures, tables, lists, other data structures, combinations of the same, and/or the like. The translation rules 2420 can be stored in local memory within the translation module 2415. The translation rules 2420 can also (or instead) be stored in external memory or on a data storage device communicatively coupled to the translation module 2415.
The translation module 2415 can include a single rule set or multiple rule sets. For example, the translation module 2415 can include a separate rule set for each medical device/system and/or for each possible communication pair of medical devices/systems coupled to the network or capable of being coupled to the network. The translation module 2415 can include a separate rule set for each possible pair of formatting implementations that are allowed under a medical communication protocol such as, for example, the HL7 protocol.
The translation rules 2420 can be manually inputted using, for example, the messaging implementation software tool 2800 illustrated in
The translation rules 2420 can also (or instead) be automatically generated. For example, the automatic generation of a new set, or multiple sets, of rules can be triggered by the detection of a newly recognized “communicating” medical device or system on a network. The automatic generation of a new set or multiple sets of rules can occur at the time a first message is received from or sent to a new “communicating” medical device or system coupled to the network. The automatic generation of rule sets can include updating or dynamically modifying a pre-existing set of rules.
The automatic generation of translation rule sets can be carried out in a variety of ways. For example, the translation module 2415 can automatically initiate usage of a pre-configured set of translation rules 2420 based upon, for example, the make and model of a new device that is recognized on the network. The translation module 2415 can request one or more messages from the new device or system and then analyze the messages to determine the type of formatting being implemented, as illustrated by the automatic rule configuration process 2900A of
At block 2903, the translation module 2415 determines the protocol of the one or more received messages by, for example, analyzing the message or by consulting a database that indicates what communication protocol/format is implemented by each medical device or system on the network. The translation module 2415 can be configured to handle medical messages implemented using a single common protocol, such as HL7. Accordingly, if a determination is made that the received messages are implemented using a non-supported or non-recognized protocol, the translation module can ignore the messages received from the detected medical device or system, output an alert or warning, or allow the messages to be sent without being translated.
At block 2905, the translation module 2415 determines the formatting implementation of the received message(s). The received messages can include one or more identifiers indicative of the formatting implementation. The determination of the formatting implementation can also (or instead) be made, for example, by analyzing the message itself by checking field order, the delimiter or encoding characters used, or other implementation variations. The translation module 2415 can separate or parse out the formatting from the content of the message to aid in the determination of the formatting implementation.
At block 2907, the translation module 2415 configures one or more rules or rule sets to handle messages received from and/or sent to the detected medical device or system. The configuration of the rules can involve the creation or generation of new rules. The configuration of the rules can also (or instead) involve the alteration or updating of existing rules. The configured rules or rule sets can be included with the translation rules 2420. If a set of rules already exists for the formatting implementation used by the new device or system, then the configuration of new translation rules may not be required. Instead, existing translation rules can be associated with the new device or system for use in communication involving that device or system. The translation module 2415 can also (or instead) create a new set of rules geared specifically for the new device or system or can modify an existing set of rules based on subtle formatting variations identified.
The translation module 2415 can also (or instead) generate test message(s) that may be useful in identifying the communication protocol and implementation used by a device or system. For example, the translation module can generate test messages to cause the newly detected device or system to take a particular action (e.g., store information) and then query information regarding the action taken by the newly detected device to determine whether or how the test message was understood. This is illustrated by the automatic rule configuration process 2900B of
The automatic rule configuration process 2900B starts at block 2902, where the translation module 2415 transmits one or more test, or initialization, messages to a remote device or system detected on a network. The test messages can be configured, for example, to instruct the remote device or system to take a particular action (e.g., store patient information). The test messages can be configured to generate a response indicative of the type of formatting recognized or supported by the remote device or system. The test messages can also (or instead) be configured such that only devices or systems supporting a particular formatting implementation will understand and properly act on the test messages.
At block 2904, the translation module 2415 queries the remote device or system to receive information regarding the action taken based on the test message sent to the remote device or system to determine whether the test message was understood. For example, if the test message instructed the remote device or system to store patient information in a particular location, the translation module 2415 can query the information from the location to determine whether the test message was understood. If the test message was not understood, the translation module 2415 can, for example, continue sending test messages of known formatting implementations until a determination is made that the test message has been understood.
At block 2906, the translation module 2415 determines the protocol and formatting implementation based on the information received. As an example, the test message can include an instruction to store patient name information. The test message can include a patient name field having a first name component followed by a surname component. The translation module 2415 can then query the remote device or system to return the patient surname. Depending on whether the patient surname or the first name is returned, this query can be useful in determining information about the order of fields in the formatting implementation being used by the remote device or system. As another example, the test messages can instruct the detected device or system to store repeated instances of a component. The translation module 2415 can then query the device or system to return the repeated instances to see which, if any, were stored. This repeatability information can also be useful in determining whether certain fields are allowed to be repeated in the formatting implementation being used by the remote device for system, and, if so, how many repeated instances are permitted.
At block 2908, the translation module 2415 configures one or more rules to handle messages received from and/or sent to the detected medical device or system. For example, the rules can convert messages from the message format used by a first medical device to that used by a second medical device, as described herein. The configuration of the rules can involve the creation or generation of new rules. The configuration of the rules can also (or instead) involve the alteration or updating of existing rules. If a set of rules already exists for the formatting implementation used by the new device or system, then the configuration of new translation rules may not be required. Instead, existing translation rules can be associated with the new device or system for use in communication involving that device or system.
The automatic rule configuration process 2900C illustrated in
At block 2917, the translation module 2415 configures one or more rules to handle messages received from and/or sent to the HL7 medical device. The configuration of the rules can involve the creation or generation of new rules for the detected formatting implementation. The configuration of the rules can also (or instead) involve the dynamic alteration or updating of existing rules. If a set of rules already exists for the formatting implementation used by the new HL7 medical device, then the configuration of new translation rules may not be required. Instead, existing translation rules can be associated with the new HL7 medical device for use in communication involving that device.
The automatic rule configuration process 2900D illustrated in
At block 2914, the translation module 2415 queries the HL7 medical device to receive information regarding an action taken or information stored in response to the test message. At block 2916, the translation module 2415 determines the formatting implementation of the HL7 device based on the information received. The translation module 2415 can analyze the information received to determine whether the test message or messages were properly understood. If none of the test messages were properly understood, the translation module 2415 can send additional test messages having other known HL7 formats and repeat blocks 2914 and 2916.
At block 2918, the translation module 2415 configures one or more translation rules to handle messages received from and/or sent to the detected HL7 medical device. The configuration of the translation rules can involve the creation or generation of new translation rules. The configuration of the rules can also (or instead) involve the alteration or updating of existing rules. If a set of translation rules already exists for the formatting implementation used by the new HL7 medical device, then the configuration of new translation rules may not be required. Instead, existing translation rules can be associated with the new HL7 medical device for use in communication involving that HL7 medical device.
The automatic rule configuration processes described above can be triggered by the detection of a network device or system by the translation module 2415. The medical devices referred to in
The automatic generation of translation rules can advantageously occur post-installation and post-compilation of the messaging sub-system software, which includes the translation module 2415. The automatic generation or dynamic modification of the translation rules 2420 can occur without having to recompile or rebuild the translation module software. This feature can be advantageous in terms of efficiently complying with U.S. Food and Drug Administration (“FDA”) requirements regarding validation of software used in healthcare environments.
Take, for example, a situation where a medical device manufacturer plans to use the translation module 2415 to facilitate communication between a particular medical device or system that is to be installed in a hospital (e.g., a patient monitoring system, as described herein), or other patient care facility, and other devices or systems that are already installed at the hospital (e.g., the HIS or CIS). Any software required for the operation of the new medical device to be installed may be at least partially validated for FDA compliance prior to installation at the hospital despite the fact that, for example, the HL7 implementations of other existing devices or systems at the hospital may still be unknown. For example, any aspects of the software for the new medical device that are dependent upon receiving messages from other hospital devices can be validated pre-installation as being capable of fully and correctly operating when the expected message format is received. Then, once the medical device is installed at the hospital, the validation of the software can be completed by showing that the translation module 2415 is able to provide messages of the expected format to the newly installed device. In this way, FDA validation tasks can be apportioned to a greater extent to the pre-installation timeframe where they can be more easily carried out in a controlled manner rather than in the field.
In addition, the translation module 2415 can further help streamline FDA validation, for example, when a medical device or system is expected to be installed at different hospitals whose existing devices use, for example, different implementations of the HL7 protocol. Normally, this type of situation could impose the requirement that the entire functionality of the software for the new medical device be completely validated at each hospital. However, if the translation module 2415 is used to interface between the new medical device and the hospital's existing devices, then much of the software functionality could possibly be validated a single time prior to installation, as just described. Then, once installed at each hospital, the software validation for the medical device can be completed by validating that correct message formats are received from the translation module (the translation rules for which are field-customizable). This may result in making on-site validation procedures significantly more efficient, which will advantageously enable more efficient FDA compliance in order to bring life-saving medical technology to patients more quickly by the use of field-customizable translation rules.
A system for providing medical data translation for output on a medical monitoring hub can include a portable physiological monitor comprising a processor that can: receive a physiological signal associated with a patient from a physiological sensor, calculate a physiological parameter based on the physiological signal, and provide a first value of the physiological parameter to a monitoring hub for display. The monitoring hub can include a docking station that can receive the portable physiological monitor. The monitoring hub can: receive the first value of the physiological parameter from the portable physiological monitor; output the first value of the physiological parameter for display; receive physiological parameter data from a medical device other than the portable physiological monitor, the physiological parameter data formatted according to a protocol other than a protocol natively readable or displayable by the monitoring hub; pass the physiological parameter data to a translation module; receive translated parameter data from the translation module, where the translated parameter data can be readable and displayable by the monitoring hub; and output a second value from the translated parameter data for display.
The system of the preceding paragraph can be combined with any subcombination of the following features: the monitoring hub is further configured to output the first value of the physiological parameter and the second value from the translated parameter data on separate displays; the monitoring hub is further configured to output the second value from the translated parameter data to an auxiliary device having a separate display from a display of the monitoring hub; the auxiliary device is selected from the group consisting of a television, a tablet, a phone, a wearable computer, and a laptop; the physiological parameter data comprises data from an infusion pump; the physiological parameter data comprises data from a ventilator; and the translation module is configured to translate the physiological parameter data from a first Health Level 7 (HL7) format to a second HL7 format.
A method of providing medical data translation for output on a medical monitoring hub can include: under the control of a first medical device comprising digital logic circuitry, receiving a physiological signal associated with a patient from a physiological sensor; obtaining a first physiological parameter value based on the physiological signal; outputting the first physiological parameter value for display; receiving a second physiological parameter value from a second medical device other than the first medical device, where the second physiological parameter value is formatted according to a protocol not used by the first medical device, such that the first medical device is not able to process the second physiological parameter value to produce a displayable output value; passing the physiological parameter data from the first medical device to a separate translation module; receiving translated parameter data from the translation module at the first medical device, the translated parameter data able to be processed for display by the first medical device; and outputting a second value from the translated parameter data for display.
The method of the preceding paragraph can be combined with any subcombination of the following features: further including translating the message by at least translating the message from a first Health Level 7 (HL7) format to a second HL7 format; the message can include data from a physiological monitor; the message can include data from an infusion pump or a ventilator; and the message can include data from a hospital bed.
A system for providing medical data translation for output on a medical monitoring hub can include a first medical device including electronic hardware that can: obtain a first physiological parameter value associated with a patient; output the first physiological parameter value for display; receive a second physiological parameter value from a second medical device other than the first medical device, the second physiological parameter value formatted according to a protocol not used by the first medical device, such that the first medical device is not able to process the second physiological parameter value to produce a displayable output value; pass the physiological parameter data from the first medical device to a translation module; receive translated parameter data from the translation module at the first medical device, the translated parameter data able to be processed for display by the first medical device; and output a second value from the translated parameter data for display.
The system of the preceding paragraph can be combined with any subcombination of the following features: the first medical device can also output the first value of the physiological parameter and the second value from the translated parameter data on the same display; the first medical device can also output the first value of the physiological parameter and the second value from the translated parameter data on separate displays; the first medical device can also output the second value from the translated parameter data to an auxiliary device; the auxiliary device can be a television monitor; the auxiliary device can be selected from the group consisting of a tablet, a phone, a wearable computer, and a laptop; the first medical device can include the translation module; the first medical device can also pass the physiological parameter data to the translation module over a network; and the physiological parameter data can include data from an infusion pump or a ventilator.
Today's patient monitoring environments provide one more traditional displays or screens for clinicians that present data from one or more electronic medical devices associated with a wide variety of monitoring, treatments, or procedures for a patient. Thus, during such patient monitoring, treatments, or procedures a clinician typically reviews one or more traditional displays to gather information about a patient. However, while a clinician looks at the one or more traditional displays their attention may be diverted away from the patient, such as a clinician looking away from the patient to a traditional display during a surgical procedure. For example, during some surgical procedures, such as an endoscopy or an epidural, it is common for the operating clinician to look at the patient to see where a probe is going but the clinician has to look away from the patient to view a traditional display, which is inefficient and potential dangerous to the patient.
The systems and methods described herein advantageously may improve the presentation of data or provide improved interactive user interfaces using augmented reality. For example, a clinician using an augmented reality device, such as wearing augmented reality glasses, is presented with medical monitoring data that may be received from the medical monitoring hub, as described herein. An advantage of augmented reality is that the augmented reality display can overlay real world visual information. Accordingly, a clinician can remain visually focused on a patient while simultaneously receiving augmented reality information. An advantage of augmented reality is that the display area for an augmented reality user interface may be larger than traditional displays, such as device screens. For example, an augmented reality display area may be ten times larger than a traditional display area. The following are examples of improved augmented reality user interfaces.
An example augmented reality device presents one or more user interfaces. Example user interfaces that may be presented on the augmented reality device include any of the user interfaces described herein. Further, augmented reality user interfaces can improve the efficiency of surgical procedures. For example, during certain procedures, such as an endoscopy or an epidural, the clinician can efficiently maintain his or her view of the patient to see where a probe is going and simultaneously view an overlay user interface that includes data that would have previously only been available on a traditional display. An augmented reality user interface may be pinned to a particular area within a three-dimensional space or the patient room. For example, a clinician can interact with the augmented reality device to pin an augmented reality user interface to a physical device, a location, or to the patient. Continuing with the example, the clinician using the augmented reality device can view the pinned augmented reality user interface when looking near the physical device or the location that was pinned; however, if the clinician looks away from the physical device or the location then the augmented reality user interface is not presented (in this example). The auxiliary device 2040 may be optional or any information displayed on the auxiliary device 2040 may be presented through the augmented reality device.
Another example of improved user interfaces using augmented reality is the presentation of analog display indicia as superimposed on a patient. Example analog display indicia that may be presented on the augmented reality device include any of the analog display indicia described herein. The example analog display indicia, such as a two-dimensional or three-dimensional lungs, heart, brain, or circulatory system, can be superimposed on a patient. Accordingly, a clinician looking at the patient can see the superimposed analog display indicia. The analog display indicia can be pinned to the patient such that if the clinician looks away from the patient then the analog display indicia is no longer presented. As described herein, the analog display indicia can present health indicators of various physiological parameters. Example health indicators include color-coded analog display indicia, such as green, yellow, or red indicia, which may indicate nominal, cautionary, or severe situations, respectively, which are described in further detail herein.
Improved augmented reality user interfaces described herein can enable a user to configure or interact with the user interface. For example, the augmented reality user interface may be a dashboard where a user can add or remove one or more virtual display panels or change the arrangement or the location of the augmented reality panels or objects. The augmented reality device may receive user input corresponding to user interactions. Example user interactions include voice input or commands, visual or eye commands, touch input, or movement, such as head movement or hand gestures. Example head gestures include head tilt, bobbing, or nodding. As another example, a clinician may receive augmented reality patient data when outside of the patient's room or area. In the example, a clinician walking past a patient's room interacts with the augmented reality device to receive data regarding the patient or from the electronic medical devices within the patient's room. Continuing with the example, the clinician executes a hand gesture that virtually grabs the patient's data and causes presentation of the data in their augmented reality device without entering the patient room. As another example, patient data may be virtually posted outside of rooms. A clinician can pass by a room, look at the room and see a virtual user interface for the patient monitor inside the room, and pin the virtual user interface outside of the room using a gesture or verbal command. Additionally or alternatively, the patient data may be available anywhere within a healthcare facility or even remotely, such as a clinician being tens or hundreds of miles away from the physical location of the patient.
Additional example user interfaces and systems for patient monitoring and notifications are disclosed in U.S. patent application Ser. No. 14/511,972 by the assignee of the present disclosure and is incorporated by reference herein.
The example augmented reality device 7200 includes augmented reality glasses, head-mounted displays, head-up displays, contact lenses, a smartphone, or a tablet. The augmented reality device 7200 may include some combination of one or more hardware processors, displays, sensors, or input devices. For example, the augmented reality device 7200 can include a camera, an accelerometer, gyroscope, a GPS device, or a solid-state compass. The augmented reality device 7200 may include one or more wired or wireless devices that enable communication over Bluetooth, USB, wired networks, or one or more wireless networks, such as a Global System for Mobile Communications (GSM) network, a Code Division Multiple Access (CDMA) network, a Long Term Evolution (LTE) network, Wi-Fi, or some other type of wireless network. The augmented reality device 7200 may also communicate with an augmented reality server (not illustrated), which may handle some augmented reality processing. Accordingly, the example augmented reality device can offload some or all of the augmented reality processing to be performed by the augmented reality server (which may be in a cloud computing center) in a distributed manner.
Similar to the visual markers or tracking sensors associated with the patient or the one or more devices 7214, the augmented reality system may determine the position of the augmented reality display area 7210 based on visual markers or tracking sensors associated with the device 7210. As described herein, the example augmented reality system can determine the position of the augmented reality display area 7210 by identifying a reference object, here the device 7212, determining a reference position for the reference object, and calculating a positional offset from the reference position. The positional offset may be calculated as a predetermined or configurable distance and direction from the reference object. Continuing with the example, the clinician using the augmented reality device may change or update the positional offset of the augmented reality display area 7210.
At block 7224, the augmented reality system server initiates a connection with a communication interface. For example, the augmented reality device 7200 may connect to a communication interface of the MMS 2004, the monitoring hub 100, or the PPM 102, wired or wirelessly. The augmented reality device 7200 can be authenticated via one or more security protocols before a connection with the communication interface can be established.
At block 7226, the augmented reality system receives data from the communication interface. As described herein, example data includes physiological monitoring data, which may include physiological parameter data from a patient that is calculated from physiological signals that are captured by physiological sensors monitoring the patient.
At block 7228, the augmented reality system formats the received data for presentation. For example, the augmented reality system may access user interface configuration data that indicates which augmented reality objects should be displayed or the arrangement of the augmented reality objects. A clinician using the augmented reality device 7200 may update or modify the user interface configuration data via one or more user interactions with the augmented reality device 7200, as described herein. Accordingly, the augmented reality system can format the received data, such as the physiological monitoring data, into user interface display data according to the user interface configuration data. For example, the augmented reality system can generate one or more augmented reality objects from the physiological monitoring data according to the user interface configuration data that specifies which objects are to be generated or where the objects should be presented on the augmented reality device display.
At block 7230, the augmented reality system presents the formatted data in a display of the augmented reality device. Example augmented reality user interfaces are shown and described in further detail with reference to
At block 7232, the augmented reality system receives and processes input data. Example input data includes user interaction data that is received from the augmented reality system. For example, a clinician may interact with the augmented reality user interface to modify the user interface. Example user interactions include voice input or commands by the user (e.g., detectable by a microphone and interpretable by voice recognition software installed in the augmented reality system), visual or eye commands by the user (e.g., detectable by a camera and interpretable by image processing software installed in the augmented reality system), touch input, and/or movement by the user, such as head movement or hand gestures (e.g., detectable by a movement sensor such an accelerometer or gyroscope and interpretable by movement processing software installed in the augmented reality system). Accordingly, the augmented reality system may return to block 7226 to receive additional data to continue processing user interaction data, display data, or monitoring data to present updated augmented reality user interfaces at blocks 7228 and 7230. For example, the augmented reality user interfaces may update in near-real time or real-time based on the physiological sensors capturing updated data from the patient or from new data received from the MMS 2004, the monitoring hub 100, or the PPM 102.
The augmented reality system can perform gesture recognition using the input data. A clinician can virtually pin a user interface display area in space (such as next to or superimposing a monitor display) or to an object in reality (such as a hospital room door) using a gesture such as a hand or finger gesture. The user interface display area can be pinned to or near to a physical device, a location, or to the patient. A gesture with respect to an augmented reality user interface and a physical device, a location, or the patient can cause the user interface to be pinned to that object. The user interface display area can be pinned to any device, such as the hub 100, the PPM 102, or the auxiliary device 2040. A gesture can be recorded by a camera of the augmented reality system, such as a camera of the augmented reality device 7200 or a camera or screen of hub 100, the PPM 102, or of the auxiliary device 2040. The augmented reality system can process the video or image data to perform gesture recognition. The augmented reality system can use currently-available gesture recognition algorithms. Example gesture recognition algorithms can be based on three-dimensional models or based on appearance, such as by using images or videos for direct interpretation, and/or some combination thereof. The example gesture recognition algorithms can include mean shift algorithms, continuous adaptive mean shift (CAMSHIFT) algorithms, pose estimation algorithms, volumetric algorithms, skeletal algorithms, motion algorithms, and/or some combination thereof. An example gesture recognition process is described in more detail below with respect to
The user input data can include image data. Detecting the gesture from the user input data can include determining color histogram data from the image data. A search window can be located within the image data according to the color histogram data, such as by identifying a range of flesh colors within the image. Multiple images can be processed in a similar manner. Multiple positions of the search window in the image data can be identified that corresponds to a gesture, such as a hand or finger swipe in a direction. The augmented reality system can identify a predefined gesture based on definitions of gesture that can correspond to, for example, swiping left, swiping right, swiping up, or swiping down. Additional details regarding example gesture detection techniques are described in further detail with respect to
The augmented reality user interface can update in response to user input. Where an augmented reality user interface is pinned to a physical device, such as the hub 100, the PPM 102, or the auxiliary device 2040, a clinician looking at the device and/or augmented reality user interface can cause the user interface to update a characteristic of the augmented reality user interface. For instance, in response to a clinician looking at the hub 100, the PPM 102, or the auxiliary device 2040, the augmented reality user interface can increase in size (e.g., enlarging from 12 inches in width or height to 50 inches in width or height). The physical user interface shown on the physical patient monitor can increase in size, for instance, as an augmented reality display (i.e., the AR display can be a copy of the physical display only larger). If the clinician looks away, the augmented reality user interface can reduce in size. The augmented reality system can detect a clinician looking at a device, user interface, or at a location via tracking technologies and devices that can include computer vision recognition techniques, image registration, optical sensors, accelerometers, GPS, gyroscopes, solid state compasses, RFID, or wireless sensors.
The augmented reality user interface can also (or instead) update in different ways. If the clinician looks away from the monitor such as by looking at the patient, then the augmented reality user interface can present no data or a reduced amount of data, to become uncluttered, unless the clinician looks back at the monitor (then the clinician may see the pinned user interfaces wherever they were pinned). Accordingly, the clinician can see the patient without having some or all augmented reality data in the way. Movement of the clinician's head or a clinician hand gesture can enable data to be viewed or not viewed. If the clinician moves her head toward the patient, the data view might change (or disappear, or move to the side of the patient, e.g., to the edge of the clinician's field of view, etc.). The clinician can also (or instead) see more augmented reality interface when looking at the patient unless the clinician makes a head or hand gesture (e.g., captured by a camera or movement sensor) to move the augmented reality interface to the side of the patient or dismiss it from view entirely.
The arrangement of objects in an augmented reality user interface can update in response to user input. Augmented reality objects can be presented in a first arrangement. For example, as described with respect to
The augmented reality user interface can be configurable via a device different from the augmented reality device. The arrangement and/or selection of objects in an augmented reality user interface can be based on user input. As described above with respect to
More generally, the user interface of a patient monitor can be reflected in an augmented reality user interface. A user can modify a view of a patient monitor, initiate an augmented reality user interface (with a flick or other gesture), and the initiated augmented reality user interface can present the modified view of the patient monitor. At least some of the plurality of augmented reality objects in the in the augmented reality user interface can correspond to the hub, monitor, or auxiliary device user interface configuration. Specifically, the hub (or monitor or auxiliary device) user interface configuration can include user interface elements. Each element of the user interface elements can include a physiological parameter value. Each element of the user interface elements can correspond to an object from the plurality of augmented reality objects presented in the augmented reality user interface.
At block 7236, the augmented reality system receives input data. Example input data includes a tagged or pinned reference object or location. An example tagged pinned reference object may correspond to one or more physical objects, such as an auxiliary display, a device of the monitoring system, or a patient. Additionally or alternatively, the example input data includes positional or positional-related data, such as image data, video data, accelerometer data, GPS data, gyroscopic data, solid state compass data, RFID data, or wireless data. Example positional or positional-related data may correspond to data captured by the augmented reality device or one or more devices of the monitoring system, such as the auxiliary display or one or more devices attached to the patient. As another example, the image or video data may capture a known visual marker (also known as or referred to as a fiducial marker as used herein) that is attached to a patient or device of the monitoring system.
At block 7238, the augmented reality system determines positional display data from the input data. For example, the augmented reality system determines or calculates a reference position for the reference object from the image data, video data, accelerometer data, GPS data, gyroscopic data, solid state compass data, RFID data, or wireless data. Typically GPS data is accurate within several meters. Accordingly, the augmented reality system may use other positional-related data, such as image or video data, accelerometer data, gyroscopic data, solid state compass data, RFID data, or wireless data to determine a more accurate position for a reference object. In a computer vision example, the augmented reality system can execute an image registration process to identify known visual markers through one or more feature detection techniques such as corner detection, blob detection, edge detection, thresholding, or other image processing techniques. Additionally or alternatively, the augmented reality system can determine a three-dimensional position for the reference object using a pose estimation technique. The augmented reality system can generate a real-world coordinate system from the obtained or generated positional data. An example real world coordinate system at least includes three-dimensional coordinates.
The example augmented reality system generates positional display data from the obtained or generated positional data. In the pinning example, the augmented reality system can determine a positional offset from a reference object, such as a patient or a display or device of the monitoring system. The augmented reality system may calculate a position offset from a predefined or configurable distance and direction from the reference object. An example predefined distance includes five or ten centimeters to the right or left of the reference object. User interaction input received from a clinician may update the position offset. For example, a clinician can interact with the augmented reality objects by moving them (virtually), such as with push, pull, or hand wave gestures. In the direct overlay or superimposed example, the augmented reality system may display one or more augmented reality objects at the reference position for the reference object. For example, the reference position of the patient may correspond to a particular coordinate within the coordinate system and the augmented reality system presents the object at the coordinate. As described herein, the augmented reality system can present analog display indicia at the reference position that corresponds to the patient, such as the coordinate position of the lung, heart, or brain areas of the patient. Accordingly, if the reference object moves, the one or more pinned augmented reality objects may move with the reference object.
At block 7244, the augmented reality system receives input data. Example input data includes image or video data. The input data can be received from an input device, such as an input device of an augmented reality device or another device, such as a monitor. The input device can include a camera. The augmented reality system can receive an image or a video and an image (such as a frame) can be extracted from the video.
At block 7246, an initial search window is established. For example, the augmented reality system can have a default search window that starts at a particular coordinate (such as an X and Y coordinate) and has a predefined size. At block 7248, a calculation region is established. For example, the augmented reality system can have a default calculation region that is equal to the whole frame. As the process 7242 proceeds, the calculation region can change over time. The calculation region can be larger than the search window.
At block 7250, a color histogram can be determined in the calculation region. For example, a pixel distribution, such as a histogram back projection, can be determined. The purpose of the color histogram can be to identify a flesh color (such as the color of a hand) in the image. At block 7252, a color probability distribution can be determined. For example, a range of acceptable colors can be used to compute the probability that any given pixel within the calculation region corresponds to a flesh color. The range of acceptable colors can be generated from examples, such as one or more examples of skin color pixels.
At block 7254, the center of mass within the search window can be found. For example, the maximum density of the acceptable range of colors can be determined within the search window. At block 7256, the search window can be centered at the center of the mass. At block 7258, convergence can be checked. For example, the center of mass can be determined once again from the new position of the search window and if it equals the old position then convergence can be determined. If there is no convergence, the process 7242 can return to blocks 7254 and 7256 to continue to move the search window. Convergence can additionally be defined based on a number of iterations or movement of a threshold distance of the search window. Accordingly, a greater concentration of pixels of the target color(s) can be located in the newly positioned search window.
At block 7260, coordinates of the search window can be determined. For example, coordinates such as (X, Y, and Z), pitch, yaw, and/or roll can be determined. At block 7262, a gesture can be recognized. For example, the augmented reality system can determine whether a series of coordinates from multiple iterations of the process 7242 corresponds to a predefined gesture pattern, such as a flick or swipe of the hand. If a gesture is not recognized, the process 7242 proceeds to block 7264 to perform additional iterations.
At block 7264, the determined X and Y coordinates can be used to set the search window center and the size of the search window can be set. For example, the search window can be resized to account for an object moving closer to or farther away from a camera. Thus, the window size can be adapted to the size and orientation of the target. For example, the size of the search window can be adjusted according to the formula: 2*(area of the search window){circumflex over ( )}½. Other calculations can also (or instead) be used to adjust the size of the search window. The process 7242 can proceed in a loop starting at the block 7248 to reset the calculation region and proceed to the blocks 7250 and 7252 to determine color histograms and color probability distributions, and to the blocks 7254, 7256 to center the search window until convergence, and to the blocks 7260 and 7264 to determine coordinates and check for gesture recognition.
At block 7266, the process 7242 can use motion and/or three-dimensional information for gesture recognition. For example, a motion mask can combine color and movement probability distributions for gesture tracking. The motion and/or three-dimensional information can be used to detect three-dimensional motion in real or near time. Pose estimation techniques can be used to determine a three-dimensional position (such as rotation) of an object from an image.
Several additional examples associated with the auxiliary device 2040 will now be described, including authentication features (see
In general, as described above, the auxiliary device 2040 can provide second screen functionality for the hub 100, PPM 102, or MMS 2004. Further, as described above, the translation module 2005 can be implemented in any device, including the hub 100 or the MMS 2004. Data set by the translation module 2005 (or from another device shown in
Turning to
It may be desirable to authenticate the auxiliary device 2040 to the hub 100 so that communications may take place between the two devices. Viewed another way, the process 7300 can be considered a process for pairing the auxiliary device 2040 with the hub 100. Multiple auxiliary devices 2040 can be paired or otherwise authenticated to the hub 100 at one time. This may be useful, for instance so that multiple clinicians can each have a tablet or other auxiliary device 2040, instead of and/or in addition to a TV auxiliary device 2040 being present in the hospital room. Further, the augmented reality device 7200 described above can be an example of one of the auxiliary devices 2040, and multiple clinicians may have augmented reality devices 7200 that they wish to use together with the hub 100 (e.g., so that each one may have a different heads-up display with a set of views customized to that particular clinician).
At block 7302, a user requests the hub 100 (or MMS 2004 etc.) to connect to the auxiliary device 2040 wirelessly (wired connectivity is also possible in other examples). For example, the user can access the settings menu on the hub 100 to begin pairing or authentication of an auxiliary device 2040. At block 7304, the hub 100 displays a PIN number or other authentication code (e.g., a number of alphanumeric digits/letters) on its display. At block 7306, a user enters the PIN or other authentication code on the auxiliary device. At block 7308, the auxiliary device communicates the PIN or other authentication code to the hub or MMS. At block 7310, the hub or MMS communicates data gathered from patient monitors to the auxiliary device. Block 7310 can be implemented either before, during, or after the process 7300.
Turning to
For example, the auxiliary device 2040 can control any option or setting that is able to be controlled on any of the patient monitors or hub 100. For example, alarms, layouts of user interfaces, settings, etc. may be controlled via the auxiliary device 2040. The auxiliary device 2040 may output a user interface that enables control of these settings. For instance, turning to
The manufacturer of the hub 100 can provide a software development kit (SDK) for third-party patient monitor manufacturers to use so that an auxiliary device 2040 can communicate with their hardware. Using the SDK, for instance, a third-party monitor manufacturer can install software libraries on a third-party patient monitor (e.g., 214-220) so that the auxiliary device 2040 can communicate changes to settings or other parameters directly to those devices. The software library or libraries may, for instance, include an application programming interface (API) defining routines or functions and associated parameters that the auxiliary device 2040 can call or invoke to set different settings within the third-party devices. A similar library or libraries may be installed on first party devices, such as the channel devices (222-226). The auxiliary device 2040 can include a set of instructions or libraries that it can invoke to send settings changes to the hub 100 (or the MMS 2004), which the hub 100 passes on to the third-party devices or first party devices. The hub 100 may also translate the settings request changes from the auxiliary device 2040 into more complex instructions provided to the individual patient monitors. The manufacturer of the hub 100 can certify medical devices that are capable of control by the auxiliary device 2040. Any of the functions for controlling the third-party and first party devices can also (or instead) be implemented directly by the hub 100 instead of or in addition to by the auxiliary device 2040.
Thus, either the hub 100 or the auxiliary device 2040 can essentially become the monitor itself as it enables full control over a variety of patient monitoring equipment. Thought of another way, auxiliary device 2040 or the hub 100 can include a unified display for managing a plurality of patient devices.
Turning again to
Turning to
In some current implementations, the translation module 2005 pushes most or all of the data it receives from most or all the devices connected to the hub 100 on to the auxiliary device 2040. For some auxiliary devices 2040 with lower computing resources (e.g., reduced processing capability, lower memory, lower battery and power capability, lower bandwidth connection, etc.), this can mean that those devices may be overwhelmed with the received data. As a result, these devices may crash or get bad or incomplete results.
One way to solve this problem is to have the auxiliary device 2040 request that the translation module 2005 send data less frequently, for example, by lowering the frequency with which it receives data. Another way is for the auxiliary device 2040 to have a module that reduces the firehose effect of all the data it receives by requesting instead a specific subset of information from the translation module 2005 or hub 100. An example of such a module is shown in
Instead of being fully flexible, the IAP layer 2042 may have a predefined capability for receiving data at the auxiliary device 2040. This capability may be determined, for example, by a level of service paid for by a clinical facility using the auxiliary device 2040, or the like. For instance, greater levels of service can enable a larger number of parameters or different variety of parameters to be provided to the auxiliary device 2040.
Turning specifically to the process 7500 shown in
Many other variations than those described herein will be apparent from this disclosure. For example, certain acts, events, or functions of any of the algorithms described herein can be performed in a different sequence, can be added, merged, or left out altogether (e.g., not all described acts or events are necessary for the practice of the algorithms). Moreover, acts or events can be performed concurrently, e.g., through multi-threaded processing, interrupt processing, or multiple processors or processor cores or on other parallel architectures, rather than sequentially. In addition, different tasks or processes can be performed by different machines and/or computing systems that can function together.
It is to be understood that not necessarily all such advantages can be achieved in accordance with any particular embodiment of the embodiments disclosed herein. Thus, the embodiments disclosed herein can be embodied or carried out in a manner that achieves or optimizes one advantage or group of advantages as taught herein without necessarily achieving other advantages as may be taught or suggested herein.
The various illustrative logical blocks and modules described in connection with the examples disclosed herein can be implemented or performed by a machine, such as a general purpose processor, a digital signal processor (DSP), an application specific integrated circuit (ASIC), a field programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein. A general purpose processor can be a microprocessor, but in the alternative, the processor can be a controller, microcontroller, or state machine, combinations of the same, or the like. A processor can include electrical circuitry or digital logic circuitry configured to process computer-executable instructions. In another example, a processor includes an FPGA or other programmable device that performs logic operations without processing computer-executable instructions. A processor can also be implemented as a combination of computing devices, e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration. A computing environment can include any type of computer system, including, but not limited to, a computer system based on a microprocessor, a mainframe computer, a digital signal processor, a portable computing device, a device controller, or a computational engine within an appliance, to name a few.
The steps of a method, process, or algorithm described in connection with the examples disclosed herein can be embodied directly in hardware, in a software module stored in one or more memory devices and executed by one or more processors, or in a combination of the two. A software module can reside in RAM memory, flash memory, ROM memory, EPROM memory, EEPROM memory, registers, hard disk, a removable disk, a CD-ROM, or any other form of non-transitory computer-readable storage medium, media, or physical computer storage known in the art. An example storage medium can be coupled to the processor such that the processor can read information from, and write information to, the storage medium. In the alternative, the storage medium can be integral to the processor. The storage medium can be volatile or nonvolatile. The processor and the storage medium can reside in an ASIC.
Conditional language used herein, such as, among others, “can,” “might,” “may,” “e.g.,” and the like, unless specifically stated otherwise, or otherwise understood within the context as used, is generally intended to convey that certain embodiments include, while other embodiments do not include, certain features, elements and/or states. Thus, such conditional language is not generally intended to imply that features, elements and/or states are in any way required for one or more embodiments or that one or more embodiments necessarily include logic for deciding, with or without author input or prompting, whether these features, elements and/or states are included or are to be performed in any particular embodiment. The terms “comprising,” “including,” “having,” and the like are synonymous and are used inclusively, in an open-ended fashion, and do not exclude additional elements, features, acts, operations, and so forth. Also, the term “or” is used in its inclusive sense (and not in its exclusive sense) so that when used, for example, to connect a list of elements, the term “or” means one, some, or all of the elements in the list. Further, the term “each,” as used herein, in addition to having its ordinary meaning, can mean any subset of a set of elements to which the term “each” is applied.
Disjunctive language such as the phrase “at least one of X, Y, or Z,” unless specifically stated otherwise, is otherwise understood with the context as used in general to present that an item, term, etc., may be either X, Y, or Z, or any combination thereof (e.g., X, Y, and/or Z). Thus, such disjunctive language is not generally intended to, and should not, imply that certain embodiments require at least one of X, at least one of Y, or at least one of Z to each be present.
Unless otherwise explicitly stated, articles such as “a” or “an” should generally be interpreted to include one or more described items. Accordingly, phrases such as “a device configured to” are intended to include one or more recited devices. Such one or more recited devices can also be collectively configured to carry out the stated recitations. For example, “a processor configured to carry out recitations A, B and C” can include a first processor configured to carry out recitation A working in conjunction with a second processor configured to carry out recitations B and C.
While the above detailed description has shown, described, and pointed out novel features as applied to various embodiments, it will be understood that various omissions, substitutions, and changes in the form and details of the devices or algorithms illustrated can be made without departing from the spirit of the disclosure. As will be recognized, certain embodiments of the inventions described herein can be embodied within a form that does not provide all of the features and benefits set forth herein, as some features can be used or practiced separately from others.
Additionally, all publications, patents, and patent applications mentioned in this specification are herein incorporated by reference to the same extent as if each individual publication, patent, or patent application was specifically and individually indicated to be incorporated by reference.
Any and all applications for which a foreign or domestic priority claim is identified in the Application Data Sheet as filed with the present application are hereby incorporated by reference under 37 CFR 1.57. This application claims benefit of U.S. Provisional Patent Application Ser. No. 62/463,517 entitled “System for Displaying and Controlling Medical Monitoring Data” filed Feb. 24, 2017, and U.S. Provisional Patent Application Ser. No. 62/463,452 entitled “Patient Monitor Communication Platform” filed Feb. 24, 2017, which are hereby incorporated by reference in their entireties.
Number | Name | Date | Kind |
---|---|---|---|
4960128 | Gordon et al. | Oct 1990 | A |
4964408 | Hink et al. | Oct 1990 | A |
5041187 | Hink et al. | Aug 1991 | A |
5069213 | Polczynski | Dec 1991 | A |
5163438 | Gordon et al. | Nov 1992 | A |
5319355 | Russek | Jun 1994 | A |
5337744 | Branigan | Aug 1994 | A |
5341805 | Stavridi et al. | Aug 1994 | A |
D353195 | Savage et al. | Dec 1994 | S |
D353196 | Savage et al. | Dec 1994 | S |
5377676 | Vari et al. | Jan 1995 | A |
D359546 | Savage et al. | Jun 1995 | S |
5431170 | Mathews | Jul 1995 | A |
D361840 | Savage et al. | Aug 1995 | S |
D362063 | Savage et al. | Sep 1995 | S |
5452717 | Branigan et al. | Sep 1995 | A |
D363120 | Savage et al. | Oct 1995 | S |
5456252 | Vari et al. | Oct 1995 | A |
5479934 | Imran | Jan 1996 | A |
5482036 | Diab et al. | Jan 1996 | A |
5490505 | Diab et al. | Feb 1996 | A |
5494043 | O'Sullivan et al. | Feb 1996 | A |
5533511 | Kaspari et al. | Jul 1996 | A |
5534851 | Russek | Jul 1996 | A |
5561275 | Savage et al. | Oct 1996 | A |
5562002 | Lalin | Oct 1996 | A |
5590649 | Caro et al. | Jan 1997 | A |
5602924 | Durand et al. | Feb 1997 | A |
5632272 | Diab et al. | May 1997 | A |
5638816 | Kiani-Azarbayjany et al. | Jun 1997 | A |
5638818 | Diab et al. | Jun 1997 | A |
5645440 | Tobler et al. | Jul 1997 | A |
5685299 | Diab et al. | Nov 1997 | A |
D393830 | Tobler et al. | Apr 1998 | S |
5743262 | Lepper, Jr. et al. | Apr 1998 | A |
5758644 | Diab et al. | Jun 1998 | A |
5760910 | Lepper, Jr. et al. | Jun 1998 | A |
5769785 | Diab et al. | Jun 1998 | A |
5782757 | Diab et al. | Jul 1998 | A |
5785659 | Caro et al. | Jul 1998 | A |
5791347 | Flaherty et al. | Aug 1998 | A |
5810734 | Caro et al. | Sep 1998 | A |
5823950 | Diab et al. | Oct 1998 | A |
5830131 | Caro et al. | Nov 1998 | A |
5833618 | Caro et al. | Nov 1998 | A |
5860919 | Kiani-Azarbayjany et al. | Jan 1999 | A |
5890929 | Mills et al. | Apr 1999 | A |
5904654 | Wohltmann et al. | May 1999 | A |
5919134 | Diab | Jul 1999 | A |
5934925 | Tobler et al. | Aug 1999 | A |
5940182 | Lepper, Jr. et al. | Aug 1999 | A |
5987343 | Kinast | Nov 1999 | A |
5995855 | Kiani et al. | Nov 1999 | A |
5997343 | Mills et al. | Dec 1999 | A |
6002952 | Diab et al. | Dec 1999 | A |
6011986 | Diab et al. | Jan 2000 | A |
6027452 | Flaherty et al. | Feb 2000 | A |
6036642 | Diab et al. | Mar 2000 | A |
6045509 | Caro et al. | Apr 2000 | A |
6067462 | Diab et al. | May 2000 | A |
6081735 | Diab et al. | Jun 2000 | A |
6088607 | Diab et al. | Jul 2000 | A |
6110522 | Lepper, Jr. et al. | Aug 2000 | A |
6124597 | Shehada | Sep 2000 | A |
6128521 | Marro et al. | Oct 2000 | A |
6129675 | Jay | Oct 2000 | A |
6144868 | Parker | Nov 2000 | A |
6151516 | Kiani-Azarbayjany et al. | Nov 2000 | A |
6152754 | Gerhardt et al. | Nov 2000 | A |
6157850 | Diab et al. | Dec 2000 | A |
6165005 | Mills et al. | Dec 2000 | A |
6184521 | Coffin, IV et al. | Feb 2001 | B1 |
6206830 | Diab et al. | Mar 2001 | B1 |
6229856 | Diab et al. | May 2001 | B1 |
6232609 | Snyder et al. | May 2001 | B1 |
6236872 | Diab et al. | May 2001 | B1 |
6241683 | Macklem et al. | Jun 2001 | B1 |
6253097 | Aronow et al. | Jun 2001 | B1 |
6256523 | Diab et al. | Jul 2001 | B1 |
6263222 | Diab et al. | Jul 2001 | B1 |
6278522 | Lepper, Jr. et al. | Aug 2001 | B1 |
6280213 | Tobler et al. | Aug 2001 | B1 |
6285896 | Tobler et al. | Sep 2001 | B1 |
6301493 | Marro et al. | Oct 2001 | B1 |
6308089 | von der Ruhr et al. | Oct 2001 | B1 |
6317627 | Ennen et al. | Nov 2001 | B1 |
6321100 | Parker | Nov 2001 | B1 |
6325761 | Jay | Dec 2001 | B1 |
6334065 | Al-Ali et al. | Dec 2001 | B1 |
6343224 | Parker | Jan 2002 | B1 |
6349228 | Kiani et al. | Feb 2002 | B1 |
6360114 | Diab et al. | Mar 2002 | B1 |
6368283 | Xu et al. | Apr 2002 | B1 |
6371921 | Caro et al. | Apr 2002 | B1 |
6377829 | Al-Ali | Apr 2002 | B1 |
6388240 | Schulz et al. | May 2002 | B2 |
6397091 | Diab et al. | May 2002 | B2 |
6430437 | Marro | Aug 2002 | B1 |
6430525 | Weber et al. | Aug 2002 | B1 |
6463311 | Diab | Oct 2002 | B1 |
6470199 | Kopotic et al. | Oct 2002 | B1 |
6501975 | Diab et al. | Dec 2002 | B2 |
6505059 | Kollias et al. | Jan 2003 | B1 |
6515273 | Al-Ali | Feb 2003 | B2 |
6519487 | Parker | Feb 2003 | B1 |
6525386 | Mills et al. | Feb 2003 | B1 |
6526300 | Kiani et al. | Feb 2003 | B1 |
6541756 | Schulz et al. | Apr 2003 | B2 |
6542764 | Al-Ali et al. | Apr 2003 | B1 |
6580086 | Schulz et al. | Jun 2003 | B1 |
6584336 | Ali et al. | Jun 2003 | B1 |
6595316 | Cybulski et al. | Jul 2003 | B2 |
6597932 | Tian et al. | Jul 2003 | B2 |
6597933 | Kiani et al. | Jul 2003 | B2 |
6606511 | Ali et al. | Aug 2003 | B1 |
6632181 | Flaherty et al. | Oct 2003 | B2 |
6639668 | Trepagnier | Oct 2003 | B1 |
6640116 | Diab | Oct 2003 | B2 |
6643530 | Diab et al. | Nov 2003 | B2 |
6650917 | Diab et al. | Nov 2003 | B2 |
6654624 | Diab et al. | Nov 2003 | B2 |
6658276 | Kiani et al. | Dec 2003 | B2 |
6661161 | Lanzo et al. | Dec 2003 | B1 |
6671531 | Al-Ali et al. | Dec 2003 | B2 |
6678543 | Diab et al. | Jan 2004 | B2 |
6684090 | Ali et al. | Jan 2004 | B2 |
6684091 | Parker | Jan 2004 | B2 |
6697656 | Al-Ali | Feb 2004 | B1 |
6697657 | Shehada et al. | Feb 2004 | B1 |
6697658 | Al-Ali | Feb 2004 | B2 |
RE38476 | Diab et al. | Mar 2004 | E |
6699194 | Diab et al. | Mar 2004 | B1 |
6714804 | Al-Ali et al. | Mar 2004 | B2 |
RE38492 | Diab et al. | Apr 2004 | E |
6721582 | Trepagnier et al. | Apr 2004 | B2 |
6721585 | Parker | Apr 2004 | B1 |
6725075 | Al-Ali | Apr 2004 | B2 |
6728560 | Kollias et al. | Apr 2004 | B2 |
6735459 | Parker | May 2004 | B2 |
6745060 | Diab et al. | Jun 2004 | B2 |
6760607 | Al-Ali | Jul 2004 | B2 |
6770028 | Ali et al. | Aug 2004 | B1 |
6771994 | Kiani et al. | Aug 2004 | B2 |
6792300 | Diab et al. | Sep 2004 | B1 |
6813511 | Diab et al. | Nov 2004 | B2 |
6816741 | Diab | Nov 2004 | B2 |
6822564 | Al-Ali | Nov 2004 | B2 |
6826419 | Diab et al. | Nov 2004 | B2 |
6830711 | Mills et al. | Dec 2004 | B2 |
6850787 | Weber et al. | Feb 2005 | B2 |
6850788 | Al-Ali | Feb 2005 | B2 |
6852083 | Caro et al. | Feb 2005 | B2 |
6858012 | Burns | Feb 2005 | B2 |
6861639 | Al-Ali | Mar 2005 | B2 |
6898452 | Al-Ali et al. | May 2005 | B2 |
6920345 | Al-Ali et al. | Jul 2005 | B2 |
6931268 | Kiani-Azarbayjany et al. | Aug 2005 | B1 |
6934570 | Kiani et al. | Aug 2005 | B2 |
6939305 | Flaherty et al. | Sep 2005 | B2 |
6943348 | Coffin, IV | Sep 2005 | B1 |
6950687 | Al-Ali | Sep 2005 | B2 |
6961598 | Diab | Nov 2005 | B2 |
6970792 | Diab | Nov 2005 | B1 |
6979812 | Al-Ali | Dec 2005 | B2 |
6985764 | Mason et al. | Jan 2006 | B2 |
6993371 | Kiani et al. | Jan 2006 | B2 |
6996427 | Ali et al. | Feb 2006 | B2 |
6999904 | Weber et al. | Feb 2006 | B2 |
7003338 | Weber et al. | Feb 2006 | B2 |
7003339 | Diab et al. | Feb 2006 | B2 |
7015451 | Dalke et al. | Mar 2006 | B2 |
7024233 | Ali et al. | Apr 2006 | B2 |
7027849 | Al-Ali | Apr 2006 | B2 |
7030749 | Al-Ali | Apr 2006 | B2 |
7039449 | Al-Ali | May 2006 | B2 |
7041060 | Flaherty et al. | May 2006 | B2 |
7044918 | Diab | May 2006 | B2 |
7048687 | Reuss et al. | May 2006 | B1 |
7067893 | Mills et al. | Jun 2006 | B2 |
7096052 | Mason et al. | Aug 2006 | B2 |
7096054 | Abdul-Hafiz et al. | Aug 2006 | B2 |
7132641 | Schulz et al. | Nov 2006 | B2 |
7142901 | Kiani et al. | Nov 2006 | B2 |
7149561 | Diab | Dec 2006 | B2 |
7186966 | Al-Ali | Mar 2007 | B2 |
7190261 | Al-Ali | Mar 2007 | B2 |
7215984 | Diab | May 2007 | B2 |
7215986 | Diab | May 2007 | B2 |
7221971 | Diab | May 2007 | B2 |
7225006 | Al-Ali et al. | May 2007 | B2 |
7225007 | Al-Ali | May 2007 | B2 |
RE39672 | Shehada et al. | Jun 2007 | E |
7239905 | Kiani-Azarbayjany et al. | Jul 2007 | B2 |
7245953 | Parker | Jul 2007 | B1 |
7254429 | Schurman et al. | Aug 2007 | B2 |
7254431 | Al-Ali | Aug 2007 | B2 |
7254433 | Diab et al. | Aug 2007 | B2 |
7254434 | Schulz et al. | Aug 2007 | B2 |
7272425 | Al-Ali | Sep 2007 | B2 |
7274955 | Kiani et al. | Sep 2007 | B2 |
D554263 | Al-Ali | Oct 2007 | S |
7280858 | Al-Ali et al. | Oct 2007 | B2 |
7289835 | Mansfield et al. | Oct 2007 | B2 |
7292883 | De Felice et al. | Nov 2007 | B2 |
7295866 | Al-Ali | Nov 2007 | B2 |
7306560 | Iliff | Dec 2007 | B2 |
7328053 | Diab et al. | Feb 2008 | B1 |
7332784 | Mills et al. | Feb 2008 | B2 |
7340287 | Mason et al. | Mar 2008 | B2 |
7341559 | Schulz et al. | Mar 2008 | B2 |
7343186 | Lamego et al. | Mar 2008 | B2 |
D566282 | Al-Ali et al. | Apr 2008 | S |
7355512 | Al-Ali | Apr 2008 | B1 |
7356365 | Schurman | Apr 2008 | B2 |
7371981 | Abdul-Hafiz | May 2008 | B2 |
7373193 | Al-Ali et al. | May 2008 | B2 |
7373194 | Weber et al. | May 2008 | B2 |
7376453 | Diab et al. | May 2008 | B1 |
7377794 | Al Ali et al. | May 2008 | B2 |
7377899 | Weber et al. | May 2008 | B2 |
7383070 | Diab et al. | Jun 2008 | B2 |
7415297 | Al-Ali et al. | Aug 2008 | B2 |
7428432 | Ali et al. | Sep 2008 | B2 |
7438683 | Al-Ali et al. | Oct 2008 | B2 |
7440787 | Diab | Oct 2008 | B2 |
7454240 | Diab et al. | Nov 2008 | B2 |
7467002 | Weber et al. | Dec 2008 | B2 |
7469157 | Diab et al. | Dec 2008 | B2 |
7471969 | Diab et al. | Dec 2008 | B2 |
7471971 | Diab et al. | Dec 2008 | B2 |
7483729 | Al-Ali et al. | Jan 2009 | B2 |
7483730 | Diab et al. | Jan 2009 | B2 |
7489958 | Diab et al. | Feb 2009 | B2 |
7496391 | Diab et al. | Feb 2009 | B2 |
7496393 | Diab et al. | Feb 2009 | B2 |
D587657 | Al-Ali et al. | Mar 2009 | S |
7499741 | Diab et al. | Mar 2009 | B2 |
7499835 | Weber et al. | Mar 2009 | B2 |
7500950 | Al-Ali et al. | Mar 2009 | B2 |
7509154 | Diab et al. | Mar 2009 | B2 |
7509494 | Al-Ali | Mar 2009 | B2 |
7510849 | Schurman et al. | Mar 2009 | B2 |
7526328 | Diab et al. | Apr 2009 | B2 |
7530942 | Diab | May 2009 | B1 |
7530949 | Al Ali et al. | May 2009 | B2 |
7530955 | Diab et al. | May 2009 | B2 |
7563110 | Al-Ali et al. | Jul 2009 | B2 |
7596398 | Al-Ali et al. | Sep 2009 | B2 |
7618375 | Flaherty | Nov 2009 | B2 |
D606659 | Kiani et al. | Dec 2009 | S |
7647083 | Al-Ali et al. | Jan 2010 | B2 |
D609193 | Al-Ali et al. | Feb 2010 | S |
D614305 | Al-Ali et al. | Apr 2010 | S |
RE41317 | Parker | May 2010 | E |
7729733 | Al-Ali et al. | Jun 2010 | B2 |
7734320 | Al-Ali | Jun 2010 | B2 |
7761127 | Al-Ali et al. | Jul 2010 | B2 |
7761128 | Al-Ali et al. | Jul 2010 | B2 |
7764982 | Dalke et al. | Jul 2010 | B2 |
D621516 | Kiani et al. | Aug 2010 | S |
7791155 | Diab | Sep 2010 | B2 |
7801581 | Diab | Sep 2010 | B2 |
7822452 | Schurman et al. | Oct 2010 | B2 |
RE41912 | Parker | Nov 2010 | E |
7844313 | Kiani et al. | Nov 2010 | B2 |
7844314 | Al-Ali | Nov 2010 | B2 |
7844315 | Al-Ali | Nov 2010 | B2 |
7865222 | Weber et al. | Jan 2011 | B2 |
7873497 | Weber et al. | Jan 2011 | B2 |
7880606 | Al-Ali | Feb 2011 | B2 |
7880626 | Al-Ali et al. | Feb 2011 | B2 |
7891355 | Al-Ali et al. | Feb 2011 | B2 |
7894868 | Al-Ali et al. | Feb 2011 | B2 |
7899507 | Al-Ali et al. | Mar 2011 | B2 |
7899518 | Trepagnier et al. | Mar 2011 | B2 |
7904132 | Weber et al. | Mar 2011 | B2 |
7909772 | Popov et al. | Mar 2011 | B2 |
7910875 | Al-Ali | Mar 2011 | B2 |
7919713 | Al-Ali et al. | Apr 2011 | B2 |
7937128 | Al-Ali | May 2011 | B2 |
7937129 | Mason et al. | May 2011 | B2 |
7937130 | Diab et al. | May 2011 | B2 |
7941199 | Kiani | May 2011 | B2 |
7951086 | Flaherty et al. | May 2011 | B2 |
7957780 | Lamego et al. | Jun 2011 | B2 |
7962188 | Kiani et al. | Jun 2011 | B2 |
7962190 | Diab et al. | Jun 2011 | B1 |
7976472 | Kiani | Jul 2011 | B2 |
7988637 | Diab | Aug 2011 | B2 |
7990382 | Kiani | Aug 2011 | B2 |
7991446 | Al-Ali et al. | Aug 2011 | B2 |
8000761 | Al-Ali | Aug 2011 | B2 |
8008088 | Bellott et al. | Aug 2011 | B2 |
RE42753 | Kiani-Azarbayjany et al. | Sep 2011 | E |
8019400 | Diab et al. | Sep 2011 | B2 |
8028701 | Al-Ali et al. | Oct 2011 | B2 |
8029765 | Bellott et al. | Oct 2011 | B2 |
8036727 | Schurman et al. | Oct 2011 | B2 |
8036728 | Diab et al. | Oct 2011 | B2 |
8046040 | Ali et al. | Oct 2011 | B2 |
8046041 | Diab et al. | Oct 2011 | B2 |
8046042 | Diab et al. | Oct 2011 | B2 |
8048040 | Kiani | Nov 2011 | B2 |
8050728 | Al-Ali et al. | Nov 2011 | B2 |
RE43169 | Parker | Feb 2012 | E |
8118620 | Al-Ali et al. | Feb 2012 | B2 |
8126528 | Diab et al. | Feb 2012 | B2 |
8128572 | Diab et al. | Mar 2012 | B2 |
8130105 | Al-Ali et al. | Mar 2012 | B2 |
8145287 | Diab et al. | Mar 2012 | B2 |
8150487 | Diab et al. | Apr 2012 | B2 |
8175672 | Parker | May 2012 | B2 |
8177704 | Mohl | May 2012 | B1 |
8180420 | Diab et al. | May 2012 | B2 |
8182443 | Kiani | May 2012 | B1 |
8185180 | Diab et al. | May 2012 | B2 |
8190223 | Al-Ali et al. | May 2012 | B2 |
8190227 | Diab et al. | May 2012 | B2 |
8203438 | Kiani et al. | Jun 2012 | B2 |
8203704 | Merritt et al. | Jun 2012 | B2 |
8204566 | Schurman et al. | Jun 2012 | B2 |
8219172 | Schurman et al. | Jul 2012 | B2 |
8224411 | Al-Ali et al. | Jul 2012 | B2 |
8228181 | Al-Ali | Jul 2012 | B2 |
8229533 | Diab et al. | Jul 2012 | B2 |
8233955 | Al-Ali et al. | Jul 2012 | B2 |
8244325 | Al-Ali et al. | Aug 2012 | B2 |
8255026 | Al-Ali | Aug 2012 | B1 |
8255027 | Al-Ali et al. | Aug 2012 | B2 |
8255028 | Al-Ali et al. | Aug 2012 | B2 |
8260577 | Weber et al. | Sep 2012 | B2 |
8265723 | McHale et al. | Sep 2012 | B1 |
8274360 | Sampath et al. | Sep 2012 | B2 |
8280473 | Al-Ali | Oct 2012 | B2 |
8301217 | Al-Ali et al. | Oct 2012 | B2 |
8306596 | Schurman et al. | Nov 2012 | B2 |
8310336 | Muhsin et al. | Nov 2012 | B2 |
8315683 | Al-Ali et al. | Nov 2012 | B2 |
RE43860 | Parker | Dec 2012 | E |
8337403 | Al-Ali et al. | Dec 2012 | B2 |
8346330 | Lamego | Jan 2013 | B2 |
8353842 | Al-Ali et al. | Jan 2013 | B2 |
8355766 | MacNeish, III et al. | Jan 2013 | B2 |
8359080 | Diab et al. | Jan 2013 | B2 |
8364223 | Al-Ali et al. | Jan 2013 | B2 |
8364226 | Diab et al. | Jan 2013 | B2 |
8374665 | Lamego | Feb 2013 | B2 |
8385995 | Al-Ali et al. | Feb 2013 | B2 |
8385996 | Smith et al. | Feb 2013 | B2 |
8388353 | Kiani et al. | Mar 2013 | B2 |
8399822 | Al-Ali | Mar 2013 | B2 |
8401602 | Kiani | Mar 2013 | B2 |
8405608 | Al-Ali et al. | Mar 2013 | B2 |
8414499 | Al-Ali et al. | Apr 2013 | B2 |
8418524 | Al-Ali | Apr 2013 | B2 |
8423106 | Lamego et al. | Apr 2013 | B2 |
8428967 | Olsen et al. | Apr 2013 | B2 |
8430817 | Al-Ali | Apr 2013 | B1 |
8437825 | Dalvi et al. | May 2013 | B2 |
8455290 | Siskavich | Jun 2013 | B2 |
8457703 | Al-Ali | Jun 2013 | B2 |
8457707 | Kiani | Jun 2013 | B2 |
8463349 | Diab et al. | Jun 2013 | B2 |
8466286 | Bellot et al. | Jun 2013 | B2 |
8471713 | Poeze et al. | Jun 2013 | B2 |
8473020 | Kiani et al. | Jun 2013 | B2 |
8483787 | Al-Ali et al. | Jul 2013 | B2 |
8489364 | Weber et al. | Jul 2013 | B2 |
8498684 | Weber et al. | Jul 2013 | B2 |
8504128 | Blank et al. | Aug 2013 | B2 |
8509867 | Workman et al. | Aug 2013 | B2 |
8515509 | Bruinsma et al. | Aug 2013 | B2 |
8521558 | Malave | Aug 2013 | B2 |
8523781 | Al-Ali | Sep 2013 | B2 |
8529301 | Al-Ali et al. | Sep 2013 | B2 |
8532727 | Ali et al. | Sep 2013 | B2 |
8532728 | Diab et al. | Sep 2013 | B2 |
D692145 | Al-Ali et al. | Oct 2013 | S |
8547209 | Kiani et al. | Oct 2013 | B2 |
8548548 | Al-Ali | Oct 2013 | B2 |
8548549 | Schurman et al. | Oct 2013 | B2 |
8548550 | Al-Ali et al. | Oct 2013 | B2 |
8560032 | Al-Ali et al. | Oct 2013 | B2 |
8560034 | Diab et al. | Oct 2013 | B1 |
8570167 | Al-Ali | Oct 2013 | B2 |
8570503 | Vo et al. | Oct 2013 | B2 |
8571617 | Reichgott et al. | Oct 2013 | B2 |
8571618 | Lamego et al. | Oct 2013 | B1 |
8571619 | Al-Ali et al. | Oct 2013 | B2 |
8577431 | Lamego et al. | Nov 2013 | B2 |
8581732 | Al-Ali et al. | Nov 2013 | B2 |
8584345 | Al-Ali et al. | Nov 2013 | B2 |
8588880 | Abdul-Hafiz et al. | Nov 2013 | B2 |
8600467 | Al-Ali et al. | Dec 2013 | B2 |
8606342 | Diab | Dec 2013 | B2 |
8626255 | Al-Ali et al. | Jan 2014 | B2 |
8630691 | Lamego et al. | Jan 2014 | B2 |
8634889 | Al-Ali et al. | Jan 2014 | B2 |
8641631 | Sierra et al. | Feb 2014 | B2 |
8652060 | Al-Ali | Feb 2014 | B2 |
8663107 | Kiani | Mar 2014 | B2 |
8666468 | Al-Ali | Mar 2014 | B1 |
8667967 | Al-Ali et al. | Mar 2014 | B2 |
8670811 | O'Reilly | Mar 2014 | B2 |
8670814 | Diab et al. | Mar 2014 | B2 |
8676286 | Weber et al. | Mar 2014 | B2 |
8682407 | Al-Ali | Mar 2014 | B2 |
RE44823 | Parker | Apr 2014 | E |
RE44875 | Kiani et al. | Apr 2014 | E |
8690799 | Telfort | Apr 2014 | B2 |
8700112 | Kiani | Apr 2014 | B2 |
8702627 | Telfort et al. | Apr 2014 | B2 |
8706179 | Parker | Apr 2014 | B2 |
8712494 | MacNeish, III et al. | Apr 2014 | B1 |
8715206 | Telfort et al. | May 2014 | B2 |
8718735 | Lamego et al. | May 2014 | B2 |
8718737 | Diab et al. | May 2014 | B2 |
8718738 | Blank et al. | May 2014 | B2 |
8720249 | Al-Ali | May 2014 | B2 |
8721541 | Al-Ali et al. | May 2014 | B2 |
8721542 | Al-Ali et al. | May 2014 | B2 |
8723677 | Kiani | May 2014 | B1 |
8740792 | Kiani et al. | Jun 2014 | B1 |
8743148 | Gegner | Jun 2014 | B2 |
8754776 | Poeze et al. | Jun 2014 | B2 |
8755535 | Telfort et al. | Jun 2014 | B2 |
8755856 | Diab et al. | Jun 2014 | B2 |
8755872 | Marinow | Jun 2014 | B1 |
8761850 | Lamego | Jun 2014 | B2 |
8764671 | Kiani | Jul 2014 | B2 |
8768423 | Shakespeare et al. | Jul 2014 | B2 |
8771204 | Telfort et al. | Jul 2014 | B2 |
8777634 | Kiani et al. | Jul 2014 | B2 |
8781543 | Diab et al. | Jul 2014 | B2 |
8781544 | Al-Ali et al. | Jul 2014 | B2 |
8781549 | Al-Ali et al. | Jul 2014 | B2 |
8788003 | Schurman et al. | Jul 2014 | B2 |
8790268 | Al-Ali | Jul 2014 | B2 |
8801613 | Al-Ali | Aug 2014 | B2 |
8821397 | Al-Ali et al. | Sep 2014 | B2 |
8821415 | Al-Ali | Sep 2014 | B2 |
8830449 | Lamego et al. | Sep 2014 | B1 |
8831700 | Schurman et al. | Sep 2014 | B2 |
8840549 | Al-Ali et al. | Sep 2014 | B2 |
8847740 | Kiani et al. | Sep 2014 | B2 |
8849365 | Smith et al. | Sep 2014 | B2 |
8852094 | Al-Ali et al. | Oct 2014 | B2 |
8852994 | Wojtczuk et al. | Oct 2014 | B2 |
8868147 | Stippick et al. | Oct 2014 | B2 |
8868150 | Al-Ali et al. | Oct 2014 | B2 |
8870792 | Al-Ali | Oct 2014 | B2 |
8886271 | Kiani et al. | Nov 2014 | B2 |
8888539 | Al-Ali et al. | Nov 2014 | B2 |
8888708 | Diab et al. | Nov 2014 | B2 |
8892180 | Weber et al. | Nov 2014 | B2 |
8897847 | Al-Ali | Nov 2014 | B2 |
8909310 | Lamego et al. | Dec 2014 | B2 |
8911377 | Al-Ali | Dec 2014 | B2 |
8912909 | Al-Ali et al. | Dec 2014 | B2 |
8920317 | Al-Ali et al. | Dec 2014 | B2 |
8921699 | Al-Ali et al. | Dec 2014 | B2 |
8922382 | Al-Ali et al. | Dec 2014 | B2 |
8929964 | Al-Ali et al. | Jan 2015 | B2 |
8942777 | Diab et al. | Jan 2015 | B2 |
8948834 | Diab et al. | Feb 2015 | B2 |
8948835 | Diab | Feb 2015 | B2 |
8965471 | Lamego | Feb 2015 | B2 |
8983564 | Al-Ali | Mar 2015 | B2 |
8989831 | Al-Ali et al. | Mar 2015 | B2 |
8996085 | Kiani et al. | Mar 2015 | B2 |
8998809 | Kiani | Apr 2015 | B2 |
9028429 | Telfort et al. | May 2015 | B2 |
9037207 | Al-Ali et al. | May 2015 | B2 |
9060721 | Reichgott et al. | Jun 2015 | B2 |
9066666 | Kiani | Jun 2015 | B2 |
9066680 | Al-Ali et al. | Jun 2015 | B1 |
9072474 | Al-Ali et al. | Jul 2015 | B2 |
9078560 | Schurman et al. | Jul 2015 | B2 |
9084569 | Weber et al. | Jul 2015 | B2 |
9095316 | Welch et al. | Aug 2015 | B2 |
9106038 | Telfort et al. | Aug 2015 | B2 |
9107625 | Telfort et al. | Aug 2015 | B2 |
9107626 | Al-Ali et al. | Aug 2015 | B2 |
9113831 | Al-Ali | Aug 2015 | B2 |
9113832 | Al-Ali | Aug 2015 | B2 |
9119595 | Lamego | Sep 2015 | B2 |
9131881 | Diab et al. | Sep 2015 | B2 |
9131882 | Al-Ali et al. | Sep 2015 | B2 |
9131883 | Al-Ali | Sep 2015 | B2 |
9131917 | Telfort et al. | Sep 2015 | B2 |
9138180 | Coverston et al. | Sep 2015 | B1 |
9138182 | Al-Ali et al. | Sep 2015 | B2 |
9138192 | Weber et al. | Sep 2015 | B2 |
9142117 | Muhsin et al. | Sep 2015 | B2 |
9153112 | Kiani et al. | Oct 2015 | B1 |
9153121 | Kiani et al. | Oct 2015 | B2 |
9161696 | Al-Ali et al. | Oct 2015 | B2 |
9161713 | Al-Ali et al. | Oct 2015 | B2 |
9167995 | Lamego et al. | Oct 2015 | B2 |
9176141 | Al-Ali et al. | Nov 2015 | B2 |
9186102 | Bruinsma et al. | Nov 2015 | B2 |
9192312 | Al-Ali | Nov 2015 | B2 |
9192329 | Al-Ali | Nov 2015 | B2 |
9192351 | Telfort et al. | Nov 2015 | B1 |
9195385 | Al-Ali et al. | Nov 2015 | B2 |
9211072 | Kiani | Dec 2015 | B2 |
9211095 | Al-Ali | Dec 2015 | B1 |
9218454 | Kiani et al. | Dec 2015 | B2 |
9226696 | Kiani | Jan 2016 | B2 |
9241662 | Al-Ali et al. | Jan 2016 | B2 |
9245668 | Vo et al. | Jan 2016 | B1 |
9259185 | Abdul-Hafiz et al. | Feb 2016 | B2 |
9267572 | Barker et al. | Feb 2016 | B2 |
9277880 | Poeze et al. | Mar 2016 | B2 |
9289167 | Diab et al. | Mar 2016 | B2 |
9295421 | Kiani et al. | Mar 2016 | B2 |
9307928 | Al-Ali et al. | Apr 2016 | B1 |
9323894 | Kiani | Apr 2016 | B2 |
D755392 | Hwang et al. | May 2016 | S |
9326712 | Kiani | May 2016 | B1 |
9333316 | Kiani | May 2016 | B2 |
9339220 | Lamego et al. | May 2016 | B2 |
9341565 | Lamego et al. | May 2016 | B2 |
9351673 | Diab et al. | May 2016 | B2 |
9351675 | Al-Ali et al. | May 2016 | B2 |
9364181 | Kiani et al. | Jun 2016 | B2 |
9368671 | Wojtczuk et al. | Jun 2016 | B2 |
9370325 | Al-Ali et al. | Jun 2016 | B2 |
9370326 | McHale et al. | Jun 2016 | B2 |
9370335 | Al-Ali et al. | Jun 2016 | B2 |
9375185 | Ali et al. | Jun 2016 | B2 |
9386953 | Al-Ali | Jul 2016 | B2 |
9386961 | Al-Ali | Jul 2016 | B2 |
9392945 | Al-Ali et al. | Jul 2016 | B2 |
9397448 | Al-Ali et al. | Jul 2016 | B2 |
9408542 | Kinast et al. | Aug 2016 | B1 |
9436645 | Al-Ali | Sep 2016 | B2 |
9445759 | Lamego et al. | Sep 2016 | B1 |
9466919 | Kiani et al. | Oct 2016 | B2 |
9474474 | Lamego et al. | Oct 2016 | B2 |
9480422 | Al-Ali | Nov 2016 | B2 |
9480435 | Olsen | Nov 2016 | B2 |
9492110 | Al-Ali et al. | Nov 2016 | B2 |
9501613 | Hanson | Nov 2016 | B1 |
9510779 | Poeze et al. | Dec 2016 | B2 |
9517024 | Kiani | Dec 2016 | B2 |
9532722 | Lamego et al. | Jan 2017 | B2 |
9538949 | Al-Ali et al. | Jan 2017 | B2 |
9538980 | Telfort et al. | Jan 2017 | B2 |
9549696 | Lamego et al. | Jan 2017 | B2 |
9554737 | Schurman et al. | Jan 2017 | B2 |
9560996 | Kiani | Feb 2017 | B2 |
9560998 | Al-Ali et al. | Feb 2017 | B2 |
9566019 | Al-Ali et al. | Feb 2017 | B2 |
9579039 | Jansen et al. | Feb 2017 | B2 |
9591975 | Dalvi et al. | Mar 2017 | B2 |
9622692 | Lamego et al. | Apr 2017 | B2 |
9622693 | Diab | Apr 2017 | B2 |
9629570 | Bar-Tal | Apr 2017 | B2 |
D788312 | Al-Ali et al. | May 2017 | S |
9636055 | Al-Ali et al. | May 2017 | B2 |
9636056 | Al-Ali | May 2017 | B2 |
9649054 | Lamego et al. | May 2017 | B2 |
9662052 | Al-Ali et al. | May 2017 | B2 |
9668679 | Schurman et al. | Jun 2017 | B2 |
9668680 | Bruinsma et al. | Jun 2017 | B2 |
9668703 | Al-Ali | Jun 2017 | B2 |
9675286 | Diab | Jun 2017 | B2 |
9687160 | Kiani | Jun 2017 | B2 |
9693719 | Al-Ali et al. | Jul 2017 | B2 |
9693737 | Al-Ali | Jul 2017 | B2 |
9697928 | Al-Ali et al. | Jul 2017 | B2 |
9700218 | Boyer | Jul 2017 | B2 |
9717425 | Kiani et al. | Aug 2017 | B2 |
9717458 | Lamego et al. | Aug 2017 | B2 |
9724016 | Al-Ali et al. | Aug 2017 | B1 |
9724024 | Al-Ali | Aug 2017 | B2 |
9724025 | Kiani et al. | Aug 2017 | B1 |
9730640 | Diab et al. | Aug 2017 | B2 |
9743887 | Al-Ali et al. | Aug 2017 | B2 |
9749232 | Sampath et al. | Aug 2017 | B2 |
9750442 | Olsen | Sep 2017 | B2 |
9750443 | Smith et al. | Sep 2017 | B2 |
9750461 | Telfort | Sep 2017 | B1 |
9757020 | Elazar | Sep 2017 | B1 |
9770203 | Berme | Sep 2017 | B1 |
9775545 | Al-Ali et al. | Oct 2017 | B2 |
9775546 | Diab et al. | Oct 2017 | B2 |
9775570 | Al-Ali | Oct 2017 | B2 |
9778079 | Al-Ali et al. | Oct 2017 | B1 |
9782077 | Lamego et al. | Oct 2017 | B2 |
9782110 | Kiani | Oct 2017 | B2 |
9787568 | Lamego et al. | Oct 2017 | B2 |
9788735 | Al-Ali | Oct 2017 | B2 |
9788768 | Al-Ali et al. | Oct 2017 | B2 |
9795300 | Al-Ali | Oct 2017 | B2 |
9795310 | Al-Ali | Oct 2017 | B2 |
9795358 | Telfort et al. | Oct 2017 | B2 |
9795739 | Al-Ali et al. | Oct 2017 | B2 |
9801556 | Kiani | Oct 2017 | B2 |
9801588 | Weber et al. | Oct 2017 | B2 |
9808188 | Perea et al. | Nov 2017 | B1 |
9814418 | Weber et al. | Nov 2017 | B2 |
9820691 | Kiani | Nov 2017 | B2 |
9833152 | Kiani et al. | Dec 2017 | B2 |
9833180 | Shakespeare et al. | Dec 2017 | B2 |
9839379 | Al-Ali et al. | Dec 2017 | B2 |
9839381 | Weber et al. | Dec 2017 | B1 |
9847002 | Kiani et al. | Dec 2017 | B2 |
9847749 | Kiani et al. | Dec 2017 | B2 |
9848800 | Lee et al. | Dec 2017 | B1 |
9848806 | Al-Ali et al. | Dec 2017 | B2 |
9848807 | Lamego | Dec 2017 | B2 |
9861298 | Eckerbom et al. | Jan 2018 | B2 |
9861304 | Al-Ali et al. | Jan 2018 | B2 |
9861305 | Weber et al. | Jan 2018 | B1 |
9867578 | Al-Ali et al. | Jan 2018 | B2 |
9872623 | Al-Ali | Jan 2018 | B2 |
9876320 | Coverston et al. | Jan 2018 | B2 |
9877650 | Muhsin et al. | Jan 2018 | B2 |
9877686 | Al-Ali et al. | Jan 2018 | B2 |
9891079 | Dalvi | Feb 2018 | B2 |
9892564 | Cvetko | Feb 2018 | B1 |
9895107 | Al-Ali et al. | Feb 2018 | B2 |
9924893 | Schurman et al. | Mar 2018 | B2 |
9924897 | Abdul-Hafiz | Mar 2018 | B1 |
9958681 | Ko | May 2018 | B2 |
9959620 | Merlet | May 2018 | B2 |
10010379 | Gibby | Jul 2018 | B1 |
10080530 | Cheng | Sep 2018 | B2 |
10092213 | Gossler | Oct 2018 | B2 |
10255690 | Bhuruth | Apr 2019 | B2 |
10304206 | Nakazato | May 2019 | B2 |
10304251 | Pahud | May 2019 | B2 |
10318811 | Gold | Jun 2019 | B1 |
10332292 | Arnicar | Jun 2019 | B1 |
10403047 | Comer | Sep 2019 | B1 |
10413666 | Al-Ali | Sep 2019 | B2 |
10803608 | Na | Oct 2020 | B1 |
10856796 | Berme | Dec 2020 | B1 |
10885530 | Mercury | Jan 2021 | B2 |
10888402 | Kim | Jan 2021 | B2 |
20020120310 | Linden | Aug 2002 | A1 |
20020140675 | Ali | Oct 2002 | A1 |
20040102683 | Khanuja | May 2004 | A1 |
20050254134 | Yamamoto | Nov 2005 | A1 |
20060241792 | Pretlove | Oct 2006 | A1 |
20080081982 | Simon | Apr 2008 | A1 |
20080183074 | Carls | Jul 2008 | A1 |
20080183190 | Adcox | Jul 2008 | A1 |
20080270188 | Garg | Oct 2008 | A1 |
20090275844 | Al-Ali | Nov 2009 | A1 |
20090311655 | Karkanias | Dec 2009 | A1 |
20090312660 | Guarino | Dec 2009 | A1 |
20100004518 | Vo et al. | Jan 2010 | A1 |
20100030040 | Poeze | Feb 2010 | A1 |
20100199232 | Mistry | Aug 2010 | A1 |
20100249540 | Lisogurski | Sep 2010 | A1 |
20100298656 | McCombie | Nov 2010 | A1 |
20110066007 | Banet | Mar 2011 | A1 |
20110082711 | Poeze et al. | Apr 2011 | A1 |
20110105854 | Kiani et al. | May 2011 | A1 |
20110124996 | Reinke | May 2011 | A1 |
20110125060 | Telfort et al. | May 2011 | A1 |
20110202084 | Hoem | Aug 2011 | A1 |
20110208015 | Welch et al. | Aug 2011 | A1 |
20110213273 | Telfort | Sep 2011 | A1 |
20110230733 | Al-Ali | Sep 2011 | A1 |
20110257552 | Banet | Oct 2011 | A1 |
20110295301 | Hoem | Dec 2011 | A1 |
20110295302 | Mohl | Dec 2011 | A1 |
20120003933 | Baker | Jan 2012 | A1 |
20120005624 | Vesely | Jan 2012 | A1 |
20120054691 | Nurmi | Mar 2012 | A1 |
20120109676 | Landau | May 2012 | A1 |
20120113140 | Hilliges | May 2012 | A1 |
20120124506 | Stuebe | May 2012 | A1 |
20120157806 | Steiger | Jun 2012 | A1 |
20120165629 | Merritt et al. | Jun 2012 | A1 |
20120209082 | Al-Ali | Aug 2012 | A1 |
20120209084 | Olsen et al. | Aug 2012 | A1 |
20120226117 | Lamego | Sep 2012 | A1 |
20120249741 | Maciocci | Oct 2012 | A1 |
20120283524 | Kiani et al. | Nov 2012 | A1 |
20120319816 | Al-Ali | Dec 2012 | A1 |
20130009993 | Horseman | Jan 2013 | A1 |
20130017791 | Wang | Jan 2013 | A1 |
20130023214 | Wang | Jan 2013 | A1 |
20130023215 | Wang | Jan 2013 | A1 |
20130023775 | Lamego et al. | Jan 2013 | A1 |
20130041591 | Lamego | Feb 2013 | A1 |
20130050258 | Liu | Feb 2013 | A1 |
20130050432 | Perez | Feb 2013 | A1 |
20130060147 | Welch et al. | Mar 2013 | A1 |
20130078977 | Anderson | Mar 2013 | A1 |
20130096405 | Garfio | Apr 2013 | A1 |
20130096936 | Sampath et al. | Apr 2013 | A1 |
20130147838 | Small | Jun 2013 | A1 |
20130149684 | Ezzell | Jun 2013 | A1 |
20130162632 | Varga | Jun 2013 | A1 |
20130234934 | Champion | Sep 2013 | A1 |
20130243021 | Siskavich | Sep 2013 | A1 |
20130253334 | Al-Ali | Sep 2013 | A1 |
20130262730 | Al-Ali | Oct 2013 | A1 |
20130267792 | Petersen | Oct 2013 | A1 |
20130293530 | Perez | Nov 2013 | A1 |
20130296672 | O'Neil et al. | Nov 2013 | A1 |
20130296713 | Al-Ali et al. | Nov 2013 | A1 |
20130316652 | Wang | Nov 2013 | A1 |
20130324808 | Al-Ali et al. | Dec 2013 | A1 |
20130331660 | Al-Ali et al. | Dec 2013 | A1 |
20130337842 | Wang | Dec 2013 | A1 |
20140012100 | Al-Ali et al. | Jan 2014 | A1 |
20140012509 | Barber | Jan 2014 | A1 |
20140035925 | Muranjan | Feb 2014 | A1 |
20140051953 | Lamego et al. | Feb 2014 | A1 |
20140065972 | Wang | Mar 2014 | A1 |
20140081175 | Telfort | Mar 2014 | A1 |
20140120564 | Workman et al. | May 2014 | A1 |
20140121482 | Merritt et al. | May 2014 | A1 |
20140127137 | Bellott et al. | May 2014 | A1 |
20140129493 | Leopold | May 2014 | A1 |
20140135588 | Al-Ali et al. | May 2014 | A1 |
20140163344 | Al-Ali | Jun 2014 | A1 |
20140163376 | Caluser | Jun 2014 | A1 |
20140163402 | Lamego et al. | Jun 2014 | A1 |
20140166076 | Kiani et al. | Jun 2014 | A1 |
20140171763 | Diab | Jun 2014 | A1 |
20140180038 | Kiani | Jun 2014 | A1 |
20140180154 | Sierra et al. | Jun 2014 | A1 |
20140180160 | Brown et al. | Jun 2014 | A1 |
20140187973 | Brown et al. | Jul 2014 | A1 |
20140207489 | Wartena | Jul 2014 | A1 |
20140213864 | Abdul-Hafiz et al. | Jul 2014 | A1 |
20140222462 | Shakil | Aug 2014 | A1 |
20140225918 | Mittal | Aug 2014 | A1 |
20140232747 | Sugimoto | Aug 2014 | A1 |
20140249431 | Banet | Sep 2014 | A1 |
20140266790 | Al-Ali et al. | Sep 2014 | A1 |
20140266983 | Christensen | Sep 2014 | A1 |
20140267419 | Ballard | Sep 2014 | A1 |
20140275808 | Poeze et al. | Sep 2014 | A1 |
20140275835 | Lamego et al. | Sep 2014 | A1 |
20140275871 | Lamego et al. | Sep 2014 | A1 |
20140275872 | Merritt et al. | Sep 2014 | A1 |
20140276115 | Dalvi et al. | Sep 2014 | A1 |
20140285521 | Kimura | Sep 2014 | A1 |
20140288400 | Diab et al. | Sep 2014 | A1 |
20140316217 | Purdon et al. | Oct 2014 | A1 |
20140316218 | Purdon et al. | Oct 2014 | A1 |
20140316228 | Blank et al. | Oct 2014 | A1 |
20140323818 | Axelgaard | Oct 2014 | A1 |
20140323825 | Al-Ali et al. | Oct 2014 | A1 |
20140323897 | Brown et al. | Oct 2014 | A1 |
20140323898 | Purdon et al. | Oct 2014 | A1 |
20140330092 | Al-Ali et al. | Nov 2014 | A1 |
20140330098 | Merritt et al. | Nov 2014 | A1 |
20140342766 | Wang | Nov 2014 | A1 |
20140357966 | Al-Ali et al. | Dec 2014 | A1 |
20140368539 | Yeh | Dec 2014 | A1 |
20150005600 | Blank et al. | Jan 2015 | A1 |
20150011907 | Purdon et al. | Jan 2015 | A1 |
20150012231 | Poeze et al. | Jan 2015 | A1 |
20150032029 | Al-Ali et al. | Jan 2015 | A1 |
20150038859 | Dalvi et al. | Feb 2015 | A1 |
20150067516 | Park | Mar 2015 | A1 |
20150067580 | Um | Mar 2015 | A1 |
20150080754 | Purdon et al. | Mar 2015 | A1 |
20150087936 | Al-Ali et al. | Mar 2015 | A1 |
20150088546 | Balram | Mar 2015 | A1 |
20150091943 | Lee | Apr 2015 | A1 |
20150094546 | Al-Ali | Apr 2015 | A1 |
20150097701 | Al-Ali | Apr 2015 | A1 |
20150099950 | Al-Ali et al. | Apr 2015 | A1 |
20150099955 | Al-Ali | Apr 2015 | A1 |
20150101844 | Al-Ali et al. | Apr 2015 | A1 |
20150106121 | Muhsin | Apr 2015 | A1 |
20150112151 | Muhsin et al. | Apr 2015 | A1 |
20150116076 | Al-Ali et al. | Apr 2015 | A1 |
20150119733 | Grubis | Apr 2015 | A1 |
20150125832 | Tran | May 2015 | A1 |
20150150518 | Cremades Peris | Jun 2015 | A1 |
20150153571 | Ballard | Jun 2015 | A1 |
20150157326 | Schiemanck | Jun 2015 | A1 |
20150165312 | Kiani | Jun 2015 | A1 |
20150186602 | Pipke | Jul 2015 | A1 |
20150196249 | Brown et al. | Jul 2015 | A1 |
20150205931 | Wang | Jul 2015 | A1 |
20150212576 | Ambrus | Jul 2015 | A1 |
20150215925 | Wang | Jul 2015 | A1 |
20150216459 | Al-Ali et al. | Aug 2015 | A1 |
20150238722 | Al-Ali | Aug 2015 | A1 |
20150245773 | Lamego et al. | Sep 2015 | A1 |
20150245794 | Al-Ali | Sep 2015 | A1 |
20150257689 | Al-Ali et al. | Sep 2015 | A1 |
20150261291 | Mikhailov | Sep 2015 | A1 |
20150272514 | Kiani et al. | Oct 2015 | A1 |
20150277699 | Algreatly | Oct 2015 | A1 |
20150286515 | Monk | Oct 2015 | A1 |
20150301597 | Rogers | Oct 2015 | A1 |
20150351697 | Weber et al. | Dec 2015 | A1 |
20150356263 | Chatterjee | Dec 2015 | A1 |
20150359429 | Al-Ali et al. | Dec 2015 | A1 |
20150366507 | Blank | Dec 2015 | A1 |
20160019352 | Cohen | Jan 2016 | A1 |
20160027216 | da Veiga | Jan 2016 | A1 |
20160029906 | Tompkins | Feb 2016 | A1 |
20160029932 | Al-Ali | Feb 2016 | A1 |
20160058347 | Reichgott et al. | Mar 2016 | A1 |
20160066824 | Al-Ali et al. | Mar 2016 | A1 |
20160081552 | Wojtczuk et al. | Mar 2016 | A1 |
20160095543 | Telfort et al. | Apr 2016 | A1 |
20160095548 | Al-Ali et al. | Apr 2016 | A1 |
20160103598 | Al-Ali et al. | Apr 2016 | A1 |
20160116979 | Border | Apr 2016 | A1 |
20160124501 | Lam | May 2016 | A1 |
20160135516 | Cobbett | May 2016 | A1 |
20160143548 | Al-Ali | May 2016 | A1 |
20160166182 | Al-Ali et al. | Jun 2016 | A1 |
20160166183 | Poeze et al. | Jun 2016 | A1 |
20160180044 | Delisle | Jun 2016 | A1 |
20160183836 | Muuranto | Jun 2016 | A1 |
20160189082 | Garrish | Jun 2016 | A1 |
20160192869 | Kiani et al. | Jul 2016 | A1 |
20160196388 | Lamego | Jul 2016 | A1 |
20160197436 | Barker et al. | Jul 2016 | A1 |
20160213281 | Eckerbom et al. | Jul 2016 | A1 |
20160225192 | Jones | Aug 2016 | A1 |
20160228043 | O'Neil et al. | Aug 2016 | A1 |
20160228640 | Pindado | Aug 2016 | A1 |
20160233632 | Scruggs et al. | Aug 2016 | A1 |
20160234944 | Schmidt et al. | Aug 2016 | A1 |
20160235323 | Tadi | Aug 2016 | A1 |
20160239252 | Nakagawa | Aug 2016 | A1 |
20160270735 | Diab et al. | Sep 2016 | A1 |
20160274358 | Yajima | Sep 2016 | A1 |
20160278644 | He | Sep 2016 | A1 |
20160283665 | Sampath et al. | Sep 2016 | A1 |
20160287090 | Al-Ali et al. | Oct 2016 | A1 |
20160287207 | Xue | Oct 2016 | A1 |
20160287786 | Kiani | Oct 2016 | A1 |
20160296169 | McHale et al. | Oct 2016 | A1 |
20160310005 | Pekander | Oct 2016 | A1 |
20160310047 | Pekander | Oct 2016 | A1 |
20160310052 | Al-Ali et al. | Oct 2016 | A1 |
20160314260 | Kiani | Oct 2016 | A1 |
20160314624 | Li | Oct 2016 | A1 |
20160324486 | Al-Ali et al. | Nov 2016 | A1 |
20160324488 | Olsen | Nov 2016 | A1 |
20160327984 | Al-Ali et al. | Nov 2016 | A1 |
20160328528 | Al-Ali et al. | Nov 2016 | A1 |
20160330573 | Masoud | Nov 2016 | A1 |
20160331332 | Al-Ali | Nov 2016 | A1 |
20160335403 | Mabotuwana | Nov 2016 | A1 |
20160335800 | DeStories | Nov 2016 | A1 |
20160351776 | Schneider | Dec 2016 | A1 |
20160357491 | Oya | Dec 2016 | A1 |
20160367173 | Dalvi et al. | Dec 2016 | A1 |
20160371886 | Thompson | Dec 2016 | A1 |
20170000394 | Al-Ali et al. | Jan 2017 | A1 |
20170000422 | Moturu | Jan 2017 | A1 |
20170004106 | Joshua | Jan 2017 | A1 |
20170007134 | Al-Ali et al. | Jan 2017 | A1 |
20170007198 | Al-Ali et al. | Jan 2017 | A1 |
20170010850 | Kobayashi | Jan 2017 | A1 |
20170014083 | Diab et al. | Jan 2017 | A1 |
20170014084 | Al-Ali et al. | Jan 2017 | A1 |
20170027456 | Kinast et al. | Feb 2017 | A1 |
20170042488 | Muhsin | Feb 2017 | A1 |
20170046872 | Geselowitz | Feb 2017 | A1 |
20170055851 | Al-Ali | Mar 2017 | A1 |
20170055882 | Al-Ali et al. | Mar 2017 | A1 |
20170055887 | Al-Ali | Mar 2017 | A1 |
20170055896 | Al-Ali et al. | Mar 2017 | A1 |
20170065379 | Cowburn | Mar 2017 | A1 |
20170079594 | Telfort et al. | Mar 2017 | A1 |
20170083104 | Namba | Mar 2017 | A1 |
20170086723 | Al-Ali et al. | Mar 2017 | A1 |
20170092002 | Mullins | Mar 2017 | A1 |
20170111824 | Wang | Apr 2017 | A1 |
20170140101 | Anderson | May 2017 | A1 |
20170143281 | Olsen | May 2017 | A1 |
20170147774 | Kiani | May 2017 | A1 |
20170156620 | Al-Ali et al. | Jun 2017 | A1 |
20170161455 | Grady | Jun 2017 | A1 |
20170172415 | Wik | Jun 2017 | A1 |
20170172515 | Banet | Jun 2017 | A1 |
20170172696 | Saget | Jun 2017 | A1 |
20170173632 | Al-Ali | Jun 2017 | A1 |
20170177816 | Ribble | Jun 2017 | A1 |
20170178356 | Bhuruth | Jun 2017 | A1 |
20170181645 | Mahalingam | Jun 2017 | A1 |
20170186157 | Boettger | Jun 2017 | A1 |
20170187146 | Kiani et al. | Jun 2017 | A1 |
20170188919 | Al-Ali et al. | Jul 2017 | A1 |
20170196464 | Jansen et al. | Jul 2017 | A1 |
20170196470 | Lamego et al. | Jul 2017 | A1 |
20170200296 | Jones | Jul 2017 | A1 |
20170202490 | Al-Ali et al. | Jul 2017 | A1 |
20170206676 | Nakazato | Jul 2017 | A1 |
20170215261 | Potucek | Jul 2017 | A1 |
20170215388 | Delecroix | Aug 2017 | A1 |
20170216524 | Haider | Aug 2017 | A1 |
20170224262 | Al-Ali | Aug 2017 | A1 |
20170228516 | Sampath et al. | Aug 2017 | A1 |
20170242480 | Dees | Aug 2017 | A1 |
20170244796 | Liu | Aug 2017 | A1 |
20170245790 | Al-Ali et al. | Aug 2017 | A1 |
20170251974 | Shreim et al. | Sep 2017 | A1 |
20170251975 | Shreim et al. | Sep 2017 | A1 |
20170255838 | Norieda | Sep 2017 | A1 |
20170258403 | Abdul-Hafiz et al. | Sep 2017 | A1 |
20170262064 | Ofir | Sep 2017 | A1 |
20170300824 | Peng | Oct 2017 | A1 |
20170311891 | Kiani et al. | Nov 2017 | A1 |
20170315774 | Meerbeek | Nov 2017 | A1 |
20170316561 | Helm | Nov 2017 | A1 |
20170323479 | Mokuya | Nov 2017 | A1 |
20170325684 | Vartiovaara | Nov 2017 | A1 |
20170325728 | Al-Ali et al. | Nov 2017 | A1 |
20170329480 | Ishikawa | Nov 2017 | A1 |
20170332976 | Al-Ali et al. | Nov 2017 | A1 |
20170340293 | Al-Ali et al. | Nov 2017 | A1 |
20170351909 | Kaehler | Dec 2017 | A1 |
20170357397 | Masumoto | Dec 2017 | A1 |
20170359467 | Norris | Dec 2017 | A1 |
20170360310 | Kiani et al. | Dec 2017 | A1 |
20170367632 | Al-Ali et al. | Dec 2017 | A1 |
20170367771 | Tako | Dec 2017 | A1 |
20180000415 | Gupta | Jan 2018 | A1 |
20180005424 | Niinuma | Jan 2018 | A1 |
20180008146 | Al-Ali et al. | Jan 2018 | A1 |
20180014752 | Al-Ali et al. | Jan 2018 | A1 |
20180024630 | Goossens | Jan 2018 | A1 |
20180025116 | Carrington | Jan 2018 | A1 |
20180028124 | Al-Ali et al. | Feb 2018 | A1 |
20180055385 | Al-Ali | Mar 2018 | A1 |
20180055390 | Kiani et al. | Mar 2018 | A1 |
20180055430 | Diab et al. | Mar 2018 | A1 |
20180059812 | Inomata | Mar 2018 | A1 |
20180064381 | Shakespeare et al. | Mar 2018 | A1 |
20180069776 | Lamego et al. | Mar 2018 | A1 |
20180074332 | Li | Mar 2018 | A1 |
20180075658 | Lanier | Mar 2018 | A1 |
20180080774 | Sink | Mar 2018 | A1 |
20180082480 | White | Mar 2018 | A1 |
20180084224 | McNelley | Mar 2018 | A1 |
20180088682 | Tsang | Mar 2018 | A1 |
20180103874 | Lee et al. | Apr 2018 | A1 |
20180116575 | Perea et al. | May 2018 | A1 |
20180125368 | Lamego et al. | May 2018 | A1 |
20180125430 | Al-Ali et al. | May 2018 | A1 |
20180130325 | Kiani et al. | May 2018 | A1 |
20180132769 | Weber et al. | May 2018 | A1 |
20180132770 | Lamego | May 2018 | A1 |
20180139203 | Dolan | May 2018 | A1 |
20180140362 | Cal | May 2018 | A1 |
20180144497 | Hirota | May 2018 | A1 |
20180147024 | Kall | May 2018 | A1 |
20180153445 | Noda | Jun 2018 | A1 |
20180157344 | Toff | Jun 2018 | A1 |
20180160881 | Okabe | Jun 2018 | A1 |
20180181810 | Jhawar | Jun 2018 | A1 |
20180188807 | Cimenser | Jul 2018 | A1 |
20180188831 | Lyons | Jul 2018 | A1 |
20180189556 | Shamir | Jul 2018 | A1 |
20180200018 | Silva | Jul 2018 | A1 |
20180217734 | Koenig | Aug 2018 | A1 |
20180225993 | Buras | Aug 2018 | A1 |
20180235478 | Khachaturian | Aug 2018 | A1 |
20180242920 | Hresko | Aug 2018 | A1 |
20180247024 | Divine | Aug 2018 | A1 |
20180250510 | Ziv | Sep 2018 | A1 |
20180251230 | Chavez | Sep 2018 | A1 |
20180253947 | Muhsin | Sep 2018 | A1 |
20180261329 | Blander | Sep 2018 | A1 |
20180264945 | Torii | Sep 2018 | A1 |
20180275837 | Getz | Sep 2018 | A1 |
20180279947 | Ummat | Oct 2018 | A1 |
20180285094 | Housel | Oct 2018 | A1 |
20180286132 | Cvetko | Oct 2018 | A1 |
20180300031 | Parkinson | Oct 2018 | A1 |
20180303558 | Thomas | Oct 2018 | A1 |
20180310822 | Indorf | Nov 2018 | A1 |
20180315490 | Jaruzel, II | Nov 2018 | A1 |
20180317826 | Muhsin | Nov 2018 | A1 |
20180333643 | Luisi | Nov 2018 | A1 |
20180342079 | Yaguchi | Nov 2018 | A1 |
20180344308 | Nawana | Dec 2018 | A1 |
20180365897 | Pahud | Dec 2018 | A1 |
20190005724 | Pahud | Jan 2019 | A1 |
20190033989 | Wang | Jan 2019 | A1 |
20190034076 | Vinayak | Jan 2019 | A1 |
20190043259 | Wang | Feb 2019 | A1 |
20190053855 | Siemionow | Feb 2019 | A1 |
20190064520 | Christensen | Feb 2019 | A1 |
20190066538 | Chao | Feb 2019 | A1 |
20190079156 | Krellmann | Mar 2019 | A1 |
20190080515 | Geri | Mar 2019 | A1 |
20190121522 | Davis | Apr 2019 | A1 |
20190138183 | Rosas | May 2019 | A1 |
20190141291 | McNelley | May 2019 | A1 |
20190146578 | Ikuta | May 2019 | A1 |
20190149797 | Casas | May 2019 | A1 |
20190155382 | Ikuta | May 2019 | A1 |
20190183577 | Fahim | Jun 2019 | A1 |
20190206104 | Rahman | Jul 2019 | A1 |
20190231436 | Panse | Aug 2019 | A1 |
20190240508 | Friman | Aug 2019 | A1 |
20190243138 | Peltola | Aug 2019 | A1 |
20190250873 | Blume | Aug 2019 | A1 |
20190254754 | Johnson | Aug 2019 | A1 |
20190298270 | Al-Ali | Oct 2019 | A1 |
20190333276 | Brown | Oct 2019 | A1 |
20190340434 | Chiu | Nov 2019 | A1 |
20190340827 | Abercrombie | Nov 2019 | A1 |
20190348169 | Gibby | Nov 2019 | A1 |
20190355182 | Nozaki | Nov 2019 | A1 |
20190380792 | Poltaretskyi | Dec 2019 | A1 |
20190387102 | Norris | Dec 2019 | A1 |
20200004328 | Blume | Jan 2020 | A1 |
20200005542 | Kocharlakota | Jan 2020 | A1 |
20200046473 | Kim | Feb 2020 | A1 |
20200074740 | Singh | Mar 2020 | A1 |
20200125322 | Wilde | Apr 2020 | A1 |
20200210679 | Kusens | Jul 2020 | A1 |
20200319770 | Varga | Oct 2020 | A1 |
20200321099 | Holladay | Oct 2020 | A1 |
20200327670 | Connor | Oct 2020 | A1 |
20200372714 | Soryal | Nov 2020 | A1 |
20200405151 | Berger | Dec 2020 | A1 |
Number | Date | Country | |
---|---|---|---|
20180300919 A1 | Oct 2018 | US |
Number | Date | Country | |
---|---|---|---|
62463452 | Feb 2017 | US | |
62463517 | Feb 2017 | US |