Noninvasive sensor system with visual infographic display

Abstract
A sensor system for obtaining and displaying information relating to physiological parameters, such as Total Hemoglobin and Pulse rate for use by a user such as an athlete. The system can present the measured physiological parameters to the user in a useful way. For example the system can display a visual multi quadrant infographic display, which can present the total hemoglobin values measured by the system in a particular season. The system can also display a visual elevation infographic display, which can present a comparison of the total hemoglobin values measured by the system over a period of time and/or at various altitudes. The system can also display a visual yin-yang infographic display, which can present a comparison of one or more metrics calculated by the system or one or more parameters measured by the system. The system can provide useful information about the user's health and/or well-being and allow the user to quickly and easily view and interpret relevant information.
Description
BACKGROUND OF THE INVENTION

User monitoring devices generally include sensors, processing equipment, and displays for obtaining and analyzing a medical patient's physiological parameters. Physiological parameters include, for example, respiratory rate, 402 level, pulse rate, total hemoglobin (tHb), oxygen content, carbon monoxide and methemoglobin content, and blood pressure, among others. Users can use the physiological parameters obtained from the user to determine an overall health, wellness, and/or fitness of the user. Users can use the physiological parameters to determine and make adjustments in a diet and/or exercise routine to enhance athletic performance.


User monitors capable of measuring pulse oximetry parameters, such as 402 and pulse rate in addition to advanced parameters, such as HbCO, HbMet and total hemoglobin (Hbt, THb, or SpHb) and corresponding multiple wavelength optical sensors are described in at least U.S. patent application Ser. No. 11/367,013, filed Mar. 1, 2006 and entitled Multiple Wavelength Sensor Emitters and U.S. patent application Ser. No. 11/366,208, filed Mar. 1, 2006 and entitled Noninvasive Multi-Parameter Patient Monitor, both assigned to Cercacor Laboratories of Irvine, Calif. (Cercacor) and both incorporated by reference herein. Further, noninvasive blood parameter monitors and corresponding multiple wavelength optical sensors, such as Rainbow™ adhesive and reusable sensors and RAD57™ and Radical-7™ monitors for measuring SpO2, pulse rate, perfusion index, signal quality, HbCO, and HbMet among other parameters are also available from Masimo Corporation, Irvine, Calif. (Masimo).


Advanced physiological monitoring systems may incorporate pulse oximetry in addition to advanced features for the calculation and display of other blood parameters, such as carboxyhemoglobin (HbCO), methemoglobin (HbMet) and total hemoglobin (Hbt or SpHb), as a few examples. Advanced physiological monitors and corresponding multiple wavelength optical sensors capable of measuring parameters in addition to SpO2, such as HbCO, HbMet and Hbt are described in at least U.S. patent application Ser. No. 11/367,013, filed Mar. 1, 2006, titled Multiple Wavelength Sensor Emitters and U.S. patent application Ser. No. 11/366,208, filed Mar. 1, 2006, titled Noninvasive Multi-Parameter Patient Monitor, which are each hereby incorporated by reference herein in their entirety. Further, noninvasive blood parameter monitors and corresponding multiple wavelength optical sensors, such as Rainbow™ adhesive and reusable sensors and RAD57™ and Radical-7™ monitors for measuring 402, pulse rate, perfusion index (PI), signal quality (SiQ), pulse variability index (PVI), HbCO and HbMet among other parameters are also available from Masimo.


SUMMARY

For purposes of summarizing the disclosure, certain aspects, advantages and novel features of several embodiments have been described herein. It is to be understood that not necessarily all such advantages can be achieved in accordance with any particular embodiment of the embodiments disclosed herein. Thus, the embodiments disclosed herein can be embodied or carried out in a manner that achieves or optimizes one advantage or group of advantages as taught herein without necessarily achieving other advantages as can be taught or suggested herein.


According to some embodiments, a sports training infographic method for presenting user data in a useful way for user use can include obtaining, by at least one sensor, the user data over a period of time, wherein the user data comprises a plurality of total hemoglobin measurements; displaying, by a user interface, a first total hemoglobin measurement of the plurality of total hemoglobin measurements, the displaying comprising: providing, by the user interface, an indication of the first total hemoglobin measurement associated with a first season of a plurality of seasons in a graphical presentation, wherein the graphical presentation comprises a plurality of quadrants, wherein a first quadrant of the plurality of quadrants is associated with the first season and a second quadrant is associated with a second season; displaying, by the user interface, the first total hemoglobin measurement in the first quadrant configured to indicate when the first total hemoglobin measurement was obtained; and providing, by the user interface in the graphical presentation, an optimal indicator, wherein the optimal indicator is configured to indicate an optimal total hemoglobin measurement to allow the user to compare the first total hemoglobin measurement to the optimal total hemoglobin measurement.


In some embodiments, the plurality of total hemoglobin measurements further comprises a second total hemoglobin measurement. In some embodiments, the method further comprises displaying, by the user interface in the graphical presentation, the second total hemoglobin measurement disposed at a location clockwise from the first total hemoglobin measurement. In some embodiments, the method comprises providing, by the user interface, one or more insights configured to notify the user of relevant information about the user data. In some embodiments, the user interface is configured to display a top navigation bar. In some embodiments, the user interface is configured to display a bottom navigation bar.


In some embodiments, the user interface is configured to allow a user to select a filter from a plurality of filters, wherein the filter indicates a physiological parameter from a plurality of physiological parameters. In some embodiments, the user interface is configured to allow a user to select an activity from a plurality of activities, wherein the user interface is configured to display the indication of the first total hemoglobin measurement associated with the activity.


According to some embodiments, a sports training infographic system for presenting user data in a useful way for user use can include a sensor configured to obtain the user data over a period of time, wherein the user data comprises a plurality of total hemoglobin measurements; a database configured to store the user data; and a user interface generated by a system having one or more hardware processors and one or more servers, wherein the user interface is configured to display a first total hemoglobin measurement of the plurality of total hemoglobin measurements, and an indication of the first total hemoglobin measurement associated with a first season of a plurality of seasons in a graphical presentation comprising a plurality of quadrants, wherein a first quadrant of the plurality of quadrants is associated with the first season and a second quadrant is associated with a second season; wherein the user interface is configured to provide the first total hemoglobin measurement in the first quadrant to indicate when the first total hemoglobin measurement was obtained, and wherein the user interface is configured to provide, in the graphical presentation, an optimal indicator configured to indicate an optimal total hemoglobin measurement to allow the user to compare the first total hemoglobin measurement to the optimal total hemoglobin measurement.


According to some embodiments, a sports training infographic method for presenting user data in a useful way for user use can include obtaining, by at least one sensor, the user data over a period of time, wherein the user data comprises a plurality of total hemoglobin measurements; displaying, by a user interface, a first total hemoglobin measurement of the plurality of total hemoglobin measurements, the displaying comprising: providing, by the user interface in a graphical presentation, an indication of the first total hemoglobin measurement according to changes in elevation, wherein the first total hemoglobin measurement is provided according to an elevation at which it was obtained, displaying, by the user interface, an image representing the first total hemoglobin measurement, wherein the user interface is configured to receive a selection by a user of the plurality of total hemoglobin measurements.


In some embodiments, the plurality of total hemoglobin measurements further comprises a second total hemoglobin measurement. In some embodiments, the method further comprises displaying, by the user interface in the graphical presentation, the second total hemoglobin measurement disposed at a location to the right of the first total hemoglobin measurement. In some embodiments, the first total hemoglobin measurement comprises an average of a subset of the plurality of total hemoglobin measurements obtained by the at least one sensor over a predetermined period of time.


In some embodiments, the method comprises calculating, by the sports training infographic, the average of the plurality of total hemoglobin measurements obtained by the at least one sensor over a predetermined period of time. In some embodiments, the predetermined period of time comprises a week. In some embodiments, the user interface is configured to display a top navigation bar. In some embodiments, the user interface is configured to display a bottom navigation bar.


In some embodiments, the user interface is configured to allow a user to select a filter from a plurality of filters, wherein the filter indicates a physiological parameter from a plurality of physiological parameters. In some embodiments, the user interface is configured to allow a user to select an activity from a plurality of activities, wherein the user interface is configured to display the indication of the first total hemoglobin measurement associated with the activity.


According to some embodiments, a sports training infographic system for presenting user data in a useful way for user use can include a sensor configured to obtain the user data over a period of time, wherein the user data comprises a plurality of total hemoglobin measurements; a database configured to store the user data; a user interface generated by a system having one or more hardware processors and one or more servers, wherein the user interface is configured to display a first total hemoglobin measurement of the plurality of total hemoglobin measurements, and provide in a graphical presentation, an indication of the first total hemoglobin measurement according to changes in elevation, wherein the first total hemoglobin measurement is provided according to an elevation at which it was obtained, and wherein the user interface is configured to display an image representing the first total hemoglobin measurement, and wherein the user interface is configured to receive a selection by a user of the plurality of total hemoglobin measurements.


According to some embodiments, a sports training infographic method for presenting user data in a useful way for user use can include obtaining, by at least one sensor, the user data over a period of time, wherein the user data comprises a plurality of physiological parameters; displaying, by a user interface, a total hemoglobin measurement and a resting heart rate of the plurality of physiological parameters, the displaying comprising: providing, by the user interface in a graphical presentation, a comparison of the total hemoglobin measurement and the resting heart rate; displaying, by the user interface, the total hemoglobin measurement in a first side of the graphical presentation; displaying, by the user interface, the resting heart rate in the second side of the graphical presentation; comparing, by the sports training infographic, the total hemoglobin measurement to an optimal total hemoglobin measurement; comparing, by the sports training infographic, the resting heart rate to an optimal resting heart rate; and adjusting, by the user interface, a size of each of the first side and the second side based on the comparison of the total hemoglobin measurement and the comparison of the resting heart rate.


In some embodiments, the method comprises displaying at least one metric calculated based on the user data. In some embodiments, the method comprises displaying an oxygen content measurement around a perimeter of the graphical representation. In some embodiments, the graphical presentation comprises a first flag and a second flag, wherein the first flag extends outwardly from the first side and the second flag extends outwardly from the second side. In some embodiments, the first flag is configured to be selected by a user to send a notification to a second user to congratulate the second user. In some embodiments, the second flag is configured to be selected by a user to send a notification to a second user to encourage the second user.


In some embodiments, the user interface is configured to display a plurality of graphical presentations, wherein each of the plurality of graphical presentations indicates a different user. In some embodiments, the user interface is configured to display a top navigation bar comprising a user profile. In some embodiments, the user interface is configured to display a bottom navigation bar comprising one or more options configured to be selected by the user.


According to some embodiments, a sports training infographic system for presenting user data in a useful way for user use can include a sensor configured to obtain the user data over a period of time, wherein the user data comprises a plurality of physiological parameters; a database configured to store the user data; a user interface generated by a system having one or more hardware processors and one or more servers, wherein the user interface is configured to display a total hemoglobin measurement and a resting heart rate of the plurality of physiological parameters, wherein the user interface is configured to provide in a graphical presentation a comparison of the total hemoglobin measurement and the resting heart rate, wherein the user interface is configured to display the total hemoglobin measurement in a first side of the graphical presentation, wherein the user interface is configured to display the resting heart rate in the second side of the graphical presentation; a comparison module configured to compare the total hemoglobin measurement to an optimal total hemoglobin measurement and configured to compare the resting heart rate to an optimal resting heart rate; and an adjuster configured to adjust a size of each of the first side and the second side based on the comparison of the total hemoglobin measurement and the comparison of the resting heart rate.





BRIEF DESCRIPTION OF THE DRAWINGS

Various embodiments are depicted in the accompanying drawings for illustrative purposes, and should in no way be interpreted as limiting the scope of the embodiments. Furthermore, various features of different disclosed embodiments can be combined to form additional embodiments, which are part of this disclosure.



FIG. 1A illustrates a block diagram depicting one embodiment of a computer hardware system configured to run software for implementing one or more embodiments of the sensor system described herein.



FIGS. 2A-2B depict an example monitoring device user interface.



FIGS. 3A-3B depict an example monitoring device user interface illustrated in FIG. 1A.



FIGS. 4A-4C depict an example monitoring device user interface illustrated in FIG. 1A.



FIGS. 5A-5B depict an example monitoring device user interface illustrated in FIG. 1A.



FIGS. 6A-6C depict an example monitoring device user interface illustrated in FIG. 1A.



FIGS. 7A-7B depict an example monitoring device user interface.



FIGS. 8A-8B depict an example monitoring device user interface illustrated in FIG. 6A.



FIGS. 9A-9C depict an example monitoring device user interface illustrated in FIG. 6A.



FIGS. 10A-10C depict an example monitoring device user interface illustrated in FIG. 6A.



FIGS. 11A-11C depict an example monitoring device user interface illustrated in FIG. 6A.



FIGS. 12A-12C depict an example monitoring device user interface.





DETAILED DESCRIPTION
I. Introduction

Current athletes work extremely hard to produce results and better themselves for competition. They desire to find devices that can track and monitor their physiological parameters to understand themselves and achieve better performance. However many current solutions offer insufficient information (only motion activity trackers, heart rate or some basic pulse oximetry). These current devices can offer insufficient review of meaningful data (just numbers and a basic trend graph). For further insights, athletes may draw their blood once, twice or three times in a year and review the information about certain parameters, such as their Total Hemoglobin Level. Many current systems offer only single peaks into a person's health and in many instances are incapable of giving insightful data review to make better training decisions.


Similarly, many competitive elite athletes go to extreme methods of training, such as living at high elevations or training at high elevations to achieve gains in their Total Hemoglobin levels. Some athletes even sleep in tents to simulate higher elevation at home. This determination and desire leads them to work this way based on published studies that provide general information, rather than an individualized understanding of the time it takes for their specific body to acclimate at different elevations. For example, a team of endurance cyclist could benefit by knowing at what point each of their members acclimates to certain elevations, at what elevation, and by how much. Accordingly, training, style, intensity and location could all be adjusted based on this information. Currently, invasive solutions offer insight into this information and would require an athlete to invasively test multiple times at multiple elevations to learn how they respond to elevation. Invasive procedures would be painful, expensive, and inconvenient (not mobile). Wearable and homecare technology (motion activity trackers, heart rate or some basic pulse oximetry, for example) cannot correctly complete a picture for when and how an athlete responds to elevation. Many wearable and homecare technologies do not provide insightful, intuitive and easy to use visual displays for communicating elevation effects.


In training, one of the struggles people face is to find daily motivation and understand how that determination positively or negatively affects them. Some traditional methods can use social interaction as a form of a support community and motivation to athletes. Many current solutions focus on social aspects solely to keep a user moving. However, many athletes are already active. Many current products help a user focus on a particular route an individual ran and/or the number of steps or elevation they climbed by using an inaccurate activity tracker. Some products allow integration of Heart Rate monitor statistics. However, many individual applications for physiological parameters and social sites fall short of providing valuable insight and comparison of blood parameters or the balance of blood parameters, environment, heart interaction on an individual level or compared to top athletic or Olympic levels.


Certain graphs disclosed herein, such as the trend graph and Yin-Yang display can offer the ability to filter the parameter and activity being displayed. This allows a user to choose to show resting measurements for particular days and only post some or all workout measurements. Thus, the user can filter certain results. The ability to filter can be advantageous because a user's hemoglobin value can vary drastically during a day, for example. In order for users to be able to see small seasonal variations, users can filter collected data to see the most stable and repeatable time of the day to measure, which could be first of day resting period. Similarly, the user can use the devices disclosed herein to determine the most stable and repeatable time of day to measure and view a resting pulse rate.


Advantageously, some embodiments of the system disclosed herein can allow the system to filter activity displayed in the graphs. In many instances, the system can be configured to allow a user to choose to display hemoglobin (tHb) or Pulse Rates (PR). Other parameters, including Functional and Fractional Oxygen Saturation, Oxygen Content, Carbon Monoxide, Methemoglobin, Perfusion Index, Pleth Variability Index, and/or Respiration Rate, among others, can be displayed through the system.


Alternative embodiments can include other visual focus and identification elements. In some embodiments, color and glowing effects can be used to indicate points at which the user's data reflects a significant change. In some embodiments, a user may see a 3d sphere that swells in size to reflect the climb in tHb values. In yet other embodiments of the system disclosed herein, additional lifestyle integration variations of sensors (such as bands, watches, shirts etc.) as well as variations on visual display to 3D augmented reality or heads up display units can be used. In some embodiments, the one or more graphs can add a multiple factor overlay. In this embodiment, the system can compare seasonal variation of atmospheric pollution levels with seasonal variations in the athlete's noninvasive CO readings or the seasonal variations in temperature with pulse rate. In additional embodiments, the system can compare elevation and tHb with the changes of atmospheric pollution levels as well as cross reference with multiple parameters. In additional embodiments, the system can compare Respiration Rate (Pleth acquired or acoustic or ecg acquired) compared with Heart rate and Pollen levels would let an athlete with respiratory conditions, such as asthma. Advantageously, embodiments of the system disclosed herein can allow a user to understand a balance of their respiratory system.


This disclosure describes embodiments of noninvasive sensor systems that can enable a user to view, compare, and/or information relating to the respiratory system, for example, via a computing device, which may contain more advanced functionality than traditional systems and devices. The computing device can be, for instance, a cellphone or smartphone, tablet, laptop, personal digital assistant (PDA), and/or the like.


User Interfaces


Generally, the embodiments described herein can depict several example user interfaces that may be implemented in a user computing device. The user interfaces shown can depict example displays generated by the noninvasive sensor system and may be implemented in any of the user devices described herein. The example user device shown in FIGS. 2A-12C may have any of the features of the user devices described herein.


The user interfaces shown may be implemented in a mobile application such as an application that runs on a mobile operating system such as the Android™ operating system available from Google™ or the iOS™ operating system available from Apple™. Alternatively, or in addition to being a mobile application, the user interfaces shown can be implemented in a web application that runs in a browser.


The user interfaces shown are merely examples that illustrate some example embodiments described herein and may be varied in other embodiments. For instance, user interface controls shown may include buttons, touch-selective components and the like which may be altered to include any type of user interface control including, but not limited to, checkboxes, radio buttons, select boxes, dropdown boxes, textboxes or any combination of the same. Likewise, the different user interface controls may be combined or their functionality may be spread apart amongst additional controls while retaining the similar or same functionality as shown and described herein with respect to FIGS. 2A through 12C. Although touchscreen interfaces are shown, other devices may implement similar user interfaces with other types of user input devices such as a mouse, keyboard, stylus, or the like.



FIG. 1A illustrates a block diagram of an exemplary embodiment of a user monitoring system 100. As shown in FIG. 1A, the system 100 includes a user monitor 102 comprising a processing board 104 and a host instrument 108. The processing board 104 communicates with a sensor 106 to receive one or more intensity signal(s) indicative of one or more parameters of tissue of a user. The processing board 104 also communicates with a host instrument 108 to display determined values calculated using the one or more intensity signals. According to an embodiment, the processing board 104 comprises processing circuitry arranged on one or more printed circuit boards capable of installation into the monitor 102, or capable of being distributed as some or all of one or more OEM components for a wide variety of host instruments monitoring a wide variety of user information. In an embodiment, the processing board 104 comprises a sensor interface 110, a digital signal processor and signal extractor (“DSP” or “processor”) 112, and an instrument manager 114. In general, the sensor interface 110 converts digital control signals into analog drive signals capable of driving sensor emitters, and converts composite analog intensity signal(s) from light sensitive detectors into digital data.


In an embodiment, the sensor interface 110 manages communication with external computing devices. For example, in an embodiment, a multipurpose sensor port (or input/output port) is capable of connecting to the sensor 106 or alternatively connecting to a computing device, such as a personal computer, a PDA, additional monitoring equipment or networks, or the like. When connected to the computing device, the processing board 104 may upload various stored data for, for example, off-line analysis and diagnosis. The stored data may comprise trend data for any one or more of the measured parameter data, plethysmograph waveform data acoustic sound waveform, or the like. Moreover, the processing board 104 may advantageously download from the computing device various upgrades or executable programs, may perform diagnosis on the hardware or software of the monitor 102. In addition, the processing board 104 may advantageously be used to view and examine user data, including raw data, at or away from a monitoring site, through data uploads/downloads, or network connections, combinations, or the like, such as for customer support purposes including software maintenance, customer technical support, and the like. Upgradable sensor ports are disclosed in copending U.S. application Ser. No. 10/898,680, filed on Jul. 23, 2004, titled “Multipurpose Sensor Port,” incorporated by reference herein.


As shown in FIG. 1A, the digital data is output to the DSP 112. According to an embodiment, the DSP 112 comprises a processing device based on the Super Harvard ARChitecture (“SHARC”), such as those commercially available from Analog Devices. However, a skilled artisan will recognize from the disclosure herein that the DSP 112 can comprise a wide variety of data and/or signal processors capable of executing programs for determining physiological parameters from input data. In particular, the DSP 112 includes program instructions capable of receiving multiple channels of data related to one or more intensity signals representative of the absorption (from transmissive or reflective sensor systems) of a plurality of wavelengths of emitted light by body tissue. In an embodiment, the DSP 112 accepts data related to the absorption of eight (8) wavelengths of light, although an artisan will recognize from the disclosure herein that the data can be related to the absorption of two (2) to sixteen (16) or more wavelengths.



FIG. 1A also shows the processing board 104 including the instrument manager 114. According to an embodiment, the instrument manager 114 may comprise one or more microcontrollers controlling system management, including, for example, communications of calculated parameter data and the like to the host instrument 108. The instrument manager 114 may also act as a watchdog circuit by, for example, monitoring the activity of the DSP 112 and resetting it when appropriate.


The sensor 106 may comprise a reusable clip-type sensor, a disposable adhesive-type sensor, a combination sensor having reusable and disposable components, or the like. Moreover, an artisan will recognize from the disclosure herein that the sensor 106 can also comprise mechanical structures, adhesive or other tape structures, Velcro wraps or combination structures specialized for the type of user, type of monitoring, type of monitor, or the like. In an embodiment, the sensor 106 provides data to the board 104 and vice versa through, for example, a user cable. An artisan will also recognize from the disclosure herein that such communication can be wireless, over public or private networks or computing systems or devices, or the like.


As shown in FIG. 1A, the sensor 106 includes a plurality of emitters 116 irradiating the body tissue 118 with differing wavelengths of light, and one or more detectors 120 capable of detecting the light after attenuation by the tissue 118. In an embodiment, the emitters 116 comprise a matrix of eight (8) emission devices mounted on a flexible substrate, the emission devices being capable of emitting eight (8) differing wavelengths of light. In other embodiments, the emitters 116 may comprise twelve (12) or sixteen (16) emitters, although other numbers of emitters are contemplated, including two (2) or more emitters. As shown in FIG. 1A, the sensor 106 may include other electrical components such as, for example, a memory device 122 comprising an EPROM, EEPROM, ROM, RAM, microcontroller, combinations of the same, or the like. In an embodiment, other sensor components may include a an optional temperature determination device 123 or other mechanisms for, for example, determining real-time emission wavelengths of the emitters 116.


The memory 122 may advantageous store some or all of a wide variety data and information, including, for example, information on the type or operation of the sensor 106; type or identification of sensor buyer or distributor or groups of buyer or distributors, sensor manufacturer information, sensor characteristics including the number of emitting devices, the number of emission wavelengths, data relating to emission centroids, data relating to a change in emission characteristics based on varying temperature, history of the sensor temperature, current, or voltage, emitter specifications, emitter drive requirements, demodulation data, calculation mode data, the parameters for which the sensor is capable of supplying sufficient measurement data (e.g., HpCO, HpMet, HbT, or the like), calibration or parameter coefficient data, software such as scripts, executable code, or the like, sensor electronic elements, whether the sensor is a disposable, reusable, multi-site, partially reusable, partially disposable sensor, whether it is an adhesive or non-adhesive sensor, whether the sensor is a reflectance, transmittance, or transreflectance sensor, whether the sensor is a finger, hand, foot, forehead, or ear sensor, whether the sensor is a stereo sensor or a two-headed sensor, sensor life data indicating whether some or all sensor components have expired and should be replaced, encryption information, keys, indexes to keys or hash functions, or the like, monitor or algorithm upgrade instructions or data, some or all of parameter equations, information about the user, age, sex, medications, and other information that may be useful for the accuracy or alarm settings and sensitivities, trend history, alarm history, or the like. In an embodiment, the monitor may advantageously store data on the memory device, including, for example, measured trending data for any number of parameters for any number of users, or the like, sensor use or expiration calculations, sensor history, or the like.



FIG. 1A also shows the user monitor 102 including the host instrument 108. In an embodiment, the host instrument 108 communicates with the board 104 to receive signals indicative of the physiological parameter information calculated by the DSP 112. The host instrument 108 preferably includes one or more display devices 124 capable of displaying indicia representative of the calculated physiological parameters of the tissue 118 at the measurement site. In an embodiment, the host instrument 108 may advantageously comprise a handheld housing capable of displaying one or more of a pulse rate, plethysmograph data, perfusion quality such as a perfusion quality index (“PI™”), signal or measurement quality (“SQ”), values of blood constituents in body tissue, including for example, SpO2, HbCO, HbMet, Hbt, or the like. In other embodiments, the host instrument 108 is capable of displaying values for one or more of Hbt, Hb, blood glucose, bilirubin, or the like. The host instrument 108 may be capable of storing or displaying historical or trending data related to one or more of the measured values, combinations of the measured values, plethysmograph data, or the like. The host instrument 108 also includes an audio indicator 126 and user input device 128, such as, for example, a keypad, touch screen, pointing device, voice recognition device, or the like.


In still additional embodiments, the host instrument 108 includes audio or visual alarms that alert caregivers that one or more physiological parameters are falling below predetermined safe thresholds. The host instrument 108 may include indications of the confidence a caregiver should have in the displayed data. In a further embodiment, the host instrument 108 may advantageously include circuitry capable of determining the expiration or overuse of components of the sensor 106, including, for example, reusable elements, disposable elements, or combinations of the same.


Although described in terms of certain embodiments, other embodiments or combination of embodiments will be apparent to those of ordinary skill in the art from the disclosure herein. For example, the monitor 102 may comprise one or more monitoring systems monitoring parameters, such as, for example, vital signs, blood pressure, ECG or EKG, respiration, glucose, bilirubin, or the like. Such systems may combine other information with intensity-derived information to influence diagnosis or device operation. Moreover, the monitor 102 may advantageously include an audio system, preferably comprising a high quality audio processor and high quality speakers to provide for voiced alarms, messaging, or the like. In an embodiment, the monitor 102 may advantageously include an audio out jack, conventional audio jacks, headphone jacks, or the like, such that any of the display information disclosed herein may be audiblized for a listener. For example, the monitor 102 may include an audible transducer input (such as a microphone, piezoelectric sensor, or the like) for collecting one or more of heart sounds, lung sounds, trachea sounds, or other body sounds and such sounds may be reproduced through the audio system and output from the monitor 102. Also, wired or wireless communications (such as Bluetooth or WiFi, including IEEE 801.11a, b, or g), mobile communications, combinations of the same, or the like, may be used to transmit the audio output to other audio transducers separate from the monitor 102.


For example, patterns or changes in the continuous noninvasive monitoring of intensity-derived information may cause the activation of other vital sign measurement devices, such as, for example, blood pressure cuffs.


II. Visual Multi Quadrant Infographic Display

Traditional methods of monitoring and displaying certain physiological parameters can lack the ability to monitor certain physiological parameters noninvasively. Traditional methods can take a long period of time to display collected data, including the physiological parameters. Rather, the systems disclosed herein can measure and display the physiological parameters within minutes. For example, embodiments of the system disclosed herein can frequently measure physiological parameters multiple times a day to build a substantial data set. The system can measure physiological parameters 1 to 2, 2 to 3, 3 to 4, 4 to 5, 5 to 6, and/or 1 to 6 or more times a day. In some embodiments, the physiological parameters can be displayed in an informative visual graphic display, such as a trend graph 230. The trend graph can be user-friendly and provide a well-designed display. For example, the trend graph can include a multi quadrant seasonal variation spider graph, as illustrated in FIGS. 2A-6C.


The trend graph 230 can provide a detailed look at all data points of the physiological parameters 203 within one or more seasonal years. In some embodiments, the trend graph 230 can display the physiological parameters 203 collected within one, two, three, four, five, six, seven, eight, nine, or ten or more years. Advantageously, the trend graph can allow the user to quickly and easily identify whether a seasonal variation exists in their physiological parameters.


Generally, users can have seasonal variation in several physiological parameters, such as tHb. As discussed above, the system 100 can collect user data 202 including information about a user at various intervals. The user data 202 can include a plurality of data points. For example, the system can measure and collect the data points at regular intervals throughout a period of time, such as an hour, day, month, and/or year. Advantageously, the system 100 can allow the user to track and assess user data 202 over time. For example, the system 100 can provide various points of comparison to the user through the user interface 200 so the user can make judgments based on all or a portion of the user data 202, rather than at a particular point in time. In some embodiments, it can be advantageous to measure, collect, and/or calculate the user data 202 at consistent intervals. In such configurations the system 100 can analyze more user data 202 to interpret a complete set of user data so that the system may interpret the user data 202 more accurately. In some configurations, the system 100 can provide user data 202 that can allow the user to make more accurate judgments about their physiological parameters. Accordingly, the user can have the ability to increase their performance based on more accurate results and adjust their diet and exercise routines. In some embodiments however, the system can measure and collect the user data at irregular intervals.


The user data 202 can include one or more data points corresponding to one or more physiological parameters 203. For each physiologic parameter 203, the system can assign certain information relating to each recorded physiologic parameter. For example, the system can assign a time, date, season, and/or location to each recorded physiologic parameter. Generally, the system can display information relating to the user data 202, such as whether seasonal variation occurs in a user's tHb and/or PR, when the variation occurs, and/or how the variation affects the user. In some embodiments, the user would be able to view the display information within seconds. Accordingly, the display information can help a user determine how to adjust a training routine and/or diet, for example, to achieve better performance results throughout the year.


The user can select from a plurality of filters and a plurality of physiological parameters. When the user selects a filter and a type of physiological parameter, the system 100 can display a user interface 200. The user interface 200 can be presented in an aesthetically pleasing and/or user friendly manner. The user interface 200 can display the user data 202, insights 206 and/or the one or more graphs 204 that allows the user to quickly view information. To view particular information, the user can select a filter from a plurality of filters. The plurality of filters can include filters that can dictate how the user data 202 is displayed, such as by season and elevation. In some embodiments, the user can select one or more physiological parameters 203. In some embodiments, the user can select an activity from the plurality of activities, such as pre-workout, pre-hydration, during workout, post-workout and/or post-hydration, among others. The user can select all activities from the plurality of activities. The system 200 can display the user interface 200 within approximately 500 milliseconds to 1 second after a filter is selected.


For example, a user can select a seasonal filter from the plurality of filters, a physiological parameter 203, and an activity from the plurality of activities. Upon the user's selection, the system 100 can display a user interface 200. FIGS. 2A and 2B illustrate an example of the user interface 200 in which the user has selected the seasonal filter. The user interface 200 can include a top navigation bar 208, one or more graphs 204, one or more insights 206, and a bottom navigation bar 212. The one or more graphs can include the user data 202 and the one or more physiological parameters 203.


The one or more graphs 204 can include the trend graph 230. The trend graph 230 can include a horizontal axis and a vertical axis. For example, the horizontal axis and the vertical axis can indicate the value of the physiological parameter 203 selected by the user.


The trend graph 230 can include a plurality of quadrants 234 to display the display information. In some embodiments, the trend graph 230 can include one, two, three, four, five, six, seven, eight, nine, or ten or more quadrants 234. For example, the graph 230 can include a first quadrant 234A, a second quadrant 234B, a third quadrant 234C, and a fourth quadrant 234D.


Each of the plurality of quadrants 234 can display information that corresponds to a particular seasonal time. The number of quadrants displayed in the trend graph 230 can depend on several factors, including a geographic location. For example, in geographic locations having four seasons, the trend graph 230 can include four quadrants associated with each of the four seasons. In the illustrated embodiment, the first quadrant 234A can correspond to Winter. The second quadrant 234B can correspond to Spring. The third quadrant 234C can correspond to Summer. The fourth quadrant 234D can correspond to Fall. In some embodiments, each quadrant of the plurality of quadrants 234 can be displayed in the same color. In some embodiments, each quadrant of the plurality of quadrants 234 can be displayed using different colors.


The user data 202 can be displayed in the trend graph 230 along a plurality of data rings 238. Each data point of the user data 202 can be displayed clockwise around a center of the data rings 238. In some embodiments, each data point of the user data 202 can be displayed counterclockwise around a center of the data rings 238. The data points can be presented in degree increments along the data rings 238. Each data point can represent a measurement taken and/or calculated by the system 100. For example, the system can record measurements of the user data 202 once a day. In this example, after 360 days, the trend graph 230 could display the user data 202 along a full ring of the data rings 238.



FIG. 3A illustrates an embodiment of a zoom user interface 250, which can display a close-up view of the trend graph 230. As shown in the FIG. 3A, the trend graph 230 can include a plurality of indicator rings 242 to illustrate various ranges and/or values corresponding to each particular physiological parameter 203 selected by the user. The plurality of indicator rings 242 can present an optimal range of physiological parameters 203. For example, the trend graph 230 can include a first indicator ring 242A of the plurality of indicator rings 242. The first indicator ring 242A can represent an optimal value for the particular physiological parameter 203 selected by the user. The optimal value and/or range can be based on one or more user inputs, such as the user's age and/or gender. The first indicator ring 242A can be disposed towards the center of the trend graph 230. The first indicator ring 242A can be disposed towards the outer periphery of the trend graph 230. The first indicator ring 242A can be disposed towards the inner ring of the trend graph 230.


In some embodiments, the trend graph 230 can include a second indicator ring 242B and a third indicator ring 242C. The second indicator ring 242B can indicate a lower value and/or range of the optimal range. The second indicator ring 242B can be disposed towards an inner ring of the trend graph 230 interior of the first indicator ring 242A, for example. In some embodiments, the trend graph 230 can include a third indicator ring 242C. The third indicator ring 242C can indicate an upper value and/or range of optimal range. The third indicator ring 242C can be disposed towards an outer ring of the trend graph 230 outwards from the first indicator ring 242A, for example.


Generally, the close-up view of the trend graph 230 can be presented to the user when the user activates one or more zoom user interfaces 250. Activating the zoom user interface 250 can allow a user to observe individual data points of the user data 202, among other detailed information displayed in the trend graph 230. The zoom user interface can allow the user to understand the user data 202 and gain a better understanding of how to adjust their diet and/or exercise routine, which can enhance the user's performance. The zoom user interface 250 can display information such as the time, date, season, and/or location assigned to each monitored, measured, and/or calculated data point.


When the zoom user interface 250 is activated, the zoom user interface can pop up from the user interface 200, overlap at least a portion of the user interface 200, and/or replace the user interface 200. To activate the zoom user interface 250, the user can apply a plurality of zooming gestures. The zooming gestures can include a double tap, a finger zoom, and/or a pan, among other gestures. For example, the user can double tap on the user interface 200 to activate the zoom interface 250. The user can double tap any area of the user interface 200, including any portion of the trend graph 230. The zoom interface 250 can display the portion in which the user double tapped in the user interface 200 to view the trend graph 230 in more detail. In some embodiments, when the system 100 displays the zoom user interface 250, the user can double tap the zoom user interface 250 to activate the user interface 200 and view the entire trend graph 230.


In some examples, the user can activate the zoom interface 250 by applying finger zoom gesture. For example, the user can touch the user interface 200 with at least two fingers and slide the at least two fingers apart from one another. This configuration can allow the user to gradually zoom in on a particular portion of the user interface 200 with a gradient zoom. The zoom interface 250 can be displayed gradually and can depend on the speed and/or extent of the finger zoom (for example, how far and/or how fast the user slides their fingers apart on the user interface 200). In some embodiments, the zoom interface 250 can display up to one month of data points. In some embodiments, the zoom interface 250 can display up to an hour, a day, and/or a year of data points. In some embodiments, the user can zoom out from the zoom interface 250 and activate the user interface 200 by pinching at least two fingers together. The seasonal variation interface 250 can be displayed gradually and can depend on the speed and/or extent of the finger zoom (for example, how far and/or how fast the user slides their fingers together on the zoom user interface 200).


In some embodiments, the user can navigate to various portions of the zoom interface 250. For example, the user can pan in any direction by swiping along the zoom interface 250. If the user pans too far to the edge of the zoom interface 250, for example, the zoom interface can indicate to the user that the user has reached the edge of the zoom interface 250. For example, at the edge of the zoom interface 250, the zoom interface can bounce back and/or rubber band.



FIGS. 6A-6C illustrate examples of the seasonal variation user interface that includes a varying number of data points of the user data 202. As shown in FIGS. 6A-6C, the user interface 200 can display one or more insights 206. The insights 206 can present to the user a general summary of the user data 202. For example, if no data has been recorded, the insights 206 can notify the user that no data is available (see FIG. 6A). In embodiments that include insufficient data, for example, when only one data point of the user data 202 has been measured, calculated, and/or recorded by the system 100, the insights 206 can notify the user that more user data is required (see FIG. 6B). In some embodiments, the insights 206 can remind the user to use the system 100 to monitor and/or measure user data. The insights can remind the user to use the system 100 to measure user data 206 more regularly, more often, and/or at different times.


In some embodiments, the system 100 can determine that a sufficient amount of user data 202 has been measured, calculated, and/or recorded. In some embodiments, the user interface 200 can display the user data within the trend graph 230 when a sufficient amount of user data 202, for example, at least two data points, has been measured, calculated, and/or recorded by the system 100. In some embodiments, the user interface 200 can display the user data within the trend graph 230 even when an insufficient amount of user data has been measured, calculated, and/or recorded by the system 100.



FIG. 6A illustrates an embodiment of the system 100 wherein no data is available. In this configuration, the user interface 200 can notify the user that no data is available to be displayed. For example, the insights 206 can present to the user that no data is available. In some embodiments, the trend graph 230 can present to the user that no data is available. For example, the trend graph 230 can display no data points of the user data 202. In such a configuration, the trend graph 230 can display text, an image, and/or another indication that no data is available. In this configuration, the trend graph 230 can display a range of years. The trend graph 230 can display a range of measurements on the horizontal and/or vertical axis.


In some embodiments in which no data has been measured, calculated, and/or recorded, the system 100 may not allow for a user to activate the zoom interface 250. In some embodiments, the system 100 may not allow for a user to apply at least some of the zooming gestures 254. In some embodiments, the seasonal variation user interface 230 does not display the user data 202 because of certain selections and/or filters selected by the user. For example, the user can select a specific activity, time period, physiological parameter, and/or filter for which no data has been monitored, measured, and/or calculated. In such configurations, the user interface 200 can notify and/or otherwise present to the user that no data is available within the particular filter selected.



FIG. 6B illustrates an embodiment of the user interface 200 wherein one data point 246 has been monitored, measured, and/or calculated by the system 100. In some embodiments, the one data point can be an insufficient amount of user data 202. In such configurations, the insight 206 can indicate to the user that an insufficient amount of data has been monitored, measured, and/or calculated by the system 100. In this embodiment, the data point is displayed on the trend graph 230. The data point can be displayed in the form of a line and/or a point. As shown in the trend graph 230, the data point can be displayed as a line that begins at the center of the graph, where the horizontal axis and the vertical axis intersect, for example, and extends outwardly to the ring indicating the value of the physiological parameter measured.


Some embodiments of the user interface 200 can display more than one data point. For example, FIG. 6C illustrates an embodiment of the user interface 200 including at least two data points 246. In some embodiments, at least two data points can be a sufficient amount of user data 202. In such configurations, consecutive data points 246 can be connected by one or more connecting lines 248. For example, the connecting lines 248 can be displayed connecting consecutive data points 246 at the point of the value of the physiological parameter 203. The connecting lines 248 can be straight or rounded. In some embodiments, the connecting lines 248 can be circular. In some embodiments, the connecting lines 248 can form a web-like structure. The area beneath the connecting lines 248 can be shaded towards a point at the intersection between the horizontal axis and the vertical axis disposed at the center of the trend graph 230.


In some embodiments, the connecting lines 248 can connect data points 246 recorded one, two, three, four, five, six, seven, eight, nine, and/or ten or more days apart. In some embodiments, the connecting lines 248 can connect data points 246 recorded 11, 12, 13, 14, 15, 16, 17, 18, 19, and/or 20 or more days apart. In some embodiments, the connecting lines 248 can connect data points 246 recorded one, two, three, four, five, six, seven, eight, nine, and/or ten or more weeks apart. Some embodiments a rounded connecting line 248 can connect two data points. In some embodiments, the system 100 can average two or more data points. In such configurations, user interface 200 can display connecting line 248 at the average of the two or more data points. In certain embodiments, the increment of the data points displayed by the user interface 200 in the trend graph 230 can indicate which data points are connected by the connecting line 248.



FIGS. 2A and 2B illustrate the top navigation bar 208 of the user interface 200. The top navigation bar 208 can be animated by the user interface 200. For example top navigation bar 208 can flip, spin, fade, and/or otherwise displayed with an animation. Top navigation bar 208 can include a title portion 220, a subtitle portion 224, stretch icon 216, a filter stack icon 218.


The title portion 220 can describe the type graph 204 that is displayed by the user interface 200. For example, the user interface 200 can display the trend graph 230. As illustrated in FIG. 2A, title portion 220 describes the type of graph 204.


The subtitle portion 224 can describe the type of filter selected and applied to user interface 200. For example, the user can select one or more filters from the plurality of filters, one or more physiological parameters, and one or more activities from the plurality of activities. By selecting the one or more filters, one or more physiological parameters, and/or the one or more activities, the user can access, by the system, a list of the plurality of filters, plurality of physiological parameters, and/or plurality of activities by selecting and activating the filter stack icon 218.


When the user selects the filter stack icon 218, the user interface 200 can display a filter user interface 260. FIGS. 5A and 5B illustrates an example of the filter user interface 260. The filter user interface 260 can provide a list of the plurality of filters, the plurality of activities, and/or the plurality of physiological parameters. The user can select one or more of the plurality of filters, the plurality of activities, and/or the plurality of physiological parameters. After selecting one or more of the plurality of filters, the plurality of activities, and/or the plurality of physiological parameters, the user can submit the selections by selecting a submit option. The system can receive the submission. The system 100 can update the data points depending on the user's selection and display a new version of the trend graph 230 in the user interface 200. In some embodiments, the system 100 dynamically updates the trend graph 230 according to the user's selections in real time.


In some embodiments, selection of the stretch icon 216 of the top navigation bar 208 can allow the title portion 220 to be displayed in the user interface 200 in full screen. In some embodiments, selection of the stretch icon 216 of the top navigation bar 208 can allow the title portion 220 to be displayed in the user interface 200 in the original configuration of the top navigation bar 208.



FIGS. 2A and 2B illustrate the bottom navigation bar 212 of the user interface 200. The bottom navigation bar 212 can be animated by the user interface 200. For example the bottom navigation bar 212 can flip, spin, fade, and/or otherwise be displayed with an animation. The bottom navigation bar 212 can include an information icon 226 and a share icon 228. In some embodiments, the system 100 can flip the bottom navigation bar 212 to reveal additional icons. Additional user interfaces can be accessed through selection by the user of the information icon 226 and/or the share icon 228, for example.


In some embodiments, the user can select the information icon 226 to access an information user interface 270. FIG. 3B illustrates an example of the information user interface 270. When the information user interface 270 is activated, the information user interface 270 can pop up from the user interface 200, overlap at least a portion of the user interface 200, and/or replace the user interface 200.


The information user interface 270 can include an explanation of the trend graph 230. The information user interface 270 can include instructions for interpreting trend graph 230. In some embodiments, the information user interface 270 can include a summary 272 of the display information including the trend graph 230. The summary can be scrollable in some examples.


In some embodiments, the information user interface 270 can include a list 274 of the plurality of physiological parameters 203. The information user interface 270 can display and/or highlight the one or more physiological parameters selected by the user. Each of the physiological parameters can be represented by an image and/or icon 275. In some embodiments, the information user interface 270 can include a list 276 of the plurality of activities. The information user interface 270 can display the one or more activities selected by the user. Each of the activities can be represented by an image and/or icon 277. In some embodiments, the information user interface 270 can allow the user to select the size adjustment icon 278. Selection of the size adjustment icon 278 can cause the size of the text to increase or decrease. In some embodiments, selection of the size adjustment icon 278 can cause the size of the images to increase or decrease.


The system 100 can include sharing capabilities. In some embodiments, the user can select the share icon 228 to access a sharing user interface 280 that can include a sharing menu with several options for sharing certain information to third parties. FIGS. 4A-4C illustrate an example of the sharing user interface 280. When the sharing user interface 280 is activated, the sharing user interface 280 can pop up from the user interface 200, overlap at least a portion of the user interface 200, and/or replace the user interface 200.


The sharing user interface 280 can allow the user to share information and/or a shared image 282 to third parties. The sharing menu can display a plurality of icons 285 representing third-party applications and/or websites for sharing the shared image 282 and/or other information. The plurality of icons can be displayed in rows, for example. The rows of icons can be scrollable in the left, right, up, and/or down direction to display additional icons. The user can select one or more of the plurality of icons to share certain user data and/or a shared image 282, as described below. For example, the shared image 282 and/or other information can be shared via digital or physical methods. In some embodiments, the shared image 282 and/or other information can be delivered to third parties through SMS, email, printing, and/or social media, among other sharing platforms. The sharing capabilities of the system 100 can allow the user to share the user data 202 with third parties, including a trainer, for example, who are invested in the user's athletic performance and success.


In some embodiments, the user data 202 can be shared as part of the trend graph 230. The sharing user interface 280 can allow the user to share a screenshot of all or a portion of the season variation user interface 200. In some embodiments, the sharing user interface 280 can allow the system to share an image 282 (see FIG. 4C) that includes all or a portion of the season variation user interface 200 and additional information. For example, the shared image 282 can include a view similar to the view displayed by the zoom user interface 250. In some embodiments, the shared image 282 can include a watermark 284 and a summary title 286. The watermark 284 can indicate a brand, for example. In some embodiments, the summarizing title 286 can display a summary of the filter and/or physiological parameter 203 selected by the user, for example. The shared image 282 can allow third parties to quickly view and interpret the user data 202 provided by the system 100. In some embodiments, the user can select certain data to be shared to the third parties. In some embodiments, the user data 202 can be shared in the form of raw data.


If a user decides to cancel a request to share the user data and/or shared image, the user can cancel the request. For example, the user can select a cancel icon 288 displayed by the sharing user interface 280. In some embodiments, the user can simply select an area outside of the sharing menu displayed by the sharing user interface 280.


III. Visual Elevation Infographic Display


FIGS. 7A-11C illustrate an embodiment of the user interface 300. The user interface 300 is similar or identical to the user interface 200 discussed above in many respects. Accordingly, numerals used to identify features of the user interface 300 are incremented by a factor of one hundred to identify certain similar features of the user interface 300. For example, as shown in FIGS. 7A and 7B, the user interface 300 can include a top navigation bar 308, one or more graphs 304, and a bottom navigation bar 312 as described above in connection with the user interface 200. The top navigation bar 308 can include a title portion 320, a subtitle portion 324, a stretch icon 316 and/or a filter stack icon 308. The bottom navigation bar 312 can include an information icon 326 that can be selected to display an information user interface 370 (see FIG. 8B) and a share icon 328 that can be selected to display a share user interface 380 that can include a sharing menu (see FIGS. 9A-9C) for providing sharing capabilities.


User data 302 can include one or more data points corresponding to one or more physiological parameters 303 as described above in connection with the user interface 200. The user interface 300 can include any one, or any combination, of the features of the user interface 200. For example, the user interface 300 can be substantially similar to the user interface 200. However, in some embodiments, the user interface 300 can illustrate an example of the user interface 200 in which the user has selected an elevation filter (see FIGS. 7A and 7B). To select a filter, the user can select the filter stack icon 318. When the system receives the request from the user, the system can cause the user interface 300 to display a filter user interface 360. FIGS. 10A-10C illustrate an example of the filter user interface 360. The filter user interface 360 can include a list of the plurality of physiological parameters 330 and/or the plurality of activities. FIG. 10B provides an example of the filter user interface 360 illustrating that the user has selected activity 366, but did not select activity 368.


In some embodiments, the user data 302 can be displayed in one or more graphs 304, such as a trend graph 330. In the illustrated embodiment, the trend graph can include an elevation graph. The trend graph 330 can provide a detailed look at all or a selection of the physiological parameters monitored, measured and/or calculated by the system.


The trend graph 330 can allow the user to view the user data over time. The trend graph 330 can allow the user to view the user data 302, such as the selected physiological parameters 303 at a particular elevation and/or during a particular time period. For example, the trend graph 330 can include a horizontal axis and a vertical axis. The horizontal axis can illustrate a specific time or a period of time. For example, the horizontal axis can display the date associated with each of the data points of the user data 302. In some embodiments, the horizontal axis can display the month and/or year associated with each of the data points of the user data 302. In some embodiments, the horizontal axis can display a range of minutes, hours, days, weeks, months, and/or years associated with each of the data points of the user data 302. The user interface 300 can display one or more horizontal axis labels. The horizontal axis labels can include narrow dates associated with a particular data point. The horizontal axis labels can include broader dates, such as a month and/or a year associated with the overall user data displayed in the trend graph 330.


The vertical axis can illustrate a specific elevation at which the user data 302 was monitored, measured, and/or calculated. The intersection between the horizontal and vertical axis can represent sea level in an example. The vertical axis can be incremented at constant elevation intervals. The vertical axis can be incremented at varying elevation intervals. The values of the vertical axis can be displayed in any unit, such as feet or meters, for example. the values of the vertical axis can be displayed in whole numbers in some embodiments. The unit displayed by the user interface 300 in the trend graph 330 can be dependent on a region where the user is located, for example. The unit displayed by the user interface 300 in the trend graph 330 can be dependent on a region set by the user device, for example.


The trend graph 330 can display each value of each data point of the user data 302. In some embodiments, the trend graph 330 can display a range and/or average of values for each time period displayed along the horizontal axis. For example, FIGS. 7A and 7B illustrate an example of the trend graph 330. The trend graph 330 can include one or more data bars 332 and one or more data values. The data bars 332 can be lines. In some embodiments, the data bars 332 can be rectangular and/or cylindrical, among other shapes.


The data values can be a value of each data point, a range of data points, an average of data points over a particular time period, and/or other metric calculated by the system 100 using the values of each data point of the user data 302. FIGS. 7A and 7B illustrate a standard display size of the user interface 300 (for example, without activating the zoom user interface 350). In this configuration, the average of data points over a particular time period can be displayed at an upper end of the data bars 332. When the system 100 receives a request from the user to activate the zoom user interface 350, for example when the user uses one or more zooming gesture methods disclosed herein, the system 100 can display the zoom user interface 350. For example, the user can use zooming gesture methods and/or features, such as a double tap, a finger zoom, and/or a pan, among other gestures. In some embodiments, panning can cause the zoom user interface 350 to have a parallex effect. In such a configuration, a background image, such as a mountain, can move more slowly than the trend graph 330. The parallex effect can advantageously allow the zoom user interface 350 to display the relevant information more clearly.



FIG. 8A illustrates an embodiment of the zoom user interface 350. The zoom user interface can display each value of each data point of the user data 302 disposed above the upper end of the data bars 332, for example, rather than an average. In this configuration, each value of each data point that is displayed can represent the value used by the system 100 to calculate a statistic, such as an average value displayed in the user interface 300. This configuration can advantageously allow the user to understand how the system calculated the statistic, such as the average value, displayed in the user interface 300. In some embodiments, the zoom user interface can display the statistic 338 near or adjacent to a group of values of each data point used by the system 100 to calculate the average value. The user interface 300 and zoom user interface 350 can advantageously allow the user to quickly and easily identify what user data is associated with an elevation change. The user interface 300 and zoom user interface 350 can advantageously allow the user to quickly and easily identify when individual data points or statistics calculated based on multiple data points increase as a result of an elevation change.


In some embodiments of the zoom user interface 350, the interface 350 can display the value of each data point in the same configuration as is displayed in the user interface 300 (for example, displayed within a shape 334 and/or with an animation). In some embodiments, to allow the user to more clearly view or access relevant information, the zoom user interface can display the group of values of each data point within a shape 336, as illustrated in FIG. 8A.


In some embodiments, the user interface 300 can be an interactive display. For example, when the user selects the elevation filter, the trend graph 330 can be displayed to the user by the user interface 300. When the trend graph is displayed by the user interface 300, the data bars 332 can dynamically extend upwards from the horizontal axis.


In some embodiments, the value of each data point of the user data 302 can be displayed at an upper end of the data bars 332 in various configurations. For example, as illustrated in FIG. 2A, the data value of each data point is displayed in a shape 334 such as a circle disposed at the upper end of the each data bar 332. The value of each data point can be displayed in a shape, such as a rectangle and/or square, among other shapes. Displaying the value of each data point in the trend graph 330 can allow the user to easily and quickly view and/or access relevant information. In some embodiments, the user interface can highlight the value of each data point in the trend graph 330 to allow the user to view and/or access relevant information more quickly. In some embodiments, when the trend graph 330 is displayed to the user, the shape surrounding the value of each data point and/or the value itself can swell, glow, become enlarged, and/or otherwise be animated to allow the user to view and/or access relevant information more quickly and easily.


Accordingly, the user would be able to quickly determine whether the user has experienced a change in elevation, when the user experienced a change in elevation, and how the change in elevation affected the user's physiological parameters. In some embodiments, the system 100 can automatically determine whether the user has experienced a change in elevation, when the user experienced a change in elevation, and how the change in elevation affected the user's physiological parameters. The user interface 300 can display this information in the elevation graph 330, for example. In some examples, the user interface 300 can display this information in one or more insights 306.


In some embodiments, the value of each data point is not displayed. In some embodiments the trend graph 330 does not include a shape that surrounds the value of each data point. In yet other embodiments, the trend graph 330 does not include any animation that highlights the value of the value of each data point.



FIGS. 11A-11C illustrate examples of the user interface 300 that includes a varying number of data points of the user data 302. As shown in FIGS. 6A-6C, the user interface 300 can display one or more insights 306. The insights 306 can present to the user a general summary of the user data 302.



FIG. 11A illustrates an embodiment of the system 100 wherein no data is available. In this configuration, the user interface 300 can notify the user that no data is available to be displayed. For example, the insights 306 can present to the user that no data is available. In some embodiments, the trend graph 330 can present to the user that no data is available. For example, the trend graph 330 can display no data points of the user data 302. In such a configuration, the trend graph 330 can display text, an image, and/or another indication that no data is available.



FIG. 11B illustrates an embodiment of the user interface 300 wherein one data point 346 has been monitored, measured, and/or calculated by the system 100. In some embodiments, the one data point can be an insufficient amount of user data 302. In some embodiments the one data point can be a sufficient amount of user data 302. However, the system 100 can encourage the user to cause the system to measure, collect, and/or calculate more user data 302. In some configurations, the insight 306 can indicate to the user that an insufficient amount of data has been monitored, measured, and/or calculated by the system 100. In this embodiment, the data point is displayed on the trend graph 330. The data point can be displayed as a single data bar 332 and corresponding data value. As shown in the trend graph 330, the user data 302 can be displayed by the trend graph 330 from left to right.


Some embodiments of the user interface 300 can display more than one data point in the trend graph 330. For example, FIG. 11C illustrates an embodiment of the user interface 300 including at least two data points 346. In some embodiments, at least two data points can be a sufficient amount of user data 302. In such configurations, consecutive data points 346 or groups of data points 346 can be displayed. In some embodiments, consecutive values of the data points can be averaged by the system 100. The user interface 300 can display the average value of the data points in configurations disclosed herein.


In some embodiments, the user interface 300 can include a summary dashboard. The summary dashboard can pop up from the user interface 300, overlap at least a portion of the user interface 300, and/or replace the user interface 300. The summary dashboard can display a summary of the user data 302 displayed in the trend graph 300. The summary dashboard can advantageously provide a summary to the user and/or be shared to third parties and can allow the user to access relevant information more quickly and easily. Accordingly, the summary dashboard can allow the user and/or a third party to help the user achieve enhanced performance by more easily and quickly adjust a diet and/or exercise routine.


In some embodiments, the summary dashboard can include several observations and/or callouts based on the user data 302 displayed in the trend graph 330. For example, the dashboard can display an average, mean, mode, and/or other statistic calculated based on data points representing a particular physiological parameter 303. In some embodiments, the system can determine and/or the summary dashboard can display any of the above-referenced statistics over various time periods, including seven, fifteen, and/or thirty or more days. In some embodiments, the system can determine and/or the summary dashboard can display an elevation the user must travel for one or more of the user's physiological parameters to be affected. In some embodiments, the system can determine and/or the summary dashboard can display the amount of time, for example the number of days, it takes for the user to acclimate to a new baseline at a different elevation. In some embodiments, the system can determine and/or the summary dashboard can display a length of time, for example the number of days, a variation in one or more of the user's physiological parameters lasts.


In some embodiments, the system can determine and/or the summary dashboard can display and/or notify the user of any changes in the user's physiological parameters. In some embodiments, the system can determine and/or the summary dashboard can display and/or notify the user of any particular elevations that affect one or more of the user's physiological parameters based on historical data. In some embodiments, the system can determine and/or the summary dashboard can display and/or notify the user of any particular elevations that affect one or more of the user's physiological parameters based on user data collected in real time. In some embodiments, the system can determine and/or the summary dashboard can display and/or notify the user of any particular elevations that affect one or more of the user's physiological parameters based on a comparison of user data stored in a records database. The records database can be remote from the system 100. In some embodiments, the system 100 can include the records database.


In some embodiments, the system can determine and/or the summary dashboard can display and/or notify the user of any particular elevations that affect one or more of the user's physiological parameters based on a comparison of user data measured during various activities.


In some embodiments, the system can determine and the summary dashboard can display and/or notify the user when a physiological parameter has reached a new high and/or new low. In some embodiments, the system can determine and the summary dashboard can display and/or notify the user when a physiological parameter is not generally optimal. For example, the system can determine and the summary dashboard can display and/or notify the user when any one of the user's physiological parameters is higher than an optimal physiological parameter. In another example, the system can determine and the summary dashboard can display and/or notify the user when any one of the user's physiological parameters is lower than an optimal physiological parameter.


In some embodiments, the system can automatically determine and display the physiological parameters and statistics disclosed herein.


IV. Visual Infographic Yin-Yang Display


FIGS. 12A-12C illustrate an embodiment of the user interface 400. The user interface 400 is similar or identical to the user interfaces 200, 300 discussed above in many respects. Accordingly, numerals used to identify features of the user interfaces 400 are incremented by a factor of one hundred to identify certain similar features of the user interfaces 200, 300. For example, as shown in FIGS. 12A and 12B, the user interface 400 can include a top navigation bar 408, one or more graphs 404, and a bottom navigation bar 412 as described above in connection with the user interfaces 200, 300. The top navigation bar 408 can include a title portion, a subtitle portion, a stretch icon and/or a filter stack icon. In some embodiments, the top navigation bar 408 can include a user profile 424 that can display a name and/or an image selected by the user. The bottom navigation bar 412 can include an information icon and a share icon.


The display information 401 of user interface 400 can include user data 402, such as one or more physiological parameters 403, and one or more graphs 404 as described above in connection with the user interfaces 200, 300. The user interface 400 can include any one, or any combination, of the features of the user interfaces 200, 300. For example, the user interface 400 can be substantially similar to the user interfaces 200, 300. However, in some embodiments the user interface 400 can be displayed in a different configuration than in user interfaces 200, 300.


For example, in the illustrated embodiments, the bottom navigation bar 412 can include a measure icon 414, a history icon 416, a team icon 418, and/or an option icon 420, among other icons. The measure icon 414 can be selected by the user to allow the system 100 to measure the one or more physiological parameters 403. In some embodiments, the user interface 400 can display that a user is measuring the physiological parameters 403. For example, the user interface 400 can display a signal waveform 422 within a widget (described in more detail below). In such configurations, the system 100 can push notifications in real time to the user and/or other users to notify the user and/or other users that a measurement is being taken. In some embodiments, the system 100 can push notifications in predefined intervals. The history icon 416 can be selected by the user to allow the user interface 400 to display a history of user data 402. The team icon 418 can be selected by the user to allow the user interface 400 to display user data of other team members. The options icon 420 can be selected by the user to allow the user interface 400 to display an options menu.


The user interface 400 can provide a user with an intuitive, easy and/or quick glance assessment of the balance of the user's fitness, goals, and/or wellness. Some embodiments of the user interface 400 can help to increase a user's usage of the user device by influencing the user. For example, FIGS. 12A-12B illustrate an example of the user interface 400. The user interface 400 can influence the user and provide an easily accessible comparison of one or more physiological parameters 403 by providing a plurality of graphs 430. Each graph 430 can correspond to the user and/or other users.


In some embodiments the system can allow a user to more easily understand the balance of the user's fitness and/or athletic value. The system 100 can calculate a metric, such as a numeric value, a physiological parameter, a percentage, a weighted value and/or a ranking. The metric can be calculated by the system 100 using various methods. For example, the system 100 can noninvasively measure one or more physiological parameters using methods disclosed herein. The system 100 can create an index for storing the user data 402, which include the one or more physiological parameters. In some embodiments, the index can represent a comparison of a readiness score, an intensity score, a resting heart rate, a regularity of taking measurements, and/or a frequency of taking measurements, for example. In some embodiments, the index can illustrate a comparison of one or more metrics calculated based on the user data 402 and an intensity score, a resting heart rate, a regularity of taking measurements, and/or a frequency of taking measurements calculated based on another user's user data.


In some embodiments, the user interface 400 can update and display the one or more metrics in real-time. In some embodiments, the user interface 400 can automatically update the one or more metrics at predefined intervals.


Each metric can be calculated using various methods. For example, the system 100 can calculate the readiness score by performing a weighted comparison of directly and indirectly measured values of each data point. The weighted comparison can be calculated by calculating a weighted comparison of a physiological parameter, such as a first of day tHb, a first of day respiration rate, a first of day pulse rate, a previous post workout capital tHb, a first of day SpO2, and/or a heart rate variability, for example, to an average physiological parameter calculated by the system 100 over a selected and/or predetermined period of time. In some embodiments, the weighted comparison can be calculated by comparing the physiological parameter to data collected by a third-party device and/or application, such as sleep and/or activity trackers. Overall, the readiness score can help to provide useful feedback to the user. For example, if the user has a readiness score of 95% calculated based on a consistently low pulse rate, low respiration rate, high tHb, full SpO2, and/or a healthy heart rate variability, for example, the readiness score can indicate to the user that the user is healthy.


In one example, the system 100 can calculate the intensity score by performing a weighted comparison of directly and indirectly measured values of each data point. The weighted comparison can be calculated by calculating a weighted comparison of physiological parameters measured during various activities, such as a post workout heart rate to a pre-workout heart rate, a post-workout SpO2 to a first of day SpO2, a post-workout respiration rate to a pre-workout respiration rate, a pre-workout tHb to a post-workout tHb, and/or third-party data such as bike power (watt-meters), running power (watt-meters), and/or stress scores collected from a third-party application such as activity trackers, for example.


The user interface 400 can display the metric and/or index calculated by the system 100. In some embodiments, the user interface 400 can display a comparison of one or more metrics corresponding to one or more users. For example the one or more metrics can be stored in a database, which can be a remote database. The system 100 can compare the metric between various data points of the user data 402 and user data of other users, such as teammates, friends, and/or celebrities, among other users. The user interface 400 can display the results of this comparison.



FIGS. 12A and 12B illustrate the user interface 400 that can display a plurality of graphs 430. In the illustrated configuration, the graphs 430 display a comparison of an average pulse rate value and an average total hemoglobin value. Each of the plurality of graphs 430 can be displayed by the user interface 400 in a shape of a divided circle, such as a Yin Yang widget. In the illustrated configuration, a first half 432 of the widget represents the average pulse rate value and a second half 434 of the widget represents the average total hemoglobin value. Each of the first half and the second half of the widget can allow the user to easily determine whether the user's metrics are balanced. For example, the graph 430B displayed by user interface 400 can represent a perfect balance. In the illustrated example, each of the first half and the second half of the widget are equal in shape. In some embodiments, the user interface 400 can illustrate unbalanced metrics. For example, graph 430A illustrates an unbalanced user profile. In these configurations the first half 432 and the second half 434 of the widget are not equal in size and shape.


In some embodiments, the user interface 400 can display images 438 associated with each metric displayed in each half of the widget to allow the user to more easily determine what metric is being displayed by the user interface 400. In some embodiments the outer rim 436 of the widget can be used to display certain metrics. For example a color can be displayed by the user interface 400 around at least a portion of the widget to display one or more metrics and/or physiological parameters.


Each of the first and second half 432, 434 of the widget can have at least one flag 440. The at least one flag 440 can be selected by the user. When the user selects the at least one flag, the user can influence another user, such as another team member. For example, when the user selects a first flag 440A of a second user's graph 430A, the system can send a notification the second user to encourage the second user. In some embodiments, when the user selects a second flag 440B of a second user's graph 430A, the system can send a notification to the second user to congratulate the second user. The system can encourage and reward a user's positive behavior by encourage participation between a plurality of users and/or facilitating interaction between a plurality of users. The users can socially reward other users and encourage other users to achieve a well-balanced lifestyle.


In some embodiments, the user interface 400 can display a graph 430B corresponding to a hero, professional athlete, and/or celebrity selected by the user. Accordingly, the user interface 400 can encourage the user to achieve a well-balanced lifestyle and/or enhanced performance by providing an easily accessible comparison to the hero. This can allow the user to more easily compare the balance displayed by the user's graph 430 to a goal balance displayed by the hero's graph 430B.


To view a widget in more detail, the user can select a particular graph 430. In some embodiments, the user can use the various methods for activating a zoom user interface 450, such as the zooming gestures disclosed herein. FIG. 12C illustrates an embodiment of the zoom user interface 450. The zoom user interface 450 can provide more detail and information than the graph 430 displayed in the user interface 400. In some embodiments, the zoom user interface 450 can display values and/or units of each of the metrics corresponding to the first half and the second half of the widget. In some embodiments, the zoom user interface 450 can display labels describing each of the metrics corresponding to the first half and the second half of the widget. In some embodiments, the zoom user interface 450 can display other information, such as a particular elevation at a particular location, a current location of the user, and/or an average elevation, among other relevant information.


V. Terminology

Many other variations than those described herein will be apparent from this disclosure. For example, depending on the embodiment, certain acts, events, or functions of any of the algorithms described herein can be performed in a different sequence, can be added, merged, or left out altogether (e.g., not all described acts or events are necessary for the practice of the algorithms). Moreover, in certain embodiments, acts or events can be performed concurrently, e.g., through multi-threaded processing, interrupt processing, or multiple processors or processor cores or on other parallel architectures, rather than sequentially. In addition, different tasks or processes can be performed by different machines and/or computing systems that can function together.


Each of the user interfaces shown includes one or more user interface controls that can be selected by a user, for example, using a browser or other application software associated with a patient or clinician device. The user interface controls shown are merely illustrative examples and can be varied in other embodiments. For instance, buttons, icons, dropdown boxes, select boxes, text boxes, check boxes, slider controls, and other user interface controls shown may be substituted with other types of user interface controls that provide the same or similar functionality. Further, user interface controls may be combined or divided into other sets of user interface controls such that similar functionality or the same functionality may be provided with very different looking user interfaces. Moreover, each of the user interface controls may be selected by a user using one or more input options, such as a mouse, touch screen input, or keyboard input, among other user interface input options.


The various illustrative logical blocks, modules, and algorithm steps described in connection with the embodiments disclosed herein can be implemented as electronic hardware, computer software, or combinations of both. To clearly illustrate this interchangeability of hardware and software, various illustrative components, blocks, modules, and steps have been described above generally in terms of their functionality. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the overall system. The described functionality can be implemented in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the disclosure.


The various illustrative logical blocks and modules described in connection with the embodiments disclosed herein can be implemented or performed by a machine, such as a general purpose processor, a digital signal processor (DSP), an application specific integrated circuit (ASIC), a field programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein. A general purpose processor can be a microprocessor, but in the alternative, the processor can be a controller, microcontroller, or state machine, combinations of the same, or the like. A processor can include electrical circuitry configured to process computer-executable instructions. In another embodiment, a processor includes an FPGA or other programmable device that performs logic operations without processing computer-executable instructions. A processor can also be implemented as a combination of computing devices, e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration. A computing environment can include any type of computer system, including, but not limited to, a computer system based on a microprocessor, a mainframe computer, a digital signal processor, a portable computing device, a device controller, or a computational engine within an appliance, to name a few.


The steps of a method, process, or algorithm described in connection with the embodiments disclosed herein can be embodied directly in hardware, in a software module stored in one or more memory devices and executed by one or more processors, or in a combination of the two. A software module can reside in RAM memory, flash memory, ROM memory, EPROM memory, EEPROM memory, registers, hard disk, a removable disk, a CD-ROM, or any other form of non-transitory computer-readable storage medium, media, or physical computer storage known in the art. An example storage medium can be coupled to the processor such that the processor can read information from, and write information to, the storage medium. In the alternative, the storage medium can be integral to the processor. The storage medium can be volatile or nonvolatile. The processor and the storage medium can reside in an ASIC.


Conditional language used herein, such as, among others, “can,” “might,” “may,” “e.g.,” and the like, unless specifically stated otherwise, or otherwise understood within the context as used, is generally intended to convey that certain embodiments include, while other embodiments do not include, certain features, elements and/or states. Thus, such conditional language is not generally intended to imply that features, elements and/or states are in any way required for one or more embodiments or that one or more embodiments necessarily include logic for deciding, with or without author input or prompting, whether these features, elements and/or states are included or are to be performed in any particular embodiment. The terms “comprising,” “including,” “having,” and the like are synonymous and are used inclusively, in an open-ended fashion, and do not exclude additional elements, features, acts, operations, and so forth. Also, the term “or” is used in its inclusive sense (and not in its exclusive sense) so that when used, for example, to connect a list of elements, the term “or” means one, some, or all of the elements in the list. Further, the term “each,” as used herein, in addition to having its ordinary meaning, can mean any subset of a set of elements to which the term “each” is applied.


While the above detailed description has shown, described, and pointed out novel features as applied to various embodiments, it will be understood that various omissions, substitutions, and changes in the form and details of the devices or algorithms illustrated can be made without departing from the spirit of the disclosure. As will be recognized, certain embodiments of the inventions described herein can be embodied within a form that does not provide all of the features and benefits set forth herein, as some features can be used or practiced separately from others.

Claims
  • 1. A system which assists a user in improving physiological performance during exercise by monitoring total hemoglobin measurements over extended time and providing user data in a useful way, the system comprising: a total hemoglobin sensor configured to obtain a plurality of total hemoglobin measurements of a user over a period of time, the period of time comprising a plurality of seasons of a year;a database configured to store the user data; andone or more hardware processors in communication with the database, the one or more hardware processors configured to associate the plurality of total hemoglobin measurements with corresponding ones of the plurality of seasons of the year and provide a graphical presentation indicating a relationship between at least some of the plurality of total hemoglobin measurements and a corresponding season of the plurality of seasons of the year.
  • 2. The system of claim 1, wherein the plurality of seasons of the year are graphically presented as a plurality of quadrants.
  • 3. The system of claim 2, wherein the processor is further configured to indicate a first total hemoglobin measurement in a first quadrant of the plurality of quadrants and indicate when the first total hemoglobin measurement was obtained.
  • 4. The system of claim 2, wherein the processor is further configured to provide the graphical presentation formatted with an indicator configured to indicate an optimal total hemoglobin measurement to allow the user to compare the first total hemoglobin measurement to the optimal total hemoglobin measurement.
  • 5. A method of conveying to a user physiological performance information by monitoring total hemoglobin measurements and providing user data, the method comprising: receiving a plurality of total hemoglobin measurements of a user over a period of time, the period of time comprising a plurality of seasons of a year;storing the user data; andusing one or more hardware processors, associating the plurality of total hemoglobin measurement of the plurality of total hemoglobin measurements with corresponding ones of the plurality of seasons of the year, and providing a graphical presentation indicating a relationship between at least some of the plurality of total hemoglobin measurements and a corresponding season of the plurality of seasons of the year.
  • 6. The method of claim 5, wherein the plurality of seasons of the year are graphically presented as a plurality of quadrants.
  • 7. The method of claim 6, wherein the processor is further configured to indicate a first total hemoglobin measurement in a first quadrant of the plurality of quadrants and indicate when the first total hemoglobin measurement was obtained.
  • 8. The method of claim 7, the processor is further configured to provide the graphical illustration formatted with an indicator configured to indicate an optimal total hemoglobin measurement to allow the user to compare the first total hemoglobin measurement to the optimal total hemoglobin measurement.
RELATED APPLICATIONS

The present application is a division of U.S. patent application Ser. No. 15/146,810, filed May 4, 2016, titled “NONINVASIVE SENSOR SYSTEM WITH VISUAL INFOGRAPHIC DISPLAY,” now U.S. Pat. No. 10,524,738, which claims priority benefit under 35 U.S.C. § 119(e) to U.S. Provisional Patent Application No. 62/156,581, filed May 4, 2015, titled “NONINVASIVE SENSOR SYSTEM WITH VISUAL MULTI QUADRANT INFOGRAPHIC DISPLAY,” U.S. Provisional Application No. 62/156,722, filed May 4, 2015, titled “NONINVASIVE SENSOR SYSTEM WITH VISUAL MULTI QUADRANT INFOGRAPHIC DISPLAY,” and U.S. Provisional Application No. 62/156,551, filed May 4, 2015, titled “NONINVASIVE SENSOR SYSTEM WITH VISUAL MULTI QUADRANT INFOGRAPHIC DISPLAY,” the disclosure of which is hereby incorporated by reference.

US Referenced Citations (1179)
Number Name Date Kind
4960128 Gordon et al. Oct 1990 A
4964408 Hink et al. Oct 1990 A
5041187 Hink et al. Aug 1991 A
5069213 Polczynski Dec 1991 A
5163438 Gordon et al. Nov 1992 A
5319355 Russek Jun 1994 A
5337744 Branigan Aug 1994 A
5341805 Stavridi et al. Aug 1994 A
D353195 Savage et al. Dec 1994 S
D353196 Savage et al. Dec 1994 S
5377676 Vari et al. Jan 1995 A
D359546 Savage et al. Jun 1995 S
5431170 Mathews Jul 1995 A
5436499 Namavar et al. Jul 1995 A
D361840 Savage et al. Aug 1995 S
D362063 Savage et al. Sep 1995 S
5452717 Branigan et al. Sep 1995 A
D363120 Savage et al. Oct 1995 S
5456252 Vari et al. Oct 1995 A
5479934 Imran Jan 1996 A
5482036 Diab et al. Jan 1996 A
5490505 Diab et al. Feb 1996 A
5494043 O'Sullivan et al. Feb 1996 A
5533511 Kaspari et al. Jul 1996 A
5534851 Russek Jul 1996 A
5561275 Savage et al. Oct 1996 A
5562002 Lalin Oct 1996 A
5590649 Caro et al. Jan 1997 A
5602924 Durand et al. Feb 1997 A
5632272 Diab et al. May 1997 A
5638816 Kiani-Azarbayjany et al. Jun 1997 A
5638818 Diab et al. Jun 1997 A
5645440 Tobler et al. Jul 1997 A
5671914 Kalkhoran et al. Sep 1997 A
5685299 Diab et al. Nov 1997 A
5726440 Kalkhoran et al. Mar 1998 A
D393830 Tobler et al. Apr 1998 S
5743262 Lepper, Jr. et al. Apr 1998 A
5747806 Khalil et al. May 1998 A
5750994 Schlager May 1998 A
5758644 Diab et al. Jun 1998 A
5760910 Lepper, Jr. et al. Jun 1998 A
5769785 Diab et al. Jun 1998 A
5782757 Diab et al. Jul 1998 A
5785659 Caro et al. Jul 1998 A
5791347 Flaherty et al. Aug 1998 A
5810734 Caro et al. Sep 1998 A
5823950 Diab et al. Oct 1998 A
5830131 Caro et al. Nov 1998 A
5833618 Caro et al. Nov 1998 A
5860919 Kiani-Azarbayjany et al. Jan 1999 A
5890929 Mills et al. Apr 1999 A
5904654 Wohltmann et al. May 1999 A
5919134 Diab Jul 1999 A
5934925 Tobler et al. Aug 1999 A
5940182 Lepper, Jr. et al. Aug 1999 A
5987343 Kinast Nov 1999 A
5995855 Kiani et al. Nov 1999 A
5997343 Mills et al. Dec 1999 A
6002952 Diab et al. Dec 1999 A
6010937 Karam et al. Jan 2000 A
6011986 Diab et al. Jan 2000 A
6027452 Flaherty et al. Feb 2000 A
6036642 Diab et al. Mar 2000 A
6040578 Malin et al. Mar 2000 A
6045509 Caro et al. Apr 2000 A
6066204 Haven May 2000 A
6067462 Diab et al. May 2000 A
6081735 Diab et al. Jun 2000 A
6088607 Diab et al. Jul 2000 A
6110522 Lepper, Jr. et al. Aug 2000 A
6115673 Malin et al. Sep 2000 A
6124597 Shehada Sep 2000 A
6128521 Marro et al. Oct 2000 A
6129675 Jay Oct 2000 A
6144868 Parker Nov 2000 A
6151516 Kiani-Azarbayjany et al. Nov 2000 A
6152754 Gerhardt et al. Nov 2000 A
6157850 Diab et al. Dec 2000 A
6165005 Mills et al. Dec 2000 A
6184521 Coffin, IV et al. Feb 2001 B1
6206830 Diab et al. Mar 2001 B1
6229856 Diab et al. May 2001 B1
6232609 Snyder et al. May 2001 B1
6236872 Diab et al. May 2001 B1
6241683 Macklem et al. Jun 2001 B1
6253097 Aronow et al. Jun 2001 B1
6255708 Sudharsanan et al. Jul 2001 B1
6256523 Diab et al. Jul 2001 B1
6263222 Diab et al. Jul 2001 B1
6278522 Lepper, Jr. et al. Aug 2001 B1
6280213 Tobler et al. Aug 2001 B1
6280381 Malin et al. Aug 2001 B1
6285896 Tobler et al. Sep 2001 B1
6301493 Marro et al. Oct 2001 B1
6308089 von der Ruhr et al. Oct 2001 B1
6317627 Ennen et al. Nov 2001 B1
6321100 Parker Nov 2001 B1
6325761 Jay Dec 2001 B1
6334065 Al-Ali et al. Dec 2001 B1
6343224 Parker Jan 2002 B1
6349228 Kiani et al. Feb 2002 B1
6360114 Diab et al. Mar 2002 B1
6368283 Xu et al. Apr 2002 B1
6371921 Caro et al. Apr 2002 B1
6377829 Al-Ali Apr 2002 B1
6388240 Schulz et al. May 2002 B2
6397091 Diab et al. May 2002 B2
6411373 Garside et al. Jun 2002 B1
6415167 Blank et al. Jul 2002 B1
6430437 Marro Aug 2002 B1
6430525 Weber et al. Aug 2002 B1
6463311 Diab Oct 2002 B1
6470199 Kopotic et al. Oct 2002 B1
6487429 Hockersmith et al. Nov 2002 B2
6501975 Diab et al. Dec 2002 B2
6505059 Kollias et al. Jan 2003 B1
6515273 Al-Ali Feb 2003 B2
6519487 Parker Feb 2003 B1
6525386 Mills et al. Feb 2003 B1
6526300 Kiani et al. Feb 2003 B1
6534012 Hazen et al. Mar 2003 B1
6541756 Schulz et al. Apr 2003 B2
6542764 Al-Ali et al. Apr 2003 B1
6580086 Schulz et al. Jun 2003 B1
6584336 Ali et al. Jun 2003 B1
6587196 Stippick et al. Jul 2003 B1
6587199 Luu Jul 2003 B1
6595316 Cybulski et al. Jul 2003 B2
6595929 Stivoric Jul 2003 B2
6597932 Tian et al. Jul 2003 B2
6597933 Kiani et al. Jul 2003 B2
6606511 Ali et al. Aug 2003 B1
6632181 Flaherty et al. Oct 2003 B2
6635559 Greenwald et al. Oct 2003 B2
6639668 Trepagnier Oct 2003 B1
6640116 Diab Oct 2003 B2
6640117 Makarewicz et al. Oct 2003 B2
6643530 Diab et al. Nov 2003 B2
6650917 Diab et al. Nov 2003 B2
6654624 Diab et al. Nov 2003 B2
6658276 Kiani et al. Dec 2003 B2
6661161 Lanzo et al. Dec 2003 B1
6671531 Al-Ali et al. Dec 2003 B2
6678543 Diab et al. Jan 2004 B2
6684090 Ali et al. Jan 2004 B2
6684091 Parker Jan 2004 B2
6697656 Al-Ali Feb 2004 B1
6697657 Shehada et al. Feb 2004 B1
6697658 Al-Ali Feb 2004 B2
RE38476 Diab et al. Mar 2004 E
6699194 Diab et al. Mar 2004 B1
6714804 Al-Ali et al. Mar 2004 B2
RE38492 Diab et al. Apr 2004 E
6721582 Trepagnier et al. Apr 2004 B2
6721585 Parker Apr 2004 B1
6725075 Al-Ali Apr 2004 B2
6728560 Kollias et al. Apr 2004 B2
6735459 Parker May 2004 B2
6738652 Mattu et al. May 2004 B2
6745060 Diab et al. Jun 2004 B2
6760607 Al-Ali Jul 2004 B2
6770028 Ali et al. Aug 2004 B1
6771994 Kiani et al. Aug 2004 B2
6788965 Ruchti et al. Sep 2004 B2
6792300 Diab et al. Sep 2004 B1
6813511 Diab et al. Nov 2004 B2
6816241 Grubisic Nov 2004 B2
6816741 Diab Nov 2004 B2
6822564 Al-Ali Nov 2004 B2
6826419 Diab et al. Nov 2004 B2
6830711 Mills et al. Dec 2004 B2
6850787 Weber et al. Feb 2005 B2
6850788 Al-Ali Feb 2005 B2
6852083 Caro et al. Feb 2005 B2
6861639 Al-Ali Mar 2005 B2
6876931 Lorenz et al. Apr 2005 B2
6898452 Al-Ali et al. May 2005 B2
6920345 Al-Ali et al. Jul 2005 B2
6931268 Kiani-Azarbayjany et al. Aug 2005 B1
6934570 Kiani et al. Aug 2005 B2
6939305 Flaherty et al. Sep 2005 B2
6943348 Coffin, IV Sep 2005 B1
6950687 Al-Ali Sep 2005 B2
6956649 Acosta et al. Oct 2005 B2
6961598 Diab Nov 2005 B2
6970792 Diab Nov 2005 B1
6979812 Al-Ali Dec 2005 B2
6985764 Mason et al. Jan 2006 B2
6990364 Ruchti et al. Jan 2006 B2
6993371 Kiani et al. Jan 2006 B2
6996427 Ali et al. Feb 2006 B2
6998247 Monfre et al. Feb 2006 B2
6999904 Weber et al. Feb 2006 B2
7003338 Weber et al. Feb 2006 B2
7003339 Diab et al. Feb 2006 B2
7015451 Dalke et al. Mar 2006 B2
7024233 Ali et al. Apr 2006 B2
7027849 Al-Ali Apr 2006 B2
7030749 Al-Ali Apr 2006 B2
7039449 Al-Ali May 2006 B2
7041060 Flaherty et al. May 2006 B2
7044918 Diab May 2006 B2
7048687 Reuss et al. May 2006 B1
7067893 Mills et al. Jun 2006 B2
D526719 Richie, Jr. et al. Aug 2006 S
7096052 Mason et al. Aug 2006 B2
7096054 Abdul-Hafiz et al. Aug 2006 B2
D529616 Deros et al. Oct 2006 S
7132641 Schulz et al. Nov 2006 B2
7133710 Acosta et al. Nov 2006 B2
7142901 Kiani et al. Nov 2006 B2
7149561 Diab Dec 2006 B2
7186966 Al-Ali Mar 2007 B2
7190261 Al-Ali Mar 2007 B2
7215984 Diab May 2007 B2
7215986 Diab May 2007 B2
7221971 Diab May 2007 B2
7225006 Al-Ali et al. May 2007 B2
7225007 Al-Ali May 2007 B2
RE39672 Shehada et al. Jun 2007 E
7239905 Kiani-Azarbayjany et al. Jul 2007 B2
7245953 Parker Jul 2007 B1
7254429 Schurman et al. Aug 2007 B2
7254431 Al-Ali Aug 2007 B2
7254433 Diab et al. Aug 2007 B2
7254434 Schulz et al. Aug 2007 B2
7272425 Al-Ali Sep 2007 B2
7274955 Kiani et al. Sep 2007 B2
D554263 Al-Ali Oct 2007 S
7280858 Al-Ali et al. Oct 2007 B2
7289835 Mansfield et al. Oct 2007 B2
7292883 De Felice et al. Nov 2007 B2
7295866 Al-Ali Nov 2007 B2
7328053 Diab et al. Feb 2008 B1
7332784 Mills et al. Feb 2008 B2
7340287 Mason et al. Mar 2008 B2
7341559 Schulz et al. Mar 2008 B2
7343186 Lamego et al. Mar 2008 B2
D566282 Al-Ali et al. Apr 2008 S
7355512 Al-Ali Apr 2008 B1
7356365 Schurman Apr 2008 B2
7371981 Abdul-Hafiz May 2008 B2
7373193 Al-Ali et al. May 2008 B2
7373194 Weber et al. May 2008 B2
7376453 Diab et al. May 2008 B1
7377794 Al Ali et al. May 2008 B2
7377899 Weber et al. May 2008 B2
7383070 Diab et al. Jun 2008 B2
7395158 Monfre et al. Jul 2008 B2
7415297 Al-Ali et al. Aug 2008 B2
7428432 Ali et al. Sep 2008 B2
7438683 Al-Ali et al. Oct 2008 B2
7440787 Diab Oct 2008 B2
7454240 Diab et al. Nov 2008 B2
7467002 Weber et al. Dec 2008 B2
7469157 Diab et al. Dec 2008 B2
7471969 Diab et al. Dec 2008 B2
7471971 Diab et al. Dec 2008 B2
7483729 Al-Ali et al. Jan 2009 B2
7483730 Diab et al. Jan 2009 B2
7489958 Diab et al. Feb 2009 B2
7496391 Diab et al. Feb 2009 B2
7496393 Diab et al. Feb 2009 B2
D587657 Al-Ali et al. Mar 2009 S
7499741 Diab et al. Mar 2009 B2
7499835 Weber et al. Mar 2009 B2
7500950 Al-Ali et al. Mar 2009 B2
7509154 Diab et al. Mar 2009 B2
7509494 Al-Ali Mar 2009 B2
7510849 Schurman et al. Mar 2009 B2
7514725 Wojtczuk et al. Apr 2009 B2
7519406 Blank et al. Apr 2009 B2
7526328 Diab et al. Apr 2009 B2
D592507 Wachman et al. May 2009 S
7530942 Diab May 2009 B1
7530949 Al Ali et al. May 2009 B2
7530955 Diab et al. May 2009 B2
7563110 Al-Ali et al. Jul 2009 B2
7593230 Abul-Haj et al. Sep 2009 B2
7596398 Al-Ali et al. Sep 2009 B2
7606608 Blank et al. Oct 2009 B2
7618375 Flaherty Nov 2009 B2
7620674 Ruchti et al. Nov 2009 B2
D606659 Kiani et al. Dec 2009 S
7629039 Eckerbom et al. Dec 2009 B2
7640140 Ruchti et al. Dec 2009 B2
7647083 Al-Ali et al. Jan 2010 B2
D609193 Al-Ali et al. Feb 2010 S
D614305 Al-Ali et al. Apr 2010 S
7697966 Monfre et al. Apr 2010 B2
7698105 Ruchti et al. Apr 2010 B2
RE41317 Parker May 2010 E
RE41333 Blank et al. May 2010 E
7729733 Al-Ali et al. Jun 2010 B2
7734320 Al-Ali Jun 2010 B2
7761127 Al-Ali et al. Jul 2010 B2
7761128 Al-Ali et al. Jul 2010 B2
7764982 Dalke et al. Jul 2010 B2
D621516 Kiani et al. Aug 2010 S
7791155 Diab Sep 2010 B2
7801581 Diab Sep 2010 B2
7822452 Schurman et al. Oct 2010 B2
RE41912 Parker Nov 2010 E
7844313 Kiani et al. Nov 2010 B2
7844314 Al-Ali Nov 2010 B2
7844315 Al-Ali Nov 2010 B2
7865222 Weber et al. Jan 2011 B2
7873497 Weber et al. Jan 2011 B2
7880606 Al-Ali Feb 2011 B2
7880626 Al-Ali et al. Feb 2011 B2
7891355 Al-Ali et al. Feb 2011 B2
7894868 Al-Ali et al. Feb 2011 B2
7899507 Al-Ali et al. Mar 2011 B2
7899518 Trepagnier et al. Mar 2011 B2
7904132 Weber et al. Mar 2011 B2
7909772 Popov et al. Mar 2011 B2
7910875 Al-Ali Mar 2011 B2
7919713 Al-Ali et al. Apr 2011 B2
7937128 Al-Ali May 2011 B2
7937129 Mason et al. May 2011 B2
7937130 Diab et al. May 2011 B2
7941199 Kiani May 2011 B2
7951086 Flaherty et al. May 2011 B2
7957780 Lamego et al. Jun 2011 B2
7962188 Kiani et al. Jun 2011 B2
7962190 Diab et al. Jun 2011 B1
7976472 Kiani Jul 2011 B2
7988637 Diab Aug 2011 B2
7990382 Kiani Aug 2011 B2
7991446 Al-Ali et al. Aug 2011 B2
8000761 Al-Ali Aug 2011 B2
8008088 Bellott et al. Aug 2011 B2
RE42753 Kiani-Azarbayjany et al. Sep 2011 E
8019400 Diab et al. Sep 2011 B2
8028701 Al-Ali et al. Oct 2011 B2
8029765 Bellott et al. Oct 2011 B2
8036727 Schurman et al. Oct 2011 B2
8036728 Diab et al. Oct 2011 B2
8046040 Ali et al. Oct 2011 B2
8046041 Diab et al. Oct 2011 B2
8046042 Diab et al. Oct 2011 B2
8048040 Kiani Nov 2011 B2
8050728 Al-Ali et al. Nov 2011 B2
RE43169 Parker Feb 2012 E
8118620 Al-Ali et al. Feb 2012 B2
8126528 Diab et al. Feb 2012 B2
8128572 Diab et al. Mar 2012 B2
8130105 Al-Ali et al. Mar 2012 B2
8145287 Diab et al. Mar 2012 B2
8150487 Diab et al. Apr 2012 B2
8175672 Parker May 2012 B2
8180420 Diab et al. May 2012 B2
8182443 Kiani May 2012 B1
8185180 Diab et al. May 2012 B2
8190223 Al-Ali et al. May 2012 B2
8190227 Diab et al. May 2012 B2
8203438 Kiani et al. Jun 2012 B2
8203704 Merritt et al. Jun 2012 B2
8204566 Schurman et al. Jun 2012 B2
8219172 Schurman et al. Jul 2012 B2
8224411 Al-Ali et al. Jul 2012 B2
8228181 Al-Ali Jul 2012 B2
8229532 Davis Jul 2012 B2
8229533 Diab et al. Jul 2012 B2
8233955 Al-Ali et al. Jul 2012 B2
8244325 Al-Ali et al. Aug 2012 B2
8255026 Al-Ali Aug 2012 B1
8255027 Al-Ali et al. Aug 2012 B2
8255028 Al-Ali et al. Aug 2012 B2
8260577 Weber et al. Sep 2012 B2
8265723 McHale et al. Sep 2012 B1
8274360 Sampath et al. Sep 2012 B2
8280473 Al-Ali Oct 2012 B2
8301217 Al-Ali et al. Oct 2012 B2
8306596 Schurman et al. Nov 2012 B2
8310336 Muhsin et al. Nov 2012 B2
8315683 Al-Ali et al. Nov 2012 B2
RE43860 Parker Dec 2012 E
8337403 Al-Ali et al. Dec 2012 B2
8346330 Lamego Jan 2013 B2
8353842 Al-Ali et al. Jan 2013 B2
8355766 MacNeish, III et al. Jan 2013 B2
8359080 Diab et al. Jan 2013 B2
8364223 Al-Ali et al. Jan 2013 B2
8364226 Diab et al. Jan 2013 B2
8374665 Lamego Feb 2013 B2
8385995 Al-ali et al. Feb 2013 B2
8385996 Smith et al. Feb 2013 B2
8388353 Kiani et al. Mar 2013 B2
8399822 Al-Ali Mar 2013 B2
8401602 Kiani Mar 2013 B2
8405608 Al-Ali et al. Mar 2013 B2
8414499 Al-Ali et al. Apr 2013 B2
8418524 Al-Ali Apr 2013 B2
8423106 Lamego et al. Apr 2013 B2
8428967 Olsen et al. Apr 2013 B2
8430817 Al-Ali et al. Apr 2013 B1
8437825 Dalvi et al. May 2013 B2
8455290 Siskavich Jun 2013 B2
8457703 Al-Ali Jun 2013 B2
8457707 Kiani Jun 2013 B2
8463349 Diab et al. Jun 2013 B2
8466286 Bellot et al. Jun 2013 B2
8471713 Poeze et al. Jun 2013 B2
8473020 Kiani et al. Jun 2013 B2
8483787 Al-Ali et al. Jul 2013 B2
8489364 Weber et al. Jul 2013 B2
8498684 Weber et al. Jul 2013 B2
8504128 Blank et al. Aug 2013 B2
8509867 Workman et al. Aug 2013 B2
8515509 Bruinsma et al. Aug 2013 B2
8523781 Al-Ali Sep 2013 B2
8529301 Al-Ali et al. Sep 2013 B2
8532727 Ali et al. Sep 2013 B2
8532728 Diab et al. Sep 2013 B2
D692145 Al-Ali et al. Oct 2013 S
8547209 Kiani et al. Oct 2013 B2
8548548 Al-Ali Oct 2013 B2
8548549 Schurman et al. Oct 2013 B2
8548550 Al-Ali et al. Oct 2013 B2
8560032 Al-Ali et al. Oct 2013 B2
8560034 Diab et al. Oct 2013 B1
8570167 Al-Ali Oct 2013 B2
8570503 Vo et al. Oct 2013 B2
8571617 Reichgott et al. Oct 2013 B2
8571618 Lamego et al. Oct 2013 B1
8571619 Al-Ali et al. Oct 2013 B2
8577431 Lamego et al. Nov 2013 B2
8581732 Al-Ali et al. Nov 2013 B2
8584345 Al-Ali et al. Nov 2013 B2
8588880 Abdul-Hafiz et al. Nov 2013 B2
8600467 Al-Ali et al. Dec 2013 B2
8606342 Diab Dec 2013 B2
8626255 Al-Ali et al. Jan 2014 B2
8630691 Lamego et al. Jan 2014 B2
8634889 Al-Ali et al. Jan 2014 B2
8641631 Sierra et al. Feb 2014 B2
8652060 Al-Ali Feb 2014 B2
8663107 Kiani Mar 2014 B2
8666468 Al-Ali Mar 2014 B1
8667967 Al-Ali et al. Mar 2014 B2
8670811 O'Reilly Mar 2014 B2
8670814 Diab et al. Mar 2014 B2
8676286 Weber et al. Mar 2014 B2
8682407 Al-Ali Mar 2014 B2
RE44823 Parker Apr 2014 E
RE44875 Kiani et al. Apr 2014 E
8688183 Bruinsma et al. Apr 2014 B2
8690799 Telfort et al. Apr 2014 B2
8700112 Kiani Apr 2014 B2
8702627 Telfort et al. Apr 2014 B2
8706179 Parker Apr 2014 B2
8712494 MacNeish, III et al. Apr 2014 B1
8715206 Telfort et al. May 2014 B2
8718735 Lamego et al. May 2014 B2
8718737 Diab et al. May 2014 B2
8718738 Blank et al. May 2014 B2
8720249 Al-Ali May 2014 B2
8721541 Al-Ali et al. May 2014 B2
8721542 Al-Ali et al. May 2014 B2
8723677 Kiani May 2014 B1
8740792 Kiani et al. Jun 2014 B1
8754776 Poeze et al. Jun 2014 B2
8755535 Telfort et al. Jun 2014 B2
8755856 Diab et al. Jun 2014 B2
8755872 Marinow Jun 2014 B1
8761850 Lamego Jun 2014 B2
8764671 Kiani Jul 2014 B2
8768423 Shakespeare et al. Jul 2014 B2
8771204 Telfort et al. Jul 2014 B2
8777634 Kiani et al. Jul 2014 B2
8781543 Diab et al. Jul 2014 B2
8781544 Al-Ali et al. Jul 2014 B2
8781549 Al-Ali et al. Jul 2014 B2
8788003 Schurman et al. Jul 2014 B2
8790268 Al-Ali Jul 2014 B2
8801613 Al-Ali et al. Aug 2014 B2
8821397 Al-Ali et al. Sep 2014 B2
8821415 Al-Ali et al. Sep 2014 B2
8830449 Lamego et al. Sep 2014 B1
8831700 Schurman et al. Sep 2014 B2
8840549 Al-Ali et al. Sep 2014 B2
8847740 Kiani et al. Sep 2014 B2
8849365 Smith et al. Sep 2014 B2
8852094 Al-Ali et al. Oct 2014 B2
8852994 Wojtczuk et al. Oct 2014 B2
8868147 Stippick et al. Oct 2014 B2
8868150 Al-Ali et al. Oct 2014 B2
8870792 Al-Ali et al. Oct 2014 B2
8886271 Kiani et al. Nov 2014 B2
8888539 Al-Ali et al. Nov 2014 B2
8888708 Diab et al. Nov 2014 B2
8892180 Weber et al. Nov 2014 B2
8897847 Al-Ali Nov 2014 B2
8909310 Lamego et al. Dec 2014 B2
8911377 Al-Ali Dec 2014 B2
8912909 Al-Ali et al. Dec 2014 B2
8920317 Al-Ali et al. Dec 2014 B2
8921699 Al-Ali et al. Dec 2014 B2
8922382 Al-Ali et al. Dec 2014 B2
8929964 Al-Ali et al. Jan 2015 B2
8942777 Diab et al. Jan 2015 B2
8948834 Diab et al. Feb 2015 B2
8948835 Diab Feb 2015 B2
8965471 Lamego Feb 2015 B2
8983564 Al-Ali Mar 2015 B2
8989831 Al-Ali et al. Mar 2015 B2
8996085 Kiani et al. Mar 2015 B2
8998809 Kiani Apr 2015 B2
9028429 Telfort et al. May 2015 B2
9037207 Al-Ali et al. May 2015 B2
9060721 Reichgott et al. Jun 2015 B2
9066666 Kiani Jun 2015 B2
9066680 Al-Ali et al. Jun 2015 B1
9072474 Al-Ali et al. Jul 2015 B2
9078560 Schurman et al. Jul 2015 B2
9084569 Weber et al. Jul 2015 B2
9095316 Welch et al. Aug 2015 B2
9106038 Telfort et al. Aug 2015 B2
9107625 Telfort et al. Aug 2015 B2
9107626 Al-Ali et al. Aug 2015 B2
9113831 Al-Ali Aug 2015 B2
9113832 Al-Ali Aug 2015 B2
9119595 Lamego Sep 2015 B2
9131881 Diab et al. Sep 2015 B2
9131882 Al-Ali et al. Sep 2015 B2
9131883 Al-Ali Sep 2015 B2
9131917 Telfort et al. Sep 2015 B2
9138180 Coverston et al. Sep 2015 B1
9138182 Al-Ali et al. Sep 2015 B2
9138192 Weber et al. Sep 2015 B2
9142117 Muhsin et al. Sep 2015 B2
9153112 Kiani et al. Oct 2015 B1
9153121 Kiani et al. Oct 2015 B2
9161696 Al-Ali et al. Oct 2015 B2
9161713 Al-Ali et al. Oct 2015 B2
9167995 Lamego et al. Oct 2015 B2
9176141 Al-Ali et al. Nov 2015 B2
9186102 Bruinsma et al. Nov 2015 B2
9192312 Al-Ali Nov 2015 B2
9192329 Al-Ali Nov 2015 B2
9192351 Telfort et al. Nov 2015 B1
9195385 Al-Ali et al. Nov 2015 B2
9211072 Kiani Dec 2015 B2
9211095 Al-Ali Dec 2015 B1
9218454 Kiani et al. Dec 2015 B2
9226696 Kiani Jan 2016 B2
9241662 Al-Ali et al. Jan 2016 B2
9245668 Vo et al. Jan 2016 B1
9259185 Abdul-Hafiz et al. Feb 2016 B2
9267572 Barker et al. Feb 2016 B2
9277880 Poeze et al. Mar 2016 B2
9289167 Diab et al. Mar 2016 B2
9295421 Kiani et al. Mar 2016 B2
9307928 Al-Ali et al. Apr 2016 B1
9323894 Kiani Apr 2016 B2
D755392 Hwang et al. May 2016 S
9326712 Kiani May 2016 B1
9333316 Kiani May 2016 B2
9339220 Lamego et al. May 2016 B2
9341565 Lamego et al. May 2016 B2
9351673 Diab et al. May 2016 B2
9351675 Al-Ali et al. May 2016 B2
9364181 Kiani et al. Jun 2016 B2
9368671 Wojtczuk et al. Jun 2016 B2
9370325 Al-Ali et al. Jun 2016 B2
9370326 McHale et al. Jun 2016 B2
9370335 Al-ali et al. Jun 2016 B2
9375185 Ali et al. Jun 2016 B2
9386953 Al-Ali Jul 2016 B2
9386961 Al-Ali et al. Jul 2016 B2
9392945 Al-Ali et al. Jul 2016 B2
9397448 Al-Ali et al. Jul 2016 B2
9408542 Kinast et al. Aug 2016 B1
9436645 Al-Ali et al. Sep 2016 B2
9445759 Lamego et al. Sep 2016 B1
9466919 Kiani et al. Oct 2016 B2
9474474 Lamego et al. Oct 2016 B2
9480422 Al-Ali Nov 2016 B2
9480435 Olsen Nov 2016 B2
9492110 Al-Ali et al. Nov 2016 B2
9510779 Poeze et al. Dec 2016 B2
9517024 Kiani et al. Dec 2016 B2
9532722 Lamego et al. Jan 2017 B2
9538949 Al-Ali et al. Jan 2017 B2
9538980 Telfort et al. Jan 2017 B2
9549696 Lamego et al. Jan 2017 B2
9554737 Schurman et al. Jan 2017 B2
9560996 Kiani Feb 2017 B2
9560998 Al-Ali et al. Feb 2017 B2
9566019 Al-Ali et al. Feb 2017 B2
9579039 Jansen et al. Feb 2017 B2
9591975 Dalvi et al. Mar 2017 B2
9622692 Lamego et al. Apr 2017 B2
9622693 Diab Apr 2017 B2
D788312 Al-Ali et al. May 2017 S
9636055 Al-Ali et al. May 2017 B2
9636056 Al-Ali May 2017 B2
9649054 Lamego et al. May 2017 B2
9662052 Al-Ali et al. May 2017 B2
9668679 Schurman et al. Jun 2017 B2
9668680 Bruinsma et al. Jun 2017 B2
9668703 Al-Ali Jun 2017 B2
9675286 Diab Jun 2017 B2
9687160 Kiani Jun 2017 B2
9693719 Al-Ali et al. Jul 2017 B2
9693737 Al-Ali Jul 2017 B2
9697928 Al-Ali et al. Jul 2017 B2
9717425 Kiani et al. Aug 2017 B2
9717458 Lamego et al. Aug 2017 B2
9724016 Al-Ali et al. Aug 2017 B1
9724024 Al-Ali Aug 2017 B2
9724025 Kiani et al. Aug 2017 B1
9730640 Diab et al. Aug 2017 B2
9743887 Al-Ali et al. Aug 2017 B2
9749232 Sampath et al. Aug 2017 B2
9750442 Olsen Sep 2017 B2
9750443 Smith et al. Sep 2017 B2
9750461 Telfort Sep 2017 B1
9775545 Al-Ali et al. Oct 2017 B2
9775546 Diab et al. Oct 2017 B2
9775570 Al-Ali Oct 2017 B2
9778079 Al-Ali et al. Oct 2017 B1
9782077 Lamego et al. Oct 2017 B2
9782110 Kiani Oct 2017 B2
9787568 Lamego et al. Oct 2017 B2
9788735 Al-Ali Oct 2017 B2
9788768 Al-Ali et al. Oct 2017 B2
9795300 Al-Ali Oct 2017 B2
9795310 Al-Ali Oct 2017 B2
9795358 Telfort et al. Oct 2017 B2
9795739 Al-Ali et al. Oct 2017 B2
9801556 Kiani Oct 2017 B2
9801588 Weber et al. Oct 2017 B2
9808188 Perea et al. Nov 2017 B1
9814418 Weber et al. Nov 2017 B2
9820691 Kiani Nov 2017 B2
9833152 Kiani et al. Dec 2017 B2
9833180 Shakespeare et al. Dec 2017 B2
9839379 Al-Ali et al. Dec 2017 B2
9839381 Weber et al. Dec 2017 B1
9847002 Kiani et al. Dec 2017 B2
9847749 Kiani et al. Dec 2017 B2
9848800 Lee et al. Dec 2017 B1
9848806 Al-Ali et al. Dec 2017 B2
9848807 Lamego Dec 2017 B2
9861298 Eckerbom et al. Jan 2018 B2
9861304 Al-Ali et al. Jan 2018 B2
9861305 Weber et al. Jan 2018 B1
9867578 Al-Ali et al. Jan 2018 B2
9872623 Al-Ali Jan 2018 B2
9876320 Coverston et al. Jan 2018 B2
9877650 Muhsin et al. Jan 2018 B2
9877686 Al-Ali et al. Jan 2018 B2
9891079 Dalvi Feb 2018 B2
9895107 Al-Ali et al. Feb 2018 B2
9913617 Al-Ali et al. Mar 2018 B2
9924893 Schurman et al. Mar 2018 B2
9924897 Abdul-Hafiz Mar 2018 B1
9936917 Poeze et al. Apr 2018 B2
9943269 Muhsin et al. Apr 2018 B2
9949676 Al-Ali Apr 2018 B2
9955937 Telfort May 2018 B2
9965946 Al-Ali May 2018 B2
9980667 Kiani et al. May 2018 B2
D820865 Muhsin et al. Jun 2018 S
9986919 Lamego et al. Jun 2018 B2
9986952 Dalvi et al. Jun 2018 B2
9989560 Poeze et al. Jun 2018 B2
9993207 Al-Ali et al. Jun 2018 B2
10007758 Al-Ali et al. Jun 2018 B2
D822215 Al-Ali et al. Jul 2018 S
D822216 Barker et al. Jul 2018 S
10010276 Al-Ali et al. Jul 2018 B2
10032002 Kiani et al. Jul 2018 B2
10039482 Al-Ali et al. Aug 2018 B2
10052037 Kinast et al. Aug 2018 B2
10058275 Al-Ali et al. Aug 2018 B2
10064562 Al-Ali Sep 2018 B2
10086138 Novak, Jr. Oct 2018 B1
10092200 Al-Ali et al. Oct 2018 B2
10092249 Kiani et al. Oct 2018 B2
10098550 Al-Ali et al. Oct 2018 B2
10098591 Al-Ali et al. Oct 2018 B2
10098610 Al-Ali et al. Oct 2018 B2
10111591 Dyell et al. Oct 2018 B2
D833624 DeJong et al. Nov 2018 S
10123726 Al-Ali et al. Nov 2018 B2
10123729 Dyell et al. Nov 2018 B2
10130289 Al-Ali et al. Nov 2018 B2
10130291 Schurman et al. Nov 2018 B2
D835282 Barker et al. Dec 2018 S
D835283 Barker et al. Dec 2018 S
D835284 Barker et al. Dec 2018 S
D835285 Barker et al. Dec 2018 S
10149616 Al-Ali et al. Dec 2018 B2
10154815 Al-Ali et al. Dec 2018 B2
10159412 Lamego et al. Dec 2018 B2
10188296 Al-Ali et al. Jan 2019 B2
10188331 Al-Ali et al. Jan 2019 B1
10188348 Kiani et al. Jan 2019 B2
RE47218 Al-Ali Feb 2019 E
RE47244 Kiani et al. Feb 2019 E
RE47249 Kiani et al. Feb 2019 E
10194847 Al-Ali Feb 2019 B2
10194848 Kiani et al. Feb 2019 B1
10201298 Al-Ali et al. Feb 2019 B2
10205272 Kiani et al. Feb 2019 B2
10205291 Scruggs et al. Feb 2019 B2
10213108 Al-Ali Feb 2019 B2
10219706 Al-Ali Mar 2019 B2
10219746 McHale et al. Mar 2019 B2
10226187 Al-Ali et al. Mar 2019 B2
10226576 Kiani Mar 2019 B2
10231657 Al-Ali et al. Mar 2019 B2
10231670 Blank et al. Mar 2019 B2
10231676 Al-Ali et al. Mar 2019 B2
RE47353 Kiani et al. Apr 2019 E
10251585 Al-Ali et al. Apr 2019 B2
10251586 Lamego Apr 2019 B2
10255994 Sampath et al. Apr 2019 B2
10258265 Poeze et al. Apr 2019 B1
10258266 Poeze et al. Apr 2019 B1
10271748 Al-Ali Apr 2019 B2
10278626 Schurman et al. May 2019 B2
10278648 Al-Ali et al. May 2019 B2
10279247 Kiani May 2019 B2
10292628 Poeze et al. May 2019 B1
10292657 Abdul-Hafiz et al. May 2019 B2
10292664 Al-Ali May 2019 B2
10299708 Poeze et al. May 2019 B1
10299709 Perea et al. May 2019 B2
10299720 Brown et al. May 2019 B2
10305775 Lamego et al. May 2019 B2
10307111 Muhsin et al. Jun 2019 B2
10327337 Schmidt et al. Jun 2019 B2
10327713 Barker et al. Jun 2019 B2
10332630 Al-Ali Jun 2019 B2
10383520 Wojtczuk et al. Aug 2019 B2
10383527 Al-Ali Aug 2019 B2
10388120 Muhsin et al. Aug 2019 B2
D864120 Forrest et al. Oct 2019 S
10441181 Telfort et al. Oct 2019 B1
10441196 Eckerbom et al. Oct 2019 B2
10448844 Al-Ali et al. Oct 2019 B2
10448871 Al-Ali et al. Oct 2019 B2
10456038 Lamego et al. Oct 2019 B2
10463340 Telfort et al. Nov 2019 B2
10471159 Lapotko et al. Nov 2019 B1
10505311 Al-Ali et al. Dec 2019 B2
10524738 Olsen Jan 2020 B2
10532174 Al-Ali Jan 2020 B2
10537285 Shreim et al. Jan 2020 B2
10542903 Al-Ali et al. Jan 2020 B2
10555678 Dalvi et al. Feb 2020 B2
10568553 O'Neil et al. Feb 2020 B2
RE47882 Al-Ali Mar 2020 E
10608817 Haider et al. Mar 2020 B2
D880477 Forrest et al. Apr 2020 S
10617302 Al-Ali et al. Apr 2020 B2
10617335 Al-Ali et al. Apr 2020 B2
10637181 Al-Ali et al. Apr 2020 B2
D887548 Abdul-Hafiz et al. Jun 2020 S
D887549 Abdul-Hafiz et al. Jun 2020 S
10667764 Ahmed et al. Jun 2020 B2
D890708 Forrest et al. Jul 2020 S
10721785 Al-Ali Jul 2020 B2
10736518 Al-Ali et al. Aug 2020 B2
10750984 Pauley et al. Aug 2020 B2
D897098 Al-Ali Sep 2020 S
10779098 Iswanto et al. Sep 2020 B2
10827961 Iyengar et al. Nov 2020 B1
10828007 Telfort et al. Nov 2020 B1
10832818 Muhsin et al. Nov 2020 B2
10849554 Shreim et al. Dec 2020 B2
10856750 Indorf et al. Dec 2020 B2
D906970 Forrest et al. Jan 2021 S
10918281 Al-Ali et al. Feb 2021 B2
10932705 Muhsin et al. Mar 2021 B2
10932729 Kiani et al. Mar 2021 B2
10939878 Kiani et al. Mar 2021 B2
D916135 Indorf et al. Apr 2021 S
D917550 Indorf et al. Apr 2021 S
D917564 Indorf et al. Apr 2021 S
D917704 Al-Ali et al. Apr 2021 S
10987066 Chandran et al. Apr 2021 B2
10991135 Al-Ali et al. Apr 2021 B2
D919094 Al-Ali et al. May 2021 S
D919100 Al-Ali et al. May 2021 S
11006867 Al-Ali May 2021 B2
D921202 Al-Ali et al. Jun 2021 S
11024064 Muhsin et al. Jun 2021 B2
11026604 Chen et al. Jun 2021 B2
D925597 Chandran et al. Jul 2021 S
D927699 Al-Ali et al. Aug 2021 S
11076777 Lee et al. Aug 2021 B2
11114188 Poeze et al. Sep 2021 B2
11147518 Al-Ali et al. Oct 2021 B1
20010034477 Mansfield et al. Oct 2001 A1
20010039483 Brand et al. Nov 2001 A1
20020010401 Bushmakin et al. Jan 2002 A1
20020058864 Mansfield et al. May 2002 A1
20020133080 Apruzzese et al. Sep 2002 A1
20030013975 Kiani Jan 2003 A1
20030018243 Gerhardt et al. Jan 2003 A1
20030144582 Cohen et al. Jul 2003 A1
20030156288 Barnum et al. Aug 2003 A1
20030212312 Coffin, IV et al. Nov 2003 A1
20040106163 Workman, Jr. et al. Jun 2004 A1
20040206353 Conroy Oct 2004 A1
20050055276 Kiani et al. Mar 2005 A1
20050101845 Nihtila May 2005 A1
20050234317 Kiani Oct 2005 A1
20060073719 Kiani Apr 2006 A1
20060161054 Reuss et al. Jul 2006 A1
20060189871 Al-Ali et al. Aug 2006 A1
20070073116 Kiani et al. Mar 2007 A1
20070180140 Welch et al. Aug 2007 A1
20070244377 Cozad et al. Oct 2007 A1
20070271009 Conroy Nov 2007 A1
20070282478 Al-Ali et al. Dec 2007 A1
20080064965 Jay et al. Mar 2008 A1
20080094228 Welch et al. Apr 2008 A1
20080221418 Al-Ali et al. Sep 2008 A1
20090036759 Ault et al. Feb 2009 A1
20090093687 Telfort et al. Apr 2009 A1
20090095926 MacNeish, III Apr 2009 A1
20090247984 Lamego et al. Oct 2009 A1
20090275813 Davis Nov 2009 A1
20090275844 Al-Ali Nov 2009 A1
20100004518 Vo et al. Jan 2010 A1
20100030040 Poeze et al. Feb 2010 A1
20100099964 O'Reilly et al. Apr 2010 A1
20100110416 Barrett et al. May 2010 A1
20100234718 Sampath et al. Sep 2010 A1
20100270257 Wachman et al. Oct 2010 A1
20100298675 Al-Ali et al. Nov 2010 A1
20110001605 Kiani et al. Jan 2011 A1
20110028806 Merritt et al. Feb 2011 A1
20110028809 Goodman Feb 2011 A1
20110040197 Welch et al. Feb 2011 A1
20110082711 Poeze et al. Apr 2011 A1
20110087081 Kiani et al. Apr 2011 A1
20110105854 Kiani et al. May 2011 A1
20110118561 Tari et al. May 2011 A1
20110125060 Telfort et al. May 2011 A1
20110137297 Kiani et al. Jun 2011 A1
20110172498 Olsen et al. Jul 2011 A1
20110208015 Welch et al. Aug 2011 A1
20110213212 Al-Ali Sep 2011 A1
20110230733 Al-Ali Sep 2011 A1
20110237911 Lamego et al. Sep 2011 A1
20120059267 Lamego et al. Mar 2012 A1
20120123231 O'Reilly May 2012 A1
20120136582 Barrett et al. May 2012 A1
20120165629 Merritt et al. Jun 2012 A1
20120179006 Jansen et al. Jul 2012 A1
20120209082 Al-Ali Aug 2012 A1
20120209084 Olsen et al. Aug 2012 A1
20120226117 Lamego et al. Sep 2012 A1
20120227739 Kiani Sep 2012 A1
20120283524 Kiani et al. Nov 2012 A1
20120296178 Lamego et al. Nov 2012 A1
20120319816 Al-Ali Dec 2012 A1
20120330112 Lamego et al. Dec 2012 A1
20130023775 Lamego et al. Jan 2013 A1
20130041591 Lamego Feb 2013 A1
20130045685 Kiani Feb 2013 A1
20130046204 Lamego et al. Feb 2013 A1
20130060147 Welch et al. Mar 2013 A1
20130096405 Garfio Apr 2013 A1
20130096936 Sampath et al. Apr 2013 A1
20130190581 Al-Ali et al. Jul 2013 A1
20130197328 Diab et al. Aug 2013 A1
20130211214 Olsen Aug 2013 A1
20130243021 Siskavich Sep 2013 A1
20130253334 Al-Ali et al. Sep 2013 A1
20130267793 Meador et al. Oct 2013 A1
20130296672 O'Neil et al. Nov 2013 A1
20130296713 Al-Ali et al. Nov 2013 A1
20130317370 Dalvi et al. Nov 2013 A1
20130324808 Al-Ali et al. Dec 2013 A1
20130331660 Al-Ali et al. Dec 2013 A1
20130331670 Kiani Dec 2013 A1
20130338461 Lamego et al. Dec 2013 A1
20130345921 Al-Ali et al. Dec 2013 A1
20140012100 Al-Ali et al. Jan 2014 A1
20140034353 Al-Ali et al. Feb 2014 A1
20140051953 Lamego et al. Feb 2014 A1
20140058230 Abdul-Hafiz et al. Feb 2014 A1
20140066783 Kiani et al. Mar 2014 A1
20140077956 Sampath et al. Mar 2014 A1
20140081100 Muhsin et al. Mar 2014 A1
20140081175 Telfort Mar 2014 A1
20140094667 Schurman et al. Apr 2014 A1
20140100434 Diab et al. Apr 2014 A1
20140114199 Lamego et al. Apr 2014 A1
20140120564 Workman et al. May 2014 A1
20140121482 Merritt et al. May 2014 A1
20140121483 Kiani May 2014 A1
20140127137 Bellott et al. May 2014 A1
20140129702 Lamego et al. May 2014 A1
20140135588 Al-Ali et al. May 2014 A1
20140142401 Al-Ali et al. May 2014 A1
20140163344 Al-Ali Jun 2014 A1
20140163402 Lamego et al. Jun 2014 A1
20140166076 Kiani et al. Jun 2014 A1
20140171763 Diab Jun 2014 A1
20140180038 Kiani Jun 2014 A1
20140180154 Sierra et al. Jun 2014 A1
20140180160 Brown et al. Jun 2014 A1
20140187973 Brown et al. Jul 2014 A1
20140194709 Al-Ali et al. Jul 2014 A1
20140194711 Al-Ali Jul 2014 A1
20140194766 Al-Ali et al. Jul 2014 A1
20140206962 Tanii Jul 2014 A1
20140206963 Al-Ali Jul 2014 A1
20140213864 Abdul-Hafiz et al. Jul 2014 A1
20140236491 Katayev et al. Aug 2014 A1
20140243627 Diab et al. Aug 2014 A1
20140266790 Al-Ali et al. Sep 2014 A1
20140275808 Poeze et al. Sep 2014 A1
20140275835 Lamego et al. Sep 2014 A1
20140275871 Lamego et al. Sep 2014 A1
20140275872 Merritt et al. Sep 2014 A1
20140275881 Lamego et al. Sep 2014 A1
20140288400 Diab et al. Sep 2014 A1
20140303520 Telfort et al. Oct 2014 A1
20140316217 Purdon et al. Oct 2014 A1
20140316218 Purdon et al. Oct 2014 A1
20140316228 Blank et al. Oct 2014 A1
20140323825 Al-Ali et al. Oct 2014 A1
20140323897 Brown et al. Oct 2014 A1
20140323898 Purdon et al. Oct 2014 A1
20140330092 Al-Ali et al. Nov 2014 A1
20140330098 Merritt et al. Nov 2014 A1
20140330099 Al-Ali et al. Nov 2014 A1
20140333440 Kiani Nov 2014 A1
20140336481 Shakespeare et al. Nov 2014 A1
20140343436 Kiani Nov 2014 A1
20140357966 Al-Ali et al. Dec 2014 A1
20150005600 Blank et al. Jan 2015 A1
20150011907 Purdon et al. Jan 2015 A1
20150012231 Poeze et al. Jan 2015 A1
20150018650 Al-Ali et al. Jan 2015 A1
20150032029 Al-Ali et al. Jan 2015 A1
20150038859 Dalvi et al. Feb 2015 A1
20150073241 Lamego Mar 2015 A1
20150080754 Purdon et al. Mar 2015 A1
20150087936 Al-Ali et al. Mar 2015 A1
20150094546 Al-Ali Apr 2015 A1
20150097701 Al-Ali et al. Apr 2015 A1
20150099950 Al-Ali et al. Apr 2015 A1
20150099955 Al-Ali et al. Apr 2015 A1
20150101844 Al-Ali et al. Apr 2015 A1
20150106121 Muhsin et al. Apr 2015 A1
20150112151 Muhsin et al. Apr 2015 A1
20150116076 Al-Ali et al. Apr 2015 A1
20150165312 Kiani Jun 2015 A1
20150196249 Brown et al. Jul 2015 A1
20150216459 Al-Ali et al. Aug 2015 A1
20150238722 Al-Ali Aug 2015 A1
20150245773 Lamego et al. Sep 2015 A1
20150245794 Al-Ali Sep 2015 A1
20150257689 Al-Ali et al. Sep 2015 A1
20150272514 Kiani et al. Oct 2015 A1
20150351697 Weber et al. Dec 2015 A1
20150359429 Al-Ali et al. Dec 2015 A1
20150366507 Blank Dec 2015 A1
20160029932 Al-Ali Feb 2016 A1
20160058347 Reichgott et al. Mar 2016 A1
20160066824 Al-Ali et al. Mar 2016 A1
20160081552 Wojtczuk et al. Mar 2016 A1
20160095543 Telfort et al. Apr 2016 A1
20160095548 Al-Ali et al. Apr 2016 A1
20160103598 Al-Ali et al. Apr 2016 A1
20160143548 Al-Ali May 2016 A1
20160166182 Al-Ali et al. Jun 2016 A1
20160166183 Poeze et al. Jun 2016 A1
20160192869 Kiani et al. Jul 2016 A1
20160196388 Lamego Jul 2016 A1
20160197436 Barker et al. Jul 2016 A1
20160213281 Eckerbom et al. Jul 2016 A1
20160228043 O'Neil et al. Aug 2016 A1
20160233632 Scruggs et al. Aug 2016 A1
20160234944 Schmidt et al. Aug 2016 A1
20160270735 Diab et al. Sep 2016 A1
20160283665 Sampath et al. Sep 2016 A1
20160287090 Al-Ali et al. Oct 2016 A1
20160287786 Kiani Oct 2016 A1
20160296169 McHale et al. Oct 2016 A1
20160310052 Al-Ali et al. Oct 2016 A1
20160314260 Kiani Oct 2016 A1
20160324486 Al-Ali et al. Nov 2016 A1
20160324488 Olsen Nov 2016 A1
20160327984 Al-Ali et al. Nov 2016 A1
20160328528 Al-Ali et al. Nov 2016 A1
20160331332 Al-Ali Nov 2016 A1
20160367173 Dalvi et al. Dec 2016 A1
20170000394 Al-Ali et al. Jan 2017 A1
20170007134 Al-Ali et al. Jan 2017 A1
20170007198 Al-Ali et al. Jan 2017 A1
20170014083 Diab et al. Jan 2017 A1
20170014084 Al-Ali et al. Jan 2017 A1
20170024748 Haider Jan 2017 A1
20170027456 Kinast et al. Feb 2017 A1
20170042488 Muhsin Feb 2017 A1
20170055851 Al-Ali Mar 2017 A1
20170055882 Al-Ali et al. Mar 2017 A1
20170055887 Al-Ali Mar 2017 A1
20170055896 Al-Ali et al. Mar 2017 A1
20170079594 Telfort et al. Mar 2017 A1
20170086723 Al-Ali et al. Mar 2017 A1
20170143281 Olsen May 2017 A1
20170147774 Kiani May 2017 A1
20170156620 Al-Ali et al. Jun 2017 A1
20170173632 Al-Ali Jun 2017 A1
20170187146 Kiani et al. Jun 2017 A1
20170188919 Al-Ali et al. Jul 2017 A1
20170196464 Jansen et al. Jul 2017 A1
20170196470 Lamego et al. Jul 2017 A1
20170202490 Al-Ali et al. Jul 2017 A1
20170224262 Al-Ali Aug 2017 A1
20170228516 Sampath et al. Aug 2017 A1
20170245790 Al-Ali et al. Aug 2017 A1
20170251974 Shreim et al. Sep 2017 A1
20170251975 Shreim et al. Sep 2017 A1
20170258403 Abdul-Hafiz et al. Sep 2017 A1
20170311851 Schurman et al. Nov 2017 A1
20170311891 Kiani et al. Nov 2017 A1
20170325728 Al-Ali et al. Nov 2017 A1
20170332976 Al-Ali et al. Nov 2017 A1
20170340293 Al-Ali et al. Nov 2017 A1
20170360310 Kiani et al. Dec 2017 A1
20170367632 Al-Ali et al. Dec 2017 A1
20180008146 Al-Ali et al. Jan 2018 A1
20180013562 Haider et al. Jan 2018 A1
20180014752 Al-Ali et al. Jan 2018 A1
20180028124 Al-Ali et al. Feb 2018 A1
20180055385 Al-Ali Mar 2018 A1
20180055390 Kiani et al. Mar 2018 A1
20180055430 Diab et al. Mar 2018 A1
20180064381 Shakespeare et al. Mar 2018 A1
20180069776 Lamego et al. Mar 2018 A1
20180070867 Smith et al. Mar 2018 A1
20180082767 Al-Ali et al. Mar 2018 A1
20180085068 Telfort Mar 2018 A1
20180087937 Al-Ali et al. Mar 2018 A1
20180103874 Lee et al. Apr 2018 A1
20180103905 Kiani Apr 2018 A1
20180110478 Al-Ali Apr 2018 A1
20180116575 Perea et al. May 2018 A1
20180125368 Lamego et al. May 2018 A1
20180125430 Al-Ali et al. May 2018 A1
20180125445 Telfort et al. May 2018 A1
20180130325 Kiani et al. May 2018 A1
20180132769 Weber et al. May 2018 A1
20180132770 Lamego May 2018 A1
20180146901 Al-Ali et al. May 2018 A1
20180146902 Kiani et al. May 2018 A1
20180153442 Eckerbom et al. Jun 2018 A1
20180153446 Kiani Jun 2018 A1
20180153447 Al-Ali et al. Jun 2018 A1
20180153448 Weber et al. Jun 2018 A1
20180161499 Al-Ali et al. Jun 2018 A1
20180168491 Al-Ali et al. Jun 2018 A1
20180174679 Sampath et al. Jun 2018 A1
20180174680 Sampath et al. Jun 2018 A1
20180182484 Sampath et al. Jun 2018 A1
20180184917 Kiani Jul 2018 A1
20180192924 Al-Ali Jul 2018 A1
20180192953 Shreim et al. Jul 2018 A1
20180192955 Al-Ali et al. Jul 2018 A1
20180199871 Pauley et al. Jul 2018 A1
20180206795 Al-Ali Jul 2018 A1
20180206815 Telfort Jul 2018 A1
20180213583 Al-Ali Jul 2018 A1
20180214031 Kiani et al. Aug 2018 A1
20180214090 Al-Ali et al. Aug 2018 A1
20180218792 Muhsin et al. Aug 2018 A1
20180225960 Al-Ali et al. Aug 2018 A1
20180238718 Dalvi Aug 2018 A1
20180242853 Al-Ali Aug 2018 A1
20180242921 Muhsin et al. Aug 2018 A1
20180242923 Al-Ali et al. Aug 2018 A1
20180242924 Barker et al. Aug 2018 A1
20180242926 Muhsin et al. Aug 2018 A1
20180247353 Al-Ali et al. Aug 2018 A1
20180247712 Muhsin et al. Aug 2018 A1
20180249933 Schurman et al. Sep 2018 A1
20180253947 Muhsin et al. Sep 2018 A1
20180256087 Al-Ali et al. Sep 2018 A1
20180256113 Weber et al. Sep 2018 A1
20180285094 Housel et al. Oct 2018 A1
20180289325 Poeze et al. Oct 2018 A1
20180289337 Al-Ali et al. Oct 2018 A1
20180296161 Shreim et al. Oct 2018 A1
20180300919 Muhsin et al. Oct 2018 A1
20180310822 Indorf et al. Nov 2018 A1
20180310823 Al-Ali et al. Nov 2018 A1
20180317826 Muhsin Nov 2018 A1
20180317841 Novak, Jr. Nov 2018 A1
20180333055 Lamego et al. Nov 2018 A1
20180333087 Al-Ali Nov 2018 A1
20190000317 Muhsin et al. Jan 2019 A1
20190000362 Kiani et al. Jan 2019 A1
20190015023 Monfre Jan 2019 A1
20190021638 Al-Ali et al. Jan 2019 A1
20190029574 Schurman et al. Jan 2019 A1
20190029578 Al-Ali et al. Jan 2019 A1
20190038143 Al-Ali Feb 2019 A1
20190058280 Al-Ali et al. Feb 2019 A1
20190058281 Al-Ali et al. Feb 2019 A1
20190069813 Al-Ali Mar 2019 A1
20190069814 Al-Ali Mar 2019 A1
20190076028 Al-Ali et al. Mar 2019 A1
20190082979 Al-Ali et al. Mar 2019 A1
20190090748 Al-Ali Mar 2019 A1
20190090760 Kinast et al. Mar 2019 A1
20190090764 Al-Ali Mar 2019 A1
20190104973 Poeze et al. Apr 2019 A1
20190110719 Poeze et al. Apr 2019 A1
20190117070 Muhsin et al. Apr 2019 A1
20190117139 Al-Ali et al. Apr 2019 A1
20190117140 Al-Ali et al. Apr 2019 A1
20190117141 Al-Ali Apr 2019 A1
20190117930 Al-Ali Apr 2019 A1
20190122763 Sampath et al. Apr 2019 A1
20190133525 Al-Ali et al. May 2019 A1
20190142283 Lamego et al. May 2019 A1
20190142344 Telfort et al. May 2019 A1
20190150800 Poeze et al. May 2019 A1
20190150856 Kiani et al. May 2019 A1
20190167161 Al-Ali et al. Jun 2019 A1
20190200941 Chandran et al. Jul 2019 A1
20190239787 Pauley et al. Aug 2019 A1
20190320906 Olsen Oct 2019 A1
20190374139 Kiani et al. Dec 2019 A1
20190374713 Kiani et al. Dec 2019 A1
20200060869 Telfort et al. Feb 2020 A1
20200111552 Ahmed Apr 2020 A1
20200113435 Muhsin Apr 2020 A1
20200113488 Al-Ali et al. Apr 2020 A1
20200113496 Scruggs et al. Apr 2020 A1
20200113497 Triman et al. Apr 2020 A1
20200113520 Abdul-Hafiz et al. Apr 2020 A1
20200138288 Al-Ali et al. May 2020 A1
20200138368 Kiani et al. May 2020 A1
20200163597 Dalvi et al. May 2020 A1
20200196877 Vo et al. Jun 2020 A1
20200253474 Muhsin et al. Aug 2020 A1
20200253544 Belur Nagaraj et al. Aug 2020 A1
20200275841 Telfort et al. Sep 2020 A1
20200288983 Telfort et al. Sep 2020 A1
20200321793 Al-Ali et al. Oct 2020 A1
20200329983 Al-Ali et al. Oct 2020 A1
20200329984 Al-Ali et al. Oct 2020 A1
20200329993 Al-Ali et al. Oct 2020 A1
20200330037 Al-Ali et al. Oct 2020 A1
20210022628 Telfort et al. Jan 2021 A1
20210104173 Pauley et al. Apr 2021 A1
20210113121 Diab et al. Apr 2021 A1
20210117525 Kiani et al. Apr 2021 A1
20210118581 Kiani et al. Apr 2021 A1
20210121582 Krishnamani et al. Apr 2021 A1
20210161465 Barker et al. Jun 2021 A1
20210236729 Kiani et al. Aug 2021 A1
20210256267 Ranasinghe et al. Aug 2021 A1
20210256835 Ranasinghe et al. Aug 2021 A1
20210275101 Vo et al. Sep 2021 A1
20210290060 Ahmed Sep 2021 A1
20210290072 Forrest Sep 2021 A1
20210290080 Ahmed Sep 2021 A1
20210290120 Al-Ali Sep 2021 A1
20210290177 Novak, Jr. Sep 2021 A1
20210290184 Ahmed Sep 2021 A1
20210296008 Novak, Jr. Sep 2021 A1
20210330228 Olsen et al. Oct 2021 A1
Non-Patent Literature Citations (1)
Entry
US 8,845,543, 08/2001, Diab et al. (withdrawn)
Related Publications (1)
Number Date Country
20200163628 A1 May 2020 US
Provisional Applications (3)
Number Date Country
62156551 May 2015 US
62156722 May 2015 US
62156581 May 2015 US
Divisions (1)
Number Date Country
Parent 15146810 May 2016 US
Child 16688692 US