Physiological monitoring devices and methods for noise reduction in physiological signals based on subject activity type

Abstract
Methods and apparatus for monitoring a subject are described. A monitoring device configured to be attached to a body of a subject includes a sensor that is configured to detect and/or measure physiological information from the subject and at least one motion sensor configured to detect and/or measure subject motion information. The physiological sensor and motion sensor are in communication with a processor that is configured to receive and analyze signals produced by the physiological sensor and motion sensor. The processor is configured to process motion sensor signals to identify an activity characteristic of the subject. Once an activity characteristic is identified, the processor is configured to select a noise reference in response to identification of the activity characteristic of the subject, and then process physiological sensor signals using the noise reference to generate an output signal having reduced noise relative to the physiological sensor signal, to produce physiological information about the subject.
Description
FIELD

The present invention relates generally to monitoring devices and methods, more particularly, to monitoring devices and methods for measuring physiological information.


BACKGROUND

Photoplethysmography (PPG) is based upon shining light into the human body and measuring how the scattered light intensity changes with each pulse of blood flow. The scattered light intensity will change in time with respect to changes in blood flow or blood opacity associated with heart beats, breaths, blood oxygen level (SpO2), and the like. Such a sensing methodology may require the magnitude of light energy reaching the volume of flesh being interrogated to be steady and consistent so that small changes in the quantity of scattered photons can be attributed to varying blood flow. If the incidental and scattered photon count magnitude changes due to light coupling variation between the source or detector and the skin or other body tissue, then the signal of interest can be difficult to ascertain due to large photon count variability caused by motion artifacts. Changes in the surface area (and volume) of skin or other body tissue being impacted with photons, or varying skin surface curvature reflecting significant portions of the photons may also significantly impact optical coupling efficiency. Physical activity, such a walking, cycling, running, etc., may cause motion artifacts in the optical scatter signal from the body, and time-varying changes in photon intensity due to motion artifacts may swamp-out time-varying changes in photon intensity due to blood flow changes. Each of these changes in optical coupling can dramatically reduce the signal-to-noise ratio (S/N) of biometric PPG information to total time-varying photonic interrogation count. This can result in a much lower accuracy in metrics derived from PPG data, such as heart rate and breathing rate.


An earphone, such as a headset, earbud, etc., may be a good choice for incorporation of a photoplethysmography (PPG) device because it is a form factor that individuals are familiar with, it is a device that is commonly worn for long periods of time, and it is often used during exercise which is a time when individuals may benefit from having accurate heart rate data (or other physiological data). Unfortunately, incorporation of a photoplethysmography device into an earphone poses several challenges. For example, earphones may be uncomfortable to wear for long periods of time, particularly if they deform the ear surface. Moreover, human ear anatomy may vary significantly from person to person, so finding an earbud form that will fit comfortably in many ears may be difficult. Other form-factors for PPG devices, such as wristbands, armbands, clothing, and the like may be problematic for motion artifacts as well.


Some previous efforts to reduce motion artifacts from wearable PPG devices have focused on passive filtering, adaptive filtering (U.S. Pat. No. 8,923,941), or removing noise associated with footsteps (U.S. Pat. No. 8,157,730) with harmonics associated with a subject's cadence (US 2015/0366509). Each of these methods is associated with both strengths and weaknesses, depending on the wearable PPG sensor location or the type of motion noise that is present for the subject wearing the PPG sensor.


SUMMARY

It should be appreciated that this Summary is provided to introduce a selection of concepts in a simplified form, the concepts being further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of this disclosure, nor is it intended to limit the scope of the invention.


According to some embodiments of the present invention, a monitoring device configured to be attached to a body of a subject includes a sensor that is configured to detect and/or measure physiological information from the subject (e.g., a PPG sensor, etc.), at least one motion sensor configured to detect and/or measure subject motion information, and at least one processor that is configured to receive and analyze physiological sensor signals produced by the physiological sensor and motion sensor signals produced by the motion sensor. The monitoring device may be configured to be positioned at or within an ear of the subject, secured to an appendage of the subject, or integrated into a form-factor that is wearable at a location of the body of the subject (for example, on a wrist). The physiological sensor may be an optical sensor that includes at least one optical emitter and at least one optical detector, although various other types of sensors may be utilized. The at least one processor is configured to process motion sensor signals to identify an activity characteristic of the subject. For example, the at least one processor is configured to determine if the subject is engaged in a periodic activity such as exercising, running, walking, etc., or if the subject is engaged in a lifestyle or non-periodic activity such as reading a book, desk work, eating, etc. A motion sensor's signal during random motion associated with lifestyle activities will be different than the signal during periodic activities. Once an activity characteristic is determined, the at least one processor is configured to select a noise reference in response to identification of the activity characteristic of the subject, and then process physiological sensor signals using the noise reference to generate output signals having reduced noise relative to the physiological sensor signals and produce physiological information about the subject.


In some embodiments, the at least one motion sensor may be first and second motion sensors, and selecting the noise reference may include selecting the first motion sensor responsive to identification of the activity characteristic as periodic motion, or selecting the second motion sensor responsive to identification of the activity characteristic as non-periodic motion.


In some embodiments, the physiological sensor may be an optical sensor, and a signal response of the second motion sensor may more closely correspond to motion-related optical noise present in the physiological sensor signal than a signal response of the first motion sensor.


In some embodiments, processing the physiological sensor signal using the noise reference may include processing the physiological sensor signal using a frequency-domain-based algorithm or circuit responsive to identification of the activity characteristic as periodic motion, or processing the physiological sensor signal using a time-domain-based algorithm or circuit responsive to identification of the activity characteristic as non-periodic motion.


In some embodiments, the physiological sensor may be a photoplethysmography (PPG) sensor, the first motion sensor may be an inertial sensor, and wherein the second motion sensor may be an optomechanical sensor.


In some embodiments, processing the physiological sensor signal using the noise reference may further include, in response to identification of the activity characteristic as periodic motion, processing the physiological sensor signal using a first algorithm or circuit, or, in response to identification of the activity characteristic as non-periodic motion, processing the physiological sensor signal using a second algorithm or circuit that is different from the first algorithm or circuit.


In some embodiments, the first algorithm or circuit may be based on frequency-domain processing and/or filtering, and wherein the second algorithm or circuit may be based on time-domain processing and/or filtering.


In some embodiments, the first and second algorithms or circuits may utilize a same sensor or sensors of the at least one motion sensor.


In some embodiments, the first algorithm or circuit may be configured to perform spectral subtraction, and the second algorithm or circuit may be configured to perform adaptive filtering.


In some embodiments, the optomechanical sensor may be configured to modulate light generated by an optical emitter in response to body motion, and to direct the light upon a pathway which is substantially unaffected by blood flow.


In some embodiments, body motion information contained in the noise reference may be greater than blood flow information contained therein.


In some embodiments, the noise reference may include information associated with motion of the subject and may be substantially free of information associated with blood flow of the subject.


In some embodiments, the physiological information may include one or more of the following: subject heart rate, subject respiration rate, subject respiratory rate interval (RRi), subject heart rate variability (HRV), subject blood pressure, subject blood analyte levels, subject cardiovascular properties.


According to some embodiments of the present invention, a monitoring device configured to be attached to a bodily location of a subject includes a physiological sensor configured to detect and/or measure physiological data from the subject and generate a physiological sensor signal representative thereof, at least one motion sensor configured to detect and/or measure subject motion data and generate a motion sensor signal representative thereof, and at least one processor coupled to the physiological sensor and the motion sensor. The at least one processor is configured to perform operations including processing the motion sensor signal received from the at least one motion sensor to identify an activity characteristic of the subject, selecting a noise reference in response to identification of the activity characteristic, and processing the physiological sensor signal received from the physiological sensor using the noise reference to generate an output signal having reduced noise relative to the physiological sensor signal.


According to some embodiments of the present invention, a computer program product includes a non-transitory computer readable storage medium having computer readable program code embodied therein. When executed by at least one processor, the computer readable program code causes the at least one processor to perform operations including processing a motion sensor signal received from at least one motion sensor to identify an activity characteristic of the subject, selecting a noise reference in response to identification of the activity characteristic, and processing a physiological sensor signal received from the physiological sensor using the noise reference to generate an output signal having reduced noise relative to the physiological sensor signal.


Monitoring devices according to embodiments of the present invention can be advantageous over some conventional monitoring devices because, by detecting an activity characteristic of a subject, a corresponding noise reference can be selected and/or modified to reduce noise that may be present in a physiological sensor signal so as to provide more accurate physiological information about a subject. For example, for optimal heart rate tracking accuracy, it may be advantageous to have some context about the type of activity being performed by a subject (i.e., whether the subject is engaged in lifestyle activities or periodic activities) to select a noise reference that may more closely correspond to motion-related noise in the signal from which the heart rate signal.


It is noted that aspects of the invention described with respect to one embodiment may be incorporated in a different embodiment although not specifically described relative thereto. That is, all embodiments and/or features of any embodiment can be combined in any way and/or combination, even if not specifically described in that manner. These and other aspects of the present invention are explained in detail below.





BRIEF DESCRIPTION OF THE DRAWINGS

The accompanying drawings, which form a part of the specification, illustrate various embodiments of the present invention. The drawings and description together serve to fully explain embodiments of the present invention.



FIGS. 1A-1B illustrate a monitoring device that can be positioned within an ear of a subject, according to some embodiments of the present invention.



FIG. 2A illustrates a monitoring device that can be positioned around an appendage (such as an arm, leg, finger, toe, etc.) of the body of a subject, according to some embodiments of the present invention.



FIG. 2B is a cross sectional view of the monitoring device of FIG. 2A.



FIG. 3 is a block diagram of a monitoring device according to some embodiments of the present invention.



FIGS. 4-7 are flowcharts of operations for monitoring a subject according to embodiments of the present invention.



FIG. 8 is a graph illustrating heart rate data for a subject plotted over time compared to a benchmark for a range of activity mode settings of a monitoring device, according to some embodiments of the present invention.



FIG. 9 is a graph illustrating heart rate data for a subject plotted over time compared to a benchmark for a range of activity mode settings of a monitoring device, according to some embodiments of the present invention.



FIG. 10 is a spectrogram of accelerometer data showing the subject monitored in FIGS. 8 and 9 performing non-periodic activity and periodic activity.



FIG. 11 is a flowchart of operations for selecting a noise reference responsive to monitoring a subject according to embodiments of the present invention.





DETAILED DESCRIPTION

The present invention will now be described more fully hereinafter with reference to the accompanying figures, in which embodiments of the invention are shown. This invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. Like numbers refer to like elements throughout. In the figures, certain layers, components or features may be exaggerated for clarity, and broken lines illustrate optional features or operations unless specified otherwise. In addition, the sequence of operations (or steps) is not limited to the order presented in the figures and/or claims unless specifically indicated otherwise. Features described with respect to one figure or embodiment can be associated with another embodiment or figure although not specifically described or shown as such.


It will be understood that when a feature or element is referred to as being “on” another feature or element, it can be directly on the other feature or element or intervening features and/or elements may also be present. In contrast, when a feature or element is referred to as being “directly on” another feature or element, there are no intervening features or elements present. It will also be understood that, when a feature or element is referred to as being “secured”, “connected”, “attached” or “coupled” to another feature or element, it can be directly secured, directly connected, attached or coupled to the other feature or element or intervening features or elements may be present. In contrast, when a feature or element is referred to as being “directly secured”, “directly connected”, “directly attached” or “directly coupled” to another feature or element, there are no intervening features or elements present. Although described or shown with respect to one embodiment, the features and elements so described or shown can apply to other embodiments.


The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used herein, the singular forms “a”, “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise.


As used herein, the terms “comprise”, “comprising”, “comprises”, “include”, “including”, “includes”, “have”, “has”, “having”, or variants thereof are open-ended, and include one or more stated features, elements, steps, components or functions but does not preclude the presence or addition of one or more other features, elements, steps, components, functions or groups thereof. Furthermore, as used herein, the common abbreviation “e.g.”, which derives from the Latin phrase “exempli gratia,” may be used to introduce or specify a general example or examples of a previously mentioned item, and is not intended to be limiting of such item. The common abbreviation “i.e.”, which derives from the Latin phrase “id est,” may be used to specify a particular item from a more general recitation.


As used herein, the term “and/or” includes any and all combinations of one or more of the associated listed items and may be abbreviated as “/”.


As used herein, phrases such as “between X and Y” and “between about X and Y” should be interpreted to include X and Y. As used herein, phrases such as “between about X and Y” mean “between about X and about Y.” As used herein, phrases such as “from about X to Y” mean “from about X to about Y.”


Spatially relative terms, such as “under”, “below”, “lower”, “over”, “upper” and the like, may be used herein for ease of description to describe one element or feature's relationship to another element(s) or feature(s) as illustrated in the figures. It will be understood that the spatially relative terms are intended to encompass different orientations of the device in use or operation in addition to the orientation depicted in the figures. For example, if a device in the figures is inverted, elements described as “under” or “beneath” other elements or features would then be oriented “over” the other elements or features. Thus, the exemplary term “under” can encompass both an orientation of over and under. The device may be otherwise oriented (rotated 90 degrees or at other orientations) and the spatially relative descriptors used herein interpreted accordingly. Similarly, the terms “upwardly”, “downwardly”, “vertical”, “horizontal” and the like are used herein for the purpose of explanation only unless specifically indicated otherwise.


It will be understood that although the terms first and second are used herein to describe various features or elements, these features or elements should not be limited by these terms. These terms are only used to distinguish one feature or element from another feature or element. Thus, a first feature or element discussed below could be termed a second feature or element, and similarly, a second feature or element discussed below could be termed a first feature or element without departing from the teachings of the present invention.


Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the specification and relevant art and should not be interpreted in an idealized or overly formal sense unless expressly so defined herein. Well-known functions or constructions may not be described in detail for brevity and/or clarity.


The term “about”, as used herein with respect to a value or number, means that the value or number can vary more or less, for example by +/−20%, +/−10%, +/−5%, +/−1%, +/−0.5%, +/−0.1%, etc.


The terms “sensor”, “sensing element”, and “sensor module”, as used herein, are interchangeable and refer to a sensor element or group of sensor elements that may be utilized to sense information, such as information (e.g., physiological information, body motion, etc.) from the body of a subject and/or environmental information in a vicinity of the subject. A sensor/sensing element/sensor module may comprise one or more of the following: a detector element, an emitter element, a processing element, optics, mechanical support, supporting circuitry, and the like. Both a single sensor element and a collection of sensor elements may be considered a sensor, a sensing element, or a sensor module.


The term “optical emitter”, as used herein, may include a single optical emitter and/or a plurality of separate optical emitters that are associated with each other.


The term “optical detector”, as used herein, may include a single optical detector and/or a plurality of separate optical detectors that are associated with each other.


The term “wearable sensor module”, as used herein, refers to a sensor module configured to be worn on or near the body of a subject.


The terms “monitoring device” and “biometric monitoring device”, as used herein, are interchangeable and include any type of device, article, or clothing that may be worn by and/or attached to a subject and that includes at least one sensor/sensing element/sensor module. Exemplary monitoring devices may be embodied in an earpiece, a headpiece, a finger clip, a digit (finger or toe) piece, a limb band (such as an arm band or leg band), an ankle band, a wrist band, a nose piece, a sensor patch, eyewear (such as glasses or shades), apparel (such as a shirt, hat, underwear, etc.), a mouthpiece or tooth piece, contact lenses, or the like.


The term “monitoring” refers to the act of measuring, quantifying, qualifying, estimating, sensing, calculating, interpolating, extrapolating, inferring, deducing, or any combination of these actions. More generally, “monitoring” refers to a way of getting information via one or more sensing elements. For example, “blood health monitoring” includes monitoring blood gas levels, blood hydration, and metabolite/electrolyte levels.


The term “headset”, as used herein, is intended to include any type of device or earpiece that may be attached to or near the ear (or ears) of a user and may have various configurations, without limitation. Headsets incorporating optical sensor modules, as described herein, may include mono headsets (a device having only one earbud, one earpiece, etc.) and stereo headsets (a device having two earbuds, two earpieces, etc.), earbuds, hearing aids, ear jewelry, face masks, headbands, and the like. In some embodiments, the term “headset” may include broadly headset elements that are not located on the head but are associated with the headset. For example, in a “medallion” style wireless headset, where the medallion comprises the wireless electronics and the headphones are plugged into or hard-wired into the medallion, the wearable medallion would be considered part of the headset as a whole. Similarly, in some cases, if a mobile phone or other mobile device is intimately associated with a plugged-in headphone, then the term “headset” may refer to the headphone-mobile device combination. The terms “headset” and “earphone”, as used herein, are interchangeable.


The term “physiological” refers to matter or energy of or from the body of a creature (e.g., humans, animals, etc.). In embodiments of the present invention, the term “physiological” is intended to be used broadly, covering both physical and psychological matter and energy of or from the body of a creature.


The term “body” refers to the body of a subject (human or animal) that may wear a monitoring device, according to embodiments of the present invention. The term “bodily location” refers to a location of the body on which (or in which) a monitoring device may be worn.


The term “processor” is used broadly to refer to a signal processor or computing system or processing or computing method which may be localized or distributed. For example, a localized signal processor may comprise one or more signal processors or processing methods localized to a general location, such as to a wearable device. Examples of such wearable devices may comprise an earpiece, a headpiece, a finger clip, a digit (finger or toe) piece, a limb band (such as an arm band or leg band), an ankle band, a wrist band, a nose piece, a sensor patch, eyewear (such as glasses or shades), apparel (such as a shirt, hat, underwear, etc.), a mouthpiece or tooth piece, contact lenses, or the like. Examples of a distributed processor comprise “the cloud”, the internet, a remote database, a remote processor computer, a plurality of remote processors or computers in communication with each other, or the like, or processing methods distributed amongst one or more of these elements. The key difference is that a distributed processor may include delocalized elements, whereas a localized processor may work independently of a distributed processing system. As a specific example, microprocessors, microcontrollers, ASICs (application specific integrated circuits), analog processing circuitry, or digital signal processors are a few non-limiting examples of physical signal processors that may be found in wearable devices.


The term “remote” does not necessarily mean that a remote device is a wireless device or that it is a long distance away from a device in communication therewith. Rather, the term “remote” is intended to reference a device or system that is distinct from another device or system or that is not substantially reliant on another device or system for core functionality. For example, a computer wired to a wearable device may be considered a remote device, as the two devices are distinct and/or not substantially reliant on each other for core functionality. However, any wireless device (such as a portable device, for example) or system (such as a remote database for example) is considered remote to any other wireless device or system.


The terms “signal analysis frequency” and “signal sampling rate”, as used herein, are interchangeable and refer to the number of samples per second (or per other unit) taken from a continuous sensor (i.e., physiological sensor and environmental sensor) signal to ultimately make a discrete signal.


The term “sensor module interrogation power”, as used herein, refers to the amount of electrical power required to operate one or more sensors (i.e., physiological sensors and environmental sensors) of a sensor module and/or any processing electronics or circuitry (such as microprocessors and/or analog processing circuitry) associated therewith. Examples of decreasing the sensor interrogation power may include lowering the voltage or current through a sensor element (such as lowering the voltage or current applied to a pair of electrodes), lowering the polling (or polling rate) of a sensor element (such as lowering the frequency at which an optical emitter is flashed on/off in a PPG sensor), lowering the sampling frequency of a stream of data (such as lowering the sampling frequency of the output of an optical detector in a PPG sensor), selecting a lower-power algorithm (such as selecting a power-efficient time-domain processing method for measuring heart rate vs. a more power-hungry frequency-domain processing method), or the like. Lowering the interrogation power may also include powering only one electrode, or powering fewer electrodes, in a sensor module or sensor element such that less total interrogation power is exposed to the body of a subject. For example, lowering the interrogation power of a PPG sensor may comprise illuminating only one light-emitting diode rather than a plurality of light-emitting diodes that may be present in the sensor module, and lowering the interrogation power of a bioimpedance sensor may comprise powering only one electrode pair rather than a plurality of electrodes that may be present in the bioimpedance sensor module.


The term “polling” typically refers to controlling the intensity of an energy emitter of a sensor or to the “polling rate” and/or duty cycle of an energy emitter element in a sensor, such as an optical emitter in a PPG sensor or an ultrasonic driver in an ultrasonic sensor. Polling may also refer to the process of collecting and not collecting sensor data at certain periods of time. For example, a PPG sensor may be “polled” by controlling the intensity of one or more optical emitters, i.e. by pulsing the optical emitter over time. Similarly, the detector of a PPG sensor may be polled by reading data from that sensor only at a certain point in time or at certain intervals, i.e., as in collecting data from the detector of a PPG sensor for a brief period during each optical emitter pulse. A sensor may also be polled by turning on or off one or more elements of that sensor in time, such as when a PPG sensor is polled to alternate between multiple LED wavelengths over time or when an ultrasonic sensor is polled to alternate between mechanical vibration frequencies over time.


The terms “sampling frequency”, “signal analysis frequency”, and “signal sampling rate”, as used herein, are interchangeable and refer to the number of samples per second (or per other unit) taken from a continuous sensor or sensing element (for example, the sampling rate of the thermopile output in a tympanic temperature sensor).


The term “noise reference” refers to a reference information source for an undesired, dynamic noise signal that may be present within a sensor signal. A noise reference signal may be generated by a noise reference source, and this noise reference signal may be processed (i.e., via digital and/or analog electronic processing) to attenuate noise from the sensor signal, yielding a filtered signal that contains cleaner sensor information (i.e., such that the desired sensor information remains with reduced noise). As a specific example, for a PPG sensor worn by a subject, a motion noise reference may include an accelerometer, a pressure sensor, a piezoelectric sensor, a capacitive sensor, an inductive sensor, an optical sensor, or the like. Ideally, the noise reference signal may contain information associated with the motion of the subject, but may contain little (if any) information associated with the blood flow of the subject. Stated another way, the ratio of body motion information to arterial blood flow information contained in the noise reference signal may be relatively high.


Motion noise information generated by the motion noise reference can be processed with PPG sensor data to attenuate dynamically changing motion noise from the PPG sensor, such that the resulting signal contains cleaner information about blood flow (and less motion noise information). In accordance with some embodiments described herein, one motion noise reference for a wearable PPG sensor is an optomechanical sensor also known as a “blocked channel” sensor. This optomechanical sensor can be designed or configured to modulate light generated by an optical emitter in response to body motion, and to direct that light upon a pathway that does not interact with blood flow; in this manner, the optomechanical sensor can be configured to measure body motion and not blood flow motion, so that the noise reference signal may be used to filter out motion noise from the PPG signal with analog or digital signal processing (i.e. filtering). Nonlimiting examples of such noise references are presented in commonly-owned U.S. Pat. No. 9,289,175 entitled “LIGHT-GUIDING DEVICES AND MONITORING DEVICES INCORPORATING SAME,” and International Patent Application No. PCT/US2016/046273 entitled “METHODS AND APPARATUS FOR DETECTING MOTION VIA OPTOMECHANICS,” the disclosures of which are incorporated by reference herein.


It should be noted that “algorithm” and “circuit” are referred to herein. An algorithm refers to an instruction set, such as an instruction set with sequential steps and logic, that may be in memory whereas a circuit refers to electronic components and/or traces that may implement such logic operations.


It should be noted that processes for managing hysteresis are implied herein. Namely, several embodiments herein for controlling sensors (and other wearable hardware) may involve a processor sending commands to a sensor element depending on the sensor readings. Thus, in some embodiments, a sensor reading (such as a reading from an optical detector or a sensing electrode) above X may result in a processor sending a command to electrically bias another sensor element (such as an optical emitter or a biasing electrode) above Y. Similarly, as soon as the sensor reading drops below X, a processor may send a command to bias another sensor element below Y. However, in borderline situations this may cause unwanted hysteresis in the biasing command, as sensor readings may rapidly toggle above/below X resulting in the toggling of the biasing of another sensor element above/below Y. As such, hysteresis management may be integrated within the algorithm(s) for controlling the execution of a processor. For example, the processor may be configured by the algorithm to delay a biasing command by a period of time Z following the timing of a prior biasing command, thereby preventing or reducing the aforementioned toggling.


In the following figures, various monitoring devices will be illustrated and described for attachment to the ear or an appendage of the human body. However, it is to be understood that embodiments of the present invention are not limited to those worn by humans.


The ear is an ideal location for wearable health and environmental monitors. The ear is a relatively immobile platform that does not obstruct a person's movement or vision. Monitoring devices located at an ear have, for example, access to the inner-ear canal and tympanic membrane (for measuring core body temperature), muscle tissue (for monitoring muscle tension), the pinna, earlobe, and elsewhere (for monitoring blood gas levels), the region behind the ear (for measuring skin temperature and galvanic skin response), and the internal carotid artery (for measuring cardiopulmonary functioning), etc. Virtually all areas of the ear support enough blood perfusion to enable an assessment of blood flow and blood-flow-derived metrics (such as breathing rate, heart rate, blood pressure, RR-interval, and the like) via photoplethysmography (PPG). A particularly good location for PPG may be the region covering the anti-tragus and concha regions of the ear, as the region has high perfusion, is more resistant to motion artifacts, and is in a region away from the ear canal. The ear is also at or near the point of exposure to: environmental breathable toxicants of interest (volatile organic compounds, pollution, etc.); noise pollution experienced by the ear; and lighting conditions for the eye. Furthermore, as the ear canal is naturally designed for transmitting acoustical energy, the ear provides a good location for monitoring internal sounds, such as heartbeat, breathing rate, and mouth motion.


Optical coupling into the blood vessels of the ear may vary between individuals. As used herein, the term “coupling” refers to the interaction or communication between excitation energy (such as light) entering a region and the region itself. For example, one form of optical coupling may be the interaction between excitation light generated from within an optical sensor of a wearable device—such as a wristband, armband, legband, skin-worn patch, neckband, ring, earbud, other device positioned at or within an ear, a headband, or the like—and the blood vessels. In one embodiment, this interaction may involve excitation light entering the skin and scattering from a blood vessel such that the temporal change in intensity of scattered light is proportional to a temporal change in blood flow within the blood vessel. Another form of optical coupling may be the interaction between excitation light generated by an optical emitter within a wearable device and a light-guiding region of the wearable device. Thus, a wearable device with integrated light-guiding capabilities, wherein light can be guided to multiple and/or select regions along the wearable device, can assure that each individual wearing the device will generate an optical signal related to blood flow through the blood vessels. Optical coupling of light to a particular skin region of one person may not yield photoplethysmographic signals for each person. Therefore, coupling light to multiple regions may assure that at least one blood-vessel-rich region will be interrogated for each person wearing the device. Coupling multiple regions of the body to light may also be accomplished by diffusing light from a light source in the device.


According to some embodiments of the present invention, “smart” monitoring devices including, but not limited to, wrist-worn devices, armbands, earbuds, and the like, are provided that can modify biometric signal extraction algorithms based upon a recognition of an activity level of a subject.



FIGS. 1A-1B illustrate a monitoring apparatus 20 configured to be positioned within an ear of a subject, according to some embodiments of the present invention. The illustrated apparatus 20 includes an earpiece body or housing 22, a sensor module 24, a stabilizer 25, and a sound port 26. When positioned within the ear of a subject, the sensor module 24 has a region 24a configured to contact a selected area of the ear. The illustrated sensor region 24a is contoured (i.e., is “form-fitted”) to matingly engage a portion of the ear between the anti tragus and acoustic meatus, and the stabilizer is configured to engage the anti-helix. However, monitoring devices in accordance with embodiments of the present invention can have sensor modules with one or more regions configured to engage various portions of the ear. Various types of device configured to be worn at or near the ear may be utilized in conjunction with embodiments of the present invention.



FIGS. 2A-2B illustrate a monitoring apparatus 30 in the form of a sensor band 32 configured to be secured to an appendage (e.g., an arm, wrist, hand, finger, toe, leg, foot, neck, etc.) of a subject. The band 32 includes a sensor module 34 on or extending from the inside surface 32a of the band 32. The sensor module 34 is configured to detect and/or measure physiological information from the subject and includes a sensor region 34a that is contoured to contact the skin of a subject wearing the apparatus 30.


Embodiments of the present invention may be utilized in various devices and articles including, but not limited to, patches, clothing, etc. Embodiments of the present invention can be utilized wherever PPG and blood flow signals can be obtained and at any location on the body of a subject. Embodiments of the present invention are not limited to the illustrated monitoring devices 20, 30 of FIGS. 1A-1B and 2A-2B.


The sensor modules 24, 34 for the illustrated monitoring devices 20, 30 of FIGS. 1A-1B and 2A-2B are configured to detect and/or measure physiological information from a subject wearing the monitoring devices 20, 30. In some embodiments, the sensor modules 24, 34 may be configured to detect and/or measure one or more environmental conditions in a vicinity of the subject wearing the monitoring devices 20, 30.


A sensor module utilized in accordance with embodiments of the present invention may be an optical sensor module that includes at least one optical emitter and at least one optical detector. Exemplary optical emitters include, but are not limited to light-emitting diodes (LEDs), laser diodes (LDs), compact incandescent bulbs, micro-plasma emitters, IR blackbody sources, or the like. In addition, a sensor module may include various types of sensors including and/or in addition to optical sensors. For example, a sensor module may include one or more inertial sensors (e.g., an accelerometer, piezoelectric sensor, vibration sensor, photoreflector sensor, etc.) for detecting changes in motion, one or more thermal sensors (e.g., a thermopile, thermistor, resistor, etc.) for measuring temperature of a part of the body, one or more electrical sensors for measuring changes in electrical conduction, one or more skin humidity sensors, and/or one or more acoustical sensors. Sensor modules may include physiological sensors for monitoring vital signs, blood flow properties, body fluids, muscle motion, body motion, and the like. Sensor modules may include environmental sensors for monitoring ambient light, ambient temperature, ambient humidity, airborne pollutants, wind, and the like. It should be understood that a wide variety of sensors may be incorporated into a wearable sensor module.


Referring to FIG. 3, a monitoring device (e.g., monitoring devices 20, 30), according to embodiments of the present invention, is illustrated schematically. The illustrated monitoring device includes a sensor module 24, 34 having one or more physiological sensors configured to detect and/or measure physiological information from the subject, and a motion sensor 50 (e.g., an accelerometer) configured to detect and/or measure subject motion information. The monitoring device 20, 30 also includes at least one processor 40 that is coupled to the sensor(s) of a sensor module 24, 34, to the motion sensor 50, and that is configured to receive and analyze signals produced by the sensor(s) and motion sensor 50.


According to some embodiments of the present invention, the processor 40 is configured to process motion sensor signals to identify an activity characteristic (e.g., periodic or non-periodic activity) of the subject, select or modify a biometric signal extraction algorithm (implemented by computer-readable memory and/or circuit 60) in response to identification of the type of subject activity, and then process physiological sensor signals via the modified biometric signal extraction algorithm or circuit 60 to produce physiological information about the subject, such as subject heart rate, subject respiration rate, subject RRi (i.e., the time-difference between consecutive R-peaks in the ECG or PPG waveform), subject HRV, subject blood pressure, subject blood analyte levels, subject cardiovascular properties, and the like. Although many aspects of this invention are most related to PPG sensors, a variety of other physiological sensors may be used within the scope of this invention.


In some embodiments, the motion sensor 50 is an accelerometer and a filter is applied to each axis of the accelerometer data to remove the DC component therefrom. A spectral transform is performed on each axis of the DC-removed signal over a given time span. An accelerometer spectrum is then generated by taking the root of the sum of the squares of each axis of the accelerometer that is associated with that time span. Other techniques may also be used.


The 1st, 2nd, and 3rd, most intense frequencies and magnitudes in the accelerometer spectrum are identified. The relative proportion of energy within the peaks is compared to all other frequencies to determine if the criteria for periodicity is met. If so, the given segment of time is determined to represent periodic activity and an Automatic Mode Switching feature of the monitoring device is set to inform an algorithm 60 (e.g., a heart rate algorithm, etc.) of the periodic context. If not, an Automatic Mode Switching feature of the monitoring device is set to inform the HR algorithm of the non-periodic context.


Alternatively or in addition, a determination is made whether the ratio of peak frequencies fit a pattern associated with periodicity (e.g., whether peak 2 is twice the frequency of peak 1, etc.). If so, the given segment of time is determined to represent periodic activity and an Automatic Mode Switching feature of the monitoring device is set to inform an algorithm 60 (e.g., a heart rate algorithm, etc.) of the periodic context. If not, an Automatic Mode Switching feature is set to inform the HR algorithm of the non-periodic context.


Referring now to FIG. 4, a method of monitoring a subject via a monitoring device, such as monitoring devices 20, 30, according to some embodiments of the present invention, will be described. The monitoring device includes a physiological sensor configured to detect and/or measure physiological information from the subject and a motion sensor. The physiological sensor and the motion sensor are in communication with a processor that is configured to receive and analyze signals produced by the physiological sensor and motion sensor. The processor may be part of the monitoring device or may be a remotely located processor.


The subject is monitored for a change in physical activity level or type (Block 100). If a change is detected (Block 110), the processor 40 identifies an activity characteristic (e.g., determines if the activity is periodic or non-periodic) (Block 120). For example, via an algorithm stored in or implemented by memory or circuit 60, the processor determines if the subject is engaged in a non-periodic or lifestyle activity such as reading a book, desk work, eating, etc., or a periodic activity such as exercising, running, walking, etc. In response to determining the type of activity, the processing of the biometric signal extraction algorithm may be modified in one or more ways (Block 130). For example, in response to determining the type of activity, at least one parameter of the biometric signal extraction algorithm may be modified, and this modified biometric signal extraction algorithm may be selected for processing. Some specific examples of parameters that may be modified are described in US 2015/0366509 entitled “CADENCE DETECTION BASED ON INERTIAL HARMONICS” (published Dec. 24, 2015), US 2016/0029964 entitled “PHYSIOLOLOGICAL MONITORING DEVICES WITH ADJUSTABLE SIGNAL ANALYSIS AND INTERROGATION POWER AND MONITORING METHODS USING THE SAME” (published Feb. 4, 2016), and US 2015/0011898 entitled “PHYSIOLOLOGICAL METRIC ESTIMATION RISE AND FALL LIMITING” (published Jan. 8, 2015), the disclosures of which are incorporated by reference herein. For example, parameters affecting the rise or fall time of biometric tracking (such as the tracking of heart rate, breathing rate, RRi, etc.), parameters affecting the subtraction or redaction of unwanted frequencies in the PPG signal (i.e., which frequency bands to subtract or redact (set to zero) based for example on amplitude and/or power), and/or parameters affecting which types of filters to execute (highpass, lowpass, bandpass, active, subtractive or redactive filters, or the like) may be modified. In response to determining the type of activity, the processor may thus select a biometric signal extraction algorithm (which may be all or part of algorithm stored in or implemented by memory or circuit 60) so as to be best suited for processing sensor data during such activity (Block 130) (i.e., that is best suited for removing motion or environmental noise from that particular activity).


It should be noted that, in addition to or alternatively than modifying an algorithm or selecting an algorithm, a circuit may be modified or selected. For example, the determination of periodicity in Block 120 may send a signal to one or more transistors (Block 130) to switch the analog filtering circuits to “notch out” the periodic signal; or, the determination of non-periodicity in Block 120 may send a signal to one or more transistors (Block 130) to switch the analog filtering circuits to a bandpass filter that includes the biometric signals of interest. It should also be noted that both algorithms and circuits may simultaneously be modified or selected in the embodiments described herein. Thus, as another exemplary embodiment, the determination of periodicity in Block 120, using a low-power time-domain algorithm (such as an autocorrelation function, or the like), may initiate an algorithm in Block 130 that determines which frequencies should be removed. The output of the algorithm (i.e., the output from a processor) may send a signal to a bank of analog filters in Block 130 to remove these frequencies without the need for digital removal. This may be advantageous because executing a spectral transform may be overly power-hungry in a wearable device. Physiological sensor signals are then processed via the selected biometric signal extraction algorithm to produce physiological information about the subject, such as subject heart rate, subject respiration rate, subject RRi, subject HRV, subject blood pressure, subject blood analyte levels, subject cardiovascular properties, and the like.


Identification of an activity characteristic (e.g., determining if the activity is periodic or non-periodic) may also be associated with particular metrics to be assessed. For example, detecting that periodic motion exists may be a criteria for determining that a cardiac efficiency metric is being assessed. In contrast, if the subject's motion is aperiodic, then a cardiac efficiency score (Average Cadence/Average HR) may be meaningless. Also, one or more processors or functions may be deactivated or run in power-saving mode based on the detection of type of motion (e.g., a pedometer may be deactivated responsive to detection of aperiodic motion). Additionally, an energy expenditure calculation which factors both HR and footsteps into an equation for the estimation of calories burned may be tuned to detect “fidgeting” (subtle motions) rather than footsteps and update the energy expenditure model to incorporate fidgets and HR, rather than footsteps and HR. The weighting factors for fidgets and HR, for example, may be different than the weighting factors for footsteps and HR when estimating energy expenditure.


As illustrated in FIG. 5, identifying an activity characteristic of a subject (Block 120) may include filtering a signal from each axis of the accelerometer over a given time period to remove a DC component therefrom (Block 122), producing an accelerometer spectrum using the filtered signals (Block 124), identifying one or more frequencies in the spectrum having a magnitude above a threshold level (Block 126), and/or determining if the one or more frequencies in the spectrum match a pattern associated with a known subject activity mode (Block 128). It should be noted that the operations of FIG. 5 may be controlled by algorithms, circuitry, or a combination of both.


Referring now to FIG. 6, a method of monitoring a subject via a monitoring device, such as monitoring devices 20, 30, according to some embodiments of the present invention, will be described. The monitoring device includes a PPG sensor and a motion sensor which are in communication with a processor that is configured to receive and analyze signals produced by the sensor module and motion sensor. The processor may be part of the monitoring device or may be a remotely located processor. It should be noted that the operations of FIG. 6 may be controlled by algorithms, circuitry, or a combination of both.


The method includes filtering a motion signal generated by the motion sensor to attenuate DC information in the motion signal and generate a DC-filtered motion signal (Block 200). A spectral representation of the DC-filtered motion signal is generated (Block 210) and a plurality of frequencies in the spectral representation are identified having a magnitude above a threshold level (Block 220). In other words, the most intense frequencies in the spectral representation are identified. Additional operations may include filtering a step rate component from a measured heart rate component based on a function of a difference therebetween as described in US 2015/0018636 entitled “REDUCTION OF PHYSIOLOGICAL METRIC ERROR DUE TO INERTIAL CADENCE” (published Jan. 15, 2015), comparing frequency measurements from an inertial sensor with one or more thresholds as described in US 2015/0366509 entitled “CADENCE DETECTION BASED ON INERTIAL HARMONICS” (published Dec. 24, 2015), and measuring signal quality to determine how a user should adjust a monitoring device to improve biometric fit, as described in US 2016/0094899 entitled “METHODS AND APPARATUS FOR IMPROVING SIGNAL QUALITY IN WEARABLE BIOMETRIC MONITORING DEVICES” (published Mar. 31, 2016), individually or together in combination, the disclosures of which are incorporated by reference herein. Information associated with the plurality of frequencies is then processed to determine a type of motion of the subject, for example, whether the subject motion is periodic or non-periodic motion (Block 230). Information on the subject motion type is sent to a biometric tracking algorithm or circuit or is otherwise used by the processor to select a biometric tracking algorithm or circuit (Block 240), and a PPG sensor signal is processed using the biometric tracking algorithm or circuit corresponding to the subject motion type information to generate biometric information about the subject (Block 250). Exemplary biometric information includes, but is not limited to, subject heart rate, subject respiration rate, subject RRi, subject HRV, subject blood pressure, subject blood analyte levels, subject cardiovascular properties, etc.


Referring now to FIG. 7, a method of monitoring a subject via a monitoring device, such as monitoring devices 20, 30, according to some embodiments of the present invention, will be described. The monitoring device includes a PPG sensor and a motion sensor which are in communication with a processor that is configured to receive and analyze signals produced by the sensor module and motion sensor. The processor may be part of the monitoring device or may be a remotely located processor. It should be noted that the operations of FIG. 7 may be controlled by algorithms, circuitry, or a combination of both.


The method includes processing PPG sensor signals via the at least one processor to identify a statistical relationship between RR-intervals in the PPG sensor signals (Block 300) and processing the statistical relationship between RR-intervals to determine a type of motion of the subject (e.g., periodic or non-periodic) or a physiological status of the subject (Block 310). A biometric signal extraction algorithm or circuit is modified or selected via the processor in response to an identified statistical relationship between RR-intervals (Block 320). PPG sensor signals are then processed via the selected biometric signal extraction algorithm to produce physiological information about the subject (Block 320), such as subject heart rate, subject respiration rate, subject RR-interval (RRi), subject HRV, subject blood pressure, subject blood analyte levels, subject cardiovascular properties, etc. A particular example of processing RR-intervals to determine a type of motion of the subject may include processing a series of RR-intervals to identify the existence of stress, which may be associated with aperiodic (non-periodic) motion.


It should be noted that it is known to those skilled in the art that a series of RR-intervals may be processed to generate an assessment of fatigue, stress (particularly psychosocial stress), autonomic nervous functioning, hemodynamics, cardiac disease—such as atrial fibrillation or cardiac arrhythmia—or the like. However, the inventors have discovered that the statistical analysis alone may not be sufficient to generate sufficient physiological information for a diagnosis of a health condition or other physiological state. Thus, in one particular embodiment of FIG. 7, the RRi-intervals may be statistically processed (Block 310) to generate an assessment that a subject status of atrial fibrillation is likely. Once this status is known to be likely, the signal extraction algorithm may be modified or selected to more concretely determine a status of atrial fibrillation or to diagnose another condition (such as poor or abnormal hemodynamics or cardiac output, which may be associated with atrial fibrillation). As a specific example, Block 320 may switch to an algorithm that allows more frequencies to pass through during signal extraction, such that the extracted signal from Block 320 may be further processed by Block 330 to generate sufficient physiological information to diagnose atrial fibrillation. It should be understood that this describes a particular embodiment, and other embodiments that rely on FIG. 7 may be used with the present invention. As another specific example, Block 310 may determine that a subject may likely be in a status of fatigue. Block 320 may then modify or select a signal extraction algorithm that increases the signal resolution (i.e., the sampling rate and/or dynamic range), such that the extracted signal may be processed by Block 330 to generate a more definitive assessment of fatigue. In yet another example, Block 310 may determine that a subject may likely be in a status of stress, Block 320 may then modify or select a signal extraction algorithm that switches from a screening spectral-based (frequency domain-based) RRi calculator to a higher resolution time-domain-based RRi calculator, the extracted RRi signal may then be processed by Block 330 to generate a more definitive assessment of stress.



FIG. 8 is a graph illustrating heart rate data 400, 402, 404 for a subject plotted over time compared to the actual heart rate (406), the heart rate determined by a validated benchmark device, for a range of activity mode settings for a monitoring device according to some embodiments of the present invention. For the illustrated duration, the subject was not walking/jogging/running (i.e., the subject was not engaged in a periodic activity), but was doing deskwork (i.e., the subject was engaged in a non-periodic activity), where arms and hands were moving as is typical for desk work. As seen, the heart rate in Mode 3 (the “lifestyle mode” indicated by 404) most closely agrees with that of the benchmark device heart rate (406), and the monitoring device correctly uses the output for Mode 3.



FIG. 9 is a graph illustrating heart rate data 400, 402, 404 for a subject plotted over time compared to the actual heart rate (406), the heart rate determined by a validated benchmark device, for a range of activity mode settings for a monitoring device according to some embodiments of the present invention. For the illustrated duration, the subject is jogging. As seen, the heart rate in Mode 1 (400) most closely agrees with that of the benchmark device heart rate (406), and the monitoring device correctly uses the output for Mode 1. Note that the heart rate reported using Lifestyle mode (404) does not match the benchmark (406) as well as Mode 1 output (400).



FIG. 10 is a spectrogram for exemplary combined accelerometer data showing the subject monitored in FIGS. 8 and 9 performing non-periodic activity from time (t)=0 to 150 seconds, and performing periodic activity (walking) after t=150 seconds. For accurate heart rate tracking, the monitoring device uses settings associated with Lifestyle Mode (Mode 3) prior to t=150 sec and Periodic Mode after t=120 seconds.


Returning to FIG. 4, a method of monitoring a subject via a monitoring device, block 130 may be modified in accordance with additional embodiments based on the detection of periodic or non-periodic motion. For example, in one embodiment shown as Block 144 in FIG. 11, if the processor 40 identifies an activity characteristic (e.g., determines if the activity is periodic or non-periodic) (Block 120) responsive to monitoring a subject (Block 100) and detecting a change in activity (Block 110), then a particular noise reference or set of noise references may be selected for improved (or optimal) motion noise reduction (or removal) from a physiological signal (Block 144). As a specific example for a PPG sensor (such as that similar to that of FIG. 2 and/or FIG. 3) worn by a subject, the PPG sensor having both optical sensors and a plurality of motion sensors, it may be desirable to select a particular type of motion sensor (or a particular set of motion sensors) to remove periodic noise and to select a different type of motion sensor (or a set of different motion sensors) to remove aperiodic noise. For example, upon determining at Block 120 that the activity characteristic is periodic, in Block 144 an accelerometer may be selected as the noise reference. As with descriptions of FIG. 4, this selection may be controlled by one or more algorithms on a microprocessor and/or electronic circuitry. In contrast, upon detecting that the activity characteristic is aperiodic, an optomechanical sensor, such as “blocked channel sensor”, may be selected as the noise reference.


One reason why an accelerometer may be a more suitable noise reference for periodic motion over an optomechanical sensor is because an accelerometer may have a better dynamic range or may have a stronger signal response at intense motions associated with periodic motion (such as running, sprinting, cycling, or the like). In contrast, an optomechanical sensor may have a better sensitivity and/or resolution at less intense motion levels (such as resting, walking, typing, grabbing, or the like) or may have a stronger signal response at less intense motions associated with aperiodic motion. Moreover, for a PPG sensor, because the signal response of an optomechanical noise reference may more closely resemble or otherwise correspond to the unwanted motion-related optical noise from the PPG sensor signal, the PPG and optomechanical sensors are likely to be more linear with each other than would be the paring of a PPG sensor and inertial noise reference (such as an accelerometer). Thus, time-domain-based PPG motion noise removal methods or algorithms (such as adaptive filtering and the like), often well-suited for aperiodic motion, may be more effective with this high linearity. In contrast, frequency-domain-based PPG motion noise removal methods or algorithms (such as spectral subtraction and the like), often well-suited for periodic motion, may not be as sensitive to the linearity between the PPG sensor and the noise reference and may work more effectively with an inertial noise reference.


Considering a pressure sensor as an example of a “motion sensor” in FIG. 4 and FIG. 11, in one embodiment a detection of aperiodic motion may result in an accelerometer being deselected as the noise reference and a pressure sensor being selected as a noise reference (or a combination of a pressure sensor and an accelerometer). This is because (as with an optomechanical sensor) a pressure sensor may be more sensitive to aperiodic motion than an inertial sensor but may also be less effective at sensing periodic motion. In some embodiments, Block 144 may not change the noise reference in response to Block 120 but rather may change the algorithm or circuit for noise reduction or removal using the same noise reference. As a specific example, the detection of periodic motion (Block 120) may trigger a spectral noise (frequency domain) reduction or removal algorithm using an accelerometer as a noise reference (Block 130), but the detection of aperiodic motion may trigger a time-domain noise reduction or removal algorithm using the same accelerometer as a noise reference. This may be beneficial because spectral noise reduction or removal algorithms may be more suited for periodic activities (especially where “cross-overs” between step rate and heart rate are observed) and time-domain noise reduction or removal algorithms may be more suited for aperiodic activities (where “cross-overs” are much less likely).


Example embodiments are described herein with reference to block diagrams and flowchart illustrations. It is understood that a block of the block diagrams and flowchart illustrations, and combinations of blocks in the block diagrams and flowchart illustrations, can be implemented by computer program instructions that are performed by one or more computer circuits. These computer program instructions may be provided to a processor circuit of a general purpose computer circuit, special purpose computer circuit, and/or other programmable data processing circuit to produce a machine, such that the instructions, which execute via the processor of the computer and/or other programmable data processing apparatus, transform and control transistors, values stored in memory locations, and other hardware components within such circuitry to implement the functions/acts specified in the block diagrams and flowchart block or blocks, and thereby create means (functionality) and/or structure for implementing the functions/acts specified in the block diagrams and flowchart blocks.


These computer program instructions may also be stored in a tangible computer-readable medium that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable medium produce an article of manufacture including instructions which implement the functions/acts specified in the block diagrams and flowchart blocks.


A tangible, non-transitory computer-readable medium may include an electronic, magnetic, optical, electromagnetic, or semiconductor data storage system, apparatus, or device. More specific examples of the computer-readable medium would include the following: a portable computer diskette, a random access memory (RAM) circuit, a read-only memory (ROM) circuit, an erasable programmable read-only memory (EPROM or Flash memory) circuit, a portable compact disc read-only memory (CD-ROM), and a portable digital video disc read-only memory (DVD/Blu-Ray).


The computer program instructions may also be loaded onto a computer and/or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer and/or other programmable apparatus to produce a computer-implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions/acts specified in the block diagrams and flowchart blocks. Accordingly, embodiments of the present invention may be embodied in hardware and/or in software (including firmware, resident software, micro-code, etc.) that runs on a processor such as a digital signal processor, which may collectively be referred to as “circuitry,” “a module” or variants thereof.


It should also be noted that in some alternate implementations, the functions/acts noted in the blocks may occur out of the order noted in the flowcharts. For example, two blocks shown in succession may in fact be executed substantially concurrently or the blocks may sometimes be executed in the reverse order, depending upon the functionality/acts involved. Moreover, the functionality of a given block of the flowcharts and block diagrams may be separated into multiple blocks and/or the functionality of two or more blocks of the flowcharts and block diagrams may be at least partially integrated. Finally, other blocks may be added/inserted between the blocks that are illustrated. Moreover, although some of the diagrams include arrows on communication paths to show a primary direction of communication, it is to be understood that communication may occur in the opposite direction to the depicted arrows.


The foregoing is illustrative of the present invention and is not to be construed as limiting thereof. Although a few exemplary embodiments of this invention have been described, those skilled in the art will readily appreciate that many modifications are possible in the exemplary embodiments without materially departing from the teachings and advantages of this invention. Accordingly, all such modifications are intended to be included within the scope of this invention as defined in the claims. The invention is defined by the following claims, with equivalents of the claims to be included therein.

Claims
  • 1. A method of monitoring a subject via a monitoring device, wherein the monitoring device includes a physiological sensor and a plurality of motion sensors, and wherein the physiological sensor and the plurality of motion sensors are in communication with at least one processor, the method comprising: processing, via the at least one processor, a motion sensor signal to identify an activity characteristic of the subject;selecting one of the plurality of motion sensors as a noise reference via the at least one processor in response to based on identification of the activity characteristic; andprocessing, via the at least one processor, a physiological sensor signal received from the physiological sensor using the noise reference to generate an output signal having reduced noise relative to the physiological sensor signal.
  • 2. The method of claim 1, wherein the plurality of motion sensors comprises first and second motion sensors, wherein the first and second motion sensors are different types of sensors, and wherein selecting the one of the plurality of motion sensors as the noise reference comprises: selecting, from among the first and second motion sensors, the first motion sensor as the noise reference based on identification of the activity characteristic as periodic motion; orselecting, from among the first and second motion sensors, the second motion sensor as the noise reference based on identification of the activity characteristic as non-periodic motion.
  • 3. The method of claim 2, wherein the physiological sensor comprises an optical sensor, and wherein the second motion sensor is configured to provide a signal response that more closely corresponds to motion-related optical noise present in the physiological sensor signal than the first motion sensor.
  • 4. The method of claim 2, wherein processing the physiological sensor signal using the noise reference comprises: processing the physiological sensor signal using a frequency-domain-based algorithm or circuit responsive to identification of the activity characteristic as periodic motion; orprocessing the physiological sensor signal using a time-domain-based algorithm or circuit responsive to identification of the activity characteristic as non-periodic motion.
  • 5. The method of claim 2, wherein the physiological sensor is a photoplethysmography (PPG) sensor, wherein the first motion sensor comprises an inertial sensor, and wherein the second motion sensor comprises an optomechanical sensor.
  • 6. The method of claim 5, wherein the optomechanical sensor is configured to modulate light generated by an optical emitter in response to body motion, and to direct the light upon a pathway which is substantially unaffected by blood flow.
  • 7. The method of claim 1, wherein processing the physiological sensor signal using the noise reference further comprises: in response to identification of the activity characteristic as periodic motion, processing the physiological sensor signal using a first algorithm or circuit; orin response to identification of the activity characteristic as non-periodic motion, processing the physiological sensor signal using a second algorithm or circuit that is different from the first algorithm or circuit.
  • 8. The method of claim 7, wherein the first algorithm or circuit comprises frequency-domain processing and/or filtering, and wherein the second algorithm or circuit comprises time-domain processing and/or filtering.
  • 9. The method of claim 8, wherein the first and second algorithms or circuits utilize a same sensor or sensors of the plurality of motion sensors.
  • 10. The method of claim 8, wherein the first algorithm or circuit is configured to perform spectral subtraction, and wherein the second algorithm or circuit is configured to perform adaptive filtering.
  • 11. The method of claim 1, wherein body motion information contained in the noise reference is greater than blood flow information contained therein.
  • 12. The method of claim 1, wherein the noise reference comprises information associated with motion of the subject and is substantially free of information associated with blood flow of the subject.
  • 13. The method of claim 1, wherein the physiological information includes one or more of the following: subject heart rate, subject respiration rate, subject RR-interval (RRi), subject heart rate variability (HRV), subject blood pressure, subject blood analyte levels, subject cardiovascular properties.
  • 14. The method of claim 1, wherein the activity characteristic comprises a first activity characteristic or a second activity characteristic that is different than the first activity characteristic, wherein the plurality of motion sensors comprises first and second motion sensors, wherein the first and second motion sensors are different types of sensors, and wherein selecting the one of the plurality of motion sensors as the noise reference comprises: selecting, from among the first and second motion sensors, the first motion sensor as the noise reference based on identification of the first activity characteristic; orselecting, from among the first and second motion sensors, the second motion sensor as the noise reference based on identification of the second activity characteristic.
  • 15. The method of claim 14, wherein the first motion sensor comprises an inertial sensor, and wherein the second motion sensor comprises an optomechanical sensor.
  • 16. A monitoring device configured to be attached to a bodily location of a subject, the monitoring device comprising: a physiological sensor configured to detect and/or measure physiological data from the subject and generate a physiological sensor signal representative thereof;a plurality of motion sensors, each configured to detect and/or measure subject motion data and generate a respective motion sensor signal representative thereof; andat least one processor coupled to the physiological sensor and the plurality of motion sensors, wherein the at least one processor is configured to perform operations comprising: processing the respective motion sensor signal received from one of the plurality of motion sensors to identify an activity characteristic of the subject;selecting one of the plurality of motion sensors as a noise reference based on identification of the activity characteristic; andprocessing the physiological sensor signal received from the physiological sensor using the noise reference to generate an output signal having reduced noise relative to the physiological sensor signal.
  • 17. The monitoring device of claim 16, wherein the plurality of motion sensors comprises first and second motion sensors, wherein the first and second motion sensors are different types of sensors, and wherein selecting the one of the plurality of motion sensors as the noise reference comprises: selecting, from among the first and second motion sensors, the first motion sensor as the noise reference based on identification of the activity characteristic as periodic motion; orselecting, from among the first and second motion sensors, the second motion sensor as the noise reference based on identification of the activity characteristic as non-periodic motion.
  • 18. The monitoring device of claim 17, wherein the physiological sensor comprises an optical sensor, and wherein a signal response of the second motion sensor more closely corresponds to motion-related optical noise present in the physiological sensor signal than a signal response of the first motion sensor.
  • 19. The monitoring device of claim 18, wherein processing the physiological sensor signal using the noise reference comprises: processing the physiological sensor signal using a frequency-domain-based algorithm or circuit responsive to identification of the activity characteristic as periodic motion; orprocessing the physiological sensor signal using a time-domain-based algorithm or circuit responsive to identification of the activity characteristic as non-periodic motion.
  • 20. The monitoring device of claim 19, wherein the physiological sensor is a photoplethysmography (PPG) sensor, wherein the first motion sensor comprises an inertial sensor, and wherein the second motion sensor comprises an optomechanical sensor.
  • 21. The monitoring device of claim 20, wherein the optomechanical sensor is configured to modulate light generated by an optical emitter in response to body motion, and to direct the light upon a pathway which is substantially unaffected by blood flow.
  • 22. A computer program product comprising a non-transitory computer readable storage medium having computer readable program code embodied therein, which, when executed by at least one processor, causes the at least one processor to perform operations comprising: processing a motion sensor signal to identify an activity characteristic of the subject;selecting one of a plurality of motion sensors as a noise reference based on identification of the activity characteristic; andprocessing a physiological sensor signal received from the physiological sensor using the noise reference to generate an output signal having reduced noise relative to the physiological sensor signal.
CLAIM OF PRIORITY

This application claims priority from U.S. Provisional Patent Application No. 62/475,746 entitled “PHYSIOLOGICAL MONITORING DEVICES AND METHODS FOR NOISE REDUCTION IN PHYSIOLOGICAL SIGNALS BASED ON SUBJECT ACTIVITY TYPE” filed Mar. 23, 2017, and also claims priority as a continuation-in-part from U.S. patent application Ser. No. 15/299,684 entitled “PHYSIOLOGICAL MONITORING DEVICES AND METHODS THAT IDENTIFY SUBJECT ACTIVITY TYPE,” filed Oct. 21, 2016, which claims priority from U.S. Provisional Patent Application No. 62/245,919 entitled “PHYSIOLOGICAL MONITORING DEVICES AND METHODS THAT IDENTIFY SUBJECT ACTIVITY TYPE,” filed Oct. 23, 2015, in the United States Patent and Trademark Office, the disclosures of which are incorporated by reference herein in their entireties.

US Referenced Citations (550)
Number Name Date Kind
3595219 Friedlander et al. Jul 1971 A
4240882 Ang et al. Dec 1980 A
4281645 Jobsis Aug 1981 A
4331154 Broadwater et al. May 1982 A
4371406 Li Feb 1983 A
4438772 Slavin Mar 1984 A
4491760 Linvill Jan 1985 A
4521499 Switzer Jun 1985 A
4541905 Kuwana et al. Sep 1985 A
4586513 Hamaguri May 1986 A
4592807 Switzer Jun 1986 A
4655225 Dahne et al. Apr 1987 A
4830014 Goodman et al. May 1989 A
4882492 Schlager Nov 1989 A
4896676 Sasaki Jan 1990 A
4928704 Hardt May 1990 A
4952890 Swanson Aug 1990 A
4952928 Carroll et al. Aug 1990 A
4957109 Groeger et al. Sep 1990 A
5002060 Nedivi Mar 1991 A
5022970 Cook et al. Jun 1991 A
5025791 Niwa Jun 1991 A
5079421 Knudson et al. Jan 1992 A
5080098 Willett et al. Jan 1992 A
5086229 Rosenthal et al. Feb 1992 A
5139025 Lewis et al. Aug 1992 A
5143078 Mather et al. Sep 1992 A
5226417 Swedlow et al. Jul 1993 A
5237994 Goldberger Aug 1993 A
5299570 Hatschek Apr 1994 A
5348002 Caro Sep 1994 A
5377100 Pope et al. Dec 1994 A
5386819 Kaneko et al. Feb 1995 A
5431170 Mathews Jul 1995 A
5448082 Kim Sep 1995 A
5482036 Diab et al. Jan 1996 A
5492129 Greenberger Feb 1996 A
5494043 O'Sullivan et al. Feb 1996 A
5499301 Sudo et al. Mar 1996 A
5581648 Sahagen Dec 1996 A
5596987 Chance Jan 1997 A
5662117 Bittman Sep 1997 A
5673692 Schulze et al. Oct 1997 A
5697374 Odagiri et al. Dec 1997 A
5711308 Singer Jan 1998 A
5725480 Oosta et al. Mar 1998 A
5743260 Chung et al. Apr 1998 A
5779631 Chance Jul 1998 A
5797841 Delonzor et al. Aug 1998 A
5807114 Hodges et al. Sep 1998 A
5807267 Bryars et al. Sep 1998 A
5817008 Rafert et al. Oct 1998 A
5820560 Sinderby et al. Oct 1998 A
5846190 Woehrle Dec 1998 A
5853005 Scanlon Dec 1998 A
5904654 Wohltmann et al. May 1999 A
5938593 Ouellette Aug 1999 A
5954644 Dettling et al. Sep 1999 A
5964701 Asada et al. Oct 1999 A
5971931 Raff Oct 1999 A
5974338 Asano et al. Oct 1999 A
5995858 Kinast Nov 1999 A
6004274 Nolan et al. Dec 1999 A
6006119 Soller et al. Dec 1999 A
6013007 Root et al. Jan 2000 A
6022748 Charych et al. Feb 2000 A
6023541 Merchant et al. Feb 2000 A
6030342 Amano et al. Feb 2000 A
6045511 Ott et al. Apr 2000 A
6067006 O'Brien May 2000 A
6070093 Oosta et al. May 2000 A
6078829 Uchida et al. Jun 2000 A
6080110 Thorgersen Jun 2000 A
6081742 Amano et al. Jun 2000 A
6122042 Wunderman et al. Sep 2000 A
6148229 Morris, Sr. et al. Nov 2000 A
6155983 Kosuda et al. Dec 2000 A
6168567 Pickering et al. Jan 2001 B1
6186145 Brown Feb 2001 B1
6198394 Jacobsen et al. Mar 2001 B1
6198951 Kosuda et al. Mar 2001 B1
6205354 Gellermann et al. Mar 2001 B1
6231519 Blants et al. May 2001 B1
6267721 Welles Jul 2001 B1
6283915 Aceti et al. Sep 2001 B1
6285816 Anderson et al. Sep 2001 B1
6289230 Chaiken et al. Sep 2001 B1
6298314 Blackadar et al. Oct 2001 B1
6332868 Sato et al. Dec 2001 B1
6358216 Kraus et al. Mar 2002 B1
6361660 Goldstein Mar 2002 B1
6371925 Imai et al. Apr 2002 B1
6374129 Chin et al. Apr 2002 B1
6415167 Blank et al. Jul 2002 B1
6443890 Schulze et al. Sep 2002 B1
6444474 Thomas et al. Sep 2002 B1
6454718 Clift Sep 2002 B1
6458080 Brown et al. Oct 2002 B1
6470893 Boesen Oct 2002 B1
6513532 Mault et al. Feb 2003 B2
6514278 Hibst et al. Feb 2003 B1
6527711 Stivoric et al. Mar 2003 B1
6527712 Brown et al. Mar 2003 B1
6529754 Kondo Mar 2003 B2
6534012 Hazen et al. Mar 2003 B1
6556852 Schulze et al. Apr 2003 B1
6569094 Suzuki et al. May 2003 B2
6571117 Marbach May 2003 B1
6605038 Teller et al. Aug 2003 B1
6608562 Kimura et al. Aug 2003 B1
6616613 Goodman Sep 2003 B1
6631196 Taenzer et al. Oct 2003 B1
6647378 Kindo Nov 2003 B2
6656116 Kim et al. Dec 2003 B2
6694180 Boesen Feb 2004 B1
6702752 Dekker Mar 2004 B2
6725072 Steuer et al. Apr 2004 B2
6745061 Hicks et al. Jun 2004 B1
6748254 O'Neil et al. Jun 2004 B2
6760610 Tschupp et al. Jul 2004 B2
6783501 Takahashi et al. Aug 2004 B2
6808473 Hisano et al. Oct 2004 B2
6859658 Krug Feb 2005 B1
6893396 Schulze et al. May 2005 B2
6941239 Unuma et al. Sep 2005 B2
6953435 Kondo et al. Oct 2005 B2
6954644 Johansson et al. Oct 2005 B2
6996427 Ali et al. Feb 2006 B2
6997879 Turcott Feb 2006 B1
7018338 Vetter et al. Mar 2006 B2
7024369 Brown et al. Apr 2006 B1
7030359 Romhild Apr 2006 B2
7034694 Yamaguchi et al. Apr 2006 B2
7039454 Kaga et al. May 2006 B1
7041062 Friedrichs et al. May 2006 B2
7043287 Khalil et al. May 2006 B1
7048687 Reuss et al. May 2006 B1
7054674 Cane et al. May 2006 B2
7088234 Naito et al. Aug 2006 B2
7107088 Aceti Sep 2006 B2
7113815 O'Neil et al. Sep 2006 B2
7117032 Childre et al. Oct 2006 B2
7144375 Kosuda Dec 2006 B2
7163512 Childre et al. Jan 2007 B1
7175601 Verjus et al. Feb 2007 B2
7190986 Hannula et al. Mar 2007 B1
7209775 Bae et al. Apr 2007 B2
7217224 Thomas May 2007 B2
7252639 Kimura et al. Aug 2007 B2
7263396 Chen et al. Aug 2007 B2
7289837 Mannheimer et al. Oct 2007 B2
7336982 Yoo Feb 2008 B2
7341559 Schulz et al. Mar 2008 B2
7376451 Mahony et al. May 2008 B2
7378954 Wendt May 2008 B2
7470234 Elhag et al. Dec 2008 B1
7483730 Diab et al. Jan 2009 B2
7486988 Goodall et al. Feb 2009 B2
7507207 Sakai et al. Mar 2009 B2
7519327 White Apr 2009 B2
7526327 Blondeau et al. Apr 2009 B2
7583994 Scholz Sep 2009 B2
7620450 Kim et al. Nov 2009 B2
7625285 Breving Dec 2009 B2
7652569 Kiff et al. Jan 2010 B2
7689437 Teller et al. Mar 2010 B1
7695440 Kondo et al. Apr 2010 B2
7725147 Li et al. May 2010 B2
7756559 Abreu Jul 2010 B2
7843325 Otto Nov 2010 B2
7894869 Hoarau Feb 2011 B2
7914468 Shalon et al. Mar 2011 B2
7991448 Edgar, Jr. et al. Aug 2011 B2
7998079 Nagai et al. Aug 2011 B2
8050728 Al-Ali et al. Nov 2011 B2
8055319 Oh et al. Nov 2011 B2
8055330 Egozi Nov 2011 B2
8059924 Letant et al. Nov 2011 B1
8130105 Al-Ali et al. Mar 2012 B2
8137270 Keenan et al. Mar 2012 B2
8157730 Leboeuf et al. Apr 2012 B2
8172459 Abreu May 2012 B2
8175670 Baker, Jr. et al. May 2012 B2
8204730 Liu et al. Jun 2012 B2
8204786 Leboeuf et al. Jun 2012 B2
8233955 Al-Ali et al. Jul 2012 B2
8251903 Leboeuf et al. Aug 2012 B2
8255027 Al-Ali et al. Aug 2012 B2
8255029 Addison et al. Aug 2012 B2
8303512 Kosuda et al. Nov 2012 B2
8320982 Leboeuf et al. Nov 2012 B2
8323982 Leboeuf et al. Dec 2012 B2
8328420 Abreu Dec 2012 B2
8416959 Lott et al. Apr 2013 B2
8491492 Shinar et al. Jul 2013 B2
8504679 Spire et al. Aug 2013 B2
8506524 Graskov et al. Aug 2013 B2
8512242 Leboeuf et al. Aug 2013 B2
8647270 Leboeuf et al. Feb 2014 B2
8652040 Leboeuf et al. Feb 2014 B2
8652409 Leboeuf et al. Feb 2014 B2
8679008 Hughes et al. Mar 2014 B2
8700111 Leboeuf et al. Apr 2014 B2
8702607 Leboeuf et al. Apr 2014 B2
8730048 Shen et al. May 2014 B2
8788002 Leboeuf et al. Jul 2014 B2
8886269 Leboeuf et al. Nov 2014 B2
8888701 Leboeuf et al. Nov 2014 B2
8923941 Leboeuf et al. Dec 2014 B2
8929965 Leboeuf et al. Jan 2015 B2
8929966 Leboeuf et al. Jan 2015 B2
8934952 Leboeuf et al. Jan 2015 B2
8942776 Leboeuf et al. Jan 2015 B2
8961415 Leboeuf et al. Feb 2015 B2
8996332 Kahn Mar 2015 B2
9005129 Venkatraman et al. Apr 2015 B2
9044180 Leboeuf et al. Jun 2015 B2
9289175 Leboeuf et al. Mar 2016 B2
9801552 Romesburg Oct 2017 B2
9808204 Leboeuf et al. Nov 2017 B2
9943266 Adams et al. Apr 2018 B2
20010015123 Nishitani et al. Aug 2001 A1
20010044588 Mault Nov 2001 A1
20010049471 Suzuki et al. Dec 2001 A1
20020035340 Fraden et al. Mar 2002 A1
20020143242 Nemirovski Oct 2002 A1
20020156386 Dardik et al. Oct 2002 A1
20020156654 Roe et al. Oct 2002 A1
20020173780 Altshuler et al. Nov 2002 A1
20020186137 Skardon Dec 2002 A1
20020188210 Aizawa Dec 2002 A1
20020194002 Petrushin Dec 2002 A1
20030002705 Boesen Jan 2003 A1
20030007631 Bolognesi et al. Jan 2003 A1
20030045785 Diab et al. Mar 2003 A1
20030050563 Suribhotla et al. Mar 2003 A1
20030064712 Gaston et al. Apr 2003 A1
20030065257 Mault et al. Apr 2003 A1
20030083583 Kovtun et al. May 2003 A1
20030109030 Uchida et al. Jun 2003 A1
20030130586 Starobin et al. Jul 2003 A1
20030181795 Suzuki et al. Sep 2003 A1
20030181798 Al-Ali Sep 2003 A1
20030181817 Mori Sep 2003 A1
20030212336 Lee et al. Nov 2003 A1
20030220584 Honeyager et al. Nov 2003 A1
20030222268 Yocom et al. Dec 2003 A1
20040004547 Appelt et al. Jan 2004 A1
20040022700 Kim et al. Feb 2004 A1
20040030581 Leven Feb 2004 A1
20040034289 Teller et al. Feb 2004 A1
20040034293 Kimball Feb 2004 A1
20040039254 Stivoric et al. Feb 2004 A1
20040073455 McConnochie et al. Apr 2004 A1
20040075677 Loyall et al. Apr 2004 A1
20040077934 Massad Apr 2004 A1
20040081621 Arndt et al. Apr 2004 A1
20040082842 Lumba et al. Apr 2004 A1
20040097796 Berman et al. May 2004 A1
20040103146 Park May 2004 A1
20040117204 Mazar et al. Jun 2004 A1
20040120844 Tribelsky et al. Jun 2004 A1
20040122294 Hatlestad et al. Jun 2004 A1
20040122702 Sabol et al. Jun 2004 A1
20040133123 Leonhardt et al. Jul 2004 A1
20040135571 Uutela et al. Jul 2004 A1
20040138578 Pineda et al. Jul 2004 A1
20040158167 Smith et al. Aug 2004 A1
20040186387 Kosuda et al. Sep 2004 A1
20040186390 Ross et al. Sep 2004 A1
20040219056 Tribelsky et al. Nov 2004 A1
20040220488 Vyshedskiy et al. Nov 2004 A1
20040228494 Smith Nov 2004 A1
20040242976 Abreu Dec 2004 A1
20040254501 Mault Dec 2004 A1
20050004458 Kanayama et al. Jan 2005 A1
20050007582 Villers et al. Jan 2005 A1
20050021519 Ghouri Jan 2005 A1
20050027216 Guillemaud et al. Feb 2005 A1
20050030540 Thornton Feb 2005 A1
20050033200 Soehren et al. Feb 2005 A1
20050036212 Saito Feb 2005 A1
20050038349 Choi et al. Feb 2005 A1
20050043630 Buchert Feb 2005 A1
20050059870 Aceti Mar 2005 A1
20050070809 Acres Mar 2005 A1
20050084666 Pong et al. Apr 2005 A1
20050100866 Arnone et al. May 2005 A1
20050101845 Nihtila May 2005 A1
20050101872 Sattler et al. May 2005 A1
20050113167 Buchner et al. May 2005 A1
20050113656 Chance May 2005 A1
20050113703 Farringdon et al. May 2005 A1
20050116820 Goldreich Jun 2005 A1
20050119833 Nanikashvili Jun 2005 A1
20050148883 Boesen Jul 2005 A1
20050154264 Lecompte et al. Jul 2005 A1
20050177034 Beaumont Aug 2005 A1
20050187448 Petersen et al. Aug 2005 A1
20050187453 Petersen et al. Aug 2005 A1
20050192515 Givens et al. Sep 2005 A1
20050196009 Boesen Sep 2005 A1
20050203349 Nanikashvili Sep 2005 A1
20050203357 Debreczeny et al. Sep 2005 A1
20050209516 Fraden Sep 2005 A1
20050212405 Negley Sep 2005 A1
20050222487 Miller, III et al. Oct 2005 A1
20050222903 Buchheit et al. Oct 2005 A1
20050228244 Banet Oct 2005 A1
20050228299 Banet Oct 2005 A1
20050228463 Mac et al. Oct 2005 A1
20050245839 Stivoric et al. Nov 2005 A1
20050258816 Zen et al. Nov 2005 A1
20050259811 Kimm et al. Nov 2005 A1
20060009685 Finarov et al. Jan 2006 A1
20060012567 Sicklinger Jan 2006 A1
20060063993 Yu et al. Mar 2006 A1
20060074333 Huiku Apr 2006 A1
20060084878 Banet et al. Apr 2006 A1
20060084879 Nazarian et al. Apr 2006 A1
20060122520 Banet et al. Jun 2006 A1
20060123885 Yates et al. Jun 2006 A1
20060140425 Berg et al. Jun 2006 A1
20060142665 Garay et al. Jun 2006 A1
20060202816 Crump et al. Sep 2006 A1
20060205083 Zhao Sep 2006 A1
20060210058 Kock et al. Sep 2006 A1
20060211922 Al-Ali et al. Sep 2006 A1
20060211924 Dalke et al. Sep 2006 A1
20060217598 Miyajima et al. Sep 2006 A1
20060224059 Swedlow et al. Oct 2006 A1
20060240558 Zhao Oct 2006 A1
20060246342 MacPhee Nov 2006 A1
20060251277 Cho Nov 2006 A1
20060251334 Oba et al. Nov 2006 A1
20060252999 Devaul et al. Nov 2006 A1
20060258927 Edgar, Jr. Nov 2006 A1
20060264730 Stivoric et al. Nov 2006 A1
20060287590 McEowen Dec 2006 A1
20060292533 Selod Dec 2006 A1
20060293921 McCarthy et al. Dec 2006 A1
20070004449 Sham Jan 2007 A1
20070004969 Kong et al. Jan 2007 A1
20070015992 Filkins et al. Jan 2007 A1
20070021206 Sunnen Jan 2007 A1
20070027367 Oliver et al. Feb 2007 A1
20070027399 Chou Feb 2007 A1
20070036383 Romero Feb 2007 A1
20070050215 Kil et al. Mar 2007 A1
20070060800 Drinan et al. Mar 2007 A1
20070060819 Altshuler et al. Mar 2007 A1
20070063850 Devaul et al. Mar 2007 A1
20070082789 Nissila et al. Apr 2007 A1
20070083092 Rippo et al. Apr 2007 A1
20070083095 Rippo et al. Apr 2007 A1
20070088221 Stahmann Apr 2007 A1
20070093702 Yu et al. Apr 2007 A1
20070106167 Kinast May 2007 A1
20070112273 Rogers May 2007 A1
20070112598 Heckerman et al. May 2007 A1
20070116314 Grilliot et al. May 2007 A1
20070118043 Oliver et al. May 2007 A1
20070135717 Uenishi et al. Jun 2007 A1
20070165872 Bridger et al. Jul 2007 A1
20070167850 Russell et al. Jul 2007 A1
20070191718 Nakamura Aug 2007 A1
20070197878 Shklarski Aug 2007 A1
20070197881 Wolf et al. Aug 2007 A1
20070213020 Novac Sep 2007 A1
20070230714 Armstrong Oct 2007 A1
20070233403 Alwan et al. Oct 2007 A1
20070265097 Havukainen Nov 2007 A1
20070270667 Coppi et al. Nov 2007 A1
20070270671 Gal Nov 2007 A1
20070293781 Sims et al. Dec 2007 A1
20070299330 Couronne et al. Dec 2007 A1
20080001735 Tran Jan 2008 A1
20080004536 Baxi et al. Jan 2008 A1
20080015424 Bernreuter Jan 2008 A1
20080039731 McCombie et al. Feb 2008 A1
20080076972 Dorogusker et al. Mar 2008 A1
20080081963 Naghavi et al. Apr 2008 A1
20080081972 Debreczeny Apr 2008 A1
20080086533 Neuhauser et al. Apr 2008 A1
20080096726 Riley et al. Apr 2008 A1
20080114220 Banet et al. May 2008 A1
20080132798 Hong et al. Jun 2008 A1
20080133699 Craw et al. Jun 2008 A1
20080141301 Azzaro et al. Jun 2008 A1
20080154098 Morris et al. Jun 2008 A1
20080154105 Lemay Jun 2008 A1
20080165017 Schwartz Jul 2008 A1
20080170600 Sattler et al. Jul 2008 A1
20080171945 Dotter Jul 2008 A1
20080177162 Bae et al. Jul 2008 A1
20080200774 Luo Aug 2008 A1
20080203144 Kim Aug 2008 A1
20080221414 Baker, Jr. Sep 2008 A1
20080221461 Zhou et al. Sep 2008 A1
20080249594 Dietrich et al. Oct 2008 A1
20080287752 Stroetz et al. Nov 2008 A1
20080312517 Genoe et al. Dec 2008 A1
20090005662 Petersen et al. Jan 2009 A1
20090006457 Stivoric et al. Jan 2009 A1
20090010461 Klinghult et al. Jan 2009 A1
20090010556 Uchibayashi et al. Jan 2009 A1
20090030350 Yang et al. Jan 2009 A1
20090054751 Babashan et al. Feb 2009 A1
20090054752 Jonnalagadda et al. Feb 2009 A1
20090069645 Nielsen et al. Mar 2009 A1
20090082994 Schuler et al. Mar 2009 A1
20090088611 Buschmann Apr 2009 A1
20090093687 Telfort et al. Apr 2009 A1
20090105548 Bart Apr 2009 A1
20090105556 Fricke et al. Apr 2009 A1
20090112101 Furness, III et al. Apr 2009 A1
20090131761 Moroney, III et al. May 2009 A1
20090131764 Lee et al. May 2009 A1
20090175456 Johnson Jul 2009 A1
20090177097 Ma et al. Jul 2009 A1
20090214060 Chuang et al. Aug 2009 A1
20090221888 Wijesiriwardana Sep 2009 A1
20090227853 Wijesiriwardana Sep 2009 A1
20090240125 Such et al. Sep 2009 A1
20090253992 Van Der Loo Oct 2009 A1
20090253996 Lee et al. Oct 2009 A1
20090264711 Schuler et al. Oct 2009 A1
20090270698 Shioi et al. Oct 2009 A1
20090281435 Ahmed et al. Nov 2009 A1
20090287067 Dorogusker et al. Nov 2009 A1
20090299215 Zhang Dec 2009 A1
20100004517 Bryenton et al. Jan 2010 A1
20100022861 Cinbis et al. Jan 2010 A1
20100045663 Chen et al. Feb 2010 A1
20100100013 Hu et al. Apr 2010 A1
20100113948 Yang et al. May 2010 A1
20100168531 Shaltis et al. Jul 2010 A1
20100172522 Mooring et al. Jul 2010 A1
20100179389 Moroney, III et al. Jul 2010 A1
20100185105 Baldinger Jul 2010 A1
20100217100 Leboeuf et al. Aug 2010 A1
20100217102 Leboeuf et al. Aug 2010 A1
20100217103 Abdul-Hafiz et al. Aug 2010 A1
20100222655 Starr et al. Sep 2010 A1
20100228315 Nielsen Sep 2010 A1
20100234714 Mercier et al. Sep 2010 A1
20100268056 Picard et al. Oct 2010 A1
20100274100 Behar et al. Oct 2010 A1
20100274109 Hu et al. Oct 2010 A1
20100292589 Goodman Nov 2010 A1
20100298653 McCombie et al. Nov 2010 A1
20110028810 Van Slyke et al. Feb 2011 A1
20110028813 Watson et al. Feb 2011 A1
20110066007 Banet et al. Mar 2011 A1
20110081037 Oh et al. Apr 2011 A1
20110098112 Leboeuf et al. Apr 2011 A1
20110105869 Wilson et al. May 2011 A1
20110112382 Li et al. May 2011 A1
20110130638 Raridan Jun 2011 A1
20110142371 King et al. Jun 2011 A1
20110178564 Keefe Jul 2011 A1
20110288379 Wu Nov 2011 A1
20120030547 Raptis et al. Feb 2012 A1
20120095303 He Apr 2012 A1
20120150052 Buchheim et al. Jun 2012 A1
20120156933 Kreger et al. Jun 2012 A1
20120172702 Koyrakh et al. Jul 2012 A1
20120179011 Moon et al. Jul 2012 A1
20120190948 Vetter et al. Jul 2012 A1
20120203081 Leboeuf et al. Aug 2012 A1
20120226111 Leboeuf et al. Sep 2012 A1
20120226112 Leboeuf et al. Sep 2012 A1
20120277548 Burton Nov 2012 A1
20130053661 Alberth et al. Feb 2013 A1
20130072765 Kahn et al. Mar 2013 A1
20130197377 Kishi et al. Aug 2013 A1
20130245387 Patel Sep 2013 A1
20130336495 Burgett et al. Dec 2013 A1
20140012105 Leboeuf et al. Jan 2014 A1
20140051940 Messerschmidt Feb 2014 A1
20140051948 Leboeuf et al. Feb 2014 A1
20140052567 Bhardwaj et al. Feb 2014 A1
20140058220 Leboeuf et al. Feb 2014 A1
20140073486 Ahmed et al. Mar 2014 A1
20140088433 Shan Mar 2014 A1
20140094663 Leboeuf et al. Apr 2014 A1
20140100432 Golda et al. Apr 2014 A1
20140127996 Park et al. May 2014 A1
20140128690 Leboeuf May 2014 A1
20140135596 Leboeuf et al. May 2014 A1
20140140567 Leboeuf et al. May 2014 A1
20140171755 Leboeuf et al. Jun 2014 A1
20140213863 Loseu et al. Jul 2014 A1
20140219467 Kurtz Aug 2014 A1
20140228649 Rayner Aug 2014 A1
20140235967 Leboeuf et al. Aug 2014 A1
20140235968 Leboeuf et al. Aug 2014 A1
20140236531 Carter Aug 2014 A1
20140243617 Leboeuf et al. Aug 2014 A1
20140243620 Leboeuf et al. Aug 2014 A1
20140275852 Hong et al. Sep 2014 A1
20140275854 Venkatraman et al. Sep 2014 A1
20140275855 Leboeuf et al. Sep 2014 A1
20140276119 Venkatraman Sep 2014 A1
20140287833 Leboeuf et al. Sep 2014 A1
20140288392 Hong et al. Sep 2014 A1
20140288396 Leboeuf et al. Sep 2014 A1
20140288436 Venkatraman et al. Sep 2014 A1
20140323829 Leboeuf et al. Oct 2014 A1
20140323830 Leboeuf et al. Oct 2014 A1
20140323880 Ahmed et al. Oct 2014 A1
20140327515 Luna et al. Nov 2014 A1
20140378844 Fei Dec 2014 A1
20150011898 Romesburg Jan 2015 A1
20150018636 Romesburg Jan 2015 A1
20150025393 Hong et al. Jan 2015 A1
20150031967 Leboeuf et al. Jan 2015 A1
20150032009 Leboeuf et al. Jan 2015 A1
20150057967 Albinali Feb 2015 A1
20150080741 LeBoeuf Mar 2015 A1
20150080746 Bleich Mar 2015 A1
20150157269 Lisogurski et al. Jun 2015 A1
20150190085 Nathan et al. Jul 2015 A1
20150196256 Venkatraman et al. Jul 2015 A1
20150250396 Ahmed et al. Sep 2015 A1
20150265217 Penders et al. Sep 2015 A1
20150282768 Luna et al. Oct 2015 A1
20150289820 Miller et al. Oct 2015 A1
20150305682 Leboeuf et al. Oct 2015 A1
20150342481 Liu et al. Dec 2015 A1
20150366509 Romesburg Dec 2015 A1
20160022220 Lee et al. Jan 2016 A1
20160029964 Leboeuf et al. Feb 2016 A1
20160038045 Shapiro Feb 2016 A1
20160051157 Waydo Feb 2016 A1
20160089033 Saponas et al. Mar 2016 A1
20160089086 Lin et al. Mar 2016 A1
20160094899 Aumer et al. Mar 2016 A1
20160120476 Liu et al. May 2016 A1
20160206247 Morland Jul 2016 A1
20160287108 Wei et al. Oct 2016 A1
20160361021 Salehizadeh Dec 2016 A1
20170007166 Roovers et al. Jan 2017 A1
20170034615 Mankodi et al. Feb 2017 A1
20170232294 Kruger et al. Aug 2017 A1
20170290549 Romesburg Oct 2017 A1
20180008200 Romesburg Jan 2018 A1
20180020979 Wagner Jan 2018 A1
20180049645 Romesburg Feb 2018 A1
20180146926 Ishikawa May 2018 A1
Foreign Referenced Citations (54)
Number Date Country
2015101130 Oct 2015 AU
101212927 Jul 2008 CN
201438747 Apr 2010 CN
3910749 Oct 1990 DE
1297784 Apr 2003 EP
1480278 Nov 2004 EP
1908401 Apr 2008 EP
2077091 Jul 2009 EP
2182839 May 2010 EP
2667769 Dec 2013 EP
2408209 May 2005 GB
2411719 Sep 2005 GB
07241279 Sep 1995 JP
09253062 Sep 1997 JP
09299342 Nov 1997 JP
2000-116611 Apr 2000 JP
2001-025462 Jan 2001 JP
2003-159221 Jun 2003 JP
2004-513750 May 2004 JP
2004-283523 Oct 2004 JP
2005-040261 Feb 2005 JP
2005-270544 Oct 2005 JP
2007-044203 Feb 2007 JP
2007-185348 Jul 2007 JP
2008-136556 Jun 2008 JP
2008-279061 Nov 2008 JP
2009-153664 Jul 2009 JP
2010-526646 Aug 2010 JP
2014-068733 Apr 2014 JP
20-0204510 Nov 2000 KR
0024064 Apr 2000 WO
0047108 Aug 2000 WO
0108552 Feb 2001 WO
0217782 Mar 2002 WO
2005010568 Feb 2005 WO
2005020121 Mar 2005 WO
2005036212 Apr 2005 WO
2005110238 Nov 2005 WO
2006009830 Jan 2006 WO
2006067690 Jun 2006 WO
2007012931 Feb 2007 WO
2007053146 May 2007 WO
2008141306 Nov 2008 WO
2011127063 Oct 2011 WO
2013019494 Feb 2013 WO
2013038296 Mar 2013 WO
2013109389 Jul 2013 WO
2013109390 Jul 2013 WO
2014092932 Jun 2014 WO
2014196119 Dec 2014 WO
2015068066 May 2015 WO
2015128226 Sep 2015 WO
2015131065 Sep 2015 WO
2017027551 Feb 2017 WO
Non-Patent Literature Citations (49)
Entry
Asada et al. “Mobile Monitoring with Wearable Photoplethysmographic Biosensors” IEEE Engineering in Medicine and Biology Magazine (pp. 28-40) (May/Jun. 2003).
BIFULCO et al. “Bluetooth Portable Device for Continuous ECG and Patient Motion Monitoring During Daily Life” Medicon 2007 IFMBE Proceedings 16:369-372 (2007).
Brodersen et al. “In-Ear Acquisition of Vital Signs Discloses New Chances for Preventive Continuous Cardiovascular Monitoring” 4th International Workshop on Wearable and Implantable Body Sensor Networks 13:189-194 (2007).
Celka et al. “Motion Resistant Earphone Located Infrared based Heart Rate Measurement Device” Proceedings of the Second Iasted International Conference on Biomedical Engineering (pp. 582-585) (Feb. 16-18, 2004).
Comtois “Implementation of Accelerometer-Based Adaptive Noise Cancellation in a Wireless Wearable Pulse Oximeter Platform for Remote Physiological Monitoring and Triage” Thesis, Worcester Polytechnic Institute (149 pages) (Aug. 31, 2007).
Comtois et al. “A Comparative Evaluation of Adaptive Noise Cancellation Algorithms for Minimizing Motion Artifacts in a Forehead-Mounted Wearable Pulse Oximeter” Proceedings of the 29th Annual International Conference of the IEEE EMBS (pp. 1528-1531) (Aug. 23-26, 2007).
Comtois et al. “A Wearable Wireless Reflectance Pulse Oximeter for Remote Triage Applications” IEEE (pp. 53-54) (2006).
Duun et al. “A Novel Ring Shaped Photodiode for Reflectance Pulse Oximetry in Wireless Applications” IEEE Sensors 2007 Conference (pp. 596-599) (2007).
Fitrainer “The Only Trainer You Need” http://itami.com©2008 FiTrainer™ (2 pages) (Downloaded Feb. 26, 2010).
Fleming et al. “A Comparison of Signal Processing Techniques for the Extraction of Breathing Rate from the Photoplethysmorgram” World Academy of Science, Engineering and Technology 30:276-280 (Oct. 2007).
Geun et al. “Measurement Site and Applied Pressure Consideration in Wrist Photoplethysmography” The 23rd Intemational Technical Conference on Circuits/Systems, Computers and Communications (pp. 1129-1132) (2008).
Gibbs et al. “Active motion artifact cancellation for wearable health monitoring sensors using collocated MEMS accelerometers” Proc. of SPIE Smart Structures and Materials, 2005: Sensors and Smart Structures Technologies for Civil, Mechanical, and Aerospace Systems 5765:811-819 (2005).
Gibbs et al. “Reducing Motion Artifact in Wearable Bio-Sensors Using MEMS Accelerometers for Active Noise Cancellation” 2005 American Control Conference 1581-1586 (Jun. 8-10, 2005).
Haahr et al. “A Wearable “Electronic Patch” for Wireless Continuous Monitoring of Chronically Diseased Patients” Proceedings of the 5th International Workshop on Wearable and Implantable Body Sensor Networks, in conjunction with The 5th International Summer School and Symposium on Medical Devices and Biosensors (pp. 66-70) (Jun. 1-3, 2008).
Han et al. “Artifacts in wearable photoplethysmographs during daily life motions and their reduction with least mean square based active noise cancellation method” Computers in Biology and Medicine 42:387-393 (Apr. 2012).
Han et al. “Development of a wearable health monitoring device with motion artifact reduced algorithm” International Conference on Control, Automation and Systems 2007 (ICCAS 2007) (pp. 1581-1584) (Oct. 17-20, 2007).
Jiang “Motion-Artifact Resistant Design of Photoplethysmograph Ring Sensor for Driver Monitoring” Thesis, Massachusetts Institute of Technology (62 pages) (Feb. 2004).
Kuzmina et al. “Compact multi-functional skin spectrometry set-up” Advanced Optical Materials, Technologies, and Devices, Proc. of SPIE 6596:65960T-1-65960T-6 (2007).
Lee et al.“A Mobile Care System With Alert Mechanism” IEEE Transactions on Information Technology in Biomedicine 11(5):507-517 (Sep. 2007).
Lee et al. “Respiratory Rate Detection Algorithms by Photoplethysmography Signal Processing” 30th Annual International IEEE EMBS Conference (pp. 1140-1143) (Aug. 20-24, 2008).
Lindberg et al. “Monitoring of respiratory and heart rates using a fibre-optic sensor” Med Biol Eng Comput 30 (5):533-537 (Sep. 1992).
Luprano “Sensors and Parameter Extraction by Wearable Systems: Present Situation and Future” pHealth 2008 (29 pages) (May 21, 2008).
Lygouras et al. “Optical-Fiber Finger Photo-Plethysmograph Using Digital Techniques” IEEE Sensors Journal 2 (1):20-25 (Feb. 2002).
Maguire et al. “The Design and Clinical Use of a Reflective Brachial Photoplethysmograph” Signals and Systems Research Group, National University of Ireland (13 pages) (Apr. 2002).
Mendelson et al. “Measurement Site and Photodetector Size Considerations in Optimizing Power Consumption of a Wearable Reflectance Pulse Oximeter” Proceedings of the 25th Annual International Conference of the IEEE EMBS (pp. 3016-3019) (Sep. 17-21, 2003).
Mendelson et al. “Noninvasive Pulse Oximetry Utilizing Skin Reflectance Photoplethysmography” IEEE Transactions on Biomedical Engineering 35(10):798-805 (Oct. 1988).
Nakajima et al. “Monitoring of heart and respiratory rates by photoplethysmography using a digital filtering technique” Med. Eng. Phys. 18(5)365-372 (Jul. 1996).
Notification Concerning Transmittal of Copy of International Preliminary Report on Patentability in corresponding PCT Application No. PCT/US2016/058098 (10 pages) (dated May 3, 2018).
Poh et al. “Motion Tolerant Magnetic Earring Sensor and Wireless Earpiece for Wearable Photoplethysmography” IEEE Transactions on Information Technology in Biomedicine 14(3):786-794 (May 2010).
Renevey et al. “Wrist-Located Pulse Detection Using IR Signals, Activity and Nonlinear Artifact Cancellation” IEEE EMBS (4 pages) (2001).
Rhee et al. “Artifact-Resistant Power-Efficient Design of Finger-Ring Plethysmographic Sensors” IEEE Transactions on Biomedical Engineering 48(7):795-805 (Jul. 2001).
Shaltis “Analysis and Validation of an Artifact Resistant Design for Oxygen Saturation Measurement Using Photo Plethysmographic Ring Sensors” Thesis, Massachusetts Institute of Technology (103 pages) (Jun. 2004).
Shaw et al. “Warfighter Physiological and Environmental Monitoring: A Study for the U.S. Army Research Institute in Environmental Medicine and the Soldier Systems Center” Massachusetts Institute of Technology Lincoln Laboratory (141 pages) (Nov. 1, 2004).
Shin et al. “A Novel Headset with a Transmissive PPG Sensor for Heart Rate Measurement” 13th International Conference on Biomedical Engineering (pp. 519-522) (2009).
Spigulis et al. “Wearable wireless photoplethysmography sensors” Proc. of SPIE 6991:69912O-1-69912O-7 (2008).
Takatani et al. “Optical Oximetry Sensors for Whole Blood and Tissue” IEEE Engineering in Medicine and Biology (pp. 347-357) (Jun./Jul. 1994).
Vogel et al. “A System for Assessing Motion Artifacts in the Signal of a Micro-Optic In-Ear Vital Signs Sensor” 30th Annual International IEEE EMBS Conference (Aug. 20-24, 2008).
Vogel et al. “In-Ear Heart Rate Monitoring Using a Micro-Optic Reflective Sensor” Proceedings of the 29th Annual International Conference of the IEEE EMBS Cite Internationale (pp. 1375-1378) (Aug. 23-26, 2007).
Wang et al. “Multichannel Reflective PPG Earpiece Sensor With Passive Motion Cancellation” IEEE Transactions on Biomedical Circuits and Systems 1(4):235-241 (Dec. 2007).
Wang et al. “Reflective Photoplethysmograph Earpiece Sensor for Ubiquitous Heart Rate Monitoring” 4th International Workshop on Wearable and Implantable Body Sensor Networks IFMBE Proceedings 13:179-183 (2007).
Webster, John G. “Design of Pulse Oximeters” Medical Science Series, Institute of Physics Publication (143 pages) (Aug. 1997).
Wei et al. “A New Wristband Wearable Sensor Using Adaptive Reduction Filter to Reduce Motion Artifact” Proceedings of the 5th International Conference on Information Technology and Application in Biomedicine, in conjunction with The 2nd International Symposium & Summer School on Biomedical and Health Engineering (pp. 278-281) (May 30-31, 2008).
Wikipedia “Least mean squares filter” Retrieved at URL: https://en.wikipedia.org/wiki/Least_mean_squares_filter (6 pages) (Retrieved on Mar. 17, 2016).
Wood “Motion Artifact Reduction for Wearable Photoplethysmogram Sensors Using Micro Accelerometers and Laguerre Series Adaptive Filters” Thesis, Massachusetts Institute of Technology (74 pages) (Jun. 2008).
Wood et al. “Active Motion Artifact Reduction for Wearable Sensors Using Laguerre Expansion and Signal Separation” Proceedings of the 2005 IEEE Engineering in Medicine and Biology 27th Annual Conference (pp. 3571-3574) (Sep. 1-4, 2005).
Notification of Transmittal of the International Search Report and the Written Opinion of the International Searching Authority, or the Declaration, in corresponding PCT Application No. PCT/US2016/058098 (13 pages) (dated Jan. 10, 2017).
Fukushima et al. “Estimating Heart Rate using Wrist-type Photoplethysmography and Acceleration sensor while running” Conf Proc IEEE Eng Med Biol Soc. (pp. 2901-2904) (Sep. 2012).
Notification of Transmittal of the International Search Report and the Written Opinion of the International Searching Authority, or the Declaration, in corresponding PCT Application No. PCT/US2018/002621 (21 pages) (dated Jun. 18, 2018).
Extended European Search Report corresponding to European Application No. 18771241.9 (7 pages) (dated Nov. 4, 2020).
Related Publications (1)
Number Date Country
20180199837 A1 Jul 2018 US
Provisional Applications (2)
Number Date Country
62475746 Mar 2017 US
62245919 Oct 2015 US
Continuations (1)
Number Date Country
Parent 15299684 Oct 2016 US
Child 15922610 US