An Application Data Sheet is filed concurrently with this specification as part of the present application. Each application that the present application claims benefit of or priority to as identified in the concurrently filed Application Data Sheet is incorporated by reference herein in its entirety and for all purposes.
This disclosure relates to the field of biometric monitoring devices, and particularly to the identification of a user of a biometric monitoring device.
Consumer interest in personal health has led to a variety of personal health monitoring devices being offered on the market. Such devices, until recently, tended to be complicated to use and were typically designed for use with one activity, for example, bicycle trip computers.
Advances in sensors, electronics, and power source miniaturization have allowed the size of personal health monitoring devices, also referred to herein as “biometric tracking,” “biometric monitoring,” or, in certain embodiments, simply “wearable” devices, to be offered in extremely small sizes that were previously impractical. The number of applications for these devices is increasing as the processing power and component miniaturization for wearable devices improves.
Certain biometric monitoring devices may be configured to monitor biometrics from a plurality of users. A user may be required to select a corresponding user profile to ensure that the biometric measurements are associated with the correct user of the device. Thus, the user may be required to perform certain interactions with the biometric monitoring device in order to have the measured biometrics associated with the correct user profile.
The systems, methods and devices of this disclosure each have several innovative aspects, no single one of which is solely responsible for the desirable attributes disclosed herein.
In one aspect, there is provided a method of operating a biometric monitoring device, the device comprising a platform configured to receive at least one foot of a user, a plurality of sensors, a memory, and processing circuitry. The method may involve: detecting that the user is standing on the platform; measuring, based on body-weight data generated by a first one of the sensors, a weight of the user in response to detecting that the user is standing on the platform; and determining, based on sensor data generated by a second one of the sensors, a user parameter indicative of the user's identification. The method may also involve: comparing the user parameter to a plurality of parameter values respectively associated with a plurality of user profiles stored in the memory; and comparing the measured weight of the user to a plurality of weight values respectively associated with each of the user profiles. The method may further involve: identifying the one of the user profiles as corresponding to the user based on the comparison of the user parameter to the parameter values and the comparison of the measured weight of the user to the weight values; and updating the identified user profile based on at least one of the measured weight and the user parameter.
In another aspect, there is provided a biometric monitoring device that includes: a platform configured to receive at least one foot of a user; a first sensor configured to generate body-weight data; a second sensor configured to generate sensor data; and a memory configured to store a plurality of user profiles, each user profile including a weight value and a parameter value. The biometric monitoring device may further include processing circuitry configured to: detect that the user is standing on the platform; measure, based on the body-weight data generated by the first sensor, a weight of the user in response to detecting that the user is standing on the platform; determining, based on the sensor data generated by the second sensor, a user parameter indicative of the user's identification; comparing the user parameter to a plurality of parameter values respectively associated with a plurality of user profiles; comparing the measured weight of the user to a plurality of weight values respectively associated with each of the user profiles; identifying one of the user profiles as corresponding to the user based on the comparison of the user parameter to the parameter values and the comparison of the measured weight of the user to the weight values; and updating the identified user profile based on at least one of the measured weight and the user parameter.
In yet another aspect, there is provided a method of operating a biometric monitoring device, the device comprising a platform configured to receive at least one foot of a user, a body-weight sensor, a bioelectrical impedance sensor, a memory, and processing circuitry. The method may involve: detecting that the user is standing on the platform; measuring, based output from the body-weight sensor, a weight of the user in response to detecting that the user is standing on the platform; applying at least one electrical signal to the user's body; measuring, based on output from the bioelectrical impedance sensor, the bioelectrical impedance of the user's body based on the application of the at least one electrical signal; determining at least one bioelectrical impedance parameter associated with the user based on the measured bioelectrical impedance of the user's body according to a bioelectrical model of the user's body as a plurality of electrical elements; and identifying one of a plurality of user profiles as corresponding to the user based on the determined at least one bioelectrical impedance parameter.
One of the biometrics which may be measured and tracked by a biometric monitoring device is the weight of a user. Since changes to a user's weight outside of normal daily fluctuations occur at a relatively slow pace (e.g., on the order of days/weeks), a body-weight monitoring device (e.g., a scale) may be used by a user at relatively infrequent intervals (e.g., once a day). A user does not need to continually update their weight measurements over the course of the day (e.g., a user does not typically measure changes in weight over the course of a workout), and multiple users can measure their weight by sharing use of the same body-weight monitoring device.
When a plurality of users measure their respective body-weights using the same body-weight biometric monitoring device, it may be necessary to determine to which user a particular weight measurement corresponds in order to update the correct user's body-weight profile. One technique for distinguishing between users may involve receiving a selection and/or confirmation from the user to identify the correct user profile. However, it may be desirable to automatically identify the user in order to streamline the user's interaction with the body-weight biometric monitoring device (e.g., by simplifying and/or streamlining user identification and/or body weight measurement).
The difficultly in automatically distinguishing between two users may be related to the difference in weight between the two users. For example, when the difference in weight between two users is less than a predetermined value (e.g., 5 pounds), the normal fluctuations in weight of the users may result in overlapping body-weight measurements for the two users. Accordingly, it may not be possible to determine to which of the users a particular measurement belongs based solely on the body-weight measurement. Aspects of the present disclosure relate to techniques for identifying a user to which a particular body-weight measurement belongs by using additional information about the user.
Biometric Monitoring Device Overview
The memory 130 may store instructions for causing the processor 120 to perform certain actions. For example, the processor 120 may be configured to measure a body-weight of a user and automatically determine to which of a plurality of user profiles the body-weight measurement corresponds. In some embodiments, the biometric sensors 160 may include one or more of a body-weight sensor, a bioelectrical impedance (also referred to as bio-impedance) sensor, an optical sensor (e.g., a photoplethysmographic (PPG) sensor), an accelerometer, a footprint scanner, and/or other biometric sensor(s). Further information regarding such biometric sensors are described in more detail below (e.g., in connection with
The biometric monitoring device 100 may collect one or more types of physiological and/or environmental data from the one or more biometric sensor(s) 160, the one or more environmental sensor(s) 150, and/or external devices and communicate or relay such information to other devices (e.g., the client device 170 and/or the server 175), thus permitting the collected data to be viewed, for example, using a web browser or network-based application. For example, while the user stands on the biometric monitoring device 100, the biometric monitoring device 100 may perform biometric monitoring via measuring the user's body-weight using the one or more biometric sensor(s) 160. The biometric monitoring device 100 may transmit data representative of the user's body-weight to an account on a web service (e.g., www.fitbit.com), computer, mobile phone, and/or health station where the data may be stored, processed, and/or visualized by the user. The biometric monitoring device 100 may measure or calculate other physiological metric(s) in addition to, or in place of, the user's body-weight. Such physiological metric(s) may include, but are not limited to: body-fat; bio-impedance; heart rate; heartbeat waveform; heart rate variability; heart rate recovery; blood pressure; blood glucose; skin conduction; skin and/or body temperature; muscle state measured via electromyography; brain activity as measured by electroencephalography; weight; caloric intake; nutritional intake from food; medication intake; pH levels; hydration levels; respiration rate; and/or other physiological metrics.
The biometric monitoring device 100 may also measure or calculate metrics related to the environment around the user (e.g., with the one or more environmental sensor(s) 150), such as, for example, barometric pressure, weather conditions (e.g., temperature, humidity, pollen count, air quality, rain/snow conditions, wind speed), light exposure (e.g., ambient light, ultra-violet (UV) light exposure, time and/or duration spent in darkness), noise exposure, radiation exposure, and/or magnetic field. Furthermore, the biometric monitoring device 100 (and/or the client device 170 and/or the server 175) may collect data from the biometric sensor(s) 160 and/or the environmental sensor(s) 150, and may calculate metrics derived from such data. For example, the biometric monitoring device 100 (and/or the client device 170 and/or the server 175) may calculate the user's stress or relaxation levels based on a combination of heart rate variability, skin conduction, noise pollution, and/or sleep quality. In another example, the biometric monitoring device 100 (and/or the client device 170 and/or the server 175) may determine the efficacy of a medical intervention, for example, medication, based on a combination of data relating to medication intake, sleep, and/or activity. In yet another example, the biometric monitoring device 100 (and/or the client device 170 and/or the server 22) may determine the efficacy of an allergy medication based on a combination of data relating to pollen levels, medication intake, sleep and/or activity. These examples are provided for illustration only and are not intended to be limiting or exhaustive.
Other examples of the biometric sensor(s) 160 include sensors to detect, measure and/or sense data which is representative of heart rate, respiratory rate, hydration, height, sun exposure, blood pressure and/or arterial stiffness. In addition thereto, or in lieu thereof, the biometric monitoring device 100 may detect, measure and/or sense (via appropriate sensors) other physiologic data. All such physiologic data or parameters, whether now known or later developed, are intended to fall within the scope of this disclosure. Similarly, although the biometric sensor(s) 160 may be depicted as independent, they may be collaborative and perform multiple types of measurements. For example, a body-weight sensor 166 in combination with bio-impedance sensor 162 may be used to measure body fat, hydration, and/or fat-free mass.
It related aspects, the processor 120 and other component(s) of the biometric monitoring device 100 (e.g., shown in
In further related aspects, the processor 120 and other component(s) of the biometric monitoring device 100 may be implemented as a SoC that may include one or more CPU cores that use one or more reduced instruction set computing (RISC) instruction sets, a WWAN radio circuit, a WLAN radio circuit, and/or other software and hardware to support the biometric monitoring device 100.
As described in more detail below, the biometric monitoring device 100 may determine a parameter which is indicative of the identity of the user based on sensor data generated by a sensor other than the body-weight sensor(s) 166. The parameter may be a physiological parameter or a non-physiological parameter. The sensor data may be generated by and/or received from at least one of the biometric monitoring device 100, the wearable device 170, the mobile device 170, and the Wi-Fi router 185. For example, in one implementation the biometric monitoring device 100 may receive the parameter indicative of the identity of the user from at least one of the wearable device 170, the mobile device 170, and the Wi-Fi router 185. Additionally, the biometric monitoring device 100 may log each of the body-weight measurements into a corresponding user account stored on a server 175 via the Wi-Fi router 185.
In the illustrated embodiment, the biometric monitoring device 100 may also include a pair of foot pads 190 and a user interface 110, which may be positioned in the illustrated regions of the platform 195. The platform 195 may be configured to receive at least one foot of a user, for example, the platform 195 may be configured to receive the feet of a user at the respective foot pads 190. However, the described technology is not so limited and the foot pad(s) and/or user interface 110 may be located in other positions. For example, the biometric monitoring device 100 may use a user interface of a client device 100 (e.g., a mobile phone) as an interface for communication with the user.
The user interface 110 of biometric monitoring device 100 may provide or facilitate exchange of physiologic information and, in certain embodiments, other information or data. For example, biometric monitoring device 100 may include one or more displays to present physiologic information including, for example, current information, historical information, and/or comparison like information, for example, current information in view of historical information. The historical information or data may include, for example, historical body-weight and/or body-fat data measured by the biometric monitoring device 100 (which may be stored internally to and/or externally from biometric monitoring device 100), historical user activity data, food consumption data, and/or sleep data (which may be measured or monitored by other personal and/or portable devices (for example, the wearable device 302 illustrated in
In addition to or in lieu of one or more displays, user interface 110 may include one or more speakers to provide the user with such physiologic data or information (whether current, historical or both) in aural form.
The user interface 110 may also include an input mechanism to facilitate input of, for example, user data, commands and/or selections. In one embodiment, the user may communicate with biometric monitoring device 110 via a user interface including, for example, a touch pad, touch screen, buttons, switches and/or knobs. In another embodiment, user interface 110 of a biometric monitoring device 100 includes one or more audio sensors to detect speech or sounds. In this way, for example, the user may input data, commands and/or selections to biometric monitoring device 100. For example, in response to speech or sounds from the user, biometric monitoring device 100 may determine or identify the appropriate user (for example, from a list stored therein) which facilitates correlation of physiologic data (acquired by the one or more physiological sensors) with a particular user.
Certain biometric sensor(s) 160 may be located directly beneath the foot pads 190. For example, certain biometric sensor(s) 160 may generate a signal that is supplied to the skin of the user and measure a response from the user in order to measure a corresponding biometric. As an example, a PPG sensor may generate an optical signal which is applied to the user's feet and may measure the light reflected from the user. This reflected signal may be used by the PPG sensor to measure one or more of the user's heart rate, heartbeat waveform; heart rate variability; heart rate recovery, etc. Similarly, a bio-impedance sensor may generate an electrical signal which may be applied to the user through the user's feet or another part of the user's body connected to an electrode. The bio-impedance sensor may measure a response to the applied electrical signal and model certain parameters of the user's body based on the measured signal. Further examples of the bio-impedance sensor are discussed in detail below.
Foot Pads for Biometric Monitoring Device
In one embodiment, in operation, the processor 120 calculates or determines the user's weight based on or using data from the body-weight sensor 166 incorporated and/or embedded in foot pads. In addition, the processor 120 may employ the data from a bio-impedance sensor(s) 162, to calculate or determine a user's body fat composition and/or body mass index. The bio-impedance sensor(s) 162 may comprise the BIA electrodes 205, from which a small current may be applied to the user's body and the characteristics of the return current measured in the electrodes may be representative of the body fat composition of the user. The processor 120, based on data acquired or detected by the BIA electrodes 205 and user information (e.g., height, age, and gender), may calculate or determine a user's body fat composition and/or body mass index
With continued reference to
Wearable Device Overview
In certain embodiments, the client device 170 may comprise a wearable device 302 which may have a shape and/or size adapted for coupling to (e.g., secured to, worn, borne by, etc.) the body or clothing of a user.
The wearable device 302 may collect one or more types of physiological and/or environmental data from one or more biometric sensor(s), one or more environmental sensor(s), and/or external devices and communicate or relay such information to other devices (e.g., the client device 170, the server 175, and or the biometric monitoring device 100), thus permitting the collected data to be viewed, for example, using a web browser or network-based application. For example, while being worn by the user, the wearable device 302 may perform biometric monitoring via calculating and storing the user's step count using one or more biometric sensor(s) included therein. The wearable device 302 may transmit data representative of the user's step count to an account on a web service (e.g., www.fitbit.com), computer, mobile phone, and/or health station where the data may be stored, processed, and/or visualized by the user. The wearable device 302 may measure or calculate other physiological metric(s) in addition to, or in place of, the user's step count. Such physiological metric(s) may include, but are not limited to: energy expenditure, e.g., calorie burn; floors climbed and/or descended; heart rate; heartbeat waveform; heart rate variability; heart rate recovery; location and/or heading (e.g., via a GPS, global navigation satellite system (GLONASS), or a similar system); elevation; ambulatory speed and/or distance traveled; swimming lap count; swimming stroke type and count detected; bicycle distance and/or speed; blood pressure; blood glucose; skin conduction; skin and/or body temperature; muscle state measured via electromyography; brain activity as measured by electroencephalography; weight; body fat; caloric intake; nutritional intake from food; medication intake; sleep periods (e.g., clock time, sleep phases, sleep quality and/or duration); pH levels; hydration levels; respiration rate; and/or other physiological metrics.
The wearable device 302 may also measure or calculate metrics related to the environment around the user (e.g., with one or more environmental sensor(s) included therein), such as, for example, barometric pressure, weather conditions (e.g., temperature, humidity, pollen count, air quality, rain/snow conditions, wind speed), light exposure (e.g., ambient light, ultra-violet (UV) light exposure, time and/or duration spent in darkness), noise exposure, radiation exposure, and/or magnetic field. Furthermore, the wearable device 302 (and/or the client device 170, the server 175 and/or the biometric monitoring device 100) may collect data from biometric sensor(s) and/or the environmental sensor(s), and may calculate metrics derived from such data. For example, the wearable device 302 (and/or the client device 170, the server 175, and/or the biometric monitoring device 100) may calculate the user's stress or relaxation levels based on a combination of heart rate variability, skin conduction, noise pollution, and/or sleep quality. In another example, the wearable device 302 (and/or the client device 170, the server 175, and/or the biometric monitoring device 100) may determine the efficacy of a medical intervention, for example, medication, based on a combination of data relating to medication intake, sleep, and/or activity. In yet another example, the wearable device 302 (and/or the client device 170, the server 175, and/or the biometric monitoring device 100) may determine the efficacy of an allergy medication based on a combination of data relating to pollen levels, medication intake, sleep and/or activity. These examples are provided for illustration only and are not intended to be limiting or exhaustive.
User Identification
Certain aspects of this disclosure relate to the identification of a user currently using the biometric monitoring device 100. As discussed above, since body-weight biometric monitoring devices 100 are not typically used continually over extended periods of time (e.g., for longer than 1 minute at a time), but rather are typically used intermittently for shorter periods of time (e.g., less than 1 minute), a single body-weight biometric monitoring device 100 may be used by a plurality of users. Aspects of this disclosure relate to techniques for automatically determining to which user profile a given measurement taken by the biometric monitoring device 100 corresponds.
While the embodiments described below generally relate to a body-weight biometric monitoring device 100, the techniques may also be extended to other biometric monitoring devices, such as client device 170, which are shared between users. For example, two or more users may share a wearable device, such as wearable device 302 in order to track certain exercise metrics. The use of sensor data generated by a secondary sensor in order to identify the current user of the biometric monitoring device 100 as described below may also be applied to wearable device 302.
The method 400 starts at block 401. At decision block 405, the processor 120 detects whether a user is standing on a platform 195 of the biometric monitoring device 100. In response to detecting that the user is not standing on the platform 195, the method 400 returns to the start of block 405. In response to detecting that the user is standing on the platform 195, the method 400 proceeds to block 410, at which the processor 120 measures the body-weight of the user using a body-weight sensor 166. At optional decision block 415, the processor 120 determines whether the measured body-weight matches two or more user profiles. For example, the processor 120 may determine whether the measured body-weight is within a defined range of two or more weight values associated with two or more corresponding user profiles stored in the memory 130. In some embodiments, memory 130 may correspond to volatile or non-volatile storage. In some embodiments, the user profiles stored in the memory 130 may be transmitted to the body-weight biometric monitoring device 100 from another device (e.g., a server) before, during, or after the body-weight of the user is measured at block 410. The weight values may be an average, mean, and/or other mathematical combination of the previous measured body-weight values for the corresponding user profiles. For example, the processor 120 may determine an expected body-weight value for each user profile based on a trend in the historical measured body-weight values for the given user.
In other embodiments, the processor 120 may determine whether the measured body-weight matches two or more user profiles by determining whether the measured body-weight is uniquely consistent with only one of the user profiles stored in the memory 130. For example, if the measured body-weight is not consistent with an expect amount of weight fluctuation since a previous body-weight measurement for a given user profile, the processor 120 may determine that the body-weight measurement is not consistent with the given user profile.
In yet other embodiments, the method 400 may proceed directly from block 410 to block 420. That is, the method 400 may include identifying a user profile corresponding to the user based on the measured body-weight and sensor data from a second sensor. The identification of a user profile will be described in greater detail below.
In response to the processor 120 determining that the measured body-weight matches two or more user profiles (yes from block 415) or in response to measuring the body-weight of the user (block 410), the method continues at block 420. In response to the processor 120 determining that the measured body-weight does not match two or more user profiles (no from block 415), the method 400 ends at block 435. At block 420, the processor 120 receives sensor data from a second sensor. For example, the processor 120 may receive sensor data from one or more of the optical sensor(s) 168, the bio-impedance sensor(s) 162, the environmental sensor(s) 150, and/or the other biometric sensor(s) 164.
At block 425, the processor 120 determines one or more user parameter(s) indicative of the user's identity. The determining of the user parameter(s) may be based on a model of the user generated from the received sensor data. For example, the processor 120 may determine a user parameter indicative of the user's identity based on the received sensor data by fitting the received sensor data to an electrical circuit model. Examples of data from which the user parameter may be determined include: bio-impedance data, footprint data, a unique client device 170 identifier, body-weight distribution data, a time of day of the weight measurement, a material worn on the user's feet during the weight measurement, a shape of the user's feet, etc.
At block 430, the processor 120 identifies one of the user profiles as corresponding to the user based on the parameter(s) and the measured weight. For example, the processor 120 may classify the measured body-weight as corresponding to the identified user profile based on the user parameter(s) matching the stored historic body-weight values for the identified user profile. As described in detail below, the matching of user parameter(s) to the stored historic body-weight values may involve determining a likelihood that the measured body-weight corresponds to each of the user profiles and selecting the user profile having the highest likelihood and/or confidence as the identified user profile. The method 400 ends at block 435. It is noted that the embodiments of the present disclosure are not limited to or by the example shown in
Bio-Impedance
In the embodiment illustrated in
The bio-impedance sensor(s) 162 may include, for example, at least two electrodes, and in certain embodiments, may include at least four electrodes. In one embodiment, a first electrode (a source electrode) and a second electrode (a sink electrode) together apply an electrical signal to the user's body and a third electrode (a reference electrode) and a fourth electrode (a measurement point) together measure a response to the applied electrical signal. The electrodes may be positioned at different locations on the user's body such that the electrical signal flows through at least a portion of the user's body. In one implementation, the first and second electrodes may be located to contact a first foot of the user and the third and fourth electrodes may be located to contact a second foot of the user. In this implementation, the electrical signal may flow through the user's body between the feet of the user. In another implementation, the electrodes may be configured to apply the electrical signal to one portion of a user's foot and receive the electrical signal from another portion of the user's foot. For example, the first and second electrodes may be configured to contact the front portion of the user's foot (e.g., near the user's toes) while the third and fourth electrodes may be configured to contact a back portion of the user's foot (e.g., near the user's heel). In certain embodiments, four electrodes may be provided for each of the user's feet.
In other implementations, at least two of the electrodes may be configured to contact another portion of the user's body. For example, in certain implementations, the biometric monitoring device 100 may include a handle (not illustrated) which the user may grab with one or more of his/her hands. The first and second electrodes may be provided on the handle such that the electrical signal may be applied to the user and/or received from the user via the handle. Accordingly, the electrical signal may be applied to and/or received from one or more hands of the user.
At block 510, the processor 120 receives bio-impedance data from one or more bio-impedance sensor(s) 162. The processor 120 fits the received bio-impedance data into a model of the human body to determine one or more parameters. The processor 120 may model the human body according to the model circuit 700 illustrated in
Based on one or more impedance measurements using the bio-impedance sensor(s) 162, the processor 120 may determine the values Re, Ri, and C corresponding to the current user of the biometric monitoring device. When a plurality of impedance measurements are taken, at least three of the applied electrical signals may have different frequencies. One implementation for determining the values Re, Ri, and C from a number of impedance measurements is shown below.
The magnitude of a given impedance measurement is given by:
Based on equation (2), an observable value may be defined as:
Here, Cnom is a scaling factor which affects the stability of the equation and may be selected to aid in solving equation (3). By multiplying each side of equation (3) by (c+x), the equation can be altered to the form:
yx=a+bx−cy (4)
Each of a plurality of measurements may be entered into equation (4) to generate a system of equations, which can be generalized by the following equation:
From equation (5), the values represented by the vector [a b c]T may be solved for using, for example, a least squares approach, from which each of the values Re, Ri, and C may be recovered.
Depending on the implementation, the bio-impedance sensor(s) 162 may be configured to apply one or more electrical signals to the user's feet, which may have differing frequencies. For example, since the user's body may be modeled as shown in
In certain implementations, the bio-impedance sensor(s) 162 may apply a “white-noise” signal to the user and measure the electrical signal response. As used herein, a white-noise signal may refer to an electrical signal having a substantially uniform frequency distribution over a defined range of frequencies. In certain implementations employing a white-noise signal, rather than calculating values Re, Ri, and C according to the model of
The user classification step 520 of
The relationship between a given measurement Z(i) and a user profile may be described by the Mahalanobis distance, which may be defined as the measure of the distance between the measurement Z(i) and the mean p of the user profile measurement history distribution, as scaled by the standard deviation. Mahalanobis distances may represent the likelihood that the measurement Z(i) belongs to the user profile. That is, a sample measurement having a first Mahalanobis distance to the mean μ1 of a first user profile than is less than a second Mahalanobis distance to the mean μ2 of a second user profile is more likely to belong to the first user profile than the second user profile. The Mahalanobis distance for a given measurement (e.g., a kth measurement) may be determined according to the following equation:
Dk(i)=√{square root over ((Z(i)−μk)TΣk−1(Z(i)−μk))} (6)
Additionally, a confidence that the given measurement Z(i) belongs to a user profile k may be defined as:
The processor 120 may calculate a confidence P for each of the user profiles k that the given measurement Z(i) belongs to the corresponding user profiles k. The processor 120 may then determine whether the given measurement Z(i) belongs to one of the user profiles k based on the determined confidences P.
After a user profile k has been identified, the processor 120 may update the corresponding user profile history, as shown in block 540 of
Weight Distribution
While the method of
In one implementation, the processor 120 may track the weight distribution over the platform 195 as the user mounts the platform 195. This may involve tracking a position of the weight distribution (e.g., a center) from a measurement of zero body-weight until movement of the weight distribution settles to within a tolerance range for movement. That is, the weight distribution may be logged until changes in the weight distribution are less than a threshold change in distribution for longer than a predetermined period of time. Certain users may have a pattern in his/her mounting weight distribution which may distinguish the users from others. For example, one user may consistently mount the platform 195 with his/her left foot first before putting his/her right leg on the platform 195. Other mounting habits, such as an average weight distribution which is further forward, backward, left, and/or right compared to other users may be used to distinguish between users. The speed at which stabilization in the weight distribution is achieved may also be used to distinguish between users.
The processor 120 may also log the weight distribution of the user after the weight distribution has achieved stability (e.g., changes in the position of the weight distribution are less than the threshold change in distribution for longer than the predetermined period of time). Due to user preferences in position over the platform 195 and user posture habits, the location of the weight distribution after stabilization may not be centered on the platform 195. Thus, in certain implementations, the user profile may include a history measurements of the stabilized position of the center of the user's weight distribution. The weight distribution history may be analyzed for classification of the user measurement similar to the bio-impedance measurements as discussed in connection with
In certain implementations, the body-weight sensor(s) 166 may further include one or more high-resolution pressure sensor(s). The high-resolution pressure sensor(s) may have a resolution that can determine an outline and/or pressure profile of the user's feet. Accordingly, the processor 120 may use one or more of the outline and pressure profile of the user's feet in determining the identity of the user.
Wireless Communication and Other User Identification Parameters
The processor 120 may also use other parameters in the identification of the user in addition to or in place of those discussed above. Further, in certain embodiments, the parameters may not be physiological parameters. For example, the processor 120 may use signals received via the wireless transceiver 140 to determine that a particular user is currently using the biometric monitoring device 100. That is, users of the biometric monitoring device 100 may wear and/or carry client devices 170, such as a wearable device 302 or a mobile phone which are associated with a particular user profile. These client devices 170 may wirelessly communicate with the processor 120 via the wireless transceiver 140 when within wireless communications range (e.g., the wearable device 302 and/or mobile phone may communicate with the wireless transceiver 140 via Bluetooth, Wi-Fi, NFC, etc.). Since one or more client device(s) 170 may be uniquely associated with a particular user, the processor 120 may identify the user of the biometric monitoring device 100 when the biometric monitoring device 100 is in communication with the client device(s) 170 while the biometric monitoring device 100 is measuring the body-weight of the user.
In one example, the processor 120 may establish wireless communication with a client device 170 when the user approaches the biometric monitoring device 100 and the user may subsequently mount the platform 195. After the user has completed measurement of his/her body-weight, the user may leave the wireless communication range of the wireless transceiver 140, thereby terminating communication between the processor 120 and the client device 170. Since the client device 170 was in wireless communication with the processor 120 during the measurement of the user's body-weight, the processor 120 may base the determination of the user profile corresponding to the user based on a unique identifier of the client device 170 (e.g., a device identifier such as media access control (MAC) address). The processor 120 may use the proximity of the client device 170 as a parameter for determining the user's identification. For example, the strength of the wireless communication signal between the client device 170 and the wireless transceiver 140 may be dependent on the distance between the client device 170 and the wireless transceiver 140. Accordingly, when two or more client devices 170 are in wireless communication with the wireless transceiver 140, the processor 120 may use the unique identifier of the client device 170 which is in closer proximity to the wireless transceiver 140 in determining the identity of the user.
Another parameter which may be used by the processor 120 in identifying the user is a footprint measurement of the user. For example, the biometric monitoring device 100 may include one or more optical sensor(s) 168, such as a footprint scanner. The footprint scanner may function similar to a fingerprint scanner and identify patterns in the ridges of the skin of the user's feet which may be used to identify the user. Depending on the implementation, the biometric monitoring device 100 may include two footprint scanners over in locations where the user places his/her feet when standing on the platform (see the footpads 190 of
In certain implementations, the optical sensor(s) 168 may comprise a PPG sensor. The PPG sensor may be able to measure biometrics such as heart rate, heartbeat waveform, heart rate variability, heart rate recovery, etc. In other implementations, the bio-impedance sensor(s) 166 may be able to determine heart rate and/or blood pressure from the bio-impedance measurement. Any one of the parameters measured by optical sensor(s) 168 and/or bio-impedance sensor(s) 166 may be used by the processor 120 to determine the identity of the user.
In other implementations, the biometric monitoring device 100 may further include or be in wireless communications with a camera (e.g., the camera may be included in a smart mirror in wireless communication with the biometric monitoring device 100). The camera may be configured to identify users of the biometric monitoring device 100 based on facial recognition algorithms and/or other image processing that is substantially unique to the users of the biometric monitoring device 100. The processor 120 may use the facial recognition algorithms of the camera as a parameter in determining the identity of the user.
Automatic Determination of User Identification Parameters
A number of parameters which may be used by the processor 120 to identify the user of the biometric monitoring device 100 when the user is measuring his/her body-weight have been described. However, depending on the particular users who share a given biometric monitoring device 100, certain parameters may be more accurate in distinguishing between the users. Accordingly, in certain implementations, the processor 120 may identify those parameters which are more accurate in distinguishing and identifying the users of the biometric monitoring device 100 and track the histories of the identified parameters in the user profiles for user identification.
For example, the processor 120 may review the histories for each of the user profiles associated with the biometric monitoring device 100 to determine which user weight distribution parameters (e.g., habits) may be useful in distinguishing between the users of the biometric monitoring device. In one instance, a first user may habitually mount the biometric monitoring device 100 with his/her left foot and have a stabilized weight distribution which is forward from the center of the platform 195. A second user may habitually mount the biometric monitoring device 100 with his/her right foot and have a stabilized weight distribution which is to the right from the center of the platform 195. In this example, the processor 120 may identify the initial mounting foot and the stabilized weight distribution location as being parameters which may be used to distinguish between the users. The processor 120 may automatically identify users of the biometric monitoring device 100 based on the identified parameters.
As another example, the processor 120 may identify significant overlap in the bio-impedance histories stored in corresponding user profiles for two users. This overlap may make it difficult to distinguish between the two users when using the biometric monitoring device 100. In this example, the processor 120 may select one or more other parameters for identifying users including, for example, bio-impedance data, footprint data, a unique client device 170 identifier, body-weight distribution data, a time of day of the weight measurement, a material worn on the user's feet during the weight measurement, a shape of the user's feet.
Example Flowchart for User Identification By Biometric Monitoring Device
In one implementation, the wearable device 100 comprises a platform 195 configured to receive at least one foot of a user, a plurality of sensors 140, 150, 160, a memory 130, and a processor (also referred to as processing circuitry) 120. The method 900 begins at block 901. At block 905, the processor 120 detects that the user is standing on the platform 195. At block 910, the processor 120 measures, based on body-weight data generated by a first one of the sensors, a weight of the user in response to detecting that the user is standing on the platform 195.
At block 915, the processor 120 determines, based on sensor data generated by a second one of the sensors, a user parameter indicative of the user's identification. In some embodiments, the determination of the parameter indicative of the user's identification may be in response to determining that the measured weight is within a defined range of two or more weight values. At block 920, the processor 120 compares the user parameter to a plurality of parameter values respectively associated with a plurality of user profiles stored in the memory. At block 925, the processor 120 compares the measured weight of the user to a plurality of weight values respectively associated with each of the user profiles. At block 930, the processor 120 identifies one of the user profiles as corresponding to the user based on the comparison of the user parameter to the parameter values and the comparison of the measured weight of the user to the weight values. At block 935, the processor 120 updates the identified user profile based on at least one of the measured weight and the user parameter. The method 900 ends at block 940. It is noted that the embodiments of the present disclosure are not limited to or by the example shown in
Other Considerations
Information and signals disclosed herein may be represented using any of a variety of different technologies and techniques. For example, data, instructions, commands, information, signals, bits, symbols, and chips that may be referenced throughout the above description may be represented by voltages, currents, electromagnetic waves, magnetic fields or particles, optical fields or particles, or any combination thereof.
The various illustrative logical blocks, and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both. To clearly illustrate this interchangeability of hardware and software, various illustrative components, blocks, and steps have been described above generally in terms of their functionality. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the overall system. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present disclosure.
The techniques described herein may be implemented in hardware, software, firmware, or any combination thereof. Such techniques may be implemented in any of a variety of devices, such as, for example, wearable devices, wireless communication device handsets, or integrated circuit devices for wearable devices, wireless communication device handsets, and other devices. Any features described as devices or components may be implemented together in an integrated logic device or separately as discrete but interoperable logic devices. If implemented in software, the techniques may be realized at least in part by a computer-readable data storage medium comprising program code including instructions that, when executed, performs one or more of the methods described above. The computer-readable data storage medium may form part of a computer program product, which may include packaging materials. The computer-readable medium may comprise memory or data storage media, such as random access memory (RAM) such as synchronous dynamic random access memory (SDRAM), read-only memory (ROM), non-volatile random access memory (NVRAM), electrically erasable programmable read-only memory (EEPROM), FLASH memory, magnetic or optical data storage media, and the like. The techniques additionally, or alternatively, may be realized at least in part by a computer-readable communication medium that carries or communicates program code in the form of instructions or data structures and that can be accessed, read, and/or executed by a computer, such as propagated signals or waves.
Processor(s) in communication with (e.g., operating in collaboration with) the computer-readable medium (e.g., memory or other data storage device) may execute instructions of the program code, and may include one or more processors, such as one or more digital signal processors (DSPs), general purpose microprocessors, ASICs, field programmable logic arrays (FPGAs), or other equivalent integrated or discrete logic circuitry. Such a processor may be configured to perform any of the techniques described in this disclosure. A general purpose processor may be a microprocessor; but in the alternative, the processor may be any conventional processor, controller, microcontroller, or state machine. A processor may also be implemented as a combination of computing devices, for example, a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration. Accordingly, the term “processor,” as used herein may refer to any of the foregoing structure, any combination of the foregoing structure, or any other structure or apparatus suitable for implementation of the techniques described herein. Also, the techniques could be fully implemented in one or more circuits or logic elements.
The techniques of this disclosure may be implemented in a wide variety of devices or apparatuses, including a wearable device, a wireless handset, an integrated circuit (IC) or a set of ICs (e.g., a chip set). Various components, or units are described in this disclosure to emphasize functional aspects of devices configured to perform the disclosed techniques, but do not necessarily require realization by different hardware units. Rather, as described above, various units may be combined in a hardware unit or provided by a collection of inter-operative hardware units, including one or more processors as described above, in conjunction with suitable software and/or firmware.
Although the foregoing has been described in connection with various different embodiments, features or elements from one embodiment may be combined with other embodiments without departing from the teachings of this disclosure. However, the combinations of features between the respective embodiments are not necessarily limited thereto. Various embodiments of the disclosure have been described. These and other embodiments are within the scope of the following claims.
Number | Name | Date | Kind |
---|---|---|---|
6583369 | Montagnino et al. | Jun 2003 | B2 |
6695207 | Norris, Jr. | Feb 2004 | B1 |
8008996 | Chen et al. | Aug 2011 | B2 |
8447704 | Tan et al. | May 2013 | B2 |
8475367 | Yuen | Jul 2013 | B1 |
8684900 | Tran | Apr 2014 | B2 |
8696569 | Yuen | Apr 2014 | B2 |
8747312 | Yuen | Jun 2014 | B2 |
8764651 | Tran | Jul 2014 | B2 |
8922342 | Ashenfelter et al. | Dec 2014 | B1 |
8983637 | Glode et al. | Mar 2015 | B2 |
9084536 | Yuen | Jul 2015 | B2 |
9084537 | Yuen | Jul 2015 | B2 |
9084538 | Yuen | Jul 2015 | B2 |
9157787 | Sharma | Oct 2015 | B2 |
9159213 | Chun et al. | Oct 2015 | B1 |
9173576 | Yuen | Nov 2015 | B2 |
9173577 | Yuen | Nov 2015 | B2 |
9247884 | Yuen | Feb 2016 | B2 |
9549680 | Giovangrandi | Jan 2017 | B2 |
9667353 | .Å.strand et al. | May 2017 | B2 |
9693711 | Yuen et al. | Jul 2017 | B2 |
9750435 | Jun | Sep 2017 | B2 |
9782075 | Redei | Oct 2017 | B2 |
9851808 | Yuen et al. | Dec 2017 | B2 |
10126830 | Yuen et al. | Nov 2018 | B2 |
10181021 | Venkatraman et al. | Jan 2019 | B2 |
10503268 | Yuen et al. | Dec 2019 | B2 |
10624561 | Venkatraman et al. | Apr 2020 | B2 |
10806379 | Foxlin et al. | Oct 2020 | B2 |
10942579 | Yuen et al. | Mar 2021 | B2 |
20020138768 | Murakami et al. | Sep 2002 | A1 |
20020184159 | Tadayon et al. | Dec 2002 | A1 |
20030061172 | Robinson | Mar 2003 | A1 |
20030190062 | Noro et al. | Oct 2003 | A1 |
20040230105 | Geva et al. | Nov 2004 | A1 |
20050281439 | Lange | Dec 2005 | A1 |
20060025282 | Redmann | Feb 2006 | A1 |
20060047970 | Mochizuki | Mar 2006 | A1 |
20060080525 | Ritter et al. | Apr 2006 | A1 |
20060133652 | Iwasaki | Jun 2006 | A1 |
20060136744 | Lange | Jun 2006 | A1 |
20060224074 | Ouchi et al. | Oct 2006 | A1 |
20060284639 | Reynolds | Dec 2006 | A1 |
20070002141 | Lipton et al. | Jan 2007 | A1 |
20070142715 | Banet et al. | Jun 2007 | A1 |
20070142868 | Moon et al. | Jun 2007 | A1 |
20070177770 | Derchak et al. | Aug 2007 | A1 |
20070260511 | Bender, II et al. | Nov 2007 | A1 |
20070263907 | McMakin et al. | Nov 2007 | A1 |
20070293781 | Sims et al. | Dec 2007 | A1 |
20080001735 | Tran | Jan 2008 | A1 |
20080235144 | Phillips | Sep 2008 | A1 |
20080287751 | Stivoric et al. | Nov 2008 | A1 |
20080294019 | Tran | Nov 2008 | A1 |
20090203972 | Heneghan et al. | Aug 2009 | A1 |
20090232362 | Otsubo et al. | Sep 2009 | A1 |
20090239709 | Wu | Sep 2009 | A1 |
20090241174 | Rajan et al. | Sep 2009 | A1 |
20090319221 | Kahn et al. | Dec 2009 | A1 |
20100113950 | Lin et al. | May 2010 | A1 |
20100203967 | Chiu et al. | Aug 2010 | A1 |
20100207721 | Nakajima et al. | Aug 2010 | A1 |
20100331145 | Lakovic et al. | Dec 2010 | A1 |
20100331903 | Zhang et al. | Dec 2010 | A1 |
20110004072 | Fletcher et al. | Jan 2011 | A1 |
20110032105 | Hoffman et al. | Feb 2011 | A1 |
20110032139 | Benitez et al. | Feb 2011 | A1 |
20110152637 | Kateraas et al. | Jun 2011 | A1 |
20110257554 | Banet et al. | Oct 2011 | A1 |
20120007713 | Nasiri et al. | Jan 2012 | A1 |
20120068820 | Mollicone et al. | Mar 2012 | A1 |
20120083705 | Yuen et al. | Apr 2012 | A1 |
20120115682 | Homsi | May 2012 | A1 |
20120179636 | Galiana et al. | Jul 2012 | A1 |
20120274508 | Brown et al. | Nov 2012 | A1 |
20120296455 | Ohnemus et al. | Nov 2012 | A1 |
20120316455 | Rahman et al. | Dec 2012 | A1 |
20130030320 | Maier | Jan 2013 | A1 |
20130041856 | Benitez et al. | Feb 2013 | A1 |
20130106684 | Weast et al. | May 2013 | A1 |
20130158364 | Hayn | Jun 2013 | A1 |
20130158686 | Zhang et al. | Jun 2013 | A1 |
20130166048 | Werner et al. | Jun 2013 | A1 |
20130204410 | Napolitano | Aug 2013 | A1 |
20130227678 | Kang et al. | Aug 2013 | A1 |
20130268236 | Yuen et al. | Oct 2013 | A1 |
20130281805 | Mason et al. | Oct 2013 | A1 |
20140059066 | Koloskov | Feb 2014 | A1 |
20140067494 | Squires | Mar 2014 | A1 |
20140083779 | Sharma | Mar 2014 | A1 |
20140085050 | Luna | Mar 2014 | A1 |
20140089672 | Luna et al. | Mar 2014 | A1 |
20140099614 | Hu et al. | Apr 2014 | A1 |
20140157209 | Dalal et al. | Jun 2014 | A1 |
20140164611 | Molettiere et al. | Jun 2014 | A1 |
20140165185 | Lange | Jun 2014 | A1 |
20140180022 | Stivoric et al. | Jun 2014 | A1 |
20140266860 | Blumrosen et al. | Sep 2014 | A1 |
20140273858 | Panther et al. | Sep 2014 | A1 |
20140275850 | Venkatraman et al. | Sep 2014 | A1 |
20140275852 | Hong et al. | Sep 2014 | A1 |
20140327515 | Luna et al. | Nov 2014 | A1 |
20150026647 | Park et al. | Jan 2015 | A1 |
20150035643 | Kursun | Feb 2015 | A1 |
20150070134 | Nagisetty et al. | Mar 2015 | A1 |
20150119728 | Blackadar et al. | Apr 2015 | A1 |
20150161876 | Castillo | Jun 2015 | A1 |
20150168365 | Connor | Jun 2015 | A1 |
20150220109 | von Badinski et al. | Aug 2015 | A1 |
20150242605 | Du et al. | Aug 2015 | A1 |
20150254575 | Nere et al. | Sep 2015 | A1 |
20150272483 | Etemad et al. | Oct 2015 | A1 |
20150301608 | Nagaraju et al. | Oct 2015 | A1 |
20150305674 | McPherson et al. | Oct 2015 | A1 |
20150342489 | Bhaumik et al. | Dec 2015 | A1 |
20150342527 | Karnik et al. | Dec 2015 | A1 |
20160063233 | Bae et al. | Mar 2016 | A1 |
20160154952 | Venkatraman et al. | Jun 2016 | A1 |
20160189451 | Yoo et al. | Jun 2016 | A1 |
20160191517 | Bae et al. | Jun 2016 | A1 |
20160313176 | Lee | Oct 2016 | A1 |
20160378235 | Dow et al. | Dec 2016 | A1 |
20170000415 | Lapetina et al. | Jan 2017 | A1 |
20170011210 | Cheong et al. | Jan 2017 | A1 |
20170032168 | Kim | Feb 2017 | A1 |
20170035327 | Yuen et al. | Feb 2017 | A1 |
20170035328 | Yuen et al. | Feb 2017 | A1 |
20170038848 | Yuen et al. | Feb 2017 | A1 |
20170039358 | Yuen et al. | Feb 2017 | A1 |
20170189755 | McKirdy et al. | Jul 2017 | A1 |
20170231519 | Westover et al. | Aug 2017 | A1 |
20170235935 | Song et al. | Aug 2017 | A1 |
20170255273 | Yuen et al. | Sep 2017 | A1 |
20180020984 | Hall | Jan 2018 | A1 |
20180067565 | Yuen et al. | Mar 2018 | A1 |
20180296136 | Foxlin | Oct 2018 | A1 |
20190050064 | Yuen et al. | Feb 2019 | A1 |
20190215369 | Pry | Jul 2019 | A1 |
20190251238 | Venkatraman et al. | Aug 2019 | A1 |
20200167004 | Yuen et al. | May 2020 | A1 |
20200260999 | Foxlin et al. | Aug 2020 | A1 |
Number | Date | Country |
---|---|---|
1 721 237 | Aug 2012 | EP |
WO 12170586 | Dec 2012 | WO |
WO 12170924 | Dec 2012 | WO |
WO 12171032 | Dec 2012 | WO |
WO 15127067 | Aug 2015 | WO |
WO 16003269 | Jan 2016 | WO |
Entry |
---|
US Office Action, dated Nov. 18, 2016, issued in U.S. Appl. No. 15/231,620. |
US Final Office Action, dated Apr. 25, 2017, issued in U.S. Appl. No. 15/231,620. |
US Notice of Allowance, dated Feb. 23, 2017, issued in U.S. Appl. No. 15/231,636. |
US Notice of Allowance, dated Oct. 23, 2017, issued in U.S. Appl. No. 15/603,305. |
US Notice of Allowance, dated Jul. 6, 2018, issued in U.S. Appl. No. 15/812,879. |
US Notice of Allowance, dated Jul. 25, 2019, issued in U.S. Appl. No. 16/153,618. |
US Notice of Allowance, dated Oct. 26, 2020, issued in U.S. Appl. No. 16/663,166. |
US Office Action, dated Dec. 19, 2016, issued in U.S. Appl. No. 15/231,636. |
US Final Office Action, dated May 3, 2017, issued in U.S. Appl. No. 15/231,636. |
US Office Action, dated Dec. 7, 2016, issued in U.S. Appl. No. 15/231,641. |
US Final Office Action, dated Apr. 20, 2017, issued in U.S. Appl. No. 15/231,641. |
US Office Action, dated Jan. 27, 2017, issued in U.S. Appl. No. 15/012,607. |
US Final Office Action, dated May 16, 2017, issued in U.S. Appl. No. 15/012,607. |
US Office Action, dated Mar. 15, 2018, issued in U.S. Appl. No. 15/012,607. |
US Office Action, dated Jun. 27, 2019, issued in U.S. Appl. No. 15/486,208. |
US Notice of Allowance, dated Dec. 19, 2019, issued in U.S. Appl. No. 15/486,208. |
US Notice of Allowance, dated Jun. 4, 2020, issued in U.S. Appl. No. 16/829,658. |
Lara et al. (2013) “A Survey on Human Activity Recognition using Wearable Sensors,” IEEE Communications Surveys & Tutorials, Third Quarter 2013, 15(3):1192-1207. |
Mills, C. (2015) “Paying for stuff with your heartbeat is now a thing,” GIZMODO (Published online, Aug. 12, 2015), 2pp. |
Saeb Sohrab, et al. “Making Activity Recognition Robust against Deceptive Behavior,” Dec. 11, 2015, PLoS ONE. (Year: 2015). |
Terry, Ian Michael et al. “GeoFit: Verifiable Fitness Challenges,” 2014 IEEE 11th International Conference on Mobile Ad Hoc and Sensor Systems. (Year: 2014). |
U.S. Appl. No. 16/829,659, filed Mar. 25, 2020, Foxlin et al. |
Number | Date | Country | |
---|---|---|---|
20210100483 A1 | Apr 2021 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 16829658 | Mar 2020 | US |
Child | 16948390 | US | |
Parent | 15486208 | Apr 2017 | US |
Child | 16829658 | US |