Sedentary period detection utilizing a wearable electronic device

Information

  • Patent Grant
  • 11129534
  • Patent Number
    11,129,534
  • Date Filed
    Monday, December 2, 2019
    5 years ago
  • Date Issued
    Tuesday, September 28, 2021
    3 years ago
Abstract
Systems and methods for determining a sedentary state of a user are described. Sensor data is collected and analyzed to calculate metabolic equivalent of task (MET) measures for a plurality of moments of interest. Based on the MET measures and a time period for which the MET measures exceed a threshold value, it is determined whether the user is in a sedentary state. If the user is in the sedentary state, the user is provided a notification to encourage the user to perform a non-sedentary activity.
Description
INCORPORATION BY REFERENCE

An Application Data Sheet is filed concurrently with this specification as part of the present application. Each application that the present application claims benefit of or priority to as identified in the concurrently filed Application Data Sheet is incorporated by reference herein in its entirety and for all purposes.


FIELD

Embodiments described in the present disclosure relate to the field of wearable electronic devices. Specifically, the embodiments relate to automatic detection of sedentary periods and promotion of non-sedentary behavior utilizing a wearable electronic device.


BACKGROUND

Trackers have gained popularity among consumers. A tracker is used to track a user's activities using a variety of sensors and helps the user to maintain a healthy life style. In order to determine the activities, the tracker collects activity data and runs computations on that data. One difficulty of obtaining accurate determinations of the activities is that the trackers, because they are worn by a user, are typically packaged in a compact casing containing less powerful processor(s) on which it is harder to run complex computations than larger electronic devices.


Another challenge in tracking the activities is differentiating between the user being stationary but performing an activity and the user being sedentary, e.g., spending minimal energy expenditure, etc. To illustrate, the user spends minimal energy when the user is seated, or seated and typing at a computer. Increased total sedentary time and longer sustained sedentary periods are associated with poor health and fitness, e.g., obesity, metabolic disorders, etc.


SUMMARY

In some embodiments, a wearable electronic device to be worn by a user is described. The wearable electronic device includes a set of one or more sensors to generate sensor data associated with the user when the user is wearing the wearable electronic device. The wearable electronic device further includes a set of one or more processors coupled to the set of sensors and a non-transitory machine readable storage medium coupled to the set of one or more processors and having stored therein instructions. When the instructions are executed by the set of one or more processors, the instructions cause the wearable electronic device to track a period of time during which a state of the user is determined to be sedentary. The determination is based on metabolic equivalent of task (MET) measures at moments of interest and the MET measures are calculated based on the sensor data. The instructions further cause the user to receive a notification responsive to the tracked period of time to encourage the user to limit the length of sedentary periods.


In various embodiments, the instructions further cause the wearable electronic device to classify, for each of the moments of interest, a status of the user as being sedentary or non-sedentary based on a MET measure for that moment of interest.


In several embodiments, a period of time is tracked based on a determination of contiguous ones of the moments of interest for which the status of the user is classified as sedentary.


In some embodiments, the status of the user at a moment of interest is classified as sedentary when a MET measure for that moment of interest is below a threshold MET value.


In various embodiments, the status of the user at a moment of interest is classified as sedentary when a MET measure for that moment of interest is between a first threshold value and a second threshold value, and the moment of interest is preceded by a first moment of interest within a threshold window of time for which the status of the user is sedentary, and is followed within the threshold window of time by a second moment of interest for which the status of the user is sedentary.


In some embodiments, one of the one of more sensors is a photoplethysmographic (PPG) sensor, and the MET measures are based on heart rate measures of the user calculated based on PPG sensor data.


In various embodiments, the instructions cause the wearable electronic device to filter out a period of time at which the state of the user is asleep.


In some embodiments, the wearable electronic device includes a sensor to generate sensor data associated with the user when the user is wearing the wearable electronic device. The wearable electronic device further includes a set of one or more processors coupled to the sensor and a non-transitory machine readable storage medium coupled to the set of one or more processors and having stored instructions therein. The instructions when executed by the set of one or more processors, cause the wearable electronic device to track a period of time during which the state of the user is determined to be sedentary based on the sensor data. The period of time has a beginning and an end. The instructions further cause the wearable electronic device to detect, responsive to the end of the period of time, that the state of the user has changed from sedentary to non-sedentary for a threshold period of time. The instructions cause the wearable device to cause the user to receive a notification responsive to the detection to encourage the user to remain non-sedentary.


In several embodiments, a notification is a message displayed on a display device of the wearable electronic device, or a vibration of the wearable electronic device, or a sound emitted by the wearable electronic device.


In some embodiments, a notification is indicative of the user having ended the period of time during which the state of the user is sedentary.


In various embodiments, a notification is determined based on preferences set by the user.


In some embodiments, a notification is a motivational statement displayed on the display device of the wearable electronic device.


In various embodiments, an apparatus to improve an effectiveness of notifications provided to the user of the wearable electronic device is described. The apparatus includes an electronic device including a sedentary state monitor to notify the user of the wearable electronic device to encourage the user to alter his/her sedentary behavior based on tracking periods of time during which the user is sedentary. The sedentary state monitor includes a set of one or more managers to receive a current state from a plurality of states of the user during periods of time. The states include a sedentary state. The one or more managers cause the wearable electronic device to receive notifications based on the current state to notify the user to limit a length of time during which the user is in the sedentary state.


In some embodiments, the apparatus further includes a sedentary learning unit coupled to receive data from each of the one or more managers concerning the notifications. The sedentary learning unit is coupled to a set of one of more sensors of the wearable electronic device to determine which notifications has an effect of modifying the sedentary behavior of the user, and to determine an updated configuration of at least one of the one or more managers. The updated configuration improves the user's response to the notifications to limit a length of a period of time during which the user is in the sedentary state.


In various embodiments, the one or more managers include a sedentary alert manager that receives a period of time for the current state of the user. The sedentary alert manager generates a sedentary alert based on a detection that the period of time exceeds a sedentary period of time threshold value. The sedentary alert manager sends a notification to the wearable electronic device indicating that the period of time exceeds the sedentary period of time threshold value.


In some embodiments, the one or more managers further include a non-sedentary state transition manager to receive the current state of the user. The non-sedentary state transition manager generates a notification that is based on a detection of an end of a sedentary period of time for the current state. The notification is sent from the non-sedentary state transition manager to the wearable electronic device.


In various embodiments, the sedentary learning unit determines based on notification information received from the sedentary alert manager and the non-sedentary state transition manager an updated configuration of at least one of the sedentary alert manager and the non-sedentary state transition manager. The updated configuration improves the user's response to the notification information to limit a length of a sedentary period of time during which the user is in the sedentary state.


In some embodiments, the updated configuration includes disabling at least one of the sedentary alert manager and the non-sedentary state transition manager.


In several embodiments, the sedentary learning unit includes a decision tree, a random forest, a support vector machine, neural network, a K-nearest neighbor, a Naïve Bayes, or Hidden Markov models.


In various embodiments, the sedentary learning unit allows the user to snooze a notification.


In some embodiments, the sedentary learning unit uses data related to the snoozed notification to determine an updated configuration of at least one of the one or more managers.


In various embodiments, the electronic device is a wearable electronic device.





BRIEF DESCRIPTION OF THE DRAWINGS

Embodiments described in the present disclosure are illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings in which like references indicate similar elements.



FIG. 1A illustrates sedentary user state detection and sedentary alert management, according to various embodiments described in the present disclosure.



FIG. 1B illustrates a flow diagram of operations for tracking sedentary periods of time and causing a user to receive notifications based on the sedentary periods, according to some embodiments described in the present disclosure.



FIG. 2 illustrates a use of sedentary and non-sedentary statuses of the user to determine periods of time in which a state of the user is sedentary, according to several embodiments described in the present disclosure.



FIG. 3 illustrates a sedentary state monitor for notifying the user based on tracking of the sedentary periods of time to encourage the user to alter his/her sedentary behavior and to limit a length of the sedentary periods, according to some embodiments described in the present disclosure.



FIG. 4 illustrates a communication of a notification to the user based on a detection of an end of a sedentary period of time and of a beginning of a non-sedentary period of time, which has exceeded a threshold period of time, according to various embodiments described in the present disclosure.



FIG. 5 illustrates a communication of a sedentary alert to the user based on a detection of the sedentary period of time exceeding a sedentary period of time threshold value, according to some embodiments described in the present disclosure.



FIG. 6A illustrates a recordation of metabolic equivalent of task (MET) measures at consecutive moments of interest over a period of time and a classification of a status of the user at each moment of interest based on the MET measures, in accordance with various embodiments described in the present disclosure.



FIG. 6B illustrates a recordation of the MET measures at consecutive moments of interest over a period of time and the classification of the status of the user at each moment of interest based on the MET measures, in accordance with some embodiments described in the present disclosure.



FIG. 7 is a block diagram illustrating a wearable electronic device and an electronic device implementing operations disclosed herein, according to various embodiments described in the present disclosure.



FIG. 8 is a block diagram of a wrist-mounted electronic device having a button, a display, and a wrist band to secure the wrist-mounted electronic device to a forearm of the user, according to some embodiments described in the present disclosure.





DETAILED DESCRIPTION

In the following description, numerous specific details are set forth. However, it is understood that embodiments described in the present disclosure may be practiced without these specific details. In other instances, well-known circuits, structures and techniques have not been shown in detail in order not to obscure an understanding of the embodiments. It will be appreciated, however, by one skilled in the art that the embodiments may be practiced without such specific details. Those of ordinary skill in the art, with a description of the embodiments, will be able to implement appropriate functionality of the embodiments without undue experimentation.


In some embodiments, the terms “coupled” and “connected,” along with their derivatives, are used. It should be understood that these terms are not intended as synonyms for each other. For example, “coupled” is used to indicate that two or more elements, which are or are not in direct physical or electrical contact with each other, co-operate or interact with each other. Moreover, in this example, “connected” is used to indicate an establishment of communication between two or more elements that are coupled with each other. Furthermore, in various embodiments, a “set”, as used herein, refers to any positive whole number of items, including one item, unless stated otherwise, such as, a set of zero or more.


In some embodiments, an electronic device stores a code internally and/or transmits the code to other electronic devices over a computer network. The code is composed of software instructions and is sometimes referred to as computer program code or a computer program stored within a machine-readable storage media. In some embodiments, the code includes data for execution of the code. In various embodiments, the machine-readable storage media is computer-readable media. Examples of the computer-readable media include magnetic disks, optical disks, read only memory (ROM), random access memory (RAM), flash memory devices, phase change memory, etc. In various embodiments, the code is sent using a machine-readable transmission media, also called a carrier, such as, for example, electrical, optical, radio, acoustical, or other form of propagated signals. Further examples of the carrier include carrier waves, infrared signals, etc.


In various embodiments, the electronic device, e.g., a computer, etc., includes hardware and software. For example, the electronic device includes one or more processors coupled to one or more machine-readable storage media to store the code for execution on the one or more processors and/or to store data. To further illustrate, the electronic device includes a non-volatile memory containing the code and the non-volatile memory stores the code or data even when the electronic device is turned off, e.g., when power to the electronic device is removed, etc. While the electronic device is turned on, a part of the code that is to be executed by the one or more processors of the electronic device is copied from the non-volatile memory into a volatile memory, e.g., a dynamic random access memory (DRAM), a static random access memory (SRAM), etc., of the electronic device. The non-volatile memory is slower to access than the volatile memory. The electronic device typically also includes a set of one or more network interface(s), e.g., a network interface controller, a network interface card, an Internet access card, an Internet access controller, etc., each of which establishes physical or wireless network connections with the other electronic devices to communicate the code and/or data using the propagated signals. A wearable electronic device (WED), further described in detail below, is an example of the electronic device. It should be noted that one or more parts of an embodiment described in the present disclosure may be implemented using different combinations of software, firmware, and/or hardware.


Embodiments describing tracking of a sedentary state of a user and generation of a sedentary notification follow.



FIG. 1A illustrates sedentary state detection and sedentary alert management, according to some embodiments described in the present disclosure. It should be noted that task boxes 1, 2, 3A, 3B, and 4 of FIG. 1A are executed and components 110, 112, 120, 122, 124, and 126 of FIG. 1A are implemented, in some embodiments, in the wearable electronic device, or distributed between the wearable electronic device and one or more of the other electronic devices coupled to the wearable electronic device. In some embodiments, the wearable electronic device is worn on a body part, e.g., an arm, a wrist, an ankle, or a chest, etc., of the user, or embedded in a garment worn by the user. Examples of the one or more other electronic devices include a server including hardware and software, a tablet, a smartphone, a desktop computer, a laptop computer, and a smart television. In some embodiments, the one or more other electronic devices execute an application, sometimes referred to as an app, to implement, for example, a sensor data analyzer 112, a user state tracking unit 190, and/or a sedentary notification unit 192.


The task boxes 1-4 illustrate an order in which operations are performed by the components 110, 112, 120, 122, 124, and 126. As illustrated by the task box 1, one or more sensors 110 generate sensor data 150 for a plurality of time intervals. For example, the one or more sensors 110 are implemented in the wearable electronic device, such that, when worn by the user, at least some of the sensor data is indicative of an activity performed by the user. An example of the sensor data includes biometric data. In some embodiments, the one or more sensors 110 that generate the sensor data 150 include a motion sensor, e.g., a three axis accelerometer, etc. The motion sensor generates motion sensor data indicative of a motion, e.g., a number of steps taken by, a number of floors climbed by, a number of floors descended by, etc., of the user. In various embodiments, the one or more sensors 110 include a heart rate sensor, e.g., a photoplethysmographic (PPG) sensor, etc., to generate heart sensor data, e.g., PPG sensor data, etc., indicative of a heart rate of the user. In several embodiments, both the motion sensor and the heart rate sensor are housed in the same wearable electronic device or in different wearable electronic devices. In some embodiments, other types of sensors, e.g., a gyroscope, a gravity sensor, a rotation vector sensor, a magnetometer, a temperature sensor to measure a temperature of the user's skin and/or an environment surrounding the user, an ambient light sensor to measure an ambient light of the environment, a galvanic skin response sensor, a capacitive sensor, a humidity sensor, a sound sensor, etc., are housed in the wearable electronic device or in multiple wearable electronic devices. Examples of the environment surrounding the user include a room in which the user is situated, a street on which the user is standing or driving, an interior of a vehicle in which the user is situated, etc. In several embodiments, some or all of the sensor data 150 is generated by one of the one or more other electronic devices described above and received by the wearable electronic device from the one of the one or more other electronic devices.


The sensor data 150 generated during a time interval is processed by the sensor data analyzer 112. In some embodiments, a subset of the sensor data 150 is generated by performing a statistical operation, e.g., averaging, etc., on the sensor data 150 and is processed by the sensor data analyzer 112. At the task box 2, the sensor data analyzer 112 analyzes the sensor data 150 received from the one or more sensors 110 and calculates analyzed sensor information 152 for each moment of interest, and the analyzed sensor information 152 is used to determine a status, e.g., sedentary status or non-sedentary status, etc., of the user. In some embodiments, the analyzed sensor information 152 is calculated for a plurality of moments of interest at regular intervals of time, e.g., an interval within a range of 30 seconds-1.5 minute, a 1 minute interval, an interval within a range of 0.5 second-1.5 seconds, a 1 second interval, etc. In various embodiments, the intervals of time are configurable and dynamically adjusted, e.g., reduced or increased based on various factors, etc., and/or capable of being automatically disabled and/or manually disabled by the user for spans of time to save power.


In some embodiments, the analyzed sensor information 152 is metabolic equivalent of task (MET) measures, where each MET is determined for a moment of interest. A MET measure is a normalized measure of energy expenditure, which increases with activity and is non-zero at a moment of interest. For example, a MET measure for an inactive or asleep status or not wearing status is close to 1.0, a MET measure for the user while walking is generally above 2.0, and a MET measure for the user while swimming is between 10.0 and 11.0. While in some embodiments the analyzed sensor information 152 is MET measures, various embodiments use different measures, e.g., motion measures indicative of a motion of the user wearing the wearable electronic device, heart rate measures indicative of the heart rate of the user, etc. The motion measures are sometimes referred to herein as movement measures. Examples of the motion measures include a number of steps taken by the user, a number of floors climbed or descended by the user, etc. In various embodiments, the heart rate sensor, e.g., a heart rate monitor, etc., generates the heart sensor data indicative of a heart rate of the user and calculates the heart rate measures of the user.


The analyzed sensor information 152 for moments of interest is used by the user state tracking unit 190 to generate a state 156 of the user for different periods of time. In some embodiments, each period of time typically includes multiple contiguous moments of interest. In various embodiments, each period of time is as small as one moment of interest. The user state tracking unit 190 includes a user status identifier 120 that classifies each moment 154 of interest into the status of the user based on the analyzed sensor information 152 for that moment of interest. The statuses of the user are classified into the sedentary status and the non-sedentary status, as indicated in the task box 3A.


In some embodiments, the user status classifier 120 classifies the status of the user for the moment 154 of interest as being sedentary, e.g., seated, seated and typing at a computer, typing at a computer, etc., or non-sedentary, e.g., active, running, walking, exercising, dancing, swimming, etc. It should be noted that in various embodiments, when the status of the user is classified as non-sedentary, the user is spending a higher amount of energy compared to energy expended by the user when the status of the user is classified as sedentary. Several methods for classifying the moment 154 of interest are described in more detail below with reference to FIGS. 6A-6B.


In various embodiments, a MET measure is used to determine the non-sedentary status of the user and associates the non-sedentary status with a particular type of activity of the user. For example, according to a MET measure, the user status classifier 120 determines whether the user is running, walking, sprinting, bicycling, swimming, or performing another type of non-sedentary activity.


As described in the task box 3B, a time period detector 122 of the user state tracking unit 190 detects periods of time during which a state of the user is sedentary based on contiguous moments of interest for which the status of the user is classified as sedentary. For example, according to some embodiments described below with reference to FIG. 2, a block of time is determined by the time period detector 122 to have a sedentary state when it is determined by the time period detector 122 that the block of time includes contiguous moments of interest of sedentary statuses.


The state 156 of the user for different periods of time is used by the sedentary notification unit 192 to generate one or more sedentary alerts 158 to notify the user. Examples of the one or more sedentary alerts 158 are provided below. The one or more sedentary alerts 158 encourage the user to limit a length of sedentary time periods. At the task box 4, a sedentary state monitor 124 of the sedentary notification unit 192 generates the one or more sedentary alerts 158, e.g., a notification, etc., for providing to the user to encourage the user to alter his/her sedentary behavior. The one or more sedentary alerts 158 are provided to the user through a user interface 126, which includes a display device of the wearable electronic device. In some embodiments, the user receives the one or more sedentary alerts 158 through a vibration of the wearable electronic device, a message displayed on the display device of the wearable electronic device, and/or a sound emitted by a speaker within the wearable electronic device.


In some embodiments, the sensor data analyzer 112 is located within the wearable electronic device or within one of the other electronic devices. For example, a processor of the wearable electronic device or of one of the other electronic devices performs operations described herein as being performed by the sensor data analyzer 112. In several embodiments, the user status classifier 120 is located within the wearable electronic device or in one of the other electronic devices. For example, the processor of the wearable electronic device or of one of the other electronic devices performs operations described herein as being performed by the user status classifier 120. In various embodiments, the time period detector 122 is located within the wearable electronic device or in one of the other electronic devices. For example, the processor of the wearable electronic device or of one of the other electronic devices performs operations described herein as being performed by the time period detector 122. In several embodiments, the sedentary state monitor 124 is located within the wearable electronic device or in one of the other electronic devices. For example, the processor of the wearable electronic device or of one of the other electronic devices performs operations described herein as being performed by the sedentary state monitor 124. In some embodiments, the user interface 126 is located within the wearable electronic device or in one of the other electronic devices. For example, the processor of the wearable electronic device or of one of the other electronic devices performs operations described herein as being performed by the user interface 126. Examples of a processor include an application specific integrated circuit (ASIC), a programmable logic device (PLD), a central processing unit, a microprocessor, a controller, a microcontroller, etc.


In some embodiments, instead of the processor of the wearable electronic device performing the operations described herein as being performed by the sensor data analyzer 112, the user status classifier 120, the time period detector 122, the sedentary state monitor 124, and the user interface 126, different processors of the wearable electronic device perform the operations. For example, a processor of the wearable electronic device performs an operation described herein as being performed by the sensor data analyzer 112, another processor of the wearable electronic device performs an operation described herein as being performed by the user status classifier 120, yet another processor of the wearable electronic device performs an operation described herein as being performed by the time period detector 122, another processor of the wearable electronic device performs an operation described herein as being performed by the sedentary state monitor 124, and another processor of the wearable electronic device performs an operation described herein as being performed by the user interface 126. Similarly, in various embodiments, instead of the processor of one of the other electronic devices performing the operations described herein as being performed by the sensor data analyzer 112, the user status classifier 120, the time period detector 122, the sedentary state monitor 124, and the user interface 126, different processors of the one of the other electronic devices perform the operations.


In various embodiments, one or more processors of the wearable electronic device perform the operations described herein as being performed by the sensor data analyzer 112, the user status classifier 120, the time period detector 122, the sedentary state monitor 124, and the user interface 126. Similarly, in some embodiments, one or more processors of one of the other electronic devices perform the operations described herein as being performed by the sensor data analyzer 112, the user status classifier 120, the time period detector 122, the sedentary state monitor 124, and the user interface 126.


It should be noted that in embodiments in which tasks described above with reference to task boxes 2 through 4 are performed by the components 112, 120, 122, and 124 located in one of the other electronic devices, the wearable electronic device includes a display device to display the one or more sedentary alerts 158. Moreover, in these embodiments, the wearable electronic device and one of the other electronic devices communicate with each other via a communication medium, e.g., a universal serial bus cable, a wireless protocol air medium, a serial cable, a parallel cable, etc. An example of the wireless protocol includes Bluetooth™.



FIG. 1B illustrates a flow diagram of a method for tracking a sedentary period of time and causing the user to receive notifications based on the sedentary period of time, according to various embodiments described in the present disclosure. At an operation 102 of the method, a period of time during which the state of the user is determined to be sedentary, sometimes referred to herein as the “sedentary period of time”, is tracked by the time period detector 122. The determination of the period of time is based on the MET measures for individual non overlapping moments of interest within each period of time. Further, in some embodiments, the MET measures are generated within the wearable electronic device based on the sensor data 150 received from the one or more sensors 110 (FIG. 1A), e.g., a three axis accelerometer and a heart rate sensor, etc. Examples of the MET measures include movement measures calculated based on the sensor data 150 from the motion sensor and heart rate measures calculated based on the sensor data 150 from the heart rate sensor housed within the same wearable electronic device. If a length of the sedentary period of time exceeds a pre-determined threshold time period, it is determined by the time period detector 122 that the state of the user is sedentary.


At an operation 104 of the method for tracking the sedentary period of time, an act of providing the user with notifications to encourage the user to limit a length of the sedentary period of time is performed based on the tracking at the operation 102. In some embodiments, the operation 104 is performed within the wearable electronic device and the user is notified by receiving a message on the display device of the wearable electronic device, a vibration of the wearable electronic device, and/or a sound emitted by speakers of the wearable electronic device.


In some embodiments, a notification, as described herein, is an electronic notification that is sent to a display device. For example, the electronic notification is rendered by a processor to be displayed on a display device. In various embodiments, the electronic notification is provided in the form of a vibration or a sound.



FIG. 2 illustrates a use of the user's statuses of sedentary or non-sedentary to determine period of times in which the state of the user is sedentary according to some embodiments described in the present disclosure. As illustrated in FIG. 2, the statuses are illustrated as either sedentary or non-sedentary over time, e.g., at each moment of interest, etc. From the sedentary or non-sedentary statuses, non-overlapping, consecutive periods of time are derived. During each of the consecutive periods of time, the user is in either a sedentary state or a non-sedentary state. Each of the consecutive periods of time span one or more moments of interest. Also, as illustrated in FIG. 2, the consecutive periods of time have different states of the user, and the consecutive periods of time are described by the task box 3B.


Specifically, FIG. 2 shows that the consecutive periods of time are derived: a sedentary period 252, a non-sedentary period 254, and a sedentary period 256, each of which spans multiple contiguous moments of interest for which the status of the user is identical. The sedentary period 252 includes 6 moments of interest each of which has the status classified as sedentary. Comparatively, the non-sedentary period 254 includes 5 moments of interest, each of which has the status classified as non-sedentary. Transitions between the states of the user are represented by edges of the periods of time. e.g., when one time period ends and the next begins, etc., To illustrate, the state of the user transitions from the sedentary state to the non-sedentary state at an end of the sedentary period 252 of time and at a beginning of the non-sedentary period 254 of time.


In some embodiments, the time period detector 122 (FIG. 1A) detects and records alternating periods of time in which the state of the user is sedentary or non-sedentary. For example, the sedentary periods of time 252 and 256, and the non-sedentary period of time 254, illustrated in FIG. 2, are recorded over a span of time, e.g., an hour, a day, a week, etc., to be presented to the user or to perform further analysis with regard to a sedentary behavior of the user. The determined periods of time with the states, e.g., the sedentary state, the non-sedentary state, etc., are presented to the user on the display device of the wearable electronic device or a display device of one of the other electronic devices, e.g., a tablet, a smartphone, a computer, etc., which receives the determined periods of time and the states as data from the wearable electronic device. In some embodiments, the one of the other electronic devices generates the determined periods of time and the states. The user then views his/her sedentary behavior indicated by the determined periods of time and the states, and tracks his/her improvement over time.


In some embodiments, the user shares information about his/her sedentary behavior with friends, colleagues, or teammates via a network, e.g., a social network, etc. The friends, colleagues, or teammates compete with each other based on their respective sedentary states determined by their respective recorded sedentary periods.


In various embodiments, additional sensor data is used to further disambiguate between sedentary periods and other time periods during which the user is asleep and/or not wearing the wearable electronic device. The wearable electronic device is able to detect time periods during which the user is asleep or not wearing the wearable electronic device. For example, the one or more sensors 110 cannot detect information, e.g., the motion sensor data, the heart sensor data, number of steps taken, etc., about the user. To further illustrate, when the wearable electronic device is not being worn by the user, the one or more sensors detect a number of steps taken by the user to be zero for a time period. The time period satisfies the MET measures or motion based criteria for sedentary time, but do not qualify as sedentary because the user is not wearing the wearable electronic device or is asleep as is detected by the one or more sensors 110. MET measures for the time period during which the user is not wearing the wearable electronic device, e.g., while sleeping, while showering, etc., or is asleep are filtered out by the processor of the wearable electronic device or the processor of one of the other electronic devices before or during a time at which a determination of the sedentary and non-sedentary statuses is made.


Embodiments describing the sedentary state monitor follow.


Through an automatic detection and tracking of periods of time in which the state of the user is sedentary, the user receives notifications to encourage him/her to alter his/her behavior and be less sedentary. The notifications promote breaking up long periods of sedentary states and decrease an overall time that the user is sedentary. FIG. 3 is a block diagram of an embodiment of the sedentary state monitor 124 for notifying the user based on tracking of sedentary periods of time to encourage the user to alter his/her sedentary behavior and limit a length of the sedentary periods, according to several embodiments described in the present disclosure. The sedentary state monitor 124 includes a management unit 310 that receives the states of the user for periods of time. For example, the state is provided at an end of a period of time after which a preceding state of the user has changed. To illustrate, after a transition between two consecutive states of the user, it is indicated to the management unit 310 that a current period and the user's state has begun. As another example, the states are provided as a current stream of information that includes a transition between two consecutive states. As yet another example, the states are provided in bulk at regular intervals. To illustrate, a current period of time is thus far X time long, and the management unit 310 detects transitions between two consecutive states during X.


While FIG. 3 shows a non-sedentary state transition manager 320 of the management unit 310, a sedentary alert manager 330 of the management unit 310, and a non-sedentary goal manager 340 of the management unit 310, in some embodiments, the management unit 310 has more, less, and/or different types of managers. In embodiments with multiple types of managers, one or some combination of these managers are used at different times to interact with the user, e.g., by sending notifications to the user through the wearable electronic device of the user or through the one of the other electronic devices based on sedentary states of the user to encourage the user to alter or end a sedentary behavior as discussed in more detail below.


Embodiments describing a behavior triggered alert, e.g., a non-sedentary state transition, etc., follow.


In accordance with several embodiments, the user is notified, e.g., via display of a notification, etc., on the wearable electronic device upon detection that the sedentary period of time has ended and the non-sedentary period of time has begun, e.g., the user started moving, etc., and upon determination that the non-sedentary period of time exceeded a threshold period of time. While in some embodiments, the threshold period of time is the same for all types of activities, in various embodiments, the threshold period of time is different for at least certain type of activities. The user receives the notification, e.g., a message displayed on the display device of the wearable electronic device, a vibration of the wearable electronic device, and/or a congratulations sound on the wearable electronic device, etc., that notifies the user that he/she has just ended a sedentary period of time. The notification is intended to encourage the user to keep moving and remain active to limit a total amount of time for which the state of the user is sedentary.



FIG. 4 illustrates a communication of a notification to the user based on a detection of an end of a sedentary period 424 of time and a beginning of a non-sedentary period 426 of time, which has exceeded the threshold period of time, according to some embodiments described in the present disclosure. In some embodiments, the time period detector 122 (FIG. 1A) detects the sedentary period 424 of time of the user and upon detection 412 that the state of the user has changed from sedentary to non-sedentary for the threshold period of time, the time period detector 122 notifies the non-sedentary state transition manager 320 (FIG. 3), which communicates 414 a notification, e.g., non-sedentary state transition notification information, etc., to the wearable electronic device of the user. As an example, it is detected that the state of the user has changed from sedentary to non-sedentary when a set of one or more moments of interest included in the non-sedentary period 426 of time meet the threshold period of time. In some embodiments, the state of the user is detected to be non-sedentary when the user performs one of a variety of activities, e.g., running, sprinting, walking briskly, puttering around, etc., for a period of time, e.g., 30 seconds, 10 seconds, 1 minute, 3 minutes, etc.


In various embodiments, the non-sedentary period 426 of time is determined by the time period detector 122 based on statuses of the user at moments of interest. In some embodiments, an end of the sedentary period 424 of time is detected when a non-sedentary state of the user is detected.


In several embodiments, the non-sedentary state is detected based on a type of activity that the user is performing. For example, when the user runs for 30 seconds, e.g., the status of the user is classified as “non-sedentary, running” for 30 seconds, etc., a change in the state of the user from sedentary to non-sedentary is detected and the user receives a notification. As another example, when the user sprints for at least 10 seconds, e.g., the status of the user is classified as “non-sedentary, sprinting” for at least 10 seconds, etc., a change in the state of the user from sedentary to non-sedentary is detected and the user receives a notification. As yet another example, when the user walks briskly for 1 minute, e.g., the status of the user is classified as “non-sedentary, walking” for 1 minute, etc., a change in the state of the user from sedentary to non-sedentary is detected and the user receives a notification. As still another example, when the user putters around for at least 3 minutes, e.g., the status of the user is classified as “non-sedentary, puttering” for at least 3 minutes, etc., a change in the state of the user from sedentary to non-sedentary is detected and the user receives a notification.


In some embodiments, the time period detector 122 detects the sedentary period 424 of time and upon detection 412, of a non-sedentary moment of interest, e.g., one or more moments of interest included in the non-sedentary period 426 of time, etc., the time period detector 122 notifies the non-sedentary state transition manager 320, which determines that the non-sedentary period of time exceeds the threshold period of time and communicates 414 a notification to the wearable electronic device of the user that the threshold period of time is met. The notification that the threshold period of time is met is indicative that the user has ended the sedentary period 424 of time and is now active. The notification is intended to encourage the user to be more active and break sedentary periods more often.


In various embodiments, the notification that the threshold period of time is met is in the form of a sentence, or a motivational statement, or a positive message displayed on the display device of the wearable device. Examples of a non-exhaustive list of exemplary motivational statements that indicate whether the threshold period of time is met include “good job!,” “great work,” “keep moving,” “keep going”; “don't stop”; “step more”; “xx hours stationary” (where xx is how long the user has been sedentary); “keep walking”; “take a xx minute walk” (where xx is a value between 1-20, for example 2); “you got up after xx hours yy minutes of sitting” (where xx is the number of hours and yy the number of minutes the user is sedentary); “you're at x steps. can you get to x+200?” (where x is the number of steps taken since ending the sedentary period); “take the long way?”; “walking meeting?”; “let's go!”; “Walk to success!”; “Let's call u butter cuz u on a roll!”; “you're on fire!”; “movin' and a groovin'”; “Don't stop movin' ! !”; “grab a friend, keep walking!”; “you can't catch me!”; “Good job, you sat around for x hours and now you finally got up”; “up and at 'em”; “walk like you just robbed a bank”; “Staying on top of it!”; “way to go”; “Way to move!”; “Way to work it!”; “you did it!”; “Looking healthy!”; “Good for you:)”; “Great!”; “score!”; and “nice!”, etc.


Embodiments describing rolling alerts follow.


In various embodiments, the user is notified through the wearable electronic device upon detection that the user has been sedentary for a threshold amount of time, which is sometimes referred to herein as a threshold sedentary time period. When the user has been sedentary for an extended period of time, a sedentary alert is communicated to the user to inform him/her of the extended period of time and encourage him/her to end the extended period. The sedentary alert is a message displayed on the display device of the wearable electronic device, a sound emitted by the wearable electronic device, and/or a vibration of the wearable electronic device. The sedentary alert is intended to encourage the user to start moving and become active to end the sedentary period.


In some embodiments, the threshold sedentary time period is 1 hour, or 2 hours, or 20 minutes, or 40 minutes, or a few seconds, or a few minutes, or a few hours. In various embodiments, the threshold sedentary time period is configurable, e.g., the user selects a length of a window of sedentary time after which he/she would like to be notified to end the sedentary time. In several embodiments, the threshold sedentary time period is dynamically adjusted, e.g., reduced or increased based on various factors, etc., and/or capable of being automatically disabled and/or manually disabled by the user.


In some embodiments, the user sets preferences for a type of alert to be received. For example the user selects a sound, a particular message to be displayed, and/or a vibration. In various embodiments, the user sets the sedentary state monitor 124 (FIG. 1A) such that sedentary periods of time are monitored within specific intervals of times. For example, the sedentary periods or non-sedentary periods of time are monitored during the day between 8 AM and 8 PM of a day and not monitored for remaining times of the day.


In some embodiments, an input, e.g., one or more preferences, etc., is received from the user via an input device, e.g., a keypad, a touchscreen of the display device, a stylus, etc., of the wearable electronic device or via an input device, e.g., a keypad, a touchscreen of a display device, a stylus, a keyboard, a mouse, etc., of one of the other electronic devices. In embodiments in which the input is received at the wearable electronic device and the sedentary state monitor 124 is located within one of the other electronic devices, the input is communicated from a communication device of the wearable electronic device to a communication device of the one of the other electronic devices. The communication device of the one of the other electronic devices provides the input to the sedentary state monitor 124 of the one of the other electronic devices. Examples of a communication device includes a device that applies a Bluetooth protocol, an Internet Protocol (IP), an Ethernet protocol, a Transmission Control Protocol over IP (TCP/IP) protocol, a Universal Serial Bus protocol, a serial transfer protocol, a parallel transfer protocol, etc. In embodiments in which the input is received at one of the other electronic devices and the sedentary state monitor 124 is located within the wearable electronic device, the input is communicated from the communication device of the one of the other electronic devices to the communication device of the wearable electronic device. The communication device of the wearable electronic device provides the input to the sedentary state monitor 124 of the wearable electronic device.



FIG. 5 illustrates communication of a sedentary alert to the user based on a detection of a sedentary period of time exceeding the threshold sedentary time period, according to various embodiments described in the present disclosure. The time period detector 122 (FIG. 1A) detects 516 a sedentary period 528 of time of the user and upon detection 516 that the sedentary period 528 of time exceeds the threshold sedentary time period ΔT, the sedentary alert manager 330 communicates 518 a sedentary alert, e.g., sedentary alert notification information, etc., to the wearable electronic device of the user. The sedentary alert is indicative that the user has spent more than ΔT period of time being in the sedentary state and he/she is encouraged to end the sedentary period 528 of time by doing a more active task, e.g., walking, or running, or performing a physical activity with higher energy expenditure than being sedentary, etc. The sedentary alert is intended to inform the user of his/her sedentary behavior and encourage him/her to be more active. In some embodiments, the sedentary alert is in the form of a vibration of the wearable electronic device or a sound emitted by the wearable electronic device through the speaker of the wearable electronic device, a sentence or a message displayed on the display device of the wearable electronic device.


A non-exhaustive list of sedentary alerts includes: “time to get moving!”; “How about a walk?”; “Get moving”; “care for a stroll”; “Move those muscles!”; “Let's move”; “please get up”; “how about a walk?”; “step it up!”; “take a break”; “stretch your legs”; “you were still for xx minutes” (where xx is how long the user has been sedentary); “where you are today is where your mind put you!”; “take care of your body”; “get up!”; “don't just sit there”; “be the example”; “get up, stand up”; “get up, get fit!”; “you've been sitting for 1 hour”; “you know what time it is!”; “woof! let's walk!”; “feed me steps”; “it's never to late to move!”; “time to get steppin'”; “let's move”; “Move! Move! Move!”; “I'm bored. Let's shake it!”; “go get 'em!”; “You can do anything you set your mind to”; “Gonna fly now!”; “I dare you to move!”; “More steps please”; “Move your butt”; “UP UP UP”; “Stretccch”; “Waaalk”; “GETUPANDGO”; “Take a walk”; “Grab a friend, take a walk”; “When the going gets sedentary, the sedentary get going!”; “I believe you can fly!”; “What have you done today to make me feel proud?”; “I want to run”; “Seize the day!”; “Run away!”; “I'm after you!”; “The British are coming”; “Tick tick tick, you're blood sugar is rising”; “Shutup and step”; “Hungry for steps”; “Error: Steps too low”; “Step error”; “Have you forgotten what walking feels like?”; “If you were my Fitbit, I'd walk you”; “it's been a while”; “Time to get up!”; “Get moving”; “Stop being an Eeyore”; “Hop to it”; “Make like a bunny and hop to it”; “It's that time again!”; “Let's go get some water”; “Let's go for a jaunt”; “care for a stroll?”; “stretch them legs!” “streeeeeeaaaaatttccchhh”; “step up 2 the streets”; “now walk it out”; “walk it out”; “left right LEFT!”; and “lets go find some stairs!”, etc. In some embodiments, the user alternatively or additionally receives one or more icons, and/or one or more animated images, e.g., animated feet, animated steps, etc., displayed on the display device of the wearable electronic device and the one or more icons and/or the one or more animated images indicate that the user is sedentary for greater than the threshold sedentary time period ΔT.


In various embodiments, upon receipt of the sedentary alert, the user ends the sedentary time period. In several embodiments, upon receipt of the sedentary alert, the user remains sedentary and continues to receive sedentary alerts from the sedentary state monitor 124 at regular time intervals encouraging him/her to move. For example, the user receives sedentary alerts every hour.


In some embodiments, if the user ends the sedentary period 528 of time, the user receives, via the wearable electronic device, a congratulations message from the sedentary state monitor 124, also sometimes referred herein below as a celebration message, to encourage him/her in his/her effort of ending the sedentary period 528 of time as described above. The celebration message is one of celebration messages described further below.


Embodiments describing mini-goal alerts follow.


In some embodiments, the user is encouraged to achieve a mini goal, e.g., 250 steps, or 15 minutes of consecutive non-sedentary moments of interest, etc., during a predetermined window of time. The mini goal is a step towards achieving a predefined goal. The sedentary periods and the non-sedentary activity of the user are tracked by the non-sedentary goal manager 340 (FIG. 3), which interacts with the display device of the wearable device and sends one or more notifications, e.g., mini-goal notification information, etc., to the display device of the wearable device providing information on a progress of the user for reaching the mini goal. For example, the non-sedentary goal manager 340 sends an indication of the progress to notify the user, via a vibration of the wearable electronic device, and/or a message displayed on the display device of the wearable electronic device, and/or a sound emitted by the speaker of the wearable electronic device, of remaining activity to perform to achieve the mini goal before an end of the predetermined window of time. A non-exhaustive exemplary list of notifications and messages that the user receives as part of the mini goal includes “xx MORE STEPS”; “xx STEPS TO GO!”; “xx STEPS LEFT”; “TAKE xx STEPS BEFORE 3 PM”; “xx STEPS TO HOURLY GOAL”; “10 MIN TO GET xx STEPS!”; “xx/yy STEPS DOWN, xx TO GO!”; “every step counts! xx MORE this hour!”; “Only xx steps to go till yy”; and “take xx (animated feet/steps)”, where xx is replaced by a number of steps left, and yy is replaced with a total number of steps set to be achieved for the mini-goal.


In some embodiments, instead of receiving an indication of how many steps remain or a length of an activity that remains to achieve the mini goal, the user receives a notification, e.g., a message, mini-goal notification information, etc., via a vibration of the wearable electronic device, and/or a message displayed on the display device of the wearable electronic device, and/or a sound emitted by the speaker of the wearable electronic device, asking him/her to start being active, e.g., walk, run, etc., and later receives a “celebration message” via a vibration of the wearable electronic device, and/or a message displayed on the display device of the wearable electronic device, and/or a sound emitted by the speaker of the wearable electronic device, for achieving the mini goal. For example, the non-sedentary goal manager 340 determines that the mini goal is achieved and provides a notification of achieving the mini goal to the user via a vibration of the wearable electronic device, and/or a message displayed on the display device of the wearable electronic device, and/or a sound emitted by the speaker of the wearable electronic device. The mini goal is sometimes referred to herein as a mini celebration. For example, the mini celebration is “a buzz+smiley”, when the user hits yy steps during the predetermined window of time set for the mini goal, e.g., during 1 hour, etc. The buzz is an example of a vibration of the wearable electronic device. In some embodiments, the wearable electronic device includes a tactile feedback device that vibrates to provide a tactile feedback to the user to provide a notification to the user.


While in some embodiments, the predetermined window of time for which the mini goal, e.g., a non-sedentary goal, etc., is to be achieved is 1 hour, in various embodiments a different predetermined time window, e.g., in a range of 10 minutes to 6 hours, or a range of 20 minutes to 3 hours, every 2 hours, etc., is used. In several embodiments, the predetermined window of time for which the mini goal, e.g., a non-sedentary goal, etc., is to be achieved is configurable, e.g., the user selects a length of the predetermined window of time for achieving the mini goal by setting preferences, in a manner described above. In some embodiments, the predetermined window of time for which the mini goal is to be achieved is dynamically adjusted, e.g., reduced or increased based on various factors, etc., by the sedentary state monitor 124. In some embodiments, the predetermined window of time for which the mini goal is to be achieved is capable of being automatically disabled by the sedentary state monitor 124 and/or manually disabled by the user.


In various embodiments, the user further determines preferences regarding timing for receiving notifications and reminders regarding his/her progress towards the mini goal and provides the preferences via the input device of the wearable electronic device or via the input device of one of the other electronic devices to the sedentary state monitor 124. For example, the user desires to receive a notification some minutes prior to the end of the predetermined window of time, e.g., 50 minutes into the hour, etc., before achieving the mini goal. The notification includes information indicating that the mini goal is to be achieved and remaining activities, e.g., a number of non-sedentary minutes, a number of steps, etc., to perform to achieve the mini goal. If the user completes the mini goal before the predetermined window of time ends, the user receives a rewarding message from the sedentary state monitor 124 and receives a prize from the sedentary state monitor 124 for that achievement.


A non-exhaustive exemplary list of celebration messages that the user receives for achieving the mini goal is presented herein: “custom character”; “great job!”; “:-D:-D:-D”; “:):):):)”; “xx/yy! ! !”; “another moving hour!”; “winner”; “winner winner chicken dinner”; “champion! champion!”; “xx down!”; “very good!”; “every extra step matters!”; “you=step machine”; “you=on fire!”; “you=awesome!”; “hourly step champ”; “xx steps isn't even that much”; and “my hero”, where xx is replaced by a number of steps completed during the predetermined window of time allocated for that mini-goal, and yy is replaced with a number of steps set to be achieved for the mini goal. Further in some embodiments, the user competes with friends via the social network for most mini goals reached.


In various embodiments, the non-sedentary goal manager 340 tracks and records mini goals set and achieved by the user and presents the mini goals set and/or achieved to the user. The mini goals are presented to the user on the display device of the wearable electronic device or one of the other electronic devices, which receives the mini goals from the wearable electronic device via the communication device of the wearable electronic device and the communication device of the one of the other electronic devices. The user then views his/her sedentary behavior and tracks his/her improvement in achieving the mini-goal over time.


In several embodiments, the sedentary alert manager 330 differs from the non-sedentary goal manager 340 in that the sedentary alert manager 330 works on a current sedentary period of time that started when the state of the user transitioned to sedentary, while the non-sedentary goal manager 340 operates off set time windows irrespective of whether the state is sedentary at a beginning of each time window. As previously discussed, two or more of the managers are used in combination to interact with the user via the sedentary state monitor 124 and the wearable electronic device to alter his/her sedentary behavior.


Embodiments describing learning alerts follow.


In some embodiments, a sedentary learning unit 350 of the sedentary state monitor 124 (FIG. 3), is coupled to the managers 320, 330, and 340, and receives notification information, e.g., one or more notifications, etc., sent to the user via the sedentary state monitor 124 (FIG. 1A) from each one of the managers 320, 330, and 340, and determines which of the one or more notifications had an effect of modifying a sedentary behavior of the user. For example, the sedentary learning unit 350 determines which of the one or more notifications succeeded in altering general sedentary behavior of the user by limiting a length of the sedentary period of time.


While in some embodiments, each of the managers 320, 330, and 340 transmits the notification information regarding a time at which the one or more notifications, e.g., the non-sedentary state transition notification information, the sedentary alert notification information, and the mini-goal notification information, etc., is sent, in various embodiments, the managers 320, 330, and 340 transmit more, less, or different data as part of the notification information. For example, the managers 320, 330, and 340 transmit a type of notification, e.g., a message, a vibration, and/or a sound, to the sedentary learning unit 350. As another example, the managers 320, 330, and 340 transmit information regarding a result of a notification sent to the user, e.g., whether the user ended his/her sedentary period, etc. to the sedentary learning unit 350.


In some embodiments, the sedentary learning unit 350 receives the sensor data 150 (FIG. 1A) from the one or more sensors 110 (FIG. 1A) to determine whether a transmitted notification had an effect of modifying a sedentary behavior of the user. The sedentary learning unit 350 records the sensor data 150 over a period of time to learn which type of notification, e.g., personality or tone of a message, etc., and which context, e.g., a time, a location, etc., has a desired effect on the user. Examples of the desired effect include a decrease in long periods of sedentary time, a decrease in a number of the sedentary states over time, a decrease in a number of sedentary statuses over time, etc. The sedentary learning unit 350 determines improved preferences and settings, e.g., configuration parameters, etc., for configuring at least one of the managers 320, 330, and 340 based on the notification information received.


The sedentary learning unit 350 learns a behavior of the user in response to the notifications from the managers 320, 330, and 340 and reacts to the user's behavior to improve the user's response to the notifications. For example, the sedentary learning unit 350 changes a configuration of one of the managers 320, 330, and 340, e.g., by transmitting configuration parameters to that manager, etc., to change to a time of day when the user might respond to a notification when the sedentary learning unit 350 determines that at another particular time of a day, the user never responds to the notification. As another example, the sedentary learning unit 350 changes a configuration of the sedentary alert manager 330 by modifying a length of the threshold sedentary period after which a notification is sent to the user. As yet another example, the sedentary learning unit 350 modifies a type of notification sent to the user, e.g., configures one of the managers 320, 330, or 340 to send a message to be displayed on the display device of the wearable electronic device instead of a vibration alert, e.g., a buzz, etc., or to cause an emission of a sound by the speakers of the wearable electronic device instead of a vibration, or to change the sound emitted, or to change the tone of a message, or to modify the type of notification.


In some embodiments, the sedentary learning unit 350 changes a plurality of configuration parameters such that one of the managers 320, 330, and 340 operates at a given time of the day. For example, the sedentary learning unit 350 determines that between certain hours of the day, e.g., 8 AM to 12 PM, etc., the user's response to notifications received from the non-sedentary state transition manager 320 is better than the user's response to notifications received from the sedentary alert manager 330. In this example, the sedentary learning unit 350 determines configuration parameters that disable a use of the sedentary alert manager 330 during those hours, e.g., 8 AM to 12 PM, etc. While the sedentary alert manager 330 and the non-sedentary state transition manager 320 are described in this example, in various embodiments, the sedentary learning unit 350 determines configuration parameters to disable or enable another manager, e.g., the non-sedentary goal manager 340, etc., and/or determines other hours of day during which to configure the managers 320, 330, or 340. While in several embodiments, the sedentary learning unit 350 changes a configuration of at least one of the managers 320, 330, and 340, in some embodiments, the sedentary learning unit 350 transmits a recommendation of a configuration of at least one of the managers 320, 330, and 340 to the display device of the wearable electronic device to be approved by the user prior to the configuration of the at least one of the managers 320, 330, and 340 with the changed configuration parameters.


In various embodiments, the sedentary learning unit 350 allows the user to snooze the sedentary alert notification information, such that the wearable electronic device reminds the user at a later time to perform a non-sedentary activity. The sedentary learning unit 350 records data related to an act, performed by the user via the input device of the wearable electronic device or the input device of one of the other electronic devices, of snoozing the sedentary alert notification information. For example, the data related to the act of snoozing includes a type of notification snoozed, a time at which the notification is snoozed, the state, e.g., sedentary, non-sedentary, etc., of the user at that time, a geographical location at which the notification is snoozed, etc. The sedentary learning unit 350 uses the data related to the act to change a configuration of one or more of the managers 320, 330, and 340. For example, if the user snoozes one or more of the managers 320, 330, and 340 at a particular time of a day, a configuration is changed in the manager to avoid operating during that time. This is used to improve the user's experience and instill greater confidence in the wearable electronic device. In some embodiments, the sedentary learning unit 350 is implemented using one or more of the following: a decision tree, a random forest, a support vector machine, a neural network, a K-nearest neighbor, a Naïve Bayes, and Hidden Markov Models.


In various embodiments, the user is able to set preferences for a type of notification received based on his/her sedentary behavior. For example, the user is able to select a subset of subunits, e.g., the non-sedentary state transition manager 320, the sedentary alert manager 330, the non-sedentary goal manager 340, etc., of the sedentary state monitor 124 via the input device of the wearable electronic device or the input device of one of the other electronic devices for use in notifying the user based on his/her sedentary behavior. Furthermore, the user is able to select a sound, a particular message to be displayed on the display device of the wearable electronic device, and a vibration for each type of message received on the wearable electronic device. The user chooses a combination of the types of notifications to be received simultaneously. The user sets the sedentary state monitor 124 via the input device of the wearable electronic device or the input device of one of the other electronic devices such that the sedentary periods are monitored within specific intervals of times. For example, the user desires to monitor the sedentary periods or the non-sedentary periods during a day between 8 AM and 8 PM.


Embodiments describing a classification of the status of the user based on the MET measures follow.


In some embodiments, the MET measures are used to determine the sedentary status or the non-sedentary status of the user. Thus, the MET measures are sometimes referred to as sedentary coefficients. The user status classifier 120 (FIG. 1A) receives a MET measure for a moment of interest and determines whether the MET measure is below a predetermined threshold. When the MET measure is below the predetermined threshold, the status of the user for that moment of interest is classified as being the sedentary status by the user status classifier 120. When the MET measure is above the predetermined threshold, the status of the user is classified as being non-sedentary for that moment of interest by the user status classifier 120 and the user is determined to be active by the user status classifier 120.



FIG. 6A illustrates a recordation of the MET measures at consecutive moments of interest over a period of time and a classification of a status 612 of the user by the user status classifier 120 (FIG. 1A) at each moment of interest based on the MET measures, in accordance some embodiments described in the present disclosure. The status 612 of the user is classified 614 as being non-sedentary by the user status classifier 120 at a moment of interest based on a MET measure exceeding a threshold MET value at that moment of interest, according to several embodiments described in the present disclosure. The MET measures are generated by the sensor data analyzer 112 (FIG. 1A) for a number of moments of interest, e.g., F1, F2 . . . FN, etc., and each MET measure, e.g., MET value, etc., is compared by the user status classifier 120 with the threshold MET value used for determination 630 of the sedentary status. MET measures 624 are all below the threshold MET value as determined by the user status classifier 120, and each moment of interest for the MET measures 624 is classified 614 and recorded as having the sedentary status by the user status classifier 120 within a memory device, e.g., the computer-readable media, etc., of the wearable electronic device or of one of the other electronic devices. Comparatively, MET measures 626 all exceed the threshold MET value as determined by the user status classifier 120, and each moment of interest for the MET measures 626 is recorded by the user status classifier 120 within the memory device as having the non-sedentary status. MET measures 628 are associated with sedentary and non-sedentary moments of interest by the user status classifier 120. For example, two of the MET measures 628 are above the threshold MET value as determined by the user status classifier 120, and each moment of interest 628B for the two of the MET measures 628 is identified as having the non-sedentary status by the user status classifier 120. The two other illustrated MET values of the MET measures 628 are below the threshold MET value as determined by the user status classifier 120. Each moment of interest 628A and 628C for the two other MET values is identified as having the sedentary status by the user status classifier 120. In some embodiments, the threshold MET value is within the range of 0.8-1.8 MET, e.g., 1.5 MET, etc. The classification 614 of the moments of interest illustrated at FIG. 6A yields sedentary and non-sedentary moments of interests illustrated in FIG. 2.



FIG. 6B illustrates a recordation of the MET measures at consecutive moments of interest over a period of time and a classification of the status of the user by the user status classifier 120 (FIG. 1A) at each moment of interest based on the MET measures, in accordance with various embodiments described in the present disclosure. The status of the user is classified as being non-sedentary at a moment of interest based on a MET measure exceeding a first threshold MET value at that moment of interest. The status of the user is classified as being sedentary at a moment of interest based on a MET measure being below a second threshold MET value at that moment of interest. In addition, the status of the user is classified as being sedentary at a moment of interest if the MET value exceeds the second threshold MET value, is below the first threshold MET value, and it is further preceded and followed by a moment of interest with a sedentary status. In some embodiments, a group of N consecutive moments are classified as having sedentary status if a MET measure associated with each moment is between the first and the second threshold MET values and the group of N consecutive moments of interest is immediately preceded and succeeded by a moment of interest with a sedentary status. Examples of N consecutive moments of interest include moments of interest, each occurring at 1 minute time intervals (e.g., where N is between 1 and 5), or each occurring at time intervals between 1 minute and 5 minutes, or each having 1 second time intervals (e.g., where N is between 1 and 300), or each occurring at time intervals between 1 second and 300 seconds, etc. If the N moments of interest occur at longer time intervals, e.g., every 10 minutes, etc., then N is smaller, e.g., 2, etc. In the embodiments discussed above, the second threshold MET value is lower than the first threshold MET value.


Each of the MET measures generated by the sensor data analyzer 112 is compared by the user status classifier 120 with the first threshold MET value used for recording a non-sedentary status 632 and with the second threshold MET value used for recording a sedentary status 634. The recordation of the non-sedentary status 632 and the sedentary status 634 in the memory device is performed by the user status classifier 120. MET measures 646 are all above the first threshold MET value, and each moment of interest for the MET measures 646 as determined by the user status classifier 120 to have the non-sedentary status is recorded by the user status classifier 120. The MET measures 644 are all below the second threshold MET value as determined by the user status classifier 120, and each moment of interest for the MET measures 644 is recorded by the user status classifier 120 as having the sedentary status. Comparatively, some of MET measures 648 exceed the second threshold MET value but are below the first threshold MET value as determined by the user status classifier 120, while other ones of the MET measures 648 are below the second threshold MET value as determined by the user status classifier 120. A first moment of interest of a group of contiguous moments of interest for the MET measures 648 has a MET value below the second MET threshold value as determined by the user status classifier 120, while a second moment of interest and a third moment of interest for the MET measures 648 has a MET value between the first and the second threshold MET values as determined by the user status classifier 120, immediately followed by moments of interest with a MET value below the second threshold MET value. In this example, all moments of interest for the MET measures 648 are determined by the user status classifier 120 as having the sedentary status despite having two moments within the contiguous group of moments with MET measures exceeding the second threshold MET value.


As described above, in some embodiments, a MET measure determines the non-sedentary status of the user associated with a particular type of activity of the user. For example, according to a MET measure, the user status classifier 120 determines whether the user is running, walking, sprinting, bicycling, swimming or performing another type of non-sedentary activity. To further illustrate, if a MET measure is within a range of 2.5 to 3.2, a status of the user is classified as “non-sedentary, bicycling”. As another example, if a MET measure is within a range of 3.2 to 3.8, a status of the user is classified as “non-sedentary, walking”. As yet another example, is a MET measure is between 6.7 to 7.3, e.g., e.g., 7.0, etc., a status of the user is classified as “non-sedentary, jogging”.


Embodiments describing a classification based on other sensor information follow.


In some embodiments, the sedentary status of the user for a moment of interest is determined by the user status classifier 120 based on the sensor data 150 (FIG. 1A), e.g., the motion sensor data and/or the biometric data, etc., received from the one or more sensors 110 (FIG. 1A) without the generation of the MET measures. For example, the sedentary status of the user is determined based on the motion measures, e.g., also sometimes referred to as movement measures, etc., and/or the heart rate measures without calculation of MET measures. Some embodiments of a classification of a status of the user at a moment of interest are described in U.S. Pat. No. 8,548,770 “Portable monitoring devices and methods of operating same,” which is incorporated by reference herein in its entirety.


Description of exemplary devices with automatic detection of the user's sedentary state or the non-sedentary state and providing notification to the user based on the sedentary state follow.


As previously described, while in some embodiments, one or more of the operations, described above, are implemented in the wearable electronic device, in various embodiments, one or more the operations are distributed among electronic devices, e.g., the wearable electronic device and the other electronic devices, etc. FIG. 7 illustrates examples of one such distribution. FIG. 7 is a block diagram illustrating a wearable electronic device 702 and an electronic device 700 implementing operations disclosed according to various embodiments described in the present disclosure. The electronic device 700 is an example of one of the other electronic devices. The wearable electronic device 702 includes a processor 742 and the one or more sensors 110. In some embodiments, instead of the processor 742, multiple processors are used in the wearable electronic device 702.


In some embodiments, the one or more sensors 110 include the motion sensor 727, examples of which include a multi-axis accelerometer, a gyroscope, a gravity sensor, a rotation vector sensor, and a magnetometer. Moreover, in various embodiments, the one or more sensors 110 include one of more other sensors 714, which include a photoplethysmographic sensor 720. In several embodiments, the one or more other sensors 714 include a temperature sensor 721, an ambient light sensor 722, a galvanic skin response sensor 723, a capacitive sensor 724, a humidity sensor 725, and a sound sensor 726.


The wearable electronic device 702 also includes a non-transitory machine readable storage medium 718, which contains the sensor data analyzer 112 as discussed herein above. When executed by the processor 742, the sensor data analyzer 112 causes the wearable electronic device 702 to generate the analyzed sensor information 152 for moments of interest. The wearable electronic device 702 performs functionalities relating to the user status classifier 120, the time period detector 122, and/or the sedentary state monitor 124, some or all of which are included in a sedentary tracking and notification module (STNM) 750, which is stored in the non-transitory machine readable storage medium 718. When executed by processor 742, the STNM 750 causes the wearable electronic device 702 to perform corresponding operations discussed herein above. The wearable electronic device 702 further includes the user interface 126 having a display device 732. Examples of a display device includes a liquid crystal display (LCD) display device, a light emitting diode (LED) display device, a plasma display device, etc. In some embodiments, the user interface 126 includes a speaker, a haptic screen, and/or a vibration mechanism, e.g., a haptic communication device, a rumble pack, a kinesthetic communication device, etc., to allow communication and interaction with the user wearing the wearable electronic device 702.


In some embodiments, the one or more other sensors 714 are not placed within the wearable electronic device 702. The one or more other sensors 714 are distributed around the user. For example, the one or more other sensors 714 are placed on a chest of the user, or a mattress on which the user lies, or a bedside table located by the user, while the wearable electronic device 702 is worn by the user.



FIG. 7 also includes an embodiment of the electronic device 700, e.g., the server including hardware and software, a tablet, a smartphone, etc., containing an application. In some embodiments, the electronic device 700 performs functionalities relating to the user status classifier 120, the time period detector 122, and/or the sedentary state monitor 124, some or all of which are included in the STNM 750, which is stored in a non-transitory machine readable storage medium 748 of the electronic device 700. For example, the STNM 750, instead of being stored in the non-transitory machine readable storage medium 718 of the wearable electronic device 702, is stored in the non-transitory machine readable storage medium 748 of the electronic device 700 for execution by a processor 752 of the electronic device 700. In some embodiments, the sensor data analyzer 112 is stored in the non-transitory machine readable storage medium 748 instead of being stored in the non-transitory machine readable storage medium 718, and is executed by the processor 752.


When executed by processor 752, the STNM 750 causes the electronic device 700 to perform corresponding operations discussed herein above. In some embodiments, the electronic device 700 contains virtual machines (VMs) 762A to 762R, each of which executes a software instance 766 or a software instance 768 of the STNM 950. A hypervisor 754 presents a virtual operating platform for the virtual machines 762A to 762R.


The wearable electronic device 702 collects one or more types of the sensor data 150, e.g., biometric data, etc., from the one or more sensors 110 and/or external devices, and then utilizes the sensor data 150 in a variety of ways. Examples of the biometric data include data pertaining to physical characteristics of a human body, such as, for example, a heartbeat, a heart rate, perspiration levels, etc. Other examples of the sensor data 150 include data relating to a physical interaction of the human body with an environment, such as accelerometer readings, gyroscope readings, etc. An example of the external devices includes an external heart rate sensor or monitor, e.g., a chest-strap heart rate sensor or monitor, etc. Examples of utilizing the sensor data 150 in the variety of ways include making calculations based on the sensor data 150, storing the sensor data 150, storing the calculations in the non-transitory machine readable storage media 718, automatically acting on the sensor data 150, automatically acting on the calculations, communicating the sensor data 150 to a communication device, such as, one or more network interface controllers 744, etc., of the electronic device 700 over the computer network, such as, for example, the Internet, a wide-area network, a local area network, etc., and communicating the calculations to the communication device over the computer network. Examples of automatically acting on the calculations include an automatic watch check and dismissal gesture detection. As described herein, the wearable electronic device 702 also receives data, e.g., notifications, etc. from one of the other electronic devices for storage and/or display on the display device 732.


In some embodiments, the electronic device 700 includes a display device for presenting any notifications described herein, e.g., the non-sedentary state transition notification information, the sedentary alert notification information, and the mini-goal notification information, etc., which are received from the wearable electronic device 702. For example, the sedentary state monitor 124 of the wearable electronic device 702 generates a notification and sends the notification via a communication device of the wearable electronic device 702 and a communication device of the electronic device 700 to the display device of the electronic device 700 for display on the display device 700.


In various embodiments, the sensor data 150 is obtained by the wearable electronic device 702 and send via the communication device of the wearable electronic device 702 and the communication device of the electronic device 700 to the STNM 750 of the electronic device 700 for performing the operations described herein.


In several embodiments, a notification is generated by the electronic device 700 and is sent from the one or more network interface controllers 744 to the communication device of the wearable electronic device 702 via the computer network for display of the notification on the display device 732. In various embodiments, the sedentary state monitor 124 of the electronic device 700 generates a notification and sends the notification via a communication device of the electronic device 700 and a communication device of the wearable electronic device 702 to the display device 732 for display on the display device 732.



FIG. 8 is a block diagram of an embodiment of a wrist-mounted electronic device having a button, a display, and a wrist band to secure the wrist-mounted electronic device to a user's forearm, according to several embodiments described in the present disclosure. For example, FIG. 8 depicts the wearable electronic device 702, such as illustrated in FIG. 7, and that is worn on the user's forearm, like a wristwatch. In FIG. 8, the wrist-mounted electronic device has a housing 802 that contains electronics, e.g., components illustrated in FIG. 7, etc., associated with the wrist-mounted electronic device, a button 804, and a display screen 806 accessible or visible through the housing 802. The display screen 806 is of the display device 732 (FIG. 7). A wristband 808 is integrated with the housing 802.


In some embodiments, the wrist-mounted electronic device incorporates one or more user interfaces including, but not limited to, visual, auditory, touch/vibration, or combinations thereof. In some embodiments, the wrist-mounted electronic device provides haptic feedback through, for instance, a vibration of a motor. In some implementations, the one or more sensors 110 (FIG. 1A) are used as part of the one or more user interfaces, e.g., accelerometer sensors are used to detect when the user taps the housing 802 of the wrist-mounted electronic device with a finger or other object and then interprets such data as a user input for purposes of controlling the wrist-mounted electronic device. For example, double-tapping of the housing 802 of the wrist-mounted electronic device is recognized by the wrist-mounted electronic device as a user input.


While FIG. 8 illustrates an implementation of the wrist-mounted electronic device, in some embodiments, the wrist-mounted electronic device has other shapes and sizes adapted for coupling to, e.g., secured to, worn, borne by, etc., a body or clothing of the user. For example, the wrist-mounted electronic device is designed such that it is inserted into, and removed from, a plurality of compatible cases or housings or holders, e.g., a wristband that is worn on the user's forearm or a belt clip case that is attached to the user's clothing. As used herein, the term “wristband” refers to a band that is designed to fully or partially encircle the user's forearm near a wrist joint. The band is continuous, e.g., without any breaks, or is discontinuous, or is simply open. An example of the continuous band includes a band that stretches to fit over the user's hand or has an expanding portion similar to a dress watchband. An example of the discontinuous band includes a band having a clasp or other connection allowing the band to be closed, similar to a watchband. An example of the open band is one having a C-shape that clasps the user's wrist.


It should be noted that in some embodiments, information, e.g., notifications, etc., are accessed by the user after logging into a user account. For example, the user provides his/her user information, e.g., user name, password, etc., and when the user information is authenticated by the server, the user logs into the user account. In these embodiments, the notifications are posted within the user account. The user account is stored on the server.


In some embodiments, the user accesses the user account to view graphs illustrated in FIGS. 2, 4, 5, 6A, and 6B. The graphs are viewed on the display device of the wearable electronic device or on the display device on one of the other electronic devices.


Some embodiments of the wearable electronic device and of one of the other electronic devices are described in application Ser. No. 15/048,965, filed on Feb. 19, 2016 and titled “Generation of Sedentary Time Information by Activity Tracking Device”, in application Ser. No. 15/048,972, filed on Feb. 19, 2016 and titled “Temporary Suspension of Inactivity Alerts in Activity Tracking Device”, in application Ser. No. 15/048,976, filed on Feb. 19, 2016 and titled “Live Presentation of Detailed Activity Captured by Activity Tracking Device”, and in application Ser. No. 15/048,980, filed on Feb. 19, 2016 and titled “Periodic Inactivity Alerts and Achievement Messages”, all of which are incorporated by reference herein in their entirety.


It should be noted that in an embodiment, one or more features from any embodiment described herein are combined with one or more features of any other embodiment described herein without departing from a scope of various embodiments described in the present disclosure.


While the invention has been described in terms of several embodiments, those skilled in the art will recognize that the invention is not limited to the embodiments described, can be practiced with modification and alteration within the spirit and scope of the appended claims. The description is thus to be regarded as illustrative instead of limiting.

Claims
  • 1. A system comprising: a wearable electronic device to be worn by a user, the wearable electronic device including one or more sensors configured to generate sensor data for a plurality of moments of interest;one or more processors; anda non-transitory machine-readable storage medium storing computer-executable instructions which, when executed by the one or more processors, cause the one or more processors to:obtain sensor data generated by the one or more sensors of the wearable electronic device,determine analyzed sensor information for each moment of interest of a first set of moments of interest of the plurality of moments of interest based on the sensor data for that corresponding moment of interest, the first set corresponding to a first window of time, wherein the analyzed sensor information indicates whether the user was sedentary or non-sedentary for the corresponding moment of interestdetermine, at a first point within the first window of time, that a first goal for the first window of time has not been achieved based at least on the analyzed sensor information for the moments of interest of the first set, wherein the first goal is an amount of non-sedentary activity, andcause a first notification to be generated responsive to determining that the first goal has not been achieved at the first point within the first window of time.
  • 2. The system of claim 1, wherein the first goal is based on input received from a user.
  • 3. The system of claim 1, wherein the first goal is a number of steps taken during the first window of time.
  • 4. The system of claim 1, wherein the non-transitory machine-readable storage medium stores additional computer-executable instructions to cause the one or more processors to: classify each moment of interest of the first set into a status of a plurality of statuses, wherein the statuses include a sedentary status and a non-sedentary status, based at least on the analyzed sensor information, anddetermine whether the first goal for the first window of time has been achieved based on the statuses for the moments of interest of the first set.
  • 5. The system of claim 4, wherein the non-transitory machine-readable storage medium stores additional computer-executable instructions to cause the one or more processors to determine whether the first goal for the first window of time has been achieved based on whether a duration of consecutive moments of interest of the first set that have been classified as having the non-sedentary status exceeds a threshold.
  • 6. The system of claim 1, wherein the duration of the first window of time is based on input received from a user.
  • 7. The system of claim 1, wherein the duration of the first window of time is between 10 minutes and 6 hours.
  • 8. The system of claim 1, wherein the duration of the first window of time is 60 minutes and the first point is preset at 10 minutes before the end of the first window of time.
  • 9. The system of claim 1, wherein the first point is a predetermined amount of time prior to the end of the first window of time.
  • 10. The system of claim 9, wherein the amount of time is based on input received from a user.
  • 11. The system of claim 1, wherein the non-transitory machine-readable storage medium stores additional computer-executable instructions to cause the one or more processors to: determine, at a second point in time after the first point in time and within the first window of time, that the first goal for the first window of time has been achieved based at least on the analyzed sensor information for the moments of interest of the first set; andcause a second notification to be generated responsive to determining that the first goal has been achieved.
  • 12. The system of claim 1, wherein the non-transitory machine-readable storage medium stores additional computer-executable instructions to cause the one or more processors to: determine analyzed sensor information for each moment of interest of a second set of moments of interest of the plurality of moments of interest based on the sensor data for that corresponding moment of interest, the second set corresponding to a second window of time, the second window of time immediately after the first window of time,determine, at a second point within the second window of time, that a second goal for the second window of time has not been achieved based at least on the analyzed sensor information for the moments of interest of the second set, andcause a second notification to be generated responsive to determining that the goal has not been achieved at the second point within the second window of time.
  • 13. The system of claim 12, wherein the duration of the first window of time and the duration of the second window of time are equal.
  • 14. The system of claim 12, wherein the first point and the second point are the same amount of time prior to the end of the corresponding window of time.
  • 15. The system of claim 12, wherein the first goal and the second goal are equal.
  • 16. The system of claim 1, wherein the non-transitory machine-readable storage medium stores additional computer-executable instructions to cause the one or more processors to: classify each moment of interest in the plurality of moments of interest into a status of a plurality of statuses, the plurality of statuses including a sedentary status and a non-sedentary status, based at least on the analyzed sensor information;detect a first time period for which a number of consecutive moments of interest have been classified as having the sedentary status;classify the first time period as a sedentary state based on the first time period being greater than a threshold value; andcause a notification to be generated based on the first time period being greater than the threshold value.
  • 17. The system of claim 16, wherein the non-transitory machine-readable storage medium stores additional computer-executable instructions to cause the one or more processors to: determine that a user associated with the wearable electronic device is asleep or is not wearing the wearable electronic device during a first moment of interest; andnot classify the first moment of interest as having the sedentary status or the non-sedentary status.
  • 18. The system of claim 1, wherein the analyzed sensor information is a metabolic equivalent of task measure, motion measure, or heart rate measure.
  • 19. The system of claim 1, wherein the moments of interest occur at regular time intervals.
  • 20. The system of claim 1, wherein the analyzed sensor information for each moment of interest is a single value.
  • 21. The system of claim 1, wherein the one or more processors and the non-transitory machine-readable storage medium are located in the wearable electronic device.
  • 22. The system of claim 1, wherein at least one of the one or more processors and the non-transitory machine-readable storage medium are located in a separate electronic device that is not the wearable electronic device.
US Referenced Citations (567)
Number Name Date Kind
2284849 Anderson et al. Aug 1941 A
2717736 Schlesinger Sep 1955 A
2827309 Fred Mar 1958 A
2883255 Anderson Apr 1959 A
3163856 Kirby Dec 1964 A
3250270 Lyon May 1966 A
3522383 Chang Jul 1970 A
3918658 Beller Nov 1975 A
4192000 Lipsey Mar 1980 A
4244020 Ratcliff Jan 1981 A
4281663 Pringle Aug 1981 A
4284849 Anderson et al. Aug 1981 A
4312358 Barney Jan 1982 A
4367752 Jimenez et al. Jan 1983 A
4390922 Pelliccia Jun 1983 A
4407295 Steuer et al. Oct 1983 A
4425921 Fujisaki et al. Jan 1984 A
4466204 Wu Aug 1984 A
4575804 Ratcliff Mar 1986 A
4578769 Frederick Mar 1986 A
4617525 Lloyd Oct 1986 A
4855942 Bianco Aug 1989 A
4887249 Thinesen Dec 1989 A
4930518 Hrushesky Jun 1990 A
4977509 Pitchford et al. Dec 1990 A
5058427 Brandt Oct 1991 A
5099842 Mannheimer et al. Mar 1992 A
5224059 Nitta et al. Jun 1993 A
5295085 Hoffacker Mar 1994 A
5314389 Dotan May 1994 A
5323650 Fullen et al. Jun 1994 A
5365930 Takashima et al. Nov 1994 A
5446705 Haas et al. Aug 1995 A
5456648 Edinburg et al. Oct 1995 A
5485402 Smith et al. Jan 1996 A
5553296 Forrest et al. Sep 1996 A
5583776 Levi et al. Dec 1996 A
5645509 Brewer et al. Jul 1997 A
5671162 Werbin Sep 1997 A
5692324 Goldston et al. Dec 1997 A
5704350 Williams, III Jan 1998 A
5724265 Hutchings Mar 1998 A
5817008 Rafert et al. Oct 1998 A
5890128 Diaz et al. Mar 1999 A
5891042 Sham et al. Apr 1999 A
5894454 Kondo Apr 1999 A
5899963 Hutchings May 1999 A
5941828 Archibald et al. Aug 1999 A
5947868 Dugan Sep 1999 A
5955667 Fyfe Sep 1999 A
5976083 Richardson et al. Nov 1999 A
6018705 Gaudet et al. Jan 2000 A
6077193 Buhler et al. Jun 2000 A
6078874 Piety et al. Jun 2000 A
6085248 Sambamurthy et al. Jul 2000 A
6129686 Friedman Oct 2000 A
6145389 Ebeling et al. Nov 2000 A
6183425 Whalen et al. Feb 2001 B1
6213872 Harada et al. Apr 2001 B1
6241684 Amano et al. Jun 2001 B1
6287262 Amano et al. Sep 2001 B1
6301964 Fyfe et al. Oct 2001 B1
6302789 Harada et al. Oct 2001 B2
6305221 Hutchings Oct 2001 B1
6309360 Mault Oct 2001 B1
6454708 Ferguson et al. Sep 2002 B1
6469639 Tanenhaus et al. Oct 2002 B2
6478736 Mault Nov 2002 B1
6513381 Fyfe et al. Feb 2003 B2
6513532 Mault et al. Feb 2003 B2
6527711 Stivoric et al. Mar 2003 B1
6529827 Beason et al. Mar 2003 B1
6558335 Thede May 2003 B1
6561951 Cannon et al. May 2003 B2
6571200 Mault May 2003 B1
6583369 Montagnino et al. Jun 2003 B2
6585622 Shum et al. Jul 2003 B1
6607493 Song Aug 2003 B2
6620078 Pfeffer Sep 2003 B2
6678629 Tsuji Jan 2004 B2
6699188 Wessel Mar 2004 B2
6761064 Tsuji Jul 2004 B2
6772331 Hind et al. Aug 2004 B1
6788200 Jamel et al. Sep 2004 B1
6790178 Mault et al. Sep 2004 B1
6808473 Hisano et al. Oct 2004 B2
6811516 Dugan Nov 2004 B1
6813582 Levi et al. Nov 2004 B2
6813931 Yadav et al. Nov 2004 B2
6856938 Kurtz Feb 2005 B2
6862575 Anttila et al. Mar 2005 B1
6984207 Sullivan et al. Jan 2006 B1
7020508 Stivoric et al. Mar 2006 B2
7041032 Calvano May 2006 B1
7062225 White Jun 2006 B2
7099237 Lall Aug 2006 B2
7133690 Ranta-Aho et al. Nov 2006 B2
7162368 Levi et al. Jan 2007 B2
7171331 Vock et al. Jan 2007 B2
7200517 Darley et al. Apr 2007 B2
7246033 Kudo Jul 2007 B1
7261690 Teller et al. Aug 2007 B2
7272982 Neuhauser et al. Sep 2007 B2
7283870 Kaiser et al. Oct 2007 B2
7285090 Stivoric et al. Oct 2007 B2
7373820 James May 2008 B1
7443292 Jensen et al. Oct 2008 B2
7457724 Vock et al. Nov 2008 B2
7467060 Kulach et al. Dec 2008 B2
7502643 Farringdon et al. Mar 2009 B2
7505865 Ohkubo et al. Mar 2009 B2
7539532 Tran May 2009 B2
7558622 Tran Jul 2009 B2
7559877 Parks et al. Jul 2009 B2
7608050 Shugg Oct 2009 B2
7640134 Park et al. Dec 2009 B2
7653503 Kahn et al. Jan 2010 B2
7690556 Kahn et al. Apr 2010 B1
7713173 Shin et al. May 2010 B2
7762952 Lee et al. Jul 2010 B2
7771320 Riley et al. Aug 2010 B2
7774156 Niva et al. Aug 2010 B2
7789802 Lee et al. Sep 2010 B2
7827000 Stirling et al. Nov 2010 B2
7865140 Levien et al. Jan 2011 B2
7881902 Kahn et al. Feb 2011 B1
7907901 Kahn et al. Mar 2011 B1
7925022 Jung et al. Apr 2011 B2
7927253 Vincent et al. Apr 2011 B2
7941665 Berkema et al. May 2011 B2
7942824 Kayyali et al. May 2011 B1
7953549 Graham et al. May 2011 B2
7959539 Takeishi et al. Jun 2011 B2
7983876 Vock et al. Jul 2011 B2
8005922 Boudreau et al. Aug 2011 B2
8028443 Case, Jr. Oct 2011 B2
8036850 Kulach et al. Oct 2011 B2
8055469 Kulach et al. Nov 2011 B2
8059573 Julian et al. Nov 2011 B2
8060337 Kulach et al. Nov 2011 B2
8095071 Sim et al. Jan 2012 B2
8099318 Moukas et al. Jan 2012 B2
8103247 Ananthanarayanan et al. Jan 2012 B2
8132037 Fehr et al. Mar 2012 B2
8172761 Rulkov et al. May 2012 B1
8177260 Tropper et al. May 2012 B2
8180591 Yuen et al. May 2012 B2
8180592 Yuen et al. May 2012 B2
8190651 Treu et al. May 2012 B2
8213613 Diehl et al. Jul 2012 B2
8260261 Teague Sep 2012 B2
8270297 Akasaka et al. Sep 2012 B2
8271662 Gossweiler, III et al. Sep 2012 B1
8289162 Mooring et al. Oct 2012 B2
8311769 Yuen et al. Nov 2012 B2
8311770 Yuen et al. Nov 2012 B2
8386008 Yuen et al. Feb 2013 B2
8437980 Yuen et al. May 2013 B2
8462591 Marhaben Jun 2013 B1
8463576 Yuen et al. Jun 2013 B2
8463577 Yuen et al. Jun 2013 B2
8487771 Hsieh et al. Jul 2013 B2
8533269 Brown Sep 2013 B2
8533620 Hoffman et al. Sep 2013 B2
8543185 Yuen et al. Sep 2013 B2
8543351 Yuen et al. Sep 2013 B2
8548770 Yuen et al. Oct 2013 B2
8562489 Burton et al. Oct 2013 B2
8583402 Yuen et al. Nov 2013 B2
8597093 Engelberg et al. Dec 2013 B2
8634796 Johnson Jan 2014 B2
8638228 Amico et al. Jan 2014 B2
8670953 Yuen et al. Mar 2014 B2
8684900 Tran Apr 2014 B2
8690578 Nusbaum et al. Apr 2014 B1
8712723 Kahn et al. Apr 2014 B1
8734296 Brumback et al. May 2014 B1
8738321 Yuen et al. May 2014 B2
8738323 Yuen et al. May 2014 B2
8738925 Park et al. May 2014 B1
8744803 Park et al. Jun 2014 B2
8762101 Yuen et al. Jun 2014 B2
8764651 Tran Jul 2014 B2
8825445 Hoffman et al. Sep 2014 B2
8847988 Geisner et al. Sep 2014 B2
8849610 Molettiere et al. Sep 2014 B2
8868377 Yuen et al. Oct 2014 B2
8892401 Yuen et al. Nov 2014 B2
8909543 Tropper et al. Dec 2014 B2
8949070 Kahn et al. Feb 2015 B1
8954290 Yuen et al. Feb 2015 B2
8961414 Teller et al. Feb 2015 B2
8968195 Tran Mar 2015 B2
9031812 Roberts et al. May 2015 B2
9042971 Brumback et al. May 2015 B2
9047648 Lekutai et al. Jun 2015 B1
9062976 Tanabe Jun 2015 B2
9066209 Yuen et al. Jun 2015 B2
9081534 Yuen et al. Jul 2015 B2
9113823 Yuen et al. Aug 2015 B2
9167991 Yuen et al. Oct 2015 B2
9241635 Yuen et al. Jan 2016 B2
9288298 Choudhary et al. Mar 2016 B2
9310909 Myers et al. Apr 2016 B2
9374279 Yuen et al. Jun 2016 B2
9426769 Haro Aug 2016 B2
9629558 Yuen et al. Apr 2017 B2
9728059 Arnold et al. Aug 2017 B2
9801547 Yuen et al. Oct 2017 B2
10004406 Yuen et al. Jun 2018 B2
10080530 Cheng et al. Sep 2018 B2
10497246 Arnold et al. Dec 2019 B2
10588519 Yuen et al. Mar 2020 B2
20010049470 Mault et al. Dec 2001 A1
20010055242 Deshmuhk et al. Dec 2001 A1
20020013717 Ando et al. Jan 2002 A1
20020019585 Dickenson Feb 2002 A1
20020077219 Cohen et al. Jun 2002 A1
20020082144 Pfeffer Jun 2002 A1
20020087264 Hills et al. Jul 2002 A1
20020109600 Mault et al. Aug 2002 A1
20020178060 Sheehan Nov 2002 A1
20020191797 Perlman Dec 2002 A1
20020198776 Nara et al. Dec 2002 A1
20030018523 Rappaport et al. Jan 2003 A1
20030050537 Wessel Mar 2003 A1
20030065561 Brown et al. Apr 2003 A1
20030107575 Cardno Jun 2003 A1
20030131059 Brown et al. Jul 2003 A1
20030171189 Kaufman Sep 2003 A1
20030176815 Baba et al. Sep 2003 A1
20030208335 Unuma et al. Nov 2003 A1
20030226695 Mault Dec 2003 A1
20040054497 Kurtz Mar 2004 A1
20040061324 Howard Apr 2004 A1
20040116837 Yamaguchi et al. Jun 2004 A1
20040117963 Schneider Jun 2004 A1
20040122488 Mazar et al. Jun 2004 A1
20040152957 Stivoric et al. Aug 2004 A1
20040239497 Schwartzman et al. Dec 2004 A1
20040249299 Cobb Dec 2004 A1
20040257557 Block Dec 2004 A1
20050037844 Shum et al. Feb 2005 A1
20050038679 Short Feb 2005 A1
20050054938 Wehman et al. Mar 2005 A1
20050102172 Sirmans, Jr. May 2005 A1
20050107723 Wehman et al. May 2005 A1
20050163056 Ranta-Aho et al. Jul 2005 A1
20050171410 Hjelt et al. Aug 2005 A1
20050186965 Pagonis et al. Aug 2005 A1
20050187481 Hatib Aug 2005 A1
20050195830 Chitrapu et al. Sep 2005 A1
20050216724 Isozaki Sep 2005 A1
20050228244 Banet Oct 2005 A1
20050228692 Hodgdon Oct 2005 A1
20050234742 Hodgdon Oct 2005 A1
20050248718 Howell et al. Nov 2005 A1
20050272564 Pyles et al. Dec 2005 A1
20060004265 Pulkkinen et al. Jan 2006 A1
20060020174 Matsumura Jan 2006 A1
20060020177 Seo et al. Jan 2006 A1
20060025282 Redmann Feb 2006 A1
20060039348 Racz et al. Feb 2006 A1
20060047208 Yoon Mar 2006 A1
20060047447 Brady et al. Mar 2006 A1
20060064037 Shalon et al. Mar 2006 A1
20060064276 Ren et al. Mar 2006 A1
20060069619 Walker et al. Mar 2006 A1
20060089542 Sands Apr 2006 A1
20060106535 Duncan May 2006 A1
20060111944 Sirmans et al. May 2006 A1
20060129436 Short Jun 2006 A1
20060143645 Vock et al. Jun 2006 A1
20060166718 Seshadri et al. Jul 2006 A1
20060189863 Peyser Aug 2006 A1
20060217231 Parks et al. Sep 2006 A1
20060241521 Cohen Oct 2006 A1
20060247952 Muraca Nov 2006 A1
20060277474 Robarts et al. Dec 2006 A1
20060282021 DeVaul et al. Dec 2006 A1
20060287883 Turgiss et al. Dec 2006 A1
20060288117 Raveendran et al. Dec 2006 A1
20070011028 Sweeney Jan 2007 A1
20070049384 King et al. Mar 2007 A1
20070050715 Behar Mar 2007 A1
20070051369 Choi et al. Mar 2007 A1
20070061593 Celikkan et al. Mar 2007 A1
20070071643 Hall et al. Mar 2007 A1
20070072156 Kaufivan et al. Mar 2007 A1
20070083095 Rippo et al. Apr 2007 A1
20070083602 Heggenhougen et al. Apr 2007 A1
20070123391 Shin et al. May 2007 A1
20070135264 Rosenberg Jun 2007 A1
20070136093 Rankin et al. Jun 2007 A1
20070146116 Kimbrell Jun 2007 A1
20070155277 Amitai et al. Jul 2007 A1
20070159926 Prstojevich et al. Jul 2007 A1
20070179356 Wessel Aug 2007 A1
20070179761 Wren et al. Aug 2007 A1
20070194066 Ishihara et al. Aug 2007 A1
20070197920 Adams Aug 2007 A1
20070208544 Kulach et al. Sep 2007 A1
20070276271 Chan Nov 2007 A1
20070288265 Quinian et al. Dec 2007 A1
20080001735 Tran Jan 2008 A1
20080014947 Carnall Jan 2008 A1
20080022089 Leedom Jan 2008 A1
20080032864 Hakki Feb 2008 A1
20080044014 Corndorf Feb 2008 A1
20080054072 Katragadda et al. Mar 2008 A1
20080059113 Tsubata Mar 2008 A1
20080084823 Akasaka et al. Apr 2008 A1
20080093838 Tropper et al. Apr 2008 A1
20080097550 Dicks et al. Apr 2008 A1
20080109158 Huhtala May 2008 A1
20080114829 Button et al. May 2008 A1
20080125288 Case May 2008 A1
20080125959 Doherty May 2008 A1
20080129457 Ritter et al. Jun 2008 A1
20080134102 Movold et al. Jun 2008 A1
20080139910 Mastrototaro et al. Jun 2008 A1
20080140163 Keacher et al. Jun 2008 A1
20080140338 No et al. Jun 2008 A1
20080146892 LeBoeuf et al. Jun 2008 A1
20080155077 James Jun 2008 A1
20080172204 Nagashima et al. Jul 2008 A1
20080176655 James et al. Jul 2008 A1
20080190202 Kulach et al. Aug 2008 A1
20080214360 Stirling et al. Sep 2008 A1
20080243432 Kato et al. Oct 2008 A1
20080275309 Stivoric et al. Nov 2008 A1
20080285805 Luinge et al. Nov 2008 A1
20080287751 Stivoric et al. Nov 2008 A1
20080300641 Brunekreeft Dec 2008 A1
20090012418 Gerlach Jan 2009 A1
20090018797 Kasama et al. Jan 2009 A1
20090043531 Kahn et al. Feb 2009 A1
20090047645 Dibenedetto et al. Feb 2009 A1
20090048044 Oleson et al. Feb 2009 A1
20090054737 Magar et al. Feb 2009 A1
20090054751 Babashan et al. Feb 2009 A1
20090058635 LaLonde et al. Mar 2009 A1
20090063193 Barton et al. Mar 2009 A1
20090063293 Mirrashidi et al. Mar 2009 A1
20090076765 Kulach et al. Mar 2009 A1
20090088183 Piersol Apr 2009 A1
20090093341 James et al. Apr 2009 A1
20090098821 Shinya Apr 2009 A1
20090144456 Gelf et al. Jun 2009 A1
20090144639 Nims et al. Jun 2009 A1
20090150178 Sutton et al. Jun 2009 A1
20090156172 Chan Jun 2009 A1
20090171788 Tropper et al. Jul 2009 A1
20090195350 Tsern et al. Aug 2009 A1
20090262088 Moll-Carrillo et al. Oct 2009 A1
20090264713 Van Loenen et al. Oct 2009 A1
20090271147 Sugai Oct 2009 A1
20090287921 Zhu et al. Nov 2009 A1
20090299691 Shimaoka et al. Dec 2009 A1
20090307517 Fehr et al. Dec 2009 A1
20090309742 Alexander et al. Dec 2009 A1
20090313857 Carnes et al. Dec 2009 A1
20100023348 Hardee et al. Jan 2010 A1
20100043056 Ganapathy Feb 2010 A1
20100056208 Ashida et al. Mar 2010 A1
20100058064 Kirovski et al. Mar 2010 A1
20100059561 Ellis et al. Mar 2010 A1
20100069203 Kawaguchi et al. Mar 2010 A1
20100079291 Kroll Apr 2010 A1
20100125729 Baentsch et al. May 2010 A1
20100130873 Yuen et al. May 2010 A1
20100158494 King Jun 2010 A1
20100159709 Kotani et al. Jun 2010 A1
20100167783 Alameh et al. Jul 2010 A1
20100179411 Holmström et al. Jul 2010 A1
20100185064 Bandic et al. Jul 2010 A1
20100191153 Sanders et al. Jul 2010 A1
20100205541 Rapaport et al. Aug 2010 A1
20100217099 LeBoeuf et al. Aug 2010 A1
20100222179 Temple et al. Sep 2010 A1
20100259434 Rud et al. Oct 2010 A1
20100261987 Kamath et al. Oct 2010 A1
20100262045 Heaton et al. Oct 2010 A1
20100292050 Dibenedetto Nov 2010 A1
20100292600 Dibenedetto et al. Nov 2010 A1
20100295684 Hsieh et al. Nov 2010 A1
20100298656 McCombie et al. Nov 2010 A1
20100298661 McCombie et al. Nov 2010 A1
20100304674 Kim et al. Dec 2010 A1
20100311544 Robinette et al. Dec 2010 A1
20100331145 Lakovic et al. Dec 2010 A1
20100331657 Mensinger et al. Dec 2010 A1
20110003665 Burton et al. Jan 2011 A1
20110009051 Khedouri et al. Jan 2011 A1
20110021143 Kapur et al. Jan 2011 A1
20110022349 Stirling et al. Jan 2011 A1
20110029241 Miller et al. Feb 2011 A1
20110032105 Hoffman et al. Feb 2011 A1
20110051665 Huang Mar 2011 A1
20110080349 Holbein et al. Apr 2011 A1
20110087076 Brynelsen et al. Apr 2011 A1
20110087137 Hanoun Apr 2011 A1
20110092834 Yazicioglu et al. Apr 2011 A1
20110106449 Chowdhary et al. May 2011 A1
20110109540 Milne et al. May 2011 A1
20110131005 Ueshima et al. Jun 2011 A1
20110145894 Garcia Morchon et al. Jun 2011 A1
20110153773 Vandwalle Jun 2011 A1
20110167262 Ross et al. Jul 2011 A1
20110193704 Harper et al. Aug 2011 A1
20110197157 Hoffman et al. Aug 2011 A1
20110214030 Greenberg et al. Sep 2011 A1
20110221590 Baker et al. Sep 2011 A1
20110224508 Moon Sep 2011 A1
20110230729 Shirasaki et al. Sep 2011 A1
20110258689 Cohen et al. Oct 2011 A1
20110275940 Nims et al. Nov 2011 A1
20120015778 Lee et al. Jan 2012 A1
20120035487 Werner et al. Feb 2012 A1
20120046113 Ballas Feb 2012 A1
20120072165 Jallon Mar 2012 A1
20120083705 Yuen et al. Apr 2012 A1
20120083714 Yuen et al. Apr 2012 A1
20120083715 Yuen et al. Apr 2012 A1
20120083716 Yuen et al. Apr 2012 A1
20120084053 Yuen et al. Apr 2012 A1
20120084054 Yuen et al. Apr 2012 A1
20120092157 Tran Apr 2012 A1
20120094649 Porrati et al. Apr 2012 A1
20120102008 Kääriäinen et al. Apr 2012 A1
20120116684 Ingrassia, Jr. et al. May 2012 A1
20120119911 Jeon et al. May 2012 A1
20120150483 Vock et al. Jun 2012 A1
20120165684 Sholder Jun 2012 A1
20120166257 Shiragami et al. Jun 2012 A1
20120179278 Riley et al. Jul 2012 A1
20120183939 Aragones et al. Jul 2012 A1
20120203503 Nakamura Aug 2012 A1
20120215328 Schmelzer Aug 2012 A1
20120221634 Treu et al. Aug 2012 A1
20120226471 Yuen et al. Sep 2012 A1
20120226472 Yuen et al. Sep 2012 A1
20120227737 Mastrototaro et al. Sep 2012 A1
20120245716 Srinivasan et al. Sep 2012 A1
20120254987 Ge et al. Oct 2012 A1
20120265477 Vock et al. Oct 2012 A1
20120265480 Oshima Oct 2012 A1
20120274508 Brown et al. Nov 2012 A1
20120283855 Hoffman et al. Nov 2012 A1
20120290109 Engelberg et al. Nov 2012 A1
20120296400 Bierman et al. Nov 2012 A1
20120297229 Desai et al. Nov 2012 A1
20120297440 Reams et al. Nov 2012 A1
20120316456 Rahman et al. Dec 2012 A1
20120324226 Bichsel et al. Dec 2012 A1
20120330109 Tran Dec 2012 A1
20130006718 Nielsen et al. Jan 2013 A1
20130041590 Burich et al. Feb 2013 A1
20130072169 Ross et al. Mar 2013 A1
20130073254 Yuen et al. Mar 2013 A1
20130073255 Yuen et al. Mar 2013 A1
20130080113 Yuen et al. Mar 2013 A1
20130094600 Beziat et al. Apr 2013 A1
20130095459 Tran Apr 2013 A1
20130096843 Yuen et al. Apr 2013 A1
20130102251 Linde et al. Apr 2013 A1
20130103847 Brown et al. Apr 2013 A1
20130106684 Weast et al. May 2013 A1
20130132501 Vandwalle et al. May 2013 A1
20130151193 Kulach et al. Jun 2013 A1
20130151196 Yuen et al. Jun 2013 A1
20130158369 Yuen et al. Jun 2013 A1
20130166048 Werner et al. Jun 2013 A1
20130187789 Lowe Jul 2013 A1
20130190008 Vathsancam et al. Jul 2013 A1
20130190903 Balakrishnan et al. Jul 2013 A1
20130191034 Weast et al. Jul 2013 A1
20130203475 Kil et al. Aug 2013 A1
20130209972 Carter et al. Aug 2013 A1
20130225117 Giacoletto et al. Aug 2013 A1
20130228063 Turner Sep 2013 A1
20130231574 Tran Sep 2013 A1
20130238287 Hoffman et al. Sep 2013 A1
20130261475 Mochizuki Oct 2013 A1
20130267249 Rosenberg Oct 2013 A1
20130268199 Nielsen et al. Oct 2013 A1
20130268236 Yuen et al. Oct 2013 A1
20130268687 Schrecker Oct 2013 A1
20130268767 Schrecker Oct 2013 A1
20130274904 Coza et al. Oct 2013 A1
20130281110 Zelinka Oct 2013 A1
20130289366 Chua et al. Oct 2013 A1
20130296666 Kumar et al. Nov 2013 A1
20130296672 O'Neil et al. Nov 2013 A1
20130296673 Thaveeprungsriporn et al. Nov 2013 A1
20130297220 Yuen et al. Nov 2013 A1
20130310896 Mass Nov 2013 A1
20130325396 Yuen et al. Dec 2013 A1
20130331058 Harvey Dec 2013 A1
20130337974 Yanev et al. Dec 2013 A1
20130345978 Lush et al. Dec 2013 A1
20140035761 Burton et al. Feb 2014 A1
20140039804 Park et al. Feb 2014 A1
20140039840 Yuen et al. Feb 2014 A1
20140039841 Yuen et al. Feb 2014 A1
20140052280 Yuen et al. Feb 2014 A1
20140067278 Yuen et al. Mar 2014 A1
20140077673 Garg et al. Mar 2014 A1
20140085077 Luna Mar 2014 A1
20140094941 Ellis et al. Apr 2014 A1
20140099614 Hu et al. Apr 2014 A1
20140121471 Walker May 2014 A1
20140125618 Panther et al. May 2014 A1
20140135612 Yuen et al. May 2014 A1
20140142466 Kawabe et al. May 2014 A1
20140156228 Yuen et al. Jun 2014 A1
20140164611 Molettiere et al. Jun 2014 A1
20140176475 Myers Jun 2014 A1
20140180022 Stivoric et al. Jun 2014 A1
20140188431 Barfield Jul 2014 A1
20140191866 Yuen et al. Jul 2014 A1
20140200691 Lee et al. Jul 2014 A1
20140206954 Yuen et al. Jul 2014 A1
20140207264 Quy Jul 2014 A1
20140213858 Presura et al. Jul 2014 A1
20140221791 Pacione et al. Aug 2014 A1
20140275885 Isaacson et al. Sep 2014 A1
20140278229 Hong et al. Sep 2014 A1
20140288435 Richards et al. Sep 2014 A1
20140316305 Venkatraman et al. Oct 2014 A1
20140337451 Choudhary et al. Nov 2014 A1
20140337621 Nakhimov Nov 2014 A1
20140343867 Yuen et al. Nov 2014 A1
20150026647 Park et al. Jan 2015 A1
20150057967 Albinali Feb 2015 A1
20150088457 Yuen et al. Mar 2015 A1
20150102923 Messenger et al. Apr 2015 A1
20150120186 Heikes Apr 2015 A1
20150127268 Park et al. May 2015 A1
20150134268 Yuen et al. May 2015 A1
20150137994 Rahman et al. May 2015 A1
20150141873 Fei May 2015 A1
20150198460 Yamato et al. Jul 2015 A1
20150220883 B'far et al. Aug 2015 A1
20150289802 Thomas et al. Oct 2015 A1
20150324541 Cheung et al. Nov 2015 A1
20150374267 Laughlin Dec 2015 A1
20160058331 Keen et al. Mar 2016 A1
20160058372 Raghuram et al. Mar 2016 A1
20160061626 Burton et al. Mar 2016 A1
20160063888 McCallum et al. Mar 2016 A1
20160089572 Liu et al. Mar 2016 A1
20160107646 Kolisetty et al. Apr 2016 A1
20160150978 Yuen et al. Jun 2016 A1
20160166156 Yuen et al. Jun 2016 A1
20160203691 Arnold et al. Jul 2016 A1
20160259426 Yuen et al. Sep 2016 A1
20160278669 Messenger et al. Sep 2016 A1
20160285985 Molettiere et al. Sep 2016 A1
20160323401 Messenger et al. Nov 2016 A1
20170238881 Cheng et al. Aug 2017 A1
20170239523 Cheng et al. Aug 2017 A1
20170243056 Cheng et al. Aug 2017 A1
20180055376 Yuen et al. Mar 2018 A1
20180061204 Arnold et al. Mar 2018 A1
20190059744 Yuen et al. Feb 2019 A1
20190392727 Cheng et al. Dec 2019 A1
Foreign Referenced Citations (23)
Number Date Country
101789933 Jul 2010 CN
101978374 Feb 2011 CN
102067560 May 2011 CN
102111434 Jun 2011 CN
102377815 Mar 2012 CN
102740933 Oct 2012 CN
102983890 Mar 2013 CN
103226647 Jul 2013 CN
203721002 Jul 2014 CN
1 721 237 Aug 2012 EP
11347021 Dec 1999 JP
2178588 Jan 2002 RU
WO 2002011019 Feb 2002 WO
WO 2006055125 May 2006 WO
WO 2006090197 Aug 2006 WO
WO 2008038141 Apr 2008 WO
WO 2009042965 Apr 2009 WO
WO 2012061438 May 2012 WO
WO 2012170586 Dec 2012 WO
WO 2012170924 Dec 2012 WO
WO 2012171032 Dec 2012 WO
WO 2015127067 Aug 2015 WO
WO 2016003269 Jan 2016 WO
Non-Patent Literature Citations (65)
Entry
U.S. Office Action dated Aug. 3, 2016, in U.S. Appl. No. 15/052,405.
U.S. Notice of Allowance dated Dec. 2, 2016, in U.S. Appl. No. 15/052,405.
U.S. Office Action dated Jun. 30, 2017, in U.S. Appl. No. 15/052,405.
U.S. Office Action dated May 21, 2015, in U.S. Appl. No. 14/156,413.
U.S. Notice of Allowance dated Sep. 14, 2015, in U.S. Appl. No. 14/156,413.
U.S. Office Action dated Dec. 19, 2014, in U.S. Appl. No. 14/221,234.
U.S. Final Office Action dated Oct. 7, 2015, in U.S. Appl. No. 14/221,234.
U.S. Final Office Action dated Jun. 3, 2016, in U.S. Appl. No. 14/221,234.
U.S. Final Office Action dated Dec. 7, 2016, in U.S. Appl. No. 14/221,234.
U.S. Notice of Allowance dated Jun. 30, 2017, in U.S. Appl. No. 14/221,234.
U.S. Office Action dated May 18, 2016, in U.S. Appl. No. 15/016,712.
U.S. Final Office Action dated Oct. 7, 2016, in U.S. Appl. No. 15/016,712.
U.S. Office Action dated Jun. 29, 2017, in U.S. Appl. No. 15/016,712.
U.S. Notice of Allowance dated Dec. 19, 2017, in U.S. Appl. No. 15/016,712.
U.S. Office Action dated Jun. 7, 2019, in U.S. Appl. No. 16/017,870.
U.S. Notice of Allowance dated Nov. 4, 2019, in U.S. Appl. No. 16/017,870.
U.S. Office Action dated Jun. 1, 2016, in U.S. Appl. No. 15/078,981.
U.S. Final Office Action dated Sep. 8, 2016, in U.S. Appl. No. 15/078,981.
U.S. Notice of Allowance dated Apr. 4, 2017, in U.S. Appl. No. 15/078,981.
U.S. Office Action dated Aug. 9, 2018, in U.S. Appl. No. 15/671,063.
U.S. Final Office Action dated Feb. 7, 2019, in U.S. Appl. No. 15/671,063.
U.S. Notice of Allowance dated Jul. 11, 2019, in U.S. Appl. No. 15/671,063.
U.S. Office Action dated May 26, 2016, in U.S. Appl. No. 15/048,972.
U.S. Final Office Action dated Aug. 10, 2016, in U.S. Appl. No. 15/048,972.
U.S. Office Action dated Jun. 7, 2017, in U.S. Appl. No. 15/048,972.
U.S. Office Action dated Nov. 19, 2019, in U.S. Appl. No. 16/414,780.
U.S. Final Office Action dated Apr. 27, 2020, in U.S. Appl. No. 16/414,780.
U.S. Office Action dated Apr. 26, 2016, in U.S. Appl. No. 15/048,980.
U.S. Final Office Action dated Dec. 1, 2016, in U.S. Appl. No. 15/048,980.
U.S. Office Action dated Jul. 6, 2017, in U.S. Appl. No. 15/048,980.
U.S. Final Office Action dated Dec. 21, 2017, in U.S. Appl. No. 15/048,980.
U.S. Notice of Allowance dated May 21, 2018, in U.S. Appl. No. 15/048,980.
U.S. Corrected Notice of Allowability dated Jul. 2, 2018, in U.S. Appl. No. 15/048,980.
International Search Report dated Aug. 15, 2008, in related application No. PCT/IB07/03617.
Chinese First Office Action dated Jul. 6, 2018, in Application No. 201610172515.9.
Chinese Second Office Action dated Jan. 2, 2019, in Application No. 201610172515.9.
“Activity Classification Using Realistic Data From Wearable Sensors”, Parkka, et al, IEEE Transactions on Information Technology in Biomedicine, vol. 10, No. 1, Jan. 2006, pp. 119-128.
“Altimeter and Barometer System”, Clifford, et al., Freescale Semiconductor Application Note AN1979, Rev. 3, Nov. 2006.
“Automatic classification of ambulatory movements and evaluation of energy consumptions utilizing accelerometers and barometer”, Ohtaki, et al, Microsystem Technologies, vol. 11, No. 8-10, Aug. 2005, pp. 1034-1040.
Chandrasekar et al., “Plug-and-Play, Single-Cip Photoplethysmography,” 34th Annual International Conference of the IEEE EMBS, San Diego, California, USA, Aug. 28-Sep. 1, 2012. 4 pages.
“Classification of Human Moving Patterns Using Air Pressure and Acceleration”, Sagawa, et al, Proceedings of the 24.sup.th Annual Conference of the IEEE Industrial Electronics Society, vol. 2, Aug.-Sep. 1998, pp. 1214-1219.
Definition of “Graphic” from Merriam-Webster Dictionary, downloaded from merriam-webster.com on Oct. 4, 2014.
Definition of “Graphical user interface” from Merriam-Webster Dictionary, downloaded from merriam-webster.com on Oct. 4, 2014.
“Design of a Wireless Assisted Pedestrian Dead Reckoning System—The NavMote Experience”, Fang, et al, IEEE Transactions on Instrumentation and Measurement, vol. 54, No. 6, Dec. 2005, pp. 2342-2358.
“Direct Measurement of Human Movement by Accelerometry”, Godfrey, et al., Medical Engineering & Physics, vol. 30, 2008, pp. 1364-1386.
“Drift-free dynamic height sensor using MEMS IMU aided by MEMS pressure sensor”, Tanigawa, et al, Workshop on Positioning, Navigation and Communication, Mar. 2008, pp. 191-196.
“Evaluation of a New Method of Heading Estimation of Pedestrian Dead Reckoning Using Shoe Mounted Sensors”, Stirling et al., Journal of Navigation, vol. 58, 2005, pp. 31-45.
“Fitbit Automatically Tracks Your Fitness and Sleep” published online at web.archive.org/web/20080910224820/http://www.fitbit.com, downloaded Sep. 10, 2008, 1 page.
“Foot Mounted Inertia System for Pedestrian Naviation”, Godha et al., Measurement Science and Technology, vol. 19, No. 7, May 2008, pp. 1-9.
“A Hybrid Discriminative/Generative Approach for Modeling Human Activities”, Lester, et al., Proc. of the Int'l Joint Conf. Artificial Intelligence, 2005, pp. 766-772.
“Improvement of Walking Speed Prediction by Accelerometry and Altimetiy, Validated by Satellite Positioning”, Perrin, et al, Medical & Biological Engineering & Computing, vol. 38, 2000, pp. 164-168.
“Indoor Navigation with MEMS Sensors”, Lammel, et al., Proceedings of the Eurosensors XIII conference, vol. 1, No. 1, Sep. 2009, pp. 532-535.
“An Intelligent Multi-Sensor system for Pedestrian Navigation”, Retscher, Journal of Global Positioning Systems, vol. 5, No. 1, 2006, pp. 110-118.
Lee, Suevon, “Jawbone Gets 2 Patents Nixed in Fitbit Infringement Suit,” Mar. 3, 2017; retrieved from URL https://www.law360.com/articles/898111/jawbone-gets-2-patents-nixed-in-fitbit-infringement-suit on May 22, 2017.
Minetti et al. Energy cost of walking and running at extreme uphill and downhill slopes. J Appl Physiol 2002; 93: 10-39-1046.
“Non-restricted measurement of walking distance”, Sagawa, et al, IEEE Int'l Conf. on Systems, Man, and Cybernetics, vol. 3, Oct. 2000, pp. 1847-1852.
O'Donovan et al., 2009, A context aware wireless body area network (BAN), Proc. 3rd Intl. Conf. Pervasive Computing Technologies for Healthcare, pp. 1-8.
“On Foot Navigation: When GPS alone is not Enough”, Ladetto, et al, Journal of Navigation, vol. 53, No. 2, Sep. 2000, pp. 279-285.
“SCP 1000-D01/D11 Pressure Sensor as Barometer and Altimeter”, VTI Technologies Application, Jun. 2006, Note 33.
“Specification of the Bluetooth.RTM. System”, Core Package version 4.1 Dec. 2013 vol. 0 and vol. 1, 283 pp.
“Suunto LUMI User Guide”, Jun. and Sep. 1997.
Thompson et al., (Jan. 1996) “Predicted and measured resting metabolic rate of male and female endurance athletes,” Journal of the American Dietetic Association 96(1): 30-34.
“Using MS5534 for altimeters and barometers”, Intersema App., Note AN501, Jan. 2006.
“Validated caloric expenditure estimation using a single body-worn sensor”, Lester, et al, Proc. of the Int'l Conf. on Ubiquitous Computing, 2009, pp. 225-234.
U.S. Appl. No. 16/783,029, filed Feb. 5, 2020, Yuen et al.
Related Publications (1)
Number Date Country
20200184793 A1 Jun 2020 US
Provisional Applications (2)
Number Date Country
62137750 Mar 2015 US
61752826 Jan 2013 US
Continuations (3)
Number Date Country
Parent 15671063 Aug 2017 US
Child 16700006 US
Parent 15078981 Mar 2016 US
Child 15671063 US
Parent 14156413 Jan 2014 US
Child 14221234 US
Continuation in Parts (1)
Number Date Country
Parent 14221234 Mar 2014 US
Child 15078981 US