Combining data sources to provide accurate effort monitoring

Information

  • Patent Grant
  • 8795138
  • Patent Number
    8,795,138
  • Date Filed
    Wednesday, September 25, 2013
    10 years ago
  • Date Issued
    Tuesday, August 5, 2014
    10 years ago
Abstract
By combining data from different sensors (on fitness device, mobile smartphone, smart clothing, other devices or people in same location), an intelligent system provides a better indicator of an individual's physical effort, using rich data sources to enhance quantified metrics such as distance/pace/altitude gain, to provide a clearer picture of an individual's exercise and activity.
Description

This application claims priority to U.S. provisional patent application Ser. No. 61/878,835, filed Sep. 17, 2013.


FIELD OF THE INVENTION

The present application relates generally to digital ecosystems that are configured for use when engaging in physical activity and/or fitness exercises.


BACKGROUND OF THE INVENTION

Society is becoming increasingly health-conscious. A wide variety of exercise and workouts are now offered to encourage people to stay fit through exercise. As understood herein, while stationary exercise equipment often comes equipped with data displays for the information of the exerciser, the information is not tailored to the individual and is frequently repetitive and monotonous. As further understood herein, people enjoy listening to music as workout aids but the music typically is whatever is broadcast within a gymnasium or provided on a recording device the user may wear, again being potentially monotonous and unchanging in pattern and beat in a way that is uncoupled from the actual exercise being engaged in. Furthermore, general fitness devices that monitor physical exertion during exercise do not always seem to provide an accurate enough picture of true effort.


SUMMARY OF THE INVENTION

Present principles recognize that by combining data from different sensors (on fitness device, mobile smartphone, smart clothing, other devices/people in same location/time), a monitoring system can provide a better indicator of true effort. This might not necessarily be a quantified “calories” or “distance”, but a way to factor in ‘difficulty’ to help provide more nuance and qualification to a quantified measurement.


Accordingly, a device includes a computer readable storage medium bearing instructions executable by a processor, and a processor configured for accessing the computer readable storage medium to execute the instructions to configure the processor for receiving signals from a position sensor from which the processor can calculate a speed and a distance over an interval of time ΔT. The processor is configured for receiving a signal representing a weather condition, and another signal representing a biometric condition of a user of the device. The processor then adjusts a baseline value associated with the speed and/or distance based on the biometric condition and weather condition to render an adjusted baseline, and outputs an indicia of exercise effort based on the adjusted baseline.


In some embodiments, the processor when executing the instructions is further configured for receiving a signal representing a slope of terrain associated with the exercise effort, and adjusting the baseline value based on the slope. The processor also may receive a signal representing an elevation and/or type of terrain associated with the exercise effort, and adjust the baseline value based on the elevation and/or type.


The weather condition can include one or more of humidity, temperature, barometric pressure, and wind condition. The biometric condition can include one or more of heart rate, leg stride condition, sleep condition, and skin temperature.


In another aspect, a method includes establishing a baseline effort indicator at least partially based on a pace and distance of an exercise of a person. The baseline effort indicator is adjusted based on a biometric condition of the person, a weather condition, and a terrain condition, and an adjusted effort indicator is output based at least in part on the adjusting steps.


In another aspect, a device includes a computer readable storage medium bearing instructions executable by a processor, and a processor configured for accessing the computer readable storage medium to execute the instructions to configure the processor for combining data from at least one biometric sensor and one or more of weather information and terrain information. This provides an indication of an individual's physical effort during an exercise to enhance quantified metrics and provide an accurate picture of an individual's exercise and activity.


The details of the present invention, both as to its structure and operation, can best be understood in reference to the accompanying drawings, in which like reference numerals refer to like parts, and in which:





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a block diagram of an example system including an example in accordance with present principles;



FIG. 2 is a block diagram of an example specific system;



FIG. 3 is a flowchart of example logic; and



FIG. 4 is a schematic diagram illustrating present principles.





DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENT

This disclosure relates generally to consumer electronics (CE) device based user information. A system herein may include server and client components, connected over a network such that data may be exchanged between the client and server components. The client components may include one or more computing devices including portable televisions (e.g. smart TVs, Internet-enabled TVs), portable computers such as laptops and tablet computers, and other mobile devices including smart phones and additional examples discussed below. These client devices may operate with a variety of operating environments. For example, some of the client computers may employ, as examples, operating systems from Microsoft, or a Unix operating system, or operating systems produced by Apple Computer or Google. These operating environments may be used to execute one or more browsing programs, such as a browser made by Microsoft or Google or Mozilla or other browser program that can access web applications hosted by the Internet servers discussed below.


Servers may include one or more processors executing instructions that configure the servers to receive and transmit data over a network such as the Internet. Or, a client and server can be connected over a local intranet or a virtual private network.


Information may be exchanged over a network between the clients and servers. To this end and for security, servers and/or clients can include firewalls, load balancers, temporary storages, and proxies, and other network infrastructure for reliability and security. One or more servers may form an apparatus that implement methods of providing a secure community such as an online social website to network members.


As used herein, instructions refer to computer-implemented steps for processing information in the system. Instructions can be implemented in software, firmware or hardware and include any type of programmed step undertaken by components of the system.


A processor may be any conventional general purpose single- or multi-chip processor that can execute logic by means of various lines such as address lines, data lines, and control lines and registers and shift registers.


Software modules described by way of the flow charts and user interfaces herein can include various sub-routines, procedures, etc. Without limiting the disclosure, logic stated to be executed by a particular module can be redistributed to other software modules and/or combined together in a single module and/or made available in a shareable library.


Present principles described herein can be implemented as hardware, software, firmware, or combinations thereof; hence, illustrative components, blocks, modules, circuits, and steps are set forth in terms of their functionality.


Further to what has been alluded to above, logical blocks, modules, and circuits described below can be implemented or performed with a general purpose processor, a digital signal processor (DSP), a field programmable gate array (FPGA) or other programmable logic device such as an application specific integrated circuit (ASIC), discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein. A processor can be implemented by a controller or state machine or a combination of computing devices.


The functions and methods described below, when implemented in software, can be written in an appropriate language such as but not limited to C# or C++, and can be stored on or transmitted through a computer-readable storage medium such as a random access memory (RAM), read-only memory (ROM), electrically erasable programmable read-only memory (EEPROM), compact disk read-only memory (CD-ROM) or other optical disk storage such as digital versatile disc (DVD), magnetic disk storage or other magnetic storage devices including removable thumb drives, etc. A connection may establish a computer-readable medium. Such connections can include, as examples, hard-wired cables including fiber optics and coaxial wires and digital subscriber line (DSL) and twisted pair wires. Such connections may include wireless communication connections including infrared and radio.


Components included in one embodiment can be used in other embodiments in any appropriate combination. For example, any of the various components described herein and/or depicted in the Figures may be combined, interchanged or excluded from other embodiments.


“A system having at least one of A, B, and C” (likewise “a system having at least one of A, B, or C” and “a system having at least one of A, B, C”) includes systems that have A alone, B alone, C alone, A and B together, A and C together, B and C together, and/or A, B, and C together, etc.


Before describing FIG. 1, it is to be understood that the CE devices and software described herein are understood to be usable in the context of a digital ecosystem. Thus, as understood herein, a computer ecosystem, or digital ecosystem, may be an adaptive and distributed socio-technical system that is characterized by its sustainability, self-organization, and scalability. Inspired by environmental ecosystems, which consist of biotic and abiotic components that interact through nutrient cycles and energy flows, complete computer ecosystems consist of hardware, software, and services that in some cases may be provided by one company, such as Sony Electronics. The goal of each computer ecosystem is to provide consumers with everything that may be desired, at least in part services and/or software that may be exchanged via the Internet. Moreover, interconnectedness and sharing among elements of an ecosystem, such as applications within a computing cloud, provides consumers with increased capability to organize and access data and presents itself as the future characteristic of efficient integrative ecosystems.


Two general types of computer ecosystems exist: vertical and horizontal computer ecosystems. In the vertical approach, virtually all aspects of the ecosystem are associated with the same company (e.g. produced by the same manufacturer), and are specifically designed to seamlessly interact with one another. Horizontal ecosystems, one the other hand, integrate aspects such as hardware and software that are created by differing entities into one unified ecosystem. The horizontal approach allows for greater variety of input from consumers and manufactures, increasing the capacity for novel innovations and adaptations to changing demands. But regardless, it is to be understood that some digital ecosystems, including those referenced herein, may embody characteristics of both the horizontal and vertical ecosystems described above.


Accordingly, it is to be further understood that these ecosystems may be used while engaged in physical activity to e.g. provide inspiration, goal fulfillment and/or achievement, automated coaching/training, health and exercise analysis, convenient access to data, group sharing (e.g. of fitness data), and increased accuracy of health monitoring, all while doing so in a stylish and entertaining manner. Further still, the devices disclosed herein are understood to be capable of making diagnostic determinations based on data from various sensors (such as those described below in reference to FIG. 1) for use while exercising, for exercise monitoring (e.g. in real time), and/or for sharing of data with friends (e.g. using a social networking service) even when not all people have the same types and combinations of sensors on their respective CE devices.


Thus, it is to be understood that the CE devices described herein may allow for easy and simplified user interaction with the device so as to not be unduly bothersome or encumbering e.g. before, during, and after an exercise.


Now specifically referring to FIG. 1, an example system 10 is shown, which may include one or more of the example devices mentioned above and described further below to enhance fitness experiences in accordance with present principles. The first of the example devices included in the system 10 is an example consumer electronics (CE) device 12 that may be waterproof (e.g., for use while swimming). The CE device 12 may be, e.g., a computerized Internet enabled (“smart”) telephone, a tablet computer, a notebook computer, a wearable computerized device such as e.g. computerized Internet-enabled watch, a computerized Internet-enabled bracelet, other computerized Internet-enabled fitness devices, a computerized Internet-enabled music player, computerized Internet-enabled head phones, a computerized Internet-enabled implantable device such as an implantable skin device, etc., and even e.g. a computerized Internet-enabled television (TV). Regardless, it is to be understood that the CE device 12 is configured to undertake present principles (e.g. communicate with other CE devices to undertake present principles, execute the logic described herein, and perform any other functions and/or operations described herein).


Accordingly, to undertake such principles the CE device 12 can include some or all of the components shown in FIG. 1. For example, the CE device 12 can include one or more touch-enabled displays 14, one or more speakers 16 for outputting audio in accordance with present principles, and at least one additional input device 18 such as e.g. an audio receiver/microphone for e.g. entering audible commands to the CE device 12 to control the CE device 12. The example CE device 12 may also include one or more network interfaces 20 for communication over at least one network 22 such as the Internet, an WAN, an LAN, etc. under control of one or more processors 24. It is to be understood that the processor 24 controls the CE device 12 to undertake present principles, including the other elements of the CE device 12 described herein such as e.g. controlling the display 14 to present images thereon and receiving input therefrom. Furthermore, note the network interface 20 may be, e.g., a wired or wireless modem or router, or other appropriate interface such as, e.g., a wireless telephony transceiver, WiFi transceiver, etc.


In addition to the foregoing, the CE device 12 may also include one or more input ports 26 such as, e.g., a USB port to physically connect (e.g. using a wired connection) to another CE device and/or a headphone port to connect headphones to the CE device 12 for presentation of audio from the CE device 12 to a user through the headphones. The CE device 12 may further include one or more tangible computer readable storage medium 28 such as disk-based or solid state storage, it being understood that the computer readable storage medium 28 may not be a carrier wave. Also in some embodiments, the CE device 12 can include a position or location receiver such as but not limited to a GPS receiver and/or altimeter 30 that is configured to e.g. receive geographic position information from at least one satellite and provide the information to the processor 24 and/or determine an altitude at which the CE device 12 is disposed in conjunction with the processor 24. However, it is to be understood that that another suitable position receiver other than a GPS receiver and/or altimeter may be used in accordance with present principles to e.g. determine the location of the CE device 12 in e.g. all three dimensions.


Continuing the description of the CE device 12, in some embodiments the CE device 12 may include one or more cameras 32 that may be, e.g., a thermal imaging camera, a digital camera such as a webcam, and/or a camera integrated into the CE device 12 and controllable by the processor 24 to gather pictures/images and/or video in accordance with present principles (e.g. to share aspects of a physical activity such as hiking with social networking friends). Also included on the CE device 12 may be a Bluetooth transceiver 34 and other Near Field Communication (NFC) element 36 for communication with other devices using Bluetooth and/or NFC technology, respectively. An example NFC element can be a radio frequency identification (RFID) element.


Further still, the CE device 12 may include one or more motion sensors 37 (e.g., an accelerometer, gyroscope, cyclometer, magnetic sensor, infrared (IR) motion sensors such as passive IR sensors, an optical sensor, a speed and/or cadence sensor, a gesture sensor (e.g. for sensing gesture command), etc.) providing input to the processor 24. The CE device 12 may include still other sensors such as e.g. one or more climate sensors 38 (e.g. barometers, humidity sensors, wind sensors, light sensors, temperature sensors, etc.) and/or one or more biometric sensors 40 (e.g. heart rate sensors and/or heart monitors, calorie counters, blood pressure sensors, perspiration sensors, odor and/or scent detectors, fingerprint sensors, facial recognition sensors, iris and/or retina detectors, DNA sensors, oxygen sensors (e.g. blood oxygen sensors and/or VO2 max sensors), glucose and/or blood sugar sensors, sleep sensors (e.g. a sleep tracker), pedometers and/or speed sensors, body temperature sensors, nutrient and metabolic rate sensors, voice sensors, lung input/output and other cardiovascular sensors, etc.) also providing input to the processor 24. In addition to the foregoing, it is noted that in some embodiments the CE device 12 may also include a kinetic energy harvester 42 to e.g. charge a battery (not shown) powering the CE device 12.


Still referring to FIG. 1, in addition to the CE device 12, the system 10 may include one or more other CE device types such as, but not limited to, a computerized Internet-enabled bracelet 44, computerized Internet-enabled headphones and/or ear buds 46, computerized Internet-enabled clothing 48, a computerized Internet-enabled exercise machine 50 (e.g. a treadmill, exercise bike, elliptical machine, etc.), etc. Also shown is a computerized Internet-enabled gymnasium entry kiosk 52 permitting authorized entry to a gymnasium housing the exercise machine 50. It is to be understood that other CE devices included in the system 10 including those described in this paragraph may respectively include some or all of the various components described above in reference to the CE device 12 such but not limited to e.g. the biometric sensors and motion sensors described above, as well as the position receivers, cameras, input devices, and speakers also described above.


Thus, for instance, the headphones/ear buds 46 may include a heart rate sensor configured to sense a person's heart rate when a person is wearing the head phones, the clothing 48 may include sensors such as perspiration sensors, climate sensors, and heart sensors for measuring the intensity of a person's workout, and the exercise machine 50 may include a camera mounted on a portion thereof for gathering facial images of a user so that the machine 50 may thereby determine whether a particular facial expression is indicative of a user struggling to keep the pace set by the exercise machine 50 and/or an NFC element to e.g. pair the machine 50 with the CE device 12 and hence access a database of preset workout routines, and the kiosk 52 may include an NFC element permitting entry to a person authenticated as being authorized for entry based on input received from a complimentary NFC element (such as e.g. the NFC element 36 on the device 12). Also note that all of the devices described in reference to FIG. 1, including a server 54 to be described shortly, may communicate with each other over the network 22 using a respective network interface included thereon, and may each also include a computer readable storage medium that may not be a carrier wave for storing logic and/or software code in accordance with present principles.


Now in reference to the afore-mentioned at least one server 54, it includes at least one processor 56, at least one tangible computer readable storage medium 58 that may not be a carrier wave such as disk-based or solid state storage, and at least one network interface 60 that, under control of the processor 56, allows for communication with the other CE devices of FIG. 1 over the network 22, and indeed may facilitate communication therebetween in accordance with present principles. Note that the network interface 60 may be, e.g., a wired or wireless modem or router, WiFi transceiver, or other appropriate interface such as, e.g., a wireless telephony transceiver.


Accordingly, in some embodiments the server 54 may be an Internet server, may facilitate fitness coordination and/or data exchange between CE device devices in accordance with present principles, and may include and perform “cloud” functions such that the CE devices of the system 10 may access a “cloud” environment via the server 54 in example embodiments to e.g. stream music to listen to while exercising and/or pair two or more devices (e.g. to “throw” music from one device to another).



FIG. 2 shows a more specific example of a system according to the general principles above. A CE device 70 may be embodied as a wristwatch as shown (or a mobile telephone or other portable CE device or combination of devices) and may include one or more microprocessors 72 accessing one or more computer readable storage media 74 to output visible information to the wearer on a display 76. Speakers also may be provided to output audible information. A position sensor such as a GPS sensor 78 may provide position information to the processor 72, from which the processor 72 can calculate a speed over an interval of time ΔT using the equation speed=distance/ΔT.


The processor 72 also can receive information from Internet servers discussed further below using a wireless network interface 80 such as a WiFi or telephony interface. The processor 72 may communicate with nearby devices such as the biometric sensors discussed further below and such as audio headphones 81 using, for example, a Bluetooth transceiver 82 or a radiofrequency identification (RFID) transceiver or other wireless and typically short range (<100 meters in useful transmission range) transceiver. User input of, e.g., recent food and beverage intake may be received on an input device 84 such as a keypad, microphone coupled to voice entry software, touch display, etc. The processor 72 may also access other information stored on the computer readable storage media 74 as received from another CE device or Internet server using one or more of the transceivers above. For example, the processor 72 may access calendar information of the user that lists future meetings and events for which the user is scheduled.


As discussed above, the processor 72 can receive information from various Internet servers or other network servers by means of the network interface 80. For example, the processor 72 can receive map information from a map server 86 from which, knowing its position from signals from the position sensor 78, the processor 72 can ascertain the elevation, slope, and other terrain information pertaining to the present location of the CE device 70.


Also, the processor 72 can receive local weather information from a weather server 88. The weather server 88 may access one or more of a humidity sensor 90, a temperature sensor 92, a barometer 94, and a wind sensor 96 and, in response to receiving a query from the processor 72 using the current location of the CE device 70 as derived from the position sensor 78 as an automatically uploaded entering argument, provide local weather conditions to the processor 72. The signals from the servers may be received by the CE device processor 72 through the appropriate communication interface and stored on the computer readable media 74 and/or on local cache memory associated with the processor 72, for processing of the information through the registers of the processor 72 according to description herein to provide output on the display 76 and/or headphones 81 or other output device.


In addition to accessing information from network servers, the processor 72 of the CE device 70 may access information from one or more biometric sensors that can be worn by or can otherwise be engaged with the user of the CE device 70. A heart rate sensor 98 may be provided as an example in which signals from a pulse sensor 100 are provided to the CE device 70 through a wireless transceiver 102 such as but not limited to a Bluetooth transceiver under control of one or more processors 104 accessing one or more computer readable media 106. The signals from the biometric sensor may be received by the CE device processor 72 through the appropriate communication interface and stored on the computer readable media 74 and/or on local cache memory associated with the processor 72, for processing of the information through the registers of the processor 72 according to description herein to provide output on the display 76 and/or headphones 81 or other output device.


A stride sensor 108 may be provided as another example in which signals from a stride sensor 110 (which may include, e.g., an accelerometer and/or gyroscope and/or force sensing resistor other jolt-sending or pressure-sensing device) are provided to the CE device 70 through a wireless transceiver 112 such as but not limited to a Bluetooth transceiver under control of one or more processors 114 accessing one or more computer readable media 116. The signals from the biometric sensor 108 may be received by the CE device processor 72 through the appropriate communication interface and stored on the computer readable media 74 and/or on local cache memory associated with the processor 72, for processing of the information through the registers of the processor 72 according to description herein to provide output on the display 76 and/or headphones 81 or other output device.


A sleep rate sensor 118 may be provided as another example (typically sensing sleep prior to exercise and data from which is stored by the CE device 70 for later retrieval according to principles discussed below) in which signals from a sleep quality sensor 120 (which may include an actigraphy-type mechanism) are provided to the CE device 70 through a wireless transceiver 122 such as but not limited to a Bluetooth transceiver under control of one or more processors 124 accessing one or more computer readable media 126. The signals from the biometric sensor 118 may be received by the CE device processor 72 through the appropriate communication interface and stored on the computer readable media 74 and/or on local cache memory associated with the processor 72, for processing of the information through the registers of the processor 72 according to description herein to provide output on the display 76 and/or headphones 81 or other output device.


Other biometric sensors may be provided, including a skin temperature sensor 128 that has onboard sensing, processing, and transceiving components similar to those discussed above in relation to other biometric sensors. The biometric sensors may be mounted on the CE device 70, fitness devices such as treadmills, mobile telephones, clothing worn by the user, or even other devices and/or people in the same location as the user at the same time as the user.



FIG. 3 shows logic in which factors from multiple sources are combined to output an accurate effort monitoring output. As discussed further below, these factors can include personal biometrics such as instantaneous heart rate, skin temperature, pace (as derived from position signals per above description, e.g.,), stride length, and stride cadence. The factors also can include the duration of the exercise activity (as accessed from the internal processor clock of the CE device 70, for instance), location data such as position, absolute altitude, elevation gain (as derived from downloaded map data from the map server, for example), and terrain (type of surface for running as derived from downloaded map data). The factors may also include weather factors such as wind speed and direction, humidity, temperature, barometric pressure, and sun or cloud cover as derived from downloaded information from the weather server. Still further, the factors can include life context factors such as food intake as obtained from user input, stress levels as inferred from a number and frequency of meetings on the user's calendar data for example, and knowing the user's prior behavior as learned by the processor 72.


Accordingly, at block 130 in FIG. 3 the CE device 70 in FIG. 2 receives biometric information from one or more of the sensors shown in FIG. 2. Weather information from, e.g., the weather server 88 is received at block 132 and location information is received from the position sensor 78 at block 134. From this, the user's distance run and pace over an exercise period can be determined from the above.


Map and terrain information may be received at block 136 from, e.g., the map server 86, and user-input information can be received at block 138 by means of, e.g., the user input device 84 indicating food and beverage intake of the user for the past N hours. Lifestyle information as derived from, e.g., a number of upcoming or immediately past meetings within a threshold time period (for instance, within the past or future 24 hours) can be retrieved from calendar information or elsewhere at block 140.


Weights for each factor may be applied at block 142 if desired, with each factor having its own respective weight and with some factors optionally having the same weights as other factors, or with each factor having its own unique weight. The weights can be positive or negative, e.g., for a run up a slope the weight accorded to a slope factor may be positive while for a run down a slope the weight can be negative. An effort level and/or coaching tips are output at block 144 based on the weighted combined factors.


An example of the above now follows.


A baseline effort indicator may be established as a number between 0-100 based on the distance and pace (speed) of the workout. The longer the distance and the faster the pace, the higher the baseline number. Note that each baseline number for each user may be established for that user by averaging the first several workout times and distances and paces, so that a baseline for one user may not be the same as the baseline for another user.


That baseline number is then adjusted upwardly for factors that increase the difficulty of the workout and decreased downwardly for factors the decrease the difficulty of the workout. As examples:


For average workout heart rates in excess of a test value such as but not limited to a median rate or average rate or other test value, which can be empirically determined if desired, add A points (multiplied if desired by a heart rate weighting factor) to the baseline number, wherein A, like the other adjustment “points” referred to herein, can be an integer. For average workout heart rates below a median rate, subtract A points (multiplied if desired by a heart rate weighting factor) to the baseline number. The adjustment points “A”, like the other adjustment points discussed below and designated by letters of the alphabet, can vary with the amount of excess/shortfall between the measured factor and the median. For example, the magnitude of “A” can increase (or decrease) linearly with the magnitude of the difference between the median and the measured value. Also, the median to which an excessive measurement value is compared may be the same as or different from the median to which a deficient measurement value is compared. Average values may be used, e.g., an average elevation or slope over the course of an exercise run or ride may be used. Or, instantaneous values may be used and the baseline adjusted and output updated accordingly.


For peak workout heart rates in excess of a median rate, which can be empirically determined if desired, add B points (multiplied if desired by a peak heart rate weighting factor) to the baseline number. For peak workout heart rates below a median rate, subtract B points (multiplied if desired by a peak heart rate weighting factor) to the baseline number.


For skin temperature in excess of a median, which can be empirically determined if desired, add C points (multiplied if desired by a skin temperature weighting factor) to the baseline number. For skin temperature below a median, which can be empirically determined if desired, subtract points (multiplied if desired by a skin temperature weighting factor) from the baseline number.


For stride length in excess of a median, which can be empirically determined if desired, add D points (multiplied if desired by a stride length weighting factor) to the baseline number. For stride length below a median, which can be empirically determined if desired, subtract points D (multiplied if desired by a stride length weighting factor) from the baseline number.


For stride cadence in excess of a median, which can be empirically determined if desired, add E points (multiplied if desired by a stride cadence weighting factor) to the baseline number. For stride cadence below a median, which can be empirically determined if desired, subtract E points (multiplied if desired by a stride length weighting factor) from the baseline number.


For sleep quality in excess of a median, which can be empirically determined if desired, subtract F points (multiplied if desired by a sleep quality weighting factor) from the baseline number. For sleep quality below a median, which can be empirically determined if desired, add F points (multiplied if desired by a sleep quality weighting factor) from the baseline number.


For ambient temperature in excess of a median, which can be empirically determined if desired, add G points (multiplied if desired by an ambient temperature weighting factor) to the baseline number. For ambient temperature below a median, which can be empirically determined if desired, subtract G points (multiplied if desired by an ambient temperature weighting factor) from the baseline number.


For ambient humidity in excess of a median, which can be empirically determined if desired, add H points (multiplied if desired by a humidity weighting factor) to the baseline number. For ambient humidity below a median, which can be empirically determined if desired, subtract H points (multiplied if desired by a humidity weighting factor) from the baseline number.


For ambient pressure in excess of a median, which can be empirically determined if desired, add I points (multiplied if desired by an ambient pressure weighting factor) to the baseline number. For ambient pressure below a median, which can be empirically determined if desired, subtract I points (multiplied if desired by an ambient pressure weighting factor) from the baseline number.


For the speed vector of the ambient wind that is directly against the user's direction of travel in excess of a median, which can be empirically determined if desired, add K points (multiplied if desired by an ambient wind weighting factor) to the baseline number. For a similar speed vector going with the user, subtract K points (multiplied if desired by an ambient wind weighting factor) from the baseline number.


For an upward slope of terrain in the direction of the user's travel in excess of a median, which can be empirically determined if desired, add L points (multiplied if desired by a slope weighting factor) to the baseline number. For a downward slope, subtract L points (multiplied if desired by a slope weighting factor) from the baseline number.


For a difficult terrain surface in excess of a median, which can be empirically determined if desired, add M points (multiplied if desired by a terrain weighting factor) to the baseline number. For an easy terrain, subtract M points (multiplied if desired by a terrain weighting factor) from the baseline number.


For a caloric intake within the last Z hours below a median which can be empirically determined if desired, add N points (multiplied if desired by a calorie weighting factor) to the baseline number. For a caloric intake within the last Z hours above a median which can be empirically determined if desired, subtract N points (multiplied if desired by a calorie weighting factor) from the baseline number.


For a number of meetings within the last Z hours below a median which can be empirically determined if desired, subtract P points (multiplied if desired by a stress weighting factor) from the baseline number. For a number of meetings within the last Z hours above a median which can be empirically determined if desired, add P points (multiplied if desired by a stress weighting factor) to the baseline number.


For a number of meetings in the next Z hours below a median which can be empirically determined if desired, subtract Q points (multiplied if desired by a stress weighting factor) from the baseline number. For a number of meetings in the next Z hours above a median which can be empirically determined if desired, add Q points (multiplied if desired by a stress weighting factor) to the baseline number.


For an elevation in excess of a median, which can be empirically determined if desired, add R points (multiplied if desired by an elevation weighting factor) to the baseline number. For elevation below a median, which can be empirically determined if desired, subtract R points (multiplied if desired by an elevation weighting factor) from the baseline number.


When all of the adjustments to the baseline number, such as some or all of the above, are made, an adjusted baseline number is arrived at. The magnitude of the adjusted baseline number may then be used to output an effort level and/or coaching tips. In one example, the magnitude of the adjusted baseline number may be used as entering argument in a table lookup as follows:














Adjusted Baseline




Number
Effort Level
Coaching Tip







 0-10
very low
“need a lot more from you!”


10-20
moderately low
“try harder”


20-30
low
“try a little harder”


30-40
minimally low
“effort not quite there yet”


40-50
below average
“almost average”


50-60
above average
“doing OK today”


60-70
minimally high
“effort higher than expected”


70-80
high
“you are cranking it today!”


80-90
moderately high
“doing more than enough!”


 90-100
very high
“take it easy Secretariat!”










FIG. 4 shows that a runner 200 wearing the CE device 70 shown in FIG. 2 formerly would not have understood, as indicated by the thought balloon 202, why a workout felt harder than the CE device absent present principles indicated. However, when accounting according to present principles for biometric readings 204, terrain and position and slope 206, climate conditions 208, and life context conditions 210, an adjusted output 212 may be presented on the CE device 70 visibly and/or audibly correctly reflecting the combined factor effort level of the runner 200.


With no particular reference to any figure, it is to be understood that lactate sensors may also be included on, and/or in communication with, the CE devices described herein for sensing lactate levels, which can be e.g. measured in sweat, to thus determine an effort level in accordance with present principles and accordingly be another biometric parameter to be factored into a determination/adjustment of a baseline in accordance with present principles. Thus, e.g., for lactate levels in excess of a median, which can be empirically determined if desired, XYZ points may be added as set forth herein (multiplied if desired by a lactate level weighting factor) to the baseline number. For lactate levels below a median, which can be empirically determined if desired, points may be subtracted (multiplied if desired by a lactate level weighting factor) from the baseline number.


While the particular Combining Data Sources to Provide Accurate Effort Monitoring is herein shown and described in detail, it is to be understood that the subject matter which is encompassed by the present invention is limited only by the claims.

Claims
  • 1. A device comprising: at least one computer readable storage medium bearing instructions executable by a processor;at least one processor configured for accessing the computer readable storage medium to execute the instructions to configure the processor for:receiving signals from a position sensor from which the processor can calculate a speed and a distance over an interval of time ΔT;receiving at least one signal representing at least one weather condition;receiving at least one signal representing at least one biometric condition of a user of the device;adjusting a baseline value associated with the speed and/or distance based at least in part on the biometric condition and weather condition to render an adjusted baseline; andoutputting an indicia of exercise effort based at least in part on the adjusted baseline.
  • 2. The device of claim 1, wherein the processor when executing the instructions is further configured for: receiving at least one signal representing a slope of terrain associated with the exercise effort; andadjusting the baseline value based at least in part on the slope.
  • 3. The device of claim 1, wherein the processor when executing the instructions is further configured for: receiving at least one signal representing an elevation of terrain associated with the exercise effort; andadjusting the baseline value based at least in part on the elevation.
  • 4. The device of claim 1, wherein the processor when executing the instructions is further configured for: receiving at least one signal representing a type of terrain associated with the exercise effort; andadjusting the baseline value based at least in part on the type.
  • 5. The device of claim 1, wherein the weather condition includes humidity.
  • 6. The device of claim 1, wherein the weather condition includes temperature.
  • 7. The device of claim 1, wherein the weather condition includes barometric pressure.
  • 8. The device of claim 1, wherein the weather condition includes a wind condition.
  • 9. The device of claim 1, wherein the biometric condition includes heart rate.
  • 10. The device of claim 1, wherein the biometric condition includes a leg stride condition.
  • 11. The device of claim 1, wherein the biometric condition includes a sleep condition.
  • 12. The device of claim 1, wherein the biometric condition includes skin temperature.
  • 13. Method comprising: establishing a baseline effort indicator at least partially based on a pace and distance of an exercise of a person;adjusting the baseline effort indicator at least in part based on at least one biometric condition of the person;adjusting the baseline effort indicator at least in part based on at least one weather condition;adjusting the baseline effort indicator (BEI) at least in part based on at least one terrain condition; andoutputting an adjusted effort indicator based at least in part on the adjusting steps.
  • 14. The method of claim 13, comprising, for average workout heart rates in excess of a test rate, increasing the BEI and for average workout heart rates below a test rate, decreasing the BEI.
  • 15. The method of claim 13, comprising, for a terrain slope associated with the distance in excess of a test value, increasing the BEI and for terrain slope below a test value, decreasing BEI.
  • 16. The method of claim 13, comprising, for temperature in excess of a test value, increasing the BEI and for temperature below a test value, decreasing BEI.
  • 17. The method of claim 13, comprising, for humidity in excess of a test value, increasing the BEI and for humidity below a test value, decreasing BEI.
  • 18. The method of claim 13, comprising, for a wind value in excess of a test value, increasing the BEI and for a wind value below a test value, decreasing BEI.
  • 19. The method of claim 13, comprising, for a terrain elevation associated with the distance in excess of a test value, increasing the BEI and for terrain elevation below a test value, decreasing BEI.
US Referenced Citations (236)
Number Name Date Kind
4278095 Lapeyre Jul 1981 A
4566461 Lubell et al. Jan 1986 A
4625962 Street Dec 1986 A
4708337 Shyu Nov 1987 A
4728100 Smith Mar 1988 A
4869497 Stewart et al. Sep 1989 A
4916628 Kugler Apr 1990 A
4920969 Suzuki et al. May 1990 A
5072458 Suzuki Dec 1991 A
5111818 Suzuki et al. May 1992 A
5207621 Koch et al. May 1993 A
5277197 Church et al. Jan 1994 A
5314389 Dotan May 1994 A
5410472 Anderson Apr 1995 A
5433683 Stevens Jul 1995 A
5454770 Stevens Oct 1995 A
5474083 Church et al. Dec 1995 A
5474090 Begun et al. Dec 1995 A
5516334 Easton May 1996 A
5524637 Erickson Jun 1996 A
5579777 Suga Dec 1996 A
5598849 Browne Feb 1997 A
5704067 Brady Jan 1998 A
5706822 Khavari Jan 1998 A
5857939 Kaufman Jan 1999 A
5921891 Browne Jul 1999 A
6013007 Root et al. Jan 2000 A
6032108 Seiple et al. Feb 2000 A
6042519 Shea Mar 2000 A
6050924 Shea Apr 2000 A
6106297 Pollak et al. Aug 2000 A
6171218 Shea Jan 2001 B1
6220865 Macri et al. Apr 2001 B1
6244988 Delman Jun 2001 B1
6251048 Kaufman Jun 2001 B1
6259944 Margulis et al. Jul 2001 B1
6296595 Stark et al. Oct 2001 B1
6447425 Keller et al. Sep 2002 B1
6464618 Shea Oct 2002 B1
6497638 Shea Dec 2002 B1
6500100 Harrell Dec 2002 B1
6515593 Stark et al. Feb 2003 B1
6582342 Kaufman Jun 2003 B2
6601016 Brown et al. Jul 2003 B1
6605044 Bimbaum Aug 2003 B2
6638198 Shea Oct 2003 B1
6659916 Shea Dec 2003 B1
6659946 Batchelor et al. Dec 2003 B1
6672991 O'Malley Jan 2004 B2
6702719 Brown et al. Mar 2004 B1
6746371 Brown et al. Jun 2004 B1
6749537 Hickman Jun 2004 B1
6786848 Yamashita et al. Sep 2004 B2
6793607 Neil Sep 2004 B2
6863641 Brown et al. Mar 2005 B1
6866613 Brown et al. Mar 2005 B1
6882883 Condie et al. Apr 2005 B2
6997882 Parker et al. Feb 2006 B1
7024369 Brown et al. Apr 2006 B1
7056265 Shea Jun 2006 B1
7057551 Vogt Jun 2006 B1
7070539 Brown et al. Jul 2006 B2
7128693 Brown et al. Oct 2006 B2
7192401 Saalasti et al. Mar 2007 B2
7223215 Bastyr May 2007 B2
7245254 Vogt Jul 2007 B1
7328612 Jämsen et al. Feb 2008 B2
7351187 Seliber Apr 2008 B2
7370763 Pascucci May 2008 B1
7438670 Gray et al. Oct 2008 B2
7507183 Anderson et al. Mar 2009 B2
7586418 Cuddihy et al. Sep 2009 B2
7617615 Martorell et al. Nov 2009 B1
7638252 Stasiak et al. Dec 2009 B2
7664292 van den Bergen et al. Feb 2010 B2
7699752 Anderson et al. Apr 2010 B1
7728214 Oliver et al. Jun 2010 B2
7786856 O'Brien Aug 2010 B2
7840031 Albertson et al. Nov 2010 B2
7841966 Aaron et al. Nov 2010 B2
7857730 Dugan Dec 2010 B2
7931563 Shaw et al. Apr 2011 B2
7951046 Barber, Jr. May 2011 B1
7966230 Brown Jun 2011 B2
7979136 Young et al. Jul 2011 B2
7996080 Hartman et al. Aug 2011 B1
8021270 D'Eredita Sep 2011 B2
8029410 Shea Oct 2011 B2
8047965 Shea Nov 2011 B2
8057360 Shea Nov 2011 B2
8062182 Somers Nov 2011 B2
8092346 Shea Jan 2012 B2
8103762 Duberry Jan 2012 B2
8109858 Redmann Feb 2012 B2
8157730 Leboeuf et al. Apr 2012 B2
8162802 Berg Apr 2012 B2
8182424 Heckerman May 2012 B2
8204786 Leboeuf et al. Jun 2012 B2
8219191 Hartman et al. Jul 2012 B1
8260667 Graham et al. Sep 2012 B2
8277377 Quy Oct 2012 B2
8333874 Currie Dec 2012 B2
8343012 Redmann Jan 2013 B2
8360785 Park et al. Jan 2013 B2
8360935 Olsen et al. Jan 2013 B2
8371990 Shea Feb 2013 B2
8435177 Lanfermann et al. May 2013 B2
8452413 Young et al. May 2013 B2
8465397 Saalasti et al. Jun 2013 B2
8467860 Salazar et al. Jun 2013 B2
8491446 Hinds et al. Jul 2013 B2
8512209 Guidi et al. Aug 2013 B2
8512548 Bar-Or et al. Aug 2013 B2
8514067 Hyde et al. Aug 2013 B2
8523740 Kruse et al. Sep 2013 B2
8562489 Burton et al. Oct 2013 B2
20010020143 Stark et al. Sep 2001 A1
20020028730 Kaufman Mar 2002 A1
20020072932 Swamy Jun 2002 A1
20020142887 O'Malley Oct 2002 A1
20020156392 Arai et al. Oct 2002 A1
20030028116 Bimbaum Feb 2003 A1
20030064860 Yamashita et al. Apr 2003 A1
20030171188 Neil Sep 2003 A1
20030171189 Kaufman Sep 2003 A1
20040058908 Keller et al. Mar 2004 A1
20040077462 Brown et al. Apr 2004 A1
20040117214 Shea Jun 2004 A1
20050070809 Acres Mar 2005 A1
20050075214 Brown et al. Apr 2005 A1
20050163346 van den Bergen et al. Jul 2005 A1
20050177059 Koivumaa et al. Aug 2005 A1
20050209002 Blythe et al. Sep 2005 A1
20050233861 Hickman et al. Oct 2005 A1
20050272561 Cammerata Dec 2005 A1
20060020216 Oishi et al. Jan 2006 A1
20060025282 Redmann Feb 2006 A1
20060032315 Saalastic et al. Feb 2006 A1
20060094570 Schneider May 2006 A1
20060111944 Sirmans et al. May 2006 A1
20060240959 Huang Oct 2006 A1
20060252602 Brown et al. Nov 2006 A1
20060281976 Juang et al. Dec 2006 A1
20060288846 Logan Dec 2006 A1
20070042868 Fisher et al. Feb 2007 A1
20070083092 Rippo et al. Apr 2007 A1
20070083095 Rippo et al. Apr 2007 A1
20070113725 Oliver et al. May 2007 A1
20070113726 Oliver et al. May 2007 A1
20070173377 Jamsen et al. Jul 2007 A1
20070213608 Brown Sep 2007 A1
20070219059 Schwartz et al. Sep 2007 A1
20070249467 Hong et al. Oct 2007 A1
20070249468 Chen Oct 2007 A1
20070275825 O'Brien Nov 2007 A1
20080045384 Matsubara et al. Feb 2008 A1
20080051919 Sakai et al. Feb 2008 A1
20080098876 Kuo et al. May 2008 A1
20080103022 Dvorak et al. May 2008 A1
20080146890 LeBoeuf et al. Jun 2008 A1
20080146892 LeBoeuf et al. Jun 2008 A1
20080147502 Baker Jun 2008 A1
20080162186 Jones Jul 2008 A1
20080170123 Albertson et al. Jul 2008 A1
20080176713 Olivera Brizzio et al. Jul 2008 A1
20080182723 Aaron et al. Jul 2008 A1
20080204225 Kitchen Aug 2008 A1
20080220941 Shaw et al. Sep 2008 A1
20080262918 Wiener Oct 2008 A1
20090044687 Sorber Feb 2009 A1
20090105047 Guidi et al. Apr 2009 A1
20090131224 Yuen May 2009 A1
20090131759 Sims et al. May 2009 A1
20090138488 Shea May 2009 A1
20090149131 Young et al. Jun 2009 A1
20090150175 Young et al. Jun 2009 A1
20090247368 Chiang Oct 2009 A1
20090258758 Hickman et al. Oct 2009 A1
20090275442 Nissila Nov 2009 A1
20090287103 Pillai Nov 2009 A1
20090292178 Ellis et al. Nov 2009 A1
20090293298 Martorell et al. Dec 2009 A1
20100035726 Fisher et al. Feb 2010 A1
20100056341 Ellis et al. Mar 2010 A1
20100120585 Quy May 2010 A1
20100167876 Cheng Jul 2010 A1
20100190607 Widerman et al. Jul 2010 A1
20100216603 Somers Aug 2010 A1
20100217099 Leboeuf et al. Aug 2010 A1
20100222178 Shea Sep 2010 A1
20100222181 Shea Sep 2010 A1
20100234699 Lanfermann et al. Sep 2010 A1
20110015039 Shea Jan 2011 A1
20110015041 Shea Jan 2011 A1
20110035184 Aaron et al. Feb 2011 A1
20110059825 Mcgown Mar 2011 A1
20110098112 Leboeuf et al. Apr 2011 A1
20110098583 Pandia et al. Apr 2011 A1
20110106627 Leboeuf et al. May 2011 A1
20110137191 Kinnunen Jun 2011 A1
20110165996 Paulus et al. Jul 2011 A1
20110165998 Lau et al. Jul 2011 A1
20110179068 O'Brien Jul 2011 A1
20110195819 Shaw et al. Aug 2011 A1
20110230142 Young et al. Sep 2011 A1
20110246908 Akram et al. Oct 2011 A1
20110263385 Shea Oct 2011 A1
20110275042 Warman et al. Nov 2011 A1
20110288381 Bartholomew et al. Nov 2011 A1
20110319228 Shea Dec 2011 A1
20120010478 Kinnunen et al. Jan 2012 A1
20120058859 Elsom-Cook et al. Mar 2012 A1
20120077580 Mahajan et al. Mar 2012 A1
20120108395 Shea May 2012 A1
20120129138 Redmann May 2012 A1
20120142429 Muller Jun 2012 A1
20120184871 Jang et al. Jul 2012 A1
20120190502 Paulus et al. Jul 2012 A1
20120203081 Leboeuf et al. Aug 2012 A1
20120226111 Leboeuf et al. Sep 2012 A1
20120226112 Leboeuf et al. Sep 2012 A1
20130046477 Hyde et al. Feb 2013 A1
20130089842 Shea Apr 2013 A1
20130090213 Amini et al. Apr 2013 A1
20130090565 Quy Apr 2013 A1
20130095459 Tran Apr 2013 A1
20130110265 Rahko et al. May 2013 A1
20130130213 Burbank et al. May 2013 A1
20130155251 Moravchik Jun 2013 A1
20130178960 Sheehan et al. Jul 2013 A1
20130217541 Shea Aug 2013 A1
20130217542 Shea Aug 2013 A1
20130217543 Shea Aug 2013 A1
20130218309 Napolitano Aug 2013 A1
20130225369 Fisbein et al. Aug 2013 A1
20130308192 Shimoda Nov 2013 A1
Foreign Referenced Citations (1)
Number Date Country
2012176193 Dec 2012 WO
Non-Patent Literature Citations (12)
Entry
Performtek Sensor Technology, “PerformTek Technology, Monitor Fitness Metrics Using Earbud Sensor Technology” http://www.valencell.com/preformtek-sensor-technology, website printed Sep. 17, 2013.
Steve Silverman, “Biometric Exercises” http://www.livestrong.com/article/282962-biometrics-exercises/, Mar. 31, 2011.
Julia Anne Framel, Aravind Babu Asam, Guru Prashanth Balasubramanian, Takeshi Suzuki, Charles D. Hedrick Jr., “User Device Position Indication for Security and Distributed Race Challenges”, File History of related U.S. Appl. No. 13/644,044, filed Oct. 3, 2012.
Sabrina Tai-Chen Yeh, David Andrew Young, “Altering Exercise Routes Based on Device Determined Information”, file history of related U.S. Appl. No. 14/037,286, filed Sep. 25, 2013.
Sabrina Tai-Chen Yeh, Steven Friedlander, David Andrew Young, “Nonverbal Audio Cues During Physical Activity”, file history of related U.S. Appl. No. 14/037,278, filed Sep. 25, 2013.
Sabrina Tai-Chen Yeh, Takashi Hironaka, David Andrew Young, Steven Friedlander, “Quick Login to User Profile on Exercise Machine”, file history of related U.S. Appl. No. 14/037,263, filed Sep. 25, 2013.
Sabrina Tai-Chen Yeh, David Andrew Young, Steven Friedlander, “Determine Exercise Routes Based on Device Determined Information”, file history of related U.S. Appl. No. 14/037,276, filed Sep. 25, 2013.
Sabrina Ta-Chen Yeh, Steven Friedlander, David Andrew Young, “Synchronized Exercise Buddy Headphones”, file history of related U.S. Appl. No. 14/037,267, filed Sep. 25, 2013.
Sabrina Tai-Chen Yeh, David Andrew Young, Takashi Hironaka, Steven Friedlander, “Presenting Audio Based on Biometrics Parameters”, file history of related U.S. Appl. No. 14/037,271, filed Sep. 25, 2013.
Sabrina Tai-Chen Yeh, Steven Friedlander, David Andrew Young, “Presenting Audio Video on Biometrics Parameters”, file history of related U.S. Appl. No. 14/037,252, filed Sep. 25, 2013.
Sabrina Tai-Chen Yeh, Jenny Therese Fredrikson, “Intelligent Device Mode Shifting Based on Activity”, file history of related U.S. Appl. No. 14/037,228, filed Sep. 25, 2013.
Jason Michael Warner, “Devices and Methods for Health Tracking and Providing Information for Improving Health”, file history of related U.S. Appl. No. 14/160,871, filed Jan. 22, 2014.
Provisional Applications (1)
Number Date Country
61878835 Sep 2013 US