Fitness training system with energy expenditure calculation that uses a form factor

Information

  • Patent Grant
  • 10420982
  • Patent Number
    10,420,982
  • Date Filed
    Tuesday, June 4, 2013
    11 years ago
  • Date Issued
    Tuesday, September 24, 2019
    5 years ago
Abstract
System and methods are provided for prompting a user to perform an exercise and to monitor the exercise. The form of the user may be monitored, such as with one or more video cameras and/or other sensors, to determine how well the user is performing the exercise. Energy expenditure is estimated based factors that include the type of the exercise, the form of the user and a contribution value that associates energy expenditure with form.
Description
BACKGROUND

While most people appreciate the importance of physical fitness, many have difficulty finding the motivation required to maintain a regular exercise program. Some people find it particularly difficult to maintain an exercise regimen that involves continuously repetitive motions, such as running, walking and bicycling.


Additionally, individuals may view exercise as work or a chore and thus, separate it from enjoyable aspects of their daily lives. Often, this clear separation between athletic activity and other activities reduces the amount of motivation that an individual might have toward exercising. Further, athletic activity services and systems directed toward encouraging individuals to engage in athletic activities might also be too focused on one or more particular activities while an individual's interest are ignored. This may further decrease a user's interest in participating in athletic activities or using the athletic activity services and systems.


Therefore, improved systems and methods to address these and other shortcomings in the art are desired.


BRIEF SUMMARY

The following presents a simplified summary in order to provide a basic understanding of some aspects of the disclosure. The summary is not an extensive overview of the disclosure. It is neither intended to identify key or critical elements of the disclosure nor to delineate the scope of the disclosure. The following summary merely presents some concepts of the disclosure in a simplified form as a prelude to the description below.


Aspects of this disclosure relate to processing of data taken while a user performs an athletic activity to determine an estimate of energy expenditure such as, for example, an amount of calories burned.


Example embodiments may relate to a system, method, apparatus, and computer readable media configured for prompting a user to perform an exercise, monitoring form of the user while performing the exercise, and calculating an energy expenditure estimate for the user performing the exercise based on a type of the exercise and on the form of the user. In other embodiments, expenditure estimate may be, or comprise, for example, an estimate of calories burned by the user. In certain embodiments, energy expenditure calculations comprise determinations relating to: effort, oxygen consumed, and/or oxygen kinetics of the user.


In various aspects, a system, method, apparatus, and/or computer readable media may be configured for processing data captured of a user performing an athletic activity over a time interval, and determining a location of a center of mass of a body part, body region, or entire body of the user at a first time instant and at a second time instant within the time interval. In further aspects, a system, method, apparatus, and/or computer readable media may be configured for identifying a change in the location of the center of mass from the first time instant to the second time instant, and calculating an energy expenditure estimate for the user due to the change.


These and other aspects of the embodiments are discussed in greater detail throughout this disclosure, including the accompanying drawings.





BRIEF DESCRIPTION OF THE DRAWINGS

The present disclosure is illustrated by way of example and not limited in the accompanying figures in which like reference numerals indicate similar elements and in which:



FIGS. 1A-B illustrate an example of a system for providing personal training in accordance with example embodiments, wherein FIG. 1A illustrates an example network configured to monitor athletic activity, and FIG. 1B illustrates an example computing device in accordance with example embodiments.



FIGS. 2A-B illustrate example sensor assemblies that may be worn by a user in accordance with example embodiments.



FIG. 3 illustrates an example flow diagram of a method for calculating an energy expenditure estimate for a user that accounts for a user's form while exercising as part of the estimate, in accordance with example embodiments.



FIG. 4 illustrates example points on a user's body for monitoring during exercising in accordance with example embodiments.



FIG. 5 illustrates an example posture assessment in accordance with example embodiments.



FIG. 6 illustrates example displays of a virtual avatar of a user performing an exercise in accordance with example embodiments.



FIGS. 7A-B illustrate example displays of a virtual avatar of a user performing a squat in accordance with example embodiments.



FIG. 8 illustrates an example flow diagram of a method for calculating an energy expenditure estimate for a user while performing an athletic activity based on monitoring changes in potential energy, in accordance with example embodiments.



FIGS. 9, 10A-B, and 11 illustrate example locations of centers of mass for a virtual avatar of user, in accordance with example embodiments.





DETAILED DESCRIPTION

In the following description of the various embodiments, reference is made to the accompanying drawings, which form a part hereof, and in which is shown by way of illustration various embodiments in which the disclosure may be practiced. It is to be understood that other embodiments may be utilized and structural and functional modifications may be made without departing from the scope and spirit of the present disclosure. Further, headings within this disclosure should not be considered as limiting aspects of the disclosure. Those skilled in the art with the benefit of this disclosure will appreciate that the example embodiments are not limited to the example headings.


I. Example Personal Training System
A. Illustrative Computing Devices


FIG. 1A illustrates an example of a personal training system 100 in accordance with example embodiments. Example system 100 may include one or more electronic devices, such as computer 102. Computer 102 may comprise a mobile terminal, such as a telephone, music player, tablet, netbook or any portable device. In other embodiments, computer 102 may comprise a set-top box (STB), desktop computer, digital video recorder(s) (DVR), computer server(s), and/or any other desired computing device. In certain configurations, computer 102 may comprise a gaming console, such as for example, a Microsoft® XBOX, Sony® Playstation, and/or a Nintendo® Wii gaming consoles. Those skilled in the art will appreciate that these are merely example consoles for descriptive purposes and this disclosure is not limited to any console or device.


Turning briefly to FIG. 1B, computer 102 may include computing unit 104, which may comprise at least one processing unit 106. Processing unit 106 may be any type of processing device for executing software instructions, such as for example, a microprocessor device. Computer 102 may include a variety of non-transitory computer readable media, such as memory 108. Memory 108 may include, but is not limited to, random access memory (RAM) such as RAM 110, and/or read only memory (ROM), such as ROM 112. Memory 108 may include any of: electronically erasable programmable read only memory (EEPROM), flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical disk storage, magnetic storage devices, or any other medium that can be used to store the desired information and that can be accessed by computer 102.


The processing unit 106 and the system memory 108 may be connected, either directly or indirectly, through a bus 114 or alternate communication structure to one or more peripheral devices. For example, the processing unit 106 or the system memory 108 may be directly or indirectly connected to additional memory storage, such as a hard disk drive 116, a removable magnetic disk drive, an optical disk drive 118, and a flash memory card. The processing unit 106 and the system memory 108 also may be directly or indirectly connected to one or more input devices 120 and one or more output devices 122. The output devices 122 may include, for example, a display device 136, television, printer, stereo, or speakers. In some embodiments one or more display devices may be incorporated into eyewear. The display devices incorporated into eyewear may provide feedback to users. Eyewear incorporating one or more display devices also provides for a portable display system. The input devices 120 may include, for example, a keyboard, touch screen, a remote control pad, a pointing device (such as a mouse, touchpad, stylus, trackball, or joystick), a scanner, a camera or a microphone. In this regard, input devices 120 may comprise one or more sensors configured to sense, detect, and/or measure athletic movement from a user, such as user 124, shown in FIG. 1A.


Looking again to FIG. 1A, image-capturing device 126 and/or sensor 128 may be utilized in detecting and/or measuring athletic movements of user 124. In one embodiment, data obtained from image-capturing device 126 or sensor 128 may directly detect athletic movements, such that the data obtained from image-capturing device 126 or sensor 128 is directly correlated to a motion parameter. For example, and with reference to FIG. 4, image data from image-capturing device 126 may detect that the distance between sensor locations 402g and 402i has decreased and therefore, image-capturing device 126 alone may be configured to detect that user's 124 right arm has moved. Yet, in other embodiments, data from image-capturing device 126 and/or sensor 128 may be utilized in combination, either with each other or with other sensors to detect and/or measure movements. Thus, certain measurements may be determined from combining data obtained from two or more devices. Image-capturing device 126 and/or sensor 128 may include or be operatively connected to one or more sensors, including but not limited to: an accelerometer, a gyroscope, a location-determining device (e.g., GPS), light sensor, temperature sensor (including ambient temperature and/or body temperature), heart rate monitor, image-capturing sensor, moisture sensor and/or combinations thereof. Example uses of illustrative sensors 126, 128 are provided below in Section I.C, entitled “Illustrative Sensors.” Computer 102 may also use touch screens or image capturing device to determine where a user is pointing to make selections from a graphical user interface. One or more embodiments may utilize one or more wired and/or wireless technologies, alone or in combination, wherein examples of wireless technologies include Bluetooth® technologies, Bluetooth® low energy technologies, and/or ANT technologies.


B. Illustrative Network

Still further, computer 102, computing unit 104, and/or any other electronic devices may be directly or indirectly connected to one or more network interfaces, such as example interface 130 (shown in FIG. 1B) for communicating with a network, such as network 132. In the example of FIG. 1B, network interface 130, may comprise a network adapter or network interface card (NIC) configured to translate data and control signals from the computing unit 104 into network messages according to one or more communication protocols, such as the Transmission Control Protocol (TCP), the Internet Protocol (IP), and the User Datagram Protocol (UDP). These protocols are well known in the art, and thus will not be discussed here in more detail. An interface 130 may employ any suitable connection agent for connecting to a network, including, for example, a wireless transceiver, a power line adapter, a modem, or an Ethernet connection. Network 132, however, may be any one or more information distribution network(s), of any type(s) or topology(s), alone or in combination(s), such as internet(s), intranet(s), cloud(s), LAN(s). Network 132 may be any one or more of cable, fiber, satellite, telephone, cellular, wireless, etc. Networks are well known in the art, and thus will not be discussed here in more detail. Network 132 may be variously configured such as having one or more wired or wireless communication channels to connect one or more locations (e.g., schools, businesses, homes, consumer dwellings, network resources, etc.), to one or more remote servers 134, or to other computers, such as similar or identical to computer 102. Indeed, system 100 may include more than one instance of each component (e.g., more than one computer 102, more than one display 136, etc.).


Regardless of whether computer 102 or other electronic device within network 132 is portable or at a fixed location, it should be appreciated that, in addition to the input, output and storage peripheral devices specifically listed above, the computing device may be connected, such as either directly, or through network 132 to a variety of other peripheral devices, including some that may perform input, output and storage functions, or some combination thereof. In certain embodiments, a single device may integrate one or more components shown in FIG. 1A. For example, a single device may include computer 102, image-capturing device 126, sensor 128, display 136 and/or additional components. In one embodiment, sensor device 138 may comprise a mobile terminal having a display 136, image-capturing device 126, and one or more sensors 128. Yet, in another embodiment, image-capturing device 126, and/or sensor 128 may be peripherals configured to be operatively connected to a media device, including for example, a gaming or media system. Thus, it goes from the foregoing that this disclosure is not limited to stationary systems and methods. Rather, certain embodiments may be carried out by a user 124 in almost any location.


C. Illustrative Sensors

Computer 102 and/or other devices may comprise one or more sensors 126, 128 configured to detect and/or monitor at least one fitness parameter of a user 124. Sensors 126 and/or 128 may include, but are not limited to: an accelerometer, a gyroscope, a location-determining device (e.g., GPS), light sensor, temperature sensor (including ambient temperature and/or body temperature), sleep pattern sensors, heart rate monitor, image-capturing sensor, moisture sensor and/or combinations thereof. Network 132 and/or computer 102 may be in communication with one or more electronic devices of system 100, including for example, display 136, an image capturing device 126 (e.g., one or more video cameras), and sensor 128, which may be an infrared (IR) device. In one embodiment sensor 128 may comprise an IR transceiver. For example, sensors 126, and/or 128 may transmit waveforms into the environment, including towards the direction of user 124 and receive a “reflection” or otherwise detect alterations of those released waveforms. In yet another embodiment, image-capturing device 126 and/or sensor 128 may be configured to transmit and/or receive other wireless signals, such as radar, sonar, and/or audible information. Those skilled in the art will readily appreciate that signals corresponding to a multitude of different data spectrums may be utilized in accordance with various embodiments. In this regard, sensors 126 and/or 128 may detect waveforms emitted from external sources (e.g., not system 100). For example, sensors 126 and/or 128 may detect heat being emitted from user 124 and/or the surrounding environment. Thus, image-capturing device 126 and/or sensor 128 may comprise one or more thermal imaging devices. In one embodiment, image-capturing device 126 and/or sensor 128 may comprise an IR device configured to perform range phenomenology. As a non-limited example, image-capturing devices configured to perform range phenomenology are commercially available from Flir Systems, Inc. of Portland, Oreg. Although image capturing device 126 and sensor 128 and display 136 are shown in direct (wirelessly or wired) communication with computer 102, those skilled in the art will appreciate that any may directly communicate (wirelessly or wired) with network 132.


1. Multi-Purpose Electronic Devices

User 124 may possess, carry, and/or wear any number of electronic devices, including sensory devices 138, 140, 142, and/or 144. In certain embodiments, one or more devices 138, 140, 142, 144 may not be specially manufactured for fitness or athletic purposes. Indeed, aspects of this disclosure relate to utilizing data from a plurality of devices, some of which are not fitness devices, to collect, detect, and/or measure athletic data. In one embodiment, device 138 may comprise a portable electronic device, such as a telephone or digital music player, including an IPOD®, IPAD®, or iPhone®, brand devices available from Apple, Inc. of Cupertino, Calif. or Zune® or Microsoft® Windows devices available from Microsoft of Redmond, Wash. As known in the art, digital media players can serve as both an output device for a computer (e.g., outputting music from a sound file or pictures from an image file) and a storage device. In one embodiment, device 138 may be computer 102, yet in other embodiments, computer 102 may be entirely distinct from device 138. Regardless of whether device 138 is configured to provide certain output, it may serve as an input device for receiving sensory information. Devices 138, 140, 142, and/or 144 may include one or more sensors, including but not limited to: an accelerometer, a gyroscope, a location-determining device (e.g., GPS), light sensor, temperature sensor (including ambient temperature and/or body temperature), heart rate monitor, image-capturing sensor, moisture sensor and/or combinations thereof. In certain embodiments, sensors may be passive, such as reflective materials that may be detected by image-capturing device 126 and/or sensor 128 (among others). In certain embodiments, sensors 144 may be integrated into apparel, such as athletic clothing. For instance, the user 124 may wear one or more on-body sensors 144a-b. Sensors 144 may be incorporated into the clothing of user 124 and/or placed at any desired location of the body of user 124. Sensors 144 may communicate (e.g., wirelessly) with computer 102, sensors 128, 138, 140, and 142, and/or camera 126. Examples of interactive gaming apparel are described in U.S. patent application Ser. No. 10/286,396, filed Oct. 30, 2002, and published as U.S. Pat. Pub, No. 2004/0087366, the contents of which are incorporated herein by reference in its entirety for any and all non-limiting purposes. In certain embodiments, passive sensing surfaces may reflect waveforms, such as infrared light, emitted by image-capturing device 126 and/or sensor 128. In one embodiment, passive sensors located on user's 124 apparel may comprise generally spherical structures made of glass or other transparent or translucent surfaces which may reflect waveforms. Different classes of apparel may be utilized in which a given class of apparel has specific sensors configured to be located proximate to a specific portion of the user's 124 body when properly worn. For example, golf apparel may include one or more sensors positioned on the apparel in a first configuration and yet soccer apparel may include one or more sensors positioned on apparel in a second configuration.


Devices 138-144 may communicate with each other, either directly or through a network, such as network 132. Communication between one or more of devices 138-144 may communicate through computer 102. For example, two or more of devices 138-144 may be peripherals operatively connected to bus 114 of computer 102. In yet another embodiment, a first device, such as device 138 may communicate with a first computer, such as computer 102 as well as another device, such as device 142, however, device 142 may not be configured to connect to computer 102 but may communicate with device 138. Those skilled in the art will appreciate that other configurations are possible.


Some implementations of the example embodiments may alternately or additionally employ computing devices that are intended to be capable of a wide variety of functions, such as a desktop or laptop personal computer. These computing devices may have any combination of peripheral devices or additional components as desired. Also, the components shown in FIG. 1B may be included in the server 134, other computers, apparatuses, etc.


2. Illustrative Apparel/Accessory Sensors

In certain embodiments, sensory devices 138, 140, 142 and/or 144 may be formed within or otherwise associated with user's 124 clothing or accessories, including a watch, armband, wristband, necklace, shirt, shoe, or the like. Examples of shoe-mounted and wrist-worn devices (devices 140 and 142, respectively) are described immediately below, however, these are merely example embodiments and this disclosure should not be limited to such.


i. Shoe-Mounted Device

In certain embodiments, sensory device 140 may comprise footwear which may include one or more sensors, including but not limited to: an accelerometer, location-sensing components, such as GPS, and/or a force sensor system. FIG. 2A illustrates one example embodiment of a sensor system 202. In certain embodiments, system 202 may include a sensor assembly 204. Assembly 204 may comprise one or more sensors, such as for example, an accelerometer, location-determining components, and/or force sensors. In the illustrated embodiment, assembly 204 incorporates a plurality of sensors, which may include force-sensitive resistor (FSR) sensors 206. In yet other embodiments, other sensor(s) may be utilized. Port 208 may be positioned within a sole structure 209 of a shoe. Port 208 may optionally be provided to be in communication with an electronic module 210 (which may be in a housing 211) and a plurality of leads 212 connecting the FSR sensors 206 to the port 208. Module 210 may be contained within a well or cavity in a sole structure of a shoe. The port 208 and the module 210 include complementary interfaces 214, 216 for connection and communication.


In certain embodiments, at least one force-sensitive resistor 206 shown in FIG. 2A may contain first and second electrodes or electrical contacts 218, 220 and a force-sensitive resistive material 222 disposed between the electrodes 218, 220 to electrically connect the electrodes 218, 220 together. When pressure is applied to the force-sensitive material 222, the resistivity and/or conductivity of the force-sensitive material 222 changes, which changes the electrical potential between the electrodes 218, 220. The change in resistance can be detected by the sensor system 202 to detect the force applied on the sensor 216. The force-sensitive resistive material 222 may change its resistance under pressure in a variety of ways. For example, the force-sensitive material 222 may have an internal resistance that decreases when the material is compressed, similar to the quantum tunneling composites described in greater detail below. Further compression of this material may further decrease the resistance, allowing quantitative measurements, as well as binary (on/off) measurements. In some circumstances, this type of force-sensitive resistive behavior may be described as “volume-based resistance,” and materials exhibiting this behavior may be referred to as “smart materials.” As another example, the material 222 may change the resistance by changing the degree of surface-to-surface contact. This can be achieved in several ways, such as by using microprojections on the surface that raise the surface resistance in an uncompressed condition, where the surface resistance decreases when the microprojections are compressed, or by using a flexible electrode that can be deformed to create increased surface-to-surface contact with another electrode. This surface resistance may be the resistance between the material 222 and the electrode 218, 220222 and/or the surface resistance between a conducting layer (e.g., carbon/graphite) and a force-sensitive layer (e.g., a semiconductor) of a multi-layer material 222. The greater the compression, the greater the surface-to-surface contact, resulting in lower resistance and enabling quantitative measurement. In some circumstances, this type of force-sensitive resistive behavior may be described as “contact-based resistance.” It is understood that the force-sensitive resistive material 222, as defined herein, may be or include a doped or non-doped semiconducting material.


The electrodes 218, 220 of the FSR sensor 216 can be formed of any conductive material, including metals, carbon/graphite fibers or composites, other conductive composites, conductive polymers or polymers containing a conductive material, conductive ceramics, doped semiconductors, or any other conductive material. The leads 212 can be connected to the electrodes 218, 220 by any suitable method, including welding, soldering, brazing, adhesively joining, fasteners, or any other integral or non-integral joining method. Alternately, the electrode 218, 220 and associated lead 212 may be formed of a single piece of the same material.


ii. Wrist-Worn Device

As shown in FIG. 2B, device 226 (which may resemble or be sensory device 142 shown in FIG. 1A) may be configured to be worn by user 124, such as around a wrist, arm, ankle or the like. Device 226 may monitor athletic movements of a user, including all-day activity of user 124. In this regard, device assembly 226 may detect athletic movement during user's 124 interactions with computer 102 and/or operate independently of computer 102. For example, in one embodiment, device 226 may be an-all day activity monitor that measures activity regardless of the user's proximity or interactions with computer 102. Device 226 may communicate directly with network 132 and/or other devices, such as devices 138 and/or 140. In other embodiments, athletic data obtained from device 226 may be utilized in determinations conducted by computer 102, such as determinations relating to which exercise programs are presented to user 124. In one embodiment, device 226 may also wirelessly interact with a mobile device, such as device 138 associated with user 124 or a remote website such as a site dedicated to fitness or health related subject matter. At some predetermined time, the user may wish to transfer data from the device 226 to another location.


As shown in FIG. 2B, device 226 may include an input mechanism, such as a depressible input button 228 assist in operation of the device 226. The input button 228 may be operably connected to a controller 230 and/or any other electronic components, such as one or more of the elements discussed in relation to computer 102 shown in FIG. 1B. Controller 230 may be embedded or otherwise part of housing 232. Housing 232 may be formed of one or more materials, including elastomeric components and comprise one or more displays, such as display 234. The display may be considered an illuminable portion of the device 226. The display 234 may include a series of individual lighting elements or light members such as LED lights 234 in an exemplary embodiment. The LED lights may be formed in an array and operably connected to the controller 230. Device 226 may include an indicator system 236, which may also be considered a portion or component of the overall display 234. It is understood that the indicator system 236 can operate and illuminate in conjunction with the display 234 (which may have pixel member 235) or completely separate from the display 234. The indicator system 236 may also include a plurality of additional lighting elements or light members 238, which may also take the form of LED lights in an exemplary embodiment. In certain embodiments, indicator system may provide a visual indication of goals, such as by illuminating a portion of lighting members 238 to represent accomplishment towards one or more goals.


A fastening mechanism 240 can be unlatched wherein the device 226 can be positioned around a wrist of the user 124 and the fastening mechanism 240 can be subsequently placed in a latched position. The user can wear the device 226 at all times if desired. In one embodiment, fastening mechanism 240 may comprise an interface, including but not limited to a USB port, for operative interaction with computer 102 and/or devices 138, 140.


In certain embodiments, device 226 may comprise a sensor assembly (not shown in FIG. 2B). The sensor assembly may comprise a plurality of different sensors. In an example embodiment, the sensor assembly may comprise or permit operative connection to an accelerometer (including in the form of a multi-axis accelerometer), heart rate sensor, location-determining sensor, such as a GPS sensor, and/or other sensors. Detected movements or parameters from device's 142 sensor(s), may include (or be used to form) a variety of different parameters, metrics or physiological characteristics including but not limited to speed, distance, steps taken, and energy expenditure such as calories, heart rate, sweat detection, effort, oxygen consumed, and/or oxygen kinetics. Such parameters may also be expressed in terms of activity points or currency earned by the user based on the activity of the user.


Illustrative Athletic Monitoring Methods

System 100 may prompt a user to perform one or more exercises, monitor user movement while performing the exercises, and provide the user with an energy expenditure estimate based on their movement. System 100 may analyze a user's form to determine if the user is making an exercise more or less difficult, and adjust the energy expenditure estimate accordingly. Energy expenditure estimates may be, or comprise, an estimate of calories burned by the user. In certain embodiments, energy expenditure determinations may be based on, and/or conveyed as a point system. In one embodiment, calories may be converted to a point system, yet in other embodiments, measurements may be directly obtained in one or more point systems. In one implementation, activity points may be based upon: form, body movements, and/or completion of certain activities. In further embodiments, energy expenditure calculations may comprise determinations relating to: effort, oxygen consumed, and/or oxygen kinetics of the user. In one embodiment, computer 102, camera 126, sensor 128, and display 136 may be implemented within the confines of a user's residence, although other locations, including gyms and/or businesses are contemplated. Further, as discussed above, computer 102 may be a portable device, such as a cellular telephone, therefore, one or more aspects discussed herein may be conducted in almost any location. In this regard, the example embodiments of this disclosure are discussed in the context of being implemented with one or more of the example components of system 100. Those skilled in the art will appreciate that reference(s) to a particular component, such as computer 102, is not meant to be limiting, but rather to provide an illustrative example of one of many possible implementations. Thus, although certain components may be referenced, it is to be assumed that other components of system 100 may be utilized unless expressly disclaimed or physically impossible. Further, aspects disclosed herein are not limited to example system 100.


A. Monitoring User Movements

While exercising, the system 100 may use one or more techniques to monitor user movement. FIG. 3 illustrates an example flow diagram of a method for calculating an energy expenditure estimate for a user that accounts for a user's form while exercising as part of the estimate, in accordance with example embodiments. The method may be implemented by a computer, such as, for example, computer 102, device 138, 140 and/or 142, as well as or other apparatuses. The blocks shown in FIG. 3 may be rearranged, some blocks may be removed, additional blocks may be added, each block may be repeated one or more times, and the flow diagram may be repeated one or more times. The flow diagram may begin at block 302.


1. Perform User Assessment

In block 302, the method may include performing an initial assessment of the user. A user, such as user 124, may be positioned in range of a sensor, such as in front of the image capturing device 126 and/or sensor 128, which may comprise an infrared transceiver. Display 136 may present a representation of user 124 that may be a “mirror-image” or depict a virtual avatar, such as a user avatar, that moves to correspond with user movement. Computer 102 may prompt the user to move into a certain region relative to the image capturing device 126 and/or relative to the infrared transceiver 128 so that the user is within frame and/or range. When properly positioned, system 100 may process movement of the user. Although the term “initial” has been utilized, this assessment may occur each time the user initiates system 100, performs certain movements, upon passage of time, or for any other reason. Thus, references to assessments herein are not limited to a single assessment.


a. Identify Sensory Locations

System 100 may process sensory data to identify user movement data. In one embodiment, sensory locations on a user's body may be identified. With reference to FIG. 4, sensory locations 402a-402o may correspond to locations of interest on the user's 124 body (e.g., ankles, elbows, shoulders, etc.). For example, images of recorded video, such as from camera 126, may be utilized in an identification of the sensory locations 402a-402o. For example, the user may stand a certain distance, which may or may not be predefined, from the camera 126, and system 100 may process the images to identify the user 124 within the video, for example, using disparity mapping techniques. In an example, image capturing device 126 may be a stereo camera having two or more lenses that are spatially offset from one another and that simultaneously capture two or more images of the user. System 100 may process the two or more images taken at a same time instant to generate a disparity map for determining a location of certain parts of the user's body in each image (or at least some of the images) in the video using a coordinate system (e.g., Cartesian coordinates). The disparity map may indicate a difference between an image taken by each of the offset lenses.


In a second example, one or more sensors may be located on or proximate to the user's 124 body at the sensory locations 402a-402o or the user 124 may wear a suit having sensors situated at various locations. Yet, in other embodiments, sensor locations may be determined from other sensory devices, such as devices 138, 140 and/or 142. In this regard, sensors may be physical sensors located on a user's clothing, yet in other embodiments, sensor locations 402a-402o may be based upon identification of relationships between two moving body parts. For example, sensor location 402a may be determined by identifying motions of user 124. In this regard, the overall shape or portion of a user's body may permit identification of certain body parts. Regardless of whether a camera, such as camera 126, is utilized and/or a physical sensor located on the user 124, such as sensors within device(s) 138, 140, 142 are utilized, the sensors may sense a current location of a body part and/or track movement of the body part.


In certain embodiments, a time stamp may be added to the data collected (such as collected part of block 302 in FIG. 3) indicating a specific time when a body part was at a certain location. Sensor data may be received at computer 102 (or other device) via wireless or wired transmission. A computer, such as computer 102 and/or devices 138, 140, 142, may process the time stamps to determine the locations of the body parts using a coordinate system (e.g., Cartesian coordinates) within each (or at least some) of the images in the video. Data received from camera 126 may be corrected, modified, and/or combined with data received from one or more other devices 138, 140, and 142.


In a third example, system 100 may use infrared pattern recognition to detect user movement and locations of body parts of the user 124. For example, sensor 128 may include an infrared transceiver, which may be part of camera 126, or another device, that may emit an infrared signal to illuminate the user's 124 body using infrared signals. The infrared transceiver 128 may capture a reflection of the infrared signal from the body of user 124. Based on the reflection, the system 100 may identify a location of certain parts of the user's body using a coordinate system (e.g., Cartesian coordinates) at particular instances in time. Which and how body parts are identified may be predetermined based on a type or types of exercise a user is requested to perform.


As part of a workout routine, system 100 may make an initial postural assessment of the user 124 as part of the initial user assessment in block 302 of FIG. 3. With reference to FIG. 5, system 100 may analyze front and side images of a user 124 to determine a location of one or more of a user's shoulders, upper back, lower back, hips, knees, and ankles. On-body sensors and/or infrared techniques may also be used, either alone or in conjunction with camera 126, to determine the locations of various body parts for the postural assessment. For example, system 100 may determine assessment lines 124a-g and/or regions 502-512 to determine the locations of a various points on a user's body, such as, for example, ankles, knees, hips, upper back, lower back, and shoulders.


b. Identify Sensory Regions

In further embodiments, system 100 may identify sensory regions (see, e.g., block 302). In one embodiment, assessments lines 124a-g may be utilized to divide the user's body into regions. For example, lines 124b-f may be horizontal axes. For example, a “shoulders” region 502 may correlate to a body portion having a lower boundary around the user's shoulders (see line 124b), region 504 may correlate to the body portion between the shoulders (line 124b) and about half the distance to the hips (see line 124c) and thus be an “upper back” region, and region 506 may span the area between line 124c to the hips (see line 124d) to comprise a “lower back region.” Similarly, region 508 may span the area between the “hips” (line 124d) and the “knees” (see line 124e), region 510 may span between lines 124e and 124f and region 512 (see “ankles”) may have an upper boundary around line 124f Regions 502-512 may be further divided, such as into quadrants, such as by using axes 124a and 124g. To aid in the identification of one or more sensory regions, system 100 may prompt the user to make one or more specific movements. For example, system 100 may prompt a user to move a specific body part or region (e.g., wave their right arm, or wave the left arm in a specific pattern) to aid the system 100 (e.g., computer algorithm processing information received from the infrared transceiver 128) in determining which body part or region is in a specific location within a coordinate system.


c. Categorize Locations or Regions

In certain embodiments, body parts or regions that are not proximate to each other may nonetheless be categorized into the same movement category (see, e.g., block 302). For example, as shown in FIG. 5, the “upper back”, “hips”, and “ankles” regions 504, 508, 512 may be categorized as belonging to a “mobility” category. In another embodiment, the “lower back” and “knees” regions 506, 510 may be categorized as belonging to a “stability” category. The categorizations are merely examples, and in other embodiments, a location or region may belong to multiple categories. For example, a “center of gravity” region may be formed from regions 504 and 506. In one embodiment, a “center of gravity” may comprise portions of regions 504 and 506. In another embodiment, a “center of moment” category may be provided, either independently, or alternatively, as comprising a portion of at least another category. In one embodiment, a single location may be weighted in two or more categories, such as being 10% weighted in a “stability” category and 90% weighted in a “mobility” category.


System 100 may also process the image to determine a color of clothing of the user or other distinguishing features to differentiate the user from their surroundings. After processing, system 100 may identify a location of multiple points on the user's body and track locations of those points, such as locations 402 in FIG. 4. System 100 may also prompt the user to answer questions to supplement the postural assessment, such as, for example, age, weight, etc. Again, block 302 is optional and is not required in accordance with various embodiments.


2. Providing Form

With reference again to FIG. 3, in block 304, various embodiments may include demonstrating proper form for an exercise and prompting the user to perform the exercise. For example, after or in addition to the initial postural assessment, the system 100 (such as with computer 102) may cause the display 136 to present a virtual trainer demonstrating an exercise to instruct the user on proper form and/or may present a depiction and/or an actual video of a real person demonstrating proper form for an exercise. System 100 may then prompt the user to begin performing the exercise.


With reference to FIG. 3, in block 306, various embodiments may include monitoring form of a user performing the exercise. As seen in FIG. 6, system 100, such as through computer 102, may cause the display 136 to present a virtual avatar 602 of the user. The virtual avatar 602 may move in synchronism with the user 124. Also, the display 136 may present video of the actual user, rather than avatar 602. System 100 may process one or more frames in the video to determine at least some of the sensory locations 402, or may receive data from sensors worn on-body by the user. As shown in FIG. 6, sensory locations 402 may be displayed on the virtual avatar.


For proper form during many exercise routines, a user may proceed through multiple positions during a repetition of an exercise. Certain aspects disclosed herein relate to defining one or more measurement positions and/or desired locations for one or more sensory locations 402. For example, a measurement position may refer to a particular relationship between various body parts during a repetition. For example, a measurement position may indicate a desired location for a user's body part (e.g., desired location of user's left elbow) and may indicate a desired relationship between multiple body parts (e.g., angle between a user's torso and thigh). For a movement or series of movements (such as an exercise routine), system 100 may define one or more measurement positions and/or desired locations for one or more of the sensory locations 402 for a measurement position. In various implementations, each repetition of an exercise can be broken down into one or more measurement positions.


System 100, such as through computer 102, may process video or sensor data of a user performing an exercise to determine when a user's body has reached a measurement position. For each measurement position, system 100 may compare the measured sensory locations to desired sensory locations to monitor the user's form while performing the exercise. For example, frame 1 of FIG. 6 may correspond to a first measurement position and frame 2 may correspond to a second measurement position. System 100 may determine a distance between sensory locations 402c and 402d at each measurement position. Other relationships between sensory locations may be specified (e.g., certain angle, certain position, etc.)


With reference again to FIG. 3, in block 308, various embodiments may include calculating an energy expenditure estimate for the user. Calculations may be based on a type of the exercise, on the form of the user and/or contribution value that correlates energy expenditure with form for a given exercise or drill. The energy expenditure estimate may be, or comprise, for example, an estimate of calories burned by the user. In certain embodiments, energy expenditure calculations comprise determinations relating to: effort, oxygen consumed, and/or oxygen kinetics of the user. During a workout session or upon its completion, the system 100 may inform the user of energy expended. In one embodiment, system 100 may provide an indication of a quantity of calories they have burned. To provide a more accurate calories burned estimate, system 100 may account for a user's form while performing an exercise as well as the type of exercise that was performed.


Form may impact energy expenditure during some exercises or drills and not others. For example, performing a drill, such as an agility shuffle, with improper form may result in a lower or higher energy expenditure than performing the same drill with proper form. In one embodiment multiple aspects of a drill or exercise are monitored. The monitoring may be performed with a camera, multiple cameras and/or other sensors. Each aspect of the exercise or drill may be assigned a form score and a contribution value that indicates the relevance of proper form for that aspect of the exercise or drill to energy expenditure. For example, a squat drill or exercise may have form scores for the level of back straightness, depth of a squat and how well heels remain on the ground. A user may receive form scores of 0.3 for back straightness, 0.6 for squat depth and 0.1 for how well her heels remain on the ground. In one embodiment of the invention contribution values may be either one or zero. Of course, several other values may also be used. Returning to the squat example, it may be determined by prior monitoring or calculations that back straightness and how well heels remain on the ground have no relevance to energy expenditure, while squat depth is relevant. Contribution values of zero, one and zero may be assigned for back straightness, squat depth and how well heels remain on the ground, respectively. Form scores may be multiplied by contribution values when determining energy expenditure.


Further embodiments may utilize user attributes to more accurately identify a number of calories burned by a user. Example user attributes may be height, weight, age, etc. One or more sensors may determine the user attributes, or the user may input the user attributes via an interface to a computer, such as computer 102.


System 100 may use information from sensory locations 402 detected at measurement positions of an exercise in combination with one or more known values to obtain a more accurate determination of calories burned. In one embodiment, a known value may comprise or be part of a Metabolic Equivalent of Task (MET) table. A MET table, for example, may be defined for a particular exercise (e.g., squat, lunge, etc.) and used to determine how many calories a user burned during a workout. System 100 may store or have access to multiple MET tables corresponding to different exercises (e.g., squat, lunge, jumping rope, push up, running, etc.). System 100 may process data from the video and/or sensors to determine a number of repetitions of an exercise that a user has performed or duration of an exercise, and may estimate a number of calories burned by the user based on the repetitions and/or duration information and the one or more known values, such as may be obtained from MET tables.


MET tables, however, are statistical averages and are not as accurate as they could be. Thus, conventional calorie measurement systems that rely on MET tables merely provide a user with a rough estimate of how many calories they burned during a workout. Although embodiments of this disclosure may utilize one or more values from a MET table, aspects of this disclosure are not limited by the deficiencies of prior measurements systems. For example, in one embodiment the user's form may be accounted for. System 100 may apply a scaling factor to a calories burned estimate based on detected sensory location information. The scaling factor may reflect how well a user has performed an exercise and in certain embodiments may consider attributes of the user. For example, the scaling factor may be a function of one or more of the sensory location information, a duration during which the user performed an exercise, information reported by the user (e.g., age, weight), a user's heart rate taken by a heart rate monitor, a pressure measurement, and/or other data. A pressure measurement may be obtained from pressure sensor 140 located in a shoe, for example, to determine how much force a user exerts during movement. For example, a user may be holding a weight in each hand and the pressure sensor 140 may monitor pressure at the shoe. The pressure sensor 140 may also indicate how quickly a user changes direction (e.g., how hard a user made a cut) or how much power was exerted when jumping.


To determine the scaling factor, system 100 may monitor for relationships between one or more body parts at one or more measurement positions during a repetition of an exercise. Modifications to these relationships may make an exercise easier or harder to perform. The scaling factor may consider factors indicative of whether a user is making the exercise more or less difficult to complete, and may adjust a calories burned estimate accordingly. In a squat, for example, relationships may be defined for a first angle between a user's torso and thighs, and a second angle between a user's thighs and shin while performing the squat. System 100 may process sensory location information to measure the first and second angle of the user over time for comparison with the desired first and second angle.


In an example, with reference to FIGS. 7A-B, a virtual avatar 702 of a user is displayed performing a squat. Virtual avatar 702 is depicted as a stick figure, and proper technique for an exercise is shown as a shaded region 704. At the lowest part of the squat (for example, as shown in FIG. 7A), the desired form may specify a relationship between a user's thigh and shin, between a user's back and arms, and/or any other two parts or locations the user. In one embodiment, the desired form may specify a first predetermined angle between a location or part. For example, a user's upper leg and lower leg, and/or a second predetermined angle between a user's back and arms. System 100 may process the sensory location information to compare the user's form to the desired form. For example, system 100 may process the sensory location information to determine an angle between the user's thigh and shin, and an angle between the user's back and arms when performing a squat.


System 100 may define thresholds for the relationships between various body parts for adjusting the scaling factor. The thresholds may permit the user's form to differ by a certain amount from the desired form. For a preferred threshold, system 100 may determine that the user has good form that does not require any adjustment of the scaling factor (e.g., less than a 5% difference between angle between the user's upper leg and lower leg and desired angle). For an acceptable threshold, the system 100 may nominally adjust the scaling factor upward or downward to reflect increased or reduced effort by the user (e.g., 5-15% difference between angle between the user's upper leg and lower leg and desired angle). For an unacceptable threshold, the system 100 may determine that the user's form has reduced the amount of effort to perform the exercise and may downwardly adjust the scaling factor (e.g., greater than a 15% difference between angle between the user's upper leg and lower leg and desired angle).


System 100 may also adjust the scaling factor based on omissions or additions a user makes when performing an exercise. For example, a user may not be doing an arm movement in an exercise that requires movement of both arms and legs. Also, if the user is performing an additional movement beyond what is specified for an exercise, the system 100 may adjust the scaling factor to increase the calorie estimate.


Upon determining the scaling factor, the system 100 may determine an amount of calories burned as a function of the scaling factor(s) and the calorie estimate. The function may be a multiplication of the calorie estimate by the scaling factor, or via other relationships. For example, the scaling factor may be adjustments to a number of variables in a mathematical equation for adjusting calories burned by one or more of multiplication, addition, and subtraction. In further embodiments, system 100 may cease determinations relating to caloric expenditure if the user deviates from a threshold. For example, a user may be interrupted during a workout routine and either forget or be too distracted to “pause” the determination, thus, certain embodiments may cease determining caloric expenditure upon detecting that a user is not performing an exercise. Further embodiments may cease or otherwise alter determinations of caloric expenditure if one or more variation thresholds are exceeded, such as for example, if a user is over-extending or under-extending a body region or part. In certain embodiments, if a user's movements are prone to cause injury, measurements and/or determinations relating to caloric expenditure may be stopped. In one implementation, system 100 may provide cues and/or instructions to correct the user's deficiencies or incorrect movements.


The following provides an example equation for calculating an amount of calories burned by a user during a workout.

Calories burned=BMR*(Activity modifier)*(Completeness modifier).  (Equation (1):


In equation (1), BMR is an acronym for Basal Metabolic Rate. The system 100 may calculate the BMR using the Mifflin-St. Jeor Equation, BMR=(10*w)+(6.25*h)−(5.0*a)+(5 for men, −161 for women), where “*” is the multiplication symbol, “w”=weight in kilograms, “h”=height in centimeters, “a”=age in years. The system 100 may also use the Harris-Benedict equation instead of or, in addition to, the Mifflin-St. Jeor Equation.


The activity modifier may be an adjustment corresponding to a type of exercise being performed by a user. The activity modifier may be larger for more strenuous exercises, and smaller for less strenuous. System 100 may store a file containing activity modifiers, where each activity modifier may have a value for a particular exercise type. Two or more exercises may have activity modifiers with a same value, or certain exercise may have a unique value for the activity modifier. The activity modifier may have a default value. In one example embodiment, the default value may be 0.1. In a second embodiment, the default value may be 1.0. The default value may be any value, including 0.0. System 100 may update the default value to correspond to the activity modifier for an exercise currently being performed by the user. Over a duration of the workout, system 100 may use different ones of the activity modifiers to calculate calories burned using equation (1) corresponding to different exercises the user is prompted to perform. One or more factors may contribute to the activity modifier and/or adjustment of the modifier. Examples include, but are not limited to: pace, type of exercise, duration, and combinations thereof. Further, activity modifiers and/or variation of activity modifiers may be determined from predetermined values (such as a value assigned to an exercise or movement that a user is prompted to perform), the user's performance, information from a MET table on a particular exercise, and combinations thereof.


The completeness modifier may be used for adjusting the BMR based on how well a user's form corresponds to a desired form when performing an exercise. In an example, the completeness modifier may indicate what percentage of full movement was achieved for each repetition when performing an exercise (e.g., determine a percentage of a measured angle between the user's torso and thighs for a particular repetition of an exercise relative to a desired angle), or may be an average of the percentage of full movement for a predetermined number of repetitions (e.g., last three exercises, last five exercises, all exercises, etc.). The completeness modifier may have a default value. In one example embodiment, the default value may be 0.1. In a second embodiment, the default value may be 1.0. The default value may be any value, including 0.0. System 100 may update the completeness modifier over time based on how well the user's form conforms to a desired form. One or more factors may contribute to the activity modifier and/or adjustment of the modifier. Examples include, but are not limited to: pace, type of exercise, duration, and combinations thereof. Further, activity modifiers and/or variation of activity modifiers may be determined from predetermined values (such as a value assigned to an exercise or movement that a user is prompted to perform), the user's performance, and combinations thereof.


Equation (2), provided below, may be utilized in further embodiments.

Calories burned=BMR*(Activity modifier)*(Completeness modifier)*(Multiply Modifier)+(Addition Modifier)  Equation (2):


Values for BMR, Activity Modifier, and/or Completeness Modifier of Equation (2) may be determined in accordance with one or more embodiments described above in reference to Equation (1). In one embodiment, the value of the Multiply Modifier may be defined for each type of exercise. In one example embodiment, the default value may be 0.1. In a second embodiment, the default value may be 1.0. The default value may be any value, including 0.0. System 100 may update the Multiply Modifier during a workout to correspond to a type of exercise the user is prompted to perform. In certain embodiments, the Activity Modifier may be obtained (either partially or entirely) from empirical data.


In certain embodiments, the value of the Addition Modifier may be defined for each type of exercise. In one example embodiment, the default value may be 0.1. In a second embodiment, the default value may be 1.0. The default value may be any value, including 0.0. System 100 may update the Addition Modifier during a workout to correspond to a type of exercise the user is prompted to perform. In certain embodiments, the Activity Modifier may be obtained (either partially or entirely) from empirical data.


System 100 may calculate the calories burned over a duration of a workout, which may incorporate the utilization of equations (1) or (2). System 100 may cause the display 136 to display a running total of calories burned. In certain embodiments, the total may be determined for one or more completed repetitions and one or more completed sets of each exercise. System 100 may also calculate and cause display of calories burned by type of exercise performed. Other information such as, for example, peak/minimum/average calorie burning rate by workout, by repetition, by set, or by exercise type may also be calculated and displayed. System 100 may periodically determine an amount of calories burned by the user while exercising using equation (1). System 100 may indicate a current amount of calories burned that is continually updated over a workout (e.g., a running total), or may update the calories burned amount at predetermined times (e.g., user completes a set of a first type of exercise and begins a set of second type of exercise, at the end of the workout session, etc.). System 100 may also inform the user how many calories were burned during each repetition as well as in each set of an exercise.


One or more of the inputs and/or variables used in the determination of caloric expenditure (such as with equation (1)) may remain the same regardless of the type of exercise being performed by the user, yet others may vary. For example, the BMR may be the same over the entire workout as a user's weight, height, and age do not change appreciably over the course of a workout. Further, one or more of the Activity modifier, Completeness modifier, Multiply Modifier, and Addition Modifier may vary over the workout. The values (and/or variation) of the values may depend on the type exercise currently being performed by the user.


The Completeness modifier may vary from repetition to repetition. As noted above, system 100 may generate the Completeness modifier based on monitoring a user's form while they perform an exercise. Generally, an exercise includes a sequence of motions to perform one repetition, and a user typically performs a set that includes two or more repetitions. A user's form may vary from repetition to repetition, and so may the Completeness modifier.


System 100 may determine calories burned using equation (1) based on a Completeness modifier that varies from repetition to repetition, or based on a filtered version of the Completeness modifier. To filter the Completeness modifier, the system 100 may, for example, determine a Completeness modifier for one or more repetitions, may average some or all of the Completeness modifiers, and may use the average in equation (1). Also, system 100 may generate the Completeness modifier as a weighted average, where Completeness modifiers of some repetitions may be given greater weight than others. For example, system 100 may apply a decaying function where more recent Completeness modifiers are weighted more heavily than less recent when generating an average.


System 100 may also allow a user to make desired movements, and calculate an amount of calories burned for such movement. In one embodiment, all detected movements may be utilized in calculations. Yet in other embodiments, only certain (e.g., system supported and/or those prompted to be performed) movements may be considered. System 100 may process data from image capturing device 126 and/or from various sensors to attempt to classify a user's movement. For example, system 100 may compare the user's movement to other known movements for which a MET table has been defined. If a user's movement corresponds to a known movement for which a MET table has been defined, then system 100 may use the identified MET table for calculating an amount of calories burned.


If the user's movement does not match an exercise defined by a MET table, the system 100 may identify one or more exercises that include movements similar to the movement being performed by the user. For example, system 100 may determine that the user's lower body moves similar to a squat and upper body moves similar to a pushup. System 100 may calculate the number of calories the user would burn using the identified MET tables as if the users were doing a squat, and as if they were doing a pushup, as approximations for the amount of calories burned by the user. In further embodiments, a new entry may be created. In this regard, certain embodiments may permit the entry and later identification of new movements and/or exercises. In certain embodiments, the user may provide inputs regarding an approximate caloric expenditure for an unidentified movement/exercise. Yet in other embodiments, system 100 may calculate caloric expenditure, such as from one or more sensors as discussed herein. In still yet further embodiments, system 100 may utilize one or more sensor readings as well as an input from a user (and/or third-party) in determining attributes, such as caloric expenditure, for previously unknown movements or exercises. Examples of estimating caloric expenditure without MET tables, may include but are not limited to, determining changes in potential energy. Examples of using changes in potential energy are provided in the next section.


System 100 may be configured to transmit calories burned estimates to a social networking website. The users may be ranked based on their total number of calories burned for a desired time interval (e.g., rank by day, week, month, year, etc.). With reference again to FIG. 3, the method may end or may return to any of the preceding blocks.


i. Energy Expenditure Estimate Based on Changes in Potential Energy

System 100 may also calculate an energy expenditure estimate of a user for physical activities not defined by a MET table. For example, system 100 may calculate an amount of calories burned by a user performing any desired combination of movements. During a workout, a user may be exposed to their own body weight and gravity. A location of a user's center of mass, or of a center of mass of a particular body part, may be utilized in estimating an amount of calories burned by the user performing an athletic activity.



FIG. 8 illustrates an example flow diagram of a method for calculating an energy expenditure estimate for a user while performing an athletic activity based on monitoring changes in potential energy, in accordance with example embodiments. The method may be implemented by a computer, such as, for example, computer 102, device 138, 140 and/or 142 as well as other apparatuses. The blocks shown in FIG. 8 may be rearranged, some blocks may be removed, additional blocks may be added, each block may be repeated one or more times, and the flow diagram may be repeated one or more times. The flow diagram may begin at block 802.


In block 802, various embodiments may involve processing data captured of a user performing an athletic activity over a time interval. In an example, system 100 may prompt a user to perform ten repetitions of a lunge and may process data captured of the user performing the lunge. The data may be video captured by the camera 126 or may be captured by the infrared transceiver 128, and/or by the other device sensors 138, 140, and 142.


In block 804, various embodiments may involve determining a location of a center of mass of a body part, body region, or of an entire body of the user at a first time instant and at a second time instant within the time interval. Yet in other embodiments, a center of movement may be utilized. For simplicity purposes, however, a center of mass will be discussed. In an example, system 100 may instruct the user to place sensors at locations of corresponding to a center of mass for one or more body parts of the user. With reference to FIG. 9, one or more of center of mass locations may be at example locations 904A-D and 906, or at other locations on the user's body. Any number of locations may be monitored. At least one sensor may wirelessly transmit sensor data indicating a time and a location of the sensor (or location of a body part as detected by the sensor). A location may be coordinates in a coordinate system (e.g., Cartesian coordinate system) and may be associated with a time stamp indicating when the sensor was at a particular coordinate. In certain embodiments, system 100 may process the sensor data to periodically determine locations 904A-D and 906. For example, system 100 may receive sensor data, such as from device sensors 138, 140 and/or 142. Computer 102 (or another component of system 100) may process data as part of determining locations (such as locations 904A-D and 906). In one embodiment, data may be processed on a routine ongoing-basis, such as four times per second. In another example, computer 102 (or another component of system 100) may process data from image capturing device 126 to determine locations 904A-D and/or 906.


In block 806, various embodiments may involve identifying a change in the location of the center of mass from the first time instant to a second time instant. As discussed above, system 100 may determine locations 904A-D and 906 at one time and at a subsequent time. For example and with reference to FIGS. 10A-B, a user is shown performing a lunge. FIG. 10A corresponds to a first time instant and FIG. 10B corresponds to a second time instant. In FIG. 10A, a location 906 of a user's center of mass is at a height “h1” (designated by 908A) off of the ground. In FIG. 10B, a location 906 of a user's center of mass is at a height “h2” (designated by 908A) off of the ground. One or more components of system 100 may determine a difference between height “h1” and “h2” to determine a change in a location 906 of the center of mass. System 100 may also calculate changes to locations 904A-D of centers of mass for other body parts, or changes to other locations of body parts or body regions of the user. System 100 may also process video of a user taken from different angles, as shown in FIG. 11, to determine locations 904A-D and 906. For example, system 100 may determine height “h1” for location 906 in a perspective view and height “h2” for location 906 in a front view of the user. System 100 may average the different height measurements, or may use one or the other.


With reference again to FIG. 8, in block 808, various embodiments may calculate an energy expenditure estimate for the user due to the change. In an example, the physics concept of potential energy may be used to estimate the amount of work done by the user, and to calculate calories burned based on work.


In an example, one or more components of system 100 may determine changes of a location 906 from one time instant to another to determine an amount of work performed by the user. Potential Energy (PE)=m*g*h, where m=mass of the user (or body part), g=the acceleration due to gravity, and h=height above ground. Work (W)=−ΔPE, where Δ is represents a change in potential energy. Substituting m*g*h, Work (W)=−m*g*Δh. Based on the above example in FIGS. 10A-B, W=−m*g*(h1−h2). System 100 may determine an amount of calories burned as a function of work multiplied by physiology of human efficiency. System 100 may determine the amount of calories burned based on the amount of work and a physiology of human efficiency (PHE) scaling factor. The system 100 may determine the PHE scaling factor as a function of one or more of the user's heart rate, pressure sensor data, and other information input by the user (e.g., age, weight, etc.)


System 100 may keep and/or transmit a running total of calories burned between subsequent time instants and inform the user of a total amount of calories burned up to that point in an exercise session. For example, system 100 may determine a height h of location 906 at a certain frequency (e.g., 2 times per second), and may calculate calories burned based on a difference in calories burned between each determination of height h. The system 100 may also track a total number of calories burned over a predetermined time range covering one or more workouts. A time range may include a week, month, year, cumulative time since a user began working out, or other defined metrics. One or metrics may comprise default values, predefined values, user-selectable values, and/or user-defined values. For example, system 100 may inform the user of how many calories they have burned during a specified time period, such as a day, week, month, and/or year. System 100 may also maintain data on average number of calories burned per workout, average number of calories burned based on a type of workout, a greatest number of calories burned during a single workout or during a predetermined time interval (e.g., month where highest amount of calories were burned), or other types of data.


In another example, system 100 may determine calories burned by movement of a particular body part or by a collection of body parts. For instance, a user may desire to know how many calories were burned by movement of their right leg. Using the above relationship between work and potential energy, and with reference again to FIG. 9, system 100 may monitor changes in the location 904A of the center of mass of the user's right leg (e.g., height 908B) from one time instant to a different time instant to calculate work. System 100 may estimate the mass of the user's right leg based on the user's weight and proportions. System 100 may then determine an amount of calories burned as a function of work multiplied by physiology of human efficiency, as described above. During an exercise session, system 100 may display, such as through display 136, a running total of calories burned attributable to movement of the user's right leg. System 100 may similarly determine calories burned based on locations 904B-D for the other limbs of the user. During an exercise session, system 100 may display a running total of calories burned by a user's entire body, as well by each limb.


System 100 may also permit a user to review an exercise session to determine how many calories were burned at certain times. For example, an exercise may involve performing repetitive motions (e.g., pushups). System 100 may identify each repetition within a set (e.g., each pushup within a set of 10), as well as a number of calories burned during each repetition. Over a set, one or more components of system 100 may identify the repetition where the user burned a highest number of calories as well as a lowest number of calories. In further embodiments, system 100 may estimate an average number of calories. These are merely exemplary statistics and those skilled in the art will readily appreciate that other analysis may be conducted without departing from the scope of this disclosure.


If an exercise session involves different types of exercises, system 100 may rank the exercise types based on the amount of calories burned by type. For example, an exercise session may involve 3 different types of exercises (e.g., pushups, sit-ups, squats). After completing the exercise session, system 100 may determine how many calories were burned by each exercise type (e.g., 10 calories for pushups, 13 calories for sit-ups, and 18 calories for squats), and rank the exercise types based on the number of calories burned (e.g., first squats, second sit-ups, third pushups). In further embodiments, energy expenditure (e.g., a quantity of calories burned) may be ranked as percentage over an ideal value or range for an exercise or routine. For example, if perfectly performing an exercise would burn about 100 calories, a first user who burned 90 calories may be assigned a better ranking than second user who only burned 85 for the same exercise. The users could have different ideal values or ranges, thus the determinations may utilize the percentage of the detected and/or estimated values as a percentage for that user's ideal value. In further embodiments, a user who is closer to 100% of their ideal value may be ranked higher than users who have over 100% of the ideal quantity of calories burned. In this regard, a user who expends more energy than estimated or calculated for an activity (e.g., exercise) may indicate improper movements, inefficiency, increased likelihood of injury, and/or combinations thereof. In certain implementations, the method of FIG. 8 may then end, or may return to any of the preceding blocks and/or other processes.


System 100 may also determine calories expended from pre-recorded videos. For example, a user may upload video of a professional basketball player dunking a basketball to system 100. One or more components of system 100 may process the video to determine locations of a center of mass of the player, or of particular body parts, at various points in time, and determine the amount of calories expended during the physical activity (e.g., by the player during the dunk) using the work-based calorie determination, described above.


Conclusion

Providing an activity environment having one or more of the features described herein may provide a user with an immersive experience that will encourage and motivate the user to engage in athletic activities and improve his or her fitness. Users may further communicate through social communities and challenge one another to reach various levels of fitness, and to view their fitness level and activity.


Aspects of the embodiments have been described in terms of illustrative embodiments thereof. Numerous other embodiments, modifications and variations within the scope and spirit of the appended claims will occur to persons of ordinary skill in the art from a review of this disclosure. For example, one of ordinary skill in the art will appreciate that the steps illustrated in the illustrative figures may be performed in other than the recited order, and that one or more steps illustrated may be optional in accordance with aspects of the embodiments.

Claims
  • 1. A computer-implemented method comprising: prompting, at a first location, a user to perform an exercise comprising multiple repetitions;determining a location of a computer device on a body of the user, from data received from a sensor integrated into an item of apparel worn by the user;calculating, by the computer device, an angle between a first body part and a second body part of the user, based on the determined location;monitoring, at the first location, with at least a first sensor of the computer device, a form of the user while performing the exercise, based on the calculated angle, wherein the form of the user comprises at least two form scores and at least two completeness modifiers corresponding to at least two parts of the exercise, wherein the at least two completeness modifiers adjust the at least two form scores based on how well the user's form corresponds to a desired form, and wherein each of the at least two completeness modifiers is a weighted average comprising a decaying function that weights more heavily recent repetitions of the exercise; andcalculating, by a processor, an energy expenditure estimate for the user performing the exercise based on a type of the exercise, the form of the user, and at least two contribution values that correlate the at least two form scores with energy expenditure.
  • 2. The method of claim 1, wherein the calculating of the energy expenditure estimate is determined utilizing a metabolic equivalent of task (MET) table for the type of the exercise.
  • 3. The method of claim 1, wherein the energy expenditure estimate is based on a basal metabolic rate of the user.
  • 4. The method of claim 1, wherein the monitoring of the form is based on processing of video of the user performing the exercise.
  • 5. The method of claim 1, wherein the monitoring of the form comprises: processing of an infrared reflection of the user.
  • 6. The method of claim 1, wherein the monitoring of the form is based on identifying locations of a first body part of the user with respect to at least a second body part of the user, at different times.
  • 7. The method of claim 6, wherein at least a portion of the body part locations are determined based on one or more of data from a plurality of sensors positioned on the body of the user, processing video of the user, and processing infrared reflections of the user.
  • 8. The method of claim 1, wherein the calculating of the energy expenditure estimate comprises increasing or decreasing the estimate based on the form and the at least two contribution values.
Parent Case Info

The present application is a continuation-in-part of U.S. patent application Ser. No. 13/324,812, filed Dec. 13, 2011, entitled “Processing Data of a User Performing an Athletic Activity to Estimate Energy Expenditure,” which is a continuation-in-part of U.S. patent application Ser. No. 13/304,056, filed Nov. 23, 2011, entitled “Fatigue Indices and Uses Thereof,” and of U.S. patent application Ser. No. 13/304,064, filed Nov. 23, 2011, entitled “Method and System for Automated Personal Training that Includes Training Programs,” which is a continuation-in-part of U.S. application Ser. No. 13/290,359, filed Nov. 7, 2011, entitled “Method and System for Automated Personal Training,” which claims the benefit of and priority to, U.S. Provisional Patent Application Ser. No. 61/433,792 filed Jan. 18, 2011, 61/432,472 filed Jan. 13, 2011, and 61/422,511 filed Dec. 13, 2010, each of which is entitled “Method and System for Automated Personal Training.” The present application also claims priority to U.S. Provisional Patent Application Ser. No. 61/655,153, filed Jun. 4, 2012, entitled “Fitness Training System with Energy Expenditure Calculation that Uses a Form Factor.” The contents of each of the above-identified applications are expressly incorporated herein by reference in its entirety for any and all non-limiting purposes.

US Referenced Citations (233)
Number Name Date Kind
4938476 Brunelle et al. Jul 1990 A
5184295 Mann Feb 1993 A
5277197 Church et al. Jan 1994 A
5288078 Capper et al. Feb 1994 A
5335188 Brisson Aug 1994 A
5354317 Alt Oct 1994 A
5375610 LaCourse et al. Dec 1994 A
5511789 Nakamura Apr 1996 A
5524637 Erickson Jun 1996 A
5527239 Abbondanza Jun 1996 A
5598849 Browne Feb 1997 A
5655316 Huang Aug 1997 A
5667459 Su Sep 1997 A
5688137 Bustance Nov 1997 A
5791351 Curchod Aug 1998 A
5826578 Curchod Oct 1998 A
5836770 Powers Nov 1998 A
5846086 Bizzi et al. Dec 1998 A
5851193 Arikka et al. Dec 1998 A
5904484 Burns May 1999 A
5913727 Ahdoot Jun 1999 A
5919149 Allum Jul 1999 A
5938690 Law et al. Aug 1999 A
5955957 Calabrese et al. Sep 1999 A
5976083 Richardson et al. Nov 1999 A
6126449 Burns Oct 2000 A
6316934 Amorai-Moriya et al. Nov 2001 B1
6416327 Wittenbecher Jul 2002 B1
6428449 Apseloff Aug 2002 B1
6516222 Fukuda Feb 2003 B2
6663491 Watabe et al. Dec 2003 B2
6743167 Balkin et al. Jun 2004 B2
6746371 Brown et al. Jun 2004 B1
6765726 French et al. Jul 2004 B2
6788200 Jamel et al. Sep 2004 B1
6817979 Nihtila Nov 2004 B2
6820025 Bachmann et al. Nov 2004 B2
6834436 Townsend et al. Dec 2004 B2
6836744 Asphahani et al. Dec 2004 B1
6856852 Bruinsma et al. Feb 2005 B1
6876496 French et al. Apr 2005 B2
7018211 Birkholzer et al. Mar 2006 B1
7074168 Farnes et al. Jul 2006 B1
7079889 Nakada Jul 2006 B2
7089216 Van Overveld Aug 2006 B2
7095424 Satoh et al. Aug 2006 B2
7163490 Chen Jan 2007 B2
7192401 Saalasti et al. Mar 2007 B2
7254516 Case, Jr. et al. Aug 2007 B2
7265666 Daniel Sep 2007 B2
7315249 Littell Jan 2008 B2
7359121 French Apr 2008 B2
7433805 Vock et al. Oct 2008 B2
7442131 Milana Oct 2008 B2
7493232 Surina Feb 2009 B1
7497807 Neff et al. Mar 2009 B2
7497812 Neff et al. Mar 2009 B2
7556590 Watterson et al. Jul 2009 B2
7602301 Stirling et al. Oct 2009 B1
7628730 Watterson et al. Dec 2009 B1
7676332 Damen Mar 2010 B2
7717858 Massad May 2010 B2
7736272 Martens Jun 2010 B2
7771293 Vann Aug 2010 B1
7771320 Riley et al. Aug 2010 B2
7782358 Nieminen et al. Aug 2010 B2
7783347 Abourizk et al. Aug 2010 B2
7789800 Watterson et al. Sep 2010 B1
7815508 Dohta Oct 2010 B2
7821407 Shears et al. Oct 2010 B2
7825815 Shears et al. Nov 2010 B2
7846067 Hanoun Dec 2010 B2
7846069 Martens Dec 2010 B2
7857708 Ueda et al. Dec 2010 B2
7894849 Kass Feb 2011 B2
7927253 Vincent et al. Apr 2011 B2
7967728 Zavadsky et al. Jun 2011 B2
7978081 Shears et al. Jul 2011 B2
7978217 Camhi Jul 2011 B2
7985164 Ashby Jul 2011 B2
7988647 Bunn et al. Aug 2011 B2
8012064 Martens Sep 2011 B2
8029411 Johnson Oct 2011 B2
8038549 Vann Oct 2011 B2
8038578 Olrik et al. Oct 2011 B2
8083643 Ng et al. Dec 2011 B2
8118710 Weinman et al. Feb 2012 B2
8230367 Bell et al. Jul 2012 B2
8235870 Hamilton Aug 2012 B2
8269826 Nieminen et al. Sep 2012 B2
8284157 Markovic et al. Oct 2012 B2
8284847 Adermann Oct 2012 B2
8409057 Martens Apr 2013 B2
8435177 Lanfermann et al. May 2013 B2
8460199 Rulkov et al. Jun 2013 B2
8465108 Markovic et al. Jun 2013 B2
8503086 French et al. Aug 2013 B2
8523667 Clavin Sep 2013 B2
8568277 Johnson Oct 2013 B2
8568330 Mollicone et al. Oct 2013 B2
8589114 Papadourakis Nov 2013 B2
8602988 Hunt Dec 2013 B2
8616989 Bentley Dec 2013 B2
8676541 Schrock et al. Mar 2014 B2
8702485 Flury Apr 2014 B2
8758201 Ashby et al. Jun 2014 B2
8784270 Ashby et al. Jul 2014 B2
8784307 Groteke Jul 2014 B1
8812428 Mollicone et al. Aug 2014 B2
8854304 Nishimoto et al. Oct 2014 B2
8858400 Johnson Oct 2014 B2
8861091 French et al. Oct 2014 B2
8892219 Pryor Nov 2014 B2
8928484 Chang et al. Jan 2015 B2
9008973 French Apr 2015 B2
9078585 Miyazaki Jul 2015 B2
9141759 Burich et al. Sep 2015 B2
9154739 Nicolaou et al. Oct 2015 B1
9317660 Burich et al. Apr 2016 B2
9545541 Aragones et al. Jan 2017 B2
9630059 Burich et al. Apr 2017 B2
20020019258 Kim et al. Feb 2002 A1
20030040348 Martens Feb 2003 A1
20030054327 Evensen Mar 2003 A1
20030077556 French et al. Apr 2003 A1
20030214408 Grajales et al. Nov 2003 A1
20030228033 Daniel et al. Dec 2003 A1
20040087366 Shum et al. May 2004 A1
20040102931 Ellis et al. May 2004 A1
20040112151 Maxwell et al. Jun 2004 A1
20040162194 Habing Aug 2004 A1
20040220856 Moore Nov 2004 A1
20040255490 Wan et al. Dec 2004 A1
20050001728 Appelt et al. Jan 2005 A1
20050079905 Martens Apr 2005 A1
20050085348 Kiefer et al. Apr 2005 A1
20050107723 Wehman May 2005 A1
20050113650 Pacione et al. May 2005 A1
20050172311 Hjelt et al. Aug 2005 A1
20050182341 Katayama et al. Aug 2005 A1
20050196737 Mann Sep 2005 A1
20050209050 Bartels Sep 2005 A1
20050223799 Murphy Oct 2005 A1
20050239026 Suzuki Oct 2005 A1
20050272517 Funk et al. Dec 2005 A1
20060025282 Redmann Feb 2006 A1
20060040793 Martens Feb 2006 A1
20060079800 Martikka et al. Apr 2006 A1
20060136173 Case et al. Jun 2006 A1
20060166737 Bentley Jul 2006 A1
20060205569 Watterson et al. Sep 2006 A1
20060229170 Ozawa et al. Oct 2006 A1
20060241521 Cohen Oct 2006 A1
20060247070 Funk et al. Nov 2006 A1
20060262120 Rosenberg Nov 2006 A1
20070050715 Behar Mar 2007 A1
20070118406 Killin et al. May 2007 A1
20070177024 Camhi Aug 2007 A1
20070213178 Lemmela Sep 2007 A1
20070232453 Hanoun Oct 2007 A1
20070232455 Hanoun Oct 2007 A1
20070270214 Bentley Nov 2007 A1
20070271065 Gupta et al. Nov 2007 A1
20080033581 Doshi et al. Feb 2008 A1
20080096726 Riley et al. Apr 2008 A1
20080161733 Einav et al. Jul 2008 A1
20080189291 Hsu Aug 2008 A1
20080191864 Wolfson Aug 2008 A1
20080200312 Tagliabue Aug 2008 A1
20080212032 Seiller Sep 2008 A1
20080221487 Zohar et al. Sep 2008 A1
20080246629 Tsui et al. Oct 2008 A1
20080254866 Young Oct 2008 A1
20080281550 Hogle Nov 2008 A1
20090042695 Chien et al. Feb 2009 A1
20090044429 Cook et al. Feb 2009 A1
20090149299 Tchao et al. Jun 2009 A1
20090171614 Damen Jul 2009 A1
20090233769 Pryor Sep 2009 A1
20090233770 Vincent et al. Sep 2009 A1
20090298024 Batzler et al. Dec 2009 A1
20090298650 Kutliroff Dec 2009 A1
20100056340 Ellis et al. Mar 2010 A1
20100063778 Schrock et al. Mar 2010 A1
20100094174 Choi et al. Apr 2010 A1
20100125026 Zavadsky et al. May 2010 A1
20100125028 Heppert May 2010 A1
20100137748 Sone et al. Jun 2010 A1
20100144414 Edis et al. Jun 2010 A1
20100197462 Piane, Jr. Aug 2010 A1
20100204616 Shears et al. Aug 2010 A1
20100210359 Krzeslo et al. Aug 2010 A1
20100210975 Anthony, III Aug 2010 A1
20100217738 Sarel Aug 2010 A1
20100234184 Le Page et al. Sep 2010 A1
20100248901 Martens Sep 2010 A1
20100302142 French et al. Dec 2010 A1
20100316983 Johns, Jr. Dec 2010 A1
20100332243 Weigman et al. Dec 2010 A1
20110072457 Lanfermann et al. Mar 2011 A1
20110077129 Martens Mar 2011 A1
20110111924 Jones et al. May 2011 A1
20110131005 Ueshima et al. Jun 2011 A1
20110136627 Williams Jun 2011 A1
20110158912 Wright Jun 2011 A1
20110212791 Ueda et al. Sep 2011 A1
20110229864 Short et al. Sep 2011 A1
20110251021 Zavadsky et al. Oct 2011 A1
20110270135 Dooley Nov 2011 A1
20110306491 Belisle Dec 2011 A1
20110307821 Martens Dec 2011 A1
20120034971 Harp et al. Feb 2012 A1
20120038627 Sung et al. Feb 2012 A1
20120130886 Shergill et al. May 2012 A1
20120143064 Cyphery et al. Jun 2012 A1
20120143358 Adams Jun 2012 A1
20120165703 Bottum et al. Jun 2012 A1
20120183940 Aragones et al. Jul 2012 A1
20120190505 Shavit et al. Jul 2012 A1
20120234111 Molyneux et al. Sep 2012 A1
20120268592 Aragones et al. Oct 2012 A1
20120271143 Aragones et al. Oct 2012 A1
20120315986 Walling Dec 2012 A1
20120315987 Walling Dec 2012 A1
20130019694 Molyneux et al. Jan 2013 A1
20130022947 Muniz Simas et al. Jan 2013 A1
20130022950 Muniz Simas et al. Jan 2013 A1
20130108993 Katz May 2013 A1
20130171596 French Jul 2013 A1
20130281796 Pan Oct 2013 A1
20130295539 Wilson et al. Nov 2013 A1
20130338802 Winsper Dec 2013 A1
20140073486 Ahmed et al. Mar 2014 A1
Foreign Referenced Citations (66)
Number Date Country
1415271 May 2003 CN
1723847 Jan 2006 CN
1933880 Mar 2007 CN
101668482 Mar 2010 CN
201643611 Nov 2010 CN
101964047 Feb 2011 CN
103493056 Jan 2014 CN
29720110 Jan 1998 DE
2415788 Jan 2006 GB
H8-57093 Mar 1996 JP
857093 May 1996 JP
2000033184 Feb 2000 JP
2000070242 Mar 2000 JP
2000504854 Apr 2000 JP
2001224853 Aug 2001 JP
2001231904 Aug 2001 JP
2001299975 Oct 2001 JP
2002112984 Apr 2002 JP
2002516121 Jun 2002 JP
2002253718 Sep 2002 JP
2002291952 Oct 2002 JP
2003085288 Mar 2003 JP
2003290406 Oct 2003 JP
2004089727 Mar 2004 JP
2004097649 Apr 2004 JP
2004208995 Jul 2004 JP
2005034195 Feb 2005 JP
3656853 Jun 2005 JP
2005198818 Jul 2005 JP
2006130127 May 2006 JP
2006167313 Jun 2006 JP
2006263002 Oct 2006 JP
2006320424 Nov 2006 JP
2007143748 Jun 2007 JP
2007144107 Jun 2007 JP
2007260307 Oct 2007 JP
2008104758 May 2008 JP
2008295746 Dec 2008 JP
2009048757 Mar 2009 JP
2009050699 Mar 2009 JP
2009078134 Apr 2009 JP
2009172315 Aug 2009 JP
2009201672 Sep 2009 JP
2009213656 Sep 2009 JP
2009213782 Sep 2009 JP
2009219828 Oct 2009 JP
2009247836 Oct 2009 JP
2009279041 Dec 2009 JP
2010502368 Jan 2010 JP
2010075715 Apr 2010 JP
2010086358 Apr 2010 JP
2010188159 Sep 2010 JP
2010246636 Nov 2010 JP
20030041034 May 2003 KR
20090084035 Aug 2009 KR
20100086052 Jul 2010 KR
9729814 Aug 1997 WO
2004073494 Sep 2004 WO
2006098282 Sep 2006 WO
2009043024 Apr 2009 WO
2009073607 Jun 2009 WO
2010121166 Oct 2010 WO
2012021633 Feb 2012 WO
2012039467 Mar 2012 WO
2012061438 May 2012 WO
2012061804 May 2012 WO
Non-Patent Literature Citations (11)
Entry
Jun. 27, 2013 (WO)—International Preliminary Report on Patentability—App. No. PCT/US2011/064711.
May 31, 2013 (WO)—International Search Report and Written Opinion—App. No. PCT/US2012/066070.
May 29, 2013 (WO)—International Search Report and Written Opinion—App. No. PCT/US2012/066065.
Jun. 6, 2013 (WO)—International Preliminary Report on Patentability—App. No. PCT/US20111062117.
May 16, 2013 (WO)—International Preliminary Report on Patentability—App. No. PCT/US20111059559.
Apr. 3, 2012 (WO)—International Search Report and Written Opinion—Application No. PCT/US20111064711.
Feb. 23, 2012 W(O)—International Search Report and Written Opinion—App. No. PCT/US2011/062117.
Feb. 20, 2014 (WO)—International Search Report and Written Opinion—App. No. PCT/US2013/067512.
Zhao, et al., Design and Practice for Individual Specialized PC Expert System for College Student, Journal of Xi An Institute of Physical Education, vol. 22, No. 2 (Mar. 2005) pp. 118-121.
Translation of JP2009201672A, Published Sep. 10, 2009 [Retrieved Oct. 11, 2017] Retrieved from Google Patents <URL:https://patents.google.com/patent/JP2009201672A/en>.
Autotranslation of JP2009-201672, published Sep. 10, 2009 [Retrieved Oct. 13, 2017] Retrieved from JPO using AIPN system.
Related Publications (1)
Number Date Country
20130268205 A1 Oct 2013 US
Provisional Applications (4)
Number Date Country
61433792 Jan 2011 US
61432472 Jan 2011 US
61422511 Dec 2010 US
61655153 Jun 2012 US
Continuations (1)
Number Date Country
Parent 13290359 Nov 2011 US
Child 13304064 US
Continuation in Parts (3)
Number Date Country
Parent 13324812 Dec 2011 US
Child 13909826 US
Parent 13304056 Nov 2011 US
Child 13324812 US
Parent 13304064 Nov 2011 US
Child 13304056 US