While most people appreciate the importance of physical fitness, many have difficulty finding the motivation required to maintain a regular exercise program. Some people find it particularly difficult to maintain an exercise regimen that involves continuously repetitive motions, such as running, walking and bicycling.
Additionally, individuals may view exercise as work or a chore and thus, separate it from enjoyable aspects of their daily lives. Often, this separation between athletic activity and other activities reduces the amount of motivation that an individual might have toward exercising. Further, athletic activity services and systems directed toward encouraging individuals to engage in athletic activities might also be too focused on one or more particular activities while an individual's interests are ignored. This may further decrease a user's interest in participating in athletic activities or using the athletic activity services and systems.
Therefore, improved systems and methods to address these and other shortcomings in the art are desired.
The following presents a simplified summary in order to provide a basic understanding of some aspects of the disclosure. The summary is not an extensive overview of the disclosure. It is neither intended to identify key or critical elements of the disclosure nor to delineate the scope of the disclosure. The following summary merely presents some concepts of the disclosure in a simplified form as a prelude to the description below.
Aspects of the invention provide systems and methods for creating personalized exercise programs. A computer device, such as a video game console may be used with an image capture device, such as a group of cameras to capture images of a user performing athletic movements. As used herein, an “athletic movement” includes movements relating to fitness, exercise, flexibility, including movements that may be part of one or more single and multiple participant athletic competitions, exercise routines, and/or combinations thereof. The images may then be evaluated to create a human movement screen score. The human movement screen score may be used to create a personalized exercise program tailored to the specific user. A human movement screen (HMS) is a ranking and grading system that documents movement patterns that may be key to normal function. The functional movement screen (EMS) developed by Gray Cook is an example of a human movement screen.
In some embodiments the user may also provide preference data, such as data relating to time commitments, preferred exercises and a preferred number of exercise sessions in a predetermined time period. The computer device may consider these factors when creating a personalized exercise program.
Certain other embodiments may capture athletic movement data with accelerometers, gyroscopes or position locating devices, such as GPS devices.
In other embodiments, the present invention can be partially or wholly implemented on a tangible non-transitory computer-readable medium, for example, by storing computer-executable instructions or modules, or by utilizing computer-readable data structures.
Of course, the methods and systems of the above-referenced embodiments may also include other additional elements, steps, computer-executable instructions, or computer-readable data structures.
These and other aspects of the embodiments are discussed in greater detail throughout this disclosure, including the accompanying drawings.
The present disclosure is illustrated by way of example and not limited in the accompanying figures in which like reference numerals indicate similar elements and in which:
In the following description of the various embodiments, reference is made to the accompanying drawings, which form a part hereof, and in which is shown by way of illustration various embodiments in which the disclosure may be practiced. It is to be understood that other embodiments may be utilized and structural and functional modifications may be made without departing from the scope and spirit of the present disclosure. Further, headings within this disclosure should not be considered as limiting aspects of the disclosure. Those skilled in the art with the benefit of this disclosure will appreciate that the example embodiments are not limited to the example headings.
I. Example Personal Training System
A. Illustrative Computing Devices
Turning briefly to
The processing unit 106 and the system memory 108 may be connected, either directly or indirectly, through a bus 114 or alternate communication structure to one or more peripheral devices. For example, the processing unit 106 or the system memory 108 may be directly or indirectly connected to additional memory storage, such as a hard disk drive 116, a removable magnetic disk drive, an optical disk drive 118, and a flash memory card, as well as to input devices 120, and output devices 122. The processing unit 106 and the system memory 108 also may be directly or indirectly connected to one or more input devices 120 and one or more output devices 122. The output devices 122 may include, for example, a display device 136, television, printer, stereo, or speakers. In some embodiments one or more display devices may be incorporated into eyewear. The display devices incorporated into eyewear may provide feedback to users. Eyewear incorporating one or more display devices also provides for a portable display system. The input devices 120 may include, for example, a keyboard, touch screen, a remote control pad, a pointing device (such as a mouse, touchpad, stylus, trackball, or joystick), a scanner, a camera or a microphone. In this regard, input devices 120 may comprise one or more sensors configured to sense, detect, and/or measure athletic movement from a user, such as user 124, shown in
Looking again to
B. Illustrative Network
Still further, computer 102, computing unit 104, and/or any other electronic devices may be directly or indirectly connected to one or more network interfaces, such as example interface 130 (shown in
Regardless of whether computer 102 or other electronic device within network 132 is portable or at a fixed location, it should be appreciated that, in addition to the input, output and storage peripheral devices specifically listed above, the computing device may be connected, such as either directly, or through network 132 to a variety of other peripheral devices, including some that may perform input, output and storage functions, or some combination thereof. In certain embodiments, a single device may integrate one or more components shown in
C. Illustrative Sensors
Computer 102 and/or other devices may comprise one or more sensors 126, 128 configured to detect and/or monitor at least one fitness parameter of a user 124. Sensors 126 and/or 128, may include but not limited to: an accelerometer, a gyroscope, a location-determining device (e.g., GPS), light sensor, temperature sensor (including ambient temperature and/or body temperature), sleep pattern sensors, heart rate monitor, image-capturing sensor, moisture sensor and/or combinations thereof. Network 132 and/or computer 102 may be in communication with one or more electronic devices of system 100, including for example, display 136, an image capturing device 126 (e.g., one or more video cameras), and sensor 128, which may be an infrared (IR) device. In one embodiment sensor 128 may comprise an IR transceiver. For example, sensors 126, and/or 128 may transmit waveforms into the environment, including towards the direction of user 124 and receive a “reflection” or otherwise detect alterations of those released waveforms. In yet another embodiment, image-capturing device 126 and/or sensor 128 may be configured to transmit and/or receive other wireless signals, such as radar, sonar, and/or audible information. Those skilled in the art will readily appreciate that signals corresponding to a multitude of different data spectrums may be utilized in accordance with various embodiments. In this regard, sensors 126 and/or 128 may detect waveforms emitted from external sources (e.g., not system 100). For example, sensors 126 and/or 128 may detect heat being emitted from user 124 and/or the surrounding environment. Thus, image-capturing device 126 and/or sensor 128 may comprise one or more thermal imaging devices. In one embodiment, image-capturing device 126 and/or sensor 128 may comprise an IR device configured to perform range phenomenology. As a non-limited example, image-capturing devices configured to perform range phenomenology are commercially available from Flir Systems, Inc. of Portland, Oregon. Although image capturing device 126 and sensor 128 and display 136 are shown in direct (wirelessly or wired) communication with computer 102, those skilled in the art will appreciate that any may directly communicate (wirelessly or wired) with network 132.
1. Multi-Purpose Electronic Devices
User 124 may possess, carry, and/or wear any number of electronic devices, including sensory devices 138, 140, 142, and/or 144. In certain embodiments, one or more devices 138, 140, 142, 144 may not be specially manufactured for fitness or athletic purposes. Indeed, aspects of this disclosure relate to utilizing data from a plurality of devices, some of which are not fitness devices, to collect, detect, and/or measure athletic data. In one embodiment, device 138 may comprise a portable electronic device, such as a telephone or digital music player, including an IPOD®, IPAD®, or iPhone®, brand devices available from Apple, Inc. of Cupertino, Calif. or Zune® or Microsoft® Windows devices available from Microsoft of Redmond, Wash. As known in the art, digital media players can serve as both an output device for a computer (e.g., outputting music from a sound file or pictures from an image file) and a storage device. In one embodiment, device 138 may be computer 102, yet in other embodiments, computer 102 may be entirely distinct from device 138. Regardless of whether device 138 is configured to provide certain output, it may serve as an input device for receiving sensory information. Devices 138, 140, 142, and/or 144 may include one or more sensors, including but not limited to: an accelerometer, a gyroscope, a location-determining device (e.g., GPS), light sensor, temperature sensor (including ambient temperature and/or body temperature), heart rate monitor, image-capturing sensor, moisture sensor and/or combinations thereof. In certain embodiments, sensors may be passive, such as reflective materials that may be detected by image-capturing device 126 and/or sensor 128 (among others). In certain embodiments, sensors 144 may be integrated into apparel, such as athletic clothing. For instance, the user 124 may wear one or more on-body sensors 144a-b. Sensors 144 may be incorporated into the clothing of user 124 and/or placed at any desired location of the body of user 124. Sensors 144 may communicate (e.g., wirelessly) with computer 102, sensors 128, 138, 140, and 142, and/or camera 126. Examples of interactive gaming apparel are described in U.S. patent application Ser. No. 10/286,396, filed Oct. 30, 2002, and published as U.S. Pat. Pub, No. 2004/0087366, the contents of which are incorporated herein by reference in its entirety for any and all non-limiting purposes. In certain embodiments, passive sensing surfaces may reflect waveforms, such as infrared light, emitted by image-capturing device 126 and/or sensor 128. In one embodiment, passive sensors located on user's 124 apparel may comprise generally spherical structures made of glass or other transparent or translucent surfaces which may reflect waveforms. Different classes of apparel may be utilized in which a given class of apparel has specific sensors configured to be located proximate to a specific portion of the user's 124 body when properly worn. For example, golf apparel may include one or more sensors positioned on the apparel in a first configuration and yet soccer apparel may include one or more sensors positioned on apparel in a second configuration.
Devices 138-144 may communicate with each other, either directly or through a network, such as network 132. Communication between one or more of devices 138-144 may communicate through computer 102. For example, two or more of devices 138-144 may be peripherals operatively connected to bus 114 of computer 102. In yet another embodiment, a first device, such as device 138 may communicate with a first computer, such as computer 102 as well as another device, such as device 142, however, device 142 may not be configured to connect to computer 102 but may communicate with device 138. Those skilled in the art will appreciate that other configurations are possible.
Some implementations of the example embodiments may alternately or additionally employ computing devices that are intended to be capable of a wide variety of functions, such as a desktop or laptop personal computer. These computing devices may have any combination of peripheral devices or additional components as desired. Also, the components shown in
2. Illustrative Apparel/Accessory Sensors
In certain embodiments, sensory devices 138, 140, 142 and/or 144 may be formed within or otherwise associated with user's 124 clothing or accessories, including a watch, armband, wristband, necklace, shirt, shoe, or the like. Examples of shoe-mounted and wrist-worn devices (devices 140 and 142, respectively) are described immediately below, however, these are merely example embodiments and this disclosure should not be limited to such.
i. Shoe-Mounted Device
In certain embodiments, sensory device 140 may comprise footwear which may include one or more sensors, including but not limited to: an accelerometer, location-sensing components, such as GPS, and/or a force sensor system.
In certain embodiments, at least one force-sensitive resistor 206 shown in
The electrodes 218, 220 of the FSR sensor 206 can be formed of any conductive material, including metals, carbon/graphite fibers or composites, other conductive composites, conductive polymers or polymers containing a conductive material, conductive ceramics, doped semiconductors, or any other conductive material. The leads 212 can be connected to the electrodes 218, 220 by any suitable method, including welding, soldering, brazing, adhesively joining, fasteners, or any other integral or non-integral joining method. Alternately, the electrode 218, 220 and associated lead(s) 212 may be formed of a single piece of the same material 222/224. In further embodiments, material 222 is configured to have at least one electric property (e.g., conductivity, resistance, etc.) than material 224. Examples of exemplary sensors are disclosed in U.S. patent application Ser. No. 12/483,824, filed on Jun. 12, 2009, the contents of which are incorporated herein in their entirety for any and all non-limiting purposes.
ii. Wrist-Worn Device
As shown in
As shown in
A fastening mechanism 240 can be unlatched wherein the device 226 can be positioned around a wrist of the user 124 and the fastening mechanism 240 can be subsequently placed in a latched position. The user can wear the device 226 at all times if desired. In one embodiment, fastening mechanism 240 may comprise an interface, including but not limited to a USB port, for operative interaction with computer 102 and/or devices 138, 140, and/or recharging an internal power source.
In certain embodiments, device 226 may comprise a sensor assembly (not shown in
iii. Identify Sensory Locations
The system 100 may process sensory data to identify user movement data. In one embodiment, sensory locations may be identified. For example, images of recorded video, such as from image-capturing device 126, may be utilized in an identification of user movement. For example, the user may stand a certain distance, which may or may not be predefined, from the image-capturing device 126, and computer 102 may process the images to identify the user 124 within the video, for example, using disparity mapping techniques. In an example, the image capturing device 126 may be a stereo camera having two or more lenses that are spatially offset from one another and that simultaneously capture two or more images of the user. Computer 102 may process the two or more images taken at a same time instant to generate a disparity map for determining a location of certain parts of the user's body in each image (or at least some of the images) in the video using a coordinate system (e.g., Cartesian coordinates). The disparity map may indicate a difference between an image taken by each of the offset lenses.
In a second example, one or more sensors may be located on or proximate to the user's 124 body at various locations or wear a suit having sensors situated at various locations. Yet, in other embodiments, sensor locations may be determined from other sensory devices, such as devices 138, 140, 142 and/or 144. With reference to
In certain embodiments, a time stamp to the data collected indicating a specific time when a body part was at a certain location. Sensor data may be received at computer 102 (or other device) via wireless or wired transmission. A computer, such as computer 102 and/or devices 138, 140, 142, 144 may process the time stamps to determine the locations of the body parts using a coordinate system (e.g., Cartesian coordinates) within each (or at least some) of the images in the video. Data received from image-capturing device 126 may be corrected, modified, and/or combined with data received from one or more other devices 138, 140, 142 and 144.
In a third example, computer 102 may use infrared pattern recognition to detect user movement and locations of body parts of the user 124. For example, the sensor 128 may include an infrared transceiver, which may be part of image-capturing device 126, or another device, that may emit an infrared signal to illuminate the user's 124 body using infrared signals. The infrared transceiver 128 may capture a reflection of the infrared signal from the body of user 124. Based on the reflection, computer 102 may identify a location of certain parts of the user's body using a coordinate system (e.g., Cartesian coordinates) at particular instances in time. Which and how body parts are identified may be predetermined based on a type of exercise a user is requested to perform.
As part of a workout routine, computer 102 may make an initial postural assessment of the user 124 as part of the initial user assessment. With reference to
3. Identify Sensory Regions
In further embodiments, system 100 may identify sensor regions. In one embodiment, assessments lines 144a-g may be utilized to divide the user's body into regions. For example, lines 144b-f may be horizontal axes. For example, a “shoulders” region 402 may correlate to a body portion having a lower boundary around the user's shoulders (see line 144b), region 404 may correlate to the body portion between the shoulders (line 144b) and about half the distance to the hips (see line 144c) and thus be an “upper back” region, and region 406 may span the area between line 144c to the hips (see line 144d) to comprise a “lower back region.” Similarly, region 408 may span the area between the “hips” (line 144d) and the “knees” (see line 144e), region 410 may span between lines 144e and 144f and region 412 (see “ankles”) may have an upper boundary around line 144f. Regions 402-412 may be further divided, such as into quadrants, such as by using axes 144a and 144g
4. Categorize Locations or Regions
Regardless of whether specific points (e.g., locations shown in
Computer 102 may also process the image to determine a color of clothing of the user or other distinguishing features to differentiate the user from their surroundings. After processing, computer 102 may identify a location of multiple points on the user's body and track locations of those points, such as locations 302 in
II. Creation of Personal Training Programs
A. Overview
Next, an image capture device may be used to capture images of an athlete performing the athletic movements in step 504. The image capture device may include multiple cameras. In one embodiment the image capture device includes three cameras and is used to capture movement in three dimensions. Various embodiments may include cameras that capture light in the visible and/or infrared spectrums.
It step 506 it is determined if data from one or more other sensors is available. Others sensors may include an accelerometer worn on the wrist or embedded in or attached to footwear, a gyroscope, a heart rate monitor, a compass, a location tracking device, such as a GPS device, pressure sensors inserted into footwear or any of the sensors described above that can be used to capture athletic movements and/or athletic performance. The data received from the image capture device and one or more sensors may be used to generate a human movement screen score. When only data from the image capture device is available, in step 508 a human movement screen score is generated with data from the image capture device. When additional sensor data is available, in step 510 a human movement screen score is generated with data from the image capture device and data from more or more additional sensors. In alternative embodiments a human movement screen score may be generated with only data from the image capture device even when other sensor data is available. For example, sensor data may be available but determined not to be credible or below a threshold. In some embodiments the system may also selectively use data from any of the available sensors.
After a human movement screen score is generated, in step 512 a personalized exercise program is generated based on a human movement screen score. The personalized exercise program may be generated via a device, such as a video game console, a server, or computer 102, that includes one or more processors. The human movement screen score may reveal areas that can be improved and the personalized exercise program may address those areas.
In alternative embodiments a user may also provide preference data that is used to generate the personalized exercise program. The preference data may include time commitments, numbers of exercise sessions, preferred days to exercise, preferred exercises and goals. In one embodiment a user may provide access to an electronic calendar, such as one stored on a website, that shows the user's availability to exercise and the personal training system scans the calendar to determine availability and time commitments. The personal training system may look at historical calendar data to determine probable best times and available time commitments or future calendar data to determine actual availability. The personal training system may also be configured to update the exercise program based on the user's actual availability. For example, a user may have an exercise session scheduled for Monday evening and a scan of the user's calendar reveals that the user has an appointment Monday evening that makes exercising not practical. The personal training system may modify the exercise program to reschedule the exercise to another day. Other changes to the exercise program may also be made to keep the user on track to reach goals. The personal training system may even add calendar events to the user's calendar.
Users may exercise at locations away from the personal training system. Exercise data may be captured by a variety of sensors, such as accelerometers worn on the wrist or other body parts. Accelerometers may also be embedded in or attached to footwear or articles of clothing. Other sensors that may be used to capture exercise data away from the personal training system include gyroscopes, location tracking devices, such as a GPS device, heart rate monitors, pressure sensor systems placed in footwear and any of the sensors described above. The captured exercise data may be provided to the personal training system via a network connection or hardware port, such as a USB port. Returning to
When sensor data is received, in step 518, the personal training system may modify the personalized exercise program based on the exercise data captured by the sensor. Modifications may include one or more changes to the types of exercises or durations of exercises. For example, if the sensor data indicates that the user recently ran, the next session of the personalized exercise program may be modified to not exercise the primary muscle groups involved in running. Other exemplary modifications include reducing the duration or eliminating an exercise session.
B. Illustrative Embodiments
When a user begins an exercise program, the computer 102 may prompt the user to perform a series of exercises in front of an image capturing device. The computer 102 may process the images and assign a score indicating how well the user was able to complete each of the exercises to establish a baseline physical fitness level. When performing an exercise, the computer 102 may instruct the user to position him or herself at a certain distance and orientation relative to an image capturing device. The computer 102 may process each image to identify different parts of the user's body, such as, for example, their head, shoulders, arms, elbows, hands, wrists, torso, hips, knees, ankles, feet, or other body parts. The computer 102 may generate a set of data identifying a location of various body parts within the image. The computer 102 may process the data set to determine a relationship between certain body parts. These relationships may include an angle of one body part relative to another. For example, when the user is doing a squat, the computer 102 may compare the angle of a user's torso with an angle of the user's thigh. In another example, the computer 102 may compare a location of a user's shoulder relative to their elbow and hand during a push up.
The computer 102 may compare the data set to a desired data set for each exercise to monitor the user's form while performing an exercise. The desired data set may include multiple comparison points throughout an exercise. For example, a push up may be divided into four events: (1) the lowest point where the user's chest is nearest to the ground and their arms are bent; (2) a highest point where the user's chest is farthest from the ground and their arms are straightened; (3) an upward event where the user transitions form the lowest point to the highest point; and (4) a downward event where the user transitions form the highest point to the lowest point. The desired data set may specify comparison points for each of these events focusing on certain body parts. For example, at each comparison point during a pushup, the computer 102 may monitor the spacing of the user's hands, the straightness of the user's back, a location of the user's head relative to their torso, the spacing of the user's feet relative to one another, or other aspects. The desired data set may specify desired locations for each body part being monitored during comparison points within an exercise, as well as permitted variations from the desired locations. If the user's body part varies beyond what is permitted, the computer 102 may provide the user with feedback identifying the body part and a correction to the user's form (e.g., back is arched, and not straight, during a pushup).
The computer 102 may also score the user's performance of an exercise. Scoring may be based on the user's form, how quickly the user was able to complete the exercise (e.g., 20 pushups in 60 seconds), a number of repetitions the user completed, the amount of weight the user used during an exercise, or other exercise metrics. In additional to processing the images, the computer 102 may receive data from other sources. For example, the user may run a predetermined distance as measured by a sensor attached to the user (e.g., sensor in a shoe) or global positioning system (GPS) device and may upload the data to the computer 102. Based on the images and/or data acquired by other sensors, the computer 102 may determine areas of weakness for the user (e.g., inability to do a pull up) and design a workout to help the user improve their overall fitness level. Score may be a function of a particular drill and may be focused on position, accuracy and correct execution. Scoring may also be based on time and/or a number sets or repetitions within a set time period.
After completing the baseline physical fitness level for the user, the computer 102 may then create an initial personalized program. The initial personalized program may be a function of user input, static assessment of the user, and a human movement screen. User input may include a user's time commitment, as well as number of exercise sessions per week and one or more goals. The status assessment may provide the user with information and coaching on exercises. The human movement screen score may be assessments of the user's performance of the exercise drills.
To obtain these inputs, the computer 102 may present a graphical user interface (GUI) on the display 302 prompting the user to start a new program and to provide input for the initial personalized program, as shown in
The drills may be used for assessing user performance relative to performance pillars and body movement categories, as depicted in
The computer 102 may instruct the user to perform the same drill at various tempos, as described in
Based on the human movement screen scoring, the computer 102 may generate a workout structure for the user, an example of which is depicted in
If a user receives a human movement screen score of 3 in all categories, the computer 102 may prompt the user to performance exercises shown in the month 1 column. If the user receives a human movement screen score of 1 or 2 in any body movement category, the computer 102 may prompt the user to perform the body movement in the score 1 or score 2 columns for that category. For example, if the user receives a score of 1 in the pull category, the computer 102 may prompt the user to perform the reach roll'n lift exercise in month 1, the Lying T's in month 2, and so forth along that row and the six month program would end at the bent over row exercise from the month 4 column.
In another example, the workout plan may include a baseline workout and six month long programs, examples of which are depicted in
With reference to
Each month long program over the six month program may be divided into 4 phases each lasting a week, an example of which is depicted in
The feedback may allow the user to compete against their own benchmarks to see improvement in real-time and over time.
As shown by the examples in
The computer 102 may also calculate a fatigue index indicating how well the user maintained good form over the duration of a drill. For example, the fatigue index may indicate that the user was in the preferred zone for the first 4 repetitions, in the good zone for the next 5 repetitions, and in the red zone for the last repetition.
If the user trains hard during a session, the computer 102 may associate a greater number of points and unlock new workouts. Upon reaching point milestones, the user may unlock workouts and online challenges, or the user may purchase these items online through a gaming console. Other incentives may include obtaining certification as a trainer upon reaching certain fitness milestones. The user may also purchase products from a particular clothing or footwear supplier to increase rewards. For example, a product may have an embedded barcodes or other information that a user may scan or otherwise input to the computer 102 to unlock new training sessions (e.g., a session about stretching for a run). In some embodiments the purchase of certain products may allow a user to unlock new workouts. The new workouts may be related to or use the purchased products.
A display device may present a graphical user interface of a post-workout-dashboard permitting a user to review training data with analysis to view progress and improve future sessions. The user may also elect to post their workout online via social networking (e.g., via a social networking website) or otherwise share their workout sessions. Users may post comments and provide recommendations when reviewing workouts of other users. Users may also post messages to provide motivation to other users. The computer 102 may also post information to a social network when a user improves their fitness level (e.g., Bob improved his fitness level from intermediate to advanced). The computer 102 may have a dynamic recommendation engine that suggests new workouts based on profile and previous training successes. Trainers may also recommend different types of engagements such as joining a challenge or going head to head with a friend. The computer 102 may then suggest a time and date for a next workout session.
To initiate a drop-in session, the user may select a drop-in workout session tab of a graphical user interface (see
A challenge session may be where a user competes against a ghost of their previous workout or another user. For example, the computer 102 may store video of a user performing a set of exercises, as well as performance metrics. The display may present the video of the user where the user appears translucent, and hence is denoted as a ghost. The display may overlay video recorded by the image capturing device for comparison with the ghost. The computer 102 may provide a demonstration of the challenge, and the user may perform the challenge. Upon completion of the challenge, the computer 102 may display the challenge results.
The user may also create their own challenges and workout sessions for more focused training or sharing with a social network. The user may receive points, money, or other incentives based on a number of other users who download a user created workout session. The user may also cause the computer 102 to request ghost workouts from friends or pros to aid or compare.
Challenges may also be against multiple players at a single location (e.g., house), or via a network.
Aspects of the embodiments have been described in terms of illustrative embodiments thereof. Numerous other embodiments, modifications and variations within the scope and spirit of the appended claims will occur to persons of ordinary skill in the art from a review of this disclosure. For example, one of ordinary skill in the art will appreciate that the steps illustrated in the illustrative figures may be performed in other than the recited order, and that one or more steps illustrated may be optional in accordance with aspects of the embodiments.
This application is a continuation-in-part of U.S. patent application Ser. No. 13/290,359 filed Nov. 7, 2011 and claims the benefit of, and priority to, U.S. Provisional Patent Application Nos. 61/410,777 filed Nov. 5, 2010, 61/417,102 filed Nov. 24, 2010, 61/422,511 filed Dec. 13, 2010, 61/432,472 filed Jan. 13, 2011, and 61/433,792 filed Jan. 18, 2011, each of which is entitled “Method and System for Automated Personal Training” The content of each of the applications is expressly incorporated herein by reference in its entirety for any and all non-limiting purposes.
Number | Name | Date | Kind |
---|---|---|---|
4938476 | Brunelle et al. | Jul 1990 | A |
5277197 | Church et al. | Jan 1994 | A |
5288078 | Capper et al. | Feb 1994 | A |
5335188 | Brisson | Aug 1994 | A |
5354317 | Alt | Oct 1994 | A |
5375610 | LaCourse et al. | Dec 1994 | A |
5524637 | Erickson | Jun 1996 | A |
5527239 | Abbondanza | Jun 1996 | A |
5598849 | Browne | Feb 1997 | A |
5655316 | Huang | Aug 1997 | A |
5667459 | Su | Sep 1997 | A |
5688137 | Bustance | Nov 1997 | A |
5791351 | Curchod | Aug 1998 | A |
5826578 | Curchod | Oct 1998 | A |
5836770 | Powers | Nov 1998 | A |
5846086 | Bizzi et al. | Dec 1998 | A |
5851193 | Arikka et al. | Dec 1998 | A |
5904484 | Burns | May 1999 | A |
5913727 | Ahdoot | Jun 1999 | A |
5919149 | Allum | Jul 1999 | A |
5938690 | Law et al. | Aug 1999 | A |
5955957 | Calabrese et al. | Sep 1999 | A |
6126449 | Burns | Oct 2000 | A |
6308565 | French et al. | Oct 2001 | B1 |
6316934 | Amorai-Moriya et al. | Nov 2001 | B1 |
6428449 | Apseloff | Aug 2002 | B1 |
6516222 | Fukuda | Feb 2003 | B2 |
6663491 | Watabe | Dec 2003 | B2 |
6743167 | Balkin et al. | Jun 2004 | B2 |
6765726 | French et al. | Jul 2004 | B2 |
6788200 | Jamel et al. | Sep 2004 | B1 |
6817979 | Nihtila | Nov 2004 | B2 |
6820025 | Bachmann et al. | Nov 2004 | B2 |
6836744 | Asphahani et al. | Dec 2004 | B1 |
6876496 | French et al. | Apr 2005 | B2 |
7018211 | Birkholzer et al. | Mar 2006 | B1 |
7074168 | Farnes et al. | Jul 2006 | B1 |
7079889 | Nakada | Jul 2006 | B2 |
7095424 | Satoh et al. | Aug 2006 | B2 |
7163490 | Chen | Jan 2007 | B2 |
7192401 | Saalasti et al. | Mar 2007 | B2 |
7254516 | Case, Jr. et al. | Aug 2007 | B2 |
7265666 | Daniel | Sep 2007 | B2 |
7315249 | Littell | Jan 2008 | B2 |
7359121 | French et al. | Apr 2008 | B2 |
7442131 | Milana | Oct 2008 | B2 |
7493232 | Surina | Feb 2009 | B1 |
7497807 | Neff et al. | Mar 2009 | B2 |
7497812 | Neff et al. | Mar 2009 | B2 |
7556590 | Watterson et al. | Jul 2009 | B2 |
7602301 | Stirling et al. | Oct 2009 | B1 |
7628730 | Watterson et al. | Dec 2009 | B1 |
7676332 | Damen | Mar 2010 | B2 |
7736272 | Martens | Jun 2010 | B2 |
7782358 | Nieminen et al. | Aug 2010 | B2 |
7783347 | Abourizk et al. | Aug 2010 | B2 |
7789800 | Watterson et al. | Sep 2010 | B1 |
7815508 | Dohta | Oct 2010 | B2 |
7821407 | Shears et al. | Oct 2010 | B2 |
7825815 | Shears et al. | Nov 2010 | B2 |
7846067 | Hanoun | Dec 2010 | B2 |
7846069 | Martens | Dec 2010 | B2 |
7857708 | Ueda et al. | Dec 2010 | B2 |
7927253 | Vincent et al. | Apr 2011 | B2 |
7967728 | Zavadsky et al. | Jun 2011 | B2 |
7978081 | Shears et al. | Jul 2011 | B2 |
7985164 | Ashby | Jul 2011 | B2 |
7988647 | Bunn et al. | Aug 2011 | B2 |
8012064 | Martens | Sep 2011 | B2 |
8029411 | Johnson | Oct 2011 | B2 |
8038578 | Olrik et al. | Oct 2011 | B2 |
8118710 | Weinman et al. | Feb 2012 | B2 |
8230367 | Bell et al. | Jul 2012 | B2 |
8235870 | Hamilton | Aug 2012 | B2 |
8269826 | Nieminen et al. | Sep 2012 | B2 |
8284157 | Markovic et al. | Oct 2012 | B2 |
8284847 | Adermann | Oct 2012 | B2 |
8409057 | Martens | Apr 2013 | B2 |
8460199 | Rulkov et al. | Jun 2013 | B2 |
8503086 | French et al. | Aug 2013 | B2 |
8568277 | Johnson | Oct 2013 | B2 |
8568330 | Mollicone et al. | Oct 2013 | B2 |
8589114 | Papadourakis | Nov 2013 | B2 |
8616989 | Bentley | Dec 2013 | B2 |
8758201 | Ashby et al. | Jun 2014 | B2 |
8784270 | Ashby et al. | Jul 2014 | B2 |
8812428 | Mollicone et al. | Aug 2014 | B2 |
8858400 | Johnson | Oct 2014 | B2 |
8861091 | French et al. | Oct 2014 | B2 |
8928484 | Chang et al. | Jan 2015 | B2 |
9008973 | French | Apr 2015 | B2 |
9154739 | Nicolaou et al. | Oct 2015 | B1 |
20020019258 | Kim et al. | Feb 2002 | A1 |
20030040348 | Martens | Feb 2003 | A1 |
20030054327 | Evensen | Mar 2003 | A1 |
20030228033 | Daniel et al. | Dec 2003 | A1 |
20040087366 | Shum et al. | May 2004 | A1 |
20040102931 | Ellis et al. | May 2004 | A1 |
20040162194 | Habing | Aug 2004 | A1 |
20050079905 | Martens | Apr 2005 | A1 |
20050085348 | Kiefer et al. | Apr 2005 | A1 |
20050101887 | Stark | May 2005 | A1 |
20050113650 | Pacione et al. | May 2005 | A1 |
20050113652 | Stark | May 2005 | A1 |
20050182341 | Katayama et al. | Aug 2005 | A1 |
20050196737 | Mann | Sep 2005 | A1 |
20050223799 | Murphy | Oct 2005 | A1 |
20050272517 | Funk et al. | Dec 2005 | A1 |
20060040793 | Martens | Feb 2006 | A1 |
20060079800 | Martikka et al. | Apr 2006 | A1 |
20060166737 | Bentley | Jul 2006 | A1 |
20060205569 | Watterson et al. | Sep 2006 | A1 |
20060229170 | Ozawa et al. | Oct 2006 | A1 |
20060241521 | Cohen | Oct 2006 | A1 |
20060247070 | Funk et al. | Nov 2006 | A1 |
20060262120 | Rosenberg | Nov 2006 | A1 |
20070050715 | Behar | Mar 2007 | A1 |
20070118406 | Killin et al. | May 2007 | A1 |
20070155588 | Stark | Jul 2007 | A1 |
20070232453 | Hanoun | Oct 2007 | A1 |
20070232455 | Hanoun | Oct 2007 | A1 |
20070270214 | Bentley | Nov 2007 | A1 |
20070272011 | Chapa et al. | Nov 2007 | A1 |
20080161733 | Einav | Jul 2008 | A1 |
20080189291 | Hsu | Aug 2008 | A1 |
20080191864 | Wolfson | Aug 2008 | A1 |
20080200312 | Tagliabue | Aug 2008 | A1 |
20080221487 | Zohar et al. | Sep 2008 | A1 |
20090044429 | Cook et al. | Feb 2009 | A1 |
20090149299 | Tchao et al. | Jun 2009 | A1 |
20090171614 | Damen | Jul 2009 | A1 |
20090233769 | Pryor | Sep 2009 | A1 |
20090233770 | Vincent et al. | Sep 2009 | A1 |
20090269728 | Verstegen et al. | Oct 2009 | A1 |
20090298650 | Kutliroff | Dec 2009 | A1 |
20090299232 | Lanfermann et al. | Dec 2009 | A1 |
20100056340 | Ellis et al. | Mar 2010 | A1 |
20100063778 | Schrock et al. | Mar 2010 | A1 |
20100094174 | Choi et al. | Apr 2010 | A1 |
20100125026 | Zavadsky et al. | May 2010 | A1 |
20100125028 | Heppert | May 2010 | A1 |
20100137748 | Sone et al. | Jun 2010 | A1 |
20100144414 | Edis et al. | Jun 2010 | A1 |
20100197462 | Piane, Jr. | Aug 2010 | A1 |
20100204616 | Shears et al. | Aug 2010 | A1 |
20100210359 | Krzeslo et al. | Aug 2010 | A1 |
20100227302 | McGilvery et al. | Sep 2010 | A1 |
20100234184 | Le Page et al. | Sep 2010 | A1 |
20100248901 | Martens | Sep 2010 | A1 |
20100302142 | French et al. | Dec 2010 | A1 |
20100316983 | Johns, Jr. | Dec 2010 | A1 |
20100332243 | Weigman et al. | Dec 2010 | A1 |
20110072457 | Lanfermann et al. | Mar 2011 | A1 |
20110077129 | Martens | Mar 2011 | A1 |
20110111922 | Weinman et al. | May 2011 | A1 |
20110111924 | Jones et al. | May 2011 | A1 |
20110112771 | French | May 2011 | A1 |
20110136627 | Williams | Jun 2011 | A1 |
20110212791 | Ueda et al. | Sep 2011 | A1 |
20110224557 | Banet | Sep 2011 | A1 |
20110229864 | Short et al. | Sep 2011 | A1 |
20110251021 | Zavadsky et al. | Oct 2011 | A1 |
20110270135 | Dooley et al. | Nov 2011 | A1 |
20110275907 | Inciardi | Nov 2011 | A1 |
20110306491 | Belisle | Dec 2011 | A1 |
20110307821 | Martens | Dec 2011 | A1 |
20120034971 | Harp et al. | Feb 2012 | A1 |
20120038627 | Sung et al. | Feb 2012 | A1 |
20120130886 | Shergill et al. | May 2012 | A1 |
20120143064 | Cyphery et al. | Jun 2012 | A1 |
20120165703 | Bottum et al. | Jun 2012 | A1 |
20120234111 | Molyneux et al. | Sep 2012 | A1 |
20120271143 | Aragones et al. | Oct 2012 | A1 |
20120315986 | Walling | Dec 2012 | A1 |
20120315987 | Walling | Dec 2012 | A1 |
20130019694 | Molyneux et al. | Jan 2013 | A1 |
20130022947 | Muniz Simas et al. | Jan 2013 | A1 |
20130022950 | Muniz Simas et al. | Jan 2013 | A1 |
20130108993 | Katz | May 2013 | A1 |
20130171596 | French | Jul 2013 | A1 |
20130281796 | Pan | Oct 2013 | A1 |
20130295539 | Wilson et al. | Nov 2013 | A1 |
20130338802 | Winsper et al. | Dec 2013 | A1 |
20140073486 | Ahmed et al. | Mar 2014 | A1 |
Number | Date | Country |
---|---|---|
103493056 | Jan 2014 | CN |
29720110 | Jan 1998 | DE |
2415788 | Jan 2006 | GB |
H8-57093 | Mar 1996 | JP |
2000504854 | Apr 2000 | JP |
2001231904 | Aug 2001 | JP |
2001299975 | Oct 2001 | JP |
2002112984 | Apr 2002 | JP |
2002291952 | Oct 2002 | JP |
2003290406 | Oct 2003 | JP |
2004089727 | Mar 2004 | JP |
3656853 | Jun 2005 | JP |
2005198818 | Jul 2005 | JP |
2006263002 | Oct 2006 | JP |
2006320424 | Nov 2006 | JP |
2008295746 | Dec 2008 | JP |
2009048757 | Mar 2009 | JP |
2009213782 | Sep 2009 | JP |
2009219828 | Oct 2009 | JP |
20030041034 | May 2003 | KR |
20090084035 | Aug 2009 | KR |
9729814 | Aug 1997 | WO |
2004073494 | Sep 2004 | WO |
2009043024 | Apr 2009 | WO |
2009073607 | Jun 2009 | WO |
2010121166 | Oct 2010 | WO |
2012071548 | May 2012 | WO |
2012071551 | May 2012 | WO |
2012061804 | May 2012 | WO |
Entry |
---|
Machine Translation of Zhao Jiang hong, Liu Zhi qiar, Shi Bin. Design and Practice for Individual Specialized PC Expert System for College Student. Journal of Xi'An Institute of Physical Education, vol. 22 No. 2 Mar. 2005 (16 pages) <retrieved from Google Translate on Jun. 29, 2016>. |
International Bureau, “International Preliminary Report on Patentability,” issued in connection with international application serial No. PCT/US2011/064711, mailed Jun. 27, 2013, 6 pages. |
International Searching Authority, “International Search Report and Written Opinion,” issued in connection with international application serial No. PCT/US2012/066070, mailed May 31, 2013, 9 pages. |
United States Patent and Trademark Office, “Final Office Action,” issued in connection with U.S. Appl. No. 13/304,056, mailed Jan. 28, 2014, 13 pages. |
May 29, 2013 (WO)—International Search Report and Written Opinion—App. No. PCT/US2012/066065. |
Jun. 6, 2013 (WO)—International Preliminary Report on Patentability—App. No. PCT/US20111062117. |
May 16, 2013 (WO)—International Preliminary Report on Patentability—App. No. PCT/US20111059559. |
Apr. 3, 2012 (WO)—International Search Report and Written Opinion—Application No. PCT/US20111064711. |
Feb. 23, 2012 W(O)—International Search Report and Written Opinion—App. No. PCT/US2011/062117. |
Feb. 20, 2014 (WO)—International Search Report and Written Opinion—App. No. PCT/US2013/067512. |
Sep. 12, 2013(WO)—ISR and WO—App. No. PCT/US2013/044109. |
United States Patent and Trademark Office, “Non-Final Office Action,” issued in connection with U.S. Appl. No. 13/894,088, mailed Aug. 6, 2013, 5 pages. |
Zhao, et al., Design and Practice for Individual Specialized PC Expert System for College Student, Journal of Xi An Institute of Physical Education, vol. 22, No. 2 (Mar. 2005) pp. 118-121. |
Number | Date | Country | |
---|---|---|---|
20120277891 A1 | Nov 2012 | US |
Number | Date | Country | |
---|---|---|---|
61410777 | Nov 2010 | US | |
61417102 | Nov 2010 | US | |
61422511 | Dec 2010 | US | |
61432472 | Jan 2011 | US | |
61433792 | Jan 2011 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 13290359 | Nov 2011 | US |
Child | 13304064 | US |