This disclosure relates to exercise machines. More specifically, this disclosure relates to a method and system for using artificial intelligence to determine a user's progress during interval training.
Exercise and rehabilitation devices, such as an cycling machine and balance equipment, are used to facilitate exercise, strength training, osteogenesis, and/or rehabilitation of a user. A user may perform an exercise (e.g., cycling, balancing, bench press, pull down, arm curl, etc.) using the osteogenic isometric exercise, rehabilitation, and/or strength training equipment to improve osteogenesis, bone growth, bone density, muscular hypertrophy, flexibility, balance, coordination, reduce pain, decrease rehabilitation time, increase strength, or some combination thereof. The isometric exercise, rehabilitation, and/or strength training equipment may include moveable portions onto which the user adds a load or balances. For example, to perform a cycling exercise, the user may sit in a seat, place each of the user's feet on a respective pedal of an cycling machine, and push on the pedals with the user's feet while each of the pedals rotate in a circular motion. To perform a balancing exercise, the user may stand on a balance board and balance on top of the balance board as it shifts in one or more directions. The isometric exercise, rehabilitation, and/or strength training equipment may include non-movable portions onto which the user adds load. For example, to perform a leg-press-style exercise, the user may sit in a seat, place each of the user's feet on a respective foot plate, and push on the feet plates with the user's feet while the foot plates remain in the same position.
Representative embodiments set forth herein disclose various techniques for an adjustment of exercise based on artificial intelligence, exercise plan, and user feedback. As used herein, the terms “exercise apparatus,” “exercise device,” “electromechanical device,” “exercise machine,” “rehabilitation device,” “cycling machine” “balance board,” and “isometric exercise and rehabilitation assembly” may be used interchangeably. The terms “exercise apparatus,” “exercise device,” “electromechanical device,” “exercise machine,” “rehabilitation device,” “cycling machine” “balance board,” and “isometric exercise and rehabilitation assembly” may also refer to an osteogenic, strength training, isometric exercise, and/or rehabilitation assembly.
In one embodiment, a method is disclosed for using an artificial intelligence engine to modify a resistance of one or more pedals of an exercise device. The method includes generating, by the artificial intelligence engine, a machine learning model trained to receive one or more measurements as input, and outputting, based on the one or more measurements, a control instruction that causes the exercise device to modify the resistance of the one or more pedals. The method includes receiving the one or more measurements from a sensor associated with the one or more pedals of the exercise device, determining whether the one or more measurements satisfy a trigger condition, and responsive to determining that the one or more measurements satisfy the trigger condition, transmitting the control instruction to the exercise device.
In one embodiment, a method is disclosed for using an artificial intelligence engine to perform a control action. The control action is based on one or more measurements from a wearable device. The method includes generating, by the artificial intelligence engine, a machine learning model trained to receive the one or more measurements as input, and outputting, based on the one or more measurements, a control instruction that causes the control action to be performed. The method includes receiving the one or more measurements from the wearable device being worn by a user, determining whether the one or more measurements indicate, during an interval training session, that one or more characteristics of the user are within a desired target zone, and responsive to determining that the one or more measurements indicate the one or more characteristics of the user are not within the desired target zone during the interval training session, performing the control action.
In one embodiment, a method is disclosed for using an artificial intelligence engine to modify resistance of one or more pedals of an exercise device. The method includes generating, by the artificial intelligence engine, a machine learning model trained to receive one or more measurements as input, and outputting, based on the one or more measurements, a control instruction that causes the exercise device to modify, independently from each other, the resistance of the one or more pedals. The method includes, while a user performs an exercise using the exercise device, receiving the one or more measurements from the one or more sensors associated with the one or more pedals of the exercise device, and determining, based on the one or more measurements, a quantifiable or qualitative modification to the resistance provided by a pedal of the one or more pedals. In one embodiment, the resistance provided by another pedal of the one or more pedals is not modified. The method includes transmitting the control instruction to the exercise device to cause the resistance provided by the pedal to be modified.
In one embodiment, a method is disclosed for using an artificial intelligence engine to present a user interface capable of presenting the progress of a user in one or more domains. The method includes generating, by the artificial intelligence engine, a machine learning model trained to receive one or more measurements as input, and outputting, based on the one or more measurements, a user interface that causes one or more graphical elements to dynamically change position on the user interface. The method includes, while a user performs an exercise using the exercise device, receiving the one or more measurements from the one or more sensors associated with the exercise device, and presenting, on a computing device associated with the exercise device, one or more sections of the user interface. The one or more sections of the user interface may each be related to a separate domain comprising the one or more domains and wherein, based on the one or more measurements, each section may include the one or more graphical elements placed.
In one embodiment, a method is disclosed for using an artificial intelligence engine to interact with a user of an exercise device during an exercise session. The method includes generating, by the artificial intelligence engine, a machine learning model trained to receive data as input, and based on the data, to provide an output. The method includes, while a user performs an exercise using the exercise device, receiving the data from an input peripheral of a computing device associated with the user, and based on the data being received from the input peripheral, determining, via the machine learning model, the output such that control of an aspect of the exercise device is enabled.
In one embodiment, a method is disclosed for using an artificial intelligence engine to onboard a user for an exercise plan. The method includes generating, by the artificial intelligence engine, a machine learning model trained to receive as input both onboarding data associated with a user and an onboarding protocol and, based on the onboarding data and the onboarding protocol, output an exercise plan. The method includes, while a user performs an exercise using the exercise device, receiving the onboarding data associated with the user. The method includes determining, by the machine learning model using the onboarding data and the onboarding protocol, a fitness level of the user, wherein the onboarding protocol comprises exercises with tiered difficulty levels, wherein the onboarding protocol increases a difficulty level for a subsequent exercise comprising the exercises when the user completes an exercise comprising the exercises, and, further wherein, based on a completion state of a last exercise performed by the user, the fitness level of the user is determined. The method includes, by associating the difficulty level for each exercise with the fitness level of the user, selecting a difficulty level for each exercise comprising the exercise plan.
In one embodiment, a tangible, non-transitory computer-readable medium storing instructions that, when executed, cause a processing device to perform any of the operations of any of the methods disclosed herein.
In one embodiment, a system includes a memory device storing instructions and a processing device communicatively coupled to the memory device. The processing device may execute the instructions to perform any of the operations of any of the methods disclosed herein.
Other technical features may be readily apparent to one skilled in the art from the following figures, descriptions, and claims.
For a detailed description of example embodiments, reference will now be made to the accompanying drawings in which:
Various terms are used to refer to particular system components. Different entities may refer to a component by different names—this document does not intend to distinguish between components that differ in name but not function. In the following discussion and in the claims, the terms “including” and “comprising” are used in an open-ended fashion, and thus should be interpreted to mean “including, but not limited to . . . ” Also, the term “couple” or “couples” is intended to mean either an indirect or direct connection. Thus, if a first device couples to a second device, that connection may be through a direct connection or through an indirect connection via other devices and connections.
Various terms are used to refer to particular system components. Different entities may refer to a component by different names—this document does not intend to distinguish between components that differ in name but not function. In the following discussion and in the claims, the terms “including” and “comprising” are used in an open-ended fashion, and thus should be interpreted to mean “including, but not limited to . . . ” Also, the term “couple” or “couples” is intended to mean either an indirect or direct connection. Thus, if a first device couples to a second device, that connection may be through a direct connection or through an indirect connection via other devices and connections.
The terminology used herein is for the purpose of describing particular example embodiments only, and is not intended to be limiting. As used herein, the singular forms “a,” “an,” and “the” may be intended to include the plural forms as well, unless the context clearly indicates otherwise. The method steps, processes, and operations described herein are not to be construed as necessarily requiring their performance in the particular order discussed or illustrated, unless specifically identified as an order of performance. It is also to be understood that additional or alternative steps may be employed.
The terms first, second, third, etc. may be used herein to describe various elements, components, regions, layers and/or sections; however, these elements, components, regions, layers and/or sections should not be limited by these terms. These terms may be only used to distinguish one element, component, region, layer or section from another region, layer or section. Terms such as “first,” “second,” and other numerical terms, when used herein, do not imply a sequence or order unless clearly indicated by the context. Thus, a first element, component, region, layer or section discussed below could be termed a second element, component, region, layer or section without departing from the teachings of the example embodiments. The phrase “at least one of,” when used with a list of items, means that different combinations of one or more of the listed items may be used, and only one item in the list may be needed. For example, “at least one of: A, B, and C” includes any of the following combinations: A, B, C, A and B, A and C, B and C, and A and B and C. In another example, the phrase “one or more” when used with a list of items means there may be one item or any suitable number of items exceeding one.
Spatially relative terms, such as “inner,” “outer,” “beneath,” “below,” “lower,” “above,” “upper,” “top,” “bottom,” and the like, may be used herein. These spatially relative terms can be used for ease of description to describe one element's or feature's relationship to another element(s) or feature(s) as illustrated in the figures. The spatially relative terms may also be intended to encompass different orientations of the device in use, or operation, in addition to the orientation depicted in the figures. For example, if the device in the figures is turned over, elements described as “below” or “beneath” other elements or features would then be oriented “above” the other elements or features. Thus, the example term “below” can encompass both an orientation of above and below. The device may be otherwise oriented (rotated 90 degrees or at other orientations) and the spatially relative descriptions used herein interpreted accordingly.
Moreover, various functions described below can be implemented or supported by one or more computer programs, each of which is formed from computer readable program code and embodied in a computer readable medium. The terms “application” and “program” refer to one or more computer programs, software components, sets of instructions, procedures, functions, objects, classes, instances, related data, or a portion thereof adapted for implementation in a suitable computer readable program code. The phrase “computer readable program code” includes any type of computer code, including source code, object code, and executable code. The phrase “computer readable medium” includes any type of medium capable of being accessed by a computer, such as read only memory (ROM), random access memory (RAM), a hard disk drive, a compact disc (CD), a digital video disc (DVD), solid state drives (SSDs), flash memory, or any other type of memory. A “non-transitory” computer readable medium excludes wired, wireless, optical, or other communication links that transport transitory electrical or other signals. A non-transitory computer readable medium includes media where data can be permanently stored and media where data can be stored and later overwritten, such as a rewritable optical disc or an erasable memory device.
The term “bone geometry” may refer to bone diameter, bone density, bone shape, bone cross-section, bone length, bone weight, or any suitable bone dimension(s) and/or measurement(s).
The term “empirical data” may refer to data obtained and/or derived based on observation, experience, measurement, and/or research.
The term “strain,” when used in context with a bone of a user, may refer to an amount, proportion, or degree of deformation of the bone material.
The terms “exercise machine” and “isometric exercise and rehabilitation assembly” may be used interchangeably herein.
The terms “body part” and “body portion” may be used interchangeably herein.
An exercise plan may include one or more exercise sessions. Each exercise session may include one or more exercises of any type (e.g., cycling, running, pull-ups, sit-ups, stretching, yoga, etc.). The one or more exercises may include or be based on various specifications (e.g., parameters, properties, values, attributes, etc.), such as a number of repetitions, a number of sets, a periodicity, a frequency, a difficulty level, an amount of weight, a range of motion, a degree of flexion, a degree of extension, a skill level, or the like.
Definitions for other certain words and phrases are provided throughout this patent document. Those of ordinary skill in the art should understand that in many if not most instances, such definitions apply to prior as well as future uses of such defined words and phrases.
As typically healthy people grow from infants to children to adults, they experience bone growth. Such, growth, however, typically stops at approximately age 30. After that point, without interventions as described herein, bone loss (called osteoporosis), can start to occur. This does not mean that the body stops creating new bone. Rather, it means that the rate at which it creates new bone tends to slow, while the rate at which bone loss occurs tends to increase.
In addition, as people age and/or become less active than they once were, they may experience muscle loss. For example, muscles that are not used often may reduce in muscle mass. As a result, the muscles become weaker. In some instances, people may be affected by a disease, such as muscular dystrophy, that causes the muscles to become progressively weaker and to have reduced muscle mass. To increase the muscle mass and/or reduce the rate of muscle loss, people may exercise a muscle to cause muscular hypertrophy, thereby strengthening the muscle as the muscle grows. Muscular hypertrophy may refer to an increase in a size of skeletal muscle through a growth in size of its component cells. There are two factors that contribute to muscular hypertrophy, (i) sarcoplasmic hypertrophy (increase in muscle glycogen storage), and (ii) myofibrillar hypertrophy (increase in myofibril size). The growth in the cells may be caused by an adaptive response that serves to increase an ability to generate force or resist fatigue.
The rate at which such bone or muscle loss occurs generally accelerates as people age. A net growth in bone can ultimately become a net loss in bone, longitudinally across time. By the time, in general, women are over 50 and men are over 70, net bone loss can reach a point where brittleness of the bones is so great that the risk of life-altering fractures can occur. Examples of such fractures include fractures of the hip and femur. Of course, fractures can also occur due to participation in athletics or due to accidents. In such cases, it is just as relevant to have a need for bone growth which heals or speeds the healing of the fracture.
To understand why such fractures occur, it is useful to recognize that bone is itself porous, with a somewhat-honeycomb like structure. This structure may be dense and therefore stronger or it may be variegated, spread out and/or sparse, such latter structure being incapable of continuously or continually supporting the weight (load) stresses experienced in everyday living. When such loads exceed the support capability of the structure at a stressor point or points, a fracture occurs. This is true whether the individual had a fragile bone structure or a strong one: it is a matter of physics, of the literal “breaking point.”
It is therefore preferable to have a means of mitigating or ameliorating bone loss and of healing fractures. Further, it is preferable to encourage new bone growth, thus increasing the density of the structure described hereinabove. The increased bone density may increase the load-bearing capacities of the bone, thus making first or subsequent fractures less likely to occur. Reduced fractures may improve a quality of life of the individual. The process of bone growth itself is referred to as osteogenesis, literally the creation of bone.
It is also preferable to have a means for mitigating or ameliorating muscle mass loss and weakening of the muscles. Further, it is preferable to encourage muscle growth by increasing the muscle mass through exercise. The increased muscle mass may enable a person to exert more force with the muscle and/or to resist fatigue in the muscle for a longer period of time.
In order to create new bone, at least three factors are necessary. First, the individual must have a sufficient intake of calcium, but second, in order to absorb that calcium, the individual must have a sufficient intake and absorption of Vitamin D, a matter problematic for those who have cystic fibrosis, who have undergone gastric bypass surgery or have other absorption disorders or conditions which limit absorption. Separately, supplemental estrogen for women and supplemental testosterone for men can further ameliorate bone loss. On the other hand, abuse of alcohol and smoking can harm one's bone structure. Medical conditions such as, without limitation, rheumatoid arthritis, renal disease, overactive parathyroid glands, diabetes or organ transplants can also exacerbate osteoporosis. Ethical pharmaceuticals such as, without limitation, hormone blockers, seizure medications and glucocorticoids are also capable of inducing such exacerbations. But even in the absence of medical conditions as described hereinabove, Vitamin D and calcium taken together do not create osteogenesis to a desirable degree or ameliorate bone loss to a desirable degree.
To achieve osteogenesis, therefore, one must add in the third factor: exercise. Specifically, one must subject one's bones to a force at least equal to certain multiple of body weight, such multiples varying depending on the individual and the specific bone in question. As used herein, “MOB” means Multiples of Body Weight. It has been determined through research that subjecting a given bone to a certain threshold MOB (this may also be known as a “weight-bearing exercise”), even for an extremely short period of time, one simply sufficient to exceed the threshold MOB, encourages and fosters osteogenesis in that bone.
Further, a person can achieve muscular hypertrophy by exercising the muscles for which increased muscle mass is desired. Strength training and/or resistance exercise may cause muscle tissue to increase. For example, pushing against or pulling on a stationary object with a certain amount of force may trigger the cells in the associated muscle to change and cause the muscle mass to increase.
In some embodiments disclosed herein, a control system for an exercise machine is disclosed, not only capable of enabling an individual, preferably an older, less mobile individual or preferably an individual recovering from a fracture, to engage easily in osteogenic exercises and/or muscle strengthening exercises, but capable of using predetermined thresholds or dynamically calculating them, such that the person using the machine can be immediately informed through real-time visual and/or other sensorial feedback, that the osteogenic threshold has been exceeded, thus triggering osteogenesis for the subject bone (or bones), and/or that the muscular strength threshold has been exceeded, thereby triggering muscular hypertrophy for the subject muscle (or muscles). The control system may be used to improve compliance with an exercise plan including one or more exercises.
The control system may receive one or more load measurements associated with forces exerted by both the left and right sides on left and right portions (e.g., handles, foot plate or platform) of the exercise machine to enhance osteogenesis, bone growth, bone density improvement, and/or muscle mass. The one or more load measurements may be a left load measurement of a load added to a left load cell on a left portion of the exercise machine and a right load measurement of a load added to a right load cell on a right portion of the exercise machine. A user interface may be provided by the control system that presents visual representations of the separately measured left load and right load where the respective left load and right load are added to the respective left load cell and right load cell at the subject portions of the exercise machine.
In some embodiments, initially, the control system may receive load measurements via a data channel associated with each exercise of the machine. For example, there may be a data channel for a leg-press-style exercise, a pull-down-style exercise, a suitcase-lift-style exercise, an arm-curl-style exercise, and so forth. Each data channel may include one or more load cells (e.g., a left load cell and a right load cell) that measure added load or applied force and transmit the load measurement to the control system via its respective data channel. The control system may receive the load measurements from each of the data channels at a first rate (e.g., 1 Hertz). If the control system detects a load from a data channel (e.g., hands resting on the handles including the respective load cells, or feet resting on the feet plate including the respective load cells), the control system may set that data channel as active and start reading load measurements from that data channel at a second rate (e.g., 10 Hertz) that is higher than the first rate. Further, the control system may set the other exercises associated with the other data channels as inactive and stop reading load measurements from the other data channels until the active exercise is complete. The active exercise may be complete when the one or more load measurements received via the data channel exceed one or more target thresholds. In some embodiments, the control system may determine an average load measurement by accumulating raw load measurements over a certain period of time (e.g., 5 seconds) and averaging the raw load measurements to smooth the data (e.g., eliminates jumps or spikes in data) in an average load measurement.
The control system may compare the one or more load measurements (e.g., raw load measurements, or averaged load measurements) to one or more target thresholds. In some embodiments, a single load measurement may be compared to a single specific target threshold (e.g., a one-to-one relationship). In some embodiments, a single load measurement may be compared to more than one specific target threshold (e.g., a one-to-many relationship). In some embodiments, more than one load measurement may be compared to a single specific target threshold (e.g., a many-to-one relationship). In some embodiments, more than one load measurement may be compared to more than one specific target threshold (e.g., a many-to-many relationship).
The target thresholds may be an osteogenesis target threshold, a muscular strength target threshold, and/or a rehabilitation threshold. The osteogenesis target threshold may be determined based on a disease protocol pertaining to the user, an age of the user, a gender of the user, a sex of the user, a height of the user, a weight of the user, a bone density of the user, etc. A disease protocol may refer to any illness, disease, fracture, or ailment experienced by the user and any treatment instructions provided by a caretaker for recovery and/or healing. The disease protocol may also include a condition of health where the goal is avoid a problem. The muscular strength target threshold may be determined based on a historical performance of the user using the exercise machine (e.g., amount of pounds lifted for a particular exercise, amount of force applied associated with each body part, etc.) and/or other exercise machines, a fitness level (e.g., how active the user is) of the user, a diet of the user, a protocol for determining a muscular strength target, etc. The rehabilitation target threshold may be determined based on historical performance of the user using the exercise machine (e.g., amount of force applied associated with each body part, speed of cycling, level of stability, etc.) and/or other exercise machines, a fitness level (e.g., how active the user is, the flexibility of the user, etc.) of the user, a diet of the user, an exercise plan for determining a rehabilitation target, the condition of the user (e.g., type of surgery the user underwent, the type of injury the user sustained), physical characteristics of the user (e.g., an age of the user, a gender of the user, a sex of the user, a height of the user, a weight of the user, a bone density of the user), condition of the user's body part(s) (e.g., the pain level of a user), an exertion level of a user (e.g., how easy/hard the exercise session is for the user), any other suitable characteristic, or combination thereof.
The control system may determine whether the one or more load measurements exceed the one or more target thresholds. Responsive to determining that the one or more load measurements exceed the one or more target thresholds, the control system may cause a user interface to present an indication that the one or more target thresholds have been exceeded and an exercise is complete. Additionally, when the one or more target thresholds are exceeded, the control system may cause the user interface to present an indication that instructs the user to apply additional force (less than a safety limit) to attempt to set a personal maximum record of weight lifted, pressed, pulled, or otherwise exert force thereupon for that exercise.
Further, the user interface may present an indication when a load measurement is approaching a target threshold for the user. In another example, when the load measurement exceeds the target threshold, the user interface may present an indication that the target threshold has been exceeded, that the exercise is complete, and if there are any remaining incomplete exercises in the exercise plan, that there is another exercise to be completed by the user. If there are no remaining exercises in the exercise plan to complete, then the user interface may present an indication that all exercises in the exercise plan are complete and the user can rest. In addition, when the exercise plan is complete, the control system may generate a performance report that presents various information (e.g., charts and graphs of the right and left load measurements received during each of the exercises, left and right maximum loads for the user received during each of the exercises, historical right and left load measurements received in the past, comparison of the current right and left load measurements with the historical right and left load measurement, an amount of pounds lifted or pressed that is determined based on the load measurements for each of the exercises, percent gained in load measurements over time, etc.).
Further, the one or more load measurements may each be compared to a safety limit. For example, a left load measurement and a right load measurement may each be compared to the safety limit for the user. The safety limit may be determined for the user based on the user's disease protocol. There may be different safety limits for different portions of the user's body on the left and the right side, one extremity versus another extremity, a top portion of the user's body and a body portion of the user's body, etc., and for different exercises. For example, if someone underwent left knee surgery, the safety limit for a user for a left load measurement for a leg-press-style exercise may be different from the safety limit for a right load measurement for that exercise and user. If the safety limit is exceeded, an indication may be presented on the user interface to instruct to reduce the amount of force the user is applying and/or to instruct the user to stop applying force because the safety limit is exceeded.
For those with any or all of the osteoporosis-exacerbating medical conditions described herein, such a control system and exercise machine can slow the rate of net bone loss by enabling osteogenesis to occur without exertions which would not be possible for someone whose health is fragile, not robust. Another benefit of the present disclosure, therefore, is its ability to speed the healing of fractures in athletically robust individuals. Further, another benefit is the increase in muscle mass by using the exercise machine to trigger muscular hypertrophy. The control system may provide an automated interface that improves compliance with an exercise plan by using a real-time feedback loop to measure loads added during each of the exercises, compare the load measurements to target thresholds and/or safety limits that are uniquely determined for the user using the exercise machine, and provide various indications based on the comparison. For example, the indications pertain to when the user should add more load, when the target thresholds are exceeded, when the safety limit is exceeded, when the exercise is complete, when the user should begin another exercise, and so forth.
Bone Exercises and their Benefits
The following exercises achieve bone strengthening results by exposing relevant parts of a user to isometric forces which are selected multiples of body weight (MOB) of the user, a threshold level above which bone mineral density increases. A MOB may be any fraction or rational number excluding zero. The specific MOB-multiple threshold necessary to effect such increases will naturally vary from individual to individual and may be more or less for any given individual. “Bone-strengthening,” as used herein, specifically includes, without limitation, a process of osteogenesis, whether due to the creation of new bone as a result of an increase in the bone mineral density; or proximately to the introduction or causation of microfractures in the underlying bone. The exercises referred to are as follows.
Leg Press
A leg-press-style exercise to improve isometric muscular strength in the following key muscle groups: gluteals, hamstrings, quadriceps, spinal extensors and grip muscles as well as to increase resistance to skeletal fractures in leg bones such as the femur. In one example, the leg-press-style exercise can be performed approximately 4.2 MOB or more of the user.
Chest Press
A chest-press-style exercise to improve isometric muscular strength in the following key muscle groups: pectorals, deltoids, and tricep and grip muscles as well as in increasing resistance to skeletal fractures in the humerus, clavicle, radial, ulnar and rib pectoral regions. In one example, the chest-press-style exercise can be performed at approximately 2.5 MOB or more of the user.
Suitcase Lift
A suitcase-lift-style exercise to improve isometric muscular strength in the following key muscle groups: gluteals, hamstrings, quadriceps, spinal extensors, abdominals, and upper back and grip muscles as well as to increase resistance to skeletal fractures in the femur and spine. In one example, the suitcase-lift-style exercise can be performed at approximately 2.5 MOB or more of the user.
Arm Curl
An arm-curl-style exercise to improve isometric muscular strength in the following key muscle groups: biceps, brachialis, brachioradialis, grip muscles and trunk as well as in increasing resistance to skeletal fractures in the humerus, ribs and spine. In one example, the arm-curl-style exercise can be performed at approximately 1.5 MOB or more of the user.
Core Pull
A core-pull-style exercise to improve isometric muscular strength in the following key muscle groups: elbow flexors, grip muscles, latissimus dorsi, hip flexors and trunk as well as in increasing resistance to skeletal fractures in the ribs and spine. In one example, the core-pull-style exercise can be performed at approximately 1.5 MOB or more of the user.
Grip Strength
A grip-strengthening-style exercise which may preferably be situated around a station in an exercise machine, in order to improve strength in the muscles of the hand and forearm. Grip strength is medically salient because it has been positively correlated with better states of health.
In some embodiments, a balance board may be communicatively coupled to the control system. For example, the balance board may include a network interface that communicates with the control system via any suitable interface protocol (e.g., Bluetooth, WiFi, cellular). The balance board may include pressure sensors and may obtain measurements of locations and amount of pressure applied to the balance board. The measurements may be transmitted to the control system. The control system may present a game or interactive exercise on a user interface. The game or interactive exercise may modify screens or adjust graphics that are displayed based on the measurements received from the balance board. The balance board may be used by a user to perform any suitable type of plank (e.g., knee plank, regular feet and elbow plank, table plank with elbows, or the like). Accordingly, the balance board may be configured to be used with arms on the balance board, knees on the balance board, and/or feet standing on the balance board. The games or interactive exercises may encourage the user during the game or interactive exercises to increase compliance and neuro-motor control after a surgery, for example.
The exercise machine, balance board, wristband, goniometer, and/or any suitable accessory may be used for various reasons in various markets. For example, users may use the exercise machine, balance board, wristband, goniometer, and/or any suitable accessory in the orthopedic market if the users suffer from chronic musculosketal pain (e.g., knees, hips, shoulders, and back). The exercise machine, balance board, wristband, goniometer, and/or any suitable accessory may be used to help with prehabilitation (prehab), as well as optimize post-surgical outcomes. Users may use the exercise machine, balance board, wristband, goniometer, and/or any suitable accessory in the back and neck pain market if the users suffer with chronic back and neck pain and they want to avoid surgery and experience long-term relief, as well as users that are in recovery following surgery. Users may use the exercise machine, balance board, wristband, goniometer, and/or any suitable accessory in the cardiovascular market if they desire to prevent or recover from life-threatening cardiovascular disease, especially heart attacks and stroke. Users may use the exercise machine, balance board, wristband, goniometer, and/or any suitable accessory in the neurological market if they desire to recover from stroke, or have conditions like Parkinson's Disease and/or Multiple Sclerosis, and the users desire to achieve better balance, strength, and muscle symmetry in order to slow progression of the medical condition.
In some embodiments, bone growth, muscle growth, rehabilitation, prehabilitation, and the like may be needed to perform certain physical activities. For example, a person may require a certain amount of muscle mass to move an object having a particular weight. While the physical activity may be desirable, some people may lack the appropriate bone mass, muscle mass, or physical ability in general to perform the physical activity. In one example, a grandparent may desire to play with their grandchildren, and may want to select that physical activity as a goal. However, the grandparent may not be aware of what levels of attainment are associated with the physical activity goal of playing with their grandchildren. As used herein, levels of attainment may refer to any physical, emotional, intellectual or other quality associated with such attainment, e.g., strength, endurance, balance, intelligence, neurological responsiveness, emotional well-being, range of motion, and mobility. A user may lack the proper knowledge, training, and/or education to determine which exercises to perform to target appropriate body portions used, for example, to achieve the appropriate levels of attainment to be able to play with their grandchildren. Further, another problem that users may experience is the ability to determine when the user may be at risk for having or developing various comorbidities in a real-time or near real-time manner. Such knowledge may be useful for a user to prevent the comorbidity from arising and/or to encourage or suggest to a user to consult with a health professional to take preventative care measures.
Accordingly, some embodiments of the present disclosure provide a technical solution for enabling a user to select one or more physical activity goals they desire to achieve and for generating an improved exercise plan that enables the user to achieve the one or more physical activity goals. The system may use an artificial intelligence engine to generate machine learning models that use one or more curated, multi-disciplinary data sources to generate the improved exercise plan. A given data source may include associations between the selected physical activity goal and one or more levels of attainment pertaining to achieving the physical life goal, associations between the one or more levels of attainment and one or more body portions, and associations between the one or more body portions and one or more exercises that target the one or more body portions. Using the data source, the artificial intelligence engine may generate a machine learning model to use the associations to generate improved exercise plans. Further, a machine learning model may be trained to predict a length of time it will take a user, if they follow the improved exercise plan, to achieve their physical activity life goal.
The levels of attainment may be objectively monitored and/or measured using various performance measurements from one or more sensors, characteristics of users of the exercise machine, user-reported difficulty levels of exercises, user-reported pain levels, and the like. An onboarding protocol may be used to establish a baseline describing a fitness level of the user, and the fitness level of the user, in the improved exercise plan, may be used to select difficulty levels of exercises. A machine learning model may be trained to perform the onboarding protocol and to determine the fitness level of the user. The improved exercise plan may be dynamically updated based on characteristics of the user, selected physical activity levels, performance measurements, user-reported difficulties of the exercises, user-reported pain levels, and the like. In some embodiments, to comply with the exercise plan, the exercise machine may be controlled using a signal that indicates changing an attribute of an operating parameter of the exercise machine. The control system may change the attribute of the operating parameter in response to receiving the signal.
In some embodiments, numerous enhanced user interfaces may be used to enable the user to create a profile, select physical activity goals, view generated improved exercise plans, perform exercises, view/listen to multimedia regarding the exercises, provider user-reported feedback, view comorbidity information, view evidential trails for the comorbidity information and the exercise plans, control the exercise machine, and the like. The user interfaces may present information in a beneficial manner, especially on a small screen used by mobile devices (e.g., smartphones, tablets), such that the user is presented with pertinent information without having to drill down into numerous other user interfaces or to open up different applications or websites. Accordingly, the enhanced user interface may improve the user's experience using a computing device, thus providing a technical improvement to computing technology.
The following discussion is directed to various embodiments of the present disclosure. Although these embodiments are given as examples, the embodiments disclosed should not be interpreted, or otherwise used, as limiting the scope of the disclosure, including the claims. In addition, one of ordinary skill in the art will understand that the following description has broad application, and the discussion of any embodiment is meant only to be exemplary of that embodiment, and not intended to intimate that the scope of the disclosure, including the claims, is limited to that embodiment.
The network interface devices may enable communication via a wireless protocol for transmitting data over short distances, such as Bluetooth, ZigBee, near field communication (NFC), etc. In some embodiments, the computing device 12 is communicatively coupled to the exercise machine 100 via Bluetooth. Additionally, the network interface devices may enable communicating data over long distances, and in one example, the computing device 12 may communicate with a network 20. Network 20 may be a public network (e.g., connected to the Internet via wired (Ethernet) or wireless (WiFi)), a private network (e.g., a local area network (LAN), wide area network (WAN), virtual private network (VPN)), or a combination thereof.
The computing device 12 may be any suitable computing device, such as a laptop, tablet, smartphone, or computer. The computing device 12 may include a display that is capable of presenting a user interface 18 of an application 17. The application 17 may be implemented in computer instructions stored on the one or more memory devices of the computing device 12 and executable by the one or more processing devices of the computing device 12. The application 17 may be a stand-alone application that is installed on the computing device 12 or may be an application (e.g., website) that executes via a web browser. The user interface 18 may present various screens to a user that enable the user to login, enter personal information (e.g., health information; a disease protocol prescribed by a physician, trainer, or caretaker; age; gender; activity level; bone density; weight; height; patient measurements; etc.), view an exercise plan, initiate an exercise in the exercise plan, view visual representations of left load measurements and right load measurements that are received from left load cells and right load cells during the exercise, view a weight in pounds that are pushed, lifted, or pulled during the exercise, view an indication when the user has almost reached a target threshold, view an indication when the user has exceeded the target thresholds, view an indication when the user has set a new personal maximum for a load measurement and/or pounds pushed, lifted, or pulled, view an indication when a load measurement exceeds a safety limit, view an indication to instruct the user to begin another exercise, view an indication that congratulates the user for completing all exercises in the exercise plan, and so forth, as described in more detail below. The computing device 12 may also include instructions stored on the one or more memory devices that, when executed by the one or more processing devices of the computing device 12, perform operations to control the exercise machine 100.
The computing device 15 may execute an application 21. The application 21 may be implemented in computer instructions stored on the one or more memory devices of the computing device 15 and executable by the one or more processing devices of the computing device 15. The application 21 may present a user interface 22 including various screens to a physician, trainer, or caregiver that enable the person to create an exercise plan for a user based on a treatment (e.g., surgery, medical procedure, etc.) the user underwent and/or injury (e.g., sprain, tear, fracture, etc.) the user suffered, view progress of the user throughout the exercise plan, and/or view measured properties (e.g., force exerted on portions of the exercise machine 100) of the user during exercises of the exercise plan. The exercise plan specific to a patient may be transmitted via the network 20 to the cloud-based computing system 16 for storage and/or to the computing device 12 so the patient may begin the exercise plan. The exercise plan may specifying one or more exercises that are available at the exercise machine 100.
The exercise machine 100 may be an osteogenic, muscular strengthening, isometric exercise and/or rehabilitation assembly. Solid state, static, or isometric exercise and rehabilitation equipment (e.g., exercise machine 100) can be used to facilitate osteogenic exercises that are isometric in nature and/or to facilitate muscular strengthening exercises. Such exercise and rehabilitation equipment can include equipment in which there are no moving parts while the user is exercising. While there may be some flexing under load, incidental movement resulting from the tolerances of interlocking parts, and parts that can move while performing adjustments on the exercise and rehabilitation equipment, these flexions and movements can comprise, without limitation, exercise and rehabilitation equipment from the field of isometric exercise and rehabilitation equipment.
The exercise machine 100 may include various load cells 110 disposed at various portions of the exercise machine 100. For example, one or more left load cells 110 may be located at one or more left feet plates or platforms, and one or more right load cells may be located at one or more right feet plates or platforms. Also, one or more left load cells may be located at one or more left handles, and one or more right load cells may be located at one or more right handles. Each exercise in the exercise system may be associated with both a left and a right portion (e.g., handle or foot plate) of the exercise machine 100. For example, a leg-press-style exercise is associated with a left foot plate and a right foot plate. The left load cell at the left foot plate and the right load cell at the right foot plate may independently measure a load added onto the left foot plate and the right foot plate, respectively, and transmit the left load measurement and the right load measurement to the computing device 12. The load added onto the load cells 110 may represent an amount of weight added onto the load cells. In some embodiments, the load added onto the load cells 110 may represent an amount of force exerted by the user on the load cells. Accordingly, the left load measurement and the right load measurement may be used to present a left force (e.g., in Newtons) and a right force (e.g., in Newtons). The left force and right force may be totaled and converted into a total weight in pounds for the exercise. Each of the left force, the right force, and/or the total weight in pounds may be presented on the user interface 18.
In some embodiments, the cloud-based computing system 16 may include one or more servers 28 that form a distributed, grid, and/or peer-to-peer (P2P) computing architecture. Each of the servers 28 may include one or more processing devices, memory devices, data storage, and/or network interface devices. The servers 28 may be in communication with one another via any suitable communication protocol. The servers 28 may store profiles for each of the users that use the exercise machine 100. The profiles may include information about the users such as one or more disease protocols, one or more exercise plans, a historical performance (e.g., loads applied to the left load cell and right load cell, total weight in pounds, etc.) for each type of exercise that can be performed using the exercise machine 100, health, age, race, credentials for logging into the application 17, and so forth.
In some embodiments, the cloud-based computing system 16 may include a training engine 50 and/or an artificial intelligence engine 65. The cloud-based computing system 16 may include one or more servers 28 that execute the artificial intelligence engine 65 that uses one or more machine learning models 60 to perform at least one of the embodiments disclosed herein. In some embodiments, the training engine 50 may be included as part of the artificial intelligence engine 65 and the artificial intelligence engine 65 may execute the training engine 50. In some embodiments, the artificial intelligence engine 65 may use the training engine 50 to generate the one or more machine learning models 60.
The artificial intelligence engine 65, the training engine 50, and/or the one or more machine learning models 60 may be communicatively coupled to the servers 28 or may be included in one of the servers 28. In some embodiments, the artificial intelligence engine 65, the training engine 50, and/or the machine learning models 60 may be included in the computing device 12.
The one or more of machine learning models 60 may refer to model artifacts created, using training data that includes training inputs and corresponding target outputs (correct answers for respective training inputs), by the artificial intelligence engine 65 and/or the training engine 50. The training engine 50 may find patterns in the training data that map the training input to the target output (the answer to be predicted), and provide the machine learning models 60 that capture these patterns. As described in more detail below, the set of machine learning models 60 may be composed of, e.g., a single level of linear or non-linear operations (e.g., a support vector machine (SVM)) or may be a deep network, i.e., a machine learning model composed of multiple levels of non-linear operations. Examples of deep networks are neural networks, including convolutional neural networks, recurrent neural networks with one or more hidden layers, and fully connected neural networks.
In some embodiments, the training data may include various inputs (e.g., a physical activity goal, range of motion of users, user-reported pain level of users, user-reported difficulty levels of exercises, exercise information, levels of attainment, characteristics of users (e.g., age, weight, height, gender, procedures performed, condition of user, goals for outcomes of exercising, etc.), performance measurements, and the like) and mapped outputs. The mapped outputs may include an exercise plan composed on various exercise sessions each including various exercises, schedule of the exercise sessions, etc. In some embodiments, the training data may include other inputs (e.g., state of the exercise session, exercise, exercise machine 100; progress of the user; events; characteristics of the user; measurements received from sensors, etc.) and other mapped outputs. The other mapped outputs may include comorbidity information pertaining to the user. The other mapped outputs may further include multimedia (e.g., video/audio) clips or segments for a virtual coach to speak, graphic images, video, and the like to be presented on the user interface 18 of the computing device 12 before, during, or after the user performs the exercises. The virtual coach may be implemented in computer instructions as part of application 17 executing on the computing device 12. The virtual coach may be driven and controlled by artificial intelligence (e.g., via one or more machine learning models 60). For example, the machine learning model 60 may be trained to implement the virtual coach. Further, the training data may include inputs pertaining to user feedback and/or progress of the user and outputs pertaining to a persona for the virtual coach to implement. The training data may include inputs of the progress of the user (e.g., completion of an exercise) and output various incentives, rewards, and/or certificates. The training data may include inputs of the progress of the user and/or the exercise plan and may output notifications pertaining to the progress and/or the exercise plan. The training data may include inputs of user-reported pain levels, user-reported difficulty of exercises, difficulty levels of the exercises, etc. and may include mapped outputs of modifying the exercise plan (e.g., removing an exercise, switching an exercise or to another exercise, adding an exercise, modifying an exercise session, adding an exercise session, removing an exercise session, etc.). The training data may include one or more of measurements from sensors and/or of characteristics of users and may further include mapped outputs of control instructions that modify operating parameters of the exercise device 100, as described further herein. Further, the training data may include one or more user inputs and mapped outputs of a virtual character to present on a user interface, as described further herein. Further, the training data may include one or more data outputs (e.g., user feedback) and mapped outputs related to controlling an aspect of the exercise device 100. Further, the training data may include one or more measurements and mapped outputs related to modifying, on a user interface, various icons in a manner that positions or repositions the various icons relative to graphical elements that represent domains associated with an exercise plan, as described further herein. Further, the training data may include onboarding data and/or an onboarding protocol, as well as mapped outputs of an exercise plan, as described further herein. The machine learning model 60 may be trained using any and/or all of the training data.
In some embodiments, the training engine 50 may train the machine learning models 60 to output an exercise plan, wherein such plan may include a schedule of exercise sessions and selected exercises for each of the exercise sessions. Based on the inputs described herein, the trained machine learning model 60 may select the exercises by filtering a set of exercises included in a tagged data structure (e.g., data source). The machine learning model 60 may be trained to control the virtual coach executing on the computing device 12. The machine learning model 60 may also be trained to provide incentives, rewards, and/or certificates to the user. The machine learning model 60 may also be trained to modify the exercise plan and/or directly or indirectly control the exercise machine 100 based on the progress of the user and/or feedback of the user (e.g., indications of a difficulty level of an exercise). For example, if the user indicates an exercise is too easy, the machine learning model 60 may choose a new intensity for the exercise and the cloud-based computing system 16 may distally control the exercise machine 100 by increasing the intensity. Any suitable number of machine learning models 60 may be used. For example, separate machine learning models 60 may be used for each respective function described above, and the machine learning models 60 may be linked such that the output from one machine learning model 60 may be input into another machine learning model 60.
The cloud-based computing system may include a data source 67 that stores the training data for the training engine 50 and/or the artificial intelligence engine 65 to use to train the one or more machine learning models 60. The data source may include exercises, physical activity goals, levels of attainment, body portions targeted by exercises, weights and/or parameters used to configure a prioritization of certain levels of attainment throughout an exercise schedule, comorbidity information, health-related information, audio segments, video segments, motivational quotations, and so forth. The data source 67 may include various tags and/or keys (e.g., primary, foreign, etc.) to associate items of the data with each other in the data source 67. The data source 67 may be a relational database, a pivot table, or any suitable type of data structure configured to store data used for any of the operations described herein.
During exercise, a user can grip and apply force to one of the pairs of load handles 104, 106, 108. The term “apply force” can include a single force, more than one force, a range of forces, etc. and may be used interchangeably with “addition of load”. Each load handle in the pairs of load handles 104, 106, 108 can include at least one load cell 110 for separately and independently measuring a force applied to, or a load added onto, respective load handles. Further, each foot plate 118 (e.g., a left foot plate and a right foot plate) can include at least one load cell 110 for separately and independently measuring a force applied to, or a load added onto, respective foot plates.
The placement of a load cell 110 in each pair of load handles 104, 106, 108 and/or feet plates 118 can provide the ability to read variations in force applied between the left and right sides of the user. This allows a user or trainer to understand relative strength. This is also useful in understanding strength when recovering from an injury.
In some embodiments, the assembly 101 further can include the computing device 12. One or more of the load cells 110 can be individually in electrical communication with the computing device 12 either via a wired or wireless connection. In some embodiments, the user interface 18 presented via a display of the computing device 12 may indicate how to perform an exercise, how much force is being applied, a target force to be applied, historical information for the user about how much force they applied at prior sessions, comparisons to averages, etc., as well as additional information, recommendations, notifications, and/or indications described herein.
In some embodiments, the assembly 101 further includes a seat 112 supported by the frame 102 in which a user sits while applying force to the load handles and/or feet plates. In some embodiments, the seat 112 can include a support such as a backboard 114. In some embodiments, the position of the seat 112 is adjustable in a horizontal and/or vertical dimension. In some embodiments, the angle of the seat 112 is adjustable. In some embodiments, the angle of the backboard 114 is adjustable. Examples of how adjustments to the seat 112 and backboard 114 can be implemented include, but are not limited to, using telescoping tubes and pins, hydraulic pistons, electric motors, etc. In some embodiments, the seat 112 can further include a fastening system 116 (
In one example, the seat 112 can include a base 113 that is slidably mounted to a horizontal rail 111 of the frame 102. The seat 112 can be selectively repositionable and secured as indicated by the double-headed arrow. In another example, the seat 112 can include one or more supports 117 (e.g., two shown) that are slidably mounted to a substantially vertical rail 115 of the frame 102. The seat 112 can be selectively repositionable and secured as indicated by the double-headed arrow.
In some embodiments, a pair of feet plate 118 can be located angled toward and in front of the seat 112. The user can apply force to the feet plate 118 (
In some embodiments, adjustments can be made to the position of the pair of feet plate 118. For example, these adjustments can include the height of the pair of feet plate 118, the distance between the pair of feet plate 118 and the seat 112, the distance between each handle of the pair of feet plate 118, the angle of the pair of feet plate 118 relative to the user, etc. In some embodiments, to account for natural differences in limb length or injuries, each foot plate of the pair of feet plate 118 can be adjusted separately.
In some embodiments, a first pair of load handles 104 can be located above and in front of the seat 112. The user can apply force to the load handles 104 (
In some embodiments, adjustments can be made to the position of the first pair of load handles 104. For example, these adjustments can include the height of the first pair of load handles 104, the distance between the first pair of load handles 104 and the seat 112, the distance between each handle of the first pair of load handles 104, the angle of the first load handles 104 relative to the user, etc. In some embodiments, to account for natural differences in limb length or injuries, each handle of the first pair of load handles 104 can be adjusted separately.
In one example, the first pair of load handles 104 can include a sub-frame 103 that is slidably mounted to a vertical rail 105 of the frame 102. The first pair of load handles 104 can be selectively repositionable and secured as indicated by the double-headed arrow.
In some embodiments, a second pair of load handles 106 can be spaced apart from and in the front of the seat 112. While seated (
In some embodiments, adjustments can be made to the position of the second pair of load handles 106. These adjustments can include the height of the second pair of load handles 106, the distance between the second pair of load handles 106 and the seat 112, the distance between each handle of the second pair of load handles 106, the angle of the second load handles 106 relative to the user, etc. In some embodiments, to account for natural differences in limb length or injuries, each handle of the second pair of load handles 106 can be adjusted separately.
In one example, the second pair of load handles 106 can include the sub-frame 103 that is slidably mounted to the vertical rail 105 of the frame 102. The sub-frame 103 can be the same sub-frame 103 provided for the first pair of load handles 104, or a different, independent sub-frame. The second pair of load handles 106 can be selectively repositionable and secured as indicated by the double-headed arrow.
In some embodiments (
In some embodiments, adjustments can be made to the position of the third pair of load handles 108. These adjustments can include the height of the third pair of load handles 108, the distance between the third pair of load handles 108 and the seat 112, the distance between each handle of the third pair of load handles 108, the angle of the third load handles 108 relative to the user, etc. In some embodiments, to account for natural differences in limb length or injuries, each handle of the third pair of load handles 108 can be adjusted separately.
In one example, each load handle 108 of the third pair of load handles 108 can include a sub-frame 109 that is slidably mounted in or to a vertical tube 107 of the frame 102. Each load handle 108 of the third pair of load handles 108 can be selectively repositionable and secured as indicated by the double-headed arrows.
In other embodiments (not shown), the third pair of load handles 108 can be reconfigured to be coaxial and located horizontally in front of the user along an axis that is perpendicular to the vertical plane. The user can apply force to the third pair of load handles 108 in a deadlift-style exercise. Like the suitcase-lift-style exercise, the deadlift-style exercise can provide or enable osteogenesis, bone growth or bone density improvement for a portion of the skeletal system of the user. Further, the deadlift-style exercise can provide or enable muscular hypertrophy for one or more muscles of the user. In the deadlift-style exercise, the user can stand on the floor or a horizontal portion of the frame 102, bend their knees, hold the third pair of load handles 108 in front of them, and extend their legs to apply an upward force to the third pair of load handles 108. In some embodiments, the third pair of load handles 108 can be adjusted (e.g., rotated) from the described coaxial position used for the deadlift-style exercise, to the parallel position (
In general, the user interface 18 may present real-time visual feedback of the current load measurements or the current forces corresponding to the load measurements, a weight in pounds associated with the load measurements, incentive messages that encourage the user to exceed target thresholds (e.g., to trigger osteogenesis and/or muscular hypertrophy) and/or set personal records for maximum loads, historical performance of the user performing the exercise, and/or scripted prompts that display images of one or more body portions indicating proper technique for performing the exercise. The control system may provide various multimedia (visual, audio), and/or haptic feedback to encourage the user to exceed their target thresholds.
Initially, when the user has not added load onto any portion of the exercise machine 100 including one or more load cells 110, the computing device 12 may be operating in an idle mode. During the idle mode, the computing device 12 may be receiving load measurements at a first frequency from each data channel associated with an exercise. For example, there may be four data channels, one for each of a chest-press-style exercise, a leg-press-style exercise, a suitcase-lift-style exercise, and a pulldown-style exercise. Although four data channels are described for explanatory purposes, it should be understood that there may be any suitable number of data channels, where “any” refers to one or more. Each data channel may provide load measurements to the computing device 12 from a respective left load cell and a respective right load cell that are located at the portion of the exercise machine 100 where the user pushes or pulls for the respective exercises. The user interface 18 may present the load measurement from each left and right load cells (e.g., 8 load measurements for the 4 data channels associated with the 4 exercises). Further, any target thresholds and/or safety limits for the user performing the exercises may be presented on the user interface 18 during the idle mode. For example, a left target threshold, a right target load threshold, a safety limit, and/or a total weight target threshold for each of the exercises may be presented on the user interface 18 during the idle mode.
If the computing device 12 detects a minimum threshold amount of load (e.g., at least 10 pound-force (lbf)) added onto any of the load cells, the computing device switches from an idle mode to an exercise mode. The data channel including the load cell that sent the detected load measurement may be set to active by the computing device 12. Further, the computing device 12 may set the other data channels to inactive and may stop receiving load measurements from the load cells corresponding to the inactive data channels. The computing device 12 may begin reading data from the load cells at the active data channel at a second frequency higher (e.g., high frequency data collection) than the first frequency when the computing device 12 was operating in the idle mode. Further, the user interface 18 may switch to presenting information pertaining to the exercise associated with the active data channel and stop presenting information pertaining to the exercises associated with the inactive data channels.
For example, the user may grip the second pair of handles 106 and apply force. The computing device 12 may detect the load from the load cells 110 located at the second pair of handles 106 and may set the data channel associated with the chest-press-style exercise to active to begin high frequency data collection from the load cells 110 via the active data channel.
As depicted, the user interface 18 presents a left load measurement 1000 as a left force and a right load measurement 1002 as a right force in real-time or near real-time as the user is pressing on the second pair of handles 106. The values of the forces for the left load measurement 1000 and the right load measurement 1002 are presented. There are separate visual representations for the left load measurement 1000 and the right load measurement 1002. In some embodiments, these load measurements 1000 and 1002 may be represented in a bar char, line chart, graph, or any suitable visual representation. In some embodiments, a left target threshold and a right target threshold for the user may be presented on the user interface 18. In some embodiments, there may be more than one left target threshold and more than one right target threshold. For example, the left target thresholds may relate to an osteogenesis target threshold determined using a user's disease protocol and/or a muscular strength target threshold determined using a historical performance of the user for a particular exercise. The right target thresholds may relate to an osteogenesis target threshold determined using a user's disease protocol and/or a muscular strength target threshold determined using a historical performance of the user for a particular exercise. For example, if the user fractured their left arm and is rehabilitating the left arm, but the user's right arm is healthy, the left osteogenesis target threshold may be different from the right osteogenesis target threshold.
If the left load measurement 1000 exceeds any of the left target thresholds, an indication (e.g., starburst) may be presented on the user interface 18 indicating that the particular left target threshold has been exceeded and/or osteogenesis and/or muscular hypertrophy has been triggered in one or more portions of the body. If the right load measurement 1002 exceeds any of the right target thresholds, an indication (e.g., starburst) may be presented on the user interface 18 indicating that the particular right target threshold has been exceeded and/or osteogenesis and/or muscular hypertrophy has been triggered in another portion of the body. Further, if either or both of the left and right target thresholds are exceeded, the indication may indicate that the exercise is complete and a congratulatory message may be presented on the user interface 18. In some embodiments, another message may be presented on the user interface 18 that encourages the user to continue adding load to set a new personal maximum left load measurement and/or right load measurement for the exercise.
In some embodiments, there may be a single target threshold to which both the left load measurement and the right load measurement are compared. If either of the left or right load measurement exceed the single target threshold, the above-described indication may be presented on the user interface 18.
In some embodiments, there may be a single safety limit to which the left and right load measurements are compared. The single safety limit may be determined based on the user's disease protocol (e.g., what type of disease the user has, a severity of the disease, an age of the user, the height of the user, the weight of the user, what type of injury the user sustained, what type of surgery the user underwent, the portion of the body affected by the disease, the exercise plan to rehabilitate the user's body, instructions from a caregiver, etc.). If either or both of the left and right load measurements exceed the single safety limit, an indication may be presented on the user interface 18. The indication may warn the user that the safety limit has been exceeded and recommend to reduce the amount of load added to the load cells 110 associated with the exercise being performed by the user.
In some embodiments, more than one safety limit may be used. For example, if the user is rehabilitating a left leg, but a right leg is healthy, there may be a left safety limit that is determined for the left leg based on the user's disease protocol and there may be a right safety limit for the left leg determined based on the user's disease protocol. The left load measurement may be compared to the left safety limit, and the right load measurement may be compared to the right safety limit. If either or both the left load measurement and/or the right load measurement exceed the left safety limit and/or the right safety limit, respectively, an indication may be presented on the user interface 18. The indication may warn the user that the respective safety limit has been exceeded and recommend to reduce the amount of load added to the load cells 110 associated with the exercise being performed by the user.
Further, a total weight 1004 in pounds that is determined based on the left and right load measurements is presented on the user interface 18. The total weight 1004 may dynamically change as the user adds load onto the load cells 110. A target weight 1006 for the exercise for the current day is also presented. This target weight 1006 may be determined based on the user's historical performance for the exercise. If the total weight 1004 exceeds the target weight 1006, an indication (e.g., starburst) may be presented on the user interface 18 indicating that osteogenesis and/or muscular hypertrophy has been triggered. Further, the indication may indicate that the exercise is complete and a congratulatory message may be presented on the user interface 18. In some embodiments, another message may be presented on the user interface 18 that encourages the user to continue adding load to set a new personal maximum record for the exercise.
Additionally, the user interface 18 may present a left grip strength 1008 and a right grip strength 1010. In some embodiments, the left grip strength and the right grip strength may be determined based on the left load measurement and the right load measurement, respectively. Numerical values representing the left grip strength 1008 and the right grip strength 1010 are displayed. Any suitable visual representation may be used to present the grip strengths (e.g., bar chart, line chart, etc.). The grip strengths may only be presented when the user is performing an exercise using handles.
The user interface 18 may also present a prompt 1012 that indicates the body position the user should be in to perform the exercise, as well as indicate which body portions will be targeted by performing the exercise. The user interface 18 may present other current and historical information related to the user performing the particular exercise. For example, the user interface 18 may present a visual representation 1014 of the user's maximum weight lifted, pressed, pulled, or otherwise exerted force for the day or a current exercise session. The user interface 18 may present a visual representation 1016 of the user's previous maximum weight lifted, pressed, pulled, or otherwise exerted force. The user interface 18 may present a visual representation 1018 of the user's maximum weight lifted, pressed, pulled, or otherwise exerted force the first time the user performed the exercise. The user interface 18 may present one or more visual representations 1020 for a weekly goal including how many sessions should be performed in the week and progress of the sessions as they are being performed. The user interface 18 may present a monthly goal including how many sessions should be performed in the month and progress of the sessions as they are being performed. Additional information and/or indications (e.g., incentivizing messages, recommendations, warnings, congratulatory messages, etc.) may be presented on the user interface 18, as discussed further below.
After a person has an injury (e.g., sprain or fractured bone), a surgery (e.g., knee replacement), or a disease (e.g., muscular dystrophy), the person's body is typically in a weakened state (e.g., physically disabled). Thus, clinicians, such as doctors and physical therapists, can prescribe exercise plans for rehabilitating their patients. The exercises in these exercise plans help restore function, improve mobility, relieve pain, improve strength, improve flexibility, and, among other benefits, prevent or limit permanent physical disability in the patients. Patients who follow their exercise plans typically show signs of physical improvement and reduced pain at a faster the rate (i.e., a faster rate of recovery or rehabilitation).
In addition, after an injury or surgery, patients typically become less active than they once were, and they may experience muscle loss. As explained above, muscles that are not used often may reduce in muscle mass and become weaker. To increase the muscle mass and/or reduce the rate of muscle loss, people may conduct exercises according to an exercise plan.
Balancing and/or resistance exercise may cause muscle tissue to increase. For example, balancing on a balance board or pushing and pulling on a stationary object (e.g., pedals of an exercise cycle) with a certain amount of force may trigger the cells in the associated muscle to change and cause the muscle mass to increase.
The subject matter disclosed herein relates to a control system for an exercise machine, not only capable of enabling an individual, preferably an individual recovering from a fracture, an injury, or a surgery, to engage easily exercises according to an exercise plan, but capable of using predetermined thresholds or dynamically calculating them, such that the person using the exercise machine can be immediately informed through real-time visual and/or other sensorial feedback, that goals of the exercise plan have been met or exceeded, thus triggering osteogenesis for the subject bone (or bones), and/or that the muscular strength threshold has been exceeded, thereby triggering muscular hypertrophy for the subject muscle (or muscles). The control system may be used to improve compliance with an exercise plan, whereby the exercise plan includes one or more exercises.
The control system may receive one or more measurements, such as load measurements, associated with forces exerted by both the left and right sides on left and right portions (e.g., pedals, base, or platform) of the exercise machine to enhance osteogenesis, bone growth, bone density improvement, stability, flexibility, range of motion, and/or muscle mass. The one or more measurements (e.g., a load measurement) may be a left measurement of a load or an increased resistance added to a left load cell on a left portion of the exercise machine (e.g., a left pedal or a left portion of the platform) and a right measurement of a load or an increased resistance added to a right load cell on a right portion of the exercise machine (e.g., a right pedal or a right portion of the platform). A user interface may be provided by the control system that presents visual representations of the separately measured left and right loads or resistances where the respective left and right load or resistances are added to the respective left and right load cells or sensors at the subject portions of the exercise machine. For example, the user interface may provide a video game that has an avatar representing the user (e.g, the patient in rehabilitation). The avatar may move in the video game and those moves may correlate with the moves of the patient. As the one or more measurements increase, the movement of the avatar may increase (e.g., if the video game is a car racing video game, as the patient increases the force exerted on the pedals, the speed of the avatar, in its car, will increase). Similarly, the control system may receive one or more measurements associated with speed, repetitions, balance, any other suitable measurement, or combination thereof. Such measurements can be used to move the avatar. The measurements can be received from sensors coupled to the exercise machine. For example, sensors can be coupled to the pedals of the exercise machine or to a base of the exercise machine.
In some embodiments, initially, the control system may determine measurements in accordance with an exercise plan associated with each exercise of the video game. For example, there may be a first level of the video game that applies a first resistance to the pedals of the exercise machine (e.g., the cycle machine) and a second level of the video game that applies a second resistance to the pedals. Further, the control system may receive measurements associated with each exercise as a patient is using the exercise machine. The control system may generate a target threshold in accordance with an exercise plan associated with each exercise of the video game. For example, there may be a first threshold associated with the first level and a second threshold associated with the second level. The exercise may be complete when the one or more measurements are received and the one or more measurements exceed one or more target thresholds. For example, if the patient is playing the first level of the video game and one or more measurements exceed a first target threshold, the first level may end and the control system will select the level two for the patient to play. In some embodiments, the control system may determine an average measurement by accumulating raw measurements over a certain period of time (e.g., 5 seconds) and averaging the raw measurements to smooth the data (e.g., eliminates jumps or spikes in data) in an average measurement.
The control system may compare the one or more measurements (e.g., raw measurements, or averaged measurements) to one or more target thresholds. In some embodiments, a single measurement may be compared to a single specific target threshold (e.g., a one-to-one relationship). In some embodiments, a single measurement may be compared to more than one specific target threshold (e.g., a one-to-many relationship). In some embodiments, more than one measurement may be compared to a single specific target threshold (e.g., a many-to-one relationship). In some embodiments, more than one measurement may be compared to more than one specific target threshold (e.g., a many-to-many relationship).
The target thresholds may be an osteogenesis target threshold, a muscular strength target threshold, a balance threshold, a speed threshold, a range of motion threshold, a repetition threshold, any other suitable threshold, or combination thereof. In addition to the threshold explanations described above, the balance target threshold, the speed threshold, and/or the range of motion threshold may be determined based on a rehabilitation protocol pertaining to the user, an age of the user, a gender of the user, a sex of the user, a height of the user, a weight of the user, a bone density of the user, an injury of the user, a type of surgery of the user, a type of bone fracture of the user, etc. A rehabilitation protocol may refer to any illness, disease, fracture, surgery, or ailment experienced by the user and any treatment instructions provided by a caretaker for recovery and/or healing. The rehabilitation protocol may also include a condition of health where the goal is avoid a problem. Any of the target thresholds may be determined based on a historical performance of the user using the exercise machine (e.g., amount of pounds lifted for a particular exercise, amount of force applied associated with each body part, the range of motion for pedaling, the level of exertion, the level of pain, etc.) and/or other exercise machines, a fitness level (e.g., how active the user is) of the user, a diet of the user, a protocol for determining a muscular strength target, a range of motion target, etc.
The control system may determine whether the one or more measurements exceed the one or more target thresholds. Responsive to determining that the one or more measurements exceed the one or more target thresholds, the control system may cause a user interface to present an indication that the one or more target thresholds have been met or exceeded and an exercise is complete. For example, the user has completed a level of the video game. Additionally, when the one or more target thresholds are met or exceeded, the control system may cause the user interface to present an indication that instructs the user to apply additional force (less than a safety limit) to attempt to set a personal maximum record or achievement (e.g., of a rate of speed, of a level of stability, a number of repetitions, of an amount of weight lifted, pressed, pulled, or otherwise exerted force) for that exercise. The control system may also determine that one or more target thresholds (e.g., a level of pain or an exersion level) are met or exceeded and end the exercise game being played. The control system may present the same game at an easier exercise game level or present a different game for the user to engage in different exercises to reduce the level of pain. In this way, the user can continue exercising rather than stopping the rehabilitation session due to pain. The video game may have one or more games, each of which have one or more exercises that target one or more muscles groups at one or more different levels of intensity.
Further, the user interface may present an indication when a measurement is approaching a target threshold for the user. In another example, when the measurement meets or exceeds the target threshold, the user interface may present an indication that the target threshold has been met or exceeded, respectively, and that the exercise is complete. The control system may provide visual and/or audio encouragement and/or coaching to the user during a video game. For example, as the user is nearing the target threshold, the control system may provide an audio of a human voice encouraging the user to maintain or increase speed on the cycling machine to earn an achievement or reach the end of the exercise game level. The control system may indicate if there are any remaining incomplete exercise game levels the video game as part of the exercise plan, that there is another game or another level (e.g., with a difference exercise and/or goal) to be completed by the user. If there are no remaining games or levels (i.e., exercises in the exercise plan) to complete, then the user interface may present an indication that all exercises in the exercise plan are complete and the user can rest. In addition, when the exercise plan is complete, the control system may generate a performance report that presents various information (e.g., charts and graphs of the right and left measurements received during each of the exercises, left and right maximum loads for the user received during each of the exercises, historical right and left measurements received in the past, comparison of the current right and left measurements with the historical right and left measurement, an amount of pounds lifted or pressed that is determined based on the measurements for each of the exercises, percent gained in measurements over time, achievements earned, goals reached, exercise game levels completed, rankings as compared to a video game history of playing, etc.).
Further, the one or more measurements may each be compared to a safety limit. For example, a left measurement and a right measurement may each be compared to the safety limit for the user. The safety limit may be determined for the user based on the user's disease protocol. There may be different safety limits for different portions of the user's body on the left and the right side, one extremity versus another extremity, a top portion of the user's body and a body portion of the user's body, etc., and for different exercises. For example, if someone underwent left knee surgery, the safety limit for a user for a left measurement for a cycling using a left leg may be different from the safety limit for a right measurement for that exercise and user. If the safety limit is exceeded, an indication may be presented on the user interface to instruct to reduce the amount of force or speed that the user is applying and/or to instruct the user to stop applying force because the safety limit has been exceeded.
Another benefit of the present disclosure is its ability to speed the healing of fractures in athletically robust individuals. Further, another benefit is the increase in muscle mass by using the exercise machine to trigger muscular hypertrophy. The control system may provide an automated interface that improves compliance with an exercise plan by using a real-time feedback loop to measure loads added during each of the exercises, (e.g. resistance applied to the pedals) compare the measurements to target thresholds and/or safety limits that are uniquely determined for the user using the exercise machine, and provide various indications based on the comparison. For example, the indications pertain to when the user should add more load, when the target thresholds are met or exceeded, when the safety limit is met or exceeded, when the exercise is complete, when the user should begin another game, when the user should begin another level of the exercise game, and so forth.
Rehabilitation Exercises and their Benefits
The following exercises achieve rehabilitation results by exposing relevant parts of a user to exercises that build strength, increase flexibility, increase range of motion, increase balance, increase coordination, decrease pain, decrease the amount of time required for recovery, or any combination thereof. In addition to the exercises machines or devices described above in this disclosure, exercise machines or devices used to facilitate the rehabilitation exercises referred to are as follows.
Cycling Machine
A cycling machine refers to a stationary bicycle used as exercise equipment and/or rehabilitation equipment. The cycling machine includes pedals configured to rotate. The cycling machine may include attached handlebars or may be used in combination with detached handlebars. The cycling machine may include an attached seat or may be used in combination with a detached seat. The cycling machine can be used to for exercise targeted to improve the following key muscle groups: gluteals, hamstrings, quadriceps, thighs, adductors, abs, and grip muscles as well as to increase flexibility, range of motion, and strength.
Balance Equipment
Balance equipment refers to an exercise machine or device, such as a balance board or a rocker device, for a user to stand on and maintain balance and control as the balance board moves in various directions. The balance board can be used to for exercise targeted to can improve mobility, flexibility, proprioception, and strength in the following key muscle groups: peroneals, gluteals, hamstrings, quadriceps, thighs, adductors, abs, and grip muscles as well as to increase flexibility, range of motion, and core strength.
The following discussion is directed to various embodiments of the present disclosure. Although these embodiments are given as examples, the embodiments disclosed should not be interpreted, or otherwise used, as limiting the scope of the disclosure, including the claims. In addition, one of ordinary skill in the art will understand that the following description has broad application, and the discussion of any embodiment is meant only to be exemplary of that embodiment, and not intended to intimate that the scope of the disclosure, including the claims, is limited to that embodiment.
Exercise machines can include moving parts to provide dynamic exercises to facilitate rehabilitation. A dynamic exercise can be, but is not limited to an exercise where a user participates in an activity where the user moves and some resistance or load may be provided against the movement of the user. The
Embodiments of a first housing 1914, generally indicated, can be coupled to the base 1902. The first housing 1914 can be disposed adjacent to the rear side 1906. A handlebar including one or more handles 1916 can be coupled to the first housing 1914. The handles 1916 can include grip pads to prevent slipping during use of the exercise machine 1900.
The exercise machine 1900 comprises a multidimensional exercise control system. The control system comprises a user interface 1918. The user interface can be coupled to the first housing 1914. The user interface 1918 may be or function as the user interface 18 in
Embodiments of a second housing 1920, generally indicated, can be coupled to the base 1902. The second housing 1920 can be disposed between the front and rear sides 1904, 1906. The second housing 1920 can be disposed adjacent to and/or coupled to the first housing 1914. In the present embodiment of the second housing 1920, and as illustrated in the drawings, the second housing 1920 is cylindrical shaped. However, the base 1902 could be of any shape.
A wheel 1926 can be operatively coupled to the exercise machine 1900. In certain embodiments, the exercise machine 1900 can have the wheel 1926 coupled to the base 1902. The wheel 1926 can be a single wheel 1926, and the wheel 1926 may be a flywheel. In certain embodiments, the exercise machine 1900 can have a pair of wheels, and the wheels may be flywheels. The wheel 1926 can be disposed in the second housing 1920, and the wheel 1926 can be independently rotatable about an axis. The wheel 1926 can be disposed in in a cavity of the second housing 1920. The wheel 1926 can be partially disposed in an openings of the second housing 1920. One of skill in the art will appreciate that the wheel 1926 may be coupled to the base 1902 by various means known in the art. As one example, a support beam can extend from the base 1902 to a first axial, where an axial extends along the axis. In this embodiment, the wheel 1926 can be coupled to and independently rotatable about the axial.
In some embodiments, a motor may be disposed in the second housing 1920 and may be configured to be controlled by the computing device 12, the computing device 15, and/or the cloud-based computing system 16. The motor may be configured to operate at a desired speed, which may dynamically modified by a control instruction. The motor may be electrically coupled, physically coupled, and/or communicatively coupled to the wheel 1920, and may drive the wheel 1920 to rotate at a desired revolutions per minute, which may cause pedals 1922, 1924 to rotate at or near the desired revolutions per minute. The revolutions per minute may be dynamically modified based on an attribute of an operating parameter specified in a control instruction received from the computing device 12, the computing device 15, and/or the cloud-based computing system 16. Further, a resistance provided by pedals 1922, 1924 may be dynamically configured.
In some embodiments, the pedals 1922, 1924 may each be attached to a pedal arm for rotation about an axle. In some embodiments, the pedals 1922, 1924 may be movable on the pedal arms in order to adjust a range of motion used by a user in pedaling. For example, the pedals being located inwardly toward the axle corresponds to a smaller range of motion than when the pedals are located outwardly away from the axle. A pressure sensor may be attached or embedded within each of the pedals 1922, 1924 for measuring an amount of force applied by the user on the pedals 1922, 1924. The measurement from the sensor may be sent to the cloud-based computing system 16. The sensor may communicate wirelessly to the computing device 12, the computing device 15, the cloud-based computing system 16, the exercise machine 1900, or the like. The pedals 1922, 1924 may be moved along the pedal arms based on operating parameters provided in a control instruction generated by a machine learning model 60. For example, a motor and/or actuator communicatively coupled to the pedals 1922, 1924 may cause the pedals to move along the pedal arms to desired positions associated with desired range of motions.
A machine learning model 60 may be trained to receive input (e.g., measurements) and to output a control instruction that causes an operating parameter of the exercise device to change. In some embodiments, the operating parameter may represent or correspond to one or more resistances provided by one or more pedals 1922, 1924; a range of motion of the one or more pedals 1922, 1924; a speed of the motor; a revolutions per minute of the wheel 1920; or some combination thereof. It should be noted that the operating parameters may be controlled independently (e.g., a first pedal may be set to a first range of motion and a second pedal may be set to a second range of motion) or the operating parameters may be similarly controlled (e.g., each of the first and second pedals is set to the same range of motion).
In some embodiments, pair of pedals (e.g., a right pedal 1922 and a left pedal 1924) can be coupled to and extend from the wheel 1926. The pedals 1922, 1924 can be configured to be engaged by the user, and the pedals 1922, 1924 can facilitate rotation of the respective wheel 1926. The pedals 1922, 1924 can be movably coupled to the wheel 1926. More specifically, the pedals 1922, 1924 can be adjusted radially by the user to various positions to accommodate the needs of the user. During use of the exercise machine 1900, the user can sit in a seat 1930 and engage the pedals 1922, 1924. The seat 1930 may be detached from the exercise machine 1900. In some embodiments, the seat 1930 may be attached to the exercise machine 1900. It should be readily appreciated that the user may adjust the seat 1930 and/or the pedals 1922, 1924 to a desired position to accommodate the needs of the user for exercise or rehabilitation. When the user engages the pedals 1922, 1924, the user may apply a force to respective pedals 1922, 1924 to engage and cause rotation of a respective wheel 1926. By engaging respective pedals 1922, 1924 and applying a force to the same, the user, to support osteogenesis and/or increase a range of motion of a user's legs, engages various muscles to push the respective pedals 1922, 1924. The pedals 1922, 1924 may have straps or engagements for a user to engage with and pull the pedals 1922, 1924. Pulling the pedals 1922, 1924 may aid in the strength and rehabilitation of additional muscles. A sensor 1934 can be coupled to the right pedal 1922. An additional sensor 1936 can be coupled to the left pedal 1924. As described above, the sensors 1934, 1936 can be configured to collect sensor data correlating to the respective pedals 1922, 1924. The sensors 1934, 1936 can be a Bluetooth sensor, a load sensor, accelerometers, gyroscopes, magnetometers, any other suitable sensor, or combination thereof.
To further support osteogenesis during use of the exercise machine 1900 by a user, the exercise machine 1900 can include a first resistance mechanism (not shown). The resistance mechanism can be coupled to the base 1902, and the resistance mechanism can be disposed in the second housing 1920 adjacent to the wheel 1926. When the pedal 1922, 1924 are engaged by the user, the resistance mechanism can be configured to resist rotation of the wheel 1926. The resistance mechanisms may resist rotation of the wheel 1926 by any means known in the art.
It is to be appreciated that the exercise machine 1900 could comprise a motor coupled to each of the wheel 1926 and each motor is configured to affect or regulate the independent rotation of a respective wheel 1926. Moreover, the motor 1928 affects or regulates the independent rotation of the wheel 1926 by engaging the wheel 1926 and selectively causing or resisting rotation of the wheel 1926. The motor 1928 can engage the wheel 1926 by any means known in the art. In one example, the motor 1928 could engage gears to cause rotation of the wheel 1926. It is to be appreciated that the motor 1928 can operate congruently with or independently of the resistance mechanisms to affect or regulate the rotation of the wheel 1926. In certain embodiments, the motor 1928 can cause rotation of the wheel 1926, and the motor 1928 can resist rotation of the wheel 1926. In other embodiments with the motor 1928 and the resistance mechanism, the motor 1928 can rotate the wheel 1926 and the resistance mechanism can resist or stop rotation of the wheel 1926 when the motor 1928 stops rotating the wheel 1926. For regulating or affecting the rotation of the wheel 1926, the present disclosure allows for many variations and combinations of the motor 1928 and the resistance mechanism.
During use of the exercise machine 1900 by a user, when the user applies a force to the pedals 1922, 1924, the control system can maintain a constant rotational velocity between each of the wheel 1926. Alternatively, the wheel 1926 can be mechanically interconnected. For example, the wheel 1926 could be mechanically interconnected by a chain, belt, gear system, or any other means to maintain a constant rotational velocity between the wheel 1926.
In a further embodiment of the exercise machine 1900, a control system can be coupled to an actuator, and the control system can be configured to control the actuator. Moreover, the control system can be configured to independently vary the resistance to each of the wheel 1926 to maintain a select rotational velocity thereof, and to independently stop rotation of the wheel 1926. More specifically, the control system can control the actuator to activate the resistance mechanism to independently vary the resistance of the wheel 1926. In certain embodiments, the control system can be coupled to the motor 1928, and the control system can be configured to control the motor 1928. Additionally, the control system can be configured to independently maintain select rotational velocities of the wheel 1926, and to independently stop rotation of the wheel 1926. More specifically, the control system can control the motor 1928 to independently maintain select rotational velocities of the wheel 1926 by rotating, resisting, or stopping rotation of the wheel 1926. It is to be appreciated that the control system may control the actuator and/or the motor 1928 simultaneously or independently to maintain the select rotational velocities of the wheel 1926. For communicating the rotational velocities or accelerations of the wheel 1926 to the control system, the control system may also include sensors located on the user or coupled to the wheel 1926. With the rotational velocities or accelerations received from the sensors, the control system can determine, with a processor of the control system, a select rotational velocity of the wheel 1926. The control system can then control the motor 1928 and/or the actuator to maintain the select rotational velocities of the wheel 1926.
In some embodiment of the exercise machine 1900, a switch, not illustrated, can be disposed on the first housing 1914 for activating the control system. In another embodiment, a button, not illustrated, may be disposed on the first housing 1914 for activating the control system. In yet another embodiment, a display 1932 of a user interface 1918, such as a computer screen, iPad, or like device, can be coupled to the exercise machine 1900 to activate the control system. The switch, display 1932, and/or button may be coupled to the exercise machine 1900 by alternative or other means. For example, the switch, display 1932, and/or button could be coupled to the handle 1916. It is further to be appreciated that alternative means could be used to activate the control system and the use of the switch, display 1932, or the button, is not meant to be limiting.
In another embodiment, one or more biometric sensors, not shown, may be coupled to the exercise machine 1900 for activating the control system. The biometric sensor could be for, inter alia, detection, recognition, validation and/or analysis of data relating to: facial characteristics; a fingerprint, hand, eye (iris), or voice signature; DNA; and/or handwriting. In yet another embodiment, the biometric sensor can comprise position sensors located on the user. In addition, it is contemplated that advancements of such biometric sensors may result in alternative sensors that could be incorporated in the exercise machine 1900, i.e., biometric type sensors not currently on the market may be utilized. Further, the one or more biometric sensors may comprise a biometric system, which may be standalone or integrated.
In one embodiment, adjustment of exercise based on artificial intelligence, exercise plan, and user feedback is disclosed. An exercise plan may include one or more exercise sessions. For example, an exercise plan may include a schedule of a certain number of exercises sessions for a certain time period (e.g., 3 exercise sessions each week for 4 weeks) that, if performed by the user, should result in a desired outcome (e.g., rehabilitation of a body part, strengthen a muscle, etc.). The exercise session may include one or more exercises for various sections (e.g., warm up, strength, flexibility, cycling, cool down, etc.) The exercise plan may be generated using artificial intelligence via one or more trained machine learning models as described herein. The exercise plan may include a plan of one or more exercise sessions including exercises for a patient for rehabilitating a body part. The exercise plan may include exercises for one or more muscle groups. The exercise plan may be generated by artificial intelligence and/or prescribed by a doctor, a physical therapist, or any other qualified clinician.
For example, a machine learning model may be trained to select one or more exercises for an exercise session based on various inputs. The inputs may include the pain level of the user, the range of motion of the user, and/or characteristics of the user. These inputs may be used to determine an exercise level of the user. The machine learning model may receive the exercise level as input and select corresponding exercises from a data structure by matching the exercise level of the user to exercises having a tagged corresponding user exercise level. Various other techniques may be used to select the exercises for the exercise session.
The machine learning models may be trained to control a virtual coach executing on a computing device associated with the exercise machine 100. The virtual coach may speak via a speaker of the computing device, may be a virtual avatar displayed on the user interface 18 of the computing device 12, may cause one or more messages, emails, text, notifications, prompts, etc. to be presented on the user interface 18. The virtual coach may perform actions based on various information, such as progress of the user performing an exercise, the exercise plan details, user feedback, and the like. For example, the virtual coach may provide encouragement to the user based on the progress of the user during an exercise. The virtual coach may provide incentives, rewards, and/or certificates to the user as the user completes exercises. The virtual coach may have a particular persona that is selected for a particular user. For example, some users may respond better and perform exercises completely in response to a nice and encouraging persona for the virtual coach, while other users may respond better to a more demanding and strict (e.g., drill sergeant) persona.
By tailoring the exercise plan for the specific user and dynamically adjusting it using artificial intelligence, compliance with the exercise plan may be enhanced. Further, the user may achieve their desired goal faster by using the generated exercise plan because it is based on their progress and feedback (e.g., pain level, exercise difficulty level). By achieving the desired outcome faster, computing resources (e.g., processing, memory, network, etc.) may be reduced because the exercise machine 100, the computing device 12, and/or the cloud-based computing system 16 may not have to continuously update the exercise plan.
Further, the virtual coach may provide a companion type of feel for the user, which may further cause the user to comply with the exercise plan more efficiently and completely, thereby achieving their desired outcome faster. The virtual coach may improve the user experience of using the computing device 12 and/or the exercise machine 100 because the persona may be selected specifically for the particular user. In some instances, the user may form a bond with the persona of the virtual coach if the persona matches a friend in real life, family member, a significant other, or the like, and the bond may cause the user to feel a desire to want to listen to the virtual coach and/or complete the exercise plan such that they don't let the virtual coach down. Such a situation may also save computing resources because the exercise plan may not have to be adjusted and lengthened by adding additional exercise sessions.
As a result, various technical benefits may be achieved by the disclosed embodiments, as described above. Further, the user experience of using the exercise machine 100, the computing device 12, or both may be improved based on the disclosed techniques due to exercising with the virtual coach, the incentives, the rewards, the certificates, and the like.
The processing device may be configured to execute the instructions to receive user input data. As illustrated in
At 2702, the processing device may receive a set of inputs. The set of inputs may include an indication of a level of pain of the user, a range of motion of a body part of the user, a set of characteristics of the user, or some combination thereof. The indication of the level of pain of the user may be entered by the user using any suitable peripheral device (e.g., microphone, keyboard, touchscreen, mouse, etc.) at a particular user interface 18 displayed on a computing device 12. The range of motion of the body part of the user may be determined by the processing device by the user performing a baseline exercise for a certain amount of time. The baseline exercise may include setting a pedal at an initial position and having the user cycle at that position for a period of time. If the user does not experience any pain after the period of time, the pedal may be moved to a second position and the user may cycle for the period of time again. If the user experiences pain, then the range of motion of the user may be determined based on the previous position of the pedal when the user was able to cycle without pain. As may be appreciated, if the user does not experience pain, the position of the pedal may continue to change until the user experiences pain and the ROM of the user may be determined based on the prior position where the user did not experience pain. The characteristics of the user may include an age of the user, a height of the user, a weight of the user, a gender of the user, a condition that caused the pain in the body part, one or more procedures perform on the user, a goal of the user, whether the user is in a pre-procedure stage or a post-procedure stage, or some combination thereof. The characteristics may be included in a user profile for the user that is stored at the cloud-based computing system 16, the computing device 12, the computing device 15, or both.
At 2704, the processing device may determine, based on the set of inputs, an exercise level of the user. The exercise levels may range from 1-5, where 1 is the lowest exercise level and 5 is the most advanced exercise level. Any suitable range of exercise levels may be used. The following chart illustrates an example of how the exercise level may be determined:
At 2706, the processing device may generate, using the machine learning model 60, an exercise session for the user by selecting, based on the exercise level of the user, one or more exercises to be performed by the user using the exercise machine 100. In some embodiments, a data structure may include entries for a set of exercises (e.g., tens, hundreds, thousands, etc.) that are each tagged with an exercise level. For example, the processing device may tag each exercise of the set of exercises with a respective user exercise level. The machine learning model 60 may access the data structure to select the exercises for the exercise session by filtering the set of exercises, as further discussed with reference to
At 2708, the processing device may cause initiation of the exercise session on the exercise machine 100 and a virtual coach executed by the computing device 12 associated with the exercise machine 100 to provide instructions pertaining to the exercise session. The virtual coach may be driven by artificial intelligence via one or more trained machine learning models 60. For example, the trained machine learning models may receive various inputs, such as the exercise session for the user, the exercise being performed, instructions pertaining to the exercise being performed, completion of the exercise being performed, progress of the exercise being performed, and may be trained to provide certain outputs based on the inputs. The virtual coach may output audible noise (e.g., speech) that pertain to the various inputs. For example, the virtual coach may say, via a speaker of the computing device 12, encouraging words while a user is performing an exercise, congratulatory words when the user completes an exercise, instructions when the user is about to start another exercise, and the like. The virtual coach may have a persona (e.g., a cheerleader type of persona, a drill sergeant type of persona) that is selected based on progress of the user, feedback of the user, or both, as described further below with reference to
In some embodiments, the processing device may receive, from the user while the user is performing an exercise of the one or more exercises in the exercise session, feedback pertaining to the exercise. The feedback may include an indication that the exercise is too easy or too hard. For example, the user may use a display screen or microphone of the computing device 12 to enter or say the exercise is “too easy” or “too hard”. Responsive to receiving the feedback, the processing device may cause an intensity of the exercise to increase or decrease. For example, if the user says “too easy” the intensity of the exercise may be increased. If the user says “too hard”, the intensity of the exercise may be decreased. Other dimensions, parameters, characteristics, etc. of the exercise or exercise session may be changed based on whether the feedback is too easy or too hard. For example, the other dimensions, parameters, characteristics, etc. may include a number of sets, a number of repetitions, a hold time, a rest time, and the like. When one of the dimensions, parameters, characteristics, etc. changes, the virtual coach may provide an indication of the change. For example, the virtual coach may say, via a speaker of the computing device 12, “The intensity for this exercise has increased”.
In some embodiments, the processing device may track how many times the user has provided the feedback for a particular exercise in an exercise session or across every exercise session in an exercise plan for the user. Responsive to determining the feedback has been received more than a threshold number of times (e.g., 3, 4, 5, etc.), the processing device may control, in real-time or near real-time, the exercise machine 100 to initiate a more advanced exercise than the exercise currently being performed, a less advanced exercise than the exercise currently being performed, or the like. Further, the processing device may remove the exercise for which the feedback was received more than the threshold number of times from subsequent exercise sessions and replace it with another exercise. The processing device may cause the virtual coach to provide an indication via the computing device 12 (e.g., voice emitted through the speaker, graphic on the user interface 18, text on the user interface 18, or the like) of the change to the exercise.
In some embodiments, the processing device may monitor the progress of the user while the user uses the exercise machine to perform the one or more exercises. The progress may include an amount of time the user performs the one or more exercises, the range of motion of the user while the user performs the one or more exercises, the level of pain of the user while the user performs the one or more exercises, whether the user completes the one or more exercises, an indication of the user of a level of difficulty of the one or more exercises, or some combination. The progress may be determined based on measurement data received from any sensor associated with the exercise machine 100, any user feedback received by the computing device 12, and the like. The user may use any suitable peripheral to input the level of difficulty (e.g., too hard or too easy) while the user performs the exercises. The processing device may adjust, by executing the machine learning model 60, a subsequent exercise session based on the progress of the user. The adjusting may be based on advancing the exercise level of the user to a next exercise level, achieving a desired goal as defined by the user, a medical professional, or both, or some combination thereof.
In some embodiments, the processing device may monitor progress of the user while the user uses the exercise machine 100 to perform the one or more exercises. The processing device may cause, based on the progress of the user, an incentive, reward, or both to be elicited by the computing device 12 associated with the exercise machine 100. The incentive, reward, or both may include an animation, video, audio, haptic feedback, image, push notification, email, text, or some combination thereof. The processing device may cause the virtual coach to perform an encouraging action (e.g., shoot virtual fireworks on the user interface 18, cause an avatar displayed on the user interface 18 to dance or give a virtual high five, emit an audible noise from the speaker congratulating the user). Providing incentives, rewards, or both may encourage the user to continue to perform exercises and comply with the exercise session, which in turn, may decrease the amount of time it takes for the user to achieve their goal. Reducing the amount of time it takes for the user to achieve their goal may include technical benefits because if the user achieves their goal faster, the computing device 12, exercise machine 100, and/or the cloud-based computing system 16 may save computing resources (e.g., processing, memory, network) by not having to execute as long to guide the use through the exercise plan. That is, if the user does not comply with the exercise plan efficiently or as directed, then the exercise plan may be adjusted to add additional exercise sessions, thereby causing the computing device 12, exercise machine 100, and/or the cloud-based computing system 16 execute longer and waste computing resources until the user achieves their goal.
In some embodiments, the processing device may determine when a number of incentives, rewards, or both elicited by the computing device 12 satisfy a threshold value (e.g., 3, 4, 5). Responsive to determining that threshold value is satisfied, the processing device may cause a certificate to be transmitted to the computing device 12 and associated with an account of the user using the exercise machine 100. For example, the certificate may be stored in a digital wallet of the user's account in the application 17 executing on the computing device 12. In some embodiments, the certificate may have a particular value that may be exchanged for certain items (e.g., gift certificate, clothing, coupons, discounts, etc.).
In some embodiments, the processing device may determine, by executing the machine learning model 60, a set of audio segments for the virtual coach to say while the user performs the one or more exercises. The audio segments may be based on a state of the exercise (e.g., beginning, middle, end), progress of the user performing the exercise, or any suitable information. For example, at the initiation of the exercise, the audio segment may provide instructions to the user on the details of the exercise (e.g., 2 reps, 30 seconds, etc.). Based on the progress of the user, the audio segment may say “pedal faster” if the user is not pedaling fast enough, “good job” if the user is satisfying the criteria for the exercise, “almost finished” if the user is almost finished with the exercise, or the like. The audio segments may be dynamically determined, in real-time or near real-time, by the machine learning model 60 based on the inputs described above. It should be noted that real-time or near real-time may refer to a relatively short amount of time (e.g., less than 5 seconds) after an action occurring.
In some embodiments, the processing device may determine, by executing the machine learning model 60, a schedule of a set of exercise sessions to be performed by the user to achieve a desired goal specified by the user, a medical professional (e.g., physical therapist), or both. The machine learning model may be trained to determine the schedule based on various inputs, such as the desired goal (e.g., full recovery, near full recovery at a fastest pace possible, strength improvement, flexibility improvement, etc.), a procedure performed on the user, characteristics of the user (e.g., age, weight, height, etc.), a daily schedule of the user (e.g., job schedule, parenting schedule, school schedule, etc.), and the like. The schedule may be optimized for the user and may comply with the various inputs described above.
In some embodiments, the virtual coach may be controlled, in real-time or near real-time, by the machine learning model 60. For example, the virtual coach may provide indications (e.g., emit audible noises, present various screens or notifications or indications or avatars or graphics, etc.) via the computing device 12 as parameters of the exercise, exercise session, exercise machine 100, etc. change, or as characteristics or progress of the user changes.
The exercise Sitting Knee Extension has been tagged as a suitable exercise for levels 1, 2, and 3, and is an option for the section comprising Warm Up. It should be note that each exercise session may include various sections: warm up, cardio, strength, cycle, cool down, flexibility, etc. Each section of an exercise session may be assigned one or more exercises that are appropriate for that section, based on the entry in the data structure 2750, and the exercise level of the user that matches the level in the data structure 2750.
At 2802, the processing device may filter a set of exercises to obtain the one or more exercises for a particular exercise session in an exercise plan. Operation 2802 may include operations 2804, 2806, 2808, 2810, and/or 2812.
At 2804, the processing device may identify, based on the tagging of the exercises in the data structure, a subset of exercises having the respective user exercise level that matches the exercise level of the user. At 2806, the processing device may identify a first subset of exercises having a respective section of a set of sections, wherein the set of sections include warm-up, cycling, strength, flexibility, or some combination thereof. At 2808, the processing device may identify a second subset of exercises that result in a desired outcome specified by a medical professional, wherein the desired outcome pertains to increasing a range of motion, mobility, strength, flexibility, or some combination thereof. At 2810, the processing device may identify, using a historical performance of the user, a third subset of exercises that have been performed by the user less than a threshold number of times. At 2812, the processing device may identify, based on feedback from the user, a fourth subset of exercises that have been performed by the user and indicated as being too easy or too hard for the user.
In some embodiments, the processing device may select at least one of the subset of exercises, the first subset of exercises, the second subset of exercises, the third subset of exercises, or the fourth subset of exercises as the one or more exercises for the exercise session. That is, any combination of the subset, the first subset, the second subset, the third subset, and the further subset of exercises may be selected as the one or more exercises for the exercise session.
At 2902, the processing device may receive, from the user while the user is performing an exercise of the one or more exercises, feedback pertaining to the exercise, wherein the feedback includes an indication of a level of difficulty of the exercise. For example, the feedback may be entered by the user using any suitable peripheral (e.g., microphone, touchscreen, mouse, keyboard, etc.) of the computing device 12. The feedback may include the user saying the exercise is too easy or too hard.
At 2904, the processing device may determine whether the feedback has been received more than a threshold number of times for the exercise. At 2906, responsive to determining the feedback has been received more than the threshold number of times for the exercise, the processing device may adjust, in real-time or near real-time, the exercise session. In some embodiments, adjusting the exercise session may include changing to another exercise, controlling the exercise machine to stop the exercise, removing the exercise from the exercise session, changing an intensity of the exercise, or some combination thereof. At 2908, the processing device may cause the virtual coach to provide an indication of the adjustment. The indication may be provided via the user interface 18, a speaker of the computing device 12, or the like.
At 3002, the processing device may select, for the virtual coach, a persona from a plurality of personas. The virtual coach may be implemented in computer instructions stored in a memory device and executable by a processing device. The virtual coach may include a particular voice (e.g., male, female) and have a particular persona. The persona may be randomly selected at first and the user's response to the persona may be tracked over time. The response may include whether the user performs the exercises completely or incompletely as the virtual coach guides the user through the exercises. The personas may range from a cheerleader type that provides a lot of encouragement to a drill sergeant type that is more aggressive, harsher, stricter, and/or demands compliance with the exercise or demands the user tries harder.
At 3004, the processing device may cause the virtual coach to provide instructions as the user performs the one or more exercises. The instructions may be provided visually on the user interface 18, audibly via a speaker of the computing device 12, or both.
At 3006, the processing device may monitor a parameter associated with the user while the user performs the one or more exercises. The parameter may include a vital sign (e.g., heartrate, blood pressure), sensor measurement data (e.g., ROM, pressure exerted on pedals, etc.), characteristics of the user (e.g., respiratory rate, temperature, perspiration, etc.). The monitoring may be based on any suitable sensor measurement data associated with the user, the exercise machine 100, or both. In some embodiments, the parameter pertains to a progress of the user, an indication of whether the user likes the persona of the virtual coach, or both. For example, the user may provide feedback that they like the persona of the virtual coach via the user interface 18 or by speaking to the computing device 12 via a microphone.
At 3008, the processing device may select, based on the parameter, a subsequent persona for the virtual coach. For example, if the user indicated the user does not like the persona, the processing device may select a different persona for a subsequent exercise and/or exercise session.
At 3010, the processing device may switch, in real-time or near real-time, based on the parameter, to a different persona for the virtual coach while the user performs the one or more exercises. Dynamically switching may be based on whether the user is performing the exercise well or not. For example, if the user is pedaling at substantially slower rate than desired for the exercise, the processing device may determine the user is not responding well to the persona and may switch to a different persona immediately during the exercise. The progress of the user may be tracked to see if the switch of personas impacts the progress of the user. Further, if the user indicates the user does not like the persona, the processing may switch to a different persona immediately while the user performs the exercise.
Further, the user interface 1918 may present one or more visual representations 3106 of target load thresholds tailored for the user. For example, the one or more target thresholds may include a left target threshold, a right target threshold, or some combination thereof. Presenting the visual representations 3106 of the target thresholds concurrently with the real-time display of the measurements in the visual representations 3102 and/or 3104 may enable the user to determine how close they are to exceeding the target thresholds and/or when they exceed the target thresholds.
The computer system 3200 includes a processing device 3202, a main memory 3204 (e.g., read-only memory (ROM), flash memory, dynamic random access memory (DRAM) such as synchronous DRAM (SDRAM)), a static memory 3206 (e.g., flash memory, static random access memory (SRAM)), and a data storage device 3208, which communicate with each other via a bus 3210.
Processing device 3202 represents one or more general-purpose processing devices such as a microprocessor, central processing unit, or the like. More particularly, the processing device 3202 may be a complex instruction set computing (CISC) microprocessor, reduced instruction set computing (RISC) microprocessor, very long instruction word (VLIW) microprocessor, or a processor implementing other instruction sets or processors implementing a combination of instruction sets. The processing device 3202 may also be one or more special-purpose processing devices such as an application specific integrated circuit (ASIC), a field programmable gate array (FPGA), a digital signal processor (DSP), network processor, or the like. The processing device 3202 is configured to execute instructions for performing any of the operations and steps discussed herein.
The computer system 3200 may further include a network interface device 3212. The computer system 3200 also may include a video display 3214 (e.g., a liquid crystal display (LCD) or a cathode ray tube (CRT)), one or more input devices 3216 (e.g., a keyboard and/or a mouse), and one or more speakers 3218 (e.g., a speaker). In one illustrative example, the video display 3214 and the input device(s) 3216 may be combined into a single component or device (e.g., an LCD touch screen).
The data storage device 3208 may include a computer-readable storage medium 3220 on which the instructions 3222 (e.g., implementing the application 17 or 21 executed by any device and/or component depicted in the FIGURES and described herein) embodying any one or more of the methodologies or functions described herein are stored. The instructions 3222 may also reside, completely or at least partially, within the main memory 3204 and/or within the processing device 3202 during execution thereof by the computer system 3200. As such, the main memory 3204 and the processing device 3202 also constitute computer-readable media. The instructions 3222 may further be transmitted or received over a network via the network interface device 3212.
While the computer-readable storage medium 3220 is shown in the illustrative examples to be a single medium, the term “computer-readable storage medium” should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more sets of instructions. The term “computer-readable storage medium” shall also be taken to include any medium that is capable of storing, encoding or carrying a set of instructions for execution by the machine and that cause the machine to perform any one or more of the methodologies of the present disclosure. The term “computer-readable storage medium” shall accordingly be taken to include, but not be limited to, solid-state memories, optical media, and magnetic media.
In some embodiments, the cloud-based computing system 16 and/or the computing device 12 may connect to and/or use an application programming interface (API) exposed by a third-party entity, such as an electronic medical records (EMR) system, and/or a social network system. The API may be used by the application 17 to extract information pertaining to EMR of the user, if proper authorization is given and authorization of a user account is completed. The EMR information may automatically populate the appropriate fields in the user profile. For example, the medical procedures identified in the EMR information may be populated. In some embodiments, the format of the data obtained by the API may be in a different format than the format the application 17 uses. In such an instance, the application 17 may transform the data's format into an acceptable format (e.g., extensible markup language (XML)) for the application 17. In some embodiments, the application 17 may use the API to access the user's information on a social media or social network system (e.g., Facebook®, Twitter®, Instagram®, etc.) to obtain information publicly available on the social network system.
The received input of the physical activity level and the pain level may be used by the one or more machine learning models 60 to generate an improved exercise plan. For example, the machine learning model may determine, using a data source including various associations, including, for example, the levels of attainment associated with achieving the physical activity level, where the levels of attainment may include range of motion, strength, endurance, balance, intelligence, neurological responsiveness, emotional well-being, and mobility. Further, the machine learning model 60 may determine which body portions to target for the various levels of attainment, and which exercises to select to include in the exercise plan that target the appropriate body portions. In some embodiments, the pain level reported by the user may be used to select exercises, difficulty levels of the exercises, and the like.
In some embodiments, upon the user's selecting the physical activity goal and the pain level, an onboarding protocol that uses a baseline fitness test may be initiated. For example,
Selection of the graphical element 3504 may cause the machine learning model 60 to select a next exercise that is more difficult. The onboarding protocol may include exercises having tiered difficulty levels and may select for subsequent exercises for the user to perform, wherein the subsequent exercises advance in difficulty until the user has either completed all of the exercises or reached a point where the user can no longer perform the exercise because it is too difficult or painful. The machine learning model 60 may determine a fitness level for the user based on a completion state (e.g., a degree of completion, a percentage of completion, a value of completion, etc.) of a last exercise performed by the user. The machine learning model 60 may select a difficulty level for each exercise in the improved exercise plan by associating the difficulty level for each exercise with the fitness level of the user.
In some embodiments, a multimedia segment (e.g., recording or feed) may be presented in a digital media player 3506. The multimedia segment may include video and/or audio of a coaching character providing instructions and guidance on how to perform the first exercise. Various options may be provided by the digital media player that enable the user to play, pause, or stop the multimedia segment. There may be options to enable the user to fast forward or rewind the multimedia segment, as well.
As further depicted, each exercise includes an energy consumption metric (“50”). The energy consumption metric may vary for each exercise and it may provide a target metric for the user to achieve during each exercise. The energy consumption metric may be based on a combination of various types of information and metrics associated therewith, such as a metabolic indicator associated with performing the exercise, fitness results of the user, and/or a user-reported pain level of the user, among other information. The energy consumption metric may be determined for the user while they perform the exercise, and when the target energy consumption metric has been exceeded, the user may be done with the exercise. The application 17 may track the user's progress over time if and when the user exceeds or meets the target energy consumption metric. Each determined energy consumption metric for each exercise may be summed to determine an energy score associated with an amount of energy it will take to achieve the physical activity goal. If the summed energy consumption metrics equal or exceed the energy score, then the user may have enough energy to achieve the physical activity goal. As may be appreciated, the user may exceed or match the energy score faster or slower than predicted based on a number of factors, such as their performance, their drive, their health (e.g., physical and mental), their compliance with the exercise plan, and the like.
The method 4200 may enable generating an improved exercise plan for a user to perform using at least an exercise machine 100. At 4202, the processing device may receive data pertaining to the user. The data pertaining to the user may include at least one selection of a physical activity goal the user desires to achieve. In some embodiments, the user may select more than one physical activity goal to achieve. The physical activity goals may include any activity that includes physical motion of a portion of the user's body. For example, the physical activity goals may include ameliorating knee pain, traversing stairs, gardening, performing yardwork, playing, walking, running, meditating, learning faster, improving concentration, improving focus, increasing response time to stimuli, improving relationships, improving sex drive, changing a state of mind, improving cardiovascular performance, improving heart rate, improving blood pressure, sitting without pain, standing without pain, feeling energized, performing more advanced exercises, performing more exercises, carrying groceries, performing house chores, losing weight, or some combination thereof. In some embodiments, the user may use a user interface including one or more graphical elements to select the physical activity goal via the computing device 12.
In some embodiments, the processing device may execute a machine learning model 60 trained to generate the exercise plan using an energy score. An energy score may refer to an amount of energy it takes to achieve the physical activity goal. The energy score may be based on a metabolic indicator associated with performing each exercise. The energy score may be an accumulation of at least all the metabolic indicators for the exercises included in the exercise plan. An energy consumption metric may be associated with each exercise, and the energy consumption metric may be determined using the metabolic indicator for an exercise, user fitness test results, user-reported pain levels, an indication of a pain level the user is in, heartrate, step count, blood pressure, perspiration, blood oxygen levels, body temperature, or some combination thereof. The energy score may indicate that by performing the exercises included in the exercise plan, the user will have enough energy to be able to achieve the physical activity goal. To determine whether the user exerted enough energy for that particular exercise, progress toward the energy score may be tracked at each exercise by calculating the energy consumption metric. One or more graphical elements (e.g., charts, tables, etc.) may be used to dynamically visualize, by depicting respective energy consumption metrics over a time series, the progress the user is making toward the energy score.
Each of the physical activity goals may include one or more levels of attainment to achieve. As described herein, the one or more levels of attainment may refer to range of motion, strength, endurance, balance, intelligence, neurological responsiveness, emotional well-being, cardiovascular well-being, and/or mobility. The levels of attainment may be associated with each physical activity goal. For example, a gardening physical activity goal may cause an exercise plan to be generated that includes exercises that improve the levels of attainment involving range of motion (e.g., kneeling down and bending over to garden) and endurance (e.g., energy consumed by gardening), more than the level of attainment of mobility (e.g., since the user is typically not moving around very much while planting flowers or the like). The levels of attainment may be quantified by, measured by, or associated with measurements (e.g., range of motion extension and/or flexion angles, exerted force measurements, amount of weight lifted, pressed, or curled, etc.), achievements (e.g., number of sets completed, number of repetitions completed, number of exercise sessions completed, weight lost, calories consumed, steps walked, etc.), and the like. The measurements may be obtained via one or more sensors associate with the exercise machine 100, the user, or the environment in which the user uses the exercise machine 100. In some embodiments, the sensor may be a wearable device that the user wears while using and while not using the exercise machine 100. For example, a step counting wearable may be worn by the user while not engaged in an active exercise session (e.g., while walking around a grocery store, etc.).
The data obtained from the sensors may enable monitoring the user's comprehensive lifestyle to enable predicting when the user will achieve the physical activity goal accurately and to provide recommendations pertaining to the user's health. In some embodiments, data from each user of the application 17, the exercise machine 100, or both, may be monitored and stored to enable training the machine learning models 60 to perform one or more functions. For example, the machine learning models 60 may be generated using training data that enables the machine learning models 60 to receive input data (e.g., characteristic of the user, performance measurement, pain level of the user, etc.) pertaining to the users and/or the selected physical activity goal and to predict a length of time it will take the user to achieve the physical activity goal if they comply with a particular exercise plan. The machine learning models 60 may identify patterns between the data pertaining to the user and other data pertaining to other users, and may determine that the other users, by following the recommended exercise plan, achieved the same physical activity goals in the length of time. In some embodiments, the processing device may transmit the amount of time it will take the user to achieve the physical activity goal to the computing device 12 for presentation. In some embodiments, the machine learning models 60 may be generated and trained to receive the data pertaining to the user and determine one or more comorbidities of the user (e.g., the user is at risk for diabetes because they are overweight and depressed, etc.). The machine learning models 60 may be trained to identify patterns between the user and other users that have similar data and may determine the similarly situated users have the comorbidities.
At 4204, the processing device may generate, by executing the artificial intelligence engine 65, the improved exercise plan. The artificial intelligence engine 65 may generate the machine learning models 60 trained to perform the generating of the improved exercise plan using the data source. The improved exercise plan may include at least one set of exercises to be performed by the user to achieve at least one of the one or more levels of attainment associated with the physical activity goal. The artificial intelligence engine 65 may use at least one data source 67 configured to include information pertaining to one or more information pertaining to one or more exercises and at least one of the one or more levels of attainment associated with the physical activity goal.
In some embodiments, the data source may include a set of rankings and each ranking of the set of rankings may pertain to a priority level for each of the one or more levels of attainment pertaining to achieving the physical activity goal. The set of exercises selected may be arranged in the improved exercise plan based on the set of rankings. For example, for a physical activity goal of gardening, the levels of attainment of range of motion and endurance may be ranked higher than the level of attainment of mobility, and as a result, exercises that target the portion(s) of the body associated with achieving range of motion and endurance may be prioritized in the exercise plan. Prioritizing those exercises may refer to including more exercises that are associated with the higher ranking levels of attainment, including more repetitions and/or sets for those exercises, including longer durations for performing the exercises, and the like. In some embodiments, (i) a first portion of the set of exercises associated with a level of attainment having a certain ranking may be included in a set of initial exercises to perform in the improved exercise plan, and (ii) a second portion of the set of exercises associated with the level of attainment having another ranking may be included as a set of last exercises to perform in the improved exercise plan.
The data source 67 may include a first association between the physical activity goal and the one or more levels of attainment pertaining to achieving the physical activity goal, a second association between the one or more levels of attainment and one or more body portions of a human being, and a third association between the one or more body portions and one or more exercises that target the one or more body portions. The processing device may select the at least one set of exercises based on the first association, the second association, and the third association in order to provide an exercise plan that targets the body portions associated with the levels of attainment for the physical activity goal. In some embodiments, the data source 67 may include exercises that are curated by one or more health professionals, such as a trainer, a medical doctor, a physical therapist, a surgeon, or the like. Further, the associations between the levels of attainment, exercises, body portions, and the like may be curated, filtered, reviewed, revised, and the like by the health professionals.
In some embodiments, to generate the improved exercise plan, the processing device may execute the one or more machine learning models 60 trained to use an onboarding protocol, a fitness level of the user. The onboarding protocol may include exercises with tiered difficulty levels. The onboarding protocol may advance a difficulty level for a subsequent exercise in the exercises when the user completes an exercise in the exercises. The fitness level of the user may be determined based on a completion state (e.g., percentage, amount completed, performance measurement, user-report difficulty level, user-reported pain level, etc.) of a last exercise performed by the user. The machine learning models 60 may select a difficulty level for each exercise in the improved exercise plan by associating the difficulty level for each exercise with the fitness level of the user. This onboarding protocol may be referred to as a baseline fitness test. One purpose of the onboarding protocol may be to match user's having particular fitness levels with exercises that have difficulty levels the user should be able to perform, and to optimize compliance and enhance an amount of time it takes to achieve the physical activity goal.
At 4206, the processing device may transmit the improved exercise plan to a computing device. For example, the improved exercise plan may be transmitted to computing device 12 and may be presented by the user interface 18 of the application 17. In some embodiments, the processing device may execute the artificial intelligence engine 65 and/or machine learning models 60 to transmit a signal to the exercise machine 100. In response to the exercise machine 10 receiving the signal, a portion of the exercise machine 100 may be adjusted. The adjustment may be based on an attribute of an operating parameter specified in the improved exercise plan. For example, an attribute of a speed operating parameter may indicate a particular pedaling exercise should be performed at 5 miles per hour. When the exercise machine 100 receives the signal including a control instruction specifying the speed at which a motor of the exercise machine 100 should operate, a processing device of the exercise machine 100 may use the attribute of the operating parameter to control the motor to operate at 5 miles per hour. There may be numerous attributes and numerous operating parameters specified in the improved exercise plan. For example, each exercise selected may be associated with various attributes for various operating parameters. The exercises and their attributes of operating parameters may be selected in order to improve a rate at which the user achieves the physical activity goal, improve compliance, ameliorate boredom, enhance enjoyment, and the like. Based on characteristics of the user, performance measurements of the user, user-reported difficulty levels of exercises, and/or user-reported pain levels, the exercises and attributes of operating parameters may change dynamically as a user performs the exercise plan.
In some embodiments, the processing device prompts, via the computing device 12, the user for feedback pertaining to one or more levels of enjoyment while performing the improved exercise plan. The set of exercises in the improved exercise plan are arranged in a performance order to ameliorate boredom based on the one or more levels of enjoyment. For example, a machine learning model 60 may be trained to match patterns between the user and other users that performed exercise plans, and to determine the other similar matched users indicated they enjoyed a particular order of exercises in an exercise plan or did not enjoy the particular order. The machine learning model 60 may be trained to select the exercises and/or an order of the exercises for a user based on whether the user has experienced they enjoy the exercise and/or order or other users indicated they enjoyed the exercise and/or order. In some embodiments, a healthcare professional (e.g., physical therapist, etc.) may use empirical evidence and/or data to select the exercises and/or performance order to maximize enjoyment, minimize boredom, and/or maximize compliance. In some embodiments, the machine learning models 60 may be trained to solve optimization (e.g., maximization, minimization, etc.) problems to generate an improved exercise plan.
In some embodiments, the processing device may determine a fitness level of a user. The fitness level of the user may be determined by one or more machine learning models 60 using data pertaining to the user. The data may include characteristics of the user, such as height, weight, age, medical history, etc., performance measurements (e.g., range of motion, speed, force, exercise duration, etc.), and indications of pain levels provided by the user. The fitness level of the user may include a value or quantification using any suitable scale. For example, a scale of 1 to 5 may be used to rate the fitness level of the user. A 1 may indicate the user is a beginner or has a lowest fitness level and a 5 may indicate the user is an elite athlete and has a highest fitness level.
In some embodiments, the processing device may cause presentation of a user interface on the computing device 12, and the user interface may present multimedia of a coaching character configured to provide instructions on how to perform an exercise of the improved exercise plan. Based on the determined fitness level of the user, the processing device may modify the multimedia (e.g., video, and/or audio) of a coaching character (e.g., a human, a virtual representation of a human, an animated character, an augmented reality character, a virtual reality character, etc.) performing an exercise in the improved exercise plan. The modifying of the multimedia may include slowing down playback of the video if the user has a low fitness level (e.g., 1, 2, etc.) or speeding up playback of the video if the user has a high fitness level (e.g., 3, 4, 5, etc.). The multimedia may be selected from the data source 67. The data source 67 may store numerous multimedia files and each multimedia file may correspond with a particular exercise (e.g., a video of a coaching character performing a seated bar curl). The processing device may modify the multimedia playback according to the user's fitness level such that the user has an engaging, productive, enjoyable, and/or appropriate exercise session.
In some embodiments, based on one or more factors, the processing device may pair an audio clip and a video clip to generate a paired audio and video clip. The one or more factors may include a fitness level of the user, types of exercises included in the improved exercise plan, one or more characteristics of the user (e.g., height, weight, age, gender, medical history, medical procedures, etc.), one or more performance measurements (e.g., range of motion, force, speed, distance, etc.), a sensor measurement, feedback from the user pertaining to a difficulty of an exercise, or some combination thereof. While the user performs the improved exercise plan, the processing device may cause playback of the paired audio and video clip. For example, if the user indicates that an exercise is too hard using the user interface, then the processing device may generate a paired audio and video clip that provides an encouraging statement (e.g., “Almost done!”, “You got this”, etc.) and/or slows down playback speed of the coaching character performing the exercise in the video. Such a technical solution may also provide engaging, productive, enjoyable, and/or appropriate exercise sessions for the user. The user's experience using the computing device and/or exercise machine 100 may be enhanced, thereby improving technology.
In some embodiments, the processing device may transmit a notification for presentation on the computing device 12. The notification may include an indication that an exercise performed by the user also helps to achieve a second physical activity goal. For example, the user may be performing a leg press exercise using the exercise machine 100, and the leg press exercise may have been selected because it improves a strength level of attainment; however, the leg press exercise may also improve mobility, and thus, the notification may be presented on the computing device 12 to indicate the same. Further, the notification can indicate that improving mobility may help the user achieve a physical activity goal of jogging a mile, playing with their grandchildren, or any suitable physical goal associated with the mobility level of attainment.
In some embodiments, while the user performs the exercise plan using the exercise machine 100, the processing device may monitor one or more characteristics of the user, performance measurements of the user, user-reported pain feedback, and the like. The processing device may determine whether an exercise in the set of exercises results in a desired outcome. The processing device may determine an exercise is successful if the user exceeds a performance measurement threshold, completes the exercise to a certain threshold percentage, reports they are not experiencing pain, or the like. The artificial intelligence engine 65 may generate one or more machine learning models 60 that are trained to generate improved exercise plans based on whether the exercise in the set of exercises results in the desired outcome. Accordingly, the processing device may implement a feedback look to iteratively improve the generated exercise plans according to whether they are providing desired results.
At 4302, while the user performs the improved exercise plan, the processing device may receive data pertaining to the user. The data pertaining to the user may include one or more characteristics of the user, performance measurements, sensor measurements, user-reported difficulty of an exercise, user-reported pain level, or some combination thereof.
At 4304, the processing device may select, based on the data pertaining to the user, a multimedia clip from the data source 67, a website (e.g., music video streaming website), a multimedia application (e.g., music video streaming website), or any suitable source. In some embodiments, the artificial intelligence engine 65 may generate one or more machine learning models 60 trained to select the multimedia clip based on the data pertaining to the user. For example, the data pertaining to the user may indicate the user is having a difficult time completing an exercise, and the machine learning model 60 may be trained to select a motivational audio clip to playback using the computing device 12 in real-time or near real-time as the user performs the exercise. At 4306, while the user performs the improved exercise plan, the processing device cause, via the computing device 12, playback of the multimedia clip.
At 4402, the processing device may execute the artificial intelligence engine 65 to generate one or more machine learning models 60 trained to determine one or more comorbidities for one or more users based on one or more characteristics of the user one or more users. The machine learning models 60 may be trained with training data that maps inputs to corresponding target outputs. For example, the training data may map certain characteristics of users as inputs to comorbidities as outputs. The characteristics may include information pertaining to medical histories, familial medical histories, medical procedures, demographics, psychographics, physical, mental, emotional, cardiovascular, neurological, performance measurements, user-reported difficulties of exercises, user-reported pain levels, and the like.
At 4404, the processing device may receive one or more characteristics of a particular user. The processing device may input the one or more characteristics of the particular user into the trained machine learning model 60. The trained machined machine learning model 60 may use the one or more characteristics of the user to determine at least one comorbidity for the user. The processing device may cause a notification to be presented on the user interface of the computing device 12. In some embodiments, there are various resources (e.g., medical papers, medical journal, evidence-based guidelines) that are referenced by the machine learning models 60 when determining what the comorbidities of the user are. The resources may be curated by health professionals and approved to be included in the data source 67. The data source 67 may be referred to as a multi-disciplinary repository that includes resources and exercises curated from health professionals having different backgrounds, such as physical therapy, medicine, neurology, cardiology, psychology, etc. Thus, the data source 67 may be used as a single source to provide improved exercise plans and notifications to enable a person to improve their entire lifestyle (e.g., physical and mental).
The control instruction 4502 also provides additional explanation, “The following control instruction has been transmitted to the exercise device: increase resistance provided by both pedals by X.” This additional explanation indicates that the measurements received from the one or more sensors resulted in the resistance provided by both pedals being increased by X (e.g., X may be a value, an amount, a percentage, etc.). The control instruction 4502 may also indicate that “The resistance has been increased accordingly.” Presenting the control instruction 4502 on the user interface 4500 may provide clarity and enhance an understanding of the user as to why the resistance provided by the pedals has been changed.
Although resistance is shown as the operating parameter modified by the control instruction 4502, any suitable operating parameter of the pedals and/or motor may be modified, such as range of motion, revolutions per minute, speed, etc. Further, the operating parameters may be modified identically, similarly or completely independently on different sides of the exercise device 100. Further, the pedals may include foot pedals, hand pedals, or some combination thereof.
The user interface 4500 may include various graphical elements 4504 (e.g., buttons): a button associated with increasing the resistance, and a button associated with decreasing the resistance. The user may use an input peripheral, such as a mouse, a keyboard, a microphone, a touchscreen, etc. to select one of the graphical elements 4504. For example, as depicted by a hand cursor, the user may select the button to cause the resistance to increase. The selection may be transmitted to the cloud-based computing system 16 and/or the computing device 12. The selection may cause a control instruction to be generated and transmitted to the exercise device 100. The control instruction may cause an operating parameter associated with the resistance to increase. In some embodiments, when generating subsequent exercise sessions, the selection may cause one or more machine learning models 60 to be retrained to learn the user's preferences for resistance levels.
In some embodiments, one or more machine learning models 60 may be generated and trained by the artificial intelligence engine 65 and/or the training engine 50 to perform one or more of the operations of the method 4600. For example, to perform the one or more operations, the processing device may execute the one or more machine learning models 60. In some embodiments, the one or more machine learning models 60 may be iteratively retrained to select different features capable of enabling optimization of output. The features that may be modified may include a number of nodes included in each layer of the machine learning models 60, an objective function executed at each node, a number of layers, various weights associated with outputs of each node, and the like.
The method 4600 may use the artificial intelligence engine 65 to modify resistance of one or more pedals of an exercise device 100. The pedals may be foot pedals, hand pedals, or some combination thereof. At 4602, the processing device may generate, by the artificial intelligence engine 65, a machine learning model 60 trained to receive one or more measurements as input. In some embodiments, the one or more measurements may be associated with a force exerted on the one or more pedals, a revolution per minute of the one or more pedals, a speed of rotation of the one or more pedals, an angular moment of the one or more pedals, a range of motion of the one or more pedals, a radius of an arc traversed by the one or more pedals, a degree of flexion, a degree of extension, a skill level, or some combination thereof.
At 4604, the processing device may output, based on the one or more measurements, a control instruction that causes the exercise device 100 to modify the resistance of the one or more pedals. In some embodiments, the control instruction may cause other operating parameters associated with the exercise device to be modified. For example, any combination of the resistance, range of motion, speed, revolutions per minute, etc. may be modified by the control instruction.
At 4606, the processing device may receive the one or more measurements from a sensor associated with the one or more pedals of the exercise device 100. As described herein, the sensor may include a pressure sensor including one or more load cells, proximity sensors, haptic sensors, piezoelectric sensors, optical sensors, temperature sensors, electrical sensors, mechanical sensors, chemical sensors, electromechanical sensors, electrochemical sensors or mechanicochemical sensors or the like. There may be multiple sensors disposed in, on, around, or near the exercise device 100. The multiple sensors may be configured to obtain the one or more measurements. The sensors may include a processing device, a memory device, a network interface device, a sensing device, etc. The sensors may be configured to wirelessly communicate the one or more measurements to the computing device 12 and/or the cloud-based computing system 16.
At 4608, the processing device may determine whether the one or more measurements satisfy a trigger condition. In some embodiments, the trigger condition may include the one or more measurements being less than a threshold value, less than or equal to a threshold value, equal to a threshold value, more than or equal to a threshold value, or more than a threshold value. For example, if a user is able to cycle at a range of motion of 50 degrees, and the trigger condition includes a range of motion being above 45 degrees, then the trigger condition may be satisfied.
In some embodiments, the processing device may include receiving one or more characteristics of a user operating the exercise device 100. For example, the one or more characteristics may include personal information, performance information, and measurement information. The personal information may include, e.g., demographic, psychographic or other information, such as an age, a weight, a gender, a height, a body mass index, a medical condition, a familial medication history, an injury, a medical procedure, a medication prescribed, a comorbidity, or some combination thereof. The performance information may include, e.g., an elapsed time of using an exercise device, an amount of force exerted on a portion of the exercise device, a range of motion achieved on the exercise device, a duration of use of the exercise device, a speed of a portion of the exercise device, an indication of a plurality of pain levels using the exercise device, or some combination thereof. The measurement information may include, e.g., one or more vital signs of the user, a respiration rate of the user, a heartrate of the user, a temperature of the user, an SpO2-measurement of the blood oxygen level of the user (e.g., oxygen saturation level), a blood pressure of the user, a glucose level of the user, other suitable measurement information of the user, microbiome related data pertaining to the user, or a combination thereof.
In some embodiments, based on the one or more characteristics of the user and the one or more measurements, the processing device may determine whether the trigger condition has been satisfied. For example, if the user is over a certain age and the measurement indicates the user has a certain heartrate, then the trigger condition may become satisfied.
At 4610, responsive to determining that the one or more measurements and/or the one or more characteristics of the user satisfy the trigger condition, the processing device may transmit the control instruction to the exercise device 100. In some embodiments, transmitting the control instruction to the exercise device 100 may cause the exercise device 100 to modify the resistance of the one or more pedals in real-time or near real-time. In some embodiments, while a user uses the exercise device 100 to perform an exercise (e.g., cycling), the resistance of the one or more pedals may be modified. In some embodiments, the modification may include modifying the resistance by the same degree, percentage or amount provided by a pedal on each side of the exercise device 100, or modifying the resistance a different degree, percentage or amount on different sides of the exercise device 100. In some embodiments, the control instruction may be associated with changing an operating parameter (e.g., speed, revolutions per minute) of a motor connected to the one or more pedals of the exercise device 100.
In some embodiments, transmitting the control instruction to the exercise device 100 may cause the exercise device 100 to modify a parameter associated with a pedal on each side of the exercise device 100 a same degree, percentage or amount or a different degree, percentage or amount. The parameter may include a range of motion, resistance, speed of a motor, revolutions per minute, or some combination thereof.
In some embodiments, responsive to determining that the trigger condition has been satisfied, the processing device may modify another operating parameter (e.g., different than resistance) of the exercise device 100. The another operating parameter may include a number of revolutions per minute, a speed value, a torque value, a temperature degree, a vibration level, or some combination thereof.
In some embodiments, the processing device may receive a second input from the user. The second input may include an instruction to modify an operating parameter of the exercise device 100. The second input may be received via a microphone, a touchscreen, a keyboard, a mouse, a proprioceptive sensor, or some combination thereof. For example, the second input may include a spoken voice from the user, wherein the spoken voice instructs the exercise device 100 to increase resistance. Accordingly, the processing device may use natural language processing to digitize an audio signal representing the spoken voice and to process the digitized audio signal. Based on the digitized audio signal, the processing device may transmit a control instruction that causes the resistance provided by one or more of the pedals to be increased in real-time or near real-time.
In some embodiments, the processing device may present the control instruction on a user interface of a computing device associated with the exercise device 100. In some embodiments, the processing device may receive a selection from the computing device, and based on the selection, may modify the resistance of the one or more pedals, range of motion of the one or more pedals, speed of a motor, or some combination thereof.
In some embodiments, the machine learning model 60 may be retrained based on the selection from the user because the selection may indicate a preference of the user, that the user is in pain, that the exercise is too easy for the user, that the exercise is too hard for the user, or some combination thereof. In some embodiments, the selection may cause the machine learning model 60 to select a different exercise to be immediately performed in place of an exercise currently being performed, to select a different exercise to be subsequently performed in an exercise session, or both. In some embodiments, the selection may cause the artificial intelligence engine 65 to modify a feature of the machine learning model 60, such that the machine learning model 60 accounts for the selection when generating additional exercise sessions and/or exercise plans. For example, the feature may include a weight of a node, a number of nodes in a layer, a number of layers, or some combination thereof. Such a technique may improve the exercises that are selected, such that they are based on input from the user.
The desired target zone for each user may be tailored based on the one or more characteristics of the user. The one or more characteristics may pertain to personal information, performance information, and/or measurement information. The personal information may include, e.g., demographic, psychographic or other information, such as an age, a weight, a gender, a height, a body mass index, a medical condition, a familial medication history, an injury, a medical procedure, a medication prescribed, a comorbidity, or some combination thereof. The performance information may include, e.g., an elapsed time of using an exercise device, an amount of force exerted on a portion of the exercise device, a range of motion achieved on the exercise device, a duration of use of the exercise device, an indication of a plurality of pain levels using the exercise device, or some combination thereof. The measurement information may include, e.g., one or more vital signs of the user, a respiration rate of the user, a heartrate of the user, a temperature of the user, an SpO2-measurement of the blood oxygen level of the user (e.g., oxygen saturation level), a blood pressure of the user, a glucose level of the user, other suitable measurement information of the user, microbiome related data pertaining to the user, a perspiration rate, or some combination thereof.
Further, the desired target zone may be based on a physical activity goal specified by the user. For example, the user may have selected to be able to play with their grandchildren, and the desired target zone may be tailored according to one or more aspects (e.g., domains including flexibility, strength, endurance, etc.) associated with the selected physical activity goal and/or the characteristics of the user.
The user interface 4700 may include various graphical elements 4704 (e.g., buttons): a button associated with increasing the resistance, and a button associated with decreasing the resistance. The user may use an input peripheral, such as a mouse, a keyboard, a microphone, a touchscreen, a virtual or augmented reality input device, etc. to select one of the graphical elements 4704. For example, as depicted by a hand cursor, the user may select the button to cause the resistance to increase. The selection may be transmitted to the cloud-based computing system 16 and/or the computing device 12. The selection may cause a control instruction to be generated and transmitted to the exercise device 100. The control instruction may cause an operating parameter associated with the resistance to increase. In some embodiments, the selection may cause one or more machine learning models 60 to be retrained to learn the user's preferences for resistance levels when generating subsequent exercise sessions.
In some embodiments, one or more machine learning models 60 may be generated and trained by the artificial intelligence engine 65 and/or the training engine 50 to perform one or more of the operations of the method 4800. For example, to perform the one or more operations, the processing device may execute the one or more machine learning models 60. In some embodiments, the one or more machine learning models 60 may be iteratively retrained to select different features capable of enabling the optimization of output. The features that may be modified may include a number of nodes included in each layer of the machine learning models 60, an objective function executed at each node, a number of layers, various weights associated with outputs of each node, and the like.
The method 4800 may use the artificial intelligence engine 65 to perform a control action. The control action may be based on one or more measurements from a wearable device or smart device (e.g., a watch, a necklace, an anklet, a bracelet, a belt, a ring, a hat, a shoe, a piece of clothing, an earplug, etc.). The wearable device or smart device may be worn by a user performing an exercise or while the user is not performing the exercise and is stationary, relaxing, sleeping, or the like.
At 4802, the processing device may generate, by the artificial intelligence engine 65, a machine learning model 60 trained to receive the one or more measurements as input. In some embodiments, the one or more measurements may include one or more vital signs of the user, a respiration rate of the user, a heartrate of the user, a temperature of the user, an SpO2-measurement of the blood oxygen level of the user (e.g., oxygen saturation level), a blood pressure of the user, a glucose level of the user, other suitable measurement information of the user, microbiome related data pertaining to the user, a perspiration rate, a revolutions per minute, a number of steps, a speed, an amount of force, or some combination thereof.
At 4804, the processing device may output, based on the one or more measurements, a control instruction that causes the control action to be performed. In some embodiments, the control action may include transmitting a notification for presentation on a user interface of a computing device associated with the exercise device 100. The notification may include feedback to encourage the user to perform, during the interval training session, an exercise within the target training zone. In some embodiments, the control action may include controlling an operating parameter of the exercise device 100. For example, the control instruction may be received by a processing device of the exercise device 100, and based on the control instruction, the processing device may transmit a signal to control the operating parameter (e.g., resistance, range of motion, speed, revolutions per minute, etc.).
At 4806, the processing device may receive the one or more measurements from the wearable device being worn by a user. The one or more measurements may be received at a certain periodicity, on demand, or when certain trigger events occur, for example. The one or more measurements may be received during an interval training session. In some embodiments, the interval training session may be included in an exercise plan associated with a rehabilitation program which the user is performing. In some embodiments, the interval training session may include short, high intensity bursts of activity with periods of rest and recovery between. In other words, rest and exercise intervals of controlled duration may be alternated during an interval training session.
At 4808, the processing device may determine whether the one or more measurements indicate, during an interval training session, that one or more characteristics of the user are within a desired target zone. The one or more characteristics of the user may be associated with an activity level determined based on physiological factors associated with the one or more measurements. The physiological factors may include heart rate, respiratory rate, perspiration rate, muscular state, readiness state, or the like. In some embodiments, the characteristic of the user may be associated with a physiological state (e.g., resting, active, hyperactive, etc.) based on the one or more measurements. The characteristic of the user may be determined to be within the desired target zone when a value, attribute, score, measure, property, etc. associated with the characteristic is within a certain range (e.g., desired target zone). The certain range may be any numerical range, quantifiable range, quantitative range, etc.
In some embodiments, responsive to determining that the one or more measurements indicate, during the interval training session, that the characteristic of the user is within the desired target zone, performing the control action, wherein the performing includes transmitting a notification to a computing device associated with an exercise device. The notification may provide a motivational message to the user.
At 4810, responsive to determining that the one or more measurements indicate the characteristic of the user is not within the desired target zone during the interval training session, the processing device may perform the control action.
In some embodiments, the processing device may receive data associated with the user. In some embodiments, based on the data and the one or more measurements, the processing device may predict, via the machine learning model 60, a medical condition (e.g., hypertension, asthma, diabetes, etc.) associated with the user. For example, the machine learning model 60 may be trained on a corpus of training data that matches patterns between certain user data and measurements associated with the presence (or absence) of certain medical conditions.
In some embodiments, the processing device may receive second input from the user. The second input may include an instruction to modify an operating parameter of the exercise device 100. The second input may be received via a microphone, a touchscreen, a virtual or augmented reality input device, a keyboard, a mouse, a proprioceptive sensor, or some combination thereof. For example, the second input may include a spoken voice from the user that instructs the exercise device 100 to increase resistance. Accordingly, the processing device may use natural language processing to digitize an audio signal representing the spoken voice and to process the digitized audio signal. Based on the digitized audio signal, the processing device may transmit a control instruction that causes the resistance provided by one or more of the pedals to be increased in real-time or near real-time.
In some embodiments, one or more machine learning models 60 may be generated and trained by the artificial intelligence engine 65 and/or the training engine 50 to perform one or more of the operations of the method 4900. For example, to perform the one or more operations, the processing device may execute the one or more machine learning models 60. In some embodiments, the one or more machine learning models 60 may be iteratively retrained to select different features capable of enabling the optimization of output. The features that may be modified may include a number of nodes included in each layer of the machine learning models 60, an objective function executed at each node, a number of layers, various weights associated with outputs of each node, and the like.
At 4902, the processing device may determine that the one or more measurements indicate the characteristic of the user is within an undesired target zone. In some embodiments, the undesired target zone may include ranges for excessive heartrates, blood pressures, temperatures, and/or perspiration rates, for example.
At 4904, responsive to determining the one or more measurements indicate the characteristic of the user is within the undesired target zone, the processing device may perform the control action that includes transmitting the control instruction to cause the exercise device 100 to stop operating, to slow down, to speed up, to generate a warning, or some combination thereof. Further, the control action may include transmitting a notification to be presented by the computing device 12, and the notification may include a warning, an alert, or the like instructing the user to stop exercising or slow down.
In some embodiments, one or more machine learning models 60 may be generated and trained by the artificial intelligence engine 65 and/or the training engine 50 to perform one or more of the operations of the method 5000. For example, to perform the one or more operations, the processing device may execute the one or more machine learning models 60. In some embodiments, the one or more machine learning models 60 may be iteratively retrained to select different features capable of enabling the optimization of output. The features that may be modified may include a number of nodes included in each layer of the machine learning models 60, an objective function executed at each node, a number of layers, various weights associated with outputs of each node, and the like.
At 5002, the processing device may generate, by the artificial intelligence engine 65, a machine learning model 60 trained to receive one or more measurements as input. In some embodiments, the one or more measurements are associated with a force exerted on the one or more pedals, a revolution per minute of the one or more pedals, a speed of rotation of the one or more pedals, an angular moment of the one or more pedals, a range of motion of the one or more pedals, a radius of an arc traversed by the one or more pedals, a degree of flexion, a degree of extension, a skill level, or some combination thereof.
At 5004, the processing device may output, based on the one or more measurements, a control instruction that causes the exercise device to modify, independently from each other, the resistance of the one or more pedals. The control instruction may cause modification of any combination of operating parameters, such as the resistance, range of motion, speed, revolutions per minute, etc. The control instruction may include a value for configuring the operating parameter of the exercise device 100. The control instruction may be received by a processing device of the exercise device 100, and the processing device may transmit a control signal to appropriate circuitry (e.g., motor, pedal, actuator, etc.) to control the operating parameter of the exercise device 100.
At 5006, while a user performs an exercise using the exercise device, the processing device may receive the one or more measurements from one or more sensors associated with the one or more pedals of the exercise device 100. As described herein, the sensor may include a pressure sensor including one or more load cells, proximity sensors, haptic sensors, piezoelectric sensors, optical sensors, temperature sensors, electrical sensors, mechanical sensors, chemical sensors, electromechanical sensors, electrochemical sensors or mechanicochemical sensors, or the like. There may be multiple sensors disposed in, on, around, or near the exercise device 100. The multiple sensors may be configured to obtain the one or more measurements. The sensors may include a processing device, a memory device, a network interface device, a sensing device, etc. The sensors may be configured to wireless communicate the one or more measurements to the computing device 12 and/or the cloud-based computing system 16.
At 5008, the processing device may determine, based on the one or more measurements, a quantifiable or qualitative modification to the resistance provided by a pedal of the one or more pedals. In some embodiments, the quantifiable or qualitative modification may include a specific force (e.g., newtons) or pounds of resistance to be provided by the pedal of the one or more pedals. In some embodiments, the resistance provided by another pedal of the one or more pedals is not modified. In some embodiments, range of motion for one pedal may be modified independently from the range of motion for another pedal. In some embodiments, revolutions per minute for one pedal may be modified independently than the revolutions per minute for another pedal. In some embodiments, any combination of operating parameters (e.g. resistance, range of motion, revolutions per minute, etc.) may be modified independently for different pedals. In some embodiments, the pedals may include hand pedals, foot pedals, or some combination thereof.
At 5010, the processing device may transmit the control instruction to the exercise device 100 to cause the resistance provided by the pedal to be modified. In some embodiments, the control instruction may automatically cause the resistance provided by the pedal to be modified in real-time or near real-time. In some embodiments, the pedal may be actuated by an affected limb, and the affected limb is associated with rehabilitation, prehabilitation, post-habilitation, or some combination thereof. In some embodiments, the another pedal of the one or more pedals may be actuated by a second limb.
In some embodiments, the processing device may include presenting a notification on a user interface of a computing device associated with the user and/or the exercise device 100. The notification may include a prompt to modify the resistance provided by the pedal. In some embodiments, the processing device may cause an audio device (e.g., speaker) to generate a notification. The notification may include a prompt (e.g., auditory, visual, haptic, etc.) to modify the resistance provided by the pedal, a range of motion provided by the pedal, a speed of the pedal, a revolutions per minute of the pedal, etc.
In some embodiments, the processing device may receive one or more subsequent measurements from the one or more sensors. The processing device may determine whether the one or more subsequent measurements indicate at least one of the at least two strength characteristic levels for the at least two limbs of the user. Responsive to determining that the one or more subsequent measurements indicate at least one of the at least two strength characteristic levels for the at least two limbs, the processing device may modify a value of resistance to be provided by the one or more pedals.
In some embodiments, the processing device may receive a second input from the user. The second input may include an instruction to modify an operating parameter of the exercise device 100. The second input may be received via a microphone, a touchscreen, a keyboard, a mouse, a proprioceptive sensor, or some combination thereof. For example, the second input may include a spoken voice, wherein the spoken voice instructs the exercise device 100 to increase resistance. Accordingly, the processing device may use natural language processing to digitize an audio signal representing the spoken voice and to process the digitized audio signal. Based on the digitized audio signal, the processing device may transmit a control instruction that causes the resistance provided by one or more of the pedals to be increased in real-time or near real-time.
In some embodiments, one or more machine learning models 60 may be generated and trained by the artificial intelligence engine 65 and/or the training engine 50 to perform one or more of the operations of the method 5100. For example, to perform the one or more operations, the processing device may execute the one or more machine learning models 60. In some embodiments, the one or more machine learning models 60 may be iteratively retrained to select different features capable of enabling the optimization of output. The features that may be modified may include a number of nodes included in each layer of the machine learning models 60, an objective function executed at each node, a number of layers, various weights associated with outputs of each node, and the like.
At 5102, the processing device may receive one or more subsequent measurements from the one or more sensors. At 5104, the processing device may determine whether the one or more subsequent measurements indicate at least two strength characteristic levels for at least two limbs of the user. The at least two strength characteristic levels may be associated with an amount of force able to be exerted by a first limb on a first pedal and an amount of force able to be exerted by a second limb on a second pedal. For example, the strength characteristic level may be configured to be 100 pounds of force. If the force exerted by each of the first and second limb on the first and second pedal is equal to or greater than 100 pounds of force, then the first and second limb achieve the strength characteristic level. In such an instance, an affected limb may have recovered to be as strong as the other unaffected limb.
At 5106, responsive to determining that the one or more subsequent measurements indicate at least one of the at least two strength characteristic levels for the at least two limbs, the processing device may modify the resistance to be provided by the one or more pedals. For example, if the affected limb actuates a pedal that generates measurements indicating the affected limb has achieved the strength characteristic level, then the affected limb may have recovered to a sufficient degree. Accordingly, the resistance provided by the pedal associated with the affected limb may be adjusted or modified similarly to the resistance provided by a pedal associated with the unaffected or other limb.
Each domain is associated with a respective graphical element, and as depicted, the graphical element represents a side of a mountain. Any suitable graphical element may be used (e.g., a rope climb, a road, a building, a river, etc.). An icon 5232 (e.g., represented as a circle) may be presented relative to the graphical element 5204. The position of the icon 5232 relative to the graphical element 5204 may represent a progress of the user with regard to the specific domain. For example, when the icon 5232 is located at a bottom of the graphical element 5204, then the user has not made any progress toward that particular domain associated with the graphical element 5204. However, if the icon is located at a top of the graphical element, then the user has completed progress toward that particular domain associated with the graphical element 5204. As depicted, with regard to the graphical element 5204 associated with the Range of Motion domain, the icon 5232 may move in real-time (as depicted by the dashed circle moving, via the dashed line, to the enclosed circle) as the user performs an exercise, such as cycling, and the user increases progress towards a target goal or desired completion state. As further, depicted by graphical element 5210 associated with the Endurance domain, the icon has advanced in real-time as the user performs the cycling exercise. Accordingly, as the user performs various exercises, any suitable number of domains may be modified to reflect the user's progress accurately.
As depicted, the user may select (e.g., via hand cursor), a particular graphical element 5204 associated with a domain to drill down to view more detailed information pertaining to that domain (e.g., Range of Motion). Accordingly,
In some embodiments, one or more machine learning models 60 may be generated and trained by the artificial intelligence engine 65 and/or the training engine 50 to perform one or more of the operations of the method 5400. For example, to perform the one or more operations, the processing device may execute the one or more machine learning models 60. In some embodiments, the one or more machine learning models 60 may be iteratively retrained to select different features capable of enabling the optimization of output. The features that may be modified may include a number of nodes included in each layer of the machine learning models 60, an objective function executed at each node, a number of layers, various weights associated with outputs of each node, and the like.
The method 5400 may use the artificial intelligence engine 65 to present a user interface capable of presenting the progress of a user in one or more domains. At 5402, the processing device may generate, by the artificial intelligence engine 65, a machine learning model 60 trained to receive one or more measurements as input. In some embodiments, the one or more measurements may be associated with a force exerted on the one or more pedals, a revolution per minute of the one or more pedals, a speed of rotation of the one or more pedals, an angular moment of the one or more pedals, a range of motion of the one or more pedals, a radius of an arc traversed by the one or more pedals, a degree of flexion, a degree of extension, a skill level, or some combination thereof.
At 5404, the processing device may output, based on the one or more measurements, a user interface that causes one or more icons to dynamically change position on the user interface. The icons may represent an amount of progress the user has made in a respective domain associated with an exercise plan. Each domain may have a target progress goal toward which the user is striving. The target progress goal may be represented by a number, a value, a percentage, a state, etc.
At 5406, while a user performs an exercise using the exercise device, the processing device may receive the one or more measurements from one or more sensors associated with the exercise device 100. As described herein, the one or more sensors may include a pressure sensor including one or more load cells, proximity sensors, haptic sensors, piezoelectric sensors, optical sensors, temperature sensors, electrical sensors, mechanical sensors, chemical sensors, electromechanical sensors, electrochemical sensors or mechanicochemical sensors or the like. There may be multiple sensors disposed in, on, around, or near the exercise device 100. The multiple sensors may be configured to obtain the one or more measurements. The sensors may include a processing device, a memory device, a network interface device, a sensing device, etc. The sensors may be configured to wireless communicate the one or more measurements to the computing device 12 and/or the cloud-based computing system 16.
At 5408, the processing device may present, on a computing device associated with the exercise device 100, one or more sections of the user interface. The one or more sections may include independent graphical elements, objects, representations, or the like. The one or more sections may include overlapping graphical elements, objects, representations, or the like. In some embodiments, one section may share an aspect (e.g., border, graphical element, region, object, shape, representation, text, etc.) with another section. The one or more sections may each include respective text, graphical elements, colors, and the like. The one or more sections may be each related to a separate domain including the one or more domains and wherein, based on the one or more measurements, each section may include the one or more icons placed. In some embodiments, the processing device may present each of the one or more sections as portions of a mount on the user interface. The portions may include parts, subsections, sections, portions, segments, or the like.
In some embodiments, the processing device may predict, based on the progress the user has made in each of the one or more domains, a completion date for an exercise plan. The predicting may be performed by the processing device executing a machine learning model 60 trained to receive the progress and output the predicted completion date. For example, the machine learning model 60 may be trained using a corpus of training data that includes inputs of various progression as measured with respect to the domains matched to completion statistics of the exercise plan.
In some embodiments, the processing device may receive, via an input peripheral, a selection of a domain, wherein the selection includes the one or more domains. In some embodiments, the processing device may present additional information related to the progress of the user associated with the selected domain. In some embodiments, the additional information may include characteristics, details, attributes, properties, parameters, values, descriptions, identifiers, and the like. In some embodiments, the one or more domains may include range of motion, strength, balance, endurance, mobility, stability, pliability, flexibility, pain, or some combination thereof.
In some embodiments, the processing device may receive a second input from the user. The second input may include an instruction to modify an operating parameter of the exercise device 100. The second input may be received via a microphone, a touchscreen, a keyboard, a mouse, a proprioceptive sensor, or some combination thereof. For example, the second input may include a spoken voice from the user that instructs the exercise device 100 to increase resistance. Accordingly, the processing device may use natural language processing to digitize an audio signal representing the spoken voice and process the digitized audio signal. Based on the digitized audio signal, the processing device may transmit a control instruction that causes the resistance provided by one or more of the pedals to be increased in real-time or near real-time.
In some embodiments, one or more machine learning models 60 may be generated and trained by the artificial intelligence engine 65 and/or the training engine 50 to perform one or more of the operations of the method 5500. For example, to perform the one or more operations, the processing device may execute the one or more machine learning models 60. In some embodiments, the one or more machine learning models 60 may be iteratively retrained to select different features capable of enabling the optimization of output. The features that may be modified may include a number of nodes included in each layer of the machine learning models 60, an objective function executed at each node, a number of layers, various weights associated with outputs of each node, and the like.
At 5502, the processing device may receive, from the computing device associated with the user, an adjustment to a value associated with a domain comprising the one or more domains. That is, the user is enabled to configure their progress in any of the domains as they desire. For example, the user may increase their completion progress toward a range of motion goal in the domain associated with range of motion.
At 5504, based on the adjustment to the value, the processing device may select, by the machine learning model 60, one or more exercises to include in an exercise plan for the user. Continuing the above example, since the user increased their completion progress toward the range of motion goal, the machine learning model 60 may select exercises with ranges of motion increased in relation to the completion progress indicated by the user.
At 5506, the processing device may determine a completion state of an exercise including the one or more exercises. The completion state may be determined based on a metric associated with the exercise (e.g., elapsed time, distance traveled, number of repetitions completed, number of sets completed, range of motion achieved, force achieved, speed achieved, etc.). The completion state may be represented by a percentage, an amount, a binary value (e.g., 1 or 0), or any suitable indicator.
At 5508, based on the completion state, the processing device may modify a difficulty level of the exercise. For example, if the completion state indicates the completion state for the exercise is below a threshold completion level, then the difficulty for the exercise may be reduced. Continuing the above example, if the completion state indicates the user is not able to sufficiently complete the exercise having the range of motion increased by the user, then the difficulty may be decreased for that exercise (e.g., the range of motion may be decreased back to a prior value describing a past progress of the user in the domain associated with range of motion). If the completion state indicates the completion state for the exercise is above the threshold completion level, then the difficulty for the exercise may be maintained or increased. The machine learning model may be retrained to select exercises having the increased range of motion and/or an increased difficulty level.
The virtual character 5602 may be animated to graphically move and to audibly generate a motivational quote 5604 to the user. In some embodiments, the virtual character 5602 may include a virtual coach as described further herein. In some embodiments, the virtual character 5602 may include a live-person performing an exercise in real-time. In some embodiments, the virtual character 5602 may include a prerecording of a live-person performing an exercise.
The user interface 5600 may include various graphical elements 5606 (e.g., buttons): a button associated with increasing the resistance, and a button associated with decreasing the resistance. The user may use an input peripheral, such as a mouse, a keyboard, a microphone, a touchscreen, etc. to select one of the graphical elements 5606. For example, as depicted by a hand cursor, the user may select the button to cause the resistance to increase. The selection may be transmitted to the cloud-based computing system 16 and/or the computing device 12. The selection may cause a control instruction to be generated and transmitted to the exercise device 100. The control instruction may cause an operating parameter associated with the resistance to increase. In some embodiments, when generating subsequent exercise sessions, the selection may cause one or more machine learning models 60 to be retrained to learn the user's preferences for resistance levels.
In some embodiments, one or more machine learning models 60 may be generated and trained by the artificial intelligence engine 65 and/or the training engine 50 to perform one or more of the operations of the method 5700. For example, to perform the one or more operations, the processing device may execute the one or more machine learning models 60. In some embodiments, the one or more machine learning models 60 may be iteratively retrained to select different features capable of enabling the optimization of output. The features that may be modified may include a number of nodes included in each layer of the machine learning models 60, an objective function executed at each node, a number of layers, various weights associated with outputs of each node, and the like.
During an exercise session, the method 5700 may use the artificial intelligence engine 65 to interact with a user of an exercise device 100. At 5702, the processing device may generate, by the artificial intelligence engine 65, a machine learning model 60 trained to receive data as input. In some embodiments, the data may include an electronic recording of a voice of the user received via a microphone associated with the computing device. In some embodiments, the data may be associated with a difficulty of an exercise the user is currently performing, and the processing device may modify an exercise plan based on the data. In some embodiments, the data may include an indication from the user that an exercise is too difficult, and the machine learning model may be trained to select a less difficult exercise for the user in a subsequent exercise session.
At 5704, based on the data, the processing device may generate an output. In some embodiments, the output may include a virtual character animated to graphically move and to audibly generate a motivational quote to the user. In some embodiments, the output may include a control instruction that causes the exercise device 100 to change an operating parameter. In some embodiments, the operating parameter may include changing a range of motion of one or more pedals, changing a speed of a motor, changing a revolutions per minute of a motor, changing a parameter of a fan associated with the exercise bike, changing a temperature of a portion of the exercise bike, changing a haptic setting of a portion of the exercise bike, or some combination thereof.
At 5706, while a user performs an exercise using the exercise device 100, the processing device may receive the data from an input peripheral of a computing device 12 associated with a user. The input peripheral may include a microphone, a keyboard, a mouse, a touchscreen, etc.
At 5708, based on the data being received from the input peripheral, the processing device may determine, via the machine learning model 60, the output to control an aspect of the exercise device 100. An “aspect” may refer to a characteristic, an attribute, a property, a part, information, detail, a controlling mechanism, or some combination thereof. For example, the aspect may refer to controlling an amount of resistance provided by one or more pedals, a range of motion provided by one or more pedals, a speed provided by one or more pedals and/or a motor, a revolutions per minute of one or more pedals and/or a wheel, etc.
In some embodiments, the processing device may select a virtual character, a prerecorded message, or both, to present via the computing device. The virtual character, the prerecorded message, or both may include a motivational quote. The virtual character may be a virtual coach, as described herein. Further, the virtual character may be a live-person performing an exercise in real-time. The virtual character may be a prerecorded live-person performing a an exercise.
In some embodiments, the processing device may receive second input from the user. The second input may include an instruction to modify an operating parameter of the exercise device 100. The second input may be received via a microphone, a touchscreen, a keyboard, a mouse, a proprioceptive sensor, or some combination thereof. For example, the second input may include spoken voice from the user that instructs the exercise device 100 to increase resistance. Accordingly, the processing device may use natural language processing to digitize an audio signal representing the spoken voice and process the digitized audio signal. Based on the digitized audio signal, the processing device may transmit a control instruction that causes the resistance provided by one or more of the pedals to be increased in real-time or near real-time.
In some embodiments, one or more machine learning models 60 may be generated and trained by the artificial intelligence engine 65 and/or the training engine 50 to perform one or more of the operations of the method 5800. For example, to perform the one or more operations, the processing device may execute the one or more machine learning models 60. In some embodiments, the one or more machine learning models 60 may be iteratively retrained to select different features capable of enabling optimization of output. The features that may be modified may include a number of nodes included in each layer of the machine learning models 60, an objective function executed at each node, a number of layers, various weights associated with outputs of each node, and the like.
The method 5800 may use the artificial intelligence engine 65 to onboard a user for an exercise plan. In some embodiments, onboarding may refer to the fulfillment of a set of conditions that form a predicate needed to enable the user to start using the system. At 5802, the processing device may generate, by the artificial intelligence engine 65, a machine learning model 60 trained to receive as input onboarding data associated with a user and an onboarding protocol and, based on the onboarding data and the onboarding protocol, to output an exercise plan.
At 5804, while the user performs an exercise using the exercise device 100, the processing device may receive the onboarding data associated with the user. The onboarding data may include one or more measurements may be associated with a force exerted on the one or more pedals, a revolution per minute of the one or more pedals, a speed of rotation of the one or more pedals, an angular moment of the one or more pedals, a range of motion of the one or more pedals, a radius of an arc traversed by the one or more pedals, a degree of flexion, a degree of extension, a skill level, or some combination thereof. The onboarding data may also include an indication of a pain level the user experiences while performing an exercise. The onboarding data may include personal information associated with the user and/or performance information associated with the user.
At 5806, the processing device may determine, by the machine learning model 60 using the onboarding data and the onboarding protocol, a fitness level of the user. The onboarding protocol may include exercises with tiered difficulty levels. The exercises may be associated with a plurality of domains comprising range of motion, strength, balance, endurance, mobility, stability, pliability, flexibility, pain, or some combination thereof. When the user completes an exercise included in the exercises, the onboarding protocol may increase a difficulty level for a subsequent exercise included in the exercises. Based on a completion state of a last exercise performed by the user, the fitness level of the user may be determined.
At 5808, by associating the difficulty level for each exercise with the fitness level of the user, the processing device may select a difficulty level for each exercise included in the exercise plan. In some embodiments, the machine learning model 60 may be trained to generate the exercise plan to reduce a pain level of the user, a dependency of the user on a certain medication (preferably opioids or other addictive drugs), or some combination thereof.
In some embodiments, as the user performs the exercise plan, the processing device may receive one or more measurements from one or more sensors, the computing device associated with the user and/or exercise device 100, or some combination thereof. Based on the one or more measurements, the processing device may modify a portion of the exercise plan to include different exercises. For example, the portion of the exercise plan that is modified may include one exercise, two exercises, three exercises, or any suitable number of exercises. That is, any combination of exercises may be modified in the exercise plan. Modification may refer to addition, change, deletion, etc. In some embodiments, the processing device may present the exercise plan on a computing device associated with the exercise device 100.
In some embodiments, the processing device may receive feedback from the user. The feedback may pertain to a pain level of the user, an enjoyment level of the user performing the exercise plan, or some combination thereof. Based on the feedback, the processing device may modify the exercise plan to include different exercises.
In some embodiments, a virtual character may be presented on a computing device associated with the exercise device 100 or the user. The virtual character may include a coach that provides instructions regarding how to properly perform an exercise. The virtual coach may be animated on the user interface and may perform the proper movements associated with the exercise. Further, the coach may speak and audio representing the virtual coach's speech may be generated. The speech may include instructions regarding how to properly perform the exercise. The virtual character may be a virtual avatar, a live person presented in real-time, a pre-recorded live person, etc.
In some embodiments, the processing device may receive input from the user. The input may include an instruction to modify an operating parameter of the exercise device 100. The input may be received via a microphone, a touchscreen, a keyboard, a mouse, a proprioceptive sensor, or some combination thereof. For example, the input may include a spoken voice from the user that instructs the exercise device 100 to increase a resistance. Accordingly, the processing device may use natural language processing to digitize an audio signal representing the spoken voice and to process the digitized audio signal. Based on the digitized audio signal, the processing device may transmit a control instruction that causes the resistance provided by one or more of the pedals to be increased in real-time or near real-time.
The various aspects, embodiments, implementations or features of the described embodiments can be used separately or in any combination. The embodiments disclosed herein are modular in nature and can be used in conjunction with or coupled to other embodiments, including both statically-based and dynamically-based equipment. In addition, the embodiments disclosed herein can employ selected equipment such that they can identify individual users and auto-calibrate threshold multiple-of-body-weight targets, as well as other individualized parameters, for individual users.
Consistent with the above disclosure, the examples of systems and method enumerated in the following clauses are specifically contemplated and are intended as a non-limiting set of examples.
Clauses:
1. A computer-implemented method for using an artificial intelligence engine to perform a control action, wherein the control action is based on one or more measurements from a wearable device, and wherein the computer-implemented method comprises:
2. The computer-implemented method of any preceding clause, wherein the one or more measurements comprise a heartrate, a blood pressure, a blood oxygen level, a blood glucose level, a temperature, a perspiration rate, a revolutions per minute, a number of steps, a speed, an amount of force, or some combination thereof.
3. The computer-implemented method of any preceding clause, wherein the control action comprises transmitting a notification for presentation on a user interface of a computing device associated with the exercise device, wherein the notification comprises feedback to encourage the user to perform, during the interval training session, an exercise within the target training zone.
4. The computer-implemented method of any preceding clause, further comprising:
5. The computer-implemented method of any preceding clause, further comprising:
6. The computer-implemented method of any preceding clause, wherein the wearable device comprises a watch, a necklace, an anklet, a bracelet, a belt, a ring, a hat, a shoe, a piece of clothing, or some combination thereof.
7. The computer-implemented method of claim 1, wherein the control action comprises controlling an operating parameter of an exercise device.
8. The computer-implemented method of any preceding clause, further comprising, responsive to determining that the one or more measurements indicate, during the interval training session, that the one or more characteristics of the user are within the desired target zone, performing the control action comprising transmitting a notification to a computing device associated with an exercise device, wherein the notification provides a motivational message to the user.
9. The computer-implemented method of any preceding clause, wherein the interval training session is included in an exercise plan associated with a rehabilitation program which the user is performing.
10. The computer-implemented method of any preceding clause, further comprising:
11. A tangible, non-transitory computer-readable medium storing instructions that, when executed, cause a processing device to:
12. The computer-readable medium of any preceding clause, wherein the one or more measurements comprise a heartrate, a blood pressure, a blood oxygen level, a blood glucose level, a temperature, a perspiration rate, a revolutions per minute, a number of steps, a speed, an amount of force, or some combination thereof.
13. The computer-readable medium of any preceding clause, wherein the control action comprises transmitting a notification for presentation on a user interface of a computing device associated with the exercise device, wherein the notification comprises feedback to encourage the user to perform, during the interval training session, an exercise within the target training zone.
14. The computer-readable medium of any preceding clause, wherein the processing device is configured to:
15. The computer-readable medium of any preceding clause, wherein the processing device is configured to:
16. The computer-readable medium of any preceding clause, wherein the wearable device comprises a watch, a necklace, an anklet, a bracelet, a belt, a ring, a hat, a shoe, a piece of clothing, or some combination thereof.
17. The computer-readable medium of any preceding clause, wherein the control action comprises controlling an operating parameter of an exercise device.
18. A system comprising:
19. The system of any preceding clause, wherein the one or more measurements comprise a heartrate, a blood pressure, a blood oxygen level, a blood glucose level, a temperature, a perspiration rate, a revolutions per minute, a number of steps, a speed, an amount of force, or some combination thereof.
20. The system of any preceding clause, wherein the control action comprises transmitting a notification for presentation on a user interface of a computing device associated with the exercise device, wherein the notification comprises feedback to encourage the user to perform, during the interval training session, an exercise within the target training zone.
No part of the description in this application should be read as implying that any particular element, step, or function is an essential element that must be included in the claim scope. The scope of patented subject matter is defined only by the claims. Moreover, none of the claims is intended to invoke 35 U.S.C. § 112(f) unless the exact words “means for” are followed by a participle.
The foregoing description, for purposes of explanation, used specific nomenclature to provide a thorough understanding of the described embodiments. However, it should be apparent to one skilled in the art that the specific details are not required in order to practice the described embodiments. Thus, the foregoing descriptions of specific embodiments are presented for purposes of illustration and description. They are not intended to be exhaustive or to limit the described embodiments to the precise forms disclosed. It should be apparent to one of ordinary skill in the art that many modifications and variations are possible in view of the above teachings.
The information below may provide a guideline of the rules employed in delivering an effective, well-structured exercise program for rehab, conditioning, and/or long-term fitness adapted to the capabilities of each user. The information below is for explanatory purposes and the subject matter of the present disclosure is not limited to the examples provided below.
2. Determining A User's Exercise Level
How the Combination of Pain and ROM Test Results Define Levels:
How the Combination of Level and ROM Test Results Define Resistance:
a) Measuring ROM
a) Establishing Degree of Pain
b) Tracking ROM and Degree of Knee Pain to Advance Through the Exercise Levels
c) Heart Rate Test: Level 1-3
d) Heart Rate Test: Level 4-5
e) Fitness Test: Portable Product
f) Fitness Test: Subsequent Products
3. Exercise Sessions
a) Composition of Each Exercise Level
b) Exercise Filtering:
c) Evaluative Sessions
4. Exercise Adaptations
a) When a User Presses or Commands (e.g., by Voice) the Too Easy Button
b) When a User Presses Button or Says (e.g., by Voice) Too Hard
c) When a User Presses Button or Says (e.g., by Voice) Skip
5. In-Session Exercise Switching
6. Counting Reps & Sets
7. Completing An Exercise or an Exercise Session
8. Advancing Out of Levels
9. Exercise Session Example for Level-1
a) Warm Up Cycling
b) Exercises
c) Strengthening Cycling
d) Cool Down
Bands
The above discussion is meant to be illustrative of the principles and various embodiments of the present disclosure. Numerous variations and modifications will become apparent to those skilled in the art once the above disclosure is fully appreciated. It is intended that the following claims be interpreted to embrace all such variations and modifications.
This application is a continuation-in-part of and claims priority to U.S. application Ser. No. 16/869,954, filed May 8, 2020, titled “System, Method and Apparatus for Rehabilitation and Exercise”, which claims priority to both U.S. Prov. Application No. 62/858,244, filed Jun. 6, 2019, titled “System for Individualized Rehabilitation Using Load Cells in Handles and Foot Plates and Providing Haptic Feedback to a User” and U.S. Prov. Application No. 62/846,434, filed May 10, 2019, titled “Exercise Machine”. The current application further claims priority to U.S. Prov. Application No. 63/168,175, filed Mar. 30, 2021, titled “System and Method for an Artificial Intelligence Engine That Uses a Multi-Disciplinary Data Source to Determine Comorbidity Information Pertaining to Users and to Generate Exercise Plans for Desired User Goals”. All applications are hereby incorporated by reference in their entirety for all purposes.
Number | Name | Date | Kind |
---|---|---|---|
59915 | Lallement | Nov 1866 | A |
363522 | Knous | May 1887 | A |
446671 | Elliott | Feb 1891 | A |
610157 | Campbell | Aug 1898 | A |
631276 | Bulova | Aug 1899 | A |
823712 | Uhlmann | Jun 1906 | A |
1149029 | Clark | Aug 1915 | A |
1227743 | Burgedorff | May 1917 | A |
1784230 | Freeman | Dec 1930 | A |
1820372 | Blomquist | Aug 1931 | A |
3017180 | Aronsohn | Jan 1962 | A |
3081645 | Bergfors | Mar 1963 | A |
3100640 | Weitzel | Aug 1963 | A |
3137014 | Meucci | Jun 1964 | A |
3143316 | Shapiro | Aug 1964 | A |
3213852 | Zent | Oct 1965 | A |
3572699 | Nies | Mar 1971 | A |
3713438 | Knutsen | Jan 1973 | A |
3744480 | Gause et al. | Jul 1973 | A |
3888136 | Lapeyre | Jun 1975 | A |
4079957 | Blease | Mar 1978 | A |
4222376 | Martin | Sep 1980 | A |
4408613 | Relyea | Oct 1983 | A |
4436097 | Cunningham | Mar 1984 | A |
4446753 | Nagano | May 1984 | A |
4477072 | DeCloux | Oct 1984 | A |
4499900 | Petrofsky et al. | Feb 1985 | A |
4509742 | Cones | Apr 1985 | A |
4519604 | Arzounian | May 1985 | A |
4538804 | Zibell | Sep 1985 | A |
4572501 | Durham et al. | Feb 1986 | A |
4606241 | Fredriksson | Aug 1986 | A |
4611807 | Castillo | Sep 1986 | A |
4616823 | Yang | Oct 1986 | A |
4618141 | Ashworth | Oct 1986 | A |
4648287 | Preskitt | Mar 1987 | A |
4673178 | Dwight | Jun 1987 | A |
4822032 | Whitmore et al. | Apr 1989 | A |
4824104 | Bloch | Apr 1989 | A |
4824132 | Moore | Apr 1989 | A |
4850245 | Feamster et al. | Jul 1989 | A |
4858942 | Rodriguez | Aug 1989 | A |
4860763 | Schminke | Aug 1989 | A |
4869497 | Stewart et al. | Sep 1989 | A |
4915374 | Watkins | Apr 1990 | A |
4930768 | Lapcevic | Jun 1990 | A |
4932650 | Bingham et al. | Jun 1990 | A |
4961570 | Chang | Oct 1990 | A |
5137501 | Mertesdorf | Aug 1992 | A |
5139255 | Sollami | Aug 1992 | A |
5161430 | Febey | Nov 1992 | A |
5184991 | Brangi | Feb 1993 | A |
5202794 | Schnee et al. | Apr 1993 | A |
5230672 | Brown et al. | Jul 1993 | A |
5240417 | Smithson et al. | Aug 1993 | A |
5247853 | Dalebout | Sep 1993 | A |
5256115 | Scholder et al. | Oct 1993 | A |
5256117 | Potts et al. | Oct 1993 | A |
D342299 | Birrell et al. | Dec 1993 | S |
5282748 | Little | Feb 1994 | A |
5284131 | Gray | Feb 1994 | A |
5316532 | Butler | May 1994 | A |
5324241 | Artigues et al. | Jun 1994 | A |
5336147 | Sweeney, III | Aug 1994 | A |
5338272 | Sweeney, III | Aug 1994 | A |
5356356 | Hildebrandt | Oct 1994 | A |
5361649 | Slocum, Jr. | Nov 1994 | A |
D353421 | Gallivan | Dec 1994 | S |
D359777 | Hildebrandt | Jun 1995 | S |
5429140 | Burdea et al. | Jul 1995 | A |
5458022 | Mattfeld et al. | Oct 1995 | A |
5474083 | Church | Dec 1995 | A |
5487713 | Butler | Jan 1996 | A |
5566589 | Buck | Oct 1996 | A |
5580338 | Scelta et al. | Dec 1996 | A |
5676349 | Wilson | Oct 1997 | A |
5685804 | Whan-Tong et al. | Nov 1997 | A |
5738636 | Saringer et al. | Apr 1998 | A |
5857943 | Murray | Jan 1999 | A |
5860941 | Saringer et al. | Jan 1999 | A |
5950813 | Hoskins et al. | Sep 1999 | A |
5980431 | Miller | Nov 1999 | A |
6001046 | Chang | Dec 1999 | A |
6007459 | Burgess | Dec 1999 | A |
6013007 | Root | Jan 2000 | A |
D421075 | Hildebrandt | Feb 2000 | S |
6036623 | Mitchell | Mar 2000 | A |
6053847 | Stearns et al. | Apr 2000 | A |
6077201 | Cheng | Jun 2000 | A |
6102834 | Chen | Aug 2000 | A |
6110130 | Kramer | Aug 2000 | A |
6155958 | Goldberg | Dec 2000 | A |
6162189 | Girone et al. | Dec 2000 | A |
6182029 | Friedman | Jan 2001 | B1 |
D438580 | Shaw | Mar 2001 | S |
6253638 | Bermudez | Jul 2001 | B1 |
6267735 | Blanchard et al. | Jul 2001 | B1 |
6273863 | Avni et al. | Aug 2001 | B1 |
D450100 | Hsu | Nov 2001 | S |
D450101 | Hsu | Nov 2001 | S |
D451972 | Easley | Dec 2001 | S |
D452285 | Easley | Dec 2001 | S |
6347290 | Bartlett | Feb 2002 | B1 |
D454605 | Lee | Mar 2002 | S |
6371891 | Speas | Apr 2002 | B1 |
D459776 | Lee | Jul 2002 | S |
6413190 | Wood et al. | Jul 2002 | B1 |
6430436 | Richter | Aug 2002 | B1 |
6436058 | Krahner et al. | Aug 2002 | B1 |
6450923 | Vatti | Sep 2002 | B1 |
6474193 | Farney | Nov 2002 | B1 |
6491649 | Ombrellaro | Dec 2002 | B1 |
6514085 | Slattery et al. | Feb 2003 | B2 |
6535861 | OConnor et al. | Mar 2003 | B1 |
6543309 | Heim | Apr 2003 | B2 |
D475424 | Lee | Jun 2003 | S |
6589139 | Butterworth | Jul 2003 | B1 |
6601016 | Brown et al. | Jul 2003 | B1 |
6602191 | Quy | Aug 2003 | B2 |
6613000 | Reinkensmeyer et al. | Sep 2003 | B1 |
6626800 | Casler | Sep 2003 | B1 |
6626805 | Lightbody | Sep 2003 | B1 |
D482416 | Yang | Nov 2003 | S |
6640662 | Baxter | Nov 2003 | B1 |
6652425 | Martin et al. | Nov 2003 | B1 |
D484931 | Tsai | Jan 2004 | S |
6820517 | Farney | Nov 2004 | B1 |
6865969 | Stevens | Mar 2005 | B2 |
6890312 | Priester et al. | May 2005 | B1 |
6895834 | Baatz | May 2005 | B1 |
6902513 | McClure | Jun 2005 | B1 |
6902515 | Howell et al. | Jun 2005 | B2 |
6960155 | Chien | Nov 2005 | B2 |
7058453 | Nelson et al. | Jun 2006 | B2 |
7063643 | Arai | Jun 2006 | B2 |
7156665 | OConnor et al. | Jan 2007 | B1 |
7156780 | Fuchs et al. | Jan 2007 | B1 |
7169085 | Killin et al. | Jan 2007 | B1 |
7204788 | Andrews | Apr 2007 | B2 |
7209886 | Kimmel | Apr 2007 | B2 |
7226394 | Johnson | Jun 2007 | B2 |
RE39904 | Lee | Oct 2007 | E |
7406003 | Burkhardt et al. | Jul 2008 | B2 |
D575836 | Hsiao | Aug 2008 | S |
7507188 | Nurre | Mar 2009 | B2 |
7510512 | Taggett | Mar 2009 | B1 |
7594879 | Johnson | Sep 2009 | B2 |
7628730 | Watterson | Dec 2009 | B1 |
D610635 | Hildebrandt | Feb 2010 | S |
7713176 | Farney | May 2010 | B1 |
7778851 | Schoenberg et al. | Aug 2010 | B2 |
7789800 | Watterson | Sep 2010 | B1 |
7809601 | Shaya et al. | Oct 2010 | B2 |
7815551 | Merli | Oct 2010 | B2 |
7833135 | Radow et al. | Nov 2010 | B2 |
7837472 | Elsmore | Nov 2010 | B1 |
7955219 | Birrell et al. | Jun 2011 | B2 |
7969315 | Ross et al. | Jun 2011 | B1 |
7988599 | Ainsworth et al. | Aug 2011 | B2 |
8012107 | Einav et al. | Sep 2011 | B2 |
8021270 | D'Eredita | Sep 2011 | B2 |
8029415 | Ashby | Oct 2011 | B2 |
8038578 | Olrik et al. | Oct 2011 | B2 |
8079937 | Bedell et al. | Dec 2011 | B2 |
8113991 | Kutliroff | Feb 2012 | B2 |
8177732 | Einav et al. | May 2012 | B2 |
8287434 | Zavadsky et al. | Oct 2012 | B2 |
8298123 | Hickman | Oct 2012 | B2 |
8371990 | Shea | Feb 2013 | B2 |
8409060 | Hsu | Apr 2013 | B2 |
8419593 | Ainsworth et al. | Apr 2013 | B2 |
8444534 | Mckee | May 2013 | B2 |
8465398 | Lee et al. | Jun 2013 | B2 |
8506458 | Dugan | Aug 2013 | B2 |
8515777 | Rajasenan | Aug 2013 | B1 |
8540515 | Williams et al. | Sep 2013 | B2 |
8540516 | Williams et al. | Sep 2013 | B2 |
8556778 | Dugan | Oct 2013 | B1 |
8607465 | Edwards | Dec 2013 | B1 |
8613689 | Dyer et al. | Dec 2013 | B2 |
8672812 | Dugan | Mar 2014 | B2 |
8751264 | Beraja et al. | Jun 2014 | B2 |
8784273 | Dugan | Jul 2014 | B2 |
8818496 | Dziubinski et al. | Aug 2014 | B2 |
8823448 | Shen | Sep 2014 | B1 |
8845493 | Watterson et al. | Sep 2014 | B2 |
8849681 | Hargrove et al. | Sep 2014 | B2 |
8864628 | Boyette et al. | Oct 2014 | B2 |
8893287 | Gjonej et al. | Nov 2014 | B2 |
8911327 | Boyette | Dec 2014 | B1 |
8979711 | Dugan | Mar 2015 | B2 |
9004598 | Weber | Apr 2015 | B2 |
9028368 | Ashby | May 2015 | B2 |
9044630 | Lampert et al. | Jun 2015 | B1 |
9167281 | Petrov et al. | Oct 2015 | B2 |
D744050 | Colburn | Nov 2015 | S |
9248071 | Benda et al. | Feb 2016 | B1 |
9272185 | Dugan | Mar 2016 | B2 |
9272186 | Reich | Mar 2016 | B2 |
9283434 | Wu | Mar 2016 | B1 |
9308417 | Grundy | Apr 2016 | B2 |
9311789 | Gwin | Apr 2016 | B1 |
9312907 | Auchinleck et al. | Apr 2016 | B2 |
9367668 | Flynt et al. | Jun 2016 | B2 |
9409054 | Dugan | Aug 2016 | B2 |
9443205 | Wall | Sep 2016 | B2 |
9474935 | Abbondanza et al. | Oct 2016 | B2 |
9480873 | Chuang | Nov 2016 | B2 |
9481428 | Gros et al. | Nov 2016 | B2 |
9486382 | Boss | Nov 2016 | B1 |
9514277 | Hassing et al. | Dec 2016 | B2 |
9530325 | Hall | Dec 2016 | B2 |
9566472 | Dugan | Feb 2017 | B2 |
9579056 | Rosenbek et al. | Feb 2017 | B2 |
9629558 | Yuen | Apr 2017 | B2 |
9640057 | Ross | May 2017 | B1 |
9707147 | Levital et al. | Jul 2017 | B2 |
9713744 | Suzuki | Jul 2017 | B2 |
D793494 | Mansfield et al. | Aug 2017 | S |
D794142 | Zhou | Aug 2017 | S |
9717947 | Lin | Aug 2017 | B2 |
9737761 | Sivaraj | Aug 2017 | B1 |
9757612 | Weber | Sep 2017 | B2 |
9782621 | Chiang et al. | Oct 2017 | B2 |
9802076 | Murray et al. | Oct 2017 | B2 |
9802081 | Ridgel et al. | Oct 2017 | B2 |
9813239 | Chee et al. | Nov 2017 | B2 |
9827445 | Marcos et al. | Nov 2017 | B2 |
9849337 | Roman et al. | Dec 2017 | B2 |
9868028 | Shin | Jan 2018 | B2 |
9872087 | DelloStritto et al. | Jan 2018 | B2 |
9872637 | Kording et al. | Jan 2018 | B2 |
9914053 | Dugan | Mar 2018 | B2 |
9919198 | Romeo et al. | Mar 2018 | B2 |
9937382 | Dugan | Apr 2018 | B2 |
9939784 | Berardinelli | Apr 2018 | B1 |
9977587 | Mountain | May 2018 | B2 |
9987188 | Diao | Jun 2018 | B1 |
9993181 | Ross | Jun 2018 | B2 |
10004946 | Ross | Jun 2018 | B2 |
D826349 | Oblamski | Aug 2018 | S |
10052518 | Lagree | Aug 2018 | B2 |
10055550 | Goetz | Aug 2018 | B2 |
10058473 | Oshima et al. | Aug 2018 | B2 |
10074148 | Cashman et al. | Sep 2018 | B2 |
10089443 | Miller et al. | Oct 2018 | B2 |
10111643 | Schulhauser et al. | Oct 2018 | B2 |
10130311 | De Sapio et al. | Nov 2018 | B1 |
10137328 | Baudhuin | Nov 2018 | B2 |
10143395 | Chakravarthy et al. | Dec 2018 | B2 |
10155134 | Dugan | Dec 2018 | B2 |
10159872 | Sasaki et al. | Dec 2018 | B2 |
10173094 | Gomberg | Jan 2019 | B2 |
10198928 | Ross et al. | Feb 2019 | B1 |
10226663 | Gomberg et al. | Mar 2019 | B2 |
10231664 | Ganesh | Mar 2019 | B2 |
10244990 | Hu et al. | Apr 2019 | B2 |
10258823 | Cole | Apr 2019 | B2 |
10278883 | Walsh | May 2019 | B2 |
10325070 | Beale et al. | Jun 2019 | B2 |
10327697 | Stein et al. | Jun 2019 | B1 |
10369021 | Zoss et al. | Aug 2019 | B2 |
10380866 | Ross et al. | Aug 2019 | B1 |
10413238 | Cooper | Sep 2019 | B1 |
10424033 | Romeo | Sep 2019 | B2 |
10430552 | Mihai | Oct 2019 | B2 |
D866957 | Ross et al. | Nov 2019 | S |
10468131 | Macoviak et al. | Nov 2019 | B2 |
10475323 | Ross | Nov 2019 | B1 |
10475537 | Purdie et al. | Nov 2019 | B2 |
10492977 | Kapure et al. | Dec 2019 | B2 |
10507358 | Kinnunen et al. | Dec 2019 | B2 |
10532000 | De Sapio | Jan 2020 | B1 |
10532785 | Stillman | Jan 2020 | B2 |
10542914 | Forth et al. | Jan 2020 | B2 |
10546467 | Luciano, Jr. | Jan 2020 | B1 |
10572626 | Balram | Feb 2020 | B2 |
10576331 | Kuo | Mar 2020 | B2 |
10581896 | Nachenberg | Mar 2020 | B2 |
10625114 | Ercanbrack | Apr 2020 | B2 |
10660534 | Lee et al. | May 2020 | B2 |
10678890 | Bitran et al. | Jun 2020 | B2 |
10685092 | Paparella et al. | Jun 2020 | B2 |
10716969 | Hoang | Jul 2020 | B2 |
10777200 | Will et al. | Sep 2020 | B2 |
D899605 | Ross et al. | Oct 2020 | S |
10792495 | Izvorski et al. | Oct 2020 | B2 |
10867695 | Neagle | Dec 2020 | B2 |
10874905 | Belson et al. | Dec 2020 | B2 |
D907143 | Ach et al. | Jan 2021 | S |
10881911 | Kwon et al. | Jan 2021 | B2 |
10918332 | Belson et al. | Feb 2021 | B2 |
10931643 | Neumann | Feb 2021 | B1 |
10946239 | Berry | Mar 2021 | B2 |
10987176 | Poltaretskyi et al. | Apr 2021 | B2 |
10991463 | Kutzko et al. | Apr 2021 | B2 |
11000735 | Orady et al. | May 2021 | B2 |
11040238 | Colburn | Jun 2021 | B2 |
11045709 | Putnam | Jun 2021 | B2 |
11065170 | Yang et al. | Jul 2021 | B2 |
11065527 | Putnam | Jul 2021 | B2 |
11069436 | Mason et al. | Jul 2021 | B2 |
11071597 | Posnack et al. | Jul 2021 | B2 |
11075000 | Mason et al. | Jul 2021 | B2 |
11087865 | Mason et al. | Aug 2021 | B2 |
11093904 | Humble | Aug 2021 | B2 |
11101028 | Mason et al. | Aug 2021 | B2 |
11107591 | Mason | Aug 2021 | B1 |
11139060 | Mason et al. | Oct 2021 | B2 |
11161011 | Neumann | Nov 2021 | B2 |
11179596 | Karys | Nov 2021 | B2 |
D939096 | Lee | Dec 2021 | S |
D939644 | Ach et al. | Dec 2021 | S |
D940891 | Lee | Jan 2022 | S |
11229727 | Tatonetti | Jan 2022 | B2 |
11270795 | Mason et al. | Mar 2022 | B2 |
11272879 | Wiedenhoefer et al. | Mar 2022 | B2 |
11278766 | Lee | Mar 2022 | B2 |
11282599 | Mason et al. | Mar 2022 | B2 |
11282604 | Mason et al. | Mar 2022 | B2 |
11282608 | Mason et al. | Mar 2022 | B2 |
11284797 | Mason et al. | Mar 2022 | B2 |
D948639 | Ach et al. | Apr 2022 | S |
11295848 | Mason et al. | Apr 2022 | B2 |
11298284 | Bayerlein | Apr 2022 | B2 |
11309085 | Mason et al. | Apr 2022 | B2 |
11311772 | Bowers et al. | Apr 2022 | B1 |
11317975 | Mason et al. | May 2022 | B2 |
11325005 | Mason et al. | May 2022 | B2 |
11328807 | Mason et al. | May 2022 | B2 |
11337648 | Mason | May 2022 | B2 |
11348683 | Guaneri et al. | May 2022 | B2 |
11376470 | Weldemariam | Jul 2022 | B2 |
11386176 | Galitsky | Jul 2022 | B2 |
11404150 | Guaneri et al. | Aug 2022 | B2 |
11410768 | Mason et al. | Aug 2022 | B2 |
11422841 | Jeong | Aug 2022 | B2 |
11433276 | Bissonnette | Sep 2022 | B2 |
11458354 | Bissonnette et al. | Oct 2022 | B2 |
11458363 | Powers et al. | Oct 2022 | B2 |
11495355 | McNutt et al. | Nov 2022 | B2 |
11508258 | Nakashima et al. | Nov 2022 | B2 |
11508482 | Mason et al. | Nov 2022 | B2 |
11515021 | Mason | Nov 2022 | B2 |
11515028 | Mason | Nov 2022 | B2 |
11524210 | Kim et al. | Dec 2022 | B2 |
11527326 | McNair et al. | Dec 2022 | B2 |
11532402 | Farley et al. | Dec 2022 | B2 |
11534654 | Silcock et al. | Dec 2022 | B2 |
D976339 | Li | Jan 2023 | S |
11541274 | Hacking | Jan 2023 | B2 |
11636944 | Hanrahan et al. | Apr 2023 | B2 |
11663673 | Pyles | May 2023 | B2 |
11701548 | Posnack et al. | Jul 2023 | B2 |
20010011025 | Ohki | Aug 2001 | A1 |
20010044573 | Manoli | Nov 2001 | A1 |
20020072452 | Torkelson | Jun 2002 | A1 |
20020143279 | Porter et al. | Oct 2002 | A1 |
20020160883 | Dugan | Oct 2002 | A1 |
20030013072 | Thomas | Jan 2003 | A1 |
20030036683 | Kehr et al. | Feb 2003 | A1 |
20030045402 | Pyle | Mar 2003 | A1 |
20030064863 | Chen | Apr 2003 | A1 |
20030083596 | Kramer et al. | May 2003 | A1 |
20030092536 | Romanelli et al. | May 2003 | A1 |
20030093012 | Wiley et al. | May 2003 | A1 |
20030109814 | Rummerfield | Jun 2003 | A1 |
20030181832 | Carnahan et al. | Sep 2003 | A1 |
20040102931 | Ellis | May 2004 | A1 |
20040106502 | Sher | Jun 2004 | A1 |
20040147969 | Mann et al. | Jul 2004 | A1 |
20040172093 | Rummerfield | Sep 2004 | A1 |
20040194572 | Kim | Oct 2004 | A1 |
20040204959 | Moreano et al. | Oct 2004 | A1 |
20040259693 | Chien | Dec 2004 | A1 |
20040263473 | Cho | Dec 2004 | A1 |
20050015118 | Davis et al. | Jan 2005 | A1 |
20050020411 | Andrews | Jan 2005 | A1 |
20050043153 | Krietzman | Feb 2005 | A1 |
20050049122 | Vallone et al. | Mar 2005 | A1 |
20050085346 | Johnson | Apr 2005 | A1 |
20050085353 | Johnson | Apr 2005 | A1 |
20050101463 | Chen | May 2005 | A1 |
20050115561 | Stahmann | Jun 2005 | A1 |
20050274220 | Reboullet | Dec 2005 | A1 |
20060003871 | Houghton et al. | Jan 2006 | A1 |
20060046905 | Doody, Jr. et al. | Mar 2006 | A1 |
20060058648 | Meier | Mar 2006 | A1 |
20060064136 | Wang | Mar 2006 | A1 |
20060064329 | Abolfathi et al. | Mar 2006 | A1 |
20060079817 | Dewald | Apr 2006 | A1 |
20060122039 | Lee et al. | Jun 2006 | A1 |
20060135325 | Holness | Jun 2006 | A1 |
20060199700 | LaStayo et al. | Sep 2006 | A1 |
20060229164 | Einav | Oct 2006 | A1 |
20060247095 | Rummerfield | Nov 2006 | A1 |
20060252607 | Holloway | Nov 2006 | A1 |
20060258520 | Bowser | Nov 2006 | A1 |
20070021277 | Kuo | Jan 2007 | A1 |
20070042868 | Fisher et al. | Feb 2007 | A1 |
20070099766 | Pyles | May 2007 | A1 |
20070118389 | Shipon | May 2007 | A1 |
20070137307 | Gruben et al. | Jun 2007 | A1 |
20070149364 | Blau | Jun 2007 | A1 |
20070173392 | Stanford | Jul 2007 | A1 |
20070184414 | Perez | Aug 2007 | A1 |
20070194939 | Alvarez et al. | Aug 2007 | A1 |
20070219059 | Schwartz | Sep 2007 | A1 |
20070243980 | Bowser | Oct 2007 | A1 |
20070271065 | Gupta et al. | Nov 2007 | A1 |
20070287597 | Cameron | Dec 2007 | A1 |
20080021834 | Holla et al. | Jan 2008 | A1 |
20080082356 | Friedlander et al. | Apr 2008 | A1 |
20080096726 | Riley et al. | Apr 2008 | A1 |
20080119333 | Bowser | May 2008 | A1 |
20080139975 | Einav | Jun 2008 | A1 |
20080153592 | James-Herbert | Jun 2008 | A1 |
20080161166 | Lo | Jul 2008 | A1 |
20080161733 | Einav et al. | Jul 2008 | A1 |
20080281633 | Burdea et al. | Nov 2008 | A1 |
20080300914 | Karkanias et al. | Dec 2008 | A1 |
20080318738 | Chen | Dec 2008 | A1 |
20090011907 | Radow | Jan 2009 | A1 |
20090023556 | Daly et al. | Jan 2009 | A1 |
20090058635 | LaLonde et al. | Mar 2009 | A1 |
20090070138 | Langheier et al. | Mar 2009 | A1 |
20090211395 | Mule | Aug 2009 | A1 |
20090221407 | Hauk | Sep 2009 | A1 |
20090239714 | Sellers | Sep 2009 | A1 |
20090270227 | Ashby | Oct 2009 | A1 |
20090287503 | Angell et al. | Nov 2009 | A1 |
20090299766 | Friedlander et al. | Dec 2009 | A1 |
20100022354 | Fisher | Jan 2010 | A1 |
20100029445 | Lee | Feb 2010 | A1 |
20100035726 | Fisher | Feb 2010 | A1 |
20100035729 | Pandozy | Feb 2010 | A1 |
20100048358 | Tchao et al. | Feb 2010 | A1 |
20100076786 | Dalton et al. | Mar 2010 | A1 |
20100121160 | Stark et al. | May 2010 | A1 |
20100152629 | Haas | Jun 2010 | A1 |
20100173747 | Chen et al. | Jul 2010 | A1 |
20100216168 | Heinzman et al. | Aug 2010 | A1 |
20100234184 | Le Page et al. | Sep 2010 | A1 |
20100248899 | Bedell et al. | Sep 2010 | A1 |
20100248905 | Lu | Sep 2010 | A1 |
20100261585 | Hauk | Oct 2010 | A1 |
20100268304 | Matos | Oct 2010 | A1 |
20100298102 | Bosecker et al. | Nov 2010 | A1 |
20100326207 | Topel | Dec 2010 | A1 |
20100331144 | Rindfleisch | Dec 2010 | A1 |
20110010188 | Yoshikawa et al. | Jan 2011 | A1 |
20110047108 | Chakrabarty et al. | Feb 2011 | A1 |
20110071003 | Watterson | Mar 2011 | A1 |
20110118084 | Tsai | May 2011 | A1 |
20110119212 | De Bruin et al. | May 2011 | A1 |
20110143898 | Trees | Jun 2011 | A1 |
20110165995 | Paulus | Jul 2011 | A1 |
20110172058 | Deaconu | Jul 2011 | A1 |
20110172059 | Watterson et al. | Jul 2011 | A1 |
20110195819 | Shaw et al. | Aug 2011 | A1 |
20110218814 | Coats | Sep 2011 | A1 |
20110256983 | Malack | Oct 2011 | A1 |
20110275483 | Dugan | Nov 2011 | A1 |
20110275486 | Hsu | Nov 2011 | A1 |
20110306846 | Osorio | Dec 2011 | A1 |
20120004932 | Sorkey et al. | Jan 2012 | A1 |
20120040799 | Jaquish | Feb 2012 | A1 |
20120041771 | Cosentino et al. | Feb 2012 | A1 |
20120065987 | Farooq et al. | Mar 2012 | A1 |
20120116258 | Lee | May 2012 | A1 |
20120167709 | Chen et al. | Jul 2012 | A1 |
20120183939 | Aragones et al. | Jul 2012 | A1 |
20120190502 | Paulus et al. | Jul 2012 | A1 |
20120220427 | Ashby | Aug 2012 | A1 |
20120232438 | Cataldi et al. | Sep 2012 | A1 |
20120259648 | Mallon et al. | Oct 2012 | A1 |
20120295240 | Walker et al. | Nov 2012 | A1 |
20120296455 | Ohnemus et al. | Nov 2012 | A1 |
20120310667 | Altman et al. | Dec 2012 | A1 |
20120323346 | Ashby | Dec 2012 | A1 |
20130029808 | Kuo | Jan 2013 | A1 |
20130029809 | Spevak | Jan 2013 | A1 |
20130102440 | Hutchins et al. | Apr 2013 | A1 |
20130116094 | Chen | May 2013 | A1 |
20130123071 | Rhea | May 2013 | A1 |
20130123667 | Komatireddy et al. | May 2013 | A1 |
20130137550 | Skinner et al. | May 2013 | A1 |
20130178334 | Brammer | Jul 2013 | A1 |
20130211281 | Ross et al. | Aug 2013 | A1 |
20130253943 | Lee et al. | Sep 2013 | A1 |
20130274069 | Watterson et al. | Oct 2013 | A1 |
20130282157 | Shin et al. | Oct 2013 | A1 |
20130296987 | Rogers et al. | Nov 2013 | A1 |
20130318027 | Almogy et al. | Nov 2013 | A1 |
20130345025 | van der Merwe | Dec 2013 | A1 |
20130345604 | Nakamura | Dec 2013 | A1 |
20140006042 | Keefe et al. | Jan 2014 | A1 |
20140011640 | Dugan | Jan 2014 | A1 |
20140031173 | Huang | Jan 2014 | A1 |
20140089836 | Damani et al. | Mar 2014 | A1 |
20140113768 | Lin et al. | Apr 2014 | A1 |
20140113776 | Jaguan | Apr 2014 | A1 |
20140155129 | Dugan | Jun 2014 | A1 |
20140172460 | Kohli | Jun 2014 | A1 |
20140087341 | Hall | Jul 2014 | A1 |
20140188009 | Lange et al. | Jul 2014 | A1 |
20140194250 | Reich et al. | Jul 2014 | A1 |
20140194251 | Reich et al. | Jul 2014 | A1 |
20140195103 | Nassef | Jul 2014 | A1 |
20140207264 | Quy | Jul 2014 | A1 |
20140207486 | Carty et al. | Jul 2014 | A1 |
20140228649 | Rayner et al. | Aug 2014 | A1 |
20140243160 | Lim | Aug 2014 | A1 |
20140246499 | Proud | Sep 2014 | A1 |
20140256511 | Smith | Sep 2014 | A1 |
20140257535 | Morris et al. | Sep 2014 | A1 |
20140257837 | Walker et al. | Sep 2014 | A1 |
20140274564 | Greenbaum | Sep 2014 | A1 |
20140274565 | Boyette et al. | Sep 2014 | A1 |
20140274622 | Leonhard | Sep 2014 | A1 |
20140303540 | Baym | Oct 2014 | A1 |
20140309083 | Dugan | Oct 2014 | A1 |
20140322686 | Kang | Oct 2014 | A1 |
20140330186 | Hyde | Nov 2014 | A1 |
20140371816 | Matos | Dec 2014 | A1 |
20150025816 | Ross | Jan 2015 | A1 |
20150045700 | Cavanagh et al. | Feb 2015 | A1 |
20150065303 | Born | Mar 2015 | A1 |
20150065305 | Dalton | Mar 2015 | A1 |
20150073814 | Linebaugh | Mar 2015 | A1 |
20150088544 | Goldberg | Mar 2015 | A1 |
20150094192 | Skwortsow et al. | Apr 2015 | A1 |
20150099458 | Weisner et al. | Apr 2015 | A1 |
20150099952 | Lain et al. | Apr 2015 | A1 |
20150141200 | Murray et al. | May 2015 | A1 |
20150151162 | Dugan | Jun 2015 | A1 |
20150158549 | Gros et al. | Jun 2015 | A1 |
20150161331 | Oleynik | Jun 2015 | A1 |
20150165263 | Golen | Jun 2015 | A1 |
20150196805 | Koduri | Jul 2015 | A1 |
20150238817 | Watterson | Aug 2015 | A1 |
20150257679 | Ross | Sep 2015 | A1 |
20150258365 | Neill et al. | Sep 2015 | A1 |
20150265209 | Zhang | Sep 2015 | A1 |
20150273267 | Manzke | Oct 2015 | A1 |
20150290061 | Stafford et al. | Oct 2015 | A1 |
20150328496 | Eder | Nov 2015 | A1 |
20150339442 | Oleynik | Nov 2015 | A1 |
20150341812 | Dion et al. | Nov 2015 | A1 |
20150351664 | Ross | Dec 2015 | A1 |
20150351665 | Ross | Dec 2015 | A1 |
20150360069 | Marti et al. | Dec 2015 | A1 |
20150379232 | Mainwaring et al. | Dec 2015 | A1 |
20150379430 | Dirac et al. | Dec 2015 | A1 |
20160007885 | Basta et al. | Jan 2016 | A1 |
20160009169 | Biderman | Jan 2016 | A1 |
20160023081 | Popa-Simil et al. | Jan 2016 | A1 |
20160082772 | Biderman | Mar 2016 | A1 |
20160096073 | Rahman et al. | Apr 2016 | A1 |
20160117471 | Belt et al. | Apr 2016 | A1 |
20160136483 | Reich | May 2016 | A1 |
20160140319 | Stark | May 2016 | A1 |
20160143593 | Fu et al. | May 2016 | A1 |
20160151670 | Dugan | Jun 2016 | A1 |
20160166881 | Ridgel et al. | Jun 2016 | A1 |
20160184634 | Yanev | Jun 2016 | A1 |
20160193306 | Rabovsky et al. | Jul 2016 | A1 |
20160213924 | Coleman | Jul 2016 | A1 |
20160220866 | Feichtinger et al. | Aug 2016 | A1 |
20160220867 | Flaherty | Aug 2016 | A1 |
20160271438 | Weisz | Sep 2016 | A1 |
20160271452 | Lagree | Sep 2016 | A1 |
20160275259 | Nolan et al. | Sep 2016 | A1 |
20160287166 | Tran | Oct 2016 | A1 |
20160302721 | Wiedenhoefer et al. | Oct 2016 | A1 |
20160317860 | Baudhuin | Nov 2016 | A1 |
20160317869 | Dugan | Nov 2016 | A1 |
20160322078 | Bose | Nov 2016 | A1 |
20160325140 | Wu | Nov 2016 | A1 |
20160332028 | Melnik | Nov 2016 | A1 |
20160354636 | Jang | Dec 2016 | A1 |
20160361597 | Cole et al. | Dec 2016 | A1 |
20160373477 | Moyle | Dec 2016 | A1 |
20170003311 | Lay | Jan 2017 | A1 |
20170004260 | Moturu et al. | Jan 2017 | A1 |
20170014671 | Burns, Sr. | Jan 2017 | A1 |
20170021827 | Seagraves | Jan 2017 | A1 |
20170033375 | Ohmori et al. | Feb 2017 | A1 |
20170036055 | Fleming | Feb 2017 | A1 |
20170042467 | Herr et al. | Feb 2017 | A1 |
20170046488 | Pereira | Feb 2017 | A1 |
20170065849 | Konishi | Mar 2017 | A1 |
20170065851 | Deluca et al. | Mar 2017 | A1 |
20170065873 | Hall | Mar 2017 | A1 |
20170080320 | Smith | Mar 2017 | A1 |
20170095670 | Ghaffari | Apr 2017 | A1 |
20170095692 | Chang | Apr 2017 | A1 |
20170095693 | Chang | Apr 2017 | A1 |
20170100628 | Wilt | Apr 2017 | A1 |
20170100637 | Princen et al. | Apr 2017 | A1 |
20170106242 | Dugan | Apr 2017 | A1 |
20170113092 | Johnson | Apr 2017 | A1 |
20170128769 | Long et al. | May 2017 | A1 |
20170132947 | Maeda et al. | May 2017 | A1 |
20170136296 | Barrera et al. | May 2017 | A1 |
20170143261 | Wiedenhoefer et al. | May 2017 | A1 |
20170147789 | Wiedenhoefer et al. | May 2017 | A1 |
20170148297 | Ross | May 2017 | A1 |
20170168555 | Munoz et al. | Jun 2017 | A1 |
20170172466 | Eriksson | Jun 2017 | A1 |
20170181698 | Wiedenhoefer et al. | Jun 2017 | A1 |
20170190052 | Jaekel et al. | Jul 2017 | A1 |
20170202724 | De Rossi | Jul 2017 | A1 |
20170209766 | Riley et al. | Jul 2017 | A1 |
20170216670 | Kuroda et al. | Aug 2017 | A1 |
20170220751 | Davis | Aug 2017 | A1 |
20170235882 | Orlov et al. | Aug 2017 | A1 |
20170235906 | Dorris et al. | Aug 2017 | A1 |
20170243028 | LaFever et al. | Aug 2017 | A1 |
20170262604 | Francois | Sep 2017 | A1 |
20170265800 | Auchinleck et al. | Sep 2017 | A1 |
20170266501 | Sanders et al. | Sep 2017 | A1 |
20170270260 | Shetty | Sep 2017 | A1 |
20170278209 | Olsen et al. | Sep 2017 | A1 |
20170282015 | Wicks | Oct 2017 | A1 |
20170283508 | Demopulos et al. | Oct 2017 | A1 |
20170286621 | Cox | Oct 2017 | A1 |
20170300654 | Stein et al. | Oct 2017 | A1 |
20170304024 | Nóbrega et al. | Oct 2017 | A1 |
20170304680 | Schmidt | Oct 2017 | A1 |
20170312614 | Tran | Nov 2017 | A1 |
20170323481 | Tran et al. | Nov 2017 | A1 |
20170329917 | McRaith | Nov 2017 | A1 |
20170333755 | Rider | Nov 2017 | A1 |
20170337033 | Duyan et al. | Nov 2017 | A1 |
20170337334 | Stanczak | Nov 2017 | A1 |
20170344726 | Duffy et al. | Nov 2017 | A1 |
20170347923 | Roh | Dec 2017 | A1 |
20170360586 | Dempers et al. | Dec 2017 | A1 |
20170361165 | Miller | Dec 2017 | A1 |
20170368413 | Shavit | Dec 2017 | A1 |
20180001181 | Prellwitz et al. | Jan 2018 | A1 |
20180017806 | Wang et al. | Jan 2018 | A1 |
20180034920 | Gopalan et al. | Feb 2018 | A1 |
20180034922 | Gopalan | Feb 2018 | A1 |
20180036593 | Ridgel et al. | Feb 2018 | A1 |
20180052962 | Van Der Koijk et al. | Feb 2018 | A1 |
20180056104 | Cromie | Mar 2018 | A1 |
20180060494 | Dias et al. | Mar 2018 | A1 |
20180064991 | Yanev | Mar 2018 | A1 |
20180071565 | Gomberg et al. | Mar 2018 | A1 |
20180071566 | Gomberg et al. | Mar 2018 | A1 |
20180071569 | Gomberg et al. | Mar 2018 | A1 |
20180071570 | Gomberg et al. | Mar 2018 | A1 |
20180071571 | Gomberg et al. | Mar 2018 | A1 |
20180071572 | Gomberg et al. | Mar 2018 | A1 |
20180075205 | Moturu et al. | Mar 2018 | A1 |
20180078843 | Tran | Mar 2018 | A1 |
20180085615 | Astolfi et al. | Mar 2018 | A1 |
20180096111 | Wells et al. | Apr 2018 | A1 |
20180102190 | Hogue et al. | Apr 2018 | A1 |
20180111034 | Watterson | Apr 2018 | A1 |
20180116741 | Garcia Kilroy et al. | May 2018 | A1 |
20180146870 | Shemesh | May 2018 | A1 |
20180154240 | Hall | Jun 2018 | A1 |
20180177612 | Trabish et al. | Jun 2018 | A1 |
20180177664 | Choi et al. | Jun 2018 | A1 |
20180178059 | Tyungsoon et al. | Jun 2018 | A1 |
20180178061 | O'larte et al. | Jun 2018 | A1 |
20180199855 | Odame et al. | Jul 2018 | A1 |
20180200577 | Dugan | Jul 2018 | A1 |
20180220935 | Tadano et al. | Aug 2018 | A1 |
20180228682 | Bayerlein et al. | Aug 2018 | A1 |
20180236307 | Hyde et al. | Aug 2018 | A1 |
20180240552 | Tuyl et al. | Aug 2018 | A1 |
20180253991 | Tang | Sep 2018 | A1 |
20180256079 | Yang et al. | Sep 2018 | A1 |
20180263530 | Jung | Sep 2018 | A1 |
20180263535 | Cramer | Sep 2018 | A1 |
20180263552 | Graman et al. | Sep 2018 | A1 |
20180264312 | Pompile et al. | Sep 2018 | A1 |
20180271432 | Auchinleck et al. | Sep 2018 | A1 |
20180272184 | Vassilaros et al. | Sep 2018 | A1 |
20180280784 | Romeo et al. | Oct 2018 | A1 |
20180290017 | Fung | Oct 2018 | A1 |
20180296143 | Anderson et al. | Oct 2018 | A1 |
20180296157 | Bleich | Oct 2018 | A1 |
20180326243 | Badi et al. | Nov 2018 | A1 |
20180330058 | Bates | Nov 2018 | A1 |
20180330824 | Athey et al. | Nov 2018 | A1 |
20180353812 | Lannon et al. | Dec 2018 | A1 |
20180360340 | Rehse et al. | Dec 2018 | A1 |
20180373844 | Ferrandez-Escamez et al. | Dec 2018 | A1 |
20190009135 | Wu | Jan 2019 | A1 |
20190019163 | Batey et al. | Jan 2019 | A1 |
20190019573 | Lake et al. | Jan 2019 | A1 |
20190019578 | Vaccaro | Jan 2019 | A1 |
20190030415 | Volpe, Jr. | Jan 2019 | A1 |
20190031284 | Fuchs | Jan 2019 | A1 |
20190046794 | Goodall et al. | Feb 2019 | A1 |
20190060699 | Frederick et al. | Feb 2019 | A1 |
20190060708 | Fung | Feb 2019 | A1 |
20190065970 | Bonutti | Feb 2019 | A1 |
20190066832 | Kang et al. | Feb 2019 | A1 |
20190076701 | Dugan | Mar 2019 | A1 |
20190080802 | Ziobro et al. | Mar 2019 | A1 |
20190088356 | Oliver et al. | Mar 2019 | A1 |
20190090744 | Mahfouz | Mar 2019 | A1 |
20190091506 | Gatelli et al. | Mar 2019 | A1 |
20190111299 | Radcliffe et al. | Apr 2019 | A1 |
20190115097 | Macoviak et al. | Apr 2019 | A1 |
20190118038 | Tana et al. | Apr 2019 | A1 |
20190126099 | Hoang | May 2019 | A1 |
20190132948 | Longinotti-Buitoni et al. | May 2019 | A1 |
20190134454 | Mahoney | May 2019 | A1 |
20190137988 | Cella et al. | May 2019 | A1 |
20190167988 | Shahriari et al. | Jun 2019 | A1 |
20190172587 | Park et al. | Jun 2019 | A1 |
20190175988 | Volterrani et al. | Jun 2019 | A1 |
20190183715 | Kapure et al. | Jun 2019 | A1 |
20190192912 | Radow | Jun 2019 | A1 |
20190200920 | Tien | Jul 2019 | A1 |
20190209891 | Fung | Jul 2019 | A1 |
20190223797 | Tran | Jul 2019 | A1 |
20190232108 | Kovach et al. | Aug 2019 | A1 |
20190240103 | Hepler et al. | Aug 2019 | A1 |
20190240541 | Denton et al. | Aug 2019 | A1 |
20190244540 | Errante | Aug 2019 | A1 |
20190247718 | Blevins | Aug 2019 | A1 |
20190251456 | Constantin | Aug 2019 | A1 |
20190262084 | Roh | Aug 2019 | A1 |
20190262655 | Lentine | Aug 2019 | A1 |
20190269343 | Ramos Murguialday et al. | Sep 2019 | A1 |
20190274523 | Bates et al. | Sep 2019 | A1 |
20190275368 | Maroldi | Sep 2019 | A1 |
20190282857 | Hapola | Sep 2019 | A1 |
20190290965 | Oren | Sep 2019 | A1 |
20190304584 | Savolainen | Oct 2019 | A1 |
20190307983 | Goldman | Oct 2019 | A1 |
20190314681 | Yang | Oct 2019 | A1 |
20190336815 | Hsu | Nov 2019 | A1 |
20190344123 | Rubin et al. | Nov 2019 | A1 |
20190354632 | Mital et al. | Nov 2019 | A1 |
20190362242 | Pillai et al. | Nov 2019 | A1 |
20190366146 | Tong et al. | Dec 2019 | A1 |
20190388728 | Wang | Dec 2019 | A1 |
20200005928 | Daniel | Jan 2020 | A1 |
20200006639 | Wu et al. | Jan 2020 | A1 |
20200038703 | Cleary et al. | Feb 2020 | A1 |
20200051446 | Rubinstein et al. | Feb 2020 | A1 |
20200066390 | Svendrys et al. | Feb 2020 | A1 |
20200085300 | Kwatra et al. | Mar 2020 | A1 |
20200086163 | Karys | Mar 2020 | A1 |
20200093418 | Kluger et al. | Mar 2020 | A1 |
20200114207 | Weldemariam | Apr 2020 | A1 |
20200143922 | Chekroud et al. | May 2020 | A1 |
20200151595 | Jayalath | May 2020 | A1 |
20200151646 | De La Fuente Sanchez | May 2020 | A1 |
20200152339 | Pulitzer et al. | May 2020 | A1 |
20200160198 | Reeves et al. | May 2020 | A1 |
20200170876 | Kapure et al. | Jun 2020 | A1 |
20200176098 | Lucas et al. | Jun 2020 | A1 |
20200197744 | Schweighofer | Jun 2020 | A1 |
20200221975 | Basta et al. | Jul 2020 | A1 |
20200267487 | Siva | Aug 2020 | A1 |
20200275886 | Mason | Sep 2020 | A1 |
20200289045 | Hacking et al. | Sep 2020 | A1 |
20200289046 | Hacking | Sep 2020 | A1 |
20200289878 | Arn | Sep 2020 | A1 |
20200289879 | Hacking et al. | Sep 2020 | A1 |
20200289880 | Hacking et al. | Sep 2020 | A1 |
20200289881 | Hacking et al. | Sep 2020 | A1 |
20200289889 | Hacking | Sep 2020 | A1 |
20200293712 | Potts et al. | Sep 2020 | A1 |
20200303063 | Sharma et al. | Sep 2020 | A1 |
20200334972 | Gopalakrishnan | Oct 2020 | A1 |
20200357299 | Patel et al. | Nov 2020 | A1 |
20200365256 | Hayashitani et al. | Nov 2020 | A1 |
20200395112 | Ronner | Dec 2020 | A1 |
20200401224 | Cotton | Dec 2020 | A1 |
20200410385 | Otsuki | Dec 2020 | A1 |
20200410893 | Ridington | Dec 2020 | A1 |
20200411162 | Lien et al. | Dec 2020 | A1 |
20210005224 | Rothschild et al. | Jan 2021 | A1 |
20210005319 | Otsuki et al. | Jan 2021 | A1 |
20210035674 | Volosin et al. | Feb 2021 | A1 |
20210074178 | Ilan et al. | Mar 2021 | A1 |
20210076981 | Hacking et al. | Mar 2021 | A1 |
20210077860 | Posnack et al. | Mar 2021 | A1 |
20210098129 | Neumann | Apr 2021 | A1 |
20210100628 | Posnack et al. | Apr 2021 | A1 |
20210101051 | Posnack et al. | Apr 2021 | A1 |
20210113877 | Chin | Apr 2021 | A1 |
20210113890 | Posnack et al. | Apr 2021 | A1 |
20210127974 | Mason et al. | May 2021 | A1 |
20210128080 | Mason et al. | May 2021 | A1 |
20210128255 | Mason et al. | May 2021 | A1 |
20210128978 | Gilstrom et al. | May 2021 | A1 |
20210134412 | Guaneri et al. | May 2021 | A1 |
20210134416 | Mason et al. | May 2021 | A1 |
20210134419 | Mason et al. | May 2021 | A1 |
20210134425 | Mason et al. | May 2021 | A1 |
20210134426 | Mason et al. | May 2021 | A1 |
20210134427 | Mason et al. | May 2021 | A1 |
20210134428 | Mason et al. | May 2021 | A1 |
20210134429 | Mason et al. | May 2021 | A1 |
20210134430 | Mason et al. | May 2021 | A1 |
20210134432 | Mason et al. | May 2021 | A1 |
20210134456 | Posnack et al. | May 2021 | A1 |
20210134457 | Mason et al. | May 2021 | A1 |
20210134458 | Mason et al. | May 2021 | A1 |
20210134463 | Mason et al. | May 2021 | A1 |
20210138304 | Mason | May 2021 | A1 |
20210142875 | Mason et al. | May 2021 | A1 |
20210142893 | Guaneri et al. | May 2021 | A1 |
20210142898 | Mason et al. | May 2021 | A1 |
20210142903 | Mason et al. | May 2021 | A1 |
20210144074 | Guaneri | May 2021 | A1 |
20210186419 | Van Ee et al. | Jun 2021 | A1 |
20210202090 | ODonovan et al. | Jul 2021 | A1 |
20210202103 | Bostic et al. | Jul 2021 | A1 |
20210244998 | Hacking et al. | Aug 2021 | A1 |
20210245003 | Turner | Aug 2021 | A1 |
20210268335 | Mizukura | Sep 2021 | A1 |
20210272677 | Barbee | Sep 2021 | A1 |
20210338469 | Dempers | Nov 2021 | A1 |
20210343384 | Purushothaman et al. | Nov 2021 | A1 |
20210345879 | Mason et al. | Nov 2021 | A1 |
20210345975 | Mason et al. | Nov 2021 | A1 |
20210350888 | Guaneri et al. | Nov 2021 | A1 |
20210350898 | Mason et al. | Nov 2021 | A1 |
20210350899 | Mason et al. | Nov 2021 | A1 |
20210350901 | Mason et al. | Nov 2021 | A1 |
20210350902 | Mason et al. | Nov 2021 | A1 |
20210350914 | Guaneri et al. | Nov 2021 | A1 |
20210350926 | Mason et al. | Nov 2021 | A1 |
20210361514 | Choi et al. | Nov 2021 | A1 |
20210366587 | Mason et al. | Nov 2021 | A1 |
20210383909 | Mason et al. | Dec 2021 | A1 |
20210391091 | Mason | Dec 2021 | A1 |
20210398668 | Chock et al. | Dec 2021 | A1 |
20210407670 | Mason | Dec 2021 | A1 |
20210407681 | Mason et al. | Dec 2021 | A1 |
20220000556 | Casey et al. | Jan 2022 | A1 |
20220001232 | DeForest | Jan 2022 | A1 |
20220016480 | Bissonnette et al. | Jan 2022 | A1 |
20220016482 | Bissonnette | Jan 2022 | A1 |
20220016485 | Bissonnette et al. | Jan 2022 | A1 |
20220016486 | Bissonnette | Jan 2022 | A1 |
20220020469 | Tanner | Jan 2022 | A1 |
20220044806 | Sanders et al. | Feb 2022 | A1 |
20220047921 | Bissonnette et al. | Feb 2022 | A1 |
20220072362 | Hopson | Mar 2022 | A1 |
20220079690 | Mason et al. | Mar 2022 | A1 |
20220080256 | Arn et al. | Mar 2022 | A1 |
20220080265 | Watterson | Mar 2022 | A1 |
20220105384 | Hacking et al. | Apr 2022 | A1 |
20220105385 | Hacking et al. | Apr 2022 | A1 |
20220105390 | Yuasa | Apr 2022 | A1 |
20220115133 | Mason et al. | Apr 2022 | A1 |
20220118218 | Bense et al. | Apr 2022 | A1 |
20220126169 | Mason | Apr 2022 | A1 |
20220133576 | Choi et al. | May 2022 | A1 |
20220148725 | Mason et al. | May 2022 | A1 |
20220158916 | Mason et al. | May 2022 | A1 |
20220176039 | Lintereur et al. | Jun 2022 | A1 |
20220181004 | Zilca et al. | Jun 2022 | A1 |
20220183557 | Mason et al. | Jun 2022 | A1 |
20220193491 | Mason et al. | Jun 2022 | A1 |
20220230729 | Mason et al. | Jul 2022 | A1 |
20220238222 | Neuberg | Jul 2022 | A1 |
20220238223 | Mason et al. | Jul 2022 | A1 |
20220262483 | Rosenberg et al. | Aug 2022 | A1 |
20220262504 | Bratty et al. | Aug 2022 | A1 |
20220266094 | Mason et al. | Aug 2022 | A1 |
20220270738 | Mason et al. | Aug 2022 | A1 |
20220273985 | Jeong et al. | Sep 2022 | A1 |
20220273986 | Mason | Sep 2022 | A1 |
20220288460 | Mason | Sep 2022 | A1 |
20220288461 | Ashley et al. | Sep 2022 | A1 |
20220288462 | Ashley et al. | Sep 2022 | A1 |
20220293257 | Guaneri et al. | Sep 2022 | A1 |
20220300787 | Wall et al. | Sep 2022 | A1 |
20220304881 | Choi et al. | Sep 2022 | A1 |
20220304882 | Choi | Sep 2022 | A1 |
20220305328 | Choi et al. | Sep 2022 | A1 |
20220314072 | Bissonnette et al. | Oct 2022 | A1 |
20220314073 | Bissonnette et al. | Oct 2022 | A1 |
20220314074 | Bissonnette et al. | Oct 2022 | A1 |
20220314075 | Mason et al. | Oct 2022 | A1 |
20220314077 | Bissonnette et al. | Oct 2022 | A1 |
20220323826 | Khurana | Oct 2022 | A1 |
20220327714 | Cook et al. | Oct 2022 | A1 |
20220327807 | Cook et al. | Oct 2022 | A1 |
20220328181 | Mason et al. | Oct 2022 | A1 |
20220330823 | Janssen | Oct 2022 | A1 |
20220331663 | Mason | Oct 2022 | A1 |
20220338761 | Maddahi et al. | Oct 2022 | A1 |
20220339052 | Kim | Oct 2022 | A1 |
20220339501 | Mason et al. | Oct 2022 | A1 |
20220384012 | Mason | Dec 2022 | A1 |
20220392591 | Guaneri et al. | Dec 2022 | A1 |
20220395232 | Locke | Dec 2022 | A1 |
20220401783 | Choi | Dec 2022 | A1 |
20220415469 | Mason | Dec 2022 | A1 |
20220415471 | Mason | Dec 2022 | A1 |
20230001268 | Bissonnette et al. | Jan 2023 | A1 |
20230013530 | Mason | Jan 2023 | A1 |
20230014598 | Mason et al. | Jan 2023 | A1 |
20230029639 | Roy | Feb 2023 | A1 |
20230048040 | Hacking et al. | Feb 2023 | A1 |
20230051751 | Hacking et al. | Feb 2023 | A1 |
20230058605 | Mason | Feb 2023 | A1 |
20230060039 | Mason | Feb 2023 | A1 |
20230072368 | Mason | Mar 2023 | A1 |
20230078793 | Mason | Mar 2023 | A1 |
20230119461 | Mason | Apr 2023 | A1 |
20230190100 | Stump | Jun 2023 | A1 |
20230201656 | Hacking et al. | Jun 2023 | A1 |
20230207097 | Mason | Jun 2023 | A1 |
20230207124 | Walsh et al. | Jun 2023 | A1 |
20230215539 | Rosenberg et al. | Jul 2023 | A1 |
20230215552 | Khotilovich et al. | Jul 2023 | A1 |
20230245747 | Rosenberg et al. | Aug 2023 | A1 |
20230245748 | Rosenberg et al. | Aug 2023 | A1 |
20230245750 | Rosenberg et al. | Aug 2023 | A1 |
20230245751 | Rosenberg et al. | Aug 2023 | A1 |
20230253089 | Rosenberg et al. | Aug 2023 | A1 |
20230255555 | Sundaram et al. | Aug 2023 | A1 |
20230263428 | Hull et al. | Aug 2023 | A1 |
20230274813 | Rosenberg et al. | Aug 2023 | A1 |
20230282329 | Mason et al. | Sep 2023 | A1 |
20240029856 | Rosenberg | Jan 2024 | A1 |
Number | Date | Country |
---|---|---|
3193419 | Mar 2022 | CA |
2885238 | Apr 2007 | CN |
101964151 | Feb 2011 | CN |
201889024 | Jul 2011 | CN |
202220794 | May 2012 | CN |
102670381 | Sep 2012 | CN |
103263336 | Aug 2013 | CN |
103390357 | Nov 2013 | CN |
103473631 | Dec 2013 | CN |
103488880 | Jan 2014 | CN |
103501328 | Jan 2014 | CN |
103721343 | Apr 2014 | CN |
203677851 | Jul 2014 | CN |
104335211 | Feb 2015 | CN |
105620643 | Jun 2016 | CN |
105683977 | Jun 2016 | CN |
103136447 | Aug 2016 | CN |
105894088 | Aug 2016 | CN |
105930668 | Aug 2016 | CN |
205626871 | Oct 2016 | CN |
106127646 | Nov 2016 | CN |
106236502 | Dec 2016 | CN |
106510985 | Mar 2017 | CN |
106621195 | May 2017 | CN |
107066819 | Aug 2017 | CN |
107430641 | Dec 2017 | CN |
107551475 | Jan 2018 | CN |
107736982 | Feb 2018 | CN |
107930021 | Apr 2018 | CN |
108078737 | May 2018 | CN |
208224811 | Dec 2018 | CN |
109191954 | Jan 2019 | CN |
109363887 | Feb 2019 | CN |
208573971 | Mar 2019 | CN |
110148472 | Aug 2019 | CN |
110201358 | Sep 2019 | CN |
110215188 | Sep 2019 | CN |
110322957 | Oct 2019 | CN |
110808092 | Feb 2020 | CN |
110931103 | Mar 2020 | CN |
110993057 | Apr 2020 | CN |
111105859 | May 2020 | CN |
111111110 | May 2020 | CN |
111370088 | Jul 2020 | CN |
111460305 | Jul 2020 | CN |
111790111 | Oct 2020 | CN |
112071393 | Dec 2020 | CN |
212141371 | Dec 2020 | CN |
112289425 | Jan 2021 | CN |
212624809 | Feb 2021 | CN |
12603295 | Apr 2021 | CN |
213190965 | May 2021 | CN |
113384850 | Sep 2021 | CN |
113499572 | Oct 2021 | CN |
215136488 | Dec 2021 | CN |
113885361 | Jan 2022 | CN |
114049961 | Feb 2022 | CN |
114203274 | Mar 2022 | CN |
216258145 | Apr 2022 | CN |
114632302 | Jun 2022 | CN |
114694824 | Jul 2022 | CN |
114898832 | Aug 2022 | CN |
114983760 | Sep 2022 | CN |
217472652 | Sep 2022 | CN |
110270062 | Oct 2022 | CN |
218420859 | Feb 2023 | CN |
115954081 | Apr 2023 | CN |
95019 | Jan 1897 | DE |
7628633 | Dec 1977 | DE |
8519150 | Oct 1985 | DE |
3732905 | Jul 1988 | DE |
19619820 | Dec 1996 | DE |
29620008 | Feb 1997 | DE |
19947926 | Apr 2001 | DE |
102018202497 | Aug 2018 | DE |
102018211212 | Jan 2019 | DE |
102019108425 | Aug 2020 | DE |
199600 | Oct 1986 | EP |
0383137 | Aug 1990 | EP |
634319 | Jan 1995 | EP |
1034817 | Sep 2000 | EP |
1159989 | Dec 2001 | EP |
1391179 | Feb 2004 | EP |
1968028 | Sep 2008 | EP |
2564904 | Mar 2013 | EP |
1909730 | Apr 2014 | EP |
2815242 | Dec 2014 | EP |
2869805 | May 2015 | EP |
2997951 | Mar 2016 | EP |
2688472 | Apr 2016 | EP |
3264303 | Jan 2018 | EP |
3323473 | May 2018 | EP |
3627514 | Mar 2020 | EP |
3671700 | Jun 2020 | EP |
3688537 | Aug 2020 | EP |
3731733 | Nov 2020 | EP |
3984508 | Apr 2022 | EP |
3984509 | Apr 2022 | EP |
3984510 | Apr 2022 | EP |
3984511 | Apr 2022 | EP |
3984512 | Apr 2022 | EP |
3984513 | Apr 2022 | EP |
4054699 | Sep 2022 | EP |
4112033 | Jan 2023 | EP |
2527541 | Dec 1983 | FR |
3127393 | Mar 2023 | FR |
141664 | Nov 1920 | GB |
2336140 | Oct 1999 | GB |
2372459 | Aug 2002 | GB |
2512431 | Oct 2014 | GB |
2591542 | Mar 2022 | GB |
201811043670 | Jul 2018 | IN |
2000005339 | Jan 2000 | JP |
2003225875 | Aug 2003 | JP |
2005227928 | Aug 2005 | JP |
2005227928 | Aug 2005 | JP |
2009112336 | May 2009 | JP |
2013515995 | May 2013 | JP |
3193662 | Oct 2014 | JP |
3198173 | Jun 2015 | JP |
5804063 | Nov 2015 | JP |
2018102842 | Jul 2018 | JP |
2019028647 | Feb 2019 | JP |
2019134909 | Aug 2019 | JP |
6573739 | Sep 2019 | JP |
6659831 | Mar 2020 | JP |
6710357 | Jun 2020 | JP |
6775757 | Oct 2020 | JP |
2021027917 | Feb 2021 | JP |
6871379 | May 2021 | JP |
2022521378 | Apr 2022 | JP |
3238491 | Jul 2022 | JP |
7198364 | Dec 2022 | JP |
7202474 | Jan 2023 | JP |
7231750 | Mar 2023 | JP |
7231751 | Mar 2023 | JP |
7231752 | Mar 2023 | JP |
20020009724 | Feb 2002 | KR |
200276919 | May 2002 | KR |
20020065253 | Aug 2002 | KR |
100582596 | May 2006 | KR |
101042258 | Jun 2011 | KR |
101258250 | Apr 2013 | KR |
20140128630 | Nov 2014 | KR |
20150017693 | Feb 2015 | KR |
20150078191 | Jul 2015 | KR |
101580071 | Dec 2015 | KR |
101647620 | Aug 2016 | KR |
20160093990 | Aug 2016 | KR |
20170038837 | Apr 2017 | KR |
20180004928 | Jan 2018 | KR |
20190029175 | Mar 2019 | KR |
20190056116 | May 2019 | KR |
101988167 | Jun 2019 | KR |
101969392 | Aug 2019 | KR |
102055279 | Dec 2019 | KR |
102088333 | Mar 2020 | KR |
20200025290 | Mar 2020 | KR |
20200029180 | Mar 2020 | KR |
102116664 | May 2020 | KR |
102116968 | May 2020 | KR |
20200056233 | May 2020 | KR |
102120828 | Jun 2020 | KR |
102121586 | Jun 2020 | KR |
102142713 | Aug 2020 | KR |
102162522 | Oct 2020 | KR |
20200119665 | Oct 2020 | KR |
102173553 | Nov 2020 | KR |
102180079 | Nov 2020 | KR |
102188766 | Dec 2020 | KR |
102196793 | Dec 2020 | KR |
20210006212 | Jan 2021 | KR |
102224188 | Mar 2021 | KR |
102224618 | Mar 2021 | KR |
102246049 | Apr 2021 | KR |
102246050 | Apr 2021 | KR |
102246051 | Apr 2021 | KR |
102246052 | Apr 2021 | KR |
20210052028 | May 2021 | KR |
102264498 | Jun 2021 | KR |
102352602 | Jan 2022 | KR |
102352603 | Jan 2022 | KR |
102352604 | Jan 2022 | KR |
102387577 | Apr 2022 | KR |
102421437 | Jul 2022 | KR |
20220102207 | Jul 2022 | KR |
102427545 | Aug 2022 | KR |
102467495 | Nov 2022 | KR |
102467496 | Nov 2022 | KR |
102469723 | Nov 2022 | KR |
102471990 | Nov 2022 | KR |
20220145989 | Nov 2022 | KR |
20220156134 | Nov 2022 | KR |
102502744 | Feb 2023 | KR |
20230019349 | Feb 2023 | KR |
20230019350 | Feb 2023 | KR |
20230026556 | Feb 2023 | KR |
20230026668 | Feb 2023 | KR |
20230040526 | Mar 2023 | KR |
20230050506 | Apr 2023 | KR |
20230056118 | Apr 2023 | KR |
102528503 | May 2023 | KR |
102531930 | May 2023 | KR |
102532766 | May 2023 | KR |
102539190 | Jun 2023 | KR |
2014131288 | Feb 2016 | RU |
2607953 | Jan 2017 | RU |
200910231 | Mar 2009 | TW |
M474545 | Mar 2014 | TW |
I442956 | Jul 2014 | TW |
201531278 | Aug 2015 | TW |
M638437 | Mar 2023 | TW |
1998009687 | Mar 1998 | WO |
0149235 | Jul 2001 | WO |
0151083 | Jul 2001 | WO |
2001050387 | Jul 2001 | WO |
2001056465 | Aug 2001 | WO |
02062211 | Aug 2002 | WO |
02093312 | Nov 2002 | WO |
2003043494 | May 2003 | WO |
2005018453 | Mar 2005 | WO |
2006004430 | Jan 2006 | WO |
2006012694 | Feb 2006 | WO |
2007102709 | Sep 2007 | WO |
2008114291 | Sep 2008 | WO |
2009008968 | Jan 2009 | WO |
2011025322 | Mar 2011 | WO |
2012128801 | Sep 2012 | WO |
2013002568 | Jan 2013 | WO |
2023164292 | Mar 2013 | WO |
2013122839 | Aug 2013 | WO |
2014011447 | Jan 2014 | WO |
2014163976 | Oct 2014 | WO |
2015026744 | Feb 2015 | WO |
2015065298 | May 2015 | WO |
2015082555 | Jun 2015 | WO |
2015112945 | Jul 2015 | WO |
2016154318 | Sep 2016 | WO |
2017030781 | Feb 2017 | WO |
2017166074 | May 2017 | WO |
2017091691 | Jun 2017 | WO |
2017165238 | Sep 2017 | WO |
2018081795 | May 2018 | WO |
2018171853 | Sep 2018 | WO |
2019022706 | Jan 2019 | WO |
2019075185 | Apr 2019 | WO |
2019143940 | Jul 2019 | WO |
2020075190 | Apr 2020 | WO |
2020130979 | Jun 2020 | WO |
2020149815 | Jul 2020 | WO |
2020229705 | Nov 2020 | WO |
2020245727 | Dec 2020 | WO |
2020249855 | Dec 2020 | WO |
2020252599 | Dec 2020 | WO |
2020256577 | Dec 2020 | WO |
2021021447 | Feb 2021 | WO |
2021022003 | Feb 2021 | WO |
2021038980 | Mar 2021 | WO |
2021055427 | Mar 2021 | WO |
2021061061 | Apr 2021 | WO |
2021090267 | May 2021 | WO |
2021138620 | Jul 2021 | WO |
2022047006 | Mar 2022 | WO |
2022092493 | May 2022 | WO |
2022092494 | May 2022 | WO |
2022212532 | Oct 2022 | WO |
2022212883 | Oct 2022 | WO |
2022212921 | Oct 2022 | WO |
2022216498 | Oct 2022 | WO |
2022251420 | Dec 2022 | WO |
2023008680 | Feb 2023 | WO |
2023008681 | Feb 2023 | WO |
2023022319 | Feb 2023 | WO |
2023022320 | Feb 2023 | WO |
2023052695 | Apr 2023 | WO |
2023091496 | May 2023 | WO |
Entry |
---|
Claris Healthcare Inc., Claris Reflex Patient Rehabilitation System Brochure, retrieved on Oct. 2, 2019, 5 pages, https://clarisreflex.com/. |
Fysiomed, 16983—Vario adjustable pedal arms, retrieved on Aug. 4, 2020, 1 page, https://www.fysiomed.com/en/products/16983-vario-adjustable-pedal-arms. |
HCL Fitness, HCI Fitness PhysioTrainer Upper Body Ergonometer, announced 2009 [online], retrieved on Aug. 19, 2021, 8 pages, www.amazon.com/HCI-Fitness-PhysioTrainer-Upper-Ergonometer/dp/B001 P5GUGM. |
HCL Fitness, HCI Fitness PhysioTrainer Pro, 2017, retrieved on Aug. 23, 2021, 6 pages, https://www.amazon.com/HCI-Fitness-Physio Trainer-Electronically-Controlled/dp/B0759YMW78/. |
Matrix, R3xm Recumbent Cycle, retrieved on Aug. 19, 2021, 7 pages, https://www.matrixfitness.com/en/cardio/cycles/r3xm-recumbent. |
ROM3 Rehab, ROM3 Rehab System, Apr. 20, 2015, retrieved on Aug. 31, 2018, 12 pages, https://vimeo.com/125438463. |
Davenport et al., “The Potential For Artificial Intelligence In Healthcare”, 2019, Future Healthcare Journal 2019, vol. 6, No. 2: Year: 2019, pp. 1-5. |
Ahmed et al., “Artificial Intelligence With Multi-Functional Machine Learning Platform Development For Better Healthcare And Precision Medicine”, 2020, Database (Oxford), 2020:baaa010. doi: 10.1093/database/baaa010 (Year: 2020), pp. 1-35. |
Ruiz Ivan et al., “Towards a physical rehabilitation system using a telemedicine approach”, Computer Methods in Biomechanics and Biomedical Engineering: Imaging & Visualization, vol. 8, No. 6, Jul. 28, 2020, pp. 671-680, XP055914810. |
De Canniere Helene et al., “Wearable Monitoring and Interpretable Machine Learning Can Objectively Track Progression in Patients during Cardiac Rehabilitation”, Sensors, vol. 20, No. 12, Jun. 26, 2020, XP055914617, pp. 1-15. |
Boulanger Pierre et al., “A Low-cost Virtual Reality Bike for Remote Cardiac Rehabilitation”, Dec. 7, 2017, Advances in Biometrics: International Conference, ICB 2007, Seoul, Korea, pp. 155-166. |
Yin Chieh et al., “A Virtual Reality-Cycling Training System for Lower Limb Balance Improvement”, BioMed Research International, vol. 2016, pp. 1-10. |
Jennifer Bresnick, “What is the Role of Natural Language Processing in Healthcare?”, pp. 1-7, published Aug. 18, 2016, retrieved on Feb. 1, 2022 from https://healthitanalytics.com/ featu res/what-is-the-role-of-natural-language-processing-in-healthcare. |
Alex Bellec, “Part-of-Speech tagging tutorial with the Keras Deep Learning library,” pp. 1-16, published Mar. 27, 2018, retrieved on Feb. 1, 2022 from https://becominghuman.ai/part-of-speech-tagging-tutorial-with-the-keras-deep-learning-library-d7f93fa05537. |
Kavita Ganesan, All you need to know about text preprocessing for NLP and Machine Learning, pp. 1-14, published Feb. 23, 2019, retrieved on Feb. 1, 2022 from https:// towardsdatascience.com/all-you-need-to-know-about-text-preprocessing-for-nlp-and-machine-learning-bcl c5765ff67. |
Badreesh Shetty, “Natural Language Processing (NPL) for Machine Learning,” pp. 1-13, published Nov. 24, 2018, retrieved on Feb. 1, 2022 from https://towardsdatascience. com/natural-language-processing-nlp-for-machine-learning-d44498845d5b. |
Website for “Pedal Exerciser”, p. 1, retrieved on Sep. 9, 2022 from https://www.vivehealth.com/collections/physical-therapy-equipment/products/pedalexerciser. |
Website for “Functional Knee Brace with ROM”, p. 1, retrieved on Sep. 9, 2022 from http://medicalbrace.gr/en/product/functional-knee-brace-with-goniometer-mbtelescopicknee/. |
Website for “ComfySplints Goniometer Knee”, pp. 1-5, retrieved on Sep. 9, 2022 from https://www.comfysplints.com/product/knee-splints/. |
Website for “BMI FlexEze Knee Corrective Orthosis (KCO)”, pp. 1-4, retrieved on Sep. 9, 2022 from https://orthobmi.com/products/bmi-flexeze%C2%AE-knee-corrective-orthosis-kco. |
Website for “Neoprene Knee Brace with goniometer—Patella ROM MB.4070”, pp. 1-4, retrieved on Sep. 9, 2022 from https://www.fortuna.com.gr/en/product/neoprene-knee-brace-with-goniometer-patella-rom-mb-4070/. |
Kuiken et al., “Computerized Biofeedback Knee Goniometer: Acceptance and Effect on Exercise Behavior in Post-total Knee Arthroplasty Rehabilitation,” Biomedical Engineering Faculty Research and Publications, 2004, pp. 1-10. |
Ahmed et al., “Artificial intelligence with multi-functional machine learning platform development for better healthcare and precision medicine,” Database, 2020, pp. 1-35. |
Davenport et al., “The potential for artificial intelligence in healthcare,” Digital Technology, Future Healthcare Journal, 2019, pp. 1-5, vol. 6, No. 2. |
Website for “OxeFit XS1”, pp. 1-3, retrieved on Sep. 9, 2022 from https://www.oxefit.com/xs1. |
Website for “Preva Mobile”, pp. 1-6, retrieved on Sep. 9, 2022 from https://www.precor.com/en-us/resources/introducing-preva-mobile. |
Website for “J-Bike”, pp. 1-3, retrieved on Sep. 9, 2022 from https://www.magneticdays.com/en/cycling-for-physical-rehabilitation. |
Website for “Excy”, pp. 1-12, retrieved on Sep. 9, 2022 from https://excy.com/portable-exercise-rehabilitation-excy-xcs-pro/. |
Website for “OxeFit XP1”, p. 1, retrieved on Sep. 9, 2022 from https://www.oxefit.com/xp1. |
Barrett et al., “Artificial intelligence supported patient self-care in chronic heart failure: a paradigm shift from reactive to predictive, preventive and personalised care,” EPMA Journal (2019), pp. 445-464. |
Oerkild et al., “Home-based cardiac rehabilitation is an attractive alternative to no cardiac rehabilitation for elderly patients with coronary heart disease: results from a randomised clinical trial,” BMJ Open Accessible Medical Research, Nov. 22, 2012, pp. 1-9. |
Bravo-Escobar et al., “Effectiveness and safety of a home-based cardiac rehabilitation programme of mixed surveillance in patients with ischemic heart disease at moderate cardiovascular risk: A randomised, controlled clinical trial,” BMC Cardiovascular Disorders (2017) 17:66, pp. 1-11. |
Thomas et al., “Home-Based Cardiac Rehabilitation,” Circulation, 2019, pp. e69-e89. |
Thomas et al., “Home-Based Cardiac Rehabilitation,” Journal of the American College of Cardiology, vol. 74, Nov. 1, 2019, pp. 133-153. |
Thomas et al., “Home-Based Cardiac Rehabilitation,” HHS Public Access, Oct. 2020, pp. 1-39. |
Dittus et al., “Exercise-Based Oncology Rehabilitation: Leveraging the Cardiac Rehabilitation Model,” Journal of Cardiopulmonary Rehabilitation and Prevention 2015, pp. 130-139. |
Chen et al., “Home-based cardiac rehabilitation improves quality of life, aerobic capacity, and readmission rates in patients with chronic heart failure,” Medicine (2018) 97:4, pp. 1-5. |
Lima de Melo Ghisi et al., “A systematic review of patient education in cardiac patients: Do they increase knowledge and promote health behavior change?,” Patient Education and Counseling, 2014, pp. 1-15. |
Fang et al., “Use of Outpatient Cardiac Rehabilitation Among Heart Attack Survivors—20 States and the District of Columbia, 2013 and Four States, 2015,” Morbidity and Mortality Weekly Report, vol. 66, No. 33, Aug. 25, 2017, pp. 869-873. |
Beene et al., “AI and Care Delivery: Emerging Opportunities For Artificial Intelligence To Transform How Care Is Delivered,” Nov. 2019, American Hospital Association, pp. 1-12. |
Malloy, Online Article “AI-enabled EKGs find difference between numerical age and biological age significantly affects health, longevity”, Website: https://newsnetwork.mayoclinic.org/discussion/ai-enabled-ekgs-find-difference-between-numerical-age-and-biological-age-significantly-affects-health-longevity/, Mayo Clinic News Network, May 20, 2021, retrieved: Jan. 23, 2023, p. 1-4. |
Jeong et al., “Computer-assisted upper extremity training using interactive biking exercise (iBikE) platform,” Sep. 2012, pp. 1-5, 34th Annual International Conference of the IEEE EMBS. |
Website for “Esino 2022 Physical Therapy Equipments Arm Fitness Indoor Trainer Leg Spin Cycle Machine Exercise Bike for Elderly,” https://www.made-in-china.com/showroom/esinogroup/product-detailYdZtwGhCMKVR/China-Esino-2022-Physical-Therapy-Equipments-Arm-Fitness-Indoor-Trainer-Leg-Spin-Cycle-Machine-Exercise-Bike-for-Elderly.html, retrieved on Aug. 29, 2023, 5 pages. |
Abedtash, “An Interoperable Electronic Medical Record-Based Platform for Personalized Predictive Analytics”, ProQuest LLC, Jul. 2017, 185 pages. |
Alcaraz et al., “Machine Learning as Digital Therapy Assessment for Mobile Gait Rehabilitation,” 2018 IEEE 28th International Workshop on Machine Learning for Signal Processing (MLSP), Aalborg, Denmark, 2018, 6 pages. |
Androutsou et al., “A Smartphone Application Designed to Engage the Elderly in Home-Based Rehabilitation,” Frontiers in Digital Health, Sep. 2020, vol. 2, Article 15, 13 pages. |
Silva et al., “SapoFitness: A mobile health application for dietary evaluation,” 2011 IEEE 13th International Conference on U e-Health Networking, Applications and Services, Columbia, MO, USA, 2011, 6 pages. |
Wang et al., “Interactive wearable systems for upper body rehabilitation: a systematic review,” Journal of NeuroEngineering and Rehabilitation, 2017, 21 pages. |
Marzolini et al., “Eligibility, Enrollment, and Completion of Exercise-Based Cardiac Rehabilitation Following Stroke Rehabilitation: What Are the Barriers?,” Physical Therapy, vol. 100, No. 1, 2019, 13 pages. |
Nijjar et al., “Randomized Trial of Mindfulness-Based Stress Reduction in Cardiac Patients Eligible for Cardiac Rehabilitation,” Scientific Reports, 2019, 12 pages. |
Lara et al., “Human-Robot Sensor Interface for Cardiac Rehabilitation,” IEEE International Conference on Rehabilitation Robotics, Jul. 2017, 8 pages. |
Ishraque et al., “Artificial Intelligence-Based Rehabilitation Therapy Exercise Recommendation System,” 2018 IEEE MIT Undergraduate Research Technology Conference (URTC), Cambridge, MA, USA, 2018, 5 pages. |
Zakari et al., “Are There Limitations to Exercise Benefits in Peripheral Arterial Disease?,” Frontiers in Cardiovascular Medicine, Nov. 2018, vol. 5, Article 173, 12 pages. |
You et al., “Including Blood Vasculature into a Game-Theoretic Model of Cancer Dynamics,” Games 2019, 10, 13, 22 pages. |
Jeong et al., “Computer-assisted upper extremity training using interactive biking exercise (iBikE) platform,” Sep. 2012, 34th Annual International Conference of the IEEE EMBS, 5 pages. |
Chrif et al., “Control design for a lower-limb paediatric therapy device using linear motor technology,” Article, 2017, pp. 119-127, Science Direct, Switzerland. |
Robben et al., “Delta Features From Ambient Sensor Data are Good Predictors of Change in Functional Health,” Article, 2016, pp. 2168-2194, vol. 21, No. 4, IEEE Journal of Biomedical and Health Informatics. |
Kantoch et al., “Recognition of Sedentary Behavior by Machine Learning Analysis of Wearable Sensors during Activities of Daily Living for Telemedical Assessment of Cardiovascular Risk,” Article, 2018, 17 pages, Sensors, Poland. |
Warburton et al., “International Launch of the PAR-•Q+ and ePARmed-•X+ Validation of the PAR-•Q+ and ePARmed••X+,” Health & Fitness Journal of Canada, 2011, 9 pages, vol. 4, No. 2. |
Number | Date | Country | |
---|---|---|---|
20220016485 A1 | Jan 2022 | US |
Number | Date | Country | |
---|---|---|---|
63168175 | Mar 2021 | US | |
62858244 | Jun 2019 | US | |
62846434 | May 2019 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 16869954 | May 2020 | US |
Child | 17395639 | US |