This disclosure relates generally to ergonomics and, more particularly, to systems, apparatus, and methods for musculoskeletal ergonomic improvement.
An individual may experience a musculoskeletal injury (e.g., an injury to muscle(s), nerve(s), and/or joint(s) of the individual's body) while performing activities. Such injuries can stem from conditions in a work environment and/or a manner in which the activities are performed.
An example apparatus includes a performance analyzer to predict a musculoskeletal strain event for a portion of a body of a user based on strain sensor data collected via one or more strain sensors associated with the user and transmit, in response to the prediction of the strain event, an instruction including an alert to be output by the output device. The example apparatus includes an ergonomic form recommendation generator to transmit, in response to the prediction of the strain event, an instruction including an ergonomic form measure to be output by the output device.
An example system includes a first sensor and an ergonomic analysis controller to execute a neural network model to predict a musculoskeletal strain event for a user based on first sensor data generated by the first sensor; generate an ergonomic form measure for the user based on the first sensor data; and cause an output device to present the ergonomic form measure in response to the prediction of the musculoskeletal strain event.
An example non-transitory computer readable medium includes instructions that, when executed by at least one processor, cause the at least one processor to predict a musculoskeletal strain event based on sensor data generated in response to movement by a user and transmit, in response to the prediction of the musculoskeletal strain event, an instruction including an ergonomic form measure to be output by an output device.
An example method includes predicting a musculoskeletal strain event for a portion of a body of a user based on strain sensor data collected via one or more strain sensors associated with the user and transmitting, in response to the prediction of the musculoskeletal strain event, an instruction including an alert and an ergonomic form measure to be output by an output device
The figures are not to scale. In general, the same reference numbers will be used throughout the drawing(s) and accompanying written description to refer to the same or like parts.
Unless specifically stated otherwise, descriptors such as “first,” “second,” “third,” etc. are used herein without imputing or otherwise indicating any meaning of priority, physical order, arrangement in a list, and/or ordering in any way, but are merely used as labels and/or arbitrary names to distinguish elements for ease of understanding the disclosed examples. In some examples, the descriptor “first” may be used to refer to an element in the detailed description, while the same element may be referred to in a claim with a different descriptor such as “second” or “third.” In such instances, it should be understood that such descriptors are used merely for identifying those elements distinctly that might, for example, otherwise share a same name.
An individual may experience a musculoskeletal injury (e.g., an injury to muscle(s), nerve(s), and/or joint(s) of the individual's body) while performing activities. Such injuries can stem from conditions in a work environment and/or a manner in which the activities are performed. For instance, performing repetitive tasks, lifting heavy objects, and/or other types of overuse or overexertion activities can cause musculoskeletal injuries that, in addition to causing pain, may affect worker productivity. Workplace conditions such as a layout of a workspace and/or a design of objects in the workspace, such as a height of a desk, can contribute to musculoskeletal injuries in an individual over time. In some examples, repeated exposure to conditions in the environment such as vibrations can also cause musculoskeletal injuries. Efforts to reduce musculoskeletal injuries are often not addressed until the worker is experiencing pain.
Disclosed herein are example systems, apparatus, and methods that predict a likelihood that a user is experiencing or is likely to experience a musculoskeletal strain event or is likely to experience a musculoskeletal strain event to notify the user of the predicted strain event. Examples disclosed herein access data generated by one or more sensors associated with the user and/or located in the environment in which the user is performing movements. The sensors can include wearable sensors (e.g., biosensors to detect body temperature, heart rate, hydration level, etc.; strain sensors carried by a fabric worn by the user that detect muscle strain and/or tension). In some examples, the sensors include environmental sensors such as video cameras to capture images of the user in the environment and/or infrared or thermal cameras to detect heat generated by the user.
In examples disclosed herein, an ergonomic analysis controller executes neural network model(s) to evaluate ergonomic form(s) associated with a user's body and to identify a risk of strain event(s) for one or more portions of the user's body based on the sensor data. The neural network model(s) can be generated for detecting strain event(s) at particular portions of the user's body, such as a shoulder. In some examples disclosed herein, the ergonomic analysis controller analyzes the results of the neural network analysis in view of data previously collected from the user and/or other users to identify trends in user movement that can indicate that the user is overstressing one or more portions of his or her body (e.g., to detect repetitive motion or to identify anomalies in user movement that can lead to injury). In examples disclosed herein, data collected from the user and/or other users is used to refine the neural network model(s) and, thus, the predictions of musculoskeletal strain event(s).
Examples disclosed herein provide feedback to the user to alert the user to the predicted strain event. The alert(s) can be provided via one or more output devices, such as via a user application on a smartphone and/or wearable device such as a smartwatch. Some examples disclosed herein provide recommendations or mitigation instructions as to how the user can alleviate strain and/or otherwise improve ergonomic form. The recommendations can include, for instance, audio instructions and/or visual instructions advising the user how to perform a movement safely, recommendations as to the number or repetitions of a movement to perform, etc. Examples disclosed herein dynamically respond to changes in user characteristics and/or behavior in evaluating the risk for strain event(s) and/or developing ergonomic form recommendations, rather than relying on static reference data that may or may not be accurate for the user.
In some examples disclosed herein, the sensor data and/or results of analysis of the sensor data performed by the ergonomic analysis controller for the user and/or a population of users are provided to third parties such as a healthcare provider. Such information can be used by healthcare providers to, for instance, monitor the user(s) and develop a customized health plan to reduce the risk of musculoskeletal injuries.
The example system 100 includes one or more sensors to collect biological data from the user 102. For example, the sensor(s) can include biosensor(s) 104 carried by the user 102 to collect biological data for the user 102 such as heart rate, respiration rate, blood pressure, body temperature, hydration level, etc. In some examples, the biosensor(s) 104 are carried by one or more user devices 105, such as a smartwatch or a health tracker. The user 102 may carry (e.g., wear) the user device(s) 105 to enable the biosensor(s) 104 of the user device(s) 105 to collect data from the user 102.
The example system 100 includes one or more strain sensor(s) 106 to detect strain and/or stress on joint(s) of the user 102 and/or with respect to the muscle(s) of the user 102. The strain sensor(s) 106 can include electromyography (EMG) sensor(s) worn by the user 102 to detect muscle tension. In some examples, the strain sensor(s) 106 include sensor(s) to detect skin and/or muscle temperature, which are indicative of muscle activity. In some examples, other types of sensors, such as position sensors and/or accelerometers are carried by the user 102 and/or by user device(s) 105 associated with the user 102 to output data indicative of muscle strain.
In some examples, the strain sensor(s) 106 include fabric sensing wearable(s) 107. The fabric sensing wearable(s) 107 include wearable fabrics (e.g., a shirt or other garment) that include sensor(s) to output data indicative of strain on the muscle(s) and/or skeleton (e.g., joint(s)) of the user 102. For example, motion-sensing fabrics can include pressure and/or strain sensor(s) that output signal(s) in response to changes in pressure and/or deformation of the sensor(s) during movement by the user 102.
In some examples, the system 100 includes environmental sensor(s) 108, or sensor(s) located in the environment 103, that collect data with respect to the environment 103 and/or the user 102 in the environment 103. The environmental sensor(s) 108 can include, for example, camera(s) (e.g., video camera(s), still camera(s)) to generate image data of the user 102 in the environment 103, audio sensor(s) to capture audio in the environment 103, vibration sensor(s) to detect vibrations in the environment 103, motion capture sensor(s), etc. In some examples, the environmental sensor(s) 108 include infrared camera(s) that detect changes in a temperature of a skin of the user 102 due to muscle activity.
The example system 100 can include other types of sensors than the example sensors 104, 106, 107, 108 disclosed herein. Also, in some examples, the system 100 includes fewer types of sensor(s). For example, the system 100 can include the biosensor(s) 104 and/or the strain sensor(s) 106 but not the environmental sensor(s) 108.
In the example of
In the example of
The example ergonomic analysis controller 110 analyzes the sensor signal data from the respective sensor(s) 104, 106, 108 to predict a likelihood that one or more portions of the body of the user 102 is under strain such that there is a risk of comprising musculoskeletal integrity. As disclosed herein, the ergonomic analysis controller 110 implements neural network model(s) to predict if the one or more portions of the user's body (e.g., muscle(s), joint(s)) is experiencing a musculoskeletal strain event or is likely to experience a strain event. The neural network model(s) can be trained using previously collected data (e.g., biometric sensor data, image data, reference anthropometric data) associated with the user and/or other individuals. The training data can define baseline or threshold information for determining if the user is at risk for experiencing a musculoskeletal strain event. In such examples, the ergonomic analysis controller 110 predicts the musculoskeletal strain event(s) by executing the trained neural network model(s) for the sensor signal data generated by the sensor(s) 104, 106, 108. In some examples, the ergonomic analysis controller 110 determines that the user 102 is experiencing a musculoskeletal strain event or is likely to experience a strain event by mapping one or more user parameters (e.g., gender, age, weight, athletic ability) to population profile data. The population profile data can include, for example, average ranges of motion for users based on parameters such as weight, gender, athletic ability; average weight that can be safely lifted based on age, gender, etc.
In examples disclosed herein, the ergonomic analysis controller 110 predicts the strain event(s) based on the signal data generated by the sensor(s) 104, 106, 108 while the user 102 is in the environment 103 and/or data previously collected from the user (in the environment 103 and/or in other environment(s)). The previously collected data can include biosensor data and/or strain sensor data and can serve as baseline or reference data for the user 102. In some examples, previously collected data from the environment 103 by the environmental sensor(s) 108 (e.g., vibration levels) and/or previously collected sensor data from other environments similar to the environment 103 (e.g., manufacturing environments) serves as reference environmental data. As disclosed herein (
In some examples, the ergonomic analysis controller 110 compares the results of the neural network analysis in view of previously collected sensor data for the user 102 and/or previously generated neural network analysis results for the user 102 to verify that a prediction that the user 102 is or is not likely to experience a strain event is accurate. For example, for a given set of sensor data, the ergonomic analysis controller 110 may determine that the user 102 is not likely to experience a strain event. However, based on previously generated neural network results and/or historical sensor data for the user 102, the ergonomic analysis controller 110 may determine that the user 102 is at risk for a strain event due to the cumulative effect of strain from, for instance, performing a repetitive motion. Thus, in some examples, the ergonomic analysis controller 110 predicts that the user 102 is experiencing or is likely to experience a musculoskeletal strain event based on cumulative results from the neural network analysis and/or changes in the sensor data collected from the user 102 and/or the environment 103 over time. The neural network model(s) implemented by the ergonomic analysis controller 110 of
In the example of
In some examples, the output device(s) 112 include user device(s) (e.g., electronic tablets, smartphones, laptops) associated with a third party who is authorized to receive report(s), alert(s), etc. with respect to the analysis of the sensor data and/or prediction(s) of strain event(s). The third party can include, for example, a medical professional. In some examples, the ergonomic analysis controller 110 transmits the data collected by the sensor(s) 104, 106, 108 and/or data derived therefrom (e.g., average muscle strain data) for display at the output device(s) 112. Thus, the authorized third party can track changes in the user 102 with respect to musculoskeletal events over time.
In some examples, the ergonomic analysis controller 110 generates ergonomic form recommendation(s) for the user 102 in response to predicting a likelihood that the user 102 is experiencing strain event(s). For example, the ergonomic form recommendation(s) can include instruction(s) or action(s) that the user 102 can take to alleviate stress or strain on the portion(s) of the user's body (e.g., by re-positioning the user's body part, taking a break from the movement, etc.). The ergonomic form recommendation(s) can include, for instance, recommended limits on a number of repetitions of a movement performed by the user 102, recommended limits on an amount of weight that the user 102 can safely carry, etc. As disclosed herein, the ergonomic form recommendation(s) can be generated based on the data collected from the user 102 via the sensor(s) 104, 106, 108 over time and predefined ergonomic form rule(s). The ergonomic form recommendation(s) generated by the ergonomic analysis controller 110 can include, for instance, visual instruction(s) that are displayed via a display screen of the output device(s) 112 and/or audio instruction(s) that are presented via speaker(s) of the output device(s) 112, etc.
As disclosed herein, the neural network analysis with respect to musculoskeletal strain event(s) and/or the generation of the ergonomic form recommendation(s) can be based on sensor data collected from the user 102 for whom the analysis is performed, including sensor data collected from the user 102 over time. In some examples, the ergonomic analysis controller 110 also uses data collected from other users to refine the neural network analysis and/or generate the ergonomic form recommendation(s). Thus, in some examples, the ergonomic analysis controller 110 performs a population-based analysis of strain event(s) associated with the user 102.
In the example system 100 of
The example population data aggregator 114 of
The example population data aggregator 114 receives data associated with the user 102 and other users. For instance, the population data aggregator 114 can receive biosensor data collected from other users in response to the other users performing movements in the environment 103 and/or different environments. In some examples, data is collected from the other users in response to the users performing substantially the same movements as performed by the user 102 (e.g., an overhead movement). Additionally or alternatively, the data can be collected from the other users in response to the users performing different movements than the user 102.
The example population data aggregator 114 classifies or groups the data associated with the plurality of users based on variables such as individual characteristics (e.g., age, gender, etc.), movement types, and/or environment(s) from which the data was collected. As a result, the population data aggregator 114 generates population profile data including data profiles defined by different classifications (e.g., demographics, environment type, movement type). The classifications defined by the population data aggregator 114 can be customized based on, for instance, properties of the environment 103 (e.g., type of work performed) and/or reference data such as anthropometric measurements for individuals of different ages, genders, etc.
In some examples, the population data aggregator 114 aggregates data from individuals in the population over time and determines average or threshold data for detecting strain event(s) based on the data collected from the population over time. For instance, the population data profile(s) can define averages of, for instance, biosensor data (e.g., heart rate data) and/or strain sensor data (e.g., amount of muscle strain or tension detected) from multiple users who experienced musculoskeletal injury. In the example of
In some examples of
As also disclosed herein, the example population data aggregator 114 is constructed to aggregate or compile data associated with a plurality of users (including, for example, the user 102 of
In the example of
In the example of
In some examples, the ergonomic analysis controller 110 includes the database 200. In other examples, the database 200 is located external to the ergonomic analysis controller 110 in a location accessible to the ergonomic analysis controller 110 as shown in
The example ergonomic analysis controller 110 includes a signal modifier 206. The signal modifier 206 can perform operations to modify the sensor data 201, 202, 204 from the sensor(s) 104, 106, 107, 108 to, for example, filter the data, convert time domain audio data into the frequency spectrum (e.g., via Fast Fourier Transform (FFT) processing) for spectral analysis, etc. In some examples, the data 201, 202, 204 undergoes modification(s) by the signal modifier 206 before being stored in the database 200.
The example ergonomic analysis controller 110 of
The user profile generator 208 generates one or more user profile(s) 212 for the user 102 based on the analysis of the sensor data 201, 202, 204. For example, the user profile generator 208 can generate a first user profile 212 including heart rate data for the user 102 collected over time. The user profile generator 208 can generate a second user profile 212 including muscle tension detected by the strain sensor(s) 106, 107 during movement of one or more portion(s) of the body of the user 102 over time. The example user profile generator 208 of
The example ergonomic analysis controller 110 of
In the example of
In the example of
In some examples, the database 220 of the population data aggregator 114 stores individual user profiles and/or sensor data associated with respective users (e.g., the user 102 of
The example ergonomic analysis controller 110 of
In some examples, the performance analyzer 224 executes neural network model(s) to determine a likelihood of the user 102 experiencing a musculoskeletal strain event, or a musculoskeletal event to one or more portions of the body of the user 102.
Artificial intelligence (AI), including machine learning (ML), deep learning (DL), and/or other artificial machine-driven logic, enables machines (e.g., computers, logic circuits, etc.) to use a model to process input data to generate an output based on patterns and/or associations previously learned by the model via a training process. For instance, the model may be trained with data to recognize patterns and/or associations and follow such patterns and/or associations when processing input data such that other input(s) result in output(s) consistent with the recognized patterns and/or associations.
In general, implementing a ML/AI system involves two phases, a learning/training phase and an inference phase. In the learning/training phase, a training algorithm is used to train a model to operate in accordance with patterns and/or associations based on, for example, training data. In general, the model includes internal parameters that guide how input data is transformed into output data, such as through a series of nodes and connections within the model to transform input data into output data. Additionally, hyperparameters are used as part of the training process to control how the learning is performed (e.g., a learning rate, a number of layers to be used in the machine learning model, etc.). Hyperparameters are defined to be training parameters that are determined prior to initiating the training process.
Different types of training may be performed based on the type of ML/AI model and/or the expected output. For example, supervised training uses inputs and corresponding expected (e.g., labeled) outputs to select parameters (e.g., by iterating over combinations of select parameters) for the ML/AI model that reduce model error. As used herein, labelling refers to an expected output of the machine learning model (e.g., a classification, an expected output value, etc.). Alternatively, unsupervised training (e.g., used in deep learning, a subset of machine learning, etc.) involves inferring patterns from inputs to select parameters for the ML/AI model (e.g., without the benefit of expected (e.g., labeled) outputs).
Training is performed using training data. In examples disclosed herein, the training data originates from previously generated sensor data (e.g., biosensor data, strain sensor data such as EMG data or fabric stretch sensor data, image data of user(s) performing different movement(s), user parameter data (e.g., weight, gender), motion capture sensor data, etc.) associated with user(s) who have experienced a musculoskeletal injury to a portion of his or her body (e.g., shoulder, knee, arm, back, neck). Because supervised training is used, the training data is labeled.
Once training is complete, the model is deployed for use as an executable construct that processes an input and provides an output based on the network of nodes and connections defined in the model. The model(s) are stored at one or more databases (e.g., the database 240 of
Once trained, the deployed model may be operated in an inference phase to process data. In the inference phase, data to be analyzed (e.g., live data) is input to the model, and the model executes to create an output. This inference phase can be thought of as the AI “thinking” to generate the output based on what it learned from the training (e.g., by executing the model to apply the learned patterns and/or associations to the live data). In some examples, input data undergoes pre-processing before being used as an input to the machine learning model. Moreover, in some examples, the output data may undergo post-processing after it is generated by the AI model to transform the output into a useful result (e.g., a display of data, an instruction to be executed by a machine, etc.).
In some examples, output of the deployed model may be captured and provided as feedback. By analyzing the feedback, an accuracy of the deployed model can be determined. If the feedback indicates that the accuracy of the deployed model is less than a threshold or other criterion, training of an updated model can be triggered using the feedback and an updated training data set, hyperparameters, etc., to generate an updated, deployed model.
Referring to
The example first computing system 226 of
The example first computing system 226 of
In the example of
One or more strain event exposure models 238 are generated as a result of the neural network training. For example, a first strain event exposure model 238 can be generated to predict shoulder strain events based on training data associated with shoulder injuries. A second strain event exposure model 238 can be generated to predict knee strain events based on training data associated with knee injuries. The strain event exposure model(s) 238 are stored in a database 240. The databases 236, 240 may be the same storage device or different storage devices.
The performance analyzer 224 of
In some examples, the performance analyzer 224 predicts that the user 102 is at risk for strain event(s) based on the predicted strain event(s) 242 (e.g., based (only) on a prediction generated using real-time sensor data). In other examples, the performance analyzer 224 determines or verifies that the user 102 is at a risk of strain event(s) by comparing the result(s) of the neural network analysis to the user profile(s) 212, previously predicted strain event(s) 242, and/or the population profile(s) 218. For example, execution of the strain event exposure model(s) 238 based on sensor data 201, 202, 204 collected during a first time period may indicate that the user 102 is not experiencing a strain event. However, the performance analyzer 224 may determine that the user 102 is at risk for a strain event based on a comparison of the data 201, 202, 204 collected during the first time period and historical data for the user 102 captured in the user profile(s) 212 indicating changes (e.g., reduction) in user muscle strength over time. Additionally or alternatively, the performance analyzer 224 can determine that the user is experiencing or is likely to experience a strain event based on previously predicted strain event(s) 242, which can indicate that the user 102 is performing a repetitive motion. Thus, the performance analyzer 224 can detect changes indicative of a risk of injury over time based on the neural network analysis and historical data.
In the example of
In the example of
In some examples, the ergonomic analysis controller 110 generates recommendations for improving ergonomic form(s). The example ergonomic analysis controller 110 of
In the example of
In other examples, the ergonomic form measure(s) 246 include reminders to the user 102 to, for example, check his or her posture when performing a movement. In such examples, the mitigation instruction(s) include audio, visual, and/or haptic feedback reminders to cause the user 102 to be aware of his or her body position, a number of times the movement has been performed, etc. Thus, in some examples, the mitigation measure(s) 246 are generated independent of the sensor data 201, 202, 204.
The ergonomic form recommendation generator 244 transmits the ergonomic form measure(s) 246 for output by the output device(s) 105, 112. The ergonomic form measure(s) 246 can be presented via audio output(s) (e.g., audio output(s) that include a recommended number of repetitions to perform of a movement) and/or visual output(s) (e.g., a visual content in the form of text and/or graphics with respect to a recommended number of repetitions to perform of a movement, an image of a person performing the movement with correct posture, etc.). The ergonomic form recommendation generator 244 can output the ergonomic form measure(s) 246 in response to or independent of the alert(s) generated in response to the prediction of the strain event(s) by the performance analyzer 224.
In some examples, the communicator 214 of the ergonomic analysis controller 110 transmits one or more of the sensor data 201, 202, 204; the user profile(s) 212; and/or the predicted strain event(s) 242 to the output device(s) 112. Also, in some examples, the population data aggregator 114 transmits the population profile(s) 218 to the output device(s) 112. The data can be displayed via user interface(s) accessible by the user 102 and/or by authorized third parties. In some examples, the user interface(s) can display changes over time in the data and/or risk exposure associated with the user 102, compare the user relative to a larger population (e.g., based on the population profile(s) 218), etc.
While an example manner of implementing the ergonomic analysis controller 110 of
While an example manner of implementing the population data aggregator 114 of
While an example manner of implementing the first computing system 226 is illustrated in
A flowchart representative of example hardware logic, machine readable instructions, hardware implemented state machines, and/or any combination thereof for implementing the example population data aggregator 114 is shown in
The machine readable instructions described herein may be stored in one or more of a compressed format, an encrypted format, a fragmented format, a compiled format, an executable format, a packaged format, etc. Machine readable instructions as described herein may be stored as data or a data structure (e.g., portions of instructions, code, representations of code, etc.) that may be utilized to create, manufacture, and/or produce machine executable instructions. For example, the machine readable instructions may be fragmented and stored on one or more storage devices and/or computing devices (e.g., servers) located at the same or different locations of a network or collection of networks (e.g., in the cloud, in edge devices, etc.). The machine readable instructions may require one or more of installation, modification, adaptation, updating, combining, supplementing, configuring, decryption, decompression, unpacking, distribution, reassignment, compilation, etc. in order to make them directly readable, interpretable, and/or executable by a computing device and/or other machine. For example, the machine readable instructions may be stored in multiple parts, which are individually compressed, encrypted, and stored on separate computing devices, wherein the parts when decrypted, decompressed, and combined form a set of executable instructions that implement one or more functions that may together form a program such as that described herein.
In another example, the machine readable instructions may be stored in a state in which they may be read by processor circuitry, but require addition of a library (e.g., a dynamic link library (DLL)), a software development kit (SDK), an application programming interface (API), etc. in order to execute the instructions on a particular computing device or other device. In another example, the machine readable instructions may need to be configured (e.g., settings stored, data input, network addresses recorded, etc.) before the machine readable instructions and/or the corresponding program(s) can be executed in whole or in part. Thus, machine readable media, as used herein, may include machine readable instructions and/or program(s) regardless of the particular format or state of the machine readable instructions and/or program(s) when stored or otherwise at rest or in transit.
The machine readable instructions described herein can be represented by any past, present, or future instruction language, scripting language, programming language, etc. For example, the machine readable instructions may be represented using any of the following languages: C, C++, Java, C#, Perl, Python, JavaScript, HyperText Markup Language (HTML), Structured Query Language (SQL), Swift, etc.
As mentioned above, the example processes of
“Including” and “comprising” (and all forms and tenses thereof) are used herein to be open ended terms. Thus, whenever a claim employs any form of “include” or “comprise” (e.g., comprises, includes, comprising, including, having, etc.) as a preamble or within a claim recitation of any kind, it is to be understood that additional elements, terms, etc. may be present without falling outside the scope of the corresponding claim or recitation. As used herein, when the phrase “at least” is used as the transition term in, for example, a preamble of a claim, it is open-ended in the same manner as the term “comprising” and “including” are open ended. The term “and/or” when used, for example, in a form such as A, B, and/or C refers to any combination or subset of A, B, C such as (1) A alone, (2) B alone, (3) C alone, (4) A with B, (5) A with C, (6) B with C, and (7) A with B and with C. As used herein in the context of describing structures, components, items, objects and/or things, the phrase “at least one of A and B” is intended to refer to implementations including any of (1) at least one A, (2) at least one B, and (3) at least one A and at least one B. Similarly, as used herein in the context of describing structures, components, items, objects and/or things, the phrase “at least one of A or B” is intended to refer to implementations including any of (1) at least one A, (2) at least one B, and (3) at least one A and at least one B. As used herein in the context of describing the performance or execution of processes, instructions, actions, activities and/or steps, the phrase “at least one of A and B” is intended to refer to implementations including any of (1) at least one A, (2) at least one B, and (3) at least one A and at least one B. Similarly, as used herein in the context of describing the performance or execution of processes, instructions, actions, activities and/or steps, the phrase “at least one of A or B” is intended to refer to implementations including any of (1) at least one A, (2) at least one B, and (3) at least one A and at least one B.
As used herein, singular references (e.g., “a”, “an”, “first”, “second”, etc.) do not exclude a plurality. The term “a” or “an” entity, as used herein, refers to one or more of that entity. The terms “a” (or “an”), “one or more”, and “at least one” can be used interchangeably herein. Furthermore, although individually listed, a plurality of means, elements or method actions may be implemented by, e.g., a single unit or processor. Additionally, although individual features may be included in different examples or claims, these may possibly be combined, and the inclusion in different examples or claims does not imply that a combination of features is not feasible and/or advantageous.
The population data aggregator 114 assigns classifications to the data (e.g., the sensor data 201, 202, 204, the user profile(s) 212) based on the population data classification rule(s) 222 (block 304). For example, the population data aggregator 114 can classify biosensor data received from a user based on data type (e.g., heart rate data) and user properties (e.g., age, gender). As another example, the population data aggregator 114 can classify environmental data based on data type (e.g., image data) and environment type (e.g., factory, office, etc.).
The population data aggregator 114 aggregates data from two or more users based on the classifications to generate the population profile(s) 218 (block 306). The population profile(s) 218 are stored in the database 220 associated with the population data aggregator 114.
If additional sensor data and/or user profile data is received from user(s), the population data aggregator 114 continues to classify and aggregate the data to generate and/or update the population profile(s) 218 (block 308). The instructions 300 of
The example instructions 400 begin with the training controller 232 accessing sensor data and/or profile data associated with user(s) and/or population(s) stored in the database 236 (block 402). The sensor data can include, for example, one or more of previously generated biosensor data 201, strain sensor data 202, environmental data 204, user profile(s) 212, and/or population profile(s) 218. In some examples, the data includes the previously predicted strain event(s) 242 generated by the performance analyzer 224 as part of feedback training. In some examples, the sensor data is associated with a particular portion of the body of interest with respect to strain events, such as a shoulder, a knee, a wrist, neck, back, etc.
The example training controller 232 labels the data as indicative of strain event(s) (block 404). For example, when the sensor data includes image data of a user performing a movement, the training controller 232 labels the image(s) corresponding to the user in a position in which one or more portions of the user's body is stressed and/or strained such that an injury could occur. As another example, the training controller 232 labels muscle tension data with thresholds for detecting strain events based on, for example, previously generated or known reference data including, for instance, anthropometric data, population data generated by the population data aggregator 114, etc.
The example training controller 232 generates the training data 234 based on the labeled sensor data (block 406).
The example training controller 232 instructs the neural network trainer 230 to perform training of the neural network 228 using the training data 234 (block 408). In the example of
The example instructions 500 begin with the ergonomic analysis controller 110 accessing sensor data associated with a user (e.g., the user 102 of
The performance analyzer 224 predicts a likelihood that is the user is experiencing or is likely to experience a strain event with respect to one or more portions of the body of the user (block 506). In some examples, the performance analyzer 224 executes the strain event exposure model(s) 238 based on one or more of the sensor data 201, 202, 204 to predict a likelihood that is the user is experiencing or is likely to experience a strain event with respect to one or more portions of the body of the user. In some examples, the performance analyzer 224 verifies the predicted strain event(s) 242 in view of the user profile(s) 212 and/or the population profile(s) 218 (block 508). For example, the performance analyzer 224 can confirm a likelihood that the user is experiencing strain event or is likely to experience a strain event by comparing the sensor data 201, 202, 204 used in the neural network analysis to the historical or baseline user profile data 212. In some examples, the performance analyzer 224 verifies the prediction of the strain event(s) by comparing the sensor data 201, 202, 204 to the population profile(s) 218 generated by population data aggregator 114.
Additionally or alternatively, at block 506, the performance analyzer 224 can predict a likelihood that is the user is experiencing or is likely to experience a strain event with respect to one or more portions of the body of the user by mapping parameter(s) of the user such as weight, age, gender, etc. to the population profile(s) 218 to determine, for instance, an average or optimal amount of weight to be lifted by the user based on average data for other users having similar profiles; an optimal range of motion of a shoulder of the user based on other users having similar medical conditions such as arthritis, etc. The performance analyzer 224 can compare the average or optimal ergonomic data from the population profile(s) with the sensor data 201, 202, 204 to determine if the user is experiencing or is likely to experience a strain event.
If the performance analyzer 224 predicts a likelihood of strain event(s) for one or more portions of the user's body (block 510), the performance analyzer 224 instructs the output device(s) 112 (e.g., a smartphone) to output alert(s) to alert the user as to the strain event(s) (block 512). The alert(s) can include audio, visual, and/or haptic feedback alert(s).
In the example of
The ergonomic form recommendation generator 244 instructs the output device(s) 112 to output the ergonomic form measure(s) 246 for presentation to the user (block 516). The ergonomic form measure(s) 246 can be presented in visual and/or audio format, for example.
The ergonomic analysis controller 110 continues to update the user profile data 212, predict a likelihood of strain event(s), and provide ergonomic form measure(s) as additional sensor data is received by the ergonomic analysis controller 110 (block 518). The example instructions 500 of
The processor platform 600 of the illustrated example includes a processor 612. The processor 612 of the illustrated example is hardware. For example, the processor 612 can be implemented by one or more integrated circuits, logic circuits, microprocessors, GPUs, DSPs, or controllers from any desired family or manufacturer. The hardware processor may be a semiconductor based (e.g., silicon based) device. In this example, the processor implements the example population data aggregator 114.
The processor 612 of the illustrated example includes a local memory 613 (e.g., a cache). The processor 612 of the illustrated example is in communication with a main memory including a volatile memory 614 and a non-volatile memory 616 via a bus 618. The volatile memory 614 may be implemented by Synchronous Dynamic Random Access Memory (SDRAM), Dynamic Random Access Memory (DRAM), RAMBUS® Dynamic Random Access Memory (RDRAM®) and/or any other type of random access memory device. The non-volatile memory 616 may be implemented by flash memory and/or any other desired type of memory device. Access to the main memory 614, 616 is controlled by a memory controller.
The processor platform 600 of the illustrated example also includes an interface circuit 620. The interface circuit 620 may be implemented by any type of interface standard, such as an Ethernet interface, a universal serial bus (USB), a Bluetooth® interface, a near field communication (NFC) interface, and/or a PCI express interface.
In the illustrated example, one or more input devices 622 are connected to the interface circuit 620. The input device(s) 622 permit(s) a user to enter data and/or commands into the processor 612. The input device(s) can be implemented by, for example, an audio sensor, a microphone, a camera (still or video), a keyboard, a button, a mouse, a touchscreen, a track-pad, a trackball, isopoint and/or a voice recognition system.
One or more output devices 624 are also connected to the interface circuit 620 of the illustrated example. The output devices 624 can be implemented, for example, by display devices (e.g., a light emitting diode (LED), an organic light emitting diode (OLED), a liquid crystal display (LCD), a cathode ray tube display (CRT), an in-place switching (IPS) display, a touchscreen, etc.), a tactile output device, a printer and/or speaker. The interface circuit 620 of the illustrated example, thus, typically includes a graphics driver card, a graphics driver chip and/or a graphics driver processor.
The interface circuit 620 of the illustrated example also includes a communication device such as a transmitter, a receiver, a transceiver, a modem, a residential gateway, a wireless access point, and/or a network interface to facilitate exchange of data with external machines (e.g., computing devices of any kind) via a network 626. The communication can be via, for example, an Ethernet connection, a digital subscriber line (DSL) connection, a telephone line connection, a coaxial cable system, a satellite system, a line-of-site wireless system, a cellular telephone system, etc.
The processor platform 600 of the illustrated example also includes one or more mass storage devices 628 for storing software and/or data. Examples of such mass storage devices 628 include floppy disk drives, hard drive disks, compact disk drives, Blu-ray disk drives, redundant array of independent disks (RAID) systems, and digital versatile disk (DVD) drives.
Coded instructions 632 of
The processor platform 700 of the illustrated example includes a processor 712. The processor 712 of the illustrated example is hardware. For example, the processor 712 can be implemented by one or more integrated circuits, logic circuits, microprocessors, GPUs, DSPs, or controllers from any desired family or manufacturer. The hardware processor may be a semiconductor based (e.g., silicon based) device. In this example, the processor implements the example neural network processor 228, the example trainer 230, and the example training controller 232.
The processor 712 of the illustrated example includes a local memory 713 (e.g., a cache). The processor 712 of the illustrated example is in communication with a main memory including a volatile memory 714 and a non-volatile memory 716 via a bus 718. The volatile memory 714 may be implemented by Synchronous Dynamic Random Access Memory (SDRAM), Dynamic Random Access Memory (DRAM), RAMBUS® Dynamic Random Access Memory (RDRAM®) and/or any other type of random access memory device. The non-volatile memory 716 may be implemented by flash memory and/or any other desired type of memory device. Access to the main memory 714, 716 is controlled by a memory controller.
The processor platform 700 of the illustrated example also includes an interface circuit 720. The interface circuit 720 may be implemented by any type of interface standard, such as an Ethernet interface, a universal serial bus (USB), a Bluetooth® interface, a near field communication (NFC) interface, and/or a PCI express interface.
In the illustrated example, one or more input devices 722 are connected to the interface circuit 720. The input device(s) 722 permit(s) a user to enter data and/or commands into the processor 712. The input device(s) can be implemented by, for example, an audio sensor, a microphone, a camera (still or video), a keyboard, a button, a mouse, a touchscreen, a track-pad, a trackball, isopoint and/or a voice recognition system.
One or more output devices 724 are also connected to the interface circuit 720 of the illustrated example. The output devices 724 can be implemented, for example, by display devices (e.g., a light emitting diode (LED), an organic light emitting diode (OLED), a liquid crystal display (LCD), a cathode ray tube display (CRT), an in-place switching (IPS) display, a touchscreen, etc.), a tactile output device, a printer and/or speaker. The interface circuit 720 of the illustrated example, thus, typically includes a graphics driver card, a graphics driver chip and/or a graphics driver processor.
The interface circuit 720 of the illustrated example also includes a communication device such as a transmitter, a receiver, a transceiver, a modem, a residential gateway, a wireless access point, and/or a network interface to facilitate exchange of data with external machines (e.g., computing devices of any kind) via a network 726. The communication can be via, for example, an Ethernet connection, a digital subscriber line (DSL) connection, a telephone line connection, a coaxial cable system, a satellite system, a line-of-site wireless system, a cellular telephone system, etc.
The processor platform 700 of the illustrated example also includes one or more mass storage devices 728 for storing software and/or data. Examples of such mass storage devices 728 include floppy disk drives, hard drive disks, compact disk drives, Blu-ray disk drives, redundant array of independent disks (RAID) systems, and digital versatile disk (DVD) drives.
Coded instructions 732 of
The processor platform 800 of the illustrated example includes a processor 812. The processor 812 of the illustrated example is hardware. For example, the processor 812 can be implemented by one or more integrated circuits, logic circuits, microprocessors, GPUs, DSPs, or controllers from any desired family or manufacturer. The hardware processor may be a semiconductor based (e.g., silicon based) device. In this example, the processor implements the example signal modifier 206, example user profile generator 208, the example communicator 214, the example performance analyzer 224, and the example ergonomic form recommendation generator 244.
The processor 812 of the illustrated example includes a local memory 813 (e.g., a cache). The processor 812 of the illustrated example is in communication with a main memory including a volatile memory 814 and a non-volatile memory 816 via a bus 818. The volatile memory 814 may be implemented by Synchronous Dynamic Random Access Memory (SDRAM), Dynamic Random Access Memory (DRAM), RAMBUS® Dynamic Random Access Memory (RDRAM®) and/or any other type of random access memory device. The non-volatile memory 816 may be implemented by flash memory and/or any other desired type of memory device. Access to the main memory 814, 816 is controlled by a memory controller.
The processor platform 800 of the illustrated example also includes an interface circuit 820. The interface circuit 820 may be implemented by any type of interface standard, such as an Ethernet interface, a universal serial bus (USB), a Bluetooth® interface, a near field communication (NFC) interface, and/or a PCI express interface.
In the illustrated example, one or more input devices 822 are connected to the interface circuit 820. The input device(s) 822 permit(s) a user to enter data and/or commands into the processor 812. The input device(s) can be implemented by, for example, an audio sensor, a microphone, a camera (still or video), a keyboard, a button, a mouse, a touchscreen, a track-pad, a trackball, isopoint and/or a voice recognition system.
One or more output devices 824 are also connected to the interface circuit 820 of the illustrated example. The output devices 824 can be implemented, for example, by display devices (e.g., a light emitting diode (LED), an organic light emitting diode (OLED), a liquid crystal display (LCD), a cathode ray tube display (CRT), an in-place switching (IPS) display, a touchscreen, etc.), a tactile output device, a printer and/or speaker. The interface circuit 820 of the illustrated example, thus, typically includes a graphics driver card, a graphics driver chip and/or a graphics driver processor.
The interface circuit 820 of the illustrated example also includes a communication device such as a transmitter, a receiver, a transceiver, a modem, a residential gateway, a wireless access point, and/or a network interface to facilitate exchange of data with external machines (e.g., computing devices of any kind) via a network 826. The communication can be via, for example, an Ethernet connection, a digital subscriber line (DSL) connection, a telephone line connection, a coaxial cable system, a satellite system, a line-of-site wireless system, a cellular telephone system, etc.
The processor platform 800 of the illustrated example also includes one or more mass storage devices 828 for storing software and/or data. Examples of such mass storage devices 828 include floppy disk drives, hard drive disks, compact disk drives, Blu-ray disk drives, redundant array of independent disks (RAID) systems, and digital versatile disk (DVD) drives.
Coded instructions 832 of
From the foregoing, it will be appreciated that example systems, methods, apparatus, and articles of manufacture have been disclosed that predict a likelihood that a user is experiencing or is likely to experience a musculoskeletal strain event at one or more portions of the user's body (e.g., shoulder, knee, wrist, arm, back, neck) and alert the user in response to the predicted strain event(s). Examples disclosed herein implement neural network model(s) to predict the likelihood of the strain event(s) based on sensor data associated with the user, such as biosensor data, strain sensor data, and/or data from sensor(s) in the environment in which the user is located that capture data associated with the user (e.g., image data of the user) and/or conditions in the environment (e.g., vibrations). Example neural network(s) are developed and/or refined based on data collected from the user and/or other user(s) over time. As such, rather than relying on static reference data, examples disclosed herein dynamically respond to changes in user and/or movement characteristics to provide customized detection of strain event(s) and optimized ergonomic form recommendations for the user.
Example systems, apparatus, and methods for musculoskeletal ergonomic improvement are disclosed herein. Further examples and combinations thereof include the following:
Clause 1 includes an apparatus includes a performance analyzer to predict a musculoskeletal strain event for a portion of a body of a user based on strain sensor data collected via one or more strain sensors associated with the user; and transmit, in response to the prediction of the musculoskeletal strain event, an instruction including an alert to be output by an output device; and an ergonomic form recommendation generator to transmit, in response to the prediction of the musculoskeletal strain event, an instruction including an ergonomic form measure to be output by the output device.
Clause 2 includes the apparatus of clause 1, further including an aggregator to aggregate the strain sensor data for the user with strain sensor data for a population of users to generate a population profile, the performance analyzer to predict the musculoskeletal strain event based on the population profile.
Clause 3 includes the apparatus of clauses 1 or 2, wherein the performance analyzer is to predict the musculoskeletal strain event based on biosensor data collected via one or more biosensors associated with the user.
Clause 4 includes the apparatus of any of clauses 1-3, further including a user profile generator to generate a user profile for the user based on the strain sensor data, the user profile including the strain sensor data and historical sensor data for the user, the ergonomic form recommendation generator to generate the ergonomic form measure based on the user profile.
Clause 5 includes the apparatus of any of clauses 1-4, wherein the ergonomic form recommendation generator is to further generate the ergonomic form measure based on a rule defining a threshold associated with movement by the user.
Clause 6 includes the apparatus of any of clauses 1-5, wherein the performance analyzer is to execute a neural network model to predict the musculoskeletal strain event.
Clause 7 includes the apparatus of any of clauses 1-6, wherein the alert includes one or more of a visual alert, an audio alert, or a haptic feedback alert.
Clause 8 includes a system including a first sensor; and an ergonomic analysis controller to execute a neural network model to predict a musculoskeletal strain event for a user based on first sensor data generated by the first sensor; generate an ergonomic form measure for the user based on the first sensor data; and cause an output device to present the ergonomic form measure in response to the prediction of the musculoskeletal strain event.
Clause 9 includes the system of clause 8, wherein the first sensor data includes strain sensor data.
Clause 10 includes the system of clauses 8 or 9, wherein the first sensor is carried by a wearable fabric.
Clause 11 includes the system of any of clauses 8-10, further including a second sensor, the second sensor including a camera to capture image data of the user in an environment, the ergonomic analysis controller to update a user profile for the user based on one or more of the strain sensor data or the image data, the user profile including historical sensor data for the user; and generate the ergonomic form measure based on the user profile.
Clause 12 includes the system of any of clauses 8-11, wherein the ergonomic analysis controller is to verify the prediction of the musculoskeletal strain event based on reference sensor data for the user.
Clause 13 includes the system of any of clauses 8-12, wherein the user is a first user and further including an aggregator, the ergonomic analysis controller to transmit the first sensor data to the aggregator, the aggregator to aggregate the first sensor data with second sensor data associated with a second user to generate a population profile.
Clause 14 includes the system of any of clauses 8-13, wherein one or more of the neural network model or the ergonomic form measure is based on the population profile.
Clause 15 includes a non-transitory computer readable medium comprising instructions that, when executed by at least one processor, cause the at least one processor to predict a musculoskeletal strain event based on sensor data generated in response to movement by a user; and transmit, in response to the prediction of the musculoskeletal strain event, an instruction including an ergonomic form measure to be output by an output device.
Clause 16 includes the non-transitory computer readable medium of clause 15, wherein the instructions, when executed, cause the at least one processor to execute a neural network model to predict the musculoskeletal strain event.
Clause 17 includes the non-transitory computer readable medium of clauses 15 or 16, wherein the neural network model is trained to generate the prediction for a shoulder of the user.
Clause 18 includes the non-transitory computer readable medium of any of clauses 15-17, wherein the instructions, when executed, cause the at least one processor to generate a user profile for the user based on the sensor data, the user profile including the sensor data and historical sensor data for the user; and generate the ergonomic form measure based on the user profile.
Clause 19 includes the non-transitory computer readable medium of any of clauses 15-18, wherein the ergonomic form measure is to include an instruction for the user with respect to the movement.
Clause 20 includes the non-transitory computer readable medium of any of clauses 15-19, wherein the sensor data is first sensor data, the user is a first user, and the instructions, when executed, cause the at least one processor to aggregate the first sensor data with second sensor data associated with a second user to generate a population profile, the ergonomic form measure to be based on the population profile.
Clause 21 includes a method including predicting a musculoskeletal strain event for a portion of a body of a user based on strain sensor data collected via one or more strain sensors associated with the user; and transmitting, in response to the prediction of the musculoskeletal strain event, an instruction including an alert and an ergonomic form measure to be output by an output device.
Clause 22 includes the method of clause 21, further including aggregating the strain sensor data for the user with strain sensor data for a population of users to generate a population profile; and predicting the musculoskeletal strain event based on the population profile.
Clause 23 includes the method of clauses 21 or 22, wherein the predicting of the musculoskeletal strain event based on biosensor data collected via one or more biosensors associated with the user.
Clause 24 includes the method of any of clauses 21-23, further including generating a user profile for the user based on the strain sensor data, the user profile including the strain sensor data and historical sensor data for the user; and generating the ergonomic form measure based on the user profile.
Clause 25 includes the method of any of clauses 21-24, further including generating the ergonomic form measure based on a rule defining a threshold associated with movement by the user.
Clause 26 includes the method of any of clauses 21-25, wherein the predicting the musculoskeletal strain event includes executing a neural network model.
Although certain example methods, apparatus and articles of manufacture have been disclosed herein, the scope of coverage of this patent is not limited thereto. On the contrary, this patent covers all methods, apparatus and articles of manufacture fairly falling within the scope of the claims of this patent.
The following claims are hereby incorporated into this Detailed Description by this reference, with each claim standing on its own as a separate embodiment of the present disclosure.