Patients may use treatment apparatuses for any suitable purpose, such as rehabilitation of a body part, pre-habilitation of a body part, strengthening a body part, exercising a body part, and the like.
A method is disclosed. The method includes, while the patient uses the treatment apparatus, controlling, based on a treatment plan for a patient, a treatment apparatus. The method includes receiving, by a processing device, data from an electronic device, wherein the data comprises one of a position of a body part of the patient or a force exerted by the body part. The method includes storing, via the processing device, the data for the patient in a computer-readable medium. The method includes causing, via a processing device, presentation of a user interface on a patient interface. The user interface comprises an adjustment confirmation control, and the adjustment confirmation control is configured to solicit a response regarding the patient's comfort level with the one of the position of the body part or the force exerted by the body part.
A computer-implemented system for physical rehabilitation is provided. The computer-implemented system comprises a clinician interface including a patient profile display configured to present data regarding performance, by a patient, of a regimen for a body part, the body part comprising at least one of a joint, a bone, or a muscle group. The computer-implemented system also comprises a sensor configured to measure one of a position of the body part or a force exerted by the body part. The computer-implemented system also comprises a patient interface including an output device and an input device for communicating information regarding the performance of the regimen, respectively to and from the patient. The patient interface is configured to present instructions and status information to the patient regarding the performance of the regimen. The patient interface is configured to present an adjustment confirmation control configured to solicit a response regarding the patient's comfort or discomfort with the one of the position of the body part or the force exerted by the body part.
A system for remote treatment is also provided. The system for remote treatment comprises: a clinician interface configured to present controls for modifying a treatment plan comprising a regimen for treatment of a body part of a patient, with the body part comprising at least one of a joint, a bone, or a muscle group. The system also comprises a treatment apparatus for performing the regimen upon the body part, the treatment apparatus is configured to be manipulated by the patient. The system also comprises a patient interface including an output device and an input device for communicating information regarding the performance of the regimen, respectively to and from the patient. The patient interface and the treatment apparatus are each configured to enable operation from a patient location geographically separate from a location of the clinician interface. The patient interface is configured to present an adjustment confirmation control configured to solicit a response regarding the patient's comfort level with one of a position of the body part or a force exerted by the body part.
A patient user interface generated by a computer is also provided. The patient user interface comprises a session period action screen configured to present real-time status of a measurement regarding a patient's use of a treatment apparatus for performing a regimen for a body part, the body part comprising at least one of a joint, a bone, or a muscle group. The patient user interface also comprises an adjustment confirmation control configured to solicit a response regarding the patient's comfort level with one of a position of the body part or a force exerted by the body part. The measurement regarding the patient's use of the treatment apparatus includes the one of the position of the body part or the force exerted by the body part.
For a detailed description of example embodiments, reference will now be made to the accompanying drawings in which:
Various terms are used to refer to particular system components. Different companies may refer to a component by different names—this document does not intend to distinguish between components that differ in name but not function. In the following discussion and in the claims, the terms “including” and “comprising” are used in an open-ended fashion, and thus should be interpreted to mean “including, but not limited to . . . .” Also, the term “couple” or “couples” is intended to mean either an indirect or direct connection. Thus, if a first device couples to a second device, that connection may be through a direct connection or through an indirect connection via other devices and connections.
The terminology used herein is for the purpose of describing particular example embodiments only, and is not intended to be limiting. As used herein, the singular forms “a,” “an,” and “the” may be intended to include the plural forms as well, unless the context clearly indicates otherwise. The method steps, processes, and operations described herein are not to be construed as necessarily requiring their performance in the particular order discussed or illustrated, unless specifically identified as an order of performance. It is also to be understood that additional or alternative steps may be employed.
The terms first, second, third, etc. may be used herein to describe various elements, components, regions, layers and/or sections; however, these elements, components, regions, layers and/or sections should not be limited by these terms. These terms may be only used to distinguish one element, component, region, layer, or section from another region, layer, or section. Terms such as “first,” “second,” and other numerical terms, when used herein, do not imply a sequence or order unless clearly indicated by the context. Thus, a first element, component, region, layer, or section discussed below could be termed a second element, component, region, layer, or section without departing from the teachings of the example embodiments. The phrase “at least one of,” when used with a list of items, means that different combinations of one or more of the listed items may be used, and only one item in the list may be needed. For example, “at least one of: A, B, and C” includes any of the following combinations: A, B, C, A and B, A and C, B and C, and A and B and C. In another example, the phrase “one or more” when used with a list of items means there may be one item or any suitable number of items exceeding one.
Spatially relative terms, such as “inner,” “outer,” “beneath,” “below,” “lower,” “above,” “upper,” “top,” “bottom,” and the like, may be used herein. These spatially relative terms can be used for ease of description to describe one element's or feature's relationship to another element(s) or feature(s) as illustrated in the figures. The spatially relative terms may also be intended to encompass different orientations of the device in use, or operation, in addition to the orientation depicted in the figures. For example, if the device in the figures is turned over, elements described as “below” or “beneath” other elements or features would then be oriented “above” the other elements or features. Thus, the example term “below” can encompass both an orientation of above and below. The device may be otherwise oriented (rotated 90 degrees or at other orientations) and the spatially relative descriptions used herein interpreted accordingly.
The following discussion is directed to various embodiments of the disclosure. Although one or more of these embodiments may be preferred, the embodiments disclosed should not be interpreted, or otherwise used, as limiting the scope of the disclosure, including the claims. In addition, one skilled in the art will understand that the following description has broad application, and the discussion of any embodiment is meant only to be exemplary of that embodiment, and not intended to intimate that the scope of the disclosure, including the claims, is limited to that embodiment.
The system 10 includes a clinician interface 20 for a clinician, such as a doctor, a nurse, a physical therapist, or a technician, to use to review and to configure various aspects of a treatment plan for use in treating a patient. The clinician interface 20 includes a clinician input device 22 and a clinician display 24, which may be collectively called a clinician user interface 22, 24. The clinician input device 22 may include one or more of a keyboard, a mouse, a trackpad, or a touch screen, for example. Alternatively or additionally, the clinician input device 22 may include one or more microphones and voice-based functionalities, with hardware and/or software configured to interpret spoken instructions by the clinician by using the one or more microphones. The clinician input device 22 may include functionality provided by or similar to existing voice-based assistants such as Siri by Apple, Alexa by Amazon, Google Assistant, or Bixby by Samsung. The clinician input device 22 may include other hardware and/or software components. The clinician input device 22 may include one or more general purpose devices and/or special-purpose devices.
The clinician display 24 may take one or more different forms including, for example, a computer monitor or display screen on a tablet, smartphone, or a smart watch. The clinician display 24 may include other hardware and/or software components such as a projector, virtual reality capability, or augmented reality capability etc. The clinician display 24 may incorporate various different visual, audio, or other presentation technologies. For example, the clinician display 24 may include a non-visual display, such as an audio signal, which may include spoken language and/or other sounds such as tones, chimes, and/or melodies which may signal different conditions and/or directions. The clinician display 24 may comprise one or more different display screens presenting various data and/or interfaces or controls for use by the clinician. The clinician display 24 may include graphics, which may be presented by a web-based interface and/or by a computer program or application (App.).
The system 10 also includes a server 30 configured to store and to provide data related to managing the treatment plan. The server 30 may include one or more computers and may take the form of a distributed and/or virtualized computer or computers. In some embodiments, the server 30 may generate aspects of the clinician display 24 for presentation by the clinician interface 20. For example, the server 30 may include a web server configured to generate the display screens for presentation upon the clinician display 24. In some embodiments, the clinician display 24 may be configured to present a virtualized desktop that is hosted by the server 30. The server 30 also includes a first communication interface 32 configured to communicate with the clinician interface 20 via a first network 34. In some embodiments, the first network 34 may include a local area network (LAN), such as an Ethernet network. In some embodiments, the first network 34 may include the Internet, and communications between the server 30 and the clinician interface 20 may be secured via encryption, such as, for example, by using a virtual private network (VPN). In some embodiments, the first network 34 may include wired and/or wireless network connections such as Wi-Fi, Bluetooth, ZigBee, Near-Field Communications (NFC), cellular data network, etc. The server 30 includes a first processor 36 and a first machine-readable storage memory 38, which may be called a “memory” for short, holding first instructions 40 for performing the various actions of the server 30 for execution by the first processor 36. The server 30 is configured to store data regarding the treatment plan. For example, the memory 38 includes a system data store 42 configured to hold system data, such as data pertaining to treatment plans for treating one or more patients. The server 30 is also configured to store data regarding performance by a patient in following a treatment plan. For example, the memory 38 includes a patient data store 44 configured to hold patient data, such as data pertaining to the one or more patients, including data representing each patient's performance within the treatment plan.
The system 10 also includes a patient interface 50 configured to communicate information to a patient and to receive feedback from the patient. Specifically, the patient interface 50 includes an input device 52 and an output device 54, which may be collectively called a patient user interface 52, 54. The input device 52 may include one or more devices, such as a keyboard, a mouse, a touch screen input, a gesture sensor, and/or a microphone and processor configured for voice recognition. The output device 54 may take one or more different forms including, for example, a computer monitor or display screen on a tablet, smartphone, or a smart watch. The output device 54 may include other hardware and/or software components such as a projector, virtual reality capability, augmented reality capability, etc. The output device 54 may incorporate various different visual, audio, or other presentation technologies. For example, the output device 54 may include a non-visual display, such as an audio signal, which may include spoken language and/or other sounds such as tones, chimes, and/or melodies, which may signal different conditions and/or directions. The output device 54 may comprise one or more different display screens presenting various data and/or interfaces or controls for use by the patient. The output device 54 may include graphics, which may be presented by a web-based interface and/or by a computer program or application (App.).
As shown in
The patient interface 50 includes a second processor 60 and a second machine-readable storage memory 62 holding second instructions 64 for execution by the second processor 60 for performing various actions of patient interface 50. The second machine-readable storage memory 62 also includes a local data store 66 configured to hold data, such as data pertaining to a treatment plan and/or patient data, such as data representing a patient's performance within a treatment plan. The patient interface 50 also includes a local communication interface 68 configured to communicate with various devices for use by the patient in the vicinity of the patient interface 50. The local communication interface 68 may include wired and/or wireless communications. In some embodiments, the local communication interface 68 may include a local wireless network such as Wi-Fi, Bluetooth, ZigBee, Near-Field Communications (NFC), cellular data network, etc.
The system 10 also includes a treatment apparatus 70 configured to be manipulated by the patient and/or to manipulate a body part of the patient for performing activities according to the treatment plan. In some embodiments, the treatment apparatus 70 may take the form of an exercise and rehabilitation apparatus configured to perform and/or to aid in the performance of a rehabilitation regimen, which may be an orthopedic rehabilitation regimen, and the treatment includes rehabilitation of a body part of the patient, such as a joint or a bone or a muscle group. More specifically, the regimen may be a physical rehabilitation regimen for improving strength and/or range of motion of the body part. The body part may include, for example, a spine, a hand, a foot, a knee, or a shoulder. The body part may include a part of a joint, a bone, or a muscle group, such as one or more vertebrae or a ligament. As shown in
The internal sensors 76 may measure one or more operating characteristics of the treatment apparatus 70 such as, for example, a force a position, a speed, and/or a velocity. In some embodiments, the internal sensors 76 may include a position sensor configured to measure at least one of a linear motion or an angular motion of a body part of the patient. For example, an internal sensor 76 in the form of a position sensor may measure a distance that the patient is able to move a part of the treatment apparatus 70, where such distance may correspond to a range of motion that the patient's body part is able to achieve. In some embodiments, the internal sensors 76 may include a force sensor configured to measure a force applied by the patient. For example, an internal sensor 76 in the form of a force sensor may measure a force or weight the patient is able to apply, using a particular body part, to the treatment apparatus 70.
The system 10 shown in
The system 10 shown in
The system 10 shown in
The system 10 also includes a wearable device 90 configured to be worn or carried on the patient's person. The wearable device 90 may take one of several different forms such as, for example, a smart watch, a wristband, a pendant, or a smartphone. The wearable device 90 may include a means of attachment, such as a pin, a belt clip, a strap, or a lanyard, to facilitate the device's being worn or carried by the patient. In some embodiments, and as shown in
The wearable device 90 includes a wearable input device 92 and a wearable display 94, which may be collectively called a wearable user interface 92, 94. The wearable input device 92 may include one or more devices, such as a keyboard, a mouse, a touch screen input, a gesture sensor, and/or a microphone and processor configured for voice recognition. The wearable display 94 may take one or more different forms including, for example, a display screen, and/or one or more lights or other indicators. The wearable display 94 may incorporate various different visual, audio, or other presentation technologies. For example, the wearable display 94 may include a non-visual display, such as a haptic or tactile device and/or an audio signal, which may include spoken language and/or other sounds such as tones, chimes, and/or melodies, and the non-visual display may signal different conditions and/or directions. The wearable display 94 may comprise one or more different display screens configured to present various data and/or interfaces or controls for use by the patient. The wearable display 94 may include graphics, which may be presented by a web-based interface and/or by a computer program or application (App.). The wearable user interface 92, 94 may be configured to present different types of information to the patient. For example, the wearable user interface 92, 94 may be configured to present a reminder when it is time for the patient to perform a rehabilitation session. The wearable user interface 92, 94 may allow the patient to track daily goals or to receive messages from a clinician, etc. This function of the wearable device 90 may be especially useful when the patient is away from the patient interface 50.
The system 10 shown in
The system 10 shown in
In some embodiments, the patient interface 50 and the treatment apparatus 70 are each configured to operate from a patient location geographically separate from a location of the clinician interface 20. For example, the patient interface 50 and the treatment apparatus 70 may be used as part of an in-home rehabilitation system, which may be monitored remotely by using the clinician interface 20 at a centralized location, such as a clinic or hospital. In some embodiments, either or both of the patient interface 50 and/or the treatment apparatus 70 are configured to communicate with a remote computer, such as the server 30, to receive the treatment plan and to report back to the remote computer with data regarding performance by the patient in following the treatment plan.
The example patient profile display 130 presents information regarding a treatment history of the patient. For example, the example patient profile display 130 includes a plurality of different treatment graphs 136 showing the effect of various treatment parameters over time. The treatment graphs 136 shown in the example patient profile display 130 of
The protocol session control 176 allows the clinician to adjust the number, the order, and the types of the session periods within a given session of the treatment protocol 156. Each session period has a type that corresponds to a category of activity to be performed upon a body part during that session period. For example, the session periods may be one of a passive period, an assisted period, an active period, or a resistance period. Each passive period is associated with a particular activity that includes moving a body part by an external force; each assisted period is associated with a particular activity that includes moving the body part by the patient with assistance of the external force; each active period is associated with a particular activity that includes the patient moving the body part without assistance of the external force; and each resistance period is associated with a particular activity that includes the patient actively moving the body part against a resistance force. For example, where the treatment apparatus 70 includes a stationary cycling machine 100, a passive period may include an actuator 78, such as a motor, that rotates the pedals 108 with the patient's feet and legs attached thereto and without any action or force being applied by the patient. An assisted period may include the patient applying force to rotate the pedals 108 with some additional help or assistance from the actuator 78. An active period may include the patient applying force to rotate the pedals 108 without any assistance from any outside force. A resistance period may include the patient exerting some force to rotate the pedals 108 in opposition to a resistance force applied by the actuator 78. In some embodiments, the actuator 78 may produce the external forces for each of the different categories of the session periods. The external forces may have different attributes, such as directions, intensities, or rates of changes, for each of the different categories of the session periods. Each session may include any number of session periods in any combination.
In some embodiments, the protocol session icons 144 may be modified using a drag-and-drop interface. Additional protocol sessions may be added to the protocol session using a session period control 177. Additionally, parameters for any or all of the session periods may be adjusted using various session parameter controls 178. For example, a duration and direction of each session period may be adjusted using the session parameter controls 178 located below an associated one of the protocol session icons 144. Various other parameters, such as resistance, target speed range (RPM), pedal radius limits, etc. may be adjusted using other session parameter controls 178. In some embodiments, the number and the type of session parameter controls 178 may change depending on the type of session period selected. For example, selecting a protocol session icon 144 for an active type of session period may cause the target speed range (RPM) session parameter control 178 to be visible and adjustable, but the target speed range (RPM) session parameter control 178 may not be visible and/or adjustable in response to selecting a protocol session icon 144 for a passive type session.
In some embodiments, the system 10 may impose limits on values that can be set using the session parameter controls 178. For example, the treatment plan 154 may include a maximum session time. In some embodiments, to satisfy a rule of the system 10 or a rule within the treatment plan 154, one or more of the values of the parameters may be automatically changed by the system 10. For example, the treatment plan 154 may require a resistance type of session period after an active type of session period, wherein the former is at least 25% as long as the active type of session to allow the patient to cool down after active exercise. The system 10 may automatically create the resistance type session period in response to the clinician creating an active type session period. The system 10 may also automatically adjust the time of the resistance type session period to satisfy the requirement of it lasting at least 25% as long as the active type of session.
In some embodiments, the treatment plan 154 may include maximum values for certain parameters until an associated condition is satisfied. For example, the pedal radius limit may be limited to 40 mm until an associated condition is satisfied. Associated conditions may include, for example, approval by an authorized person, such as an orthopedic surgeon; the elapsing of a particular time, such as 5 days after a surgical procedure; or successful completion of a post-operation checkup. Similarly, the treatment plan 154 may place limits on the types of session periods that may be performed until an associated condition is satisfied. The treatment plan 154 may be limited to only passive or assisted session periods (and not active periods or resistance periods until an associated condition is satisfied. Different associated conditions may be associated with each of the different parameters and/or with limits on the types of session periods available.
In some embodiments, the patient interface 50 presents an adjustment confirmation control configured to solicit a response regarding the patient's comfort level with the position of the body part or the force exerted by the body part. The comfort level may be indicated by a binary selection (e.g., comfortable or not comfortable). In some embodiments, the comfort level may be an analog value that may be indicated numerically or with an analog input control, such as a slider or a rotary knob. In some embodiments, the comfort level may be indicated by one of several different comfort level values, such as an integer number from 1 to 5. In some embodiments, the comfort level may be indicated using controls for the patient to maintain a setting or for the patient to change the setting. More specifically, the adjustment confirmation control for the patient to change the setting may provide for the patient to change the setting in either of two or more directions. For example, the controls may allow the patient to maintain the value of a setting, to increase the value of the setting, or to decrease the value of the setting.
In some embodiments, the patient interface 50 and/or a server may generate and/or present the adjustment confirmation control using one or more machine learning models. The one or more machine learning models may be trained using training data including inputs that are mapped to outputs, such that the machine learning models identify patterns in the data to generate a certain output. The training data may include input data of types and/or arrangements of graphical user interface elements to present that are associated with a higher likelihood of a patient providing feedback. The training data may include input data of values of comfort levels to present that are associated with a higher likelihood of a patient providing feedback. The training data may include input data of values of positions of body parts to present that are associated with a higher likelihood of a patient providing feedback.
The adjustment confirmation control may take the form of an adjustment confirmation screen 720, as shown, for example, in
The phrase “ICON” refers to ‘increase control’, the phrase “DCON” refers to ‘decrease control’, and the phrase “SCON” refers to ‘stay control’, unless explicitly stated otherwise, are intended to be understood as noun phrases meaning controls that serve the functions of increasing, decreasing, or maintaining corresponding values.
The adjustment confirmation screen 720 includes text and/or graphics requesting the patient to confirm their satisfaction with the position of the treatment apparatus 70 during and/or after the automatic adjustments are made. The adjustment confirmation screen 720 includes an increase control that the patient may select to indicate a desire to increase the value of a corresponding parameter. The corresponding parameter may be a position of the treatment apparatus 70 such as the radius of the pedal 102 on the pedal arm 104. The corresponding parameter may be a setting for a force or a speed of an exercise performed as part of the regimen. For example, the corresponding parameter may be a target pressure or a target RPM speed in a given session period. The increase control may take the form of an increase button 722, such as the button shown on
In some embodiments, one or more of the increase, the decrease, and/or the stay control(s) may be provided by one or more of the sensors 76, 84, 86. For example, the patient interface 50 may prompt the patient to move a body part until they start to feel discomfort, the system 10 may use one or more of the sensors 76, 84, 86 to measure the range of motion that the body part moved, and that range of motion may be used for performing the rehabilitation regimen. In another example, one or more of the sensors 76, 84, 86, such as a pressure sensor 76 and/or a goniometer 84, may measure a physical response by the patient, such as a flinch that indicates pain. A target value of the parameter may be set based upon the value of the parameter where the patient indicated pain or discomfort. That target value of the parameter may then be used for performing the rehabilitation regimen. The target value of the parameter may be set based upon a value of the parameter where the patient indicated pain or discomfort. The target parameter value may be set to X % of P, where X is a predetermined percentage, and P is the value of the parameter where the patient indicated pain or discomfort. For example, if a patient indicated pain at a pedal radius of 6.0 cm, and X is 90%, the target parameter value for the pedal position may be set to 5.4 cm, or 90% of 6.0 cm. Alternatively, the target parameter value may be set using an offset value that is added or subtracted from the value of the parameter where the patient indicated pain or discomfort. For example, if a patient indicated pain at pedal radius of 8.0 cm, and the offset value is −1.2 cm, then the target parameter value for the pedal radius may be set to 6.8 cm. Values of other parameters, such as target pressure or target speed, may be similarly adjusted.
In some embodiments, the system 10 may be configured to persuasively motivate the patient to use one or more settings for the position of the body part and/or the force exerted by the body part. For example, the patient interface 50 may show a target value or a target range for the position of the body part and/or the force exerted by the body part. In another example, the patient interface 50 may periodically encourage the patient to increase a setting for the position of the body part and/or the force exerted by the body part, particularly where that setting is below a target value or a target range. The system 10 may gradually increase a setting for the position of the body part and/or the force exerted by the body part while the patient is using the body part to perform the rehabilitation regimen. In some embodiments, the adjustment confirmation control may be presented to the patient only after the setting for the position of the body part and/or the force exerted by the body part has been actively used in performing the rehabilitation regimen for some period of time. In some embodiments, the adjustment confirmation control may not be presented to the patient, even after the setting for the position of the body part and/or the force exerted by the body part is adjusted.
In some embodiments, the patient interface 50 may present the adjustment confirmation control before the patient performs the rehabilitation regimen. Such a pre-performance adjustment allows the patient to use a confirmed or adjusted position and/or force setting while performing the rehabilitation regimen. Additionally or alternatively, the patient interface 50 may present the adjustment confirmation control during and/or after the rehabilitation regimen. For example, the adjustment confirmation screen 720 may be presented to the patient during a session or between sessions of the rehabilitation regimen. In some embodiments, the adjustment confirmation control may be presented in response to a triggering event. The triggering event may include, for example, the patient reporting pain in excess of a given value, or an inability to complete one or more activities within the treatment plan 154, or a sudden decrease in walking performed by the patient. Additionally or alternatively, the adjustment confirmation screen 720 may be presented to the patient after the patient has completed a session of the rehabilitation regimen. Such a post-session confirmation may be used to determine the patient's comfort, which may be a proxy for satisfaction with the session of the rehabilitation regimen. The post-session confirmation may be used to determine one or more settings for use in subsequent sessions. For example, an indication of “stay” or “increase” may cause a target value for position and/or pressure of the body part to be increased in subsequent sessions of the rehabilitation regimen.
In some embodiments, the clinician interface 20 may present information regarding the position of the body part and/or the force exerted by the body part. This information may include actual and/or target positions and/or forces as measured by one or more of the sensors 76, 84, 86. Additionally or alternatively, the information regarding the position of the body part and/or the force exerted by the body part may include a target value or a target range of values for either or both of the position of the body part and/or the force exerted by the body part. For example, the clinician interface 20 may provide a control for the clinician to adjust a value or a range of values as a target for a parameter such as a position, a force, or a speed used in a session or a session period or for a particular exercise within the rehabilitation regimen. Similarly, the clinician interface 20 may provide a control for the clinician to adjust minimum and/or maximum values for the parameter. For example, the patient may adjust the value of a pedal radius parameter from the preset target value up to the maximum value for that parameter, where the preset target value and the maximum value are both set by the clinician using corresponding controls on the clinician interface 20.
The session period action screen 760 also includes a speed indicator 766 showing a speed that the pedals 106 are turning, as measured by an internal sensor 76 of the stationary cycling machine 100. The speed indicator 766 is shown as a rotary gauge, but other types of displays may be used, such as a bar graph and/or a numeric indicator. The speed indicator 766 includes an optimal or desired speed range, which may be determined by the clinician using an associated session parameter control 178 on the protocol management display 170, as shown, for example, on
In some embodiments, a computer, such as the server 30, is configured to automatically modify the treatment plan 154 in response to satisfaction by the patient of a predetermined condition. For example, the treatment plan 154 may be limited in speed, velocity, or pressure settings or number of sessions per day until a predetermined condition is satisfied. In another example, the treatment plan 154 may include only certain types of session periods, such as passive type exercises, until the predetermined condition is satisfied. The predetermined condition may include, for example, a successful post-operative checkup; or completion of a predetermined number of sessions or satisfying a performance benchmark within the treatment plan. Such a benchmark may include, for example, walking X number of steps in a day, or some given RPM speed or a given number of pounds of force using the treatment apparatus 70. In some embodiments, the computer is configured to increase at least one of a frequency, a duration, or an intensity of an aspect of the treatment plan 154 in response to performance or occurrence of the predetermined condition. In some embodiments, the computer is configured to decrease at least one of a frequency, a duration, or an intensity of an aspect of the treatment plan 154 in response to a performance or occurrence of the condition. The predetermined condition may include, for example, the patient reporting pain in excess of a given value, or an inability to complete one or more activities within the treatment plan 154, or a sudden decrease in walking performed by the patient.
In some embodiments, the patient interface 50 may provide a prompt to the patient in response to occurrence of the predetermined condition. For example, in a session period where the patient is expected to maintain the stationary cycling machine at a speed of between 40 and 50 RPM, the predetermined condition may include the cycling machine operating below 30 RPM for a period of 5 seconds. In that case, the patient interface 50 may provide a prompt asking the patient if they are having trouble or pain in performing the activity. The prompts may narrow down a problem. For example, if the patient is unable to perform a given activity, then a computer, such as the server 30, may automatically modify the treatment plan 154 to include activities that are easier for the patient to complete, such as only passive or only assisted session periods. Alternatively, the treatment plan 154 may be suspended until the clinician or another qualified person, such as an orthopedic surgeon, directs the system 10 to re-enable the treatment plan 154. Additionally or alternatively, the patient's responses to the prompts may generate an alert to the clinician.
In some embodiments, the system may communicate an alert message to the clinician using a communication message, such as a pager message or a text message or an email. The alert message may include pseudonymized data and/or anonymized data or use any privacy enhancing technology to prevent confidential patient data from being communicated in a way that could violate patient confidentiality requirements. Such privacy enhancing technologies may enable compliance with laws, regulations, or other rules of governance such as, but not limited to, the Health Insurance Portability and Accountability Act (HIPAA), or the General Data Protection Regulation (GDPR), wherein the patient may be deemed a “data subject”. For example, an alert message may direct the clinician that a particular type of alert exists, such as a patient reporting wound splitting, without identifying which patient made the report. The alert message may direct the clinician to check the clinician interface 20 for more specific details regarding the alert.
According to further aspects, the computer-implemented system 10 may be configured to automatically modify one or more parameters of the treatment plan based upon progress made by the patient in performing the treatment plan. For example, the server 30 may be configured to adjust one or more settings, such as frequency of sessions, a range of motion setting, and/or a pressure setting based on how the patient is progressing in the treatment plan. In some embodiments, the parameters available to be modified by the system may be adjusted within a corresponding range of values set by the clinician. For example, the clinician interface 20 may present one or more controls for the clinician to set a range of values that the system can use for each of the adjustable parameters. The system 10 may use an algorithm to add more sessions (e.g., if the patient is behind schedule). Alternatively, the system 10 may accelerate ahead to more difficult sessions if the recovery is proceeding faster than expected.
For simplicity of explanation, the method 1700 is depicted and described as a series of operations. However, operations in accordance with this disclosure can occur in various orders and/or concurrently, and/or with other operations not presented and described herein. For example, the operations depicted in the method 1700 may occur in combination with any other operation of any other method disclosed herein. Furthermore, not all illustrated operations may be required to implement the method 1700 in accordance with the disclosed subject matter. In addition, those skilled in the art will understand and appreciate that the method 1700 could alternatively be represented as a series of interrelated states via a state diagram, a directed graph, a deterministic finite state automaton, a non-deterministic finite state automaton, a Bayesian model, a Markov diagram, or an event diagram.
At 1702, while the patient uses a treatment apparatus 70, the processing device may control, based on a treatment plan for a patient, the treatment apparatus 70. In some embodiments, the processing device may be separate from the treatment apparatus 70. For example, the processing device may be included in the patient interface, in a server, in the clinician interface, in any other interface discussed herein, in a sensor, in a computing device, or the like. In some embodiments, the processing device may be included in the treatment apparatus 70. In some embodiments, the treatment plan is a physical rehabilitation regimen for improving strength or range of motion of a body part.
At 1704, the processing device may receive data from an electronic device (e.g., patient interface, computing device of an individual (patient, clinician, staff member, nurse, etc.), clinician interface, sensor internal or external to the treatment apparatus 70, or any some combination thereof). The data may include one of a position of a body part of the patient or a force exerted by the body part. The data may include a measurement (e.g., pressure measurement from a sensor in a pedal of the treatment apparatus, speed of a motor operating within the treatment apparatus 70, range of motion (of a limb of the patient) received from a goniometer, etc.) pertaining to performance of a treatment plan by a patient using the treatment apparatus 70, a characteristic (e.g., a heartrate, a blood pressure, a percentage or other measurement of blood oxygen, a glucose level, a temperature, a perspiration rate, a pain level, etc.) pertaining to the patient, or both. In some embodiments, the body part is a joint, and the position of the body part comprises an angle of the joint. In some embodiments, the body part may include at least one of a joint, a bone, or a muscle group.
At 1706, the processing device may store the data for the patient in a computer-readable medium. At 1708, the processing device may cause a user interface to be presented on a patient interface. The user interface may include an adjustment confirmation control configured to solicit a response regarding the patient's comfort level with the one of the position of the body part or the force exerted by the body part. In some embodiments, the adjustment confirmation control may be configured to solicit the response regarding the patient's comfort level with the force exerted by the body part. In some embodiments, the adjustment confirmation control may be configured to solicit the response regarding the patient's comfort level with the position of the body part. In some embodiments, the processing device may cause presentation of a user interface on a clinician interface, wherein the user interface comprises information regarding the one of the position of the body part or the force exerted by the body part. Causing a user interface to be presented on any computing device may include transmitting data and/or computer instructions to the computing device. The computing device may use the data and/or execute the instructions to present the user interface on a display screen. The user interface may be included in a standalone application executing on the computing device and/or in an application (website) executing within another application (web browser).
Clauses:
1. A method comprising:
2. The method of clause 1, wherein the processing device is separate from the treatment apparatus, and the method further comprises using the processing device separate from the treatment apparatus to perform the controlling of the treatment apparatus.
3. The method of clause 1, wherein the treatment plan is a physical rehabilitation regimen for improving strength or range of motion of the body part.
4. The method of clause 1, wherein the adjustment confirmation control is configured to solicit the response regarding the patient's comfort level with the force exerted by the body part.
5. The method of clause 1, wherein the adjustment confirmation control is configured to solicit the response regarding the patient's comfort level with the position of the body part.
6. The method of clause 5, wherein the body part is a joint, and the position of the body part comprises an angle of the joint.
7. The method of clause 1, further comprising causing, via the processing device, presentation of a user interface on a clinician interface, wherein the user interface comprises information regarding the one of the position of the body part or the force exerted by the body part.
8. A computer-implemented system for physical rehabilitation, comprising:
9. The computer-implemented system of clause 8, wherein the regimen is a physical rehabilitation regimen for improving strength or range of motion of the body part.
10. The computer-implemented system of clause 8, wherein the adjustment confirmation control is configured to solicit the response associated with the patient's comfort level with the force exerted by the body part.
11. The computer-implemented system of clause 8, wherein the adjustment confirmation control is configured to solicit the response associated with the patient's comfort level with the position of the body part.
12. The computer-implemented system of clause 11, wherein the body part is a joint, and the position of the body part comprises an angle of the joint.
13. The computer-implemented system of clause 8, wherein the clinician interface is configured to present information regarding the one of the position of the body part or the force exerted by the body part.
14. The computer-implemented system of clause 8, wherein the adjustment confirmation control provides an ICON configured to increase the one of the position of the body part or the force exerted by the body part during the regimen.
15. The computer-implemented system of clause 8, wherein the adjustment confirmation control provides a DCON configured to decrease the one of the position of the body part or the force exerted by the body part during the regimen.
16. The computer-implemented system of clause 8, wherein the adjustment confirmation control provides a SCON configured to maintain the one of the position of the body part or the force exerted by the body part during the regimen.
17. The computer-implemented system of clause 8, wherein the patient interface presents the adjustment confirmation control during or after the regimen.
18. The computer-implemented system of clause 8, further comprising, for performing the regimen, a treatment apparatus configured to be manipulated by the patient.
19. The computer-implemented system of clause 18, wherein the treatment apparatus comprises an actuator configured to adjust the position of the body part.
20. The computer-implemented system of clause 18, wherein the sensor is an internal sensor within the treatment apparatus.
21. A system for remote treatment, comprising:
22. The system of clause 21, wherein the treatment plan comprises a target setting for the one of the position of the body part or the force exerted by the body part.
23. The system of clause 21, wherein the regimen is a physical rehabilitation regimen for improving strength or range of motion of the body part.
24. The system of clause 21, wherein the adjustment confirmation control is configured to solicit the response regarding the patient's comfort level with the position of the body part.
25. The system of clause 24, wherein the body part is a joint, and the position of the body part comprises an angle of the joint.
26. A patient user interface generated by a computer and comprising:
27. The patient user interface of clause 26, wherein the adjustment confirmation control provides an ICON configured to increase the one of the position of the body part or the force exerted by the body part during the regimen; and
28. The patient user interface of clause 26, wherein the adjustment confirmation control provides a SCON configured to maintain the one of the position of the body part or the force exerted by the body part during the regimen.
As will readily be appreciated by a person of ordinary skill of the art in light of having read the present disclosure, as used herein, actions described as being performed in real-time include actions performed in near-real-time without departing from the scope and intent of the present disclosure.
The various aspects, embodiments, implementations, or features of the described embodiments can be used separately or in any combination. The embodiments disclosed herein are modular in nature and can be used in conjunction with or coupled to other embodiments.
Consistent with the above disclosure, the examples of assemblies enumerated in the following clauses are specifically contemplated and are intended as a non-limiting set of examples.
This application claims priority to and the benefit of U.S. Provisional Application Patent Ser. No. 62/923,829 filed Oct. 21, 2019, titled “Persuasive Motivation for Orthopedic Treatment,” the entire disclosure of which is hereby incorporated by reference for all purposes.
Number | Name | Date | Kind |
---|---|---|---|
4822032 | Whitmore et al. | Apr 1989 | A |
4860763 | Schminke | Aug 1989 | A |
4932650 | Bingham et al. | Jun 1990 | A |
5137501 | Mertesdorf | Aug 1992 | A |
5240417 | Smithson et al. | Aug 1993 | A |
5256117 | Potts et al. | Oct 1993 | A |
5284131 | Gray | Feb 1994 | A |
5429140 | Burdea et al. | Jul 1995 | A |
5738636 | Saringer et al. | Apr 1998 | A |
6007459 | Burgess | Dec 1999 | A |
6110130 | Kramer | Aug 2000 | A |
6182029 | Friedman | Jan 2001 | B1 |
6267735 | Blanchard | Jul 2001 | B1 |
6273863 | Avni et al. | Aug 2001 | B1 |
6413190 | Wood et al. | Jul 2002 | B1 |
6436058 | Krahner | Aug 2002 | B1 |
6491649 | Ombrellaro | Dec 2002 | B1 |
6535861 | OConnor et al. | Mar 2003 | B1 |
6602191 | Quy | Aug 2003 | B2 |
6613000 | Reinkensmeyer et al. | Sep 2003 | B1 |
6626805 | Lightbody | Sep 2003 | B1 |
6652425 | Martin et al. | Nov 2003 | B1 |
6890312 | Priester et al. | May 2005 | B1 |
6902513 | McClure | Jun 2005 | B1 |
7063643 | Arai | Jun 2006 | B2 |
7156665 | OConnor et al. | Jan 2007 | B1 |
7156780 | Fuchs et al. | Jan 2007 | B1 |
7169085 | Killin et al. | Jan 2007 | B1 |
7209886 | Kimmel | Apr 2007 | B2 |
7226394 | Johnson | Jun 2007 | B2 |
RE39904 | Lee | Oct 2007 | E |
7507188 | Nurre | Mar 2009 | B2 |
7594879 | Johnson | Sep 2009 | B2 |
7628730 | Watterson et al. | Dec 2009 | B1 |
7778851 | Schoenberg et al. | Aug 2010 | B2 |
7809601 | Shaya et al. | Oct 2010 | B2 |
7833135 | Radow et al. | Nov 2010 | B2 |
7837472 | Elsmore et al. | Nov 2010 | B1 |
7955219 | Birrell et al. | Jun 2011 | B2 |
7969315 | Ross et al. | Jun 2011 | B1 |
7974689 | Volpe et al. | Jul 2011 | B2 |
7988599 | Ainsworth et al. | Aug 2011 | B2 |
8038578 | Olrik et al. | Oct 2011 | B2 |
8079937 | Bedell et al. | Dec 2011 | B2 |
8287434 | Zavadsky et al. | Oct 2012 | B2 |
8298123 | Hickman | Oct 2012 | B2 |
8371990 | Shea | Feb 2013 | B2 |
8419593 | Ainsworth et al. | Apr 2013 | B2 |
8465398 | Lee et al. | Jun 2013 | B2 |
8506458 | Dugan | Aug 2013 | B2 |
8540515 | Williams et al. | Sep 2013 | B2 |
8540516 | Williams et al. | Sep 2013 | B2 |
8556778 | Dugan | Oct 2013 | B1 |
8607465 | Edwards | Dec 2013 | B1 |
8613689 | Dyer et al. | Dec 2013 | B2 |
8672812 | Dugan | Mar 2014 | B2 |
8751264 | Beraja et al. | Jun 2014 | B2 |
8784273 | Dugan | Jul 2014 | B2 |
8818496 | Dziubinski et al. | Aug 2014 | B2 |
8823448 | Shen | Sep 2014 | B1 |
8864628 | Boyette et al. | Oct 2014 | B2 |
8893287 | Gjonej et al. | Nov 2014 | B2 |
8979711 | Dugan | Mar 2015 | B2 |
9167281 | Petrov et al. | Oct 2015 | B2 |
9248071 | Benda et al. | Feb 2016 | B1 |
9272185 | Dugan | Mar 2016 | B2 |
9283434 | Wu | Mar 2016 | B1 |
9311789 | Gwin | Apr 2016 | B1 |
9312907 | Auchinleck et al. | Apr 2016 | B2 |
9367668 | Flynt et al. | Jun 2016 | B2 |
9409054 | Dugan | Aug 2016 | B2 |
9443205 | Wall | Sep 2016 | B2 |
9481428 | Gros et al. | Nov 2016 | B2 |
9566472 | Dugan | Feb 2017 | B2 |
9579056 | Rosenbek et al. | Feb 2017 | B2 |
9629558 | Yuen et al. | Apr 2017 | B2 |
9640057 | Ross | May 2017 | B1 |
D794142 | Zhou | Aug 2017 | S |
9717947 | Lin | Aug 2017 | B2 |
9737761 | Govindarajan | Aug 2017 | B1 |
9782621 | Chiang et al. | Oct 2017 | B2 |
9802076 | Murray et al. | Oct 2017 | B2 |
9872087 | DelloStritto et al. | Jan 2018 | B2 |
9872637 | Kording et al. | Jan 2018 | B2 |
9914053 | Dugan | Mar 2018 | B2 |
9919198 | Romeo et al. | Mar 2018 | B2 |
9937382 | Dugan | Apr 2018 | B2 |
9939784 | Berardinelli | Apr 2018 | B1 |
9977587 | Mountain | May 2018 | B2 |
9993181 | Ross | Jun 2018 | B2 |
10004946 | Ross | Jun 2018 | B2 |
10074148 | Cashman et al. | Sep 2018 | B2 |
10089443 | Miller et al. | Oct 2018 | B2 |
10130298 | Mokaya et al. | Nov 2018 | B2 |
10130311 | De Sapio et al. | Nov 2018 | B1 |
10155134 | Dugan | Dec 2018 | B2 |
10159872 | Sasaki et al. | Dec 2018 | B2 |
10173094 | Gomberg et al. | Jan 2019 | B2 |
10173095 | Gomberg et al. | Jan 2019 | B2 |
10173096 | Gomberg et al. | Jan 2019 | B2 |
10173097 | Gomberg et al. | Jan 2019 | B2 |
10198928 | Ross et al. | Feb 2019 | B1 |
10226663 | Gomberg et al. | Mar 2019 | B2 |
10325070 | Beale et al. | Jun 2019 | B2 |
10327697 | Stein et al. | Jun 2019 | B1 |
10380866 | Ross et al. | Aug 2019 | B1 |
10424033 | Romeo | Sep 2019 | B2 |
10430552 | Mihai | Oct 2019 | B2 |
D866957 | Ross et al. | Nov 2019 | S |
10468131 | Macoviak et al. | Nov 2019 | B2 |
10475323 | Ross | Nov 2019 | B1 |
10542914 | Forth et al. | Jan 2020 | B2 |
10546467 | Luciano, Jr. et al. | Jan 2020 | B1 |
10569122 | Johnson | Feb 2020 | B2 |
10572626 | Balram | Feb 2020 | B2 |
10576331 | Kuo | Mar 2020 | B2 |
10625114 | Ercanbrack | Apr 2020 | B2 |
10646746 | Gomberg et al. | May 2020 | B1 |
10660534 | Lee et al. | May 2020 | B2 |
10678890 | Bitran et al. | Jun 2020 | B2 |
10685092 | Paparella et al. | Jun 2020 | B2 |
10777200 | Will et al. | Sep 2020 | B2 |
D899605 | Ross et al. | Oct 2020 | S |
10792495 | Izvorski et al. | Oct 2020 | B2 |
10867695 | Neagle | Dec 2020 | B2 |
10874905 | Belson et al. | Dec 2020 | B2 |
D907143 | Ach et al. | Jan 2021 | S |
10918332 | Belson et al. | Feb 2021 | B2 |
10931643 | Neumann | Feb 2021 | B1 |
10987176 | Poltaretskyi et al. | Apr 2021 | B2 |
11000735 | Orady et al. | May 2021 | B2 |
11045709 | Putnam | Jun 2021 | B2 |
11065527 | Putnam | Jul 2021 | B2 |
11069436 | Mason et al. | Jul 2021 | B2 |
11071597 | Posnack et al. | Jul 2021 | B2 |
11075000 | Mason et al. | Jul 2021 | B2 |
D928635 | Hacking et al. | Aug 2021 | S |
11087865 | Mason et al. | Aug 2021 | B2 |
11101028 | Mason et al. | Aug 2021 | B2 |
11107591 | Mason | Aug 2021 | B1 |
11139060 | Mason et al. | Oct 2021 | B2 |
11185735 | Am et al. | Nov 2021 | B2 |
D939096 | Lee | Dec 2021 | S |
D939644 | Ach et al. | Dec 2021 | S |
D940797 | Ach et al. | Jan 2022 | S |
D940891 | Lee | Jan 2022 | S |
11229727 | Tatonetti | Jan 2022 | B2 |
11270795 | Mason et al. | Mar 2022 | B2 |
11272879 | Wiedenhoefer et al. | Mar 2022 | B2 |
11282599 | Mason et al. | Mar 2022 | B2 |
11282604 | Mason et al. | Mar 2022 | B2 |
11282608 | Mason et al. | Mar 2022 | B2 |
11284797 | Mason et al. | Mar 2022 | B2 |
D948639 | Ach et al. | Apr 2022 | S |
11295848 | Mason et al. | Apr 2022 | B2 |
11309085 | Mason et al. | Apr 2022 | B2 |
11317975 | Mason et al. | May 2022 | B2 |
11325005 | Mason et al. | May 2022 | B2 |
11328807 | Mason et al. | May 2022 | B2 |
11337648 | Mason | May 2022 | B2 |
11348683 | Guaneri et al. | May 2022 | B2 |
11404150 | Guaneri et al. | Aug 2022 | B2 |
11410768 | Mason et al. | Aug 2022 | B2 |
11508482 | Mason et al. | Nov 2022 | B2 |
11515021 | Mason | Nov 2022 | B2 |
11515028 | Mason | Nov 2022 | B2 |
11532402 | Farley et al. | Dec 2022 | B2 |
D976339 | Li | Jan 2023 | S |
11541274 | Hacking | Jan 2023 | B2 |
11636944 | Hanrahan et al. | Apr 2023 | B2 |
20020072452 | Torkelson | Jun 2002 | A1 |
20020143279 | Porter et al. | Oct 2002 | A1 |
20020160883 | Dugan | Oct 2002 | A1 |
20030036683 | Kehr et al. | Feb 2003 | A1 |
20030064863 | Chen | Apr 2003 | A1 |
20030083596 | Kramer et al. | May 2003 | A1 |
20030181832 | Carnahan et al. | Sep 2003 | A1 |
20040102931 | Ellis et al. | May 2004 | A1 |
20040147969 | Mann et al. | Jul 2004 | A1 |
20040204959 | Moreano et al. | Oct 2004 | A1 |
20050043153 | Krietzman | Feb 2005 | A1 |
20060046905 | Doody, Jr. et al. | Mar 2006 | A1 |
20060064329 | Abolfathi et al. | Mar 2006 | A1 |
20060199700 | LaStayo et al. | Sep 2006 | A1 |
20070042868 | Fisher et al. | Feb 2007 | A1 |
20070118389 | Shipon | May 2007 | A1 |
20070137307 | Gruben et al. | Jun 2007 | A1 |
20070173392 | Stanford | Jul 2007 | A1 |
20070287597 | Cameron | Dec 2007 | A1 |
20080021834 | Holla et al. | Jan 2008 | A1 |
20080096726 | Riley | Apr 2008 | A1 |
20080153592 | James-Herbert | Jun 2008 | A1 |
20080300914 | Karkanias et al. | Dec 2008 | A1 |
20090011907 | Radow et al. | Jan 2009 | A1 |
20090058635 | LaLonde et al. | Mar 2009 | A1 |
20090070138 | Langheier et al. | Mar 2009 | A1 |
20090270227 | Ashby et al. | Oct 2009 | A1 |
20100048358 | Tchao et al. | Feb 2010 | A1 |
20100121160 | Stark et al. | May 2010 | A1 |
20100173747 | Chen et al. | Jul 2010 | A1 |
20100248899 | Bedell et al. | Sep 2010 | A1 |
20100268304 | Matos | Oct 2010 | A1 |
20100298102 | Bosecker et al. | Nov 2010 | A1 |
20110047108 | Chakrabarty et al. | Feb 2011 | A1 |
20110172059 | Watterson et al. | Jul 2011 | A1 |
20110195819 | Shaw et al. | Aug 2011 | A1 |
20110218814 | Coats | Sep 2011 | A1 |
20110275483 | Dugan | Nov 2011 | A1 |
20120065987 | Farooq et al. | Mar 2012 | A1 |
20120116258 | Lee | May 2012 | A1 |
20120183939 | Aragones et al. | Jul 2012 | A1 |
20120190502 | Paulus et al. | Jul 2012 | A1 |
20120232438 | Cataldi et al. | Sep 2012 | A1 |
20120295240 | Walker et al. | Nov 2012 | A1 |
20120310667 | Altman et al. | Dec 2012 | A1 |
20130123667 | Komatireddy et al. | May 2013 | A1 |
20130137550 | Skinner et al. | May 2013 | A1 |
20130178334 | Brammer | Jul 2013 | A1 |
20130211281 | Ross et al. | Aug 2013 | A1 |
20130296987 | Rogers et al. | Nov 2013 | A1 |
20130318027 | Almogy et al. | Nov 2013 | A1 |
20130345025 | van der Merwe | Dec 2013 | A1 |
20140006042 | Keefe et al. | Jan 2014 | A1 |
20140011640 | Dugan | Jan 2014 | A1 |
20140113261 | Akiba | Apr 2014 | A1 |
20140113768 | Lin et al. | Apr 2014 | A1 |
20140155129 | Dugan | Jun 2014 | A1 |
20140172460 | Kohli | Jun 2014 | A1 |
20140188009 | Lange et al. | Jul 2014 | A1 |
20140194250 | Reich et al. | Jul 2014 | A1 |
20140194251 | Reich et al. | Jul 2014 | A1 |
20140207264 | Quy | Jul 2014 | A1 |
20140207486 | Carty et al. | Jul 2014 | A1 |
20140246499 | Proud et al. | Sep 2014 | A1 |
20140256511 | Smith | Sep 2014 | A1 |
20140257837 | Walker et al. | Sep 2014 | A1 |
20140274565 | Boyette et al. | Sep 2014 | A1 |
20140274622 | Leonhard | Sep 2014 | A1 |
20140309083 | Dugan | Oct 2014 | A1 |
20140315689 | Vauquelin et al. | Oct 2014 | A1 |
20140322686 | Kang | Oct 2014 | A1 |
20150025816 | Ross | Jan 2015 | A1 |
20150045700 | Cavanagh et al. | Feb 2015 | A1 |
20150088544 | Goldberg | Mar 2015 | A1 |
20150094192 | Skwortsow et al. | Apr 2015 | A1 |
20150151162 | Dugan | Jun 2015 | A1 |
20150158549 | Gros et al. | Jun 2015 | A1 |
20150161331 | Oleynik | Jun 2015 | A1 |
20150257679 | Ross | Sep 2015 | A1 |
20150290061 | Stafford et al. | Oct 2015 | A1 |
20150339442 | Oleynik | Nov 2015 | A1 |
20150341812 | Dion et al. | Nov 2015 | A1 |
20150351664 | Ross | Dec 2015 | A1 |
20150351665 | Ross | Dec 2015 | A1 |
20150360069 | Marti et al. | Dec 2015 | A1 |
20150379232 | Mainwaring et al. | Dec 2015 | A1 |
20160007885 | Basta et al. | Jan 2016 | A1 |
20160117471 | Belt et al. | Apr 2016 | A1 |
20160140319 | Stark et al. | May 2016 | A1 |
20160151670 | Dugan | Jun 2016 | A1 |
20160166881 | Ridgel et al. | Jun 2016 | A1 |
20160275259 | Nolan et al. | Sep 2016 | A1 |
20160302721 | Wiedenhoefer et al. | Oct 2016 | A1 |
20160317869 | Dugan | Nov 2016 | A1 |
20160322078 | Bose et al. | Nov 2016 | A1 |
20160325140 | Wu | Nov 2016 | A1 |
20160332028 | Melnik | Nov 2016 | A1 |
20160361597 | Cole | Dec 2016 | A1 |
20170004260 | Moturu et al. | Jan 2017 | A1 |
20170033375 | Ohmori et al. | Feb 2017 | A1 |
20170042467 | Herr et al. | Feb 2017 | A1 |
20170046488 | Pereira | Feb 2017 | A1 |
20170065851 | Deluca et al. | Mar 2017 | A1 |
20170080320 | Smith | Mar 2017 | A1 |
20170095670 | Ghaffari et al. | Apr 2017 | A1 |
20170095692 | Chang et al. | Apr 2017 | A1 |
20170095693 | Chang et al. | Apr 2017 | A1 |
20170106242 | Dugan | Apr 2017 | A1 |
20170128769 | Long et al. | May 2017 | A1 |
20170132947 | Maeda | May 2017 | A1 |
20170136296 | Barrera et al. | May 2017 | A1 |
20170143261 | Wiedenhoefer et al. | May 2017 | A1 |
20170147789 | Wiedenhoefer et al. | May 2017 | A1 |
20170148297 | Ross | May 2017 | A1 |
20170168555 | Munoz et al. | Jun 2017 | A1 |
20170181698 | Wiedenhoefer et al. | Jun 2017 | A1 |
20170190052 | Jaekel et al. | Jul 2017 | A1 |
20170209766 | Riley et al. | Jul 2017 | A1 |
20170243028 | LaFever et al. | Aug 2017 | A1 |
20170265800 | Auchinleck et al. | Sep 2017 | A1 |
20170266501 | Sanders et al. | Sep 2017 | A1 |
20170278209 | Olsen et al. | Sep 2017 | A1 |
20170282015 | Wicks et al. | Oct 2017 | A1 |
20170300654 | Stein et al. | Oct 2017 | A1 |
20170312614 | Tran et al. | Nov 2017 | A1 |
20170329917 | McRaith et al. | Nov 2017 | A1 |
20170333755 | Rider | Nov 2017 | A1 |
20170337033 | Duyan et al. | Nov 2017 | A1 |
20170337334 | Stanczak | Nov 2017 | A1 |
20170344726 | Duffy et al. | Nov 2017 | A1 |
20170360586 | Dempers et al. | Dec 2017 | A1 |
20170368413 | Shavit | Dec 2017 | A1 |
20180017806 | Wang et al. | Jan 2018 | A1 |
20180036593 | Ridgel et al. | Feb 2018 | A1 |
20180052962 | Van Der Koijk et al. | Feb 2018 | A1 |
20180056104 | Cromie et al. | Mar 2018 | A1 |
20180071572 | Gomberg et al. | Mar 2018 | A1 |
20180075205 | Moturu et al. | Mar 2018 | A1 |
20180078843 | Tran et al. | Mar 2018 | A1 |
20180085615 | Astolfi et al. | Mar 2018 | A1 |
20180096111 | Wells et al. | Apr 2018 | A1 |
20180102190 | Hogue et al. | Apr 2018 | A1 |
20180116741 | Garcia Kilroy et al. | May 2018 | A1 |
20180177612 | Trabish et al. | Jun 2018 | A1 |
20180178061 | O'larte et al. | Jun 2018 | A1 |
20180199855 | Odame et al. | Jul 2018 | A1 |
20180200577 | Dugan | Jul 2018 | A1 |
20180220935 | Tadano et al. | Aug 2018 | A1 |
20180228682 | Bayerlein et al. | Aug 2018 | A1 |
20180240552 | Tuyl et al. | Aug 2018 | A1 |
20180253991 | Tang et al. | Sep 2018 | A1 |
20180256079 | Yang et al. | Sep 2018 | A1 |
20180263530 | Jung | Sep 2018 | A1 |
20180264312 | Pompile et al. | Sep 2018 | A1 |
20180271432 | Auchinleck et al. | Sep 2018 | A1 |
20180272184 | Vassilaros et al. | Sep 2018 | A1 |
20180280784 | Romeo et al. | Oct 2018 | A1 |
20180296157 | Bleich et al. | Oct 2018 | A1 |
20180326243 | Badi et al. | Nov 2018 | A1 |
20180330058 | Bates | Nov 2018 | A1 |
20180330824 | Athey et al. | Nov 2018 | A1 |
20180360340 | Rehse et al. | Dec 2018 | A1 |
20180373844 | Ferrandez-Escamez et al. | Dec 2018 | A1 |
20190019578 | Vaccaro | Jan 2019 | A1 |
20190030415 | Volpe, Jr. | Jan 2019 | A1 |
20190031284 | Fuchs | Jan 2019 | A1 |
20190046794 | Goodall et al. | Feb 2019 | A1 |
20190060708 | Fung | Feb 2019 | A1 |
20190065970 | Bonutti et al. | Feb 2019 | A1 |
20190066832 | Kang | Feb 2019 | A1 |
20190076701 | Dugan | Mar 2019 | A1 |
20190080802 | Ziobro et al. | Mar 2019 | A1 |
20190088356 | Oliver et al. | Mar 2019 | A1 |
20190111299 | Radcliffe | Apr 2019 | A1 |
20190115097 | Macoviak et al. | Apr 2019 | A1 |
20190117128 | Chen et al. | Apr 2019 | A1 |
20190118038 | Tana et al. | Apr 2019 | A1 |
20190126099 | Hoang | May 2019 | A1 |
20190132948 | Longinotti-Buitoni et al. | May 2019 | A1 |
20190134454 | Mahoney et al. | May 2019 | A1 |
20190137988 | Cella et al. | May 2019 | A1 |
20190167988 | Shahriari et al. | Jun 2019 | A1 |
20190172587 | Park et al. | Jun 2019 | A1 |
20190175988 | Volterrani et al. | Jun 2019 | A1 |
20190183715 | Kapure et al. | Jun 2019 | A1 |
20190200920 | Tien et al. | Jul 2019 | A1 |
20190209891 | Fung | Jul 2019 | A1 |
20190223797 | Tran | Jul 2019 | A1 |
20190240103 | Hepler et al. | Aug 2019 | A1 |
20190240541 | Denton et al. | Aug 2019 | A1 |
20190244540 | Errante et al. | Aug 2019 | A1 |
20190269343 | Ramos Murguialday et al. | Sep 2019 | A1 |
20190274523 | Bates et al. | Sep 2019 | A1 |
20190275368 | Maroldi | Sep 2019 | A1 |
20190304584 | Savolainen | Oct 2019 | A1 |
20190366146 | Tong et al. | Dec 2019 | A1 |
20190388728 | Wang et al. | Dec 2019 | A1 |
20200066390 | Svendrys | Feb 2020 | A1 |
20200085300 | Kwatra et al. | Mar 2020 | A1 |
20200143922 | Chekroud et al. | May 2020 | A1 |
20200151595 | Jayalath et al. | May 2020 | A1 |
20200151646 | De La Fuente Sanchez | May 2020 | A1 |
20200152339 | Pulitzer et al. | May 2020 | A1 |
20200160198 | Reeves et al. | May 2020 | A1 |
20200170876 | Kapure et al. | Jun 2020 | A1 |
20200176098 | Lucas et al. | Jun 2020 | A1 |
20200197744 | Schweighofer | Jun 2020 | A1 |
20200221975 | Basta et al. | Jul 2020 | A1 |
20200267487 | Siva | Aug 2020 | A1 |
20200275886 | Mason | Sep 2020 | A1 |
20200289045 | Hacking et al. | Sep 2020 | A1 |
20200289046 | Hacking et al. | Sep 2020 | A1 |
20200289879 | Hacking et al. | Sep 2020 | A1 |
20200289880 | Hacking et al. | Sep 2020 | A1 |
20200289881 | Hacking et al. | Sep 2020 | A1 |
20200289889 | Hacking et al. | Sep 2020 | A1 |
20200293712 | Potts et al. | Sep 2020 | A1 |
20200334972 | Gopalakrishnan | Oct 2020 | A1 |
20200357299 | Patel et al. | Nov 2020 | A1 |
20200395112 | Ronner | Dec 2020 | A1 |
20200401224 | Cotton | Dec 2020 | A1 |
20210005319 | Otsuki et al. | Jan 2021 | A1 |
20210035674 | Volosin et al. | Feb 2021 | A1 |
20210074178 | Ilan et al. | Mar 2021 | A1 |
20210076981 | Hacking et al. | Mar 2021 | A1 |
20210077860 | Posnack et al. | Mar 2021 | A1 |
20210098129 | Neumann | Apr 2021 | A1 |
20210101051 | Posnack et al. | Apr 2021 | A1 |
20210127974 | Mason et al. | May 2021 | A1 |
20210128080 | Mason et al. | May 2021 | A1 |
20210128255 | Mason et al. | May 2021 | A1 |
20210128978 | Gilstrom et al. | May 2021 | A1 |
20210134412 | Guaneri et al. | May 2021 | A1 |
20210134425 | Mason et al. | May 2021 | A1 |
20210134428 | Mason et al. | May 2021 | A1 |
20210134430 | Mason et al. | May 2021 | A1 |
20210134432 | Mason et al. | May 2021 | A1 |
20210134456 | Posnack et al. | May 2021 | A1 |
20210134457 | Mason et al. | May 2021 | A1 |
20210134458 | Mason et al. | May 2021 | A1 |
20210134463 | Mason et al. | May 2021 | A1 |
20210138304 | Mason et al. | May 2021 | A1 |
20210142875 | Mason et al. | May 2021 | A1 |
20210142893 | Guaneri et al. | May 2021 | A1 |
20210142898 | Mason et al. | May 2021 | A1 |
20210142903 | Mason et al. | May 2021 | A1 |
20210144074 | Guaneri et al. | May 2021 | A1 |
20210186419 | Van Ee et al. | Jun 2021 | A1 |
20210202090 | ODonovan et al. | Jul 2021 | A1 |
20210202103 | Bostic et al. | Jul 2021 | A1 |
20210244998 | Hacking et al. | Aug 2021 | A1 |
20210245003 | Turner | Aug 2021 | A1 |
20210343384 | Purushothaman et al. | Nov 2021 | A1 |
20210345879 | Mason et al. | Nov 2021 | A1 |
20210345975 | Mason et al. | Nov 2021 | A1 |
20210350888 | Guaneri et al. | Nov 2021 | A1 |
20210350898 | Mason et al. | Nov 2021 | A1 |
20210350899 | Mason et al. | Nov 2021 | A1 |
20210350901 | Mason et al. | Nov 2021 | A1 |
20210350902 | Mason et al. | Nov 2021 | A1 |
20210350914 | Guaneri et al. | Nov 2021 | A1 |
20210350926 | Mason et al. | Nov 2021 | A1 |
20210361514 | Choi et al. | Nov 2021 | A1 |
20210366587 | Mason et al. | Nov 2021 | A1 |
20210383909 | Mason et al. | Dec 2021 | A1 |
20210391091 | Mason | Dec 2021 | A1 |
20210398668 | Chock et al. | Dec 2021 | A1 |
20210407670 | Mason et al. | Dec 2021 | A1 |
20210407681 | Mason et al. | Dec 2021 | A1 |
20220000556 | Casey et al. | Jan 2022 | A1 |
20220015838 | Posnack et al. | Jan 2022 | A1 |
20220016480 | Bissonnette et al. | Jan 2022 | A1 |
20220044806 | Sanders et al. | Feb 2022 | A1 |
20220047921 | Bissonnette et al. | Feb 2022 | A1 |
20220079690 | Mason et al. | Mar 2022 | A1 |
20220080256 | Am et al. | Mar 2022 | A1 |
20220105384 | Hacking et al. | Apr 2022 | A1 |
20220105385 | Hacking et al. | Apr 2022 | A1 |
20220115133 | Mason et al. | Apr 2022 | A1 |
20220118218 | Bense et al. | Apr 2022 | A1 |
20220126169 | Mason | Apr 2022 | A1 |
20220133576 | Choi et al. | May 2022 | A1 |
20220148725 | Mason et al. | May 2022 | A1 |
20220158916 | Mason et al. | May 2022 | A1 |
20220193491 | Mason et al. | Jun 2022 | A1 |
20220230729 | Mason et al. | Jul 2022 | A1 |
20220238222 | Neuberg | Jul 2022 | A1 |
20220238223 | Mason et al. | Jul 2022 | A1 |
20220262483 | Rosenberg et al. | Aug 2022 | A1 |
20220266094 | Mason et al. | Aug 2022 | A1 |
20220270738 | Mason et al. | Aug 2022 | A1 |
20220273985 | Jeong et al. | Sep 2022 | A1 |
20220273986 | Mason | Sep 2022 | A1 |
20220288460 | Mason | Sep 2022 | A1 |
20220288461 | Ashley et al. | Sep 2022 | A1 |
20220288462 | Ashley et al. | Sep 2022 | A1 |
20220293257 | Guaneri et al. | Sep 2022 | A1 |
20220304881 | Choi et al. | Sep 2022 | A1 |
20220304882 | Choi | Sep 2022 | A1 |
20220305328 | Choi et al. | Sep 2022 | A1 |
20220314075 | Mason et al. | Oct 2022 | A1 |
20220327714 | Cook et al. | Oct 2022 | A1 |
20220327807 | Cook et al. | Oct 2022 | A1 |
20220328181 | Mason et al. | Oct 2022 | A1 |
20220331663 | Mason | Oct 2022 | A1 |
20220339052 | Kim | Oct 2022 | A1 |
20220339501 | Mason et al. | Oct 2022 | A1 |
20220384012 | Mason | Dec 2022 | A1 |
20220392591 | Guaneri et al. | Dec 2022 | A1 |
20220395232 | Locke | Dec 2022 | A1 |
20220401783 | Choi | Dec 2022 | A1 |
20220415469 | Mason | Dec 2022 | A1 |
20220415471 | Mason | Dec 2022 | A1 |
20230013530 | Mason | Jan 2023 | A1 |
20230014598 | Mason et al. | Jan 2023 | A1 |
20230048040 | Hacking et al. | Feb 2023 | A1 |
20230051751 | Hacking et al. | Feb 2023 | A1 |
20230058605 | Mason | Feb 2023 | A1 |
20230060039 | Mason | Feb 2023 | A1 |
20230072368 | Mason | Mar 2023 | A1 |
20230078793 | Mason | Mar 2023 | A1 |
20230119461 | Mason | Apr 2023 | A1 |
20230207124 | Walsh et al. | Jun 2023 | A1 |
20230215552 | Khotilovich et al. | Jul 2023 | A1 |
20230245747 | Rosenberg et al. | Aug 2023 | A1 |
20230245748 | Rosenberg et al. | Aug 2023 | A1 |
20230245750 | Rosenberg et al. | Aug 2023 | A1 |
20230245751 | Rosenberg et al. | Aug 2023 | A1 |
20230253089 | Rosenberg et al. | Aug 2023 | A1 |
20230263428 | Hull et al. | Aug 2023 | A1 |
Number | Date | Country |
---|---|---|
2698078 | Mar 2010 | CA |
112603295 | Feb 2003 | CN |
2885238 | Apr 2007 | CN |
103473631 | Dec 2013 | CN |
103488880 | Jan 2014 | CN |
104335211 | Feb 2015 | CN |
105683977 | Jun 2016 | CN |
103136447 | Aug 2016 | CN |
105894088 | Aug 2016 | CN |
105930668 | Sep 2016 | CN |
106127646 | Nov 2016 | CN |
106510985 | Mar 2017 | CN |
107066819 | Aug 2017 | CN |
107430641 | Dec 2017 | CN |
107736982 | Feb 2018 | CN |
207220817 | Apr 2018 | CN |
108078737 | May 2018 | CN |
208573971 | Mar 2019 | CN |
110148472 | Aug 2019 | CN |
110215188 | Sep 2019 | CN |
110808092 | Feb 2020 | CN |
111105859 | May 2020 | CN |
111370088 | Jul 2020 | CN |
111790111 | Oct 2020 | CN |
112603295 | Apr 2021 | CN |
114203274 | Mar 2022 | CN |
114632302 | Jun 2022 | CN |
114898832 | Aug 2022 | CN |
110270062 | Oct 2022 | CN |
102018202497 | Aug 2018 | DE |
102018211212 | Jan 2019 | DE |
102019108425 | Aug 2020 | DE |
0383137 | Aug 1990 | EP |
1391179 | Feb 2004 | EP |
2815242 | Dec 2014 | EP |
2869805 | May 2015 | EP |
2997951 | Mar 2016 | EP |
2688472 | Apr 2016 | EP |
3264303 | Jan 2018 | EP |
3323473 | May 2018 | EP |
3627514 | Mar 2020 | EP |
3671700 | Jun 2020 | EP |
3688537 | Aug 2020 | EP |
3731733 | Nov 2020 | EP |
3984508 | Apr 2022 | EP |
3984509 | Apr 2022 | EP |
3984510 | Apr 2022 | EP |
3984511 | Apr 2022 | EP |
3984512 | Apr 2022 | EP |
3984513 | Apr 2022 | EP |
4112033 | Jan 2023 | EP |
2512431 | Oct 2014 | GB |
2003225875 | Aug 2003 | JP |
2005227928 | Aug 2005 | JP |
2013515995 | May 2013 | JP |
3198173 | Jun 2015 | JP |
2019028647 | Feb 2019 | JP |
2019134909 | Aug 2019 | JP |
6573739 | Sep 2019 | JP |
6659831 | Mar 2020 | JP |
6710357 | Jun 2020 | JP |
6775757 | Oct 2020 | JP |
2021027917 | Feb 2021 | JP |
2022521378 | Apr 2022 | JP |
7198364 | Dec 2022 | JP |
7202474 | Jan 2023 | JP |
7231750 | Mar 2023 | JP |
7231751 | Mar 2023 | JP |
7231752 | Mar 2023 | JP |
20020009724 | Feb 2002 | KR |
20020065253 | Aug 2002 | KR |
20110099953 | Sep 2011 | KR |
20140128630 | Nov 2014 | KR |
20150017693 | Feb 2015 | KR |
20150078191 | Jul 2015 | KR |
101580071 | Dec 2015 | KR |
20160093990 | Aug 2016 | KR |
20170038837 | Apr 2017 | KR |
20190029175 | Mar 2019 | KR |
101988167 | Jun 2019 | KR |
101969392 | Aug 2019 | KR |
102055279 | Dec 2019 | KR |
20200025290 | Mar 2020 | KR |
20200029180 | Mar 2020 | KR |
102116664 | May 2020 | KR |
102116968 | May 2020 | KR |
20200056233 | May 2020 | KR |
102120828 | Jun 2020 | KR |
102142713 | Aug 2020 | KR |
102162522 | Oct 2020 | KR |
102173553 | Nov 2020 | KR |
102180079 | Nov 2020 | KR |
102188766 | Dec 2020 | KR |
102196793 | Dec 2020 | KR |
20210006212 | Jan 2021 | KR |
102224188 | Mar 2021 | KR |
102224618 | Mar 2021 | KR |
102246049 | Apr 2021 | KR |
102246050 | Apr 2021 | KR |
102246051 | Apr 2021 | KR |
102246052 | Apr 2021 | KR |
20210052028 | May 2021 | KR |
102264498 | Jun 2021 | KR |
102352602 | Jan 2022 | KR |
102352603 | Jan 2022 | KR |
102352604 | Jan 2022 | KR |
20220004639 | Jan 2022 | KR |
102387577 | Apr 2022 | KR |
102421437 | Jul 2022 | KR |
20220102207 | Jul 2022 | KR |
102467495 | Nov 2022 | KR |
102467496 | Nov 2022 | KR |
102469723 | Nov 2022 | KR |
102471990 | Nov 2022 | KR |
20230019349 | Feb 2023 | KR |
20230019350 | Feb 2023 | KR |
20230026556 | Feb 2023 | KR |
20230026668 | Feb 2023 | KR |
20230040526 | Mar 2023 | KR |
0149235 | Jul 2001 | WO |
0151083 | Jul 2001 | WO |
2001050387 | Jul 2001 | WO |
02062211 | Aug 2002 | WO |
2003043494 | May 2003 | WO |
2005018453 | Mar 2005 | WO |
2006004430 | Jan 2006 | WO |
2008114291 | Sep 2008 | WO |
2009008968 | Jan 2009 | WO |
2012128801 | Sep 2012 | WO |
2013122839 | Aug 2013 | WO |
2014011447 | Jan 2014 | WO |
2014163976 | Oct 2014 | WO |
2015026744 | Feb 2015 | WO |
2015082555 | Jun 2015 | WO |
2016154318 | Sep 2016 | WO |
2017030781 | Feb 2017 | WO |
2017166074 | May 2017 | WO |
2017091691 | Jun 2017 | WO |
2017165238 | Sep 2017 | WO |
2018081795 | May 2018 | WO |
2018171853 | Sep 2018 | WO |
2019022706 | Jan 2019 | WO |
WO-2019022706 | Jan 2019 | WO |
2019204876 | Apr 2019 | WO |
2020185769 | Mar 2020 | WO |
2020075190 | Apr 2020 | WO |
2020130979 | Jun 2020 | WO |
2020149815 | Jul 2020 | WO |
2020229705 | Nov 2020 | WO |
2020245727 | Dec 2020 | WO |
2020249855 | Dec 2020 | WO |
2020252599 | Dec 2020 | WO |
2020256577 | Dec 2020 | WO |
2021021447 | Feb 2021 | WO |
2021022003 | Feb 2021 | WO |
2021038980 | Mar 2021 | WO |
2021055427 | Mar 2021 | WO |
2021055491 | Mar 2021 | WO |
2021061061 | Apr 2021 | WO |
2021081094 | Apr 2021 | WO |
2021138620 | Jul 2021 | WO |
2021216881 | Oct 2021 | WO |
2021236542 | Nov 2021 | WO |
2021236961 | Nov 2021 | WO |
2021262809 | Dec 2021 | WO |
2022092493 | May 2022 | WO |
2022092494 | May 2022 | WO |
2022212883 | Oct 2022 | WO |
2022212921 | Oct 2022 | WO |
2022216498 | Oct 2022 | WO |
2022251420 | Dec 2022 | WO |
2023008680 | Feb 2023 | WO |
2023008681 | Feb 2023 | WO |
2023022319 | Feb 2023 | WO |
2023022320 | Feb 2023 | WO |
Entry |
---|
Claris Healthcare Inc.; Claris Reflex Patient Rehabilitation System Brochure, https://clarisreflex.com/, retrieved from internet on Oct. 2, 2019; 5 pages. |
International Searching Authority, Search Report and Written Opinion for International Application No. PCT/US2021/032807, dated Sep. 6, 2021, 11 pages. |
International Searching Authority, Search Report and Written Opinion for International Application No. PCT/US2021/038617, dated Oct. 15, 2021, 12 pages. |
Jennifer Bresnick, “What is the Role of Natural Language Processing in Healthcare?”, pp. 1-7, published Aug. 18, 2016, retrieved on Feb. 1, 2022 from https://healthitanalytics.com/ featu res/what-is-the-role-of-natural-language-processing-in-healthcare. |
Alex Bellec, “Part-of-Speech tagging tutorial with the Keras Deep Learning library,” pp. 1-16, published Mar. 27, 2018, retrieved on Feb. 1, 2022 from https://becominghuman.ai/part-of-speech-tagging-tutorial-with-the-keras-deep-learning-library-d7f93fa05537. |
Kavita Ganesan, All you need to know about text preprocessing for NLP and Machine Learning, pp. 1-14, published Feb. 23, 2019, retrieved on Feb. 1, 2022 from https:// towardsdatascience.com/all-you-need-to-know-about-text-preprocessing-for-nlp-and-machine-learning-bcl c5765ff67. |
Badreesh Shetty, “Natural Language Processing (NPL) for Machine Learning,” pp. 1-13, published Nov. 24, 2018, retrieved on Feb. 1, 2022 from https://towardsdatascience. com/natural-language-processing-nlp-for-machine-learning-d44498845d5b. |
Barrett et al., “Artificial intelligence supported patient self-care in chronic heart failure: a paradigm shift from reactive to predictive, preventive and personalised care,” EPMA Journal (2019), pp. 445-464. |
Oerkild et al., “Home-based cardiac rehabilitation is an attractive alternative to no cardiac rehabilitation for elderly patients with coronary heart disease: results from a randomised clinical trial,” BMJ Open Accessible Medical Research, Nov. 22, 2012, pp. 1-9. |
Bravo-Escobar et al., “Effectiveness and safety of a home-based cardiac rehabilitation programme of mixed surveillance in patients with ischemic heart disease at moderate cardiovascular risk: A randomised, controlled clinical trial,” BMC Cardiovascular Disorders, 2017, pp. 1-11, vol. 17:66. |
Thomas et al., “Home-Based Cardiac Rehabilitation,” Circulation, 2019, pp. e69-e89, vol. 140. |
Thomas et al., “Home-Based Cardiac Rehabilitation,” Journal of the American College of Cardiology, Nov. 1, 2019, pp. 133-153, vol. 74. |
Thomas et al., “Home-Based Cardiac Rehabilitation,” HHS Public Access, Oct. 2, 2020, pp. 1-39. |
Dittus et al., “Exercise-Based Oncology Rehabilitation: Leveraging the Cardiac Rehabilitation Model,” Journal of Cardiopulmonary Rehabilitation and Prevention, 2015, pp. 130-139, vol. 35. |
Chen et al., “Home-based cardiac rehabilitation improves quality of life, aerobic capacity, and readmission rates in patients with chronic heart failure,” Medicine, 2018, pp. 1-5 vol. 97:4. |
Lima de Melo Ghisi et al., “A systematic review of patient education in cardiac patients: Do they increase knowledge and promote health behavior change?,” Patient Education and Counseling, 2014, pp. 1-15. |
Fang et al., “Use of Outpatient Cardiac Rehabilitation Among Heart Attack Survivors—20 States and the District of Columbia, 2013 and Four States, 2015,” Morbidity and Mortality Weekly Report, vol. 66, No. 33, Aug. 25, 2017, pp. 869-873. |
Beene et al., “AI and Care Delivery: Emerging Opportunities For Artificial Intelligence To Transform How Care Is Delivered,” Nov. 2019, American Hospital Association, pp. 1-12. |
Website for “Pedal Exerciser”, p. 1, retrieved on Sep. 9, 2022 from https://www.vivehealth.com/collections/physical-therapy-equipment/products/pedalexerciser. |
Website for “Functional Knee Brace with ROM”, p. 1, retrieved on Sep. 9, 2022 from http://medicalbrace.gr/en/product/functional-knee-brace-with-goniometer-mbtelescopicknee/. |
Website for “ComfySplints Goniometer Knee”, pp. 1-5, retrieved on Sep. 9, 2022 from https://www.comfysplints.com/product/knee-splints/. |
Website for “BMI FlexEze Knee Corrective Orthosis (KCO)”, pp. 1-4, retrieved on Sep. 9, 2022 from https://orthobmi.com/products/bmi-flexeze%C2%AE-knee-corrective-orthosis-kco. |
Website for “Neoprene Knee Brace with goniometer—Patella ROM MB.4070”, pp. 1-4, retrieved on Sep. 9, 2022 from https://www.fortuna.com.gr/en/product/neoprene-knee-brace-with-goniometer-patella-rom-mb-4070/. |
Kuiken et al., “Computerized Biofeedback Knee Goniometer: Acceptance and Effect on Exercise Behavior in Post-total Knee Arthroplasty Rehabilitation,” Biomedical Engineering Faculty Research and Publications, 2004, pp. 1-10. |
Ahmed et al., “Artificial intelligence with multi-functional machine learning platform development for better healthcare and precision medicine,” Database, 2020, pp. 1-35. |
Davenport et al., “The potential for artificial intelligence in healthcare,” Digital Technology, Future Healthcare Journal, 2019, pp. 1-5, vol. 6, No. 2. |
Website for “OxeFit XS1”, pp. 1-3, retrieved on Sep. 9, 2022 from https://www.oxefit.com/xs1. |
Website for “Preva Mobile”, pp. 1-6, retrieved on Sep. 9, 2022 from https://www.precor.com/en-us/resources/introducing-preva-mobile. |
Website for “J-Bike”, pp. 1-3, retrieved on Sep. 9, 2022 from https://www.magneticdays.com/en/cycling-for-physical-rehabilitation. |
Website for “Excy”, pp. 1-12, retrieved on Sep. 9, 2022 from https://excy.com/portable-exercise-rehabilitation-excy-xcs-pro/. |
Website for “OxeFit XP1”, p. 1, retrieved on Sep. 9, 2022 from https://www.oxefit.com/xp1. |
Davenport et al., “The Potential For Artificial Intelligence In Healthcare”, 2019, Future Healthcare Journal 2019, vol. 6, No. 2: Year: 2019, pp. 1-5. |
Ahmed et al., “Artificial Intelligence With Multi-Functional Machine Learning Platform Development For Better Healthcare And Precision Medicine”, 2020, Database (Oxford), 2020:baaa010. doi: 10.1093/database/baaa010 (Year: 2020), pp. 1-35. |
Ruiz Ivan et al., “Towards a physical rehabilitation system using a telemedicine approach”, Computer Methods in Biomechanics and Biomedical Engineering: Imaging & Visualization, vol. 8, No. 6, Jul. 28, 2020, pp. 671-680, XP055914810. |
De Canniere Helene et al., “Wearable Monitoring and Interpretable Machine Learning Can Objectively Track Progression in Patients during Cardiac Rehabilitation”, Sensors, vol. 20, No. 12, Jun. 26, 2020, XP055914617, pp. 1-15. |
Boulanger Pierre et al., “A Low-cost Virtual Reality Bike for Remote Cardiac Rehabilitation”, Dec. 7, 2017, Advances in Biometrics: International Conference, ICB 2007, Seoul, Korea, pp. 155-166. |
Yin Chieh et al., “A Virtual Reality-Cycling Training System for Lower Limb Balance Improvement”, BioMed Research International, vol. 2016, pp. 1-10. |
Jeong et al., “Computer-assisted upper extremity training using interactive biking exercise (iBikE) platform,” Sep. 2012, pp. 1-5, 34th Annual International Conference of the IEEE EMBS. |
Malloy, Online Article “AI-enabled EKGs find difference between numerical age and biological age significantly affects health, longevity”, Website: https://newsnetwork.mayoclinic.org/discussion/ai-enabled-ekgs-find-difference-between-numerical-age-and-biological-age-significantly-affects health-longevity/, Mayo Clinic News Network, May 20, 2021, retrieved: Jan. 23, 2023, p. 1-4. |
International Search Report and Written Opinion for PCT/US2023/014137, dated Jun. 9, 2023, 13 pages. |
Website for “Esino 2022 Physical Therapy Equipments Arm Fitness Indoor Trainer Leg Spin Cycle Machine Exercise Bike for Elderly,” https://www.made-in-china.com/showroom/esinogroup/product-detailYdZtwGhCMKVR/China-Esino-2022-Physical-Therapy-Equipments-Arm-Fitness-Indoor-Trainer-Leg-Spin-Cycle-Machine-Exercise-Bike-for-Elderly.html, retrieved on Aug. 29, 2023, 5 pages. |
Abedtash, “An Interoperable Electronic Medical Record-Based Platform For Personalized Predictive Analytics”, ProQuest LLC, Jul. 2017, 185 pages. |
Number | Date | Country | |
---|---|---|---|
20210113890 A1 | Apr 2021 | US |
Number | Date | Country | |
---|---|---|---|
62923829 | Oct 2019 | US |