Motion analysis to study joint-related biomechanics can be performed by measuring body kinematics via motion capture systems such as optical, inertial measurement units (IMU), electro-goniometers, and mechanical tracking. Each technology has advantages and limitations pertaining to their effectiveness and convenience of use. For example, optical systems are often used for full body motion capture, but the required complex setups and data processing systems can make these systems less attractive for daily life health assessment and monitoring a single joint or body part. Electro-goniometer and mechanical tracking systems are adapted to monitor single joint motion but have poor portability restricting their flexibility for use in various situations such as during an athletic competition. IMU-based systems are wearable and portable but have limited fields of operation, significant integration drift issue and high error rates that require more frequent sensor calibration to compensate.
A joint movement monitoring system can be used for health monitoring and interactive skeletal tracking of an indoor or an outdoor exercise program. The joint movement monitoring system allows monitoring of the motions of body parts with respect to a joint. Such human body joint monitoring can help a person by regularly monitoring the mobility status and assisting to maintain regular physical activities/exercises. The joint movement monitoring systems described, which allow for continuous monitoring of joint activity and health, can record and extract important parameters for early diagnosis, leading to early treatment of mobility-related problems. This facilitates analysis of human joint posture and movement that can be used for an extensive range of mobility-related activities such as rehabilitation, sports medicine, human activity assessment and virtual guided training. The joint movement monitoring system enables the gathering of accurate data reliably in real life using a comfortable appliance and without frequent sensor calibration.
A joint health monitoring system can include a calibration-free skeletal tracking module that utilizes three-dimensional (3D) magnetic sensors and kinematic model to track motion, a muscle force and joint wear-out monitoring module that utilizes a microelectromechanical systems (MEMS) sensor to track joint condition, and a biosensor module that utilizes photoplethysmography (PPG) and/or electrodermal activity (EDA) to track biomarkers. A joint health monitoring system can be configured to fuse data and generate a skeletal/character model, which can be monitored and animated in real time on a display. For example, the skeletal models are used to interact in a world scene (gym, running trail, Yoga place, golf course, swimming pool), or a shared animated space with an artificial intelligent health examiner/personal trainer or animated movie character.
For example, a joint health monitoring system can be worn by multiple users in a competition, a group exercise or training class setup. In such examples, each user from a remote location may use two, four or another number of joint movement monitoring devices. Joint health monitoring systems can be used to produce data that is monitored, stored and studied, for example, using a cloud-based health monitoring system. A joint health monitoring system can be configured to detect and report emergency situations.
For example, portable device 903 can be configured to execute a health monitoring application for fusing data and generating a skeletal model that can be monitored and tracked in real time on a display of portable device 903 or a display of some other device such as another portable device, a computer display, a display panel, a display used by remotely connected users and so on. For example, the monitoring can be in real time so that as a user changes leg positions, for example, when, running, biking, swimming or walking, the health indications and changes in skeletal joints are changed in real time on the display.
For example, portable device 903 can be connected to a cellular base station 904 via a cellular network 902 and can function as a client in communication over a network 905 with a cloud-based health monitoring center 907.
For example, portable device 903 transmits GPS data, as well as inputs from joint movement monitoring device 911 and joint movement monitoring device 912, to health monitoring center 907. For example, health monitoring center 907 processes the inputs using artificial intelligence such as knowledge representation and reasoning (KRR) to monitor fatal signs, biomarkers and so on. The resulting information can be transmitted to portable device 903, and others such as a client 909 and a client 910 that are connected to network 905. For example, client 909 and client 910 can each be a service center or another type of user connected to network 905. For example, portable device 903 can display a message and provide vibration feedback to user 908. A service center or other user on network 905 can be notified to provide assistance if needed.
For example, health monitoring center 907 stores the historical health data, health analysis result and medical record of user 908.
For example, based on portable device 903's GPS data, health monitoring center 907 can create an animated world scene such as a gym, a running trail, an exercise class, a golf course or a swimming pool in which users wearing joint movement monitoring devices can interact with each other, communicating, sharing information skeletal models and so on. For example, voice volume of each portable device can be weighted by source based on user distance in the animated scene, so that the voice volume of those closest to a user in the animated world scene will be louder than those more distant from the user in the animated world scene. This allows users that are remotely distant to interact as if they were physically presented in the animated world scene.
For example, as shown in
For example, user information (name, company etc.) may be related to the location obtained by direction and step counts from access point and/or Wi-Fi positioning system (WPS). User information, including location, may be shared with other users in the scene. For example, health monitoring center 907 processes the inputs using artificial intelligence. The output from the knowledge representation and reasoning (KRR), such as fatal signs and biomarkers etc., is transmitted to computer 921, other users in shared space 910 and service centers 909, which are also clients in the network. Computer 921 can display a message and provide audible feedback to user 908. Service centers 909 and other users in shared space 910 can provide assistance.
Joint movement monitoring device 911 can be used to monitor and track motion of knee joint 23. For example, a 3D magnetic sensor 111 and a 3D magnetic sensor 112 are incorporated into a primary section 24 of joint pad 27. For example, 3D magnetic sensor 111 and a 3D magnetic sensor 11 may be placed directly on joint pad 27 or may be on a rigid section of a carrier board 800, as shown in
Joint pad 27 also has an opposing section 25 on which is incorporated a magnet 121. Primary section 24 and opposing section 25 of joint pad 27 are on opposite sides of knee joint 23.
The circuitry in carrier board 800, shown in
Instead of just one magnet 121, as shown in
Additional sensors may also be added. These additional sensors can include, for example, pressure sensors, inertial measurement unit (IMU) sensors, electromyography (EMG) sensors, galvanic skin response (GSR) sensors, acoustic emission (AE) sensors, micro electro-mechanical systems (MEMS) sensors, photoplethysmography (PPG) sensors, electrodermal activity (EDA) and other types of sensors including other types of biosensors.
For example,
For example, joint movement monitoring device 911 can be configured to fuse data, generate a pose of a skeletal joint 23 and derive joint 23 health indicators (muscle strength, pulse rate etc.). For example, PPG sensor 201 can provide continuous information regarding a user's pulse rate, respiratory rate, and oxygen saturation. However, PPG sensors are susceptive to motion, often causing false signal due to motion artifact. When exposed to knee 23 bending, PPG Sensor 201 will move, and the cross-sectional area of the blood vessel may change. Thus, PPG sensor 201 reading may be disturbed in movement. The distortion of the PPG waveform is strongly correlated with the knee bending. For example, a recursive least squares active noise cancellation technique using 3D magnetic sensor 111 and 3D magnetic sensor 112 reading as an input for a Finite Impulse Response (FIR) or Laguerre model can be used to compensate for noise resulting from joint movement. In the FIR model, the actual PPG sensor output p is corrupted and e is the distortion signal component added to the true signal. po is the heart portion of the PPG signal and is given by po=p−e. The estimation of e can be predicted by:
e(t)=ρ(t)Tβ(t)
Where
ρ(t)=[θ(t−1) . . . θ(t−i) . . . θ(t−n)]T
β(t)=[a1. . . ai. . . an]
In the above equation, θ(t) is the joint angle measured by the 3D magnetic sensor at each time step and ai . . . an are coefficients to be determined with various real-time computation algorithms, including the standard Recursive Least Squares (RLS).
Alternatively, histogram and factorial design can be used to derive Venous Recovery Time (VRT) in daily life for detecting and preventing varicose veins in earlier stage. Knee angle change will drive the venous blood in the veins in the proximal direction towards the heart or refill the veins in the legs with blood, and this causes PPG signal wave. VRT, the duration from bending and until the signal return to baseline at rest is a strong indication of possible incompetency of the venous valves.
For example, an expanded section 70 of
In the above equation Bx1, By1 and Bz1 are magnetic sensor readings.
For example, the global reference frame may be further referenced by gravity. For example,
For example, vector algebra provides a means to calculate joint absolute angle ϕ in midstance phase of a running/walking gait cycle using the apparent gravity vector between IMU 191 and 192 accelerometer readings. The scalar product a·b between any two IMU accelerometer vectors a and b gives the joint angle ϕ. This result is easily proved by applying the triangle cosine theorem to the triangle with sides comprised of the vectors a, b and a-b.
In the equation above, ax, ay, az are IMU 191 acceleration readings, and bx, by, bz are IMU 192 acceleration readings.
The above calculated joint absolute angle ϕ may then be used as the starting angle for motion tracking in the remained phases of running/walking gait cycle. 3D hall sensor data starts to fuse in to determine relative motion between thigh 22 and shank 21 in fast joint movement, in order to minimize IMU drifting and improve accuracy. Both absolute and relative movement of the joint can be monitored with this configuration.
Calibration-free health measurement is highly desired for daily health assessment. 3D magnetic sensor 111 and 3D magnetic sensor 112 providing magnetic field detection in x, y, and z directions, and being linked together with a defined separation between them can provide calibration-free joint movement tracking. For example,
c2=a2+d2−2×a×d×cos θ Equation 1
In Equation 1, a, c and d are spatial distances, as shown in
Also, for Equation 1, a can be calculated as set out in the Equation below
In the equation above, L is the distance between 3D magnetic sensor 111 and 3D magnetic sensor 112. Bx1, By1, Bz1 are readings from 3D magnetic sensor 111 (S1). Bx2, By2, Bz2 are readings from 3D magnetic sensor 112 (S2). a and θ are known values.
In another sampling point, M1 is rotated around the X axis to another position, as shown in Equation 2 below:
c2=a′2+d2−2×a′×d×cos θ′ Equation 2
In Equation 2, a′, c and d are spatial distance and θ′ is the spatial angle between a′ and d. Magnet rotation radii c and distance d do not change in joint rotation.
By solving Equation 1 and Equation 2, magnet rotation radii c and distance d can be derived.
Joint moving angle ϕ is understood to be:
For a general placement of joint movement monitoring device 911, 3D magnetic sensor 111 and 3D magnetic sensor 112, local measuring coordinate may not be in line with the global coordinate. The sensor readings must be modified with the following transfer function before applying the above kinematic model:
In the equation above, angle c is the Z axis rotation angle of 3D magnetic sensor used measuring coordinates for lining up with the joint moving coordinate and angle ρ is the Y axis rotation angle of the 3D magnetic sensor measuring coordinate for lining up with the joint moving coordinate.
A third measuring point is needed to calculate angle σ and angle ρ using a least square method with Machine learning (ML) and artificial intelligence (AI), accounting for magnet 121 always moving in the same plane. This is illustrated by the 3D plot shown in
For example, the kinematic model illustrated in
For example, the look-ahead algorithm can be implemented using the following steps:
Repeat steps two through six with the updated magnet's location. The path tracking is aborted when the goal is placed out of the desired path.
For example, an algorithm that employs a quaternion representation of orientation and is not subject to the problematic singularities associated with Euler angles may be used for effective performance at low computational expense.
For example, IMU 191 is a 6-axis Motion Processing Unit with integrated 3-axis gyroscope, 3-axis accelerometer. 3D magnetic sensor 111 and 3D magnetic sensor 112 are configured to detect magnetic field changes within its detection region as produced by magnet 121 and any other magnets that are moving. As discussed above, other magnets may be added, for example, such as magnet 122 shown in
For example, the skeletal joint movement may also be supplied as input to a health monitoring module 653, and may be combined with other health indicators, such as muscle strength, pulse rate and so on, to derive useful health information, as well as to monitor warning signs of ill health. The result can be monitored and tracked in real-time on a display 657 along with the animation.
In block 882, additional signals are transmitted from additional sensors to a controller. For the example of joint movement monitoring device 911 as shown in
In block 883 a controller signal is generated from the 3D magnetic sensor signal and the output of other sensors. For the example of joint movement monitoring device 911 as shown in
For example, optionally, in a block 854 the controller signal is transmitted to a receiver. For the example of joint movement monitoring device 911 as shown in
For example, optionally, in a block 855 a transceiver signal is transmitted to a remote processor. For the example of joint movement monitoring device 911 as shown in
The foregoing discussion discloses and describes merely exemplary methods and embodiments. As will be understood by those familiar with the art, the disclosed subject matter may be embodied in other specific forms without departing from the spirit or characteristics thereof. Accordingly, the present disclosure is intended to be illustrative, but not limiting, of the scope of the invention, which is set forth in the following claims.
Number | Name | Date | Kind |
---|---|---|---|
4737710 | Van Antwerp et al. | Apr 1988 | A |
5792077 | Gomes | Aug 1998 | A |
5826578 | Curchod | Oct 1998 | A |
5980472 | Seyl | Nov 1999 | A |
6050962 | Kramer et al. | Apr 2000 | A |
6872187 | Stark et al. | Mar 2005 | B1 |
8165844 | Luinge et al. | Apr 2012 | B2 |
8246462 | Tran et al. | Aug 2012 | B1 |
8279091 | Trab et al. | Oct 2012 | B1 |
8421448 | Tran et al. | Apr 2013 | B1 |
9019349 | Richardson | Apr 2015 | B2 |
9119569 | Chen et al. | Sep 2015 | B2 |
9551562 | Janisch | Jan 2017 | B2 |
9642572 | Mahfouz et al. | May 2017 | B2 |
10061891 | Grundlehner et al. | Aug 2018 | B2 |
10278647 | Salehizadeh et al. | May 2019 | B2 |
20060113990 | Schodlbauer | Jun 2006 | A1 |
20130217998 | Mahfouz | Aug 2013 | A1 |
20160242646 | Obma | Aug 2016 | A1 |
20160246369 | Osman | Aug 2016 | A1 |
20160338621 | Kanchan | Nov 2016 | A1 |
20160338644 | Connor | Nov 2016 | A1 |
20170090668 | Chen et al. | Mar 2017 | A1 |
20170181689 | Lin | Jun 2017 | A1 |
20170231533 | Qu | Aug 2017 | A1 |
20170277138 | Kaji | Sep 2017 | A1 |
20190283247 | Chang | Sep 2019 | A1 |
20200033958 | Bieglmayer | Jan 2020 | A1 |
20200054288 | Vural | Feb 2020 | A1 |
Entry |
---|
Zhao, “A Review of Wearable IMU (Inertial-Measurement-Unit)-based Pose Estimation and Drift Reduction Technologies”, IOP COnf. Series: Journal of Physics: Conf. Series 1087 (2018). |
Orlando Adas Saliba Junior et al. “Pre- and Postoperative Evaluation by Photoplethysmography in Patients Receivign Surgery for Lower-Limb Varicose Veins”, Internal Journal of Vascular Medicine vol. 2014, Articl ID 562782. Feb. 2014. |
Ryan Clark, et al., “Tracking joint angles during whole-arm movements using electromagnetic sensors”, Brigham Young University Faculty Publications 4174, BYU ScholarsArchive, https://scholarsarchive.byu.edu/facpub/4174, 2020. |
Number | Date | Country | |
---|---|---|---|
63019432 | May 2020 | US |