This application claims the priority benefit of China application serial no. 202310783234.7, filed on Jun. 29, 2023 and China application serial no. 202310784631.6, filed on Jun. 29, 2023. The entirety of each of the above-mentioned patent applications is hereby incorporated by reference herein and made a part of this specification.
The disclosure relates to an image analysis technology, and in particular to a posture comparison method, an electronic device, and a computer-readable storage medium.
In modern life, learning exercises through various exercise teaching videos has become a very common way of learning. However, for the general public, it is difficult to objectively judge whether the actions they are doing are consistent with the demonstration actions presented in the exercise teaching videos.
Embodiments of the disclosure provide a posture comparison method, an electronic device, and a computer-readable storage medium, which can be used to solve the above technical issues.
An embodiment of the disclosure provides a posture comparison method, which is applicable to an electronic device and includes the following steps. M first image frames of a first video stream within a specific time interval are obtained, and N second image frames of a second video stream within the specific time interval are obtained, where M and N are positive integers. K first joint points corresponding to K specified joints are determined in each of the first image frames, and K second joint points corresponding to the K specified joints are determined in each of the second image frames. The K first joint points respectively correspond to the K second joint points, and K is a positive integer. An m-th first image frame among the M first image frames and an n-th second image frame among the N second image frames are obtained, where 1≤m≤M and 1≤n≤N. A difference between each of the first joint points and each of the corresponding second joint points in the m-th first image frame and the n-th second image frame is determined, and a posture difference degree corresponding to the specific time interval difference is determined accordingly. In another embodiment, a posture correction recommendation corresponding to the specific time interval difference is provided accordingly also.
An embodiment of the disclosure provides an electronic device, which includes a storage circuit and a processor. The storage circuit stores a program code. The processor is coupled to the storage circuit, accesses the program code and executes the following steps. M first image frames of a first video stream within a specific time interval are obtained, and N second image frames of a second video stream within the specific time interval are obtained, where M and N are positive integers. K first joint points corresponding to K specified joints are determined in each of the first image frames, and K second joint points corresponding to the K specified joints are determined in each of the second image frames. The K first joint points respectively correspond to the K second joint points, and K is a positive integer. An m-th first image frame among the M first image frames and an n-th second image frame among the N second image frames are obtained, where 1≤m≤M and 1≤n≤N. A difference between each of the first joint points and each of the corresponding second joint points in the m-th first image frame and the n-th second image frame is determined, and a posture difference degree corresponding to the specific time interval difference is determined accordingly. In another embodiment, a posture correction recommendation corresponding to the specific time interval difference is provided accordingly also.
An embodiment of the disclosure provides a computer-readable storage medium, which records an executable computer program. The executable computer program is loaded by a posture comparison device to execute the following steps. M first image frames of a first video stream within a specific time interval are obtained, and N second image frames of a second video stream within the specific time interval are obtained, where M and N are positive integers. K first joint points corresponding to K specified joints are determined in each of the first image frames, and K second joint points corresponding to the K specified joints are determined in each of the second image frames. The K first joint points respectively correspond to the K second joint points, and K is a positive integer. An m-th first image frame among the M first image frames and an n-th second image frame among the N second image frames are obtained, where 1≤m≤M and 1≤n≤N. A difference between each of the first joint points and each of the corresponding second joint points in the m-th first image frame and the n-th second image frame is determined, and a posture difference degree corresponding to the specific time interval difference is determined accordingly. In another embodiment, a posture correction recommendation corresponding to the specific time interval difference is provided accordingly also.
The drawings are included to provide a further understanding of the disclosure, and the drawings are incorporated into the specification and constitute a part of the specification. The drawings illustrate embodiments of the disclosure and serve to explain principles of the disclosure together with the description.
Reference will now be made in detail to the exemplary embodiments of the disclosure, examples of which are illustrated in the drawings. Wherever possible, the same reference numerals are used in the drawings and the description to refer to the same or similar parts.
Please refer to
In
The processor 104 is coupled to the storage circuit 102 and may be a general purpose processor, a special purpose processor, a conventional processor, a digital signal processor, multiple microprocessors, one or more microprocessors combined with a digital signal processor core, a controller, a microcontroller, an application specific integrated circuit (ASIC), a field programmable gate array (FPGA), any other type of integrated circuits, a state machine, a processor based on an advanced reduced instruction set computer (RISC) machine (ARM), and the like.
In the embodiment of the disclosure, the processor 104 may access the modules and the program codes recorded in the storage circuit 102 to implement the methods (for example, a posture comparison method in a first embodiment and a posture recommendation method in a second embodiment) provided in the disclosure, and the details thereof are described below.
Please refer to
First, in Step S210, the processor 104 obtains M first image frames of a first video stream within a specific time interval (hereinafter referred to as T), and obtains N second image frames of a second video stream within the specific time interval T, where M and N are positive integers.
Please refer to
In this embodiment, the second user 312 is, for example, an action demonstrator, such as a coach, and the second video stream is, for example, a video in which the second user 312 demonstrates the specific action (for example, sit-ups). In addition, the first user 311 is, for example, a person such as a student who executes the specific action with reference to the second video stream. In an embodiment, when the first user 311 watches the second video stream and imitates the second user 312 to execute the specific action, the relevant process may be recorded as the first video stream by a camera or other imaging device, and provided to the electronic device 100.
In an embodiment, the camera or the imaging device may be built in the electronic device 100 or externally connected to the electronic device 100 by wire or wirelessly. In an embodiment, the first video stream and/or the second video stream of the electronic device 100 may be captured in real time or obtained by the electronic device 100 from a relevant storage device, but not limited thereto.
In an embodiment, M may be determined according to the frame rate of the first video stream and the length of the considered specific time interval T. For example, assuming that the frame rate of the first video stream is 30 frames per second, and the length of the specific time interval T is 1 second, then M is, for example, 30 (that is, 30*1). For another example, assuming that the frame rate of the first video stream is 30 frames per second, and the length of the specific time interval T is 1.5 seconds, then M is, for example, 45 (that is, 30*1.5).
Similarly, N may be determined according to the frame rate of the second video stream and the length of the considered specific time interval T. For example, assuming that the frame rate of the second video stream is 25 frames per second, and the length of the specific time interval T is 1 second, then N is, for example, 25 (that is, 25*1). For another example, assuming that the frame rate of the second video stream is 25 frames per second, and the length of the specific time interval T is 2 seconds, then N is, for example, 25 (that is, 25*2).
In an embodiment, the specific time interval T may be determined by the designer according to requirements and may be used as a time unit for the electronic device 100 to determine a posture difference degree between the first user 311 and the second user 312 executing the specific action.
For example, assuming that the specific time interval T is set to be from an A-th second to a B-th second (that is, the length of the specific time interval T is (B−A) seconds), then the processor 104 may obtain the image frames corresponding to the A-th second to the B-th second from the first video stream as the M first image frames, and obtain the image frames corresponding to the A-th second to the B-th second from the second stream as the N second image frames in Step S210, but not limited thereto.
In Step S220, the processor 104 determines K first joint points corresponding to K specified joints in each first image frame, and determines K second joint points corresponding to the K specified joints in each second image frame, where K is a positive integer.
In an embodiment, the K specified joints are, for example, K joints of interest that may be used to determine the posture difference degree between the first user 311 and the second user 312. In the embodiment of
In an embodiment, the number of K, such as 21, may be determined by the designer according to requirements and the considered specific action, but not limited thereto. In another embodiment, the processor 104 judges the same K joints for different specific actions.
In an embodiment, after obtaining the M first image frames, the processor 104 may execute a relevant joint point recognition algorithm (for example, DeepPose) on each first image frame to determine the position of each body joint of the first user 311 in each first image frame, and then find joints corresponding to the K specified joints from the body joints in each first image frame as the K first joint points.
Similarly, after obtaining the N second image frames, the processor 104 may execute a relevant joint point recognition algorithm (for example, DeepPose) on each second image frame to determine the position of each body joint of the second user 312 in each second image frame, and then find joints corresponding to the K specified joints from the body joints in each second image frame as the K second joint points, but not limited thereto.
In an embodiment of the disclosure, the K first joint points correspond to the K second joint points one-to-one. In this case, the first and second joint points corresponding to each other all correspond to the same specified joint. For example, assuming that an i-th first joint point among the K first joint points corresponds to a knee joint, then an i-th second joint point among the K second joint points also corresponds to the knee joint. For another example, assuming that a j-th first joint point among the K first joint points corresponds to a hip joint, then a j-th second joint point among the K second joint points also corresponds to the hip joint, but not limited thereto (where i and j are index values).
Afterwards, in Step S230, the processor 104 obtains an m-th first image frame among the M first image frames and an n-th second image frame among the N second image frames, where 1≤m≤M and 1≤n≤N.
Next, in Step S240, the processor 104 determines the difference between each first joint point and each corresponding second joint point in the m-th first image frame and the n-th second image frame, and determines the posture difference degree corresponding to the specific time interval T accordingly.
In the first embodiment, the details of Step S240 will be illustrated with reference to
In Step S410, the processor 104 determines a reference three-dimensional data structure W0 based on the difference between each first joint point and each corresponding second joint point in the m-th first image frame and the n-th second image frame, wherein the reference three-dimensional data structure W0 has M×N×K reference data elements.
During the process of executing Step S410, the processor 104 may first initialize the reference three-dimensional data structure W0. For example, the processor 104 may establish a data array with a dimension of M×N×K as the initialized reference three-dimensional data structure W0.
Afterwards, the processor 104 may obtain a k-th first joint point among the K first joint points in the m-th first image frame as a first reference joint point, and obtain a k-th second joint point among the K second joint points in the n-th second image frame as a second reference joint point, where 1≤k≤K. Afterwards, the processor 104 may determine a position difference between the first reference joint point and the second reference joint point, and set a specific data element in the reference three-dimensional data structure W0 accordingly, wherein the specific data element corresponds to the m-th first image frame, the n-th second image frame, and a k-th specified joint among the K specified joints.
In the first embodiment, the position of the specific data element in the reference three-dimensional data structure W0 is, for example, (m, n, k), and the specific data element may be correspondingly expressed as W0(m, n, k), but not limited thereto.
In an embodiment, when determining the position difference between the first reference joint point and the second reference joint point, the processor 104 may be configured to: obtain first pixel coordinates of the first reference joint point in the m-th first image frame; obtain second pixel coordinates of the second reference joint point in the n-th second image frame; and determine the position difference between the first reference joint point and the second joint point based on a distance between the first pixel coordinates and the second pixel coordinates.
For example, it is assumed that the k-th specified joint (which corresponds to the k-th first joint point and the k-th second joint point) is a knee joint, and the m-th first image frame and the n-th second image frame are respectively the first image frame 321 and the second image frame 322 in
In some embodiments, the processor 104 may, for example, determine the (normalized) Euclidean distance between the first reference joint point and the second reference joint point as the position difference, but not limited thereto.
After obtaining the position difference, the processor 104 may set the content of the reference data element (that is, W0(m, n, k)) at the position (m, n, k) in the reference three-dimensional data structure W0 as the position difference (for example, the (normalized) Euclidean distance between the first reference joint point and the second reference joint point), but not limited thereto.
In the embodiment of the disclosure, the processor 104 may determine the content of each reference data element in the reference three-dimensional data structure W0 based on the above mechanism.
In
In Step S420, the processor 104 determines a first two-dimensional data structure W1 through combining the K reference two-dimensional data structures, wherein the first two-dimensional data structure W1 has M×N first data elements.
During the process of determining the first two-dimensional data structure W1, the processor 104 may first initialize the first two-dimensional data structure W1. For example, the processor 104 may first establish a data array with a dimension of M×N as the initialized first two-dimensional data structure W1, but not limited thereto. After that, the processor 104 may find the reference data elements corresponding to the m-th first image frame and the n-th second image frame in each of the reference two-dimensional data structures W01 to W0K, and determine the first data element corresponding to the m-th first image frame and the n-th second image frame in the first two-dimensional data structure W1 accordingly.
Please refer to
Similarly, the reference data element corresponding to the m-th first image frame and the n-th second image frame in the reference two-dimensional data structure W0K is, for example, the reference data element located at (m, n), and the reference data element may be expressed as W0K(m, n). For example, the reference data element W0K(1, 1) is, for example, the reference data element corresponding to a 1-st first image frame and a 1-st second image frame in the reference two-dimensional data structure W0K; the reference data element W0K(1, N) is, for example the reference data element corresponding to the 1-st first image frame and an N-th second image frame in the reference two-dimensional data structure W0K; the reference data element W0K(M, 1) is, for example, the reference data element corresponding to an M-th first image frame and the 1-st second image frame in the reference two-dimensional data structure W0K; and the reference data element W0K(M, N) is, for example, the reference data element corresponding to the M-th first image frame and the N-th second image frame in the reference two-dimensional data structure W0K.
In addition, the first data element corresponding to the m-th first image frame and the n-th second image frame in the first two-dimensional data structure W1 is, for example, the first data element located at (m, n), and the first data element may be expressed as W1(m, n). For example, the first data element W1(1, 1) is, for example, the first data element corresponding to a 1-st first image frame and a 1-st second image frame in the first two-dimensional data structure W1; the first data element W1(1, N) is, for example, the first data element corresponding to the 1-st first image frame and an N-th second image frame in the first two-dimensional data structure W1; the first data element W1(M, 1) is, for example, the first data element corresponding to an M-th first image frame and the 1-st second image frame in the first two-dimensional data structure W1; and the first data element W1(M, N) is, for example, the first data element corresponding to the M-th first image frame and the N-th second image frame in the first two-dimensional data structure W1.
Based on this, during the process of determining the first two-dimensional data structure W1, the processor 104 may respectively find the reference data elements W01(m, n) to W0K(m, n) in the reference two-dimensional data structures W01 to W0K, and determine the first data element W1(m, n) in the first two-dimensional data structure W1 accordingly.
In an embodiment, the processor 104 may, for example, determine the first data element W1(m, n) based on statistical characteristics (for example, a mean value) and/or a linear combination of the reference data elements W01(m, n) to W0K(m, n). In an embodiment, the processor 104 may first normalize the reference data elements W01(m, n) to W0K(m, n) and then take the statistical characteristics and/or the linear combination as the first data element W1(m, n), but not limited thereto.
For example, processor 104 may be configured to: determine the first data element W1(1, 1) (for example, 4 located at the upper left corner of the first two-dimensional data structure W1 in
For other first data elements in the first two-dimensional data structure W1, the processor 104 may determine based on similar principles, and the details will not be repeated here.
Next, in Step S430, the processor 104 finds a surrounding element corresponding to each first data element, and generates M×N second data elements accordingly, wherein the M×N second data elements form a second two-dimensional data structure M2.
During the process of executing Step S430, the processor 104 may first initialize the second two-dimensional data structure M2. For example, the processor 104 may first establish a data array with a dimension of M×N as the initialized second two-dimensional data structure M2. In this embodiment, the second data element corresponding to the m-th first image frame and the n-th second image frame is, for example, the second data element at the position (m, n) in the second two-dimensional data structure W2, and may be expressed as W2(m, n).
In an embodiment, when m and n are both 1, the processor 104 may determine that the second data element W2(m, n) is the first data element W1(m, n). That is, when determining the second data element W2(1, 1) located at the upper left corner of the second two-dimensional data structure W2, the processor 104 may directly use the first data element W1(1, 1) at the upper left corner of the first two-dimensional data structure W1 as the second data element W2(1, 1), but not limited thereto.
In addition, when m or n is not 1, the processor 104 may obtain the surrounding element corresponding to the first data element W1(m, n) in the second two-dimensional data structure W2, and add the first data element W1(m, n) and the corresponding surrounding element into the second data element W2(m, n).
In an embodiment, the surrounding element corresponding to the first data element W1(m, n) in the second two-dimensional data structure W2 includes at least one of the second data elements W2(m−1, n), W2(m, n−1), and W2(m−1, n−1). In addition, the surrounding element corresponding to the first data element W1(m, n) in the second two-dimensional data structure W2 may include the smallest of the second data elements W2(m−1, n), W2(m, n−1), and W2(m−1, n−1).
For example, when determining the second data element W2(1, 2), the processor 104 may find the surrounding element, such as the second data element W2(1, 1), corresponding to the first data element W1(1, 2) in the second two-dimensional data structure W2. Afterwards, the processor 104 may add the first data element W1(1, 2) and the second data element W2(1, 1) into the second data element W2(1, 2) (that is, 1+4=5).
For example, when determining the second data element W2(2, 1), the processor 104 may find the surrounding element, such as the second data element W2(1, 1), corresponding to the first data element W1(2, 1) in the second two-dimensional data structure W2. Afterwards, the processor 104 may add the first data element W1(2, 1) and the second data element W2(1, 1) into the second data element W2(2, 1) (that is, 9+4=13).
As another example, when determining the second data element W2(2, 2), the processor 104 may find the surrounding element, such as the smallest of the second data elements W2(1, 1), W2(1, 2), and W2(2, 1) (that is, 4 corresponding to the second data element W2(1, 1)), corresponding to the first data element W1(2, 2) in the second two-dimensional data structure W2. Afterwards, the processor 104 may add the first data element W1(2, 2) and the second data element W2(1, 1) into the second data element W2(2, 2) (that is, 4+4=8).
In short, when m or n is not 1, the processor 104 may select one (such as the smallest) of the second data elements W2(m−1, n), W2(m, n−1), and W2(m−1, n−1) to be added with the first data element W1(m, n) into the second data element W2(m, n). From another point of view, the processor 104 may also be understood as selecting one of the second data elements on the left, the upper left, and the top of the second data element W2(m, n) to be added with the first data element W1(m, n), so as to determine the second data element W2(m, n), but not limited thereto.
Based on the above mechanism, the processor 104 may correspondingly determine other second data elements in the second two-dimensional data structure W2, and the details thereof will not be repeated here.
Afterwards, in Step S440, the processor 104 determines the posture difference degree corresponding to the specific time interval T based on the second two-dimensional data structure W2
In an embodiment, the processor 104 may find the second data element corresponding to the M-th first image frame among the M first image frames and the N-th second image frame among the N second image frames in the M×N second data elements as a second specific data element (that is, a second data element W2(M, N)), and determine the second specific data element as the posture difference degree corresponding to the specific time interval T.
In
For other specific time intervals, the processor 104 may determine the corresponding posture difference degree based on the above mechanism. In this way, the first user 311 may know whether the action executed thereby is similar to that of the second user 312, thereby executing corresponding action correction.
However, for a user with less knowledge about the relevant exercise, only knowing whether the actions are similar may still not be enough to do the correct action. Based on this, an embodiment of the disclosure further provides a posture recommendation method, which may be used to provide relevant action correction recommendations to the user, and the relevant details are described as follows.
Please refer to
In Step S710, the processor 104 obtains the M first image frames of the first video stream within the specific time interval T, and obtains the N second image frames of the second video stream within the specific time interval T. In Step S720, the processor 104 determines the K first joint points corresponding to the K specified joints in each first image frame, and determines the K second joint points corresponding to the K specified joints in each second image frame. In Step S730, the processor 104 obtains the m-th first image frame among the M first image frames and the n-th second image frame among the N second image frames.
In the second embodiment, reference may be made to the relevant description of Steps S210 to S230 in
In Step S740, the processor 104 determines the difference between each first joint point and each corresponding second joint point in the m-th first image frame and the n-th second image frame, and accordingly provides a posture correction corresponding to a specific time interval T recommendation.
In the second embodiment, the details of Step S740 will be explained with reference to
In Step S810, the processor 104 determines the reference three-dimensional data structure W0 based on the difference between each first joint point and each corresponding second joint point in the m-th first image frame and the n-th second image frame. In Step S820, the processor 104 determines the first two-dimensional data structure W1 through combining the K reference two-dimensional data structures W01 to W0K. In Step S830, the processor 104 finds the surrounding element corresponding to each first data element, and generates the M×N second data elements accordingly, wherein the M×N second data elements form the second two-dimensional data structure W2.
In the second embodiment, reference may be made to the relevant description of Steps S410 to S430 in
In Step S840, the processor 104 determines the posture correction recommendation corresponding to the specific time interval T based on the second two-dimensional data structure W2.
In the second embodiment, the details of Step S840 will be illustrated with reference to
First, in Step S910, the processor 104 may find the second data element corresponding to the M-th first image frame among the M first image frames and the N-th second image frame among the N second image frames in the M×N second data elements as the second specific data element (that is, the second data element W2(M, N)).
In Step S920, the processor 104 may determine a specific path in each of the reference two-dimensional data structures W01 to W0K based on the second specific data element.
Please refer to
In the scenario of
In the scenario of
Afterwards, the processor 104 may determine a reference path RP with multiple coordinates of the third data elements in the second two-dimensional data structure W2. In the scenario of
Next, the processor 104 may find the path corresponding to the reference path RP in each of the reference two-dimensional data structures W01 to W0K as a corresponding specific path.
For example, in the reference two-dimensional data structure W01, the processor 104 may take the path corresponding to the reference path RP as a corresponding specific path P1 (which includes the coordinates (1, 1), (2, 2), (3, 3), (4, 4), and (4, 5)); in the reference two-dimensional data structure W02, the processor 104 may take the path corresponding to the reference path RP as a corresponding specific path P2 (which includes the coordinates (1, 1), (2, 2), (3, 3), (4, 4), and (4, 5)); and in the reference two-dimensional data structure W0K, the processor 104 may take the path corresponding to the reference path RP as a corresponding specific path PK (which includes the coordinates (1, 1), (2, 2), (3, 3), (4, 4), and (4, 5)).
Next, in Step S930, the process 104 may determine a reference value corresponding to each of the reference two-dimensional data structures W01 to W0K based on each of the specific paths P1 to PK corresponding to each of the reference two-dimensional data structures W01 to W0K.
In an embodiment, the processor 104 may find multiple specific data elements corresponding to a specific path in a k-th reference two-dimensional data structure among the reference two-dimensional data structures W01 to W0K, and add the specific data elements into the reference value corresponding to the k-th reference two-dimensional data structure.
For example, for the reference two-dimensional data structure W01 (that is, a 1-st reference two-dimensional data structure), the processor 104 may determine the reference data element located on the specific path P1 as the specific data element in the reference two-dimensional data structure W01. After that, the processor 104 may add the specific data elements in the reference two-dimensional data structure W01 into the reference value corresponding to the reference two-dimensional data structure W01.
As another example, for the reference two-dimensional data structure W02 (that is, a 2-nd reference two-dimensional data structure), the processor 104 may determine the reference data element located on the specific path P2 as the specific data element in the reference two-dimensional data structure W02. Afterwards, the processor 104 may add the specific data elements in the reference two-dimensional data structure W02 into the reference value corresponding to the reference two-dimensional data structure W02.
As another example, for the reference two-dimensional data structure W0K(that is, a K-th reference two-dimensional data structure), the processor 104 may determine the reference data element located on the specific path PK as the specific data element in the reference two-dimensional data structure W0K. After that, the processor 104 may add the specific data elements in the reference two-dimensional data structure W0K into the reference value corresponding to the reference two-dimensional data structure W0K.
For other reference two-dimensional data structures, the processor 104 may determine the corresponding reference value based on the above mechanism, and the details thereof will not be repeated here.
In some embodiments, the processor 104 may also normalize the reference values corresponding to the reference two-dimensional data structures W01 to W0K, but not limited thereto.
Afterwards, in Step S940, the processor 104 provides the posture correction recommendation corresponding to the specific time interval T based on the reference value corresponding to each of the reference two-dimensional data structures W01 to W0K.
In the second embodiment, the reference two-dimensional data structures W01 to W0K may be divided to respectively correspond to different parts of the body, such as head, torso, buttocks, arms, legs, upper body, and lower body. In an embodiment, the reference two-dimensional data structures W01 to W0K may be divided into a first group and a second group respectively corresponding to a first body part and a second body part. For ease of understanding, the following assumes that the first body part and the second body part are respectively the upper body and the lower body, but not limited thereto. In this case, one or more of the reference two-dimensional data structures W01 to W0K corresponding to the joints belonging to the upper body may be determined as belonging to the first group, and one or more of the reference two-dimensional data structures W01 to W0K corresponding to the joints belonging to the lower body may be determined as belonging to the second group.
In an embodiment, the processor 104 may be configured to: determine a first difference degree value (hereinafter referred to as V1) based on the (normalized) reference value corresponding to each reference two-dimensional data structure belonging to the first group; and determine a second difference degree value (hereinafter referred to as V2) based on the (normalized) reference value corresponding to each reference two-dimensional data structure belonging to the second group.
For ease of understanding, it is assumed below that K is 9, the reference two-dimensional data structures W01 to W06 belong to the first group, and the reference two-dimensional data structures W07 to W09 belong to the second group, but the same is only exemplary and is not intended to limit the possible implementations of the disclosure.
Based on this, the processor 104 may determine the first difference degree value V1 based on the reference value corresponding to each of the reference two-dimensional data structures W01 to W06 belonging to the first group, and determine the second difference degree value V2 based on the reference value corresponding to each of the reference two-dimensional data structures W07 to W09 belonging to the second group. In an embodiment, the processor 104 may, for example, use the statistical characteristics and/or the linear combination of the reference values corresponding to the reference two-dimensional data structures W01 to W06 as the first difference degree value V1, and use the statistical characteristics and/or the linear combination of the reference values corresponding to the reference two-dimensional data structures W07 to W09 as the second difference degree value V2, but not limited thereto.
Afterwards, the processor 104 may provide the posture correction recommendation corresponding to the specific time interval T based on a comparison result of the first difference degree value V1 and the second difference degree value V2.
In an embodiment, in response to determining that the first difference degree value V1 is greater than the second difference degree value V2, the processor 104 provides a first posture correction recommendation for correcting the first body part (for example, the upper body) as the posture correction recommendation. On the other hand, in response to determining that the first difference degree value V1 is not greater than the second difference degree value V2, the processor 104 may provide a second posture correction recommendation for correcting the second body part (for example, the lower body) as the posture correction recommendation.
In an embodiment, the processor 104 may also first determine the posture difference degree according to the content of the first embodiment, and determine whether the posture difference degree is lower than a difference degree threshold. In response to determining that the posture difference degree is lower than the difference degree threshold, it means that the first user 311 imitates the action of the second user 312 well. In this case, the processor 104 may provide a maintain posture recommendation as the posture correction recommendation or may not provide any recommendation.
On the other hand, in response to determining that the posture difference degree is not lower than the difference degree threshold, it means that the first user 311 does not properly imitate the action of the second user 312. In this case, the processor 104 may then provide the corresponding first posture correction recommendation and/or second posture correction recommendation as the posture correction recommendation according to the mechanism of the second embodiment, but not limited thereto.
In another embodiment, the reference two-dimensional data structures W01 to W0K may be divided into multiple different groups respectively corresponding to multiple different body parts. The processor 104 may respectively determine the posture difference degrees of the different groups, select the group corresponding to the largest difference degree value, and determine whether the posture difference degree is not lower than the difference degree threshold, thereby providing the corresponding posture correction recommendation for the body part corresponding to the group with the largest difference degree.
In yet another embodiment, in addition to providing the posture correction recommendations for different body parts, the processor 104 may also provide a quantified posture difference indicator or a posture difference interval according to the posture difference degree for the user to refer to the posture difference degree.
In summary, in the method according to the embodiments of the disclosure, the actions presented in different video streams may be compared and/or the posture correction recommendation may be provided, so that user may know whether a specific action is correctly executed when performing the specific action with reference to the video and know which specific body parts need to be adjusted. Thereby, learning actions through videos can be improved.
Finally, it should be noted that the above embodiments are only used to illustrate, but not to limit, the technical solutions of the disclosure. Although the disclosure has been described in detail with reference to the above embodiments, persons skilled in the art should understand that the technical solutions described in the above embodiments may still be modified or some or all of the technical features thereof may be equivalently replaced. However, the corrections or replacements do not cause the essence of the corresponding technical solutions to deviate from the scope of the technical solutions of the embodiments of the disclosure.
Number | Date | Country | Kind |
---|---|---|---|
202310783234.7 | Jun 2023 | CN | national |
202310784631.6 | Jun 2023 | CN | national |