This disclosure relates to the field of wearable device technologies, and in particular, to a motion data calibration method and system.
As people pay more attention to scientific exercise and physical health, motion monitoring devices are under tremendous development. Currently, a motion monitoring device mainly monitors action parameters during a movement of a user with a sensor. In the existing technologies, when the motion monitoring device performs coordinate calibration with the sensor, the user is required to perform a series of calibration actions (for example, raising two arms forward or raising two arms laterally), so that the motion monitoring device can convert posture data in a sensor coordinate system to posture data in a human body coordinate system based on the calibration actions. Whenever the user adjusts a position of the motion monitoring device, coordinate calibration needs to be performed again; otherwise, a calculation result may be affected, causing poor user experience.
Therefore, it is necessary to provide a motion data calibration method and system that can calibrate motion data without requiring a user to perform a calibration action.
This disclosure provides a motion data calibration method and system that can calibrate motion data without requiring a user to perform a calibration action.
In a first aspect, the present disclosure provides a motion data calibration system, including: at least one storage medium storing at least one instruction set for calibrating motion data; and at least one processor in communication with the at least one storage medium, where during operation, the at least one processor executes the set of instructions to: obtain action data during a movement of a user, where the action data includes at least one posture signal corresponding to at least one measurement position on a body of the user, and each posture signal of the at least one posture signal includes three-dimensional posture data of a corresponded measurement position in an original coordinate system, establish a target coordinate system, where the target coordinate system includes an X-axis, a Y-axis, and a Z-axis mutually perpendicular to each other, and convert each posture signal to two-dimensional posture data in the target coordinate system.
In a second aspect, the present disclosure provides a method for calibrating motion data, including: obtaining action data during a movement of a user, where the action data includes at least one posture signal corresponding to at least one measurement position on a body of the user, and each posture signal of the at least one posture signal includes three-dimensional posture data of a corresponded measurement position in an original coordinate system; establishing a target coordinate system, where the target coordinate system includes an X-axis, a Y-axis, and a Z-axis mutually perpendicular to each other; and converting each posture signal to two-dimensional posture data in the target coordinate system.
As can be learned from the foregoing technical solutions, in the motion data calibration method and system provided in this disclosure, the action data during the movement of the user can be converted from the three-dimensional posture data on the X-axis, the Y-axis, and the Z-axis mutually perpendicular to each other to the two-dimensional data in the target coordinate system, that is, posture data in the horizontal plane and posture data in the vertical plane. In this way, actions during movement of the user are classified to movement in a horizontal direction and movement in a vertical direction, and a data discrepancy caused by different orientations of the user is avoided. The method and system can eliminate influences of the orientations of the user on the motion data. Therefore, the method and system can calibrate the motion data without requiring the user to perform any calibration action.
This disclosure is further described by using exemplary embodiments. The exemplary embodiments are described in detail with reference to accompanying drawings. The embodiments are non-restrictive. In the embodiments, same numerals represent same structures.
To describe the technical solutions in the embodiments of this disclosure in a clearer way, the following briefly introduces the accompanying drawings required for describing the embodiments. Apparently, the accompanying drawings in the following description merely show some examples or embodiments of this disclosure, and a person of ordinary skill in the art may further apply this disclosure to other similar scenarios based on the accompanying drawings without creative efforts. Unless explicitly indicated from a linguistic context or specifically explained otherwise, same reference numbers in the figures represent same structures or operations.
It should be understood that the terms “system”, “apparatus”, “unit”, and/or “module” used in this disclosure are for the purpose of distinguishing between different components, elements, parts, portions, or assemblies at different levels. However, if other terms can achieve the same purpose, the terms may be replaced with other expressions.
As shown in this disclosure or claims, unless explicitly indicated in an exceptional case in a context, the terms “a”, “one”, “one type of”, and/or “the” do not necessarily refer to singular forms, but may also include plural forms. Generally, the terms “comprise” and “include” merely indicate that explicitly identified steps and elements are included, but the steps and elements do not constitute an exclusive list, and a method or device may also include other steps or elements.
In this disclosure, a flowchart is used to describe operations performed by a system according to some embodiment of this disclosure. It should be understood that previous or subsequent operations are not necessarily performed in a fixed order. Conversely, steps may be performed in a reverse order or processed simultaneously. In addition, other operations may also be added to the processes, or one or some of the operations may be removed from the processes.
This disclosure provides a motion monitoring system. The motion monitoring system may obtain an action signal during a movement of a user, where the action signal includes at least a myoelectric signal, a posture signal, an electrocardio signal, a respiratory frequency signal, or the like. The system may monitor the action of the movement of the user based on at least feature information corresponding to the myoelectric signal or feature information corresponding to the posture signal. For example, by using frequency information and amplitude information corresponding to the myoelectric signal, and an angular velocity, a direction of the angular velocity, an angular velocity value of the angular velocity, an angle, displacement information, stress, and the like corresponding to the posture signal, the system determines an action type of the user, a quantity of actions, action quality, and an action time, or physiological parameter information or the like when the user implements the action. In some exemplary embodiments, the motion monitoring system may further generate feedback about an exercise action of the user based on a result of analysis of the exercise action of the user, to guide the user in exercise. For example, when the exercise action of the user does not meet the standard, the motion monitoring system may send out prompt information (for example, a voice prompt, a vibration prompt, or a current stimulation) to the user. The motion monitoring system may be applied to a wearable device (for example, a garment, a wristband, or a helmet), a medical test device (for example, a myoelectric tester), an exercise device, or the like. By obtaining the action signal during movement of the user, the motion monitoring system may accurately monitor the action of the user and provide feedback, without the involvement of any professional. Therefore, exercise costs of the user can be reduced while exercise efficiency of the user is improved.
For example, the motion monitoring system 100 may monitor a exercise action of the user and provide feedback. When the user wearing the wearable device 130 performs exercises, the wearable device 130 may obtain the action signal of the user. The processing device 110 or the mobile terminal device 140 may receive and analyze the action signal of the user, to determine whether the exercise action of the user meets the standard and monitor the action of the user. Specifically, the monitoring of the action of the user may include determining an action type of the action, a quantity of actions, action quality, and an action time, or physiological parameter information or the like when the user implements the action. Further, the motion monitoring system 100 may generate feedback about the exercise action of the user based on a result of analysis of the exercise action of the user, so as to guide the user in exercise.
In another example, the motion monitoring system 100 may monitor a running action of the user and provide feedback. For example, when the user wearing the wearable device 130 takes running exercise, the motion monitoring system 100 may monitor whether the running action of the user meets the standard, and whether a running time conforms to a health standard. When the running time of the user is too long, or the running action is incorrect, the exercise device may provide feedback of a movement status of the user to the user to prompt the user to adjust the running action or the running time.
In some exemplary embodiments, the processing device 110 may be configured to process information and/or data related to movement of the user. For example, the processing device 110 may receive the action signal of the user (for example, the myoelectric signal, the posture signal, the electrocardio signal, or the respiratory frequency signal), and further extract feature information corresponding to the action signal (for example, feature information corresponding to the myoelectric signal in the action signal, or feature information corresponding to the posture signal). In some exemplary embodiments, the processing device 110 may perform specific signal processing, for example, signal segmentation or signal preprocessing (for example, signal correction processing or filtering processing), on the myoelectric signal or the posture signal captured by the wearable device 130. In some exemplary embodiments, the processing device 110 may also determine, based on the action signal of the user, whether the action of the user is correct. For example, the processing device 110 may determine, based on the feature information (for example, amplitude information or frequency information) corresponding to the myoelectric signal, whether the action of the user is correct. In another example, the processing device 110 may determine, based on the feature information (for example, an angular velocity, a direction of the angular velocity, an acceleration of the angular velocity, an angle, displacement information, and stress) corresponding to the posture signal, whether the action of the user is correct. In another example, the processing device 110 may determine, based on the feature information corresponding to the myoelectric signal and the feature information corresponding to the posture signal, whether the action of the user is correct. In some exemplary embodiments, the processing device 110 may further determine whether the physiological parameter information during movement of the user conforms to the health standard. In some exemplary embodiments, the processing device 110 may further send a corresponding instruction for feeding back the movement status of the user. For example, when the user takes running exercise, the motion monitoring system 100 detects that the user has run for a too long time period. In this case, the processing device 110 may send an instruction to the mobile terminal device 140 to prompt the user to adjust the running time. It should be noted that the feature information corresponding to the posture signal is not limited to the foregoing angular velocity, direction of the angular velocity, acceleration of the angular velocity, angle, displacement information, stress, and the like, and may also be other feature information. Any parameter information that can reflect relative motion of the body of the user may be the feature information corresponding to the posture signal. For example, when a posture sensor is a strain sensor, a bending angle and a bending direction at the user's joint may be obtained by measuring a magnitude of resistance that varies with a tensile length in the strain sensor.
In some exemplary embodiments, the processing device 110 may be a local or remote device. For example, through the network 120, the processing device 110 may access information and/or data stored in the wearable device 130 and/or the mobile terminal device 140. In some exemplary embodiments, the processing device 110 may be directly connected to the wearable device 130 and/or the mobile terminal device 140 to access information and/or data stored therein. For example, the processing device 110 may be located in the wearable device 130, and may exchange information with the mobile terminal device 140 through the network 120. In another example, the processing device 110 may be located in the mobile terminal device 140, and may exchange information with the wearable device 130 through the network. In some exemplary embodiments, the processing device 110 is executable on a cloud platform.
In some exemplary embodiments, the processing device 110 may process data and/or information related to motion monitoring to perform one or more of functions described in this disclosure. In some exemplary embodiments, the processing device 110 may obtain the action signal captured by the wearable device 130 during movement of the user. In some exemplary embodiments, the processing device 110 may send a control instruction to the wearable device 130 or the mobile terminal device 140. The control instruction may control a switch status of the wearable device 130 and each sensor of the wearable device 130, and may further control the mobile terminal device 140 to send out prompt information. In some exemplary embodiments, the processing device 110 may include one or more sub-processing devices (for example, single-core processing devices or multi-core processing devices).
The network 120 may facilitate the exchange of data and/or information in the motion monitoring system 100. In some exemplary embodiments, one or more components of the motion monitoring system 100 may send data and/or information to other components of the motion monitoring system 100 through the network 120. For example, the action signal captured by the wearable device 130 may be transmitted to the processing device 110 through the network 120. In another example, a confirmation result about the action signal in the processing device 110 may be transmitted to the mobile terminal device 140 through the network 120. In some exemplary embodiments, the network 120 may be any type of wired or wireless network.
The wearable device 130 refers to a garment or device having a donning function. In some exemplary embodiments, the wearable device 130 may include, but is not limited, to an upper garment apparatus 130-1, a pant apparatus 130-2, a wristband apparatus 130-3, a shoe apparatus 130-4, and the like. In some exemplary embodiments, the wearable device 130 may include a plurality of sensors. The sensors may obtain various action signals (for example, the myoelectric signal, the posture signal, temperature information, a heart rate, and the electro cardio signal) during movement of the user. In some exemplary embodiments, the sensor may include, but is not limited to, one or more of a myoelectric sensor, a posture sensor, a temperature sensor, a humidity sensor, an electrocardio sensor, a blood oxygen saturation sensor, a Hall sensor, a galvanic skin sensor, a rotation sensor, and the like. For example, the myoelectric sensor may be disposed at a position of a muscle (for example, a biceps brachii, a triceps brachii, a latissimus dorsi, or a trapezius) of a human body in the upper garment apparatus 130-1, and the myoelectric sensor may fit onto the user's skin and capture the myoelectric signal during movement of the user. In another example, an electrocardio sensor may be disposed near a left pectoral muscle of the human body in the upper garment apparatus 130-1, and the electrocardio sensor may capture the electrocardio signal of the user. In another example, the posture sensor may be disposed at a position of a muscle (for example, a gluteus maximus, a vastus lateralis, a vastus medialis, or a gastrocnemius) of the human body in the pant apparatus 130-2, and the posture sensor may capture the posture signal of the user. In some exemplary embodiments, the wearable device 130 may further provide feedback about the action of the user. For example, during movement of the user, when an action of a part of the body does not conform to the standard, a myoelectric sensor corresponding to the part may generate a stimulation signal (for example, a current stimulation or striking signal) to alert the user.
It should be noted that the wearable device 130 is not limited to the upper garment apparatus 130-1, the pant apparatus 130-2, the wristband apparatus 130-3, and the shoe apparatus 130-4 shown in
In some exemplary embodiments, the mobile terminal device 140 may obtain information or data in the motion monitoring system 100. In some exemplary embodiments, the mobile terminal device 140 may receive processed motion data from the processing device 110, and feedback a motion record or the like based on the processed motion data. Exemplary feedback manners may include, but are not limited to, a voice prompt, an image prompt, a video presentation, a text prompt, and the like. In some exemplary embodiments, the user may obtain an action record during movement of the user by using the mobile terminal device 140. For example, the mobile terminal device 140 may be connected to the wearable device 130 through the network 120 (for example, a wired connection or a wireless connection), and the user may obtain the action record during movement of the user by using the mobile terminal device 140, where the action record may be transmitted to the processing device 110 through the mobile terminal device 140. In some exemplary embodiments, the mobile terminal device 140 may include one or any combination of a mobile apparatus 140-1, a tablet computer 140-2, a notebook computer 140-3, and the like. In some exemplary embodiments, the mobile apparatus 140-1 may include a mobile phone, a smart home apparatus, a smart mobile apparatus, a virtual reality apparatus, an augmented reality apparatus, or the like, or any combination thereof. In some exemplary embodiments, the smart home apparatus may include a smart appliance control apparatus, a smart monitoring apparatus, a smart television, a smart camera, or the like, or any combination thereof. In some exemplary embodiments, the smart mobile apparatus may include a smartphone, a personal digital assistant (PDA), a game apparatus, a navigation apparatus, a POS apparatus, or the like, or any combination thereof. In some exemplary embodiments, the virtual reality apparatus and/or the augmented reality apparatus may include a virtual reality helmet, virtual reality glasses, virtual reality eye masks, an augmented reality helmet, augmented reality glasses, augmented reality eye masks, or the like, or any combination thereof.
In some exemplary embodiments, the motion monitoring system 100 may further include a motion data calibration system 180. The motion data calibration system 180 may be used to process action data related to user motion and can perform the motion data calibration method described in this specification. Specifically, the motion data calibration system 180 may receive the action data during movement of the user, and can convert the action data from the three-dimensional posture data on the X-axis, the Y-axis, and the Z-axis mutually perpendicular to each other to the two-dimensional posture data in the target coordinate system, that is, posture data in the horizontal plane and posture data in the vertical plane. In this way, actions during movement of the user are classified into movement in a horizontal direction and movement in a vertical direction, and a data discrepancy caused by different orientations of the user is avoided. The motion data calibration system 180 can eliminate adverse impact of the orientations of the user on the motion data. Therefore, the motion data may be calibrated without requiring the user to perform any calibration action. The action data may be three-dimensional posture data of a measurement position on a body of the user during movement. The action data and the motion data calibration method are described in detail later. In some exemplary embodiments, the motion data calibration system 180 may be integrated with the processing device 110. In some exemplary embodiments, the motion data calibration system 180 may alternatively be integrated with the mobile terminal device 140. In some exemplary embodiments, the motion data calibration system 180 may alternatively exist independently of the processing device 110 and the mobile terminal device 140. The motion data calibration system 180 may be in communication with the processing device 110, the wearable device 130, and the mobile terminal device 140 to transmit and exchange information and/or data. In some exemplary embodiments, through the network 120, the motion data calibration system 180 may access information and/or data stored in the processing device 110, the wearable device 130, and/or the mobile terminal device 140. In some exemplary embodiments, the wearable device 130 may be directly connect to the processing device 110 and/or the mobile terminal device 140 to access information and/or data stored therein. For example, the motion data calibration system 180 may be located in the processing device 110, and may exchange information with the wearable device 130 and the mobile terminal device 140 through the network 120. In another example, the motion data calibration system 180 may be located in the mobile terminal device 140, and may exchange information with the processing device 110 and the wearable device 130 through the network. In some exemplary embodiments, the motion data calibration system 180 is executable on the cloud platform, and may exchange information with the processing device 110, the wearable device 130, and the mobile terminal device 140 through the network.
For ease of presentation, the motion data calibration system 180 located in the processing device 110 is hereinafter used as an example in the following description.
In some exemplary embodiments, the motion monitoring system 100 may further include a database. The database may store data (for example, an initially set threshold condition) and/or an instruction (for example, a feedback instruction). In some exemplary embodiments, the database may store data obtained from the wearable device 130 and/or the mobile terminal device 140. In some exemplary embodiments, the database may store information and/or instructions for execution or use by the processing device 110, to perform the exemplary method described in this disclosure. In some exemplary embodiments, the database may be connected to the network 120 to communicate with one or more components of the motion monitoring system 100 (for example, the processing device 110, the wearable device 130, and the mobile terminal device 140). Through the network 120, the one or more components of the motion monitoring system 100 may access data or instructions stored in the database. In some exemplary embodiments, the database may be directly connected to or communicate with the one or more components of the motion monitoring system 100. In some exemplary embodiments, the database may be a part of the processing device 110.
The obtaining module 210 may be configured to obtain an action signal during a movement of a user. In some exemplary embodiments, the obtaining module 210 may include a sensor unit, where the sensor unit may be configured to obtain one or more action signals during movement of the user. In some exemplary embodiments, the sensor unit may include, but is not limited to, one or more of a myoelectric sensor, a posture sensor, an electrocardio sensor, a respiration sensor, a temperature sensor, a humidity sensor, an inertial sensor, a blood oxygen saturation sensor, a Hall sensor, a galvanic skin sensor, a rotation sensor, and the like. In some exemplary embodiments, the action signal may include one or more of a myoelectric signal, a posture signal, an electrocardio signal, a respiratory frequency signal, a temperature signal, a humidity signal, and the like. The sensor unit may be placed in different positions of the wearable device 130 depending on a type of an action signal to be obtained. For example, in some exemplary embodiments, the myoelectric sensor (also referred to as an electrode element) may be disposed at a position of a muscle of a human body, and the myoelectric sensor may be configured to capture a myoelectric signal during movement of the user. The myoelectric signal and feature information (for example, frequency information or amplitude information) corresponding to the myoelectric signal may reflect a status of the muscle during movement of the user. The posture sensor may be disposed at different positions of the human body (for example, positions corresponding to a torso, four limbs, and joints, in the wearable device 130), and the posture sensor may be configured to capture a posture signal during movement of the user. The posture signal and feature information (for example, a direction of an angular velocity, an angular velocity value, an acceleration value of the angular velocity, an angle, displacement information, and stress) corresponding to the posture signal may reflect a posture during movement of the user. The electrocardio sensor may be disposed at a position around a chest of the human body, and the electrocardio sensor may be configured to capture electrocardio data during movement of the user. The respiration sensor may be disposed at a position around the chest of the human body, and the respiration sensor may be configured to capture respiratory data (for example, a respiratory frequency and a respiratory amplitude) during movement of the user. The temperature sensor may be configured to capture temperature data (for example, a shell temperature) during movement of the user. The humidity sensor may be configured to capture humidity data of an external environment during movement of the user.
The processing module 220 may process data from the obtaining module 210, the control module 230, the communications module 240, the power supply module 250, and/or the input/output module 260. For example, the processing module 220 may process the action signal during movement of the user from the obtaining module 210. In some exemplary embodiments, the processing module 220 may preprocess the action signal (for example, the myoelectric signal or the posture signal) obtained by the obtaining module 210. For example, the processing module 220 performs segmentation processing on the myoelectric signal or the posture signal during movement of the user. In another example, the processing module 220 may perform preprocessing (for example, filtering processing or signal correction processing) on the myoelectric signal during movement of the user, to improve quality of the myoelectric signal. In another example, the processing module 220 may determine, based on the posture signal during movement of the user, feature information corresponding to the posture signal. In some exemplary embodiments, the processing module 220 may process an instruction or an operation from the input/output module 260. In some exemplary embodiments, the processed data may be stored in a memory or a hard disk. In some exemplary embodiments, the processing module 220 may transmit, through the communications module 240 or a network 120, the data processed by the processing module 220 to one or more components of a motion monitoring system 100. For example, the processing module 220 may send a motion monitoring result of the user to the control module 230, and the control module 230 may execute a subsequent operation or instruction based on an action determining result.
The control module 230 may be connected to other modules in the wearable device 130. In some exemplary embodiments, the control module 230 may control a running status of another module in the wearable device 130. For example, the control module 230 may control a power supply status (for example, a normal mode or a power saving mode), a power supply time, and the like of the power supply module 250. In another example, the control module 230 may control the input/output module 260 based on the action determining result of the user, and may further control a mobile terminal device 140 to send a motion feedback result of the user to the user. When there is a problem with the action (for example, the action does not conform to a standard) during movement of the user, the control module 230 may control the input/output module 260, and may further control the mobile terminal device 140 to provide feedback for the user, so that the user can learn the movement status of the user and adjust the action. In some exemplary embodiments, the control module 230 may further control one or more sensors in the obtaining module 210 or another module to provide feedback about the human body. For example, when the force of a muscle is excessive during movement of the user, the control module 230 may control an electrode module at a position of the muscle to electrically stimulate the user to prompt the user to adjust the action in time.
In some exemplary embodiments, the communications module 240 may be configured to exchange information or data. In some exemplary embodiments, the communications module 240 may be used for communication between internal components of the wearable device 130. For example, the obtaining module 210 may send an action signal (for example, the myoelectric signal or the posture signal) of the user to the communications module 240, and the communications module 240 may send the action signal to the processing module 220. In some exemplary embodiments, the communications module 240 may be further used for communication between the wearable device 130 and other components of the motion monitoring system 100. For example, the communications module 240 may send status information (for example, a switch status) of the wearable device 130 to a processing device 110, and the processing device 110 may monitor the wearable device 130 based on the status information. The communications module 240 may use wired, wireless, or hybrid wired and wireless technologies.
In some exemplary embodiments, the power supply module 250 may supply power to other components of the motion monitoring system 100.
The input/output module 260 may obtain, transmit, and send signals. The input/output module 260 may be connected to or communicate with other components of the motion monitoring system 100. The other components of the motion monitoring system 100 may be connected or communicate through the input/output module 260.
It should be noted that the foregoing description of the motion monitoring system 100 and modules thereof is merely for ease of description and is not intended to limit one or more embodiments of this disclosure to the scope of the illustrated embodiments. It may be understood that, after learning the principle of the system, a person skilled in the art may combine the modules, or connect a constituent subsystem to other modules, or omit one or more modules, without departing the principle. For example, the obtaining module 210 and the processing module 220 may be one module, and the module may have functions for obtaining and processing the action signal of the user. In another example, alternatively, the processing module 220 may not be disposed in the wearable device 130, but is integrated with the processing device 110. All such variations shall fall within the scope of protection of one or more embodiments of this disclosure.
The internal communications bus 310 may implement data communication between various components in the computing device 300. For example, the at least one processor 320 may send data to other hardware such as the at least one storage medium or the input/output port 360 through the internal communications bus 310. In some exemplary embodiments, the internal communications bus 310 may be an industry standard (ISA) bus, an extended industry standard (EISA) bus, a video electronics standard (VESA) bus, a peripheral component interconnect standard (PCI) bus, or the like. In some exemplary embodiments, the internal communications bus 310 may be used to connect various modules (for example, the obtaining module 210, the processing module 220, the control module 230, the communications module 240, and the input/output module 260) in the motion monitoring system 100 shown in
The at least one storage medium of the computing device 300 may include a data storage apparatus. The data storage apparatus may be a non-transitory storage medium, or may be a transitory storage medium. For example, the data storage apparatus may include one or more of a read-only memory (ROM) 330, a random access memory (RAM) 340, a hard disk 370, and the like. The example ROM may include a mask ROM (MROM), a programmable ROM (PROM), an erasable programmable ROM (PEROM), an electronic erasable programmable ROM (EEPROM), an optical disc (CD-ROM), a digital versatile disk ROM, or the like. The example RAM may include a dynamic RAM (DRAM), a double date rate synchronous dynamic RAM (DDR SDRAM), a static RAM (SRAM), a thyristor RAM (T-RAM), a zero-capacitance RAM (Z-RAM), or the like. The storage medium may store data/information obtained from any other component of the motion monitoring system 100. The storage medium further includes at least one instruction set stored in the data storage apparatus. The instruction is computer program code, and the computer program code may include a program, a routine, an object, a component, a data structure, a process, a module, and the like that perform the motion data calibration method provided in this specification. In some exemplary embodiments, the storage medium of the computing device 300 may be located in the wearable device 130, or may be located in the processing device 110.
The at least one processor 320 may be communicatively connected to the at least one storage medium. The at least one processor 320 is configured to execute the at least one instruction set. When the computing device 300 runs, the at least one processor 320 may read the at least one instruction set, and execute a computer instruction (program code) based on an instruction of the at least one instruction set, so as to perform a function of the motion monitoring system 100 described in this disclosure. The processor 320 may perform all steps included in the data processing method. The computer instruction may include a program, an object, a component, a data structure, a process, a module, and a function (the function is a particular function described in this disclosure). For example, the processor 320 may process an action signal (for example, a myoelectric signal or a posture signal) obtained from a wearable device 130 or/and a mobile terminal device 140 of the motion monitoring system 100 during a movement of a user, and monitor a movement action of the user based on the action signal during movement of the user. For example, the processor 320 may be configured to process the action data during movement of the user obtained from the wearable device 130 or/and the mobile terminal device 140 of the motion monitoring system 100, and perform the motion data calibration method described in this specification based on the instruction of the at least one instruction set to convert the action data to two-dimensional posture data. In some exemplary embodiments, the processor 320 may include a microcontroller, a microprocessor, a reduced instruction set computer (RISC), an application-specific integrated circuit (ASIC), an application-specific instruction set processor (ASIP), a central processing unit (CPU), a graphics processing unit (GPU), a physical processing unit (PPU), a microcontroller unit, a digital signal processor (DSP), a field programmable gate array (FPGA), an advanced reduced instruction set computing machine (ARM), a programmable logic device, any circuit or processor that can implement one or more functions, and the like, or any combination thereof. For a purpose of description only, only one processor 320 is depicted in the computing device 300 in
The hard disk 370 may be configured to store information and data generated by the processing device 110 or received from the processing device 110. For example, the hard disk 370 may store user confirmation information of the user. In some exemplary embodiments, the hard disk 370 may be disposed in the processing device 110 or the wearable device 130.
The user interface 380 may implement interaction and information exchange between the computing device 300 and a user. In some exemplary embodiments, the user interface 380 may be configured to present a motion record generated by the motion monitoring system 100 to the user. In some exemplary embodiments, the user interface 380 may include a physical display, such as a display with a speaker, an LCD display, an LED display, an OLED display, or an electronic ink (E-Ink) display.
The input/output interface 360 may be configured to input or output signals, data, or information. In some exemplary embodiments, the input/output interface 360 may enable the user to interact with the motion monitoring system 100.
Step 510: Obtain an action signal during a movement of a user.
In some exemplary embodiments, step 510 may be performed by the obtaining module 210. The action signal refers to human parameter information during movement of the user. In some exemplary embodiments, the human parameter information may include, but is not limited to, one or more of a myoelectric signal, a posture signal, an electrocardio signal, a temperature signal, a humidity signal, a blood oxygen concentration, a respiratory frequency, and the like. In some exemplary embodiments, a myoelectric sensor in the obtaining module 210 may capture a myoelectric signal during movement of the user. For example, when the user performs seated chest fly, a myoelectric sensor corresponding to a position of a pectoral muscle, a latissimus dorsi, or the like of a human body, in a wearable device may capture a myoelectric signal at a corresponding muscle position of the user. In another example, when the user performs a deep squat action, a myoelectric sensor corresponding to a position of a gluteus maximus, a quadriceps femoris, or the like of the human body, in the wearable device may capture a myoelectric signal at a corresponding muscle position of the user. In another example, when the user takes running exercise, a myoelectric sensor corresponding to a position of a gastrocnemius or the like of the human body, in the wearable device may capture a myoelectric signal at the position of the gastrocnemius or the like of the human body. In some exemplary embodiments, a posture sensor in the obtaining module 210 may capture a posture signal during movement of the user. For example, when the user performs barbell horizontal-pushing motion, a posture sensor corresponding to a position of a triceps brachii or the like of the human body, in the wearable device may capture a posture signal at the position of the triceps brachii or the like of the user. In another example, when the user performs a dumbbell bird action, a posture sensor disposed at a position of a deltoid or the like of the human body may capture a posture signal at the position of the deltoid or the like of the user. In some exemplary embodiments, the obtaining module 210 may include a plurality of posture sensors, and the plurality of posture sensors may obtain posture signals of a plurality of parts of the body during movement of the user, where the plurality of posture signals may reflect a status of relative movement between different parts of the human body. For example, a posture signal at an arm and a posture signal at a torso may reflect a movement status of the arm relative to the torso. In some exemplary embodiments, posture signals are associated with types of posture sensors. For example, when a posture sensor is an angular velocity three-axis sensor, an obtained posture signal is angular velocity information. In another example, when posture sensors are an angular velocity three-axis sensor and an acceleration three-axis sensor, obtained posture signals are angular velocity information and acceleration information. In another example, when a posture sensor is a strain sensor, the strain sensor may be disposed at a joint position of the user, a posture signal obtained by measuring a magnitude of resistance varying with a tensile length in the strain sensor may be displacement information, stress, and the like, and the posture signal may represent a bending angle and a bending direction at the user's joint. It should be noted that parameter information that can be used to reflect relative motion of the body of the user may all be feature information corresponding to posture signals, and different types of posture sensors may be used to obtain the information based on feature information types.
In some exemplary embodiments, the action signal may include a myoelectric signal of a specific part of the body of the user and a posture signal of the specific part. The myoelectric signal and the posture signal may reflect a movement status of the specific part of the body of the user from different perspectives. In short, the posture signal of the specific part of the body of the user may reflect an action type, an action amplitude, an action frequency, or the like of the specific part. The myoelectric signal may reflect a muscle status of the specific part during movement. In some exemplary embodiments, myoelectric signals and/or posture signals of the same part of the body may be used to better evaluate whether the action of the part meets the standard.
Step 520: Monitor the action of the movement of the user based on at least feature information corresponding to the myoelectric signal or feature information corresponding to the posture signal.
In some exemplary embodiments, this step may be performed by the processing module 220 and/or the processing device 110. In some exemplary embodiments, the feature information corresponding to the myoelectric signal may include, but is not limited to, one or more of frequency information, amplitude information, and the like. The feature information corresponding to the posture signal is parameter information used to represent relative motion of the body of the user. In some exemplary embodiments, the feature information corresponding to the posture signal may include, but is not limited to, one or more of a direction of an angular velocity, an angular velocity value, an acceleration value of the angular velocity, and the like. In some exemplary embodiments, the feature information corresponding to the posture signal may further include an angle, displacement information (for example, a tensile length in a strain sensor), stress, and the like. For example, when a posture sensor is a strain sensor, the strain sensor may be disposed at a joint position of the user, a posture signal obtained by measuring a magnitude of resistance varying with a tensile length in the strain sensor may be displacement information, stress, and the like, and the posture signal may represent a bending angle and a bending direction at the user's joint. In some exemplary embodiments, the processing module 220 and/or the processing device 110 may extract the feature information (for example, the frequency information and the amplitude information) corresponding to the myoelectric signal or the feature information (for example, the direction of the angular velocity, the angular velocity value, the acceleration value of the angular velocity, the angle, the displacement information, stress, and the like) corresponding to the posture signal, and monitor the action of the user based on the feature information corresponding to the myoelectric signal or the feature information corresponding to the posture signal. Herein, the monitoring of the action of the movement of the user includes monitoring information related to the action of the user. In some exemplary embodiments, the information related to the action may include one or more of an action type of the user, a quantity of actions, action quality (for example, whether the action of the user conforms to a standard), an action time, and the like. The action type refers to an exercise action taken during movement of the user. In some exemplary embodiments, the action type may include, but is not limited to, one or more of seated chest fly, deep squat exercise, hard pull exercise, plank exercise, running, swimming, and the like. The action quantity is a quantity of actions performed during movement of the user. For example, the user performs seated chest fly for 10 times, and the action quantity herein is 10. The action quality refers to a degree of conformity of an exercise action performed by the user to a standard exercise action. For example, when the user performs a deep squat action, the processing device 110 may determine an action type of the action of the user based on feature information corresponding to an action signal (myoelectric signal and posture signal) at a specific muscle position (gluteus maximus, quadriceps femoris, or the like), and determine action quality of the deep squat action of the user based on an action signal of a standard deep squat action. The action time refers to the time corresponding to one or more action types of the user or the total time of the motion process.
As described above, the feature information corresponding to the posture signal may include parameter information that can be used to reflect relative motion of the body of the user. To monitor the action of the movement of the user, the motion monitoring system 100 needs to obtain relative motion between different parts of the body of the user. As above, the posture signal may be posture data obtained by a posture sensor. The posture sensors may be distributed at different parts of the body of the user. To obtain relative motion between a plurality of different parts of the body of the user, the motion monitoring system 100 may calibrate posture signals between the plurality of different parts of the body of the user.
S3020. Obtain action data during a movement of a user.
In some exemplary embodiments, this step may be performed by the processing module 220 and/or the processing device 110. The action data refers to human motion parameter information during movement of the user. In some exemplary embodiments, the action data may include at least one posture signal corresponding to at least one measurement position on the body of the user. The posture signal and feature information (for example, a direction of an angular velocity, an angular velocity value, an acceleration value of the angular velocity, an angle, displacement information, and stress) corresponding to the posture signal may reflect a posture during movement of the user. The at least one measurement position corresponds to the at least one posture signal on a one-to-one basis. Measurement positions may be different parts of the body of the user. The at least one posture signal corresponds to an actual posture of the at least one measurement position on the body of the user during movement of the user. Each of the posture signal of the at least one posture signal may include three-dimensional posture data of the measurement position corresponding to the posture signal in an original coordinate system. The original coordinate system may be a coordinate system in which the posture signal is located. When the posture of the user does not change, a change of the original coordinate system may also cause a change of the posture signal. Each of the posture signals may include one or more types of three-dimensional posture data, for example, three-dimensional angle data, three-dimensional angular velocity data, three-dimensional angular acceleration data, three-dimensional velocity data, three-dimensional displacement data, and three-dimensional stress data.
In some exemplary embodiments, a posture signal may be obtained by a posture sensor in the wearable device 130. As described above, the sensor unit of the obtaining module 210 in the wearable device 130 may include a posture sensor. Specifically, the wearable device 130 may include at least one posture sensor. The at least one posture sensor may be located in at least one measurement position on the body of the user. The posture sensor may capture a posture signal of the corresponding measurement position on the body of the user. Posture sensors in the wearable device 130 may be distributed on four limbs of the human body (for example, arms and legs), the torso of the human body (for example, the chest, abdomen, back, and waist), the head of the human body, and the like. The posture sensors can capture posture signals at other parts of the human body, such as the limbs and the torso. The posture sensors may be placed at different positions of the wearable device 130 based on the posture signals to be obtained, to measure the posture signals corresponding to different positions of the human body. In some exemplary embodiments, a posture sensor may also be a sensor having an attitude and heading reference system (AHRS) with a posture fusion algorithm. The posture fusion algorithm may fuse data of a nine-axis inertial measurement unit (IMU) having a three-axis acceleration sensor, three-axis angular velocity sensor, and a three-axis geomagnetic sensor into an Euler angle or quaternion, to obtain a posture signal of a body part of the user in which the posture sensor is located. In some exemplary embodiments, the processing module 220 and/or the processing device 110 may determine, based on the posture signal, the feature information corresponding to the posture signal. In some exemplary embodiments, the feature information corresponding to the posture signal may include, but is not limited to, an angular velocity value, a direction of an angular velocity, an acceleration value of the angular velocity, and the like. In some exemplary embodiments, the posture sensor may be a strain sensor, and the strain sensor may obtain a bending direction and a bending angle at the user's joint to obtain a posture signal during movement of the user. For example, the strain sensor may be disposed at a knee joint of the user, during movement of the user, a body part of the user acts on the strain sensor, and a bending direction and a bending angle at the knee joint of the user may be calculated based on a resistance or length change of the strain sensor, to obtain a posture signal of a leg of the user. In some exemplary embodiments, the posture sensor may further include a fiber sensor, and the posture signal may be represented by a change of a direction of light after bending at the fiber sensor. In some exemplary embodiments, the posture sensor may alternatively be a magnetic flux sensor, and the posture signal may be represented by a change of a magnetic flux. It should be noted that the type of the posture sensor is not limited to the foregoing sensors, but may also be other sensors. All sensors capable of obtaining a posture signal of a user shall fall within the scope of the posture sensor in this disclosure.
As described above, each of the posture signal may include one or more types of three-dimensional posture data. The posture sensors may also include a variety of sensors. In some exemplary embodiments, the posture sensor may include at least one of an acceleration sensor, an angular velocity sensor, and a magnetic sensor.
When the posture signal is data measured by the posture sensor, the original coordinate system may be a coordinate system in which the posture sensor is located. In some exemplary embodiments, the original coordinate system is a coordinate system corresponding to a posture sensor disposed on the human body. When the user uses the wearable device 130, the posture sensors in the wearable device 130 are distributed at different parts of the human body, so that installation angles of the posture sensors on the human body are different, and the posture sensors at different parts respectively use coordinate systems of their own device bodies as original coordinate systems. Therefore, the posture sensors at different parts have different original coordinate systems. In some exemplary embodiments, a posture signal obtained by each posture sensor may be an expression under an original coordinate system of the corresponded posture sensor. The posture signal obtained by the posture sensor may be a posture signal of a preset fixed coordinate system in the original coordinate system. The preset fixed coordinate system may be a geodetic coordinate system, or may be any other preset coordinate system. A conversion relationship between the original coordinate system and the preset fixed coordinate system may be pre-stored in the motion data calibration system 180.
In some exemplary embodiments, the posture signal may be a signal directly obtained by the posture sensor, or may be a posture signal formed after a signal processing procedure such as conventional filtering, rectification, and wavelet transformation, or burr processing is performed on the signal directly obtained by the posture sensor, or may be a signal obtained by permutating or combining any one or more of the foregoing processing procedures.
In some exemplary embodiments, the posture signal may be data obtained by an image sensor. The image sensor may be an image sensor capable of obtaining depth information, such as a 3D structured light camera or a binocular camera. The image sensor may be installed at any position capable of shooting an image during movement of the user. There may be one or more image sensors. When there is a plurality of image sensors, the plurality of image sensors may be installed at a plurality of different positions. The image sensor may obtain a depth image during movement of the user. The depth image may include depth information of at least one measurement position on the body of the user relative to a coordinate system in which the image sensor is located. When the user moves, the motion data calibration system 180 may obtain a posture signal at each of the at least one measurement position through calculation based on a change in a plurality of frames of depth images. As described above, each of the posture signals may include one or more types of three-dimensional posture data. The motion data calibration system 180 may obtain different three-dimensional posture data through calculation.
When the posture signal is data measured by the image sensor, the original coordinate system may be a coordinate system in which the image sensor itself is located. In some exemplary embodiments, the posture signal obtained by the image sensor may be an expression in a coordinate system (original coordinate system) of the image sensor corresponding to the posture signal. The image sensor may perform calibration in advance. To be specific, a conversion relationship between the coordinate system of the image sensor and the foregoing preset fixed coordinate system may be pre-stored in the motion data calibration system 180.
For ease of presentation, in the following description, an example in which the action data is data measured by at least one posture sensor is used for description. For ease of presentation, the original coordinate system is defined as an o-xyz coordinate system, where o is a coordinate origin of the original coordinate system o-xyz, and an x-axis, a y-axis and a z-axis are respectively three mutually perpendicular coordinate of the original coordinate system o-xyz.
As described above, each posture signal may be three-dimensional posture data of the measurement position corresponding to the posture signal in the original coordinate system o-xyz. In some exemplary embodiments, the three-dimensional posture data may be posture data on three mutually perpendicular coordinate axes in the coordinate system in which the three-dimensional posture data is located. In some exemplary embodiments, the posture data may include angle data and angular velocity data. In some exemplary embodiments, the three-dimensional posture data in the original coordinate system o-xyz may include angle data and angular velocity data on the three mutually perpendicular coordinate axes: the x-axis, the y-axis, and the z-axis. For ease of presentation, the three-dimensional posture data of each posture signal in the original coordinate system o-xyz is respectively marked as three-dimensional angle (Euler) data Esens and three-dimensional angular velocity (Gyro) data Gsens. The three-dimensional angle data Esens may include angle data Esens_x on the x-axis, angle data Esens_y on the y-axis, and angle data Esens_z on the z-axis. The three-dimensional angular velocity data Gsens may include angular velocity data Gsens_x on the x-axis, angular velocity data Gsens_y on the y-axis, and angular velocity data Gsens_z on the z-axis.
S3040. Establish a target coordinate system.
In some exemplary embodiments, this step may be performed by the processing module 220 and/or the processing device 110. For ease of presentation, the target coordinate system is defined as O-XYZ. To conveniently determine relative motion between different parts of the user, the motion data calibration system 180 may convert the action data to posture data in a same known coordinate system (for example, the target coordinate system is defined as O-XYZ). The target coordinate system O-XYZ may include three mutually perpendicular coordinate axes: an X-axis, a Y-axis, and a Z-axis.
In some exemplary embodiments, the target coordinate system O-XYZ may be any calibrated coordinate system. In some exemplary embodiments, the target coordinate system O-XYZ may be the foregoing preset fixed coordinate system. In some exemplary embodiments, the target coordinate system O-XYZ and the foregoing preset fixed coordinate system may be different coordinate systems. A conversion relationship between the foregoing preset fixed coordinate system and the target coordinate system may be pre-stored in the motion data calibration system 180.
The motion data calibration system 180 is configured to calibrate the action data during movement of the user, and a measurement object thereof is the user. Therefore, in some exemplary embodiments, a length direction of the torso when the human body is standing may be used as the Z-axis in the target coordinate system O-XYZ. To be specific, the Z-axis is a direction opposite to a vertical direction of gravity acceleration. In other words, the Z-axis is a coordinate axis perpendicular to the ground and pointing to the sky. A plane formed by the X-axis and the Y-axis is a horizontal plane perpendicular to the Z-axis. In some exemplary embodiments, the X-axis and the Y-axis may be any two mutually perpendicular coordinate axes in the horizontal plane perpendicular to the Z-axis. In some exemplary embodiments, the X-axis may be an east-west coordinate axis, such as an east coordinate axis, and the Y-axis may be a north-south coordinate axis, such as a north coordinate axis.
S3060. Convert each posture signal to two-dimensional posture data in the target coordinate system.
S3062. Obtain a pre-stored conversion relationship between the target coordinate system O-XYZ and the original coordinate system o-xyz.
As described above, the posture signal obtained by each posture sensor may be an expression in the original coordinate system corresponding to the posture sensor. Specifically, the posture signal obtained by the posture sensor may be a posture signal of the preset fixed coordinate system in the original coordinate system at the measurement position corresponding to the current posture signal. The motion data calibration system 180 may obtain, in a reverse conversion manner based on each posture signal, an expression of the original coordinate system o-xyz of the posture sensor corresponding to the posture signal in the preset fixed coordinate system. As described above, the conversion relationship between the preset fixed coordinate system and the target coordinate system O-XYZ may be pre-stored in the motion data calibration system 180. The motion data calibration system 180 may calculate and determine the conversion relationship between each original coordinate system o-xyz and the target coordinate system O-XYZ based on the conversion relationship between the preset fixed coordinate system and the target coordinate system O-XYZ. The motion data calibration system 180 may convert posture information in the original coordinate system o-xyz to posture information in the target coordinate system O-XYZ based on the conversion relationship. In some exemplary embodiments, the conversion relationship may be represented by one or more rotation matrices. The rotation matrix may be pre-stored in the motion data calibration system 180.
S3064. Convert each posture signal to three-dimensional motion data in the target coordinate system O-XYZ based on the conversion relationship between the target coordinate system O-XYZ and the original coordinate system o-xyz.
As described above, each posture signal may include three-dimensional posture data Esens_x, Esens_y, Esens_z, Gsens_x, Gsens_y, and Gsens_z in the original coordinate system o-xyz corresponding to the posture signal. The motion data calibration system 180 may determine, based on the conversion relationship between the target coordinate system O-XYZ and the original coordinate system o-xyz, three-dimensional motion data of the original coordinate system o-xyz at the measurement position corresponding to each posture signal in the target coordinate system O-XYZ. In some exemplary embodiments, the three-dimensional motion data includes at least angular velocity data Gglobal_X on the X-axis, angular velocity data Gglobal_Y on the Y-axis, and angular velocity data Gglobal_Z on the Z-axis.
S3066. Convert the three-dimensional motion data in the target coordinate system O-XYZ to two-dimensional posture data in the target coordinate system O-XYZ.
The two-dimensional posture data is data in a two-dimensional coordinate system. The two-dimensional coordinate system in the target coordinate system O-XYZ may include a motion plane when limbs of the user 001 swing and a motion plane when the torso of the user 001 rotates. In some exemplary embodiments, the two-dimensional posture data in the target coordinate system O-XYZ may include horizontal posture data and vertical posture data. The horizontal posture data may include horizontal angle data Eglobal_Z and horizontal angular velocity data Gglobal_Z during a movement in a horizontal plane perpendicular to the Z-axis. The vertical posture data may include vertical angle data Eglobal_XY and vertical angular velocity data Gglobal_XY during a movement in any vertical plane perpendicular to the horizontal plane. The vertical plane may be any plane perpendicular to the horizontal plane. The horizontal angle data Eglobal_Z may be a rotational angle of the measurement position in the horizontal plane in the target coordinate system O-XYZ. The horizontal angular velocity data Gglobal_Z may be a rotational angular velocity of the measurement position in the horizontal plane in the target coordinate system O-XYZ. The vertical angle data Eglobal_XY may be a rotational angle of the measurement position in any vertical plane in the target coordinate system O-XYZ. The vertical angular velocity data Gglobal_XY may be a rotational angular velocity of the measurement position in any vertical plane in the target coordinate system O-XYZ.
Because for most exercise actions in a gymnasium, a major portion of an action of the user 001 may be decomposed to motion in two planes, for example, motion in place in the horizontal plane and motion in any one vertical plane. Different actions performed by the user 001 during movement of the user 001 can be distinguished only by the motion on the horizontal plane and the motion on the vertical plane. For example, while the user 001 is running on a treadmill, a running action of the user 001 is mainly focused on rotational movement around joints parallel to the horizontal plane. In this case, the rotational movement occurs in a vertical plane perpendicular to the horizontal plane, and the vertical plane extends in a direction of a body orientation of the user 001. A center of gravity of the user 001 moves linearly up and down in this vertical plane, and each limb of the user swings with the center of gravity. For example, when the user 001 performs an action of biceps lifting, the action of biceps lifting includes only movement in the vertical plane. In another example, when the user 001 performs an action of seated chest fly, the action of seated chest fly includes only an action in the horizontal plane.
As shown in
S3066-2. Convert the angular velocity data Gglobal_X on the X-axis and the angular velocity data Gglobal_Y on the Y-axis to vertical angular velocity data Gglobal_XY by using a vector law.
The vertical angular velocity data Gglobal_XY may be expressed as the following formula:
S3066-4. Perform time integration on the vertical angular velocity data based on a time corresponding to a start position and a time corresponding to an end position during movement of the user 001, to obtain the vertical angle data Eglobal_XY.
The vertical angle data Eglobal_XY may be expressed as the following formula:
E
global_XY=∫startposendposGglobal_XY·dt formula (4)
where startpos and endpos are a start time and an end time corresponding to a start position (start position) and an end position (end position) of an action.
S3066-6. Use the angular velocity data Gglobal_Z on the Z-axis as the horizontal angular velocity data Gglobal_Z.
S3066-8. Perform time integration on the horizontal angular velocity data Gglobal_Z based on the time corresponding to the start position and the time corresponding to the end position during movement of the user 001, to obtain the horizontal angle data Eglobal_Z.
The horizontal angle data Eglobal_Z may be expressed as the following formula:
E
global_Z=∫startposendposGglobaal_z·dt formula (5)
where startpos and endpos are a start time and an end time corresponding to a start position (start position) and an end position (end position) of an action.
In some exemplary embodiments, the motion data calibration method 3000 may further include:
S3080. Determine relative motion between the at least one measurement position based on the two-dimensional posture data corresponded to each posture signal.
The motion data calibration system 180 may determine relative motion between different movement parts of the body of the user 001 by using at least one piece of two-dimensional posture data corresponding to at least one posture signal corresponding to the at least one measurement position on the body of the user 001. For example, the relative motion between the arms and the torso during movement of the user 001 may be determined by using feature information corresponding to posture sensors at the arms of the user 001 and feature information corresponding to a posture sensor at the torso of the user 001.
In summary, the motion data calibration method 3000 and the system 180 provided in this disclosure can convert the action data during movement of the user 001 from the three-dimensional posture data on the three mutually perpendicular coordinate axes to the two-dimensional data in the target coordinate system, that is, the posture data in the horizontal plane and the posture data in the vertical plane, thereby dividing the action of the movement of the user 001 to the motion in the horizontal direction and the motion in the vertical direction, and avoiding a data discrepancy caused by different orientations of the user 001. The method 3000 and the system 180 may eliminate affection of the orientations of the user 001 on the motion data. Therefore, the method 3000 and the system 180 can calibrate the motion data without requiring the user 001 to perform a calibration action.
Basic concepts have been described above. Apparently, for a person skilled in the art, the foregoing detailed disclosure is only an example, and should not be construed as a limitation on this disclosure. Although not explicitly described herein, various changes, improvements, and modifications to this disclosure may be applied by a person skilled in the art. Such changes, improvements, and modifications are suggested in this disclosure. Therefore, such changes, improvements, and modifications still fall within the spirit and scope of the exemplary embodiments of this disclosure.
In addition, specific terms used in this disclosure are to describe the embodiments of this disclosure. For example, “one embodiment”, “an embodiment”, and/or “some exemplary embodiments” shall mean a feature, a structure, or a characteristic in connection with at least one embodiment of this disclosure. Therefore, it should be emphasized and noted that two or more references to “an embodiment” or “one embodiment” or “an alternative embodiment” in various places throughout this disclosure may not necessarily refer to the same embodiment. Furthermore, some of the features, structures, or characteristics in one or more embodiments of this disclosure may be appropriately combined.
This application is a continuation application of PCT application No. PCT/CN2021/134421, filed on Nov. 30, 2021, and the contents of which are incorporated herein by reference in their entirety.
Number | Date | Country | |
---|---|---|---|
Parent | PCT/CN2021/134421 | Nov 2021 | US |
Child | 18421955 | US |